Tag Archives: visual tracker

Joint laboratory MICC – Thales

MICC, Media Integration and Communication Center of the University of Florence, and Thales Italy have established a partnership to create a joint laboratory between university and company in order to research and develop innovative solutions per safety, sensitive sites, critical infrastructure and transport.

MICC - Thales joint lab demo at Thales Technoday 2011

MICC - Thales joint lab demo at Thales Technoday 2011

In particular the technology program is mainly focused (but not limited) on surveillance through video analysis, employing computer vision and pattern recognition technologies.

A current active filed of research, continued from 2009 to 2011 was that of studying how to increase the effectiveness of classic video surveillance systems using active sensors (Pan Tilt Zoom cameras) and obtain higher resolution images of tracked targets.

MICC - Thales joint lab projects

MICC - Thales joint lab projects

The collaboration allowed to start studying the inherent complexities of PTZ camera setting and algorithms for target tracking and was focused on the study and verification of a set of basic video analysis functionalities.

Thales

Thales

In 2011 the joint lab led to two important demos at two main events: Festival della Creatività, October 2010 in Florence (Italy) and Thales Technoday 2011 in January 2011 in Paris (France). In the latter the PTZ Tracker has been nominated as VIP Demo (Very ImPortant Demo).

Some videos about this events:

Particle filter-based visual tracking

The project’s goal is to develop a computationally efficient, robust real-time particle filter-based visual tracker. In particular, we aim to increase the robustness of the tracker when it is used in conjunction with weak (but computationally efficient) appearance model, such as color histograms. To achieve this goal, we have proposed an adaptive parameter estimation method that estimates the statistic parameters of the particle filter on-line, so that it is possible to increase or reduce the uncertainty in the filter depending on a measure of its performances (tracking quality).

Particle filter based visual tracking

Particle filter based visual tracking

The method has proved to be effective in dramatically increasing the robustness of a particle filter-based tracker in situations that are usually critical for visual tracking, such as in presence of occlusions and highly erratic motion.

The data set we used is now available for download, with ground truth data, in order to make it possible for other people to test their tracker on our data set and compare the performance.

It is made of 10 video sequences showing a remote controlled toy car (Ferrari F40) filmed from two different point of view: ground floor or ceiling. The sequences will be provided in mjpeg format, together with text files (one per sequence) containing ground truth data (position and size of the target’s bounding box) for each frame. Below you can see an example of the ground truth provided with our data set (sequence #10):

We have tested the performance of the resulting tracker on the sequences of our data set comparing the segmentation provided by the tracker with the ground truth data. Quantitative measures of this performance are reported in the literature. Below we show a few videos that demonstrate the tracker capabilities.

This is an example of tracking on sequence #9 of the data set:

An example tracking humans outdoor with a PTZ camera. In this video (not in the data set) the camera was steered by the tracker. It is thus an active tracking and it shows that the method can be applied to PTZ cameras, since it does not use any background modeling techinque: