Visual Tracking Demos
Please click images of this page to watch corresponding video demos
Stan Li, and Yu Qiao, Real-time Multiple
Face Tracking Based on Boosted Face Filter (in Microsoft Research
Multiple face tracking is performed based on the output of a boosted face filter. The boosted face filter is built to initialize the tracking and to provide a likelihood estimate to guide and update the tracking model. Gradient search combined with template tracking are used to improve the efficiency. No assumption about the stationary of the background is made; nor is skin color used. The resulting system can deal with varying number of faces, face appearance changes, short time mutual occlusion, and moving background. It runs at about 25 fps for 320x240 images on P4 1.4GHz PC.
(2.56M) (4.60M) (2.15M)
Zhihong Zeng and Songde Ma, An Efficient Vision System for Multiple Car Tracking, Int. Conf. on Pattern Recognition, 609-612, Vol.2,2002
An efficient system is proposed for multiple car tracking on the highway. The main modules of the system are: lane detection, separate 2D model-based trackers, heuristic car detection, and a process coordinator. In the system, the dynamical creation and termination of tracking processes optimizes the amount of spent computational resources. The system is successfully tested with the image sequence (2767 frames) from PETS2001 and the processing time per frame is 12ms on Pentium III 450MHz PC. The some results (AVI files) of the system are illustrated here. The resolutions of the original images are 768*576 pixels2.
(3.51M) (3.91M) (3.39M)
(3.40M) (3.45M) (3.33M)
Zhihong Zeng and Songde Ma, Head Tracking by Active Particle Filtering, Int. Conf. on Automatic Face and Gesture Recognition, 89-94, 2002
Particle filtering has attracted much attention due to its robust tracking performance in clutter. However, a price to pay for its robustness is the computational cost. Active particle filtering is proposed in this paper. Unlike the tradition particle filtering, every particle in the active particle filtering is first driven to its local maximum of the likelihood before it is weighted. In this case, the efficiency of every particle is improved and the number of required particles is greatly reduced. Actually, the number of particles in the active particle filtering is mainly determined by the cluttered degree of the environment rather than by the size of the model¡¯s configuration space. Extensive experimental results show that the tracker is efficient and robust to track heads undergoing translation and full 360-degree out-of-planes rotation with partial occlusion in cluttered environments. The test sequences are from the Computer Science Department of Stanford University. We are grateful to Dr. Stan Birchfield and his group for making these wonderful image sequences.
1. Cluttered environment (717KB) 2. Partial occlusion (1.167MB) 3. A lady with long hair (2.304MB)
4. Multiple moving people (487KB) 5. Severe Occlusion (94KB) 6. Rapid movement (149KB)
Zhihong Zeng and Songde Ma, Real-time Face Tracking under Partial Occlusion and Illumination Change, Int. Conf. of Multimodal Interfaces, 135-142, 2000
We propose an approach which tracks human faces robustly in real-time applications by combining region matching with active contour model. The proposed technique is applied to track the head of the person who is doing Taiji exercise in live video sequences. The system demonstrates promising performance , and the tracking time per frame is about 40ms on Pentium¢ò400MHz PC . The resolutions of the original images are 640*480 pixels2.
(942 KB) (477 KB)
Zhihong Zeng and Songde Ma, Real-time visual target tracking and stabilization, technical report, 2000-1-30
We achieve the target stabilization at real-time rate on Pentium¢ò400MHz PC in a efficient and robust way. The resolutions of the original images are 640*480 pixels2
Go back to the home page of Zhihong Zeng