A Particle Filter without Dynamics for Robust 3D Face Tracking

            Le Lu    Xiang-Tian Dai    Gregory Hager

 

Abstract

Particle filter is a very successful technique for the
sequential state estimation problem. Its convergence greatly
depends on the balance between the number of particles/hypotheses
and the fitness of dynamics to application that guides the
temporal evolving of particles. Due to the difficulty of complex
dynamics learning or tuning for agile motions, thousands of
particles are usually required for real applications. This paper
presents a hybrid sampling solution by combining the sampling in
the image feature space and the state space via RANSAC or particle
filter, respectively. We show that the number of particles can be
reduced to dozens for a full 3D tracking problem which contains
considerable noise of different types. For non-semantic motions, a
specific set of dynamics may not exist, but it is avoided in our
algorithm. The theoretical convergence proof
\cite{Crisan02,Doucet00} for particle filtering by integrating
RANSAC is difficult, and we address this problem by analyzing the
likelihood distributions of particles from a real tracking
example. The sampling efficiency (on the more likely areas) is
much higher by the impact of RANSAC. We also discuss the tracking
quality measurement in the sense of entropy or statistical
testing. The algorithm has been applied to the problem of 3D face
pose tracking with changing moderate or intense expressions. We
demonstrate the validity of our approach with several video
sequences acquired in a casual environment.

Testing Results

comparison.avi (to test the validity of RANSAC-PF)

 

cher.avi (3D Face Tracking with moderate expression changes)

 

donald.avi (3D Face Tracking with intense expression changes)

 

Full Text Paper (pdf)