Kostas Daniilidis' Talk



Title: Signal analysis and geometry of immersive sensing

Immersive visualization is rapidly becoming very popular with the dissemination of platforms enabling switching among viewpoints and directions. While it might look as a pure graphics problem if the content is virtual, it is a problem of immersive visual sensing if we visualize a real world and in real time.

Immersive sensing is best described by the notion of the plenoptic function. In this talk I will start with describing tele-immersion, a system that amplifies the sense of co-presence in the Internet. Then, I will provide ways to analyze samplings of the plenoptic function beyond the traditional perspective plane starting from omnidirectional systems with a single viewpoint.

I will present a new unifying theory of panoramic image formation covering all central omnidirectional sensors as well as any conventional pinhole camera. The model is based on a spherical projection followed by a projection from the sphere to the omnidirectional plane. The natural domain to process an omnidirectional signal is the sphere considered as a homogeneous space with the group action of rotation. By applying a Fourier transform on rotations we are able to obtain direct attitude information without point or line correspondences.

To describe more general mappings of omnidirectional planes we consider a new representation where the omnidirectional plane is lifted to a 3D circle space where transformations preserving points can be modeled as elements of the Lorentz group SO(3,1). Such a mapping models also the intrinsic geometry of an omnidirectional camera and it turns out that it drastically simplifies the problem of 3D-motion estimation. The additional robustness of of a huge field of view make such sensors irreplaceable in navigational and wide-area mapping tasks.


scheideler@cs.jhu.edu
Last modified: Sep 16 2002