Schops T, Enge J, Cremers D (2014)
Publication Type: Conference contribution
Publication year: 2014
Publisher: Institute of Electrical and Electronics Engineers Inc.
Pages Range: 145-150
Conference Proceedings Title: ISMAR 2014 - IEEE International Symposium on Mixed and Augmented Reality - Science and Technology 2014, Proceedings
Event location: Munich, DEU
ISBN: 9781479961849
DOI: 10.1109/ISMAR.2014.6948420
We present a direct monocular visual odometry system which runs in real-time on a smartphone. Being a direct method, it tracks and maps on the images themselves instead of extracted features such as keypoints. New images are tracked using direct image alignment, while geometry is represented in the form of a semi-dense depth map. Depth is estimated by filtering over many small-baseline, pixel-wise stereo comparisons. This leads to significantly less outliers and allows to map and use all image regions with sufficient gradient, including edges. We show how a simple world model for AR applications can be derived from semi-dense depth maps, and demonstrate the practical applicability in the context of an AR application in which simulated objects can collide with real geometry.
APA:
Schops, T., Enge, J., & Cremers, D. (2014). Semi-dense visual odometry for AR on a smartphone. In Robert W. Lindeman, Christian Sandor, Simon Julier (Eds.), ISMAR 2014 - IEEE International Symposium on Mixed and Augmented Reality - Science and Technology 2014, Proceedings (pp. 145-150). Munich, DEU: Institute of Electrical and Electronics Engineers Inc..
MLA:
Schops, Thomas, Jakob Enge, and Daniel Cremers. "Semi-dense visual odometry for AR on a smartphone." Proceedings of the 13th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2014, Munich, DEU Ed. Robert W. Lindeman, Christian Sandor, Simon Julier, Institute of Electrical and Electronics Engineers Inc., 2014. 145-150.
BibTeX: Download