Elsevier

Computers & Graphics

Volume 35, Issue 4, August 2011, Pages 831-840
Computers & Graphics

Mobile Augmented Reality
Robust detection and tracking of annotations for outdoor augmented reality browsing

https://doi.org/10.1016/j.cag.2011.04.004Get rights and content
Under a Creative Commons license
open access

Abstract

A common goal of outdoor augmented reality (AR) is the presentation of annotations that are registered to anchor points in the real world. We present an enhanced approach for registering and tracking such anchor points, which is suitable for current generation mobile phones and can also successfully deal with the wide variety of viewing conditions encountered in real life outdoor use. The approach is based on on-the-fly generation of panoramic images by sweeping the camera over the scene. The panoramas are then used for stable orientation tracking, while the user is performing only rotational movements. This basic approach is improved by several new techniques for the re-detection and tracking of anchor points. For the re-detection, specifically after temporal variations, we first compute a panoramic image with extended dynamic range, which can better represent varying illumination conditions. The panorama is then searched for known anchor points, while orientation tracking continues uninterrupted. We then use information from an internal orientation sensor to prime an active search scheme for the anchor points, which improves matching results. Finally, global consistency is enhanced by statistical estimation of a global rotation that minimizes the overall position error of anchor points when transforming them from the source panorama in which they were created, to the current view represented by a new panorama. Once the anchor points are redetected, we track the user's movement using a novel 3-degree-of-freedom orientation tracking approach that combines vision tracking with the absolute orientation from inertial and magnetic sensors. We tested our system using an AR campus guide as an example application and provide detailed results for our approach using an off-the-shelf smartphone. Results show that the re-detection rate is improved by a factor of 2 compared to previous work and reaches almost 90% for a wide variety of test cases while still keeping the ability to run at interactive frame rates.

Highlights

► Enhanced approach to register and track annotations' anchor points. ► Panoramic image with extended dynamic range to increase image quality. ► Use of internal orientation sensor to prime an active search for the anchor points. ► Minimizing position error of anchor points by aligning panoramas. ► Tracking using a 3-degree-of-freedom orientation tracking based on sensor fusion.

Keywords

Augmented reality
Annotation
Tracking
Mobile phone

Cited by (0)