当前位置: X-MOL 学术IEEE Trans. Vis. Comput. Graph. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Mobile3DRecon: Real-time Monocular 3D Reconstruction on a Mobile Phone.
IEEE Transactions on Visualization and Computer Graphics ( IF 5.2 ) Pub Date : 2020-09-21 , DOI: 10.1109/tvcg.2020.3023634
Xingbin Yang , Liyang Zhou , Hanqing Jiang , Zhongliang Tang , Yuanbo Wang , Hujun Bao , Guofeng Zhang

We present a real-time monocular 3D reconstruction system on a mobile phone, called Mobile3DRecon. Using an embedded monocular camera, our system provides an online mesh generation capability on back end together with real-time 6DoF pose tracking on front end for users to achieve realistic AR effects and interactions on mobile phones. Unlike most existing state-of-the-art systems which produce only point cloud based 3D models online or surface mesh offline, we propose a novel online incremental mesh generation approach to achieve fast online dense surface mesh reconstruction to satisfy the demand of real-time AR applications. For each keyframe of 6DoF tracking, we perform a robust monocular depth estimation, with a multi-view semi-global matching method followed by a depth refinement post-processing. The proposed mesh generation module incrementally fuses each estimated keyframe depth map to an online dense surface mesh, which is useful for achieving realistic AR effects such as occlusions and collisions. We verify our real-time reconstruction results on two mid-range mobile platforms. The experiments with quantitative and qualitative evaluation demonstrate the effectiveness of the proposed monocular 3D reconstruction system, which can handle the occlusions and collisions between virtual objects and real scenes to achieve realistic AR effects.

中文翻译:

Mobile3DRecon:手机上的实时单眼3D重建。

我们提出了一种称为Mobile3DRecon的移动电话上的实时单眼3D重建系统。我们的系统使用嵌入式单眼相机,在后端提供在线网格生成功能,并在前端提供实时6DoF姿势跟踪,以便用户在手机上实现逼真的AR效果和交互。与大多数现有的仅在线生成基于点云的3D模型或离线生成表面网格的最新系统不同,我们提出了一种新颖的在线增量网格生成方法,以实现快速的在线密集表面网格重建,以满足实时需求。 AR应用程序。对于6DoF跟踪的每个关键帧,我们使用多视图半全局匹配方法执行健壮的单眼深度估计,然后进行深度细化后处理。提出的网格生成模块将每个估计的关键帧深度图增量融合到在线密集表面网格上,这对于实现逼真的AR效果(例如遮挡和碰撞)很有用。我们在两个中档移动平台上验证了我们的实时重建结果。通过定量和定性评估的实验证明了所提出的单眼3D重建系统的有效性,该系统可以处理虚拟对象与真实场景之间的遮挡和碰撞,以实现逼真的AR效果。
更新日期:2020-11-13
down
wechat
bug