Abstract

The rise of consumer devices capable of 3D scanning has given many people access to technology that is able to reconstruct three-dimensional environments in real-time at a relatively low cost. Available consumer solutions for scene reconstruction, however, are mostly focused on producing a single, cleaned watertight mesh that is suitable for 3D printing. There are no or very limited ways for a user to navigate and manipulate the reconstructed scene in a natural and intuitive manner. To this end, the virtual scene must be segmented into distinct objects that should resemble their real-world counterparts. To further aid the design of an intuitive system, prevalent approaches to scene navigation have to be revisited and improved upon. In this research project, a prototype for the virtual reconstruction and manipulation of three-dimensional scenes has been developed. The system allows for quick and intuitive 3D scanning and reconstruction by using a tablet display in combination with a depth camera. Distinct planes and objects are identified and separated from each other by utilizing a combination of automatic and manual segmentation techniques. The subsequent processing stage allows users to fill possible gaps and remove small, unwanted components. Afterwards, the tracking capabilities of the 3D reconstruction algorithm enable users to navigate through their reconstructed virtual scene by physically moving the tablet. The mobile display functions as a window into the virtual world. By utilizing the touch capabilities of the screen, previously segmented objects can be carried around and repositioned anywhere in the scene. Duplication, transformation and removal of items is also possible with the provided tools. Edited scenes can be exported to one of several common file formats for use in other applications. After development, a small user study was conducted in order to evaluate the prototype. The results show that the system has a high usability and can be learned easily. Participants were able to reconstruct and segment scenes in reasonable quality without much effort. Moreover, the methods used for scene navigation and interaction proved to be highly intuitive and natural. The evaluation also revealed several possible areas of improvement for future releases.

Reference

Huber, C. (2016). Editing reality: a tool for interactive segmentation and manipulation of 3D reconstructed scenes [Diploma Thesis, Technische Universität Wien]. reposiTUm. https://doi.org/10.34726/hss.2016.29541