Abstract

With the continued digitalization of the construction industry BIM (Building Information Modeling) lends itself to ever more use cases. Comparing a BIM model to its real-life twin has traditionally been a manual and often time-consuming task. In order to automate this, the BIMCheck project was initiated. So far, using LiDAR (Light Detection and Ranging) and SLAM (Simultaneous Localization and Mapping), a building and its various rooms can be roughly compared with its BIM Model. However, due to the sparse density of the LiDAR point clouds, small objects of interest such as power sockets, light switches or emergency installations cannot be sufficiently recognized and compared to their BIM model counterpart. To capture these smaller objects, this thesis introduces an Azure Kinect component into the modular BIMCheck pipeline. This component records, registers and post-processes multiple unstructured colored point clouds extracted from the Azure Kinect frames simultaneously in real-time. While most of the registration is achieved by a good initial guess obtained from LiDAR SLAM, additional registration through Perspective-n-Point (PnP) is also employed. Due to the large amount of data generated by the Azure Kinect, the component also features multiple data reduction techniques in order to keep only the most relevant data while filtering away the rest.The performance of the component is evaluated in terms of registration quality, data reduction rate and performance. Two types of environments are used to evaluate the registration accuracy: A simple but feature-rich empty hallway and a larger office complex under normal working conditions. Our component shows robust results in both environments, with the ability to fine-tune parameters depending on the environment and recording procedure.The component overcomes many of the limitations that come with registering and combining Azure Kinect point cloud data. It was written in C++ using PCL and OpenCV. Together with the LiDAR SLAM component, the system may also be used outside of the construction industry, ranging from scanning archaeological sites to creating 3D models of real-world objects for use in virtual reality applications.

Reference

Prohaska, J. (2024). Integrating azure kinect data into a real-time planned-vs-built comparsion pipeline [Diploma Thesis, Technische Universität Wien]. reposiTUm. https://doi.org/10.34726/hss.2024.119303