Real-time Simultaneous Navigation and Mapping (SLAM)

The research proposes a versatile mobile platform that forms an incremental 3D map of the environment using an orthogonal pair of Light Detection and Ranging (Lidar) devices. The main advantages to the proposed technique are the high update rate, high output resolution, and effective progress visualization. The proposed technique has the potential to be utilized in infrastructure surveying applications such as geometric analysis, building maintenance, and energy modeling in unknown environments.
Laser scanned point clouds are relevant for geometric analysis, building maintenance and energy modeling, yet obtaining fully registered point clouds is a labor-intensive process when targets or common registration features are absent. Thus, we propose a versatile mobile platform that forms an incremental 3D map of the environment in real time using an orthogonal pair of LIDAR (Light Detection and Ranging) devices. The horizontal scanner aims to estimate the robot position and orientation with SLAM (Simultaneous Localization and Mapping) techniques whereas the vertical scanner recovers the building structure in the vertical plane. We also developed a real time point cloud visualization tool that allows an operator to track the mapping progress. The method was evaluated with walk-through laser scans of a complete building floor.


Kim, P., Chen, J., and Cho, Y. (2018). "SLAM-driven robotic mapping and registration of 3D point clouds." Automation in Construction,​1016/​j.​autcon.​2018.​01.​009.

Kim, P., Chen, J., and Cho, Y. (2017). "Automated Point Clouds Registration using Visual and Planar Features for Construction Environments." ASCE Journal of Computing in Civil Engineering, Volume 32, Issue2, March 2018, DOI: 10.1061/(ASCE)CP.1943-5487.0000720 [Full text]

Chen, J. and Cho, Y. (2016). “Real-time 3D Mobile Mapping for the Built Environment”. International Symposium on Automation and Robotics in Construction (ISARC), Auburn, AL, July 18-21, 2016, DOI: 10.22260/ISARC2016/0028 [Full text]