Location Estimation in Indoor Environments Using a Time-of-Flight Range Camera
von Tobias KohoutekIndoor positioning with different technical solutions is omnipresent in industrial and academic research. The most important applications are Location Based Services (LBS), which objects require reference in a coordinate system. Research and development target for example the automation of processes in smart warehousing and logistics, or the monitoring of people during rescue missions. Indoor positioning is also highly relevant to robotics and autonomous navigation. The poor performance of Global Navigation Satellite Systems (GNSS) in indoor environments calls for other solutions. Diverse requirements and different environmental conditions, in particular Non-Line-of-Sight (NLoS) signal propagation, are reasons for the current insufficient level of performance in indoor positioning and navigation. Wireless devices (e. g. RFID systems) enjoy widespread use in numerous diverse applications including sensor networks, deployed in all environments and organizing themselves in an ad-hoc fashion. However, knowing the correct positions of network nodes and their deployment is an essential precondition. Optical sensors do not require the deployment of physical reference infrastructure inside buildings and offer several solutions covering all required accuracy levels.
The aim of this thesis is to apply range images from a Time-of-Flight (ToF) range camera for indoor positioning. Range Imaging (RIM) is a special technique in the spectra of electro-optic and video-metric principles. It is capable to capture the environment three-dimensionally in real-time. Single camera systems offer a high potential for indoor applications. Camera position and possible movements can be derived after insignificant details have been eliminated. Furthermore, semantic information can be extracted from the purely metrical data using geometric constraints to establish a connection between the spatio-semantic information of installations and objects in the scene.
This thesis is based on five scientific publications, which have been framed, by an introduction and a concluding chapter. Publication 1 focuses on the localization and tracking/monitoring of a robot. Publication 2 describes human computer interaction based motion detection of people. Publications 3 to 5 concentrate on the location estimation of a ToF range camera itself in a scene compared to a spatio-semantic interior building model. Such models can be referenced to any arbitrary coordinate system. The proposed approach can therefore be used for absolute positioning of objects/installations and human operators in real time with centimeter accuracy. However, the camera position in relation to surrounding objects, which are compared with their database models, is derived with decimeter accuracy. Simultaneous Localization And Mapping (SLAM) generates 3D modeled environments in the proposed method.
The aim of this thesis is to apply range images from a Time-of-Flight (ToF) range camera for indoor positioning. Range Imaging (RIM) is a special technique in the spectra of electro-optic and video-metric principles. It is capable to capture the environment three-dimensionally in real-time. Single camera systems offer a high potential for indoor applications. Camera position and possible movements can be derived after insignificant details have been eliminated. Furthermore, semantic information can be extracted from the purely metrical data using geometric constraints to establish a connection between the spatio-semantic information of installations and objects in the scene.
This thesis is based on five scientific publications, which have been framed, by an introduction and a concluding chapter. Publication 1 focuses on the localization and tracking/monitoring of a robot. Publication 2 describes human computer interaction based motion detection of people. Publications 3 to 5 concentrate on the location estimation of a ToF range camera itself in a scene compared to a spatio-semantic interior building model. Such models can be referenced to any arbitrary coordinate system. The proposed approach can therefore be used for absolute positioning of objects/installations and human operators in real time with centimeter accuracy. However, the camera position in relation to surrounding objects, which are compared with their database models, is derived with decimeter accuracy. Simultaneous Localization And Mapping (SLAM) generates 3D modeled environments in the proposed method.