Abstract: Simultaneous localization and mapping is a prevalent method for localization in automated parking applications. However, it requires to access an area for exploring purposes before the automated parking function can be completely applied. As this is inconvenient for parking functions, where often unknown areas are entered, our approach proposes a radar-based localization method primarily for applications inside of buildings which have a ground plan available. This ground plan contains the walls, pillars and parking lots of the building in a two-dimensional, bird’s eye view perspective. Based on the ground plan, a synthesized point cloud is generated to be matched with the filtered radar point cloud via a Normal Distributions Transform algorithm. The measurements generated hereby are fused with odometry measurements in a factor graph. This architecture is capable of processing independent, asynchronous incoming data in parallel and can easily be extended, e.g., by camera data. We outline our pipeline and show in experiments that it serves as a solid basis which competes with other state-of-the-art localization algorithms. Some drawbacks, e.g., the noisiness of the radar data in slow-speed or standstill situations, are discussed. Future work could incorporate camera data to further improve the robustness of this approach.
Loading