11 Feb
11. Feb. 2021 | 14:00 - 16:00
Gastvortrag

Multi-Sensor Fusion for Autonomous Driving

Autonomous driving makes stringent requirements on the accuracy, availability and integrity of positioning. A sensor fusion of multiple complementary and redundant sensors is needed.
This talk focuses on a sensor fusion of GNSS, INS, wheel odometry, camera and lidar data. We perform a tight coupling of GNSS, INS and wheel odometry with a Kalman filter. The state parameters of the Kalman filter include the position, velocity, quaternion (describing the attitude), quaternion rate, carrier phase ambiguities, and IMU biases. RTK/ PPP corrections are used and the carrier phase ambiguities are resolved to integers to achieve maximum precision. Moreover, position and attitude information derived from the visual sensors (camera and lidar) are integrated into the sensor fusion to improve the performance in areas with limited or no GNSS signal reception.
We consider a few approaches for visual positioning with camera and Lidar data: The first one is the "Robust Visual-Inertial Odometry" (RoVIO), which includes a robust feature tracking and derives odometry information from time series of camera images. The second one is the "Deep Visual Odometry" (Deep VO), which uses deep recurrent convolutional neural networks to determine odometry information from monocular camera images. Deep VO is an end-to-end solution, i.e. it provides odometry information directly from the time series of camera images without the need of classical feature extraction and matching techniques. The third approach is a Lidar based "Simultaneous Localization and Mapping" (SLAM). We consider the "Surfel-based Mapping" (SUMA) and its enhanced version SUMA++, that additionally uses semantic information determined by convolutional neural networks.
ANavS GmbH - Advanced Navigation Solutions offers two precise positioning systems for autonomous driving: The first one is its Multi-Sensor RTK module with a tight coupling of GNSS, INS and wheel odometry measurements. The second one is the ANavS Integrated Sensor Platform (ISP) with 3 integrated GNSS receivers, an IMU, a CAN odometry interface, two cameras, a 3D Lidar, an LTE module for the reception of RTK/ PPP corrections, and two powerful processors for performing the sensor fusion. All sensors are integrated into a single platform and the respective raw data are synchronized, time-stamped, calibrated and accessible.

Referent/Referentin

Dr.-Ing. Patrick Henkel (Lehrstuhl für Kommunikation und Navigation, TUM)

Veranstalter

GRK 2159 (i.c.sens)

Termin

11. Feb. 2021
14:00 - 16:00

Kontakt

Dr. rer. nat. Katja Lohmann
GRK 2159 (i.c.sens)
lohmann@ife.uni-hannover.de

Ort

online