TY - CHAP A1 - Agrawal, Shiva A1 - Song, Rui A1 - Doycheva, Kristina A1 - Knoll, Alois A1 - Elger, Gordon ED - Klein, Cornel ED - Jarke, Matthias ED - Ploeg, Jeroen ED - Helfert, Markus ED - Berns, Karsten ED - Gusikhin, Oleg T1 - Intelligent Roadside Infrastructure for Connected Mobility T2 - Smart Cities, Green Technologies, and Intelligent Transport Systems: 11th International Conference, SMARTGREENS 2022 and 8th International Conference, VEHITS 2022: Revised Selected Papers UR - https://doi.org/10.1007/978-3-031-37470-8_6 Y1 - 2023 UR - https://doi.org/10.1007/978-3-031-37470-8_6 UR - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:573-41761 SN - 978-3-031-37470-8 SN - 1865-0937 SP - 134 EP - 157 PB - Springer CY - Cham ER - TY - JOUR A1 - Senel, Numan A1 - Kefferpütz, Klaus A1 - Doycheva, Kristina A1 - Elger, Gordon T1 - Multi-Sensor Data Fusion for Real-Time Multi-Object Tracking JF - Processes N2 - Sensor data fusion is essential for environmental perception within smart traffic applications. By using multiple sensors cooperatively, the accuracy and probability of the perception are increased, which is crucial for critical traffic scenarios or under bad weather conditions. In this paper, a modular real-time capable multi-sensor fusion framework is presented and tested to fuse data on the object list level from distributed automotive sensors (cameras, radar, and LiDAR). The modular multi-sensor fusion architecture receives an object list (untracked objects) from each sensor. The fusion framework combines classical data fusion algorithms, as it contains a coordinate transformation module, an object association module (Hungarian algorithm), an object tracking module (unscented Kalman filter), and a movement compensation module. Due to the modular design, the fusion framework is adaptable and does not rely on the number of sensors or their types. Moreover, the method continues to operate because of this adaptable design in case of an individual sensor failure. This is an essential feature for safety-critical applications. The architecture targets environmental perception in challenging time-critical applications. The developed fusion framework is tested using simulation and public domain experimental data. Using the developed framework, sensor fusion is obtained well below 10 milliseconds of computing time using an AMD Ryzen 7 5800H mobile processor and the Python programming language. Furthermore, the object-level multi-sensor approach enables the detection of changes in the extrinsic calibration of the sensors and potential sensor failures. A concept was developed to use the multi-sensor framework to identify sensor malfunctions. This feature will become extremely important in ensuring the functional safety of the sensors for autonomous driving. UR - https://doi.org/10.3390/pr11020501 KW - environmental perception KW - sensor fusion KW - autonomous vehicle KW - unscented Kalman filter KW - object tracking KW - roadside units Y1 - 2023 UR - https://doi.org/10.3390/pr11020501 UR - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:573-31989 SN - 2227-9717 VL - 11 IS - 2 PB - MDPI CY - Basel ER - TY - JOUR A1 - Agrawal, Shiva A1 - Bhanderi, Savankumar A1 - Doycheva, Kristina A1 - Elger, Gordon T1 - Static multi-target-based auto-calibration of RGB cameras, 3D Radar, and 3D Lidar sensors JF - IEEE Sensors Journal UR - https://doi.org/10.1109/JSEN.2023.3300957 KW - Autonomous vehicles KW - camera KW - feature extraction KW - intelligent roadside infrastructure KW - lidar KW - radar KW - sensor calibration Y1 - 2023 UR - https://doi.org/10.1109/JSEN.2023.3300957 SN - 1530-437X VL - 23 IS - 18 SP - 21493 EP - 21505 PB - IEEE CY - Piscataway ER -