TY - JOUR A1 - Haas, Lukas A1 - Haider, Arsalan A1 - Kastner, Ludwig A1 - Zeh, Thomas A1 - Poguntke, Tim A1 - Kuba, Matthias A1 - Schardt, Michael A1 - Jakobi, Martin A1 - Koch, Alexander W. A2 - Angelini, Federico T1 - Velocity Estimation from LiDAR Sensors Motion Distortion Effect T2 - Sensors N2 - Many modern automated vehicle sensor systems use light detection and ranging (LiDAR) sensors. The prevailing technology is scanning LiDAR, where a collimated laser beam illuminates objects sequentially point-by-point to capture 3D range data. In current systems, the point clouds from the LiDAR sensors are mainly used for object detection. To estimate the velocity of an object of interest (OoI) in the point cloud, the tracking of the object or sensor data fusion is needed. Scanning LiDAR sensors show the motion distortion effect, which occurs when objects have a relative velocity to the sensor. Often, this effect is filtered, by using sensor data fusion, to use an undistorted point cloud for object detection. In this study, we developed a method using an artificial neural network to estimate an object’s velocity and direction of motion in the sensor’s field of view (FoV) based on the motion distortion effect without any sensor data fusion. This network was trained and evaluated with a synthetic dataset featuring the motion distortion effect. With the method presented in this paper, one can estimate the velocity and direction of an OoI that moves independently from the sensor from a single point cloud using only one single sensor. The method achieves a root mean squared error (RMSE) of 0.1187 m s−1 and a two-sigma confidence interval of [−0.0008 m s−1, 0.0017 m s−1] for the axis-wise estimation of an object’s relative velocity, and an RMSE of 0.0815 m s−1 and a two-sigma confidence interval of [0.0138 m s−1, 0.0170 m s−1] for the estimation of the resultant velocity. The extracted velocity information (4D-LiDAR) is available for motion prediction and object tracking and can lead to more reliable velocity data due to more redundancy for sensor data fusion. KW - LiDAR sensor KW - deep learning KW - motion distortion effect KW - point cloud KW - advanced driver assistance systems KW - highly automated driving KW - velocity estimation Y1 - 2023 UR - https://opus4.kobv.de/opus4-hs-kempten/frontdoor/index/index/docId/2079 SN - 1424-8220 VL - 23. IS - 23 SP - 1 EP - 16 PB - MDPI CY - Basel, Switzerland ER -