Computer Vision News - August 2022

7 See Eye to Eye To perform the comparison, the metrics used is the average precision (AP) over 40 recall positions at 0.7 and 0.5 IoU (Intersection over Union) thresholds for both BEV (bird’s-eye- view) and 3D IoUs. In the first task (nuScenes -> KITTI), SEE closes the performance gap by 39.61 AP for SECOND- IoU and 24.49 AP for PV-RCNN in AP 3D . In the second task (Waymo -> KITTI), SEE closes the performance gap by 53.60 AP for SECOND-IoU and 37.85 AP for PV-RCNN in AP 3D . Here, the authors’ approach outperforms ST3D with both SECOND-IoU and PV-RCNN in both IoU thresholds. The table below shows the results of the last experiment on the new dataset, where green highlights the performance increase of models trained with SEE over the Source only method. Based on the lower performance in the nuScenes -> Baraja Spectrum-Scan™ dataset, it is possible to conclude that the domain gap between these is higher. This is due to the difference in scan patterns between the two lidars (Figure 3), which is drastically reduced in the Waymo dataset, which achieves much better performance in both Source- only and SEE methods. The authors state that the performance between SEE and SEE-Ideal can be closed if the camera and lidar viewpoint alignment is minimised, as this reduces background points. Unfortunately, SEE doesn’t run in real-time, but it seems the be the only method available which does not require new training on new lidars- especially if they have an adjustable scan pattern. This appears to be a large component of the domain gap. Through results and ablation studies, and due to the nature of the approach, it’s concluded that SEE performs better with more points on the object, which anyways might be the general future direction in lidar manufacturing, and better surface completion methods. We are already at the end of this one too! See you next month. If you have any suggestions for future articles or questions on the topics discussed, feel free to contact me.

RkJQdWJsaXNoZXIy NTc3NzU=