Computer Vision News - June 2023

5 Data-driven Feature Tracking for Event Cameras almost impossible to tell the robot what it needs to do from its own perception. ” Previous work in this domain has predominantly relied on standard image cameras, such as video cameras, which have some drawbacks. Existing trackers based on standard camera images are affected by issues such as motion blur in high-speed scenarios, resulting in a loss of scene structure. Additionally, the frame rate of standard image cameras is typically limited toaround20 fps. Thisworkproposes the inclusion of an event camera alongside a traditional camera, which offers higher temporal resolution and enhances the but if it deems it relevant or significant, it uses that information downstream. Similarly, event cameras exclusively process changes and ignore static scenes to avoid processing redundant information. In this paper, Nico and Mathias explore using event cameras in combination with frame cameras to achieve robust feature tracking in sequential images. Feature tracking is crucial for various applications. The Robotics and Perception Group focuses on SLAM algorithms and pose estimation using cameras placed on robots . The researchers also work with drones , involving high-speed motion and maneuvers, with robust feature tracks to compute the pose using SLAM and VO backends. “ Robust feature tracking is often called the front end of visual odometry pipelines, ” Mathias tells us. “ These VO pipelines are the foundation of mobile robotics because they’re required for control algorithms in the robots to tell the robot where it needs to go or what it needs to do. If you don’t have access to these visual feature tracks, it’s

RkJQdWJsaXNoZXIy NTc3NzU=