Bay Vision - Spring 2018

AEye develops advanced vision hardware, software and algorithms that act as the eyes and visual cortex of autonomous vehicles. Their intelligent sensing platform, iDAR (which stands for Intelligent Detection and Ranging ) combines an agile, MOEMS-based LiDAR, pre-fused with a low-light camera and embedded artificial intelligence to create a machine perception platform. Jordan Greene , lead strategist and one of the co-founders of AEye, tells us that their story began when their CEO Luis Dussan was inspired by the DARPA Grand Challenge to come up with a new solution that was focused around trying to exceed the performance of the human eyes and visual cortex. Legacy systems have a specific type of architecture that inefficiently collects data and information. The goal of iDAR was to target a specific type of data set by emulating how the human eyes and visual cortex pre-process and add intelligence to information before it is brought back to the executive functions. AEye built a hardware architecture surrounding what they deemed to be the two primary sensors in autonomous vehicles today, a camera and a LiDAR: the two sensors that allow you to close the path- planning loop fast enough to be able to provide autonomous navigation. They started using those two technologies in a very novel way, utilising embedded AI: very unique algorithms that allow the two systems to work together and optimise data collection. How did they know this would work? Well, Jordan thinks their CEO’s credentials might help explain. Luis Dussan got his bachelor’s degree in electrical engineering and computer science, his Masters in optics and photonics, his second Masters in quantum optics, and his PhD work – that he started and put aside to start this venture – was in computational physics, which means he was really, really good at simulating real-life phenomenon, mainly light matter interaction. His background led him to be able to put together a simulated system in the virtual world and be able to prove that it would in fact work. He used seed funding to build the first prototype that demonstrated what the simulation had initially projected. Jordan explains that the novelty of what they were doing was they were addressing a system-level problem, deriving their hardware architecture to deliver a specified intelligent data set. The goal was two-fold: never miss anything and devote additional attention to critical areas or objects. What they were enabling for the autonomous vehicle path-planning Bay Vision 6 Boston Vision AEye

RkJQdWJsaXNoZXIy NTc3NzU=