Mobileye, an Intel subsidiary, is working on a system for autonomous vehicles that relies solely on cameras mounted on the body of the car, and no LiDAR (light detection and ranging). A demo of how it works in a real-life scenario was shown at CES 2020 in Las Vegas earlier this month.
The results are impressive, to say the least. A Ford sedan, dubbed generically the Autonomous Driving Test Vehicle, was able to complete a 30-minute drive in 21 minutes, on the crowded, often-unsignaled streets of Jerusalem. You can see the demo in the video at the bottom of the page.
Tests of AVs in crowded urban environments have been done before, with the same impressive results. But most AVs use a combination of cameras, radar, ultrasonic and LiDAR to map out the surrounding environment and be able to operate. Mobileye’s uses just 12 cameras.
The feed from the cameras, mounted on strategic points on the body of the car, is extracted into a 3D model of the environment using “a chain of algorithmic redundancies based on multiple computer vision engines and deep networks.” The environment is also shown on a display in the car, for the safety operator to compare with what he’s seeing, and be able to step in to take over from the computer in case what the AI thinks is seeing is wrong.
During the 21-minute ride, the car came across “roundabouts, unprotected turns, narrow streets, close maneuvers, lots of pedestrians, and pretty much anything you can think of.” That included a parked car pulling out without signaling and pedestrians jaywalking, and the camera system was enough for the AI to correctly interpret the environment and drive the car without incident, sometimes at speeds of 64 kph / 40 mph.
Before being acquired by Intel, Mobileye worked with Tesla, until a divergence of opinion regarding a Florida accident when a Tesla on Autopilot drove into a tractor trailer led to their parting ways. Elon Musk too believes that LiDAR is cumbersome and less efficient than a combination of cameras and excellent software to interpret their feed.
However, as The Verge also points out, there is a caveat to this otherwise-impressive video. Nowhere in it is it mentioned whether testing had been done before on this specific route using the same vehicle and technology, or if remote assistance was offered.
Tests of AVs in crowded urban environments have been done before, with the same impressive results. But most AVs use a combination of cameras, radar, ultrasonic and LiDAR to map out the surrounding environment and be able to operate. Mobileye’s uses just 12 cameras.
The feed from the cameras, mounted on strategic points on the body of the car, is extracted into a 3D model of the environment using “a chain of algorithmic redundancies based on multiple computer vision engines and deep networks.” The environment is also shown on a display in the car, for the safety operator to compare with what he’s seeing, and be able to step in to take over from the computer in case what the AI thinks is seeing is wrong.
During the 21-minute ride, the car came across “roundabouts, unprotected turns, narrow streets, close maneuvers, lots of pedestrians, and pretty much anything you can think of.” That included a parked car pulling out without signaling and pedestrians jaywalking, and the camera system was enough for the AI to correctly interpret the environment and drive the car without incident, sometimes at speeds of 64 kph / 40 mph.
Before being acquired by Intel, Mobileye worked with Tesla, until a divergence of opinion regarding a Florida accident when a Tesla on Autopilot drove into a tractor trailer led to their parting ways. Elon Musk too believes that LiDAR is cumbersome and less efficient than a combination of cameras and excellent software to interpret their feed.
However, as The Verge also points out, there is a caveat to this otherwise-impressive video. Nowhere in it is it mentioned whether testing had been done before on this specific route using the same vehicle and technology, or if remote assistance was offered.