Have you ever dreamed of owning a talking car? Most of you will probably say "maybe," but a company wants to develop a system to allow vehicles to communicate with pedestrians.
The idea comes from a company named Drive.ai, which intends to make a system that will allow autonomous cars to interact with humans. According to Carol Reiley, the co-founder and president of the startup company Drive.ai, people will trust self-driving vehicles if they will be able to understand what the driverless cars are doing.
Evidently, the startup has thought beyond the usual communication that happens on the road — like using turn signals, and has already excluded the use of the conventional horn for its plan.
Instead, Drive.ai wants to make a system that will use emoji and computer generated sounds to show humans around the vehicle what the self-driving system is about to perform. They will pioneer the system on their fleet of autonomous vehicles.
Among the examples presented by the president of the company are situations where humans would flash their lights, honk their horns, turn on the hazard lights, or make hand gestures. Some might also use head nods, or lower the window and talk to another road user.
Since most of these gestures mean different things when done by humans, and implementing them in a driverless car is close to impossible, Drive.ai wants to display emoji and make warning sounds whenever it detects that people are confused by its presence. The place of display or the range of emoji available has not been detailed.
As The Verge notes, the first prototype from Drive.ai will be a retrofit kit for existing vehicles. This package would be installed by the company, and will be adapted to each make and model. At first, it would only be done for fleet companies, and it will be employed to allow the machine to learn from human interaction.
Drive.ai already has a license to test autonomous cars in California, and had received funds of $12 million last year to support development. The next part of the plan will involve “deep learning.”
The term is used by scientists to describe when artificial intelligence is teaching itself from previous experience. The same will happen with Drive.ai’ first self-driving cars, and will continue over the course of the project.
Evidently, the startup has thought beyond the usual communication that happens on the road — like using turn signals, and has already excluded the use of the conventional horn for its plan.
Instead, Drive.ai wants to make a system that will use emoji and computer generated sounds to show humans around the vehicle what the self-driving system is about to perform. They will pioneer the system on their fleet of autonomous vehicles.
Among the examples presented by the president of the company are situations where humans would flash their lights, honk their horns, turn on the hazard lights, or make hand gestures. Some might also use head nods, or lower the window and talk to another road user.
Since most of these gestures mean different things when done by humans, and implementing them in a driverless car is close to impossible, Drive.ai wants to display emoji and make warning sounds whenever it detects that people are confused by its presence. The place of display or the range of emoji available has not been detailed.
As The Verge notes, the first prototype from Drive.ai will be a retrofit kit for existing vehicles. This package would be installed by the company, and will be adapted to each make and model. At first, it would only be done for fleet companies, and it will be employed to allow the machine to learn from human interaction.
Drive.ai already has a license to test autonomous cars in California, and had received funds of $12 million last year to support development. The next part of the plan will involve “deep learning.”
The term is used by scientists to describe when artificial intelligence is teaching itself from previous experience. The same will happen with Drive.ai’ first self-driving cars, and will continue over the course of the project.