CES 2019 is now open and filled with either wonders or horrors, depending on how you perceive this ultra-connected future that companies keep shoving in our face. Autonomous cars, walking cars, jumping cars, AI, VR, all came together in Las Vegas to paint a picture of a future not at all far-fetched.
For cars, the future translates into digital, connected and autonomous. But how will humans perceive these future cars that drive and tend to themselves?
Various companies have already begun working on ways to make the experience of riding in an autonomous car as less boring as possible. Apple, for instance, plans to take humans on a double-trip: as their bodies travel in the car, their minds can roam elsewhere by means of virtual reality headsets.
Coming up with ways to keep humans interested and connected during an autonomous drive will probably be the main focus of most auto companies in the coming years.
Kia is planning to keep humans not only interested and connected, but also happy. At CES, the South Koreans are showing the Real-time Emotion Adaptive Driving technology, or READ.
READ is an artificial intelligence system develop to read human emotion, understand them and then act in accordance with those emotions.
The system uses a variety of sensors to decipher the occupant’s emotional state. It can read their facial expressions, heart rate, and electrodermal activity. Depending on its interpretation of those readings it can modify the interior environment in an attempt to create “a more joyful mobility experience.”
Kia is showing the tech at CES in three configurations. The first is READ Me, a one-person cockpit that can be modified by the system to include sounds and fragrances tailored to the occupant’s mood.
The second is a two-person cockpit called READ Now. This is designed as an autonomous tour car that makes suggestions on route choice and in-car entertainment based on an analysis of the driver’s mood.
The third is READ Motion, a four-person cockpit created as the workspace of a Kia executive.
Kia did not say when or if it plans to introduce this system or parts of it in production cars.
Various companies have already begun working on ways to make the experience of riding in an autonomous car as less boring as possible. Apple, for instance, plans to take humans on a double-trip: as their bodies travel in the car, their minds can roam elsewhere by means of virtual reality headsets.
Coming up with ways to keep humans interested and connected during an autonomous drive will probably be the main focus of most auto companies in the coming years.
Kia is planning to keep humans not only interested and connected, but also happy. At CES, the South Koreans are showing the Real-time Emotion Adaptive Driving technology, or READ.
READ is an artificial intelligence system develop to read human emotion, understand them and then act in accordance with those emotions.
The system uses a variety of sensors to decipher the occupant’s emotional state. It can read their facial expressions, heart rate, and electrodermal activity. Depending on its interpretation of those readings it can modify the interior environment in an attempt to create “a more joyful mobility experience.”
Kia is showing the tech at CES in three configurations. The first is READ Me, a one-person cockpit that can be modified by the system to include sounds and fragrances tailored to the occupant’s mood.
The second is a two-person cockpit called READ Now. This is designed as an autonomous tour car that makes suggestions on route choice and in-car entertainment based on an analysis of the driver’s mood.
The third is READ Motion, a four-person cockpit created as the workspace of a Kia executive.
Kia did not say when or if it plans to introduce this system or parts of it in production cars.