We need self-driving cars that can monitor our attention along with our emotions

Volvo is building a line of electric commercial semi trucks
April 13, 2018
HBO’s Silicon Valley gets renewed for a sixth season
April 13, 2018



Last month, for the first time, an autonomous vehicle killed a pedestrian. The death of Elaine Herzberg at the hands of a self-guided Uber vehicle in Arizona has caused a crisis of consciousness in the autonomous vehicle industry. Now, engineers and startups are struggling to shift the focus to technology that they say could help prevent future self-directed collisions, especially as more and more autonomous vehicles are expected to hit the road in the future.
One of these startups is Renovo Auto, a Silicon Valley company that has developed an operating system that integrates all the software necessary to run a fleet of autonomous vehicles. You might remember the Renovo Coupe, a $ 529,000 electric supercar with 1,000 lb.-ft. of torque and a time of 0-60 of 3.4 seconds or, more recently, your project to convert a DeLorean with an electric powertrain and then make autonomous donuts with him. .
"Automated mobility has a huge human component."
Now, Renovo is highlighting its work to build a self-driving system that monitors not only the driver's attention, but can also read the facial expressions of passengers and pedestrians for a better understanding of the emotions inside and outside the vehicle. The company recently began working with Affectiva, the new artificial intelligence company, to integrate this new technology into its fleet of test vehicles. The point is to "build trust" between passengers and the technology that drives the car, Renovo CEO Chris Heiser said.
"We spent a lot of time trying to figure out how to detect inanimate objects with LIDAR and cameras, and that's super important," Heiser said. "But automated mobility has a large human component, and companies like Affectiva provide us with a new flow of data to observe and help each of the people in our ecosystem: people who build self-conduits, people who are building teleoperations, people who are building travel attraction apps, everyone wants to know how people feel and how they react to these automated vehicles. "
Affectiva's technology works as a driver monitoring tool, ensuring that security drivers keep their eyes on the road even while the software drives the vehicle, and an emotional tracker to ensure that robotic taxi passengers feel safe during their autonomous travel. Using deep learning algorithms, Affectiva trained its software to read emotional reactions by studying a wide range of people of all ages and ethnic backgrounds, Heiser said. Renovo then integrates Affectiva's application into its operating system, which allows it to access any of the cameras inside or outside the car.
The way in which Affectiva determines human emotions is quite interesting. According to the company's website:
The computer vision algorithms identify key landmarks on the face, for example, the corners of the eyebrows, the tip of the nose, the corners of the mouth. The deep learning algorithms then analyze the pixels in those regions to classify the facial expressions. The combinations of these facial expressions are assigned to the emotions.
anger, contempt, disgust, fear, joy, sadness and surprise
Affective says he can measure seven "emotional metrics": anger, contempt, disgust, fear, joy, sadness and surprise. It can provide 20 metrics of facial expression too. The company has a software development kit (SDK) and an application program interface (API) that provides emoji, gender, age, ethnicity and a number of other metrics. However, there is not a word about how effective it would be with someone who can be sociopathic or at least very good at suppressing their emotions.
A vehicle that can detect if a passenger is afraid can slow down or dim the lights if it senses discomfort or frustration, Heiser said. More importantly, with the camera pointed at the safety driver, Renovo can tell if that person is tired or distracted, and provides the appropriate warnings or warnings to ensure that care is kept on the road. And that's where the collaboration of Renovo and Affectiva might have prevented the fatal Uber collision last month.
Images from the dash camera released by Tempe police showed the Uber security driver looking down for several seconds before the crash. A driver monitoring system such as the one proposed by Renovo could have led the driver to look up, perhaps with enough time to avoid colliding with Herzberg.

Image: Renovo Auto

Of course, Renovo is not the only company that works on driver monitoring tools. Many are available in production models on the road today. The most prominent is Cadillac's semi-autonomous Super Cruise "hands-free" system, which uses infrared cameras mounted on the steering column to track the driver's attention and ensure they stay focused on the road. And the new Subaru Forrester comes with facial recognition technology to help detect driver fatigue. But Heiser said those systems can take years before they are ready for production, while Renovo's work with Affectiva may be ready much sooner.
"Working with pure software and placing it on a platform gives us a lot of speed in the implementation," he said. "And it also means that you can take things like Affectiva and integrate them directly with the self-driving system or the teleoperation system or content delivery system, that's something we can demonstrate in a matter of days or weeks."
Heiser said he wants Renovo to be for self-directed cars what Amazon Web Services is for cloud computing platforms. The company has created a software intermediary to help other companies gather all the pieces of autonomous technology. Its operating system, called AWare, allows a fleet of autonomous vehicles to handle huge amounts of sensor data. Renovo recently started working with Samsung to help test, develop and deploy driverless smart phone cars.
"We are the operating system for automated mobility," Heiser said.

ICS
ICS

Leave a Reply

Your email address will not be published. Required fields are marked *