A dashboard robot with expressive cartoon eyes now in the works at MIT may someday help you avoid traffic jams, remind you to pick up the milk, and help you have a great night out. This Affective Intelligent Driving Agent (AIDA) will do that by learning from your expressions and your driving habits.
Researchers say that after a week of using AIDA, it will know your regular stops. After a month, AIDA can suggest the optimal driving route home via your favorite grocery store because it knows you are leaving work and this is the day you shop.
You can watch a YouTube video that profiles this “personal driving companion who understands your personal driving habits, frequent destinations, and the city environment.”
AIDA is a project of MIT’s Personal Robots Group and the SENSEable City Lab in collaboration with the Volkswagen Group of America’s Electronics Research Lab.
“AIDA builds on our long experience in building sociable robots,” says Prof. Cynthia Breazeal SM ’93, ScD ’00, director of Personal Robots (by curtis ). “We are developing AIDA to read the driver’s mood from facial expression and other cues and respond in a socially appropriate and informative way.”