preload

What about automated driving?

Posted by Albert Gareev on Apr 22, 2010 | Categories: Links

Imagine an intelligent and emotional robot sitting next to you, helping you to drive through a city traffic, and even remembering your driving habits. Does it sound like a sci-fi to you? 

Well, not anymore.

Meet AIDA – Affective Intelligent Driving Agent

AIDA communicates with the driver through a small robot embedded in the dashboard. “AIDA builds on our long experience in building sociable robots,” explains professor Cynthia Breazeal, director of the Personal Robots Group at the MIT Media Lab. “We are developing AIDA to read the driver’s mood from facial expression and other cues and respond in a socially appropriate and informative way.”

AIDA communicates in a very immediate way: with the seamlessness of a smile or the blink of an eye. Over time, the project envisions that a kind of symbiotic relationship develops between the driver and AIDA, whereby both parties learn from each other and establish an affective bond.

To identify the set of goals the driver would like to achieve, AIDA analyses the driver’s mobility patterns, keeping track of common routes and destinations. AIDA draws on an understanding of the city beyond what can be seen through the windshield, incorporating real-time event information and knowledge of environmental conditions, as well as commercial activity, tourist attractions, and residential areas.

“When it merges knowledge about the city with an understanding of the driver’s priorities and needs, AIDA can make important inferences,” explains Assaf Biderman, associate director of the SENSEable City Lab. “Within a week AIDA will have figured out your home and work location. Soon afterwards the system will be able to direct you to your preferred grocery store, suggesting a route that avoids a street fair-induced traffic jam. On the way AIDA might recommend a stop to fill up your tank, upon noticing that you are getting low on gas,” says Biderman. “AIDA can also give you feedback on your driving, helping you achieve more energy efficiency and safer behavior.”

Source: http://web.mit.edu/press/2009/mit-researchers-develop-affective-intelligent-driving-agent-aida-.html

The AIDA project (Affective, Intelligent Driving Agent), a collaboration between Volkswagen of America and the Massachusetts Institute of Technology (SENSEable City Lab and Personal Robots Group of Media Lab), is a platform comprising of a personal robot and an intelligent navigation system that aims to bring an innovative driving experience. We envision a navigation system that mimics the friendly expertise of a driving companion who is familiar with both the driver and the city. Instead of focusing solely on determining routes to a specified waypoint, our system utilizes analysis of driver behavior in order to identify the set of goals the driver would like to achieve. Furthermore, AIDA involves an understanding of the city beyond what can be seen through the windshield, incorporating information such as business and shopping districts, tourist and residential areas, as well as real-time event information and environmental conditions. Functinalities that gather information about driver preferences additionally help AIDA to behave more intelligently. One mandatory task for AIDA is to predict the destination of the driver as well as the most likely route that he/she will follow. This will in turn allow for useful reactions from AIDA such as proposing route alternatives when something unexpected happens in the predicted route, or providing the right information at the right time (e.g. a fuel warning before passing by a gas station) or even helping save energy.

Source: http://senseable.mit.edu/aida/

The Personal Robots Group focuses on developing the principles, techniques, and technologies for personal robots. Cynthia and her students have developed numerous robotic creatures ranging from robotic flower gardens, to embedding robotic technologies into familiar everyday artifacts (e.g., clothing, lamps, desktop computers), to creating highly expressive humanoids — including the well-known social robot, Leonardo. Ongoing research includes the development of socially intelligent robot partners that interact with humans in human-centric terms, work with humans as peers, and learn from people as an apprentice. Other projects have explored how HRI can be applied to enhance human behavior as applied to motor learning and cognitive performance. More recent work investigates the impact of long-term HRI applied to communication, quality of life, health, and educational goals. The ability of these robot systems to naturally interact, learn from, and effectively cooperate with people has been evaluated in numerous human subjects experiments, both inside the lab and in real-world environments.
 

Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported
This work by Albert Gareev is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported.