Aida is a robot built by Mikey Siegel from the MIT Media Lab for a research project at Audi. It is suppose to be a driving companion, something to be installed in your car!
During the summer of 2009, when I was doing an internship at the Intelligent Robotics Group in NASA Ames, I met Mikey for the first time. He was on his way to Audi Research Center located at the heart of the sunny Silicon Valley to present the robot he had built for them, but decided to stop at NASA Ames first to show us the robot, because he used to be an intern here at the IRG.
The purpose of the robot is to experiment with the idea of using a robot to influence people's driving behavior. Researchers hope to use the movement of the robot (really just the neck movement), the different facial expressions, and the robot's speech to encourage people to drive more safely. This required the robot to be able to communicate with human with many social cues, which was exactly the research topic at the Personal robots Group at MIT, led by Dr. Cynthia Breazeal, Mikey's advisor.
According to Mikey, the robot was built within a three-day period (I assumed that he didn't really get much sleep), which caused all our jaws to drop. The lovely head was printed off a 3D printer, and he also machined all the mechanical parts himself. However, to the fair to the other members of his lab, he added, the neck design was a copy from another project, the animated eyes and mouth movements were created by a friend (if I remember correct, someone from Pixar), and the software control was a mixture of modules previously developed at MIT and open source libraries such as OpenCV.
When Mikey demoed the robot to us, Aida was able to recognize faces. It became excited when it was surrounded by many people, and acted bored when it was left alone. The animated emoticons projected onto the plastic face from the back of the head made the robot look very cute, and the smooth neck movement made it almost appear "alive". At that time, the only sensor it had was a video camera mounted on the base (not moving with the neck or head), but eventually, Aida will be equipped with more eyes (cameras) and ears (microphones), so it can sense the world around it better.
Having a cute robot interacting with people in their cars sounds very cool, however, I am not so sure it is such a great idea.
First of all, could it be possible that the moving robot might distract the driver with its cute winks? I couldn't help but remember those signs next to bus drivers I used to see when I was a young kid: "Do not talk to the driver!" These days, when many states are making it illegal to talk on cell phone while driving, what would they think of a robot that not only talks to the driver, but also try to get the driver to look at it?
Secondly, don't you get annoyed sometimes when your better half keeps criticizing your driving skills (or was that just me)? Now imagine a robot, nagging constantly right next to your ear like your dear Grandma, telling you that you are driving too fast, or that you hit the brake too hard. Especially after you rear-end someone, I am sure a nagging robot saying "Told you! Told you to not follow so closely" would be the last thing you want.... (Disclaimer: I have never rear-ended anyone!)
On the other hand, for those LA solo commuters who get stuck in traffic many hours regularly (I was recently stuck in LA traffic for hours, so I know!), Aida would make a great driving companion! And I certainly wouldn't mind such a cute robot making a conversation with me, while my car drives itself to my intended destination!
Video of the Day:
If you were there at the Liverpool Street Station on January 15, 2009, would you have joined in?