Search within Lanny's blog:


Leave me comments so I know people are actually reading my blogs! Thanks!

Showing posts with label Robot of the Day. Show all posts
Showing posts with label Robot of the Day. Show all posts

Tuesday, January 27, 2009

Robot of the Day: MQ-9 Reaper

Since I am really in this UAV mood, let's talk about another UAV (Unmanned Aerial Vehicle) today.

The MQ-9 Predator B Reaper is a scaled up version of the MQ-1 Predator UAV we discussed in the previous post. It is also designed by General Atomics Aeronautical Systems, only larger, faster, more powerful, and much, much more deadly!

Picture credit: U.S. Air Force

Reaper was designed to be a hunter-killer, hence the name. It has a wingspan of 66 feet (20.12m), can fly at an impressive 300 miles per hour, and can stay in the air for up to 30 hours. The Reaper UAV is still powered by propellers at the rear end of the plane. It can carry 3800 lb of weapons. For example, it can carry 14 Hellfire missiles or other weapons such as the 500-pound, laser-guided bombs shown in the picture above. The U.S. Air Force was activated to operate the MQ-9 Reaper UAVs on May 1, 2007.

Fully loaded Reaper!





Looks like the U.S. Customs and Border Protection also upgraded their UAVs to the Predator B Reapers!




Picture of the Day:

Disclaimer: This is not my house. Read the story below!

 

"Good news is that I truly out did myself this year with my Christmas decorations. The bad news is that I had to take him down after two days. I had more people come screaming up to my house than ever. Great stories. But two things made me take it down.


First, the cops advised me that it would cause traffic accidents as they almost wrecked when they drove by.

Second, a 55 year old lady grabbed the 75 pound ladder almost killed herself putting it against my house and didn’t realize that it was fake until she climbed to the top (she was not happy). By the way, she was one of the many people who attempted to do that. My yard couldn’t take it either. I have more than a few tire tracks where people literally drove up my yard."

Monday, January 26, 2009

Robot of the Day: MQ-1 Predator

The MQ-1 Predator is probably the most famous UAV (Unmanned Aerial Vehicle) of all times. It was developed by General Atomics Aeronautical Systems for the USAF (U.S. Air Force) and the CIA.

Photo credit: U.S. Air Force

The MQ-1 Predator UAV has a wingspan of 47.8 ft (14.8m), can fly a maximum of 135 miles per hour, and can stay in the air for 14 hours. The cost for an early production was around $3.2 million.

Initially it was only a reconnaissance system allowing the remote operators to acquire aerial video in real-time. After the CIA deployed the Predator UAVs to Afghanistan, they expressed strong desire to add the capability of firing Hellfire missiles from Predator UAVs to kill terrorists. So it was done. On February 4, 2002, a CIA Predator attacked a convoy of sports utility vehicles, killing a suspected al Qaeda leader who the CIA thought were Osama Bin Laden.

The Predator UAV requires a satellite link and is operated by two pilots (most likely in a military base in Nevada) sitting in front of cockpit like devices. The control of the UAV falls under the tele-operation category because most decisions are made by human operators.

The first video below showcases the capabilities of the MQ-1 Predator to quickly track down a moving vehicle (note that it is much easier to tracking a lone moving car in a desert compared to tracking down the same car, say, in LA traffic). The second video shows firing of the missiles.






An unknown number of Predator UAVs are also used by the U.S. Customs and Border Patrol. I would guess these Predators don't shoot missiles at illegal aliens.

Picture of the Day:


Residents in Norway were stunned by the beautiful yet mysterious light show. Turned out it was caused by the malfunction of a Russian missile test. Follow this link to read more.

Tuesday, January 20, 2009

Robot of the Day: UAVs at BYU

Since in the previous post I talked about a BYU UAV demo dry run, I thought it might be a good idea to present some of the UAVs we used at BYU for research purposes.

The research group WiSAR (stands for Wilderness Search and Rescue) at BYU consists of faculty and students from three research labs: The MAGICC lab (ME and EE departments), the HCMI lab (CS department), and the Computer Vision lab (CS department). The objective of the research group is to investigate and develop technologies to support wilderness search and rescuers with an Unmanned Aeriel Vehicle (UAV).

In the past, we have been using UAVs built by the MAGICC lab students. The UAV in the picture below is named Madre (meaning the mothership in Spanish) and was built by the MAGGIC lab. Madre retired in 2008 and simply sits on top of a closet in our lab for displaying purposes only.


 
Madre: UAV built by BYU MAGICC Lab


Some students in the WiSAR group graduated and then decided to license technologies from BYU and start a local company making UAVs. The company is named Procerus and has been quite successful. So later we simply bought a plane from them. The second picture below shows the current UAV we use. We just called it "The UAV" because we couldn't come up with a good name.


 
UAV built by Procerus. It doesn't have a name. We call it "The UAV".


The fix-wing UAVs we used in our research are small, light, and have wingspans of 42-50 in. Each weights about 2 lbs. They are propelled by standard electric motors powered by lithium batteries -- good for up to 2 hours in the air.

The sensors onboard include three-axis rate gyroscopes, three-axis accelerometers, static and differential barometric pressure sensors, a global positioning system module, and a video camera on a gimballed mount. A 900 MHz radio transceiver is used for data communication and an analog 2.4 GHz transmitter is used for video downlink. The autopilot was designed at BYU and built on a small microprocessor. It stabilize the aircraft's roll and pitch angles and attitude, and also flies a UAV to desired altitude or to a waypoint.

Each UAV has many autonomous capabilities. For example, it can auto-launch (all you have to do is to throw it into the air), auto-land (crash land after spiraling down), and if the UAV loses communication with the base, it will automatically return to base and loiter around. The video below shows the auto-launching and auto-landing capabilities of Madre.




The gimballed camera onboard the UAV provides bird's eye view of the area. Because the UAV can quickly get to hard-to-reach areas and cover lots of grounds quickly, the visual information it provides can help wilderness search and rescuers improve situation awareness and support in search of a missing or injured person. The next video shows the kind of video the operator can see from the ground. (You can skip to the end to see the crash landing.)




Maybe you have noticed from the previous video that video data from the UAV is not easy to use (jitters, disorientation, too fast, etc.). That's why our research group developed video mosaicing algorithms to piece video frames together to help with the search task. This method enables video frames to stay in sight much longer for the video observer, thus, improving detectability.




We have also developed other automation to assist with the search and rescue work. Examples include automatically suggesting likely places to find missing person, various automatic path planning for the UAV, anomaly detection algorithms, etc. Those will be discussed in a separate blog post in the future.

The video below is a compilation of some other capabilities of the UAV made by the MAGICC lab, including obstacle avoidance, multiple UAV formation flight, etc. Too bad the audio track was disabled, but you can leave the music running from the videos above and then watch it in rhythm. :) Note that at the beginning of the video, the UAV was launched from inside BYU campus. Of course, this is no longer allowed due to tighter FCC rules and regulations!




Picture of the Day:



People have always wanted to roam the sky freely like birds.
I don't, because I've got UAVs.

Thursday, January 15, 2009

Robot of the Day: Aida, Your Driving Companion

[Don't get confused with the dates. You'll find that I frequently travel back and forth through time -- in my blog. :) ]


Aida is a robot built by Mikey Siegel from the MIT Media Lab for a research project at Audi. It is suppose to be a driving companion, something to be installed in your car!

During the summer of 2009, when I was doing an internship at the Intelligent Robotics Group in NASA Ames, I met Mikey for the first time. He was on his way to Audi Research Center located at the heart of the sunny Silicon Valley to present the robot he had built for them, but decided to stop at NASA Ames first to show us the robot, because he used to be an intern here at the IRG.

The purpose of the robot is to experiment with the idea of using a robot to influence people's driving behavior. Researchers hope to use the movement of the robot (really just the neck movement), the different facial expressions, and the robot's speech to encourage people to drive more safely. This required the robot to be able to communicate with human with many social cues, which was exactly the research topic at the Personal robots Group at MIT, led by Dr. Cynthia Breazeal, Mikey's advisor.

According to Mikey, the robot was built within a three-day period (I assumed that he didn't really get much sleep), which caused all our jaws to drop. The lovely head was printed off a 3D printer, and he also machined all the mechanical parts himself. However, to the fair to the other members of his lab, he added, the neck design was a copy from another project, the animated eyes and mouth movements were created by a friend (if I remember correct, someone from Pixar), and the software control was a mixture of modules previously developed at MIT and open source libraries such as OpenCV.

When Mikey demoed the robot to us, Aida was able to recognize faces. It became excited when it was surrounded by many people, and acted bored when it was left alone. The animated emoticons projected onto the plastic face from the back of the head made the robot look very cute, and the smooth neck movement made it almost appear "alive". At that time, the only sensor it had was a video camera mounted on the base (not moving with the neck or head), but eventually, Aida will be equipped with more eyes (cameras) and ears (microphones), so it can sense the world around it better.




Having a cute robot interacting with people in their cars sounds very cool, however, I am not so sure it is such a great idea.

First of all, could it be possible that the moving robot might distract the driver with its cute winks? I couldn't help but remember those signs next to bus drivers I used to see when I was a young kid: "Do not talk to the driver!" These days, when many states are making it illegal to talk on cell phone while driving, what would they think of a robot that not only talks to the driver, but also try to get the driver to look at it?

Secondly, don't you get annoyed sometimes when your better half keeps criticizing your driving skills (or was that just me)? Now imagine a robot, nagging constantly right next to your ear like your dear Grandma, telling you that you are driving too fast, or that you hit the brake too hard. Especially after you rear-end someone, I am sure a nagging robot saying "Told you! Told you to not follow so closely" would be the last thing you want.... (Disclaimer: I have never rear-ended anyone!)

On the other hand, for those LA solo commuters who get stuck in traffic many hours regularly (I was recently stuck in LA traffic for hours, so I know!), Aida would make a great driving companion! And I certainly wouldn't mind such a cute robot making a conversation with me, while my car drives itself to my intended destination!

Video of the Day:

If you were there at the Liverpool Street Station on January 15, 2009, would you have joined in?

Sunday, January 11, 2009

Robot of the Day: G8 Robotic Fish to Detect Water Pollution

British scientists, specifically, researchers at University of Essex, plan to release a bunch of robot fish into the sea off north Spain to detect pollution. This is part of three-year research project funded by the European Commission and coordinated by BMT Group Ltd.



These carp-shaped robots look very much like the real ones, big ones (nearly 5 feet) -- roughly the size of a seal. The tiny chemical sensors installed on these robot fish enable them to find sources of potentially hazardous pollutants in the water.

These robots all have autonomous navigation capabilities, meaning no remote control is needed to direct them. All that is required is to simply "let them loose". Using Wi-Fi technology, data collected can be transmitted to the the port's control center. The battery on each fish can last approximately 8 hours and similar to the Roomba vacuum cleaning robots, they are smart enough to return to a "charging hub" to get recharged when battery runs low. The video below demonstrate the swimming capability of such a robot fish, the G8 model. It really swims like a fish!!



The fish can swith at a maximum speed of about one meter per second, which means the fish can be away from the "charging hub" for as far as 14.4 kilometers (which I think might be too far for the charge hub to still receive good signals). The cost for building one of such robot fish is around £20,000 (roughly $29,000), so it is certainly not cheap. There are also smaller ones created by the same group of researchers as shown in this video below. I guess these are more suited for a fish tank.





So why robot fish? Why not the very machine-looking like mini-submarines? Rory Doyle, a senior research scientist at BMT Group said,

"In using robotic fish we are building on a design created by hundreds of millions of years' worth of evolution which is incredibly energy efficient. This efficiency is something we need to ensure that our pollution detection sensors can navigate in the underwater environment for hours on end."


Personally, I think this technology is great because:
1. As stated, using the fish design is very energy efficient.
2. The robots can navigate autonomous, which doesn't require human interaction.
3. Chemicals dissolved in the water under the surface can be detected.
4. Data can be sent to the data center wirelessly.
5. The fish robots can recharge themselves when needed.
6. The fish form also help them blend in with the environment (and maybe disguise them from people who intentionally pollute our water).

Now if they are capable of the following, it can be even better:
1. Trace the source of the pollution on their own autonomously (maybe through some heuristic path planning algorithms)
2. Take pictures of the pollution source (to help identify/analyze the cause and maybe use them as evidence in a court of law).
3. Somehow obtain energy on their own? Eat seaweed, little fish, or shrimp and generate energy through metabolism?
4. Also, in case of malfunction, is there an easy way to retrieve it? Maybe using another robotic fish?

Every coin has two sides, and there are certainly concerns for this technology too. For example: what if other fish (a shark? although a shark is not technically a fish) attacks the robotic fish and treats it as food? I am sure the robot fish won't be easy to digest and might kill the poor (real) fish. How who's responsible for that? And how about the disappointing fisherman who happen to catch the robotic fish?

You can read more about the robotic fish from the following articles:

Article at BMT web site
News Article at Reuters





Shear will power, no matter how strong it is, will not make a problem go away.

Wednesday, January 07, 2009

Robot of the Day: Winebot, the Wine Tasting Robot

Researchers at NEC and Mie University, both in Japan, came up with a robot, the Winebot, that is capable of distinguishing good wine from bad wine, tell you the brand, and also suggest side dishes that will go well alongside the wine. It is also capable of identifying different cheeses and hors d'oeuvres.

Wait a second. What are hors d'oeuvres? I've never heard of this word before. So I did a google search, and there is actually a wikipedia entry of it (including a sound file for pronunciation). Basically, it means appetizer.


Cold Hors d'œuvre

When presented with three apples, without actually taking a bite, the robot was able to separate the sweet one from the two sour ones. So how does the robot do it? There is actually an infrared spectrometer at the end of the robot's left arm. When objects are placed up against the sensor, the robot fires off a beam of infrared light and then analyze the reflected light in real time. It takes advantage of the fact that different food would have different reflection ratio of light per wavelength for water, protein, etc., kind of like unique "fingerprints", in order to distinguish them.



Just to add a bit more fun, because it is analyzing the chemical composition of the wine or food placed before it, it can also alert its owner to possible health issues, gently warning against fatty or salty products. However, it is not capable of detecting drunkenness, which could have been another great feature.

However, when the Winebot makes mistakes, the consequences could be dramatic. When a reporter’s hand was placed against the robot’s taste sensor, it was identified as prosciutto. A cameraman was mistaken for bacon.



This cute robot is about 2 feet tall, can slightly swivel its head, and when it speaks in a child-like voice, lights flash around its mouth. If you think this makes a great birthday or Christmas present, think again! It cost as much as a new car. With that amount of money, I would have expected the robot to
  • cook for me
  • wash the dishes
  • make beds
  • clean the house
  • take out the garbage
  • mow the lawn
  • rake the leaves
  • shovel the snow
  • and the list goes on and on and on ...


Take a short break from your busy life so you can appreciate the beautiful things you might be ignoring.


Friday, July 04, 2008

Robot of the Day: Robotic Flowers

Another make up post. When will I ever catch up?!

If you like decorating your rooms with flowers and plants but always "forget" to water your plants, why not consider having a robot watering your plants for you? Now here's an even better idea: why not just have robotic flowers and plants??!!


Chonnam National University in Korea has developed just the right solution for you: a robotic plant that emits oxygen, moisture, and aroma, and even dances to music (shown in the picture above). This 130cm tall and 40cm in diameter robot also knows how to greet you by bending toward you and bloom for you. It also reacts to light changes or even loud voices.

I tried to find more pictures or videos of this robotic flower but could not find any. There isn't any pricing or release date for retail products, either. I just hope it dances better than this one:



The idea of robotic flowers/plants is not new and researchers in US also developed various prototypes. The video below shows the robotic flowers designed by Dr. Cynthia Breazeal (MIT) that sway when a human hand is near and glow in beautiful bright colors.



Sena Clara Creston, an artist from New York City, built a robotic flower garden as one of her art projects. The video below shows some of the flowers she created. The next paragraph is direct quote from her statement about this project.



"Flower Garden is an interactive installation that detects and responds to the viewers' movements. The garden consists of about 20 paper-mâché and wire flowers each equipped with a distance sensor and arranged around a path for the viewer to walk through. Once the viewer gets within range the flower encloses its petals within its leaves. If the viewer remains in range the flower begins to shake making it appear to be nervous or frightened and if the viewer continues to approach, the flower responds by becoming aggressive, snapping it's petals and leaves open and shut. If the viewer steps out of range the flower seems to relax. It stops shaking and very slowly opens back up, exposing its petals. When the viewer enters the garden, the seemingly benign environment of fragile and vulnerable sculptures will have tuned into a mass of creatures fully expressing their aversion to the intrusion either by putting up their defenses, or in cases of extreme attack, becoming offensive. The viewer, realizing the impact they are having on the environment, will in turn react to the flowers, either choosing to hurry through the path causing as little disturbance as possible or embrace confrontation and continue to provoke the flowers."

I know these robotic flowers are supposed to represent timid fragile things that are nervous and frightened, but why do I always get this creepy feeling with monstrous man-eating creatures looming in my mind? Do you also get the feeling that they might just bite you, all of a sudden, like how this robot below is doing?



Frankly speaking, robotic flowers or plants that are for decorations only don't excite me that much. They are cool and cute (or creepy), but don't you wish it will do a bit something more for you, such as checking your emails for you?! Don't laugh! I didn't come up with this idea; someone else did, who even published a paper on this. Read my adviser's survey paper on Human Robot Interaction if you are interested.

Here's another idea. How about letting your robotic plant be your personal psychiatrist? Sega Toys actually came up with such a product named "Pekoppa" that will listen to your endless and meaningless ranting and react to it. (Disclaimer: I don't know Japaneses, so I have no idea what harm is done to the poor little robot!)



So, have you found the robotic flower/plant just right for you? I am still looking....





Rather than constantly worrying about the many things you have to get done, just start doing them one at a time.

Tuesday, July 01, 2008

Robot of the Day: Atom (Astroy Boy)

This is a make up post!

This is the first ever "Robot of the Day" blog post, and I just feel obligated to designate this "honor" to Atom (aka 阿童木, Astro Boy), a fictional robot character created in Japanese manga (动画) and television animation series in the 1950-60s, because the lovable, brave, and peace-loving Atom inspired a whole generation of kids, some of whom went on to become robotics researchers (me included). Some people even claim that Atom was the big reason why Japan is at the forefront of android development today! In a WIRED magazine article, "The 50 Best Robots Ever", Atom was ranked at #2 dispite being only a fictional character. Impressive!!



I still remember when I was just a little kid, the entire neighborhood of over 100 families shared one television set - a 14-inch color TV (this might give you some clues about how old I am), which was locked in an iron cabinet in a spacious openning by the neighbourhood. Every evening at around 6:30pm, people (mostly kids and some adults) would start taking spots in front of the TV cabinet with their small wooden stools. Of course, a good spot (close to the TV) might have required even earlier arrival. At exactly 7:00pm, the uncle in charge of the TV cabinet would unlock the cabinet and turn on the TV. As soon as the theme song started to play, the chaotic crowd would immediately quiet down and soon everyone was immersed in the adventures of a cute little robot named Atom. Ask anyone who was born in the 70s in China, he/she could probably still sing a few lines from the famous Atom theme song (see video below)....



The character Atom was created in 1952 as a comic character by Japanese comics artist, animator, producer and medical doctor, Osamu Tezuka (手冢治虫), who is often credited as "Father of Anima". The story was put on television in Japan from 1963 to 1966 and immediately achieved great success. The Atom series was remade in the 1980s as Shin Tetsuwan Atomu, which was also translated into English as the "Astory Boy" and broadcasted by NBC in the United States. The video below shows the opening theme for the English version of Astory Boy.



In the story, Atom was built by the head of the Ministry of Science on April 7, 2003, as a replacement for his son, who died in a car accident. Although Atom looked identical to his lost son, he soon realized that the little andriod was a failure because it was only a robot that doesn't grow or express human aesthetics. So he sold Atom to a circus. Professor Ochanomizu (茶水博士), the new head of the Ministry of Science, noticed Atom and managed to become his gardian. He also gave Atom seven special powers. Using these special powers, Atom fought crime, evil, and injustice.
  1. Jet engines under his feet for flying;
  2. Ability to speak 60 different languages;
  3. Ability to distinguish good and evil;
  4. 1,000 times more powerful hearing than human;
  5. Strong searchlights as his eyes;
  6. Ass cannon (later changed to finger machine gun in the new TV series)
  7. 100,000 horse power (later improved to 1 million horse power)
In the story Atom was born in 2003. It is already 2008 now and we are still far from able to create a robot of Atom's caliber. However, the story of Atom does paint a good picture of what we'd like to achieve with robotic technology. Specifically, I look forward to the day when robots and human can peacefully and happily live together under the same sky.

Interesting facts:
  • In real life April 7, 2003, a Japanese city officially registered Atom as an honor citizen and issued certification of citizenship.
  • In 2009, a feature film version of Astro Boy is scheduled to hit the theater screens.
Bonus:




When you have good thoughts or ideas, write it down before you forget.