The research group WiSAR (stands for Wilderness Search and Rescue) at BYU consists of faculty and students from three research labs: The MAGICC lab (ME and EE departments), the HCMI lab (CS department), and the Computer Vision lab (CS department). The objective of the research group is to investigate and develop technologies to support wilderness search and rescuers with an Unmanned Aeriel Vehicle (UAV).
In the past, we have been using UAVs built by the MAGICC lab students. The UAV in the picture below is named Madre (meaning the mothership in Spanish) and was built by the MAGGIC lab. Madre retired in 2008 and simply sits on top of a closet in our lab for displaying purposes only.
Some students in the WiSAR group graduated and then decided to license technologies from BYU and start a local company making UAVs. The company is named Procerus and has been quite successful. So later we simply bought a plane from them. The second picture below shows the current UAV we use. We just called it "The UAV" because we couldn't come up with a good name.
The fix-wing UAVs we used in our research are small, light, and have wingspans of 42-50 in. Each weights about 2 lbs. They are propelled by standard electric motors powered by lithium batteries -- good for up to 2 hours in the air.
The sensors onboard include three-axis rate gyroscopes, three-axis accelerometers, static and differential barometric pressure sensors, a global positioning system module, and a video camera on a gimballed mount. A 900 MHz radio transceiver is used for data communication and an analog 2.4 GHz transmitter is used for video downlink. The autopilot was designed at BYU and built on a small microprocessor. It stabilize the aircraft's roll and pitch angles and attitude, and also flies a UAV to desired altitude or to a waypoint.
Each UAV has many autonomous capabilities. For example, it can auto-launch (all you have to do is to throw it into the air), auto-land (crash land after spiraling down), and if the UAV loses communication with the base, it will automatically return to base and loiter around. The video below shows the auto-launching and auto-landing capabilities of Madre.
The gimballed camera onboard the UAV provides bird's eye view of the area. Because the UAV can quickly get to hard-to-reach areas and cover lots of grounds quickly, the visual information it provides can help wilderness search and rescuers improve situation awareness and support in search of a missing or injured person. The next video shows the kind of video the operator can see from the ground. (You can skip to the end to see the crash landing.)
Maybe you have noticed from the previous video that video data from the UAV is not easy to use (jitters, disorientation, too fast, etc.). That's why our research group developed video mosaicing algorithms to piece video frames together to help with the search task. This method enables video frames to stay in sight much longer for the video observer, thus, improving detectability.
We have also developed other automation to assist with the search and rescue work. Examples include automatically suggesting likely places to find missing person, various automatic path planning for the UAV, anomaly detection algorithms, etc. Those will be discussed in a separate blog post in the future.
The video below is a compilation of some other capabilities of the UAV made by the MAGICC lab, including obstacle avoidance, multiple UAV formation flight, etc. Too bad the audio track was disabled, but you can leave the music running from the videos above and then watch it in rhythm. :) Note that at the beginning of the video, the UAV was launched from inside BYU campus. Of course, this is no longer allowed due to tighter FCC rules and regulations!
Picture of the Day:
People have always wanted to roam the sky freely like birds.
I don't, because I've got UAVs.
I don't, because I've got UAVs.