MIT engineers have developed a new virtual-reality training system for drones that enables a vehicle to “see” a rich, virtual environment while flying in an empty physical space (Image: William Litant)
Researchers from the Massachusetts Institute of Technology have developed a virtual reality training system to reduce the time and cost of developing autonomous drones.
The system, which the team has dubbed “Flight Goggles,” could significantly reduce the number of crashes that drones experience in actual training sessions. It can also provide a virtual testbed for environments and conditions in which researchers might want to train fast-flying drones.
Training drones to fly fast around obstacles is a crash-prone exercise during which engineers have to repair or replace vehicles regularly. The virtual reality (VR) training system enables a drone to “see” a rich environment while flying in an empty physical space.
Sertac Karaman, associate professor of aeronautics and astronautics at the Massachusetts Institute of Technology (MIT), said, “We think this is a game-changer in the development of drone technology. The system can make autonomous vehicles more responsive, faster, and more efficient.”
Pushing boundaries
Training autonomous drones involves researchers flying drones in large, enclosed testing grounds, in which they often hang large nets to catch vehicles. They also set up props, such as windows and doors, through which a drone can learn to fly.
When vehicles crash, they must be repaired or replaced, which delays development and adds to a project’s cost.
Karaman said testing drones in this way can work for vehicles that are not meant to fly fast, such as drones for mapping, but is not suitable for fast-flying vehicles that need to process visual information quickly as they fly through an environment.
“The moment you want to do high-throughput computing and go fast, even the slightest changes you make to its environment will cause the drone to crash,” said Karaman. “You can’t learn in that environment. If you want to push boundaries on how fast you can go and compute, you need some sort of virtual-reality environment.”
Flight Goggles
The VR system comprises a motion capture system, an image rendering program, and electronics that enable the team to quickly process images and transmit them to the drone.
The test space — a hangar-like gymnasium in MIT’s new drone-testing facility in Building 31 — is lined with motion-capture cameras that track the orientation of the drone as it’s flying.
With the image-rendering system, Karaman and his colleagues can draw up photorealistic scenes, such as a loft apartment or a living room, and beam these virtual images to the drone as it’s flying through the empty facility.
The virtual images can be processed by the drone at a rate of about 90 frames per second — around three times as fast as the human eye can see and process images. To enable this, the team custom-built circuit boards that integrate a powerful embedded supercomputer, along with an inertial measurement unit and a camera.
They hardware is fitted into a small, 3-D-printed nylon and carbon-fiber-reinforced drone frame.
A crash course
The researchers carried out a set of trials for the VR system, including one in which the drone learned to fly through a virtual window about twice its size. As the drone flew through this virtual room, the researchers tuned a navigation algorithm, enabling the drone to learn on the fly.
Over 10 flights, the drone, flying at around 2.3 meters per second (5 miles per hour), successfully flew through the virtual window 361 times, only “crashing” into the window three times, according to positioning information provided by the facility’s motion-capture cameras.
In a final test, the team set up an actual window in the test facility and turned on the drone’s onboard camera to enable it to see and process its actual surroundings. Using the navigation algorithm tuned in the virtual system, the drone, over eight flights, was able to fly through the real window 119 times, only crashing or requiring human intervention six times.
“It’s something we programmed it to do in the virtual environment, by making mistakes, falling apart, and learning. But we didn’t break any actual windows in this process,” said Karaman.
The training system can also be used to test out new sensors, or specifications for existing sensors, to see how they might handle on a fast-flying drones or be used to train drones to fly safely around humans.
“There are a lot of mind-bending experiments you can do in this whole virtual reality thing. Over time, we will showcase all the things you can do,” added Karaman.
The details of the virtual training system will be presented at the IEEE International Conference on Robotics and Automation this month.
The research was funded in part by the US Office of Naval Research, MIT Lincoln Laboratory, and the NVIDIA Corporation.
Source: MIT News Office.