Microsoft has made available a beta version of an advanced virtual world for training autonomous drones, as well as other gadgets that move on their own. The software, which is available on GitHub, recreates conditions like shadows, reflections and other potentially confusing real-world conditions in a highly detailed, highly realistic virtual environment – without the risk of the real thing.
Microsoft says that it hopes to help the “democratization of robotics” with the move, which will assist individuals, researchers and companies with testing of systems that would otherwise be impossible, or too resource-intensive for them to do on their own.
Why test drones and other self-navigating devices in a virtual world instead of the real one? Mainly because simulated testing is vastly more affordable when part of your process is encouraging learning self-navigating software to distinguish between things like shadows and solid, dark-colored walls. If you ask a drone with an advanced onboard brain to fly into something that might be solid or might be immaterial, you could end up with a very smashed, very expensive failure. If you do that in a virtual setting, all you’ve lost is a few moments and maybe some electricity for powering your PC.
Of course, simulation also allows you to increase the volume and speed of your scenario testing and learning, and train up AI systems more efficiently. But for this to be an effective method of educating autonomous flight software, it also has to be highly accurate; Microsoft says its simulator takes advantage of recent advances in graphics processing technology to provide accurate virtual versions of real-world details like shadow, glare from the sun, haze and pooled surface water on roads.
The software can also be sued to test autonomous driving systems, Microsoft’s Ashish Kapoor notes in the company’s blog post, and it should be applicable to other kinds of robots that need to learn how to make their way through their environment.
Featured Image: Scott Eklund/Red Box Pictures