I am excited to share our Code, Datasets, Video, and Paper, of Dronet ,

I am excited to share our Code, Datasets, Video, and Paper, of Dronet, a light Deep Neural Networks architecture that allows drones to navigate autonomously in the streets of a city by imitating cars and bicycles, thus respecting traffic rules. One of the biggest problems of deep learning is how to collect enough training data. Since flying in city streets is illegal, besides being very dangerous, we used videos recorded from cars and bicycles. The results were surprising. Even though trained on the streets of San Francisco, the learned models are able to control the drone in the streets of Zurich, and also in indoor environments, such as corridors and parking lots. As you can see in the video, the drone stays on the right hand side of the lane, never crosses the oncoming lane, it stops in presence of pedestrians, bicycles, construction works, and other vehicles. DroNet is desgined as a fast 8-layers CNN and produces, for each single input image, two outputs: a steering angle, to keep the drone navigating while avoiding obstacles, and a probability of collision, to let the drone recognize dangerous situations and promptly react to them. To share our findings with the robotics community, we publicly release all our datasets, code, and trained networks.
Video: https://youtu.be/ow7aw9H4BcA
Paper: http://rpg.ifi.uzh.ch/docs/RAL18_Loquercio.pdf
Datasets, code, and models: http://rpg.ifi.uzh.ch/dronet.html
Reference:
Antonio Loquercio, Ana Isabel Maqueda, Carlos Roberto del Blanco und Davide Scaramuzza. DroNet: Learning to Fly by Driving. IEEE Robotics and Automation Letters, 22. January 2018. DOI: 10.1109/LRA.2018.2795643