Drone Development
Here you can find an overview of our current and past drone development projects.
Last updated
Here you can find an overview of our current and past drone development projects.
Last updated
Software: GAAS is an open-source autonomous aviation software platform, designed for fully autonomous drones and flying cars. GAAS was built to provide a common infrastructure for computer-vision based drone intelligence. Our build environment is brought up on ubuntu 18.04, Jetpack 4.3, and ROS Melodic. We have selected the ZED2 stereo camera in order to generate a disparity map and point cloud data. Our team is working to test our own navigation package on top of the existing GAAS foundation.
Hardware: The Jetson TX2 was used as our compute module and was mounted on the Auvidea J120 carrier board. We used the PX4 based Pixracer (fmu-v4) flight controller using firmware version 1.10.0. Local position estimation fuses sensor data from a downward-facing leddarone lidar sensor for altitude setpoints. Visual Inertial Odometry from the YGZ SLAM package is available for estimating the local pose.
Software: Humanpose is an open-source autonomus aviation software platform initially designed for use with the DJI Tello Drone. Humanpose's initial design was to be used as a pose estimation neural network to pilot a flying selfie stick. The onboard camera would stream video offboard to a computer over wifi and then use computer vision to capture the poses of the user and interpret the pose by using OpenPose in conjunction with the Intel OpenVino suite to determine what action the drone should take. The program is currently housed inside a Docker container and can run on any Ubuntu 18.04 version of Docker and wireless connectivity capability. From this, we have developed and experimented with a ROS compatibility Driver for the HumanPose System in order to migrate the software's use onto other drone platforms.
Hardware: The Tello Drone was used as our testing drone and it comes with a 5MP 720p camera. The offboard computer, in this case, was a mobile workstation with Intel's Haswell Architecture running the HumanPose software.
Software: Redtail Enhanced was developed as an extension of the original Project Redtail and features an improved user interface. However, we were interested in this extension because the project's dependencies were brought up to ubuntu 18.04, Jetpack 4.2.2, and ROS Melodic which aligned with our implementation of the GAAS project. This project focused on using the stereo DNN for navigation, which adds another layer of information used in the training data [1].
Hardware: The Jetson TX2 was used as our compute module and was mounted on the Connect Tech Orbitty carrier board. We used the PX4 based Pixracer (fmu-v4) flight controller using firmware version 1.7.0. Our local position estimation onboard the flight controller is fused with a downward-facing lidarlitev3HP for altitude setpoints and a PX4FLOW optical flow sensor for x-y velocity setpoints.
Software: Project Redtail uses a novel deep learning approach to navigating trails and paths based on the training dataset used [2]. The project uses an open-source convolutional neural network YOLO in order to perform object detection. Our implementation was built on ubuntu 16.04, Jetpack 3.3, and ROS Kinetic.
Hardware: The Jetson TX2 was used as our compute module and was mounted on the Auvidea J120 carrier board. We also implemented this project on the Connect Tech Orbitty carrier board. We used the PX4 based Pixracer (fmu-v4) flight controller using firmware version 1.7.0.
We have since retired this project in place of the Redtail Enhanced - Stereo DNN
This drone was developed to compete in the Vertical Flight Societies 7th Annual MAV Student Challenge however we were not able to complete the qualifying requirements in time to attend the event. We encountered significant challenges in building a platform <500 grams that could perform all of the necessary functions that were required by the competition. This was also the first year of our research group's experience in building autonomous systems.
Our team began our journey into the autonomous drone world by experimenting with the Intel Aero, an out of the box solution that allowed our team to hit the ground running on learning new techniques and algorithms on a capable computing platform. We eventually retired the development of the Aero Drone due to its limitations and restrictions in pursuing more open source solutions.
[1] Smolyanskiy, Nikolai, et al. “On the Importance of Stereo for Accurate Depth Estimation: An Efficient Semi-Supervised Deep Neural Network Approach.” ArXiv.org, 8 July 2020, arxiv.org/abs/1803.09719.
[2] Smolyanskiy, Nikolai, et al. “Toward Low-Flying Autonomous MAV Trail Navigation Using Deep Neural Networks for Environmental Awareness.” ArXiv.org, 22 July 2017, arxiv.org/abs/1705.02550.
[3]