Autonomous Object-tracking Quadcopter with Remote Processing Capability

This is an archive site. Current senoir design projects are at https://projects.eng.uci.edu.

Topic:

Idea source:

Autonomous Object-tracking Quadcopter
with Remote Processing Capability

Matthew Eng
Thuong Nguyen
Qiwei Fu
Wenta Zhu
Professor: Marco Levorato
TA:  Davide Callegaro
EECS
Winter 2018

 

Project Goal

The tasks performed by autonomous drones are becoming increasingly complex. Many of these tasks involve the execution of complex algorithms (including deep learning) and analyzing video and other sensor input in real-time. The constraints in terms of onboard processing capabilities limit the ability of some drones to perform such tasks effectively. Additionally, the computational load may make the drone’s operational time significantly shorter due to the limited onboard energy storage. These issues are especially relevant to Unmanned Aerial Vehicles (UAV) due to obvious weight constraints.

The objective of the project is to build an architecture capable of offloading the computational tasks controlling the navigation of a UAV. The architecture will take as inspiration the recent edge and fog computing algorithms. 

 

Project Outline

- Install and test a control framework for the navigation of the UAV. In particular, a video stream captured by a video camera will be captured by an onboard Raspberry Pi to be sent to the groundstation, which will also act as an interface between the drone and Pixhawk controller onboard of the drone. Pixhawk controller will take the command received from Raspberry PI and maneuver accordingly. The Raspberry Pi will also have the ability to run the tracking algorithm in case connection to ground station is lost.

- Build a data communication pipeline capable of transporting the video stream to a laptop on the ground using wifi, which will process the video using object tracking algorithm and send command for the drone back to the Raspberry Pi, which will then in turn be sent to Pixhawk controller to complete the maneuver. 

- Measure the performance of the tracking and power usage between onboard processing and processing with remote groundstation. Operation time on a single charge and the accuracy of tracking will be our performance metrics.

 

Implementation

 

Above is the flowchart for the project implementation. 

The project is divided into the UAV module and the Ground module:

  • UAV module: the UAV module is ran on the Raspberry Pi microcontroller onboard the drone. This module focuses on the communication between the drone and the ground station. The Raspberry Pi is connected to the Pixhawk controller on the drone, which allows it to control the drone. The module takes the video feed taken from the webcam attached to the drone and compress that video. The module then send the video feed over to the ground station via TCP connection. The module then waits for the command from ground station, once it receives the command, it will send the instruction to Pixhawk controller to initiate movement. 
  • Ground module: the ground module is ran on any labtop/desktop with a WIFI connection. This module focus on taking the video feed from the UAV module and then use that video to determine the command drone should receive to track the object of interest. This is accomplished by using OpenCV computer vision algorithm to determine the object of interest. Once the object of interest is determined, the module will then run the tracking algorithm we developed to determined the movement command to send to the drone. This command is sent over TCP to the UAV module. 

Hardware Used

  1. 3DR Solo Quadcopter

  2. Raspberry Pi

  3. Logitech Webcam

  4. Pixhawk Flight Controller

Functionality:

Our current functionality is as follow:

  • Working tracking algorithm implemented using OpenCV

  • Working communication protocol over TCP to issue movement  command and stream video feed.

  • Working drone control script to allow for guided maneuver.

  • Ability for drone to track color flagged moving object.

  • Expandable tracking type with multitude of alternative tracking algorithm.

The drone tracking functionality is a success and the drone does not have an issue tracking objects. 

Challenges and Issues:

There were many challenges and issues left to be solved for this project despite the fact that the drone can track object without an issue. One of the issue is the fact that the capability of the raspberry pi is quite low, it create a bottle neck on the video feed transfer from the UAV module to the Ground module. The video and command exchange had high response time and thus caused the tracking to overshoot target at times. The other issue we had was the ability for camera to capture good quality video feed on board the drone due to the viberation caused by the drone, that gives the tracking algorithm a bit of difficulties on determining the object of interest. That can be fixed in the future with an attached gimbal for stable video feed.

Members Roles

Link to Demo

https://www.youtube.com/watch?v=YljgPcyVxvU&