Skip to main content

Sudarshan's Blog

How I got a 100 Dollar toy drone to follow me around.

It’s that time of the year again when I decide to add to my blog. It was my birthday last week and I decided to buy myself a small DJI drone to hack with. I wanted to get something lightweight, cheap and easily replaced. The DJI Tello so far has turned out to be delightfully sufficient for my criteria.

The video is here.

The experiment I set out to do is to have the Tello latch on to a face and follow it. To do this, I needed to be able to:

- Control the drone's movements.
- Be able to get drone flight data/visual. 
- Have the drone recognize a face. 
- Tell the drone to follow the face.

Controlling the drone:

So, with no information about the drone, I naively assumed I’d have to push some code/instructions into the drone like an arduino or a Lego mindstorms unit. However, the android app that came with the tello required the phone to connect to the drone on wifi.

This got me thinking. There had to be a server of sorts with a protocol that the android app was using to communicate to the drone. All I had to do was to write a a client that spoke that protocol and I didn’t have to burn any software into the drone.

I found this excellent article that a lot of other hackers had died worked super hard to put together. So, all I needed was to write a client that could send these UDP packets.

To my most pleasant surprise, I found that gobot had already done the excellent work of implementing this protocol as an SDK here. Their documentation listed examples of starting and controlling the drone and the much cooler one of displaying the drone output on mplayer.

The code below (taken from gobot) would have your tello take off and land.

package main

import (
	"fmt"
	"time"

	"gobot.io/x/gobot"
	"gobot.io/x/gobot/platforms/dji/tello"
)

func main() {
	drone := tello.NewDriver("8888")

	work := func() {
		drone.TakeOff()

		gobot.After(5*time.Second, func() {
			drone.Land()
		})
	}

	robot := gobot.NewRobot("tello",
		[]gobot.Connection{},
		[]gobot.Device{drone},
		work,
	)

	robot.Start()
}

Getting the flight visual:

My second problem was also easily solved. Gobot sufficiently lets you stream the drone camera UDP output to mplayer (or as we will in the future, ffmpeg pipes).

For my experiment, I used the ffmpeg encoder to a stdoutpipe.

	ffmpeg := exec.Command("ffmpeg", "-hwaccel", "auto", "-hwaccel_device", "opencl", "-i", "pipe:0",
		"-pix_fmt", "bgr24", "-s", strconv.Itoa(frameX)+"x"+strconv.Itoa(frameY), "-f", "rawvideo", "pipe:1")
	ffmpegIn, err := ffmpeg.StdinPipe()
	if err != nil {
		return nil, err
	}
	ffmpegOut, err := ffmpeg.StdoutPipe()
	if err != nil {
		return nil, err
	}

I then read this repeatedly

for {
		outData := make([]byte, frameSize)
		if _, err := io.ReadFull(f.ffmpegOut, outData); err != nil {
			return err
	}
...

Have the drone recognize a face:

I did a little bit of caffe work to train a CNN to recognize faces. The GoCV bindings to opencv comes with an xml cascade file that does this sufficiently well but I liked my model better. Plugging it into GoCV is as simple as

	net := gocv.ReadNetFromCaffe(f.protoPath, f.modelPath)

The neural network can then run inferences here:

		detBlob := net.Forward("detection_out")

...
        confidence := detections.GetFloatAt(r, 2)
		if confidence < 0.5 {
			continue
		}

Tell the drone to follow the face:

This was the most fun part. My logic was simple. I created bounds for the drone camera. My idea was to have the drone turn in the direction of the human face bounding box every time it left it’s line of sight. I also added in a distance vector to have the drone maintain a safe distance from the human face.

if f.detectSize {
			f.detectSize = false
			refDistance = dist(left, top, right, bottom)
		}

		distance := dist(left, top, right, bottom)

		if right < W/2 {
			f.drone.CounterClockwise(50)
		} else if left > W/2 {
			f.drone.Clockwise(50)
		} else {
			f.drone.Clockwise(0)
		}

		if top < H/10 {
			f.drone.Up(25)
		} else if bottom > H-H/10 {
			f.drone.Down(25)
		} else {
			f.drone.Up(0)
		}

		if distance < refDistance-distTolerance {
			f.drone.Forward(20)
		} else if distance > refDistance+distTolerance {
			f.drone.Backward(20)
		} else {
			f.drone.Forward(0)
		}

Note:

If you watch the video closely, you’ll notice that my beard is very patchy
boundary values are so small that the drone sometimes “loses” my face. This is an improvement I will be working on.

Source code:

All the code is available at https://github.com/sudarshan-reddy/dogbot

comments powered by Disqus