├── scripts ├── Windows_run_raw.bat ├── Windows_run_fake_balloon.bat ├── Windows_run_find_balloon.bat ├── Windows_run_colour_finder.bat ├── mavproxy_sequence.txt ├── linux_run_strategy.sh ├── opencv_version_check.py ├── balloon_finder.cnf.example ├── balloon_utils.py ├── OF.py ├── pid.py ├── web_server.py ├── colour_finder_web.py ├── colour_finder.py ├── balloon_config.py ├── position_vector.py ├── raw.py ├── attitude_history.py ├── fake_balloon.py ├── balloon_video.py ├── find_balloon.py └── balloon_strategy.py ├── README.md ├── smart_camera ├── smart_camera.cnf.example ├── sc_webcam.py ├── sc_config.py ├── sc_video.py └── sc_main.py ├── startup ├── rc.local ├── start_balloon_finder.sh └── start_mavproxy.sh ├── .gitignore └── LICENSE /scripts/Windows_run_raw.bat: -------------------------------------------------------------------------------- 1 | python raw.py -------------------------------------------------------------------------------- /scripts/Windows_run_fake_balloon.bat: -------------------------------------------------------------------------------- 1 | python fake_balloon.py -------------------------------------------------------------------------------- /scripts/Windows_run_find_balloon.bat: -------------------------------------------------------------------------------- 1 | python find_balloon.py -------------------------------------------------------------------------------- /scripts/Windows_run_colour_finder.bat: -------------------------------------------------------------------------------- 1 | python colour_finder.py -------------------------------------------------------------------------------- /scripts/mavproxy_sequence.txt: -------------------------------------------------------------------------------- 1 | mode loiter 2 | arm throttle 3 | rc 3 1800 4 | rc 3 1500 5 | mode guided -------------------------------------------------------------------------------- /scripts/linux_run_strategy.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | sim_vehicle.sh -v ArduCopter -f X -j 2 --aircraft=balloon --quadcopter --console --map 4 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | ardupilot-balloon-finder 2 | ======================== 3 | 4 | Code meant to be run on an Odroid to allow an ArduCopter Pixhawk based multicopter to find red balloons for Sparkfun's AVC 2014 competition 5 | 6 | Installation instructions can be found here: http://dev.ardupilot.com/wiki/odroid-via-mavlink/#Red_Balloon_Finder -------------------------------------------------------------------------------- /scripts/opencv_version_check.py: -------------------------------------------------------------------------------- 1 | # 2 | # opencv_version_check - displays the opencv version on the machine 3 | # 4 | # Start from command line using 'python opencv_version_check.py'. 5 | 6 | import cv2 7 | 8 | def main(): 9 | # opencv_version_check - prints opencv version 10 | print "OpenCV %s" % cv2.__version__ 11 | 12 | if __name__ == "__main__": 13 | main() 14 | -------------------------------------------------------------------------------- /smart_camera/smart_camera.cnf.example: -------------------------------------------------------------------------------- 1 | [camera0] 2 | # for no camera set type = 0, for webcam set type = 1 3 | type = 1 4 | #width = 640 5 | #height = 480 6 | #horizontal-fov = 70.42 7 | #vertical-fov = 43.3 8 | #video_output_file = ~/smartcamera-%%Y-%%m-%%d-%%H-%%M.avi 9 | 10 | [camera1] 11 | # for no camera set type = 0, for webcam set type = 1 12 | #type = 0 13 | 14 | [general] 15 | #debug = True 16 | -------------------------------------------------------------------------------- /startup/rc.local: -------------------------------------------------------------------------------- 1 | #!/bin/sh -e 2 | # 3 | # rc.local 4 | # 5 | # This script is executed at the end of each multiuser runlevel. 6 | # Make sure that the script will "exit 0" on success or any other 7 | # value on error. 8 | # 9 | # In order to enable or disable this script just change the execution 10 | # bits. 11 | # 12 | # By default this script does nothing. 13 | 14 | # Overclock to 1.92GHz 15 | #echo 1920000 > /sys/devices/system/cpu/cpu0/cpufreq/scaling_max_freq 16 | 17 | # start mavproxy as odroid 18 | ( 19 | date 20 | su odroid -c /home/odroid/GitHub/ardupilot-balloon-finder/startup/start_mavproxy.sh 21 | ) >> /tmp/start_mavproxy.log 2>&1 22 | 23 | exit 0 24 | -------------------------------------------------------------------------------- /startup/start_balloon_finder.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | # 3 | # start_balloon_finder.sh 4 | # 5 | # This script starts the balloon finder scripts 6 | 7 | # wait 2 seconds before starting 8 | sleep 2 9 | 10 | date 11 | PATH=$PATH:/bin:/sbin:/usr/bin:/usr/local/bin 12 | PATH=$PATH:$HOME/GitHub/ardupilot-balloon-finder 13 | PATH=$PATH:$HOME/GitHub/ardupilot-balloon-finder/scripts 14 | export PATH 15 | echo "PATH:" $PATH 16 | 17 | PYTHONPATH=$PYTHONPATH:$HOME/GitHub/ardupilot-balloon-finder/scripts 18 | export PYTHONPATH 19 | echo "PYTHONPATH:" $PYTHONPATH 20 | 21 | cd $HOME/GitHub/ardupilot-balloon-finder/scripts 22 | 23 | python balloon_strategy.py 24 | 25 | echo "start_balloon_finder.sh done" 26 | -------------------------------------------------------------------------------- /startup/start_mavproxy.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | # 3 | # start_mavproxy.sh 4 | # 5 | # This script starts up mavproxy as odroid and should be called from /etc/rc.local 6 | 7 | # give 10 seconds before starting 8 | sleep 10 9 | 10 | date 11 | export HOME=/home/odroid 12 | PATH=$PATH:/bin:/sbin:/usr/bin:/usr/local/bin 13 | PATH=$PATH:$HOME/GitHub/ardupilot-balloon-finder 14 | PATH=$PATH:$HOME/GitHub/ardupilot-balloon-finder/scripts 15 | PATH=$PATH:$HOME/GitHub/ardupilot-balloon-finder/smart_camera 16 | export PATH 17 | echo "PATH:" $PATH 18 | 19 | PYTHONPATH=$PYTHONPATH:$HOME/GitHub/ardupilot-balloon-finder/scripts 20 | export PYTHONPATH 21 | echo "PYTHONPATH:" $PYTHONPATH 22 | 23 | cd $HOME 24 | echo $HOME 25 | 26 | ls -l /dev/ttyUSB* 27 | ls -l /dev/serial/by-id 28 | ls -l /dev/video* 29 | 30 | screen -d -m -s /bin/bash mavproxy.py --master=/dev/ttyUSB0 --baudrate 921600 --aircraft MyCopter 31 | 32 | echo "start_mavproxy.sh done" 33 | -------------------------------------------------------------------------------- /scripts/balloon_finder.cnf.example: -------------------------------------------------------------------------------- 1 | [dronekit] 2 | #connection_string = "/dev/ttyUSB0" 3 | #baud = 921600 4 | 5 | [camera] 6 | ## set type to 0 for webcam, 1 for RPi camera ## 7 | #type = 0 8 | #width = 640 9 | #height = 480 10 | #horizontal-fov = 70.42 11 | #vertical-fov = 43.3 12 | #video_output_file = ~/balloon-%%Y-%%m-%%d-%%H-%%M.avi 13 | 14 | [balloon] 15 | #h-low = 160 16 | #h-high = 180 17 | #s-low = 120 18 | #s-high = 255 19 | #v-low = 210 20 | #v-high = 255 21 | #radius_m = 0.3 22 | 23 | [general] 24 | #debug = True 25 | #simulate = False 26 | #SEARCH_YAW_SPEED = 5.0 27 | #SEARCH_TARGET_DELAY = 2.0 28 | #VEL_XY_P = 1.0 29 | #VEL_XY_I = 0.0 30 | #VEL_XY_D = 0.0 31 | #VEL_XY_IMAX = 10.0 32 | #VEL_Z_P = 2.5 33 | #VEL_Z_I = 0.0 34 | #VEL_Z_D = 0.0 35 | #VEL_Z_IMAX = 10.0 36 | #VEL_SPEED_MIN = 1.0 37 | #VEL_SPEED_MAX = 4.0 38 | #VEL_ACCEL = 0.5 39 | #VEL_DIST_RATIO = 0.5 40 | #VEL_PITCH_TARGET = -5.0 41 | #VEL_UPDATE_RATE_SEC = 0.2 -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | 5 | # C extensions 6 | *.so 7 | 8 | # Distribution / packaging 9 | .Python 10 | env/ 11 | bin/ 12 | build/ 13 | develop-eggs/ 14 | dist/ 15 | eggs/ 16 | lib/ 17 | lib64/ 18 | parts/ 19 | sdist/ 20 | var/ 21 | *.egg-info/ 22 | .installed.cfg 23 | *.egg 24 | 25 | # Installer logs 26 | pip-log.txt 27 | pip-delete-this-directory.txt 28 | 29 | # Unit test / coverage reports 30 | htmlcov/ 31 | .tox/ 32 | .coverage 33 | .cache 34 | nosetests.xml 35 | coverage.xml 36 | 37 | # Translations 38 | *.mo 39 | 40 | # Mr Developer 41 | .mr.developer.cfg 42 | .project 43 | .pydevproject 44 | .cproject 45 | 46 | # Rope 47 | .ropeproject 48 | 49 | # Django stuff: 50 | *.log 51 | *.pot 52 | 53 | # Sphinx documentation 54 | docs/_build/ 55 | 56 | # ardupilot-balloon-finder files to ignore 57 | balloon_finder.cnf 58 | smart_camera.cnf 59 | *.tlog 60 | *.raw 61 | *.parm 62 | *.avi -------------------------------------------------------------------------------- /scripts/balloon_utils.py: -------------------------------------------------------------------------------- 1 | import math 2 | from balloon_video import balloon_video 3 | 4 | """ 5 | This file include utility functions for the balloon_finder project 6 | """ 7 | 8 | # 9 | # image related helper function 10 | # 11 | 12 | # shift_pixels_down - shifts point down by "shift pixels" 13 | def shift_pixels_down((x,y),shift_pixels): 14 | return (x, y+shift_pixels) 15 | 16 | # rotate_xy - rotates an x,y pixel position around the center of the image by the specified angle (in radians) 17 | # x and y : position integer or floats holding the pixel location as an offset from center or angle offset from center 18 | # angle : float holding the angle (in radians) to rotate the position. +ve = rotate clockwise, -ve = rotate counter clockwise 19 | # returns x,y rotated by the angle 20 | def rotate_xy(x, y, angle): 21 | cos_ang = math.cos(angle) 22 | sin_ang = math.sin(angle) 23 | x_centered = x - balloon_video.img_width 24 | y_centered = y - balloon_video.img_height 25 | x_rotated = x * cos_ang - y * sin_ang 26 | y_rotated = x * sin_ang + y * cos_ang 27 | return x_rotated, y_rotated 28 | 29 | # get_distance_from_pixels - returns distance to balloon in meters given number of pixels in image and expected 0.5m radius 30 | # size_in_pizels : diameter or radius of the object on the image (in pixels) 31 | # actual_size : diameter or radius of the object in meters 32 | def get_distance_from_pixels(size_in_pixels, actual_size): 33 | # avoid divide by zero by returning 9999.9 meters for zero sized object 34 | if (size_in_pixels == 0): 35 | return 9999.9 36 | # convert num_pixels to angular size 37 | return actual_size / balloon_video.pixels_to_angle_x(size_in_pixels) 38 | 39 | # get_pixels_from_distance - returns object diameter or radius in pixels for a given distance 40 | # distance : distance from object in meters 41 | # actual_size: diameter or radius of the object in meters 42 | def get_pixels_from_distance(distance, actual_size): 43 | return balloon_video.angle_to_pixels_x(actual_size / distance) 44 | 45 | # wrap_PI - wraps value between -2*PI ~ +2*PI (i.e. -360 ~ +360 degrees) down to -PI ~ PI (i.e. -180 ~ +180 degrees) 46 | # angle should be in radians 47 | def wrap_PI(angle): 48 | if (angle > math.pi): 49 | return (angle - (math.pi * 2.0)) 50 | if (angle < -math.pi): 51 | return (angle + (math.pi * 2.0)) 52 | return angle 53 | -------------------------------------------------------------------------------- /scripts/OF.py: -------------------------------------------------------------------------------- 1 | # 2 | # OF.py - is a demo of using Open CV for optical Flow. 3 | # 4 | # Start from command line using 'OF.py', select a window and press ESC at any time to quit 5 | # 6 | # 1 window will be displayed at startup: 7 | # frame : displays the tracking results. 8 | # How to use: 9 | # Start the program and hold the object in front of the camera (the object should not fill the entire screen) 10 | # Move the object, and you should see the object leaving streaks behind it. 11 | 12 | import cv2 13 | 14 | cap = cv2.VideoCapture(0) 15 | cap.set(cv2.cv.CV_CAP_PROP_FRAME_WIDTH,640) 16 | cap.set(cv2.cv.CV_CAP_PROP_FRAME_HEIGHT,480) 17 | 18 | 19 | # params for ShiTomasi corner detection 20 | feature_params = dict( maxCorners = 100, 21 | qualityLevel = 0.3, 22 | minDistance = 7, 23 | blockSize = 7 ) 24 | 25 | # Parameters for lucas kanade optical flow 26 | lk_params = dict( winSize = (15,15), 27 | maxLevel = 2, 28 | criteria = (cv2.TERM_CRITERIA_EPS | cv2.TERM_CRITERIA_COUNT, 10, 0.03)) 29 | 30 | # Create some random colors 31 | color = np.random.randint(0,255,(100,3)) 32 | 33 | # Take first frame and find corners in it 34 | ret, old_frame = cap.read() 35 | old_gray = cv2.cvtColor(old_frame, cv2.COLOR_BGR2GRAY) 36 | p0 = cv2.goodFeaturesToTrack(old_gray, mask = None, **feature_params) 37 | 38 | # Create a mask image for drawing purposes 39 | mask = np.zeros_like(old_frame) 40 | 41 | while(1): 42 | ret,frame = cap.read() 43 | frame_gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY) 44 | 45 | # calculate optical flow 46 | p1, st, err = cv2.calcOpticalFlowPyrLK(old_gray, frame_gray, p0, None, **lk_params) 47 | 48 | # Select good points 49 | try: 50 | good_new = p1[st==1] 51 | good_old = p0[st==1] 52 | except: 53 | break 54 | 55 | # draw the tracks 56 | for i,(new,old) in enumerate(zip(good_new,good_old)): 57 | a,b = new.ravel() 58 | c,d = old.ravel() 59 | cv2.line(mask, (a,b),(c,d),color[i].tolist(), 2) 60 | cv2.circle(frame,(a,b),5,color[i].tolist(),-1) 61 | #cv2.imshow('frame',mask) 62 | img = cv2.add(frame,mask) 63 | 64 | cv2.imshow('frame',img) 65 | k = cv2.waitKey(30) & 0xff 66 | if k == 27: 67 | break 68 | 69 | #Now update the previous frame and previous points 70 | old_gray = frame_gray.copy() 71 | p0 = good_new.reshape(-1,1,2) 72 | 73 | cv2.destroyAllWindows() 74 | cap.release() 75 | -------------------------------------------------------------------------------- /scripts/pid.py: -------------------------------------------------------------------------------- 1 | """ 2 | pid class : implements a pid controller 3 | 4 | """ 5 | 6 | import math 7 | import time 8 | 9 | class pid(object): 10 | 11 | def __init__(self, initial_p=0, initial_i=0, initial_d=0, initial_imax=0): 12 | # default config file 13 | self.p_gain = initial_p 14 | self.i_gain = initial_i 15 | self.d_gain = initial_d 16 | self.imax = abs(initial_imax) 17 | self.integrator = 0 18 | self.last_error = None 19 | self.last_update = time.time() 20 | 21 | # __str__ - print position vector as string 22 | def __str__(self): 23 | return "P:%s,I:%s,D:%s,IMAX:%s,Integrator:%s" % (self.p_gain, self.i_gain, self.d_gain, self.imax, self.integrator) 24 | 25 | # get_dt - returns time difference since last get_dt call 26 | def get_dt(self, max_dt): 27 | now = time.time() 28 | time_diff = now - self.last_update 29 | self.last_update = now 30 | if time_diff > max_dt: 31 | return 0.0 32 | else: 33 | return time_diff 34 | 35 | # get_p - return p term 36 | def get_p(self, error): 37 | return self.p_gain * error 38 | 39 | # get_i - return i term 40 | def get_i(self, error, dt): 41 | self.integrator = self.integrator + error * self.i_gain * dt 42 | self.integrator = min(self.integrator, self.imax) 43 | self.integrator = max(self.integrator, -self.imax) 44 | return self.integrator 45 | 46 | # get_d - return d term 47 | def get_d(self, error, dt): 48 | if self.last_error is None: 49 | self.last_error = error 50 | ret = (error - self.last_error) * self.d_gain * dt 51 | self.last_error = error 52 | return ret 53 | 54 | # get pi - return p and i terms 55 | def get_pi(self, error, dt): 56 | return self.get_p(error) + self.get_i(error,dt) 57 | 58 | # get pid - return p, i and d terms 59 | def get_pid(self, error, dt): 60 | return self.get_p(error) + self.get_i(error,dt) + self.get_d(error, dt) 61 | 62 | # get_integrator - return built up i term 63 | def get_integrator(self): 64 | return self.integrator 65 | 66 | # reset_I - clears I term 67 | def reset_I(self): 68 | self.integrator = 0 69 | 70 | # main - used to test the class 71 | def main(self): 72 | 73 | # print object 74 | print "Test PID: %s" % test_pid 75 | 76 | # run it through a test 77 | for i in range (0, 100): 78 | result_p = test_pid.get_p(i) 79 | result_i = test_pid.get_i(i, 0.1) 80 | result_d = test_pid.get_d(i, 0.1) 81 | result = result_p + result_i + result_d 82 | print "Err %s, Result: %f (P:%f, I:%f, D:%f, Int:%f)" % (i, result, result_p, result_i, result_d, self.get_integrator()) 83 | 84 | # run the main routine if this is file is called from the command line 85 | if __name__ == "__main__": 86 | # create pid object P, I, D, IMAX 87 | test_pid = pid(1.0, 0.5, 0.01, 50) 88 | test_pid.main() -------------------------------------------------------------------------------- /smart_camera/sc_webcam.py: -------------------------------------------------------------------------------- 1 | """ 2 | sc_webcam.py 3 | 4 | This file includes functions to: 5 | initialise a web cam 6 | capture image from web cam 7 | 8 | Image size is held in the smart_camera.cnf 9 | """ 10 | 11 | import sys 12 | import time 13 | import math 14 | import cv2 15 | import sc_config 16 | 17 | class SmartCameraWebCam: 18 | 19 | def __init__(self, instance): 20 | 21 | # health 22 | self.healthy = False; 23 | 24 | # record instance 25 | self.instance = instance 26 | self.config_group = "camera%d" % self.instance 27 | 28 | # get image resolution 29 | self.img_width = sc_config.config.get_integer(self.config_group,'width',640) 30 | self.img_height = sc_config.config.get_integer(self.config_group,'height',480) 31 | 32 | # background image processing variables 33 | self.img_counter = 0 # num images requested so far 34 | 35 | # latest image captured 36 | self.latest_image = None 37 | 38 | # setup video capture 39 | self.camera = cv2.VideoCapture(self.instance) 40 | 41 | # check we can connect to camera 42 | if not self.camera.isOpened(): 43 | print "failed to open webcam %d" % self.instance 44 | 45 | # __str__ - print position vector as string 46 | def __str__(self): 47 | return "SmartCameraWebCam Object W:%d H:%d" % (self.img_width, self.img_height) 48 | 49 | # latest_image - returns latest image captured 50 | def get_latest_image(self): 51 | return self.latest_image 52 | 53 | # get_image_counter - returns number of images captured since startup 54 | def get_image_counter(self): 55 | return self.img_counter 56 | 57 | # take_picture - take a picture 58 | # returns True on success 59 | def take_picture(self): 60 | # setup video capture 61 | self.camera = cv2.VideoCapture(self.instance) 62 | self.camera.set(cv2.cv.CV_CAP_PROP_FRAME_WIDTH,self.img_width) 63 | self.camera.set(cv2.cv.CV_CAP_PROP_FRAME_HEIGHT,self.img_height) 64 | 65 | # check we can connect to camera 66 | if not self.camera.isOpened(): 67 | self.healty = False 68 | return False 69 | 70 | # get an image from the webcam 71 | success_flag, self.latest_image=self.camera.read() 72 | 73 | # release camera 74 | self.camera.release() 75 | 76 | # if successful overwrite our latest image 77 | if success_flag: 78 | self.img_counter = self.img_counter+1 79 | return True 80 | 81 | # return failure 82 | return False 83 | 84 | # main - tests SmartCameraWebCam class 85 | def main(self): 86 | 87 | while True: 88 | # send request to image capture for image 89 | if self.take_picture(): 90 | # display image 91 | cv2.imshow ('image_display', self.get_latest_image()) 92 | else: 93 | print "no image" 94 | 95 | # check for ESC key being pressed 96 | k = cv2.waitKey(5) & 0xFF 97 | if k == 27: 98 | break 99 | 100 | # take a rest for a bit 101 | time.sleep(0.01) 102 | 103 | # run test run from the command line 104 | if __name__ == "__main__": 105 | sc_webcam0 = SmartCameraWebCam(0) 106 | sc_webcam0.main() -------------------------------------------------------------------------------- /scripts/web_server.py: -------------------------------------------------------------------------------- 1 | import cherrypy 2 | import json 3 | import os 4 | import cv2 5 | import tempfile 6 | from cherrypy.lib.static import serve_file 7 | 8 | 9 | 10 | class Config(object): 11 | 12 | exposed = True 13 | 14 | def __init__(self, config_parser): 15 | self.config = config_parser 16 | 17 | def get_config(self): 18 | """ Return a config as a dictionary""" 19 | dict = {} 20 | for section in self.config.sections(): 21 | for option in self.config.options(section): 22 | dict[section + '.' + option] = self.config.get(section, option) 23 | 24 | return dict 25 | 26 | def handle_set(self, id, newval): 27 | split = id.split('.') 28 | section = split[0] 29 | option = split[1] 30 | self.config.set(section, option, newval) 31 | return "%s has been set to %s" % (id, newval) 32 | 33 | def GET(self, id=None, set=None): 34 | if id == None: 35 | return json.dumps(self.get_config()) 36 | else: 37 | # We allow the user to do 'sets' via a query param of the form ?set=value 38 | #setparam = cherrypy.request.params.get('set') 39 | if set != None: 40 | return self.handle_set(id, set) 41 | else: 42 | return self.get_config()[id] 43 | 44 | class Image(object): 45 | 46 | exposed = True 47 | 48 | def __init__(self, image_callback): 49 | self.image_file = tempfile.mktemp(suffix=".jpg") 50 | self.image_callback = image_callback 51 | 52 | def GET(self, id=None): 53 | """Write our image to a temp file, then give it back to the user""" 54 | cv2.imwrite(self.image_file, self.image_callback()) 55 | return serve_file(self.image_file, content_type='image/jpeg') 56 | 57 | class Static: 58 | def index(self): 59 | html = """ 60 | 61 | 62 | 63 | 64 | 65 |

AVC Red balloon ass kicker

66 | 67 | 73 | 74 | 75 | """ 76 | return html 77 | 78 | index.exposed = True 79 | 80 | class Webserver(object): 81 | def __init__(self, config_parser, image_callback): 82 | cherrypy.tree.mount( 83 | Config(config_parser), '/config', 84 | {'/': {'request.dispatch': cherrypy.dispatch.MethodDispatcher()} } ) 85 | cherrypy.tree.mount( 86 | Image(image_callback), '/image', 87 | {'/': {'request.dispatch': cherrypy.dispatch.MethodDispatcher()} } ) 88 | cherrypy.tree.mount(Static(), '/') 89 | 90 | cherrypy.config.update({ 91 | 'server.socket_port': 8081, 92 | 'server.socket_host': '0.0.0.0', 93 | 'log.screen': None 94 | }) 95 | 96 | cherrypy.engine.start() 97 | # cherrypy.engine.block() 98 | 99 | def close(self): 100 | cherrypy.engine.stop() 101 | 102 | if __name__ == '__main__': 103 | 104 | Webserver() 105 | cherrypy.engine.block() -------------------------------------------------------------------------------- /scripts/colour_finder_web.py: -------------------------------------------------------------------------------- 1 | # 2 | # colour_finder_web.py - helps find the min and max Hue, Saturation and Brightness (aka Value) of a desired object 3 | # 4 | # Start from command line using 'python colour_finder_web.py' 5 | # 6 | # Access the final image at http://:8081 7 | # 8 | # Adjust the colour filters directly in the balloon_config.cnf file 9 | # h-low : hue min 10 | # h-high: hue max 11 | # s-low : saturation min 12 | # s-high: saturation max 13 | # v-low : value (aka brightness) min 14 | # v-high: value (aka brightness) max 15 | # 16 | 17 | import time 18 | import cv2 19 | import numpy as np 20 | import balloon_config 21 | from web_server import Webserver 22 | from balloon_video import balloon_video 23 | from find_balloon import BalloonFinder 24 | 25 | class ColourFinder: 26 | 27 | # constructor 28 | def __init__(self): 29 | # check if config exists and if not create it 30 | if (balloon_config.config.get_integer('balloon','h-low',-1) == -1): 31 | balloon_config.config.set_integer('balloon','h-low',BalloonFinder.default_h_low) 32 | balloon_config.config.set_integer('balloon','h-high',BalloonFinder.default_h_high) 33 | balloon_config.config.set_integer('balloon','s-low',BalloonFinder.default_s_low) 34 | balloon_config.config.set_integer('balloon','s-high',BalloonFinder.default_s_high) 35 | balloon_config.config.set_integer('balloon','v-low',BalloonFinder.default_v_low) 36 | balloon_config.config.set_integer('balloon','v-high',BalloonFinder.default_v_high) 37 | balloon_config.config.save(); 38 | print "Initialised colour filters to default in balloon_finder.cnf file" 39 | 40 | # initialise config last read time 41 | self.config_last_read = time.time() 42 | 43 | # read config file 44 | self.read_config_file(True) 45 | 46 | # frame that will be displayed by webserver 47 | self.frame_filtered = None 48 | 49 | # read_config_file - reads latest colour filters from config file 50 | # force : boolean. if set to True will always read from config file even if a read has been performed recently 51 | def read_config_file(self, force_read): 52 | 53 | # check at least one second has passed since the last read 54 | if (force_read or (time.time() - self.config_last_read > 1)): 55 | # re-read the config file 56 | balloon_config.config.read() 57 | 58 | # load colour filters from config file 59 | self.h_low = balloon_config.config.get_integer('balloon','h-low',BalloonFinder.default_h_low) 60 | self.h_high = balloon_config.config.get_integer('balloon','h-high',BalloonFinder.default_h_high) 61 | self.s_low = balloon_config.config.get_integer('balloon','s-low',BalloonFinder.default_s_low) 62 | self.s_high = balloon_config.config.get_integer('balloon','s-high',BalloonFinder.default_s_high) 63 | self.v_low = balloon_config.config.get_integer('balloon','v-low',BalloonFinder.default_v_low) 64 | self.v_high = balloon_config.config.get_integer('balloon','v-high',BalloonFinder.default_v_high) 65 | 66 | # store time config file was last read (we will check it once per second) 67 | self.config_last_read = time.time() 68 | 69 | # run - main routine to help user find colour filters 70 | def run(self): 71 | 72 | try: 73 | print "starting web server" 74 | # initialise web server which will display final image 75 | web = Webserver(balloon_config.config.parser, (lambda : self.frame_filtered)) 76 | 77 | print "initialising camera" 78 | # initialise video capture 79 | balloon_video.init_camera() 80 | 81 | print "Ready to go!" 82 | 83 | while(True): 84 | # get a frame 85 | frame = balloon_video.capture_image() 86 | 87 | # Convert BGR to HSV 88 | hsv_frame = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV) 89 | 90 | # get latest colour filters 91 | self.read_config_file(False) 92 | 93 | # use trackbar positions to filter image 94 | colour_low = np.array([self.h_low,self.s_low,self.v_low]) 95 | colour_high = np.array([self.h_high,self.s_high,self.v_high]) 96 | 97 | # Threshold the HSV image 98 | mask = cv2.inRange(hsv_frame, colour_low, colour_high) 99 | 100 | # Erode 101 | erode_kernel = np.ones((3,3),np.uint8); 102 | eroded_img = cv2.erode(mask,erode_kernel,iterations = 1) 103 | 104 | # dilate 105 | dilate_kernel = np.ones((10,10),np.uint8); 106 | dilated_img = cv2.dilate(eroded_img,dilate_kernel,iterations = 1) 107 | 108 | # Bitwise-AND mask and original image 109 | self.frame_filtered = cv2.bitwise_and(frame,frame, mask=dilated_img) 110 | 111 | # handle interrupts 112 | except: 113 | print "interrupted, exiting" 114 | 115 | # exit and close web server 116 | print "exiting..." 117 | web.close() 118 | 119 | # create global colour_finder object and run it 120 | colour_finder = ColourFinder() 121 | colour_finder.run() -------------------------------------------------------------------------------- /scripts/colour_finder.py: -------------------------------------------------------------------------------- 1 | # 2 | # colour_finder.py - helps find the min and max Hue, Saturation and Brightness (aka Value) of a desired object 3 | # 4 | # Start from command line using 'python colour_finder.py', select a window and press ESC at any time to quit 5 | # 6 | # 4 windows will be displayed: 7 | # Colour Filters : display high and low colour filters for Hue, Saturation and Brightness (aka Value) 8 | # Original : displays the raw image from the camera 9 | # Mask : displays black and white mask where black parts will be removed, white parts will remain 10 | # Filtered Result: Original image with Mask applied. Only desired object should be visible 11 | # 12 | # How to use: 13 | # Start the program and hold the object in front of the camera (the object should not fill the entire screen) 14 | # Increase the min and decrease the Max Hue, Saturation and Brightness trackbars so that only the desired object is shown in the Filtered Result 15 | # Record the values so they can be input manually into the usb_cam_test.py script 16 | 17 | import cv2 18 | import numpy as np 19 | import balloon_config 20 | from balloon_video import balloon_video 21 | 22 | class ColourFinder: 23 | 24 | # constructor 25 | def __init__(self): 26 | # initialise colour filters to no filtering 27 | self.h_low = 0 28 | self.h_high = 255 29 | self.s_low = 0 30 | self.s_high = 255 31 | self.v_low = 0 32 | self.v_high = 255 33 | 34 | # initialise save trackbar setting 35 | self.save = 0 36 | 37 | # call back for trackbar movements 38 | def empty_callback(self,x): 39 | pass 40 | 41 | # call back for save trackbar that saves colour filters to config file 42 | def save_callback(self,x): 43 | if x == 10: 44 | balloon_config.config.set_integer('balloon','h-low',self.h_low) 45 | balloon_config.config.set_integer('balloon','h-high',self.h_high) 46 | balloon_config.config.set_integer('balloon','s-low',self.s_low) 47 | balloon_config.config.set_integer('balloon','s-high',self.s_high) 48 | balloon_config.config.set_integer('balloon','v-low',self.v_low) 49 | balloon_config.config.set_integer('balloon','v-high',self.v_high) 50 | balloon_config.config.save(); 51 | print "Saved colour filters to config file!" 52 | return 53 | 54 | # run - main routine to help user find colour filters 55 | def run(self): 56 | 57 | # initialise video capture 58 | balloon_video.init_camera() 59 | 60 | # create trackbars for color change 61 | cv2.namedWindow('Colour Filters') 62 | cv2.createTrackbar('Hue min','Colour Filters',self.h_low,255,self.empty_callback) 63 | cv2.createTrackbar('Hue max','Colour Filters',self.h_high,255,self.empty_callback) 64 | cv2.createTrackbar('Sat min','Colour Filters',self.s_low,255,self.empty_callback) 65 | cv2.createTrackbar('Sat max','Colour Filters',self.s_high,255,self.empty_callback) 66 | cv2.createTrackbar('Bgt min','Colour Filters',self.v_low,255,self.empty_callback) 67 | cv2.createTrackbar('Bgt max','Colour Filters',self.v_high,255,self.empty_callback) 68 | cv2.createTrackbar('Save','Colour Filters',0,10,self.save_callback) 69 | 70 | while(True): 71 | # get a frame 72 | frame = balloon_video.capture_image() 73 | 74 | # Convert BGR to HSV 75 | hsv_frame = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV) 76 | 77 | # get latest trackbar positions 78 | self.h_low = cv2.getTrackbarPos('Hue min','Colour Filters') 79 | self.h_high = cv2.getTrackbarPos('Hue max','Colour Filters') 80 | self.s_low = cv2.getTrackbarPos('Sat min','Colour Filters') 81 | self.s_high = cv2.getTrackbarPos('Sat max','Colour Filters') 82 | self.v_low = cv2.getTrackbarPos('Bgt min','Colour Filters') 83 | self.v_high = cv2.getTrackbarPos('Bgt max','Colour Filters') 84 | 85 | # use trackbar positions to filter image 86 | colour_low = np.array([self.h_low,self.s_low,self.v_low]) 87 | colour_high = np.array([self.h_high,self.s_high,self.v_high]) 88 | 89 | # Threshold the HSV image 90 | mask = cv2.inRange(hsv_frame, colour_low, colour_high) 91 | 92 | # Erode 93 | erode_kernel = np.ones((3,3),np.uint8); 94 | eroded_img = cv2.erode(mask,erode_kernel,iterations = 1) 95 | 96 | # dilate 97 | dilate_kernel = np.ones((10,10),np.uint8); 98 | dilated_img = cv2.dilate(eroded_img,dilate_kernel,iterations = 1) 99 | 100 | # Bitwise-AND mask and original image 101 | res = cv2.bitwise_and(frame,frame, mask=dilated_img) 102 | 103 | cv2.imshow('Original',frame) 104 | cv2.imshow('Mask',mask) 105 | cv2.imshow('Filtered Result',res) 106 | 107 | #cv2.imshow('grey_res',grey_res) 108 | k = cv2.waitKey(5) & 0xFF 109 | if k == 27: 110 | break 111 | 112 | # close camera 113 | frame = balloon_video.close_camera() 114 | 115 | # close all windows 116 | cv2.destroyAllWindows() 117 | 118 | # create global colour_finder object and run it 119 | colour_finder = ColourFinder() 120 | colour_finder.run() -------------------------------------------------------------------------------- /scripts/balloon_config.py: -------------------------------------------------------------------------------- 1 | """ 2 | BalloonConfig class : handles config for the balloon_finder project 3 | 4 | balloon_finder.cnf file is created in the local directory 5 | 6 | other classes or files wishing to use this class should add 7 | import balloon_config 8 | 9 | """ 10 | 11 | from os.path import expanduser 12 | import ConfigParser 13 | 14 | class BalloonConfig(object): 15 | 16 | def __init__(self): 17 | # default config file 18 | self.config_file = expanduser("~/balloon_finder.cnf") 19 | 20 | # create the global parser object 21 | self.parser = ConfigParser.SafeConfigParser() 22 | 23 | # read the config file into memory 24 | self.read() 25 | 26 | # read - reads the contents of the file into the dictionary in RAM 27 | def read(self): 28 | try: 29 | self.parser.read(self.config_file) 30 | except IOError as e: 31 | print 'Error {0} reading config file: {1}: '.format(e.errno, e.strerror) 32 | return 33 | 34 | # save - saves the config to disk 35 | def save(self): 36 | try: 37 | with open(self.config_file, 'wb') as configfile: 38 | self.parser.write(configfile) 39 | except IOError as e: 40 | print 'Error {0} writing config file: {1}: '.format(e.errno, e.strerror) 41 | return 42 | 43 | # check_section - ensures the section exists, creates it if not 44 | def check_section(self, section): 45 | if not self.parser.has_section(section): 46 | self.parser.add_section(section) 47 | return 48 | 49 | # get_balloon - returns the boolean found in the specified section/option or the default if not found 50 | def get_boolean(self, section, option, default): 51 | try: 52 | return self.parser.getboolean(section, option) 53 | except ConfigParser.Error: 54 | return default 55 | 56 | # set_boolean - sets the boolean to the specified section/option 57 | def set_boolean(self, section, option, new_value): 58 | self.check_section(section) 59 | self.parser.set(section, option, str(bool(new_value))) 60 | return 61 | 62 | # get_integer - returns the integer found in the specified section/option or the default if not found 63 | def get_integer(self, section, option, default): 64 | try: 65 | return self.parser.getint(section, option) 66 | except ConfigParser.Error: 67 | return default 68 | 69 | # set_integer - sets the integer to the specified section/option 70 | def set_integer(self, section, option, new_value): 71 | self.check_section(section) 72 | self.parser.set(section, option, str(int(new_value))) 73 | return 74 | 75 | # get_float - returns the float found in the specified section/option or the default if not found 76 | def get_float(self, section, option, default): 77 | try: 78 | return self.parser.getfloat(section, option) 79 | except ConfigParser.Error: 80 | return default 81 | 82 | # set_float - sets the float to the specified section/option 83 | def set_float(self, section, option, new_value): 84 | self.check_section(section) 85 | self.parser.set(section, option, str(float(new_value))) 86 | return 87 | 88 | # get_string - returns the string found in the specified section/option or the default if not found 89 | def get_string(self, section, option, default): 90 | try: 91 | return self.parser.get(section, option) 92 | except ConfigParser.Error: 93 | return default 94 | 95 | # set_string - sets the string to the specified section/option 96 | def set_string(self, section, option, new_value): 97 | self.check_section(section) 98 | self.parser.set(section, option, str(new_value)) 99 | return 100 | 101 | # main - tests BalloonConfig class 102 | def main(self): 103 | # print welcome message 104 | print "BalloonConfig v0.1 test" 105 | print "config file: %s" % self.config_file 106 | 107 | # write and read a boolean 108 | section = 'Test_Section1' 109 | option = 'Test_boolean' 110 | print "Writing %s/%s = True" % (section,option) 111 | self.set_boolean(section,option,True) 112 | print "Read %s/%s : %s" % (section, option, self.get_boolean(section, option, False)) 113 | 114 | # write and read an integer 115 | section = 'Test_Section1' 116 | option = 'Test_integer' 117 | print "Writing %s/%s = 11" % (section,option) 118 | self.set_integer(section,option,11) 119 | print "Read %s/%s : %s" % (section, option, self.get_integer(section, option, 99)) 120 | 121 | # write and read a float 122 | section = 'Test_Section1' 123 | option = 'Test_float' 124 | print "Writing %s/%s = 12.345" % (section,option) 125 | self.set_float(section,option,12.345) 126 | print "Read %s/%s : %s" % (section, option, self.get_float(section, option, 0.01)) 127 | 128 | # read an undefined number to get back the default 129 | section = 'Test_Section2' 130 | option = 'test_default' 131 | print "Read %s/%s : %s" % (section, option, self.get_float(section, option, 21.21)) 132 | 133 | # save the config file 134 | self.save() 135 | 136 | return 137 | 138 | # declare global balloon_config object 139 | config = BalloonConfig() 140 | 141 | # run the main routine if this is file is called from the command line 142 | if __name__ == "__main__": 143 | config.main() -------------------------------------------------------------------------------- /smart_camera/sc_config.py: -------------------------------------------------------------------------------- 1 | """ 2 | SmartCameraConfig class : handles config for the smart_camera project 3 | 4 | smart_camera.cnf file is created in the local directory 5 | 6 | other classes or files wishing to use this class should add 7 | import sc_config 8 | 9 | """ 10 | 11 | from os.path import expanduser 12 | import ConfigParser 13 | 14 | class SmartCameraConfig(object): 15 | 16 | def __init__(self): 17 | # default config file 18 | self.config_file = expanduser("~/smart_camera.cnf") 19 | 20 | # print config file location 21 | print "config file: %s" % self.config_file 22 | 23 | # create the global parser object 24 | self.parser = ConfigParser.SafeConfigParser() 25 | 26 | # read the config file into memory 27 | self.read() 28 | 29 | # read - reads the contents of the file into the dictionary in RAM 30 | def read(self): 31 | try: 32 | self.parser.read(self.config_file) 33 | except IOError as e: 34 | print 'Error {0} reading config file: {1}: '.format(e.errno, e.strerror) 35 | return 36 | 37 | # save - saves the config to disk 38 | def save(self): 39 | try: 40 | with open(self.config_file, 'wb') as configfile: 41 | self.parser.write(configfile) 42 | except IOError as e: 43 | print 'Error {0} writing config file: {1}: '.format(e.errno, e.strerror) 44 | return 45 | 46 | # check_section - ensures the section exists, creates it if not 47 | def check_section(self, section): 48 | if not self.parser.has_section(section): 49 | self.parser.add_section(section) 50 | return 51 | 52 | # get_boolean - returns the boolean found in the specified section/option or the default if not found 53 | def get_boolean(self, section, option, default): 54 | try: 55 | return self.parser.getboolean(section, option) 56 | except ConfigParser.Error: 57 | return default 58 | 59 | # set_boolean - sets the boolean to the specified section/option 60 | def set_boolean(self, section, option, new_value): 61 | self.check_section(section) 62 | self.parser.set(section, option, str(bool(new_value))) 63 | return 64 | 65 | # get_integer - returns the integer found in the specified section/option or the default if not found 66 | def get_integer(self, section, option, default): 67 | try: 68 | return self.parser.getint(section, option) 69 | except ConfigParser.Error: 70 | return default 71 | 72 | # set_integer - sets the integer to the specified section/option 73 | def set_integer(self, section, option, new_value): 74 | self.check_section(section) 75 | self.parser.set(section, option, str(int(new_value))) 76 | return 77 | 78 | # get_float - returns the float found in the specified section/option or the default if not found 79 | def get_float(self, section, option, default): 80 | try: 81 | return self.parser.getfloat(section, option) 82 | except ConfigParser.Error: 83 | return default 84 | 85 | # set_float - sets the float to the specified section/option 86 | def set_float(self, section, option, new_value): 87 | self.check_section(section) 88 | self.parser.set(section, option, str(float(new_value))) 89 | return 90 | 91 | # get_string - returns the string found in the specified section/option or the default if not found 92 | def get_string(self, section, option, default): 93 | try: 94 | return self.parser.get(section, option) 95 | except ConfigParser.Error: 96 | return default 97 | 98 | # set_string - sets the string to the specified section/option 99 | def set_string(self, section, option, new_value): 100 | self.check_section(section) 101 | self.parser.set(section, option, str(new_value)) 102 | return 103 | 104 | # main - tests SmartCameraConfig class 105 | def main(self): 106 | # print welcome message 107 | print "SmartCameraConfig v1.0 test" 108 | print "config file: %s" % self.config_file 109 | 110 | # write and read a boolean 111 | section = 'Test_Section1' 112 | option = 'Test_boolean' 113 | print "Writing %s/%s = True" % (section,option) 114 | self.set_boolean(section,option,True) 115 | print "Read %s/%s : %s" % (section, option, self.get_boolean(section, option, False)) 116 | 117 | # write and read an integer 118 | section = 'Test_Section1' 119 | option = 'Test_integer' 120 | print "Writing %s/%s = 11" % (section,option) 121 | self.set_integer(section,option,11) 122 | print "Read %s/%s : %s" % (section, option, self.get_integer(section, option, 99)) 123 | 124 | # write and read a float 125 | section = 'Test_Section1' 126 | option = 'Test_float' 127 | print "Writing %s/%s = 12.345" % (section,option) 128 | self.set_float(section,option,12.345) 129 | print "Read %s/%s : %s" % (section, option, self.get_float(section, option, 0.01)) 130 | 131 | # read an undefined number to get back the default 132 | section = 'Test_Section2' 133 | option = 'test_default' 134 | print "Read %s/%s : %s" % (section, option, self.get_float(section, option, 21.21)) 135 | 136 | # save the config file 137 | self.save() 138 | 139 | return 140 | 141 | # declare global config object 142 | config = SmartCameraConfig() 143 | 144 | # run the main routine if this is file is called from the command line 145 | if __name__ == "__main__": 146 | config.main() -------------------------------------------------------------------------------- /scripts/position_vector.py: -------------------------------------------------------------------------------- 1 | """ 2 | PositionVector class : holds a 3 axis position offset from home in meters in NEU format 3 | X = North-South with +ve = North 4 | Y = West-East with +ve = West 5 | Z = Altitude with +ve = Up 6 | 7 | """ 8 | 9 | import math 10 | from dronekit import LocationGlobal 11 | 12 | # global variables used by position vector class 13 | posvec_home_location = LocationGlobal(0,0,0) 14 | posvec_lon_scale = 1.0 # scaling used to account for shrinking distance between longitude lines as we move away from equator 15 | posvec_latlon_to_m = 111319.5 # converts lat/lon to meters 16 | 17 | class PositionVector(object): 18 | 19 | # define public members 20 | x = 0 # north-south direction. +ve = north of home 21 | y = 0 # west-east direction, +ve = east of home 22 | z = 0 # vertical direction, +ve = above home 23 | 24 | def __init__(self,initial_x=0,initial_y=0,initial_z=0): 25 | # default config file 26 | self.x = initial_x 27 | self.y = initial_y 28 | self.z = initial_z 29 | 30 | # get_from_location - returns a position vector created from a location 31 | @classmethod 32 | def get_from_location(cls, location): 33 | ret = PositionVector(0,0,0) 34 | ret.set_from_location(location) 35 | return ret 36 | 37 | # __str__ - print position vector as string 38 | def __str__(self): 39 | return "Pos:X=%s,Y=%s,Z=%s" % (self.x, self.y, self.z) 40 | 41 | # __add__ - addition operator overload 42 | def __add__(self, other): 43 | return PositionVector(self.x + other.x, self.y + other.y, self.z + other.z) 44 | 45 | # __sub__ - subtraction operator overload 46 | def __sub__(self, other): 47 | return PositionVector(self.x - other.x, self.y - other.y, self.z - other.z) 48 | 49 | # __mul__ - multiply operator overload 50 | def __mul__(self, scalar): 51 | return PositionVector(self.x * scalar, self.y * scalar, self.z * scalar) 52 | 53 | # get_location - return the location (i.e. lat, lon, alt) of the position vector 54 | def get_location(self): 55 | dlat = self.x / posvec_latlon_to_m 56 | dlon = self.y / (posvec_latlon_to_m * posvec_lon_scale) 57 | return LocationGlobal(posvec_home_location.lat + dlat,posvec_home_location.lon + dlon,posvec_home_location.alt + self.z) 58 | 59 | # set_from_location - sets x,y,z offsets given a location object (i.e. lat, lon and alt) 60 | def set_from_location(self, location): 61 | # convert lat, lon to meters from home 62 | self.x = (location.lat - posvec_home_location.lat) * posvec_latlon_to_m 63 | self.y = (location.lon - posvec_home_location.lon) * posvec_latlon_to_m * posvec_lon_scale 64 | self.z = location.alt 65 | 66 | # set_home_location - sets home location used by all position vector instances 67 | @classmethod 68 | def set_home_location(cls, home_location): 69 | global posvec_home_location 70 | posvec_home_location = home_location 71 | PositionVector.update_lon_scale(posvec_home_location.lat) 72 | 73 | # get_home_location - returns home location used by all position vector instances 74 | @classmethod 75 | def get_home_location(cls): 76 | global posvec_home_location 77 | return posvec_home_location 78 | 79 | # get_distance_xy - returns horizontal distance in meters from one position to another 80 | @classmethod 81 | def get_distance_xy(cls, pos1, pos2): 82 | dx = pos2.x - pos1.x 83 | dy = pos2.y - pos1.y 84 | return math.sqrt(dx**2+dy**2) 85 | 86 | # get_distance_xyz - returns distance in meters from one position to another 87 | @classmethod 88 | def get_distance_xyz(cls, pos1, pos2): 89 | dx = pos2.x - pos1.x 90 | dy = pos2.y - pos1.y 91 | dz = pos2.z - pos1.z 92 | return math.sqrt(dx**2+dy**2+dz**2) 93 | 94 | # get_bearing - returns bearing from origin to destination in radians 95 | @classmethod 96 | def get_bearing(cls, origin, destination): 97 | # avoid error when origin and destination are exactly on top of each other 98 | if destination.x == origin.x and destination.y == origin.y: 99 | return 0 100 | bearing = math.radians(90) + math.atan2(-(destination.x-origin.x), destination.y-origin.y); 101 | if bearing < 0: 102 | bearing += math.radians(360); 103 | return bearing 104 | 105 | # get_elevation - returns an elevation in radians from origin to destination 106 | @classmethod 107 | def get_elevation(cls, origin, destination): 108 | # avoid error when origin and destination are exactly on top of each other 109 | if destination.x == origin.x and destination.y == origin.y and destination.z == origin.z: 110 | return 0 111 | 112 | # calculate distance to destination 113 | dist_xy = PositionVector.get_distance_xy(origin, destination) 114 | 115 | # calculate elevation 116 | elevation = -math.atan2(destination.z-origin.z, dist_xy) 117 | return elevation 118 | 119 | # get_lon_scale - return lon scaling factor to account for curvature of earth 120 | @classmethod 121 | def update_lon_scale(cls, lat): 122 | global posvec_lon_scale 123 | if lat <> 0: 124 | posvec_lon_scale = math.cos(math.radians(lat)) 125 | 126 | # main - used to test the class 127 | def main(self): 128 | # set home position - to tridge's home field (this is just for testing anyway) 129 | PositionVector.set_home_location(LocationGlobal(-35.362938,149.165085,0)) 130 | print "Home %s" % PositionVector.get_home_location() 131 | home_pos = PositionVector(0,0,0) 132 | print "Home %s" % home_pos 133 | 134 | # other position 135 | other_pos = PositionVector.get_from_location(PositionVector.get_home_location()) 136 | print "Other %s" % other_pos 137 | 138 | # set vehicle to be 10m north, 10m east and 10m above home 139 | veh_pos = PositionVector(10,10,10) 140 | print "Vehicle %s" % veh_pos.get_location() 141 | print "Vehicle %s" % veh_pos 142 | print "Distance from home: %f" % PositionVector.get_distance_xyz(home_pos,veh_pos) 143 | 144 | 145 | # run the main routine if this is file is called from the command line 146 | if __name__ == "__main__": 147 | dummy = PositionVector(0,0,0) 148 | dummy.main() -------------------------------------------------------------------------------- /scripts/raw.py: -------------------------------------------------------------------------------- 1 | # 2 | # raw.py - helps find the min and max Hue, Saturation and Brightness (aka Value) of a desired object 3 | # 4 | # Start from command line using 'raw.py', select a window and press ESC at any time to quit 5 | # 6 | # 1 window will be displayed at startup: 7 | # Colour Filters : display high and low colour filters for Hue, Saturation and Brightness (aka Value) also displays the tracking results. 8 | # the first 6 sliders are disabled until you set the 7th slider to OFF 9 | # 10 | # 3 Windows will be displayed when the 7th slider is set to OFF 11 | # Colour Filters : displays the raw image from the camera 12 | # Mask : displays black and white mask where black parts will be removed, white parts will remain 13 | # Filtered Result: Original image with Mask applied. Only desired object should be visible 14 | # 15 | # How to use: 16 | # Start the program and hold the object in front of the camera (the object should not fill the entire screen) 17 | # Increase the min and decrease the Max Hue, Saturation and Brightness trackbars so that only the desired object is shown in the Filtered Result 18 | # set the 7th slider to ON, to see it tracking. 19 | # the 8th slider selects if only the largest circle is chosen. 20 | 21 | import cv2 22 | import cv2.cv 23 | import numpy as np 24 | 25 | # define image resolution 26 | img_width = 640 27 | img_height = 480 28 | 29 | # setup video capture 30 | video_capture = cv2.VideoCapture(0) 31 | video_capture.set(cv2.cv.CV_CAP_PROP_FRAME_WIDTH,img_width) 32 | video_capture.set(cv2.cv.CV_CAP_PROP_FRAME_HEIGHT,img_height) 33 | 34 | # check we can connect to camera 35 | if not video_capture.isOpened(): 36 | print "failed to open camera, exiting!" 37 | sys.exit(0) 38 | 39 | # call back for trackbar movements 40 | def empty_callback(x): 41 | pass 42 | 43 | # default filters -- all visible 44 | h_low = 0 45 | h_high = 255 46 | s_low = 0 47 | s_high = 255 48 | v_low = 0 49 | v_high = 255 50 | switch = 1 51 | num_circle = 1 52 | # default filter -- yellow tennis ball (Randy's default) 53 | h_low = 0 54 | h_high = 22 55 | s_low = 168 56 | s_high = 255 57 | v_low = 147 58 | v_high = 255 59 | #default filter -- orange paint lid (Phils Default) 60 | h_low = 0 61 | h_high = 22 62 | s_low = 168 63 | s_high = 255 64 | v_low = 147 65 | v_high = 255 66 | 67 | # create trackbars for color change 68 | cv2.namedWindow('Colour Filters') 69 | cv2.createTrackbar('Hue min','Colour Filters',h_low,255,empty_callback) 70 | cv2.createTrackbar('Hue max','Colour Filters',h_high,255,empty_callback) 71 | cv2.createTrackbar('Sat min','Colour Filters',s_low,255,empty_callback) 72 | cv2.createTrackbar('Sat max','Colour Filters',s_high,255,empty_callback) 73 | cv2.createTrackbar('Bgt min','Colour Filters',v_low,255,empty_callback) 74 | cv2.createTrackbar('Bgt max','Colour Filters',v_high,255,empty_callback) 75 | cv2.createTrackbar('0 : OFF \n1 : ON','Colour Filters',switch,1,empty_callback) 76 | cv2.createTrackbar('1 Circle \nAll Circles','Colour Filters',num_circle,1,empty_callback) 77 | 78 | #create basic output window 79 | out_win = np.zeros((300,512,3),np.uint8) 80 | 81 | 82 | while(1): 83 | 84 | # Take each frame 85 | _, frame = video_capture.read() 86 | 87 | switch = cv2.getTrackbarPos('0 : OFF \n1 : ON','Colour Filters') 88 | num_circle = cv2.getTrackbarPos('1 Circle \nAll Circles','Colour Filters') 89 | 90 | # Convert BGR to HSV 91 | hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV) 92 | 93 | # use trackbar positions to filter image 94 | colour_low = np.array([h_low,s_low,v_low]) 95 | colour_high = np.array([h_high,s_high,v_high]) 96 | 97 | # Threshold the HSV image 98 | mask = cv2.inRange(hsv, colour_low, colour_high) 99 | 100 | # blur the result 101 | #mask = cv2.medianBlur(mask,9) 102 | 103 | # Erode 104 | erode_kernel = np.ones((3,3),np.uint8); 105 | eroded_img = cv2.erode(mask,erode_kernel,iterations = 1) 106 | 107 | # dilate 108 | dilate_kernel = np.ones((10,10),np.uint8); 109 | dilate_img = cv2.dilate(eroded_img,dilate_kernel,iterations = 1) 110 | 111 | # Bitwise-AND mask and original image 112 | res = cv2.bitwise_and(frame,frame, mask= dilate_img) 113 | 114 | # create a grey version of the result 115 | grey_res = cv2.cvtColor(res,cv2.COLOR_BGR2GRAY) 116 | 117 | # threshold it to a black and white image 118 | #thresh_used, grey_res = cv2.threshold(grey_res,10,255,cv2.THRESH_BINARY) 119 | 120 | #grey_res = cv2.cvtColor(mask,cv2.COLOR_BGR2GRAY) 121 | # blur it to reduce false circles 122 | grey_res = cv2.medianBlur(grey_res,5) 123 | 124 | if (switch == 1): 125 | circles = cv2.HoughCircles(grey_res,cv2.cv.CV_HOUGH_GRADIENT,1,50,param1=50,param2=30,minRadius=0,maxRadius=0) 126 | #circles = cv2.HoughCircles(grey_res,cv2.cv.CV_HOUGH_GRADIENT,1,20) 127 | 128 | # check if any circles were found 129 | if not (circles is None): 130 | # print(circles) 131 | # draw circles around the circles 132 | circles = np.uint16(np.around(circles)) 133 | biggest = 0 134 | big_circle = 0 135 | for i in circles[0,:]: 136 | if (i[2] > biggest): 137 | biggest = i[2] 138 | # print biggest 139 | big_circle = i 140 | if (num_circle == 1): 141 | cv2.line(frame,(big_circle[0],1),(big_circle[0],640),5000,2) 142 | cv2.line(frame,(1,big_circle[1]),(640,big_circle[1]),5000,2) 143 | else: 144 | for i in circles[0,:]: 145 | # draw the outer circle 146 | cv2.circle(frame,(i[0],i[1]),i[2],(0,255,0),-1) 147 | cv2.line(frame,(i[0],1),(i[0],640),5000,2) 148 | cv2.line(frame,(1,i[1]),(640,i[1]),5000,2) 149 | # draw the center of the circle 150 | # cv2.circle(frame,(i[0],i[1]),2,(0,0,255),3) 151 | # print switch 152 | else: 153 | # get latest trackbar positions 154 | h_low = cv2.getTrackbarPos('Hue min','Colour Filters') 155 | h_high = cv2.getTrackbarPos('Hue max','Colour Filters') 156 | s_low = cv2.getTrackbarPos('Sat min','Colour Filters') 157 | s_high = cv2.getTrackbarPos('Sat max','Colour Filters') 158 | v_low = cv2.getTrackbarPos('Bgt min','Colour Filters') 159 | v_high = cv2.getTrackbarPos('Bgt max','Colour Filters') 160 | switch = cv2.getTrackbarPos('0 : OFF \n1 : ON','Colour Filters') 161 | num_circle = cv2.getTrackbarPos('1 Circle \nAll Circles','Colour Filters') 162 | 163 | cv2.imshow('Mask',mask) 164 | cv2.imshow('Filtered Result',res) 165 | #cv2.imshow('grey_res',grey_res) 166 | 167 | cv2.imshow('Colour Filters',frame) 168 | 169 | # set escape key for exit 170 | k = cv2.waitKey(5) & 0xFF 171 | if k == 27: 172 | break 173 | 174 | cv2.destroyAllWindows() 175 | -------------------------------------------------------------------------------- /scripts/attitude_history.py: -------------------------------------------------------------------------------- 1 | """ 2 | AttitudeHistory class : provides delayed attitude and location information 3 | """ 4 | 5 | import math 6 | import time 7 | from pymavlink import mavutil 8 | from dronekit import VehicleMode, Attitude 9 | from balloon_utils import wrap_PI 10 | 11 | class AttitudeHistory(object): 12 | 13 | def __init__(self, dronekit_vehicle, max_delay): 14 | self.vehicle = dronekit_vehicle # reference to drone api's vehicle for retrieving attitude and location 15 | self.max_delay = max_delay # maximum expected delay in seconds 16 | self.last_update = 0 # system time of last update call 17 | self.att_dict = dict() # initialise attitude dictionary 18 | 19 | self.max_update_rate = 0.02 # do not store updates at more than 50hz 20 | # debug testing counter 21 | self.testing_counter = 0 22 | 23 | # __str__ - print contents of dictionary 24 | def __str__(self): 25 | return "AttHist MaxDelay:%d \nAttDict:%s" % (self.max_delay, self.att_dict) 26 | 27 | # update - captures an attitude from the drone api and stores in a dictionary 28 | def update(self): 29 | # get current time 30 | now = time.time() 31 | 32 | # don't update if it's too soon since the last update 33 | time_diff = now - self.last_update 34 | if time_diff < self.max_update_rate: 35 | return 36 | 37 | # exit immediately if vehicle is not initialised 38 | if self.vehicle is None: 39 | return 40 | 41 | # check attitude is initialised 42 | if not self.vehicle.attitude is None: 43 | # add attitude to dictionary 44 | self.att_dict[now] = self.vehicle.attitude 45 | # debug 46 | #print "Add Att: t:%f r:%f p:%f y:%f" % (now, self.vehicle.attitude.roll, self.vehicle.attitude.pitch, self.vehicle.attitude.yaw) 47 | 48 | # clear out any old entries 49 | for t in self.att_dict.keys(): 50 | if t < now - self.max_delay: 51 | self.att_dict.pop(t) 52 | 53 | # store last update time 54 | self.last_update = now 55 | 56 | # get_attitude - get attitude at given time 57 | def get_attitude(self, desired_time_in_sec): 58 | # return current attitude immediately if dictionary is empty 59 | if len(self.att_dict) == 0: 60 | return self.vehicle.attitude 61 | 62 | # get times from dict 63 | keys = sorted(self.att_dict.keys()) 64 | 65 | # debug 66 | #print "AttDict looking for %f" % desired_time_in_sec 67 | 68 | # initialise best before and after keys 69 | key_before = keys[0] 70 | time_diff_before = (key_before - desired_time_in_sec) 71 | key_after = keys[len(keys)-1] 72 | time_diff_after = (key_after - desired_time_in_sec) 73 | 74 | # handle case where we hit the time exactly or the time is beyond the end 75 | if (time_diff_before >= 0): 76 | #debug 77 | #print "Time %f was before first entry's time %f" % (desired_time_in_sec, key_before) 78 | return self.att_dict[key_before] 79 | if (time_diff_after <= 0): 80 | #debug 81 | #print "Time %f was after last entry's time %f" % (desired_time_in_sec, key_after) 82 | return self.att_dict[key_after] 83 | 84 | # iteration through attitude dictionary 85 | for t in keys: 86 | # calc this dictionary entry's time diff from the desired time 87 | time_diff = t - desired_time_in_sec 88 | 89 | # if before the desired time but after the current best 'before' time use it 90 | if time_diff <= 0 and time_diff > time_diff_before: 91 | time_diff_before = time_diff 92 | key_before = t 93 | 94 | # if after the desired time but before the current best 'before' time use it 95 | if time_diff >= 0 and time_diff < time_diff_after: 96 | time_diff_after = time_diff 97 | key_after = t 98 | 99 | # calc time between before and after attitudes 100 | tot_time_diff = -time_diff_before + time_diff_after 101 | 102 | # debug 103 | if (tot_time_diff <= 0): 104 | print "Div By Zero!" 105 | print "des time:%f" % desired_time_in_sec 106 | print "num keys:%d" % len(keys) 107 | print "key bef:%f aft:%f" % (key_before, key_after) 108 | print "keys: %s" % keys 109 | 110 | # get attitude before and after 111 | att_before = self.att_dict[key_before] 112 | att_after = self.att_dict[key_after] 113 | 114 | # interpolate roll, pitch and yaw values 115 | interp_val = (-time_diff_before / tot_time_diff) 116 | roll = wrap_PI(att_before.roll + (wrap_PI(att_after.roll - att_before.roll) * interp_val)) 117 | pitch = att_before.pitch + (att_after.pitch - att_before.pitch) * interp_val 118 | yaw = wrap_PI(att_before.yaw + (wrap_PI(att_after.yaw - att_before.yaw) * interp_val)) 119 | 120 | ret_att = Attitude(pitch, yaw, roll) 121 | 122 | return ret_att 123 | 124 | # main - test the class 125 | def main(self): 126 | 127 | # print object 128 | print "Test AttitudeHistory" 129 | 130 | for i in range(0,40): 131 | # add some entries 132 | self.update() 133 | time.sleep(0.1) 134 | 135 | # print out dictionaries 136 | print str(self) 137 | 138 | # retrieve attitude 0.25 sec before 139 | att = self.get_attitude(time.time()-0.2) 140 | print "Att 0.25 sec ago: %s" % att 141 | 142 | # test entry which is later than last item 143 | #att = self.get_attitude(time.time()) 144 | #print "Att Now: %s" % att 145 | 146 | # test entry which is earlier than earliest item 147 | #att = self.get_attitude(time.time()-10) 148 | #print "Att 10 sec ago: %s" % att 149 | 150 | # retrieve attitude from last time stored 151 | #keys = sorted(self.att_dict.keys()) 152 | #att = self.get_attitude(keys[len(keys)-1]) 153 | #print "Got attitude: %s" % att 154 | 155 | # retrieve attitude from earliest time stored 156 | #keys = sorted(self.att_dict.keys()) 157 | #att = self.get_attitude(keys[0]) 158 | #print "Got attitude: %s" % att 159 | 160 | # test wrap 161 | 162 | # test items are cleared when they get too old 163 | 164 | ''' 165 | # run the main routine if this is file is called from the command line 166 | if __name__ == "__main__": 167 | 168 | # for testing from the command line 169 | print "No command line tests supported" 170 | 171 | else: 172 | # for testing in the simulator 173 | 174 | connection_str = balloon_config.config.get_string('dronekit','connection_string','/dev/ttyUSB0') 175 | connection_baud = balloon_config.config.get_integer('dronekit','baud',921600) 176 | vehicle = dronekit.connect(connection_str, connection_baud) 177 | 178 | # create test attitude history class with maximum 2 second delay 179 | test_atthist = AttitudeHistory(vehicle, 2.0) 180 | 181 | # run tests 182 | test_atthist.main() 183 | ''' -------------------------------------------------------------------------------- /smart_camera/sc_video.py: -------------------------------------------------------------------------------- 1 | """ 2 | sc_video.py 3 | 4 | This file includes functions to: 5 | Initialise the camera 6 | Initialise the output video 7 | 8 | Image size is held in the smart_camera.cnf 9 | """ 10 | 11 | import sys 12 | from os.path import expanduser 13 | import time 14 | import math 15 | from multiprocessing import Process, Pipe 16 | import cv2 17 | import sc_config 18 | 19 | class SmartCameraVideo: 20 | 21 | def __init__(self): 22 | # get image resolution 23 | self.img_width = sc_config.config.get_integer('camera','width',640) 24 | self.img_height = sc_config.config.get_integer('camera','height',480) 25 | 26 | # get image center 27 | self.img_center_x = self.img_width / 2 28 | self.img_center_y = self.img_height / 2 29 | 30 | # define field of view 31 | self.cam_hfov = sc_config.config.get_float('camera','horizontal-fov',70.42) 32 | self.cam_vfov = sc_config.config.get_float('camera','vertical-fov',43.3) 33 | 34 | # define video output filename 35 | self.video_filename = sc_config.config.get_string('camera','video_output_file','~/smartcamera-%Y-%m-%d-%H-%M.avi') 36 | self.video_filename = expanduser(self.video_filename) 37 | self.video_filename = time.strftime(self.video_filename) 38 | 39 | # background image processing variables 40 | self.proc = None # background process object 41 | self.parent_conn = None # parent end of communicatoin pipe 42 | self.img_counter = 0 # num images requested so far 43 | 44 | # __str__ - print position vector as string 45 | def __str__(self): 46 | return "SmartCameraVideo Object W:%d H:%d" % (self.img_width, self.img_height) 47 | 48 | # get_camera - initialises camera and returns VideoCapture object 49 | def get_camera(self): 50 | # setup video capture 51 | self.camera = cv2.VideoCapture(0) 52 | self.camera.set(cv2.cv.CV_CAP_PROP_FRAME_WIDTH,self.img_width) 53 | self.camera.set(cv2.cv.CV_CAP_PROP_FRAME_HEIGHT,self.img_height) 54 | 55 | # check we can connect to camera 56 | if not self.camera.isOpened(): 57 | print "failed to open camera, exiting!" 58 | sys.exit(0) 59 | 60 | return self.camera 61 | 62 | # open_video_writer - begin writing to video file 63 | def open_video_writer(self): 64 | # Define the codec and create VideoWriter object 65 | # Note: setting ex to -1 will display pop-up requesting user choose the encoder 66 | ex = int(cv2.cv.CV_FOURCC('M','J','P','G')) 67 | self.video_writer = cv2.VideoWriter(self.video_filename, ex, 25, (self.img_width,self.img_height)) 68 | 69 | return self.video_writer 70 | 71 | # pixels_to_angle_x - converts a number of pixels into an angle in radians 72 | def pixels_to_angle_x(self, num_pixels): 73 | return num_pixels * math.radians(self.cam_hfov) / self.img_width 74 | 75 | # pixels_to_angle_y - converts a number of pixels into an angle in radians 76 | def pixels_to_angle_y(self, num_pixels): 77 | return num_pixels * math.radians(self.cam_vfov) / self.img_height 78 | 79 | # angle_to_pixels_x - converts a horizontal angle (i.e. yaw) to a number of pixels 80 | # angle : angle in radians 81 | def angle_to_pixels_x(self, angle): 82 | return int(angle * self.img_width / math.radians(self.cam_hfov)) 83 | 84 | # angle_to_pixels_y - converts a vertical angle (i.e. pitch) to a number of pixels 85 | # angle : angle in radians 86 | def angle_to_pixels_y(self, angle): 87 | return int(angle * self.img_height / math.radians(self.cam_vfov)) 88 | 89 | # 90 | # background image processing routines 91 | # 92 | 93 | # image_capture_background - captures all images from the camera in the background and returning the latest image via the pipe when the parent requests it 94 | def image_capture_background(self, imgcap_connection): 95 | # exit immediately if imgcap_connection is invalid 96 | if imgcap_connection is None: 97 | print "image_capture failed because pipe is uninitialised" 98 | return 99 | 100 | # open the camera 101 | camera = self.get_camera() 102 | 103 | # clear latest image 104 | latest_image = None 105 | 106 | while True: 107 | # constantly get the image from the webcam 108 | success_flag, image=camera.read() 109 | 110 | # if successful overwrite our latest image 111 | if success_flag: 112 | latest_image = image 113 | 114 | # check if the parent wants the image 115 | if imgcap_connection.poll(): 116 | recv_obj = imgcap_connection.recv() 117 | # if -1 is received we exit 118 | if recv_obj == -1: 119 | break 120 | 121 | # otherwise we return the latest image 122 | imgcap_connection.send(latest_image) 123 | 124 | # release camera when exiting 125 | camera.release() 126 | 127 | # start_background_capture - starts background image capture 128 | def start_background_capture(self): 129 | # create pipe 130 | self.parent_conn, imgcap_conn = Pipe() 131 | 132 | # create and start the sub process and pass it it's end of the pipe 133 | self.proc = Process(target=self.image_capture_background, args=(imgcap_conn,)) 134 | self.proc.start() 135 | 136 | def stop_background_capture(self): 137 | # send exit command to image capture process 138 | self.parent_conn.send(-1) 139 | 140 | # join process 141 | self.proc.join() 142 | 143 | # get_image - returns latest image from the camera captured from the background process 144 | def get_image(self): 145 | # return immediately if pipe is not initialised 146 | if self.parent_conn == None: 147 | return None 148 | 149 | # send request to image capture for image 150 | self.parent_conn.send(self.img_counter) 151 | 152 | # increment counter for next interation 153 | self.img_counter = self.img_counter + 1 154 | 155 | # wait endlessly until image is returned 156 | recv_img = self.parent_conn.recv() 157 | 158 | # return image to caller 159 | return recv_img 160 | 161 | # main - tests SmartCameraVideo class 162 | def main(self): 163 | 164 | # start background process 165 | self.start_background_capture() 166 | 167 | while True: 168 | # send request to image capture for image 169 | img = self.get_image() 170 | 171 | # check image is valid 172 | if not img is None: 173 | # display image 174 | cv2.imshow ('image_display', img) 175 | else: 176 | print "no image" 177 | 178 | # check for ESC key being pressed 179 | k = cv2.waitKey(5) & 0xFF 180 | if k == 27: 181 | break 182 | 183 | # take a rest for a bit 184 | time.sleep(0.1) 185 | 186 | # send exit command to image capture process 187 | self.stop_background_capture() 188 | 189 | print "a2p 10 = %f" % self.angle_to_pixels_x(10) 190 | print "p2a 10 = %f" % self.pixels_to_angle_x(10) 191 | 192 | # create a single global object 193 | sc_video = SmartCameraVideo() 194 | 195 | if __name__ == "__main__": 196 | sc_video.main() -------------------------------------------------------------------------------- /smart_camera/sc_main.py: -------------------------------------------------------------------------------- 1 | import time 2 | import math 3 | import cv2 4 | from pymavlink import mavutil 5 | from droneapi.lib import VehicleMode, Location 6 | import sc_config 7 | from sc_video import sc_video 8 | from sc_webcam import SmartCameraWebCam 9 | 10 | """ 11 | sc_main.py - runs top level smart camera function 12 | 13 | To run this module: 14 | * Run mavproxy.py with the correct options to connect to your vehicle 15 | * module load api 16 | * api start sc_main.py 17 | 18 | (Once tested we can put these directives into a mavinit.scr file and mavproxy will load/run 19 | this code automatically) 20 | 21 | """ 22 | 23 | class SmartCamera(object): 24 | def __init__(self, use_api): 25 | 26 | # if using droneapi connect to vehicle 27 | if (use_api): 28 | # First get an instance of the API endpoint (the connect via web case will be similar) 29 | self.api = local_connect() 30 | 31 | # Our vehicle (we assume the user is trying to control the virst vehicle attached to the GCS) 32 | self.vehicle = self.api.get_vehicles()[0] 33 | else: 34 | self.api = None 35 | self.vehicle = None 36 | 37 | # initialised flag 38 | self.home_initialised = False 39 | # timer to intermittently check for home position 40 | self.last_home_check = time.time() 41 | self.home_location = None 42 | 43 | # vehicle mission 44 | self.mission_cmds = None 45 | 46 | # check if we should display debug messages 47 | self.debug = sc_config.config.get_boolean('general','debug',True) 48 | 49 | # register cameras 50 | self.register_cameras(); 51 | 52 | # initialise video writer 53 | self.writer = None 54 | 55 | # register cameras - creates camera objects based on camera-type configuration 56 | def register_cameras(self): 57 | 58 | # initialise list 59 | self.camera_list = [] 60 | 61 | #look for up to 2 cameras 62 | for i in range(0,2): 63 | config_group = "camera%d" % i 64 | camera_type = sc_config.config.get_integer(config_group, 'type', 0) 65 | # webcam 66 | if camera_type == 1: 67 | new_camera = SmartCameraWebCam(i) 68 | self.camera_list = self.camera_list + [new_camera] 69 | 70 | # display number of cameras found 71 | print "cameras found: %d" % len(self.camera_list) 72 | 73 | # fetch_mission - fetch mission from flight controller 74 | def fetch_mission(self): 75 | # download the vehicle waypoints 76 | self.mission_cmds = self.vehicle.commands 77 | self.mission_cmds.download() 78 | self.mission_cmds.wait_valid() 79 | 80 | # check home - intermittently checks for changes to the home location 81 | def check_home(self): 82 | 83 | # return immediately if home has already been initialised 84 | if self.home_initialised: 85 | return True 86 | 87 | # check for home no more than once every two seconds 88 | if (time.time() - self.last_home_check > 2): 89 | 90 | # update that we have performed a status check 91 | self.last_home_check = time.time() 92 | 93 | # check if we have a vehicle 94 | if self.vehicle is None: 95 | self.vehicle = self.api.get_vehicles()[0] 96 | return 97 | 98 | # ensure the vehicle's position is known 99 | if self.vehicle.location is None: 100 | return False 101 | if self.vehicle.location.lat is None or self.vehicle.location.lon is None or self.vehicle.location.alt is None: 102 | return False 103 | 104 | # download the vehicle waypoints if we don't have them already 105 | if self.mission_cmds is None: 106 | self.fetch_mission() 107 | return False 108 | 109 | # get the home lat and lon 110 | home_lat = self.mission_cmds[0].x 111 | home_lon = self.mission_cmds[0].y 112 | home_alt = self.mission_cmds[0].z 113 | 114 | # sanity check the home position 115 | if home_lat is None or home_lon is None or home_alt is None: 116 | return False 117 | 118 | # sanity check again and set home position 119 | if home_lat <> 0 and home_lon <> 0: 120 | self.home_location = Location(home_lat,home_lon,home_alt) 121 | self.home_initialised = True 122 | else: 123 | self.mission_cmds = None 124 | 125 | # To-Do: if we wish to have the same home position as the flight controller 126 | # we must download the home waypoint again whenever the vehicle is armed 127 | 128 | # return whether home has been initialised or not 129 | return self.home_initialised 130 | 131 | # checks if video output should be started 132 | def check_video_out(self): 133 | 134 | # return immediately if video has already been started 135 | if not self.writer is None: 136 | return 137 | 138 | # start video once vehicle is armed 139 | if self.vehicle.armed: 140 | self.writer = sc_video.open_video_writer() 141 | 142 | # check_status - poles vehicle' status to determine if we should record video or not 143 | def check_status(self): 144 | 145 | # download the vehicle waypoints if we don't have them already 146 | # To-Do: do not load waypoints if vehicle is armed 147 | if self.mission_cmds is None: 148 | self.fetch_mission() 149 | return 150 | 151 | # take_picture_all - ask all cameras to take a picture 152 | def take_picture_all(self): 153 | for cam in self.camera_list: 154 | cam.take_picture() 155 | 156 | # saves_picture_all - ask all cameras for their latest image and saves to files 157 | def save_picture_all(self): 158 | cam_num = 0 159 | for cam in self.camera_list: 160 | img = cam.get_latest_image() 161 | # display image 162 | window_name = "cam%d" % cam_num 163 | cv2.imshow (window_name, img) 164 | # write to file 165 | #imgfilename = "C:\Users\rmackay9\Documents\GitHub\ardupilot-balloon-finder\smart_camera\img%d-%d.jpg" % (cam_num,cam.get_image_counter()) 166 | imgfilename = "C:\Temp\img%d-%d.jpg" % (cam_num,cam.get_image_counter()) 167 | print imgfilename 168 | cv2.imwrite(imgfilename, img) 169 | 170 | # check for ESC key being pressed 171 | k = cv2.waitKey(5) & 0xFF 172 | if k == 27: 173 | break 174 | cam_num = cam_num + 1 175 | 176 | # get image from sc_video class and write to file 177 | def analyze_image(self): 178 | 179 | # record time 180 | now = time.time() 181 | 182 | # get new image from camera 183 | f = self.get_frame() 184 | 185 | # save image for debugging later 186 | if not self.writer is None: 187 | self.writer.write(f) 188 | 189 | def run(self): 190 | while True: 191 | # ask all cameras to take a picture 192 | self.take_picture_all() 193 | 194 | # store images to disk 195 | self.save_picture_all() 196 | 197 | # Don't suck up too much CPU, only process a new image occasionally 198 | time.sleep(2.0) 199 | 200 | ''' 201 | while not self.api.exit: 202 | 203 | # only process images once home has been initialised 204 | if self.check_home(): 205 | 206 | # start video if required 207 | self.check_video_out() 208 | 209 | # check if we are controlling the vehicle 210 | self.check_status() 211 | 212 | # ask all cameras to take a picture 213 | self.take_picture_all() 214 | 215 | # Don't suck up too much CPU, only process a new image occasionally 216 | time.sleep(2.0) 217 | 218 | if not self.use_simulator: 219 | sc_video.stop_background_capture() 220 | ''' 221 | 222 | # initialise depending upon whether running from command line or as part of mavproxy 223 | if __name__ == "__main__": 224 | sc_main = SmartCamera(False) 225 | else: 226 | sc_main = SmartCamera(True) 227 | 228 | # run the smart camera 229 | sc_main.run() 230 | 231 | -------------------------------------------------------------------------------- /scripts/fake_balloon.py: -------------------------------------------------------------------------------- 1 | # 2 | # fake_balloon.py - creates an imagine of a balloon given a vehicle position, vehicle attitude and balloon position 3 | # 4 | # Start from command line using 'python balloon_finder.py'. 5 | # The program will run for 10 seconds and will create a video file, "balloon_finder.avi", which contains the original video along with 6 | # super-imposed circles showing the where circles were detected 7 | # 8 | # How to use: 9 | # run colour_finder.py to find the best min and max Hue, Saturation and Brightness (aka Value) levels. Enter these into h_low, h_high, etc below 10 | # run this script and hold the object in front of the camera and move it around 11 | # check the balloon_finder.avi file to ensure the super-imposed circles accurately follow the object 12 | 13 | import sys 14 | from time import time 15 | import math 16 | import cv2 17 | import numpy 18 | from dronekit import LocationGlobal 19 | import balloon_config 20 | from balloon_video import balloon_video 21 | import balloon_utils 22 | from position_vector import PositionVector 23 | from find_balloon import balloon_finder 24 | 25 | class BalloonSimulator(object): 26 | 27 | # constructor 28 | def __init__(self): 29 | 30 | # read fake balloon location from config file 31 | self.fake_balloon_location = LocationGlobal(balloon_config.config.get_float('fake-balloon', 'lat',-35.363274), 32 | balloon_config.config.get_float('fake-balloon', 'lon',149.164630), 33 | balloon_config.config.get_float('fake-balloon', 'alt',15)) 34 | 35 | # fake balloon's colour is mid way between colour filter's low and high values 36 | h = (balloon_finder.filter_low[0] + balloon_finder.filter_high[0]) / 2 37 | s = (balloon_finder.filter_low[1] + balloon_finder.filter_high[1]) / 2 38 | v = (balloon_finder.filter_low[2] + balloon_finder.filter_high[2]) / 2 39 | 40 | # convert colour to BGR palette 41 | fake_balloon_colour_bgr = cv2.cvtColor(numpy.uint8([[[h,s,v]]]),cv2.COLOR_HSV2BGR) 42 | self.fake_balloon_colour_bgr_scalar = [fake_balloon_colour_bgr.item(0), fake_balloon_colour_bgr.item(1), fake_balloon_colour_bgr.item(2)] 43 | 44 | # fake balloon is same radius as actual balloon 45 | self.fake_balloon_radius = balloon_finder.balloon_radius_expected 46 | 47 | # background sky and ground colours 48 | self.background_sky_colour_bgr = (232, 228, 227) 49 | self.background_ground_colour_bgr_scalar = [87, 145, 158] 50 | 51 | # last iterations balloon radius 52 | self.last_balloon_radius = 0 53 | 54 | 55 | # get_background - returns a background image given a roll and pitch angle 56 | # vehicle_roll and pitch are in radians 57 | def get_background(self, vehicle_roll, vehicle_pitch): 58 | 59 | # create sky coloured image 60 | image = numpy.zeros((balloon_video.img_height, balloon_video.img_width, 3),numpy.uint8) 61 | image[:] = self.background_sky_colour_bgr 62 | 63 | # create large rectangle which will become the ground 64 | top_left = balloon_utils.rotate_xy(balloon_video.img_center_x-1000, balloon_video.img_center_y, -vehicle_roll) 65 | top_right = balloon_utils.rotate_xy(balloon_video.img_center_x+1000, balloon_video.img_center_y, -vehicle_roll) 66 | bot_left = balloon_utils.rotate_xy(balloon_video.img_center_x-1000,balloon_video.img_center_y+1000, -vehicle_roll) 67 | bot_right = balloon_utils.rotate_xy(balloon_video.img_center_x+1000,balloon_video.img_center_y+1000, -vehicle_roll) 68 | 69 | # calculate vertical pixel shift 70 | pitch_pixel_shift = balloon_video.angle_to_pixels_y(vehicle_pitch) 71 | 72 | # add pitch adjustment 73 | top_left = balloon_utils.shift_pixels_down(top_left, pitch_pixel_shift) 74 | top_right = balloon_utils.shift_pixels_down(top_right, pitch_pixel_shift) 75 | bot_left = balloon_utils.shift_pixels_down(bot_left, pitch_pixel_shift) 76 | bot_right = balloon_utils.shift_pixels_down(bot_right, pitch_pixel_shift) 77 | 78 | # draw horizon 79 | box = numpy.array([top_left, top_right, bot_right, bot_left],numpy.int32) 80 | cv2.fillConvexPoly(image, box, self.background_ground_colour_bgr_scalar) 81 | 82 | return image 83 | 84 | # draw_fake_balloon - draws fake balloon in the frame at the specified roll, pitch and yaw angle 85 | # veh_pos : PositionVector holding the vehicle's offset from home 86 | # balloon_pos : PositionVector holding the balloon's offset from home 87 | # vehicle roll, pitch and yaw angles should be in radians 88 | def draw_fake_balloon(self, frame, veh_pos, balloon_pos, vehicle_roll, vehicle_pitch, vehicle_yaw): 89 | # calculate bearing to balloon 90 | bearing_to_balloon = PositionVector.get_bearing(veh_pos, balloon_pos) 91 | yaw_to_balloon = balloon_utils.wrap_PI(bearing_to_balloon-vehicle_yaw) 92 | 93 | # calculate earth frame pitch angle from vehicle to balloon 94 | pitch_to_balloon = vehicle_pitch + PositionVector.get_elevation(veh_pos, balloon_pos) 95 | 96 | #print "Fake Balloon Bearing:%f Pitch:%f Dist:%f" % (math.degrees(bearing_to_balloon), math.degrees(pitch_to_balloon), dist_to_balloon_xy) 97 | 98 | # calculate pixel position of balloon 99 | balloon_x = balloon_video.angle_to_pixels_x(yaw_to_balloon) + balloon_video.img_center_x 100 | balloon_y = balloon_video.angle_to_pixels_y(pitch_to_balloon) + balloon_video.img_center_y 101 | 102 | # calculate size of balloon in pixels from distance and size 103 | dist_to_balloon_xyz = PositionVector.get_distance_xyz(veh_pos, balloon_pos) 104 | balloon_radius = balloon_utils.get_pixels_from_distance(dist_to_balloon_xyz, balloon_finder.balloon_radius_expected) 105 | 106 | # store balloon radius 107 | self.last_balloon_radius = balloon_radius 108 | 109 | # draw balloon 110 | cv2.circle(frame,(balloon_x,balloon_y), balloon_radius, self.fake_balloon_colour_bgr_scalar, -1) 111 | 112 | # get_simulated_frame - returns an image of a simulated background and balloon based upon vehicle position, vehicle attitude and balloon position 113 | def get_simulated_frame(self, veh_pos, vehicle_roll, vehicle_pitch, vehicle_yaw): 114 | # get balloon position 115 | balloon_pos = PositionVector.get_from_location(self.fake_balloon_location) 116 | # get background 117 | sim_frame = self.get_background(vehicle_roll, vehicle_pitch) 118 | # draw balloon on background 119 | self.draw_fake_balloon(sim_frame, veh_pos, balloon_pos, vehicle_roll, vehicle_pitch, vehicle_yaw) 120 | return sim_frame 121 | 122 | # main - tests this class 123 | def main(self): 124 | 125 | # set home to tridge's home field (absolute alt = 270) 126 | PositionVector.set_home_location(LocationGlobal(-35.362938,149.165085,0)) 127 | 128 | # calculate balloon position 129 | fake_balloon_pos = PositionVector.get_from_location(self.fake_balloon_location) 130 | 131 | # vehicle attitude and position 132 | veh_pos = PositionVector(0,0,fake_balloon_pos.z) # at home location 133 | veh_roll = math.radians(0) # leaned right 10 deg 134 | veh_pitch = math.radians(0) # pitched back at 0 deg 135 | veh_yaw = PositionVector.get_bearing(veh_pos,fake_balloon_pos) # facing towards fake balloon 136 | 137 | # display positions from home 138 | print "Vehicle %s" % veh_pos 139 | print "Balloon %s" % fake_balloon_pos 140 | 141 | # generate simulated frame of balloon 10m north, 2m above vehicle 142 | img = self.get_simulated_frame(veh_pos, veh_roll, veh_pitch, veh_yaw) 143 | 144 | while(True): 145 | # move vehicle towards balloon 146 | veh_pos = veh_pos + (fake_balloon_pos - veh_pos) * 0.01 147 | 148 | # regenerate frame 149 | img = self.get_simulated_frame(veh_pos, veh_roll, veh_pitch, veh_yaw) 150 | 151 | # look for balloon in image using blob detector 152 | found_in_image, xpos, ypos, size = balloon_finder.analyse_frame(img) 153 | 154 | # display actual vs real distance 155 | dist_actual = PositionVector.get_distance_xyz(veh_pos, fake_balloon_pos) 156 | dist_est = balloon_utils.get_distance_from_pixels(size, balloon_finder.balloon_radius_expected) 157 | print "Dist Est:%f Act:%f Size Est:%f Act:%f" % (dist_est, dist_actual, size, self.last_balloon_radius) 158 | 159 | # show image 160 | cv2.imshow("fake balloon", img) 161 | 162 | # wait for keypress 163 | k = cv2.waitKey(5) & 0xFF 164 | if k == 27: 165 | break 166 | 167 | # destroy windows 168 | cv2.destroyAllWindows() 169 | 170 | # declare global instance 171 | balloon_sim = BalloonSimulator() 172 | 173 | # call main if this file is being run from the command line 174 | if __name__ == "__main__": 175 | balloon_sim.main() 176 | -------------------------------------------------------------------------------- /scripts/balloon_video.py: -------------------------------------------------------------------------------- 1 | """ 2 | balloon_video.py 3 | 4 | This file includes functions to: 5 | Initialise the camera 6 | Initialise the output video 7 | 8 | Image size is held in the balloon_finder.cnf 9 | """ 10 | 11 | import sys 12 | from os.path import expanduser 13 | import time 14 | import math 15 | from multiprocessing import Process, Pipe 16 | import cv2 17 | import balloon_config 18 | 19 | # import pi camera if present 20 | try: 21 | from picamera.array import PiRGBArray as PiRGBArray 22 | from picamera import PiCamera as PiCamera 23 | except ImportError: 24 | print "picamera not installed - please install if running on RPI" 25 | 26 | class BalloonVideo: 27 | 28 | def __init__(self): 29 | # camera type - 0=web cam, 1=RPI camera 30 | self.camera_type = balloon_config.config.get_integer('camera','type',1) 31 | 32 | # camera objects 33 | self.camera = None 34 | 35 | # get image resolution 36 | self.img_width = balloon_config.config.get_integer('camera','width',640) 37 | self.img_height = balloon_config.config.get_integer('camera','height',480) 38 | 39 | # get image center 40 | self.img_center_x = self.img_width / 2 41 | self.img_center_y = self.img_height / 2 42 | 43 | # define field of view 44 | self.cam_hfov = balloon_config.config.get_float('camera','horizontal-fov',70.42) 45 | self.cam_vfov = balloon_config.config.get_float('camera','vertical-fov',43.3) 46 | 47 | # define video output filename 48 | self.video_filename = balloon_config.config.get_string('camera','video_output_file','~/balloon-%Y-%m-%d-%H-%M.avi') 49 | self.video_filename = expanduser(self.video_filename) 50 | self.video_filename = time.strftime(self.video_filename) 51 | 52 | # background image processing variables 53 | self.proc = None # background process object 54 | self.parent_conn = None # parent end of communicatoin pipe 55 | self.img_counter = 0 # num images requested so far 56 | 57 | # __str__ - print position vector as string 58 | def __str__(self): 59 | return "BalloonVideo Object W:%d H:%d" % (self.img_width, self.img_height) 60 | 61 | # initialise camera 62 | def init_camera(self): 63 | # return immediately if already initialised 64 | if not self.camera is None: 65 | return 66 | 67 | # use webcam 68 | if self.camera_type == 0: 69 | self.camera = cv2.VideoCapture(0) 70 | #self.camera.set(cv2.cv.CV_CAP_PROP_FRAME_WIDTH,self.img_width) 71 | #self.camera.set(cv2.cv.CV_CAP_PROP_FRAME_HEIGHT,self.img_height) 72 | 73 | # check we can connect to camera 74 | if not self.camera.isOpened(): 75 | print "failed to open camera, exiting!" 76 | sys.exit(0) 77 | 78 | # use rpi camera 79 | if self.camera_type == 1: 80 | self.camera = PiCamera() 81 | self.camera.resolution = (self.img_width,self.img_height) 82 | # to-do: check we can connect to camera 83 | 84 | # close camera 85 | def close_camera(self): 86 | # return immediately if already initialised 87 | if not self.camera is None: 88 | return 89 | # use webcam 90 | if self.camera_type == 0: 91 | self.camera.release() 92 | # use rpi camera 93 | if self.camera_type == 1: 94 | self.camera.close() 95 | 96 | # capture image from camera 97 | def capture_image(self): 98 | # check camera is initialised 99 | self.init_camera() 100 | 101 | # use webcam 102 | if self.camera_type == 0: 103 | success_flag, image=self.camera.read() 104 | return image 105 | 106 | # use rpi camera 107 | if self.camera_type == 1: 108 | image_array = PiRGBArray(self.camera) 109 | self.camera.capture(image_array, format="bgr") 110 | image = image_array.array 111 | return image 112 | 113 | # open_video_writer - begin writing to video file 114 | def open_video_writer(self): 115 | # Define the codec and create VideoWriter object 116 | # Note: setting ex to -1 will display pop-up requesting user choose the encoder 117 | ex = cv2.VideoWriter_fourcc('M','J','P','G') 118 | self.video_writer = cv2.VideoWriter(self.video_filename, ex, 25, (self.img_width,self.img_height)) 119 | if not self.video_writer is None: 120 | print "started recording video to %s" % self.video_filename 121 | else: 122 | print "failed to start recording video to %s" % self.video_filename 123 | return self.video_writer 124 | 125 | # pixels_to_angle_x - converts a number of pixels into an angle in radians 126 | def pixels_to_angle_x(self, num_pixels): 127 | return num_pixels * math.radians(self.cam_hfov) / self.img_width 128 | 129 | # pixels_to_angle_y - converts a number of pixels into an angle in radians 130 | def pixels_to_angle_y(self, num_pixels): 131 | return num_pixels * math.radians(self.cam_vfov) / self.img_height 132 | 133 | # angle_to_pixels_x - converts a horizontal angle (i.e. yaw) to a number of pixels 134 | # angle : angle in radians 135 | def angle_to_pixels_x(self, angle): 136 | return int(angle * self.img_width / math.radians(self.cam_hfov)) 137 | 138 | # angle_to_pixels_y - converts a vertical angle (i.e. pitch) to a number of pixels 139 | # angle : angle in radians 140 | def angle_to_pixels_y(self, angle): 141 | return int(angle * self.img_height / math.radians(self.cam_vfov)) 142 | 143 | # 144 | # background image processing routines 145 | # 146 | 147 | # image_capture_background - captures all images from the camera in the background and returning the latest image via the pipe when the parent requests it 148 | def image_capture_background(self, imgcap_connection): 149 | # exit immediately if imgcap_connection is invalid 150 | if imgcap_connection is None: 151 | print "image_capture failed because pipe is uninitialised" 152 | return 153 | 154 | # open the camera 155 | self.init_camera() 156 | 157 | # clear latest image 158 | latest_image = None 159 | 160 | while True: 161 | # constantly get the image from the webcam 162 | image = self.capture_image() 163 | 164 | # if successful overwrite our latest image 165 | if not image is None: 166 | latest_image = image 167 | 168 | # check if the parent wants the image 169 | if imgcap_connection.poll(): 170 | recv_obj = imgcap_connection.recv() 171 | # if -1 is received we exit 172 | if recv_obj == -1: 173 | break 174 | 175 | # otherwise we return the latest image 176 | imgcap_connection.send(latest_image) 177 | 178 | # release camera when exiting 179 | self.close_camera() 180 | 181 | # start_background_capture - starts background image capture 182 | def start_background_capture(self): 183 | # create pipe 184 | self.parent_conn, imgcap_conn = Pipe() 185 | 186 | # create and start the sub process and pass it it's end of the pipe 187 | self.proc = Process(target=self.image_capture_background, args=(imgcap_conn,)) 188 | self.proc.start() 189 | 190 | def stop_background_capture(self): 191 | # send exit command to image capture process 192 | self.parent_conn.send(-1) 193 | 194 | # join process 195 | self.proc.join() 196 | 197 | # get_image - returns latest image from the camera captured from the background process 198 | def get_image(self): 199 | # return immediately if pipe is not initialised 200 | if self.parent_conn == None: 201 | return None 202 | 203 | # send request to image capture for image 204 | self.parent_conn.send(self.img_counter) 205 | 206 | # increment counter for next interation 207 | self.img_counter = self.img_counter + 1 208 | 209 | # wait endlessly until image is returned 210 | recv_img = self.parent_conn.recv() 211 | 212 | # return image to caller 213 | return recv_img 214 | 215 | # main - tests BalloonVideo class 216 | def main(self): 217 | 218 | # start background process 219 | self.start_background_capture() 220 | 221 | while True: 222 | # send request to image capture for image 223 | img = self.get_image() 224 | 225 | # check image is valid 226 | if not img is None: 227 | # display image 228 | cv2.imshow ('image_display', img) 229 | else: 230 | print "no image" 231 | 232 | # check for ESC key being pressed 233 | k = cv2.waitKey(5) & 0xFF 234 | if k == 27: 235 | break 236 | 237 | # take a rest for a bit 238 | time.sleep(0.1) 239 | 240 | # send exit command to image capture process 241 | self.stop_background_capture() 242 | 243 | print "a2p 10 = %f" % self.angle_to_pixels_x(10) 244 | print "p2a 10 = %f" % self.pixels_to_angle_x(10) 245 | 246 | # create a single global object 247 | balloon_video = BalloonVideo() 248 | 249 | if __name__ == "__main__": 250 | balloon_video.main() -------------------------------------------------------------------------------- /scripts/find_balloon.py: -------------------------------------------------------------------------------- 1 | # 2 | # find_balloon.py - find the pixel location of red balloons (although currently hard coded colour filters search for yellow circles) 3 | # 4 | # Start from command line using 'python find_balloon.py'. 5 | # The program will run for 10 seconds and will create a video file, "find_balloon.avi", which contains the original video along with 6 | # super-imposed circles showing the where circles were detected 7 | # 8 | # How to use: 9 | # run colour_finder.py to find the best min and max Hue, Saturation and Brightness (aka Value) levels. Enter these into h_low, h_high, etc below 10 | # run this script and hold the object in front of the camera and move it around 11 | # check the find_balloon.avi file to ensure the super-imposed circles accurately follow the object 12 | 13 | import sys 14 | import time 15 | import cv2 16 | import numpy 17 | import math 18 | import balloon_config 19 | from web_server import Webserver 20 | from balloon_video import balloon_video 21 | import balloon_utils 22 | from position_vector import PositionVector 23 | 24 | class BalloonFinder(object): 25 | 26 | # default colours for red balloon 27 | default_h_low = 160 28 | default_h_high = 180 29 | default_s_low = 120 30 | default_s_high = 255 31 | default_v_low = 210 32 | default_v_high = 255 33 | 34 | def __init__(self): 35 | 36 | # define expected balloon radius in meters 37 | self.balloon_radius_expected = balloon_config.config.get_float('balloon','radius_m',0.3) 38 | 39 | # colour filters for balloon 40 | self.filter_low = numpy.array([balloon_config.config.get_integer('balloon','h-low',BalloonFinder.default_h_low), 41 | balloon_config.config.get_integer('balloon','s-low',BalloonFinder.default_s_low), 42 | balloon_config.config.get_integer('balloon','v-low',BalloonFinder.default_v_low)]) 43 | 44 | self.filter_high = numpy.array([balloon_config.config.get_integer('balloon','h-high',BalloonFinder.default_h_high), 45 | balloon_config.config.get_integer('balloon','s-high',BalloonFinder.default_s_high), 46 | balloon_config.config.get_integer('balloon','v-high',BalloonFinder.default_v_high)]) 47 | 48 | self.frame = None 49 | 50 | # analyse_frame - look for balloon in image using SimpleBlobDetector 51 | # returns: 52 | # found: boolean which is true if balloon if found 53 | # x: an integer holding the horizontal position in pixels of the center of the balloon on image 54 | # y: an integer holding the vertical position in pixels of the center of the balloon on image 55 | # radius: a float(?) holding the radius of the balloon in pixels 56 | def analyse_frame(self,frame): 57 | balloon_found = False 58 | balloon_x = 0 59 | balloon_y = 0 60 | balloon_radius = 0 61 | 62 | # Convert BGR to HSV 63 | hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV) 64 | 65 | # Threshold the HSV image 66 | mask = cv2.inRange(hsv, self.filter_low, self.filter_high) 67 | 68 | # Erode 69 | erode_kernel = numpy.ones((3,3),numpy.uint8); 70 | eroded_img = cv2.erode(mask,erode_kernel,iterations = 1) 71 | 72 | # dilate 73 | dilate_kernel = numpy.ones((10,10),numpy.uint8); 74 | dilate_img = cv2.dilate(eroded_img,dilate_kernel,iterations = 1) 75 | 76 | # blog detector 77 | blob_params = cv2.SimpleBlobDetector_Params() 78 | blob_params.minDistBetweenBlobs = 50 79 | blob_params.filterByInertia = False 80 | blob_params.filterByConvexity = False 81 | blob_params.filterByColor = True 82 | blob_params.blobColor = 255 83 | blob_params.filterByCircularity = False 84 | blob_params.filterByArea = False 85 | #blob_params.minArea = 20 86 | #blob_params.maxArea = 500 87 | blob_detector = cv2.SimpleBlobDetector_create(blob_params) 88 | keypts = blob_detector.detect(dilate_img) 89 | 90 | # draw centers of all keypoints in new image 91 | #blob_img = cv2.drawKeypoints(frame, keypts, color=(0,255,0), flags=0) 92 | 93 | # find largest blob 94 | if len(keypts) > 0: 95 | kp_max = keypts[0] 96 | for kp in keypts: 97 | if kp.size > kp_max.size: 98 | kp_max = kp 99 | 100 | # draw circle around the largest blob 101 | cv2.circle(frame,(int(kp_max.pt[0]),int(kp_max.pt[1])),int(kp_max.size),(0,255,0),2) 102 | 103 | # set the balloon location 104 | balloon_found = True 105 | balloon_x = kp_max.pt[0] 106 | balloon_y = kp_max.pt[1] 107 | balloon_radius = kp_max.size 108 | 109 | # return results 110 | return balloon_found, balloon_x, balloon_y, balloon_radius 111 | 112 | # add_artificial_horizon - adds artificial horizon to an image using the vehicle's attitude 113 | def add_artificial_horizon(self, frame, vehicle_roll, vehicle_pitch): 114 | # horizon line is 200 pixels in either direction from center of image 115 | ah1_x, ah1_y = balloon_utils.rotate_xy(balloon_video.img_center_x - 200, balloon_video.img_center_y, -vehicle_roll) 116 | ah2_x, ah2_y = balloon_utils.rotate_xy(balloon_video.img_center_x + 200, balloon_video.img_center_y, -vehicle_roll) 117 | # shift down by by -ve pitch angle 118 | pitch_pixel_shift = int(math.degrees(vehicle_pitch) / float(balloon_video.cam_vfov) * balloon_video.img_height) 119 | # draw line 120 | cv2.line(frame,(int(ah1_x),int(ah1_y)+pitch_pixel_shift),(int(ah2_x),int(ah2_y)+pitch_pixel_shift),5000,2) 121 | 122 | # 123 | # Below are functions that together can be used to convert the location on the image 124 | # to an estimate of the balloon's actual earth-frame location 125 | # 126 | 127 | # pixels_to_direction - converts a pixel location and vehicle attitude to an earth-frame pitch angle and heading 128 | # pixels_x should be a number from 0 ~ img_width 129 | # pixels_y should be a number from 0 ~ img_height 130 | # roll_in_radians, pitch_in_radians, yaw_in_radiasn should be the vehicle's roll, pitch and yaw angles in radians 131 | # returns vector towards target: 132 | # vertical angle (-ve = down, +ve = upwards) in degrees 133 | # absolute heading (in degrees)) 134 | def pixels_to_direction(self, pixels_x, pixels_y, vehicle_roll, vehicle_pitch, vehicle_yaw): 135 | # rotate position by +ve roll angle 136 | x_rotated, y_rotated = balloon_utils.rotate_xy(pixels_x - balloon_video.img_width/2, pixels_y - balloon_video.img_height/2, vehicle_roll) 137 | # calculate vertical pixel shift from pitch angle 138 | pitch_pixel_shift = int(math.degrees(vehicle_pitch) / float(balloon_video.cam_vfov) * balloon_video.img_height) 139 | pitch_dir = (-y_rotated + pitch_pixel_shift) / float(balloon_video.img_height) * balloon_video.cam_vfov 140 | # calculate yaw shift in degrees 141 | yaw_dir = x_rotated / float(balloon_video.img_width) * float(balloon_video.cam_hfov) + math.degrees(vehicle_yaw) 142 | # return vertical angle to target and heading 143 | return pitch_dir, yaw_dir 144 | 145 | # project_position - calculates the position in m given a yaw angle, pitch angle and distance 146 | # origin : PositionVector of the origin 147 | # pitch : earth frame pitch angle from vehicle to object. +ve = above vehicle, -ve = below vehicle 148 | # yaw : earth frame heading. +ve = clockwise from north, -ve = counter clockwise from north 149 | # distance : distance from vehicle to object in meters 150 | def project_position(self, origin, pitch, yaw, distance): 151 | cos_pitch = math.cos(pitch) 152 | dx = distance * math.cos(yaw) * cos_pitch 153 | dy = distance * math.sin(yaw) * cos_pitch 154 | dz = distance * math.sin(pitch) 155 | ret = PositionVector(origin.x + dx,origin.y + dy, origin.z + dz) 156 | return ret 157 | 158 | # get_ef_velocity_vector - calculates the earth frame velocity vector in m/s given a yaw angle, pitch angle and speed in m/s 159 | # pitch : earth frame pitch angle from vehicle to object. +ve = above vehicle, -ve = below vehicle 160 | # yaw : earth frame heading. +ve = clockwise from north, -ve = counter clockwise from north 161 | # velocity : scalar velocity in m/s 162 | def get_ef_velocity_vector(self, pitch, yaw, speed): 163 | cos_pitch = math.cos(pitch) 164 | x = speed * math.cos(yaw) * cos_pitch 165 | y = speed * math.sin(yaw) * cos_pitch 166 | z = speed * math.sin(pitch) 167 | return x,y,z 168 | 169 | # main - tests the BalloonFinder class 170 | def main(self): 171 | web = Webserver(balloon_config.config.parser, (lambda : self.frame)) 172 | 173 | # initialise camera 174 | balloon_video.init_camera() 175 | video_writer = balloon_video.open_video_writer() 176 | 177 | # get start time 178 | start_time = time.time() 179 | 180 | # loop for 10 seconds looking for circles 181 | while(time.time() - start_time < 20): 182 | 183 | # Take each frame 184 | frame = balloon_video.capture_image() 185 | self.frame = frame 186 | 187 | # is there the x & y position in frame of the largest balloon 188 | found_in_image, xpos, ypos, size = self.analyse_frame(frame) 189 | 190 | # display image 191 | cv2.imshow('frame',frame) 192 | 193 | # write the frame 194 | video_writer.write(frame) 195 | 196 | # exit if user presses ESC 197 | k = cv2.waitKey(5) & 0xFF 198 | if k == 27: 199 | break 200 | 201 | print "exiting..." 202 | web.close() 203 | 204 | # uncomment line below if window with real-time video was displayed 205 | cv2.destroyAllWindows() 206 | 207 | # release camera 208 | balloon_video.close_camera() 209 | 210 | # create the global balloon_finder object 211 | balloon_finder = BalloonFinder() 212 | 213 | # run a test if this file is being invoked directly from the command line 214 | if __name__ == "__main__": 215 | balloon_finder.main() 216 | -------------------------------------------------------------------------------- /scripts/balloon_strategy.py: -------------------------------------------------------------------------------- 1 | import sys 2 | import os 3 | sys.path.insert(1, os.getcwd()) 4 | 5 | import time 6 | import math 7 | from pymavlink import mavutil 8 | from dronekit import connect, VehicleMode, LocationGlobal 9 | import balloon_config 10 | from balloon_video import balloon_video 11 | from balloon_utils import get_distance_from_pixels, wrap_PI 12 | from position_vector import PositionVector 13 | from find_balloon import balloon_finder 14 | from fake_balloon import balloon_sim 15 | import pid 16 | from attitude_history import AttitudeHistory 17 | 18 | """ 19 | This is an early guess at a top level controller that uses the DroneAPI and OpenCV magic 20 | to try and win the challenge. 21 | 22 | General approach: 23 | * A state machine will need to know if we are in the portion of the route where we should even 24 | be looking for balloons (by looking at vehicle mode == AUTO and between wpt #x and #y?) 25 | * Periodically grab openCV frames and ask the image analysis function if it thinks there 26 | is a balloon there and if so - what direction should we go. 27 | * Initially we use current vehicle position + orientation + camera orentation to do a GUIDED 28 | goto for a wpt beyond where the vehicle should go. 29 | * If our altitude gets too low, or we get too far from the 'bee-line' path, abandon looking for 30 | this particular balloon (to prevent crashing or wasting too much time looking for balloons) 31 | * Until vehicle enters auto mode the strategy will only look for balloons (and generate 32 | log messages 33 | * If there was a mavlink message for telling the vehicle to shoot for a particular heading/altitude 34 | rather than just going to wpts that might be a good optimization 35 | 36 | To run this module: 37 | * Run mavproxy.py with the correct options to connect to your vehicle 38 | * module load api 39 | * api start balloon-strategy.py 40 | 41 | or 42 | 43 | * execute linux_run_strategy.sh 44 | 45 | Then arm it an takeoff as in mavproxy_sequence.txt 46 | """ 47 | 48 | class BalloonStrategy(object): 49 | def __init__(self): 50 | 51 | # connect to vehicle with dronekit 52 | self.vehicle = self.get_vehicle_with_dronekit() 53 | 54 | # initialised flag 55 | self.home_initialised = False 56 | # timer to intermittently check for home position 57 | self.last_home_check = time.time() 58 | 59 | # historical attitude 60 | self.att_hist = AttitudeHistory(self.vehicle, 2.0) 61 | self.attitude_delay = 0.0 # expected delay between image and attitude 62 | 63 | # search variables 64 | self.search_state = 0 # 0 = not search, 1 = spinning and looking, 2 = rotating to best balloon and double checking, 3 = finalise yaw 65 | self.search_start_heading = None # initial heading of vehicle when search began 66 | self.search_target_heading = None # the vehicle's current target heading (updated as vehicle spins during search) 67 | self.search_heading_change = None # heading change (in radians) performed so far during search 68 | self.search_balloon_pos = None # position (as an offset from home) of closest balloon (so far) during search 69 | self.search_balloon_heading = None # earth-frame heading (in radians) from vehicle to closest balloon 70 | self.search_balloon_pitch_top = None # earth-frame pitch (in radians) from vehicle to closest balloon 71 | self.search_balloon_distance = None # distance (in meters) from vehicle to closest balloon 72 | self.targeting_start_time = 0 # time vehicle first pointed towards final target (used with delay_time below) 73 | self.targeting_delay_time = balloon_config.config.get_float('general','SEARCH_TARGET_DELAY',2.0) # time vehicle waits after pointing towards ballon and before heading towards it 74 | self.search_yaw_speed = balloon_config.config.get_float('general','SEARCH_YAW_SPEED',5.0) 75 | 76 | # vehicle mission 77 | self.mission_cmds = None 78 | self.mission_alt_min = 0 # min altitude from NAV_GUIDED mission command (we ignore balloons below this altitude). "0" means no limit 79 | self.mission_alt_max = 0 # max altitude from NAV_GUIDED mission command (we ignore balloons above this altitude). "0" means no limit 80 | self.mission_distance_max = 0 # max distance from NAV_GUIDED mission command (we ignore balloons further than this distance). "0" means no limit 81 | 82 | # we are not in control of vehicle 83 | self.controlling_vehicle = False 84 | 85 | # vehicle position captured at time camera image was captured 86 | self.vehicle_pos = None 87 | 88 | # balloon direction and position estimate from latest call to analyse_image 89 | self.balloon_found = False 90 | self.balloon_pitch = None 91 | self.balloon_pitch_top = None # earth-frame pitch (in radians) from vehicle to top of closest balloon 92 | self.balloon_heading = None 93 | self.balloon_distance = None 94 | self.balloon_pos = None # last estimated position as an offset from home 95 | 96 | # time of the last target update sent to the flight controller 97 | self.guided_last_update = time.time() 98 | 99 | # latest velocity target sent to flight controller 100 | self.guided_target_vel = None 101 | 102 | # time the target balloon was last spotted 103 | self.last_spotted_time = 0 104 | 105 | # if we lose sight of a balloon for this many seconds we will consider it lost and give up on the search 106 | self.lost_sight_timeout = 3 107 | 108 | # The module only prints log messages unless the vehicle is in GUIDED mode (for testing). 109 | # Once this seems to work reasonablly well change self.debug to False and then it will 110 | # actually _enter_ guided mode when it thinks it sees a balloon 111 | self.debug = balloon_config.config.get_boolean('general','debug',True) 112 | 113 | # use the simulator to generate fake balloon images 114 | self.use_simulator = balloon_config.config.get_boolean('general','simulate',False) 115 | 116 | # start background image grabber 117 | if not self.use_simulator: 118 | balloon_video.start_background_capture() 119 | 120 | # initialise video writer 121 | self.writer = None 122 | 123 | # horizontal velocity pid controller. maximum effect is 10 degree lean 124 | xy_p = balloon_config.config.get_float('general','VEL_XY_P',1.0) 125 | xy_i = balloon_config.config.get_float('general','VEL_XY_I',0.0) 126 | xy_d = balloon_config.config.get_float('general','VEL_XY_D',0.0) 127 | xy_imax = balloon_config.config.get_float('general','VEL_XY_IMAX',10.0) 128 | self.vel_xy_pid = pid.pid(xy_p, xy_i, xy_d, math.radians(xy_imax)) 129 | 130 | # vertical velocity pid controller. maximum effect is 10 degree lean 131 | z_p = balloon_config.config.get_float('general','VEL_Z_P',2.5) 132 | z_i = balloon_config.config.get_float('general','VEL_Z_I',0.0) 133 | z_d = balloon_config.config.get_float('general','VEL_Z_D',0.0) 134 | z_imax = balloon_config.config.get_float('general','VEL_IMAX',10.0) 135 | self.vel_z_pid = pid.pid(z_p, z_i, z_d, math.radians(z_imax)) 136 | 137 | # velocity controller min and max speed 138 | self.vel_speed_min = balloon_config.config.get_float('general','VEL_SPEED_MIN',1.0) 139 | self.vel_speed_max = balloon_config.config.get_float('general','VEL_SPEED_MAX',4.0) 140 | self.vel_speed_last = 0.0 # last recorded speed 141 | self.vel_accel = balloon_config.config.get_float('general','VEL_ACCEL', 0.5) # maximum acceleration in m/s/s 142 | self.vel_dist_ratio = balloon_config.config.get_float('general','VEL_DIST_RATIO', 0.5) 143 | 144 | # pitch angle to hit balloon at. negative means come down from above 145 | self.vel_pitch_target = math.radians(balloon_config.config.get_float('general','VEL_PITCH_TARGET',-5.0)) 146 | 147 | # velocity controller update rate 148 | self.vel_update_rate = balloon_config.config.get_float('general','VEL_UPDATE_RATE_SEC',0.2) 149 | 150 | # stats 151 | self.num_frames_analysed = 0 152 | self.stats_start_time = 0 153 | 154 | # connect to vehicle with dronekit 155 | def get_vehicle_with_dronekit(self): 156 | connection_str = balloon_config.config.get_string('dronekit','connection_string','/dev/ttyUSB0') 157 | connection_baud = balloon_config.config.get_integer('dronekit','baud',921600) 158 | print "connecting to vehicle on %s, baud=%d" % (connection_str, connection_baud) 159 | return connect(connection_str, baud=connection_baud) 160 | print "connected to vehicle" 161 | 162 | # fetch_mission - fetch mission from flight controller 163 | def fetch_mission(self): 164 | # download the vehicle waypoints 165 | print "fetching mission.." 166 | self.mission_cmds = self.vehicle.commands 167 | self.mission_cmds.download() 168 | self.mission_cmds.wait_ready() 169 | if not self.mission_cmds is None: 170 | print "retrieved mission with %d commands" % self.mission_cmds.count 171 | else: 172 | print "failed to retrieve mission" 173 | 174 | # check home - intermittently checks for changes to the home location 175 | def check_home(self): 176 | 177 | # return immediately if home has already been initialised 178 | if self.home_initialised: 179 | return True 180 | 181 | # check for home no more than once every two seconds 182 | if (time.time() - self.last_home_check > 2): 183 | 184 | # update that we have performed a status check 185 | self.last_home_check = time.time() 186 | 187 | # check if we have a vehicle 188 | if self.vehicle is None: 189 | self.vehicle = self.get_vehicle_with_dronekit() 190 | return 191 | 192 | # download the vehicle waypoints (and home) if we don't have them already 193 | if self.mission_cmds is None: 194 | self.fetch_mission() 195 | return False 196 | 197 | # ensure the vehicle's position is known 198 | if self.vehicle.location.global_relative_frame is None: 199 | print "waiting for vehicle position.." 200 | return False 201 | if self.vehicle.location.global_relative_frame.lat is None or self.vehicle.location.global_relative_frame.lon is None or self.vehicle.location.global_relative_frame.alt is None: 202 | print "waiting for vehicle position.." 203 | return False 204 | 205 | # get home location from mission command list 206 | if self.vehicle.home_location is None: 207 | print "waiting for home location.." 208 | self.fetch_mission() 209 | return False 210 | 211 | # sanity check home position 212 | if self.vehicle.home_location.lat == 0 and self.vehicle.home_location.lon == 0: 213 | print "home position all zero, re-fetching.." 214 | self.fetch_mission() 215 | return False 216 | 217 | # set home position 218 | PositionVector.set_home_location(LocationGlobal(self.vehicle.home_location.lat,self.vehicle.home_location.lon,0)) 219 | self.home_initialised = True 220 | print "got home location" 221 | 222 | # To-Do: if we wish to have the same home position as the flight controller 223 | # we must download the home waypoint again whenever the vehicle is armed 224 | 225 | # return whether home has been initialised or not 226 | return self.home_initialised 227 | 228 | # checks if video output should be started 229 | def check_video_out(self): 230 | 231 | # return immediately if video has already been started 232 | if not self.writer is None: 233 | return 234 | 235 | # start video once vehicle is armed 236 | if self.vehicle.armed: 237 | self.writer = balloon_video.open_video_writer() 238 | 239 | # check_status - poles vehicle' status to determine if we are in control of vehicle or not 240 | def check_status(self): 241 | 242 | # we are active in guided mode 243 | if self.vehicle.mode.name == "GUIDED": 244 | if not self.controlling_vehicle: 245 | self.controlling_vehicle = True 246 | print "taking control of vehicle in GUIDED" 247 | # clear out any limits on balloon position 248 | self.mission_alt_min = 1 249 | self.mission_alt_max = 0 250 | self.mission_distance_max = 50 251 | # start search for balloon 252 | print "starting search alt_min:%f alt_max:%f dist_max:%f" % (self.mission_alt_min, self.mission_alt_max, self.mission_distance_max) 253 | self.start_search() 254 | return 255 | 256 | # download the vehicle waypoints if we don't have them already 257 | # To-Do: do not load waypoints if vehicle is armed 258 | if self.mission_cmds is None: 259 | self.fetch_mission() 260 | return 261 | 262 | # Check for Auto mode and executing Nav-Guided command 263 | if self.vehicle.mode.name == "AUTO": 264 | 265 | # get active command number and mavlink id 266 | active_command = self.vehicle.commands.next 267 | active_command_id = self.vehicle.commands[active_command].command 268 | 269 | # ninety two is the MAVLink id for Nav-Guided-Enable commands 270 | if active_command_id == 92: 271 | if not self.controlling_vehicle: 272 | self.controlling_vehicle = True 273 | print "taking control of vehicle in AUTO" 274 | self.mission_alt_min = self.vehicle.commands[active_command].param2 275 | self.mission_alt_max = self.vehicle.commands[active_command].param3 276 | self.mission_distance_max = self.vehicle.commands[active_command].param4 277 | print "starting search alt_min:%f alt_max:%f dist_max:%f" % (self.mission_alt_min, self.mission_alt_max, self.mission_distance_max) 278 | self.start_search() 279 | return 280 | 281 | # if we got here then we are not in control 282 | if self.controlling_vehicle: 283 | self.controlling_vehicle = False 284 | print "giving up control of vehicle in %s" % self.vehicle.mode.name 285 | 286 | # condition_yaw - send condition_yaw mavlink command to vehicle so it points at specified heading (in degrees) 287 | def condition_yaw(self, heading): 288 | # create the CONDITION_YAW command 289 | msg = self.vehicle.message_factory.mission_item_encode(0, 0, # target system, target component 290 | 0, # sequence 291 | mavutil.mavlink.MAV_FRAME_GLOBAL_RELATIVE_ALT, # frame 292 | mavutil.mavlink.MAV_CMD_CONDITION_YAW, # command 293 | 2, # current - set to 2 to make it a guided command 294 | 0, # auto continue 295 | heading, 0, 0, 0, 0, 0, 0) # param 1 ~ 7 296 | # send command to vehicle 297 | self.vehicle.send_mavlink(msg) 298 | self.vehicle.flush() 299 | 300 | # send_nav_velocity - send nav_velocity command to vehicle to request it fly in specified direction 301 | def send_nav_velocity(self, velocity_x, velocity_y, velocity_z): 302 | # create the SET_POSITION_TARGET_LOCAL_NED command 303 | msg = self.vehicle.message_factory.set_position_target_local_ned_encode( 304 | 0, # time_boot_ms (not used) 305 | 0, 0, # target system, target component 306 | mavutil.mavlink.MAV_FRAME_LOCAL_NED, # frame 307 | 0b0000111111000111, # type_mask (only speeds enabled) 308 | 0, 0, 0, # x, y, z positions (not used) 309 | velocity_x, velocity_y, velocity_z, # x, y, z velocity in m/s 310 | 0, 0, 0, # x, y, z acceleration (not used) 311 | 0, 0) # yaw, yaw_rate (not used) 312 | # send command to vehicle 313 | self.vehicle.send_mavlink(msg) 314 | self.vehicle.flush() 315 | 316 | # advance_current_cmd - ask vehicle to advance to next command (i.e. abort current command) 317 | def advance_current_cmd(self): 318 | 319 | # exit immediately if we are not in AUTO mode or not controlling the vehicle 320 | if not self.vehicle.mode.name == "AUTO" or not self.controlling_vehicle: 321 | return 322 | 323 | # download the vehicle waypoints if we don't have them already 324 | if self.mission_cmds is None: 325 | self.fetch_mission() 326 | 327 | # get active command 328 | active_command = self.vehicle.commands.next 329 | 330 | # ensure there is one more command at least 331 | if (self.vehicle.commands.count > active_command): 332 | # create the MISSION_SET_CURRENT command 333 | msg = self.vehicle.message_factory.mission_set_current_encode(0, 0, active_command+1) # target system, target component, sequence 334 | # send command to vehicle 335 | self.vehicle.send_mavlink(msg) 336 | self.vehicle.flush() 337 | else: 338 | print "Failed to advance command" 339 | 340 | # get_frame - get a single frame from the camera or simulator 341 | def get_frame(self): 342 | if self.use_simulator: 343 | veh_pos = PositionVector.get_from_location(self.vehicle.location.global_relative_frame) 344 | frame = balloon_sim.get_simulated_frame(veh_pos, self.vehicle.attitude.roll, self.vehicle.attitude.pitch, self.vehicle.attitude.yaw) 345 | else: 346 | frame = balloon_video.get_image() 347 | return frame 348 | 349 | # get image from balloon_video class and look for balloon, results are held in the following variables: 350 | # self.balloon_found : set to True if balloon is found, False otherwise 351 | # self.balloon_pitch : earth frame pitch (in radians) from vehicle to balloon (i.e. takes account of vehicle attitude) 352 | # self.balloon_heading : earth frame heading (in radians) from vehicle to balloon 353 | # self.balloon_distance : distance (in meters) from vehicle to balloon 354 | # self.balloon_pos : position vector of balloon's position as an offset from home in meters 355 | def analyze_image(self): 356 | 357 | # record time 358 | now = time.time() 359 | 360 | # capture vehicle position 361 | self.vehicle_pos = PositionVector.get_from_location(self.vehicle.location.global_relative_frame) 362 | 363 | # capture vehicle attitude in buffer 364 | self.att_hist.update() 365 | 366 | # get delayed attitude from buffer 367 | veh_att_delayed = self.att_hist.get_attitude(now - self.attitude_delay) 368 | 369 | # get new image from camera 370 | f = self.get_frame() 371 | 372 | # look for balloon in image using blob detector 373 | self.balloon_found, xpos, ypos, size = balloon_finder.analyse_frame(f) 374 | 375 | # add artificial horizon 376 | balloon_finder.add_artificial_horizon(f, veh_att_delayed.roll, veh_att_delayed.pitch) 377 | 378 | if self.balloon_found: 379 | # record time balloon was found 380 | self.last_spotted_time = now 381 | 382 | # convert x, y position to pitch and yaw direction (in radians) 383 | self.balloon_pitch, self.balloon_heading = balloon_finder.pixels_to_direction(xpos, ypos, veh_att_delayed.roll, veh_att_delayed.pitch, veh_att_delayed.yaw) 384 | self.balloon_pitch = math.radians(self.balloon_pitch) 385 | self.balloon_pitch_top = self.balloon_pitch + balloon_video.pixels_to_angle_y(size) # add balloon radius so we aim for top of balloon 386 | self.balloon_heading = math.radians(self.balloon_heading) 387 | 388 | # get distance 389 | self.balloon_distance = get_distance_from_pixels(size, balloon_finder.balloon_radius_expected) 390 | 391 | # updated estimated balloon position 392 | self.balloon_pos = balloon_finder.project_position(self.vehicle_pos, self.balloon_pitch, self.balloon_heading, self.balloon_distance) 393 | 394 | # save image for debugging later 395 | if not self.writer is None: 396 | self.writer.write(f) 397 | 398 | # increment frames analysed for stats 399 | self.num_frames_analysed += 1 400 | if self.stats_start_time == 0: 401 | self.stats_start_time = time.time() 402 | 403 | # start_search - start search for balloon 404 | def start_search(self): 405 | # exit immediately if we are not controlling the vehicle 406 | if not self.controlling_vehicle: 407 | return 408 | # initialise search variables 409 | self.search_state = 1 # 0 = not search, 1 = spinning and looking, 2 = rotating to best balloon and double checking, 3 = finalise yaw 410 | self.search_balloon_pos = None 411 | self.search_start_heading = self.vehicle.attitude.yaw 412 | self.search_target_heading = self.search_start_heading 413 | self.search_total_angle = 0 414 | self.targeting_start_time = 0 415 | # reset vehicle speed 416 | self.vel_speed_last = 0 417 | 418 | # search - spin vehicle looking for balloon 419 | # analyze_image should have been called just before and should have filled in self.balloon_found, self.balloon_pos, balloon_distance, balloon_pitch and balloon_yaw 420 | def search_for_balloon(self): 421 | 422 | # exit immediately if we are not controlling the vehicle or do not know our position 423 | if not self.controlling_vehicle or self.vehicle_pos is None: 424 | return 425 | 426 | # exit immediately if we are not searching 427 | if self.search_state <= 0: 428 | return 429 | 430 | # search states: 0 = not searching, 1 = spinning and looking, 2 = rotating to best balloon and double checking, 3 = finalise yaw 431 | 432 | # search_state = 1: spinning and looking 433 | if (self.search_state == 1): 434 | 435 | # check if we have seen a balloon 436 | if self.balloon_found: 437 | # if this is the first balloon we've found or the closest, store it's position as the closest 438 | if (self.search_balloon_pos is None) or (self.balloon_distance < self.search_balloon_distance): 439 | # check distance is within acceptable limits 440 | if (self.mission_alt_min == 0 or self.balloon_pos.z >= self.mission_alt_min) and (self.mission_alt_max == 0 or self.balloon_pos.z <= self.mission_alt_max) and (self.mission_distance_max == 0 or self.balloon_distance <= self.mission_distance_max): 441 | # record this balloon as the closest 442 | self.search_balloon_pos = self.balloon_pos 443 | self.search_balloon_heading = self.balloon_heading 444 | self.search_balloon_pitch_top = self.balloon_pitch_top # we target top of balloon 445 | self.search_balloon_distance = self.balloon_distance 446 | print "Found Balloon at heading:%f Alt:%f Dist:%f" % (math.degrees(self.balloon_heading), self.balloon_pos.z, self.balloon_distance) 447 | else: 448 | print "Balloon Ignored at heading:%f Alt:%f Dist:%f" % (math.degrees(self.balloon_heading), self.balloon_pos.z, self.balloon_distance) 449 | 450 | # update yaw so we keep spinning 451 | if math.fabs(wrap_PI(self.vehicle.attitude.yaw - self.search_target_heading)) < math.radians(self.search_yaw_speed * 2.0): 452 | # increase yaw target 453 | self.search_target_heading = self.search_target_heading - math.radians(self.search_yaw_speed) 454 | self.search_total_angle = self.search_total_angle + math.radians(self.search_yaw_speed) 455 | # send yaw heading 456 | self.condition_yaw(math.degrees(self.search_target_heading)) 457 | 458 | # end search if we've gone all the way around 459 | if self.search_total_angle >= math.radians(360): 460 | # if we never saw a balloon then just complete (return control to user or mission) 461 | if self.search_balloon_pos is None: 462 | print "Search did not find balloon, giving up" 463 | self.search_state = 0 464 | self.complete() 465 | else: 466 | # update target heading towards closest balloon and send to flight controller 467 | self.search_target_heading = self.search_balloon_heading 468 | self.condition_yaw(math.degrees(self.search_target_heading)) 469 | self.search_state = 2 # advance towards rotating to best balloon stage 470 | print "best balloon at %f" % math.degrees(self.search_balloon_heading) 471 | 472 | # search_state = 2: rotating to best balloon and double checking 473 | elif (self.search_state == 2): 474 | # check yaw is close to target 475 | if math.fabs(wrap_PI(self.vehicle.attitude.yaw - self.search_target_heading)) < math.radians(5): 476 | # reset targeting start time 477 | if self.targeting_start_time == 0: 478 | self.targeting_start_time = time.time() 479 | 480 | # if targeting time has elapsed, double check the visible balloon is within acceptable limits 481 | if (time.time() - self.targeting_start_time > self.targeting_delay_time): 482 | if self.balloon_found and (not self.balloon_pos is None) and (self.mission_alt_min == 0 or self.balloon_pos.z >= self.mission_alt_min) and (self.mission_alt_max == 0 or self.balloon_pos.z <= self.mission_alt_max) and (self.mission_distance_max == 0 or self.balloon_distance <= self.mission_distance_max): 483 | # balloon is within limits so reset balloon target 484 | self.search_balloon_pos = self.balloon_pos 485 | self.search_balloon_heading = self.balloon_heading 486 | self.search_balloon_pitch_top = self.balloon_pitch_top 487 | self.search_balloon_distance = self.balloon_distance 488 | # move to final heading 489 | self.search_target_heading = self.search_balloon_heading 490 | self.condition_yaw(math.degrees(self.search_target_heading)) 491 | # move to finalise yaw state 492 | self.search_state = 3 493 | print "Balloon was near expected heading, finalising yaw to %f" % (math.degrees(self.search_balloon_heading)) 494 | 495 | # we somehow haven't found the balloon when we've turned back to find it so giveup 496 | elif (time.time() - self.targeting_start_time) > (self.targeting_delay_time + 1.0): 497 | print "Balloon was not at expected heading: %f, giving up" % (math.degrees(self.search_target_heading)) 498 | self.search_state = 0 499 | self.complete() 500 | 501 | # search_state = 3: finalising yaw 502 | elif (self.search_state == 3): 503 | # check yaw is close to target 504 | if math.fabs(wrap_PI(self.vehicle.attitude.yaw - self.search_target_heading)) < math.radians(5): 505 | # complete search, move should take-over 506 | # Note: we don't check again for the balloon, but surely it's still there 507 | self.search_state = 0 508 | print "Finalised yaw to %f, beginning run" % (math.degrees(self.search_balloon_heading)) 509 | 510 | # move_to_balloon - velocity controller to drive vehicle to balloon 511 | # analyze_image should have been called prior to this and filled in self.balloon_found, balloon_pitch, balloon_heading, balloon_distance 512 | def move_to_balloon(self): 513 | 514 | # exit immediately if we are not controlling the vehicle 515 | if not self.controlling_vehicle: 516 | return 517 | 518 | # get current time 519 | now = time.time() 520 | 521 | # exit immediately if it's been too soon since the last update 522 | if (now - self.guided_last_update) < self.vel_update_rate: 523 | return; 524 | 525 | # if we have a new balloon position recalculate velocity vector 526 | if (self.balloon_found): 527 | 528 | # calculate change in yaw since we began the search 529 | yaw_error = wrap_PI(self.balloon_heading - self.search_balloon_heading) 530 | 531 | # calculate pitch vs ideal pitch angles. This will cause to attempt to get to 5deg above balloon 532 | pitch_error = wrap_PI(self.balloon_pitch_top - self.vel_pitch_target) 533 | 534 | # get time since last time velocity pid controller was run 535 | dt = self.vel_xy_pid.get_dt(2.0) 536 | 537 | # get speed towards balloon based on balloon distance 538 | speed = self.balloon_distance * self.vel_dist_ratio 539 | 540 | # apply min and max speed limit 541 | speed = min(speed, self.vel_speed_max) 542 | speed = max(speed, self.vel_speed_min) 543 | 544 | # apply acceleration limit 545 | speed_chg_max = self.vel_accel * dt 546 | speed = min(speed, self.vel_speed_last + speed_chg_max) 547 | speed = max(speed, self.vel_speed_last - speed_chg_max) 548 | 549 | # record speed for next iteration 550 | self.vel_speed_last = speed 551 | 552 | # calculate yaw correction and final yaw movement 553 | yaw_correction = self.vel_xy_pid.get_pid(yaw_error, dt) 554 | yaw_final = wrap_PI(self.search_balloon_heading + yaw_correction) 555 | 556 | # calculate pitch correction and final pitch movement 557 | pitch_correction = self.vel_z_pid.get_pid(pitch_error, dt) 558 | pitch_final = wrap_PI(self.search_balloon_pitch_top + pitch_correction) 559 | 560 | # calculate velocity vector we wish to move in 561 | self.guided_target_vel = balloon_finder.get_ef_velocity_vector(pitch_final, yaw_final, speed) 562 | 563 | # send velocity vector to flight controller 564 | self.send_nav_velocity(self.guided_target_vel[0], self.guided_target_vel[1], self.guided_target_vel[2]) 565 | self.guided_last_update = now 566 | 567 | # if have not seen the balloon 568 | else: 569 | # if more than a few seconds has passed without seeing the balloon give up 570 | if now - self.last_spotted_time > self.lost_sight_timeout: 571 | if self.debug: 572 | print "Lost Balloon, Giving up" 573 | # To-Do: start searching again or maybe slowdown? 574 | self.complete() 575 | 576 | # FIXME - check if vehicle altitude is too low 577 | # FIXME - check if we are too far from the desired flightplan 578 | 579 | def output_frame_rate_stats(self): 580 | # get current time 581 | now = time.time() 582 | 583 | # output stats each 10seconds 584 | time_diff = now - self.stats_start_time 585 | if time_diff < 10 or time_diff <= 0: 586 | return 587 | 588 | # output frame rate 589 | frame_rate = self.num_frames_analysed / time_diff 590 | print "FrameRate: %f (%d frames in %f seconds)" % (frame_rate, self.num_frames_analysed, time_diff) 591 | 592 | # reset stats 593 | self.num_frames_analysed = 0 594 | self.stats_start_time = now 595 | 596 | def run(self): 597 | while True: 598 | 599 | # only process images once home has been initialised 600 | if self.check_home(): 601 | 602 | # start video if required 603 | self.check_video_out() 604 | 605 | # check if we are controlling the vehicle 606 | self.check_status() 607 | 608 | # look for balloon in image 609 | self.analyze_image() 610 | 611 | # search or move towards balloon 612 | if self.search_state > 0: 613 | # search for balloon 614 | self.search_for_balloon() 615 | else: 616 | # move towards balloon 617 | self.move_to_balloon() 618 | 619 | # output stats 620 | self.output_frame_rate_stats() 621 | 622 | # Don't suck up too much CPU, only process a new image occasionally 623 | time.sleep(0.05) 624 | 625 | if not self.use_simulator: 626 | balloon_video.stop_background_capture() 627 | 628 | # complete - balloon strategy has somehow completed so return control to the autopilot 629 | def complete(self): 630 | # debug 631 | if self.debug: 632 | print "Complete!" 633 | 634 | # stop the vehicle and give up control 635 | if self.controlling_vehicle: 636 | self.guided_target_vel = (0,0,0) 637 | self.send_nav_velocity(self.guided_target_vel[0], self.guided_target_vel[1], self.guided_target_vel[2]) 638 | self.guided_last_update = time.time() 639 | 640 | # if in GUIDED mode switch back to LOITER 641 | if self.vehicle.mode.name == "GUIDED": 642 | self.vehicle.mode = VehicleMode("LOITER") 643 | self.vehicle.flush() 644 | 645 | # if in AUTO move to next command 646 | if self.vehicle.mode.name == "AUTO": 647 | self.advance_current_cmd(); 648 | 649 | # flag we are not in control of the vehicle 650 | self.controlling_vehicle = False 651 | 652 | return 653 | 654 | strat = BalloonStrategy() 655 | strat.run() 656 | 657 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | GNU GENERAL PUBLIC LICENSE 2 | Version 3, 29 June 2007 3 | 4 | Copyright (C) 2007 Free Software Foundation, Inc. 5 | Everyone is permitted to copy and distribute verbatim copies 6 | of this license document, but changing it is not allowed. 7 | 8 | Preamble 9 | 10 | The GNU General Public License is a free, copyleft license for 11 | software and other kinds of works. 12 | 13 | The licenses for most software and other practical works are designed 14 | to take away your freedom to share and change the works. By contrast, 15 | the GNU General Public License is intended to guarantee your freedom to 16 | share and change all versions of a program--to make sure it remains free 17 | software for all its users. We, the Free Software Foundation, use the 18 | GNU General Public License for most of our software; it applies also to 19 | any other work released this way by its authors. You can apply it to 20 | your programs, too. 21 | 22 | When we speak of free software, we are referring to freedom, not 23 | price. Our General Public Licenses are designed to make sure that you 24 | have the freedom to distribute copies of free software (and charge for 25 | them if you wish), that you receive source code or can get it if you 26 | want it, that you can change the software or use pieces of it in new 27 | free programs, and that you know you can do these things. 28 | 29 | To protect your rights, we need to prevent others from denying you 30 | these rights or asking you to surrender the rights. Therefore, you have 31 | certain responsibilities if you distribute copies of the software, or if 32 | you modify it: responsibilities to respect the freedom of others. 33 | 34 | For example, if you distribute copies of such a program, whether 35 | gratis or for a fee, you must pass on to the recipients the same 36 | freedoms that you received. You must make sure that they, too, receive 37 | or can get the source code. And you must show them these terms so they 38 | know their rights. 39 | 40 | Developers that use the GNU GPL protect your rights with two steps: 41 | (1) assert copyright on the software, and (2) offer you this License 42 | giving you legal permission to copy, distribute and/or modify it. 43 | 44 | For the developers' and authors' protection, the GPL clearly explains 45 | that there is no warranty for this free software. For both users' and 46 | authors' sake, the GPL requires that modified versions be marked as 47 | changed, so that their problems will not be attributed erroneously to 48 | authors of previous versions. 49 | 50 | Some devices are designed to deny users access to install or run 51 | modified versions of the software inside them, although the manufacturer 52 | can do so. This is fundamentally incompatible with the aim of 53 | protecting users' freedom to change the software. The systematic 54 | pattern of such abuse occurs in the area of products for individuals to 55 | use, which is precisely where it is most unacceptable. Therefore, we 56 | have designed this version of the GPL to prohibit the practice for those 57 | products. If such problems arise substantially in other domains, we 58 | stand ready to extend this provision to those domains in future versions 59 | of the GPL, as needed to protect the freedom of users. 60 | 61 | Finally, every program is threatened constantly by software patents. 62 | States should not allow patents to restrict development and use of 63 | software on general-purpose computers, but in those that do, we wish to 64 | avoid the special danger that patents applied to a free program could 65 | make it effectively proprietary. To prevent this, the GPL assures that 66 | patents cannot be used to render the program non-free. 67 | 68 | The precise terms and conditions for copying, distribution and 69 | modification follow. 70 | 71 | TERMS AND CONDITIONS 72 | 73 | 0. Definitions. 74 | 75 | "This License" refers to version 3 of the GNU General Public License. 76 | 77 | "Copyright" also means copyright-like laws that apply to other kinds of 78 | works, such as semiconductor masks. 79 | 80 | "The Program" refers to any copyrightable work licensed under this 81 | License. Each licensee is addressed as "you". "Licensees" and 82 | "recipients" may be individuals or organizations. 83 | 84 | To "modify" a work means to copy from or adapt all or part of the work 85 | in a fashion requiring copyright permission, other than the making of an 86 | exact copy. The resulting work is called a "modified version" of the 87 | earlier work or a work "based on" the earlier work. 88 | 89 | A "covered work" means either the unmodified Program or a work based 90 | on the Program. 91 | 92 | To "propagate" a work means to do anything with it that, without 93 | permission, would make you directly or secondarily liable for 94 | infringement under applicable copyright law, except executing it on a 95 | computer or modifying a private copy. Propagation includes copying, 96 | distribution (with or without modification), making available to the 97 | public, and in some countries other activities as well. 98 | 99 | To "convey" a work means any kind of propagation that enables other 100 | parties to make or receive copies. Mere interaction with a user through 101 | a computer network, with no transfer of a copy, is not conveying. 102 | 103 | An interactive user interface displays "Appropriate Legal Notices" 104 | to the extent that it includes a convenient and prominently visible 105 | feature that (1) displays an appropriate copyright notice, and (2) 106 | tells the user that there is no warranty for the work (except to the 107 | extent that warranties are provided), that licensees may convey the 108 | work under this License, and how to view a copy of this License. If 109 | the interface presents a list of user commands or options, such as a 110 | menu, a prominent item in the list meets this criterion. 111 | 112 | 1. Source Code. 113 | 114 | The "source code" for a work means the preferred form of the work 115 | for making modifications to it. "Object code" means any non-source 116 | form of a work. 117 | 118 | A "Standard Interface" means an interface that either is an official 119 | standard defined by a recognized standards body, or, in the case of 120 | interfaces specified for a particular programming language, one that 121 | is widely used among developers working in that language. 122 | 123 | The "System Libraries" of an executable work include anything, other 124 | than the work as a whole, that (a) is included in the normal form of 125 | packaging a Major Component, but which is not part of that Major 126 | Component, and (b) serves only to enable use of the work with that 127 | Major Component, or to implement a Standard Interface for which an 128 | implementation is available to the public in source code form. A 129 | "Major Component", in this context, means a major essential component 130 | (kernel, window system, and so on) of the specific operating system 131 | (if any) on which the executable work runs, or a compiler used to 132 | produce the work, or an object code interpreter used to run it. 133 | 134 | The "Corresponding Source" for a work in object code form means all 135 | the source code needed to generate, install, and (for an executable 136 | work) run the object code and to modify the work, including scripts to 137 | control those activities. However, it does not include the work's 138 | System Libraries, or general-purpose tools or generally available free 139 | programs which are used unmodified in performing those activities but 140 | which are not part of the work. For example, Corresponding Source 141 | includes interface definition files associated with source files for 142 | the work, and the source code for shared libraries and dynamically 143 | linked subprograms that the work is specifically designed to require, 144 | such as by intimate data communication or control flow between those 145 | subprograms and other parts of the work. 146 | 147 | The Corresponding Source need not include anything that users 148 | can regenerate automatically from other parts of the Corresponding 149 | Source. 150 | 151 | The Corresponding Source for a work in source code form is that 152 | same work. 153 | 154 | 2. Basic Permissions. 155 | 156 | All rights granted under this License are granted for the term of 157 | copyright on the Program, and are irrevocable provided the stated 158 | conditions are met. This License explicitly affirms your unlimited 159 | permission to run the unmodified Program. The output from running a 160 | covered work is covered by this License only if the output, given its 161 | content, constitutes a covered work. This License acknowledges your 162 | rights of fair use or other equivalent, as provided by copyright law. 163 | 164 | You may make, run and propagate covered works that you do not 165 | convey, without conditions so long as your license otherwise remains 166 | in force. You may convey covered works to others for the sole purpose 167 | of having them make modifications exclusively for you, or provide you 168 | with facilities for running those works, provided that you comply with 169 | the terms of this License in conveying all material for which you do 170 | not control copyright. Those thus making or running the covered works 171 | for you must do so exclusively on your behalf, under your direction 172 | and control, on terms that prohibit them from making any copies of 173 | your copyrighted material outside their relationship with you. 174 | 175 | Conveying under any other circumstances is permitted solely under 176 | the conditions stated below. Sublicensing is not allowed; section 10 177 | makes it unnecessary. 178 | 179 | 3. Protecting Users' Legal Rights From Anti-Circumvention Law. 180 | 181 | No covered work shall be deemed part of an effective technological 182 | measure under any applicable law fulfilling obligations under article 183 | 11 of the WIPO copyright treaty adopted on 20 December 1996, or 184 | similar laws prohibiting or restricting circumvention of such 185 | measures. 186 | 187 | When you convey a covered work, you waive any legal power to forbid 188 | circumvention of technological measures to the extent such circumvention 189 | is effected by exercising rights under this License with respect to 190 | the covered work, and you disclaim any intention to limit operation or 191 | modification of the work as a means of enforcing, against the work's 192 | users, your or third parties' legal rights to forbid circumvention of 193 | technological measures. 194 | 195 | 4. Conveying Verbatim Copies. 196 | 197 | You may convey verbatim copies of the Program's source code as you 198 | receive it, in any medium, provided that you conspicuously and 199 | appropriately publish on each copy an appropriate copyright notice; 200 | keep intact all notices stating that this License and any 201 | non-permissive terms added in accord with section 7 apply to the code; 202 | keep intact all notices of the absence of any warranty; and give all 203 | recipients a copy of this License along with the Program. 204 | 205 | You may charge any price or no price for each copy that you convey, 206 | and you may offer support or warranty protection for a fee. 207 | 208 | 5. Conveying Modified Source Versions. 209 | 210 | You may convey a work based on the Program, or the modifications to 211 | produce it from the Program, in the form of source code under the 212 | terms of section 4, provided that you also meet all of these conditions: 213 | 214 | a) The work must carry prominent notices stating that you modified 215 | it, and giving a relevant date. 216 | 217 | b) The work must carry prominent notices stating that it is 218 | released under this License and any conditions added under section 219 | 7. This requirement modifies the requirement in section 4 to 220 | "keep intact all notices". 221 | 222 | c) You must license the entire work, as a whole, under this 223 | License to anyone who comes into possession of a copy. This 224 | License will therefore apply, along with any applicable section 7 225 | additional terms, to the whole of the work, and all its parts, 226 | regardless of how they are packaged. This License gives no 227 | permission to license the work in any other way, but it does not 228 | invalidate such permission if you have separately received it. 229 | 230 | d) If the work has interactive user interfaces, each must display 231 | Appropriate Legal Notices; however, if the Program has interactive 232 | interfaces that do not display Appropriate Legal Notices, your 233 | work need not make them do so. 234 | 235 | A compilation of a covered work with other separate and independent 236 | works, which are not by their nature extensions of the covered work, 237 | and which are not combined with it such as to form a larger program, 238 | in or on a volume of a storage or distribution medium, is called an 239 | "aggregate" if the compilation and its resulting copyright are not 240 | used to limit the access or legal rights of the compilation's users 241 | beyond what the individual works permit. Inclusion of a covered work 242 | in an aggregate does not cause this License to apply to the other 243 | parts of the aggregate. 244 | 245 | 6. Conveying Non-Source Forms. 246 | 247 | You may convey a covered work in object code form under the terms 248 | of sections 4 and 5, provided that you also convey the 249 | machine-readable Corresponding Source under the terms of this License, 250 | in one of these ways: 251 | 252 | a) Convey the object code in, or embodied in, a physical product 253 | (including a physical distribution medium), accompanied by the 254 | Corresponding Source fixed on a durable physical medium 255 | customarily used for software interchange. 256 | 257 | b) Convey the object code in, or embodied in, a physical product 258 | (including a physical distribution medium), accompanied by a 259 | written offer, valid for at least three years and valid for as 260 | long as you offer spare parts or customer support for that product 261 | model, to give anyone who possesses the object code either (1) a 262 | copy of the Corresponding Source for all the software in the 263 | product that is covered by this License, on a durable physical 264 | medium customarily used for software interchange, for a price no 265 | more than your reasonable cost of physically performing this 266 | conveying of source, or (2) access to copy the 267 | Corresponding Source from a network server at no charge. 268 | 269 | c) Convey individual copies of the object code with a copy of the 270 | written offer to provide the Corresponding Source. This 271 | alternative is allowed only occasionally and noncommercially, and 272 | only if you received the object code with such an offer, in accord 273 | with subsection 6b. 274 | 275 | d) Convey the object code by offering access from a designated 276 | place (gratis or for a charge), and offer equivalent access to the 277 | Corresponding Source in the same way through the same place at no 278 | further charge. You need not require recipients to copy the 279 | Corresponding Source along with the object code. If the place to 280 | copy the object code is a network server, the Corresponding Source 281 | may be on a different server (operated by you or a third party) 282 | that supports equivalent copying facilities, provided you maintain 283 | clear directions next to the object code saying where to find the 284 | Corresponding Source. Regardless of what server hosts the 285 | Corresponding Source, you remain obligated to ensure that it is 286 | available for as long as needed to satisfy these requirements. 287 | 288 | e) Convey the object code using peer-to-peer transmission, provided 289 | you inform other peers where the object code and Corresponding 290 | Source of the work are being offered to the general public at no 291 | charge under subsection 6d. 292 | 293 | A separable portion of the object code, whose source code is excluded 294 | from the Corresponding Source as a System Library, need not be 295 | included in conveying the object code work. 296 | 297 | A "User Product" is either (1) a "consumer product", which means any 298 | tangible personal property which is normally used for personal, family, 299 | or household purposes, or (2) anything designed or sold for incorporation 300 | into a dwelling. In determining whether a product is a consumer product, 301 | doubtful cases shall be resolved in favor of coverage. For a particular 302 | product received by a particular user, "normally used" refers to a 303 | typical or common use of that class of product, regardless of the status 304 | of the particular user or of the way in which the particular user 305 | actually uses, or expects or is expected to use, the product. A product 306 | is a consumer product regardless of whether the product has substantial 307 | commercial, industrial or non-consumer uses, unless such uses represent 308 | the only significant mode of use of the product. 309 | 310 | "Installation Information" for a User Product means any methods, 311 | procedures, authorization keys, or other information required to install 312 | and execute modified versions of a covered work in that User Product from 313 | a modified version of its Corresponding Source. The information must 314 | suffice to ensure that the continued functioning of the modified object 315 | code is in no case prevented or interfered with solely because 316 | modification has been made. 317 | 318 | If you convey an object code work under this section in, or with, or 319 | specifically for use in, a User Product, and the conveying occurs as 320 | part of a transaction in which the right of possession and use of the 321 | User Product is transferred to the recipient in perpetuity or for a 322 | fixed term (regardless of how the transaction is characterized), the 323 | Corresponding Source conveyed under this section must be accompanied 324 | by the Installation Information. But this requirement does not apply 325 | if neither you nor any third party retains the ability to install 326 | modified object code on the User Product (for example, the work has 327 | been installed in ROM). 328 | 329 | The requirement to provide Installation Information does not include a 330 | requirement to continue to provide support service, warranty, or updates 331 | for a work that has been modified or installed by the recipient, or for 332 | the User Product in which it has been modified or installed. Access to a 333 | network may be denied when the modification itself materially and 334 | adversely affects the operation of the network or violates the rules and 335 | protocols for communication across the network. 336 | 337 | Corresponding Source conveyed, and Installation Information provided, 338 | in accord with this section must be in a format that is publicly 339 | documented (and with an implementation available to the public in 340 | source code form), and must require no special password or key for 341 | unpacking, reading or copying. 342 | 343 | 7. Additional Terms. 344 | 345 | "Additional permissions" are terms that supplement the terms of this 346 | License by making exceptions from one or more of its conditions. 347 | Additional permissions that are applicable to the entire Program shall 348 | be treated as though they were included in this License, to the extent 349 | that they are valid under applicable law. If additional permissions 350 | apply only to part of the Program, that part may be used separately 351 | under those permissions, but the entire Program remains governed by 352 | this License without regard to the additional permissions. 353 | 354 | When you convey a copy of a covered work, you may at your option 355 | remove any additional permissions from that copy, or from any part of 356 | it. (Additional permissions may be written to require their own 357 | removal in certain cases when you modify the work.) You may place 358 | additional permissions on material, added by you to a covered work, 359 | for which you have or can give appropriate copyright permission. 360 | 361 | Notwithstanding any other provision of this License, for material you 362 | add to a covered work, you may (if authorized by the copyright holders of 363 | that material) supplement the terms of this License with terms: 364 | 365 | a) Disclaiming warranty or limiting liability differently from the 366 | terms of sections 15 and 16 of this License; or 367 | 368 | b) Requiring preservation of specified reasonable legal notices or 369 | author attributions in that material or in the Appropriate Legal 370 | Notices displayed by works containing it; or 371 | 372 | c) Prohibiting misrepresentation of the origin of that material, or 373 | requiring that modified versions of such material be marked in 374 | reasonable ways as different from the original version; or 375 | 376 | d) Limiting the use for publicity purposes of names of licensors or 377 | authors of the material; or 378 | 379 | e) Declining to grant rights under trademark law for use of some 380 | trade names, trademarks, or service marks; or 381 | 382 | f) Requiring indemnification of licensors and authors of that 383 | material by anyone who conveys the material (or modified versions of 384 | it) with contractual assumptions of liability to the recipient, for 385 | any liability that these contractual assumptions directly impose on 386 | those licensors and authors. 387 | 388 | All other non-permissive additional terms are considered "further 389 | restrictions" within the meaning of section 10. If the Program as you 390 | received it, or any part of it, contains a notice stating that it is 391 | governed by this License along with a term that is a further 392 | restriction, you may remove that term. If a license document contains 393 | a further restriction but permits relicensing or conveying under this 394 | License, you may add to a covered work material governed by the terms 395 | of that license document, provided that the further restriction does 396 | not survive such relicensing or conveying. 397 | 398 | If you add terms to a covered work in accord with this section, you 399 | must place, in the relevant source files, a statement of the 400 | additional terms that apply to those files, or a notice indicating 401 | where to find the applicable terms. 402 | 403 | Additional terms, permissive or non-permissive, may be stated in the 404 | form of a separately written license, or stated as exceptions; 405 | the above requirements apply either way. 406 | 407 | 8. Termination. 408 | 409 | You may not propagate or modify a covered work except as expressly 410 | provided under this License. Any attempt otherwise to propagate or 411 | modify it is void, and will automatically terminate your rights under 412 | this License (including any patent licenses granted under the third 413 | paragraph of section 11). 414 | 415 | However, if you cease all violation of this License, then your 416 | license from a particular copyright holder is reinstated (a) 417 | provisionally, unless and until the copyright holder explicitly and 418 | finally terminates your license, and (b) permanently, if the copyright 419 | holder fails to notify you of the violation by some reasonable means 420 | prior to 60 days after the cessation. 421 | 422 | Moreover, your license from a particular copyright holder is 423 | reinstated permanently if the copyright holder notifies you of the 424 | violation by some reasonable means, this is the first time you have 425 | received notice of violation of this License (for any work) from that 426 | copyright holder, and you cure the violation prior to 30 days after 427 | your receipt of the notice. 428 | 429 | Termination of your rights under this section does not terminate the 430 | licenses of parties who have received copies or rights from you under 431 | this License. If your rights have been terminated and not permanently 432 | reinstated, you do not qualify to receive new licenses for the same 433 | material under section 10. 434 | 435 | 9. Acceptance Not Required for Having Copies. 436 | 437 | You are not required to accept this License in order to receive or 438 | run a copy of the Program. Ancillary propagation of a covered work 439 | occurring solely as a consequence of using peer-to-peer transmission 440 | to receive a copy likewise does not require acceptance. However, 441 | nothing other than this License grants you permission to propagate or 442 | modify any covered work. These actions infringe copyright if you do 443 | not accept this License. Therefore, by modifying or propagating a 444 | covered work, you indicate your acceptance of this License to do so. 445 | 446 | 10. Automatic Licensing of Downstream Recipients. 447 | 448 | Each time you convey a covered work, the recipient automatically 449 | receives a license from the original licensors, to run, modify and 450 | propagate that work, subject to this License. You are not responsible 451 | for enforcing compliance by third parties with this License. 452 | 453 | An "entity transaction" is a transaction transferring control of an 454 | organization, or substantially all assets of one, or subdividing an 455 | organization, or merging organizations. If propagation of a covered 456 | work results from an entity transaction, each party to that 457 | transaction who receives a copy of the work also receives whatever 458 | licenses to the work the party's predecessor in interest had or could 459 | give under the previous paragraph, plus a right to possession of the 460 | Corresponding Source of the work from the predecessor in interest, if 461 | the predecessor has it or can get it with reasonable efforts. 462 | 463 | You may not impose any further restrictions on the exercise of the 464 | rights granted or affirmed under this License. For example, you may 465 | not impose a license fee, royalty, or other charge for exercise of 466 | rights granted under this License, and you may not initiate litigation 467 | (including a cross-claim or counterclaim in a lawsuit) alleging that 468 | any patent claim is infringed by making, using, selling, offering for 469 | sale, or importing the Program or any portion of it. 470 | 471 | 11. Patents. 472 | 473 | A "contributor" is a copyright holder who authorizes use under this 474 | License of the Program or a work on which the Program is based. The 475 | work thus licensed is called the contributor's "contributor version". 476 | 477 | A contributor's "essential patent claims" are all patent claims 478 | owned or controlled by the contributor, whether already acquired or 479 | hereafter acquired, that would be infringed by some manner, permitted 480 | by this License, of making, using, or selling its contributor version, 481 | but do not include claims that would be infringed only as a 482 | consequence of further modification of the contributor version. For 483 | purposes of this definition, "control" includes the right to grant 484 | patent sublicenses in a manner consistent with the requirements of 485 | this License. 486 | 487 | Each contributor grants you a non-exclusive, worldwide, royalty-free 488 | patent license under the contributor's essential patent claims, to 489 | make, use, sell, offer for sale, import and otherwise run, modify and 490 | propagate the contents of its contributor version. 491 | 492 | In the following three paragraphs, a "patent license" is any express 493 | agreement or commitment, however denominated, not to enforce a patent 494 | (such as an express permission to practice a patent or covenant not to 495 | sue for patent infringement). To "grant" such a patent license to a 496 | party means to make such an agreement or commitment not to enforce a 497 | patent against the party. 498 | 499 | If you convey a covered work, knowingly relying on a patent license, 500 | and the Corresponding Source of the work is not available for anyone 501 | to copy, free of charge and under the terms of this License, through a 502 | publicly available network server or other readily accessible means, 503 | then you must either (1) cause the Corresponding Source to be so 504 | available, or (2) arrange to deprive yourself of the benefit of the 505 | patent license for this particular work, or (3) arrange, in a manner 506 | consistent with the requirements of this License, to extend the patent 507 | license to downstream recipients. "Knowingly relying" means you have 508 | actual knowledge that, but for the patent license, your conveying the 509 | covered work in a country, or your recipient's use of the covered work 510 | in a country, would infringe one or more identifiable patents in that 511 | country that you have reason to believe are valid. 512 | 513 | If, pursuant to or in connection with a single transaction or 514 | arrangement, you convey, or propagate by procuring conveyance of, a 515 | covered work, and grant a patent license to some of the parties 516 | receiving the covered work authorizing them to use, propagate, modify 517 | or convey a specific copy of the covered work, then the patent license 518 | you grant is automatically extended to all recipients of the covered 519 | work and works based on it. 520 | 521 | A patent license is "discriminatory" if it does not include within 522 | the scope of its coverage, prohibits the exercise of, or is 523 | conditioned on the non-exercise of one or more of the rights that are 524 | specifically granted under this License. You may not convey a covered 525 | work if you are a party to an arrangement with a third party that is 526 | in the business of distributing software, under which you make payment 527 | to the third party based on the extent of your activity of conveying 528 | the work, and under which the third party grants, to any of the 529 | parties who would receive the covered work from you, a discriminatory 530 | patent license (a) in connection with copies of the covered work 531 | conveyed by you (or copies made from those copies), or (b) primarily 532 | for and in connection with specific products or compilations that 533 | contain the covered work, unless you entered into that arrangement, 534 | or that patent license was granted, prior to 28 March 2007. 535 | 536 | Nothing in this License shall be construed as excluding or limiting 537 | any implied license or other defenses to infringement that may 538 | otherwise be available to you under applicable patent law. 539 | 540 | 12. No Surrender of Others' Freedom. 541 | 542 | If conditions are imposed on you (whether by court order, agreement or 543 | otherwise) that contradict the conditions of this License, they do not 544 | excuse you from the conditions of this License. If you cannot convey a 545 | covered work so as to satisfy simultaneously your obligations under this 546 | License and any other pertinent obligations, then as a consequence you may 547 | not convey it at all. For example, if you agree to terms that obligate you 548 | to collect a royalty for further conveying from those to whom you convey 549 | the Program, the only way you could satisfy both those terms and this 550 | License would be to refrain entirely from conveying the Program. 551 | 552 | 13. Use with the GNU Affero General Public License. 553 | 554 | Notwithstanding any other provision of this License, you have 555 | permission to link or combine any covered work with a work licensed 556 | under version 3 of the GNU Affero General Public License into a single 557 | combined work, and to convey the resulting work. The terms of this 558 | License will continue to apply to the part which is the covered work, 559 | but the special requirements of the GNU Affero General Public License, 560 | section 13, concerning interaction through a network will apply to the 561 | combination as such. 562 | 563 | 14. Revised Versions of this License. 564 | 565 | The Free Software Foundation may publish revised and/or new versions of 566 | the GNU General Public License from time to time. Such new versions will 567 | be similar in spirit to the present version, but may differ in detail to 568 | address new problems or concerns. 569 | 570 | Each version is given a distinguishing version number. If the 571 | Program specifies that a certain numbered version of the GNU General 572 | Public License "or any later version" applies to it, you have the 573 | option of following the terms and conditions either of that numbered 574 | version or of any later version published by the Free Software 575 | Foundation. If the Program does not specify a version number of the 576 | GNU General Public License, you may choose any version ever published 577 | by the Free Software Foundation. 578 | 579 | If the Program specifies that a proxy can decide which future 580 | versions of the GNU General Public License can be used, that proxy's 581 | public statement of acceptance of a version permanently authorizes you 582 | to choose that version for the Program. 583 | 584 | Later license versions may give you additional or different 585 | permissions. However, no additional obligations are imposed on any 586 | author or copyright holder as a result of your choosing to follow a 587 | later version. 588 | 589 | 15. Disclaimer of Warranty. 590 | 591 | THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY 592 | APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT 593 | HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY 594 | OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, 595 | THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR 596 | PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM 597 | IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF 598 | ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 599 | 600 | 16. Limitation of Liability. 601 | 602 | IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING 603 | WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS 604 | THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY 605 | GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE 606 | USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF 607 | DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD 608 | PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), 609 | EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF 610 | SUCH DAMAGES. 611 | 612 | 17. Interpretation of Sections 15 and 16. 613 | 614 | If the disclaimer of warranty and limitation of liability provided 615 | above cannot be given local legal effect according to their terms, 616 | reviewing courts shall apply local law that most closely approximates 617 | an absolute waiver of all civil liability in connection with the 618 | Program, unless a warranty or assumption of liability accompanies a 619 | copy of the Program in return for a fee. 620 | 621 | END OF TERMS AND CONDITIONS 622 | 623 | How to Apply These Terms to Your New Programs 624 | 625 | If you develop a new program, and you want it to be of the greatest 626 | possible use to the public, the best way to achieve this is to make it 627 | free software which everyone can redistribute and change under these terms. 628 | 629 | To do so, attach the following notices to the program. It is safest 630 | to attach them to the start of each source file to most effectively 631 | state the exclusion of warranty; and each file should have at least 632 | the "copyright" line and a pointer to where the full notice is found. 633 | 634 | {one line to give the program's name and a brief idea of what it does.} 635 | Copyright (C) {year} {name of author} 636 | 637 | This program is free software: you can redistribute it and/or modify 638 | it under the terms of the GNU General Public License as published by 639 | the Free Software Foundation, either version 3 of the License, or 640 | (at your option) any later version. 641 | 642 | This program is distributed in the hope that it will be useful, 643 | but WITHOUT ANY WARRANTY; without even the implied warranty of 644 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the 645 | GNU General Public License for more details. 646 | 647 | You should have received a copy of the GNU General Public License 648 | along with this program. If not, see . 649 | 650 | Also add information on how to contact you by electronic and paper mail. 651 | 652 | If the program does terminal interaction, make it output a short 653 | notice like this when it starts in an interactive mode: 654 | 655 | {project} Copyright (C) {year} {fullname} 656 | This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'. 657 | This is free software, and you are welcome to redistribute it 658 | under certain conditions; type `show c' for details. 659 | 660 | The hypothetical commands `show w' and `show c' should show the appropriate 661 | parts of the General Public License. Of course, your program's commands 662 | might be different; for a GUI interface, you would use an "about box". 663 | 664 | You should also get your employer (if you work as a programmer) or school, 665 | if any, to sign a "copyright disclaimer" for the program, if necessary. 666 | For more information on this, and how to apply and follow the GNU GPL, see 667 | . 668 | 669 | The GNU General Public License does not permit incorporating your program 670 | into proprietary programs. If your program is a subroutine library, you 671 | may consider it more useful to permit linking proprietary applications with 672 | the library. If this is what you want to do, use the GNU Lesser General 673 | Public License instead of this License. But first, please read 674 | . --------------------------------------------------------------------------------