├── .gitignore ├── README.md ├── config.ini ├── help.txt ├── main.py ├── modules ├── Camera.py ├── Controller.py ├── Speak.py ├── Timer.py ├── Tracker.py ├── Turret.py ├── __init__.py └── driver │ ├── Adafruit_I2C.py │ ├── Adafruit_PWM_Servo_Driver.py │ ├── __init__.py │ └── monkeypatch.py ├── robot.png └── run.sh /.gitignore: -------------------------------------------------------------------------------- 1 | myfiles/ 2 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # SentryTurret 2 | 3 | The purpose of this project is to create a cheap autonomous turret, capable of detecting motion and targetting objects. 4 | 5 | ![Pew Pew](https://raw.githubusercontent.com/steve-vincent/SentryTurret/master/robot.png "Working Example") 6 | 7 | ## Overview 8 | 9 | These instructions were written for constructing the robot with RPi2+RaspiCam+servos, but **you may try it out on a desktop with a webcam** (after installing required python packages). 10 | 11 | **The basic structure of the program follows:** 12 | - Main() 13 | - Spawn Camera() thread to fetch camera frames 14 | - Spawn Turret() thread to listen and move servos. If centered and armed, will fire. 15 | - Spawn Controller() thread to listen for input (keyboard or bluetooth) 16 | - Loop: 17 | - Get last frame from Camera() 18 | - If tracking target: 19 | - Find center of target in frame 20 | - Give Turret() new target coords 21 | - Else if searching for target, do one of: 22 | - Detect face mode 23 | - Detect motion by color 24 | - Detect motion by object 25 | - If input, do command. E.g: 26 | - Quit 27 | - Targetting mode 28 | - Adjust setting 29 | - etc. 30 | 31 | ## Install 32 | 33 | See [CONFIG.INI](https://github.com/steve-vincent/SentryTurret/blob/master/config.ini) for settings 34 | 35 | ### Python Package Requirements 36 | ~~~ 37 | python-dev 38 | libopencv-dev 39 | python-opencv 40 | open-cv 41 | dlib (see note below) 42 | configobj 43 | python-smbus 44 | webcolors 45 | bluez python-bluetooth python-gobject (for bluetooth) 46 | pyttsx (for speech) 47 | ~~~ 48 | 49 | **Installing dlib** 50 | While "sudo apt-get install" works for opencv, [dlib](http://dlib.net) (used for object tracking) requires compilation. Here are basic steps to do that: 51 | ~~~ 52 | sudo apt-get install cmake 53 | Install boost from: http://sourceforge.net/projects/boost 54 | sudo apt-get install libboost-python-dev 55 | wget http://dlib.net/files/dlib-18.18.tar.bz2 56 | tar jxf dlib-18.18.tar.bz2 57 | cd dlib* 58 | sudo python setup.py install 59 | pip install dlib 60 | ~~~ 61 | 62 | ### Hardware To Construct Robot 63 | - Raspberry Pi 2 + power supply 64 | - RaspiCam 65 | - PWM Servo Driver + power supply 66 | - 3 standard servos 67 | - Optional: Speaker (for speech feedback) 68 | - Optional: Bluetooth USB adapter (for bluetooth control) 69 | - Robot frame: camera mounted direction of gun, with servos for trigger and pan/tilt rotation 70 | 71 | ## Usage 72 | 73 | `$ python main.py` 74 | 75 | See [HELP.TXT](https://github.com/steve-vincent/SentryTurret/blob/master/help.txt) for commands 76 | 77 | ## Setup Tips 78 | 79 | **Camera:** 80 | For using raspicam with opencv, add bcm2835-v4l2 with `sudo modprobe bcm2835-v4l2` (read [here](http://raspberrypi.stackexchange.com/questions/17068/using-opencv-with-raspicam-and-python) and [here](https://www.raspberrypi.org/forums/viewtopic.php?f=43&t=94381)). 81 | - CONFIG.INI settings: 82 | - upsidedown: how you mount the camera. 83 | - width,height: camera capture dimensions 84 | - scaledown: for faster processing/framerate 85 | - display: on or off to display robot view. (Slower) 86 | 87 | **Turret/Servos:** 88 | Here is a [good tutorial](https://learn.adafruit.com/adafruit-16-channel-servo-driver-with-raspberry-pi) on connecting servos with your RPi. (Also read about [setting I2C permissions](http://www.raspberrypi.org/forums/viewtopic.php?p=238003#p238003)). 89 | - CONFIG.INI settings: 90 | - panchannel/tiltchannel/triggerchannel: channels on servo driver for each servo 91 | - fireposition: how far back to pull trigger 92 | - firesensitivity: how centered before firing? (higher is more "trigger happy") 93 | - *TODO: These other settings need some clean up or auto-calibration. Tweak if needed.* 94 | - stepsleep: seconds per microstep 95 | - pixelsperpulse: you have to manually measure & set this per rotation/camera, (pixels between same point after turret rotates 1.0 pulse.) 96 | - fps: how fast turret expects coordinates 97 | 98 | 99 | **Speech:** (Optional) 100 | If you connect a speaker, robot will speak modes and actions. Make sure volume is up the first time: `$ amixer sset PCM,0 100%`. 101 | - CONFIG.INI settings: 102 | - quiet: on or off (turn off if you don't have speaker) 103 | 104 | **Controller :** 105 | Send commands by either: keyboard (direct USB) or bluetooth (USB dongle + android app, such as [BlueMCU](https://play.google.com/store/apps/details?id=com.bluetooth.BlueMCU&hl=en)). Read about [bluetooth setup](https://github.com/metachris/android-bluetooth-spp) and [helpful commands](https://www.raspberrypi.org/forums/viewtopic.php?p=521067). 106 | - CONFIG.INI settings: 107 | - usebluetooth: on or off (turn off if you don't use) 108 | 109 | E.g. To enable serial port, and connect bluetooth to your phone 110 | ~~~ 111 | $ sdptool add SP 112 | $ sudo rfcomm bind rfcomm0 XX:XX:XX:XX:XX:XX 1 (<-your address here) 113 | ~~~ 114 | 115 | ## Issues and Workarounds 116 | 117 | You see a lot of output on start which may be ignored. 118 | 119 | Bluetooth may need reset before restarting bot. 120 | ~~~ 121 | $ sudo hciconfig hci0 reset 122 | ~~~ 123 | 124 | Stdin may stop showing what you type after running. Reset with: 125 | ~~~ 126 | $ stty sane 127 | ~~~ 128 | 129 | ## Contribute 130 | 131 | Contributions are welcome to improve accuracy and movement of the robot! 132 | 133 | ## License 134 | 135 | This project licensed under GNU GENERAL PUBLIC LICENSE 136 | https://www.gnu.org/licenses/gpl.txt 137 | -------------------------------------------------------------------------------- /config.ini: -------------------------------------------------------------------------------- 1 | 2 | [controller] 3 | usebluetooth = 1#see readme 4 | 5 | 6 | [turret] 7 | useturret = 1 8 | panchannel = 15 9 | tiltchannel = 3 10 | triggerchannel = 1 11 | centerx = 0.0 12 | centery = 0.0 13 | stepsleep = 0.05#time per step (smoothness) 14 | pixelsperpulse = 340.0#measured pixels moved for 1.0 pulse 15 | fireposition = 0.2#servo position when pulled/fired 16 | pulsecenter = 1.6 17 | pulserange = 0.8#per adafruit 0.5 = 1.0...1.5...2.0 18 | 19 | 20 | [camera] 21 | display = 1 22 | width = 160 23 | height = 120 24 | scaledown = 1.0#shrink for processing speed 25 | upsidedown = 1#is camera mounted upside down 26 | fps = 5#movement is wild if set too high *****AUTO CALC THIS**** 27 | 28 | 29 | [tolerance] 30 | reloadseconds = 3#seconds between fire/reload (3) 31 | firesensitivity = 0.01940598#fire if within pulse range, higher number = more trigger happy (0.02) 32 | [[objecttrack]] 33 | areathreshold = 100#min contour area, ignores noise (100) 34 | [[colortrack]] 35 | areathreshold = 100#min contour area, ignores noise (100) 36 | #starting HSV +/- range 37 | hue = 20#try 20 38 | saturation = 40#try 40 39 | value = 40#try 40 40 | 41 | 42 | [speak] 43 | quiet = 0 44 | volume = 0.5#0.0 to 1.0 45 | rate = 150 46 | dialect = 0x12b0690 47 | taunts = beep boop. beep beep boop. ha ha ha just kidding, "domo ah,di 'gahtow, mis'ter robato", Anybody here named John Conner or Sarah Conner?, I don't have a sister but I got lots of transistors., Why did the robot cross the road? Because it was programmed to., "with a face like this, its lucky I'm not self aware", "Can a robot go down the stairs? Yes, but only once.", "the singularity, is near", Who has three 'servos and likes to pivot? This guy., Eat your heart out R2 D2, "Pay me some 'bitcoin, and I'll tell more jokes." 48 | -------------------------------------------------------------------------------- /help.txt: -------------------------------------------------------------------------------- 1 | '?' or 'h' - this menu 2 | 'q' - quit (exit 0) 3 | 'r' - restart (exit 1) 4 | --- Targeting --- 5 | ' ' - reset 6 | 't' - target color at center 7 | '1' - track face on detection 8 | '2' - track color on motion 9 | '3' - track object on motion 10 | --- Controls --- 11 | 'j' - joke/taunt 12 | 'g' - fire 13 | --- Settings --- 14 | 'p' - toggle arm/safety 15 | 'n' - toggle speech/quiet 16 | 'i' - toggle frames/second output 17 | 'w' - pause display 18 | '+' - volume up 19 | '-' - volume down 20 | 'y' - restart speech (if crashes) 21 | --- Tolerance --- 22 | 'l/m' - more/less trigger happy 23 | 'a/z' - +/- size threshold color motion 24 | 's/x' - +/- size threshold object motion 25 | --- Turret/Servo --- 26 | You may click window to position servos. 27 | -------------------------------------------------------------------------------- /main.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/python 2 | 3 | import sys 4 | import os 5 | import cv2 6 | import dlib 7 | import numpy as np 8 | import threading 9 | from time import sleep 10 | import modules.Camera as Camera 11 | import modules.Speak as Speak 12 | import modules.Controller as Controller 13 | import modules.Timer as Timer 14 | import modules.Tracker as Tracker 15 | import modules.Turret as Turret 16 | 17 | 18 | # =========================================================================== 19 | # This is the main executable for the robot. 20 | # =========================================================================== 21 | 22 | 23 | #if display, respond to mouse 24 | def on_click(event, cx, cy, flag, param): 25 | if(event==cv2.cv.CV_EVENT_LBUTTONDOWN): 26 | print "moving..." + str(cx),str(cy) 27 | coords = (cx,cy) 28 | newturr = turret.coordToPulse(coords) 29 | currturr = (turret.xy[0],turret.xy[1]) 30 | turret.sendTarget(newturr, currturr) 31 | if(event==cv2.cv.CV_EVENT_MBUTTONDOWN): 32 | print "firing..." 33 | turret.fire() 34 | if(event==cv2.cv.CV_EVENT_RBUTTONDOWN): 35 | print "exiting...(to do)" 36 | 37 | 38 | #load config 39 | from configobj import ConfigObj #this library supports writing/saving config 40 | cfg = ConfigObj(os.path.dirname(os.path.abspath(__file__)) + '/config.ini') 41 | 42 | 43 | def main() : 44 | global cam, turret 45 | 46 | #start cam 47 | cam = Camera.Capture(cfg) 48 | cam.daemon = True 49 | cam.start() 50 | i=0 51 | while (i<50): #allow 5sec for startup 52 | i += 1 53 | sleep(0.1) 54 | if cam.rawframe != None: 55 | break 56 | frame = cam.getFrame() #need during init 57 | 58 | #start turret 59 | turret = Turret.Targetting(cfg) 60 | turret.daemon = True 61 | turret.recenter() 62 | turret.start() 63 | 64 | #start controller 65 | cmdlistener = Controller.Listener(cfg) 66 | cmdlistener.startlisteners() 67 | key = "" 68 | 69 | #display 70 | displaytext = "" #for writing message in window 71 | if os.environ.get('DISPLAY') and int(cfg['camera']['display']): 72 | display = True 73 | print 'display found' 74 | else: 75 | display = False 76 | print 'no display' 77 | if display: 78 | cv2.namedWindow("display", cv2.cv.CV_WINDOW_AUTOSIZE) 79 | cv2.setMouseCallback("display", on_click, 0) 80 | 81 | #tracking functions 82 | track = Tracker.Tracking(cfg, display) 83 | tracker = None #dlib object tracking 84 | 85 | #motion detection 86 | avgframe = np.float32(frame) 87 | avgtimer = threading.Event() #TODO: put this in Timer thread? 88 | 89 | #speak 90 | speak = Speak.Speak(cfg) 91 | speak.say("ready") 92 | 93 | cam.resetFPS() 94 | 95 | while(1): 96 | 97 | #capture frame,position 98 | framexy = turret.xy 99 | frame = cam.getFrame() 100 | if display: #default display 101 | displayframe = frame 102 | 103 | #targetting color (hsv) 104 | if track.mode == 4: 105 | cx,cy,displayframe = track.targetHsv(frame) 106 | if cx: 107 | turret.sendTarget(turret.coordToPulse((cx,cy)), framexy) 108 | #targetting object (dlib) 109 | if track.mode == 5: 110 | tracker.update(frame) 111 | rect = tracker.get_position() 112 | cx = (rect.right() + rect.left()) / 2 113 | cy = (rect.top() + rect.bottom()) / 2 114 | turret.sendTarget(turret.coordToPulse((cx,cy)), framexy) 115 | if display: 116 | pt1 = (int(rect.left()), int(rect.top())) 117 | pt2 = (int(rect.right()), int(rect.bottom())) 118 | cv2.rectangle(displayframe, pt1, pt2, (255, 255, 255), 3) 119 | #detect motionhsv, motionobj 120 | elif track.mode == 2 or track.mode == 3: 121 | if avgtimer.isSet(): 122 | #learn image for motion detection 123 | cv2.accumulateWeighted(frame, avgframe, 0.8) 124 | resframe = cv2.convertScaleAbs(avgframe) 125 | else: 126 | cnt, motionmask = track.getMotionContour(frame, resframe, track.areathresholdobject if track.mode==3 else track.areathresholdcolor) 127 | if display: 128 | displayframe = cv2.cvtColor(motionmask.copy(), cv2.COLOR_GRAY2RGB) 129 | if not cnt == None: 130 | #motionhsv 131 | if track.mode == 2: #mask for mean 132 | track.bgrtarget = cv2.mean(frame, motionmask) 133 | framehsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV) 134 | track.hsvtarget = cv2.mean(framehsv, motionmask) 135 | track.setColorRange() 136 | turret.armed = True 137 | displaytext = "Targetting " + Speak.BGRname(track.bgrtarget) 138 | speak.say(displaytext) 139 | track.mode = 4 #hsvtarget 140 | cam.resetFPS() 141 | #motionobj 142 | if track.mode == 3: #object points 143 | x,y,w,h = cv2.boundingRect(cnt) 144 | if x>3 and y>3: # and x+w best_area: 163 | best_area = area 164 | best_face = d 165 | if not best_face == None: #face points 166 | points = (best_face.left(),best_face.top(),best_face.right(),best_face.bottom()) 167 | tracker = dlib.correlation_tracker() 168 | tracker.start_track(frame, dlib.rectangle(*points)) 169 | displaytext = "Targetting human" 170 | speak.say(displaytext) 171 | track.mode = 5 #dlib 172 | #if face, go for shirt color 173 | # xs = int(x + (x/2)) #shirt 174 | # ys = int(y+(h*1.7)) #shirt 175 | # framecenter = frame[ys-track.sampleradius:ys+track.sampleradius, xs-track.sampleradius:xs+track.sampleradius] 176 | # framecenterhsv = cv2.cvtColor(framecenter, cv2.COLOR_BGR2HSV) 177 | # track.hsvtarget = [int(cv2.mean(framecenterhsv)[0]), int(cv2.mean(framecenterhsv)[1]), int(cv2.mean(framecenterhsv)[2])] 178 | # track.setColorRange() 179 | # track.bgrtarget = [int(cv2.mean(framecenter)[0]), int(cv2.mean(framecenter)[1]), int(cv2.mean(framecenter)[2])] 180 | # speak.say("Hey, you in the " + Speak.BGRname(track.bgrtarget) + " shirt") 181 | # track.mode = 4 #hsvtarget 182 | 183 | 184 | 185 | #display frame 186 | if display: 187 | if displaytext: 188 | textsize = 0.003 * cam.w 189 | cv2.putText(displayframe, displaytext, (5,cam.h-5), cv2.FONT_HERSHEY_SIMPLEX, textsize, (0,255,255)) 190 | cv2.imshow('display', displayframe) 191 | cv2.waitKey(1) 192 | 193 | 194 | # Command Handler -------------------------- 195 | if cmdlistener.cmdsent.isSet(): 196 | key = cmdlistener.cmd 197 | cmdlistener.reset() 198 | #quit/restart 199 | if key=="?" or key=="h": 200 | f = open(os.path.dirname(os.path.abspath(__file__)) + '/help.txt','r') 201 | print(f.read()) 202 | elif key=="q" or key=="r": 203 | turret.quit() 204 | cam.quit() 205 | cmdlistener.quit() 206 | cfg.write() 207 | if display: 208 | cv2.destroyAllWindows() 209 | if key=="q": 210 | speak.say("quitting. bye") 211 | return 0 212 | if key=="r": 213 | speak.say("restarting") 214 | return 1 215 | #--- Targeting --- 216 | elif key==" ": #reset all 217 | speak.say("Reset") 218 | displaytext = "" 219 | turret.armed = False 220 | track.mode = 0 221 | turret.recenter() 222 | cam.resetFPS() 223 | elif key=="t": #sample center of image 224 | sampleradius = 0.02 * (cam.w / float(cfg['camera']['scaledown'])) 225 | framecenter = frame[(cam.h/2)-sampleradius:(cam.h/2)+sampleradius, (cam.w/2)-sampleradius:(cam.w/2)+sampleradius] 226 | framecenterhsv = cv2.cvtColor(framecenter, cv2.COLOR_BGR2HSV) 227 | track.hsvtarget = [int(cv2.mean(framecenterhsv)[0]), int(cv2.mean(framecenterhsv)[1]), int(cv2.mean(framecenterhsv)[2])] 228 | track.setColorRange() 229 | track.bgrtarget = [int(cv2.mean(framecenter)[0]), int(cv2.mean(framecenter)[1]), int(cv2.mean(framecenter)[2])] 230 | displaytext = "Targetting " + Speak.BGRname(track.bgrtarget) 231 | speak.say(displaytext) 232 | track.mode = 4 233 | cam.resetFPS() 234 | elif key=="1": #start face detect 235 | displaytext = "Seeking humans." 236 | speak.say(displaytext) 237 | track.mode = 1 #face 238 | cam.resetFPS() 239 | #track.faceCascade = cv2.CascadeClassifier("haarcascade_frontalface_default.xml") 240 | elif key=="2" or key=="3": #start motion detect: 2:hsv or 3:dlib 241 | displaytext = "Motion " + ("by color" if key=="2" else "by object") 242 | speak.say(displaytext) 243 | Timer.Countdown(5, avgtimer).thread.start() 244 | sleep(.1) #timer set 245 | track.mode = int(key) #motiondetect 246 | #--- Controls --- 247 | elif key=="j": #joke/taunt 248 | speak.taunt() 249 | elif key=="g": #test fire 250 | turret.fire() 251 | #--- Settings --- 252 | elif key=="i": #fps 253 | print(cam.getFPS()) 254 | cam.resetFPS() 255 | elif key=="p": #toggle armed 256 | turret.armed = not turret.armed 257 | if turret.armed: 258 | speak.say("armed") 259 | else: 260 | speak.say("disarmed") 261 | elif key=="n": #toggle noisy 262 | if speak.quiet: 263 | speak.quiet = 0 264 | speak.say("talking") 265 | else: 266 | speak.say("quiet") 267 | sleep(.1) 268 | speak.quiet = 1 269 | cfg['speak']['quiet'] = speak.quiet 270 | elif key=="+": #volume 271 | speak.volume(.1) 272 | speak.say("check") 273 | elif key=="-": 274 | speak.volume(-.1) 275 | speak.say("check") 276 | elif key=="y": #restart speak 277 | speak = Speak.Speak() 278 | speak.say("ok") 279 | elif key=="w": #pause display 280 | cam.resetFPS() 281 | if not display and os.environ.get('DISPLAY') and int(cfg['camera']['display']): 282 | display = True 283 | print("- display resumed -") 284 | else: 285 | display = False 286 | print("- display paused -") 287 | #--- Tolerance --- 288 | else: 289 | adjustpercent = 0.10 #step +/- percent for manual key adjustments 290 | adjust = lambda val,sign: val * (1 + (sign*adjustpercent)) 291 | if key=="l" or key=="m": #target sensitivity 292 | turret.firesensitivity = adjust(turret.firesensitivity, 1 if key=="l" else -1) 293 | cfg['tolerance']['firesensitivity'] = turret.firesensitivity 294 | if key=="a" or key=="z": #threshhold area 295 | track.areathresholdcolor = adjust(track.areathresholdcolor, 1 if key=="a" else -1) 296 | cfg['tolerance']['colortrack']['areathreshold'] = track.areathresholdcolor 297 | if key=="s" or key=="x": #threshhold area 298 | track.areathresholdobject = adjust(track.areathresholdobject, 1 if key=="s" else -1) 299 | cfg['tolerance']['objecttrack']['areathreshold'] = track.areathresholdobject 300 | print "Area(Color):", track.areathresholdcolor, "Area(Object):", track.areathresholdobject, "Trigger:", turret.firesensitivity 301 | 302 | 303 | if __name__ == "__main__" : 304 | #start 305 | sys.exit(main()) 306 | 307 | -------------------------------------------------------------------------------- /modules/Camera.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/python 2 | 3 | import threading 4 | import cv2 5 | import datetime 6 | 7 | class Capture(threading.Thread) : 8 | 9 | def __init__(self, cfg): 10 | threading.Thread.__init__(self) 11 | self.upsidedown = int(cfg['camera']['upsidedown']) 12 | self.w = int(cfg['camera']['width']) 13 | self.h = int(cfg['camera']['height']) 14 | #for faster processing 15 | self.scaledown = float(cfg['camera']['scaledown']) 16 | self.w = int(self.w / self.scaledown) 17 | self.h = int(self.h / self.scaledown) 18 | #for fps 19 | self.rawframe = None 20 | self.frametime = None 21 | self.framecount = 0 22 | self.stopped = False 23 | 24 | 25 | def run(self): 26 | #loop to always get latest frame 27 | self.capture = cv2.VideoCapture(0) 28 | self.capture.set(cv2.cv.CV_CAP_PROP_FRAME_WIDTH, self.w) #TODO: does this work webcam? 29 | self.capture.set(cv2.cv.CV_CAP_PROP_FRAME_HEIGHT, self.h) 30 | while True: 31 | if self.stopped: 32 | return 33 | _, self.rawframe = self.capture.read() 34 | self.frametime = datetime.datetime.now() 35 | 36 | 37 | def getFrame(self): 38 | frame = self.rawframe 39 | self.framecount = self.framecount + 1 40 | # if self.scaledown != 1: 41 | # frame = cv2.resize(self.rawframe, (self.w, self.h)) #TODO: only if needed 42 | #if cam mounted upside down, rotate 180 43 | if self.upsidedown: #TODO: do this in display, but for coords with math 44 | M = cv2.getRotationMatrix2D((self.w/2, self.h/2), 180, 1.0) 45 | frame = cv2.warpAffine(frame, M, (self.w, self.h)) 46 | return frame 47 | 48 | def getFPS(self): 49 | return str(self.framecount / (self.starttime - datetime.datetime.now()).total_seconds()) 50 | 51 | def resetFPS(self): 52 | self.starttime = datetime.datetime.now() 53 | self.framecount = 0 54 | 55 | def quit(self): 56 | self.stopped = True 57 | self.capture.release() 58 | 59 | 60 | -------------------------------------------------------------------------------- /modules/Controller.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/python 2 | 3 | import threading 4 | import bluetooth 5 | #import getch 6 | 7 | import fcntl, os, sys, termios 8 | 9 | def getch(): 10 | fd = sys.stdin.fileno() 11 | oldattr = termios.tcgetattr(fd) 12 | newattr = termios.tcgetattr(fd) 13 | newattr[3] = newattr[3] & ~termios.ICANON & ~termios.ECHO 14 | termios.tcsetattr(fd, termios.TCSANOW, newattr) 15 | oldflags = fcntl.fcntl(fd, fcntl.F_GETFL) 16 | fcntl.fcntl(fd, fcntl.F_SETFL, oldflags | os.O_NONBLOCK) 17 | try: 18 | while True: 19 | try: 20 | c = sys.stdin.read(1) 21 | except IOError: 22 | pass 23 | else: 24 | return c 25 | finally: 26 | termios.tcsetattr(fd, termios.TCSAFLUSH, oldattr) 27 | fcntl.fcntl(fd, fcntl.F_SETFL, oldflags) 28 | 29 | class Listener(): 30 | 31 | def __init__(self, cfg): 32 | self.ready = threading.Event() 33 | self.cmdsent = threading.Event() 34 | self.threadquit = threading.Event() 35 | self.cmd = "" 36 | self.usebluetooth = True if int(cfg['controller']['usebluetooth']) else False 37 | 38 | 39 | def startlisteners(self) : 40 | if self.usebluetooth: 41 | self.bluetooththread = threading.Thread(target=self.bluetoothlisten) 42 | self.bluetooththread.daemon=True 43 | self.bluetooththread.start() 44 | self.keyboardthread = threading.Thread(target=self.keyboardlisten) 45 | self.keyboardthread.daemon=True 46 | self.keyboardthread.start() 47 | self.ready.set() 48 | 49 | def reset(self) : 50 | self.cmdsent.clear() 51 | self.cmd = "" 52 | self.ready.set() 53 | 54 | def gotchar(self, ch): 55 | self.ready.clear() 56 | self.cmd = ch 57 | print "Command: ", self.cmd 58 | self.cmdsent.set() 59 | 60 | def keyboardlisten(self) : 61 | print "Keyboard: listening" 62 | while self.ready.wait() and not self.threadquit.isSet(): 63 | self.gotchar(getch()) #getch.getch()) 64 | print "Keyboard: done" 65 | 66 | def bluetoothlisten(self) : 67 | self.server_sock = bluetooth.BluetoothSocket( bluetooth.RFCOMM ) 68 | self.server_sock.bind(("", 1)) #for controlling device MAC address, run: $sudo rfcomm bind rfcomm0 XX:XX:XX:XX:XX:XX 1 69 | self.server_sock.listen(1) #1 connection at a time. 70 | bluetooth.advertise_service( self.server_sock, "BluetoothRobot", "fa87c0d0-afac-11de-8a39-0800200c9a66" ) 71 | print "Bluetooth: Waiting on RFCOMM channel %d" % self.server_sock.getsockname()[1] 72 | self.sock = None 73 | self.sock, self.info = self.server_sock.accept() 74 | print "Bluetooth: listening" 75 | while self.ready.wait() and not self.threadquit.isSet(): 76 | try: 77 | self.gotchar(self.sock.recv(1024)) 78 | except IOError: 79 | pass 80 | print "Bluetooth: done" 81 | 82 | def quit(self): 83 | self.threadquit.set() 84 | self.ready.set() 85 | if self.usebluetooth: 86 | if not self.sock == None: 87 | self.sock.close() 88 | self.server_sock.close() 89 | 90 | 91 | -------------------------------------------------------------------------------- /modules/Speak.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/python 2 | 3 | #sudo apt-get install espeak 4 | 5 | #import threading 6 | import webcolors 7 | import pyttsx 8 | import random 9 | 10 | class Speak(): 11 | def __init__(self, cfg): 12 | self.cfg = cfg 13 | self.engine = pyttsx.init() 14 | self.engine.setProperty('rate', int(cfg['speak']['rate'])) 15 | self.engine.setProperty('speak', cfg['speak']['dialect']) 16 | self.engine.setProperty('volume', float(cfg['speak']['volume'])) 17 | self.quiet = int(cfg['speak']['quiet']) 18 | self.taunts = cfg['speak']['taunts'] 19 | self.tauntidx = random.randint(0, len(self.taunts)-1) #first taunt 20 | def say(self, msg): 21 | print "*** ROBOT ***: \"" + msg + "\"" 22 | if not self.quiet: 23 | self.engine.say(msg) 24 | self.engine.runAndWait() 25 | def volume(self, val): 26 | currvol = float(self.engine.getProperty('volume')) 27 | vol = currvol + val 28 | if (0.0 <= vol <= 1.0): 29 | print "volume:", vol 30 | self.engine.setProperty('volume', vol) 31 | self.cfg['speak']['volume'] = vol 32 | def taunt(self): 33 | self.say(self.taunts[self.tauntidx]) 34 | if self.tauntidx < (len(self.taunts)-1): 35 | self.tauntidx = self.tauntidx + 1 36 | else: 37 | self.tauntidx = 0 38 | 39 | def BGRname(bgrcolor): 40 | print "rgb = ", bgrcolor[2], bgrcolor[1], bgrcolor[0] 41 | try: 42 | closest_name = webcolors.rgb_to_name((bgrcolor[2], bgrcolor[1], bgrcolor[0])) 43 | except ValueError: 44 | min_colors = {} 45 | for key, name in webcolors.css21_hex_to_names.items(): 46 | r_c, g_c, b_c = webcolors.hex_to_rgb(key) 47 | bd = (b_c - bgrcolor[0]) ** 2 48 | gd = (g_c - bgrcolor[1]) ** 2 49 | rd = (r_c - bgrcolor[2]) ** 2 50 | min_colors[(rd + gd + bd)] = name 51 | closest_name = min_colors[min(min_colors.keys())] 52 | return closest_name 53 | 54 | 55 | -------------------------------------------------------------------------------- /modules/Timer.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/python 2 | 3 | import threading 4 | from time import sleep 5 | 6 | class Countdown(threading.Thread): 7 | def __init__(self, seconds, event): 8 | self.runTime = seconds 9 | self.event = event 10 | self.thread = threading.Thread(target=self.run) 11 | def run(self): 12 | if not self.event.isSet(): 13 | self.event.set() 14 | sleep(self.runTime) 15 | self.event.clear() 16 | -------------------------------------------------------------------------------- /modules/Tracker.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/python 2 | 3 | import cv2 4 | import numpy as np 5 | import math 6 | 7 | class Tracking() : 8 | 9 | def __init__(self, cfg, display): 10 | self.display = display 11 | self.mode = 0 # 0:waiting, 1:faceshirthsv, 2:motionhsv, 3:motionobj, 4:hsvtarget 12 | #facetrack 13 | self.facecascade = None 14 | #objtrack 15 | self.areathresholdobject = int(cfg['tolerance']['objecttrack']['areathreshold']) / float(cfg['camera']['scaledown']) 16 | #colortrack 17 | self.areathresholdcolor = int(cfg['tolerance']['colortrack']['areathreshold']) / float(cfg['camera']['scaledown']) 18 | self.hsvtarget = None #HSV range: 0-180,255,255 19 | self.hsvtolerance = [float(cfg['tolerance']['colortrack']['hue']), 20 | float(cfg['tolerance']['colortrack']['saturation']), 21 | float(cfg['tolerance']['colortrack']['value'])] 22 | self.bgrtarget = None #for display only 23 | 24 | 25 | def setColorRange(self): 26 | self.hsvlower = np.array([max(0,self.hsvtarget[0] - self.hsvtolerance[0]), 27 | max(0,self.hsvtarget[1] - self.hsvtolerance[1]), 28 | max(0,self.hsvtarget[2] - self.hsvtolerance[2])], dtype=np.uint8) 29 | self.hsvupper = np.array([min(180,self.hsvtarget[0] + self.hsvtolerance[1]), 30 | min(255,self.hsvtarget[1] + self.hsvtolerance[1]), 31 | min(255,self.hsvtarget[2] + self.hsvtolerance[2])], dtype=np.uint8) 32 | print "Target: Hue(",self.hsvlower[0],self.hsvupper[0],") Sat(",self.hsvlower[1],self.hsvupper[1],") Val(",self.hsvlower[2],self.hsvupper[2],")" 33 | 34 | 35 | def getBestContour(self, imgmask, threshold): 36 | contours,hierarchy = cv2.findContours(imgmask, cv2.RETR_LIST, cv2.CHAIN_APPROX_SIMPLE) 37 | best_cnt = None 38 | for cnt in contours: 39 | area = cv2.contourArea(cnt) 40 | if area > threshold: 41 | threshold = area 42 | best_cnt = cnt 43 | return best_cnt 44 | 45 | 46 | def getMotionContour(self, frame, resframe, threshold): 47 | #mask all motion 48 | motionmask = cv2.absdiff(frame, resframe) 49 | motionmask = cv2.cvtColor(motionmask, cv2.COLOR_RGB2GRAY) 50 | _,motionmask = cv2.threshold(motionmask, 25, 255, cv2.THRESH_BINARY) 51 | cntmask = motionmask.copy() 52 | cnt = self.getBestContour(cntmask, threshold) 53 | return cnt, motionmask 54 | 55 | 56 | def targetHsv(self, frame): 57 | #find best contour in hsv range 58 | displayframe = None 59 | framehsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV) 60 | framemask = cv2.inRange(framehsv, self.hsvlower, self.hsvupper) 61 | if self.display: #color copy 62 | displaymaskframe = cv2.cvtColor(framemask.copy(), cv2.COLOR_GRAY2BGR) 63 | displayframe = cv2.addWeighted(frame, 0.7, displaymaskframe, 0.3, 0) 64 | best_cnt = self.getBestContour(framemask, self.areathresholdcolor) 65 | if not best_cnt == None: #if match found 66 | #get center 67 | M = cv2.moments(best_cnt) 68 | cx,cy = int(M['m10']/M['m00']), int(M['m01']/M['m00']) 69 | if self.display: 70 | #display circle target 71 | objcenter = (cx,cy) 72 | objradius = int(round(math.sqrt(cv2.contourArea(best_cnt)),2)) 73 | cv2.circle(displayframe, objcenter, objradius, (0,0,0), 5) 74 | cv2.circle(displayframe, objcenter, objradius, self.bgrtarget, 3) 75 | return cx, cy, displayframe 76 | else: 77 | return None, None, frame 78 | 79 | 80 | 81 | 82 | -------------------------------------------------------------------------------- /modules/Turret.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/python 2 | 3 | import threading 4 | from time import sleep 5 | try: 6 | from driver.Adafruit_PWM_Servo_Driver import PWM 7 | except ImportError: 8 | from driver.monkeypatch import PWM 9 | 10 | # =========================================================================== 11 | # Wrapper for your servo driver goes here. 12 | # move() is called with servo positions in range -1(left/down)...0(center)...1(right/up) 13 | # 14 | # This example implements Adafruit PWM servo driver from RaspberryPi 15 | # https://learn.adafruit.com/adafruit-16-channel-servo-driver-with-raspberry-pi 16 | # =========================================================================== 17 | class ServoDriver : 18 | 19 | def __init__(self, cfg): 20 | # PWM settings 21 | FREQ = 60.0 # Set frequency to 60 Hz 22 | self.pwm = PWM(0x40, debug=False) 23 | self.pwm.setPWMFreq(FREQ) 24 | pulseLength = 1000000 # 1,000,000 us per second 25 | pulseLength /= FREQ # 60 Hz 26 | pulseLength /= 4096 # 12 bits of resolution 27 | self.pulseFactor = 1000 / pulseLength # millisecond multiplier 28 | # Tested Servo range 29 | self.pulsecenter = float(cfg['turret']['pulsecenter']) 30 | self.pulserange = float(cfg['turret']['pulserange']) 31 | 32 | def move(self, servo, position): 33 | if -1 < position < 1: #movement range -1...0...1 34 | pulse = self.pulsecenter + (position*self.pulserange) # w/in range from middle 35 | pulse *= self.pulseFactor # ~120-600 at 60hz 36 | self.pwm.setPWM(servo, 0, int(pulse)) 37 | 38 | 39 | class Countdown(threading.Thread): 40 | def __init__(self, seconds, event): 41 | self.runTime = seconds 42 | self.event = event 43 | self.thread = threading.Thread(target=self.run) 44 | def run(self): 45 | if not self.event.isSet(): 46 | self.event.set() 47 | sleep(self.runTime) 48 | self.event.clear() 49 | 50 | 51 | #globals 52 | threadquit = threading.Event() 53 | 54 | 55 | class Targetting(threading.Thread) : 56 | 57 | def __init__(self, cfg): 58 | 59 | #settings 60 | self.servoX = int(cfg['turret']['panchannel']) 61 | self.servoY = int(cfg['turret']['tiltchannel']) 62 | self.servoF = int(cfg['turret']['triggerchannel']) 63 | self.center = [float(cfg['turret']['centerx']), float(cfg['turret']['centery'])] 64 | self.fps = int(cfg['camera']['fps']) 65 | self.stepsleep = float(cfg['turret']['stepsleep']) 66 | self.firesensitivity = float(cfg['tolerance']['firesensitivity']) 67 | self.triggerwait = float(cfg['tolerance']['reloadseconds']) 68 | self.pixelPulseRatio = float(cfg['turret']['pixelsperpulse']) 69 | self.camW = int(cfg['camera']['width']) 70 | self.camH = int(cfg['camera']['height']) 71 | self.driver = ServoDriver(cfg) 72 | self.cfg = cfg 73 | 74 | self.wmiddle = self.camW/2 75 | self.hmiddle = self.camH/2 76 | self.triggertimer = threading.Event() 77 | self.armed = False 78 | self.xy = self.center[:] #current xy 79 | self.deltaxy = [0.0, 0.0] #current move 80 | self.deltaxylast = [0.0, 0.0] #prior move 81 | self.stepxy = [0.0, 0.0] #xy per step 82 | self.steps = (1.0/self.fps)/self.stepsleep #steps per frame 83 | self.stepcounter = 0 #motion step counter 84 | threading.Thread.__init__(self) 85 | 86 | def coordToPulse(self, coord): 87 | xpulse = (float(coord[0]) - self.wmiddle) / self.pixelPulseRatio 88 | ypulse = (self.hmiddle - float(coord[1])) / self.pixelPulseRatio 89 | return (xpulse,ypulse) 90 | 91 | def fire(self): #pull trigger 92 | #CUSTOMIZE THIS, per your servo configuration 93 | self.driver.move(self.servoF, 0 ) 94 | sleep(.2) #todo: should be calculated 95 | self.driver.move(self.servoF, float(self.cfg['turret']['fireposition']) ) 96 | sleep(.2) #todo: should be calculated 97 | self.driver.move(self.servoF, 0 ) 98 | Countdown(self.triggerwait, self.triggertimer).thread.start() #between fire 99 | 100 | def recenter(self): 101 | #zero to center 102 | self.stepcounter = 0 103 | self.sendTarget([-self.xy[0]+self.center[0], -self.xy[1]+self.center[1]], self.xy) 104 | 105 | def sendTarget(self, pulsexy, framexy): #pulsexy: delta from 0,0 / framexy: position at capture 106 | self.deltaxylast = self.deltaxy[:] 107 | #subtract distance since capture 108 | self.deltaxy[0] = pulsexy[0]-(self.xy[0]-framexy[0]) 109 | self.deltaxy[1] = pulsexy[1]-(self.xy[1]-framexy[1]) 110 | #stay on newest delta 111 | if self.stepcounter > 0: 112 | if abs(self.deltaxy[0])>> PEW! PEW!" 124 | if not self.deltaxy[0] == 0: 125 | self.fire() 126 | 127 | def quit(self): #cleanup 128 | global threadquit 129 | self.recenter() 130 | sleep(1) 131 | threadquit.set() 132 | 133 | def run(self): 134 | global threadquit 135 | while(not threadquit.isSet()): 136 | sleep(self.stepsleep) 137 | if self.stepcounter>0: #stepping to target 138 | self.xy[0] += self.stepxy[0] 139 | self.driver.move(self.servoX, self.xy[0] ) 140 | self.xy[1] += self.stepxy[1] 141 | self.driver.move(self.servoY, self.xy[1] ) 142 | self.stepcounter -= 1 143 | else: #set next target 144 | self.stepxy[0] = self.deltaxy[0]/self.steps 145 | self.stepxy[1] = self.deltaxy[1]/self.steps 146 | self.deltaxy = [0.0, 0.0] 147 | self.stepcounter = self.steps 148 | 149 | -------------------------------------------------------------------------------- /modules/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/decentropy/SentryTurret/4d8c627c4e1bf24ac3f8e1ce6cb70c5929f760ef/modules/__init__.py -------------------------------------------------------------------------------- /modules/driver/Adafruit_I2C.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/python 2 | import re 3 | import smbus 4 | 5 | # =========================================================================== 6 | # Adafruit_I2C Class 7 | # =========================================================================== 8 | 9 | class Adafruit_I2C(object): 10 | 11 | @staticmethod 12 | def getPiRevision(): 13 | "Gets the version number of the Raspberry Pi board" 14 | # Revision list available at: http://elinux.org/RPi_HardwareHistory#Board_Revision_History 15 | try: 16 | with open('/proc/cpuinfo', 'r') as infile: 17 | for line in infile: 18 | # Match a line of the form "Revision : 0002" while ignoring extra 19 | # info in front of the revsion (like 1000 when the Pi was over-volted). 20 | match = re.match('Revision\s+:\s+.*(\w{4})$', line) 21 | if match and match.group(1) in ['0000', '0002', '0003']: 22 | # Return revision 1 if revision ends with 0000, 0002 or 0003. 23 | return 1 24 | elif match: 25 | # Assume revision 2 if revision ends with any other 4 chars. 26 | return 2 27 | # Couldn't find the revision, assume revision 0 like older code for compatibility. 28 | return 0 29 | except: 30 | return 0 31 | 32 | @staticmethod 33 | def getPiI2CBusNumber(): 34 | # Gets the I2C bus number /dev/i2c# 35 | return 1 if Adafruit_I2C.getPiRevision() > 1 else 0 36 | 37 | def __init__(self, address, busnum=-1, debug=False): 38 | self.address = address 39 | # By default, the correct I2C bus is auto-detected using /proc/cpuinfo 40 | # Alternatively, you can hard-code the bus version below: 41 | # self.bus = smbus.SMBus(0); # Force I2C0 (early 256MB Pi's) 42 | # self.bus = smbus.SMBus(1); # Force I2C1 (512MB Pi's) 43 | self.bus = smbus.SMBus(busnum if busnum >= 0 else Adafruit_I2C.getPiI2CBusNumber()) 44 | self.debug = debug 45 | 46 | def reverseByteOrder(self, data): 47 | "Reverses the byte order of an int (16-bit) or long (32-bit) value" 48 | # Courtesy Vishal Sapre 49 | byteCount = len(hex(data)[2:].replace('L','')[::2]) 50 | val = 0 51 | for i in range(byteCount): 52 | val = (val << 8) | (data & 0xff) 53 | data >>= 8 54 | return val 55 | 56 | def errMsg(self): 57 | print "Error accessing 0x%02X: Check your I2C address" % self.address 58 | return -1 59 | 60 | def write8(self, reg, value): 61 | "Writes an 8-bit value to the specified register/address" 62 | try: 63 | self.bus.write_byte_data(self.address, reg, value) 64 | if self.debug: 65 | print "I2C: Wrote 0x%02X to register 0x%02X" % (value, reg) 66 | except IOError, err: 67 | return self.errMsg() 68 | 69 | def write16(self, reg, value): 70 | "Writes a 16-bit value to the specified register/address pair" 71 | try: 72 | self.bus.write_word_data(self.address, reg, value) 73 | if self.debug: 74 | print ("I2C: Wrote 0x%02X to register pair 0x%02X,0x%02X" % 75 | (value, reg, reg+1)) 76 | except IOError, err: 77 | return self.errMsg() 78 | 79 | def writeRaw8(self, value): 80 | "Writes an 8-bit value on the bus" 81 | try: 82 | self.bus.write_byte(self.address, value) 83 | if self.debug: 84 | print "I2C: Wrote 0x%02X" % value 85 | except IOError, err: 86 | return self.errMsg() 87 | 88 | def writeList(self, reg, list): 89 | "Writes an array of bytes using I2C format" 90 | try: 91 | if self.debug: 92 | print "I2C: Writing list to register 0x%02X:" % reg 93 | print list 94 | self.bus.write_i2c_block_data(self.address, reg, list) 95 | except IOError, err: 96 | return self.errMsg() 97 | 98 | def readList(self, reg, length): 99 | "Read a list of bytes from the I2C device" 100 | try: 101 | results = self.bus.read_i2c_block_data(self.address, reg, length) 102 | if self.debug: 103 | print ("I2C: Device 0x%02X returned the following from reg 0x%02X" % 104 | (self.address, reg)) 105 | print results 106 | return results 107 | except IOError, err: 108 | return self.errMsg() 109 | 110 | def readU8(self, reg): 111 | "Read an unsigned byte from the I2C device" 112 | try: 113 | result = self.bus.read_byte_data(self.address, reg) 114 | if self.debug: 115 | print ("I2C: Device 0x%02X returned 0x%02X from reg 0x%02X" % 116 | (self.address, result & 0xFF, reg)) 117 | return result 118 | except IOError, err: 119 | return self.errMsg() 120 | 121 | def readS8(self, reg): 122 | "Reads a signed byte from the I2C device" 123 | try: 124 | result = self.bus.read_byte_data(self.address, reg) 125 | if result > 127: result -= 256 126 | if self.debug: 127 | print ("I2C: Device 0x%02X returned 0x%02X from reg 0x%02X" % 128 | (self.address, result & 0xFF, reg)) 129 | return result 130 | except IOError, err: 131 | return self.errMsg() 132 | 133 | def readU16(self, reg, little_endian=True): 134 | "Reads an unsigned 16-bit value from the I2C device" 135 | try: 136 | result = self.bus.read_word_data(self.address,reg) 137 | # Swap bytes if using big endian because read_word_data assumes little 138 | # endian on ARM (little endian) systems. 139 | if not little_endian: 140 | result = ((result << 8) & 0xFF00) + (result >> 8) 141 | if (self.debug): 142 | print "I2C: Device 0x%02X returned 0x%04X from reg 0x%02X" % (self.address, result & 0xFFFF, reg) 143 | return result 144 | except IOError, err: 145 | return self.errMsg() 146 | 147 | def readS16(self, reg, little_endian=True): 148 | "Reads a signed 16-bit value from the I2C device" 149 | try: 150 | result = self.readU16(reg,little_endian) 151 | if result > 32767: result -= 65536 152 | return result 153 | except IOError, err: 154 | return self.errMsg() 155 | 156 | if __name__ == '__main__': 157 | try: 158 | bus = Adafruit_I2C(address=0) 159 | print "Default I2C bus is accessible" 160 | except: 161 | print "Error accessing default I2C bus" 162 | -------------------------------------------------------------------------------- /modules/driver/Adafruit_PWM_Servo_Driver.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/python 2 | 3 | import time 4 | import math 5 | from Adafruit_I2C import Adafruit_I2C 6 | 7 | # ============================================================================ 8 | # Adafruit PCA9685 16-Channel PWM Servo Driver 9 | # ============================================================================ 10 | 11 | class PWM : 12 | # Registers/etc. 13 | __MODE1 = 0x00 14 | __MODE2 = 0x01 15 | __SUBADR1 = 0x02 16 | __SUBADR2 = 0x03 17 | __SUBADR3 = 0x04 18 | __PRESCALE = 0xFE 19 | __LED0_ON_L = 0x06 20 | __LED0_ON_H = 0x07 21 | __LED0_OFF_L = 0x08 22 | __LED0_OFF_H = 0x09 23 | __ALL_LED_ON_L = 0xFA 24 | __ALL_LED_ON_H = 0xFB 25 | __ALL_LED_OFF_L = 0xFC 26 | __ALL_LED_OFF_H = 0xFD 27 | 28 | # Bits 29 | __RESTART = 0x80 30 | __SLEEP = 0x10 31 | __ALLCALL = 0x01 32 | __INVRT = 0x10 33 | __OUTDRV = 0x04 34 | 35 | general_call_i2c = Adafruit_I2C(0x00) 36 | 37 | @classmethod 38 | def softwareReset(cls): 39 | "Sends a software reset (SWRST) command to all the servo drivers on the bus" 40 | cls.general_call_i2c.writeRaw8(0x06) # SWRST 41 | 42 | def __init__(self, address=0x40, debug=False): 43 | self.i2c = Adafruit_I2C(address) 44 | self.i2c.debug = debug 45 | self.address = address 46 | self.debug = debug 47 | if (self.debug): 48 | print "Reseting PCA9685 MODE1 (without SLEEP) and MODE2" 49 | self.setAllPWM(0, 0) 50 | self.i2c.write8(self.__MODE2, self.__OUTDRV) 51 | self.i2c.write8(self.__MODE1, self.__ALLCALL) 52 | time.sleep(0.005) # wait for oscillator 53 | 54 | mode1 = self.i2c.readU8(self.__MODE1) 55 | mode1 = mode1 & ~self.__SLEEP # wake up (reset sleep) 56 | self.i2c.write8(self.__MODE1, mode1) 57 | time.sleep(0.005) # wait for oscillator 58 | 59 | def setPWMFreq(self, freq): 60 | "Sets the PWM frequency" 61 | prescaleval = 25000000.0 # 25MHz 62 | prescaleval /= 4096.0 # 12-bit 63 | prescaleval /= float(freq) 64 | prescaleval -= 1.0 65 | if (self.debug): 66 | print "Setting PWM frequency to %d Hz" % freq 67 | print "Estimated pre-scale: %d" % prescaleval 68 | prescale = math.floor(prescaleval + 0.5) 69 | if (self.debug): 70 | print "Final pre-scale: %d" % prescale 71 | 72 | oldmode = self.i2c.readU8(self.__MODE1); 73 | newmode = (oldmode & 0x7F) | 0x10 # sleep 74 | self.i2c.write8(self.__MODE1, newmode) # go to sleep 75 | self.i2c.write8(self.__PRESCALE, int(math.floor(prescale))) 76 | self.i2c.write8(self.__MODE1, oldmode) 77 | time.sleep(0.005) 78 | self.i2c.write8(self.__MODE1, oldmode | 0x80) 79 | 80 | def setPWM(self, channel, on, off): 81 | "Sets a single PWM channel" 82 | self.i2c.write8(self.__LED0_ON_L+4*channel, on & 0xFF) 83 | self.i2c.write8(self.__LED0_ON_H+4*channel, on >> 8) 84 | self.i2c.write8(self.__LED0_OFF_L+4*channel, off & 0xFF) 85 | self.i2c.write8(self.__LED0_OFF_H+4*channel, off >> 8) 86 | 87 | def setAllPWM(self, on, off): 88 | "Sets a all PWM channels" 89 | self.i2c.write8(self.__ALL_LED_ON_L, on & 0xFF) 90 | self.i2c.write8(self.__ALL_LED_ON_H, on >> 8) 91 | self.i2c.write8(self.__ALL_LED_OFF_L, off & 0xFF) 92 | self.i2c.write8(self.__ALL_LED_OFF_H, off >> 8) 93 | 94 | -------------------------------------------------------------------------------- /modules/driver/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/decentropy/SentryTurret/4d8c627c4e1bf24ac3f8e1ce6cb70c5929f760ef/modules/driver/__init__.py -------------------------------------------------------------------------------- /modules/driver/monkeypatch.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/python 2 | 3 | #used if no servos connected 4 | 5 | 6 | class PWM : 7 | 8 | def __init__(self, address=0x40, debug=False): 9 | self.test = True 10 | 11 | def setPWMFreq(self, freq): 12 | self.test = True 13 | 14 | def setPWM(self, channel, on, off): 15 | self.test = True 16 | -------------------------------------------------------------------------------- /robot.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/decentropy/SentryTurret/4d8c627c4e1bf24ac3f8e1ce6cb70c5929f760ef/robot.png -------------------------------------------------------------------------------- /run.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | 3 | # To autostart bot, add below line to /etc/rc.local 4 | # $ sudo sh /run.sh & 5 | 6 | # add bluetooth serial port 7 | sdptool add SP 8 | 9 | # bind phone mac address 10 | sudo rfcomm bind rfcomm0 F8:A9:D0:A2:07:25 1 11 | 12 | cd /home/pi/bot 13 | 14 | #start bot, restarting on error(0) 15 | while true ; do 16 | 17 | # reset bluetooth interface 18 | sudo hciconfig hci0 reset 19 | 20 | #start bot and log 21 | EXITCODE=`/usr/bin/python main.py >> /home/pi/bot/bot.log 2>&1` 22 | if [ $? -ne 1 ]; then 23 | break 24 | fi 25 | 26 | #repair ssh stdin echo 27 | stty sane 28 | done 29 | --------------------------------------------------------------------------------