├── _config.yml
├── Synopsis.pdf
├── motor_driver_connections.jpg
├── ultrasonic_sensor_connections.jpg
├── RedBall.py
├── README.md
└── MainCode.py
/_config.yml:
--------------------------------------------------------------------------------
1 | theme: jekyll-theme-architect
--------------------------------------------------------------------------------
/Synopsis.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ROHIT1005/Ball-Tracking-Robot-RPi-OpenCV/HEAD/Synopsis.pdf
--------------------------------------------------------------------------------
/motor_driver_connections.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ROHIT1005/Ball-Tracking-Robot-RPi-OpenCV/HEAD/motor_driver_connections.jpg
--------------------------------------------------------------------------------
/ultrasonic_sensor_connections.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ROHIT1005/Ball-Tracking-Robot-RPi-OpenCV/HEAD/ultrasonic_sensor_connections.jpg
--------------------------------------------------------------------------------
/RedBall.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import cv2
3 | import time
4 | import os
5 | # This system command loads the right drivers for the Raspberry Pi camera
6 | os.system('sudo modprobe bcm2835-v4l2')
7 | w=480
8 | h=320
9 | my_camera = cv2.VideoCapture(0)
10 | my_camera.set(3,w)
11 | my_camera.set(4,h)
12 | time.sleep(2)
13 | while (True):
14 | success, image = my_camera.read()
15 | image = cv2.flip(image,-1)
16 | image = cv2.GaussianBlur(image,(5,5),0)
17 | image_HSV = cv2.cvtColor(image,cv2.COLOR_BGR2HSV)
18 | lower_pink = np.array([0,220,20])
19 | upper_pink = np.array([15,255,190])
20 | mask = cv2.inRange(image_HSV,lower_pink,upper_pink)
21 | mask = cv2.GaussianBlur(mask,(5,5),0)
22 | # findContours returns a list of the outlines of the white shapes in the mask (and a heirarchy that we shall ignore)
23 | contours, hierarchy = cv2.findContours(mask,cv2.RETR_TREE,cv2.CHAIN_APPROX_SIMPLE)
24 | # If we have at least one contour, look through each one and pick the biggest
25 | if len(contours)>0:
26 | largest = 0
27 | area = 0
28 | for i in range(len(contours)):
29 | # get the area of the ith contour
30 | temp_area = cv2.contourArea(contours[i])
31 | # if it is the biggest we have seen, keep it
32 | if temp_area > area:
33 | area = temp_area
34 | largest = i
35 | # Compute the coordinates of the center of the largest contour
36 | coordinates = cv2.moments(contours[largest])
37 | target_x = int(coordinates['m10']/coordinates['m00'])
38 | target_y = int(coordinates['m01']/coordinates['m00'])
39 | # Pick a suitable diameter for our target (grows with the contour)
40 | diam = int(np.sqrt(area)/4)
41 | # draw on a target
42 | cv2.circle(image,(target_x,target_y),diam,(0,255,0),1)
43 | cv2.line(image,(target_x-2*diam,target_y),(target_x+2*diam,target_y),(0,255,0),1)
44 | cv2.line(image,(target_x,target_y-2*diam),(target_x,target_y+2*diam),(0,255,0),1)
45 | cv2.imshow('View',image)
46 | # Esc key to stop, otherwise repeat after 3 milliseconds
47 | key_pressed = cv2.waitKey(3)
48 | if key_pressed == 27:
49 | break
50 | cv2.destroyAllWindows()
51 | my_camera.release()
52 | # due to a bug in openCV you need to call wantKey three times to get the window to dissappear properly. Each wait only last 10 milliseconds
53 | cv2.waitKey(10)
54 | time.sleep(0.1)
55 | cv2.waitKey(10)
56 | cv2.waitKey(10)
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 |
7 |
8 |
Red Ball Follower Robot
9 |
10 |
11 |
12 | ## Table of Contents
13 |
14 | * [About the Project](#about-the-project)
15 | * [Prerequisites](#prerequisites)
16 | * [Installation](#installation)
17 | * [License](#license)
18 |
19 |
20 |
21 |
22 |
23 | ## About The Project
24 |
25 |
26 | I made this project in order to build a basic ball tracking car. Here, my bot uses camera to take frames and do image processing to track down the ball. The features of the ball such as color, shape, size can be used.I have chosen raspberry pi as micro-controller for this project as it gives great flexibility to use Raspberry Pi camera module and allows to code in Python which is very user friendly and OpenCV library, for image analysis.
27 |
28 | For controlling the motors, I have used an H-Bridge to switch from clockwise to counter-clockwise or to stop the motors. This I have integrated via code when direction and speed has to be controlled in different obstacle situations.
29 |
30 | Crucial thing while detecting images frame by frame was to avoid any frame drops as then the bot can go into a limbo state if the bot is unable to predict direction of ball after few frame drops. Even if it manage the frame drops then also if the ball goes out of scope of the camera, it will go into a limbo state, in that case, then I have made my bot take a 360 degree view of it's environment till the ball comes back in the scope of the camera and then start moving in it's direction.
31 |
32 | For the image analysis, I am taking each frame and then masking it with the color needed. Then for noise reduction, I am eroding the noise and dilating the major blobs. Then I find all the contours and find the largest among them and bound it in a rectangle. And show the rectangle on the main image and find the coordinates of the center of the rectangle.I have attached the algorithm (pseudo-code) of the image analysis part and demonstrated this part in the video also.
33 |
34 | Finally my bot tries to bring the coordinates of the ball to the center of its imaginary coordinate axis. This is how my robo works. I enjoy a lot working on this project and its a nice experience.
35 |
36 | ### Built With
37 | 1. Raspberry pi (any version) with raspbian os installed
38 | 2. 3 Ultrasonic sensors
39 | 3. Motor driver (LM298 , L293D module)
40 | 4. Power supply
41 | 5. Jumper cambles (Feamle-female)
42 | 6. Camera module(use module over usb camera for better camera)
43 | 7. Red ball for testing
44 |
45 | ### Prerequisites
46 |
47 |
48 | 1. Python2
49 | 2. opencv -3.10
50 | 3. picamera
51 | 4. Raspberry-pi with raspbian os installed
52 |
53 |
54 | ### Installation
55 |
56 | 1. Create a python file in raspberry pi
57 | ```sh
58 | sudo nano filename.py
59 | ```
60 | 2. Paste the code in file & Save
61 | ```sh
62 | cltrl+o & then cltrl+x
63 | ```
64 | 3. Connect the motor driver with raspberry pi (Refer : https://github.com/AmeyaUpalanchi/Interfacing-motor_driver-with-raspberry_pi/)
65 |
66 | 4. Connect the ultransonic sensors to the raspberry pi by refering diagram
67 |
68 | 5. Run the program
69 | ```sh
70 | python2 filename.py
71 | ```
72 |
73 |
74 | ## License
75 |
76 | Distributed under the MIT License. See `LICENSE` for more information.
77 |
78 |
79 |
--------------------------------------------------------------------------------
/MainCode.py:
--------------------------------------------------------------------------------
1 | # import the necessary packages
2 | from picamera.array import PiRGBArray #As there is a resolution problem in raspberry pi, will not be able to capture frames by VideoCapture
3 | from picamera import PiCamera
4 | import RPi.GPIO as GPIO
5 | import time
6 | import cv2
7 |
8 | import numpy as np
9 |
10 | #hardware work
11 | GPIO.setmode(GPIO.BCM)
12 | GPIO_TRIGGER1 = 5 #Left ultrasonic sensor
13 | GPIO_ECHO1 = 31
14 |
15 | GPIO_TRIGGER2 = 13 #Front ultrasonic sensor
16 | GPIO_ECHO2 = 6
17 |
18 | GPIO_TRIGGER3 = 26 #Right ultrasonic sensor
19 | GPIO_ECHO3 = 19
20 |
21 | MOTOR1B=25 #Left Motor
22 | MOTOR1E=8
23 |
24 | MOTOR2B=23 #Right Motor
25 | MOTOR2E=15
26 | en = 24
27 | en1 = 14
28 | LED_PIN=13 #If it finds the ball, then it will light up the led
29 |
30 | # Set pins as output and input
31 | GPIO.setup(GPIO_TRIGGER1,GPIO.OUT) # Trigger
32 | GPIO.setup(GPIO_ECHO1,GPIO.IN) # Echo
33 | GPIO.setup(GPIO_TRIGGER2,GPIO.OUT) # Trigger
34 | GPIO.setup(GPIO_ECHO2,GPIO.IN)
35 | GPIO.setup(GPIO_TRIGGER3,GPIO.OUT) # Trigger
36 | GPIO.setup(GPIO_ECHO3,GPIO.IN)
37 | GPIO.setup(LED_PIN,GPIO.OUT)
38 | GPIO.setup(en ,GPIO.OUT)
39 | GPIO.setup(en1 ,GPIO.OUT)
40 | p=GPIO.PWM(en,1000)
41 | p1=GPIO.PWM(en1,1000)
42 | p.start(100)
43 | p1.start(100)
44 |
45 | # Set trigger to False (Low)
46 | GPIO.output(GPIO_TRIGGER1, False)
47 | GPIO.output(GPIO_TRIGGER2, False)
48 | GPIO.output(GPIO_TRIGGER3, False)
49 |
50 | # Allow module to settle
51 | def sonar(GPIO_TRIGGER,GPIO_ECHO):
52 | start=0
53 | stop=0
54 | # Set pins as output and input
55 | GPIO.setup(GPIO_TRIGGER,GPIO.OUT) # Trigger
56 | GPIO.setup(GPIO_ECHO,GPIO.IN) # Echo
57 |
58 | # Set trigger to False (Low)
59 | GPIO.output(GPIO_TRIGGER, False)
60 |
61 | # Allow module to settle
62 | time.sleep(0.01)
63 |
64 | #while distance > 5:
65 | #Send 10us pulse to trigger
66 | GPIO.output(GPIO_TRIGGER, True)
67 | time.sleep(0.00001)
68 | GPIO.output(GPIO_TRIGGER, False)
69 | begin = time.time()
70 | while GPIO.input(GPIO_ECHO)==0 and time.time()largest_contour) :
147 | largest_contour=area
148 |
149 | cont_index=idx
150 | #if res>15 and res<18:
151 | # cont_index=idx
152 |
153 | r=(0,0,2,2)
154 | if len(contours) > 0:
155 | r = cv2.boundingRect(contours[cont_index])
156 |
157 | return r,largest_contour
158 |
159 | def target_hist(frame):
160 | hsv_img=cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
161 |
162 | hist=cv2.calcHist([hsv_img],[0],None,[50],[0,255])
163 | return hist
164 |
165 | #CAMERA CAPTURE
166 | #initialize the camera and grab a reference to the raw camera capture
167 | camera = PiCamera()
168 | camera.resolution = (160, 120)
169 | camera.framerate = 16
170 | rawCapture = PiRGBArray(camera, size=(160, 120))
171 |
172 | # allow the camera to warmup
173 | time.sleep(0.001)
174 |
175 | # capture frames from the camera
176 | for image in camera.capture_continuous(rawCapture, format="bgr", use_video_port=True):
177 | #grab the raw NumPy array representing the image, then initialize the timestamp and occupied/unoccupied text
178 | frame = image.array
179 | frame=cv2.flip(frame,1)
180 | global centre_x
181 | global centre_y
182 | centre_x=0.
183 | centre_y=0.
184 | hsv1 = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
185 | mask_red=segment_colour(frame) #masking red the frame
186 | loct,area=find_blob(mask_red)
187 | x,y,w,h=loct
188 |
189 | #distance coming from front ultrasonic sensor
190 | distanceC = sonar(GPIO_TRIGGER2,GPIO_ECHO2)
191 | #distance coming from right ultrasonic sensor
192 | distanceR = sonar(GPIO_TRIGGER3,GPIO_ECHO3)
193 | #distance coming from left ultrasonic sensor
194 | distanceL = sonar(GPIO_TRIGGER1,GPIO_ECHO1)
195 |
196 | if (w*h) < 10:
197 | found=0
198 | else:
199 | found=1
200 | simg2 = cv2.rectangle(frame, (x,y), (x+w,y+h), 255,2)
201 | centre_x=x+((w)/2)
202 | centre_y=y+((h)/2)
203 | cv2.circle(frame,(int(centre_x),int(centre_y)),3,(0,110,255),-1)
204 | centre_x-=80
205 | centre_y=6--centre_y
206 | print centre_x,centre_y
207 | initial=400
208 | flag=0
209 | GPIO.output(LED_PIN,GPIO.LOW)
210 | if(found==0):
211 | #if the ball is not found and the last time it sees ball in which direction, it will start to rotate in that direction
212 | if flag==0:
213 | rightturn()
214 | time.sleep(0.05)
215 | else:
216 | leftturn()
217 | time.sleep(0.05)
218 | stop()
219 | time.sleep(0.0125)
220 |
221 | elif(found==1):
222 | if(area=8:
226 | rightturn()
227 | time.sleep(0.00625)
228 | stop()
229 | time.sleep(0.0125)
230 | forward()
231 | time.sleep(0.00625)
232 | stop()
233 | time.sleep(0.0125)
234 | #while found==0:
235 | leftturn()
236 | time.sleep(0.00625)
237 | elif distanceL>=8:
238 | leftturn()
239 | time.sleep(0.00625)
240 | stop()
241 | time.sleep(0.0125)
242 | forward()
243 | time.sleep(0.00625)
244 | stop()
245 | time.sleep(0.0125)
246 | rightturn()
247 | time.sleep(0.00625)
248 | stop()
249 | time.sleep(0.0125)
250 | else:
251 | stop()
252 | time.sleep(0.01)
253 | else:
254 | #otherwise it move forward
255 | forward()
256 | time.sleep(0.00625)
257 | elif(area>=initial):
258 | initial2=6700
259 | if(area10):
261 | #it brings coordinates of ball to center of camera's imaginary axis.
262 | if(centre_x<=-20 or centre_x>=20):
263 | if(centre_x<0):
264 | flag=0
265 | rightturn()
266 | time.sleep(0.025)
267 | elif(centre_x>0):
268 | flag=1
269 | leftturn()
270 | time.sleep(0.025)
271 | forward()
272 | time.sleep(0.00003125)
273 | stop()
274 | time.sleep(0.00625)
275 | else:
276 | stop()
277 | time.sleep(0.01)
278 |
279 | else:
280 | #if it founds the ball and it is too close it lights up the led.
281 | GPIO.output(LED_PIN,GPIO.HIGH)
282 | time.sleep(0.1)
283 | stop()
284 | time.sleep(0.1)
285 | #cv2.imshow("draw",frame)
286 | rawCapture.truncate(0) # clear the stream in preparation for the next frame
287 |
288 | if(cv2.waitKey(1) & 0xff == ord('q')):
289 | break
290 |
291 | GPIO.cleanup() #free all the GPIO pins
292 |
--------------------------------------------------------------------------------