├── screenshot.png
├── azure-pipelines-1.yml
├── azure-pipelines.yml
├── LICENSE
├── README.md
├── directkeys.py
└── gesturecontrol.py
/screenshot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gnana1997/Gesture-Gaming-using-OpenCV/HEAD/screenshot.png
--------------------------------------------------------------------------------
/azure-pipelines-1.yml:
--------------------------------------------------------------------------------
1 | # Python package
2 | # Create and test a Python package on multiple Python versions.
3 | # Add steps that analyze code, save the dist with the build record, publish to a PyPI-compatible index, and more:
4 | # https://docs.microsoft.com/azure/devops/pipelines/languages/python
5 |
6 | trigger:
7 | - master
8 |
9 | pool:
10 | vmImage: ubuntu-latest
11 | strategy:
12 | matrix:
13 | Python36:
14 | python.version: '3.6'
15 |
16 |
17 | steps:
18 | - task: UsePythonVersion@0
19 | inputs:
20 | versionSpec: '3.6'
21 | displayName: 'Use Python 3.6'
22 |
23 | - script: |
24 | python -m pip install --upgrade pip
25 | displayName: 'Install dependencies'
26 |
27 |
--------------------------------------------------------------------------------
/azure-pipelines.yml:
--------------------------------------------------------------------------------
1 | # Python package
2 | # Create and test a Python package on multiple Python versions.
3 | # Add steps that analyze code, save the dist with the build record, publish to a PyPI-compatible index, and more:
4 | # https://docs.microsoft.com/azure/devops/pipelines/languages/python
5 |
6 | trigger:
7 | - master
8 |
9 | pool:
10 | vmImage: ubuntu-latest
11 | strategy:
12 | matrix:
13 | Python36:
14 | python.version: '3.6'
15 |
16 | steps:
17 | - task: UsePythonVersion@0
18 | inputs:
19 | versionSpec: '3.6'
20 | displayName: 'Use Python 3.6'
21 |
22 | - script: |
23 | python -m pip install --upgrade pip
24 | displayName: 'Installed dependencies'
25 |
26 | - script: |
27 | pip install pytest pytest-azurepipelines
28 | pytest
29 | displayName: 'pytest'
30 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2020 Gnana
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Gesture Gaming using OpenCV
2 | Hand gestures are super cool to use instead of keyboard keys!
3 | So, I have used my Hand Gestures to play Hill Climb Racing game with the help of OpenCV library in Python.
4 | ## Controls
5 | - Right Hand Fist- Acceleration
6 | - Left Hand Fist- Break
7 |
8 | ## Requirements
9 | - python 3.x
10 | - imutils
11 | - numpy
12 | - opencv-python
13 |
14 | ## Description
15 | Using OpenCV, the screen is divided in such a way that the bottom left region of the screeen is used for applying breaks, whereas bottom right region of the screen is used for acceleration.
16 |
17 |
When a navy blue circle(fists in navy blue gloves) is detected in any of the regions, corresponding key is given as input to the game from our program.
18 | NOTE: No machine learning model is used for detecting objects!!
19 |
Let's take a ride then!
20 |
21 |
22 | Further, you can play your favourite games and add more controls based on your requirements.
23 |
Here is the link to the YouTube video of my Hill Climb Racing gamplay using hand gestures.
24 |
--------------------------------------------------------------------------------
/directkeys.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 | """
4 | Created on Tue Jul 21 20:42:01 2020
5 | @author: Gnana Murthiy
6 | @description: Passing input to the Keyboard(from camera).
7 | @Reference: #http://stackoverflow.com/questions/14489013/simulate-python-keypresses-for-controlling-a-game
8 | """
9 | import ctypes
10 | import time
11 |
12 | SendInput = ctypes.windll.user32.SendInput
13 |
14 | #List of Scan codes: https://wiki.osdev.org/PS/2_Keyboard
15 | #cursor right pressed
16 | right_pressed=0x4D
17 |
18 | #cursor leftt pressed
19 | left_pressed=0x4B
20 |
21 | # C struct redefinitions
22 | PUL = ctypes.POINTER(ctypes.c_ulong)
23 | class KeyBdInput(ctypes.Structure):
24 | _fields_ = [("wVk", ctypes.c_ushort),
25 | ("wScan", ctypes.c_ushort),
26 | ("dwFlags", ctypes.c_ulong),
27 | ("time", ctypes.c_ulong),
28 | ("dwExtraInfo", PUL)]
29 |
30 | class HardwareInput(ctypes.Structure):
31 | _fields_ = [("uMsg", ctypes.c_ulong),
32 | ("wParamL", ctypes.c_short),
33 | ("wParamH", ctypes.c_ushort)]
34 |
35 | class MouseInput(ctypes.Structure):
36 | _fields_ = [("dx", ctypes.c_long),
37 | ("dy", ctypes.c_long),
38 | ("mouseData", ctypes.c_ulong),
39 | ("dwFlags", ctypes.c_ulong),
40 | ("time",ctypes.c_ulong),
41 | ("dwExtraInfo", PUL)]
42 |
43 | class Input_I(ctypes.Union):
44 | _fields_ = [("ki", KeyBdInput),
45 | ("mi", MouseInput),
46 | ("hi", HardwareInput)]
47 |
48 | class Input(ctypes.Structure):
49 | _fields_ = [("type", ctypes.c_ulong),
50 | ("ii", Input_I)]
51 |
52 | def PressKey(hexKeyCode):
53 | extra = ctypes.c_ulong(0)
54 | ii_ = Input_I()
55 | ii_.ki = KeyBdInput( 0, hexKeyCode, 0x0008, 0, ctypes.pointer(extra) )
56 | x = Input( ctypes.c_ulong(1), ii_ )
57 | ctypes.windll.user32.SendInput(1, ctypes.pointer(x), ctypes.sizeof(x))
58 |
59 | def ReleaseKey(hexKeyCode):
60 | extra = ctypes.c_ulong(0)
61 | ii_ = Input_I()
62 | ii_.ki = KeyBdInput( 0, hexKeyCode, 0x0008 | 0x0002, 0, ctypes.pointer(extra) )
63 | x = Input( ctypes.c_ulong(1), ii_ )
64 | ctypes.windll.user32.SendInput(1, ctypes.pointer(x), ctypes.sizeof(x))
65 |
66 | if __name__=='__main__':
67 | while (True):
68 | PressKey(0x11)
69 | time.sleep(1)
70 | ReleaseKey(0x11)
71 | time.sleep(1)
72 |
--------------------------------------------------------------------------------
/gesturecontrol.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 | """
4 | Created on Tue Jul 21 20:42:01 2020
5 | @author: Gnana Murthiy
6 | @description: Game controlling with Fists in Navy blue gloves using openCV. Left Fist- Break Righ Fist- acceleration
7 | This code is inspired by a project, by Patel Digant: https://github.com/pateldigant/gesture-gaming-python
8 | Custom Logic was written to handle both the keys simultaneously for gaming requirements.
9 | """
10 |
11 | from imutils.video import VideoStream
12 | import numpy as np
13 | import cv2
14 | import imutils
15 | import time
16 | from directkeys import right_pressed,left_pressed
17 | from directkeys import PressKey, ReleaseKey
18 |
19 |
20 | break_key_pressed=left_pressed
21 | accelerato_key_pressed=right_pressed
22 |
23 | # define the lower and upper boundaries of the "navy blue" object in the HSV color space
24 | #https://stackoverflow.com/questions/36817133/identifying-the-range-of-a-color-in-hsv-using-opencv
25 | blueLower = np.array([110, 40, 40])
26 | blueUpper = np.array([130,255,255])
27 |
28 | vs = VideoStream(src=0).start()
29 |
30 | # allow the camera or video file to warm up
31 | time.sleep(2.0)
32 | initial = True
33 | flag = False
34 | current_key_pressed = set()
35 | circle_radius = 30
36 | windowSize = 160
37 | lr_counter = 0
38 |
39 | # keep looping
40 | break_pressed=False
41 | accelerator_pressed=False
42 | while True:
43 | keyPressed = False
44 | break_pressed=False
45 | accelerator_pressed=False
46 | # grab the current frame
47 | frame = vs.read()
48 | height,width = frame.shape[:2]
49 |
50 | #Flipped the frame so that left hand appears on the left side and right hand appears on the right side
51 | frame = cv2.flip(frame,1);
52 |
53 | # resize the frame, blur it, and convert it to the HSV color space
54 | frame = imutils.resize(frame, height=300)
55 | frame = imutils.resize(frame, width=600)
56 | blurred = cv2.GaussianBlur(frame, (11, 11), 0)
57 | hsv = cv2.cvtColor(blurred, cv2.COLOR_BGR2HSV)
58 |
59 | # crteate a mask for the orange color and perform dilation and erosion to remove any small
60 | # blobs left in the mask
61 | mask = cv2.inRange(hsv, blueLower, blueUpper)
62 | mask = cv2.erode(mask, None, iterations=2)
63 | mask = cv2.dilate(mask, None, iterations=2)
64 |
65 | # find contours in the mask and initialize the current
66 | # (x, y) center of the orange object
67 |
68 | # divide the frame into two halves so that we can have one half control the acceleration/brake
69 | # and other half control the left/right steering.
70 | left_mask = mask[:,0:width//2,]
71 | right_mask = mask[:,width//2:,]
72 |
73 | #find the contours in the left and right frame to find the center of the object
74 | cnts_left = cv2.findContours(left_mask.copy(), cv2.RETR_EXTERNAL,
75 | cv2.CHAIN_APPROX_SIMPLE)
76 | cnts_left = imutils.grab_contours(cnts_left)
77 | center_left = None
78 |
79 | cnts_right = cv2.findContours(right_mask.copy(), cv2.RETR_EXTERNAL,
80 | cv2.CHAIN_APPROX_SIMPLE)
81 | cnts_right = imutils.grab_contours(cnts_right)
82 | center_right = None
83 | # only proceed if at least one contour was found
84 | key_count=0
85 | key_pressed=0
86 | if len(cnts_left) > 0:
87 | # find the largest contour in the mask, then use
88 | # it to compute the minimum enclosing circle and centroid
89 | c = max(cnts_left, key=cv2.contourArea)
90 | ((x, y), radius) = cv2.minEnclosingCircle(c)
91 | M = cv2.moments(c)
92 | # find the center from the moments 0.000001 is added to the denominator so that divide by
93 | # zero exception doesn't occur
94 | center_left = (int(M["m10"] / (M["m00"]+0.000001)), int(M["m01"] / (M["m00"]+0.000001)))
95 | #print("center_left",center_left)
96 | # only proceed if the radius meets a minimum size
97 | if radius > circle_radius:
98 | # draw the circle and centroid on the frame,
99 | cv2.circle(frame, (int(x), int(y)), int(radius),
100 | (0, 0, 255), 2)
101 | cv2.circle(frame, center_left, 5, (0, 0, 255), -1)
102 | #Bottom Left region
103 | if center_left[1] > 250:
104 | cv2.putText(frame,'Break Applied',(10,30),cv2.FONT_HERSHEY_SIMPLEX,1,(0,0,255),3)
105 | PressKey(break_key_pressed)
106 | break_pressed=True
107 | current_key_pressed.add(break_key_pressed)
108 | #Break key- 75 #Acc key-77
109 | key_pressed=break_key_pressed
110 | keyPressed = True
111 | key_count=key_count+1
112 | # only proceed if at least one contour was found
113 | if len(cnts_right) > 0:
114 | c2 = max(cnts_right, key=cv2.contourArea)
115 | ((x2, y2), radius2) = cv2.minEnclosingCircle(c2)
116 | M2 = cv2.moments(c2)
117 | center_right = (int(M2["m10"] / (M2["m00"]+0.000001)), int(M2["m01"] / (M2["m00"]+0.000001)))
118 | center_right = (center_right[0]+width//2,center_right[1])
119 | # only proceed if the radius meets a minimum size
120 | if radius2 > circle_radius:
121 | cv2.circle(frame, (int(x2)+width//2, int(y2)), int(radius2),
122 | (0, 255, 0), 2)
123 | cv2.circle(frame, center_right, 5, (0, 255, 0), -1)
124 | #Bottom Right region
125 | if center_right[1] >250 :
126 | cv2.putText(frame,'Acc. Applied',(350,30),cv2.FONT_HERSHEY_SIMPLEX,1,(0,255,0),3)
127 | PressKey(accelerato_key_pressed)
128 | key_pressed=accelerato_key_pressed
129 | accelerator_pressed=True
130 | keyPressed = True
131 | current_key_pressed.add(accelerato_key_pressed)
132 | key_count=key_count+1
133 |
134 | frame_copy=frame.copy()
135 | #Bottom left region rectangle
136 | frame_copy = cv2.rectangle(frame_copy,(0,height//2 ),(width//2,width),(255,255,255),1)
137 | cv2.putText(frame_copy,'Break',(10,280),cv2.FONT_HERSHEY_SIMPLEX,1,(255,255,255),3)
138 | #Bottom right region rectangle
139 | frame_copy = cv2.rectangle(frame_copy,(width//2,height//2),(width,height),(255,255,255),1)
140 | cv2.putText(frame_copy,'Acceleration',(330,280),cv2.FONT_HERSHEY_SIMPLEX,1,(255,255,255),3)
141 |
142 | # show the frame to our screen
143 | cv2.imshow("Frame", frame_copy)
144 |
145 | #If part: We need to release the pressed key if none of the key is pressed else the program will keep on sending
146 | #Else part:If different keys(Only one key in each frame) are pressed in previous and current frames, then we must
147 | #release previous frame key, Also release the key in current frame key for smoother control
148 | if not keyPressed and len(current_key_pressed) != 0:
149 | for key in current_key_pressed:
150 | ReleaseKey(key)
151 | current_key_pressed = set()
152 | elif key_count==1 and len(current_key_pressed)==2:
153 | for key in current_key_pressed:
154 | if key_pressed!=key:
155 | ReleaseKey(key)
156 | current_key_pressed = set()
157 | for key in current_key_pressed:
158 | ReleaseKey(key)
159 | current_key_pressed = set()
160 |
161 | key = cv2.waitKey(1) & 0xFF
162 | # if the 'q' key is pressed, stop the loop
163 | if key == ord("q"):
164 | break
165 |
166 |
167 | vs.stop()
168 | # close all windows
169 | cv2.destroyAllWindows()
--------------------------------------------------------------------------------