├── .DS_Store
├── .gitignore
├── LICENSE
├── README.md
├── _config.yml
├── _includes
├── headerbutton.html
└── navigation.html
├── _layouts
└── default.html
├── assets
└── css
│ └── style.scss
├── blue_background.md
├── draft-d
├── README.md
└── resources
│ └── resources.md
├── draft-h
├── README.md
└── resources
│ └── resources.md
├── draft-j
├── README.md
└── resources
│ ├── Untitled Diagram.xml
│ ├── client.png
│ ├── demo.gif
│ ├── learning_curve.png
│ ├── mb.xml
│ ├── out-mouse.gif
│ ├── resources.md
│ ├── server.png
│ └── workflow.png
├── editguide.md
├── gestures.md
├── reference.md
├── resources
├── .DS_Store
├── blue-table-cloth800x.jpg
├── download.jpeg
├── gestures
│ ├── hand_gestures.png
│ ├── index.md
│ ├── l-m-r.png
│ ├── l1.jpg
│ ├── l2.jpg
│ ├── l3.jpg
│ ├── l4.jpg
│ ├── l5.jpg
│ ├── lc.jpg
│ ├── le.jpg
│ ├── lo.jpg
│ ├── lq.jpg
│ ├── lw.jpg
│ ├── m1.jpg
│ ├── m2.jpg
│ ├── m3.jpg
│ ├── m4.jpg
│ ├── m5.jpg
│ ├── mc.jpg
│ ├── me.jpg
│ ├── mo.jpg
│ ├── mq.jpg
│ ├── mw.jpg
│ ├── r1.jpg
│ ├── r2.jpg
│ ├── r3.jpg
│ ├── r4.jpg
│ ├── r5.jpg
│ ├── rc.jpg
│ ├── re.jpg
│ ├── ro.jpg
│ ├── rq.jpg
│ ├── rw.jpg
│ ├── v1.jpg
│ ├── v2.jpg
│ ├── v3.jpg
│ ├── v4.jpg
│ ├── v5.jpg
│ ├── vc.jpg
│ ├── ve.jpg
│ ├── vo.jpg
│ ├── vq.jpg
│ └── vw.jpg
├── images.jpeg
├── remote-control800x.jpg
├── tracking_hands800x.jpg
└── virtual-keyboard2.jpg
└── teams-all
├── SML109_team_project_abstracts.docx
└── SML109_team_project_abstracts.pdf
/.DS_Store:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/.DS_Store
--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------
1 | MANUAL
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | The MIT License
2 |
3 | Copyright (c) 2017-present whatifif and andyli8500
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in
13 | all copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
21 | THE SOFTWARE.
22 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | [Home](/README.md) | [Gestures](/gestures.md) | [Blue_Background](/blue_background.md) | [Reference](/reference.md) | [Edit Guide](/editguide.md) | | | [Teams-all](/teams-all/SML109_team_project_abstracts.pdf)
2 |
3 |
4 | # Controlling a Computer by Hand Gesture
5 |
6 | ## Study group of Sydney Machine Learning
7 |
8 | This study group is formed to study the [Harvard CS109](https://github.com/cs109) among the members of [Sydney Machine Learning Meetup](https://www.meetup.com/Sydney-Machine-Learning/).
9 | ( Youtube channel [https://www.youtube.com/channel/UCcZ5Sy4JzVUaiD1ZYRGDM0g/videos](https://www.youtube.com/channel/UCcZ5Sy4JzVUaiD1ZYRGDM0g/videos) )
10 |
11 | Members were re-grouped into 10 teams and completed their own interesting projects.
12 | My team was "team echo" named after the prize [Amazon Echo](https://www.amazon.com/Amazon-Echo-Bluetooth-Speaker-with-Alexa-Black/dp/B00X4WHP5E).
13 |
14 | - study period: 2017.08.02 ~ 2017.10.23 ( 3 months including 1 month project )
15 | - study place : Amazon Web Service ( [https://aws.amazon.com/](https://aws.amazon.com/), Australia, 2 Park St, Sydney NSW 2000 )
16 |
17 | ## Project
18 |
19 | - name: Controlling a Computer by Hand Gesture
20 | - homepage: [https://github.com/whatifif/handgesture](https://github.com/whatifif/handgesture)
21 | - code page: [https://github.com/whatifif/handgesturecode](https://github.com/whatifif/handgesturecode)
22 | - slack: [https://sml109.slack.com](https://sml109.slack.com)
23 | - team name: Team Echo
24 | - team members: heeseob, jiaxi
25 | - period: 1 month
26 |
27 | Among [10 teams](/teams-all/SML109_team_project_abstracts.pdf), we won the 1st prize ( and Amazon Echo ( Alexa ) as an award ) along with one other team ( DeepAI ).
28 |
29 | ## Brief Introduction of this project
30 |
31 | Almost people use a desktop or laptop these days. One of serious problem is that we are stuck to the keyboard and mouse, which will cause a serious health problem on a long run. Moreover in VR/AR age, we cannot use keyboard/mouse. Our purpose is to replace a keyboard and mouse with hand gestures. We have devised a virtual keyboard and virtual mouse with subtle hand gestures and made ML recognise our gestures so that we can control our computer remotely. Amazon echo has ear now. It will have eye in future. We need to make a standard gestures for people to adopt easily like a standard keyboard and mouse. ML will address this for us, human. We used Deep Learning as ML in this project.
32 |
33 | - Using hand movements and gesture to control mouse/clicks though cam
34 | 
35 |
36 |
37 | ## Introduction
38 |
39 | People have used a keyboard and mouse as a standard way of input to computer for a long time. One of the serious problem of using keyboard and mouse is that we are stuck to the keyboard/mouse and sit still while we are using our desktop or laptop computer. This causes a serious health problem on a long run.
40 |
41 | Moreover we are entering the VR ( Virtual reality )/ AR ( Augmented reality ) age and we cannot use keyboard/mouse as we did before. We have to use some kind of mobile controller or GESTURE !!!
42 |
43 | ML( Machine Learning ) can recognise our gesture and control the computer as we intented by gesture remotely.
44 | We will feel like a magician. And by moving our body to control the computer, we can avoid sitting still and getting a health problem. Moreover we can play a game with much more immersive experience by moving our hands, heads and body.
45 |
46 | To replace a keyboard and mouse, we have to devise many subtle gestures and these gestures have to be easy one for people to learn easily. And these gestures should be a standard like a standard keyboard and mouse. It may be annoying if we have to learn different gestures to control different devices in future.
47 |
48 | We suggest a standard hand gestures like follows. The goal of this project is to search a possibility of using these subtle hand gestures to control a computer up to the level to replace a keyboard and mouse completely.
49 |
50 |
51 | ## Recent works in this field
52 |
53 | 1. [Carnegie Mellon University OpenPose](https://github.com/CMU-Perceptual-Computing-Lab/openpose): webcam
54 |
55 | 2. [Microsoft hololens](https://www.microsoft.com/en-au/hololens): 3D sensor
56 |
57 | 3. [Microsoft hand tracking](https://www.microsoft.com/en-us/research/project/fully-articulated-hand-tracking/): 3D sensor
58 |
59 | 4. [Leap Motion](https://www.leapmotion.com): 3D sensor
60 |
61 | 5. [Mano Motion](https://www.manomotion.com/): smartphone camera
62 |
63 | 6. [PilotBit Mobile Hand Tracking](http://www.pilotbit.com/): 3D sensor
64 |
65 |
66 | ## Challenging points
67 |
68 | - using a webcam without using 3D sensor (depth camera)
69 |
70 | - detecting the subtle gestures at real time
71 |
72 | - deep learning running at mobile devices such as smartphone, smartglasses and VR/AR headset
73 |
74 |
75 |
76 | ## Work flow
77 |
78 | 1. searching github for open sources and youtube for useful information.
79 |
80 | 2. defining as many hand gestures as possible to cover all the keys on a keyboard and mouse.
81 |
82 | 3. coding for tools to create as many data sets as possible in short time.
83 |
84 | 4. coding to use a deep learning model which can recognise these hand gestures at real time.
85 |
86 | 5. coding for tracking and capturing hand
87 |
88 | 6. coding for some demos such as basic calculation or a game.
89 |
90 |
91 |
92 | ## Tools used for Team work
93 | 1. github site for project page: [https://github.com/whatifif/handgesture](https://github.com/whatifif/handgesture)
94 | 2. github site for coding work : [https://github.com/whatifif/handgesturecode](https://github.com/whatifif/handgesturecode)
95 | 3. slack for team communication and instant file sharing: [https://sml109.slack.com](https://sml109.slack.com)
96 |
97 | ## Technical Details
98 |
99 | #### Main Dependencies:
100 | - Python 2.7
101 | - Opencv 3.2.0
102 | - MxNet 0.11.0
103 | - Numpy 1.13.1
104 | - Pandas 0.20.3
105 |
106 | #### Hardwares and softwares
107 | - Coding with MacBook Air (i7, 8GB RAM)
108 | - ML Model training with nvidia GTX 960 (2GB Graphic memory) on i7 CPU, 8GB RAM, ubunutu 14.04 64bit personal computer
109 | - Microsoft Webcam HD-3000
110 | - MacBook Air webcam
111 | - jupyter notebook as python editor
112 |
113 |
114 | ## Hand Gestures as a Standard Way of Input like a Keyboard and a Mouse
115 | - Using hand for a future virtual keyboard
116 | 
117 |
118 | We have two hands normally. We can use one hand for keyboard and the other for a mouse.
119 | The standard gestures should be easy for people to learn easily. Just imagine that there is a virtual keyboard in front of your left side and virtual mouse on your right side. Lets focus on virtual keyboard first.
120 |
121 | We divided the left side into three regions
122 |
123 | 1. left region
124 | 2. middle region
125 | 3. right region
126 |
127 | - left, middle and right region of the left side.
128 | ( Thses are LEFT side images. Images are flipped horizontally when captured. So do not be confused )
129 | 
130 |
131 |
132 | We can make 10 easy distinct gestures on each region:
133 |
134 | 1. closed hand
135 | 2. open hand
136 | 3. thumb only
137 | 4. additional index finger
138 | 5. additional middle finger
139 | 6. folding first finger
140 | 7. folding second finger
141 | 8. folding middle finger
142 | 9. folding index finger
143 | 10. folding thumb
144 |
145 | If we use 6, 7, 8, 9, 10 as inputs, we have 5 * 3 = 15 different gestures in each region and 15 * 3 = 45 gestures in total region.
146 |
147 | If we use 3, 4, 5 as control, we have 3 * 3 = 9 controls and 45 * 9 = 135 different gestures which will cover the whole range of keys ( numbers, lower alphabets, upper alphabets, special keys and controls )
148 |
149 | For a right hand as a mouse, we can have 10 same gestures as left hand, which cover whole inputs from mouse. Our center of hand is a mouse cursor. The mouse cursor of computer will track the center of the right hand. There is only one region on the right side for a mouse.
150 |
151 | For a left-handed people, of course we can swap the left with the right.
152 |
153 | - 30 hand gestures suggested as a standard way of input for remotely-controlled devices.
154 | 
155 |
156 | [See the detailed gestures for keyboard and mouse](/gestures.md)
157 |
158 | ## Making a data set
159 | To train the ML, several thousand data are needed and these data are to be prepared by ourselves.
160 | So we have to make a program for capturing the hand images easily. With the capturing program, about 2000 hand images were captured in short period.
161 |
162 | ## Detecting and tracking Hand
163 | Since we are moving our hands freely in front of webcam, our hands should be detected and tracked correctly in the frame of webcam image at real time. Haar cascade, background substraction and skin color detection were tried to track a hand. Skin color detection was found to be stable.
164 |
165 | #### detection region for hand and mouse
166 | Since our face has a same skin color as our hands, we have to find ways to ignore the face. Haar cascade can be applied for this purpose. But due to the time limitation of this project, we simply defined a detection region and tried not to push our face into that region.
167 |
168 | 
169 |
170 |
171 | #### tracking hands
172 | Since we use a skin color for tracking a hand, background color and our shirts color should be a contrasting color to skin color. And we have to wear a long sleeve shirt to hide our arm from detection also.
173 |
174 | The environment light affects skin color significantly. So bright room was avoided. And a blue screen made with blue color table cloth was used as a background to get a good data. It turns out that the whiteboard is a good background also.
175 |
176 | - Main program used to capture hand images:
177 | There are two detection regions. When the buttons below are clicked, detected image is captured, resized into 200x200 and saved in jpg format. At the same time, the file name is saved in csv file also. 2000 data were obtained in short period.
178 | 
179 |
180 | # Deep Learning Model to recognise the gesture
181 |
182 | ## Difficulties
183 | - Very few existing dataset fits our purpose. Therefore, we have to capture the hand picture, and do the labeling. A.k.a, we will have to make the dataset our own
184 |
185 | - Tracking and recognizing hand are 2 challenges in this project, and each requires large amount of time to work on. Therefore, in this project, we are firstly focusing on Recognition, and used a just feasible approach for Hand Detection for saving time.
186 |
187 | - Model selections, especially on building a Neural Nework architecture. That is, if the Neural Network is too "deep", it takes long time to converge, and also is not suitable in real-time cameras. While single-layer NN does not performing good enough
188 |
189 | - 200x200 pixel image data in jpg format gets "out of memory" error when Nvidia GTX 960 ( 2GB Graphic memory) is used for training a Deep Learning Model. So 64x64 version of data are prepared and used to train a Model.
190 |
191 | - There are intermediate gestures between any two gestures while capturing at real time. If ML is forced to continuously predict these intermediate gestures, ML will make much mistakes. We had to capture only 1 out of 10 frames and only if the 3 consecutive predictions are same, we took the gesture as a final gesture. So the delay shown on demo is not the actual delay due to ML processing.
192 |
193 | ## Why MxNet is chosen as a Deep Learning Model of this project?
194 | - Super fast
195 | - Various language bindings: R, Python, Scala, Perl, Julia
196 | - Strong API interface
197 | - Scale linearly and easily
198 | - Easy to deploy into production
199 |
200 |
201 | ## How can the trained model be transferred to work at mobile devices such as smartphone?
202 | Two ways:
203 |
204 | 1. The trained model can be install onto the smartphone, and does the prediction in the client side. In this case, the model needs to be periodically updated, and might require a large update pacakage of the application
205 |
206 | 
207 |
208 | 2. The trained model stays at the application server, while keeping the detection utility and necessary preprocessing in the client side to reduce the workload overhead on the server. Therefore, the model can be periodically updated without impact on the client application much.
209 |
210 | 
211 |
212 |
213 | ### Main Work Flow:
214 |
215 | 
216 |
217 | ### Model Details
218 |
219 | - ~1800 images for training
220 | - ~200 images for testing
221 | - Mini-batch size 128
222 | - Model converge around 400 iterations, with Learning Rate 0.001
223 | - Weight decay set to 0.001 for regularize the model for overfitting
224 | - Achieved over 97% accuracy on evaluating test dataset
225 |
226 | 
227 |
228 |
229 | ## Summary of Progress
230 |
231 | - The possibility to use subtle hand gestures for controlling a computer remotely is confirmed.
232 |
233 | - A standard hand gestures for people to learn easily and replace a whole keyboard and mouse is suggested.
234 |
235 | - Left hand is used for keyboard, Right hand is used for mouse, which can be swapped for left-handed people
236 |
237 | - Left hand can have 30 gestures based on the hand gesture, and their angles in the picture. This 30 gestures can be mapped to 135 different keys in the keyboard.
238 |
239 |
240 |
241 |
242 | ## Demo
243 | - Performing simple calculations using different LEFT hand gestures through web cam.
244 | 
245 |
246 | - Using Right hand to control the mouse cursor.
247 | 
248 |
249 | ## Future Work
250 | - Improve our Hand Detection, this can be extended to ultilise a Deep Learning Model, and get a more accurate feedback
251 |
252 | - By using Haar cascade, the face can be detected and removed or only hands can be detected. Then the detection region for hands is not needed and we can move our body freely.
253 |
254 | - Add reference points to catch a skin color and set it as a new skin color to detect. And while preprocessing, change the new skin color to the skin color used in training. In this way, the new skin color can be tracked correctly also.
255 |
256 | - Add 'unknown' class if the probability is under criteria. When classified as 'unknown', immediately capture next frame without waiting the next 10 frames.
257 |
258 | - Add more gestures such as raising 1st finger, index finger, index and middle finger, 1st and thumb funger
259 |
260 | - Need more data in training for various situations of the hand. Such as, different skin color, hand size, brightness, and oculus issues. This might modify the model struture if needed
261 |
262 | - Advanced training method so as not to require too many data sets, such as using 3D hand model instead of 2D hand images for training.
263 |
264 | - Establishing and propagating the standard Hand Gestures for mobile and remotely-controlled devices
265 |
266 | ## P.S. What will be the future of human in AI (Artificial Intelligence) age?
267 | What will be the future of us, human in AI age? There will be no work which human have to do for living. We may have Universal Basic Income and have freedom to do what we like to do. AI may make an utopian world for human. Then to reach the utopian world as soon as possible, why not cooperate in developing AI rather than compete for limited resources and be greedy? By using technology including AI, we can make our resources rich enough to be used by all human. Human will be multiplanetary species in future and there are infinite resources out there in universe. Lets make AI work for us, human and lets enjoy our lives as a human being.
268 |
269 |
--------------------------------------------------------------------------------
/_config.yml:
--------------------------------------------------------------------------------
1 | theme: jekyll-theme-cayman
--------------------------------------------------------------------------------
/_includes/headerbutton.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
--------------------------------------------------------------------------------
/_includes/navigation.html:
--------------------------------------------------------------------------------
1 |
2 | Home |
3 | Reference |
4 | Edit Guide |
5 |
6 |
--------------------------------------------------------------------------------
/_layouts/default.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
17 | {% if site.show_downloads %}
18 | Download .zip
19 | Download .tar.gz
20 | {% endif %}
21 |
22 |
23 |
24 | {{ content }}
25 |
26 |
32 |
33 |
34 | {% if site.google_analytics %}
35 |
44 | {% endif %}
45 |
46 |
47 |
--------------------------------------------------------------------------------
/assets/css/style.scss:
--------------------------------------------------------------------------------
1 | ---
2 | ---
3 |
4 | @import "{{ site.theme }}";
5 |
6 | .page-header {
7 | background-image: linear-gradient(120deg, #155799, #0f482b);
8 | }
9 |
10 | .nav a {
11 | text-decoration: none;
12 | }
13 |
14 | .nav:hover {
15 | background-color: #eee
16 | }
17 |
--------------------------------------------------------------------------------
/blue_background.md:
--------------------------------------------------------------------------------
1 | [Home](/README.md) | [Gestures](/gestures.md) | [Blue_Background](/blue_background.md) | [Reference](/reference.md) | [Edit Guide](/editguide.md) | |
2 |
3 |
4 | # The Blue Background used to capture a hand image
5 |
6 | 
7 |
8 | Made with 2 brooms, a blue color vinyl table cloth, some clips and a clothes hanger
9 |
--------------------------------------------------------------------------------
/draft-d/README.md:
--------------------------------------------------------------------------------
1 |
2 |
--------------------------------------------------------------------------------
/draft-d/resources/resources.md:
--------------------------------------------------------------------------------
1 |
2 |
--------------------------------------------------------------------------------
/draft-h/README.md:
--------------------------------------------------------------------------------
1 |
2 | ## Brief Introduction of the project of Team Echo
3 |
4 | Almost people use a desktop or laptop these days. One of serious problem is that we are stuck to the keyboard and mouse, which will cause a serious health problem on a long run. Moreover in VR/AR age, we cannot use keyboard/mouse. Our purpose is to replace a keyboard and mouse with hand gestures. We have devised a virtual keyboard and virtual mouse with subtle hand gestures and made ML recognise our gestures so that we can control our computer remotely. Amazon echo has ear now. It will have eye in future. We need to make a standard gestures for people to adopt easily like a standard keyboard and mouse. ML will address this for us, human. We used Deep Learning as ML in this project.
5 |
6 | ## Introduction
7 |
8 | People have used a keyboard and mouse as a standard way of input to computer for a long time. One of the serious problem of using keyboard and mouse is that we are stuck to the keyboard/mouse and sit still while we are using our desktop or laptop computer. This causes a serious health problem on a long run.
9 |
10 | Moreover we are entering the VR ( Virtual reality )/ AR ( Augmented reality ) age and we cannot use keyboard/mouse as we did before. We have to use some kind of mobile controller or GESTURE !!!
11 |
12 | ML( Machine Learning ) can recognise our gesture and control the computer as we intented by gesture remotely.
13 | We will feel like a magician. And by moving our body to control the computer, we can avoid sitting still and getting a health problem. Moreover we can play a game with much more immersive experience by moving our hands, heads and body.
14 |
15 | To replace a keyboard and mouse, we have to devise many subtle gestures and these gestures have to be easy one for people to learn easily like a standard keyboard and mouse. It may be annoying if we have to learn different gestures to control different devices in future.
16 |
17 | We suggest a standard virtual keyboard and standard virtual mouse like following. The goal of this project is to search a possibility of using these subtle gestures to control a computer upto the level to replace a keyboard and mouse completely.
18 |
19 |
20 | ## Hand Gestures as a Standard Way to replace a Keyboard and Mouse
21 |
22 | We have two hands normally. We can use one hand for keyboard and the other for a mouse.
23 | The standard gestures should be easy for people to learn easily. Just imagine that there is a virtual keyboard in front of your left side and virtual mouse on your right side. Lets focus on virtual keyboard first.
24 |
25 | We divided the left side into three region
26 |
27 | 1. left region
28 | 2. middle region
29 | 3. right region
30 |
31 | We can make 10 easy distinct gesture on each region:
32 |
33 | 1. closed hand
34 | 2. open hand
35 | 3. thumb only
36 | 4. additional index finger
37 | 5. additional middle finger
38 | 6. folding first finger
39 | 7. folding second finger
40 | 8. folding middle finger
41 | 9. folding index finger
42 | 10 folding thumb
43 |
44 | If we use 6, 7, 8, 9, 10 as inputs, we have 5 * 3 = 15 different gestures in each region and 15 * 3 = 45 gestures in total region.
45 |
46 | If we use 3, 4, 5 as control, we have 3 * 3 = 9 controls and 45 * 9 = 135 different gestures which will cover the whole range of keys ( numbers, lower alphabet, upper alphabet, special keys and controls )
47 |
48 | For a mouse, we can have 10 same gestures as left hand, which cover whole inputs from mouse. Our center of hand is a mouse cursor. The mouse cursor of computer will track the center of the right hand.
49 |
50 | For a left-handed people, of course we can swap the left and right.
51 |
52 |
53 | ## My thought
54 |
55 | Our project confirmed using those hand gestures to replace a keyboard and mouse completely.
56 | And we suggested a standard hand gestures to establish a standard way of input for all remotely-controlled devices
57 | because without standard way, people have to learn each gestures for each device, which will be annoying and forgettable.
58 |
59 | I think that AI should directly affect the physical world to be practically useful for human since we are physical beings.
60 | AI should have body ( arms and legs) as well as ears and eyes.
61 | That is why I am interested in robot and it would be useful if these robots can recognise humans intention by hand gestures.
62 |
63 |
64 |
--------------------------------------------------------------------------------
/draft-h/resources/resources.md:
--------------------------------------------------------------------------------
1 |
2 |
--------------------------------------------------------------------------------
/draft-j/README.md:
--------------------------------------------------------------------------------
1 | # Use hand gestures to controll keyboard/mouse
2 |
3 |
4 | ## Project Outline
5 |
6 | - Using hand gestures to controll keyboard cam
7 |
8 | 
9 |
10 | - Using hand movements and gesture to control mouse/clicks though cam
11 |
12 | 
13 |
14 |
15 |
16 | ## Technical Details
17 |
18 | ### Main Dependencies:
19 | - Python 2.7
20 | - Opencv 3.2.0
21 | - MxNet 0.11.0
22 | - Numpy 1.13.1
23 | - Pandas 0.20.3
24 | - ..
25 |
26 | ### Difficulties
27 | - Very few existing dataset fits our purpose. Therefore, we have to capture the hand picture, and do the labeling. A.k.a, we will have to make the dataset our own
28 |
29 | - Tracking and recognizing hand are 2 challenges in this project, and each requires large amount of time to work on. Therefore, in this project, we are firstly focusing on Recognition, and used a just feasible approach for Hand Detection for saving time.
30 |
31 | - Model selections, especially on building a Neural Nework architecture. That is, if the Neural Network is too "deep", it takes long time to converge, and also is not suitable in real-time cameras. While single-layer NN does not performing good enough
32 |
33 | ### Why MxNet?
34 | - Super fast
35 | - Various language bindings: R, Python, Scala, Perl, Julia
36 | - Strong API interface
37 | - Scale linearly and easily
38 | - Easy to deploy into production
39 |
40 | ### How can the trained model be transferred to work at smartphone?
41 | Two ways:
42 |
43 | 1. The trained model can be install onto the smartphone, and does the prediction in the client side. In this case, the model needs to be periodically updated, and might require a large update pacakage of the application
44 |
45 | 2. The trained model stays at the application server, while keeping the detection utility and necessary preprocessing in the client side to reduce the workload overhead on the server. Therefore, the model can be periodically updated without impact on the client application much
46 |
47 | ### Main Work Flow:
48 |
49 | 
50 |
51 | ### Model Details
52 |
53 | - ~1800 images for training
54 | - ~200 images for testing
55 | - Mini-batch size 128
56 | - Model converge around 400 iterations, with Learning Rate 0.001
57 | - Weight decay set to 0.001 for regularize the model for overfitting
58 | - Achieved over 97% accuracy on evaluating test dataset
59 |
60 | 
61 |
62 |
63 | ## Our Progress
64 |
65 | - Use Left hand for keyboard, Right hand for mouse
66 |
67 | - At current stage, Left hand can be used for around 30 different keys in the keyboard, based on the hand gesture, and their angles in the picture
68 |
69 |
70 |
71 |
72 | ## Demo
73 | - Performing simple calculations using different LEFT hand gestures through web cam.
74 | 
75 |
76 | - Using Right hand to control the mouse cursor.
77 | 
78 |
79 | ## Future Work
80 | - Improve our Hand Detection technical, this can be extended to ultilise a Deep Learning Model, and get a more accurate feedback
81 |
82 | - Determine the classes (keys) in a more effecient/effective way
83 |
84 | - Need more data in training for various situations of the hand. Such as, different skin color, hand size, brightness, and oculus issues. This might modify the model struture if needed
85 |
86 |
87 |
88 |
89 |
90 |
91 |
92 |
93 |
94 |
95 |
96 |
97 |
98 |
99 |
--------------------------------------------------------------------------------
/draft-j/resources/Untitled Diagram.xml:
--------------------------------------------------------------------------------
1 | UzV2zq1wL0osyPDNT0nNUTV2VTV2LsrPL4GwciucU3NyVI0MMlNUjV1UjYwMgFjVyA2HrCFY1qAgsSg1rwSLBiADYTaQg2Y1AA==
--------------------------------------------------------------------------------
/draft-j/resources/client.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/draft-j/resources/client.png
--------------------------------------------------------------------------------
/draft-j/resources/demo.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/draft-j/resources/demo.gif
--------------------------------------------------------------------------------
/draft-j/resources/learning_curve.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/draft-j/resources/learning_curve.png
--------------------------------------------------------------------------------
/draft-j/resources/mb.xml:
--------------------------------------------------------------------------------
1 | UzV2zq1wL0osyPDNT0nNUTV2VTV2LsrPL4GwciucU3NyVI0MMlNUjV1UjYwMgFjVyA2HrCFY1qAgsSg1rwSLBiADYTaQg2Y1AA==
--------------------------------------------------------------------------------
/draft-j/resources/out-mouse.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/draft-j/resources/out-mouse.gif
--------------------------------------------------------------------------------
/draft-j/resources/resources.md:
--------------------------------------------------------------------------------
1 |
2 |
--------------------------------------------------------------------------------
/draft-j/resources/server.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/draft-j/resources/server.png
--------------------------------------------------------------------------------
/draft-j/resources/workflow.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/draft-j/resources/workflow.png
--------------------------------------------------------------------------------
/editguide.md:
--------------------------------------------------------------------------------
1 | [Home](/README.md) | [Gestures](/gestures.md) | [Blue_Background](/blue_background.md) | [Reference](/reference.md) | [Edit Guide](/editguide.md) | |
2 |
3 |
4 |
5 | ### Edit Guide of Pages
6 |
7 |
8 |
9 | README.md is the main page.
10 |
11 | Copy the navigation part at the upper part of README.md to the upper part of all the pages.
12 |
13 | You can use the [editor on GitHub](https://github.com/whatifif/handgesture/edit/master/README.md) to maintain and preview the content for your website in Markdown files.
14 |
15 | Whenever you commit to this repository, GitHub Pages will run [Jekyll](https://jekyllrb.com/) to rebuild the pages in your site, from the content in your Markdown files.
16 |
17 | ### Markdown
18 |
19 | Markdown is a lightweight and easy-to-use syntax for styling your writing. It includes conventions for
20 |
21 | ```markdown
22 | Syntax highlighted code block
23 |
24 | # Header 1
25 | ## Header 2
26 | ### Header 3
27 |
28 | - Bulleted
29 | - List
30 |
31 | 1. Numbered
32 | 2. List
33 |
34 | **Bold** and _Italic_ and `Code` text
35 |
36 | [Link](url) and 
37 | ```
38 |
39 | For more details see [GitHub Flavored Markdown](https://guides.github.com/features/mastering-markdown/).
40 |
41 | ### Jekyll Themes
42 |
43 | Your Pages site will use the layout and styles from the Jekyll theme you have selected in your [repository settings](https://github.com/whatifif/handgesture/settings). The name of this theme is saved in the Jekyll `_config.yml` configuration file.
44 |
45 | ### Support or Contact
46 |
47 | Having trouble with Pages? Check out our [documentation](https://help.github.com/categories/github-pages-basics/) or [contact support](https://github.com/contact) and we’ll help you sort it out.
48 |
49 | ### [LICENSE](/LICENSE)
50 | GNU GENERAL PUBLIC LICENSE Version 3, 29 June 2007
51 |
52 | Copyright (C) 2007 Free Software Foundation, Inc.
53 |
54 | Everyone is permitted to copy and distribute verbatim copies
55 | of this license document, but changing it is not allowed.
56 |
57 |
--------------------------------------------------------------------------------
/gestures.md:
--------------------------------------------------------------------------------
1 | [Home](/README.md) | [Gestures](/gestures.md) | [Blue_Background](/blue_background.md) | [Reference](/reference.md) | [Edit Guide](/editguide.md) | |
2 |
3 | # Hand Gestures as a Standard Way of Input
4 | # to replace Keyboard and Mouse
5 |
6 | 
7 |
8 |
9 | # Hand Gestures of Left Hand side
10 | ( mirrored images for easy capturing )
11 |
12 |
13 | ## There are three regions of left hand side for keyboard
14 |
15 | #### Left region
16 |  leftclosed as ready sate
17 |  leftopen as ready sate
18 |  leftq as control 1
19 |  leftw as control 2
20 |  lefte as control 3
21 |  left1 as number 1 or alphabet 1
22 |  left2 as number 2 or alphabet 2
23 |  left3 as number 3 or alphabet 3
24 |  left4 as number 4 or alphabet 4
25 |  left5 as number 5 or alphabet 5
26 |
27 |
28 |
29 | #### Middle region
30 |  middleclosed as ready sate
31 |  middleopen as ready sate
32 |  middleq as control 4
33 |  middlew as control 5
34 |  middlee as control 6
35 |  middle1 as number 6 or alphabet 6
36 |  middle2 as number 7 or alphabet 7
37 |  middle3 as number 8 or alphabet 8
38 |  middle4 as number 9 or alphabet 9
39 |  middle5 as number 0 or alphabet 10
40 |
41 |
42 |
43 | #### Right region
44 |  rightclosed as ready sate
45 |  rightopen as ready sate
46 |  rightq as control 7
47 |  rightw as control 8
48 |  righte as control 9
49 |  right1 as plus or alphabet 11
50 |  right2 as minus or alphabet 12
51 |  right3 as multi or alphabet 13
52 |  right4 as divide or alphabet 14
53 |  right5 as equal or alphabet 15
54 |
55 |
56 |
57 | # Hand Gestures of Right Hand side
58 | ( mirrored images for easy capturing )
59 |
60 | ## There is only one region of right hand side for mouse
61 |  mouse closed as ready sate
62 |  mouse open as ready sate
63 |  mouse q as control 10
64 |  mouse w as control 11
65 |  mouse e as control 12
66 |  mouse 1 as scroll down
67 |  mouse 2 as left button
68 |  mouse 3 as middle button
69 |  mouse 4 as right button
70 |  mouse 5 as scroll up
71 |
72 |
73 |
74 |
75 |
76 |
77 |
--------------------------------------------------------------------------------
/reference.md:
--------------------------------------------------------------------------------
1 | [Home](/README.md) | [Gestures](/gestures.md) | [Blue_Background](/blue_background.md) | [Reference](/reference.md) | [Edit Guide](/editguide.md) | |
2 |
3 | # References
4 |
5 | - MxNet (A Flexible and Efficient Library for Deep Learning)
6 | [http://mxnet.incubator.apache.org/](http://mxnet.incubator.apache.org/)
7 |
8 | - OpenCV documentaion
9 | [https://docs.opencv.org/2.4.13/](https://docs.opencv.org/2.4.13/)
10 |
11 | - Opencv python hand gesture recognition
12 | [http://creat-tabu.blogspot.com.au/2013/08/opencv-python-hand-gesture-recognition.html](http://creat-tabu.blogspot.com.au/2013/08/opencv-python-hand-gesture-recognition.html)
13 |
14 | - Fingers-Detection-using-OpenCV-and-Python
15 | [https://github.com/lzane/Fingers-Detection-using-OpenCV-and-Python](https://github.com/lzane/Fingers-Detection-using-OpenCV-and-Python)
16 |
17 | - Mahaveerverma's hand gesture recognition
18 | [https://github.com/mahaveerverma/hand-gesture-recognition-opencv](https://github.com/mahaveerverma/hand-gesture-recognition-opencv)
19 |
20 | - Handling the mouse
21 | [http://pythonhosted.org/pynput/mouse.html](http://pythonhosted.org/pynput/mouse.html)
22 |
23 | - Realtime hand gesture recognition to control your window manager
24 | [https://github.com/mre/tracker](https://github.com/mre/tracker)
25 |
26 | - real-time object detection
27 | [https://pjreddie.com/darknet/yolo/](https://pjreddie.com/darknet/yolo/)
28 |
29 | - GestuRe: A mixed-initiative interactive machine learning system for recognizing hand gestures
30 | [https://github.com/atduskgreg/gestuRe](https://github.com/atduskgreg/gestuRe):
31 |
32 | - Hand Gesture Datasets
33 | [http://lttm.dei.unipd.it/downloads/gesture/](http://lttm.dei.unipd.it/downloads/gesture/)
34 |
35 | - HandGenerator
36 | [http://lttm.dei.unipd.it/downloads/handposegenerator/index.html](http://lttm.dei.unipd.it/downloads/handposegenerator/index.html)
37 |
38 | - YOLO: Real-Time Object Detection
39 | [https://pjreddie.com/darknet/yolo/](https://pjreddie.com/darknet/yolo/)
40 |
41 | - Universe allows an AI agent to use a computer like a human does: by looking at screen pixels and operating a virtual keyboard and mouse
42 | [https://blog.openai.com/universe/](https://blog.openai.com/universe/)
43 |
44 | - How to Use Your Smartphone as a Mouse, Keyboard, and Remote Control for Your PC
45 | [https://www.howtogeek.com/240794/how-to-use-your-smartphone-as-a-mouse-keyboard-and-remote-control-for-your-pc/](https://www.howtogeek.com/240794/how-to-use-your-smartphone-as-a-mouse-keyboard-and-remote-control-for-your-pc/)
46 |
47 | - 9 OpenCV tutorials to detect and recognize hand gestures
48 | [https://www.intorobotics.com/9-opencv-tutorials-hand-gesture-detection-recognition/](https://www.intorobotics.com/9-opencv-tutorials-hand-gesture-detection-recognition/)
49 |
50 |
51 |
52 | # Youtube
53 |
54 | - Handpose: Fully Articulated Hand Tracking : microsoft research
55 | [https://www.youtube.com/watch?v=A-xXrMpOHyc](https://www.youtube.com/watch?v=A-xXrMpOHyc)
56 |
57 | - Kinect Finger Recognition For Games
58 | [https://www.youtube.com/watch?v=NqjopQmqWAE](https://www.youtube.com/watch?v=NqjopQmqWAE)
59 |
60 | - IMPRESSIVE HAND TRACKING! | Blocks VR (Oculus Rift DK2 + Leap Motion Orion )
61 | [https://www.youtube.com/watch?v=LJPxyWM9Ujg](https://www.youtube.com/watch?v=LJPxyWM9Ujg)
62 |
63 | - OpenCV + Python + Hand Tracking + Gesture Recognition
64 | [https://www.youtube.com/watch?v=ycd3t6K2ofs](https://www.youtube.com/watch?v=ycd3t6K2ofs)
65 |
66 | - OpenCV Python Webcam Hand Gesture Detection API
67 | [https://www.youtube.com/watch?v=oH0ZkfFoeYU](https://www.youtube.com/watch?v=oH0ZkfFoeYU)
68 |
69 | - Hand Gesture Detection AI With Convolutional Neural Networks
70 | [https://www.youtube.com/watch?v=Y6oLbRKwmPk](https://www.youtube.com/watch?v=Y6oLbRKwmPk)
71 |
72 | - Playing Mario With Hand Gesture Neural Network + LSES Update/Demo
73 | [https://www.youtube.com/watch?v=HaQizxcc1d0](https://www.youtube.com/watch?v=HaQizxcc1d0)
74 |
75 |
76 | # Resources
77 |
78 | - Remote control images
79 | [https://thetechhacker.com/wp-content/uploads/2013/08/Control-Your-Home-With-Gesture-And-Voice-Using-Flowton.jpg](https://thetechhacker.com/wp-content/uploads/2013/08/Control-Your-Home-With-Gesture-And-Voice-Using-Flowton.jpg)
80 |
81 | - virtual keyboard from Google images
82 | [https://www.gearbest.com/bluetooth-keyboard/pp_69416.html](https://www.gearbest.com/bluetooth-keyboard/pp_69416.html)
83 |
84 |
85 |
86 |
87 |
88 |
89 |
90 |
--------------------------------------------------------------------------------
/resources/.DS_Store:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/.DS_Store
--------------------------------------------------------------------------------
/resources/blue-table-cloth800x.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/blue-table-cloth800x.jpg
--------------------------------------------------------------------------------
/resources/download.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/download.jpeg
--------------------------------------------------------------------------------
/resources/gestures/hand_gestures.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/hand_gestures.png
--------------------------------------------------------------------------------
/resources/gestures/index.md:
--------------------------------------------------------------------------------
1 |
2 |
--------------------------------------------------------------------------------
/resources/gestures/l-m-r.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/l-m-r.png
--------------------------------------------------------------------------------
/resources/gestures/l1.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/l1.jpg
--------------------------------------------------------------------------------
/resources/gestures/l2.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/l2.jpg
--------------------------------------------------------------------------------
/resources/gestures/l3.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/l3.jpg
--------------------------------------------------------------------------------
/resources/gestures/l4.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/l4.jpg
--------------------------------------------------------------------------------
/resources/gestures/l5.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/l5.jpg
--------------------------------------------------------------------------------
/resources/gestures/lc.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/lc.jpg
--------------------------------------------------------------------------------
/resources/gestures/le.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/le.jpg
--------------------------------------------------------------------------------
/resources/gestures/lo.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/lo.jpg
--------------------------------------------------------------------------------
/resources/gestures/lq.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/lq.jpg
--------------------------------------------------------------------------------
/resources/gestures/lw.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/lw.jpg
--------------------------------------------------------------------------------
/resources/gestures/m1.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/m1.jpg
--------------------------------------------------------------------------------
/resources/gestures/m2.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/m2.jpg
--------------------------------------------------------------------------------
/resources/gestures/m3.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/m3.jpg
--------------------------------------------------------------------------------
/resources/gestures/m4.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/m4.jpg
--------------------------------------------------------------------------------
/resources/gestures/m5.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/m5.jpg
--------------------------------------------------------------------------------
/resources/gestures/mc.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/mc.jpg
--------------------------------------------------------------------------------
/resources/gestures/me.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/me.jpg
--------------------------------------------------------------------------------
/resources/gestures/mo.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/mo.jpg
--------------------------------------------------------------------------------
/resources/gestures/mq.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/mq.jpg
--------------------------------------------------------------------------------
/resources/gestures/mw.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/mw.jpg
--------------------------------------------------------------------------------
/resources/gestures/r1.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/r1.jpg
--------------------------------------------------------------------------------
/resources/gestures/r2.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/r2.jpg
--------------------------------------------------------------------------------
/resources/gestures/r3.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/r3.jpg
--------------------------------------------------------------------------------
/resources/gestures/r4.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/r4.jpg
--------------------------------------------------------------------------------
/resources/gestures/r5.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/r5.jpg
--------------------------------------------------------------------------------
/resources/gestures/rc.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/rc.jpg
--------------------------------------------------------------------------------
/resources/gestures/re.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/re.jpg
--------------------------------------------------------------------------------
/resources/gestures/ro.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/ro.jpg
--------------------------------------------------------------------------------
/resources/gestures/rq.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/rq.jpg
--------------------------------------------------------------------------------
/resources/gestures/rw.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/rw.jpg
--------------------------------------------------------------------------------
/resources/gestures/v1.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/v1.jpg
--------------------------------------------------------------------------------
/resources/gestures/v2.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/v2.jpg
--------------------------------------------------------------------------------
/resources/gestures/v3.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/v3.jpg
--------------------------------------------------------------------------------
/resources/gestures/v4.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/v4.jpg
--------------------------------------------------------------------------------
/resources/gestures/v5.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/v5.jpg
--------------------------------------------------------------------------------
/resources/gestures/vc.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/vc.jpg
--------------------------------------------------------------------------------
/resources/gestures/ve.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/ve.jpg
--------------------------------------------------------------------------------
/resources/gestures/vo.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/vo.jpg
--------------------------------------------------------------------------------
/resources/gestures/vq.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/vq.jpg
--------------------------------------------------------------------------------
/resources/gestures/vw.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/gestures/vw.jpg
--------------------------------------------------------------------------------
/resources/images.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/images.jpeg
--------------------------------------------------------------------------------
/resources/remote-control800x.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/remote-control800x.jpg
--------------------------------------------------------------------------------
/resources/tracking_hands800x.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/tracking_hands800x.jpg
--------------------------------------------------------------------------------
/resources/virtual-keyboard2.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/resources/virtual-keyboard2.jpg
--------------------------------------------------------------------------------
/teams-all/SML109_team_project_abstracts.docx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/teams-all/SML109_team_project_abstracts.docx
--------------------------------------------------------------------------------
/teams-all/SML109_team_project_abstracts.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/whatifif/handgesture/24e84561ee5c2244fee0b45cb31d69bccd6ff123/teams-all/SML109_team_project_abstracts.pdf
--------------------------------------------------------------------------------