├── CNAME ├── Publications-and-Articles.md ├── README.md ├── _config.yml ├── data ├── Code Examples │ ├── Code_to_read_Data_Python.py │ ├── Code_to_read_Test_Data_Python.py │ ├── MATLAB_Code_To_Read_Test_Data.m │ └── MATLAB_Code_To_Read_Training_Data.m └── Data_Download_And_Details.md └── images ├── 22 (1).png ├── Images.txt ├── Picture3-635x500 (1).png ├── Picture4.png ├── Picture5.png ├── Picture6.png ├── Picture7.png └── Picture8.png /CNAME: -------------------------------------------------------------------------------- 1 | dop-net.com -------------------------------------------------------------------------------- /Publications-and-Articles.md: -------------------------------------------------------------------------------- 1 | # UCL Publications: 2 | 3 | ## DopNET Publication: 4 | 5 | M. Ritchie, R. Capraru, and F. Fioranelli, “Dop-NET: a micro-Doppler radar data challenge,”Elec-tronics Letters, vol. 56, no. 11, pp. 568–570, May 2020 6 | 7 | ## Key Publications 8 | 9 | A. Bannon, R. Capraru, and M. Ritchie, “Exploring gesture recognition with low-cost cw radar mod-ules in comparison to fmcw architectures,” in2020 IEEE International Radar Conference (RADAR), 2020, pp. 744–748 10 | 11 | Ritchie, M.; Jones, A.; Brown, J.; Griffiths, H., “Hand gesture classification using 24 GHz FMCW dual polarized radar system”, IET Radar conference, Belfast, 2017 12 | 13 | Ritchie, M; Jones, A, “Micro-Doppler Gesture Recognition using Dopper, Time and Range based features”, IEEE Radar Conference , Boston, MA, USA, 2019. 14 | 15 | ## 2019 16 | 17 | J. Le Kernec et al., “Radar signal processing for sensing in assisted living: The challenges associated with real-time implementation of emerging algorithms,” IEEE Signal Process. Mag., vol. 36, no. 4, pp. 29–41, 2019. 18 | 19 | ## 2018 20 | 21 | G. Li, R. Zhang, M. Ritchie, and H. Griffiths, “Sparsity-Driven Micro-Doppler Feature Extraction for Dynamic Hand Gesture Recognition,” IEEE Trans. Aerosp. Electron. Syst., vol. 54, no. 2, 2018. 22 | 23 | J. S. Patel, F. Fioranelli, M. Ritchie, and H. Griffiths, “Multistatic radar classification of armed vs unarmed personnel using neural networks,” Evol. Syst., vol. 9, no. 2, pp. 135–144, 2018. 24 | 25 | Z. Chen, G. Li, F. Fioranelli, and H. Griffiths, “Personnel Recognition and Gait Classification Based on Multistatic Micro-Doppler Signatures Using Deep Convolutional Neural Networks,” IEEE Geosci. Remote Sens. Lett., vol. 15, no. 5, pp. 669–673, 2018 26 | 27 | ## 2017 28 | 29 | F. Fioranelli, M. Ritchie, S. Z. Gürbüz, and H. Griffiths, “Feature Diversity for Optimized Human Micro-Doppler Classification Using Multistatic Radar,” IEEE Trans. Aerosp. Electron. Syst., vol. 53, no. 2, 2017. 30 | 31 | G. Li, R. Zhang, M. Ritchie, and H. Griffiths, “Sparsity-based dynamic hand gesture recognition using micro-Doppler signatures,” in 2017 IEEE Radar Conference, RadarConf 2017, 2017. 32 | 33 | Fioranelli, F.; Ritchie, M.; Balleri, A.; Griffiths, H., “Practical investigation of multiband mono- and bistatic radar signatures of wind turbines” IET RSN, Vol 11, Issue 6, June 2017. 34 | 35 | Fioranelli, F.; Ritchie, M.; Griffiths, H., “Bistatic human micro-Doppler signatures for classification of indoor activities”, IEEE Radar Conference, May 2017. 36 | 37 | Chen, Q.; Ritchie, M.; Liu Y.;Chetty, K.; Woodbridge, K., “Joint fall and aspect angle recognition using fine-grained micro-Doppler classification”, IEEE Radar Conference, May 2017. 38 | 39 | Ritchie, M.; Jones, A.; Brown, J.; Griffiths, H., “Hand gesture classification using 24 GHz FMCW dual polarized radar system”, IET Radar conference, Belfast, 2017 40 | 41 | ## 2016 42 | 43 | F. Fioranelli, M. Ritchie, and H. Griffiths, “Centroid features for classification of armed/unarmed multiple personnel using multistatic human micro-Doppler,” IET Radar, Sonar Navig., vol. 10, no. 9, 2016. 44 | 45 | Ritchie, M; Ash M.; Chetty K., “Soprano Radar Through Wall Classification of Human Micro-Doppler”, MDPI Sensors, May 2016. 46 | 47 | Ritchie, M; Fioranelli, F; Griffiths, H., “Bistatic Radar Configuration for Human Motion Detection and Classification”, Invited Chapter within book “Radar for Indoor Monitoring; Detection, Localization and Assessment”, Artech House, Accepted for Publication 2016. 48 | 49 | Ritchie, M; Fioranelli; F.; Borrion H.; Griffiths H., “Multistatic Micro-Doppler Radar Features for Classification of Unloaded/Loaded Micro-Drones”, IET RSN, May 2016. 50 | 51 | Ritchie, M, Fioranelli; F; Griffiths H.; Torvik B.; “Monostatic and Bistatic Radar Measurements of Birds and Micro-Drone”, IEEE Radar Conference, Philadelphia, 2016. 52 | 53 | Ritchie, M.; Fioranelli, F.; Borrion, H., “Micro UAV Crime Prevention: Can we help Princess Leia?”, Chapter in Crime Prevention in the 21st Century, Springer, August 2016. 54 | 55 | M. Ritchie, A. Stove, K. Woodbridge, and H. Griffiths, “NetRAD: Monostatic and Bistatic Sea Clutter Texture and Doppler Spectra Characterization at S-Band 56 | 57 | ## 2015 58 | 59 | F. Fioranelli, M. Ritchie, and H. Griffiths, “Multistatic human micro-Doppler classification of armed/unarmed personnel,” IET Radar, Sonar Navig., vol. 9, no. 7, pp. 857–865, 2015. 60 | 61 | Ritchie M.; Fioranelli, F.; Balleri, A.; Griffiths, H., “Measurement and analysis of multiband bistatic and monostatic radar signatures of wind turbines”, IET Electronics Letters, Volume 51, Issue 14, p. 1112 –1113, 2015. 62 | 63 | Fioranelli, F.; Ritchie, M.; Griffiths, H., “Classification of Loaded/Unloaded Micro-Drones Using Multistatic Radar”, Electronic Letters, Vol 51, Issue 22, 2015. 64 | 65 | Fioranelli, F.; Ritchie, M.; Griffiths, H., “Classification of Unarmed/Armed Personnel Using the NetRAD Multistatic Radar for Micro-Doppler and Singular Value Decomposition Features,” IEEE Geoscience and Remote Sensing Letters, vol.12, no.9, pp.1933-1937, Sept. 2015. 66 | 67 | Fioranelli, F.; Ritchie, M.; Griffiths, H., “Aspect angle dependence and multistatic data fusion for micro-Doppler classification of armed/unarmed personnel”, IET RSN, Vol. 9, Issue 9, Dec 2015. 68 | 69 | ## 2009 70 | 71 | D. Tahmoush and J. Silvious, “Remote detection of humans and animals,” Proc. – Appl. Imag. Pattern Recognit. Work., pp. 1–8, 2009. https://ieeexplore.ieee.org/document/5466303 72 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | # What is DopNet? 5 | 6 | **DopNet** is a large Radar database organised in a hierarchy in which each node represents the data of a person which is divided into different gestures recorded from that person. The data here was measured with FMCW and CW Radars. 7 | 8 | DopNet‘s structure makes it a useful tool for Machine Learning Gesture Recognition software and Image Processing for the spectrograms. 9 | 10 | The shared data was generated by **Dr. Matthew Ritchie** (**University College London (UCL)**, London, United Kingdom) and **Richard Capraru** (**Nanyang Technological University (NTU)** and **Singapore Agency for Science, Technology and Research (A*STAR)**, Singapore) within the **UCL Radar Research Group** in collaboration with **Dr. Francesco Fioranelli** (**Delft University of Technology (TU Delft)**, Delft, Netherlands). Furthermore, it started as a **Laidlaw Scholarship project**. 11 | 12 | ## How do I get hold of the data? 13 | 14 | The Data is available to universities and scientific bodies for use in their research. It can be obtained here: [Data Information and Download Links](https://github.com/UCLRadarGroup/DopNet/blob/main/data/Data_Download_And_Details.md) 15 | 16 | If you want to read Publications and Articles relating the Micro-Doppler Gesture Recognition or Radar Research, they are available in the [Publications and Articles Page](https://github.com/UCLRadarGroup/DopNet/blob/main/Publications-and-Articles.md). If you make a publication that uses DopNet or the Data, and you want to add it on the Publications and Articles Page, contact us. 17 | 18 | ## Radar & Micro-Doppler 19 | 20 | Radar sensors have previously been successfully used to classify different actions such as walking, carrying an item, discriminating between people and animals gaits or drones and bird targets [2-6]. All of this analysis used the phenomenon called Micro-Doppler which is the additional movements a target has on top of its bulk velocity. For example, a person may walk forwards at 3 m/s but as they move at this speed their arms and legs oscillate back and forth. This movement creates a signature which was coined as Micro-Doppler by researcher V. Chen [7]. The image below shows an example plot of a micro-Doppler signature of a person walking towards a radar. 21 | 22 | A typical short-range radar system architecture is the Frequency Modulated Continuous Wave (FMCW) radar. This constantly transmits a Chirp, which is a signal that increases (up-chirp) or decreases (down-chirp) its frequency linearly with time. This is then mixed with the received signal in order to obtain the range/Doppler of a target. The data generated for this classification challenge was created using an Ancortek 24 GHz FMCW radar a 750 MHz bandwidth, more details about the radar system can be seen in the table below: 23 | 24 | ![alt text](https://github.com/UCLRadarGroup/DopNet/blob/6221b177c76048d356d0ea62ca5210772be55c1e/images/Picture4.png?raw=true) 25 | 26 | The focus on this data challenge is the classification of 4 separate hand gestures. There has been a vast amount of research into various technologies used as human-machine interfaces (HMI). This includes the Microsoft Kinetic Sensor, Virtual Reality wand controllers and even sensors that read your brainwaves. Recently Google has developed a small radar sensor call Soli which is proposed as a device for gesture recognition [1]. This research challenge proposes the use of a compact Radar sensor as a device that can be used in HMI and has encouraged researchers to investigate the feasibility of a radar device in this roll. 27 | 28 | Some examples of gestures are Wave / Pinch / Click / Swipe. 29 | 30 | ![alt text](https://github.com/UCLRadarGroup/DopNet/blob/6221b177c76048d356d0ea62ca5210772be55c1e/images/Picture5.png?raw=true) 31 | 32 | Radar sensors have previously been successfully used to classify different actions such as walking, carrying an item, discriminating between people and animals gaits or drones and bird targets. All of this analysis used the phenomenon called Micro-Doppler which is the additional movements a target has on top of its bulk velocity. For example, a person may walk forwards at 3 m/s but as they move at this speed their arms and legs oscillate back and forth. This movement creates a signature which was coined as Micro-Doppler by researcher V. Chen. The image below shows an example plot of a micro-Doppler signature of a person walking towards a radar. 33 | 34 | ![alt text](https://github.com/UCLRadarGroup/DopNet/blob/6221b177c76048d356d0ea62ca5210772be55c1e/images/Picture6.png?raw=true) 35 | 36 | In order to evaluate how effective a radar is in recognising gestures, this challenge provides data that can be used to apply classification methods. A database of gestures has been created and uploaded here using the Ancortek Radar system. This database includes signals from 4 different types (Wave / Pinch / Click / Swipe). The data itself has been pre-processed so that the signatures have been cut into individual actions from a long data stream, filtered to enhance the desired components and processed to produce the Doppler vs. time-domain data. The data is then stored in this format in order for it to be read in, features to be extracted and the classification process to be performed. 37 | 38 | The Ancortek Radar system used to generate the dataset is a 24 GHz FMCW radar that has a standalone GUI to control and capture data or can be commanded within a Matlab interface to capture signals. The system we have used has one transmit antenna and two receive antennas (only one was used for the purposes of this dataset). It was set up on a lab bench at the same height as the gesture action. It was then initiated to capture 30 seconds of data and the candidate repeated the actions numerous times within this window. Afterwards, the raw data was then cut into individual gestures that occurred over the whole period. These individual gesture actions have varying matrix sizes hence a cell data format was used to create a ragged data cube. The data that has been shared as part of this challenge was created by the following flow of processing: 39 | 40 | 1. De-interleave Channel 1 -2 and I/Q samples 41 | 2. Break vector of samples into a 2D matrix of Chirp vs. Time. 42 | 3. FFT samples to covert to range domain. Resulting in a Range vs. Time matrix (RTI) 43 | 4. Filter signal such that static targets are suppressed and moving targets are highlighted. This is called MTI filtering in radar signal processing. 44 | 5. Extract rows within the RTI that contain the gesture movement and coherently sum these. 45 | 6. Generate a Doppler vs. Time 2D matrix by using a Short Time Fourier Transform on the vector of selected samples 46 | 7. Store the complex samples of the Doppler vs. Time matrix within a larger cell array which is a data cube of the N repeats of the 4 gestures from each person. 47 | 48 | Example of a Doppler vs. Time matrix for each gesture can be seen below: 49 | 50 | ![alt text](https://github.com/UCLRadarGroup/DopNet/blob/6221b177c76048d356d0ea62ca5210772be55c1e/images/Picture3-635x500%20(1).png?raw=true) 51 | 52 | By eye, it is clear that these gestures look different from each other. The waving gesture which has the oscillatory shape and longer duration, whereas the click gesture happens over the shortest time frame (as a click is only a short sharp action). Then the pinch and swipe actions do show some level of similarity which could make them challenging for a classifier. 53 | 54 | The training data that we share is a matrix of Doppler vs time signals is stored in a cell format. This is a labelled dataset that can be used to create a classifier model. A separate Matlab .m file is shared to show users how to read this data. Within the whole database, there are 3052 files from 6 different people. 55 | 56 | ## UCL Radar Research Group 57 | 58 | More detail on the UCL radar group can be found here 59 | [Radar Group Wiki](https://collab.ee.ucl.ac.uk/radar-research/doku.php) 60 | 61 | Along with the YouTube channel showing some technical seminars from researchers within the group: [Radar Group YouTube Channel](https://www.youtube.com/channel/UCI5KZFVKvLPqntR5HMoDb_Q?view_as=subscriber) 62 | 63 | ## Acknowledgements 64 | 65 | The data provided on this competition is freely available to the public with the complex data being available, upon request, for use by universities or scientific bodies. Participants shall use the Database only for non-commercial research and educational purposes. You are required to acknowledge UCL and the DopNet dataset in any publications or media that you use the data for. 66 | 67 | The authors would like the thank Colin Horne, Dr. Riccardo Palama’, Alvaro Arenas Pingarron and Folker Hoffman for their support in data collection for this research. 68 | 69 | References 70 | 71 | [1] https://atap.google.com/soli/ 72 | 73 | [2] F. Fioranelli, M. Ritchie, and H. Griffiths, “Centroid features for classification of 74 | armed/unarmed multiple personnel using multistatic human micro-Doppler,” IET Radar, 75 | Sonar Navig., vol. 10, no. 9, 2016. 76 | 77 | [3] F. Fioranelli, M. Ritchie, S. Z. Gürbüz, and H. Griffiths, “Feature Diversity for Optimized 78 | Human Micro-Doppler Classification Using Multistatic Radar,” IEEE Trans. Aerosp. 79 | Electron. Syst., vol. 53, no. 2, 2017. 80 | 81 | [4] D. Tahmoush and J. Silvious, “Remote detection of humans and animals,” Proc. - Appl. Imag. Pattern Recognit. Work., pp. 1–8, 2009. https://ieeexplore.ieee.org/document/5466303 82 | 83 | [5] F. Fioranelli, M. Ritchie, and H. Griffiths, “Multistatic human micro-Doppler classification of armed/unarmed personnel,” IET Radar, Sonar Navig., vol. 9, no. 7, pp. 857 865, 2015. 84 | 85 | [6] M. Ritchie, F, Fioranelli, H. Borrion, H. Griffiths, “Multistatic Micro-Doppler Radar Features for Classification of Unloaded/Loaded Micro-Drones”, vol 11, 1, IET RSN, Jan 2017. 86 | 87 | [7] Chen, V.C., Fayin, L., Ho, S.S., Wechsler, H.: ‘Micro-Doppler effect in radar: phenomenon, model, and simulation study’, IEEE Trans. Aerosp. Electron. Syst., 2006, 42, (1), pp. 2–21 88 | -------------------------------------------------------------------------------- /_config.yml: -------------------------------------------------------------------------------- 1 | theme: jekyll-theme-cayman -------------------------------------------------------------------------------- /data/Code Examples/Code_to_read_Data_Python.py: -------------------------------------------------------------------------------- 1 | 2 | #This code will display a Doppler Micro-Doppler Spectogram that will help 3 | #you familiarize with the data 4 | # By choosing the Person, Gesture and the Repeat, you will see that certain 5 | # data displayed in a Spectogram 6 | 7 | 8 | import scipy.io as sio 9 | import scipy 10 | import numpy as np 11 | import matplotlib.pyplot as plt 12 | import math 13 | import os 14 | 15 | #Here you must specify the sample you want to see 16 | person=input("Please tell me the person letter(A, B, C, D, E or F): ") 17 | gesture=input('Please tell me the gesture( Wave =0, Pinch=1, Swipe=2, Click =3): ') 18 | sample=input('Please tell me repeat number: ') 19 | 20 | 21 | #Load File; Get the path from the file containing the Data 22 | test = sio.loadmat(os.path.realpath("Data_Per_PersonData_Training_Person_"+person+".mat")) 23 | 24 | #Get the gesture; 25 | x=test["Data_Training"]["Doppler_Signals"][0][0][0][int(gesture)][int(sample)][0] 26 | 27 | 28 | #Equation 29 | x=20*np.log10(abs(x)/np.amax(abs(x))) 30 | 31 | #Display Spectogram 32 | plt.imshow(x,vmin=-50, vmax=0,cmap='jet', aspect='auto',extent=[0,x.shape[1],-501,500]) 33 | 34 | 35 | 36 | plt.ylabel("Doppler",fontsize=17) 37 | plt.xlabel("Time",fontsize=17) 38 | 39 | plt.colorbar() 40 | plt.show() 41 | -------------------------------------------------------------------------------- /data/Code Examples/Code_to_read_Test_Data_Python.py: -------------------------------------------------------------------------------- 1 | import scipy.io as sio 2 | import scipy 3 | import numpy as np 4 | import matplotlib.pyplot as plt 5 | import math 6 | 7 | 8 | Number=input("Please tell me the number of the Sample you want to display: ") 9 | 10 | #Load File 11 | test = sio.loadmat(sio.loadmat(os.path.realpath("Data_For_Test.py"))) 12 | 13 | #Get the gesture; 14 | x=test["Data_rand"][int(Number)][0][0][0] 15 | 16 | 17 | #Equation 18 | x=20*np.log10(abs(x)/np.amax(abs(x))) 19 | 20 | 21 | #Display Spectogram 22 | plt.imshow(x,vmin=-50, vmax=0,cmap='jet', aspect='auto') 23 | plt.colorbar() 24 | plt.show() 25 | -------------------------------------------------------------------------------- /data/Code Examples/MATLAB_Code_To_Read_Test_Data.m: -------------------------------------------------------------------------------- 1 | 2 | 3 | %This part is getting the directory where your Matlab is running right now 4 | dir=pwd; 5 | %This parts loads the data 6 | load([dir,'\','Data_For_Test.mat']) 7 | 8 | % Choose one of the measurements 9 | Repeat = input('Please tell me repeat number: '); 10 | 11 | 12 | %Get The data 13 | x = Data_rand{Repeat,1}{1,1}; 14 | 15 | % plot the data to see how it looks like 16 | figure;imagesc(20*log10(abs(x)./max(abs(x(:)))),[-30,0]) 17 | -------------------------------------------------------------------------------- /data/Code Examples/MATLAB_Code_To_Read_Training_Data.m: -------------------------------------------------------------------------------- 1 | 2 | %This part is getting the directory where your Matlab is running right now 3 | dir=pwd; 4 | %This parts loads the data 5 | Person= 'A'% Choose the person letter 6 | load([dir,'\','Data_Per_PersonData_Training_Person_',Person,'.mat']) 7 | 8 | Gesture = 1; % Choose one of the Gestures which are in this order: 'Wave' =1,'Pinch'=2,'Swipe'=3,'Click' =4 9 | 10 | Repeat = 17; % Choose one of the measurements 11 | 12 | x = Data_Training.Doppler_Signals{Gesture}{Repeat}; %Get The data 13 | 14 | 15 | 16 | 17 | figure;imagesc(20*log10(abs(x)./max(abs(x(:)))),[-30,0]) 18 | -------------------------------------------------------------------------------- /data/Data_Download_And_Details.md: -------------------------------------------------------------------------------- 1 | # Data 2 | 3 | The data can be found here: 4 | 5 | https://figshare.com/s/e2d70817514f8e40e400?file=45297967 6 | 7 | - Folders 8 | - "/Gesture Data/Test Data/" 9 | - "/Gesture Data/Training Data/" 10 | 11 | 12 | 13 | 14 | 15 | -------------------------------------------------------------------------------- /images/22 (1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/UCLRadarGroup/DopNet/b1b456650757ca79d5f3c1447d8b5af4ae805b7f/images/22 (1).png -------------------------------------------------------------------------------- /images/Images.txt: -------------------------------------------------------------------------------- 1 | Some images used for the main pages 2 | -------------------------------------------------------------------------------- /images/Picture3-635x500 (1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/UCLRadarGroup/DopNet/b1b456650757ca79d5f3c1447d8b5af4ae805b7f/images/Picture3-635x500 (1).png -------------------------------------------------------------------------------- /images/Picture4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/UCLRadarGroup/DopNet/b1b456650757ca79d5f3c1447d8b5af4ae805b7f/images/Picture4.png -------------------------------------------------------------------------------- /images/Picture5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/UCLRadarGroup/DopNet/b1b456650757ca79d5f3c1447d8b5af4ae805b7f/images/Picture5.png -------------------------------------------------------------------------------- /images/Picture6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/UCLRadarGroup/DopNet/b1b456650757ca79d5f3c1447d8b5af4ae805b7f/images/Picture6.png -------------------------------------------------------------------------------- /images/Picture7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/UCLRadarGroup/DopNet/b1b456650757ca79d5f3c1447d8b5af4ae805b7f/images/Picture7.png -------------------------------------------------------------------------------- /images/Picture8.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/UCLRadarGroup/DopNet/b1b456650757ca79d5f3c1447d8b5af4ae805b7f/images/Picture8.png --------------------------------------------------------------------------------