├── .DS_Store ├── .Rhistory ├── LICENSE ├── README.md ├── kinectPipeline.py └── pictures ├── DumpKinectSkeletonRelease.png ├── KinectStudio_skeleton.PNG ├── anatomicalPosition.png ├── capture_test.PNG ├── kinectjoints.png ├── kinectmodel.png ├── leftHandRule.png ├── planesOfMotion.png ├── powerShell.PNG ├── powerShell_navigate.PNG ├── sensor.png └── tpose.png /.DS_Store: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/stevenhirsch/Human-Motion-Analysis-with-Kinect-v2/d6355fb7c7aacd96cedaaea098eea68b4c71af2c/.DS_Store -------------------------------------------------------------------------------- /.Rhistory: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/stevenhirsch/Human-Motion-Analysis-with-Kinect-v2/d6355fb7c7aacd96cedaaea098eea68b4c71af2c/.Rhistory -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2021 Steven Hirsch 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Human Motion Analysis with Kinect v2 2 | 3 | 4 | 5 | ### Why? 6 | 7 | - I've noticed most studies or projects in this space only use the joint centers to compute planar projection angles, or the angles between vectors, and don't quantify 3D rotations in a similar way that researchers would if using something like Qualisys or Vicon. 8 | - People still use the Kinect since it's cheap, and you can probably find someone who has one lying around in a drawer somewhere (I got mine in exchange for a few gyros wraps). 9 | - These concepts and code can be adapted for the newer Kinect Azure cameras (from what I understand of those newer systems) and potentially other markerless motion capture systems. 10 | 11 | 12 | 13 | ### Main applications 14 | 15 | - Gross-level human motion analysis (e.g., strength and conditioning movement assessments). 16 | 17 | - Mediolateral, Anteroposterior, and Vertical Axis rotations for hip and knee joint (based on the International Society of Biomechanics (ISB) recommendations by [Wu and Cavanagh](https://www.ece.uvic.ca/~bctill/papers/mocap/Wu_Cavanagh_1995.pdf) (1995) and [Wu et al.](https://www.sciencedirect.com/science/article/pii/S0021929001002226?casa_token=2dAvjy8tpVsAAAAA:WMEtDcaXO0Jrg_GirWFR84c3w4etz7MzVceTSjGYivBhKOUqNOeWdkNDgGFfgN9g4q-3XHU43w) (2002). Also see [Research Methods in Biomechanics](https://www.amazon.ca/Research-Methods-Biomechanics-2nd-Gordon-Robertson/dp/0736093400) by Gordon Roberston et al. for more information) 18 | 19 | - Joint center tracking (25 total "joints", elaborated on more below). 20 | 21 | - Teaching tool for undergraduate or graduate students (depending on how much "raw" data is made available). 22 | 23 | - Can be scaled for undergraduate and graduate students in kinesiology, physiotherapy, chiropractic, engineering, or related disciplines. 24 | 25 | - Much easier (and cheaper) to first learn with a markerless system relative to a marker-based system! 26 | 27 | 28 | 29 | ### What this overview does not do 30 | 31 | - Provide a rigorous overview of the math and other skills required for collecting and processing motion data (for markerless or marker-based systems). 32 | 33 | - Some supplementary information is available for students or researchers who may want an additional refresher of the material or a better understanding of the computations without digging into the code right away. 34 | 35 | - Teach you how to analyze your data following its extraction appropriately. 36 | 37 | - Capture upper extremity joint angles based on ISB recommendations. 38 | 39 | - Operate as a diagnostic or screening tool. 40 | 41 | 42 | 43 | ## tl;dr 44 | 45 | - `.\DumpKinectSkeleton.exe --prefix=` is the command to collect the data with the connect. Make sure that a standing calibration (in anatomical position ) is collected along with any motion trials. This has been cloned from the [DumpKinectSkeleton repository created by sebtoun](https://github.com/sebtoun/DumpKinectSkeleton) (Sebastien Andary). 46 | 47 | - only runs with Windows 8 or later. 48 | 49 | - `python kinectPipeline.py -i -c -o ` is the command to take the csv outputs from the DumpKinectSkeleton command and create a new csv file with joint angles based on ISB recommendations. 50 | 51 | - can be used with Windows, macOS, or Linux as long as you have Python 3.7, Pandas, NumPy, and SciPy installed. 52 | 53 | 54 | 55 | ## 1 Getting started 56 | 57 | ### 1.1 Getting the Kinect SDK running on your computer 58 | 59 | Many existing guides can direct people on how to get the Kinect "running" on your computer. I'll provide a brief overview of the software and hardware required and then provide links to other helpful resources below. You can read more on the [Xbox Support Website](https://support.xbox.com/en-CA/help/hardware-network/kinect/kinect-for-windows-v2-setup-with-adapter) as well. Essentially, though, all it involves is downloading the software development kit (SDK) and connecting the Kinect using an adaptor. 60 | 61 | *Hardware Requirements (Minimum)* 62 | 63 | - USB 3.0 port 64 | - [Kinect adaptor /Power supply unit/ USB 3.0 cable combination](https://www.amazon.ca/Kinect-Adapter-Sensor-Windows-Version/dp/B07MDB1DQY/ref=sr_1_1_sspa?dchild=1&keywords=kinect+windows+adapter&qid=1618065435&sr=8-1-spons&psc=1&smid=A3K3D4LA1E20ZI&spLa=ZW5jcnlwdGVkUXVhbGlmaWVyPUFGWEk4RjE2RlEyQ0gmZW5jcnlwdGVkSWQ9QTA4ODY2NzkzMEhOVzMwWFpUWTdJJmVuY3J5cHRlZEFkSWQ9QTA3Njg0NDYyODcxTUdDR0pUU0lPJndpZGdldE5hbWU9c3BfYXRmJmFjdGlvbj1jbGlja1JlZGlyZWN0JmRvTm90TG9nQ2xpY2s9dHJ1ZQ==) (unless you have a Kinect v2 for Windows as this is part of the package) 65 | - 64-bit (x64) processor 66 | - Dual-core 3.2GHz or faster processor 67 | - 2GB RAM 68 | 69 | *Software Requirements* 70 | 71 | - Windows 8 or later 72 | 73 | - [Kinect SDK 2.0](https://www.microsoft.com/en-ca/download/details.aspx?id=44561) 74 | 75 | - Microsoft Visual Studio 2012 Express or later 76 | 77 | - .NET Framework 4.5 78 | 79 | - Python 3.7 (recommended to create an environment using [Miniconda](https://docs.conda.io/projects/continuumio-conda/en/latest/user-guide/install/macos.html) or [Anaconda](https://docs.anaconda.com/anaconda/install/) to run locally or copy and paste into Google Colab- only required for data processing pipeline, not the Kinect itself) 80 | - Pandas (1.1.5) 81 | - NumPy (1.19.5) 82 | - SciPy (1.4.1) 83 | - PyQuaternion (0.9.9) 84 | 85 | *Other Helpful Resources* 86 | 87 | - ["How to Connect a Kinect"](https://www.instructables.com/How-to-Connect-a-Kinect/) 88 | - [Set up Kinect for Windows v2 with a Kinect Adaptor for Windows 10 PC (Xbox Support website)](https://support.xbox.com/en-CA/help/hardware-network/kinect/kinect-for-windows-v2-setup-with-adapter) 89 | 90 | *What you should "see" before you move onto the next step* 91 | 92 | - After downloading the SDK and connecting the Kinect to your computer, you should be able to open Kinect Studio, click the connect button, and see something like this: 93 | 94 | ![](pictures/KinectStudio_skeleton.PNG) 95 | 96 | I also like using Kinect Studio as an initial "check" of the data collection space. 97 | 98 | 99 | 100 | ### 1.2 Getting the "raw" data 101 | 102 | The raw **joint centers**, **orientations**, **timestamps**, and **states** are collected using the [DumpKinectSkeleton repository created by sebtoun](https://github.com/sebtoun/DumpKinectSkeleton) (Sebastien Andary). Kinect also includes data about the colour and depth recorded from the camera and IR sensors, respectively. However, those are not necessary for our current purposes. I will highlight some other information/uncertainties regarding the data export that weren't contained in the original repo. 103 | 104 | First, I am not sure what Microsoft's default behaviour is regarding the "states". In other words, I'm not sure what the precise difference is between "tracked" (state = 2), "inferred" (state = 1), and "not tracked" (state=0). I've trusted Microsoft's algorithms (for now), but it's also possible to interpolate the data yourself if you don't trust it. If anyone has documentation about this, please let me know! 105 | 106 | Second, the "JointID" legend is as follows: 107 | 108 | ![](pictures/kinectjoints.png) 109 | 110 | More on this in future sections. 111 | 112 | 113 | 114 | Third, I want to expand upon using the package for those unfamiliar with using a command line interface. After downloading the latest release from the [DumpKinectSkeleton repository](https://github.com/sebtoun/DumpKinectSkeleton) located in the right column underneath the "About" section: 115 | 116 | ![](pictures/DumpKinectSkeletonRelease.png) 117 | 118 | Then, unzip the folder and save it to another directory. I would suggest moving it into Documents, but you can hypothetically place this folder anywhere that you'll remember it. Then, you will need to open PowerShell. 119 | 120 | ![](pictures/powerShell.PNG) 121 | 122 | To navigate within PowerShell, you use the command `cd` followed by the pathname. For example, suppose you wanted to navigate to your documents folder. In that case, you'd need to type `cd \Documents\` into the command prompt. To collect data, navigate to the `DumpKinectSkeleton-v3.1.0` directory. If you've placed this folder in your Documents folder, you can copy and paste: 123 | 124 | `cd .\Documents\DumpKinectSkeleton-v3.1.0\` 125 | 126 | into PowerShell: 127 | 128 | ![](pictures/powerShell_navigate.PNG) 129 | 130 | Hit enter to navigate to the appropriate directory. Next, use the command outlined in Sebastien's repo. The most common way I start collecting is by using the command: 131 | 132 | `.\DumpKinectSkeleton.exe --prefix=pid_task` 133 | 134 | where `PID` refers to the anonymized participant ID and `task` refers to what I am collecting (e.g., squats). If exporting video data that the Kinect records is desired, add a `-v` to your command: 135 | 136 | `.\DumpKinectSkeleton.exe -v --prefix=pid_task` 137 | 138 | This additional argument is helpful if you want to do other qualitative analyses along with the quantitative analyses. The video is in a yuy2 raw format, so additional processing needs to be done to "view" this video file. There are other arguments that can be used along with this function, which are highlighted in the original repository. If everything is working properly, you should see this prompt in PowerShell after hitting enter: 139 | 140 | ![](pictures/capture_test.PNG) 141 | 142 | Information about the skeleton(s) found (will only record the first skeleton found), the capture rate (which is not consistent as it is dependent on the computer load and processing power, more on this later), and how to stop the data collection will be desplayed. As the prompt suggests, press control+c, x, or q to stop the recording. You should then find a csv file saved with the filename `_body.csv` in the `\DumpKinectSkeleton-v3.1.0` folder (the `_body` suffix is added automatically to all csv files). 143 | 144 | Once you've obtained the csv, you can either process it yourself or use the existing pipeline, which I will elaborate on below. 145 | 146 | 147 | 148 | ## 2 How is the data being processed? 149 | 150 | ### 2.1 What are the Kinect outputs? 151 | 152 | The joint position data is relatively straightforward to understand. It refers to the X-Y-Z components of each joint in the global coordinate system (GCS). As a reminder, here are the joint positions Kinect will automatically track: 153 | 154 | ![](pictures/kinectjoints.png) 155 | 156 | 157 | 158 | The GCS for Kinect is built using the IR sensor: 159 | 160 | 161 | 162 | ![](pictures/sensor.png) 163 | 164 | 165 | 166 | The origin of this system is located at the center of the IR sensor. The Z-axis points away from the sensor, the X-axis points to the sensor's left, and the Y-axis is pointed upwards, which is dependent on the sensor's tilt. I believe Kinect includes an alogirthm that can find the floor to correct tilt errors, but I've found that being proactive with the sensor alignment works well if you don't mind the origin remaining in the center of the IR sensor. All coordinates are measured in meters, and the recommended depth of a participant is between 1.5-4.0m away from the sensor. Further information is provided in [Microsoft's SDK](https://docs.microsoft.com/en-us/previous-versions/windows/kinect/dn772836(v%3dieb.10)). 167 | 168 | 169 | 170 | The joint orientation data is a little more confusing given the lack of information provided in the [SDK](https://docs.microsoft.com/en-us/previous-versions/windows/kinect/dn758647(v=ieb.10)) and because the algorithms are not public knowledge. After a bit of digging and experimenting ([thank you wingcloud](https://social.msdn.microsoft.com/Forums/en-US/f2e6a544-705c-43ed-a0e1-731ad907b776/meaning-of-rotation-data-of-k4w-v2?forum=k4wv2devpreview)!), I'll explain how I got things to "click". The local coordinate system (LCS) for each joint is as follows: 171 | 172 | ![](pictures/kinectmodel.png) 173 | 174 | (again, all credit to wingcloud for putting this together). It seems odd at first (especially relative to traditional biomechanics software), but once you're made aware of how the default algorithm works it's not that bad. Assume one's reference posture is a T-Pose: 175 | 176 | ![](pictures/tpose.png) 177 | 178 | and the "home" joint is SPINEBASE. The LCS for the "home" joint is built by "mirroring" the GCS so that the LCS_Y is still vertical, but the LCS_Z now points in the negative GCS_Z direction and the LCS_X points in the negative GCS_X direction. All future LCSs are built by rotating the LCS_Y so it points in the "direction of travel" along the skeleton. For example, to build the LCS_Y for HIPRIGHT, the Y-axis must point in the direction of travel (i.e., to the right) whereas the LCS_Y for the HIPLEFT is oriented to the left (and you can probably see the "pattern" for the LCS_Y for every other joint). The X- and Z-axes are then rotated "along for the ride" as shown in the diagram above (the actual formula is provided in wingcloud's explanation in the link above; from the work I've done and what I've seen in other published studies, this is correct). Because of this, it's important that the person is faced as perpendicularly to the Kinect as possible to avoid extra code necessary that "fixes" these coordinate system offsets (there's only a single camera and sensor, so it's critical to use it as well as possible when collecting data). Since all these joints have LCSs oriented in different directions, it will be essential to make them all consistent prior to computing any joint angles (more on this below). 179 | 180 | 181 | 182 | The other thing to note is that all orientations are defined using [quaternions](https://en.wikipedia.org/wiki/Quaternion) where `orientation.W` represents the scalar component and `orientation.X`, `orientation.Y`, and `orientation.Z` represent the i, j, and k components, respectively. Although quaternions are nice mathematical objects to describe rotations since they don't suffer from [gimbal lock](https://en.wikipedia.org/wiki/Gimbal_lock) or [Codman's paradox](https://pubmed.ncbi.nlm.nih.gov/9728676/), and are computationally efficient in comparison to rotation matrices, they are relatively difficult to interpret from a practical perspective. After all, we do these analyses for practitioners and if a practitioner asked you to describe someone's knee angle during an assessment and you provided them with the quaternions for their shank and thigh, they'd probably look for a new analyst 😊. Although I'd probably argue that any 3D rotation is somewhat difficult to interpret practically, the next section describes how the quaternions are converted to standard "biomechanical" angles that are easier for a practitioner to interpret from an anatomical perspective they are more familiar with. 183 | 184 | 185 | 186 | ### 2.2 What is the pipeline doing? 187 | 188 | Only a high-level overview is provided in this document. Please refer to the other texts and papers for more detail on kinematic analyses. Briefly, the steps include reorienting the LCSs, creating orientation matrices, normalizing these orientation matrices to an upright standing posture, transforming the proximal LCS into the distal LCS to get the rotation matrix of a joint, and then decomposing that rotation matrix into Cardan-Euler angles. I've intentionally computed things this way so the pipeline is similar to that used in many analyses with marker-based motion capture systems (however, this is much simpler to implement relative to marker-based systems so it's great for learning). 189 | 190 | 191 | 192 | #### 2.2.1 Reorienting the LCSs 193 | 194 | The first thing that needs to be done is reoriented the LCSs so they all use the same coordinate system. Since the orientation data are currently quaternions (and thus `orientation.W` is dependent only on the angle of rotation and not the axis), it's as simple as swaping and negating `orientation.X`, `orientation.Y`, and `orientation.Z` to convert them to a new coordinate system (this is possible since Kinect uses a right-handed system for all rotations). Therefore, the axes are redefined so that the LCSs are consistent between joints and match that of the ISB recommendation papers. 195 | 196 | #### 2.2.2 Resampling the data to 30Hz 197 | 198 | Since the Kinect does not collect at a consistent 30hz (its sampling rate is dependent on the current demands placed on the processor), the signals are resampled to a consistent 30Hz using the Fourier Method. 199 | 200 | #### 2.2.3 Smoothing the Quaternions 201 | 202 | The raw quaternion outputs from Kinect are quite jittery, and thus need to be smoothed. To smooth this data, I adapted the [following MatLab code](https://ww2.mathworks.cn/help/nav/ug/lowpass-filter-orientation-using-quaternion-slerp.html) whereby spherical linear interpolation was used to lowpass filter the noisy quaternions. More information about how this works can be located in the previous link. 203 | 204 | #### 2.2.4 Creating orientation matrices 205 | 206 | After the LCSs are reoriented to match that of the ISB recommendations, orientation matrices for each segment are built from the quaternion data. [See this Wikipedia page for more information](https://en.wikipedia.org/wiki/Quaternions_and_spatial_rotation) pertaininig to quaternions. 207 | 208 | 209 | 210 | #### 2.2.5 Normalizing orientations to anatomical position 211 | 212 | It's common in most biomechanics work to define joint angles relative to anatomical position so that upright standing represents 0 degrees of rotation. This is done to assist with the interpretation of the calculated joint angles. Therefore, the first trial prior to any data collection session is usually a standing calibration where the participant is recorded while standing in what is referred to as anatomical position: 213 | 214 | ![](pictures/anatomicalPosition.png) 215 | 216 | The normalized angle is computed using the orientation matrices from when the individual is in anatomical position and not by simply subtracting the joint angle from the calibration posture. This is because the computed joint angles are not vectors, and thus can't be subtracted from one another. There are other methods of normalizing joint angles, but this is the simplest to implement with the Kinect data. 217 | 218 | 219 | 220 | #### 2.2.6 Transforming the proximal LCS to the distal LCS 221 | 222 | Transforming the proximal LCS to the distal LCS is how joint angles are defined in biomechanics. For example, if we're interested in computing the knee angle, we are interested in assessing how much the shank has rotated relative to the thigh. Mathematically, this is described as multiplying the transpose of the orientation matrix of the thigh by the orientation matrix of the shank (because rotation matrices are orthogonal the transpose is equal to the inverse, which simplifies the computations). Remember, though, we want to compute these relative to anatomical position (i.e., the "calibration" position), so these calculations are modified slightly to include these orientations. More detail is provided specifically on page 57 of the Research Methods in Biomechanics textbook. 223 | 224 | 225 | 226 | One thing to note is that the vocabulary for computing rotations are a little different for Kinect than what most people with biomechanics training are likely used to. This is because Kinect refers to everything as "joints" and biomechanists usually refer to the segments themselves. Below is a terminology conversion table to translate between the two "languages" when computing hip and knee rotations: 227 | 228 | | Joint Angle | Device | Vocabulary | Data | 229 | | ----------- | ------ | ---------- | ------------ | 230 | | Hip | Kinect | Parent | Hip Center | 231 | | | | Child | Knee Center | 232 | | | Mocap | Reference | Pelvis | 233 | | | | Segment | Thigh | 234 | | Knee | Kinect | Parent | Knee Center | 235 | | | | Child | Ankle Center | 236 | | | Mocap | Reference | Thigh | 237 | | | | Segment | Shank | 238 | 239 | Therefore, the knee in Kinect is computed by expressing the ankle center orientation relative to the knee center orientation. It's slightly odd to use the joints centers when joint angles are really expressing the relationships between segments, but it's just the way that Kinect decided to name things. I'm assuming this nomenclature is easier to follow if you were trying to build animations for video games, which is what the Kinect was designed to do, so we just need to get used to the vocabulary when doing biomechanical analyses. 240 | 241 | #### 2.2.7 Decomposing the rotation matrix 242 | 243 | Great! We've now computed a rotation matrix for the knee. We're at the same abstract point as before with the quaternions, though, in that the rotation matrix for a joint is not very helpful for a practitioner. Thankfully, these matrices can be decomposed into something a little more interpretable using Cardan-Euler angles. An important caveat is that matrix multiplication is not commutative (which is the precise way of saying that multiplying rotation matrices A\*B is not equivalent to B\*A, as it would be for the real numbers). Therefore, the order of rotations that we specify during this decomposition matters. This can be illustrated with the following example. Use your left hand to make a coordinate system like this: 244 | 245 | ![](pictures/leftHandRule.png) 246 | 247 | Grab your phone with your right hand and have your phone oriented so that the screen is facing upwards and the “top” of the phone is pointed forwards. Positive rotations are counterclockwise and we will only use two rotations. Rotate your phone +90 deg around your thumb, and then +90 deg around your middle finger. Your phone screen should now be pointed towards you in landscape mode with the “top” pointed to the left. Now let’s change the order of operations. With your phone in the original position, now rotate first +90 around your middle finger, and then +90 around your thumb. You are now in a different position in comparison to the rotations performed previously, with the phone now in portrait mode, the screen pointing to the right, and the “top” of the phone pointed upwards. 248 | 249 | 250 | 251 | So what is the appropriate order of rotations? ISB recommendations is to decompose the rotations first about the mediolateral axis, then the anteroposterior axis, and finally the vertical axis. We use this order as it is equivalent to the joint coordinate system proposed by [Grood and Suntay (1983)](https://asmedigitalcollection.asme.org/biomechanical/article-abstract/105/2/136/397206/A-Joint-Coordinate-System-for-the-Clinical?redirectedFrom=fulltext) ([Cole et al., 1993](https://asmedigitalcollection.asme.org/biomechanical/article-abstract/115/4A/344/395760)). Since we don't have any predefined axes based on anatomical information with the Kinect (since it's a markerless system), this isn't a truly equivalent computation, but it's as close as we can get without using more sophisticated techniques to estimate the joint rotation axes. Using this convention, the first rotation corresponds to flexion-extension, the second rotation corresponds to abduction-adduction, and the third rotation refers to internal-external rotation of a joint. This roughly corresponds the motions-in-planes interpretations of joint angles that therapists are used to dealing with. 252 | 253 | ![](pictures/planesOfMotion.png) 254 | 255 | Although, it must be made clear that **motion about axes is not equivalent to motion in planes**, even though we might talk about them that way (Zatsiorsky's Kinematics of Human Motion textbook is a great resource for more details pertaining to this & kinematic analysis in general). Further, these rotations are simplifications of the actual motions that occur at a joint. These simplifications are likely fine for the level of analysis that the Kinect is useful for, but these underyling assumptions should be made explicit to best understand what the data can and can't tell us. Again, caution is advised when interpreting these angles, especially for the second (abduction-adduction) and third (internal-external) rotations. 256 | 257 | 258 | 259 | #### 2.2.8 Data export 260 | 261 | The Python pipeline will automatically perform these computations and export the data into a new csv file for you to explore. To install the pipeline, either clone this repository, download it as a ZIP (green "code" button at the top of the screen) or create a `kinectPipeline.py` file in a text editor and then copy and paste the code from this repository into that file. The command to run the pipeline in PowerShell (or Terminal if you want to process the data using macOS or Linux), assuming the python file is located in your current working directory, is: 262 | 263 | `python kinectPipeline.py -i -c -o ` 264 | 265 | where the relative filepath for the data you'd like to process is written after `-i`, the calibration trial is written after `-c`, and the resulting output filename after `-o`. For example, let's say I am analyzing a squat trial for a participant with the anonymized id of 12. Processing the data with the command may look something like this if you are using Mac or Linux (simply substitute the forward slash for backward slashes, as was shown in the examples above, if you are working with Windows): 266 | 267 | `python kinectPipeline.py -i data/p12_squats_body.csv -c data/p12_standcal_body.csv -o processed_data/p12_jointangles.csv` 268 | 269 | Which would take `p12_squats_body.csv` and `p12_standcal_body.csv` from the `data` directory and then save the resulting processed file, `p12_jointangles.csv` into an existing directory named `processed_data`. Note that the syntax to navigate between file systems on macOS/Linux and Windows is slightly different, so the above script would need to be modified accordingly. 270 | 271 | If you're like me and will amost certainly forget the syntax for this command and need to refresh yourself, you can use the argument `-h` for help. 272 | 273 | The csv file from this export will include the frame number, the time stamp corresponding to the frame number (which has now been resampled), the mediolateral, anteroposterior, and vertical rotations of the hip and knee joint, and the global XYZ positions of every joint center the Kinect measures. 274 | 275 | 276 | 277 | #### 2.2.9 Miscellaneous data collection tips 278 | 279 | - There is only one camera, so if any joints are occluded there will be larger errors in the resulting output. Do your best (without disturbing the "natural" motion of the participant) to ensure that joints are easily visible by the cameras for the entire collection. 280 | - Wait until the software finds the Kinect skeleton prior to initiating any movements. I would suggest providing a small buffer between the DumpKinectSkeleton command and the start of the movement trial. 281 | 282 | 283 | 284 | #### 2.2.10 Current omissions 285 | 286 | I've only included rotations for the hip and knee joint as they can be applied for gait analysis or other lower extremity exercises. Given that upper extremity motion is quantified a bit differently (a different rotation sequence is recommended to the one provided in this pipeline) and it is less relevant to some of the "beginner" things I am exploring with this pipeline, I've omitted it. The principles outlined in this document should be helpful to guide future work and teaching projects for other upper extremity joints. 287 | 288 | 289 | 290 | #### 2.2.11 Fun things to try for those who are keen to explore further and refine your modelling skills 291 | 292 | - use only quaternions to get the same rotations about the mediolateral, anteroposterior, and vertical axes 293 | - use PCA to "correct" for the potential errors in the second and third rotations, as in [Baudet et al. (2014)](https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0102098) or [Skaro et al. (2021)](https://asmedigitalcollection.asme.org/biomechanical/article/143/5/054501/1096599?casa_token=MrUib7bkkdEAAAAA:ECwOFkkwHSiKUbQw7FXa7KvWmFnlCmMe9jzCVA1UbtV-BKGF4Sr5eXXNsie8vawP-pLnK6k) (note: the cross-talk errors are due to the difficulties in specifying the axes of rotation when using marker-based systems, which are technically not defined using any anatomical information with Kinect. Having said this, PCA will almost certainly reduce of the magnitude of the second and third rotations considerably in the Kinect) 294 | 295 | 296 | 297 | ## 3 Next Steps 298 | 299 | I have a few next steps planned for this repository. If you (or anyone you know) is interested in contributing to this repo or collaborating on some of these potential projects please let me know! 300 | 301 | - (Recurrent) Neural Network to "correct" joint angle data relative to a multi-camera system (e.g., Qualisys, Vicon) 302 | - validation of both the "raw" Kinect and an ML correction alogrithm in comparison to gold-standard systems with more exercise tasks (e.g., squats, hinges, lunges, jumps) 303 | - Add joint angular velocities and accelerations 304 | -------------------------------------------------------------------------------- /kinectPipeline.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/python 2 | import sys, getopt 3 | import os 4 | import pandas as pd 5 | import numpy as np 6 | import pyquaternion as pyq 7 | from pyquaternion import Quaternion 8 | from scipy import signal 9 | from scipy.spatial.transform import Slerp 10 | from scipy.spatial.transform import Rotation as R 11 | 12 | 13 | def main(argv): 14 | inputfile = '' 15 | calfile = '' 16 | outputfile = '' 17 | try: 18 | opts, args = getopt.getopt(argv,"hi:c:o:",["ifile=", "cfile=","ofile="]) 19 | except getopt.GetoptError: 20 | print('test.py -i -c -o ') 21 | sys.exit(2) 22 | for opt, arg in opts: 23 | if opt == '-h': 24 | print('test.py -i -c calfile -o ') 25 | sys.exit() 26 | elif opt in ("-i", "--ifile"): 27 | inputfile = arg 28 | elif opt in ("-c", "--ifile"): 29 | calfile = arg 30 | elif opt in ("-o", "--ofile"): 31 | outputfile = arg 32 | 33 | # Creating Functions 34 | def orientation_matrix(q0, q1, q2, q3): 35 | # based on https://automaticaddison.com/how-to-convert-a-quaternion-to-a-rotation-matrix/ 36 | r11 = 2 * (q0 ** 2 + q1 ** 2) - 1 37 | r12 = 2 * (q1 * q2 - q0 * q3) 38 | r13 = 2 * (q1 * q3 + q0 * q2) 39 | r21 = 2 * (q1 * q2 + q0 * q3) 40 | r22 = 2 * (q0 ** 2 + q2 ** 2) - 1 41 | r23 = 2 * (q2 * q3 - q0 * q1) 42 | r31 = 2 * (q1 * q3 - q0 * q2) 43 | r32 = 2 * (q2 * q3 + q0 * q1) 44 | r33 = 2 * (q0 ** 2 + q3 ** 2) - 1 45 | 46 | return r11, r12, r13, r21, r22, r23, r31, r32, r33 47 | 48 | def compute_relative_orientation(seg, cal): 49 | ''' 50 | Calculating the relative orientation between two matrices. This is used for the initial normalization 51 | procedure using the standing calibration 52 | ''' 53 | R_11 = np.array([]) 54 | R_12 = np.array([]) 55 | R_13 = np.array([]) 56 | R_21 = np.array([]) 57 | R_22 = np.array([]) 58 | R_23 = np.array([]) 59 | R_31 = np.array([]) 60 | R_32 = np.array([]) 61 | R_33 = np.array([]) 62 | 63 | for i in range(seg.shape[0]): 64 | segment = np.asmatrix([ 65 | [np.array(seg['o11'])[i], np.array(seg['o12'])[i], np.array(seg['o13'])[i]], 66 | [np.array(seg['o21'])[i], np.array(seg['o22'])[i], np.array(seg['o23'])[i]], 67 | [np.array(seg['o31'])[i], np.array(seg['o32'])[i], np.array(seg['o33'])[i]] 68 | ]) 69 | 70 | segment_cal = np.asmatrix([ 71 | [np.array(cal['o11'])[i], np.array(cal['o12'])[i], np.array(cal['o13'])[i]], 72 | [np.array(cal['o21'])[i], np.array(cal['o22'])[i], np.array(cal['o23'])[i]], 73 | [np.array(cal['o31'])[i], np.array(cal['o32'])[i], np.array(cal['o33'])[i]] 74 | ]) 75 | # normalization 76 | r = np.matmul(segment, segment_cal.T) 77 | 78 | new_orientations = np.asarray(r).reshape(-1) 79 | 80 | R_11 = np.append(R_11, new_orientations[0]) 81 | R_12 = np.append(R_12, new_orientations[1]) 82 | R_13 = np.append(R_13, new_orientations[2]) 83 | R_21 = np.append(R_21, new_orientations[3]) 84 | R_22 = np.append(R_22, new_orientations[4]) 85 | R_23 = np.append(R_23, new_orientations[5]) 86 | R_31 = np.append(R_31, new_orientations[6]) 87 | R_32 = np.append(R_32, new_orientations[7]) 88 | R_33 = np.append(R_33, new_orientations[8]) 89 | 90 | return R_11, R_12, R_13, R_21, R_22, R_23, R_31, R_32, R_33 91 | 92 | def compute_joint_angle(df, child, parent): 93 | c = df[df[' jointType'] == child] 94 | p = df[df[' jointType'] == parent] 95 | ml = np.array([]) 96 | ap = np.array([]) 97 | v = np.array([]) 98 | 99 | # Compute Rotation Matrix Components 100 | for i in range(c.shape[0]): 101 | segment = np.asmatrix([ 102 | [np.array(c['n_o11'])[i], np.array(c['n_o12'])[i], np.array(c['n_o13'])[i]], 103 | [np.array(c['n_o21'])[i], np.array(c['n_o22'])[i], np.array(c['n_o23'])[i]], 104 | [np.array(c['n_o31'])[i], np.array(c['n_o32'])[i], np.array(c['n_o33'])[i]] 105 | ]) 106 | 107 | reference_segment = np.asmatrix([ 108 | [np.array(p['n_o11'])[i], np.array(p['n_o12'])[i], np.array(p['n_o13'])[i]], 109 | [np.array(p['n_o21'])[i], np.array(p['n_o22'])[i], np.array(p['n_o23'])[i]], 110 | [np.array(p['n_o31'])[i], np.array(p['n_o32'])[i], np.array(p['n_o33'])[i]] 111 | ]) 112 | 113 | # transformation of segment to reference segment 114 | r = np.matmul(reference_segment.T, segment) 115 | # decomposition to Euler angles 116 | rotations = R.from_matrix(r).as_euler('xyz', degrees=True) 117 | ml = np.append(ml, rotations[0]) 118 | ap = np.append(ap, rotations[1]) 119 | v = np.append(v, rotations[2]) 120 | 121 | return ml, ap, v 122 | 123 | def resample_df(d, new_freq=30, method='linear'): 124 | # Resamples data at 30Hz unless otherwise specified 125 | joints_without_quats = [3, 15, 19, 21, 22, 23, 24] 126 | resampled_df = pd.DataFrame( 127 | columns=['# timestamp', ' jointType', ' orientation.X', ' orientation.Y', ' orientation.Z', 128 | ' orientation.W', ' position.X', ' position.Y', ' position.Z']) 129 | new_df = pd.DataFrame() 130 | for i in d[' jointType'].unique(): 131 | current_df = d.loc[d[' jointType'] == i].copy() 132 | old_times = np.array(current_df['# timestamp']) 133 | new_times = np.arange(min(current_df['# timestamp']), max(current_df['# timestamp']), 1 / new_freq) 134 | 135 | o_x = np.array(current_df[' orientation.X']) 136 | o_y = np.array(current_df[' orientation.Y']) 137 | o_z = np.array(current_df[' orientation.Z']) 138 | o_w = np.array(current_df[' orientation.W']) 139 | 140 | p_x = np.array(current_df[' position.X']) 141 | p_y = np.array(current_df[' position.Y']) 142 | p_z = np.array(current_df[' position.Z']) 143 | 144 | if i in joints_without_quats: 145 | orientation_x = np.repeat(0.0, len(new_times)) 146 | orientation_y = np.repeat(0.0, len(new_times)) 147 | orientation_z = np.repeat(0.0, len(new_times)) 148 | orientation_w = np.repeat(0.0, len(new_times)) 149 | else: 150 | if method == "linear": 151 | orientation_x = np.interp(new_times, old_times, o_x) 152 | orientation_y = np.interp(new_times, old_times, o_y) 153 | orientation_z = np.interp(new_times, old_times, o_z) 154 | orientation_w = np.interp(new_times, old_times, o_w) 155 | elif method == 'slerp': 156 | quats = [] 157 | for t in range(len(old_times)): 158 | quats.append([o_x[t], o_y[t], o_z[t], o_w[t]]) 159 | # Create rotation object 160 | quats_object = R.from_quat(quats) 161 | 162 | # Spherical Linear Interpolation 163 | slerp = Slerp(np.array(current_df['# timestamp']), quats_object) 164 | interp_rots = slerp(new_times) 165 | new_quats = interp_rots.as_quat() 166 | 167 | # Create new orientation objects 168 | orientation_x = np.array([item[0] for item in new_quats]) 169 | orientation_y = np.array([item[1] for item in new_quats]) 170 | orientation_z = np.array([item[2] for item in new_quats]) 171 | orientation_w = np.array([item[3] for item in new_quats]) 172 | else: 173 | raise ValueError("Method must be either linear or spherical (slerp) interpolation.") 174 | 175 | position_x = signal.resample(p_x, num=int(max(current_df['# timestamp']) * new_freq)) 176 | position_y = signal.resample(p_y, num=int(max(current_df['# timestamp']) * new_freq)) 177 | position_z = signal.resample(p_z, num=int(max(current_df['# timestamp']) * new_freq)) 178 | 179 | new_df['# timestamp'] = pd.Series(new_times) 180 | new_df[' jointType'] = pd.Series(np.repeat(i, len(new_times))) 181 | 182 | new_df[' orientation.X'] = pd.Series(orientation_x) 183 | new_df[' orientation.Y'] = pd.Series(orientation_y) 184 | new_df[' orientation.Z'] = pd.Series(orientation_z) 185 | new_df[' orientation.W'] = pd.Series(orientation_w) 186 | 187 | new_df[' position.X'] = pd.Series(position_x) 188 | new_df[' position.Y'] = pd.Series(position_y) 189 | new_df[' position.Z'] = pd.Series(position_z) 190 | 191 | resampled_df = resampled_df.append(new_df, ignore_index=True) 192 | 193 | return resampled_df 194 | 195 | def smooth_rotations(o_x, o_y, o_z, o_w): 196 | o_x = np.array(o_x) 197 | o_y = np.array(o_y) 198 | o_z = np.array(o_z) 199 | o_w = np.array(o_w) 200 | trajNoisy = [] 201 | for i in range(len(o_x)): 202 | trajNoisy.append([o_x[i], o_y[i], o_z[i], o_w[i]]) 203 | 204 | trajNoisy = np.array(trajNoisy) 205 | # This code was adapted from https://ww2.mathworks.cn/help/nav/ug/lowpass-filter-orientation-using-quaternion-slerp.html 206 | 207 | # As explained in the link above, "The interpolation parameter to slerp is in the closed-interval [0,1], so the output of dist 208 | # must be re-normalized to this range. However, the full range of [0,1] for the interpolation parameter gives poor performance, 209 | # so it is limited to a smaller range hrange centered at hbias." 210 | hrange = 0.4 211 | hbias = 0.4 212 | low = max(min(hbias - (hrange / 2), 1), 0) 213 | high = max(min(hbias + (hrange / 2), 1), 0) 214 | hrangeLimited = high - low 215 | 216 | # initial filter state is the quaternion at frame 0 217 | y = trajNoisy[0] 218 | qout = [] 219 | 220 | for i in range(1, len(trajNoisy)): 221 | x = trajNoisy[i] 222 | # x = mathutils.Quaternion(x) 223 | # y = mathutils.Quaternion(y) 224 | # d = x.rotation_difference(y).angle 225 | x = pyq.Quaternion(x) 226 | y = pyq.Quaternion(y) 227 | d = (x.conjugate * y).angle 228 | # Renormalize dist output to the range [low, high] 229 | 230 | hlpf = (d / np.pi) * hrangeLimited + low 231 | # y = y.slerp(x, hlpf) 232 | y = Quaternion.slerp(y, x, hlpf).elements 233 | qout.append(np.array(y)) 234 | 235 | # because a frame of data is lost during this process, I've (arbitrarily) decided to append an extra quaternion at the end of the trial 236 | # that is identical to the n-1th frame. This keeps the length consistent (so there is no issues with merging later) and should not 237 | # negatively impact the data since the last frame is rarely of interest (and the data collector can decide to collect for a split second 238 | # after their trial of interest has completed to attenuate any of these "errors" that may propogate in the analyses) 239 | qout.append(qout[int(len(qout) - 1)]) 240 | 241 | orientation_x = [item[0] for item in qout] 242 | orientation_y = [item[1] for item in qout] 243 | orientation_z = [item[2] for item in qout] 244 | orientation_w = [item[3] for item in qout] 245 | 246 | return orientation_x, orientation_y, orientation_z, orientation_w 247 | 248 | def smooth_quaternions(d): 249 | for i in d[' jointType'].unique(): 250 | current_df = d.loc[d[' jointType'] == i].copy() 251 | 252 | current_df[' orientation.X'], current_df[' orientation.Y'], current_df[' orientation.Z'], current_df[ 253 | ' orientation.W'] = smooth_rotations(current_df[' orientation.X'], current_df[' orientation.Y'], 254 | current_df[' orientation.Z'], current_df[' orientation.W']) 255 | 256 | d[d[' jointType'] == i] = current_df 257 | 258 | return d 259 | 260 | def compute_segment_angle(df, SEGMENT): 261 | s = df[df[' jointType'] == SEGMENT] 262 | ml = np.array([]) 263 | ap = np.array([]) 264 | v = np.array([]) 265 | 266 | # Compute Rotation Matrix Components 267 | for i in range(s.shape[0]): 268 | segment = np.asmatrix([ 269 | [np.array(s['n_o11'])[i], np.array(s['n_o12'])[i], np.array(s['n_o13'])[i]], 270 | [np.array(s['n_o21'])[i], np.array(s['n_o22'])[i], np.array(s['n_o23'])[i]], 271 | [np.array(s['n_o31'])[i], np.array(s['n_o32'])[i], np.array(s['n_o33'])[i]] 272 | ]) 273 | 274 | # decomposition to Euler angles 275 | rotations = R.from_matrix(segment).as_euler('xyz', degrees=True) 276 | ml = np.append(ml, rotations[0]) 277 | ap = np.append(ap, rotations[1]) 278 | v = np.append(v, rotations[2]) 279 | 280 | return ml, ap, v 281 | 282 | dir = os.getcwd() 283 | 284 | # Loading Data 285 | print('... Loading data') 286 | cal = pd.read_csv(os.path.join(dir, calfile)) 287 | df = pd.read_csv(os.path.join(dir, inputfile)) 288 | df['# timestamp'] = df['# timestamp'] * 10 ** -3 289 | cal['# timestamp'] = cal['# timestamp'] * 10 ** -3 290 | 291 | df_reoriented = df.copy() 292 | cal_reoriented = cal.copy() 293 | print('... Reorienting LCSs') 294 | # Hips 295 | df_reoriented.loc[df[' jointType'] == 16, ' orientation.X'] = df.loc[df[' jointType'] == 16, ' orientation.Z'] 296 | df_reoriented.loc[df[' jointType'] == 16, ' orientation.Y'] = df.loc[df[' jointType'] == 16, ' orientation.X'] 297 | df_reoriented.loc[df[' jointType'] == 16, ' orientation.Z'] = df.loc[df[' jointType'] == 16, ' orientation.Y'] 298 | 299 | cal_reoriented.loc[cal[' jointType'] == 16, ' orientation.X'] = cal.loc[cal[' jointType'] == 16, ' orientation.Z'] 300 | cal_reoriented.loc[cal[' jointType'] == 16, ' orientation.Y'] = cal.loc[cal[' jointType'] == 16, ' orientation.X'] 301 | cal_reoriented.loc[cal[' jointType'] == 16, ' orientation.Z'] = cal.loc[cal[' jointType'] == 16, ' orientation.Y'] 302 | 303 | df_reoriented.loc[df[' jointType'] == 12, ' orientation.X'] = df.loc[df[' jointType'] == 12, ' orientation.Z'] 304 | df_reoriented.loc[df[' jointType'] == 12, ' orientation.Y'] = df.loc[df[' jointType'] == 12, ' orientation.X'] * -1 305 | df_reoriented.loc[df[' jointType'] == 12, ' orientation.Z'] = df.loc[df[' jointType'] == 12, ' orientation.Y'] * -1 306 | 307 | cal_reoriented.loc[cal[' jointType'] == 12, ' orientation.X'] = cal.loc[cal[' jointType'] == 12, ' orientation.Z'] 308 | cal_reoriented.loc[cal[' jointType'] == 12, ' orientation.Y'] = cal.loc[cal[' jointType'] == 12, ' orientation.X'] * -1 309 | cal_reoriented.loc[cal[' jointType'] == 12, ' orientation.Z'] = cal.loc[cal[' jointType'] == 12, ' orientation.Y'] * -1 310 | 311 | # Knees 312 | df_reoriented.loc[df[' jointType'] == 17, ' orientation.X'] = df.loc[df[' jointType'] == 17, ' orientation.X'] * -1 313 | df_reoriented.loc[df[' jointType'] == 17, ' orientation.Y'] = df.loc[df[' jointType'] == 17, ' orientation.Y'] * -1 314 | df_reoriented.loc[df[' jointType'] == 17, ' orientation.Z'] = df.loc[df[' jointType'] == 17, ' orientation.Z'] 315 | 316 | cal_reoriented.loc[cal[' jointType'] == 17, ' orientation.X'] = cal.loc[cal[' jointType'] == 17, ' orientation.X'] * -1 317 | cal_reoriented.loc[cal[' jointType'] == 17, ' orientation.Y'] = cal.loc[cal[' jointType'] == 17, ' orientation.Y'] * -1 318 | cal_reoriented.loc[cal[' jointType'] == 17, ' orientation.Z'] = cal.loc[cal[' jointType'] == 17, ' orientation.Z'] 319 | 320 | df_reoriented.loc[df[' jointType'] == 13, ' orientation.X'] = df.loc[df[' jointType'] == 13, ' orientation.X'] 321 | df_reoriented.loc[df[' jointType'] == 13, ' orientation.Y'] = df.loc[df[' jointType'] == 13, ' orientation.Y'] * -1 322 | df_reoriented.loc[df[' jointType'] == 13, ' orientation.Z'] = df.loc[df[' jointType'] == 13, ' orientation.Z'] * -1 323 | 324 | cal_reoriented.loc[cal[' jointType'] == 13, ' orientation.X'] = cal.loc[cal[' jointType'] == 13, ' orientation.X'] 325 | cal_reoriented.loc[cal[' jointType'] == 13, ' orientation.Y'] = cal.loc[cal[' jointType'] == 13, ' orientation.Y'] * -1 326 | cal_reoriented.loc[cal[' jointType'] == 13, ' orientation.Z'] = cal.loc[cal[' jointType'] == 13, ' orientation.Z'] * -1 327 | 328 | # Ankles 329 | df_reoriented.loc[df[' jointType'] == 18, ' orientation.X'] = df.loc[df[' jointType'] == 18, ' orientation.X'] * -1 330 | df_reoriented.loc[df[' jointType'] == 18, ' orientation.Y'] = df.loc[df[' jointType'] == 18, ' orientation.Y'] * -1 331 | df_reoriented.loc[df[' jointType'] == 18, ' orientation.Z'] = df.loc[df[' jointType'] == 18, ' orientation.Z'] 332 | 333 | cal_reoriented.loc[cal[' jointType'] == 18, ' orientation.X'] = cal.loc[cal[' jointType'] == 18, ' orientation.X'] * -1 334 | cal_reoriented.loc[cal[' jointType'] == 18, ' orientation.Y'] = cal.loc[cal[' jointType'] == 18, ' orientation.Y'] * -1 335 | cal_reoriented.loc[cal[' jointType'] == 18, ' orientation.Z'] = cal.loc[cal[' jointType'] == 18, ' orientation.Z'] 336 | 337 | df_reoriented.loc[df[' jointType'] == 14, ' orientation.X'] = df.loc[df[' jointType'] == 14, ' orientation.X'] 338 | df_reoriented.loc[df[' jointType'] == 14, ' orientation.Y'] = df.loc[df[' jointType'] == 14, ' orientation.Y'] * -1 339 | df_reoriented.loc[df[' jointType'] == 14, ' orientation.Z'] = df.loc[df[' jointType'] == 14, ' orientation.Z'] * -1 340 | 341 | cal_reoriented.loc[cal[' jointType'] == 14, ' orientation.X'] = cal.loc[cal[' jointType'] == 14, ' orientation.X'] 342 | cal_reoriented.loc[cal[' jointType'] == 14, ' orientation.Y'] = cal.loc[cal[' jointType'] == 14, ' orientation.Y'] * -1 343 | cal_reoriented.loc[cal[' jointType'] == 14, ' orientation.Z'] = cal.loc[cal[' jointType'] == 14, ' orientation.Z'] * -1 344 | 345 | # Resampling data to 30Hz 346 | df_reoriented = resample_df(df_reoriented, new_freq=30, method='slerp') 347 | # Smooth Quaternion Rotations 348 | df_reoriented = smooth_quaternions(df_reoriented) 349 | # need to re-sort and reset the index following the resampling 350 | df_reoriented = df_reoriented.sort_values(by=['# timestamp', ' jointType']).reset_index() 351 | 352 | df_reoriented['o11'], df_reoriented['o12'], df_reoriented['o13'], df_reoriented['o21'], df_reoriented['o22'], \ 353 | df_reoriented['o23'], df_reoriented['o31'], df_reoriented['o32'], df_reoriented['o33'] \ 354 | = orientation_matrix(df_reoriented[' orientation.W'], df_reoriented[' orientation.X'], 355 | df_reoriented[' orientation.Y'], df_reoriented[' orientation.Z']) 356 | 357 | cal_reoriented['o11'], cal_reoriented['o12'], cal_reoriented['o13'], cal_reoriented['o21'], cal_reoriented['o22'], \ 358 | cal_reoriented['o23'], cal_reoriented['o31'], cal_reoriented['o32'], cal_reoriented['o33'] \ 359 | = orientation_matrix(cal_reoriented[' orientation.W'], cal_reoriented[' orientation.X'], 360 | cal_reoriented[' orientation.Y'], cal_reoriented[' orientation.Z']) 361 | 362 | df_reoriented.set_index(' jointType', inplace=True) 363 | cal_reoriented.set_index(' jointType', inplace=True) 364 | cal_reoriented = cal_reoriented.groupby(' jointType').mean().drop(columns=['# timestamp']) 365 | cal_reoriented = pd.concat([cal_reoriented] * np.int64(df_reoriented.shape[0] / 25)) 366 | 367 | print('... Normalizing to calibration pose') 368 | # Normalize orientations to calibration pose 369 | df_reoriented['n_o11'], df_reoriented['n_o12'], df_reoriented['n_o13'], df_reoriented['n_o21'], df_reoriented[ 370 | 'n_o22'], \ 371 | df_reoriented['n_o23'], df_reoriented['n_o31'], df_reoriented['n_o32'], df_reoriented['n_o33'] \ 372 | = np.array(compute_relative_orientation(cal_reoriented, df_reoriented)) 373 | 374 | df_reoriented.reset_index(inplace=True) 375 | 376 | print('... Computing joint angles') 377 | r_hipFlexion, r_hipAbduction, r_hipV = compute_joint_angle(df_reoriented, child=17, parent=16) 378 | l_hipFlexion, l_hipAbduction, l_hipV = compute_joint_angle(df_reoriented, child=13, parent=12) 379 | r_kneeFlexion, r_kneeAbduction, r_kneeV = compute_joint_angle(df_reoriented, child=18, parent=17) 380 | l_kneeFlexion, l_kneeAbduction, l_kneeV = compute_joint_angle(df_reoriented, child=14, parent=13) 381 | 382 | # Note that 16 or 12 can be used for the pelvis (given Kinect's definitions) 383 | pelvis_rotation = compute_segment_angle(df_reoriented, 16)[0] 384 | r_thigh_rotation = compute_segment_angle(df_reoriented, 17)[0] 385 | l_thigh_rotation = compute_segment_angle(df_reoriented, 13)[0] 386 | r_shank_rotation = compute_segment_angle(df_reoriented, 18)[0] 387 | l_shank_rotation = compute_segment_angle(df_reoriented, 14)[0] 388 | 389 | new_df = pd.DataFrame({ 390 | 'frame': np.arange(df_reoriented['# timestamp'].unique().shape[0]), 391 | 'timeStamp': df_reoriented['# timestamp'].unique(), 392 | # Below are adjusted for relatively easy anatomical interpretations 393 | 'r_hipFlexion' : r_hipFlexion, 394 | 'l_hipFlexion' : l_hipFlexion*-1, 395 | 'r_hipAbduction' : r_hipAbduction*-1, 396 | 'l_hipAbduction' : l_hipAbduction, 397 | 'r_hipV' : r_hipV *-1, 398 | 'l_hipV' : l_hipV *-1, 399 | 'r_kneeFlexion' : r_kneeFlexion*-1, 400 | 'l_kneeFlexion' : l_kneeFlexion, 401 | 'r_kneeAdduction' : r_kneeAbduction, 402 | 'l_kneeAdduction' : l_kneeAbduction*-1, 403 | 'r_kneeV' : r_kneeV*-1, 404 | 'l_kneeV' : l_kneeV, 405 | # Below are adjusted specifically for use with relative phase analyses 406 | 'pelvis_rotation': pelvis_rotation, 407 | 'r_thigh_rotation': r_thigh_rotation, 408 | 'l_thigh_rotation': l_thigh_rotation*-1, 409 | 'r_shank_rotation': r_shank_rotation, 410 | 'l_shank_rotation': l_shank_rotation*-1, 411 | # Below are left in the GCS 412 | 'r_hip_x': np.array(df_reoriented[df_reoriented[' jointType'] == 16][' position.X']), 413 | 'r_hip_y': np.array(df_reoriented[df_reoriented[' jointType'] == 16][' position.Y']), 414 | 'r_hip_z': np.array(df_reoriented[df_reoriented[' jointType'] == 16][' position.Z']), 415 | 'l_hip_x': np.array(df_reoriented[df_reoriented[' jointType'] == 12][' position.X']), 416 | 'l_hip_y': np.array(df_reoriented[df_reoriented[' jointType'] == 12][' position.Y']), 417 | 'l_hip_z': np.array(df_reoriented[df_reoriented[' jointType'] == 12][' position.Z']), 418 | 'r_knee_x': np.array(df_reoriented[df_reoriented[' jointType'] == 17][' position.X']), 419 | 'r_knee_y': np.array(df_reoriented[df_reoriented[' jointType'] == 17][' position.Y']), 420 | 'r_knee_z': np.array(df_reoriented[df_reoriented[' jointType'] == 17][' position.Z']), 421 | 'l_knee_x': np.array(df_reoriented[df_reoriented[' jointType'] == 13][' position.X']), 422 | 'l_knee_y': np.array(df_reoriented[df_reoriented[' jointType'] == 13][' position.Y']), 423 | 'l_knee_z': np.array(df_reoriented[df_reoriented[' jointType'] == 13][' position.Z']), 424 | 'r_ankle_x': np.array(df_reoriented[df_reoriented[' jointType'] == 18][' position.X']), 425 | 'r_ankle_y': np.array(df_reoriented[df_reoriented[' jointType'] == 18][' position.Y']), 426 | 'r_ankle_z': np.array(df_reoriented[df_reoriented[' jointType'] == 18][' position.Z']), 427 | 'l_ankle_x': np.array(df_reoriented[df_reoriented[' jointType'] == 14][' position.X']), 428 | 'l_ankle_y': np.array(df_reoriented[df_reoriented[' jointType'] == 14][' position.Y']), 429 | 'l_ankle_z': np.array(df_reoriented[df_reoriented[' jointType'] == 14][' position.Z']), 430 | 'r_foot_x': np.array(df_reoriented[df_reoriented[' jointType'] == 19][' position.X']), 431 | 'r_foot_y': np.array(df_reoriented[df_reoriented[' jointType'] == 19][' position.Y']), 432 | 'r_foot_z': np.array(df_reoriented[df_reoriented[' jointType'] == 19][' position.Z']), 433 | 'l_foot_x': np.array(df_reoriented[df_reoriented[' jointType'] == 15][' position.X']), 434 | 'l_foot_y': np.array(df_reoriented[df_reoriented[' jointType'] == 15][' position.Y']), 435 | 'l_foot_z': np.array(df_reoriented[df_reoriented[' jointType'] == 15][' position.Z']), 436 | 'spinebase_x': np.array(df_reoriented[df_reoriented[' jointType'] == 0][' position.X']), 437 | 'spinebase_y': np.array(df_reoriented[df_reoriented[' jointType'] == 0][' position.Y']), 438 | 'spinebase_z': np.array(df_reoriented[df_reoriented[' jointType'] == 0][' position.Z']), 439 | 'spinemid_x': np.array(df_reoriented[df_reoriented[' jointType'] == 1][' position.X']), 440 | 'spinemid_y': np.array(df_reoriented[df_reoriented[' jointType'] == 1][' position.Y']), 441 | 'spinemid_z': np.array(df_reoriented[df_reoriented[' jointType'] == 1][' position.Z']), 442 | 'neck_x': np.array(df_reoriented[df_reoriented[' jointType'] == 2][' position.X']), 443 | 'neck_y': np.array(df_reoriented[df_reoriented[' jointType'] == 2][' position.Y']), 444 | 'neck_z': np.array(df_reoriented[df_reoriented[' jointType'] == 2][' position.Z']), 445 | 'head_x': np.array(df_reoriented[df_reoriented[' jointType'] == 3][' position.X']), 446 | 'head_y': np.array(df_reoriented[df_reoriented[' jointType'] == 3][' position.Y']), 447 | 'head_z': np.array(df_reoriented[df_reoriented[' jointType'] == 3][' position.Z']), 448 | 'r_shoulder_x': np.array(df_reoriented[df_reoriented[' jointType'] == 8][' position.X']), 449 | 'r_shoulder_y': np.array(df_reoriented[df_reoriented[' jointType'] == 8][' position.Y']), 450 | 'r_shoulder_z': np.array(df_reoriented[df_reoriented[' jointType'] == 8][' position.Z']), 451 | 'r_elbow_x': np.array(df_reoriented[df_reoriented[' jointType'] == 9][' position.X']), 452 | 'r_elbow_y': np.array(df_reoriented[df_reoriented[' jointType'] == 9][' position.Y']), 453 | 'r_elbow_z': np.array(df_reoriented[df_reoriented[' jointType'] == 9][' position.Z']), 454 | 'r_wrist_x': np.array(df_reoriented[df_reoriented[' jointType'] == 10][' position.X']), 455 | 'r_wrist_y': np.array(df_reoriented[df_reoriented[' jointType'] == 10][' position.Y']), 456 | 'r_wrist_z': np.array(df_reoriented[df_reoriented[' jointType'] == 10][' position.Z']), 457 | 'l_shoulder_x': np.array(df_reoriented[df_reoriented[' jointType'] == 4][' position.X']), 458 | 'l_shoulder_y': np.array(df_reoriented[df_reoriented[' jointType'] == 4][' position.Y']), 459 | 'l_shoulder_z': np.array(df_reoriented[df_reoriented[' jointType'] == 4][' position.Z']), 460 | 'l_elbow_x': np.array(df_reoriented[df_reoriented[' jointType'] == 5][' position.X']), 461 | 'l_elbow_y': np.array(df_reoriented[df_reoriented[' jointType'] == 5][' position.Y']), 462 | 'l_elbow_z': np.array(df_reoriented[df_reoriented[' jointType'] == 5][' position.Z']), 463 | 'l_wrist_x': np.array(df_reoriented[df_reoriented[' jointType'] == 6][' position.X']), 464 | 'l_wrist_y': np.array(df_reoriented[df_reoriented[' jointType'] == 6][' position.Y']), 465 | 'l_wrist_z': np.array(df_reoriented[df_reoriented[' jointType'] == 6][' position.Z']) 466 | }) 467 | 468 | new_df.to_csv(str(outputfile)) 469 | print('... Finished processing') 470 | 471 | if __name__ == "__main__": 472 | main(sys.argv[1:]) -------------------------------------------------------------------------------- /pictures/DumpKinectSkeletonRelease.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/stevenhirsch/Human-Motion-Analysis-with-Kinect-v2/d6355fb7c7aacd96cedaaea098eea68b4c71af2c/pictures/DumpKinectSkeletonRelease.png -------------------------------------------------------------------------------- /pictures/KinectStudio_skeleton.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/stevenhirsch/Human-Motion-Analysis-with-Kinect-v2/d6355fb7c7aacd96cedaaea098eea68b4c71af2c/pictures/KinectStudio_skeleton.PNG -------------------------------------------------------------------------------- /pictures/anatomicalPosition.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/stevenhirsch/Human-Motion-Analysis-with-Kinect-v2/d6355fb7c7aacd96cedaaea098eea68b4c71af2c/pictures/anatomicalPosition.png -------------------------------------------------------------------------------- /pictures/capture_test.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/stevenhirsch/Human-Motion-Analysis-with-Kinect-v2/d6355fb7c7aacd96cedaaea098eea68b4c71af2c/pictures/capture_test.PNG -------------------------------------------------------------------------------- /pictures/kinectjoints.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/stevenhirsch/Human-Motion-Analysis-with-Kinect-v2/d6355fb7c7aacd96cedaaea098eea68b4c71af2c/pictures/kinectjoints.png -------------------------------------------------------------------------------- /pictures/kinectmodel.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/stevenhirsch/Human-Motion-Analysis-with-Kinect-v2/d6355fb7c7aacd96cedaaea098eea68b4c71af2c/pictures/kinectmodel.png -------------------------------------------------------------------------------- /pictures/leftHandRule.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/stevenhirsch/Human-Motion-Analysis-with-Kinect-v2/d6355fb7c7aacd96cedaaea098eea68b4c71af2c/pictures/leftHandRule.png -------------------------------------------------------------------------------- /pictures/planesOfMotion.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/stevenhirsch/Human-Motion-Analysis-with-Kinect-v2/d6355fb7c7aacd96cedaaea098eea68b4c71af2c/pictures/planesOfMotion.png -------------------------------------------------------------------------------- /pictures/powerShell.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/stevenhirsch/Human-Motion-Analysis-with-Kinect-v2/d6355fb7c7aacd96cedaaea098eea68b4c71af2c/pictures/powerShell.PNG -------------------------------------------------------------------------------- /pictures/powerShell_navigate.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/stevenhirsch/Human-Motion-Analysis-with-Kinect-v2/d6355fb7c7aacd96cedaaea098eea68b4c71af2c/pictures/powerShell_navigate.PNG -------------------------------------------------------------------------------- /pictures/sensor.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/stevenhirsch/Human-Motion-Analysis-with-Kinect-v2/d6355fb7c7aacd96cedaaea098eea68b4c71af2c/pictures/sensor.png -------------------------------------------------------------------------------- /pictures/tpose.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/stevenhirsch/Human-Motion-Analysis-with-Kinect-v2/d6355fb7c7aacd96cedaaea098eea68b4c71af2c/pictures/tpose.png --------------------------------------------------------------------------------