├── .gitignore
├── LICENSE.txt
├── PyKinect2.pyproj
├── PyKinect2.sln
├── README.md
├── environment.yml
├── examples
├── Open3D.png
├── OpenCV2D.png
├── PyKinectBodyGame.py
├── PyKinectInfraRed.py
├── __init__.py
├── basic_2D.py
├── basic_3D.py
└── utils_PyKinectV2.py
├── pykinect2
├── PyKinectRuntime.py
├── PyKinectV2.py
└── __init__.py
└── setup.py
/.gitignore:
--------------------------------------------------------------------------------
1 | Enter file contents here# Byte-compiled / optimized / DLL files
2 | __pycache__/
3 | *.py[cod]
4 |
5 | # C extensions
6 | *.so
7 |
8 | # Visual Studio
9 | *.suo
10 |
11 | # Distribution / packaging
12 | .Python
13 | env/
14 | build/
15 | develop-eggs/
16 | dist/
17 | downloads/
18 | eggs/
19 | .eggs/
20 | lib/
21 | lib64/
22 | parts/
23 | sdist/
24 | var/
25 | *.egg-info/
26 | .installed.cfg
27 | *.egg
28 |
29 | # PyInstaller
30 | # Usually these files are written by a python script from a template
31 | # before PyInstaller builds the exe, so as to inject date/other infos into it.
32 | *.manifest
33 | *.spec
34 |
35 | # Installer logs
36 | pip-log.txt
37 | pip-delete-this-directory.txt
38 |
39 | # Unit test / coverage reports
40 | htmlcov/
41 | .tox/
42 | .coverage
43 | .coverage.*
44 | .cache
45 | nosetests.xml
46 | coverage.xml
47 | *,cover
48 |
49 | # Translations
50 | *.mo
51 | *.pot
52 |
53 | # Django stuff:
54 | *.log
55 |
56 | # Sphinx documentation
57 | docs/_build/
58 |
59 | # PyBuilder
60 | target/
61 |
62 |
--------------------------------------------------------------------------------
/LICENSE.txt:
--------------------------------------------------------------------------------
1 | The MIT License (MIT)
2 |
3 | Copyright (c) Microsoft
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
--------------------------------------------------------------------------------
/PyKinect2.pyproj:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 | Debug
5 | 2.0
6 | 862156b9-96a7-4f85-bb66-b7ec70418cf7
7 | .
8 | pykinect2/PyKinectBodyGame.py
9 |
10 |
11 | .
12 | .
13 | PyKinect2
14 | PyKinect2
15 | False
16 |
17 |
18 | true
19 | false
20 |
21 |
22 | true
23 | false
24 |
25 |
26 |
27 |
28 |
29 |
30 |
31 |
32 |
33 |
34 | 10.0
35 | $(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v$(VisualStudioVersion)\Python Tools\Microsoft.PythonTools.targets
36 |
37 |
38 |
39 |
42 |
43 |
44 |
45 |
46 |
47 |
--------------------------------------------------------------------------------
/PyKinect2.sln:
--------------------------------------------------------------------------------
1 |
2 | Microsoft Visual Studio Solution File, Format Version 12.00
3 | # Visual Studio 14
4 | VisualStudioVersion = 14.0.22609.0
5 | MinimumVisualStudioVersion = 10.0.40219.1
6 | Project("{888888A0-9F3D-457C-B088-3A5042F75D52}") = "PyKinect2", "PyKinect2.pyproj", "{862156B9-96A7-4F85-BB66-B7EC70418CF7}"
7 | EndProject
8 | Global
9 | GlobalSection(SolutionConfigurationPlatforms) = preSolution
10 | Debug|Any CPU = Debug|Any CPU
11 | Release|Any CPU = Release|Any CPU
12 | EndGlobalSection
13 | GlobalSection(ProjectConfigurationPlatforms) = postSolution
14 | {862156B9-96A7-4F85-BB66-B7EC70418CF7}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
15 | {862156B9-96A7-4F85-BB66-B7EC70418CF7}.Release|Any CPU.ActiveCfg = Release|Any CPU
16 | EndGlobalSection
17 | GlobalSection(SolutionProperties) = preSolution
18 | HideSolutionNode = FALSE
19 | EndGlobalSection
20 | EndGlobal
21 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Added more examples to PyKinect2
2 |
3 |
4 |
5 |
6 |
7 | 1) examples/basic_2D.py shows examples of displaying body, body index, color, align color, depth and IR images in 2D using OpenCV
8 | 1) examples/basic_3D.py shows examples of displaying coloured point cloud, joint and joint orientation in 3D using Open3D
9 |
10 | Other than the prerequisites stated below, we will also need [opencv-python](https://pypi.org/project/opencv-python/) and [open3d-python](http://www.open3d.org/docs/tutorial/Basic/python_interface.html) for the 2D and 3D display as we are not using PyGame for the display
11 |
12 | To create a Python virtual environment with the packages required:
13 | ```
14 | conda env create -f environment.yml
15 | ```
16 |
17 | Note: First time install of PyKinectV2 via pip may encounter [AssertionError: 80 File "C:\Users\...\Anaconda2\lib\site-packages\pykinect2\PyKinectV2.py", line 2216, in assert sizeof(tagSTATSTG) == 72, sizeof(tagSTATSTG)](https://github.com/Kinect/PyKinect2/issues/37)
18 | Just go to the pykinect2 installation in the site-packages folder on your computer and replace the PyKinectV2.py file with the one from this github repository.
19 |
20 | # PyKinect2
21 |
22 | Enables writing Kinect applications, games, and experiences using Python. Inspired by the original [PyKinect project on CodePlex](http://pytools.codeplex.com/wikipage?title=PyKinect).
23 |
24 | Only color, depth, body and body index frames are supported in this version.
25 | PyKinectBodyGame is a sample game. It demonstrates how to use Kinect color and body frames.
26 |
27 |
28 | ## Prerequisites
29 |
30 | The easiest way to get most of the pre-requisites is to use Anaconda which includes NumPy. You'll then need to pip install comtypes. The PyKinectBodyGame sample requires PyGame which needs to be manually installed.
31 |
32 | 1. Download [Anaconda](https://store.continuum.io/cshop/anaconda/) get the 32-bit version. This includes NumPy.
33 | 2. pip install comtypes
34 | 3. Install the [Kinect for Windows SDK v2](http://aka.ms/k4wv2sdk)
35 |
36 | Full List of Dependencies
37 | * [Python 2.7.x or 3.4 and higher](https://www.python.org/)
38 | * [NumPy](http://www.numpy.org/)
39 | * [comtypes](https://github.com/enthought/comtypes/)
40 | * [Kinect for Windows SDK v2](http://aka.ms/k4wv2sdk)
41 | * [Kinect v2 sensor and adapter](http://aka.ms/k4wv2purchase) Note: you can use a Kinect for Xbox One as long as you also have the Kinect Adapter for Windows
42 | * [PyGame](http://www.pygame.org) - for running PyKinectBodyGame sample
43 | 
44 |
45 |
46 | ## Installation
47 |
48 | The package can be installed through pip using the usual means:
49 | ```
50 | pip install pykinect2
51 | ````
52 | If you are using a virtual environment, be sure to activate it first.
53 |
54 | For more information, please see https://pip.pypa.io/en/latest/user_guide.html#installing-packages
55 |
56 |
57 | ## Installation (Manual)
58 |
59 | To install the package manually, clone this repository to a local folder and include it in the appropriate python environment. If installing in a virtual environment, be sure to install all required dependencies (above).
60 |
61 | For example:
62 | ```
63 | cd c:\projects\myproject\env\
64 | /Scripts/activate.bat
65 |
66 | easy_install -a c:\projects\downloads\PyKinect2
67 | ```
68 | After installation is complete, you can launch the interactive python shell and `import pykinect2` to ensure everything has been installed properly.
69 |
70 | Core helper classes for working with the Kinect sensor are located in PyKinectRuntime.py. For usage examples, please see /examples/PyKinectBodyGame.py.
71 |
--------------------------------------------------------------------------------
/environment.yml:
--------------------------------------------------------------------------------
1 | # Create a Python virtual environment with the packages required
2 | # conda env create -f environment.yml
3 |
4 | name: kv2
5 |
6 | dependencies:
7 | - python=3.6
8 | - numpy
9 |
10 | - pip:
11 | - opencv-python
12 | - pykinect2
13 | - open3d-python
14 |
--------------------------------------------------------------------------------
/examples/Open3D.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/limgm/PyKinect2/21e9adc4ad9c12da4665eb2ddb129242209ee330/examples/Open3D.png
--------------------------------------------------------------------------------
/examples/OpenCV2D.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/limgm/PyKinect2/21e9adc4ad9c12da4665eb2ddb129242209ee330/examples/OpenCV2D.png
--------------------------------------------------------------------------------
/examples/PyKinectBodyGame.py:
--------------------------------------------------------------------------------
1 | from pykinect2 import PyKinectV2
2 | from pykinect2.PyKinectV2 import *
3 | from pykinect2 import PyKinectRuntime
4 |
5 | import ctypes
6 | import _ctypes
7 | import pygame
8 | import sys
9 |
10 | if sys.hexversion >= 0x03000000:
11 | import _thread as thread
12 | else:
13 | import thread
14 |
15 | # colors for drawing different bodies
16 | SKELETON_COLORS = [pygame.color.THECOLORS["red"],
17 | pygame.color.THECOLORS["blue"],
18 | pygame.color.THECOLORS["green"],
19 | pygame.color.THECOLORS["orange"],
20 | pygame.color.THECOLORS["purple"],
21 | pygame.color.THECOLORS["yellow"],
22 | pygame.color.THECOLORS["violet"]]
23 |
24 |
25 | class BodyGameRuntime(object):
26 | def __init__(self):
27 | pygame.init()
28 |
29 | # Used to manage how fast the screen updates
30 | self._clock = pygame.time.Clock()
31 |
32 | # Set the width and height of the screen [width, height]
33 | self._infoObject = pygame.display.Info()
34 | self._screen = pygame.display.set_mode((self._infoObject.current_w >> 1, self._infoObject.current_h >> 1),
35 | pygame.HWSURFACE|pygame.DOUBLEBUF|pygame.RESIZABLE, 32)
36 |
37 | pygame.display.set_caption("Kinect for Windows v2 Body Game")
38 |
39 | # Loop until the user clicks the close button.
40 | self._done = False
41 |
42 | # Used to manage how fast the screen updates
43 | self._clock = pygame.time.Clock()
44 |
45 | # Kinect runtime object, we want only color and body frames
46 | self._kinect = PyKinectRuntime.PyKinectRuntime(PyKinectV2.FrameSourceTypes_Color | PyKinectV2.FrameSourceTypes_Body)
47 |
48 | # back buffer surface for getting Kinect color frames, 32bit color, width and height equal to the Kinect color frame size
49 | self._frame_surface = pygame.Surface((self._kinect.color_frame_desc.Width, self._kinect.color_frame_desc.Height), 0, 32)
50 |
51 | # here we will store skeleton data
52 | self._bodies = None
53 |
54 |
55 | def draw_body_bone(self, joints, jointPoints, color, joint0, joint1):
56 | joint0State = joints[joint0].TrackingState;
57 | joint1State = joints[joint1].TrackingState;
58 |
59 | # both joints are not tracked
60 | if (joint0State == PyKinectV2.TrackingState_NotTracked) or (joint1State == PyKinectV2.TrackingState_NotTracked):
61 | return
62 |
63 | # both joints are not *really* tracked
64 | if (joint0State == PyKinectV2.TrackingState_Inferred) and (joint1State == PyKinectV2.TrackingState_Inferred):
65 | return
66 |
67 | # ok, at least one is good
68 | start = (jointPoints[joint0].x, jointPoints[joint0].y)
69 | end = (jointPoints[joint1].x, jointPoints[joint1].y)
70 |
71 | try:
72 | pygame.draw.line(self._frame_surface, color, start, end, 8)
73 | except: # need to catch it due to possible invalid positions (with inf)
74 | pass
75 |
76 | def draw_body(self, joints, jointPoints, color):
77 | # Torso
78 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_Head, PyKinectV2.JointType_Neck);
79 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_Neck, PyKinectV2.JointType_SpineShoulder);
80 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_SpineShoulder, PyKinectV2.JointType_SpineMid);
81 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_SpineMid, PyKinectV2.JointType_SpineBase);
82 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_SpineShoulder, PyKinectV2.JointType_ShoulderRight);
83 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_SpineShoulder, PyKinectV2.JointType_ShoulderLeft);
84 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_SpineBase, PyKinectV2.JointType_HipRight);
85 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_SpineBase, PyKinectV2.JointType_HipLeft);
86 |
87 | # Right Arm
88 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_ShoulderRight, PyKinectV2.JointType_ElbowRight);
89 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_ElbowRight, PyKinectV2.JointType_WristRight);
90 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_WristRight, PyKinectV2.JointType_HandRight);
91 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_HandRight, PyKinectV2.JointType_HandTipRight);
92 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_WristRight, PyKinectV2.JointType_ThumbRight);
93 |
94 | # Left Arm
95 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_ShoulderLeft, PyKinectV2.JointType_ElbowLeft);
96 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_ElbowLeft, PyKinectV2.JointType_WristLeft);
97 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_WristLeft, PyKinectV2.JointType_HandLeft);
98 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_HandLeft, PyKinectV2.JointType_HandTipLeft);
99 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_WristLeft, PyKinectV2.JointType_ThumbLeft);
100 |
101 | # Right Leg
102 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_HipRight, PyKinectV2.JointType_KneeRight);
103 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_KneeRight, PyKinectV2.JointType_AnkleRight);
104 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_AnkleRight, PyKinectV2.JointType_FootRight);
105 |
106 | # Left Leg
107 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_HipLeft, PyKinectV2.JointType_KneeLeft);
108 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_KneeLeft, PyKinectV2.JointType_AnkleLeft);
109 | self.draw_body_bone(joints, jointPoints, color, PyKinectV2.JointType_AnkleLeft, PyKinectV2.JointType_FootLeft);
110 |
111 |
112 | def draw_color_frame(self, frame, target_surface):
113 | target_surface.lock()
114 | address = self._kinect.surface_as_array(target_surface.get_buffer())
115 | ctypes.memmove(address, frame.ctypes.data, frame.size)
116 | del address
117 | target_surface.unlock()
118 |
119 | def run(self):
120 | # -------- Main Program Loop -----------
121 | while not self._done:
122 | # --- Main event loop
123 | for event in pygame.event.get(): # User did something
124 | if event.type == pygame.QUIT: # If user clicked close
125 | self._done = True # Flag that we are done so we exit this loop
126 |
127 | elif event.type == pygame.VIDEORESIZE: # window resized
128 | self._screen = pygame.display.set_mode(event.dict['size'],
129 | pygame.HWSURFACE|pygame.DOUBLEBUF|pygame.RESIZABLE, 32)
130 |
131 | # --- Game logic should go here
132 |
133 | # --- Getting frames and drawing
134 | # --- Woohoo! We've got a color frame! Let's fill out back buffer surface with frame's data
135 | if self._kinect.has_new_color_frame():
136 | frame = self._kinect.get_last_color_frame()
137 | self.draw_color_frame(frame, self._frame_surface)
138 | frame = None
139 |
140 | # --- Cool! We have a body frame, so can get skeletons
141 | if self._kinect.has_new_body_frame():
142 | self._bodies = self._kinect.get_last_body_frame()
143 |
144 | # --- draw skeletons to _frame_surface
145 | if self._bodies is not None:
146 | for i in range(0, self._kinect.max_body_count):
147 | body = self._bodies.bodies[i]
148 | if not body.is_tracked:
149 | continue
150 |
151 | joints = body.joints
152 | # convert joint coordinates to color space
153 | joint_points = self._kinect.body_joints_to_color_space(joints)
154 | self.draw_body(joints, joint_points, SKELETON_COLORS[i])
155 |
156 | # --- copy back buffer surface pixels to the screen, resize it if needed and keep aspect ratio
157 | # --- (screen size may be different from Kinect's color frame size)
158 | h_to_w = float(self._frame_surface.get_height()) / self._frame_surface.get_width()
159 | target_height = int(h_to_w * self._screen.get_width())
160 | surface_to_draw = pygame.transform.scale(self._frame_surface, (self._screen.get_width(), target_height));
161 | self._screen.blit(surface_to_draw, (0,0))
162 | surface_to_draw = None
163 | pygame.display.update()
164 |
165 | # --- Go ahead and update the screen with what we've drawn.
166 | pygame.display.flip()
167 |
168 | # --- Limit to 60 frames per second
169 | self._clock.tick(60)
170 |
171 | # Close our Kinect sensor, close the window and quit.
172 | self._kinect.close()
173 | pygame.quit()
174 |
175 |
176 | __main__ = "Kinect v2 Body Game"
177 | game = BodyGameRuntime();
178 | game.run();
179 |
180 |
--------------------------------------------------------------------------------
/examples/PyKinectInfraRed.py:
--------------------------------------------------------------------------------
1 | from pykinect2 import PyKinectV2
2 | from pykinect2.PyKinectV2 import *
3 | from pykinect2 import PyKinectRuntime
4 |
5 | import ctypes
6 | import _ctypes
7 | import pygame
8 | import sys
9 | import numpy as np
10 |
11 |
12 | if sys.hexversion >= 0x03000000:
13 | import _thread as thread
14 | else:
15 | import thread
16 |
17 | # colors for drawing different bodies
18 | SKELETON_COLORS = [pygame.color.THECOLORS["red"],
19 | pygame.color.THECOLORS["blue"],
20 | pygame.color.THECOLORS["green"],
21 | pygame.color.THECOLORS["orange"],
22 | pygame.color.THECOLORS["purple"],
23 | pygame.color.THECOLORS["yellow"],
24 | pygame.color.THECOLORS["violet"]]
25 |
26 |
27 | class InfraRedRuntime(object):
28 | def __init__(self):
29 | pygame.init()
30 |
31 | # Used to manage how fast the screen updates
32 | self._clock = pygame.time.Clock()
33 |
34 | # Loop until the user clicks the close button.
35 | self._done = False
36 |
37 | # Used to manage how fast the screen updates
38 | self._clock = pygame.time.Clock()
39 |
40 | # Kinect runtime object, we want only color and body frames
41 | self._kinect = PyKinectRuntime.PyKinectRuntime(PyKinectV2.FrameSourceTypes_Infrared)
42 |
43 | # back buffer surface for getting Kinect infrared frames, 8bit grey, width and height equal to the Kinect color frame size
44 | self._frame_surface = pygame.Surface((self._kinect.infrared_frame_desc.Width, self._kinect.infrared_frame_desc.Height), 0, 24)
45 | # here we will store skeleton data
46 | self._bodies = None
47 |
48 | # Set the width and height of the screen [width, height]
49 | self._infoObject = pygame.display.Info()
50 | self._screen = pygame.display.set_mode((self._kinect.infrared_frame_desc.Width, self._kinect.infrared_frame_desc.Height),
51 | pygame.HWSURFACE|pygame.DOUBLEBUF|pygame.RESIZABLE, 32)
52 |
53 | pygame.display.set_caption("Kinect for Windows v2 Infrared")
54 |
55 |
56 |
57 | def draw_infrared_frame(self, frame, target_surface):
58 | if frame is None: # some usb hub do not provide the infrared image. it works with Kinect studio though
59 | return
60 | target_surface.lock()
61 | f8=np.uint8(frame.clip(1,4000)/16.)
62 | frame8bit=np.dstack((f8,f8,f8))
63 | address = self._kinect.surface_as_array(target_surface.get_buffer())
64 | ctypes.memmove(address, frame8bit.ctypes.data, frame8bit.size)
65 | del address
66 | target_surface.unlock()
67 |
68 | def run(self):
69 | # -------- Main Program Loop -----------
70 | while not self._done:
71 | # --- Main event loop
72 | for event in pygame.event.get(): # User did something
73 | if event.type == pygame.QUIT: # If user clicked close
74 | self._done = True # Flag that we are done so we exit this loop
75 |
76 | elif event.type == pygame.VIDEORESIZE: # window resized
77 | self._screen = pygame.display.set_mode(event.dict['size'],
78 | pygame.HWSURFACE|pygame.DOUBLEBUF|pygame.RESIZABLE, 32)
79 |
80 |
81 | # --- Getting frames and drawing
82 | if self._kinect.has_new_infrared_frame():
83 | frame = self._kinect.get_last_infrared_frame()
84 | self.draw_infrared_frame(frame, self._frame_surface)
85 | frame = None
86 |
87 | self._screen.blit(self._frame_surface, (0,0))
88 | pygame.display.update()
89 |
90 | # --- Go ahead and update the screen with what we've drawn.
91 | pygame.display.flip()
92 |
93 | # --- Limit to 60 frames per second
94 | self._clock.tick(60)
95 |
96 | # Close our Kinect sensor, close the window and quit.
97 | self._kinect.close()
98 | pygame.quit()
99 |
100 |
101 | __main__ = "Kinect v2 InfraRed"
102 | game =InfraRedRuntime();
103 | game.run();
104 |
105 |
--------------------------------------------------------------------------------
/examples/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/limgm/PyKinect2/21e9adc4ad9c12da4665eb2ddb129242209ee330/examples/__init__.py
--------------------------------------------------------------------------------
/examples/basic_2D.py:
--------------------------------------------------------------------------------
1 | ################################################################################
2 | ### Sample program to stream
3 | ### body, body index, color, align color, depth and IR images in 2D using OpenCV
4 | ################################################################################
5 | import cv2
6 | import numpy as np
7 | import utils_PyKinectV2 as utils
8 | from pykinect2.PyKinectV2 import *
9 | from pykinect2 import PyKinectV2
10 | from pykinect2 import PyKinectRuntime
11 |
12 | #############################
13 | ### Kinect runtime object ###
14 | #############################
15 | kinect = PyKinectRuntime.PyKinectRuntime(PyKinectV2.FrameSourceTypes_Body |
16 | PyKinectV2.FrameSourceTypes_BodyIndex |
17 | PyKinectV2.FrameSourceTypes_Color |
18 | PyKinectV2.FrameSourceTypes_Depth |
19 | PyKinectV2.FrameSourceTypes_Infrared)
20 |
21 | depth_width, depth_height = kinect.depth_frame_desc.Width, kinect.depth_frame_desc.Height # Default: 512, 424
22 | color_width, color_height = kinect.color_frame_desc.Width, kinect.color_frame_desc.Height # Default: 1920, 1080
23 |
24 | while True:
25 | ##############################
26 | ### Get images from camera ###
27 | ##############################
28 | if kinect.has_new_body_frame() and \
29 | kinect.has_new_body_index_frame() and \
30 | kinect.has_new_color_frame() and \
31 | kinect.has_new_depth_frame() and \
32 | kinect.has_new_infrared_frame():
33 |
34 | body_frame = kinect.get_last_body_frame()
35 | body_index_frame = kinect.get_last_body_index_frame()
36 | color_frame = kinect.get_last_color_frame()
37 | depth_frame = kinect.get_last_depth_frame()
38 | infrared_frame = kinect.get_last_infrared_frame()
39 |
40 | #########################################
41 | ### Reshape from 1D frame to 2D image ###
42 | #########################################
43 | body_index_img = body_index_frame.reshape(((depth_height, depth_width))).astype(np.uint8)
44 | color_img = color_frame.reshape(((color_height, color_width, 4))).astype(np.uint8)
45 | depth_img = depth_frame.reshape(((depth_height, depth_width))).astype(np.uint16)
46 | infrared_img = infrared_frame.reshape(((depth_height, depth_width))).astype(np.uint16)
47 |
48 | ###############################################
49 | ### Useful functions in utils_PyKinectV2.py ###
50 | ###############################################
51 | align_color_img = utils.get_align_color_image(kinect, color_img)
52 | align_color_img = utils.draw_bodyframe(body_frame, kinect, align_color_img) # Overlay body joints on align_color_img
53 | body_index_img = utils.color_body_index(kinect, body_index_img) # Add color to body_index_img
54 |
55 | ######################################
56 | ### Display 2D images using OpenCV ###
57 | ######################################
58 | color_img_resize = cv2.resize(color_img, (0,0), fx=0.5, fy=0.5) # Resize (1080, 1920, 4) into half (540, 960, 4)
59 | depth_colormap = cv2.applyColorMap(cv2.convertScaleAbs(depth_img, alpha=255/1500), cv2.COLORMAP_JET) # Scale to display from 0 mm to 1500 mm
60 | infrared_img = cv2.convertScaleAbs(infrared_img, alpha=255/65535) # Scale from uint16 to uint8
61 |
62 | cv2.imshow('body index', body_index_img) # (424, 512)
63 | cv2.imshow('color', color_img_resize) # (540, 960, 4)
64 | cv2.imshow('align color with body joints', align_color_img) # (424, 512)
65 | cv2.imshow('depth', depth_colormap) # (424, 512)
66 | cv2.imshow('infrared', infrared_img) # (424, 512)
67 |
68 | key = cv2.waitKey(30)
69 | if key==27: # Press esc to break the loop
70 | break
71 |
72 | kinect.close()
73 | cv2.destroyAllWindows()
--------------------------------------------------------------------------------
/examples/basic_3D.py:
--------------------------------------------------------------------------------
1 | ########################################################################
2 | ### Sample program to stream
3 | ### Coloured point cloud, joint and joint orientation in 3D using Open3D
4 | ########################################################################
5 | import cv2
6 | import numpy as np
7 | import utils_PyKinectV2 as utils
8 | from open3d import *
9 | from numpy.linalg import inv
10 | from pykinect2.PyKinectV2 import *
11 | from pykinect2 import PyKinectV2
12 | from pykinect2 import PyKinectRuntime
13 |
14 | #############################
15 | ### Kinect runtime object ###
16 | #############################
17 | kinect = PyKinectRuntime.PyKinectRuntime(PyKinectV2.FrameSourceTypes_Body |
18 | PyKinectV2.FrameSourceTypes_Color |
19 | PyKinectV2.FrameSourceTypes_Depth)
20 |
21 | depth_width, depth_height = kinect.depth_frame_desc.Width, kinect.depth_frame_desc.Height # Default: 512, 424
22 | color_width, color_height = kinect.color_frame_desc.Width, kinect.color_frame_desc.Height # Default: 1920, 1080
23 |
24 | ##############################
25 | ### User defined variables ###
26 | ##############################
27 | depth_scale = 0.001 # Default kinect depth scale where 1 unit = 0.001 m = 1 mm
28 | clipping_distance_in_meters = 1.5 # Set the maximum distance to display the point cloud data
29 | clipping_distance = clipping_distance_in_meters / depth_scale # Convert dist in mm to unit
30 | width=depth_width; height=depth_height; ppx=260.166; ppy=205.197; fx=367.535; fy=367.535 # Hardcode the camera intrinsic parameters for backprojection
31 | # width=depth_width; height=depth_height; ppx=258.981; ppy=208.796; fx=367.033; fy=367.033 # Hardcode the camera intrinsic parameters for backprojection
32 |
33 | ############################
34 | ### Open3D visualisation ###
35 | ############################
36 | intrinsic = PinholeCameraIntrinsic(width, height, fx, fy, ppx, ppy)
37 | flip_transform = [[1,0,0,0], [0,-1,0,0], [0,0,-1,0], [0,0,0,1]] # To convert [x,y,z] -> [x.-y,-z]
38 | # Define the objects to be drawn
39 | obj_pcd = PointCloud()
40 | obj_bone = utils.create_line_set_bones(np.zeros((24,3), dtype=np.float32)) # 24 bones connecting 25 joints
41 | obj_axis = []
42 | for i in range(PyKinectV2.JointType_Count): # 25 axes for 25 joints
43 | obj_axis.append(create_mesh_coordinate_frame(size=0.1, origin=[0,0,0])) # XYZ axis length of 0.1 m
44 | # Create Open3D Visualizer
45 | vis = Visualizer()
46 | vis.create_window(width=width, height=height)
47 | vis.get_render_option().point_size = 3
48 | vis.add_geometry(obj_bone)
49 | for i in range(PyKinectV2.JointType_Count): # 25 axes for 25 joints
50 | vis.add_geometry(obj_axis[i])
51 |
52 | first_loop = True
53 | while True:
54 | ##############################
55 | ### Get images from camera ###
56 | ##############################
57 | if kinect.has_new_body_frame() and kinect.has_new_color_frame() and kinect.has_new_depth_frame():
58 |
59 | body_frame = kinect.get_last_body_frame()
60 | color_frame = kinect.get_last_color_frame()
61 | depth_frame = kinect.get_last_depth_frame()
62 |
63 | #########################################
64 | ### Reshape from 1D frame to 2D image ###
65 | #########################################
66 | color_img = color_frame.reshape(((color_height, color_width, 4))).astype(np.uint8)
67 | depth_img = depth_frame.reshape(((depth_height, depth_width))).astype(np.uint16)
68 |
69 | ###############################################
70 | ### Useful functions in utils_PyKinectV2.py ###
71 | ###############################################
72 | align_color_img = utils.get_align_color_image(kinect, color_img)
73 | obj_pcd.points, obj_pcd.colors = utils.create_color_point_cloud(align_color_img, depth_img, depth_scale, clipping_distance_in_meters, intrinsic)
74 | obj_pcd.transform(flip_transform)
75 | joint3D, orientation = utils.get_single_joint3D_and_orientation(kinect, body_frame, depth_img, intrinsic, depth_scale)
76 | obj_bone.points = Vector3dVector(joint3D)
77 | obj_bone.transform(flip_transform)
78 |
79 | ###############################################
80 | ### Draw the orientation axes at each joint ###
81 | ###############################################
82 | for i in range(PyKinectV2.JointType_Count):
83 | obj_axis[i].transform(utils.transform_geometry_quaternion(joint3D[i,:], orientation[i,:]))
84 | obj_axis[i].transform(flip_transform)
85 |
86 | if first_loop:
87 | vis.add_geometry(obj_pcd)
88 | first_loop = False
89 | vis.update_geometry()
90 | vis.poll_events()
91 | vis.update_renderer()
92 |
93 | # Need to inverse the transformation else it will be additive in every loop
94 | for i in range(PyKinectV2.JointType_Count):
95 | obj_axis[i].transform(flip_transform)
96 | obj_axis[i].transform(inv(np.array(utils.transform_geometry_quaternion(joint3D[i,:], orientation[i,:]))))
97 |
98 | ######################################
99 | ### Display 2D images using OpenCV ###
100 | ######################################
101 | depth_colormap = cv2.applyColorMap(cv2.convertScaleAbs(depth_img, alpha=255/clipping_distance), cv2.COLORMAP_JET)
102 | cv2.imshow('depth', depth_colormap) # (424, 512)
103 |
104 | key = cv2.waitKey(30)
105 | if key==27: # Press esc to break the loop
106 | break
107 |
108 | kinect.close()
109 | vis.destroy_window()
110 | cv2.destroyAllWindows()
--------------------------------------------------------------------------------
/examples/utils_PyKinectV2.py:
--------------------------------------------------------------------------------
1 | ##############################################################
2 | ### Set of useful utilities function related to PyKinectV2 ###
3 | ##############################################################
4 | import cv2
5 | import ctypes
6 | import numpy as np
7 | from open3d import *
8 | from pykinect2.PyKinectV2 import *
9 | from pykinect2 import PyKinectV2
10 | from pykinect2 import PyKinectRuntime
11 |
12 |
13 | ##########################
14 | ### Map color to depth ###
15 | ##########################
16 | def get_align_color_image(kinect, color_img, color_height=1080, color_width=1920, depth_height=424, depth_width=512):
17 | CSP_Count = kinect._depth_frame_data_capacity
18 | CSP_type = _ColorSpacePoint * CSP_Count.value
19 | CSP = ctypes.cast(CSP_type(), ctypes.POINTER(_ColorSpacePoint))
20 |
21 | kinect._mapper.MapDepthFrameToColorSpace(kinect._depth_frame_data_capacity,kinect._depth_frame_data, CSP_Count, CSP)
22 |
23 | colorXYs = np.copy(np.ctypeslib.as_array(CSP, shape=(depth_height*depth_width,))) # Convert ctype pointer to array
24 | colorXYs = colorXYs.view(np.float32).reshape(colorXYs.shape + (-1,)) # Convert struct array to regular numpy array https://stackoverflow.com/questions/5957380/convert-structured-array-to-regular-numpy-array
25 | colorXYs += 0.5
26 | colorXYs = colorXYs.reshape(depth_height,depth_width,2).astype(np.int)
27 | colorXs = np.clip(colorXYs[:,:,0], 0, color_width-1)
28 | colorYs = np.clip(colorXYs[:,:,1], 0, color_height-1)
29 |
30 | align_color_img = np.zeros((depth_height,depth_width, 4), dtype=np.uint8)
31 | align_color_img[:, :] = color_img[colorYs, colorXs, :]
32 |
33 | return align_color_img
34 |
35 |
36 | ##################################
37 | ### Get the joints information ###
38 | ##################################
39 | def get_single_joint(joints, jointPoints, jointType):
40 | jointState = joints[jointType].TrackingState;
41 |
42 | # Joint not tracked or not 'really' tracked
43 | if (jointState == PyKinectV2.TrackingState_NotTracked) or (jointState == PyKinectV2.TrackingState_Inferred):
44 | return np.zeros((1,2), dtype=np.int32) # Return zeros
45 | else:
46 | return np.array([jointPoints[jointType].x, jointPoints[jointType].y], dtype=np.int32)
47 |
48 |
49 | def get_joint2D(joints, jointPoints):
50 | joint2D = np.zeros((PyKinectV2.JointType_Count,2), dtype=np.int32) # [25, 2] Note: Total 25 joints
51 | for i in range(PyKinectV2.JointType_Count):
52 | joint2D[i,:] = get_single_joint(joints, jointPoints, i)
53 |
54 | return joint2D
55 |
56 |
57 | def get_joint3D(joints, jointPoints, depth_img, intrinsics, depth_scale):
58 | joint3D = np.zeros((PyKinectV2.JointType_Count,3), dtype=np.float32) # [25, 3] Note: Total 25 joints
59 | joint2D = get_joint2D(joints, jointPoints)
60 |
61 | fx = intrinsics.intrinsic_matrix[0,0]
62 | fy = intrinsics.intrinsic_matrix[1,1]
63 | cx = intrinsics.intrinsic_matrix[0,2]
64 | cy = intrinsics.intrinsic_matrix[1,2]
65 |
66 | # Back project the 2D points to 3D coor
67 | for i in range(PyKinectV2.JointType_Count):
68 | u, v = joint2D[i,0], joint2D[i,1]
69 | joint3D[i,2] = depth_img[v,u]*depth_scale # Z coor
70 | joint3D[i,0] = (u-cx)*joint3D[i,2]/fx # X coor
71 | joint3D[i,1] = (v-cy)*joint3D[i,2]/fy # Y coor
72 |
73 | return joint3D
74 |
75 |
76 | def get_joint_quaternions(orientations):
77 | quat = np.zeros((PyKinectV2.JointType_Count,4), dtype=np.float32) # [25, 4] Note: Total 25 joints
78 | for i in range(PyKinectV2.JointType_Count):
79 | quat[i,0] = orientations[i].Orientation.w
80 | quat[i,1] = orientations[i].Orientation.x
81 | quat[i,2] = orientations[i].Orientation.y
82 | quat[i,3] = orientations[i].Orientation.z
83 |
84 | return quat
85 |
86 |
87 | ######################
88 | ### Draw on OpenCV ###
89 | ######################
90 | # Define the BGR color for 6 different bodies
91 | colors_order = [(0,0,255), # Red
92 | (0,255,0), # Green
93 | (255,0,0), # Blue
94 | (0,255,255), # Yellow
95 | (255,0,255), # Magenta
96 | (255,255,0)] # Cyan
97 | def draw_joint2D(img, j2D, color=(0,0,255)): # Default red circles
98 | for i in range(j2D.shape[0]): # Should loop 25 times
99 | cv2.circle(img, (j2D[i,0],j2D[i,1]), 5, color, -1)
100 |
101 | return img
102 |
103 |
104 | def draw_bone2D(img, j2D, color=(0,0,255)): # Default red lines
105 | # Define the kinematic tree where each of the 25 joints is associated to a parent joint
106 | k = [0,0,1,2, # Spine
107 | 20,4,5,6, # Left arm
108 | 20,8,9,10, # Right arm
109 | 0,12,13,14, # Left leg
110 | 0,16,17,18, # Right leg
111 | 1, # Spine
112 | 7,7, # Left hand
113 | 11,11] # Right hand
114 |
115 | for i in range(j2D.shape[0]): # Should loop 25 times
116 | if j2D[k[i],0]>0 and j2D[k[i],1]>0 and j2D[i,0]>0 and j2D[i,1]>0:
117 | cv2.line(img, (j2D[k[i],0],j2D[k[i],1]), (j2D[i,0],j2D[i,1]), color)
118 |
119 | return img
120 |
121 |
122 | def color_body_index(kinect, img):
123 | height, width = img.shape
124 | color_img = np.zeros((height, width, 3), dtype=np.uint8)
125 | for i in range(kinect.max_body_count):
126 | color_img[np.where(img == i)] = colors_order[i]
127 |
128 | return color_img
129 |
130 |
131 | def draw_bodyframe(body_frame, kinect, img):
132 | if body_frame is not None:
133 | for i in range(0, kinect.max_body_count):
134 | body = body_frame.bodies[i]
135 | if body.is_tracked:
136 | joints = body.joints
137 | joint_points = kinect.body_joints_to_depth_space(joints) # Convert joint coordinates to depth space
138 | joint2D = get_joint2D(joints, joint_points) # Convert to numpy array format
139 | img = draw_joint2D(img, joint2D, colors_order[i])
140 | img = draw_bone2D(img, joint2D, colors_order[i])
141 |
142 | return img
143 |
144 |
145 | ################################
146 | ### For Open3D visualisation ###
147 | ################################
148 | def create_line_set_bones(joints):
149 | # Draw the 24 bones (lines) connecting 25 joints
150 | # The lines below is the kinematic tree that defines the connection between parent and child joints
151 | lines = [[0,1],[1,20],[20,2],[2,3], # Spine
152 | [20,4],[4,5],[5,6],[6,7],[7,21],[7,22], # Left arm and hand
153 | [20,8],[8,9],[9,10],[10,11],[11,23],[11,24], # Right arm and hand
154 | [0,12],[12,13],[13,14],[14,15], # Left leg
155 | [0,16],[16,17],[17,18],[18,19]] # Right leg
156 | colors = [[0,0,1] for i in range(24)] # Default blue
157 | line_set = LineSet()
158 | line_set.lines = Vector2iVector(lines)
159 | line_set.colors = Vector3dVector(colors)
160 | line_set.points = Vector3dVector(joints)
161 |
162 | return line_set
163 |
164 |
165 | def create_color_point_cloud(align_color_img, depth_img,
166 | depth_scale, clipping_distance_in_meters, intrinsic):
167 |
168 | align_color_img = align_color_img[:,:,0:3] # Only get the first three channel
169 | align_color_img = align_color_img[...,::-1] # Convert opencv BGR to RGB
170 | rgbd_image = create_rgbd_image_from_color_and_depth(
171 | Image(align_color_img.copy()),
172 | Image(depth_img),
173 | depth_scale=1.0/depth_scale,
174 | depth_trunc=clipping_distance_in_meters,
175 | convert_rgb_to_intensity = False)
176 | pcd = create_point_cloud_from_rgbd_image(rgbd_image, intrinsic)
177 |
178 | # Point cloud only without color
179 | # pcd = create_point_cloud_from_depth_image(
180 | # Image(depth_img),
181 | # intrinsic,
182 | # depth_scale=1.0/depth_scale,
183 | # depth_trunc=clipping_distance_in_meters)
184 |
185 | return pcd.points, pcd.colors
186 |
187 |
188 | def get_single_joint3D_and_orientation(kinect, body_frame, depth_img, intrinsic, depth_scale):
189 | joint3D = np.zeros((PyKinectV2.JointType_Count,3), dtype=np.float32)
190 | orientation = np.zeros((PyKinectV2.JointType_Count,4), dtype=np.float32)
191 |
192 | if body_frame is not None:
193 | for i in range(0, kinect.max_body_count):
194 | body = body_frame.bodies[i]
195 | if body.is_tracked:
196 | joints = body.joints
197 | joint_points = kinect.body_joints_to_depth_space(joints) # Convert joint coordinates to depth space
198 | joint3D = get_joint3D(joints, joint_points, depth_img, intrinsic, depth_scale) # Convert to numpy array format
199 | orientation = get_joint_quaternions(body.joint_orientations)
200 |
201 | # Note: Currently only return single set of joint3D and orientations
202 | return joint3D, orientation
203 |
204 |
205 | def transform_geometry_quaternion(joint3D, orientation):
206 |
207 | qw,qx,qy,qz = orientation[0],orientation[1],orientation[2],orientation[3]
208 | tx,ty,tz = joint3D[0],joint3D[1],joint3D[2]
209 |
210 | # Convert quaternion to rotation matrix
211 | # http://www.euclideanspace.com/maths/geometry/rotations/conversions/quaternionToMatrix/index.htm
212 | transform_matrix = [[ 1 - 2*qy*qy - 2*qz*qz, 2*qx*qy - 2*qz*qw , 2*qx*qz + 2*qy*qw , tx],
213 | [ 2*qx*qy + 2*qz*qw , 1 - 2*qx*qx - 2*qz*qz, 2*qy*qz - 2*qx*qw , ty],
214 | [ 2*qx*qz - 2*qy*qw , 2*qy*qz + 2*qx*qw , 1 - 2*qx*qx - 2*qy*qy, tz],
215 | [ 0, 0, 0, 1]]
216 |
217 | return transform_matrix
218 |
--------------------------------------------------------------------------------
/pykinect2/PyKinectRuntime.py:
--------------------------------------------------------------------------------
1 | from pykinect2 import PyKinectV2
2 | from pykinect2.PyKinectV2 import *
3 |
4 | import ctypes
5 | import _ctypes
6 | from _ctypes import COMError
7 | import comtypes
8 | import sys
9 | import numpy
10 | import time
11 |
12 | import importlib
13 |
14 | if sys.hexversion >= 0x03000000:
15 | import _thread as thread
16 | else:
17 | import thread
18 |
19 | KINECT_MAX_BODY_COUNT = 6
20 |
21 | class PyKinectRuntime(object):
22 | """manages Kinect objects and simplifying access to them"""
23 | def __init__(self, frame_source_types):
24 | # recipe to get address of surface: http://archives.seul.org/pygame/users/Apr-2008/msg00218.html
25 | is_64bits = sys.maxsize > 2**32
26 | if not is_64bits:
27 | self.Py_ssize_t = ctypes.c_int
28 | else:
29 | self.Py_ssize_t = ctypes.c_int64
30 |
31 | self._PyObject_AsWriteBuffer = ctypes.pythonapi.PyObject_AsWriteBuffer
32 | self._PyObject_AsWriteBuffer.restype = ctypes.c_int
33 | self._PyObject_AsWriteBuffer.argtypes = [ctypes.py_object,
34 | ctypes.POINTER(ctypes.c_void_p),
35 | ctypes.POINTER(self.Py_ssize_t)]
36 |
37 | #self._color_frame_ready = PyKinectV2._event()
38 | #self._depth_frame_ready = PyKinectV2._event()
39 | #self._body_frame_ready = PyKinectV2._event()
40 | #self._body_index_frame_ready = PyKinectV2._event()
41 | #self._infrared_frame_ready = PyKinectV2._event()
42 | #self._long_exposure_infrared_frame_ready = PyKinectV2._event()
43 | #self._audio_frame_ready = PyKinectV2._event()
44 |
45 | self._close_event = ctypes.windll.kernel32.CreateEventW(None, False, False, None)
46 |
47 | self._color_frame_arrived_event = 0
48 | self._depth_frame_arrived_event = 0
49 | self._body_frame_arrived_event = 0
50 | self._body_index_frame_arrived_event = 0
51 | self._infrared_frame_arrived_event = 0
52 | self._long_exposure_infrared_frame_arrived_event = 0
53 | self._audio_frame_arrived_event = 0
54 |
55 | self._color_frame_lock = thread.allocate()
56 | self._depth_frame_lock = thread.allocate()
57 | self._body_frame_lock = thread.allocate()
58 | self._body_index_frame_lock = thread.allocate()
59 | self._infrared_frame_lock = thread.allocate()
60 | self._long_exposure_infrared_frame_lock = thread.allocate()
61 | self._audio_frame_lock = thread.allocate()
62 |
63 | #initialize sensor
64 | self._sensor = ctypes.POINTER(PyKinectV2.IKinectSensor)()
65 | hres = ctypes.windll.kinect20.GetDefaultKinectSensor(ctypes.byref(self._sensor))
66 | hres = self._sensor.Open()
67 |
68 | self._mapper = self._sensor.CoordinateMapper
69 |
70 | self.frame_source_types = frame_source_types
71 | self.max_body_count = KINECT_MAX_BODY_COUNT
72 |
73 | self._handles = (ctypes.c_voidp * 8)()
74 | self._handles[0] = self._close_event
75 | self._handles[1] = self._close_event
76 | self._handles[2] = self._close_event
77 | self._handles[3] = self._close_event
78 | self._handles[4] = self._close_event
79 | self._handles[5] = self._close_event
80 | self._handles[6] = self._close_event
81 | self._handles[7] = self._close_event
82 |
83 | self._waitHandleCount = 1
84 |
85 | self._color_source = self._sensor.ColorFrameSource
86 | self.color_frame_desc = self._color_source.FrameDescription
87 | self._infrared_source = self._sensor.InfraredFrameSource
88 | self.infrared_frame_desc = self._infrared_source.FrameDescription
89 | self._depth_source = self._sensor.DepthFrameSource
90 | self.depth_frame_desc = self._depth_source.FrameDescription
91 | self._body_index_source = self._sensor.BodyIndexFrameSource
92 | self.body_index_frame_desc = self._body_index_source.FrameDescription
93 | self._body_source = self._sensor.BodyFrameSource
94 | self._body_frame_data = ctypes.POINTER(ctypes.POINTER(IBody))
95 | self.max_body_count = self._body_source.BodyCount
96 |
97 | self._color_frame_data = None
98 | self._depth_frame_data = None
99 | self._body_frame_data = None
100 | self._body_index_frame_data = None
101 | self._infrared_frame_data = None
102 | self._long_exposure_infrared_frame_data = None
103 | self._audio_frame_data = None
104 |
105 | if(self.frame_source_types & FrameSourceTypes_Color):
106 | self._color_frame_data = ctypes.POINTER(ctypes.c_ubyte)
107 | self._color_frame_data_capacity = ctypes.c_uint(self.color_frame_desc.Width * self.color_frame_desc.Height * 4)
108 | self._color_frame_data_type = ctypes.c_ubyte * self._color_frame_data_capacity.value
109 | self._color_frame_data = ctypes.cast(self._color_frame_data_type(), ctypes.POINTER(ctypes.c_ubyte))
110 | self._color_frame_reader = self._color_source.OpenReader()
111 | self._color_frame_arrived_event = self._color_frame_reader.SubscribeFrameArrived()
112 | self._handles[self._waitHandleCount] = self._color_frame_arrived_event
113 | self._waitHandleCount += 1
114 |
115 | if(self.frame_source_types & FrameSourceTypes_Infrared):
116 | self._infrared_frame_data = ctypes.POINTER(ctypes.c_ushort)
117 | self._infrared_frame_data_capacity = ctypes.c_uint(self.infrared_frame_desc.Width * self.infrared_frame_desc.Height)
118 | self._infrared_frame_data_type = ctypes.c_ushort * self._infrared_frame_data_capacity.value
119 | self._infrared_frame_data = ctypes.cast(self._infrared_frame_data_type(), ctypes.POINTER(ctypes.c_ushort))
120 | self._infrared_frame_reader = self._infrared_source.OpenReader()
121 | self._infrared_frame_arrived_event = self._infrared_frame_reader.SubscribeFrameArrived()
122 | self._handles[self._waitHandleCount] = self._infrared_frame_arrived_event
123 | self._waitHandleCount += 1
124 |
125 | if(self.frame_source_types & FrameSourceTypes_Depth):
126 | self._depth_frame_data = ctypes.POINTER(ctypes.c_ushort)
127 | self._depth_frame_data_capacity = ctypes.c_uint(self.depth_frame_desc.Width * self.depth_frame_desc.Height)
128 | self._depth_frame_data_type = ctypes.c_ushort * self._depth_frame_data_capacity.value
129 | self._depth_frame_data = ctypes.cast(self._depth_frame_data_type(), ctypes.POINTER(ctypes.c_ushort))
130 | self._depth_frame_reader = self._depth_source.OpenReader()
131 | self._depth_frame_arrived_event = self._depth_frame_reader.SubscribeFrameArrived()
132 | self._handles[self._waitHandleCount] = self._depth_frame_arrived_event
133 | self._waitHandleCount += 1
134 |
135 | if(self.frame_source_types & FrameSourceTypes_BodyIndex):
136 | self._body_index_frame_data = ctypes.POINTER(ctypes.c_ubyte)
137 | self._body_index_frame_data_capacity = ctypes.c_uint(self.body_index_frame_desc.Width * self.body_index_frame_desc.Height)
138 | self._body_index_frame_data_type = ctypes.c_ubyte * self._body_index_frame_data_capacity.value
139 | self._body_index_frame_data = ctypes.cast(self._body_index_frame_data_type(), ctypes.POINTER(ctypes.c_ubyte))
140 | self._body_index_frame_reader = self._body_index_source.OpenReader()
141 | self._body_index_frame_arrived_event = self._body_index_frame_reader.SubscribeFrameArrived()
142 | self._handles[self._waitHandleCount] = self._body_index_frame_arrived_event
143 | self._waitHandleCount += 1
144 |
145 | self._body_frame_data = None
146 | if(self.frame_source_types & FrameSourceTypes_Body):
147 | self._body_frame_data_capacity = ctypes.c_uint(self.max_body_count)
148 | self._body_frame_data_type = ctypes.POINTER(IBody) * self._body_frame_data_capacity.value
149 | self._body_frame_data = ctypes.cast(self._body_frame_data_type(), ctypes.POINTER(ctypes.POINTER(IBody)))
150 | self._body_frame_reader = self._body_source.OpenReader()
151 | self._body_frame_arrived_event = self._body_frame_reader.SubscribeFrameArrived()
152 | self._body_frame_bodies = None
153 | self._handles[self._waitHandleCount] = self._body_frame_arrived_event
154 | self._waitHandleCount += 1
155 |
156 | thread.start_new_thread(self.kinect_frame_thread, ())
157 |
158 | self._last_color_frame = None
159 | self._last_depth_frame = None
160 | self._last_body_frame = None
161 | self._last_body_index_frame = None
162 | self._last_infrared_frame = None
163 | self._last_long_exposure_infrared_frame = None
164 | self._last_audio_frame = None
165 |
166 | start_clock = time.clock()
167 | self._last_color_frame_access = self._last_color_frame_time = start_clock
168 | self._last_body_frame_access = self._last_body_frame_time = start_clock
169 | self._last_body_index_frame_access = self._last_body_index_frame_time = start_clock
170 | self._last_depth_frame_access = self._last_depth_frame_time = start_clock
171 | self._last_infrared_frame_access = self._last_infrared_frame_time = start_clock
172 | self._last_long_exposure_infrared_frame_access = self._last_long_exposure_infrared_frame_time = start_clock
173 | self._last_audio_frame_access = self._last_audio_frame_time = start_clock
174 |
175 | def close(self):
176 | if self._sensor is not None:
177 | ctypes.windll.kernel32.SetEvent(self._close_event)
178 | ctypes.windll.kernel32.CloseHandle(self._close_event)
179 |
180 | self._color_frame_reader = None
181 | self._depth_frame_reader = None
182 | self._body_index_frame_reader = None
183 | self._body_frame_reader = None
184 |
185 | self._color_source = None
186 | self._depth_source = None
187 | self._body_index_source = None
188 | self._body_source = None
189 |
190 | self._body_frame_data = None
191 |
192 | self._sensor.Close()
193 | self._sensor = None
194 |
195 | def __del__(self):
196 | self.close()
197 |
198 | def __enter__(self):
199 | return self
200 |
201 | def __exit__(self, *args):
202 | self.close()
203 |
204 | def surface_as_array(self, surface_buffer_interface):
205 | address = ctypes.c_void_p()
206 | size = self.Py_ssize_t()
207 | self._PyObject_AsWriteBuffer(surface_buffer_interface,
208 | ctypes.byref(address), ctypes.byref(size))
209 | bytes = (ctypes.c_byte * size.value).from_address(address.value)
210 | bytes.object = surface_buffer_interface
211 | return bytes
212 |
213 | def has_new_color_frame(self):
214 | has = (self._last_color_frame_time > self._last_color_frame_access)
215 | return has
216 |
217 | def has_new_depth_frame(self):
218 | has = (self._last_depth_frame_time > self._last_depth_frame_access)
219 | return has
220 |
221 | def has_new_body_frame(self):
222 | has = (self._last_body_frame_time > self._last_body_frame_access)
223 | return has
224 |
225 | def has_new_body_index_frame(self):
226 | has = (self._last_body_index_frame_time > self._last_body_index_frame_access)
227 | return has
228 |
229 | def has_new_infrared_frame(self):
230 | has = (self._last_infrared_frame_time > self._last_infrared_frame_access)
231 | return has
232 |
233 | def has_new_long_exposure_infrared_frame(self):
234 | has = (self._last_long_exposure_infrared_frame_time > self._last_long_exposure_infrared_frame_access)
235 | return has
236 |
237 | def has_new_audio_frame(self):
238 | has = (self._last_audio_frame_time > self._last_audio_frame_access)
239 | return has
240 |
241 |
242 | def get_last_color_frame(self):
243 | with self._color_frame_lock:
244 | if self._color_frame_data is not None:
245 | data = numpy.copy(numpy.ctypeslib.as_array(self._color_frame_data, shape=(self._color_frame_data_capacity.value,)))
246 | self._last_color_frame_access = time.clock()
247 | return data
248 | else:
249 | return None
250 |
251 | def get_last_infrared_frame(self):
252 | with self._infrared_frame_lock:
253 | if self._infrared_frame_data is not None:
254 | data = numpy.copy(numpy.ctypeslib.as_array(self._infrared_frame_data, shape=(self._infrared_frame_data_capacity.value,)))
255 | self._last_infrared_frame_access = time.clock()
256 | return data
257 | else:
258 | return None
259 |
260 | def get_last_depth_frame(self):
261 | with self._depth_frame_lock:
262 | if self._depth_frame_data is not None:
263 | data = numpy.copy(numpy.ctypeslib.as_array(self._depth_frame_data, shape=(self._depth_frame_data_capacity.value,)))
264 | self._last_depth_frame_access = time.clock()
265 | return data
266 | else:
267 | return None
268 |
269 | def get_last_body_index_frame(self):
270 | with self._body_index_frame_lock:
271 | if self._body_index_frame_data is not None:
272 | data = numpy.copy(numpy.ctypeslib.as_array(self._body_index_frame_data, shape=(self._body_index_frame_data_capacity.value,)))
273 | self._last_body_index_frame_access = time.clock()
274 | return data
275 | else:
276 | return None
277 |
278 | def get_last_body_frame(self):
279 | with self._body_frame_lock:
280 | if self._body_frame_bodies is not None:
281 | self._last_body_frame_access = time.clock()
282 | return self._body_frame_bodies.copy()
283 | else:
284 | return None
285 |
286 |
287 | def body_joint_to_color_space(self, joint):
288 | return self._mapper.MapCameraPointToColorSpace(joint.Position)
289 |
290 | def body_joint_to_depth_space(self, joint):
291 | return self._mapper.MapCameraPointToDepthSpace(joint.Position)
292 |
293 |
294 | def body_joints_to_color_space(self, joints):
295 | joint_points = numpy.ndarray((PyKinectV2.JointType_Count), dtype=numpy.object)
296 |
297 | for j in range(0, PyKinectV2.JointType_Count):
298 | joint_points[j] = self.body_joint_to_color_space(joints[j])
299 |
300 | return joint_points
301 |
302 | def body_joints_to_depth_space(self, joints):
303 | joint_points = numpy.ndarray((PyKinectV2.JointType_Count), dtype=numpy.object)
304 |
305 | for j in range(0, PyKinectV2.JointType_Count):
306 | joint_points[j] = self.body_joint_to_depth_space(joints[j])
307 |
308 | return joint_points
309 |
310 | def kinect_frame_thread(self):
311 | while 1:
312 | wait = ctypes.windll.kernel32.WaitForMultipleObjects(self._waitHandleCount, self._handles, False, PyKinectV2._INFINITE)
313 |
314 | if wait == 0:
315 | break
316 |
317 | if self._handles[wait] == self._color_frame_arrived_event:
318 | self.handle_color_arrived(wait)
319 | elif self._handles[wait] == self._depth_frame_arrived_event:
320 | self.handle_depth_arrived(wait)
321 | elif self._handles[wait] == self._body_frame_arrived_event:
322 | self.handle_body_arrived(wait)
323 | elif self._handles[wait] == self._body_index_frame_arrived_event:
324 | self.handle_body_index_arrived(wait)
325 | elif self._handles[wait] == self._infrared_frame_arrived_event:
326 | self.handle_infrared_arrived(wait)
327 | elif self._handles[wait] == self._long_exposure_infrared_frame_arrived_event:
328 | self.handle_long_exposure_infrared_arrived(wait)
329 | elif self._handles[wait] == self._audio_frame_arrived_event:
330 | self.handle_audio_arrived(wait)
331 | else:
332 | break
333 |
334 |
335 | def handle_color_arrived(self, handle_index):
336 | colorFrameEventData = self._color_frame_reader.GetFrameArrivedEventData(self._handles[handle_index])
337 | colorFrameRef = colorFrameEventData.FrameReference
338 | try:
339 | colorFrame = colorFrameRef.AcquireFrame()
340 | try:
341 | with self._color_frame_lock:
342 | colorFrame.CopyConvertedFrameDataToArray(self._color_frame_data_capacity, self._color_frame_data, PyKinectV2.ColorImageFormat_Bgra)
343 | self._last_color_frame_time = time.clock()
344 | except:
345 | pass
346 | colorFrame = None
347 | except:
348 | pass
349 | colorFrameRef = None
350 | colorFrameEventData = None
351 |
352 |
353 | def handle_depth_arrived(self, handle_index):
354 | depthFrameEventData = self._depth_frame_reader.GetFrameArrivedEventData(self._handles[handle_index])
355 | depthFrameRef = depthFrameEventData.FrameReference
356 | try:
357 | depthFrame = depthFrameRef.AcquireFrame()
358 | try:
359 | with self._depth_frame_lock:
360 | depthFrame.CopyFrameDataToArray(self._depth_frame_data_capacity, self._depth_frame_data)
361 | self._last_depth_frame_time = time.clock()
362 | except:
363 | pass
364 | depthFrame = None
365 | except:
366 | pass
367 | depthFrameRef = None
368 | depthFrameEventData = None
369 |
370 |
371 | def handle_body_arrived(self, handle_index):
372 | bodyFrameEventData = self._body_frame_reader.GetFrameArrivedEventData(self._handles[handle_index])
373 | bofyFrameRef = bodyFrameEventData.FrameReference
374 | try:
375 | bodyFrame = bofyFrameRef.AcquireFrame()
376 |
377 | try:
378 | with self._body_frame_lock:
379 | bodyFrame.GetAndRefreshBodyData(self._body_frame_data_capacity, self._body_frame_data)
380 | self._body_frame_bodies = KinectBodyFrameData(bodyFrame, self._body_frame_data, self.max_body_count)
381 | self._last_body_frame_time = time.clock()
382 |
383 | # need these 2 lines as a workaround for handling IBody referencing exception
384 | self._body_frame_data = None
385 | self._body_frame_data = ctypes.cast(self._body_frame_data_type(), ctypes.POINTER(ctypes.POINTER(IBody)))
386 |
387 | except:
388 | pass
389 |
390 | bodyFrame = None
391 | except:
392 | pass
393 | bofyFrameRef = None
394 | bodyFrameEventData = None
395 |
396 |
397 | def handle_body_index_arrived(self, handle_index):
398 | bodyIndexFrameEventData = self._body_index_frame_reader.GetFrameArrivedEventData(self._handles[handle_index])
399 | bodyIndexFrameRef = bodyIndexFrameEventData.FrameReference
400 | try:
401 | bodyIndexFrame = bodyIndexFrameRef.AcquireFrame()
402 | try:
403 | with self._body_index_frame_lock:
404 | bodyIndexFrame.CopyFrameDataToArray(self._body_index_frame_data_capacity, self._body_index_frame_data)
405 | self._last_body_index_frame_time = time.clock()
406 | except:
407 | pass
408 | bodyIndexFrame = None
409 | except:
410 | pass
411 | bodyIndexFrame = None
412 | bodyIndexFrameEventData = None
413 |
414 | def handle_infrared_arrived(self, handle_index):
415 | infraredFrameEventData = self._infrared_frame_reader.GetFrameArrivedEventData(self._handles[handle_index])
416 | infraredFrameRef = infraredFrameEventData.FrameReference
417 | try:
418 | infraredFrame = infraredFrameRef.AcquireFrame()
419 | try:
420 | with self._infrared_frame_lock:
421 | infraredFrame.CopyFrameDataToArray(self._infrared_frame_data_capacity, self._infrared_frame_data)
422 | self._last_infrared_frame_time = time.clock()
423 | except:
424 | pass
425 | infraredFrame = None
426 | except:
427 | pass
428 | infraredFrameRef = None
429 | infraredFrameEventData = None
430 |
431 | def handle_long_exposure_infrared_arrived(self, handle_index):
432 | pass
433 |
434 | def handle_audio_arrived(self, handle_index):
435 | pass
436 |
437 |
438 |
439 | class KinectBody(object):
440 | def __init__(self, body = None):
441 | self.is_restricted = False
442 | self.tracking_id = -1
443 |
444 | self.is_tracked = False
445 |
446 | if body is not None:
447 | self.is_tracked = body.IsTracked
448 |
449 | if self.is_tracked:
450 | self.is_restricted = body.IsRestricted
451 | self.tracking_id = body.TrackingId
452 | self.engaged = body.Engaged
453 | self.lean = body.Lean
454 | self.lean_tracking_state = body.LeanTrackingState
455 | self.hand_left_state = body.HandLeftState
456 | self.hand_left_confidence = body.HandLeftConfidence
457 | self.hand_right_state = body.HandRightState
458 | self.hand_right_confidence = body.HandRightConfidence
459 | self.clipped_edges = body.ClippedEdges
460 |
461 | joints = ctypes.POINTER(PyKinectV2._Joint)
462 | joints_capacity = ctypes.c_uint(PyKinectV2.JointType_Count)
463 | joints_data_type = PyKinectV2._Joint * joints_capacity.value
464 | joints = ctypes.cast(joints_data_type(), ctypes.POINTER(PyKinectV2._Joint))
465 | body.GetJoints(PyKinectV2.JointType_Count, joints)
466 | self.joints = joints
467 |
468 | joint_orientations = ctypes.POINTER(PyKinectV2._JointOrientation)
469 | joint_orientations_data_type = PyKinectV2._JointOrientation * joints_capacity.value
470 | joint_orientations = ctypes.cast(joint_orientations_data_type(), ctypes.POINTER(PyKinectV2._JointOrientation))
471 | body.GetJointOrientations(PyKinectV2.JointType_Count, joint_orientations)
472 | self.joint_orientations = joint_orientations
473 |
474 |
475 | class KinectBodyFrameData(object):
476 | def __init__(self, bodyFrame, body_frame_data, max_body_count):
477 | self.bodies = None
478 | self.floor_clip_plane = None
479 | if bodyFrame is not None:
480 | self.floor_clip_plane = bodyFrame.FloorClipPlane
481 | self.relative_time = bodyFrame.RelativeTime
482 |
483 | self.bodies = numpy.ndarray((max_body_count), dtype=numpy.object)
484 | for i in range(0, max_body_count):
485 | self.bodies[i] = KinectBody(body_frame_data[i])
486 |
487 | def copy(self):
488 | res = KinectBodyFrameData(None, None, 0)
489 | res.floor_clip_plane = self.floor_clip_plane
490 | res.relative_time = self.relative_time
491 | res.bodies = numpy.copy(self.bodies)
492 | return res
493 |
494 |
495 |
--------------------------------------------------------------------------------
/pykinect2/PyKinectV2.py:
--------------------------------------------------------------------------------
1 | # -*- coding: mbcs -*-
2 | typelib_path = 'c:\\Users\\vladkol\\Documents\\PyKinect2\\idl\\Kinect.tlb'
3 | _lcid = 0 # change this if required
4 | import ctypes
5 | import comtypes
6 | from ctypes import *
7 | from comtypes import *
8 | from comtypes import GUID
9 | from ctypes import HRESULT
10 | from comtypes import helpstring
11 | from comtypes import COMMETHOD
12 | from comtypes import dispid
13 | STRING = c_char_p
14 | INT_PTR = c_int
15 | from ctypes.wintypes import _LARGE_INTEGER
16 | from ctypes.wintypes import _ULARGE_INTEGER
17 | from ctypes.wintypes import _ULARGE_INTEGER
18 | from ctypes.wintypes import _FILETIME
19 | WSTRING = c_wchar_p
20 |
21 | from _ctypes import COMError
22 | comtypes.hresult.E_PENDING = 0x8000000A
23 |
24 | import numpy.distutils.system_info as sysinfo
25 |
26 |
27 | class _event(object):
28 | """class used for adding/removing/invoking a set of listener functions"""
29 | __slots__ = ['handlers']
30 |
31 | def __init__(self):
32 | self.handlers = []
33 |
34 | def __iadd__(self, other):
35 | self.handlers.append(other)
36 | return self
37 |
38 | def __isub__(self, other):
39 | self.handlers.remove(other)
40 | return self
41 |
42 | def fire(self, *args):
43 | for handler in self.handlers:
44 | handler(*args)
45 |
46 | class IBody(comtypes.IUnknown):
47 | _case_insensitive_ = True
48 | _iid_ = GUID('{46AEF731-98B0-4D18-827B-933758678F4A}')
49 | _idlflags_ = []
50 | class _Joint(Structure):
51 | pass
52 | class _JointOrientation(Structure):
53 | pass
54 |
55 | # values for enumeration '_DetectionResult'
56 | DetectionResult_Unknown = 0
57 | DetectionResult_No = 1
58 | DetectionResult_Maybe = 2
59 | DetectionResult_Yes = 3
60 | _DetectionResult = c_int # enum
61 |
62 | # values for enumeration '_HandState'
63 | HandState_Unknown = 0
64 | HandState_NotTracked = 1
65 | HandState_Open = 2
66 | HandState_Closed = 3
67 | HandState_Lasso = 4
68 | _HandState = c_int # enum
69 |
70 | # values for enumeration '_TrackingConfidence'
71 | TrackingConfidence_Low = 0
72 | TrackingConfidence_High = 1
73 | _TrackingConfidence = c_int # enum
74 | class _PointF(Structure):
75 | pass
76 |
77 | # values for enumeration '_TrackingState'
78 | TrackingState_NotTracked = 0
79 | TrackingState_Inferred = 1
80 | TrackingState_Tracked = 2
81 | _TrackingState = c_int # enum
82 | IBody._methods_ = [
83 | COMMETHOD([], HRESULT, 'GetJoints',
84 | ( [], c_uint, 'capacity' ),
85 | ( [], POINTER(_Joint), 'joints' )),
86 | COMMETHOD([], HRESULT, 'GetJointOrientations',
87 | ( [], c_uint, 'capacity' ),
88 | ( [], POINTER(_JointOrientation), 'jointOrientations' )),
89 | COMMETHOD(['propget'], HRESULT, 'Engaged',
90 | ( ['retval', 'out'], POINTER(_DetectionResult), 'detectionResult' )),
91 | COMMETHOD([], HRESULT, 'GetExpressionDetectionResults',
92 | ( [], c_uint, 'capacity' ),
93 | ( [], POINTER(_DetectionResult), 'detectionResults' )),
94 | COMMETHOD([], HRESULT, 'GetActivityDetectionResults',
95 | ( [], c_uint, 'capacity' ),
96 | ( [], POINTER(_DetectionResult), 'detectionResults' )),
97 | COMMETHOD([], HRESULT, 'GetAppearanceDetectionResults',
98 | ( [], c_uint, 'capacity' ),
99 | ( [], POINTER(_DetectionResult), 'detectionResults' )),
100 | COMMETHOD(['propget'], HRESULT, 'HandLeftState',
101 | ( ['retval', 'out'], POINTER(_HandState), 'handState' )),
102 | COMMETHOD(['propget'], HRESULT, 'HandLeftConfidence',
103 | ( ['retval', 'out'], POINTER(_TrackingConfidence), 'confidence' )),
104 | COMMETHOD(['propget'], HRESULT, 'HandRightState',
105 | ( ['retval', 'out'], POINTER(_HandState), 'handState' )),
106 | COMMETHOD(['propget'], HRESULT, 'HandRightConfidence',
107 | ( ['retval', 'out'], POINTER(_TrackingConfidence), 'confidence' )),
108 | COMMETHOD(['propget'], HRESULT, 'ClippedEdges',
109 | ( ['retval', 'out'], POINTER(c_ulong), 'ClippedEdges' )),
110 | COMMETHOD(['propget'], HRESULT, 'TrackingId',
111 | ( ['retval', 'out'], POINTER(c_ulonglong), 'TrackingId' )),
112 | COMMETHOD(['propget'], HRESULT, 'IsTracked',
113 | ( ['retval', 'out'], POINTER(c_bool), 'tracked' )),
114 | COMMETHOD(['propget'], HRESULT, 'IsRestricted',
115 | ( ['retval', 'out'], POINTER(c_bool), 'IsRestricted' )),
116 | COMMETHOD(['propget'], HRESULT, 'Lean',
117 | ( ['retval', 'out'], POINTER(_PointF), 'amount' )),
118 | COMMETHOD(['propget'], HRESULT, 'LeanTrackingState',
119 | ( ['retval', 'out'], POINTER(_TrackingState), 'TrackingState' )),
120 | ]
121 | ################################################################
122 | ## code template for IBody implementation
123 | ##class IBody_Impl(object):
124 | ## def GetJoints(self, capacity):
125 | ## '-no docstring-'
126 | ## #return joints
127 | ##
128 | ## @property
129 | ## def IsTracked(self):
130 | ## '-no docstring-'
131 | ## #return tracked
132 | ##
133 | ## @property
134 | ## def HandLeftState(self):
135 | ## '-no docstring-'
136 | ## #return handState
137 | ##
138 | ## @property
139 | ## def HandLeftConfidence(self):
140 | ## '-no docstring-'
141 | ## #return confidence
142 | ##
143 | ## @property
144 | ## def TrackingId(self):
145 | ## '-no docstring-'
146 | ## #return TrackingId
147 | ##
148 | ## @property
149 | ## def Lean(self):
150 | ## '-no docstring-'
151 | ## #return amount
152 | ##
153 | ## @property
154 | ## def Engaged(self):
155 | ## '-no docstring-'
156 | ## #return detectionResult
157 | ##
158 | ## @property
159 | ## def HandRightState(self):
160 | ## '-no docstring-'
161 | ## #return handState
162 | ##
163 | ## @property
164 | ## def ClippedEdges(self):
165 | ## '-no docstring-'
166 | ## #return ClippedEdges
167 | ##
168 | ## def GetJointOrientations(self, capacity):
169 | ## '-no docstring-'
170 | ## #return jointOrientations
171 | ##
172 | ## def GetExpressionDetectionResults(self, capacity):
173 | ## '-no docstring-'
174 | ## #return detectionResults
175 | ##
176 | ## @property
177 | ## def IsRestricted(self):
178 | ## '-no docstring-'
179 | ## #return IsRestricted
180 | ##
181 | ## def GetActivityDetectionResults(self, capacity):
182 | ## '-no docstring-'
183 | ## #return detectionResults
184 | ##
185 | ## @property
186 | ## def HandRightConfidence(self):
187 | ## '-no docstring-'
188 | ## #return confidence
189 | ##
190 | ## def GetAppearanceDetectionResults(self, capacity):
191 | ## '-no docstring-'
192 | ## #return detectionResults
193 | ##
194 | ## @property
195 | ## def LeanTrackingState(self):
196 | ## '-no docstring-'
197 | ## #return TrackingState
198 | ##
199 |
200 | class IColorCameraSettings(comtypes.IUnknown):
201 | _case_insensitive_ = True
202 | _iid_ = GUID('{DBF802AB-0ADF-485A-A844-CF1C7956D039}')
203 | _idlflags_ = []
204 | IColorCameraSettings._methods_ = [
205 | COMMETHOD(['propget'], HRESULT, 'ExposureTime',
206 | ( ['retval', 'out'], POINTER(c_longlong), 'ExposureTime' )),
207 | COMMETHOD(['propget'], HRESULT, 'FrameInterval',
208 | ( ['retval', 'out'], POINTER(c_longlong), 'FrameInterval' )),
209 | COMMETHOD(['propget'], HRESULT, 'Gain',
210 | ( ['retval', 'out'], POINTER(c_float), 'Gain' )),
211 | COMMETHOD(['propget'], HRESULT, 'Gamma',
212 | ( ['retval', 'out'], POINTER(c_float), 'Gamma' )),
213 | ]
214 | ################################################################
215 | ## code template for IColorCameraSettings implementation
216 | ##class IColorCameraSettings_Impl(object):
217 | ## @property
218 | ## def ExposureTime(self):
219 | ## '-no docstring-'
220 | ## #return ExposureTime
221 | ##
222 | ## @property
223 | ## def FrameInterval(self):
224 | ## '-no docstring-'
225 | ## #return FrameInterval
226 | ##
227 | ## @property
228 | ## def Gamma(self):
229 | ## '-no docstring-'
230 | ## #return Gamma
231 | ##
232 | ## @property
233 | ## def Gain(self):
234 | ## '-no docstring-'
235 | ## #return Gain
236 | ##
237 |
238 | class IAudioBeamFrameReader(comtypes.IUnknown):
239 | _case_insensitive_ = True
240 | _iid_ = GUID('{B5733DE9-6ECF-46B2-8B23-A16D71F1A75C}')
241 | _idlflags_ = []
242 | class IAudioBeamFrameArrivedEventArgs(comtypes.IUnknown):
243 | _case_insensitive_ = True
244 | _iid_ = GUID('{E0DBE62D-2045-4571-8D1D-ECF3981E3C3D}')
245 | _idlflags_ = []
246 | class IAudioBeamFrameList(comtypes.IUnknown):
247 | _case_insensitive_ = True
248 | _iid_ = GUID('{5393C8B9-C044-49CB-BDD6-23DFFFD7427E}')
249 | _idlflags_ = []
250 | class IAudioSource(comtypes.IUnknown):
251 | _case_insensitive_ = True
252 | _iid_ = GUID('{52D1D743-AED1-4E61-8AF8-19EF287A662C}')
253 | _idlflags_ = []
254 | IAudioBeamFrameReader._methods_ = [
255 | COMMETHOD([], HRESULT, 'SubscribeFrameArrived',
256 | ( ['retval', 'out'], POINTER(INT_PTR), 'waitableHandle' )),
257 | COMMETHOD([], HRESULT, 'UnsubscribeFrameArrived',
258 | ( ['in'], INT_PTR, 'waitableHandle' )),
259 | COMMETHOD([], HRESULT, 'GetFrameArrivedEventData',
260 | ( ['in'], INT_PTR, 'waitableHandle' ),
261 | ( ['retval', 'out'], POINTER(POINTER(IAudioBeamFrameArrivedEventArgs)), 'eventData' )),
262 | COMMETHOD([], HRESULT, 'AcquireLatestBeamFrames',
263 | ( ['retval', 'out'], POINTER(POINTER(IAudioBeamFrameList)), 'audioBeamFrameList' )),
264 | COMMETHOD(['propget'], HRESULT, 'IsPaused',
265 | ( ['retval', 'out'], POINTER(c_bool), 'IsPaused' )),
266 | COMMETHOD(['propput'], HRESULT, 'IsPaused',
267 | ( [], c_bool, 'IsPaused' )),
268 | COMMETHOD(['propget'], HRESULT, 'AudioSource',
269 | ( ['retval', 'out'], POINTER(POINTER(IAudioSource)), 'AudioSource' )),
270 | ]
271 | ################################################################
272 | ## code template for IAudioBeamFrameReader implementation
273 | ##class IAudioBeamFrameReader_Impl(object):
274 | ## def GetFrameArrivedEventData(self, waitableHandle):
275 | ## '-no docstring-'
276 | ## #return eventData
277 | ##
278 | ## @property
279 | ## def AudioSource(self):
280 | ## '-no docstring-'
281 | ## #return AudioSource
282 | ##
283 | ## def AcquireLatestBeamFrames(self):
284 | ## '-no docstring-'
285 | ## #return audioBeamFrameList
286 | ##
287 | ## def UnsubscribeFrameArrived(self, waitableHandle):
288 | ## '-no docstring-'
289 | ## #return
290 | ##
291 | ## def _get(self):
292 | ## '-no docstring-'
293 | ## #return IsPaused
294 | ## def _set(self, IsPaused):
295 | ## '-no docstring-'
296 | ## IsPaused = property(_get, _set, doc = _set.__doc__)
297 | ##
298 | ## def SubscribeFrameArrived(self):
299 | ## '-no docstring-'
300 | ## #return waitableHandle
301 | ##
302 |
303 | class IDepthFrame(comtypes.IUnknown):
304 | _case_insensitive_ = True
305 | _iid_ = GUID('{D8600853-8835-44F9-84A7-E617CDD7DFDD}')
306 | _idlflags_ = []
307 | class IFrameDescription(comtypes.IUnknown):
308 | _case_insensitive_ = True
309 | _iid_ = GUID('{21F6EFB7-EB6D-48F4-9C08-181A87BF0C98}')
310 | _idlflags_ = []
311 | class IDepthFrameSource(comtypes.IUnknown):
312 | _case_insensitive_ = True
313 | _iid_ = GUID('{C428D558-5E46-490A-B699-D1DDDAA24150}')
314 | _idlflags_ = []
315 | IDepthFrame._methods_ = [
316 | COMMETHOD([], HRESULT, 'CopyFrameDataToArray',
317 | ( [], c_uint, 'capacity' ),
318 | ( [], POINTER(c_ushort), 'frameData' )),
319 | COMMETHOD([], HRESULT, 'AccessUnderlyingBuffer',
320 | ( [], POINTER(c_uint), 'capacity' ),
321 | ( [], POINTER(POINTER(c_ushort)), 'buffer' )), #'out'
322 | COMMETHOD(['propget'], HRESULT, 'FrameDescription',
323 | ( ['retval', 'out'], POINTER(POINTER(IFrameDescription)), 'FrameDescription' )),
324 | COMMETHOD(['propget'], HRESULT, 'RelativeTime',
325 | ( ['retval', 'out'], POINTER(c_longlong), 'RelativeTime' )),
326 | COMMETHOD(['propget'], HRESULT, 'DepthFrameSource',
327 | ( ['retval', 'out'], POINTER(POINTER(IDepthFrameSource)), 'DepthFrameSource' )),
328 | COMMETHOD(['propget'], HRESULT, 'DepthMinReliableDistance',
329 | ( ['retval', 'out'], POINTER(c_ushort), 'DepthMinReliableDistance' )),
330 | COMMETHOD(['propget'], HRESULT, 'DepthMaxReliableDistance',
331 | ( ['retval', 'out'], POINTER(c_ushort), 'DepthMaxReliableDistance' )),
332 | ]
333 | ################################################################
334 | ## code template for IDepthFrame implementation
335 | ##class IDepthFrame_Impl(object):
336 | ## @property
337 | ## def RelativeTime(self):
338 | ## '-no docstring-'
339 | ## #return RelativeTime
340 | ##
341 | ## @property
342 | ## def DepthMaxReliableDistance(self):
343 | ## '-no docstring-'
344 | ## #return DepthMaxReliableDistance
345 | ##
346 | ## @property
347 | ## def FrameDescription(self):
348 | ## '-no docstring-'
349 | ## #return FrameDescription
350 | ##
351 | ## def CopyFrameDataToArray(self, capacity):
352 | ## '-no docstring-'
353 | ## #return frameData
354 | ##
355 | ## def AccessUnderlyingBuffer(self):
356 | ## '-no docstring-'
357 | ## #return capacity, buffer
358 | ##
359 | ## @property
360 | ## def DepthMinReliableDistance(self):
361 | ## '-no docstring-'
362 | ## #return DepthMinReliableDistance
363 | ##
364 | ## @property
365 | ## def DepthFrameSource(self):
366 | ## '-no docstring-'
367 | ## #return DepthFrameSource
368 | ##
369 |
370 | class IDepthFrameArrivedEventArgs(comtypes.IUnknown):
371 | _case_insensitive_ = True
372 | _iid_ = GUID('{2B01BCB8-29D7-4726-860C-6DA56664AA81}')
373 | _idlflags_ = []
374 | class IDepthFrameReference(comtypes.IUnknown):
375 | _case_insensitive_ = True
376 | _iid_ = GUID('{20621E5E-ABC9-4EBD-A7EE-4C77EDD0152A}')
377 | _idlflags_ = []
378 | IDepthFrameArrivedEventArgs._methods_ = [
379 | COMMETHOD(['propget'], HRESULT, 'FrameReference',
380 | ( ['retval', 'out'], POINTER(POINTER(IDepthFrameReference)), 'depthFrameReference' )),
381 | ]
382 | ################################################################
383 | ## code template for IDepthFrameArrivedEventArgs implementation
384 | ##class IDepthFrameArrivedEventArgs_Impl(object):
385 | ## @property
386 | ## def FrameReference(self):
387 | ## '-no docstring-'
388 | ## #return depthFrameReference
389 | ##
390 |
391 | class IColorFrameSource(comtypes.IUnknown):
392 | _case_insensitive_ = True
393 | _iid_ = GUID('{57621D82-D8EE-4783-B412-F7E019C96CFD}')
394 | _idlflags_ = []
395 | class IFrameCapturedEventArgs(comtypes.IUnknown):
396 | _case_insensitive_ = True
397 | _iid_ = GUID('{24CBAB8E-DF1A-4FA8-827E-C1B27A44A3A1}')
398 | _idlflags_ = []
399 | class IColorFrameReader(comtypes.IUnknown):
400 | _case_insensitive_ = True
401 | _iid_ = GUID('{9BEA498C-C59C-4653-AAF9-D884BAB7C35B}')
402 | _idlflags_ = []
403 |
404 | # values for enumeration '_ColorImageFormat'
405 | ColorImageFormat_None = 0
406 | ColorImageFormat_Rgba = 1
407 | ColorImageFormat_Yuv = 2
408 | ColorImageFormat_Bgra = 3
409 | ColorImageFormat_Bayer = 4
410 | ColorImageFormat_Yuy2 = 5
411 | _ColorImageFormat = c_int # enum
412 | class IKinectSensor(comtypes.IUnknown):
413 | _case_insensitive_ = True
414 | _iid_ = GUID('{3C6EBA94-0DE1-4360-B6D4-653A10794C8B}')
415 | _idlflags_ = []
416 | IColorFrameSource._methods_ = [
417 | COMMETHOD([], HRESULT, 'SubscribeFrameCaptured',
418 | ( ['retval', 'out'], POINTER(INT_PTR), 'waitableHandle' )),
419 | COMMETHOD([], HRESULT, 'UnsubscribeFrameCaptured',
420 | ( ['in'], INT_PTR, 'waitableHandle' )),
421 | COMMETHOD([], HRESULT, 'GetFrameCapturedEventData',
422 | ( ['in'], INT_PTR, 'waitableHandle' ),
423 | ( ['retval', 'out'], POINTER(POINTER(IFrameCapturedEventArgs)), 'eventData' )),
424 | COMMETHOD(['propget'], HRESULT, 'IsActive',
425 | ( ['retval', 'out'], POINTER(c_bool), 'IsActive' )),
426 | COMMETHOD([], HRESULT, 'OpenReader',
427 | ( ['retval', 'out'], POINTER(POINTER(IColorFrameReader)), 'reader' )),
428 | COMMETHOD([], HRESULT, 'CreateFrameDescription',
429 | ( [], _ColorImageFormat, 'format' ),
430 | ( ['retval', 'out'], POINTER(POINTER(IFrameDescription)), 'FrameDescription' )),
431 | COMMETHOD(['propget'], HRESULT, 'FrameDescription',
432 | ( ['retval', 'out'], POINTER(POINTER(IFrameDescription)), 'rawFrameDescription' )),
433 | COMMETHOD(['propget'], HRESULT, 'KinectSensor',
434 | ( ['retval', 'out'], POINTER(POINTER(IKinectSensor)), 'sensor' )),
435 | ]
436 | ################################################################
437 | ## code template for IColorFrameSource implementation
438 | ##class IColorFrameSource_Impl(object):
439 | ## def UnsubscribeFrameCaptured(self, waitableHandle):
440 | ## '-no docstring-'
441 | ## #return
442 | ##
443 | ## @property
444 | ## def KinectSensor(self):
445 | ## '-no docstring-'
446 | ## #return sensor
447 | ##
448 | ## def OpenReader(self):
449 | ## '-no docstring-'
450 | ## #return reader
451 | ##
452 | ## @property
453 | ## def FrameDescription(self):
454 | ## '-no docstring-'
455 | ## #return rawFrameDescription
456 | ##
457 | ## def CreateFrameDescription(self, format):
458 | ## '-no docstring-'
459 | ## #return FrameDescription
460 | ##
461 | ## def GetFrameCapturedEventData(self, waitableHandle):
462 | ## '-no docstring-'
463 | ## #return eventData
464 | ##
465 | ## def SubscribeFrameCaptured(self):
466 | ## '-no docstring-'
467 | ## #return waitableHandle
468 | ##
469 | ## @property
470 | ## def IsActive(self):
471 | ## '-no docstring-'
472 | ## #return IsActive
473 | ##
474 |
475 | class Library(object):
476 | u'Kinect for Windiws v2 Type Library'
477 | name = u'Kinect'
478 | _reg_typelib_ = ('{7E31D9B1-D4F2-4DEF-999A-6601F7AB0562}', 1, 0)
479 |
480 |
481 | # values for enumeration '_Activity'
482 | Activity_EyeLeftClosed = 0
483 | Activity_EyeRightClosed = 1
484 | Activity_MouthOpen = 2
485 | Activity_MouthMoved = 3
486 | Activity_LookingAway = 4
487 | Activity_Count = 5
488 | _Activity = c_int # enum
489 |
490 | # values for enumeration '_FrameSourceTypes'
491 | FrameSourceTypes_None = 0
492 | FrameSourceTypes_Color = 1
493 | FrameSourceTypes_Infrared = 2
494 | FrameSourceTypes_LongExposureInfrared = 4
495 | FrameSourceTypes_Depth = 8
496 | FrameSourceTypes_BodyIndex = 16
497 | FrameSourceTypes_Body = 32
498 | FrameSourceTypes_Audio = 64
499 | _FrameSourceTypes = c_int # enum
500 |
501 | # values for enumeration '_KinectCapabilities'
502 | KinectCapabilities_None = 0
503 | KinectCapabilities_Vision = 1
504 | KinectCapabilities_Audio = 2
505 | KinectCapabilities_Face = 4
506 | KinectCapabilities_Expressions = 8
507 | KinectCapabilities_Gamechat = 16
508 | _KinectCapabilities = c_int # enum
509 | class IAudioBeamFrameReference(comtypes.IUnknown):
510 | _case_insensitive_ = True
511 | _iid_ = GUID('{1BD29D0E-6304-4AFB-9C85-77CFE3DC4FCE}')
512 | _idlflags_ = []
513 | IAudioBeamFrameArrivedEventArgs._methods_ = [
514 | COMMETHOD(['propget'], HRESULT, 'FrameReference',
515 | ( ['retval', 'out'], POINTER(POINTER(IAudioBeamFrameReference)), 'audioBeamFrameReference' )),
516 | ]
517 | ################################################################
518 | ## code template for IAudioBeamFrameArrivedEventArgs implementation
519 | ##class IAudioBeamFrameArrivedEventArgs_Impl(object):
520 | ## @property
521 | ## def FrameReference(self):
522 | ## '-no docstring-'
523 | ## #return audioBeamFrameReference
524 | ##
525 |
526 | class ILongExposureInfraredFrameReader(comtypes.IUnknown):
527 | _case_insensitive_ = True
528 | _iid_ = GUID('{2AF23594-0115-417B-859F-A0E3FFB690D2}')
529 | _idlflags_ = []
530 | class ILongExposureInfraredFrameArrivedEventArgs(comtypes.IUnknown):
531 | _case_insensitive_ = True
532 | _iid_ = GUID('{D73D4B5E-E329-4F04-894C-0C97482690D4}')
533 | _idlflags_ = []
534 | class ILongExposureInfraredFrame(comtypes.IUnknown):
535 | _case_insensitive_ = True
536 | _iid_ = GUID('{D1199394-9A42-4577-BE12-90A38B72282C}')
537 | _idlflags_ = []
538 | class ILongExposureInfraredFrameSource(comtypes.IUnknown):
539 | _case_insensitive_ = True
540 | _iid_ = GUID('{D7150EDA-EDA2-4673-B4F8-E3C76D1F402B}')
541 | _idlflags_ = []
542 | ILongExposureInfraredFrameReader._methods_ = [
543 | COMMETHOD([], HRESULT, 'SubscribeFrameArrived',
544 | ( ['retval', 'out'], POINTER(INT_PTR), 'waitableHandle' )),
545 | COMMETHOD([], HRESULT, 'UnsubscribeFrameArrived',
546 | ( ['in'], INT_PTR, 'waitableHandle' )),
547 | COMMETHOD([], HRESULT, 'GetFrameArrivedEventData',
548 | ( ['in'], INT_PTR, 'waitableHandle' ),
549 | ( ['retval', 'out'], POINTER(POINTER(ILongExposureInfraredFrameArrivedEventArgs)), 'eventData' )),
550 | COMMETHOD([], HRESULT, 'AcquireLatestFrame',
551 | ( ['retval', 'out'], POINTER(POINTER(ILongExposureInfraredFrame)), 'longExposureInfraredFrame' )),
552 | COMMETHOD(['propget'], HRESULT, 'IsPaused',
553 | ( ['retval', 'out'], POINTER(c_bool), 'IsPaused' )),
554 | COMMETHOD(['propput'], HRESULT, 'IsPaused',
555 | ( [], c_bool, 'IsPaused' )),
556 | COMMETHOD(['propget'], HRESULT, 'LongExposureInfraredFrameSource',
557 | ( ['retval', 'out'], POINTER(POINTER(ILongExposureInfraredFrameSource)), 'LongExposureInfraredFrameSource' )),
558 | ]
559 | ################################################################
560 | ## code template for ILongExposureInfraredFrameReader implementation
561 | ##class ILongExposureInfraredFrameReader_Impl(object):
562 | ## def GetFrameArrivedEventData(self, waitableHandle):
563 | ## '-no docstring-'
564 | ## #return eventData
565 | ##
566 | ## @property
567 | ## def LongExposureInfraredFrameSource(self):
568 | ## '-no docstring-'
569 | ## #return LongExposureInfraredFrameSource
570 | ##
571 | ## def UnsubscribeFrameArrived(self, waitableHandle):
572 | ## '-no docstring-'
573 | ## #return
574 | ##
575 | ## def _get(self):
576 | ## '-no docstring-'
577 | ## #return IsPaused
578 | ## def _set(self, IsPaused):
579 | ## '-no docstring-'
580 | ## IsPaused = property(_get, _set, doc = _set.__doc__)
581 | ##
582 | ## def AcquireLatestFrame(self):
583 | ## '-no docstring-'
584 | ## #return longExposureInfraredFrame
585 | ##
586 | ## def SubscribeFrameArrived(self):
587 | ## '-no docstring-'
588 | ## #return waitableHandle
589 | ##
590 |
591 |
592 | # values for enumeration '_FrameEdges'
593 | FrameEdge_None = 0
594 | FrameEdge_Right = 1
595 | FrameEdge_Left = 2
596 | FrameEdge_Top = 4
597 | FrameEdge_Bottom = 8
598 | _FrameEdges = c_int # enum
599 |
600 | # values for enumeration '_FrameCapturedStatus'
601 | FrameCapturedStatus_Unknown = 0
602 | FrameCapturedStatus_Queued = 1
603 | FrameCapturedStatus_Dropped = 2
604 | _FrameCapturedStatus = c_int # enum
605 | IFrameCapturedEventArgs._methods_ = [
606 | COMMETHOD(['propget'], HRESULT, 'FrameType',
607 | ( ['retval', 'out'], POINTER(_FrameSourceTypes), 'FrameType' )),
608 | COMMETHOD(['propget'], HRESULT, 'FrameStatus',
609 | ( ['retval', 'out'], POINTER(_FrameCapturedStatus), 'FrameStatus' )),
610 | COMMETHOD(['propget'], HRESULT, 'RelativeTime',
611 | ( ['retval', 'out'], POINTER(c_longlong), 'RelativeTime' )),
612 | ]
613 | ################################################################
614 | ## code template for IFrameCapturedEventArgs implementation
615 | ##class IFrameCapturedEventArgs_Impl(object):
616 | ## @property
617 | ## def FrameStatus(self):
618 | ## '-no docstring-'
619 | ## #return FrameStatus
620 | ##
621 | ## @property
622 | ## def FrameType(self):
623 | ## '-no docstring-'
624 | ## #return FrameType
625 | ##
626 | ## @property
627 | ## def RelativeTime(self):
628 | ## '-no docstring-'
629 | ## #return RelativeTime
630 | ##
631 |
632 | class IKinectSensorCollection(comtypes.IUnknown):
633 | _case_insensitive_ = True
634 | _iid_ = GUID('{EF1FE50F-641C-4FB8-B7BA-C2A8295E1C74}')
635 | _idlflags_ = []
636 | class IEnumKinectSensor(comtypes.IUnknown):
637 | _case_insensitive_ = True
638 | _iid_ = GUID('{E7DEB409-8F82-4D72-9F91-2BB1D2025DC4}')
639 | _idlflags_ = []
640 | IKinectSensorCollection._methods_ = [
641 | COMMETHOD(['propget'], HRESULT, 'Enumerator',
642 | ( ['retval', 'out'], POINTER(POINTER(IEnumKinectSensor)), 'Enumerator' )),
643 | ]
644 | ################################################################
645 | ## code template for IKinectSensorCollection implementation
646 | ##class IKinectSensorCollection_Impl(object):
647 | ## @property
648 | ## def Enumerator(self):
649 | ## '-no docstring-'
650 | ## #return Enumerator
651 | ##
652 |
653 | IAudioBeamFrameReference._methods_ = [
654 | COMMETHOD([], HRESULT, 'AcquireBeamFrames',
655 | ( ['retval', 'out'], POINTER(POINTER(IAudioBeamFrameList)), 'audioBeamFrameList' )),
656 | COMMETHOD(['propget'], HRESULT, 'RelativeTime',
657 | ( ['retval', 'out'], POINTER(c_longlong), 'RelativeTime' )),
658 | ]
659 | ################################################################
660 | ## code template for IAudioBeamFrameReference implementation
661 | ##class IAudioBeamFrameReference_Impl(object):
662 | ## def AcquireBeamFrames(self):
663 | ## '-no docstring-'
664 | ## #return audioBeamFrameList
665 | ##
666 | ## @property
667 | ## def RelativeTime(self):
668 | ## '-no docstring-'
669 | ## #return RelativeTime
670 | ##
671 |
672 | class ILongExposureInfraredFrameReference(comtypes.IUnknown):
673 | _case_insensitive_ = True
674 | _iid_ = GUID('{10043A3E-0DAA-409C-9944-A6FC66C85AF7}')
675 | _idlflags_ = []
676 | ILongExposureInfraredFrameArrivedEventArgs._methods_ = [
677 | COMMETHOD(['propget'], HRESULT, 'FrameReference',
678 | ( ['retval', 'out'], POINTER(POINTER(ILongExposureInfraredFrameReference)), 'longExposureInfraredFrameReference' )),
679 | ]
680 | ################################################################
681 | ## code template for ILongExposureInfraredFrameArrivedEventArgs implementation
682 | ##class ILongExposureInfraredFrameArrivedEventArgs_Impl(object):
683 | ## @property
684 | ## def FrameReference(self):
685 | ## '-no docstring-'
686 | ## #return longExposureInfraredFrameReference
687 | ##
688 |
689 | class IBodyFrameSource(comtypes.IUnknown):
690 | _case_insensitive_ = True
691 | _iid_ = GUID('{BB94A78A-458C-4608-AC69-34FEAD1E3BAE}')
692 | _idlflags_ = []
693 | class IBodyFrameReader(comtypes.IUnknown):
694 | _case_insensitive_ = True
695 | _iid_ = GUID('{45532DF5-A63C-418F-A39F-C567936BC051}')
696 | _idlflags_ = []
697 | IBodyFrameSource._methods_ = [
698 | COMMETHOD([], HRESULT, 'SubscribeFrameCaptured',
699 | ( ['retval', 'out'], POINTER(INT_PTR), 'waitableHandle' )),
700 | COMMETHOD([], HRESULT, 'UnsubscribeFrameCaptured',
701 | ( ['in'], INT_PTR, 'waitableHandle' )),
702 | COMMETHOD([], HRESULT, 'GetFrameCapturedEventData',
703 | ( ['in'], INT_PTR, 'waitableHandle' ),
704 | ( ['retval', 'out'], POINTER(POINTER(IFrameCapturedEventArgs)), 'eventData' )),
705 | COMMETHOD(['propget'], HRESULT, 'IsActive',
706 | ( ['retval', 'out'], POINTER(c_bool), 'IsActive' )),
707 | COMMETHOD(['propget'], HRESULT, 'BodyCount',
708 | ( ['retval', 'out'], POINTER(c_int), 'BodyCount' )),
709 | COMMETHOD([], HRESULT, 'OpenReader',
710 | ( ['retval', 'out'], POINTER(POINTER(IBodyFrameReader)), 'reader' )),
711 | COMMETHOD(['propget'], HRESULT, 'KinectSensor',
712 | ( ['retval', 'out'], POINTER(POINTER(IKinectSensor)), 'sensor' )),
713 | COMMETHOD([], HRESULT, 'OverrideHandTracking',
714 | ( [], c_ulonglong, 'TrackingId' )),
715 | COMMETHOD([], HRESULT, 'OverrideAndReplaceHandTracking',
716 | ( [], c_ulonglong, 'oldTrackingId' ),
717 | ( [], c_ulonglong, 'newTrackingId' )),
718 | ]
719 | ################################################################
720 | ## code template for IBodyFrameSource implementation
721 | ##class IBodyFrameSource_Impl(object):
722 | ## def UnsubscribeFrameCaptured(self, waitableHandle):
723 | ## '-no docstring-'
724 | ## #return
725 | ##
726 | ## @property
727 | ## def KinectSensor(self):
728 | ## '-no docstring-'
729 | ## #return sensor
730 | ##
731 | ## def SubscribeFrameCaptured(self):
732 | ## '-no docstring-'
733 | ## #return waitableHandle
734 | ##
735 | ## def OverrideHandTracking(self, TrackingId):
736 | ## '-no docstring-'
737 | ## #return
738 | ##
739 | ## def OverrideAndReplaceHandTracking(self, oldTrackingId, newTrackingId):
740 | ## '-no docstring-'
741 | ## #return
742 | ##
743 | ## def GetFrameCapturedEventData(self, waitableHandle):
744 | ## '-no docstring-'
745 | ## #return eventData
746 | ##
747 | ## def OpenReader(self):
748 | ## '-no docstring-'
749 | ## #return reader
750 | ##
751 | ## @property
752 | ## def BodyCount(self):
753 | ## '-no docstring-'
754 | ## #return BodyCount
755 | ##
756 | ## @property
757 | ## def IsActive(self):
758 | ## '-no docstring-'
759 | ## #return IsActive
760 | ##
761 |
762 | class IAudioBeamFrame(comtypes.IUnknown):
763 | _case_insensitive_ = True
764 | _iid_ = GUID('{07AADCC8-EC4A-42F8-90A9-C72ECF0A1D06}')
765 | _idlflags_ = []
766 | IAudioBeamFrameList._methods_ = [
767 | COMMETHOD(['propget'], HRESULT, 'BeamCount',
768 | ( ['retval', 'out'], POINTER(c_uint), 'count' )),
769 | COMMETHOD([], HRESULT, 'OpenAudioBeamFrame',
770 | ( [], c_uint, 'index' ),
771 | ( ['out'], POINTER(POINTER(IAudioBeamFrame)), 'audioBeamFrame' )),
772 | ]
773 | ################################################################
774 | ## code template for IAudioBeamFrameList implementation
775 | ##class IAudioBeamFrameList_Impl(object):
776 | ## def OpenAudioBeamFrame(self, index):
777 | ## '-no docstring-'
778 | ## #return audioBeamFrame
779 | ##
780 | ## @property
781 | ## def BeamCount(self):
782 | ## '-no docstring-'
783 | ## #return count
784 | ##
785 |
786 |
787 | # values for enumeration '_Appearance'
788 | Appearance_WearingGlasses = 0
789 | Appearance_Count = 1
790 | _Appearance = c_int # enum
791 | class IColorFrameArrivedEventArgs(comtypes.IUnknown):
792 | _case_insensitive_ = True
793 | _iid_ = GUID('{82A2E32F-4AE5-4614-88BB-DCC5AE0CEAED}')
794 | _idlflags_ = []
795 | class IColorFrame(comtypes.IUnknown):
796 | _case_insensitive_ = True
797 | _iid_ = GUID('{39D05803-8803-4E86-AD9F-13F6954E4ACA}')
798 | _idlflags_ = []
799 | IColorFrameReader._methods_ = [
800 | COMMETHOD([], HRESULT, 'SubscribeFrameArrived',
801 | ( ['retval', 'out'], POINTER(INT_PTR), 'waitableHandle' )),
802 | COMMETHOD([], HRESULT, 'UnsubscribeFrameArrived',
803 | ( ['in'], INT_PTR, 'waitableHandle' )),
804 | COMMETHOD([], HRESULT, 'GetFrameArrivedEventData',
805 | ( ['in'], INT_PTR, 'waitableHandle' ),
806 | ( ['retval', 'out'], POINTER(POINTER(IColorFrameArrivedEventArgs)), 'eventData' )),
807 | COMMETHOD([], HRESULT, 'AcquireLatestFrame',
808 | ( ['retval', 'out'], POINTER(POINTER(IColorFrame)), 'colorFrame' )),
809 | COMMETHOD(['propget'], HRESULT, 'IsPaused',
810 | ( ['retval', 'out'], POINTER(c_bool), 'IsPaused' )),
811 | COMMETHOD(['propput'], HRESULT, 'IsPaused',
812 | ( [], c_bool, 'IsPaused' )),
813 | COMMETHOD(['propget'], HRESULT, 'ColorFrameSource',
814 | ( ['retval', 'out'], POINTER(POINTER(IColorFrameSource)), 'ColorFrameSource' )),
815 | ]
816 | ################################################################
817 | ## code template for IColorFrameReader implementation
818 | ##class IColorFrameReader_Impl(object):
819 | ## def GetFrameArrivedEventData(self, waitableHandle):
820 | ## '-no docstring-'
821 | ## #return eventData
822 | ##
823 | ## def UnsubscribeFrameArrived(self, waitableHandle):
824 | ## '-no docstring-'
825 | ## #return
826 | ##
827 | ## @property
828 | ## def ColorFrameSource(self):
829 | ## '-no docstring-'
830 | ## #return ColorFrameSource
831 | ##
832 | ## def _get(self):
833 | ## '-no docstring-'
834 | ## #return IsPaused
835 | ## def _set(self, IsPaused):
836 | ## '-no docstring-'
837 | ## IsPaused = property(_get, _set, doc = _set.__doc__)
838 | ##
839 | ## def AcquireLatestFrame(self):
840 | ## '-no docstring-'
841 | ## #return colorFrame
842 | ##
843 | ## def SubscribeFrameArrived(self):
844 | ## '-no docstring-'
845 | ## #return waitableHandle
846 | ##
847 |
848 |
849 | # values for enumeration '_KinectAudioCalibrationState'
850 | KinectAudioCalibrationState_Unknown = 0
851 | KinectAudioCalibrationState_CalibrationRequired = 1
852 | KinectAudioCalibrationState_Calibrated = 2
853 | _KinectAudioCalibrationState = c_int # enum
854 | ILongExposureInfraredFrameReference._methods_ = [
855 | COMMETHOD([], HRESULT, 'AcquireFrame',
856 | ( ['retval', 'out'], POINTER(POINTER(ILongExposureInfraredFrame)), 'longExposureInfraredFrame' )),
857 | COMMETHOD(['propget'], HRESULT, 'RelativeTime',
858 | ( ['retval', 'out'], POINTER(c_longlong), 'RelativeTime' )),
859 | ]
860 | ################################################################
861 | ## code template for ILongExposureInfraredFrameReference implementation
862 | ##class ILongExposureInfraredFrameReference_Impl(object):
863 | ## @property
864 | ## def RelativeTime(self):
865 | ## '-no docstring-'
866 | ## #return RelativeTime
867 | ##
868 | ## def AcquireFrame(self):
869 | ## '-no docstring-'
870 | ## #return longExposureInfraredFrame
871 | ##
872 |
873 | class IDepthFrameReader(comtypes.IUnknown):
874 | _case_insensitive_ = True
875 | _iid_ = GUID('{81C0C0AB-6E6C-45CB-8625-A5F4D38759A4}')
876 | _idlflags_ = []
877 | IDepthFrameSource._methods_ = [
878 | COMMETHOD([], HRESULT, 'SubscribeFrameCaptured',
879 | ( ['retval', 'out'], POINTER(INT_PTR), 'waitableHandle' )),
880 | COMMETHOD([], HRESULT, 'UnsubscribeFrameCaptured',
881 | ( ['in'], INT_PTR, 'waitableHandle' )),
882 | COMMETHOD([], HRESULT, 'GetFrameCapturedEventData',
883 | ( ['in'], INT_PTR, 'waitableHandle' ),
884 | ( ['retval', 'out'], POINTER(POINTER(IFrameCapturedEventArgs)), 'eventData' )),
885 | COMMETHOD(['propget'], HRESULT, 'IsActive',
886 | ( ['retval', 'out'], POINTER(c_bool), 'IsActive' )),
887 | COMMETHOD([], HRESULT, 'OpenReader',
888 | ( ['retval', 'out'], POINTER(POINTER(IDepthFrameReader)), 'reader' )),
889 | COMMETHOD(['propget'], HRESULT, 'DepthMinReliableDistance',
890 | ( ['retval', 'out'], POINTER(c_ushort), 'DepthMinReliableDistance' )),
891 | COMMETHOD(['propget'], HRESULT, 'DepthMaxReliableDistance',
892 | ( ['retval', 'out'], POINTER(c_ushort), 'DepthMaxReliableDistance' )),
893 | COMMETHOD(['propget'], HRESULT, 'FrameDescription',
894 | ( ['retval', 'out'], POINTER(POINTER(IFrameDescription)), 'FrameDescription' )),
895 | COMMETHOD(['propget'], HRESULT, 'KinectSensor',
896 | ( ['retval', 'out'], POINTER(POINTER(IKinectSensor)), 'sensor' )),
897 | ]
898 | ################################################################
899 | ## code template for IDepthFrameSource implementation
900 | ##class IDepthFrameSource_Impl(object):
901 | ## def UnsubscribeFrameCaptured(self, waitableHandle):
902 | ## '-no docstring-'
903 | ## #return
904 | ##
905 | ## @property
906 | ## def KinectSensor(self):
907 | ## '-no docstring-'
908 | ## #return sensor
909 | ##
910 | ## def OpenReader(self):
911 | ## '-no docstring-'
912 | ## #return reader
913 | ##
914 | ## @property
915 | ## def DepthMaxReliableDistance(self):
916 | ## '-no docstring-'
917 | ## #return DepthMaxReliableDistance
918 | ##
919 | ## @property
920 | ## def FrameDescription(self):
921 | ## '-no docstring-'
922 | ## #return FrameDescription
923 | ##
924 | ## def GetFrameCapturedEventData(self, waitableHandle):
925 | ## '-no docstring-'
926 | ## #return eventData
927 | ##
928 | ## def SubscribeFrameCaptured(self):
929 | ## '-no docstring-'
930 | ## #return waitableHandle
931 | ##
932 | ## @property
933 | ## def DepthMinReliableDistance(self):
934 | ## '-no docstring-'
935 | ## #return DepthMinReliableDistance
936 | ##
937 | ## @property
938 | ## def IsActive(self):
939 | ## '-no docstring-'
940 | ## #return IsActive
941 | ##
942 |
943 | class IAudioBeam(comtypes.IUnknown):
944 | _case_insensitive_ = True
945 | _iid_ = GUID('{F692D23A-14D0-432D-B802-DD381A45A121}')
946 | _idlflags_ = []
947 | class IAudioBeamSubFrame(comtypes.IUnknown):
948 | _case_insensitive_ = True
949 | _iid_ = GUID('{0967DB97-80D1-4BC5-BD2B-4685098D9795}')
950 | _idlflags_ = []
951 | IAudioBeamFrame._methods_ = [
952 | COMMETHOD(['propget'], HRESULT, 'AudioSource',
953 | ( ['retval', 'out'], POINTER(POINTER(IAudioSource)), 'AudioSource' )),
954 | COMMETHOD(['propget'], HRESULT, 'duration',
955 | ( ['retval', 'out'], POINTER(c_longlong), 'duration' )),
956 | COMMETHOD(['propget'], HRESULT, 'AudioBeam',
957 | ( ['retval', 'out'], POINTER(POINTER(IAudioBeam)), 'AudioBeam' )),
958 | COMMETHOD(['propget'], HRESULT, 'SubFrameCount',
959 | ( ['retval', 'out'], POINTER(c_uint), 'pSubFrameCount' )),
960 | COMMETHOD([], HRESULT, 'GetSubFrame',
961 | ( [], c_uint, 'subFrameIndex' ),
962 | ( ['out'], POINTER(POINTER(IAudioBeamSubFrame)), 'audioBeamSubFrame' )),
963 | COMMETHOD(['propget'], HRESULT, 'RelativeTimeStart',
964 | ( ['retval', 'out'], POINTER(c_longlong), 'RelativeTime' )),
965 | ]
966 | ################################################################
967 | ## code template for IAudioBeamFrame implementation
968 | ##class IAudioBeamFrame_Impl(object):
969 | ## @property
970 | ## def AudioSource(self):
971 | ## '-no docstring-'
972 | ## #return AudioSource
973 | ##
974 | ## @property
975 | ## def SubFrameCount(self):
976 | ## '-no docstring-'
977 | ## #return pSubFrameCount
978 | ##
979 | ## @property
980 | ## def RelativeTimeStart(self):
981 | ## '-no docstring-'
982 | ## #return RelativeTime
983 | ##
984 | ## @property
985 | ## def AudioBeam(self):
986 | ## '-no docstring-'
987 | ## #return AudioBeam
988 | ##
989 | ## @property
990 | ## def duration(self):
991 | ## '-no docstring-'
992 | ## #return duration
993 | ##
994 | ## def GetSubFrame(self, subFrameIndex):
995 | ## '-no docstring-'
996 | ## #return audioBeamSubFrame
997 | ##
998 |
999 |
1000 | # values for enumeration '_JointType'
1001 | JointType_SpineBase = 0
1002 | JointType_SpineMid = 1
1003 | JointType_Neck = 2
1004 | JointType_Head = 3
1005 | JointType_ShoulderLeft = 4
1006 | JointType_ElbowLeft = 5
1007 | JointType_WristLeft = 6
1008 | JointType_HandLeft = 7
1009 | JointType_ShoulderRight = 8
1010 | JointType_ElbowRight = 9
1011 | JointType_WristRight = 10
1012 | JointType_HandRight = 11
1013 | JointType_HipLeft = 12
1014 | JointType_KneeLeft = 13
1015 | JointType_AnkleLeft = 14
1016 | JointType_FootLeft = 15
1017 | JointType_HipRight = 16
1018 | JointType_KneeRight = 17
1019 | JointType_AnkleRight = 18
1020 | JointType_FootRight = 19
1021 | JointType_SpineShoulder = 20
1022 | JointType_HandTipLeft = 21
1023 | JointType_ThumbLeft = 22
1024 | JointType_HandTipRight = 23
1025 | JointType_ThumbRight = 24
1026 | JointType_Count = 25
1027 | _JointType = c_int # enum
1028 | class IBodyFrameArrivedEventArgs(comtypes.IUnknown):
1029 | _case_insensitive_ = True
1030 | _iid_ = GUID('{BF5CCA0E-00C1-4D48-837F-AB921E6AEE01}')
1031 | _idlflags_ = []
1032 | class IBodyFrame(comtypes.IUnknown):
1033 | _case_insensitive_ = True
1034 | _iid_ = GUID('{52884F1F-94D7-4B57-BF87-9226950980D5}')
1035 | _idlflags_ = []
1036 | IBodyFrameReader._methods_ = [
1037 | COMMETHOD([], HRESULT, 'SubscribeFrameArrived',
1038 | ( ['retval', 'out'], POINTER(INT_PTR), 'waitableHandle' )),
1039 | COMMETHOD([], HRESULT, 'UnsubscribeFrameArrived',
1040 | ( ['in'], INT_PTR, 'waitableHandle' )),
1041 | COMMETHOD([], HRESULT, 'GetFrameArrivedEventData',
1042 | ( ['in'], INT_PTR, 'waitableHandle' ),
1043 | ( ['retval', 'out'], POINTER(POINTER(IBodyFrameArrivedEventArgs)), 'eventData' )),
1044 | COMMETHOD([], HRESULT, 'AcquireLatestFrame',
1045 | ( ['retval', 'out'], POINTER(POINTER(IBodyFrame)), 'bodyFrame' )),
1046 | COMMETHOD(['propget'], HRESULT, 'IsPaused',
1047 | ( ['retval', 'out'], POINTER(c_bool), 'IsPaused' )),
1048 | COMMETHOD(['propput'], HRESULT, 'IsPaused',
1049 | ( [], c_bool, 'IsPaused' )),
1050 | COMMETHOD(['propget'], HRESULT, 'BodyFrameSource',
1051 | ( ['retval', 'out'], POINTER(POINTER(IBodyFrameSource)), 'BodyFrameSource' )),
1052 | ]
1053 | ################################################################
1054 | ## code template for IBodyFrameReader implementation
1055 | ##class IBodyFrameReader_Impl(object):
1056 | ## def GetFrameArrivedEventData(self, waitableHandle):
1057 | ## '-no docstring-'
1058 | ## #return eventData
1059 | ##
1060 | ## def UnsubscribeFrameArrived(self, waitableHandle):
1061 | ## '-no docstring-'
1062 | ## #return
1063 | ##
1064 | ## @property
1065 | ## def BodyFrameSource(self):
1066 | ## '-no docstring-'
1067 | ## #return BodyFrameSource
1068 | ##
1069 | ## def _get(self):
1070 | ## '-no docstring-'
1071 | ## #return IsPaused
1072 | ## def _set(self, IsPaused):
1073 | ## '-no docstring-'
1074 | ## IsPaused = property(_get, _set, doc = _set.__doc__)
1075 | ##
1076 | ## def AcquireLatestFrame(self):
1077 | ## '-no docstring-'
1078 | ## #return bodyFrame
1079 | ##
1080 | ## def SubscribeFrameArrived(self):
1081 | ## '-no docstring-'
1082 | ## #return waitableHandle
1083 | ##
1084 |
1085 | class IColorFrameReference(comtypes.IUnknown):
1086 | _case_insensitive_ = True
1087 | _iid_ = GUID('{5CC49E38-9BBD-48BE-A770-FD30EA405247}')
1088 | _idlflags_ = []
1089 | IColorFrameArrivedEventArgs._methods_ = [
1090 | COMMETHOD(['propget'], HRESULT, 'FrameReference',
1091 | ( ['retval', 'out'], POINTER(POINTER(IColorFrameReference)), 'colorFrameReference' )),
1092 | ]
1093 | ################################################################
1094 | ## code template for IColorFrameArrivedEventArgs implementation
1095 | ##class IColorFrameArrivedEventArgs_Impl(object):
1096 | ## @property
1097 | ## def FrameReference(self):
1098 | ## '-no docstring-'
1099 | ## #return colorFrameReference
1100 | ##
1101 |
1102 | ILongExposureInfraredFrame._methods_ = [
1103 | COMMETHOD([], HRESULT, 'CopyFrameDataToArray',
1104 | ( [], c_uint, 'capacity' ),
1105 | ( [], POINTER(c_ushort), 'frameData' )),
1106 | COMMETHOD([], HRESULT, 'AccessUnderlyingBuffer',
1107 | ( [], POINTER(c_uint), 'capacity' ),
1108 | ( [], POINTER(POINTER(c_ushort)), 'buffer' )), #'out'
1109 | COMMETHOD(['propget'], HRESULT, 'FrameDescription',
1110 | ( ['retval', 'out'], POINTER(POINTER(IFrameDescription)), 'FrameDescription' )),
1111 | COMMETHOD(['propget'], HRESULT, 'RelativeTime',
1112 | ( ['retval', 'out'], POINTER(c_longlong), 'RelativeTime' )),
1113 | COMMETHOD(['propget'], HRESULT, 'LongExposureInfraredFrameSource',
1114 | ( ['retval', 'out'], POINTER(POINTER(ILongExposureInfraredFrameSource)), 'LongExposureInfraredFrameSource' )),
1115 | ]
1116 | ################################################################
1117 | ## code template for ILongExposureInfraredFrame implementation
1118 | ##class ILongExposureInfraredFrame_Impl(object):
1119 | ## @property
1120 | ## def LongExposureInfraredFrameSource(self):
1121 | ## '-no docstring-'
1122 | ## #return LongExposureInfraredFrameSource
1123 | ##
1124 | ## @property
1125 | ## def RelativeTime(self):
1126 | ## '-no docstring-'
1127 | ## #return RelativeTime
1128 | ##
1129 | ## @property
1130 | ## def FrameDescription(self):
1131 | ## '-no docstring-'
1132 | ## #return FrameDescription
1133 | ##
1134 | ## def CopyFrameDataToArray(self, capacity):
1135 | ## '-no docstring-'
1136 | ## #return frameData
1137 | ##
1138 | ## def AccessUnderlyingBuffer(self):
1139 | ## '-no docstring-'
1140 | ## #return capacity, buffer
1141 | ##
1142 |
1143 | class IAudioBeamList(comtypes.IUnknown):
1144 | _case_insensitive_ = True
1145 | _iid_ = GUID('{3C792C7B-7D95-4C56-9DC7-EF63955781EA}')
1146 | _idlflags_ = []
1147 | IAudioSource._methods_ = [
1148 | COMMETHOD([], HRESULT, 'SubscribeFrameCaptured',
1149 | ( ['retval', 'out'], POINTER(INT_PTR), 'waitableHandle' )),
1150 | COMMETHOD([], HRESULT, 'UnsubscribeFrameCaptured',
1151 | ( ['in'], INT_PTR, 'waitableHandle' )),
1152 | COMMETHOD([], HRESULT, 'GetFrameCapturedEventData',
1153 | ( ['in'], INT_PTR, 'waitableHandle' ),
1154 | ( ['retval', 'out'], POINTER(POINTER(IFrameCapturedEventArgs)), 'eventData' )),
1155 | COMMETHOD(['propget'], HRESULT, 'KinectSensor',
1156 | ( ['retval', 'out'], POINTER(POINTER(IKinectSensor)), 'sensor' )),
1157 | COMMETHOD(['propget'], HRESULT, 'IsActive',
1158 | ( ['retval', 'out'], POINTER(c_bool), 'IsActive' )),
1159 | COMMETHOD(['propget'], HRESULT, 'SubFrameLengthInBytes',
1160 | ( ['retval', 'out'], POINTER(c_uint), 'length' )),
1161 | COMMETHOD(['propget'], HRESULT, 'SubFrameDuration',
1162 | ( ['retval', 'out'], POINTER(c_longlong), 'duration' )),
1163 | COMMETHOD(['propget'], HRESULT, 'MaxSubFrameCount',
1164 | ( ['retval', 'out'], POINTER(c_uint), 'count' )),
1165 | COMMETHOD([], HRESULT, 'OpenReader',
1166 | ( ['retval', 'out'], POINTER(POINTER(IAudioBeamFrameReader)), 'reader' )),
1167 | COMMETHOD(['propget'], HRESULT, 'AudioBeams',
1168 | ( ['retval', 'out'], POINTER(POINTER(IAudioBeamList)), 'audioBeamList' )),
1169 | COMMETHOD(['propget'], HRESULT, 'AudioCalibrationState',
1170 | ( ['retval', 'out'], POINTER(_KinectAudioCalibrationState), 'AudioCalibrationState' )),
1171 | ]
1172 | ################################################################
1173 | ## code template for IAudioSource implementation
1174 | ##class IAudioSource_Impl(object):
1175 | ## def UnsubscribeFrameCaptured(self, waitableHandle):
1176 | ## '-no docstring-'
1177 | ## #return
1178 | ##
1179 | ## @property
1180 | ## def KinectSensor(self):
1181 | ## '-no docstring-'
1182 | ## #return sensor
1183 | ##
1184 | ## def SubscribeFrameCaptured(self):
1185 | ## '-no docstring-'
1186 | ## #return waitableHandle
1187 | ##
1188 | ## @property
1189 | ## def SubFrameLengthInBytes(self):
1190 | ## '-no docstring-'
1191 | ## #return length
1192 | ##
1193 | ## @property
1194 | ## def AudioCalibrationState(self):
1195 | ## '-no docstring-'
1196 | ## #return AudioCalibrationState
1197 | ##
1198 | ## def GetFrameCapturedEventData(self, waitableHandle):
1199 | ## '-no docstring-'
1200 | ## #return eventData
1201 | ##
1202 | ## def OpenReader(self):
1203 | ## '-no docstring-'
1204 | ## #return reader
1205 | ##
1206 | ## @property
1207 | ## def MaxSubFrameCount(self):
1208 | ## '-no docstring-'
1209 | ## #return count
1210 | ##
1211 | ## @property
1212 | ## def AudioBeams(self):
1213 | ## '-no docstring-'
1214 | ## #return audioBeamList
1215 | ##
1216 | ## @property
1217 | ## def SubFrameDuration(self):
1218 | ## '-no docstring-'
1219 | ## #return duration
1220 | ##
1221 | ## @property
1222 | ## def IsActive(self):
1223 | ## '-no docstring-'
1224 | ## #return IsActive
1225 | ##
1226 |
1227 | class _Vector4(Structure):
1228 | pass
1229 | _Vector4._fields_ = [
1230 | ('x', c_float),
1231 | ('y', c_float),
1232 | ('z', c_float),
1233 | ('w', c_float),
1234 | ]
1235 | assert sizeof(_Vector4) == 16, sizeof(_Vector4)
1236 | assert alignment(_Vector4) == 4, alignment(_Vector4)
1237 | class IBodyIndexFrameReader(comtypes.IUnknown):
1238 | _case_insensitive_ = True
1239 | _iid_ = GUID('{E9724AA1-EBFA-48F8-9044-E0BE33383B8B}')
1240 | _idlflags_ = []
1241 | class IBodyIndexFrameArrivedEventArgs(comtypes.IUnknown):
1242 | _case_insensitive_ = True
1243 | _iid_ = GUID('{10B7E92E-B4F2-4A36-A459-06B2A4B249DF}')
1244 | _idlflags_ = []
1245 | class IBodyIndexFrame(comtypes.IUnknown):
1246 | _case_insensitive_ = True
1247 | _iid_ = GUID('{2CEA0C07-F90C-44DF-A18C-F4D18075EA6B}')
1248 | _idlflags_ = []
1249 | class IBodyIndexFrameSource(comtypes.IUnknown):
1250 | _case_insensitive_ = True
1251 | _iid_ = GUID('{010F2A40-DC58-44A5-8E57-329A583FEC08}')
1252 | _idlflags_ = []
1253 | IBodyIndexFrameReader._methods_ = [
1254 | COMMETHOD([], HRESULT, 'SubscribeFrameArrived',
1255 | ( ['retval', 'out'], POINTER(INT_PTR), 'waitableHandle' )),
1256 | COMMETHOD([], HRESULT, 'UnsubscribeFrameArrived',
1257 | ( ['in'], INT_PTR, 'waitableHandle' )),
1258 | COMMETHOD([], HRESULT, 'GetFrameArrivedEventData',
1259 | ( ['in'], INT_PTR, 'waitableHandle' ),
1260 | ( ['retval', 'out'], POINTER(POINTER(IBodyIndexFrameArrivedEventArgs)), 'eventData' )),
1261 | COMMETHOD([], HRESULT, 'AcquireLatestFrame',
1262 | ( ['retval', 'out'], POINTER(POINTER(IBodyIndexFrame)), 'bodyIndexFrame' )),
1263 | COMMETHOD(['propget'], HRESULT, 'IsPaused',
1264 | ( ['retval', 'out'], POINTER(c_bool), 'IsPaused' )),
1265 | COMMETHOD(['propput'], HRESULT, 'IsPaused',
1266 | ( [], c_bool, 'IsPaused' )),
1267 | COMMETHOD(['propget'], HRESULT, 'BodyIndexFrameSource',
1268 | ( ['retval', 'out'], POINTER(POINTER(IBodyIndexFrameSource)), 'BodyIndexFrameSource' )),
1269 | ]
1270 | ################################################################
1271 | ## code template for IBodyIndexFrameReader implementation
1272 | ##class IBodyIndexFrameReader_Impl(object):
1273 | ## def GetFrameArrivedEventData(self, waitableHandle):
1274 | ## '-no docstring-'
1275 | ## #return eventData
1276 | ##
1277 | ## def UnsubscribeFrameArrived(self, waitableHandle):
1278 | ## '-no docstring-'
1279 | ## #return
1280 | ##
1281 | ## def _get(self):
1282 | ## '-no docstring-'
1283 | ## #return IsPaused
1284 | ## def _set(self, IsPaused):
1285 | ## '-no docstring-'
1286 | ## IsPaused = property(_get, _set, doc = _set.__doc__)
1287 | ##
1288 | ## @property
1289 | ## def BodyIndexFrameSource(self):
1290 | ## '-no docstring-'
1291 | ## #return BodyIndexFrameSource
1292 | ##
1293 | ## def AcquireLatestFrame(self):
1294 | ## '-no docstring-'
1295 | ## #return bodyIndexFrame
1296 | ##
1297 | ## def SubscribeFrameArrived(self):
1298 | ## '-no docstring-'
1299 | ## #return waitableHandle
1300 | ##
1301 |
1302 |
1303 | # values for enumeration '_AudioBeamMode'
1304 | AudioBeamMode_Automatic = 0
1305 | AudioBeamMode_Manual = 1
1306 | _AudioBeamMode = c_int # enum
1307 | class ISequentialStream(comtypes.IUnknown):
1308 | _case_insensitive_ = True
1309 | _iid_ = GUID('{0C733A30-2A1C-11CE-ADE5-00AA0044773D}')
1310 | _idlflags_ = []
1311 | class IStream(ISequentialStream):
1312 | _case_insensitive_ = True
1313 | _iid_ = GUID('{0000000C-0000-0000-C000-000000000046}')
1314 | _idlflags_ = []
1315 | IAudioBeam._methods_ = [
1316 | COMMETHOD(['propget'], HRESULT, 'AudioSource',
1317 | ( ['retval', 'out'], POINTER(POINTER(IAudioSource)), 'AudioSource' )),
1318 | COMMETHOD(['propget'], HRESULT, 'AudioBeamMode',
1319 | ( ['retval', 'out'], POINTER(_AudioBeamMode), 'AudioBeamMode' )),
1320 | COMMETHOD(['propput'], HRESULT, 'AudioBeamMode',
1321 | ( [], _AudioBeamMode, 'AudioBeamMode' )),
1322 | COMMETHOD(['propget'], HRESULT, 'BeamAngle',
1323 | ( ['retval', 'out'], POINTER(c_float), 'BeamAngle' )),
1324 | COMMETHOD(['propput'], HRESULT, 'BeamAngle',
1325 | ( [], c_float, 'BeamAngle' )),
1326 | COMMETHOD(['propget'], HRESULT, 'BeamAngleConfidence',
1327 | ( ['retval', 'out'], POINTER(c_float), 'BeamAngleConfidence' )),
1328 | COMMETHOD([], HRESULT, 'OpenInputStream',
1329 | ( ['retval', 'out'], POINTER(POINTER(IStream)), 'stream' )),
1330 | COMMETHOD(['propget'], HRESULT, 'RelativeTime',
1331 | ( ['retval', 'out'], POINTER(c_longlong), 'RelativeTime' )),
1332 | ]
1333 | ################################################################
1334 | ## code template for IAudioBeam implementation
1335 | ##class IAudioBeam_Impl(object):
1336 | ## def _get(self):
1337 | ## '-no docstring-'
1338 | ## #return AudioBeamMode
1339 | ## def _set(self, AudioBeamMode):
1340 | ## '-no docstring-'
1341 | ## AudioBeamMode = property(_get, _set, doc = _set.__doc__)
1342 | ##
1343 | ## @property
1344 | ## def AudioSource(self):
1345 | ## '-no docstring-'
1346 | ## #return AudioSource
1347 | ##
1348 | ## @property
1349 | ## def RelativeTime(self):
1350 | ## '-no docstring-'
1351 | ## #return RelativeTime
1352 | ##
1353 | ## @property
1354 | ## def BeamAngleConfidence(self):
1355 | ## '-no docstring-'
1356 | ## #return BeamAngleConfidence
1357 | ##
1358 | ## def OpenInputStream(self):
1359 | ## '-no docstring-'
1360 | ## #return stream
1361 | ##
1362 | ## def _get(self):
1363 | ## '-no docstring-'
1364 | ## #return BeamAngle
1365 | ## def _set(self, BeamAngle):
1366 | ## '-no docstring-'
1367 | ## BeamAngle = property(_get, _set, doc = _set.__doc__)
1368 | ##
1369 |
1370 | class IBodyFrameReference(comtypes.IUnknown):
1371 | _case_insensitive_ = True
1372 | _iid_ = GUID('{C3A1733C-5F84-443B-9659-2F2BE250C97D}')
1373 | _idlflags_ = []
1374 | IBodyFrameArrivedEventArgs._methods_ = [
1375 | COMMETHOD(['propget'], HRESULT, 'FrameReference',
1376 | ( ['retval', 'out'], POINTER(POINTER(IBodyFrameReference)), 'bodyFrameReference' )),
1377 | ]
1378 | ################################################################
1379 | ## code template for IBodyFrameArrivedEventArgs implementation
1380 | ##class IBodyFrameArrivedEventArgs_Impl(object):
1381 | ## @property
1382 | ## def FrameReference(self):
1383 | ## '-no docstring-'
1384 | ## #return bodyFrameReference
1385 | ##
1386 |
1387 | IColorFrameReference._methods_ = [
1388 | COMMETHOD([], HRESULT, 'AcquireFrame',
1389 | ( ['retval', 'out'], POINTER(POINTER(IColorFrame)), 'colorFrame' )),
1390 | COMMETHOD(['propget'], HRESULT, 'RelativeTime',
1391 | ( ['retval', 'out'], POINTER(c_longlong), 'RelativeTime' )),
1392 | ]
1393 | ################################################################
1394 | ## code template for IColorFrameReference implementation
1395 | ##class IColorFrameReference_Impl(object):
1396 | ## @property
1397 | ## def RelativeTime(self):
1398 | ## '-no docstring-'
1399 | ## #return RelativeTime
1400 | ##
1401 | ## def AcquireFrame(self):
1402 | ## '-no docstring-'
1403 | ## #return colorFrame
1404 | ##
1405 |
1406 | IDepthFrameReader._methods_ = [
1407 | COMMETHOD([], HRESULT, 'SubscribeFrameArrived',
1408 | ( ['retval', 'out'], POINTER(INT_PTR), 'waitableHandle' )),
1409 | COMMETHOD([], HRESULT, 'UnsubscribeFrameArrived',
1410 | ( ['in'], INT_PTR, 'waitableHandle' )),
1411 | COMMETHOD([], HRESULT, 'GetFrameArrivedEventData',
1412 | ( ['in'], INT_PTR, 'waitableHandle' ),
1413 | ( ['retval', 'out'], POINTER(POINTER(IDepthFrameArrivedEventArgs)), 'eventData' )),
1414 | COMMETHOD([], HRESULT, 'AcquireLatestFrame',
1415 | ( ['retval', 'out'], POINTER(POINTER(IDepthFrame)), 'depthFrame' )),
1416 | COMMETHOD(['propget'], HRESULT, 'IsPaused',
1417 | ( ['retval', 'out'], POINTER(c_bool), 'IsPaused' )),
1418 | COMMETHOD(['propput'], HRESULT, 'IsPaused',
1419 | ( [], c_bool, 'IsPaused' )),
1420 | COMMETHOD(['propget'], HRESULT, 'DepthFrameSource',
1421 | ( ['retval', 'out'], POINTER(POINTER(IDepthFrameSource)), 'DepthFrameSource' )),
1422 | ]
1423 | ################################################################
1424 | ## code template for IDepthFrameReader implementation
1425 | ##class IDepthFrameReader_Impl(object):
1426 | ## def GetFrameArrivedEventData(self, waitableHandle):
1427 | ## '-no docstring-'
1428 | ## #return eventData
1429 | ##
1430 | ## def UnsubscribeFrameArrived(self, waitableHandle):
1431 | ## '-no docstring-'
1432 | ## #return
1433 | ##
1434 | ## def _get(self):
1435 | ## '-no docstring-'
1436 | ## #return IsPaused
1437 | ## def _set(self, IsPaused):
1438 | ## '-no docstring-'
1439 | ## IsPaused = property(_get, _set, doc = _set.__doc__)
1440 | ##
1441 | ## @property
1442 | ## def DepthFrameSource(self):
1443 | ## '-no docstring-'
1444 | ## #return DepthFrameSource
1445 | ##
1446 | ## def AcquireLatestFrame(self):
1447 | ## '-no docstring-'
1448 | ## #return depthFrame
1449 | ##
1450 | ## def SubscribeFrameArrived(self):
1451 | ## '-no docstring-'
1452 | ## #return waitableHandle
1453 | ##
1454 |
1455 | IBodyIndexFrameSource._methods_ = [
1456 | COMMETHOD([], HRESULT, 'SubscribeFrameCaptured',
1457 | ( ['retval', 'out'], POINTER(INT_PTR), 'waitableHandle' )),
1458 | COMMETHOD([], HRESULT, 'UnsubscribeFrameCaptured',
1459 | ( ['in'], INT_PTR, 'waitableHandle' )),
1460 | COMMETHOD([], HRESULT, 'GetFrameCapturedEventData',
1461 | ( ['in'], INT_PTR, 'waitableHandle' ),
1462 | ( ['retval', 'out'], POINTER(POINTER(IFrameCapturedEventArgs)), 'eventData' )),
1463 | COMMETHOD(['propget'], HRESULT, 'IsActive',
1464 | ( ['retval', 'out'], POINTER(c_bool), 'IsActive' )),
1465 | COMMETHOD([], HRESULT, 'OpenReader',
1466 | ( ['retval', 'out'], POINTER(POINTER(IBodyIndexFrameReader)), 'reader' )),
1467 | COMMETHOD(['propget'], HRESULT, 'FrameDescription',
1468 | ( ['retval', 'out'], POINTER(POINTER(IFrameDescription)), 'FrameDescription' )),
1469 | COMMETHOD(['propget'], HRESULT, 'KinectSensor',
1470 | ( ['retval', 'out'], POINTER(POINTER(IKinectSensor)), 'sensor' )),
1471 | ]
1472 | ################################################################
1473 | ## code template for IBodyIndexFrameSource implementation
1474 | ##class IBodyIndexFrameSource_Impl(object):
1475 | ## def UnsubscribeFrameCaptured(self, waitableHandle):
1476 | ## '-no docstring-'
1477 | ## #return
1478 | ##
1479 | ## @property
1480 | ## def KinectSensor(self):
1481 | ## '-no docstring-'
1482 | ## #return sensor
1483 | ##
1484 | ## def OpenReader(self):
1485 | ## '-no docstring-'
1486 | ## #return reader
1487 | ##
1488 | ## @property
1489 | ## def FrameDescription(self):
1490 | ## '-no docstring-'
1491 | ## #return FrameDescription
1492 | ##
1493 | ## def GetFrameCapturedEventData(self, waitableHandle):
1494 | ## '-no docstring-'
1495 | ## #return eventData
1496 | ##
1497 | ## def SubscribeFrameCaptured(self):
1498 | ## '-no docstring-'
1499 | ## #return waitableHandle
1500 | ##
1501 | ## @property
1502 | ## def IsActive(self):
1503 | ## '-no docstring-'
1504 | ## #return IsActive
1505 | ##
1506 |
1507 | IBodyFrameReference._methods_ = [
1508 | COMMETHOD([], HRESULT, 'AcquireFrame',
1509 | ( ['retval', 'out'], POINTER(POINTER(IBodyFrame)), 'bodyFrame' )),
1510 | COMMETHOD(['propget'], HRESULT, 'RelativeTime',
1511 | ( ['retval', 'out'], POINTER(c_longlong), 'RelativeTime' )),
1512 | ]
1513 | ################################################################
1514 | ## code template for IBodyFrameReference implementation
1515 | ##class IBodyFrameReference_Impl(object):
1516 | ## @property
1517 | ## def RelativeTime(self):
1518 | ## '-no docstring-'
1519 | ## #return RelativeTime
1520 | ##
1521 | ## def AcquireFrame(self):
1522 | ## '-no docstring-'
1523 | ## #return bodyFrame
1524 | ##
1525 |
1526 | class IBodyIndexFrameReference(comtypes.IUnknown):
1527 | _case_insensitive_ = True
1528 | _iid_ = GUID('{D0EA0519-F7E7-4B1E-B3D8-03B3C002795F}')
1529 | _idlflags_ = []
1530 | IBodyIndexFrameArrivedEventArgs._methods_ = [
1531 | COMMETHOD(['propget'], HRESULT, 'FrameReference',
1532 | ( ['retval', 'out'], POINTER(POINTER(IBodyIndexFrameReference)), 'bodyIndexFrameReference' )),
1533 | ]
1534 | ################################################################
1535 | ## code template for IBodyIndexFrameArrivedEventArgs implementation
1536 | ##class IBodyIndexFrameArrivedEventArgs_Impl(object):
1537 | ## @property
1538 | ## def FrameReference(self):
1539 | ## '-no docstring-'
1540 | ## #return bodyIndexFrameReference
1541 | ##
1542 |
1543 |
1544 | # values for enumeration '_Expression'
1545 | Expression_Neutral = 0
1546 | Expression_Happy = 1
1547 | Expression_Count = 2
1548 | _Expression = c_int # enum
1549 | IColorFrame._methods_ = [
1550 | COMMETHOD(['propget'], HRESULT, 'RawColorImageFormat',
1551 | ( ['retval', 'out'], POINTER(_ColorImageFormat), 'RawColorImageFormat' )),
1552 | COMMETHOD(['propget'], HRESULT, 'FrameDescription',
1553 | ( ['retval', 'out'], POINTER(POINTER(IFrameDescription)), 'rawFrameDescription' )),
1554 | COMMETHOD([], HRESULT, 'CopyRawFrameDataToArray',
1555 | ( [], c_uint, 'capacity' ),
1556 | ( [], POINTER(c_ubyte), 'frameData' )),
1557 | COMMETHOD([], HRESULT, 'AccessRawUnderlyingBuffer',
1558 | ( [], POINTER(c_uint), 'capacity' ),
1559 | ( [], POINTER(POINTER(c_ubyte)), 'buffer' )), #'out'
1560 | COMMETHOD([], HRESULT, 'CopyConvertedFrameDataToArray',
1561 | ( [], c_uint, 'capacity' ),
1562 | ( [], POINTER(c_ubyte), 'frameData' ), #( [], POINTER(c_ubyte), 'frameData' )
1563 | ( [], _ColorImageFormat, 'colorFormat' )),
1564 | COMMETHOD([], HRESULT, 'CreateFrameDescription',
1565 | ( [], _ColorImageFormat, 'format' ),
1566 | ( ['retval', 'out'], POINTER(POINTER(IFrameDescription)), 'FrameDescription' )),
1567 | COMMETHOD(['propget'], HRESULT, 'ColorCameraSettings',
1568 | ( ['retval', 'out'], POINTER(POINTER(IColorCameraSettings)), 'ColorCameraSettings' )),
1569 | COMMETHOD(['propget'], HRESULT, 'RelativeTime',
1570 | ( ['retval', 'out'], POINTER(c_longlong), 'RelativeTime' )),
1571 | COMMETHOD(['propget'], HRESULT, 'ColorFrameSource',
1572 | ( ['retval', 'out'], POINTER(POINTER(IColorFrameSource)), 'ColorFrameSource' )),
1573 | ]
1574 | ################################################################
1575 | ## code template for IColorFrame implementation
1576 | ##class IColorFrame_Impl(object):
1577 | ## def CopyConvertedFrameDataToArray(self, capacity, colorFormat):
1578 | ## '-no docstring-'
1579 | ## #return frameData
1580 | ##
1581 | ## @property
1582 | ## def ColorCameraSettings(self):
1583 | ## '-no docstring-'
1584 | ## #return ColorCameraSettings
1585 | ##
1586 | ## @property
1587 | ## def RelativeTime(self):
1588 | ## '-no docstring-'
1589 | ## #return RelativeTime
1590 | ##
1591 | ## @property
1592 | ## def FrameDescription(self):
1593 | ## '-no docstring-'
1594 | ## #return rawFrameDescription
1595 | ##
1596 | ## @property
1597 | ## def RawColorImageFormat(self):
1598 | ## '-no docstring-'
1599 | ## #return RawColorImageFormat
1600 | ##
1601 | ## def AccessRawUnderlyingBuffer(self):
1602 | ## '-no docstring-'
1603 | ## #return capacity, buffer
1604 | ##
1605 | ## def CreateFrameDescription(self, format):
1606 | ## '-no docstring-'
1607 | ## #return FrameDescription
1608 | ##
1609 | ## def CopyRawFrameDataToArray(self, capacity):
1610 | ## '-no docstring-'
1611 | ## #return frameData
1612 | ##
1613 | ## @property
1614 | ## def ColorFrameSource(self):
1615 | ## '-no docstring-'
1616 | ## #return ColorFrameSource
1617 | ##
1618 |
1619 | ISequentialStream._methods_ = [
1620 | COMMETHOD([], HRESULT, 'RemoteRead',
1621 | ( [], POINTER(c_ubyte), 'pv' ),
1622 | ( ['in'], c_ulong, 'cb' ),
1623 | ( [], POINTER(c_ulong), 'pcbRead' )),
1624 | COMMETHOD([], HRESULT, 'RemoteWrite',
1625 | ( ['in'], POINTER(c_ubyte), 'pv' ),
1626 | ( ['in'], c_ulong, 'cb' ),
1627 | ( [], POINTER(c_ulong), 'pcbWritten' )),
1628 | ]
1629 | ################################################################
1630 | ## code template for ISequentialStream implementation
1631 | ##class ISequentialStream_Impl(object):
1632 | ## def RemoteRead(self, cb):
1633 | ## '-no docstring-'
1634 | ## #return pv, pcbRead
1635 | ##
1636 | ## def RemoteWrite(self, pv, cb):
1637 | ## '-no docstring-'
1638 | ## #return pcbWritten
1639 | ##
1640 |
1641 | IFrameDescription._methods_ = [
1642 | COMMETHOD(['propget'], HRESULT, 'Width',
1643 | ( ['retval', 'out'], POINTER(c_int), 'Width' )),
1644 | COMMETHOD(['propget'], HRESULT, 'Height',
1645 | ( ['retval', 'out'], POINTER(c_int), 'Height' )),
1646 | COMMETHOD(['propget'], HRESULT, 'HorizontalFieldOfView',
1647 | ( ['retval', 'out'], POINTER(c_float), 'HorizontalFieldOfView' )),
1648 | COMMETHOD(['propget'], HRESULT, 'VerticalFieldOfView',
1649 | ( ['retval', 'out'], POINTER(c_float), 'VerticalFieldOfView' )),
1650 | COMMETHOD(['propget'], HRESULT, 'DiagonalFieldOfView',
1651 | ( ['retval', 'out'], POINTER(c_float), 'DiagonalFieldOfView' )),
1652 | COMMETHOD(['propget'], HRESULT, 'LengthInPixels',
1653 | ( ['retval', 'out'], POINTER(c_uint), 'LengthInPixels' )),
1654 | COMMETHOD(['propget'], HRESULT, 'BytesPerPixel',
1655 | ( ['retval', 'out'], POINTER(c_uint), 'BytesPerPixel' )),
1656 | ]
1657 | ################################################################
1658 | ## code template for IFrameDescription implementation
1659 | ##class IFrameDescription_Impl(object):
1660 | ## @property
1661 | ## def HorizontalFieldOfView(self):
1662 | ## '-no docstring-'
1663 | ## #return HorizontalFieldOfView
1664 | ##
1665 | ## @property
1666 | ## def DiagonalFieldOfView(self):
1667 | ## '-no docstring-'
1668 | ## #return DiagonalFieldOfView
1669 | ##
1670 | ## @property
1671 | ## def VerticalFieldOfView(self):
1672 | ## '-no docstring-'
1673 | ## #return VerticalFieldOfView
1674 | ##
1675 | ## @property
1676 | ## def Height(self):
1677 | ## '-no docstring-'
1678 | ## #return Height
1679 | ##
1680 | ## @property
1681 | ## def Width(self):
1682 | ## '-no docstring-'
1683 | ## #return Width
1684 | ##
1685 | ## @property
1686 | ## def BytesPerPixel(self):
1687 | ## '-no docstring-'
1688 | ## #return BytesPerPixel
1689 | ##
1690 | ## @property
1691 | ## def LengthInPixels(self):
1692 | ## '-no docstring-'
1693 | ## #return LengthInPixels
1694 | ##
1695 |
1696 | IBodyIndexFrameReference._methods_ = [
1697 | COMMETHOD([], HRESULT, 'AcquireFrame',
1698 | ( ['retval', 'out'], POINTER(POINTER(IBodyIndexFrame)), 'bodyIndexFrame' )),
1699 | COMMETHOD(['propget'], HRESULT, 'RelativeTime',
1700 | ( ['retval', 'out'], POINTER(c_longlong), 'RelativeTime' )),
1701 | ]
1702 | ################################################################
1703 | ## code template for IBodyIndexFrameReference implementation
1704 | ##class IBodyIndexFrameReference_Impl(object):
1705 | ## @property
1706 | ## def RelativeTime(self):
1707 | ## '-no docstring-'
1708 | ## #return RelativeTime
1709 | ##
1710 | ## def AcquireFrame(self):
1711 | ## '-no docstring-'
1712 | ## #return bodyIndexFrame
1713 | ##
1714 |
1715 | class tagSTATSTG(Structure):
1716 | pass
1717 | IStream._methods_ = [
1718 | COMMETHOD([], HRESULT, 'RemoteSeek',
1719 | ( ['in'], _LARGE_INTEGER, 'dlibMove' ),
1720 | ( ['in'], c_ulong, 'dwOrigin' ),
1721 | ( [], POINTER(_ULARGE_INTEGER), 'plibNewPosition' )),
1722 | COMMETHOD([], HRESULT, 'SetSize',
1723 | ( ['in'], _ULARGE_INTEGER, 'libNewSize' )),
1724 | COMMETHOD([], HRESULT, 'RemoteCopyTo',
1725 | ( ['in'], POINTER(IStream), 'pstm' ),
1726 | ( ['in'], _ULARGE_INTEGER, 'cb' ),
1727 | ( [], POINTER(_ULARGE_INTEGER), 'pcbRead' ),
1728 | ( [], POINTER(_ULARGE_INTEGER), 'pcbWritten' )),
1729 | COMMETHOD([], HRESULT, 'Commit',
1730 | ( ['in'], c_ulong, 'grfCommitFlags' )),
1731 | COMMETHOD([], HRESULT, 'Revert'),
1732 | COMMETHOD([], HRESULT, 'LockRegion',
1733 | ( ['in'], _ULARGE_INTEGER, 'libOffset' ),
1734 | ( ['in'], _ULARGE_INTEGER, 'cb' ),
1735 | ( ['in'], c_ulong, 'dwLockType' )),
1736 | COMMETHOD([], HRESULT, 'UnlockRegion',
1737 | ( ['in'], _ULARGE_INTEGER, 'libOffset' ),
1738 | ( ['in'], _ULARGE_INTEGER, 'cb' ),
1739 | ( ['in'], c_ulong, 'dwLockType' )),
1740 | COMMETHOD([], HRESULT, 'Stat',
1741 | ( [], POINTER(tagSTATSTG), 'pstatstg' ),
1742 | ( ['in'], c_ulong, 'grfStatFlag' )),
1743 | COMMETHOD([], HRESULT, 'Clone',
1744 | ( ['out'], POINTER(POINTER(IStream)), 'ppstm' )),
1745 | ]
1746 | ################################################################
1747 | ## code template for IStream implementation
1748 | ##class IStream_Impl(object):
1749 | ## def RemoteSeek(self, dlibMove, dwOrigin):
1750 | ## '-no docstring-'
1751 | ## #return plibNewPosition
1752 | ##
1753 | ## def Stat(self, grfStatFlag):
1754 | ## '-no docstring-'
1755 | ## #return pstatstg
1756 | ##
1757 | ## def UnlockRegion(self, libOffset, cb, dwLockType):
1758 | ## '-no docstring-'
1759 | ## #return
1760 | ##
1761 | ## def Clone(self):
1762 | ## '-no docstring-'
1763 | ## #return ppstm
1764 | ##
1765 | ## def Revert(self):
1766 | ## '-no docstring-'
1767 | ## #return
1768 | ##
1769 | ## def RemoteCopyTo(self, pstm, cb):
1770 | ## '-no docstring-'
1771 | ## #return pcbRead, pcbWritten
1772 | ##
1773 | ## def LockRegion(self, libOffset, cb, dwLockType):
1774 | ## '-no docstring-'
1775 | ## #return
1776 | ##
1777 | ## def Commit(self, grfCommitFlags):
1778 | ## '-no docstring-'
1779 | ## #return
1780 | ##
1781 | ## def SetSize(self, libNewSize):
1782 | ## '-no docstring-'
1783 | ## #return
1784 | ##
1785 |
1786 | IDepthFrameReference._methods_ = [
1787 | COMMETHOD([], HRESULT, 'AcquireFrame',
1788 | ( ['retval', 'out'], POINTER(POINTER(IDepthFrame)), 'depthFrame' )),
1789 | COMMETHOD(['propget'], HRESULT, 'RelativeTime',
1790 | ( ['retval', 'out'], POINTER(c_longlong), 'RelativeTime' )),
1791 | ]
1792 | ################################################################
1793 | ## code template for IDepthFrameReference implementation
1794 | ##class IDepthFrameReference_Impl(object):
1795 | ## @property
1796 | ## def RelativeTime(self):
1797 | ## '-no docstring-'
1798 | ## #return RelativeTime
1799 | ##
1800 | ## def AcquireFrame(self):
1801 | ## '-no docstring-'
1802 | ## #return depthFrame
1803 | ##
1804 |
1805 | class _CameraSpacePoint(Structure):
1806 | pass
1807 | _CameraSpacePoint._fields_ = [
1808 | ('x', c_float),
1809 | ('y', c_float),
1810 | ('z', c_float),
1811 | ]
1812 | assert sizeof(_CameraSpacePoint) == 12, sizeof(_CameraSpacePoint)
1813 | assert alignment(_CameraSpacePoint) == 4, alignment(_CameraSpacePoint)
1814 | _Joint._fields_ = [
1815 | ('JointType', _JointType),
1816 | ('Position', _CameraSpacePoint),
1817 | ('TrackingState', _TrackingState),
1818 | ]
1819 | assert sizeof(_Joint) == 20, sizeof(_Joint)
1820 | assert alignment(_Joint) == 4, alignment(_Joint)
1821 | IBodyIndexFrame._methods_ = [
1822 | COMMETHOD([], HRESULT, 'CopyFrameDataToArray',
1823 | ( [], c_uint, 'capacity' ),
1824 | ( [], POINTER(c_ubyte), 'frameData' )),
1825 | COMMETHOD([], HRESULT, 'AccessUnderlyingBuffer',
1826 | ( [], POINTER(c_uint), 'capacity' ),
1827 | ( [], POINTER(POINTER(c_ubyte)), 'buffer' )), #'out'
1828 | COMMETHOD(['propget'], HRESULT, 'FrameDescription',
1829 | ( ['retval', 'out'], POINTER(POINTER(IFrameDescription)), 'FrameDescription' )),
1830 | COMMETHOD(['propget'], HRESULT, 'RelativeTime',
1831 | ( ['retval', 'out'], POINTER(c_longlong), 'RelativeTime' )),
1832 | COMMETHOD(['propget'], HRESULT, 'BodyIndexFrameSource',
1833 | ( ['retval', 'out'], POINTER(POINTER(IBodyIndexFrameSource)), 'BodyIndexFrameSource' )),
1834 | ]
1835 | ################################################################
1836 | ## code template for IBodyIndexFrame implementation
1837 | ##class IBodyIndexFrame_Impl(object):
1838 | ## @property
1839 | ## def BodyIndexFrameSource(self):
1840 | ## '-no docstring-'
1841 | ## #return BodyIndexFrameSource
1842 | ##
1843 | ## @property
1844 | ## def RelativeTime(self):
1845 | ## '-no docstring-'
1846 | ## #return RelativeTime
1847 | ##
1848 | ## @property
1849 | ## def FrameDescription(self):
1850 | ## '-no docstring-'
1851 | ## #return FrameDescription
1852 | ##
1853 | ## def CopyFrameDataToArray(self, capacity):
1854 | ## '-no docstring-'
1855 | ## #return frameData
1856 | ##
1857 | ## def AccessUnderlyingBuffer(self):
1858 | ## '-no docstring-'
1859 | ## #return capacity, buffer
1860 | ##
1861 |
1862 | _PointF._fields_ = [
1863 | ('x', c_float),
1864 | ('y', c_float),
1865 | ]
1866 | assert sizeof(_PointF) == 8, sizeof(_PointF)
1867 | assert alignment(_PointF) == 4, alignment(_PointF)
1868 | class _ColorSpacePoint(Structure):
1869 | pass
1870 | _ColorSpacePoint._fields_ = [
1871 | ('x', c_float),
1872 | ('y', c_float),
1873 | ]
1874 | assert sizeof(_ColorSpacePoint) == 8, sizeof(_ColorSpacePoint)
1875 | assert alignment(_ColorSpacePoint) == 4, alignment(_ColorSpacePoint)
1876 | class _RectF(Structure):
1877 | pass
1878 | _RectF._fields_ = [
1879 | ('x', c_float),
1880 | ('y', c_float),
1881 | ('Width', c_float),
1882 | ('Height', c_float),
1883 | ]
1884 | assert sizeof(_RectF) == 16, sizeof(_RectF)
1885 | assert alignment(_RectF) == 4, alignment(_RectF)
1886 | class IMultiSourceFrameArrivedEventArgs(comtypes.IUnknown):
1887 | _case_insensitive_ = True
1888 | _iid_ = GUID('{3532F40B-D908-451D-BBF4-6CA73B782558}')
1889 | _idlflags_ = []
1890 | class IMultiSourceFrameReference(comtypes.IUnknown):
1891 | _case_insensitive_ = True
1892 | _iid_ = GUID('{DD70E845-E283-4DD1-8DAF-FC259AC5F9E3}')
1893 | _idlflags_ = []
1894 | IMultiSourceFrameArrivedEventArgs._methods_ = [
1895 | COMMETHOD(['propget'], HRESULT, 'FrameReference',
1896 | ( ['retval', 'out'], POINTER(POINTER(IMultiSourceFrameReference)), 'frames' )),
1897 | ]
1898 | ################################################################
1899 | ## code template for IMultiSourceFrameArrivedEventArgs implementation
1900 | ##class IMultiSourceFrameArrivedEventArgs_Impl(object):
1901 | ## @property
1902 | ## def FrameReference(self):
1903 | ## '-no docstring-'
1904 | ## #return frames
1905 | ##
1906 |
1907 | class _DepthSpacePoint(Structure):
1908 | pass
1909 | _DepthSpacePoint._fields_ = [
1910 | ('x', c_float),
1911 | ('y', c_float),
1912 | ]
1913 | assert sizeof(_DepthSpacePoint) == 8, sizeof(_DepthSpacePoint)
1914 | assert alignment(_DepthSpacePoint) == 4, alignment(_DepthSpacePoint)
1915 | _JointOrientation._fields_ = [
1916 | ('JointType', _JointType),
1917 | ('Orientation', _Vector4),
1918 | ]
1919 | assert sizeof(_JointOrientation) == 20, sizeof(_JointOrientation)
1920 | assert alignment(_JointOrientation) == 4, alignment(_JointOrientation)
1921 | class IMultiSourceFrame(comtypes.IUnknown):
1922 | _case_insensitive_ = True
1923 | _iid_ = GUID('{29A63AFB-76CE-4359-895A-997F1E094D1C}')
1924 | _idlflags_ = []
1925 | IMultiSourceFrameReference._methods_ = [
1926 | COMMETHOD([], HRESULT, 'AcquireFrame',
1927 | ( ['retval', 'out'], POINTER(POINTER(IMultiSourceFrame)), 'frame' )),
1928 | ]
1929 | ################################################################
1930 | ## code template for IMultiSourceFrameReference implementation
1931 | ##class IMultiSourceFrameReference_Impl(object):
1932 | ## def AcquireFrame(self):
1933 | ## '-no docstring-'
1934 | ## #return frame
1935 | ##
1936 |
1937 | ILongExposureInfraredFrameSource._methods_ = [
1938 | COMMETHOD([], HRESULT, 'SubscribeFrameCaptured',
1939 | ( ['retval', 'out'], POINTER(INT_PTR), 'waitableHandle' )),
1940 | COMMETHOD([], HRESULT, 'UnsubscribeFrameCaptured',
1941 | ( ['in'], INT_PTR, 'waitableHandle' )),
1942 | COMMETHOD([], HRESULT, 'GetFrameCapturedEventData',
1943 | ( ['in'], INT_PTR, 'waitableHandle' ),
1944 | ( ['retval', 'out'], POINTER(POINTER(IFrameCapturedEventArgs)), 'eventData' )),
1945 | COMMETHOD(['propget'], HRESULT, 'IsActive',
1946 | ( ['retval', 'out'], POINTER(c_bool), 'IsActive' )),
1947 | COMMETHOD([], HRESULT, 'OpenReader',
1948 | ( ['retval', 'out'], POINTER(POINTER(ILongExposureInfraredFrameReader)), 'reader' )),
1949 | COMMETHOD(['propget'], HRESULT, 'FrameDescription',
1950 | ( ['retval', 'out'], POINTER(POINTER(IFrameDescription)), 'FrameDescription' )),
1951 | COMMETHOD(['propget'], HRESULT, 'KinectSensor',
1952 | ( ['retval', 'out'], POINTER(POINTER(IKinectSensor)), 'sensor' )),
1953 | ]
1954 | ################################################################
1955 | ## code template for ILongExposureInfraredFrameSource implementation
1956 | ##class ILongExposureInfraredFrameSource_Impl(object):
1957 | ## def UnsubscribeFrameCaptured(self, waitableHandle):
1958 | ## '-no docstring-'
1959 | ## #return
1960 | ##
1961 | ## @property
1962 | ## def KinectSensor(self):
1963 | ## '-no docstring-'
1964 | ## #return sensor
1965 | ##
1966 | ## def OpenReader(self):
1967 | ## '-no docstring-'
1968 | ## #return reader
1969 | ##
1970 | ## @property
1971 | ## def FrameDescription(self):
1972 | ## '-no docstring-'
1973 | ## #return FrameDescription
1974 | ##
1975 | ## def GetFrameCapturedEventData(self, waitableHandle):
1976 | ## '-no docstring-'
1977 | ## #return eventData
1978 | ##
1979 | ## def SubscribeFrameCaptured(self):
1980 | ## '-no docstring-'
1981 | ## #return waitableHandle
1982 | ##
1983 | ## @property
1984 | ## def IsActive(self):
1985 | ## '-no docstring-'
1986 | ## #return IsActive
1987 | ##
1988 |
1989 | class IInfraredFrameReference(comtypes.IUnknown):
1990 | _case_insensitive_ = True
1991 | _iid_ = GUID('{60183D5B-DED5-4D5C-AE59-64C7724FE5FE}')
1992 | _idlflags_ = []
1993 | IMultiSourceFrame._methods_ = [
1994 | COMMETHOD(['propget'], HRESULT, 'colorFrameReference',
1995 | ( ['retval', 'out'], POINTER(POINTER(IColorFrameReference)), 'colorFrameReference' )),
1996 | COMMETHOD(['propget'], HRESULT, 'depthFrameReference',
1997 | ( ['retval', 'out'], POINTER(POINTER(IDepthFrameReference)), 'depthFrameReference' )),
1998 | COMMETHOD(['propget'], HRESULT, 'bodyFrameReference',
1999 | ( ['retval', 'out'], POINTER(POINTER(IBodyFrameReference)), 'bodyFrameReference' )),
2000 | COMMETHOD(['propget'], HRESULT, 'bodyIndexFrameReference',
2001 | ( ['retval', 'out'], POINTER(POINTER(IBodyIndexFrameReference)), 'bodyIndexFrameReference' )),
2002 | COMMETHOD(['propget'], HRESULT, 'infraredFrameReference',
2003 | ( ['retval', 'out'], POINTER(POINTER(IInfraredFrameReference)), 'infraredFrameReference' )),
2004 | COMMETHOD(['propget'], HRESULT, 'longExposureInfraredFrameReference',
2005 | ( ['retval', 'out'], POINTER(POINTER(ILongExposureInfraredFrameReference)), 'longExposureInfraredFrameReference' )),
2006 | ]
2007 | ################################################################
2008 | ## code template for IMultiSourceFrame implementation
2009 | ##class IMultiSourceFrame_Impl(object):
2010 | ## @property
2011 | ## def depthFrameReference(self):
2012 | ## '-no docstring-'
2013 | ## #return depthFrameReference
2014 | ##
2015 | ## @property
2016 | ## def bodyIndexFrameReference(self):
2017 | ## '-no docstring-'
2018 | ## #return bodyIndexFrameReference
2019 | ##
2020 | ## @property
2021 | ## def longExposureInfraredFrameReference(self):
2022 | ## '-no docstring-'
2023 | ## #return longExposureInfraredFrameReference
2024 | ##
2025 | ## @property
2026 | ## def bodyFrameReference(self):
2027 | ## '-no docstring-'
2028 | ## #return bodyFrameReference
2029 | ##
2030 | ## @property
2031 | ## def infraredFrameReference(self):
2032 | ## '-no docstring-'
2033 | ## #return infraredFrameReference
2034 | ##
2035 | ## @property
2036 | ## def colorFrameReference(self):
2037 | ## '-no docstring-'
2038 | ## #return colorFrameReference
2039 | ##
2040 |
2041 | class _CameraIntrinsics(Structure):
2042 | pass
2043 | _CameraIntrinsics._fields_ = [
2044 | ('FocalLengthX', c_float),
2045 | ('FocalLengthY', c_float),
2046 | ('PrincipalPointX', c_float),
2047 | ('PrincipalPointY', c_float),
2048 | ('RadialDistortionSecondOrder', c_float),
2049 | ('RadialDistortionFourthOrder', c_float),
2050 | ('RadialDistortionSixthOrder', c_float),
2051 | ]
2052 | assert sizeof(_CameraIntrinsics) == 28, sizeof(_CameraIntrinsics)
2053 | assert alignment(_CameraIntrinsics) == 4, alignment(_CameraIntrinsics)
2054 | class ICoordinateMapper(comtypes.IUnknown):
2055 | _case_insensitive_ = True
2056 | _iid_ = GUID('{8784DF2D-16B0-481C-A11E-55E70BF25018}')
2057 | _idlflags_ = []
2058 | class ICoordinateMappingChangedEventArgs(comtypes.IUnknown):
2059 | _case_insensitive_ = True
2060 | _iid_ = GUID('{E9A2A0BF-13BD-4A53-A157-91FC8BB41F85}')
2061 | _idlflags_ = []
2062 | ICoordinateMapper._methods_ = [
2063 | COMMETHOD([], HRESULT, 'SubscribeCoordinateMappingChanged',
2064 | ( ['retval', 'out'], POINTER(INT_PTR), 'waitableHandle' )),
2065 | COMMETHOD([], HRESULT, 'UnsubscribeCoordinateMappingChanged',
2066 | ( ['in'], INT_PTR, 'waitableHandle' )),
2067 | COMMETHOD([], HRESULT, 'GetCoordinateMappingChangedEventData',
2068 | ( ['in'], INT_PTR, 'waitableHandle' ),
2069 | ( ['retval', 'out'], POINTER(POINTER(ICoordinateMappingChangedEventArgs)), 'eventData' )),
2070 | COMMETHOD([], HRESULT, 'MapCameraPointToDepthSpace',
2071 | ( [], _CameraSpacePoint, 'cameraPoint' ),
2072 | ( ['retval', 'out'], POINTER(_DepthSpacePoint), 'depthPoint' )),
2073 | COMMETHOD([], HRESULT, 'MapCameraPointToColorSpace',
2074 | ( [], _CameraSpacePoint, 'cameraPoint' ),
2075 | ( ['retval', 'out'], POINTER(_ColorSpacePoint), 'colorPoint' )),
2076 | COMMETHOD([], HRESULT, 'MapDepthPointToCameraSpace',
2077 | ( [], _DepthSpacePoint, 'depthPoint' ),
2078 | ( [], c_ushort, 'depth' ),
2079 | ( ['retval', 'out'], POINTER(_CameraSpacePoint), 'cameraPoint' )),
2080 | COMMETHOD([], HRESULT, 'MapDepthPointToColorSpace',
2081 | ( [], _DepthSpacePoint, 'depthPoint' ),
2082 | ( [], c_ushort, 'depth' ),
2083 | ( ['retval', 'out'], POINTER(_ColorSpacePoint), 'colorPoint' )),
2084 | COMMETHOD([], HRESULT, 'MapCameraPointsToDepthSpace',
2085 | ( [], c_uint, 'cameraPointCount' ),
2086 | ( ['in'], POINTER(_CameraSpacePoint), 'cameraPoints' ),
2087 | ( [], c_uint, 'depthPointCount' ),
2088 | ( [], POINTER(_DepthSpacePoint), 'depthPoints' )),
2089 | COMMETHOD([], HRESULT, 'MapCameraPointsToColorSpace',
2090 | ( [], c_uint, 'cameraPointCount' ),
2091 | ( ['in'], POINTER(_CameraSpacePoint), 'cameraPoints' ),
2092 | ( [], c_uint, 'colorPointCount' ),
2093 | ( [], POINTER(_ColorSpacePoint), 'colorPoints' )),
2094 | COMMETHOD([], HRESULT, 'MapDepthPointsToCameraSpace',
2095 | ( [], c_uint, 'depthPointCount' ),
2096 | ( ['in'], POINTER(_DepthSpacePoint), 'depthPoints' ),
2097 | ( [], c_uint, 'depthCount' ),
2098 | ( ['in'], POINTER(c_ushort), 'depths' ),
2099 | ( [], c_uint, 'cameraPointCount' ),
2100 | ( [], POINTER(_CameraSpacePoint), 'cameraPoints' )),
2101 | COMMETHOD([], HRESULT, 'MapDepthPointsToColorSpace',
2102 | ( [], c_uint, 'depthPointCount' ),
2103 | ( ['in'], POINTER(_DepthSpacePoint), 'depthPoints' ),
2104 | ( [], c_uint, 'depthCount' ),
2105 | ( ['in'], POINTER(c_ushort), 'depths' ),
2106 | ( [], c_uint, 'colorPointCount' ),
2107 | ( [], POINTER(_ColorSpacePoint), 'colorPoints' )),
2108 | COMMETHOD([], HRESULT, 'MapDepthFrameToCameraSpace',
2109 | ( [], c_uint, 'depthPointCount' ),
2110 | ( ['in'], POINTER(c_ushort), 'depthFrameData' ),
2111 | ( [], c_uint, 'cameraPointCount' ),
2112 | ( [], POINTER(_CameraSpacePoint), 'cameraSpacePoints' )),
2113 | COMMETHOD([], HRESULT, 'MapDepthFrameToColorSpace',
2114 | ( [], c_uint, 'depthPointCount' ),
2115 | ( ['in'], POINTER(c_ushort), 'depthFrameData' ),
2116 | ( [], c_uint, 'colorPointCount' ),
2117 | ( [], POINTER(_ColorSpacePoint), 'colorSpacePoints' )),
2118 | COMMETHOD([], HRESULT, 'MapColorFrameToDepthSpace',
2119 | ( [], c_uint, 'depthDataPointCount' ),
2120 | ( ['in'], POINTER(c_ushort), 'depthFrameData' ),
2121 | ( [], c_uint, 'depthPointCount' ),
2122 | ( [], POINTER(_DepthSpacePoint), 'depthSpacePoints' )),
2123 | COMMETHOD([], HRESULT, 'MapColorFrameToCameraSpace',
2124 | ( [], c_uint, 'depthDataPointCount' ),
2125 | ( ['in'], POINTER(c_ushort), 'depthFrameData' ),
2126 | ( [], c_uint, 'cameraPointCount' ),
2127 | ( [], POINTER(_CameraSpacePoint), 'cameraSpacePoints' )),
2128 | COMMETHOD([], HRESULT, 'GetDepthFrameToCameraSpaceTable',
2129 | ( [], POINTER(c_uint), 'tableEntryCount' ),
2130 | ( ['retval', 'out'], POINTER(POINTER(_PointF)), 'tableEntries' )),
2131 | COMMETHOD([], HRESULT, 'GetDepthCameraIntrinsics',
2132 | ( ['retval', 'out'], POINTER(_CameraIntrinsics), 'cameraIntrinsics' )),
2133 | ]
2134 | ################################################################
2135 | ## code template for ICoordinateMapper implementation
2136 | ##class ICoordinateMapper_Impl(object):
2137 | ## def GetDepthCameraIntrinsics(self):
2138 | ## '-no docstring-'
2139 | ## #return cameraIntrinsics
2140 | ##
2141 | ## def MapDepthPointsToCameraSpace(self, depthPointCount, depthPoints, depthCount, depths, cameraPointCount):
2142 | ## '-no docstring-'
2143 | ## #return cameraPoints
2144 | ##
2145 | ## def GetCoordinateMappingChangedEventData(self, waitableHandle):
2146 | ## '-no docstring-'
2147 | ## #return eventData
2148 | ##
2149 | ## def MapCameraPointsToDepthSpace(self, cameraPointCount, cameraPoints, depthPointCount):
2150 | ## '-no docstring-'
2151 | ## #return depthPoints
2152 | ##
2153 | ## def MapColorFrameToCameraSpace(self, depthDataPointCount, depthFrameData, cameraPointCount):
2154 | ## '-no docstring-'
2155 | ## #return cameraSpacePoints
2156 | ##
2157 | ## def MapCameraPointToColorSpace(self, cameraPoint):
2158 | ## '-no docstring-'
2159 | ## #return colorPoint
2160 | ##
2161 | ## def SubscribeCoordinateMappingChanged(self):
2162 | ## '-no docstring-'
2163 | ## #return waitableHandle
2164 | ##
2165 | ## def MapDepthPointsToColorSpace(self, depthPointCount, depthPoints, depthCount, depths, colorPointCount):
2166 | ## '-no docstring-'
2167 | ## #return colorPoints
2168 | ##
2169 | ## def MapDepthPointToCameraSpace(self, depthPoint, depth):
2170 | ## '-no docstring-'
2171 | ## #return cameraPoint
2172 | ##
2173 | ## def MapColorFrameToDepthSpace(self, depthDataPointCount, depthFrameData, depthPointCount):
2174 | ## '-no docstring-'
2175 | ## #return depthSpacePoints
2176 | ##
2177 | ## def UnsubscribeCoordinateMappingChanged(self, waitableHandle):
2178 | ## '-no docstring-'
2179 | ## #return
2180 | ##
2181 | ## def MapCameraPointsToColorSpace(self, cameraPointCount, cameraPoints, colorPointCount):
2182 | ## '-no docstring-'
2183 | ## #return colorPoints
2184 | ##
2185 | ## def GetDepthFrameToCameraSpaceTable(self):
2186 | ## '-no docstring-'
2187 | ## #return tableEntryCount, tableEntries
2188 | ##
2189 | ## def MapDepthFrameToCameraSpace(self, depthPointCount, depthFrameData, cameraPointCount):
2190 | ## '-no docstring-'
2191 | ## #return cameraSpacePoints
2192 | ##
2193 | ## def MapDepthFrameToColorSpace(self, depthPointCount, depthFrameData, colorPointCount):
2194 | ## '-no docstring-'
2195 | ## #return colorSpacePoints
2196 | ##
2197 | ## def MapCameraPointToDepthSpace(self, cameraPoint):
2198 | ## '-no docstring-'
2199 | ## #return depthPoint
2200 | ##
2201 | ## def MapDepthPointToColorSpace(self, depthPoint, depth):
2202 | ## '-no docstring-'
2203 | ## #return colorPoint
2204 | ##
2205 |
2206 | tagSTATSTG._fields_ = [
2207 | ('pwcsName', WSTRING),
2208 | ('type', c_ulong),
2209 | ('cbSize', _ULARGE_INTEGER),
2210 | ('mtime', _FILETIME),
2211 | ('ctime', _FILETIME),
2212 | ('atime', _FILETIME),
2213 | ('grfMode', c_ulong),
2214 | ('grfLocksSupported', c_ulong),
2215 | ('clsid', GUID),
2216 | ('grfStateBits', c_ulong),
2217 | ('reserved', c_ulong),
2218 | ]
2219 | required_size = 64 + sysinfo.platform_bits / 4
2220 |
2221 | assert sizeof(tagSTATSTG) == required_size, sizeof(tagSTATSTG)
2222 | assert alignment(tagSTATSTG) == 8, alignment(tagSTATSTG)
2223 | IAudioBeamList._methods_ = [
2224 | COMMETHOD(['propget'], HRESULT, 'BeamCount',
2225 | ( ['retval', 'out'], POINTER(c_uint), 'count' )),
2226 | COMMETHOD([], HRESULT, 'OpenAudioBeam',
2227 | ( [], c_uint, 'index' ),
2228 | ( ['out'], POINTER(POINTER(IAudioBeam)), 'AudioBeam' )),
2229 | ]
2230 | ################################################################
2231 | ## code template for IAudioBeamList implementation
2232 | ##class IAudioBeamList_Impl(object):
2233 | ## def OpenAudioBeam(self, index):
2234 | ## '-no docstring-'
2235 | ## #return AudioBeam
2236 | ##
2237 | ## @property
2238 | ## def BeamCount(self):
2239 | ## '-no docstring-'
2240 | ## #return count
2241 | ##
2242 |
2243 | class IInfraredFrameReader(comtypes.IUnknown):
2244 | _case_insensitive_ = True
2245 | _iid_ = GUID('{059A049D-A0AC-481E-B342-483EE94A028B}')
2246 | _idlflags_ = []
2247 | class IInfraredFrameArrivedEventArgs(comtypes.IUnknown):
2248 | _case_insensitive_ = True
2249 | _iid_ = GUID('{7E17F78E-D9D1-4448-90C2-4E50EC4ECEE9}')
2250 | _idlflags_ = []
2251 | class IInfraredFrame(comtypes.IUnknown):
2252 | _case_insensitive_ = True
2253 | _iid_ = GUID('{EA83823C-7613-4F29-BD51-4A9678A52C7E}')
2254 | _idlflags_ = []
2255 | class IInfraredFrameSource(comtypes.IUnknown):
2256 | _case_insensitive_ = True
2257 | _iid_ = GUID('{4C299EC6-CA45-4AFF-87AD-DF5762C49BE7}')
2258 | _idlflags_ = []
2259 | IInfraredFrameReader._methods_ = [
2260 | COMMETHOD([], HRESULT, 'SubscribeFrameArrived',
2261 | ( ['retval', 'out'], POINTER(INT_PTR), 'waitableHandle' )),
2262 | COMMETHOD([], HRESULT, 'UnsubscribeFrameArrived',
2263 | ( ['in'], INT_PTR, 'waitableHandle' )),
2264 | COMMETHOD([], HRESULT, 'GetFrameArrivedEventData',
2265 | ( ['in'], INT_PTR, 'waitableHandle' ),
2266 | ( ['retval', 'out'], POINTER(POINTER(IInfraredFrameArrivedEventArgs)), 'eventData' )),
2267 | COMMETHOD([], HRESULT, 'AcquireLatestFrame',
2268 | ( ['retval', 'out'], POINTER(POINTER(IInfraredFrame)), 'infraredFrame' )),
2269 | COMMETHOD(['propget'], HRESULT, 'IsPaused',
2270 | ( ['retval', 'out'], POINTER(c_bool), 'IsPaused' )),
2271 | COMMETHOD(['propput'], HRESULT, 'IsPaused',
2272 | ( [], c_bool, 'IsPaused' )),
2273 | COMMETHOD(['propget'], HRESULT, 'InfraredFrameSource',
2274 | ( ['retval', 'out'], POINTER(POINTER(IInfraredFrameSource)), 'InfraredFrameSource' )),
2275 | ]
2276 | ################################################################
2277 | ## code template for IInfraredFrameReader implementation
2278 | ##class IInfraredFrameReader_Impl(object):
2279 | ## def GetFrameArrivedEventData(self, waitableHandle):
2280 | ## '-no docstring-'
2281 | ## #return eventData
2282 | ##
2283 | ## @property
2284 | ## def InfraredFrameSource(self):
2285 | ## '-no docstring-'
2286 | ## #return InfraredFrameSource
2287 | ##
2288 | ## def UnsubscribeFrameArrived(self, waitableHandle):
2289 | ## '-no docstring-'
2290 | ## #return
2291 | ##
2292 | ## def _get(self):
2293 | ## '-no docstring-'
2294 | ## #return IsPaused
2295 | ## def _set(self, IsPaused):
2296 | ## '-no docstring-'
2297 | ## IsPaused = property(_get, _set, doc = _set.__doc__)
2298 | ##
2299 | ## def AcquireLatestFrame(self):
2300 | ## '-no docstring-'
2301 | ## #return infraredFrame
2302 | ##
2303 | ## def SubscribeFrameArrived(self):
2304 | ## '-no docstring-'
2305 | ## #return waitableHandle
2306 | ##
2307 |
2308 | IInfraredFrameSource._methods_ = [
2309 | COMMETHOD([], HRESULT, 'SubscribeFrameCaptured',
2310 | ( ['retval', 'out'], POINTER(INT_PTR), 'waitableHandle' )),
2311 | COMMETHOD([], HRESULT, 'UnsubscribeFrameCaptured',
2312 | ( ['in'], INT_PTR, 'waitableHandle' )),
2313 | COMMETHOD([], HRESULT, 'GetFrameCapturedEventData',
2314 | ( ['in'], INT_PTR, 'waitableHandle' ),
2315 | ( ['retval', 'out'], POINTER(POINTER(IFrameCapturedEventArgs)), 'eventData' )),
2316 | COMMETHOD(['propget'], HRESULT, 'IsActive',
2317 | ( ['retval', 'out'], POINTER(c_bool), 'IsActive' )),
2318 | COMMETHOD([], HRESULT, 'OpenReader',
2319 | ( ['retval', 'out'], POINTER(POINTER(IInfraredFrameReader)), 'reader' )),
2320 | COMMETHOD(['propget'], HRESULT, 'FrameDescription',
2321 | ( ['retval', 'out'], POINTER(POINTER(IFrameDescription)), 'FrameDescription' )),
2322 | COMMETHOD(['propget'], HRESULT, 'KinectSensor',
2323 | ( ['retval', 'out'], POINTER(POINTER(IKinectSensor)), 'sensor' )),
2324 | ]
2325 | ################################################################
2326 | ## code template for IInfraredFrameSource implementation
2327 | ##class IInfraredFrameSource_Impl(object):
2328 | ## def UnsubscribeFrameCaptured(self, waitableHandle):
2329 | ## '-no docstring-'
2330 | ## #return
2331 | ##
2332 | ## @property
2333 | ## def KinectSensor(self):
2334 | ## '-no docstring-'
2335 | ## #return sensor
2336 | ##
2337 | ## def OpenReader(self):
2338 | ## '-no docstring-'
2339 | ## #return reader
2340 | ##
2341 | ## @property
2342 | ## def FrameDescription(self):
2343 | ## '-no docstring-'
2344 | ## #return FrameDescription
2345 | ##
2346 | ## def GetFrameCapturedEventData(self, waitableHandle):
2347 | ## '-no docstring-'
2348 | ## #return eventData
2349 | ##
2350 | ## def SubscribeFrameCaptured(self):
2351 | ## '-no docstring-'
2352 | ## #return waitableHandle
2353 | ##
2354 | ## @property
2355 | ## def IsActive(self):
2356 | ## '-no docstring-'
2357 | ## #return IsActive
2358 | ##
2359 |
2360 | IInfraredFrameArrivedEventArgs._methods_ = [
2361 | COMMETHOD(['propget'], HRESULT, 'FrameReference',
2362 | ( ['retval', 'out'], POINTER(POINTER(IInfraredFrameReference)), 'infraredFrameReference' )),
2363 | ]
2364 | ################################################################
2365 | ## code template for IInfraredFrameArrivedEventArgs implementation
2366 | ##class IInfraredFrameArrivedEventArgs_Impl(object):
2367 | ## @property
2368 | ## def FrameReference(self):
2369 | ## '-no docstring-'
2370 | ## #return infraredFrameReference
2371 | ##
2372 |
2373 | class IAudioBodyCorrelation(comtypes.IUnknown):
2374 | _case_insensitive_ = True
2375 | _iid_ = GUID('{C5BA2355-07DB-47C3-ABC4-68D24B91DE61}')
2376 | _idlflags_ = []
2377 | IAudioBeamSubFrame._methods_ = [
2378 | COMMETHOD(['propget'], HRESULT, 'FrameLengthInBytes',
2379 | ( ['retval', 'out'], POINTER(c_uint), 'length' )),
2380 | COMMETHOD(['propget'], HRESULT, 'duration',
2381 | ( ['retval', 'out'], POINTER(c_longlong), 'duration' )),
2382 | COMMETHOD(['propget'], HRESULT, 'BeamAngle',
2383 | ( ['retval', 'out'], POINTER(c_float), 'BeamAngle' )),
2384 | COMMETHOD(['propget'], HRESULT, 'BeamAngleConfidence',
2385 | ( ['retval', 'out'], POINTER(c_float), 'BeamAngleConfidence' )),
2386 | COMMETHOD(['propget'], HRESULT, 'AudioBeamMode',
2387 | ( ['retval', 'out'], POINTER(_AudioBeamMode), 'AudioBeamMode' )),
2388 | COMMETHOD(['propget'], HRESULT, 'AudioBodyCorrelationCount',
2389 | ( ['retval', 'out'], POINTER(c_uint), 'pCount' )),
2390 | COMMETHOD([], HRESULT, 'GetAudioBodyCorrelation',
2391 | ( ['in'], c_uint, 'index' ),
2392 | ( ['out'], POINTER(POINTER(IAudioBodyCorrelation)), 'ppAudioBodyCorrelation' )),
2393 | COMMETHOD([], HRESULT, 'CopyFrameDataToArray',
2394 | ( [], c_uint, 'capacity' ),
2395 | ( [], POINTER(c_ubyte), 'frameData' )),
2396 | COMMETHOD([], HRESULT, 'AccessUnderlyingBuffer',
2397 | ( [], POINTER(c_uint), 'capacity' ),
2398 | ( [], POINTER(POINTER(c_ubyte)), 'buffer' )), #'out'
2399 | COMMETHOD(['propget'], HRESULT, 'RelativeTime',
2400 | ( ['retval', 'out'], POINTER(c_longlong), 'RelativeTime' )),
2401 | ]
2402 | ################################################################
2403 | ## code template for IAudioBeamSubFrame implementation
2404 | ##class IAudioBeamSubFrame_Impl(object):
2405 | ## @property
2406 | ## def AudioBeamMode(self):
2407 | ## '-no docstring-'
2408 | ## #return AudioBeamMode
2409 | ##
2410 | ## @property
2411 | ## def RelativeTime(self):
2412 | ## '-no docstring-'
2413 | ## #return RelativeTime
2414 | ##
2415 | ## def GetAudioBodyCorrelation(self, index):
2416 | ## '-no docstring-'
2417 | ## #return ppAudioBodyCorrelation
2418 | ##
2419 | ## def CopyFrameDataToArray(self, capacity):
2420 | ## '-no docstring-'
2421 | ## #return frameData
2422 | ##
2423 | ## @property
2424 | ## def FrameLengthInBytes(self):
2425 | ## '-no docstring-'
2426 | ## #return length
2427 | ##
2428 | ## @property
2429 | ## def AudioBodyCorrelationCount(self):
2430 | ## '-no docstring-'
2431 | ## #return pCount
2432 | ##
2433 | ## def AccessUnderlyingBuffer(self):
2434 | ## '-no docstring-'
2435 | ## #return capacity, buffer
2436 | ##
2437 | ## @property
2438 | ## def BeamAngleConfidence(self):
2439 | ## '-no docstring-'
2440 | ## #return BeamAngleConfidence
2441 | ##
2442 | ## @property
2443 | ## def duration(self):
2444 | ## '-no docstring-'
2445 | ## #return duration
2446 | ##
2447 | ## @property
2448 | ## def BeamAngle(self):
2449 | ## '-no docstring-'
2450 | ## #return BeamAngle
2451 | ##
2452 |
2453 | IEnumKinectSensor._methods_ = [
2454 | COMMETHOD([], HRESULT, 'GetNext',
2455 | ( ['retval', 'out'], POINTER(POINTER(IKinectSensor)), 'sensor' )),
2456 | COMMETHOD([], HRESULT, 'Reset'),
2457 | ]
2458 | ################################################################
2459 | ## code template for IEnumKinectSensor implementation
2460 | ##class IEnumKinectSensor_Impl(object):
2461 | ## def GetNext(self):
2462 | ## '-no docstring-'
2463 | ## #return sensor
2464 | ##
2465 | ## def Reset(self):
2466 | ## '-no docstring-'
2467 | ## #return
2468 | ##
2469 |
2470 | IBodyFrame._methods_ = [
2471 | COMMETHOD([], HRESULT, 'GetAndRefreshBodyData',
2472 | ( [], c_uint, 'capacity' ),
2473 | ( ['in'], POINTER(POINTER(IBody)), 'bodies' )),
2474 | COMMETHOD(['propget'], HRESULT, 'FloorClipPlane',
2475 | ( ['retval', 'out'], POINTER(_Vector4), 'FloorClipPlane' )),
2476 | COMMETHOD(['propget'], HRESULT, 'RelativeTime',
2477 | ( ['retval', 'out'], POINTER(c_longlong), 'RelativeTime' )),
2478 | COMMETHOD(['propget'], HRESULT, 'BodyFrameSource',
2479 | ( ['retval', 'out'], POINTER(POINTER(IBodyFrameSource)), 'BodyFrameSource' )),
2480 | ]
2481 | ################################################################
2482 | ## code template for IBodyFrame implementation
2483 | ##class IBodyFrame_Impl(object):
2484 | ## @property
2485 | ## def FloorClipPlane(self):
2486 | ## '-no docstring-'
2487 | ## #return FloorClipPlane
2488 | ##
2489 | ## @property
2490 | ## def BodyFrameSource(self):
2491 | ## '-no docstring-'
2492 | ## #return BodyFrameSource
2493 | ##
2494 | ## def GetAndRefreshBodyData(self, capacity):
2495 | ## '-no docstring-'
2496 | ## #return bodies
2497 | ##
2498 | ## @property
2499 | ## def RelativeTime(self):
2500 | ## '-no docstring-'
2501 | ## #return RelativeTime
2502 | ##
2503 |
2504 | ICoordinateMappingChangedEventArgs._methods_ = [
2505 | ]
2506 | ################################################################
2507 | ## code template for ICoordinateMappingChangedEventArgs implementation
2508 | ##class ICoordinateMappingChangedEventArgs_Impl(object):
2509 |
2510 | IInfraredFrameReference._methods_ = [
2511 | COMMETHOD([], HRESULT, 'AcquireFrame',
2512 | ( ['retval', 'out'], POINTER(POINTER(IInfraredFrame)), 'infraredFrame' )),
2513 | COMMETHOD(['propget'], HRESULT, 'RelativeTime',
2514 | ( ['retval', 'out'], POINTER(c_longlong), 'RelativeTime' )),
2515 | ]
2516 | ################################################################
2517 | ## code template for IInfraredFrameReference implementation
2518 | ##class IInfraredFrameReference_Impl(object):
2519 | ## @property
2520 | ## def RelativeTime(self):
2521 | ## '-no docstring-'
2522 | ## #return RelativeTime
2523 | ##
2524 | ## def AcquireFrame(self):
2525 | ## '-no docstring-'
2526 | ## #return infraredFrame
2527 | ##
2528 |
2529 | class IIsAvailableChangedEventArgs(comtypes.IUnknown):
2530 | _case_insensitive_ = True
2531 | _iid_ = GUID('{3A6DD52E-967F-4982-B3D9-74B9E1A044C9}')
2532 | _idlflags_ = []
2533 | class IMultiSourceFrameReader(comtypes.IUnknown):
2534 | _case_insensitive_ = True
2535 | _iid_ = GUID('{C0F6432B-9FFE-4AB3-A683-F37C72BBB158}')
2536 | _idlflags_ = []
2537 | IKinectSensor._methods_ = [
2538 | COMMETHOD([], HRESULT, 'SubscribeIsAvailableChanged',
2539 | ( ['retval', 'out'], POINTER(INT_PTR), 'waitableHandle' )),
2540 | COMMETHOD([], HRESULT, 'UnsubscribeIsAvailableChanged',
2541 | ( ['in'], INT_PTR, 'waitableHandle' )),
2542 | COMMETHOD([], HRESULT, 'GetIsAvailableChangedEventData',
2543 | ( ['in'], INT_PTR, 'waitableHandle' ),
2544 | ( ['retval', 'out'], POINTER(POINTER(IIsAvailableChangedEventArgs)), 'eventData' )),
2545 | COMMETHOD([], HRESULT, 'Open'),
2546 | COMMETHOD([], HRESULT, 'Close'),
2547 | COMMETHOD(['propget'], HRESULT, 'IsOpen',
2548 | ( ['retval', 'out'], POINTER(c_bool), 'IsOpen' )),
2549 | COMMETHOD(['propget'], HRESULT, 'IsAvailable',
2550 | ( ['retval', 'out'], POINTER(c_bool), 'IsAvailable' )),
2551 | COMMETHOD(['propget'], HRESULT, 'ColorFrameSource',
2552 | ( ['retval', 'out'], POINTER(POINTER(IColorFrameSource)), 'ColorFrameSource' )),
2553 | COMMETHOD(['propget'], HRESULT, 'DepthFrameSource',
2554 | ( ['retval', 'out'], POINTER(POINTER(IDepthFrameSource)), 'DepthFrameSource' )),
2555 | COMMETHOD(['propget'], HRESULT, 'BodyFrameSource',
2556 | ( ['retval', 'out'], POINTER(POINTER(IBodyFrameSource)), 'BodyFrameSource' )),
2557 | COMMETHOD(['propget'], HRESULT, 'BodyIndexFrameSource',
2558 | ( ['retval', 'out'], POINTER(POINTER(IBodyIndexFrameSource)), 'BodyIndexFrameSource' )),
2559 | COMMETHOD(['propget'], HRESULT, 'InfraredFrameSource',
2560 | ( ['retval', 'out'], POINTER(POINTER(IInfraredFrameSource)), 'InfraredFrameSource' )),
2561 | COMMETHOD(['propget'], HRESULT, 'LongExposureInfraredFrameSource',
2562 | ( ['retval', 'out'], POINTER(POINTER(ILongExposureInfraredFrameSource)), 'LongExposureInfraredFrameSource' )),
2563 | COMMETHOD(['propget'], HRESULT, 'AudioSource',
2564 | ( ['retval', 'out'], POINTER(POINTER(IAudioSource)), 'AudioSource' )),
2565 | COMMETHOD([], HRESULT, 'OpenMultiSourceFrameReader',
2566 | ( [], c_ulong, 'enabledFrameSourceTypes' ),
2567 | ( ['retval', 'out'], POINTER(POINTER(IMultiSourceFrameReader)), 'multiSourceFrameReader' )),
2568 | COMMETHOD(['propget'], HRESULT, 'CoordinateMapper',
2569 | ( ['retval', 'out'], POINTER(POINTER(ICoordinateMapper)), 'CoordinateMapper' )),
2570 | COMMETHOD(['propget'], HRESULT, 'UniqueKinectId',
2571 | ( [], c_uint, 'bufferSize' ),
2572 | ( ['retval', 'out'], POINTER(c_ushort), 'UniqueKinectId' )),
2573 | COMMETHOD(['propget'], HRESULT, 'KinectCapabilities',
2574 | ( ['retval', 'out'], POINTER(c_ulong), 'capabilities' )),
2575 | ]
2576 | ################################################################
2577 | ## code template for IKinectSensor implementation
2578 | ##class IKinectSensor_Impl(object):
2579 | ## def SubscribeIsAvailableChanged(self):
2580 | ## '-no docstring-'
2581 | ## #return waitableHandle
2582 | ##
2583 | ## def OpenMultiSourceFrameReader(self, enabledFrameSourceTypes):
2584 | ## '-no docstring-'
2585 | ## #return multiSourceFrameReader
2586 | ##
2587 | ## def GetIsAvailableChangedEventData(self, waitableHandle):
2588 | ## '-no docstring-'
2589 | ## #return eventData
2590 | ##
2591 | ## @property
2592 | ## def AudioSource(self):
2593 | ## '-no docstring-'
2594 | ## #return AudioSource
2595 | ##
2596 | ## @property
2597 | ## def IsOpen(self):
2598 | ## '-no docstring-'
2599 | ## #return IsOpen
2600 | ##
2601 | ## @property
2602 | ## def BodyIndexFrameSource(self):
2603 | ## '-no docstring-'
2604 | ## #return BodyIndexFrameSource
2605 | ##
2606 | ## @property
2607 | ## def InfraredFrameSource(self):
2608 | ## '-no docstring-'
2609 | ## #return InfraredFrameSource
2610 | ##
2611 | ## @property
2612 | ## def BodyFrameSource(self):
2613 | ## '-no docstring-'
2614 | ## #return BodyFrameSource
2615 | ##
2616 | ## @property
2617 | ## def UniqueKinectId(self, bufferSize):
2618 | ## '-no docstring-'
2619 | ## #return UniqueKinectId
2620 | ##
2621 | ## def UnsubscribeIsAvailableChanged(self, waitableHandle):
2622 | ## '-no docstring-'
2623 | ## #return
2624 | ##
2625 | ## @property
2626 | ## def ColorFrameSource(self):
2627 | ## '-no docstring-'
2628 | ## #return ColorFrameSource
2629 | ##
2630 | ## @property
2631 | ## def LongExposureInfraredFrameSource(self):
2632 | ## '-no docstring-'
2633 | ## #return LongExposureInfraredFrameSource
2634 | ##
2635 | ## @property
2636 | ## def KinectCapabilities(self):
2637 | ## '-no docstring-'
2638 | ## #return capabilities
2639 | ##
2640 | ## @property
2641 | ## def DepthFrameSource(self):
2642 | ## '-no docstring-'
2643 | ## #return DepthFrameSource
2644 | ##
2645 | ## def Close(self):
2646 | ## '-no docstring-'
2647 | ## #return
2648 | ##
2649 | ## @property
2650 | ## def IsAvailable(self):
2651 | ## '-no docstring-'
2652 | ## #return IsAvailable
2653 | ##
2654 | ## def Open(self):
2655 | ## '-no docstring-'
2656 | ## #return
2657 | ##
2658 | ## @property
2659 | ## def CoordinateMapper(self):
2660 | ## '-no docstring-'
2661 | ## #return CoordinateMapper
2662 | ##
2663 |
2664 | IAudioBodyCorrelation._methods_ = [
2665 | COMMETHOD(['propget'], HRESULT, 'BodyTrackingId',
2666 | ( ['retval', 'out'], POINTER(c_ulonglong), 'TrackingId' )),
2667 | ]
2668 | ################################################################
2669 | ## code template for IAudioBodyCorrelation implementation
2670 | ##class IAudioBodyCorrelation_Impl(object):
2671 | ## @property
2672 | ## def BodyTrackingId(self):
2673 | ## '-no docstring-'
2674 | ## #return TrackingId
2675 | ##
2676 |
2677 | IInfraredFrame._methods_ = [
2678 | COMMETHOD([], HRESULT, 'CopyFrameDataToArray',
2679 | ( [], c_uint, 'capacity' ),
2680 | ( [], POINTER(c_ushort), 'frameData' )),
2681 | COMMETHOD([], HRESULT, 'AccessUnderlyingBuffer',
2682 | ( [], POINTER(c_uint), 'capacity' ),
2683 | ( [], POINTER(POINTER(c_ushort)), 'buffer' )), #'out'
2684 | COMMETHOD(['propget'], HRESULT, 'FrameDescription',
2685 | ( ['retval', 'out'], POINTER(POINTER(IFrameDescription)), 'FrameDescription' )),
2686 | COMMETHOD(['propget'], HRESULT, 'RelativeTime',
2687 | ( ['retval', 'out'], POINTER(c_longlong), 'RelativeTime' )),
2688 | COMMETHOD(['propget'], HRESULT, 'InfraredFrameSource',
2689 | ( ['retval', 'out'], POINTER(POINTER(IInfraredFrameSource)), 'InfraredFrameSource' )),
2690 | ]
2691 | ################################################################
2692 | ## code template for IInfraredFrame implementation
2693 | ##class IInfraredFrame_Impl(object):
2694 | ## @property
2695 | ## def RelativeTime(self):
2696 | ## '-no docstring-'
2697 | ## #return RelativeTime
2698 | ##
2699 | ## @property
2700 | ## def FrameDescription(self):
2701 | ## '-no docstring-'
2702 | ## #return FrameDescription
2703 | ##
2704 | ## def CopyFrameDataToArray(self, capacity):
2705 | ## '-no docstring-'
2706 | ## #return frameData
2707 | ##
2708 | ## @property
2709 | ## def InfraredFrameSource(self):
2710 | ## '-no docstring-'
2711 | ## #return InfraredFrameSource
2712 | ##
2713 | ## def AccessUnderlyingBuffer(self):
2714 | ## '-no docstring-'
2715 | ## #return capacity, buffer
2716 | ##
2717 |
2718 | IMultiSourceFrameReader._methods_ = [
2719 | COMMETHOD([], HRESULT, 'SubscribeMultiSourceFrameArrived',
2720 | ( ['retval', 'out'], POINTER(INT_PTR), 'waitableHandle' )),
2721 | COMMETHOD([], HRESULT, 'UnsubscribeMultiSourceFrameArrived',
2722 | ( ['in'], INT_PTR, 'waitableHandle' )),
2723 | COMMETHOD([], HRESULT, 'GetMultiSourceFrameArrivedEventData',
2724 | ( ['in'], INT_PTR, 'waitableHandle' ),
2725 | ( ['retval', 'out'], POINTER(POINTER(IMultiSourceFrameArrivedEventArgs)), 'eventData' )),
2726 | COMMETHOD([], HRESULT, 'AcquireLatestFrame',
2727 | ( ['retval', 'out'], POINTER(POINTER(IMultiSourceFrame)), 'multiSourceFrame' )),
2728 | COMMETHOD(['propget'], HRESULT, 'FrameSourceTypes',
2729 | ( ['retval', 'out'], POINTER(c_ulong), 'enabledFrameSourceTypes' )),
2730 | COMMETHOD(['propget'], HRESULT, 'IsPaused',
2731 | ( ['retval', 'out'], POINTER(c_bool), 'IsPaused' )),
2732 | COMMETHOD(['propput'], HRESULT, 'IsPaused',
2733 | ( [], c_bool, 'IsPaused' )),
2734 | COMMETHOD(['propget'], HRESULT, 'KinectSensor',
2735 | ( ['retval', 'out'], POINTER(POINTER(IKinectSensor)), 'sensor' )),
2736 | ]
2737 | ################################################################
2738 | ## code template for IMultiSourceFrameReader implementation
2739 | ##class IMultiSourceFrameReader_Impl(object):
2740 | ## @property
2741 | ## def KinectSensor(self):
2742 | ## '-no docstring-'
2743 | ## #return sensor
2744 | ##
2745 | ## def GetMultiSourceFrameArrivedEventData(self, waitableHandle):
2746 | ## '-no docstring-'
2747 | ## #return eventData
2748 | ##
2749 | ## def UnsubscribeMultiSourceFrameArrived(self, waitableHandle):
2750 | ## '-no docstring-'
2751 | ## #return
2752 | ##
2753 | ## def SubscribeMultiSourceFrameArrived(self):
2754 | ## '-no docstring-'
2755 | ## #return waitableHandle
2756 | ##
2757 | ## def _get(self):
2758 | ## '-no docstring-'
2759 | ## #return IsPaused
2760 | ## def _set(self, IsPaused):
2761 | ## '-no docstring-'
2762 | ## IsPaused = property(_get, _set, doc = _set.__doc__)
2763 | ##
2764 | ## @property
2765 | ## def FrameSourceTypes(self):
2766 | ## '-no docstring-'
2767 | ## #return enabledFrameSourceTypes
2768 | ##
2769 | ## def AcquireLatestFrame(self):
2770 | ## '-no docstring-'
2771 | ## #return multiSourceFrame
2772 | ##
2773 |
2774 | IIsAvailableChangedEventArgs._methods_ = [
2775 | COMMETHOD(['propget'], HRESULT, 'IsAvailable',
2776 | ( ['retval', 'out'], POINTER(c_bool), 'IsAvailable' )),
2777 | ]
2778 | ################################################################
2779 | ## code template for IIsAvailableChangedEventArgs implementation
2780 | ##class IIsAvailableChangedEventArgs_Impl(object):
2781 | ## @property
2782 | ## def IsAvailable(self):
2783 | ## '-no docstring-'
2784 | ## #return IsAvailable
2785 | ##
2786 |
2787 | __all__ = [ 'IKinectSensor', 'IAudioBeamSubFrame',
2788 | 'JointType_WristLeft', 'Activity_MouthOpen',
2789 | 'FrameSourceTypes_Color', 'FrameSourceTypes_Audio',
2790 | 'JointType_ThumbRight', '_FrameEdges',
2791 | 'IAudioBeamFrameList', 'IBodyIndexFrame',
2792 | 'FrameCapturedStatus_Dropped', 'tagSTATSTG',
2793 | 'IBodyFrameReference', 'IFrameDescription',
2794 | 'FrameEdge_Right', 'HandState_Lasso',
2795 | 'JointType_ShoulderRight', '_AudioBeamMode',
2796 | 'IBodyFrameReader', 'FrameCapturedStatus_Unknown',
2797 | '_KinectCapabilities', '_DepthSpacePoint',
2798 | 'HandState_Closed', '_FrameSourceTypes', '_TrackingState',
2799 | 'HandState_NotTracked', 'IAudioBeamFrameReader',
2800 | 'JointType_ShoulderLeft',
2801 | 'ILongExposureInfraredFrameArrivedEventArgs',
2802 | 'JointType_SpineMid', 'ILongExposureInfraredFrameSource',
2803 | 'IKinectSensorCollection', 'TrackingState_Inferred',
2804 | 'Activity_MouthMoved', 'TrackingState_NotTracked',
2805 | 'IMultiSourceFrameArrivedEventArgs',
2806 | 'AudioBeamMode_Manual', '_JointOrientation',
2807 | 'Activity_EyeRightClosed', 'IBodyIndexFrameReference',
2808 | 'IStream', 'KinectCapabilities_Face', 'Expression_Count',
2809 | 'JointType_HandLeft', 'IMultiSourceFrameReader',
2810 | 'FrameEdge_Bottom', 'JointType_SpineShoulder',
2811 | 'IFrameCapturedEventArgs', 'JointType_KneeLeft',
2812 | 'KinectCapabilities_Vision', 'IDepthFrame', 'IColorFrame',
2813 | '_HandState', 'IBodyIndexFrameReader', '_CameraIntrinsics',
2814 | '_KinectAudioCalibrationState', 'IAudioBodyCorrelation',
2815 | 'IColorFrameSource', 'DetectionResult_Yes',
2816 | 'IDepthFrameReader', 'Appearance_WearingGlasses',
2817 | 'KinectCapabilities_Gamechat', 'AudioBeamMode_Automatic',
2818 | 'JointType_HandTipLeft', 'JointType_AnkleLeft', '_Joint',
2819 | 'INT_PTR', 'Activity_LookingAway', 'IInfraredFrameSource',
2820 | '_RectF', 'JointType_HipRight', 'DetectionResult_No',
2821 | 'IColorFrameArrivedEventArgs', 'ISequentialStream',
2822 | 'IAudioBeamFrameArrivedEventArgs', 'IBody',
2823 | 'IMultiSourceFrame', 'ICoordinateMapper',
2824 | 'DetectionResult_Maybe', 'JointType_ElbowRight',
2825 | 'JointType_HandTipRight', 'JointType_FootLeft',
2826 | 'HandState_Open', 'IBodyFrameSource',
2827 | 'Activity_EyeLeftClosed', 'KinectCapabilities_None',
2828 | 'KinectAudioCalibrationState_Calibrated',
2829 | 'FrameSourceTypes_LongExposureInfrared',
2830 | 'IDepthFrameReference', 'IBodyIndexFrameSource',
2831 | 'FrameCapturedStatus_Queued', 'JointType_ElbowLeft',
2832 | 'ColorImageFormat_Bayer', 'IInfraredFrameReader',
2833 | 'JointType_Head', 'FrameSourceTypes_BodyIndex', '_Vector4',
2834 | 'IBodyIndexFrameArrivedEventArgs',
2835 | 'TrackingConfidence_High', 'FrameEdge_None',
2836 | 'IDepthFrameSource', 'ColorImageFormat_Bgra',
2837 | 'TrackingState_Tracked', 'JointType_Neck', 'IAudioBeam',
2838 | 'JointType_AnkleRight', 'ILongExposureInfraredFrameReader',
2839 | 'IColorFrameReference', 'KinectCapabilities_Audio',
2840 | '_Expression', 'ICoordinateMappingChangedEventArgs',
2841 | 'KinectAudioCalibrationState_CalibrationRequired',
2842 | 'JointType_SpineBase', 'IIsAvailableChangedEventArgs',
2843 | 'Appearance_Count', 'IInfraredFrameReference',
2844 | 'IBodyFrameArrivedEventArgs',
2845 | 'ILongExposureInfraredFrameReference', 'HandState_Unknown',
2846 | 'IInfraredFrame', 'IInfraredFrameArrivedEventArgs',
2847 | 'FrameSourceTypes_None', 'DetectionResult_Unknown',
2848 | '_Appearance', '_FrameCapturedStatus', 'IEnumKinectSensor',
2849 | 'FrameSourceTypes_Infrared', 'JointType_Count',
2850 | 'Expression_Happy', 'IAudioBeamFrameReference',
2851 | '_CameraSpacePoint', 'ColorImageFormat_Yuv',
2852 | '_TrackingConfidence', 'JointType_ThumbLeft',
2853 | 'JointType_WristRight', 'IAudioBeamList',
2854 | '_ColorSpacePoint', 'FrameEdge_Left',
2855 | 'TrackingConfidence_Low', 'FrameSourceTypes_Body',
2856 | 'FrameEdge_Top', 'IBodyFrame', 'IAudioSource',
2857 | '_JointType', '_PointF', 'Activity_Count',
2858 | 'KinectAudioCalibrationState_Unknown',
2859 | 'IMultiSourceFrameReference', 'IAudioBeamFrame',
2860 | 'JointType_HandRight', 'IDepthFrameArrivedEventArgs',
2861 | '_ColorImageFormat', 'KinectCapabilities_Expressions',
2862 | 'ColorImageFormat_None', 'JointType_FootRight',
2863 | 'FrameSourceTypes_Depth', 'ILongExposureInfraredFrame',
2864 | 'JointType_KneeRight', 'Expression_Neutral',
2865 | 'JointType_HipLeft', 'ColorImageFormat_Rgba',
2866 | 'IColorCameraSettings', '_DetectionResult',
2867 | 'IColorFrameReader', 'ColorImageFormat_Yuy2', '_Activity']
2868 | from comtypes import _check_version; _check_version('')
2869 |
2870 |
2871 | KINECT_SKELETON_COUNT = 6
2872 |
2873 | class DefaultKinectSensor:
2874 | _kinect20 = ctypes.WinDLL('Kinect20')
2875 | _GetDefaultKinectSensorProto = _kinect20.GetDefaultKinectSensor
2876 | _GetDefaultKinectSensorProto.argtypes = [ctypes.POINTER(ctypes.POINTER(IKinectSensor))]
2877 | _GetDefaultKinectSensorProto.restype = ctypes.HRESULT
2878 |
2879 |
2880 | _kernel32 = ctypes.WinDLL('kernel32')
2881 | _CreateEvent = _kernel32.CreateEventW
2882 | _CreateEvent.argtypes = [ctypes.c_voidp, ctypes.c_uint, ctypes.c_bool, ctypes.c_wchar_p]
2883 | _CreateEvent.restype = ctypes.c_voidp
2884 |
2885 | _CloseHandle = _kernel32.CloseHandle
2886 | _CloseHandle.argtypes = [ctypes.c_voidp]
2887 | _CloseHandle.restype = c_bool
2888 |
2889 | _WaitForSingleObject = _kernel32.WaitForSingleObject
2890 | _WaitForSingleObject.argtypes = [ctypes.c_voidp, ctypes.c_uint32]
2891 | _WaitForSingleObject.restype = ctypes.c_uint32
2892 |
2893 | _WaitForMultipleObjects = _kernel32.WaitForMultipleObjects
2894 | _WaitForMultipleObjects.argtypes = [ctypes.c_uint32, ctypes.POINTER(ctypes.c_voidp), ctypes.c_uint, ctypes.c_uint32]
2895 | _WaitForMultipleObjects.restype = ctypes.c_uint32
2896 |
2897 | _WAIT_OBJECT_0 = 0
2898 | _WAIT_OBJECT_1 = 1
2899 | _INFINITE = 0xffffffff
2900 |
2901 | _oleaut32 = ctypes.WinDLL('oleaut32')
2902 | _SysFreeString = _oleaut32.SysFreeString
2903 | _SysFreeString.argtypes = [ctypes.c_voidp]
2904 | _SysFreeString.restype = ctypes.HRESULT
2905 |
2906 | def HRValue(hr):
2907 | _hr = comtypes.HRESULT(hr)
2908 | return ctypes.c_ulong(_hr.value).value
2909 |
2910 | def IsHR(hr, value):
2911 | _hr = comtypes.HRESULT(hr)
2912 | return ctypes.c_ulong(_hr.value).value == value
2913 |
2914 |
2915 | __name__ = 'PyKinectV2'
--------------------------------------------------------------------------------
/pykinect2/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/limgm/PyKinect2/21e9adc4ad9c12da4665eb2ddb129242209ee330/pykinect2/__init__.py
--------------------------------------------------------------------------------
/setup.py:
--------------------------------------------------------------------------------
1 | from setuptools import setup, find_packages
2 |
3 | setup(name='pykinect2',
4 | version='0.1.0',
5 | description='Wrapper to expose Kinect for Windows v2 API in Python',
6 | license='MIT',
7 | author='Microsoft Corporation',
8 | author_email='k4w@microsoft.com',
9 | url='https://github.com/Kinect/PyKinect2/',
10 | classifiers=[
11 | 'Development Status :: 4 - Beta',
12 | 'Programming Language :: Python',
13 | 'Programming Language :: Python :: 2.7',
14 | 'Programming Language :: Python :: 3.4',
15 | 'License :: OSI Approved :: MIT License'],
16 | packages=find_packages(),
17 | install_requires=['numpy>=1.9.2',
18 | 'comtypes>=1.1.1']
19 | )
20 |
--------------------------------------------------------------------------------