About the Author |
|
xi | |
About the Technical Reviewer |
|
xiii | |
Acknowledgments |
|
xv | |
Introduction |
|
xvii | |
|
|
1 | (40) |
|
Chapter 1 Mixed Reality and Kinect |
|
|
3 | (18) |
|
A Brief History of Mixed Reality |
|
|
3 | (2) |
|
|
5 | (5) |
|
|
10 | (8) |
|
|
12 | (1) |
|
|
12 | (4) |
|
|
16 | (1) |
|
The IMU (Accelerometer and Gyroscope) |
|
|
17 | (1) |
|
|
17 | (1) |
|
Enhancing Kinect with Azure |
|
|
18 | (1) |
|
|
18 | (3) |
|
Chapter 2 The Developer Toolbox |
|
|
21 | (20) |
|
Overview of the Microsoft Azure SDK |
|
|
21 | (1) |
|
|
22 | (10) |
|
Sensor SDK System Requirements |
|
|
22 | (1) |
|
|
23 | (9) |
|
Azure Kinect Body Tracking SDK |
|
|
32 | (3) |
|
Body Tracking SDK System Requirements |
|
|
32 | (2) |
|
The Kinect Body Tracking Viewer |
|
|
34 | (1) |
|
Setting Up the Development Environment |
|
|
35 | (4) |
|
Unity 3D and Visual Studio |
|
|
36 | (3) |
|
|
39 | (1) |
|
|
39 | (2) |
|
|
41 | (86) |
|
Chapter 3 Configuring the Device |
|
|
43 | (26) |
|
Adding the Kinect SDKs in Unity3D |
|
|
43 | (12) |
|
The Azure Kinect Binaries |
|
|
44 | (7) |
|
|
51 | (1) |
|
Deploying Your Unity Application |
|
|
52 | (2) |
|
Mastering Azure Kinect: Source Code |
|
|
54 | (1) |
|
Starting and Stopping the Device |
|
|
55 | (3) |
|
|
57 | (1) |
|
Kinect Device Configuration |
|
|
58 | (7) |
|
Handling Invalid Configurations |
|
|
61 | (1) |
|
|
62 | (3) |
|
|
65 | (1) |
|
|
66 | (3) |
|
|
69 | (14) |
|
Structure of a Color Frame |
|
|
69 | (2) |
|
|
71 | (3) |
|
|
74 | (1) |
|
Displaying Color Data in Unity3D |
|
|
75 | (5) |
|
Specifying the Color Configuration |
|
|
76 | (2) |
|
Reading Kinect Color Data as BGRA32 |
|
|
78 | (1) |
|
Reading Kinect Color Data as MJPG |
|
|
79 | (1) |
|
|
80 | (3) |
|
|
83 | (20) |
|
Structure of a Depth Frame |
|
|
83 | (2) |
|
Narrow and Wide Fields of View |
|
|
85 | (4) |
|
Displaying Depth Data in Unity3D |
|
|
89 | (7) |
|
Depth Configuration and Data |
|
|
91 | (3) |
|
Grayscale Depth Visualization |
|
|
94 | (1) |
|
|
94 | (2) |
|
|
96 | (4) |
|
Too Close to the Camera or Too Far from the Camera |
|
|
96 | (1) |
|
|
97 | (1) |
|
|
98 | (1) |
|
|
99 | (1) |
|
|
100 | (3) |
|
|
103 | (24) |
|
The Technology of Body Tracking |
|
|
103 | (4) |
|
The New Azure Kinect Approach |
|
|
105 | (2) |
|
Structure of a Human Body |
|
|
107 | (4) |
|
|
107 | (3) |
|
|
110 | (1) |
|
|
110 | (1) |
|
|
111 | (6) |
|
|
112 | (3) |
|
|
115 | (1) |
|
Constructing Body Objects |
|
|
116 | (1) |
|
|
116 | (1) |
|
Displaying Body Data in Unity3D |
|
|
117 | (8) |
|
|
125 | (2) |
|
|
127 | (90) |
|
Chapter 7 Streaming Data in the Background |
|
|
129 | (14) |
|
|
130 | (1) |
|
Creating the Streaming Class |
|
|
131 | (1) |
|
Starting and Stopping the Device |
|
|
132 | (2) |
|
Streaming Data in a Background Thread |
|
|
134 | (3) |
|
|
137 | (1) |
|
Using the KinectSensor Class |
|
|
138 | (2) |
|
|
140 | (3) |
|
Chapter 8 Coordinate Mapping |
|
|
143 | (26) |
|
|
144 | (5) |
|
|
145 | (1) |
|
|
145 | (1) |
|
|
146 | (3) |
|
Coordinate Transformations |
|
|
149 | (17) |
|
|
151 | (2) |
|
|
153 | (1) |
|
|
154 | (2) |
|
|
156 | (1) |
|
|
157 | (1) |
|
|
158 | (1) |
|
Using the Coordinate Mapper Class |
|
|
159 | (2) |
|
The Complete Coordinate Mapper Class |
|
|
161 | (5) |
|
|
166 | (3) |
|
Chapter 9 Augmented Reality: Removing the Background of the Users |
|
|
169 | (22) |
|
Mixing the Physical and the Digital Worlds |
|
|
170 | (1) |
|
|
170 | (4) |
|
|
174 | (11) |
|
|
178 | (7) |
|
A Background Removal Game in Unity3D |
|
|
185 | (4) |
|
|
189 | (2) |
|
Chapter 10 Motion Analysis |
|
|
191 | (26) |
|
|
192 | (2) |
|
|
192 | (1) |
|
|
193 | (1) |
|
|
193 | (1) |
|
Measuring Physical Distances |
|
|
194 | (6) |
|
Example: Evaluating a Squat |
|
|
197 | (3) |
|
|
200 | (11) |
|
Example: Counting Bicep Curls |
|
|
205 | (6) |
|
|
211 | (5) |
|
Example: How Fast Are You Walking? |
|
|
212 | (4) |
|
|
216 | (1) |
|
Part IV The "Azure" in Kinect |
|
|
217 | (42) |
|
Chapter 11 Azure Cognitive Services |
|
|
219 | (14) |
|
There Is an API for That! |
|
|
220 | (4) |
|
|
221 | (1) |
|
|
221 | (1) |
|
|
222 | (2) |
|
Understanding Azure in Kinect |
|
|
224 | (1) |
|
Creating a Computer Vision API |
|
|
224 | (8) |
|
|
232 | (1) |
|
|
232 | (1) |
|
Chapter 12 Computer Vision and Object Detection |
|
|
233 | (26) |
|
Do You Need a Kinect After All? |
|
|
235 | (2) |
|
The Azure Cognitive Services SDK |
|
|
237 | (6) |
|
The Computer Vision NuGet Package |
|
|
238 | (1) |
|
Importing the Packages in Unity3D |
|
|
239 | (2) |
|
Creating a New Unity Scene |
|
|
241 | (1) |
|
Referencing the Azure Computer Vision SDK |
|
|
242 | (1) |
|
Computer Vision in Action |
|
|
243 | (14) |
|
Input: Kinect Color Frames |
|
|
244 | (3) |
|
Output: Object Rectangles |
|
|
247 | (1) |
|
|
248 | (2) |
|
|
250 | (6) |
|
|
256 | (1) |
|
|
257 | (2) |
Index |
|
259 | |