Book Image

Kinect for Windows SDK Programming Guide

By : Abhijit Jana
Book Image

Kinect for Windows SDK Programming Guide

By: Abhijit Jana

Overview of this book

Kinect has been a game-changer in the world of motion games and applications since its first release. It has been touted as a controller for Microsoft Xbox but is much more than that. The developer version of Kinect, Kinect for Windows SDK, provides developers with the tools to develop applications that run on Windows. You can use this to develop applications that make interaction with your computer hands-free. This book focuses on developing applications using the Kinect for Windows SDK. It is a complete end to end solution using different features of Kinect for Windows SDK with step by step guidance. The book will also help you develop motion sensitive and speech recognition enabled applications. You will also learn about building application using multiple Kinects.The book begins with explaining the different components of Kinect and then moves into to the setting up the device and getting thedevelopment environment ready. You will be surprised at how quickly the book takes you through the details of Kinect APIs. You will use NUI to use the Kinect for Natural Inputs like skeleton tracking, sensing, speech recognizing. You will capture different types of stream, and images, handle stream event, and capture frame. Kinect device contains a motorized tilt to control sensor angles, you will learn how to adjust it automatically. The last part of the book teaches you how to build application using multiple Kinects and discuss how Kinect can be used to integrate with other devices such as Windows Phone and microcontroller.
Table of Contents (19 chapters)
Kinect for Windows SDK Programming Guide
Credits
About the Author
Acknowledgement
About the Reviewers
www.PacktPub.com
Preface
Index

Key things to remember


While gesture detection sounds very inspiring and motivating, and can encourage developers to use their imagination, there are a few options we need to consider while implementing a gesture-enabled application. They are as follows:

  • User actions

  • Development

  • Data matching

  • Testing

User actions or inputs are the key elements for any gesture-enabled application. Wrong inputs can mislead the application if they are not handled properly. So, make sure the user knows what needs to be done for what action. This can be done by training them or by providing live feedback in the application's UI.

While developing, make sure you capture all the conditions that come from user input. Also, you must validate the boundary conditions for the entry and exit criteria of the gestures. If the user input matches with the data set, invoke the desire action, otherwise send the message back to the user. This makes your development a bit complex as you need to check for both positive and negative...