Book Image

OpenNI Cookbook

By : Soroush Falahati
Book Image

OpenNI Cookbook

By: Soroush Falahati

Overview of this book

The release of Microsoft Kinect, then PrimeSense Sensor, and Asus Xtion opened new doors for developers to interact with users, re-design their application’s UI, and make them environment (context) aware. For this purpose, developers need a good framework which provides a complete application programming interface (API), and OpenNI is the first choice in this field. This book introduces the new version of OpenNI. "OpenNI Cookbook" will show you how to start developing a Natural Interaction UI for your applications or games with high level APIs and at the same time access RAW data from different sensors of different hardware supported by OpenNI using low level APIs. It also deals with expanding OpenNI by writing new modules and expanding applications using different OpenNI compatible middleware, including NITE. "OpenNI Cookbook" favors practical examples over plain theory, giving you a more hands-on experience to help you learn. OpenNI Cookbook starts with information about installing devices and retrieving RAW data from them, and then shows how to use this data in applications. You will learn how to access a device or how to read data from it and show them using OpenGL, or use middleware (especially NITE) to track and recognize users, hands, and guess the skeleton of a person in front of a device, all through examples.You also learn about more advanced aspects such as how to write a simple module or middleware for OpenNI itself. "OpenNI Cookbook" shows you how to start and experiment with both NIUI designs and OpenNI itself using examples.
Table of Contents (14 chapters)
OpenNI Cookbook
About the Author
About the Reviewers

Getting a user's skeleton joints and displaying their position in the depth map

In this recipe, we are going to show you how to request calibration for a user's skeleton and for tracking a user's skeleton joints; we will then show these joints on a screen overlaying the depth stream.

Getting ready

Create a project in Visual Studio and prepare it for working with OpenNI and NiTE using the Create a project in Visual Studio 2010 recipe in Chapter 2, OpenNI and C++; then, configure Visual Studio to use OpenGL using the Configuring Visual Studio 2010 to use OpenGL recipe in Chapter 3, Using Low-level Data.

Then copy the code from the Identifying and coloring users' pixels in depth map recipe of Chapter 5, NiTE and User Tracking to this project.

How to do it...

  1. Locate the following line in the gl_DisplayCallback() function:

  2. Add the following lines of code relative to the preceding line:

        glBegin( GL_POINTS );
        glColor3f( 1.f, 0.f, 0.f );
    const nite::Array<nite::UserData>&users...