This page is a documentation of Kinect research undertaken at LUSTlab. Having contributed to several installations using the Microsoft Kinect sensor, the following is a list of projects using this hardware and an explanation of the software involved in their execution. The intention is to release part of the software used in the new year.
The Kinect was primarily developed and marketed by Microsoft as a sensor for video games. However, artists and designers quickly realised the interactive capabilities it offered compared to existing hardware. The sensor captures both colour and depth data, allowing the distance between each image pixel and the sensor to be calculated. The addition of depth data greatly improves the reliability of user tracking.
The Kinect (or similar RGB-Depth sensors), in combination with software such as OpenNI and NITE, allows developers to determine not only the position of the user in space but also the user's pose at interactive speeds. Knowing the user's pose allows the creation of rich natural user interfaces in which interaction takes place through means of gestures and pointing.
A single Kinect sensor can map a space around three meters wide. This is sufficient for playing video games but most of the installations that we develop require larger observable spaces. In most cases, constraining the physical space in which this interaction can take place will diminish or complicate the experience. To overcome this problem we developed software to enable the use of multiple Kinect sensors.
This custom built software, n-Track, fuses data from multiple depth sensors to track people in large spaces. n-Track merges the observable spaces from each of the Kinects into a single large space, allowing users to be tracked through the full space as if it were observed by a single sensor. n-Track is developed specifically for interactive media installations where natural user interaction in a large space is required.
In May 2011, LUST created the installation Polyarc for the Festival international d'Affiche in Chaumont, France. For this project, LUSTlab made custom software for the Kinect to allow users to browse through an archive of 25.000 posters. Polyarc consists of two nine meter high screens installed within the arcs of the Chapelle des Jésuites. The interaction was designed to be as intuitive as possible: just walking in front of the large screens is enough to start the interaction. A more refined interaction with the archive is engaged when the user walks closer to the screen and is invited to browse the archive through swipe gestures.
At LUSTlab, we strive for invisible technology and natural interaction, so the Kinect was an ideal tool in this instance.
Equipped with experience from the Polyarc project, we improved the n-Track software further for use in the LUST installation, Posterwall 3.0, presented in the exhibition Graphic Design: Now in production at the Walker Art Center in Minneapolis. We used an array of four Kinect sensors and our n-Track software to enable natural user interaction on an eight meter wide virtual poster wall. The poster wall software continuously generates posters from news, Twitter and blog posts and displays these on the wall. A user can browse through the generated posters by walking across the wall. Refined interaction takes place when the user is close enough to a single poster, that poster will then take focus and display a QR code that leads to a webpage with poster details.
At the TEDx Flanders conference in Antwerp, six Kinects were used to track speakers throughout their lectures. The n-Track software was combined with custom software to communicate with the in house lighting control.