Through their company, Design I/O, digital artists/designers Emily Gobeille and Theo Watson have been experimenting with Kinect for some time--last year they demonstrated a prototype of a hand-controlled bird that caused a stir in the Kinect hacking community and way beyond.
Now, they’ve unveiled the latest iteration of their work in the form of "Puppet Parade," an installation at the 2011 Cinekid Festival in Amsterdam that sent an audience of adorable tots on an avian adventure. With the aid of Kinect, these tiny hand puppeteers can control the movement of giant psychedelic-colored birds that appear on a screen, making them swoop down on absurdly long necks to squawk and eat (the food is generated by the movements and gestures of other children standing closer to the screen, whose silhouettes the Kinect is also tracking).
We asked Watson to walk us through the "Puppet Parade."
Co.Create: Can you explain the basic tech setup you used to create the installation?
Theo Watson: The setup consists of two Xbox Kinect cameras mounted on a stage which tracks the arms and hands of two puppeteers. The Kinect cameras are connected to a computer running an openFrameworks application which is doing computer vision on the depth image of the Kinect to detect the position of the arm and hand and mapping it to the puppets on screen. There is also an infrared camera which looks at the whole screen and tracks the people standing in front of it. This enables the audience to interact directly with the puppets or to make food by holding out their hands.
How has this installation evolved from your demo last year--and what were the advances that allowed you to evolve the project?
The installation has involved quite a bit since our first demo. We improved the arm and hand tracking immensely, basically rewriting it from scratch to be much more robust. The creatures are now 3-D, so they can actually turn towards or away from the audience. They can also transform their appearance based on what they eat. We added interaction to the wall to help create a dialog between the audience and the puppeteers. For example, now if you touch a creature on its nose with your hand it does a sneeze of sorts and shakes its head.
From the technical side a couple of advances really made a big difference. The improvement with the libFreenect (opensource kinect) library and ofxKinect which wraps it have come along way, making it much easier to work with multiple Kinects. Also the new openFrameworks 007 version included a lot of new features (especially 3-D model loading) which made switching the characters to 3-D much more painless. The rest of the advances that allowed this were really time and also having a space to show the project. The Cinekid festival was a really big part of that, as they encouraged us to make it big and as the festival gets so busy we knew we had to make it be able to support a lot of participants at once.
How do you see the Kinect platform as evolving and what’s the potential of this and similar tech for you as an artist and designer?
I have to say that first off we are very appreciative even to have the Kinect as it is. It has really opened up the possibilities of what we can do as interactive artists and I think there is still a lot to be realized with the current version.
Since last June the last four projects we have made have all used a Kinect as the primary source of tracking, so it has become a big part of our toolkit. If I had a wishlist for the next Kinect version, the first thing would be an increased depth range (currently it is 40cm to 400cm), followed by having a higher resolution color image. Outside of hardware I see the software written for the Kinect getting a lot smarter. Open-source skeleton detection and sound localization, that rival what exists on the XBox, seem to be things that could be coming soon.
For more on "Puppet Parade," see the live footage clip below.