Leap represents an entirely new way to interact with your computers. It’s more accurate than a mouse, as reliable as a keyboard and more sensitive than a touchscreen. For the first time, you can control a computer in three dimensions with your natural hand and finger movements.

www.leapmotion.com

Pretty much every TV at CES had the same functionality: Wi-Fi. Gesture control. Voice control. Given your console, your TV, your cable box, your light switches, your hi-fi, your phone, your tablet will have these performative technologies, we’ve got to find ways to add direction to our waving hands and faltering voices.

This is pretty much Minority Report isn’t it?

It’s been built at MIT using Kinect, libfreenect and Linux. The graphical interface and the hand detection software were written to talk to the open source robotics package ‘ROS’, developed by Willow Garage. The hand detection software showcases the abilities of the Point Cloud Library (PCL), a part of ROS that MIT has been helping to optimize.

The hand detection software is able to distinguish hands and fingers in a cloud of more than 60,000 points at 30 frames per second, allowing natural, real time interaction.

Code available at:

http://www.ros.org/wiki/kinect

http://www.ros.org/wiki/mit-ros-pkg

OnObject is a small device user wears on hand to program physical objects to respond to gestural triggers. Attach an RFID tag to any objects, grab them by the tag, and program their responses to your grab, release, shake, swing, and thrust gestures using built in microphone or on-screen interface. 

OnObject is a small device user wears on hand to program physical objects to respond to gestural triggers. Attach an RFID tag to any objects, grab them by the tag, and program their responses to your grab, release, shake, swing, and thrust gestures using built in microphone or on-screen interface. 

This is a 3D Input Interface for Mobile Devices developed by Ishikawa Komuro Laboratory. It’s very cool but when I watch this video I can’t help thinking that it might be a step too far.

The beauty of modern mobile devices is you touch them. They’re personal to you so the fact you touch them and they do stuff makes things feel more real - especially when you’re using them to communicate with other people. Recently I heard someone refer to iPhones jokingly as (I forget who it was and I’m paraphrasing) “like little pets you keep in your pocket and stroke from time to time”. I’ve been thinking about that a lot and it’s really struck a chord so this just seems like it’s going one step away from all that.

Why would you not want to touch something that’s already in your other hand?

Gestural stuff is going to be great for big fixed screens but mobile devices? Not sure yet - unless it’s for one handed BIG gestures that you can’t convey with a small touch.

For L’Artisan Electronique, Unfold created - aside from the ceramic printer - a virtual pottery wheel in collaboration with Tim Knapen. This pottery wheel gives visitors a chance to ‘turn’ their own forms.

At regular intervals, a selection of these designs is printed in clay and exhibited in the space.

In L’Artisan Electronique, pottery, one of the oldest artisanal techniques for making utilitarian objects, is combined with new digital techniques. The virtual pottery wheel was realsied by means of a 3D-scanner and digital design software. However, the installation still clearly refers to the artisanal process of working in clay. The printing process imitates the traditional technique used by ceramicists, in which the form is built up by stacking coils of clay.

Would you like to know more?