As stuff like Google Glass becomes mainstream, we’re going to see a lot more wearable computing devices around. But one thing that isn’t clear is how we’ll control them. One idea is to use gesture control, which would enable users to communicate with wearable computers without having to use a whole separate smartphone or other device to do so.
But so far, gesture control for most devices — like the Xbox Kinect, for instance — has depended upon cameras watching user movement. That means remaining in a fixed space and using pre-programmed gestures that are not exactly natural, but can be picked up by cameras. As a result, today’s gesture control technologies are far from perfect. In fact, most to date are just downright bad.
Y Combinator-backed startup Thalmic Labs believes it has a better way of determining user intent when using gesture control. To do so, it’s developed a new device, called MYO, which is an armband worn around the forearm. Using Bluetooth, the armband can wirelessly connect to other devices, such as PCs and mobile phones, to enable user control based on their movements without directly touching the electronics.
See it in action here: