Imagine a construction site where the foreman stands off to the side, peers at the growing skeleton of a building, moves his arms this way and that, and the blueprints of the building slide into his glasses. Or a doctor, who stands in the hallway, moving her arms as x-ray slides zoom through her lenses.

These sorts of visions have been advertised for Google Glass and other smart eyeware makers, but today Thalmic, maker of the Myo armband, is jumping into the picture. It's launching technology that will let wearable developers integrate Myo into their applications, allowing wearers to control the devices with body gestures instead of by tilting their heads and using voice commands.

Myo is essentially a motion sensor that wraps around your forearm. Move your arm and whatever you're controlling moves that way. To date the company has sold the vision in controlling things like drones, but now it's making a play for bigger businesses that need workers to free up their hands. They already released developer kits and hope to start shipping enterprise-ready devices later this fall.

APX Labs is teaming up with Thalmic to build "heads-ups applications," as smartglass applications are called, for labor-intensive work. APX CEO Brian Ballard notes the boundary of gesture based applications today: restricted to image and information navigation, not necessarily using gestures to build digital objects.

"Most of the enterprise cases tend to fall in the Q/A and service side, rather than design side," he says.

Despite all of Google's marketing around Glass, the concept for smart eyeware becoming the dashboard for the workplace has been around for decades. Thad Starner discussed how Boeing might use such devices in its manufacturing back in 1996. Since then, good old Moore's Law has allowed the idea to become real. Now the question is which of all these new wearable devices, from Glass to armbands to smartwatches, are actually useful.

For instance, does a construction worker really need to see the blueprints so much that he needs to wear an armband and Google Glass? Does the doctor really save effort (and dignity) by waving her arms around, when she could just use Glass's onboard touch-and-tap navigation tools?

Chris Gooden, Thalmic's developer evangelist, makes the point that doctors need to maintain sterility with their hands during procedures, but have to view x-rays and MRIs. Typically they have to take off their gloves or an assistant holds the film. Even if they were wearing smart eyeware to render the images, they'd have to touch the non-sterile device. Gesture navigation would solve this, he says.

Construction workers also develop dirty gloves, sometimes covered in caustic chemicals, but also work in loud environments. Glass relies on voice commands which might not survive the journey from mouth to microphone. Gesture controls would cut through the noise.

"This can finally give users a hands free interaction," Gooden argues.

Another hurdle for uptake is Myo's size. Doctors and construction workers need their hands unencumbered, so naturally Thalmic continues to work at trimming weight and thickness of the device. The Myo are elegant and futuristic-looking, but they are still a giant band around the forearm.