On the second day at SolidWorks World 2015, author, futurist, and theoretical physicist Michio Kaku took the stage. With acerbic wit and humor, he recounted the time he agreed to test-drive a self-driving car for a TV documentary.
“BBC put me in that car, but I intended to drive it myself,” he said. “When the cameraman behind me said, Let go of the steering wheel, I thought, What? Are you nuts? Crazy? I’m not going to let go.”
Eventually Kaku managed to let go, despite his trepidation, and the car drove itself. That is the difficulty with technologies that go against years of human habit. Even if a superior navigation method is possible, you still must convince the humans to let go of their established instincts.
At Thalmic Labs, in the City of Kitchener in southern Ontario, Canada, a tight-knitted group of mechatronic engineers are working on an armband dubbed Myo, which lets you control games and software using your arm movements. Convincing people to let go of their mouse may be part of the challenge they face.
This is Your Brain Talking
Movement comes naturally to most of us who are able-bodied, so we don’t think about the process. But Chris Goodine, a developer at Thalmic Labs, and his colleagues think about it a lot.
“When you want to move your body, like to make a fist, for instance, your brain fires up a signal,” Chris explained. “That neurological signal is transmitted to your muscles, causing them to contract. The signal is detectable through your skin.”
If there’s a way to capture the unique electrical signal, you would then be capturing the brain’s vocabulary, the language the brain uses to command your limbs.
“The Myo armband uses electromyographic (EMG) sensors, the exact same type used at hospitals,” said Chris. “Except, at the hospitals, you’d probably be asked to shave and apply some conductive gels with EMG sensors. They measure the electric signals that appear on your skin when your muscles activate.”
Eight EMG sensors embedded in the armband continually monitor and detect the signals. Depending on the movement you make, different sensors in the armband get activated. These patterns can be programmed to trigger certain actions remotely in another software, like a video game or a CAD program. The same patterns can also be made to trigger operations in the control software of a robot or toy. That is how Chris is able to fly a real drone, control a virtual aircraft, or rotate a 3D assembly by literally a wave of his hand or fist.
Around January 2014, Chris and his colleagues began working the crowd at local shopping malls to collect data. They set up a booth and gave demonstrations. With an armband that could commandeer virtual planes on a large monitor, Chris found that he didn’t have to do a whole lot to recruit volunteers. People were naturally curious and happy to lend a hand, so to speak.
“We’ll put a band on them, ask them to make a fist, make an open hand, wave, snap, and so on,” Chris said. The volume and accuracy of the data gave Thalmic Labs engineers confidence in their device, which has to account for the EMG activities of a range of individuals (thick skinned, thin skinned, fit, fat, slim, and so on). “The signal data was extremely important. It’s fundamental to our product,” said Chris.
The data collection is ongoing, up to the present day.
The armband itself was designed in SolidWorks, “It’s a robust program that allows us to build what we need, but it’s the integration of 3D printing that makes it useful for us,” Chris said.
The earliest prototype was a cotton sweatband sewn up with EMG sensors. Subsequently, after the CAD phase, Thalmic Labs’ in-house 3D printer churned out prototypes quickly as the engineers experimented with different shapes.
“To come up with a one-size-fits-all band for all types of forearms was a challenge, especially when it involves detecting electrical pulses,” said Chris.
Thalmic Labs opted for a stretchable design that can expand or collapse to accommodate different arm sizes, with sensors housed in elastic, conductive materials. “We knew the armband would be stretched over and over. Part of that design phase was simulating the wear and tear of the materials over time [inside SolidWorks Simulation],” revealed Chris.
An older woman and a teenaged male making a fist register different EMG signals on the Myo band. It’s not possible, or even practical, to manually code every EMG signature to trigger the desired operation. The only way to accomplish this was through machine learning.
“We have a team dedicated to machine learning,” said Chris. “The software takes in a wide variety of signals, but must understand that they all mean the same thing [someone making a fist]. That’s machine learning for us.”
With the rise of digital prototyping, software makers and manufacturers have begun promoting the use of digital simulation in product development. But gesture computing, which incorporates movement-triggered operations, may challenge the proposition.
“It’s important to remember that individual humans are different,” said Chris. “You can get a decent understanding through digital simulation, but to build a rock solid product, you need to get a deeper understanding of your users.”
For Thalmic Labs, that understanding could only come from tens of thousands of sample signals, collected from willing volunteers with real pulses and heartbeats.
For a demo of the Myo armband, watch the company-produced video below: