As Facebook looks to the next stage of digital connection, which includes the development of its own AR-enabled glasses, it’s also working on new control tools that will enable people to interact more seamlessly with these overlays and options.
Such controls are a key part of the experience – in order to make best use of AR overlays, you need to be able to interact with them easily, without disrupting your everyday tasks. Voice commands are one option, while other research has focused on eye-based controls, with varying results.
But Facebook’s research has actually pointed them to the wrist being the key control point for such interactions.
As you can see in the video, Facebook is now developing new tools based on a wristband-type device that would read muscle commands as they travel through your arm to then enable you to respond within the digital environment.
The process utilizes EMG, or electromyography, which translates electrical motor nerve signals via sensors that detect such as they travel through your limb.
“These signals let you communicate crisp one-bit commands to your device, a degree of control that’s highly personalizable and adaptable to many situations. The signals through the wrist are so clear that EMG can understand finger motion of just a millimeter. That means input can be effortless.”
The logic of utilizing this form of control for AR, and even VR, makes sense – as Facebook notes:
“The wrist is a traditional place to wear a watch, meaning it could reasonably fit into everyday life and social contexts. It’s a comfortable location for all-day wear. It’s located right next to the primary instruments you use to interact with the world – your hands. This proximity would allow us to bring the rich control capabilities of your hands into AR, enabling intuitive, powerful, and satisfying interaction.”
The process would enable simple, effective interaction with digital overlays and tools, which could make it easier than ever to interact with these supplemental options, and even control elements on-screen.
And according to Facebook, they may even be more responsive than your own hands in some applications.
“It’s highly likely that ultimately you’ll be able to type at high speed with EMG on a table or your lap – maybe even at higher speed than is possible with a keyboard today. Initial research is promising. In fact, since joining FRL in 2019, the CTRL-labs team has made important progress on personalized models, reducing the time it takes to train custom keyboard models that adapt to an individual’s typing speed and technique.”
Facebook’s not at the level where it’s able to project a future roll-out of such as yet, but it is working on its AR glasses, the first iteration of which are set for release sometime this year. That first version, which Facebook’s calling Smart Glasses, won’t be full AR-enabled, but in combination with Facebook’s development of its own smart watch, you can see the roadmap starting to take shape, with these new connective processes coming together fast.
Maybe too fast.
Back in 2017, Facebook freaked a whole lot of people out when it previewed its work on a new computing interface that would connect directly into your brain, and respond to commands by essentially reading your thoughts as they come through.
Giving Facebook direct access to your thoughts seems like a risky proposition, on many fronts, and Facebook quietly moved on from that project (or publicly discussing it at least) as its various data privacy controversies began to stack up.
This new EMG control process is along similar lines, connecting into your body, your thought process, in order to facilitate device control. It’s not giving Facebook access to your thoughts, as such, but the implications could still be more wide-reaching than we’re considering. Facebook does note that user privacy is front of mind when developing all of these tools and processes. But still, the velocity of change has seen Facebook ‘move fast and break things’ in the past.
Regardless, the next stage of digital connection is coming, and it’ll logically look a lot like this. That will facilitate a range of new possibilities, new marketing opportunities, new data tracking processes and debates, and whole new product categories.
It’s interesting to consider the broader implications, as we move forward with the next stage.
Follow Andrew Hutchinson on Twitter