M.G. Siegler •

A Mere Thought of a Flick of the Wrist

With an AI assist, Meta nears the productization of CTRL-Labs gesture (and non-gesture) interface wristband
Meta Unveils Wristband for Controlling Computers With Hand Gestures
When you write your name in the air, you can see the letters appear on your smartphone.

It's sort of wild to think that far more people currently interact with computers through multi-touch than do through a mouse and keyboard. And while the smartphone is clearly going to be here to stay for a while – much to the chagrin of Meta and Amazon – there also clearly needs to be a new interaction paradigm for what's next. And if you believe that what's next isn't likely going to be one thing at the scale of the iPhone, but instead a collection of devices all tied together through though AI, it's reasonable to think there will also be multiple types of input. Voice will undoubtedly be one.

And gestures perhaps another...

The prototype looks like a giant rectangular wristwatch. But it doesn’t tell the time: It lets you control a computer from across the room simply by moving your hand.

With a gentle turn of the wrist, you can push a cursor across your laptop screen. If you tap your thumb against your forefinger, you can open an app on your desktop computer. And when you write your name in the air, as if you were holding a pencil, the letters will appear on your smartphone.

Makes sense and Meta has talked about and even shown off this wristband before alongside prototypes of some new devices they've been working on. But what about gestures without actually gesturing...

Designed by researchers at Meta, the tech giant that owns Facebook, Instagram and WhatsApp, this experimental technology reads the electrical signals that pulse through your muscles when you move your fingers. These signals, generated by commands sent from your brain, can reveal what you are about to do even before you do it, as the company detailed in a research paper published on Wednesday in the journal Nature.

With a little practice, you can even move your laptop cursor simply by producing the right thought. “You don’t have to actually move,” Thomas Reardon, the Meta vice president of research who leads the project, said in an interview. “You just have to intend the move.”

This sounds like science fiction, but I can assure you that it's science fact. I know this because almost a decade ago, I was sitting in my office when I got a demo of an early build of this technology. It was hands-down – pun very much intended – the coolest live demo I had seen in person. And so making an investment call, as GV did in what was then called CTRL-Labs, was an easy one.

That demo – again, this was about eight years ago – went something like this: a person – in this particular case, Reardon – is sitting at a desk typing on a computer wearing a large wristband on one hand while the words he's typing appear on screen. But then the computer is taken away and he keeps moving his fingers as if he was typing on that same keyboard that is no longer there – and the words he's typing still keep coming on the screen. Then he stops moving his fingers. The words just keep going.

Meta smartly snapped up the company nearly six years ago, and they've been perfecting the very same technology in-house ever since.

Meta’s wristband uses a technique called electromyography, or EMG, to gather electrical signals from muscles in the forearm. These signals are produced by neurons in the spinal cord — called alpha motor neurons — that connect to individual muscle fibers.

Because these neurons connect directly to the muscle fibers, the electrical signals are particularly strong — so strong that they can be read from outside the skin. The signal also moves much faster than the muscles. If a device like Meta’s wristband can read the signals, it can type much faster than your fingers.

“We can see the electrical signal before you finger even moves,” Dr. Reardon said.

One challenge back in those early days was the training required to get this to work beyond the individual on which such a system was tailored for. It worked, but it took a lot of calibration. Enter AI:

Although Dr. Reardon and his colleagues have been privately demonstrating their technology for years, they are only now beginning to publicly share their work because it is now mature enough for the marketplace. The key development is the use of A.I. techniques to analyze the EMG signals.

After collecting these signals from 10,000 people who agreed to test the prototype, Dr. Reardon used a machine learning system called a neural network — the same breed of A.I. that drives ChatGPT — to identify common patterns in this data. Now, it can look for these same patterns even when a different person is using the device.

“Out of the box, it can work with a new user it has never seen data for,” said Patrick Kaifosh, director of research science at Reality Labs and one of the neuroscientists that founded Ctrl Labs.

It sounds like it's not quite ready to roll as an external product yet, but it's close.

According to Dr. Reardon, who is also known as the founding father of the Internet Explorer web browser at Microsoft, Meta plans to fold the technology into products over the next few years. Last fall, the company demonstrated how its wristband could be used to control an experimental version of its smart glasses, which can take photos, record videos, play music and verbally describe the world around you.

Do they pair it with the Meta Ray-Ban Smart Glasses? The 'Orion' (now codenamed 'Artemis'?) "true" AR glasses? Something else? All the above? And will it be exclusive to Meta AI/AR products or will they let the wristband work with say, your MacBook? I could see arguments both ways, but we all know how much Mark Zuckerberg hates (envies) the tight integration that say, the Apple Watch has with the iPhone...

Speaking of Apple, how do they eventually play in this space? With said Apple Watch? A ring? Obviously the Vision Pro is built around gesture-based controls, but they're visual. The Apple Watch has some which are movement-based. But this Meta wristband is the next level, clearly.

In a similar way, Meta’s wristband lets you control a computer with the appropriate thought. Merely thinking about a movement is not enough. But if you intend to make a movement, thewristband can pick up on what you aim to do — even if you do not physically move.

“It feels like the device is reading your mind, but it is not,” Dr. Reardon said. “It is just translating your intention. It sees what you are about to do.”

When you move your arm or hand or finger, the number of muscle fibers you activate varies depending on how big or how small the movement is. If you practice using the wristband long enough, you can learn to activate a tiny number of fibers without actually moving your fingers.

“We can listen to a single neuron. We are working at the atomic level of the nervous system.”

I'm obviously biased given the above (though I have no current conflicts here beyond the occasional friendly back-and-forth with Reardon), but this is wild. And it will be more so when it's actually in the wild.

The Anti iPhone
Jony Ive’s antidote to the smartphone obsession he helped usher in…
Computing in Concert
“Voice will never be the interaction model,” they say, out loud, to communicate their thoughts.
OpenAI Changes the Vocal Computing Game!
No sarcasm, just enthusiasm for GPT-4o
Amazon Joins the Race for “What’s Next” After the iPhone
J Allard brings some device pedigree to Amazon’s ‘ZeroOne’ team…
Meta’s March to Make the iPhone ‘The Thing That Gets Us to the Thing’
The company will build anything and everything to end the smartphone era…