Why the Apple Watch’s newest feature is doomed to fail

Double tap symbol on Apple Watch Series 9
Andy Boxall/

Gesture control systems work best when they are simple, quick, and easy to learn. They have to feel natural and intuitive, preferably so that you’ll remember them, and most of all, they have to be believable. Apple’s Double Tap on the Apple Watch Series 9 and Apple Watch Ultra 2 is a good example of gesture controls working well.

I think this is a cool, fun feature and I’m not alone. However, this is not the first case of its kind, and history shows us that, unfortunately, gestures on mobile devices are more likely to be abandoned and forgotten than to be loved and widely adopted .

Motion Sense on Pixel 4

Without even looking at phones like the 2009 Sony Ericsson Yari, which used a front-facing camera to check body movements and control pre-loaded games, many tech fans will immediately look forward to the 2019 Google Pixel 4 and Pixel 4 XL. Can think of when gesture controls are mentioned.

Don’t miss these Pixel deals:

These two devices were the first to feature Google’s Project Soli chip, which uses radar to detect even the smallest movements around them. The resulting feature, called Motion Sense, allows you to swipe in and around the phone to play or pause music, silence an alarm, or mute an incoming call. It also unlocks the phone when you pick it up.

pixel 4 xl accented buttons
Julian Chokkattu /

It was technically exciting, but in practice, it did not work reliably enough, and regularity issues meant it could not be used globally, hurting sales potential. The Pixel 4 eventually became the only smartphone to feature Project Soli’s gesture controls, but the chip is also present in the Google Nest Hub without Motion Sense, where it helps measure breathing while you sleep.

iSight Technologies

Motion Sense on the Pixel 4 is probably one of the better known failed gesture control systems on phones, but other companies were working on gesture control systems long before this. In 2011, Korean electronics brand Pantech released the Vega LTE, which had basically the same gestures as the Pixel 4 but used a software-based system that relied on the front camera to “see”. It was developed by a company called iSight Technologies.

For several years after the Pantech Vega, iSight Technologies worked hard to create gesture controls on mobile devices. It was relying on its platform-agnostic natural user interface (NUI) software for success, as it could be integrated directly into the device’s operating system or even into apps to use the camera and add gesture control systems. Could have been made.

The company worked with Indian smartphone brand Micromax on the A85 superphone, which used gestures similar to the Pantech Vega, made NUI available on Android and iOS, and its technology was shown at technology trade shows on several occasions. It boasted partnerships with companies ranging from Nokia to AMD, and also tried to capitalize on the early 2016 VR craze. Despite all these efforts, it never reached the mainstream, and the company eventually changed its name to Sepia and turned to in-car technology.

Air Gestures on Samsung Galaxy S4

Galaxy S4's gorgeous screen
digital trends

Around the same time that iSight Technologies was promoting its software-based gesture system, Samsung introduced a small set of gesture controls called Air Gestures on the brand new Galaxy S4 smartphone. The phone uses an infrared sensor to recognize basic hand movements on the screen, allowing you to check the time without touching it, accept calls, interact with apps, and even browse the web with just a swipe. You can activate it to scroll through pages.

It worked well enough, but the short range of the sensor meant you were almost touching the screen, making it look more gimmicky than the brilliant technology it probably deserved. This feature continued in Samsung’s repertoire but was gradually phased out and replaced by Air Actions, which uses gestures made with the S Pen stylus to perform the same function without the need for the infrared sensor. Does.

Samsung Galaxy S4 How to Use 2 Air Gestures

So far, we’ve seen the use of radar, software, and infrared sensors to sense hand movements and control features on our phones, which shows how eager companies were to experiment and add gesture recognition. There was no “best” way. A smartphone. But we are not done yet.

Elliptic Labs

Elliptic Labs 2
Elliptic Labs technology demonstrated at a trade show Mallary Gokey//

Fast forward to 2017, and interestingly, the Galaxy S4 was also called in to demonstrate gesture recognition technology from Elliptic Labs, which – like iSight Technologies, has spent a lot of time trying to get us to wave at our smartphones. Effort was spent. Other equipment.

Elliptic Labs’ technology used ultrasound to detect motion, making it possible to use a larger area of ​​motion and different gestures without reliance on light, lower power consumption, and greater accuracy. It had planned to license the ultrasound gesture technology to device makers, but despite taking advantage of the Internet of Things (IoT) boom and adapting the same system to integrate it into speakers and lights, it never got past the demo and concept stage. Couldn’t move ahead. ,

Instead, Xiaomi used its ultrasonic proximity sensor to get rid of the traditional proximity sensor and reduce the bezels on the original Mi Mix. Today, Elliptic Labs still works on proximity sensors and has now removed the hardware from the system entirely to offer software-driven proximity detection, which can be found on the Motorola Razr Plus and Razr (2023) compact folding phones. Could.

Air Motion on LG G8 ThinQ

lg g8 thinq
Julian Chokkattu /

Both Elliptic Labs and Eyesight Technologies, along with other companies like Neonode, experimented with gesture control between 2010 and 2017 — but didn’t make much of an impact outside of tech trade shows like CES and MWC. When the Pixel 4 reignited interest in gesture controls in 2019, it was joined by another big-name device: the LG G8 ThinQ.

LG, which has now stopped making smartphones altogether, loves to try new things with its phones, be it modular hardware with the LG G5 or secondary screens on phones like the LG V10 and V50 ThinQ. Air Motion used front-facing cameras and time-of-flight (ToF) sensors to detect various hand movements, including simulating rotating a volume knob to adjust the volume of a music player .

Like all proximity gesture control systems, its usefulness was as questionable as that of the touchscreen. right there, just an inch away from your rotating fingers. It was also not particularly reliable, which put people off using it. The LG G8 ThinQ marked the end of the line for LG’s G series, and along with Project Soli, Air Motion was probably the last gesture control system to be heavily promoted by a phone maker.

What about smartwatches?

Photo of a person gesturing with his hand to use Double Tap on the Apple Watch Ultra 2.
Joe Maring/

Until recently, gesture controls have mostly been displayed or displayed on smartphones. But what about smartwatches? The existence of double tap on the Apple Watch Series 9 and Apple Watch Ultra 2 is due to an accessibility feature called AssistiveTouch, which has been part of watchOS for several years. Samsung offers a similar accessibility feature on the Galaxy Watch 6 as well.

Smartwatches are tight on space inside, and there’s little room for cameras, proximity sensors, or other complex pieces of hardware. Double Tap uses a heart rate sensor, accelerometer, and software to recognize that you’re tapping your fingers, adding another system of detection to the list.

In addition to the Apple Watch and Double Tap, Google demonstrated Project Soli inside a smartwatch, but it never made it to the final Google Pixel Watch. The oddly named Mad Guys Watch apparently used bone conduction to enable a variety of gesture controls ranging from finger snaps to arm taps. In 2015, a company called Deus Ex crowdfunded an add-on module for the Pebble Watch called Aria, and although it’s not a smartwatch, Google used head nodes to add hands-free use to Google Glass.

simple hints are best

A person makes the double tap gesture on the Apple Watch Series 9.
Andy Boxall/

All these examples show that there have been more failed attempts than successes in popularizing gesture control systems on our phones and smartwatches. However, there are several simple signals that have proven to be effective and reliable – to the extent that we don’t even consider them special. A great example is raise-to-wake, where lifting or tilting the device’s screen towards your face turns on the display, and is the perfect example of a natural motion activating a feature. It could be argued that anything beyond that is too complicated.

Even the flicks and twists of the wrist, which were used on the Moto 360 to aid scrolling, seem to be a gesture too far and rarely seen since the Fossil Hybrid HR. Aside from these few isolated examples and essential accessibility features, gestures haven’t changed the regular, everyday use of wearables or smartphones for most people.

Double tap has the potential to join raise-to-wake as one of the few widely used gestures on smartwatches, however, it’s simple, natural and works really well. Sadly, history shows that gesture control systems and mobile devices have yet to pique our interest, and I hope Double Tap soon ends up on a future list of abandoned but promising gesture control systems. Will not done. It’s very interesting to suffer that fate.






Related Posts

Leave a Reply