From User Interface to Omniface

“An iPod, a phone, and an internet communicator…do you get it?” were the words spoken by Steve Jobs right before he unveiled the first iPhone and with that created a device that impacted the lives of billions of people. Jobs had a vision to combine existing technologies and tools, as if the answer was already there, but everyone else had just failed to see it.

As remarkable as the iPhone seems, I believe that its innovation essence is not unique. Many other impactful innovations that we use on a daily basis have recombined existing technologies and trends, updated for the context of today. In some cases, this involves combining previously ‘dumb’ devices and connectivity to create ‘smart’ new iterations. Sonos, for example, combined the app ecosystem, WiFi connectivity and the rise in music streaming services into what we now refer to as a ‘smart speaker’.

Our appetite for smart devices seems insatiable. Globally, on an annual basis we create a staggering 1.6B smartphones0.8B smart home devices (smart speakers, thermostats, doorbells etc.), 320M smart wearables240M ‘hearables’, and 0.5B PCs and laptops. In order to provide for this global demand, the world produces enough LCD screens to cover the entire surface area of the Netherlands ten times over: an enormous 320.000km2.

The world has moved on quickly from the original promise of the iPhone – a single interface for all your devices – to offering many interfaces across many devices. Our definition of an ‘interface’ is evolving, too. For every physical mobile interface, there are 80+ digital app interfaces. We will increasingly find ourselves immersed in multiple interfaces at the same time, wherever we are, on all sensory levels (audio, visual and touch).

Today, many companies still offer experiences through a single physical or digital interface. Efforts to diversify those interfaces often fall flat: to retain relevance, they create new company-owned platforms, interfaces and forced user flows. In my view, the easiest path is actually to find ways to partner and integrate with existing platforms and interfaces. This means taking a counterintuitive step and yielding full control of the experience, instead organizing around the user, and their desired journey. Companies should combine and connect multiple physical and digital interfaces into a single user-driven experience: an ‘omni-face’ experience.

Creating omni-face experiences is, of course, easy for companies like Apple who own an entire platform and retain control over the end-to-end flow. For example, via their Fitness+ service, your Apple Watch can share real-time data with an Apple TV and overlay it on your exercise, while also serving as haptic input for exercise countdown clocks to keep you focused9. A more common omni-face experience occurs when buying products online, with banking apps integrated seamlessly into purchase flows. This omni-face experience works, because banking apps understand their function in the user flow: to provide payment authentication as quickly as possible through the sensors on the phone. A more sensory example is Philips Hue lights that can extend the mood of a movie into the entire living room. Some consumers are even creating their own omni-face experiences. A Tesla owner, for example, can automate the heating or cooling of their car based on their calendar, a connected thermometer and the car’s internal sensors.

True omni-face experiences are starting to emerge, but I believe we’ve only scratched the surface of what’s possible. What if you could orchestrate a scary movie on Netflix, your WhatsApp chat, your smart thermostat, smart lighting and your smart aroma diffuser into an integrated, omni-face experience?

This article is part of the series, Tech-Bites: exploring our favourite tech trends for 2021.