During Google’s Pixel 2 event last week, several products were launched accross the AR & VR spectrum. But one product didn’t get that designation from reporters covering the event, though we believe it could be a key piece of Google’s AR master plan: Pixel Buds.
In fact, ARtillry’s recent report about ARCore and ARkit, predicted Google will launch an AirPods competitor as an AR play. Like AirPods, Pixel Buds create another touch point to serve information — in this case, audio from Google Assistant and other channels.
But what’s the AR angle? Though they aren’t explicitly for AR, Pixel Buds (like AirPods) represent an unsung AR modality: sound. In other words, ambient audio for information about one’s surroundings could be a prominent type of “overlay,” in addition to visual graphics.
For example, using Google Assistant, Pixel Buds can perform real-time language translation. Think of it like the in-ear translation system used by UN delegates, but for the rest of us. In fact, live audible language translation is a great example of what we call “AR audio.”
We often joke that the original form of AR was radio. It “augments” your perception of the world while jogging, driving or other times you tune in. And like graphical AR’s evolution towards “true AR” (i.e. SLAM), AR audio will involve textured and intelligent sound overlays.
Besides language translation, what will that look like? AR audio could deliver details about an upcoming business meeting, or someone you’re shaking hands with at a conference. LinkedIn could develop an app that delivers these audible stats subtly and on the fly.
The way it could play out: Sleekness and portability will condition people to leave them in their ears all day. That engenders a new channel for ambient audio. From there it’s up to app developers – as with ARCore – to develop content and use cases like the LinkedIn example.
AR Audio also brings to mind Google’s smartphone-era construct of “micro moments.” These are the content snacking moments in the grocery line or subway — pulling out your phone for a quick fix of email, Facebook or Snapchat. It created lots of opportunity for media delivery.
But audio’s advantage is discreetness. It’s less cumbersome than pulling out your phone. And because AR glasses are held back by cultural and stylistic factors, the subtlety of ambient audio could fill an important gap. All-day use also creates a massive opening for content.
Of course visual media won’t go away and is more conducive to several content formats. But audio could take over a certain share of micro moments like getting informed about people or surroundings. We’re talking local discovery, shopping and proximity-based social media.
Breaking the Sound Barrier
As for who’s better positioned, AirPods have greater near-term reach than Pixel Buds. The former operate with about 600 million iPhones, while the latter work only with Google Pixels. They can connect with other phones (even iPhones) but just as standard headphones.
But Pixel Buds have a longer-run advantage when they — and Google Assistant — are phased in to the larger Android Universe. Moreover, Google Assistant (the brain behind Pixel Buds) blows Siri (the brain behind Airpods) out of the water in terms of digital-assistant chops.
Apple’s achilles heel for AR audio is in fact Siri. Google Assistant will win the voice search and “general knowledge” AI game, based on the extensiveness of Google’s knowledge graph. It will also outperform Amazon Alexa, because it’s a better AI engine with more data.
Pixel Buds add another weapon to Google’s AR arsenal. Alongside ARCore, this could be carried forward through Google and developer-created applications for AR Audio. Meanwhile, if live translation of foreign language dialogue isn’t “augmented reality,” I don’t know what is.
Disclosure: ARtillry has no financial stake in the companies mentioned in this post, nor received payment for its production. Disclosure and ethics policy can be seen here.