Ottoman L.

Beyond vision | Spatial Computing reaches the entire Apple ecosystem

Ottoman L.

Just as eyesight often eclipses our other senses, Apple Vision Pro stole the lights during WWDC. But clear signs herald a much larger ambition for Apple’s spatial computing platform. The revolution is staged outside the augmented reality headset. From the iPhone and Watch to the less flashy TV and Airpods, Apple has been incorporating spatial awareness features into its devices for years. But what is the purpose of this? And what does it reveal about Apple’s spatial computing strategy?

An iceberg symbolizes the greatest ambition of spatial computing that lurks beneath the surface.  To understand what spatial computing really means, you need to understand the platform and how it interacts with other Apple devices.
Look beneath the surface and you’ll see the significance of spatial computing for iPhone and Watch Simon LEE

After Tim Cook’s presentation, something was wrong. Surely it couldn’t be just that. Spatially aware technologies aren’t unique to Vision Pro, but rather something Apple has included in their devices for at least 3 years. If you’ve ever wondered why Apple would include cutting-edge space chips in stationary devices like its HomePod speaker, you’re not alone. But with her latest announcement, everything has become crystal clear.

The success of spatial computing is intertwined with its availability across all Apple devices, and there are signs that this has been a deliberate strategy for the company. Are you ready to find out what happens when it all comes together? Let’s unravel the mystery of Apple’s spatial computing strategy!

Spatial computing promises to enrich our world with digital information and interactions that are seamlessly integrated with our physical environment. However, this vision cannot be fully realized if we rely on headphones as the sole interface for spatial computation. Wearing a headset all day is neither practical nor desirable for most users. Wearing a headset for a short interaction entails intentionality which generates friction. We need a more flexible and natural way to interact with spatial computation, which adapts to different levels of immersion and comfort. This is where the gradual interaction comes into play.

Glass steps that symbolize the gradual interaction offered by multimodal spatial computation.  Each device, like the iPhone or Apple Watch, acts as an accessible stepping stone to spatial interactions without the need to wear a Vision Pro headset.
Multi-mode offers gradual stepping stones to engage with spatial computing experiences without the friction of wearing a headset. A good way to transform user habits. Nathan Watson

Gradual interaction it is the idea that users can access spatial computing across various devices, depending on their needs and preferences. A headset offers the most immersive and lifelike experience, but it’s not your only option. Given the choice, you will certainly opt for less intrusive and more portable devices, such as the iPhone or Apple Watch, to access specific information or actions that are space aware.

For example, pointing the iPhone at a plant displays its watering history and soil moisture level. Likewise, your watch may know when you gesture towards the TV and adjust the volume. Reach for the doorknob and the clock will remind you to take an umbrella as it is forecast to rain in the evening. Walk past your mailbox and your Airpods will chime to let you know your package delivery has arrived. These are the promises of multimodal spatial computing.

This is possible thanks to a common platform that synchronizes all your iCloud space experiences, allowing you to switch from one device to another without losing context or continuity. Step-by-step interaction allows users to take advantage of spatial computing in a more convenient and accessible way, while retaining the ability to fully immerse themselves when desired.

A long-term investment

Why would Apple include advanced space chips in a watch or speaker? This can only make sense in light of a grander purpose.

Apple has a secret weapon for spatial computing: the U1 chip. This chip is a hidden gem that’s been quietly added to every iPhone, Watch, and HomePod since 2020. It’s a chip that can enable amazing spatial interactions using ultra-wideband radio, a technology that accurately measures distance and direction between devices. It’s a chip that doesn’t yet have a clear purpose, but it holds the keys to unlocking the full potential of spatial computing.

Let’s look at some examples:

Your iPhone is a magic wand

HandOff already lets you tap your HomePod to your iPhone to transfer all the music currently playing. This proximity interaction, handled by U1, could soon turn into a long-range interaction and affect many more dumb devices than one would expect. With the U1 chip, your iPhone and HomePod can locate each other. So you might be able to get the same HandOff functionality without moving from your couch. Just point the phone at the speaker.

To avoid unwanted interactions, the last missing ingredient is a simple trigger, which could very well come in the form of the much-rumored Action Button. It deserves a separate article.

Thinking more systematically, nothing prevents your iPhone from controlling other devices that are not equipped with a U1 chip. For example, if you’ve previously mapped your space with ARKit, you only need to know the iPhone’s location within that space to guess what you’re pointing it at.

You may be able to activate specific shortcuts by pointing your iPhone at a bookshelf or coffee machine. Can you imagine long-range space shortcuts that might be useful to you?

But wait! ARKit also has an external world map.

For me, a future where you point your iPhone in front of a restaurant to quickly access their menu is inevitable.

Heck, you might even get a Mapstr space widget to let you know that your friends tried and loved the croissant or that the oyster made them sick. This is definitely not something within reach of a bulky AR headset yet.

Apple TV is a mirrored portal into your living room

One of the most underrated announcements at WWDC23 is the arrival of Continuity Camera on Apple TV. It states the following:

Developers can leverage the Continuity Camera API on Apple TV 4K to integrate the iPhone or iPad camera and microphone into their tvOS app and create new shared AND engaging entertainment experiences for the living room.

Have you ever wondered what Spatial Computing and Continuity Camera can do for Apple TV?  This illustration depicts Apple TV as an AR portal in a living room.  The 3D model is contained within a cube and the side walls are invisible, allowing you to see inside the room.
Continuity Camera on Apple TV gives its future as an AR mirror inside your living room Siednji Leon

It’s easy to focus on its video calling applications. But the real impact is when this camera is used to show you an enhanced mirror image of your world. One of the more exciting applications is within the Fitness app. The app can track your 3D body posture in real time as you exercise, giving you detailed feedback on how to improve your form and the number of reps you performed. He may even be able to adjust the pace of your workout to match your pace. Ubisoft’s Just Dance could also make a welcome return to hands-free body control using the Continuity Camera on Apple TV. And that’s only scratching the surface of its potential.

These few examples illustrate how spatial computing is a platform that is ready to go once the use cases are there. And Apple is ahead of the curve.

Vision Pro’s best trick lies in its eye tracking capabilities, predicting users’ intentions and reacting to what the user is looking at. The magic gained with eye tracking could be distilled into other devices. While it may be diluted in the process, the convenience of the interaction could still make it an enticing UI. With head tracking on your AirPods, Siri can whisper relevant information into your ear based on what you’re looking at or where you’re going. Approach a subway entrance and it will announce upcoming departures. Spatialized chimes could subtly guide you inside the underground corridors without having to look at a screen.

The Vision Pro is just the tip of the iceberg. As Apple continues to develop a spatial computing ecosystem with stepwise, multimodal interactions, countless new use cases will emerge. Technology will recede into the background of our lives, allowing us to focus on what truly matters to each other and the world around us. This could finally make one of technology’s wildest dreams come true, a user interface so intuitive it feels like magic imbued into your world.

Indeed, so are our brains inherently multimodal, filling in missing sensory information through associations with other senses. Similarly, multimodal spatial computation can compensate for the absence of immersive visual feedback, creating natural and intuitive user interfaces based on our other senses. Once other devices join the dance, the tactile noises and chimes could give birth to virtual experiences, audible even in the absence of visual feedback. No more scrolling through long lists of apps, you can simply point to an associated object to trigger an action and feel the consequences. You don’t want to wait until you put on an AR headset to know that your plants need watering. Instead, your watch may vibrate in proximity to a sensor of neglected plants to warn of its impending drying out.

The Vision Pro is a diamond in the rough. And a spatial computing ecosystem with stepwise and multimodal interactions would open the door to many more use cases. The flux may eventually polish the gemstone to reveal its true luster.

The possibilities of the AR headset aren’t limited by its hardware, but by its social acceptance. A phased, multimodal approach could bring spatial computing to the streets.

We expected a slowdown in technological breakthroughs, but what’s coming could catch us off guard. Apple has equipped the Vision pro with an external display facing the world, surely a metaphor for what is to come. The lights are going out of it and into the rest of the Apple ecosystem.

When devices recede into the background and technology permeates our physical objects, the world moves closer to magic. The genie has already come out of the bottle and the collaboration between Disney and Apple could fulfill his first wish.

By now it is clear that the Vision Pro is just the spark that ignites everything; but once the dust settles, it will become clearer what all these changes mean for Apple’s device ecosystem.

We are on the brink of a new era, where technology becomes magic and magic becomes reality. Are you ready for this?

#vision #Spatial #Computing #reaches #entire #Apple #ecosystem
Image Source : medium.com

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *