Apple’s ‘mixed reality’ headset: What could the future look like through new VR glasses?
Goggles are a secret and a mystery – but clues have been lurking in Apple’s products for years
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.In a couple of weeks, at its annual Worldwide Developers Conference, Apple could be set to reveal one of its most important and risky products ever: a whole new way of seeing the world.
The “mixed reality” headset is still a secret and a mystery. Though it has been strongly rumoured, it still might never arrive; it would certainly not be a surprise if it did not turn up at that WWDC event, though it would be something of a disappointment given how many whispers there have been.
But whether it arrives now, later or never, Apple’s headset has been in the works for years. And clues to what it might look like – or what the world might look like through it – are already sprinkled through the company’s other products and apps.
Rumours of how the headset will look have suggested it might be something like ski goggles, with cameras mounted on the outside and a precise screen on the inside, borrowing some of the design language of the Apple Watch; eventually, that is expected to morph into something less like a virtual reality headset, and more like glasses with a screen that will blend into normal life.
We have a lot of clues about what the mixed reality experiences that Apple wants people to see through it will look like. That’s because Apple and its executives have frequently discussed the uses of augmented reality in public, even as the hardware has stayed a tightly protected secret.
Apple’s work on a kind of headset is thought to have been ongoing since 2016, though what exactly that headset was rumoured to be has shifted over the years. The main rumours have coalesced since then into work on a “mixed reality” headset: something that would use cameras to show a picture of the real world inside a headset strapped to users face, and sensors and software to overlay virtual objects on that real world.
In 2017, Tim Cook spoke to The Independentabout the importance of augmented reality. At that point, he was talking primarily about the iPhone and iPad, and how their cameras, sensors and screens could be used to layer virtual objects on top of the real world.
It was clear even then that Apple really planned to include it in some sort of glasses or goggles, however. Asked at the time, Mr Cook pointed to “rumours and stuff about companies working on those” and declined to discuss Apple’s plans, but he said that the “technology itself doesn’t exist to do [devoted hardware] in a quality way”.
“The display technology required, as well as putting enough stuff around your face – there’s huge challenges with that,” he said then. “The field of view, the quality of the display itself, it’s not there yet,” he added, saying that Apple doesn’t “give a rat’s about being first” and would not launch anything until it was satisfied.
In the time since, Apple has seemingly been working very hard to get satisfied with that technology. It appears to be paying off: last week, virtual reality pioneer Palmer Luckey, who created Oculus and then sold it to Meta, suggested that he had seen an early version of the headset and that it was “so good”.
Morsels of unclear and uncertain information like that are all we have to go on about the headset, which has not really been leaked in any definitive form even as rumours suggest it is just weeks from launch. But if the hardware is unknown, Apple’s plans are not so mysterious.
That is because Apple may already have been quietly planning for the metaverse – or something like it, given that the company may be unlikely to adopt a term that has been soured both by the mixed reaction to “Web 3.0” and Facebook’s attempt to co-opt it by changing its name to Meta and re-orienting the company around it.
Apple will probably avoid much of that branding, since it has quickly become attached to a kind of hype and boosterism that it tends not to associate itself with. It also usually likes to make up its own terms – and recent trademarks have included filings for “xrOS”, which may be the branding for the operating system that powers the headset.
But those early experiments in the metaverse do already show some of the ways that Apple might be imagining us interacting with its new headset. The most obvious comparison is with the Oculus headsets that are now developed by Meta.
Those have focused more on virtual reality than augmented reality; Meta has been interested in creating whole new digital worlds, rather than overlaying virtual objects on the real world, like Apple seems to be interested in. But the use cases may be the same, and Meta has focused on applications such as business meetings, where people can sit around a real table and discuss things, as well as VR games such as the hugely popular Beat Saber, where people try and strike objects with a sword in time to a beat.
Apple has in recent years been more focused on the quality of interactions with devices, and has actually encouraged users to spend less time on their devices and more time being active. As such, it might not opt for mixed reality experiences that leave people cooped up inside the headset, and might instead be more outward-looking.
As such, Apple might still focus on meetings, but would presumably use its augmented reality technology to add virtual people onto real rooms, rather than stuffing them in computer-designed spaces. And it might focus more on people’s interactions with the real environment by, for instance, allowing people to see their messages or directions as they move around, a version of which is already experienced by people who wear the Apple Watch.
Apple has already offered a host of technologies aimed at making it easier for developers to include augmented reality in their iPhone and iPad apps. In 2017, when Cook made those tantalising comments about a possible headset, he was actually showing off uses of those frameworks: in the app Night Sky, for instance, which overlays virtual images of the constellations on top of real camera views of the stars.
The company makes that technology possible with a range of systems, in both its hardware and software. That hardware includes the cameras, of course, but also the LiDAR sensor in some iPhones and iPads that allows for precise scanning of the environment; the software is most obvious in the form of ARKit, which does the heavy lifting of understanding the world around a user so that developers can just place their virtual objects into it.
Since then, developers such as Ikea have used that to offer the option to drop virtual furniture into a real room to see how it might fit, for instance. Other apps such as 3D Scanner harness those sensors to allow people to easily create virtual views of those rooms as if they are in a video game.
But the most telling way of guessing what Apple’s plans for the headset are can be found in its own technology. Apple has been quietly creating a whole world of apps that appear to be ready for mixed reality, hidden in plain view.
One of them is Memoji, which are cartoon versions of people that can be used as stickers to react to messages or as profile pictures within messages. Apple has leaned on them heavily in recent years, using them as decorations in its presentations, for instance.
At the moment, they are just a fun picture. But it’s very easy to imagine how they could instantly be used as avatars for people in virtual environments.
If people are to participate in virtual meetings through their headsets, then other participants in that meeting will need to appear in some form, and current versions of those environments such as Meta’s Horizon Worlds rely on cartoonish drawings of people rather than realistic depictions. Apple already has those ready to go – many iPhone owners might already have created their virtual avatar, without even realising it.
Aside from meetings, one of the most discussed use cases of mixed reality is in finding directions. A person might be walking up to junction, for instance, and see a virtual signal telling them to turn left overlaid on the real road that they need to start walking down.
Apple has been putting the foundations of that kind of technology in place for years. It launched Apple Maps in 2012 and in the time since has been working to integrate it with the real world: collecting detailed 3D information about the world, for instance, so that it can show accurate and precise virtual models of real buildings, and a “Look Around” tool introduced in 2019 that works like Google Street View and lets people scroll around three dimensional panoramas.
At the moment, those features are used to make people’s phone navigation look nice: as you drive into a city, for instance, the 3D models of buildings shift around, to make it feel like the virtual car on your phone is really driving along. But it takes very little imagination at all to see that the real purpose of those models may be to allow Apple to have a precise understanding of the real world, so that the headset will know where it is and help its wearer.
Apple has also already been working on wearable products, which might themselves offer an insight into how the headset will work. The Apple Watch and AirPods already work by attaching themselves to people’s bodies and adding a layer of digital information – which, in a way, is exactly what the headset will aim to do.
The AirPods, for instance, have found a way to lightly add information on top of the real world: using Siri to read out messages as they arrive, and adding transparency modes that allow the noises of the real world in as well as the sound coming through the earphones.
And the Watch may be a glimpse at how Apple will encourage people to strap a computer to their body in a way that might look alarmingly sci-fi but has become accepted as relatively normal. Apple encouraged the Watch to become an acceptable accoutrement by focusing on personalisation, and designing in a way that brought in a sense of softness that did not make the hardware seem too much like a computer – it will probably do much the same with the headset.
Apple’s new products are often described as if they have been magicked up from nowhere. But in truth all of its recent innovations have had their roots in previous products: the iPhone combined an iPod and a computer, and both the iPad and Apple Watch were bigger and smaller versions of that iPhone.
The same thing appears to have been happening with Apple’s headset, which might look a lot more familiar than we are expecting. When the company takes to the stage to reveal its new products in a couple of weeks, we might realise that the product Apple wants us to strap to our faces has been developed right under our noses.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments