First look: Apple Vision Pro

Verity McIntosh
6 min readJun 5, 2023
Happy birthday to me, happy birthday to me (to explain - today is my birthday 🥳)

After years of speculation, hype and hyperbole, Apple have officially joined the wearable immersive tech party with a brand new product, the Apple Vision Pro.

It is still early doors and to be clear, I have not tried the kit myself as yet, but from what we saw from today’s announcement, this is a generational shift in spatial computing technology. A few first thoughts on what we have seen so far:

  1. Augmented to virtual reality on a sliding scale

Apple are offering an AR (augmented reality) first approach, where it is assumed that you will want to retain your view of the world around you, and bring 3D objects into this live view of your environment. If you want to enter a fully virtualised world you can (literally) dial up the virtual world using a ‘digital crown’ similar to one already familiar to millions on the Apple Watch.

2. Peekaboo, I see you

One of the biggest complaints about VR is that it is anti-social. If you’re in VR you are effectively blindfolded. You can’t see your immediate surroundings, and people around you can no longer make eye-contact with you or have any sense of what you are doing.

The Apple Vision Pro suggests a different approach, becoming a sort of transluscent visor that uses the internal cameras to give a lenticular view of the eyes and expression of the person in the headset. Lenticular as in, whichever angle you approach from, the view of the face looks 3D and as though the headset is barely there.

Note the lack of top strap so the ‘more-than-bald’ users finally have hair styling options — woo!

From the point of view of the person in the headset, the blend of passthrough and full VR is a personal choice, and when people in your physical environment come close e.g to offer you a cup of tea, give you a hug or (hopefully not) attempt to steal your stuff whilst you’re in the headset, the passthrough mode will kick in and automatically give you a clear view of people on approach.

3. 3D photography — the ‘killer app’?

Anyone else getting Delenn from Babylon 5 vibes?

This really is something that we really haven’t seen anywhere else, at least not this well and not all in the same place. Using the front facing depth camera and LiDAR scanner you will be able to capture 3D photographs and video. This will allow you to relive precious memories and moments on your headset for years to come. This strikes me as a very consumer-friendly use case that could make the difference for so many people who have discounted XR thus far as ‘not for them’ or ‘just a gaming thing’. Smart move Apple.

4. Sounds good, right?

Let’s be honest, for all of it’s immersive potential, sound has been pretty pap in previous headsets. You can either live with the low quality onboard speakers or contend with the cable spaghetti/bluetooth lottery of using external headphones that were not designed to work with a face-mounted device. Apple are claiming that their onboard speakers are simultaneously high fidelity, discrete and spatial, using a clever system to map the physical character of your environment (sofa, curtains etc) and adjust the acoustic quality of the sound to map to the size and contents of your space.

5. Tech spec — as below!

6. Here’s looking at you — photorealistic avatar systems

Using ‘visionOS’ the front-facing camera can be turned around to take a map of your own face in order to construct a (close to) photorealistic avatar. The internal-facing cameras will then track your facial movement so that the two together can work to give you a semi-realistic avatar that can presumably be used in social XR platforms, but in the keynote was shown specifically as a way to take part in 2D communications contexts such as Teams or Zoom call. A bit of a shame for those wanting to take advantage of the more fluid identities offered to them by other avatar systems but perhaps that will come later?

7. Data privacy as a first principle

This is a big one for me. Many current generation headsets have gaze detection built in. This can have lots of technical benefits e.g. ID verification, accessibility tools, foveated rendering etc. It has also been inferred to be the hidden economy of the ‘metaverse’. If you can see where people look and how their pupils dialate in response to certain stimulae, you can rapidly monitise that information by profiling users, selling advertising and nudging behaviour to serve corporate or state agendas.

Apple have stated that, whilst sophisticated eye scanning will be used to identify users (even claiming to differentiate between identical twins) and to deliver technical benefits, that individual gaze data (where you look, how intently and for how long) will never leave the headset. It will not be data that is available to Apple and it will not be available to third parties to use in any circumstances. It is the same mentality that they have shown with the iOS ‘ask app not to track’ consent position, and it is a bold and (as far as i’m concerned) a very welcome first principle when it comes to user privacy and security.

8. Great for us folk in education?

Erm… not sure. It’s as yet unclear how these might work in setting where headsets are expected to be used by multiple users such as education settings, libraries, cultural venues, theme parks etc. This does seems to presume, as did Meta before it, that one user = one headset and therefore can work locked to one verified user ID. However, Apple have always had good form with enterprise level management via Apple Schools etc with Mobile Device Management (MDM) solutions and configurable setups so here’s hoping that more info comes out with something along these lines in the build up to release.

9. When and how much?

Ages. And loads 😳

--

--

Verity McIntosh

Senior Lecturer and researcher in Virtual and Extended Realities at UWE Bristol.