Skip to content
Thoughtful, detailed coverage of everything Apple for 34 years
and the TidBITS Content Network for Apple professionals
55 comments

Impressions and Thoughts from Early Vision Pro Reviews

Apple’s new Vision Pro headset is here (see “Apple Vision Pro Arrives 2 February 2024,” 8 January 2024). As always, Apple seeded some select reviewers with pre-release units. You could easily spend several hours reading and watching them all, and unless you’re highly intrigued by the Vision Pro, that may be more time than you have. Plus, the reviewers largely agreed about most things, so it’s not as though there are many different perspectives. That’s not unexpected—the Vision Pro is so new and so unusual that most people are just trying to explain the basics for now.

The Reviews

To get started, watch the video by Joanna Stern of the Wall Street Journal. It’s less than 9 minutes long, but she puts the Vision Pro through its paces in a real, if slightly contrived, environment, and her work is always insightful and entertaining.

Once you’ve watched Joanna Stern’s video to get a sense of what the Vision Pro looks like and what a user would see, you have the necessary visual background to read the 7250-word review by John Gruber of Daring Fireball. Gruber’s description and analysis are careful and thoughtful, which is good because they’re backed with only a single still image.

After that, you have a choice. If you’d like to read more, Nilay Patel of The Verge penned a 9000-word review illustrated with numerous still images. He’s the least positive of all the reviewers and the one who most emphasizes that wearing a headset is inherently isolating. And he’s not happy about it messing up his hair. The Verge’s video is impressively produced, but content-wise, it’s essentially Patel reading the text of his article (though you get to see the effect of the Vision Pro on his hair).

YouTuber Marques Brownlee of MKBHD created three videos about the Vision Pro. The first is a 19-minute unboxing video, and while I skimmed some of the early parts, once he’s explaining all the parts and pieces, it’s helpful and informative. He followed that up with another well-done 38-minute video about using the Vision Pro that’s more in-depth than Joanna Stern’s. But the best one is the third video, in which he shares opinions about how well the Vision Pro hardware and software work.

After that, I was honestly a little burned out on learning more about the Vision Pro, but if you want to press on, you can read a 5000-word review from Mark Spoonauer of Tom’s Guide or a 7000-word review from Scott Stein of CNET. For more video coverage and the most enthusiastic review, watch iJustine’s 32-minute video that combines unboxing and usage. Note her warning that the videos of what the reviewer sees look very fast and jumpy but don’t reflect what’s being experienced.

Initial Impressions

After consuming all these reviews, I came away with some impressions about the Vision Pro:

  • It’s an amazing VR headset: All the reviewers were blown away by the level and amount of technology that Apple has shoehorned into the Vision Pro. They say every aspect of the experience is better than what they’ve experienced with other VR headsets. The Vision Pro may be expensive, but it doesn’t feel like Apple is overcharging based on what’s inside.
  • It’s more expensive than we thought: The $3499 list price is just the beginning. That model has 256 GB of storage; 512 GB costs $3699 and 1 TB is $3899. Given the size of 3D content, apps, and your Photos library, 256 GB may not be enough. Many will also need the Zeiss optical inserts: $99 for readers or $149 for prescription lenses. Apple’s padded carrying case for travel runs another $199, and AppleCare+—almost certainly essential—costs $499. You could also pay $199 for a backup battery pack and $199 for another light seal cushion if someone else in your household wants to try it. You could spend nearly $5000 on a kitted-out Vision Pro.
  • It’s heavy: Everyone commented on the weight. Even though Apple has done a good job with the design and provided two types of head straps, the Vision Pro still puts 650 grams (1.4 pounds) on the front of your face. That’s ergonomically problematic (and would have been worse if Apple had integrated the battery pack), so if you’re buying the Vision Pro, I urge you to increase session time slowly to avoid overloading your neck muscles.
  • EyeSight is weird: EyeSight, which displays a digital representation of your eyes on the front of the Vision Pro, is weird and ineffective. The resolution and brightness are much lower than Apple’s marketing suggested. The eyes don’t look natural, and if you have dark skin, like Marques Brownlee, they’re hard to see. This is disappointing since Apple made such a big deal about how EyeSight would help make others feel more comfortable with you when you’re wearing goggles. Perhaps most people will remove the Vision Pro when others are around.
  • Personas are deeply unsettling: Personas, which give you a digital avatar from the chest up, are even more disconcerting. They’re highly accurate representations that are still profoundly wrong, which lands them deep in the uncanny valley. Apple says Personas are still in beta, but it’s hard to imagine them improving enough to climb out of the valley.
  • Guest mode, but seemingly no multi-user support: Several reviewers briefly mentioned that there’s a Guest mode that lets someone else try out a Vision Pro, and iJustine has a friend test it. From what I can tell, though, it’s designed for demos, not for regular sharing with someone else in your family.
  • You can have just one 4K Mac display: Everyone liked displaying a Mac’s screen within the Vision Pro environment, and you can expand it. However, that screen is limited to 2560-by-1440 pixels, and you can only have one. Although you can arrange native Vision Pro and compatible iPad apps next to the virtual Mac screen, it’s not exactly the “infinite canvas” Apple has been promoting.
  • A physical keyboard is essential for being productive: If your aim is to get work done, you’ll need a Bluetooth or laptop keyboard. The visionOS virtual keyboard is fine for inputting passwords and other short bits of text, but that’s it.

Final Thoughts

The most important thing to remember about the Vision Pro is that it’s not for “the rest of us.” It’s not even clear who the Vision Pro is for—there is no killer app yet. Apple is planting seeds for developers, early adopters, and the tech curious who are willing to spend big to be on the bleeding edge. To be sure, Apple also wants to get ahead of Meta and other companies that may want to stake out similar territory. The hope is that the Vision Pro will provide everyone—Apple, developers, and users—with the experience necessary to work toward a point where the technology has evolved sufficiently that it can become a mainstream product at an affordable price.

Or, rather, that’s my analysis of the situation, which matches that of many experienced tech industry watchers. But it’s not what you’ll hear from Apple’s marketing machine, which has to pretend that the Vision Pro is for everyone and solves real-world problems today. Apple can’t say out loud that the initial Vision Pro is a necessary first step toward an imagined future, but when you compare it to everything Tim Cook has said about AR and VR, it’s pretty clear that’s what is happening.

Perhaps the second-generation Vision Pro will be half the weight, the third-generation model will integrate the battery, and the fourth-generation will be actual glasses rather than a VR headset with passthrough video. (And surely we’ll get a “Vision Air” at some point.) The promise is compelling, if dangerously close to science fiction—a Vision Pro could one day supplant every other Apple product. Who needs an iPhone, Mac, Apple Watch, or Apple TV if a pair of glasses (contact lenses would be science fiction) could provide all the same processing, communications, recording, and virtual display options? That may be looking too far into the future, but regardless, I’d prefer Apple’s vision—and implementation—of what comes next to those of any of today’s other tech giants, who would likely bombard you with ads, track your every eye movement, and try to sell you more stuff.

Subscribe today so you don’t miss any TidBITS articles!

Every week you’ll get tech tips, in-depth reviews, and insightful news analysis for discerning Apple users. For over 33 years, we’ve published professional, member-supported tech journalism that makes you smarter.

Registration confirmation will be emailed to you.

This site is protected by reCAPTCHA. The Google Privacy Policy and Terms of Service apply.

Comments About Impressions and Thoughts from Early Vision Pro Reviews

Notable Replies

  1. I have to admit I’m not a fan of including a recommendation for iJustine in the same collection as Joanna Stern’s review. Joanna Stern does series reviews (albeit in an entertaining manner) and is ready to offer actual insight and critique. iJustine is an entitled hyperactive brat that does content-devoid puff pieces for Apple marketing. IMHO she needs absolutely no advertisement at all from a reputable source. Just my 2¢.

  2. Thanks for this summary and pointers. I’ve watched many of them, and overall I was reminded of early iPhone reviews where people felt compelled to find flaws, to the point of absurdity. (Such as the angst over messing up your hair, or complaining that it’s too heavy while simultaneously suggesting the battery should be in the headset.)

  3. Thanks much for a great review. I haven’t taken the time to follow all the links, but there’s a lot to see here.

    I wonder how long before developers can have fun with things like EyeSight and Personas - like the ability to use cartoony avatars instead of your real face. (Am I the only one who thinks it would be fun for the front display to show Simpsons-style spherical eyeballs?)

  4. I thought all the videos you mention were well done and informative (though I skip iJustine for the same reason I never watch Linus). I have two thoughts about the device.

    First, I think it is unfortunate that they base visionOS on iPadOS. I don’t think much of the latter and am particularly unimpressed by the iPad App Store. As I have mentioned elsewhere, it lacks the apps that I am interested in: things like compilers and development apps, scientific data analysis apps, CAD apps, etc.

    I have found iPadOS hard to use for productive work largely due to the lack of a flexible file management capability and the many little annoyances in iOS generally.

    I wish visionOS were based upon macOS. Then, one could have a completely self-contained computer in the headset and, moreover, a computer one could use for productive work, with a keyboard, mouse and no AppStore requirement. The last is a dream, of course.

    Second, I think the Vision Pro would be a superb device for certain professions such as surgeons, architects, certain machine operators, robot controllers at nuclear power plants and pilots. As an inactive pilot, I think that the device would be a great platform for a superb, immersive flight simulator.

    As some reviewers (e.g. Gruber) have said, its greatest strength at present is in entertainment. While I agree, the solitary nature of the experience is somewhat disturbing. I am guessing that entertainment will be dominant for at least a year or so for those not bothered by watching movies and sports events alone.

  5. I’m disappointed with the one monitor, heck I wanted more than one Mac.

    I have three Macs on the go at my desk, two with two monitors, which isn’t typical I acknowledge, I’d have been curious about a virtual replica of that. With a couple of iPads…

    Version 1. It will develop.

  6. I hesitated about including iJustine’s review, but at this point in the Vision Pro lifespan, any additional perspective is interesting. For instance, she was the only person who covered Guest mode, which feels important early on.

    And while her style (I’ve never met her; there’s no telling how much is an act) grates on my nerves as well, I could see some (very different from me) people finding her more compelling than the others.

    I don’t think there’s any reason to believe this. All of Apple’s operating systems share a great deal of code and interface approach, and visionOS is a new fork of that OS core focused on the interaction methods possible with the Vision Pro and appropriate to the level of processing power available with the M2. It’s a little hard to imagine writing code on a Vision Pro, though, of course, you could do that in a Mac window. Things like data analysis and CAD apps are much easier to imagine, but it will take developers some time to figure out what’s possible, after which they’ll have to decide if there’s sufficient marketing for such focused apps or if such an app could be so compelling on the Vision Pro that users would buy the hardware to get it. I could see it going either way eventually.

  7. Perhaps I didn’t understand what was said in many of the reviews, but I thought that visionOS was deliberately based on iPadOS so that all the existing iPad apps (with the developer’s approval) would run seamlessly on Vision Pro. I have not heard any mention of the possibility of loading Vision apps except through the Vision App Store.

    If Apple runs true to form it will not be possible to run compilers or development systems on the device, as is the case with iOS and iPadOS. I understand from the developers of TeXpad for iOS that they had great difficulty in getting it accepted since, as a typesetting app for math, it came close to violating the compiler ban.

    I have always been surprised that there were so few CAD apps for the iPad. One would think that the platform would be a natural, with the pencil. I have heard users of the one good CAD app praising to the sky the iPad plus pencil as the best platform for their software. Perhaps it is the App Store, which seems inhospitable to fairly expensive apps together with Apple’s 30% cut, that dissuades developers.

  8. It’s absolutely true that iPad apps can run on the Vision Pro if the developer allows it (iPhone apps too?) but that doesn’t imply that visionOS is based on iPadOS. macOS has the same feature.

    I can’t imagine there ever will be one unless Apple is forced to do it by the EU. ;-)

  9. Given that the EU alternative app stores will not be available for devices other than the iPhone, the Apple Store will be the only path to obtaining Vision Pro apps for the immediate future.

  10. Thank you, Adam. I was just fantasizing a Vision Pro based upon macOS where one would have an almost infinite screen in which to place one’s Mac windows where one could drag and drop between windows and use the Mac UI to manage all this. This seemed better than having a single Mac screen surrounded by iPad windows (or visionOS windows) with (at present) limited capacity to drag and drop between them.

    By the way, I can’t imagine a comfortable use of the single Mac screen without a mechanical keyboard and pointing device. For some reason, Apple does not support Bluetooth mice (like their own Magic Mouse), my preferred pointing device. That seems odd and I hope it changes and, further, that Apple allows more than one Mac screen. For those Mac-centric people (like me), it would make much more sense to buy a couple of Studio Displays than use a Vision Pro for Mac operations at present.

    It will be interesting too see how the Vision Pro evolves.

  11. Yeah, having Mac app windows being able to stand alone within the visionOS interface would be nice, but it really is a fantasy. macOS just isn’t designed for the gaze-oriented interface of visionOS, so Mac apps would be very hard to use that way. iPad apps are already assuming a touch interface and thus have controls with larger targets because a finger is much larger than a mouse pointer.

  12. I like the phrase “gaze-oriented interface”, an apt description of the visionOS UI. Although it is potentially more accurate than a touch-based interface, it still lacks the multiplicity of modes allowed by a mouse with multiple buttons. Furthermore, it limits one’s use of one’s gaze for more than pointing - gazing elsewhere during a UI operation could have weird consequences. I wonder whether users will get fatigued by this after a long session.

  13. If it’s a laptop you’re mirroring inside the AVP, you can still use the laptop’s existing keyboard and trackpad. It works really well, and you can even use the trackpad to control Vision apps (similar to using a trackpad on iPad).

    You can also use a bluetooth keyboard and trackpad. But not bluetooth mice as you mentioned – I wonder if that’s because the OS wants the trackpad’s touch gestures?

    (Personally, I can’t use mice any more. Tried a few years ago and got hand cramps. So weird. I’ve just been using trackpads since the mid-90s when Apple started putting them on laptops and never looked back.)

  14. It’s important to keep straight how the different OS’s in Apple’s ecosystem overlap.

    • They’re all based on the Darwin Unix kernel (originally based on BSD, but I assume has evolved in unique directions since then).
    • It is safe to assume that builds of Darwin for different hardware platforms (Intel Mac, Apple Silicon Mac, iPhone, iPad, Watch, TV, Vision Pro) are compiled with different sets of device drivers and with various kernel-level features enabled/disabled, as appropriate for the products.
    • On top of Darwin is a huge slew of frameworks - libraries upon which apps are built. Some (e.g. CoreFoundation and CoreImage) are probably present on every platform, even if the capabilities provided on each are not 100% identical. And some (e.g. Cocoa or UiKit) are going to be platform-specific (only Macs have Cocoa. UiKit is for iOS, iPadOS and TVOS).
    • And then, of course, there are the bundled apps. Even when two platforms have apps with the same name and icons (e.g. Mail or Keynote), they are probably not built from identical code, although they may share some app-specific libraries under the covers.

    I think it is reasonable to assume that the VisionPro is based on a Darwin kernel, just like all of Apple’s products. But it’s going to be a custom build designed for its distinctive hardware devices. And although there will be many common frameworks for app developers to use, I can guarantee you that they won’t all be present, there will several unique ones, and those that are cross-platform will not have identical implementations.

  15. I have been using both trackpads and mice and have a strong preference for the Magic Mouse, despite its slightly uncomfortable feel. The 2D touch surface is wonderful and can be used for (2D) scrolling, switching, screens in Safari and lots of other things. I still like trackpads - my first one was the pad on a Powerbook 1400, where they introduced tap-to-click and I loved that.

    I find mice much more convenient to use in CAD and drawing apps - for some reason, drawing is easier using a mouse (for me).

  16. I get it.

    Speaking of drawing, yesterday I tried Apple’s Freeform app for the first time (the one they announced last summer as part of the new OS), but on Vision Pro. That was freaky. You draw in the air with your hand! I can’t really draw so I don’t know how a real artist would like it, but it much better than I thought. The finger tracking was surprisingly good.

    But it was weird and I’m sure to my dog watching me, I looked rather insane. :joy:

  17. One thing about Apple first editions: they become obsolete pretty quickly, and the price drops rapidly. Remember the first iPhone - and how Apple had to give all buyers a big Apple Store credit when they quickly lowered the price? And how the first iPad didn’t have a camera? And how the Watch original version died away as Series 1, etc., were introduced?

    So I think this is for the most serious of Apple maniacs with money to spare.

    One thing about the “extras” that seems strange to me (well, extra for the carrying case seems weird to me too) is the prescription glasses thing. So many people wear glasses! Why can’t they adjust for that in VR software?

  18. The Marques Brownlee reviews are possibility the best ones to get an appreciation of the set up and working features of Vision Pro. In his last video, he discusses ‘how it feels’ to be in Apple’s Vision Pro spatial environment, in other words his psychological reaction to a new way of interaction. Also his idea of doing his video editing while sitting in an airline seat shows advantages of such a device. There’s now others by people wearing the headset in real life and it’s interesting how people respond to someone wearing one.

    I have come away from these reviews being impressed by the deep thinking by the Apple developers into what is needed to be functioning in this device. A remarkable achievement given the one we are seeing today is the first edition - we know how previous Apple products developed over time. Like many products you don’t buy the first edition unless it is a book.

    Further development or functions and reduction in weight are obvious next steps. But it will be app developers who now have real opportunities to develop apps beyond the personal use cases and for these apps together out in the industrial and business world…

  19. I think Apple does not want to get involved with the certification and licensing required in many jurisdictions to fill prescriptions and does not want to work on the ramifications of doing it virtually. Even for the in-store demos, they will not work with a written prescription but instead measure your current glasses and provide lenses that should be close enough. For purchase, Apple manages the sale of the inserts, but lens maker Zeiss provides the lenses.

  20. Weird. And complicated. What happens in a household with more than one person?

    doug

  21. At this point, like iOS and iPadOS, visionOS supports only a single user. So a household with more than one person should buy more then one Vision Pro. That said, I understand that the prescription inserts attach magnetically so are easily removable and can be replaced with other inserts, if Apple ever sells the inserts separately.

  22. It somehow seems strange that a physical insert is needed. Oh - one question. In my case I am near sighted and needed glasses when reading my external monitor or outside walking. But if I’m reading a book, for example, on my iPhone I take off the glasses. In that case do I need a prescription to use the device?

  23. All of that info is here.

    But I understand that the Vision Pro focal length design is something like 1.5 meters. I’ve heard but can’t recall the exact distance, but you’d want correction for being able to see something that far away.

  24. I can’t imagine me adopting to Vision Pro anytime soon. But I do appreciate the direction it’s going in. All other VR stuff are either in a place (Massachusetts Museum of Contemporary Art had 2 Laurie Anderson designed Virtual Reality Spaces that were truly wonderful and I’ve had VR Cardboard for a long time) - or in some weird entertainment arena (meta/goggles…etc) that has no appeal to me. The potential of large workspaces, immersive collaboration and AR/VR in situ look promising technology-wise. Will Hollywood adopt VP for production as well as content ? …because I see a use for that. I wonder what these things will look like in 10 years! So, much thanks for the article wrap up, and it’s great that the new sponsor Art Authority is delivering art services for VP- although, do visit your local Art Gallery in person sometimes.

  25. In software? I’m not sure how. People who need corrective lenses don’t need content to be bigger. They need the image to be distorted in a way that will produce a focused image on their retinas. Software can’t make an OLED screen do this. No amount of blurring the image can produce a sharper result after it passes through the eyes of someone needing corrective lenses.

    To do this, you’d need to change the light path, which no flat panel screen can do. Maybe some future holographic system might be able to (or maybe not - I really don’t understand holograms), but for now, the only thing you can do is place corrective lenses between your eyes and the screen.

    The AVP has (I think) some ability to mechanically adjust its optics in order to compensate for different size/shape faces, in order to align with your eyes. But it would take more than moving the existing lenses in order to compensate for people with poor vision.

    You would need lenses with a variety of shapes. Correcting myopia (near-sightedness) requires concave lenses. Correcting hyperopia (far-sightedness) requires convex lenses. Astigmatism requires lenses with a non-uniform curve.

    To do that in a single device without requiring you to swap lenses would require something akin to a phoropter, like what an optimetrist uses to determine your prescription. Or maybe adaptive optics. Both of which would be very large, very expensive, and probably couldn’t be used in a consumer device due to medical licensing regulations.

    Hence the Apple design of allowing the insertion of corrective lenses, which would be used in addition to the rest of the lens assembly.

    I suppose it would theoretically be possible to make the AVP larger so you can wear it over your regular glasses, but that would probably make the eye tracking and iris authentication systems more difficult to work reliably, since the image of your eyes would be distorted by the lenses. The presence of the frames in the field of the sensors might also create problems.

    I am a bit surprised, however, that you can’t wear contact lenses while using it (at least Apple says you can’t). Maybe that would also mess up the eye tracking or iris-authentication system, but I’d love to understand how and why.

    I suspect that over time, you’ll find the lens mounting rings available to opticians in much the same way they sell eyeglass frames. You’ll then buy the rings from an optician who will fit prescription lenses to them.

    Probably a bit more tricky than eyeglasses because certain measurements (e.g. position of your pupil relative to the frame) will likely be different, since the AVP adjusts the position of some internal parts to fit your face. But I see no technical reason why any optician shouldn’t be able to fit their own lenses into the mounting rings where the Zeiss lenses go today.

  26. That’s all very interesting. Using the device sounds like an extreme nuisance for contact lens wearers though. Imagine having to take out and put in your contacts multiple times a day, whenever you take off off put on the device!

    I think Apple needs to find another solution. Do other VR headsets have the same issue?

  27. The link to the Apple document that Doug Miller posted earlier (HT213965) has more detail about contact lenses. Some types are ok.

    If you use single vision soft prescription contact lenses, you can use Apple Vision Pro without ZEISS Optical Inserts — Prescription.

    • If you use hard contact lenses, it might impact your experience with Apple Vision Pro. If you experience difficulty with eye tracking, and your eye care provider has indicated that eye glasses with a comprehensive prescription is an option for you, then you may consider ordering ZEISS Optical Inserts. Otherwise, you may use an alternate form of input such as Pointer Control.
    • Cosmetic contact lenses are not compatible with Apple Vision Pro and should be removed before using the device.
    • If you typically use reading glasses in conjunction with your prescription contact lenses, you may benefit from using ZEISS Optical Inserts — Readers in conjunction with your contact lenses.
  28. For issues that may affect wearers of hard contact lenses, see Mark Z’s initial report:

    I suspect some of the issues are caused by reflections off the surface of the lenses that do not occur for folks with soft contacts or no contacts at all.

  29. BTW, here’s a quick update on my hard contact lens situation.

    Over the weekend I switched to the thicker face pad (AVP comes with two thicknesses) as the device kept warning me that my eyes were “too close” and could result in injury if I fell down while wearing AVP. At first I thought the thicker pad was much worse – everything seemed blurry – but later I tried it again and it was fine and the warnings stopped. On a whim, I redid the eye setup several times, wondering if the thicker pad and moving my eyes further away might make the system work better. But no: I still failed the eye test and it didn’t seem to improve the tracking. I have kept the thicker pad on, though.

    Since then I’ve experimented with a lot of the accessibility options. I finally tried the “wrist” pointer just to see what it is and it’s bizarre: you get a 3D sort of clear light pole emanating from your hand that you can use to point at stuff (it reminds me of those clear plastic rods on window blinds except infinitely long). The pointer makes it much easier to see what you’re pointing at, but is very distracting.

    I switched to head tracking with “pointer control” on, which displays a dot that moves around the screen as you move your head. This works fairly well and is at least more noticeable so you can see what you’re doing. One of the biggest frustrations of the eye tracking is it’s really hard to tell what’s going on when it’s not working. With the head pointer control, you can at least move your head until you find the pointer (the equivalent of jiggling the mouse until you locate the cursor on the screen) and then use it to point at what you want to control.

    However, I discovered three key problems with the head tracking:

    • It’s possible to end up with a window or control in a weird place where your head has trouble directing the cursor. For example, in VP the window close button is at the very bottom. If a tall window appears right in front of me, sometimes its close button is way down south. My chin hits my chest and I literally can’t look any lower – yet I can’t reach the button to close the window! Sometimes you can do some gyrations to make this work, or you can press the digital crown to reset the view (which re-centers all the windows) and that could make them easier to retrieve. You can also use Siri to close the app. It’s a bit of a chore.

    • Pointer control is buggy. Sometimes the pointer just vanishes and you’re in the dark again, unsure what you’re pointing at. It’s like using a Mac with an invisible mouse cursor. You can click on things blindly, but it’s really hard to tell what you’re doing. When you can’t see the pointer getting back into Settings to reset Pointer Control is a challenge.

    • When I put on VP this morning, Pointer Control was turned off even though I left it on yesterday. I don’t know if that’s normal or a bug, but it’s a problem. If you can’t use eye tracking at all, there’s no way to turn head tracking back on without using your eyes! (Fortunately, I can use some eye tracking. It’s just not super-accurate or consistent.) And it’s not like you can hand a AVP to someone else to have them set it up for you like you could an iPad.

    All-in-all, the Accessibility features on AVP are awesome and I’m glad and impressed they’re included, but they are definitely buggy. Since the device was such a secret even within Apple, I imagine the number of disabled testers was tiny, if any.

    Finally, the best news of all: today I hooked up an Apple Trackpad from one of my Macs. The AVP only supports the newer version of the Magic Trackpad, the one that charges via Lightning and doesn’t use replaceable batteries. Fortunately, I had one of those. It showed up right within Settings under Bluetooth and I activated it.

    It’s wonderful! Not perfect, but works great 99% of the time. It gives you a pointer you can use to move around the entire AVP interface, supports scrolling, long clicks (which is like Control-Clicking on Mac and brings up things like word definitions if you have a word highlighted), and other gestures.

    The two issues I’ve found is that it seems to be a little too “window-based” – it doesn’t give you a free-roaming cursor that goes over the entire display, but magically jumps from window to window. In other words, it only appears within the active window. I was testing in one app, the PGA Tour golf app, and it has a sidebar of icons which you pick to enter different modes. That floating sidebar seems to be a separate window – sometimes when I got my cursor right on the edge of the main window it wouldn’t “jump” to the sidebar window so I couldn’t select those icons. Other times it jumped easily. Perhaps it needs momentum. It was just a little disconcerting. Perhaps it’s something I simply need to learn how to use properly.

    That same issue can be a problem for accessing the Control Center. Normally you look up north and you see the Control Center icon floating there. But again, you can’t move your cursor there with the trackpad until that control is highlighted (active). I seemed to be able to do it with my gaze and then the trackpad click worked, but it was a little strange. (So annoying that you can’t use Siri to open Control Center. Siri just says, “You don’t have an app named ‘Control Center.’ Do you want to search the App Store?” Grrr. There really needs to be another way to open it.)

    Overall, though trackpad control is terrific, and a great solution to someone having trouble with the eye tracking. It does mean you have another device you have to include as part of the AVP, though. The promise of not needing any controller but your hand and eyes is defeated – though this is really a physical defect for me since it’s my eyes that are the problem. I could see this being an issue if I wanted to sit on my deck with the AVP on and do some work outside this summer – I’d have to have my laptop nearby to pair with it and have the trackpad with me. That might not be a problem. I won’t know what a hassle it is until I try it.

    I still have more things I want to test and try, but I’m learning and making progress. The hard contact lens issue definitely caught me by surprise (I didn’t learn about it until during pre-order), but it’s not the end of the world.

  30. Thank you for your very complete descriptions of using the Vision Pro with rigid gas permeable contacts. I fully agree with the poster who said that hard contacts presented multiple reflecting surfaces which confuses eye tracking: whenever there is an abrupt change in index of refraction, there is a reflection from an interface between two materials. I guess that soft lenses adhere better to the cornea and are less of a problem.

    I am struck that Apple, having two very successful, time-tested pointing systems (mouse and trackpad) for manipulating objects on a relatively stationary and large canvas seems to need to invent two more: a touch-oriented interface for iPads and a gaze-oriented interface for the Vision Pro. In my opinion a mouse is a far superior pointing device for an iPad of any size than touch, which, I grant, is necessary on the small screen of an iPhone. Similarly, I don’t see why a trackpad (or mouse) couldn’t be the primary user interface on the Vision Pro. It would be less portable, but we have already seen that using the device while moving is not a very good idea.

  31. The unspoken assumption here is that you’re using an iPad sitting in front of a flat surface upon which a mouse can move. I rarely use an iPad there and the places I do would be literally impossible for a mouse (armchair, public transport, in the car, etc.) Given that, for me, the touch interface is not only superior, it makes the iPad possible in a way it wouldn’t without a mouse.

  32. Sorry, I should have been clear that almost all productive work on an iPad is done with something like a Magic Keyboard with the iPad mounted vertically - sort of a laptop replacement where one would use a trackpad (with the Magic KB). Using the touch interface in this mode is very tiring and can lead to “gorilla arm”, something Jobs mentioned. Studies have shown that the ergonomics for touch are poor in this configuration.

    From my experience, I have found that it is hard to do productive work with the iPad flat on a surface (or one’s lap) though one can do certain things like access the web, read email, etc. in that configuration. For one thing, one needs to use the software keyboard, often a frustrating experience. Again, this is my experience, others might do better.

  33. It really depends on what kind of “productive work” you’re talking about.

    My daughter used her iPad Pro extensively while in college, as an electronic sketchbook (she was a theater major) for costume, makeup, hair and set design work. She did most of her work with the iPad lying flat, doing her drawing with an Apple Pencil, mimicking what artists have traditionally used a pad of paper for.

    But if you’re editing Office documents, well that’s a radically different use-case.

  34. Yes, I should have excluded graphics work, where with the pencil one can do very well placing the iPad on a flat surface. Of course, using text in the drawings relies on the on-screen keyboard and all other interactions use the touch interface which can be difficult if the touch targets are too close. When I do CAD, I much prefer to use a mouse on a Mac than a pencil on an iPad. Tastes vary.

    On another topic, kudos to iFixit for using the only useful measure of graphics resolution: pixels/degree. Angular resolution is what matters: if I use reading glasses I can place my face 2 feet from my 24" 4k computer monitor and get the same angular resolution as my 55" 4k OLED TV at 4.6 feet. The apparent resolution and apparent screen size as well as the image quality will be the same, everything else being equal.

    I want to scream when I hear reviewers talk about having a 10 foot wide Mac screen when the angular resolution is the same as that of a small screen viewed more closely. For similar (apparent) screen sizes, the angular resolution of the AVP is about one half that of a Studio display, which has 5k pixels across compared to the ~2.5k on the virtual Mac screen used in the AVP. The Studio display will look much sharper - like going from a non-retina to a retina display. (the feature size of objects will have a similar size on both screens since the Studio uses ~2.5k points, which determines the size of on-screen objects). Again, the iFixit article explains this beautifully.

  35. I don’t disagree, but we have to acknowledge it’s a lot less ambiguous to specify angular over spatial resolution on a device like AVP where there is a standardized distance from eye to pixel. On Mac or iPhone that’s a whole lot harder to get right since every user has their own preference and perhaps even different uses.

    I like to sit up real close to the 27" display hooked up to my main work Mac (reducing its angular resolution), but I know that every ergonomist says that’s a bad thing and sure enough plenty of Mac users sit much farther from their 27" displays (increasing the apparent angular resolution of that very same display). Similar with iPhone. In bed I’ll have it 4 inches from my face to read a book, but watching a video on transit I’ll have it over a foot from my eyes. Now which angular resolution would I say is the iPhone’s? Spatial gets around that ambiguity.

    Oh and BTW, the Studio Display has 5K pixels horizontally, not across. Across (diagonally) it’s almost 6K (5874). My understanding is AVP has for each eye a WUHD display giving it per eye the same horizontal pixel count as 5K, but only 2160 in the vertical (basically like a wide 4K screen), so 5557 diagonally. My understanding is further that the largest “virtual Mac display” the AVP can display is 4K (3840x1920) which is 4405 across.

  36. Sorry about the ambiguity of the word “across”. I meant horizontally. I think I would have said “diagonally” if I meant that.

    I still hold there is no problem with angular resolution for all situations. Two devices with the same pixel count and the same angular resolution will subtend exactly the same angle at your eyes and provide identical images on the retina if the content is the same. Neglecting the inability of my eyes to focus close-up, an iPhone which is 5 inches from my eyes could look exactly the same as a TV 5 foot away if the angular resolutions are the same. Angular resolution is the sole determinant of the perceived resolution no matter what device is used or how far they are away.

    Maybe we are saying the same thing.

  37. Jason Snell has now posted his review, tying it all the time when we bought computers not so much because they were useful but because “they were the future.”

    To an extent, that’s a fair comparison—I remember my parents buying my first computer (a Franklin ACE 1000 that was an Apple ][ clone) without knowing exactly what we’d do with it.

    But before I migrated to an Atari 1040ST to go to Cornell, I used that ACE 1000 for a bunch of word processing (every paper I wrote after that in high school), spreadsheet stuff (chemistry and physics lab work, mostly), rudimentary programming with BASIC, and of course games. I also learned a lot about hardware sysadmin tasks in terms of what you could get away with doing to computers, disks, and peripherals. My parents used it too for basic word processing and spreadsheet things—I can still hear the screech of the Epson LX-80 printer we got, which had replaced a daisywheel printer that let me down by breaking its S character when I had to print my college application essay.

    So while I’m sure the comparison might have been apt for him and others, computers were both the future AND had immediate real-world utility that couldn’t be achieved in any other way for me. And that experience gave me background that informed everything I did in college and professionally after that.

    In that sense, the Vision Pro might be a taste of the future, but I have trouble seeing how it will teach us anything about that future that is important to learn now. Or, to put it another way, I wouldn’t buy one for a 14-year-old to make sure they were exposed to the future early enough.

  38. I think a key difference is that back then we didn’t know what the future was going to be. Now, for this tech, we’ve a pretty good rough idea.

  39. Brian Chen of the NYTimes has what I’d call a slightly negative review. (Link should not be paywalled.)

    The Vision Pro is the start of something — of what, exactly, I’m not sure.

    But the point of a product review is to evaluate the here and now. In its current state, the Vision Pro is an impressive but incomplete first-generation product with problems and big trade-offs. Other than being a fancy personal TV, it lacks purpose.

    Most striking to me about the Vision Pro is, for such an expensive computer, how difficult it is to share the headset with others. There’s a guest mode, but there’s no ability to create profiles for different family members to load their own apps and videos.

    So it’s a computer for people to use alone, arriving at a time when we are seeking to reconnect after years of masked solitude. That may be the Vision Pro’s biggest blind spot.

    For me this seems about right. I’m impressed with the technology, but less so with this particular product.

  40. I read and liked Brian Chen’s review but I would recommend the current ATP podcast as one of the best discussions I have heard about the device by three articulate and very computer literate people, two of whom own the headset. The consensus seems to be that the device is great for movies and other entertainment but is of questionable value for production.

    One thing I found interesting is some skepticism (which I share) about the utility of a gaze oriented UI. Frankly, I don’t understand how this can work well - it seems to go completely against the way we use our eyes. We are constantly moving our eyes as we gaze at things and superimposing a touch control on this behavior sounds like it would be very awkward, tiring and error-prone.

    Interesting times…

  41. That was fun!

    Thanks!

  42. I just did a search on Google for “CAD apps iPad”, and turned up “ About 8,750,000 results (0.32 seconds)”

  43. Here’s another review of Vision Pro from a programmer. He’s very enthusiastic about the device from a productivity perspective. His comments on using it as Mac are interesting, especially in how you can infinitely position your display anywhere you want. Hard to do that with a physical display no matter how flexible the stand.

  44. I saw Casey Neistad (sp?) walked around NY all day with it. I’m not a fan or follower of his but here’s the link. He said after a little while something ‘clicked’ and it all made sense…

    The very last thing he said was “buy Apple stock because this is a new product category Apple will see through”.

  45. A very interesting and well written review and thanks for the link. His use of s-curve to explain the likely trajectory of Vision Pro is good and expectations must be that Vision Pro will follow the same s-curve feature of most technologies that have gone the distance.

  46. BTW, yesterday I took my Vision Pro and had my uncle try it. It’s the first time I’ve tried guest mode for another user. He’s 78 and since he had cataract surgery, he doesn’t need glasses, so it was worth a try. He’s a computer guy – he’s the one who got me into computers when he showed me his Osborne in 1981. (After seeing it, I switched from saving for an IBM Selectric typewriter to saving for a computer.)

    My uncle has been reading about Vision Pro and while he probably isn’t interest in buying one (he buys 10-year-old computers and tinkers), he was super-excited to see mine. He is paralyzed and in a wheelchair and his hands are rather gnarly (limited finger movement) so I wasn’t at all sure how it would go. We just did it in my car with him in the passenger seat. He held up his hands for the initial scan and it worked even though his fingers wouldn’t extend!

    He then did the eye scans. He struggled with the first one because he didn’t even know how to use the headset. I reminded him how to look at the dots and “tap” by pinching two fingers. It took him about a minute to do the first scan, and then he breezed through the other two in 30 seconds total. Way faster than I did! Mine always failed because of my hard contacts, but he said his said, “Eye scan complete” and let him use the Vision Pro.

    I had him fire up the impressive rhino video. Unfortunately, I’d forgotten the only internet was tethering with my iPhone and cell coverage is terrible at the coast where he lives, so the video took forever to stream. But he got a few seconds of it finally and I heard him audibly gasp. Then he was all, “Woah! The rhinos are all around me. They look so real!” Then he was speechless for a while. It was pretty cool.

    He looked at some of the photos in my Photos and was wildly impressed. He had a bit of a struggle moving the windows around to where he could see them more comfortably. “Grabbing” them can be tricky the first few times. As I explained to him, within 60 seconds of use he was already using the eye tracking and finger clicking. With a few more minutes of practice, he’d be a master. We didn’t wait that long – it was just a quick 10-minute demo. I should have thought ahead and downloaded some media for him. Next time!

    But he came away wowed. I was so impressed that he was even able to use it. This could be a huge device for the physically impaired. With it he could “travel” and experience life in ways he never could in real life. The lack of a keyboard isn’t a problem since he can’t really type anyway (he works on real keyboards the same finger-poke way you do with the virtual one in Vision Pro).

  47. Wow! Yours is the first review of tne Apple Vision Pro for its’ optimum , though not intended audience! Not flakey gen x curio techs! Elderly semi-mobile or incapacitated intelligent who want and NEED isolated but superb displays! What a boon to them! Deeper pockets, often situated in a head supported recliner… without the need of mouse, keyboard or trackpad.
    Go further with this… interview him or video of his experience!

  48. Here’s a look at the vision accessibility aspect of the Vision Pro.

  49. Here’s a look at the Vision Pro from an enterprise user.

  50. A bit of a contrarian view from Alison Sheridan, whose try-out experience wasn’t the best.

  51. I’m not sure what that tells us, since she didn’t use Guest Mode and tried to use it set up for someone else’s eye tracking and was critical when the eye tracking didn’t work for her…

    :man_facepalming:t3:

  52. Joanna Stern’s One-Month follow-up: in the Wall Street Journal or on Apple News+.

    By the way, I find the reported return rate of about 1% astoundingly low. The attitude of the buyers I chatted with was that this was an unknown, and they had two weeks to figure out whether to keep it. If 99% of the folks are keeping it, that is a terrific acceptance rate for an experimental device.

Join the discussion in the TidBITS Discourse forum

Participants

Avatar for ace Avatar for Simon Avatar for tommy Avatar for silbey Avatar for aforkosh Avatar for m.hedley Avatar for lowengard Avatar for ddmiller Avatar for bb1 Avatar for rlennox Avatar for xdev Avatar for MMTalker Avatar for doug2 Avatar for Shamino Avatar for Dafuki Avatar for ashley Avatar for trilo Avatar for warrennn