Apple Vision Pro: EyeSight, comfort, and more impressions from my latest demo

Vision Pro pre-orders begin on Friday, and Apple invited me to New York City this week to spend some more time with its first spatial computer.

Right off the bat, I’ll make it clear that this is not a review. My demo today was around 30 minutes, which is not long enough to draw any firm conclusions. Stay tuned for a full review sometime in the future. That being said, I have a lot of thoughts and impressions that I want to share about my most recent demo.


Before you read this, check out my initial hands-on with Vision Pro from WWDC last year!

Vision Pro comfort, fit, and sizing

The most important aspect of Vision Pro is comfort. To get started, I scanned my head using a process that’s somewhat similar to setting up Face ID on a new iPhone. You look up, down, left, and right, and complete that process two times. This is the same process that people pre-ordering Vision Pro on Friday will go through.

With Vision Pro, Apple is including two bands in the box: the Solo Knit Band and the Dual Loop Band. Both of these options were available during my demo, but Apple suggested I use the Dual Loop Band.

During my WWDC demo, I used the Solo Knit Band and a different top strap. This is no longer an option, with Apple instead settling on offering the Solo Knit Band or a separate Dual Loop Band with a different design.

While the Solo Knit Band is absolutely the more aesthetically pleasing option, the Dual Loop Band did a great job of distributing the weight of Vision Pro. It took me a few tries to adjust the sizing of each of the two bands to my liking, but once I found the sweet spot, I didn’t have to adjust it again.

Is Vision Pro heavy? Yes, absolutely. It feels heavy in your hands when you pick it up, and it feels heavy when it’s attached to your face, at least at first. In my experience, I noticed the weight at the very start of my demo, but it gradually faded away as I got used to it over the course of the 30-minute demo.

I didn’t find myself as affected by Vision Pro’s weight as other people who experienced the same demo. Maybe it’s because I’m used to wearing AirPods Max for multiple hours every day, or maybe it’s because I have a big head. Who’s to say?

You can see some pictures Apple took of me while wearing Vision Pro above. I think it’s hard to make anyone look normal while wearing this thing. You can see where my hair is slightly stuck in the front, but I didn’t notice this until I actually saw the pictures after the fact.

I’m eager to spend an extended amount of time with Vision Pro to get a better feeling of its comfort. Is it going to be enjoyable to sit down and watch a 2.5-hour movie with Vision Pro strapped to your face? I think so, especially if you lean back, but I’m not jumping to any firm conclusions yet.

Vision Pro is very much a “your mileage may vary” product. It’s going to feel great on some people’s heads, and not so great on other people’s. In many ways, it’s similar to AirPods and AirPods Pro. Some people just can’t use AirPods Pro’s in-ear tips. It’s something Apple will have to contend with as time progresses, and it releases future iterations of Vision Pro.

In the meantime, I’m very glad that Apple is including two different strap options in the box with Vision Pro. Hopefully, this boosts the chances that people find a fit that’s comfortable for them.

visionOS: Gestures, apps, productivity, and more

Diving into visionOS, I was surprised at how well I remembered the gestures that I first learned at WWDC. I think this speaks volumes about the thought Apple put into the intuitiveness of Vision Pro. There’s a learning curve, but once you learn the different methods of interactivity, it comes naturally. It’s a lot like riding a bike.

As I wrote during my initial Vision Pro recap last year, one of the most impressive aspects of Vision Pro is how well it’s able to recognize the hand gestures.

You may look at pictures of people wearing Vision Pro and assume that you have to hold your hands out in front of you while performing the different gestures. This isn’t actually the case. I could have my hands resting on my lap, seemingly out of sight of the headset. Still, Vision Pro could pick up on any gestures and respond instantaneously.

One of the things I was most excited to try was the visionOS keyboard, which I wasn’t allowed to try at WWDC. There are essentially two ways of interacting with the visionOS keyboard. You can reach out and type on it by “pressing” the keys on the virtual keyboard floating in front of you. Or, you can look at each key, and tap your index and thumb together, as you do elsewhere throughout visionOS.

I used the keyboard in Safari to type in “,” which is certainly not the easiest domain to type, given you have to switch back and forth between the numbers. I also used dictation to visit “,” which was a surprisingly fast process.

I absolutely couldn’t write this very story using the visionOS keyboard. That would require an external Bluetooth keyboard. I could absolutely see myself using the visionOS keyboard to fire off a quick iMessage or jot something down in the Notes app, though. It’s certainly not a “complete write-off” as some other people have suggested.

In Keynote, I immersed myself in the Steve Jobs Theater to imagine I was on stage preparing for an important speaking event. My slides appeared in front of me, allowing me to reference them and see my notes.

While there’s a collection of apps built specifically for visionOS, the Vision Pro will also be able to run iPad apps (unless the developer opts out). I tried out the Yummly app for iPad on Vision Pro and it worked pretty well. You can adjust the window size, move it freely around the landscape, and place it alongside apps designed for Vision Pro. These apps appear in a dedicated “Compatible apps” folder on the visionOS home screen.

Once I had a few windows open at the same time and was ready to end things, I simply said, “Siri, close everything” and all the floating windows disappeared.

Spatial photos and videos

Spatial videos continue to be one of the most impressive aspects of Vision Pro. I was blown away by the feature when I spent time with Vision Pro at WWDC, and even more so this time around.

During my demo, I saw a handful of spatial photos and spatial videos. Some of the spatial videos were shot on iPhone, while some were shot on Vision Pro. There’s a clear difference in quality between the two, with the ones shot on Vision Pro having more depth thanks to the 3D camera. Still, the results from the iPhone 15 Pro Max were quite compelling.

At WWDC, we had no idea that the iPhone 15 Pro would be able to shoot spatial video. Instead, Apple had led us to believe that you could only capture spatial videos while wearing Vision Pro. This really tempered my excitement for spatial videos. Would I really put on Vision Pro to walk around and take spatial videos? Probably not.

But now we know spatial videos can be shot with iPhone 15 Pro. That’s a big deal, and it makes spatial video videos a vastly more compelling Vision Pro feature. I already find myself wishing I had a bigger library of spatial videos. I plan on making it a priority to take more spatial videos going forward.


Apple showed me the (almost) same sizzle reel of Immersive Videos that it showed me at WWDC, although I did notice that the clip of an NBA game had been removed. Apple Immersive Videos are incredibly impressive, but the big question is how many of those videos are actually coming.

My favorite Immersive Video is “Alicia Keys: Rehearsal Room,” which takes you inside a music studio with Alicia Keys. Apple has confirmed that this is one of a few Immersive Videos that will be available when Vision Pro launches.

The most impressive demo, though, was in the Disney+ app. As Apple announced earlier this week, Disney+ will be available on Vision Pro with a set of custom environments. In those environments, you’ll be able to watch any Disney+ content. For instance, I watched a trailer for a Star Wars movie while sitting in Luke Skywalker’s landspeeder, overlooking planet Tatooine from the Star Wars galaxy.

Watching a movie in the Disney+ Theater (which is apparently inspired by the historic El Capitan Theatre in Hollywood) is going to be incredible.

I also got to try out the JigSpace app for Vision Pro, which put a life-size Alfa Romeo C43 Formula 1 car right in the demo room of Apple’s New York City building. I could tap and pull apart different pieces, rotate them, and even see the light reflect off of the side mirrors. I have no idea how useful this actually is, but it’s easy to think of similar models that could exist for other things.


One of the big differences between my hands-on time with Vision Pro this week and what I experienced at WWDC is EyeSight. This is Apple’s feature that uses Vision Pro’s outward facing display to show people around you when you are using apps or fully immersed. I didn’t get to see EyeSight in action at WWDC, but this time I saw the feature while an Apple employee was wearing Vision Pro.

The feature is built around the Vision Pro’s Persona feature, showing a rendering of the person’s eyes on the outward-facing display. For example, when the Apple employee blinked, I could see the rendering of their eyes on the display also blink.

When the Apple employee was using an app, a blue gradient appeared on the external display to signal to people they were actively focused on something. When they went into a fully immersed environment, a moving gradient similar to the top of a HomePod appeared. As I approached the person and they started talking, their “eyes” broke through.

When the Apple employee took a picture using the dedicated camera button on Vision Pro, the outward display flashed similar to a camera shutter.

EyeSight is a weird feature, there’s no denying that. Apple says that EyeSight is Vision Pro’s way of keeping you connected to the world around you while you have the device strapped to your face. I’m not sure how well this will translate to the real world, but kudos to Apple for trying. Even if it doesn’t catch on, it’s an impressive feature.

Wrap up

I can’t deny that I’m excited about Vision Pro. I’ve had two demos courtesy of Apple so far, but these have been highly controlled demos. I’m eager to actually sit down in my house, put on Vision Pro, and spend hours combing through every little detail. I still have a lot of unanswered questions.

When I tried Vision Pro at WWDC, I was blown away. In the months since then, my excitement ebbed and flowed a bit. Now that I’ve had the chance to spend more time with Vision Pro, however, I’m reminded of why I was so excited after my WWDC demo. It really is an incredible product.

After all, Vision Pro is a first-generation product. It’s perhaps the most “first-gen” product Apple has ever released. That doesn’t mean it’s not a great product. Also, more so than any Apple product before it, Vision Pro has a learning curve. There’s an obvious learning curve for the person wearing it, but there’s also a learning curve for society as a whole – especially when it comes to EyeSight.

The Vision Pro interface is delightful and whimsical, with beautiful animations and an attention to detail that only Apple can pull off. The hardware is something to marvel at, even if there are drawbacks. The tech specs are equally impressive.

Vision Pro is truly something only Apple could create, and I think it sets the stage for a very exciting future. This is going to be fun!

Follow Chance: Threads, Twitter, Instagram, and Mastodon. 

FTC: We use income earning auto affiliate links. More.


Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top