YouTube is probably one of the parts of the internet I consume the most, so I was more than a little sad when YouTube announced that they don’t have plans to build a visionOS app, and disabled the option to load the iPad app. This leaves you with Safari, and the website is okay, but definitely doesn’t feel like a visionOS app. Couple that with visionOS not having the option to add websites to your Home Screen, and YouTube isn’t that convenient on visionOS by default.
Please help me understand the point of the Vision Pro? It’s not VR. And every app and screenshot I’m seeing looks like “let’s throw this window, that you could normally have on your desktop or TV in your field of view”. Are there any mechanisms to have it interact with your surrounding in an AR type manner? Or does it just overlay flat windows on top of what you’re seeing?
Afaik there are only a handful of actual AR stuff right now:
Application windows stay anchored to your environment until you reset them. If you put them on top of your desk, it’ll stay there even if you move to your kitchen.
When you look at your mac, it’ll sometimes pop a button to allow you to initiate virtual display to your mac.
When you look down to your bluetooth keyboard, it’ll show whatever you type in a floating box complete with suggestions.
It’s AR and VR in whatever mix you want it to be. The little spinny dial at the top controls how much of the real world you see vs how much of a virtual environment you see. The bottom end is full AR, the top end is full VR.
That’s it, I don’t. I want AR, but just throwing up a random flat window in my field of view without interacting with the environment is not AR. It’s just your monitor with a dynamic background.
It’s going to take time for meaningful AR apps to exist, because this is the first device even capable of testing it in a functional manner on.
But ARKit is already out there and extremely capable on iPhone. The Vision Pro will be able to do way more than the phone due to the field of view and freeing your hands.
Applications are obviously limited without developers having one, but the tooling is all there to interact with and modify your perception of objects in the real world. ARKit is already reasonably well tested with mobile. It’s just more/better input and output.
The fact that the real world is passed as a low latency display doesn’t make it not AR.
Please help me understand the point of the Vision Pro? It’s not VR. And every app and screenshot I’m seeing looks like “let’s throw this window, that you could normally have on your desktop or TV in your field of view”. Are there any mechanisms to have it interact with your surrounding in an AR type manner? Or does it just overlay flat windows on top of what you’re seeing?
Afaik there are only a handful of actual AR stuff right now:
Maybe there are more I’m not aware of.
There we go! That’s some stuff that I was missing. Thank you!
Here’s a guided tour of it if you want to learn more
I want it for the virtual monitor aspect. Especially since I have adhd, I think this would possibly cut down on distractions.
It’s AR and VR in whatever mix you want it to be. The little spinny dial at the top controls how much of the real world you see vs how much of a virtual environment you see. The bottom end is full AR, the top end is full VR.
Preparation for the future. If you want an actual VR device get an oculus
That’s it, I don’t. I want AR, but just throwing up a random flat window in my field of view without interacting with the environment is not AR. It’s just your monitor with a dynamic background.
It’s going to take time for meaningful AR apps to exist, because this is the first device even capable of testing it in a functional manner on.
But ARKit is already out there and extremely capable on iPhone. The Vision Pro will be able to do way more than the phone due to the field of view and freeing your hands.
This isn’t even technically AR. It’s all VR.
Mixed Reality is what this is called
“Spatial Computing” is what this is called. insert SpongeBob meme
Sure it is.
Applications are obviously limited without developers having one, but the tooling is all there to interact with and modify your perception of objects in the real world. ARKit is already reasonably well tested with mobile. It’s just more/better input and output.
The fact that the real world is passed as a low latency display doesn’t make it not AR.