There’s very little disucssion about VR on this forum (and zero threads with a VR tag in this category). Was it a tech demo where development just stopped after the first version shipped? Is it waiting for Eevee Next to receive updates?
I’m just curious because I tried it with a few what I thought would be easy scenes, around 60k triangles in both, two to a dozen objects, half a dozen lights, contact shadows in one scene but no difficult materials, materials that use a bit of noise in another… and on a 3090, I think I got about 10 fps. Even the reprojection was absolutely tanking (solid mode was fine, of course).
But then I tried it with an animal with very detailed fur, and I though this wasn’t going to end well, but it was actually better performant than my scenes! So then I thought I’d go crazy and just open a (for us, a small ID company, medium sized) movie project we recently did, 1.5 million triangles, 500 objects, at least one material that does some stuff, massive contact shadow distance and AO distance… and not only did I get what looked like 30fps (which is still low for VR), I was able to play the movie and glide along with the camera and see everything, including geometry nodes animations, which was pretty mindblowing.
So, very uneven experience, which granted, Blender does a lot so it’s difficult to optimize, but now I’m very curious what is happening in this area (but we’re so, so very close to pulling the trigger on moving to Unreal where I work).
(Also, just one small detail which would increase the UX by 1000%… if I aim the laser pointer at something which has normals pointing up, don’t place my headset directly on it. It’s probably a floor, so it would make more sense to place the headset at my body height. We do this in Unreal already where one has direct control over the navigation implementation. Also, wasn’t able to figure to figure out how to switch from movement to manipulation, but maybe that’s not implemented yet.)