Will EEVEE Support VR (Equirectangular) Rendering?

Woah, thank you so much! Works fine also with Blender Alpha from yesterday’s buildbot (I tested equirectangular 360°).

Great script!

Currently, the way to run the script is to open the script in the text editor, press run script, and then a toolbar should appear on the left side of your 3D viewport. Sometimes the toolbar is hidden so you just have to drag the little arrow on the left of your 3D viewport right.

I posted this in with the coding section all the way back in June, it didn’t get all that much traction. There’s still a lot of work to be done on this, but at the moment I have no means to continue working on it, at least for the next few weeks!

It’d be cool if Blender would have a way to view the current scene in the headset. Not in realtime per se, but as a rendered cube map. VRay does it by basically rendering a (stereoscopic) cube map from the current camera position. And as long as you’re not changing anything in the scene (which would cause rendering to start over) you can look around inside the VR headset, while it is still refining the rendering. In Eevee this would basically mean you’ll get an almost instantly clean image to look around in, with only stuff like shadows and volumes still refining.

Ah this is great! It will help me work with eevee before a final render in cycles. Thank you!

I have a question. I noticed that the output from your script is much darker / has different lighting. I’m curious to understand why. Also, I doesn’t seem to fix all the stitch issues. Great tool nonetheless. It saves time.

The reason for the difference in lighting is that internally the script changes the colour space to Linear since otherwise there were issues. That’s something I might look into in the future. Some of the stitch issues can be caused by bloom which I also wanna fix in the future although not sure how I would go about doing that.

Ahh thank you for your answer, this is very helpful. I was using bloom in my scene when I tried to export my own cubemap using my own 6 cameras. I was about to give up and try rendering this in cycles but the render time for a 8k 60fps animation would be insane for even subpar results. I’ll give stitching a cubemap another try then… without the bloom.

Do you think a fake bloom could be added in post instead? That might be easier than trying to fix bloom.

I don’t know nearly enough about blender to implement that at the moment, that might be something to look into in the future!

1 Like

Hi EternalTrail,nice addon by the way.

I guess you are stitching the images with an alpha channel together.
There is a problem with alpha and bloom.

see here:

2 Likes

Finally this is happening…

  • Panoramic Camera: By rendering up to 6 views we can cover any panoramic projection by reprojecting the sample.

https://developer.blender.org/T93220

Making my VR rendering a lot easier.

3 Likes

I’ve waited for this around the block, like a movie premiere.
Finally true spatial potential for the web.