Will EEVEE Support VR (Equirectangular) Rendering?

Thank you so much for the procedure, @csimeon .

I still had to go through some of the (outdated) tutorials on Hugin website and some videotutorials to get the hang of it, but with the settings you mentioned I was good to go and didn’t even need to set any control points. I don’t see any seams whatsoever, tho that could be because my environment texture is a landscape instead.

I also read your thread about the interior design linked above and that was interesting to see how you come up with various possible solutions to the seams problem and how you worked it out.

Well, I guess there must be more people interested in this and there will be more. 1. Hugin UI isn’t that easy to figure out (the basic wizard makes it even more confusing IMHO). 2. you get faster rendering times in Eevee which is always nice, 3. you get rid entirely of the fireflies or grain induced by engines like Cycles and maybe 4. I suppose you can get a different aesthetics than you would in Cycles. I for instance am a Blender Internal user and never quite made the leap to Cycles because I’m not after that kind of realism but Eevee does get me what I want and I’m happy with it, and it’s cool to be able to make environment maps with it.

With a fee and a well explained step by step (video included) setup, I think it will have potential. I too need equirrectangular for EEVEE.

You’re welcome. Glad you think it’s useful.
I was surprised how little interest there was, because I find 360’s exciting.
@DavidRivera thanks.

I guess too many people are worried about 2.8 final than these kind of features. Stereoscopic is another abandoned area, but not for this soldier…no one stereo is left behind…that including VR. :slight_smile: and 360º video render.

To export multiple camera at once, go see this: https://www.youtube.com/watch?v=U3KlJNiw12k&fbclid=IwAR057J9LWbQ9RBHvqrLiQfQN5lcSx_b8e7nXyb9cv9frjR6yj2Y-mumgJoY

YEs, first result pop up a while ago, when I was researching in to this. I can say it´s not convoluted, just wondering how it will result if I take the time to set it up.

Is there any more news on this?
equirectangular for VR in eevee sounds like a must have thing for 2.8.


Do U talking abou this?


Shameless plug here but it’s relevant. I wrote a script for blender which renders equirectangular images with variable FOV in eevee. You can download the script at https://github.com/EternalTrail/eeVR it’s far from perfect but it does the job.


Thanks @EternalTrail
Installing the .py or .zip didn’t seem to do anything for me. I couldnt se any check box.
Any ideas?

Ooh, I might have a use for this soon, fantastic timing. Thanks a lot!
Have you considered posting it in the addon section here?

Woah, thank you so much! Works fine also with Blender Alpha from yesterday’s buildbot (I tested equirectangular 360°).

Great script!

Currently, the way to run the script is to open the script in the text editor, press run script, and then a toolbar should appear on the left side of your 3D viewport. Sometimes the toolbar is hidden so you just have to drag the little arrow on the left of your 3D viewport right.

I posted this in with the coding section all the way back in June, it didn’t get all that much traction. There’s still a lot of work to be done on this, but at the moment I have no means to continue working on it, at least for the next few weeks!

It’d be cool if Blender would have a way to view the current scene in the headset. Not in realtime per se, but as a rendered cube map. VRay does it by basically rendering a (stereoscopic) cube map from the current camera position. And as long as you’re not changing anything in the scene (which would cause rendering to start over) you can look around inside the VR headset, while it is still refining the rendering. In Eevee this would basically mean you’ll get an almost instantly clean image to look around in, with only stuff like shadows and volumes still refining.

Ah this is great! It will help me work with eevee before a final render in cycles. Thank you!

I have a question. I noticed that the output from your script is much darker / has different lighting. I’m curious to understand why. Also, I doesn’t seem to fix all the stitch issues. Great tool nonetheless. It saves time.

The reason for the difference in lighting is that internally the script changes the colour space to Linear since otherwise there were issues. That’s something I might look into in the future. Some of the stitch issues can be caused by bloom which I also wanna fix in the future although not sure how I would go about doing that.

Ahh thank you for your answer, this is very helpful. I was using bloom in my scene when I tried to export my own cubemap using my own 6 cameras. I was about to give up and try rendering this in cycles but the render time for a 8k 60fps animation would be insane for even subpar results. I’ll give stitching a cubemap another try then… without the bloom.

Do you think a fake bloom could be added in post instead? That might be easier than trying to fix bloom.

I don’t know nearly enough about blender to implement that at the moment, that might be something to look into in the future!

1 Like

Hi EternalTrail,nice addon by the way.

I guess you are stitching the images with an alpha channel together.
There is a problem with alpha and bloom.

see here: