Rendering Omni‐directional Stereo Content on Eevee | 360º 3D VR

Hello friends!

I’m making a 360º stereoscopic Virtual Reality short animation rendered in Eevee called Resilience Satelle r.i.c.e.: https://blenderartists.org/t/resilience-satelle-r-i-c-e

…but as you know we don’t have an equirectangular/panoramic camera for Eevee just yet. :sweat_smile:

So far THIS PAPER from Google is still the best description on how to achieve this.
Pay attention that I’m not a programmer, but still this paper talks sense even to me.

My question:
:thinking:
How hard would it be to convert the code they share in the last two pages, to make an add-on to create such a camera for Blender Eevee renderer.

I know that eventually Eevee will have a panoramic camera, but because we are almost ready to render I’m searching for a more immediate solution.

Is this code even compatible or adaptable to a Blender python script/Blender add-on?
I’m open to invest some money on this, but first I would like to have a confirmation that is actually something that worth’s it.

Best regards
-Rogério-

So the paper code seems to want lower level access than I believe Blender provides. Unless you can access the render engine rays, which maybe you can?

But is there any reason you can’t use two cameras as part of a stereo pair, rotate 360 and stitch everything together? It seems similar to what the code is doing?

1 Like

Hello @RajW!
Thank you very much for taking the time to give a look to the paper!

Well… my only reason is that to do all that it’s just way faster and less painful to render everything on Cycles :sweat_smile: We do have all our shaders compatible with Eevee and Cycles… but I like Eevee look better and wile a 4092*4092 frame in Eevee takes 5 minutes to render on Cycles takes 1+ hour.

Being this an animated movie with 20000 frames…
Unity and Unreal Engine does this stuff, right! There must be a way! :slight_smile:

Unity does this by rendering 2 cubemaps https://blogs.unity3d.com/pt/2018/01/26/stereo-360-image-and-video-capture/ and unwrapping them in the end to an equirectangular projection… Blender has light probes that capture cubemaps, can’t a script be made to used the probes data to render a VR output?


May be a very stupid question, but wouldn’t something like this work?

If you place 2 light probes, separated by about 6 cm and somehow obtain the cube maps that these light probes capture for the reflections… convert the cube maps to equirectangular and place one on top of the other… wouldn’t this output a 360º Stereoscopic rendering? :thinking:

Why can’t it be done?

I understand that this thread is old, but I think I can give an answer to this last question for someone that might be still interested:
no it can’t be done like you say, because the parallax deviation would be correct only if you look at the panorama from a specific point of view (the one looking in the direction of the two spherical cameras), if you rotate the head on the opposite side the two parallax deviation of the R and L eyes would be flipped. The only way I know to solve the problem without coding is by doing a lot of renderings, let’s say 360 spherical panoramas for the L eye and 360 for the R eye, and while doing that rotating the camera of 1 degree for each pair of renderings, after that crop the central strip of each spherical panorama and stitch it next to the central strip of the next rendering until you reconstruct the whole thing (as explained here: https://www.youtube.com/watch?v=a5hy4QdcFGU&t=157s this is in Unreal engine but the concept is the same). The cons of this approach is that you need to do 360*2 spherical renderings and trash 99% of the rendered area, also the final result would only work well at equatorial height, if you look up or down you would see noticeable aberrations.

1 Like

Hey @Rikkarlo!
Thank you very much for the clear explanation!
I completely understood it now. It also explains why no one achieved it yet.

Considering all that complexity I guess that using Cycles is just a lot faster and easier.

Btw: I felt I just posted this a week ago… how the hell so much time has already passed!? :neutral_face:

Ahaha I feel you, it always happens to me too, time runs so fast lately :rofl:.
About our topic, yes cycles provides already a great algorithm for omnidirectional stereoscopic panoramas, I also think it’s defenitelly worth using that instead :slight_smile:

1 Like