Will EEVEE Support VR (Equirectangular) Rendering?

If your renderer doesn’t support it natively, you can “manually” render it out by doing something like this:

  • set your rendering resolution to something quadratic (e.g. 512x512)
  • set your camera FOV to 90 degrees
  • for each side of the cube, rotate your camera so that it looks at the side of that cube and insert a keyframe
  • render the six frames

I tried the manual technique, it’s pretty close, only there’s one thing I don’t get. The top and bottom camera seems a bit offset for some reason, when I reconstruct the cubemap in photoshop. They don’t match somehow. Does anyone know what causes this ?

The camera are all 90 degrees FOV, rendered at 1024x1024. They’re all positionned on the same point.

I think I found out why. I changed the camera clip start value to the lowest value I could. It seems like it resolved the problem.

Here’s my rig. You’d have to render each camera individually and then stitch the images in after effects or another software to make the cubemap. But it works, tested here in Montreal in the SAT dome.

blender_28_cubemap_camera_rig.zip (106.6 KB)

1 Like

Thanks to UnitedFilmDom team, for their excellent tutorial and blender files rig.
Like previous posts mention, it uses 6 cameras, from a single vantage point, each with a 90d FOV.
The rig is a collection which you add to your scene.
You render out in Eevee, preferably as a batch in seconds or low minutes, depending on hardware.
You add the resulting 6 images to a UV mapped cube, with emitter shaders at each face, facing in. Then, using a camera in the middle (no other lights or objects, at all) you render with Cycles, with the built-in panoramic equirectangular camera.
Finally you may add the metadata to view in a 360 VR viewer, like Google photos, Flicker or other.

I adapted this:
Most important tweaks:

  • No bloom (or any other post image compositing or processing, that modifies lighting per shot)
  • Extend camera screen space (not actual FOV) to overlap outside field of view.
  • Add metadata so you can see the result in 360 viewer like Google

The rig and adapted workflow give 42.5 Megapixel VR panorama results, in a few minutes (about 5 minutes)!!
View my work and the panorama

1 Like

Until that time comes, when it is supported, I have solved it! No seams.
In my previous post yesterday I had managed a descent result with slight seams. That workflow could go no more because:

I worked on idea #1:
I stuck with six camera rig and increased the FOV of each from 90 degrees to 96d, so that the shots overlap in the perimeter, about 15% (in the following example each shot is 2300 pixels and the overlap is 150 on each side —> the resulting panorama is 8000X4000 equirectangular, each image has a unique area of 2000X2000 pixels)
Then I fed the set to Hugin’s Panotools, inputed the cubemap coordinates in PanoTools format (yaw, pitch) and gave the overlap corners for control points.
This is saved as a .pto project file so you can rerun the set with new Eevee outputs with one click!

Hugin software, not only removes seams but does a clearly better job at preserving edges near the edges of the individual photos where pixels are stretched

View my work in progress and the links to the panorama test shots(Eevee Architectural Interior)

What happens if you stitch together 1 pix wide strips? It’s gonna get a lot of images, but will all screenspace effects get lost?

It’s possible to script this coordinate config and stitch but
Hugin’s Panotools need overlap to cleanup seams and fix exposure differences . 1 pixel wide means no overlap.
What is your goal?

Was just thinking out loud. Maybe “stitching” wouldn’t be necessary, just add them all sideways. Assuming screen space effects are off, I don’t understand why there would be “exposure differences” in a computer generated setup.

But yeah, this is way out of my comfort zone, ignore me if I don’t know what I’m talking about here :slight_smile:

There are differences because Eevee is a Biased Renderer, not a ray-tracer.
The screenspace (view angle etc) influence the render result.

The panoramas are very sweet! Stitching with Hugin’s PanoTools is a scripted, 3minute computer-time-only routine.
View 12000x6000 equirectangular panorama
(after the generation of the 6 cubemap images with a scripted batch, Blender in backround mode: Eevee takes about 2.5minutes for each of the six, on my laptop, at 3450x3450 resolution and dozens lightsources)

Hi, @csimeon . Gorgeous panorama there! It looks perfect.

I’m also interested in rendering equirectangular HDRIs with Eevee.

I understand the part about rendering 6 squared 90deg FoV pictures. But I can’t figure out how to convert this cubemap into an equirectangular projection in Hugin.

Any chance you could provide a tutorial on this? Can’t find any videotutorial doing this. Also, I’m a noob to HDRIs and don’t know about setting these control points and whatnot. I’d expect a perfect cubemap from 3D software (unlike real world pictures) would convert into an equirectangular projection with no much hassle tho?


1 Like

Not quite:
Any photo stitcher, including Hugin’s Panotools, needs overlap to stitch photos. Overlap areas are used to align the geometry of the adjacent shots and to calibrate the exposures seamlessly.
90d do indeed give the complete cube in 360X360d, however there is no overlap.
A little trigonometry will tell that to get 15% extra all around the shot you need to set the FOV to 97.982d
Inverse Tangent of 1.15 = 48.991 degrees (half the FOV). Multiply X2 = 97.982 degrees

Then you need to map the 6 individual shots with Hugin, tell the software which is down, up, left …etc, by specifying pitch and yaw. For example: front camera has yaw=0, pitch=0 / down yaw=0, pitch=-90 …etc

Then you set the control points for the adjacent photo pairs, 15% inside the photo limits.

You do this manually once! Hugin generates a script .pto file.
Every next time you just name your 6 renders with those names you chose when making your pto and run the script! Done!
(Or you edit the script, it’s simple text substitute the six photo names. Editing gives you the advantage to set new resolutions and some more)

Above is the “algorithm”. I’m considering providing ready scripts at Gumroad or BlenderMarket for a small fee. You think it has potential?


You can convert cubemap to equirect imagens with cycles. Set the cubemap textures to a cube using only a emission material and set the camera to enquirect. Use a 2:1 aspect ratio in the image, Remove all lights of the scene and render with 16 samples only.

True, this approach is mentioned higher up, in this thread. I also tried it, before I switched to this latest method with Hugin. Follow the thread and you can read why, I don’t want to crowd. In short it comes down to this:

Depending on the differences of exposure in the six orientations, it may work better or worse.
Below is a case point: a 360 panorama, where one side faces windows in direct sunlight and the opposite is a much darker corner. Hugin stitches and adjusts exposure smoothly, despite the differences and the bloom (exaggerated for this example).
Example of extreme exposure differences and bloom
In such a case, the method with Cycles rendering the cubemap would render poorly, with incontinuity and pronounced seams at the cube edges.

You’re right. Nice tip with the hugin :wink:

Thank you so much for the procedure, @csimeon .

I still had to go through some of the (outdated) tutorials on Hugin website and some videotutorials to get the hang of it, but with the settings you mentioned I was good to go and didn’t even need to set any control points. I don’t see any seams whatsoever, tho that could be because my environment texture is a landscape instead.

I also read your thread about the interior design linked above and that was interesting to see how you come up with various possible solutions to the seams problem and how you worked it out.

Well, I guess there must be more people interested in this and there will be more. 1. Hugin UI isn’t that easy to figure out (the basic wizard makes it even more confusing IMHO). 2. you get faster rendering times in Eevee which is always nice, 3. you get rid entirely of the fireflies or grain induced by engines like Cycles and maybe 4. I suppose you can get a different aesthetics than you would in Cycles. I for instance am a Blender Internal user and never quite made the leap to Cycles because I’m not after that kind of realism but Eevee does get me what I want and I’m happy with it, and it’s cool to be able to make environment maps with it.

With a fee and a well explained step by step (video included) setup, I think it will have potential. I too need equirrectangular for EEVEE.

You’re welcome. Glad you think it’s useful.
I was surprised how little interest there was, because I find 360’s exciting.
@DavidRivera thanks.

I guess too many people are worried about 2.8 final than these kind of features. Stereoscopic is another abandoned area, but not for this soldier…no one stereo is left behind…that including VR. :slight_smile: and 360º video render.