I’m experimenting some render techniques to make stereo panoramic rendering that can be used in Oculus Rift, GearVR and other VR headsets.
There’s some discussion in OculusVR Forums about what is the best way to render stereo panoramic images. What is common sense is that you need two images with a separation to simulate the eye separation. But if you just setup it that way and render with a panoramic projection the results will be wrong since the center of the panoramic projection will not be the center between the eyes, will be the center of the individual cameras. it produce this kind of problem:
See that the part in front of you is right you see it stereoscopic but the left and right parts of the image are not stereo and the back is inverted. Paul Bourke have explained this problem and made some solutions in this paper
Digging in the OculusVR forum they suggested:
- Set the Fov to 1º (Now possible in blender 2.74 thanks this patch)
- Render 360 strips with 5px width ( A camera rotating 360º , 1º per frame, 360 frames)
- Join all parts with PhotoMaker (http://stereo.jpn.org/eng/stphmkr/)
- Repeat this for the other eye
It seams a bit crazy and is a lot of work but it just work!
Here is the result:
This way we have a right stereo image in all directions!
I check too the Multiview Branch and it’s a great improvement in the stereo 3D workflow! At this time it don’t fix yet this problem with stereo and panoramic rendering, i talked with Dalai (dfelinto) that are working on it and looks he have a patch to implement but is not in the branch yet.
Today we can use this hackish solution for static images but is quite impossible (not feasible) to render stereoscopic 3D movies like this. At least until dfelinto or other developer implement the solution for render stereo panoramic images.
Someone have a easier solution for that task?