Project UVs from 360 camera?

Hey everyone! I am trying to project the UVs of my scene objects based on my 360 camera location. Here is an example image of what I would like my ground plane UVs to look like to match up with the skydome.

The goal of this is to add depth to my skydomes for VR. So far I have not been able to project from camera and I can not seem to get good result with a sphere_projection.

This video best represents the effect that I am trying to achieve https://www.youtube.com/watch?v=X8IzbQ_E6BI

Thanks!

Matellis

Hum, I’ve seen it done, but can’t remember where and how it was done…
Seems like in other softwares they indeed use spherical projections and that seems logical : https://www.youtube.com/watch?v=uWbfdswQ1mM

What was the issue when you tried ? maybe something you miss in the setup ?

Hey Sozap, thanks for replying.

When you create a spherical UV projection in Maya it gives you some settings to adjust (Image below). They allow you to move your projection sphere to where the camera would have been to capture the HDRI.

While Blender does have a sphere projection, I don’t see any way to refine it to what I want.

Ultimately I am trying to project my HDRI image onto geo to add depth, bake out the UVs to the spherical image and then import that geo and texture into Unreal where I can then view the image in VR and you will be able to see depth and a small amount of parallax. Trying to achieve all this without the need for Maya or Mari.

Thanks!

I’m not exacly sure if it that you want to do :

test_HDR.blend (187.3 KB)

Once you’ve modeled the scene you’ll need to unwrap the mesh and bake the texture.

1 Like

This is great sozap! This solves a lot of my issues. I can now construct the scene in blender. Next will be making sure the object UVs project from the same location as the camera so the UVs pretty much stick to where you see them on the HDRI :slight_smile: Thank you so much for the help!

1 Like