Workflow for multiple scene renders, one normal and one with Red/Green "UVs"?

I am looking for tips on how to best set up View Layers or other Blender configurations for a project. I modeled a proof-of-concept virtual set, which will be put on a very large monitor behind our talent in a live public-access news show.

The virtual set itself has a fake curved screen:

Using view layers, I render a file that is just the UV coordinates of the screen (masking for the desk in front) from the virtual camera’s view:

image

To animate content on the screen in real-time (hopefully once the virtual set is photorealistic), I can then in Godot use a fast 2D shader to composite 2D elements onto the screen to fake the distortion of the screen. The following image is compositing the debug grid texture purely in 2D in Godot:

Notice that there are some weird noisy spots at the top and bottom of the fake screen. Since the red and green channel of each pixel is used for mapping, it’s critical that they purely represent UV coordinates. I think what is happening is that antialiasing is changing the pixel red and greens a little bit at the edges:

I haven’t figured out how to get a “perfect” UV map with red, green, and alpha channels.

I’d really like to be able to render both the set and the screen mapping image in a single button press, so I’ve been using view layers:

This approach is pretty close, but I think there are times where I will want to be able to use the same model, but switch the material to a shadow or reflection catcher. I think Cycles can substitute a single material on a layer (like for clay renders), but I’d want to be able to do something like make everything except the screen a hold-out/catcher. The catcher should also let me create natural looking reflections, since the red/green texture could reflect off of catcher surfaces.

Sorry for such a long write-up, but I wanted to get ideas for best practices on achieving this as a workflow, and I’m afraid that if I asked a more specific question, I might be missing a more holistic change in approach.

Thank you!

For holdouts, you can set collections as holdout in the outliner and it does work with view layers. Search in the filters menu, it’s not visible by default.

filters

If you wanted to override a specific object’s material, you could do it using the “scenes” feature and make a copy of the scene which you can modify as you want.

Can UV coordinates even have an alpha channel? I am guessing if you removed the transparency, it would reveal that those pixels are mixed with a black background, giving them the wrong color.

Maybe the alpha of the UV pass would need to be slightly trimmed on the edges before being used, so the border with the wrong colors isn’t visible in the end result.

1 Like

For holdouts, you can set collections as holdout in the outliner and it does work with view layers. Search in the filters menu, it’s not visible by default.

That’s a great feature, thanks! That cuts down on the amount of duplicate hold-out geometry I’m using by a lot. Scenes look like an even more flexible way to do something similar but maybe I can avoid that level of complexity. I tried changing materials by having a new “Linked Copy” scene, but I haven’t figured out yet how to delink just the material.

Can UV coordinates even have an alpha channel? I am guessing if you removed the transparency, it would reveal that those pixels are mixed with a black background, giving them the wrong color.

The render definitely has an alpha channel, but you’re right that they appear to be mixed with a black background. The pixel where the arrow is pointing along the fringe has a red channel that is way too low for the area around it:

I forget the right word, but it’s almost like when compositing and your image has pre-multiply not configured right. Any idea on of there is a way to maybe normalize by the alpha value to get the proper pixel value back? Playing around in Gimp, it actually really looks like dividing the R or G value of a pixel by alpha should work, but my shader isn’t cooperating.

Any idea on a way to trim the edge on the Blender side? That said, I think I’d just run into more variants of the same alpha-to-black mixing. I think I can do it in the Godot shader, but it looked like an expensive operation to compare against neighboring pixels.

I am not sure if you can de-link a material. However, if you work with a linked copy, you can actually add and delete objects in just one scene but not the other. You could probably de-link not the material, but the object itself so you can make any alteration you need to it.

Does the bake’s margin not work in this case? I am wondering if it wouldn’t be useful to extend the color a bit outside the UVs.

Does the bake’s margin not work in this case? I am wondering if it wouldn’t be useful to extend the color a bit outside the UVs.

I got the shader fix working! I was dumb and mistyping something. Without normalizing red and green by alpha, you get the crunchy bits at the top and bottom:

With that fixed, it is nice and smooth:

Even along the fades:

I’m going to dig more into the multi-scene and shadow catcher issues tomorrow. Thanks!

2 Likes