How does Blender Combine the Refraction Pass?

SHORT VERSION: I want to know the proper way to combine render passes to match the output of the combined pass that Blender makes.

EDIT: It looks like now I’m only having trouble with the refraction pass.

FULL STORY:

I need to separate refraction and shadows from my scene and render them as separate passes. Before moving on I wanted to make sure that my method of combining the render passes matched with blender’s method (the “Combined” pass in another render layer).

My method of combining these passes does not match with Blender’s internal method, and I want to know what is wrong.

I am currently multiplying the “image” output (identical to a diffuse pass) with the shadow pass and then “Adding” the reflection, refraction, and specular passes (Factor = 1 for all).

The result is an image that is too dark in comparison to Blender’s method. The shadows are a bit too dark and the refraction is WAY too dark. So I would like to know the proper method of combination.

So this is strange about the diffraction: Using the “Add” method with diffraction should not produce a dark result if most of the pixels of the refraction pass are dark or black. It should have nearly no influence in those areas. In fact, “Add” should never darken any pixel of an image. If I run the diffraction pass through a black&white color ramp (just converting to B&W) then it behaves normally using the “Add” function. So it’s like there’s some funny business about the refraction data/image.

I am using a recent build from Graphicall: r30426. Any Ideas? Thanks!

The short answer is, you can’t easily re-create the Internal renderer results by combining passes. It would be hugely inefficient for Internal to work that way, so it doesn’t. Blender’s compositing nodes aren’t really designed for building an image from the ground up, but rather they let you access the passes to enhance the render.

Ugh, if you’re right, that’s a real pain. Well, I’ll do more experimenting, but it really shouldn’t be that complicated to document how Blender does this (or at least a close approximation).

As I mentioned, the refraction pass is really acting strange. I almost wonder if this is a bug, because the pixel math with Add should not behave this way. I’ll try piping it through an unaltered RGB curves node to see if that fixes the issue without distorting the pass.

I do not think that is true. If you know how to do it, you can recreate an exact copy of the combined pass. The only pass I didn’t try yet is refraction. But the rest works fine. Maybe not particularly easy, but fine.

Attachments


Thanks Sebastian! I agree.

In the time since your post, I made a couple discoveries. OK, so note how I said Adding the reflection pass (which was mostly dark or black) was actually brightening areas of my image EVEN where I was Adding black pixels from the refraction pass (value = 0). So this was strange.

Well, I discovered that some regions of the refraction pass will have negative values! So when I Add the refraction pass to my original image it adds a negative quantity (i.e. subtracts) in those black regions, and that is how seemingly black pixels in the refraction pass can actually darken the image when Added to the original image. The rest of the pass Adds as you would expect.

OK, so this was enlightening, but it did not solve my problem. But then I stumbled upon the solution: I had been Adding the refraction pass with a full “factor=1” (and hey, why not?). It turns out the magic number is “factor = 0.92.” Why? I don’t know yet. I’m going to do a test with a fresh scene to see if was a setting I had made somewhere. So I don’t know yet if 0.92 is standard for all scenes yet. I did check that it’s not dependent on the index of refraction (IOR) setting.

The first image I’ve attached shows the selection of pixels that have negative value. I’ve selected them using the Math node (any pixels less than 0 are colored white in the viewer node). It’s these regions in the refraction pass that actually look black yet are negative values.

The second image shows the perfect matching of my composite to the original image using the “Difference” function for comparing images (absolute value of the difference between pixel values in the two images). I displayed it in Split View to compare factor = 1.0 with factor = 0.92.

So this appears to have solved my problem. I hope it’s useful to someone else! If anyone has any idea on why the refraction pass needs to be multiplied by 0.92 before Adding, I would be interested to know.

Attachments



@Sebastian_K: that’s nice stuff and good to know. Note I didn’t say it was impossible, just not easy! :slight_smile:

Did you work out those nodes from first principles (ie reverse engineering the renderer), or by trial and error?

Thanks! :slight_smile: Well, some trial&error, wiki, and a great tut from blenderunderground. Haven’t got the link atm. Cheers!

OK, well more bad news. I can get the result I mentioned above just fine if I only separate out the refraction pass for example. I was doing this so I could isolate the problem. But now, after breaking my render into diffuse, shadow, specular, reflect, refract, and emit, my above solution does not work. There is a difference between my composite and Blender’s.

Currently my node setup does this:

Composite = FshadDiffuseShad +FspecSpec + FreflRefl + FrefrRefr + FemitEmit,

where all the Factor coefficients are 1 except for Frefr (refraction) right now.

Then I am displaying the difference between my composite and Blender’s.

So still no luck when I break up into all passes. Any ideas/feedback?

Attachments


Ah, I’ve found that tutorial on blenderunderground. Hopefully that has the answers I’m looking for. But if not, I’ll have to keep plugging away.

Here’s the tut: Part 1: http://blenderunderground.com/2008/03/31/introduction-to-composite-nodes-part-1/

Part 2: http://blenderunderground.com/2008/04/10/introduction-to-composite-nodes-part-2/

Ehh, I just read through the blenderunderground tutorials and, although well-written, did not provide me with any new information.

The struggle continues.

things like how you apply AO depend on UI settings, which are not known to the nodes. l

@PapaSmurf:

Right. I remember seeing somewhere where someone was compositing AO in a specific way based on the AO Energy and whether it was set to Add, Subtract, or Both (via a few Math nodes). I think maybe this was just on the Blender documentation Wiki. Yet I still have trouble figuring out how to do this for Reflection and Refraction. Each of these passes contain negative values for certain pixel regions.

So as I said, it appears AO depends on AO Energy, Add, Subtract, Both. I haven’t tested AO yet, but I’m more concerned with Reflection and Refraction. So what do Reflection and Refraction depend on? It doesn’t appear to be IOR.

Ok, now i’m confused as well. Everything works as expected when AO is set to multiply and you don’t use Environment or Indirect Lighting. But when you enable these two, things start to go weird. So the question remains: how to composite reflect/refract when using env.light/ind.light?

Ah, yes I haven’t looked into that AO situation.

I’m using a test scene with a sphere placed over a plane to do some compositing tests. If you have reflection and refraction in your scene, you can separate out those passes and composite them back together with no trouble (Combined + Reflect + Refract). This works.

But what if I want to alter the shadow pass? Here’s where the trouble is. If you want to separate the shadow pass you also must separate the specularity pass (you must multiply diffuseshadow then add spec). This is fine (and works if there is no reflect and refract), but for some reason after doing [diffuseshadow + spec] + reflect + refract, there is an error in the composite. It should match exactly with the case where I just did Combined (including spec and shad) + reflect + refract, but it does not; the area behind the sphere in the shadowed area is in error.

My first image shows my composite, with an incorrectly yellowish shadow area (compare with blender’s composite). The second shows the result of the Difference between my composite and Blender’s.

Attachments



So now that I’m using this test scene I created, it brings to light more simply the problem that I encountered before with my more complex model.

I guess I’ll post my .blend of this test. I’ve created three scenes for comparison (differing based on whether reflection and refraction is present in the materials). Then some of these scenes are duplicated so that only certain render passes are separated out as control tests and to prove that things work when isolated.

Here it is: http://dl.dropbox.com/u/1531862/Blender/CompTest.blend

Hm, I could be wrong, but from looking at your file I think the error might have been that you where using the combined pass in your comps. That of course will lead to errors, because it already includes all passes. I have exchanged that with the diffuse pass, and to me it seems that everything is working now.
Here’s the file:

I also disabled the render-layer-exclude-settings, that you have used.

EDIT: oops, i completely missed the three other scenes. sorry. i’ll have a look at those now.

EDIT2: ok, but you also did the same thing, which to me appears to be a mistake. You have to use the diffuse pass instead of combined.

I also tried to isolate the problem with Refraction and Env.Light. From what I can see it seems that the problem is somehow related to sky-color and transparency.
When you enable AO and Env.Light without any transparency, hence refraction, everything’s fine. But AO+Env.Light+Transparency totally messes things up.
And you also get a problem, when you disable Env.Light, but also un-check the “include sky” setting. That also gives a wrong result.

I have added Ambient Occlusion to SCUEY’s samplefile:


All seems to work fine, there is reflection, refraction and AO and the combined pass looks exactly the same as the renderpasses composite.

Then I added EnvironmentLighting:


And now reflection/refraction look completely wrong.

So the question here is: how do Sky and EnvironmentLighting relate to reflection/refraction in terms of RenderLayers? How do I have to setup the renderlayer-nodetree so that i get the correct result when I use transparency, reflections and EnvirnmentLighting?
Any help would be great!

Think the prob was quite easy, to i don’t dare say that I fixed your specific env. light problem(.blend didn’t work for me)
AO+env,
http://dl.dropbox.com/u/5921697/CompTest_Working_AO_env_coen.blend

Thanks!
But unfortunately that’s not the solution. The point of the separate renderpasses is to be able to control each of them separately with curves, color-balance and so on. That’s why I tried to avoid the combined pass in my setup, except for the sky-background.
Your setup is basically just the combined pass itself, because you are alpha-overing the passes with inverted alpha over the combined pass - which in the end is zero and therefore just leaving that combined pass. And also it is not using the env.light-pass at all.
So that’s not the solution.
But thanks for trying! I really appreciate it!
:slight_smile: