Ok, guys! A few fellows and me started trying stuff in other thread, and I decided to created a new one so we don’t fill the other one with this tests.
We were trying to figure out ways of getting mist working on Cycles. As some of you may know, in BI we had a mist effect ready to go, we’re not so lucky with Cycles yet hehe. Also, Zbuffer, to get the effect in the compositor isn’t very efficient because we get some jagged edges.
We have some “tricky tricks” to get it through node materials, or get some pass from BI… so I’d like to discuss theese methods to see if we can find something easier/more efficient all together
When we have some cool stuff going on, I’ll try to make a videotutorial explaining the different techniques
Come on! I’ll add some tests later, when I arrive home
PS: I posted this thread here in Lightin and rendering, but we can do it in the compositor/materials… whatever!
I think last Agus3D patch “mist with shadows” is most efficient, my volumetric patch with min/max bounce < 2 can be acceptible for some scenes, maybe if i will add trick as in Mitsuba that get guarandeed 50% probability to hit media if there is surface hit it will be almost usable. Until that, you can use Agus3D light length input info in nodes.
Hi, @Kaluura. Yes, that’s the usual way of creating fog in compositing. But as I said, it uses the Zbuffer, which is problematic in Cycles In that scene it works quite well, but that’s not always the case. You may get jagged edges when there is a lot of contrast between the foreground and the background. Also, it doesn’t work correctly in combination with depth of field. Take that scene and apply some depth of field… you’ll see how the focused objects get extremely noisy borders, even if the render without mist is quite clean.
Ok, guys. I created a little scene to test this out. It includes some overlapping objects with depth of field, in order to test if the mist affects the jagged edges and so on.
And here is the “original” render, with no mist effect added. 200 samples, not completely clean, but almost. Later I’ll start doing things with this one:
EDIT: I forgot to tell you, the .blend file is setup with two render layers, one for the scene itself, and a second one for the sky background. The scene render layer is setup so it renders almost every pass.
First test. This is the “typical” way I use to create mist. I did it this way in Blender Internal since a long time ago, and I think it is probably the common way to do it, not only in blender.
This method is using the z pass as a mask for an image/color/whatever that increases opacity with distance and act as a fog. It can also be inverted to mask the visible parts of the scene, and face to transparent with distance, so the farther, the more we see the background behind the scene.
In normal scenes it works… just correctly, sometimes there are some antialiased borders, but in scenes with depth of field, is not good at all, we have like antialiased noise in the ZBuffer, that’s why we get jagged edges in parts slightly defocused, even if the original combined image is ok.
Here you have the composited image with the node setup, and the zbuffer.
I tried a different method I had in mind, and it works… kind of
I duplicated the scene, turned every Cycles parameter to the minimum (like bounces), so it’s fast, and no cool things are rendered. I set up a black background, and the scene has only a point light in the camera position. I made a material override with a completely white material. This way we can generate some kind of “Zbuffer” image, and we can control the “distance” by increasing/decreasing the strength of the point light.
This way we get an image we can use to mask the fog, and it has the same depth of field effect than the original render. The issue with it is that the borders not looking at the camera are shaded, so they are wrong in the final composite…
In the next tests… I’ll take stuff from Blender Internal, and let’s see where we can get… but these are only temporary solutions… we really need a good Zbuffer pass in Cycles!! How much would that cost? I’m all into donating for somedeveloper to get it right
Ok, I decided to try to deactivate the camera “real” depth of field, and use a postpro defocus node instead… but the result is not very good either. Also, we’d lose the greatness of a “real” depth of field
Also, in case you’re wondering, even not having depth of field, the edges on the objects using the zbuffer are aliased. Somehow, I can’t find the Full Sample option in the latest release, to test if it works better activating that one
Here is another test, this time taking the ZBuffer from BI and applying a defocus node to it. Apparently, defocus doesn’t blur the ZBuffer, but only… something like “eroding” it, not sure why is that. Smooth and blur filters have the same kind of effect on it.
Next test will be using materials, cause for now it’s the only way I saw that works properly… but I was trying to avoid it, because it means that we need to apply a node group to every material in the scene, which can be pretty tedious in large scenes. Also, I’ll try this last method with the “mist” pass in BI
Ok, did the same as before, but using the mist pass instead of the ZBuffer. The result is not perfect, but it looks much better I added a RGB Curves node to control a little the amoount of mist.
Finally tried using materials. It’s probably the best result so far, not requiring different renders to composite, and it works pretty well.
Things I don’t like: The “size” (like the ZBuffer) of the mist, is hardly controllable. It doesn’t allow tweaking in compositing, it must be setup before the render, as it’s applied in a material. In large scenes with lots of objects/materials can be a tedious job, because this method requires a node group to be attached before the output of every material in the scene (the good thing about using the node group, is that it’s instanced, so by tweaking one of them, all the rest of the scene’s materials are also changed).
Here is the result (I had an error in this one, instead of a transparency for the mist, I should have added an emission shader, so the objects don’t become transparent, but the method works and the nodetree of a material using this technique:
The Z-Value is the distance from the camera plane to the first intersection point. For something like an out-of-focus object or motion-blur, every pixel would have many Z-values, encoding that information would require a higher-dimensional image. (like a light field)
Blurring is basically taking the average of the surrounding pixels, and indeed the Zbuffer values will get “blurred”, but that really destroys the meaning of it. Nodes that expect a meaningful Z-Buffer will therefore not work anymore.
It’s not really a Problem with Cycles or Blender, what you want to do is plainly impossible using (2D) compositing. Similarly, any postprocess DOF/motion-blur effects will always have artifacts, because the required information simply isn’t there.
I finally tried with the Ray Length, and I think it’s the best option so far… I need to try to render that in a separate scene to be able to control the mist in the compositor (which is in my opinion more handy, cause we can make fine adjustments in the fog later, and not having to render the scene again).
I wasn’t familiar with that option, and wanted to try which was the most useful method for me Thanks a lot for your support!
Thanks for your explanation, Zalamander I’m not a very technical guy and didn’t know about that, but what you say makes sense. But in this case… Zbuffer is really useful for compositing? Because the fact is that if I want to composite the image using that channel… is not useful not being antialiased. Or maybe I’m missing something! Please, illustrate me
YAY!! I hope that finally happens! Yes, I agree with you completely, having it as a pass would be a great help, to avoid setting up two different scenes and two renders
Hi, storm, I selected the “world” icon and the “world” shader nodes, then I added the node setup in your example but I see no mist. I obviously am really lost. Can you please tell me what I need to do to see the mist?
It not World (background), it works only with surface materials, you need to “hack” every material with that trick, that is limitation, But quality will be very high, with zero impact for converging time. it worth that extra hand work, at least for simple scenes.