What is missing to make real looking images with Cycles?.I think Cycles does a good job,however some subtle “real look” is missing.Like Corona handles caustics as perfect reflections and refractions.
With the open path guiding library and other improvements we get closer.
If you have a new feature,you know what was missing before the feature exists then.
If we look a the PBR behavior light has on a material,It should be possible to make some
test renderings and measurements if the light behaves as in real world.
edit,There are many other examples see fibre glass.The light gets a total internal reflection for the lighttransport over many meter.This is maybe very shader depending if the TIReflection gets calculated.
E.g. the light strength has a inverse quadratic fall off every meter from its source.If it hits a surface then the reflection/refraction should be bounce with the fresnel equation further,until its to weak to bounce even further.
As simple as it is,i guess that with different render settings (especially how much bounces) can be expected a different render result.
In real world the light bounces until no energy is left.Maybe absorbed,scattered reflected and attenuated until its to weak.
For this reason i started this thread.And for most artists this is not something new.However i think that the lighttransport with the correct light behavior makes the biggest impact in a render.
I had the idea to make a test scene with a specific light source and measurements if the light reach to a expected distance with a expected intensity.Very basic but if the bounces are to low the light don’t reach the point.And if the light don’t bounce correctly e.g. attenuated to early then something is wrong as well.
Not sure atm how to best setup this test scene,ideas are welcome.Maybe there is such test scene allready?
My favourite types of topics - feature request time!
It is possible to make realistic results with Cycles that will spar with Corona Renderer. Besides that compositing, good textures, details (wearing, dirt), lighting setup and etc. are important to achieve realism, what Blender and Cycles are missing are things that will speed up production - I wrote about it once, lighting is almost the same because all renderers uses the same or similar equations and they are approaching the same results as in real life (what differs engines the most is shading but it would be better with Principled BSDF V2):
Archviz guys are always pursuing for goal of creating most appealing and realistic scenes, so what they do, for examaple, is saving renders as EXR and putting them into Corona Image Editor. Corona has really ease to use (and faster!) tonemapper with features like highlight compression, loading LUTs, physical glare and bloom effects, with custom bokeh effects. Other artists use K-Cycles or Octane because of this (art of Jan Morek made in Blender + Octane):
Blender doesn’t have dirt shader, triplanar, auto vertical camera guessing, archviz glass, basic assets, better microdisplacement, LIGHT LINKING etc. - it is possible to make those things by user, but it’s a tedious work and I think that they should be built-in. These are small things that add up to a big picture.
With new developers things are going for better - there will be Project Heist that put emphasis on realism so artist from Studio will need new tools. There’s Principled BSDF V2 so shading would be far realsitic than current; Real-time compositor would speed up looking into end results; also caustic solver and path guiding would bring some significant speed up. Filmic V2 aka AgX is going to change for good look of our renders.
for me it’s the fogging/volumetric atmosphere stuff. i find doing kit with blender volumetrics is very cumbersome and makes render times un acceptable.
in unreal engine using the sky/atmosphere and ground fog and volumetric cloud is super simple and takes the realism on outdoor stuff up greatly. im fine post processing bloom, lens artifacts, chromatic aberation/ film grain etc in davinci. compositing the mist pass always feels very crude.
As a small note, Cycles/Eevee does have triplanar, just set the texture type to “box” (and input object coordinates or something). Dirt shader is present as well, or at least the building blocks for it. Dirt shaders are just ambient occlusion and procedural noise. Blender is less focused on having prefab effects at the base shader level, you’re supposed to make a node group for a dirt shader.
Having premade node groups shipped with Blender is something that has come up in the past, a dirt shader might be a good one.
In my book, caustics have been the biggest problem in terms of realism: You’d need to use a few tricks in the material nodes to filter by shadow ray, since even with reflective / refractive caustics glass panels in windows would cause unnatural darkness while bent glass wouldn’t properly magnify light… also there’s no such thing as light dispersion in the ray system itself. Fortunately I understand the next Blender release will feature a new caustic system which is said to be better, fingers crossed on that!
Otherwise we still lack a system for volumetric refraction: You can’t do things like air distorting above a hot plate or jet engine and so on, not without needing to use more trickery like hidden particles.
Besides the established trick of having a mesh with a refraction shader (which fades as the angle to the camera gets steeper), I am not aware of any system in any engine that does this natively in a way that is easy to control. I believe a lot of people also just add stuff like this in post for now (if they can’t emulate it at render time).
Yes, it’s one of the core features still missing. Ideally this would be supported in PrincipledVolumeBSDF as a parameter to deviate the light rays, that would be the best way to simulate stuff like heat haze. As it stands you need either a mesh with a special material using the facing parameter for smooth fading of the distortion, or to extract the smoke as a separate pass and use its normals as displacement in the compositor (what I believe I did at some point).