I know this has been asked before, but I didn’t found a satisfying solution.
I’m currently starting to implement blender in my professional workflow.
I have one issue that bothers me. Cycles render outputs do not match the render preview. I know this is common pretty much in any other software, but the differences in blender are much bigger. What I’m talking about are more or less minor differences.
Especially colors are off, but mostly light colours turn out brighter, to a degree that texture details of those areas are lost. Colors can be corrected in Photoshop, but lost texture is more time consuming. At the moment, this is pretty much a dealbreaker for me, since money shot-renders just have to be perfect.
What are your thoughts on this? Do you have a workaround? Or did you just get used to the output differences, and know how thing will result in the final render by heart?
In viewport, are you using viewport render, or material preview, as those two are very different?
I can’t say I’ve ever really noticed a difference between what I see in viewport vs what I get in the render, unless I’ve accidentally toggled a light or something in the outliner to not render.
Have you made sure you have Scene Lights and Scene World ticked when rendering in the viewport?
The color management is different mainly because render dimensions do not correspond to dimensions of part of screen dedicated to Viewport.
Viewport settings may also take into account less samples to render faster.
I am making render tests by cropping a part of expected render, before the final one.
You can find several Exposure settings in Filter panel or Color Management panels of Render Properties.
You can also make adjustments in Compositor.
But losing textures because of too bright lights is rare. That may be an user mistake, like hiding some lights in Viewport, but forgetting to deactivate them for rendering process, too.
That would be the first thing to look at, since viewport render and render render can both have different samples, noise threshold, and denoise settings.
Then there’s compositor that may or may not be changing the rendered image vs the viewport and it’s possible that things like normal/displacement aren’t being fully applied in the viewport, but I’ve not checked/tested that.
As I understand you, you first render a region, when you say cropping?
I’m mostly loosing texture, when the color is nearly white. For example light concrete or paper. In the viewport I can see those fine details, but after rendering, they are not as much visible.
Yes. That is what I meant.
There is also a difference of color management, because of what is not rendered.
But it should be less annoying, because what is important is, that surfaces rendered are at same scale than, in expected result.