Using blender 3.1.2. I tried rendering with both CUDA and OptiX to test out which one would be faster for my hardware but discovered that the render outputs look very different. Can anyone help me make a best guess at why this is happening?
Sorry I can only embed one image
switching between these,images
but i do not see any difference?what is different??
(with motionblur i recently had problems/different results when rendering cuda or rtx)
Admittedly the differences are numerous and hard to spot. They are more obvious towards the bottom left of the images, on the reflection of the leaves. Some areas have different lighting condition too (with and without shadows).
yes now I see it, I did not notice before
did you also compare it with cpu rendering?
did you use a denoiser, does the difference get smaller with more samples?
These are all good points. I don’t ever plan on using cpu render but I should test it out as soon as I get the chance. The denoiser was on, but my understanding is that as long as the settings used are identical, OptiX and CUDA render should produce identical results. I will test with more samples, but I doubt that’s where the problem lies: the differences cover a much larger area than a single pixel, and appears to be something related to their normals / specularity. Thanks for your help, I’ll get back to this topic with more information.
yes am curious about the test and yes!
should be the same result
to test you can also test only the problematic area with boxregion
Results from CPU, OptiX and CUDA render, this time with compositing and denoise turned off, and max samples set to 512.
ALL of them look slightly different from one another.
You can report this to the bug tracker for developers to decide if this is a bug or a limitation (Open Blender, “Help” menu and “Report a bug”.
You need to share a .blend file where developers can reproduce the problem. Much better if you can simplify the scene and share only the problematic elements.
everything is different! Pick the most beautiful
What plants did you use, botaniq, scatter …?!
That’s true, to be honest, I don’t really care about these differences. I only thought that it might be revealing some flaws in my workflow, for example, a wrong shader node setup, or bad normals. I’m alright with them looking slightly different, and I’ll stick with OptiX rendering after making the upgrade to RTX cards.
In terms of assets, these were from scatter 5 biome: forest 08 and wasteland, with modified distributions.
So I’d like to revisit this topic, having just done a few tests for another reason and in doing so discovered this very issue.
No need for a sample file, one can just use the BMW demo file on the Blender web site.
On loading that and changing nothing, I rendered it using CPU and GPU with both CUDA and OptiX. All three images are different on Blender 3.3
Most of the difference is in dark/shadow areas and on normal viewing, you can’t see. The most visible difference is usually around the headlight on the nearest car. It’s the same sample settings and no denosing in any of the renders.
To really see the difference, load each image into Krita, layer on top of each other and set blend mode to ‘divide’. If the images where the same, you’d get nothing but all white. Instead you get various colour pixels like this:
So, this makes me wonder, is this the expected result, do other renderers have the same issue?
For a still image, I guess it doesn’t matter all that much, but if one was doing an animation, say some local, some via a render farm or you did most of the shot via GPU but then some part just didn’t fit in GPU VRAM, so that was done using a CPU. I can foresee the potential for some jitter or image flicker if one mixed the frame rendering while expecting the result to be calculated and hence produced the same.
It is better that developers answer this question. You can report this to the bug tracker for developers to decide if this is a bug or a limitation (Open Blender, “Help” menu and “Report a bug”)
Well, there is this old report for 2.93, but it was closed/merge: https://developer.blender.org/T89351
With no real dev statement as to if it’s expected or otherwise.
It then got merged with what seems to be a very specific large plane issue: https://developer.blender.org/T85078
Which again over a year old now and no real solution and more focused on just a large plane.
I guess I can report it and see what happens. Tho I’d still like to know how other render engines compare. I only use Cycles, so assuming that some of the other ones support both CPU and GPU rendering, if you render the same scene, is there any difference in the final image?
I don’t think there is a problem that you open new report explaining what you discovered. Maintainers are the ones who would be in charge of deciding if this is related to some other previous entry.
Or, if you think the issue you found already has an open entry, you can add a comment to that post with the differences you found, just in case it’s useful to the developers.
I don’t know that much about other render engines. What I do know is that when in Cycles there are differences in results using different devices, Blender developers prefer that the user report the problem so that they can see if it is due to some limitation or if it is something that can be fixed.
Well, I created a bug report: https://developer.blender.org/T101561
So guess we will see what happens.
There I have added a comment in the report. Are you sure Divide mode is what is used to see the differences in layers? I have done a test and even opening exactly the same images as layer, in Divide mode I get differences.
Don’t even need Divide mode, I see a difference just looking at them in the area around the nearest headlight. Flicking back and forth you can see a spot that is lighter/darker across all three rendered images.
Here’s the images
So I heard back from a Dev and basically, its not a bug, there are and will be differences. They try to get results as close as possible but will basically never be exactly the same.
On an interesting side note, I recently got a new GPU (a 3080 Ti) and compared to the same render done using OptiX on my old 1070 Ti, there are some very minor differences on the same render done using OptiX on the 3080 Ti (which of course has actual RT cores, etc).
So yup, for an animation, don’t mix your render hardware across frames.