Ok, so here’s the deal. I’m able to use 2 instances of blender 2.64 to render out alternate frames of the same file. Both files are rendered with Cycles. The first is rendering with CPU cores, and the second is rendering with GPU cores on my dedicated GTX580 Classified card.
The frames look nearly identical, however when the sequence is loaded into a video file, you can see the texture of the background wall is shifting between frames (even thought the foreground objects seem to move smoothly.) It almost feels like the camera is bouncing a little. Perhaps this is UV issue between the 2 renderers? I’m not sure.
If anyone has any advice, I’d appreciate it. For now, I’ll just render using frame ranges, but it’s always nice to be able to use the same frame range and use the Placeholder option.
can you upload two PNG’s of the same frame, one rendered on CPU and one on GPU?
I’ll have to render an extra duplicate frame to compare both. I’ll post once I have it.
is the background image a background image (like sky?) i’ve noticed Cycles CPU-GPU differences on the Position of those, which of course would lead to such “jumping” backgrounds…
On the GPU, Cycles will use hardware texture interpolation, which probably can’t predictably be matched with the CPU routine. I don’t know whether that causes it, but it is a likely candidate for further investigation.
In the future, GPU texture handling will have to be redone anyway (accounting for the texture limits) and I believe at that point a software routine for interpolation will be used, too.
@bashi - the background image is a texture on the interior side of a cylinder that is forming the backdrop. The camera is rotating around the position where the product would be sitting in the full scene.
@zalamander, I suspected that something like that might be happening. Do we have any technical evidence that this is the case from any of the devs (are you a dev?). Also, I’m wondering if it will ever be possible to harmonize the 2 render modes, so the CPU’s and GPU’s can be used to render the same frame?
Do we have any technical evidence that this is the case from any of the devs (are you a dev?).
I’m not a (blender) dev. If you want evidence that hardware interpolation is being used for CUDA, you can just look at the source (which I did). Whether that is causing your problem, as I’ve said, I don’t know.
Also, I’m wondering if it will ever be possible to harmonize the 2 render modes, so the CPU’s and GPU’s can be used to render the same frame?
Yes, by using an equivalent software routine on the GPU. This is desirable for quality reasons (better filters than bilinear are possible) as well as flexibility (only 128 hardware textures can be used with CUDA). The drawback would be a performance hit.
But again, I’m not sure if that is really causing your problem. It may also be related to floating-point precision or maybe there is a subtle bug in the software interpolation.
EDIT: This image illustrates the difference in interpolation on CPU (top) and GPU (bottom). The Cycles routine chooses different interpolation points. Since this is also inconsistent with both the OpenGL viewport as well as Blender Internal (and quite possibly other renderers), you might consider filing a bug report.
I’m fairly sure now this subtle offset causes your problem. Try increasing your texture resolution and see if it is less obvious.
I guess I’m wondering if this is something that should be reported in the bugtracker or if it’s something most devs are aware of.
Looks like you missed my edit by posting the very minute I did. I do think it should be brought to attention, since it is an inconsistency that causes problem, it is worth a bug report. Do note that this code isn’t final, and it certainly will be extended/changed in the future.