AMD graphics cards and rendering animation?

Several months ago I read that AMD cards were good for rendering individual images but poor for animations because they have to compile something before rendering each frame. They also said that eventually an already compiled thing would be made available that would fix this problem.

What’s the status of using AMD cards for rendering animations?

Here’s the thread about the compiling problem. Is this a blender problem or is it something that AMD needs to provide?

The compiling issue still exists.[edit: my bad, recompiling kernel only happens once per session, not frame] If you are deciding on a graphics card, I still recommend Nvidia. No precompiling every frame, and overall faster and more reliable performance. [edit: excepting flagship cards younger than 1 year]

Can you point me towards some proof that this compiling issue is real? I have checked the log files of my benchmark runs with Radeon GPUs and you only have to compile the kernel(s) once you fire up Blender for the very first time. If you change your GPU to another Radeon a new recompile may take place, I’m not sure what happens after a driver update, but it is possible that it needs a recompile as well. All I can find is hearsay, my friend’s friend told me …

I use AMD GPu’s for my rendering system. Kernel compile occurs at the beginning of inital rendering ONLY.

During animation rendering it only does this at first frame only. (just tested in on my RX Vega with RX 480) I have yet to experience this compile during each frame, so i have to call that an incorrect statement.

Can you point me to benchmarks where Nvidia is overall faster then AMD?

Blenders own tests show (click to filter a device out) but it shows that a GTX 1080 is noticeably slower then RX Vega64, with GTX 1080TI on par (some faster some slower)

I have no experience rendering on AMD gpus

I do have about 5 years of experience watching AMD users gnash their teeth at how poorly/not at all their GPUs worked with cycles. All while sailing smoothly on Nvidia’s CUDA from day 0.

I am glad to see that AMD is picking up their performance, but I need a little more time before I can trust them or recommend them.

I have edited my original post to correct inaccurate information, but my recommendation still stands.

If you are rendering a single frame (just an f12 render), does it recompile each time?

No, while the Blender session stays open it only complies the first time. At least for my current workflow. I do expect that if I added a drastic change to my workflow (a unique shader that I didn’t use, it might have to recompile, but at this time I don’t recall such thing happeing to me,.

1 Like

Thanks for debunking the myth. I’ll be rebuilding my system one of these days and am planning on using a Threadripper for the cpu. It’s nice to know that I can use an AMD gpu and not have to worry about offending the cpu with an Nvidia gpu.

I’m currently running Threadripper 1950x with 1 rx Vega and 1 rx 480 (soon to add 3 more rx 480s once i get all the water blocks on)

I will say that the fact Threadripper was created internally by AMD fans (it wasn’t driven by a business requirement but rather engineers came with the solution to the business to see if they would liek to release it) And I have no regrets… well except that I wish i could afford the 2990x :slight_smile:

Do note that if you are planning to build one in the near future, I recommend to also wait till Blender team releases support for the RTX 2080 series. inital LuxRender results show 2080ti finally regaining the crown, by a nice margin. That is if one can actually afford one :slight_smile:

I used to feel the same way but they appear to have gotten on track. See for example,