Shiny new Radeon VII, Linux, Cycles, and a giant pain in the arse

Bought a new computer last week. Ryzen 3600, 32 GB of ram, X470 “Gaming Plus” motherboard and a nice fat Radeon VII. For OS I run Linux Mint. (s/o to Chaos & Evolutions, have wanted a chance to switch to this OS ever since Francesco mentioned it actually)

Obviously my main goal here is to run Cycles faster then my old Nvidia 1050 could. It’s taken about a week of configuring drivers to get Cycles to run on the Radeon VII at all without freezing the whole machine, which is more or less what I expected with a new computer and linux.

Now it’s rendering at least. about 1.5x SLOWER then the 1050. why.

I have tried open source drivers, closed source drivers, 3 or 4 different kernels… every combination either doesn’t run at all or runs unuseably slowly. I am completely lost. Help!

PS: it’s also producing ultra-bright fireflies in Cycles (i know they’re ultra bright cause they result in completely blown out highlights that cover a good chunk of the screen after the denoiser gets to them) that don’t show up on the 1050. cuz what the hell right.

Just for further analysiz could you try running a couple of older verions of blender/cycles, just to know if it is a regression in blender or if it’s a system side thing…?

Realistically: Can you return the card?
Buying AMD cards for CG and Linux is just asking for trouble.
If you return the CPU as well you can just say no to GPU rendering alltogether and get a 3900x or even a 3950X for the money you spend on the 3600 and VII. No driver worries, you can allocate threads during render and you have a lot more RAM.

Sure, which versions would you suggest?

2.79 for starters, if it works fine, maybe look for a build of master branch from inbetween 2.79 and 2.80 release. I think just trying 2.79 and comparing to 2.80 might be enough to see if it’s a regression or not.

I hope you manage to resolve the GPU issues for sure.

And definitley stick with a GPU render over CPU render. I might agree in part with anaho, with regards to potentially returning the AMD GPU and getting an RTX one in it’s place.

Below is a recent summary by techgage on both CPU vs GPU (or mixed renders as well) and another newer one on Optix (granted Windows - which means it should be even faster)

Hm. Just for fun, could you try a different distro? Bodhi has always been pretty awesome, but then, I don’t run it on the latest hardware like you do :slight_smile: www.bodhilinux.com

It’s Ubuntu based, so whatever you were doing to get your drivers going in Mint - similar procedures should work.

I have no idea WHY I’m suggesting this, other than when I used Mint, it was a resource hog. Almost WIN-ish! Bodhi is MUCH leaner.

Report back here, will you please? :slight_smile:

Alright guys sorry it’s taken me so long to respond my life is hell rn

So I went to try 2.79 but 2.79 doesn’t even detect the Radeon VII…?image

I did a quick search for which distro is best for opencl… but they seem to say that Ubuntu is best and Mint is an Ubuntu derivative. So idk where to go from here…

PS: as far as Mint being a resource hog, Mint comes in 3 flavors of varying “fanciness”, I run Mate which i find to be orders of magnitude lighter then Windows. Maybe you were using Cinnamon which is very heavy.

Edit: am running memtestCL to see if there are any obvious problems with the hardware itself. hasn’t found any errors yet.

Bump.

Should I look into coding a custom driver or something…? It doesn’t seem like a solution for running Cycles on the Radeon VII on Linux efficiently exists at all at the moment. I kind of wished I had known this before I bought the card, it was advertised as being “good for blender” and having “open source drivers ready on launch”, which was almost a year ago…

I haven’t heard of other people having issues with that card on linux. I don’t know what the issue is, but it is probably on your end.

Hi. To determine if the problem is with the driver or Blender side, you run LuxMark (GPU only) and compare results with the different scenes:
http://wiki.luxcorerender.org/LuxMark_v3#Binaries

Here results for Hotel Lobby scene:
http://www.luxmark.info/top_results/Hotel/OpenCL/GPU/1

Luxmark v3.1 on my Radeon VII, rendering Hotel Lobby scene gets a score of 3,265. The other Radeon VII’s in that list get a score of 6,919 or higher. This is consistent with the sub-par results I’ve gotten from cycles. However it doesn’t throw any errors… what would cause a gpu to work but only work slowly like this? i thought it either worked or it didn’t aha

Most likely the drivers you’re using.

Which ones are you using?

amdgpu is my display driver
idk how to check what my opencl driver is… i installed a few different ones before it worked at all and i forget which one i installed last :frowning: i wanna say “rocm” tho

What I remember from my Ubuntu/AMD based time is, that only the amdgpu-pro driver leverages full capability of OpenCL.
You might got in trouble while experimenting with all of the drivers.
I mean this in terms of conflicts and/or precedence.
Ubuntu based systems are a bit opaque in this field and I don’t blame you as a newcomer to Linux systems. There is a chance your system doesn’t use newly installed drivers but instead using the default ones.
Go read about blacklisting and lspci. The Arch Linux Wiki is great in giving high level overview on those topics even if you use another distro.

Very many years ago when I was an ATI user on Linux, AMD’s policy was this: OpenSource drivers were relatively stable but features and performance were crap. Proprietary drivers had relatively better performance than OpenSource drivers (still far from Windows performance anyway) and more features, but they were unstable. They promised for a long time that they would achieve a single driver (OpenSource) with all the features and good performance, but that was always just promises.
I think that all this under Linux has not changed much.

@cerebral_malfunction, if you still can’t solve the problem, you try to complain about this by making noise in official AMD/ATI site/forum

Might be the crappy thermal pad between the heatsink and gpu die is bad. Causing the card to throttle clocks as it gets too hot perhaps.

1 Like

Hm, I didn’t have such problems with my Vega back in the days.
Of course you’re right that GPU drivers are a major issue in Linux. I really hoped that collaboration between BF and AMD and NVIDIA would somehow penetrate into the development culture of the drivers but that seemed to be a pipe dream.

@Felix_Kutt good point

An easy way to rule out that the problem may be due to hardware issues, is to try on Windows on the same machine.

2 Likes

Indeed, Windows is a good tool to investigate hardware issues.