I’m thinking of buying a rx vega 64 for use in blender 2.8 anyone else using it with blender, if so is it a good buy?
I don’t use AMD, but I would say that if you want to be efficient with Blender Cycles, you should use Nvidia GPUs with CUDA architecture.
See you ++
cheeers for the responce tricotou,I intend to use eevee only, does that make a difference?
I use only VEGA, and they perform better and better. Vega is a very impressive compute device, and is very on par with GTX 1080TI.
RTX 2080TI is still not fully supporrted (the ray tracing cores), but it is still a tad faster, so yes. Additionally if you jump on the e-Cycles which boost performance even further.
Still I have 1 Vega64 + 3 Vega 56, and performance is impressive
Note this is about 1 year old. but Blenders own internal testing.
click on the index to remove any device from the bar chart.
Still vega 64 is slightly slower compared to GTX 1080ti, and faster in one (koro), by significant margin.
Wish Blender did an update on the performance… gives a very nice and clear comparison
another independent testing by techgage, done 1 year ago. Vega 64 ahead of GTX 1080ti.
With regards to Eevee, I played around and works normally, but I can NOT confirm any actual benchmarks on rendering scenes and would be eager to see Eevee performance comparision.
I’m not big on graphs and pie charts n stuff like that my eyes go blurry and i get dizzy but your word does it for me im sold! on a vega going to hold off, however for a few weeks wait for the new amd cards to hit the market might lower the prices on the vega…thanks for your insights really, thank you
Yes, it’s a good time to wait, though on the used market it’s already possible to find Vega 64’s for under 200.
I used a VEGA 64 before switching to 2 RTX2070.
Compile times in Cycles was the biggest issue for me. Now, you won’t use it so it’s not a concern. But I will warn you that the VEGA is loud, really loud. They produce a lot of heat and are literal suckers at the power outlet.
While I would have preferred AMD for their effort in Open Source and particularly Blender, I wouldn’t want to go back at the moment.
hi chalybeum, yeah i heard that there loud however my situation is about money, well lack of it so if i have to put up with jet aircraft noises from my card might just have to suck it up till i can upgrade to a quiter card.
I can relate to your situation and I am absolutely fine with your decision.
I just wanted to avoid this drawbacks be unmentioned because when I heard my VEGA the first time, I honestly thought it was broken.
All 4 in the same machine? which motherboard are you using and did you water cool them?
lol thats so funny
Undervolt to pull the heat production down, you won’t necessarily have to affect the clocks nor performance to get the heat down a bit.
@chalybeum Was it a reference design cooler?
Yes, I pre-ordered mine. Something I won’t do again.
Reference cards are good for one thing: water cooling. Reference designs are typically the most likely to get full-cover blocks from third-party manufacturers.
Yeah, would have been a cool solution. But the compile times drove me in direction NVidia anyway. Btw. it’s not only the fan making noise. The card itself made a very unpleasant scratching noise when it was calculating.
Something like this:
If its the coil whine then a undervold with possibly a slight underclock might have reduced/solved it. But yeah, the compile times would not be helped by it.
No, coil whine is what I would I describe in English as “whistling”. If we’d meet in person I’d say our German word “Fiepen”.
Anyways, the sounds my VEGA made resembled more of an needle scratching over bare metal. I have to say, that it wasn’t that much of a problem in Blender. But was really prominent in DaVinci Resolve. But then DVR isn’t really supported on AMD cards.
I know some people have managed to get their GPUs replaced by the manufacturer just over coil whine, what you describe could probably have gotten the card changed too, not sure if it would have helped, but maybe.