Problem with AMD and Cycles

Hello, I have a AMD Radeon HD 7700 Series graphics Card but Blender doesn’t seem to recognize it in order to use it with Cycles, I’ve tried updating the drivers (my PC is new anyways) and nothing happens.
How can I make sure I can do GPU rendering with cycles?

Thanks for reading!

You cannot use your card for gpu rendering. You need a compatible NVidia card to use CUDA.

It’s a shame that Cycles only uses Nvidia right now, a lot of people like AMD but hey Nvidia does like toying with there competitors lol

OK GUYS YOU JUST GOT ME UPSET MY NEPHEW JUST GAVE ME A AMD Radeon™ HD 7770 GHz EditionSO NOW IT NOT ANY GOOD TO USE WITH BLENDER YOU SAYING, THIS IS JUNE 2013 DID THEY UPDATE ANYTHING FOR IT TO RENDER IN BLENDER? DIVER OR SOMETHING??

It’s not Blenders fault, it’s AMDs/OpenCLs fault really. Didn’t you notice that no pure GPU renderer supports OpenCL and all support CUDA?
Before CaptainCapsRaging you should read around a bit in this forum.

I’m not sure if AMD and ATI are the same company. To me they’re two separate companies. AMD makes CPUs and GPUs apparently. ATI only make video cards. AMD can produce both. Right now I’m having a serious problem with Blender itself. I have a computer that are as powerful as a workstation computer because it uses AMD FirePro V5900 which is built with the intention to create 3D graphics. So it’s very powerful.

However, I brought the graphic card to work with 3D software. The problem? Blender itself is causing my computer to shut down repeatedly by overheating the CPUs. Which is bad. I’ve been hearing that the Blender can only use Nvidia. If I understand correctly, Nvidia produces gaming graphic cards, not CG graphic card.

In response to it’s not Blender’s fault, so basically it is not Blender’s fault that the computer shut down before the cores could damage themselves? Anyone with a piece of garbage will fry their CPUs just by trying to render in Blender. I have to say two things, one, the Blender wasn’t designed to run on a professional graphic workstation. Two, AMD built both the motherboard CPU and the GPU for the AMD FirePro graphic card so if the Blender can run on an AMD processor on the motherboard, why couldn’t it run on the AMD GPU as well? It looks like somebody have never built Blender for a workstation that doesn’t use Nvidia.

No offense meant, I’ve been told that “Blender was made to run on garbage. The only halfway decent component garbage has is a processor.”

In order to solve the shutdown problems, the Blender must get off the CPU and onto the GPU no matter what kind it is. The CPUs will shut down if they get too hot. They’ve been getting way too hot recently.

I noticed that the person above said “No pure GPU renderer”. So basically Blender wasn’t made to run on computers that are professional workstation with one of the most powerful GPU processors on the market?

Again, don’t blame the hardware. Hardware can’t change, but software can. Blender just wasn’t built to work with some pretty powerful graphic card out there. With that shortcoming in the program itself it will cause problems in professional graphic industry. Unless that changes, most of the graphic artists out there who use pure GPU renderer on their computers won’t be able to use them.

Is there another way of tricking Blender into thinking the GPU is CPU? I think it will help keep the CPUs cool.

Oh and by the way, upon looking at the AMD Firepro V5900 website, it supports OpenCL 1.1 and yet Blender is acting like it only sees the CPUs.

First of all AMD bought ATI some years ago, so it’s the same company.
Second Nvidia do make professional cards called Quadro.
Before 5-6 years I had an firegl ATI v7100, it was top of the line that time.
But with maya and dual screen I had some glitches and problems. Then I bought an nvidia geforce card
and my problems and glitches gone.And geforce is a gaming card not pro.What I mean is that I don’t trust AMD-ATI for any serious 3D work.This is my experience and just my opinion.
About your overheating problems I think it’s your hardware fault. Did you test another 3d software or Cinebench or even Intel stress test?

As other guys said, Cycles support only CUDA, not OpenCL.

My apology. I was pretty frustrated last night after trying repeatedly to render a scene only to see the computer just turn off. After twelve tries my patience was pretty frayed. So sorry for any offense.

Thanks for the corrections.

The computer I have is designed not to run one 3D CG program, it’s also designed to work with other as well like Maya, 3D Studio, and others. The graphic card was brought because Maya is pretty intensive on the processors. So I got the graphic card to reduce the strain on the CPUs.

I am not expecting the GPU to run the operating system. Just take over the rendering. That way it can free up the CPUs. After all, wasn’t the GPUs supposed to handle rendering?

Overheating is only isolated to Blender itself. I’ve ran a 3D game called Borderlands 2. No shutdown, no overheating. I’ve also had Maya and made about 200 to 300 MB scene on the computer and ran it. The computer never shut down. I’ve rendered an image on it, but it never overheated.

There are actually two solutions to overheating problem. One is to get a bigger fan, and another is to force Blender to use only one core, I had four cores and Blender was pushing all of them up to 100% of their capability and all four of them got very hot, very fast. So I forced Blender to use one thread (one core obviously), the render time took longer but the computer didn’t shut down.

Good to hear that Blender is in good hands. Why don’t Octane, Vray-RT, and iray support AMD GPUs? Why just NVIDIA?

Obviously I’m a little behind the times on AMD and ATI merging. Thanks for letting me know.

Low system requirement, which means Blender is better than most 3D programs. In my opinion Blender is better than Maya and 3D Studio Max, in my opinion. If it would be less CPU intensive that would help. If they would set Blender to be able to run on GPUs if the computer have it, it should shift rendering workload off the CPUs onto the GPUs. However, that’s just an idea of how it may help the computer stay cool.

To bls, I understand and respect your opinion. I am more accustomed to ATI/AMD brand. I’ve used Maya and I didn’t notice any serious problems with AMD FirePro. It renders fine without any problems.

To Zalamader, I apologize for causing a reason for you to get annoyed and/or angry. I thought even if AMD brought ATI, they would still use ATI’s technologies on the Radeon graphic cards. Unless they changed Radeon? Ah, so AMD basically have a different OpenGL software that Blender doesn’t recognize, did I get that correct? What I have is a quad core intel i5, with stock cooling fan (apparently not a high end fan). Blender was using all four of the cores until I told it to use only one core which apparently solved the heating problem but also slows down render time. Not the best solution but better than nothing. I’ve never seen my computer shut down after only about 10 to 15 minutes top of rendering a movie in Blender 3D. It never happened before. This was the first I’ve seen it do that. All other programs runs fine. You are right, the CPUs won’t get damaged, I have a question, what would happen if I repeatedly did it over and over again and again? Will the CPUs start to wear out faster? The computer shut down no less than 12 times, all within less than 20 minutes. I then restarted the computer and tried it again. Computer shut down again, and it gets quite annoying after a couple of tries. Again, good to hear that Blender is in good hands. Oh and I was rendering in Blender Internal. As for that person, he doesn’t think highly of Blender and when my computer shut down after rendering a couple of frames, it doesn’t improve his opinion of it and makes the program and me look bad to him.

EDIT: I’ve just looked at NVIDIA vs AMD. It’s basically saying that Blender is far behind MAX. I think I will cry.

Why do you keep going on about Blender using all cores and causing overheating, or that Blender should be less CPU intensive?
That’s really complete nonsense.

If a tool uses all your cores at 100% it’s actually good, you want that.
Processors are made to handle full load. A processor with the reference cooler can run at full load within its thermal specifications. That’s how they are designed, built and shipped.

If your CPU gets too hot, it’s your fault alone. Neither Blenders, nor the CPUs. Either you got no or bad thermal compound, a bad or damaged fan, a dust clogged cooler or bad air circulation in your case, or you life somewhere where the day has an average of 45°C.

A first good idea would be to install CoreTemp or SpeedFan, both free, to keep an eye on your CPU temperature.
You say you got an i5? That’s even more worrying because the i5 actually decreases speed and lowers the voltage if it gets hot. Shutting down is not done by the CPU as far as I know, at least not by the Core-i series. But I might be wrong.

You might have set a mainboard feature in the BIOS to shut down the machine if the CPU temperature reaches a set threshold.
Or even more trivial, your PSU is broken or not powerful enough and under full load the system just dies.

And again to boil it down in the simplest way, as it seems you still haven’t fully understood it:

AMD, which also builds your card, only supports OpenCL.
Nvidia, offers CUDA and does so exclusively.
CUDA is used by all major GPU renders and also Blender currently only works properly with CUDA.
It’s not Blenders fault, it’s the fault of AMDs OpenCL compiler, which is supposed to compile the render kernel.
Your card will not take rendering load off the CPU, because it’s not properly supported yet.
Blender Internal will NOT support GPU rendering, only Cycles does, which also runs on the CPU.

Yes, I can understand that, it’s just that I get annoyed when people post their opinion based on inaccurate information and misconception. If left uncorrected, somebody else might pick it up and soon enough it becomes “common wisdom”.

The computer I have is designed not to run one 3D CG program, it’s also designed to work with other as well like Maya, 3D Studio, and others. The graphic card was brought because Maya is pretty intensive on the processors. So I got the graphic card to reduce the strain on the CPUs.

That doesn’t really work. In the vast majority of cases, you cannot just offload work from the CPU to the GPU.

I am not expecting the GPU to run the operating system. Just take over the rendering. That way it can free up the CPUs. After all, wasn’t the GPUs supposed to handle rendering?

GPUs accelerate the viewport, but the final rendering is usually performed on the CPU, because that’s much more flexible. Only since GPUs have been become programmable has there been a shift towards using them for production rendering as well, but it’s pretty early in and there’s still many limitations (lack of memory, difficult programming environment, crappy drivers/compilers).

Overheating is only isolated to Blender itself. I’ve ran a 3D game called Borderlands 2. No shutdown, no overheating. I’ve also had Maya and made about 200 to 300 MB scene on the computer and ran it. The computer never shut down. I’ve rendered an image on it, but it never overheated.

I don’t know the characteristics of what you did there, but games in general tend to not use 100% of the CPU.

There are actually two solutions to overheating problem. One is to get a bigger fan, and another is to force Blender to use only one core, I had four cores and Blender was pushing all of them up to 100% of their capability and all four of them got very hot, very fast. So I forced Blender to use one thread (one core obviously), the render time took longer but the computer didn’t shut down.

I guess you can try using only 3 cores? That will leave your system more responsive and maybe it doesn’t overheat. You can also try “underclocking” your CPU.

Good to hear that Blender is in good hands. Why don’t Octane, Vray-RT, and iray support AMD GPUs? Why just NVIDIA?

NVIDIA provides a more mature GPU programming solution called CUDA, as well as a more reliable OpenCL compiler.

To bls, I understand and respect your opinion. I am more accustomed to ATI/AMD brand. I’ve used Maya and I didn’t notice any serious problems with AMD FirePro. It renders fine without any problems.

Whatever you do on Maya, you’re probably not rendering with the GPU. Neither mental ray nor the maya internal renderer use it.

I thought even if AMD brought ATI, they would still use ATI’s technologies on the Radeon graphic cards. Unless they changed Radeon? Ah, so AMD basically have a different OpenGL software that Blender doesn’t recognize, did I get that correct?

They’re using what used to be ATI technology. OpenGL works just fine with AMD, but the OpenCL compiler will fail to run Cycles, which is why it is hidden from you. (To avoid people complaining about it failing).

I’ve never seen my computer shut down after only about 10 to 15 minutes top of rendering a movie in Blender 3D. It never happened before. This was the first I’ve seen it do that. All other programs runs fine. You are right, the CPUs won’t get damaged, I have a question, what would happen if I repeatedly did it over and over again and again? Will the CPUs start to wear out faster? The computer shut down no less than 12 times, all within less than 20 minutes. I then restarted the computer and tried it again. Computer shut down again, and it gets quite annoying after a couple of tries.

Most programs won’t use the all of the CPU 100% of the time. Try rendering a movie in Mental Ray and see if that works out better.

EDIT: I’ve just looked at NVIDIA vs AMD. It’s basically saying that Blender is far behind MAX. I think I will cry.

I don’t know what that means. It shouldn’t come as a suprise that a commercially developed, 3500$ software is (in some regards) ahead of a piece of software developed mainly by volunteers.

EDIT:

I haven’t considered that. That indeed might be the case.

Thanks for the feedback. What I meant is that by using all of the CPUs at 100%, it rose the temperature too fast. The cooling fan couldn’t cool them off fast enough.

BTW, I initially suspected it was the hardware problem. However somebody else blamed it on Blender. “Made to run on garbage” was what that person said and not in a good way.

I got the CoreTemp yesterday and it was saying that all four went over 160 F, and it went over the shut down temperature set in the BIOS. So I told Blender to use one thread and it solved the problem but it takes longer to render. So I’m thinking of getting Zalman heat sink and see if it keep the computer happy.

I now realized that Blender doesn’t have hardware renderer accelerator for Blender Internal. Just software renderer. It seems that when it comes to rendering GPUs are completely worthless and software renderer uses the CPUs.

The PSU isn’t the problem. All four of the CPUs was being pushed to 100%. So by changing it to one thread it solved the problem, but it takes longer to render.

I have a question, didn’t you say that Blender uses less system resources than most programs? That is dependent on how many faces there are in a scene correct? The scene I was rendering have 21,515 faces and online other users have said that if the face count exceeds 20,000 it overheats the computer. It looks like it is using up a lot of system resources to render 20,000+ faces, correct?

I definitely need better heat sink and cooling fan.

If your cpu overheats under full load that’s just plain bad hardware design.

If you want to use all threads but prevent full cpu load you can go to power plan advanced settings and set maxium cpu state under 100% (assuming you use windows).

I run on garbage. intel 4 processor (single core- hyperthreaded) 1gb Ram, onboard ATI chip, no plugin GPU card. with no other issues than viewport slowdown and long render time.

you can’t really blame the program if your computer isnt capable.

(ok i dont use it all the time now, but enough to justify this post)

Ah, thanks for clarifying the requirement. That’s true. Blender is smaller than commercial versions and have practically everything built in.

That’s true about adding up. The strange thing is that the first time I rendered the movie, I didn’t add a second plane, it rendered the movie just fine. When I added the plane the following morning the computer started to overheat. I have no idea why it worked fine the night before and not in the morning.

The problem obviously is that the cooling fan was unaccustomed to having all four of the CPUs running at the same time and couldn’t cool it down. So I will fix that when I can.

That is not a Blender issue, nor a CPU issue.
It’s a user issue. You have to make sure you got proper cooling.
You also don’t drive a V8 through a Desert with half a gallon water in the cooler and blame the sun or the car if it stops…

You should ignore your pal blaming Blender and having a bad opinion of you and Blender. He obviously has no technical expertise and is just a hater. Not much you can benefit from, unless he’s a client and got money for you…

160… thats around 7 potato or 15,7 football fields calculates… 71 degree celcius. :stuck_out_tongue:

For the Ci5-3570k, the Ci5 flagship the maximum allowed temperature at the Heatspreader is 67.4°C
So yeh, you’re overheating.
Another problem is, that the heatspreaders (thats the metal thingy thats part of your CPU) of the Sandybridge generation (Ci5-2500k) was a bigger production and the heatspreader was soldered on the CPU.
The new Ivybridge generation (Ci5-3570k) is a smaller production, which means that a smaller surface area has to give off almost the same Watt in heat, and on top of that the heatspreader is packed on the Die with thermal paste, causing them to run hotter as the thermal conductivity is worse.

Anyways, with a decent Tower Cooler like the CoolerMaster V6, Thermalright Silver Arrow SB-E, Noctua NH-D14 or a Noctua NH-U12P SE2 you should have no problem.

I personally have the NH-U12P SE2, a 3570k, OC’d to 4 GHz. Lately we have around 34°C, which is around 93°F and my CPU runs under full load at around 61°C, which is a delta-T of 27°C, which is quite good.
Never forget that you can only cool a difference to your room temperature, and more importantly the temperature in your computers case.
If you don’t have any fans expelling heat from your case and it heats up to 40-50°C which is quite easily done by the chipset, GPU and HDDs, and only can cool 27°C delta, you’ll heat up to 50°C+27°C = 77°C.

Another alternative are all in one, ready to run, watercooling systems like NZXT Kraken, Corsair Hydro Series, Thermaltake Water Pro Series… there are dozends by now.
The disadvantage is that you often need to cool your potential transformers sitting around your CPU socket separately as they are not in the air flow of the CPU cooler anymore.

hth.

With twin XEONs and 12 cores (24 hyperthreaded) my system runs very cool with CPU rendering and all 24 cores working evenly. Blender distributes the load perfectly.

CUDA = NVIDIA hence the reason I cancelled my order for a HD 7950 for Mac and have a replacement K5000 for Mac on the way. The first K5000 was defective and crashed my system when GPU rendering (shame on you NVIDIA) but that’s another story…

That’s the strange thing. The night before the computer started having problems, I rendered the exact same scene except I didn’t add a plane to it. I was rendering 750 frames and the first time it rendered all of them without shutting down, I turned the computer off at about 4AM local time, at about 12pm to maybe 2pm, I don’t remember the exact time, I added a plane to it, subdivided it, and added a displacement to it. I even had it textured. It was the same scene but the second time around the computer overheated. It didn’t overheat the first time. The second time afterward, it overheated. Maybe the fan got tired of working at full load.

I didn’t notice that Blender distributed the load. All of the cores was running at full percentage, there was no distributing, just all of them running at the same time.

I will get a new fan with heat sink and that should solve the problem. Thanks guys for helping.