Hi everyone, I’m pretty new here, so I started to follow some tutorials. When the topic of rendering came up and using your GPU to make faster renders, I noticed mine was not being used. For clarity, I’m using blender 2.8 on a laptop with windows 10, a GTX 1050ti with 4 Gbs of VRAM, and intel I5 2.5 GHZ, and 16 Gbs of RAM. I have in my preferences the CPU disable, and GPU checked. In my render setting, I have GPU compute on, and when rendering it shows about 300 mbs of memory being used, and task manager shows a 100% CPU usage. I don’t know what is causing this. Any help would be greatly appreciated, thank you.
Is the gpu option actually being used? That is, the option hasn’t been greyed out?
Also, did these issues exist in previous versions of Blender?
I had issues similar with Blender 2.8 a while ago. It ended up fixing itself without my intervention. It was likely that the drivers for my graphics card (gtx 1060) was not supporting the new Blender update and all it took was time before the drivers were updated to work. Could this be the issue? Maybe try updating/reinstalling drivers and see what happens.
Is your problem solely that the task manager is showing 100% CPU usage? Do note that the CPU still has to do work to keep the GPU fed with data.
If you install HWiNFO and run another render what does it show for the GPU? Even if the GPU isn’t working at constant 100% utilization, so long as it isn’t constantly sitting idle at around 0% chances are it is working fine. Might simply be a CPU bottleneck.
That is the CPU might be clocked so low that it can’t keep the GPU fed with data so the GPU is sitting around waiting for data and will only kick into gear once it has all it needs to start rendering.
Hi, thanks for your reply. I downloaded the you suggested and ran another render, the tool showed about 3% GPU usage. Is this a CPU bottleneck I cant avoid?
Hi thank you for your reply. The option is not greyed out, i’m about to reinstall my drivers and hopefully that works.
That very much does sound like the GPU might not have been used for rendering at all. Hopefully the drivers install helps, make sure that your CUDA version is also recent enough, at least compute version 3.0(which it should be for the 10x0 series). And latest CUDA software version(10.1)
Looking at this page however for the desktop the 1050 is listed, but oddly 1050Ti is not. I’m assuming it’s implied that they are the same, though:
More importantly though, for the notebook side, neither 1050 not 1050Ti is listed… Seems like an omission of human error to me, but who knows. I haven’t owned an nvidia GPU since the 680 so I could be wrong(but I doubt it).
Looking at this thread:
Someone else is having trouble with the same GPU too.
Hi!
I’m having the same issue as well. In my case, I’m using a GTX1080Ti 11GB .
I have the option of render only with GPU but this what it shows the task manager while rendering:
So basically is using only 10 / 11 % of his capabilities?
Any suggestion?
Just went into similar problem, but realized you need to switch any of the tabs to cuda. For example Copy or Video Encode to cuda and it will then show 99% usage. Although speed seems problematic, It looks like 1060 is as fast as 1 ryzen 3700x core. And that should not be the case.
Just ran into this as well, didn’t realize I needed to set the renderer to use the GPU in the Render Properties.
Sometimes there is possibly another tool that does show the GPU usage properly. Seems windows GPU does not tell what other programs are doing only the OS maybe? There is a link to an included GPU-Z with monitoring app in my case it’s called GPUTweak or some such. The monitoring shows 100% while Windows Processor Panel shows like was said nothing.
I have the same issue too. All my renders are going straight to the CPU. Task manager shows GPU as 0%, as well as GPU-Z. The only thing that changes in GPU-Z when the render is done is the CPU Temperature drops and the System Memory Used drops about 500MB. I have CUDA selected for my GTX 1060 6GB and even tried to uncheck the CPU, but renders never use the GPU. Obviously a software compatibility somewhere if other people verify this function does actually work. I just upgraded to the latest version 3.1.2 also.
If changing the preferences didn’t fix the issue. I found a potential solution for this problem.
If you changed the preferences and the GPU is still not being used as it should, give this video a watch: https://www.youtube.com/watch?v=eb4EwMc9ceY. It’s only 3 minutes long and has a good chance of fixing the issue.
This is enabling GPU denoising, the render itself is using GPU, as you would have seen if you hadn’t waited for the renderer itself to stop before checking the task manager the first time.
Hi, I have the same issue with my rtx 3070 that has 16gb vram and i7 10700k and 16gb of ddr4 but task manager is showing that cycles is using 100% cpu and 0% gpu even though it says gpu compute. Any solutions?
Do you have preferences set correctly…
Only the GPU should be checked, not the CPU.
Do you have render properties set correctly…
Of course fairly recent GPU drivers, Blender install, etc should be installed.
PS, the RTX 3070 only has 8GB VRAM, it was never released with 16.
This is the way to go. If you have a CUDA enabled driver, then go to
edit > preferences > system
then select the CUDA tab at the top and select the external GPU.
Now if you go to Task Manager, it will show 100% usage on the dedicated GPU.
Guys I found the solution! I don’t know why it worked, but worked for me, you should try it. In ‘Preferences’ ‘CUDA’, just select both, as it is in my image. Previously, with only GTX activated, the render was made totally by CPU and 0% by GPU. I don’t know why, but selecting also my Intel Core, the render now is working with CPU + GPU and it’s faster
]
Update: After that and making one render, you can deselect the CPU and now the render works 100% on GPU, it is even faster than CPU + GPU selected. Apparently we must do these steps for Cycles to work correctly rendering with the GPU