Blender Isn't Using My NVIDIA GeForce MX350

So I’ve bought a new computer in the coming months, and since then Blender has not really liked to use my GPU! I’ve installed the GeForce using the Experience app, individual drivers and even re-installing windows as a whole. At this point I think it’s JUST Blender because I have not had any problems while gaming or using apps like Resolve!

Below I have some screenshots showing what I mean. This screenshot is showing my GPU utilization while a render is going on! As you can see, only 7% of my GPU is being used while my CPU is being completely overloaded! None of my options are greyed out and it shows that I’m using the GPU…

I’m using Blender 2.9 and am relucted on updating to 2.3 until this issue gets resolved… If anyone else has experienced this issue can you give some pointers on how you were able to fix the issue? Any help would be appreciated!

Extra info about my PC

  • Windows 11
  • HP Pavilion All-in-One
  • 11th Gen Intel(R) Core™ i5-11500T @ 1.50GHz, 1512 Mhz, 6 Core(s), 12 Logical Processor(s)
  • NVIDIA GeForce MX350

Is your monitor connected to the graphics card? Because it’s showing the integrated graphics from the CPU.

In computer check if the power profile is using the MX350.

In Blender go to Edit>Preferences>System in CUDA and Optix tab check if MX350 is on. - Note i am not sure MX350 can work with Optix.

You should update to Blender 3.0

Unfortunately my computer is an all-in-one as it was all I could afford (though this one was still around 1k), so I don’t know what’s connected to what!

Rendering is a tad bit faster, but it’s still going through my CPU and is now showing less GPU usage… Even after updating to 3.0 so I don’t know if it was changing the settings or updating blender

What rendering setting you have in Blender, CUDA or Optix?

Optix, as thats whats worked before with previous graphics cards I’ve used, I’m a new user to Blender Artists so it only allowed me to use one screenshot! Should I change over to CUDA?

If you are using optix you are using the gpu.

If the MX350 apears in Optix and you selected it then should be using it, keep in mind that is a weak card, but test it also in CUDA, since Optix is new tech maybe the MX350 can’t profit from it. Test also only CPU.


OptiX requires graphics cards with compute capability 5.0 and higher and a driver version of at least 470. To make sure your GPU is supported, see the list of Nvidia graphics cards OptiX works best on RTX graphics cards with hardware ray tracing support (e.g. Turing and above).

Since you card is based on 1050 should have Compute 5.0. You need to check what is your Geforce driver. If it is driver 470 at least.

For a test download from here Car Demo with BMW model

Open and render it in:

Post your times.

I also did some more digging, apparently there’s ONLY game drivers for my graphics card, so I don’t know if that affects anything or not… But I did find out I could add programs to my NVIDIA dashboard so I directed it to the Blender 3.0 folder, whist also doing a full clean install, but now it maxes out BOTH GPU and CPU! Not sure if that’s normal, but things seem to be rendering better. I also grabbed screenshots of CUDA and Optix rendering stats!

Now I’m thinking I’m just bad at optimizing my scenes…

CPU — 07:28.95 | CUDA — 02:28.67 | Optix — 02:41.37

Great you fixed that issue.
But do not select the CPU when you select the MX350. it might be faster that way.
When i select my CPU and the RTX3060 it is 1 sec slower than just using the RTX3060 It was explained to me that Cycles X is not yet optimised for multiple different system calculators. 2 or 3 similar GPU’s is okay, but CPU+GPU should be no good.

Do your system allow a upgrade? With a laptop RTX3060 i can do the BMW scene in 12 sec in Optix. These RTX cards and Blender implemented Optix are an amazing improvement.

I’m not all too sure about how to go about upgrading my computer, it was an all-in-one and the graphics card is built into it! I’ll try to look and see if I can, but that’s kind of the gamble I took when I bought this computer. It was better than some of the towers that I saw for $1,000!
As for turning off the cpu, how do I do that! Is that where I choose between CUDA/Optix? Cuz I never realized that was an option. I’ve always had everything selected by default because that’s what used to work.
Glad to hear I fixed it for the most part, I was just more concerned that the fact that I was using up all of my CPU to render certain scenes when I should could have seen more in my gpu.

That does not affect anything in cycles. Nvidia control pannels have no control over what Blender is doing with gpu.
Just install newest drivers for Your GPU and thats it from nvidia side.

If You want to render with GPU you have to have:

  1. render device set to GPU Compute

  2. Set in Blender preferences CUDA or OptiX

  3. set Checkbox with devices you want to render with

  • so if You want to render simultaneously with CPU and GPU set both
    but in Your case GPU and CPU are probably sharing same cooling system so better leave only GPU
  1. start render

Thank you! Everything is working much better now!