Quick question concerning discrete graphics card

I am considering to buy a Ryzen 3 3200G machine.
This comes with Vegas 8 discrete graphics card. Now I have a simple question that could be answered with Yes or No!

I am planning to buy (later) the Nvidia GTX 1650 card. Will Blender be able to render using both GPUs at the same time? Thanks!

Hi community! I still need an answer. This is what I got from my research:

Quora: Can we combine the integrated graphics and the dedicated graphics card in our laptop to do faster rendering work in Blender?

A: “That’s unlikely, unless both cards can be used through CUDA or OpenCL. (…) If both integrated and dedicated cards are “visible” through either CUDA or OpenCL APIs as “compute devices”, then it should be able to utilize them both. Otherwise, you’re out of luck.”

Quora: Can I in some way use both dedicated graphics and integrated graphics at the same time in a game to boost FPS? If so, how?

A: “(…) You cannot combine the integrated graphics and dedicated graphics processing power for most applications. You can utilize integrated graphics to accelerate some applications (…) but for something like gaming, viewport rendering in an application etc, you cannot. (…) AMD APU integrated graphics and an entry level AMD dedicated GPU however can run in a hybrid crossfire mode for added performance.”

Blender Artists: Blender performance with APU? (Ryzen 5 2400G)

@TiborNyers_BC answered: "I have recently benchmarked the Ryzen 3 2200G Vega 8 and its compute OpenCL performance is on par with the RX550. Additionally, you can use both the IGP and the dGPU in multigpu setup with excellent scaling: Blender & Indigo Renderer OpenCL benchmarks with Adrenalin 18.5.1 unified driver - RX 550 + Vega 8 mini workstation

So I need to know: can I combine Vega 8 + Nvidia GTX 1650? I want this set because I want a cheaper machine for further upgrades, and later I will need CUDA for some applications. But if I can’t add up the power of both graphics cards, then I should stick with a Ryzen processor that doesn’t have its own integrated card. Help me solve this question before black friday …

You can’t currently because NVidia requires CUDA and AMD requires OpenCL. So you could only render animations (1 instance of Blender per card). But due to some possible differences in noise/render results, mixing both can be tricky.

1 Like

Stefano, you could get a used 980Ti for around the same price as that pathetic GPU, or a used 1080Ti for just a wee bit more. Either one of those GPUs would completely obliterate that pussy GTX1650 in a render race…

Hey, I get it, GPUs are expensive, I’m in the process of replacing my three 980Ti Hybrids with three 2070 Super Hybrids… Trust me, my advice is (always) to get high-end used GPUs (1 or even 2 gens older if needed) over low-end new GPUs…

A 980Ti for instance has 2GB more VRAM, and three times the memory bandwidth, just for starters, then you look at how many more CUDA cores it has…

Edit: Okay, I just checked ebay and 1080Ti is prolly out of your price range at $400+, but a 980Ti is not. I’d get a 980Ti if I were you, if one will indeed fit on your mobo and your case is big enough…

Edit part deux: GTX 1650 gets a lowly OctaneBench score of 78… A 980Ti gets a very respectable OB of 156. Exactly double. Bottom line: Twice the rendering power and 2GB more (and 3x faster) memory… Now I have to get back to work. :stuck_out_tongue:


thanks @bliblubli for the precise answer.

thanks @norka for the kind recommendations… FOR NOW, I need to get some air and rise above the “crappy notebook” level. Hey, I’m a South American! The 1650 is totally the cheapest cost-effective GPU of the moment. In addition it is cooler, more modern, new technologies, consumes less, longer warranty time … When I make my next equipment change in 2021/22 I will take into consideration your kind advice :slight_smile:

You are going to regret that decision bro, if rendering final art and getting quick feedback while you work is important to you. Basically, all rendering. The rest of you, who may find yourself in similar circumstances to Stefano’s at some point, take my advice and completely disregard his decision and rationale, that is woefully ill-informed (imho), and is going to have him ruing the day he picked that toy gpu. A used 980Ti is twice the GPU a GTX1650 is, for around the same price, any way you cut it. Period.

I said it in a very similar thread and I say it again here; that for as long as that pussy gpu is going to take to render, you might as well just draw your artwork by hand.

1 Like

Well, you can in fact render in OpenCL on NVIDIA cards. It works in Blender for stills in one instance.
That said it’s not worth it. In my experiments with such a setup I ended up with differently colored tiles. OpenCL is purposely crippled on NVIDIA and the intended setup in this Thread isn’t worth the hassle anyway.
Just adding this for the sake of completeness.


And the notion of high-end used stuff over low-end new stuff does not stop at GPUs… I would take a used three or four (maybe even older) year-old high-end workstation mobo with a kickin’ cpu (or better yet dual-cpus) and 32GB+ of quad-channel ECC memory, any day of the week, over some new low-end-to-mid-range pc… if said high-end gear could be had for around the same amount. Yes, there are gray areas, and sometimes new gens bring radical changes that makes this not always workable, but a really kickass X99 with four pci-e slots and support for quad-channel memory, that has USB3 and hopefully at least one NVMe m.2 on board will rock the house like a mofo. If you can get these things in your price range, this would be a solid foundation for anyone with limited funds (a student, some dude down on his luck) to build a powerful enough workstation to tackle most anything, once you have at least one relatively powerful GPU in it. Then you could even edit video in Davinci Resolve or do mograph in AE or Fusion, etc… Just tossin’ this out there yo!

1 Like

I know, just didn’t want to confuse. For completness, it’s even possible to render with openCL+CUDA+CPU on one frame with patches available on the tracker. But the differences in noise and other subtilities make that the final image could look weird.

1 Like

I can understand the incentive.

1 Like

Norka, I appreciate your intention to help, and I acknowledge that your advice is very well founded! But starting today, remember that this is an international community. Hardware prices will usually vary widely in different regions. Here in Brazil we do not have a fertile and competitive “used market” as you would find in the USA. I looked on Ebay and actually found some 980ti for 130/180 bucks (and even cheaper). The price of the new GTX1650 on Amazon is $150 - confirming your information.

However, here in my regional pricing reality, the numbers are different.

  • New GTX1650 costs R$750 (converts to $180).
  • New 980ti costs R$3300 (converts to $790).
  • New 1080ti costs R$4000 (converts to $960, just for a comparison).
  • Used 980ti i can find for R$1700 (converts to $400).

OK, simply, the price of a used 980ti is double the price of a new 1650. With this investment, I could build a 2x GTX1650 machine, which I believe would achieve the 1x 980ti performance. Prices of used goods also depends on luck. But the availability is not very high, so you will not see an abrupt price drop, as it happens there in your region.

Finally, I can’t explain why hardware prices are so high in my country. Import duties are high, but the answer is more complex. It would be because we do not have a true “Free Market”.


Oh, and not everyone uses Blender to make photorealistic renders … I use to make cartoons. I wonder how you would handle the limitation of creating art using a simple notebook (i3-7100U processor and the awesome Intel HD 620 graphics).

Okay, I’m truly sorry man. I had no idea it is quite that bad. Maybe the Blender Foundation should take some of their new found wealth and try to help Blender artists in developing countries get better access to affordable hardware, used and new… Edit: Or maybe BF set up a render farm, for disadvantaged 3D artists (that must reside in developing countries) to render final art at no charge… I dunno…

I have two GTX 580 3GB sitting on a shelf that combined are about as powerful as a 980Ti, that I’d be more than happy to give you for free if you paid shipping, but unfortunately I don’t think that Cycles supports GTX 500-series (Fermi) anymore… : - (

And for anyone else, I think Cycles and EEVEE still support Kepler (600 and 700 series), and 780Ti is definitely a very usable GPU in Blender/Cycles for those who can get one cheap. They score 116 in OctanceBench. To recap some OctaneBench scores: RTX 2080Ti = 302, RTX 2070 Super = 220, GTX 1080Ti = 217, GTX 980Ti = 156, GTX 780Ti = 116, GTX1650 = 78. I believe these numbers represent raw CUDA rendering in Octane, with no OptiX in the benching.

1 Like

For 99$ you can transform a single 1650 into 2 by the way with the right software :slight_smile: