Does blender need a GPU to render?


(soriyath) #1

Hello!

I know some people might already have asked this but I’m being too lazy to search for posts and maybe it hasn’t been asked this way. So here we go…

Does blender need a graphic card/gpu for renders? I am talking about renders only here.

I bought a new workstation, basically an AMD64-2ghz with an nvidiaGeF6600 graphic card. Whenever I render, the blender process uses 99% of my cpu. So I thought that blender mainly relies on the CPU of the mainboard for renders. Is it the case?

I’m thinking that if it is the case, I might buy some cheap micro-atx towers and make very basic systems but with a powerful CPU and big RAM to do the renders.

Most motherboards today have an integrated gpu and sound card, so we can have systems with only cpu/mainboard/ram/harddisk. Thus, I could work on my current PC and when I need to do heavy renders, I send it to the other machine which will do the slave work, leaving me the leisure of using my main pc the way I want.


(nykysle) #2

No! RAM and CPU for rendering.


(phlip) #3

GPU = good for modelling, all the modelling interface (including all the buttons and such) are drawn with OpenGL, and can be hardware accellerated. However lower subsurf levels and splitting into layers can go a long way to speeding up a modelling scene.
CPU and RAM = good for rendering. RAM especially because once you run out of that it starts swapping to the hard disk and that kills efficiency, especially with something as memory-bound as a 3D render. Once you’ve got plenty of fast RAM, get a mobo that can support a nice high FSB speed and a juicy CPU.

Pretty much the major thing in any CGI design is speed vs realism. Scanline is faster than raytracing. Sampled AA is faster than distributed rays. Envmaps are faster than ray reflections. Etc, etc, etc.

The GPU is specifically designed to be fast enough for real-time graphics, at the expense of the more fancy features. You simply can’t do things like AO on a GPU, it just isn’t designed for that.


(yu_wang) #4

CPU and RAM: for rendering.
GPU: for Blender’s interface, so that you can create something to render in the first place.

So you’ll need both anyway. :stuck_out_tongue:


(MassTA) #5

I see this question like for the 10th time since im on elysiun… and it hasnt been even half year since im here.


(Duoas) #6

http://download.blender.org/documentation/htmlII/a8314.html

Hope this helps.


(youngbatcat) #7

well simply cause when users see what games can do in real time they just think blender or any software can do it to.

CPU ram for rendering normal way
GPu for the interface, a gpu can do basicly everything with a few hacks thrown in, but no one has programed it to that those things in blender yet.

Now tt rant mode :smiley: GPu can if programmed, Render one pass of shadows for AO or radiocity then use that map made as a shadow map, same goes for highlights and other passes… So all of these hacks could be render increased via the GPU, then display the rest of the frames either normal cpu render or opengl style of lighting…

Aw where art thou be coders of GPU cards :expressionless:


(Duoas) #8

What? You mean the game engine doesn’t allow you to use the stencil buffer for shadows? That’s annoying… (Of course, I’ve never messed with the game engine myself…)

where are ye coders


(soriyath) #9

Thanks for all the replies. It’s very useful. It’s good to know blender doesn’t need a gpu to render. I think next time I’m investing money on computers, I’ll make a basic but powerful computer and render command-line.


(sundialsvc4) #10

I often wonder why Blender can’t use the GPUs that are found in so many graphics cards. I mean, wouldn’t it be nice if the cards could do the graphics-crunching, and Blender could retrieve the output and put it somewhere . . .


(z3r0 d) #11

perhaps because each individual card would come up with a slightly different result

because there is no standard cross platform way to do this

because the coders would have to write their own fallback renderer anyway [for those people with anything worse than a geforce fx or radeon 9500, or for people without the latest drivers]

because blender has a renderer already…


(indigomonkey) #12

And because the processor is so much better at the type of stuff that needs doing to render something NOT in realtime.


(youngbatcat) #13

then how does blender even exsit ? If blender had all of these problems with different cg cards how or better why does anyone code for it ? Why not make the interface and all just cpu as well ?

Mergeing the line between what we have now in opengl and what we could have if someone knew how to add it in is perplexing…


(indigomonkey) #14

Because graphics cards are much better at the calculations that involve outputting information to a screen in real-time - they’re quicker at that kind of thing. The CPU wouldn’t be able to do it - it has a different architecture that focuses on other aspects.


(youngbatcat) #15

Yes true, so why not things like shaders ? If it is the same tech behind rendering to screen how much effort would it take to add in draw shaders preview on the mesh ? we have somewhat good lighting to see what the maps are casting light on, why cant tthis be upgraded to basic shaders? Just basic ones to start with,


(tedi) #16

http://graphics.stanford.edu/projects/brookgpu/

o.k. nothing much, but - at some point in time - somebody w.knowledge could look into it …

well I’m not advocating making of opensource gelato for blender, but …