hardware rendering

Is there a good script/plugin/program/etc that can render blender scenes using hardware, or an export/converter to a format that a hardware renderer can use? Software rendering takes ages.

(I have an NVIDIA GeForce FX 5200 with OpenGL 2.0 support.)

You’re confusing Render with Render. :smiley:

It’s not as complicated as learning Japanese ASL, but the term “rendering” has two meanings in computer graphics (at least.)

What you’re referring to will take the same amount of time. It’s got nothing to do with your video setup. In fact it doesn’t use your video at all. “Rendering” in this case is the generation of the image from a shitload of mathematical calculations to simulate objects and lights and the only thing you can do to speed this up is to throw more and/or faster CPUs on the task. Certain video cards and applications can have specifically tailored or accelerated rendering of a scene, but it’s very hardware/software specific (like nVidia’s Gelato or the older BMRT)

As opposed to the “rendering” you think you’re referring to, as in having a GPU generate the rendered graphical output. In which case Blender is fully OpenGL and as long as your setup is equipped with OGL hardware acceleration (like with your GeForce card) and a properly configured drive then you are definitely seeing the effects of OGL in hardware; your screen is very responsive when you’re in the 3D modeller, isn’t it?

Thank you for your response. :smiley:

Correct me if I’m wrong, but Blender only uses hardware (through OpenGL) when rendering in real time, such as when editing. The final product is entirely rendered in software, using only the CPU. You seem to be confirming this (I think).

What I’m looking for is something to render the final product using (at least partially) the GPU. When not trying to render in real time, hardware rendering can be quite impressive. If I remember correctly Maya has something like this. I think Maya can at least use hardware to layer on some things such as particles. (Imagine the speed increase when rendering hair created in Blender.)

I know that rendering this way may be more restrictive and many things possible for software-only rendering may not be possible when using hardware, but the speed increase may be worth it in some cases.

Sorry if I’m not explaining myself well. :frowning:

correct

I’d argue the problem isn’t what you can do so much as the hardware you can do it on and that it doesn’t make sense for a render farm [all of your machines would need the same graphics card, so you don’t benefit when you can buy something more powerful for less because the results wouldn’t be exactly the same]

I don’t know of any free or open source gpu-accelerated renderers. [but I also am not really motivated to look for one]

One of the problems is that when you render your final image
1- it needs to look a lot better than the real time preview
2- you want to store the result on your hard drive

Both of these require the GPU to send it’s data back to the main system, something that they are not really designed to do. There are probably some work-arounds to this, but they would be hardware specific, not a universal solution that would work on any graphics card.

Thank you for your responses. :smiley:

it doesn’t make sense for a render farm
But for people without a render farm it may be useful.

it needs to look a lot better than the real time preview
It can, when not attempting to render in real time.

you want to store the result on your hard drive … There are probably some work-arounds to this, but they would be hardware specific, not a universal solution that would work on any graphics card.
You mean like glReadPixels? I’m not an expert, but I believe you could simply create an OpenGL viewport of the desired size, render your scene to the back buffer, and copy the pixels from the back buffer. You could then save the pixel data however you liked. Although I wouldn’t be surprised if there are better ways of doing this, since it’s nothing new.

If you want an example of what hardware rending can do, go to Dimajix and view the Mirroring Spheres example. Software rending would take ages for such reflection, yet for me the demo runs fullscreen at 1600 x 1200 with several frames per second, even on my weak graphics card, while I have Blender, Firefox, Code::Blocks, and several other programs running.

http://dimajix.de/uploads/pics/spheres.big.jpg

yeah see, The problem here is a small one… no coder is willing to try it yet…

while it would be great … Till someone steps up to the plate it wont be done…
Al sorts of ways to utlize the GPU to handel parts of the rendering instead of rendering to screen at great FPS rates… But that is a goal one must look for…

The person that creates a special “rendering card” for apps like Lightwave or Blender would get my $500 in a heart beat. Some of these GPU’s can crank stuff out fast. How you go about that…I am no eletrical engineer…

Sadly, it’s not going to be $500… maybe $5000… definitely $50,000…

For just hardware acellerated fades and assorted transitions for an Avid NLE setup, you could spring for a Mojo so your transitions don’t have to wait on your “slow” 4-proc Opt PC… at around $1500… for nVidia’s Gelato, the software costs that, per node, plus the price of a Quadra card. Each. Per node. Wanna set up a RT rendering envrionment for film? Shouldn’t take more than 40 or 50 boxen, right? What’s 300k when you’re planning on making 24M net?

The point is, when does this stop being a hobby and start being capitalist production? All those toys are targeted (and priced accordingly) for people making a living doing this.

Yes, 3Dc was fun to play with. And we all got a kick out of rthdribl years back, trying to overheat our 6800GTs… but it’s just playing around showing what you can do with RT pix shading 2.0/3.0… Don’t you think they want paying jobs too? :smiley:

gelato is free now

Hardware rendering is fast, but highly chipset dependent. We want Blender to run everywhere. Even on lame-ass on-board graphics.

AFAIK, comon cards all use 8 bit color channels.

Software rendering runs everywhere. Want it faster? Throw money at the problem for faster cpus, or a nice little thousand node renderfarm.

Software rendering can handle hi-res formats like OpenEXR and Cineon that use more that 8 bits per color channel. This is becoming important in the film and video world.

somebody hook gelato to blender. (wish there was gelato for debian, though)

Yeah, I have a ATI 9600 Pro that has OpenGL 2.0 support and have been waiting for an app to support hardware rendering for some time now. Don’t know why no one has done it yet as I’m sure it would be widely used. As for Gelato, I prefer ATI and have no plans of switching. Anyways, i know this is an old topic so if anyone has any new news on hardware rendering, please tell. :wink:

hey doesnt blender support hardware rendering already through the GE?

i had a demo rendering a movie animation.