NVIDIA Gelato

Anyone tried using this yet?

I was thinking about playing around with it later today but I don’t think my GPU is supported =(

http://www.nvidia.com/page/gz_home.html

Added:
lol, they have a huge blender thread on their forums - going to check that out
http://forums.nvidia.com/index.php?showtopic=14227

Also check out:
http://blenderartists.org/forum/showthread.php?t=66308&highlight=gelato

Screw Gelato, I wish there was a plugin for CUDA, use the internal renderer and have your GPU do all the work.
http://www.nvidia.com/object/IO_37226.html

Gelato IS GPU rendering :wink:

CUDA isn’t actually a program it’s a way to allow programmers to write programs that use the GPU for processing using C.

Yes Gelato is GPU rendering but it still is an external Renderer. With CUDA (and someone programing to make it work with blender) you could use the native blender internal render, just at much faster speeds.

I’m cool with that. anyone one up for a little bit of C hacking?

I could be wrong but isn’t CUDA just for their Quado line - not GeForce GPUs?

The new Quadros and the 8 Series Geforce GPUs would be the only cards seeing the full benefit from it.

Getting the internal renderer, or any renderer written for the CPU, to run on CUDA would take a huge amount of work. Most of the code would have to be changed and algorithms would have to be rethought. You can’t do this in a few months even.

If I could build blender - heck I’d give it a go.

That’s true; however, not all the code would have to be modified. Ideally you would profile the code and target the bottlenecks.

Instead of CUDA, it might be worth looking into a more general (and open-source) approach such as BrookGPU.

Ok I can’t get Gelato to work with Blender either with the direct blender plugin or the pixie plugin and Renderman plugin for gelato.
both the plugins fail on the Blender side, not on Gelato

Can anyone help please???

Perhaps if you gave some detail on the actual failure, then you’d get more assistance. Do you have the full Python 2.5 installed?

IMO gelato is not worth it at the moment. I’ve tried a couple of the samples included in the SDK. Gelato is not a pure GPU renderer, instead, is it some sort of hybrid renderer that offloads some of the functions onto the GPU. THe speed improvement isn’t that drastic, from some of the simple scenes (teapot + shadow, dinasaur sss) the blender internal renderer seems to be faster.

Plus, to get a real performance boost out of gelato, you would need a real expensive board like the geforce 8800 series. But then, with that sort of money, I would just get a quadcore Core 2.

Try this:

http://www.kino3d.com/forum/download.php?id=4155

Blender is even optimised for quad cores not only dual?

That would be great because in the end of July there should be great prices reduction by Intel.

@JiriH The render engine can handle as manny threads as you want. But optimized for Quad cores is not really correct. Optimized build are available but at the moment only SSE2 builds have been spotted (hmm it seems a SSE3 build has been posted for testing yesterday). For an optimal build for a quad you would need a SSE4 build.

I tried that plub-in and it had lots of errors