Free version of Nvidia Gelato released!

I don’t know how new is this, however Nvidia released a free version of it’s hardware based renderer for free.

They also released the API, so it’s easy (at least if you know how) to code a plugin for any 3d package.

I guess having an exporter for Blender would be great, but unfortunately I have no programming skills… is there anyone out there that could try to code a plugin? Myself, I could be testing it. I have a Quadro-FX equipped workstation (WinXP) that would be perfect for testing purposes…

Anyone? :slight_smile:



Let the nay sayers commence !

I dont think they would do it because its not Open source(AFAIK).

as far as i know gelato has a python api so blender’s gpl license shouldn’t be a problem if gelato gets used from blender python.

i downloaded it and after a first glimpse it looks quite nice and fast.

This is potentially very interesting stuff. Any idea as to just how fast/slow it’s supposed to be?

after a first glimpse it feels like one of the faster renderers (i only have a 6600 though) but it isn’t as fast that i couldn’t imagine a pure cpu renderer do the same. i guess a lot of people would expect more speed because of all the “it uses the gpu” talk.

Well a GPU isn’t nessecarily quicker than a CPU isn’t it? it’s just a processor… it can help if it has specific rendering stuff hard wired in it though… but only in those cases will it really be quicker.

basically a gpu is a quite fast vector processor. for some things (like a lot of the maths that happens during rendering) it should be a lot faster than a general purpose cpu. with the higher shader models there also isn’t that much hardwiring going on anymore. don’t know why gelato isn’t faster. :slight_smile: maybe it still will take a few gpu generations and better apis (no strange opengl detours) for such tasks.

Gelato is only a Hybrid GPU/CPU renderer, and thus should be faster than a pure CPU renderer…

It just depends what elements your scene contain, as the GPU is not able to handle every calculation, but they won’t say exactly what the GPU does and what the CPU does…

A Blender Plug-In shouldn’t be that much of a problem, as the Gelato API is open and documented…

They do also say they want 3rd Party plugins to be created, to get more potencial customers that might upgrade to Gelato Pro,

so that’s quite open there

Personally, I almost always go to the image gallery of any new software I find, and judging by that this renderer is pretty powerful. SSS shaders, DOF, caustics, etc. are all nice. :slight_smile:

I wonder if or when someone will code an exporter.

I have tried the version 2.0 for Linux. My video card NVIDIA 6800 GTX (no Quadro FX).

I have used BlenderPixie Python plugin:

Renderman plugin for Gelato:

Simple test image, Ambient occlusion:

Render time 21.21s first pass, 8.61s second pass.

SubSurface Scattering:

Render time 3.96s first pass, 37.80s second pass.

Technical details (sorry, Italian only):

Holy shit! (if that’s offensive I’ll asterisk it for you, but it needs to be said.)

Wow. Even with consumer 6800 and a baby 32-bit 2.2GHz Athlon and 1.5G of RAM… this is wicked fast. At huge resolutions. Bumped the caustics example to 12 raydepth, all shadows, all everything, 1600x1200, photons pass ~8s, image pass about a minute flat.

A working (fully supported that is, textures and all) RIB exporter would really make for some fun compares. Oi!

Does Gelato work on Nvidia GeForce4 cards? When I run checkgelato.bat I get a seg fault on windows.:frowning:

Sweet test marioamb! I have to check this out.

now we need a tutorial!

jaycun: as you see from , geforce 4 is not supported. But my 6600GT is! :stuck_out_tongue:

lowest card supported is FX5200, no?

I highly suggest grabbing rsl2gsl (linked on the gelato d/l page @ nV) and start converting all the shaders you can get your grubby little mitts on. :smiley: Whoo we’re havin’ fun now.

Software used RibGelato, rsl2gsl and Gelato:

Time 0.42s shadowmap, 29.22s final render.

From the Book “Advanced RenderMan” Figure 12.12 page 316.
Source rib file:

I got to admit, this is fun. :smiley:

I wasn’t sure that it was really using my cheapo (well, by compare) FX card, but run a temperature monitor… yep, this bad boy is heating up as if you were gaming. That’s so cool. The techref.pdf is quite well documented too, it’s nice to get free access to such a commercially targeted application.

(float dispersive in the glass.gsl shader doing prismatic color separation, dependant on the shape of the refraction… b-e-a-utiful.)

The following is the 2-pass subsurf example, with ray:maxdepth @ 8 and spatialquality 16x16 subpixel antialiasing, rendered @ 1920x1440. Shadow mapping (@ shadowsamples 16) & diffuse baking, 9m 21s. Beauty render and subsurf scattering, 14m 3s. Yea, I’d say it’s coprocessing. I don’t know how much but… damn!

I’m thinking it’s time to convert The Cornell Box, if someone hasn’t already. (Someone must have.)

edit: screwed up link.

File cornell.rib fron Siggraph: