G Buffer Extractor

For those who don’t know what G(eometry) Buffers are, they’re 2D images that encode information about a 3D scene and are useful in compositing and image manipulation. ‘G Buffer Extractor’ is a Python script that uses mesh vertex colours to store scene information and render it. So far my script will render a depth buffer, object ID buffer or surface normal buffer as below using Endi’s Rhino…

Render:
http://www.flippyneck.com/wip/rhinotest_col.jpg

Depth image:
http://www.flippyneck.com/wip/rhinotest_depth.jpg

Surface normal image:
http://www.flippyneck.com/wip/rhinotest_norm.jpg

Object ID image
http://www.flippyneck.com/wip/rhinotest_id.jpg

Hopefully I can also add illumination and a crude edge buffer as well.

The question is, would anyone else find this useful? At the moment the script is pretty rough and ready, but if there is enough demand I will develop it as a proper tool. I don’t know if it can be done, but in an ideal world it would be great to automatically combine the different buffer types and load them as layers into Gimp/Photoshop. Does anyone know if this is possible?
http://www.flippyneck.com/wip/g_buffer_extractor_a05.py

I don’t think it is possible to render to a file from the python api

I really wish it was, from that point loading in another app (or back into python) should be reasonably trivial.

well anway, with the information you were able to extract this way you could create some nice toon edges, if you can play with the ati toon shading demo to see what I mean
http://www.ati.com/developer/demos/r9700.html
[non-photorealistic rendering]

erm, from demo

I don’t think it is possible to render to a file from the python api

It isn’t, but i im going to enquire at .org to see if this is going to be added at any point. It is possible to set the render directory for output though so I think i can hack around the current limitation.

well anway, with the information you were able to extract this way you could create some nice toon edges, if you can play with the ati toon shading demo to see what I mean
http://www.ati.com/developer/demos/r9700.html
[non-photorealistic rendering]

That’s exactly what I’m hoping to be able to do: write some scripts that will work on the g buffers to generate NPR images. Thanks for the link.

The question is, would anyone else find this useful?

for my part - extremely! a must for compositing and post-effects. development would be very much appreciated.

good luck

marin

This is VERY useful!

[quote]I don’t think it is possible to render to a file from the python api

It isn’t, but i im going to enquire at .org to see if this is going to be added at any point. [/quote]

Better ask quick since the feature is already in CVS.

lib-dip… lib-dip…KADOOF.KADOOF.KADOOF… :o YES!

%<

This looks excellent - very promising. WEll done.

Love to try it but the link is broken?

nik

There’s a letter missing in the link to the python script:
http://www.flippyneck.com/wip/g_buffer_extractor_a05.py

So, is it ready yet? :wink: I need this now. :smiley:

It’s a very usefull tool ,I’ve wait for a log time for this !

You’ve got my vote :smiley: Great script and very useful, keep up the good work.

/Nozzy

This is much needed. That said, though, I’ve always wanted to see this integrated in the main code. If you’re really vertex painting things, then it kind of destroys the usefulness of this for any project that uses vert painting. A direct implementation would not mess with your vert paint values, but just render in the data space you requested.

harkyman, I agree. I see no reason why this couldn’t be implemented as a special renderer type in Blender itself. I wish I had one iota of an understanding of C so that I could give it a go. My hope is that if I knock together the best script that I can, someone else with the skills will recognise the interest, pick up the baton and code something ‘real’.

It would be possible to read existing vertex colours from a model and restore them after rendering (my script already strips and then restores material settings). This would lengthen the processing time but I’ll try and get it working for those that might need it. A far more fundamental limitation of vertex colours is of course that this will only work with meshes…

Seeing the considerable interest I shall continue with this. Thanks guys. As they say, ‘watch this space’…

lib-dip… lib-dip…KADOOF.KADOOF.KADOOF… YES!

Praise indeed. Thank you :smiley:

:o Woww!! man, for a time I was looking for a way to get the depth info, and your script is going to do that and more… thats a very good job.

And what kind of file is going to export? rla, rpf?

Does those normal maps have to be in some manner with yougbatcat´s normal mapping?

Thank you for your effort.

aw! does it get the entire mesh ? if so you have normal mapping exract ! sweet!

Just to update you, folks. I’ve done a bit more work on this based on a few of your comments. Principally the script now gives the option to either render to an image, or to ‘bake’ the scene information into the vertex colours. The latter can hopefully be used in conjunction with jms’ vertex paint script to export normal maps.

youngbatcat, I haven’t followed all the work that you’ve done on normal mapping, but if anything I have done is useful to you then feel free to use it.

hagen, I looked at the rla/rpf file information, but I can’t get access to enough of the data types encoded in those files to export to them. Also I don’t have access to Combustion etc, unless anyone knows of other (free or OS) software that can view them. The plan is to export to image files that can then be manually or automatically loaded into Gimp/Photoshop layers for making composite masks. By the way, there is a zbuffer rendering plugin for the sequence editor. I’m not sure if it still works though…

http://www-users.cs.umn.edu/~mein/blender/plugins/sequence/showzbuf/index.html

Soon I hope to have the following options available for both vertex colour baking and export to image:

Z Depth
Object ID
Face/Vertex normal
Angle to view vector
UV coordinates

and these two for image export only:

Illumination
Shadows

Look out for something new by the end of the week…

Any chance of making it possible to render out a image sequence
of the buffer type choosen for animations :slight_smile:

/nozzy

Any chance of making it possible to render out a image sequence
of the buffer type choosen for animations

/nozzy

I’m on it; should’ve mentioned that. The user can now define a project name, set the required buffers and a range of frames and render the frames into a directory dedicated to that project.

Your the man :smiley: Its shaping up real nice, are the new changes in
the download link on page 1 or are you going to make a new version?

/nozzy