Point-based rendering article -Blender adaptable??

Just bumped into an interesting article on another forum posting, cgsociety.org. It seems to represent a Renderman implementation of point-based rendering solutions as an alternative to a full ray-tracing and occlusion solution. While a TD will still want the flexibility to choose how a given scene is lit, this is another option, and seems to be solved in a shorter time frame, with less resources. I am curious if this is something that is being discussed, as I did a search and did not find reference threads.

http://features.cgsociety.org/story_custom.php?story_id=5615&referer=newsletter

already done, not as far as pixar but pretty colse, turn on indirect lighting then turn on approximate, and hey presto pointbased GI

I will have to check out your comment and do some comparisons. If this is like the article, I sure haven’t heard the hurrah’s on the forum about speed up and results nor have I heard it related to the renderman solution, but I will await the wisdom of others.
Paul

AAO came into blender in 2.46 so point based ambient occlusion been there for sometime now thought that was without the colour bleed stuff that is now there in the render branches

Paul,

color bleeding and HDRI IBL is only possible with the raytrace version of AO.
Currently that mode is hard to use because it is so slow/

Renderings look much better and nicer but this method needs a big speed improvement.
In conjunction with mirror reflection it is nearly unusable.

Brecht is doing a full shader/render recode using many of the latest techniques that mfoxdogg did a great “preview-tour” video of on his blog. This long awaited and needed refactor will transform blender’s renderer into a powerhouse, IMO. Check it out, watch the vid, build and post the latest code from the render branch! Trick stuff! Go Brecht! :smiley:

(I’m really hoping that ypoissant was able to contribute some of his experience/insights into the design, too.)

Be aware that this is a marketing article. Pointbased techniques are viable but not an end-all solution and when they are talking about how slow raytracing is they must be referring to rendermans raytracing. Because that really sucks compared to VRay, for example. Read the thread on CGTalk for more info.

Brecth took every paper into account before he decided, and I’m quite confident he choosed for the best solution to be integrated in Blender.

@mzungu:
Brecht and Yves had a long and highly technical discussion on bf-committers mailing list, about the shader refactor design. Hope they mixed their experiences & talents…

Looks like us common folk can test it out for ourselves.

http://www.renderpixie.com/pixiewiki/Documentation/Point_based_GI

Maybe we need a Blender render branch vs Pixie GI smackdown…

Yeah, it was in reading some of that thread (what I could comprehend of it anyway) that got me stoked to hear that Brecht is tackling this challenge. I only hope that Yves was able to impart some of what his research had taught him, as it seems he had (at the time) been coding a similar & proprietary (read: commercial) algorithm system for his employers. Great stuff!

I’m really salivating over this upgrade as I hope it will allow the incorporation (among many other things) of a real-world-relatable shader preset library into blender, putting aside the need for the endless tweaking of the current system’s myriad array of legacy shader settings which must each be painstakingly adjusted for every different scene setup. Perhaps that capability will remain for those who love to (or maybe need to) fake things to achieve a particular look. But I’d be happy to just, for example, select the “glass” preset from a menu list, maybe nudge a couple of the settings a bit to my taste, then F12 away - knowing it’ll come out looking at least close to glass from the get-go.

I gathered, from reading their discussion, as well as some other postings of Yves here at BA, that this becomes much more “do-able” with a BxDF setup such as Brecht is implementing.

Uncle

does Pixie easily work with Blender 2.5?

I thought it was decided that BxDF materials wouldn’t be done for Durian earlier this year because it would be a lot of work for only a minor improvement in quality (for the Durian project itself that is).

I believe you can read that on the Durian blog itself.

I am curious where this goes to.

Feature wise 2.5 render branch is in its current stage already better than 2.4 and 2.5.

With improved code and the new material shading system I hope it will catch up.

Hypershot and all those quick and easy HDRI render tools gain more and more
popularity and I cannot press Blender since it is to slow and complicated to use.

I hope with the better HDRI IBL support and way faster raytracing that I could
promote it back.

A simple hdri file a point light or sun and the rendering looks already brilliant in Blender.

Supposedly one of the external renderer scripts speaks renderman but I haven’t tried it yet. Can’t seem to find it either, maybe they took it out of trunk or something.

sigh, you all glossed over my post why do you have to use Pixie when Blender already has this. I pointed out that this is already been available in Blender minus the color bleeding stuff ( which I believe is now available through the render branches I have never used these branches) since 2.46 as AAO here are the release notes for AAO look at the first and last link in particular.

point based ambient occlusion is AAO

http://www.blender.org/development/release-logs/blender-246/approximate-ambient-occlusion/

Tyrant

sigh, why did you need read my previous post. color bleeding does only work with raytrace gather method
and NOT AAO.

:wink:

Point based occlusion and color bleeding will not seem much faster then photon mapping unless your dealing with micro polygons and displacements. This is because in RenderMan all geometry is diced into micropolys so its a small step to record area or color information from each into a point cloud, then reuse those clouds in the beauty pass for occlusion and indirect lighting. When using raytracing with micropolys the amount of geometry traced is insane so point clouds are the only way to go in RenderMan.

RIBMOSAIC supports point clouds as well as raytracing, photon mapping and env map techniques which you can see at the bottom of this blog however I have not documented any of this as I’m busy writing the new RIBMOSAIC for Blender 2.5 :frowning:

Eric Back (WHiTeRaBBiT)

new rib for 2.5 ? yayh …

any eta ?

Render Branch now also can handle micro polygons however they are not
accounted for in the shadow pass.

Claas

The per-tile subdivision isn’t done by the same technique as Renderman though, the Blender Internal Renderer in the render25 is still not reyes based, it’s basically a different technique that allows for micropolygon quality subdivision while not built on a micropolygon core.

Yep thats is true. but it looks nice :wink: