Paper: Approximating Dynamic Global Illumination in Image Space

Approximating Dynamic Global Illumination in Image Space :yes:

Is a improvement beyond ambient occlusion in image space (as seen on Mirror Edge game for example) that have colour bleeding …

I think it was mentioned in the Peach blog that the implementation of AAO that blender currently has also allowed to, in the future, have one bounce of coulor bleeding…
I’m no tech, but I liked the samples images in the document :slight_smile: Thanks for charing Otomo :wink:

mh looks not too bad.

color bleeding is seriously needed.

It just seriously underlines that we need SOME kind of GI solution.

I’m not a coder, but if there are a multitude of open source realistic raytracers, why can’t Blender get in on the action? We really need SOMETHING.

Someone send this paper to Ton.

There was already a AO build with color bleeding. I wonder why they didn’t use that?

Reading it. Very very fast. Nice results. Don’t except anyone with the chops to implement this to actually pick it up and run with it right now, though. Everyone’s working like crazy on 2.5.

i ask myself the same.

well blenders raytracer as i read often si not the fastest anymore and also needs some polish up. but i can imagine that it is just not so easy to just dump it and put a new one in and make that one work with all other systems blender offers like node system.

the node system improved grately and I hope after 2.5 is set they go and look in extending the GI solutions.

we have AO, AAO, lightcuts, but all have their ups and downs.
AO is slow
AAO is fast but seems not good for products
Lightcuts is still in dev

lets see where all this goes to.

the ibl patch uses some tricks to simulate 2nd bounces and is fast simply because it tricks as I was told. but it is quite fast and the results I got with it were honestly great.

I could be wrong, but iirc, the ibl patch had nothing to do with shadows or bounces.

I’ve been told that it isn’t a ‘true’ Global Illumination solution. But it looks amazing. The current tragedy is that I’ve been told that it won’t compile on 2.48.

I have personally found that external renderers are kind of a mixed bag. Yafaray is great, but on my Steam Walker, it died on the table. That could be due to the complexity of the scene or something else, but this isn’t the first time this has happened. Previously, I’ve been chasing all over the web trying to find out what arcane error messages in Indigo mean.

I generally find that when a scene reaches a certain complexity or if you do something in your model that your rendering solution dosen’t like, Suddenly you don’t have a rendering solution apart from Blender internal.

I mean, if freaking Truespace has a GI solution, then surely we can either beg borrow or steal something that will work on the internal renderer? Lightcuts is very promising, but has a few really bad quirks, and the speed, considering its funky method, isn’t really on a par with something like Yafaray (when it works).

Perhaps you might consider looking into LuxRender (Which is scheduled to release a new version here in the next couple days) It’s a GI solution supporting both biased and un-biased methods.

It works quite well with Blender, and is much more stable than yaf(a)ray currently is.

Yes the ibl is not a true GI but it does not matter that much because the results are pleasing.

I also agree that external systems are a mixed bag because they never can support all the jazz Blender has. Or they include Yaf(a)ray in a way like Maya includes MentalRay.

Until the external renderers are able to render good quality motion blur then they are limited as far as proper production use is concerned.

And then I visit the Luxrender forums out of curiousity and see examples of MOTION BLUR being implemented for the next version. :cool:

Well, cheers then, cause I’m going to have to give it a whirl now too, when it comes out. :slight_smile:

I’ll have to give the new version a whirl. I was using the CVS version and it crashed when trying to render my last scene as well.

I love things like Yafaray and LuxRender, but I just need something that I can count on not to expire half-way through a project.

This is going to be a silly question, but is anyone thinking of coding the method described at the begining of this thread into Blender? I’ve seen a number of recent “instant radiosity/raytracing” methods proposed by such as Nvidia etc. It would be nice to see if they are any good and if we could get something like it into Blender.

Okay, so I got curious after the read paper and started digging into nodes. I’m coding a node for doing this sort of fake Screen Space Global Illumination. Some pics:

Plain render:

SSGI pass:


Right now it’s just checking normals, and not very intelligently. It should be much better once I get the normal evaluation to be a little more sophisticated, and add in depth-based attenuation.

Ah, the method that I write down years ago but nobody believe in it. I have opened a topic years ago at here too, but nobody believe in me. Yeah, I am cursed.

this is that topic…
(image missing…)

another try some months ago with an idea to convert normal pass to speed pass and use motion blur to make a fake radiosity:

harkyman! very good result