So, the idea of blurring the ray-traced indirect lighting and ambient occ passes some how, has been rattling around in my mind for awhile, the problem always was that the blur would bleed between the objects. Then a few days ago I was experimenting with the compositor and realized how ridiculously powerful it is. I also realized that there was this little node called ID mask…
So long story short: IDmask into 1x blur, inverted, mixed with (1 sample IL*OCC), into 16X blur mixed with black with the inverted IDmask as the fac mixed with the diffuse and spec=
^16 seconds, 1 sample OCC & IL^
So here are a bunch of renders showing different settings:
without the blur but at 22 samples, 4min 18sec:
without the blur, 1 sample with cache, 1min 2sec:
![http://www.dgdigital.net/blenderblurtrace/ 1smpl-cache-noblur-1min2sec.png](http://www.dgdigital.net/blenderblurtrace/ 1smpl-cache-noblur-1min2sec.png)
approx IL, noblur, 0.25 error, cache, 20sec:
no IL or OCC, noblur, 10sec:
Basic explanation of what’s happening: blender renders the scene with the indirect light and OCC separated from the diffuse/spec. It then separately masks and blurs the indirect light and OCC for each object then mixes it back with the original image.
Also, my renders are all slow because I’m using a vintage single core Athlon64…
and here’s the .blend
btw, I’m using render branch 2.52 r29776 because it seems that there is no 2.53 render branch yet… if there is please let me know
There are many many obvious issues, but I think if this were implemented into the renderer like the cache, and set to mask every face instead of every object it has the possibility of being a viable time saver…
Another option would be to do this but almost backwards: have a script that bakes OCC & IL then blurs that bake and applys it to all objects from only the camera’s perspective.
P.S. this is my first post I’m really sorry if it is in the wrong place/if I have set it up wrong…