Lighting fast AO...integrate into Blender?

Hi! i´m new here.

I´m a game programmer, and i´ve developed a realtime AO technique, mix between SSAO and disk based ambient occlusion. It´s a shader (GLSL), it works in screen space as a post-proccess, I know that Blender has a node editor and it is possible to composite a final render using it, but i don´t know how to create a new compositing node and integrate it into Blender.

This is the output of the algorithm, computed in a few milliseconds (+60 frames per second):
http://www.thehiddenfactory.com/arkano/maxedssao.jpg

I think it would be useful for games made with the blender game engine, too. Anyway, what i´m asking for here is some info about creating new composite nodes. I´ve heard something about a python way to do it, but that it was half cooked, and i´m aware that someone created an SSGI node for blender, but no clue how he did it.

(I don´t know if this is the correct forum to post this and ask for help…sorry)

I would actually ask about this on the mailing list. You will probably get a lot more relevant help there.

http://lists.blender.org/mailman/listinfo/bf-committers

Your other option is to hang out in IRC, but that will take much more waiting around for the right devs to step into the channel.

#blendercoders on freenode

Ok, thank you very much :slight_smile:

I´ll ask on the mailing list to see if someone can help me :slight_smile:

Not sure if GLSL and composite nodes play nice together but nodes aren’t very hard to make.

You have a link or something to the code?

omg! this will be killer for sculpting! A great addition, thanks.

If you can come up with a cavity shader, I will sell my children for medical experiments, and send you the money.

Link to a thread with code and more screens:
http://www.gamedev.net/community/forums/topic.asp?whichpage=4&pagesize=25&topic_id=556187
there´s also a link to a RenderMonkey (HLSL) project so you can test it yourself. The HLSL version is more advanced than the GLSL one currently, though.

It can be used as a cavity shader too, cranking up the intensity to maximum and using a small radius and scale. It looks like this:):

(you can keep your children btw xD)
I´ve already written to the list and received a response. I´ll try to integrate it (it can take some time, i´ve never programmed for blender) then notify you when its done.

wow, this is awesome :slight_smile:

Other than the actual node in source/blender/nodes/intern/CMP_nodes here’s the basic framework for a composite node.

If you want to do custom buttons and stuff like that here’s an example that shows the basic principle.

Storing data in the .blend is a little more work but not too bad.

Thank you very much!

That looks like some valuable info. I´m going to check it out immediately.

WOW! I don’t have any coding advice, just encouragement. See this to a finished node and it will certainly be included in an official release.

Yeah, this is great for cavity mask! are you planning to implement it in the “materials” nodes too?

If you output a grayscale mask from a cavity-mask/AO node then you can use that grayscale mask to set one material for the object and another for the cavity.

Yeah, the output of the node will be a grayscale buffer. You can then use it to dim the lighting, to blend between two materials, or whatever you want to use it for, not only AO. The screens above use it this way: final = diffuse*(light-AO), for example.

That’s great, looking forward to see it working… thanks!

Hey Arkano22!

I`ve seen your work and you rock!
I already implemented your SSGI shader to Blender Game Engine.
http://blenderartists.org/forum/showthread.php?t=171563
though I replaced your Random Texture with a procedural noise…

I also tried to do the disk-to-disk SSAO, but unfortunately Game Engine doesn`t have the Normal buffer…
well I actually implemented a shader that calculates normals from scene depths but with your shader they seems to be wrong…

But this is the best SSAO ive ever seen :), I just cant imagine anything more authentic to real AO in screen space.

integrating this to Blender itself would be awesome!

I would love to see this work in the BGE in a way where you activate it by just turning it on in the world settings or in shader settings.

Either way can be done now because the render settings and BGE settings are completely seperated in 2.5, as in you select your renderer to be the game engine and you only get the BGE options.

Yea!
If we could use the cavity shader as a mask for another material, would make it identical to the one in 3dcoat and Zbrush.

I would just be happy with the ability to change the AO color from black to something else.
Then we could do things like patina, would be awesome to make a bronze statue.

Either way, even just black cavity shader is awesome! It will make a really cool toon shader for the game engine!

So I’ve been trying to figure this out…


float doAmbientOcclusion(in float2 tcoord,in float2 uv, in float3 p, in float3 cnorm)
{
  float3 diff = getPosition(tcoord + uv) - p;
  const float3 n = getNormal(tcoord + uv);
  const float3 v = normalize(diff);
  const float  d = length(diff)*0.1;

  return
    (1.0-saturate(dot(n, -v)-0.05)) *
    saturate(dot(cnorm, v)-0.05) *
    (1.0f - 1.0f / sqrt(0.2f / (d * d * g_scale) + 1.0f));
}

PS_OUTPUT main(PS_INPUT i)
{
  PS_OUTPUT o = (PS_OUTPUT)0;
  
  float d = getDepth(i.uv);

  if (d < 2.0f)
    discard;
  d = max(d,50.0);
    
  float3 p = getPosition(i.uv);
  float3 n = getNormal(i.uv);
  float3 r = getRandom(i.uv);
  
  float ao = 0.0f;
  
  float incx = g_sample_rad *g_inv_screen_size.x;
  float incy = g_sample_rad *g_inv_screen_size.y;
  float dx0 = incx;
  float dy0 = incy;
  float ang = 0.0;
  float iterations = 64.0;
  for (int j = 0; j < iterations; ++j)
  {
    float dzx =  (dx0 + r.x * g_jitter)/d;
    float dzy =  (dy0 + r.y * g_jitter)/d;
    float a = radians(ang);
    float dx = cos(a)*dzx-sin(a)*dzy;
    float dy = sin(a)*dzx+cos(a)*dzy;
    
    ao += doAmbientOcclusion(i.uv,float2( dx,  dy), p, n);
    
    dx0 += incx;
    dy0 += incy;
    ang+=360.0/8.5;
  }  
  ao/=24.0;
  o.color.rgb = 1.0f;
     
  if (g_use_lighting)
    o.color.rgb *= float3(1,0.7,0.4) + 0.55f * saturate(dot(n, normalize(float3(1.0f, 3.0f, -2.0f))));
 
  if (g_use_ambient_occlusion)
    o.color.rgb -= saturate(ao*g_intensity);
 
  return o;
}

If I understand this correctly, in main() d would be the Z-buffer lookup, n from Nor, p the x,y pixel position and r is a value lookup from another (random) image.

Not sure about saturate(), g_sample_rad and g_scale.

And this is all per pixel.

–edit-- saturate() = clamp(x, 0.0, 1.0)

bump…

bump +1 :wink:

Uncle Entity:
that is a HLSL shader, though the principle in GLSL is the same…

Here this is GLSL filter working in Blender Game Engine, attach it to custom 2D Filter Actuator.

But as I said… Blender needs a real Normal buffer for this (maybe someone can make a defferred renderer for BGE :} ) , also “gl_ProjectionMatrixInverse” doesn`t return any data… bug, I suppose.


uniform sampler2D bgl_RenderedTexture;
uniform sampler2D bgl_DepthTexture;

vec3 grandom(in vec2 coord){
    float noiseR =  (fract(sin(dot(coord ,vec2(12.9898,78.233))) * 43758.5453));
    float noiseG =  (fract(sin(dot(coord ,vec2(12.9898,78.233)*2.0)) * 43758.5453)); 
    float noiseB =  (fract(sin(dot(coord ,vec2(12.9898,78.233)*3.0)) * 43758.5453));    
    return vec3(noiseR,noiseG,noiseB);
}

vec3 readNormal(in vec2 coord){
    float near = 0.5;
    float far = 5.0;
    vec4 depth =  texture2D(bgl_DepthTexture, coord);
    float depth1 = -near / (-1.0+float(depth) * ((far-near)/far));
    vec3 worldPoint = vec3(coord,depth1) * depth1;
    vec3 normal = normalize(cross(dFdx(worldPoint.xyz),dFdy(worldPoint.xyz)));
    return normal*0.5+0.5;
}


vec3 posFromDepth(vec2 coord){
     float d = texture2D(bgl_DepthTexture, coord).r;
     vec3 tray = mat3x3(gl_ProjectionMatrixInverse)*vec3((coord.x-0.5)*2.0,(coord.y-0.5)*2.0,1.0);
     return tray*d;
}
    //Ambient Occlusion form factor:
    float aoFF(in vec3 ddiff,in vec3 cnorm, in float c1, in float c2){
          vec3 vv = normalize(ddiff);
          float rd = length(ddiff);
          return (1.0-clamp(dot(readNormal(gl_TexCoord[0]+vec2(c1,c2)),-vv),0.0,1.0)) *
           clamp(dot( cnorm,vv ),0.0,1.0)* 
                 (1.0 - 1.0/sqrt(1.0/(rd*rd) + 1.0));
    }
    //GI form factor:
    float giFF(in vec3 ddiff,in vec3 cnorm, in float c1, in float c2){
          vec3 vv = normalize(ddiff);
          float rd = length(ddiff);
          return 1.0*clamp(dot(readNormal(gl_TexCoord[0]+vec2(c1,c2)),-vv),0.0,1.0)*
                     clamp(dot( cnorm,vv ),0.0,1.0)/
                     (rd*rd+1.0);  
    }

void main()
{
    //read current normal,position and color.
    vec3 n = readNormal(gl_TexCoord[0].st);
    vec3 p = posFromDepth(gl_TexCoord[0].st);
    vec3 col = texture2D(bgl_RenderedTexture, gl_TexCoord[0]).rgb;

    //randomization texture
    vec2 fres = vec2(800.0/128.0*5,600.0/128.0*5);
    vec3 random = grandom(gl_TexCoord[0].st*fres.xy);
    random = random*2.0-vec3(1.0);

    //initialize variables:
    float ao = 0.0;
    vec3 gi = vec3(0.0,0.0,0.0);
    float incx = 1.0/800.0*0.1;
    float incy = 1.0/600.0*0.1;
    float pw = incx;
    float ph = incy;
    float cdepth = texture2D(bgl_DepthTexture, gl_TexCoord[0]).r;

    //3 rounds of 8 samples each. 
    for(float i=0.0; i<3.0; ++i) 
    {
       float npw = (pw+0.0007*random.x)/cdepth;
       float nph = (ph+0.0007*random.y)/cdepth;

       vec3 ddiff = posFromDepth(gl_TexCoord[0].st+vec2(npw,nph))-p;
       vec3 ddiff2 = posFromDepth(gl_TexCoord[0].st+vec2(npw,-nph))-p;
       vec3 ddiff3 = posFromDepth(gl_TexCoord[0].st+vec2(-npw,nph))-p;
       vec3 ddiff4 = posFromDepth(gl_TexCoord[0].st+vec2(-npw,-nph))-p;
       vec3 ddiff5 = posFromDepth(gl_TexCoord[0].st+vec2(0,nph))-p;
       vec3 ddiff6 = posFromDepth(gl_TexCoord[0].st+vec2(0,-nph))-p;
       vec3 ddiff7 = posFromDepth(gl_TexCoord[0].st+vec2(npw,0))-p;
       vec3 ddiff8 = posFromDepth(gl_TexCoord[0].st+vec2(-npw,0))-p;

       ao+=  aoFF(ddiff,n,npw,nph);
       ao+=  aoFF(ddiff2,n,npw,-nph);
       ao+=  aoFF(ddiff3,n,-npw,nph);
       ao+=  aoFF(ddiff4,n,-npw,-nph);
       ao+=  aoFF(ddiff5,n,0,nph);
       ao+=  aoFF(ddiff6,n,0,-nph);
       ao+=  aoFF(ddiff7,n,npw,0);
       ao+=  aoFF(ddiff8,n,-npw,0);

       gi+=  giFF(ddiff,n,npw,nph)*texture2D(bgl_RenderedTexture, gl_TexCoord[0]+vec2(npw,nph)).rgb;
       gi+=  giFF(ddiff2,n,npw,-nph)*texture2D(bgl_RenderedTexture, gl_TexCoord[0]+vec2(npw,-nph)).rgb;
       gi+=  giFF(ddiff3,n,-npw,nph)*texture2D(bgl_RenderedTexture, gl_TexCoord[0]+vec2(-npw,nph)).rgb;
       gi+=  giFF(ddiff4,n,-npw,-nph)*texture2D(bgl_RenderedTexture, gl_TexCoord[0]+vec2(-npw,-nph)).rgb;
       gi+=  giFF(ddiff5,n,0,nph)*texture2D(bgl_RenderedTexture, gl_TexCoord[0]+vec2(0,nph)).rgb;
       gi+=  giFF(ddiff6,n,0,-nph)*texture2D(bgl_RenderedTexture, gl_TexCoord[0]+vec2(0,-nph)).rgb;
       gi+=  giFF(ddiff7,n,npw,0)*texture2D(bgl_RenderedTexture, gl_TexCoord[0]+vec2(npw,0)).rgb;
       gi+=  giFF(ddiff8,n,-npw,0)*texture2D(bgl_RenderedTexture, gl_TexCoord[0]+vec2(-npw,0)).rgb;

       //increase sampling area:
       pw += incx;  
       ph += incy;    
    } 
    ao/=24.0;
    gi/=24.0;


    //gl_FragColor = vec4(col-vec3(ao)+gi*5.0,1.0);
    gl_FragColor = vec4(vec3(ao)+gi*5.0,1.0); //ao only
}