Possible OS X BGE bug??

I’m going out of my mind trying to figure out why this toon outline filter and other filters I’ve tried won’t work for me.

I’ve been told there is nothing wrong with the script and that it is working on windows machines:

uniform sampler2D bgl_RenderedTexture;
uniform sampler2D bgl_DepthTexture;


    void main()
    {
       float depth = texture2D( bgl_DepthTexture, gl_TexCoord[0].xy).r;
       float depth2 = texture2D( bgl_DepthTexture, gl_TexCoord[0].xy + vec2(0,0.002)).r;
       float depth3 = texture2D( bgl_DepthTexture, gl_TexCoord[0].xy + vec2(0.002,0)).r;
       
       float fac = abs(depth-depth2) + abs(depth-depth3);
       
       float intensity = 300;


       vec4 sub = vec4(fac*intensity,fac*intensity,fac*intensity,0);
       
       gl_FragColor = texture2D( bgl_RenderedTexture, gl_TexCoord[0].xy ) - sub;
    }

Is there anyone out there who wouldn’t mind giving this a shot and letting me know if it is working on OS X? I am desperate and frustrated.

EDIT
SOLVED: I asked for help on macrumors as well and got this reply:

"GLSL on Mac OS tends to be strict. If you declare a float you must assign a float. There are two instances where the author declared float, but assigned an integer. The author is lazy.

This is the correct code:

uniform sampler2D bgl_RenderedTexture;
uniform sampler2D bgl_DepthTexture;

void main()
{
  float depth = texture2D( bgl_DepthTexture, gl_TexCoord[0].xy).r;
  float depth2 = texture2D( bgl_DepthTexture, gl_TexCoord[0].xy + vec2(0.0, 0.002)).r;
  float depth3 = texture2D( bgl_DepthTexture, gl_TexCoord[0].xy + vec2(0.002, 0.0)).r;

  float fac = abs(depth-depth2) + abs(depth-depth3);

  float intensity = 300.0;

  vec4 sub = vec4(fac*intensity, fac*intensity, fac*intensity, 0.0);

  gl_FragColor = texture2D( bgl_RenderedTexture, gl_TexCoord[0].xy ) - sub;
}

"