Assuming that you are trying to DOF on a Blender 3d image, but the default defocus filter is not doing the job the way you want, you can either pass the parts of the image through different blenderlayers, and then through separate DOF passes, and mix the results, or you could use other objects on another pass to generate a fake z pass, and merge that with the original z pass. Not as easy as it sounds, as many of the processes you can do on an image don’t work on a z pass. ( You can’t blur it, for example, as this makes no sense logically)
If you have a problem where, for example, a reflection shows sharp, when the object reflected is at a distance where it should be blurred, then you can use a separate pass to separate the reflective object, and then put it’s reflection pass only through a separate DOF pass where you use an add node to tweak the z buffer data. This allows you to make the reflected objects seem further away than they are. You then need to add the reflection pass back into the reflecting object, and z-combine the whole thing back into the scene.
If you need to tweak the reflected/refracted objects independently, then there is no option but to build a set of scenes, each containing the reflecting/refracting object and some of the objects reflected/refracted, and then merging all the parts back together.
Unfortunately, Blender does not provide a reflected/refracted z-pass, alpha pass, or index pass, all of which would be very useful, as would the ability to make an object completely invisible to a layer, by stopping it from affecting the shadow, reflection, and refraction passes on layers where it does not appear. At the moment, an object will still be seen in reflection, even when it is on a hidden layer… Weird!
Another option would be if Blender could do a ‘reflection to mesh’ conversion, to create a pseudo-mesh representing the objects seen. But I digress
Matt