Simulating Kinect and 3D scanner

Hi
We need to simulate the output of the Kinect and also a regular 3D scanner.
The task is probably trivial for someone who knows how to do it in Blender but Im a total Newbie so any help will be appreciated!

We have imported a regular object file (and and used it as skin on an armature but thats not really relevant I think?)

Here is what needs to be done
1)Kinnect simulation: Simply save the depth map/z-buffer of the camera’s view as a jpg.
2)3D scanner simulation: Raytrace stripes onto the world scene and save the cameras view as a jpg

(For those who are super alert: the Kinnect simulation wont actually give us exactly what the Kinnect outputs but its close enough)

Could someone help me out here?

Just in case my question wasnt clear…

All i want to do is to render the depth buffer of a scene as a jpg. I.e open blender, import an obj file, render z-buffer.

i was hoping doing something like this was rather simple. Can anyone help me out?