Hum, how is it going to work ? I don’t see a way of doing this at render time. But maybe by looking at the original effect of slit scan (in 2001 space odissey) maybe it’s possible to re-do the effect inside blender by recreating the same rig/setup they used at that time. But I guess it’s tough too because there maybe a trick involving exposing the camera film in a non-standard way. Then It won’t work.
The big issue is that when rendering image 100, you can’t access geometry / render-data that is at frame 1.
In compositor there is a bit of a same problem, you can access the same image sequence at different frames without loading n image sequences each offseted of 1. That’s doable but very very inefficient.
In the VSE I think something is doable too but same issue with the compositor, you’ll end-up loading 1920 images sequences and offset them.
With python you’re screwed too because it won’t help anything related to “doing it at rendertime” and you’ll end up writing a slitscan plugin that load each image and paste it on another one.
One thing that should be possible is to look into lattices deformers and try to mimic a similar deformation, as long as you’re working with abstract images , maybe that can do something that work, but it won’t be slitscan anymore, just deforming geometry that can give similar effect.
I’ll stick to another software , Natron as a slitscan plugin too and is a good compositor application. You can maybe render your images to opengl sequences to have a quick preview then limiting final exports , I can understand why you want to do everything in the same application…