I think I know how in perfect dark (BTW, I don’t actually own this game) messes up your vision when you get hit really hard, or injured (or whatever), and I wonder if it would be possible to do in the real time version of blender. I don’t know opengl, and am not very good at python, but I think I have planned this out well enough so that someone who did could judge feasibiltiy
ok, here goes:
- access to each frame that is rendered (before it is displayed to the screen)
- access to an image buffer of the same size
- a way to combine the two with a variable amount of opacity (of the currentFrame)
so then, each frame that is displayed on the screen is the combination of the buffer, and the current frame with the opacity variable. This is stored back to the buffer. On the first frame the opacity is full. The opacity will have to be adjustable to account for the varation this effect would cause with an inpredictable framerate on the user’s computer.
I hope that made sense. here is a lower level explination: (nothing new here if the above made sense)
prevFrames # the image buffer
opacity # starts at full
on each new frame # before it is rendered
# combine the buffer, and the current frame
theCurrentFrame = combined(prevFrames, theCurrentFrame, opacity)
# should be a copy of the image, not the pointer, or handle
prevFrames = theCurrentFrame
what will it look like (in therory)?
assuming that the opacity has been set lower, and we are currently viewing something significantly after the first frame it would look like motion blur gone more than crazy, and the current position would be difficult to distinguish from the previous ones.
Unfourtunately I can not put my mind’s pictue of this into this explination, but I hope you can understand.