Simulate photographic exposure in Blender

Hi,

anyone knows how to do something like this in Blender?
http://www.deliverhouston.com/images/night_traffic.jpg?nxg_versionuid=published

I’m not referring specially to simulate night traffic. I’m referring to simulate in Blender the behavior of photographic film when exposed during a long time range.

My first idea was:

  • use the VSE, apply glow effect, create metastrip
  • duplicate the same metastrip and offset 1 frame with add blending mode and create metastrip
  • duplicate metastrip, add blending mode

and so on. But this is so tedious and I think useless to in order to take control over the process. I could use simply GIMP and add sequenced images as layers with add blending node, but I want to work with an easily animatable technique.

Any ideas on how to achieve this using vse, nodes or another workaround or hack? I could too use particles to achieve te effect, but I want to discover or know a light-based workflow, not a particle-like-based workflow.

perhaps another app as cinepaint can combine hundreds of sequenced images working with its exposure data or at least with its brightness values?

Thanks,
Raimon

can’t you just animate it and render the frames to different images and blend them in photoshop or something? or maybe with a very high motion blur factor…

Hi th555,

thanks for your response, but I can’t. Or at least, I want to try a less tedious method. Look for example at this image from Andrew Price. This is at least 10, 20, 30 or something like this in seconds, so no less than 250 frames…

For a single project this is achievable, but I’m looking for something that provides an easiest workflow in other instances. I guess I’m not the first to ask how to do this in 3D or with digital generated images.

I tweaked a little with CinePaint, but this app is so poorly documented… and so far I’ve not seen anything on it that could’nt be done in Blender compositor, related to what I’m looking for.

I thought too in motion blur, and I think is an area that I should investigate.

Any ideas are still welcome.
Raimon

See attached file. Needs some serious tweaking but it’s a start.
Made with 2.5 svn r26199.

Attachments

film_exposure_simulation.blend (200 KB)

Particle halos emitted over time from a moving emitter, very streaky effect.

Hi,

thanks all for your answers, specially blendercomp for your file. I’ve looked at it and it’s an interesting setup. I guess, tough, that it differs from my first approach to the problem. I need a way to record on a single image luminance/brightness and chromatic values of a huge range of rendered frames.

In other words, the motion in 3D must be there; this motion need to not be faked at postpro, because in this context the motion itself is not enough controlable, as it is with an object following a curve.

The most obvious way to reproduce the effect is with particles, but I must insist that I want to go beyond this. Particles are ideal for light flows like those from a headlamp, or car lights, etc.

But what about a lit moving subject in a long exposure photo?. This kind of blurring is only (AFAIK) achievable through motion blur, but even using this method, I need a way to map the motion blur calculation over a long time range. Any way on how to do this?

Thanks,
Raimon

Raimon, based on the description you gave I thought that you needed to fake such an effect. If you want to animate the trajectory of any light I don’t see why it’s difficult to do it with such a setup. Maybe it’s not an optimal or generic solution but that depends on what you want to do.

Hi blendercomp,

What I’m trying to achieve is not only animate a light trajectory. I need some method (perhaps a proposed workflow, perhaps an specific app) to have an image generation according to photographic behavior.

In other words, I want to simulate te behavior of the film or digital camera’s LCD in a fully digital context.

As said, motion blur is the nearest technique to achieve this, but I don’t know how to control the time range in which motion blur is calculated, so i lack photographic control. I’ll keep investigating.

Thanks,
Raimon

Hi,

I just found this thread asking for the same thing as mine.
http://blenderartists.org/forum/showthread.php?t=46248

Again, the main method seems motion blur, but there are some limitations:

  • motion blur seems to not work with lights, i.e., it seems to not blur fast moving lit areas.
  • frame range covered by motion blur calculation is small, as previously said.

Tomorrow, more :slight_smile:
Raimon

hmm, I wonder if there is a way to translate object movement over time into voxel data for the volumetric renderer? You can use other spatial data in there why not object surface patch movement.

Raimon, when you do figure it out please post here as some of us might be interested.

Just use another render system. I believe Lux and Yafaray both support better camera models which can do true motion blur.

Hi,

I’ve done some progress on this issue. My approach now is based on compositor, setting a sort of loop where the output of a frame is the input for the next one. So with a simple add node you can sum progressively the results of all previous frames.

Only one thing, my current Blender installation crashes quite often with this setup, not sure if this fault of my “looping” setup, a Blender bug or a potentially conflicting setting that I could put on some nodes.

You can see the results here:

http://vimeo.com/26227200

I plan to do a sort of tutorial or explanation about this. Thanks for your interest, and I’ll be glad if you run, and comment here, on other approaches to the problem.
Raimon

Hi, this sounds interesting, i’d be interested to see a node setup with a bit of explanation if you are happy with that.

As for the crashing, sometimes i have better results when i change under the render panel performance threads property from auto-detect to fixed.

Personally i sometimes get stabler performance when i switch from displaying the render to ‘keep UI’.

Hope that helps you too. :yes:

Aidy.

You also might try the not well known true motion blur in blender. Its not in the compositor its right below the anti-aliasing tab in the render settings. By setting the steps to 20 and the shutter to the max of ten blender will render 20 frames in a ten frame time and combine all 20 into one image!!