I need to generate hundreds images bases on variations (color, texture, material, camera position) of a same model. This is for a configurator.
I am a programmer and I have a little experience with Blender.
Should be pretty simple, depending on how you wish to generate the variation. Random select should be easy enough.
What do you mean by configurator ? Like a web app where a user chooses a color, cam angle, etc?
If the variation is fixed and not random, it should just be a couple of nested for loops.
for MDL in models :
for LC in lightingcondition :
for CA in camAngle :
for M in material :
for CLR in color :
30 colors * 5 materials * 12 cam angles * 2 lighting conditions = 3600 renders per model. Thats a lot of pictures!
It may be possible to speed this up by seperating the output into layers though, and composite them together in post or realtime in the app if possible (is your platform flash?). For instance rendering out lighting to it’s own pass means 2 lighting conditions * 12 cam angles = 24 renders. And then the unlit model with 30 colors * 5 materials * 12 camera angles = 1800. You’re already down to 1824 renders.
In this case you multiply the proper lightbuffer with the matching unlit image to get the final output. The heavier the renders, the more time you will save doing it this way.
The above is assuming a single color option per model though. Multiple colors will make things more complex. If you are able to composite images in realtime in your app, i suggest you try to output scalar channels to use as alpha masks for applying local color values. If not, than it will also be faster to color models this way in post, and make it much easier to add more colors on request, rather than render out whole new batch of renders per model.
Images are quite small (around 700*400 px) and I shouldn’t have to reprocess the images too often (once a year at most).