I have a python scripts that creates around 3000 objects (cylinder, …). Each of one has assigned an image in one for the faces of the object. Each image is different (cannot be reproduced with procedural textures).
When I render the scene (cycles), it shows this message:
Only 1024 images out of ~3000 are shown, the rest are discarded. Each image (jpg) is ~4KB.
Is there a way to solve this (using blender python API)? I am running Blender 2.73 under linux (Ubuntu) 64-b. I have selected (in the script) to use the CPU and run it as a background job.
I know GPU cycles has a limit, but I am not sure about CPUs. I recall older versions had a limit also in CPUs, but I believe it was removed (altough I am not cetain about this).
Regarding the command, I use the following to set up the CPU settings:
bpy.context.scene.cycles.device = ‘CPU’
If someone can confirm that there is also a limit on the number of images (using CPUs), then there is nothing more to dicuss. If not, ¿what are the right settings?
Again - citing AD: “The limit for CPU rendering used to be 100 textures just like with the GPU, but was removed by Sergey as part of the Tears of Steel project.
Apparently, I can see now that what he did was raise the limit to 512 textures.”
you could also talk of it on IRC !
don’t have any specific doc for this so would be interesting to get the real things from Dev!
you cold even ask them to increase it to what ever level like 3 or 5000 !
but this is a lot of images and will take a lot of memory and may be make blender sluggish!
best way is to talk with dev on bug report or IRC I guess
I don’t know if this can be a solution to your problem, but can’t you make your script to apply just one texture ? I mean, you could try merging all these images into a single one and then make your script to UV map all the cylinders to it’s respective position in the atlas.