Load and pack images from memory

Hi, I want to give my importer the ability to load models from a compressed archive, so I extract the files in memory using mmap (model file, textures, materials, etc.).
The only difficult I came across is how to load the textures, as bpy.data.images.load() expects a path to a file in the disk.
I’d like to just load the images in blender from the memory and pack them as PNGs to the .blend file.

Is there other way to load images? If not, could you please point me in the right direction for checking blender’s source, and does it look like a hard to do modification?

Cheers

Campbell linked to these in IRC:

http://lists.blender.org/pipermail/bf-committers/2011-January/030253.html

the latter is a script to directly access the buffer memory, but it’s linux-only.

Do you know how to access the mmap’ed data in memory with python in the first place?
If so, try to turn that into a python sequence and assign to image.pixels, won’t be really in-memory then of course.

Thanks for the links. From what I understood in the mailing list thread is that this is kind of a hard topic for many valid reasons, so yes it looks complicated. I don’t have linux installed right now, but I’ll try it later. I wonder how to use it though, where to put the script, how to access the buffer, (use _image_get_buffer(image), and if there’s any chance to get it working on all platforms.
Honestly this kind of things take a lot of hours to me to understand unless I have a working example.

>> Example usage.
>>>>> # Inserts property into blenders Image RNA.
>>>>> import image_buffer_raw_access
>>>>>>>>>> ima = bpy.data.images[‘MyImage’]
>>>>> x, y, rect, rect_float = ima.buffer_raw
>>>>> pixel_index_max = x * y * 4
>>>>> # set colors for first pixel
>>>>> rect[0] = 0 # red
>>>>> rect[1] = 255 # green
>>>>> rect[2] = 128 # blue
>>>>> rect[3] = 255 # alpha
>>

About image.pixels, you mean parse the texture format RGBA and then create the image (generated) ? I thought about that but I read is really slow (this code snippet from campbell) plus they’re dds textures and I couldn’t find a good library to read them in python 3.3.

Edit: I know maybe the easiest way is to just create a tmp folder, extract the textures there, load them/pack them and then delete them from the disk. But having an easy method to access the memory would be great for many compressed formats.

i think it is as easy as placing it in a folder where blender searches scripts, or append the path:

sys.path.append(“C:\ he\path”)

then import the script. AFAIK all you need to do is to run the example call, the rest of the script should work on its own without modifications.

It gives you direct access to the buffer, you can assign to “rect”, which is the actual memory for the image data.

Campbell already checked on windows compatibility, but there’s no way to make ctypes work - which is necessary.

Using .pixels can be slow, but if you assign just once, that should be fast. What will last long is probably the conversion from compressed texture to raw data. Better just save to temp file and load from there, blender can handle dds right away. You could try to write the mmap’ed files to disc one after another to save disk space (dds to file -> load -> pack -> remove file, next).

Thanks a lot for the help, I’ll try different methods and post here, but for me it could take a while. Anyone help in investigating how to acomplish this will be very appreciated.

Decompress and access RGBA data to me doesn’t sound such a bad idea, there are some python modules wrapping freeimage that might work to get that data. I’m making tests with simpler image formats first.