However, i’m feeling the need to write some custom algorithms. I have implemented some basic simplex noise with my custom modifications, but i have no idea how to transform that into a Texture object in Blender. Is that possible? Can you send a byte array or something and create a texture object? Or do i have to save to the filesystem and then load it as a bitmap texture?
I think that i might have expressed myself badly. I’m not trying to apply the texture to the object or do anything like Substance or FilterForge. I’m just generating some 2D noise that i’m able to display in different levels of black and white. However, i don’t know how to transform that noise data i have into a bpy.types.Texture, so i can use it in modifiers like Displace. Take this, for example:
Yes, that’s the issue. I don’t want to save my bitmap data as a file before loading into the texture object, i was looking for a way of doing it directly.
Well, you cant do this without an image… you could try with an in-memory image if you dont wanna save to disk, and write your data via python into a blank generated image by filling the image.pixels 2D array with float values you generate… might be slow if you have many pixels. This image can be packed into the blend itself then.
There are no more options except image textures, though. Editing the procedural textures as in generating a new type would require changes in C code.