generate and display texture from 2D python list ('array')

Hi,

I wasn’t sure whether to post this in the python section or the bge section :S

I have a python script that generates a 2D list, and I want to use that to produce a grayscale texture on a plane. For example, the list:

 [[0,0,0],[0,1,0],[0,0,0]] 

would produce a 3x3 texture that is all black except the central square that would be white. I’ve been searching for how to achieve this, and I saw something where someone was recommending the VideoTexture module.

So I’ve been looking at various tutorials looking at VideoTexture and also bge.texture, but they all require a source. That is, you have to have an image somewhere that you are taking the data from for your texture. I could, of course, write my data to a file and then link to that, but that is going to be very slow.

Somewhere in the texture object must be data that is an array of some sort, containing color values (RGBA) that can be edited in python and then updated. I thought this would be a common thing that there would be lots of information on. Surely some people would be writing their own procedural textures? Anyway, if anyone has any ideas for how I could achieve this, please let me know. :slight_smile:

Well, you can do the opposite like so:

image = bge.texture.ImageFFmpeg(FILENAME)
raw_array = bge.texture.imageToArray(image, 'R') #a list of lists of all the red values

But I can’t see any functions to go the way you want to.

Most procedural textures take the form of GLSL shaders.

So what are you actually trying to do? Is there some alternate method that could work? (like having a high poly object and setting vertex colors)

Thanks for your reply sdfgeoff. I’m trying to simulate microscope images. I want to have a region that I put the sample (a blender object) in, and calculate the image as an array, then show this array as a greyscale image on a plane. I normally do this in C++ and output the images as image files, but I was curious as to what was possible in real time.

I haven’t looked in to vertex colours. I was thinking of the possibility of setting each face to a different material and changing the diffuse colour of each material, but it seemed like a very messy solution for something that should be very easy.

In theory it’s possible, only, it has not been done completely!
There’s a small blend of a realtime texture paintingThis one. It should be a starting point! Now that I think about it, I can understand more of python and should be able to come up with something!

Edit: Errr, yeah, forgot about the maths involved into that blend! Need more time to understand the whole thing! Maybe if you ask the creator, what was in his mind…

Thanks torakunsama, I’ll check it out. I’m still shocked that there isn’t some simple, built-in method of doing this.

Weird question, but can this be done backwards? I’ve looked through the API with no luck…
Example:
imageToArray > do something with the data > array back to image > set as a material
Is this possible? Because if it was, I think some interesting things could be done (that is, with a limited resolution)

You can use bge.texture and bgl plot I think