Hi all.
Anyone here know how to display an image texture using the BGL module in the BGE (?)
Image mipmap options with BGL would also be appreciated.
Hi all.
Anyone here know how to display an image texture using the BGL module in the BGE (?)
Image mipmap options with BGL would also be appreciated.
Best I can do is give you some WebGL code I wrote, this example is from an iTunes cover flow, and I make a basic shape out of plain vertex data (a thin cube with 0.1z) to paint the texture onto.
If you want more help, I will need you to post your actual GL code, and I can walk you through some of the hard parts.
, Shape : { ThinCube : function ( ) {
var GL = this.gl, attr = this.attribute
// model matrix computes this [triangle map]
GL.bindBuffer ( GL.ARRAY_BUFFER, this['DefaultCubeVertices'] )
GL.vertexAttribPointer ( attr.VertexPosition, 3, GL.FLOAT, false, 0, 0 )
// texture to faces map
GL.bindBuffer ( GL.ARRAY_BUFFER, this['DefaultCubeTextureVertices'] )
GL.vertexAttribPointer ( attr.TextureRelativePosition, 2, GL.FLOAT, false, 0, 0 )
// mapping between
GL.bindBuffer ( GL.ELEMENT_ARRAY_BUFFER, this['DefaultCubeIndices'] )
GL.useProgram ( this.program )
}
}
, Texture : { RGBA : function (rgba) {
var texture = this.gl.createTexture ( )
, MipMapLevel = 0, Format = this.gl.RGBA
this.gl.bindTexture (this.gl.TEXTURE_2D, texture)
this.gl.texImage2D ( this.gl.TEXTURE_2D, 0, Format // internal format
, 1, 1, 0, Format // width, height, border
, this.gl.UNSIGNED_BYTE // read size
, new Uint8Array(rgba)
)
return texture
}
}
, Draw : { background : function ( Texture ) {
if( ! Texture ) Texture = this.Texture.RGBA([255,255,255,255])
var GL = this.gl
GL.bindTexture ( GL.TEXTURE_2D, Texture )
this.Shape.ThinCube.call ( this )
this.setUniform ('ModelViewMatrix'
, Math.matrix.scale( Math.matrix.identity(false,0,0,-100.0)
, 100, 100, 100 ))
GL.activeTexture(GL.TEXTURE0) // This is a non-zero number representing the offset of texture unit 0
GL.uniform1i(this.uniform.Sampler, GL.TEXTURE0) // Sample
GL.drawElements ( GL.TRIANGLES, 36, GL.UNSIGNED_SHORT, 0 )
return true
}
}
Here is how I define the cube and its UV map:
const DefaultCubeVertices=[-1.0,-1.0,0.01,1.0,-1.0,0.01,1.0,1.0,0.01,-1.0,1.0,0.01,-1.0,-1.0,-0.01,-1.0,1.0,-0.01,1.0,1.0,-0.01,1.0,-1.0,-0.01,-1.0,1.0,-0.01,-1.0,1.0,0.01,1.0,1.0,0.01,1.0,1.0,-0.01,-1.0,-1.0,-0.01,1.0,-1.0,-0.01,1.0,-1.0,0.01,-1.0,-1.0,0.01,1.0,-1.0,-0.01,1.0,1.0,-0.01,1.0,1.0,0.01,1.0,-1.0,0.01,-1.0,-1.0,-0.01,-1.0,-1.0,0.01,-1.0,1.0,0.01,-1.0,1.0,-0.01]
const DefaultCubeTextureVertices=[0.0,0.0,1.0,0.0,1.0,1.0,0.0,1.0,1.0,0.0,1.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,1.0,0.0,1.0]
const DefaultCubeIndices = [
0, 1, 2, 0, 2, 3, // front
4, 5, 6, 4, 6, 7, // back
8, 9, 10, 8, 10, 11, // top
12, 13, 14, 12, 14, 15, // bottom
16, 17, 18, 16, 18, 19, // right
20, 21, 22, 20, 22, 23 // left
]
Let me know what you don’t understand, I have no way of knowing.
1st of all.
Thx for the reply.
2nd of all.
Are you sure this is a TEXTURE IMAGE script (?)
I don’t see a line of a script asking for an texture image.
The image code is elsewhere, like this:
function AfterDownload ( event ) {
var Type = GL.TEXTURE_2D, MipmapReduction = 0
, Format = GL.RGBA, ValueLength = GL.UNSIGNED_BYTE
this.texture = GL.createTexture ( )
// set write register in OGL
GL.bindTexture ( Type, this.texture )
GL.texImage2D ( Type, MipmapReduction, Format, Format, ValueLength, this.image )
// texture projection (clamp)
GL.texParameteri ( Type, GL.TEXTURE_WRAP_S, GL.CLAMP_TO_EDGE )
GL.texParameteri ( Type, GL.TEXTURE_WRAP_T, GL.CLAMP_TO_EDGE )
// basic filter prevents mipmapping which would be an error 99%
GL.texParameteri ( Type, GL.TEXTURE_MIN_FILTER, GL.LINEAR )
this.w = this.image.width, this.h = this.image.height
this.ar = this.w/this.h
this.loaded = true
this.loading = false
}
Right, but my example is from WebGL. I’m painting images/videos onto a cube in a web browser.
So while the GL code is similar, I haven’t written GL in python.
Therefor if you want me to help you with your python, you have to actually put your code here. I only know the GL part.
I found something to get you started
https://blender.stackexchange.com/questions/28534/rendering-to-texture-with-bgl-in-python
you have to actually give it a try, I can see that it’s the same, so if you dive into it, I really will be able to help you.
I would like to be able to use one of my own textures.
Not an image created from scratch.
it’s at the top of the ffmpeg page
https://shuvit.org/python_api/bge.texture.html?highlight=ffmpeg#bge.texture.ImageFFmpeg
from bge import logic
from bge import texture
def createTexture(cont):
"""Create a new Dynamic Texture"""
obj = cont.owner
# get the reference pointer (ID) of the internal texture
ID = texture.materialID(obj, 'IMoriginal.png')
# create a texture object
object_texture = texture.Texture(obj, ID)
# create a new source with an external image
url = logic.expandPath("//newtexture.jpg")
new_source = texture.ImageFFmpeg(url)
# the texture has to be stored in a permanent Python object
logic.texture = object_texture
# update/replace the texture
logic.texture.source = new_source
logic.texture.refresh(False)
def removeTexture(cont):
"""Delete the Dynamic Texture, reversing back the final to its original state."""
try:
del logic.texture
except:
pass
from bge import texture
import numpy as np
clipboard = {}
mat_id = texture.materialID(own,"TextureName")
tex = texture.Texture(own, mat_id)
path = (gamepath
+"\\data\\textures\\%s.png"%("textureName")
)
tex.source = im = texture.ImageFFmpeg(path)
im_buff = np.asarray(im.image)
checkTex = np.ndarray((512, 512, 4)) #resolution. hardcoded to 512x512
im_buff.shape = checkTex.shape
checkTex[:] = im_buff
tex.refresh(True)
clipboard["textureName"] = tex
The scripts above are for REPLACING TEXTURE ON AN OBJECT.
I am looking for basically BLF drawing FOR textures.
I am looking for a way to --> DRAW TEXTURES DIRECTLY FROM STRATCH USING A FILE PATH (NO MESH OBJECTS)
pop the texture on a plane bro
Aye, slap it on.
It’s an extra mesh per image in memory vs doing it purely with openGL for every image.
I don’t know the benefits to the later. There might not be any, and the code is much more dense.
@BluePrintRandom + @Liebranca Ok . . .
Look.
You guys know how BLF drawing works (?) With drawing fonts directly on-to the screen (?)
I want to do that exactly but with textures (not fonts, textures)
(Thank you both for your valuable time)
Why do I not want to put the texture on an object (?)
Because I want
Fifty materials for shadeless widgets? You’re doing it wrong.
Pack the UI stuff into a texture and do your uvs, problem solved.
In this case younwoukd probably want to make a texture atlas that stores all you game textures. This will greatly reduce the number of calls and make working with your menu widges flexible.
If you want to draw it directly into the screen buffer, you’ll need to write a fragment shader.
What I need from you, is for you to begin doing this.
I’ve been in the business for a long time, and I’ve mentored my fair share of juniors. What I’ve learned is that in order for you to succeed, I can’t do it for you. You seem to have a strong will toward a particular goal, and I will help you get there. It starts with you.
you can have a polygon holding a texture with a camera facing it and plot into the buffer in python also
@Liebranca + @Thomas_Murphy + @horusscope + @BluePrintRandom
Thx you for all your reply’s. I will try all your suggestions.
However do we really not have a way to dsiplay an texture_image DIRECTLY from stratch (??)
I thought BGE could do anything
Is this me just guessing or not.
Ah. Ok . . . If you think so