Upbge 0.3 does not support SetUV() ?
This code works, but the texture seems not moving ?
speed_x = 0.1 # speed by which the coordinates are moved
speed_y = 0.0 # speed by which the coordinates are moved
mesh = owner.meshes[0] # access the mesh of the object in question
v_array = mesh.getVertexArrayLength(0) # how many verts are there?
for v in range(0,v_array): # go through the uv's of every vert
vert = mesh.getVertex(0,v)
uv = vert.getUV()
print(uv)
uv[0] += speed_x # change the uv coordinates. uv[0] = x, uv[1] = y
uv[1] += speed_y
vert.setUV(uv) # get the game engine to notice the change!!
Can we use object parameters in the shader editor ?
Another working way is to use UVWrap modifier, and move the “To” object to make UV move.
I don’t like TAA or Upbge 0.3 denoising, it makes objects blurry, very noticeable for the game objects in front of the camera.
But shadows are not related to TAA, they need a different filter or shader improvment.
Also i did not share comparison lighting, but Upbge lighting is really missing lighting bounces and global illumination something Eevee does not have to make lot better and accurate litghing rendering.
This will be lot of work to get Upbge 0.3 able to display modern litghing as Eevee is not the solution for good global illumination litghing in whole game levels.
so my little experiment with transform feedback works; i got the gpu to move vertices around before the actual drawing stage. sending in bone matrices and weights as uniforms gets us skinning.
big problem though, it crashes blender after ten or so seconds of engine runtime. the error is linked to a glUseProgram call in one of my methods so i am guessing i cant just do that. any ideas?
and heres the test shader, which works as expected.
#version 150 core
in vec3 in_position;
out vec3 out_position;
void main() {
out_position = in_position * 0.5;
}
i think ive seen some snippet of moguri’s where he changes the active shader through the rasterizer. maybe i should try that. if it all fails, devtalk it is.
https://www.youtube.com/watch?v=qC5KtatMcUw - They talk about trillions of triangles and realistic lighting, and 8k textures per frame ? with no bake and stuff, they will just work ? like cgi quality and stuff ? … like scanned asserts straight into the game engine ? can this be implemented into upbge or something ?
Most of this is hardware
(ps5 dev kit + fast pci_e SSD)
Basically its using a SSD like giant stick of Ram
So they are streaming raw geometry Into a buffer capable of transforming the chunks, and run a tesselator against the streaming buffer
(I think) from the presentation
It’s not fully raw data , the demo uses materials( so baked textures and UV ), it’s not material per vertex.
I’m pretty sure all next gen games in the making for next gen consoles are using normal maps because it’s most efficient for micro detail rendering, and traditionnal LOD as the progressive LOD without user control will only work with enough high detail on distant meshes to avoid visual glitches.
I don’t think all companies will drop normal maps and LOD using raw data to make only 30 fps games with 2K resolution like the demo (while features like DLSS 2 could help a lot).
Yeah … ffreak unreal and nvidia , … guess what’s real is Blender. Can’t wait for amd to take over gpu also and ubisoft on epic ! if unreal has it’s C++ source open on github, than there is no crack needed to adapt on python ! ! right Ffffred k ?