A new fork of the engine

Umm,
it doesnt seem to run, everything goes white in the UPBGE 0.1.4… changed some values to see if it works but it doesnt bloom other than changing screen brightness and stuff like that

@​Dark Power, switch to upbge 015.

We are never going to get that due to BF not wanting to touch the BLF module

Maybe we can get the code for it from PIL?

You may be interested by PIL ImageDraw which has a textsize method. See http://effbot.org/imagingbook/imagedraw.htm

Side note: can we include PIL in upbge?

The drawing stuff could be handy if we can point it at the right buffers.

The 0.1.5 release note is now published: https://doc.upbge.org/releases.php?id=0.1.5.

@BPR: You can import PIL, but this library seems to draw only to images.

And text* into buffers

And you can get the width of the text,

Ah nice, I have waited for it. :smiley: And thanks one more time. :slight_smile:

I read that martinsh water shader has been implimented in the latest release of UPBGE. Does anybody have any info on this? Is it an addon? How is it used.

@cdo1955: hi, no this is a false information. But in upbge 0.1.5 we implemented planar reflections/refractions. It’s martinsh who wrote the main shader (GLSL code) for our implementation of planar reflections/refractions. And martinsh also contributed to the rendering code (c++ code). We use his unlimited planar reflections/refractions method (the one he use in his water demo file in render to texture script). So we didn’t implemented his water shader but something with a more global approach which allows to do all kind of planar reflections/refractions. And we can do some cool water effects with it as I showed in this tutorial: https://www.youtube.com/watch?v=C9aLerVmEpc&t=97s

Again about the filmic color management. This is what I meant: https://docs.blender.org/manual/en/dev/render/post_process/color_management.html

Aparently using the color management and rendering an open gl render works. (you can use the settings in the color management section to effect the look of the image after rendering.
So I guess what I’m asking for is a way of using LUTs and tonemapping in UPBGE. This whole topic is a bit confusing to me, but maybe you get what I mean.

EDIT: WARNING, the tonemapping script(s) somehow doesn’t work… (it just works in the .blend i made it…)
EDIT2: It does work, but does need a shadowbuffer (WTF?) if you want to use it, here’s an example .blend:
https://www.file-upload.net/download-12413852/Tonemap.blend.html simply append the .plane and there you go

there you have it:

uniform sampler2D bgl_RenderedTexture;
uniform sampler2D bgl_ColorCorrectionTexture;
uniform float avgL;

const float TextureSize = 16.0;
vec2 texcoord = vec2(gl_TexCoord[0]).st;


vec4 tonemapping(vec4 col)
{
    col = vec4(col.r / (col.r + 0.187) * 1.035, col.g / (col.g + 0.187) * 1.035, col.b / (col.b + 0.187) * 1.035, col.a);
    return col;
}

vec4 getLinearColor(sampler2D tex, vec2 coord)
{
    vec4 col =  texture2D(tex, coord);
    return vec4(pow(col.r,2.2), pow(col.g,2.2), pow(col.b,2.2), col.a);
}

vec4 linearTosRGB(vec4 col)
{
    return vec4(pow(col.r,0.454545), pow(col.g,0.454545), pow(col.b,0.454545), col.a);
}

vec4 colorcorrection(vec4 col, float width) {

    col.r = clamp(col.r, 0.0, 1.0);
    col.g = 1.0-clamp(col.g, 0.0, 1.0);
    col.b = clamp(col.b, 0.0, 1.0);
    col.r = col.r*width/(width+1.0)+1.0/(2.0*width);
    col.g = col.g*width/(width+1.0)+1.0/(2.0*width);
    
    float zSlice0 = min(floor(col.b * width), width-1.0);
    float zSlice1 = zSlice0 + 1.0;
    float s0 = (col.r + zSlice0) / width;
    float s1 = (col.r + zSlice1) / width;
    
    vec4 slice0Color = getLinearColor(bgl_ColorCorrectionTexture, vec2(s0, col.g));
    vec4 slice1Color = getLinearColor(bgl_ColorCorrectionTexture, vec2(s1, col.g));
    float zOffset = mod(col.b * width, 1.0);
    vec4 result    = mix(slice0Color, slice1Color, zOffset);
    return result;

}

void main(void)
{
    float Lum = pow(avgL, 2.2);
    vec4 value =  getLinearColor(bgl_RenderedTexture, texcoord);
    value = vec4(value.r/Lum, value.g/Lum, value.b/Lum, value.a);
    value = tonemapping(value);
    value = colorcorrection(linearTosRGB(value), TextureSize);
    gl_FragColor = linearTosRGB(value);
}

that’s the 2DFilter, but you also need this (a python script):

from bge import texture, render, logic
import bgl
from bgl import *

cont = logic.getCurrentController()
own = cont.owner
scene = own.scene

own["bgl_ColorCorrectionTexture"] = bindID = own.meshes[0].materials[0].getTextureBindcode(0) # get the first texture.
glActiveTexture(GL_TEXTURE0 + own["bgl_ColorCorrectionTexture"])
glBindTexture(GL_TEXTURE_2D, bindID)

then you also have to apply a texture like this: https://docs.unrealengine.com/latest/images/Engine/Rendering/PostProcessEffects/ColorGrading/RGBTable16x1.png to the first texture slot of the first materialslot of the object on whose logic you assign the two scripts.
You can also add a float property “avgL” and a second python script for an auto-exposure effect:

import GameLogic as G
import bgl as BGL
import Rasterizer as R

scene = G.getCurrentScene()
cont = G.getCurrentController()
objList = scene.objects
own = cont.owner

viewport = BGL.Buffer(BGL.GL_INT, 4)
BGL.glGetIntegerv(BGL.GL_VIEWPORT, viewport);

width = R.getWindowWidth()
height = R.getWindowHeight()

x = viewport[0] + int(width/2)
y = viewport[1] + int(height/2)
#print('error if not int',type(x), type(y))

x1 = viewport[0] + int(width/3)
y1 = viewport[1] + int(height/3)

x2 = viewport[0] + int(width/3)
y2 = viewport[1] + int(height/2)

x3 = viewport[0] + int(width/2)
y3 = viewport[1] + int(height/2)

x4 = viewport[0] + int(width/2)
y4 = viewport[1] + int(height/3)

pixels = BGL.Buffer(BGL.GL_FLOAT, [1])
pixels1 = BGL.Buffer(BGL.GL_FLOAT, [1])
pixels2 = BGL.Buffer(BGL.GL_FLOAT, [1])
pixels3 = BGL.Buffer(BGL.GL_FLOAT, [1])
pixels4 = BGL.Buffer(BGL.GL_FLOAT, [1])

BGL.glReadPixels(x, y, 1, 1, BGL.GL_LUMINANCE, BGL.GL_FLOAT, pixels)
BGL.glReadPixels(x1, y1, 1, 1, BGL.GL_LUMINANCE, BGL.GL_FLOAT, pixels1)
BGL.glReadPixels(x2, y2, 1, 1, BGL.GL_LUMINANCE, BGL.GL_FLOAT, pixels2)
BGL.glReadPixels(x3, y3, 1, 1, BGL.GL_LUMINANCE, BGL.GL_FLOAT, pixels3)
BGL.glReadPixels(x4, y4, 1, 1, BGL.GL_LUMINANCE, BGL.GL_FLOAT, pixels4)

avgPixels = (pixels[0]+pixels1[0]+pixels2[0]+pixels3[0]+pixels4[0])/5.0

#Slow adaptation
own['avgL'] = (own['avgL']*0.99 + avgPixels*0.01)

avgPixels = own['avgL']


the ideas are from there: https://docs.unrealengine.com/latest/INT/Engine/Rendering/PostProcessEffects/ColorGrading/index.html and the code was partly made by me the rest is code from the internet.
But you might realize, when you’ve set up everything correctly, that something does look odd, the reason is (I think) that the floor() function (float(int(x)), too) doesn’t work with values you divided before (if you use 0.5 it works, if you use 1/2 it doesn’t)
a demonstration (the strange lines across everything):



That bug makes the whole script useless… :spin:
So if anyone knows a fix, I’ll be glad to fix it. (I even tried splitting the Filter in two scripts but that also doesn’t help… :confused: )
A random thing I’m working on (because I have to write a seminar paper(is that correct english?, I just used Google translator) about realistic rendering in realtime and I try to make this in the Blender Game Engine):


This still needs blurrying for a nicer look, but you get the idea.
With UPBGE 0.1.5 blurrying in a separate pass is possible right? So I just have to figure out how this works :o
This AO-method is from there: http://john-chapman-graphics.blogspot.de/2013/01/ssao-tutorial.html
This uses a “real” normal buffer and not just the normals created from the depth buffer, so smooth normals + normal maps are possible. I achieved this via rendering the whole scene a second time with a material which displays the normals via nodes. This is of course not fast, but is this the only way of getting a real normal buffer or can i achieve this with a faster method? What I also plan is having a direct light pass, an indirect light pass, an albedo pass and a reflection pass (image based lighting).
Then render the AO, multiply it with the indirect light pass, add the direct pass, multiply it with the albedo pass, render screen space reflections, mix with the reflection pass, and then mix the reflection with the diffuse path with fresnel.
I already also made a pbr shader via nodes (with ggx specular!!!), there it is in action with the AO (+ GI, because this effect are just three lines of code added to the AO shader, you can see the illumination at the foot of the Buddha statue):


Okay, that’s probably the longest post I’ve ever made on BA, yayy (^.^)/

@TheLumcoin: Hello,


from bge import texture, render, logic
import bgl
from bgl import *

cont = logic.getCurrentController()
own = cont.owner
scene = own.scene

own["bgl_ColorCorrectionTexture"] = bindID = own.meshes[0].materials[0].getTextureBindcode(0) # get the first texture.
glActiveTexture(GL_TEXTURE0 + own["bgl_ColorCorrectionTexture"])
glBindTexture(GL_TEXTURE_2D, bindID)

Please avoid guessing the texture state, here you are guessing that the texture slot 0 (+ bind code ???) will contains the bind code you set until the filter render, if an other filter is rendered before it can be wrong.

@TheLumcoin: You could look at the python API for KX_2DFilter: https://pythonapi.upbge.org/bge.types.KX_2DFilter.html. In this API you could set a texture using setTexture:


filter = scene.filterManager.getFilter(0)
filter.setTexture(0, own.meshes[0].materials[0].textures[0].bindCode, "bgl_ColorCorrectionTexture")

But you might realize, when you’ve set up everything correctly, that something does look odd, the reason is (I think) that the floor() function (float(int(x)), too) doesn’t work with values you divided before (if you use 0.5 it works, if you use 1/2 it doesn’t)

Maybe a int cast issue, use 1.0 / 2.0 instead ?

The blenderVR team says that this should work almost out of the box with upbge

Hello everyone,

The version 0.1.6 is now released. The major features of this release are the debug options and list filtering.
More info are in the UPBGE documentation website: https://doc.upbge.org/releases.php?id=0.1.6.

The builds for linux and windows will come tomorrow.

This release isn’t a very big release but it permits us to work on task for the next release like culling refactor and bug fixing.

Thanks again for support us.

And bpplayer support I hope :stuck_out_tongue:

Could you post a copy/paste of the data from the website here?

The upbge.org website does not like mobile atm and I don’t have pc for a while
(sun_downish)

Could you explain what exactly the BPPlayer support will be? It will convert the blends to block files automatically when the engine is executed, it will have an addon to easily make the conversion from inside Blender or the BPPlayer will just be released along with the UPBGE binaries (or something else I didn’t mentioned)? Sorry if it was answered somewhere, I didn’t searched about it.

Edit: By the way, nice work with the release, thanks so much! :smiley:

@joelgomes1994: BPPlayer is a software which allows to publish a game without sharing the .blend files (they can be encrypted to protect assets): https://blenderartists.org/forum/showthread.php?130089-BPPlayer-BGE-Security-(1-07b-Win32)-Blender-2-78-support

For now, BPPlayer is not supported in official upbge, but lordloki worked on it to make it compatible with upbge, so we have a working patch waiting to be merged in upbge.

Are the new builds avaliable? because I’ve downloaded several times the recent build from the upbge site and its still the same 0.1.5 :open_mouth:

I just got a win 64 yesterday and it was 0.1.6

are you on linux or 32 bit or?