I’ve been playing with the Blender game engine for some time now, and have been wondering about something quite specific.I hope it’s the right place to talk about in-game shaders.
Long story short, I d’like to port those shaders frome Unity to Blender GLSL ;
The goal with these is to achieve a retro psx game look. Psx games have a very specific look because of the way the graphic processorcalculated polygons position ( all calculations were made with integers instead of float, hence the lack of precision), added with affine texture mapping ( ie; no perpective correction). You can find a more technical explanation here :
I have close to no knowledge in GLSL and was wondering about the feasibilty of porting those shaders, or are there other ways to achieve this ?
In blender to achieve results like those shaders you don’t need GLSL. Simply put BGE into multitexture mode and you’re all set!
Other than that, porting shaders isn’t that hard, but I doubt you’ll find someone else to do it, heck, writing your own per-vertex lighting isn’t hard in GLSL. I think there may even be an example of it in the resources subforum somewhere.
I’ve tried to get something like this several times using a python script instead of a shader. I’m glad someone finally managed to do this!
-We have a “polygon jitter” shader working
-I don’t think you can disable zbuffer(?)
Disabling mipmapping/texture filtering is easy. Just use a python script with BGE.setMipmapping(0)
Do not switch to Multitexture shading. It is kind of broken and not very supported. For example, light properties like intensity, color, and falloff can not be properly set (Yes, I know multi texture mode uses vertex shading instead of pixel, but the lights being broken is a reported and confirmed bug that has not yet been fixed) Instead, use GLSL mode and use a basic vertex lighting shader.
Rather than setting the resolution of the game to run at 320x240, I would use a render to texture with a resolution of 320x240. This way, It would work well with any monitor resolution
However, there are still two important pieces of the puzzle missing. First of all, PS1 used a special type of color dithering. I believe there is a filter in the resources section, but if I remember right it has some issues.
Second, the PS1 used affine texture projection. Here is an example of what that is: https://m.youtube.com/watch?v=5Pm8GNfquEk (This video uses the Godot engine, but I just want to demonstrate what the effect looks like for those who don’t know.)
I have yet to find a way to do this in blender, but I am 99% sure it is possible with a shader.
So basically, to best replicate the PS1 aesthetic, we just need color dithering and affine texture projection (I don’t think anything can be done to disable the zbuffer)
Not really - you do need to use custom shaders to get the look and feel of an early-gen 3D title. Simply using Multitexture just changes how lighting is calculated, not lowering the fidelity of how textures are fundamentally applied to a model or where its vertices are rendered.
Yes, a vertex shader would work fine. The difference when the models move (that polygon instability - that shakiness) is what people are aiming for when they want PS1-style visuals, I think.
Here’s another video to work with that shows this polygon instability.
I tested the vertex shader to make 3d scenes as rendered by a super FX chip from Super Nintendo Games:
To use the vertex shader in blender, you also have to overwrite the material properties with a fragment shader, in this script I added a basic fragment shader that uses the texture from the material, and it also uses the light source (it does not work very well).
cont = bge.logic.getCurrentController()
VertexShader = """
float WIDTH = 320; //set the resolution to emulate
float HEIGH = 240;
varying vec3 diffuseColor;
varying vec4 texCoords;
void main() // all vertex shaders define a main() function
vec3 normalDirection = normalize(gl_NormalMatrix * gl_Normal);
vec3 viewDirection = -normalize(vec3(gl_ModelViewMatrix * gl_Vertex));
if (0.0 == gl_LightSource.position.w)
// directional light?
attenuation = 0.5; // no attenuation
else // point light or spotlight (or other kind of light)
vec3 vertexToLightSource =
- gl_ModelViewMatrix * gl_Vertex);
float distance = length(vertexToLightSource);
attenuation = 1.0 / distance; // linear attenuation
lightDirection = normalize(vertexToLightSource);
if (gl_LightSource.spotCutoff <= 90.0) // spotlight?
float clampedCosine = max(0.0, dot(-lightDirection,
if (clampedCosine < gl_LightSource.spotCosCutoff)
// outside of spotlight cone?
attenuation = 0.0;
attenuation = attenuation * pow(clampedCosine,
vec3 diffuseReflection = attenuation
* max(0.0, dot(normalDirection, lightDirection));
// without material color!
diffuseColor = diffuseReflection;
texCoords = gl_MultiTexCoord0;
gl_TexCoord = gl_MultiTexCoord0;
gl_Position = ftransform();
vec4 snapToPixel = gl_ModelViewProjectionMatrix * gl_Vertex;
vec4 vertex = snapToPixel;
vertex.xyz = snapToPixel.xyz/snapToPixel.w;
vertex.x = floor((WIDTH/2)*vertex.x)/(WIDTH/2);
vertex.y = floor((HEIGH/2)*vertex.y)/(HEIGH/2);
gl_Position = vertex;
FragmentShader = """
varying vec3 diffuseColor;
uniform sampler2D color;
// Get texture from color map
vec3 texture = (texture2D(color, gl_TexCoord.st).xyz);
gl_FragColor = diffuseColor +vec4(texture, 1.0) - vec3(0.5, 0.5, 0.5);
mesh = cont.owner.meshes
for mat in mesh.materials:
shader = mat.getShader()
if shader != None:
if not shader.isValid():
shader.setSource(VertexShader, FragmentShader, 1)
You also have to add this at the start of the game to disable texture filters: