Make vertex near camera semi-transparent

In my game, a wall can sometimes be between my character and the camera.

Sometimes, it’s the back face of the wall so it’s already transparent , sometimes its simply a clipping. How can i have it semi-transparent each time (in both case or at least first case)?

I was thinking a script to make object semi-transparent when rayCasting but i was wondering if there’s a simple way to make it done for all the vertex under the camera (whatever the object) … :slight_smile:


You can do this in three ways in my opinion:

A) with Material nodes. In order for that to work, all Materials which you want to become transparent have to get their transparency mapped to a function which fades the object out the closer you get. I remember an example with clouds somewhere here on the forum which might serve as good reference for the node setuop.

B) actual raycast between camera and character. If something is inbetween, make it invisible or modify its transparency. This might work with obj.color (fourth argument). But here also, your transparency would have to be taken from object color via noded I guess.

C) transparency change via texture replacement. Similar to above, just take transparency from some image file and replace it durming runtime dynamically.

thx for answer. I already use .obj.color when selecting objects. I just test the 4th parameter (alpha), it works well. But i prefer to have only the backfaces to be affected

But i prefer to avoid this solution because i would have to split my entire building into many parts :frowning: otherwise it will start to make many walls transparent.

So i think your A) is more the way i would want it (even if it’s gonna make all objects sharing same materials getting more transparent in same time :confused: ) . But i’m total noob with materials, so using nodes make me even noober.

I though it would be more about something related with a backface culling option but setted to something like 50% (half transparent) .

Im sure this somehow wanted to be done since more than 10 years existance of BGE.

Please drop a minimal example, I would like to give it a try.

hey, I just figure it out .

Actually, the easiest way to go : (u know all this but it’s for primary noobs googling )

  • select your map mesh, go edit mode with face selection
  • select all the walls you want to be half-transparent when interposing (typically the external walls)
  • shift-d , duplicate, flip normals
  • make the new mesh with ‘no collide’ physic
  • make a copy of the wall material (to not modify the original material) and keep only that one.
  • Still in the material tab , Transparancy : Z Transparancy, Alpha 0.5

tadaaam … so bascially, in the place the old mesh was showing backface, just create a duplicate mesh showing 50% transparency


so when your camera is inside the room, it shows the first wall mesh - when the camera is outside the building, you see the clone-mesh with the new modified material :slight_smile:

1 Like

i think it would be awesome to have a progressive camera clipping … not 0/1 but a function

backside of faces wont be rendered, so if you have a cube, flip normals, so the normals point to inside of the cube, you get the same effect, you will see the inside from the outside, from the inside you would still see the wall as normal.

So you don’t have to add multiple meshes.

i don’t understand. Flipping the normals will just inverse the problem inside - out . No ? :hear_no_evil:

I was thinking using 2 different materials (1 on each face) for a a same mesh. (1 side the basic material, the second a copy of it but with 75% transparancy) . How could i achieve that ?

You can feed the actors position in as a uniform using object color, and use cross product of vector to position and distance to point to make a ‘xray hole’ through the mesh using alpha blend

Another trick in old bge is using a shadow of a lamp VS the distance to the actor

at this point, i would like to know how to use 2 different materials on same object. one on the front faces, the second on the backfaces. I know it’s possible without nodes , but i cannot find any source explaining it clearly

I just can’t understand this paper. The way it’s explained is so twisted.

nodes vs no nodes, cycles vs BGE vs blender render, glsl vs multitexture , materials and textures , object mode VS edit mode … … etc etc etc … if anyone wants to put a bullet in the head when dealing with materials in Blender , it’s normal … i will not believe someone saying that’s easy . I really hate working with it.

I found this but it’s like impossible to show the image textures, only the red and blue shows up .

To have 2 materials using nodes you can use normal VS origin

cross product of normal VS (position - location).normalized()

Can be used to mix 2 or more textures (diffuse / normal / spec whatever)

textures whose normal point inline to the origin have a different texture than outfacing normals

( this is in 2.8x but you just do the same and use a material node instead of a pbr node)

why should i have to use nodes ? In the video above the guy seems to do it fine. What about this option in BGE 2.79 ?


He’s maybe in multitexture and his cards are still cubes. Anyway, i would probably have to go with nodes

you need to generate additional geometry or flip what you already have.
i wouldn’t go with either. i’d make the camera not clip through walls.

That would be my choice too. You can fire a ray from the character back towards the camera. If it hits something, then set the camera distance to the ray distance (minus a little so the camera is definitely in front).

that’s a good option

that’s what i have now : 2 planes instead of 1 . I was just wondering why BGE offers the ‘Double sided’ option. I just 1 plane with 1 solid material on the front and 50% transparent material on the backface (so not simply backculling) . As reported by BluePrintRandom, it’s possible with nodes. But i wonder if node materials will work in the BGE

sometimes you want to draw the backface instead of modelling the inside. skirts come to mind.
this is not a bge thing, its graphics in general. the question is do i want this polygon on the screen, will it ever be visible. and should i bug the gpu with yet another evaluation on every single frame.

vanilla bge supports math and vector nodes afaik so a slight variation on bpr’s method should work. take the greater than, fac between full white and some shade of grey then plug that to output alpha.