[ BGMC28 ] To The Surface (Now with beta download)

Download

For BGMC 18 I made a game called “CaveFly” where you flew around a procedurally generated cave to turn on the lights (video). For BGMC28, I’ll be revisiting this idea, but as an adventure game - exploring the underground world.

Why make this thread so early? Well, the scope that I’m planning for the game is quite large, and to make the level using existing tools would be impossible. So I’m starting to develop a level-creation tool now, so that when the contest starts I can actually make the game. This tool will share no code with the game, and while it will share a shader, I won’t develop that shader until the contest starts.
Since this isn’t quite regular, you guys can decide if this make my entry “non-competing” or if it’s allowed.

The tool will:

  • Painting the terrain types
  • Generates the collision mesh for the cave (this would take hours to do by hand for the level I’m planning)

Screenshot of the tool interface:


For those interested: the collision mesh generation is done by rendering out the terrain to a bitmap image and using a program called potrace to convert it to a svg. The svg is then imported by blender, and a set of cleanup operations performed.
This means that the work to create a level is now a case of painting it, and then placing any specific assets/sprites.

This project is available on github.

Rule # 4 says you can use assets you made in the past, (which I think should have been, and should always be the rule) So I don’t see why you should be barred. But I’ll leave that up to Smoke, he’s the host.

I think it’s great, but yes, unfair…having access and using tools that are not available to others…I wouldn’t consider it cheating…but it is a handicap for anyone else. Unless you released said tools to the other contestants in time for them to learn how to use them…

This is just my opinion as a by stander.

I really love your work though…games…meh :slight_smile: but there is always a lot of good ideas and cool toys that come from your work…you are an excellent designer my friend…I’m sure you know this. :wink:

The code people use isn’t available, it was never nessisary to post your code so everyone could use it. like artists had to post their assets. This seems to give progammers an unfair advantage, but I think I’m being petty. So i’ll back off. I dislike any restrictions, because I think it hinders good game making. So pay no attention to me. :slight_smile:

No, you have a valid point…
My point is that he will be developing this tool prior to the contest beginning…which is fine. It is however a game dependent on this tool…so in fairness it should either a) be available for all contestants or b) not be developed until the contest starts.

I really do not care in any case, I am just looking out for those who will not be able to compete on the same level playing field. That is what good competition is all about.

The code for the tool is up on github, so technically it’s available to all participants - but I do agree that it gives me a large advantage. So if you guys say “nope, disqualified” then I will make it over the BGMC period anyway, and just won’t submit it for voting.

To me the BGMC’s are just an excuse to make the games from my notebook of games-to-make, and this one I’ve had on my mind for a long time now. All the technical requirements and solutions have been in my head for months, and I just need an excuse to sit down and type it out. Unfortunately the requirements of the level mean that blender is not a complete solution for creating levels, and nor is any other program. I estimated developing the tool as taking 4-5 days in the evenings, and that leaves very little time to make the actual game during the BGMC period - particularly as it is an adventure game (and thus has a story line).

I’m really looking forward to what you can produce…you are a far better coder than I.

Just to be clear, I’m avocating for less rules, not more rules. (in the interest of better quality games) :slight_smile: But there is no rule that I have ever seen in BGMC that says you can’t use code that you have made in the past. So anyone can write a module that could apply to char movement, pickups, save/load, etc. anytime. then use that code in the next BGMC. Correct me if I am mistakem. So, I would think that it’s perfectly fine for Jeoff to use his old code from CaveFly, in addition to any new code. But again, that’s up to Smoking Mirror. :slight_smile:

Here’s a quick demo of the tool:

As you can see, it allows you to paint and then generates a physics mesh. The terrain is defined by a low res image (in this case 16px/bu) and detail will be added by a shader. To generate the physics mesh, you need to consider the shader applied to the terrain. To do this the map is rendered at a higher resolution (the extra window that appears), and then run through a bmp -> svg converter (potrace). This is then imported into blender and post-processed to a workable physics mesh.

Anyway, there’s only a few more things I want to add to the tool, and then I’ll call it done. The remaining features are:

  • Load/save multiple maps (currently the map it edits is a hardcoded path)
  • Toggle buttons for the recolor vs dig
  • Possibly another brush control dial for brush strength
  • Correctly align generated physics mesh inside blend file rather than at runtime
  • Figure out where the scaling factor of 1.384 comes from…

Edits to the terrain tool that will be done after the contest starts:

  • Create terrain shader
  • Decide on needed px/bu for the low-res map
  • Decide on needed upscale for rendering out the image for physics conversion

Obviously, in the shader, the colors won’t be red green and blue. I plan for one dial to be “undergrowth” (eg plants), one dial to be ground details (eg pebbles, rocks), and the final channel is currently spare (any ideas?)
I’ve also got a neat 2D water shader designed in my head - time will tell if these shaders actually work in practice…

(@Justin, if you want to use it you can start learning it! Link to github is in the first post. Linux only, need make, python, upbge, blender and potrace in the system path, need PIL accessible to blender’s python. Only edits the file “test.png” and all usage instructions are on-screen)

I am linux only :)…but I have other things on my plate…been really busy with Myrlea now that it is in the last 10% or so of development(so 90% to go right :)).

Still, I’m always watching for your posts…always curious what you are up to, you are one of the few truly decent programmers here in the BGE forums that also has an eye for design…usually it is one or the other…

no need to respond, I understand that it is hard to sound modest when replying to a comment :).

I am totally about allowing your game to participate. I think it’s totally fine under your conditions to use pre-made tool like that. We need to see more good games take part in BGMC and I know you can handle the task of making a good game.

For those who aren’t subscribed to my youtube, here’s a video I posted Yesterday:

Hopefully this evening I will make the terrain shader.

Looks fantastic! those springy gear gave me an idea for my own space game.

Hahaha, I wasted so much time on fighting bullets constraint system to get them to work nicely. But it was totally worth it.

And we now have terrain:

Shadow casting is from the physics mesh (generated using the tool shown earlier in this thread.
I want to automate the level building process a little bit more, as there are still some bits in working with the generated meshes that are awkward (eg alignment, setting up materials)

also how did you make the gear springiness? I want to do something similar for my space game. I was thinking of using the car suspension constraints.

Here is an image of the physics setup:

You can see (on the left leg) that the center of rotation is in a funny place: down “inside” one of the boosters. This helps the collision bounds to approximately follow the leg as it rotates upwards (see the leg on the right). It isn’t perfect, but it is close enough that it is unnoticeable during the game.
The leg collider is joined to the body using a 6DOF rigid body joint, free to rotate about one axis, but bounded using the constraint limits to the “up” and “down” positions. Then, in order to change position I use the constraint motors, using the velocity to control direction and the motor force is limited so that the end-stops work properly.

This all sound easy, except that a rigid body 6DOF joint motors have a singularity along the axis due to gimbal lock in euler coordinates. To overcome this, the “default” position of the colliders is half-way through their deployment, so it goes from -pi/4 to +pi/4, never hitting the pi/2 singularity.
(the axis parameter in the bge.constraints.createConstraint appears to be broken, so this is set up by rotating the objects before the constraint is created).

Finally, the visual meshes are linked to the collider by setting the worldOrientation every frame.

Looks amazing. I’ve got no problem with people using code or helper utilities they made in the past. it’s no different to using art assets or music.

@theoldguy I have to admit that it does seem like coders are at an advantage with being able to reuse old code, but that’s one of the facts of life. Being able to write modular, reusable code is a huge advantage for anyone, not just in game jams, but in game development generally.

Thanks Smoking_mirror. I’ll upload a new video of the tool (showing how it relates to this game), and you may change your mind…

Anyway, I added a HUD, and spent way too much time on the fade-in/out effects:

I also added sounds (not in the video), but they’re noise I modulated in audacity and are pretty bad. Hopefully I’ll get time to redo them, but even if I don’t, it has something at least.

And here’s a demo of the terrain painting tool, now that I have the terrain shader.

I still have an entire terrain dial unused (space for 3 more textures), and need to make another “undergrowth” texture.

The terrain shader uses four texture samples, and is likely to be cheaper than a typical normal+spec+diffuse material because there are no lighting calculationss and a similar number of samples

And to give an idea of the size of the map, here is a montage:

Using a 1024px texture, the map is an almighty 1024 bu square. The ship is 2bu high. How on earth does a 1px-per-bu map work? Using another texture to add detail and interpolation:
map

Because each pixel has 256 possibly alpha values, that 256px/bu gives approximately control on a scale of 1/256th of a bu - which is more than enough. Couple that with a detail texture to add interest, and you’re all set.

The detail textures are 1024px as well, and each spans 20bu (more than a screenwidth), so surely you’d see those pixels as well? Well, because of the clipping value, interpolation and the 256-possible-values, you really can’t. But if you zoom in to the grass, you can see the effect of the interpolation. Bear in mind this is several hundred times closer than you’ll see in game - it normally reaches about to the top of the landing legs:

The shader node that handles the details looks like:


There are four input channels (RGBA of the terrain map). Each get’s multiplied against a channel in the associated texture. This means you have 9 different detail textures, of which three can be displayed simultaneously (and you can display mixes between them). Each detail texture is mapped to 1/3 of the alpha values, which allows the full range of the detail textures channel values to come from the texture. This also allows you to put grass on the rocks without either texture “clipping.”

The multiply and power values on each image are fudge factors to solve an off-by-half issue I couldn’t identify the proper source of. I think it may be to do with conversion from RGB into B/W.


For the physics mesh, the terrain (with detail shader, but without grass) is rendered out at 16x resolution (ie 16px/bu, 16384x16384). It takes about 1/30th of a second for the GPU to generate, and then about 30 seconds to transfer it to the CPU and save it as a 1Gb bmp file. This is then converted into a SVG by a tool called “potrace”. The SVG is imported into blender and then:

  • Converted into a mesh
  • Doubles are removed at a distance of 0.01bu (because the SVG curves are quite detailed, but we only have 16px/bu of actual information)
  • A limited-dissolve is run at 5 degrees to prevent excessive vertices on flat areas.
  • extruded to give the physics engine something to work with.

This whole process takes about 1 minute.

You can see here the detail on the generated physics mesh:

For the map shown in the top image of this post, the polycount is 3000 polygons for the lower cave structure, and is both far more detailed and far more efficient than anything I could do by hand. As the level grows, the polycount will increase, and I hope bullet will be able to cope. I suspect it will end up being ~100,000 polygons. If needs be, I may end up separating it into multiple smaller meshes in a grid pattern to better use the physics engines broadphase.

The physics mesh is also used for casting the shadows, as it can be “seen” in 3D by the render engine.