In roadmap Ton speaks of game engine using viewport code

In the roadmap,Ton speaks of game engine using viewport code.

Talking with the author of the code, he says they are incompatible coding styles,

Who is right? Ton or Psy-Fi? Can they both be right?

Can the code branch at some point, GE one direction and interactive GLSL the other?

Can island detection be ran just before pressing P?

I think the game engine is filling a need for a lot of people,
including myself, and the drawing code were modernized, it would
boost the engine into this decade.


people say the engine is not good,
but then all the reasons they describe will be fixed
when the viewport is fixed…

why not enjoy the ride?

“All the reasons” are not related to the viewport. Some of them aren’t even technical in nature. Hell, the game engine renderer has actually been more advanced, supporting more features than Blender’s native 3D View. And it’s been like that for a while. In some respects, the viewport project is playing catch-up… and then leapfrogging. And while the native 3D View will see substantial improvements, those improvements do not necessarily translate to being good for a game engine renderer. There’s a lot of data that’s useful in content creation that is simply pure overhead when it comes to a game engine. That means that the 3D View will benefit from increased interactive capabilities and realtime display, but those benefits don’t necessarily translate to the game engine.

Viewport enhancement is not a panacea (for the game engine… or anything else, for that matter).

The Blender game engine is useful and it is fun. I daresay it’s not a bad way to quickly prototype a game… especially for people who aren’t developers themselves. But it’s not what you want it to be. It may not ever be.

I really can do anything I try in the game engine,

The only limits I have is no instancing,
and no batched static draw calls.

and that is pretty much it

besides android support for export pipeline.

I have pretty much replicated the mechanics of grand theft auto, mixed with space engineers, I need to work ui, animations and assets, but none of that is a fault of the engine.

Really? You have worked out how to get the Blender Game Engine to handle levels the size of small cities, with level data streamed on the fly from the hard-drive during gameplay, whilst simulating crowds & traffic?

I don’t see that in any of your demo videos, so colour me skeptical.

I mean the swapping actor controls without the overhead to vehicles, drones etc,
and my weapon system is actually more flexible then theirs,

about streaming data,

divide ground plain into sections, store 2 lists in the sections,
large items and small items

setup LOD for sections.

when LOD < X load in the pieces off the list over time (x pieces per frame), and delete them from the list, (Large items)

when LOD < X-1 = load in the pieces off the list over time, and delete them from the list, (Small items)

if the LOD level goes above threshold and your done loading in, start deleting the items, and saving them back to the two lists


 Is that Wrectified or a different project? I'd love to play that game!      :D

Wrectified -

Still need quite a bit of assets to be what I have been dreaming of, but the code is there.

I daresay it’s not a bad way to quickly prototype a game…

Here some questions from an amateur:

A lot of people say this, but what are the specific reasons for the BGE being not a good engine for making full games?
What does unity and all those engines have what the BGE doesn’t have?
Yes yes, making a remake of GTA V would be pretty much impossible in the BGE… But would you be able to do it in unity with only a team of one or a few people?


GTA 5 assets would take years to make, unless you could use procedural methods to generate them, or even 3d scans.

as far as how bge stacks against unity, its mostly shader stuff we cant do at the moment.
if you are doing a ton of calculations, python
is not the fastest do to threading, but I have
not hit that wall yet by using managers instead of 100’s of
objects running python,

check out my demo for wrectified,

the script compHost

its mostly shader stuff we cant do at the moment.


I don’t really care about that.
I guess you know the game Rust.
Poorest graphics I’ve ever seen in a game but still a awesome game that is pretty popular.

Its more than the shaders code!

BTW… in case it wasn’t clear, this has been answered:

As has been noted in other threads (and here), it’s not that simple. Up until recently, the game engine renderer was (in some ways) superior to the native viewport. But did you ever stop to wonder why a lot of the advancements in the game engine renderer (old as they are now) never found their way back to the native viewport? It’s apples and oranges. Features that are advantageous for games aren’t necessarily advantageous for a content creation viewport… and vice versa.

The native viewport needs enhancing. It’s likely that, feature-wise, it will eventually outstrip the capabilities of the game engine renderer… and even gain some programmable interactivity that would be useful in content creation. However, those features might not be useful to the current game engine renderer. There will be data overhead that’s necessary for content creation, but useless to a game engine renderer… and there might not be a clean way to strip that overhead out.

In short, it’s probably inaccurate to say that the plan moving forward is to integrate the game engine into Blender proper. A more accurate way to put it is that the goal is to have a native viewport in Blender that has features that are close to parity with a game engine. You probably could make an actual game using those features (Excel had a flight simulator in it at one point, after all)… but it probably wouldn’t be that great of an idea to use those features for more than prototyping.

What about using everything they have in common in the code to a point, and diverging somewhere?
or switches in the code?

of mode==1: etc?

still impractical?

Software development is seldom as simple as it seems, in almost no case can things be solved by a statement here and a function there.

This is not BGE python you’re dealing with, the development of Blender itself is an entirely different ball of wax.

Generally speaking, it’s not that simple (see where I wrote “and there might not be a clean way to strip that overhead out”?). Furthermore, you’re not acknowledging the bigger point. The codebases have already diverged. Entropy has set in. The game engine renderer and the viewport renderer are not going to be merged in the way you think. The viewport will gain features… perhaps even interactivity. The game engine renderer will either be developed independently or languish altogether.

That is the most likely future.

I think seperating the bge render materials system from the render would be a good first step, so the render that uses the data can decide how to use it.

So ogre or new render B could be swapped in without breaking anything,

So the core program feeds data, the render interprets it.

this way you can have vanilla, high end, and mobile renders easilly.

Compartmentalized code sections that link up using a uniform language
would free the game engine render for plugin style renders.

Compartmentalized logic could mean swapping SCA for nodes as a user choice, or pure python etc.

New coders could use a template, try a method, and get it the wild without breaking blender or the bge.

I will ask lordLoki how hard it would be to do this,
and how far away it is after the refactoring that has already
taken place.

So, in short, you’re going to ignore everything I’ve written.


No, I was saying to impliment a better render, one first has to isolate the bindings from the current render,

then we could potentially have the ability to select renders,

like ue4ish or unityIsh or ???

so you could see assets in the game engine and they would look like that engine,

also we could have a ogre plugin or a new bge render, or a mobile style render,

the important thing, is to isolate the render so it’s a plugin, and then be able
to test new ones just by downloading the developers addon,