Is it just me...

Is it just me, or is there anyone else who feels that Blender would be a stronger product if it branched into two development streams – one for animation, and one for the game engine, as separate products with separate release schedules? As an animator, I find that I am constantly kludging stuff in from the BGE (especially physics) that was designed for the BGE but would have been better implemented, even if just in layout and where to find it, if it wasn’t sort of all over the place, half in the BGE and half in the animation panels. And I wonder if Blender wouldn’t be faster for both products if the unused code by each product were gone. For example, I imagine all kinds of hooks in the code that kind of say “IF BGE THEN…” which if they didn’t have to be evaluated every time, would speed up other parts.

Also, there is an element of professional credibility when a single software package tries to be too many things, because it makes people suspect it doesn’t do any one thing really well.

I’ve split your topic off into a new thread, since it’s really a completely different thought altogether from the original poster’s concern. Perhaps it will stimulate more conversation around your thought this way.

thanks blendenzo, i wanted to answer and waited for someone to move this post :slight_smile:

i think that the idea of splitting blender into branches would be great for us specialists,
but to be honest, i dont think that splitting those two branches would do any good for the majority of blenders users.

as much as i understand the engine, there are some parts which simply are not included when compiling the scene, so there shouldnt be many “if bge then” calls made in my part of the blender (which is the gameengine part)

i also think, that (realtime!) there wont be a lot of performance gain, since, as mentioned, most of the not realtime code is not part of the compiled file. the only positive thing about splitting would be, that there are less buttons to press and thereby less confusion about them.

in the end the only useful split maybe could be to make an blender-modeller.exe and an blender-gameengine.exe, but since most of the people use both of the parts of the engine i think this wouldnt be that great, since most of the users would have to switch between the two blenders anyways.
i know some engines which use 5 or more executables for the different parts of the work, and, to be honest again, i disliked every last one of those engines, since i sometimes had more than 10 programs running and, although i am good at multitasking, 10 windows and their dependencies to one another are to much for my brain.

i though about making a build which only includes the realtime parts, but since 2.50 is around the corner i will at least wait till 2.51 or even 2.52 before getting into that.

greetings
manarius

I would consider myself as a quite specialist of blender’s game engine.
…And I can tell you that splitting blender into two separates softwares
would be a pain for creating games. Specially for beginners, but also for specialists.

If it’s the interface’s layout that bug you, just create a “window setup” for blender games,
and a window setup for others thing. Myself I have 4 setups explicitly named :

1-Scripts
2-Model & Games
3-UV & IPO
4-Nodes

I tell you, the key is to have a good setup. Don’t just
get forced to use blender’s default setup :stuck_out_tongue:

Also, I don’t follow you in your believe that making the game engine external would
make anything else go faster into blender, for the coders skills I’ve gathered lately,
I can tell you that “the code that isn’t executed, Isn’t slowing anything”.
…by example, when you model something, the fact that blender can simulate
water absolutely do not affect your modelling speed… just like the fact that you’ll
be able to simulate your object into a blender game does not influence anything.
(xcept the way you’re modelling the thingy of course heheh)

For the incredible confusions that bring you this game engine into blender,
Come on. it’s only a single sub-button hidden into the “button window” sub window.
If that confuse you that much, man, you’re gonna get hell confused with blender hahhaahh.
Ho… and there’s also a shortcut “F4” that will also invoke the Game engine panel.
…but I don’t think you shall bother with that, it’s only gonna create more confusion hahahaha…
…my bad…

More seriously, I think you understand that it might not be such a good idea.
Blender is know to do everything. If you start to divise it, you’ll soon end up saying :
(I never do any fluid simulations, let’s cut that off too)
…and than nodes goes out as well
…etc

Blender handle quite brilliantly all these different personalities, if you really
want to mess with it’s code, don’t waste a second of your life trying to split blender.

Instead, do something creative and improve or create a new function for blender.
There’s so much to do.

Anywho,
Have a nice day folks. :slight_smile:

I wasn’t really suggesting separating all those elements (ie: modeling), only the rendering/animation from BGE.

A million years ago when I was a telecom engineer, because we had so many products for different end users, our software was structured into distinct layers so that the bottom most layers had the common functionality, and as you went higher, the layers became more specific to the particular end product. Any upper layer could access code from any lower layer (and picked up all lower layer bug fixes automatically because the layers of code were separate builds).

In this case it would be much simpler because you’d have 1 common bottom layer (interface, python scripting, modeling, materials, etc.), and then two equal but separate higher layers, one specifically for game design, and one specifically for animation. But each layer could reach down to the bottom layer for common functionality, such as the bullet physics, but the higher specific layer (ie: animation), would implement the same functionality differently, for layout, where to find it, the settings available, etc.

For instance, one aspect of the animation of Blender that is all kludgy in implementation is recording physics to IPOs. This has the feel of something that was designed much after the implementation of physics in the BGE, and so kind of feels like an undeveloped work-around so that animators can partly access the physics. Same with GLSL materials.

Anyway, it’s probably more work than the developers want, but eventually would allow dedicated “base layer” coders, separated from dedicated BGE coders, who are in turn separate from animation relevant coders, to all make functionality calls to the base layer but have uniquely designed implementations of how they use that functionality above.

Since the new interface of blender 2.5 may be pretty much flexible,
you could alway take the opportunity to script what you got in mind.

create the desing you want and show it off :smiley: