At the current time, the BGE cannot load libraries dynamically. After some discussion in #blendercoders and #gameblender about possible ways around it- the resounding solution seems to point to one thing -> Modifying the source.
Why would you want to load libraries dynamically? Well in my case, a virtual world is completely fluid, I’d like for content to come and go transiently- like a web browser. In cases of games, you could have a smaller memory footprint as things wouldn’t need to load all at once.
Modifying source, sounds awfully scary to me, since I never planned on recompiling, and have very little C experience- though I have compiled blender successfully, so I think I’d like to at least give it a try before I give it up and accept it.
So, this thread will discuss problems, solutions, and possible routes towards getting dynamic content working in the BGE. So far, I’m thinking if I use pymalloc in conjunction with exposing the converter- I can allocate memory, append the objects, and convert them for use.
Feel free to call me out and set me straight on details. I’d like to first make sure this is plausible!
The converter functions can be found in ‘\blender\source\gameengine\Converter’. I’m going to snoop around and see if I can find where the conversion is actually performed when you hit P to play your game.
Here is the header file for BL_BlenderDataConversion.h
(Thanks to brecht for finding this for me:)
I think this may be the heart of the routine, where the conversion begins (line 330):
// create a scene converter, create and convert the startingscene
KX_ISceneConverter* sceneconverter = new KX_BlenderSceneConverter(blenderdata,sipo, ketsjiengine);
Do you think modifying the source to do what you described could eventually be used to allow dynamic streaming of objects? This would also speedup large worlds as objects that haven’t been loaded will not take up any framerate and you can swap various objects in and out of the game world.
Like if your in a transition area you can unload the objects named ‘Box1’ and load all the objects named ‘Box2’ So Box1 will not take up framerate since it’s unloaded by the game and will not exist until a function says to load them into the scene again.
From what I think I know right now, you should be able to load and unload your meshes from your scene already. When you start your game, all the memory is allocated for your games assets at the beginning, and then frozen- you can’t add or subtract from it- however you should be able to load/unload objects from memory into your scene, to control framerate.
What I mean by dynamic library loading is that you allocate/deallocate new memory while the game engine is running- downloading new assets, converting them from Blender objects to BGE objects and using them, and discarding them, realtime, without having to restart. My understanding of Blender and it’s Game Engine are primitive (hence starting this discussion to explore possible roadblocks or development directions), so again, if anybody knows otherwise, please comment and correct.
If I add the wrapper to the GameLogic module, I think the desired loadBlend() function will be available in both the Blender Player and the Blender main program.
After following the ConvertScene method… it looks like the python interpreter is updated with knowledge of the new scene objects. I’m still not sure… Hopefully I won’t have to mess with it.
I might need to run the asset load as a C++ thread, if possible. Threads can be done in Python, but it would look a bit hacky- I don’t think you’d just be able to call a function- you’d have to create a thread to surround it and make sure that thread only gets called once out of every frame render.
@Delirium, thinking about this problem, if this is over your head, then why not start by working around the problem? (not pretty I know)
start the level with 20 meshes each 1000 disconnected triangles.
instead of loading a blend, open an OBJ or similar and move the verts to match the OBJ (ofcourse your limited to 1000 faces or whatvever)
Even so, with this you can prototype what its like to stream meshes over a network. You can set UVs vertex colors etc.
Once you have this and its working nice other devs might be more inclined to write a less hackish method.
Whoah that’s a really cool idea, I didn’t know you could change verts and UV’s around from inside the game engine. That would probably be really useful for something like, realtime collaborative mesh editing- or, if I can’t cut it, yeah just try converting OBJ’s >_>;
I have rudimentary C experience, so I picked up a small C++ book, and I’ll see how that goes. But this is definitely a great option if the problem proves beyond me, thanks a bunch ideasman
Ah, wasn’t sure if youd like this way since its totally simple but it actually has a surprising number of benefits.
you can probably parse the data in a thread since its all python floats/ints.
you could avoid lockups by parsing 1 line of the file per logic tick rather then running a loop on 1000s of lines.
No C experience needed - can be all done in python.
No security issues loading other peoples blend files over the web.
OBJ is a very simple format, you can write a basic loader for verts+triangles in a few lines. dont try support arbitrary files to begin with though - since there are cases that break a simple loader - ngons. odd ordering of data.
The scenes though limited, will have very predictable performance. For a virtual world this can be a lot better then idiots putting 500k poly meshes there
If you get this working well, it could even be made less horrible by not rendering unused faces and loading some image data as well.
Some BGE games that rely on heavy armature animation suffer from not being to able to have physics enabled characters that have dynamically updating collision meshes. Maybe have Erwin look at how it should be done, after all he’s the one who updates Bullet in Blender.
I never expected to dig this far down, I’m gonna take a break x.x Converting a scene isn’t enough if I want to use this practically. I need to access objects within the scene and I don’t want to use the init workaround to do it instead of doing it via modifying the source. I’m just not at that level yet.
I was planning to prototype a virtual reality application with Blender but since dynamic loading is not (easily) possible, I think I have to drop that plan.
It would be really awesome if one could create meshes dynamically in GE. Hey, then it would be possible to build a modeler inside GE
EDIT: what’s the point to make a modeler in GE? Well, one could create a simplified tool for a specific task. In my case it would be a simple architecture modeler.