Blender People - New MySQL Setup Documentation

Get the latest version right here:
http://www.harkyman.com/bp.html

Subscribe to BlenderPeople Development News RSS Feed here:
http://www.harkyman.com/bpblog/blenderpeople.xml

Now follows old stuff from the original post…

First - I’m moving discussion of this from the blender.org boards to here. So, anyone who’s been following this over there - welcome! I just couldn’t stand the forum errors on long threads there anymore. Ack.

Here’s a link to the latest animation:
http://66.134.133.114/battle24fps.avi

The camera was stuck to one of the red team Actors, on assault against a defending Blue team. BTW, I will not be posting future updates on blender.org. I’ll put them right in this thread.

Everone else can bail here, but if you’re interested in the progress report and to do list, read on…

  1. Did a complete rewrite of the code. It is now significantly faster. It was calculating a full frame of motion for over 140 actors in less than one second on my Athlon XP1800 .5GB RAM machine.

  2. Fixed a bug in the Python API source, allowing the generation of smooth IPO curves from Python. The calculated motion frames can now be used as keyframes, every 3, 6, 12 or whatever real frames. So it only took one second to calc the motions for 12 full frames. I might try a 1000 actor sim this weekend. We’ll see how it scales.

  3. Code rewrite made it easy to add new kinds of orders, so I added three new types: StrictMarch and StrictDefend, which are like their normal versions, but actors will ignore enemies unless directly attacked. Also added RegroupMain and RegroupCommander, which cause forces to regroup to a central location or their heirarchicial commander, respectively.

  4. Rewrote the code for barrier avoidance. It still needs work, but it’s much more efficient and intelligent now.

  5. The big one: DIRECTIONAL AWARENESS. Actors are now directional creatures. Before, they evaluated everything on a global basis. Now, they make their decisions based on what they see in front of them, with user-definable fields of vision and turning speeds. In the video, you can see them “looking around” for enemies to attack. Actors also “hear” in a limited radius, so they are aware of crap that’s going on behind them, just not so much as the stuff in front of their face. Came up with solutions to some nasty over-rotation problems, so if you’ve been working on a script and your rotations get out of control once they go over 360, pm me and I’ll give you my solution.

TO DO LIST:

  1. Testing on each of the Orders, to make sure they do what they’re supposed to, and that Actors ignore them when they’re supposed to. For documentation sake, I’ll make a little video of each Order type in action.

  2. DB creation code. All you will have to do is have mySQL installed and Blender will create the appropriate databases and tables for you, if they don’t already exist.

  3. Decide how to handle verticle orientation and include movement constraints for slopes on terrain.

Once that’s done, then the whole package is 1/3 finished. Another 1/3 is the script set that will link NLA info with Action logs generated by THIS part of the script, creating the character animation to go along with the keyed motion. The other 1/3 of the project is GUI and documentation. Anyone who likes this and has done a GUI before, please contact me. I need your help.

This has progressed very well harkyman. It looks like it will be a fantastic script once you have it done.

Good on you for doing this.

BgDM

Holy CRAP - so that movie is from a program instead of being animated? WOW. By the way, I’m a user interface developer, but strictly web-based interfaces up to this point.

But, I’d be willing to try.

hehheh… I’ve seen this stuff at blender.org, and like it.
the camera angle on this movie isn’t really showing what is happening… the bird-eye-view you had earlier was much better … I think.

anyways, I’m interested to see where this is going, and how the real use of the script will look like eventually…

.b

I’ve got no idea what this is… but it looks interesting enough for me to check back later :wink:

I followed your project on blender.org. superb!

for real characters with rigs you could do something like this:

a gui to create “soldier groups”. first of all very basic, like blue/red. or advanced like blue-archery, blue-infantry, etc.

then add several actions to a group. like “walking”, “attacking”, “standing around”. and the script selects randomly one of x attacking actions.

maybe you already thought of it all. if you need support (except python :wink: ) please ask me.

The work you are doing for the Blender community is fantastic, with scripts like yours Blender is going to be elevated to professional status in no time, great work I can’t wait to use this script.

Keep it up.

Ken

Man id say in a year ur prog will start to look like the LOTR battle scenes… so far looks great . And like basse said the birdeye was better or at least veiw showing more going on. GJ so far

Very good progress. I always look forward to seeing your updates on this.

Something I was wondering: is the action randomly generated at render time? If so, you can never re-render a scene to refine a shot. Perhaps the battle generation should be output to a file or database in a first pass, and read into memory by a blender plug in. You would be able to refine your camera moves to fly around the most interesting sections of the battle, refine a fe “hero” moves, preview in a blender 3d view - things not possible if the battle is built on the fly each render.

The generated motion is stored in IPOs. Once it’s generated, you can do whatever you want with it… add actors that are keyframed by hand, remove actors that do really stupid things, etc. So, you can test your lighting, find good camera angles and do anything once the motion is recorded. And as I’ve said motion generation is pretty fast. Don’t like the way things played out this time? Duplicate your Blender file in case you change your mind, then run it again. If you’ll be changing camera angles, you could use snippets from different sim results.

Once you have good motion, then the second stage of the script suite kicks in (which is not written yet). You replace your dummy objects with character rigs, which will follow the IPOs. It uses an “action log” that is generated in the motion stage (it’s actually a mySql database, with a full log for each actor) to assign appropriate NLA information to character rigs, based on what they are doing and how fast they are doing it.

Currently, there is no NLA access through the Python API. I fixed the IPO recording bug in the cvs for this project, and if I have to, I’ll write the NLA module myself. Hopefully someone else will get to it before me, though.

Cheers!

I’m feeling this! Whoa yeah. Great stuff Harkyman!

I never thought that this script would progress so well, since seeing the origional post.

Fantastic script, the movements of the actors is definitely more realistic than it was. Keep up the good work.
I agree with basse, I prefered the birds eye view of the scene.

Ian

LOL cool new anime… though the camera makes me seesick… maybe animate it’s motion differently?

Ok sorry it took me so long to register.

I like the newest test Harkyman. Your code is very clear ( in dbBotsBoxBounding.blend ). Is there anything I can do to help? Also, we should probably try and get some people to make some rigged medieval warriors. Like someone else said, a contest might work.

Here’s the thing about the rigged warriors: there is no Python access to the NLA. I’m going to have to write it myself. This might take a couple of weeks. I’m going to contact Hos, who did some major work to the NLA and Actions code to see if he’ll give me a synopsis of how the code works. Once that’s done, though, first tests ought to be done on hoverbots, getting the NLA stuff to work with auxiliary motion before tackling walkcycles.

I only have a couple more things to do on this stage of the project before I build a GUI. I’ve been prepping the code for that this morning. Mr_Rob, have you used mySQL before? What platform are you on? I know this was your idea from the beginning, and if you’d like I can let you run remote tests once there’s a GUI, to get user issues with the mySQL integration hammered out.

New Addition as of this morning: Each actor is randomly assigned a “turn”, which is an integer slot among the resolution of the sim. If you’re having the sim run once every twelve frames, then you have 12 “turn” slots. Keys are then generated with offsets corresponding to the turn slot the actor occupies. This gives a much better overall look, as it appears that Actors are not all moving, changing orders, etc. en masse, but as individuals. The order in which actors are now evaluated is based on their turn slots and relative speeds. Previously, it was based on creation order.

great great great. just a general question:

Asside Massive, the original prog, is there other battle-simulator on the market? Some Max or Maya plug-in we’r not aware of…

Anyway i can help by doing benchmark and test on your code, i’m studing this now, it may be usefull.

gogogogo

I’ve never used MySQL except on a web server. There’s a large community for it so it shouldn’t be too much of a problem to install and get configured. I run windows XP home. I’ll do any tests if you need me to.

Hey harkyman
This is very impressive stuff.
I agree with the outer about the camera, maybe you should attach it to some actor that will get a defensive command to stay a bit at the back (the war reporter :))
Except this, I wanted to ask, after one actor bit the outer, what will happen to the one that lose the fight (today they disappear),so today animation can be replaced with a wounded/dying animation ?
I will keep follow and when modeling of real character needed I would really like to help.
Really like the way this system works!

Cheers (and keep the good works)

Eyal

The scripts run in two passes. Pass one (the one I’m working on now that you are seeing) generates the gross character motions - translations and rotations across the stage. It also writes a log of what action each character is engaged in, including dying.

Once your happy with the overall look of the scene, you move on to the next step. The second pass uses the action log, the IPO Curves and the NLA system to generate the character animation, including getting hit and biting the dust.

Oohh! I overlooked this thread before!

Great stuff!

Stefano