I am particuarly impressed with the powerful features available in the Game Engine Realtime buttons, and an excellent example of their flexibility is Zero Consequences, the little-seen FPS that seemingly uses only logic bricks.
Anyway, I was wondering whether it would be possible to design a node-like system of AI logic.
For instance, an enemy (represented as a block “Enemy”) sees the player, and could either fight or flee. The Enemy block is physically linked to the Player block by a line, and by double clicking the Enemy, you can specify attack patterns, apply animations, sound effects, death animations etc by use of button style programming.
The programming system could be specially presented to make the designer methodically define reactions in the proper order, leading to less confusion.
This could be backed up with python scripting, and whole ‘webs’ of AI could be represented… What do you think?