Message sensor bug

Hi, I came across an interesting problem while playing with many messages flying around between my objects. It turned out that messages can easily get lost if you don’t set the message sensor to pulse mode with 0 delay.
I got this problem with a complicated blend so I don’t know if I can reproduce it with a simple one but in any case the problem disappeared when I enabled pulse mode and it didn’t cost any extra CPU.

Why are you using bricks to send messages between objects?

Just get the object from a python script and modify properties/actions directly.

Social, unfortunately everyone doesn’t know python and in my case does not have enough time to learn it yet.

That’s not true. I know some Python.

Logic bricks are usefull sometimes: in this case I needed to get the result of ray sensors attached to unrelated objects; sending a message seems to be the only way since I’m not aware of any method to get hold of a sensor that is not attached to the current controller.

However, when I patch the BGE to allow ray casting from anywhere to anywhere, surely I won’t use messages.

You can also have a ray sensor toggle a boolean prop on each object. Then you just access the individual props from the main script.

Of course, it all depends on what the most convenient method for your purposes is, but in the end “bugged is bugged”, so I would advise that you refrain from using message bricks as your main form of intercommunication.

Anyway, if you can give me some more details as to what you want to do (exactly), I can probably suggest a better workaround.

OK, here is the thing (sorry if it is a bit long): it’s part of the enhencements I’m bringing to your FPS template btw. See the current status here:
http://blenderartists.org/forum/showthread.php?t=114242
In the next version, I want to give the bots the ability to search for the player even if he’s hiding.
The idea is as follow: the designer of the room puts nodes at strategic places so that walk path can be established between the nodes and all areas of the room are covered. Currently, these nodes are transparent collision-free objects that are just there to help the bot walking through the room. When the bots detect the presence of the player (either by seing him or hearing him shooting), they try to shoot him. But what if they can’t see him?
My plan is to use the node to spy the player: the nodes that can see him will tell the bots (that’s where I’m using the messages: for the question and answer). The bot will go towards the closest node that can also see the player and hopefully, it will also see the player. I’ve got already a version that works like that but it’s far from satisfactory:

  • it requires a trackTo actuator and ray sensor on every node; huge waste of CPU as I only need to know once in a while which node can see the player
  • the bot will run to the closest node but what if this node is not directly reachable because of the complexity of the room?
    It is important to note here that I want the AI of the bot to be 100% generic: no hardcoded knowlege of any kind.

Actually I know where I want to go:

  • export to python the RayCast class to allow ray casting from anywhere to anywhere outside the use of ray sensor (unfortunately it requires a custom Blender build but hopefully it can be integrated in the next Blender release)
  • have the node registering themselves in a global dictionnary at startup so that the bot knows immediately which nodes are in the room. The node will also register the geometry of the room in terms of a NxN matrix of nodes that can see other nodes. I’m still thinking of the best way to do that. Apart for this initial acitivity, the nodes will be 100% passive objects.
  • when the bots must search for the player, they will use the ray casting facility to find 1) all the nodes that they can see directly from where they are 2) the nodes that can see the player directly. With this information and the geometry of the room, they can build the optimized path to reach the player.

The result should be a pretty smart bot that will hunt you wherever you’re hiding.

Of course, if there is an efficient way to do that without making a custom Blender build, that’s much better.

How do I send a message to an object with a pygame sounds script with a single actuator? I tried with:

def msg_fire_sound(sound_msg): #-use fire(name of weapon) STRING
    msg_fire.setObject(sound_msg)
    gl.addActiveActuator(msg_fire, 1)

and:

msg_fire_sound(‘fire_rifle’)

i use the message actuator to talk between (over/under)lay scenes in real time, that can’t (?) be done with python alone (without using global variables and pulling them constantly)

Nevermind, it’s setSubject().