@Monster, good point, while thinking of this I was looking at event managers and not thinking about sensors that are local to a certain object and have their own settings.
global events are simple and I covered that but local events are not
Cant think of a way to do this without using logic bricks since the events are based on logic brick settings - ray cast direction for eg.
As well as having a real logic brick youd need to use the sensior->controller connection so the BGE’s logic manager would know if the sensor was active and to evaluate it.
You have a command on the python controller that allows you to register a python function that gets directly called when a sensor is true. But this seems a bit clunky and not that different to having a sensor call a python script directly.
Hints on how this could be made less crap welcome
Take 2… this is an imaginary api but maybe not SO far of being possible (the subclassing stuff is there)
def onMouseDown(self, event, buttons=[0,2]):
print("which button was pressed?", event.button)
self.loc *= 0.1
def onRayCast(self, event, rays=((0,1,0), (0,0,-1))):
print("which array was hit?", event.ray)
def onKeyDown(self, event, keys=['UP', 'DOWN', 'LEFT', 'RIGHT'])
def onCollision(self, event, properties=['ground', 'kill'])
sensor = event.sensor
for ob in sensor.hitObjects:
For this to work the class would be inspected, and internally sensors could be added to match the class, but users would not need to make the logic bricks.
One thing that is odd about this is that each class function would have to be multiple sensors in the BGE. Having the sensor settings as a keyword argument for the class is also a little odd.
Oddness aside I really like having 1 class that describes all the logic for an object without mixing with logic bricks.