Receive midi messages in UPBGE python script?

Is it possible to get real time midi events from several midi sources (sequencer/DAW, midi keyboard, midi controller) into a python script in the game engine?

My (not so deep) knowledge of blender and GE is 10 years old, and now there is a new game engine.
So I would appreciate feedback from other blender users if above is possible in the current blender / game engine and possibly some hints for the implementation.

My assumptions are:

  • the (new) game engine is needed for the real time nature of this data visualization,
  • a python script in GE receives midi messages from 3 midi sources and triggers in real time appropriate blender actions
  • if midi input is not possible, I can make an external program (python3 + mido module + osc?) to filter and convert midi events to another type of communication channel (network port, pipe)

I am using linux, Ubuntu Studio 20.04 with alsa/jack midi audio/midi layer on an Intel i5 or i8 system, latest stable blender version installed via snap, a usb-midi keyboard and a controller for real time midi input, musescore or reaper as sequencer for real time midi ‘reference data’.
If above cannot be done in blender, I will program it using python, Qt5 and python module mido.

Thanks in advance,

I was making some time ago a stage lighting visualization tool in UPBGE, and it was meant to receive MIDI messagens and trigger logic based on the received messages, but I abandoned it halfway through. You can download it below. It requires mido installed as Python module on your Blender in order to work.

The important part here is the file scripts/, where it creates a new thread that watches new MIDI events. The thread is needed because it runs a function that has a while loop, which freezes the game engine if not using threads.

I hope it helps. (1.3 MB)

I use often OSC to communicate with python script and BGE.
I use oscpy from Kivy Project:

I wrote a tuto :
Hope it helps

1 Like

merci bien serge3576,
your documentation makes it very clear how to integrate the code in the blender environment.
Also thanks for the warning about “le port est utilisé”, that is very useful to know, Up to now I have only used oscpy to send simple commands so I never came across this problem.

Hey, I followed the guide in this thread and it’s almost working, the problem I’m having it UPBGE seems to be blocking the OSC messages and the rate is very slow.
The ‘always’ function prints rapidly but the actual data that is received only changes every 1 second or so.

To test this you can put a 'print ("test")' in your 'on_action' function and just unlink the and you will see and the rate that Blender prints “test” will be very slow when in game but ramp up when you exit (the oscpy server stays on). (I am constantly sending an OSC message to the server, like 60 times a second)

I’ve stripped the code way back to just a receiving server doing nothing but prints to debug (no variables, no ‘always’ script etc.) but it remains slow.
I’ve tried both using nodes/logic bricks (as per the tutorial) as well as converting the code to the component system and the result is the same.
I’ve done some research on python threading but I was under the impression the oscpy was already threaded?. Sorry I’m not much of a programmer.

I am recreating a small part of the functionality that the AddRoutes modifier achieves to get FaceApp to work in UPBGE (sending the OSC data to blend shapes + controlling some transforms)

Sorry to resurface an older thread but I was wondering if someone had encountered this or has a solution?.

The “always” must pulse every frame, and you must send OSC messages at a frequency
less than 60.
I worked today on skeleton animation with a Intel RealSense Camera, and it work’s
fine without latency. I send a big message with all data from camera to BGE,
at around 30 FPS.
See, but it’s on progress, and a bit complicated.

If I send a lot of small message (one message = one keypoint), I have a big latency.
I create a message with all keypoints coordinates.

oscpy accept only list, and not list of list. So I create only a list of integer.

Hmm if that’s the case I must be doing something wrong. here’s a video of the way it’s functioning at the moment anyway:

As you can see at the end of the video I get the values I should be receiving in realtime. I can receive them at the correct rate but only when the engine is not running.

I will keep researching. thanks for the help.

The way it keeps running after you stop the engine shows that it’s threaded, which means it’s running independently of the framerate anyway. If I had to guess, I’d say you’re reinitialising it every frame rather than just reading the values? Hard to say without seeing the blend.

Yes, oscpy is treaded: see in oscpy source code

and the problem of latency with OSC is as old as the world :cold_face:

In blender, have you a callback for every tags, or a default callback ?

A good solution is to send and receive a bundle, but isn’t describe in documentation.
You must see comments in in oscpy source code !

If you are a programmer, the best solution to communicate with BGE is twisted and json !

I wrote this tuto a long long time ago:

OSC Latency

1 Like

I’m just doing 1 callback for 1 address (/HR which is FaceCaps head rotation values and returns 3 values) There are more addresses being sent but I’m not doing anything with those yet.

After doing a few more tests, yeah it does seem to run faster if I also send a bundle back as well.
It’s still not full speed and I noticed if I stop my Ipad while in game, Blender will still print values for a while after, trying to catchup. Where as when I stop my Ipad when the engine isn’t running the printing stops immediately.
I find this behavior really weird that I can receive the OSC messages correctly and at full speed when the engine is not running but the messages slow down when it is running.

I might send over a .blend later but I’ll do some more research first.

I guess this is a bigger can of worms than I initially thought. I’ll take a look at that latency tutorial.
Thanks for the help!