Hello, I am pleased to release my second Add-on publicly. This Add-on allows you to bind properties to OSC events in real-time in the viewport.

Update 22 oct 2015: The new version 0.14 is now as easy to install as a normal Blender Add-on. Just grab the zip file which contains everything.

Read my page for all the informations:

Update 28 jan 2020 : I am working on a new project for Blender 2.8 with OSC support here:

1 Like

Great that you made an OSC version as well! I’m waiting for the ‘easier’ versions of both plugins (and some time where I can integrate it into my workflow!), but I’m really excited about this.

Bon courage @JPfeP,

Je vais attendre pour ta bonne version alors.

Thank you so much for sharing this addon!

I have downloaded the ‘python-osc-1.5’ source archive, unpacked it and copied from it the ‘pythonosc’ folder into my Blender USER configuration directory under ‘scripts’ -> ‘modules’
so the full path in my case on OS X is ‘/Users/username/Library/Application Support/Blender/2.76/scripts/modules/pythonosc’
your addon placed in ‘/Users/username/Library/Application Support/Blender/2.76/scripts/addons/’

I’m happy to say that the addon works for me without the need for the full python install described in your web page :slight_smile:

Hi, I am going to release the new version (easy to install) very soon now. I wanted initially to bundle the BGE framework with the next release but I will do another release for that.

Anyway you can do right now like iondev12, you should be able to download and copy by yourself the directory “pythonosc” in whatever place listed by “import os ; os.sys.path” (in the Blender python console), the add-on should find the module and work as expected.

Hello, the new easy to install version is ready for general consumption. :wink:

The module python-osc is now bundled with the Add-on. Just grab the zip file and do “install from file” in Blender. That’s all.

Hi there,
is there a module to control the BG camera thru your add-on, i want build a realtime previz system with a Kincet2 and based on the NextStagePro software which can stream the camera position in realtime and i want to drive the BG camera in sync with a loaded film set. Possible?


Yes, it should be fairly possible with the new BGE module I was working on. I have been kept busy by other things, but it’s mostly finished and just need some testing. I will try to release it in the next days (it will be bundled with the addon). In the meanwhile, do you know how are sent the X,Y,Z coordinates ? I mean are they bundled as a whole like “/camera x y z” or are they sent as separate values like “/cameraX x” “/cameraY y” and “/cameraZ z” ? Do you know if there is some documentation about this ? And btw, will you need too to get the angle values ?

I have sent him an email asking about the data format and what I intend to do, stream the data live into Blender and sync the viewport camera, i actually dont need the Blender game engine just a model with baked texture, this is extremely fast as nothing needs to render just move the camera around. The only thing he mentioned is that the Kinect only streams 30fps per second so if i shhot in 24p i need a way to scale the data by 0.8 to run in 24 instead of 30fps, is there a way to do that per python? If i record the data and import into Blender this is not a problem Blender will scale the data (Collada File) automatically to 24p but streaming is always in 30fps.

If you want to receive the stream in real time in the viewport to move the camera, you don’t need the BGE module and the Add-on is the way to go. The Add-on can deal with a refresh rate as low as 1ms. That’s 1000FPS but I doubt the display can follow. :slight_smile: For indication at 30FPS, a frame lasts 33ms or so, and a refresh rate of 10ms would do fine if you want to capture the stream. But the major problem is that the Add-on cannot cope yet with several values sent at once for an OSC message. It seems that the Next Stage Pro just does that. Their FAQ says : “Each tracked frame is sent as a single string containing the Kinect TimeStamp, the Kinect’s Position, it’s Rotation in quaternion values followed by the rotation in euler values.” So you would need for now to use an intermediate software to break the flux in several messages, then send them in Blender.

Yes, i got the same answer from him about the data format, is there a solution to that problem, maybe over Unity with UniOSC?


Hi, JPfeP
First thank you for the add-on! It’s so easy to listen osc message with it in blender.
I’m working on a project with a visual artist who want to make some simple shapes to move and deform based on the osc messages from an Emotiv Epoc. I would like to know if I can listen osc message in the BGE and how I can do this. Can you help me?
Thanks in advance

Hi, Akuattro. It seems that there is an OSC application for the Emotiv Epoc:

So, theoretically it should be possible to link them. However if this application send the various data packed at once to the same OSC address then AddOSC won’t be able in its current form to dispatch them to several Blender keys.

It is something annoying that I wish to resolve. The BGE OSC support suffers from this too. It is a little bit why I haven’t released it yet, as it supports only simple message too (one OSC message to one single BGE property) plus the fact that I had a very demanding freelance job. Now this job is over and it’s propbably time to get back to theses personal projects. :slight_smile:


At least I have found the time to finish the OSC module for the BGE. Please see this page, there is even a .blend file to test it with your smartphone:

It seems to work as expected here but it is a first release. The great news is that it can work with complex messages like lists of coordinates. I am going to see now how to do the same with the normal Add-on in the Blender UI to address the handling of existing devices, as some of you asked.

Just grab the zip on github as usual.

I am testing it for few days for Blender normal environment (not BGE), so far it works great and I could easily send message in and out, few things though:

  • multiple value in and out are not yet possible?
  • the pose bone does not work
  • need ability to bypass the Keying Sets and add or remove own props more easily.

Hi, it is possible to bypass the Keying Set mechanism to import your own properties using a script of your own. I have to write some documentation to explain that. I will investigate the issue you report for the next release. The multiple values support has been already coded a few weeks ago. I need some time to finish it. I am currently in “holidays” but I intend to work on it on my free time. Another news is that before leaving for the south of France I made a smartphone application, well a working prototype, to use AddOSC remotely.

Hi, back to Paris very briefly.

There is a new version on github fixing the problem for the bones (and for some other properties with a long path).

Another news is the preliminary support for list messages. It works only in receiving mode right now. There is a new parameter “Index” for each prop, that you can see in monitor mode, to play with.

If a message “/blender 1 2 3” is received, the index allows you to pick one of the 3 numbers (0 is the first). Since you can have the same address for several properties, you can then route the 3 numbers to the 3 properties of your choice.

Note: Right now if a an index is >0 there will be no sending from Blender to avoid absurdities. But it shouldn’t be a problem for now as most of the time list messages are received in situations like using Kinect and FaceOSC. However I intend to support sending as well in the near future.

I hadn’t time to fix a long standing issue with updates of the tool panel happening only when you move the mouse. Will try to address that next week, and may be it will be the occasion to refactor the GUI a little bit, with more flexibility.

Meanwhile (I am off another week), I will try to write the documentation of the API so that you can write scripts for using the addon without having to import a Keying Set, and therefore allowing sharable “drivers” for special OSC devices and various other duties.

@JPfeP, hi. I´m working on Blender on a PC. I installed your addon, runs accordingly to the video. I installed TouchOSC app from google play (bought it) on my android. So I launch blender, on the AddOSC Settings > Listen on: ( which is my local ip. I also add a simple X translate (position) to be used as Imported Key Set. All fine until now.
I installed OSCBridge on my computer, so it will send the port signal (in or out) to the OSC device (my android phone).

But nothing works. I can´t send OSC commands to Blender (x64 windows 7) from my phone.
What other step am I missing?

@sirdavid32: in AddOSC the input IP is the one of the interface you listen on. If is the IP of the ethernet card and you get the data from a wireless device, it won’t work. You have to discover the IP bound to the WiFi card, may be in the configuration panels of Windows. In the meanwhile you can use as the input IP, it will listen on all the interfaces. It is a little less safe, but unless you have a NAT route set in your internet box for the input port, it shouldn’t be a problem. Please tell me if you’re making progress. You shouldn’t need extra software, is OSCBridge intended for a special purpose in your case ?

To all, I seize the opportunity to announce v0.16 that is already available on Github since a few days. It solves the refresh problem of the monitoring feature, and as well the one with the index (when >0) preventing the monitored value to be displayed.

Thanks for the reply @JPfeP, This is what I´m doing: (From your post I understand I need to know what the wireless IP on my cellphone is, so I go to but this reveals the IP of my internet provider to me - not the phone wireless IP). OSC bridge uses the PC network and routes to wifi over to TouchOSC (app) to my phone. So I thought I needed to install that).
Your video example is on MAC, so I´m running this on PC.
Step 1: AddOSC Settings > OSC Settings > Listen on ( ; destinantion adress > (ip.number.of.pc?)
Input port: 9001 output port : 9002. START.
Step 2: Drive some parameters via listen commands…

Help me out on this one please.

Also, I don´t know how do I install this for blender: