AddRoutes (MIDI, OSC + Blemote, a nascent Android app)

My post yesterday included this error message, but since I was multitasking thought the error could have been on my part.

Downloaded from GitHub, “”. and installed it, directing it to the .zip file.

Yep, that’s because the GitHub version only contains sources and consequently needs some additional libraries (I updated the front page to explain that a bit better).

That’s why I provide a ready-to-go zipped version in the Download section of my own site. It includes some binaries required for the 3 main OS. That’s the price to pay to talk MIDI.

BTW, I released quietly a few days ago version 0.28 which offers 2 new little features. One is for converting the new text file format exported by FaceCap (an IOS app for facial motion capture), the other is for converting a generic textfile as keyframes (using the PureData qfile format). See my blog fore more details.


Sounds really exiting. Giving it a spin tomorrow for sure. So looking forwards to use this in my main daily setup in Blender!

1 Like

Hello to all, I just released v0.29:

It is mostly a maintenance release as 2 important regressions were affecting OSC.

1 Like

thanks @JPfeP add routes is a really handy tool for shows. I actually use it for tracking boids particles, video projected around dancers. It works really really well…
One thing could be cool… collapsing routes to navigate quickly inside when you have a lot of … it’s a little bit painful. and perhaps, managing kind of groups.
But again, thanks for it.
As a artnet user… it could be also cool to integrate Artnet DMX route type with universes Also.
i have got my own one, more in a pythonish way… so integrated in add route could be wonderfull.

Thanks for your supportive comment.

Collapsing is easy to add, and actually planned for the next version in a few days I hope. May be in the middle of next week.

Adding some other protocols is something I really want, in order to offer more options to users. I don’t know very well DMX, and I have to learn about Artnet DMX route type, but there are already some python libs I think for DMX. I have to dive into that. It seems there are however some OSC bridges to convert DMX to OSC.

Do you think it would be really still useful to have native support of DMX ? Adding a DMX engine, if you can provide some information to help me, could be done in the next months.

Here is a post i have started few years ago. First it was in bge . And my beginnings in python. With embeded code. Without univers. I ve used it without thread. Because at this stage I didn’t know about it. Then , this year one guy start bouncing on it. Creating a more advanced script/addon around it. So . I think this is a good starting point for artnet protocol.
Artnet is worldwide use in light managing on stage… hope you ll find interesting information here.

I ve commented my code to explain where the informations are placed in the packet. So it ll be a good start. I hope

I’m playing with AddRoutes now, but having some issues, mostly on understanding it really works. I have been able to recreate the sample file, But if I want to have the animation on the Y or Z PosRotScale, I can’t seem to do that, the data path always goes to the X PosRot or Scale. This may be something I am missing fundemental to Blender/Scripting/Data Paths, but if Anyone has an answer I would greatly appreciate it. Also, is there anywhere where there is an active discussion on using the plug in, or is this the most active place?

I apologize for the lack of news but I was working hard on a new version that is at last ready:

It’s now possible to define some default settings for MIDI or OSC. That way the connection is active as soon you open any project. This make easier for instance to use some MIDI devices to control the GUI with the system routes implemented recently.

@skantron: I intend to do some videos in the near future, I hope you have found your way since. I think this is the good place, so feel free to expose any problem you still have.

Hi, I’m trying to use Blender’s timeline to control other software via OSC.

This add-on is pretty cool but I was struggling to get data out.

Turns out that the playhead has to move for OSC to send anything.

It would be nice if you could make this clearer somehow and also possibly just send out OSC data on change even when the playhead is not moving.

Also Create realtime route fails for custom properties and I can’t figure out how to set the route for them manually either. Any ideas?

Hi there @JPfeP. Glad to see the project is active.

I still can’t get to work AddRoutes. Using a Launchpad, Blender 2.83, and the latest AddRoutes as of 6-23-2020.

I have setup in the AddR config Tab my Launchpad in inputs and outputs. I go to the Snap option, right click and create s realtime route. Then in the routes tab I configure it as in the screenshot.

I have tried all the options in the tab (Direct, Auto, Cut) and (Send, Receive, Both) as well as Continuous Controller 7 Bit, RPN 7bit, and all the others. Select 112 and 70. I am attaching a screenshot of the Launchpad implementations. I’ve been at it for several months with no avail. Could you help me get this working?

Yep, all your assumptions are right. Sending currently relies on the animation being played. I will update the documentation to make that clearer. The idea is to prevent useless processing while the user is only editing the scene. I intend to modernize that a bit in the future.
And yes, custom properties are not supported right now. For now you can use an empty as a proxy and create a driver for your custom prop that take the empty’s location.x as source for instance.
It’s on my TODO list to support them directly sooner or later.

First and foremost, I advise you to try my 2 examples found in the Multirouting chapter here.

Normally you only have to configure your MIDI device and if theses examples don’t work, then there is something wrong in the configuration.

In any case once you have the midi configuration well set, you can see incoming messages using the debug option. BUT the messages are printed in a console. And consequently you have to start Blender from a command line shell. That can be tricky for someone not used to navigate in the folders. But it might bring you a serious help at first to get things rolling.

Then, concerning your example, I think the use_snap property is a Boolean property, therefore any value above 1 will make it ON and only 0 will turn it OFF. But again the Debug option should tell you if the route is receiving and the processing is OK. Another source of error could be that sometimes CC are counted from 0, sometimes from 1 by various softwares. So 112 might not work and 111 will.

Thanks much for the response. When I have time I will go over the steps above. I’ll check again with MIDI-OX and the debug console to see where in my machine the Midi message is been blocked so to speak.

I had forgotten about the 0-127 / 1-128 implementation. That could be it.

I am taking a little break for the summer, leaving tomorrow for a bicycle trip in the french Normandie. I will be back in a few weeks.

@MechaX, I have the idea to mod the add-on to get the debug messages in the Info window. That would make things easier.

Hello, I just released the new version 0.31, see this blog article :

This release is mostly intended to help new users (but not only) seeing what happens. The debug reports when MIDI/OSC/Blemote events are occurring can now be seen in the text editor. More convenient than the terminal shell.

There are a few other polishing as well. But more substantial improvements are planned in the next versions.



what a great plugin. Thanks for putting it up!

I cannot seem to make it work, though. I launched Blender from terminal and tried to send some OSC messages with another program with the address /blender and value 1 but nothing seems to be coming in. I can’t see any OSC output anywhere, neither in the console, nor the text panel (which I set up in the AddRoutes config panel). I’m in Blender 2.90.1 on Manjaro Linux, may that be it?

Here’s a screenshot of my setup. Any ideas?


Sorry for the late answer, working on several projects …

Have you find your way since ? It can be many things actually. The Addon uses 2 tabs in the “N” panel, one for settings and one for the routes. The preferences shown on the right concern only a few set of general parameters, and enabling debug is done actually in the second tab “AddR: Config” (see just above the opened one on your pix), not in the preferences, because it’s a per project setting.

1 Like

@JPfeP Thank you for making this add-on. I’ve been looking for something like this for quite a while now, and most other ones require special builds of Blender so this is a welcome addition.

I, too, am having trouble getting any kind of result from my midi device or osc app.

On Windows 10: I’ve installed Midi-Ox. I definitely am seeing all the events coming from my midi controller (M-Audio Code 25).

However, in Blender, it’s not clear to me how to tell if its working. I see my Code 25 device listed in the add-on dropdown. Anytime I choose it, that field turns RED. Is that normal? Any device I choose is also RED.

I’ve tried to use the ‘Debug’ checkbox but I have no idea where that info would display. I thought maybe the python console window, but it remains static. Documentation on this is sparse.

I’ve tried a few of your examples but still can’t seem to get a result, with the exception of the keyboard one, but that only works because of the midi file conversion-- no input seems to work for me.

I think many (less technical) folks like me could really benefit from a simple video tutorial.
I’d also love to see more helpful tooltips on various parameters.

Anyway, I’ll keep trying, and thanks again.

This is an amazing plugin. I’ve been following Jimmy Gunawan’s demonstrations for some time (e.g. Unfortunately I’ve been unable to replicate any of his results. The most recent one appears to be with an older version where the UI is still in one tab. I’m having some trouble telling if there is a successful connection, much less how to couple the data to the shape keys. I’m clearly out of my depth. I’ve been using blender for some time, but not with anything this low level. Here are a few questions.

Does anyone know of a more recent tutorial or demonstration?
What is the “Address:” field? He just puts in “/W”. I have no idea what that means.
Should I be using Project Routes or System Routes? What is the difference.
In his most recent demo, there was also a “FaceCa…” tab that I don’t think he openned. Is there a plugin I don’t know about that facilitates the transfer of data from the FaceCap app on the iphone?

Thanks again for this amazing plugin. It’s going to be a game changer when I can rap my head around it.