After quite some work, I’m happy to release my first Add-on. As its name indicates it, this Add-on allows you to bind properties to MIDI events in real-time in the viewport.
Possibles usages are multiple, you can use MIDI physical knobs to control objects in the scene for instance or work both on your MIDI sequencer and Blender at the same time, if you are a composer, to add visuals to your songs in close relationship. It works both ways (sending/receiving).
I am providing an example which turns Blender into a polyphonic synthesizer (well, a bare samples reader for now), but it is just for the fun and to demonstrate how to use update functions, when you create your own properties, to be able to do whatever you want. You could theoretically do a full remote controller that way.
Hi! i’ve been trying to make an addon to control blender with MIDI inputs, but as a complete noob with phyton i wasn’t able to do much.
Here i have a little MIDI keyboard that i’d like to use to change between editor types and screen layouts. After a quick look at your addon i think is totally doable! What do you think?
Hello, I want to make this add-on easier to install. I need time to investigate how to do that, may be it is possible to redistribute the dependencies with the Add-on. The author of the python module for “rt-midi” told me he could provide some binaries in the future. So, things could improve in the near term as he was close to a stable version for his module.
However it is hard to say if and when it will as easy as a regular add-on. Python doesn’t deal with MIDI, MIDI is bound to the operating system and you will always need a layer to access the MIDI devices.
Recently a friend under Mac OSX wrote me what he had to do for installing AddMIDI, with that I am going to update a bit my documentation (Probably this weekend). May be by the end of September, things will have improved already, but it will be a gradual process. Better documentation, simpler things to do.
As a last resort I could provide some builds on Graphicall.
Good news! It’s also means it’s working with Blender 2.77, so as soon as i have some time to dig it, i will!
Possibilities are near endless!
Something i’m not sure about is when you say “it works in the 2 ways”, i understand i can drive stuff in Blender with my Midi controller, but can Blender drive stuff in my Daw? That would be awesome… idk for example a noise controlling a filter, or a curve controlling a volume fader? or inside a simulation, a cube hurting the floor or other object with my Daw making the noise?
Yes David, Blender will send the values of the chosen properties thru the selected MIDI out port (if any). I agree the possibilities are endless but I don’t know if it is possible to detect the collision of an object with another yet. May be with an ‘handler’ (a special Blender python feature that would test for each frame the coordinates of some objects). But for the simpler cases, there is no problem.
With the help of a friend here is a new easy to install version of AddMIDI, this time for Apple OSX users ! All the dependencies are bundled so it is as easy to install as any Add-on. Keep it zipped, as always, and do “load from file” in the User Preferences, then choose the zip.
I keep hoping you can create an easy install for Windows as well, because try as I might, I can’t seem to get it working manually. The closest I’ve come is having it show up in my add-ons, but I can’t enable it: it only throws up this error message:
Traceback (most recent call last):
File “C:\Program Files\Blender Foundation\Blender\2.78\scripts\modules\addon_utils.py”, line 330, in enable mod = import(module_name)
File “C:\Users\Bobby\AppData\Roaming\Blender Foundation\Blender\2.78\scripts\addons\AddMIDI_init_.py”, line 73, in <module> from select import select
ImportError: DLL load failed: The specified module could not be found.
I’ve tried following the steps outlined on the website, and even attempted multiple versions of the files provided, to no avail. Any idea what I might be doing wrong?
@Friendly Floyd: I don’t have a windows machine (only Linux) but the Add-on doesn’t find the compiled module for rtmidi. There might be several reasons explaining that, but it is not easy to answer without seeing how things turned during your manual installation. I am going to check a little bit the documentation I wrote, and try to provide a more helpful answer soon. May be it is not very hard to solve, I will contact you in private, if you agree.
From a broader perspective, I want to make an update to AddMIDI (backport of a fix from AddOSC) and I will try at the same time to provide the long awaited easy to install version for Windows. I have a friend that could help me for that, but it could take a little bit longer.
Unless there’s some way of simulating physics in Reaper scripting I realised that rather than doing the audio in Blender it makes sense to actually have the turntable simulation as a virtual controller. Audio in Blender is limited and fiddly even though Python Audaspace is nice, Blender is never going to be able to compete with something like Reaper. Ideally Blender would output MIDI timecode (or whatever its called) for transport and play head location to Reaper (and other MIDI transport aware software). But would that be problematic here ? I have had Jack Audio working on Windows but its very fiddly compared to things like Reaper ReaRoute and Loopback MIDI devices (I use “loopMIDI”). If the transport would not work then I guess it could be done by simply sending MIDI CC.