long rant time…
after trying to find a way to use device inputs in blender… and google didn’t help, I am writing here to attempt to find out if there is a market for this type of thing, and how easy it would to do in blender, weather its worth me learning python and whatnot , in order to make this addon.
effectively the main aspects of this addon would be called “input units” (a name i made up just cos i needed a name for it)
so… what is an input unit. effectively an input unit is just like a keyframe driver, only it takes input from a connected device such as a control pad or control stick
as part of the user preferences perhaps a new TAB you would be able to assign a controller device, as detected by the operating system, and set up a range of options for each device input, such as stick axis, button presses, and throttle sliders, these would then each have their own options such as range, sensitivity curves falloff and deadzones and also a “name” variable, so the user can name the input manually, otherwise blender would just take the input names directly from the device. these name variables will be the name of the “input units” a system used to assign the controls
now we have a simple way of setting up an input device, we can do many things with them…
most importantly we can press “I” over a keyable value anywhere in blender and assign that value to be represented by an input unit so as well as options for “insert key” and “driver” we would have the option “input unit”… now how do we use this to animate…
firstly we turn on autokeyframing by the timeline or user preferences, and play the viewport animation with the timeline or alt+a…
now as the animation plays, IF an input unit is active we write a keyframe for each frame, as we play the animation.
when the last frame is reached the “input unit” is deactivated, so the second (looped) playthrough is showing what you keyed with input units in the first one otherwise you will be MAKING animation… i hope this seems simple…
so to animate you simply activate the inputunit, and press play, then use the control device to manipulate the keyframes directly. when you reach the last frame, the input unit is deactivated and the playback is showing the previously written keys, (just like the dopesheet editor does)
another cool thing we could do with input units once implemented is things like camera manipulation in the viewport, it might make for very interesting sculpting or mesh editing if we can rotate the camera with one hand on the control pad and use the mouse with the other, maybe the ability to map keypresses to controller buttons or stick axis would allow for the use of the shift and control keys during sculpting.
the point in this , i hope, would be to make thngs like facial animation more intuitive and faster, for example setting up a face rig using bones, you could use the input units to rotate and transform these bones, and with a little practice on YOUR particular setup… making facial animation feel much like controlling an animatronic puppet, things like breathing keyframes could take just a few minutes to set up and you could key in realtime effectively,
some questions. hopefully answered.
why use a new tab in the properties screen…?
because it would mean you can save your controllers input easily and save it as part of the default blend file on your system. meaning once you set it up , it can be used without effort. you could then simply tweak this setup to match your needs, like setting input range and falloff…
whats wrong with keying by hand…?
nothing at all, its the best way to do it, this is not intended to replace keyframing by hand, but for some things, this system could be very useful and very fast to set up.
think, setting up a control stick to move head movement and the other stick for eye movement, press play… make them doo stuff…release controls to stop recording keys… press stop, fine tune, done
would it be complicated to get this into blender?
I honestly don’t know enough about blender to answer that but I cant see how it could affect TOO much besides the manipulation of the viewport and the ability to assign device inputs as control assignments for things like viewport navigation… the main part , keying animations like how a driver does it would be pretty much done already. and blender can already use mouse movements to add keys in the viewport during playback.
what about using multiple controller devices?
I dont see why not if the operating system can let an app use more than one.
isnt that just drivers with a different type of input??
no, one subtle difference would be the fact that these would only be used to bake keyframes, once autokeying is turned off or the input unit in question is deactivated, the keys are played back as normal from the dopesheet/action editor, therefore, this effectively disables the input unit altogether and allows the “underlying” animation to be shown correctly… think, baking a keyframe drivers information into a set of keyframes, turning the driver ON and off as we need to
what could this be used for???
realtime keying of almost any property in blender, everything from materials, textures, physics attributes, particle info, transforms (loc rot scale), shapekeys, NLA-blending… pretty much anything that can be keyed, could be recorded in realtime
things like animatronic style rig manipulation, car steering / movement, plane flight style animation… in fact anything where you need a high level of control , but you want to animate it faster than key by key animation…
when will this be released?..
when someone who knows how to make this happen, reads this, and decides it would be useful to them, personally I have no knowledge of making addons in blender. so it would take me a long time to learn enough python to be able to make this happen , if its even possible
this is all just daydreaming imho, i have no hope of such an addon being developed, but maybe one of our more seasoned developers/coders could whip something functional together in a weekend.
another though would be to allow the input to directly refresh the viewport, so to be a basic tool for posing or manipulation of objects,… use the sticks to pose a facebone setup, then press “A” on the controller to make a regular keyframe…
these are just ideas… if a system like this were in place it wouldn’t take long for people to learn how to best make use of it.