I’ve joined two of my hobbies and made a preliminary test to see if it was possible to use the Android mobile sensors to control Blender objects. I’ve already tested the first option that would be to record the sensor values and then import it to Blender through a python script.
Today I’ve made another test in which the object is updated in real time in Blender while moving the mobile phone. Communication is made via wi-fi using sockets. Check it out.
My objective is to try to use the mobile sensors to create more realistic animations, without having to use expensive motion tracking equipment. For that, in most cases it will be necessary not only the mobile position, but also its velocity. The problem I’ll probably face is that velocity is not directly obtained by a sensor, it must be computed from the accelerations and the noise might introduce some errors along time, but I’ll see if low pass filters might help to do the job.
If it is feasible, one can then fix the mobile to a part of the body or to an object that we want to animate in order to have more realistic movements.