I’ve been working on a tool to stream your facial expressions to Blender. Yeah, I know there’s at least one other solution, but I’ve been working on this for a few years, and finally getting around to publishing it. Better late than never!
You can also use your iPhone as a spatial input device. For example, you could drive your scene camera interactively. Or you could animate a plane or spaceship flight path by zooming your iPhone through the air, like you did when you were a kid.
Also in the works… turn your Vive VR setup into an armature MoCap solution. This is working now and it’s awesome. But needs some more work before roll out.