Android phone for Motion & Facial Capturing

My smartphone is about to die and I want to get something new which could be useful for some facial capturing and motion capturing tests.

Because I am also Android developer I would rather use an Android phone than an iPhone (even though since iPhone X their facial recognition is very good).

Do you have any recommendations?

no one interested in Mocap or Face Cap here?

I am interested in Mocap as well but I have never looked into it.

Basically I would start from the most simple way which is to test how things roll with standard equipment and if the results are not correct then try to consider some other more specialized and expensive options.

Things like that can be done with OpenCV
https://www.youtube.com/results?search_query=opencv+mocap
https://www.youtube.com/results?search_query=opencv+facial+landmark+detection

For those who might not be interested in C++ programming at all might be interested to look at that vvvv software which is mostly node based. I have never tried that software at all also but it seems legit, I would have no problem taking a look at it at some point.

I don’t know specifically if iPhoneX is better on this task or not. I looked at the camera specs and it seems that their camera works same as Kinect for sensing depth properly. The deal here is that I don’t know exactly if 3D detection generally is superior to 2D detection. To me it looks like 2D detection with OpenCV has advanced and refined so much due to it’s wide use to the point of offer correct results however Kinect remained very exotic since beginning so it was not widely used to the point of having more refined techniques in it. Perhaps we could consider that it might give some interesting results in skeletal animation but still far from the superior results of these mocap suits.

That means that a Kinect might be the better investment than a smartphone for that, right?

1 Like

I would consider that Kinect can be used to track depth information as well as skeleton information. So these two combined into one packet as a low budget 3D scanner only and basic mocap. However I would still rely on 2D mocap for high detail face tracking.

Also I would consider that iPhoneX is somehow the pumped version of kinect. Specifically about face tracking it seems that it works out of the box with superb results. It looks like Apple invested effort on these 3D emojis and and have that neural network chip in the hardware to make it work on the fly with minimal imperfections.

Last but not least also with standard equipment and accessible software it would be possible to achieve respectable results. I had not looked at it previously but it seems that Blender has a decent tracker to track marks directly.

My strategy here on picking equipment would be

  • 2D web cam: the most cost-approachable option, easily accessible free software
  • iPhoneX: very specialized and out of the box solution for face tracking, a little bit more closed source software (you can’t build APKs and share them, you will have to obtain developer account to distribute them through the store, however you can distribute source code and let anyone else download it and build it themselves).
  • Kinect: still trying to find some convincing advantages, other than those previously mentioned, but until then is same as 2D webcam.
1 Like

thanks a lot! It’s true that Apple invested a lot into it. And it seems there is no Android phone on the same level even it’s already 3 years ago

1 Like

I know nothing about it, but Samsung Galaxy Note 10+ has a Time of Flight sensor.

1 Like

awesome Samsung is currently my favorite. The only thing I miss is native Android

I have a Zenfone AR phone (I bough it last month for $250 instead of $1000 back in 2017), it has ToF (like my Kinect) but I can’t find an app for real time mocap like Iphone X “Face Cap” or even scanning face correctly like https://hege.sh/
Constructor destroys nose each time. I guess infrared plays a part.

I was reading about this a few days ago and I think is a good thing to share. It uses that AppleAR kit.

Mobile facial animation capture system is only available on iOS devices with a front-facing TrueDepth camera, such as the iPhone X, iPhone XS, iPhone XS Max, iPhone XR, iPad Pro (11-inch), and the iPad Pro (12.9-inch, 3rd generation).

Optionally, the Unreal Engine ARKit implementation enables you to send facial tracking data directly into the Engine via the Live Link plugin, including current facial expression and head rotation. In this way, users can utilize their phones as motion capture devices to puppeteer an on-screen character.

So it means that you can use your phone as a “web camera” of some sort and stream the data directly to Unreal. So my idea here would be to do the same through some connection in Blender, or create some triple-link of iPhone/Unreal/Blender. Still know nothing about that I have no equipment at all to test all these.

1 Like

This looks super promising but I don’t want to work with Apple products. Unfortunately it seems they are the only ones really investing into these technologies

Currently, as @j2l2 said, that can’t find software to use yet and also the quality of the tracking is questionable. This is mostly a matter that Apple uses a “special neural network chip” to correct all of the imperfections and inaccuracies and get smooth butter results.

Which is if you consider somewhat easy technology to replicate (everybody with a neural network can do it - even in software - not a chip - but in chip makes the realtime performance faster).

This is a pros and cons situation at this point in time, perhaps in the future there would be some equivalent alternatives, but for now it seems that Apple has the upper hand on that exact feature.

1 Like

Update: I just found this interesting video by luck. I remembered having answered here so comes a follow up on finding something that works.

I am going to purchase a Samsung Galaxy S10 or Pixel 4 soon. Hope there will be something to try this out as well

I can surely think of this problem of buying a Phone:

  • iPhone: Great phone if you can use it within the restricted walls of Apple ecosystem. It means that you CANT do some specific things regarding “power-using” your device. But other than that you CAN do some specific things perfectly. Also since the application quality is top notch you always get proper performance and response times, Apple here really places huge control and restriction on developers so they can produce a good performing application.

  • Android devices: It would be the exact opposite of Apple, since it allows you to totally control how to use your device, even reinstall a new OS, or even ROOT everything, and mostly allow developers jump right in making application with real ease. However from the application side of things things might bit a bit shady. You will find things you look for aiming for the most popular apps (eg: Viber, FB) but otherwise searching for really specific apps might be very difficult.

Sure there is, starting with the S10, and now the new S20, both have depth sensors, it’s just everyone’s blinded by Apple somehow being better?

EpicGames definitely focuses on Apple when it comes to mobile, however, UE runs best on PC’s due to gpus needed power that Apple just can’t muster as they’re always so proprietary. I do know that there are some Blender to UE pipeline livelinks in workings, and there are is a great solution for the virtual production/virtual camera app available on the marketplace, and google store.

@Sean_Lake1 True Depth (iPhone only) and Time of Flight (Samsung and Apple) sensors aren’t of the same precision.
Apple may have a patent on True Depth (FaceID) and is why you don’t see it on other phones.

i am interested