I was reading about this a few days ago and I think is a good thing to share. It uses that AppleAR kit.
Mobile facial animation capture system is only available on iOS devices with a front-facing TrueDepth camera, such as the iPhone X, iPhone XS, iPhone XS Max, iPhone XR, iPad Pro (11-inch), and the iPad Pro (12.9-inch, 3rd generation).
Optionally, the Unreal Engine ARKit implementation enables you to send facial tracking data directly into the Engine via the Live Link plugin, including current facial expression and head rotation. In this way, users can utilize their phones as motion capture devices to puppeteer an on-screen character.
So it means that you can use your phone as a “web camera” of some sort and stream the data directly to Unreal. So my idea here would be to do the same through some connection in Blender, or create some triple-link of iPhone/Unreal/Blender. Still know nothing about that I have no equipment at all to test all these.