For Thanos, they didn’t actually create his face using AI. Machine learning was used to get more accurate performance capture data from Josh Brolin’s face. They still modeled the face manually.
But yes, generating human faces is possible, given enough work and resources. Though rather than hundreds of photos, you’d probably need thousands of full, professionally made 3d scans, or else the output quality would be limited to what you can extract from a photo, that is to say, bad. If you wanted these faces to also be ready for animation, you’d also have to provide each of those thousands of scans with correct topology and a rig. And then you’d need access to enough computing power to crunch all that data - no home PC or amazon server will do, we’re talking mainframes. Perhaps some of these things can be had more cheaply in the future, but for now it’d be a big and expensive project.
That is if you want to go the AI route. A less fancy solution using good old human ingenuity is also possible, and in fact used for random NPCs in many games like Fallout, Dragon Age, etc., to a varying degree of success.