iPhone 10 has an very intersting feature. It has a realtime 3D scanner

It uses like 30000 dots and scan your face feature with an IR camera. Thing is that this can have a few uses in 3d.

3D scan objects to get a 3d model with texture, maybe in realtime?

Face motion capture. Animate faces is super easy now.

Create diffuse, normal map, displacement maps of almost flat surfaces.
There are probably many more uses and it’s very interesting what devs will come up with.

I think xbox kinetic was the first commercial 3D scanner but you can’t carry that in your pocket.

You can buy a decent portable 3d scanner for around $400, and it’s made for 3d scanning. An iPhone probably isn’t the best choice for that task at the moment and it’s $1000. I guess if you’re drinking the Kool-aid and buying an iphone anyway, it would be nice to have 3d scanning capabilities, but I’m not going to shell out that kind of coin, plus, the Apple guy giving the presentation for the new iphone couldn’t get the face recognition thing to work.

Actually, that might not be true(regarding comparison with 400$ scanner).
Apple bought primesense company, which was the leader in this kind of technology. Their Carmine 1.09 sensors were by far the best on the market regarding price/value, comparable with scanners which coset 50x more. I remember that dey when I finally decided to pay 200$ for their scanner, while finding apple logo on their webpage. I was so angry ! :slight_smile:
So, the question is where the company brought this technology after being totally silent about it for at least 3-4 years. I quess they didn’t try to make it worse in the meantime :slight_smile:

IMHO the key thing is that what the iPhone X has is a scanner, but it´s not a “3d scanner” as the usual 3d scanner, it´s a face scanner, so greatly improved to do facial motion capture, so maybe this feature can be leveraged to have some cool facial mocap tool mixed with other full body solutions.


To defend Apple a bit here for some misinformation about the presentation. The faceId didn’t work because someone(s) has looked on the iphone prior to the presenter. So it locked itself like it would if someone tried to use wrong fingerprint.

Also iphone X is more than $1000 in the rest of the world. More like $1400 in Europe. So the Americans are getting a better deal anyway.

I was thinking about the possibility of it. It could still in theory do scans and you always have it with you. The portability aspect of it is the key here.

Of course if I’m going to do professional scans then I rather buy a scanner for €1000-€2000 than a smartphone.

bigbad you can already do that with the Structure Sensor, it works with iphone and ipad, and you can do scans just with the sensor and the iphone, no matter what iPhone, but the quality is not very good, at lease IMO.


It has a full-fledged IR-based depth camera, just like the Kinect (made by the same company) which can be jerry-rigged to work as a “movable” high-quality 3D scanner. Professional IR-based depth cameras by other vendors are priced much higher.

Face scanning is the application they are using it for, but there’s a chance that developers will be able to access the raw depth data eventually.

If I’m right, and I’m not sure, the iPhone X has an structured light scanner, that is different from Kinect, the projected dots are an ordered matrix of dots projected always in the same position, it’s variance gives depth and shape information, it’s not the same as Kinect AFAIK


@juang3d what you describe is how the Kinect works, or at least I don’t see the distinction:

The Kinect falls under “structured light scanner” as well, but unlike most of such devices it uses infrared light.

Yu are totally right :slight_smile:

but I am not sure if the quality of the scan is really that detailed and usable.

There are 30 000 dots that are projected. I don’t know in what angle and how many of those dots hit the target. But I can assume it should be good enough because Apple says it can see difference on practically all faces. We’ll have to wait and see what the reviews say. I really hope it can be used as a 3d scanner.

Oh great. Another thread that will eventually degenerate into an Apple hate flame war… Yay.

Why is that?

So far it’s been in calm, I think this is not about Apple, but about a piece of electronics inside the iPhone X, but we could be speaking about project Tango, one it’s functionalities is specifically the scanner.


There is this video from Apple homepage that shows how many dots there is on a face. I don’t know if that would be good enough for a good 3D scan?

The closer it can be used to a subject the more detail you can get out of those dots. Much like a photo and photogrammetry from the available pixels. Another thing is feature detection and tracking to enable stitching, were you to try scanning the whole head (or more) in one go.

Android has project Tango (now AR) for years already, implementing it made the cost explode.
Devices that support Tango and do 3D scan:

  • Lenovo Phab 2 Pro
  • Asus ZenFone AR

I would use them to create a sketch (e.g. of a room) first and then model it manually.

Android had some sort of AR for a couple of years at least now, but Project Tango is a completely different thing that is just now making its way into mobile devices (it is full-on scanning technology that can turn a 3D space into a 3D model, something you can’t do with a generic mobile camera).

Sure, you may have heard of Tango many months back, but at that point is was still under R&D in Google’s labs.

As for the iPhone X itself, I would hold off on the decision to purchase until the new RED phone is out with technologies like a holographic display (made by the same company that produces those high quality cinematic cameras). The rumored price for that device will be 1200 dollars, which is actually not much more if you’re able to really spare that amount of cash for a phone.

Apple needed years to make a open API for siri, and with limitations. I don’t think that they will allow to use the technology to developers so easily. Probably you only obtain simple data.

If the simple data is just 3D points then it’s great and at the presentation Snapchat was using the 3D points from the iphone X front sensors and camera. But it might just be face related points but we don’t really know.