Use your Smartphone to track Camera (IDEA!)

Hi! My name’s Adam Abrahamsson! I’m interested in doing visual effects and have done quite a lot of motion tracking! This has led to me think, why don’t we just track the camera motion directly on set, instead of doing it in post? It’s just an idea that popped into my head the other day. Being able to grab the movements from the actual camera would mean:

  1. No manual motion tracking needed.
  2. Possibility to have the camera movement from shots that otherwise would be very hard to track. (Eg. Blurry footage, a sky/sea, a shot during the night.)
    Of course a lot of times it’s works just fine to use motion tracking but this might not be necessary, and it isn’t perfect! (Even though you can get great results!)

How is this made possible?

I’m by no means knowledgeable in this subject, that’s one reason I’m turning to you guys! :slight_smile:

I figured there are two things that somehow needs to be measured: The rotation of the camera (Might use an accelerometer) And the “movement” (X,Y,Z) of the camera. (This I don’t know how to measure)

Modern smartphones might have everything needed to be able to measure the camera’s movement! It would be great to create a solution for this as I believe it would be a advancement in the realm of motion tracking!

Note that this is just an idea and I might demonstrate my lack of knowledge here, I just want to know the possibilities! :slight_smile: What do you think? Wouldn’t it be great to be able to precisely record the movements of the camera?! Please share your thoughts!

The data from phone accelerometers is pretty coarse. It could be used to refine tracking data in cases where there is extreme blurring, but integrating that data in would be complicated.

in theory, with high quality accelerometers that could provide good enough data, you might be able to get a decent track from motion data alone. But you’re looking at a complex problem that would require a lot of research.

Hmm, yeah I was thinking about that but had no idea how bad/good the accelerometer in phones actually was! Just dreaming here :wink:

that’s something that is being done in the industry already
https://vimeo.com/161949709

You don’t need any of that. Just film and than use a motion tracker to get the rotation and movement of the camera.

This is perfectly possible, the only thing is that you cannot do this inside blender you have to use unreal engine 4 or Unity with something like ARKit or AR Core, not just with the accelerometers, then you can program some kind of recording app and that’s it you have everything you may need, It’s not perfect but it may work.

cheers

Looch, the lytro camera (while awesome) is solving an unrelated problem. The technology certainly could work in conjunction though, if you want to spend a ton of money.

I think there is an opportunity for this technology, but someone has to put all of the pieces together. It will take more than just a garage tinkerer to do it though. you’ll need to develop an exchange format for it to work in a non-hacky way, and somehow embed this information into each captured frame.

… and it still won’t always help those who are working downstream. They don’t always have the luxury of dictating how things are shot, like figuring out how to tell Zapruder to turn the camera around a bit further. :stuck_out_tongue:

I could swear professional productions are/were already doing motion-control camera work (even as far back as the original Star Wars), so there’s likely already some kind of standards in place. Could be just a matter of getting it in a more consumer-level product, or finding a reasonable rental place.

there are several solutions which do on set camera tracking already, but they are mostly for studio scenes. its nearly the same as a motion capture setup, expect that you dont track the actor but you track the camera.
all the current on set previz relies on such technics.
e.g. http://www.ncam-tech.com/

motion control rigs are still expensive and bulky to use it for many shots in a movie. look at the gigant robots here:
http://stillerstudios.com/

and you need to know what you want to shoot before you shoot, because you need to programm the exact motion to the robot. there is now quick change of plans on set

very limiting for most task on set

I just thought it would be cool to basically slap your phone unto your camera and start tracking! Would be such a nice possibility! Would it perhaps be possible to use some kind of local GPS system to track the position…?

I already told you how to do this, you need Unreal Engine or unity, program a small app to record camera movements using ARKit or ARCore and that´s it, you have all you need as long as you have tracking.

GPS don´t have enough precission for this.

Cheers.

Same idea I had a couple of days ago!!! How can we make this a reality? As far as I see, it would be like this: slap a phone (in my case Samsung S6) onto a GH5 (maybe on the hotshoe with a standardized adapter?) and have a smal app from unreal or unity running on the phone that records the video and the tracking data. Then you hit record on your GH5, do your normal shot and thats it. In post you take your GH5 footage, download your phone video + tracking data and it magically marries in say C4D or or Unreal. you then build your 3d content into your footage, export to maybe fusion (inside resolve) or premiere and you are good to go.

I have no idea how to go about any of this but I am willing to learn. especially the unreal / unity part.

Don’t see why game engines are required when sensor apps can log to csv and or stream data over network protocols. For csv there’s this:

Or spreadsheet it into a bvh file (just a text file / csv with some headers and metadata).

A structure or similar sensor would probably be good for this since you could get a camera track and some point cloud data at once. For a stage setup there is lighthouse tracking which has been used for real time camera tracking to do mixed reality filming. There is also some promising DIY work being done with lighthouse.