Help tracking a scene...

Hello all…
I need some help tracking a scene that looks like it should be really easy. My son and I took a DJI Phantom 3 and flew around an old train bridge. I wanted to import the footage into Blender, track it, put in a blender train and send it down the tracks. I thought this would be easy since the scene seems to have a lot of tracking points, good slow movement, lots of parallax movement, etc. After working for hours and hours on it though I’m starting to lose hope.

Can anyone offer some advice on how this type of scene can be tracked? I’ve uploaded the original movie to my google drive here: https://drive.google.com/open?id=0ByMMbmA9cIVvRlB6Ynk3bG9sN1U. It is a mp4 file around 50 MB, ~500 frames or so. The intial display of the movie is very low quality but if you download it is is full quality.

I looked around for the camera specs on a DJI Phantom and put those into Blender and the solution actually got worse. It has a 1/2.3" sensor and 20mm focal length with a 94 degree FOV (info gathered from here). When I try and track it, Blender can’t find the perspective at all and when I try and set up the scene in the solution everything is all flipped around. I’ve tried setting the floor plane by selecting three points on the track, that didn’t work. I picked three points on the water below, that didn’t work. I tried setting the X-axis by picking a point along the track, that didn’t work. Obviously I don’t know what I’m doing.

Can anyone offer any advice in this situation? Thanks!


Are you tracking points on the trees? If you are, that could be where the problem is coming in. Trees don’t usually provide very solid track points, so I would mainly stick with points on the tracks/bridge structure.
If you’re not, then I would say to make sure that your two solving keyframes are good ones with good perspective information.

Use more manual trackers and place them on bridge, railway and static ground. Or delete the ones that are on moving trees and bushes before the solve. Moving tracking points can mess up the solve.

Setting thenground plane for scene orientation does not make solve better, it only scales, rotates and translates the whole scene after it is solved. You must first fix the 2d feature tracks, then the solve and then you can align the scene.

Is it possible that the image has been electronically stabilised in the camera? If so Blender will have a hard time reverse engineering the sensor location from the reference trackers as the image centre actually drifts around.

Thanks for the tips. I can get a good track (less than 1% error) but it is still putting the perspective all screwy. The whole scene is elongated and not level. I will post a picture when I get a chance.

Are your tracking points scattered enough and not lumped together on some small area? And do you let the solver calculate fov for you or do you set it up manually?

If trackers are too close together you can easily get an inverted perspective or wrong fov and still have small error.

The data you use for fov is wrong. The 94 degree fov is for 2.7K resolution, but your footage is the HD one that is cropped from sensor. The real fov is in the 60-70 degree range.You must also set the sensor size according to this 1920/2704 = 0.71x crop. Focal length will be the same but not 20mm (this is 35mm equivalent) but something in the one-digit millimeters.

So to put it all together:
Sensor:

  • 1/2.3" sensor is 6.17 x 4.55 mm
  • you are interested in horizontal measure as it does not use all lines vertically (2704/1520 = 1.778, but 6.17/4.55 is 1.356)
  • real used sensor width is 6.17mm x 0.71 = ~4.38 mm. Height is 4.38 / 1.778 = ~2.46mm

Focal Length:

  • 35mm equivalent is 20mm
  • real focal length is 20mm x (6.17/35) = ~3.53mm

FOV:

  • you can calculate the FOV using simple arithmetics: FOV = 2 atan(0.5 x sensor width / focal length)
  • FOV = ~63.6

The numbers I slapped together are based on the assumption that the 2.7K resolution fills the sensor horizontally that HD resolution is a crop from the sensor and that 20mm equivalent is correct. All these assumptions can be wrong. But from my testing, FOV around 64 degrees is much better estimate than 94 degrees. Solver gives me 71 degrees (I use SynthEyes myself) so there can be something else going on, I would put my 2 cents on wrong focal length equivalent.

That is really helpful, thanks! The track is looking much better.
Thank you all for the help everyone.

Hi and all the best to all of you !

I am kindly asking for your help in the same topic.
I have footage from a DJI MAVIC MINI.
These never work with my blender animations with perspective errors.

What settings do I have to enter to get accurate results.
Please just let me know, what I have to do !

Thank you so much !
All the best and kind regards,
I hope, you can help me.

Boris