Camera motion solver does not position points correctly.

I have a video sequence, where camera is rotated from right to left. I have set about 12 tracking points and try to rebuild the camera movement using solver from “Movie Clip Editor”.

Here is how the tracking works.


Unfortunately, running solver does not detect scene construction correctly. When looking onto the tracking points in editor, it is obvious that those at the bottom of the screen are closer to the observer than those in the centre. However, points that appear in 3D view are arranged in the strange way:


The first thing, is that they stay very short from each others. Second thing, their order seems to be completely random - sometimes it is as it should be (closer points are closer), sometimes not. I cannot diagnose the reason for the strange behaviour - solve error is “1.35” which is quite good (from what I have read). I have also tried to experiment with “Optical center” settings, but I did not see any improvement.

Unfortunately, I do not have optic details of the camera that has been used, so I use generic ones. I don’t know if it may be related to the problem, but when I tried it with another video (which differs from this one, it should one object which is always visible, while here I have the scene that moves).

The scene is quite short - I use only about 15 frames. What may be the reason for such strange behavior?

If the camera is not translating(up, down, left, right, back, forward), it has no way of differentiating distance. all of the trackers are moving at the same speed and vector. there may be a way for it to still work, but your making it harder than normal. I’m not 100% but planar tracking might be a solution.