On the whole, the camera tracker works like a dream. However…
There are some occasions where the simulated camera moves completely in the wrong direction - that’s despite solves of less than one.
I can only put this down to Blender mistaking foreground tracks for background tracks (or vice versa)
Is it that blender suffers from the ‘turning silhouette’ illusion. I’m sure most of us have seen it, where the figure can be perceived to be moving in either direction in 3D space. Similarly, I’m guessing that the camera tracker has no real idea of what’s near or far since all it has to go on is a sequence of 2d images. I’m no mathematician, but if this is indeed the case, then surely a function to indicate B/G or F/G tracks would be most welcome!
But just to re-iterate… Is there a way of ‘telling’ Blender which tracks are which - a way of signalling that eg a track is in the near-distance relative to the others?
Otherwise, is there a workaround for the problem?
Any suggestions would be gratefully appreciated!
PS There is, of course, the possibility that I’m being a complete spoon and overlooking the obvious!