Why can I never EVER get a good track?!

So I am very upset. I bought a Canon Vixia hfR21 about 2 years ago for the purpose of making videos, doing effects, etc. However, one of the main reasons why I bought it was so that I could do some camera tracking with it. But, in the dozens of tracks I have done with it, I have never once been able to get a good track out of the footage I use. I get an average solve error of like 150…

Please. If anyone could help me by using the below footage to try to get a good track, I would love you. Even more, if you could go as far as to screencast yourself doing it, I would find that very helpful. I really want to be able to get good at motion tracking with this camera, but I am having a very hard time doing so :confused:

Thank you SO much for the help. Again, it is a Canon Vixia HfR21.

The footage is in 1080i, shot at 29.9 fps. The focal length SHOULD be around 5mm, but I’m not even sure about that. All I know is that I wasn’t zoomed in all that much. As for the sensor width, don’t even ask. I don’t know much about cameras…


Adding a 5th point, remove any sort of in camera stabalization when you shoot… this warps the image and makes it nearly impossible to track. just stabalize it after the track.

Adding a 5th point, remove any sort of in camera stabalization when you shoot… this warps the image and makes it nearly impossible to track. just stabalize it after the track.

quoted in agreement…again, refer to above link

In addition to what everyone else told you, are you converting the footage to an image sequence?

What do you mean ‘use the 35mm equivalent’?. And yes, I usually do convert to a png sequence, but I didn’t think you needed to. Andrew price uses an MP4 in his tutorial and everything works fine. Also, I hoot in 1080i because its te best I can shoot at. It gives me the most detail to track…

ok it says this for the sensor data: [TABLE=“width: 500”]

Image Sensor
1/4.85-inch CMOS, RGB Primary Color Filter

How do I get the sensor width out of that? And also I looked up some 35mm equivalents but I’m not quite sure what my camera’s focal length should be.

Oh and thanks so much for all the help everyone.

if you can, shoot 720p instead of 1080i… the i means that the footage is interlaced… as in it only records every second line for each frame… so for frame 1, it will be say all the even numbered lines, and for frame 2 it will be all the odd number lines…

What do you mean ‘use the 35mm equivalent’?

Blender doesn’t really care what your real sensor width is… what really matters is the ratio between the sw and fl which is a good thing because your camera doesn’t actually always use the whole sensor in certain modes…

Zoom Ratio: 28x Advanced/20x Optical/400x Digital
Focal Length: f-3.0 - 60mm
35 mm Equivalent:
Optical Zoom: 41.2 -824mm (Standard IS/OFF), 49-980mm (Dymanic IS ON)
Advanced Zoom: 41.2 - 1154mm (Standard IS/OFF), 49 - 1176mm (Dynamic IS)
Frame Rate60i, 24p Progressive (records at 60i), 30p Progressive (records at 60i)

Here you can see what I mean… At it’s widest setting, your camera’s lens has a fl of 3.0mm but you need to know the exact sensor size for blenders tracker to work properly (which is usually very hard to find - it’s a long confusing subject… a 1/4.85 sensor type will translate to a sensor width of somewhere around 2.8mm) it’s much easier to just use the 35mm equivalent that they give you. blender will give you the same solve if you set sw = (somewhere around 2.8mm) and fl = 3.0mm or if you set sw - 36mm and fl =41.2mm (again, it’s the ratio that counts, not the real values)
Now look at the difference between (Standard IS/OFF) and (Dymanic IS ON) they are very different. since it’s the same lens and at the same setting, the only explanation is that the camera uses less of the image sensor in IS mode and I guarantee you will never find that number.

So, set sw = 36mm and fl = 41.2mm and leave IS off ( it will mess up your lens distortion values)

AHHH! Success!! :smiley: Thank you all SO much! I got a great track with a solve error of only .6 by doing what you guys said.

But there is one more thing i’d like help on if you all would be so kind to do so. It’s with the compositing. When I do “Setup tracking scene”, the plane it creates gives me a weird result when rendered. Any idea how to fix this? Again, thank you all so much for the help. I have been waiting years for this moment :slight_smile:

This is what I am speaking of:

The result of all of your efforts… It’s not anywhere near perfect (there are clearly some sliding issues), but I could easily have added more trackers in to make it better… Thank you so much everyone!

Oh I ended up just using a simple Alpha Over node using Shadows Only to do the shadows… Gave me some problems later on when I tried to do Vector Blur

I highly suggest using cycles with true motion blur instead of vectorbluring in the compositor, this way shadows will be blurred correctly

If I post the file, could you show me how I would do all that? I’m a basically brand new to this compositing thing and I know very little about Cycles and render layers and all that… Thanks!

the video:

the blend file:

Nah i dont have the time to do it in your file… working on my own tracking shots atm.

Just look at an introduction to cycles rendering… and then go from there… there will never be a perfect tutorial for your setup, the best thing to do is absorb the information and concepts from the tutorial and experiment for yourself.

Ok. Thanks anyways for all the help you’ve been today. I will be sure to look into cycles render layers and stuff more. Cycles naguy is the better option anyways