Please help if you understand camera tracking. Even with 0.12 error it looks aweful

I tracked a video shot at 60fps in bright sunlight on a good camera (not my phone) with lots of contrast spots to track. I was able to get the lowest error rate I’ve had since starting to learn camera tracking. That being 0.12, yet the 3D objects in the shot slide around differently than the background.

I don’t understand what is going on. I’ve literally watched over a dozen hours of tutorials on camera tracking in the last 3 months. One video was over an hour long. So I don’t think I am missing any steps.

I think I am going to have to give up on all my ideas that involved camera tracking and that is a real shame since obviously Blender is capable based on what I’ve seen some other people do. But unless someone here can help me figure out what’s going on, I can’t keep wasting months of my life on things I can’t get to work.

1 Like

:thinking:hmmmm…This sounds more like a threat than a cry for help. You can stop using blender if you want to. But I will advice you actually ask for help cus blender is great :sunglasses:.

1 Like

Threat? I’m just saying that I’ve waisted moths of time trying and I’ve never gotten the tracking to work. So I’ll have to give up on my ideas. Not blender. I’m still doing animation and other things in it.

I’ve asked multiple questions about tracking, and I’ve never gotten any actual help from anyone. And your message changing the meaning of what I said to a threat doesn’t help me get any closer either.

I did ask for help. The last part of my post is an expression of my desperation and the fact that no one here has ever actually helped me figure out the tracking. As you can see you’re the only person that posted and I have no idea if you’ve ever even done tracking since you didn’t actually answer my question.

Maybe I didn’t word it well enough.

I’m trying to understand how I can get such a low error rate and still have the tracking jiggle out of sink with the video.

Do you have tracking down? If so, what sort of things could I be missing that is causing this problem?

I did a lot of tracking in Blender, and it’s very good and simple, yet you need to have focal lenght and sensor size correct. You can get very low values and track still looks awful if those parameters are not set. Also for optimal results use manual trackers that can track the entire footage, Blender can solve with just 8 good ones. I have a lot of examples, check it out: https://vimeo.com/lucascoutinho
Forgot something, before all that you need to undistort your footage. I use NukeX Non-Commercial, it’s limited to 1080p.

I put in the right focal length and sensor size, but when I use the refine focal length button, it changes what I typed, though I get a lower solve error.

I’ve never heard in any tutorial them say you have to undistort your footage. Are you serious? That would be a major oversight of design in Blender as well as of every tutorial out there not to mention that.

Don’t refine it. Just use that when you don’t have camera data. About lens distortion, Blender generates a undistorted node in the compositor when you create your 3D scene after tracking. It’s not very good, you need to undistort before. VFX it’s not something Blender artists are looking for, that’s why you didn’t heard before. But all movies have that process. Undistort > Work > Redistort.
Upload your video into Vimeo, and type camera lens and model and I will create you an example file.

Maybe you should check this tutorials out maybe they will help you… sorry for the misunderstanding, hope no hard feelings :blush:.https://www.blenderguru.com/tutorials/introduction-to-camera-tracking
https://blender.community/c/today/VTfbbc/

Welcome to my hell.

BOOOO

HISSSS

BOOOO

*throws cow poop at you*

1 Like

I have asked whether I need to know the focal length of the camere before hand (and if using cheap android phones that I can’t ever find out any focal length info for makes tracking impossible) and I’ve always been told I don’t need to know the focal length. I’ve also never successfully tracked anything in a year of trying. Meanwhle CG Matter has 5 minute tutorials about how to get a successful track with ONLY 3 STILL PHOTOS SOMEHOW.

1 Like

I consider CG Matter the best Blender channel online, and I learned tracking with his tutorials. He has some very good skills for VFX. I’ve kept small and started to move on for more advanced shots. His videos teach videos with smooth camera movements. For smartphones is beyond insane to get camera data, but I found these websites:
https://www.digicamdb.com/sensor-sizes/
https://www.camerafv5.com/devices/

I did a lot of tracking, and also tryed PFTrack and 3DEqualizer, they all work better with undistorted footage.

This one, I undistorted manually in Nuke and tracked in Blender, no camera data:

Same shot with CG, still a WIP:

Thanks for the link, but that blenderguru tutorial is 9 years old. Some of the buttons aren’t even the same any more.

Since none of the tutorials I’ve watched bothered to undistort the video, and it worked for them, I don’t think I’m going to go down that road. I can see doing it if everything was working, but that would make it even better.

Like I said, I’ve done with my good quality samsung phone, as well as a $1,000 sony camera that is not part of a phone. Same problems.

That’s great you have been able to get it to work. I’d be curious if you tried doing it without “undistorting it” and what the difference in quality was for the track.

Tracking is one of those very difficult things that is made to look easy in tutorials. In fact it is worse than difficult it is a very dark art. I think it is possible to use Blender for tacking - but in reality I think it is very difficult to learn tracking with Blender.

The non-Blender route is I think going to be much more successful learning path but is going to cost. I would suggest either giving up and focusing on some other part of 3D or picking up a copy of Syntheyes. It is the only affordable option, sadly has a terrible 1990’s interface - but is state of the art and used by professionals. It also has good tutorials written by its developer.

Nuke non commercial will track so could be used to learn - but you can’t export the solve. You can build 3d geometry in Nuke so can render out to see if it is working. PFTrack is a nice interface but considerably more expensive. 3dEqualiser requires a mortgage to buy. Fusion standalone has a 3d camera tracker and is similar cost to Syntheyes - but considerably less functionality.

You can’t track distorted plates so either you undistort - or the software undistorts under the hood. At the risk of oversimplification tracking is finding a solution (regression) to a set of mathematical equations. Having a low solve error just means the solution found is a good fit to the data you input. That good solution may bear no relationship to the real world that the data derived from.

Even recent tutorials commit the unforgivable sin of not providing the viewer with the exact same video file they used in their tracking tutorial.