I need some help with tracking

I’m new to camera tracking, and I need some help. I have set up a track which seems to bee all right. but when I press solve camera it still has a very high solve error of 120. With a little bit of tweaking in the camera data settings I could bring the solve error down to about 15. But that’s still too much. Have you got any idea? Please Help ! :(.

This is my camera : http://camcorder-test.slashcam.com/compare-cmd-i-view-u-what-i-detail-u-lang-i-en-u-id-i-158-u-name-i-Panasonic-HDC-SD66.html

Here is a screenshot.


You need to input the correct camera settings for focal length and sensor width in the Camera Data panel.
The sensor is about 3.2mm and you should know what focal length you used when you shot your footage.

Thank’s. Now I set the sensor width to 3.2. But where do I exactly get the focal length from? When I shot the scene I didn’t change anything. I used the standard settings. Beneath my camera lens there is a small sort of information it says f=3.02~75.5 mm 1:18 is this the focal length?


I just did an other camera solving. but now the solve error went up to 1180.08 . I looked again. I think my sensor width is 35.7. But what about the focal length? Oh an an other thing I noticed is, that when it finished solving the camera, it says : some data could not be reconstructed, see console for more information or something. Could it be, that this has something to do with the video format? Originally my video was an AVCHD. Blender couldn’t use that, so i renderd it out to an avi. in sony vegas.

I tried new video formats. But that also didn’t help. It got even harder to track the points.

Going by the first screenshot you posted, it looks like your source video is interlaced. This will probably cause havoc with the motion tracking. Go to a section of your video where there’s significant movement and zoom in - does the video look like it’s split up into fine horizontal lines?

Ideally, you would shoot the footage as progressive video, but some cameras don’t have this as an option. What model camera do you have?

If you can’t shoot progressive, you can convert interlaced video to progressive on your computer before importing into Blender.

What Richard said is important, but making sure the video is progressive is the first thing to do.

Yes the video does split up in horizontal lines, when there is significant movement! So my video is interlaced. This is my camera http://camcorder-test.slashcam.com/compare-cmd-i-view-u-what-i-detail-u-lang-i-en-u-id-i-158-u-name-i-Panasonic-HDC-SD66.html .

But how can I find out, if I can shoot progressive videos with my camera , or how can i convert my videos?

Ok. I rendered the scene out again in sony vegas. I deinterlaced it and everything. But now , when i import the scene t blender, and trie to track it, it also doesent work. When I play back the scene in blender, the image just jerks around for a few seconds. But then the rest plays normal. But tracking still doesn’t work. It is even worse now. Now the tracking points just slide away a little bit, and then start tracking. Here is a screen shot. I hope you understand, what im trying to explain :).


The following might be causing problems:

  1. format of the video

  2. quality of the de-interlacing

  3. badly tracked points

  4. tracking the right objects

  5. sensor size and focal length settings

  6. AVI is a container format, the actual video and audio could be any number of codecs. To avoid compatibility problems, people sometimes use PNG sequences (just a load of png files in a folder). See this thread for how to export a PNG sequence from Vegas:
    http://forums.creativecow.net/thread/24/889840

To open the PNGs in Blender; go to the Movie Clip Editor, click on open, navigate to the folder containing the PNGs and double click on the first image in the folder.

  1. video can be de-interlaced a number of ways, and not all programs do a good job of it. I don’t know what vegas is doing, but look carefully at your exported frames and make sure they don’t show any signs of double images or aliasing on movement. The video should look clean with the exception of motion blur, which is to be expected in any footage.

  2. it’s important that all track points are free from errors. Select each track point, click ‘Lock to Selection’, zoom in to the point and slowly go through the video and make sure the track doesn’t jump around. Repeat for all the points.

  3. Blender’s motion tracking system is a work in progress. The following seems to improve the quality of tracks for me:

  • points in the distance don’t often solve and cause problems. Try limiting track points to a distance of about 30 feet from the camera (few or no points at the horizon).
  • make sure the camera’s location changes. Videos with no movement except panning don’t work as well.
  • go to frame 1 before clicking on ‘solve camera (or object) motion’. I heard somewhere that this is necessary.
  • I select ‘Focal length, K1’ on the Refine option before solving.
  1. I don’t know how important it is to get the camera data right (I’m still a newbie at tracking). It’s not easy to know what focal length a video has been shot with on a lot of consumer cameras. What I’ve done so far is leave the sensor at 35.00 and adjust the focal length to what I think is right. This requires a bit of trial and error (and familiarity with lenses). For the video of the field, try leaving the sensor at 35.00 and keep increasing the focal length from 25.00 to 35.00 in 3mm increments and see if the solve error improves at one of the settings.

Have you watched any tracking tutorials? I found them very helpful at understanding the process.

I don’t think the camera can shoot progressive video, but that shouldn’t be a problem as long as the de-interlacing is good.

Ok thanks for your help. I rendered out an other deinterlaced file in sony vegas, with a tutorial. This one work’s a lot better. I an other thing I will do now, is to re-shoot the scene, with a little bit more stable camera handling, and some “artificial” tracking points. I think I will also try a differrent shutter speed, to avoid strong motion blur. :slight_smile: