How far can I push Blender's camera tracker?

I wasted way too much time tracking this :rolleyes: ( & 5 whole min throwing in some suzannes tracked to the camera ) but here ya go, … hoverbike your brains out:

It would be much faster/easier to track if you used some markers… if you don’t want to mask them out later, just use some small rocks or sticks - anything that wouldn’t look out of place on the lawn or on the surrounding trees.

I started with picking what seemed to be some of the best keyframes for perspective change (150, 188) used the 35mm equiv values (as I advised you to do earlier,) and refined fl, k1, k2

Thanks, that looks really good! I was thinking about shooting the same thing today but putting some colored markers, or at least printed numbers at different places in the yard to make tracking easier, but I’d have to see if the After Effects erase function does a good job. I have the Mocha Pro trial, but I can’t rely on that since it’s only good for two weeks.

I’m looking at your file now, and I would like to ask you a few questions. You say to use the 35mm equivalent values. In the case of this camera, a Canon XF100, according to the manual the sensor is a 1/3 inch CMOS. So that would be 0.33 inches, or 8.382 mm. Is that the right way to do it, or is it some formula for the conversion? I’m asking because in your file you set the sensor to 36mm. Then the focal length that you put in there, 33.83mm, is that what the solver calculated when you set it to refine focal length, k1, k2? Because in reality the actual focal length is less than that. This is the spec for the lens in the manual:

f=4.25-42.5mm, F1.8-2.8, 10x optical zoom
35mm equivalent: 30.4-304mm

It seems to me that with the zoom all the way back, it can’t be 4.25mm because that would be too wide, and it doesn’t seem that way. But it doesn’t seem to me like it’s 30.4mm either, to me it’s more like 20mm, but I could be wrong.

I’m surprised that you don’t have a lot of markers there, and yet you got a great solve. Also all your markers seem to be just “loc” and rather small, and you left the speed to “fastest”. I usually start with markers at 50 and 250 search area, because small markers lose track very easily. But you accomplished this perfect solve with a small number of small markers, and not changing the speed, so I’ll try that approach and see what happens.

Thanks to everybody for all your help!

sw = sensor width, fl = focal length… I was just too lazy to type it out each time…

The smaller your marker is, the easier it will be to mask them out. I have often used the tops of 2 liter coke bottles as markers, small but quite visible from a distance… if the grass is too tall, I put them on nails/small sticks pushed into the ground (and yes, I always recycle). It goes without saying that you make dang sure that you retrieve ALL of the nails right away or the next time you mow the lawn it may become extremely unpleasant.

If it’s a natural setting (or anything that is not too sterile), just spread some leaves and flowers and stuff

I don’t, but I may buy a few if they are not expensive. I shot new footage with the sheets I printed and at first I had terrible results, but I started changing the A and B keyframes from 0 in increments of 50, with a range of 100, and eventually I got a full solve with an error of over 4, which is not great, but much better than what I got before. I’m sure I can refine that if I spend a lot of time, but first I have to see if After Effects will erase the paper sheets correctly, and if not use something else like those golf balls, or ping pong ones.

One thing I can tell is that the distances are all messed up. For example, the distance between two specific sheets was two meters in reality (I measured it), but if I select the markers for those two sheets and set the scale to 2 meters, then the distance the solved camera travels is far less than the distance I walked in reality, which is about 24 meters. So to place the beginning of the camera travel at 24 meters I have to enter a scale of 3 meters between those two sheets.

Another curious thing is the solve camera doesn’t turn around as much as I did in reality. I have a path that the hoverbike follows, and it shows along one area before I turn, and 2 meters to the left after I turn.

One thing I don’t understand is why to define X and Y you can only select one marker. I mean, you should set both axes by selecting two markers that you set on the shoot, that you know establish x and y (Blender Y, Z for any other program)

Also, one thing that I’ll never understand is the difference between scales in the FBX format depending on the program. For example, I export this in FBX, open it in Modo, and the camera travels a few mm. To be able to import the FBX into Modo with the correct camera distances, I have to set the export from Blender to a scale of 100. Then in Modo it opens with this gigantic camera wireframe that I have to then set to 0.01, but also this camera, in Modo, shows the scale at 115084.7534 % but it’s keyframed to values that go slightly up and down. I had dealt with the scale screw-ups in FBX before, but this is the first time I see that the camera has keyframed scale.

You can get a dozen of the hollow plastic ones for 2 or 3 dollars at places like Target or Walmart. I hot-glued mine to some tees so that they sit above the grass.

One thing I don’t understand is why to define X and Y you can only select one marker.

Because you’ve already selected the origin which serves as the other point on the line.

Steve S

Oh, so it is two points, it’s just that one of them is the origin. That makes things so much easier. Thanks!

If it isn’t windy, might be cheaper to use ping-pong balls i think; if you’re going for balls that is.

Ping Pong balls are several times more expensive than the hollow golf balls. Also, you don’t want the ball to move at all during the shoot or it could mess up your tracking. Even a slight breeze could move a Ping-Pong ball.

Steve S

Any tips for when you get to the lowest solve error you can get? By changing the keyframe range I was able to get down to a 2.0718, this is with refinement for all the options. Then I tried some values for the K3 parameter, since refine doesn’t touch it, and it helped but just a very small number, down to 2.0712.

Would the way to get it down below 1 be to keep adding markers in the sections that have the least? Would it be to add more affine or perspective markers as opposed to “loc” only?

True, but ping pong balls can turn into a lot more fun:

Wow, really? I assumed they were much cheaper, due to using less material (aren’t they just some sort of plastic balloon cooled off and sealed?)

Man, this thing really couldn’t be more frustrating. I’m not talking about the tracking, I was able to get a solve of less than 1, so I’m happy with that. In Blender, a small cube I placed at the origin, when I scroll through the timeline, looks absolutely perfect, doesn’t slide, doesn’t move at all, in relation to the ground of course.

Then I export an FBX so I can work with it in Modo. Turns out, like I put in a post above, that unless I set the scale to 100, it doesn’t work. Scale 100 is the only way that the virtual camera travels roughly the same distance I traveled when walking with the camera, about 24 meters.

The main problem is that there seems to be some other bug or something in the Blender FBX exporter, because the FBX export looks terrible in Modo, and also in Lightwave. Simply put, the cube at the origin appears out of place, and as I move the timeline cursor ahead, at one point it kinda reaches the origin, after a lot of sliding, and then it keeps sliding to where it’s not at the origin anymore. Like I said, in Blender the cube sticks to the origin perfectly. Some of you may be thinking it’s a frame offset, but no, there was a one frame offset and I already corrected for that. If I don’t compensate it looks even worse, and it still slides. Also, both Blender and Modo are at 24 fps, not 23.98, but 24, and the footage was originally exported from After Effects at 24, and read by Blender as 24 fps.

I also tried exporting Collada, which has given me better results than FBX for certain things, and even though it doesn’t compensate for Blender’s swapped Y and Z axes, I compensated for that in Modo by setting the X axis to -90°. Still the same weird effect.

The fact that Lightwave also shows the same weird behavior makes me think that this is a problem in how Blender exports the file. Do you guys know anything about this, or should I maybe post in another section of the forum that deals with exports?

So things on export are moving 100 times less than they should? Sorta sounds like what is happening is something like Blender exporting in centimeters but the other programs expecting meters.

Could be, but even if that’s the case, exporting with a scale that is a multiple of ten shouldn’t make a difference as long as it is the right scale. The only way to get the same distances as in Blender is to export at a scale of 100. That way, the starting position shows as 24.51885 on Y in Blender and it shows as -24.5189 m on Z in Modo. Otherwise, the decimal point gets moved around.

So say for example that I go to frame 240. Coordinates in Blender are -8.31714, 12.59197, 2.20023. In Modo they are -8.3171, 2.2002, -12.592, so they are the same, since the FBX exporter already swapped Y and Z. Except the - in the Z axis, but I set the Z axis in a way that would put the camera in the right place at the start, otherwise I would have to rotate the group locator that contains the camera by 180°.

OK, so frame 240 has the same coordinates, but making this comparison just now I realized that the problem is in the rotation values. They are all screwed up, and it’s not because of the Y and Z axis difference between Blender and all other programs. They just differ completely. For example in frame 240, Blender has the rotation values 87.295°, 3.574°, -151.875°. In Modo, again for frame 240, they are 8.428°, -61.3185°, -5.7117°.

So now I’m even more lost than before. When exporting or importing between programs, I always came across scale issues, so I would have to compensate by applying a different scale on export or import. But the camera animation was always perfect. Here it’s completely messed up. Can anybody tell why?

Edit: I realized I hadn’t attached the blend file, so here it is: AA203401_013.blend (1.22 MB)

Just ignore the Mocha files, the sequence is the same.

It’s probably a really long shot, but could you give me the set of values for a few other frames (the original and the exported rotation values)? I wanna see if i can find any patterns.

Is there a way to export the whole set of values from the Blender camera animation? I know how to export them from Modo, but not from Blender. That would make it easier to find patterns.

I found that exporting to .py I can open the file and see all the position and rotation values, along with a bunch of other stuff. I’ll try to clean that up and put it in a text file, and upload both that and the text file with all the Modo info to my Mediafire account so you can take a look. It’s going to take a while, so I’ll post back when I can do it.