So I can track footage in Blender now, great, but how the heck do I use that data and apply it to a camera in a scene???

I guess that’s the part that’s on the ‘to do’ list.

This is still very very early in development, so don’t expect it to work now.

The thing you need Mr Black for you to get it to work is patience

Ok, so let me get this straight, it’s completely useless for right now right? I don’t think so, in the commit entries there’s globs of stuff but no explanation on how to use it. Btw, there’s no need for the uptight rudeness just cause ya’ll don’t know how it works either. And since when did I ever let on a feeling of “this is supposed to be the final build”? Seriously, I asked one simple question nicely and the hounds are released on me to rip me to shreds. Worst online community ever.

Seriously, like I said before, I think it’s great that you can do camera tracking in blender, but how the heck do you apply that tracked data to a camera? The commit entry list shows a lot of things that would imply that you can do this. Anyone at all know how this is achieved?

Thats right Jackblack, it is an in progress commit… thats why it is not in the main branch at the moment. its in the tomato branch… the tomato branch was specifically set up for new features for google summer of code.

Be patient and you will get it eventually :slight_smile:

From what I’ve seen, it’s going to be great when it’s finally done. If you’re really anxious to get some tracking done, there are open source external alternatives anyway.

I would wait till the end of GSoc to have any even partially useful tools - basically figure end of the summer.

Yo relax, nobody is tearing you down, even if your post came off as a bit demanding.

What commit entries are you referring to? What I have seen is this:

Added the very basic implementation of 2D tracking. It should be treated as draft for tracking architecture which probably would be cleaned up, changed or whatever else. Current implementation was supposed to demonstrate that our structures and understading is correct to interact with libmv easily.

What you are asking for, and what we all want (apply the tracking data to a camera) is like the final result of this GSoC, and this project has only just started. That’s all we are saying. I follow this branch and if I find out something new I’ll definitely pass it along (others will no doubt beat me to it).

Bark…Bark… (pause) Bark…Bark…

A bit of moderation:


I can understand that you want to express your frustration, but there is really no need to be rude (see your first post) and than repeat it again. “how the heck” is not the right statement for constructive discussion. I think you would get more help if you ask kindly.


you might want to rephrase your post or delete it via Edit Post. I’m pretty sure you can express yourself in a more intelligent, respectful and valuable way :wink: (even if you might disagree).



you might want to get a sense of humour (even if you disagree).
Irony is also regarded as a higher form of intelligence.
So I won’t be deleting my post as I think its quite CLEVER, it RESPECTS the OPs sense of humour ( even if its completely wasted on some people) and its VALUE is determined by the intelligence or lack off intelligence of the reader. So even if you don’t “get it” some people will.


when you said “the hounds are released on me to rip me to shreds.” I thought you were kidding.

@sx-1 I was sorta kidding.

@Monster Where I come from saying things like “how the heck” is not rude (I’m from the southern U.S.), heck is a very common word used, just as where in your country some of the things your people may say may seem rude to me.

Blah blah blah… question is answered. just close this.

What ?
Another Fascist
Lets have free software
But at the same time lets abnegate free speech
What just because we don’t find it agreeable.
You know its extremely offensive and hypocritical to talk of moderation
and then to viciously attack someone’s intelligence, ethics and integrity
perhaps if I told you to delete your post, or maybe not to post at all or don’t even have an opinion you might not be so quick to to want to close a thread.
Perhaps you think there isn’t room on this thread for an apology ?

Blah blah blah… The Third Reich is gone mate the time of jumped upped little dictators is over

Bear in mind that this is a 2D track… It’s most useful in motion graphics and compositing tasks. It wouldn’t really make any sense to try to apply any of this tracking data to Blender’s 3D camera. That’s well down the road.

@Fweeb, in the commit entries list it says this:
“- Display bundles in 3D view as spheres. Selection is synchronized to markers associated with this bundle.
Bundle can’t be selected in 3D view.”
That would lead me to believe that it includes some sort of 3d implementation. Also, couldn’t there be a way to fake a Z field by calculating the space between two tracking points and their separation between each other? I guess what I’m saying goes like this is point 1 is 50 pixels away from point 2, the space between point 1 and 2 grows to 75 pixels as the film progresses, therefore a z field factor can be calculated as moving "forward in z space. Does that makes sense and would it be effective?

No, because that could also happen if the camera rotates.

I don’t think so, if the camera rotates then the points are in sync, but if they spread apart then that could be interperated as movement to or fro. Just like when you zoom in, the image gets larger and any tracking points on the image will head towards the edges of the image, this does not occur with an animation or film where the camera rotates, in that case the points move in synchronisity.