User selects which plane track to use (for this he
need to select movie clip datablock, object and track
names).
Node gets an image and mask inputs (both are optional).
Node outputs:
Input image warped into the plane.
Input mask warped by the plane.
Plane, rasterized to a mask.
Warping image is done by computing reverse bilinear
coordinates, and getting pixel from corresponding
position.
This requires some tricks with downsampling to make warped
image looks smooth.
Currently compositor doesn’t support downsampling, so we
needed to implement our own operation for this.
Currently idea is dead simple: value of output pixel equals
to an average of neighborhood of corresponding pixel in input
image. Size of neighborhood is defined by ratio between input
and output resolutions.
This operation doesn’t give perfect results and works only
for downsampling. But it’s totally internal operation not
exposed to the interface, so it’s easy to replace it with
smarter bi-directional sampler with nicer filtering.
Limitations:
Node currently only warps image and outputs mask created
out of plane, warping input mask is not implemented yet.
Image warping doesn’t report proper depending area of
interest yet, meaning interactivity might be not so much
great,
There’s no anti-aliasing applied on the edges of warped
image and plane mask, so they look really sharp at this
moment.
Promises some awsome! Wonder if he can resolve some scaling issues as the UV scaler sucks a bit for distortion in nodes.
To clarify blender already does planar tracking that’s the perspective track type. Sadly you cannot use the tracked region just the centre. However by linking a series of 4 track points together you can drive a plane or uv type distortion.
Wow, that’s cool. I like the mask parenting. I guess you could construct simple 2 1/2 D geometry like this in the compositor? With multiple planes, that would save performing a complete solve for 3D space.
So will this tracking feature replace the sometimes confusing UV_Project modifier or will it work as a companion? Here is another question will there ever be any documentations on when to use the LocScale, LocScaleRot, Affine and Perspective trackers? I always use the Loc tracker but the LocScale or LocScaleRot always seems to loose track on simple scenes scaling and rotating.
It’s a great addition, but I didn’t expect it to work like this. Why not use 1 perpective track, and use it to define a plane? Then you could use a parented empty (That follows rotation and scale), or use a node with rotation, and scale information of the tracker.
The way it is implemented in Mocha, you can track almost anything planar (or almost planar, like sunglasses) without having to use four marks (sometimes there are no features to track in that plane)
But, anyway, this is a great addition. Congratulations!
Sorry to double post, but I wanted to show a use case, to explain the difference of implementations. Have you seen the photo frame tracking effect, like this one:
You can’t easily track four points in the frame, as the background is changing. With proper planar tracker, you could track the whole frame, masking the hole (this is actually in blender, but not much used), and use the information to deform an image in compo.
Don’t think of this as working like Mocha, think of it as corner pinning.
Blender does indeed have the ability to track a plane like Mocha does. Try creating one single track point, setting it to Affine, and dragging the corners to cover a small area. It loses the track a lot, but if you work with it, it does actually track a planar surface. There’s just no way to use that data in the same way Mocha uses it. Planar tracking in Blender is only to help the actual individual track point stick better.
So yeah, just think of this as corner pinning, not planar tracking. Having awesome corner pinning without going through the 3d view is still awesome.
After you have finished with the mask tracking, go to the compositor and pipe the mask(s) into your processing nodes and watch the masks act as a traveling matte.