follow camera zoom with scale of a scene

Hey there,

I am trying to scale a scene onto an zooming plane. I set keyframes at the beginning of the zoom and the end so that the scene matches the size of the plane at this points. I also set all curves to linear. The curve of the zoom also. But the zoom of the plane and the scaled scene are not fitting. Any ideas how i can match both?

Here is a video which should illustrate what i mean:

At the beginning the two fit and also in the end. But in the middle not. I dont really like to match each frame.

I wrote a little python script which measures the distance of the plane to the camera and then calcs the factor with which the scale gets multiplied to get the correct scale. this value gets written into a value node to use it in the nodetree. The only problem is that the scaled scene is little bit jittering. I think this is caused through rounding maybe.

I am sure there is am much more simple way than i did it.

import bpy

def call(context):
    
    print("script executed!")
    main()
    
print(bpy.app.handlers.frame_change_pre.count(True))
#print(bpy.app.handlers.frame_change_pre.remove(stringplay_callback))
bpy.app.handlers.frame_change_pre.append(call)

def main():
    # camera position
    camera = bpy.data.objects["Camera"]
    # object position
    object = bpy.data.objects["Plane"]

    # start distance
    start_distance = 14.98587

    # start_scale
    start_scale = 0.221

    distance = (camera.location - object.location).length

    scale = start_distance / distance

    calc_scale = scale * start_scale

    print(calc_scale)
    bpy.data.scenes['main'].node_tree.nodes['Value'].outputs[0].default_value = calc_scale 

Here is the blend file:
comp_zoom_test_001.blend (529 KB)

Here is a video of the result:

Why not do this in 3D space? Are you just using the compositor?

Another fast way would be to just use UV unwrapped planes in the 3D view on separate layers, then use the UV distort node to map your images onto each layer. That way you can fly the camera any way you like and the results should stay perfectly locked with out to much render overhead (I think).

BTW the first video seems to have a finishing point offset, are you sure that the dope sheet doesn’t show the keyframes are a bit off?

Hey,

yes before i use the tracking method i could use the rendered scene as the video texture for example. But i like to render everything in one render workflow.
I can not find a uv distort node. where is it placed?
In the first video there might be an offset. I can not check it because i overwrite the file…

Add Node > Distort > Map UV

Found it, from where can i get the UV vector? Which Node?

Ah got it. Just had to switch it on in the layer dialog. Works great! I thought that there must be such a feature but couldn’t find it. So thanks a lot for pointing me to it! The only thing which is strange. There seems to be a small border around the face. like there is a wireframe above the texture.

Yes it has a aliasing error at the edge. Try subdividing the plane first then unwrap it.

The subdividing didn’t change much for the use with a plane. maybe it helps with more complex wraps. I resized the mapped image slightly with a scale node. I also tried to give the object behind my scene some alpha. both worked for me… thanks!