You can see a video tracked and constrained on to circular line, the horizon, projected in a 3d scene. Is it possible to do something similar in Blender ? (I do not have experience on C4D which is used in the video)
Also, do you think it is possible to integrate an image sequence, with the distorsion and data of inherited from tracking, inside of a 3D scene as an object or plane ?
Or maybe having the a camera projecting an image sequence into the 3d scene with the tracking data ?
I did my best to make things clear, if it still isn’t tell me and i’ll try to explain things better
Cheers !
Cameras were solved as tripod method them projected into a sphere with an alpha, and manually adjusted to match the horizon line. I’ve been doing tracking jobs for a while and must say this is a tricky one, I would avoid unless you’re very familiar with matchmove, photogrammetry/LiDAR and technical data.
I learned projections with this tutorial, but was a simple photo. You will basically do the same from a solved camera, projected into a sphere, using a movie clip, for each clip you have.
I am not quite sure if we are talking about the same thing here … when you are talking about the tripod method are you talking about what you saw in the video i quoted ?
Let me formulate it differently : do you know a way to have a tracked/stabilized image sequence inside blender’s viewport ? (not in the camera)
Those 3 cameras were tracked and solved as nodal pan, aligned into a single point, in Blender this is the tripod method. After that, all footages were projected into a sphere for each respective camera.
That image is a sequence of the UV unwrap of that sphere with all projected footages together.
I don’t know exactly what they did and I believe there isn’t a simple away to do it. But I’m positive their method is very similar to that. Track, project and unwrap, your footage will be stabilized.