Camera Tracking Multiple Shots Into A Single 3D Scene

I’m currently working on a project involving 12 different green screen shots with a single actor in each. My camera solves are very accurate (green screen had a great track point system setup).

I’ve gotten a single one of these shots into the 3D environment with a solid composite setup. As I begin to move ahead to the other shots, I’m curious as to what the best approach would be to linking multiple camera tracks into the same Blender scene. I imagine this involves linking scenes, but wanted to get some feedback from the community on how to streamline this process. Ideally, I would not have to setup a composite setup for each shot as the lighting is uniform across all of the shots I’m tracking.

The scene involves an individual standing in the middle of a “digital environment”, with floating HUD/UI elements surrounding them in a sphere. The talent is always at the origin of the scene and interacts with the pieces as they float around them. Again, setting the camera track is very managable, but as I continue to bring more shots into the environment I’m trying to stay ahead of organization and avoiding duplicate work in the compositor.

There are a few images attached for reference. If anyone needs more info, I’d be happy to provide. Thanks in advance, this community rocks.




1 Like

I think you’re on the right track haha! I’d say if you have your compositing, and scene objects all set up, using either ‘Link Object’ or ‘Link Object Data’ when you create a new scene for each speaker, should work for you. Then you should be able to just set up a new and different camera within each scene to receive the new tracking data. Looks interesting! Keep us posted!

https://www.blender.org/manual/data_system/scenes/introduction.html#controls