How to combine a rendered 360 image from a 3D model with a real 360 image taken from a drone?

Hello!
I have a task - overlaying a 360 panoramic render of a 3D model of a construction site onto a real 360 photo from a drone at this construction site. To do this, the camera in Blender with which the render is performed must exactly match its position with the place where the drone was while taking the panoramic photo.
The problem is selecting the camera position for which the rendering will be performed. I’m not very familiar with blender and now my work process looks like this:

  • approximately selected the position of the camera in the blender
  • made a render in png format with transparency
  • in Photoshop I superimposed a render on a panoramic photo from a drone
  • imagined where approximately the camera in the blender should be moved to make it more accurate
  • moved the camera in the blender.

It takes me up to 10 iterations, the process is very long.
I believe there is some more correct way to solve this problem.
If so, is it possible to describe such a path in sufficient detail?
Thank you.

If you can map the drone footage to the inside of a sphere object in Blender I guess you could move the camera around until it looks right. I’m not sure if camera matching works with 360 degree scenes.

Could you explain in more detail? I’m new to blender.

See this youtube video, it’s basically what they do in the second half of the video.

1 Like

If you use 360-degree images, try using the addon in the video below.

If you need motion tracking of 360-degree video, you will need to find a method.

1 Like

Let me remind you that I’m trying to adjust the camera position in Blender to the real point of shooting a panoramic photo from a drone.

What I was able to do:

  • Make Properties - World - Surface - Color - Enviroment Texture - Linear, Equirectangular, Single Image;
  • uploaded a panoramic photo from a drone there;
  • I made the camera in the blender panoramic;
  • switched to camera view.

And I try to manually adjust the camera position by changing the rotation of the world surface and the x, y, z coordinates of the camera. It turns out very long and inaccurate.
Maybe there is some way to position the camera by triangulation, for example, or something else?

It depends on what your drone image looks like. You should be able to locate the x/y position on the ground relatively easy, then you just need to compensate for elevation.

Are there any landmarks in the drone image you can approximate the height from? Do you have any data from the drone itself?

At the moment, panoramic photos are very difficult to manually select camera coordinates. There is snow everywhere, and the buildings that are on both the model and the panoramic photo are located far from the shooting point, which makes it difficult to navigate towards them with sufficient accuracy.
So I thought, maybe there is, for example, some kind of plugin for triangulating the camera position using several points on the model and panoramic photo.

And unfortunately the GPS is not working in this area and I cannot use the coordinates from the drone photo to place the camera.

I’ve been looking into this issue, but there’s no obvious way to do it on the blender. :thinking:

I don’t know if this Addon is possible…
Ask the seller

The video below is a different software

1 Like