Hello,
I’m looking for a way to get my camera in Blender to match a real-world camera and I was hoping I could get some assistance to point me in the right direction. I’ve found some stuff built into Blender that looks promising, but nothing that matches my exact use case. Here’s the situation…
I have a scene in the real world that I’ve taken a picture of. To make this example a bit more concrete, let’s say it is a basketball court. I also have a Blender model of this basketball court…the physical dimensions of the Blender model very accurately match the actual real-world basketball court. For example, the length and width of the court, the positions of the baskets, the markings on the court (free throw lines, three point lines, etc.) are all accurately modeled in Blender.
Is there a way to calibrate / optimize my Blender camera so that it best matches the real-world camera that snapped the actual image of the scene? In general, I think the situation is that I want to click N points on the image and for each point I click, I want to associate a known XYZ position to that point. For example, I would click the corner of the basketball court in the image, and then associate that to the XYZ position of that corner from the Blender model. It seems like after I had enough points spread out throughout the volume of the scene, it would be possible to “solve” for the camera’s position, orientation, focal length, etc.
Can this sort of thing somehow be done in Blender? And, if not, can it be done outside of Blender in some other tool that produces data that I could use to position / orient my Blender camera?