I’ve written a script that provides a more accurate emulation of optical lens focal length for Blender’s camera, intended to make scene-matching with real-world camera output more successful. Part of its operation makes possible animating the focal length (zooms) and DoF effects:
Thanks for the replies, folks. Sounds like it might be a useful accessory
Yes, blenditall, it has a full GUI. Lots of options beyond the basic focal length setting. The website has full details.
Currently I consider the script in beta status because only I’ve spent any time using it – I might be a little short-sighted ;), over familiar, y’know? So, if anyone’s interested in helping with a test cycle before public release, drop me a Private Message, please.
I want to keep it in very limited circulation until others have used it and it checks out well in terms of un-bugginess and user friendliness.
The contra-zoom (dolly-zoom, or “zoomdolly” as I once named it when first figuring out how to do it in a 3D app ) has also been used by such luminaries as Spielberg (Jaws) and Jackson (LoTR/FoTR). It can be overdone, of course, like any effect.
Thanks, BeBraw. Here’s an example of a scene-match from a digital camera shot:
The camera is a consumer-level point & shooter, zoom lens only, autofocus, autoflash, auto exposure, the whole idiot-proof shebang, but that makes it a good test case for the script and the calibration methods I outline at the BLenses website. This is my first test of the digital camera calibration method.
Lens was at it shortest zoom range limit, 6mm focal length by the specs. My calibration got a sensor size of circa 5.323mm (confirmed by a little research – it’s a 1/2.7" sensor, size 5.3mm x 4mm according to one source I found). I shot the pic of my very humble abode with only one measurement – the house was about 20m from the camera. After plugging the numbers into the BLenses script & using its camera controls to get Blender’s camera into a matching position, I dropped a few Suzy statues on my lawn!
Once the script gets out into the public, sharing the camera profiles would be useful – too many variations in digital sensors to dig them all off the web, so a user-generated d/b would be great.
Not sure what you mean by a “still zoom.” To me, a zoom is always an active change in focal length during a shot (vid or movie film). In that case, if you know the start and end points of the change in focal length, you can use those as a starting point for generating a matching Ipo for Blender’s camera using the BLenses animation mode. If those values aren’t known exactly, it’ll take more trial and error getting the focal length values to match, but the script can help streamline that process.
Ideally, the focal length range of the zoom, plus camera-to-subject distance, and any other camera-position measurements that can be nailed down (height above ground, etc), should be recorded during the shooting for any scene designed to be composited. But idealized shoots rarely happen, so some informed estimates may need to be made and then honed to fit the live footage. For the above composite I tweaked the nominal focal length (from 6 to 6.2mm) and jiggered the camera and POI positions to get the best match to the perspective, making some educated guesses about tripod height, etc. Even so, it only took an hour or so to line up the perspective match. Getting even a halfway decent lighting match took longer.
Worst case would be trying to match a totally “wild” shot – no info at all regarding the camera, lens, zoom range, etc. In that case, there are some possible workarounds to develop a first-trial starting point, but it would be a lengthy process getting a decent match unless Lady Luck was smiling graciously ;). An app like Icarus would be a useful tool in this scenario, with BLenses providing the focal-length matching capability for Blender.
“Still zoom” meaning, you never touch the zoom, it’s constant throughout the shot. (Makes for a boring shot, usually.)
But your answer was what I expected. I’ll have to get the tape measure out more often.
Now, for still shots, it would be SWEET if it could read the EXIF data from the photos and grab all the info (focal length, and such) and slap it in the plug in. (NOT suggesting it! Just something that passed through my mind.)
So you’re equating the word “zoom” with “focal length.” I guess that may be a consequence of so many cameras now sporting only zoom (variable focal length) lenses. For “still zoom” I’d be inclined to say “static” or “fixed” focal length since “zoom” has always meant to me an active change in focal length (the zoom lenses making this possible).
Accessing and parsing the EXIF data for an image would add a major layer of complexity to the script for little gain – the data from the EXIF can be human-read a lot more easily ;). It does have value in providing focal length values for shots made in between zoom range limits when a lens scale isn’t available, but little else in the data is useful for BLenses.
EXIF is also not a stable standard yet, and the data can be corrupted by image processors, so it can’t be considered a universally reliable source of data for the image.
Ok, you caught me. I’ve never worked professionally in live footage. Static FL makes more sense, you’re right, but yeah, as I see variable FL the same as “zoom”… Oh well.
You read into my EXIF comment! See, EXIF doesn’t make that much sense to me, but I have some friends who
have nice digital video cameras, and these cameras automatically log the focal length, and other interesting data, onto an SD card, or some such memory, so that when you’re working with the footage, you know what length you were at.
Once those type of cameras get to be common place… You should consider being able to parse THOSE files.
EDIT: Have I pulled your thread far enough off topic yet? Sorry.
Erm, why don’t you make the script public yet? Or the website? I’d like to take a look at some explanations and examples rather than try the script right now, since I have some other projects going on but it definitely looks interesting. And your work definitely seems to be very well done, judging from your comments and the effect-videos you have posted.
It’s basically going through a cycle of beta-testing. One thing I’ve found from scripting in a number of languages and environments (Pixelscript/TcL, RealBasic, UnrealScript, Python, to name the biggies), is that one person can never anticipate all the possible ways a piece of programming can screw up . Users have an uncanny ability to uncover, or trip over, the faults in code. Even just usability can be an issue. So for now, the script is being used on a limited basis so if anything nasty crops up, or if it seems to have some usability issues, they can be fixed before lots of people start using it. No sense releasing two or three version of a script within a couple of weeks in order to patch flaws if some up-front testing can get a cleaner public release.
I’ll PM you with the website link – feel free to browse away, no obligation to test, sound like a deal ?