Blender was used for a bunch of different parts of this teaser/trailer I worked on: see details below!
- “Cyberspace” Scenes use the “Suicidator City Generator” and extensive compositing node trees
- Gameplay simulations use ortho projections of the cyberspace city for the backdrop, and Blender was used to create the icon bezels
- The VR goggles were modeled, textured, animated and rendered in Blender.
- The “clip-on wireless jack” has a faceplate made in Blender
- Near the end, the “floating in space” VR interface is modeled, tracked, and composited in Blender
The camera tracks were a bit on the rough side: I don’t think there was quite enough motion for Blender’s tracker to get a good lock. Even with reprojection errors of .3 and less there is a pronounced curve to what should be a flat plane. The object tracks were even rougher. I ran into lots of trouble with scale and the scale parameter’s weird behavior. Also, again, with low reprojection errors the tracking cloud would occasionally invert for a frame or two. Lots of manual cleanup was needed, which unfortunately results in a less accurate track: there is more than a little sliding that is visible.
Blender’s compositor was relied on heavily for nearly all the scenes: the ability to project shadows on a “head” model which then was keyed over the footage improved the realism. I’m particularly happy with how the extreme closeup of the goggled character turned out.
A lot of the compositing work was done in Mocha Pro and Apple’s Motion. Color grading was done in Apple Color (DaVinci Resolve kept choking on some timecode weirdnesses on the transcoded Canon HDSLR video), editing in Final Cut Pro 7.
All in all, I learned a lot about how to track in Blender and, more importantly, how to plan shots to give the best chance at a successful track!
The whole trailer is in support of a Kickstarter campaign to raise money for an iOS (iPhone, iPad, iPod Touch) video game:
Thanks for looking!