Jetset & Autoshot: Blender-centric Virtual Production (FREE BETA)

I’m excited to finally post this; it’s the system I wanted to build when I started Lightcraft a long time ago.

You can read some of our history at https://www.lightcraft.pro/about, but the basic idea has stayed the same: if you automated a bunch of the tedious tasks in VFX, like camera tracking and matte extraction, a small team could make a full production.

Our early systems were big, expensive, enabled massive shot processing, and won awards, but their Achilles’ heel (besides being $100k+ !!) was that only the production/data capture side of the pipeline was automated. The post production side was always a big unsolvable mess. Nuke, Maya, Python, hacked scripts, etc.

And then we saw what Blender had become, and where it was going.

One of our users (Kyle Dell’Aquila) showed us the current version of Blender in 2021, and we switched our whole 3D pipeline over to it. Then we saw 3.0 and Cycles-X, followed by the Viewport Compositor, and for the first time it was possible to automate and unify the 2D and 3D sides of VFX post work – quickly.

With Blender and an iPhone or iPad, we could make complex VFX production both easy to use, and available to (almost) everyone. We built a whole system around this idea, and here we are.

The system has 2 parts. The production or on-set side is an iOS app called Jetset. You can sign up for a free beta invite by clicking the gold button at the top of https://www.lightcraft.pro/.

It does real time 3D camera tracking, rendering, and compositing on a recent iPhone or iPad Pro (LiDAR is a big plus!), and reads Blender-generated USDZ files for its virtual scenes.

Jetset captures all the technical data (iPhone/iPad original video, 3D environment scans, 3D tracking, etc.) needed to create complex VFX shots. It helps the user match a given live action ‘mark’ to a matching 3D Mark (or Scene Locator) in the 3D blend file.

You can see more details of how it works at https://www.lightcraft.pro/tutorial/jetset-01/your-first-shot-with-jetset

There are 3 free models that you can pick when you start up the app, and you can also click Upgrade to be able to load in your own Blender-created USDZ files.

Important: the Jetset beta is done through Apple’s TestFlight program, and when you click Upgrade it will ask you if you want to pay for a monthly or yearly subscription. The beta is FREE so there is NO CHARGE either way. Picking the ‘yearly’ option will keep the Update option working for about a day at a time. (I know, it’s a pain, but this is how Apple does it, so we’re working with it . .)

Autoshot is a PC executable (Mac version upcoming) that pulls take files from Jetset, links to the original Blender 3D scene file, and builds a new linked ‘Render’ blend file with the camera footage linked and keyed through the Viewport Compositor.

The free Autoshot downloads are here (https://www.lightcraft.pro/downloads), and there is a series of tutorials on installing and running the Autoshot pieces, but you can see the live comp come together here: https://youtube.com/watch?v=x8IQSvrgNEM&si=EnSIkaIECMiOmarE&t=509

The live viewport composite naturally leads to a more detailed 2D final composite. We’ve added some powerful node groups that automate bluescreen & greenscreen operations like screen compensation and 3D tracking garbage mattes:
https://www.lightcraft.pro/tutorial/autoshot-03/screen-compensation

There is even a feature on wrapping up a shot into a ZIP file suitable for SheepIt or similar Blender render farm systems, with 4K video images, while keeping the upload size small:
https://www.lightcraft.pro/tutorial/autoshot-03/render-farm-export

There is a lot more (converting Unreal scenes to Blender with Omniverse! Render farm setup! Distributed Blender workflows! Driving shots!), but we want to start seeing what people do with this.

This is a free beta, so we want people to try it out, and let us know what we got right and what we need to fix.

I hope to see some small teams make a big splash.

-Eliot

2 Likes