Fluttering Flag and Augmented Reality: Blender Assets for Two Museum Interactives

We have an urgent need for a Blender generalist to help create assets for two museum interactives (described below). The total amount of work will likely be 3-4 days, 2 days immediately and the balance spent as minor iterations over the next month. As you will communicate with our art director and programming team via email and Skype, decent command of written and spoken English is required. After reading the descriptions below, if you are interested, please PM me explaining how you have the skills to undertake this work. Be sure to specify a desired hourly rate (responses that omit this will not be considered). Note that as part of this work, you must assign copyright over all work product to our company. You must also supply all Blender files needed to replicate the outputs.

What follows is a description of the two interactives along with the specific Blender outcomes we are looking for.


The interactive is a linear 8-screen multi-touch table showing a historical timeline. If no visitor touches the table for 1 minute, we want to apply a gently fluttering flag effect to the screen contents. It’s important to understand that the screen contents aren’t known in advance - it depends on how the most recent visitors have been using the table. Step 1 then will be to take a snapshot of 8 1920x1080 screens to act as the flag texture. We then plan to use an OpenGL shader and pre-calculated UV and normal maps to achieve the desired effect.

The Blender work here involves setting up a loop-able cloth simulation of a fluttering flag, rendering out the corresponding frames as UV and normal maps, and encoding this information into the RGB channels of two Quicktime videos. Our programmer will write the OpenGL shader that references frames of the video, unpacks the UV and normal map data and applies the flag texture.


This interactive is a pan-tilt screen situated on a balcony overlooking what used to be a speed-skating oval. A video camera on the back of the screen shows a live feed of the space below, effectively turning the screen into a digital window. As the visitor pan and tilts the screen, an augmented-reality overlay tracks features and provides hotspots they can press for added information.

The Blender work here involves setting up a simple 3D scene that matches our physical setup. This will include accurate placement of a few building details, virtual cameras in the 3 locations that our interactive will be deployed, and creating a simple model of the original oval track. We have lots of reference material, architectural plans … and even a detailed SketchUp model of the space (which is overly complicated). The output will be in Collada format suitable for import by OS X SceneKit.

Thanks for reading.


I’ll do it.