Blender to get Inpainting node! Awesomeness

Peter Schlaile proposes a new node for Blender “Inpainting”!:eek:

Check out the wiki:

Potential for denoising here, depending on the algos employed :eyebrowlift:.

Sadly it seems that Blender is in lock down for 2.64 :frowning: but maybe Mango could wedge it in, as there is more potential for another detail keyer!

This seems like a big deal for the compositor.


I wrote an inpaint node for the compositor.

Since we are directly before a release, I haven’t commited it, but I’d
like people to review it.

In case, you don’t know, inpainting does this:

It’s use cases in blender are

  • wire removal
  • green screen background reconstruction

the later is important to actually improve keying in blender. (see

The node isn’t tile based (for fundamental reasons), but very fast,
since it first builds a manhatten distance map and after that performs
color convolution only on the edges.

That’s something, one should probably add also to the dilate node (in
step mode) to make it perform a lot better for dilate iterations greater
than 3.

It will bring it’s computing time from O(n^3) down to O(n^2).
Take a look here for the details: )

My aim is implementing something like the IBK Keyer in Nuke
( ), since all current
solutions within Blender fail on hair details rather badly.

You can see first steps in this direction here, which already do some
nice improvements:

(compare key_raw.png to inpaint_key.png )

The trick I use is the following:

If you consider the usual compositing equation

Composite = Background * (1-ALPHA) + Foreground * ALPHA

for the case, that Background is our GreenScreen and Foreground is the
Object, we want to seperate, you’ll notice, that we more or less
successfully can pull an alpha-matte (using a color channel node), but
currently fail to subtract the GreenScreen Background properly from the

That’s no surprise, since until now, the GreenScreen Background wasn’t
actually known (we don’t have any clean plates shot in Mango).

But: we can inpaint the surrounding greenscreen into the area behind
the semi-transparent regions and subtract that instead.

The only thing missing in blender for that task, is said inpainting
node. And that’s why I added it :slight_smile:

So: please use my git repository at

git checkout image-keyer

and tell me your findings.

If I should commit to trunk, please let me know. If team Mango can make
use of it, feel free to commit to tomato branch.

Cheers and good night,

P.S.: There are a lot more sophisticated solutions for inpainting, some
are convolution based, like my simple approach (which convolves the
known surrounding pixels with a weighted average into the unknown
region), some are a lot more advanced. For all practical reasons (the
ones noted above), my node should work fairly well. If you want to add
additional inpainting algos, feel free to add a type and activate
custom1 as a type parameter variable.

Peter Schlaile

This is a great addition to Blender. It should really be in trunk ASAP. Nodes like this one bring Blender closer to professional compositors. Thanks to Peter!

No worries.

It can go 2.65. Checking things and “stabilizing” new versions is being done for a reason.

Many thanks to Peter for making that thing.

Wow, latest update from Peter. Amazing keyer results:

OMG that’s dagnabbit fantastic :eek: Here’s the same image comped with Nuke’s IBK keyer at default settings:

Anyone knows what happened to this great node? Has it been included in any builds yet?

it’s in trunk and 2.64.
you can find it under nide nodes–>filter.
if you wanna “inpaint” a masked area, use set-alpha-node, add mask as alpha to your image, feed that result into the inpaint node. transparent areas will be “inpainted”.

Where is the “Image Keyer” node from this image?

It’s now just called “Keying”. I think the node in that screenshot was an early version of it.

Reading Peter Schlaile’s proposal and trying to understand. So this is basically an improved method of removing a background like a green screen?

I’m not enirely sure. Without having played with the node yet, I thought inpainting was more like noise removal by compaing frames?

Inpainting fills the gaps in an alpha channel.

In After Effects, it’s the same thing the Re:Fill plug-in does.

Cycore’s wire removal tool also does the same type of thing.
Blender example:
Here I created an alpha channel with the mask tool. I drew a shape around the twigs on the wall.

And with the inpaint node, you can see how it pulls the pixels into the gap created by the roto, filling it. you can see in this example, this isn’t always the best solution for removing things from an image. The correct way to remove that twig from the wall would be to create a cleaned up patch image and track it in.

However, in regards to keying, the Inpaint node is awesome!

It plays a key part in copying Nuke’s IBK (Image Based Keyer). Nuke’s IBK works by taking a greenscreen shot and comparing it with a “clean” pass of the same greenscreen. The same exact shot - once with an actor, once without. It uses the math described in Peter’s write up to replace the green mathematically with the the background image. You can see Nuke’s IBK in action here:

Blender’s new Keying node has an input for a Key Color. If you use this color swatch to sample a green from the image, it works as a standard keyer does. But, if you use the Inpaint node to create a “clean” version of your greenscreen footage, you can plug that image into the key color input, and it will then function very much like Nuke’s IBK.

It’s really good, advanced keying stuff. :slight_smile:

Hi, thanks! I got a little confused by the difference in complexity, but got it working. Heh, as a Nuke guy I actually feel like using the Blender compositer more now :slight_smile:

Thanks, hype!

I tried to use the keyer mode as a subtraction node, which was ok but the noise from the camera killed the idea ass it was to different from frame to frame.

I’m glad this made it into trunk, what a powerful node!

very exciting news :slight_smile: