"Edge detection-like filter" with nodes?

As always, sorry for my bad english

So, there is a filter that is very commom in image editor programs, called “edge detection”. Basically (at least, that’s what I think it does…) it looks for abruptly changes of colors in the image, so it’s kind of looking for “borders” between colors, and then create lines among these borders. Take a look at this picture:

Now, take a look at this:/uploads/default/original/4X/0/1/0/0103a6ae02a58dbdc120f312cbee5d80dd442ff0.pngstc=1

This is a plane made of voronoi cells deformed by a noise texture (a different kind of deformation, which you can find here:

and here:
http://blender.stackexchange.com/questions/45892/is-it-possible-to-distort-a-voronoi-texture-like-the-wave-textures-distortion-sl )

I simply want to get the borders from these materials, the border between the cells. If there is a way to make the material detect the boundaries between the cells (it can judge by the colors, just like the filter), it would be very useful. It’s basically that, so, if you can help me, I’ll be very happy! :slight_smile:


you could use a simple color ramp to add colors around edges !

are you using cycles or bl ?

color ramp detects all edges from texture and add some colors after each edge

see examp

happy cl


"Edge detection is simple image processing, which aims at identifying points in a digital image at which the image brightness changes sharply, or more formally, has discontinuities. Sobel Edge detection uses implementation of the Sobel operator as explained in http://en.wikipedia.org/wiki/Sobel_operator." -from the Internet.

Easy to do in photo editor; likely could be done in Blender’s Compositor. AFAIK not possible using Cycles material nodes.
You could however generate an image and send it to external application (GMIC, Imagemagic e.g.), then get back resulting image using python.

2nd double post. Inet does not cooperate today.

3rd double post.

it’s possible to do some edge detection with nodes… All you need to do is to find the derivatives of your texture color (in two or three dimensions depending on the nature of your texture). This means, creating copies of your original texture, translating them in a small quantity in each axis, and calculate the derivative with the original.

However, this is not very practical, if for example you need to make changes to your texture, not to mention the slowdown at render.


Part of the problem, I think, is Blender’s relative inability to blur textures in the Material Node Editor. Secrop is using a Noise Texture, presumably, to blur it a bit (the best way to blur a texture AFAIK – by scrambling with the texture coordinates a little bit).

Here’s an example from the Compositor, where you have a Blur Node: Blur your image, composite it over the original image with a Difference Blend Mode.

So if you can manage this sort of blur with a texture, you can achieve a similar result in the Material Node Editor.


The noise was just to match op’s texture, and just to give the voronoi.cell that wobbling effect.

The main part is where just a tiny bit is added to each coordinates. The value node ‘Delta’ scales each component (RGB) and then it’s added to the original coordinates. After that we get the texture values at those new coordinates and compare them to the texture values of the original to determine the derivative. If the derivative is greater than 0, then is because there was a color change, and we can do something with it.