Why isn't there a simple straight forward blur texture/shader node already?

This has really astounded me for years… the concept of blurring cannot possibly be alien to the developers of blender. So Why isn’t there already just a straightfoward blur node? It makes absolutely no sense. It’s not a new concept, blurring has been a known thing in photography for 100s of years.

So can someone explain what is so difficult, above adding a node that simply takes the color/image from a previous node, and apply a honest to god straight up blurring? With perhaps a few parameters the user can tweak, like blur size/range and amount (e.g internally mixes with the original image/texture/color input). Every single option for doing this I have seen relies on manipulating the mapping on the vector inputs long before any images or textures are input. There doesn’t seem to be any way to do this further along your node chain because it relies on the mapping, which can only be done before any images/patterns/textures are made. After that point, there is nothing.

Can someone explain to me why this simple thing wasn’t added to blender years ago? I don’t get what is so difficult.

Case in point: In the image below, I have a generated sky/starfield in progress, and some of the star clusters, I wanted to have them shine brighter, as they are denser together, so have seperated out those stars.

To achieve the effect I want, I would now blur this, before mixing with the original unconnected node. But there isn’t any way to have the generated image as an input, so basically I am stuffed by the lack of what really should be, a basic functional node.

1 Like

Well, Cycles is part of Blender since 2.62.
During 2.6x series, work was about adding basics to software.
It should have been done at that moment.
But because Cycles was not enough advanced ; most of users were using Blender Internal during this period.
Then, during 2.7x, priority was to push enough advanced stuff (baking, hair, volumes, displacement) and improving support of features needed for open movies productions (Cosmos Laundromat, Caminandes, Agent 327).
After 2.8x refactor, idea was to add stuff compatible with EEVEE, able to replace BI stuff like new procedural textures nodes.
That is also the time where multiple works about denoising are done.
During 2.9x series, Cycles X is a target for 3.x series.

There were 2 windows during 2.6x and 2.7x series where developers could have thought about that.
But they did not.

Nowadays, a blur node is a target. It was announced in article about texture layers.

1 Like

Blurring is actually a lot more difficult than you might think, for both eevee and cycles (rasterization and path tracing). It’s a well-known problem with many imperfect solutions. You will find this same question o

Think of it this way: a ray hits an object, which is shaded with, for example, a procedural “clouds” texture.

Procedural clouds are made in such a way that you can ask “give me a color at coordinate (x,y)”. If you sample this with a different coordinate every time, like a UV coordinate, you get a coherent cloud pattern. This is how you build shaders in the material editor.

Now imagine you want to perform a 50px blur. Now you can’t just ask “value at (x,y)”, but you have to do it for every pixel in that 50px radius, and take a weighed average. That means suddenly you’re not asking it 1x, but 2500x. That’s no good for render times, of course.


So, the reason you can’t put a blur “after” your texture sample is because at that point, it’s just a color. It has no knowledge about what came before, or neighboring pixels, or anything else.


The easiest way to “blur” a texture is to jitter the input vector with a random value, like by using a “white noise” node. This adds a bit of randomness for each render sample, which gets smoothed out over time, but it’s not perfect. This only works with higher render samples. https://www.youtube.com/watch?v=Ci2FDxsUjyQ

If you can bake your textures to an image, you can pre-blur your image (in photoshop/gimp/krita, or in the blender compositor, or anthing else) and use that.

7 Likes

I can’t argue about technicalities and difficulties, since I don’t know what they are. From an artistic perspective blurring is just a basic and useful building block in both contexts – a shader as well as in a compositing network (the latter is available in Blender). Let’s hope it eventually arrives. I tried the UV blurring trick, I’m just not sure if / how I can use it at any point in the shading tree and for any type of image data, loaded from disk or just generated by other nodes.

Hello,
What we call “blur” is a process at pixel level to fake the optical situation when something is out-of focus in a camera or the eye.
In real life, when the light is not converging in the focal plane, the rays leave a “jittered” imprint thus creating the blur effect. And this is what is recreating the Noise method pointed by @ThomasKole .

If we want to use a conventional blur filter we have to “bake” the material.
I really like the idea of a “convert to image texture” node (and it would even lead to merging shading nodes with the compositing nodes in the same editor), but being aware that we are loosing proceduralism in that branch of the shader.

I’ve watched 2 or 3 tutos that offer solutions to blur with a node setup, I guess you already know these solutions? But yes this would be great to simply add a blur node. It would also be great to have a normal node that would convert an image into a normal map.

Unless we’re talking about depth of field… as Thomas said above blurring is impractical or impossible in a path tracer. However in an image processing context, this is straightforward. The compositor does this very well already and if you need procedural blur, the new texture nodes (2023?) may have your back as well.

1 Like

Your knowledge of Blender development is astounding. Do you know why Texture Nodes are implemented separately instead of them being part of Shading Nodes? Seems like extra work to me and something that eventually will be merged anyways.

As for the Blur, there is a Blur in Compositor but I’ve never understood when would Blur node be useful in shading. I know Substance programs have it but never found a use for artificial blurring. I wanna know cases when people use blurring in shading process with some examples.

That was making sense with UI of old releases.
A texture of old Blender Internal renderer is a datablock that is not used only for shading, but everywhere.
So, when you create a texture for one thing, you can reuse it where it makes sense without re-setting it.

They are still used by brushes used to paint and sculpt, by modifiers, by particles and by Freestyle strokes, in Blender 3.

For EEVEE and Cycles, textures nodes were directly added to shading nodes.
But EEVEE and Cycles texture nodes could not be used to influence modifiers before arrival of Geometry nodes.

So, since 2.8, it is planned to remove Blender Internal textures and make Cycles/EEVEE textures available to all modifiers and to brushes.

There is currently a Texture Nodes Editor for old procedural texture. That is a kind of procedural texture generator.
So, logically, the idea is to update it with useful nodes, same pattern as in EEVEE and Cycles to create complex textures, cache them and use them anywhere (shading, painting, sculpting, modifiers, compositing, …).

2 Likes

Hey, I work in this area for many years but I am quite new to Blender and was very surprised it doesnt have BLUR node that can be applied to TEXTURE/BITMAP as well as that Blender lacks of an actual global material editor :).

To answer your question:
BLUR is extremely important when it comes to ‘height map’ textures and displacement. It is helful when it comes to sharp transition to avoid projection stretching (lack of diffuse/albedo informration after the displacement was applied to geometry). It can also help when it comes to ‘jagged pixel transition’ when there is not enough pixels to displace denser geometry.

So BLURs are extremely important and useful when it comes to heightmap based dispalcement for heightmaps with very sharp transitions like 0-1 etc.
It often happens when we deal with ‘scan based’ data but also with full dynamic range procedurals.

Of course such might be BLURED in PS etc.but would be cool to have it in Blender

1 Like

Bluring is not cheap… and for example some special application for procedural material making… well it is used to produce some texture and not for direct use in the render.

Funnily enough the blur node of some very well known app for this actually does nothing else than copying the texture multiple times with different locations and transpareny… because in shader you do not know anything about your neighbour pixels…

Also there are two differetn approches in shaders ( i haven’t found the correct term yet):

  • In some (for example GLSL) : a pattern node produces a pattern and then you can transform and compose this pattern ( for example to make some blur)
  • In Cycles and also in OSL pattern node does produce the pattern at the given loction, so you can’t chaneg there postion afterwards…

If someone knoe how this two principles are named… :star_struck: …please tell.

Also even if GLSL ( OpenGL Shading Language ) is capable of this and you need OpenGL 4… for the newest Blender… EEVEE (AFAIK) can’t do this…

1 Like

It’s a kernel.

I meant the different shader principles not any image processing convolution matrix… :stuck_out_tongue_winking_eye:

Hi Grzegorz!

What you are asking for is more complicated than it might seem. When doing blur on a 2D image you need to sample the value of neighboring pixels to find the average value - basically every blur is based on this principle.
This cannot be done during rendering as there is no way to communicate between already rendered pixels. The only way is to prepare the data before. You can already do it.

  1. First obvious one is to blur the texture in external editor.
  2. Second is to use very dense mesh to store color from a texture on individual vertices - then blur it between them and then finally displace or shade your mesh with blurred attribute. You can do it with geometry nodes, and this will work not only for image textures but for 3D textures too (Voronoi, Noise, etc…).





Unless there are addons for this then you can forget about it. As long as the data ownership is exposed in a UI this wont happen. Best solution is to make your own material assets and use Asset Browser. Editing assets is cumbersome but it works.

2 Likes