Seamless texture patching and filtering addon

edit2: The project is now hosted on Github:

edit3: Current notable features:

  • Two different algorithms to generate seamless textures
  • Blender Internal renderer automatic material generation: normals and specular
  • Cropping to powers of two
  • Various image filters (blurs, sharpening, greyscale)

original text:

I coded this little thing. It takes in any image and according to parameters, tries to patch the edges that the seams become less visible. It’s still work in progress. The algorithm is not perfect but you will get the idea I’m going for.

Included in this post should be the .blend file demo that includes the script and test images.

When you run the script it will create a tab into UV/Image editor tool shelf. Set the parameters and press the button to generate the image.

Anyone who is willing to alpha test it, please do so.

Please do comment if you think the idea is worth while. I’m also considering making normal map creation from bitmap images in addition to the seamless generator.

It’s my first addon so there may be some things that are completely incorrect or just dumb. Any suggestions for improvements would be very welcome!


Experimental smoothing filter:

interesting try. there are still artifacts, and anyway it grashes loading and processing a random png

nice start :slight_smile:

That’s cool. Could be usefull for previs and so.

Thank you for the comments! The crashing problem is on my todo list. It’s good to hear that there could be some use for this type of addon in Blender.

Here’s the full addon, with GIMP style seamless texture generation added and the previous patching algorithm improved.

For some reason I get an autorun disabled message when I try to run this script (even with autorun enabled in the user preferences). Not sure what’s going on but it doesn’t deliver me a tab with options. I think the idea is brilliant though. Something like this script gets us closer to one day just being able to drag and drop images from Google straight into Blender and like alchemy, create worlds from them. I’m very much a fan of this idea. Keep working at it.

Here’s a short demo of how the addon works. The music stops halfway through, don’t mind it. :wink:

that looks like a nice start. keep up the good work.

Moved from “Python Support” to “Released Scripts and Themes” by OP request

Normal maps now work. Also added various filters. I still have problems figuring out how to update unsaved images for materials and viewports, without actually saving the images. Many quality of life improvements are in my crosshairs, as long as I can figure out how to do it in Blender Python API.

5 Stars from me!

An invert button for the normal map would be uberfine!

There’s an invert option in the Image menu below. You should be able to change x by inverting red and change y by inverting green. As long as those work, I don’t want to ovelap with Blender’s own tools.

Amazing Work! It would be really cool if the functionality of the plugin could be available as a image node in the Node Editor. An Normal Map Node and a Seamless Texture Node would be incredible useful.

here are some interesting readings regarding nice seamless texture generation algorithms for you :wink:

regarding the fact that you have to reload image - yes, there is no solution as far as I know :wink:

Yes I agree 100%. The trick is finding out how to do that. Also, Blender doesn’t have image node groups. Only texture, material and compositing, of which only the compositing has useful nodes. Here’s what I’m able to do with just those nodes:

Unfortunately it seems the compositing nodes can be only exploited for compositing, not for general image or material creation.

You do know that solving massive linear equations and/or optimizing them is no easy task ;). But I’m looking into it is all I can say. It’s definitely the way forward. Even if compositing nodes already have inpainting functionality. Which is slightly aggravating because I can’t find a way to use it.

Looks like bpy.ops.image.invert(invert_r=False, invert_g=False, invert_b=False, invert_a=False) works in updating the images. Because IMAGE_OT_invert is the only function in image_ops.c that sets the flag IB_DISPLAY_BUFFER_INVALID which seems to be the key for the image updates.

It’s nice to see that someone has taken to the idea of having seamless texture and normalmap generation inside of Blender itself.

The only critique is that it seems like the algorithm has a ways to go yet, what is needed is sophisticated feature and element detection so you would avoid cases such as those stones getting merged together (which gives an image containing stones with funny shapes).

It’s true, the algorithm is pretty crude. You should be seeing the discrete Poisson equation solved on the gradient domain withing a week if all goes as planned. We’ll see how that goes.