Two different algorithms to generate seamless textures
Blender Internal renderer automatic material generation: normals and specular
Cropping to powers of two
Various image filters (blurs, sharpening, greyscale)
original text:
I coded this little thing. It takes in any image and according to parameters, tries to patch the edges that the seams become less visible. It’s still work in progress. The algorithm is not perfect but you will get the idea I’m going for.
Included in this post should be the .blend file demo that includes the script and test images.
When you run the script it will create a tab into UV/Image editor tool shelf. Set the parameters and press the button to generate the image.
Anyone who is willing to alpha test it, please do so.
Please do comment if you think the idea is worth while. I’m also considering making normal map creation from bitmap images in addition to the seamless generator.
It’s my first addon so there may be some things that are completely incorrect or just dumb. Any suggestions for improvements would be very welcome!
For some reason I get an autorun disabled message when I try to run this script (even with autorun enabled in the user preferences). Not sure what’s going on but it doesn’t deliver me a tab with options. I think the idea is brilliant though. Something like this script gets us closer to one day just being able to drag and drop images from Google straight into Blender and like alchemy, create worlds from them. I’m very much a fan of this idea. Keep working at it.
Normal maps now work. Also added various filters. I still have problems figuring out how to update unsaved images for materials and viewports, without actually saving the images. Many quality of life improvements are in my crosshairs, as long as I can figure out how to do it in Blender Python API.
There’s an invert option in the Image menu below. You should be able to change x by inverting red and change y by inverting green. As long as those work, I don’t want to ovelap with Blender’s own tools.
Amazing Work! It would be really cool if the functionality of the plugin could be available as a image node in the Node Editor. An Normal Map Node and a Seamless Texture Node would be incredible useful.
Yes I agree 100%. The trick is finding out how to do that. Also, Blender doesn’t have image node groups. Only texture, material and compositing, of which only the compositing has useful nodes. Here’s what I’m able to do with just those nodes:
Unfortunately it seems the compositing nodes can be only exploited for compositing, not for general image or material creation.
You do know that solving massive linear equations and/or optimizing them is no easy task ;). But I’m looking into it is all I can say. It’s definitely the way forward. Even if compositing nodes already have inpainting functionality. Which is slightly aggravating because I can’t find a way to use it.
Looks like bpy.ops.image.invert(invert_r=False, invert_g=False, invert_b=False, invert_a=False) works in updating the images. Because IMAGE_OT_invert is the only function in image_ops.c that sets the flag IB_DISPLAY_BUFFER_INVALID which seems to be the key for the image updates.
It’s nice to see that someone has taken to the idea of having seamless texture and normalmap generation inside of Blender itself.
The only critique is that it seems like the algorithm has a ways to go yet, what is needed is sophisticated feature and element detection so you would avoid cases such as those stones getting merged together (which gives an image containing stones with funny shapes).
It’s true, the algorithm is pretty crude. You should be seeing the discrete Poisson equation solved on the gradient domain withing a week if all goes as planned. We’ll see how that goes.