Neural Style Transfer Node?

Could we have something that is like deep dream in a node?

input texture is processed by neural net, and a style from the second image is applied to it?


was made from


it could take a while to cook, but appears to be a instantanious method to create kick ass greebles.

What is the second image supposed to be anyway?

If it’s supposed to be a heightmap, then the algorithm obviously didn’t do a very good job. It would also be an iffy result if the intent was stylization because of all the grey junk floating around.

The second image was the source image, that I generated using cycles,
I took the cycles bake result and passed it through a style transfer of a sheet of greebles

This would be great if someone could code it! wish my python skills were better than beginner level lol!

The method can also be used to upscale low res textures really well using multiple reference images!
See here, this guy did it with minecraft textures and got great results - https://nucl.ai/blog/enhance-pixel-art/

So what software are you using here to generate the texture (if you are using external software), do you have a link?

https://dreamscopeapp.com/

but this is very ‘black box’

I think devs should look into Neural stuff for blender sooner or later. I guess many tools in the coming years will use this technology. But, there’s a long way to do this. First, you take the best library(tensorflow?) then, you start doing stuff step by step. Style transfer is quite advanced allready, although there are papers how to write this quite easily…
I thought about some kind of animation development, but don’t have relly time for that…

google deep dream and google deep style?

Anyone able to get a node running this?

it looks like the whole thing is driven by python so…

also only appears to run on nvidia?


it is really fun for post processing like freestyle* and could be fuun as a node inside a cycles.

(that would need to bake it’s image any time something changed upstream on the nodegraph)

edit:
boom
https://nucl.ai/blog/enhance-pixel-art/

https://nucl.ai/files/2016/04/Doom_light.gif

https://nucl.ai/blog/neural-doodles/

here is the git

lol, i posted a link to the nucl.ai blog in comment #4 :slight_smile:

Theres some really good stuff up on there man

giggity


ok it is confirmed to be useful

Nice work :slight_smile:

The green one looks like a circuit board. The others would be usefull as generic metal panel textures. Have you tried using the software to combine normal maps or height maps yet? that might produce some interesting results

the green ones source was a image called mother board :slight_smile:

the other ones were made using metal plate textures and the other is a greebles atlas

Attachments




and here it is working on a 3d model

edit:


Nice :slight_smile:

For some reason they kind of remind me of the cubes from the game “portal”

> The process should complete in 3 to 8 minutes, even with twice the iteration count.

As a node… no, this will never make it into cycles… materials calculations need to take a fraction of a second to compute, rather then minutes… imaging having to wait 3-8 minutes longer per sample then before.

as a comp node… probably not aswell… it would lock up blender and people would complain it doesnt work.

its best to keep this sort of thing as a separate program.

I suppose there could be a neural bake mode? with a progress bar like you get when baking textures.
Then maybe a node that can hold the baked neural transfer images? or the images can be saved and used as normal…

At least then the user will know they have to wait for the process to complete.

I’m guessing it would be a hard job to make an add-on that functioned in that way though…

But then again, it’s just easy enough to make them on seperate programs and import them to blender