Generate PBR materials from photos | v0.1.0


Material addon is a material creation and editing addon for Blender, which enables users to get desired PBR textures from simple flash photos. It offers 3 different approaches, two of which are based on recent machine learning studies and one based on a Compositor pipeline. The machine learning approaches called NeuralMaterial and MaterialGAN demand a CUDA supported and with appropriate CUDA drivers installed (works on both v10 and v11).

The addon is still in the early stages and this is my first time creating a Blender addon so any feedback is much appreciated. A more detailed tutorial of usage is being actively worked on.

The addon works on Blender 3.0, but should also work on versions later than 2.8. Addon also has an integrated updater based on

For installation follow the steps mentioned in the first release.


For any questions and troubleshooting please reply in this thread or DM.


hey cool stuff my dude! how does it compare to this addon ?

meybe you can take that algo too and put it in your addon

1 Like

This is excellent work. Thanks so much for sharing.

Any chance we could get a video showing how to install the python environment if that won’t be too much trouble.

@ulf3000 Never heard of deepbump before. Looks really good.

1 Like

Hey. Thank you for the positive feedback. As I see it his machine learning model relies on generating just the normal texture from a photo, while the two different approaches provided in my addon generate albedo, normal, roughness and specular map. I’ll see if it has any interesting advantages to its model setup, but it shouldn’t be hard to include.

Thank you for the feedback. I will be releasing a video with the installation tutorial
and a seperate video for a more detailed demo_usage. Perhaps I will include a script to automate the installation as much as possible, but for now I have just updated the installation section in README.

Could you show us some results from your addon using the same inputs as DeepBump. I’m also wondering how they compare…

Thanks. Yeah, I think automating the installation as much as possible is the way to go.

I am still having issues installing it. No python folder in Blender foundaion folder. Do you need to have python installed before you run this?

Hi, will post some as soon as I have the time.

No you don’t need to because you can use the included python in Blender. It should in the folder specified in the new installation guide that is “<YOUR_PATH>\Blender Foundation\Blender 3.0\3.0\python\bin”. We can continue this in DMs if you still have issues.

Interesting and thank you. What might be the pros and cons of 3 approaches?
NeuralMaterial , Algorithmic, MaterialGAN

For a very short overview, because this is a part of my Bachelor thesis on which I am still working on:

  1. MaterialGAN
    (-) requires you to take photos of materials with apritagged A4 paper with a cutout to perform the perspective rectification,
    (-) produces 256x256 initial textures which can then be upsample with another model,
    (+) offers a “more interesting” editing option by exploring the latent space of the trained model from the direction of the initially generated material
    (+) has from my experience offered more accurate results from fewer epochs.

  2. NeuralMaterial
    (+) can generate texture maps of any resolution, because it the model draws from an infinite noise spaces
    (+) the materials have a more solid noise structure and can be easily regenerated with different noise seeds
    (-) the editing step works by interpolating in latent space between materials, which offers less versatility

  3. Algorthmic
    (+) Much faster.
    (+) Generates more maps (AO, cavity, curvature).
    (-) Has a big tendency to poorly separate albedo to other maps.
    (-) Has very limited options for editing (this to a degree because of the way I implemented it).

If you want to learn more about the science behind each approach I would recommend you check out the their project websites or repos MaterialGAN, NeuralMaterial, Algorithmic.