Aunt Beatrice's bathroom, reimagined

Hi. I started this a while back, but it was kinda tough to finish because of multiple reasons. But alas, it’s finished, at last!

I modelled everything except the plants and the accessories on the table. I wouldn’t have been able to render this without Google Colab.

This project is finished, but your critique will be appreciated and considered in my next project. Ciao :slight_smile:

11 Likes

I think this is amazing work! materials and colors are perfect!
All the smudges and imperfections and mainly glass material on the picture frame creates a very nice effect.
The only thing I would change is the towel and its material - maybe add a particle system so it looks more fluffy, soft pleasant material. But as it is it is also great as it creates depth in the picture!
Great work! I am looking forward to seeing more!

1 Like

Howdyy. Thank you so much bro. I tried haha, but my normals were weird haha. Next time thou :slight_smile:

What was the sample count or other light path settings that made you use Google Colab? My renders tend to be blurry, but yours look nicely crisp.

Small props with relatively low relevance - Stable Diffusion comes in handy.
Towel

2 Likes

Suppp. Woah, my mind’s blown! Thanks for the tipp. Know of any good tutorials? Or can ya tell me exactly how you did it? Haha

I kept the default light path settings, sample count was wild: 2500 samples, with a noise threshold of 0.0001 (crazy I know). I also decided to export as a 16bit png with 100% compression. Hope that helps.

I don’t remember all resources but let’s start with those:

  1. Find in google “How to install Stable Diffusion on Windows (AUTOMATIC1111)”
    In this point you will know if your PC has enough power to run anything.
    We are talking about 512x512 and 786x786 max working resolution as for this towel.

  2. Enter civitai.com and search for most realistic “model” (model type “checkpoint”). You can filter results by “checkpoint” and most downloaded. It doesn’t matter that preview images look like porn or softporn. Avoid architectural models. They aren’t as good as expected.

  3. Most important part is to search youtube how to install “ControlNet v1.1.196”

  4. In StableDiffusion choose checkpoint (upper left corner) / choose img2img tab / inpaint tab / load your render / paint mask with brush e.g. mask out towel / choose Sampling method DPM++ 2M Karras / Inpaint masked - only masked (only masked allows to use max resolution set earlier for that part) / CFG Scale set randomly from 2 to 7 / Denoising strength is most important 0,76 will give AI free hand, we don’t want it when we have our favourite towel already set. Set Denoising to 0,55 and increase till 0,8 each generation to see if it fits your needs.

  5. In ControlNet load your render again and set “Depth” That should constrain AI no to go freely with his search.

  6. Write prompt like: RAW photo, fluffy black and white towel with geometric pattern, (high detailed:1.2), 8k uhd, dslr, high quality, film grain, Fujifilm XT3
    negative prompt: (normal quality), (low quality), (worst quality), paintings, sketch

First days with Stable Diffusion are hard… but as you have seen towel was ready after 9th try.
Those 6 point are more like “search me” on youtube because it’s very wide topic.

1 Like

I featured you on BlenderNation, have a great weekend!

1 Like

Thanks a ton Bart! Have an awesome weekend :slight_smile:

1 Like

Roger that! Thanks man :wink: