HDR painting experiments in Krita + compositing in Blender

So, since rewriting the base color selector, and with in the palette branch a file format that can store non-8bit colors, as well as fimic blender that has been floating around, I’ve decided to have some fun with super bright values.

This is the result of my second attempt(so it will be picked for the thread thumbnail :D)


This is the image in Krita. I was sorta going for a winter morning, but then it more turned into a winte evening. I think if you live close to the north you may have experienced an evening like this in winter.

There’s a lot of proportion errors here, but I think it works well for a practice piece.


I actually wanted to see if I could progressively blur the image, so I kept the builds and people seperate. However, it seems my computer isn’t powerful enough to process this image, so I ended up just using a glare node and some color balance instead.


Filmic blender doesn’t make much difference here as I am in full control of the values as opposed to a render.


This was my first test. I threw over a regular blur filter as a filter layer at about 35 px. Then I added a gradient to the layer so that it would kind of give this defocus effect.

Left is sRGB OETF, right is Filmic Log+Base contrast. The interesting thing here is that in the morning I like right one more, and when I’ve been using my computer longer I like the left on more.

from exploring HDR stuff i’ve learned that we need proper displays/monitors, to really see in HDR specter (rec.2020)… only then it is truly observed!
as is now, majority is limited to the realm of 24bit RGB color spectrum & many are unknowingly simply faking (illusory superiority) for the sake of HDR illusion, complicating the simplicity for the sake of pleasure making magic ;).
For now use is limited and more indirect, exploited for other goals & purpose (lighting the VR scene, displacement maps, high quality prints & lately HDR displays, projectors…)

None of that is either true or false, it is mostly just deeply confused. I am also not quite sure why you decided to post that here?

Any computer generated image will need to go through a transform to be put into something that correctly displays on screen. To create that transform, you will need to profile+calibrate your image. If you haven’t a default transform will be used that is indeed incorrect. That said, rec 2020 is cool, but it is not the definitive color space. It is just the colorspace that the industry is attempting to conform their HDR tvs to.

Furthermore, within the world of color maths, using a floating point linear space within an enviroment set up for scene reffered imaging it actually simplifies creating certain effects, like the nice bloom you see above with the glare node. Of course, you prepare an image for web by transforming it to sRGB or something alike, but that has nothing to do with the editing process itself.

But like I said, I am not sure why you posted this. Do you like it? Hate it? Do you want me to stop wasting my time on this? What is the purpose of your post?

Ah, no… sorry for the confusion. Me like these works.
But as noted, you’re a few steps ahead of many others :wink:
What we see now (on our displays) is still far from what we wish to observe and recreate, represent.
You’re on a good path.

Ah, alright then. Thanks. Hopefully I can find some nice do’s and don’t’s while doing these so others can follow :slight_smile: