Huge resolution - how to properly render?

I am rendering a file with a resolution of 59055 x 17717 px at my new iMac.

To make rendering a bit easier for Blender, I used the addon ‘Animated Render Border’ so I could split the Camera View into 10 x 10 tiles with the Render Border. I would like to render each tile separately to stitch them back together afterwards in Photoshop.

I noticed that I can render as Multilayer OpenEXR (to have my Cryptomatte pass included as well), but this results in a OpenEXR with a filesize of 450 MB, which Photoshop and Blender cannot open since that’s probably too large. So instead I tried rendering a ‘singlelayer’ OpenEXR, but this automatically means I will need to save each Cryptomatte pass to its own file as well. So I added a File Output node in Compositor and connected all three Cryptomatte passes to it.

But as soon as I start rendering using Terminal (to speed things up a bit), I see that when Terminal is done rendering it starts ‘Compositing’ for minutes, while my memory suddenly starts peaking like crazy. It even crashes Terminal.

What do I need to do to make appropriate renders out of Blender for this huge file? And how can I stop Terminal from ‘Compositing’ for minutes for each Cryptomatte pass?

What do you mean by ‘Photoshop cannot open OpenEXR’ ? (there are plugin to open openexr in photoshop like https://www.exr-io.com)

Thanks, but I already use that plugin. An OpenEXR file of 250 MB apparently is a bit too large to open in Photoshop, After Effects and Blender (afterwards) :wink:

Ha ok sorry :slight_smile: But 250 MB doesn’t seems to be so “big”, that’s strange anyway

Haha no problem, didn’t mean it like that :wink:

That’s exactly what I thought. It’s probably due to all layers inside since Photoshop doesn’t really support Cryptomatte in an OpenEXR file and therefore generates many, many unusable layers as a protest. :wink:

I keep trying to think of a response, but I keep running into Why questions. I have done some very high res renders on this scale before, but I don’t know what your goal with exporting the cryptomatte passes is. Could your end compositing program even handle files of this resolution?
What compositing are you doing? Could you do your compositing in blender?

It seems like adding cryptomatte on top of rendering super high res is just asking for trouble.

Any additional information you could supply would be helpful.

1 Like

Good question! Since the file will have an enormous resolution, I just want to be sure I don’t have to render that monster again :wink: That’s why I’d like to render a Cryptomatte pass as well - to be able to color correct things more easily afterwards (should that be needed). It’s more of a precaution, actually.

However, I am still curious why Blender needs to ‘Composite’ all tiles afterwards when I render to Singelayered OpenEXR and why this entire step (which takes lots (!) of memory) is skipped as soon as I render to Multilayered OpenEXR. That just doesn’t make sense.

What do your compositing nodes look like for multi-layer and single-layer exr?

And can you confirm that your final compositor can even handle those resolutions? It seems like a lot of work to ensure adjustability, and it would be a bummer to have it not work in the end.

Multi-layer:
Render Layers 1 > Composite 1

Single-layer:
Render Layers 1 > Cryptomatte 1 (Object) > File Output 1
Render Layers 1 > Cryptomatte 2 (Material) > File Output 1
Render Layers 1 > Cryptomatte 3 (Asset) > File Output 1

I assume Photoshop can handle these smaller tiles, or am I wrong here?
I can always increase the DPI to be able to decrease the resolution?

Out of pure curiosity - what’s the use case for a gigapixel render?

It’s for a huge banner we’re going to put in a frame of 10 x 5 m. on the front of one of our offices. :wink:

Such a strange thing! I created an override material with just a white Emission node and remove the current HDRI entirely from the scene. Blender rendered at lightning speed but when it came to Compositing, it choked again and I had to force quit Blender. I just don’t get it.

That seems like excessive resolution for that application. Consider, movie theaters project at a similar size in meters, and there the resolution is at best 4k.

Unless you expect people to look at your banner up close (~ 30cm), such a high resolution is just wasted effort since the human eye is not even close to recognizing that.

I know that that’s not answering your question, and yes, Blender should be able to deal with pretty much any size images.

1 Like

I print large format banners and usually work to 300dpi at 1/4 size (scaling to full size on the printer). You can probably half your render resolution and still get a good quality print; unless you want pin sharp details with you nose pressed against it :]

Yeah I already decreased the DPI to 150 and even to 72, but I’m still curious why Blender behaves so weird :wink:

I rendered the file without any Cryptomatte passes, just the default Depth pass which I connected to the Composite node, and the same thing occurs! There were no File Output nodes in the file. Blender keeps taking all my RAM for compositing (each time there’s only 5MB left).
Only when I connect the Image pass to the Composite node (as default) everything goes smoothly.

Is it possible I need to correct my Blender settings?

Just disable compositing nodes (‘Use Nodes’). You shouldn’t be using compositing nodes for cryptomatte. Cryptomatte data needs to be written directly to a multilayer exr. I know there are Youtube videos suggesting otherwise. They’re wrong.

But that’s actually the whole reason I started this thread: since the Multilayered OpenEXR files are about 450 MB, Photoshop is unable to open them.

I also rendered without Cryptomatte pass, but the problem still remaines: Blender starts using ALL of my 32 GB RAM to composite a simple Depth pass to the Composite node.
Even when I use an Override Material and remove all but one object in my scene.

Hence my question if there are some Compositor settings in Blender I can adjust to resolve this. :wink:

I’m inclined to think you’re running out of memory due to combination of excessive resolution and high bit depth.

Those have nothing to do with the compositing process.

If all you need is a depth pass, disable comp nodes then render with a regular OpenEXR and enable ‘Z Buffer’.
The compositing process splits the image into tiles so that it can make use of all of your threads. The more threads, the more ram consumed. If the process doesn’t crash then the consumption of all your memory is acceptable. However, I think lowering ‘Chunk Size’ will reduce per tile ram consumption.