Just got a new computer so I’m testing just how far I can push Blender. One of my tests is of planet earth, using the highest resolution maps I can find. Here’s a breakdown of the texture sizes I’m using:
Base Earthmap: 21k
Bumpmap: 16k
Water mask: 16k
Earth lights: 30k
Clouds: 8k (yeah, that was the largest I could find)
I can load them all as textures individually, so I know there’s no corruption in the files, but when I try and load more than one, Blender dies instantly. The same thing happens with 2.49b and 2.5 beta.
I’ve used Blender for a very long time, but I’ve never been in a situation where I could actually test it’s limitations before, so this is a little new to me. I’d love to hear if you guys can actually load textures of this size, or if it’s something that’s happening on this machine.
Oh, by the way, since I’ve got this new computer, I’m falling rapidly in love with Blender 2.5. It’s an absolute joy to use.
Try scaling the textures in a photo editing package so they are divisible by a power of 2.
For rendering it usually wouldn’t be a problem, but since you are storing them in the GPU ram and processing the data with the GPU you have to follow the rules/ limitations.
EDIT: Just to point out, I couldn’t see a situation arising with a production, say a scene or a short film where ultra high resolution textures/ materials would need to be enabled in the viewport, a low resolution pre-vis texture would suffice (Shame Blender doesn’t do this with large texture maps).
5 gigs of Ram, and 1 gig of ram on my gfx card. I’m fairly certain it isn’t a hardware limitation as I can edit and manipulate those textures easily in Photoshop without my ram taking too much of a hit.
It’s probably a bit of both, I know max can’t handle that many textures in the viewport, not sure about Maya, it’ll probably be the same, that’s why people use proxy textures, to ease the work load, and allow better frame rates in the viewport, unless you are editing the textures in the viewport you probably won’t need to see several ultra high resolution images.
Mari is probably the most cutting edge (that we know of) application when it comes to handling large image textures, it can handle a huge amount but it’s all streamed from the HDD and RAM and makes use of tiling and down samples the images when zooming out, if you have a 32K image it won’t actually show you the full 32K detail until you’ve zoomed in far enough, at which point it only caches the part of the texture that is in view, as far as I know Blender doesn’t do this, so what you are doing is loading in all of these ultra high resolution texture maps and for every frame Blender needs to update the viewport and manage these massive texture files, it’s a lot of work, even for high end cards!
Yup, all of that is understandable, but Blender crashes instantly upon attempting to load them, so I don’t think it’s even getting the chance to test my hardware.
I could understand Blender crashing if I attempted to display a 21k texture in the viewport, but I’m not doing that – I just want to test the size of textures I can render.
Has nobody else tried using a texture that size? Some of you guys have must have machines a lot more powerful than mine.
– Edit –
I tried resizing the 21k earthmap to 16,000 x 8,000 pixels, and Blender still won’t open it.
I see, I thought it was crashing when you viewed them in the viewport.
I’ll download the textures and try them out.
EDIT: When I try to add a 32k by 32k image in the UV editor it does crash straight out.
A 16k by 16k texture works, but it’s unworkable for painting.
My memory also jumped from 1GB to 2.92GB after adding the 16k image, so you can image how much RAM your textures are taking up, it’s probably running out of available RAM.
Try reporting a bug and see what the developers say, there might be a bug, there may just be software/ hardware limitations.
Please mention the actual size instead… (you know… x*y)
Also, one thing to keep in mind: those texture are probably compressed, but they won’t be compressed anymore whenever you’re loading them into main memory (not vram, not required for rendering using BI).
And what kind of system are you trying this on? Windows? 32 bit? 64 bit?
It seems there is a maximum texture size in Blender, it’s 16,384, this could explain the instant crash out.
Try scaling the images down to that maximum resolution and you should be fine, if you use Photoshop make sure to alter the sampling method.
EDIT: reporting a bug now, it seems that in the ‘new image’ pop up menu, which you get from clicking the new image icon at the bottom allows you to enter in 32000 by 32000 pixels, but crashes out, but only for the bottom input, the top one accepts it fine and shows a 32k by 16k or whatever.
I would have thought it was blatantly obvious myself…especially seeing as I actually provided a link to the textures.
It seems there is a maximum texture size in Blender, it’s 16,384, this could explain the instant crash out.
Try scaling the images down to that maximum resolution and you should be fine, if you use Photoshop make sure to alter the sampling method.
Yeah, I did…the current map I’m using is 16,000 x 8,000 px, and it’s still crashing out Blender if I have another map of a comparable size open.
I did a little test; I opened up a new file and loaded up the original 21k image. The ram consumption went up from 700mb to 1.5 gig. So that’s 800mb used for Blender to open up the largest file I have.
That is interesting…I think I need to experiment a little more.
blender has as bug where it cant open images with file sized bigger then 2gig (ran into this in durian), in our case we had enough more urgent bugs but you can see if this is the problem by saving a jpeg version of the image and make sure its under 2gig in filesize.
No problem here with three maps, 1 of them being 21k, the other two 16k.
Blender doesn’t crash when I load them in it, nor when I open them in the UV/Image Editor:
I suspect it could be an issue with your videocard driver…