32MB Blender file (Incl. Textures) leads to Out of Memory at GPU

I’ve been following Andrew’s tutorial on a nice earth, as found here:

At some point I noticed my rendered viewport would no longer update, it said my GPU was out of memory. Blender was using >3 GB whereas Andrew reaches about 200MB at this point (40 minutes in). A re-boot of Blender fixed the problem, I figure I had a memory leak or so.

Now, it still comes up with memory errors whenever I render using F12 (at 1920*1080 px where Blender jumps to 3.8GB) or have a viewport on “rendered” and change any node settings. I’ve got 6440 faces, a few textures which when appended take the total file size to 32MB. I use one of them as an HDR background. (Completely off-topic, any tips on how to use an image as HDR background, yet NOT make it emit any light? :wink: )

I’d appreciate if someone can help me solve this. I don’t think a 3.5-4GB GPU (Geforce GTX 970) should choke on a 30MB file, or am I missing something?

.Blend for those interested:

Just off hand, check and update your video drivers. I did tech support for several years and this often solved many problems that were being blamed on the software being used, (which high-lighted the problem) rather than the video driver. NVidia is really good about ongoing updates, occasionally a version will be troublesome, but I roll back and then lurch forward when they get it fixed with yet another driver update.

Still trying to update drivers. They weren’t that old but there was new one. For some reason updating will remove the old driver and then fail to install the new one. I know how to fix it, but it takes some time ^^

In the meantime I realized/found out that 32MB of textures will of course take up much more than 32MB because the textures are loaded uncompressed, a nice explanation can be found here if others are interested:

However, even then I still end up at 1.3GB of RAM only.


Gpu-z say it use 4.300 Mb when I render it in viewport…4 Gb is not enough to render the file.

Don’t know why it use so much memory…But it’s not the size.


…However if you put some effort to get some elbow room this renders on 2GB GPU. Barely, but renders.
First, why do you load 8Mb background image when all you want to render is Earth? File shows you render transparent background and Milky Way is easy to do in post using compositor (it could be mapped on a plane or cylinder, doesn’t look like spherical fits there).
Second, part of the mask images are set to greyscale which in theory should decrease information Cycles loads; Clouds are not.
In theory - because one of the FOSS applications tells lies and i’m not quite sure which one: in Blender on UV editor N panel under the loaded image path is image properties - size and type. Latter always shows “RGB byte” while Gimp exports (and loads back) in Greyscale or RGBA. Maybe this is due to us being in 21 century; black’n white is long forgotten format and media space doesn’t matter either, idk.
Maybe i just don’t quite understand intricacies but RGBA is more than RGB or Greyscale+A translated to data amounts in files.
Since i suspect Blender might waste some byte here and there i combined 3 b/w mask images into one using Gimp and since clouds did not like to compute transparency from b/w png i added A in.

Thank you both. Interesting, Eppo, that with your file I get 2GB according to Gpu-z as well, although Blender reports only a few hundred MB’s in the viewport.
The Milky Way - good point there. I’ve been fiddling a bit with nodes & compositor - it’s been a long time since I used that, so I guess I wasn’t very efficient. Will keep it in mind.
As for what you did with the files, it took me some time to figure out but that’s actually quite smart, nice! :slight_smile:

All in all my problems have been reduced. The new driver did something I think and better understanding what’s going on is also helping. Thanks! :slight_smile: (Although it still bothers me I get 1.3GB when assuming RGB (or was it RGBA) for all images. I wouldn’t expect a factor >3 more. )