CUDA error "Out of mem in cuArray" ?

Hi out there,
I try to render a scene with cycles GPU compute. Therefore I swapped to CUDA in the user pref settings. But allthough the scene is veryvery small and I am using just an HDRi to light the scene, I constantly getting the following error: “Time:00:00.0 / Mem:23.52, Peak:23.52 / cancel / CUDA error: out of memory in cuArrayCreate(&handle,&desc), line 783”

Does this make any sense to you?

I am new to blender, so sorry for silly questions. :o

And when I use GPU and hit render, how could I swap from buckets to progressive render?

Thanks in advance for your Help
Hilmar

Is your card by chance overclocked?.. i once forgot that i oc’d my card and blender was throwin cuda memory errors every render…

Unfortunately you’re providing very little relevant information. So no, it can not make any sense to us. What Blender version are you using? What OS? What are the system specs (especially: What GPU? How much VRAM?)?

Your statement implies that using an HDRI is low on memory resources and therefore can’t possibly cause your issue. Quite the opposite is true: HDRIs tend to use gargantuan resolutions. That combined with a 32 bit color depth makes for a perfect memory hog.

BTW, if you want to use progressive instead of bucket rendering, enable “Progressive Refine” in the Render tab (Performance subsection). Mind you , though, that progressive refine will need even more memory.

HI DflippinK and IkariShinji
thanks for your answer.
No, my card isn´t overclocked or something else.

My PC is fairly new, so it shoudnt be a hardware problem. ( i7-7700K CPU 4.2GHz, 32gigRAM and a Nvidia GeForce1060).
Working with Blender 2.79 on an Win10 OS.

I guess, it is more caused by a wrong workflow. Comming from Vray, I might have chosen a wrong setting. In MAX/Vray it is so easy to accidently set one checker and you will ruin all you rendersettings. Guess I did the same in Cycles. Will go through all the settings first and hopefully get rid of this error.

Thanks for your help.
Hilmar

one thing I noticed… HDRI images could be huuuuge in size! It works with an 50MB file, but not with an 250MB huge HDRI file. How big are the r HDRI images you use?

Does your GTX 1060 have 3gb or 6gb vRAM?
Could you share the problematic scene here?:
http://pasteall.org/blend/

Edit:
If it is 3GB only, well, that’s not much vRAM considering that Windows 10 can take a lot of vRAM. You use external application like GPU-Z to measure vRAM usage, by the system and cycles just in the moment when it starts rendering.

Edited again:
You have an intel iGPU. To save vRAM in nvidia, you could set up iGPU as the primary display, as long as you render with nvidia CUDA. You should configure iGPU as the primary display in the BIOS, connect display to the motherboard and then install intel graphics driver.
Or you could use a lightweight Linux distribution.

Does that GTX 1060 have 3 GB or 6 GB of VRAM?
If it is 3 GB, well, there you have it. That would be quite sparse by today’s standards.

When it comes to textures, the resolution is more important than the size on the disk, as Cycles needs access to uncompressed texture space. Let’s do the math for a 16,000 x 8,000 pixel HDR texture’s memory usage (feel free to correct me if the math is wrong):

32-bit HDR image
16,000 x 8,000 = 128,000,000 pixels
32 bits per channel x 4 channels = 128 bits = 16 bytes
16 bytes/pixel x 128,000,000 pixels = 2,048,000,000 bytes = 2,048 MB

So, a single 16K x 8K HDR texture would need more than 2 GB of VRAM…

I’m having similar problems when rendering.
I have a laptop geforce 1060 6 gb
16 ram
Cant check cpu at the moment but intel something.
Blender 2.9

The peak of the render should be around 2 gb but after a few frames of the animation it can spike to over 5 so it quits…

Does the denoiser sometimes have a ram spike?
doing some test with a lower hdri but never had that problem before.