Cuda error half way while Denoising

2.79 / 980ti x 2 / OSX 10.11.6

This is very weird. It is using about 1800 GB or so out of 6GB available on any of the 2 cards.

To be honest the cuda error is always present at a certain point. No one could explain this ongoing problem for Blender. Are they ever going to fix this issue?

Any ideas?

CUDA error: Out of memory in cuMemAlloc(&device_pointer, size), line 568

Read “Notes and issues”:
https://docs.blender.org/manual/en/dev/render/cycles/settings/scene/render_layers/denoising.html

Have you tried to reducing tiles size?

Is that measured with external application like GPU-z (I do not know which one in OSX)? Blender does not correctly report CUDA memory usage.
And to correctly calculate available vRAM, if you have the monitor/display connected to one of those GTX 980Ti, you have to subtract the vRAM used by the system.

Thanks for replying YAFU

I’ve always struggled to find a proper way to watch Vram in OSX. I know the memory used gets stuck or something as it happens a lot more if I have big projects opened in PS. So as soon as I close PS I can render again. But this time is not finishing and can’t tell why. I will keep on looking for a good tool to get a proper GPU usage report.

In other notes, they should add an extra denoising tick option to delay denoising until the end of the rendering process and that could be what is causing the error due to extra memory usage right when it starts denoising, I believe.

I just had a look at iStat Menus and it reports everything just fine but you can’t tell what the problem is. I doubt anyone can tell the problems going on inside Blender by checking any tool such as this one.

I am truly disappointed with Blender GPU management, to be honest. Is this problem happening with any other GPU based rendering engine? Even inside Blender? Let’s say Octane…

A clarification just in case, you should measure use of vRAM with external monitor while Cycles is rendering, not before.

I think you did not mention which tiles size you are using, and if you tried to reduce size.

I’ve had this happen on a 4Gb card with scenes that are pushing it. I’ve learned not to trust what Blender is telling me about memory (vRam in particular) and instead optimise my render or break it down for compositing.

Just because Blender tells you that it’s only using half of your vRam doesn’t mean that it is. I’d bet money that you are running out, unless the same happens on very lightweight scenes (e.g. a cube and a lamp). If it happens on the basic scene, then, depending on your OS, look at drivers or (Linux is all I know about) your cuda installation.

Right now I do only get this error while rendering with denonising active. I always use a tile of 450x450 because it’s the fastest for me. I’ve tried 256x256 right now and it went well. It seems it points out the same line error I posted in my first comment if I use bigger tiles.

Does that mean the denoising feature is not optimized on GPU? As for GPU, you want to use bigger tiles than 256 in order to have the best performance.

Not sure how I could measure Vram the way you say. Right now I am just able to see how much Vram is used while rendering with iStat Menus and it’s taking about 30-35%. So I doubt it is really a Vram issue. Nothing else is running in the background.

The thing is I only get this error on this project while denoising. It is not that big. 2 Million faces, a couple 4K png textures, and a 4K HDRI.
I’ve rendered heavier projects with almost 4GB of VRAM usage. Vram usage is a mystery, to be honest. How come with 6GB of VRAM you can’t go over 4?

Denoising additionally needs some VRAM to work on top of rendering the tiles.
Check this with GPU-Z or how the name of the macOS tool was,( forgot ).

Try reduce the texture size with clamping to a limit that does not change the renderresult in “simplify” aka 2048 or 1024.
As long as we don’t have the global mem feature implemented, there is not much you can do as trying to not exceed your VRAM.
You may want check the swerner patch.( https://developer.blender.org/D2056 )

Jens

Thanks, man. The tool I have reports the Vram usage as I said of about 30-35% while rendering. I think that denoising cannot jump to 100% from there unless there is a bug or something. It wouldn’t make sense.

CUDA error: Out of memory in cuMemAlloc(&device_pointer, size), line 568

That line should give us a clue about whats going on but I don’t know where to look for it.

What do you mean by “reducing with clamping and “simplify””?

I still think denoising should have nothing to do with the textures used in the project. Denoising meant to be applied to the output document size only which is in this case pretty small. Less than 1k. Just speculating.

1234567890

1234567890

1234567890

Wow the forum was not updating and I posted it 4 times. Any way to remove those extra posts?

1: Denoising may have a little impact but as on a gpu VRAM is not a homogen memory space but you have “portions” for dedicated types ( texture space / const mem / local mem / etc. ) adding up blows over the border then.

  1. Look at Scene/Simplify panel to limit your mem usage by tex limit and culling testwise.

  2. Keep the tilesize low, in actual master 16*16 is optimal.

Jens

P.S.: yep, sometimes forum behaves odd atm…

I’ve never used the simplify options will check how it works. Thanks!

The thing is that I don’t see any spike in Vram usage on my monitoring tool. That’s why I was thinking it might be a bug rather than a project problem.

Testing with 256 again it reported another error. I guess I will have to ditch the Denoising feature for now. Pretty sad.

I discovered this can happen when a file is corrupted as well. I built really huge neighborhood scene with lots of linked files. I thought it was because I did a high poly tree for my scene that my card couldn’t handle it so I started separating everything, same thing with grass etc. But it seemed that computer could handle less and less. Would work fine rendering a tree by it’s self, grass by it’s self but the regular scene that it never problems before rendering would have this CUDA error. When I discovered that it would not render a hedge that is in my scene by it’s self, then I realized that the problem was with linked file. I had the original in a completely different folder, I rendered it and it was very fast. So I just saved that file over the one that is corrupted, made sure that the texture is also re-assigned from the new place. Problem solved, no more real issues. As matter fact the scene was rendering pretty fast with everything at (2K).***. *If you still have a problem rendering something, make sure that you clean computer from registries etc. and sometimes you might have to close down other programs if there still is a problems. This so far works for me

Thanks for taking the time to report your experience. To me it never happened again since a long time ago. What I get sometimes now is that the render area will be cut in half so only half of the image is visible and the other half would be transparent. It seems to be another annoying memory bug.