Results 1 to 17 of 17
  1. #1

    Cuda error half way while Denoising

    2.79 / 980ti x 2 / OSX 10.11.6

    This is very weird. It is using about 1800 GB or so out of 6GB available on any of the 2 cards.

    To be honest the cuda error is always present at a certain point. No one could explain this ongoing problem for Blender. Are they ever going to fix this issue?

    Any ideas?

    CUDA error: Out of memory in cuMemAlloc(&device_pointer, size), line 568



  2. #2
    Member YAFU's Avatar
    Join Date
    Mar 2013
    Posts
    2,870
    Read "Notes and issues":
    https://docs.blender.org/manual/en/d...denoising.html

    Have you tried to reducing tiles size?

    Originally Posted by animani View Post
    It is using about 1800 GB or so out of 6GB available on any of the 2 cards
    Is that measured with external application like GPU-z (I do not know which one in OSX)? Blender does not correctly report CUDA memory usage.
    And to correctly calculate available vRAM, if you have the monitor/display connected to one of those GTX 980Ti, you have to subtract the vRAM used by the system.
    Last edited by YAFU; 12-Oct-17 at 06:24.
    Be patient, English is not my language.



  3. #3
    Thanks for replying YAFU

    I've always struggled to find a proper way to watch Vram in OSX. I know the memory used gets stuck or something as it happens a lot more if I have big projects opened in PS. So as soon as I close PS I can render again. But this time is not finishing and can't tell why. I will keep on looking for a good tool to get a proper GPU usage report.

    In other notes, they should add an extra denoising tick option to delay denoising until the end of the rendering process and that could be what is causing the error due to extra memory usage right when it starts denoising, I believe.
    Last edited by animani; 12-Oct-17 at 07:15.



  4. #4
    I just had a look at iStat Menus and it reports everything just fine but you can't tell what the problem is. I doubt anyone can tell the problems going on inside Blender by checking any tool such as this one.

    I am truly disappointed with Blender GPU management, to be honest. Is this problem happening with any other GPU based rendering engine? Even inside Blender? Let's say Octane...



  5. #5
    Member YAFU's Avatar
    Join Date
    Mar 2013
    Posts
    2,870
    A clarification just in case, you should measure use of vRAM with external monitor while Cycles is rendering, not before.

    I think you did not mention which tiles size you are using, and if you tried to reduce size.
    Be patient, English is not my language.



  6. #6
    I've had this happen on a 4Gb card with scenes that are pushing it. I've learned not to trust what Blender is telling me about memory (vRam in particular) and instead optimise my render or break it down for compositing.

    Just because Blender tells you that it's only using half of your vRam doesn't mean that it is. I'd bet money that you are running out, unless the same happens on very lightweight scenes (e.g. a cube and a lamp). If it happens on the basic scene, then, depending on your OS, look at drivers or (Linux is all I know about) your cuda installation.



  7. #7
    Originally Posted by YAFU View Post
    A clarification just in case, you should measure use of vRAM with external monitor while Cycles is rendering, not before.

    I think you did not mention which tiles size you are using, and if you tried to reduce size.
    Right now I do only get this error while rendering with denonising active. I always use a tile of 450x450 because it's the fastest for me. I've tried 256x256 right now and it went well. It seems it points out the same line error I posted in my first comment if I use bigger tiles.

    Does that mean the denoising feature is not optimized on GPU? As for GPU, you want to use bigger tiles than 256 in order to have the best performance.

    Not sure how I could measure Vram the way you say. Right now I am just able to see how much Vram is used while rendering with iStat Menus and it's taking about 30-35%. So I doubt it is really a Vram issue. Nothing else is running in the background.



  8. #8
    Originally Posted by Roken View Post
    I've had this happen on a 4Gb card with scenes that are pushing it. I've learned not to trust what Blender is telling me about memory (vRam in particular) and instead optimise my render or break it down for compositing.

    Just because Blender tells you that it's only using half of your vRam doesn't mean that it is. I'd bet money that you are running out, unless the same happens on very lightweight scenes (e.g. a cube and a lamp). If it happens on the basic scene, then, depending on your OS, look at drivers or (Linux is all I know about) your cuda installation.
    The thing is I only get this error on this project while denoising. It is not that big. 2 Million faces, a couple 4K png textures, and a 4K HDRI.
    I've rendered heavier projects with almost 4GB of VRAM usage. Vram usage is a mystery, to be honest. How come with 6GB of VRAM you can't go over 4?



  9. #9
    Originally Posted by animani View Post
    The thing is I only get this error on this project while denoising. It is not that big. 2 Million faces, a couple 4K png textures, and a 4K HDRI.
    I've rendered heavier projects with almost 4GB of VRAM usage. Vram usage is a mystery, to be honest. How come with 6GB of VRAM you can't go over 4?
    Denoising additionally needs some VRAM to work on top of rendering the tiles.
    Check this with GPU-Z or how the name of the macOS tool was,( forgot ).

    Try reduce the texture size with clamping to a limit that does not change the renderresult in "simplify" aka 2048 or 1024.
    As long as we don't have the global mem feature implemented, there is not much you can do as trying to not exceed your VRAM.
    You may want check the swerner patch.( https://developer.blender.org/D2056 )

    Jens
    Last edited by jensverwiebe; 13-Oct-17 at 04:56.



  10. #10
    Thanks, man. The tool I have reports the Vram usage as I said of about 30-35% while rendering. I think that denoising cannot jump to 100% from there unless there is a bug or something. It wouldn't make sense.

    CUDA error: Out of memory in cuMemAlloc(&device_pointer, size), line 568

    That line should give us a clue about whats going on but I don't know where to look for it.

    What do you mean by "reducing with clamping and "simplify""?

    I still think denoising should have nothing to do with the textures used in the project. Denoising meant to be applied to the output document size only which is in this case pretty small. Less than 1k. Just speculating.



  11. #11
    1234567890
    Last edited by animani; 13-Oct-17 at 05:01.



  12. #12
    1234567890
    Last edited by animani; 13-Oct-17 at 05:02.



  13. #13
    1234567890
    Last edited by animani; 13-Oct-17 at 05:02.



  14. #14
    Wow the forum was not updating and I posted it 4 times. Any way to remove those extra posts?



  15. #15
    Originally Posted by animani View Post
    Thanks, man. The tool I have reports the Vram usage as I said of about 30-35% while rendering. I think that denoising cannot jump to 100% from there unless there is a bug or something. It wouldn't make sense.

    CUDA error: Out of memory in cuMemAlloc(&device_pointer, size), line 568

    That line should give us a clue about whats going on but I don't know where to look for it.

    What do you mean by "reducing with clamping and "simplify""?

    I still think denoising should have nothing to do with the textures used in the project. Denoising meant to be applied to the output document size only which is in this case pretty small. Less than 1k. Just speculating.
    1: Denoising may have a little impact but as on a gpu VRAM is not a homogen memory space but you have "portions" for dedicated types ( texture space / const mem / local mem / etc. ) adding up blows over the border then.

    2. Look at Scene/Simplify panel to limit your mem usage by tex limit and culling testwise.

    3. Keep the tilesize low, in actual master 16*16 is optimal.

    Jens

    P.S.: yep, sometimes forum behaves odd atm..
    Last edited by jensverwiebe; 13-Oct-17 at 05:25.



  16. #16
    I've never used the simplify options will check how it works. Thanks!

    The thing is that I don't see any spike in Vram usage on my monitoring tool. That's why I was thinking it might be a bug rather than a project problem.



  17. #17
    Testing with 256 again it reported another error. I guess I will have to ditch the Denoising feature for now. Pretty sad.



Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •