OptiX denoiser gives blender out of memory

I cannot use the Optix denoiser in the viewport
i can render with it perfectly fine.
but i preview with viewport i get error on the top saying

Failed to create Cuda context (illegal adress)
if I go to the system console i get this:
outofmem
Not even a simple default cube will make optiX
work in the viewport.
My rig:
SYS: Windows 7 professional 64 bit
RAM: 4GB of ram
GPU: Asus Nvidia GT1030 2GB
CPU: AMD Athlon dual-core 5200+ 2.6Ghz.

Yep, i get that all the time too. GTX1050 4GB. However, depending on what I’m doing, sometimes it doesn’t produce an error, but I’ve never been able to narrow it down to a certain action. Sometimes it works, others not.

The less vRAM you have, the more likely you are to have out of memory problems. 2GB vRAM is really too low these days.

In these cases you have a workaround when lowering the viewport resolution. faster render time and less vRAM but you will see the image more blurred and with less detail:

Render tab > Performance > Viewport > Pixel Size, set 2x or 4x

If your monitor resolution is higher than 1080p, probably Automatic is already using 2x by default, so you use at least 4x.

i lowered the pixel size but then I preview but after a few secs blender crashed.

As I had said, 2GB is very little vRAM. Did it crash with 8x too?
You could give more information about the scene you are using, for example textures resolution you use, if you are using adaptive subdivision or how much vRAM occupies while normal render image (use GPU-Z to monitor vRAM in Windows, not Blender information).

You can use the Simplify menu to try to optimize the scene a bit more for viewport, especially Texture Limit in viewport if you are using large images.

By the way, normal render image may be at the limit of vRAM memory used, but this does not give vRAM problems because OptiX denoiser works on each small tile and not on the entire image.

Yeah i did a very big mistake
i applied the subdev :frowning:

but also even ​a​̲ ​̲s​̲i​̲m​̲p​̲l​̲e​̲ ​̲d​̲e​̲f​̲a​̲u​̲l​̲t​̲ ​̲c​̲u​̲b​̲e​̲ ​̲c​̲a​̲u​̲s​̲e​̲s​̲ ​̲b​̲l​̲e​̲n​̲d​̲e​̲r​̲ ​̲t​̲o​̲ ​̲c​̲r​̲a​̲s​̲h​̲

Just in case, you download the latest drivers available from the nvidia website, and then when you run the installer you check the option to do a clean install.

yeah the installer does not work it. just says
this version is not compatible with your system.
but if i use NVIDIA geoforce experience.
it just gets stuck at 100% and nothing installs

Anyway the problem is your 2GB of vRAM. Here in my System the default cube with OptiX denoiser in viewport uses 1.5GB of vRAM. Then you must add to this what Windows and other apps are using from vRAM, and it will quickly exceed the available 2GB.

Another workaround that might just a little help to reduce vram usage is that you use camera view and then you use Render Region (Ctrl+B) for the camera frame and with a small viewport window size. But this would reduce vRAM almost to the same level as reducing viewport resolution with Pixel Size.

will blender also use virtual ram??

The virtual memory is related to the RAM of the machine, it will not help you for the vRAM memory of the graphics card.

got some bad news now for some reason i cannot preview
or render even with denoiser off. :disappointed_relieved:(