GPU Memory

Hi, I have a file that won’t render with GPU. I have a 2080 ti with 11 GB or memory.
When I render on the cpu it says that peak memory is 7 GB. Why wont it work on GPU, shouldn’t 11 GB be enough?

When I work on the file in Blender, Blender uses 30 GB of ram.
Thanks

If you also use the 2080 Ti for display, then it has already occupied some VRAM before rendering.
To check whether the VRAM is enough or not,
you can check the memory usage before/during rendering.
(In updated Windows 10, Task manager -> performance tab -> GPU, memory usage)
To use all 11 GB memory, use another GPU to display and let the 2080 Ti do only rendering may be a good way.
It’s probably not a “not enough VRAM issue” since CUDA GPU rendering can use host memory when VRAM is not enough.

Ok, so GPU shouldn’t be a problem.
I have done some more testing with CUDA.

E-cycles 1 or 2 GPU’s: Doesn’t render in viewport and can’t render. Just says operation cancelled.

Blender 2.81a 1 GPU: Renders i viewport, Renders a black file but showed a couple of tiles being rendered before disapering and turning in to the transparent checker pattern and putting out a black file.

Blender 2.81a 2 GPU’s: Renders in viewport, Renders a black file without showing any tiles getting rendered, only checker pattern.

CPU rendering works fine in both E-cycles and 2.81a.

Any idea what is going on?

Solved it. I changed the tile size to 128x128 instead of 512x512.