RTX 3090 Ti: System is out of GPU and shared host memory

Judging by this thread I need to have 48gb of RAM in order to fully utilize 24gb of VRAM.

This makes sense because I have 32 and I’m hitting 16gb of VRAM use in Blender and memtest.

I’ve never heard of this before. If true it should be common knowledge and something gpu manufacturers will advertise in advance.