Unexpected behavior with motion blur between RTX 3060 ti and RTX 2070 Super cards

Running Blender 2.91.0, all machines of 32 gig of RAM, using OPTIX as the render engine.
Win 10 Pro
machine 1 is running a RTX 3060 Ti
machine 2 is running a RTX 2070 Super
Ubuntu 20.04
Machine 3 running a RTX 2070 Super
all NVIDIA drivers are up to date on all machines.

Without motion blur on, all machines run fine, showing ~4500M peak memory.
If I turn on motion blur, the machine with the RTX 3060 Ti runs showing ~5900M peak memory.
Both of the RTX 2070 Super machines crash out saying lack of CUDA memory.
Mind you, on the RTX 3060 Ti machine I was also running DaVinci resolve!

Now, I’m confused; on paper both graphics cards sport 8G of RAM.

Is there any reason why one would expect the 3060 Ti to do motion blur when a 2070 Super cannot?

–mjlg