World size and HDRi

So I tried to implement a HDRi environment/background. And I am quite happy with how easy it is to do and how nice it looks. There is, however, a kind of mystery (hence, this question)

I have a model, created in scale (metric measurements) but when I place my camera in front of the object, the background-image looks too big.
So I could move camera or change the depth of field - but this seems like a work-around. Somehow it seems the distance to the HDRi-image is too short. Is there anyway to tell Blender to expand the size of the virtual room?

Googling HDRi and World scale or size results in endless variations of changing to metric and making sure things are sized correctly. But this is NOT what I am asking. My model is correct in size. It’s the world that’s not correct… so to speak.

Running Blender 2.82 BTW

edit: I realize that this would probably be fixed easily if the HDRi-image was “wider”. But somehow it doesn’t make sense since it seems to be 360 degress… And I tried 2K and 4K with the same issue (although, naturally, the image quality changed).

The HDRI is projected at infinity. There is no way to make it larger or smaller. The camera is the only way to do what you want.

Thanks WildBill

I still can’t wrap my head around the right way to solve this. In the real world I would expect that I could position background and foreground in camera - but I probably should avoid HDR-backgrounds with trees!