Rendered HDRi looks different to the original 3D scene

Hey Guys,
I’ve created a space environment with a sphere and two particles systems. If I render this out in 1920p x 1080p and 100% it looks like I want it to look. Now I want to create a HDRi map out of this, or at least a spherical texture, I can use as Background in other Blender scenes. For this i set the resolution to 16384p x 8192p, and in the camera settings I set the lens to panoramic and Equirectangular.
But the render result doesn’t match with the original view. The stars are much bigger, so there are less at the screen. My Question: why is the HDRi so different to the normal view in the scene, and how can I change that, so I doesn’t have to render the hole HDRi image out again and again?
Also, the HDRi image looks kind of to low res, but I think it has something to do with, that it looks kind if zoomed in.

P.S. Sorry for my bad English

Can you share both renders here? (Ideally, the .blend file)

I can’t uploade the .blend. It says its to big

Make sure your camera is at the center of your sphere.

Equirectangular panoramic image must look distorted (especially at the edges). It’s hard to tell here, i think because you cropped the image to 1920x1080.

Also way less stars must appear in perspective render. See below renders i just took for comparison:

Perspective (3000x1500)

Panoramic (3000x1500)

Here is the .blend file i used:
space_panoramic.blend (764.3 KB)


Besides, you can download something like GoPro VR Player and see/test your panoramic image.

The second image, is the sphere with the particle system, rendered with the normal settings. The first image is the rendered hdri map applyed to the Background shader of Blender, and rendered with the dafault settings. So both images are 1920p x 1080p.

make sure you are using the right color space.