Is there any way of achieving cubemaps in real time? I’ve see static cubemaps with 360 images, I also thought of how to turn (or create) the scene into a HDR image and so using it as a cubemap, but will continue to be static if objetcs are moved, so I wanted to learn how to make it properly and the current (and best) way to achieve it.
theres a thing called dynamic textures. likely you need to know python, and you will definitely need a powerful computer. i heard upbge made a planar reflections feature.
just know, realtime solutions will never look as good as rendered/raytraced (read:cycles). we have gotten close, but the hardware requirements are very high.
I saw the video right now and tried myself, and I already ask if there’s any way of changing the color of the cubemap. At first guess it appears to be like a chroma ball that reflects, so I tried do use a matcap texture with it in a separated material and mixed in the node editor to change the visual of the ball. here’s the file: