I have a very specific use case for Cycles and Python to make an environment map. I’d like to be able to sample a ray of my choosing and get the rendered color. Basically I’d be creating my own “camera” or mapping of image pixels to rays.
I’ve figured out that I can set a very small image size, like 4x4 pixels and then point the camera where I want and render the scene. Finally I can postprocess to combine all the images into one. This kind of works but it’s super slow because the scene is parsed each time and storing a few pixels in a separate image on disk is very inefficient.
I know this is a bit of a stretch, but I’d like to be able to send my own rays and get the color back. Is there anything remotely approaching this in Blender/Cycles?
FYI, I am aware of the panoramic camera options but those don’t do what I need.