Hi everyone! I’m a relative newbie to using python in blender here (only been messing around with it for a week or so). I’m looking to write a script to do automatic LOD calculations on a complex scene using collapse (angle-based) decimation before render to help reduce compute times - I want it such that the smaller an object is and the further away from the camera, the higher the limiting angle on polygons and thus the fewer the polygons. All good, I should have no problem with that (just need to figure out how to get the bounding box size, the object midpoint, and the current camera location), but there’s one thing I’ve searched for and haven’t found out how to do. I want lower limiting angles on objects that are directly visible in the viewing frustrum than those that are not. That is to say, a shadow caster or reflection or whatnot doesn’t need to be as precise as an object right in front of you, even if both are equidistant from the camera.
Is there a way in Blender to test for whether an object is directly visible in the camera (best), or at least whether it fits in the viewing frustrum regardless of whether it’s visible (second best option)? I mean, I could look up the algorithm for a frustrum check and then program it myself based on the object vertices or bounding box, but that’d be a pain and almost certainly slower than using anything internal.
Does any sort of such internal functionality exist in blender?