Scaling sunlight/starlight arcseconds

With a constant enthusiasm for space scenes and wide field of view landscapes around the Forum, I thought I’d ask, is there a way to generate arcseconds in the size of the Sun?

It’s not as crazy as it sounds. The Blender Sun, as I understand it, generates parallel rays. That’s fine until you hit a scale that’s big. Then Point lights can shoot shadows at themselves which is fine until you get to a scale that’s small.

Somewhere in between, it seems there should be arcseconds - how much of the Sun you’re using to light the scene you’re showing. A way to bend the sun’s projected rays from parallel to outward while keeping the sharpness of the size value.

I realize that the camera point of view and perspective will probably negate any actual accuracy, but I was just thinking and thought I’d toss the question out there as proof to the Universe that I was thinking.

Since most people here probably don’t study astronomy to a high degree, it’s probably a safer bet to explain your question in more lay terms, even if it doesn’t make you sound as smart. Yeah, you know what you’re doing there! Sorry, I had poke fun at ya a bit. :stuck_out_tongue:

Anyway, I personally only took a semester of astronomy in college, so I’m fuzzy on what you’re asking.
You mention arc seconds in relation to the size of the sun, but then you’re talking about the angle of the light rays.

So are you asking if it’s possible to describe the angular diameter of an object (the sun) in arc seconds, or the angular distance between objects? - Basically what arc seconds and minutes are used for.

Or are you asking if it’s possible to have a light cone angle in units smaller than one degree?

Certainly nothing I know holds a candle to what the Blender Developers know and do. But I needed that grounding, thank you! Working off your clarification, I’m talking about the angular diameters of celestial objects from the point of view of the sun. The distance between objects from a star’s point of view would also be important in forcing celestial perspective.

Creative Shrimp has a learn-the-universe package for Blender and it’s all sorts of great. We can make a planet and we can light it; but once you add a second planet at almost any distance from the first, the Blender lighting is betrayed as either a Sun with parallel rays or some Point light that’s obviously not scaled far enough away to be as realistic as the planets. And if you do get that point light far enough away, other tweaks have to be made to strengthen the light, adjust shadows, etc.

I hope this isn’t an awful example but what comes to mind in simple terms is an eclipse. I don’t know a Blender lighting solution that can keep hard shadows on the closer body while “reaching around” to cast a diffused shadow on the farther body behind it. You have to spoil one to light the other.

Meanwhile… :slight_smile:

If I understand Light Cone Angles at all, is that in reference to something like watching a cloud obscure a light source? The sun is blotted out simultaneously as the shadow arrives unless, in Blender, your angles are all off and the observer camera is still lighted even though, visually, the ‘sun’ is behind clouds? …as I think about this, I wonder if HDRI environments aren’t actually the solution since their distances aren’t physically modeled the way Blender lights are…

A cloud passing before that sun would always cast a shadow on the observer no matter where in the scene they were. Interesting.

What specifically are you trying to do? Provide more context
If you’re looking to create space scenes, there are two scenarios that I can think of.

In scenario A, you want to make something that looks pretty. If this is the case, forget about accurate math and scale and just use your creative license, because you’re going to have to exaggerate the scale to make it look good anyway.

In scenario B, you want something mathematically accurate involving more than one celestial body. In this case, you’re talking about a scale so… well astronomical… that you aren’t going to have more than a few pixels to describe objects anyway.

You can’t really have it both ways, my friend. :slight_smile: The real universe just isn’t as scenic as scifi makes it out to be. Even well known real life space-selfies like Pillars of Creation got the “Instagram filter” treatment to look good for public viewing.
You’re either rendering objects unrealistically close, or you’re rendering dots that are millions of mile apart.

I don’t know, maybe I’m light years off from what you’re asking. :wink: