I bought an Insta360 One X camera and am really enjoying the one click pano capture albeit seemingly only 3 stops? Has anyone else looked into using this device for environments, lighting and reflection maps? I’m wondering on the best file conversion or HDRI method. It seems to store .DNG files for its HDRi output.
You’d need something like a 12 stop ND filter, ref this to properly shoot the sun.
I don’t know the camera, but surely you can do manual exposure? And that 3 stops is just a bracketing thing or something? For manual exposure shooting ideally you’d want a mount with bubble and tethering - or at least remote control. You still won’t be able to expose for the sun. You’d either have to paint it in manually afterwards, or add an additional sun lamp and line it up in a spherical reflection.
Hi Carl
Thanks for the input. I essentially bought this camera to avoid the whole painful dSLR workflow (I have 2 dSLRs and a mirrorless and a suited lens and pretty much everything except a pano head.)
The Insta360 one X does have a HDR function built in and it even auto paints out the mount (which is a really amazing time saver IMHO), its just not such extreme amount of stops as the dSLR route and I’m wondering if one needs so many stops? This is the inside of my bathroom(soon to get some CG characters comped in). I opened the .dng in Darktable and saved out a .exr file and the .jpg I posted (which seems a bit darker).
I think I need to clean my bathroom, that’s what I think :
I don’t think 3 extra stops is enough to avoid all clipping, even for artificially lit interiors.
Still, you’re pretty much only going to clip the fixtures themselves. Seeing the filament is silly imo.
I wouldn’t worry too much about it. Only you can experiment and decide what is good enough.
The 18MP resolution (6080*3040) is borderline of what I would call acceptable for a LDR backplate. You become restricted wrt what focal lengths you can use. You need 10000x5000 to get relatively sharp results with a 50mm lens. Of course you can always blur the background which is usually a good reason to use longer focal lengths in the first place Or shoot proper high res backplate alternatives in regular fashion - won’t match lighting always, but usually only you would know.
It’s more than enough for HDR reflections.
It’s overkill for HDR lighting. If you do it using three separate images (sIBL way/HDR Labs?).
Does the .dng really contain the hdr? Or is it an increased bit depth saved to .dng? I know many claim to be hdr if they can produce a tonemapped image from a higher bit depth. Some range is going to get in there, sure, but… 3 stops? I’m no expert but it doesn’t sound like it does more than save in raw.
I’d call it more than low dynamic range, but not close to bracketed high dynamic range, and way out of reach from deep dynamic range (sun). But maybe good enough for interiors.
Frankly, I think the jpg of the bathroom would suffice. Just add a simple lamp om the ceiling.
Here is an image showing what I mean about backplate resolution. This one is 8000x4000, and the guys get quite blurry using a 50mm lens.
I might be wrong as I don’t have this 360 camera, but I think the HDR function is to create a tonemapped image with a larger dynamic range, not a proper HDR probe that you can use to light your scenes.
I created the HDR for this image by rendering a panorama in Vue. I went down to -8EV and managed to get decent shadows, although I had to use light blocking objects in the scene to cut down the ambient lighting onto the plane. Vue is meant to have a physically accurate lighting model (the sun was only just starting to clip at -8EV) but I don’t know how it compares to a real photograph.
I opened the hdr in GIMP (it was created in Photomatix using half a dozen tiffs exported from Vue at different exposures) and the colour picker showed that the sun was giving RGB values of around 35 while other pixels in the scene were in the expected range (about 0.2 - 0.6), so sun was about 100 times brighter than the rest of the scene.
Photographer here, writing a paper on HDRI for CG. (let’s see what I know, yay)
EDIT: looking at Fabien’s youtube video I realize I haven’t looked at some cameras, it’s a good video. If I were a liking man I’d “like” it.
In short:
Yes, you need that many stops, though not more than 24.
Your camera, monitor and eyes can only see a limited amount of EV, but mechanics in CG software have no such limitation. If you make a black reflective sphere and blast a proper hdri at it you will be able to see detail reflected off the sphere even when the background is completely white. This is assuming you want proper IBL. I mean, canon’s inbuilt hdrs are ±2EV*3shots = 6EV.
file type is important unless you just want a backplate:
So… I may be wrong about this because I got conflicting results from my tests, but I don’t believe it is a good idea to use DNG as a hdr format. You want a 32bit format like .hdr or .exr (Blender doesn’t like .exr I think, which it should love) DNG by itself is a 16 bit format which covers about 2EV of difference and it should not be possible to have a hdr of 20EV range - yet somehow Lightroom seems to do it, at least within the software. As a blackplate though, meaning it not being a lght source it’s completely okay to use pretty much anything.
the camera you are using is practical but not made for this:
I own 2 canon full frame cameras, a 3m pole, homemade panohead… you start to feel the weight on location. Anyway, I was looking at using a 360d camera from a practical standpoint, but because of reasons described in point 1 I haven’t bought one yet. There is an app or a mode of use that makes it possible to take about 6 dynamic shots with that camera (I think), but then the question is of whether the shots are far enough apart i.e. if you have 6 16b dng files with 1EV dfference you are basically covering 7EV DR. It’s still better than nothing though. One note though - you can’t do night scenes that way, I’m sure of it.
By the way guys, did you know Dolby Vision makes a jpg that can record in [visual] HDR and nobody wants to pay the license to use it? Freaking corporations keeping the whole world back by license alone. Pretty sure Apple iPhones have access to this but I might be wrong - Apple prefers to use its own stuff. And! And! There’s a jpeg2000 format (you probably saw it in blender even) which is actually a more advanced version of jpeg . . . that nobody is using! At least from what I’ve tried all my tests failed to open without issue. Or have all of you been using it without problem forever now and I’ve just been living under a rock?
Then maybe you da man to ask
Based on this, I use 441 as sun strength (4500K). For sRGB, my sun disk will clip in full white reflection at EV-11.5. -11.5+20=+8.5EV - that’s a lot of shadow detail. As a range for the camera, sure. But a probe having 24EV?
Sunny 16 gets me to -6.64 and everything lit by direct sunlight is properly exposed (although above I didn’t account for atmosphere). Are these numbers in the correct ballpark according to you?
I also quite recently read an article or saw a video by some of the big players in development regarding sun hdri shooting. It was very informative, but I’ll be damned if I’m able to relocate it.
That depends on what lit the rest of the scene. In the link above, the comments states sun & sky at 441 and 29 - I’m using 400/40 because it is easier to remember. He also divides it by 100 to keep exposure at 0. Meaning that indoor fixtures, i.e. a 470Lum bulb; 470/666 = 0,7057, further divided by 100 = 0,0071 - I don’t like such small numbers. 666; I can never remember the real conversion factor, but 666 is close enough and easy to remember.
What’s important to remember for sun & sky, is that for an indoor scene, the sun contributes usually mostly with indirect light (meaning noise) bouncing off the floor, whereas the sky comes from a full hemisphere in a window, providing a lot more direct light (less noise).
I just watched the video, and I agree on everything. And the addon is great.
What EV value are you using? Photographer addon?
I don’t think the Sunny 16 rule works with my addon if you use 441 as the sun intensity. I will have to adjust my addon now that lamp intensities have been clearly communicated as Watts.
The problem is that Watt isn’t enough to define light intensities, you need to know about luminous efficacy. Does Blender assume a 100% efficiency of 683 lumens/W? I have been asking Blender developers but never got a confirmation from Brecht.
Also from captures I’ve done in the past, sun intensity is definitely around 100 000 - 120 000 lux, and if you convert that using Blendergrid’s first equation, then the sun intensity in Blender should be more around 160, that’s quite a big difference from 441…
I am a bit lost as to what to trust to make sure my addon is correct, and now I’m considering to trust LuxCore as the ground truth. I am also about to release something that would really beneficiate for getting this units correct, for good. I know William and Clement are also all for that, but I don’t know why nothing gets decided…
Yes, Photographer sets exposure to -6.64 (guessing its from 6 2/3 but) when doing sunny 16. Going up 6.64 to 0 (approx office lighting) that corresponds to a 1" shutter and my darker than an office living room shows up darker than an office. So I don’t think it is too far off.
For something I just tried to help out with, not my stuff. But here I apply sunny 16 to characters and in frontal sunlight they look “correctly exposed” (not perfect, the idea is that you can’t go completely wrong, compared to camera light meter that can do all sorts of things - kinda). Looks about right to me:
For fixtures, it’s just messy. I have to create analytical light in one room, do the lumen via 683 scale on it - and similarly, I don’t know if it’s an accurate conversion. Then adjust the mesh light in another room until they match. We can get weight using rigid body, but not surface area (that I know of).
And then there is fresnel lamps. But for spots we can’t override the falloff when we reduce the cone angle. I get it would be annoying having to deal with it for regular art stuff. Not that I ever use them practically, but it would be nice to have if the situation comes up.
683 (or my 666 :p) is as far as I understand it, lumen to (blender) watt. No efficiency/efficacy involved. Luminous efficacy is for dealing with light sources that doesn’t have a lumen given, like a 60W incandescent light bulb. But I could be wrong. I’m no expert in this, and it confuses the living daylight out of me. Who thought using watts could be confusing when old light bulbs deal with a completely different kind of watt?
And of course, the manual doesn’t state anywhere how to setup a scene using physically correct values, or at least physically correct relations.
Maybe I am misunderstanding you, but it’s quite the opposite: lumens is now always communicated for light bulbs that you can buy, and is the only value you can trust to know how bright it is. We shouldn’t expect the artist to do the math himself to find out the correct value (even if it’s a simple division).
Watts only reflect how much electric energy is needed to turn on that lightbulb, and can’t be used as a light intensity measure as there are many different type of bulbs available commercially (tungsten, halogen, fluorescent and LED). A LED lamp is about 6 times more energy efficient than a tungsten bulb.
Blender exposure doesn’t mean much. It is EV but the assumption that says that Blender Exposure of 0 = Photographer EV of 8 is completely arbitrary. I hope I’m not too far from the truth, but my tests did show that I was, unfortunately…
This I fully agree with. So I’m not sure how we’re opposite Lumen is brightness that is listed on the bulb or package itself. Watts (the blender one) is also brightness, it’s just not the same kind of watts we read on a 60W bulb, which is the power consumption.
Yeah, I tried convincing the devs that the exposure should have meaning, but it was brushed off. See how well filament does this with its physical camera. I’ve also a utility node group for exposure compensation. Like in filament, although I use it to maintain the exact same output no matter the exposure. Although now they complain that the content image is in filmic when they want standard
Filament documentation is a gold mine, there is nothing ground breaking there (render engines have been doing most of what they do for a long time), but it’s nice to see it clearly explained and written down.
@troy_s Can you please share these documents? I have trouble to find artist-friendly informations about radiant flux and couldn’t find the discourse you are mentioning.
All recent PBR engines went for Lux / Lumen / Nits, these are more understandable, well documented and easy to measure units. Maybe they are all wrong to not use radiant flux then, but if we want Cycles to be easy to use for artists, it would be better to offer similar unit options.
When dealing with physically plausible light transport models, you can’t be using a perceptual system of measurement, and the above image demonstrates why. Despite being at equal energy output, none of the squares appear at the same perceptual energy level; green looks “brightest”, with red next, and blue quite “dark”.
Hence radiant flux is the only unit of measurement that matters in terms of describing the facets of a light transport model. If a PBR model is using a lumen, I wouldn’t be trusting the documentation nor model.
We’re not on a spectral renderer, so my guess is that lighting with primaries isn’t a great idea in the first place.
Tell me which artists really understand any of this stuff?
Formulas use symbols I didn’t even know exist.
If I read 430lm on a bulb, I should be able to plug that number in somewhere. That’s a number that is readily available, understandable, and artist friendly.
But since you’re here then; how do I correctly go from the lumen (the only available data other than measuring it) printed on the bulb, into watt? Here is the manual page for lights. Feel free to point out where in the manual a regular artist with no science background is to figure out that watts is really about radiant flux or joules per second (whatever that is) and not the wattage we know from incandescent bulbs. And how the readily available lumen on the bulb itself just have to be ignored because it is false - the only source of information we have about it. And how to set the sun strength to get correct relative factor between everything?
The three lights being fired into the scene that bounce out of the scene are primaries. They are of a given energy level and specific, unique colour.
I don’t know what an “artist” is, but I studied art and design to end up with a fine arts degree after five years. Every other person there studied technique and light study as well. There’s even a rumor that some hack from the Renaissance spent days in a morgue dissecting bodies to get an accurate representation of muscle tissue for his work…
You’ll end up with varying light energy level output if you rely on luminous flux versus radiant flux.
It’s not wattage in the same sense you are thinking. It’s a measurement of energy. The overall energy in your image depends on the relative exposure level.
We want to use real-life, absolute exposure levels. So how many watts does a common light bulb emits? A common TV? The sun during a bright day?
There is no documentation about this anywhere, you can’t be saying that this is artist-friendly and not be able to give reference values that we can input in Blender to get physically based results.