Im working on super large scale scene with _cycles
postion of shadow (cast by cycles sun ) is shifted, I have no idea which part is wrong,anyone can help?
I’ll attach my file here
pengyu_cycls_shadow_shifted.blend (906.8 KB)
Im working on super large scale scene with _cycles
postion of shadow (cast by cycles sun ) is shifted, I have no idea which part is wrong,anyone can help?
I’ll attach my file here
pengyu_cycls_shadow_shifted.blend (906.8 KB)
Its because you have an abnormally large scene (Many times larger than the earth).
All the calculations are being thrown off.
You have objects all over the place, move them back to the center, not hundreds thousands of miles in random spots. Once I moved everything to a place that made sense and got the dimensions under control it all worked fine.
dang I didnt check the dimension of that object, didnt know how it goes that off chart level , thanks for the help!!
another question though, how to make cycles work with large scale scene? my original one is probably around 50x50km region , Im working a full scale city design.
I wouldn’t make it that big.
I would scale everything down by a factor of 10 or even more. So instead of 50km do 5km and just make sure all the things you put in the city is scaled accordingly. Unless you’re doing water/smoke simulations and you need it super realistic there is no reason to have a scene that big.
It comes down to floating point precision. There’s a limit to how much data Blender can use to describe a location in space. You CAN work at larger scales (50km, or even 50ly), but you will have to trade off precision on the smaller scale. You do this with the unit scale setting on the scene properties tab. Increasing this value to something like 100 would allow you to work at your desired scale. However, don’t expect sub-mm precision. Inversely, you can decrease the unit scale for work at the microscopic level, but you then lose precision at large scale.
So like icyou520 said, don’t go that big. Or have a separate scene for the large and another for the small details.
I shrink the whole scene down to 1/100rd of the original one, 50x50km to 500x500m ,then the 1.8m dude would be 1.8cm now ,.the shadow shift still happened when the object is too far off the center even it is now in 500m scale level.
yeah, well explained.
are you applying scale (ctrl+A) to all your objects?
anytime you change the scale drastically of an object you need to apply the scale afterwords.
I did applied the scale, cycles just cant handle this level of scale precision .atm , ,I understand it now …thanks for help!!
This wouldn’t be an issue if Blender supported double-precision values, but it’s unlikely as the average machine would need more time and use significantly more memory to compute them.
There are ways to ‘fake’ long distances by using compositing and by using sleight-of-hand techniques like forced-perspective
.
But MAN would that satisfy my inner OCD!
There’s even a type of number called a quad believe or not, enough to make a decent recreation of the known universe.
That is if you’re willing to accept your render engine devouring tons of your RAM and outputting the image slowly. The tricks described will have to do for now
thank you guys very much @anon12133251 , I put @cgCody 's explanation for the best solution, since there might be other fellas stamp upon the similar issue , they can go straight for a more technical explanation.
You mean I can align my vertices with sub-atomic precision? Shut up and take my money! lol
You are welcome Pengyu. I’d say @anon12133251 answered your question. I just added a little more context.
Perhaps the biggest issue currently is that Nvidia in recent generations of their video cards have decided double-precision is a “pro” feature and consumer cards have no hardware acceleration for DP so things end up 24-32 times slower when you try to do DP calculations in CUDA compared to SP, so you would lose Cycles GPU compute on consumer Nvidia cards if Cycles switched to 64-bit double-precision floats. If AMD starts offering cheap DP performance then maybe Nvidia will have to change its ways.
Single precision basically limits your 3d universe to 33,554,433 “places” in each dimension, so take the maximum dimension based on your unit scale and divide by 32M and that gives you some idea of the smallest distance that can exist between two separate points. When that becomes significant then the math starts to go pear-shaped, especially in rendering.
So in a cubical volume 2km on a side (+/- 1km from the origin), you get something like 16 places per mm that you can put a vertex, so you can do pretty finely detailed modeling and never notice. If you want to work on a sphere the size of Earth, now you only have places to put vertices about every 1/3 of a meter though space. If you’re modeling large structures to be seen from a distance then maybe that’s not horrible, but try modeling a 40cm teapot sitting on the ground and you’ll have a bad day.
And things like procedural textures become a pain to work with as you end up having to multiply everything by the scene’s unit scale, etc.
As usual all of the above is subject to being nonsense if I messed up the math which is entirely possible