I have scaled my object up so it fits better with the HDRI backround, but then i dont see my object anymore. Im new to using this kind of background and found that with different backgrounds my object was tiny ( a little bigger than strand of grass from my HDRI) and after advice from BA i scaled object up. When i do that and render the object is no longer visible.
I take it my camera settings are not good but i haven’t a clue were to begin to correct this.
When you prepare the .blend for upload, just enabling compress makes the file much smaller. 22 -> 12mb in this one. But you don’t have to include everything as long it demonstrates the problem.
There is a view range and outside that it doesn’t show the objects. By default the range is different for the viewport camera and the render camera. The objects in your file are outside the viewing range of the render camera and that’s why they don’t show up, increase the values so they show up.
If you must know, the clipping settings are for adjusting the Z buffer precision, as in how much precision is given to calculate what distance the polygons are in 3D space. If you have surfaces overlapping each other, or you have poor precision to calculate the distance between the two, you get artifacts and the effect is called Z-fighting. The clipping values are there so you can adjust the precision based on the scale of the scene.
12.040 Depth buffering seems to work, but polygons seem to bleed through polygons that are in front of them. What’s going on?
You may have configured your zNear and zFar clipping planes in a way that severely limits your depth buffer precision. Generally, this is caused by a zNear clipping plane value that’s too close to 0.0. As the zNear clipping plane is set increasingly closer to 0.0, the effective precision of the depth buffer decreases dramatically. Moving the zFar clipping plane further away from the eye always has a negative impact on depth buffer precision, but it’s not one as dramatic as moving the zNear clipping plane.
Speaking of viewing range, a bit of trivia. As the distance increases, human eyes loses detail rapidly and even the colors and values get distorted because of aerial perspective. If the viewer is standing on earth, on the ground level, the 5km clipping distance set in the stackexchange link is plenty because that is further than the horizon for most people. The distance to the horizon is: distance (in km) = 3.57*sqrt(height (in meters)), so for 5km the person’s eyes would be at 2 meters.
I don’t know what you’re talking about. The uploaded scene didn’t have metric units set but had the default Blender units. When changing to metric, 1 blender unit = 1 meter, with unit scale of 1. The objects have unapplied and non-uniform scale though so the mesh dimensions are way different than what you’re looking at in the viewport. Ctrl+A -> scale to apply the object scale for the selected objects, that changes the mesh dimensions to what the object dimensions are, and the object scale goes back to 1,1,1.