Using camera distance to scale texture?

I currently have a noise texture that is scaled to the reciprocal of the distance from the camera using the Camera Data node. The noise texture is intended to simulate fine detailed bump and specular effect on roughened plastic. The problem I was encountering was that the detail kept getting lost in a way that made it look unrealistic when rendered at a distance.

My question is if this method is advisable to maintain fine detail. Will I encounter problems if I rely on this technique, and if so, under what circumstances (i.e. animation)?

Any thought on this?

can you show nodes set up in cycles

if distance is great then it takes a long time to render and adding no details at all
if we had like an LOD for cycles it would be faster

show us some render at different distance

happy cl

If I am understanding your concern, I think that if the distance were positively correlated with scale then distance would add render time as the texture scale would approach infinity. That wouldn’t help me much, though. What I am needing is a texture that approaches zero with distance - thus detail getting “bigger”.

The formula for the node is as follows:

s = (1/d)+1*k


s = scale
d = distance
k = offset (minimum scale)

Node graph of the “Camera Scaler” group. Note the “Noise texture” node on the parent graph.

Offset arbitrarily to 4.0. Y = scale factor, X = Distance of object from camera.

not certain if useful!

but don’t forget to change your render resolution too!
must need very high value to see effect !

also the bigger the scale is the smaller the noise is !

happy cl

edit - doesn’t matter

I haven’t tried it this way, but long time ago in other 3D software we used the opposite trick to help antialiasing deal with moire patterns by reducing detail with distance. It’s been like 20 years since those times, but at least I think that was how it was supposed to work. Why not render out a quick anim to see how it looks?

If you can create a driver that measures the distance between the camera and your target point, you could use that to drive the size of an Empty that can be used as the Object Map for the main object’s texture. The simple node setup for your main object (assumes you have an empty, probably parented to your object):

When the Empty is small:

When the Empty is larger:

People who are better at python than I am can figure out the right way to code the Driver. There are bits online for calculating distance in Blender. But ultimately, this can scale your noise (or other scalable texture) pretty easily as the distance changes. Love that new Object mapping feature!

The trick to this is that your noise is going to look a bit “swimmy” as the noise scales up and down.

Like when you were doing a mega-zoom in on earth, you’d have to render three or more different versions with different levels of detail in the textures. Then you’d fade between each render as you came closer the earth, increasing detail with each new version.