Arnold 5.0

Our beloved Cycles older sister Arnold is turning 5.0.

Couldn’t help to notice that many of the features reminds me of whats in Cycles.

Dithered Sampling (kinda reminds me of the de-noiser, blue shift somehow smarter than random sampling)
new Standard Shader, Volume Shader, Hair Shader.
per light AOVs
OSL (check, )
VR Camera (check, has been in blender for a while as panoramic cameras)

Overall looked like a solid release (pun intended) I mean if you are using Arnold this must be quite a nice release. The better sampling tech like dithered samplings paper might be cool for Cycles? but what I’ve seen the de-noiser seems quite far advanced already.

Unfortunately, Blue-noise Dithered Sampling is still have some issues in Cycles: https://developer.blender.org/D2149 .

Yeah, but nice that it’s Lukas on it, it probably will get implemented sooner or later.

“Multi threaded maketx” I know I’ve mentioned this on twitter, and I think Blender does this under the hood? Convert pngs and jpegs to .tx files when rendering? (maybe someone knows and can refresh my memory)

I recently saw a Cinema4d to Arnold video which show texture files being processed and saved as .tx files on the file system. They got significant speed improvements from it.

~3:00 min mark,

Pretty funny to watch if not for the voice over.

AOVs have been in Arnold for years, in v5 they introduce per-lightgroup AOVs. Cycles has neither lightgroups of AOVs at that point (well, there’s a patch…).

Anything else than sampling based on true wild brute randomness will produce low frequency and shadow flickering artifacts in difficult areas. Calling it sophisticated names does not change this fact. And there is no caustic components in that presentation, even when they could play a significative role.

No, Blender does not convert. It will read from .tx files, but the benefits of that apply only when rendering in OSL mode.

It’s worth pointing out that all blue noise dithering is not created equally. Their dithering pattern/matrix is incredibly complex, and it will likely be a long time (if ever) before we get the results they’re getting with a single sample. And shutting it down as just a “sophisticated name” is silly without knowing what they’re doing under the hood. I’ve been using Arnold 5 since release, and can say that their new sampling is a game changer. Preview rendering for scene setup is insanely fast now, and at higher sample rates there is absolutely no additional low-frequency variance. I’m dying to see it paired with modern denoising, since the perceptual nature of modern algorithms and the perceptual cleanliness of low-sample blue noise dithering should pair very nicely.

The problem with the good blue noise algorithms is there heavily patented, One thing i did think about though is these blue noise algorithms if im correct the last time i read up on them are only used to generate a small screen view repeatable sample mask texture. Maybe one way round could be to find a way to output one the the better blue noise implementations to lossless texture format and import that into blender to get round the code patents.

Did Just Find this on Github Though which is based on the solid angle paper:

Might be worth pointing Lukas in this direction if he hasn’t found it already

@3DLuver it can’t hurt, so I commented it on the diff patch for Lukas.

Looks like that dude Josh is working at framestore as a shader developer. Lends some credit that it might be useful code.

The two main problems I see with it are:

  • It just runs straight into the first local minimum because it doesn’t do simulated annealing - it just rejects every swap that increases the total energy
  • It recalculates the energy of the entire system after every swap, which is extremely wasteful - due to the local nature of the energy term used, only a handful of terms have to be recalculated. Doing so helps reducing the computational complexity a lot (my tool used to perform billions of swaps iirc, while his tool currently takes 30min to do 4096 iterations) as well as reducing numerical issues.

Or maybe not
https://developer.blender.org/D2149#62301

Lukas highlights a couple of major issues compared to the patch he is working on, so I’m afraid the code there is a no go for his patch.

Anything else than sampling based on true wild brute randomness will produce low frequency and shadow flickering artifacts in difficult areas. Calling it sophisticated names does not change this fact. And there is no caustic components in that presentation, even when they could play a significative role.

Care to post examples of where true randomness shines better than algorithms developed by some of the best in the industry? Also, Arnold can do caustics, they just require a lot of samples just like in Cycles.