I am having trouble understanding the hype surrounding the Arnold Render

I would like to just post this here just in case I am missing something. I have had a chance to play with this particular render engine before and recently, with some excitement, I was able to sit in front of a computer containing an updated version running Maya 2014.

After which, I began to wonder about a few things… :confused:

I made a little red cow and coated it with red car-paint. I set the cow atop a plane and through in an HDRI background. I made the scene identical in both software packages, which included fidgeting with Maya’s color space and Arnold’s Gamma. After I was satisfied that both renders looked the same in both color and lighting (to the naked eye) I began rendering the scene.

First off, Arnold and Cycle’s photo realism seems identical. I even played with the skinshader in Arnold which everyone seems to love, and compared it to the Cycles “simple skin shader” that I recently got off of Cgookie. Again they looked about the same. Practically, if not completely… identical. Hair was a bit trickier because I seem to know how to do it in Blender better then in Maya. But with the new hair node it seems that cycles can do hair as good as Vray or anything else that is not Hair Farm.

But what really got me was the render times. Cycles was faster then Arnold while running on CPU, and by no small margin either. On GPU it was much faster then Arnold. This surprised me a great deal because you only ever hear about people talking of fast and amazing Arnold is. Well, it might be when compared to a software like Maxwell but compared to cycles this just does not seem to be the case at all.

Note: I was running Arnold in the ranges of 4-5 diff, 4-5 spec and 6 AA at 1920 x 1080 (which I think are very reasonable low to mid range settings) in order to get rid of the noise. I did increase the samples for the sun, but I admit that I do not know how to set the equivalent of multiple importance sampling in Arnold/Maya.

My Observation:

The only benefits that I saw in Arnold is 1. it did not create as many fireflies (but the renders were allot longer) and 2. there was no shading issue like there is inside of blender. But that is more of a Maya thing then an Arnold thing. Or perhaps I should say it is more of a Blender thing because I do not seem to run into shading problems in other packages.

So I am not posting this as a “cycles is so awesome and better then Arnold” thread, I am actually asking if I missed something? Or is it just that people are so fed up with Vray’s complexity that Arnold’s simplicity seems like a godsend to them? Is it perhaps that Arnold is simply fantastic for the type of render that it is (I think its monte carlo unbiased or something?) but when compared to other render engines be it Cycles, octane or whatever… it is still a bit on the slower side?

I am just not “getting it” apparently. So could someone here educate me? :o

You’re never going to see the benefit of Arnold until you’re using it in production. Start layering materials, instancing objects, using hundreds of lights, using volumes, heavy displacement, motion blur, and depth of field. Then render it all on a 64-thread machine. Then try the exact same scene with Cycles, VRay, or pretty much anything else on the same machine. Then try to take it all into Katana ;).

You’ll very quickly see where Arnold shines. The average hobbyist doesn’t need what Arnold offers. Studios haven’t paid millions of dollars to switch from Renderman to Arnold just because it’s fun.

So basically you have to be working with scenes so big that they would make blender, as well as most ever other rendering package on your basic $2,500 - $3,000 (medium to high end) computer, choke and crash in order to see it’s benefits?

People should probably mention that when they praise Arnold’s awesomeness :spin:

Anyway, thank you for explaining.

Your little red cow is not a good benchmark scene. Load up several billion polys, no instancing, then compare render times. Remember the huge habitable torus in orbit from Elisium? I remember reading somewhere that was a couple trillion polys with instancing, a few billion unique. No idea how much ram the farm had, but the fact they rendered it at all is a miracle.

Well, that was a bit of an exaggeration, but honestly your test case was pretty terrible for showing off what a path tracer like Arnold can do.

As far as the cgcookie skin shader being similar, well… I created it to more or less BE the Arnold skin shader, so that’s nice to hear :wink:

Actually I was referring to the one put together by Kent Trammel in the “Skin Shading with Cycles” tutorial. Your Arnold port was also featured in that video, but that is not the one I was using at the time. That is why I referred to it as “simple” shader, as that is what it was named in that tutorial. Yours is referred to as an “Arnold port” if I am not mistaken.

I have major doubts that my $2,500, 16gb ram Main Gear machine could handle the scene that you described while running Cinema 4d, Blender or Modo. Maya I don’t know, I never tried. But I don’t think that I am exaggerating to much based upon the requirements that you listed. If that type of scene is indeed what would be needed in order to truly see “what Arnold can do”, then unless you are making a production length feature film with over the top special effects shots, then Arnold is just another rendering tool in a person’s potential arsenal.

Fact of the matter is, no one ever says that when rendering basic, small, run of the mill scenes… Arnold is slow. Like a couple of cars in front of a backdrop, a city scene or a few spaceships over a planet. from what I have seen I think that it can be said that Arnold is magnificent in it’s intended role, but it should not be inferred that it is magnificent among all other render engines because it’s just… well… not.

Now, I’m not going to do this for everyone, so nobody expect personalized support, mmkay? :wink:

I’m one of the core software engineers on Arnold, and I spend time with major VFX studios regularly helping them optimize their shots and get render times down with Arnold (but most of my time is spend on adding features, fixing bugs, and optimizing Arnold core itself). Your sample settings immediately jumped out at me. You have to understand how sample settings work in Arnold, and then you’ll realize where you went wrong. Just about every sample setting in Arnold is “squared”, meaning if you set your AA samples to 6, you will get 6*6=36 pixel samples. Same for diffuse GI samples, glossy, per-light samples, etc. Arnold also “splits” rays at the first hit, meaning it’s not a pure path tracer. This is a bit of a simplification, but roughly what happens is that on the first hit (from a pixel sample ray) it will then spawn diffuse rays and glossy rays, and light (shadow) rays for each light, all based on the (squared) sample settings you set. After those bounces, each hit will then scale back to 1 ray each for all of those (one diffuse, one glossy, one sample per light, etc) until the ray depths are hit, so that you don’t have an exponential explosion in the number of rays traced.

So, by setting your samples to 6 AA, and 5 diffuse and 5 glossy/spec samples, that means in reality you were tracing 66 * (55 [dif] + 5*5 [glossy]) = 1800 glossy and diffuse GI rays per pixel (assuming your ray depths were set to 1). If your light samples were set higher than one, well, square those and add them in to get an idea for shadow rays. That’s a crap ton of rays. No wonder it was slow. :cool:

There is an art to reducing noise with these sample knobs. My philosophy is this: turn on a few useful AOVs, like indirect diffuse, indirect glossy, etc so you can see how noisy each of those components are. Render at least once with all GI and motion blur turned off to see how noisy your direct lighting is. What you want to do is hold your AA samples still (say, 2 or 4 or something) and set all the other samples to one. Turn each of those other samples up (GI samples for diffuse and/or glossy) a bit at a time and adjust your light samples up a bit at a time until all of those AOVs come back roughly equally noisy. As long as none of them are much noisier than the others, you’ve got nice balanced settings. Then, increase your AA samples until the noise is acceptable for a final render.

When you have motion blur, the only way to get rid of that noise is to increase AA samples. If you find that you have to increase those a lot (to say, 8-12 AA samples) then usually you can compensate by reducing the diffuse, glossy, light, etc samples down. Roughly speaking, having a render with 1 AA sample and 5 diffuse GI samples will be fairly close in GI noise to 5 AA samples and 1 diffuse GI sample. So as you increase AA samples you can generally get away with reducing the other samples.

My take? Set your AA samples to 6, set glossy to 1 or 2, set diffuse 1, set your light samples to 1 or 2. See how she flies. Increase AA samples to 8 if it’s still a touch noisy; or, if you find the indirect_diffuse and/or indirect_specular AOVs are a bit noisy, increase the diffuse and/or glossy GI samples by one. I’ll bet your render times will be much more manageable.

Happy rendering!

I was at Whiskytree helping them optimize renders on Elysium. :slight_smile: One of the lead guys there showed us a log where they had 400 million unique triangles (so, actual triangle data) and 4.5 trillion visible triangles (so, with instancing). I almost didn’t believe them until they showed me logs and I saw for myself. By the way, this was rendering in about 20 GB of RAM if I recall. I don’t remember the render time per frame, but it wasn’t totally horrible (but it wasn’t way fast either, but what are you going to do when you have that much data?).

But yeah, there were some recent tests that a couple of our users did that showed render times for some (almost :P) production-like scenes. With low poly counts, some of the other renderers were faster than Arnold (Cycles, Vray, Clarisse?, I can’t remember them all), but there was an inflection point where with with so many tens of millions of triangles Arnold passed them up and began to really smoke them. In fact, Arnold showed the classic log curve in time as you throw more polygons at it, while the other renderers increased linearly in time. Arnold actually proves out the theoretical results of render time, while the others fall behind.

When throwing GBs of texture data, millions of polygons, millions of curves, complex shading networks, deferred procedurals, Arnold pulls away pretty quickly in terms of performance. But support for all of those features comes with a cost, which manifests as overhead for simple scenes. We’ve been continuously eliminating those overheads, though, so in many ways we’re narrowing the gap. The next release of Arnold core is going to have some pretty nice performance improvements for everything from displacement, to light sampling, to volume rendering, to scene processing/startup time.

I’m willing to say, Jon Smith, that if you tweak your sample settings like I mentioned, and then also try out the next release, you’ll be pleasantly surprised. For really simple scenes it may not actually beat Cycles, but I’ll bet that with those tuned sample settings you could get pretty darn close. And with more complex scenes it would scale up quite well.

@Mike

Thanks for the input!

I’m not even an arnold user, but that was an incredibly helpful post.

I also think that Cycles needs more time in the cooker before it can claim to compete on a production level. It’s a fast, stable renderer so far, but missing a lot of important features yet. On the simple side (things that the independent freelancer or small studio) could really use is OpenSubdiv support, PTex, displacements, Alembic, etc. A feature I’m personally dying to see is some kind of cached texture format support (or is such a thing already in there and I missed it?). Once your throwing gigs of textures at scene, it would be simply awesome if Cycles didn’t have to hold all of those textures in RAM at once.

It’s little things like that that Cycles needs before it can really jump into the competition and compete head-to-head. In the meantime though, I love Cycles anyway. Rendering has never been more fun!

Udim texture is supported yes. You can make a node to read your textures.
Search in the texture forum I made one.

I talked about Cycles pros/cons in my podcast with Andrew Price (see http://www.blenderguru.com/podcast-lets-talk-shaders-with-mike-farnsworth/), since I’ve spent time in the source code and have contributed patches. Cycles has a lot going for it. The goals and target audience for Arnold and Cycles do overlap, but they’re fairly different in the end. You should always use the right tool for the job.

I’ll always have a soft spot in my heart for Cycles. :wink:

Well, let’s not get carried away and say it’s “supported”. You can hack UDIM loading together, but there is no auto-read (1001, 1002, etc.) and there’s no way to view them within Blender.

Mike, thanks for peeking your head in here :wink:

The adaptive metropolis sampling patch with noise-based stopping is quite promising and could be what Cycles needs to become a potent solution for general production, and this being in the current state in which Lukas has noted still has a list of things to be addressed yet.

The buildbot also now has builds that will render much faster on scenes with hair and transparent surfaces, it’s obviously going to be a while before it gets to Arnold’s level since they have an entire team on an engine that has had a number of years head start on development (I don’t even know if it would even survive scenes with trillions of polygons since very few people here would actually try to do that).

New sampling algorithms are great but don’t really address the issue that fahr raises. As awesome as Cycles may be, the underlying infrastructure makes it really difficult to feed it with the high density geometry and texture data that is often needed to sell a shot.

Given their complexity, the fact that the Elysium shots were rendering with only about 20 GB of RAM is pretty amazing. I won’t claim that it’s impossible, but I can’t imagine trying to assemble a shot with even a 100th of their complexity in Blender. So unless some other package integrates Cycles I’m afraid there is an upper limit to the types of scenes that Cycles will ever see.

I’m inclined to agree; integration algorithm changes are really helpful, but only for certain classes of scenes. Oftentimes what directors want, though, is not necessarily lighting complexity but scene complexity and detail that requires the crazy production rendering features that are staples of Arnold and Prman, for example.

Jedfrechette; Have you seen this (normal calculation at rendertime)?
https://developer.blender.org/D487

This is the start of what DingTo hopes will be a series of optimizations that cover memory, sampling, and speed, and one of the results should be a Cycles that is more memory efficient.

It won’t get Cycles to Arnold’s level, but the potential memory reduction should allow for a measurable increase in complexity.

I hope I’m not coming off as negative, as I really love Cycles. I hope Thomas Dinges (or Brecht, or anyone) considers looking into some form of texture caching this summer. Rendering speed and GPU acceleration doesn’t mean a whole lot if you can’t hold your scene in memory. And if you drop several characters each with multiple 4k maps into a complex environment your going to start running out of RAM really fast.

Thanks for the in-depth post about sampling Mike. I have to ask - it’s awesome that you contributed to Cycles but how is that possible from a legal point of view ? I thought you’d have to abide to some sort of exclusivity contract… ? And while I’m at it, what about the opposite ? - as a programming layman I long wondered whether companies developing commercial products drew inspiration or even chunks of code from the open source projects. Who’s preventing anyone to do that since the source is kept secret ?

edit Well you answered some of that in the podcast. Thanks !

Hadrien

Oh, and… a friend at school is doing his graduation movie with Arnold. It’s looking incredible so far, and he has a simple question for you (yeah I know what you said but this is really short!) : why is there no support for the surface luminance node in Arnold for Maya ? He seems to be missing it very much (his only prior experience being with MR).

Hadrien

Mike contributed to Cycles prior to getting picked up by Solid Angle. It’s why he had to stop working on Cycles. Although he was previously working at Tippet Studio who DID give him permission to work on Cycles.