Cycles tests - the new blender CPU/GPU renderer of awesomeness

Here it is, the finished Cycles Weather Animation which I made with my Science Teacher. This project’s Cycles renders included 240,500 passes in total(All on the CPU!!!)!

Some Conclusions

17 Conclusions about Cycles in Animation

Materials and Render setup are a breeze with Cycles

I’ve found the quick preview, and simple materials make the workflow sooooooo much faster. Not like Blender Internal where you tweak one thing, wait two minutes for it to render, repeat.

Any time savings from above statement, are annihilated by the demoralizing amounts of Rendering required.
Seriously, With stills, it’s fine, you can just wait for your render to clear up, but when it comes to animation, it can take like 10 minutes for a scene to go through a sufficient amount of passes, and multiply that by the thousands of frames in an animation project, well, you will be waiting quite a while.

The Compositor is your Friend
Luckily, after you’ve rendered out a sequence, and you discover it’s still too noisy, you can render out a second round with a different seed, and mix them together to reduce noise.

2 GI bounces or more is overkill
1 bounce should be plenty. You will not notice the other bounces, unless it’s a dark scene lit primarily by bounce lighting(i.e. A dark room, with light colored walls with a single window)

Know when to randomize noise, and when to leave the seed at 0.
Noise patterns that stay the same can make it stand out, and look funny, but noise patterns that randomly shift can make an animation look noisier than it is. Typically, if lots of things in the scene are moving, then animate the seed value with #frame, or #frame+1, etc. If the the large objects and camera aren’t moving, leave the seed be.

Cycles is more Distributed Rendering-friendly than Blender Internal

With Cycles, you can just tell one computer to render the entire animation with 20 passes/frame with seed 0, and another to render it with 20 passes/frame with seed 1, etc, rather than splitting up frames which may take unequal amounts of time to render.

If you’re an Environmentalist like me, Remember it’s Winter(In the Northern Hemisphere).Think about it, all the electricity used by a computer is turned into heat, and calculations are done and data created as a byproduct. If you have a programmable thermostat, then every joule of heat generated from the render is a joule of heat the heater doesn’t have to burn. That ought a cover us until Brecht finishes optimizations, hopefully before summer arrives, and rendering becomes a double-whammy for the Earth.

For Dark Rooms, use large, invisible lights.Make them invisible to the camera and glossy rays. You should see a decrease in noise.

With Experimental Patches(i.e. Volume), the sky will not fall the minute you use it.
In my experience, Storm’s Volume patch has been fairly stable. That being said, it’s still a good idea to save often.

While the Sky may not fall, compatibility will.
Keep your old (volume) builds, as your scenes may stop being compatible.

Cycles DOF is awesome.
It works better than Blender Internal’s faking method, and the blur doesn’t get cut off when the unblurred object ends.

Cycles DOF is not awesome
Man, is it noisy!

  1. Your Poly Count Doesn’t Affect your Rendertimes(At least not noticiably)
    While it may result in very long BVH build times, poly count generally does not have a large impact on rendetimes.

  2. You can get away with diffuse on most objects.
    Viewers will probably not notice that that plastic object is a plain diffuse, or wonder why those snowballs being thrown don’t have proper SSS.

  3. Embrace the Penguin
    Cycles can sometimes render twice as fast under Linux.

Build your Own Blender(Or grab one from Graphicall)
I’m not talking about the kitchen appliance(Though I’ve heard of a guy who built his own toaster…which melted on first use). Your self-compiled Blender is guaranteed to be significantly faster to render(Both in Cycles and Blender Internal). This is probably because the binary is much smaller than official Blender.org releases.

Lie
There are lots of tricks you can use. For example, you can use a holdout panel in front of the camera and alpha-over compositing to just rerender the parts of your scene which move, and the composite in the parts that don’t move as background. This doesn’t work with scenes with lots of reflections.

@OL77:
Being Shocked and a bit discouraged by Ad images rendered by Arion linked in “most long and active thread”, that was my wet dream as Cycles end product quality, desperately writing answer to that with my test image with volume fog caustic of glass sphere, press cancel when get how sh*t it compared, get a grasp, open that thread and…

Speechless. That is real answer. I do not exchange Cycles to that 10k+$ renderer, even if you give me twice more. I cannot start vim, make changes, make -j4 install and get modified Aion. Period.

BTW, nice animation, even final credits :). Seroiusly, it scream for more samples, too noisy. Maybe i need to break flatness of ray tree and make more samples along the ray inside first volume from camera, then sum it. But it require to rewrite kernel_path_integrate() as recursive function… Will try it after some experiments with mlt. Aion, Octane, whatever will be defeated!

Musgrave texture recently fixed, some test. Beware, Arion, Cycles coming !11!!!1


hey - do i see an mlt panel ??? is this metropolis light transport ?? so do the volume builds render faster / with less noise??? or if im totally wrong regarding mlt…?

im glad u’ve shared your thought – where are u taking volume builds from? graphicall? i see only 32bit there

  1. Snow is not a flat plane. Actually the “six” axis are X, Y, Z axis so it has the shape of a “Fractal Blender Empty”

EDIT: Seems it is flat so I was wrong: “Capped Columns” here shows two snowflakes connected by a column. The “Radiating Dendrites” I believed was the norm seems a rare thing.

I guess that’s because of an older Blenderversion where had to be rotated 90°.
But in the current Blenderversion we don’t have that issue anymore.

I will upload a new File.

Kind regards
Alain

the rotation issue with particles had been known for a long time, had to do something with the thing devs call “funny space” — and the particle guy jahka fixed it after ton mentioned it on a dev meeting — as I recall it some months ago.

“funny space,” or “crazy space” … guess some matrix to fix between local/global or even normalspace for particles to have Z up, on vertex/face normal.

@pxl666:
Yes i experimenting with MLT-like sampler and trying to do bi-directional tracer, even rewtite them 3 times from scratch, but don’t hold your breath, it so unuseable, buggy, slow that i not dare to share actual source (i intentionally skip kernel_mlt.c in patch). That MLT panel is real, i am too lazy to comment out when rebasing volume patch, just ignore it for now. MLT do not improve volume scenes, except maybe point/spot/IES lights, but in general sampling is orthogonal to other features, MLT save time by making less attempts in places where light cannot pass (completely blocked rays, absolutely black or unreachable surfaces), and more in very bright parts like caustics, that is all. Bi-directional tracer is another story, it will add more light sources and more correct render transparent objects with refraction, attenuating light rays from point like lights (current Cycles algorithm just test for ray shadow probe intersection and sometime not efficient if not possible at all).

Storm, why are you everytime so pessimistic? :wink:
I mean you brought us the volume shader, that was a heavy task!
I doesnt work with emmiters, but its already a nice make stuff from it.

@tungee:
Sorry it is fact, tested many times by people much smarter then i am. MLT publically known since Veach paper in 1997, there are many similar algorithms since that, many realizations, latest is energy redistribution ERPT, they all have common advantages and not related to reducing noise in volume.

Volume patch is one liner BTW, it very simple free_flly_distance=log(random_value) / sigma. All rest is just glue to Brecht code. I doubt it can be called “heavy task”. Heavy taks are motion blur aware BVH optimized to GPU, dynamic GPU memory management, that is. Even MLT can be copy-pasted from pbrt or LuxRender source, it not that hard, need only time.

Hi storm_st,
I have built with your Volume patch, and it looks really promising!
It can be very slow, but I tested it in a very extreme situation and without the knowledge for optimizing, and from a little of experience with Luxrender or Maxwell, a wrong setting can give many troubles of this sort, so nothing very worrying .

What I noticed is that sometimes, changing the input texture for the density, it doesn’t refresh, also disconnecting and reconnecting.

And here is a picture, not so appealing, two indeed, in which something weird happens: in the scene there is a plane emitting yellow from backside, and a sun lamp, there is a sphere with transparent surface and with a blue tinted Volume, and inside the sphere there’s the Suzy that is pure diffuse red.
What is strange is that the light from the sun, that is pure white, appears to be yellow too, and the more I strengthen it and the more yelow it becomes, and at the same time, more yellow (full saturated) scattered light arises inside the volume.



sunlight stronger (same angle, from front-top-right)

Thank you very much for your great work and dedition.

paolo

Only emissive background and emissive meshes as light sources fully supported, other light sources as sun, point, spot, etc work half- arsed because Cycles lack of Bi-dir part of tracing from light source, i use constant color background in test images for a reason, it “cheating” and make less noise. You need insane (literally) number of passes to get good image like posted fuzzy ball with yellow emissive plain, I mean 20000+ (twenty thousands) samples per pixel, weeks of rendering per frame. But I promise, when it finished rendering, you get decent looking cream-like gradients, it is unbiased physically correct multi-scattering integrator after all.

Thank for the explanations storm_st,
I’m not missing lamps light so much at the moment, with mesh lights and hdri you can do almost everything, only problem is with the sunlight, because small and distant (and strong) mesh lights take too long to converge.
As for rendering time, the first image posted above, with less extreme and more correct settings, was converging quite fast, one hour for 1215 samples on my i7, and it’s almost cooked. So I think, as OL77 says, this volume is quite usable even now, even if you say it’s not. :wink:

regards, paolo

Here it is, the finished Cycles Weather Animation which I made with my Science Teacher. This project’s Cycles renders included 240,500 passes in total(All on the CPU!!!)!

The Animation was shown at school today, and it got a pretty good reception. Mr. Redman actually made up a quiz(against my recommendations), and funny thing, he got images off the web from GoogleImages, and one of them was from a certain website that watermarks all their images. How was that not iPrismed? :stuck_out_tongue: 8th Period was much more mature about it than 6th Period. :wink:

BTW, can you see the freezing rain image, at school, it was so dark you couldn’t even make it out.

Being Shocked and a bit discouraged by Ad images rendered by Arion linked in “most long and active thread”, that was my wet dream as Cycles end product quality, desperately writing answer to that with my test image with volume fog caustic of glass sphere, press cancel when get how sh*t it compared, get a grasp, open that thread and…

Speechless. That is real answer. I do not exchange Cycles to that 10k+$ renderer, even if you give me twice more. I cannot start vim, make changes, make -j4 install and get modified Aion. Period.

BTW, nice animation, even final credits :). Seroiusly, it scream for more samples, too noisy.

Well, if a quarter of a million passes isn’t enough, I don’t know what is. Especially on the intro scene, after 80 passes total(20 at a time), I stopped, as the benefits decline, so the #160-180 passes are not as helpful as adding the #40-60 passes.

Maybe i need to break flatness of ray tree and make more samples along the ray inside first volume from camera, then sum it. But it require to rewrite kernel_path_integrate() as recursive function… Will try it after some experiments with mlt. Aion, Octane, whatever will be defeated!

Musgrave texture recently fixed, some test. Beware, Arion, Cycles coming !11!!!1

Yay! There’s always the next episode to test with. :wink:

Just out of curiosity… what grade are you in? Because my science teacher probably wouldn’t give a quiz on a 2 min long video that a kid made… just saying.

And I really need to get started w/ cycles.

8th .

And it’s not like a paper quiz, it’s like a slideshow thing he made.


New Volume Builds, hot off the compiler

I know people have been asking for this, I finally worked up the stamina to build the Volume Patch on Windows(Before promptly being reminded of how much I hate building on Windows. I have to download 2.2 GIGABYTES OF LIBS???):


Oh, and it looks like Blendermf has uploaded a MacOSX build http://graphicall.org/592

8D!!! I LOVE YOU!!!(no homo) If I wasn’t straight I’d kiss you.

Lately I have tested the various Light Path node output values, trying to figure out how to use them for an outdoor scene.
It seems that most of these options make some sense for the World environment nodes, and some of them (Is Shadow Ray, Is Reflection Ray) for the material nodes.
I have problem with the Is Singular Ray output value. According the documentation it should return 1.0 only for the sharp, specular reflections. I have tried it with in the sun light nodes, object material nodes and environment (World) nodes, and it continously returns 1.0 for all the pixels.

Have you ever used the Light Path:Is Singular Ray output for anything? Or should I report it as a bug?