The latest additions to cycles has made it nearly real time (Theory Build)

Is this something we will see in 2.79 ?

or is it an rare test branch edition ?

Or will it be part from buildbot branch now ?

1:48 rendering time with scrambling 0 and AO bounces 1:

Awesome :slight_smile:


Yes but it does the trick for every lightpath type, including Glossy and Refraction, which give unwanted behaviour. Please devs give it a try at ‘only diffuse’ (or even per lightpath settings!)
BTW, doing this trick via nodes also has slower rendertime than doing it via simplfy, why?

If so, how about other global tricks like that? I’m especially using two additional “insert node groups” frequently:

  1. If backside, then render as transparent. Using fully white this is same as backface culling during rendering. I wouldn’t always want to use this as a global, i.e. when I actually utilize backside such as in plants and windows.
  2. If diffuse ray, then calculate as diffuse only with a brightness multiplier. If applied to a rough shiny floor (near black diffuse), light would still scatter diffusely as if it was a purely diffuse surface to nearby surfaces rather than simply being dropped out. Useful if rendering without reflective caustics - for me that means always. But currently I have to define it for every material, although floors and walls in a room would be the major contributors. Probably even better to use a calculated albedo as base. It basically means that specularly reflected light would still contribute to light transport, even if it happens in a diffuse way. It won’t be “correct”, but it will brighten the scene.

I’m with you man!

Nice little things to know, i guess this is not documented, unless looking into the code

Ok, on a second thought, why this AO trick is in simplify panel? Shouldn’t it belong to Lightpaths?
Or maybe…
Why not thinking of a whole new ‘Biased tricks’ panel under Render tab where all those tweaks could go together? Glossy filter, clamp, Ao bounces, light sampling threshold, and what else?
It would make sense to have all the biased tricks together in one place instead of being scattered here and there.
God, i wish I could write my own UI customization with python. Anyone know where to point me to start learning? :spin:

It’s very easy. Just open one of the Properties UI scripts that comes with Blender. You can just copy and rename a current panel to create a new one, and easily move settings around. You’ll get the hang of it in no time.

Cool! Is there the possibilty to append an own .py to the main ui.py? I’d rather not mess too much with the original file

It’s more of a convenience override than a speed thing. When doing interiors I never add outside faces to walls, and they’re not in the way during modelling other stuff. For realtime preview things get a bit more tricky. Figure out all the materials defining the walls and add the trick manually. Or hide the walls which reveals the sun&sky fully so you have to deal with exposure, plus not seeing the rendered object in its true environment.

Speaking of overrides, it would be very helpful if the material override system in render layers had a switch to ignore emission nodes in materials - so that they weren’t overridden. Although I prefer light objects (certain visibility tricks faster to setup on the object level, cleans up faster) over emission lights, I sometimes have to use them to exploit certain features not available to light objects (camera visibility, single ray visibility, texture space). Material override is currently only lit by lamp objects and world, which is not always sufficient.

Ok, I think I might have completely misunderstood the concept behind this one. I thought it was for blurring reflections in general in secondary bounces and was most useful when trying to do reflective caustic patterns, which would still not clear up fast enough to be a regular rendering effect. I try to be a bit selective what materials/objects/lights gets MIS, and only dominating surfaces and lights (walls, floors, and ceilings typically, usually never tables or dark surfaces) gets more bounces.

The experimental build is amazing! I render interior scenes heavy with indirect light. I am currently testing this experimental build in my production no matter the possible risk. As far it seems to speed up my renders up to 5 times with acceptable result! And I am using only AO bounces for now. The other stuff seems amazing as well. So many improvements. Since it feels like miracle time It would be awesome to see adaptive sampling per material, or with a mask, similar to Maxwell renderer’s functionality. Is this something that might be possible in the future?

Lukas Stockner has done it again… What this feature does is it pretty much takes the dynamic BVH of the viewport and applies it to final renders. Making it so you dont have to synchronize the entire scene every frame.

Sadly this feature is very hacky and wouldnt be accepted in to blender as it currently stands. But the potential is amazing!

Oh wow! That looks amazing! I’d like to try this with an animated character scene. Where did you get this build?

I test builds for our studio. So I cant exactly hand out our builds but people from the community are more than welcome to build their own since all the features I show off are more than likely committed somewhere

https://developer.blender.org/D2613

These latest Cycles developments are amazing. Surely a game changer.
I wasn’t sure what was going on with the last video until just over half way through and then it was just …Wow. So I guess now it works theoretically it could be made into a more solid less hacky feature, in time ?

It’s a little bit confusing because there are two current threads on this subject now.
I’ve been testing a build of what I think are the latest Denoise developments ? Which 3DLuver put up in the older Denoise thread a short time back.
I’m getting incredible results and speed ups with this build. Render time down to around a 4th or 5th shorter with a clean and sharp image. But I’m also always rendering animation batches from a recent project for my tests to be certain how it works across animated frames. That for me is the real test. It’s working very well now so far. For me at least.

This is a life changer but why would this not be allowed ? in simplest term explain me?

Hey Lord, this is a bit misleading!
The optimization works only if nothing but the camera moves, right? The entire BVH process has to be evaluated again for even a single vertex moved/displaced/rotated/etc…, which is a common scenario in animation.

And by the way, isn’t that checkbox already present in regular builds since years? (EDIT: ok reading the commit makes it clear ;))

Well I guess it’s still got quite good potential for architectural fly throughs and background passes.

Hey Guys,

Here’s a little Easter Sunday Gift for you all to play with after your sunday lunch.

OCL_Denoise_V5_RayDivergenceReduction_PersistantSceneData:

https://mega.nz/#!c8IiFL6A!lSeiA0C6t6ggIDvSd8NtN9GklSsEQidFvCkiZre8APk

This Build includes:

Opencl&Cuda_GPU Denoise system (This is Lukas latest Denoise code system)
GPU Denoise Multi GPU support (Even in Viewport, Def works for Cuda but not tested multiple Opencl GPU)
Scramble Distance added for solbol and multi jitter (works on CPU & GPU) Also added to Supported features render tab
Thread DiverganceSort Reduction patch (Gives 30% speedup in classromm and 8% in barcellona)
Latest Master
Cycles: Implement persistent storage for all scene data (Patch D2613)
Cuda & Opencl Supported

Enjoy.

Cool cool! This is only Win version?

I have tested scramble distance with corelated multijitter - it has no effect

I have tried to render scene with hair and persistent storage (Patch D2613) - it has no effect :confused: but bhv is calculated each frame hmm (maybe I have done somethings wrong hmm)

The feature only works with Sobol (the UI is deceptive in this case since the scramble distance field does not get grayed out).

Thank you :).