This technique can be integrated on GPU and Unidirectional Pathtracing. We need this, and this would solve all the Caustic problems once and for all.
This paper has been talked about quite a lot actually, but there are a few reasons why it is unlikely to be implemented. One of the cycles devs, Lukas Stockner gives a good explanation as to why, here:
In short, Lukas’ explanation is that with this technique, the render engine will spend a lot of cycles exploring the caustic paths, but leave little for the general pathtracing (so your caustics are clean, but every other spot is super noisy).
There needs to be a balanced approach, making it so caustics require some more samples to clean them up would be a worthwhile trade if it means the rest of the image gets cleaned to the point where the AI denoising produces good results. If there was some way where a caustic path would trigger just a limited amount of exploration with few samples instead.
Yeah, I’m sure it could be made to work, I was just highlighting that it’s not the perfect solution to everyone’s rendering problems, and it does have some drawbacks.
OK, I get it, but still, we are talking unidirectional GPU. Even if let’s say 80% of the time caustics would be searched, it would still be faster than VCM on CPU, and we would get a complete solution. Hell, even NMEE is not capable of clean results. So I think it would be worth implementing for when you totally need that perfect light transport. Remember, there are no alternatives but Bidir and MLT, and this thing is truly genius so far. So, yeah… we only need it to solve for a AOV if needed. That would be totally worth it, did you see how clean it is ? My gosh…
Oh, and that slow it’s not. The samples needed are really low compared to PT and even NMEE, not to talk about the noise. And remember, NMEE is used in production! So, it would totally be worth it even as a secondary render with a separate AOV. One renderer, no fuss setting up the scene for Lux or Prman or etc…
@enilnacs if you are so sure this is needed you should try implementing it yourself. Otherwise I think you have to take someone who works on the code’s word that it’s not ready to just be dropped in.
For the record you can’t just do a different rendering technique for a single aov unless you do a completely separate pass
Yeah that is what i am playing with ATM. I try to do some cycles modifications. Its just in the concept phase. And i dont have that high C knowledge and i am also a beginner with rendering algorithms in C, so… lets see. I am also in between earning with on my job and free time, bt i wont give this up.
I dream since a long time on fast and amazing caustics.
I know, it takes another kernel and that in itself is a challenge, but i am hoping that, if i can believe what the paper says it would be easy to extend the unidirectional cycles engine to do just that.
Once i have something usable, i will share, first i need to know what im doing.
I was always confused by the fact caustics need so many samples.
Don’t you think it could be possible to create something like a interpolation matrix to take in account a smaller number of samples and calculate the intensity of light in the areas between them instead of trying to use samples for everithing?
It’s just a question, I don’t understand anything about the issue, but I believe it could be possible to find a reasonable medium solution between fake caustics and fully raytraced ones. Don’t you agree?
Well caustics can be done like that, but you lose then those fine details that you get only by bruteforcing them. You can see that with those older photon caches, that also need an insane amount of photons to get really clean. Interpolating is never the solution, we need even good pixel subsampling.
The problem being that lets say you have a caustic ray that is far outside, and that ray has to come through a keyhole and hit a wall in a dark room… just to search for that needs at least Bidir stuff or manifold sampling. So there are caustics and there are caustics.
Manifold sampling is capable of finding AFAIU those thin sources of rays without any blurring, albeit slower sample speed (about 80 times slower than PT).
I believe its worth it, because it can be implemented with classic PT.
However, i am not an expert in rendering algos, and i trust people like Lukas, etc… but i wish there would be more experimental implementation in blender. We have a playground switch in the render engine (experimental) where we could have these things.
In cinema we say:
The best camera is the one you have with you. Meaning that even a experimental feature is more worth than no feature. Let’s not forget that a lot of movies are made with experimental renderers to have that breakthrough and get a technical Oscar. Experimental ? I don’t care… can we implement it till tomorrow ? Go for it! RENDER
But to be honest, at the moment i would be even more interested in EEVEE Vulkan raytracing, with an implementation of realtime caustics like in the Unreal Engine. THAT is amazing.
You have Luxcore, Appleseed, Yafaray and a shitload of commercial renderers from Vray to Renderman with you (in the above sense), if you really need caustics.
Nobody needs any time spent on Cycles-integration of a very young, not production-proven at all algorithm.
There are many a lot more important missing pieces in Cycles to this day (e.g. full baking support, mip mapping/texture caching).
The topic of this thread is just so absurd to me, I’m sorry.
He has plans to attempt an integration of the algorithm himself, so this (for the moment) is not going to require much time and attention from Brecht, Stefen, and whoever the new engineer is.
He will do it in his free time, let him attempt to prove the algorithm with his own images and animation first, and then find out for yourself if he gets to where builds become available.
I believe most of us know that but, let’s be honest, does it worth it to have to work with a completely different node system, materials, lights, renders… to learn how to work with all that stuff and get a mess with different types of blender files in your folders just to have caustics?
I personally prefer to work with Cycles because of the perfect integration in Blender, for obvious reasons, I think that the idea of messing around with different renderers sound just like a nightmare to me.
I already have worked a lot with Yafaray, when installing the renderer was not a nightmare by itself and I agree that the caustics are perfect, bla, bla, bla… BUT… With Yafaray you are always depending on working on older versions of Blender because of the delay between a new Blender release and the adaptation of the newer Yafa versions. Just anoying.
Completely ignoring the fact that you have to effectively rebuilt all of your materials from scratch to work in most of these. Some node groups in Cycles cannot be translated into these other renderers, especially where they are using Cycles specific functionality (i.e. specific procedural textures, bevel etc - which may not have an equivalence).
I personally think caustics would be an important addition to Cycles. They play a much bigger role in day to day life than people give them credit for.
I don’t get why Cycles has to be focussed on animation, especially now that we have Eevee which is being used for that purpose too. Many people use cycles for static renders and if a Caustics algorithm can be integrated into the existing engine, then I think it should be given a chance. If people want to do animation in Cycles and don’t like the way caustics affect their animation due to flickering if the caustics are generated using a biased algorithm - they can simply turn them off (as many people do already).
I have to add that I have at work 128 core Epyc servers that scream with bidir renderers. And people have no clue what it means to truly work with beautiful caustics. What they miss is a world in itself. Unfortunately I can’t allow to buy such a thing and this is why I want for mere mortals to have that.
And yes, just the shere amount of time required to rebuild and remake a lot of nodes and other stuff, etc… is for other renderers unpractical.
Well, all I said was, - in the sense of the analogy mentioned by @enilnacs (the best camera is the one you have with you on set) - some renderer other than Cycles very much is the camera you have with you.
So all the arguments about having to learn new software or rebuild shaders is as if the DOP, on the day of shooting, had a certain camera with him (other than his go-to camera), but he refuses to use it, because he isn’t familiar with it and unwilling to familiarize.
If we’re talking a paid gig here, that’s just unprofessional. Else it’s just lazy and a matter of making excuses.
I know, I know, deadlines may be tight, budgets may be low, there may be neither time nor money to be spent on familiarizing with renderer X, or on converting nodetrees.
But in such a case, you’d probably have known you’d want or need caustics from the start, so why not make the right choice of renderer beforehand?
Anyway, I’m far from saying the OP mustn’t try implementing this, if he likes to, I’m sorry if it sounded like that. All I meant was imho Brecht et al have more important missing features to add to Cycles (baking, texture caching, many light sampling, openvkl, probably more).
The same could be said of any of the functionality subsequently added to cycles following it’s initial release, so why bother with all the additions and improvements over the years?
Yes, it could be said. But I don’t think I did say that. My point was/is:
(In my opinion) not having an efficient way of rendering caustics is holding Cycles back as a production renderer to far less an extent than not having e.g. mipmaping/texture caching, proper instancing support, usd-support, feature-complete baking capabilities, more efficient volume rendering, …the list goes on.
Take a look at Arnold:
- Does it have SMS or any other algorithm for efficient caustics? - Not to my knowledge.
- Does it have the above mentioned things (and more)? - It sure does.
- Is it being widely used by pretty much everyone (not necessarily exclusively, but that’s beside the point) for all kinds of work, including some of the best high-end VFX-work? - Clearly so. Despite no efficient caustics.
Maybe this makes it more clear what I meant.
Because is a “Production” render engine. That means the main focus of the engine is animation. EEVEE can be used for animation too, but really is not it’s focus, because is not a raytracer and is limited by the GPU used to render. These two are two completely different beasts aimed to different use cases, but developers are doing a great work unifying and trying to make EEVEE produce similar results to cycles where possible.
Caustics are a nice to have feature, the only problem to date is to animate them: they tend to be not “stable” for animation (i mean, prone to noticeable artifacts), cannot be really controlled for artists and for “production” are still considered harmful in most cases. That’s why you don’t see much of these in rendered movies and such. (mostly on very specific situations, and added in post-production, since it takes just a few minutes to fake and animate them in After Effects anyways).
Now, it should not stop anyone trying to implement them, in the past several developers tried to add this to the old internal renderer and cycles, but the problem was deemed too complex at the time. Just don’t expect a quick dev. review because it needs to be battletested in production.
If we’re looking at what we want the core team to work on for the time being, I do agree that the focus should be on things like many-light rendering and completing the microdisplacement code (which would include the geometry cache).
That shouldn’t stop the OP from his desire to prove this algorithm in production (as you have already hinted) though. He should go ahead and implement it, and then battle test it with sample scenes and animation. If it can’t (at the least), render some of the open movie scenes without flicker and with caustics turned on, then we will know.
The term “production render engine” is ambiguous and ill defined. It doesn’t necessarily imply “mainly for animation”.
In which case they can be turned off when rendering animations - as many already do with the existing implementation of caustics.