With NVidia adding “real-time raytracing” to their GPUs, does this have any impact on Cycles? Cycles is doing ray tracing… and it’s on the GPU… is RTX something that Cycles will, or could eventually, take advantage of to render faster?
Will the viewport (Eevee, BI) ever take advantage of RTX?
This is all assuming that RTX actually becomes more than just a novelty, of course.
Hi, RTX cards are even not supported to render on Cuda cores.
They want to support RT cores at some point but nobody knows when.
Take a look here: https://devtalk.blender.org/t/rtx-gpu-support/2306/17
As far as I am aware, the RTX series has some special cores dedicated only to raytracing but it’s not like they are some incredible tech that does perfect raytracing in realtime. What makes them work is the realtime denoising on the raytracing pass that they do in-game.
So Cycles shoudn’t gain any massive performance boost from the RTX series. Eevee probably won’t benefit much either since it would need access to Nvidia’s denoiser which I don’t believe is open-source compatible. I may be wrong about that though.
One of the render engines that have been most pro-active looking into RTX are Vray.
They posted a overview with their initial findings back in october.
Here are a little excerpt:
…For the three scenes, the RT Cores provide a speedup of 1.78x, 1.53x and 1.47x respectively compared to the pure CUDA version. We expect these results to get better as we get closer to the official builds in the coming months.
Yeah, that makes sense. My hope is that because the RTX cores are designed for raytracing, that they’ll be optimized for that task in a way other GPU cores aren’t. Similar to the boost we get going from CPU (which CAN raytrace, but isn’t optimized for it) to GPU cores (which CAN’T do other things, like hardware IO ops), I’m hoping that RX cores will be further optimized for the ops related to ray tracing.
I have a hard time believing that Eevee CAN’T take advantage of RTX cores as it is. The whole point of RTX is for software developers (typically game developers) to implement for it in their realtime engines. Eevee is just a real-time graphics engine more-or-less the same as any first-person shooter. If games can implement for it, surely Eevee can implement for it. Time and effort required are a whole other issue, of course, but it should at least be possible.
One thing to keep in mind is that so-called raytracing renderers don’t necessarily spend most of their time actually tracing rays. Let’s say that the perfect hardware can do ray intersection in zero time, if the renderer spends 50% of the time doing other things like shading or just waiting on memory, the speedup will only be 2x. That’s why “production scenes” with complex shaders aren’t magically going to turn realtime.
The other part is overhead from actually using these hardware features. Right now, you just hand Optix your geometry and some code to run for intersections, which may or may not fit your architecture well.
So, if RTX happens, don’t expect miracles.