Eevee raytracing

Lol ya been outplayed, Id also like to mention he guy using the ue4 demo has a smaller view port and a better gpu than me

and he also has after 1-2 seconds FINAL FRAMES, you dont, you have only viewport denoise, it takes many samples to get clean. AI denoise is not meant to do that.

I think you are making up some stuff at this point, IDK what you think unreal is doing to get those final renders (fyi its denoise)

“Realtime” raytracing is low samples with denoising…

1 Like

You should see quake 2 before its denoised it literally has like 20 pixels of light that magically turns in to a clean final frame look at those screenshots at the bottom

The Viewport denoise is NOT temporally stable. If i want to use cycles on a single machine for an animation, you need a TON of samples! That is also valid for ALL AI denoisers (Optix, etc)
Raytracing enhanced rasterisation IS, and this is why eevee is so good for animations, and would be even better with raytracing.
Even if you would have 100 Teraflops to cover realtime raytracing with Cycles, it would either be temporally unstable (AI Denoise), or noisy! Period.

1 Like

Its not only that, its also rasterisation first. Only the RT buffer is RT.

Check out some stuff cycles will get soon, This video is 3 years old and on a gtx 970, Sure it jitters a lot but today modern GPUs and blender 2.8+ versions of cycles being significantly faster this is really really fast even without denoising, After blue noise dithereing denosing is significantly cleaner too

I still believe making cycles faster should be the primary goal

1 Like

Raytracing denoising is different, its because, ONLY the rays are denoised. Everything else, primary rays, details, geometry and all textures are clean, because they are rasterized.

1 Like

You know you can denoise only light in cycles too right?

Yes, making cycles faster should be also the goal, but realtime is the future. I dont want to use (who wants) a renderfarm to render an animation anymore.

1 Like

Well considering I own a render farm id love to have to use it in the future :smiley: Not that any of my render times are over 30 seconds anyways on my own projects

Yes, i know you can, but because in cycles everything is pathtraced, you either loose detail or temporality or speed. you cant have all three.

1 Like

Great for you, congrats for your renderfarm :smiley: Most people dont.

1 Like

BTW, i tried Neat Video denoise, its pretty good, but ONLY if your scene is not changing to fast from frame to frame, cause it uses a temporal technique to correct the details. So especially fast action scenes are not suited.

Yeah I heard about that a while back before cycles got 3 denoisers haha

First thing I tried when I got my hands on an RTX card. Knowing the look of heavy denoising makes the game feel uncomfortable. That said, Control’s more restrained implementation is far more comfortable and often striking with its clean reflections.

Neither Eevee or UE4 produce appropriately temporally consistent images to be valid for common production use. They also don’t have acceptable motion blur or proper depth of field without a significant compositing effort.

But you’ve got your head in the clouds and spin while some of us have our heads in papers and production so I’m done here.


1 Like

You judge final frame rendering through games ? Huh ?

DOF and motion blur, ARE compositing anyway.
Unless you really, really need 3D deformation motion blur. So no problems there.
And with enough samples (still way fewer than pathtracing) you have no noise in eevee or unreal.
Just look at redshift, its an rasterisation renderer with rt/gi caches. Its very similar.

The irony. I can’t run CUDA but there is no problem to have that in Blender.

Ya got opencl and cpu lol