Very impressive real time path tracing demo

I found this online:

Still very GPU intensive though as it used a couple of Titans, 40 fps at 720p, 25fps at 1080p. But give it a generation or two and this could be viable in a normal high end PC. Maybe earlier with improved tracing algorithms.

incredible! why Blender cant produce this? :frowning:

I think we need dedicated Monte-Carlo accelerators, with special crafted hardware units for QMC, shader interpreter and main logic loop. Maybe new Carmack must be born who will make next gen game despite current GPU limitation, and GPU industry will go that way instead of 30+ years old Z buffer architecture with shader band-aids.

Modern GPUs already spend the vast majority of transistors on general-purpose processors. The rasterization hardware itself isn’t holding us back in any way, and it is actually quite interesting for general computing as well.

In any case, I wouldn’t bet on raytraced games any time soon. Just take the amount of samples you need for an acceptable level of image quality, imagine dedicated RT hardware that is 10x as fast as a modern GPU (something that is unlikely) and assume processing power to double every 2 years: It will still take a decade or more, depending on your definition of “acceptable”.

Until then, we’ll still use every hack in the book that make things look better.

Hardware qmc generator unit will take 1/100 area of typical general ALU, and same for streamlined hardware pipe to do BSDF sampling. Only custom shaders require general CPU. BVH based triangle intersection can be handled same way, and use fraction of power (less heat). Only huge archaic game legacy base prevent that, as new hardware will cannot render old directX games at all and nobody will buy accelerators except few big studios, and that studious already have render farms. Chicken and eggs.

Arrival of the Graphene CPU will give so fast render speeds that can defeat any current GPU.

Brigade uses energy redistribution path tracing in its engine. I’ve often wondered about the pros and cons and whether or not it could find its way into Cycles one day. It seems to be faster across the board than vanilla path tracing, but it may have some material limitations or corner cases that don’t work well. Looking at something like Brigade though it definitely looks pretty great, especially for interiors.

This is not the first time Brigade is mentioned, it’s very impressive indeed.
Brigade is improving even quicker than what I thought !
Pathtracing will surely be the future of real-time 3D and games but the question is how long before it happens.
Brigade is developed in parallel with Octane AND cloud services such as cloud gaming, I saw them coming before they said anything lol… an actual recent quote from Sam Lampere anwsering a question :

“-Cool & how well do you think a Next Gen Console using a 2nd GPGPU for computing will work out?”

“- […] I never thought I would be able to run the kind of scenes we’re doing with Brigade today (which haven’t been shown to the public btw) so I honestly don’t know. But in case the next gen consoles can’t handle it, the cloud will.”

So even if our PC/consoles don’t have the power, OTOY and other have it and will sell it over the cloud to enable people to play photorealistic games.

I thought it sounded similar to the way cycles work. Can you explain or perhaps know of a link that might explain what the difference between a render engine like cycles and Brigade. Energy Redistribution path tracer you say?

On another note. The realtime bloom and glare in the Brigade video is very nice. Is that nd post effect or is it part of the different implementation of the pathtracing compared to Cycles…?

This is an old video that showcases the differences between path tracing and ERPT in the Brigade engine. It’s a method of path tracing that is similar to bidirectional without the very high memory costs, if I understand correctly. It’s very good for interiors and resolving caustics. I need to do some more research on it myself. Note that the whole “starting off dark” issue was resolved in videos after that one. It would be worth it to watch all of the videos on that channel, IMO.

The bloom is almost certainly a post effect. Doing it for real would require all kinds of expensive optics calculations that just wouldn’t be worth wasting CPU/GPU power on.

I don’t think they used ERPT in those demos, the sampling distribution looks like regular path-tracing. Like bidirectional path-tracing, ERPT requires the entire framebuffer to be present in-memory, because samples are written out to other pixels than those that originate the camera rays - which makes tiled rendering impractical. It also shares problems with occasional “splotchy” artifacts that are more objectionable/harder to remove than uniform noise in regular path tracing.
In my opinion, for “real world” scenes it’s better to look into virtual lights combined with simple path-tracing than working on complicated and likely inflexible path-space methods. There’s just too few programmers who (desire to) understand these things, and artists have time and time again proven that they don’t care about “scientific” solutions…

I don’t buy that, at all. We have services like OnLive struggling financially because of infrastructure-costs, at the same time they are using mid-range hardware. Nobody in their right mind would develop a game that requires thousands of dollars in hardware per active user.

Here’s a new Brigade demo:

intel work on rendering technology.
we need only a hw-accelerated path-tracer
like quick sync technology…

this reminds me of otoy showing off how fast octane is with 8 gtx 690s haha

I concur, a decade is not a bad bet if currently no one is working on a raytraced game already.
However, one can’t argue that Brigade is (one of) the sharpest tools in the shed:

  • instancing, realtime BVH, .blend support :D, AMD & Nvidia support, bone animation, collision detection…

And the Brigade 3.0 demo looks quite acceptable (with them being suspiciously quiet about the hardware used). I remember when all games where 2d and people said that it’ll be aeons for 3d games.
And suddenly Elite, Another World and Wolfenstein 3d popped up. Looking crappy, no proper hardware to run it on. I also remember Nintendos FX chip or good old Glide. There was a lot of “fail”. :slight_smile:
But it opened a new field of hard and software to work with and in.

IMO raytraced games are the next logical step but the problems are obvious:
How do you make it attractive for gamers? - because they don’t care about the tech, they care about the eyecandy. :smiley:

They were running on a pair of Titans last time they showed off a demo. I’d assume they didn’t downgrade.

It seems that this new demo requires only a single Titan card… If true, there were terrific software algorithms improvements…