Path-Tracing : Why not use it in the BGE ?

Hello,

First, I’d like to precise that I’m not asking for any new feature but just giving you my point of view & idea and I’d like to know what you think about it :).

These last years, with the emergence of OpenCl (or CUDA for Nvidia), massive amount of real-time parallel calculus became possible. Thanks to this ever increasing power, real-time path-tracing becomes more and more a reality. Of course this is the ultimate goal, the Holy Grail for any Game Engine.

When we look at Crysis, FarCry or Unreal, we can find the graphics already excellent (and I remember I was really astonished when I saw for the first time the Crysis’Jungle in realtime :)).
However all we see are a bunch of very complicated “hacks” that tries to get closer from the reality. For e.g to simulate the Global Illumination, they use : SSAO filters, hard/soft shadows, faked radiosity, etc …

The day real-time path-tracing will be possible, all these developed technology will become useless because Global Illumination is naturally simulated with path-tracing .

Today on powerful computer, this dream is almost reached through some engines such as:

I’m deeply convinced that the first path-traced video games will appear in the few years (maybe 4-5 years ?) and this will be a revolution for video games.

Of course, concerning tablet and mobile phone, their limited power will not allow path-tracing for a much longer time. However, with the developing Cloud Gaming where the calculus are made on the server and no more on the device, this should not be a problem anymore.

So I think that instead of painfully trying to bring to the BGE the technologies that Unreal or Crysis Engine have for a long time, we could instead look further and try to to take one step ahead similarly to Brigade.
This is a huge opportunity for us to bring back the BGE in the race by having this kind of renderer before the others.

And we’re maybe not so far ? Cycle is already very fast and maybe if a special Branch of Cycle, more oriented towards real-time (with tweaks and optimization similar Brigade’s ones) was developed, we could reach realtime path-tracing ?

In the long term, this would also allow to integrate the BGE more into Blender because they would share the same sort of Render Engine (for e.g, Materials’ node could be used in both Cycle and BGE without any modification needed). And that would be awesome …

I don’t know about a new rendering engine.

But for a new, say OpenGL 3 / 4 engine, BGE may want to start with this:

http://nvidia.fullviewmedia.com/gtc2012/0515-B-S0610.html

This in my opinion is next generation rendering, and it’s what UE4 uses. Latest, high-end DirectX 11 hardware is needed however.

Creating an engine like that for the BGE will require the user to have the absolute best hardware you can get, which means that one would have to spend thousands to get a rig to run it.

And realtime pathtracing for games isn’t quite there yet when it comes to achieving comparable quality to say, the Cryengine, YouTube videos showing the test game for the first Brigade engine showed a lot of noise during scene updates and that was with the latest hardware at the time.

Like Sinan said, the recommended thing to do is for the BGE would be to achieve something making use of some of the latest OpenGL techniques, but there still is a lot of room yet in exploring older techniques that would guarantee that most people with GLSL support today can use it themselves.

Unless I’m missing something, if it requires Direct-X, it can’t be used within the BGE because it’s cross-platform, right…?

EDIT: Do you mean it requires DirectX 11-compatible / caliber hardware? Or that it actually requires DirectX 11?

Maybe with voxels, a limited form of path tracing could be done on today’s very high-end hardware. For diffuse surfaces, the path tracer could do several raycasts through a very low-res version of the scene(allowing for less noise and fewer samples), and some samples could be shared across pixels. It would have to be limited to one bounce only for diffuse surfaces.

This way of handle lighting gives some nice results but if you follow Cyril Crassin’s presentation & papers you quickly see that his method is… very complicated ':confused: and the results, despite having nice color bleeding, glossy refection, etc… isn’t as realist as ray or path tracing (although I wouldn’t refuse it if it was in the in BGE X) ).

Once again, the day realtime path-tracing is achieved with a decent amount of noise, it’s sad to say this but, Cyril Crassin’s technique along with all other actual lighting system we can see in games will become outdated and useless.

For the 3/4 coming years this totally make sense to look for this kind of “ingenious hack” to get closer from the GI we see in path/ray traced pictures because our computer aren’t yet powerful enough for real path tracing.
Of course EpicGames or Crytek can’t just wait 5 years so that everyone has a computer being able to do realtime pathtracing, they must produce game today.

I totally agree that, as Ace Dragon said, this requires a very powerful computer. However with these very good hardwares, we can today have realtime path tracing (even if it’s too much noisy for the moment). In few years, with the progress of computer power, a not-to-expensive computer could be able to render realtime pathtracing.

Also, Brigade is still young and I’m sure we can optimize more a path tracer. For e.g we could do less passes and remove noise with anti-noise filter, many optimizations are possible…

I honestly don’t think that we’ll be able to catch up all the lighting & render techniques other engines have already, because it’s a massive amount of work. When I see that real-time pathtracing starts to become possible on today high end device, I really think that this is the occasion to get better graphics than other commercial Engine with less work and in “one move” (i.e not trying to catch up features one by one but having a innovative solution that nobody else has yet).
Brigade is developed by only one guy and he’s already having impressive results !

It would not be a solution at short term because of current power limitation, but in 3,4 or 5 years this will be definitively possible.

The day it is possible, EpicGame will only have to buy Brigade technology like he did with Cyril Crassin technology and it’s done. Or it could employ some guys to do the work because it has money. If the BGE doesn’t get ahead today, then when the time of path traced games comes, it will get outdated again :(.

Path or Ray tracing is the end, we can not get more realist than that. It is a evidence that video games will come eventually to this technology. Even people without powerful machine could play it through Cloud Gaming that can offers high end power.

Also, beside the BGE, real-time pathtracing would be awesome for artists because they’d see in realtime a render of their scene. (a bit in the same idea as CineBox of Crytek i.e using game engine render for making movies and pictures).

Do you think that a modified version of Cycle with optimizations, less passes, noise remover filter,etc… could produce the same result as Brigade ? Or would it need to be written again from scratch ?

So you make a cool photorealistic game using the BGE’s new real-time pathtracer, you are excited that the BGE has this feature and thus is up to date.

Then you post on the forums and find that it is fully playable for only 5 percent of the userbase or less, so you have all of that dev. work that went into something that only a small subset of the community could enjoy.

In light of this I don’t think it is quite time for something like this to be developed for the BGE. Now I know that not everyone in the forum can make good use of the Cycles engine, but it is different because not only does it not have to be realtime, the work produced by it can be enjoyed by everyone as opposed to if it was a path-traced game which can only be run by those with the absolute latest generations of GPU’s (and the high-end versions at that)

Well, before it reaches enough optimization so that it is noise-free and realtime for everyone, it could allow a very fast preview in the view port and replace the actual preview we have when we switch in “rendered viewport shading”.
This would allow people that don’t have a powerful computer to still get a fast preview without having to wait hours.

I’d be nice to have the speed we see in brigade when doing previews and renders.

This would not reach 5% of user but many more, because it would reach not only Game Developers and players in few years but also Artists allowing them to have very fast or even real-time preview and almost instantaneous renders.

For the moment, every lighting system developed for the BI or Cycle is useless to the BGE and every lighting system developed for the BGE is of no use for the BI & Cycle.
A fast path-tracer would be useful to both Game developers and Artist.

The idea would not to replace straight away the actual BGE render system and make it thus unusable for many people. But if a fast path-tracer (which could be maybe Cycle with some optimizations) was introduced first as a preview tool for artist, and later as an option for Game developers who have enough power or use Cloud Gaming, this would just benefit everyone.

The whole thing is less a question how many users will be able to run such frame work. It is more a question who is capable and willing to develop and integrate such render framework.

Having different 3D render engines for BGE is not a new (and not a bad) idea. But there was no working solution completed.

We have the option to exchange the Physics engine. During the releases the internal (BGE) Physics engine was removed in favor of Bullet/Bullet2.

@Monster : Yep that’s right. I hope by the time I’ll finish my studies I’ll have enough math & physics skills to try to help Cycle or some other Render Engine to get closer to real-time path tracing. But maybe Cycle will evolve enough in the coming years to be as optimized as for e.g Octane which is already very close to realtime pathtracing. We’ll see.
But I convinced that in few years, path-traced games will appear because this can produce photorealistic results so I don’t think Game Engines Company will refuse this if it can run in real-time.

Yeah, what I meant by DirectX11 is that you need DirectX11 hardware but using OpenGL equivalents to Direct3D11, i.e. OpenGL 4.2, OpenGL 4.3. So in BGE’s case you would have to upgrade the renderer and shader code to support OpenGL equivalent features. Basically OpenGL 4.2 or 4.3 and up would give you enough features to start implementing what’s in that paper.

DirectX11 is never an option for BGE because it needs to support Mac, Linux and Android.

I’ve read a recent interview though, that talk about how the UE4 technique doesn’t scale very well to large cities and how you might need something like the techniques used in CryEngine3 and Geomerics Enlighten to get realtime GI that works in practice for games with many cases covered.

However as the techniques in UE4 gets developed further, it will be just as good as current techniques for realtime GI.

Been playing a mmo game with CryEngine, and know what, they were pretty low poly!,
and… I see the games are using A LOT less polygons than the CryEngine playable demos,
but they have heavy use of MegaTextures and Advanced Shaders, thats what common people like,
the " OMG, Shiny! " effect, we care about polycount because we are Geeks…,
its not because they are bad at the engine, its because they research their market.

If you want to go into the “unlimited detail” crazyness, good luck,
i want to see how you gonna keep it up , with others that use less polys but get things done because of less work,
and how you gonna give that soft body real time physics with an average computer over the internet.

So I hope… some day BGE get a MegaTexture implementation, meanwhile we keep writing GLSL stuff.

Notice that ID-soft implementation is not so different from our current animated texture and texture atlasing methods combined,
with the difference it only “upload” to the vga ram at time the sprite that needs to be used.

You can’t have path tracers for game engines.

Rendering in game engines has to happen at 30-60 FPS plus plenty of power left over to do other tasks like AI, physics and gameplay processing.

Even Cycles today can render 1 frame every 2 seconds at best for rather simple scenes, even when using high end GPUs. Are you going to get that improved to even 10 FPS at any point? I just don’t see how it’s possible at this stage.

Innovation in Parallelism needs to happen, think 1 gpu core for every object :slight_smile:
Object based processor?

The GTX 690 has something like 3k CUDA cores and probably still can’t do path tracing at 60fps, so more parallelism isn’t the answer; we need more clever tricks. Furthermore, path tracing isn’t some kind of silver bullet that’s going to fix all of the BGE’s problems. We have plenty of work we can do to bring the BGE more up to speed on modern polygon rasterization techniques. Let’s leave the truly ground-breaking stuff to researches and the like and work on getting lower hanging fruit working. With the limited amount of developer resources the BGE has available, this will give us much more “bang-for-buck.”

I have a path tracing scheme, that is a trick…

Roll a ball at your target, if it hits walk the way the ball went, otherwise adjust offset angle from straight line and add Z torque, throw multiple balls, when one strikes, follow that path?

So the ball follows a path that is an arc, that will not go up to steep of an object…

Are you aware of what path tracing is?

I may have been thinking of navigation meshes :slight_smile:

lol, offtopic :slight_smile:

It looks nice and shiny,
but why does it have to?
I though games were for fun?
or to express a concept,
or both…
looking pretty is for newb-sausages…

Unless you have a Super computer for me to play with :slight_smile:

Can I light based on a collisions with a spheres verts/faces?
Also could I use the collision to remove faces or just negate the results of those collisions? if I did this to a spawned sphere while scaling the sphere up, I could have shadows in real time… like a point light
also , probably would take some CPU if you had the “brightness” falloff based on each scale iteration…

So I add sphere, it gets bigger, anything a vert touches is “lit” as the sphere gets bigger, each vert is dimmer, if I can delete any vert that collides, this is in effect a shadow,

rinse and repeat