Raytrace lighting in BGE

Surely the idea isn’t new and has been discussed before. I learned about raytrace lighting after looking up how Cycles works, and it wasn’t long before I asked myself “what if path tracing could be used in games too, not only rendering”. BGE is already considered a poor engine by many (untrue IMO), and this would be a great opportunity to make it stand out as one of the first complete game engines with raytrace lighting. Blender already has Cycles, so the GE could use a few of its components.

Now I’m experienced enough with both Blender and game development (outside of Blender) to know this is not as easy as it might sound. Sure, one could make Cycles render the viewport while BGE is running, but what about performance? Anyone expects a good engine to not drop below 60 FPS, but render engines are designed to cope with seconds / minutes / hours per frame! Using the Rendered viewport and moving the camera around gives a good idea of how Cycles would act in realtime… and no doubt it takes at least a second before anything becomes clear through all that grain.

However, raytracing in realtime is not an impossible dream either. I looked it up and found a surprising number of attempts, using OpenCL or even just GLSL. Some of them actually look pretty mind blowing! They also seem to run at excellent performance, and some have little to no grain due to ray approximation (Cycles does). Here are the best examples I found on Youtube:

https://youtu.be/pm85W-f7xuk

https://youtu.be/pXZ33YoKu9w

https://youtu.be/LNj2PvKZf6s

To get this working in BGE, there’s no doubt GPU rendering needs to be ready and working. Also that a much simpler version of what Cycles does needs to be created. If there’s any hope to get 60 FPS in a scene of normal complexity, we likely can’t have more than a few ray bounces, limited glossy / reflection / refraction shaders, and other optimizations would likely be needed too. Still, it’s been proved possible… and maybe it’s possible for BGE too.

What do you think about the idea? How possible do you think it is to add raytrace lighting in BGE? Has anyone tried it yet, does anybody plan to? Any ideas on what systems and technologies could be used?

All very well, but my standpoint is that we need some 5 to 10 years before any of these methods become realistic. We simply need better hardware. You need a 500€ GPU to even consider realtime raytracing, and even then it wouldn’t work in interiors. All the Brigade demos are exterior scenes. Why? Well look how slower Octane is in dark interiors, you need more samples to simulate the diffuse light.

Yes, realtime raytracing will be the future. But not yet. I will be interested in 10 years, when I have a mid budget GPU that can do these stuff. But not before that. :no:

My opinion concerning BGE is, 1) We do not need NEW bling bling, we need to implement the ones we have better. Get the arealights working, and maybe get the shaders hardcoded into BGE. 2) What BGE actually needs is optimization. Work under the hood. GLSL looks good enough I say, we do not need a new render engine, we need a optimized game engine.

I missed the fact that those scenes are all in outdoor areas. It makes sense why in this case.

I heard that hardware is an issue also, but kept hoping a smart ray tracer could find a way to run decently on an average GPU. Thinking realistically though, I admit that sounds about right; If you have a resolution of 1680 x 1050, you have 1764000 pixels. For each one, you need to trace a ray of light which bounces several times and scans its environment. The whole thing must be done 60 times in one second. Yeah… that’s pretty insane, and even a very smart formula wouldn’t help much.

10 years sounds very long, but that’s probably when it will become the default standard everywhere. But hopefully it won’t be more than 3 years till an average computer will be able to run a raytrace engine at what can be considered barely decent performance.

its not that far away, the secret is “synaptic chips”

with these, you could use algorithmic lighting by passing data in loops,

imagine something that can handle a 1024*1024 image process on it’s own, and then imagine a network of them, that pass data in 3d…

It’s just a different ball game now…

Think of each thread as a independent node that can be specialized, or generalized in it’s function.
Now imagine you can pass data along more then 1 route to get to that node, in and out.

Processing now
Bob is thinking about it
Jane wants to know now!
Bob will get back to her,

Processing with this,
while bob thinks about it, jane is grabbing the data from last frame, unless bob is done,

So all the nodes are constantly updating a image buffer,

the prototypes offer further evidence of the growing importance of “parallel processing,” or computers doing multiple tasks simultaneously. That is important for rendering graphics and crunching large amounts of data.

Real-time ray-tracing using a super high-power processors for those years (eight cores in each CPU).

I want to see it on this :smiley:

could you not simulate reality in real time?

The second video. “On a cluster of IBM Cell blades” What is that exactly?

http://www-01.ibm.com/common/ssi/rep_ca/7/897/ENUS106-677/index.html

@ @BluePrintRandom: That sounds like a better system, it would be nice to see something like this. I still imagined it the classic way: Compute X rays of light for Y pixels in an interval of Z times a second. Using multi-threading wisely would be an important part of getting an acceptable realtime ray tracer to work.

Anyway, I gave more thought to the matter last night. It still feels like a raytrace engine on today’s hardware should be possible… BUT with many compromises: First, a lower resolution. 1024 x 768 means 786432 pixels which shouldn’t be as hard to compute. Second, don’t go for 60 FPS, but aim for anything above 30. Third, don’t do any bounces (indirect lighting). As much as this kills the purpose of ray tracing, hardware can’t handle bouncing so many rays of light… so direct lighting would be where it stops.

If those compromises are accepted (especially the third), path tracing should work acceptably for a game engine. Obviously it wouldn’t be usable for a serious project, but people could make demos or even small games with it. Resolution and frame rate are things the user adjusts or gets automatically, while bounces can be an option people enable based on their hardware and discretion.

But even if it will take time for hardware to catch up, I believe now would be a good time for engines to start implementing the system. First, it would let those who have a good GPU enjoy it earlier. Second, we can better see how it works, have test cases for research, and get to improve the system… so when mainstream hardware will fully support path tracing, the engines will already be as good as they get.

For all those reasons, I still support adding it to BGE today. Question is if any developer finds it worth the trouble.

What makes allot of other engines better and more popular than BGE is that they don’t care wether all the people have good PCs and can run their games or not, That’s why BGE is stuck at a certain progress and new users are constantly switching to other engines. So yes, I support ray tracing “as an option”.

I support a “selection menu” to choose the level of blender to use

1.-Old
2. -New GsL
3. - BLEEDING EDGE! Cuda Etc
4. - CRAZY - realtime everything!

Just because it ‘can’ be done on a GPU or in GLSL doesn’t mean it should be implemented, especially in this case where Moguri or Kupoman ends up spending valuable development time on something that no one can cleanly render at 60 FPS yet.

Sure, you might argue about future-proofing, but it’s important to ensure that the only things that get implemented are those in which we have the hardware to use in games right now (not 5-10 years from now).