Microsoft anounces raytracing in directX12

Keep in mind that this demo is not 100% path traced, it’s a combination of rasterisation with ray tracing added in specific places (AO, reflections). Think BI, not Cycles.

Yes. Also everything is just glossy :rolleyes:

It is exactly that, actually the point is use classic rasterization and add raytracing over it

We are far of a full raytracing engine, I will even say that the name is a little bit exagerate, as it concern only a little part of what is computed.

Also, the tech is not limited to Nvidia, DX raytracing is part of DX12, and each brands ( AMD and Nvidia, Intel ) will come with their own driver, ( Nvidia call it RTX, AMD will call it ( well i dont know ).

Epic just showed it’s UE4 DXR integration working with a movie quality star wars scene, and I can tell you it’s very good quality and is offline renderer level. But also needed 4 top Nvidia cards to run it Lol. But as it’s a dev version of DXR at this point the performance will no doubt be better when public release. But we have hit movie quality rendering now in realtime, things only can get better from here. Within a few years mainstream cards will be able to do this.

Honestly, I use Linux exclusively. DXx is nothing but a blip on the landscape. Can contributors keep in mind that we don’t all live in Microsoft’s world (and yes, I know that MS contribute code to Linux, but proprietary stuff is best kept that way).

Here, see it running on GDX-1 (4x NVLinked Teslas V100) for fullHD @ 24fps

more on

Well your in luck brother, AMD have also built a realtime raytrace platform to be released (been trying my best to pursued AMD to show more after their talks at GDC today but who knows, maybe their work isnt so impressive yet. Always worried when new tech doesn’t get showcased). But their OPEN SOURCE platform will use Vulkan. Lets just hope AMD have matched what UE4 with DirectX and Nvidia hardware showed today.

I have my doubts about this. As it has been said, if anybody wanted to do (hybrid) raytracing in realtime, they could do it with Compute Shaders right now. It’s just not worth it on current hardware. This is going to be a software solution, if and when there will be hardware support eventually, this interface isn’t going to be optimal (unless Microsoft and GPU vendors already haved this planned out).

Honestly, I don’t think a lot of people care.

DXx is nothing but a blip on the landscape. Can contributors keep in mind that we don’t all live in Microsoft’s world.

I doubt Blender will get any form of DirectX support any time soon, but if it somebody stepped up and did it (maybe to bring Blender to UWP and the Windows Store), should they not do it so the 5% of Linux users don’t feel left out? I don’t think that makes sense. The same “argument” could be used against CUDA support. We’d just have worse software.

tumbleweeds rolling across the screen

Not quite sure what you mean here, If you watched the above video you now have to admit that it’s reality. And even though 4x teslas it’s still running butter smooth at above 24 fps (movie frame rate). This whole process as i see it is to show you dont need a Cuda or Opencl compute platform to do this anymore, Just fragment shaders, Vertex shaders, Compute shaders, Geometry shaders etc ec etc that are part of any new realtime game platform like DX12 or Vulkan.

But guess what Dx12 or Vulkan will be far better supported for hardware dev than Cuda (closed) or opencl which is capable but not invested into by most producers or software compatibility engineers.

By pushing this to game tech, card producers had to react. And that SHOULD mean designed hardware to match these needs from now on (that also maybe means killing off the hardware issue of compute platforms for Virtual currency’s mashing up prices to the point hardware is twice the price it should be.

I have no trouble admitting it, I was already convinced it was possible before, I just don’t think there’s a strong case for it on current hardware.

I remember a graphics demo from many years ago which did realtime hybrid raytracing, where the raytracing part was done on the CPU. It was a heavy load and I’m curious how it would perform today, unfortunately I can’t find it now.

And even though 4x teslas it’s still running butter smooth at above 24 fps (movie frame rate).

…which means that’s about 8-10 years away from being in consumer electronics, maybe 3-5 with dedicated hardware support.

This whole process as i see it is to show you dont need a Cuda or Opencl compute platform to do this anymore, Just fragment shaders, Vertex shaders, Compute shaders, Geometry shaders etc ec etc that are part of any new realtime game platform like DX12 or Vulkan.

I don’t think so, it’s been clear that raytracing is possible with Compute Shaders, but it’s still way too expensive to be used in a hybrid renderer, except maybe for very specialized cases like what ILM is doing. What this new tech is trying to tell me instead is that DirectX is the great new interface to use with raytracing hardware of the future, or maybe it isn’t and then I shouldn’t waste my time on it now. Anybody can make a fancy demo for tech that goes nowhere, remember Geometry Shaders?

By pushing this to game tech, card producers had to react. And that SHOULD mean designed hardware to match these needs from now on

I don’t know, ImgTec has RT-capable GPUs for a while now, maybe they own all the relevant patents and nobody else wants to bother.

Your wrong on this one, Even if i had to buy a few cards, Getting that render quality at that speed makes all the difference.

There’s a lot of announcements being made, but we have to wait until the fog clears to understand which APIs will support it, what kind of limitations there will be, which GPUs will support it and what performance will be like. The fact that this is aimed at gaming is good for Eevee and Cycles in the long run, since this usually means better tested drivers and good support on consumer cards.

The Star Wars demo looks great, and from what I understand ray tracing is used for (relatively sharp) reflections, area light shadows and ambient occlusion only. These are the same kinds of effects that raytracing was first used for in production offline rendering, as they are relatively cheap, and with denoising over mutliple frames it becomes accessible to realtime. This performance is what you might expect from running Optix on 4 Volta cards at 24 fps though, so the fact that a tech demo like this could be made is not that surprising on a technical level.

GI and DoF were not raytraced in this demo, and I didn’t see soft glossy, glass, motion blur, SSS, hair, volumes, etc. These kinds of effects are more expensive, since not only do you need to trace a lot of rays, you also need to do a lot of shader evaluations and deal with some difficult types of noise. So I don’t expect raytracing to replace existing realtime techniques that quickly, but there’s a lot of interesting research to be done for new and hybrid techniques.

Brecht you also have to take into account this demo is not justa render animation, it’s the sound, music, realtime particle systems, post pro that also is done. it’s a finsihed shot.

Yep, Like ive been saying for 2-3 years. Realtime is on the way, Which is why i always wanted an eevee style realtime render in Blender. Yeah this stuff will not replace true rendering for at least another 10 years, But like you say good looking movies were made before these new path tracer extraordinary tools were made, Pixar said it best when stated, who cares if it’s physically correct, go with what works.

Brecht your laugh at this but i even started work on my own realtime path tracer a while back (not touched since jan last year).

Even on an old GPU it wasn’t that far from realtime.

This was aimed at exactly what’s going on now, raytraced direct rays replaced with
Rasterized gbuffer, screen space reflections with reflection miss mask (screen traced not hit goto BVH) and distance field soft shadows could replace alot of un-needed shadow rays at far camera distances. Many things I posted on here and other places but didn’t have the skill to do on my own.

You on the other hand are perfectly placed brother to take all this in and come up with something, your brain must be loving all these developments :slight_smile:

TBH the only part I may miss from all that on Eevee are the reflections, Eevee lacks true reflection power, but if we had real reflections, Eevee would be awesome (more awesome than what it already is I mean hehe) because for some use cases, like infoarchitecture, reflections give us the life, and SSR or Spherical/planar reflections are not enough.

But the demos are awesome :slight_smile:

Cheers!

Will it be possible to implement these DX12 raytrace effects into eevee when they are ready?
Or is eevee based on a different technology?

Eevee is in OpenGL, so no dx raytrace.

The problem of eevee is that is not finalized. Raytrace is more real, but very expensive for the user.
Now i see that eevee is to solve the problems of reflections and problems with crystals and glass. Also I have had the problem of press f12 in eevee and nothing black screen 2.80.5.
I don’t know how many features they would lack to eeevee in order to get something to the degree of realism and i don’t know if opengl can continue to evolve. Nor do I believe that you can merge cycles with eevee at this point, although it would be great to be able to do this by “different layers” was not invented!!

Exactly my thoughts. Bold for emphasis. It’s a demo, but looks great. The takeaway should certainly be that the trickle down is on the way. The good news for Blender users, is this tech is much more imminent for viewport level rendering rather than game engines as the performance requirements for interactivity are lower.

Doesn’t Eevee use a little bit of raytracing as well (to resolve things that can’t readily be done by rasterization)?

According to the videos, you would be thinking Eevee once 2.8 comes out with no trace of BI.

Screen Space Reflection is raytracing, technically. Just cheaper because you can sample a pixel that has already been shaded and lit.