Microsoft anounces raytracing in directX12

Well maybe someday your games assets might look more as in cycles

Ze Matrix is coming …

Quite impressive.

Microsoft could very well try to use this as a killer feature to help their struggling Windows Mobile line and their faltering Xbox business (the latter is well behind the PS4 in raw numbers and may eventually fall behind the Switch as well).

You seriously don’t think game developers using this tech. will have any deployment option outside of Xbox, Windows 10 Mobile, and Windows 10 PC (due to it using DirectX), do you?

Nvidia’s current plan with this tech.

In other words, the RTX technology is not going to run at all on the current gaming cards, and will only work on a future lineup such as the hypothetical Ampere or Turing cards.

If you want this tech. now though, your only option is the 3000 dollar Titan V.

I can’t see real-time ray-tracing being exclusive to Direct X for very long. Other libraries will eventually catch up. This is definitely going to be in the next Xbox and I am sure Sony are keeping an eye on this too for the PS5.

I believe this is a technology demonstrator, and a pact between Nvidea and Microsoft, PS4 might sell nice (for the moment).
But dont under estimate Microsoft, this tech is probaply not used in mobiles, but might be in their next gamebox, or the one after that.
Microsoft has a lot of money, and in contrast to some tech giants is creating often new stuf, marketing is not always their best part, when its not one of their key businesses, ea Microsoft Holo Lens, The kinect one (one of the best TOF cams), Microsoft surface table.
Sure they make wrong marketing choises too, (who remembers microsoft mobile 2003, android did not even existed then).

Also take a note that Microsoft and Unity work together (its inside visual studio 2015 and 2017), giving Microsoft more ground as a game development platform, in fact one can use C# and compile to android-unity games. But in 5 to 10 years the tech giants could change landscape so much that it might be normal to run raytraced 3d games on a mobile, if you still use a mobile by that time.
Microsoft is just prepared for the future to happen i think.

vEGA CARDS HAVE 100% OF DIRECTx 12 enabled. So it should work with Vega cards.

The real question is: Will 3D programs (Blender FTW!) be able to leverage this technology moving forward? In other words, real-time raytracing becoming the norm rather than the exception?

Cool tech demo by Remedy:

It’s very cool tech. But it’s also


It seems that nVidia has done some crap again against AMD. Some brand are forbidden to make AMD cards anymore.

All is not that bad or lost…
AMD Announces Real-time Ray Tracing Support for ProRender and Radeon GPU Profiler 1.2

But again it feels as another new year’s event (spring has sprung, days get longer…) and another bling-bling flashing lights right in the field of vision rather than solving real problems. Most of what can be seen in demo scenes is not realistically complex for today’s science to brag about (no complex shaders, no intersecting, overlapping XXL geo, no messy organic scenes… as if they all know what not to show & not to do) So more like a Circus in town so: “Pump up the Hype!” :wink:

There is no new tech… It’s just new M$ specific API. It was possible to do ray tracing from the first programmable shader 10 years ago… NIDIA created Optix and CUDA somewhere at that time. The reason it was not used is because there are cheap hacks that worked well for the computational power of the most GPUs so far.

I guess the release of this API triggered AMD to start working on ray tracing few years ago. So I can give them that. AMD would never get into ray tracing probably if it wasn’t this API… They slept through a lot of ray tracers, including Cycles. Then suddenly, they start working on Cycles and ProRender… Well this DirectX API was most probably the reason… If you want to see what will be this tech… well see ProRender… It’s most probably the back end of the DirectX API. The API will simply present better integration between rasterization and ray tracing. It will give you ready to use implementation, so you don’t need to write your own. Initially the ray tracing part will be very small - area light shadow, reflections, things that can’t work in a rasterizer… It will be far from a modern ray tracer, but at least it’s some progress to make ray tracing more mainstream.

Considering there is just one main source right now, it may or may not be overblown (the response is from Nvidia, but it’s important to get their side of the story too). Now if some of the other well known tech. people such as Linus confirm the allegations, then we might have something.

Much like some of the details about why Nvidia Pascal won’t support the RTX tech. It appears that this kind of technology works best on the so-called “tensor cores” introduced with Volta (something the GTX 1080 Ti and below does not have).

I really hope the it’s just a misunderstanding. nVidia needs competition and we need the progress.

You need it, not Nvidia XD

The official cut of the video did not show the segment with the flickering and noise (instead showing the evenly lit scene designed to give the appearance that noiseless realtime raytracing is here and ready to use in games).

Though I don’t know if the denoising being worked on has been applied yet (it will be the only way to make things noise free, though that does not solve the flickering of highlights). Even with denoising, you’ll still have sharper and smaller details that just can’t be resolved with high fidelity from just 1 sample per-pixel (as seen with Nvidia’s papers).

Nice video, maybe they figured out noice free rendering ?
Hmm … i didnt see any glass botlle or so, is it raytracing / or eevee alike

Glass milk and skin materials are harder to render, but… i wont rule out they solved it as well.

Announcing Microsoft DirectX Raytracing! more about from Microsoft’s DirectX Developer Blog