As NVidia is part of the companies that are supporting Blender, this is something that could be accessible maybe with Blender 2.9 or 3!!!
Very interesting the development of GPU acceleration and CUDA as the language…
Another aspect is that it seems to be an opensource initiative…
Think about a global world base network of Blender users working on a production and using such a platform / technology
I remember in 2004 at SiGGRAPH, NVidia devs where saying that they were ‘thinking’ of expanding CUDA as a main core language for acceleration…
Nvidia released Omniverse in open beta today. The render speed is very impressive, almost real-time. The download comes with some amazingly photoreal looking example scenes.
Would be cool to see this in Blender some day
EDIT: link to the download site: https://www.nvidia.com/en-us/design-visualization/omniverse/create/
Maya, UE4, Substance, and a few others are already in, Blender and Houdini, and a few others are on the way next.
Noticed that too. Hopefully the Blender plugin will be released soon. Did some quick tests with some of my models, and it looks to be quite a bit faster than Cycles with Optix.
I don’t have good screen recording on my system (tried OBS, but it looks stuttery), but this might give an idea of the speed (although it looks sped up, so not the best example):
edit: SSS test (from https://twitter.com/gavriilklimov/status/1305178703517765633?s=20)
Is anyone else seeing this as a winning tool for scene building & rendering when taking into account all the physics/dynamics tools that are GPU accelerated? I think more is to come as well and there are plenty of experimental modules to play with too. I’m pretty excited.
Rys , could you tell where did you download the sample scenes please?
@akario: You have to install the Local Nucleus Collaboration Service under the collaboration tab in the launcher. Then in Omniverse, connect to localhost and go to omniverse://localhost/NVIDIA/Samples/
There’s a few tutorials on the Nvidia page which may help:
https://www.nvidia.com/en-us/on-demand/session/omniverse2020-om1228/ (launcher overview)
https://www.nvidia.com/en-us/on-demand/session/omniverse2020-om1242/ (very cool overview of Omniverse’s real-time rendering capabilities, using some sample scenes)
Adding a screenshot of the content tree in “localhost”. It only appears after installing the Nucleus collaboration server and logging in to it.
And a short video of the Marbles sample scene: https://vimeo.com/493706366
I just cannot believe how fast it is, I have no idea how this is even possible.
Is this fully path traced, or real time raytracing hybrid like UE4? This is Pixar Hydra tech, right?
This is pure ray tracing and path tracing based on Nvidia’s RTX technology. There is no hybrid rasterization component in Omniverse, but in one of the tutorial videos (https://www.nvidia.com/en-us/on-demand/session/omniverse2020-om1242/), they mentioned plans to integrate Pixar’s USD Storm render (OpenGL) as one of the Hydra “delegates”.
Some examples of real-time ray tracing in Omniverse from Nvidia’s YT channel:
It just amazes me how many cool things you can do with this.
Right, so this is Nvidias own RTX pathtracer running as a Hydra delegate? I also saw IRay running on it to, through OV.
AFAIK both RTX renderers are fully Hydra compliant and I think the same goes for Iray. In theory, even Cycles could run in Omniverse as a Hydra delegate or the other way around, the Nvidia’s RTX path tracer as Hydra delegate in Blender (see AMD’s work on USD Hydra support in Blender), which would just require porting the PBR materials as good as possible from one renderer to the other.
All this “Omniverse” stuff is built around Pixar’s USD system. Hydra is just the rendering framework part. So yes I think the RTX renderer in Omniverse is just running as a Hydra render delegate.
Also AFAIK, nucleus is just a database of USD data.
I’m not sure if the omniverse app lets you use other render delegates such as cycles or storm or others.
Speaking objectively (and my opinions here are my own, not my company’s). The Nucleus / Omniverse system IS very cool from a content creation and sharing standpoint. IMO though it is unfortunate that it locks users into using MDL materials, and rebrands Pixar USD as “Omniverse” (this will make it harder to go to other renderers). But hey, I guess NVidia is going to do what NVidia does?
In this video, they show a bit of Pixar’s Storm running as a Hydra render delegate in Omniverse, but I haven’t found a way to enable it so far. Not sure if it’s part of the beta release.
So it can render SSS, what about hair like Cycles?
I haven’t read anything about hair/fur in the docs. Iray supports RTX accelerated hair (https://resources.nvidia.com/events/GTC2020s22494), so I think it will come to Omniverse at some point.
thanks! BTW What GPU are you running with Omniverse ? It seems really quick in your video. lol Maybe it’s really quick in general.
I’m gonna give it a try later today. If only we can get this rendering technology in Blender viewport!
I’m using a Geforce RTX 2080 Ti. Wish I had the RTX 3090, but it’s plenty fast already.
I’ve been looking for a way to use it in combination with Blender by exporting to USD. Some exported scenes work well, but not all, still trying to figure out why.