So when is this going to make it’s way to blender? It looks awesome
I know blender devs have already started working on getting opensubdiv in.
Hydra seems to be very much in line with what the current goal to modernize the viewport needs.
Is it going to be used or is blender going to have it’s own independent viewport?
Perhaps using theirs will make it easier to also integrate the other open source pixar technologies that were designed to work with it?
The open source release includes Hydra, a high-performance preview renderer capable of interactively displaying large data sets.
“With USD, Hydra, and OpenSubdiv, we’re sharing core technologies that can be used in filmmaking tools across the industry,” says George ElKoura, Supervising Lead Software Engineer at Pixar. “Our focus in developing these libraries is to provide high-quality, high-performance software that can be used reliably under demanding production scenarios.”
It also seems to use python and c++
Seems like a match made in heaven for b3d
ProRender is not a real time renderer. Someone is going to implement support for it in Blender because AMD pays for it.
Hydra on the other hand is a real time renderer. However, it is not clear whether it is going to be used in Blender. Seeing the work that is going on regarding the viewport, it seems rather unlikely that Hydra is being integrated at all.
I read somewhere else that blender devs are going to use Unreal 4’s viewport code. Does that mean just their pbr shader or entire viewport?
It was confirmed by cgmasters
It would make sense for them to port it over to b3d, but when you compare it to the scaling and performace of hydra it is just not at the same level. In fact now that hydra is open source, it is possible that unreal devs start looking into it.
Both ways, opengl 3.2 should impose some limitations. I myself run a laptop that cant go higher than that, but also it might mean choosing one technology over the other
There seem to be a lot of rumors/info surrounding the viewport, but no official statement as to the direction/collaboration with other existing projects
My understanding is that Hydra uses opengl 4.4, Hydra is just opengl based anyway. The reason it’s also so fast is the fact of a very efficient scene graph setup and streaming data from disc to feed the view. Ive been meaning to have a look at the opengl code in Hydra, should be interesting.
The UE4 thing, they couldn’t implement epics rendering code, it’s only the PBR shader setup, The AMD thing, is as i understand 2 projects. 1 getting Pro render working with blender and 2 getting a new viewport rendering system working that also will support Vulkan. Makes sense, Vulkan destroys Opengl due to the minimal driver layer involved, also how the shader system works, it converts GLSL into Spir-V (opencl style code, works more like directx where shaders are compiled before hand NOT add hock like with opengl which was always one of it’s main weaknesses).
Good thing with Vulkan is the glsl to spirv conversion can be modified to convert any shader language really into spir-v, even brand new shader code languages could be built at which point and just compiled to spirV.
Id like to hear more on the state of the AMD view port stuff myself, Blender view port in edit mode for example is so bloody slow it drives me mad, it’s one of blenders biggest weaknesses right now
https://wiki.blender.org/index.php/Dev:2.8/Viewport
According to this, they are not using the actual Unreal code, but they may follow their PRB model. I would even be surprised if the Unreal license allows this kind of usage.
It is not clear to me whether Hydra is practically usable within a game engine. For high end stuff, it seems to scale really well, but is that also the case if you don’t have high end hardware? There is also the idea to use the viewport renderer for the game engine. I would be surprised if Hydra could easily be used for that purpose.
Hydra is fast, but certainly not as pretty as some of the other options already out in the wild (including Blender’s own PBR fork and Modo’s new viewport). I’m wondering if there weren’t a lot of features turned off in every presentation I’ve seen, because it could be pushed much further.