DO Blender use Open GL 4.0/4.1/4.2/4.3 ?
or is it still using Open GL 2.0
Itâs using mostly 1.4
Which might soon change with the completion of Jwilkenâs Viewport FX project in which the purpose is to replace all of the old drawing routines and cruft with modern methods.
Currently, the drawing code in places is little more than âdraw this vertex, now draw this vertexâ (GL_begin > GL_end), which is about as basic as one can possibly get when it comes to OpenGL use. This is part of the reason why Blenderâs viewport speed is one of the few remaining major areas that are still holding Blender back and yet not much can be done to improve it now if we donât want to make it even more painful for the Viewport FX devs. to keep their branch in synch.
In short, the Open_GL code is basic now, but Jwilkenâs has mentioned that he is likely to at least have something presentable within a few months, so improvements are coming, but it might not be for a couple of releases yet.
Just to elaborate a litte. While we do use 1.4, we also use some extensions in places.
The viewpoft fx branch wonât do miracles, I donât know of the performance impact it would have exactly but actually it might even make things slower for some.
Itâs mostly for compatability afair, to make the viewprot faster we still need to rewrite the code.
Itâs mostly for compatability afair, to make the viewprot faster we still need to rewrite the code.
So⌠sometime in the future?
4.3 is out and we still are using Open GL 1.4 ? I think its time to develop the core. .and not concentrate on new features. .well new features are always welcome. .BTW !
Does opengl 4 really offer better performance or do we just want bigger numbers?
Direct X 11 is better than Direct X 9So is OPEN GL 4.3 better than 1.4I wont go into details. .check out the documentation !
Two things:
-
Not everyone has a video card that supports OpenGL 4.x. Blender is used by a lot of different people in a lot of different places who canât necessarily afford to buy the newest/fastest/best(est) VFX card.
-
People generally work on whatever they want when doing blender development soâŚ
Not everyone has a card that supports OGL4 but I would wager that NOBODY is using a card that only supports 1.4. And if they are, they are going to quickly find that CG work probably isnât a hobby theyâll be able to keep up with for very long. Even an openGL 2 card is dirt cheap these days (in fact, I doubt they even make cards that arenât at least OGL4 compliant anymore). At a certain point we canât let the functionality of Blender be held back by people who wonât use hardware from the current decade. If you canât save up for a $50-$100 graphics card over the course of 10 years (the amount of time the OpenGL 2.0 standard has been available), then maybe 3D isnât for you. Hell, the draw method used in Blender has been deprecated and considered bad practice since 2008, at least that should be addressed.
4.3 is out and we still are using Open GL 1.4?
Just going by version numbers in OpenGL isnât very meaningful. Pretty much any feature that is part of the OpenGL core of a given version is accessible through the OpenGL extension mechanism beforehand. Blender certainly doesnât use any features that became core in OpenGL 4.x (tesselation, compute shaders) for now.
The features required for VBOs and GLSL mode became core in 2.0, for instance. VBOs make the single largest impact on rendering speed and as soon as all drawing code uses it, there isnât that much to be gained anymore (at least as far as OpenGL is concerned).
Now, having said that, the difference between code using old-style immediate-mode/fixed-function OpenGL and shader/buffer-based OpenGL is enormous. Itâs quite a burden to support both, which is evident from the Viewport FX project spending a lot of time on just creating a compatibility layer for functions which are deprecated (and donât exist in OpenGL ES, used on Android).
Not everyone has a card that supports OGL4 but I would wager that NOBODY is using a card that only supports 1.4. Even an openGL 2 card is dirt cheap these days (in fact, I doubt they even make cards that arenât at least OGL4 compliant anymore).
There are unfortunately a lot of people who have really crappy integrated intel graphics without full support for 2.0. I donât think Blender should support those, but it does affect certain users (like UncleEntity). As for OpenGL 4.x, it isnât even supported on Mac OS yet.
We just want a custom build for all those who got Nvidia 6 serie graphics card or ATI. .and Win 7 or 8 64 bit. .may be later for other OS. .
I just hope if the viewport code get revisited that the devs extend the API to allow us to write custom shaders for the viewport. Then we could crank out higher quality playblasts.
Dx11 supports more features than dx9, but it seems to be slower in some games, not to mention that it has nothing to do with opengl.
But I guess Iâm just whining for no reason since I would like to see opengl4.3 features used in blender, its just that I donât buy this âbigger number -> must be betterâ anymore.
For once, $30 can get you DX10 level GPU http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%20600007816%20600030348&IsNodeId=1&bop=And&ShowDeactivatedMark=False&Order=PRICE&PageSize=20 (on Amazon such GPUs can be purchased used, for less)
Those cards at least support OpenGL 3.2, which provides modern real-time rendering (drawing) methods that are faster than OpenGL 1.4 methods.
And maybe it makes sense to embed voluntary poll of userâs hardware setup (and OS) to BF ? This way itâs clear how many users use what hardware and if it makes sense to keep supporting old stuff.
Another option is to keep OpenGL 1.4 as fall back API, and have OpenGL 3.2 as main API. OpenGL 3.2 was released in 2009, almost 4 years ago. So many video cards support it.
This is more or less already possible by overriding e.g. the view_draw method in a CustomRenderEngine using OpenGL bindings of your choice (such as PyOpenGL). Itâs more a problem of converting all the blender data efficiently because you canât access the GPU buffers of the geometry through the Python API. Just writing custom shaders doesnât get you much further than what you can do with node materials already, youâd have to implement things like shadow maps and post-processing yourself to get good results.
OpenGL 3.x doesnât add anything fundamentally new (apart from the seldom-used geometry shaders) - the major question is whether you require shaders for future blender versions, or not. Keeping 1.4 around as a fallback means essentially implementing things twice in significantly different ways (and not being able to implement certain things at all). I donât think it is reasonable to do that, but itâs up to the people who actually develop these things to choose which direction they go.
if blender (foundation) wishes/wished to remain accessible to people who canât have a moderg gfx card than it shouldnât want to also go into bigger studiosâŚ
Bigger studios and customers who would pay for support should be a priority for blender. This would be of benefit to the rest of the blender communityâŚ
There is always software mode if hw mode fails.
This is more or less already possible by overriding e.g. the view_draw method in a CustomRenderEngine using OpenGL bindings of your choice (such as PyOpenGL)
I was hoping for a more Blender centric solution rather than relying on third party APIs.
Just writing custom shaders doesnât get you much further than what you can do with node materials already
Screen Space AO would be nice.
I am not looking to write another Renderer, just a better OpenGL playblast. For instance, in the BGE I can install filter shaders.
I was hoping for a more Blender centric solution rather than relying on third party APIs.
Itâs not really a third-party API, itâs just a binding to OpenGL just like bgl (but more complete). bgl isnât particularly blender-centric either.
Screen Space AO would be nice. I am not looking to write another Renderer, just a better OpenGL playblast. For instance, in the BGE I can install filter shaders.
For the SSAO, youâll need a depth buffer texture. But for other effects, you might need to render all kinds of things to textures. Youâd really want to be in control of the rendering (or âwrite another rendererâ as you put it - not that a big deal really unless you want to please everyone). Thatâs the only way to get all the data you actually need it into your shaders.