OPEN GL 4.2/4.3 for Blender

DO Blender use Open GL 4.0/4.1/4.2/4.3 ?
or is it still using Open GL 2.0 :stuck_out_tongue:

It’s using mostly 1.4

Which might soon change with the completion of Jwilken’s Viewport FX project in which the purpose is to replace all of the old drawing routines and cruft with modern methods.

Currently, the drawing code in places is little more than ‘draw this vertex, now draw this vertex’ (GL_begin > GL_end), which is about as basic as one can possibly get when it comes to OpenGL use. This is part of the reason why Blender’s viewport speed is one of the few remaining major areas that are still holding Blender back and yet not much can be done to improve it now if we don’t want to make it even more painful for the Viewport FX devs. to keep their branch in synch.

In short, the Open_GL code is basic now, but Jwilken’s has mentioned that he is likely to at least have something presentable within a few months, so improvements are coming, but it might not be for a couple of releases yet.

Just to elaborate a litte. While we do use 1.4, we also use some extensions in places.
The viewpoft fx branch won’t do miracles, I don’t know of the performance impact it would have exactly but actually it might even make things slower for some.
It’s mostly for compatability afair, to make the viewprot faster we still need to rewrite the code.

It’s mostly for compatability afair, to make the viewprot faster we still need to rewrite the code.

So… sometime in the future?

4.3 is out and we still are using Open GL 1.4 ? I think its time to develop the core. .and not concentrate on new features. .well new features are always welcome. .BTW !

Does opengl 4 really offer better performance or do we just want bigger numbers?

Direct X 11 is better than Direct X 9So is OPEN GL 4.3 better than 1.4I wont go into details. .check out the documentation !

Two things:

  1. Not everyone has a video card that supports OpenGL 4.x. Blender is used by a lot of different people in a lot of different places who can’t necessarily afford to buy the newest/fastest/best(est) VFX card.

  2. People generally work on whatever they want when doing blender development so…

Not everyone has a card that supports OGL4 but I would wager that NOBODY is using a card that only supports 1.4. And if they are, they are going to quickly find that CG work probably isn’t a hobby they’ll be able to keep up with for very long. Even an openGL 2 card is dirt cheap these days (in fact, I doubt they even make cards that aren’t at least OGL4 compliant anymore). At a certain point we can’t let the functionality of Blender be held back by people who won’t use hardware from the current decade. If you can’t save up for a $50-$100 graphics card over the course of 10 years (the amount of time the OpenGL 2.0 standard has been available), then maybe 3D isn’t for you. Hell, the draw method used in Blender has been deprecated and considered bad practice since 2008, at least that should be addressed.

4.3 is out and we still are using Open GL 1.4?
Just going by version numbers in OpenGL isn’t very meaningful. Pretty much any feature that is part of the OpenGL core of a given version is accessible through the OpenGL extension mechanism beforehand. Blender certainly doesn’t use any features that became core in OpenGL 4.x (tesselation, compute shaders) for now.
The features required for VBOs and GLSL mode became core in 2.0, for instance. VBOs make the single largest impact on rendering speed and as soon as all drawing code uses it, there isn’t that much to be gained anymore (at least as far as OpenGL is concerned).
Now, having said that, the difference between code using old-style immediate-mode/fixed-function OpenGL and shader/buffer-based OpenGL is enormous. It’s quite a burden to support both, which is evident from the Viewport FX project spending a lot of time on just creating a compatibility layer for functions which are deprecated (and don’t exist in OpenGL ES, used on Android).

Not everyone has a card that supports OGL4 but I would wager that NOBODY is using a card that only supports 1.4. Even an openGL 2 card is dirt cheap these days (in fact, I doubt they even make cards that aren’t at least OGL4 compliant anymore).

There are unfortunately a lot of people who have really crappy integrated intel graphics without full support for 2.0. I don’t think Blender should support those, but it does affect certain users (like UncleEntity). As for OpenGL 4.x, it isn’t even supported on Mac OS yet.

We just want a custom build for all those who got Nvidia 6 serie graphics card or ATI. .and Win 7 or 8 64 bit. .may be later for other OS. .

I just hope if the viewport code get revisited that the devs extend the API to allow us to write custom shaders for the viewport. Then we could crank out higher quality playblasts.

Dx11 supports more features than dx9, but it seems to be slower in some games, not to mention that it has nothing to do with opengl.

But I guess I’m just whining for no reason since I would like to see opengl4.3 features used in blender, its just that I don’t buy this “bigger number -> must be better” anymore.

For once, $30 can get you DX10 level GPU http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%20600007816%20600030348&IsNodeId=1&bop=And&ShowDeactivatedMark=False&Order=PRICE&PageSize=20 (on Amazon such GPUs can be purchased used, for less)

Those cards at least support OpenGL 3.2, which provides modern real-time rendering (drawing) methods that are faster than OpenGL 1.4 methods.

And maybe it makes sense to embed voluntary poll of user’s hardware setup (and OS) to BF ? This way it’s clear how many users use what hardware and if it makes sense to keep supporting old stuff.

Another option is to keep OpenGL 1.4 as fall back API, and have OpenGL 3.2 as main API. OpenGL 3.2 was released in 2009, almost 4 years ago. So many video cards support it.

This is more or less already possible by overriding e.g. the view_draw method in a CustomRenderEngine using OpenGL bindings of your choice (such as PyOpenGL). It’s more a problem of converting all the blender data efficiently because you can’t access the GPU buffers of the geometry through the Python API. Just writing custom shaders doesn’t get you much further than what you can do with node materials already, you’d have to implement things like shadow maps and post-processing yourself to get good results.

OpenGL 3.x doesn’t add anything fundamentally new (apart from the seldom-used geometry shaders) - the major question is whether you require shaders for future blender versions, or not. Keeping 1.4 around as a fallback means essentially implementing things twice in significantly different ways (and not being able to implement certain things at all). I don’t think it is reasonable to do that, but it’s up to the people who actually develop these things to choose which direction they go.

if blender (foundation) wishes/wished to remain accessible to people who can’t have a moderg gfx card than it shouldn’t want to also go into bigger studios…
Bigger studios and customers who would pay for support should be a priority for blender. This would be of benefit to the rest of the blender community…

There is always software mode if hw mode fails.

This is more or less already possible by overriding e.g. the view_draw method in a CustomRenderEngine using OpenGL bindings of your choice (such as PyOpenGL)
I was hoping for a more Blender centric solution rather than relying on third party APIs.

Just writing custom shaders doesn’t get you much further than what you can do with node materials already
Screen Space AO would be nice.

I am not looking to write another Renderer, just a better OpenGL playblast. For instance, in the BGE I can install filter shaders.

I was hoping for a more Blender centric solution rather than relying on third party APIs.
It’s not really a third-party API, it’s just a binding to OpenGL just like bgl (but more complete). bgl isn’t particularly blender-centric either.
Screen Space AO would be nice. I am not looking to write another Renderer, just a better OpenGL playblast. For instance, in the BGE I can install filter shaders.
For the SSAO, you’ll need a depth buffer texture. But for other effects, you might need to render all kinds of things to textures. You’d really want to be in control of the rendering (or “write another renderer” as you put it - not that a big deal really unless you want to please everyone). That’s the only way to get all the data you actually need it into your shaders.