Going from opengl 1.1 to 2.1 for blender 2.74

Hi @Psy-fy thanks for trowing some light on this subject,
Just to put this thing a bit clear.

If my card support Opengl 2.1 and above i will be using all blender features if i don’t have support for Opengl 2.1 i will be able to use blender but some features won’t be enable.

Is this statement right?

Well, i can understand that the Devs don’t want to loose too much people with an upgrade. It’s a valid argument. But upgrading from a long outdated solution to another long outdated solution is no solution at all. Especially when you have to keep in mind that this area will not be touched for the next 10 years then anymore since it already got an “upgrade”, and there more important things to do.

My point is, when you are already at it, then better make it right and up to date. That way it can really last for a while then.

BTW the CUDA argument is not really the same. Updating to the newest OpenGL will actually leave a great many users not able to run Blender at all while in Cycles you will always have the option to render with your CPU, GPU rendering is more of a bonus feature anyway.

But when you use Blender then you usually have a Nvidia card with Cuda. I could bet that you will have a hard time to find Blender users that cannot run at least OGL 3 at their pc, even when they have a ATI card. And for very old pc’s there are still the older Blender versions available.

A pc is usually outdated after 5 years, and gets replaced by a newer machine then. That’s the normal range that you need to take into account when you implement new features. I would even go just 3 years back. Since 3D is a heavy evolving area. And everybody who has to do with it usually knows that.

@blurymind
Yes, we’ll try to expose full screen compositing in the node editor. But if you want SSAO as a material parameter it gets complex because you want to make a full screen compositing shader that depends on material definition or vice versa. So I’d say that most likely this will be done by compositing material render layers with an SSAO shader and some material ID (same as ambient occlusion is a shader in cycles rather than a color).

@Tiles
It depends on what your PBR material definition is but again we can use extensions for any functionality that is needed to get such materials. As far as I have seen in papers such materials may use texture level of detail or derivative functions (to get a glossy texture for instance) which are OGL 3.x level, and usually an extension or two away from OGL 2.1. It’s not that hard to use such extensions, we do use them for real time bump mapping for instance. Again, the challenge here is having shaders -at all-. Assuming we won’t be updating the backend in ten years is inaccurate at best. We can always do abstractions for better hardware. For instance, instancing (ha!) requires OpenGL 3.x extensions and we’ll definitely be using that at some point.

@lich Not everything but all core functionality - UI, tools, most materials and display. (Cycles in GPU or some fancy materials won’t work, for instance). Personally I think shader model 3 instruction count and conditional execution, framebuffer objects and float textures should be a requirement too.

@rdo3 this is incorrect. 1.2 is totally different than 2.1.

The reason on why we don’t upgrade to newer OpenGL is that then we preclude even more users from core functionality if we do that. OpenGL 2.1 is a good middle ground between functionality and compatibility. The key point to remember here is “core”. Even UI or tools will be able to assume shader support, not just isolated parts like cycles in GPU.

Blender users who does not have an Nvidia CUDA card is far more common than you think, not even counting the people who use newer AMD cards. If you only go by the people voicing their concerns here on the forum you do not get an accurate depiction of the world’s Blender users.
Personally I would not mind the developers upgrading to at least OpenGL 3 since I have 2 computers that can handle that without a problem and I am going to upgrade soon to something even better but I know what it is like to not be able to do that and I know that many people are in that position. OpenGL 2.1 is a good middle-ground.

Also far from everyone uses Cycles to render and even less use their GPU for rendering purposes.

This can’t be stressed enough:

One doesn’t “update” to a higher OpenGL version like one would update a library. What you do is start using features, those can be either extensions or core features. Usually features are available as extensions first, then they become core features (and requirements) of a particular OpenGL version. Blender has been using (but not requiring) OpenGL 2.1+ features for years already!

Always using (and requiring) the latest core features of OpenGL is about the stupidest thing you could do. NVIDIA is usually the first company to support the latest version, AMD tends to follow only months later, followed by Intel. That’s for Windows, Linux drivers tend to lag behind even more. Mac OS lags years behind and hasn’t been updated past 4.1 to this day (That means no “Compute Shaders” on OSX!). Especially new features tend to be buggy and few developers use and test them, which makes it less likely that the bugs are fixed. Very few developers use OpenGL in the first place, so it’s not being given priority like Direct3D. The last major game to launch on OpenGL was RAGE, and it was a disaster.

What you want to do is use as little fancy features of OpenGL as possible - if you value your developer time. You want to go with what has been stress-tested and proven to work. We all don’t want to transfer the AMD OpenCL experience to the viewport, do we?

Okay, so it seems that we cannot convince you and the other developers anymore. A shame, you really miss a chance here.

Nevertheless good luck and thanks for giving us an insight.

Iit depends on what your PBR material definition is

It’s not a definition really, it’s a realtime render technique, and requires different shaders and different textures than the traditional approach with Diffuse, Specularmap, Opacitymap etc. . Now you have Albedomap, Reflectivitymap, etc.

PBR means Physically based rendering. It’s the new horse in games to make surfaces look as good and physical correct as possible. Allegorithmic substance offers it, the big game engines like Unreal and Unity uses it. And i think i have already seen such an implementation for even the BGE.

Tools where you can create the textures for your mesh are switching to PBR now too, like 3D Coat. And sooner or later you might want the tools to create the textures for PBR also in Blender. It’s the future for realtime.

You want to go with what has been stress-tested and proven to work.

You might have noticed that the computer world evolves from time to time. And 3D is one of the areas that evolves fastest. I know that the BI is proven and works. But i prefer to work with Cycles nowadays. Since BI is outdated and doesn’t give this good results as Cycles. And same goes for OGL 2. It is already outdated.

You misunderstand. There are many techniques to render PB materials. If you read papers from recent games you’ll see there are differences in implementations. Some just do raw texture sampling, others do a precomputation of a texture map to store some parameters others use different BRDFs etc. Not all of them do the same shader calculations. That’s what I’m referring to when I say definition. This is more a math and material definition problem than an API problem though. In shader level you just have textures and you combine them according to the mathematical definition of your material.

I see, thanks :slight_smile:

never mind. realized my back vs forward compatibility mistake. braincrramp. sorry.

Psy-Fi, would you be so kind to directly answer if supporting multiple openGL versions is a problem? It would allow users with older hardware to continue using blender as well as keep the performance/feature hungry camp also happy

Thank you

The thing is, you don’t really know what you are talking about (and you probably haven’t even read my post). Just earlier you said Blender should “update” to the newest version of OpenGL (4.5). That would mean dropping support for Mac OS X and all Intel/AMD GPUs. Clearly the 3D world isn’t evolving quite as fast as you’d think.

OpenGL 2.1 is just a feature level. It’s not outdated, because those features are a subset of all newer OpenGL versions. (If you disregard the deprecated features, which are not going to be used).

@rdo3:

GLSL version is not the same as OpenGL version, at least back then it wasn’t

Better to sculpt model by real clay. Too many people can’t afford computer:eyebrowlift:

Probably there are people who don’t know their system can support opengl 2.1, off the topic I recently dicovered that my system is 64bit after 6 years, so I installed Ubuntu Studio 64 bit and so glad it saved my ass on one project. anyway I think soon or later blender should move on and that should be better for developer that to focus on one thing for more stable version or adding display features to the scene. my opinion blender viewport is slow and my vote is yes, and certanily there are people that can’t run opengl 2.1, but I think having option to download older version of blender is a good choice.

2.1 is a good compromise i think. My Graphics card could handle 4.3 but hey, not everybody has a top notch GPU. So i think we shouldnt be here egoistic.
Blender users arent only from wealthy Europe or USA. So give the others also a :yes:chance

If the worst consequence for a user of an older card is falling back to solid mode or missing out on some of the eye candy, then I see no reason not to do it. If you are using 10+ year old hardware, you should be used to not having eye candy. It will still render and model and texture just as it always has for the end user, just a tiny subset of the userbase won’t have the option to turn on all the fancy switches.

Really, we have support for gpu rendering, but not everyone has a gpu that can leverage it. It’s an analogous situation in my book, we shouldn’t remove gpu rendering to ensure compatibility with all users systems.

tough call - viewport improvement is really a must. I have no real idea where there difference between 2.1 and 4.x are and what the latest versions offer.

but you have to make a trade off - if some people sadly at one point cannot use their out dated GPUs anymore then maybe they should invest int one which can deal with 2.1. Those cards are not that expensive. Some always will be left behind as painful as this is. Thats just reality.

And to be honest Blender is a 3D app if you want to do 3D you need to be aware of what it requires. I hope I do not sound unfair or harsh, I am trying to be quite direct and honest here.

I have to tell the same to my students when they complain about homework time and money. They made a decision to study design, so expect the workload and that here in this country it will cost sadly a lot of money.

No frigging wonder why ppl think that Blender is a toy. Updating to a 9 years old API. This is beyond a bad joke.

There are hobbies that everyone can get into, and hobbies that, by definition, NOT everyone can get into. I’d put 3D into the latter category. Sure anyone can tumble around a cube or a sphere (metaphorically speaking), but if you want to advance beyond that point, you just can’t get around the fact that you need some brute force horsepower. I don’t think it wise to base the progression of tech requirements in a field as fast-changing as 3D on the lowest common denominator, no matter how noble Ton’s “3D for everyone” goals are. It’s simply not feasible, just like “skydiving for everyone” or “world-class yacht championships for everyone”. Just like any hobby, there comes a point in your education where you need to give in and upgrade. Limiting scope by catering to those who refuse to get “serious” about 3D is a losing strategy in the long run, in my opinion. Blenders 1.8 - 2.73 aren’t going anywhere (the beauty of open source). If you can stretch the limits of 2.73 without considering upgrading to a GPU from this decade, then you may want to reconsider your priorities re: where you’re spending your time/money if $30-$50 for a OGL 3 capable card is breaking the bank for you.

That said, 2.1 compliance is better than nothing. A jump to a shader-based pipeline is a much bigger jump than it will eventually be from 2.1 compliance to 3.0+ compliance. While I’d rather see 3.0 as the minimum (I think 7 years is a MORE than healthy lead time), as Antony said, most of what comes after 2.1 can be in the form of extensions. My concern is that things will quickly turn into a heap of hacked in extensions, such that it would have made more sense to just bite the bullet and push headfirst into modern times from the get go.

Now, I’m not an OpenGL expert by any means, just some dabbling here and there. And, as I understand it, simply jumping to modern OpenGL wholesale would completely destroy ALL of Blender’s UI code, requiring a total rewrite of pretty much everything you see. So I understand the iterative upgrade approach. I just wonder if a big “shock” in the form of a major rewrite isn’t what it will finally take to get Blender in line with the rest of the 3D field, and if that won’t come at a point where everyone else has moved on to the next big thing anyway. Judging by Ton’s thoughts on 3.0 on the code blog, it seems he has a similar line of thinking, but given the limited coding resources of the Foundation, who knows how that will pan out.