GsoC: OpenGL viewport optimization!

Because of this very thread. I’m seeing more problems than there are successes. And the term of the GSOC has run out or is about the knock on the door. Sounds like there is a whole lot of issues left. It isn’t even a beta if it has issues with cards.

I like this project as any other. I just wanted a confirmation if this project is a go or no go. It is not something to be ashamed off nor is my statement negative in nature. I’m just making an observation and my other observation is that there is a graveyard full of failed or unfinished GSOC’s.

Of course your statement was negative in nature, adding a question mark at the end doesn’t change that :).

If you want my opinion, I’m quite optimistic about this project getting into trunk.

Intel graphics are the bastard child of graphics, a far third. Any problems are on their end, and ATI isn’t faring much better, they are struggling a bit as a company.

The new version works perfectly as far as I can tell!!

Thanks for clearing that for me. In that case I’m anxiously awaiting the stable version (although I’m on a Intel graphics too).

On a side note… what ever happened to light cuts? I thought that it was and that it even was finished before the end of the GSOC term and that Uncle Ziev even went outside the scope by adding bounces. I’m just asking because I don’t understand the whole GSOC to Blender procedure. Does finished GSOC project get added to the main branch right away or are they put on ice or something?

Please realize that just because you find graphics errors in an application, that it’s automatically the application’s fault.

Your graphics driver is responsible for a large majority of the rendering tasks. You can tell if there is a driver issue if card A and B work fine, but C doesn’t.

Nvidia in the past few years have engineered the best OpenGL drivers. ATI has fairly bad OpenGL drivers (they are slowly improving…), and Intel is by far the worst.

It was only up until 3 weeks ago that I could actually USE Blender 2.50 on my laptops Intel integrated. Driver issues - bleh.

Sorry to change the subject, but I would like to point out that Intel has invested heavily in the graphic stack of linux and open source in general. Intel is helping to change the whole structure of the open source Linux drivers and is (from kernel 2.6.30) starting to show good results. Look for DRI2 and KMS… is promising.

About the project GsoC: Congratulations for any progress, optimize the 3d viewport should not be easy.

Thank you! =)

> Intel graphics are the bastard child of graphics

They are also the most popular graphic chipset on the planet unfortunately ( low cost computer pretty much means integrated Intel chipset ).

Hmm… strange opinions here, looking an thread on graphicall I see many positive opinions about project. So why failed GsoC project? Because someones Intel “integrated graphic card” is not fully supported?

In my opinion is if someone thinks serious about CGI, he can invest some money for GPU with good OpenGL support like FireGL or Quadro. Look at other CG forums, many users have those cards.

As like someone say optimization 3dviewport isn’t easy. Blender uses immediate-mode and it’s 1992 year (easy)solution of drawing geometry, VBO’s is modern 2003 solution of handling geometry. Lukas did great work introducing them into Blender. I belive if project will be merged into trunk, there will be some option to dissable them…

I’m possitive about this project, I belive it is good starting point of further optimization.

Cheers :smiley:

FX5200 WinXP guy here.
Sample scene: http://www.pasteall.org/blend/535
GSOC VBO Build (overlap draw method) is about three times slower than official 2.49a. Visually, I’d say 8fps in GSOC build against 30fps in official 2.49a. Yeah, all these 2.5 builds tend to be slower than 2.49a (which is the main reason I’d like to stick with old interface for some time after 2.5 becomes finally released). Nevertheless I’ve tried to open the same scene using rev 22334 (2.5 build without VBO, I guess) and you know what? It’s about two times faster than your GSOC project! Visually, 20fps against 8. Collecting all data for the same scene we get:

GSOC build (overlap): ~10fps
2.5 rev 22334 (overlap): ~20fps
2.49a (official): ~30fps

Thus, if this becomes included in trunk, I will get major slowdown in 2.5 for no particular reason.
At least, I need an option to turn this deoptimization off. I guess those who have old cards will support me.

Im sorry that you are finding such a slowdown, but its not for “no particular reason”…its because you have a GPU that is 5 generations old.

Please dont take this out of turn, but ts not fair to call obvious improvements within blender bad because your hardware is so out of date, and though i do have sympathy for you, a button that allows the optimisations to be turned off , realistically isnt very likely or practical :S

Maybe its time to get a newer card? the one you have is 6 years old after all.

They are so cheap now its not even funny.

true words, if you can stay with an six years old gpu that you can also stuck on a old blender version. Everything else just constrict the improvement of the further development, same goes for intel gpu’s

I also agree. An ATI graphic card with almost 9x more RAM than your actual graphic card just costs 55€. That’s really cheap. For Blenders sake. Just stop using an absolutely obsolete graphic card. I unterstand you don’t want to and it’s your right to use your old graphic card but from nothing comes nothing.

Somewhen there are times when you have to move on. Seems the time has come :yes:. Blender development is for the future, not the (very long ago) past.

But you should get in contact with Jaghurandi (lol did I write it right ;D?). Maybe you can help him to fix these problems! I’m sure he doesn’t have so many testers with work station graphic cards.

I think the Blender devs. should be concerned about support of very old computers and GPU chipsets as app. makers today care about supporting Windows95, for Blender to head into the future support for extremely old GPU’s and computers may have to be dropped. Many people who make CG art with any app. probably have computers and GPU’s less than 4 years old.

accessiore: this is imbusy’s branch, not jaguarandi’s.

Regarding the slow downs some are seeing, please keep in mind that optimization is not a simple process. You don’t just insert the “make things go faster” GL command; there’s a long process of tweaking and testing to get things right. The people saying “get a new card” might want to slow down; the branch hasn’t even been merged yet, and we can certainly keep trying to improve performance.

If it really does end up that VBOs simply won’t work right on some cards, then there probably will be a UI toggle for this feature.

Summary: it’s useful to post reports and benchmarks comparing the speed of this branch to 2.5. It’s useful to compare performance between cards, drivers, and operating systems. It is less useful to bash other people’s hardware choices, or complain about the state of an in-progress branch.

Ok lol I’m absolutely sorry … dunno I’ve read something about jaguarandi having two projects so I thought that’s his second.

JEEZZZZ, man you are soooo negative. I have this feeling that anyone that complaint gets treated like he’s some harbinger of death or something.

Anyway, the project is not done yet, and like Nicholas said, there should be a lot of optimizations on the way.
Who said something about a 6 generations old graphics card? In computer-hardware/ Moore law terms, that is like 3 months ago? I think it is fair if this work right on 5 year old computers too.

I’d rather challenge Agent_AL to code us a 2.5 viewport optimization patch that would be proven to be much better and faster than the GSoC project for both old and new computers if he thinks this could’ve been done much better.

I used to have an FX5200 a few years ago, then I upgraded to a 7300GS and then an 8800GT came with my new Dell XPS Quad-Core.

I’d just like to reiterate, it is useful to post benchmark numbers, like Agent_AL did.

This does not need to be a fight between old and new hardware. Try to provide useful information on what works, and what doesn’t. I for one would like to see the changes in this branch get merged, and I’d like to see it work on a wide variety of hardware. This happens sooner if good feedback is provided. :slight_smile:

Please dont take this out of turn, but ts not fair to call obvious improvements within blender bad because your hardware is so out of date,

I see obvious slowdown since 2.49. I see devs counting candy eye milestones instead of performance ones. This is not how I’d like to see the future of blender.

Maybe its time to get a newer card? the one you have is 6 years old after all.

I have top end NVidia video card nearby to model heavy things if there’s a need to. But most of time I’m optimizing OGL video game engines as I’m programmer. FX5200 is the first card in my car park which supports pixel shaders and is great for testing purposes as has all necessary instructions for shading and is not too fast to omit all bottlenecks in pipeline code of my 3d engines.

JEEZZZZ, man you are soooo negative.

That’s because I see no progress from my point of view. All drawing features should be optional and that’s why I demand on an option for VBO. The thing I like in Blender is that it’s extremely compact and fast for a 3d application. I had an experience of using many commercial 3d packages (like Maya, 3dsmax) and all of them provide the same functionality while having low initialization speed and huge installation size. Blender is like Notepad for 3d: you don’t need to load heavy word processing software to open and edit some code. Now it comes much closer to the word processing software rather than staying fast and compact.

I’d rather challenge Agent_AL to code us a 2.5 viewport optimization patch that would be proven to be much better and faster than the GSoC project for both old and new computers if he thinks this could’ve been done much better.

There’s no need to do so. Look, 2.49 viewport rocks at least twice more efficient on old computers. Just add one more checkbox in User Preferences to disable whole that GSoC project part in 2.5. Feeling depressed by major unreasoned slowdown, just click that checkbox and enjoy new 2.5 functional features at the speed of 2.49a.