to all those who bash intel integraded graphics...

my macbook pro with the radeon 6750 with 1gb vram screws up when performing an opengl viewport render, but the onboard intel hd3000 handles it just fine.

when is ati/amd gonna give us decent opengl support??

to all those who bash intel integraded graphics…

when is ati/amd gonna give us decent opengl support??

From one generalisation to another

mostly tongue in cheek, but as i recall ati does have a long history of driver issues for blender’s opengl interface, and this experience does seem to continue that tradition. if i’m doing something wrong, i’ll happily correct it and admit my mistake!

that said, i’d been on an nvidia gpu for 2.5 years prior to this, and had nary a graphics glitch. further, i’m getting really inconsistent performance in general in the viewport under the ati card. sometimes i’ll load a scene and it’ll refresh at 2-3 frames second, but if i quit and re-open, it’ll be smooth sailing.

so feel free to dismiss the unfair generalization from only personal experience, but correct me if it doesn’t sound like there is at least some sort of a problem here?

Oddly, under OS X there seems to the opposite problem. ATI cards seems to always beat Nvidia cards at benchmarks, even when there is no hardware reason why they should. My 5770 has never really had issues, unless you count the OpenGL render being square all the time. Btw, OP, is that what you are referring to?

Welcome to the hell of AMD/ATI graphics drivers. I try to give them another chance like once a year, and every time it’s the same. There’s a reason most professional studios have nvidia powered machines.

Heh, I’m running windows 7 with a switch to change between a integrated and a Nvidia GeForce GT 425M. When I try to open Blender on the integrated card it takes like a minute to even open, then runs really slowly. However, Nvidia had some major issues with older OpenGL games right now.
Intels graphics; I’ve never really bashed them, I just never expected much from them. Nvidia though… How can you fail at the one thing you were made to do?

btw, i’m not saying intel graphics are great by any stretch, i was just lightheartedly pointing out that while their performance may suck, i was surprised to see it was doing something correctly that the discrete card could not.

@J_the_Ninja, i sometimes get the square viewport render you mentioned under the ati card, which isn’t ideal but at least the image is there. more frustratingly, it often crops the top and bottom region of the camera frame, so i’m losing half of what should be visible.

for my work, its really nice to use viewport render to pump out temp assets for review/placeholders in larger projects, so losing the camera framing is a disappointment!