I am planning to purchase a new laptop and really need to know how Blender performs on an Intel HD Graphics card. Most laptops I am looking at come with integrated Intel HD Graphics - I will definitely be going with the Intel Core i3 or i5 as the processor. I have heard that Intel’s graphics cards aren’t that great while there processors are amazing. If you have a machine with Intel HD Graphics and use Blender on it, how well does Blender perform?
I’d advise you to NOT buy any machine with an Intel graphics chip, they are an utter disgrace.
If it’s main use is for Blender, or you plan on making use of any graphically intensive content, even Aero for Windows 7, I’d suggest you look at the models with a dedicated Nvidia card, or perhaps look at the AMD range as they won’t have Intel GPU’s in them.
I’m not sure how the latest ATI cards work with Blender, but I’m pretty certain they run with less issues.
I disagree with you I get fairly decent performance with a integrated Intel card. But I also dont do anything with that many polys on my laptop. As he suggested you should probably look into getting one with a Nvidia card or ATI card. My suggestion to you would be to find a laptop with Nvidias Optimus technology as it will allow you to have a extended battery life and the ability to have a decent GPU when you need it.
Forget about onboard video, get an NVIDIA card (mid range at least), don’t expect to use your laptop off the wall a lot.
That about sums it up. Don’t settle for less, it won’t make you a happy Blender user (or a happy user of any 3d package).
These guys are right. Blender does run on almost any graphics card, even on my eee pc 901, but it really cannot handle anything remotely hipoly and it doesn’t support GLSL materials (which I use a lot for material previews) Any non-integrated GPU from nvidia or ati will be totally worth it.
Another thing to look at is having a separate numpad, since it helps a lot with navigation. If the model you plan to buy has it, then good. If it doesn’t, you might want to buy a usb-numpad, they’re not that expensive
I advise you to buy laptop with NVIDIA graphic card and Intel i5 inside. Integrated video don’t fit at all. If You want to use Blender Game Engine You should choose NVIDIA GeForce 220GT (at least).
It’s only my opinion but I think nvidia+intel is the best solution for 3d work.
Got to add, don’t do ATI. Recently they have had loads of problems with Open GL and generally with drivers.
Get an Nvidia chip.
Recently as in pre-HD series, or…?
The lastest ati problem is the GSOD.
Gray screen of death in the HD series. Cropped up recently, especially with the 5850 and 5870. Most aren´t even aware of that problem almost seems like ATI pays ezines not to report about it.
Screen turns gray and system freezes, or you get white horizontal lines.
Users blame faulty memory chips, voltage control, drivers, no real fix there.
ATI blames their driver + Windows 7 but it also happens in other windows.
ATI released a hotfix, wich at least work for some people but it doesn´t seem to be a fix at all.
I really hope though its just a driver issue and no major hardware problem, ATI doesn´t deserves it.
Every time I personally have had to use an ATI card, I have had problems with Open GL. A friend of mine recently purchased a Firepro card that is touted as being dedicated to 3D workstation graphics. His system was plagued by display issues and screen glitches. He tried about 5 different drivers to no avail. A call to ATI’s help desk was completely useless. We thought maybe he had a duff card but a quick google proved he was not alone.
@ arexma - I too thought it strange that the e-zines would praise their cards so highly when there are so many known issues. Are they useing just benchmarks or are they actually trying to use the cards?
NVidia all the way. And for anyone on or thinking of moving to Linux, I believe that if you want GPU hardware decoded HD video playback then the VDPau / LibVA configured VLC, FFmpeg etc only work on NVidia hardware not ATi. No problem to Windows users as it’s handled by DirectX?
VDPau being an open API created by NVidia, for other manufacturers to adopt if they’d like. And that only NVidia cards above 8400 series have the ability to use VDPau due to older chips not having the playback engine VDPau uses.
Last NVidia graphic cards are hot like class-O stars and eat energy like experiments at CERN research centre.
I’ve got 5770x2 sadly blender does not make use of several gpus(dont know if it’s code specific or AMD/ATI has to make a Crossfire profile), I’ve had no problems with opengl. Have had 4 different AMD/ATI cards while using blender and none had issues, only speed issues(old select mode driver issue, and UV/image editor).
Select issues have been solved and really blender could use a different way to display images.
ATM Nvidia cards produce lava air and suck the life out of your PSU(while it yells om nom nom).
Intel GPU’s are called GPU’s but really they should be called WTBGPU’s(want to be gpu).
I recomend Nividia.Why Nvidia you will say …remember 3DFX the masters of the accelerated OpenGL and first game that run on OpenGL real time ? Nvidia it is 3DFX and in the last graphics card they implemented what 3DFX left out in the “development department”.
That it is why I`ll chose Nvidia over any other brand.
Intel Larabee will not be cause Nividia got CUDA straight in the market.
So Intel do what they do the best CPU.
well everything that needs to be said has been said but i’ll add my 2cents, ATI have a bad wrap when it comes to GLSL and integrated card dont preform well at all, so go for Nvidia with preferably an ATI GPU but Nvidia and Intel are usually together in most systems these days… so yeah
The HD 4xxx series has been working for me quite fine,
then again I haven’t necessarily all way’s updated to the latest drivers,
I only upgrade when I have problems, had none thus far.
Nor have I given a try to the newest(5xxx) series cards, so good to know, thanks guy’s.
And yeah, I do prefer nvidia chips myself as well.
Rather surprised how many don’t know this, but only full-screen* applications can make use of multiple gpu’s(rasterisers),
afaik the spus from both gpu’s should be useable, but nothing in blender is OpenCL accelerated.
*That is to say applications running in their own separate X session.
GC alone isn’t all about neither. I have something anybody can call a “miserable graphic card” (7300 256Mo NVidia)… But the rest of the machine does compensate that and will have to 'till i can find 2 GPU sharing, cheep and flashed, compatible GC to replace it with… :o
So you got GF100 cards your own or are you just parroting the populistic fanboy chitchat everyone does since the first informations leaked on fermi back in november?
Powerconsumption yeh, the 480 uses around 100W more than a 5870, but thats the TDP under full load and the consumption is exponential, not linear, under normal conditions they are not far apart.
I got a GF100 series and my whole system (incl. screens, 2 graphic cards) takes in (intake from the wallsocket) around 320W during working in blender, around 400W when rendering. During gaming it peaks to around 450-470W. While the TDP of the Fermi chip is ~310W
My Card runs with 35°-40°C idle and blending, under full load with ~52°C with ~28°C airtemperature in my case and it is barely hearable. If thats lavahot for you, you better stay inside the house this summer
One can´t buy a high end card and then complain about it consuming power.
The only fail I see in the Fermi cards is the reference cooling, every single aftermarket cooler does a 10x better job than the crap they build on the cards. But I guess nvidia does that with reason to keep the aftermarket business alive.
Same for ATI - their reference cooling for the HD58 series was a joke, loud’n’hot.
Now that the board partners came up with custom cooling, and nvidia has exactly the same as ATI, its all forgotten, ATI is grand and nvidia is said not to produce roomheaters.
That said, remember, I am no fanboy, I buy what gets my job done best for my money.
Yes, NVidia GTX 460/465 look a good deal with reasonable power consumption.
But if anyone buy NVidia, AMD go bankrupt, and NVidia-Intel raise the price as they like and stop developement without competition.
It is strange that people can recommend Nvidia cards, while in another thread just saying that I’m thinking of replacing my 5770 with a GTX 470 made me a viral marketeer
But knowing now that there are problems with Blender and the GTX 470 I will wait.