Intel did it again

As with the Pentium now today …

and one would assume that a company with that expertise would be able to prevent this
or better just do it right.

When my hard-drive died in my laptop, initially I was depressed about the fact that I would have invest in a new laptop. But a check-up at the local computer fix-it showed that it was the hard-drive that needed replacing only. So I guess I can afford to wait at least another two years at least until I have to get a new machine, that ought to provide plenty of time for the chip-makers to find and iron out any more flaws in their production.

In the cases of chips with that level of complexity (a number of regions that have different purposes in computing on one of the smallest nanometer scales yet), it may not always be a surprise when a flaw turns up, such processing chips require a major level of expertise, design/computing knowledge, and scrutiny in the development process to make sure it’s even fit to ship.

After that its making sure the robots have the proper programming to manufacture these things, each chip is the product of a long, meticulous research and development process which has to be done constantly to deliver the increase in computing power people have come to expect through the idea of Moore’s Law.

Another “we have a bug” ploy. How much is Intel going to make off of this one? Let’s see the article says they’re going to lose “$700 million” which will translate into a $1.4 to $2.1 billion gain. If you remember the famous “Pentium Bug” of the 1990, which was the best thing that could have happened for Intel and the Pentium, then you’ll realize this “hard flaw in Sandy Bridge chipset” is in the same vein.


that is so true - the 90s problem generated a lot of product awareness and boosting sales - what an irony.

However compared to AMD there is not much competition - Intel seems nevertheless to be a good chip designer today.

well, maybe for CPUs, but when it comes to GFX cards I’ve heard Nvidia (is it pronounced En-vidia or Neh-vidia? I usually say En-vidia) and ATI still beat Intels.

anyway, good thing my laptop is an “old” model with the good-ol’ dual-core chip system :wink: it’s a Toshiba Satellite L300, btw

Na - I never claimed Intel to be a competitor to Nvidia at all.

With chips I was more talking about the CPUs.

I think some here are not aware of the fact that this bug does not concern the Sandybridge CPUs.

Sandybridge CPUs work just fine, as far as the current testing goes, they are flawless.

The bug is within the southbridge, the part of a mainboards system also known as ICH - Input/Output controller hub - which also holds the SATA controller.
Within the first 3 years of usage you got a 5% chance that the error occurs - which is hardly compareable to a faulty floating point unit in a Pentium chip.
It also only affects the SATA3 controllers while the SATA6 controllers are not affected - and most users will most likely never connect anything to the SATA3 controllers
Intel stopped delivering the faulty chips already and updated them.
All those facts made it rather hard to spot this bug in testing, also because it is not reproduceable… it´s more of an glitch than a bug and for what I understand it happened due to the production process and not because of wrong design - and rest assured state of the art chipsets are quite more complicated than CPUs as CPUs got to abide to the “X86 compatibility laws”, while chipsets have a rather fast pace of change.

So the worst that can happen is that one for your SATA3 ports has a 5% chance to loose a bit performance in the next 3 years… yawns

And I feel for intel somehow.
If they didn´t release it, they would be bashed for hiding a serious flaw
Releasing it, they are bashed for doing it “again” or being attention pigs trying a marketing move generating any kind of news

yes and no

there is also something called quality management and product testing

every company has to deal with it and this fault area increased in the recent years
sofar that today we utilize the slogan that the customer does the finale product testing
and the feedback goes back to the company to fix it for good.

Think about the car companies, Apple and material issues etc.

But truth is that this does not have the magnitude of the 90s bug.

there is also something called quality management and product testing

Yeah, that would make sense for something like a guitar, but not nessecarily a CPU.

Not necessarily ? You must be kidding man.