So… I’ve recently realized, since most programs that i know of do not allow split processing among processor cores… why do AMD and Intel proceed to just add more cores instead of ‘upping’ the speed… I’d much rather have a 6 Ghz dual core than a 3 or possibly even 4 Ghz quad… Or even a 5 Ghz quad over a 3 or 4 Ghz 6-core… Just seems like they’re going in the wrong direction unless some new software is written that allows programs to be split amont cores… (like, 1 program being processed through many cores) Unless this is already the case, then either I’ve been lied to, or there are at least two people on this planet that are quite misinformed.

You should read “The Free Lunch Is Over” by Herb Sutter. That should help you understand why there are no 6Ghz x86-Processors.

Companies have pretty much hit the Ghz limit for processors, now there are three things that can be done to make programs run faster.

-Better architecture, which is better wiring, more cache memory, better connections, better efficiency on information transfer ect…

-Faster memory, we have not reached the speed limit on memory as far as I know

-More cores, many 3D and content creation programs already take advantage of multi-core architecture (splitting complex tasks among multiple processors so each has to do less work for example, or having each core focus on a single aspect like physics). There’s even some game engines that use multiple cores and have in part assisted in the demise of specialized processors for things like physics. This makes programs run much faster.

There’s also OpenCL which promises to make it much easier for programs to take advantage of multiple cores and the many processors in GPU’s. A quad-core also can, for example run many programs at once compared to a single core.

In addition to what c++user and AC said, (with current technology) CPU’s can not go over 4 GHz without (literally) having a meltdown. In order to break the 4 GHz limit, you would need to super cool the processor with something like liquid nitrogen (link).

PS: Even if you can overclock your processor above 4.0 GHz, you will heavy reduce the life span of your hardware.

heat heat heat

plus having multiple cores in one chip also allows multi threaded work.

imagine a rendering where the single core crunches on a complex element
while a multi core could dedicate other cores to work on other tiles while
one core crunches on the complex part.

for multi tasking this is good since the cores also have in a good design
their own access to the ram and can work as individual brains.

over long term also your energy bill is lower since the system drains
not so much.

One thing that must also be kept in mind is … memory is always the bottleneck for a CPU. It’s not just a matter of “having enough of it,” but also of “making it fast enough to keep up with the demand.”

Every core must have instructions to execute and data to process, and somewhere to put the results. Cores can only operate independently to the extent that they have the necessary data on-chip, and a sufficient path to the outside world. So, “it is not a free lunch.”

Even a multi-core CPU is not equivalent to having independent CPUs doing the same things.

The most important thing that you can do for rendering is to plan your moves well. Be sure that you are giving the computer a realistic task to do, avoiding redundant effort (e.g. rendering the background or non-moving objects over-and-over in every frame) and finding ways to do things “just good enough.”


yep - which is why the intels don’t have memory bridges anymore
but direct access.

Yes, the hardware is becoming “terribly advanced,” and most motherboards out there are actually able to keep up. (It wasn’t always that way.)

With rendering, you’ll be asking the computer to run “just as fast as it possibly can,” on all cores, “for hours on end.” And, it’s the last four words of that sentence that are the bugaboo. You’ll need to be extremely sure that the system’s design is actually rated to do that continuously… and to deliver optimum performance while doing so. Such computers are not cheap, and they are rare. Every single part of the system must be able to do that.

After all, there were plenty of video cards out there … sold everywhere … which would literally blow capacitors etc. when you actually asked them to do what you bought them to do, for long periods. They looked great in the box, and melted in the heat.