ARM is the real alternative to Intel!

Hi

Iv been reading alot about ARM recently and the processor market and computer market are changeing rapadly as ARM starts producing more powerfull CPU’s. looking on ARM’s website there top spec A series CPU the Cortex-A15 can run at 2.5Ghz in Quad or Octo configuration.

Thats not shy of even a powerful desktop Intel, add to this the lower power consuption, Cost and you have a real contender for the CPU throne. This is of course half the battle you need consumers to purchase your products. and in a way ARM is doing well there too. Most tablets that are sold have either an ARM or Intel Atom core. But there is a growing move towards low cost desktops sporting ARM CPU’s.

The reason I post this here is that with the growing succses of ARM comes with it Linux and OS software aswell. Alot of the low cost ARM desktops run Linux and I think it would be good if Blender could run on the ARM Archtecture.

I know there was talk of making a BlenderGE version for ARM, “as at the time ARM CPU’s were used mostly in phones” But I think with the number of Computers running ARM now this warrents the discusion on the Blender Application itself.

  • by computers running ARM I include tablets, and in tablets I include the Iphone/Ipad. as mobile devices that powerfull are just as capable as desktop computers.

What do you think?

I’d put myself somewhere between beginner and intermediate for programming skills, but from what I’ve heard, programming for the ARM architecture is more difficult, and porting from an x86 architecture is fairly futile.

Well ARM is quite successful in the table market but that is not what you would put into a desktop.

Intel chips are good but not really the best either. For high-end there a better options.

I think in the mobile market Intel will have a problem there.

Of course with Intel ramping up facilities to start producing the 22nm Tri-gate 3D processors, which will boost processing power at half the power consumption, and at a reduced cost, they see the technology easily crossing over into smaller forms. Read up on tri-gate and Ivy Bridge for more info.

I’ll agree!

You know, things can change. With both Apple and Microsoft moving to ARM, both OSX and Windows, it should be clear that things are changing. I think it’s a relevant, if slightly premature, discussion with regards to multi-platform software like Blender. Also because it paves the way for some form of Blender on tablets.

(btw, I presume you mean tablets and not tables! :slight_smile: )

who says surface/tables aren’t going to pack low-power cpu’s as well? :wink:

but yeah, that’s probably what he meant. :slight_smile:
Still smart tables are going to be sweet as well. :slight_smile:

ARM = RISC
x86 = CISC

It always depends on the application, for a workstation ARM is no real option.

It´s like saying a 50ton dumper is a real alternative to a 500kg Dacia Logan beacuse you can use both to get from A to B.

While it could take a RISC processor 1 instuction for a simple function, it could take a CISC processor 10000.
And a complex function could take 1 instruction for a CISC processor while the RISC needs 10000.

So a processor that is 100 times faster than the other could still be slower on the same function.

i heard rumors about AMD starting to implement ARM architecture while still using x86 ones, i think we need a new competitor to get better cpu’s than those already here.

No, just no

I’ve read similar stuf on another website, ARM is gaining.

Clock speed means nothing. There are design factors when comparing different architectures that are vastly more influential in the over-all processing capacity than just clock speeds.

Mmm, i think quite all the code would need a rewrite… The CISC/RISC thing has a story. Back in the days there where the doubt if creating big caches and few basic instructions, or smaller chache but “more intelligent” processors. Changing from one to the other is a big deal, i don’t think that it would be convenient(no one produces 3D on the phone, and for games exist gamekit)

This reminds me of the old days of apple motorolas vs pc intels. What about using GPUs for all the major gruntwork? It would be interesting to see an OS built around that idea.

Grunts are Einsteins compared with a GPU - it is kind of an autistic army of communist single cell organisms.
They are SIMD (single instruction multiple data) chips and specialized for paralellization.

You can take a truckload of data and throw it in the GPU, and there the streaming processors do one stupid calculation, but they are many and they do it parallel. That´s also the art in GPGPU programming, to parallelize the code and for many algorithms it´s hard, for some it might not even be doable.

So making a GPU OS would be an efford that would not justify the result.

Grunts are Einsteins compared with a GPU

  • Einsteins is STUPID
  • the worlds most powerful super computer are using GPU’s

To crunch through large databases of numbers etcetera.

CPU’s are still the way forward for generalized computing. GPU’s could however become used as co-processors for the OS ‘when’ OpenCL matures enough.

That’s not a matter in faster or slower, this is a matter of architecture and design. 1 person that can do 1000 things in a second, and 1000 person that can do 1 thing per second are really different. I’m not English mother language, so it’s difficult to me to explain, but search a bit on the net, you’ll find a lot of explanations. http://en.wikipedia.org/wiki/Instruction_level_parallelism (the world most powerful computer will be slow doing operations that can’t be paralleled)

Einstein, not Einsteins, and he was if ever - he is no more.
Anyways, why would that be?

My point. They are trivial.
Supercomputers are generally build for one specific purpose, on top of that they rely on insane parallelization, like weather simulation, thermal and fluid simulations and stuff like that.
The exact opposite of an enduser desktop.

I think the definition of “enduser desktop” will be much different in 10 years, you will see a lot more 3D/realtime stuff in the browser, and who knows what else people come up with- remember that yesterday’s supercomputer is now today’s cellphone.