What system are you developing on? (Post your computer specs)

Yeah, but what if the new RAMs are made out of crystals or something? :smiley:

When I was a teenager, computers were doubling in power like every 6 months, if you bought a 1 meg graphics card, by the time you got back from the shop it was obsolete. Everyone else had a 2 meg card.

I always held off buying a new rig until there had been a real jump in technology, and when it happened, not much of the old stuff was useful any more. I remember throwing away a bunch of RAM because the standard RAM slot had been increased to a larger size, the older, smaller RAM cards wouldn’t fit in the new mother board.

It sometimes seems to me as if hardware development these days has pretty much stopped.

I’m still waiting for the next big improvement, but it seems like I’ve been waiting about 5 years. In fact, The computer I bought in 2008 seemed not much worse than the one I’ve got now. The only real difference is the really noisy blue ray player this one has got in it. Can’t watch movies on it because I can’t hear them over the noise of the spinning disc.

Back in the 80s, people said that the only limit to increases in computer processing improvements would be the physical limitations of chip size. But it turns out that if you can make a chip that’s 5% faster and make 30% more profit for the company, there’s not point in making a 30% faster chip and making a 5% bigger profit. I think a lot of people have seen this and that’s why PC sales are in the toilet.

yeah, the next leap will be optical computing with quantum exploits and potentially quantum computing,

We’ve hit the limits of silicon. Below 14nm die size, the position of the electrons is uncertain enough the they will jump transistor, not go where they’re meant to, and generally be annoying. We also can’t make chips bigger, as then power dissapation becomes an issue. As a result, the only imprpvements recently have been small layout changes leading to fractional percentage performance gains. If you compare an i7-2600 to the i7-6700, the only difference is reduction in power consumption.

Graphics processors are still changing - a bit. They’re still exploring the effects of massively parallel processing, and I expect a few more massive jumps in GPU power, but in 5 years, even that will stop.

What’s next?

  • Optical computing (will allow 10-20% performance increase)
  • Graphene computing (will reduce size and power consumption) (this is crystaline stuff - kind-of)
  • ?? (Will be the actual next computation method)

The problem is that all these emerging technologies have to overtake existing silicon levels of performance, and that’s no easy feat

  1. Quantum Xeno effect single photon detection(http://www.technologyreview.com/view/522016/quantum-light-harvesting-hints-at-entirely-new-form-of-computing/)

  2. Single photon laser systems

  3. optical nand(emits or does not emit)

  4. optical boolean(blocks or does not block)

I’m running an old Dell Optiplex 745 that i’ve replaced some parts for.

i don’t know what everything does, but when i need to replace a part, my friends usually tell me and then i buy the piece they say and plug it in. :stuck_out_tongue:

Intel pentium D 3.40 GHz x2
GPU: Radeon 230
4 Gigs of ram
32 bit Ubuntu 14.04

i’ve redone the cooling system and it’s not junked up with anything except blender, so most of the times it works fast for me… the graphics card isn’t fully operational since i replaced it and sometimes gives me weird shading artifacts like making everything look triangulated and high-lighting an entire edge rather than just one vertex but other than that it’s great…except the internet… it sucks…


Using an Alienware 15 for the main monitor and the latest Intel NUC for the other two monitors; both systems are running manjaro linux.