AMD's Vega GPU's; Information about them is starting to come out

It’s not like developing for CUDA is a piece of cake either (there’s been many bug reports on the developer website regarding render performance crashing due to changes in the code or driver updates for one thing). Another thing is reports on this site alleging that the new Maxwell cards don’t perform that much better than cards several generations old in some cases (despite being far more powerful). Compare that to the CPU where the code just works as long as there’s no compilation errors and the fairly reliable speed boost with each new generation.

I would think in that case that part of the problem with developing for AMD isn’t always the drivers, but the inherent difficulty in actually getting a GPU to work with a fully-featured render engine (even the Octane devs. aren’t immune to this and they know GPU programming better than anyone). Now I do know that things have improved somewhat in that area, but it’s still far from the ease of CPU programming (otherwise the developers wouldn’t be having to spend an enormous amount of time shuffling things around and tweaking for the hardware).

The problem with presumably unreliable drivers is, you can never be sure if it’s your bug or their bug. Languages like OpenCL may look like C, but there’s lots of pitfalls with (lack of) ordering guarantees and memory alignment, unlike on an x86 CPU. The tools to debug those problems are also much worse than what you get for CPUs.

If you can split up your problem into many simple data-parallel programs (and still maintain efficiency), that may not be such a big deal. That’s where GPUs come from (programmable shaders) and that’s what they’re good at.

GPU architectures aren’t actually suited well for complex programs like a fully-featured raytracer. The fact that we still use them for that purpose is an economic accident. The GPU chips you can buy for 600$ are huge, a chip of similar size from Intel would cost thousands (especially the higher-clocked ones), because there’s no big consumer demand for CPUs with more than 4 cores.

Maybe if web developers could figure out a way to make web sites unusable without a 12-core CPU, we can have some progress here. Javascript is still single-threaded, but there are web workers for multi-core, which could be used to mine Dogecoins on the users machine, to supplement ad revenue.