Enabling GPU Cycles Rendering Blender 2.65a on OSX 10.6.8

This feels like a really stupid question, but how do I enable GPU Cycles rendering? I’m just watching Andrew Price’s intro to Cycles video but the drop down menu that he says to click, just isn’t there.

Is this a problem with me being on OSX?


- The Render tab (I’m 100,000,000% definitely running Cycles).


- The 2 graphics card in my 15 inch Macbook Pro (sometime-in-2010 model - I forget).

The only thing I can think to try, after looking about a bit, is to install the CUDA toolkit and update drivers, but the toolkit is taking so long to download due to the bad internet connection in my town, that I figure I’d get a quicker response here.

Apologies if this has been covered somewhere else in another thread. I tried searching, but couldn’t really see anything. But I have been told I’d lose my head if it wasn’t attached to my neck…

Cheers for any help! :).

File / User Preferences / System, select your Compute Device in the bottom left corner. Also your graphics card may not meet the minimum requirementsfor cycles renderer so you’ll just have to stick with cpu rendering. If you do get it working for gpu rendering, being such a low end card you may actually get better performance sticking with just the cpu

Also before you ask, that tutorial is old and the location of some of the render settings have moved. Just open the different panels in the render settings and you’ll find them easily enough.

Cheers tonnes. After a look at the minimum requirements though, it appears you’re right and there’s not really much point me bothering with using my GPU. I sometimes forget that this laptop is getting out of date. Back to sticking the fan on full blast when CPU rendering then, I guess.

Don’t suppose if you know if using the GPU would make any difference to my system heating up? My main reason for wanting to switch is that even with my fans running at full speed - thanks to smcFanControl, an awesome little app for any Mac users - my CPU temp is still hitting like 80-90C, and sometimes tickling 100C, which is obviously not cool. Reckon it’d be worth installing all the CUDA gubbins just to spread the workload around my hardware a bit, and hopefully prevent my laptop from becoming a pool of melted aluminium?

Cheers for any further help. Dw if you don’t want to reply, I realise I’m diverting slightly off topic now, it’s just it’d save clogging up the forum with another thread.