Can't use AMD grpahics card

Hello,

I am trying to render in Cycles using GPU complete on an AMD Radeon R9 2gig graphics card. Every time I try to render even the most simple things, it runs out of memory. On top of that, in rendered view it is ridiculously slow.

I’ve been using CPU (4 Ghz Intel Core i7) but I am trying to make 200-300 frame animations with around 600k faces and I’ve set everything from sampling to subdivision to the bare minimum. It still takes 5-10 hours to just render 240 frames. It seems the renders take the most time tessellating planes and building/packing BVH. (I am using displacement modifiers to build terrain, but I kept even that to a bare minimum)

So, is there something I can do to speed this up? (I’ve already went through and done everything from both Blender Guru and the remington graphics videos on the subject) Or any add on I can use to use my graphics card?

P.S I am running the stable build of 2.78 and I am on a Mac (Latest version) with 32 gig of ram.

Thank you.

2 GB VRAM is about the bare minimum these days. And I’m curious: Why bundle a card with so little VRAM with otherwise powerful components?

And no, if the scene does not fit into the card’s VRAM, there’s not much you can do, other than perhaps splitting the scene in multiple render layers and combine those in compositing. Or upgrade your graphics card - but then again, it’s a Mac and I fear that you just can’t.

BTW, is that a Retina display?
That would explain why the rendered viewport is so slow, as that would run in the enormous native resolution of your monitor…

It is a retina display! I am thinking of upgrading my custom built PC which I use for games, but hey, it could work. I am eying a (After the prices drop back to normal) GTX 1080 TI. (With 11 gigs of vram!) Would that show a much better render time then a quad core i7 CPU? I’ll also try the layers idea, thank.