GPU Rendering with Cycles - Complete Guide

Oops already posted

When I click on the link I get a HTTP Error.

yup, link to the faq is broken. Is this info anywhere else on the www?

Hey guys! I was wondering and searching on google about possibility of using blender Network renderer with GPU. I know this feature was developed on mango project and that the complexity of scenes works sometimes against the GPU capacity. But I got 4GB graphic card which could handle easily all of my shots and it seems to me as such waste to have a computer rendering on CPU while GPU is resting. Is there any way how to send jobs from network renderer to both CPU and GPU (almost like two separate slaves) ? That hack would double the speed of everyones render times.

Well, if Cycles was hybrid this could have been done. Not yet.
For now you can send a rendering job to another computer with different (gpu & noise) setting then compose into one.

Iā€™m not sure if I should be posting here, but I have read (most) of this thread and need to ask for an opinion.

Iā€™m running Win with an out-of-date nvidea card (and poor spec).

The trouble is, there are so many cards out there, its difficult to know which way to goā€¦

Iā€™m looking at three cards at the mo:

Any advice on which to go for? Iā€™ve also heard that the number of Cuda cores is a KPI?

Thanks

Go for the 960, more computing power per watt.

The GTX 960 has 1024 CUDA cores, whereas the GT 740 has 384.

The GT 740 uses 64W while the GTX 960 uses 120W. The GTX 960 is more then twice as powerful as the 740 and may use less power in long renders. (64W over 20 minutes > then 120W over 10 minutes, right?)

Even though the 960 is far more expensive it is the better choice. It is rumored that nvidia will release a new card soon, probably either a dual GPU 990 or the new Titan, which will lower the price of the current 980 and likely to result in lower prices of the 970. If they are going to, it has to be soon. They have already confirmed the release of the Pascal GPU architecture in 2016. Which if you can wait until then to upgrade, I know I canā€™t, there will be a huge leap in GPU performance.

Definitely donā€™t get the 1GB version of the 740. I currently have a graphics card with 1GB VRAM, itā€™s not enough.

Thanks Phith

Exactly the answer I wanted and it also confirms views from other people too. I guess cores win over RAM.

To be honest, the power isnā€™t critical as itā€™s for business use but I absolutely understand your explanation.

Iā€™ll also look into the new 990/titan although I need this card pretty quick.

The next best thing is always on the horizon!

Iā€™m not sure I fully understand the relevance of the PCI Express 3, but Iā€™m reading up on it now.

Thanks again.

Just to let everyone know, I went with the following card

Iā€™m running on an i7 4770 with 16GB RAM

I had the following problems:ā€¦

Firstly, on Blender 2.72b, it throws an immediate errorā€¦

ā€œblender CUDA binary kernal for this graphics card compute capability (5.2) not foundā€

This can be resolved by changing a filename ā€˜kernel_sm_50.cubinā€™ (Blender/2.72/scripts/addons/cycleslib) to ā€˜kernel_sm_52.cubinā€™

That workedā€¦ greatā€¦ butā€¦

I work on large models with complex geometry and if I try to switch to rendered view (Shift + Z), it runs out of memory because this card is only 2MB vs the standard 16MB I had to start withā€¦ OK, never mind, I can resolve this by switching to a full F12 render using tiles, which only needs to commit the current tile to memory - which works.

Lastly, on large .blend files (I am currently working on a 600mb fileā€¦ yes 600mb!), there is very little perceptible difference in render speed. I did a test using two identical PCā€™s - one with standard CPU processing and one with my new GPU compute. The end times were very similar.

A disappointing result!

On smaller object files, there is a very noticeable improvement in speed on both the rendered view and in final F12 renders.

That said, a MAJOR advantage is that whilst using the GPU to do the rendering, the rest of my CPU is freed up to carry on with the rest of my daily tasks and the PC isnā€™t clogged up trying to process the render.

All in all, in my case, Iā€™m not sure it was worth it.

Hi,

I have a GeForce GTX 660 Ti GPU.

I am running Debian Jessie with up to date proprietary nvidia drivers (version 340.65-2).

When I check with nvidia-settings tool I see 1344 CUDA cores.

I donā€™t see CUDA under User Preferences -> System -> Compute Device (only None and then CPU in the selection).

When I start Blender (v2.73a) from command line it doesnā€™t print any erros/warnings.

Do I need a special build for GPU rendering to work?

Edit:

Iā€™ve found this and installing libcuda1 package solved my problem. CUDA works now.

Dear all, I would like to upgrade my graphic card. Should I go for GTX970 or Quadro 5000? Beside Blender, Other softwares Iā€™m using are Zbrush, Mudbox, Photoshop. Thanks and looking forward on your reply.

Hi, there is some software supported from special drivers for Quadro cards.
I know from Maya for example. Blender does not profit from it.
Cuda render engines like Cycles or Octane are much faster on gaming cards.
There was a Quadro card in my benchmark for 2.72:

http://www.blenderartists.org/forum/attachment.php?attachmentid=354192&d=1420649558

Cheers, mib

So I am using a Mac Book Pro 15-Inch Retina Display, it has a Nvidia Geforce 750M GT CUDA supported video card, Got the CUDA Driver so that blender would recognize it and now iā€™m getting this error when rendering with GPU in Cycles:
ā€œCUDA error: Launch exceeded timeout in cutCtxSynchronize()ā€

I read up that someone else has the same error but on a PC so they say to go fix something in the registry (I prob wouldnt even do it if it said to like that) But has any Mac user fixed this issue?

Also, the link on the first post gives me a 404 error.

The guide linked in the original post is 404 does anyone have a mirror?

Iā€™ve got a Dell Latitude E5520.

Itā€™s got an embedded Intel Graphics chip:

VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09)

(Someone in a different thread here mentioned the NVIDIA website as a better link for Graphics Card info. It shows how to find Graphics
Card information using Windows or Macs (but not Linux),

In Linux, this command works (run in the Terminal):

lspci -v -s lspci | awk '/VGA/{print $1}'

for me, returns this information:

VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) (prog-if 00 [VGA controller])
Subsystem: Dell Device 049a
Flags: bus master, fast devsel, latency 0, IRQ 41
Memory at e1c00000 (64-bit, non-prefetchable) [size=4M]
Memory at d0000000 (64-bit, prefetchable) [size=256M]
I/O ports at 7000 [size=64]
Expansion ROM at <unassigned> [disabled]
Capabilities: <access denied>
Kernel driver in use: i915

More info on Linux and Video Card Info here:

WebGL is broken on Chrome because of it.

Barely runs on Firefox.

(Latest versions)

LinuxMint 17.1 (one off the most current: 17.2)

Blender doesnā€™t recognize my video card. There is no ā€œDeviceā€ option
under render.

Doesnā€™t matter that it works with NVidia and now (with 2.75) some ATI/AMD
video cards.

The latest Dell upgrade XPS13 for Linux uses the:

Intel (R) HD Graphics 5500

with NO options for video upgrades.

By default, ZAReasonā€™s Chimera 2

http://zareason.com/shop/Chimera-2.html

offers the following specs / upgrades:

    Nvidia GTX or Quadro Graphics
    Nvidia GTX 970M dedicated graphics with 6GB video memory
    Nvidia GTX 980M dedicated graphics with 8GB video memory (optional)
    Nvidia Quadro K3100 workstation graphics with 4GB video memory (optional)
    Nvidia Quadro K4100 workstation graphics with 4GB video memory (optional)
Blue Backlit Keyboard with on/off switch + brightness control
Two SATA 2.5" drive bays
Up to 32gb of memory (4x SO-DIMM slots)

So, Dellā€™s BEST for Linux, itā€™s ONLY choice for Linux falls far short of ZAReasonā€™s for
about the same price.


Memory is only PART of the equation. Iā€™ve got 8 GB of memory. Does NOT matter.

Not enough is being said about video memory. I have no idea how good an Intel
5500 is. If itā€™s anything like my current card, I can expect that in a few years, Iā€™ll
be blacklisted again.

Iā€™d like to buy a laptop (yes, a LAPTOP) designed for Graphic Design, something that
registers with a program like Blender.

And Iā€™d like to get it RIGHT this time.

Got any suggestions?

Thanx

Hi,

I just downloaded blender yesterday and have been looking at some turorials etc. to get started.

However when I am supposed to switch from ā€œBlender Renderā€ to ā€œCycles Renderā€, blender crashes. I havenā€™t made a scene or done any editing (I only have the start scene with the grey box) I tried google but couldnā€™t find anyone with quite the same issueā€¦ :confused:

Does anyone have any idea why this is happening?

I am sorry if this is a very n00b question :o)

Are there any rules here?And why forums like starmen.net are powered by COPPA?

Hi, a while back i tried using GPU for rendering on Ubuntu. I tried a ton of different things and couldnā€™t get it to work. Finally, I saw a thread somewhere where it was mentioned to run Blender as root (i.e. sudo blender), and it worked! I tested on a vanilla system with just the proprietary nvidia driver installed and no other changes, and just running blender with sudo worked great. Donā€™t know about fedora, but itā€™s worth a try!

Thank you! I could have GPU rendering working fine on Fedora 22.3 the day after, hereā€™s a monologue through the steps I was going through (synthetic form just posted at the bottom, as a comment, I donā€™t know if itā€™s awaiting moderation):

https://ask.fedoraproject.org/en/question/76256/how-to-install-libcuda-on-fedora-22-so-that-blender-finds-the-cuda-device/

i cant open the link in the first post, it first says that it might be a security risk, when ignoring it it asks for a username and Password, the 401s, is there an up to date guide anywhere?

For me it is giving a error connection refused message.