Hi guys and gals.
I just got a laptop with the infamous Optimus technology and I’m having problems making the cycles gpu render to work as intended under Linux (through Bumblebee). I tested Blender under Windows and it is working great. In linux I tested running from the blender folder the command optirun ./blender or primusrun ./blender, in both of them, under User Settings, I was able to select my graphics card (GTX 780m). Everything seemed ok, but the GPU render in Blender was slower than the CPU.
Windows 7 Ultima, 64-bit, Nvidia Driver: 320.49
GPU (CUDA), GTX 780m, 200 Samples, Tiles: 256 x 256:
Render Time: 1:04.80
CPU - i7-4700MQ - Tiles: 16x16:
Render Time: 2:44.53
Ubuntu 13.04, 64-bit, Nvidia Driver: 304.88
GPU (CUDA), GTX 780m, Bumblebee with Optirun, 200 Samples, Tiles: 256 x 256:
Render Time: 2:06.60
GPU (CUDA), GTX 780m, Bumblebee with Primusrun, 200 Samples, Tiles: 256 x 256:
Render Time: 2:06.66
CPU - i7-4700MQ - Tiles: 16x16:
Render Time: 1:59.54
I also noticed that during the GPU render on Ubuntu, one of my CPU’s was getting 100% of usage, which didn’t happen under Windows, here are some screenshots:
CPU Usage on Linux when rendering on CPU:
CPU Usage on Linux when rendering on GPU through Optirun:
CPU Usage on Linux when rendering on GPU through Primusrun:
CPU Usage on Windows rendering on GPU:
http://img834.imageshack.us/img834/5559/lb4e.jpg
And also, here awe some links to prove that Bumblebee is working fine on my machine, I also tested it on games in Ubuntu and it seems to be working fine:
Optirun: http://img268.imageshack.us/img268/5098/jedh.jpg
Primusrun: http://img202.imageshack.us/img202/2858/lpzm.jpg
Sorry to the lengthy post, but I tried to get as much as information as I could to help you to help me =)
Thank you.