The NVidia GeForce GTX 750 and GTX 750 Ti are out, lets see what we can expect from the Maxwell architecture. Please post any benchmark of upcomming NVidia Maxwell family cards you find or you do in Cycles or any other CUDA based benchmarks.
I can´t test the GTX750ti with official 2.68a build because the GTX750ti (Maxwell architecture) has cuda capability 5.0. which isn´t supported at the moment.
My build is based on the official Blender Source R61305
I compile it with default settings, only change is i added support for cuda capability 5.0 and compile with the Cuda Toolkit v6.0RC(official builds based on v5.0 or v5.5, which is necessary to build the kernel_sm_50.cubin for the Maxwell architecture.
Only Chance is, i compile the latest blender 2.68a source with cuda toolkit v6.0RC and test it
Rolf, its really cool. But I still don’t know how to compiling for older CUDA Capability v1.0? Because I’ve GT8600 DDR3 which has CUDA Capability v1.0 and is it possible to make v1.0 cubin to use ‘Experimental GPU Compute’ settings? I apprecieted if you teach us for it and make test version for v1.0 cuda cards too. Big thanks.
I dont think it is possible to run it on CUDA 1.0, as far as I know, it does not have some instructions/functions that cycles uses. Time for a little upgrade.
Rolf is the best. Thank you thank you thank you. I downloaded your build and it works great with my GTX 750 ti! I’m in California. If I’m ever in Germany I gotta bring you a prize.
Wow the GTX 750ti with 2 GB seems really great - Nearly as fast as a GTX 580 (~200watt) but only uses 60watt. And costs 140 Euros. So you can nearly buy 2 GTX 750ti for the price of one used GTX 580 with 3GB and still you consume half the power at even greater rendering speeds. Of course it is only 2GB.
Looking forward to the higher Maxwell graphics crads.
Maybe off topic, 2 questions came to my mind:
1-Will cycles (or is any other GPU Renderer already able) be able to make use of the regular memory? Or is this impossible?
2-And why is V-Ram so much more expensive as regular ram?
Rolf, thanks for all info. - Got to know that GTX 750Ti is not SLi-ready. How two cards work together then? Or SLI not necessary for this? (Sorry, till now I have not much knowledge about hardware.)
Thx
Can you post some benchmark results of your pc?
i forgot to mention, the GTX750ti is not SLI ready.
But SLI is not necessary to render with two cards , and unlike SLI, in cycles two cards of different series can work together.
Hi guys. Blender noob here and this is my first post since joining in September of last year.
Here is a short video demonstrating the CUDA compute capability of Nvidia’s Maxwell chip on my new video card, as it renders Mike Pan’s popular BMW scene using the Blender build uploaded by Rolf. Enjoy.