Nvidia 1070/1080 antialaising of text GPU rendering not as good as cpu rendering

I will spare the long story, and say I have actually had this issue for a while, but up till now, when I need to make a decent graphic, have I really immersed myself in the problem.

And the problem is simple; Terrible aliasing problem when rendering in blender or cycles, no matter what the antialiasing settings are.

My current system specs:

i7 3770 cpu
Asus PBZ77-v LE plus mainboard
16 gigs ddr3 ram 1600 speed
850 watt power supply
3 32 inch sharp TV 1080 p Monitors
240gig ssd with 6bs 2 terabyte storage to accompany it
Nvidia GTX 1070 (installed 2 days before writing this, old GPU was GTX 780) had same issue

Yes i know how to set up CUDA and anti-aliasing, and the rendering of TEXT is alwfull, and I noticed the same when i would put various objects in the scene. Very apparent jagged edges.

I set up the cuda core in preferences, and made sure to set up GPU rendering in blender, BTW 2.77a will not do cycles rendering with new cards, had to go back to original 2.77 and in that folder it does.

This is what I tried before discovering the problem:

  1. changing anti-aliasing settings in Blender, render at different levels of cycles, from 32 to 1000
  2. render in Blander render, same issue
  3. render real big like 4000x 3000 and shrinking image
  4. mess with settings in Nvidia control panel (outside of Blender) but used the individual program settings Nvidia allows you set per program in your computer, so I set up antialaising there, and made no difference in rendering

I did end up finding the problem and no I am not happy!
What is the problem and why this post is here:

I decided to try rendering in CPU mode, oh yes it is slow!!!
But that worked, the difference in render quality is huge

So the problem is the GPU rendering with Nvidia cards! Cpu rendering takes longer but looks right

So now I am pist I spent nearly $500 for a video card and it is of no use!

I dont know if the guys who programed Blender can fix this or not, i can Nvidia doing this to make people pay them a royalty for using CUDA technology, I mean that is the difference between CUDA and Opengl anyway is that CUDA is proprietary!

I am taking the card back and going to get a Radion rx480 when they get in stock and find out, his sure is costing a lot of time and money to get working!

Here is some pics of Nvidia GPU rendering and then Intel i7 rendering

This is Nvidia GTX 1070 render, look at the massive jagged edges


This is the slow as hell Intel i7 quad core rendering, oh but it looks so nice!


:wink:

Use a blender version that supports the new cards
Download http://download.blender.org/release/Blender2.78/

Also try changing sampling settings to Branched Path Tracing.