AMD new CPU RYZEN using blender to render.

Workstation users that really benefit from multi-core will probably go with a dual-cpu rig.

Yeah. Right now, I have 2 computers with Intel Xeon E5 ES V3 & V4, 2 cpus each.
If AMD does not offer something similar, there is no reason for me to change.
I don’t expect any ES version of AMD Naples on Ebay at anytime.

I did mention the allegation by someone in the comments that the article used the wrong image (followed by a posting of the ‘correct’ image which showed Zen matching Intel).

All of this right is now though is ‘word on the street from China’ (and then there’s the idea that not all of the chip’s features have actually been enabled yet). Then there’s the question of how the Chinese benchmark guys got their hands on the chip in the first place (since they are not in distribution yet).

Very true, that is why Intel and Nvidia should be funding Blender to make sure that Cycles run best with their hardware.

Though how do we know that Cinebench treats all chips from all brands fairly (as there are some out there who claim it is a benchmark compiled with Intel’s compiler and designed for Intel’s hardware)? I do not know this myself, but then again we see everyone talking about an article on a known rumor-mill site.

Also, if AMD really wanted to bias the results in their favor, then they could’ve made you download a custom Blender build with their test file (AMD can’t just commit special code to master without Brecht and Sergey’s permission).

Cinebench is using the Intel compiler and was infamous for using the “cripple non Intel CPUs” flag. Not that it matters because a good chunk of the software industry uses it but holy hell giving any credibility to a fritz or CB score is beyond nuts at this point. More info is likely to drop at CES. Stop reading shitty sites like wccftech.

RyzerGraphic27.blend

my rig with 5960X 32 s.

It is two year old PC. And to be honest I’ve never ever rendered a single frame with CPU. In my opinion 16GB VRAM 1080Ti is worth waiting.

Best Regards,
Albert

It’s an ES version of the chip, those are usually distributed much earlier to partners and can be leaked out in many illegal ways.
As for Intel, it’s the same story, you can get 32 Core Skylake-EP ES CPUs from Chinese guys already, even though there aren’t any motherboards for that chip available yet.

The RX 480 renders 30 % faster than a GTX 1080 if i remember correctly, maybe the 1080 Ti will match the performances of the RX 480 but will still be behind the RX 490 and the RX 490X should set new records in rendering and games using the next gen Vulkan API. The RX 480 already matches the GTX 1070 for games using Vulkan. And AMD wants to integrate Vulkan, their VR Open Source technology and ProRender in Blender.
The Blender Foundation has hired an AMD developper full time to work with Sergey to improve the support of OpenCL.

You remember wrong. The 1080 will be faster than any midrange card produced today. Just look into the blender benchmark thread.

Rx 480 matches gtx1070 in vulcan games? What?

I speak about a render engine built for OpenCL, Cycles has issue with OpenCL. The RX 480 is faster than a GTX 1080 it was shown during the AMD GPU conference with the ProRender OpenCL demo.

My bad for the in game performances it is in between the gtx 1060 and 1070 with Vulkan.

And OpenCL will have a big advantage over CUDA once it will be well implemented with Cycles (or should we use ProRender?), you can use both your CPU (+graphic chipset) and GPU at the same time. So a Ryzen CPU + RX 490X is a better choice than getting a GTX 1080Ti because with it your CPU will stay idle during the render if you use CUDA.

My CPU don’t stay IDLE when rendering. I read newspapers or code and listen to music :slight_smile:
Like I said. I never ever rendered a single frame with my CPU. It’s great industry solution if you have a render farm but not for home studios. CPU with 1 rendered frame can not compete vs 30 frames in the same time rendered by my two GPUS. The only limitation is VRAM size what can be workaround with render layers etc. This also may change when VRAM will cross the magic barier of 16GB. Nowadays you can do anything with GPU more or less efficient and seems that CPU is just used for a decoration.

Best Regards,

Albert

Looks like another fake for for that joke of a site.

https://www.reddit.com/r/hardware/comments/5jc6vf/amd_ryzen_cinebench_r15_gegen_core_i77700k_und/

@ Karab44

Is your 5960x overclocked?

Otherwise that is some generational improvement. I have xeon-2687w (slightly faster clocks +300mhz) and I only get 48. though I guess you could have also tested in on non-Window system?

Got my second Xeon in.

so 2x Xeon e5-2687w = 26s… this is in windows. General question is. do you think the benchmark that was shown by AMD was done in Linux to get even faster rendering time??

It shouldn’t make much of a difference because the speedup on Linux also applies to Intel chips (so valid comparisons can still be made if that was the case).

Is there a case where 1 frame with cpu is 30 frames with gpu? If anything maybe it’s 3:1 in most cases. But if you render with the cpu at the same time you’ll get 4 frames done instead of 3 frames. All frames counts :slight_smile:

There is the question if realtime graphics will have a place in animation (it will). It’s better to bake then render in real time for most scenes anyways.

Bigdad I use multiple GPUs for rendering. And yes the difference is extremely shocking. Using CPU for rendering is good if you are in this distributed render-farm program, where you can share one or two cores for network rendering to others. I don’t see any other reasonable application for that…

Grzesiek, I work on W10 Pro. No, I use standard, manufacture clock.

On the AMD video presentation they do testing RYZEN on W10.

Best Regards!
Albert

The Titan X (Pascal ) is at 12GB of GDDR5x (GP102 ) so i really doubt we will see a 1080TI Pascal with 16GB .

You’re right. Latest leaks from different sources say about 10 and 12GB VRAM GDDR5X for 1080Ti. There’s also something about 16GB AMD Vega what can be good alternative for 3D artists…
However I would wait for official announcement.

Best Regards!
Albert