Cycles HIP for AMD

I totally agree… When Cycles was first release on that easter special, it was announced it was OpenCL compatible. Then when they were fleshing out the renderer / making it compatible on all cards, OpenCL was dropped as it couldnt handle it, waiting for AMD to fix their drivers.

Then AMD submitted the split kernel patch, which brought support back to AMD cards again, this was a few years later. Support for AMD Cards had Returned! But, again within a few years it was pretty average. Blamed on AMD drivers again

Now, there is HIP rendering, and whilst I am hopeful for this time to be different (AMD being a blender development fund supporter etc.etc), I am still doubtful.

2 Likes

Once is happenstance. Twice is coincidence. The third time it’s enemy action.

If Blender drops CUDA / HIP in the next few years I will truly believe there out to get AMD :laughing:.

From some things I read last year it seems the entire world lost interest in OpenCL like 3 years ago.

Blender didn’t choose anything. Nvidia threw time and manpower at cuda/optix year after year and were rewarded with at least a dozen dirt poor people I know sacrificing everything to buy nvidia products to improve their Blender experience. Blender gained a dozen more users even though nvidia did most of the work to get better nvidia support into blender.

Not exactly. Cycles debuted with CPU and CUDA support, and OpenCL was explored, but didn’t work at the time of release.

The problem with OpenCL was it was too complex for the hardware of the era, and coding for it wasn’t exactly easy. Doing a simple “hello world” using OpenCL could fail in spectacular ways depending on the hardware you were using, and also required any code to be used to be recompilled every time the data or routine changed in subtle ways. AMD leadership was quite shortsighted at the time, and considered general GPU processing as a second class citizen, since it’s investors didn’t believe nvidia CUDA would take flight like they did. (Granted, Intel OpenCL implementation was also known to be kinda unstable) AMD implementation of OpenCL was hit or miss, since their hardware was never designed with OpenCL in mind until their R9 3xx series at best, and Only with polaris had decent support (despite drivers bugs).

What’s happened with OpenCL then? Their inventors, Apple discontinued it, and the rest of the world followed it. you can’t fix what was born broken. And since AMD OpenCL implementation was broken by design (sorry, sounds harsh, but it’s closest to the reality, really) they also discontinued it.

Now, AMD has HIP that can work on their own cards, and tools that can make easy to convert/port code directly from CUDA. Let’s hope this time, the driver developement dept. at AMD doesn’t f*** it up.

Do you think its appropriate for Blender to clarify their mission statement? How about…

get the world’s best 3d technology in the hands of artists (#1) as open-source (#2), and make amazing things with it (#1).
#1 - Provided you have access to a supported GPU.

#2 - GPU acceleration to rely on vendor specific proprietary SDK’s with initial support supplied by GPU vendors and future maintenance at Blenders discretion based on how complex Blender considers that implementation.

Sounds awfully realistic.

On the other note. You cannot display anything on a screen without a GPU, so you have to dance with the devil(s).

1 Like

Why?
The biggest number of Blender users are in Windows. Not Linux. And there is also Blender in Mac another even more proprietary brand… so Blender should be stopped in Windows and Mac?

Blender also exists in OpenCL, it is just bad and do not really advance Blender. We have Vulkan open architecture but i don’t know how much useful will be.

I think you hit the nail on the head. If one statement below is correct, surely both are correct?

  • Halting Blender development for Windows or Mac would disadvantage those that use Windows or Mac.
  • Halting Blender development for OpenCL would disadvantage those that use OpenCL.

Who puts most if not all of the cuda and optix code into Blender? Blender devs or nvidia devs?

Does anyone know?

What about opencl and any radeon specific code?

So you saying that Blender should never be able to get out of an obsolete tech/code…

thinsoldier and Built first I would like to commend you on your endless pursuit defending Blenders mission, this awe inspiring story will be talked about for years! I dedicate to you my last attempt at one mission to rule them all.

get the world’s best 3d technology in the hands of artists (#1) as open-source (#2, #3), and make amazing things with it (#1). (#4)
#1 - Provided you have access to a supported GPU.
#2 - GPU acceleration:-
(a) To rely on vendor specific proprietary SDK’s with initial support supplied by GPU vendors.
(b) Future maintenance of code is primarily to be supported by GPU vendors.
(c) From time to time Blender may forbid certain open technologies.
#3 - Windows Mac and Linux operating systems supported.
#4 - If you feel Blender does not live up to this mission then fork it.

I remember one scene from Blade II movie where Blade said to one guy:
“You aren’t a vampire, you are human.” Dude’s response was: “Barely. I’m a lawyer.”

1 Like

I might shock you, Loas, but the truth is that:

#1: Blender ALWAYS since day ZERO required a supported GPU. It’s just in the old days the term GPU didn’t existed and was called “Graphics Accelerator” and one of these costed thousand of dollars in their heyday. Blender devs never could support SIS, S3, 3Dfx barely worked at the time, so apart for stating the obvious, what’s your point?

#2:(a) GPU acceleration: From Day Zero, they relied on specific propietary SDK. It was called OpenGL (was propietary at that time, but i think 1.0 was really made “open”). At at the time of Day ZERO, it was propietary That’s why MesaGL was born (And is compatible with OpenGL, but a different implementation done by… OMG!!! the actual manufacturers!!!, who could have know that!! (/s)) (except nvidia and broadcom, their shareholders loves closed things). And in the begining all your software (even linux) depended on propietary API’s and drivers. Again what’s your point?

#2(b): “Future maintenance of code is primarily to be supported by GPU vendors”. Why do you think GPU drivers are so complicated?. They do maintain binaries lists, CRC’s and workarounds for games. All GPU manufacturers nowadays dedicates developers to help game developers to make their games runs at the fastest FPS they can. Nvidia developers even do as far at change the entire shader code of several games at driver level to make them run faster on their cards. What they do with Blender is to donate the code to run on their product and then you make your clients happy. Maintenance is shared Between Blender devs and GPU manufacturer devs. It’s called collaboration. Do you know the concept?. (I believe Linux and other open source OS depends on that concept, i think).

#2 (c): “(c) From time to time Blender may forbid certain open technologies.”: Yeah, sure. From time to time, Linux even does that. You are forbidden to run any Linux Kernel from 5.8 on any 386 and lower system. MesaGL also forbids you to run on any ATI Radeon up to the 8500 era cards /SIS/S3/3DFX, since december 2021. So you need the older versions if you insist to keep these systems running. OpenCL failed in the real world™ and even their original authors (Apple) recognized that and discontinued it in 2018. Is a known fact that OpenGL is on maintenance only mode and will be deprecated sooner or later, and Vulkan is replacing it, and, who knows? developers might need the GPU manufacturer’s asistence (wowzers!!) the day Blender devs (who am i kidding, the day Brecht, Sergey, Clément, or Jeroen have time/resources to do so.) decides to port Blender to Vulkan . BTW, you can’t run Blender 3.xx on windows 7 either. And is not Blender devs fault.

#3: Windows, Mac, and Linux… You’re free to support other O.S. nobody is stopping you. FreeBSD are still in 2.9xx since nobody cares enough to do the port for it. IOS is restricted by Apple, Android ports do exist, but they are mostly non-funcional since, android GPU’s don’t support OpenGL, but a different API known as OpenGL ES that is not compatible with the original, albeit similar.

#4: There have been several of those. Blender-instictive was one, there was another years ago that tried to do software-only rendering for the viewport (to make Blender work in S3/SIS/3dfx cards) but never were able to compile it, and i knew several others that tried to convert it into a print-shop modeling software. Nowadays there’s “B for artists” that tries to put the old MSOffice 2000/maya interface paradigm in Blender.

Might be beating a dead horse, but Blender nowadays is “get the world’s best 3d technology in the hands of artists as open-source, and make amazing things with it.”. And like or not, the “best 3D technology” is not the deprecated one at this point. AMD is just trying to repair the mistakes their older management did back in the day, and at least I welcome this, and anyone with an AMD system should too. Heck, even nvidia users should too, since in the end, everyone benefits from it. If you can’t understand this, there’s always Blender 2.9x and older for you. (And Windows 7 and older for you, with their legacy drivers too!)

For mods: We’re derraling here. You might want to clean this thread.

Uh. What? Can you explain what you mean by opengl being proprietary? Kinda seems like the opposite of that.

1 Like

Am referring to older times, irisGL times, when OpenGL wasn’t really “open” (In the sense is now). That was lead to the MesaGL intiative… Quite older times.

Of couse (since you work for AMD) might have another opinion or know better facts about it, than this grumpy old geezer (won’t be the first nor the last time :slight_smile: ). (BTW, Edited original post for clarity.)

So just got a 6900XT… Renders pretty nicely, though was expecting it to do slightly better with a furry character. I wonder when we will get hardware accelerated raytracing…

EDIT: And it seems there is a texture size limit… it would be good if behind the scenes if it just handled anything we threw at it at full resolution… Gotta push that 16gb of vram to the max. Also very common for us to use 48k by 24k textures

Could you elaborate on your statement about the texture limit?

a single texture cannot be larger than 8k i believe. 16k textures dont work.

I see. Thank you for clearing this up.

Does anyone know if there is any update regarding AMD GPU support on Linux.