Mac: M1, M2 - M3? *Blender 3.5 Stable!*

there are no problems on running Blender on ARM, it was doable for years

bigger problem is graphic API and potential pf apple blocking apps like it does on ios


The hardware side of PC is an easy win, especially with AMD’s Ryzen. Their weakness is sadly Windows 10 with its evil practices. I’m still on Windows 7. :face_with_raised_eyebrow: Linux always makes me angry, so I can see why people would try Apple.

The Anandtech article is pretty informative and shows that, for years, Apple has had much steeper performance gains between generations than Intel. If they keep up that trend at least for a couple more years, I don’t see Intel inside any of my future devices.

1 Like

These days, they’re more likely to make their own chips so they won’t have to pay someone else for it. It’s a lot cheaper.

I’m gonna stick my neck out here and say the biggest benefactors will be Cinema 4D users on a Mac. Look at recent acquisitions – Maxon buys Redshift, Redshift and Apple are (early days) developing Redshift using the Metal API etc. Apple can afford to keep their approach streamlined, they don’t welcome too much competition, preferring themselves to choose affiliates and dictate the roadmap. And you know what – I just think, look at the astonishing work created right now in Blender, by people with technically quite inferior setups, and realise that it’s the final result we are all inspired by not the power of the processors that created it. Man, there’s more fomo being a C4D user looking at stuff like HardOps and DecalMachin3 AAAAAAND eevee in Blender than there is PC users looking at people snapping up MacMini’s with bionic chips :kissing_heart:

1 Like

Well the M1 chip target is Intel and IntelGPU. They fully pushed Intel out here. Mostly do efficient energy management. The GPU is at no level to Nvidia or general desktop GPU. So much about the little computers been updated.
The next thing will be likely the M2 for the Pro Laptops and iMacs. Not sure what they will have by then. Expect it later next year.
Could be that Apple already can stick more the one CPU in an iMac. And if you take 4 M1 suddenly you got a massive machine with really low energy usage and enough Ram to run Octane on complex scenes. Doesn’t have to be that way, but it can get interesting.

1 Like

That’s a good theory, I didn’t think about that. I mean, they’re kind of in charged of their entire pipeline now and able to draw any roadmap they desire.
3-5 years from now, what was standard for the industry now may look entirely different. Or maybe the same-oh same-oh of the last twenty-thirty years… here’s your motherboard, here is your CPU, here is your GPU, here is your HDD/SSD, get to work. :stuck_out_tongue:

Just read that OctaneX supports the M1 including network rendering on up 20 machines incl. iPad and iPhones.
Not sure I want to use my phone for rendering, but it could be interesting for everyone.


I’m on Windows, but would love to switch to Linux and can’t because I there’s too much Windows only software I need for work. Windows is the one that makes me angry, because fundamental parts of it (file explorer, search, start menu) are just crap and there are bugs that seem to have existed since Windows 95 still occurring (issues with icons sometimes being wrong, for instance).

I tried using a Mac some time ago, and I sold it a couple of months after I bought it because it felt like a sort of locked down Linux, which had some advantages of Linux, but many weird limitations that I didn’t have on Windows or Linux, lack of customisation of annoying things like window animations, and having to download an app to add cut and paste seemed ridiculous.

One thing I do like about Apple, is that when they introduce new hardware, such as the Apple pen, it enjoys wide app support in ways that MS never does with Windows peripherals. Look at how much of a mess graphics tablets are on Windows; MS introduced Windows Ink a few years ago, but it made things worse for many tablet users and if you’ve tried using graphics apps on a Windows convertible laptop, you’ll realise that nothing is actually optimised for a workflow that has no keyboard and only pen input. It’s a similar situation when you compare Android with iOS, where Apple’s features get proper support and work right, whereas Google’s often feel half arsed, or don’t get support from software companies or phone manufacturers.

Back on topic though, and the Apple Silicon does sound like a potentially large advantage in the long run, due to Apple’s control of every aspect of their computers, which gives them the chance to heavily optimise for their own hardware, as well as dictating how other software should work or be optimised in order to get on the app store etc.

This whole time, I’ve been wondering if you folks have heard of Louis Rossmann on YouTube. Great quality videos.


So the M1 chip has “16 GB unified RAM”.
This nice for performance for ‘normal’ programs but might be very bad for rendering in my opinion.
Right now (on PC) I have 32 GB of RAM and 11 GB of GPU memory.
I render using CUDA, so I have 11 GB for rendering.
If the 11 GB are not enough, I can switch to CPU and use 32 GB.
But if the M1 chip has unified memory, it means I would have less, because everything is now stored in the same memory!
I think it would limit the size of the scenes I can render, simply because of the shared memory model.
You can never even use the full 16 GB because at lease the OS and Blender will take up memory space.

Another aspect I’m not sure about:
In the current architecture GPU memory is MUCH faster than regular RAM.
The reason being that regular RAM is multi purpose while GPU memory is used by one chip only.
Is M1 memory as fast as GPU memory is right now or is it somewhere in between current RAM and GPU memory?
(I personally think it will be in between but I don’t know)
If it’s not as fast as current GPU memory, rendering would be slower.

~100+ GB of textures in M1 16 GB available immediately to the rendering engine thanks to unified memory.:


Did you ever try C4D with ‘millions’ of objects? Or a lot of external files?
Good luck with that :wink:

According to Geekbench it is close to entry level Mac Pro from 2019:

Single Core 1687
Multi Core 7433

Mac Pro (Late 2019)
Single Core 1039
Multi Core 8237

Really looking forward to the 25th, when my Mac mini arrives ;D

The 16Gb limit is the reason (I think…) they came out with laptops & the mini first.
16Gb is no way enough for graphic and 3D artists during the day. I can cap my 64Gb easily with doing sims, or stitching very large HDRI’s, while doing graphic work and having a 3D app open.

Another thing to consider is the speed of conversion of applications by developers to this new silicon, and how long Apple is going to support your current x86 hardware. Looking at the past, that might be a disappointing path to travel.


1 Like

There seems to be a lot of focus on what is released right now, and comparing that to current rendering requirements as in, people are looking at a 13" Macbook Air and wondering how that would perform rendering a complex scene. For a start, no one buys a 13" Macbok Air for 3d. I’ve soldiered on with a 15" Macbook Pro from 2015 and the amount I’ve learned on that machine is incredible. Sure, there’s a few things I’d like to really got into such as Substance Painter that’s hampered by my underpowered machine, but that can wait until I can own a machine that will. So what did I do? I focussed on learning how to model properly, simple setups that kind of thing. It’s all too easy to sit and look at this technology and complain about what it can’t do when really that shouldn’t stop you from making stuff.

It’s the same over in photography blogs – there will be a beautiful landscape scene and the comment section turns into a bitchfest about camera manufacturers.

I think Apple are more than capable of producing a machine, and chips, to really push the power of Cinema 4D, it’s really clear that’s where their priorities lie. I work in advertising in the UK, and every studio is stuffed full of iMacs, and the 3d software of choice is Cinema 4D. Apple knows this, their biggest market for computers is this industry. Their attitude will be ‘here’s the hardware, if you want your software to be used on our platform then here’s the architecture, the code, the API, you conform to those standards and our rules and we’ve got a deal’. The world’s [sometimes] richest company doesn’t have to rely on open source software with a col user base for survival. It creates it’s own arena…

FWIW I want to build my own PC for my private / hobbyist 3d work, I just have to use Mac for work.


the arm cores sound nice but so far the gpu seems to be a match for intel integrated gpus only.

the vendor lock-in and the prices for ram and ssds are horrible. :slight_smile:

what i dislike most is that the ssds are soldered on. basically that’s planned obsolescence with the 3000 or so rewrite cycles a typical tlc flash cell can endure.

1 Like

The SSD will certainly not be well suited for a planned obsolescence.

Just do the math:

(write cycles x capacity) / (SSD factor x data per year) = lifetime

The SSD factor indicates the ratio of the real data volume to the actually written data.

Example for a Samsung 850 PRO with 1tb

3000 x 1000 GB
---------------------- ~ 343 years
5 x 1750 GB

Sure, this is only a theoretical value. The warranty for the mentioned SSD is ten years :wink:

Another fact a lot of people get wrong. It ist not an ARM CPU. This is because Apple has not licensed the actual processor design from ARM, but only the instruction set.


I’m trying to understand Jules’ tweet. 16GB of ram will hold a 100GB scene in Octane X because of the unified memory and texture compression? :flushed:

1 Like

That single-core score is insane! There was a new Mac Mini in that list that scored in the 1700’s. :flushed:

Thats on par with the brand new Ryzen chips.

1 Like