I’ve got some issues using cycles on Apple Silicon. When I’m rendering on GPU, I often get an error. I already checked memory usage, which is fine using 10GB of 64GB available.
It always throws the same error:
Error: Command buffer not completed. status = 5. (PSO_GENERIC.integrator_intersect_closest):
Since it is a Mac using Apple Silicon, the CPU and GPU SoC are sharing memory, both have 64GB of memory in total (Also the controller is actively swapping if the memory used is greater)
But as already stated, the current render isn’t exceeding 10GB of total RAM used (GPU and CPU) and failing to even start beyond BVH Tree build.
If I render it on CPU only, there Is also no more than 8GB of RAM used.
The same issue I observe in by far easier scenes…
Sometimes even viewport render is failing to load with default cube
I did a dig on the limit of triangles for rendering, and found someone who had performance issues using geo nodes, giving him a huge performance hit unless he‘d realize the geometry, going to test that out asap…
Meanwhile I tested the same scene on my windows pc (i7 8700k 16GB RAM, GTX1070 8GB RAM) most of the frames were rendered within 2min, at the same frame number my the render crashed on my Mac it started to be painfully slow (4 samples in 16min)
(Trying to render a point cloud consisting of approximately 1.6 instances of a icosphere)
Will look into the links you provided later this evening. Thanks for the support so far
Well, on realized geometry it works, it is even 20% faster, difficult to render 1.6million points of a point cloud and using less faces (currently icosphere, 1subdiv)
(unless I don’t know how to render them properly, since using a volume and pointinfo is just to timeconsuming for a dissolving title effect)
I heard there is a trick to creating a new object with a geo nodes instance, and then dragging the original geo nodes object into it, which basically caches the result of the previous object into the second yielding massive performance improvements.
Sorry that I didn’t answer for this long, was quite sick for a few days…
Hope y’all had good Christmas holidays (for those who celebrate it)
and thanks so far for the answers…
(For those interested: Im doing a dissolution of text, in quite high resolution, currently using spheres probably switching to the faces itself to reduce overhead, sacrificing a little bit of quality… a second test I did, with a longer text used 150M Faces without any problems, even exceeding the total system ram, only resulting in a performance penalty)
I think there is a memory leak - I’m trying to render a smoke sim and have this error every 30 frames or so (The sim was baked at 512 resolution!), but if I cancel and hit render again it carries on.
I’d like to revive this topic… with a little addition…
I got exactly the same error when trying to render on M2 max with Blender 4.1 (release version.)
And I don’t think it’s a VRAM issue. The scene uses about 8GB of Ram…and my MacBook has 96GB. (No other programs open which use a lot of RAM.)
I also don’t use Geonodes in this scene.
What causes the problem for me is simply enabling motion blur. As soon as I disable it it works. With CPU I can also render with motion blur.
Did anyone else encounter this?
It’s pretty annoying as I desperately need motion blur in the scene…and CPU render takes forever…