Nvidia 555 drivers kills CUDA

I can’t see this having been mentioned anywhere. I use Arch linux (BTW :wink: ), and have asked on the forums there, but nada. However, I’m thinking a Blender problem.

Nvidia drivers 555 (doesn’t matter the sub variant) are failing with CUDA errors. This on Blender 4.2.

Illegal address in CUDA queue copy_from_device (integrator_init_from_bake integrator_shade_surface integrator_sorted_paths_array)

Reverting to 550 drivers and all is good.

Except it gets weirder. Blender 4.1 seg faults.

# Python backtrace
# Blender 4.1.1, Commit date: 2024-04-15 15:11, Hash e1743a0317bc
Read library:  '/home/steve/.config/blender/4.2/scripts/addons/botaniq_full/blends_280/flowers/Flower_Daisy_A_spring-summer.blend', '//../../.config/blender/4.2/scripts/addons/botaniq_full/blends_280/flowers/Flower_Daisy_A_spring-summer.blend', parent '<direct>'  # Info
Read library:  '/home/steve/.config/blender/4.2/scripts/addons/botaniq_full/blends_280/grass/Grass_Tall_A_spring-summer.blend', '//../../.config/blender/4.2/scripts/addons/botaniq_full/blends_280/grass/Grass_Tall_A_spring-summer.blend', parent '<direct>'  # Info
Read library:  '/home/steve/.config/blender/4.2/scripts/addons/botaniq_full/blends_280/Library_Botaniq_Materials.blend', '//../../.config/blender/4.2/scripts/addons/botaniq_full/blends_280/Library_Botaniq_Materials.blend', parent '/home/steve/.config/blender/4.2/scripts/addons/botaniq_full/blends_280/grass/Grass_Tall_A_spring-summer.blend'  # Info
Read library:  '/home/steve/.config/blender/4.2/scripts/addons/botaniq_full/blends_280/weed/Weed_Dandellion_A_spring-summer.blend', '//../../.config/blender/4.2/scripts/addons/botaniq_full/blends_280/weed/Weed_Dandellion_A_spring-summer.blend', parent '<direct>'  # Info

# backtrace
blender(+0x207290b) [0x616a9198790b]
blender(+0x15fe239) [0x616a90f13239]
/usr/lib/libc.so.6(+0x3cae0) [0x7a4e48f65ae0]
/usr/lib/liboslexec.so.1.13(_ZN9OSL_v1_133pvt9LLVM_Util13call_functionEPN4llvm5ValueEN16OpenImageIO_v2_54spanIKS4_Lln1EEE+0x41) [0x7a4e5339b441]
/usr/lib/liboslexec.so.1.13(+0x1490be) [0x7a4e533490be]
/usr/lib/liboslexec.so.1.13(+0x15505f) [0x7a4e5335505f]
/usr/lib/liboslexec.so.1.13(+0x22e69a) [0x7a4e5342e69a]
/usr/lib/liboslexec.so.1.13(+0x15d58a) [0x7a4e5335d58a]
/usr/lib/liboslexec.so.1.13(+0x22e69a) [0x7a4e5342e69a]
/usr/lib/liboslexec.so.1.13(+0x17f369) [0x7a4e5337f369]
/usr/lib/liboslexec.so.1.13(+0x18a476) [0x7a4e5338a476]
/usr/lib/liboslexec.so.1.13(+0x6ae3b) [0x7a4e5326ae3b]
/usr/lib/liboslexec.so.1.13(+0x6d05c) [0x7a4e5326d05c]
/usr/lib/libstdc++.so.6(+0xe0c84) [0x7a4e492e0c84]
/usr/lib/libc.so.6(+0x92ded) [0x7a4e48fbbded]
/usr/lib/libc.so.6(+0x1160dc) [0x7a4e4903f0dc]

# Python backtrace

Even with 550 drivers.

I can’t share a scene - uses Botaniq, so has paid for assets.

Tried re-installing CUDA and no change.

For now, I’ve pinned nvidia, nvidia-utils (and not that it matters, lib32-nvidia-utils) at 550, but any ideas?

The reason I’m thinking a Blender fault, is that everything else tested works just fine. Cyberpunk 2077, the three most recent unigine benchmarks (heaven, valley and superposition).

It must be something to do with Linux, as I run the 556.12 driver and it’s just fine in Windows…

Given the eerie silence both here and on the Arch forums, I suspect it’s a “me” problem somewhere.

When was the last time you didn’t have this problem, and what has changed since?

Honestly, it’s only with the 555 drivers. 550 drivers work just fine. There has been no update to CUDA, so it’s something with the driver.

At least, with Blender 4.2. I don’t often use the stable version, so can’t say when that broke, but it’s a different error.

I assume you have been to the NVidia Boards already…but it might be a good idea to post it if you haven’t already…

Running the 555.99 Studio drivers here, but under Windows. Still for what it’s worth, not had any issues.