nvidia gtx1050 correct installation under linux mint 18.2

i know this is a post strictly related to blender, but i can’t find a way to solve this issue…

I’ve installed linux Mint 18.2 on a new machine with an nvidia GTX1050. on first boot the GTX1050, even after changing the drivers , doesn’t work, non recognizing the monitor (HP) and keeping the resolution to 1024x768. I’ve added the nvidia drivers ppa and i’ve installed dirver 387.12. After the installation I’ve got an error message, but after rebooting the sistem everything seemed to work.
But, working with Blender, when I use GPU rendering, I cannot render scenes with a high polycount or with big textures ( he says cuda out of memory)

nvidia -smi

thisi is what I’ve got:

Sun Dec 31 10:26:45 2017       
| NVIDIA-SMI 387.12                 Driver Version: 387.12                    |
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|   0  GeForce GTX 105...  Off  | 00000000:22:00.0  On |                  N/A |
| 45%   21C    P5   ERR! /  75W |    209MiB /  4035MiB |      0%      Default |
| Processes:                                                       GPU Memory |
|  GPU       PID   Type   Process name                             Usage      |
|    0      1425      G   /usr/lib/xorg/Xorg                           130MiB |
|    0      3455      G   ...-token=A034FA1325227D2EB85498ED4077F77A    77MiB |

I’m too noob to solve these issues…how can I do?
P.S.: Sorry for my bad English

I’m pretty sure that’s a GTX 1050 Ti, which is relevant in terms of the available VRAM on the card. The regular 1050 only has 2 GB VRAM, while the 1050 Ti ships with 4 GB VRAM.

Having said that, even 4 GB VRAM is IMHO more or less the bare minimum these days. If the card runs out of VRAM, there are only three options: (1) Optimize your scene so that it fits into the VRAM (e.g. by using render layers + compositing) or (2) buy a better card or (3) render on the CPU.

That’s true, but nvidia-smi reports 4GB. So ‘waaf’ must have a 1050 Ti.

If you are using Denoising on 2.79, make sure you are not using very large tile sizes. You use for example X=480, Y=270.
I think we can not say much without knowing much about the kind of scenes you’re working with.
You can monitor the use of vRAM in real time with:

watch -n 1 nvidia-smi

Could you run ‘nvidia-smi’ again and expand the size of the terminal so that all the text appears?

Sorry!!! Yes, it is a 1050Ti!!!

@Yafu: how can I expand the size to show everything? if I expand the text is bigger but doesn’t show anything more.

I run a pair of 1050 Ti cards on a debian AMD 64 machine and I’d be happy to be a data point if needed. I’ve not run into the problems you are seeing but I’d be curious about your power supply since your nvidia-smi output is showing an error in that output column. My system uses a 750W PSU to drive the 2 cards.

The problem with pasting the nvidia-smi output is that there doesn’t seem to be a way of selecting a fixed-width font in these forums. (If there is please enlighten me.)

I may not be processing scenes as complex as yours but I’d be happy to run a sample of yours to see what happens. Here is my nvidia-smi output to compare (note that my driver is 375.82):

Mon Jan 1 22:39:45 2018
| NVIDIA-SMI 375.82 Driver Version: 375.82 |
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| 0 GeForce GTX 105… Off | 0000:01:00.0 On | N/A |
| 30% 44C P0 36W / 75W | 272MiB / 4038MiB | 0% Default |
| 1 GeForce GTX 105… Off | 0000:02:00.0 Off | N/A |
| 30% 39C P0 36W / 75W | 4MiB / 4038MiB | 0% Default |

| Processes: GPU Memory |
| GPU PID Type Process name Usage |
| 0 1100 G /usr/lib/xorg/Xorg 19MiB |
| 0 1235 G /usr/lib/xorg/Xorg 109MiB |
| 0 1468 G …el-token=8956EBCE147BDD572D35B4EF9A14085B 62MiB |
| 0 18722 G /home/glenl/devo/blender-2.79/blender 59MiB |

Thank you G60!
My power supply should be a Cooler Master 600W 80+
I’d be glad to send you my file…but it’ 400 MB (200MB .zip)

I’m not sure you got my PM. I suggested that you put the zipped blend file somewhere and then PM me with a link.

BTW, I think your 600W psu should be sufficient, providing your system isn’t loaded with too many other devices. Before I got my second card I was running with a 400W psu but I tend to run my machine pretty lean with a separate file/device server.

Got the blend file.

The good news is that I don’t think there is anything wrong with your card — mine failed as well and, watching memory, it died after showing 3.8G and stepping into 4. Boom. It gave it a proper try.

I would say that some models are going to have polycounts out of the scope of 4GB cards. My card recovers, however, and I am able to continue processing on other models. Your card & driver should at least manage that.

Thanks g60! Mine crash too at 3.8 . So my HW and Drivers should be right…just need to buy a better card :slight_smile:
Again, thank you very much…

Hi. You try Blender 2.79 nightly build, it is portable for Linux too (you download from first files with 2.79 label in name)

There has been recently included use of system RAM for CUDA cards. So if you have enough RAM in the system, maybe you can avoid CUDA out of memory with your scene.

Very cool! Here are my results:

CPU-only render: 5:21
non-display 1050Ti + CPU: 1:37

Both of these peaked at 4377.87M memory. (we were so close.)

Important: You can only use 1 card with this setup because it will attempt to use 1/2 of system memory. I seem to remember the device list telling me which card was assigned to the display but didn’t see that in the list. I think I guessed right.