Memory or GPU Upgrade & Lexrender prob's - 2 in 1 Post


Can I have your thoughts on the following please.

  1. My PC is a Dell optiplex 780 with an intel Core 2 Vpro 8 gig mem and a Nvidia GeForce 9300 GE video card. I’ve got about £200 to spend and I was considering either buying a GeForce 560 Ti (384 cuda cores) and leaving the machine mem at 8 gig or buying a Nvidia GeForce 550 (192 cuda cores) and upgrading the machine memory to 16 gig. Any thoughts?

  2. I occasionally have access to a number of machine, between 5 and 20, which a pretty big 128 gig mem 24 cores each (4 x intel Xeon 5650) and I’ve tried running LuxRender on them in a distributed fashion using luxconsole and pylux in blender but although it works I find that quite often the luxconsole processes terminates with segmentation fault and I have to restart the process. When I do have access to the machine it is only during the night and I don’t want to sit up all night. Any help on this?


Unless I am mistaken doesn’t a Core 2 MB support only DDR2 ram? That kind of ram is pretty hard to find as most shops stock DDR3 and those that stock DDR2 charge a pretty penny for it. If you upgrade to a new system in a year time or so you won’t be able to carry you ram over, your video card you will but with DDR2 ram you will be stuck.

The memory in the machine at the moment is “DDR3 PC3-10600” but good point about being able to move things over when upgrading.

G41, G43, nForce 790i Ultra SLI, P41, P43, P45, X38 and X48 chipset boards have a DDR3 memory controller, some even a combo and allow 2ddr3 and 2ddr2.
But they are rather exotic.

8GB should be plenty unless you plan to do ridiculously big renders and/or simulations.

Best spend on a graphics card, but unless you need CUDA I´d wait for the Radeon 7000 series to launch, which is supposedly this month and see how the market changes, especially the prices.
To me the 7000 series looks promising, and if it´s to expensive for you, I am certain the 6900 series will drop in price, which is a good choise for Blenders viewport performance.
If you need CUDA, there´s not much choise though :wink:
But then the 560TI offers the best price/performance and performance/watt for CUDA.

holy shit, nice access there. have you tried render with mitsuba? IMO it’s the open source renderer with best scalability. I wish blender cycles renderer adapts to network render as pretty, showing avail cores and interactive preview update.

as it does now on single machine.

I would go with better graphics card. You’ll need it for preview renders with cycles all the time. probably the GUI goes faster when sculpting etc.

RAM is needed only when baking or loading frames/movies into RAM for editing and 2d/3d tracking / compositing.

I guess the only question to ask now is are you running out of memory often or do you foresee yourself running out memory before you have to make another upgrade.

@Arexma thanks I have never heard of those boards

aermartin - I’ll try mitsuba and let you know how I get on - from what ive just read it looks good and the blender plugin is always helpful.

arexma - In the Cycles experimental render settings there is support for CUDA but not for ATI Stream and with OpenCL being a generic framework over both of them then I figured with Nvidia I’s get support for both OpenCL and CUDA, but other than that I dont’ really know what the difference in performance is between nvidia and AMD everything I’ve read seems to suggest it is personal perference?

tyrant monkey - I don’t think I’m running out of memeroy but the GUI slows when running at highres when sculpting. I’m working on some models/characters for a short film I’m producing and I figured that when I get all the models and sets together for a scene even even with texture baking I’m going to have a lot of vertices and this might result in the same slow down and I’m not sure if the mem or gpu will solve this the best?

Nvidia: CUDA support, deliberately crippled OpenGL/CUDA performance to sell Quadro/Tesla cards - still fast though for CUDA, not so in OpenGL, Stereo3D support, awesome although proprietary linux drivers.
In particular gl_readpixel() and backface shading for doublesided triangles is extremely slow in GF400+ cards. Multiple times slower than GF200 series cards. Nvidia still says its for the best, else customers would have pay too much because the cards would be too good, so just lock out some features :smiley:
And no, a quadro does not give you any real benefits in Blender (yet) and it has crippled CUDA performance as well so you need a Tesla too :slight_smile:

AMD: No CUDA, sometimes driver issues haven´t heared of any lately though, some cards have slow gl_select (beeing bypasse by PsyFi’S patch), cranky linux drivers, very good OpenGL performance in Blender, Eyefinity.

PS.: I have 8 GeForce cards, 1 Quadro and 1 Radeon here, 1 Quadro in the studio.

If you need to know more it´s a good bet you´ll find all you need if you use the forum search with “arexma nvidia” or “arexma cuda”… Well, just arexma in combination with anything graphic card related… I wrote a book already in this forum about those topics. True story! :smiley:

That explains a lot. i was wondering why a Quadro with a seemingly lesser spec was selling for more than what appeared to be an equivalent GeForce but I geuss the crippling is the cause.

Anyway thanks for the info. I’ll check out your other posts.


This is the first time in a long time I’ve heard about anybody having issues with LuxRender’s networking. If you’d be willing to describe your problem in more detail on the LuxRender forum it would be much appreciated and you’d most likely get better help. In particular the render settings and the exact version of LuxRender would be of interest. In my own experience network rendering has been very reliable lately; it should be possible to get it running nicely on your systems as well.

As to the other question, I’ve had so many driver crashes with a GeForce 580 lately (on two systems even) that I’d be seriously looking at Ati. And I’d use some of the money form memory, but that’s all dependent on the way you’re using your computer.

Yeah that’s what I thought I figured that other people must use distributed LuxRender but when I googled the seg fault issue I did find other people with similar issues (although I couldn’t say how old the posts were off hand) and there seemed to be a number of solutions all involving writing scripts to restart the process automatically but this seemed to me like a bit of a hack. I’ll pop over to the LuxRender Forum and post the details there. Thanks.

As far as going from a 9300 to the 560 you had better take a look at your power supply. I believe the 9300 will run on a 400W power supply, but the 460/560 need a bit more power. So out of that 200 you may want to consdier an upgrade to a 600W power supply to make sure you do not inherit crashing issue because you don’t actually have enough power to run the hardware. I don’t know the spec on the Optiplex, but if it already has a decent sized power supply you may be good to go.

Good point - I had wondered about that but it’s all irrelevant now anyway because the machine I was going to stick this in is my machine at work which when I crawled under the desk and took a look at it i found it is a SFF machine so I ain’t going to get a decent card in there am I - Sheesh!. Anyway I’m thinking of biting the bullet spending another few hundred and getting a new machine . So I’m now considering the best approach whether to buy ready made or build myself reusing and scrounging peripheral parts so I can sepnd the money on raw power! This is the shopping list so far:

Intel Core i7 - 2600 (Sandybridge)
16 gig memory
asus 1155 P8z68-V Pro motherboard

and a Nvidia 560Ti or the Radeon when it comes out?

Don’t know what you think. Not sure I need the ASUS mother board £140 or wether something less full on would do?

I want something that is specifically for Blender type work so good graphics, mem and gpu are what I see as essentials but its been many years since I last built a machine from scratch so a) I don’t know if it’s worth it anymore b) i don’t know if the bits that glue it all together have remained the same (mother board, case, disk(or ssd)) or whether there are other things to worry about, like PSU rating.

Hey Ho just another adventure…