Hardware specs?

Having a little difficulty with Blender… It keeps running out of memory.

What sort of specs should I be running as a bare minimum?

I have an XP box running 2.5Ghz duel core 2Gb RAM and a 1Gb 3D card. It just doesn’t seem to be powerful enough, I can happily create complex meshes all day long, but start sculpting said meshes and down it goes… I have the options of upping the RAM to 4Gb without a new motherboard.

I’m getting reasonably good at this modelling malarky, and I’d like to get good enough to produce a decent portfolio and maybe start a little freelance work, but if the PC I’m using isn’t up to scratch I’ll need to invest in new hardware.

I’m happy to run duel OS and have Ubuntu sat in reserve in case this could solve the issues I’m having (not that I’m under any disillusions here), or at least help.

Can anyone suggest the best route for me… will linux free enough resources to run blender smoothly.

Quick cause of crash; Multires level 6 Catmull-Clarke (obviously dependant on mesh complexity; this level was achieved sculpting a realistic head)

in 32bit xp, each process is limited to 1.5GB of RAM, which means that, if your blender scene uses more than 1.5GB, blender will crash.
In ubuntu you should be able to handle up to 3GB using an LAA enabled build.
For more than that, you will need a 64bit operating system and 64bit blender

RAM upgrade would probably be the most useful on your current rig,
but what good is it if you’re running a 32bit OS?

Cheers exactly what I needed to know…

It may be of interest to know I solved the problem by installing ubuntu 10.10 64bit… and adding a gb ram.

Well done!
Actually the 64bit is rather getting the standard… 'till it’ll all jump to 128bit!
:slight_smile:

the AMD64 architecture uses 48 bits to address memory => so the new limit is 256TB of memory. i doubt that this limit will be a problem anytime soon (or ever) for our purposes.

While it is widely disputed, and bill gates says he never made the remark, the popular rumor is that bill gates once said ‘640k of memory is all that anybody with a computer would ever need’

If they ever break through the current problems of not being able to get processor to work faster than the current 3~4 ghz limit and manage to push processors into the 50-100 ghz speed, 256 tb might not be enough…

just sayin…
Randy

i knew those quotes and i have to admit that it is risky to predict future demands. but some things are somewhat foreseeable, like more than 150dpi for a screen won’t be necessary unless evolution drives the human eyes to extreme performances.

256TB RAM could hold ~5.8 hours of 3D stereoscopic video in 4k resolution, 4channels, 32bit depth per channel at 50 fps with 8channel 32bit 192kHz sound. all uncompressed. beat me if i miscalculated something :slight_smile:

and that’s just RAM! SSDs will also develop a lot in the next years.

and now to repeat myself (and notice the MAYBE i added for you :slight_smile: ):
i doubt that this limit will be a problem anytime soon (or MAYBE ever) for our purposes.

Have you heard about nanotube transistors? They are right now roughly the same speed as our current high-end silicon transistors, but this guy says their theoretical limit should be about a terahertz.

If I have understood correctly, the 64bit architecture is able to use 64bit memory addresses although the current generations only use 48 bits. This means that you can have 18 million terabytes of RAM before an architechture change is needed. That is a billion times more than our current high-end computers and before we can hit that limit, we need to come up with something that can actually fill that amount of space. (I guess you could take about 20 million average computers and load up all their hard drives into your RAM)

Yeah, 640K ought to be enough for anybody.

“I think there is a world market for about five computers” - Thomas J. Watson, 1943

dude, read then post!

Yeah, those were the day’s when everybody still though along the multivac idiom,
you know the one where one big-ass analog computer served city’s full of people.

I believe he never actually said that at least his biography on Wikipedia says so. But even if he didn’t that statement only looks ridiculous to us because we have the gift of hindesight.

in 1943 computers we large, government funded, filled entire warehouses and need a team of people to run. so in 1943 would that alleged statement been ridiculous to make or something fairly reasonable to say.

It’s a joke, get a grip.

Yes, that’s the point - it sounds pretty reasonable, if not optimistic for 1943. Makes you think about what the next paradigm shift in computing would bring. Jorzi has already mentioned carbon transistors, memristors, and the various candidates for next-generation computing technology.

yeah, right.

…Also some engineers, in the 90’s, where sure the 200Mhz CPU speed was the highest one ever could reach… :yes:

  					While it is widely disputed, and bill gates says he never made  the remark, the popular rumor is that bill gates once said '640k of  memory is all that anybody with a computer would ever need'

He did say that. I seen it on a video documentary of him and I truly don’t remember which one, sorry “but he did say that”.
I need a 512-bit 554345532-thz computer with unlimited ram to speed up Blender rendering engine.

Heh, I just want to use the super-density that is the result of a singularity as a computer, I mean Imagine the power of infinity! :slight_smile:

The problem is that it’s physically impossible to get any data out from behind the event horizon that surrounds the singularity :stuck_out_tongue:

In fact I would NOT want a singularity inside my computer. They just mess things up and then they explode.