virtual memory

What would happens if you use too much virtual memory?

virtual memory is slow as it is written / read from the harddisk… all that will happen is your computer will go. extra. slow. like i am talking 30+ seconds to register a mouseclick.

You get a low memory EM. From there you should save your work, increase your virtual memory and restart.

Every now and then, if you’re doing a lot of intense modeling or what-not, save the file then quit blender, reload and proceed.

Blender’s memory management structure by-and-large keeps objects around even with a user-count of zero. This can cause a buildup in memory which is trivially solved in this way.

Also, watch out that you don’t have a bunch of repetitive things in memory … like duplicated materials and so-forth. We just saw in another thread where an import-script produced thousands of dupes. If you’re coloring many things identically, they probably should all link to the same material. And so on.

That it will be abused

Just a ten-second FYI note on what “virtual memory” actually is …

A long time ago, a PhD candidate noticed that, even though a computer program needs “so-much memory” in order to operate, it doesn’t reference every byte of it every time. In fact, there is a clearly-exploitable pattern called locality of reference: the memory accesses are, at any particular moment in time, “tightly clustered,” although the “clusters” move-around constantly. Therefore (and this is what would have got you a PhD in those days), a program only needs “truly instantaneous” access to its current set of clusters … its so-called working set. The rest of the memory space must be accessible on-demand but the program can tolerate a slight delay.

The system keeps the “working set” constantly available, and fetches any not-recently-used information transparently on-demand. It “steals” not-recently-used pages when necessary.

The “virtual memory size” of a program is the total amount of memory that it has access to (and has ever referenced). The “resident size” is the size of its working-set … the currently and very-recently active (so called) “pages.” Its “footprint.”

Now, here’s the CG problem: these programs do “constantly reference” a lot of memory, especially when rendering. Their working-set size, their “footprint,” is the Ten-Million Pound Elephant. Unlike most programs, they are sucking-up the CPU, and doing comparatively little I/O … exactly opposite to the usual behavior of a computer program, and quite contrary to what most operating-systems are “tuned” for. That’s why, when setting up a CG machine, you need RAM, above all else.

What you want to see in a CG system (when rendering) is:

  • One-hundred percent CPU utilization, or very nearly so, all the time.
  • Virtual-memory size nearly equal to resident-size. (For this application … in other words, it’s getting instantaneous access to all the memory that it needs, whether or not it’s consuming all the RAM in the machine.)
  • Paging-rate (the number of I/Os that are occurring to fetch pages on-demand) should be near-zero.
  • Physical infrastructure that can support that constant activity without hardware-imposed delays, and without melting.

I don’t know that much. I think, the virtual memory combines your computer’s RAM with temporary space on your hard disk. When RAM runs low, virtual memory moves data from RAM to a space called a paging file. Moving data to and from the paging file frees up RAM so your computer can complete its work.