Memory Issues

Image Memory Configurations

This is incomplete, but is something that is probably useful.

There are several memory options that you can configure when starting an image. Note that these are fixed for the session and cannot be changed.

The following helps you calculate the total VM Size (as seen from the Task Manager actually) that the session will use. Note that the assumptions are you are 'M' in the settings. See Smallworld help for other possible variations of memory configuration. It appears that you have a maximum of 4GB on 32 bit systems of total VM Size available for all programs running*!

(Mnew * 2) + (Mold * 2) + Mpage + Image Size. 
Example:
-Mnew 384M -Mold 128M -Mpage 256M = 1,280MB + Image Size.

*There is more information at this blog about what VM Size is actually reporting.

Sectors and extension memory

I ran into this and wanted to post it for the next person who is trying to figure out why their extension file is so large….

I was reading a large amount of data into memory, specifically a large SHP file that contained 27000 features.

I was constantly running into yellow, and then red, extension memory errors. I traced it down to the sectors I was reading. I was capping the number of coordinates per sector to be 500. Well that I found out was not a good idea…. Lets see why…

If you read the documentation about how Smallworld handles memory, you have 4 types, new, old, extension, and the original image. The documentation states (Section 3.2 Creating, promoting and updating objects of Session Management: Memory management overview) that any object larger then 4KB (4096) are automatically stored as images, thus automatically put into the extension memory.

Now if you go over 255 coordinates in a sector, the vector slot (which is a coords_vector) the memory for this vector is above 4KB. Thus this automatically is considered a large object, and blammo it is in the extension memory.. For sector_z, you need to max at 170 to stay under 4KB.

So if you are running into extension memory problems, dig deep into the belly of your code and see if you are creating vectors that are larger then 4KB.

One additional thing I found is that you should definitely use sector.new_for(max_size). This will ensure the automatic space creator for the underlying coords_vector doesn’t produce a coords_vector with some oddball large size (like 16776 or something).

Some methods I used to track this down were:

system.report_session_objects() which will report object being large and that will give you a clue to look for. And

system.sys!object_info() or if you can get it from GE object.object_size(). This gave you the size of the objects that I used to determine if they were under/over the 4KB mark.

By no means is this the end all and be all of memory issues, it is some thoughts I thought may be useful to others…

Enjoy,
Mark

Field Consulting and Services, Inc.
www.field-csi.com

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License