Am 26.09.09 14:40 schrieb Jim Guo:
Memory size nowadays are larger and each time I have to manually set
Memory Policy when build a new image. And we don't know the exact
memory size computer which will run a program in the future in advance
so it seems better decide automaticly.
Is there a way? Thanks.
Yes, and no. :-)
You can save the configuration of the Runtime Packager, including settings
for MemoryPolicy parameters, for later use, such that you don't have to
enter the values every time you build an image (see page 21-21 in the
Application Developers Guide). But that addresses only part of the problem.
MemoryPolicy does not implement the ability to adapt its parameters to its
environment. But you can write a subclass of MemoryPolicy such that
- it computes memoryUpperBound dynamically instead of using a constant value
- it computes growthRegimeUpperBound as e.g. 90% of memoryUpperBound or
whatever seems reasonable
- sets defaults for other parameters to values more appropriate to larger
memory sizes and the needs of your application (e.g., if your application
needs large amounts of memory, it is a good idea to set
preferredGrowthIncrement to a larger value than the default of 1 MB - I tend
to set it to 10 MB without thinking much about it).
The MemoryPolicy has several places where it can be adapted. For example,
you can tweak the permitMemoryGrowth: method, the memoryMeasurementBlock or
the way memoryUpperBound is determined to achieve what you want.
It is relatively easy to detect how much physical memory your computer has
using the appropriate operating system calls, and it's a good idea to set
memoryUpperBound lower than that. VisualWorks applications can be extremely
slow when parts of their memory are swapped to disk, especially when running
global garbage collections.
You could also implement a SubSystem which reads parameters for the
MemoryPolicy from a configuration file and installs or configures the
currentMemoryPolicy accordingly in its startUp method.
You should also set the ObjectMemory's sizesAtStartup to larger values,
especially the sizes of eden and survivor space. My rule of thumb is to set
eden and survivor space to 10 times the default size, and then run the
TimeProfiler on parts of the application which are likely to produce lots of
temporary objects. If it shows that there are too many scavenges (garbage
collection runs in NewSpace), I increase them further until scavenges don't
decrease any more or time to execute the code starts increasing. You can
increase the other size parameters too, in particular stack space and
compiled code cache. I usually double them. If you don't use fixed space,
leave its size at the default value. Increase OldSpace headroom such that it
is large enough to hold the objects created after starting and initializing
your application, and a bit more. That way, your application will start a
bit faster, and it doesn't have to allocate more OldSpace segments during
initialization.
HTH,
Joachim Geidel
_______________________________________________
vwnc mailing list
vwnc@cs.uiuc.eduhttp://lists.cs.uiuc.edu/mailman/listinfo/vwnc