I was converting the filestream into a readWriteStream so that the character
$( I add to it does not affect the file itself. If you let the fileStream
itself, I think your file will be modified and then you will not be able to
re-import it at all.
2010/2/18 Laval Jannik <jannik.laval(a)inria.fr>
Hi Johan,
When I work on big models, I run the VM with more memory:
In Command line:
Squeak\ 4.2.2beta1U.app/Contents/MacOS/Squeak\ VM\ Opt* -memory 1500m*
It should work with this.
Cheers,
Jannik
On Feb 18, 2010, at 08:12 , Johan Brichau wrote:
Thanks Doru. I will take a look at that.
It all seems to work very well now except for large models. What can you in
Pharo to use more memory?
I have a large model exported from VW which gives trouble importing into
Pharo merely because of the size of it.
One item that first seemed to help is not to convert the filestream into a
readwritestream in the #importFromFamix2MSE method. I changed the method as
follows:
importFromFamix2MSE
<menuItem: 'Import from Famix2 MSE' category: 'Import / Export'>
| stream contents streamCopy |
stream := UITheme builder
fileOpen: 'Import model from MSE file' extensions: #('mse').
"we skip all the famix 2 specific head of file."
stream upTo: $( .
stream upTo: $( .
stream upTo: $( .
stream upTo: $( .
stream next: 6.
stream nextPut: $(.
stream back.
self name: (FileDirectory baseNameFor: stream localName).
self importFromFamix2MSEStream: stream.
stream close
However, I later run out of memory when parsing it.
Any ideas?
_______________________________________________
Moose-dev mailing list
Moose-dev(a)iam.unibe.ch
https://www.iam.unibe.ch/mailman/listinfo/moose-dev