I am trying to run a model writting the input .dat.
However, the memory of the computer nearly fills out before the calculation begins. The model is big, but I think there are temporary files that are being created, which I do not need.
I have no echo and no print introduced in my .dat to reduce the size of .out files.
Nevertheles, .t19 files are being created and just a .t16 files even if I am using a multi-processing of 10 threads.
Which can be the problem?
Normally in the calculations I have done until now, a global .t16 file together with a number of .t16 files was created (equal to the number of threads used in the multi-processing).
If any further explanation is required to better understand the problem, please ask.
I did not know about out-of-core option. What sall I write in the input file .dat? I generate the models in an automated way and do not open them in Marc to run.
I did not know about out-of-core option. What sall I write in the input file .dat? I generate the models in an automated way and do not open them in Marc to run.