[FLASH-USERS] FLASH memory usage: Static, Dynamic and Machine Specific

Sean M. Couch couch at pa.msu.edu
Thu Aug 11 13:06:44 EDT 2016


Why not to use the original multipole solver? It is not really any less
accurate.

Oh?  Then I guess we shouldn’t have bothered to write this whole paper about how much more accurate (and efficient) it is: http://adsabs.harvard.edu/abs/2013ApJ...778..181C.  And going back to the old solver doesn’t address Rahul’s problem at all.

Rahul, in the reference paper, we go to very high L_max, but in 2D only.  Perhaps try setting mpole_3DAxisymmetry to .TRUE. in your .par file.  This is likely not an ultimate fix since you need non-axisymmetric gravity, I assume, for your problem but it could tell you if it is indeed the memory requirements of the mpole arrays.


Sean


--
On 08/11/16 12:41, Klaus Weide wrote:
> On Wed, 10 Aug 2016, Rahul Kashyap wrote:
>
>> Yes, I forgot to mention that I'm using new multipole implementation with
>> 60 poles.
>>
>> I have attached a small txt files with short summary on three runs which
>> very well describes my problem. 1024 proc have been used for all runs with
>> fixed lrefinemax and base blocks. I get three differenet error for three
>> different maxblocks value.
>>
>> My understanding was that reasonable use of maxblocks avoids any such
>> memory failures.
> Rahul,
>
> It appears that the total memory required by
>
> PARAMESH Grid + multipole solver ( + Particles + other units )
>
> is just too much; I suspect that this is PRIMARILY due to the memory
> requirements of the Multipole(_new) solver.
>
> There are several large arrays allocated, see in particular statements
> like
>
> allocate (gr_mpoleScratch (1:gr_mpoleMaxLM,1:gr_mpoleMaxQ ),...)
>
> in gr_mpoleAllocateRadialArrays.F90, where gr_mpoleMaxQ may be very large
> and gr_mpoleMaxLM ~ gr_mpoleMaxL ** 2 ( == 60**2 in your case?).
>
> Unfortunately this memory is required for each process, you cannot reduce
> this by running on more procs.
>
> It makes sense in general to try to reduce the memory required by the
> PARAMESH Grid by lowering maxblocks, but this can go only so far;
> maxblocks has to leave room for "a few" more blocks than the number
> actually required by the distributed grid. These additional slots
> are needed for temporary storage during processing by some internal
> PARAMESH routines for things like block redistribution. I don't
> know of a reliable way to predict how low "a few" can go in a given
> case, so this has to be determined empirically. Apparently,
> this was too low in your maxblocks=5 case.
>
> It may be possible to tweak the maxblocks value further, possibly in
> connection with also modifying the values of maxblocks_alloc and
> maxblocks_tr (see amr_initialize.F90 and paramesh_dimensions.F90),
> in order to allow the Grid domain initialization to proceed with
> maxblocks < 10; but this may then still not give you enough free memory
> for the multipole solver (and particles, etc).
>
> So you should investigate ways to lower the memory requirements of the
> Poisson solver; you may have to lower the resolution (not sure which
> runtime parameters to change), or perhaps use a different implementation.
>
> Klaus

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://flash.rochester.edu/pipermail/flash-users/attachments/20160811/2e048799/attachment.htm>


More information about the flash-users mailing list