[FLASH-USERS] too many refinement iterations
Sean M. Couch
couch at pa.msu.edu
Tue Sep 24 09:14:18 EDT 2019
Hi Jon,
There are various temporary/scratch arrays that are allocated at runtime. It could be the machine is running out of memory. Some arrays are sized according to maxblocks (or a _multiple_ thereof). Have you tried reducing maxblocks compiled into the application? Also, if you can setup the code with `useFortran2003=True`, you will get some very useful memory usage statistics in the log file just before the main iteration loop info starts printing.
Sean
----------------------------------------------------------------------
Sean M. Couch, Ph.D.
Assistant Professor
Department of Physics and Astronomy
Department of Computational Mathematics, Science, and Engineering
Facility for Rare Isotope Beams
Michigan State University
567 Wilson Rd, 3260 BPS
East Lansing, MI 48824
(517) 884-5035 --- couch at pa.msu.edu --- www.pa.msu.edu/~couch
On Sep 23, 2019, 11:27 AM -0400, Slavin, Jonathan <jslavin at cfa.harvard.edu>, wrote:
Hi Marissa,
Thanks for sharing your experience. I'm currently letting it run, as I mentioned, having changed the refinement criteria to remove pressure as a refine_var. Before I try anything more I'd like to see how that turns out. Running under gdb could take some time since even using mpi with 10 cpus on my desktop it takes a couple hours before it starts getting into that mode where it keeps increasing the refinement. I could, however, compile with various checks enabled and at low optimization, which could produce useful information.
As for the units used, here they are (note that the drag unit under Particles/ParticlesForces and the dust unit under Particles/ParticlesMain are units that I have written):
FLASH Units used:
Driver/DriverMain/Split
Grid/GridBoundaryConditions
Grid/GridMain/paramesh/interpolation/Paramesh4/prolong
Grid/GridMain/paramesh/interpolation/prolong
Grid/GridMain/paramesh/paramesh4/Paramesh4dev/PM4_package/headers
Grid/GridMain/paramesh/paramesh4/Paramesh4dev/PM4_package/mpi_source
Grid/GridMain/paramesh/paramesh4/Paramesh4dev/PM4_package/source
Grid/GridMain/paramesh/paramesh4/Paramesh4dev/PM4_package/utilities/multigrid
Grid/GridParticles/GridParticlesMapFromMesh
Grid/GridParticles/GridParticlesMapToMesh/Paramesh/MoveSieve
Grid/GridParticles/GridParticlesMove/Sieve/BlockMatch
Grid/GridParticles/GridParticlesMove/paramesh
Grid/GridSolvers/HYPRE/paramesh
IO/IOMain/hdf5/serial/PM
IO/IOParticles/hdf5/serial
Particles/ParticlesForces/shortRange/drag
Particles/ParticlesInitialization
Particles/ParticlesMain/active/dust
Particles/ParticlesMapping/Quadratic
Particles/ParticlesMapping/meshWeighting/MapToMesh
PhysicalConstants/PhysicalConstantsMain
RuntimeParameters/RuntimeParametersMain
Simulation/SimulationMain
flashUtilities/contiguousConversion
flashUtilities/general
flashUtilities/interpolation/oneDim
flashUtilities/nameValueLL
flashUtilities/rng
flashUtilities/sorting/quicksort
flashUtilities/system/memoryUsage/legacy
monitors/Logfile/LogfileMain
monitors/Timers/TimersMain/MPINative
physics/Diffuse/DiffuseMain/Unsplit
physics/Eos/EosMain/Gamma
physics/Hydro/HydroMain/split/PPM/PPMKernel
physics/materialProperties/Conductivity/ConductivityMain/PowerLaw
Also note that this run uses Flash 4.3.
Regards,
Jon
On Mon, Sep 23, 2019 at 11:02 AM Marissa Adams <madams at pas.rochester.edu<mailto:madams at pas.rochester.edu>> wrote:
Hi Jonathan,
I've encountered something quite similar earlier this past summer. When I ran with AMR on a supercomputer, it would refine all the way, then crash via segfault. I then tried running the same executable under gdb, and it ran just fine. However, while running, it wouldn't even register the refinement! And just pushed through the time step where it refined/crashed as if it were a pseudo-fixed grid. It would do this, even when I asked for only one level of refinement, or two. I am wondering if you can run yours under gdb and see if it does something similar? Then perhaps we know we are dealing with the same "heisenbug" I've encountered.
After moving back to my local machine to debug further, I sorted through some uninitialized variables that may have been eating at the memory and causing that sort of crash. I added debug -O0 and additional -W flags to make sure it crashed appropriataely for me to sort through the weeds. It was a process.... but perhaps list what units you're using and I can tell you if I have any cross over, and what variables I initialized that seemed to fix the problem.
Best,
Marissa
On Mon, Sep 23, 2019 at 9:05 AM Slavin, Jonathan <jslavin at cfa.harvard.edu<mailto:jslavin at cfa.harvard.edu>> wrote:
Hi,
I've run into an issue with a simulation I'm running where, after evolving just fine, it begins to take more and more iterations during grid refinement until it fails. It's strange because I ran the same simulation on a different system with the same code and same parameters and it worked just fine. I did use different versions of the Intel fortran compiler and the hardware (cpus) were a bit different. But in terms of the software it is the same.
I'm currently trying again with refinement on pressure removed - previously I refined on density and pressure now I'm trying just refining on density.
If anyone has any other suggestions, I'd like to hear them.
Thanks,
Jon
--
Jonathan D. Slavin
Astrophysicist - High Energy Astrophysics Division
Center for Astrophysics | Harvard & Smithsonian
Office: (617) 496-7981 | Cell: (781) 363-0035
60 Garden Street | MS 83 | Cambridge, MA 02138
--
Jonathan D. Slavin
Astrophysicist - High Energy Astrophysics Division
Center for Astrophysics | Harvard & Smithsonian
Office: (617) 496-7981 | Cell: (781) 363-0035
60 Garden Street | MS 83 | Cambridge, MA 02138
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://flash.rochester.edu/pipermail/flash-users/attachments/20190924/4bb963d1/attachment-0001.htm>
More information about the flash-users
mailing list