[FLASH-USERS] Fatal error in PMPI_Waitall: See the MPI_ERROR field in MPI_Status for the error code

Eddie Hansen ehansen at flash.uchicago.edu
Tue Dec 10 13:54:46 EST 2019


What Ryan meant is that the *total *memory of the simulation increases as
you increase the number of processors because each processor has that data
array.

NUNK_VARS is the number of variables in that data array within the code,
which is more than the number of variables that are plot_var's in the plot
files. NUNK_VARS includes every variable: dens, velx, vely, velz, temp,
etc., whereas the plot_var's are just a subset that you specify in the par
file.


On Tue, Dec 10, 2019 at 12:34 PM guido <g.granda at irya.unam.mx> wrote:

> Hello,
>
> Thank you, but I'm confused.  You said that the more processors I use the
> more memory each processor will require, however from maxblocks ~
> 1.02*num_blocks_requesed/num_procs , max blocks is inverse proportional
>
> to the number of processors and since the memory each processor needs is
> NUNK_VARSxNXBxNYBxNZBxMAXBLOCKS , then the memory is inverse proportional
> to the num_procs , right?
>
> Another question is if NUNK_VARS is the number of plot_var written into
> the plot files.
>
> Cheers,
> On 06/12/19 22:53, Ryan Farber wrote:
>
> Hi Guido,
>
> I've seen FLASH crash with similar stdout due to insufficient memory many
> times. Keep in mind that the more processors you use, the more memory the
> run requires (since each processor has its own solution data array of
> NUNK_VARSxNXBxNYBxNZBxMAXBLOCKS).
>
> I was going to suggest reducing maxblocks since 3000 sounds to me rather
> extreme. My simulations are happiest with MAXBLOCKS=200 (the default value).
>
> Assuming static mesh refinement, you only need maxblocks to be ~1.02 *
> num_blks_requested / num_procs (in your case, 1.02*97801/procs).
>
> So, if you want to use more nodes, decrease maxblocks accordingly.
>
> Best,
> --------
> Ryan
>
>
> On Fri, Dec 6, 2019 at 8:42 PM guido <g.granda at irya.unam.mx> wrote:
>
>> Hello,
>> Thank you for replaying. I tried your suggestion and I still got the same
>> errors. However, I tried increasing the number of nodes from 8 to 16 and
>> the simulations runs. So, I guess the crash was due to lack of resources
>> such as ram memory or sth similar. How can I check that it was the cause of
>> the crash ?
>>
>> Cheers,
>>
>> On 12/6/19 6:13 PM, Eddie Hansen wrote:
>>
>> I haven't seen this error before, so I don't know if this is the
>> solution, but here's something I noticed...
>>
>> You mentioned that you have loaded a version of hdf5 with parallel mode
>> enabled, yet the IO unit that is loaded during setup is
>> IO/IOMain/hdf5/serial. Try using the setup shortcut +hdf5typeio. This will
>> load, by default, the parallel IO unit. Resetup, recompile, rerun, and see
>> if you get the same error.
>>
>>
>> On Fri, Dec 6, 2019 at 5:46 PM guido <g.granda at irya.unam.mx> wrote:
>>
>>> Hello Flash users,
>>>
>>> I’m having trouble running a simulation with FLASH 4.5. It compiled
>>> FLASH 4.5 using:
>>>
>>> - hdf5-1.8.20 with parallel mode enabled
>>>
>>> - mpich-3.2.1
>>>
>>> - gcc-7.3.0
>>>
>>> The simulations run fine till I increase the maximum level of refinement
>>> and the number of nodes used. For example, for simulation with lmax=9, it
>>> runs slow but safe with 8 nodes. However, when I increase the number of
>>> nodes to 10, the simulation crash trying to write the Coefficients in the
>>> Taylor expansion into file.
>>>
>>> The log file is the following:
>>>
>>>
>>> FLASH log file: 12-05-2019 12:01:48.640 Run number: 1
>>>
>>>
>>> ==============================================================================
>>>
>>> Number of MPI tasks: 1
>>>
>>> MPI version: 3
>>>
>>> MPI subversion: 1
>>>
>>> Dimensionality: 3
>>>
>>> Max Number of Blocks/Proc: 3000
>>>
>>> Number x zones: 8
>>>
>>> Number y zones: 8
>>>
>>> Number z zones: 8
>>>
>>> Setup stamp: Wed Sep 18 16:44:51 2019
>>>
>>> Build stamp: Wed Sep 18 16:45:20 2019
>>>
>>> System info: Linux mouruka.crya.privado 2.6.32-504.16.2.el6.x86_64 #1
>>> SMP Wed Apr 22 06:48:29
>>>
>>> Version: FLASH 4.5_release
>>>
>>> Build directory: /home/guido/FLASH4.5/obj_mc
>>>
>>> Setup syntax: /home/guido/FLASH4.5/bin/setup.py magnetoHd/MC_evolution
>>> -auto -3d -maxblocks=3000 -objdir=obj_mc -site=irya.guido
>>> --without-unit=physics/Hydro/HydroMain/split/MHD_8Wave
>>>
>>> f compiler flags:
>>> /home/guido/libraries/compiled_with_gcc-7.3.0/mpich-3.2.1/bin/mpif90 -g -c
>>> -O0 -fdefault-real-8 -fdefault-double-8 -ffree-line-length-none
>>> -Wuninitialized -g -c -fdefault-real-8 -fdefault-double-8
>>> -ffree-line-length-none -Wuninitialized -DMAXBLOCKS=3000 -DNXB=8 -DNYB=8
>>> -DNZB=8 -DN_DIM=3
>>>
>>> c compiler flags:
>>> /home/guido/libraries/compiled_with_gcc-7.3.0/mpich-3.2.1/bin/mpicc
>>> -I/home/guido/libraries/compiled_with_gcc-7.3.0/hdf5-1.8.20/include
>>> -DH5_USE_16_API -O0 -c -DMAXBLOCKS=3000 -DNXB=8 -DNYB=8 -DNZB=8 -DN_DIM=3
>>>
>>>
>>> ==============================================================================
>>>
>>> Comment: MC evolution
>>>
>>>
>>> ==============================================================================
>>>
>>> FLASH Units used:
>>>
>>> Driver/DriverMain/Split
>>>
>>> Driver/localAPI
>>>
>>> Grid/GridBoundaryConditions
>>>
>>> Grid/GridMain/paramesh/interpolation/Paramesh4/prolong
>>>
>>> Grid/GridMain/paramesh/interpolation/prolong
>>>
>>> Grid/GridMain/paramesh/paramesh4/Paramesh4dev/PM4_package/headers
>>>
>>> Grid/GridMain/paramesh/paramesh4/Paramesh4dev/PM4_package/mpi_source
>>>
>>> Grid/GridMain/paramesh/paramesh4/Paramesh4dev/PM4_package/source
>>>
>>> Grid/GridMain/paramesh/paramesh4/Paramesh4dev/PM4_package/source
>>>
>>>
>>> Grid/GridMain/paramesh/paramesh4/Paramesh4dev/PM4_package/utilities/multigrid
>>>
>>> Grid/GridMain/paramesh/paramesh4/Paramesh4dev/flash_avoid_orrery
>>>
>>> Grid/GridSolvers/BHTree/Wunsch
>>>
>>> Grid/localAPI
>>>
>>> IO/IOMain/hdf5/serial/PM
>>>
>>> IO/localAPI
>>>
>>> PhysicalConstants/PhysicalConstantsMain
>>>
>>> RuntimeParameters/RuntimeParametersMain
>>>
>>> Simulation/SimulationMain/magnetoHD/MC_evolution
>>>
>>> flashUtilities/contiguousConversion
>>>
>>> flashUtilities/general
>>>
>>> flashUtilities/interpolation/oneDim
>>>
>>> flashUtilities/nameValueLL
>>>
>>> flashUtilities/sorting/quicksort
>>>
>>> flashUtilities/system/memoryUsage/legacy
>>>
>>> monitors/Logfile/LogfileMain
>>>
>>> monitors/Timers/TimersMain/MPINative
>>>
>>> physics/Eos/EosMain/Gamma
>>>
>>> physics/Eos/localAPI
>>>
>>> physics/Gravity/GravityMain/Poisson/BHTree
>>>
>>> physics/Hydro/HydroMain/split/Bouchut/Bouchut3
>>>
>>> physics/Hydro/localAPI
>>>
>>> physics/sourceTerms/Polytrope/PolytropeMain
>>>
>>>
>>> ==============================================================================
>>>
>>> RuntimeParameters:
>>>
>>>
>>> ==============================================================================
>>>
>>> bndpriorityone = 1
>>>
>>> bndprioritythree = 3
>>>
>>> bndprioritytwo = 2
>>>
>>> checkpointfileintervalstep = 1000000 [CHANGED]
>>>
>>> checkpointfilenumber = 0
>>>
>>> dr_abortpause = 2
>>>
>>> dr_dtminbelowaction = 1
>>>
>>> dr_numposdefvars = 4
>>>
>>> drift_break_inst = 0
>>>
>>> drift_trunc_mantissa = 2
>>>
>>> drift_verbose_inst = 0
>>>
>>> eos_entrelescalechoice = 6
>>>
>>> eos_loglevel = 700
>>>
>>> fileformatversion = 9
>>>
>>> forcedplotfilenumber = 0
>>>
>>> gr_bhtwmaxqueuesize = 10000
>>>
>>> gr_lrefmaxtimevalue_1 = -1
>>>
>>> gr_lrefmaxtimevalue_10 = -1
>>>
>>> gr_lrefmaxtimevalue_11 = -1
>>>
>>> gr_lrefmaxtimevalue_12 = -1
>>>
>>> gr_lrefmaxtimevalue_13 = -1
>>>
>>> gr_lrefmaxtimevalue_14 = -1
>>>
>>> gr_lrefmaxtimevalue_15 = -1
>>>
>>> gr_lrefmaxtimevalue_16 = -1
>>>
>>> gr_lrefmaxtimevalue_17 = -1
>>>
>>> gr_lrefmaxtimevalue_18 = -1
>>>
>>> gr_lrefmaxtimevalue_19 = -1
>>>
>>> gr_lrefmaxtimevalue_2 = -1
>>>
>>> gr_lrefmaxtimevalue_20 = -1
>>>
>>> gr_lrefmaxtimevalue_3 = -1
>>>
>>> gr_lrefmaxtimevalue_4 = -1
>>>
>>> gr_lrefmaxtimevalue_5 = -1
>>>
>>> gr_lrefmaxtimevalue_6 = -1
>>>
>>> gr_lrefmaxtimevalue_7 = -1
>>>
>>> gr_lrefmaxtimevalue_8 = -1
>>>
>>> gr_lrefmaxtimevalue_9 = -1
>>>
>>> gr_pmrpdivergencefree = 1
>>>
>>> gr_pmrpifaceoff = 0
>>>
>>> gr_pmrpl2p5d = 0
>>>
>>> gr_pmrpmaxblocks = -1
>>>
>>> gr_pmrpmflags = 1
>>>
>>> gr_pmrpnboundaries = 6
>>>
>>> gr_pmrpndim = 3
>>>
>>> gr_pmrpnedgevar1 = -1
>>>
>>> gr_pmrpnfacevar = -1
>>>
>>> gr_pmrpnfielddivf = -1
>>>
>>> gr_pmrpnfluxvar = -1
>>>
>>> gr_pmrpnguard = -1
>>>
>>> gr_pmrpnguardwork = -1
>>>
>>> gr_pmrpnvar = -1
>>>
>>> gr_pmrpnvarwork = 1
>>>
>>> gr_pmrpnvarcorn = 0
>>>
>>> gr_pmrpnvaredge = 0
>>>
>>> gr_pmrpnxb = -1
>>>
>>> gr_pmrpnyb = -1
>>>
>>> gr_pmrpnzb = -1
>>>
>>> gr_restrictallmethod = 3
>>>
>>> gr_sanitizedatamode = 3 [CHANGED]
>>>
>>> gr_sanitizeverbosity = 5
>>>
>>> grv_bhewaldfieldnxv42 = 32
>>>
>>> grv_bhewaldfieldnyv42 = 32
>>>
>>> grv_bhewaldfieldnzv42 = 32
>>>
>>> grv_bhewaldnper = 32
>>>
>>> grv_bhewaldnrefv42 = -1
>>>
>>> grv_bhewaldseriesn = 10
>>>
>>> grv_bhmpdegree = 2
>>>
>>> iprocs = 1
>>>
>>> interpol_order = 2
>>>
>>> irenorm = 0
>>>
>>> jprocs = 1
>>>
>>> kprocs = 1
>>>
>>> lrefine_del = 0
>>>
>>> lrefine_max = 9 [CHANGED]
>>>
>>> lrefine_min = 3 [CHANGED]
>>>
>>> lrefine_min_init = 1
>>>
>>> max_particles_per_blk = 100
>>>
>>> memory_stat_freq = 100000
>>>
>>> meshcopycount = 1
>>>
>>> min_particles_per_blk = 1
>>>
>>> nbegin = 1
>>>
>>> nblockx = 1
>>>
>>> nblocky = 1
>>>
>>> nblockz = 1
>>>
>>> nend = 1000000 [CHANGED]
>>>
>>> nrefs = 2
>>>
>>> nsteptotalsts = 5
>>>
>>> outputsplitnum = 1
>>>
>>> plotfileintervalstep = 1000000 [CHANGED]
>>>
>>> plotfilenumber = 0
>>>
>>> refine_var_count = 4
>>>
>>> rolling_checkpoint = 10000
>>>
>>> sim_refine = 9 [CHANGED]
>>>
>>> sweeporder = 123
>>>
>>> wr_integrals_freq = 1
>>>
>>> cfl = 0.800E+00
>>>
>>> checkpointfileintervaltime = 0.316E+14 [CHANGED]
>>>
>>> checkpointfileintervalz = 0.180+309
>>>
>>> derefine_cutoff_1 = 0.300E+00 [CHANGED]
>>>
>>> derefine_cutoff_2 = 0.200E+00
>>>
>>> derefine_cutoff_3 = 0.200E+00
>>>
>>> derefine_cutoff_4 = 0.200E+00
>>>
>>> dr_dtmincontinue = 0.000E+00
>>>
>>> dr_posdefdtfactor = 0.100E+01
>>>
>>> dr_tstepslowstartfactor = 0.100E+00
>>>
>>> dtinit = 0.316E+11 [CHANGED]
>>>
>>> dtmax = 0.316E+13 [CHANGED]
>>>
>>> dtmin = 0.316E+10 [CHANGED]
>>>
>>> eintswitch = 0.100E-03 [CHANGED]
>>>
>>> eos_singlespeciesa = 0.100E+01
>>>
>>> eos_singlespeciesz = 0.100E+01
>>>
>>> gamma = 0.167E+01
>>>
>>> gr_bhtreelimangle = 0.500E+00
>>>
>>> gr_bhtreemaxcellmass = 0.100+100
>>>
>>> gr_bhtreemincellmass = 0.100E-98
>>>
>>> gr_bhtreesafebox = 0.120E+01
>>>
>>> gr_lrefinemaxredlogbase = 0.100E+02
>>>
>>> gr_lrefinemaxredradiusfact = 0.000E+00
>>>
>>> gr_lrefinemaxredtref = 0.000E+00
>>>
>>> gr_lrefinemaxredtimescale = 0.100E+01
>>>
>>> gr_lrefmaxtime_1 = -0.100E+01
>>>
>>> gr_lrefmaxtime_10 = -0.100E+01
>>>
>>> gr_lrefmaxtime_11 = -0.100E+01
>>>
>>> gr_lrefmaxtime_12 = -0.100E+01
>>>
>>> gr_lrefmaxtime_13 = -0.100E+01
>>>
>>> gr_lrefmaxtime_14 = -0.100E+01
>>>
>>> gr_lrefmaxtime_15 = -0.100E+01
>>>
>>> gr_lrefmaxtime_16 = -0.100E+01
>>>
>>> gr_lrefmaxtime_17 = -0.100E+01
>>>
>>> gr_lrefmaxtime_18 = -0.100E+01
>>>
>>> gr_lrefmaxtime_19 = -0.100E+01
>>>
>>> gr_lrefmaxtime_2 = -0.100E+01
>>>
>>> gr_lrefmaxtime_20 = -0.100E+01
>>>
>>> gr_lrefmaxtime_3 = -0.100E+01
>>>
>>> gr_lrefmaxtime_4 = -0.100E+01
>>>
>>> gr_lrefmaxtime_5 = -0.100E+01
>>>
>>> gr_lrefmaxtime_6 = -0.100E+01
>>>
>>> gr_lrefmaxtime_7 = -0.100E+01
>>>
>>> gr_lrefmaxtime_8 = -0.100E+01
>>>
>>> gr_lrefmaxtime_9 = -0.100E+01
>>>
>>> grv_bhaccerr = 0.100E+00
>>>
>>> grv_bhextrnpotcenterx = 0.000E+00
>>>
>>> grv_bhextrnpotcentery = 0.000E+00
>>>
>>> grv_bhextrnpotcenterz = 0.000E+00
>>>
>>> grv_bhnewton = -0.100E+01
>>>
>>> hall_parameter = 0.000E+00
>>>
>>> hyperresistivity = 0.000E+00
>>>
>>> nusts = 0.100E+00
>>>
>>> plotfileintervaltime = 0.316E+13 [CHANGED]
>>>
>>> plotfileintervalz = 0.180+309
>>>
>>> point_mass = 0.000E+00
>>>
>>> point_mass_rsoft = 0.000E+00
>>>
>>> polytropedens1 = 0.212E-24 [CHANGED]
>>>
>>> polytropedens2 = 0.100+100
>>>
>>> polytropedens3 = 0.100+100
>>>
>>> polytropedens4 = 0.100+100
>>>
>>> polytropedens5 = 0.100+100
>>>
>>> polytropegamma1 = 0.100E+01
>>>
>>> polytropegamma2 = 0.100E+01
>>>
>>> polytropegamma3 = 0.100E+01
>>>
>>> polytropegamma4 = 0.100E+01
>>>
>>> polytropegamma5 = 0.100E+01
>>>
>>> polytropekonst = 0.400E+09 [CHANGED]
>>>
>>> refine_cutoff_1 = 0.700E+00 [CHANGED]
>>>
>>> refine_cutoff_2 = 0.800E+00
>>>
>>> refine_cutoff_3 = 0.800E+00
>>>
>>> refine_cutoff_4 = 0.800E+00
>>>
>>> refine_filter_1 = 0.100E-01
>>>
>>> refine_filter_2 = 0.100E-01
>>>
>>> refine_filter_3 = 0.100E-01
>>>
>>> refine_filter_4 = 0.100E-01
>>>
>>> rss_limit = -0.100E+01
>>>
>>> sim_bx = 0.710E-05 [CHANGED]
>>>
>>> sim_by = 0.000E+00
>>>
>>> sim_bz = 0.000E+00
>>>
>>> sim_cs = 0.200E+05 [CHANGED]
>>>
>>> sim_delta = 0.463E+19 [CHANGED]
>>>
>>> sim_mu_mol = 0.127E+01
>>>
>>> sim_pert = 0.150E+01 [CHANGED]
>>>
>>> sim_rho = 0.212E-21 [CHANGED]
>>>
>>> sim_sigma = 0.250E+00 [CHANGED]
>>>
>>> sim_vx = 0.000E+00
>>>
>>> sim_vy = 0.000E+00
>>>
>>> sim_vz = 0.000E+00
>>>
>>> sim_x_cent = 0.000E+00
>>>
>>> sim_y_cent = 0.000E+00
>>>
>>> sim_z_cent = 0.000E+00
>>>
>>> small = 0.100E-39 [CHANGED]
>>>
>>> smalle = 0.100E-09
>>>
>>> smallp = 0.100E-21 [CHANGED]
>>>
>>> smallt = 0.100E+01 [CHANGED]
>>>
>>> smallu = 0.100E-39 [CHANGED]
>>>
>>> smallx = 0.100E-09
>>>
>>> smlrho = 0.100E-39 [CHANGED]
>>>
>>> tinitial = 0.000E+00
>>>
>>> tmax = 0.252E+15 [CHANGED]
>>>
>>> tstep_change_factor = 0.200E+01
>>>
>>> wall_clock_checkpoint = 0.432E+05
>>>
>>> wall_clock_time_limit = 0.605E+06
>>>
>>> x_refine_center = 0.000E+00
>>>
>>> xmax = 0.154E+20 [CHANGED]
>>>
>>> xmin = -0.154E+20 [CHANGED]
>>>
>>> y_refine_center = 0.000E+00
>>>
>>> ymax = 0.154E+20 [CHANGED]
>>>
>>> ymin = -0.154E+20 [CHANGED]
>>>
>>> zfinal = 0.000E+00
>>>
>>> zinitial = -0.100E+01
>>>
>>> z_refine_center = 0.000E+00
>>>
>>> zmax = 0.154E+20 [CHANGED]
>>>
>>> zmin = -0.154E+20 [CHANGED]
>>>
>>> unitsystem = CGS [CHANGED]
>>>
>>> basenm = mc_evolution_ [CHANGED]
>>>
>>> dr_posdefvar_1 = none
>>>
>>> dr_posdefvar_2 = none
>>>
>>> dr_posdefvar_3 = none
>>>
>>> dr_posdefvar_4 = none
>>>
>>> eosmode = dens_ie
>>>
>>> eosmodeinit = dens_ie
>>>
>>> geometry = cartesian
>>>
>>> gr_pmrpoutputdir = ./
>>>
>>> grav_boundary_type = periodic [CHANGED]
>>>
>>> grav_boundary_type_x = periodic [CHANGED]
>>>
>>> grav_boundary_type_y = periodic [CHANGED]
>>>
>>> grav_boundary_type_z = periodic [CHANGED]
>>>
>>> grv_bhewaldfname = ewald_coeffs
>>>
>>> grv_bhewaldfnameaccv42 = ewald_field_acc
>>>
>>> grv_bhewaldfnamepotv42 = ewald_field_pot
>>>
>>> grv_bhextrnpotfile = external_potential.dat
>>>
>>> grv_bhextrnpottype = planez
>>>
>>> grv_bhmac = ApproxPartialErr
>>>
>>> log_file = mc_evolution.log [CHANGED]
>>>
>>> output_directory =
>>>
>>> pc_unitsbase = CGS
>>>
>>> plot_grid_var_1 = none
>>>
>>> plot_grid_var_10 = none
>>>
>>> plot_grid_var_11 = none
>>>
>>> plot_grid_var_12 = none
>>>
>>> plot_grid_var_2 = none
>>>
>>> plot_grid_var_3 = none
>>>
>>> plot_grid_var_4 = none
>>>
>>> plot_grid_var_5 = none
>>>
>>> plot_grid_var_6 = none
>>>
>>> plot_grid_var_7 = none
>>>
>>> plot_grid_var_8 = none
>>>
>>> plot_grid_var_9 = none
>>>
>>> plot_var_1 = dens [CHANGED]
>>>
>>> plot_var_10 = magz [CHANGED]
>>>
>>> plot_var_11 = none
>>>
>>> plot_var_12 = none
>>>
>>> plot_var_13 = none
>>>
>>> plot_var_14 = none
>>>
>>> plot_var_15 = none
>>>
>>> plot_var_16 = none
>>>
>>> plot_var_17 = none
>>>
>>> plot_var_18 = none
>>>
>>> plot_var_2 = pres [CHANGED]
>>>
>>> plot_var_3 = none
>>>
>>> plot_var_4 = eint [CHANGED]
>>>
>>> plot_var_5 = velx [CHANGED]
>>>
>>> plot_var_6 = vely [CHANGED]
>>>
>>> plot_var_7 = velz [CHANGED]
>>>
>>> plot_var_8 = magx [CHANGED]
>>>
>>> plot_var_9 = magy [CHANGED]
>>>
>>> prof_file = profile.dat
>>>
>>> refine_var_1 = dens [CHANGED]
>>>
>>> refine_var_2 = none
>>>
>>> refine_var_3 = none
>>>
>>> refine_var_4 = none
>>>
>>> refine_var_thresh = dens [CHANGED]
>>>
>>> run_comment = MC evolution [CHANGED]
>>>
>>> run_number = 1
>>>
>>> stats_file = mc.dat [CHANGED]
>>>
>>> xl_boundary_type = periodic
>>>
>>> xr_boundary_type = periodic
>>>
>>> yl_boundary_type = periodic
>>>
>>> yr_boundary_type = periodic
>>>
>>> zl_boundary_type = periodic
>>>
>>> zr_boundary_type = periodic
>>>
>>> allowdtstsdominate = F
>>>
>>> alwayscomputeuservars = T
>>>
>>> alwaysrestrictcheckpoint = T
>>>
>>> chkguardcellsinput = F
>>>
>>> chkguardcellsoutput = F
>>>
>>> converttoconsvdformeshcalls = F
>>>
>>> converttoconsvdinmeshinterp = T
>>>
>>> corners = F
>>>
>>> dr_printtsteploc = T
>>>
>>> dr_shortenlaststepbeforetmax = F
>>>
>>> dr_useposdefcomputedt = F
>>>
>>> drift_tuples = F
>>>
>>> eachprocwritesownabortlog = F
>>>
>>> eachprocwritessummary = F
>>>
>>> earlyblockdistadjustment = T
>>>
>>> enablemaskedgcfill = F
>>>
>>> flux_correct = T
>>>
>>> geometryoverride = F
>>>
>>> gr_bcenableapplymixedgds = T
>>>
>>> gr_bhphysmaccomm = F
>>>
>>> gr_bhphysmactw = F
>>>
>>> gr_bhuseunifiedtw = F [CHANGED]
>>>
>>> gr_lrefinemaxbytime = F
>>>
>>> gr_lrefinemaxreddobylogr = F
>>>
>>> gr_lrefinemaxreddobytime = F
>>>
>>> gr_pmrpadvancealllevels = F
>>>
>>> gr_pmrpamrerrorchecking = F
>>>
>>> gr_pmrpcartesianpm = F
>>>
>>> gr_pmrpconserve = F
>>>
>>> gr_pmrpconsvfluxdensities = T
>>>
>>> gr_pmrpconsvfluxes = F
>>>
>>> gr_pmrpcurvilinear = F
>>>
>>> gr_pmrpcurvilinearconserve = F
>>>
>>> gr_pmrpcylindricalpm = F
>>>
>>> gr_pmrpdiagonals = T
>>>
>>> gr_pmrpedgevalue = T
>>>
>>> gr_pmrpedgevalueinteg = F
>>>
>>> gr_pmrpemptycells = F
>>>
>>> gr_pmrpforceconsistency = T
>>>
>>> gr_pmrplsingularline = F
>>>
>>> gr_pmrpnopermanentguardcells = F
>>>
>>> gr_pmrppolarpm = F
>>>
>>> gr_pmrppredcorr = F
>>>
>>> gr_pmrpsphericalpm = F
>>>
>>> gr_pmrptimingmpi = F
>>>
>>> gr_pmrptimingmpix = F
>>>
>>> gr_pmrpvardt = F
>>>
>>> grav_temporal_extrp = F
>>>
>>> grav_unjunkpden = T
>>>
>>> grv_bhewaldalwaysgenerate = T
>>>
>>> grv_bhlinearinterpolonlyv42 = T
>>>
>>> grv_bhuserelaccerr = F
>>>
>>> grv_useexternalpotential = F
>>>
>>> grv_usepoissonpotential = T
>>>
>>> ignoreforcedplot = F
>>>
>>> io_writemscalarintegrals = F
>>>
>>> killdivb = T [CHANGED]
>>>
>>> plotfilegridquantitydp = F
>>>
>>> plotfilemetadatadp = F
>>>
>>> reducegcellfills = F
>>>
>>> refine_on_particle_count = F
>>>
>>> restart = F
>>>
>>> summaryoutputonly = F
>>>
>>> threadblocklistbuild = F
>>>
>>> threaddriverblocklist = F
>>>
>>> threaddriverwithinblock = F
>>>
>>> threadeoswithinblock = F
>>>
>>> threadhydroblocklist = F
>>>
>>> threadhydrowithinblock = F
>>>
>>> threadraytracebuild = F
>>>
>>> threadwithinblockbuild = F
>>>
>>> typematchedxfer = T
>>>
>>> unbiased_geometry = F
>>>
>>> updategravity = T
>>>
>>> updatehydrofluxes = T
>>>
>>> useburn = F
>>>
>>> usecollectivehdf5 = T
>>>
>>> useconductivity = F
>>>
>>> usecool = F
>>>
>>> usecosmology = F
>>>
>>> usedeleptonize = F
>>>
>>> usediffuse = F
>>>
>>> usediffusecomputedtspecies = F
>>>
>>> usediffusecomputedttherm = F
>>>
>>> usediffusecomputedtvisc = F
>>>
>>> usediffusecomputedtmagnetic = F
>>>
>>> useenergydeposition = F
>>>
>>> useflame = F
>>>
>>> usegravity = T
>>>
>>> useheat = F
>>>
>>> useheatexchange = F
>>>
>>> usehydro = T
>>>
>>> useincompns = F
>>>
>>> useionize = F
>>>
>>> uselegacylabels = T
>>>
>>> usemagneticresistivity = F
>>>
>>> usemassdiffusivity = F
>>>
>>> useopacity = F
>>>
>>> useparticles = F
>>>
>>> useplasmastate = F
>>>
>>> usepolytrope = T [CHANGED]
>>>
>>> useprimordialchemistry = F
>>>
>>> useprotonemission = F
>>>
>>> useprotonimaging = F
>>>
>>> useradtrans = F
>>>
>>> useraytrace = F
>>>
>>> usests = F
>>>
>>> usestsfordiffusion = F
>>>
>>> usestir = F
>>>
>>> usethomsonscattering = F
>>>
>>> usetreeray = F
>>>
>>> useturb = T
>>>
>>> useviscosity = F
>>>
>>> usexrayimaging = F
>>>
>>> use_cma_advection = F
>>>
>>> use_cma_flattening = F
>>>
>>> use_flash_surr_blks_fill = T
>>>
>>> use_reduced_orrery = T
>>>
>>> use_steepening = T
>>>
>>> writestatsummary = T
>>>
>>>
>>>
>>> ==============================================================================
>>>
>>>
>>> WARNING: Ignored Parameters :
>>>
>>> These parameters were found in the flash.par file, but they were
>>>
>>> not declared in any Config file for the simulation!
>>>
>>>
>>> energyFix
>>>
>>>
>>>
>>> ==============================================================================
>>>
>>>
>>> Known units of measurement:
>>>
>>>
>>> Unit CGS Value Base Unit
>>>
>>> 1 cm 1.0000 cm
>>>
>>> 2 s 1.0000 s
>>>
>>> 3 g 1.0000 g
>>>
>>> 4 K 1.0000 K
>>>
>>> 5 esu 1.0000 esu
>>>
>>> 6 mol 1.0000 mol
>>>
>>> 7 m 100.00 cm
>>>
>>> 8 km 1.00000E+05 cm
>>>
>>> 9 pc 3.08568E+18 cm
>>>
>>> 10 kpc 3.08568E+21 cm
>>>
>>> 11 Mpc 3.08568E+24 cm
>>>
>>> 12 Gpc 3.08568E+27 cm
>>>
>>> 13 Rsun 6.96000E+10 cm
>>>
>>> 14 AU 1.49598E+13 cm
>>>
>>> 15 yr 3.15569E+07 s
>>>
>>> 16 Myr 3.15569E+13 s
>>>
>>> 17 Gyr 3.15569E+16 s
>>>
>>> 18 kg 1000.0 g
>>>
>>> 19 Msun 1.98892E+33 g
>>>
>>> 20 amu 1.66054E-24 g
>>>
>>> 21 eV 11605. K
>>>
>>> 22 C 2.99792E+09 esu
>>>
>>> 23 LFLY 3.08568E+24 cm
>>>
>>> 24 TFLY 2.05759E+17 s
>>>
>>> 25 MFLY 9.88470E+45 g
>>>
>>> 26 clLength 3.08568E+24 cm
>>>
>>> 27 clTime 3.15569E+16 s
>>>
>>> 28 clMass 1.98892E+48 g
>>>
>>> 29 clTemp 1.16045E+07 K
>>>
>>> -----------End of Units--------------------
>>>
>>>
>>> Known physical constants:
>>>
>>>
>>> Constant Name Constant Value cm s g K esu mol
>>>
>>> 1 Newton 6.67408E-08 3.0 -2.0 -1.0 0.0 0.0 0.0
>>>
>>> 2 speed of light 2.99792E+10 1.0 -1.0 0.0 0.0 0.0 0.0
>>>
>>> 3 Planck 6.62607E-27 2.0 -1.0 1.0 0.0 0.0 0.0
>>>
>>> 4 electron charge 4.80320E-10 0.0 0.0 0.0 0.0 1.0 0.0
>>>
>>> 5 electron mass 9.10938E-28 0.0 0.0 1.0 0.0 0.0 0.0
>>>
>>> 6 proton mass 1.67262E-24 0.0 0.0 1.0 0.0 0.0 0.0
>>>
>>> 7 fine-structure 7.29735E-03 0.0 0.0 0.0 0.0 0.0 0.0
>>>
>>> 8 Avogadro 6.02214E+23 0.0 0.0 0.0 0.0 0.0 -1.0
>>>
>>> 9 Boltzmann 1.38065E-16 2.0 -2.0 1.0 -1.0 0.0 0.0
>>>
>>> 10 ideal gas constant 8.31446E+07 2.0 -2.0 1.0 -1.0 0.0 -1.0
>>>
>>> 11 Wien 0.28978 1.0 0.0 0.0 1.0 0.0 0.0
>>>
>>> 12 Stefan-Boltzmann 5.67037E-05 0.0 -3.0 1.0 -4.0 0.0 0.0
>>>
>>> 13 Radiation Constant 7.56572E-15 -1.0 -2.0 1.0 -4.0 0.0 0.0
>>>
>>> 14 pi 3.1416 0.0 0.0 0.0 0.0 0.0 0.0
>>>
>>> 15 e 2.7183 0.0 0.0 0.0 0.0 0.0 0.0
>>>
>>> 16 Euler 0.57722 0.0 0.0 0.0 0.0 0.0 0.0
>>>
>>>
>>> ==============================================================================
>>>
>>>
>>> Multifluid database: not configured in
>>>
>>>
>>>
>>> ==============================================================================
>>>
>>> [ 12-05-2019 12:02:26.975 ] [gr_initGeometry] checking BCs for idir: 1
>>>
>>> [ 12-05-2019 12:02:26.976 ] [gr_initGeometry] checking BCs for idir: 2
>>>
>>> [ 12-05-2019 12:02:26.976 ] [gr_initGeometry] checking BCs for idir: 3
>>>
>>> [ 12-05-2019 12:02:27.074 ] [GRID amr_refine_derefine]: initiating
>>> refinement
>>>
>>> [ 12-05-2019 12:02:27.302 ] [GRID amr_refine_derefine]: redist. phase.
>>> tot blks requested: 9
>>>
>>> [GRID amr_refine_derefine] min blks 0 max blks 1 tot blks 9
>>>
>>> [GRID amr_refine_derefine] min leaf blks 0 max leaf blks 1 tot leaf blks
>>> 8
>>>
>>> [ 12-05-2019 12:02:27.543 ] [GRID amr_refine_derefine]: refinement
>>> complete
>>>
>>> [ 12-05-2019 12:02:27.968 ] [GRID gr_expandDomain]: iteration=1, create
>>> level=3
>>>
>>> INFO: Grid_fillGuardCells is ignoring masking.
>>>
>>> [ 12-05-2019 12:02:28.078 ] [mpi_amr_comm_setup]: buffer_dim_send=74785,
>>> buffer_dim_recv=1
>>>
>>> [ 12-05-2019 12:02:28.157 ] [GRID amr_refine_derefine]: initiating
>>> refinement
>>>
>>> [ 12-05-2019 12:02:29.011 ] [GRID amr_refine_derefine]: redist. phase.
>>> tot blks requested: 73
>>>
>>> [GRID amr_refine_derefine] min blks 0 max blks 1 tot blks 73
>>>
>>> [GRID amr_refine_derefine] min leaf blks 0 max leaf blks 1 tot leaf blks
>>> 64
>>>
>>> [ 12-05-2019 12:02:29.440 ] [GRID amr_refine_derefine]: refinement
>>> complete
>>>
>>> [ 12-05-2019 12:02:29.833 ] [GRID gr_expandDomain]: iteration=2, create
>>> level=4
>>>
>>> [ 12-05-2019 12:02:30.253 ] [GRID amr_refine_derefine]: initiating
>>> refinement
>>>
>>> [ 12-05-2019 12:02:30.322 ] [GRID amr_refine_derefine]: redist. phase.
>>> tot blks requested: 585
>>>
>>> [GRID amr_refine_derefine] min blks 2 max blks 4 tot blks 585
>>>
>>> [GRID amr_refine_derefine] min leaf blks 1 max leaf blks 3 tot leaf blks
>>> 512
>>>
>>> [ 12-05-2019 12:02:32.046 ] [GRID amr_refine_derefine]: refinement
>>> complete
>>>
>>> [ 12-05-2019 12:02:32.564 ] [GRID gr_expandDomain]: iteration=3, create
>>> level=5
>>>
>>> [ 12-05-2019 12:02:32.684 ] [mpi_amr_comm_setup]:
>>> buffer_dim_send=233185, buffer_dim_recv=135889
>>>
>>> [ 12-05-2019 12:02:33.443 ] [GRID amr_refine_derefine]: initiating
>>> refinement
>>>
>>> [ 12-05-2019 12:02:34.807 ] [GRID amr_refine_derefine]: redist. phase.
>>> tot blks requested: 3033
>>>
>>> [GRID amr_refine_derefine] min blks 11 max blks 14 tot blks 3033
>>>
>>> [GRID amr_refine_derefine] min leaf blks 9 max leaf blks 11 tot leaf
>>> blks 2654
>>>
>>> [ 12-05-2019 12:02:35.561 ] [GRID amr_refine_derefine]: refinement
>>> complete
>>>
>>> [ 12-05-2019 12:02:35.977 ] [GRID gr_expandDomain]: iteration=4, create
>>> level=6
>>>
>>> [ 12-05-2019 12:02:36.173 ] [mpi_amr_comm_setup]:
>>> buffer_dim_send=575965, buffer_dim_recv=351529
>>>
>>> [ 12-05-2019 12:02:37.611 ] [GRID amr_refine_derefine]: initiating
>>> refinement
>>>
>>> [ 12-05-2019 12:02:40.401 ] [GRID amr_refine_derefine]: redist. phase.
>>> tot blks requested: 6297
>>>
>>> [GRID amr_refine_derefine] min blks 24 max blks 26 tot blks 6297
>>>
>>> [GRID amr_refine_derefine] min leaf blks 20 max leaf blks 23 tot leaf
>>> blks 5510
>>>
>>> [ 12-05-2019 12:02:41.154 ] [GRID amr_refine_derefine]: refinement
>>> complete
>>>
>>> [ 12-05-2019 12:02:41.544 ] [GRID gr_expandDomain]: iteration=5, create
>>> level=7
>>>
>>> [ 12-05-2019 12:02:41.815 ] [mpi_amr_comm_setup]:
>>> buffer_dim_send=540997, buffer_dim_recv=411109
>>>
>>> [ 12-05-2019 12:02:44.513 ] [GRID amr_refine_derefine]: initiating
>>> refinement
>>>
>>> [ 12-05-2019 12:02:46.394 ] [GRID amr_refine_derefine]: redist. phase.
>>> tot blks requested: 16185
>>>
>>> [GRID amr_refine_derefine] min blks 62 max blks 65 tot blks 16185
>>>
>>> [GRID amr_refine_derefine] min leaf blks 54 max leaf blks 57 tot leaf
>>> blks 14162
>>>
>>> [ 12-05-2019 12:02:47.277 ] [GRID amr_refine_derefine]: refinement
>>> complete
>>>
>>> [ 12-05-2019 12:02:47.696 ] [GRID gr_expandDomain]: iteration=6, create
>>> level=8
>>>
>>> [ 12-05-2019 12:02:48.106 ] [mpi_amr_comm_setup]:
>>> buffer_dim_send=881329, buffer_dim_recv=699133
>>>
>>> [ 12-05-2019 12:02:50.281 ] [GRID amr_refine_derefine]: initiating
>>> refinement
>>>
>>> [ 12-05-2019 12:02:51.436 ] [GRID amr_refine_derefine]: redist. phase.
>>> tot blks requested: 89497
>>>
>>> [GRID amr_refine_derefine] min blks 348 max blks 351 tot blks 89497
>>>
>>> [GRID amr_refine_derefine] min leaf blks 304 max leaf blks 308 tot leaf
>>> blks 78310
>>>
>>> [ 12-05-2019 12:02:53.153 ] [GRID amr_refine_derefine]: refinement
>>> complete
>>>
>>> [ 12-05-2019 12:02:53.443 ] [GRID gr_expandDomain]: iteration=7, create
>>> level=9
>>>
>>> [ 12-05-2019 12:02:56.118 ] [mpi_amr_comm_setup]:
>>> buffer_dim_send=1952545, buffer_dim_recv=1773109
>>>
>>> [ 12-05-2019 12:03:03.333 ] [GRID amr_refine_derefine]: initiating
>>> refinement
>>>
>>> [ 12-05-2019 12:03:07.980 ] [GRID amr_refine_derefine]: redist. phase.
>>> tot blks requested: 626345
>>>
>>> [GRID amr_refine_derefine] min blks 2445 max blks 2449 tot blks 626345
>>>
>>> [GRID amr_refine_derefine] min leaf blks 2139 max leaf blks 2142 tot
>>> leaf blks 548052
>>>
>>> [ 12-05-2019 12:03:13.985 ] [GRID amr_refine_derefine]: refinement
>>> complete
>>>
>>> [ 12-05-2019 12:03:14.055 ] [GRID gr_expandDomain]: iteration=8, create
>>> level=9
>>>
>>> [ 12-05-2019 12:03:25.399 ] [mpi_amr_comm_setup]:
>>> buffer_dim_send=6183577, buffer_dim_recv=5623945
>>>
>>> [ 12-05-2019 12:03:55.420 ] [GRID amr_refine_derefine]: initiating
>>> refinement
>>>
>>> [ 12-05-2019 12:03:58.377 ] [GRID amr_refine_derefine]: redist. phase.
>>> tot blks requested: 97801
>>>
>>> [GRID amr_refine_derefine] min blks 381 max blks 384 tot blks 97801
>>>
>>> [GRID amr_refine_derefine] min leaf blks 333 max leaf blks 336 tot leaf
>>> blks 85576
>>>
>>> [ 12-05-2019 12:04:01.822 ] [GRID amr_refine_derefine]: refinement
>>> complete
>>>
>>> [ 12-05-2019 12:04:01.849 ] [GRID gr_expandDomain]: iteration=9, create
>>> level=9
>>>
>>> [ 12-05-2019 12:04:03.709 ] [mpi_amr_comm_setup]:
>>> buffer_dim_send=2129845, buffer_dim_recv=1958629
>>>
>>> [ 12-05-2019 12:04:11.340 ] [GRID amr_refine_derefine]: initiating
>>> refinement
>>>
>>> [ 12-05-2019 12:04:15.464 ] [GRID amr_refine_derefine]: redist. phase.
>>> tot blks requested: 626345
>>>
>>> [GRID amr_refine_derefine] min blks 2445 max blks 2449 tot blks 626345
>>>
>>> [GRID amr_refine_derefine] min leaf blks 2139 max leaf blks 2142 tot
>>> leaf blks 548052
>>>
>>> [ 12-05-2019 12:04:24.519 ] [GRID amr_refine_derefine]: refinement
>>> complete
>>>
>>> [ 12-05-2019 12:04:24.588 ] [GRID gr_expandDomain]: iteration=10, create
>>> level=9
>>>
>>> [ 12-05-2019 12:04:35.959 ] [mpi_amr_comm_setup]:
>>> buffer_dim_send=6183577, buffer_dim_recv=5623945
>>>
>>> [ 12-05-2019 12:05:06.370 ] [GRID amr_refine_derefine]: initiating
>>> refinement
>>>
>>> [ 12-05-2019 12:05:09.335 ] [GRID amr_refine_derefine]: redist. phase.
>>> tot blks requested: 97801
>>>
>>> [GRID amr_refine_derefine] min blks 381 max blks 384 tot blks 97801
>>>
>>> [GRID amr_refine_derefine] min leaf blks 333 max leaf blks 336 tot leaf
>>> blks 85576
>>>
>>> [ 12-05-2019 12:05:12.763 ] [GRID amr_refine_derefine]: refinement
>>> complete
>>>
>>> [ 12-05-2019 12:05:12.790 ] [GRID gr_expandDomain]: iteration=11, create
>>> level=9
>>>
>>> [ 12-05-2019 12:05:14.289 ] [gr_ensureValidNeighborInfo] found
>>> mpi_pattern_id: -10
>>>
>>> [ 12-05-2019 12:05:14.317 ] [mpi_amr_comm_setup]: buffer_dim_send=73525,
>>> buffer_dim_recv=62437
>>>
>>> [ 12-05-2019 12:05:14.806 ] memory: /proc vsize (MiB): 2918.54 (min)
>>> 3030.76 (max) 2946.03 (avg)
>>>
>>> [ 12-05-2019 12:05:14.807 ] memory: /proc rss (MiB): 1798.14 (min)
>>> 1883.50 (max) 1830.61 (avg)
>>>
>>> [ 12-05-2019 12:05:14.808 ] memory: /proc vsize (MiB): 2918.54 (min)
>>> 3030.76 (max) 2946.03 (avg)
>>>
>>> [ 12-05-2019 12:05:14.809 ] memory: /proc rss (MiB): 1798.18 (min)
>>> 1883.53 (max) 1830.67 (avg)
>>>
>>> [ 12-05-2019 12:05:17.553 ] [BHTree]: Coefficients in the Taylor
>>> expansion written into file
>>>
>>>
>>>
>>> The error file says:
>>>
>>>
>>> Fatal error in PMPI_Waitall: See the MPI_ERROR field in MPI_Status for
>>> the error code
>>>
>>>
>>> The out file says:
>>>
>>>
>>> RuntimeParameters_read: ignoring unknown parameter "energyFix"...
>>>
>>> Grid_init: resolution based on runtime params:
>>>
>>> lrefine dx dy dz
>>>
>>> 3 9.643E+17 9.643E+17 9.643E+17
>>>
>>> 4 4.821E+17 4.821E+17 4.821E+17
>>>
>>> 5 2.411E+17 2.411E+17 2.411E+17
>>>
>>> 6 1.205E+17 1.205E+17 1.205E+17
>>>
>>> 7 6.027E+16 6.027E+16 6.027E+16
>>>
>>> 8 3.013E+16 3.013E+16 3.013E+16
>>>
>>> 9 1.507E+16 1.507E+16 1.507E+16
>>>
>>> MaterialProperties initialized
>>>
>>> Cosmology initialized
>>>
>>> Initializing Polytropic Equation of State
>>>
>>> Source terms initialized
>>>
>>> iteration, no. not moved = 0 0
>>>
>>> refined: total leaf blocks = 8
>>>
>>> refined: total blocks = 9
>>>
>>> [amr_morton_process]: Initializing surr_blks using standard orrery
>>> implementation
>>>
>>> INFO: Grid_fillGuardCells is ignoring masking.
>>>
>>> iteration, no. not moved = 0 6
>>>
>>> iteration, no. not moved = 1 0
>>>
>>> refined: total leaf blocks = 64
>>>
>>> refined: total blocks = 73
>>>
>>> iteration, no. not moved = 0 46
>>>
>>> iteration, no. not moved = 1 2
>>>
>>> iteration, no. not moved = 2 0
>>>
>>> refined: total leaf blocks = 512
>>>
>>> refined: total blocks = 585
>>>
>>> iteration, no. not moved = 0 459
>>>
>>> iteration, no. not moved = 1 58
>>>
>>> iteration, no. not moved = 2 0
>>>
>>> refined: total leaf blocks = 2654
>>>
>>> refined: total blocks = 3033
>>>
>>> iteration, no. not moved = 0 2675
>>>
>>> iteration, no. not moved = 1 765
>>>
>>> iteration, no. not moved = 2 0
>>>
>>> refined: total leaf blocks = 5510
>>>
>>> refined: total blocks = 6297
>>>
>>> iteration, no. not moved = 0 5642
>>>
>>> iteration, no. not moved = 1 1146
>>>
>>> iteration, no. not moved = 2 0
>>>
>>> refined: total leaf blocks = 14162
>>>
>>> refined: total blocks = 16185
>>>
>>> iteration, no. not moved = 0 15209
>>>
>>> iteration, no. not moved = 1 1503
>>>
>>> iteration, no. not moved = 2 0
>>>
>>> refined: total leaf blocks = 78310
>>>
>>> refined: total blocks = 89497
>>>
>>> iteration, no. not moved = 0 79298
>>>
>>> iteration, no. not moved = 1 4927
>>>
>>> iteration, no. not moved = 2 0
>>>
>>> refined: total leaf blocks = 548052
>>>
>>> refined: total blocks = 626345
>>>
>>> iteration, no. not moved = 0 91484
>>>
>>> iteration, no. not moved = 1 36351
>>>
>>> iteration, no. not moved = 2 0
>>>
>>> refined: total leaf blocks = 85576
>>>
>>> refined: total blocks = 97801
>>>
>>> iteration, no. not moved = 0 89804
>>>
>>> iteration, no. not moved = 1 6879
>>>
>>> iteration, no. not moved = 2 0
>>>
>>> refined: total leaf blocks = 548052
>>>
>>> refined: total blocks = 626345
>>>
>>> iteration, no. not moved = 0 91447
>>>
>>> iteration, no. not moved = 1 36080
>>>
>>> iteration, no. not moved = 2 0
>>>
>>> refined: total leaf blocks = 85576
>>>
>>> refined: total blocks = 97801
>>>
>>> NSIZE = 4 1
>>>
>>> Tree solver initialized
>>>
>>> Finished with Grid_initDomain, no restart
>>>
>>> Ready to call Hydro_init
>>>
>>> Hydro initialized
>>>
>>> Gravity initialized
>>>
>>> Initial dt verified
>>>
>>>
>>> I’m not sure what is the reason for this kind of crash, have you ever
>>> experienced something similar?
>>>
>>> My guess is that it is a communication problem due to the use of
>>> mpich-3.2.1 but i’m not sure.
>>>
>>>
>>> Cheers,
>>>
>>>
>>
>> --
>> Eddie Hansen, PhD
>> Postdoctoral Scholar
>> University of Chicago
>> 607-341-6126 | Flash Center
>>
>>
>>

-- 
Eddie Hansen, PhD
Postdoctoral Scholar
University of Chicago
607-341-6126 | Flash Center
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://flash.rochester.edu/pipermail/flash-users/attachments/20191210/c8766b2b/attachment.htm>


More information about the flash-users mailing list