[zorzi@cees-tool-8 object]$ mpirun -np 1 flash4 -------------------------------------------------------------------------- No OpenFabrics connection schemes reported that they were able to be used on a specific port. As such, the openib BTL (OpenFabrics support) will be disabled for this port. Local host: cees-tool-8 Local device: mlx4_1 Local port: 1 CPCs attempted: rdmacm, udcm -------------------------------------------------------------------------- Grid_init: resolution based on runtime params: lrefine dx dy 1 0.125 0.125 2 0.062 0.062 3 0.031 0.031 4 0.016 0.016 5 0.008 0.008 6 0.004 0.004 MaterialProperties initialized Cosmology initialized Source terms initialized iteration, no. not moved = 0 0 refined: total leaf blocks = 1 refined: total blocks = 1 [amr_morton_process]: Initializing surr_blks using standard orrery implementation INFO: Grid_fillGuardCells is ignoring masking. iteration, no. not moved = 0 0 refined: total leaf blocks = 4 refined: total blocks = 5 iteration, no. not moved = 0 0 refined: total leaf blocks = 16 refined: total blocks = 21 iteration, no. not moved = 0 5 iteration, no. not moved = 1 0 refined: total leaf blocks = 49 refined: total blocks = 65 iteration, no. not moved = 0 22 iteration, no. not moved = 1 0 refined: total leaf blocks = 121 refined: total blocks = 161 iteration, no. not moved = 0 66 iteration, no. not moved = 1 0 refined: total leaf blocks = 280 refined: total blocks = 373 iteration, no. not moved = 0 0 refined: total leaf blocks = 268 refined: total blocks = 357 Finished with Grid_initDomain, no restart Ready to call Hydro_init Hydro initialized Gravity initialized Initial dt verified *** Wrote checkpoint file to test_hdf5_chk_0000 **** *** Wrote plotfile to test_hdf5_plt_cnt_0000 **** Initial plotfile written Driver init all done n t dt ( x, y, z) | dt_hydro 1 1.0000E-10 2.0000E-10 ( 0.377 , 0.654 , 0.00 ) | 3.123E-01 2 3.0000E-10 4.0000E-10 ( 0.432 , 0.529 , 0.00 ) | 3.123E-01 3 7.0000E-10 8.0000E-10 ( 0.432 , 0.529 , 0.00 ) | 3.123E-01 4 1.5000E-09 1.6000E-09 ( 0.432 , 0.529 , 0.00 ) | 3.123E-01 5 3.1000E-09 3.2000E-09 ( 0.432 , 0.529 , 0.00 ) | 3.123E-01 6 6.3000E-09 6.4000E-09 ( 0.432 , 0.529 , 0.00 ) | 3.123E-01 7 1.2700E-08 1.2800E-08 ( 0.432 , 0.529 , 0.00 ) | 3.123E-01 8 2.5500E-08 2.5600E-08 ( 0.432 , 0.529 , 0.00 ) | 3.123E-01 9 5.1100E-08 5.1200E-08 ( 0.580 , 0.553 , 0.00 ) | 3.123E-01 *** Wrote plotfile to test_hdf5_plt_cnt_0001 **** 10 1.0230E-07 1.0240E-07 ( 0.432 , 0.529 , 0.00 ) | 3.123E-01 11 2.0470E-07 2.0480E-07 ( 0.432 , 0.529 , 0.00 ) | 3.123E-01 12 4.0950E-07 4.0960E-07 ( 0.549 , 0.521 , 0.00 ) | 3.123E-01 13 8.1910E-07 8.1920E-07 ( 0.549 , 0.521 , 0.00 ) | 3.123E-01 14 1.6383E-06 1.6384E-06 ( 0.596 , 0.627 , 0.00 ) | 3.123E-01 15 3.2767E-06 3.2768E-06 ( 0.580 , 0.658 , 0.00 ) | 3.122E-01 16 6.5535E-06 6.5536E-06 ( 0.596 , 0.627 , 0.00 ) | 3.122E-01 17 1.3107E-05 1.3107E-05 ( 0.580 , 0.658 , 0.00 ) | 3.122E-01 18 2.6214E-05 2.6214E-05 ( 0.580 , 0.658 , 0.00 ) | 3.122E-01 19 5.2429E-05 5.2429E-05 ( 0.580 , 0.658 , 0.00 ) | 3.122E-01 *** Wrote plotfile to test_hdf5_plt_cnt_0002 **** 20 1.0486E-04 1.0486E-04 ( 0.549 , 0.521 , 0.00 ) | 3.122E-01 21 2.0972E-04 2.0972E-04 ( 0.588 , 0.502 , 0.00 ) | 3.121E-01 22 4.1943E-04 4.1943E-04 ( 0.459 , 0.502 , 0.00 ) | 3.120E-01 23 8.3886E-04 8.3886E-04 ( 0.373 , 0.654 , 0.00 ) | 2.551E-01 iteration, no. not moved = 0 152 iteration, no. not moved = 1 0 refined: total leaf blocks = 271 refined: total blocks = 361 24 1.6777E-03 1.0000E-10 ( 0.436 , 0.631 , 0.00 ) | 6.435E-12 25 1.6777E-03 1.0000E-10 ( 0.404 , 0.611 , 0.00 ) | 1.175E-40 iteration, no. not moved = 0 110 iteration, no. not moved = 1 0 refined: total leaf blocks = 280 refined: total blocks = 373 dtCheck= 0.0000000000000000 Driver_abort called. See log file for details. Error message is [Hydro]: Computed dt is not positive! Aborting! Calling MPI_Abort() for shutdown in 2 seconds! -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. --------------------------------------------------------------------------