No OpenFabrics connection schemes reported that they were able to be used on a specific port. As such, the openib BTL (OpenFabrics support) will be disabled for this port. Local host: cees-tool-8 Local device: mlx4_1 Local port: 1 CPCs attempted: rdmacm, udcm -------------------------------------------------------------------------- Grid_init: resolution based on runtime params: lrefine dx dy 1 0.125 0.125 2 0.062 0.062 3 0.031 0.031 4 0.016 0.016 5 0.008 0.008 6 0.004 0.004 MaterialProperties initialized Cosmology initialized Source terms initialized iteration, no. not moved = 0 0 refined: total leaf blocks = 1 refined: total blocks = 1 [amr_morton_process]: Initializing surr_blks using standard orrery implementation INFO: Grid_fillGuardCells is ignoring masking. iteration, no. not moved = 0 0 refined: total leaf blocks = 4 refined: total blocks = 5 iteration, no. not moved = 0 0 refined: total leaf blocks = 16 refined: total blocks = 21 iteration, no. not moved = 0 5 iteration, no. not moved = 1 0 refined: total leaf blocks = 49 refined: total blocks = 65 iteration, no. not moved = 0 22 iteration, no. not moved = 1 0 refined: total leaf blocks = 121 refined: total blocks = 161 iteration, no. not moved = 0 66 iteration, no. not moved = 1 0 refined: total leaf blocks = 280 refined: total blocks = 373 iteration, no. not moved = 0 0 refined: total leaf blocks = 268 refined: total blocks = 357 Finished with Grid_initDomain, no restart Ready to call Hydro_init Hydro initialized Gravity initialized Initial dt verified *** Wrote checkpoint file to test_hdf5_chk_0000 **** *** Wrote plotfile to test_hdf5_plt_cnt_0000 **** Initial plotfile written Driver init all done n t dt ( x, y, z) | dt_hydro 1 1.0000E-10 2.0000E-10 ( 0.373 , 0.619 , 0.00 ) | 2.641E-03 2 3.0000E-10 4.0000E-10 ( 0.436 , 0.631 , 0.00 ) | 2.641E-03 3 7.0000E-10 8.0000E-10 ( 0.537 , 0.510 , 0.00 ) | 2.641E-03 4 1.5000E-09 1.6000E-09 ( 0.537 , 0.510 , 0.00 ) | 2.641E-03 5 3.1000E-09 3.2000E-09 ( 0.600 , 0.564 , 0.00 ) | 2.641E-03 6 6.3000E-09 6.4000E-09 ( 0.600 , 0.564 , 0.00 ) | 2.641E-03 7 1.2700E-08 1.2800E-08 ( 0.600 , 0.564 , 0.00 ) | 2.641E-03 8 2.5500E-08 2.5600E-08 ( 0.600 , 0.564 , 0.00 ) | 2.641E-03 9 5.1100E-08 5.1200E-08 ( 0.600 , 0.564 , 0.00 ) | 2.641E-03 *** Wrote plotfile to test_hdf5_plt_cnt_0001 **** 10 1.0230E-07 1.0240E-07 ( 0.600 , 0.564 , 0.00 ) | 2.641E-03 11 2.0470E-07 2.0480E-07 ( 0.600 , 0.564 , 0.00 ) | 2.641E-03 12 4.0950E-07 4.0960E-07 ( 0.600 , 0.564 , 0.00 ) | 2.640E-03 13 8.1910E-07 8.1920E-07 ( 0.600 , 0.564 , 0.00 ) | 2.639E-03 14 1.6383E-06 1.6384E-06 ( 0.600 , 0.564 , 0.00 ) | 2.637E-03 15 3.2767E-06 3.2768E-06 ( 0.600 , 0.564 , 0.00 ) | 2.634E-03 16 6.5535E-06 6.5536E-06 ( 0.600 , 0.564 , 0.00 ) | 2.626E-03 17 1.3107E-05 1.3107E-05 ( 0.600 , 0.564 , 0.00 ) | 2.611E-03 18 2.6214E-05 2.6214E-05 ( 0.600 , 0.564 , 0.00 ) | 2.582E-03 19 5.2429E-05 5.2429E-05 ( 0.600 , 0.564 , 0.00 ) | 2.526E-03 *** Wrote plotfile to test_hdf5_plt_cnt_0002 **** 20 1.0486E-04 1.0486E-04 ( 0.600 , 0.564 , 0.00 ) | 2.422E-03 21 2.0972E-04 2.0972E-04 ( 0.600 , 0.564 , 0.00 ) | 2.256E-03 22 4.1943E-04 4.1943E-04 ( 0.600 , 0.564 , 0.00 ) | 2.091E-03 23 8.3886E-04 8.3886E-04 ( 0.381 , 0.592 , 0.00 ) | 2.108E-03 24 1.6777E-03 1.6777E-03 ( 0.600 , 0.564 , 0.00 ) | 1.889E-03 25 3.3554E-03 1.4895E-03 ( 0.404 , 0.682 , 0.00 ) | 1.490E-03 iteration, no. not moved = 0 152 iteration, no. not moved = 1 0 refined: total leaf blocks = 271 refined: total blocks = 361 26 4.8450E-03 1.0000E-10 ( 0.600 , 0.564 , 0.00 ) | 2.691E-12 27 4.8450E-03 1.0000E-10 ( 0.443 , 0.525 , 0.00 ) | 2.176E-32 28 4.8450E-03 1.0000E-10 ( 0.568 , 0.549 , 0.00 ) | 3.435-232 29 4.8450E-03 1.0000E-10 ( 0.561 , 0.557 , 0.00 ) | 3.256-296 *** Wrote plotfile to test_hdf5_plt_cnt_0003 **** dtCheck= 0.0000000000000000 Driver_abort called. See log file for details. Error message is [Hydro]: Computed dt is not positive! Aborting! Calling MPI_Abort() for shutdown in 2 seconds! -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. --------------------------------------------------------------------------