RuntimeParameters_read: ignoring unknown parameter "ed_useLaserIO"... RuntimeParameters_read: ignoring unknown parameter "ed_laserIOMaxNumberOfPositions"... RuntimeParameters_read: ignoring unknown parameter "ed_laserIOMaxNumberOfRays"... NOTE: Enabling curvilinear, cartesian_pm/cylindrical_pm/spherical_pm/polar_pm so far was F F F F Grid_init: resolution based on runtime params: lrefine dx dy 1 2.500E-04 2.500E-04 2 1.250E-04 1.250E-04 3 6.250E-05 6.250E-05 4 3.125E-05 3.125E-05 MaterialProperties initialized Cosmology initialized [eos_tabBrowseIonmix4Tables] IONMIX4 file found: he-imx-005.cn4 [eos_tabBrowseIonmix4Tables] IONMIX4 file found: al-imx-003.cn4 in eos_inittabulated, tableName = he-imx-005.cn4 in eos_inittabulated, groupName = -none- in eos_inittabulated, tableName = he-imx-005.cn4 in eos_inittabulated, groupName = -none- in eos_inittabulated, tableName = al-imx-003.cn4 in eos_inittabulated, groupName = -none- in eos_inittabulated, tableName = al-imx-003.cn4 in eos_inittabulated, groupName = -none- RadTrans initialized [EnergyDeposition_init] INFO: Using ed_irradVar= 16 ed_irradVarName=lase Source terms initialized iteration, no. not moved = 0 0 refined: total leaf blocks = 2 refined: total blocks = 2 [amr_morton_process]: Initializing surr_blks using standard orrery implementation INFO: Grid_fillGuardCells is ignoring masking. iteration, no. not moved = 0 1 iteration, no. not moved = 1 0 refined: total leaf blocks = 5 refined: total blocks = 6 iteration, no. not moved = 0 1 iteration, no. not moved = 1 0 refined: total leaf blocks = 20 refined: total blocks = 26 iteration, no. not moved = 0 4 iteration, no. not moved = 1 0 refined: total leaf blocks = 44 refined: total blocks = 58 Finished with Grid_initDomain, no restart Ready to call Hydro_init [Hydro_init]: Using non-Cartesian Geometry! Hydro initialized Gravity initialized Initial dt verified *** Wrote checkpoint file to /nobackup1/tmarkj/FLASH_runs/biermann_test_3D_output/gasjetexp_hdf5_chk_0000 **** HDF5-DIAG: Error detected in HDF5 (1.10.5) MPI-process 1: #000: H5F.c line 444 in H5Fcreate(): unable to create file major: File accessibilty minor: Unable to open file #001: H5Fint.c line 1498 in H5F_open(): unable to open file: time = Wed Aug 30 13:14:07 2023 , name = '/nobackup1/tmarkj/FLASH_runs/biermann_test_3D_output/gasjetexp_hdf5_plt_cnt_0000à|', tent_flags = 13 major: File accessibilty minor: Unable to open file #002: H5FD.c line 734 in H5FD_open(): open failed major: Virtual File Layer minor: Unable to initialize object #003: H5FDmpio.c line 998 in H5FD_mpio_open(): MPI_File_open failed major: Internal error (too specific to document in detail) minor: Some MPI function failed #004: H5FDmpio.c line 998 in H5FD_mpio_open(): MPI_ERR_NO_SUCH_FILE: no such file or directory major: Internal error (too specific to document in detail) minor: MPI Error String Program received signal SIGSEGV: Segmentation fault - invalid memory reference. Backtrace for this error: HDF5-DIAG: Error detected in HDF5 (1.10.5) MPI-process 0: #000: H5F.c line 444 in H5Fcreate(): unable to create file major: File accessibilty minor: Unable to open file #001: H5Fint.c line 1498 in H5F_open(): unable to open file: time = Wed Aug 30 13:14:07 2023 , name = '/nobackup1/tmarkj/FLASH_runs/biermann_test_3D_output/gasjetexp_hdf5_plt_cnt_0000file', tent_flags = 13 major: File accessibilty minor: Unable to open file #002: H5FD.c line 734 in H5FD_open(): open failed major: Virtual File Layer minor: Unable to initialize object #003: H5FDmpio.c line 998 in H5FD_mpio_open(): MPI_File_open failed major: Internal error (too specific to document in detail) minor: Some MPI function failed #004: H5FDmpio.c line 998 in H5FD_mpio_open(): MPI_ERR_IO: input/output error major: Internal error (too specific to document in detail) minor: MPI Error String Error: unable to initialize file Driver_abort called. See log file for details. Error message is unable to initialize hdf5 file Calling MPI_Abort() for shutdown in 2 seconds! #0 0x2aaaacc4c3af in ??? -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. --------------------------------------------------------------------------