[FLASH-USERS] cell density goes to zero when refining several times
Christoph Federrath
christoph.federrath at monash.edu
Thu May 29 18:54:39 EDT 2014
Hi Andrea,
if you use the sink particle's Jeans refinement, the parameter jeans_ncells_deref must be set twice as high as jeans_ncells_ref, e.g.:
jeans_ncells_ref = 32.0
jeans_ncells_deref = 64.0
Did you do that? If jeans_ncells_deref is less than 2 x jeans_ncells_ref then a back and forth refinement might happen.
I should also say that the Bouchut MHD solver (Waagan et al. 2011, Journal of Computational Physics) is not yet in the public FLASH version, so we should continue discussing this offline. The only thing I'm going to say here is that there were some minor bugs in the code that you probably got from somewhere. Who gave you the code? Those bugs can strongly degrade the stability of the solver in certain situations, in particular for supersonic turbulence and non-isothermal gas. I fixed those bugs a few months ago, so it is very unlikely that you have the latest version. While the solver itself is extremely well suited for the problem you are trying to attack, there were some problems with the dual energy formalism switch. I should also say that I do not recommend using the 5-wave version of the solver for MHD; only for HD. If you have magnetic fields present, I recommend to use the 3-wave version of the solver, aka HLL3R (see Waagan et al. 2011: http://adsabs.harvard.edu/abs/2011JCoPh.230.3331W).
Now, the problems you are having might still originate in some of your custom refinement, but I suspect that those points above might also have to do with them.
Kind regards,
Christoph
________________________________
Dr. Christoph Federrath
Monash Centre for Astrophysics,
School of Mathematical Sciences,
Monash University,
Clayton, VIC 3800, Australia
+61 3 9905 9760
http://www.ita.uni-heidelberg.de/~chfeder/index.shtml?lang=en
Am 29.05.2014 um 12:47 schrieb Andrea Gatto:
> Dear Christoph, Klaus, et al,
>
> I try to be more specific.
>
> I am using a (split) Bouchut 5 waves solver, since I'd like to include
> magnetic fields in future. I'm using MODE_DENS_EI.
>
> I have a simulation of gas spanning over a wide ranges of density and
> temperature, with a box size of the order of 10^20-10^21 cm.
> The gas is shaped by stellar feedback.
> This setup causes the medium to be extremely turbulent and intermittent.
>
> Specifically, I'm interested in capturing the correct physics processes at
> small scales. However, since the gas is extremely turbulent, increasing
> lrefine_max doesn't help me since the 2nd derivative-based refinement
> criterion causes the whole (or majority of the) domain to be refined to
> the this maximum level, which makes the simulation too slow and impossible
> to run.
>
> Moreover, I'm not really interested in resolving shocks/discontinuity et al.
> Even if I have a big dense cloud with uniform thermodynamical properties,
> I'd like to have this refined to high resolution.
>
> For this reason, I've modified the Grid unit allowing for different
> refinement criteria, with each one of them having its own maximum
> refinement level.
> These criteria are: refinement on variables threshold and jeans lenght
> refinement (based on the default routine present in the sink particle
> unit).
>
> What I do in practice is to rearrange the refine, derefine, and stay flags
> of the blocks in the following way in Grid_markRefineDerefine.
>
> 1) I call gr_markRefineDerefine and I let the code mark the blocks
> following the default 2nd derivative-based criterion. I rearrange the
> flags so that this method refines only up to a level lrefine_2ndD. Each
> block above this refinement level is marked automatically for
> derefinement.
>
> 2) I call the new criteria, e.g. gr_markVarThreshold, which reorganize the
> refine, derefine and stay flags according to what I want to a higher
> maximum refinement level.
> For instance, I could say I want a block refined if the density of one of
> its cell is above 1.e-22 gr/cm^3, and I want it to derefine if the
> densities are below 1.e-24 gr/cm^-3.
> This routines are able to refine up to a higher maximum refinement level
> with respect to lrefine_2ndD.
>
> This method is extremely powerful, as I'm able to reach high resolutions
> exactly where I want without having too many blocks.
> For instance, in my tests I see a number of blocks of 40 000 - 60 000
> against the 2 - 3 millions I receive if I allow the 2nd derivative-based
> criterion to refine to the highest refinement level.
>
> Now, I know this is a bad idea, but it's the only way to reach high
> resolutions where I want.
> In principle I could play a bit with the 2nd derivative-based criterion
> parameters, but still this would not allow me to reach high resolutions in
> the gas parcels I'm interested in.
>
> I made a number of tests with this new method, from a single supernova
> explosion in a uniform medium to ram-pressure stripping of a dense blob
> moving at high velocities, and the blocks are refined and derefined
> correctly following my criteria without any problem.
>
> The problem arises when I have a turbulent medium. The method refines and
> derefines correctly, but at a certain point the solver crashes, as I
> described in the previous email.
> I've found that a single (and the neighbor cells) has a density which goes
> to zero, and internal and kinetic energies that explode.
> Every time I've checked I've found that this cell is a guard cell between
> two different blocks at different high refinement levels.
>
> This problem arises regardless the method I choose (jeans, threshold or a
> combination of them), with or without sink particles, with or without
> gravity, using a periodic box or an elongated box along z with periodic
> boundary conditions in x and y and diode in z.
> As long as I have multiple maximum refinement levels in a turbulent
> environment, the code crashes. I also tried different combinations of
> eintSwitch and density/energy floors, but the problem still remains.
>
> Given the "universality" of this problem, I suppose this has to do with
> the amr routines which handle the density/energy/velocity filling from
> parent to child blocks and vice versa.
>
> I'll try to use the routine Klaus has suggested to have a deeper look on
> the problem, however I'd really appreciate any help in this regard.
>
> Sorry for the lenghty email, but I think now you understand what I'm
> trying to do.
>
> Thank you.
>
> Best,
>
> Andrea
>
>
>> On Wed, 28 May 2014, Andrea Gatto wrote:
>>> Dear all,
>>> I'm running a simulation with FLASH 4.0.1 of a turbulent interstellar
> medium.
>>> For different reasons, some regions of my domain are refined several
> times
>>> during the evolution (nrefs=2), so that every two timesteps blocks in
> the
>>> same region are refined, derefined, refined again etc.
>>> After few hundreds timesteps, however, the code crashes due to a steep
> increase in internal and kinetic energy.
>>> Every time I checked, I found that a single/few cells belonging to one
> of
>>> this highly time-varying blocks show a density that rapidly goes to
> zero
>>> and a subsequent explosion in temperature and velocity.
>>> I tried several solutions, such as the inclusion of guard cells filling
> routines in different parts of the code, but the problem remains. This
> doesn't happen when the blocks are not refined so often.
>>> Do you have any idea on how to solve this problem?
>> Andrea,
>> First of all, I think a situation like this should be avoided if at all
> possible. I.e., don't refine-then-derefine the same region so many times.
>> Second, the behavior will likely depend on the Hydro version you are
> using
>> and its runtime parameters, like "order", "RiemannSolver", etc. if you
> are
>> using unsplit Hydro. Others will be better able to give guidance here.
> Third, the following MAY be helpful for circumventing or analyzing the
> problematic behavior:
>> In FLASH4.2, we have added two runtime parameter to make it easier to
> deal with situations where some invalid values start appearing in
> density
>> (also, total and internal energy). Note that the mode that "fixes" data
> by flooring, gr_sanitizeDataMode=3, has not been tested much in
> practice.
>> Excerpt from source/Grid/GridMain/paramesh/paramesh4/Config :
>> ========================================================================
> D gr_sanitizeDataMode What to do when gr_sanitizeDataAfterInterp is called
>> D & to check for acceptable values in the dens, ener, and eint
>> cell-centered variables
>> D & after a Grid operation may have resulted in grid interpolation. D &
> 0: Do nothing.
>> D & 1: Check (if variable is not masked out) and report (see
>> sanitizeVerbosity).
>> D & 2: Check (ignoring variable mask) and report (see
> sanitizeVerbosity).
>> D & 3: Check (if variable is not masked out) and fix (apply floor
> value).
>> D & 4: Check (if variable is not masked out) and abort if cell is found
> below floor value.
>> PARAMETER gr_sanitizeDataMode INTEGER 1 [0,1,2,3,4]
>> D gr_sanitizeVerbosity How to write information about unacceptable D &
> values in the dens, ener, and eint cell-centered variables if
> gr_sanitizeDataAfterInterp
>> D & finds value that are below the acceptable floor.
>> D & This reporting is in addition to other actions selected with
> gr_sanitizeDataMode=3 or 4.
>> D & 0: Be quiet.
>> D & 1: Only write a log file message per block if unacceptable value
> found
>> on MASTER_PE.
>> D & 4: As 1, and each proc writes a line to standard output for each
> block
>> with bad values.
>> D & 5: As 4, and each proc writes lines showing the values in all cells
> of
>> the block (in 1D/2D)
>> D & 5: or a 2D slice (in 3D).
>> PARAMETER gr_sanitizeVerbosity INTEGER 5 [0,1,4,5]
>> ========================================================================
> HTH,
>> Klaus
>
>
>
>
>
More information about the flash-users
mailing list