[FLASH-USERS] Growing error in magnetic field when updating fluid variables in wind tunnel simulation

Rukmani Vijayaraghavan rukmani at virginia.edu
Fri Feb 19 12:59:22 EST 2016


Hi Dongwook,

I'm using the HLLC Riemann solver. I've attached a recent log file from 
a short run.

Thanks!

Best,
Rukmani


On 02/18/2016 10:31 PM, Dongwook Lee wrote:
> Dear Rukmani,
>
> What kind of Riemann solver are you using?
> Can you send me a log file or flash.par?
>
> Thanks,
> Dongwook
>
> On Feb 18, 2016, at 6:59 PM, Rukmani Vijayaraghavan 
> <rukmani at virginia.edu <mailto:rukmani at virginia.edu>> wrote:
>
>> Hi Jason, Klaus,
>>
>> This block-by-block variation is correlated with similar variation in 
>> other fluid variables (density, pressure), and this persists even 
>> where there is a zero velocity inflow, as well as with a uniform 
>> grid, and with both USM and PPM (pure hydro) solvers. Modifying the 
>> gravity solver from Multigrid to Multipole doesn't make a difference 
>> either. I'm using the FLASH Gamma EOS unit. As far as I've seen, 
>> there is no variation in B-field across grid cells adjacent to block 
>> / refinement boundaries, this only happens at the inflow edge.
>>
>> I also update the magnetic field face variables (MAG_FACE_VAR and/or 
>> MAGI_FACE_VAR), with no effect. Div(B) still seems to be 0.
>>
>> Any other suggestions would be great!
>>
>> Thanks,
>> Rukmani
>>
>>
>>
>> On 02/18/2016 01:30 PM, Jason Galyardt wrote:
>>> Hi Rukmani,
>>>
>>> I used a spatially varying wind; the velocity of the wind varies 
>>> along the boundary, but it has a well-defined, time-independent 
>>> form. I've also seen problems with more realistic B-field geometries 
>>> which (to my horror) included step functions in the domain interior. 
>>> I had to smooth these out to avoid unphysical evolution in those 
>>> regions.
>>>
>>> I've also seen some modest increase in B-field magnitude for the 
>>> cells adjacent to a refinement boundary. I haven't reported the 
>>> latter previously because I haven't had time to figure out what's 
>>> going on there. You might try setting lrefine_min = lrefine_max to 
>>> get uniform refinement and see whether that helps (some of our 
>>> group's simulations do this).
>>>
>>> The block by block variation does seem strange. I would expect this 
>>> kind of variation to be correlated with variation in another 
>>> variable. How do the other variables look in the problem region?
>>>
>>> Another idea: could this variation be tied to the equation of state? 
>>> If you're using one of the supported FLASH EOS units, you're 
>>> probably fine.
>>>
>>> Regards,
>>> Jason
>>>
>>>
>>> On Thu, Feb 18, 2016 at 11:07 AM, Rukmani Vijayaraghavan 
>>> <rukmani at virginia.edu> wrote:
>>>
>>>     Hi Jason,
>>>
>>>     Thanks! I'm using FLASH 4.2, I'll try using 4.3 to see if that
>>>     makes a difference. I haven't tried refining on the magnetic
>>>     variables yet.
>>>
>>>     For the different runtime parameters --
>>>
>>>     1. I've tried cfl = 0.5 and 0.8, but nothing lower yet. I'll
>>>     check to see if that works.
>>>
>>>     2. For the Riemann Solver, I've found HLLC to be a bit more
>>>     dissipative than HLLD, and therefore marginally better at
>>>     smoothing out the magnetic field at the edges. Ditto with second
>>>     order MUSCL-Hancock over third order PPM.
>>>
>>>     3. All the other runtime parameters are mostly the same. I don't
>>>     refine on the magnetic variables, but I tried higher overall
>>>     lrefine_min (to make sure the outer edges get further refined)
>>>     and it didn't help -- the same block-based discontinuity persists.
>>>
>>>     4. I'm using a constant wind inflow for this particular run. One
>>>     thing I checked to see was if there was a round off error in
>>>     reading my input variables into double precision arrays, and
>>>     this tiny "seed" instability might grow, but it doesn't seem to
>>>     be an issue. What is strange is that the value (and sign) of the
>>>     initial instability varies block-by-block. In your simulations,
>>>     did you use a constant wind?
>>>
>>>     Thanks,
>>>     Rukmani
>>>
>>>
>>>     On 02/18/2016 09:38 AM, Jason Galyardt wrote:
>>>>     Hi Rukmani,
>>>>
>>>>     I've had some similar issues with MHD runs. You didn't mention
>>>>     which version of FLASH you're using, but I've found the latest
>>>>     (v4.3) to be a bit more stable than v4.2 or v2.5. As for
>>>>     runtime parameters, found the following combination to be helpful:
>>>>
>>>>     #~~~~
>>>>     # Refine on the magnetic variables:
>>>>     refine_var_1 = "dens"
>>>>     refine_var_2    = "magp"
>>>>     # -OR-
>>>>     # refine_var_2 = "magx"
>>>>     # refine_var_3 = "magy"
>>>>     # refine_var_4 = "magz"
>>>>     # prefer higher refinement, according to magp (default
>>>>     refine_cutoff_X = 0.8)
>>>>     refine_cutoff_2 = 0.7
>>>>     # refine_cutoff_3 = 0.7
>>>>     # refine_cutoff_4 = 0.7
>>>>
>>>>     # Lower CFL: between 0.25 and 0.5
>>>>     cfl = 0.5
>>>>
>>>>     # Use second order MUSCL-Hancock reconstruction scheme
>>>>     order = 2
>>>>
>>>>     # I've mostly used the "hybrid" slope limiter, but occasionally
>>>>     I've found the "minmod" useful in particularly difficult
>>>>     situations
>>>>     slopeLimiter    = "hybrid"
>>>>
>>>>     # use flattening (dissipative) (originally for PPM)
>>>>     use_flattening    = .true.
>>>>
>>>>     # Use high order algorithm for E-field construction
>>>>     E_modification  = .true.
>>>>
>>>>     # Update magnetic energy using staggered B-fields
>>>>     energyFix       = .true.
>>>>
>>>>     # Prolongation method (injecton_prol, balsara_prol) -- Using
>>>>     Balsara's method is particularly critical, in my experience.
>>>>     prolMethod      = "BALSARA_PROL"
>>>>
>>>>     # For the Riemann solver, I use HLLD for MHD runs, and HLLC for
>>>>     pure hydro runs.
>>>>     RiemannSolver    = "HLLD"
>>>>     #~~~~
>>>>
>>>>     What sort of inflow conditions have you implemented? Small
>>>>     non-linearities in the inflow can grow into large unphysical
>>>>     features over time (I've seen this happen in my own
>>>>     simulations). So, it's worth checking your boundary condition
>>>>     code for undesirable features. In any case, I hope this helps.
>>>>
>>>>     Sean: is the E_upwind option available for the unsplit MHD
>>>>     solver in FLASH 4.3? My recollection is that it caused some
>>>>     problems in previous versions....
>>>>
>>>>     Regards,
>>>>     Jason
>>>>
>>>>
>>>>     On Wed, Feb 17, 2016 at 9:22 PM, Rukmani Vijayaraghavan
>>>>     <rukmani at virginia.edu> wrote:
>>>>
>>>>         Hi everyone,
>>>>
>>>>         I've come across an error when updating fluid variables at
>>>>         the inflow edge of a wind tunnel simulation. I'm running a
>>>>         simulation of a galaxy (with active dark matter particles,
>>>>         gas, and passive particles) in a box, whose fluid is
>>>>         initialized to be identical to the incoming wind (with vx,
>>>>         vy, vz = 600 km/s, 0, 0). There is a small error (on the
>>>>         order of 1%) when updating grid cells near the inflow
>>>>         boundary (with both USM and PPM solvers), and this error is
>>>>         spatially correlated with block boundaries. While this
>>>>         error itself is tolerable as far as the density and
>>>>         pressure go, this has bad consequences for the magnetic
>>>>         field which grows as the wind propagates through the box
>>>>         (see attached figure, xl_boundary). This figure shows
>>>>         slices of Bx at two timesteps (annotated with block
>>>>         boundaries and magnetic field vectors). The dynamic range
>>>>         of Bx in this image has been reduced to highlight these
>>>>         discontinuities. At the timesteps shown in the attached
>>>>         image, the fluctuations in Bx are ~1%, but grow with time
>>>>         up to order unity. I've tried a variety of Riemann solvers
>>>>         (HLLC, HLLD, Roe, Hybrid), slope limiters (mc, minmod,
>>>>         etc.), interpolation orders, prolongation methods, turning
>>>>         on and off specific USM switches, but nothing seems to
>>>>         solve this issue so far. Has anybody else dealt with and/or
>>>>         successfully solved this issue?
>>>>
>>>>         Thanks,
>>>>         Rukmani
>>>>
>>>>         -- 
>>>>         Rukmani Vijayaraghavan
>>>>         NSF Astronomy & Astrophysics Postdoctoral Fellow
>>>>         University of Virginia
>>>>         rukmani at virginia.edu <mailto:rukmani at virginia.edu>
>>>>
>>>>
>>>
>>>     -- 
>>>     Rukmani Vijayaraghavan
>>>     NSF Astronomy & Astrophysics Postdoctoral Fellow
>>>     University of Virginia
>>>     rukmani at virginia.edu <mailto:rukmani at virginia.edu>
>>>
>>>
>>
>> -- 
>> Rukmani Vijayaraghavan
>> NSF Astronomy & Astrophysics Postdoctoral Fellow
>> University of Virginia
>> rukmani at virginia.edu

-- 
Rukmani Vijayaraghavan
NSF Astronomy & Astrophysics Postdoctoral Fellow
University of Virginia
rukmani at virginia.edu

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://flash.rochester.edu/pipermail/flash-users/attachments/20160219/505756bd/attachment.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: galaxy_wt_mhd_aniso.log
Type: text/x-log
Size: 93683 bytes
Desc: not available
URL: <http://flash.rochester.edu/pipermail/flash-users/attachments/20160219/505756bd/attachment.bin>


More information about the flash-users mailing list