[FLASH-USERS] MHD staggered mesh (Flash4a)

Seyit Hocuk seyit at astro.rug.nl
Fri Jul 15 06:23:39 EDT 2011


Dear Dongwook,

Thanks for the reply.

I hadn't put my setup under magnetoHD, assuming that only the 
"customizeProlong" was the required, so I copied this file to my setup 
directory. I wasn't aware that it was so important. DivB does seem to be 
preserved. It's values are still around 1e-33.

I have initialized the magnetic field by giving it a constant value.

       magx(i) = sim_mfield
       magy(i) = sim_mfield
       magz(i) = sim_mfield      

#if NFACE_VARS > 0            
        call Grid_putRowData(blockID, CENTER, MAGX_VAR, EXTERIOR, &
             IAXIS, startingPos, magx, sizeX) 
        call Grid_putRowData(blockID, CENTER, MAGY_VAR, EXTERIOR, &
             IAXIS, startingPos, magy, sizeX) 
        call Grid_putRowData(blockID, CENTER, MAGZ_VAR, EXTERIOR, &
             IAXIS, startingPos, magz, sizeX) 
#endif

#if NFACE_VARS > 0
           if (sim_killdivb) then
              facexData(MAG_FACE_VAR,:,:,:) = sim_mfield
              faceyData(MAG_FACE_VAR,:,:,:) = sim_mfield
              if (NDIM == 3) facezData(MAG_FACE_VAR,:,:,:) = sim_mfield
           endif
#endif

where sim_mfield = 1.0e-5 in this case. This, in order to make the 
problem simple. This is all I have done. The rest is just my standard setup.

kind regards,
Seyit




dongwook at flash.uchicago.edu wrote:
> Hi Seyit,
>
>   
>> Dear Flash developers,
>>
>> I am testing the USM solver a bit in Flash4-alpha and noticed some
>> curious features. I use my standard setup, which I normally use with
>> split PPM, and do +usm in the setup command. Everything looks and runs
>> fine, however, whenever I increase the number of processors, dt_hydro
>> seems to drop by an equal amount. I do not use super-time-stepping,
>> which I don't understand yet.
>>     
>
> We haven't seen this issue with large number of processor runs with the
> USM solver. I'd like to check what kind of problem you're trying to setup
> and seeing this problem. One quick note is that, in case you have your own
> MHD simulation, you NEED to put that under
>
> source/Simulation/SimulationMain/magnetoHD/YourMHDsimulation
>
> in order to preserve divB=0 condition on a staggered grid.
>
> It is best if you start from those MHD simulations provided in /magnetoHD/
> directory,
> copy and save it to another name, and start modifying
> Simulation_initBlock.F90, etc.
>
> It sounds like you probably ran a simulation that is in
> Simulation/SimulationMain/
> and setup up once with PPM and then with USM by simply adding +usm.
>
> This won't work, especially for USM, and will need to setup B-field on a
> staggered
> grid correctly to start with divB=0 condition.
>
>   
>> Another question I have is that if it is normal that USM MHD requires so
>> much more (ram) memory than PPM. It is about 3-4 times more memory
>> intensive. I noticed that there are much more (about 3 times more)
>> variables than in PPM. These are mainly the scratch and the flux
>> variables. So, is it normal that USM requires this much memory?
>>     
>
> Yes, it is true that the memory requirement in USM is higher that any split
> solvers. The expense comes from the fact that USM being an unsplit solver,
> plus there are additional variables for MHD including B-fields & electric
> fields in the
> USM MHD formulation.
>
> Of course, these variables are not needed in gas dynamics
> solver. And the PPM solver is a split solver, therefore, it does not require
> transverse fluxes in each directional sweep (i.e., no need y,z fluxes in
> x-sweep).
> This is not true for all directionally unsplit solvers.
>
> Because of this fact, the USM solver needs to store all fluxes in x,y, and
> z directions,
> requiring more memory.
>
>
>   
>> Lastly, I am new to magnetic field studies and thus not so familiar how
>> to implement them. The way I implement magnetic fields is as follow: I
>> use my standard setup +usm and in Simulation_initBlock, I give
>> reasonable values to center values of MAGX/Y/Z and also do the same for
>> the face values of MAG (facex, facey, facez), similar to how it is done
>> in the supplied test runs. My run differs from these test runs in the
>> fact that I have gravity (and particles) included. I'm curious how the
>> magnetic fields will be amplified/weakened over time. I'm also wondering
>> why it is not necessary for the code to know the ion/electron abundance.
>> Is this assuming some flux freezing state?
>>     
>
> As I mentioned in the above, you should be very careful in initializing
> B-fields
> for the USM solver. You need to setup the fields that satisfy divB=0 on a
> staggered
> grid. If you use constant B-fields then it is going to be straightforward,
> otherwise,
> you will also need to initialize facex, facey, facez (face-centered)
> differently from
> MAGX/Y/Z (cell-centered) because face centers are dx/2 away from cell
> centers.
>
> Please have a look at the initialization of the fields in CurrentSheet
> problem, if your
> B-fields are complicated. Or if you know your analytic forms of B-fields,
> then please
> have a look at Simulation_initBlock in OrszagTang to help you increase
> your understanding.
>
> Magnetic energy will eventually decrease if you don't have any explicit
> source terms such
> as B-field injection as a boundary condition, or biermann battery term for
> dynamo effect.
> In ideal MHD, magnetic energy will decrease because of numerical
> diffusivity on a grid
> scale, and in resistive MHD, magnetic diffusivity (resistivity) will
> diffuse out the strength
> of the fields.
>
> Best,
> Dongwook
>
>   
>> Kind regards,
>> Seyit
>>
>>     
>
>   




More information about the flash-users mailing list