[FLASH-USERS] How does FLASH3 calculate temperatures for checkpoint and plot dumps?

John ZuHone jzuhone at milkyway.gsfc.nasa.gov
Fri Dec 16 13:43:13 EST 2011


Hi David,

Could you tell us a little more about your setup? For example, what EOS are you using? 

I assume it is either Gamma or Multigamma. If it is Gamma, can you tell us the runtime parameters for gamma, eos_singleSpeciesZ, and eos_singleSpeciesA? 

If it is Multigamma, can you give us a description of how you have set up Simulation_initSpecies? 

Finally, what checkpoint does this odd behavior show up in? If it is in the first, then something may have gone wrong in the initial setup. For example, if you are using Multigamma, something might have gone wrong with the initial species setup. 

Best,

John Z

On Dec 16, 2011, at 1:37 PM, Dave wrote:

> When examining the checkpoint file from a simple test run of a galaxy, I'm finding that the pres and dens values are what I expect, but the temp values seem to be way off. In an (unrealistic) supposedly isothermal test, the outer parts of the galaxy are ~10^7 K, as they should be, but there is a sharp discontinuity in the centre, where the temperature drops to ~10^7 K, according to the temp scalar - even though the pres and dens values are what they should be.
> 
> I'm trying to track down where in Flash the temperature is calculated from internal energy etc for output, to determine if this is just being dumped out wrong, or if the internal energy itself is wrong. Any idea where that goes on?

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://flash.rochester.edu/pipermail/flash-users/attachments/20111216/acd8eb29/attachment.htm>


More information about the flash-users mailing list