<html><head></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space;"><div id="bloop_customfont" style="font-family:Helvetica,Arial;font-size:13px; color: rgba(0,0,0,1.0); margin: 0px; line-height: auto;">Hi Klaus, thanks for your reply, and thanks for clarifying how the Conductivity interface operates, I did not notice it was overloaded. </div><div id="bloop_customfont" style="font-family:Helvetica,Arial;font-size:13px; color: rgba(0,0,0,1.0); margin: 0px; line-height: auto;"><br></div><div id="bloop_customfont" style="font-family:Helvetica,Arial;font-size:13px; color: rgba(0,0,0,1.0); margin: 0px; line-height: auto;">The issue in the fullState routines themselves are references to 3T variable names, which are not properly substituted for single temperature analogs. An example is “EOS_CVELE” in Conductivity_fullState within the SpitzerHighZ module, this variable is not defined when not using 3T. The other Conductivity modules seem to have similar naming issues.</div><div id="bloop_customfont" style="font-family:Helvetica,Arial;font-size:13px; color: rgba(0,0,0,1.0); margin: 0px; line-height: auto;"><br></div><div id="bloop_customfont" style="font-family:Helvetica,Arial;font-size:13px; color: rgba(0,0,0,1.0); margin: 0px; line-height: auto;">As for the PFFT issues, I’ll try running on smaller processor numbers, but currently the crash was occurring ~30 minutes into a high-res run. I’ll let you know if I can isolate a simple test case.</div><div id="bloop_customfont" style="font-family:Helvetica,Arial;font-size:13px; color: rgba(0,0,0,1.0); margin: 0px; line-height: auto;"><br></div> <div class="" id="bloop_sign_1383628308895481088"><span style="font-family:helvetica,arial;font-size:13px"></span>--<br><div>James Guillochon</div><div>Einstein Fellow at Harvard CfA</div><div><a href="http://mailto:jguillochon@cfa.harvard.edu"><span style="color: rgb(0, 0, 0);">jguillochon@cfa.harvard.edu</span></a></div></div> <br><p style="color:#A0A0A8;">On November 4, 2013 at 2:44:01 PM, Klaus Weide (<a href="mailto://klaus@flash.uchicago.edu">klaus@flash.uchicago.edu</a>) wrote:</p> <blockquote type="cite" class="clean_bq"><span><div><div>
<br>> Hi all, I've been attempting to use the Conductivity + Diffuse modules in
<br>> FLASH 4.0.1, and I've encountered a number of issues/inconsistencies I was
<br>> hoping to have resolved:
<br>
<br>> - It seems that most of the Conductivity modules include a
<br>> "Conductivity_fullState" routine, which does not compile without errors
<br>> unless the 3T EOS is used, but as far as I can tell
<br>> "Conductivity_fullState" is not called by anything in the FLASH source. The
<br>> way the rest of the Conductivity modules are written seems to indicate that
<br>> they should work with a single temp EOS, but the inability to compile the
<br>> fullState function makes me a bit wary. Are the Conductivity modules written
<br>> to support a single temp EOS?
<br>
<br>James,
<br>
<br>Yes, the Conductivity unit should be compilable with a 1T EOS, but I
<br>am not sure whether we are testing this. Please let us know how you
<br>set up a test and what compilation errors you are getting.
<br>
<br>Note that even though Conductivity_fullState may not be invoked by
<br>that name, it may still be invoked though the generic interface named
<br>just "Conductivity". That is the effect of the following lines in Conductivity_interface.F90:
<br>
<br>========================================================================
<br> interface Conductivity
<br>
<br> subroutine Conductivity(xtemp,xden,massfrac,isochoricCond,diff_coeff,component)
<br>
<br> real, intent(IN) :: xtemp
<br> real, intent(IN) :: xden
<br> real, intent(OUT) :: diff_coeff
<br> real, intent(OUT) :: isochoricCond
<br> real, intent(IN) :: massfrac(NSPECIES)
<br> integer, intent(IN) :: component
<br>
<br> end subroutine Conductivity
<br>
<br> subroutine Conductivity_fullState(solnVec,isochoricCond,diffCoeff,component)
<br>
<br> real, intent(IN) :: solnVec(NUNK_VARS)
<br> real, OPTIONAL, intent(OUT) :: diffCoeff
<br> real, OPTIONAL, intent(OUT) :: isochoricCond
<br> integer, OPTIONAL, intent(IN) :: component
<br>
<br> end subroutine Conductivity_fullState
<br> end interface
<br>========================================================================
<br>
<br>When the compiler encounters a 'call Conductivity(...)', it decides
<br>based on the actual aguments whether to call the implementation named
<br>'Conductivity' or the implementation named 'Conductivity_fullState'.
<br>
<br>
<br>> - I am getting errors/warnings in the Pfft solver when using the implicit
<br>> mode from DiffuseMain/Split on a number of processors not equal to a power
<br>> of 2 (I think, 64 processors worked fine, 96 processors produced errors for
<br>> otherwise identical simulations). The four types of messages I see in the
<br>> output are (note that NONE of these are seen when running on 64 processors):
<br>
<br>> [gr_pfftInitMetadata]: WARNING... making work arrays larger artificially!!!
<br>
<br>> (INFO) Processor: 11 has no pencil grid points.
<br>
<br>The above two messages are warnings or informational. They don't mean
<br>that anything is wrong. They may indicate, however, that the grid is
<br>configured in some "unusual" way, and / or that performance may be
<br>decreased. The PFFT solver attempts to factor the number of processors
<br>that participate in the PFFT solve in some automatic way, and that may
<br>not always work out well. (Wrost case should be when that number is a
<br>prime.) The code still SHOULD handle this correctly.
<br>
<br>> perfmon: ran out of space for timer, "guardcell internal", cannot time this
<br>> timer with perfmon
<br>
<br>> [Timers_start] Ran out of space on timer call stack. Probably means calling
<br>> start without a corresponding stop.
<br>
<br>I don't think I have seen these two in connection with the PFFT solver.
<br>Something is wrong with the Timers_start / Timers_stop calls, we would need
<br>a test case to investigate.
<br>
<br>> - The timestep limiter seems to still apply when using an implicit solver,
<br>> although as far as I understand there's no need to limit the timestep when
<br>> solving implicitly. Is this just an oversight? Should I just set
<br>> dt_diff_factor to a large number when using the implicit solver?
<br>
<br>Yes, often this is being set to something ridiculously high like
<br>1.e100. You may still find the dt_Diff infomation in the standard
<br>output sometimes useful.
<br>
<br>Klaus
<br></div></div></span></blockquote></body></html>