Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Code update for HR4_roughness #2022

Merged
merged 23 commits into from
Jan 30, 2024
Merged

Conversation

grantfirl
Copy link
Collaborator

@grantfirl grantfirl commented Dec 1, 2023

PR Author Checklist:

  • I have linked PR's from all sub-components involved in section below.
  • I am confirming reviews are completed in ALL sub-component PR's.
  • I have run the full RT suite on either Hera/Cheyenne AND have attached the log to this PR below this line:
  • I have added the list of all failed regression tests to "Anticipated changes" section.
  • I have filled out all sections of the template.

Description

From @Qingfu-Liu:
The overall features for the updates are:

The current wave model derived momentum roughness increases with increasing 10m wind speed, which can lead to too much drag in high winds and consequent reduction of TC intensity. In order to reduce the negative hurricane intensity biases when coupled with the wave model, in the modified code the momentum roughness is limited to a constant value in high winds wind speeds larger than about 30 m/s). Also, in the modification molecular viscosity effect is included to avoid too much reduction of exchange coefficients for heat and moisture in weak winds. More detailed discussion for this issue can be found here.

The HR2 coupled runs with the above modified surface layer scheme have been completed for the 2020 hurricane season (July 21 - Nov. 20, 2020) by Wei Li, and the hurricane stats have been evaluated by Jiayi Peng. The impacts of this change are neutral. However, the observations indicate drag reduction in high wind speed (> about 30m/s at 10m), this change is more consistent with the observation.

Background diffusivity (K0) in the inversion layers near the PBL top is increased from 0.15 m^2/s to 0.4 m^2/s to reduce too much stratocumulus formation in the coastal areas of the east Pacific and east Atlantic oceans [https://www.emc.ncep.noaa.gov/gmb/jhan/vsdbw/hr2d04/: hr2d01 (K0=0.15 m^2/s); hr2d02 (K0=0.3 m^2/s); hr2d03 (K0=0.4 m^2/s; hr2d04 (K0=0.5 m^2/s)].

Maximum allowable TKE-dependent entrainment enhancement is reduced from 15 times to 10 times in the shallow convection scheme to avoid a potential numerical instability due to too much entrainment increase. cmxfac=10 also improve hurricane forecasts in the HAFS.

Commit Message

CCPP physics code update for HR4 related to surface roughness, background diffusivity and shallow convection + CI fix

Linked Issues and Pull Requests

Associated UFSWM Issue to close

Subcomponent Pull Requests

Blocking Dependencies

Subcomponents involved:

  • AQM
  • CDEPS
  • CICE
  • CMEPS
  • CMakeModules
  • FV3
  • GOCART
  • HYCOM
  • MOM6
  • NOAHMP
  • WW3
  • stochastic_physics
  • none

Anticipated Changes

Input data

  • No changes are expected to input data.
  • Changes are expected to input data:
    • New input data.
    • Updated input data.

Regression Tests:

  • No changes are expected to any regression test.
  • Changes are expected to the following tests:
Tests effected by changes in this PR: A lot of tests are expected to change results due to changes to constants in shallow convection and PBL schemes. There is also a change in sfc_diff.f for when waves are coupled to the atmosphere.

001 cpld_control_p8_mixedmode_intel failed in check_result
cpld_control_p8_mixedmode_intel 001 failed in run_test
002 cpld_control_gfsv17_intel failed in check_result
cpld_control_gfsv17_intel 002 failed in run_test
005 cpld_mpi_gfsv17_intel failed in check_result
cpld_mpi_gfsv17_intel 005 failed in run_test
006 cpld_debug_gfsv17_intel failed in check_result
cpld_debug_gfsv17_intel 006 failed in run_test
007 cpld_control_p8_intel failed in check_result
cpld_control_p8_intel 007 failed in run_test
009 cpld_control_qr_p8_intel failed in check_result
cpld_control_qr_p8_intel 009 failed in run_test
011 cpld_2threads_p8_intel failed in check_result
cpld_2threads_p8_intel 011 failed in run_test
012 cpld_decomp_p8_intel failed in check_result
cpld_decomp_p8_intel 012 failed in run_test
013 cpld_mpi_p8_intel failed in check_result
cpld_mpi_p8_intel 013 failed in run_test
014 cpld_control_ciceC_p8_intel failed in check_result
cpld_control_ciceC_p8_intel 014 failed in run_test
015 cpld_control_c192_p8_intel failed in check_result
cpld_control_c192_p8_intel 015 failed in run_test
017 cpld_bmark_p8_intel failed in check_result
cpld_bmark_p8_intel 017 failed in run_test
019 cpld_control_noaero_p8_intel failed in check_result
cpld_control_noaero_p8_intel 019 failed in run_test
020 cpld_control_nowave_noaero_p8_intel failed in check_result
cpld_control_nowave_noaero_p8_intel 020 failed in run_test
021 cpld_debug_p8_intel failed in check_result
cpld_debug_p8_intel 021 failed in run_test
022 cpld_debug_noaero_p8_intel failed in check_result
cpld_debug_noaero_p8_intel 022 failed in run_test
023 cpld_control_noaero_p8_agrid_intel failed in check_result
cpld_control_noaero_p8_agrid_intel 023 failed in run_test
024 cpld_control_c48_intel failed in check_result
cpld_control_c48_intel 024 failed in run_test
025 cpld_control_p8_faster_intel failed in check_result
cpld_control_p8_faster_intel 025 failed in run_test
026 cpld_control_pdlib_p8_intel failed in check_result
cpld_control_pdlib_p8_intel 026 failed in run_test
029 cpld_debug_pdlib_p8_intel failed in check_result
cpld_debug_pdlib_p8_intel 029 failed in run_test
030 control_flake_intel failed in check_result
control_flake_intel 030 failed in run_test
031 control_CubedSphereGrid_intel failed in check_result
control_CubedSphereGrid_intel 031 failed in run_test
032 control_CubedSphereGrid_parallel_intel failed in check_result
control_CubedSphereGrid_parallel_intel 032 failed in run_test
033 control_latlon_intel failed in check_result
control_latlon_intel 033 failed in run_test
034 control_wrtGauss_netcdf_parallel_intel failed in check_result
control_wrtGauss_netcdf_parallel_intel 034 failed in run_test
035 control_c48_intel failed in check_result
control_c48_intel 035 failed in run_test
036 control_c192_intel failed in check_result
control_c192_intel 036 failed in run_test
037 control_c384_intel failed in check_result
control_c384_intel 037 failed in run_test
038 control_c384gdas_intel failed in check_result
control_c384gdas_intel 038 failed in run_test
039 control_stochy_intel failed in check_result
control_stochy_intel 039 failed in run_test
041 control_lndp_intel failed in check_result
control_lndp_intel 041 failed in run_test
042 control_iovr4_intel failed in check_result
control_iovr4_intel 042 failed in run_test
043 control_iovr5_intel failed in check_result
control_iovr5_intel 043 failed in run_test
044 control_p8_intel failed in check_result
control_p8_intel 044 failed in run_test
045 control_p8_ugwpv1_intel failed in check_result
control_p8_ugwpv1_intel 045 failed in run_test
047 control_noqr_p8_intel failed in check_result
control_noqr_p8_intel 047 failed in run_test
049 control_decomp_p8_intel failed in check_result
control_decomp_p8_intel 049 failed in run_test
050 control_2threads_p8_intel failed in check_result
control_2threads_p8_intel 050 failed in run_test
051 control_p8_lndp_intel failed in check_result
control_p8_lndp_intel 051 failed in run_test
052 control_p8_rrtmgp_intel failed in check_result
control_p8_rrtmgp_intel 052 failed in run_test
054 merra2_thompson_intel failed in check_result
merra2_thompson_intel 054 failed in run_test
078 control_csawmg_intel failed in check_result
control_csawmg_intel 078 failed in run_test
079 control_csawmgt_intel failed in check_result
control_csawmgt_intel 079 failed in run_test
080 control_ras_intel failed in check_result
control_ras_intel 080 failed in run_test
081 control_wam_intel failed in check_result
control_wam_intel 081 failed in run_test
082 control_p8_faster_intel failed in check_result
control_p8_faster_intel 082 failed in run_test
084 control_CubedSphereGrid_debug_intel failed in check_result
control_CubedSphereGrid_debug_intel 084 failed in run_test
085 control_wrtGauss_netcdf_parallel_debug_intel failed in check_result
control_wrtGauss_netcdf_parallel_debug_intel 085 failed in run_test
086 control_stochy_debug_intel failed in check_result
control_stochy_debug_intel 086 failed in run_test
087 control_lndp_debug_intel failed in check_result
control_lndp_debug_intel 087 failed in run_test
089 control_csawmgt_debug_intel failed in check_result
control_csawmgt_debug_intel 089 failed in run_test
090 control_ras_debug_intel failed in check_result
control_ras_debug_intel 090 failed in run_test
091 control_diag_debug_intel failed in check_result
control_diag_debug_intel 091 failed in run_test
092 control_debug_p8_intel failed in check_result
control_debug_p8_intel 092 failed in run_test
111 control_wam_debug_intel failed in check_result
control_wam_debug_intel 111 failed in run_test
131 hafs_regional_atm_intel failed in check_result
hafs_regional_atm_intel 131 failed in run_test
132 hafs_regional_atm_thompson_gfdlsf_intel failed in check_result
hafs_regional_atm_thompson_gfdlsf_intel 132 failed in run_test
133 hafs_regional_atm_ocn_intel failed in check_result
hafs_regional_atm_ocn_intel 133 failed in run_test
134 hafs_regional_atm_wav_intel failed in check_result
hafs_regional_atm_wav_intel 134 failed in run_test
135 hafs_regional_atm_ocn_wav_intel failed in check_result
hafs_regional_atm_ocn_wav_intel 135 failed in run_test
136 hafs_regional_1nest_atm_intel failed in check_result
hafs_regional_1nest_atm_intel 136 failed in run_test
137 hafs_regional_telescopic_2nests_atm_intel failed in check_result
hafs_regional_telescopic_2nests_atm_intel 137 failed in run_test
138 hafs_global_1nest_atm_intel failed in check_result
hafs_global_1nest_atm_intel 138 failed in run_test
139 hafs_global_multiple_4nests_atm_intel failed in check_result
hafs_global_multiple_4nests_atm_intel 139 failed in run_test
140 hafs_regional_specified_moving_1nest_atm_intel failed in check_result
hafs_regional_specified_moving_1nest_atm_intel 140 failed in run_test
141 hafs_regional_storm_following_1nest_atm_intel failed in check_result
hafs_regional_storm_following_1nest_atm_intel 141 failed in run_test
142 hafs_regional_storm_following_1nest_atm_ocn_intel failed in check_result
hafs_regional_storm_following_1nest_atm_ocn_intel 142 failed in run_test
143 hafs_global_storm_following_1nest_atm_intel failed in check_result
hafs_global_storm_following_1nest_atm_intel 143 failed in run_test
145 hafs_regional_storm_following_1nest_atm_ocn_debug_intel failed in check_result
hafs_regional_storm_following_1nest_atm_ocn_debug_intel 145 failed in run_test
146 hafs_regional_storm_following_1nest_atm_ocn_wav_intel failed in check_result
hafs_regional_storm_following_1nest_atm_ocn_wav_intel 146 failed in run_test
147 hafs_regional_docn_intel failed in check_result
hafs_regional_docn_intel 147 failed in run_test
148 hafs_regional_docn_oisst_intel failed in check_result
hafs_regional_docn_oisst_intel 148 failed in run_test
167 control_p8_atmlnd_sbs_intel failed in check_result
control_p8_atmlnd_sbs_intel 167 failed in run_test
168 atmwav_control_noaero_p8_intel failed in check_result
atmwav_control_noaero_p8_intel 168 failed in run_test
169 control_atmwav_intel failed in check_result
control_atmwav_intel 169 failed in run_test
170 atmaero_control_p8_intel failed in check_result
atmaero_control_p8_intel 170 failed in run_test
171 atmaero_control_p8_rad_intel failed in check_result
atmaero_control_p8_rad_intel 171 failed in run_test
172 atmaero_control_p8_rad_micro_intel failed in check_result
atmaero_control_p8_rad_micro_intel 172 failed in run_test
regional_atmaq_intel 173 failed in run_test
175 regional_atmaq_faster_intel failed in check_result
regional_atmaq_faster_intel 175 failed in run_test
176 control_c48_gnu failed in check_result
control_c48_gnu 176 failed in run_test
177 control_stochy_gnu failed in check_result
control_stochy_gnu 177 failed in run_test
178 control_ras_gnu failed in check_result
control_ras_gnu 178 failed in run_test
179 control_p8_gnu failed in check_result
control_p8_gnu 179 failed in run_test
180 control_p8_ugwpv1_gnu failed in check_result
control_p8_ugwpv1_gnu 180 failed in run_test
181 control_flake_gnu failed in check_result
control_flake_gnu 181 failed in run_test
196 control_diag_debug_gnu failed in check_result
control_diag_debug_gnu 196 failed in run_test
206 control_ras_debug_gnu failed in check_result
control_ras_debug_gnu 206 failed in run_test
207 control_stochy_debug_gnu failed in check_result
control_stochy_debug_gnu 207 failed in run_test
208 control_debug_p8_gnu failed in check_result
control_debug_p8_gnu 208 failed in run_test
control_wam_debug_gnu 212 failed in run_test
231 cpld_control_p8_gnu failed in check_result
cpld_control_p8_gnu 231 failed in run_test
232 cpld_control_nowave_noaero_p8_gnu failed in check_result
cpld_control_nowave_noaero_p8_gnu 232 failed in run_test
233 cpld_debug_p8_gnu failed in check_result
cpld_debug_p8_gnu 233 failed in run_test
234 cpld_control_pdlib_p8_gnu failed in check_result
cpld_control_pdlib_p8_gnu 234 failed in run_test
235 cpld_debug_pdlib_p8_gnu failed in check_result
cpld_debug_pdlib_p8_gnu 235 failed in run_test

Libraries

  • Not Needed
  • Needed
    • Create separate issue in JCSDA/spack-stack asking for update to library. Include library name, library version.
    • Add issue link from JCSDA/spack-stack following this item
Code Managers Log
  • This PR is up-to-date with the top of all sub-component repositories except for those sub-components which are the subject of this PR.
  • Move new/updated input data on RDHPCS Hera and propagate input data changes to all supported systems.
    • N/A

Testing Log:

  • RDHPCS
    • Hera
    • Orion
    • Hercules
    • Jet
    • Gaea
    • Cheyenne
  • WCOSS2
    • Dogwood/Cactus
    • Acorn
  • CI
    • Completed
  • opnReqTest
    • N/A
    • Log attached to comment

@zach1221 zach1221 added the Baseline Updates Current baselines will be updated. label Dec 15, 2023
@zach1221
Copy link
Collaborator

@grantfirl could you sync up your branch for us, please?

Copy link

@grantfirl please bring these up to date with respective authoritative repositories

  • ufs-weather-model NOT up to date

@jkbk2004
Copy link
Collaborator

@grantfirl can you sync up the branch? so that we can work on this pr.

@grantfirl
Copy link
Collaborator Author

@jkbk2004 @zach1221 Sorry for the delay. It should be ready to test now.

@BrianCurtis-NOAA
Copy link
Collaborator

@grantfirl or @Qingfu-Liu I've added a commit message section under the description. Could you please fill that out containing the information you prefer users to understand about the changes from this PR. Thanks.

@grantfirl
Copy link
Collaborator Author

@grantfirl or @Qingfu-Liu I've added a commit message section under the description. Could you please fill that out containing the information you prefer users to understand about the changes from this PR. Thanks.

@Qingfu-Liu Could you help distill the changes into a one-liner for the purposes of what will be displayed in the git commit history?

@zach1221 zach1221 added Ready for Commit Queue The PR is ready for the Commit Queue. All checkboxes in PR template have been checked. jenkins-ci Jenkins CI: ORT build/test on docker container labels Dec 15, 2023
@Qingfu-Liu
Copy link
Collaborator

@grantfirl or @Qingfu-Liu I've added a commit message section under the description. Could you please fill that out containing the information you prefer users to understand about the changes from this PR. Thanks.

@Qingfu-Liu Could you help distill the changes into a one-liner for the purposes of what will be displayed in the git commit history?

No sure where to fill the message. Please use the following commit message: "Code update for HR4_roughness, background diffusivity and shallow convection". Thanks

@grantfirl
Copy link
Collaborator Author

@grantfirl or @Qingfu-Liu I've added a commit message section under the description. Could you please fill that out containing the information you prefer users to understand about the changes from this PR. Thanks.

@Qingfu-Liu Could you help distill the changes into a one-liner for the purposes of what will be displayed in the git commit history?

No sure where to fill the message. Please use the following commit message: "Code update for HR4_roughness, background diffusivity and shallow convection". Thanks

Thanks, @Qingfu-Liu I've put this into the description above. There is a new section called "Commit Message".

@epic-cicd-jenkins
Copy link
Collaborator

Jenkins-ci ORTs passed

@jkbk2004
Copy link
Collaborator

regional_atmaq_intel crashes. err is on hera at /scratch1/NCEPDEV/stmp2/Jong.Kim/FV3_RT/rt_37357/regional_atmaq_intel

218: FATAL from PE   218: compute_qs: saturation vapor pressure table overflow, nbad=      1
218: fv3.exe            0000000002C0F2C4  sat_vapor_pres_mo        2138  sat_vapor_pres.F90

@grantfirl @Qingfu-Liu Can you check with the case? @BrianCurtis-NOAA FYI

@jkbk2004 jkbk2004 removed the Ready for Commit Queue The PR is ready for the Commit Queue. All checkboxes in PR template have been checked. label Dec 17, 2023
@Qingfu-Liu
Copy link
Collaborator

regional_atmaq_intel crashes. err is on hera at /scratch1/NCEPDEV/stmp2/Jong.Kim/FV3_RT/rt_37357/regional_atmaq_intel

218: FATAL from PE   218: compute_qs: saturation vapor pressure table overflow, nbad=      1
218: fv3.exe            0000000002C0F2C4  sat_vapor_pres_mo        2138  sat_vapor_pres.F90

@grantfirl @Qingfu-Liu Can you check with the case? @BrianCurtis-NOAA FYI

@jkbk2004 I will run a test

@Qingfu-Liu
Copy link
Collaborator

regional_atmaq_intel crashes. err is on hera at /scratch1/NCEPDEV/stmp2/Jong.Kim/FV3_RT/rt_37357/regional_atmaq_intel

218: FATAL from PE   218: compute_qs: saturation vapor pressure table overflow, nbad=      1
218: fv3.exe            0000000002C0F2C4  sat_vapor_pres_mo        2138  sat_vapor_pres.F90

@grantfirl @Qingfu-Liu Can you check with the case? @BrianCurtis-NOAA FYI

@jkbk2004 I will run a test

My test also failed. I will contact the code developer

@BrianCurtis-NOAA
Copy link
Collaborator

My test also failed. I will contact the code developer

I've been working on updates to the regional_atmaq tests, I would recommend disabling them and I'll prioritize my work for the test updates.

@Qingfu-Liu
Copy link
Collaborator

regional_atmaq

@BrianCurtis-NOAA Thank you very much for working on the regional_atmaq tests

@grantfirl
Copy link
Collaborator Author

It also looks like there may be an issue with control_wam_debug_gnu. I'm retesting it now.

@grantfirl
Copy link
Collaborator Author

regional_atmaq_intel crashes. err is on hera at /scratch1/NCEPDEV/stmp2/Jong.Kim/FV3_RT/rt_37357/regional_atmaq_intel

218: FATAL from PE   218: compute_qs: saturation vapor pressure table overflow, nbad=      1
218: fv3.exe            0000000002C0F2C4  sat_vapor_pres_mo        2138  sat_vapor_pres.F90

@grantfirl @Qingfu-Liu Can you check with the case? @BrianCurtis-NOAA FYI

@jkbk2004 I will run a test

My test also failed. I will contact the code developer

@Qingfu-Liu Which code developer are you referring to? I thought that this error typically signals numerical instability?

@BrianCurtis-NOAA
Copy link
Collaborator

regional_atmaq_intel crashes. err is on hera at /scratch1/NCEPDEV/stmp2/Jong.Kim/FV3_RT/rt_37357/regional_atmaq_intel

218: FATAL from PE   218: compute_qs: saturation vapor pressure table overflow, nbad=      1
218: fv3.exe            0000000002C0F2C4  sat_vapor_pres_mo        2138  sat_vapor_pres.F90

@grantfirl @Qingfu-Liu Can you check with the case? @BrianCurtis-NOAA FYI

@jkbk2004 I will run a test

My test also failed. I will contact the code developer

@Qingfu-Liu Which code developer are you referring to? I thought that this error typically signals numerical instability?

This does seem to be a numerical instability issue, and typically reducing the time step has fixed this issue in the past but seems more like a band-aid, I would prefer a code solution if possible.

@grantfirl
Copy link
Collaborator Author

FYI, the control_wam_debug_gnu test that shows as failed in the original uploaded RT log ran fine for me on Hera. It failed in check result, as expected.

So, @BrianCurtis-NOAA, you would like us to temporarily comment out the regional_atmaq_intel test in rt.conf for this PR?

@jkbk2004 jkbk2004 added the Ready for Commit Queue The PR is ready for the Commit Queue. All checkboxes in PR template have been checked. label Jan 26, 2024
@jkbk2004
Copy link
Collaborator

@BrianCurtis-NOAA @zach1221 @FernandoAndrade-NOAA FYI: we will let this pr run over weekend.

@FernandoAndrade-NOAA
Copy link
Collaborator

Please hold while I rerun cpld_debug_gfsv17 intel on jet as it's missing from the log.

@DeniseWorthen
Copy link
Collaborator

@FernandoAndrade-NOAA Why is it missing? Did it not compile, not run?

@FernandoAndrade-NOAA
Copy link
Collaborator

@FernandoAndrade-NOAA Why is it missing? Did it not compile, not run?

There was a timeout during the BL creation; I'm unsure if it was just system variability or the test will need an increase in its limit moving forward specifically on Jet.

@FernandoAndrade-NOAA
Copy link
Collaborator

FernandoAndrade-NOAA commented Jan 29, 2024

Apologies for the delay, there was another timeout, I'm going to increase the time limit for the test to 55 for Jet and rerun. We will revisit the issue within an upcoming PR.

@zach1221
Copy link
Collaborator

zach1221 commented Jan 29, 2024

Alright, regression testing is complete. Let's begin the merging process with the ccpp-physics sub-pr.

@zach1221
Copy link
Collaborator

@grantfirl fv3atm pr merged. Hash: NOAA-EMC/fv3atm@bd38c56

@grantfirl
Copy link
Collaborator Author

@grantfirl fv3atm pr merged. Hash: NOAA-EMC/fv3atm@bd38c56

@zach1221 FV3 submodule pointer updated and .gitmodules reverted. This should be ready for approval/merge.

@zach1221 zach1221 merged commit 625ac0c into ufs-community:develop Jan 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Baseline Updates Current baselines will be updated. jenkins-ci Jenkins CI: ORT build/test on docker container Ready for Commit Queue The PR is ready for the Commit Queue. All checkboxes in PR template have been checked.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants