Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature/global_det_mpmd #610

Merged

Conversation

malloryprow
Copy link
Contributor

@malloryprow malloryprow commented Nov 22, 2024

Note to developers: You must use this PR template!

Description of Changes

This includes development to address NCO Bugzilla 1547 for global_det. Closes #553.

Developer Questions and Checklist

  • Is this a high priority PR? If so, why and is there a date it needs to be merged by?

No

  • Do you have any planned upcoming annual leave/PTO?

Yes, 11/27-11/29

  • Are there any changes needed for when the jobs are supposed to run?

No

  • The code changes follow NCO's EE2 Standards.
  • Developer's name is removed throughout the code and have used ${USER} where necessary throughout the code.
  • References the feature branch for HOMEevs are removed from the code.
  • J-Job environment variables, COMIN and COMOUT directories, and output follow what has been defined for EVS.
  • Jobs over 15 minutes in runtime have restart capability.
  • If applicable, changes in the dev/drivers/scripts or dev/modulefiles have been made in the corresponding ecf/scripts and ecf/defs/evs-nco.def?
  • Jobs contain the approriate file checking and don't run METplus for any missing data.
  • Code is using METplus wrappers structure and not calling MET executables directly.
  • Log is free of any ERRORs or WARNINGs.

Testing Instructions

Set-up

  1. Clone my fork and checkout branch feature/global_det_mpmd
  2. ln -sf /lfs/h2/emc/vpppg/noscrub/emc.vpppg/verification/EVS_fix fix

global_det stats - wave

  1. cd dev/drivers/scripts/stats/global_det
  2. Run: jevs_global_det_gfs_wave_grid2obs_stats.sh
  • Set HOMEevs to the location of the clone.
  • Set COMIN to /lfs/h2/emc/vpppg/noscrub/emc.vpppg/$NET/$evs_ver_2d

global_det plots - wave

  1. cd dev/drivers/scripts/plots/global_det
  2. Run: jevs_global_det_wave_grid2obs_plots_last31days.sh, jevs_global_det_wave_grid2obs_plots_last90days.sh
  • Set HOMEevs to the location of the clone.
  • Set COMIN to /lfs/h2/emc/vpppg/noscrub/emc.vpppg/$NET/$evs_ver_2d

global_det stats - atmos

  1. cd dev/drivers/scripts/stats/global_det
  2. Run: jevs_global_det_cfs_atmos_grid2grid_stats.sh, jevs_global_det_cfs_atmos_grid2obs_stats.sh, jevs_global_det_cmc_atmos_grid2grid_stats.sh, jevs_global_det_cmc_atmos_grid2obs_stats.sh, jevs_global_det_cmc_regional_atmos_grid2grid_stats.sh, jevs_global_det_dwd_atmos_grid2grid_stats.sh,jevs_global_det_ecmwf_atmos_grid2grid_stats.sh, jevs_global_det_ecmwf_atmos_grid2obs_stats.sh, jevs_global_det_fnmoc_atmos_grid2grid_stats.sh, jevs_global_det_fnmoc_atmos_grid2obs_stats.sh, jevs_global_det_gfs_atmos_grid2grid_stats.sh, jevs_global_det_gfs_atmos_grid2obs_stats.sh, jevs_global_det_imd_atmos_grid2grid_stats.sh, jevs_global_det_imd_atmos_grid2obs_stats.sh, jevs_global_det_jma_atmos_grid2grid_stats.sh, jevs_global_det_jma_atmos_grid2obs_stats.sh, jevs_global_det_metfra_atmos_grid2grid_stats.sh, jevs_global_det_ukmet_atmos_grid2grid_stats.sh, jevs_global_det_ukmet_atmos_grid2obs_stats.sh
  • Set HOMEevs to the location of the clone.
  • Set COMIN to /lfs/h2/emc/vpppg/noscrub/emc.vpppg/$NET/$evs_ver_2d
  • May need to adjust VDATE depending when testing is done
  1. Run jevs_global_det_gfs_atmos_wmo_daily_stats.sh
  • Set HOMEevs to the location of the clone.
  • Set COMIN to /lfs/h2/emc/vpppg/noscrub/emc.vpppg/$NET/$evs_ver_2d
  • May need to adjust VDATE depending when testing is done
  1. Run jevs_global_det_gfs_atmos_wmo_monthly_stats.sh
  • Set HOMEevs to the location of the clone.
  • Set COMIN to /lfs/h2/emc/vpppg/noscrub/emc.vpppg/$NET/$evs_ver_2d
  • Set VDATE to 20241130

global_det stats - atmos

  1. cd dev/drivers/scripts/plots/global_det
  2. Run: jevs_global_det_atmos_grid2grid_means_plots_last31days.sh, jevs_global_det_atmos_grid2grid_means_plots_last90days.sh, jevs_global_det_atmos_grid2grid_precip_plots_last31days.sh, jevs_global_det_atmos_grid2grid_precip_plots_last90days.sh, jevs_global_det_atmos_grid2grid_pres_levs_plots_last31days.sh, jevs_global_det_atmos_grid2grid_pres_levs_plots_last90days.sh, jevs_global_det_atmos_grid2grid_sea_ice_plots_last31days.sh, jevs_global_det_atmos_grid2grid_sea_ice_plots_last90days.sh, jevs_global_det_atmos_grid2grid_snow_plots_last31days.sh, jevs_global_det_atmos_grid2grid_snow_plots_last90days.sh, jevs_global_det_atmos_grid2grid_sst_plots_last31days.sh, jevs_global_det_atmos_grid2grid_sst_plots_last90days.sh, jevs_global_det_atmos_grid2obs_pres_levs_plots_last31days.sh, jevs_global_det_atmos_grid2obs_pres_levs_plots_last90days.sh, jevs_global_det_atmos_grid2obs_ptype_plots_last31days.sh, jevs_global_det_atmos_grid2obs_ptype_plots_last90days.sh, jevs_global_det_atmos_grid2obs_sfc_plots_last31days.sh, jevs_global_det_atmos_grid2obs_sfc_plots_last90days.sh
  • Set HOMEevs to the location of the clone.
  • Set COMIN to /lfs/h2/emc/vpppg/noscrub/emc.vpppg/$NET/$evs_ver_2d
  • May need to adjust VDATE depending when testing is done

@PerryShafran-NOAA
Copy link
Contributor

PerryShafran-NOAA commented Dec 2, 2024

OK, on to plots. I will be working on things in the order they are listed in the instructions.

@malloryprow malloryprow modified the milestones: EVS v2.0.x, EVS v2.0.0 Dec 2, 2024
@PerryShafran-NOAA
Copy link
Contributor

The GFS wave plots are finished.

The .o files are here: /lfs/h2/emc/vpppg/noscrub/perry.shafran/pr610test/EVS/dev/drivers/scripts/plots/global_det
The plot tarballs are here: /lfs/h2/emc/ptmp/perry.shafran/evs/v2.0/plots/global_det/wave.20241201
The working directories are here: /lfs/h2/emc/stmp/perry.shafran/evs_test/prod/tmp/jevs_global_det_wave_grid2obs_plots_lastxxdays_00.201014xx.dbqs01

@malloryprow
Copy link
Contributor Author

global_det wave plots looks good as well!

Can we wait until 1845Z to test the global_det atmos stats? I reran the prep step this morning for the dates that didn't complete due to stmp filling up. The stats that ran yesterday in the emc.vpppg parallel wouldn't have data for those missing files. If we wait until the jobs run later today, both will be running using the same prep data and make it easier to compare the final stats files.

@PerryShafran-NOAA
Copy link
Contributor

Yes, we can wait. I think that will be around 2:45 pm, and that means that most jobs might end after your workday ends. I think some of the quick ones will finish before that.

@PerryShafran-NOAA
Copy link
Contributor

@malloryprow I think I miscalculated when 1845Z was, darn it. I'll set off all the atmos stats jobs now.

@malloryprow
Copy link
Contributor Author

Sounds good! I will review them tomorrow morning!

@malloryprow
Copy link
Contributor Author

Good with the first round of the global_det atmos stats. We can do the WMO stats next. Both daily and monthly. We can run the daily with VDATE=20241201 and the monthly with VDATE=20241130.

@PerryShafran-NOAA
Copy link
Contributor

@malloryprow Thank you for checking! On to the WMO stats.

@PerryShafran-NOAA
Copy link
Contributor

@malloryprow The wmo stats jobs are complete.

.o files: /lfs/h2/emc/vpppg/noscrub/perry.shafran/pr610test/EVS/dev/drivers/scripts/stats/global_det
small stat files: /lfs/h2/emc/vpppg/noscrub/perry.shafran/evs/v2.0/stats/global_det/atmos.20241130/gfs/wmo
final monthly stat files: /lfs/h2/emc/vpppg/noscrub/perry.shafran/evs/v2.0/stats/global_det/gfs.20241130
final daily stat files: /lfs/h2/emc/vpppg/noscrub/perry.shafran/evs/v2.0/stats/global_det/gfs.20241201
working directories: /lfs/h2/emc/stmp/perry.shafran/evs_test/prod/tmp/jevs_global_det_gfs_atmos_wmo directories

@malloryprow
Copy link
Contributor Author

The WMO stats look good for the daily!

Could we run the monthly with VDATE=20241031? It looks like the output for 20231130 in the parallel is messed up from stmp filling up.

@malloryprow
Copy link
Contributor Author

Actually just caught something in the working directories using /lfs/h1/ops/prod/com/gfs/v16.3/gdas.YYYYmmdd/HH/atmos/gdas.tHHz.cnvstat. The file permissions need to be set to rstprod. I'll need to make a change for that.

@PerryShafran-NOAA
Copy link
Contributor

@malloryprow OK, let me know when you want me to run the monthly WMO for October. Does this file permission thing you are talking about refer to the daily or the monthly?

@malloryprow
Copy link
Contributor Author

It is the daily so both the WMO monthly and daily will need to be rerun

  1. rm -r /lfs/h2/emc/vpppg/noscrub/perry.shafran/evs/v2.0/stats/global_det/atmos.20241201/gfs/wmo
  2. rm /lfs/h2/emc/vpppg/noscrub/perry.shafran/evs/v2.0/stats/global_det/gfs.20241201/evs.stats.gfs.atmos.wmo.station_info.v20241201.stat
  3. rm /lfs/h2/emc/vpppg/noscrub/perry.shafran/evs/v2.0/stats/global_det/gfs.20241201/evs.stats.gfs.atmos.wmo.v20241201.stat
  4. qsub -v VDATE=20241201 jevs_global_det_gfs_atmos_wmo_daily_stats.sh
  5. qsub -v VDATE=20241031 jevs_global_det_gfs_atmos_wmo_monthly_stats.sh

@PerryShafran-NOAA
Copy link
Contributor

Both jobs are underway.

@PerryShafran-NOAA
Copy link
Contributor

@malloryprow Both the wmo jobs are complete if you want to check. The locations are the same as listed above.

@PerryShafran-NOAA
Copy link
Contributor

Re-running the daily wmo stats job as I had forgotten to pull in changes.

Mallory says that the monthly wmo stats look good.

@malloryprow
Copy link
Contributor Author

I can comment on PRs again. Posting what I sent in an email.

For the daily WMO job, it doesn't look like the changes were pulled in. /lfs/h2/emc/vpppg/noscrub/perry.shafran/pr610test/EVS/ush/global_det/global_det_atmos_stats_wmo_reformat_cnvstat.py doesn't have the changes. Can you pull the changes in and run again? No need to set VDATE or can do qsub -v VDATE=20241202.
The monthly WMO job is good.

@PerryShafran-NOAA
Copy link
Contributor

@malloryprow Running the daily job for 20241201. I did all the rm's before I submitted.

@malloryprow
Copy link
Contributor Author

WMO daily job is good!

We can move on to plots. Please run with VDATE_END=20241202.

@PerryShafran-NOAA
Copy link
Contributor

@malloryprow All plots jobs are underway.

@malloryprow
Copy link
Contributor Author

Everything is good for plots!

@PerryShafran-NOAA
Copy link
Contributor

OK cool! I'll do a code check and then approve if all is good.

@AliciaBentley-NOAA You should check and approve when ready as well.

Copy link
Contributor

@PerryShafran-NOAA PerryShafran-NOAA left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code works as expected. Approved for merge.

Copy link
Contributor

@AliciaBentley-NOAA AliciaBentley-NOAA left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have reviewed the changes made in this PR and found that they are consistent with the changes required to address the MPMD bugzilla. I like the introduction of data/ in the various paths and the ${job_work_dir} variable. I approve this PR to be merged.

CC @PerryShafran-NOAA @malloryprow

@PerryShafran-NOAA
Copy link
Contributor

Excellent! Ready to merge!

@PerryShafran-NOAA PerryShafran-NOAA merged commit 68a4fc0 into NOAA-EMC:develop Dec 4, 2024
@malloryprow malloryprow deleted the feature/global_det_mpmd branch December 4, 2024 20:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

global_det: Address Bugzilla 1547 - MPMD processes share the same working directory
3 participants