diff --git a/.gitignore b/.gitignore
index e702edf6c..9609c69c3 100644
--- a/.gitignore
+++ b/.gitignore
@@ -27,4 +27,5 @@ README.md
*.log
compile_options.sh
.DS_Store
+.venv
.vscode
diff --git a/.readthedocs.yaml b/.readthedocs.yaml
new file mode 100644
index 000000000..cbf746c4d
--- /dev/null
+++ b/.readthedocs.yaml
@@ -0,0 +1,35 @@
+# Read the Docs configuration file for Sphinx projects
+# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
+
+# Required
+version: 2
+
+# Set the OS, Python version and other tools you might need
+build:
+ os: ubuntu-22.04
+ tools:
+ python: "3.11"
+ # You can also specify other tool versions:
+ # nodejs: "20"
+ # rust: "1.70"
+ # golang: "1.20"
+
+# Build documentation in the "docs/" directory with Sphinx
+sphinx:
+ configuration: docs/userguide/conf.py
+ # You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
+ # builder: "dirhtml"
+ # Fail on all warnings to avoid broken references
+ # fail_on_warning: true
+
+# Optionally build your docs in additional formats such as PDF and ePub
+formats:
+ - pdf
+# - epub
+
+# Optional but recommended, declare the Python requirements required
+# to build your documentation
+# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
+python:
+ install:
+ - requirements: docs/requirements.txt
diff --git a/docs/requirements.txt b/docs/requirements.txt
new file mode 100644
index 000000000..483a4e960
--- /dev/null
+++ b/docs/requirements.txt
@@ -0,0 +1 @@
+sphinx_rtd_theme
diff --git a/docs/userguide/_static/ug_theme.css b/docs/userguide/_static/ug_theme.css
new file mode 100644
index 000000000..834909fe0
--- /dev/null
+++ b/docs/userguide/_static/ug_theme.css
@@ -0,0 +1,29 @@
+.wy-nav-content {
+ max-width: 1000px !important;
+}
+
+.center {
+ text-align: center;
+}
+
+.underline {
+ text-decoration: underline;
+}
+
+.filename {
+ font-family: Courier, "Courier New", monospace;
+ white-space: pre-wrap;
+}
+
+.program {
+ font-family: Courier, "Courier New", monospace;
+ white-space: pre-wrap;
+ color: greenyellow;
+ background-color: black;
+ padding-left: 5px;
+ padding-right: 5px;
+}
+
+.wy-table-responsive table td, .wy-table-responsive table th {
+ white-space: normal;
+}
\ No newline at end of file
diff --git a/docs/userguide/appendices.rest b/docs/userguide/appendices.rest
new file mode 100644
index 000000000..236c843ed
--- /dev/null
+++ b/docs/userguide/appendices.rest
@@ -0,0 +1,2312 @@
+.. vim: syntax=rst
+.. role:: raw-html(raw)
+ :format: html
+
+APPENDICES
+==========
+
+This section contains supplementary information.
+
+.. _section-a1:
+
+A1. Example of Dependency Installation for Ubuntu 24.04 LTS
+-----------------------------------------------------------
+
+The example below uses the GNU compilers and Open MPI. Commands are
+issued as root user in the bash shell.::
+
+ ##########################################################
+ ### Get libraries available through apt-get ###
+ ##########################################################
+
+ apt-get update
+
+ apt-get install wget bzip2 ca-certificates gfortran \
+ libnetcdff-dev mpi-default-dev \
+ cmake git netcdf-bin
+
+ ##########################################################
+ ### Check netCDF installs (optional) ###
+ ##########################################################
+
+ nf-config --all
+ nc-config --all
+
+.. _section-a2:
+
+A2. Exceptions for Running WRF-Hydro with the Noah LSM
+------------------------------------------------------
+
+Support for the Noah Land Surface Model (LSM) within WRF-Hydro is
+currently frozen at Noah version 3.6. Since the Noah LSM is not under
+active development by the community, WRF-Hydro is continuing to support
+Noah in deprecated mode only. Some new model features, such as the
+improved output routines, have not been setup to be backward compatible
+with Noah. Noah users should follow the guidelines below for adapting
+the WRF-Hydro workflow to work with Noah:
+
+- **LSM initialization:** The simple wrfinput.nc initialization file
+ created by the create_Wrfinput.R script does not currently include
+ all of the fields required by the Noah LSM. Therefore, Noah users
+ should use the WRF real.exe utility to create a wrfinput_d0x file.
+ Refer to the WRF documentation and user guides for information on how
+ to do this.
+
+- **Time-varying vegetation specifications:** While the Noah LSM will
+ be properly initialized with green vegetation fraction from the
+ wrfinput file, there is currently no automated method to update this
+ field over time (e.g., seasonally based on climatology). Therefore,
+ Noah users will need to provide these time-varying fields in the
+ model input forcing files (e.g., LDASIN).
+
+- **Spatially varying parameters**: Spatially varying soil and
+ vegetation parameters (e.g., soil_properties.nc) are not supported in
+ Noah.
+
+- **Model outputs:** The updated output routines have not been adapted
+ to work with Noah. Therefore, Noah users should always use
+ io_form_outputs = 0 to activate the deprecated output routines.
+ Scale/offset and compression options, CF compliance, augmented
+ spatial metadata, etc. are not available in this deprecated mode.
+
+.. _section-a3:
+
+A3. Noah `namelist.hrldas` File with Description of Options
+-----------------------------------------------------------
+
+Below is an annotated namelist.hrldas file for running with the Noah
+land surface model. Notes and descriptions are indicated with <-- and
+blue text.
+
+.. code-block:: fortran
+
+ &NOAHLSM_OFFLINE
+ HRLDAS_CONSTANTS_FILE = "./DOMAIN/wrfinput_d01" !!<-- Path to wrfinput file containing initialization data
+ ! for the LSM. This is required even for a warm start
+ ! where a restart file is provided.
+
+ INDIR = "./FORCING" !<-- Path to atmospheric forcing data directory.
+ OUTDIR = "./" !<-- Generally leave this as-is (output goes to base run directory);
+ ! redirected output only applies to LSM output files and can cause
+ ! issues when running coupled to WRF-Hydro.
+ START_YEAR = 2013 !<-- Simulation start year
+ START_MONTH = 09 !<-- Simulation start month
+ START_DAY = 01 !<-- Simulation start day
+ START_HOUR = 00 !<-- Simulation start hour
+ START_MIN = 00 !<-- Simulation start min
+ RESTART_FILENAME_REQUESTED = "RESTART.2013090100_DOMAIN1" !<-- Path to LSM restart file if using; this contains a
+ ! "warm" model state from a previous model run.
+ ! Comment out if not a restart simulation.
+
+ ! Specification of simulation length in days hours
+ KHOUR = 24 !<-- Number of hours for simulation;
+
+ ! Timesteps in units of seconds
+ FORCING_TIMESTEP = 3600 !<-- Timestep for forcing input data (in seconds)
+ NOAH_TIMESTEP = 3600 !<-- Timestep the LSM to cycle (in seconds)
+ OUTPUT_TIMESTEP = 86400 !<-- Timestep for LSM outputs, LDASOUT (in seconds)
+
+ ! Land surface model restart file write frequency
+ RESTART_FREQUENCY_HOURS = 6 !<-- Timestep for LSM restart files to be generated (in hours). A value of -99999
+ ! will simply output restarts on the start of each month, useful for longer
+ ! model runs. Restart files are generally quite large, so be cognizant of
+ ! storage space and runtime impacts when specifying.
+ ! Split output after split_output_count output times.
+ SPLIT_OUTPUT_COUNT = 1 !<-- Number of timesteps to put in a single output file. This option
+ ! must be 1 for NWM output configurations.
+
+ ! Soil layer specification
+ NSOIL=4 !<-- Number of soil layers
+ ZSOIL(1) = 0.10 !<-- Thickness of top soil layer (m)
+ ZSOIL(2) = 0.30 !<-- Thickness of second soil layer (m)
+ ZSOIL(3) = 0.60 !<-- Thickness of third soil layer (m)
+ ZSOIL(4) = 1.00 !<-- Thickness of bottom soil layer (m)
+
+ ! Forcing data measurement heights
+ ZLVL = 2.0 !<-- Height of input temperature and humidity measurement/estimate
+ ZLVL_WIND = 10.0 !<-- Height of input wind speed measurement/estimate
+
+ IZ0TLND = 0 !<-- Switch to control land thermal roughness length. Option 0 is the default,
+ ! non-vegetation dependent value and option 1 introduces a vegetation dependence.
+ SFCDIF_OPTION = 0 !<-- Option to use the newer, option 1, or older,
+ option 0, SFCDIF routine. The default value is 0.
+ UPDATE_SNOW_FROM_FORCING = .FALSE. !<-- Option to activate or deactivate updating the snowcover
+ ! fields from available analyses. The default option is true.
+
+ ! -------- Section: Select atmospheric forcing input file format, FORC_TYP -------- !
+ ! Specification of forcing data: 1=HRLDAS-hr format, 2=HRLDAS-min format,
+ ! 3=WRF,4=Idealized, 5=Ideal w/ Spec.Precip.,
+ ! 6=HRLDAS-hrly fomat w/ Spec. Precip, 7=WRF w/ Spec. Precip
+ FORC_TYP = 3
+ /
+
+.. _section-a4:
+
+A4. Noah-MP `namelist.hrldas` File with Description of Options
+--------------------------------------------------------------
+
+Below is an annotated namelist.hrldas file for running with the Noah-MP
+land surface model. Do note that the file says ``&NOAHLSM_OFFLINE``
+however it is for use with the Noah-MP LSM. This namelist statement
+happens to be hardcoded and thus not easily changed. Notes and
+descriptions are indicated with <-- and blue text when after sections
+being described. See the official HRLDAS namelist description here:
+https://github.com/NCAR/hrldas-release/blob/release/HRLDAS/run/README.namelist
+
+.. code-block:: fortran
+
+ &NOAHLSM_OFFLINE
+ HRLDAS_SETUP_FILE = "./DOMAIN/wrfinput_d01" !<-- Path to wrfinput file containing initialization
+ ! data for the LSM. This is required even for a warm
+ ! start where a restart file is provided.
+ INDIR = "./FORCING" !<-- Path to atmospheric forcing data directory.
+
+ SPATIAL_FILENAME = "./DOMAIN/soil_properties.nc" !<-- Path to optional 2d/3d soil and vegetation
+ ! parameter file. If you are using this option,
+ ! you must also use a binary compiled with
+ ! SPATIAL_SOIL=1. If using the traditional
+ ! parameter lookup tables, compile with
+ ! SPATIAL_SOIL=0 and comment out this option.
+ OUTDIR = "./" !<-- Generally leave this as-is (output goes to base run directory); redirected
+ ! output only applies to LSM output files
+ ! and can cause issues when running coupled to WRF-Hydro.
+ START_YEAR = 2013 !<-- Simulation start year
+ START_MONTH = 09 !<-- Simulation start month
+ START_DAY = 12 !<-- Simulation start day
+ START_HOUR = 04 !<-- Simulation start hour
+ START_MIN = 00 !<-- Simulation start min
+ RESTART_FILENAME_REQUESTED = "RESTART.2013091204_DOMAIN1" !<-- Path to LSM restart file if using;
+ ! this contains a "warm" model state
+ ! from a previous model run. Comment out
+ ! if not a restart simulation.
+ ! Specification of simulation length in days OR hours
+ KHOUR = 24 !<-- Number of hours for simulation
+
+ ! -------- Following Section: Noah-MP physics options -------- !
+
+ ! Physics options (see the documentation for details)
+
+ DYNAMIC_VEG_OPTION = 4
+ CANOPY_STOMATAL_RESISTANCE_OPTION = 1
+ BTR_OPTION = 1
+ RUNOFF_OPTION = 3
+ SURFACE_DRAG_OPTION = 1
+ FROZEN_SOIL_OPTION = 1
+ SUPERCOOLED_WATER_OPTION = 1
+ RADIATIVE_TRANSFER_OPTION = 3
+ SNOW_ALBEDO_OPTION = 2
+ PCP_PARTITION_OPTION = 1
+ TBOT_OPTION = 2
+ TEMP_TIME_SCHEME_OPTION = 3
+ GLACIER_OPTION = 2
+ SURFACE_RESISTANCE_OPTION = 4
+
+ ! Timesteps in units of seconds
+ FORCING_TIMESTEP = 3600 !<-- Timestep for forcing input data (in seconds)
+ NOAH_TIMESTEP = 3600 !<-- Timestep the LSM to cycle (in seconds)
+ OUTPUT_TIMESTEP = 86400 !<-- Timestep for LSM outputs, LDASOUT (in seconds)
+
+ ! Land surface model restart file write frequency
+ RESTART_FREQUENCY_HOURS = 2 !<-- Timestep for LSM restart files to be generated (in hours).
+ ! A value of -99999 will simply output restarts on the start of
+ ! each month, useful for longer model runs. Restart files are
+ ! generally quite large, so be cognizant of storage space and
+ ! runtime impacts when specifying.
+ ! Split output after split_output_count output times.
+ SPLIT_OUTPUT_COUNT = 1 !<-- Number of timesteps to put in a single
+ output file. This option must be 1 for NWM output configurations.
+
+ ! Soil layer specification
+ NSOIL=4 !<-- Number of soil layers
+ soil_thick_input(1) = 0.10 !<-- Thickness of top soil layer (m)
+ soil_thick_input(2) = 0.30 !<-- Thickness of second soil layer (m)
+ soil_thick_input(3) = 0.60 !<-- Thickness of third soil layer (m)
+ soil_thick_input(4) = 1.00 !<-- Thickness of bottom soil layer (m)
+
+ ! Forcing data measurement height for winds, temp, humidity
+ ZLVL = 10.0 !<-- Height of input wind speed
+
+ ! -------- Following Section: Restart IO file formats -------- !
+
+ ! Options to specify whether restart files (both read in and output)
+ ! should be in binary or netCDF format. Generally recommend using
+ ! netCDF format (option 0) for both.
+
+ ! Restart file format options
+ rst_bi_in = 0 !<-- 0: use netcdf input restart file 1: use parallel io for reading multiple
+ ! restart files (1 per core)
+ rst_bi_out = 0 !<-- 0: use netcdf output restart file 1: use parallel io for outputting multiple
+ ! restart files (1 per core)
+ /
+
+ &WRF_HYDRO_OFFLINE
+
+ ! Specification of forcing data: 1=HRLDAS-hr format, 2=HRLDAS-min format,
+ ! 3=WRF, 4=Idealized, 5=Ideal w/ Spec.Precip.,
+ ! 6=HRLDAS-hrly fomat w/ Spec. Precip, 7=WRF w/ Spec.Precip
+ FORC_TYP = 1
+ /
+
+.. _section-a5:
+
+A5. WRF-Hydro `hydro.namelist` File with Description of Options
+---------------------------------------------------------------
+
+Below is an annotated hydro.namelist file. Annotations follow what is
+being described, indicated with <-- and blue text. Note that
+annotations describing options are meant to accompany the commented
+description in the namelist which precedes the option.
+
+.. _hydro-namelist:
+
+.. code-block:: fortran
+
+ &HYDRO_nlist
+ !!!! --------------- SYSTEM COUPLING -------------- !!!!
+ ! Specify what is being coupled: 1=HRLDAS (offline Noah-LSM),
+ ! 2=WRF, 3=NASA/LIS, 4=CLM
+ sys_cpl = 1 !<-- For offline runs, including Noah and NoahMP, this will be option 1.
+
+ !!!! ----------- MODEL INPUT DATA FILES ----------- !!!!
+ ! Specify land surface model gridded input data file (e.g.: "geo_em.d01.nc")
+ GEO_STATIC_FLNM = "./DOMAIN/geo_em.d01.nc" !<-- Path to the “GEOGRID” file which contains base
+ ! information on the LSM grid (this file is generally
+ ! created via WPS in the model preprocessing steps).
+
+ ! Specify the high-resolution routing terrain input data file (e.g.: "Fulldom_hires.nc")
+ GEO_FINEGRID_FLNM = "./DOMAIN/Fulldom_hires.nc" !<-- Path to the “routing stack” which contains
+ ! base information on the high-resolution routing
+ ! grid. This file is generally created via the
+ ! GIS pre-processing tools.
+
+ ! Specify the spatial hydro parameters file (e.g.: "hydro2dtbl.nc")
+ ! If you specify a filename and the file does not exist, it will
+ ! be created for you.
+ HYDROTBL_F = "./DOMAIN/hydro2dtbl.nc" !<-- Path to the new 2d hydro parameters file. If this file
+ ! does not exist, it will be created for you based on
+ ! HYDRO.TBL and the soil and land class grids foundin the
+ ! GEOGRID netCDF file
+
+ ! Specify spatial metadata file for land surface grid. (e.g.: "GEOGRID_LDASOUT_Spatial_Metadata.nc")
+ LAND_SPATIAL_META_FLNM = "./DOMAIN/GEOGRID_LDASOUT_Spatial_Metadata.nc" !<-- Path to the geospatial
+ ! metadata file for your domain. This file is required
+ ! if using any of the io_form_outputs options (i.e.,
+ ! io_form_outputs > 0). This file is generally created
+ ! via the GIS pre-processing tools.
+
+ ! Specify the name of the restart file if starting from restart...comment out with '!' if not...
+ RESTART_FILE = 'HYDRO_RST.2013-09-12_04:00_DOMAIN3' !<-- Path to hydro restart file; this contains
+ ! a "warm" model state from a previous model run.
+
+ !!!! ------------- MODEL SETUP OPTIONS ------------ !!!!
+ ! Specify the domain or nest number identifier...(integer)
+ IGRID = 1 !<-- Domain ID number. This comes from the WRF coupling framework and is intended to
+ ! specify which nested domain you are running. For standalone runs, this is not relevant
+ ! HOWEVER this ID must match the number specified after DOMAIN in your forcing file names
+ ! (e.g., the "1" in "2013091200.LDASIN_DOMAIN1").
+
+ ! Specify the restart file write frequency in minutes
+ ! A value of -99999 will output restarts on the first day of the month only.
+ rst_dt = 120 !<-- Specify how often hydro restart files should be generated, in minutes. This should
+ ! generally track your LSM restart file frequency (as specified in namelist.hrldas).
+ ! A value of -99999 will simply output restarts on the start of each month, useful for
+ ! longer model runs. Hydro restart files are generally quite large, so be cognizant of
+ ! storage space and runtime impacts.
+
+ ! Reset the LSM soil states from the high-res routing restart file (1=overwrite, 0=no overwrite)
+ ! NOTE: Only turn this option on if overland or subsurface routing is active!
+ rst_typ = 1 !<-- Specify whether or not to use the soil conditions (soil moisture and ponded water)
+ ! from the high-resolution hydro restart file, if "warm" starting the model with a
+ ! provided HYDRO_RST file. If this option is 0, the LSM restart states will be used
+ ! instead. IMPORTANT: If you are NOT running with terrain routing turned on, do not set
+ ! this option to 1 as it may bring in invalid values.
+
+ ! Restart file format control !<-- Options to whether restart files (input and output separately)
+ ! should be in binary or netCDF format. Generally recommend to use
+ ! netCDF format (option 0) for both.
+ rst_bi_in = 0 !0: use netCDF input restart file (default) 1: use parallel io for reading multiple
+ ! restart files, 1 per core
+ rst_bi_out = 0 !0: use netCDF output restart file (default) 1: use parallel io for outputting multiple
+ ! restart files, 1 per core
+
+ ! Restart switch to set restart accumulation variables to 0 (0=no reset, 1=yes reset to 0.0)
+ RSTRT_SWC = 0 !<-- Specify whether or not to reset any accumulated output variables to 0 (option 1)
+ ! or to continue accumulating from the values in the hydro restart file (option 0).
+ ! Note that this only applies to the hydrologic model outputs; the LSM outputs will
+ ! always continue to accumulate from the LSM restart file.
+
+ ! Specify baseflow/bucket model initialization (0=cold start from table, 1=restart file)
+ GW_RESTART = 1 !<-- Specify whether to initialize the groundwater bucket states from the hydro
+ ! restart file (option 1) or "cold" start the bucket states from the parameter
+ ! table, GWBUCKPARM.nc.
+
+ !!!! ------------ MODEL OUTPUT CONTROL ------------ !!!!
+ ! Specify the output file write frequency...(minutes)
+ out_dt = 60 !<-- Timestep for hydro model outputs, in minutes. This covers all output options
+ ! listed below (CHRTOUT, GWOUT, RTOUT, LAKEOUT, etc.) so be cognizant of impacts
+ ! on disk space and runtime when specifying.
+
+ ! Specify the number of output times to be contained within each output history file...(integer)
+ ! SET = 1 WHEN RUNNING CHANNEL ROUTING ONLY/CALIBRATION SIMS!!!
+ ! SET = 1 WHEN RUNNING COUPLED TO WRF!!!
+ SPLIT_OUTPUT_COUNT = 1 !<-- Number of timesteps to put in a single output file.
+
+ ! Specify the minimum stream order to output to netCDF point file (integer)
+ ! Note: lower value of stream order produces more output.
+ order_to_write = 4 !<-- Lowest stream order to include in output files. Selecting 1 gives
+ ! you output for every reach/channel cell, selecting a higher order number
+ ! gives you fewer channel output elements.
+
+ ! Flag to turn on/off new I/O routines:
+ ! 0 = deprecated output routines (only use when running with the Noah LSM),
+ ! 1 = with scale/offset/compression,
+ ! 2 = with scale/offset/NO compression,
+ ! 3 = compression only,
+ ! 4 = no scale/offset/compression (default)
+ io_form_outputs = 1 !<-- Specify which output option to use (NOTE: option 0 is the only
+ ! supported option when running with the original Noah LSM)
+
+ ! Realtime run configuration option:
+ ! 0=all (default), 1=analysis, 2=short-range, 3=medium-range,
+ ! 4=long-range, 5=retrospective,
+ ! 6=diagnostic (includes all of 1-4 outputs combined)
+ io_config_outputs = 1 !<-- Specify which configuration of output variables to generate
+ ! (NOTE: not active when io_form_outputs=0)
+
+ ! Option to write output files at time 0 (restart cold start time): 0=no, 1=yes (default)
+ t0OutputFlag = 1 !<-- Select whether or not to create outputs at the initial timestep.
+
+ ! Options to output channel & bucket influxes. Only active for UDMP_OPT=1.
+ ! Nonzero choice requires that out_dt above matches NOAH_TIMESTEP in namelist.hrldas.
+ ! 0=None (default), 1=channel influxes (qSfcLatRunoff, qBucket)
+ ! 2=channel+bucket fluxes (qSfcLatRunoff, qBucket, qBtmVertRunoff_toBucket)
+ ! 3=channel accumulations (accSfcLatRunoff, accBucket) *NOT TESTED*
+ output_channelBucket_influx = 0 !<-- Select which additional channel and groundwater bucket
+ ! outputs will be generated. These additional variables can
+ ! be used to drive the channel-only model.
+
+ ! Output netCDF file control - specify which outputs to generate for the run.
+
+ CHRTOUT_DOMAIN = 1 !<-- Channel output variables (streamflow, velocity, head, etc.) as a netCDF
+ ! point timeseries output at all channel points (1d) 0 = no output, 1 = output
+
+ CHANOBS_DOMAIN = 0 !<-- NetCDF point timeseries at forecast points or gage points (defined in
+ ! Route_Link.nc) 0 = no output, 1 = output
+
+ CHRTOUT_GRID = 0 !<-- NetCDF grid of channel streamflow values (2d) 0 = no output, 1 = output
+ ! NOTE: Not available with reach-based routing
+
+ LSMOUT_DOMAIN = 0 !<-- NetCDF grid of variables passed between LSM and routing components (2d)
+ ! (generally used for diagnostics only)
+ ! 0 = no output, 1 = output NOTE: No scale_factor/add_offset available
+
+ RTOUT_DOMAIN = 1 !<-- NetCDF grid of terrain routing variables on routing grid (2d)
+ ! 0 = no output, 1 = output
+
+ output_gw = 1 !<-- NetCDF groundwater output, 0 = no output, 1 = output
+ ! Groundwater bucket outputs [level, inflow, outflow]
+
+ outlake = 1 !<-- NetCDF grid of lake values (1d) 0 = no output, 1 = output !
+ ! Lake output variables (if lakes are included in the domain) [level, inflow, outflow]
+
+ frxst_pts_out = 0 !<-- ASCII text file of streamflow at forecast points or gage points
+ ! (defined in Route_Link.nc), 0 = no output, 1 = output
+
+ !!!! ---- PHYSICS OPTIONS AND RELATED SETTINGS ---- !!!!
+
+ ! Specify the number of soil layers (integer) and the depth of the bottom of each layer... (meters)
+ ! Notes: In the current version of WRF-Hydro these must be the same as in the namelist.input file.
+ ! Future versions may permit this to be different.
+ NSOIL=4 !<-- Number of soil layers
+ ZSOIL8(1) = -0.10 !<-- Depth of bottom boundary of top soil layer in meters
+ ZSOIL8(2) = -0.40 !<-- Depth of bottom of second soil layer in meters (note that this is specified
+ ! differently than the namelist.hrldas; this is total depth from the surface
+ instead of thickness)
+ ZSOIL8(3) = -1.00 !<-- Depth of bottom of third soil layer in meters (note that this is specified
+ ! differently than the namelist.hrldas; this is total depth from the surface
+ instead of thickness)
+ ZSOIL8(4) = -2.00 !<-- Depth of bottom of fourth (last) soil layer in meters (note that this is
+ ! specified differently than the namelist.hrldas; this is total depth from the
+ surface instead of thickness)
+
+ ! Specify the grid spacing of the terrain routing grid (meters)
+ DXRT = 100.0 !<-- Resolution of the high-res routing grid
+ ! Specify the integer multiple between the land model grid and the terrain routing grid (integer)
+ AGGFACTRT = 10 !<-- Aggregation factor between the high-res routing grid and the LSM grid;
+ ! e.g., a 100-m routing grid resolution and a 1km LSM grid resolution would
+ ! be AGGFACTRT = 10.
+
+ ! Specify the channel routing model timestep (seconds)
+ DTRT_CH = 10 !<-- Timestep for the channel routing module to cycle, in seconds; model runtime
+ ! will be sensitive to this timestep, so choose something appropriate for your
+ ! domain resolution (finer resolutions generally require finer timesteps).
+ ! Specify the terrain routing model timestep (seconds)
+ DTRT_TER = 10 !<-- Timestep for the terrain routing module to cycle, in seconds; model runtime
+ ! will be sensitive to this timestep, so choose something appropriate for your
+ ! domain resolution (finer resolutions generally require finer timesteps).
+
+ ! Switch to activate subsurface routing...(0=no, 1=yes)
+ SUBRTSWCRT = 1 !<-- Turn on/off subsurface routing module.
+ ! Switch to activate surface overland flow routing...(0=no, 1=yes)
+ OVRTSWCRT = 1 !<-- Turn on/off overland routing module.
+
+ ! Specify overland flow routing option:
+ ! 1=Seepest Descent (D8) 2=CASC2D (not active)
+ ! NOTE: Currently subsurface flow is only steepest descent
+ rt_option = 1 !<-- For both terrain routing modules, specify whether flow should follow the
+ ! steepest path (option 1) or multi-directional (option 2).
+ ! Option 2 is currently unsupported.
+
+ ! Switch to activate channel routing...(0=no, 1=yes)
+ CHANRTSWCRT = 1 !<-- Turn on/off channel routing module.
+
+ ! Specify channel routing option:
+ ! 1=Muskingam-reach, 2=Musk.-Cunge-reach, 3=Diff.Wave-gridded
+ channel_option = 3 !<-- If channel routing module is active, select which physics option to use.
+
+ ! Specify the reach file for reach-based routing options (e.g.: "Route_Link.nc")
+ route_link_f = "./DOMAIN/Route_Link.nc" !<-- If using one of the reach-based channel routing
+ ! options (channel_option = 1 or 2), specify the path
+ ! to the Route_Link.nc file, which provides the
+ ! channel-reach parameters.
+
+ ! If using channel_option=2, activate the compound channel formulation? (Default=.FALSE.)
+ ! This option is currently only supported if using reach-basedrouting with UDMP=1.
+ compound_channel = .FALSE.
+
+ ! Specify the lake parameter file (e.g.: "LAKEPARM.nc"). Note: REQUIRED if lakes are on.
+ route_lake_f = "./DOMAIN/LAKEPARM.nc" !<-- If lakes are active,specify the path to the lake
+ ! parameter file, which provides thelake parameters.
+
+ ! Switch to activate baseflow bucket model…
+ ! (0=none, 1=exp. bucket, 2=pass-through)
+ GWBASESWCRT = 1 !<-- Turn on/off the ground water bucket module. Option 1 activates the
+ ! exponential bucket model, Option 2 bypasses the bucket model and dumps all
+ ! flow from the bottom of the soil column directly into the channel, and
+ ! Option 0 creates a sink at the bottom of the soil column (water draining from
+ ! the bottom of the soil column leaves the system, so note that this option will
+ ! not have water balance closure).
+
+ ! Groundwater/baseflow 2d mask specified on land surface model grid (e.g.: "GWBASINS.nc").
+ ! NOTE: Only required if baseflow model is active (1 or 2) and UDMP_OPT=0.
+ gwbasmskfil = "./DOMAIN/GWBASINS.nc" !<-- For configurations wherethe bucket or pass-through
+ ! groundwater modules are active, provide the path to the
+ ! 2d netCDF file (LSM grid resolution) that maps the
+ ! groundwater basin IDs. Bucket parameters will be specified
+ ! through the GWBUCKPARM.nc file, whose IDs should match
+ ! those in the groundwater basin mask file.
+
+ ! Groundwater bucket parameter file (e.g.: "GWBUCKPARM.nc")
+ GWBUCKPARM_file = "./DOMAIN/GWBUCKPARM.nc" !<-- For configurations where the groundwater bucket
+ ! model is active, specify the path to the bucket
+ ! parameter file, which provides bucket parameters
+ ! by catchment.
+
+ ! User defined mapping, such NHDPlus: 0=no (default), 1=yes
+ UDMP_OPT = 0 !<-- If 1, this tells the model to use a "user-defined mapping" scheme to translate
+ ! between terrain and groundwater flow and reaches, e.g., NHDPlus.
+ ! If UDM is on, specify the user-defined mapping file (e.g.: "spatialweights.nc")
+ !udmap_file = "./DOMAIN/spatialweights.nc" !<-- If UDMP_OPT=1 (user defined mapping is active),
+ ! provide the path to the required spatial weights
+ ! file, which maps between grid cells and catchments.
+
+ / !<-- End of hydro namelist HYDRO_nlist
+
+ &NUDGING_nlist !<-- Start of separate namelist for nudging, only used if the model is compiled
+ ! with the compile-time option WRF_HYDRO_NUDGING=1. Ignored otherwise.
+
+ ! Path to the "timeslice" observation files.
+ timeSlicePath = "./nudgingTimeSliceObs/" !<-- Path to a directory containing nuding “time slice”
+ ! observation files. There are no requirements on the
+ ! existence of files in the directory, but the directory
+ ! itself must exist if specified.
+ nudgingParamFile = "DOMAIN/nudgingParams.nc" !<-- Path to the required nudging parameter file.
+ ! Nudging restart file. nudgingLastObsFile defaults to '', which will look for
+ ! nudgingLastObs.YYYY-mm-dd_HH:MM:SS.nc *AT THE INITALIZATION TIME OF THE RUN*. Set to a missing
+ ! file to use no restart.
+ !nudgingLastObsFile = '/a/nonexistent/file/gives/nudging/cold/start' !<-- Optional path to an
+ ! optional nudging restart
+ ! file. See comments above.
+ ! Parallel input of nudging timeslice observation files?
+ readTimesliceParallel = .TRUE. !<-- Can read the observation files in parallel (on different cores)
+ ! for quicker run speeds.
+
+ ! temporalPersistence defaults to true, only runs if necessary params present.
+ temporalPersistence = .FALSE. !<-- This option uses the expCoeff
+ ! parameter for persisting observations
+
+ ! The total number of last (obs, modeled) pairs to save in nudgingLastObs for removal of bias.
+ ! This is the maximum array length. (This option is active when persistBias=FALSE)
+ ! (Default=960=10days @15min obs resolution, if all the obs are present and longer if not.)
+ nLastObs = 960 !<-- The maximum trailing window size for calculating bias correction.
+
+ ! If using temporalPersistence the last observation persists by default. This option instead
+ ! persists the bias after the last observation.
+ persistBias = .FALSE. !<-- Apply bias correction as observations move into the past?
+ ! AnA (FALSE) vs Forecast (TRUE) bias persistence.
+
+ ! If persistBias: Does the window for calculating the bias end at model init time (=t0)?
+ ! FALSE = window ends at model time (moving),
+ ! TRUE = window ends at init=t0(fcst) time.
+ ! (If commented out, Default=FALSE)
+ ! Note: Perfect restart tests require this option to be .FALSE.
+ biasWindowBeforeT0 = .FALSE. !<-- Is the bias window shifting with
+ ! model integration?
+
+ ! If persistBias: Only use this many last (obs, modeled) pairs.
+ ! (If Commented out, Default=-1*nLastObs)
+ ! > 0: apply an age-based filter, units=hours.
+ ! = 0: apply no additional filter, use all available/usable obs.
+ ! < 0: apply an count-based filter, units=count
+ maxAgePairsBiasPersist = -960
+
+ ! If persistBias: The minimum number of last (obs, modeled) pairs, with age less
+ ! than maxAgePairsBiasPersist, required to apply a bias correction. (default=8)
+ minNumPairsBiasPersist = 8
+
+ ! If persistBias: give more weight to observations closer in time? (default=FALSE)
+ invDistTimeWeightBias = .TRUE. !<-- The exact form of this
+ ! weighting is currently hard-coded.
+
+ ! If persistBias: "No constructive interference in bias correction?", reduce the bias
+ ! adjustment when the model and the bias adjustment have the same sign relative to the
+ ! modeled flow at t0? (default=FALSE)
+ ! Note: Perfect restart tests require this option to be .FALSE.
+ noConstInterfBias = .FALSE. !<-- Tactical response to phase errors.
+ /
+
+.. _section-A6:
+.. _section-16:
+
+A6. Noah land surface model parameter tables
+--------------------------------------------
+
+The Noah land surface model requires three parameter table files denoted
+by the file suffix TBL. The variables contained within these files are
+described in the tables below.
+
+Please refer to the Noah land surface model documentation
+(https://ral.ucar.edu/sites/default/files/public/product-tool/unified-noah-lsm/Noah_LSM_USERGUIDE_2.7.1.pdf)
+for additional information.
+
+`GENPARM.TBL` - This file contains global parameters for the Noah land surface model.
+
+.. table::
+ :width: 90%
+ :align: center
+
+ +--------------------+-------------------------------------------------+
+ | **Variable name** | **Description** |
+ +====================+=================================================+
+ | SLOPE_DATA | Linear reservoir coefficient |
+ +--------------------+-------------------------------------------------+
+ | SBETA_DATA | Parameter used to calculate vegetation effect |
+ | | on soil heat |
+ +--------------------+-------------------------------------------------+
+ | FXEXP_DAT | Soil evaporation exponent used in DEVAP |
+ +--------------------+-------------------------------------------------+
+ | CSOIL_DATA | Soil heat capacity [:math:`J/m^3/K`] |
+ +--------------------+-------------------------------------------------+
+ | SALP_DATA | Shape parameter of distribution function of |
+ | | snow cover |
+ +--------------------+-------------------------------------------------+
+ | REFDK_DATA | Parameter in the surface runoff |
+ | | parameterization |
+ +--------------------+-------------------------------------------------+
+ | REFKDT_DATA | Parameter in the surface runoff |
+ | | parameterization |
+ +--------------------+-------------------------------------------------+
+ | FRZK_DATA | Frozen ground parameter |
+ +--------------------+-------------------------------------------------+
+ | ZBOT_DATA | Depth of lower boundary soil temperature |
+ | | [:math:`m`] |
+ +--------------------+-------------------------------------------------+
+ | CZIL_DATA | Parameter used in the calculation of the |
+ | | roughness length for heat |
+ +--------------------+-------------------------------------------------+
+ | SMLOW_DATA | Soil moisture wilt, soil moisture reference |
+ | | parameter |
+ +--------------------+-------------------------------------------------+
+ | SMHIGH_DATA | Soil moisture wilt, soil moisture reference |
+ | | parameter |
+ +--------------------+-------------------------------------------------+
+ | LVCOEF_DATA | Parameter in the snow albedo formulation |
+ +--------------------+-------------------------------------------------+
+
+| `SOILPARM.TBL` - This file contains parameters that are assigned based
+ upon soil classification.
+| *All parameters are a function of soil class.*
+
+.. table::
+ :width: 90%
+ :align: center
+
+ +-------------+--------------------------------------------------------+
+ | **Variable | **Description** |
+ | name** | |
+ +=============+========================================================+
+ | BB | B parameter |
+ +-------------+--------------------------------------------------------+
+ | DRYSMC | Dry soil moisture threshold at which direct |
+ | | evaporation from top soil layer ends |
+ +-------------+--------------------------------------------------------+
+ | F11 | Soil thermal diffusivity/conductivity coefficient |
+ +-------------+--------------------------------------------------------+
+ | MAXSMC | Saturation soil moisture content (i.e. porosity) |
+ +-------------+--------------------------------------------------------+
+ | REFSMC | Reference soil moisture (field capacity), where |
+ | | transpiration begins to stress |
+ +-------------+--------------------------------------------------------+
+ | SATPSI | Saturation soil matric potential |
+ +-------------+--------------------------------------------------------+
+ | SATDK | Saturation soil conductivity |
+ +-------------+--------------------------------------------------------+
+ | SATDW | Saturation soil diffusivity |
+ +-------------+--------------------------------------------------------+
+ | WLTSMC | Wilting point soil moisture |
+ +-------------+--------------------------------------------------------+
+ | QTZ | Soil quartz content |
+ +-------------+--------------------------------------------------------+
+
+| `VEGPARM.TBL` - This file contains parameters that a function of land cover type.
+| *All parameters are a function of land cover type.*
+
+.. table::
+ :width: 90%
+ :align: center
+
+ +-------------------+---------------------------------------------------------+
+ | **Variable name** | **Description** |
+ +===================+=========================================================+
+ | SHDFAC | Green vegetation fraction |
+ +-------------------+---------------------------------------------------------+
+ | NROOT | Number of soil layers (from the top) reached by |
+ | | vegetation roots |
+ +-------------------+---------------------------------------------------------+
+ | RS | Minimum stomatal resistance [`s/m`] |
+ +-------------------+---------------------------------------------------------+
+ | RGL | Parameter used in radiation stress function |
+ +-------------------+---------------------------------------------------------+
+ | HS | Parameter used in vapor pressure deficit function |
+ +-------------------+---------------------------------------------------------+
+ | SNUP | Threshold water-equivalent snow depth [m] that implies |
+ | | 100% snow cover |
+ +-------------------+---------------------------------------------------------+
+ | MAXALB | Upper bound on maximum albedo over deep snow [`\%`] |
+ +-------------------+---------------------------------------------------------+
+ | LAIMIN | Minimum leaf area index through the year |
+ | | [dimensionless] |
+ +-------------------+---------------------------------------------------------+
+ | LAIMAX | Maximum leaf area index through the year |
+ | | [dimensionless] |
+ +-------------------+---------------------------------------------------------+
+ | EMISSMIN | Minimum background emissivity through the year |
+ | | [fraction 0.0 to 1.0] |
+ +-------------------+---------------------------------------------------------+
+ | EMISSMAX | Maximum background emissivity through the year |
+ | | [fraction 0.0 to 1.0] |
+ +-------------------+---------------------------------------------------------+
+ | ALBEDOMIN | Minimum background albedo through the year [fraction |
+ | | 0.0 to 1.0] |
+ +-------------------+---------------------------------------------------------+
+ | ALBEDOMAX | Maximum background albedo through the year [fraction |
+ | | 0.0 to 1.0] |
+ +-------------------+---------------------------------------------------------+
+ | Z0MIN | Minimum background roughness length through the year |
+ | | [`m`] |
+ +-------------------+---------------------------------------------------------+
+ | Z0MAX | Maximum background roughness length through the year |
+ | | [`m`] |
+ +-------------------+---------------------------------------------------------+
+ | TOPT_DATA | Optimum transpiration air temperature [`K`] |
+ +-------------------+---------------------------------------------------------+
+ | CMCMAX_DATA | Maximum canopy water capacity [volumetric fraction] |
+ | | |
+ +-------------------+---------------------------------------------------------+
+ | CFACTR_DATA | Parameter used in the canopy interception calculation |
+ | | [dimensionless] |
+ +-------------------+---------------------------------------------------------+
+ | RSMAX_DATA | Maximal stomatal resistance [`s/m`] |
+ +-------------------+---------------------------------------------------------+
+ | BARE | The land-use category representing bare ground (used to |
+ | | set the vegetation fraction to zero) [land-use category |
+ | | index] |
+ +-------------------+---------------------------------------------------------+
+ | NATURAL | The land-use category representative of the non-urban |
+ | | portion of urban land-use points [land-use category |
+ | | index] |
+ +-------------------+---------------------------------------------------------+
+
+.. _section-A7:
+
+A7. Noah-MP land surface model parameter tables
+-----------------------------------------------
+
+The Noah-MP land surface model requires three parameter table files
+denoted by the file suffix TBL. The variables contained within these
+files are described in the tables below.
+
+As part of the work conducted for the National Water Model
+implementation, the ability to specify a number of these land surface
+model parameters spatially on a two or three dimensional grid was
+introduced. This is done through the use of the compile time option
+``SPATIAL_SOIL`` and the specification of a netCDF format parameter file
+with the default filename soil_properties.nc. A list of the variables
+contained in this file is included in a table below as well.
+
+`GENPARM.TBL` This file contains global parameters for the Noah-MP
+land surface model.
+
+.. table::
+ :width: 90%
+ :align: center
+
+ +---------------+------------------------------------------------------+
+ | **Variable | **Description** |
+ | name** | |
+ +===============+======================================================+
+ | SLOPE_DATA | Linear reservoir coefficient |
+ +---------------+------------------------------------------------------+
+ | SBETA_DATA | Parameter used to calculate vegetation effect on |
+ | | soil heat |
+ +---------------+------------------------------------------------------+
+ | FXEXP_DAT | Soil evaporation exponent used in DEVAP |
+ +---------------+------------------------------------------------------+
+ | CSOIL_DATA | Soil heat capacity [:math:`J/m^3/K`] |
+ +---------------+------------------------------------------------------+
+ | SALP_DATA | Shape parameter of distribution function of snow |
+ | | cover |
+ +---------------+------------------------------------------------------+
+ | REFDK_DATA | Parameter in the surface runoff parameterization |
+ +---------------+------------------------------------------------------+
+ | REFKDT_DATA | Parameter in the surface runoff parameterization |
+ +---------------+------------------------------------------------------+
+ | FRZK_DATA | Frozen ground parameter |
+ +---------------+------------------------------------------------------+
+ | ZBOT_DATA | Depth of lower boundary soil temperature [:math:`m`] |
+ +---------------+------------------------------------------------------+
+ | CZIL_DATA | Parameter used in the calculation of the roughness |
+ | | length for heat |
+ +---------------+------------------------------------------------------+
+ | SMLOW_DATA | Soil moisture wilt, soil moisture reference |
+ | | parameter |
+ +---------------+------------------------------------------------------+
+ | SMHIGH_DATA | Soil moisture wilt, soil moisture reference |
+ | | parameter |
+ +---------------+------------------------------------------------------+
+ | LVCOEF_DATA | Parameter in the snow albedo formulation |
+ +---------------+------------------------------------------------------+
+
+`SOILPARM.TBL` - This file contains parameters that are assigned based
+on soil classification.
+
+.. table::
+ :width: 90%
+ :align: center
+
+ +--------------+-------------------------------------------------------+
+ | **Variable | **Description** |
+ | name** | |
+ +==============+=======================================================+
+ | BB | B parameter |
+ +--------------+-------------------------------------------------------+
+ | DRYSMC | Dry soil moisture threshold at which direct |
+ | | evaporation from top soil layer ends |
+ +--------------+-------------------------------------------------------+
+ | F11 | Soil thermal diffusivity/conductivity coefficient |
+ +--------------+-------------------------------------------------------+
+ | MAXSMC | Saturation soil moisture content (i.e. porosity) |
+ +--------------+-------------------------------------------------------+
+ | REFSMC | Reference soil moisture (field capacity), where |
+ | | transpiration begins to stress |
+ +--------------+-------------------------------------------------------+
+ | SATPSI | Saturation soil matric potential |
+ +--------------+-------------------------------------------------------+
+ | SATDK | Saturation soil conductivity |
+ +--------------+-------------------------------------------------------+
+ | SATDW | Saturation soil diffusivity |
+ +--------------+-------------------------------------------------------+
+ | WLTSMC | Wilting point soil moisture |
+ +--------------+-------------------------------------------------------+
+ | QTZ | Soil quartz content |
+ +--------------+-------------------------------------------------------+
+
+`MPTABLE.TBL` - This file contains parameters that are a function of
+land cover type.
+
+.. table::
+ :width: 90%
+ :align: center
+
+ +-------------------------+--------------------------------------------+
+ | **Variable name** | **Description** |
+ +=========================+============================================+
+ | VEG_DATASET_DESCRIPTION | Land cover classification dataset |
+ +-------------------------+--------------------------------------------+
+ | NVEG | Number of land cover categories |
+ +-------------------------+--------------------------------------------+
+ | ISURBAN | Land cover category for urban |
+ +-------------------------+--------------------------------------------+
+ | ISWATER | Land cover category for water |
+ +-------------------------+--------------------------------------------+
+ | ISBARREN | Land cover category for barren |
+ +-------------------------+--------------------------------------------+
+ | ISICE | Land cover category for ice |
+ +-------------------------+--------------------------------------------+
+ | EBLFOREST | Land cover category for evergreen |
+ | | broadleaf forest |
+ +-------------------------+--------------------------------------------+
+ | .. centered:: *Parameters below are a function of land cover type* |
+ +-------------------------+--------------------------------------------+
+ | CH2OP | Maximum intercepted H\ :sub:`2`\O per unit |
+ | | LAI + SAI [:math:`mm`] |
+ +-------------------------+--------------------------------------------+
+ | DLEAF | Characteristic leaf dimension [:math:`m`] |
+ +-------------------------+--------------------------------------------+
+ | Z0MVT | Momentum roughness length [:math:`m`] |
+ +-------------------------+--------------------------------------------+
+ | HVT | Top of canopy [:math:`m`] |
+ +-------------------------+--------------------------------------------+
+ | HVB | Bottom of canopy [:math:`m`] |
+ +-------------------------+--------------------------------------------+
+ | DEN | Tree density [:math:`trunks/m^2`\] |
+ +-------------------------+--------------------------------------------+
+ | RC | Tree crown radius [:math:`m`] |
+ +-------------------------+--------------------------------------------+
+ | MFSNO | Snowmelt m parameter |
+ +-------------------------+--------------------------------------------+
+ | RHOS_VIS | Monthly stem area index (SAI), one-sided |
+ +-------------------------+--------------------------------------------+
+ | RHOS_NIR | Monthly leaf area index (LAI), one-sided |
+ +-------------------------+--------------------------------------------+
+ | TAUL_VIS | Leaf transmittance, visible |
+ +-------------------------+--------------------------------------------+
+ | TAUL_NIR | Leaf transmittance, near infrared |
+ +-------------------------+--------------------------------------------+
+ | TAUS_VIS | Stem transmittance, visible |
+ +-------------------------+--------------------------------------------+
+ | TAUS_NIR | Stem transmittance, near infrared |
+ +-------------------------+--------------------------------------------+
+ | XL | Leaf / stem orientation index |
+ +-------------------------+--------------------------------------------+
+ | CWPVT | Canopy wind parameter |
+ +-------------------------+--------------------------------------------+
+ | C3PSN | Photosynthetic pathway [c4 = 0. \| c3 = |
+ | | 1.] |
+ +-------------------------+--------------------------------------------+
+ | KC25 | CO2 Michaelis-Menten constant |
+ | | at 25°C [:math:`Pa`] |
+ +-------------------------+--------------------------------------------+
+ | AKC | Q10 for KC25 |
+ +-------------------------+--------------------------------------------+
+ | KO25 | O2 Michaelis-Menten constant |
+ | | at 25°C [:math:`Pa`] |
+ +-------------------------+--------------------------------------------+
+ | AKO | Q10 for KO25 |
+ +-------------------------+--------------------------------------------+
+ | AVCMX | Q10 for VCMX25 |
+ +-------------------------+--------------------------------------------+
+ | AQE | Q10 for QE25 |
+ +-------------------------+--------------------------------------------+
+ | LTOVRC | Leaf turnover [:math:`1/s`] |
+ +-------------------------+--------------------------------------------+
+ | DILEFC | Coefficient for leaf stress death |
+ | | [:math:`1/s`] |
+ +-------------------------+--------------------------------------------+
+ | DILEFW | Coefficient for leaf stress death |
+ | | [:math:`1/s`] |
+ +-------------------------+--------------------------------------------+
+ | RMF25 | Leaf maintenance respiration at 25°C |
+ | | [:math:`umol\ CO_{2}/m^2/s`] |
+ +-------------------------+--------------------------------------------+
+ | SLA | Single-side leaf area [:math:`m2/kg`] |
+ +-------------------------+--------------------------------------------+
+ | FRAGR | Fraction of growth respiration |
+ +-------------------------+--------------------------------------------+
+ | TMIN | Minimum temperature for photosynthesis |
+ | | [:math:`K`] |
+ +-------------------------+--------------------------------------------+
+ | VCMX25 | maximum rate of carboxylation at 25°C |
+ | | [:math:`umol\ CO_{2}/m^2/s`] |
+ +-------------------------+--------------------------------------------+
+ | TDLEF | Characteristic temperature for leaf |
+ | | freezing [:math:`K`] |
+ +-------------------------+--------------------------------------------+
+ | BP | Minimum leaf conductance |
+ | | [:math:`umol\ /m^2/s`] |
+ +-------------------------+--------------------------------------------+
+ | MP | Slope of conductance to photosynthesis |
+ | | relationship |
+ +-------------------------+--------------------------------------------+
+ | QE25 | Quantum efficiency at 25°C |
+ | | [:math:`umol\ CO_{2} / umol\ photon`] |
+ +-------------------------+--------------------------------------------+
+ | RMS25 | Stem maintenance respiration at 25°C |
+ | | [:math:`umol\ CO_{2}/kg_{bio}/s`] |
+ +-------------------------+--------------------------------------------+
+ | RMR25 | Root maintenance respiration at 25°C |
+ | | [:math:`umol\ CO_{2}/kg_{bio}/s`] |
+ +-------------------------+--------------------------------------------+
+ | ARM | Q10 for maintenance respiration |
+ +-------------------------+--------------------------------------------+
+ | FOLNMX | Foliage nitrogen concentration when |
+ | | :math:`f(n)=1` [:math:`\%`] |
+ +-------------------------+--------------------------------------------+
+ | WRRAT | Wood to non-wood ratio |
+ +-------------------------+--------------------------------------------+
+ | MRP | Microbial respiration parameter |
+ | | [:math:`umol\ CO_{2}/kg_{C}/s`] |
+ +-------------------------+--------------------------------------------+
+ | NROOT | Number of soil layers with root present |
+ +-------------------------+--------------------------------------------+
+ | RGL | Parameter used in radiation stress |
+ | | function |
+ +-------------------------+--------------------------------------------+
+ | RS | Stomatal resistance [:math:`s/m`] |
+ +-------------------------+--------------------------------------------+
+ | HS | Parameter used in vapor pressure deficit |
+ | | function |
+ +-------------------------+--------------------------------------------+
+ | TOPT | Optimum transpiration air temperature [K] |
+ +-------------------------+--------------------------------------------+
+ | RSMAX | Maximal stomatal resistance |
+ | | [:math:`s m-1`] |
+ +-------------------------+--------------------------------------------+
+ | SAI | Steam area index |
+ +-------------------------+--------------------------------------------+
+ | LAI | Leaf area index |
+ +-------------------------+--------------------------------------------+
+ | SLAREA | (not used in Noah-MP as configured in |
+ | | WRF-Hydro) |
+ +-------------------------+--------------------------------------------+
+ | EPS1 | (not used in Noah-MP as configured in |
+ | | WRF-Hydro) |
+ +-------------------------+--------------------------------------------+
+ | EPS2 | (not used in Noah-MP as configured in |
+ | | WRF-Hydro) |
+ +-------------------------+--------------------------------------------+
+ | EPS3 | (not used in Noah-MP as configured in |
+ | | WRF-Hydro) |
+ +-------------------------+--------------------------------------------+
+ | EPS4 | (not used in Noah-MP as configured in |
+ | | WRF-Hydro) |
+ +-------------------------+--------------------------------------------+
+ | EPS5 | (not used in Noah-MP as configured in |
+ | | WRF-Hydro) |
+ +-------------------------+--------------------------------------------+
+ | .. centered:: *Parameters below are a function of soil color index* |
+ +-------------------------+--------------------------------------------+
+ | ALBSAT_VIS | Saturated soil albedos for visible |
+ +-------------------------+--------------------------------------------+
+ | ALBSAT_NIR | Saturated soil albedos for near infrared |
+ +-------------------------+--------------------------------------------+
+ | ALBDRY_VIS | Dry soil albedos for visible |
+ +-------------------------+--------------------------------------------+
+ | ALBDRY_NIR | Dry soil albedos for near infrared |
+ +-------------------------+--------------------------------------------+
+ | .. centered:: *Parameters below are global* |
+ +-------------------------+--------------------------------------------+
+ | ALBICE | Albedo land ice (visible and near |
+ | | infrared) |
+ +-------------------------+--------------------------------------------+
+ | ALBLAK | Albedo frozen lakes (visible and near |
+ | | infrared) |
+ +-------------------------+--------------------------------------------+
+ | OMEGAS | Two-stream parameter for snow |
+ +-------------------------+--------------------------------------------+
+ | BETADS | Two-stream parameter for snow |
+ +-------------------------+--------------------------------------------+
+ | BETAIS | Two-stream parameter for snow |
+ +-------------------------+--------------------------------------------+
+ | EG | Emissivity soil surface (soil and lake) |
+ +-------------------------+--------------------------------------------+
+ | CO2 | CO\ :sub:`2` partial pressure |
+ +-------------------------+--------------------------------------------+
+ | O2 | O\ :sub:`2` partial pressure |
+ +-------------------------+--------------------------------------------+
+ | TIMEAN | Grid cell mean topographic index [global |
+ | | mean] |
+ +-------------------------+--------------------------------------------+
+ | FSATMX | Maximum surface saturated fraction [global |
+ | | mean] |
+ +-------------------------+--------------------------------------------+
+ | Z0SNO | Snow surface roughness length [:math:`m`] |
+ +-------------------------+--------------------------------------------+
+ | SSI | Liquid water holding capacity for snowpack |
+ | | [:math:`m^3/m^3`] |
+ +-------------------------+--------------------------------------------+
+ | SWEMX | New snow mass to fully cover old snow |
+ | | [:math:`mm`] |
+ +-------------------------+--------------------------------------------+
+ | TAU0 | Tau0 from Yang97 eqn. 10a |
+ +-------------------------+--------------------------------------------+
+ | GRAIN_GROWTH | Growth from vapor diffusion Yang97 |
+ | | eqn. 10b |
+ +-------------------------+--------------------------------------------+
+ | EXTRA_GROWTH | Extra growth near freezing Yang97 |
+ | | eqn. 10c |
+ +-------------------------+--------------------------------------------+
+ | DIRT_SOOT | Dirt and soot term Yang97 eqn. 10d |
+ +-------------------------+--------------------------------------------+
+ | BATS_COSZ | Zenith angle snow albedo |
+ | | adjustment; b in Yang97 eqn. 15 |
+ +-------------------------+--------------------------------------------+
+ | BATS_VIS_NEW | New snow visible albedo |
+ +-------------------------+--------------------------------------------+
+ | BATS_NIR_NEW | New snow NIR albedo |
+ +-------------------------+--------------------------------------------+
+ | BATS_VIS_AGE | Age factor for diffuse visible snow |
+ | | albedo Yang97 eqn. 17 |
+ +-------------------------+--------------------------------------------+
+ | BATS_NIR_AGE | Age factor for diffuse NIR snow |
+ | | albedo Yang97 eqn. 18 |
+ +-------------------------+--------------------------------------------+
+ | BATS_VIS_DIR | Cosz factor for direct visible snow |
+ | | albedo Yang97 eqn. 15 |
+ +-------------------------+--------------------------------------------+
+ | BATS_NIR_DIR | Cosz factor for direct NIR snow |
+ | | albedo Yang97 eqn. 16 |
+ +-------------------------+--------------------------------------------+
+ | RSURF_SNOW | Surface resistance for snow [:math:`s/m`] |
+ +-------------------------+--------------------------------------------+
+ | RSURF_EXP | Exponent in the shape parameter for |
+ | | soil resistance option 1 |
+ +-------------------------+--------------------------------------------+
+
+`soil\_properties.nc` [optional]
+
+.. table::
+ :width: 90%
+ :align: center
+
+ +------------+----------------------------------------------------------+
+ | **Variable | **Description** |
+ | name** | |
+ +============+==========================================================+
+ | bexp | Beta parameter |
+ +------------+----------------------------------------------------------+
+ | cwpvt | Empirical canopy wind parameter |
+ +------------+----------------------------------------------------------+
+ | dksat | Saturated soil hydraulic conductivity |
+ +------------+----------------------------------------------------------+
+ | dwsat | Saturated soil hydraulic diffusivity |
+ +------------+----------------------------------------------------------+
+ | hvt | Top of vegetation canopy [:math:`m`] |
+ +------------+----------------------------------------------------------+
+ | mfsno | Snowmelt m parameter |
+ +------------+----------------------------------------------------------+
+ | mp | Slope of conductance to photosynthesis relationship |
+ +------------+----------------------------------------------------------+
+ | psisat | Saturated soil matric potential |
+ +------------+----------------------------------------------------------+
+ | quartz | Soil quartz content |
+ +------------+----------------------------------------------------------+
+ | refdk | Parameter in the surface runoff parameterization |
+ +------------+----------------------------------------------------------+
+ | refkdt | Parameter in the surface runoff parameterization |
+ +------------+----------------------------------------------------------+
+ | rsurf_exp | Exponent in the shape parameter for soil |
+ | | resistance option 1 |
+ +------------+----------------------------------------------------------+
+ | slope | Slope index |
+ +------------+----------------------------------------------------------+
+ | smcdry | Dry soil moisture threshold where direction evaporation |
+ | | from the top layer ends |
+ +------------+----------------------------------------------------------+
+ | smcmax | Saturated value of soil moisture [volumetric] |
+ +------------+----------------------------------------------------------+
+ | smcref | Reference soil moisture (field capacity) [volumetric] |
+ +------------+----------------------------------------------------------+
+ | smcwlt | Wilting point soil moisture [volumetric] |
+ +------------+----------------------------------------------------------+
+ | vcmx25 | Maximum rate of carboxylation at 25°C |
+ | | [:math:`umol\ CO_{2}/m^2/s`] |
+ +------------+----------------------------------------------------------+
+
+.. _section-a8:
+
+A8. Terrain routing parameter files
+-----------------------------------
+
+Parameters for the lateral routing component of WRF-Hydro are specified
+via either the `HYDRO.TBL` file or the `hydro2dtbl.nc` file. Variables
+within these files are described in the tables below.
+
+`HYDRO.TBL`
+
+.. table::
+ :width: 90%
+ :align: center
+
+ +--------------------+------------------------------------------------------+
+ | **Variable name** | **Description** |
+ +====================+======================================================+
+ | .. centered:: *The parameter below is a function of land cover type* |
+ +--------------------+------------------------------------------------------+
+ | SFC_ROUGH | Overland flow roughness coefficient |
+ +--------------------+------------------------------------------------------+
+ | .. centered:: *The parameters below are a function of soil class* |
+ +--------------------+------------------------------------------------------+
+ | SATDK | Saturated soil hydraulic conductivity [:math:`m/s`] |
+ +--------------------+------------------------------------------------------+
+ | MAXSMC | Maximum volumetric soil moisture |
+ | | [:math:`m^3/m^3`] |
+ +--------------------+------------------------------------------------------+
+ | REFSMC | Reference volumetric soil moisture |
+ | | [:math:`m^3/m^3`] |
+ +--------------------+------------------------------------------------------+
+ | WLTSMC | Wilting point volumetric soil moisture |
+ | | [:math:`m^3/m^3`] |
+ +--------------------+------------------------------------------------------+
+ | QTZ | Quartz fraction of the soil |
+ +--------------------+------------------------------------------------------+
+
+`hydro2dtbl.nc`
+
+.. table::
+ :width: 90%
+ :align: center
+
+ +-----------------------+----------------------------------------------+
+ | **Variable name** | **Description** |
+ +=======================+==============================================+
+ | SMCMAX1 | Maximum volumetric soil moisture |
+ | | [:math:`m^3/m^3`] |
+ +-----------------------+----------------------------------------------+
+ | SMCREF1 | Reference volumetric soil moisture |
+ | | [:math:`m^3/m^3`] |
+ +-----------------------+----------------------------------------------+
+ | SMCWLT1 | Wilting point volumetric soil moisture |
+ | | [:math:`m^3/m^3`] |
+ +-----------------------+----------------------------------------------+
+ | OV_ROUGH2D | Overland flow roughness coefficient |
+ +-----------------------+----------------------------------------------+
+ | LKSAT | Lateral saturated soil hydraulic |
+ | | conductivity [:math:`m/s`] |
+ +-----------------------+----------------------------------------------+
+
+.. _section-a9:
+
+A9. Channel routing parameter tables (`CHANPARM.TBL` and `Route\_Link.nc`)
+---------------------------------------------------------------------------
+
+Variables of the the channel routing parameter tables are described in
+the tables below.
+
+| `CHANPARM.TBL`
+| *All parameters are a function of Strahler stream order*
+
+.. table::
+ :width: 90%
+ :align: center
+
+ +----------------------+---------------------------------------------------+
+ | **Variable name** | **Description** |
+ +======================+===================================================+
+ | Bw | Channel bottom width [:math:`m`] |
+ +----------------------+---------------------------------------------------+
+ | HLINK | Initial depth of water in the channel [:math:`m`] |
+ +----------------------+---------------------------------------------------+
+ | ChSSlp | Channel side slope [:math:`m/m`] |
+ +----------------------+---------------------------------------------------+
+ | MannN | Manning’s roughness coefficient |
+ +----------------------+---------------------------------------------------+
+
+| `Route\_Link.nc`
+| *All parameters are specified per stream specified per stream segment (i.e. link)*
+
+.. table::
+ :width: 90%
+ :align: center
+
+ +----------------------+-----------------------------------------------+
+ | **Variable name** | **Description** |
+ +======================+===============================================+
+ | BtmWdth | Channel bottom width [:math:`m`] |
+ +----------------------+-----------------------------------------------+
+ | ChSlp | Channel side slope [:math:`m/m`] |
+ +----------------------+-----------------------------------------------+
+ | Kchan | Channel conductivity [:math:`mm/hr`] |
+ +----------------------+-----------------------------------------------+
+ | Length | Stream segment length [:math:`m`] |
+ +----------------------+-----------------------------------------------+
+ | MusK | Muskingum routing time [:math:`s`] |
+ +----------------------+-----------------------------------------------+
+ | MusX | Muskingum weighting coefficient |
+ +----------------------+-----------------------------------------------+
+ | NHDWaterbodyComID | ComID of an associated water body if any |
+ +----------------------+-----------------------------------------------+
+ | Qi | Initial flow in link [:math:`m^3/s`] |
+ +----------------------+-----------------------------------------------+
+ | So | Slope [:math:`m/m`] |
+ +----------------------+-----------------------------------------------+
+ | alt | Elevation from the NAD88 datum at start node |
+ | | [:math:`m`] |
+ +----------------------+-----------------------------------------------+
+ | ascendingIndex | Index to user for sorting IDs - *only in NWM |
+ | | files* |
+ +----------------------+-----------------------------------------------+
+ | from | From Link ID |
+ +----------------------+-----------------------------------------------+
+ | gages | Identifier for stream gage at this location |
+ +----------------------+-----------------------------------------------+
+ | lat | Latitude of the segment midpoint *[degrees |
+ | | north]* |
+ +----------------------+-----------------------------------------------+
+ | link | Link ID |
+ +----------------------+-----------------------------------------------+
+ | lon | Longitude of the segment midpoint *[degrees |
+ | | east]* |
+ +----------------------+-----------------------------------------------+
+ | n | Manning's roughness |
+ +----------------------+-----------------------------------------------+
+ | order | Strahler stream order |
+ +----------------------+-----------------------------------------------+
+ | to | To Link ID |
+ +----------------------+-----------------------------------------------+
+ | time | Time of measurement |
+ +----------------------+-----------------------------------------------+
+
+.. _section-a10:
+
+A10. Groundwater input and parameter files
+------------------------------------------
+
+The contents of the groundwater input and parameter files are described
+in the tables below.
+
+`GWBASINS.nc`
+
+.. table::
+ :width: 90%
+ :align: center
+
+ +----------------------+-----------------------------------------------+
+ | **Variable name** | **Description** |
+ +======================+===============================================+
+ | y | projection y coordinate |
+ +----------------------+-----------------------------------------------+
+ | x | projection x coordinate |
+ +----------------------+-----------------------------------------------+
+ | crs | coordinate reference system definition |
+ +----------------------+-----------------------------------------------+
+ | BASIN | groundwater basin ID |
+ +----------------------+-----------------------------------------------+
+
+`GWBUCKPARM.nc`
+
+.. table::
+ :width: 90%
+ :align: center
+
+ +-----------------------+----------------------------------------------+
+ | **Variable name** | **Description** |
+ +=======================+==============================================+
+ | Basin | Basin monotonic ID (1...n) |
+ +-----------------------+----------------------------------------------+
+ | Coeff | Coefficient |
+ +-----------------------+----------------------------------------------+
+ | Expon | Exponent |
+ +-----------------------+----------------------------------------------+
+ | Zmax | Zmax |
+ +-----------------------+----------------------------------------------+
+ | Zinit | Zinit |
+ +-----------------------+----------------------------------------------+
+ | Area_sqkm | Basin area [:math:`km^2`] |
+ +-----------------------+----------------------------------------------+
+ | ComID | NHDCatchment FEATUREID (NHDFlowline ComID) |
+ +-----------------------+----------------------------------------------+
+
+.. _section-20:
+
+A11. Spatial weights input file variable description
+----------------------------------------------------
+
+The contents of the `spatialweights.nc` file is described in the table
+below.
+
+.. table::
+ :width: 90%
+ :align: center
+
+ +--------------+---------------------------------------------+---------------+
+ | **Variable | **Description** | **Dimension** |
+ | name** | | |
+ +==============+=============================================+===============+
+ | polyid | ID of polygon | polyid |
+ +--------------+---------------------------------------------+---------------+
+ | IDmask | Polygon ID (polyid) associated with each | data |
+ | | record) | |
+ +--------------+---------------------------------------------+---------------+
+ | overlaps | Number of intersecting polygons | polyid |
+ +--------------+---------------------------------------------+---------------+
+ | weight | Fraction of intersecting polygon(polyid) | data |
+ | | intersected by poly2 | |
+ +--------------+---------------------------------------------+---------------+
+ | regridweight | Fraction of intersecting | data |
+ | | polyid(overlapper) intersected by | |
+ | | polygon(polyid) | |
+ +--------------+---------------------------------------------+---------------+
+ | i_index | Index in the x dimension of the raster | data |
+ | | grid *(starting with 1,1 in the LL corner)* | |
+ +--------------+---------------------------------------------+---------------+
+ | j_index | Index in the y dimension of the raster | data |
+ | | grid *(starting with 1,1 in the LL corner)* | |
+ +--------------+---------------------------------------------+---------------+
+
+.. _section-a12:
+
+A12. Lake and reservoir parameter tables (`LAKEPARM.nc`)
+--------------------------------------------------------
+
+Variables within the `LAKEPARM.nc` file are described in the tables below.
+
+.. table::
+ :width: 90%
+ :align: center
+
+ +--------------------+-------------------------------------------------+
+ | **Variable name** | **Description** |
+ +====================+=================================================+
+ | lake_id | Lake index (consecutively from 1 to n # of |
+ | | lakes) |
+ +--------------------+-------------------------------------------------+
+ | LkArea | Area [:math:`m^2`] |
+ +--------------------+-------------------------------------------------+
+ | LkMxE | Elevation of maximum lake height [:math:`m`, |
+ | | AMSL] |
+ +--------------------+-------------------------------------------------+
+ | WeirC | Weir coefficient (ranges from zero to one) |
+ +--------------------+-------------------------------------------------+
+ | WeirL | Weir length [:math:`m`] |
+ +--------------------+-------------------------------------------------+
+ | OrificeC | Orifice coefficient (ranges from zero to one) |
+ +--------------------+-------------------------------------------------+
+ | OrificeA | Orifice area [:math:`m^2`] |
+ +--------------------+-------------------------------------------------+
+ | OrificeE | Orifice elevation [:math:`m`, AMSL] |
+ +--------------------+-------------------------------------------------+
+ | lat | Latitude *[decimal degrees north]* |
+ +--------------------+-------------------------------------------------+
+ | lon | Longitude *[decimal degrees east]* |
+ +--------------------+-------------------------------------------------+
+ | time | time |
+ +--------------------+-------------------------------------------------+
+ | WeirE | Weir elevation [:math:`m`, AMSL] |
+ +--------------------+-------------------------------------------------+
+ | ascendingIndex | Index to use for sorting IDs (ascending) |
+ +--------------------+-------------------------------------------------+
+ | ifd | Initial fraction water depth |
+ +--------------------+-------------------------------------------------+
+ | crs | CRS definition |
+ +--------------------+-------------------------------------------------+
+
+.. _section-22:
+
+A13. Restart Files Overview
+----------------------------
+
+.. figure:: media/restarts.png
+ :align: center
+
+ **Figure A13.** Overview of restart files for the various model physics
+ components.
+
+A13.1 RESTART_MP File Variable Table
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+.. note::
+ Noah-MP restarts are written in ``subroutine lsm_restart()`` in :file:`module_NoahMP_hrldas_driver.F`.
+ Noah-MP variables are defined in ``subroutine noahmplsm()`` in :file:`module_sf_noahmpdrv.F`
+
+`RESTART\_MP` file variable descriptions
+
+.. table::
+ :width: 90%
+ :align: center
+
+ +-------------+-------------------------------------------+------------------+
+ | **Variable | **Description** | **Units** |
+ | name** | | |
+ +=============+===========================================+==================+
+ | ACMELT | accumulated melting water out of snow | :math:`mm` |
+ | | bottom | |
+ +-------------+-------------------------------------------+------------------+
+ | ACSNOW | accumulated snowfall on grid | :math:`mm` |
+ +-------------+-------------------------------------------+------------------+
+ | ALBOLD | snow albedo at last time step (-) | |
+ +-------------+-------------------------------------------+------------------+
+ | AREAXY | (in the file but not used by the model) | |
+ +-------------+-------------------------------------------+------------------+
+ | CANICE | Canopy ice water content / | :math:`mm` |
+ | | canopy-intercepted ice | |
+ +-------------+-------------------------------------------+------------------+
+ | CANLIQ | Canopy liquid water content / | :math:`mm` |
+ | | canopy-intercepted liquid water | |
+ +-------------+-------------------------------------------+------------------+
+ | CH | Sensible heat exchange coefficient | |
+ +-------------+-------------------------------------------+------------------+
+ | CM | Momentum drag coefficient | |
+ +-------------+-------------------------------------------+------------------+
+ | DEEPRECHXY | soil moisture below the bottom of the | :math:`m^3/m^3` |
+ | | column | |
+ +-------------+-------------------------------------------+------------------+
+ | EAH | canopy air vapor pressure | :math:`Pa` |
+ +-------------+-------------------------------------------+------------------+
+ | EQZWT | (in the file but not used by the model) | |
+ +-------------+-------------------------------------------+------------------+
+ | FASTCP | short-lived carbon in shallow soil | :math:`g/m^2` |
+ +-------------+-------------------------------------------+------------------+
+ | FDEPTHXY | (in the file but not used by the model) | |
+ +-------------+-------------------------------------------+------------------+
+ | FWET | Wetted or snowed fraction of canopy | :math:`fraction` |
+ +-------------+-------------------------------------------+------------------+
+ | GVFMAX | annual maximum in vegetation fraction | |
+ +-------------+-------------------------------------------+------------------+
+ | GVFMIN | annual minimum in vegetation fraction | |
+ +-------------+-------------------------------------------+------------------+
+ | ISNOW | Number of snow layers | :math:`count` |
+ +-------------+-------------------------------------------+------------------+
+ | LAI | leaf area index | |
+ +-------------+-------------------------------------------+------------------+
+ | LFMASS | Leaf mass | :math:`g/m^2` |
+ +-------------+-------------------------------------------+------------------+
+ | PEXPXY | (in the file but not used by the model) | |
+ +-------------+-------------------------------------------+------------------+
+ | QRFSXY | Stem mass | :math:`g/m^2` |
+ +-------------+-------------------------------------------+------------------+
+ | QRFXY | (in the file but not used by the model) | |
+ +-------------+-------------------------------------------+------------------+
+ | QSFC | bulk surface specific humidity | |
+ +-------------+-------------------------------------------+------------------+
+ | QSLATXY | Stable carbon in deep soil | :math:`g/m^2` |
+ +-------------+-------------------------------------------+------------------+
+ | QSNOW | snowfall rate on the ground | :math:`mm/s` |
+ +-------------+-------------------------------------------+------------------+
+ | QSPRINGSXY | Mass of wood and woody roots | :math:`g/m^2` |
+ +-------------+-------------------------------------------+------------------+
+ | QSPRINGXY | (in the file by not used by the model) | |
+ +-------------+-------------------------------------------+------------------+
+ | RECHXY | recharge to the water table (diagnostic) | :math:`m^3/m^3` |
+ +-------------+-------------------------------------------+------------------+
+ | RIVERBEDXY | (in the file but not used by the model) | |
+ +-------------+-------------------------------------------+------------------+
+ | RIVERCONDXY | (in the file but not used by the model) | |
+ +-------------+-------------------------------------------+------------------+
+ | RTMASS | mass of fine roots | :math:`g/m^2` |
+ +-------------+-------------------------------------------+------------------+
+ | SAI | stem area index | |
+ +-------------+-------------------------------------------+------------------+
+ | SFCRUNOFF | Accumulatetd surface runoff | :math:`mm` |
+ +-------------+-------------------------------------------+------------------+
+ | SH2O | volumetric liquid soil moisture | :math:`m^3/m^3` |
+ +-------------+-------------------------------------------+------------------+
+ | SMC | Volumetric Soil Moisture | :math:`m^3/m^3` |
+ +-------------+-------------------------------------------+------------------+
+ | SMCWTDXY | soil moisture below the bottom of the | :math:`m^3/m^3` |
+ | | column | |
+ +-------------+-------------------------------------------+------------------+
+ | SMOISEQ | volumetric soil moisture | :math:`m^3/m^3` |
+ +-------------+-------------------------------------------+------------------+
+ | SNEQV | Snow water equivalent | :math:`kg/m^2` |
+ +-------------+-------------------------------------------+------------------+
+ | SNEQVO | snow mass at last time step | :math:`mm` |
+ +-------------+-------------------------------------------+------------------+
+ | SNICE | snow layer ice | :math:`mm` |
+ +-------------+-------------------------------------------+------------------+
+ | SNLIQ | Snow layer liquid water | :math:`mm` |
+ +-------------+-------------------------------------------+------------------+
+ | SNOWH | Snow depth | :math:`m` |
+ +-------------+-------------------------------------------+------------------+
+ | SNOW_T | snow temperature | :math:`K` |
+ +-------------+-------------------------------------------+------------------+
+ | SOIL_T | Soil Temperature on NSOIL layers | :math:`K` |
+ +-------------+-------------------------------------------+------------------+
+ | STBLCP | Stable carbon in deep soil | :math:`g/m^2` |
+ +-------------+-------------------------------------------+------------------+
+ | STMASS | stem mass | :math:`g/m^2` |
+ +-------------+-------------------------------------------+------------------+
+ | TAH | Canopy Air Temperature | :math:`K` |
+ +-------------+-------------------------------------------+------------------+
+ | TAUSS | snow age factor | |
+ +-------------+-------------------------------------------+------------------+
+ | TG | Ground Temperature | :math:`K` |
+ +-------------+-------------------------------------------+------------------+
+ | TV | Canopy Temperature | :math:`K` |
+ +-------------+-------------------------------------------+------------------+
+ | UDRUNOFF | Accumulated underground runoff" | :math:`mm` |
+ +-------------+-------------------------------------------+------------------+
+ | WA | Water in aquifer relative to reference | :math:`kg/m^2` |
+ | | level | |
+ +-------------+-------------------------------------------+------------------+
+ | WOOD | Mass of wood and woody roots | :math:`g/m^2` |
+ +-------------+-------------------------------------------+------------------+
+ | WSLAKE | lake water storage | :math:`mm` |
+ +-------------+-------------------------------------------+------------------+
+ | WT | Water in aquifer and saturated soil | :math:`kg/m^2` |
+ +-------------+-------------------------------------------+------------------+
+ | ZSNSO | Snow layer depths from snow surface | :math:`m` |
+ +-------------+-------------------------------------------+------------------+
+ | ZWT | water table depth | :math:`m` |
+ +-------------+-------------------------------------------+------------------+
+ | VEGFRA | Vegetation fraction | |
+ +-------------+-------------------------------------------+------------------+
+ | ACCPRCP | Accumulated precipitation | :math:`mm` |
+ +-------------+-------------------------------------------+------------------+
+ | ACCECAN | Accumulated canopy evaporation | :math:`mm` |
+ +-------------+-------------------------------------------+------------------+
+ | ACCEDIR | Accumulated direct soil evaporation | :math:`mm` |
+ +-------------+-------------------------------------------+------------------+
+ | ACCETRAN | Accumulated transpiration | :math:`mm` |
+ +-------------+-------------------------------------------+------------------+
+ | SMOISEQ | volumetric soil moisture | :math:`m^3/m^3` |
+ +-------------+-------------------------------------------+------------------+
+
+.. _section-23:
+
+A13.2 HYDRO_RST File Variable Table
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+.. note::
+ The variables are written to the `HYDRO_RST` file in the subroutine of
+ ``RESTART_OUT_nc`` in the :file:`Routing/module_HYDRO_io.F90`. The tables below
+ contain all the information on the dimensions and variables in the Hydro
+ RESTART file (`HYDRO\_RST`).
+
+.. table::
+ :width: 90%
+ :align: center
+
+ +---------------+-----------------------------+---------------------------+
+ | **Dimension** | **Description** | **It is written** |
+ +===============+=============================+===========================+
+ | depth | Number of soil layers | |
+ +---------------+-----------------------------+---------------------------+
+ | ix | Number of columns in the | |
+ | | coarse grid (LSM) | |
+ +---------------+-----------------------------+---------------------------+
+ | iy | Number of rows in the | |
+ | | coarse grid (LSM) | |
+ +---------------+-----------------------------+---------------------------+
+ | ixrt | Number of columns in the | |
+ | | fine grid (hydro) | |
+ +---------------+-----------------------------+---------------------------+
+ | iyrt | Number of rows in the fine | |
+ | | grid (hydro) | |
+ +---------------+-----------------------------+---------------------------+
+ | links | Number of links/reaches | |
+ +---------------+-----------------------------+---------------------------+
+ | basns | Number of basins for the | Only if ``GWBASESWCRT=1`` |
+ | | groundwater/baseflow | in the `hydro.namelist` |
+ | | modeling | |
+ +---------------+-----------------------------+---------------------------+
+ | lakes | Number of lakes | Only if the lake |
+ | | | routing is turned on |
+ +---------------+-----------------------------+---------------------------+
+
+.. table::
+ :width: 90%
+ :align: center
+
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | **Variable | **Description** | **# Dimensions | **Resolution** | **Units** |
+ | name** | | (not | | |
+ | | | including | | |
+ | | | time)** | | |
+ +==============+===============================+================+================+=================+
+ | cvol | volume of stream in cell | 1 | fine/link | :math:`m^3` |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | hlink | stream stage | 1 | fine/link | :math:`m` |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | infxsrt | infiltration excess water | 2 | coarse | :math:`mm` |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | infxswgt | weights for disaggregation of | 2 | fine | \- |
+ | | infxsrt | | | |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | qbdryrt | accumulated value of the | 2 | fine | :math:`mm` |
+ | | boundary flux | | | |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | qlink1 | stream flow in to cell/reach | 1 | fine/link | :math:`m^3/s` |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | qlink2 | stream flow out of cell/reach | 1 | fine/link | :math:`m^3/s` |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | qstrmvolrt | Accumulated depth of stream | 2 | fine | :math:`mm` |
+ | | channel inflow | | | |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | sfcheadrt | surface head on the coarse | 2 | coarse | :math:`mm` |
+ | | grid | | | |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | sfcheadsubrt | surface head on the routing | 2 | fine | :math:`mm` |
+ | | grid | | | |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | sh2owgt | weights for disaggregation of | 3 | fine | \- |
+ | | total soil moisture (smc) | | | |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | sh2ox | liquid soil moisture | 3 | coarse | :math:`m^3/m^3` |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | smc | total liq+ice soil moisture. | 3 | coarse | :math:`m^3/m^3` |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | soldrain | soil drainage | 2 | coarse | :math:`mm` |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | stc | soil temperature | 3 | coarse | :math:`K` |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | lake_inflort | lake inflow | 2 | fine | :math:`mm` |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | resht | water surface elevation | 1 | link | :math:`m` |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | qlakeo | outflow from lake used in | 1 | link | :math:`m^3/s` |
+ | | diffusion scheme | | | |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | qlakei | lake inflow | numLakes | link | :math:`m^3/s` |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+ | z_gwsubbas | depth in ground water bucket | 1 | link | :math:`m` |
+ +--------------+-------------------------------+----------------+----------------+-----------------+
+
+A14. Streamflow Nudging
+-----------------------
+
+**Figure A14.1** Below is an example netCDF header nudging time slice
+observation file containing 2 gages. The command :program:`ncdump -h` was used to
+produce this header information.
+
+::
+
+ netcdf 2013-06-01_21:45:00.15min.usgsTimeSlice {
+ dimensions:
+ stationIdStrLen = 15 ;
+ stationIdInd = UNLIMITED ; // (2 currently)
+ timeStrLen = 19 ;
+ variables:
+ char stationId(stationIdInd, stationIdStrLen) ;
+ stationId:long_name = "USGS station identifier of length 15" ;
+ char time(stationIdInd, timeStrLen) ;
+ time:units = "UTC" ;
+ time:long_name = "YYYY-MM-DD_HH:mm:ss UTC" ;
+ float discharge(stationIdInd) ;
+ discharge:units = "m^3/s" ;
+ discharge:long_name = "Discharge.cubic_meters_per_second" ;
+ short discharge_quality(stationIdInd) ;
+ discharge_quality:units = "-" ;
+ discharge_quality:long_name = "Discharge quality 0 to 100 to be scaled by 100." ;
+ float queryTime(stationIdInd) ;
+ queryTime:units = "seconds since 1970-01-01 00:00:00 local TZ" ;
+ // global attributes:
+ :fileUpdateTimeUTC = "2017-08-25_17:24:22" ;
+ :sliceCenterTimeUTC = "2013-06-01_21:45:00" ;
+ :sliceTimeResolutionMinutes = "15" ;
+ }
+
+**Figure A14.2:** Below is an example `nudgingParams.nc` file containing
+parameters for 3 gages. The command :program:`ncdump -h` was used to create this
+header information.
+
+::
+
+ netcdf nudgingParams {
+ dimensions:
+ stationIdInd = UNLIMITED ; // (3 currently)
+ monthInd = 12 ;
+ threshCatInd = 2 ;
+ threshInd = 1 ;
+ stationIdStrLen = 15 ;
+ variables:
+ float G(stationIdInd) ;
+ G:units = "-" ;
+ G:long_name = "Amplitude of nudging" ;
+ float R(stationIdInd) ;
+ R:units = "meters" ;
+ R:long_name = "Radius of influence in meters" ;
+ float expCoeff(stationIdInd, monthInd, threshCatInd) ;
+ expCoeff:units = "minutes" ;
+ expCoeff:long_name = "Coefficient b in denominator e^(-dt/b)" ;
+ float qThresh(stationIdInd, monthInd, threshInd) ;
+ qThresh:units = "m^3/s" ;
+ qThresh:long_name = "Discharge threshold category" ;
+ char stationId(stationIdInd, stationIdStrLen) ;
+ stationId:units = "-" ;
+ stationId:long_name = "USGS station identifer" ;
+ float tau(stationIdInd) ;
+ tau:units = "minutes" ;
+ tau:long_name = "Time tapering parameter half window size in minutes" ;
+ }
+
+.. _section-a15:
+
+A15. National Water Model (NWM) Configuration
+---------------------------------------------
+
+It is important to note here that the community WRF-Hydro modeling
+system is currently the actual underlying modeling architecture that is
+used in the NOAA National Water Model. This means that the community
+WRF-Hydro model code is configurable into the National Water Model
+configurations that runs in operations at the National Center for
+Environmental Prediction (NCEP).
+
+.. pull-quote::
+
+ “\ *The NWM is an hourly cycling uncoupled analysis and forecast system
+ that provides streamflow for 2.7 million river reaches and other
+ hydrologic information on 1km and 250m grids. The model provides
+ complementary hydrologic guidance at current NWS River Forecast Center
+ (RFC) river forecast locations and significantly expanded guidance
+ coverage and type in underserved locations.*
+
+ *The NWM ingests forcing from a variety of sources including Multi-Radar
+ Multi-Sensor (MRMS) radar-gauge observed precipitation data and
+ High-Resolution Rapid Refresh (HRRR), Rapid Refresh (RAP), Global
+ Forecast System (GFS) and Climate Forecast System (CFS) Numerical
+ Weather Prediction (NWP) forecast data. USGS real-time streamflow
+ observations are assimilated and all NWM configurations benefit from the
+ inclusion of ~5500 reservoirs. The core of the NWM system is the
+ National Center for Atmospheric Research (NCAR)-supported community
+ Weather Research and Forecasting (WRF)-Hydro hydrologic model. WRF-Hydro
+ is configured to use the Noah Multi-*
+
+ *Parameterization (Noah-MP) Land Surface Model (LSM) to simulate land
+ surface processes. Separate water routing modules perform diffusive wave
+ surface routing and saturated subsurface flow routing on a 250m grid,
+ and Muskingum-Cunge channel routing down NHDPlusV2 stream reaches. River
+ analyses and forecasts are provided across a domain encompassing the
+ continental U.S. and hydrologically-contributing areas, while land
+ surface output is available on a larger domain that extends beyond the
+ continental U.S. into Canada and Mexico (roughly from latitude 19N to
+ 58N). In addition, NWM forcing datasets are provided on this domain at a
+ resolution of 1km.*\ ”
+
+.. centered:: *Excerpt from NOUS41 KWBC 061735 PNSWSH NWS Office of Science and Technology Integration*
+
+For more information regarding the operational configuration, input, and
+output data of the National Water Model see the Office of Water
+Prediction website: http://water.noaa.gov/about/nwm and the Open Commons
+Consortium Environmental Data Commons website:
+http://edc.occ-data.org/nwm/.
+
+The NWM/WRF-Hydro modeling system suite of tools for data
+preparation, evaluation, and calibration. is continually under
+development and will be rolled out to the community as each tool becomes
+finalized with supporting documentation for public usage. To be
+notified when tools become available please subscribe to the WRF-Hydro
+email list https://ral.ucar.edu/projects/wrf_hydro/subscribe.
+
+The figures below illustrate the physics permutations available in the
+WRF-Hydro framework and the Noah-MP land surface model as well as the
+current National Water Model configuration as of March 2018, the
+NWM ecosystem and suite of tools and sample NWM configuration
+namelists.
+
+.. _figure-A15.1:
+.. figure:: media/hydro-physics-permutations.png
+ :align: center
+ :scale: 90%
+
+ **Figure A15.1** Illustration of WRF-Hydro physics permutations and those used in the
+ current configuration of the National Water Model (NWM).
+
+.. _figure-A15.2:
+.. figure:: media/noahmp-physics-permutations.png
+ :align: center
+ :scale: 75%
+
+ **Figure A15.2.** Illustration of Noah-MP physics permutations and those
+ used in the configuration of the National Water Model (NWM)
+
+.. _figure-A15.3:
+.. figure:: media/nwm-wrf-hydro.png
+ :align: center
+ :scale: 135%
+
+ **Figure A15.3** National Water Model/WRF-Hydro Modeling System
+ Ecosystem and Suite of Tools.
+
+There are different NWM configurations that run operationally. The full
+list of the configurations and their specifics can be found at
+https://water.noaa.gov/about/nwm. Below we provide the namelists for the
+Standard Analysis configuration (self-cycling with 3-hour look-back,
+used to initialize CONUS short- and medium-range forecasts) as the
+sample namelists. Note that the name of the files are different from the
+conventions used throughout this documentation and they match with the
+name of the files in operation (A subset of the model parameter files
+used by the operational implementation of the NWM is available on `NWM
+website `__).
+The namelists for all the configurations are also being distributed with
+the model code.
+
+.. rubric::
+ Below are sample NWM configuration namelists for both the LSM (NoahMP) and WRF-Hydro:
+
+`namelist.hrldas` (sample NWM configuration)
+
+.. code-block:: fortran
+
+ &NOAHLSM_OFFLINE
+ HRLDAS_SETUP_FILE = "./DOMAIN/wrfinput_d01_1km.nc"
+ INDIR = "./forcing"
+ SPATIAL_FILENAME = "./DOMAIN/soil_veg_properties_ASM.nc"
+ OUTDIR = "./"
+ START_YEAR = 2018
+ START_MONTH = 06
+ START_DAY = 01
+ START_HOUR = 00
+ START_MIN = 00
+ RESTART_FILENAME_REQUESTED = "RESTART.2018060100_DOMAIN1"
+
+ ! Specification of simulation length in days OR hours
+ !KDAY = 1
+ KHOUR = 3
+
+ ! Physics options (see the documentation for details)
+ DYNAMIC_VEG_OPTION = 4
+ CANOPY_STOMATAL_RESISTANCE_OPTION = 1
+ BTR_OPTION = 1
+ RUNOFF_OPTION = 3
+ SURFACE_DRAG_OPTION = 1
+ FROZEN_SOIL_OPTION = 1
+ SUPERCOOLED_WATER_OPTION = 1
+ RADIATIVE_TRANSFER_OPTION = 3
+ SNOW_ALBEDO_OPTION = 1
+ PCP_PARTITION_OPTION = 1
+ TBOT_OPTION = 2
+ TEMP_TIME_SCHEME_OPTION = 3
+ GLACIER_OPTION = 2
+ SURFACE_RESISTANCE_OPTION = 4
+
+ ! Timesteps in units of seconds
+ FORCING_TIMESTEP = 3600
+ NOAH_TIMESTEP = 3600
+ OUTPUT_TIMESTEP = 3600
+
+ ! Land surface model restart file write frequency
+ RESTART_FREQUENCY_HOURS = 1
+
+ ! Split output after split_output_count output times.
+ SPLIT_OUTPUT_COUNT = 1
+
+ ! Soil layer specification
+ NSOIL=4
+ soil_thick_input(1) = 0.10
+ soil_thick_input(2) = 0.30
+ soil_thick_input(3) = 0.60
+ soil_thick_input(4) = 1.00
+
+ ! Forcing data measurement height for winds, temp, humidity
+ ZLVL = 10.0
+
+ ! Restart file format options
+ rst_bi_in = 0 !0: use netcdf input restart file
+ !1: use parallel io for reading multiple restart files (1 per core)
+ rst_bi_out = 0 !0: use netcdf output restart file
+ !1: use parallel io for outputting multiple restart files (1 per core)
+ /
+
+ &WRF_HYDRO_OFFLINE
+ ! Specification of forcing data: 1=HRLDAS-hr format, 2=HRLDAS-min
+ ! format, 3=WRF, 4=Idealized, 5=Ideal w/ spec. precip,
+ ! 6=HRLDAS-hr format w/ spec. precip, 7=WRF w/ spec. precip
+ FORC_TYP = 2
+
+ /
+
+`hydro.namelist` (sample NWM configuration)
+
+.. code-block:: fortran
+
+ &HYDRO_nlist
+
+ !!!! ---------------------- SYSTEM COUPLING -----------------------
+ !!!!
+
+ ! Specify what is being coupled: 1=HRLDAS (offline Noah-LSM), 2=WRF, 3=NASA/LIS, 4=CLM
+ sys_cpl = 1
+
+ !!!! ------------------- MODEL INPUT DATA FILES -------------------
+ !!!!
+
+ ! Specify land surface model gridded input data file (e.g.: "geo_em.d01.nc")
+ GEO_STATIC_FLNM = "./DOMAIN/geo_em.d01_1km.nc"
+
+ ! Specify the high-resolution routing terrain input data file (e.g.: "Fulldom_hires.nc")
+ GEO_FINEGRID_FLNM = "./DOMAIN/Fulldom_hires_netcdf_250m.nc"
+
+ ! Specify the spatial hydro parameters file (e.g.: "hydro2dtbl.nc")
+ ! If you specify a filename and the file does not exist, it will becreated for you.
+ HYDROTBL_F = "./DOMAIN/HYDRO_TBL_2D.nc"
+
+ ! Specify spatial metadata file for land surface grid. (e.g.: "GEOGRID_LDASOUT_Spatial_Metadata.nc")
+ LAND_SPATIAL_META_FLNM = "./DOMAIN/WRF_Hydro_NWM_geospatial_data_template_land_GIS.nc"
+
+ ! Specify the name of the restart file if starting from restart...comment out with '!' if not...
+ RESTART_FILE = 'HYDRO_RST.2018-06-01_00:00_DOMAIN1'
+
+ !!!! --------------------- MODEL SETUP OPTIONS --------------------
+ !!!!
+
+ ! Specify the domain or nest number identifier...(integer)
+ IGRID = 1
+
+ ! Specify the restart file write frequency...(minutes)
+ ! A value of -99999 will output restarts on the first day of the month only.
+ rst_dt = 60
+
+ ! Reset the LSM soil states from the high-res routing restart file (1=overwrite, 0=no overwrite)
+ ! NOTE: Only turn this option on if overland or subsurface rotuing is active!
+ rst_typ = 1
+
+ ! Restart file format control
+
+ rst_bi_in = 0 !0: use netcdf input restart file (default)
+ !1: use parallel io for reading multiple restart files, 1 per core
+
+ rst_bi_out = 0 !0: use netcdf output restart file (default)
+ !1: use parallel io for outputting multiple restart files, 1 per core
+
+ ! Restart switch to set restart accumulation variables to 0 (0=no reset, 1=yes reset to 0.0)
+ RSTRT_SWC = 1
+
+ ! Specify baseflow/bucket model initialization...(0=cold start from table, 1=restart file)
+ GW_RESTART = 1
+
+ !!!! -------------------- MODEL OUTPUT CONTROL --------------------
+ !!!!
+
+ ! Specify the output file write frequency...(minutes)
+ out_dt = 60
+
+ ! Specify the number of output times to be contained within each output history file...(integer)
+ ! SET = 1 WHEN RUNNING CHANNEL ROUTING ONLY/CALIBRATION SIMS!!!
+ ! SET = 1 WHEN RUNNING COUPLED TO WRF!!!
+ SPLIT_OUTPUT_COUNT = 1
+
+ ! Specify the minimum stream order to output to netcdf point file...(integer)
+ ! Note: lower value of stream order produces more output.
+ order_to_write = 1
+
+ ! Flag to turn on/off new I/O routines: 0 = deprecated output routines (use when running with Noah LSM),
+ ! 1 = with scale/offset/compression, ! 2 = with scale/offset/NO compression,
+ ! 3 = compression only, 4 = no scale/offset/compression (default)
+ io_form_outputs = 2
+
+ ! Realtime run configuration option:
+ ! 0=all (default), 1=analysis, 2=short-range, 3=medium-range, 4=long-range, 5=retrospective,
+ ! 6=diagnostic (includes all of 1-4 outputs combined)
+ io_config_outputs = 1
+
+ ! Option to write output files at time 0 (restart cold start time): 0=no, 1=yes (default)
+ t0OutputFlag = 1
+
+ ! Options to output channel & bucket influxes. Only active for UDMP_OPT=1.
+ ! Nonzero choice requires that out_dt above matches NOAH_TIMESTEP in namelist.hrldas.
+ ! 0=None (default), 1=channel influxes (qSfcLatRunoff, qBucket)
+ ! 2=channel+bucket fluxes (qSfcLatRunoff, qBucket, qBtmVertRunoff_toBucket)
+ ! 3=channel accumulations (accSfcLatRunoff, accBucket) \*\*NOT TESTED\*\*
+
+ output_channelBucket_influx = 2
+
+ ! Output netcdf file control
+ CHRTOUT_DOMAIN = 1 ! Netcdf point timeseries output at all channel points (1d)
+ ! 0 = no output, 1 = output
+
+ CHANOBS_DOMAIN = 0 ! Netcdf point timeseries at forecast points or gage points (defined in Routelink)
+ ! 0 = no output, 1 = output at forecast points or gage points.
+
+ CHRTOUT_GRID = 0 ! Netcdf grid of channel streamflow values (2d)
+ ! 0 = no output, 1 = output
+ ! NOTE: Not available with reach-based routing
+
+ LSMOUT_DOMAIN = 0 ! Netcdf grid of variables passed between LSM and routing components (2d)
+ ! 0 = no output, 1 = output
+ ! NOTE: No scale_factor/add_offset available
+
+ RTOUT_DOMAIN = 1 ! Netcdf grid of terrain routing variables on routing grid (2d)
+ ! 0 = no output, 1 = output
+
+ output_gw = 0 ! Netcdf GW output, 0 = no output, 1 = output
+ outlake = 1 ! Netcdf grid of lake values (1d), 0 = no output, 1 = output
+
+ frxst_pts_out = 0 ! ASCII text file of forecast points or gage points (defined in Routelink)
+ ! 0 = no output, 1 = output
+
+ !!!! ------------ PHYSICS OPTIONS AND RELATED SETTINGS ------------
+ !!!!
+
+ ! Specify the number of soil layers (integer) and the depth of the bottom of each layer... (meters)
+ ! Notes: In Version 1 of WRF-Hydro these must be the same as in the namelist.input file.
+ ! Future versions will permit this to be different.
+ NSOIL=4
+ ZSOIL8(1) = -0.10
+ ZSOIL8(2) = -0.40
+ ZSOIL8(3) = -1.00
+ ZSOIL8(4) = -2.00
+
+ ! Specify the grid spacing of the terrain routing grid...(meters)
+ DXRT = 250.0
+
+ ! Specify the integer multiple between the land model grid and the terrain routing grid...(integer)
+ AGGFACTRT = 4
+
+ ! Specify the channel routing model timestep...(seconds)
+ DTRT_CH = 300
+
+ ! Specify the terrain routing model timestep...(seconds)
+ DTRT_TER = 10
+
+ ! Switch to activate subsurface routing...(0=no, 1=yes)
+ SUBRTSWCRT = 1
+
+ ! Switch to activate surface overland flow routing...(0=no, 1=yes)
+ OVRTSWCRT = 1
+
+ ! Specify overland flow routing option: 1=Steepest Descent (D8) 2=CASC2D (not active)
+ ! NOTE: Currently subsurface flow is only steepest descent
+ rt_option = 1
+
+ ! Switch to activate channel routing...(0=no, 1=yes)
+ CHANRTSWCRT = 1
+
+ ! Specify channel routing option: 1=Muskingam-reach, 2=Musk.-Cunge-reach, 3=Diff.Wave-gridded
+ channel_option = 2
+
+ ! Specify the reach file for reach-based routing options (e.g.: "Route_Link.nc")
+ route_link_f = "./DOMAIN/RouteLink_NHDPLUS.nc"
+
+ ! If using channel_option=2, activate the compound channel formulation? (Default=.FALSE.)
+ compound_channel = .TRUE.
+
+ ! Specify the lake parameter file (e.g.: "LAKEPARM.nc").
+ ! Note REQUIRED if lakes are on.
+ route_lake_f = "./DOMAIN/LAKEPARM_NHDPLUS.nc"
+
+ ! Switch to activate baseflow bucket model...(0=none, 1=exp. bucket, 2=pass-through)
+ GWBASESWCRT = 1
+
+ ! Groundwater/baseflow 2d mask specified on land surface model grid (e.g.: "GWBASINS.nc")
+ ! Note: Only required if baseflow model is active (1 or 2) and UDMP_OPT=0.
+ gwbasmskfil = "./DOMAIN/GWBASINS.nc"
+
+ ! Groundwater bucket parameter file (e.g.: "GWBUCKPARM.nc")
+ GWBUCKPARM_file = "./DOMAIN/GWBUCKPARM_CONUS.nc"
+
+ ! User defined mapping, such NHDPlus: 0=no (default), 1=yes
+ UDMP_OPT = 1
+
+ ! If on, specify the user-defined mapping file (e.g.: "spatialweights.nc")
+ udmap_file = "./DOMAIN/spatialweights_250m_all_basins.nc"
+
+ /
+
+ &NUDGING_nlist
+
+ ! Path to the "timeslice" observation files.
+ timeSlicePath = "./nudgingTimeSliceObs/"
+ nudgingParamFile = "./DOMAIN/nudgingParams.nc"
+
+ ! Nudging restart file = "nudgingLastObsFile"
+ ! nudgingLastObsFile defaults to '', which will look for nudgingLastObs.YYYY-mm-dd_HH:MM:SS.nc
+ ! \*\*AT THE INITALIZATION TIME OF THE RUN\*\*. Set to a missing file to use no restart.
+
+ !nudgingLastObsFile = '/a/nonexistent/file/gives/nudging/cold/start'
+
+ !! Parallel input of nudging timeslice observation files?
+ readTimesliceParallel = .TRUE.
+
+ ! temporalPersistence defaults to true, only runs if necessary params present.
+ temporalPersistence = .TRUE.
+
+ ! The total number of last (obs, modeled) pairs to save in nudgingLastObs for
+ ! removal of bias. This is the maximum array length. (This option is active when persistBias=FALSE)
+ ! (Default=960=10days @15min obs resolution, if all the obs are present and longer if not.)
+ nLastObs = 480
+
+ ! If using temporalPersistence the last observation persists by default.
+ ! This option instead persists the bias after the last observation.
+ persistBias = .TRUE.
+
+ ! AnA (FALSE) vs Forecast (TRUE) bias persistence.
+ ! If persistBias: Does the window for calculating the bias end at
+ ! model init time (=t0)?
+ ! FALSE = window ends at model time (moving),
+ ! TRUE = window ends at init=t0(fcst) time.
+ ! (If commented out, Default=FALSE)
+ ! Note: Perfect restart tests require this option to be .FALSE.
+ biasWindowBeforeT0 = .FALSE.
+
+ ! If persistBias: Only use this many last (obs, modeled) pairs. (If Commented out, Default=-1*nLastObs)
+ ! > 0: apply an age-based filter, units=hours.
+ ! = 0: apply no additional filter, use all available/usable obs.
+ ! < 0: apply an count-based filter, units=count
+ maxAgePairsBiasPersist = 3
+
+ ! If persistBias: The minimum number of last (obs, modeled) pairs, with age less than
+ ! maxAgePairsBiasPersist, required to apply a bias correction. (default=8)
+ minNumPairsBiasPersist = 1
+
+ ! If persistBias: give more weight to observations closer in time? (default=FALSE)
+ invDistTimeWeightBias = .TRUE.
+
+ ! If persistBias: "No constructive interference in bias correction?", Reduce the bias adjustment
+ ! when the model and the bias adjustment have the same sign relative to the modeled flow at t0?
+ ! (default=FALSE)
+ ! Note: Perfect restart tests require this option to be .FALSE.
+ noConstInterfBias = .TRUE.
+
+ /
+
+
+.. _section-a16:
+
+A16. The Crocus Glacier Model
+-----------------------------
+
+Crocus is an energy and mass transfer snowpack model, initially developed for
+avalanche forecasting (*Brun et al., 1989, 1992*). The version that was
+implemented into the French SURFEX model V8.0 (*Vionnet et al., 2012*) is being
+used here. This version has several updates from older versions of Crocus, such
+as the impacts of wind drift.
+
+The Crocus snowpack model is a multilayered, physically based snow model that
+explicitly calculates snow grain properties in each snow layer and how these
+properties change over time. The grain properties of dendricity, sphericity and
+size are prognosed in Crocus through metamorphism, compaction and impacts of
+wind drift. Furthermore, the snow albedo is calculated based on the snow grain
+properties from the top 3cm of the snowpack (*Vionnet et al., 2012*) and is
+calculated in three spectral bands (0.3-0.8, 0.8-1.5 and 1.5-2.5 `\mu m`).
+Impurities in aging snow are parameterized in the UV and visible spectral band
+(0.3-0.8 `\mu m`) from the age of the snow, with a time constant of 60 days. See
+*Vionnet et al. (2012)* for a detailed description of the albedo calculations.
+The albedo over ice is constant in all spectral bands and is 0.38, 0.23 and
+0.08 for the spectral bands 0.3-0.8, 0.8-1.5 and 1.5-2.5 `\mu m`. The sensible and
+latent heat are parameterized with an effective roughness length over snow and
+ice (see *Vionnet et al. (2012)* for further details).
+
+In the Crocus model, it is possible to divide the snow into a user-defined
+maximum numbers of dynamically evolving layers. As new snow is accumulated, a
+new active layer is added. As different snow layers become similar (based upon
+the number of user-set layers, the thickness of the snow layers and the snow
+grain characteristics), these snow layers will merge into single snow layers.
+
+The Crocus module is added to the Noah-MP land surface model in WRF-Hydro to
+act as a glacier mass balance model (*Eidhammer et al. 2021*). Over designated
+glacier grid points, the Crocus snow model represents both snow and ice, while
+outside of the designated glacier grid points, the regular three-layer snow
+model in Noah-MP is used. Since the current Crocus implementation in WRF-Hydro
+only acts over designated glacier grid points, we follow *Gerbaux et al. (2005)*
+and assume that the temperatures at the bottom of the glacier and the ground
+below are both at `0^\circ C`. Note that we have not yet incorporated
+fluxes between the glacier and the ground below; thus, there is a constant
+temperature boundary condition.
+
+Both Crocus and Noah-MP (for the non-glacier grid points) output runoff from
+snowmelt (and precipitation). This runoff is provided to the terrain routing
+models in WRF-Hydro.
+
+Note that the implementation of Crocus as a glacier mass balance model does not
+address glacier movement (i.e., plastic flow) nor lateral wind (re)distribution
+of snow. However, there are two options for including impacts on the snow due
+to wind. One of the options impacts the snow density during blowing snow events
+(*Brun et al., 1997*). This option is important in polar environments (*Brun et
+al., 1997*). The other option is the sublimation due to snow drift, which was
+implemented by *Vionnet et al. (2012)* and which is in the Crocus version that
+is used in this study.
+
+As implemented, if the glacier completely melts over a user-defined glacier
+grid point, the original Noah-MP module is used from this point on. Therefore,
+as currently implemented, the glacier cannot grow horizontally in extent; it
+can only decrease in extent, as no dynamic response of the ice mass is included
+in the model. Over short model time periods, the lack of increase in glacier
+extent might impact a few grid points at the edges of the glacier. However,
+given the expected increase in temperature in the future, we do not expect that
+limiting glacier horizontal growth will have a major impact over most studied
+glaciers as most are likely to decrease in mass and extent.
+
+.. rubric:: Running WRF-Hydro / Glacier
+
+Below is a description on how to run with Crocus as a glacier model. There are
+only a few namelist options that needs to be added in :file:`namelist.hrldas`:
+
+.. code-block:: fortran
+
+ &CROCUS_nlist
+ crocus_opt = 1 ! 0 model is off, 1 model is on
+ act_lev = 40 ! 1-50, 20-40 normal options
+ /
+
+The initialization file wrfinput needs two additional fields to be defined:
+
+ | ``glacier``
+ | ``glacier_thickness``
+
+The field ``glacier`` should have the value of 1 over glacier gridpoints. The
+``glacier`` field can be provided by the user, or the user can use the glacier
+category from ``IVGTYP``.
+
+Here is an example how to generate initial glacier fields for an “ideal”
+simulation, with homogeneous glacier thickness layer. In this case,
+``IVGTYP=24`` represents glaciers:
+
+.. code-block:: bash
+
+ ncap2 -O -s 'glacier=IVGTYP' wrfinput.nc wrfinput.nc
+ ncap2 -O -s 'where(glacier!=24) glacier=0' wrfinput.nc wrfinput.nc
+ ncap2 -O -s 'where(glacier==24) glacier=1' wrfinput.nc wrfinput.nc
+
+To create a 300 m thick glacier:
+
+.. code-block:: bash
+
+ ncap2 -O -s 'glacier_thickness=glacier*300' wrfinput.nc wrfinput.nc
+
+At initialization, it is assumed that the glacier consists of only ice, and the
+density is that of pure ice (`900 \frac{kg}{m^3}`). Within the user-defined maximum
+layers (``act_lev``) the glacier is initialized with all the layers having the same
+assumed density and snow grain properties. As new snow accumulates during the
+simulations, the layers representing the glacier will start to merge since all
+layers contain the initialized ice.
+
+.. rubric:: Crocus outputs
+
+.. table::
+ :width: 90%
+ :align: center
+ :name: table-a16
+
+ +--------------+-----------+----------------------------------------------+--------------------+
+ | | Dimension | Explanation | Units |
+ +==============+===========+==============================================+====================+
+ | PSNOWSWE | 3D | Snow water equivalent | `kg/m^2` |
+ +--------------+-----------+----------------------------------------------+--------------------+
+ | PSNOWTEMP | 3D | Glacier temperature | `K` |
+ +--------------+-----------+----------------------------------------------+--------------------+
+ | PSNOWALB | 2D | Albedo | `-` |
+ +--------------+-----------+----------------------------------------------+--------------------+
+ | PSNOWTHRUFAL | 2D | Surface runoff rate | `kg/m^2/s` |
+ +--------------+-----------+----------------------------------------------+--------------------+
+ | PSNOWHEIGHT | 2D | Total glacier thickness | `m` |
+ +--------------+-----------+----------------------------------------------+--------------------+
+ | PSNOWTOTSWE | 2D | Total glacier snow water equivalent | `kg/m^2` |
+ +--------------+-----------+----------------------------------------------+--------------------+
+ | PSNOWGRAN1 | 3D | Snow grain parameter 1 | `-` |
+ +--------------+-----------+----------------------------------------------+--------------------+
+ | PSNOWGRAN2 | 3D | Snow grain parameter 2 | `-` |
+ +--------------+-----------+----------------------------------------------+--------------------+
+ | PSNOWDZ | 3D | Thickness of snow/ice layers | `m` |
+ +--------------+-----------+----------------------------------------------+--------------------+
+ | PSNOWHIST | 3D | Snow grain historical parameter | `-` |
+ +--------------+-----------+----------------------------------------------+--------------------+
+ | PSNOWLIQ | 3D | Liquid content | `kg/m^2` |
+ +--------------+-----------+----------------------------------------------+--------------------+
+ | PSNOWRHO | 3D | Snow/ice density | `kg/m^3` |
+ +--------------+-----------+----------------------------------------------+--------------------+
+ | FLOW_ICE | 2D | Accumulated surface runoff from ice surface | `kg/m^2` (or `mm`) |
+ +--------------+-----------+----------------------------------------------+--------------------+
+ | FLOW_SNOW | 2D | Accumulated surface runoff from snow surface | `kg/m^2` (or `mm`) |
+ +--------------+-----------+----------------------------------------------+--------------------+
+
+.. note::
+ Note on other WRF-Hydro outputs: The following outputs are informed
+ from both Noah-MP and Crocus. Over glacier gridpoints, the outputs
+ are informed from Crocus: ``ACCET``, ``ALBEDO``, ``SNOWEQV``, ``SNOWH``
+ and ``ACSNOWM``.
+
+ Currently there are no namelist options to change parameter values.
+ Several important parameters that can be modified can be found in:
+ :file:`src/Land_models/NoahMP/phys/surfex/modd_snow_par.F90`
+
diff --git a/docs/userguide/conf.py b/docs/userguide/conf.py
new file mode 100644
index 000000000..4aa01779b
--- /dev/null
+++ b/docs/userguide/conf.py
@@ -0,0 +1,32 @@
+project = 'WRF-Hydro Modeling System'
+author = 'WRF-Hydro Team'
+copyright = '2024, '+author
+version = 'v5.4.0'
+release = '5.4.0'
+try:
+ import sphinx_rtd_theme
+ extensions = [
+ 'sphinx_rtd_theme',
+ ]
+ # html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
+ html_theme = 'sphinx_rtd_theme'
+ html_theme_options = {
+ 'navigation_depth': -1
+ }
+except:
+ pass
+html_static_path = ['_static']
+html_css_files = ['ug_theme.css']
+numfig_secnum_depth = 2
+
+#these are enforced by rstdoc, but keep them for sphinx-build
+numfig = 0
+smartquotes = 0
+source_suffix = '.rest'
+templates_path = []
+language = 'en'
+highlight_language = "none"
+default_role = 'math'
+pygments_style = 'sphinx'
+exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
+master_doc = 'index'
diff --git a/docs/userguide/index.rest b/docs/userguide/index.rest
new file mode 100644
index 000000000..6d6bd9294
--- /dev/null
+++ b/docs/userguide/index.rest
@@ -0,0 +1,113 @@
+.. vim: syntax=rst
+.. include:: meta.rest
+
+.. image:: media/wrfhydro-banner.png
+ :align: center
+
+=========================================================
+The NCAR WRF-Hydro® Modeling System Technical Description
+=========================================================
+.. rst-class:: center
+
+ | Version |version_long|
+ |
+ | Originally Created:
+ | April 14, 2013
+ |
+ | Updated:
+ | October 17, 2024
+
+Until further notice, please cite the WRF-Hydro® modeling system as
+follows:
+
+Gochis, D.J., M. Barlage, R. Cabell, M. Casali, A. Dugger, T. Eidhammer,
+K. FitzGerald, M. McAllister, J. McCreight, A. McCluskey, A. RafieeiNasab,
+S. Rasmussen, L. Read, K. Sampson, D. Yates, Y. Zhang (2024).
+*The WRF-Hydro® Modeling System Technical Description,* (Version 5.4).
+NCAR Technical Note. 107 pages. Available online at:
+https://wrf-hydro.readthedocs.io/en/latest/
+
+.. rubric:: FORWARD
+
+This Technical Description describes the WRF-Hydro® model coupling
+architecture and physics options, released in Version 5.4 in Oct. 2024.
+As the WRF-Hydro® modeling system is developed further, this document
+will be continuously enhanced and updated. Please send feedback to
+wrfhydro@ucar.edu.
+
+.. rubric:: Prepared by:
+
+David Gochis, Michael Barlage, Ryan Cabell, Matt Casali, Aubrey Dugger, Trude
+Eidhammer, Katelyn FitzGerald, Molly McAllister, James McCreight, Alyssa
+McCluskey, Arezoo RafieeiNasab, Soren Rasmussen, Laura Read, Kevin Sampson,
+David Yates, and Yongxin Zhang
+
+.. rubric:: Special Acknowledgments:
+
+Development of the NCAR WRF-Hydro system has been significantly enhanced
+through numerous collaborations. The following persons are graciously
+thanked for their contributions to this effort:
+
+John McHenry and Carlie Coats, Baron Advanced Meteorological Services
+
+Martyn Clark and Fei Chen, National Center for Atmospheric Research
+
+Zong-Liang Yang, Cedric David, Peirong Lin and David Maidment of the
+University of Texas at Austin
+
+Harald Kunstmann, Benjamin Fersch and Thomas Rummler of Karlsruhe
+Institute of Technology, Garmisch-Partenkirchen, Germany
+
+Alfonso Senatore, University of Calabria, Cosenza, Italy
+
+Brian Cosgrove, Ed Clark, Fernando Salas, Trey Flowers, Xia Feng,
+Yuqiong Liu, Nels Frazier,
+
+Fred Ogden, Dave Mattern, Don Johnson, and Tom Graziano of the National
+Oceanic and Atmospheric Administration Office of Water Prediction
+
+Ismail Yucel, Middle East Technical University, Ankara, Turkey
+
+Erick Fredj, The Jerusalem College of Technology, Jerusalem, Israel
+
+Amir Givati, Surface water and Hydrometeorology Department, Israeli
+Hydrological Service, Jerusalem.
+
+Antonio Parodi, Fondazione CIMA - Centro Internazionale in Monitoraggio
+Ambientale, Savona, Italy
+
+Blair Greimann, Sedimentation and Hydraulics section, U.S. Bureau of
+Reclamation
+
+Z George Xue and Dongxiao Yin, Louisiana State University
+
+Funding support for the development and application of the WRF-Hydro®
+modeling system has been provided by:
+
+The National Science Foundation and the National Center for Atmospheric
+Research
+
+The U.S. National Weather Service
+
+The Colorado Water Conservation Board
+
+Baron Advanced Meteorological Services
+
+National Aeronautics and Space Administration (NASA)
+
+National Oceanic and Atmospheric Administration (NOAA) Office of Water
+Prediction (OWP)
+
+
+.. toctree::
+ :hidden:
+
+ Preface / Acknowledgements
+ introduction
+ model-code-config
+ model-physics
+ nudging
+ model-inputs-preproc
+ model-outputs
+ references
+ appendices
diff --git a/docs/userguide/introduction.rest b/docs/userguide/introduction.rest
new file mode 100644
index 000000000..88468973b
--- /dev/null
+++ b/docs/userguide/introduction.rest
@@ -0,0 +1,264 @@
+.. vim: syntax=rst
+.. include:: meta.rest
+
+1. Introduction
+===============
+
+The purpose of this technical note is to describe the physical
+parameterizations, numerical implementation, coding conventions and
+software architecture for the NCAR Weather Research and Forecasting
+model (WRF) hydrological modeling system, hereafter referred to as
+WRF-Hydro. The system is intended to be flexible and extensible and
+users are encouraged to develop, add and improve components to meet
+their application needs.
+
+It is critical to understand that like the WRF atmospheric modeling
+system, the WRF-Hydro modeling system is not a singular 'model' per se
+but instead it is a modeling architecture that facilitates coupling of
+multiple alternative hydrological process representations. There are
+numerous (over 100) different configuration permutations possible in
+WRF-Hydro Version 5.2. Users need to become familiar with the concepts
+behind the processes within the various model options in order to
+optimally tailor the system for their particular research and
+application activities.
+
+1.1 Brief History
+-----------------
+
+The WRF-Hydro modeling system provides a means to couple hydrological
+model components to atmospheric models and other Earth System modeling
+architectures. The system is intended to be extensible and is built upon
+a modular Modern Fortran architecture. The code has also been parallelized
+for distributed memory parallel computing applications. Numerous options
+for terrestrial hydrologic routing physics are contained within Version
+|version_short| of WRF-Hydro but users are encouraged to add additional components
+to meet their research and application needs. The initial version of
+WRF-Hydro (originally called 'Noah-distributed' in 2003) included a
+distributed, 3-dimensional, variably-saturated surface and subsurface
+flow model previously referred to as 'Noah-distributed' for the
+underlying land surface model upon which the original code was based.
+Initially, the implementation of terrain routing and, subsequently,
+channel and reservoir routing functions into the 1-dimensional Noah land
+surface model was motivated by the need to account for increased
+complexity in land surface states and fluxes and to provide
+physically-consistent land surface flux and stream channel discharge
+information for hydrometeorological applications. The original
+implementation of the surface overland flow and subsurface saturated
+flow modules into the Noah land surface model are described by Gochis
+and Chen (2003). In that work, a simple subgrid
+disaggregation-aggregation procedure was employed as a means of mapping
+land surface hydrological conditions from a “coarsely” resolved land
+surface model grid to a much more finely resolved terrain routing grid
+capable of adequately resolving the dominant local landscape gradient
+features responsible for the gravitational redistribution of terrestrial
+moisture. Since then numerous improvements to the Noah-distributed model
+have occurred including optional selection for 2-dimensional (in `x` and
+`y`) or 1-dimensional (“steepest descent” or so-called “D8” methodologies)
+terrain routing, a 1-dimensional, grid-based, hydraulic routing model, a
+reservoir routing model, 2 reach-based hydrologic channel routing
+models, and a simple empirical baseflow estimation routine. In 2004, the
+entire modeling system, then referred to as the NCAR WRF-Hydro
+hydrological modeling extension package was coupled to the Weather
+Research and Forecasting (WRF) mesoscale meteorological model (*Skamarock
+et al., 2005*) thereby permitting a physics-based, fully coupled land
+surface hydrology-regional atmospheric modeling capability for use in
+hydrometeorological and hydroclimatological research and applications.
+The code has since been fully parallelized for high-performance
+computing applications. During late 2011 and 2012, the WRF-Hydro code
+underwent a major reconfiguration of its coding structures to facilitate
+greater and easier extensibility and upgradability with respect to the
+WRF model, other hydrological modeling components, and other Earth
+system modeling frameworks. The new code and directory structure
+implemented is reflected in this document. Additional changes to the
+directory structure occurred during 2014-2015 to accommodate the
+coupling with the new Noah-MP land modeling system. Between 2015-2018,
+new capabilities were added to permit more generalized, user-defined
+mapping onto irregular objects, such as catchments or hydrologic
+response units. As additional changes and enhancements to the WRF-Hydro
+occur they will be documented in future versions of this document.
+
+1.2 Model Description
+------------------------
+
+WRF-Hydro has been developed to facilitate improved representations of
+terrestrial hydrologic processes related to the spatial redistribution
+of surface, subsurface and channel waters across the land surface and to
+facilitate coupling of hydrologic models with atmospheric models.
+Switch-activated modules in WRF-Hydro enable treatment of terrestrial
+hydrological physics, which have either been created or have been
+adapted from existing distributed hydrological models. The conceptual
+architecture for WRF-Hydro is shown in Figures 1.1 and 1.2 where
+WRF-Hydro exists as a coupling architecture (blue box) or “middle-ware”
+layer between weather and climate models and terrestrial hydrologic
+models and land data assimilation systems. WRF-Hydro can also operate in
+a standalone mode as a traditional land surface hydrologic modeling
+system.
+
+.. _figure-1.1:
+.. figure:: media/conceptual_diagram_wrfhydro.png
+ :align: center
+ :scale: 80%
+
+ **Figure 1.1.** Generalized conceptual schematic of the WRF-Hydro
+ architecture showing various categories of model components.
+
+.. figure:: media/coupling_schematic.png
+ :align: center
+ :scale: 80%
+
+ **Figure 1.2.** Model schematic illustrating where many existing
+ atmosphere, land surface and hydrological model components *could* fit
+ into the WRF-Hydro architecture. NOTE: Not all of these models are
+ currently coupled into WRF-Hydro at this time. This schematic is meant
+ to be illustrative. Components which are coupled have an asterisk (\*)
+ by their name.
+
+WRF-Hydro is designed to enable improved simulation of land surface
+hydrology and energy states and fluxes at a fairly high spatial
+resolution (typically 1 km or less) using a variety of physics-based and
+conceptual approaches. As such, it is intended to be used as either a
+land surface model in both standalone (“uncoupled” or “offline”) mode
+and fully-coupled (to an atmospheric model) mode. Both time-evolving
+“forcing” and static input datasets are required for model operation.
+The exact specification of both forcing and static data depends greatly
+on the selection of model physics and component options to be used. The
+principle model physics options in WRF-Hydro include:
+
+- 1-dimensional (vertical) land surface parameterization
+
+- surface overland flow
+
+- saturated subsurface flow
+
+- channel routing
+
+- reservoir routing
+
+- conceptual/empirical baseflow
+
+Both the Noah land surface and Noah-MP land surface model options are
+available for use in the current version of the WRF-Hydro. The rest of
+this document will focus on their implementation. Future versions will
+include other land surface model options.
+
+Like nearly all current land surface models, the Noah and Noah-MP land
+surface parameterizations require a few basic meteorological forcing
+variables. Required meteorological forcing variables are listed in Table
+1.1.
+
+.. table:: **Table 1.1** Required input meteorological forcing variables for the
+ Noah and Noah-MP LSMs
+ :width: 90%
+ :align: center
+
+ +----------------------------------------+-----------+
+ | **Variable** | **Units** |
+ +========================================+===========+
+ | Incoming shortwave radiation | `W/m^2` |
+ +----------------------------------------+-----------+
+ | Incoming longwave radiation | `W/m^2` |
+ +----------------------------------------+-----------+
+ | Specific humidity | `kg/kg` |
+ +----------------------------------------+-----------+
+ | Air temperature | `K` |
+ +----------------------------------------+-----------+
+ | Surface pressure | `Pa` |
+ +----------------------------------------+-----------+
+ | Near surface wind in the u - component | `m/s` |
+ +----------------------------------------+-----------+
+ | Near surface wind in the v-component | `m/s` |
+ +----------------------------------------+-----------+
+ | Liquid water precipitation rate | `mm/s` |
+ +----------------------------------------+-----------+
+
+*[Different land surface models may require other or additional forcing
+variables or the specification of forcing variables in different units.]*
+
+When coupled to the WRF regional atmospheric model the meteorological
+forcing data is provided by the atmospheric model with a frequency
+dictated by the land surface model time-step specified in WRF. When run
+in a standalone mode, meteorological forcing data must be provided as
+gridded input time series. Further details on the preparation of forcing
+data for standalone WRF-Hydro execution is provided in :ref:`section-5.7`
+
+External, third party, Geographic Information System (GIS) tools are
+used to delineate a stream channel network, open water (i.e., lake,
+reservoir, and ocean) grid cells and groundwater/baseflow basins. Water
+features are mapped onto the high-resolution terrain-routing grid and
+post-hoc consistency checks are performed to ensure consistency between
+the coarse resolution Noah/Noah-MP land model grid and the fine
+resolution terrain and channel routing grid.
+
+The WRF-Hydro model components calculate fluxes of energy and moisture
+either back to the atmosphere or also, in the case of moisture fluxes,
+to stream and river channels and through reservoirs. Depending on the
+physics options selected, the primary output variables include but are
+not limited to those in the table below. Output variables and options
+are discussed in detail in :ref:`section-6.0`
+
+.. table:: **Table 1.2** Primary Output data from WRF-Hydro
+ :width: 90%
+ :align: center
+
+ +-----------------------------------------------------------+------------+
+ | **Variable** | **Units** |
+ +===========================================================+============+
+ | Surface latent heat flux | `W/m^2` |
+ +-----------------------------------------------------------+------------+
+ | Surface sensible heat flux | `W/m^2` |
+ +-----------------------------------------------------------+------------+
+ | Ground heat flux | `W/m^2` |
+ +-----------------------------------------------------------+------------+
+ | Ground surface and/or canopy skin temperature | `K` |
+ +-----------------------------------------------------------+------------+
+ | Surface evaporation components (soil evaporation, | `kg/m^2/s` |
+ | transpiration, canopy water evaporation, snow sublimation | |
+ | and ponded water evaporation) | |
+ +-----------------------------------------------------------+------------+
+ | Soil moisture | `m^3/m^3` |
+ +-----------------------------------------------------------+------------+
+ | Soil temperature | `K` |
+ +-----------------------------------------------------------+------------+
+ | Deep soil drainage | |
+ +-----------------------------------------------------------+------------+
+ | Surface runoff | `mm` |
+ +-----------------------------------------------------------+------------+
+ | Canopy moisture content | `mm` |
+ +-----------------------------------------------------------+------------+
+ | Snow depth | `m` |
+ +-----------------------------------------------------------+------------+
+ | Snow liquid water equivalent | `mm` |
+ +-----------------------------------------------------------+------------+
+ | Stream channel inflow (optional with terrain routing) | `mm` |
+ +-----------------------------------------------------------+------------+
+ | Channel flow rate (optional with channel routing) | `m^3/s` |
+ +-----------------------------------------------------------+------------+
+ | Channel flow depth (optional with channel routing) | `mm` |
+ +-----------------------------------------------------------+------------+
+ | Reservoir height and discharge (optional with channel and | |
+ | reservoir routing) | |
+ +-----------------------------------------------------------+------------+
+
+WRF-Hydro has been developed for Linux-based operating systems including
+small local clusters and high-performance computing systems.
+Additionally, the model code has also been ported to a selection of
+virtual machine environments (e.g. "containers") to enable the use of
+small domain cases on many common desktop computing platforms (e.g.
+Windows and MacOS). The parallel computing schema is provided in
+:ref:`section-2.3`. WRF-Hydro utilizes a combination of netCDF and flat
+ASCII file formats.
+
+The majority of input and output is handled using the netCDF data format
+and the netCDF library is a requirement for running the model. Details on the
+software requirements are available online on the FAQs page of the
+website as well as in the *How To Build & Run WRF-Hydro V5 in Standalone
+Mode* document also available from
+https://ral.ucar.edu/projects/wrf_hydro.
+
+WRF-Hydro is typically set up as a computationally-intensive modeling
+system. Simple small domains (e.g. 16 `km^2`) can be configured to
+run on a desktop platform. Large-domain model runs can require hundreds
+or thousands of processors. We recommend beginning with an example “test
+case” we supply at the WRF-Hydro website
+https://ral.ucar.edu/projects/wrf_hydro before moving to your region of
+interest, particularly if your region or domain is reasonably large.
\ No newline at end of file
diff --git a/docs/userguide/media/aggfactr.png b/docs/userguide/media/aggfactr.png
new file mode 100644
index 000000000..059467825
Binary files /dev/null and b/docs/userguide/media/aggfactr.png differ
diff --git a/docs/userguide/media/channel-props.svg b/docs/userguide/media/channel-props.svg
new file mode 100644
index 000000000..9ee1b26da
--- /dev/null
+++ b/docs/userguide/media/channel-props.svg
@@ -0,0 +1,34 @@
+
\ No newline at end of file
diff --git a/docs/userguide/media/channel-routing-grid-link.png b/docs/userguide/media/channel-routing-grid-link.png
new file mode 100644
index 000000000..02f677272
Binary files /dev/null and b/docs/userguide/media/channel-routing-grid-link.png differ
diff --git a/docs/userguide/media/channel-terms.png b/docs/userguide/media/channel-terms.png
new file mode 100644
index 000000000..3c5677797
Binary files /dev/null and b/docs/userguide/media/channel-terms.png differ
diff --git a/docs/userguide/media/channel_decomp.png b/docs/userguide/media/channel_decomp.png
new file mode 100644
index 000000000..65d3fbe04
Binary files /dev/null and b/docs/userguide/media/channel_decomp.png differ
diff --git a/docs/userguide/media/conceptual_diagram_wrfhydro.png b/docs/userguide/media/conceptual_diagram_wrfhydro.png
new file mode 100644
index 000000000..c926ab89d
Binary files /dev/null and b/docs/userguide/media/conceptual_diagram_wrfhydro.png differ
diff --git a/docs/userguide/media/coupling_schematic.png b/docs/userguide/media/coupling_schematic.png
new file mode 100644
index 000000000..cee97e1c7
Binary files /dev/null and b/docs/userguide/media/coupling_schematic.png differ
diff --git a/docs/userguide/media/gridded_decomp.png b/docs/userguide/media/gridded_decomp.png
new file mode 100644
index 000000000..2a072679f
Binary files /dev/null and b/docs/userguide/media/gridded_decomp.png differ
diff --git a/docs/userguide/media/groundwater.svg b/docs/userguide/media/groundwater.svg
new file mode 100644
index 000000000..91253bbbc
--- /dev/null
+++ b/docs/userguide/media/groundwater.svg
@@ -0,0 +1,1417 @@
+
+
+
+
diff --git a/docs/userguide/media/hydro-physics-permutations.png b/docs/userguide/media/hydro-physics-permutations.png
new file mode 100644
index 000000000..027899a23
Binary files /dev/null and b/docs/userguide/media/hydro-physics-permutations.png differ
diff --git a/docs/userguide/media/level-pool.png b/docs/userguide/media/level-pool.png
new file mode 100644
index 000000000..7ffe64dce
Binary files /dev/null and b/docs/userguide/media/level-pool.png differ
diff --git a/docs/userguide/media/modular_calling.png b/docs/userguide/media/modular_calling.png
new file mode 100644
index 000000000..6b1c25c6e
Binary files /dev/null and b/docs/userguide/media/modular_calling.png differ
diff --git a/docs/userguide/media/noahmp-physics-permutations.png b/docs/userguide/media/noahmp-physics-permutations.png
new file mode 100644
index 000000000..1ab294414
Binary files /dev/null and b/docs/userguide/media/noahmp-physics-permutations.png differ
diff --git a/docs/userguide/media/nudging-workflow.svg b/docs/userguide/media/nudging-workflow.svg
new file mode 100644
index 000000000..88a8c6c69
--- /dev/null
+++ b/docs/userguide/media/nudging-workflow.svg
@@ -0,0 +1,998 @@
+
+
+
+
diff --git a/docs/userguide/media/nwm-wrf-hydro.png b/docs/userguide/media/nwm-wrf-hydro.png
new file mode 100644
index 000000000..89df8e06a
Binary files /dev/null and b/docs/userguide/media/nwm-wrf-hydro.png differ
diff --git a/docs/userguide/media/overland-flow.png b/docs/userguide/media/overland-flow.png
new file mode 100644
index 000000000..366549b5a
Binary files /dev/null and b/docs/userguide/media/overland-flow.png differ
diff --git a/docs/userguide/media/physics-inputs.png b/docs/userguide/media/physics-inputs.png
new file mode 100644
index 000000000..cd5268c6d
Binary files /dev/null and b/docs/userguide/media/physics-inputs.png differ
diff --git a/docs/userguide/media/restarts.png b/docs/userguide/media/restarts.png
new file mode 100644
index 000000000..359fa0372
Binary files /dev/null and b/docs/userguide/media/restarts.png differ
diff --git a/docs/userguide/media/subsurface-flow.png b/docs/userguide/media/subsurface-flow.png
new file mode 100644
index 000000000..07961b343
Binary files /dev/null and b/docs/userguide/media/subsurface-flow.png differ
diff --git a/docs/userguide/media/trapezoid-compound-channel.png b/docs/userguide/media/trapezoid-compound-channel.png
new file mode 100644
index 000000000..d43f55b67
Binary files /dev/null and b/docs/userguide/media/trapezoid-compound-channel.png differ
diff --git a/docs/userguide/media/user-defined-mapping.png b/docs/userguide/media/user-defined-mapping.png
new file mode 100644
index 000000000..a256eab96
Binary files /dev/null and b/docs/userguide/media/user-defined-mapping.png differ
diff --git a/docs/userguide/media/wrf-hydro-components.png b/docs/userguide/media/wrf-hydro-components.png
new file mode 100644
index 000000000..e38fa9209
Binary files /dev/null and b/docs/userguide/media/wrf-hydro-components.png differ
diff --git a/docs/userguide/media/wrf_coupling.png b/docs/userguide/media/wrf_coupling.png
new file mode 100644
index 000000000..f8495c084
Binary files /dev/null and b/docs/userguide/media/wrf_coupling.png differ
diff --git a/docs/userguide/media/wrfhydro-banner.png b/docs/userguide/media/wrfhydro-banner.png
new file mode 100644
index 000000000..b98bead35
Binary files /dev/null and b/docs/userguide/media/wrfhydro-banner.png differ
diff --git a/docs/userguide/media/wrfhydro-outputs.png b/docs/userguide/media/wrfhydro-outputs.png
new file mode 100644
index 000000000..e74fade78
Binary files /dev/null and b/docs/userguide/media/wrfhydro-outputs.png differ
diff --git a/docs/userguide/meta.rest b/docs/userguide/meta.rest
new file mode 100644
index 000000000..740f01082
--- /dev/null
+++ b/docs/userguide/meta.rest
@@ -0,0 +1,17 @@
+.. |version_short| replace:: 5.4
+.. |version_long| replace:: 5.4.0
+
+.. role:: center
+ :class: center
+
+.. role:: underline
+ :class: underline
+
+.. role:: file
+ :class: filename
+
+.. role:: program
+ :class: program
+
+.. role:: output
+ :class: filename
\ No newline at end of file
diff --git a/docs/userguide/model-code-config.rest b/docs/userguide/model-code-config.rest
new file mode 100644
index 000000000..c5d4a4d07
--- /dev/null
+++ b/docs/userguide/model-code-config.rest
@@ -0,0 +1,465 @@
+.. vim: syntax=rst
+.. include:: media.rest
+.. include:: meta.rest
+
+2. Model Code and Configuration Description
+===========================================
+
+This chapter presents the technical description of the WRF-Hydro model
+code. The chapter is divided into the following sections:
+
+2.1 Brief Code Overview
+-----------------------
+
+WRF-Hydro is written in a modularized, modern Fortran coding structure whose
+routing physics modules are switch-activated through a model namelist
+file called :ref:`hydro.namelist `. The code has been
+parallelized for execution on high-performance, parallel computing
+architectures including Linux operating system commodity clusters and
+multi-processor desktops as well as multiple supercomputers. More detailed model
+requirements depend on the choice of model driver, described in the next section.
+
+2.2 Driver Level Description
+----------------------------
+
+WRF-Hydro is essentially a group of modules and functions which handle
+the communication of information between atmosphere components (such as
+WRF, CESM or prescribed meteorological analyses) and sets of land
+surface hydrology components. From a coding perspective the WRF-hydro
+system can be called from an existing architecture such as the WRF
+model, the CESM, NASA LIS, etc. or can run in a standalone mode with its
+own driver which has adapted part of the NCAR High Resolution Land Data
+Assimilation System (HRLDAS). Each new coupling effort requires some
+basic modifications to a general set of functions to manage the
+coupling. In WRF-Hydro, each new system that WRF-Hydro is coupled into
+gets assigned to a directory indicating the name of the coupling
+component WRF-Hydro is coupled to. For instance, the code which handles
+the coupling to the WRF model is contained in the :file:`WRF_cpl/` directory in
+the WRF-Hydro system. Similarly, the code which handles the coupling to
+the offline Noah land surface modeling system is contained within the
+:file:`Noah_cpl/` directory and so on. Description of each directory is
+provided in :ref:`section-2.4`.
+
+The coupling structure is illustrated here, briefly, in terms of the
+coupling of WRF-Hydro into the WRF model. A similar approach is used for
+coupling the WRF-Hydro extension package into other modeling systems or
+for coupling other modeling systems into WRF-Hydro.
+
+ *Example:* For coupled WRF/WRF-Hydro runs the WRF-Hydro components are
+ compiled as a single library function call with the WRF system. As such,
+ a single executable is created upon compilation (:program:`wrf.exe`). As
+ illustrated in :ref:`Figure 2.1 `, WRF-hydro is called directly
+ from WRF in the WRF surface driver module (:file:`phys/module_surface_driver.F90`).
+ The code that manages the communication is the :file:`WRF_drv_Hydro.F90`
+ interface module that is contained within the :file:`WRF_cpl/` directory.
+ The :file:`WRF_drv_Hydro.F90` interface module is the specific instance of
+ a 'General WRF-Hydro Coupling Interface' for the WRF model which passes data,
+ grid and time information between WRF and WRF-Hydro. Components within
+ WRF-Hydro then manage the dynamic regridding “data mapping” and sub-component
+ routing functions (e.g. surface, subsurface and/or channel routing) within
+ WRF-Hydro (see :ref:`Fig. 1.1 ` for an illustration of components
+ contained within WRF-Hydro).
+
+Upon completion of the user-specified routing functions, WRF-Hydro will
+remap the data back to the WRF model grid and then pass the necessary
+variables back to the WRF model through the :file:`WRF_drv_Hydro.F90` interface
+module. Therefore, the key component of the WRF-Hydro system is the
+proper construction of the :file:`WRF_cpl_Hydro` interface module (or more
+generally :file:`{XXX}_cpl_Hydro`). Users wishing to couple new modules to
+WRF-Hydro will need to create a unique “General WRF-Hydro Coupling
+Interface” for their components. Some additional examples of this
+interface module are available upon request for users to build new
+coupling components. This simple coupling interface is similar in
+structure to other general model coupling interfaces such as those
+within the Earth System Modeling Framework (ESMF) or the Community
+Surface Dynamics Modeling System (CSDMS).
+
+.. _figure2.1:
+.. figure:: media/wrf_coupling.png
+ :align: center
+
+ **Figure 2.1** Schematic illustrating the coupling and calling structure
+ of WRF-Hydro from the WRF Model.
+
+The model code has been compiled using the Intel :program:`ifort` compiler and
+the freely-available GNU Fortran compiler :program:`gfortran` for use with
+Unix-type operating systems on desktops, clusters, and supercomputing
+systems. Because the WRF-Hydro modeling system relies on netCDF input and
+output file conventions, netCDF Fortran libraries must be installed and
+properly compiled on the system upon which WRF-Hydro is to be executed.
+Not doing so will result in numerous error messages such as :code:`*…undefined
+reference to netCDF library …*` or similar messages upon compilation.
+For further installation requirements see the FAQs page of the website
+as well asin the *How To Build & Run WRF-Hydro v5 in Standalone Mode* document
+also available from https://ral.ucar.edu/projects/wrf_hydro.
+
+.. _section-2.3:
+
+2.3 Parallelization strategy
+----------------------------
+
+Parallelization of the WRF-Hydro code utilizes geographic domain
+decomposition and 'halo' array passing structures similar to those used
+in the WRF atmospheric model (:ref:`Figures 2.2 ` and :ref:`2.3 `).
+Message passing between processors is accomplished using MPI protocols. Therefore the
+relevant MPI libraries must be installed and properly compiled on the
+system upon which WRF-Hydro is to be executed in parallel mode.
+Currently sequential compile is not supported so MPI libraries are
+required even if running over a single core.
+
+.. figure:: media/gridded_decomp.png
+ :align: center
+ :name: figure2.2
+
+ **Figure 2.2** Schematic of parallel domain decomposition scheme in
+ WRF-Hydro. Boundary or 'halo' arrays in which memory is shared between
+ processors (P1 and P2) are shaded in purple.
+
+.. figure:: media/channel_decomp.png
+ :align: center
+ :name: figure2.3
+
+ **Figure 2.3** Schematic of parallel domain decomposition scheme in
+ WRF-Hydro as applied to channel routing. Channel elements (stars) are
+ communicated at boundaries via ‘halo’ arrays in which memory is shared
+ between processors (P1 and P2). Black and red stars indicate overlapping
+ channel elements used in the diffusive wave solver.
+
+.. _section-2.4:
+
+2.4 Directory Structures
+------------------------
+
+The top-level directory structure of the code is provided below as
+nested under :file:`trunk/NDHMS` and subdirectory structures are described
+thereafter. The tables below provide brief descriptions of the file
+contents of each directory where the model code resides.
+
+.. default-role:: file
+
+.. table:: **Table 2.1** Description of the file contents of each directory
+ where the model *code* resides
+ :align: center
+ :width: 90%
+ :name: table-2.1
+
+ +-------------------------+--------------------------------------------------+
+ | **File/directory name** | **Description** |
+ | | |
+ +=========================+==================================================+
+ | Main code files and directories (under version control in |
+ | a GitHub repository): |
+ +-------------------------+--------------------------------------------------+
+ | :underline:`Top-Level Files and Directories:` |
+ +-------------------------+--------------------------------------------------+
+ | `CMakeLists.txt` | Top-level CMake build script used to compile |
+ | | the WRF-Hydro model |
+ +-------------------------+--------------------------------------------------+
+ | `docs/` | Pointer to location of full documentation (i.e. |
+ | | this document). |
+ +-------------------------+--------------------------------------------------+
+ | `tests` | Scripts and data used to test the model |
+ +-------------------------+--------------------------------------------------+
+ | `src` | WRF-Hydro Model source code |
+ +-------------------------+--------------------------------------------------+
+ | :underline:`Source code directories under \`src/\`:` |
+ +-------------------------+--------------------------------------------------+
+ | `CPL/Noah_cpl/` | Contains the WRF-Hydro coupling interface for |
+ | | coupling WRF-Hydro components with the |
+ | | standalone (offline) Noah land surface model |
+ | | data assimilation and forecasting system |
+ +-------------------------+--------------------------------------------------+
+ | `CPL/NoahMP_cpl/` | Contains the WRF-Hydro coupling interface for |
+ | | coupling WRF-Hydro components with the |
+ | | standalone (offline) Noah-MP land surface model |
+ | | data assimilation and forecasting system |
+ +-------------------------+--------------------------------------------------+
+ | `CPL/WRF_cpl/` | Contains the WRF-Hydro coupling interface for |
+ | | coupling WRF-Hydro components with the WRF |
+ | | system |
+ +-------------------------+--------------------------------------------------+
+ | `CPL/CLM_cpl/` , | Work in progress for ongoing coupling work. |
+ | `CPL/LIS_cpl/` , | Only NUOPC is actively supported. |
+ | `CPL/NUOPC_cpl/` | |
+ +-------------------------+--------------------------------------------------+
+ | `Data_Rec/` | Contains some data declaration modules |
+ +-------------------------+--------------------------------------------------+
+ | `Debug_Utilities/` | Utilities for debugging |
+ +-------------------------+--------------------------------------------------+
+ | `deprecated/` | Contains files not currently used |
+ +-------------------------+--------------------------------------------------+
+ | `HYDRO_drv/` | Contains the high-level WRF-Hydro component |
+ | | driver: `module_HYDRO_drv.F90` |
+ +-------------------------+--------------------------------------------------+
+ | `Land_models/Noah/` | Contains the Noah land surface model driver for |
+ | | standalone or uncoupled applications |
+ +-------------------------+--------------------------------------------------+
+ | `Land_models/NoahMP/` | Contains the Noah-MP land surface model driver |
+ | | for standalone or uncoupled applications |
+ +-------------------------+--------------------------------------------------+
+ | `MPP/` | Contains MPI parallelization routines and |
+ | | functions |
+ +-------------------------+--------------------------------------------------+
+ | `nudging/` | Contains nudging data assimilation routines and |
+ | | functions |
+ +-------------------------+--------------------------------------------------+
+ | `Rapid_routing/` | Contains the files necessary for RAPID routing |
+ | | model coupling. Unsupported as version of RAPID |
+ | | is out of date. |
+ +-------------------------+--------------------------------------------------+
+ | `Routing/` | Contains modules and drivers related to specific |
+ | | routing processes in WRF-Hydro |
+ +-------------------------+--------------------------------------------------+
+ | `template/` | Contains example namelist files for Noah, |
+ | | Noah-MP and the WRF-Hydro modules (HYDRO). |
+ | | Default and example parameter tables are also |
+ | | included for HYDRO. Note: Parameter tables for |
+ | | Noah and Noah-MP are stored within the |
+ | | :file:`Land_models` directory. |
+ +-------------------------+--------------------------------------------------+
+ | `utils/` | internal model versioning |
+ +-------------------------+--------------------------------------------------+
+ | :underline:`Files:` |
+ +-------------------------+--------------------------------------------------+
+ | `docs/BUILD.md` | WRF-Hydro build instructions for the standalone |
+ | | model |
+ +-------------------------+--------------------------------------------------+
+ | `wrf_hydro_config` | Configure script for coupled WRF \| WRF-Hydro |
+ | | configuration |
+ +-------------------------+--------------------------------------------------+
+ | `\*.json` | JSON files used for testing |
+ +-------------------------+--------------------------------------------------+
+ | Local files and directories created by CMake in the build directory |
+ | (not part of the version controlled repository): |
+ +-------------------------+--------------------------------------------------+
+ | :underline:`Directories:` |
+ +-------------------------+--------------------------------------------------+
+ | `lib/` | Directory where compiled libraries are written |
+ +-------------------------+--------------------------------------------------+
+ | `mods/` | Directory where compiled `.mod`` files are |
+ | | written upon compilation |
+ +-------------------------+--------------------------------------------------+
+ | `Run/` | Directory where model executable, example |
+ | | parameter tables, and example namelist files |
+ | | for the compiled model configuration will be |
+ | | populated. These files will be overwritten on |
+ | | compile. It is recommended the user copy the |
+ | | contents of this directory into an alternate |
+ | | location, separate from the code, to execute |
+ | | model runs. |
+ +-------------------------+--------------------------------------------------+
+
+.. table:: **Table 2.2** Modules within the :file:`Routing/` directory which relate to
+ routing processes in WRF-Hydro
+ :width: 90%
+ :align: center
+ :name: table-2.2
+
+ +--------------------------------------+-------------------------------------------------+
+ | **File/directory name** | **Description** |
+ | | |
+ +======================================+=================================================+
+ | `Overland/` | Directory containing overland routing modules |
+ +--------------------------------------+-------------------------------------------------+
+ | `Makefile` | Makefile for WRF-Hydro component |
+ +--------------------------------------+-------------------------------------------------+
+ | `module_channel_routing.F90` | Module containing WRF-Hydro channel routing |
+ | | components |
+ +--------------------------------------+-------------------------------------------------+
+ | `module_date_utilities_rt.F90` | Module containing various date/time utilities |
+ | | for routing routines |
+ +--------------------------------------+-------------------------------------------------+
+ | `module_GW_baseflow.F90` | Module containing model physics for simple |
+ | | baseflow model |
+ +--------------------------------------+-------------------------------------------------+
+ | `module_HYDRO_io.F90` | Module containing WRF-Hydro input and (some) |
+ | | output functions |
+ +--------------------------------------+-------------------------------------------------+
+ | `module_HYDRO_utils.F90` | Module containing several WRF-Hydro utilities |
+ | | |
+ +--------------------------------------+-------------------------------------------------+
+ | `module_lsm_forcing.F90` | Module containing the options for reading in |
+ | | different forcing data types |
+ +--------------------------------------+-------------------------------------------------+
+ | `module_noah_chan_param_init_rt.F90` | Module containing routines to initialize |
+ | | WRF-Hydro routing grids |
+ +--------------------------------------+-------------------------------------------------+
+ | `module_NWM_io.F90` | Module containing output routines to produce |
+ | | CF-compliant desired output files. |
+ +--------------------------------------+-------------------------------------------------+
+ | `module_NWM_io_dict.F90` | Dictionary to support CF-compliant output |
+ | | routines. |
+ +--------------------------------------+-------------------------------------------------+
+ | `module_RT.F90` | Module containing the calls to all the |
+ | | WRF-Hydro routing initialization |
+ +--------------------------------------+-------------------------------------------------+
+ | `module_UDMAP.F90` | Module for the user-defined mapping |
+ | | capabilities, currently used for NWM |
+ | | configuration (NHDPlus network) |
+ +--------------------------------------+-------------------------------------------------+
+ | `Noah_distr_routing.F90` | Module containing overland flow and subsurface |
+ | | physics routines and grid disaggregation |
+ | | routine |
+ +--------------------------------------+-------------------------------------------------+
+ | `module_gw_gw2d.F90` | Module containing routines for the experimental |
+ | | 2D groundwater model |
+ +--------------------------------------+-------------------------------------------------+
+
+.. default-role::
+
+2.5 Model Sequence of Operations
+--------------------------------
+
+The basic structure and sequencing of WRF-Hydro are diagrammatically
+illustrated in :ref:`Figure 2.4 ` management, initialization,
+I/O and model completion) is handled by the WRF-Hydro system unless
+WRF-Hydro is coupled into, and beneath, a different modeling architecture.
+The WRF-Hydro system can either call an independent land model driver such
+as the NCAR High Resolution Land Data Assimilation System (HRLDAS) for
+both Noah and Noah-MP land surface models to execute column land surface
+physics or be called by a different modeling architecture such as WRF,
+the NCAR CESM, or the NASA LIS. When run in a standalone or “uncoupled”
+mode, WRF-Hydro must read in the meteorological forcing data necessary to
+perform land surfac model calculations and it contains the necessary
+routines to do this. When run in a coupled mode with WRF or another larger
+architecture, WRF-Hydro receives meteorological forcing or land surface
+states and fluxes from the parent architecture. The basic execution
+process is as follows:
+
+ 1. Upon initialization static land surface physiographic data are read
+ into the WRF-Hydro system and the model domain and computational
+ arrays are established.
+
+ 2. Depending on whether or not WRF-Hydro is run offline as a standalone
+ system or whether it is coupled into another architecture, either
+ forcing data is read in or land surface states and fluxes are passed
+ in.
+
+ 3. For offline simulations which require land model execution, the
+ gridded column land surface model is executed.
+
+ 4. If routing is activated and there is a difference between the land
+ model grid and the routing grid, land surface states and fluxes are
+ then disaggregated to the high-resolution terrain routing grids.
+
+ 5. If activated, sub-surface routing physics are executed.
+
+ 6. If activated, surface routing physics are executed.
+
+ 7. If activated, the conceptual base flow model is executed.
+
+ 8. If activated, channel and reservoir routing components are executed.
+ Streamflow nudging is currently available to be applied within the
+ Muskingum-Cunge routing call.
+
+ 9. Updated land surface states and fluxes are then aggregated from the
+ high-resolution terrain routing grid to the land surface model grid
+ (if routing is activated and there is a difference between the land
+ model grid and the routing grid).
+
+ 10. Results from these integrations are then written to the model output
+ files and restart files or, in the case of a coupled WRF/WRF-Hydro
+ simulation, passed back to the WRF model.
+
+As illustrated at the bottom of the :ref:`Figure 2.4 `
+component with `NCAR’s DART `__
+(https://www.image.ucar.edu/DAReS/DART/) has been developed. This
+currently only works with WRF-Hydro in standalone mode. DART updates
+WRF-Hydro states independently of model time integration.
+
+.. _figure2.4:
+.. figure:: media/modular_calling.png
+ :align: center
+
+ **Figure 2.4** Modular calling structure of WRF-Hydro.
+
+2.6 WRF-Hydro compile-time options
+----------------------------------
+
+Compile time options are choices about the model structure which are
+determined when the model is compiled. Compile time choices select a
+WRF-Hydro instance from some of the options illustrated in
+:ref:`Figure 2.4. ` Compile time options fall into two
+categories: 1) the selected model driver, and 2) the compile options
+for the choice of driver. In this guide we limit the description of
+model drivers to WRF, Noah, and Noah-MP. Configuring, compiling, and
+running WRF-Hydro in standalone mode is described in detail in the
+*How To Build & Run WRF-Hydro V5 in Standalone Mode* document available
+from https://ral.ucar.edu/projects/wrf_hydro.
+
+Compile-time options are listed during the CMake build configuration
+process. These options are passed to CMake as environment variables
+using ``-D[OPTION]=[0|1]`` syntax. Those options/variables are listed
+below along with a description of what each option does:
+
+.. parsed-literal::
+
+ ===================================================================
+ -- Start of WRF-Hydro Env VARIABLES
+ WRF_HYDRO = 1 *Always set to 1 for WRF-Hydro*
+
+ HYDRO_D = 0 *Set to 1 for enhanced diagnostic output*
+
+ WRF_HYDRO_RAPID = 0 *Currently unsupported, always set to 0*
+
+ SPATIAL_SOIL = 1 *Set to 1 to allow NoahMP LSM to use*
+ *spatially distrubuted parameteter*
+ *vs. a table based on soil class and*
+ *land use categories*
+
+ WRFIO_NCD_LARGE_FILE_SUPPORT = 0 *Set to 1 if using a*
+ *WRF/WRF-Hydro coupled build*
+
+ NCEP_WCOSS = 0 *Set to 1 if compile for NOAA WCOSS*
+
+ NWM_META = 0 *Set to 1 if using NWM-style metadata in output*
+
+ WRF_HYDRO_NUDGING = 0 *Set to 1 if using streamflow nudging*
+
+ OUTPUT_CHAN_CONN = 0 *Set to 1 to output channel network*
+ *diagnostic information*
+
+ PRECIP_DOUBLE = 0 *Set to 1 to double all incoming*
+ *precipitation (for debug purposes only)*
+
+ WRF_HYDRO_NUOPC = 0 *Set to 1 when using NUOPC coupling*
+ ===================================================================
+
+.. _section-2.7:
+
+2.7 WRF-Hydro run time options
+------------------------------------
+
+There are two namelist files that users must edit in order to
+successfully execute the WRF-Hydro system in a standalone mode or
+“uncoupled” to WRF. One of these namelist files is the hydro.namelist
+file and in it are the various settings for operating all of the routing
+components of the WRF-Hydro system. The hydro.namelist file is
+internally commented so that it should be clear as to what is needed for
+each setting. A full annotated example of the hydro.namelist file is
+provided in :ref:`section-a5`.
+
+The second namelist is the namelist which specifies the land surface
+model options to be used. This namelist can change depending on which
+land model is to be used in conjunction with the WRF-Hydro routing
+components. For example, a user would use one namelist when running the
+Noah land surface model coupled to WRF-Hydro but that user would need to
+use a different namelist file when running the CLM model, the Noah-MP
+model or NASA LIS model coupled to WRF-Hydro. The reason for this is
+WRF-Hydro is intended to be *minimally-invasive* to other land surface
+models or land model driver structures and not require significant
+changes to those systems. This minimal invasiveness facilitates easier
+coupling with new systems and helps facilitate easy supportability and
+version control with those systems. When the standalone WRF-Hydro model
+is compiled the appropriate namelist.hrldas template file is copied over
+to the Run directory based upon the specified land surface model.
+
+In WRF-Hydro v\ |version_short|, Noah and Noah-MP land surface models are the main
+land surface model options when WRF-Hydro is run in standalone mode.
+Both Noah and Noah-MP use a namelist file called namelist.hrldas, which,
+as noted above, will contain different settings for the two different
+land surface models. For a run where WRF-Hydro is coupled to the WRF
+model, the WRF model input file namelist.input becomes the second
+namelist file. Full annotated example namelist.hrldas files for Noah and
+Noah-MP are provided in :ref:`section-a3` and :ref:`section-a4`.
+
diff --git a/docs/userguide/model-inputs-preproc.rest b/docs/userguide/model-inputs-preproc.rest
new file mode 100644
index 000000000..3392f31df
--- /dev/null
+++ b/docs/userguide/model-inputs-preproc.rest
@@ -0,0 +1,732 @@
+.. vim: syntax=rst
+.. include:: meta.rest
+
+5. Model inputs and preprocessing
+=================================
+
+This chapter describes WRF-Hydro input and parameter file requirements
+and the preprocessing tools used to generate them.
+
+5.1 Overview of model inputs
+----------------------------
+
+.. _figure-5.1:
+.. figure:: media/physics-inputs.png
+ :align: center
+ :figwidth: 90%
+
+ **Figure 5.1** WRF-Hydro input and parameter files organized by model
+ physics component. See the Key for files specific to a certain land
+ model or channel configuration.
+
+WRF-Hydro requires a number of input files describing the model domain,
+parameters, initial conditions, and when run in a standalone
+configuration meteorological forcing files. A full list of these files
+broken up by model physics component is shown in :ref:`Figure 5.1 `.
+Note that the set of files required to run WRF-Hydro varies depending upon
+model configuration. For example, different land surface models and model
+physics components may require different parameter and input files.
+
+While some parameter files and templates are included with the model
+source code, most must be generated by the user. We provide a number of
+scripts and preprocessing utilities on the WRF-Hydro website
+(https://ral.ucar.edu/projects/wrf_hydro) in order to aid in this
+process. These include NCAR Command Language (NCL) scripts to regrid
+forcing data from commonly used data sources, R scripts to generate
+parameter and model initialization files, and a set of Python based
+ArcGIS pre-processing tools. The specific utilities used to generate
+different files are listed in :ref:`Table 5.1 `. Users
+should be aware that these tools do not support all potential datasets
+and use cases and that the use of default parameters will often result
+in suboptimal model performance. More details regarding the pre-processing
+utilities, file requirements, and descriptions follow.
+
+In addition to the ArcGIS pre-processing tools, the WRF-Hydro group has also
+developed a set of open-source, Python-based preprocessing tools. These tools
+replicate existing capabilities and operate within the same software
+environments as WRF-Hydro. Built with Python 3 and libraries such as
+Whitebox Tools, NumPy, and GDAL, users can configure their environments
+using Anaconda, Miniconda, or custom setups. This initiative responds to user
+demand for an open-source option, ensuring portability and multi-platform
+support while offering efficient, parallelized geoprocessing. All underlying
+libraries are open-source, eliminating proprietary licensing restrictions.
+For more information, please visit:
+https://ral.ucar.edu/dataset/open-source-wrf-hydro-gis-pre-processing-tools-beta.
+
+5.2 Domain processing and description of surface physiographic input files
+--------------------------------------------------------------------------
+
+This subsection describes the process of defining the WRF-Hydro model
+domain, generating model initial conditions, and deriving geospatial
+input and parameter files via the WRF-Hydro GIS pre-processing tools. As
+noted in the previous section a number of scripts and utilities have
+been developed to facilitate the creation of these files. Additionally,
+we rely on a utility within the Weather Research and Forecasting (WRF)
+model preprocessing system (WPS) called GEOGRID to define the land
+surface model grid and relevant geospatial data and produce the
+resulting :file:`geo_em.d0{x}.nc` file hereafter referred to as a “geogrid”
+file. This geogrid file is then used as input to the ArcGIS preprocessing
+tools, along with external datasets such as high resolution topographic
+data, which generate the high resolution routing grid and all surface
+physiographic input data files required by the model. The geogrid file
+is also passed to utilities in order to generate land surface model
+initial condition files.
+
+5.2.1 Defining the model domain
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+The data required to define the domain and geospatial attributes of a
+spatially-distributed, or gridded, 1-dimensional (vertical) land surface
+model (LSM) are specified in a geogrid (:file:`geo_em.d0{x}.nc`) netCDF file.
+This file is generated by the :program:`GEOGRID` utility in the WRF preprocessing
+system (WPS). WPS is a preprocessing system that prepares both land surface and
+atmospheric data for use in the model. The GEOGRID component of WPS
+automates the procedure of defining in space, georeferencing and
+attributing most of the land surface parameter data required to execute
+both the Noah and Noah-MP land surface models. GEOGRID interpolates land
+surface terrain, soils and vegetation data from standard, readily
+available data products. These data are distributed as a geographical
+input data package via the WRF website. Complete documentation and user
+instructions for use of the WPS system are provided online by NCAR and
+are updated regularly and, thus, are not discussed in great detail here.
+This :file:`geo_em.d0{x}.nc` file is also required as input to other WRF-Hydro
+preprocessing utilities.
+
+5.2.2 Initial land surface conditions
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+Initial conditions for the land surface, such as soil moisture, soil
+temperature, and snow states, are prescribed via the :file:`wrfinput_d0x.nc`
+file. This netCDF file can be generated one of two ways, through the
+:program:`real.exe` program within WRF or via an R script (:file:`create_Wrfinput.R`)
+distributed on the WRF-Hydro website. When created using the real.exe
+program in WRF, initial conditions are pulled from existing reanalysis
+or realtime products (see WRF documentation for data and system
+requirements). This will typically result in more realistic initial
+model states. However, the process is somewhat involved and requires the
+user to obtain additional external datasets.
+
+The R script will create a simplified version of the wrfinput
+(:file:`wrfinput_d0x.nc`)file including all necessary fields for the
+Noah-MP land surface model, but with spatially uniform initial conditions
+that are prescribed within the script and requires only the geogrid file
+:file:`geo_em.d0{x}.nc` as input. Step-wise instructions and detailed
+requirements are included in the documentation distributed with the script.
+Users should be aware that the model will likely require additional spin-up
+time when initialized from this file.
+
+5.2.3 Generating hydrologic routing input files via the WRF-Hydro GIS pre-processing tools
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+A suite of Python based utilities, the *WRF-Hydro GIS Pre-processing
+Toolkit*, have been developed to facilitate the process of deriving
+WRF-Hydro input and parameter files from commonly available geospatial
+data products, such as hydrologically processed digital elevation
+models. A large number of the hydro input and parameter files described
+in :ref:`Table 5.1 ` can be generated by these tools as well
+as a geospatial metadata file to support georeferencing of WRF-Hydro model
+output files and relevant shapefiles to aid in visualizing the model
+components. The WRF-Hydro GIS pre-processing tools are developed to
+function as an additional ArcToolbox within the Esri ArcGIS software.
+Specific operating system and software requirements are addressed in the
+full *WRF-Hydro GIS Pre-processing Toolkit* documentation.
+
+The minimum input data requirements for the pre-processing tools are the
+geogrid file geo_em.d0x.nc and a hydrologically conditioned digital
+elevation model covering the full extent of the domain of interest. From
+these datasets the terrain routing Fulldom_hires.nc and channel routing
+files (see Appendix :ref:`A9 ` can be created. A text file
+with stream gage locations can also be supplied allowing the user to
+demarcate these locations in the model input files and optionally produce
+time series outputs for only these locations :file:`frxst_pts_out.txt` or
+:file:`*CHANOBS_DOMAIN{x}`.
+
+This text file denoting the location of stream gages or forecast points
+can also be used to generate groundwater input files. Effectively
+groundwater basins are delineated above each of these locations and
+default parameters will be assigned to a parameter file that can also be
+generated using this tool.
+
+Lake and reservoir component input files also require a supplementary
+input file. A shapefile containing polygons defining the extent of each
+lake must be provided as input. From this file and the processed digital
+elevation model a number of parameters are derived for each lake
+(however, note that other parameters are only assigned a global default
+value). More details about this process and the contents of the input
+and parameter files can be found in Appendix :ref:`A12 `
+and the full *WRF-Hydro GIS Pre-processing Toolkit* documentation.
+
+The *WRF-Hydro GIS Preprocessing Toolkit* will also produce a geospatial
+metadata file for the land surface model grid (as defined by the geogrid
+file), :file:`GEOGRID_LDASOUT_Spatial_Metadata.nc`. This file contains
+projection and coordinate information for the land surface model grid.
+While this file is an optional input to WRF-Hydro, in combination with
+the new file output routines in version 5.0 of WRF-Hydro this file will
+allow for the creation of CF (Climate and Forecast metadata convention)
+compliant output files. This allows for files to be more easily viewed
+in GIS systems (e.g. ArcGIS and QGIS) as well as other visualization
+software. Additional documentation for this toolkit including step by
+step instructions and detailed requirements is provided on the WRF-Hydro
+website.
+
+Requirements for the hydro components of the model (i.e. those not
+directly associated with the land surface model or data assimilation)
+are described in the model physics :ref:`Section
+3 ` and in :ref:`Table 5.1 `.
+
+.. table:: **Table 5.1** Input and parameter files for hydro components of
+ WRF-Hydro.
+ :width: 90%
+ :align: center
+ :name: table-5.1
+
+ +---------------------------+---------------------+----------------------------------+--------------+
+ | **Filename** | **Description** | **Source** | **Required** |
+ +===========================+=====================+==================================+==============+
+ | :file:`Fulldom_hires.nc` | High resolution | WRF-Hydro GIS | Yes |
+ | | full domain file. | pre-processing | |
+ | | Includes all fields | toolkit | |
+ | | specified on the | | |
+ | | routing grid. | | |
+ +---------------------------+---------------------+----------------------------------+--------------+
+ | :file:`Route_Link.nc` | Channel reach | WRF-Hydro GIS | When reach |
+ | | parameters (ComID, | pre-processing | based |
+ | | gage ID, bottom | toolkit | routing is |
+ | | width, slope, | | used |
+ | | roughness, order, | | (including |
+ | | etc.) | | user defined |
+ | | | | mapping) |
+ +---------------------------+---------------------+----------------------------------+--------------+
+ | :file:`GWBASINS.nc` | 2D file defining | WRF-Hydro GIS | When the |
+ | | the locations of | pre-processing | baseflow |
+ | | groundwater basins | toolkit | bucket model |
+ | | on a grid | | is turned on |
+ | | | | and user |
+ | | | | defined |
+ | | | | mapping is |
+ | | | | off |
+ +---------------------------+---------------------+----------------------------------+--------------+
+ | :file:`GWBUCKPARM.nc` | Groundwater | WRF-Hydro GIS | When the |
+ | | parameter table | pre-processing | baseflow |
+ | | containing bucket | toolkit | bucket model |
+ | | model parameters | | is turned on |
+ | | for each basin | | |
+ +---------------------------+---------------------+----------------------------------+--------------+
+ | :file:`LAKEPARM.nc` | Lake parameter | WRF-Hydro GIS | When lake |
+ | | table containing | pre-processing | and |
+ | | lake model | toolkit | reservoir |
+ | | parameters for each | | routing is |
+ | | catchment | | turned on |
+ +---------------------------+---------------------+----------------------------------+--------------+
+ | :file:`hydro2dtbl.nc` | Spatially | :file:`create_SoilProperties.R` | When using |
+ | | distributed netCDF | script | spatially |
+ | | version of | | distributed |
+ | | :file:`HYDRO.TBL` | (will also be automatically | terrain |
+ | | | generated by WRF-Hydro) | routing |
+ | | | | parameters |
+ +---------------------------+---------------------+----------------------------------+--------------+
+ | :file:`HYDRO.TBL` | Parameter table for | :file:`template/HYDRO` | Yes |
+ | | lateral flow | directory in the | |
+ | | routing within | model code | |
+ | | WRF-Hydro. In the | | |
+ | | :file:`HYDRO.TBL` | | |
+ | | file parameters are | | |
+ | | specified by land | | |
+ | | cover type or soil | | |
+ | | category | | |
+ +---------------------------+---------------------+----------------------------------+--------------+
+ | :file:`HYDRO_MODIS.TBL` | Version of | :file:`template/HYDRO` | Replacement |
+ | | :file:`HYDRO.TBL` | directory in the | for :file:\ |
+ | | using MODIS land | model code | `HYDRO.TBL` |
+ | | use categories | | when using |
+ | | rather than USGS. | | MODIS land |
+ | | (Change name to | | use |
+ | | :file:`HYDRO.TBL` | | categories |
+ | | for use.) | | |
+ +---------------------------+---------------------+----------------------------------+--------------+
+ | :file:`CHANPARM.TBL` | Parameters for | :file:`template/HYDRO` | When gridded |
+ | | gridded channel | directory in the | routing is |
+ | | routing scheme. | model code | used |
+ | | Parameters are | | |
+ | | specified by | | |
+ | | Strahler stream | | |
+ | | order | | |
+ +---------------------------+---------------------+----------------------------------+--------------+
+ | :file:`spatialweights.nc` | Spatial weight file | distributed with NWM domain | When using |
+ | | used to map fluxes | files | user defined |
+ | | to catchment | | mapping |
+ | | objects | | |
+ +---------------------------+---------------------+----------------------------------+--------------+
+
+.. _section-5.3:
+
+5.3 Description of land surface model and lateral routing parameter files
+-------------------------------------------------------------------------
+
+Parameters for the Noah and Noah-MP land surface models as well as for
+the lateral routing component are specified via a collection of text
+files (i.e. parameter tables) denoted by the file suffix :file:`.TBL`.
+Default parameter tables for the Noah and Noah-MP models are included in
+the WRF-Hydro source code within the directory structure for their
+respective land model and the appropriate files are automatically moved
+to the Run directory upon building the model.
+
+The Noah land surface model requires three parameter table files, outlined
+in :ref:`Table 5.2 `. The first of these is the general parameter
+table or :file:`GENPARM.TBL`. This file contains a number of global parameters
+for the Noah land surface model. The next is the vegetation parameter table or
+:file:`VEGPARM.TBL`. This file contains parameters that are a function
+of land cover type. The final table is the soil parameter table or
+:file:`SOILPARM.TBL`. This parameter table contains parameters that are
+assigned based upon the soil classification. The variables contained
+within these files are described in the Appendix :ref:`A6 `.
+
+The Noah-MP land surface model requires three parameter table files, outlined
+in :ref:`Table 5.3 `. The first of these is the general parameter
+table or :file:`GENPARM.TBL`. This file contains a number of global parameters
+for the Noah-MP land surface model. The next is the soil parameter table or
+:file:`SOILPARM.TBL`. This parameter table contains parameters that are
+assigned based upon the soil classification. The final table is the
+:file:`MPTABLE.TBL`. This file contains parameters that are a function
+of land cover type. The variables contained within these files are
+described in Appendix :ref:`A7 `.
+
+As part of work conducted for the National Water Model implementation,
+the ability to specify a number of these land surface model parameters
+spatially on a two or three dimensional grid was introduced. This is
+done through the use of the compile time option ``SPATIAL_SOIL`` and the
+specification of a netCDF format parameter file with the default
+filename :file:`soil_properties.nc`. A list of the variables contained in
+this file is included in Appendix :ref:`A7 `.
+
+.. table:: **Table 5.2** Parameter tables for the Noah land surface model. These
+ parameter tables can be found within the land surface model source code
+ :file:`Run/` directory and will be copied over the the WRF-Hydro Run
+ directory when the compile script for this LSM is run.
+ :align: center
+ :width: 90%
+ :name: table-5.2
+
+ .. default-role:: file
+
+ +---------------+-------------------------------------------+----------------+
+ | **Filename** | **Description** | **Required** |
+ +===============+===========================================+================+
+ | GENPARM.TBL | Miscellaneous model parameters that are | Yes |
+ | | applied globally | |
+ +---------------+-------------------------------------------+----------------+
+ | VEGPARM.TBL | Vegetation parameters indexed by land use | Yes |
+ | | / land cover categories | |
+ +---------------+-------------------------------------------+----------------+
+ | SOILPARM.TBL | Soil parameters indexed by soil texture | Yes |
+ | | classes | |
+ +---------------+-------------------------------------------+----------------+
+
+ .. default-role::
+
+.. table:: **Table 5.3** Parameter tables for the Noah-MP land surface model. These
+ parameter tables can be found within the land surface model source code
+ Run directory and will be copied over to the WRF-Hydro Run directory
+ when the compile script for this LSM is run.
+ :name: table-5.3
+ :align: center
+ :width: 90%
+
+ .. default-role:: file
+
+ +----------------------+---------------------------------------------+----------------+
+ | **Filename** | **Description** | **Required** |
+ +======================+=============================================+================+
+ | `GENPARM.TBL` | Miscellaneous model parameters that are | Yes |
+ | | applied globally | |
+ +----------------------+---------------------------------------------+----------------+
+ | `MPTABLE.TBL` | Vegetation parameters indexed by land use / | Yes |
+ | | land cover categories | |
+ +----------------------+---------------------------------------------+----------------+
+ | `SOILPARM.TBL` | Soil parameters indexed by soil texture | Yes |
+ | | classes | |
+ +----------------------+---------------------------------------------+----------------+
+ | `soil_properties.nc` | NetCDF file with spatially distributed land | No |
+ | | surface model parameters used when | |
+ | | WRF-Hydro is compiled with SPATIAL_SOIL=1. | |
+ | | This allows the user to specify parameters | |
+ | | on the model grid rather than as a single | |
+ | | value or function of soil or land cover | |
+ | | type. | |
+ | | | |
+ | | This file is created by the | |
+ | | :file:`create_SoilProperties.R` script | |
+ +----------------------+---------------------------------------------+----------------+
+
+ .. default-role:: file
+
+
+Parameters for the lateral routing component of WRF-Hydro are specified
+in a similar way via the :file:`HYDRO.TBL` file or the :file:`hydro2dtbl.nc`
+file. This file is also distributed with the WRF-Hydro source code in the
+:file:`templates/HYDRO` directory and is copied over to the :file:`Run` directory
+upon building the model. There is also an additional :file:`HYDRO_MODIS.TBL`
+file for those using the MODIS land cover classification scheme.
+
+The :file:`HYDRO.TBL` parameter table file contains 2 parts. The first part
+contains the Manning's roughness coefficients for overland flow as a
+function of the USGS vegetation types as that data is used in the Noah
+land surface model. The roughness values are strictly indexed to the
+USGS vegetation classes so that if one wanted to use a different
+vegetation index dataset (e.g. the MODIS/IGBP option in the Noah land
+surface model) a user would need to remap these roughness values to
+those new vegetation indices. Users can alter the values of overland
+flow roughness here for a given vegetation type. However, users may also
+'scale' these initial values of roughness by changing the gridded values
+of the overland flow roughness scaling factor (``OVROUGHRTFAC``) that are
+contained within the high resolution routing data netCDF file. Because
+hydrological models are often calibrated over a particular region or
+watershed as opposed to a specific vegetation type it is recommended
+that users modify the ``OVROUGHRTFAC`` scaling factor as opposed to altering
+the roughness values in :file:`HYDRO.TBL`.
+
+The second part of the :file:`HYDRO.TBL` parameter table contains several soil
+hydraulic parameters that are classified as functions of soil type.
+These soil parameters are copied from the :file:`SOILPARM.TBL` parameter table
+from the Noah land surface model. They are provided in HYDRO.TBL to
+allow the user to modify those parameters as needed during model
+calibration activities without modifying the :file:`SOILPARM.TBL` file and thus
+is just done for convenience. In effect, when routing options in
+WRF-Hydro are activated the code will read the soil hydraulic parameters
+from :file:`HYDRO.TBL`. If the Noah land surface model is run within WRF-Hydro
+without any of the routing options active, the code will simply use the
+parameter values specific in :file:`HYDRO.TBL`.
+
+The :file:`hydro2dtbl.nc` is a spatially distributed netCDF file version of the
+:file:`HYDRO.TBL` parameter table. This netCDF file can be created via the
+:file:`create_SoilProperties.R` script distributed on the WRF-Hydro website
+(https://ral.ucar.edu/projects/wrf_hydro) or will automatically be
+generated by the model from the :file:`HYDRO.TBL` if the filename specified in
+the :file:`hydro.namelist` does not already exist. See Appendix
+:ref:`A8 ` for further explanation of the variables in the
+:file:`HYDRO.TBL` and :file:`hydro2dtbl.nc` files.
+
+5.3.1 Spatially distributed parameter files
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+As of version 5.0 of WRF-Hydro we now allow for the specification of a
+number of spatially distributed land surface model and / or lateral
+routing parameters in netCDF input files :file:`soil_properties.nc` and
+:file:`hydro2dtbl.nc`. This option was implemented as part of work conducted
+for the National Water Model and allows the user to specify parameters on
+the model grid rather than as a single value or function of soil or land
+cover type. The files can be generated via an R script provided on our
+website (:file:`create_SoilProperties.R`) and are described in more detail in
+Appendices :ref:`A7 ` and :ref:`A8 `. In order for the
+model to read in the :file:`soil_properties.nc` file the ``SPATIAL_SOIL`` environment
+variable must be set to 1 at compile time. This option gives users more
+flexibility in the specification of land surface model parameters and is
+particularly useful in the context of calibration and parameter
+regionalization.
+
+.. _section-5.4:
+
+5.4 Description of channel routing parameter files
+--------------------------------------------------
+
+Channel parameters for WRF-Hydro are specified in one of two files. If
+the model is configured using gridded channel routing these parameters
+will be stored in :file:`CHANPARM.TBL`. If the model is configured using reach
+based routing (including the NWM configuration) the parameters and
+channel geometry are specified within the :file:`Route_Link.nc` file generated
+by the *WRF-Hydro GIS Pre-processing Toolkit*. Variables of the
+:file:`CHANPARM.TBL` and :file:`Route_Link.nc` files are described in Appendix
+:ref:`A9 `.
+
+It is important to keep in mind that there is large uncertainty
+associated with these parameters. Therefore, model calibration is almost
+always warranted. Also, because fully-distributed estimates of flow
+depth (``HLINK``) are not available for model initialization, it is almost
+always necessary to use a small initial value of ``HLINK`` and let the model
+come to its own equilibrium (i.e. “spin-up”) after several hours of
+integration. The necessary time required to spin up the channel network
+is a direct function of how dense and long your channel network is.
+Larger, more dense networks will take substantially longer to spin up.
+Estimates of total travel time from the furthest channel element to the
+basin outline are a reasonable initial approximation of the time it will
+take to spin up the channel elements.
+
+.. _section-5.5:
+
+5.5 Description of groundwater input and parameter files
+--------------------------------------------------------
+
+Depending upon the choice of channel configuration groundwater input and
+parameter files are specified in slightly different ways. For the
+National Water Model (NWM) implementation of the model where user
+defined mapping is active the :file:`spatialweights.nc` file is used to map
+gridded fluxes to the appropriate catchments, the spatial unit of the
+NWM groundwater bucket model. In other configurations of the model where
+user defined mapping is not used, grid-based groundwater basins are
+defined in a :file:`GWBASINS.nc` netCDF file. The contents of these files are
+described in Appendix :ref:`A10 `.
+
+Groundwater bucket model parameters are assigned via the :file:`GWBUCKPARM.nc`
+file for all configurations. The contents of these files are also
+summarized in Appendix :ref:`A10 ` and like the
+groundwater basins files these files are produced by the *WRF-Hydro GIS
+Pre-processing Toolkit*. Note that global default parameters are
+prescribed when these files are generated so user adjustments and/or
+calibration are recommended.
+
+.. _section-5.6:
+
+5.6 Description of lake and reservoir parameter tables
+------------------------------------------------------
+
+Lake parameter values are specified for each one of the lake objects.
+Typically, baseline parameters are derived within the high-resolution
+terrain preprocessing stages described above using tools such as ArcGIS
+(e.g. ``LkArea``, ``LkMxE``). Values for the weir and orifice coefficients and
+sizes can be drawn from standard engineering hydraulics textbooks (e.g.
+*Chow et al., 1957*) and calibrated based on lake level performance. Weir
+parameters are specified for reservoir “overflow” or “spill” and orifice
+parameters are specified for design operations. The behavior of the
+reservoir to store and release water is highly dependent on these
+parameters and therefore it is highly recommended that the user modify
+this file with their own set of parameters beyond the default given in
+the *WRF-Hydro GIS Pre-processing Toolkit*. See Appendix
+:ref:`A12 ` for descriptions of the variables within the
+:file:`LAKEPARM.nc` file.
+
+.. _section-5.7:
+
+5.7 Specification of meteorological forcing data
+------------------------------------------------
+
+Modern land surface hydrology models, including WRF-Hydro, require
+meteorological forcing data to simulate land-atmosphere exchanges and
+terrestrial hydrologic processes when uncoupled to atmospheric modeling
+systems. Most land models require a similar set of input variables with
+some variation in terms of the units, spectral bandwidths of radiation,
+handling of precipitation phase, etc. Most commonly these variables
+include: incoming short and longwave radiation, humidity, temperature,
+pressure, wind speed and precipitation. The required variables for the
+Noah and Noah-MP land surface models supported in version 5.x of
+WRF-Hydro are listed in :ref:`Table 5.4 ` These variables'
+names, units, and several of the forcing data file format options described
+below are borrowed from the High-Resolution Land Data Assimilation System
+(`HRLDAS `__),
+an offline driver for Noah land surface models. When WRF-Hydro is
+coupled into other modeling architectures such as the NASA Land
+Information System (LIS), these systems will set the requirements for
+the forcing data.
+
+.. table:: **Table 5.4** Input forcing data for the Noah and Noah-MP land surface
+ models
+ :width: 90%
+ :align: center
+ :name: table-5.4
+
+ +-----------------+------------------------------------+---------------+
+ | **Variable | **Description** | **Units** |
+ | name** | | |
+ +=================+====================================+===============+
+ | ``SWDOWN`` | Incoming shortwave radiation | `W/m^2` |
+ +-----------------+------------------------------------+---------------+
+ | ``LWDOWN`` | Incoming longwave radiation | `W/m^2` |
+ +-----------------+------------------------------------+---------------+
+ | ``Q2D`` | Specific humidity | `kg/kg` |
+ +-----------------+------------------------------------+---------------+
+ | ``T2D`` | Air temperature | `K` |
+ +-----------------+------------------------------------+---------------+
+ | ``PSFC`` | Surface pressure | `Pa` |
+ +-----------------+------------------------------------+---------------+
+ | ``U2D`` | Near surface wind in the | `m/s` |
+ | | `u`-component | |
+ +-----------------+------------------------------------+---------------+
+ | ``V2D`` | Near surface wind in the | `m/s` |
+ | | `v`-component | |
+ +-----------------+------------------------------------+---------------+
+ | ``RAINRATE`` | Precipitation rate | `mm/s` *or* |
+ | | | `kg/m^2` |
+ +-----------------+------------------------------------+---------------+
+
+Here we simply describe the requirements and options that are available
+in the standalone version of WRF-Hydro. Presently, there are 7 forcing
+data input types in WRF-Hydro. Because it is untenable to support a
+large variety of input file formats and data types within the model,
+WRF-Hydro requires that most processing of forcing data be handled
+external to the model (i.e. as a “pre-process”) and that users put their
+forcing data into one of the required formats. This includes performing
+tasks like, gridding of station observations, making sure forcing data
+is gridded to match the domain grid and has the correct variable names
+and units (see :ref:`Table 5.4 `), reformatting data into the
+prescribed netCDF format, etc. To facilitate these pre-processing activities
+we have developed numerous scripts which can be executed to help in the
+forcing data preparation process. These scripts along with sample data files
+are distributed on the WRF-Hydro website.
+
+The input forcing data type is specified in the land surface model
+namelist file namelist.hrldas by modifying the ``FORC_TYP`` namelist
+option.
+
+Model forcing type namelist options are specified as follows:
+
+ 1=HRLDAS-hr format
+
+ 2=HRLDAS-min format
+
+ 3=WRF output
+
+ 4=Idealized
+
+ 5=Idealized with specified precipitation
+
+ 6=HRLDAS-hr format with specified precipitation
+
+ 7=WRF output with specified precipitation
+
+.. rubric:: 1 - HRLDAS hourly format input files:
+
+This option requires meteorological forcing data to be provided in the HRLDAS
+hourly forcing data format. Scripts provided on the WRF-Hydro website will
+generate files in this format. Forcing files in this format can also be found
+in the example cases. In this format, gridded forcing data for all
+meteorological forcing variables with the names and units shown in :ref:`Table
+5.4 ` are included in a single netCDF file for each time step. The
+forcing data grids must match the model grid specified in the :file:`geo_em.d0{x}.nc`
+“geogrid” file. Filenames must conform to the following convention:
+:file:`YYYYMMDDHH.LDASIN_DOMAIN{X}`
+
+.. rubric:: 2 - HRLDAS minute format input files:
+
+This option requires meteorological forcing data to be provided in the HRLDAS
+minute forcing data format. Like the HRLDAS hourly format, this standard is
+borrowed from the HRLDAS modeling system. However, this format allows for the
+specification of forcing data at more frequent time intervals (up to
+every minute as specified by the forcing time step in the
+:file:`namelist.hrldas` file). In this format, gridded forcing data for all
+meteorological forcing variables with the names and units shown in :ref:`Table
+5.4 ` are included in a single netCDF file for each time step. The forcing
+data grids must match the model grid specified in the :file:`geo_em.d0{x}.nc`
+file. Filenames must conform to the following convention:
+:file:`YYYYMMDDHHmm.LDASIN_DOMAIN{X}`
+
+.. rubric:: 3 - WRF output files as input to WRF-Hydro:
+
+This option allows for meteorological forcing data to be read directly from a
+WRF model output file “wrfout” file so long as the WRF model grid is the same as
+that for WRF-Hydro. The WRF-Hydro code will not remap or spatially-subset the
+data in any way. All necessary fields are available in a default WRF output
+file but users should verify their existence if modifications have been made.
+These files must be written with only a single time step per file and retain
+the default filenames. The file naming convention for the wrfout file is
+:file:`wrfout_d0{x}_YYYY-MM-DD_HH:MM:SS`.
+
+.. rubric:: 4 - Idealized forcing:
+
+This option requires no input files. Instead a simple rainfall event is
+prescribed (i.e. “hardcoded”) in the model. This event is a spatially uniform
+25.4 `mm/hr` (1 `in/hr`) for 1 hour duration over the first hour of the model
+simulation. The remainder of the forcing variables are set to have either
+constant values (in space and time) or, in the case of temperature and
+radiation variables, a fixed diurnal cycle (see :ref:`Table 5.5 `).
+This option is primarily used for simple testing of the model and is convenient
+for checking whether or not components besides the forcing data are properly
+being read into the model and working. Future versions of WRF-Hydro will allow
+the user to specify values for the precipitation event and the other meteorological
+variables. Note that this forcing type requires the user-specified
+``FORCING_TIMESTEP`` namelist parameter to be set to 3600 (1 hr) in the
+:file:`namelist.hrldas` file.
+
+.. table:: **Table 5.5.** Description of idealized forcing
+ :align: center
+ :width: 90%
+ :name: table-5.5
+
+ +--------------+--------------------------+------------------------------+
+ | **Variable | **Prescribed value or | **Timing** |
+ | name** | range of values** | |
+ +==============+==========================+==============================+
+ | ``SWDOWN`` | 0 - 900 [`W/m^2`] | Diurnal cycle |
+ +--------------+--------------------------+------------------------------+
+ | ``LWDOWN`` | 375 - 425 | Diurnal cycle |
+ | | [`W/m^2`] | |
+ +--------------+--------------------------+------------------------------+
+ | ``Q2D`` | 0.01 [`kg/kg`] | Constant |
+ +--------------+--------------------------+------------------------------+
+ | ``T2D`` | 287 - 293 [`K`] | Diurnal cycle |
+ +--------------+--------------------------+------------------------------+
+ | ``PSFC`` | 100,000 [`Pa`] | Constant |
+ +--------------+--------------------------+------------------------------+
+ | ``U2D`` | 1.0 [`m/s`] | Constant |
+ +--------------+--------------------------+------------------------------+
+ | ``V2D`` | 1.0 [`m/s`] | Constant |
+ +--------------+--------------------------+------------------------------+
+ | ``RAINRATE`` | 25.4 [`mm/s` or | For first hourly time step |
+ | | `kg/m^2`] | and zero thereafter |
+ +--------------+--------------------------+------------------------------+
+
+.. rubric:: 5 - Idealized forcing with specified precipitation:
+
+This option is identical to forcing type 4 with the exception that the
+WRF-Hydro system will look for user provided supplementary precipitation
+files. These supplementary precipitation files are netCDF files containing a
+single gridded field with either the name ``precip`` and units of `mm` or
+``precip_rate`` with unit a unit of `mm/s`. When using this forcing type,
+the WRF-Hydro system will look for a new precipitation input file based
+on the user-specified ``FORCING_TIMESTEP`` namelist option set in the
+namelist.hrldas file. Scripts provided on the WRF-Hydro website will
+generate files in this format (specifically the MRMS regridding
+scripts). Forcing files in this format can also be found in the example
+test cases. Filenames for supplemental precipitation files must conform
+to this convention: :file:`{YYYYMMDDHHMM}.PRECIP_FORCING.nc`.
+
+.. rubric:: 6 - HRLDAS hourly format input files with specified
+ precipitation:
+
+This option is identical to forcing type 1 with the exception that the
+WRF-Hydro system will also look for user provided supplementary precipitation
+files. These supplementary precipitation files are netCDF files containing a
+single gridded field with either the name ``precip`` and units of `mm` or
+``precip_rate`` with unit a unit of `mm/s`. When using this forcing type, the
+WRF-Hydro system will look for a new precipitation input file based on the
+user-specified ``FORCING_TIMESTEP`` namelist option set in the
+:file:`namelist.hrldas` file. Scripts provided on the WRF-Hydro website will
+generate files in this format (specifically the MRMS regridding scripts).
+Forcing files in this format can also be found in the example test cases.
+Filenames for supplemental precipitation files must conform to this
+convention: :file:`{YYYYMMDDHHMM}.PRECIP_FORCING.nc`.
+
+This option is useful when combining atmospheric analyses from
+reanalysis products or other models with a separate analysis of
+precipitation (e.g. a gridded gauge product, radar QPE, nowcasts,
+satellite QPE, etc). The model reads in the meteorological forcing data
+fields on each hour and then holds those values constant for the entire
+hour. Precipitation can be read in more frequently based on the
+user-specified ``FORCING_TIMESTEP`` namelist parameter in the
+namelist.hrldas file. For example, the user can have 'hourly'
+meteorology with '5-minute' precipitation analyses.
+
+.. rubric:: 7 - WRF output files as input to WRF-Hydro with specified
+ precipitation:
+
+This option is identical to forcing type 3 with the exception that the
+WRF-Hydro system will also look for user provided supplementary precipitation
+files. These supplementary precipitation files are netCDF files containing a
+single gridded field with either the name ``precip`` and units of `mm` or
+``precip_rate`` with unit a unit of `mm/s`. When using this forcing type, the
+WRF-Hydro system will look for a new precipitation input file based on the
+user-specified ``FORCING_TIMESTEP`` namelist option set in the
+:file:`namelist.hrldas` file. Scripts provided on the WRF-Hydro website will
+generate files in this format (specifically the MRMS regridding scripts).
+Forcing files in this format can also be found in the example test cases.
+Filenames for supplemental precipitation files must conform to this
+convention: :file:`{YYYYMMDDHHMM}.PRECIP_FORCING.nc`.
+
+This option is useful when combining forcing data from WRF with a
+separate analysis of precipitation (e.g. a gridded gauge product, radar
+QPE, nowcasts, satellite QPE, etc). The model reads in the
+meteorological forcing data fields from the WRF output file and then
+holds those values constant until the next file is available.
+Precipitation can be read in more frequently based on the user-specified
+``FORCING_TIMESTEP`` namelist parameter in the namelist.hrldas file. For
+example, the user can have 'hourly' meteorology with '5-minute'
+precipitation analyses.
+
diff --git a/docs/userguide/model-outputs.rest b/docs/userguide/model-outputs.rest
new file mode 100644
index 000000000..3e2070539
--- /dev/null
+++ b/docs/userguide/model-outputs.rest
@@ -0,0 +1,403 @@
+.. vim: syntax=rst
+.. include:: meta.rest
+
+.. _section-6.0:
+
+6. Description of Output Files from WRF-Hydro
+=============================================
+
+This chapter describes the output files from Version 5.x of WRF-Hydro.
+
+The user has several options to allow flexibility when outputting from
+the WRF-Hydro modeling system. All of the options to control outputs are
+located in the hydro.namelist file that the user edits prior to running a
+simulation. Prior to turning specific file options on, there are a few
+high-level namelist options (flags) that help control the quantity of
+variables each file will produce, along with some flexibility on the level
+of compression files contain.
+
+ .. rubric:: ``io_form_outputs``:
+
+ This flag directs the output to utilize optional internal netCDF compression
+ and the use of optional scale_factor/add_offset attributes to pack variables
+ from floating point to integer. However, the user also has the flexibility
+ to turn these optional features off. For additional information on these
+ “packing” attributes, consult the netCDF documentation for a more in-depth
+ explanation
+ (http://www.unidata.ucar.edu/software/netcdf/docs/index.html). It should
+ be noted that the use of internal compression adds time to output files
+ being produced. This may become costly for large-scale modeling
+ applications. Tests have indicated a cost of 15-25% additional time
+ spent producing output variables when internal netCDF compression is
+ utilized, depending on the number of output files being produced.
+ However, without the use of compression, it is possible file sizes could
+ become large depending on the application. It is also important to note
+ that a value of ``0`` will result in the code deferring to old output
+ routines used in version 3.0 of WRF-Hydro. For these outputs, the user
+ is encouraged to read the documentation for that version of the code.
+ The following values for the ``io_form_outputs`` option are available:
+
+ ``0`` - Defer to old output routines for version 3.0 of WRF-Hydro
+ (NOTE:this is the ONLY option that is supported when running with
+ the Noah LSM)
+
+ ``1`` - Utilize internal netCDF compression in conjunction with
+ scale_factor/add_offset byte packing
+
+ ``2`` - Utilize scale_factor/add_offset byte packing without internal
+ netCDF compression
+
+ ``3`` - Utilize internal netCDF compression without
+ scale_factor/add_offset byte packing.
+
+ ``4`` - No internal netCDF compression and no scale_factor/add_offset
+ byte packing.
+
+ .. rubric:: ``io_config_outputs``:
+
+ This flag offers different sets of output variables for each file. This
+ offers the user some flexibility to the number of output variables being
+ produced. *NOTE*: This flag has no effect when ``io_form_outputs = 0``.
+
+ .. rubric:: ``t0OutputFlag``:
+
+ This flag controls if output files are produced on the initial timestep
+ of the model simulation. It is important to note that some variables are
+ initialized to missing values and may translate to missing values in the
+ output files for the initial time step. However, these files may offer
+ useful information to the user for diagnosing purposes.
+
+ .. rubric:: ``output_channelBucket_influx``:
+
+ This flag controls the creation of output variables specific to running
+ a channel-only configuration of the model. These variables provide useful
+ information on flow coming into channel links located in the simulation
+ domain, which can be used for diagnosing purposes. *Note*: this value must
+ be zero for running a gridded channel routing configuration of the model.
+
+An overview of available model output files is shown in :ref:`Figure 6.1 `.
+For a detailed table of each variable contained within each output file, see
+the *WRF-Hydro Output Variable Matrix V5* located on our website
+(`https://ral.ucar.edu/projects/wrf_hydro `__)
+for details. There is no optimal combination of namelist options to use
+for outputs. Flexibility was given to the user as end applications will
+vary from one user to another. While a combination of many output
+variables with compression may work for a one-time model simulation,
+having fewer variables with less time spent on compression may be more
+suitable for a user that is operations driven. Future code upgrades will
+allow further flexibility on the exact variables to output for each
+file.
+
+.. figure:: media/wrfhydro-outputs.png
+ :name: figure-6.1
+ :figwidth: 90%
+ :width: 90%
+ :align: center
+
+ **Figure 6.1** WRF-Hydro output files organized by model physics
+ component. See the Key for files specific to a certain channel
+ configuration.
+
+Please note a proper land spatial metadata file is highly encouraged
+when producing land surface output from the simulations. This file is
+specified by the ``LAND_SPATIAL_META_FLNM`` option in the hydro.namelist
+file. This file contains several geospatial variables and attributes
+which are translated to output files that meet CF compliance
+(http://cfconventions.org/). This file can be created using the
+*WRF-Hydro GIS Pre-processing Toolkit* associated with this release. For
+gridded output files, coordinate variable data and attributes are used
+from the spatial metadata file for the output variables. Additionally,
+geospatial attributes, which can help the user display data in GIS
+applications are located within the metadata file. These attributes
+translate to the output files during the output creation process. For
+the 2D high resolution routing output files (``RT_DOMAIN``, ``CHRTOUT_GRID``),
+geospatial attributes and coordinate variables are translated from the
+Fulldom_hires.nc file if they are detected. For point output files
+(``CHRTOUT_GRID``, ``CHANOBS_DOMAIN``, ``LAKEOUT_DOMAIN``), the geospatial
+attributes and coordinate variables have been hard-coded to be latitude
+and longitude for this version of the code.
+
+Each output file will potentially contain some set of attributes and
+variables that contain temporal and geospatial information useful to the
+user. Again, it is worth noting that the lack of a land spatial metadata
+file, or proper attributes in the :file:`Fulldom_hires.nc` file will result
+in a less comprehensive output file in terms of metadata. Each output file
+will contain a time dimension and variable that specifies the number of
+timesteps located in the output file, along with a numeric value for
+each timestep in the form of minutes since EPOCH. A ``reference_time``
+dimension (usually 1 in dimension size) and variable exist. This
+variable will contain the model initialization in minutes since EPOCH.
+
+Gridded output files will contain an `x` and `y` coordinate dimension and
+variable that will contain the center-point coordinate values for either
+the routing grid, or land surface grid in the model projected space. For
+example, on a Lambert Conformal modeling domain, these values would be
+in meters. Gridded output files will also contain a ``CRS`` variable,
+which contains useful geospatial metadata attributes about the modeling
+domain. Output files for points, river channel links, or lakes will
+contain latitude, longitude, and elevation variables to offer metadata
+about each location in the output file.
+
+Additionally, output files at points will contain a feature_id variable
+that will list either a global ID value associated with that point, or a
+predefined ID value extracted from an input file. For example, with 2D
+gridded channel routing, each channel pixel cell has an ID value that
+ranges from `1-n` where `n` is the global number of channel pixel cells.
+However, with reach-based routing, each channel reach may have a
+predefined link ID value specified via the :file:`Route_Link.nc` file. All files
+contain ``model_initialization_time`` and ``model_output_valid_time`` character
+attributes to offer additional time information about the output file.
+For files that were produced with ``io_form_outputs`` options of ``1`` or ``2``,
+standard netCDF variable attributes ``scale_factor`` and add_``offset are
+present to help users and netCDF APIs unpack integer data back to
+floating point for visualization and analysis. For a more in-depth
+description of netCDF CF compliant output, please visit
+http://cfconventions.org.
+
+Two output files that do not necessarily follow the above mentioned
+format will be the groundwater output (``GWOUT_DOMAIN``) file and
+:file:`frxst_pts_out.txt` text file. Groundwater output are representative of a
+spatial region, as opposed to points or fixed pixel cells. Future code
+upgrades will attempt to incorporate additional spatial information
+about groundwater buckets. The :file:`frxst_pts_out.txt` text file is a simple
+ASCII text file, not netCDF.
+
+The following output files are available to the user, depending on their
+run configuration:
+
+ 1. Land surface model output
+
+ 2. Land surface diagnostic output
+
+ 3. Streamflow output at all channel reaches/cells
+
+ 4. Streamflow output at forecast points or gage reaches/cells
+
+ 5. Streamflow on the 2D high resolution routing grid (gridded channel
+ routing only)
+
+ 6. Terrain routing variables on the 2D high resolution routing grid
+
+ 7. Lake output variables
+
+ 8. Ground water output variables
+
+ 9. A text file of streamflow output at either forecast points or gage
+ locations (:file:`frxst_pts_out.txt`)
+
+The output files will be described below.
+
+File naming convention of output files: ``YYYY`` = year, ``MM`` = month,
+``DD`` = day, ``HH`` = hour, ``MM`` = minutes, ``DOMAINX`` = the domain
+number that is specified in the hydro.namelist input file (also matches
+the domain number of the geogrid input file)
+
+.. rubric:: 1. Land surface model output
+
+\
+ :file:`{YYYYMMDDHHMM}.LDASOUT_DOMAIN{X}`
+
+ For this output file, land surface model variables are written to a
+ multi-dimensional netCDF file. Output is produced on the land surface
+ grid, most variables coming directly from the land surface model. The `x`
+ and `y` dimensions of the output file match those of the geogrid input
+ file and the land spatial metadata file. The ``soil_layers_stag`` and
+ ``snow_layers`` dimensions specify the number of soil and snow layers
+ being produced by the land surface model. The names and definitions for
+ each output variable in the LSM output file are generally consistent
+ with those output from standard Noah or Noah-MP LSM coupled to WRF. The
+ output frequency of this file is dictated ``OUTPUT_TIMESTEP`` specified in
+ :file:`namelist.hrldas`.
+
+.. rubric:: 2. Land surface diagnostic output
+
+\
+ :file:`{YYYYMMDDHHMM}.LSMOUT_DOMAIN{X}`
+
+ Variables for this output file will not change with varying values of
+ ``io_config_outputs`` as there is a limited set of land surface states
+ produced for this output file. In general, the user will not desire this
+ output file as the regular land surface output files contain a larger
+ amount of land surface output. However, for examining model state and
+ flux passing between the LSM and the routing routines, this file could
+ contain potentially valuable information that would assist in those
+ efforts. Some of these states include soil moisture, soil temperature,
+ infiltration excess, and surface head. Like the land surface output
+ files, output variables in this output file will match the land surface
+ grid. The output frequency of this file is dictated by ``OUTPUT_TIMESTEP``
+ specified in :file:`namelist.hrldas`.
+
+.. rubric:: 3. Streamflow output at all channel reaches/cells
+
+\
+ :file:`{YYYYMMDDHHMM}.CHRTOUT_DOMAIN{X}`
+
+ The ``CHRTOUT_DOMAIN`` option in the :file:`hydro.namelist` is used to
+ activate this output. This output file will produce a set of streamflow
+ (and related) variables for each channel location in the modeling domain.
+ For 2D gridded routing on the channel network, this is every pixel cell
+ on the high-resolution modeling domain classified as a channel pixel cell.
+ Forreach-based routing, this is every channel reach defined in the
+ :file:`Route_Link.nc` file. If the user desires to limit the number of
+ streamflow points, the ``order_to_write`` option in :file:`hydro.namelist`
+ will reduce the number of points based on the Strahler order number.
+ Otherwise, all points will be outputted to the file. Each file will
+ contain a ``latitude``, ``longitude``, ``elevation``, and ``order`` variable
+ to describe basic information on each channel point. The ``CRS`` projection
+ variable has been hard-coded (as it is with all other point output
+ files) as the coordinate variables for point files are in
+ latitude/longitude.
+
+.. rubric:: 4. Streamflow output at forecast points or gage reaches/cells
+
+\
+ :file:`{YYYYMMDDHHMM}.CHANOBS_DOMAIN{X}`
+
+ The ``CHANOBS_DOMAIN`` option in the hydro.namelist is used to activate this
+ output. This output file is very similar to the regular streamflow
+ output file format. The key difference is output only occurs at
+ predefined forecast points or gage locations. For 2D gridded channel
+ routing, the user defines forecast points during the setup of their
+ modeling domain. Under this configuration, streamflow will be produced
+ at those points. It is worth noting output points can be constrained by
+ the ``order_to_write`` as they are in the regular streamflow output files.
+ For reach-based routing, it is possible to create outputs at a set of
+ predefined gage points in the :file:`Route_Link.nc` file. Within the
+ :file:`Route_Link.nc` file, a variable called ``gages`` of type character will
+ need to be created by the user containing a string for each channel
+ reach that contains a gage. This variable is of length ``feature_id`` (see
+ description of the Route_Link.nc file in Appendix :ref:`A9 `),
+ and size 15. If a channel reach does not contain a gage, the string
+ stays empty. For example, ``" "`` would represent a channel
+ reach with no gage, and ``" 07124000"`` would contain a gage labeled
+ “07124000”. It is up to the user to create this variable and populate it with
+ character strings if there is a desire to connect gage locations to channel
+ reaches. If no locations are found, the output code will simply bypass
+ creating this output file. Like the other point files, similar
+ geospatial information will be placed into the output files.
+
+.. rubric:: 5. Streamflow on the 2D high resolution routing grid
+
+\
+ :file:`{YYYYMMDDHHMM}.CHRTOUT_GRID{X}`
+
+ The ``CHRTOUT_GRID`` option in the :file"`hydro.namelist` is used to activate this
+ output. This output file is a 2D file created from streamflow with 2D gridded
+ channel routing. Currently, this file is not available for reach-based
+ routing as channel routing does not occur on the channel grid. Output
+ occurs on the high resolution channel routing grid, which means file
+ sizes may be large depending on the size of your domain. In addition to
+ geospatial metadata and coordinate variables, an ``index`` variable is
+ created on the 2D grid producing a global index value for each channel
+ pixel cell on the grid. The main motivation behind creating this file is
+ for easy spatial visualization of the streamflow occurring across the
+ modeling domain.
+
+.. rubric:: 6. Terrain routing variables on the 2D high resolution routing grid
+
+\
+ :file:`{YYYYMMDDHHMM}.RTOUT_DOMAIN{X}`
+
+ The ``RTOUT_DOMAIN`` option in the :file:`hydro.namelist` is used to activate this
+ output. This output file is a 2D file created on the high resolution routing
+ grid. The primary variables created for this file are overland and
+ subsurface routing components that may be of interest to the user. The
+ format is very similar to the 2D streamflow file. Due to the shear size
+ of these data layers, care should be used in deciding when to output
+ high-resolution terrain data.
+
+.. rubric:: 7. Lake output variables
+
+\
+ :file:`{YYYYMMDDHHMM}.LAKEOUT_DOMAIN{X}`
+
+ The ``outlake`` option in the :file:`hydro.namelist` will activate this output.
+ This file is a point output file containing lake/reservoir inflow,
+ outflow and elevation values for each lake/reservoir object created in
+ the modeling domain. The format follows that of the other point output
+ files in terms of geospatial metadata. If no lake/reservoir objects were
+ created in the modeling domain, no output will be created.
+
+.. rubric:: 8. Ground water output variables
+
+\
+ :file:`{YYYYMMDDHHMM}.GWOUT_DOMAIN{X}``
+
+ The ``output_gw`` option in the :file:`hydro.namelist` will activate this output.
+ When groundwater buckets are activated in the model simulations, it is
+ possible to output bucket inflow/outflow/depth states via netCDF files.
+ One important note to reiterate for these output files is that they will
+ not contain the same geospatial metadata as other point files. Each
+ element in the output array represents a spatial groundwater bucket that
+ covers a region that is neither a single pixel cell or point on the
+ modeling domain. For these reasons, this is the only netCDF output file
+ that will not contain full geospatial metadata and coordinate variables.
+
+.. rubric:: 9. :file:`frxst_pts_out.txt`
+
+\
+ The ``frxst_pts_out`` option in the :file:`hydro.namelist` will activate this
+ output. The forecast points text file is a unique output file that distills
+ modeled streamflow and stage down to a simple text file with the
+ following columns:
+
+ - column 1 : time (in seconds) into simulation
+
+ - column 2 : date and time as YYYY-MM-DD_HH:MM:SS
+
+ - column 3 : station number index (same as ``feature_id`` in netCDF files)
+
+ - column 4 : station longitude (in decimal degrees)
+
+ - column 5 : station latitude (in decimal degrees)
+
+ - column 6 : streamflow discharge (in cubic meters per second)
+
+ - column 7 : streamflow discharge (in cubic feet per second)
+
+ - column 8 : flow depth/river stage (in meters above channel bottom)
+
+ *Note*: Column 8 is not active for reach-based routing.
+
+ Each row in the text file is representative of a predefined forecast
+ point (2D gridded channel routing only) or a gage point (reach-based
+ routing). It is worth noting that the number of points will be reduced
+ (as with CHANOBS and CHRTOUT) if the user specifies a higher
+ ``order_to_write`` namelist option.
+
+Once output files are generated, the user should inspect the files using
+the :program:`ncdump` netCDF utility for displaying the contents of a netCDF
+file. With the exception of groundwater output files, the forecast
+points text file, and any files generated using ``io_form_outputs`` of 0,
+the user should see some baseline variables and attributes. A ``crs``
+variable will be present indicating the projection coordinate system for
+the output files. If these files are missing in the 2D files, it is
+possible the Fulldom_hires.nc or land spatial metadata file does not
+contain the necessary ``crs`` variable. The same logic can be applied to
+the ``x`` and ``y`` coordinate variables in the 2D output files. The
+omission of these indicates they were not present in the input files
+prior to running the model. For additional help indicating potential
+issues with the output code, please inspect the standard output from the
+model. Specifically, look for any ":output:`WARNING`" messages that may indicate
+why files have not appeared or metadata is missing. For example,
+":output:`WARNING: Unable to locate the crs variable. No crs variable or \
+attributes will be created.`" would indicate the model was unable to locate
+the ``crs`` variable in one of the input files.
+
+
+.. note::
+ **Additional Notes:**
+
+ - The output descriptions above may not be fully accurate when running
+ with the Noah LSM, which is not actively in development and we
+ therefore support only in a deprecated state. New and improved output
+ routines (e.g., with CF compliance, scale/offset/compression options,
+ augmented metadata) only work with the Noah-MP LSM, while the Noah
+ LSM relies on deprecated output routines. See Appendix
+ :ref:`A2 ` for more details on running with the Noah LSM.
+
+ - For proper QGIS display of the 2D variables, the user will need to
+ rename netCDF output files to include a “.nc” at the end as some
+ versions of QGIS struggle to properly read in information from a
+ netCDF file without this extension. Future upgrades will
+ automatically add this file extension into the filenames.
diff --git a/docs/userguide/model-physics.rest b/docs/userguide/model-physics.rest
new file mode 100644
index 000000000..fe2da1a3c
--- /dev/null
+++ b/docs/userguide/model-physics.rest
@@ -0,0 +1,1468 @@
+.. vim: syntax=rst
+.. include:: meta.rest
+
+.. _section-3:
+
+3. Model Physics Description
+============================
+
+This chapter describes the physics behind each of the modules in Version
+|version_short| of WRF-Hydro and the associated namelist options which are
+specified at “run time”.
+
+3.1 Physics Overview
+--------------------
+
+.. _figure3.1:
+.. figure:: media/wrf-hydro-components.png
+ :align: center
+
+ **Figure 3.1.** Conceptual diagram of WRF-Hydro physics components and
+ relative outputs.
+
+First, the 1-dimensional (1D) column land surface model calculates the
+vertical fluxes of energy (sensible and latent heat, net radiation) and
+moisture (canopy interception, infiltration, infiltration-excess, deep
+percolation) and soil thermal and moisture states. Infiltration excess,
+ponded water depth and soil moisture are subsequently disaggregated from
+the 1D LSM grid, typically of 1-4 km spatial resolution, to a
+high-resolution, typically 30-100 m, routing grid using a time-step
+weighted method *(Gochis and Chen, 2003)* and are passed to the subsurface
+and overland flow terrain-routing modules. In typical U.S. applications,
+land cover classifications for the 1D LSMs are provided by the USGS
+24-type Land Use Land Cover product or MODIS Modified IGBP 20-category
+land cover product (see WRF/WPS documentation); soil classifications are
+provided by the 1-km STATSGO database *(Miller and White, 1998)*; and soil
+hydraulic parameters that are mapped to the STATSGO soil classes are
+specified by the soil analysis of *Cosby et al. (1984)*. Other land cover
+and soil type classification datasets can be used with WRF-Hydro but
+users are responsible for mapping those categories back to the same
+categories as used in the USGS or MODIS land cover and STATSGO soil type
+datasets. The WRF model pre-processing system (WPS) also provides a
+fairly comprehensive database of land surface data that can be used to
+set up the Noah and Noah-MP land surface models. It is possible to use
+other land cover and soils datasets, and more recently, data from the USGS
+National Land Cover Dataset (NLCD) and international soils datasets have
+been integrated into WRF-Hydro.
+
+Then, subsurface lateral flow in WRF-Hydro is calculated prior to the
+routing of overland flow to allow exfiltration from fully saturated grid
+cells to be added to the infiltration excess calculated by the LSM. The
+method used to calculate the lateral flux of the saturated portion of
+the soil column is that of *Wigmosta et al. (1994)* and *Wigmosta and
+Lettenmaier (1999)*, implemented in the Distributed Hydrology Soil
+Vegetation Model (DHSVM). It calculates a quasi-3D flow, which includes
+the effects of topography, saturated soil depth, and depth-varying
+saturated hydraulic conductivity values. Hydraulic gradients are
+approximated as the slope of the water table between adjacent grid cells
+in either the steepest descent or in both `x`- and `y`-directions. The flux
+of water from one cell to its down-gradient neighbor on each time-step
+is approximated as a steady-state solution. The subsurface flux occurs
+on the coarse grid of the LSM while overland flow occurs on the fine
+grid.
+
+Next, WRF-Hydro calculates the water table depth according to the depth
+of the top of the saturated soil layer that is nearest to the surface.
+Typically, a minimum of four soil layers are used in a 2-meter soil
+column used in WRF-Hydro but this is not a strict requirement.
+Additional discretization permits improved resolution of a time-varying
+water table height and users may vary the number and thickness of soil
+layers in the model namelist described in the Appendices :ref:`section-a3`,
+:ref:`section-a4`, and :ref:`section-a5`.
+
+Then, overland flow is defined. The fully unsteady, spatially explicit,
+diffusive wave formulation of *Julien et al. (1995-CASC2D)* with later
+modification by *Ogden (1997)* is the current option for representing
+overland flow, which is calculated when the depth of water on a model
+grid cell exceeds a specified retention depth. The diffusive wave
+equation accounts for backwater effects and allows for flow on adverse
+slopes *(Ogden, 1997)*. As in *Julien et al. (1995)*, the continuity
+equation for an overland flood wave is combined with the diffusive wave
+formulation of the momentum equation. Manning's equation is used as the
+resistance formulation for momentum and requires specification of an
+overland flow roughness parameter. Values of the overland flow roughness
+coefficient used in WRF-Hydro were obtained from *Vieux (2001)* and were
+mapped to the existing land cover classifications provided by the USGS
+24-type land-cover product of *Loveland et al. (1995)* and the MODIS
+20-type land cover product, which are the same land cover classification
+datasets used in the 1D Noah/Noah-MP LSMs.
+
+Additional modules have also been implemented to represent stream
+channel flow processes, lakes and reservoirs, and stream baseflow. In
+WRF-Hydro v\ |version_short| inflow into the stream network and lake and
+reservoir objects is a one-way process. Overland flow reaching grid cells
+identified as 'channel' grid cells pass a portion of the surface water
+in excess of the local ponded water retention depth to the channel
+model. This current formulation implies that stream and lake inflow from
+the land surface is always positive to the stream or lake element. There
+currently are no channel or lake loss functions where water can move
+from channels or lakes back to the landscape. Channel flow in WRF-Hydro
+is represented by one of a few different user-selected methodologies
+described below. Water passing into and through lakes and reservoirs is
+routed using a simple level pool routing scheme. Baseflow to the stream
+network is represented using a conceptual catchment storage-discharge
+bucket model formulation (discussed below) which obtains “drainage” flow
+from the spatially-distributed landscape. Discharge from buckets is
+input directly into the stream using an empirically-derived
+storage-discharge relationship. If overland flow is active, the only
+water flowing into the buckets comes from soil drainage. This is because
+the overland flow scheme will pass water directly to the channel model.
+If overland flow is switched off and channel routing is still active,
+then surface infiltration excess water from the land model is collected
+over the pre-defined catchment and passed into the bucket as well. Each
+of these process options are enabled through the specification of
+options in the model namelist file.
+
+3.2 Land model description: The community Noah and Noah-MP land surface models
+-------------------------------------------------------------------------------
+
+.. note::
+ As of this writing, only the Noah and Noah-MP land surface
+ models are supported within WRF-Hydro. Additional land surface models
+ such as CLM or land model driver frameworks, such as the NASA Land
+ Information System (LIS) have been coupled with WRF-Hydro but those
+ efforts are in various phases of development and are not yet formally
+ supported as part of the main code repository.
+
+The Noah land surface model is a community, 1-dimensional land surface
+model that simulates soil moisture (both liquid and frozen), soil
+temperature, skin temperature, snowpack depth, snowpack water
+equivalent, canopy water content and the energy flux and water flux
+terms at the Earth's surface *(Mitchell et al., 2002; Ek et al., 2003)*.
+The model has a long heritage, with legacy versions extensively tested
+and validated, most notably within the Project for Intercomparison of
+Land surface Parameterizations (PILPS), the Global Soil Wetness Project
+*(Dirmeyer et al. 1999)*, and the Distributed Model Intercomparison
+Project *(Smith, 2002)*. *Mahrt and Pan (1984)* and *Pan and Mahrt (1987)*
+developed the earliest predecessor to Noah at Oregon State University
+(OSU) during the mid-1980's. The original OSU model calculated sensible
+and latent heat flux using a two-layer soil model and a simplified plant
+canopy model. Recent development and implementation of the current
+version of Noah has been sustained through the community participation
+of various agency modeling groups and the university community (e.g.
+*Chen et al., 2005*). *Ek et al. (2003)* detail the numerous changes that
+have evolved since its inception including, a four layer soil
+representation (with soil layer thicknesses of 0.1, 0.3, 0.6 and 1.0 m),
+modifications to the canopy conductance formulation *(Chen et al., 1996)*,
+bare soil evaporation and vegetation phenology *(Betts et al., 1997)*,
+surface runoff and infiltration *(Schaake et al., 1996)*, thermal
+roughness length treatment in the surface layer exchange coefficients
+*(Chen et al., 1997a)* and frozen soil processes *(Koren et al., 1999)*.
+More recently refinements to the snow-surface energy budget calculation
+*(Ek et al., 2003)* and seasonal variability of the surface emissivity
+*(Tewari et al., 2005)* have been implemented.
+
+The Noah land surface model has been tested extensively in both offline
+(e.g., *Chen et al., 1996, 1997*; *Chen and Mitchell, 1999*; *Wood et al.,
+1998*; *Bowling et al., 2003*) and coupled (e.g. *Chen et el., 1997*, *Chen
+and Dudhia, 2001*, *Yucel et al., 1998*; *Angevine and Mitchell, 2001*; and
+*Marshall et al., 2002*) modes. The most recent version of Noah is
+currently one of the operational LSP's participating in the interagency
+NASA-NCEP real-time Land Data Assimilation System (LDAS, 2003, *Mitchell
+et al., 2004* for details). Gridded versions of the Noah model are
+currently coupled to real-time weather forecasting models such as the
+National Center for Environmental Prediction (NCEP) North American Model
+(NAM), and the community WRF model. Users are referred to *Ek et al.
+(2003)* and earlier works for more detailed descriptions of the
+1-dimensional land surface model physics of the Noah LSM.
+
+Support for the Noah Land Surface Model within WRF-Hydro is currently
+frozen at Noah version 3.6. Since the Noah LSM is not under active
+development by the community, WRF-Hydro is continuing to support Noah in
+deprecated mode only. Some new model features, such as the improved
+output routines, have not been setup to be backward compatible with Noah.
+Noah users should follow the guidelines in Appendix :ref:`A2 `
+for adapting the WRF-Hydro workflow to work with Noah.
+
+Noah-MP is a land surface model (LSM) using multiple options for key
+land-atmosphere interaction processes (*Niu et al., 2011*). Noah-MP was
+developed to improve upon some of the limitations of the Noah LSM (*Koren
+et al., 1999; Ek et al., 2003*). Specifically, Noah-MP contains a
+separate vegetation canopy defined by a canopy top and bottom, crown
+radius, and leaves with prescribed dimensions, orientation, density, and
+radiometric properties. The canopy employs a two-stream radiation
+transfer approach along with shading effects necessary to achieve proper
+surface energy and water transfer processes including under-canopy snow
+processes (*Dickinson, 1983; Niu and Yang, 2004*). Noah-MP contains a
+multi-layer snow pack with liquid water storage and melt/refreeze
+capability and a snow-interception model describing loading/unloading,
+melt/refreeze capability, and sublimation of canopy-intercepted snow
+(*Yang and Niu 2003; Niu and Yang 2004*). Multiple options are available
+for surface water infiltration and runoff and groundwater transfer and
+storage including water table depth to an unconfined aquifer (*Niu et
+al., 2007*) as well as options for different snow processes such as snow
+albedo.
+
+The Noah-MP land surface model can be executed by prescribing both the
+horizontal and vertical density of vegetation using either ground- or
+satellite-based observations. Another available option is for prognostic
+vegetation growth that combines a Ball-Berry photosynthesis-based
+stomatal resistance (*Ball et al., 1987*) with a dynamic vegetation model
+(*Dickinson et al. 1998*) that allocates carbon to various parts of
+vegetation (leaf, stem, wood and root) and soil carbon pools (fast and
+slow). The model is capable of distinguishing between `C_3` and
+`C_4` photosynthesis pathways and defines vegetation-specific
+parameters for plant photosynthesis and respiration.
+
+In addition to the three-layer snow model in NoahMP, WRF-Hydro also supports
+optionally running the Crocus snowpack model within NoahMP for glacial
+representation. For details on using this option, please see
+:ref:`Appendix 16 `.
+
+3.3 Spatial Transformations
+---------------------------
+
+The WRF-Hydro system has the ability to execute a number of physical
+process executions (e.g. column physics, routing processes, reservoir
+fluxes) on different spatial frameworks (e.g. regular grids, catchments,
+river channel vectors, reservoir polygons, etc). This means that spatial
+transformations between differing spatial elements has become a critical
+part of the overall modeling process. Starting in v5.0 of WRF-Hydro,
+increased support has been developed to aid in the mapping between
+differing spatial frameworks. Section 3.3.1 describes the spatial
+transformation process which relies on regular, rectilinear grid-to-grid
+mapping using a simplified integer linear multiple
+aggregation/disaggregation scheme. This basic scheme has been utilized
+in WRF-Hydro since its creation as it was described in Gochis and Chen,
+
+Section 3.3.2 describes new spatial transformation methods that have been
+developed and are currently supported in v5.0 and beyond and, more
+specifically, in the NOAA National Water Model (NWM). Those user-defined
+transformations rely on the pre-processing development and specification
+of interpolation or mapping weights which must be read into the model.
+As development continues future versions will provide more options and
+flexibility for spatial transformations using similar user-defined methodologies.
+
+.. _section-5:
+
+3.3.1 Subgrid disaggregation-aggregation
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+This section details the implementation of a subgrid
+aggregation/disaggregation scheme in WRF-Hydro. The
+disaggregation-aggregation routines are activated when routing of either
+overland flow or subsurface flow is active and the specified routing
+grid increment is different from that of the land surface model grid.
+Routing in WRF-Hydro is “switch-activated” through the declaration of
+parameter options in the primary model namelist file hydro.namelist
+which are described in Appendix :ref:`section-a5`
+
+In WRF-Hydro subgrid aggregation/disaggregation is used to represent
+overland and subsurface flow processes on grid scales much finer than
+the native land surface model grid. Hence, only routing is represented
+within a subgrid framework. It is possible to run both the land surface
+model and the routing model components on the same grid. This
+effectively means that the aggregation factor between the grids has a
+value of 1.0. This following section describes the
+aggregation/disaggregation methodology in the context of a “subgrid”
+routing implementation.
+
+In WRF-Hydro the routing portions of the code have been structured so
+that it is simple to perform both surface and subsurface routing
+calculations on grid cells that potentially differ from the native land
+surface model grid sizes provided that each land surface model grid cell
+is divided into integer portions for routing. Hence routing calculations
+can be performed on comparatively high-resolution land surfaces (e.g., a
+25 `m` digital elevation model) while the native land surface model can be
+run at much larger (e.g., 1 `km`) grid sizes. (In this example, the
+integer multiple of disaggregation in this example would be equal to
+40.) This capability adds considerable flexibility in the implementation
+of WRF-Hydro. However, it is well recognized that surface hydrological
+responses exhibit strongly scale-dependent behavior such that
+simulations at different scales, run with the same model forcing, may
+yield quite different results.
+
+The aggregation/disaggregation routines are currently activated by
+specifying either the overland flow or subsurface flow routing options
+in the model namelist file and prescribing terrain grid domain file
+dimensions (``IXRT``, ``JXRT``) which differ from the land surface model domain
+file dimensions (``IX``, ``JX``). Additionally, the model sub-grid size (``DXRT``),
+the routing time-step (``DTRT``), and the integer divisor (``AGGFACTRT``), which
+determines how the aggregation/disaggregation routines will divide up a
+native model grid square, all need to be specified in the model
+`hydro.namelist` file.
+
+If ``IXRT=IX``, ``JXRT=JX`` and ``AGGFACTRT=1`` the aggregation/disaggregation
+schemes will be activated but will not yield any effective changes in
+the model resolution between the land surface model grid and the terrain
+routing grid. Specifying different values for ``IXRT``, ``JXRT`` and ``AGGFACTRT≠1``
+will yield effective changes in model resolution between the land model
+and terrain routing grids. As described in the Surface Overland Flow
+Routing section `3.5 <#surface-overland-flow-routing>`__, ``DXRT`` and ``DTRT``
+must always be specified in accordance with the routing grid even if
+they are the same as the native land surface model grid.
+
+The disaggregation/aggregation routines are implemented in WRF-Hydro as
+two separate spatial loops that are executed after the main land surface
+model loop. The disaggregation loop is run prior to routing of saturated
+subsurface and surface water. The main purpose of the disaggregation
+loop is to divide up specific hydrologic state variables from the land
+surface model grid square into integer portions as specified by
+``AGGFACTRT``. An example disaggregation (where ``AGGFACTRT=4``) is given in
+Figure 3.2.
+
+.. _figure3.2:
+.. figure:: media/aggfactr.png
+ :align: center
+ :scale: 50%
+
+ **Figure 3.2** Example of the routing sub-grid implementation within the
+ regular land surface model grid for an aggregation factor = 4.
+
+Four model variables are required to be disaggregated for higher
+resolution routing calculations:
+
+ `SMCMAX` - maximum soil moisture content for each soil type
+
+ `SMCREF` - reference soil moisture content (field capacity) for each soil
+ type
+
+ `INFXS` - infiltration excess
+
+ `LKSAT` - lateral saturated conductivity for each soil type
+
+ `SMC` - soil moisture content for each soil layer
+
+In the model code, fine-grid values bearing the same name as these with
+an “RT” extension are created for each native land surface model grid
+cell (e.g. ``INFXSRT`` vs ``INFXS``).
+
+To preserve the structure of the spatial variability of soil moisture
+content on the sub-grid from one model time step to the next, simple,
+linear sub-grid weighting factors are assigned. These values indicate
+the fraction of the total land surface model grid value that is
+partitioned to each sub-grid pixel. After disaggregation, the routing
+schemes are executed using the fine grid values.
+
+Following execution of the routing schemes the fine-grid values are
+aggregated back to the native land surface model grid. The aggregation
+procedure used is a simple linear average of the fine-grid components.
+For example the aggregation of surface head (``SFHEAD``) from the fine grid
+to the native land surface model grid would be:
+
+.. rst-class:: center
+
+ :math:`{SFHEAD}_{i,j}\ = \ \frac{\Sigma\Sigma\ {SFHEADRT}_{irt,jrt}}{AGGFACTRT^2}`
+ (3.0)
+
+where, `i_{rt}` and `j_{rt}` are the indices of all of the grid
+cells residing within the native land model grid cell `i`,\ `j`. The following
+variables are aggregated and, where applicable, update land surface
+model variable values:
+
+ | SFHEAD - surface head (or, equivalently, depth of ponded water)
+ | SMC - soil moisture content for each soil layer
+
+These updated values are then used on the next iteration of the land
+surface model.
+
+3.3.2 User-Defined Mapping
+~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+The emergence of hydrologic models, like WRF-Hydro, that are capable of
+running on gridded as well as vector-based processing units requires
+generic tools for processing input and output data as well as methods
+for transferring data between models. Such a spatial transformation is
+currently utilized when mapping between model grids and catchments in
+the WRF-Hydro/National Water Model (NWM) system. In the NWM, selected
+model fluxes are mapped from WRF-Hydro model grids onto the NHDPlus
+catchment polygon and river vector network framework. The GIS
+pre-processing framework described here allows for fairly generalized
+geometric relationships between features to be characterized and for
+parameters to be summarized for any discrete unit of geography.
+
+3.3.3 Data Remapping for Hydrological Applications
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+A common task in hydrologic modeling is to regrid or aggregate data from
+one unit of analysis to another. Frequently, atmospheric model data
+variables such as temperature and precipitation may be produced on a
+rectilinear model grid while the hydrologic unit of analysis may be a
+catchment Hydrologic Response Unit (cHRU), defined using a closed
+polygon and derived from a hydrography dataset or terrain processing
+application. Often, cHRU-level parameters must be derived from data on a
+grid. Depending on the difference between the scale of the gridded and
+feature data, simple interpolation schemes such as nearest neighbor may
+introduce significant error when estimating data at the cHRU scale.
+Other GIS analysis methods such as zonal statistics require resampling
+of the gridded and/or feature data and limited control over the common
+analysis grid resolution, which may also introduce significant error.
+Area-weighted grid statistics provide a robust and potentially
+conservative method for transferring data from one or multiple features
+to another. In the case of runoff calculated from a land surface model
+grid, the runoff should be conservatively transferred between the grid
+and the cHRU, such that the runoff volume is conserved.
+
+The correspondence between polygons and grid cells need only be
+generated once for any grid/polygon collection. The correspondence file
+that is output from the tool stores all necessary information for
+translating data between the datasets in either direction.
+
+There are a variety of useful regridding and spatial analysis tools
+available for use in the hydrologic and atmospheric sciences. Many
+regridding utilities exist that are able to either characterize and
+store the relationship between grid features and polygons or perform
+regridding from one grid to another. The Earth System Modeling Framework
+(ESMF) offers high performance computing (HPC) software for building and
+coupling weather, climate, and related models. ESMF provides the
+:program:`ESMF_RegridWeightGen` utility for parallel generation of interpolation
+weights between two grid files in netCDF format. These utilities will
+work for structured (rectilinear) and unstructured grids. The NCAR
+Command Language (NCL) has supported the :program:`ESMF_RegridWeightGen` tool since
+version 6.1.0. Another commonly used tool in the atmospheric sciences
+are the Climate Data Operators (CDO), which offer 1\ :sup:`st` and 2\ :sup:`nd`\-
+order conservative regridding (:program:`remapcon`, :program:`remapcon2`) and regrid weight
+generation (:program:`gencon`, :program:`gencon2`) based on the work of *Jones (1999)*. All of
+the above-mentioned utilities require SCRIP grid description files to
+perform the remapping. The SCRIP standard format for correspondence
+stores geometry information for regridding, while the tools mentioned
+here store just the spatial weights. Thus, WRF-Hydro spatial
+correspondence files are more generic, with compact file sizes, and may
+be used for non-gridded data.
+
+This script quantifies the polygon to polygon correspondence between
+geometries in two separate features (grid cells represented by polygons
+and basins represented by polygons). This correspondence is stored in a
+netCDF format file that contains the spatial weights and identification
+of all polygons from one input shapefile that intersect each polygon in
+another input shapefile. The storage of correspondence information
+between one dataset and another allows for many types of regridding and
+spatial interpolation between the spatial datasets. This file needs only
+to be derived once between any two sets of polygons, and the
+correspondence file can be used to regrid variables between those
+spatial datasets. This is useful if multiple variables must be
+regridded, or a single variable across many timesteps. As long as the
+grids do not change in space or time, the relationship between all
+features will remain constant, and the correspondence file may be used
+to regrid data between them.
+
+There are uses for this utility that range outside of the hydrological
+sciences, and this utility may be of broader interest to the geospatial
+community. Although interpolation packages exist, this method allows for
+storage of the correspondence information for future use in a small-file
+size. Users wanting to create custom spatial weight interpolation files
+for WRF-Hydro need to refer to the *WRF-Hydro GIS Pre-processing
+Toolkit* and documentation. For reference, variable descriptions of the
+contents of the spatial weights file is located in :ref:`section-20`.
+
+.. _figure3.3:
+.. figure:: media/user-defined-mapping.png
+ :align: center
+ :scale: 90%
+
+ **Figure 3.3.** An illustration of implementing user-defined mapping to
+ translate from gridded fluxes and states to aggregated catchment fluxes
+ and states, which can be passed into, for example, vector-based channel
+ routing modules.
+
+3.4 Subsurface Routing
+----------------------
+
+Subsurface lateral flow is calculated prior to the routing of overland
+flow. This is because exfiltration from a supersaturated soil column is
+added to infiltration excess from the land surface model, which
+ultimately updates the value of surface head prior to routing of
+overland flow. A supersaturated soil column is defined as a soil column
+that possesses a positive subsurface moisture flux which when added to
+the existing soil water content is in excess of the total soil water
+holding capacity of the entire soil column. Figure 3.4 illustrates the
+lateral flux and exfiltration processes in WRF-Hydro.
+
+In the current default implementation of WRF-Hydro with the Noah and
+Noah-MP land surface models, there are four soil layers. The depth of
+the soil layers in WRF-Hydro can be manually specified in the model
+namelist file under the ``ZSOIL`` variable. Users must be aware that, in
+the present version of WRF-Hydro, total soil column depth and individual
+soil layer thicknesses are constant throughout the entire model domain.
+Future versions under development are relaxing this constraint. However,
+the model is capable of using a different distribution of soil column
+layer depths and these simply need to be specified in the model namelist
+file. Assuming a 2-m soil profile the default soil layer depths (and
+associated water table depths) are specified in Table 3.1.
+
+.. table:: **Table 3.1: Depths of 4 soil layers in WRF-Hydro**
+ :align: center
+
+ +-------------+-------------------------+-----------------------------+
+ | **Layer** | **Soil Thickness (mm)** | **Z (depth to top of layer) |
+ | | | (mm)** |
+ +=============+=========================+=============================+
+ | 1 | 100 | 0 |
+ +-------------+-------------------------+-----------------------------+
+ | 2 | 300 | 100 |
+ +-------------+-------------------------+-----------------------------+
+ | 3 | 600 | 400 |
+ +-------------+-------------------------+-----------------------------+
+ | 4 | 1000 | 1000 |
+ +-------------+-------------------------+-----------------------------+
+
+.. _figure3.4:
+.. figure:: media/subsurface-flow.png
+ :align: center
+ :scale: 50%
+
+ **Figure 3.4** Conceptualization of saturated subsurface flow
+ components.
+
+The method used to calculate the lateral flow of saturated soil moisture
+employs a quasi three-dimensional flow representation, which include the
+effects of topography, saturated soil depth (in this case layers), and
+saturated hydraulic conductivity. Hydraulic gradients are approximated
+as the slope of the water table between adjacent grid cells in the x-
+and y-directions or in an eight direction (D8) steepest descent
+methodology that is specified by the user in the model namelist. In each
+cell, the flux of water from one cell to its down-gradient neighbor on
+each timestep is approximated as a steady-state solution. The looping
+structure through the model grid performs flux calculations separately
+in the x- and y-directions for the 2-dimensional routing option or
+simply along the steepest D8 pathway.
+
+Using Dupuit-Forchheimer assumptions the rate of saturated subsurface
+flow at time `t` can be calculated as:
+
+.. _eqn3.1:
+.. rst-class:: center
+.. math::
+
+ {q}_{i,j} &= - T_{i,j}\beta_{i,j}w_{i,j}\ when\ \beta_{i,j}\ < \ 0 \\
+ &= 0\ when\ \beta_{i,j} > = 0
+
+ (3.1)
+
+where, `q_{i,j}` is the flow rate from cell `(i,j)`, `T_{i,j}` is the
+transmissivity of cell `(i,j)`, `\beta_{i,j}` is the water table slope and
+`w_{i,j}` is the width of the cell which is fixed for a regular grid.
+`\beta_{i,j}` is calculated as the difference in water table depths between
+two adjacent grid cells divided by the grid spacing. The method by which
+the water table depth is determined is provided below. Transmissivity is
+a power law function of saturated hydraulic conductivity (`Ksat_{i,j}`)
+and soil thickness (`D_{i,j}`) given by:
+
+.. _eqn3.2:
+.. rst-class:: center
+.. math::
+
+ T_{i,j} &= \frac{{Ksat}_{i,j}D_{i,j}} {n_{i,j}} \left( 1 - \frac{z_{i,j}}{D_{i,j}} \right)^{n_{i,j}}\ when\ z_{i,j} < = D_{i,j} \\
+ &= 0\ when\ z_{i,j} > D_{i,j}
+
+ (3.2)
+
+where, `z_{i,j}` is the depth to the water table. `n_{i,j}` in :ref:`Eq. (3.2) `
+is defined as the local power law exponent and is a tunable parameter
+(currently hard-coded to 1 but will be exposed in future versions) that
+dictates the rate of decay of `Ksat_{i,j}` with depth. When :ref:`Eq. (3.2) ` is
+substituted into :ref:`Eq. (3.1) ` the flow rate from cell `(i,j)`
+to its neighbor in the `x`-direction can be expressed as:
+
+.. _eqn3.3:
+.. rst-class:: center
+.. math::
+
+ q_{x(i,j)} = \gamma_{x(i,j)} h_{i,j}\ when\ \beta_{x(i,j)} < 0
+
+ (3.3)
+
+where,
+
+.. _eqn3.4:
+.. rst-class:: center
+.. math::
+
+ \gamma_{x(i,j)} = - \left( \frac{w_{i,j}{Ksat}_{i,j}D_{i,j}}{n_{i,j}} \right)\beta_{x(i,j)}
+
+ (3.4)
+
+.. _eqn3.5:
+.. rst-class:: center
+.. math::
+
+ h_{i,j} = \left( 1-\frac{z_{i,j}}{D_{i,j}} \right)
+
+ (3.5)
+
+This calculation is repeated for the y-direction when using the
+two-dimensional routing method. The net lateral flow of saturated
+subsurface moisture (`Q_{net}`) for cell `(i,j)` then becomes:
+
+.. _eqn3.6:
+.. rst-class:: center
+.. math::
+
+ Q_{net(i,j)} = h_{i,j} \sum_x{\gamma_{x(i,j)}} + h_{i,j} \sum_y{\gamma_{y(i,j)}}
+
+ (3.6)
+
+The mass balance for each cell on a model time step (`\Delta t`) can then be
+calculated in terms of the change in depth to the water table (`\Delta z`):
+
+.. _eqn3.7:
+.. rst-class:: center
+.. math::
+
+ \Delta z = \frac{1}{\phi_{(i,j)}} \left[ \frac{Q_{net(i,j)}}{A} - R_{(i,j)} \right] \Delta t
+
+ (3.7)
+
+where, `\phi` is the soil porosity, `R` is the soil column recharge rate
+from infiltration or deep subsurface injection and *A* is the grid cell
+area. In WRF-Hydro, `R`, is implicitly accounted for during the land
+surface model integration as infiltration and subsequent soil moisture
+increase. Assuming there is no deep soil injection of moisture (i.e.
+pressure driven flow from below the lowest soil layer), `R`, in
+WRF-Hydro is set equal to 0.
+
+The methodology outlined in Equations :ref:`3.2 ` thru :ref:`3.7 `
+has no explicit information on soil layer structure, as the method treats
+the soil as a single homogeneous column (with an assumed exponential decay of
+saturated hydraulic conductivity). Therefore, changes in water table
+depth (`\Delta z`) need to be remapped to the land surface model soil layers.
+WRF-Hydro specifies the water table depth according to the depth of the
+top of the highest (i.e. nearest to the surface) saturated layer. The
+residual saturated water above the uppermost, saturated soil layer is
+then added to the overall soil water content of the overlying
+unsaturated layer. This computational structure requires accounting
+steps to be performed prior to calculating `Q_{net}`.
+
+Given the timescale for groundwater movement and limitations in the
+model structure there is significant uncertainty in the time it takes to
+properly spin-up groundwater systems. The main things to consider
+include 1) the specified depth of soil and number and thickness of the
+soil vertical layers and 2) the prescription of the model bottom
+boundary condition. Typically, for simulations with deep soil profiles
+(e.g. > 10 `m`) the bottom boundary condition is set to a ‘no-flow’
+boundary (``SLOPE_DATA = 0.0``) in the :file:`GENPARM.TBL` parameter file (see
+Appendices :ref:`section-a6` and :ref:`section-a7` for a
+description of :file:`GENPARM.TBL`).
+
+.. rubric:: Relevant code modules:
+
+:file:`Routing/Noah_distr_routing.F90`
+
+.. rubric:: Relevant namelist options:
+
+:file:`hydro.namelist`:
+
+- ``SUBRTSWCRT`` - Switch to activate subsurface flow routing.
+
+- ``DXRT`` - Specification of the routing grid cell spacing
+
+- ``AGGFACTR`` - Subgrid aggregation factor, defined as the ratio of the
+ subgrid resolution to the native land model resolution
+
+- ``DTRT_TER`` - Terrain routing grid time step (used for overland and
+ subsurface routing)
+
+.. rubric:: Relevant domain and parameter files/variables:
+
+- ``TOPOGRAPHY`` in :file:`Fulldom_hires.nc` - Terrain grid or Digital Elevation
+ Model (DEM). Note: this grid may be provided at resolutions equal to
+ or finer than the native land model resolution.
+
+- ``LKSATFAC`` in :file:`Fulldom_hires.nc` - Multiplier on saturated hydraulic
+ conductivity in lateral flow direction.
+
+- ``SATDK``, ``SMCMAX``, ``SMCREF`` in :file:`HYDRO.TBL` or :file:`hydro2dtbl.nc` - Soil properties
+ (saturated hydraulic conductivity, porosity, field capacity) used in
+ lateral flow routing.
+
+.. _section-3.5:
+
+3.5 Surface Overland Flow Routing
+---------------------------------
+
+Overland flow in WRF-Hydro is calculated using a fully-unsteady,
+explicit, finite-difference, diffusive wave formulation similar to that
+of *Julien et al. (1995)* and *Ogden et al. (1997)*. The diffusive wave
+equation, while slightly more complicated, is, under some conditions,
+superior to the simpler and more traditionally used kinematic wave
+equation, because it accounts for backwater effects and allows for flow
+on adverse slopes. The overland flow routine described below can be
+implemented in either a 2-dimensional (x and y direction) or 1-dimension
+(steepest descent or “D8”) method. While the 2-dimensional method may
+provide a more accurate depiction of water movement across some complex
+surfaces it is more expensive in terms of computational time compared
+with the 1-dimensional method. While the physics of both methods are
+identical we have presented the formulation of the flow in equation form
+below using the 2-dimensional methodology.
+
+.. figure:: media/overland-flow.png
+ :align: center
+ :scale: 50%
+
+ **Figure 3.5:** Conceptual representation of terrain elements. Flow is
+ routed across terrain elements until it intersects a “channel” grid cell
+ indicated by the blue line where it becomes “in-flow” to the stream
+ channel network.
+
+The diffusive wave formulation is a simplification of the more general
+St. Venant equations of continuity and momentum for a shallow water
+wave. The two-dimensional continuity equation for a flood wave flowing
+over the land surface is:
+
+.. _eqn3.8:
+.. rst-class:: center
+.. math::
+
+ \frac{\partial h}{\partial t} + \frac{\partial q_x}{\partial x} + \frac{\partial q_y}{\partial x} = i_e
+
+ (3.8)
+
+where, `h` is the surface flow depth; `q_x` and `q_y` are the unit
+discharges in the `x`- and `y`-directions, respectively; and `i_e` is the
+infiltration excess. The momentum equation used in the diffusive wave
+formulation for the `x`-dimension is:
+
+.. _eqn3.9:
+.. rst-class:: center
+.. math::
+
+ S_{fx} = S_{ox} - \frac{\partial h}{\partial x}
+
+ (3.9)
+
+where, `S_fx` is the friction slope (or slope of the energy grade line)
+in the x-direction, `S_ox` is the terrain slope in the `x`-direction and
+`\partial h/\partial x` is the change in depth of the water surface above the land
+surface in the `x`-direction.
+
+In the 2-dimensional option, flow across the terrain grid is calculated
+first in the `x`- then in the `y`-direction. In order to solve :ref:`Eq. 3.8 `
+values for `q_x` and `q_y` are required. In most hydrological models
+they are typically calculated by the use of a resistance equation such
+as Manning's equation or the Chezy equation, which incorporates the
+expression for momentum losses given in :ref:`Eq. 3.9 `.
+In WRF-Hydro, a form of Manning's equation is implemented:
+
+.. _eqn3.10:
+.. rst-class:: center
+.. math::
+
+ q_x = \alpha_x h^\beta
+
+ (3.10)
+
+where,
+
+.. _eqn3.11:
+.. rst-class:: center
+.. math::
+
+ \alpha_x = \frac{S_{fx}^{1/2}}{n_{OV}}; \qquad \beta = \frac{5}{3}
+
+ (3.11)
+
+where, `n_{OV}` is the roughness coefficient of the land surface and is a
+tunable parameter and `\beta` is a unit-dependent coefficient expressed here
+for SI units.
+
+The overland flow formulation has been used effectively at fine terrain
+scales ranging from 30-300 `m`. There has not been rigorous testing to
+date, in WRF-Hydro, at larger length-scales (> 300 `m`). This is due to
+the fact that typical overland flood waves possess length scales much
+smaller than 1 `km`. Micro-topography can also influence the behavior of a
+flood wave. Correspondingly, at larger grid sizes (e.g. > 300 `m`) there
+will be poor resolution of the flood wave and the small-scale features
+that affect it. Also, at coarser resolutions, terrain slopes between
+grid cells are lower due to an effective smoothing of topography as grid
+size resolution is decreased. Each of these features will degrade the
+performance of dynamic flood wave models to accurately simulate overland
+flow processes. Hence, it is generally considered that finer resolutions
+yield superior results.
+
+The selected model time step is directly tied to the grid resolution. In
+order to prevent numerical diffusion of a simulated flood wave (where
+numerical diffusion is the artificial dissipation and dispersion of a
+flood wave) a proper time step must be selected to match the selected
+grid size. This match is dependent upon the assumed wave speed or
+celerity (`c`). The Courant Number, `C_n = c(\Delta t/\Delta x)`, should be
+close to 1.0 in order to prevent numerical diffusion. The value of the
+`C_n` also affects the stability of the routing routine such that
+values of `C_n` should always be less than 1.0. Therefore the
+following model time steps are suggested as a function of model grid
+size as shown in Table 3.2.
+
+.. table::
+ **Table 3.2:** Suggested routing time steps for various grid spacings
+ :align: center
+ :width: 50%
+
+ +---------+---------+
+ | X (`m`) | T (`s`) |
+ +=========+=========+
+ | 30 | 2 |
+ +---------+---------+
+ | 100 | 6 |
+ +---------+---------+
+ | 250 | 15 |
+ +---------+---------+
+ | 500 | 30 |
+ +---------+---------+
+
+.. rubric:: Relevant code modules:
+
+:file:`Routing/Noah_distr_routing.F90`
+
+.. rubric:: Relevant namelist options:
+
+`hydro.namelist`:
+
+- ``OVRTSWCRT`` - Switch to activate overland flow routing.
+
+- ``DXRT`` - Specification of the routing grid cell spacing
+
+- ``AGGFACTR`` - Subgrid aggregation factor, defined as the ratio of the
+ subgrid resolution to the native land model resolution
+
+- ``DTRT_TER`` - Terrain routing grid time step (used for overland and
+ subsurface routing)
+
+.. rubric:: Relevant domain and parameter files/variables:
+
+- ``TOPOGRAPHY`` in :file:`Fulldom_hires.nc` - Terrain grid or Digital Elevation
+ Model (DEM). Note: this grid may be provided at resolutions equal to
+ or finer than the native land model resolution.
+
+- ``RETDEPRTFAC`` in :file:`Fulldom_hires.nc` - Multiplier on maximum retention
+ depth before flow is routed as overland flow.
+
+- ``OVROUGHRTFAC`` in :file:`Fulldom_hires.nc` - Multiplier on Manning's roughness
+ for overland flow.
+
+- ``OV_ROUGH`` in :file:`HYDRO.TBL` or ``OV_ROUGH2D`` in
+ :file:`hydro2dtbl.nc` - Manning's roughness for overland flow (by
+ default a function of land use type).
+
+.. _section-3.6:
+
+3.6 Channel and Lake Routing
+----------------------------
+
+There are multiple channel routing algorithms available in version 5.0
+of WRF-Hydro. These algorithms operate either on the resolution of the
+fine grid (gridded routing) or on a vectorized network of channel
+reaches (linked routing, also referred to as reach-based routing), which
+maps the fine grid to the vector network (:ref:`Figure 3.6 `).
+The following section describes the routing methods and their
+implementation in the WRF-Hydro model code.
+
+In general, inflow to the channel is based on a mass balance
+calculation, where the channel routes water when the ponded water depth
+(or surface head, `SFCHEADRT`) of the channel grid cells exceeds a
+predefined retention depth (`RETDEPRT`). As described in `Section
+3.5 <#surface-overland-flow-routing>`__, the depth of surface head on
+any grid cell is a combination of the local infiltration excess, the
+amount of water flowing onto the grid cell from overland flow, and
+exfiltration from groundwater flow. The quantity of surface head in
+excess of the retention depth is accumulated as stream channel inflow
+and is effectively “discharged” to the channel routing routine
+(described below). For calibration purposes gridded values of a scaling
+factor for `RETDEPRT` can be specified in the main hydro2dtbl.nc netCDF
+input file. Increases in the `RETDEPRT` scaling factor on channel pixels
+can encourage more local infiltration near the river channel leading to
+wetter soils that better emulate riparian conditions. Values of “channel
+inflow” are accumulated on the channel grid and can be output for
+visualization and analysis (see :ref:`Section 6 ` for a
+description of model outputs).
+
+.. _figure3.6:
+.. figure:: media/channel-routing-grid-link.png
+ :align: center
+
+ **Figure 3.6** Channel routing via the high resolution grid (left) or on
+ a vector/link network (right).
+
+The channel routing module :file:`module_channel_routing.F90` allows for the
+one-dimensional, distributed routing of streamflow across the domain. An
+optional, switch-activated, level-pool lake/reservoir algorithm is also
+available and is described below in Sections :ref:`3.7 `
+and :ref:`3.8 `. Within each channel grid cell there is an
+assumed channel reach of trapezoidal geometry as depicted in Figure 3.7.
+Channel parameters side slope (`z`), bottom width (`B_w`) and roughness (`n`)
+are currently prescribed as functions of Strahler stream order for defaults.
+Details on how each routing method reads these parameters are specified
+in the subsections below.
+
+.. _figure3.7:
+.. figure:: media/channel-terms.png
+ :align: center
+ :figwidth: image
+
+ **Figure 3.7** Schematic of Channel Routing Terms
+
+.. table::
+ :align: center
+
+ +---------------------------------+------------------+
+ | Channel Slope | `S_o` |
+ +---------------------------------+------------------+
+ | Channel Length | `\Delta x` (`m`) |
+ +---------------------------------+------------------+
+ | Channel side slope | `z` (`m`) |
+ +---------------------------------+------------------+
+ | Constant bottom width | `B_w` (`m`) |
+ +---------------------------------+------------------+
+ | Manning's roughness coefficient | (`n`) |
+ +---------------------------------+------------------+
+
+As discussed above, channel elements receive lateral inflow from
+overland flow. There is currently no overbank flow back to the
+fine-grid, so flow into the channel model is effectively one-way.
+Therefore, WRF-Hydro does not explicitly represent inundation areas from
+overbank flow from the channel back to the terrain. This will be an
+upcoming enhancement, though currently there are methods for
+post-processing an inundation surface. Uncertainties in channel geometry
+parameters and the lack of an overbank flow representation result in a
+measure of uncertainty for users wishing to compare model flood
+inundation versus those from observations. It is strongly recommended
+that users compare model versus observed streamflow discharge values and
+use observed stage-discharge relationships or “rating curves” when
+wishing to relate modeled/predicted streamflow values to actual river
+levels and potential inundation areas.
+
+.. rubric:: Relevant code modules:
+
+:file:`Routing/module_channel_routing.F90`
+
+.. rubric:: Relevant namelist options for gridded and reach-based routing:
+
+`hydro.namelist`:
+
+- ``CHANRTSWCRT`` - Switch to activate channel routing.
+
+- ``channel_option`` - Specification of the type of channel routing to
+ activate
+
+- ``DTRT_CH`` - Channel routing time step, applies to both gridded and
+ reach-based channel routing methods
+
+- ``route_link_f`` (optional) - a :file:`Route_Link.nc` file is required for
+ reach-based routing methods. Example header in :ref:`Appendix A9 `.
+
+3.6.1. Gridded Routing using Diffusive Wave
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+Channel flow down through the gridded channel network is performed using
+an explicit, one-dimensional, variable time-stepping diffusive wave
+formulation. As mentioned above the diffusive wave formulation is a
+simplification of the more general St. Venant equations for shallow
+water wave flow. Similarly, for channel routing, the mass and momentum
+continuity equations are given as:
+
+ Continuity:
+
+.. math::
+ \frac{\partial A}{\partial t} + \frac{\partial Q}{\partial x} = q_{lat}
+ \qquad (3.12)
+
+\
+ Momentum:
+
+.. math::
+ \frac{\partial Q}{\partial t} + \frac{\partial(\beta Q^2 / A)}{\partial x} +
+ gA\frac{\partial Z}{\partial x} = -gAS_f
+ \qquad (3.13)
+
+Where `t` is the time, `x` is the streamwise coordinate, `A` is
+in the flow area of the cross section, and `q_{lat}` is the lateral
+inflow rate into the channel. In the momentum equation, `Q` is the flow
+rate, `\beta` is a momentum correction coefficient, `Z` is the water surface
+elevation, `g` is gravity and `S_f` is the friction slope which is
+computed as:
+
+.. math::
+ S_f = \left( \frac{Q}{K} \right)^2
+ \qquad (3.14)
+
+where `K` is the conveyance, computed from the Manning's equation:
+
+.. math::
+ K = \frac{C_m}{n} AR^{2/3}
+ \qquad (3.15)
+
+where `n` is the Manning's roughness coefficient, `A` is the
+cross-sectional area, `R` is the hydraulic radius (`A/P`), `P` is the
+wetted perimeter, and `C_m` is dimensional constant (1.486 for English
+units or 1.0 for SI units).
+
+Ignoring the convective term, the second term in the momentum equation
+gives the diffusive wave approximation of open channel flow. The
+momentum equation then simplifies to:
+
+.. math::
+ Q = -SIGN \left( \frac{\partial Z}{\partial x} \right) K \sqrt{\left| \frac{\partial z}{\partial x}\right |}
+ \qquad (3.16)
+
+where the substitution for friction slope has been made and the `SIGN`
+function is `1` for `\partial Z / \partial x > 0` and `-1` for `\partial Z / \partial x < 0`.
+
+The numerical solution is obtained by discretizing the continuity
+equation over a raster cell as:
+
+.. math::
+ A^{n+1} - A^n = \frac{\Delta t}{\Delta x} \left( Q^n_{i+\frac{1}{2}} - Q^n_{i-\frac{1}{2}} \right)
+ + \Delta tq^n_{lat}
+ \qquad (3.17)
+
+where `Q^n_{i+\frac{1}{2}}` is the flux across the cell face between point `i` and
+`i+1`, and is computed as:
+
+.. math::
+ Q^n_{i+\frac{1}{2}} = -SIGN\left(\Delta Z^n_{i+1}\right) K_{i+\frac{1}{2}}
+ \sqrt{\frac{\left|\Delta Z^n_{i+1}\right|}{\Delta x}}
+ \qquad (3.18)
+
+where:
+
+.. math::
+ \Delta Z^n_{i+1} = Z^n_{i+1} - Z^i &\qquad (3.19) \\
+ K^n_{i+\frac{1}{2}} = \frac{1}{2} \left( 1 + SIGN\left(\Delta Z^n_{i+1}\right)\right) K_i
+ + \left( 1- SIGN\left( \Delta Z^n_{i+1} \right)\right) K_{i+1} &\qquad (3.20)
+
+A first-order, Newton-Raphson (N-R) solver is used to integrate the
+diffusive wave flow equations. Under certain streamflow conditions (e.g.
+typically low gradient channel reaches) the first-order solver method
+can produce some instabilities resulting in numerical oscillations in
+calculated streamflow values. To address this issue, higher order solver
+methods will be implemented in future versions of WRF-Hydro.
+
+Unlike typical overland flow flood waves which have very shallow flow
+depths, on the order of millimeters or less, channel flood waves have
+appreciably greater flow depths and wave amplitudes, which can
+potentially result in strong momentum gradients and strong accelerations
+of the propagating wave. To properly characterize the dynamic
+propagation of such highly variable flood waves it is often necessary to
+decrease model time-steps in order to satisfy Courant conditions.
+Therefore WRF-Hydro utilizes variable time-stepping in the diffusive
+wave channel routing module in order to satisfy Courant constraints and
+avoid numerical dispersion and instabilities in the solutions. The
+initial value of the channel routing time-step is set equal to that of
+the overland flow routing timestep which is a function of grid spacing.
+If, during model integration the N-R convergence criteria for
+upstream-downstream streamflow discharge values is not met, the channel
+routing time-step is decreased by a factor of one-half and the N-R
+solver is called again.
+
+It is important to note that the use of variable time-stepping can
+affect model computational performance resulting in slower solution
+times for rapidly evolving streamflow conditions such as those occurring
+during significant flood events. Therefore, selection of the time-step
+decrease factor (default value set to 0.5) and the N-R convergence
+criteria can each affect model computational performance.
+
+Uncertainty in channel routing parameters can also impact the accuracy
+of the model solution which implies that model calibration is often
+required upon implementation in a new domain. Presently, all of the
+channel routing parameters are prescribed as functions of stream order
+in a channel routing parameter table :file:`CHANPARM.TBL`. The structure of this
+file is described in detail in Appendix :ref:`A9 `.
+It should be noted that prescription of channel flow parameters as
+functions of stream order is likely to be a valid assumption over
+relatively small catchments and not over large regions.
+
+3.6.2. Linked Routing using Muskingum and Muskingum-Cunge
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+The gridded catchment and drainage network of the land surface model
+(Noah/Noah-MP LSM) are mapped to the one-dimensional vectorized channel
+network, with a unique set of channel properties defined as constant for
+each reach. The flow out of each channel reach is determined based on
+flow hydraulics, channel storage and the lateral inflow contribution
+from each grid cell that is mapped to the individual link element. Since
+reach lengths are not constant, the number of contributing grid cells to
+the link depend on the link length (:ref:`Figure 3.6 `).
+Flow is assumed always upstream-to-downstream, and channel junctions
+accommodate the merging of flows through the linked network. The simultaneous
+transformation of the often complex drainage network, source areas, and
+channel flow hydrographs in these large, complex networks necessitates
+a practical and efficient solution to the routing problem (*Brunner and
+Gorbrecht, 1991*).
+
+On the linked network, WRF-Hydro makes use of a fairly standard
+implementation of the Muskingum-Cunge (MC) method of hydrograph routing
+which makes use of time varying parameter estimates. The scheme is a
+practical approach to characterize watershed runoff characteristics over
+large network, large watershed flow integration. But as a
+one-dimensional explicit scheme, it does not allow for backwater or
+localized effects. Channel flows are routed upstream to downstream in a
+cascade routing manner (*Gunner and Gorbetch, 1991*) with the assumption
+that there are negligible backwater effects. The MC routing scheme
+relates inflow and outflow using a storage relationship, where:
+
+.. math::
+ S = K[XI + (1-X) Q] \qquad (3.21)
+
+where X is a weighting factor with a range of` 0 ≤ X ≤ 0.5`, where `X`
+range from `0` for reservoir-type storage, while an advancing floodwave
+produces a wedge of storage and thus a value of `X` greater than `0` (*Chow
+et al., 1982*). The finite difference formulation of the storage
+relationship results in the Muskingum Equation,
+
+.. math::
+ Q_{d}^{c} = \ C1{\ Q}_{u}^{p}\ + \ C2\ Q_{u}^{c}\ + \ {C3Q}_{d}^{p}\ + \left( \frac{\ q_{l}\ dt}{D} \right)
+ \qquad (3.22)
+
+where `D = K(1-X)+ dt/2` and is the wedge storage contribution from
+lateral inflow in the reach. The subscript are `u` and `d` are the
+upstream and downstream nodes of each reach, respectively; and the `p`
+and `c` superscript are the previous and current time step,
+respectively.
+
+.. _figure3.8:
+.. figure:: media/channel-props.svg
+ :align: center
+ :scale: 150%
+
+ **Figure 3.8** Channel Properties
+
+Static hydraulic properties are used to describe the properties of each
+channel reach, with each being assumed trapezoidal and include bottom
+width (`B_w``), channel length (`dx`), channel top width before bankfull
+(`Tw`), Manning's roughness coefficient (`n`), channel side slope (`z`,
+in meters), and the longitudinal slope of the channel (`So`). If a user
+is running the model with reach-based routing (``channel_option`` =
+``1`` or ``2``), the `B_w` , `n`, and `z` parameters can be modified
+through the :file:`Route_Link.nc` file. Note: the :file:`CHANPARM.TBL`
+file will not be used in this configuration.
+
+Simulated state variables include estimate of mean water depth in the
+channel (`h`), steady-state velocity (`v`) and flow rate (`q`) in the
+reach at the current timestep. An initial depth estimate is made based
+on the depth from the previous time step. Time varying properties
+include the hydraulic area, `Area = (B_w*h*z)*h; (3.23)`, the wetted
+perimeter `W_p= (B_w + 2 * \sqrt{1+z^2}); (3.24)`, and the hydraulic radius,
+`R=Area/W_p; (3.25)`. With an initial estimate of water depth in the channel,
+the wave celerity for the trapezoidal channel is estimated as:
+
+.. math::
+ Ck = \sqrt{\frac{So}{n}} \frac{5}{3}R^{\frac{2}{3}} -
+ \frac{2}{3}R^{\frac{5}{3}}*\left(2*\sqrt{\frac{1+z^2}{B_w + 2hz}}\right)
+ \qquad (3.26)
+
+Wave celerity is used to estimate the MC routing parameters, where *K=*
+*dx/c\ k* (3.27) is the time required for an incremental flood wave to
+propagate through the channel reach, and the storage shape weighting
+factor is given as, *X* =
+:math:`\frac{1}{2}\left( 1 - \frac{Q}{(Tw{\ c}_{k}\ So\ dx} \right)` ,
+(3.28) where *Q* is the estimated discharge, *T\ w* is the water surface
+width, *S\ o* is the channel slope and *dx* is the channel length.
+
+.. rubric:: Relevant domain and parameter files/variables:
+
+- ``TOPOGRAPHY`` in :file:`Fulldom_hires.nc` - Terrain grid or Digital Elevation
+ Model (DEM). Note: this grid may be provided at resolutions equal to
+ or finer than the native land model resolution.
+
+- ``CHANNELGRID`` in :file:`Fulldom_hires.nc` - Channel network grid identifying
+ the location of stream channel grid cells
+
+- ``STREAMORDER`` in :file:`Fulldom_hires.nc` - Strahler stream order grid
+ identifying the stream order for all channel pixels within the
+ channel network.
+
+- ``FLOWDIRECTION`` in :file:`Fulldom_hires.nc` - Flow direction grid, which
+ explicitly defines flow directions along the channel network in
+ gridded routing. This variable dictates where water flows into
+ channels from the land surface as well as in the channel. This should
+ not be modified independently because it is tied to the DEM.
+
+- ``frxst_pts`` (optional) in :file:`Fulldom_hires.nc` - Forecast point grid, which
+ specified selected channel pixels for which channel discharge and
+ flow depth are to be output within a netcdf point file (CHANOBS)
+ and/or an ASCII timeseries file (frxstpts_out.txt).
+
+- :file:`CHANPARM.TBL` text or :file:`Route_Link.nc` netcdf file - Specifies channel
+ parameters by stream order (:file:`CHANPARM.TBL`, for gridded channel
+ routing) or individual reaches (``route_link_f``, for reach-based routing
+ methods)
+
+.. note:: Reach-based routing is highly sensitive to time step.
+
+3.6.3 Compound Channel (currently only functional in NWM)
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+In order to represent a simplification of the behavior of a flood wave
+when it exceeds the channel bank, a compound channel formulation was
+added to the NWM (i.e. only active when ``channel_option=2`` and ``udmp=1``).
+A visual representation is shown in Figure 3.9. When the depth of the
+flow exceeds bankfull (`d > d_b`), then the wave celerity is given
+as the weighted celerity of the trapezoidal flow and the overbank
+portion of flow. This weighting is based on the cross sectional area of
+each, and allows water to enter the conceptual compound channel, where
+the Manning's coefficient of the compound channel portion, `n_{cc}`,
+is assumed rougher than the channel `n` by an unknown factor,
+`n_{cc}`. Based on a set of sensitivity experiments described in
+*Read et al., (forthcoming)*, the default value for in NWMv2.0 and v2.1 is
+`n_{cc}=2n`, such that the floodplain roughness is twice that of
+the channel. The introduction of compound channel requires values for
+three more parameters: bankfull depth (`d_b`), top widths of the
+trapezoid and the compound channel, `T_w` and `T_{w\_cc},`
+respectively. These parameters, in addition to `n_{cc}`, are defined
+in the Route_Link.nc file for the NWM. The default values in NWMv2.0 and
+v2.1 are defined as: (1) was determined using a published equation from
+*Blackburn-Lynch et al., 2017*, who gathered regional USGS estimations of
+channel parameters and developed coefficients to describe the
+relationship of drainage area (`DA`) to `T_w` and to channel area
+(`A`). The aggregated CONUS equation is: `T_w = 2.44(DA)^{0.34}`
+and `A = 0.75(DA)^{0.53}`. Given these, `d_b` is determined
+using the standard equation for a trapezoid. As a default value,
+`T_{w\_cc}` is a multiplier on `T_w`. Sensitivity experiments
+presented in *Read et al. (forthcoming)* found that `T_{w\_cc}=3*T_w`
+yielded the best streamflow performance, all else being equal.
+
+.. _figure3.9:
+.. figure:: media/trapezoid-compound-channel.png
+ :align: center
+ :scale: 50%
+
+ **Figure 3.9** Cross-sectional schematic of trapezoidal channel and compound
+ channel in National Water Model, where the dashed lines represent roughness
+ of the channel n, and of the compound channel, `n_{cc}`
+
+.. _section-3.7:
+
+3.7 Lake and Reservoir Routing Description
+------------------------------------------
+
+A simple mass balance, level-pool lake/reservoir routing module allows
+for an estimate of the inline impact of small and large reservoirs on
+hydrologic response. A lake/reservoir or series of lakes/reservoirs are
+identified in the channel routing network, and lake/reservoir storage
+and outflow are estimated using a level-pool routing scheme. The only
+conceptual difference between lakes and reservoirs as represented in
+WRF-Hydro is that reservoirs contain both orifice and weir outlets for
+reservoir discharge while lakes only contain weirs. Note that the user
+must adjust these parameters accordingly - the model makes no other
+distinction between a reservoir and a lake.
+
+Fluxes into a lake/reservoir object occur through the channel network
+and when surface overland flow intersects a lake object. Fluxes from
+lake/reservoir objects are made only through the channel network and no
+fluxes from lake/reservoir objects to the atmosphere or the land surface
+are currently represented (i.e. there is currently no lake evaporation
+or subsurface exchange between the land surface and lakes and
+reservoirs). The Level Pool scheme tracks water elevation changes over
+time, `h(t)` where water from the reservoir can exit either through weir
+overflow (`Q_w`) and/or a gate-controlled flow (`Q_o`), where
+these outflows are functions of the water elevation and spillway
+parameters. Weir flow is given as `Q_w(t) = C_wLh^{\frac{3}{2}}; (3.29)`
+when `h>h_{max}` or `Q_w(t) = 0.0` when `h≤h_{max}` where, `h_{max}` is
+the maximum height before the weir begins to spill (`m`), `C_w` is a weir
+coefficient, and `L` is the length of the weir (`m`). Orifice flow is
+given as `Q_o(t) = C_oO_a\sqrt{2gh}; (3.30)` where `C_o` is the orifice
+coefficient, `O_a` is the orifice area (`m^2`), and `g` is the
+acceleration of gravity (`m/s^2`). In addition, the level pool scheme
+is designed to track each reservoir's surface area, `S_a` (`km^2`) as
+a function of water depth and the area at full storage, `A_s`
+(`km^2`). Presently, a lake/reservoir object is assumed to have
+vertical side walls, such that the surface area is always constant.
+
+.. _figure-3.10:
+.. figure:: media/level-pool.png
+ :align: center
+ :scale: 40%
+.. rst-class:: center
+.. math::
+
+ S_a(t) &= f(h, A_s) \\
+ Q_t(t) &= f(h)
+
+.. rst-class:: center
+
+ **Figure 3.10** Schematic of Level Pool Routing
+
+The following lake/reservoir parameters are required for level-pool
+routing and are defined in the :file:`LAKEPARM.nc` parameter file. The GIS
+pre-processing tool can make either of these files and the model will
+read the one specified in the :file:`hydro.namelist` file:
+
+ - Weir and Orifice Coefficients (`Co`, `Cw`)
+ - Weir Length, `L` (`m`)
+ - Orifice Area, `O_a` (`m^2`)
+ - Reservoir Area, `A_s` (`km^2`)
+ - Maximum reservoir height at full storage, `h_{max}` (`m`)
+
+The lake/reservoir flow routing option is activated when lake objects
+are defined and properly indexed as a data field in the high resolution
+terrain routing grid file. If lake/reservoir objects are present in the
+lake grid (and also within the channel network) then routing through
+those objects will occur if the channel is active AND if ``channel_option
+= 3`` (gridded routing). There are several special requirements for the
+lake grid and channel routing grids when lakes/reservoirs are to be
+represented and these are discussed in Sections :ref:`5.4 `
+and :ref:`5.6 `.
+
+.. rubric:: Relevant code modules:
+
+:file:`Routing/module_channel_routing.F90`
+
+.. rubric:: Relevant namelist options:
+
+:file:`hydro.namelist`:
+
+- ``route_lake_f`` (optional) - Path to lake parameter file to support
+ level-pool reservoir routing methods.
+
+.. note:: As mentioned in the paragraph above, if in the
+ GIS-Preprocessing the user created a “gridded” routing stack for
+ ``channel_option = 3`` (i.e. did *not* select to create a
+ Route_Link.nc file for ``channel_option=1`` or ``=2``) AND specified
+ a lake file (user provided a reservoir/lake input file), then
+ the :file:`Fulldom_hires.nc` file will populate the ``LAKEGRID`` variable.
+ For this case, the user **must** specify the route_lake_f file.
+ To turn lakes “off” with ``channel_option=3``, create another set
+ of :file:`Fulldom_hires.nc` (“domain”) files without a reservoir input
+ file specified.
+
+.. rubric:: Relevant domain and parameter files/variables:
+
+- ``CHANNELGRID`` in :file:`Fulldom_hires.nc` - Channel network grid identifying
+ the location of stream channel grid cells
+
+- ``LAKEGRID`` in :file:`Fulldom_hires.nc` (optional) - Specifies lake locations on
+ the channel grid (for gridded channel routing methods, i.e.
+ ``channel_option=3``).
+
+- :file:`Route_Link.nc` netCDF file (optional) - Specifies lake associations
+ with channel reaches.
+
+- :file:`LAKEPARM.nc` netCDF file - Specifies lake parameters for each lake
+ object specified.
+
+.. _section-3.8:
+
+3.8 Conceptual base flow model description
+------------------------------------------
+
+Aquifer processes contributing baseflow often operate at depths well
+below ground surface. As such, there are often conceptual shortcomings
+in current land surface models in their representation of groundwater
+processes. Because these processes contribute to streamflow (typically
+as “baseflow”) a parameterization is often used in order to simulate
+total streamflow values that are comparable with observed streamflow
+from gauging stations. Therefore, a switch-activated baseflow module
+:file:`module_GW_baseflow.F90` has been created which conceptually
+(i.e. *not* physically-explicit) represents baseflow contributions to
+streamflow. This model option is particularly useful when WRF-Hydro is
+used for long-term streamflow simulation/prediction and baseflow or
+“low flow” processes must be properly accounted for. Besides potential
+calibration of the land surface model parameters the conceptual baseflow
+model does not directly impact the performance of the land surface model
+scheme. The new baseflow module is linked to WRF-Hydro through the
+discharge of “deep drainage” from the land surface soil column (sometimes
+referred to as “underground runoff”).
+
+The baseflow parameterization in WRF-Hydro uses spatially-aggregated
+drainage from the soil profile as recharge to a conceptual groundwater
+reservoir (:ref:`Fig. 3.10 `). The unit of spatial
+aggregation is often taken to be that of a catchment or sub-basin within
+a watershed. Each sub-basin has a groundwater reservoir “bucket” with a
+conceptual depth and associated conceptual volumetric capacity. The
+reservoir operates as a simple bucket where outflow (= “baseflow” or
+“stream inflow”) is estimated using an empirically-derived function of
+recharge. The functional type and parameters are determined empirically
+from offline tests using an estimation of baseflow from stream gauge
+observations and model-derived estimates of bucket recharge provided by
+WRF-Hydro. Presently, WRF-Hydro uses either a direct output-equals-input
+"pass-through" relationship or an exponential storage-discharge function
+for estimating the bucket discharge as a function of a conceptual depth
+of water in the bucket "exponential bucket". Note that, because this is
+a highly conceptualized formulation, the depth of water in the bucket in
+no way infers the actual depth of water in a real aquifer system.
+However, the volume of water that exists in the bucket needs to be
+tracked in order to maintain mass conservation. Estimated baseflow
+discharged from the bucket model is then combined with lateral inflow
+from overland routing (if active) and input directly into the stream
+network as channel inflow, as referred to above in Section
+:ref:`3.5 `. Presently, the total basin
+baseflow flux to the stream network is equally distributed among all
+channel pixels within a basin for gridded channel routing options or
+dumped into the top of the reach to be routed downstream for reach-based
+methods. Lacking more specific information on regional groundwater
+basins, the groundwater/baseflow basins in WRF-Hydro are often assumed
+to match those of the surface topography. However, this is not a strict
+requirement. Buckets can be derived in a number of ways such as where
+true aquifers are defined or from a third-party hydrographic
+dataset such as the USGS NHDPlus or Hydrosheds.
+
+.. _figure-3.11:
+.. figure:: media/groundwater.svg
+ :align: center
+ :width: 70%
+
+ **Figure 3.11** Hypothetical map of groundwater/baseflow sub-basins
+ within a watershed and conceptualization of baseflow “bucket”
+ parameterization in WRF-Hydro.
+
+\
+ | `z = z_{previous} + \frac{Q_{in}*dt}{A}; \qquad (3.40)`
+ |
+ | **if** `z > z_{max}` :
+ | `z_{spill} = z - z_{max}`
+ | `z = z_{max}`
+ | `Q_{spill} = \frac{A*z_{spill}}{dt}`
+ |
+ | **else** :
+ | `Q_{spill} = 0`
+ |
+ | `Q_{exp} = C(e^{E\frac{z}{zmax}}-1)`
+ | `Q_{out} = Q_{spill} + Q_{exp}`
+ | `z = z - \frac{Q_{exp}*dt}{A}`
+ |
+ | **where** :
+ | `Q_{in}` is the inflow to the bucket aggregated from the bottom of the LSM in `m^3/s`
+ | `z` is the height of the water level in the bucket in `mm`
+ | `z_{max}` is the total height of the bucket in `mm`
+ | `A` is the area of the catchment or groundwater basin in `m^2`
+ | `E` is a unitless parameter
+ | `C` is a parameter with units of `m^3/s`
+
+A groundwater/baseflow bucket model parameter file (:file:`GWBUCKPARM.nc`)
+specifies the empirical parameters governing the behavior of the bucket
+model parameterization for each groundwater/baseflow basin specified
+within the model domain. These files are created by the WRF-Hydro GIS
+Preprocessing System and documented in :ref:`Appendix 10 `.
+The parameters include: the bucket model coefficient, the bucket model
+exponent, the initial depth of water in the bucket model, and the maximum
+storage in the bucket before "spilling" occurs.
+
+It is important to remember that a simple bucket model is a highly
+abstracted and conceptualized representation of groundwater processes
+and therefore the depth of water values in the bucket and the parameters
+themselves have no real physical basis. As mentioned above, initial
+values of the groundwater bucket model parameters are typically derived
+analytically or \'offline\' from WRF-Hydro and then are fine-tuned through
+model calibration.
+
+.. rubric:: Relevant code modules:
+
+:file:`Routing/module_GW_baseflow.F90`
+
+.. rubric:: Relevant namelist options:
+
+:file:`hydro.namelist`:
+
+- ``GWBASESWCRT`` - Switch to activate groundwater bucket module.
+
+- ``GWBUCKPARM_file`` - Path to groundwater bucket parameter file.
+
+- ``gwbasmskfil`` (optional) - Path to netcdf groundwater basin mask file
+ if using an explicit groundwater basin 2d grid.
+
+- ``UDMP_OPT`` (optional) - Switch to activate user-defined mapping between
+ land surface model grid and conceptual basins.
+
+- ``udmap_file`` (optional) - If user-defined mapping is active, path to
+ spatial-weights file.
+
+.. rubric:: Relevant domain and parameter files/variables:
+
+- :file:`GWBUCKPARM.nc` netCDF file - Specifies the parameters for each
+ groundwater bucket/basin. More information regarding the groundwater
+ bucket model parameters is provided in :ref:`Section 5.5 ` and
+ :ref:`Appendix 10 `.
+
+- :file:`GWBASINS.nc` netCDF file - The 2d grid of groundwater basin IDs.
+
+- :file:`spatialweights.nc` - netCDF file specifying the weights to map
+ between the land surface grid and the pre-defined groundwater basin
+ boundaries.
diff --git a/docs/userguide/nudging.rest b/docs/userguide/nudging.rest
new file mode 100644
index 000000000..6a031251b
--- /dev/null
+++ b/docs/userguide/nudging.rest
@@ -0,0 +1,356 @@
+.. vim: syntax=rst
+.. include:: meta.rest
+
+4. Streamflow Nudging Data Assimilation
+=======================================
+
+This chapter describes streamflow nudging and data assimilation in
+Version 5.0 and beyond of WRF-Hydro. Streamflow nudging was introduced
+in v1.1 of the National Water Model (NWM). The community WRF-Hydro model
+source code and the NWM source code have merged as of Version 5.0 of
+WRF-Hydro. See Appendix :ref:`A15 ` for more
+information on the NWM.
+
+4.1 Streamflow Nudging Data Assimilation Overview
+-------------------------------------------------
+
+For the National Water Model (NWM), a simple nudging data assimilation
+(DA) scheme has been developed to correct modeled stream flows to (or
+towards) observed values. The capability is only currently supported
+under the NWM configuration, but could be extended to NCAR reach-based
+routing, and potentially other kinds of routing, in the future.
+Specifically, the nudging capability introduces an interface for stream
+discharge observations to be applied to the Muskingum-Cunge streamflow
+routing solution.
+
+.. _section-4.2:
+
+4.2 Nudging Formulation
+-----------------------
+
+There are several motivations for performing DA. For the NWM analysis
+and assimilation cycle, the motivation is to improve model simulation
+and forecast initial conditions. Nudging is a simple and computationally
+inexpensive method of data assimilation where an observed state is
+inserted into the model with some uncertainty. When the observed value
+is inserted into the model without uncertainty, the method is referred
+to as “direct insertion”.
+
+Nudging works well locally to observations, both in space and time. Away
+from observations, in space and time, the method has limited success.
+For example, our application applies nudging data assimilation on a
+channel network with the advantage that the corrections are propagated
+downstream with the network flow. However, if no spatial or temporal
+smoothing of the corrections are included with the nudging method,
+upstream errors soon propagate past observed points when in the forecast
+(away from the observations, into the future). Various assumptions can
+be made to smooth the nudge (or correction) in space and/or time but
+these are highly parameterized and require tuning. In the NWM we have
+avoided spatial smoothing and have opted to pursue a very limited
+temporal-interpolation approach.
+
+The basic nudging equation solves `e_j`, the nudge, `e`, on a spatial
+element `j`,
+
+.. _equation-4.1:
+.. rst-class:: center
+.. math::
+ e_{j} = \frac{q_{n}*w_{n}^{2}(j,t)*(Q_{n} - {\widehat{Q}}_{n})}{w_{n}^{2}(j,t)}
+ \qquad (4.1)
+
+The numerator is the sum, over the `N_j` observations affecting element
+`j`, of the product of each observations' quality coefficient, `q_n`,
+the model error, :math:`Q_{n} - {\widehat{Q}}_{n}`, and the squared
+weights. The weights is where most of the action happens.
+
+The weights determine how the nudge is interpolated in both space and
+time (`j,t`). The weights term `w_n(j,t)` in the above equation is
+solved for observation `n` as a function of both space, `j`, and time,
+`t`. It is expressed as the product of separate spatial and temporal
+weight terms:
+
+.. _equation-4.2:
+.. rst-class:: center
+.. math::
+ w_{n}(j,t) = w_{n_{s}}(j)\ *\ w_{n_{t}}(t,j)
+ \qquad (4.2)
+
+The temporal weight term takes the following piecewise form in our
+application:
+
+.. _equation-4.3:
+.. rst-class:: center
+.. math::
+ w_{n_t}(t,j) = w_{n_s} \begin{cases}
+ 10^{10} * (1/10)^{\frac{\left| t-\widehat{t} \right|}{tau_j / 10}} &:\text{if} \ \left| t-\widehat{t} \right| \leq tau_j \\
+ e^{-a_j*(t-\widehat{t})} &:\text{if} \ \left| t-\widehat{t} \right| \gt tau_j
+ \end{cases} \qquad (4.3)
+
+The spatial weight term is of the following form:
+
+.. _equation-4.4:
+.. rst-class:: center
+.. math::
+ w_{n_s} = \begin{cases}
+ \frac{R_{n}^2 - d_{jn}^{2}}{R_{n}^2 + d_{jn}^2} &: \text{if} R_n > d_{jn} \\
+ 0 &: \text{otherwise}
+ \end{cases} \qquad (4.4)
+
+
+The parameters specified in version 1.2 of the NWM (equivalent to this
+WRF-Hydro version 5) are:
+
+.. rst-class:: center
+
+ | *tau = 15 minutes*
+ | *a = 120 minutes*
+ | *R = 0.25 meters*
+
+for all gages (all `j`) in CONUS (the parameter files are discussed
+below). The very short `R` means that nudging is applied locally, only
+to the reach where the observation is associated. There is currently no
+spatial smoothing. This is partly because spatial smoothing is assumed
+to be computationally intensive and has not been completely implemented.
+The `tau = 15` means that within 15 minutes of an observation we are
+heavily weighting the observation and `a = 120` means that nudging to
+the observation relaxes with e-folding time of two hours moving away
+from the observation.
+
+The Muskingum-Cunge equation in Section 3.6.2 has the form:
+
+.. _equation-4.5:
+.. rst-class:: center
+.. math::
+ Q_{d}^{c} = C1{\,Q}_{u}^{p} + C2\,Q_{u}^{c} + {C3\,Q}_{d}^{p} + \left( \frac{q_{l}\,dt}{D} \right)
+ \qquad (4.5)
+
+In V1.0 of the NWM, nudging was applied into this equation in the
+following manner
+
+.. _equation-4.6:
+.. rst-class:: center
+.. math::
+ Q_{d}^{c} = C1{\,Q}_{u}^{p} + C2\,Q_{u}^{c} + {C3\,(Q}_{d}^{p} + N_{d}^{p}) + \left( \frac{q_{l}\,dt}{D} \right)
+ \qquad (4.6)
+
+where the discharge solution (`Q`) at the current time (`c`) at the
+downstream (`d`) reach was solved by applying the nudge from the
+previous timestep (`N_{d}^{p}`) to adjust the discharge of
+downstream reach at the previous (`p`) time. Experiments indicated
+undesirable side effects of introducing a discontinuity (the previous
+nudge) in discharge between the upstream (`u`) link and the downstream
+(`d`) link in this solution. With v1.2 of the NWM (equivalent to v5.0
+WRF-Hydro), the equation was modified to include the nudge in the
+upstream terms of the solution as well, at both the previous and current
+times:
+
+.. _equation-4.7:
+.. rst-class:: center
+.. math::
+ Q_{d}^{c} = C1{(Q}_{u}^{p} + N_{d}^{p}) + C2{(Q}_{u}^{c} + N_{d}^{p}) + {C3(Q}_{d}^{p} + N_{d}^{p}) + \left( \frac{q_{l}\,dt}{D} \right)
+ \qquad (4.7)
+
+This is the form of the equation currently used for nudging which aims
+to reduce the discontinuity in space between the upstream and downstream
+reaches. Experiments revealed that this formulation, significantly
+reduced the difference between modeled and observed discharge and hence
+the nudging magnitudes (over long time series of assimilated
+observations). Note that the nudge is only applied to the upstream reach
+during the solution of the downstream reach and is not applied in the
+output values of the upstream reach.
+
+This change in the nudging formulation also promotes the previous
+downstream nudge to a prognostic variable. Whereas
+`Q_{d}^{p} + N_{d}^{p}` was simply the previous downstream
+streamflow value after the nudge (already a prognostic model variable),
+adding the previous downstream nudge to the upstream solutions requires
+having the previous nudge available. Therefore, the previous downstream
+nudge values gets written to the nudgingLastObs files (described in
+:ref:`Section 4.3 `), which are the restart files for
+the nudging routine.
+
+There are a variety of experimental nudging options and features in the
+nudging section of the hydro.namelist which are incomplete or unused at
+this time. There are also nudging features used in a limited capacity by
+the NWM which are not described here. As development of these options
+evolves, they will be documented in future versions of WRF-Hydro.
+
+.. _section-4.3:
+
+4.3 Nudging Workflow
+--------------------
+
+Figure 4.1 provides an overview of the nuding workflow at the file
+level. Descriptions are provided for each of the files shown in the
+diagram.
+
+.. figure:: media/nudging-workflow.svg
+ :align: center
+ :scale: 125%
+
+ **Figure 4.1:** The nudging workflow at the file level.
+
+4.3.1 Input Files
+~~~~~~~~~~~~~~~~~
+
+.. rubric:: Discharge observations and :file:`nudgingTimeSliceObs/` :
+
+Discharge observations from the real world enter the WRF-Hydro system through the
+:file:`nugingTimeSliceObs/` directory.
+
+The individual observation files used for streamflow nudging are stored
+in this directory, each with the the following naming convention
+:file:`YYYY-mm-dd_HH:MM:SS.RRmin.usgsTimeSlice.ncdf`.
+
+The first part of the filename, ``YYYY-mm-dd_HH:MM:SS``, identifies the
+center of the “slice of time” in which observations are found (from year
+down to second). ``RR`` indicates the resolution, or total width of the time
+slice. Currently this resolution is a **hard-coded** parameter in the model.
+It is set to 15 minutes as this is the most common reporting frequency
+for USGS gages in the USA. The “usgsTimeSlice” part of the filename is
+fixed and is legacy of the fact that these files were originally
+designed for USGS observations. However, any discharge observations can
+be placed into this format.
+
+The format of a an example timeslice observation file is given by an
+:program:`ncdump -h` in :ref:`Figure A15.1 ` in
+:ref:`Appendix A15 `. Of the three-dimensional variables,
+two are character lengths and only the ``stationIdInd`` dimension is
+variable (unlimited). The ``stationIdInd`` variable has dimensions of the
+number of individual stream gages contained in the file by the fixed
+width of the strings (``stationIdStrLen=15``). The variable metadata
+describes the contents. The ``stationId`` variable is the “USGS station
+identifier of length 15”. While the character type of the variable and
+the length of 15 are both fixed, identifiers are not limited to those
+used by the USGS. Custom identifiers can be used and are described later
+in this section when the gages variable in the :file:`Route_Link.nc` file is
+described. The variable ``discharge_quality`` is simply a multiplier. This
+value is stored as a short integer for space concerns and only takes
+values from zero to one hundred. Internally in the model, this variable
+is scaled by 100 and used as a floating point variable between zero and
+one. The ``queryTime`` variable is not used by the model and is optional. It
+may be useful in situations when the files are updated in real-time.
+Similarly, the metadata field ``fileUpdateTimeUTC`` can be useful but is not
+required by the model. The remaining two metadata fields are both
+required by the model: ``sliceCenterTimeUTC`` and ``sliceTimeResolutionMinutes``
+ensure that the file and the model are operating under the same time
+location and resolution assumptions. An example of generating timeslice
+files from USGS observations using the R language is given in the help
+for the rwrfhydro function :command:`WriteNcTimeSlice`.
+
+.. rubric:: :file:`hydro.namelist`:
+
+When WRF-Hydro is compiled with nudging on, the :file:`hydro.namelist`
+file is required to contain ``&NUDGING_nlist``. The nudging namelist is
+found at the bottom of the :file:`hydro.namelist` file either in the :file:`Run/`
+directory after compilation or in the :file:`template/HYDRO/` directory.
+The namelist governs the run-time options to the nudging code. These run-time
+options are detailed in :ref:`Section 4.5 ` below and in
+:ref:`Appendix A5 `.
+
+.. rubric:: :file:`Route_Link.nc`:
+
+Collocation of streamflow gages and reach elements is achieved
+by the gages field in the Route_Link.nc file (see Sections
+:ref:`3.6 ` and :ref:`5.4 `). Each reach
+element may have a single gage identified with it as specific by a
+fixed-width 15 character string in the gages field. A blank entry
+indicates no gage is present on the reach. The gages field in
+:file:`Route_Link.nc` tells the nudging module where to apply the observed
+values to the stream network. Gages which appear in the observation
+files but not in the :file:`Route_Link.nc` file do not cause a problem, they are
+simply skipped and their identifiers collected and printed to the file
+:file:`NWIS_gages_not_in_RLAndParams.txt` file, described later. The number of
+non-blank routelink gages must match the number of gages supplied in the
+nudging parameters file, described next.
+
+.. rubric:: :file:`nudgingParams.nc`:
+
+:ref:`Figure A15.2 ` in Appendix :ref:`A15 ` shows
+the structure of the :file:`nudgingParams.nc` file for a small domain. Some
+of the parameters in the file are explained in detail in Section
+:ref:`4.2 ` and some are placeholders for capabilities
+which have not been developed.
+
+4.3.2 Output Files
+~~~~~~~~~~~~~~~~~~
+
+When the code is compiled with the nudging compile-time option on (see
+next section), four types of output files contain nudging information.
+Some files are different than when compiled without nudging while other
+files are unique outputs for the nuding option.
+
+.. rubric:: :file:`YYYYmmddHHMM.CHRTOUT_DOMAIN1`:
+
+The nudging code affects the values written to the “CHRTOUT” files.
+If valid observations are available, the (modeled) streamflow variable is
+changed by the assimilated observations. When the model is compiled to enable
+nudging, the variable ``nudge`` also appears in the file. The nudge value is
+calculated as in Section :ref:`4.2 `.
+
+.. rubric:: :file:`nudgingLastObs.YYYY-mm-dd_HH:MM:SS.nc`:
+
+These files are unique to the nudging compile and are the restart files
+for the nudging component of the model. A restart file is not required,
+nudging can be run from a cold start. This file can contain multiple variables,
+only the ``nudge`` variable is described in this documentation.
+
+.. rubric:: :file:`NWIS_gages_not_in_RLAndParams.txt`:
+
+These files are unique to nudging and report the unique gages found in the
+observation time slice files which were not present in the :file:`Route_Link.nc`
+file. These are gages which may have the opportunity to be assimilated (provided
+they could be located in the domain). There is only one such file per run, written
+at the end of the run.
+
+.. rubric:: Standard output and :file:`hydro_diag.*` files:
+
+The nudging routines write various messages. The ultimate destination of these
+can be compiler dependent. The nudging code aims to preface all its messages with ``Ndg:``
+and all its warnings with ``Ndg: WARNING:``.
+
+4.4 Nudging compile-time option
+-------------------------------
+
+The nuding capability is only available when the code is compiled with
+the environment variable set:
+
+``WRF_HYDRO_NUDGING=1``
+
+.. _section-4.5:
+
+4.5 Nudging run-time options
+----------------------------
+
+:ref:`Appendix A5 ` presents an annotated :file:`hydro.namelist`
+file. There are two Fortran namelists contained within that file. The nudging
+run-time options are contained the ``NUDGING_nlist`` which is the second namelist
+in the document. Only some run time options listed in the namelist are
+documented at this time.
+
++-----------------------------------+---------------------------------------+
+| **Documented/Supported Options** | **Undocumented/Unsupported Options** |
++===================================+=======================================+
+| ``timeSlicePath`` | ``nLastObs`` |
+| | |
+| ``nudgingParamFile`` | ``persistBias`` |
+| | |
+| ``nudgingLastObsFile`` | ``biasWindowBeforeT0`` |
+| | |
+| ``readTimesliceParallel`` | ``maxAgePairsBiasPersist`` |
+| | |
+| ``temporalPersistence`` | ``minNumPairsBiasPersist`` |
+| | |
+| | ``invDistTimeWeightBias`` |
+| | |
+| | ``noConstInterfBias`` |
++-----------------------------------+---------------------------------------+
+
+Details on the meaning and usage of the options are given in
+:ref:`Appendix A5 `, in both the comments which are part of the namelist
+itself and by the blue annotations added to the namelists. The supported options
+are fairly straight forward in their usage. It is worth noting that the
+specfication of the :file:`nudgingLastObsFile` does not behave the same way as
+the specification of the LSM or hydro restart files. The unsupported
+nudging options have to do with mostly experimental methods for forecast
+bias correction which have been investigated.
diff --git a/docs/userguide/other/Makefile b/docs/userguide/other/Makefile
new file mode 100644
index 000000000..ba8f621bb
--- /dev/null
+++ b/docs/userguide/other/Makefile
@@ -0,0 +1,76 @@
+SPHINXOPTS = -c .
+SPHINXBLD = sphinx-build
+SPHINXPROJ = wrfhydro
+DOCDIR = doc/
+DOCBACK = ../
+DCXFROMDOC = ../
+BLDDIR = build/doc/
+STPLS = $(wildcard $(DOCDIR)*.stpl)
+STPLTGTS = $(STPLS:%.stpl=%)
+SRCS = $(filter-out $(DOCDIR)index.rest,$(wildcard $(DOCDIR)*.rest))
+SRCSTPL = $(wildcard $(DOCDIR)*.rest.stpl)
+IMGS = \
+ $(wildcard $(DOCDIR)*.pyg)\
+ $(wildcard $(DOCDIR)*.eps)\
+ $(wildcard $(DOCDIR)*.tikz)\
+ $(wildcard $(DOCDIR)*.svg)\
+ $(wildcard $(DOCDIR)*.uml)\
+ $(wildcard $(DOCDIR)*.dot)\
+ $(wildcard $(DOCDIR)*.eps.stpl)\
+ $(wildcard $(DOCDIR)*.tikz.stpl)\
+ $(wildcard $(DOCDIR)*.svg.stpl)\
+ $(wildcard $(DOCDIR)*.uml.stpl)\
+ $(wildcard $(DOCDIR)*.dot.stpl)
+PNGS=$(subst $(DOCDIR),$(DOCDIR)_images/,\
+ $(patsubst %.eps,%.png,\
+ $(patsubst %.pyg,%.png,\
+ $(patsubst %.tikz,%.png,\
+ $(patsubst %.svg,%.png,\
+ $(patsubst %.uml,%.png,\
+ $(patsubst %.dot,%.png,\
+ $(patsubst %.eps.stpl,%.png,\
+ $(patsubst %.dot.stpl,%.png,\
+ $(patsubst %.tikz.stpl,%.png,\
+ $(patsubst %.svg.stpl,%.png,\
+ $(patsubst %.uml.stpl,%.png,$(IMGS)))))))))))))
+DOCXS = $(subst $(DOCDIR),$(BLDDIR)docx/,$(SRCS:%.rest=%.docx))\
+ $(subst $(DOCDIR),$(BLDDIR)docx/,$(SRCSTPL:%.rest.stpl=%.docx))
+PDFS = $(subst $(DOCDIR),$(BLDDIR)pdf/,$(SRCS:%.rest=%.pdf))\
+ $(subst $(DOCDIR),$(BLDDIR)pdf/,$(SRCSTPL:%.rest.stpl=%.pdf))
+.PHONY: docx help Makefile docxdir pdfdir stpl index imgs
+stpl: $(STPLTGTS)
+%:%.stpl
+ @cd $(DOCDIR) && stpl "$(/conf.py``
+or opposite to the extension of the included ``_links_sphinx.r?st`` file:
+
+- if you have ``.. include:: /_links_sphinx.rest``,
+ then the main file extension is ``.rst``
+
+``rstdoc`` creates documentation (PDF, HTML, DOCX)
+from restructuredText (``.rst``, ``.rest``) using either
+
+- `Pandoc `__
+- `Sphinx `__
+- Docutils
+ `configurable `__
+
+``rstdoc`` and ``rstdcx`` command line tools call ``dcx.py``.
+which
+
+- creates ``.tags`` to jump around with the editor
+
+- handles `.stpl `__ files
+
+- processes ``gen`` files (see examples produced by --rest)
+
+- creates links files (``_links_docx.r?st``, ``_links_sphinx.r?st``, ...)
+
+- forwards known files to either Pandoc, Sphinx or Docutils
+
+See example at the end of ``dcx.py``.
+It is supposed to be used with a build tool.
+``make`` and ``waf`` examples are included.
+
+- Initialize example tree (add ``--rstrest`` to make ``.rst`` main and ``.rest`` included files):
+
+ $ ./dcx.py --rest repo #repo/doc/{sy,ra,sr,dd,tp}.rest files OR
+ $ ./dcx.py --stpl repo #repo/doc/{sy,ra,sr,dd,tp}.rest.stpl files
+ $ ./dcx.py --ipdt repo #repo/pdt/AAA/{i,p,d,t}.rest.stpl files
+ $ ./dcx.py --over repo #.rest all over
+
+- Only create .tags and ``_links_xxx.r?st``::
+
+ $ cd repo
+ $ rstdoc
+
+- Create the docs (and .tags and ``_links_xxx.r?st``) with **make**::
+
+ $ make html #OR
+ $ make epub #OR
+ $ make latex #OR
+ $ make docx #OR
+ $ make pdf
+
+ The latter two are done by Pandoc, the others by Sphinx.
+
+- Create the docs (and .tags and ``_links_xxx.r?st``) with
+ `waf `__:
+
+ Instead of using ``make`` one can load ``dcx.py`` (``rstdoc.dcx``) in
+ `waf `__.
+ ``waf`` also considers all recursively included files,
+ such that a change in any of them results in a rebuild.
+ All files can have an additional ``.stpl`` extension to use
+ `SimpleTemplate `__.
+
+ $ waf configure #also copies the latest version of waf in here
+ $ waf --docs docx,sphinx_html,rst_odt
+ $ #or you provide --docs during configure to always compile the docs
+
+ - ``rst_xxx``: via
+ `rst2xxx.py `__
+ - ``sphinx_xxx``: via `Sphinx `__ and
+ - just ``xxx``: via `Pandoc `__.
+
+
+The following image language files should be parallel to the ``.r?st`` files.
+They are automatically converted to ``.png``
+and placed into ``./_images`` or ``/_images`` or else parallel to the file.
+
+- ``.tikz`` or ``.tikz.stpl``.
+ This needs LaTex.
+
+- `.svg `__ or ``.svg.stpl``
+
+- ``.dot`` or ``.dot.stpl``
+
+ This needs `graphviz `__.
+
+- `.uml `__ or ``.uml.stpl``
+
+ This needs `plantuml `__ .
+ Provide either
+
+ - ``plantuml.bat`` with e.g. ``java -jar "%~dp0plantuml.jar" %*`` or
+ - ``plantuml`` sh script with
+ ``java -jar `dirname $BASH_SOURCE`/plantuml.jar "$@"``
+
+- ``.eps`` or ``.eps.stpl`` embedded postscript files.
+
+ This needs `inkscape `__.
+
+- ``.pyg`` contains python code that produces a graphic.
+ If the python code defines a ``to_svg`` or a ``save_to_png`` function,
+ then that is used, to create a png.
+ Else the following is tried
+
+ - ``pyx.canvas.canvas`` from the
+ `pyx `__ library or
+ - ``cairocffi.Surface`` from
+ `cairocffi `__
+ - `matplotlib `__.
+ If ``matplotlib.pyplot.get_fignums()>1``
+ the figures result in ``.png``
+
+ The same code or the file names can be used in a ``.r?st.stpl`` file
+ with ``pngembed()`` or ``dcx.svgembed()`` to embed in html output.
+
+ ::
+
+ {{!svgembed("egpyx.pyg",outinfo)}}
+ <%
+ ansvg=svgembed('''
+ from svgwrite import cm, mm, drawing
+ d=drawing.Drawing(viewBox=('0 0 300 300'))
+ d.add(d.circle(center=(2*cm, 2*cm), r='1cm', stroke='blue', stroke_width=9))
+ '''.splitlines(),outinfo)
+ %>
+ {{!ansvg}}
+
+
+Conventions
+-----------
+
+Files
+
+ - main files and included files: ``.rest``, ``.rst`` or vice versa.
+ ``.txt`` are for literally included files (use :literal: option).
+ - templates separately rendered : ``*.rest.stpl`` and ``*.rst.stpl``
+ template included: ``*.rst.tpl``
+ Template lookup is done in
+ ``.`` and ``..`` with respect to the current file.
+
+ - with ``%include('some.rst.tpl', param="test")`` with optional parameters
+ - with ``%globals().update(include('utility.rst.tpl'))``
+ if it contains only definitions
+
+Links
+
+- ``.. _`id`:`` are *reST targets*.
+ reST targets should not be template-generated.
+ The template files should have a higher or equal number of targets
+ than the generated file,
+ in order for tags to jump to the template original.
+ If one wants to generate reST targets,
+ then this should better happen in a previous step,
+ e.g. with ``gen`` files mentioned above.
+
+- References use replacement
+ `substitutions `__:
+ ``|id|``.
+
+- If you want an overview of the linking (traceability),
+ add ``.. include:: _traceability_file.rst``
+ to ``index.rest`` or another ``.rest`` parallel to it.
+ It is there in the example project, to include it in tests.
+ ``_traceability_file.{svg,png,rst}`` are all in the same directory.
+
+Link files are created in link roots, which are folders where the first main file
+(``.rest`` or ``.rst``) is encoutered during depth-first traversal.
+Non-overlapping link root paths produce separately linked file sets.
+
+``.. include:: /_links_sphinx.r?st``, with the one initial ``/``
+instead of a relative or absolute path,
+will automatically search upward for the ``_links_xxx.r?st`` file
+(``_sphinx`` is replaced by what is needed by the wanted target when the docs are generated).
+
+Sphinx ``conf.py`` is augmented by configuration for Pandoc and Docutils.
+It should be where the input file is, or better at the project root
+to be usable with `waf `__.
+
+See the example project created with ``--rest/stpl/ipdt/over``
+and the sources of the documentation of
+`rstdoc `__.
+
+
+"""
+
+'''
+API
+---
+
+.. code-block:: py
+
+ import rstdoc.dcx as dcx
+
+
+The functions in ``dcx.py``
+are available to the ``gen_xxx(lns,**kw)`` functions (|dhy|).
+
+'''
+
+try:
+ import svgwrite.drawing
+except:
+ svgwrite = None
+
+try:
+ import pyfca
+except:
+ pyfca = None
+
+try:
+ import cairocffi
+ import cairosvg
+except:
+ cairocffi = None
+ cairosvg = None
+
+try:
+ import sphinx_bootstrap_theme
+ html_theme_path = sphinx_bootstrap_theme.get_html_theme_path()
+except:
+ html_theme_path = ''
+
+
+class RstDocError(Exception):
+ pass
+
+
+_plus = '+'
+_indent = ' '
+_indent_text = lambda txt: '\n'.join(_indent+x for x in txt.splitlines())
+
+
+class _ToolRunner:
+ def svg2png(self, *args, **kwargs):
+ try:
+ cairosvg.svg2png(*args, **kwargs)
+ except Exception as e:
+ print('Trying inkscape for ',kwargs['write_to'])
+ fn = normjoin(tempdir(),'svg.svg')
+ with open(fn,'w',encoding='utf-8') as f:
+ f.write(kwargs['bytestring'])
+ run_inkscape(fn, kwargs['write_to'],
+ dpi=kwargs.get('DPI', DPI))
+
+ def run(self, *args, **kwargs):
+ if 'outfile' in kwargs:
+ del kwargs['outfile']
+ return sp.run(*args, **kwargs)
+
+
+_toolrunner = _ToolRunner()
+
+def opnwrite(filename):
+ return open(filename, 'w', encoding='utf-8', newline='\n')
+
+def opnappend(filename):
+ return open(filename, 'a', encoding='utf-8', newline='\n')
+
+def opn(filename):
+ return open(filename, encoding='utf-8')
+
+
+isfile = os.path.isfile
+isdir = os.path.isdir
+islink = os.path.islink
+
+def abspath(x):
+ return os.path.abspath(x).replace('\\', '/')
+
+
+isabs = os.path.isabs
+
+
+def relpath(x, start=None):
+ try:
+ return os.path.relpath(x, start=start).replace('\\', '/')
+ except ValueError:
+ return abspath(x)
+
+def dirname(x):
+ return os.path.dirname(x).replace('\\', '/')
+
+base = os.path.basename
+
+def dir_base(x):
+ return [e.replace('\\', '/') for e in os.path.split(x)]
+
+
+def stem(x):
+ return os.path.splitext(x)[0].replace('\\', '/')
+
+
+def stem_ext(x):
+ return [e.replace('\\', '/') for e in os.path.splitext(x)]
+
+
+exists = os.path.exists
+
+
+def cwd():
+ return os.getcwd().replace('\\', '/')
+
+
+mkdir = partial(os.makedirs, exist_ok=True)
+cd = os.chdir
+cp = shutil.copy
+
+
+def ls(x='.'):
+ return [e for e in sorted(os.listdir(x))]
+
+
+def rmrf(x):
+ try:
+ if isdir(x):
+ shutil.rmtree(x)
+ else:
+ os.remove(x)
+ except:
+ pass
+
+
+def filenewer(infile, outfile):
+ res = True
+ try:
+ res = os.path.getmtime(infile) > os.path.getmtime(outfile)
+ except:
+ pass
+ return res
+
+
+def normjoin(*x):
+ return os.path.normpath(os.path.join(*x)).replace("\\", "/")
+
+
+def updir(fn):
+ return normjoin(dirname(fn), '..', base(fn))
+
+
+# fn='x/y/../y/a.b'
+# updir(fn) # x\a.b
+# updir('a.b') # ..\a.b
+# updir('a.b/a.b') # a.b
+# normjoin(fn) # x\y\a.b
+
+def is_project_root_file(filename):
+ '''
+ Identifies the root of the project by a file name contained there.
+
+ '''
+ return filename=='.git' or filename=='waf' or filename=='Makefile' or filename.lower().startswith('readme')
+
+
+'''
+Used for png creation.
+'''
+DPI = 600
+
+# text files
+_stpl = '.stpl'
+_tpl = '.tpl'
+_rest = '.rest'
+_rst = '.rst'
+_txt = '.txt'
+
+_grepinc = lambda inc: grep('^\.\. include:: .*_links_sphinx'+inc+'$',
+ exts=set(['.rst','.rest','.stpl','.tpl']))
+
+#a_rest is the main extension (default .rest)
+def _set_rstrest(a_rest):
+ global _rest
+ global _rst
+ assert isinstance(a_rest,str)
+ _rest = a_rest
+ _rst = '.rest' if _rest=='.rst' else '.rst'
+
+def _get_rstrest(config=None):
+ if config is None:
+ config = conf_py(cwd())
+ if 'source_suffix' not in config:
+ try:
+ next(_grepinc('.rest'))
+ _set_rstrest('.rst') #found incuded .rest, so main .rst
+ if '__file__' in config:
+ with opnappend(config['__file__']) as f:
+ f.write('\nsource_suffix = "%s"'%_rest)
+ else:
+ conf_file = up_dir(is_project_root_file)
+ if conf_file:
+ conf_file = normjoin(conf_file,'conf.py')
+ if not exists(conf_file):
+ config['__file__'] = conf_file
+ with opnwrite(conf_file) as f:
+ f.write('\nsource_suffix = "%s"'%_rest)
+ return _rest
+ except:
+ pass
+ else:
+ _set_rstrest(config['source_suffix'])
+ return _rest
+ _set_rstrest('.rest')
+ return _rest
+
+def is_rest(x):
+ return x.endswith(_rest) or x.endswith(_rest + _stpl)
+
+def is_rst(x):
+ return x.endswith(_rst) or x.endswith(_rst + _stpl) or x.endswith(
+ _rst + _tpl)
+
+
+# graphic files
+_svg = '.svg'
+_tikz = '.tikz'
+_tex = '.tex'
+_dot = '.dot'
+_uml = '.uml'
+_eps = '.eps'
+_pyg = '.pyg'
+_png = '.png' # target of all others
+
+
+def _is_graphic(t):
+ return t != '' and any(x.endswith(t) for x in graphic_extensions)
+
+
+rextgt = re.compile(
+ r'(?:^|^[^\.\%\w]*\s|^\s*\(?\w+[\)\.]\s)\.\. _`?(\w[^:`]*)`?:\s*$')
+# no need to consider those not starting with \w,
+# because rexlinksto starts with \w
+rexsubtgt = re.compile(
+ r'(?:^|^[^\.\%\w]*\s|^\s*\(?\w+[\)\.]\s)\.\. \|(\w[^\|]*)\|\s\w+::')
+
+rextitle = re.compile(r'^([!"#$%&\'()*+,\-./:;<=>?@[\]^_`{|}~])\1+$')
+#rextitle.match('===')
+#rextitle.match('==')
+#rextitle.match('=') #NO
+
+rexitem = re.compile(r'^\s*:?\**(\w[^:\*]*)\**:\s*.*$')
+#rexitem.match(":linkname: words").groups()[0]
+#rexitem.match("linkname: words").groups()[0]
+#rexitem.match("a linkname: words").groups()[0]
+#rexitem.match("``a`` linkname: words") #NO
+
+rexoneword = re.compile(r'^\s*(\w+)\s*$')
+rexname = re.compile(r'^\s*:name:\s*(\w.*)*$')
+rexlnks = re.compile(r'(?:^|[^a-zA-Z`])\|(\w+)\|(?:$|[^a-zA-Z`])')
+reximg = re.compile(r'(?:image|figure):: ((?:\.|/|\\|\w).*)')
+rerstinclude = re.compile(r'\.\. include::\s*([\./\w\\].*)')
+restplinclude = re.compile(r"""%\s*include\s*\(\s*["']([^'"]+)['"].*\)\s*""")
+
+#... combined:
+cmmnt = r"""[\.:#%/';"-]"""
+rexkw = re.compile(r'^\s*('+cmmnt+cmmnt+r' {|%?\s*{*_+\w+_*\()|^\s*:\w+:\s')
+#rexkw.search("{{_A30('kw1 kw2')}}")
+#rexkw.search("{{_U00___('kw1 kw2')}}")
+#rexkw.search(" % __0A0('kw1 kw2')")
+#rexkw.search(" % __123_('kw1 kw2')")
+#rexkw.search(":A30: kw1 kw2")
+#rexkw.search(" :U00: kw1 kw2")
+#rexkw.search("\t:123: kw1 kw2")
+#rexkw.search(" .. {kw1 kw2}")
+#rexkw.search(" -- {kw1 kw2}")
+#rexkw.search(" // {kw1 kw2}")
+#rexkw.search(" ## {kw1 kw2}")
+#rexkw.search(" '' {kw1 kw2}")
+#rexkw.search(" ''' {kw1 kw2}") #NO
+#rexkw.search(" % .. {kw1 kw2}") #NO
+#rexkw.search(" % fun('kw1 kw2')") #NO
+rexkwsplit = re.compile(r'[\W_]+')
+
+rextrace_target_id=re.compile("^[^-_]\w+$")
+#rextrace_target_id.match("_000") #NO
+#rextrace_target_id.match("-000") #NO
+#rextrace_target_id.match("sd 00") #NO
+#rextrace_target_id.match("a-d-00") #NO
+#rextrace_target_id.match("A_d_00")
+#rextrace_target_id.match("sD00")
+#rextrace_target_id.match("000")
+
+#https://sourceforge.net/p/docutils/mailman/message/36453416/
+_rstlinkfixer = re.compile('#[^>]+>')
+def _rst_id_fixer(matchobj):
+ return matchobj.group(0).replace(' ', '-').replace('_', '-')
+def _rst_id_fix(linktxt):
+ return _rstlinkfixer.sub(_rst_id_fixer, linktxt, re.MULTILINE)
+
+@lru_cache()
+def _here_or_updir(fldr, file):
+ filepth = normjoin(fldr, file)
+ there = 1
+ if not exists(filepth):
+ there = 0
+ filedir = up_dir(lambda x:x==file or is_project_root_file(x), start=abspath(fldr))
+ if filedir:
+ filepth = normjoin(filedir, file)
+ if exists(filepth):
+ rthere = relpath(filedir,start = fldr)
+ #rthere = '../..'
+ there = len(rthere.split('..'))
+ else:
+ there = 0
+ return (filepth, there)
+
+# master_doc and latex_documents is determined automatically
+sphinx_config_keys = """
+ project
+ author
+ copyright
+ version
+ release
+ html_theme
+ html_theme_path
+ latex_elements
+ html_extra_path
+ source_suffix
+ """.split()
+
+latex_elements = {
+ 'preamble':
+ r"""
+\usepackage{pgfplots}
+\usepackage{unicode-math}
+\usepackage{tikz}
+\usepackage{caption}
+\captionsetup[figure]{labelformat=empty}
+\usetikzlibrary{
+ arrows,snakes,backgrounds,patterns,matrix,shapes,
+ fit,calc,shadows,plotmarks,intersections
+ }
+"""
+}
+
+tex_wrap = r"""
+\documentclass[12pt, tikz]{standalone}
+\usepackage{amsmath}
+""" + latex_elements['preamble'] + r"""
+\pagestyle{empty}
+\begin{document}
+%s
+\end{document}
+"""
+
+
+def target_id_group(targetid):
+ return targetid[0]
+
+
+target_id_color = {
+ "ra": ("r", "lightblue"),
+ "sr": ("s", "red"),
+ "dd": ("d", "yellow"),
+ "tp": ("t", "green")
+}
+
+_images = '_images'
+# used for _traceability_file.rst and _traceability_file.svg
+_traceability_file = '_traceability_file'
+html_extra_path = [_traceability_file + '.svg']
+pandoc_doc_optref = {
+ 'latex': '--template reference.tex',
+ 'html': {}, # each can also be dict of file:template
+ 'pdf': '--template reference.tex',
+ 'docx': '--reference-doc reference.docx',
+ 'odt': '--reference-doc reference.odt'
+}
+_pandoc_latex_pdf = [
+ '--listings', '--number-sections', '--pdf-engine', 'xelatex', '-V',
+ 'titlepage', '-V', 'papersize=a4', '-V', 'toc', '-V', 'toc-depth=3', '-V',
+ 'geometry:margin=2.5cm'
+]
+pandoc_opts = {
+ 'pdf': _pandoc_latex_pdf,
+ 'latex': _pandoc_latex_pdf,
+ 'docx': [],
+ 'odt': [],
+ 'html': ['--mathml', '--highlight-style', 'pygments']
+}
+rst_opts = { # http://docutils.sourceforge.net/docs/user/config.html
+ 'strip_comments': True,
+ 'report_level': 3,
+ 'raw_enabled': True
+}
+
+# ``list-table`` and ``code-block`` are converted to ``table`` and ``code``
+
+
+def make_counters():
+ return {".. figure": 1, ".. math": 1, ".. table": 1, ".. code": 1}
+
+
+def name_from_directive(directive, count):
+ return directive[0].upper() + directive[1:] + ' ' + str(count)
+
+
+config_defaults = {
+ 'project': 'Project',
+ 'author': 'Project Team',
+ 'copyright': '2019, Project Team',
+ 'version': '1.0',
+ 'release': '1.0.0',
+ 'html_theme': 'bootstrap',
+ 'html_theme_path': html_theme_path,
+ 'latex_elements': latex_elements,
+ 'tex_wrap': tex_wrap,
+ 'target_id_group': target_id_group,
+ 'target_id_color': target_id_color,
+ 'rextrace_target_id': rextrace_target_id,
+ 'pandoc_doc_optref': pandoc_doc_optref,
+ 'pandoc_opts': pandoc_opts,
+ 'rst_opts': rst_opts,
+ 'name_from_directive': name_from_directive
+}
+
+sphinx_enforced = {
+ 'numfig': 0,
+ 'smartquotes': 0,
+ 'templates_path': [],
+ 'language': None,
+ 'highlight_language': "none",
+ 'default_role': 'math',
+ 'latex_engine': 'xelatex',
+ 'pygments_style': 'sphinx',
+ 'exclude_patterns': ['_build', 'Thumbs.db', '.DS_Store']
+}
+
+
+'''
+``g_config`` can be used to inject a global config.
+This overrides the defaults
+and is overriden by an updir ``conf.py``.
+'''
+g_config = None
+
+@lru_cache()
+def conf_py(fldr):
+ """
+ ``defaults``, ``g_config``, updir ``conf.py``
+
+ """
+ config = {}
+ config.update(config_defaults)
+ if g_config:
+ config.update(g_config)
+ confpydir = up_dir(lambda x:x=='conf.py' or is_project_root_file(x),start=abspath(fldr))
+ if confpydir:
+ confpypath = normjoin(confpydir,'conf.py')
+ if exists(confpypath):
+ with opn(confpypath) as f:
+ config['__file__'] = abspath(confpypath)
+ eval(compile(f.read(), abspath(confpypath), 'exec'), config)
+ config.update(sphinx_enforced)
+ return config
+ if g_include:
+ for gi in g_include:
+ confpypath = normjoin(gi,'conf.py')
+ if exists(confpypath):
+ config['__file__'] = abspath(confpypath)
+ eval(compile(f.read(), abspath(confpypath), 'exec'), config)
+ break
+ config.update(sphinx_enforced)
+ return config
+
+def _fillwith(u, v):
+ return [x or v for x in u]
+
+
+def _joinlines(lns):
+ if lns[0].endswith('\n'):
+ tmp = ''.join(lns)
+ else:
+ tmp = '\n'.join(lns)
+ return tmp.replace('\r\n', '\n')
+
+
+# x=b'a\r\nb'
+# _nbstr(x)==b'a\nb'
+def _nbstr(x):
+ return x and x.replace(b'\r\n', b'\n') or b''
+
+# x=x.decode()
+# _nstr(x)=='a\nb'
+
+
+def _nstr(x):
+ return x and x.replace('\r\n', '\n') or ''
+
+
+def cmd(cmdlist, **kwargs):
+ '''
+ Runs ``cmdlist`` via subprocess.run and return stdout.
+ In case of problems RstDocError is raised.
+
+ :param cmdlist: command as list
+ :param kwargs: arguments forwarded to ``subprocess.run()``
+
+ '''
+
+ cmdstr = ' '.join(cmdlist)
+ try:
+ for x in 'out err'.split():
+ kwargs['std' + x] = sp.PIPE
+ r = _toolrunner.run(cmdlist, **kwargs)
+ try:
+ stdout, stderr = _nstr(r.stdout), _nstr(r.stderr)
+ except:
+ stdout, stderr = _nbstr(r.stdout).decode('utf-8'), _nbstr(
+ r.stderr).decode('utf-8')
+ if r.returncode != 0:
+ raise RstDocError('Error code %s returned from \n%s\nin\n%s\n' % (
+ r.returncode, cmdstr,
+ cwd()) + '\n[stdout]\n%s\n[stderr]\n%s' % (stdout, stderr))
+ return stdout
+ except OSError as err:
+ raise RstDocError(
+ 'Error: Cannot run ' + cmdstr + ' in ' + cwd() + str(err))
+
+
+def _imgout(inf):
+ inp, inname = dir_base(inf)
+ infn, infe = stem_ext(inname)
+ if not _is_graphic(infe) and not _is_graphic(stem_ext(infn)[1]):
+ raise ValueError('%s is not an image source' % inf)
+ outp, there = _here_or_updir(inp, _images)
+ if not there:
+ outp = inp
+ outname = infn + _png
+ nout = normjoin(outp, outname)
+ return nout
+
+
+def _unioe(args):
+ i, o = [None] * 2
+ try:
+ (i, o), a = args[:2], args[2:]
+ except:
+ (i,), a = args[:1], args[1:]
+ return i, o, a
+
+
+def png_post_process_if_any(f):
+ @wraps(f)
+ def png_post_processor(*args, **kwargs):
+ infile, outfile, args = _unioe(args)
+ if isinstance(infile, str):
+ config = conf_py(dirname(infile))
+ pp = config.get('png_post_processor', None)
+ pngfile = f(infile, outfile, *args, **kwargs)
+ if pp and exists(pngfile):
+ return pp(pngfile)
+ else:
+ return pngfile
+ return f(infile, outfile, *args, **kwargs)
+
+ return png_post_processor
+
+
+def _ext(x):
+ return x[0] == '.' and x or '.' + x
+
+
+_cdlock = RLock()
+
+
+@contextlib.contextmanager
+def new_cwd(apth):
+ '''
+ Use as::
+
+ with new_cwd(dir):
+ #inside that directory
+
+ '''
+
+ _cdlock.acquire()
+ prev_cwd = cwd()
+ cd(apth)
+ try:
+ yield
+ finally:
+ cd(prev_cwd)
+ _cdlock.release()
+
+
+def startfile(filepath):
+ '''
+ Extends the Python startfile to non-Windows platforms
+
+ '''
+
+ if sys.platform.startswith('darwin'):
+ sp.call(('open', filepath))
+ elif os.name == 'nt': # For Windows
+ os.startfile(filepath)
+ elif os.name == 'posix': # For Linux, Mac, etc.
+ sp.call(('xdg-open', filepath))
+
+def up_dir(match,start=None):
+ '''
+ Find a parent path producing a match on one of its entries.
+ Without a match an empty string is returned.
+
+ :param match: a function returning a bool on a directory entry
+ :param start: absolute path or None
+ :return: directory with a match on one of its entries
+
+ >>> up_dir(lambda x: False)
+ ''
+
+ '''
+
+ if start is None:
+ start = os.getcwd()
+ if any(match(x) for x in os.listdir(start)):
+ return start
+ parent = os.path.dirname(start)
+ if start == parent:
+ rootres = start.replace('\\','/').strip('/').replace(':','')
+ if len(rootres)==1 and sys.platform=='win32':
+ rootres = ''
+ return rootres
+ return up_dir(match,start=parent)
+
+def tempdir():
+ '''
+ Make temporary directory and register it to be removed with ``atexit``.
+
+ This can be used inside a ``.stpl`` file
+ to create images from inlined images source,
+ place them in temporary file,
+ and include them in the final ``.docx`` or ``.odt``.
+
+ '''
+
+ atmpdir = tempfile.mkdtemp()
+ atexit.register(rmrf, atmpdir)
+ return atmpdir
+
+
+def infile_cwd(f):
+ """
+ Changes into the directory of the infile if infile is a file name string.
+ """
+
+ @wraps(f)
+ def infilecwder(*args, **kwargs):
+ infile, outfile, args = _unioe(args)
+ if isinstance(infile, str):
+ ndir, inf = dir_base(infile)
+ else:
+ ndir, inf = '', infile
+ if ndir:
+ if isinstance(outfile, str) and outfile != '-':
+ outfile = relpath(outfile, start=ndir)
+ with new_cwd(ndir):
+ return f(inf, outfile, *args, **kwargs)
+ return f(infile, outfile, *args, **kwargs)
+
+ return infilecwder
+
+
+def normoutfile(f, suffix=None):
+ """
+ Make outfile from infile by appending suffix, or, if None,
+ ``.png`` in ``./_images``, ``/_images`` or parallel to infile.
+ The outfile is returned.
+ """
+
+ @wraps(f)
+ def normoutfiler(*args, **kwargs):
+ infile, outfile, args = _unioe(args)
+ if isinstance(infile, str):
+ if not outfile:
+ if not suffix or _is_graphic(suffix):
+ outfile = _imgout(infile)
+ elif suffix:
+ infn, infe = stem_ext(infile)
+ outinfo = kwargs.get('outinfo', None)
+ if outinfo.startswith('sphinx'):
+ outfile = "{1}/{0}/{2}".format(
+ outinfo, *dir_base(infn)
+ ) + '.' + outinfo[outinfo.find('_') + 1:]
+ else:
+ if _stpl.endswith(infe):
+ infn, infe = stem_ext(infn)
+ outfile = infn
+ f(infile, outfile, *args, **kwargs)
+ return outfile
+
+ return normoutfiler
+
+def _suffix(outinfo):
+ try:
+ _, suf = outinfo.split('_')
+ except: #noqa
+ suf = outinfo
+ return suf or 'html'
+
+def _in_2_out_name(inname,outinfo):
+ instem = stem(inname)
+ if instem.endswith(_rest) or instem.endswith(_rst):
+ instem = stem(instem)
+ res = base(instem) + '.' + _suffix(outinfo).strip('.')
+ return res
+
+def in_temp_if_list(
+ f,
+ suffix='stpl'
+ ):
+ """
+
+ to produce a temporary directory/file for when infile is a list of strings.
+ The temporary directory/file is removed via atexit.
+
+ :param suffix: .dot, .uml, ... or rest.stpl,...
+ default it will assume stpl and use outinfo
+
+ To make this have an effect use after ``readin``
+
+ - includes ``normoutfile``
+
+ If outfile is None, outfile is derived from suffix,
+ which can be ``rest.stpl``, ``png.svg``;
+ If suffix is ``.svg``, ...,
+ png is assumed and will be placed into ``_images``.
+
+ """
+
+ @wraps(f)
+ def intmpiflister(*args, **kwargs):
+ infile, outfile, args = _unioe(args)
+ outinfo = None
+ try:
+ suf0, suf1 = suffix.split('.', 1)
+ except: #noqa
+ outinfo = kwargs.get('outinfo', 'rest')
+ if _is_graphic(outinfo):
+ suf0, suf1 = outinfo, suffix
+ else:
+ # see infile_outinfo
+ _, outi = dir_base(outinfo)
+ suf0, suf1 = _suffix(outi) + _rest, suffix
+ if not isinstance(infile, str) and infile:
+ if outfile and isinstance(outfile, str):
+ outfile = abspath(outfile)
+ atmpdir = tempdir()
+ content = _joinlines(infile).encode('utf-8')
+ infnfromoutinfo,outi = outinfo and dir_base(outinfo) or (None,None)
+ if outfile and isinstance(outfile, str):
+ infn = stem(base(outfile))
+ elif infnfromoutinfo:
+ kwargs['outinfo'] = outi
+ infn = infnfromoutinfo
+ else:
+ infn = sha(content).hexdigest()
+ if suf0:
+ infile = normjoin(atmpdir, '.'.join([infn, suf0, suf1]))
+ else:
+ infile = normjoin(atmpdir, '.'.join([infn, suf1]))
+ with open(infile, 'bw') as ff:
+ ff.write(content)
+ return normoutfile(f, suf0)(infile, outfile, *args, **kwargs)
+ return normoutfile(f, suf0)(infile, outfile, *args, **kwargs)
+
+ return intmpiflister
+
+
+def readin(f):
+ """
+ Decorator to read in file content and pass it on to the wrapped function.
+
+ The config is forwarded via parameters, if the file is read in.
+ """
+
+ @wraps(f)
+ def readiner(*args, **kwargs):
+ infile, outfile, args = _unioe(args)
+ if isinstance(infile, str):
+ config = conf_py(dirname(infile))
+ with opn(infile) as inf:
+ return f(inf.readlines(), outfile, *args, **config, **kwargs)
+ return f(infile, outfile, *args, **kwargs)
+
+ return readiner
+
+
+def run_inkscape(infile, outfile, dpi=DPI):
+ '''
+ Uses ``inkscape`` commandline to convert to ``.png``
+
+ :param infile: .svg, .eps, .pdf filename string
+ (for list with actual .eps or .svg data use |dcx.svgpng| or |dcx.epspng|)
+ :param outfile: .png file name
+
+ '''
+
+ cmd([
+ 'inkscape', '--export-dpi=%s' % dpi, '--export-area-drawing',
+ '--export-background-opacity=0', infile,
+ '--export-filename='+outfile
+ ],
+ outfile=outfile)
+
+
+@infile_cwd
+def rst_sphinx(
+ infile, outfile, outtype=None, **config
+ ):
+ '''
+ Run Sphinx on infile.
+
+ :param infile: .txt, .rst, .rest filename
+ :param outfile: the path to the target file (not target directory)
+ :param outtype: html, latex,... or any other sphinx writer
+ :param config: keys from config_defaults
+
+ ::
+
+ >>> olddir = os.getcwd()
+ >>> cd(dirname(__file__))
+ >>> cd('../doc')
+
+ >>> infile, outfile = ('index.rest',
+ ... '../build/doc/sphinx_html/index.html')
+ >>> rst_sphinx(infile, outfile) #doctest: +ELLIPSIS
+ >>> exists(outfile)
+ True
+
+ >>> infile, outfile = ('dd.rest',
+ ... '../build/doc/sphinx_html/dd.html')
+ >>> rst_sphinx(infile, outfile) #doctest: +ELLIPSIS
+ >>> exists(outfile)
+ True
+
+ >>> infile, outfile = ('dd.rest',
+ ... '../build/doc/sphinx_latex/dd.tex')
+ >>> rst_sphinx(infile, outfile) #doctest: +ELLIPSIS
+ >>> exists(outfile)
+ True
+
+ >>> cd(olddir)
+
+ '''
+
+ cfgt = {}
+ cfgt.update(config_defaults)
+ cfgt.update(config)
+
+ indr, infn = dir_base(infile)
+ outdr, outn = dir_base(outfile)
+ outnn, outne = stem_ext(outn)
+ samedir = False
+ if outdr == indr:
+ samedir = True
+ if not indr:
+ indr = '.'
+ cfg = {}
+ cfg.update({
+ k: v
+ for k, v in cfgt.items() if k in sphinx_config_keys
+ and 'latex' not in k and k != 'html_extra_path'
+ })
+ cfg.setdefault('source_suffix','.rest')
+ #cfg['source_suffix'] = '.rest'
+ if not outtype or outtype=='html':
+ if outne == '.html':
+ if infn.startswith('index.'):
+ outtype = 'html'
+ else:
+ outtype = 'singlehtml'
+ elif outne == '.tex':
+ outtype = 'latex'
+ else:
+ outtype = outne.strip('.')
+ cfg.update({k: v for k, v in sphinx_enforced.items() if 'latex' not in k})
+ cfg['master_doc'] = stem(infn) #.rest.rest -> .rest (see rest_rest)
+ # .rest.rest contains temporary modification and is used instead of .rest
+ if outtype == 'html' and is_rest(cfg['master_doc']): #... not for index.rest
+ cfg['master_doc'] = stem(cfg['master_doc'])
+ if exists(cfg['master_doc']):
+ cfg['master_doc'] = stem(cfg['master_doc'])
+ if samedir:
+ outdr = normjoin(outdr, 'sphinx_'+outtype)
+ outfn = normjoin(outdr, outn)
+ print('Warning: Shinx output cannot be in same directory. Using '
+ + outdr)
+ else:
+ outfn = outfile
+ latex_elements = []
+ latex_documents = []
+ if 'latex' in outtype:
+ cfg.update({
+ k: v
+ for k, v in cfgt.items()
+ if k in sphinx_config_keys and 'latex' in k
+ })
+ cfg.update({k: v for k, v in sphinx_enforced.items() if 'latex' in k})
+ try:
+ latex_elements = ([
+ ['-D', "latex_elements.%s=%s" % (k, v.replace('\n', ''))]
+ for k, v in cfg['latex_elements'].items()
+ ] + [['-D', 'latex_engine=xelatex']])
+ except:
+ pass
+ del cfg['latex_elements']
+ del cfg['latex_engine']
+ latex_documents = []
+ extras = ['-C'] + reduce(lambda x, y: x + y, [[
+ '-D', "%s=%s" % (k, (','.join(v) if isinstance(v, list) else v))
+ ] for k, v in cfg.items()] + latex_elements + latex_documents)
+ sphinxcmd = ['sphinx-build', '-b', outtype, indr, outdr] + extras
+ cmd(sphinxcmd, outfile=outfn)
+ if outtype == 'html':
+ #undo duplication via temp file's see: rest_rest
+ rmrf(normjoin(outdr,cfg['master_doc']+_rest+'.html'))
+ if 'latex' in outtype:
+ texfile = next(x for x in os.listdir(outdr) if x.endswith('.tex'))
+ os.rename(normjoin(outdr, texfile), outfn)
+ if 'html' in outtype and 'html_extra_path' in cfgt:
+ for epth in cfgt['html_extra_path']:
+ try:
+ if isabs(epth):
+ epth = relpath(epth, start=indr)
+ cp(epth, normjoin(outdr, epth))
+ except:
+ pass
+
+
+def _copy_images_for(infile, outfile, with_trace):
+ indr = dirname(infile)
+ imgdir, there = _here_or_updir(indr, _images)
+ imgdir_tgt = outfile
+ while there:
+ imgdir_tgt = dirname(imgdir_tgt)
+ there = there - 1
+ if imgdir_tgt == outfile:
+ return
+ imgdir_tgt = normjoin(imgdir_tgt,_images)
+ outdr = dirname(outfile)
+ if with_trace:
+ tracesvg = normjoin(indr, _traceability_file + _svg)
+ if exists(tracesvg):
+ try:
+ cp(tracesvg, normjoin(outdr, _traceability_file + _svg))
+ except:
+ pass
+ if exists(imgdir) and imgdir != imgdir_tgt:
+ if not exists(imgdir_tgt):
+ mkdir(imgdir_tgt)
+ for x in os.listdir(imgdir):
+ frm, twd = normjoin(imgdir, x), normjoin(imgdir_tgt, x)
+ docpy = filenewer(frm, twd)
+ if docpy:
+ try:
+ cp(frm, twd)
+ except:
+ pass
+
+'''
+One can append paths to ``rstdoc.dcx.g_include`` for stpl expansion
+or finding other files.
+'''
+g_include = []
+
+@infile_cwd
+def rst_pandoc(
+ infile, outfile, outtype, **config
+ ):
+ '''
+ Run Pandoc on infile.
+
+ :param infile: .txt, .rst, .rest filename
+ :param outfile: the path to the target document
+ :param outtype: html,...
+ :param config: keys from config_defaults
+
+ '''
+
+ cfg = {}
+ cfg.update(config_defaults)
+ cfg.update(config)
+ pandoccmd = ['pandoc', '--standalone', '-f', 'rst'] + cfg.get(
+ 'pandoc_opts', {}).get(outtype, []) + [
+ '-t', 'latex'
+ if outtype == 'pdf' else outtype.replace('rest','rst'), infile, '-o', outfile
+ ]
+ opt_refdoc = cfg.get('pandoc_doc_optref', {}).get(outtype, '')
+ if opt_refdoc:
+ if isinstance(opt_refdoc, dict):
+ opt_refdoc = opt_refdoc.get(base(infile), '')
+ if opt_refdoc:
+ refoption, refdoc = opt_refdoc.split()
+ refdocfound, there = _here_or_updir('.', refdoc)
+ if there:
+ pandoccmd.append(refoption)
+ pandoccmd.append(abspath(refdocfound))
+ elif g_include:
+ refdoc = dir_base(refdoc)[1]
+ for gi in g_include:
+ refdoctry = normjoin(gi,refdoc)
+ if exists(refdoctry):
+ pandoccmd.append(refoption)
+ pandoccmd.append(refdoctry)
+ break
+ stdout = cmd(pandoccmd, outfile=outfile)
+ if outtype.endswith('html') or outtype.endswith('latex'):
+ _copy_images_for(infile, outfile, outtype.endswith('html'))
+ elif outtype.endswith('odt'):
+ PageBreakHack(outfile)
+ return stdout
+
+
+def _indented_default_role_math(filelines):
+ """
+
+ .. _`x`:
+
+ xlabel:
+
+ hello
+
+ """
+ indent = ''
+ i = 0
+ try:
+ while not filelines[i].strip():
+ i = i+1
+ indent = ' '*filelines[i].index(filelines[i].lstrip())
+ except:
+ pass
+ return indent + '.. default-role:: math\n\n'
+
+
+@infile_cwd
+def rst_rst2(
+ infile, outfile, outtype, **config
+ ):
+ '''
+ Run the rst2xxx docutils fontend tool on infile.
+
+ :param infile: .txt, .rst, .rest filename
+ :param outfile: the path to the target document
+ :param outtype: html,...
+ :param config: keys from config_defaults
+
+ '''
+
+ cfg = {}
+ cfg.update(config_defaults)
+ cfg.update(config)
+ destination_path = outfile if outfile != '-' else None
+ if outtype == 'odt':
+ outtype = 'odf_odt'
+ stdout = None
+ if isinstance(infile, str):
+ publish_file(
+ source_path=infile,
+ destination_path=destination_path,
+ writer_name=outtype,
+ settings_overrides=cfg['rst_opts'])
+ else:
+ source = _indented_default_role_math(infile) + _joinlines(infile)
+ stdout = publish_string(
+ source,
+ destination_path=destination_path,
+ writer_name=outtype,
+ settings_overrides=cfg['rst_opts'])
+ if destination_path:
+ if outtype.endswith('html') or outtype.endswith('latex'):
+ _copy_images_for(infile, outfile, outtype.endswith('html'))
+ elif outtype.endswith('odt'):
+ PageBreakHack(destination_path)
+ return stdout
+
+
+def PageBreakHack(destination_path):
+ '''
+ This introduces a ``PageBreak`` style into ``content.xml``
+ to allow the following raw page break of opendocument odt::
+
+ .. raw:: odt
+
+
+
+ This is no good solution,
+ as it introduces an empty line at the top of the new page.
+
+ Unfortunately the following does not work
+ with or without ``text:use-soft-page-breaks="true"``
+
+ ::
+
+ .. for docutils
+ .. raw:: odt
+
+
+
+ .. for pandoc
+ .. raw:: opendocument
+
+
+
+ According to
+ `C066363e.pdf `__
+ it should work.
+
+ See ``utility.rst.tpl`` in the ``--stpl`` created example project tree.
+
+ '''
+
+ from zipfile import ZipFile
+ odtzip = OrderedDict()
+ with ZipFile(destination_path) as z:
+ for n in z.namelist():
+ with z.open(n) as f:
+ content = f.read()
+ if n == 'content.xml':
+ # break-after produces two page breaks
+ content = content.replace(
+ b'', b' '.join(
+ x.strip() for x in b"""
+
+
+ """.splitlines()))
+ content = content.replace(
+ b'',
+ b'')
+ odtzip[n] = content
+ with ZipFile(destination_path, 'w') as z:
+ for n, content in odtzip.items():
+ with z.open(n, mode='w', force_zip64=True) as f:
+ f.write(content)
+
+
+# sphinx_html, rst_html, [pandoc_]html
+rst_tools = {'pandoc': rst_pandoc, 'sphinx': rst_sphinx, 'rst': rst_rst2}
+
+
+@png_post_process_if_any
+@normoutfile
+@readin
+def svgpng(infile, outfile=None, *args, **kwargs):
+ '''
+ Converts a .svg file to a png file.
+
+ :param infile: a .svg file name or list of lines
+ :param outfile: if not provided the input file with new extension
+ ``.png`` in ``./_images``, ``/_images`` or parallel to infile.
+
+ '''
+
+ _toolrunner.svg2png(
+ bytestring=_joinlines(infile),
+ write_to=outfile,
+ dpi=kwargs.get('DPI', DPI))
+
+
+@png_post_process_if_any
+@partial(in_temp_if_list, suffix='.tex')
+@infile_cwd
+def texpng(infile, outfile=None, *args, **kwargs):
+ '''
+ Latex has several graphic packages, like
+
+ - tikz
+ - chemfig
+
+ that can be converted to .png with this function.
+
+ For ``.tikz`` file use |dcx.tikzpng|.
+
+ :param infile: a .tex file name or list of lines
+ (provide outfile in the latter case)
+ :param outfile: if not provided, the input file with
+ ``.png`` in ``./_images``, ``/_images`` or parallel to infile.
+
+ '''
+
+ pdffile = stem(infile) + '.pdf'
+ try:
+ cmd(['xelatex', '-interaction=nonstopmode', infile], outfile=pdffile)
+ except RstDocError as err:
+ with opn(infile) as latex:
+ raise RstDocError(str(err) + '\n[latex]\n' + latex.read())
+ run_inkscape(pdffile, outfile, dpi=kwargs.get('DPI', DPI))
+
+
+def _texwrap(f):
+ @wraps(f)
+ def _texwraper(*args, **kwargs):
+ texlns, outfile, args = _unioe(args)
+ content = _joinlines(texlns)
+ latex = kwargs.get('tex_wrap', tex_wrap) % content
+ return f(latex.splitlines(), outfile, *args, **kwargs)
+
+ return _texwraper
+
+
+def _tikzwrap(f):
+ @wraps(f)
+ def _tikzwraper(*args, **kwargs):
+ tikzlns, outfile, args = _unioe(args)
+ content = _joinlines(tikzlns).strip()
+ tikzenclose = [r'\begin{tikzpicture}', '%s', r'\end{tikzpicture}']
+ if not content.startswith(tikzenclose[0]):
+ content = _joinlines(tikzenclose) % content
+ return f(content.splitlines(), outfile, *args, **kwargs)
+
+ return _tikzwraper
+
+
+'''
+Converts a .tikz file to a png file.
+
+See |dcx.texpng|.
+'''
+tikzpng = normoutfile(readin(_tikzwrap(_texwrap(texpng))))
+
+
+@png_post_process_if_any
+@partial(in_temp_if_list, suffix='.dot')
+@infile_cwd
+def dotpng(
+ infile,
+ outfile=None,
+ *args,
+ **kwargs
+ ):
+ '''
+ Converts a .dot file to a png file.
+
+ :param infile: a .dot file name or list of lines
+ (provide outfile in the latter case)
+ :param outfile: if not provided the input file with new extension
+ ``.png`` in ``./_images``, ``/_images`` or parallel to infile.
+
+ '''
+
+ cmd(['dot', '-Tpng', infile, '-o', outfile], outfile=outfile)
+
+
+@png_post_process_if_any
+@partial(in_temp_if_list, suffix='.uml')
+@infile_cwd
+def umlpng(
+ infile,
+ outfile=None,
+ *args,
+ **kwargs
+ ):
+ '''
+ Converts a .uml file to a png file.
+
+ :param infile: a .uml file name or list of lines
+ (provide outfile in the latter case)
+ :param outfile: if not provided the input file with new extension
+ ``.png`` in ``./_images``, ``/_images`` or parallel to infile.
+
+ '''
+
+ cmd(['plantuml', '-tpng', infile, '-o' + dirname(outfile)],
+ shell=sys.platform == 'win32',
+ outfile=outfile)
+
+
+@png_post_process_if_any
+@partial(in_temp_if_list, suffix='.eps')
+@infile_cwd
+def epspng(
+ infile,
+ outfile=None,
+ *args,
+ **kwargs):
+ '''
+ Converts an .eps file to a png file using inkscape.
+
+ :param infile: a .eps file name or list of lines
+ (provide outfile in the latter case)
+ :param outfile: if not provided the input file with new extension
+ ``.png`` in ``./_images``, ``/_images`` or parallel to infile.
+
+ '''
+
+ run_inkscape(infile, outfile, dpi=kwargs.get('DPI', DPI))
+
+
+@png_post_process_if_any
+@normoutfile
+@readin
+@infile_cwd
+def pygpng(
+ infile, outfile=None, *args,
+ **kwargs
+ ):
+ '''
+ Converts a .pyg file to a png file.
+
+ ``.pyg`` contains python code that produces a graphic.
+ If the python code defines a ``to_svg`` or a ``save_to_png`` function,
+ then that is used.
+ Else the following is tried
+
+ - ``pyx.canvas.canvas`` from the
+ `pyx `__ library or
+ - ``svgwrite.drawing.Drawing`` from the
+ `svgwrite `__ library or
+ - ``cairocffi.Surface`` from
+ `cairocffi `__
+ - `matplotlib `__.
+ If ``matplotlib.pyplot.get_fignums()>1``
+ the figures result ``.png``
+
+ :param infile: a .pyg file name or list of lines
+ (provide outfile in the latter case)
+ :param outfile: if not provided the input file with new extension
+ ``.png`` in ``./_images``, ``/_images`` or parallel to infile.
+
+ '''
+
+ pygcode = _joinlines(infile)
+ pygvars = {}
+ dpi = kwargs.get('DPI', DPI)
+ eval(compile(pygcode, outfile, 'exec'), pygvars)
+ if 'save_to_png' in pygvars:
+ pygvars['save_to_png'](outfile)
+ elif 'to_svg' in pygvars:
+ _toolrunner.svg2png(bytestring=pygvars['to_svg'](),
+ write_to=outfile, dpi=dpi)
+ else:
+ for _, v in pygvars.items():
+ if hasattr(v,'_repr_svg_'):
+ _toolrunner.svg2png(
+ bytestring=v._repr_svg_(), write_to=outfile, dpi=dpi)
+ break
+ elif cairocffi and isinstance(v, cairocffi.Surface):
+ v.write_to_png(target=outfile)
+ break
+ elif svgwrite and isinstance(v, svgwrite.drawing.Drawing):
+ _toolrunner.svg2png(bytestring=v.tostring(),
+ write_to=outfile, dpi=dpi)
+ break
+ else: # try matplotlib.pyplot
+ try:
+ fignums = plt.get_fignums()
+ if len(fignums) == 0:
+ continue
+ if len(fignums) > 1:
+ # makename('a.b', 1) # a1.b
+ def makename(x, i):
+ return ('{0}%s{1}' % i).format(*stem_ext(x))
+ else:
+
+ def makename(x, i):
+ return x
+ for i in fignums:
+ plt.figure(i).savefig(
+ makename(outfile, i), format='png')
+ plt.close()
+ break
+ except:
+ continue
+
+@readin
+@infile_cwd
+def pygsvg(infile, *args, **kwargs):
+ '''
+ Converts a .pyg file or according python code to an svg string.
+
+ ``.pyg`` contains python code that produces an SVG graphic.
+ Either there is a ``to_svg()`` function or
+ the following is tried
+
+ - ``io.BytesIO`` containing SVG, e.g via ``cairo.SVGSurface(ioobj,width,height)``
+ - ``io.StringIO`` containing SVG
+ - object with attribute ``_repr_svg_``
+ - ``svgwrite.drawing.Drawing`` from the
+ `svgwrite `__ library or
+ - ``cairocffi.SVGSurface`` from
+ `cairocffi `__
+ - `matplotlib `__.
+
+ :param infile: a .pyg file name or list of lines
+
+ '''
+
+ onlysvg = lambda x: ''.format(b64.decode("utf-8"))
+
+
+def pngembed(
+ pyg_or_pngfile, outinfo, *args, **kwargs
+ ):
+ '''
+ If ``outinfo`` ends with ``html``, the PNG is embedded.
+ Else the PNG is included in the DOCX or ODT zip.
+
+ '''
+
+ pngfn = normjoin(tempdir(),'png.png')
+ pygpng(pyg_or_pngfile,pngfn,*args,**kwargs)
+ if outinfo.endswith('html') or outinfo.endswith('rest') or outinfo.endswith('rest'):
+ return '.. raw:: html\n\n'+_indent_text(_png64(pngfn))
+ else:
+ return ".. image:: {}".format(pngfn)
+
+
+@infile_cwd
+def dostpl(
+ infile,
+ outfile=None,
+ lookup=None,
+ **kwargs
+ ):
+ '''
+ Expands an `.stpl `__ file.
+
+ The whole ``rstdoc.dcx`` namespace is forwarded to the template code.
+
+ ``.stpl`` is unrestrained python:
+
+ - e.g. one can create temporary images,
+ which are then included in the final .docx of .odt
+ See |dcx.tempdir|.
+
+ :param infile: a .stpl file name or list of lines
+ :param outfile: if not provided the expanded is returned
+ :param lookup: lookup paths can be absolute or relative to infile
+
+ ::
+
+ >>> infile = ['hi {{2+3}}!']
+ >>> dostpl(infile)
+ ['hi 5!']
+
+ '''
+
+ if not lookup:
+ lookup = ['.', '..'] + g_include
+ if isinstance(infile, str):
+ lookup = [abspath(normjoin(dirname(infile), x)) for x in lookup if not isabs(x)
+ ]+[x for x in lookup if isabs(x)]
+ filename = abspath(infile)
+ else:
+ lookup = [abspath(x) for x in lookup if not isabs(x)
+ ]+[x for x in lookup if isabs(x)]
+ try:
+ filename = abspath(outfile)
+ except:
+ filename = None
+ infile = _joinlines(infile)
+ variables = {}
+ variables.update(globals())
+ variables.update(kwargs)
+ variables.update({'__file__': filename})
+ if 'outinfo' not in variables and outfile:
+ _, variables['outinfo'] = stem_ext(outfile)
+ stpl.TEMPLATES.clear()
+ st = stpl.template(
+ infile,
+ template_settings={'escape_func': lambda x: x},
+ template_lookup=lookup,
+ **variables
+ )
+ if outfile:
+ with opnwrite(outfile) as f:
+ f.write(st)
+ else:
+ return st.replace('\r\n', '\n').splitlines(keepends=True)
+
+
+def dorst(
+ infile,
+ outfile=io.StringIO,
+ outinfo=None,
+ fn_i_ln=None,
+ **kwargs
+ ):
+ r'''
+ Default interpreted text role is set to math.
+ The link lines are added to the rest file or rst lines
+
+ :param infile: a .rest, .rst, .txt file name or list of lines
+
+ :param outfile: None and '-' mean standard out.
+
+ If io.StringIO, then the lines are returned.
+ ``|xxx|`` substitutions for reST link targets
+ in infile are appended if no ``_links_sphinx.rst`` there
+
+ :param outinfo: specifies the tool to use.
+
+ - ``html``, ``docx``, ``odt``,... via pandoc
+ - ``sphinx_html``,... via sphinx
+ - ``rst_html``,... via rst2xxx frontend tools
+
+ General format of outinfo::
+
+ [infile/][tgtfile.]docx[.]
+
+ ``infile`` is used, if the function infile param are lines.
+
+ ``tgtfile`` is target file used in links.
+
+ ``tgtfile`` is the target file to create.
+ A final dot tells not to create the target file.
+ This is of use in the command line
+ if piping a file to rstdoc then to pandoc.
+ The doc will only be generated by pandoc,
+ but links need to know the doc to link to already before that.
+
+ :param fn_i_ln: ``(fn, i, ln)`` of the ``.stpl``
+ with all stpl includes sequenced (used by |dcx.convert|)
+
+ ::
+
+ >>> olddir = os.getcwd()
+ >>> cd(dirname(__file__))
+ >>> cd('../doc')
+
+ >>> dorst('dd.rest') #doctest: +ELLIPSIS
+ ['.. default-role:: math\n', ...
+
+ >>> dorst('ra.rest.stpl') #doctest: +ELLIPSIS
+ ['.. default-role:: math\n', ...
+
+ >>> dorst(['hi there']) #doctest: +ELLIPSIS
+ ['.. default-role:: math\n', '\n', 'hi there\n', ...
+
+ >>> dorst(['hi there'], None,'html') #doctest: +ELLIPSIS
+
+ ...
+
+ >>> drst=lambda x,y: dorst(x,y,None,pandoc_doc_optref={'docx':'--reference-doc doc/reference.'+y.split('.')[1]})
+ >>> dorst('ra.rest.stpl','ra.docx') #doctest: +ELLIPSIS
+ >>> exists('ra.docx')
+ True
+ >>> rmrf('ra.docx')
+ >>> exists('ra.docx')
+ False
+ >>> rmrf('ra.rest.stpl.rest')
+ >>> exists('ra.rest.stpl.rest')
+ False
+
+ >>> dorst(['hi there'],'test.html') #doctest: +ELLIPSIS
+ >>> exists('test.html')
+ True
+ >>> rmrf('test.html')
+ >>> exists('test.html')
+ False
+ >>> rmrf('rest.rest.rest')
+ >>> exists('rest.rest.rest')
+ False
+
+ >>> dorst(['hi there'],'test.odt','rst') #doctest: +ELLIPSIS
+ >>> exists('rest.rest.rest')
+ True
+ >>> rmrf('rest.rest.rest')
+ >>> exists('rest.rest.rest')
+ False
+ >>> exists('test.odt')
+ True
+ >>> rmrf('test.odt')
+ >>> exists('test.odt')
+ False
+ >>> cd(olddir)
+
+
+ '''
+
+ tool = 'pandoc'
+ rsttool = rst_tools[tool]
+ dinfo, binfo = None, None
+ if outinfo:
+ # see infile_outinfo
+ dinfo, binfo = dir_base(outinfo)
+ outinfo = binfo
+ if not isinstance(outfile, str) and outinfo in rst_tools:
+ rsttool = rst_tools[outinfo]
+ outinfo = 'html'
+ else:
+ try:
+ tool, outinfo = outinfo.split('_')
+ try:
+ rsttool = rst_tools[tool]
+ except:
+ rsttool = None
+ except:
+ pass
+ if isinstance(infile, str):
+ infile = abspath(infile)
+ with opn(infile) as f:
+ filelines = f.readlines()
+ else:
+ filelines = infile
+ infile = dinfo
+ if not infile:
+ infile = 'rest'
+ infile = abspath(infile + _rest)
+ sysout = None
+ finalsysout = None
+ try:
+ if outfile is None or outfile == '-':
+ if not outinfo:
+ outinfo = 'rest'
+ # outinfo='docx.'
+ if outinfo.strip('.').find('.') < 0:
+ outfile = stem(base(infile))+'.' + outinfo.strip('.')
+ #=> outfile=infile.docx
+ #equivalent to input params: infile.rest - docx
+ else: # - - otherfile.docx
+ outfile = outinfo.strip('.')
+ if any(outinfo.endswith(x) for x in ['docx', 'odt', 'pdf']):
+ sysout = None # create a file in these cases
+ else:
+ try:
+ sys.stdout = codecs.getwriter("utf-8")(sys.stdout.detach())
+ except: #noqa
+ pass
+ sysout = sys.stdout
+ elif callable(outfile):
+ sysout = outfile()
+ else:
+ _, ofext = stem_ext(outfile)
+ ofext = ofext.strip('.')
+ if not outinfo: # x.rst a/b/c.docx
+ outinfo = ofext
+ elif outinfo in rst_tools: # x.rst a/b/c.docx pandoc
+ tool = outinfo
+ rsttool = rst_tools[outinfo]
+ outinfo = ofext
+ try:
+ if outinfo.endswith('.'): # x.rest - docx.
+ rsttool = None # ... output the rest code with links for docx
+ # drop file information from outinfo
+ outinfo = outinfo.strip('.')
+ t, outinfo = stem_ext(outinfo)
+ if not outinfo:
+ outinfo = t
+ outinfo = outinfo.strip('.')
+ except:
+ outinfo = 'rest'
+
+ if _rest.replace('rest','rst').endswith(outinfo.replace('rest','rst')):
+ rsttool = None # no further processing wanted, sysout is final
+ if not rsttool and not sysout:
+ sysout = opnwrite(outfile)
+ tmprestindir = None
+
+ if rsttool:
+ finalsysout = sysout
+ tmprestindir = infile + _rest # .rest->rest_rest
+ sysout = opnwrite(tmprestindir)
+ infile = tmprestindir
+ atexit.register(rmrf, tmprestindir)
+ if sysout:
+ sysout.write(_indented_default_role_math(filelines))
+ links_done = False
+ _links_re = r'^\.\. include:: (.*)(_links_sphinx)(.re?st)'
+ rexincludelinks = re.compile(_links_re)
+ for x in filelines:
+ #x = '.. include:: _links_sphinx.rest' #1
+ #x = '.. include:: ../_links_sphinx.rest' #2
+ #x = '.. include:: /_links_sphinx.rst' #3
+ lim = rexincludelinks.match(x)
+ if lim:
+ limg0 = normjoin(lim.groups()[0])
+ limg2 = normjoin(lim.groups()[2])
+ if tool == 'sphinx':
+ links_done = True
+ else:
+ #infile='a/b/c'
+ #outinfo='docx'
+ if limg0.startswith('/'):
+ if limg0 == '/': #find linkroot
+ linkroot = up_dir(lambda x: x.startswith('_links_') or is_project_root_file(x),
+ abspath(dirname(infile)))
+ linksfilename = normjoin(linkroot, '_links_' + outinfo + limg2)
+ else:
+ linksfilename = normjoin(limg0, '_links_' + outinfo + limg2)
+ else:
+ linksfilename = normjoin(dirname(infile), limg0, '_links_' + outinfo + limg2)
+ #a/b/_links_docx.rst #1
+ #a/_links_docx.rst #2
+ if not exists(linksfilename):
+ linksfilename = stem(linksfilename)+(_rst if limg2==_rest else _rest)
+ if exists(linksfilename):
+ with opn(linksfilename) as f:
+ if tool == 'rst' and outinfo == 'html':
+ sysout.write(_rst_id_fix(f.read()))
+ else:
+ sysout.write(f.read())
+ links_done = True
+ else:
+ sysout.write(x if x.endswith('\n') else x+'\n')
+ if not links_done:
+ sysout.write('\n')
+ try:
+ reststem = stem(outfile)
+ except:
+ reststem = ''
+ for tgt in RstFile.make_tgts(filelines, infile,
+ make_counters(), fn_i_ln):
+ sysout.write(
+ tgt.create_link(
+ outinfo.replace('rest', 'html'),
+ reststem, tool))
+
+ if rsttool:
+ config = conf_py(dirname(infile))
+ config.update(kwargs)
+ if sysout:
+ sysout.close()
+ sysout = None
+ stdout = rsttool(infile, '-' if finalsysout else outfile,
+ outinfo, **config)
+ if stdout is not None and finalsysout:
+ finalsysout.write(stdout)
+ finally:
+ for x in [sysout, finalsysout]:
+ if x is not None and x != sys.stdout and not isinstance(
+ x, io.StringIO):
+ x.close()
+ for x in [sysout, finalsysout]:
+ if isinstance(x, io.StringIO):
+ x.seek(0)
+ return x.readlines()
+
+
+converters = {
+ _svg: svgpng,
+ _tikz: tikzpng,
+ _tex: texpng,
+ _dot: dotpng,
+ _uml: umlpng,
+ _eps: epspng,
+ _pyg: pygpng,
+ _stpl: dostpl,
+ _rst: dorst,
+ _rest: dorst,
+ _txt: dorst
+}
+graphic_extensions = {_svg, _tikz, _tex, _dot, _uml, _eps, _pyg}
+
+def convert(
+ infile,
+ outfile=io.StringIO,
+ outinfo=None,
+ **kwargs
+ ):
+ r'''
+ Converts any of the known files.
+
+ Stpl files are forwarded to the next converter.
+
+ The main job is to normalized the input params,
+ because this is called from |dcx.main| and via Python.
+ It forwards to the right converter.
+
+ Examples::
+
+ >>> olddir = os.getcwd()
+ >>> cd(dirname(__file__))
+ >>> cd('../doc')
+
+ >>> convert([' ',' hi {{2+3}}!'], outinfo='rest')
+ [' .. default-role:: math\n', '\n', ' \n', ' hi 5!\n', '\n']
+
+ >>> convert([' ',' hi {{2+3}}!']) #doctest: +ELLIPSIS
+ ['\n', ...]
+ >>> rmrf('rest.rest.rest')
+
+ >>> infile, outfile, outinfo = ([
+ ... "newpath {{' '.join(str(i)for i in range(4))}} rectstroke showpage"
+ ... ],'tst.png','eps')
+ >>> 'tst.png' in convert(infile, outfile, outinfo) #doctest: +ELLIPSIS
+ True
+ >>> exists('tst.png')
+ True
+ >>> rmrf('tst.png')
+ >>> exists('tst.png')
+ False
+
+ >>> convert('ra.rest.stpl') #doctest: +ELLIPSIS
+ ['\n', ...
+
+ >>> cnvrt=lambda x,y: convert(x,y,None,pandoc_doc_optref={'docx':'--reference-doc doc/reference.'+y.split('.')[1]})
+ >>> cnvrt('ra.rest.stpl','ra.docx')
+ >>> exists('ra.rest.rest')
+ True
+ >>> rmrf('ra.rest.rest')
+ >>> exists('ra.rest.rest')
+ False
+ >>> exists('ra.docx')
+ True
+ >>> rmrf('ra.docx')
+ >>> exists('ra.docx')
+ False
+
+ >>> convert('dd.rest', None,'html') #doctest: +ELLIPSIS
+
+ ...
+ >>> exists('dd.rest.rest')
+ True
+ >>> rmrf('dd.rest.rest')
+ >>> exists('dd.rest.rest')
+ False
+ >>> cd(olddir)
+
+
+ :param infile:
+ any of ``.tikz``, ``.svg``, ``.dot``, ``.uml``, ``.eps``, ``.pyg``
+ or else stpl is assumed. Can be list of lines, too.
+
+ :param outfile: ``-`` means standard out,
+ else a file name, or None for automatic (using outinfo),
+ or io.StringIO to return lines instead of stdout
+
+ :param outinfo:
+ ``html``, ``sphinx_html``, ``docx``, ``odt``, ``file.docx``,...
+ interpet input as rest, else specifies graph type
+
+ '''
+
+ afile = False
+ try:
+ afile = infile and isfile(infile) or False
+ except:
+ pass
+ if not afile and (infile == '-' or infile is None):
+ try:
+ sys.stdin = codecs.getreader("utf-8")(sys.stdin.detach())
+ except:
+ pass
+ infile = sys.stdin.readlines()
+ if not outinfo:
+ if outfile == '-':
+ outinfo = 'rest'
+ elif outfile is None or callable(outfile):
+ outinfo = 'html'
+ else:
+ _,outinfo = stem_ext(outfile)
+ outinfo = outinfo.strip('.')
+ fext = None
+ if isinstance(infile, str):
+ nextinfile, fext = stem_ext(infile)
+ else:
+ fext = _stpl
+ if outinfo and _is_graphic(outinfo):
+ soi = outinfo.strip('.')
+ nextinfile = soi + '.' + soi
+ else:
+ nextinfile = 'rest' + _rest
+ fn_i_ln = None
+ while fext in converters:
+ if (outfile is None or callable(outfile)) and _is_graphic(fext):
+ outfile = _imgout(nextinfile + fext)
+ try:
+ nextinfile, fextnext = stem_ext(nextinfile)
+ if fextnext not in converters:
+ fextnext = None
+ except:
+ fextnext = None
+ out_ = lambda:outfile if not fextnext else None
+ thisconverter = converters[fext]
+ if thisconverter == dorst:
+ kwargs['fn_i_ln'] = fn_i_ln
+ kwargs.pop('outfile',None)
+ kwargs.pop('outinfo',None)
+ infile = thisconverter(infile, out_(), outinfo, **kwargs)
+ else:
+ if thisconverter == dostpl:
+ kwargs['outinfo'] = outinfo
+ # save infile for dorst() in outinfo as "infile/outinfo"
+ if fextnext in converters and converters[fextnext] == dorst:
+ if isinstance(infile, str):
+ fn_i_ln = _flatten_stpl_includes(infile)
+ else:
+ fn_i_ln = list(_flatten_stpl_includes_it(infile))
+ outinfo = nextinfile + '/' + (outinfo or '') # infile_outinfo
+ infile = thisconverter(infile, out_(), **kwargs)
+ if not infile:
+ break
+ if not fextnext:
+ break
+ fext = fextnext
+ return infile
+
+
+'''
+Same as |dcx.convert|,
+but creates temporary dir for a list of lines infile argument.
+
+::
+
+ >>> tmpfile = convert_in_tempdir("""digraph {
+ ... %for i in range(3):
+ ... "From {{i}}" -> "To {{i}}";
+ ... %end
+ ... }""".splitlines(), outinfo='dot')
+ >>> stem_ext(tmpfile)[1]
+ '.png'
+ >>> tmpfile = convert_in_tempdir("""
+ ... This is re{{'st'.upper()}}
+ ...
+ ... .. _`xx`:
+ ...
+ ... xx:
+ ... text
+ ...
+ ... """.splitlines(), outinfo='rst_html')
+ >>> stem_ext(tmpfile)[1]
+ '.html'
+
+'''
+convert_in_tempdir = in_temp_if_list(infile_cwd(convert))
+
+
+def rindices(regex, lns):
+ r'''
+ Return the indices matching the regular expression ``regex``.
+
+ :param regex: regular expression string or compiled
+ :param lns: lines
+
+ ::
+
+ >>> lns=['a','ab','b','aa']
+ >>> [lns[i] for i in rindices(r'^a\w*', lns)]==['a', 'ab', 'aa']
+ True
+
+ '''
+
+ regex = re.compile(regex)
+ for i, ln in enumerate(lns):
+ if regex.search(ln):
+ yield i
+
+
+def rlines(regex, lns):
+ '''
+ Return the lines matched by ``regex``.
+
+ :param regex: regular expression string or compiled
+ :param lns: lines
+
+ '''
+
+ return [lns[i] for i in rindices(regex, lns)]
+
+
+def intervals(nms # list of indices
+ ):
+ """
+ Return intervals between numbers.
+
+ ::
+
+ >>> intervals([1, 2, 3])==[(1, 2), (2, 3)]
+ True
+
+ """
+ return list(zip(nms[:], nms[1:]))
+
+
+def in2s(nms # list of indices
+ ):
+ """
+ Convert the list into a list of couples of two elements.
+
+ ::
+
+ >>> in2s([1, 2, 3, 4])==[(1, 2), (3, 4)]
+ True
+
+ """
+ return list(zip(nms[::2], nms[1::2]))
+
+
+# re.search(reid,'OpenDevices = None').groups()
+# re.search(reid,'def OpenDevices(None)').groups()
+# re.search(reid,'class OpenDevices:').groups()
+# re.search(reid,' def __init__(a, b):').groups()
+# re.search(relim," '''prefix. ").groups()
+# re.search(relim," '''").groups()
+
+
+def doc_parts(
+ lns,
+ relim=r"^\s*r?'''([\w.:]*)\s*\n*$",
+ reid=r"\s(\w+)[(:]|(\w+)\s\=",
+ reindent=r'[^#/\s]',
+ signature=None,
+ prefix=''
+ ):
+ r'''
+ ``doc_parts()`` yields doc parts delimeted by ``relim`` regular expression
+ possibly with id, if ``reid`` matches
+
+ If start and stop differ use regulare expression ``|`` in ``relim``.
+
+ - There is no empty line between doc string
+ and preceding code lines that should be included.
+ - There is no empty line between doc string
+ and succeeding code lines that should be included.
+ - Included code lines end with an empty line.
+
+ In case of ``__init__()`` the ID can come from the ``class`` line
+ and the included lines can be those of ``__init__()``,
+ if there is no empty line between the doc string
+ and ``class`` above as well as ``_init__()`` below.
+
+ If the included code comes only from one side of the doc string,
+ have an empty line at the other side.
+
+ Immediately after the initial doc string marker
+ there can be a prefix, e.g. ``classname.``.
+
+ :param lns: list of lines
+ :param relim: regular expression marking lines enclosing the documentation.
+ The group is a prefix.
+ :param reid: extract id from preceding or succeeding non-empty lines
+ :param reindent: determines start of text
+ :param signature: if signature language is given the preceding
+ or succeeding lines will be included
+ :param prefix: prefix to make id unique, e.g. module name. Include the dot.
+
+ ::
+
+ >>> with open(__file__) as f:
+ ... lns = f.readlines()
+ ... docparts = list(doc_parts(lns, signature='py'))
+ ... doc_parts_line = rlines('doc_parts', docparts)
+ >>> doc_parts_line[1]
+ ':doc_parts:\n'
+
+ '''
+
+ rlim = re.compile(relim)
+ rid = re.compile(reid)
+ rindent = re.compile(reindent)
+
+ def foundid(lnsi):
+ if not lnsi.strip(): # empty
+ return False
+ id = rid.search(lnsi)
+ if id and id.groups():
+ ids = [x for x in id.groups() if x is not None]
+ if len(ids) > 0:
+ return ids[0]
+
+ ids = []
+
+ def checkid(rng):
+ i = None
+ for i in rng:
+ testid = foundid(lns[i])
+ if testid is False:
+ break
+ elif not ids and isinstance(testid, str):
+ ids.append(testid)
+ return i
+
+ for a, b in in2s(list(rindices(rlim, lns))):
+ try:
+ thisprefix = rlim.search(lns[a]).groups()[0]
+ except:
+ thisprefix = ''
+ ids.clear()
+ i = checkid(range(a - 1, 0, -1))
+ j = checkid(range(b + 1, len(lns)))
+ if ids:
+ yield ''
+ yield '.. _`' + prefix + thisprefix + ids[0] + '`:\n'
+ yield ''
+ yield ':' + prefix + thisprefix + ids[0] + ':\n'
+ yield ''
+ if signature:
+ if i is not None and i < a and i > 0:
+ if not lns[i].strip(): # empty
+ i = i + 1
+ if i < a:
+ yield '.. code-block:: ' + signature + '\n'
+ yield ''
+ yield from (' ' + x for x in lns[i:a])
+ yield ''
+ if j is not None and j > b + 1 and j < len(lns):
+ if not lns[j].strip(): # empty
+ j = j - 1
+ if j > b:
+ yield '.. code-block:: ' + signature + '\n'
+ yield ''
+ yield from (' ' + x for x in lns[b + 1:j + 1])
+ yield ''
+ indent = 0
+ for ln in lns[a + 1:b]:
+ lnst = rindent.search(ln)
+ if lnst and lnst.span():
+ indent = lnst.span()[0]
+ break
+ yield from (x[indent:] for x in lns[a + 1:b])
+
+
+# for generator function, instead of lru_cache()
+_Tee = tee([], 1)[0].__class__
+
+
+def _memoized(f):
+ cache = {}
+
+ def ret(*args):
+ if args not in cache:
+ cache[args] = f(*args)
+ if isinstance(cache[args], (GeneratorType, _Tee)):
+ cache[args], r = tee(cache[args])
+ return r
+ return cache[args]
+
+ return ret
+
+
+@lru_cache()
+def _read_lines(fn):
+ lns = []
+ with opn(fn) as f:
+ lns = list(f.readlines())
+ return lns
+
+
+@_memoized
+def rstincluded(
+ fn,
+ paths=(),
+ withimg=False,
+ withrest=False
+ ):
+ '''
+ Yield the files recursively included from an RST file.
+
+ :param fn: file name without path
+ :param paths: paths where to look for fn
+ :param withimg: also yield image files, not just other RST files
+ :param withrest: rest files are not supposed to be included
+
+ ::
+
+ >>> olddir = os.getcwd()
+ >>> cd(dirname(__file__))
+ >>> list(rstincluded('ra.rest',('../doc',)))
+ ['ra.rest.stpl', '_links_sphinx.rst']
+ >>> list(rstincluded('sr.rest',('../doc',)))
+ ['sr.rest', '_links_sphinx.rst']
+ >>> list(rstincluded('meta.rest',('../doc',)))
+ ['meta.rest', 'files.rst', '_traceability_file.rst', '_links_...']
+ >>> 'dd.rest' in list(rstincluded(
+ ... 'index.rest',('../doc',), False, True))
+ True
+ >>> cd(olddir)
+
+ '''
+
+ p = ''
+ for p in paths:
+ nfn = normjoin(p, fn)
+ if exists(nfn + _stpl): # first, because original
+ nfn = nfn + _stpl
+ yield fn + _stpl
+ break
+ elif exists(nfn): # while this might be generated
+ yield fn
+ break
+ else:
+ nfn = fn
+ yield fn
+ lns = _read_lines(nfn)
+ toctree = False
+ if lns:
+ for aln in lns:
+ if toctree:
+ toctreedone = False
+ if aln.startswith(' '):
+ fl = aln.strip()
+ if fl.endswith(_rest) and exists(normjoin(p, fl)):
+ toctreedone = True
+ yield from rstincluded(fl, paths)
+ continue
+ elif toctreedone:
+ toctree = False
+ if aln.startswith('.. toctree::'):
+ if withrest:
+ toctree = True
+ elif aln.strip().startswith('.. '):
+ # aln = ' .. include:: some.rst'
+ # aln = ' .. include:: ../some.rst'
+ # aln = '.. include:: some.rst'
+ # aln = '.. include:: ../some.rst'
+ # aln = ' .. image:: some.png'
+ # aln = '.. image:: some.png'
+ # aln = ' .. figure:: some.png'
+ # aln = ' .. |x y| image:: some.png'
+ try:
+ f, t, _ = rerstinclude.split(aln)
+ nf = not f.strip() and t
+ if nf:
+ if is_rest(nf) and not withrest:
+ continue
+ yield from rstincluded(nf.strip(), paths)
+ except:
+ if withimg:
+ m = reximg.search(aln)
+ if m:
+ yield m.group(1)
+ elif restplinclude.match(aln):
+ # aln="%include('some.rst.tpl', v='param')"
+ # aln=" %include('some.rst.tpl', v='param')"
+ f, t, _ = restplinclude.split(aln)
+ nf = not f.strip() and t
+ if nf:
+ thisnf = normjoin(p, nf)
+ if not exists(thisnf):
+ parntnf = normjoin(p, '..', nf)
+ if exists(parntnf):
+ nf = parntnf
+ else:
+ continue
+ yield from rstincluded(nf.strip(), paths)
+
+
+_traceability_instance = None
+
+
+class Traceability:
+ def __init__(self, tracehtmltarget):
+ self.tracehtmltarget = tracehtmltarget
+ self.fcaobjsets = []
+ global _traceability_instance
+ _traceability_instance = self
+ self.counters = None
+
+ def appendobject(self, aset):
+ self.fcaobjsets.append(aset)
+
+ def isempty(self):
+ return len(self.fcaobjsets) == 0
+
+ # returns the rst lines of _traceability_file
+ def create_traceability_file(self, linkroot):
+ if not pyfca:
+ return []
+ if not self.fcaobjsets:
+ return []
+ config = conf_py(linkroot)
+ target_id_group = config['target_id_group']
+ target_id_color = config['target_id_color']
+ rextrace_target_id = re.compile(config['rextrace_target_id'])
+
+ def _drawnode(canvas, node, parent, center, radius):
+ fillcolors = []
+ nodetgtgrps = {target_id_group(x) for x in node.intent}
+ for _, (groupid, groupcolor) in target_id_color.items():
+ if groupid in nodetgtgrps:
+ fillcolors.append(groupcolor)
+ n_grps = len(fillcolors)
+ for i in range(n_grps - 1, -1, -1):
+ rr = int(radius * (i + 1) / n_grps)
+ parent.add(
+ canvas.circle(
+ center, rr, fill=fillcolors[i], stroke='black'))
+
+ fca = pyfca.Lattice(self.fcaobjsets,
+ lambda x: set(xe for xe in x if rextrace_target_id.match(xe)))
+ tr = 'tr'
+
+ # |trXX|, |trYY|, ...
+
+ def reflist(x, pfx=tr):
+ return (
+ '|' + pfx +
+ ('|, |' + pfx).join([str(e)
+ for e in sorted(x)]) + '|') if x else ''
+
+ fcanodes = [(".. _`" + tr + "{0}`:\n\n:" + tr +
+ "{0}:\n\n{1}\n\nUp: {2}\n\nDown: {3}\n\n").format(
+ n.index, reflist(n.intent, ''), reflist(n.up),
+ reflist(n.down)) for n in fca.nodes]
+ tlines = ''.join(fcanodes).splitlines(keepends=True)
+ # fig_traceability_file target
+ tlines.extend([
+ '.. _`fig' + _traceability_file + '`:\n', '\n',
+ '.. figure:: ' + _traceability_file + '.png\n', ' :name:\n', '\n',
+ ' |fig' + _traceability_file + '|: `FCA <%s>`__ %s' % (
+ "https://en.wikipedia.org/wiki/Formal_concept_analysis",
+ "diagram of dependencies"
+ )
+ ])
+ if target_id_color is not None:
+ legend = ', '.join(
+ [fnm + " " + clr for fnm, (_, clr) in target_id_color.items()])
+ tlines.extend([': ' + legend, '\n'])
+ tlines.append('\n')
+ with opnwrite(normjoin(linkroot, _traceability_file + _rst)) as f:
+ f.write('.. raw:: html\n\n')
+ f.write(' \n')
+ if target_id_color is not None:
+ f.write(
+ '''
FCA
+ diagram of dependencies with clickable nodes: ''' % (
+ "https://en.wikipedia.org/wiki/Formal_concept_analysis"
+ )
+ + legend + '
\n\n')
+ f.writelines(tlines)
+ ld = pyfca.LatticeDiagram(fca, 4 * 297, 4 * 210)
+
+ tracesvg = abspath(normjoin(linkroot, _traceability_file + _svg))
+
+ def ttgt():
+ return self.tracehtmltarget.endswith(_rest) and stem(
+ self.tracehtmltarget) or self.tracehtmltarget
+
+ ld.svg(
+ target=ttgt() + '.html#' + tr, drawnode=_drawnode).saveas(tracesvg)
+ if exists(tracesvg):
+ tracepng = abspath(normjoin(linkroot, _traceability_file + _png))
+ svgpng(tracesvg, tracepng)
+ return tlines
+
+
+def pair(alist, blist, cmp):
+ '''
+ pair two sorted lists
+ where the second must be at least as long as the first
+
+ :param alist: first list
+ :param blist: second list longer or equal to alist
+ :param cmp: compare function
+
+ ::
+
+ >>> alist=[1,2,4,7]
+ >>> blist=[1,2,3,4,5,6,7]
+ >>> cmp = lambda x,y: x==y
+ >>> list(pair(alist,blist,cmp))
+ [(1, 1), (2, 2), (None, 3), (4, 4), (None, 5), (None, 6), (7, 7)]
+
+ >>> alist=[1,2,3,4,5,6,7]
+ >>> blist=[1,2,3,4,5,6,7]
+ >>> cmp = lambda x, y: x==y
+ >>> list(pair(alist, blist, cmp))
+ [(1, 1), (2, 2), (3, 3), (4, 4), (5, 5), (6, 6), (7, 7)]
+
+ '''
+
+ i = 0
+ for aa, bb in zip(alist, blist):
+ if not cmp(aa, bb):
+ break
+ i = i + 1
+ yield aa, bb
+ alen = len(alist)
+ tlen = max(alen, len(blist))
+ d = 0
+ for j in range(i, alen):
+ for dd in range(tlen - j - d):
+ bb = blist[j + d + dd]
+ if not cmp(alist[j], bb):
+ yield None, bb
+ else:
+ yield alist[j], bb
+ d = d + dd
+ break
+ else:
+ return
+
+
+def gen(
+ source,
+ target=None,
+ fun=None,
+ **kw
+ ):
+ '''
+ Take the ``gen_[fun]`` functions
+ enclosed by ``#def gen_[fun](lns,**kw)`` to create a new file.
+
+ :param source: either a list of lines or a path to the source code
+ :param target: either save to this file
+ or return the generated documentation
+ :param fun: use ``#gen_(lns,**kw):`` to extract the documentation
+ :param kw: kw arguments to the ``gen_()`` function
+
+ ::
+
+ >>> source=[i+'\\n' for i in """
+ ... #def gen(lns,**kw):
+ ... # return [l.split('#@')[1] for l in rlines(r'^\s*#@', lns)]
+ ... #def gen
+ ... #@some lines
+ ... #@to extract
+ ... """.splitlines()]
+ >>> [l.strip() for l in gen(source)]
+ ['some lines', 'to extract']
+
+ '''
+
+ if isinstance(source, str):
+ lns = []
+ try:
+ lns = _read_lines(source)
+ except:
+ sys.stderr.write("ERROR: {} cannot be opened\n".format(source))
+ return
+ else:
+ lns = source
+ source = ""
+ if '.' not in sys.path:
+ sys.path.append('.')
+ if fun:
+ # fun ='sdf'
+ gen_regex = r'#\s*def gen_' + fun + r'(\w*(lns,\*\*kw):)*'
+ # re.compile(gen_regex).search('#def gen_sdf(lns,**kw):') #begin
+ # re.compile(gen_regex).search('#def gen_sdf') #end
+ else:
+ gen_regex = r'#\s*def gen(\w*(lns,\*\*kw):)*'
+ # re.compile(gen_regex).search('# def gen(lns,**kw):') #begin
+ # re.compile(gen_regex).search('# def gen') #end
+ iblks = list(rindices(gen_regex, lns))
+ py3 = [
+ lns[k][lns[i].index('#') + 1:] for i, j in in2s(iblks)
+ for k in range(i, j)
+ ]
+ indent = py3[0].index(py3[0].lstrip())
+ py3 = '\n'.join(x[indent:] for x in py3)
+ eval(compile(py3, source + r'#\s*gen', 'exec'), globals())
+ if fun:
+ gened = list(eval('gen_' + fun + '(lns,**kw)'))
+ else: # else eval all gen_ funtions
+ gened = []
+ for i in iblks[0::2]:
+ gencode = re.split(r"#\s*def |:", lns[i])[1] # gen(lns,**kw)
+ gened += list(eval(gencode))
+ if target:
+ drn = dirname(target)
+ if drn and not exists(drn):
+ mkdir(drn)
+ with opnwrite(target) as o:
+ o.write(''.join(((x or '\n') for x in gened)))
+ else:
+ return gened
+
+
+def parsegenfile(genpth):
+ '''
+ Parse the file ``genpth`` which is either
+
+ - python code or
+
+ - has format ::
+
+ sourcefile | targetfile | suffix | kw paramams or {}
+
+ ``suffix`` refers to ``gen_``.
+
+ The yields are used for the |dcx.gen| function.
+
+ :param genpth: path to gen file
+
+ '''
+
+ try:
+ genfilelns = _read_lines(genpth)
+ except: #noqa
+ sys.stderr.write("ERROR: {} cannot be opened\n".format(genpth))
+ return
+
+ try: #python code return [[from,to,fun,kw],...]?
+ genconfig= {'__file__':abspath(genpth)}
+ gencode = '\n'.join(genfilelns)
+ eval(compile(gencode, genpth, 'exec'), genconfig)
+ for f,t,d,kw in genconfig['from_to_fun_kw']:
+ yield f,t,d,kw # if python code last entry is not a string as blow
+ except:
+ for ln in genfilelns:
+ if ln[0] != '#':
+ try:
+ f, t, d, a = [x.strip() for x in ln.split('|')]
+ kw = eval(a)
+ yield f, t, d, kw
+ except:
+ pass
+
+
+def _flatten_stpl_includes_it(fn):
+ """
+ This flattens the .stpl includes
+ to have all targets align to those in the RST file.
+ Targets must be *explicit* in all ``.stpl`` and ``.tpl``,
+ i.e. they must not be created by stpl code.
+ This is needed to make the .tags jump to the original
+ and not the generated file.
+ """
+ flns = []
+ if isinstance(fn, str):
+ if exists(fn):
+ flns = _read_lines(fn)
+ else:
+ parnt = updir(fn)
+ if exists(parnt):
+ flns = _read_lines(parnt)
+ else:
+ flns = fn
+ fn = '-'
+ for i, ln in enumerate(flns):
+ # ln = '% include("../test.rst.stpl", v="aparam")'
+ m = restplinclude.match(ln)
+ if m:
+ includedtpl = m.group(1)
+ yield from _flatten_stpl_includes(
+ normjoin(dirname(fn), includedtpl))
+ else:
+ yield fn, i, ln
+
+
+@lru_cache()
+def _flatten_stpl_includes(fn):
+ return list(_flatten_stpl_includes_it(fn))
+
+
+class Tgt:
+
+ line_search_range = 8
+
+ def __init__(
+ self,
+ lnidx, # line index
+ target # target name
+ ):
+ self.lnidx = lnidx
+ self.target = target
+ self.tagentry = None # (path, line index)
+ self.lnkname = None # link name
+
+ def is_inside_literal(self, lns):
+ try: # skip literal blocks
+ indentation = re.search(r'\w', lns[self.lnidx]).span()[0] - 3
+ if indentation > 0:
+ for iprev in range(self.lnidx - 1, 0, -1):
+ prev = lns[iprev]
+ if prev:
+ newspc, _ = next((ich, ch)
+ for ich, ch in enumerate(prev)
+ if ch != ' ' and ch != '\t')
+ if newspc < indentation:
+ prev = prev.strip()
+ if prev:
+ if not prev.startswith(
+ '.. ') and prev.endswith('::'):
+ return True
+ return False
+ except:
+ pass
+
+ def find_lnkname(self,
+ lns,
+ counters
+ ):
+ """Tgt.
+
+ Determines the link name for this target.
+ It searches the following lines for either
+
+ - a title
+ - ``:name:`` immediately below a directive
+ (a counter is used if no name is given)
+ - a ``:xxx:`` or ``xxx:`` or
+ - a single word ``xxx``
+
+ The link name must not contain formatting,
+ e.g. "``link name``:" is not possible.
+
+ :param lns: the rest lines
+ :param counters: the counters for the directives (see make_counters())
+
+ """
+ lenlns = len(lns)
+ lnkname = self.target
+ for j in range(self.lnidx + 2, self.lnidx + self.line_search_range):
+ # j=i+2
+ if j > lenlns - 1:
+ break
+ lnj = lns[j]
+ if rextitle.match(lnj):
+ lnkname = lns[j - 1].strip()
+ if not lnkname:
+ lnkname = lns[j + 1].strip()
+ break
+ # j, lns=1,".. figure::\n :name: linkname".splitlines();lnj=lns[j]
+ # j, lns=1,".. figure::\n :name:".splitlines();lnj=lns[j]
+ # j, lns=1,".. math::\n :name: linkname".splitlines();lnj=lns[j]
+ itm = rexname.match(lnj)
+ if itm:
+ lnkname, = itm.groups()
+ lnj1 = lns[j - 1].split('::')[0].replace(
+ 'list-table', 'table').replace('code-block',
+ 'code').strip()
+ if counters and not lnkname and lnj1 in counters:
+ lnkname = name_from_directive(
+ lnj1.strip('. '), counters[lnj1])
+ counters[lnj1] += 1
+ break
+ elif lnkname:
+ lnkname = lnkname.strip()
+ break
+ itm = rexitem.match(lnj)
+ if itm:
+ lnkname, = itm.groups()
+ break
+ itm = rexoneword.match(lnj)
+ if itm:
+ lnkname, = itm.groups()
+ break
+ lnkname = self.target
+ self.lnkname = lnkname
+
+ def create_link(self,
+ linktype,
+ reststem,
+ tool
+ ):
+ """Tgt.
+
+ Creates a link.
+ If both linktype and reststem are empty,
+ then this is an internal link.
+
+ :param linktype: file extension:
+ one of rest, html, docx, odt, latex, pdf
+ :param reststem: the file name without extension
+ (not used for linktype='sphinx' or 'rest')
+ :param tool: pandoc, sphinx or rst
+
+ """
+ if reststem and linktype:
+ targetfile = reststem + '.' + linktype
+ else:
+ targetfile = ''
+ id = self.target
+ if linktype == 'latex':
+ linktype = 'pdf'
+ if tool == 'sphinx':
+ tgte = ".. |{0}| replace:: :ref:`{1}<{2}>`\n".format(
+ self.target, self.lnkname, id)
+ else:
+ if linktype == 'odt':
+ # https://github.com/jgm/pandoc/issues/3524
+ tgte = ".. |{0}| replace:: `{1} `__\n".format(
+ self.target, self.lnkname, targetfile, id)
+ else:
+ # https://sourceforge.net/p/docutils/bugs/378/
+ tgte = ".. |{0}| replace:: `{1} `__\n".format(
+ self.target, self.lnkname, targetfile, id)
+ if tool == 'rst' and linktype == 'html':
+ return _rst_id_fix(tgte)
+ else:
+ return tgte
+
+ def create_tag(self):
+ return r'{0} {1} /\.\. _`\?{0}`\?:/;" line:{2}'.format(
+ self.target, self.tagentry[0], self.tagentry[1])
+
+
+class RstFile:
+ def __init__(self, reststem, doc, tgts, lnks, nlns):
+ '''RstFile.
+
+ Contains the targets for a ``.rst`` or ``.rest`` file.
+
+ :param reststem: rest file this doc belongs to (without extension)
+ :param doc: doc belonging to reststem,
+ either included or itself (.rest, .rst, .stpl)
+ :param tgts: list of Tgt objects yielded by |dcx.RstFile.make_tgts|.
+ :param lnks: list of (line index, target name (``|target|``)) tuples
+ :param nlns: number of lines of the doc
+
+ '''
+
+ self.reststem = reststem
+ self.doc = doc
+ self.tgts = tgts
+ self.lnks = lnks
+ self.nlns = nlns
+
+ def __str__(self):
+ return str((self.doc, self.reststem))
+
+ def add_links_and_tags(self, add_tgt, add_linksto):
+ iterlnks = iter(self.lnks)
+ prevtgt = None
+ # unknowntgts = []
+ tgt = None
+ for tgt in self.tgts:
+ if tgt.lnidx is not None:
+ add_linksto(prevtgt, tgt.lnidx, iterlnks) # , unknowntgts)
+ add_tgt(tgt, self.reststem)
+ prevtgt = tgt
+ if tgt:
+ add_linksto(prevtgt, tgt.lnidx, iterlnks) # , unknowntgts)
+
+ @staticmethod
+ def make_lnks(lns # lines of the document
+ ):
+ """RestFile.
+
+ Yields (index, link name) for ``lns``.
+
+ """
+
+ for i, ln in enumerate(lns):
+ mo = rexlnks.findall(ln)
+ for g in mo:
+ yield i, g
+
+ @staticmethod
+ def make_tgts(
+ lns,
+ doc,
+ counters=None,
+ fn_i_ln=None
+ ):
+ '''RstFile.
+
+ Yields ``((line index, tag address), target, link name)``
+ of ``lns`` of a restructureText file.
+ For a .stpl file the linkname comes from the generated RST file.
+
+ :param lns: lines of the document
+ :param doc: the rst/rest document for tags
+ :param counters: if None, the starts with
+ {".. figure":1,".. math":1,".. table":1,".. code":1}
+ :fn_i_ln: (fn, i, ln) of the .stpl with all stpl includes sequenced
+
+ '''
+
+ if counters is None:
+ counters = make_counters()
+ itgts = list(rindices(rextgt, lns))
+ if fn_i_ln:
+ lns1 = [x[2] for x in fn_i_ln]
+ itgts1 = list(rindices(rextgt, lns1))
+ else:
+ lns1 = lns
+ itgts1 = itgts
+ if len(itgts) < len(itgts1):
+ paired_itgts_itgts1 = pair(itgts, itgts1,
+ lambda x, y: lns[x] == lns1[y])
+ elif len(itgts) > len(itgts1):
+ paired_itgts_itgts1 = ((i, j) for (
+ j, i) in pair(itgts1, itgts, lambda x, y: lns1[x] == lns[y]))
+ else:
+ paired_itgts_itgts1 = zip(itgts, itgts1)
+ lenlns = len(lns)
+ lenlns1 = len(lns1)
+ for i, i1 in paired_itgts_itgts1:
+ ii, iis, _ = (i, lns, lenlns) if i else (i1, lns1, lenlns1)
+ cur = iis[ii]
+ tgt = Tgt(ii, rextgt.search(cur).group(1))
+ if tgt.is_inside_literal(iis):
+ continue
+ tgt.find_lnkname(iis, counters)
+ tgt.lnkidx = i
+ if i1:
+ if fn_i_ln:
+ tgt.tagentry = fn_i_ln[i1][:2]
+ else:
+ tgt.tagentry = (doc, ii)
+ else:
+ tgt.tagentry = (doc.replace(_stpl, ''), ii)
+ yield tgt
+
+ @staticmethod
+ def substs(lns # lines of the rst document
+ ):
+ """RestFile.
+
+ Return all substitution targets in the rst lns
+
+ ::
+
+ >>> list(RstFile.substs('''
+ ... .. |sub| image:: xx
+ ... .. |s-b| date::
+ ... '''.splitlines()))
+ ['sub', 's-b']
+
+ """
+
+ for i, ln in enumerate(lns):
+ asub = rexsubtgt.search(ln)
+ if asub:
+ yield asub.group(1)
+
+
+g_links_types = "sphinx latex html pdf docx odt".split()
+class Fldr(OrderedDict):
+ def __init__(
+ self,
+ folder,
+ linkroot,
+ scanroot='.'
+ ):
+ """
+ Represents a directory.
+
+ It is an ordered list of {rest file: RstFile object}.
+
+ :self.folder: is the directory path
+ :self.linkroot: is the directory relative to which links are made
+ :self.scanroot: is the directory where the scan began
+ :self.allfiles: set of all files in the directory
+ :self.alltgts: set of all targets in the directory
+ :self.allsubsts: set of all substitutions in the directory
+ :self.counters: the counters for each rest file
+
+ """
+
+ self.folder = folder
+ self.linkroot = linkroot
+ self.scanroot = scanroot
+ self.allfiles = set()
+ self.alltgts = set()
+ self.allsubsts = set()
+ self.rest_counters = defaultdict(dict)
+
+ def __str__(self):
+ return str(list(sorted(self.keys())))
+
+ def scanfiles(
+ self,
+ fs
+ ):
+ """Fldr.
+
+ Scans the directory for rest files.
+ All files (.rest and included .rst)
+ are added if there is at least one ``.rest[.stpl]``.
+
+ :param fs: all files in the directory as returned by ``os.walk()``
+
+ Sphinx index.rest is processed last.
+
+ ``allfiles``, ``alltgts`` and ``allsubsts`` get filled.
+
+ """
+
+ sofar = set([])
+ sphinx_index = None
+ # reversed puts the rest.stpl before the .rest
+ for afs in reversed(sorted(fs)):
+ fullpth = normjoin(self.folder, afs).replace("\\", "/")
+ if is_rest(afs):
+ if afs.startswith('index'+_rest):
+ sphinx_index = afs
+ continue
+ fullpth_nostpl = fullpth.replace(_stpl, '')
+ if fullpth_nostpl in sofar:
+ continue
+ sofar.add(fullpth_nostpl)
+ self.add_rest(afs)
+ if sphinx_index:
+ self.add_rest(sphinx_index)
+
+ def add_rest(self,
+ restfile,
+ exclude_paths_substrings=['_links_', _traceability_file]):
+ """Fldr.
+
+ Scans a rest file for included files and constructs all the targets.
+
+ """
+
+ pths = []
+ has_traceability = False
+ for restinc in rstincluded(restfile, (self.folder, )):
+ pth = normjoin(self.folder, restinc).replace("\\", "/")
+ if _traceability_file + _rst in restinc:
+ if pyfca and _traceability_instance is None:
+ Traceability(stem(restfile))
+ has_traceability = True
+ continue
+ if any(x in pth for x in exclude_paths_substrings):
+ continue
+ pths.append(pth)
+
+ assert pths, "No file for "+restfile+" due to excluded " + str(exclude_paths_substrings)
+ reststem = pths[0]
+ reststem = stem(stem(reststem))
+ if reststem not in self.rest_counters:
+ self.rest_counters[reststem] = make_counters()
+ counters = self.rest_counters[reststem]
+ if has_traceability:
+ _traceability_instance.counters = counters
+
+ self.allfiles |= set(pths)
+
+ for doc in pths:
+ rstpath = doc.replace(_stpl, '')
+ if doc.endswith(_stpl) and exists(rstpath):
+ lns = _read_lines(doc.replace(_stpl, ''))
+ fn_i_ln = _flatten_stpl_includes(doc)
+ tgts = list(RstFile.make_tgts(lns, doc, counters, fn_i_ln))
+ elif not doc.endswith(_tpl) and not doc.endswith(_txt) and exists(
+ doc):
+ lns = _read_lines(doc)
+ tgts = list(RstFile.make_tgts(lns, doc, counters))
+ else:
+ continue
+ lnks = list(RstFile.make_lnks(lns))
+ relp = relpath(reststem,start=self.linkroot)
+ rstfile = RstFile(relp, doc, tgts, lnks, len(lns))
+ self[doc] = rstfile
+ self.alltgts |= set([t.target for t in rstfile.tgts])
+ self.allsubsts |= set(RstFile.substs(lns))
+
+ def create_links_and_tags(self):
+ """Fldr.
+
+ Appends to links_xxx.rst and .tags in linkroot for files in this folder.
+
+ The target IDs are grouped using target_id_group().
+ To every group a color is associated. See ``conf.py``.
+ This is used to color an FCA lattice diagram
+ in "_traceability_file.rst".
+ The diagram nodes are clickable in HTML.
+
+ """
+
+ tagentries = []
+ lnkrelfolder = ''
+ if self.folder.strip():
+ lnkrelfolder = relpath(self.linkroot, start=self.folder)
+ linkfiles = [(linktype, []) for linktype in g_links_types]
+
+ def add_tgt(tgt, reststem):
+ for linktype, linklines in linkfiles:
+ linklines.append(
+ tgt.create_link(
+ linktype, normjoin(lnkrelfolder,reststem),
+ linktype if linktype == 'sphinx' else 'pandoc'))
+ if isabs(tgt.tagentry[0]):
+ tgt.tagentry = (relpath(tgt.tagentry[0], start=self.scanroot),
+ tgt.tagentry[1])
+ tgt.tagentry = (tgt.tagentry[0], tgt.tagentry[1])
+ newtag = tgt.create_tag()
+ tagentries.append(newtag)
+
+ def add_links_comments(comment):
+ for _, linklines in linkfiles:
+ linklines.append(comment)
+
+ def add_linksto(prevtgt, lnidx, iterlnks, ojlnk=[0, None]):
+ # all the links from the block following prevtgt up to this tgt
+ linksto = []
+
+ def chkappend(x):
+ if not prevtgt or x != prevtgt.target:
+ linksto.append(x)
+
+ if ojlnk[1] and ojlnk[0] < lnidx: # first link in the new prevtgt
+ if ojlnk[1] in self.alltgts:
+ chkappend(ojlnk[1])
+ elif ojlnk[1] not in self.allsubsts:
+ linksto.append('-' + ojlnk[1])
+ # unknowntgts.append(ojlnk[1])
+ ojlnk[1] = None
+ if ojlnk[1] is None: # remaining links in prevtgt up to this tgt
+ for j, lnk in iterlnks:
+ if j > lnidx: # links up to to this target
+ ojlnk[:] = j, lnk
+ break
+ else:
+ if lnk in self.alltgts:
+ chkappend(lnk)
+ elif lnk not in self.allsubsts:
+ linksto.append('-' + lnk)
+ # unknowntgts.append(lnk)
+ if _traceability_instance:
+ if prevtgt and linksto:
+ _traceability_instance.appendobject(linksto+[prevtgt.target])
+ if linksto:
+ linksto = '.. .. ' + ','.join(linksto) + '\n\n'
+ add_links_comments(linksto)
+
+ for rstfile in self.values():
+ add_links_comments('\n.. .. {0}\n\n'.format(rstfile.doc))
+ rstfile.add_links_and_tags(add_tgt, add_linksto)
+ if _traceability_instance and self.linkroot==self.folder:
+ tlines = _traceability_instance.create_traceability_file(self.linkroot)
+ trcrst = normjoin(self.linkroot, _traceability_file + _rst)
+ if tlines:
+ for tgt in RstFile.make_tgts(tlines, trcrst,
+ _traceability_instance.counters):
+ add_tgt(tgt, _traceability_instance.tracehtmltarget)
+ for linktype, linklines in linkfiles:
+ with opnappend(normjoin(self.linkroot,
+ '_links_'+linktype+_rst)) as f:
+ f.write('\n'.join(linklines))
+ ctags_python = ""
+ try:
+ ctags_python = cmd(
+ [
+ 'ctags', '-R', '--sort=0', '--fields=+n',
+ '--languages=python', '--python-kinds=-i', '-f', '-', '*'
+ ],
+ cwd=self.scanroot)
+ finally:
+ with opnappend(normjoin(self.scanroot, '.tags')) as f:
+ if ctags_python: f.write(ctags_python)
+ if tagentries: f.write('\n'.join(tagentries)+'\n')
+
+
+class Fldrs(OrderedDict):
+ def __init__(
+ self,
+ scanroot='.'
+ ):
+ """
+ Represents a directory hierarchy below ``scanroot``.
+
+ :param scanroot: root path to start scanning
+ for independent doc directories
+
+ .tags: paths are relative to ``scanroot``.
+
+ _links_xxx.rst: paths are relative to the first directory with a ``.rest``.
+ Place e.g. index.rest in a folder above, to link between folders.
+
+ """
+
+ self.scanroot = scanroot
+
+ def __str__(self):
+ return super().__str__()
+
+ def scandirs(self):
+ #_images, and dot files excluded
+ notexcluded = lambda d: not d.startswith('_') and not (len(d)>1 and d[0]=='.' and d[1]!='.')
+ linkroot = None
+ for p, ds, fs in os.walk(self.scanroot):
+ ds[:] = [d for d in ds if notexcluded(d)]
+ if notexcluded(base(p)):
+ njp = normjoin(p)
+ fldr = Fldr(njp,linkroot or njp,self.scanroot)
+ fldr.scanfiles(fs)
+ if len(fldr):
+ self[njp] = fldr
+ if not linkroot or not abspath(njp).startswith(abspath(linkroot)):
+ linkroot = njp
+ for linktype in g_links_types:
+ with opnwrite(normjoin(linkroot,
+ '_links_'+linktype+_rst)) as f:
+ f.write('.. .. .. %s'%linkroot)
+ rmrf(normjoin(self.scanroot, '.tags'))
+
+def links_and_tags(
+ scanroot='.'
+ ):
+ '''
+ Creates ``_links_xxx.rst`` files and a ``.tags``.
+
+ :param scanroot: directory for which to create links and tags
+
+ ::
+
+ >>> olddir = os.getcwd()
+ >>> cd(dirname(__file__))
+ >>> rmrf('../doc/_links_sphinx.rst')
+ >>> '_links_sphinx.rst' in ls('../doc')
+ False
+
+ >>> links_and_tags('../doc')
+ >>> '_links_sphinx.rst' in ls('../doc')
+ True
+ >>> cd(olddir)
+
+ '''
+
+ fldrs = Fldrs(scanroot)
+ fldrs.scandirs()
+ #reversed to do create_traceability_file at the end
+ for folder,fldr in reversed(fldrs.items()):
+ fldr.create_links_and_tags()
+
+def _kw_from_path(kwpth,rexkwsplit=rexkwsplit):
+ """use names of path up to project root as keywords
+
+ ::
+
+ >>> kwpth="/projects/me_about-this-1.rst"
+ >>> _kw_from_path(kwpth)==frozenset({'me', 'this', '1', 'about'})
+ True
+
+ """
+ fr = kwpth
+ fn = None
+ while True:
+ fr,fn = dir_base(fr)
+ if not fn:
+ break
+ if exists(fr):
+ ipr = any(is_project_root_file(x) for x in os.listdir(fr))
+ if ipr:
+ break
+ if fn:
+ fn = relpath(kwpth,fr)
+ else:
+ fn = base(kwpth)
+ fpth = stem(fn)
+ if fpth.endswith(_rst) or fpth.endswith(_rest):
+ fpth = stem(fpth)
+ res = re.split(rexkwsplit,fpth)
+ return frozenset(res)
+
+def _kw_from_line(ln,rexkwsplit=rexkwsplit):
+ """make a frozenset out of keyword line
+
+ ::
+
+ >>> ln='.. {kw1,kw2-kw3.kw4}'
+ >>> _kw_from_line(ln) == frozenset({'kw1','kw2','kw3','kw4'})
+ True
+ >>> ln=' .. {kw1,trag}'
+ >>> _kw_from_line(ln) == frozenset({'kw1', 'trag'})
+ True
+
+ """
+ return frozenset(x for x in re.split(rexkwsplit,ln.lower()) if x)
+
+def grep(
+ regexp=rexkw,
+ dir=None,
+ exts=set(['.rst','.rest','.stpl','.tpl','.adoc','.md','.wiki','.py','.jl','.lua','.tex',
+ '.js', '.h','.c','.hpp','.cpp','.java','.cs','.vb','.r','.sh','.vim','.el',
+ '.php','.sql','.swift','.go','.rb','.m','.pl','.rs','.f90','.dart',
+ '.yml','.mm','.d','.lsp','.kt','.hs','.lhs','.ex','.scala','.clj']),
+ **kwargs
+):
+ '''
+ .. {grep}
+
+ Uses python re to find ``regexp`` and return
+ ``[(file,1-based index,line),...]``
+ in *dir* (default: os.getcwd()) for ``exts`` files
+
+ :param regexp: default is '^\s*\.\. {'
+ :param dir: default is current dir
+ :param exts: the extension of files searched
+
+
+ '''
+
+ if dir is None:
+ dir = os.getcwd()
+ regexp = re.compile(regexp)
+ for root, dirs, files in os.walk(dir):
+ for name in files:
+ if any(name.endswith(ext) for ext in exts):
+ f = normjoin(root,name)
+ if not f.endswith('.py') and not f.endswith(_stpl) and exists(f+_stpl):
+ continue
+ with open(f,encoding="utf-8") as fb:
+ lines=[l.strip() for l in fb.readlines()]
+ res = [(i,lines[i]) for i in rindices(regexp, lines)]
+ for (i,l) in res:
+ yield (f,i+1,l)
+
+def yield_with_kw (kws, fn_ln_kw=None, **kwargs):
+ '''
+ Find keyword lines in ``fn_ln_kw`` list or using grep(),
+ that contain the keywords in kws.
+
+ Keyword line are either of::
+
+ .. {{{kw1,kw2
+ .. {kw1,kw2}
+ {{_ID3('kw1 kw2')}}
+ %__ID3('kw1 kw2')
+ :ID3: kw1 kw2
+
+ ``..`` can also be two comment chars of popular programming languages.
+ This is due to ``dcx.rexkw``, which you can change.
+ See also ``dcx.grep()`` for the keyword parameters.
+
+ :param kws: string will be split by non-chars
+ :param fn_ln_kw: list of (file, line, keywords) tuples
+ or ``regexp`` for grep()
+
+ ::
+
+ >>> list(yield_with_kw('a',[('a/b',1,'a b'),('c/d',1,'c d')]))
+ [(0, ['a/b', 1, 'a b'])]
+ >>> list(yield_with_kw('a c',[('a/b',1,'a b'),('c/d',1,'c d')]))
+ []
+ >>> list(yield_with_kw('a',[('a/b',1,'a b'),('c/d',1,'a c d')]))
+ [(0, ['a/b', 1, 'a b']), (1, ['c/d', 1, 'a c d'])]
+ >>> kwargs={'dir':normjoin(dirname(__file__),'../test/fixtures')}
+ >>> kws = 'svg'
+ >>> len(list(yield_with_kw(kws,**kwargs)))
+ 6
+ >>> kws = 'png'
+ >>> len(list(yield_with_kw(kws,**kwargs)))
+ 7
+
+ '''
+
+ if fn_ln_kw is None:
+ fn_ln_kw = grep(**kwargs)
+ elif isinstance(fn_ln_kw,str) or isinstance(fn_ln_kw,re.Pattern):
+ fn_ln_kw = grep(fn_ln_kw, **kwargs)
+ arexkwsplit=kwargs.get('rexkwsplit',rexkwsplit)
+ oldfn = None
+ qset = _kw_from_line(kws,rexkwsplit=arexkwsplit)
+ for i,(fn,ln,kw) in enumerate(fn_ln_kw):
+ #i,(fn,ln,kw) = next(enumerate(fn_ln_kw))
+ if fn != oldfn:
+ fnkw = _kw_from_path(fn,rexkwsplit=arexkwsplit)
+ oldfn = fn
+ kws = _kw_from_line(kw,rexkwsplit=arexkwsplit)|fnkw
+ if kws and qset<=kws:
+ yield i,[fn,ln,kw]
+
+
+# ==============> pdt
+
+class Counter:
+ def __init__(self, before_first=0):
+ '''Counter.
+
+ Counter object.
+
+ :param before_first: first-1 value
+
+ ::
+
+ >>> myc = Counter()
+ >>> myc()
+ 1
+ >>> myc()
+ 2
+
+ '''
+
+ self.cnt = before_first
+ def __call__(self):
+ self.cnt += 1
+ return self.cnt
+
+class PdtItem(Counter):
+ def __init__(self, AAA
+ ,level=0
+ ):
+ """
+ Used in pdtAAA
+
+ ``PdtItem`` numbers items in a ``pdt`` document.
+
+ :param AAA: A ``pdt`` is identified by a base36 number (AAA).
+ :param level: level=0 is a content item with separate ID = AAABB,
+ level>0 are headers: AAA text
+
+ ::
+
+ >>> pdt=PdtItem('032')
+ >>> pdt()
+ '\\n03201:'
+ >>> pdt('kw1 kw2','kw3')
+ '\\n03202: **kw1 kw2 kw3**'
+ >>> hdr2=PdtItem('032',level=2)
+ >>> hdr2('header text')
+ '\\n032 header text\\n---------------'
+
+ """
+
+ super().__init__()
+ self.AAA = AAA
+ self.level = level
+ def __call__(self, *args):
+ super().__call__()
+ if self.level==0:
+ BB = np.base_repr(self.cnt,36)
+ Id = "{}{:0>2}".format(self.AAA,BB)
+ if args:
+ lines = ['\n{}: **{}**'.format(Id,' '.join(args))]
+ else:
+ lines = ['\n{}:'.format(Id)]
+ else:
+ c = title_some[self.level-1]
+ assert args, "a header cannot be empty"
+ lin = ' '.join([self.AAA]+list(args))
+ lenlin = len(lin)
+ lines = ['',lin,c*lenlin]
+ return "\n".join(lines)
+
+def _pdtok(fid):
+ assert fid.upper() == fid
+ assert len(fid) == 3
+ assert int(fid,base=36) < 36**3
+
+def pdtid(pdtfile,pdtok=_pdtok):
+ """
+ ``pdtid`` takes the path of the current file and extracts an ID
+
+ - either from the directory or
+ - from the file name
+
+ depending on ``pdtok``, which raises if not OK.
+
+ ::
+
+ >>> pdtid('/a/b/3A2/0sA.rest.stpl')
+ '3A2'
+ >>> pdtid('/a/b/3A2/0SA.rest.stpl')
+ '0SA'
+ >>> pdtid('/a/b/3A2/AS-A.rest.stpl')
+ '3A2'
+
+ """
+
+ fid = base(pdtfile)
+ while True:
+ fido = fid
+ fid = stem(fid)
+ if fid == fido:
+ break
+ try:
+ pdtok(fid)
+ except:
+ fid = stem(stem(base(dirname(pdtfile))))
+ pdtok(fid)
+ return fid
+
+gpdtid = pdtid
+def pdtAAA(pdtfile,dct,pdtid=pdtid,
+ pdtfileid=lambda x:'ipdt'[int(x[0])]):
+ '''
+ ``pdtAAA`` is for use in an ``.stpl`` document::
+
+ % pdtAAA(__main_file__,globals())
+
+ See the example generated with::
+
+ rstdoc --ipdt
+
+ :param pdtfile: file path of pdt
+ :param dct: dict to take up the generated defines
+ :param pdtid: function returning the ID for the ``pdt`` cycle
+ or regular expression with group for full path
+ or regular expression for just the base name without extension (``pdtok``)
+ :param pdtfileid: extracts/maps a file base name to one of the letters ipdt.
+ E.g. to have the files in order one could name them {0,1,2,3}.rest.stpl,
+ and map each to one of 'ipdt'.
+
+ A ``pdt`` is a project enhancement cycle with its own documentation.
+ ``pdt`` stands for
+
+ - plan: why
+ - do: specification
+ - test: tests according specification
+
+ Additionally there should be an
+
+ - inform: non-technical purpose for or from external people.
+
+ There can also be *only* the ``inform`` document, if the ``pdt`` item is only informative.
+
+ The repo looks like this (preferred)::
+
+ project repo
+ pdt
+ ...
+ AAA
+ i*.rest.stpl
+ p*.rest.stpl
+ d*.rest.stpl
+ t*.rest.stpl
+
+ or::
+
+ project repo
+ pdt
+ ...
+ AAA.rst.stpl
+
+ In the first case, the ``UID`` starts with ``{i,p,d,t}AAA``.
+ This is useful to trace related items by their plan-do-test-aspect.
+
+ Further reading: `pdt `__
+
+ ``pdtAAA`` makes these Python defines:
+
+ - ``_[x]AAA`` returns next item number as AAABB. Use: ``{{_[x]AAA('kw1')}}``
+ - ``_[x]AAA_``, ``_[x]AAA__``, ``_[x]AAA___``, ... returns headers. Use: ``{{_[x]AAA_('header text')}}``
+ - ``__[x]AAA``, same as ``_[x]AAA``, but use: ``%__[x]AAA('kw1')`` (needs _printlist in dct)
+ - ``__[x]AAA_``, ``__[x]AAA__``, ``__[x]AAA___``, ... Use: ``%__[x]AAA_('header text')``
+
+ A, B are base36 letters and x is the initial of the file.
+ The generated macros do not work for indented text, as they produce line breaks in RST text.
+
+ ::
+
+ >>> dct={'_printlist':str}
+ >>> pdtfile = "a/b/a.rest.stpl"
+ >>> pdtAAA(pdtfile,dct,pdtid=r'.*/(.)\.rest\.stpl')
+ >>> dct['_a']('x y').strip()
+ 'a01: **x y**'
+ >>> dct['__a']('x y').strip() #needs _printlist
+ "['\\\\na02: **x y**', '\\\\n']"
+ >>> dct={}
+ >>> pdtfile = "pdt/000/d.rest.stpl"
+ >>> pdtAAA(pdtfile,dct)
+ >>> dct['_d000']('x y').strip()
+ 'd00001: **x y**'
+ >>> dct={}
+ >>> pdtfile = "a/b/003.rest.stpl"
+ >>> pdtAAA(pdtfile,dct)
+ >>> dct['_003']('x y').strip()
+ '00301: **x y**'
+ >>> dct['_003_']('x y')
+ '\\n003 x y\\n======='
+ >>> pdtfile="a/b/003/d.rest.stpl"
+ >>> pdtAAA(pdtfile,dct)
+ >>> dct['_003']('x y').strip()
+ '00301: **x y**'
+ >>> dct['_d003']('x y').strip()
+ 'd00301: **x y**'
+ >>> dct['_003_']('x y')
+ '\\n003 x y\\n======='
+ >>> dct['_d003_']('x y')
+ '\\nd003 x y\\n========'
+
+ '''
+
+ try:
+ AAA = pdtid(pdtfile)
+ except TypeError:
+ try:
+ AAA = re.match(pdtid,pdtfile).group(1)
+ except AttributeError:
+ def repdtok(fid):
+ assert re.match(pdtid,fid)
+ try:
+ AAA = gpdtid(pdtfile,pdtok=repdtok)
+ except:
+ return
+ except:
+ return
+ pdtfn = base(pdtfile)
+ x = ''
+ dct['AAA']=AAA
+ if not pdtfn.startswith(AAA):
+ try:
+ x = pdtfileid(pdtfn)
+ except:
+ x = pdtfn[0]
+ dct[x+'AAA']=x+AAA
+ dct['xAAA']=x+AAA
+ dct['PdtItem']=PdtItem
+ dfns = "\n".join("_{0}"+"_"*i+"=PdtItem('{0}',"+str(i)+")" for i in range(10))
+ if '_printlist' in dct: #define __AAA for direct output
+ #_printlist should come from the separate pkg stpl/stpl.py
+ dfns += "\n"
+ dfns += "\n".join("__{0}"+"_"*i+"=lambda *args: _printlist([_{0}"+"_"*i+"(*args),'\\n'])" for i in range(10))
+ eval(compile(dfns.format(AAA), "", "exec"),dct)
+ if x:
+ eval(compile(dfns.format(x+AAA), "", "exec"),dct)
+
+def index_toctree(index_file):
+ '''
+ Construct::
+
+ .. toctree::
+ file1
+ file2
+
+ for the sphinx index file,
+ i.e. ``index.rest.stpl`` or ``index.rst.stpl``.
+ Use like::
+
+ {{! index_toctree(__file__) }}
+
+ '''
+
+ from pathlib import Path
+ thisdir = Path(index_file).parent
+ indexlines = open(index_file).readlines()
+ alreadyi = lambda x: rlines(r'\.\. include::.*'+stem(stem(x)),indexlines)
+ toctree = ['.. toctree::']
+ totoctree = lambda x: alreadyi(x) or toctree.append(' '+x)
+ _get_rstrest()
+ toglob = '*'+_rest+"*"
+ pdtdirs = list(sorted(set(y.parent for y in thisdir.rglob(toglob) if
+ not y.name.startswith('index'+_rest) and
+ not y.name.endswith(_tpl) and
+ not any(x.endswith('build') for x in str(y).split(os.sep))
+ )))
+ for apdtd in pdtdirs:
+ fs = [f for f in sorted(Path(apdtd).glob(toglob)) if
+ not f.name.startswith('index'+_rest) and
+ not exists(str(f.absolute())+_stpl) and
+ not f.name.endswith(_tpl)]
+ fsdict = dict((f.name[0],f) for f in fs)
+ fsdone = set()
+ for i in "0i1p2d3t":
+ if i in fsdict:
+ fsi = fsdict[i]
+ fsi0 = fsi.name.split('.')[0]
+ ipdtf = any(x.startswith(fsi0) for x in 'inform plan do test'.split())
+ if ipdtf or '0123'.find(fsi0)>=0:
+ relpth = stem(fsi.relative_to(thisdir))
+ totoctree(relpth)
+ fsdone.add(fsi)
+ for f in fs:
+ if f not in fsdone:
+ relpth = stem(f.relative_to(thisdir))
+ totoctree(relpth)
+ return '\n'.join(toctree)
+
+
+# ==============> for building with WAF
+
+try:
+ from waflib import TaskGen, Task
+
+ gensrc = {}
+
+ @TaskGen.feature('gen_files')
+ @TaskGen.before('process_rule')
+ def gen_files_now(tskgen):
+ global gensrc
+ gensrc = {}
+ rootpth = tskgen.bld.path.abspath()
+ if rootpth not in sys.path:
+ sys.path.append(rootpth)
+ for gen in tskgen.path.ant_glob("**/gen"):
+ genpth = gen.abspath()
+ relgen = relpath(gen.parent.abspath(),start=tskgen.path.abspath())
+ if gen.exists():
+ for f, t, fun, kw in parsegenfile(genpth):
+ gensrc[normjoin(relgen,t)] = normjoin(relgen,f)
+ genfrom = gen.parent.find_resource(f)
+ assert genfrom, "%s rel to %s not found"%(f,genpth)
+ gento = gen.parent.make_node(t)
+ assert gento, "%s rel to %s not found"%(t,genpth)
+ tskgen.create_task('GENTSK', genfrom, gento, fun=fun, kw=kw)
+
+ class GENTSK(Task.Task):
+ def run(self):
+ genfrom = self.inputs[0]
+ gento = self.outputs[0]
+ gento.parent.mkdir()
+ gen(genfrom.abspath(), gento.abspath(), fun=self.fun, **self.kw)
+
+ def get_docs_param(bld):
+ docs = [x.lower() for x in bld.options.docs]
+ if not docs:
+ docs = [x.lower() for x in bld.env.docs]
+ return docs
+
+ @lru_cache()
+ def get_files_in_doc(path, node):
+ srcpath = node.parent.get_src()
+ orgd = node.parent.abspath()
+ d = srcpath.abspath()
+ n = node.name
+ nod = None
+ if node.is_bld(
+ ) and not node.name.endswith(
+ _stpl
+ ) and not node.name.endswith(_tpl):
+ nod = srcpath.find_node(node.name + _stpl)
+ if not nod:
+ nod = node
+ ch = rstincluded(n, (d, orgd), True, True)
+ deps = []
+ nodeitself = True
+ for x in ch:
+ if nodeitself:
+ nodeitself = False
+ continue
+ isrst = is_rst(x)
+ # else cyclic dependency for _links_xxx.rst
+ if isrst and x.startswith('_links_'):
+ continue
+ nd = srcpath.find_node(x)
+ if not nd:
+ if isrst and not x.endswith(_stpl) and not x.endswith(_tpl):
+ nd = srcpath.find_node(x + _stpl)
+ deps.append(nd)
+ depsgensrc = [
+ path.find_node(gensrc[x]) for x in deps if x and x in gensrc
+ ]
+ rs = [x for x in deps if x] + depsgensrc
+ return (list(sorted(set(rs), key=lambda a: a.name)), [])
+
+ @TaskGen.feature('gen_links')
+ @TaskGen.after('gen_files')
+ def gen_links_now(tskgen):
+ docs = get_docs_param(tskgen.bld)
+ if docs:
+ for so in tskgen.path.ant_glob('**/*.stpl'):
+ tsk = Namespace()
+ tsk.inputs = (so, )
+ tsk.env = tskgen.env
+ tsk.generator = tskgen
+ render_stpl(tsk, tskgen.bld)
+ links_and_tags(tskgen.path.abspath())
+
+ @TaskGen.feature('gen_docs')
+ @TaskGen.after('gen_links')
+ def gen_docs_now(tskgen):
+ docs = get_docs_param(tskgen.bld)
+ if docs:
+ bldpth=relpath(tskgen.bld.bldnode.abspath(),start=tskgen.path.abspath())
+ for anext in 'tikz svg dot uml pyg eps'.split():
+ source=tskgen.path.ant_glob('**/*.'+anext,excl=bldpth+"/**")
+ tskgen.bld(name='build '+anext, source=[x for x in source if _traceability_file not in x.abspath()])
+ foundfiles = list(tskgen.path.ant_glob('**/*'+_rest,excl=bldpth+"/**"))
+ tskgen.bld(name='build all rest', source=foundfiles)
+
+ def render_stpl(tsk, bld):
+ bldpath = bld.path.get_bld().abspath()
+ ps = tsk.inputs[0].abspath()
+ try:
+ pt = tsk.outputs[0].abspath()
+ except:
+ if ps.endswith(_stpl):
+ pt = stem(ps)
+ else:
+ raise RstDocError('No target for %s' % ps)
+ env = dict(tsk.env)
+ env.update(tsk.generator.__dict__)
+ env['bldpath'] = bldpath
+ dostpl(ps, pt, **env)
+
+ class STPL(Task.Task):
+ always_run = True
+
+ def run(self):
+ render_stpl(self, self.generator.bld)
+
+ @TaskGen.extension(_stpl)
+ def stpl_taskgen(tskgen, node): # expand into same directory
+ nn = node.parent.make_node(stem(node.name))
+ tskgen.create_task('STPL', node, nn)
+ try:
+ tskgen.get_hook(nn)(tskgen, nn)
+ except:
+ pass
+
+ def gen_ext_tsk(tskgen, node,
+ ext): # into _images or /_images in source path
+ srcfldr = node.parent.get_src()
+ _imgpath, there = _here_or_updir(srcfldr.abspath(), _images)
+ if not there:
+ _imgpath = normjoin(srcfldr.abspath(), _images)
+ mkdir(_imgpath)
+ imgpath = relpath(_imgpath, start=srcfldr.abspath())
+ outnode = srcfldr.make_node(
+ normjoin(imgpath,
+ stem(node.name) + '.png'))
+ tskgen.create_task(ext[1:].upper(), node, outnode)
+
+ @TaskGen.extension(_tikz)
+ def tikz_to_png_taskgen(tskgen, node):
+ gen_ext_tsk(tskgen, node, _tikz)
+
+ class TIKZ(Task.Task):
+ def run(self):
+ tikzpng(self.inputs[0].abspath(), self.outputs[0].abspath())
+
+ @TaskGen.extension(_svg)
+ def svg_to_png_taskgen(tskgen, node):
+ gen_ext_tsk(tskgen, node, _svg)
+
+ class SVG(Task.Task):
+ def run(self):
+ svgpng(self.inputs[0].abspath(), self.outputs[0].abspath())
+
+ @TaskGen.extension('.dot')
+ def dot_to_png_taskgen(tskgen, node):
+ gen_ext_tsk(tskgen, node, '.dot')
+
+ class DOT(Task.Task):
+ run_str = "${dot} -Tpng ${SRC} -o${TGT}"
+
+ @TaskGen.extension('.uml')
+ def uml_to_png_taskgen(tskgen, node):
+ gen_ext_tsk(tskgen, node, '.uml')
+
+ class UML(Task.Task):
+ run_str = "${plantuml} ${SRC} -o${TGT[0].parent.abspath()}"
+
+ @TaskGen.extension('.eps')
+ def eps_to_png_taskgen(tskgen, node):
+ gen_ext_tsk(tskgen, node, '.eps')
+
+ class EPS(Task.Task):
+ run_str = ("${inkscape} --export-dpi=${DPI} --export-area-drawing" +
+ " --export-background-opacity=0 ${SRC} " +
+ " --export-filename=${TGT}")
+
+ @TaskGen.extension('.pyg')
+ def pyg_to_png_taskgen(tskgen, node):
+ gen_ext_tsk(tskgen, node, '.pyg')
+
+ class PYG(Task.Task):
+ def run(self):
+ pygpng(self.inputs[0].abspath(), self.outputs[0].abspath())
+
+ @TaskGen.extension(_get_rstrest())
+ def docs_taskgen(tskgen, node):
+ docs = get_docs_param(tskgen.bld)
+ d = get_files_in_doc(tskgen.path, node)
+
+ def rstscan():
+ return d
+
+ def out_node(doctgt,doctype):
+ relnode = relpath(stem(node.abspath())+'.'+doctype,start=tskgen.path.abspath())
+ bldpath = tskgen.path.get_bld()
+ return bldpath.find_or_declare(normjoin(doctgt,relnode))
+
+ if node.name != "index"+_rest:
+ for doctgt in docs:
+ if doctgt.startswith('sphinx_'):
+ continue
+ doctype = _suffix(doctgt)
+ tskgen.create_task(
+ 'NonSphinxTask', [node],
+ out_node(doctgt,doctype),
+ scan=rstscan,
+ doctgt=doctgt)
+ else:
+ for doctgt in docs:
+ if not doctgt.startswith('sphinx_'):
+ continue
+ doctype = _suffix(doctgt.replace('_tex','_latex'))
+ tskgen.create_task(
+ 'SphinxTask', [node],
+ out_node(doctgt,doctype.replace('latex','tex')),
+ scan=rstscan,
+ doctype=doctype)
+
+ class NonSphinxTask(Task.Task):
+ def run(self):
+ dorst(self.inputs[0].abspath(), self.outputs[0].abspath(),
+ self.doctgt)
+
+ class SphinxTask(Task.Task):
+ always_run = True
+
+ def run(self):
+ inpth = self.inputs[0].abspath()
+ confpypath, _ = _here_or_updir(dirname(inpth), 'conf.py')
+ config = conf_py(dirname(confpypath))
+ # rst_sphinx needs it relative to infile
+ if 'html_extra_path' in config:
+ config['html_extra_path'] = [
+ normjoin(dirname(confpypath), x)
+ for x in config['html_extra_path']
+ ]
+ else:
+ config['html_extra_path'] = html_extra_path
+ rst_sphinx(inpth, self.outputs[0].abspath(),
+ self.doctype, **config)
+
+ def options(opt):
+ def docscb(option, opt, value, parser):
+ setattr(parser.values, option.dest, value.split(','))
+
+ opt.add_option(
+ "--docs",
+ type='string',
+ action="callback",
+ callback=docscb,
+ dest='docs',
+ default=[],
+ help="""Comma-separated list of
+html, docx, pdf, sphinx_html (default)
+or any other of http://www.sphinx-doc.org/en/master/usage/builders"""
+ )
+
+ def configure(cfg):
+ cfg.env['docs'] = cfg.options.docs
+ for x in 'plantuml dot inkscape'.split():
+ try:
+ cfg.env[x] = cfg.find_program(x)
+ except cfg.errors.ConfigurationError:
+ cfg.to_log(x + ' was not found (ignoring)')
+ root=cfg.path.abspath()
+ config = conf_py(root)
+ cfg.env['DPI'] = str(config.get('DPI', DPI))
+ cfg.env['rstrest'] = _get_rstrest()
+ assert isinstance(cfg.env['rstrest'],str)
+ #VERSION
+ try: # repo?
+ from git import Repo
+ repo = Repo(root)
+ tags = repo.tags
+ taglast = tags and tags[-1].name or '0.0.0'
+ tagint = int(taglast.replace('.',''),10)
+ tagfix = tagint%10
+ tagminor = tagint//10 % 10
+ tagmajor = tagint//100 % 10
+ tagnew = f'{tagmajor}.{tagminor}.{tagfix+1}'
+ cfg.env['VERSION'] = tagnew
+ except: # no repo
+ try:
+ # VERSION file must exist when no git repo available
+ cfg.env['VERSION'] = next(filter(
+ lambda x:x.strip(),opn(normjoin(root,'VERSION')).readlines()))
+ except FileNotFoundError:
+ cfg.env['VERSION'] = '0.0.0'
+
+ def build(bld):
+ _set_rstrest(bld.env['rstrest'])
+
+ bld.src2bld = lambda f: bld(
+ features='subst', source=f, target=f, is_copy=True)
+
+ def gen_files():
+ bld(name="process gen file", features="gen_files")
+ bld.gen_files = gen_files
+
+ def gen_links():
+ bld(name="create links and .tags", features="gen_links")
+ bld.gen_links = gen_links
+
+ def gen_docs():
+ bld(name="create docs", features="gen_docs")
+ bld.gen_docs = gen_docs
+
+ # use like bld(rule=bld.stpl, source='x.h.stpl')
+ # to compile stpl only, else do without rule
+ bld.stpl = lambda tsk: render_stpl(tsk, bld)
+
+ global g_config
+ if exists(normjoin(bld.srcnode.abspath(), 'conf.py')):
+ g_config = conf_py(bld.srcnode.abspath())
+
+ def build_docs():
+ bld.gen_files()
+ bld.gen_links()
+ bld.gen_docs()
+ bld.add_group()
+ #call bld.build_docs in root wscript to have .tags there
+ bld.build_docs = build_docs
+
+except:
+ pass
+
+# ==============< for building with WAF
+
+# pandoc --print-default-data-file reference.docx > reference.docx
+# pandoc --print-default-data-file reference.odt > reference.odt
+# pandoc --print-default-template=latex
+# then modified in format and not to use figure labels
+# this is for mktree(): first line of file content must not be empty!
+example_rest_tree = r'''
+ build/
+ dcx.py << file:///__dcx__
+ reference.tex << file:///__tex_ref__
+ reference.docx << file:///__docx_ref__
+ reference.odt << file:///__odt_ref__
+ wafw.py << file:///__wafw__
+ waf
+ #!/usr/bin/env sh
+ shift
+ ./wafw.py "$@"
+ waf.bat
+ @setlocal
+ @set PYEXE=python
+ @where %PYEXE% 1>NUL 2>NUL
+ @if %ERRORLEVEL% neq 0 set PYEXE=py
+ @%PYEXE% -x "%~dp0wafw.py" %*
+ @exit /b %ERRORLEVEL%
+ wscript
+ #vim: ft=python
+ from waflib import Logs
+ Logs.colors_lst['BLUE']='\x1b[01;36m'
+ top='.'
+ out='build'
+ def options(opt):
+ opt.load('dcx', tooldir='.')
+ def configure(cfg):
+ cfg.load('dcx', tooldir='.')
+ def build(bld):
+ bld.load('dcx', tooldir='.')
+ bld.build_docs()
+ docutils.conf
+ [general]
+ halt_level: severe
+ report_level: error
+ conf.py
+ project = 'sample'
+ author = project+' Project Team'
+ copyright = '2019, '+author
+ version = '1.0'
+ release = '1.0.0'
+ try:
+ import sphinx_bootstrap_theme
+ html_theme_path = sphinx_bootstrap_theme.get_html_theme_path()
+ html_theme = 'bootstrap'
+ except:
+ pass
+ #these are enforced by rstdoc, but keep them for sphinx-build
+ numfig = 0
+ smartquotes = 0
+ source_suffix = '.rest'
+ templates_path = []
+ language = None
+ highlight_language = "none"
+ default_role = 'math'
+ pygments_style = 'sphinx'
+ exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
+ master_doc = 'index'
+ html_extra_path=['doc/_traceability_file.svg'] #relative to conf.py
+ import os
+ on_rtd = os.environ.get('READTHEDOCS') == 'True'
+ if not on_rtd:
+ latex_engine = 'xelatex'
+ #You can postprocess pngs.default: png_post_processor = None
+ def png_post_processor(filename):
+ from PIL import Image, ImageChops
+ def trim(im):
+ bg = Image.new(im.mode, im.size, im.getpixel((0, 0)))
+ diff = ImageChops.difference(im, bg)
+ diff = ImageChops.add(diff, diff, 2.0, -100)
+ bbox = diff.getbbox()
+ if bbox:
+ return im.crop(bbox)
+ return im
+ im = Image.open(filename)
+ im = trim(im)
+ im.save(filename)
+ return filename
+ #the following are default and can be omitted
+ latex_elements = {'preamble':r"""
+ \usepackage{pgfplots}
+ \usepackage{unicode-math}
+ \usepackage{tikz}
+ \usepackage{caption}
+ \captionsetup[figure]{labelformat=empty}
+ \usetikzlibrary{arrows,snakes,backgrounds,patterns,matrix,shapes,fit,calc,shadows,plotmarks,intersections}
+ """
+ }
+ #new in rstdcx/dcx/py
+ tex_wrap = r"""
+ \documentclass[12pt,tikz]{standalone}
+ \usepackage{amsmath}
+ """+latex_elements['preamble']+r"""
+ \pagestyle{empty}
+ \begin{document}
+ %s
+ \end{document}
+ """
+ DPI = 600
+ # |targetid| refs are grouped by the first letter (determining whether r(iskanalysis), s(pecification), d(esign) or t(est))
+ target_id_group = lambda targetid: targetid[0]
+ target_id_color={"ra":("r","lightblue"), "sr":("s","red"),
+ "dd":("d","yellow"), "tp":("t","green")}
+ pandoc_doc_optref={'latex': '--template reference.tex',
+ 'html': {},#each can also be dict of file:template
+ 'pdf': '--template reference.tex',
+ 'docx': '--reference-doc reference.docx',
+ 'odt': '--reference-doc reference.odt'
+ }
+ _pandoc_latex_pdf = ['--listings','--number-sections','--pdf-engine',
+ 'xelatex','-V','titlepage','-V','papersize=a4',
+ '-V','toc','-V','toc-depth=3','-V','geometry:margin=2.5cm']
+ pandoc_opts = {'pdf':_pandoc_latex_pdf,'latex':_pandoc_latex_pdf,
+ 'docx':[],'odt':[],
+ 'html':['--mathml','--highlight-style','pygments']}
+ rst_opts = { #http://docutils.sourceforge.net/docs/user/config.html
+ 'strip_comments':True
+ ,'report_level':3
+ ,'raw_enabled':True
+ }
+ def name_from_directive(directive,count):
+ return directive[0].upper() + directive[1:] + ' ' + str(count)
+ Makefile
+ SPHINXOPTS = -c .
+ SPHINXBLD = sphinx-build
+ SPHINXPROJ = sample
+ DOCDIR = doc/
+ DOCBACK = ../
+ DCXFROMDOC = ../
+ BLDDIR = build/doc/
+ STPLS = $(wildcard $(DOCDIR)*.stpl)
+ STPLTGTS = $(STPLS:%.stpl=%)
+ SRCS = $(filter-out $(DOCDIR)index.rest,$(wildcard $(DOCDIR)*.rest))
+ SRCSTPL = $(wildcard $(DOCDIR)*.rest.stpl)
+ IMGS = \
+ $(wildcard $(DOCDIR)*.pyg)\
+ $(wildcard $(DOCDIR)*.eps)\
+ $(wildcard $(DOCDIR)*.tikz)\
+ $(wildcard $(DOCDIR)*.svg)\
+ $(wildcard $(DOCDIR)*.uml)\
+ $(wildcard $(DOCDIR)*.dot)\
+ $(wildcard $(DOCDIR)*.eps.stpl)\
+ $(wildcard $(DOCDIR)*.tikz.stpl)\
+ $(wildcard $(DOCDIR)*.svg.stpl)\
+ $(wildcard $(DOCDIR)*.uml.stpl)\
+ $(wildcard $(DOCDIR)*.dot.stpl)
+ PNGS=$(subst $(DOCDIR),$(DOCDIR)_images/,\
+ $(patsubst %.eps,%.png,\
+ $(patsubst %.pyg,%.png,\
+ $(patsubst %.tikz,%.png,\
+ $(patsubst %.svg,%.png,\
+ $(patsubst %.uml,%.png,\
+ $(patsubst %.dot,%.png,\
+ $(patsubst %.eps.stpl,%.png,\
+ $(patsubst %.dot.stpl,%.png,\
+ $(patsubst %.tikz.stpl,%.png,\
+ $(patsubst %.svg.stpl,%.png,\
+ $(patsubst %.uml.stpl,%.png,$(IMGS)))))))))))))
+ DOCXS = $(subst $(DOCDIR),$(BLDDIR)docx/,$(SRCS:%.rest=%.docx))\
+ $(subst $(DOCDIR),$(BLDDIR)docx/,$(SRCSTPL:%.rest.stpl=%.docx))
+ PDFS = $(subst $(DOCDIR),$(BLDDIR)pdf/,$(SRCS:%.rest=%.pdf))\
+ $(subst $(DOCDIR),$(BLDDIR)pdf/,$(SRCSTPL:%.rest.stpl=%.pdf))
+ .PHONY: docx help Makefile docxdir pdfdir stpl index imgs
+ stpl: $(STPLTGTS)
+ %:%.stpl
+ @cd $(DOCDIR) && stpl "$(
+ @#include "some.h"
+ @int main()
+ @{
+ */
+
+ /**Test add1()
+ @assert(add1(1)==2);
+ */
+ int add1(int a)
+ {
+ return a+1;
+ }
+
+ /**Test add2()
+ @assert(add2(1)==3);
+ */
+ int add2(int a)
+ {
+ return a+2;
+ }
+
+ /*
+ @}
+ */
+ doc
+ ├ _images/
+ ├ index.rest
+ ============
+ Project Name
+ ============
+
+ .. toctree::
+ ra.rest
+ sr.rest
+ dd.rest
+ tp.rest
+
+ One can also have a
+
+ - issues.rest for issues
+
+ - pp.rest for the project plan
+ (with backlog, epics, stories, tasks)
+
+ .. REMOVE THIS IF NO LINKING OVERVIEW WANTED
+ .. include:: _traceability_file.rst
+
+ .. include:: _links_sphinx.rst
+
+ ├ ra.rest
+ Risk Analysis
+ =============
+
+ .. _`rz7`:
+
+ :rz7: risk calculations
+
+ Risk calculations are done with python in the ``.stpl`` file.
+
+ .. include:: _links_sphinx.rst
+
+ ├ sr.rest
+ Software/System Requirements
+ ============================
+
+ .. _`s97`:
+
+ Requirements are testable (see |t9a|).
+
+ .. _`su7`:
+
+ ``dcx.py`` produces its own labeling
+ consistent across DOCX, PDF, HTML.
+
+ .. _`sy7`:
+
+ A Requirement Group
+ -------------------
+
+ .. _`s3a`:
+
+ :s3a: brief description
+
+ Don't count the ID, since the order will change.
+ The IDs have the first letter of the file
+ and 2 or more random letters of ``[0-9a-z]``.
+ Use an editor macro to generate IDs.
+
+ A link: |s3a|
+
+ If one prefers ordered IDs, one can use templates::
+
+ %id = lambda x=[0]: x.append(x[-1]+1) or "s{:0>2}".format(x[-1])
+
+ .. _`soi`:
+
+ :{{id()}}: auto numbered.
+
+ The disadvantage is that the id will differ
+ between rst and final doc.
+ When this is needed in an included file
+ use template include: ``%include('x.rst.tpl`)``
+ See the the ``test/stpl`` directory.
+
+ Every ``.rest`` has this line at the end::
+
+ .. include:: _links_sphinx.rst
+
+ .. include:: _links_sphinx.rst
+
+ ├ dd.rest
+ Design Description
+ ==================
+
+ .. _`d97`:
+
+ :d97: Independent DD IDs
+
+ The relation with RS IDs is m-n.
+ Links like |s3a| can be scattered over more DD entries.
+
+ .. _`dx3`:
+
+ .. figure:: _images/egtikz1.png
+ :name:
+ :width: 30%
+
+ |dx3|: Create from egtikz1.tikz
+
+ .. _`dz3`:
+
+ .. figure:: _images/egtikz.png
+ :name:
+ :width: 50%
+
+ |dz3|: Create from egtikz.tikz
+
+ The usage of ``:name:`` produces:
+ ``WARNING: Duplicate explicit target name: ""``. Ignore.
+
+ Reference via |dz3|.
+
+ ``.tikz``, ``.svg``, ``.dot``, ``.uml``, ``.eps`` or ``.stpl``
+ thereof and ``.pyg``, are converted to ``.png``.
+
+ .. _`dz4`:
+
+ .. figure:: _images/egsvg.png
+ :name:
+
+ |dz4|: Created from egsvg.svg.stpl
+
+ .. _`dz5`:
+
+ .. figure:: _images/egdot.png
+ :name:
+
+ |dz5|: Created from egdot.dot.stpl
+
+ .. _`dz6`:
+
+ .. figure:: _images/eguml.png
+ :name:
+
+ |dz6|: Created from eguml.uml
+
+ .. _`dz7`:
+
+ .. figure:: _images/egplt.png
+ :name:
+ :width: 30%
+
+ |dz7|: Created from egplt.pyg
+
+ .. _`dz8`:
+
+ .. figure:: _images/egpyx.png
+ :name:
+
+ |dz8|: Created from egpyx.pyg
+
+ .. _`dr8`:
+
+ .. figure:: _images/egcairo.png
+ :name:
+
+ |dr8|: Created from egcairo.pyg
+
+ .. _`ds8`:
+
+ .. figure:: _images/egpygal.png
+ :name:
+ :width: 30%
+
+ |ds8|: Created from egpygal.pyg
+
+ .. _`dsx`:
+
+ .. figure:: _images/egother.png
+ :name:
+
+ |dsx|: Created from egother.pyg
+
+ .. _`d98`:
+
+ .. figure:: _images/egeps.png
+ :name:
+
+ |d98|: Created from egeps.eps
+
+ .. _`dua`:
+
+ |dua|: Table legend
+
+ .. table::
+ :name:
+
+ +--------+--------+
+ | A | B |
+ +========+========+
+ | |eps| | |eps| |
+ +--------+--------+
+
+ .. _`dta`:
+
+ |dta|: Table legend
+
+ .. list-table::
+ :name:
+ :widths: 20 80
+ :header-rows: 1
+
+ * - Bit
+ - Function
+
+ * - 0
+ - afun
+
+ Reference |dta| does not show ``dta``.
+
+ .. _`dyi`:
+
+ |dyi|: Listing showing struct.
+
+ .. code-block:: cpp
+ :name:
+
+ struct astruct{
+ int afield; //afield description
+ }
+
+ .. _`d9x`:
+
+ .. math::
+ :name:
+
+ V = \frac{K}{r^2}
+
+ ``:math:`` is the default inline role: `mc^2`
+
+ .. _`d99`:
+
+ :OtherName: Keep names the same all over.
+
+ Here instead of ``d99:`` we use ``:OtherName:``,
+ but now we have two synonyms for the same item.
+ This is no good. If possible, keep ``d99`` in the source
+ and in the final docs.
+
+ The item target must be in the same file as the item content.
+ The following would not work::
+
+ .. _`dh5`:
+
+ .. include:: somefile.rst
+
+ .. |eps| image:: _images/egeps.png
+
+ .. include:: _links_sphinx.rst
+
+ ├ tp.rest
+ Test Plan
+ =========
+
+ .. _`t9a`:
+
+ Requirement Tests
+ -----------------
+
+ No duplication. Only reference the requirements to be tested.
+
+ - |s97|
+ - |su7|
+ - |s3a|
+ - |rz7|
+
+ Or better: reference the according SR chapter,
+ else changes there would need an update here.
+
+ - Test |sy7|
+
+ Unit Tests
+ ----------
+
+ Use ``.rst`` for included files
+ and start the file with ``_`` if generated.
+
+ - |d97|
+ - |dx3|
+ - |dz4|
+ - |dz5|
+ - |dz6|
+ - |dz7|
+ - |dz8|
+ - |dsx|
+ - |d98|
+ - |dua|
+ - |dta|
+ - |dyi|
+ - |d9x|
+ - |d99|
+
+ .. include:: _sometst.rst
+
+ .. include:: _links_sphinx.rst
+ ├ egtikz.tikz
+ [thick,red]
+ \draw (0,0) grid (3,3);
+ \foreach \c in {(0,0), (1,0), (2,0), (2,1), (1,2)}
+ \fill \c + (0.5,0.5) circle (0.42);
+ ├ egtikz1.tikz
+ \begin{scope}[blend group = soft light]
+ \fill[red!30!white] ( 90:1.2) circle (2);
+ \fill[green!30!white] (210:1.2) circle (2);
+ \fill[blue!30!white] (330:1.2) circle (2);
+ \end{scope}
+ \node at ( 90:2) {Typography};
+ \node at ( 210:2) {Design};
+ \node at ( 330:2) {Coding};
+ \node [font=\Large] {\LaTeX};
+ ├ egsvg.svg.stpl
+
+
+ ├ egdot.dot.stpl
+ digraph {
+ %for i in range(3):
+ "From {{i}}" -> "To {{i}}";
+ %end
+ }
+ ├ eguml.uml
+ @startuml
+ 'style options
+ skinparam monochrome true
+ skinparam circledCharacterRadius 0
+ skinparam circledCharacterFontSize 0
+ skinparam classAttributeIconSize 0
+ hide empty members
+ Class01 <|-- Class02
+ Class03 *-- Class04
+ Class05 o-- Class06
+ Class07 .. Class08
+ Class09 -- Class10
+ @enduml
+ ├ egplt.pyg
+ #vim: ft=python
+ import matplotlib.pyplot as plt
+ import numpy as np
+ x = np.random.randn(1000)
+ plt.hist( x, 20)
+ plt.grid()
+ plt.title(r'Normal: $\mu=%.2f, \sigma=%.2f$'%(x.mean(), x.std()))
+ plt.show()
+ ├ egpyx.pyg
+ import pyx
+ c = pyx.canvas.canvas()
+ c.stroke(pyx.path.circle(0,0,2),
+ [pyx.style.linewidth.Thick,pyx.color.rgb.red])
+ c.text(1, 1, 'Hi', [pyx.color.rgb.red])
+ ├ egpygal.pyg
+ import pygal
+ diagram=pygal.Bar()(1, 3, 3, 7)(1, 6, 6, 4)
+ def to_svg():
+ return diagram.render().decode('utf-8')
+ ├ egother.pyg
+ from PIL import Image, ImageDraw, ImageFont
+ im = Image.new("RGBA",size=(50,50),color=(155,0,100))
+ draw = ImageDraw.Draw(im)
+ draw.rectangle(((0, 0), (40, 40)), fill="red")
+ draw.text((20, 20), "123")
+ save_to_png = lambda out_file: im.save(out_file, "PNG")
+ ├ egeps.eps
+ newpath 6 2 36 54 rectstroke
+ showpage
+ ├ egcairo.pyg
+ import cairocffi as cairo
+ surface = cairo.SVGSurface(None, 200, 200)
+ context = cairo.Context(surface)
+ x, y, x1, y1 = 0.1, 0.5, 0.4, 0.9
+ x2, y2, x3, y3 = 0.6, 0.1, 0.9, 0.5
+ context.set_source_rgba(1, 0.2, 0.2, 0.6)
+ context.scale(200, 200)
+ context.set_line_width(0.04)
+ context.move_to(x, y)
+ context.curve_to(x1, y1, x2, y2, x3, y3)
+ context.stroke()
+ context.set_line_width(0.02)
+ context.move_to(x, y)
+ context.line_to(x1, y1)
+ context.move_to(x2, y2)
+ context.line_to(x3, y3)
+ context.stroke()
+ ├ gen
+ ##from|to|gen_xxx|kwargs
+ #../__code__/some.h | _sometst.rst | tstdoc | {}
+ #../__code__/some.h | ../build/__code__/some_tst.c | tst | {}
+ #or
+ from_to_fun_kw = [ #fun and gen_fun() or gen()
+ ['../__code__/some.h','_sometst.rst','tstdoc',{}],
+ ['../__code__/some.h','../build/__code__/some_tst.c','tst',{}]
+ ]'''
+
+# replaces from '├ index.rest' to '├ egtikz.tikz'
+example_stpl_subtree = r'''
+ ├ model.py
+ """
+ Contains definitions used in
+ - template files (``.rst.tpl`` or standalone ``.rest.stpl``)
+ - test programs
+ """
+ from pint import UnitRegistry
+ u = UnitRegistry()
+ u.define('percent = 0.01*count = %')
+ def U(*k,sep=", "):
+ """
+ Returns string of quantity, with units if possible.
+ """
+ try:
+ return sep.join(["{:~P}"]*len(k)).format(*k)
+ except:
+ res = sep.join(["{}"]*len(k)).format(*k)
+ if res == 'None':
+ res = '-'
+ return res
+ # Definitions e.g. x_some = 3.5*u.hour #see |x_some_doc|
+ ├ utility.rst.tpl
+ % import sys
+ % import os
+ % sys.path.append(os.path.dirname(__file__))
+ % from model import *
+ % cntr = lambda alist0,prefix='',width=2: alist0.append(
+ % alist0[-1]+1) or ("{}{:0>%s}"%width).format(prefix,alist0[-1])
+ % II = lambda prefix,alist0,short:':{}: **{}**'.format(
+ % cntr(alist0,prefix),short)
+ % #define in file e.g.
+ % #SR=lambda short,alist0=[0]:II('SR',alist0,short)
+ % #and use like {{SR('Item Title')}}
+ %def pagebreak():
+ .. raw:: openxml
+
+
+
+
+
+
+
+ .. raw:: html
+
+
+
+ .. raw:: latex
+
+ \pagebreak
+
+ .. for docutils
+ .. raw:: odt
+
+
+
+ .. for pandoc
+ .. raw:: opendocument
+
+
+
+ %end
+ ├ index.rest
+ .. encoding: utf-8
+ .. vim: ft=rst
+
+ ============
+ Project Name
+ ============
+
+ .. toctree::
+ sy.rest
+ ra.rest
+ sr.rest
+ dd.rest
+ tp.rest
+
+ One can also have a
+
+ - issues.rest for issues
+
+ - pp.rest for the project plan
+ (with backlog, epics, stories, tasks)
+
+ .. REMOVE THIS IF NO LINKING OVERVIEW WANTED
+ .. include:: _traceability_file.rst
+
+ .. include:: _links_sphinx.rst
+
+ ├ sy.rest.stpl
+ .. encoding: utf-8
+ .. vim: ft=rst
+
+ % globals().update(include('utility.rst.tpl'))
+ % SY=lambda short,alist0=[0]:II('SY',alist0,short)
+
+ .. _`sy_system_scope`:
+
+ ############
+ System Scope
+ ############
+
+ .. _`sy_general_idea`:
+
+ {{SY('General Idea')}}
+
+ Source code is text done in a good editor.
+ Use the same editor also for documentation.
+ Jump around like in hypertext.
+
+ .. include:: _links_sphinx.rst
+ ├ ra.rest.stpl
+ .. encoding: utf-8
+ .. vim: ft=rst
+
+ % globals().update(include('utility.rst.tpl'))
+ % RA=lambda short,alist0=[0]:II('RA',alist0,short)
+
+ .. _`ra_risk_analysis`:
+
+ #############
+ Risk Analysis
+ #############
+
+ .. _`r_restructured_text`:
+
+ Advantages
+ ==========
+
+ {{RA('Restructured Text')}}
+
+ We use
+ `restructuredText `_
+ together with
+ `SimpleTemplate `_.
+
+ This is very flexible:
+
+ - it allows to generate boilerplate text with python
+ - it allows to link the text across documents
+ - it allows to have many final formats (html, docx, pdf, odt, ...)
+ - ...
+
+ .. _`ra_risks`:
+
+ Risks
+ =====
+
+ .. _`r_editor`:
+
+ {{RA('Wrong Editor')}}
+
+ This is not for people that only know how to edit in MS Words.
+ Users should have embraced a good text editor.
+ Software guys normally have.
+
+ One needs a good text editor that supports ctags.
+ These have been tested
+
+ - `atom `_
+ - `vim `_
+
+ .. include:: _links_sphinx.rst
+ ├ sr.rest.stpl
+ .. encoding: utf-8
+ .. vim: ft=rst
+
+ % globals().update(include('utility.rst.tpl'))
+ % SR=lambda short,alist0=[0]:II('SR',alist0,short)
+
+ .. _`sr_software_system_requirements`:
+
+ ############################
+ Software/System Requirements
+ ############################
+
+ .. _`sr_general`:
+
+ General
+ =======
+
+ .. _`sr_testable`:
+
+ {{SR('Testable')}}
+
+ Requirements are testable (see |tp_requirement_tests|).
+
+ .. _`sr_style`:
+
+ {{SR('Style')}}
+
+ In a restructuredText file, have one sentence in one line.
+
+ Make long sentences into
+
+ - lists
+
+ - with sub items, if needed
+
+ - or simply make more sentences out of it
+
+ A link: |sr_style|.
+
+ .. _`sr_a_requirement_group`:
+
+ A Requirement Group
+ ===================
+
+ .. _`sr_id`:
+
+ {{SR('ID')}}
+
+ The ID seen in the final document is numbered
+ by a python function.
+ In the restructuredText files there is no numbering.
+ The targets use keywords instead.
+ This way one can rearrange the items
+ keeping the items sorted and still referentially consistent.
+
+ The ID shall not contain any hyphens
+ or dots or other non-identifier characters,
+ as some final formats, like DOCX, demand that.
+
+ .. include:: _links_sphinx.rst
+ ├ dd.rest.stpl
+ .. encoding: utf-8
+ .. vim: ft=rst
+
+ % globals().update(include('utility.rst.tpl'))
+ % DD=lambda short,alist0=[0]:II('DD',alist0,short)
+
+ .. _`dd_design_description`:
+
+ ##################
+ Design Description
+ ##################
+
+ .. _`dd_traceability`:
+
+ {{DD('Traceability')}}
+
+ ``dcx.py`` associates all links between two targets
+ to the first target.
+ This can be used as traceability.
+
+ Warnings issued during conversion to final documents
+ help to keep the documents consistent.
+
+ .. _`dd_name`:
+
+ {{DD('Name')}}
+
+ For targeted
+
+ - ``.. table::``
+ - ``.. list-table::``
+ - ``.. figure::``
+ - ``.. code-block::``
+ - ``.. math::``
+
+ use ``:name:``.
+ In the legend use the same ID as in the target definition.
+
+ .. _`dd_figure`:
+
+ .. figure:: _images/egtikz.png
+ :name:
+ :width: 50%
+
+ |dd_figure|: Caption here.
+ Reference this via ``|dd_figure|``.
+
+ For testing purpose the following is rendered via include files.
+
+ Include normal .rst way, where the .rst might be gnerated by a ``.rst.stpl``
+
+ .. include:: dd_included.rst
+
+ Include the stpl way
+
+ %include('dd_diagrams.tpl',DD=DD) # you optionally can provide python definitions
+
+ Pandoc does not know about `definitions in included files `__.
+
+ .. |eps| image:: _images/egeps.png
+
+ .. include:: _links_sphinx.rst
+
+ ├ dd_included.rst.stpl
+ .. encoding: utf-8
+ .. vim: ft=rst
+
+ .. _`dd_code`:
+
+ |dd_code|: Listing showing struct.
+
+ .. code-block:: cpp
+ :name:
+
+ struct xxx{
+ int yyy; //yyy for zzz
+ }
+
+ Include normal ``.rst``.
+
+ .. include:: dd_tables.rst
+
+ Again include the stpl way.
+
+ %include('dd_math.tpl')
+
+ ├ dd_tables.rst
+ .. encoding: utf-8
+ .. vim: ft=rst
+
+ .. _`dd_table`:
+
+ |dd_table|: Table legend
+
+ .. table::
+ :name:
+
+ +--------+--------+
+ | A | B |
+ +========+========+
+ | |eps| | |eps| |
+ +--------+--------+
+
+ .. _`dd_list_table`:
+
+ |dd_list_table|: Table legend
+
+ .. list-table::
+ :name:
+ :widths: 20 80
+ :header-rows: 1
+
+ * - Bit
+ - Function
+
+ * - 0
+ - xxx
+
+ Reference |dd_table| or |dd_list_table| does not show
+ ``dd_table`` or ``dd_list_table``.
+
+ ├ dd_math.tpl
+ .. encoding: utf-8
+ .. vim: ft=rst
+
+ .. _`dd_math`:
+
+ .. math::
+ :name:
+
+ V = \frac{K}{r^2}
+
+ ``:math:`` is the default inline role: `mc^2`
+
+ With `sympy `_ one can have formulas in ``some.py``
+ that are usable for calculation.
+ The formulas can be converted to latex
+ in the ``.stpl`` or ``.tpl`` file.
+
+ %def hyp(a,b):
+ % return a**2+b**2
+ %end
+
+ The long side of a rectangular triangle with legs
+ {{3}} and {{4}} is {{hyp(3,4)**0.5}}. See |hyp|.
+
+ .. _`hyp`:
+
+ .. math::
+ :name:
+
+ %import sympy
+ %from sympy.abc import a,b,c
+ {{sympy.latex(sympy.Eq(c,hyp(a,b)))}}
+
+ ├ dd_diagrams.tpl
+ .. encoding: utf-8
+ .. vim: ft=rst
+
+ .. _`dd_diagrams`:
+
+ {{DD('Diagrams')}}
+
+ .. _`exampletikz1`:
+
+ .. figure:: _images/egtikz1.png
+ :name:
+ :width: 30%
+
+ |exampletikz1|: Create from egtikz1.tikz
+
+ The usage of ``:name:`` produces: ``WARNING:
+ Duplicate explicit target name: ""``. Ignore.
+
+ Reference via |exampletikz1|.
+
+ ``.tikz``, ``.svg``, ``.dot``, ``.uml``, ``.eps`` or ``.stpl``
+ thereof and ``.pyg``, are converted to ``.png``.
+
+ .. _`examplesvg`:
+
+ .. figure:: _images/egsvg.png
+ :name:
+
+ |examplesvg|: Created from egsvg.svg.stpl
+
+ .. _`exampledot`:
+
+ .. figure:: _images/egdot.png
+ :name:
+
+ |exampledot|: Created from egdot.dot.stpl
+
+ .. _`exampleuml`:
+
+ .. figure:: _images/eguml.png
+ :name:
+
+ |exampleuml|: Created from eguml.uml
+
+ .. _`exampleplt`:
+
+ .. figure:: _images/egplt.png
+ :name:
+ :width: 30%
+
+ |exampleplt|: Created from egplt.pyg
+
+ .. _`examplepyx`:
+
+ .. figure:: _images/egpyx.png
+ :name:
+
+ |examplepyx|: Created from egpyx.pyg
+
+ .. _`examplecairo`:
+
+ .. figure:: _images/egcairo.png
+ :name:
+
+ |examplecairo|: Created from egcairo.pyg
+
+ .. _`examplepygal`:
+
+ .. figure:: _images/egpygal.png
+ :name:
+ :width: 30%
+
+ |examplepygal|: Created from egpygal.pyg
+
+ .. _`exampleother`:
+
+ .. figure:: _images/egother.png
+ :name:
+
+ |exampleother|: Created from egother.pyg
+
+ .. _`exampleeps`:
+
+ .. figure:: _images/egeps.png
+ :name:
+
+ |exampleeps|: Created from egeps.eps
+
+ %if False:
+ .. _`target_more_than_in_rest`:
+
+ It is OK to have more targets in the .stpl file.
+ %end
+
+ ├ tp.rest.stpl
+ .. encoding: utf-8
+ .. vim: ft=rst
+
+ % globals().update(include('utility.rst.tpl'))
+ % TP=lambda short,alist0=[0]:II('TP',alist0,short)
+
+ .. _`tp_test_plan`:
+
+ #########
+ Test Plan
+ #########
+
+ .. _`tp_requirement_tests`:
+
+ Requirement Tests
+ =================
+
+ .. _`tp_test_types`:
+
+ {{TP('Test Types')}}
+
+ Performance tests are only one kind of tests.
+
+ .. _`tp_no_duplication`:
+
+ {{TP('No duplication')}}
+
+ Since items in other documents are phrased as tests,
+ there is no need to repeat the text here.
+
+ - |sr_id|
+
+ Or better: Reference the according chapter:
+
+ - Test |sr_a_requirement_group|
+
+ .. _`tp_unit_tests`:
+
+ Unit Tests
+ ==========
+
+ .. _`tp_gen_file`:
+
+ {{TP('gen file')}}
+
+ Use ``.rst`` for included files
+ and start the file with ``_`` if generated.
+ How test documentation files are generated
+ from test source code can be specified in the ``gen`` file.
+
+ .. include:: _links_sphinx.rst'''
+
+
+example_ipdt_tree = r'''
+ wafw.py << file:///__wafw__
+ waf
+ #!/usr/bin/env sh
+ shift
+ ./wafw.py "$@"
+ waf.bat
+ @setlocal
+ @set PYEXE=python
+ @where %PYEXE% 1>NUL 2>NUL
+ @if %ERRORLEVEL% neq 0 set PYEXE=py
+ @%PYEXE% -x "%~dp0wafw.py" %*
+ @exit /b %ERRORLEVEL%
+ wscript
+ #vim: ft=python
+ from waflib import Logs
+ Logs.colors_lst['BLUE']='\x1b[01;36m'
+ top='.'
+ out='build'
+ def options(opt):
+ opt.load('rstdoc.dcx')
+ def configure(cfg):
+ cfg.load('rstdoc.dcx')
+ def build(bld):
+ bld.load('rstdoc.dcx')
+ bld.build_docs()
+ c
+ └ some.h
+ /*
+ #def gen_tst(lns,**kw):
+ # return [l.split('@')[1] for l in rlines(r'^\s*@',lns)]
+ #def gen_tst
+ #def gen_tstdoc(lns,**kw):
+ # return ['#) '+l.split('**')[1] for l in rlines(r'^/\*\*',lns)]
+ #def gen_tstdoc
+
+ @//generated from some.h
+ @#include
+ @#include "some.h"
+ @int main()
+ @{
+ */
+
+ /**Test add1()
+ @assert(add1(1)==2);
+ */
+ int add1(int a)
+ {
+ return a+1;
+ }
+
+ /**Test add2()
+ @assert(add2(1)==3);
+ */
+ int add2(int a)
+ {
+ return a+2;
+ }
+
+ /*
+ @}
+ */
+ pdt
+ ├ conf.py
+ project = 'PDT'
+ author = project+' Project Team'
+ copyright = '2019, '+author
+ version = '0.0.0'
+ release = version
+ try:
+ import sphinx_bootstrap_theme
+ html_theme_path = sphinx_bootstrap_theme.get_html_theme_path()
+ html_theme = 'bootstrap'
+ except:
+ pass
+ #these are enforced by rstdoc, but keep them for sphinx-build
+ numfig = 0
+ smartquotes = 0
+ source_suffix = '.rest'
+ templates_path = []
+ language = None
+ highlight_language = "none"
+ default_role = 'math'
+ pygments_style = 'sphinx'
+ exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
+ master_doc = 'index'
+ html_extra_path=['_traceability_file.svg'] #relative to conf.py
+ import os
+ on_rtd = os.environ.get('READTHEDOCS') == 'True'
+ if not on_rtd:
+ latex_engine = 'xelatex'
+ #You can postprocess pngs.default: png_post_processor = None
+ def png_post_processor(filename):
+ from PIL import Image, ImageChops
+ def trim(im):
+ bg = Image.new(im.mode, im.size, im.getpixel((0, 0)))
+ diff = ImageChops.difference(im, bg)
+ diff = ImageChops.add(diff, diff, 2.0, -100)
+ bbox = diff.getbbox()
+ if bbox:
+ return im.crop(bbox)
+ return im
+ im = Image.open(filename)
+ im = trim(im)
+ im.save(filename)
+ return filename
+ #the following are default and can be omitted
+ latex_elements = {'preamble':r"""
+ \usepackage{pgfplots}
+ \usepackage{unicode-math}
+ \usepackage{tikz}
+ \usepackage{caption}
+ \captionsetup[figure]{labelformat=empty}
+ \usetikzlibrary{arrows,snakes,backgrounds,patterns,matrix,shapes,fit,calc,shadows,plotmarks,intersections}
+ """
+ }
+ #new in rstdcx/dcx/py
+ tex_wrap = r"""
+ \documentclass[12pt,tikz]{standalone}
+ \usepackage{amsmath}
+ """+latex_elements['preamble']+r"""
+ \pagestyle{empty}
+ \begin{document}
+ %s
+ \end{document}
+ """
+ DPI = 600
+ #rst-internal target IDs to include in trace
+ rextrace_target_id="^[ipdt][0-9A-Z]{3}[a-z]+$"
+ target_id_color={"inform":("i","lightblue"), "plan":("p","red"),
+ "do":("d","yellow"), "test":("t","green")}
+ pandoc_doc_optref={'latex': '--template reference.tex',
+ 'html': {},#each can also be dict of file:template
+ 'pdf': '--template reference.tex',
+ 'docx': '--reference-doc reference.docx',
+ 'odt': '--reference-doc reference.odt'
+ }
+ _pandoc_latex_pdf = ['--listings','--number-sections','--pdf-engine',
+ 'xelatex','-V','titlepage','-V','papersize=a4',
+ '-V','toc','-V','toc-depth=3','-V','geometry:margin=2.5cm']
+ pandoc_opts = {'pdf':_pandoc_latex_pdf,'latex':_pandoc_latex_pdf,
+ 'docx':[],'odt':[],
+ 'html':['--mathml','--highlight-style','pygments']}
+ rst_opts = { #http://docutils.sourceforge.net/docs/user/config.html
+ 'strip_comments':True
+ ,'report_level':3
+ ,'raw_enabled':True
+ }
+ def name_from_directive(directive,count):
+ return directive[0].upper() + directive[1:] + ' ' + str(count)
+ ├ docutils.conf
+ [general]
+ halt_level: severe
+ report_level: error
+ ├ 000
+ ├ gen
+ # vim: ft=python
+ from os.path import relpath, dirname
+ from_to_fun_kw = [ #fun and gen_fun() or gen()
+ ['../../c/some.h','_sometst.rst','tstdoc',{}],
+ ['../../c/some.h','../../build/c/some_tst.c','tst',{}]
+ ]
+ ├ i.rest.stpl
+ .. vim: ft=rst
+
+ .. _`i000`:
+
+ % globals().update(include('pdt.rst.tpl',
+ % Title="pdt inform"
+ % ))
+
+ .. _`i000inform`:
+ %__i000('inform')
+
+ Purpose is non-technical, from or for externals.
+
+ The purpose of the ``__i000`` python function is to count the entries.
+ If you don't care about counted items, you can use normal RST, or provide your own ``__i000``.
+
+ plan-do-test: `pdt `__.
+
+ %epilog()
+ ├ p.rest.stpl
+ .. vim: ft=rst
+
+ .. _`p000`:
+
+ % globals().update(include('pdt.rst.tpl',
+ % Title="pdt plan"
+ % ))
+
+
+ .. _`p000grouping`:
+ %__p000_('Grouping') ############################################################
+
+ .. _`p000headers`:
+ %__p000('headers')
+ |i000inform|
+
+ Headers are groupings of content items,
+ but are not as important as content items.
+
+ .. _`p000subproject`:
+ %__p000('sub-project')
+ |i000inform|
+
+ For a sub-project prefer a new ``pdt`` over headers.
+
+
+ %epilog()
+ ├ d.rest.stpl
+ .. vim: ft=rst
+
+ .. _`d000`:
+
+ % globals().update(include('pdt.rst.tpl',
+ % Title="pdt do"
+ % ))
+
+
+ .. _`d000repo`:
+ %__d000('repo')
+ |p000headers| is an example of a link to another item.
+
+ - ``pdt`` documents the development
+ - ``doc`` documents the SW
+
+ .. _`000repofigure`:
+
+ .. figure:: _images/repo.png
+ :name:
+ :width: 50%
+
+ |000repofigure|: Example project repo.
+
+ .. _`d000notrace`:
+ %__d000('notrace')
+ |p000subproject|
+
+ The figure target above does not start with 'd'.
+ ``rextrace_target_id`` is set to ignore such targets.
+
+ %epilog()
+ ├ t.rest.stpl
+ .. vim: ft=rst
+
+ .. _`t000`:
+
+ % globals().update(include('pdt.rst.tpl',
+ % Title="pdt test"
+ % ))
+
+
+ .. _`t000testitem1`:
+ %__t000('test item 1')
+
+ Link *plan* and *do* items that are tested here, e.g.
+
+ - |p000headers| fits to |d000repo|
+
+ .. _`t000testitem2`:
+ %__t000('test item 2')
+ |d000notrace|
+
+ Tests manually.
+
+ .. _`t000codegeneratedtestitems`:
+ %__t000('code generated test items')
+
+ .. include:: _sometst.rst
+
+ %epilog()
+
+
+ └ repo.pyg
+ # vim: ft=python ts=4
+ import drawSvg
+ d = drawSvg.Drawing(400, 800, origin=(0,0))
+ draw = lambda what,*args,**kwargs: d.append(getattr(drawSvg,what)(*args,**kwargs))
+ for e in "Image Text Rectangle Circle ArcLine Path Lines Line Arc".split():
+ eval(compile("{}=lambda *args, **kwargs: draw('{}',*args,**kwargs)".format(e.lower(),e),"repo","exec"),globals())
+ p=dict(fill='red', stroke_width=2, stroke='black')
+ th=20
+ y=[d.height]
+ dx,dy=2*th,-2*th
+ x=lambda : len(y)*dx
+ indent=lambda: y.append(0)
+ def back(n=1):
+ yy = sum(y[:-n])
+ y[-n-1]+=sum(y[-n:])
+ del y[-n:]
+ line(x(),yy,x(),sum(y),**p)
+ def entry(t):
+ yy = sum(y)
+ line(x(),yy,x(),yy+dy,**p)
+ y[-1]=y[-1]+dy
+ yy += dy
+ line(x(),yy,x()+dx,yy,**p)
+ text(t,th,x()+dx,yy)
+ entry("sw")
+ indent()
+ entry("pdt")
+ indent()
+ entry("000")
+ indent()
+ entry("{i,p,d,t}.rest.stpl")
+ back()
+ entry("001")
+ indent()
+ entry("{i,p,d,t}.rest.stpl")
+ back(2)
+ entry('doc')
+ indent()
+ entry('sw_{x,y,z}.rest.stpl')
+ back()
+ entry('c')
+ indent()
+ entry('sw_{x,y,z}.c')
+ back()
+ entry('python')
+ indent()
+ entry('sw')
+ indent()
+ entry('__init__.py')
+ back(2)
+ entry('test')
+ indent()
+ entry('test_{x,y,z}.py')
+ back()
+ entry('waf')
+ entry('wscript')
+ ├ 001
+ __imgs__
+ ├ i.rest.stpl
+ .. vim: ft=rst
+
+ .. _`i001`:
+
+ %globals().update(include('pdt.rst.tpl',
+ %Title="Information on Diagrams",
+ %Type="inform"
+ %))
+
+ .. _`i001figure`:
+ %__i001('figure')
+
+ An item is not included in the traceabilty diagram unless with links to other items.
+ Ref to |i000inform| to test inclusion.
+
+ .. _`001fig1`:
+
+ .. figure:: _images/egtikz.png
+ :name:
+ :width: 50%
+
+ |001fig1|: Caption here.
+ Reference this via ``|001fig1|``.
+
+ .. _`i001rstinclude`:
+ %__i001('rst include')
+ |i000inform|
+
+ .. include:: i_included.rst
+
+ .. _`i001stplincludetpl`:
+ %__i001('stpl include (tpl)')
+ |i000inform|
+
+ %include('i_diagrams.tpl',_i001=_i001)
+
+ Following definitions here, as
+ Pandoc does accept
+ `definitions in included files `__.
+
+ .. |eps| image:: _images/egeps.png
+
+ %epilog()
+
+ ├ i_diagrams.tpl
+ .. vim: ft=rst
+
+ .. _`i001diagrams`:
+ %__i001('diagrams')
+ |i000inform|
+
+ .. _`exampletikz1`:
+
+ .. figure:: _images/egtikz1.png
+ :name:
+ :width: 30%
+
+ |exampletikz1|: Create from egtikz1.tikz
+
+ ``.tikz``, ``.svg``, ``.dot``, ``.uml``, ``.eps`` or ``.stpl``
+ thereof and ``.pyg``, are converted to ``.png``.
+
+ .. _`examplesvg`:
+
+ .. figure:: _images/egsvg.png
+ :name:
+
+ |examplesvg|: Created from egsvg.svg.stpl
+
+ .. _`exampledot`:
+
+ .. figure:: _images/egdot.png
+ :name:
+
+ |exampledot|: Created from egdot.dot.stpl
+
+ .. _`exampleuml`:
+
+ .. figure:: _images/eguml.png
+ :name:
+
+ |exampleuml|: Created from eguml.uml
+
+ .. _`exampleplt`:
+
+ .. figure:: _images/egplt.png
+ :name:
+ :width: 30%
+
+ |exampleplt|: Created from egplt.pyg
+
+ .. _`examplepyx`:
+
+ .. figure:: _images/egpyx.png
+ :name:
+
+ |examplepyx|: Created from egpyx.pyg
+
+ .. _`examplecairo`:
+
+ .. figure:: _images/egcairo.png
+ :name:
+
+ |examplecairo|: Created from egcairo.pyg
+
+ .. _`examplepygal`:
+
+ .. figure:: _images/egpygal.png
+ :name:
+ :width: 30%
+
+ |examplepygal|: Created from egpygal.pyg
+
+ .. _`exampleother`:
+
+ .. figure:: _images/egother.png
+ :name:
+
+ |exampleother|: Created from egother.pyg
+
+ .. _`exampleeps`:
+
+ .. figure:: _images/egeps.png
+ :name:
+
+ |exampleeps|: Created from egeps.eps
+
+ %if False:
+ .. _`target_more_than_in_rest`:
+
+ It is OK to have more targets in the .stpl file.
+ %end
+
+ Make a reference to |exampletikz1|.
+
+ ├ i_included.rst
+ .. vim: ft=rst
+
+
+
+ .. _`i001code`:
+
+ |i001code|: Listing showing struct.
+
+ .. code-block:: cpp
+ :name:
+
+ struct xxx{
+ int yyy; //yyy for zzz
+ }
+
+ .. _`i001table`:
+
+ i00101: **table**
+
+ Include normal ``.rst``.
+
+ .. include:: i_tables.rst
+
+ .. _`i001math`:
+
+ i00102: **math**
+
+ Again include the stpl way.
+
+ .. vim: ft=rst
+
+ .. _`i001math1`:
+
+ .. math::
+ :name:
+
+ V = \frac{K}{r^2}
+
+ ``:math:`` is the default inline role: `mc^2`
+
+
+ ├ i_included.rst.stpl
+ .. vim: ft=rst
+
+ %globals().update(include('pdt.rst.tpl'))
+
+ .. _`i001code`:
+
+ |i001code|: Listing showing struct.
+
+ .. code-block:: cpp
+ :name:
+
+ struct xxx{
+ int yyy; //yyy for zzz
+ }
+
+ .. _`i001table`:
+ %__001('table')
+ |i000inform|
+
+ Include normal ``.rst``.
+
+ .. include:: i_tables.rst
+
+ .. _`i001math`:
+ %__i001('math')
+ |i000inform|
+
+ Again include the stpl way.
+
+ %include('i_math.tpl')
+
+ ├ i_math.tpl
+ .. vim: ft=rst
+
+ .. _`i001math1`:
+
+ .. math::
+ :name:
+
+ V = \frac{K}{r^2}
+
+ ``:math:`` is the default inline role: `mc^2`
+
+ └ i_tables.rst
+ .. vim: ft=rst
+
+ .. _`i001table1`:
+
+ |i001table1|: Table legend
+
+ .. table::
+ :name:
+
+ +--------+--------+
+ | A | B |
+ +========+========+
+ | |eps| | |eps| |
+ +--------+--------+
+
+ .. _`i001table2`:
+
+ |i001table2|: Table legend
+
+ .. list-table::
+ :name:
+ :widths: 20 80
+ :header-rows: 1
+
+ * - Bit
+ - Function
+
+ * - 0
+ - xxx
+
+ Reference |i001table1| or |i001table2| does not show
+ ``i001table1`` or ``i001table2``.
+ ├ index.rest.stpl
+ .. vim: ft=rst
+
+ %globals().update(include('pdt.rst.tpl'
+ %,Title="rstdoc - pdt example"
+ %,Type="inform"
+ %))
+
+ %from pathlib import Path
+ %thisdir=Path(__file__).parent
+ %stem = lambda x:os.path.splitext(x)[0].replace('\\', '/')
+ %from os.path import dirname, basename
+
+ .. toctree::
+
+ %for x in sorted(set(y.parent for y in thisdir.rglob("*.rest*") if not y.name.startswith('index.rest'))):
+ % fs = dict((f.name[0],f) for f in Path(x).rglob("*.rest*"))
+ % for i in "ipdt":
+ % if i in fs:
+ {{stem(fs[i].relative_to(thisdir))}}
+ % del fs[i]
+ % end
+ % end
+ % for i in fs:
+ {{stem(fs[i].relative_to(thisdir))}}
+ % end
+ % end
+
+ .. REMOVE THIS IF NO LINKING OVERVIEW WANTED
+ .. include:: _traceability_file.rst
+
+ .. include:: _links_sphinx.rst
+
+ ├ pdt.rst.tpl
+ % #expect Title and optionally
+ % setdefault('Contact','roland.puntaier@gmail.com')
+ % setdefault('Type','pdt')
+ % setdefault('Status','draft')
+ % assert Type in "pdt inform".split(), "Wrong PDT Type"
+ % assert Status in "draft final replaced deferred rejected withdrawn".split(), "Wrong PDT Status"
+ %
+ % #see pdtAAA (__main_file__ is the main stpl file)
+ % from rstdoc.dcx import pdtAAA
+ % pdtAAA(__main_file__,globals()) #,pdtid=".*/(.).\w+.stpl")
+ %
+ %if defined('Title'):
+ %ttl=AAA+" - "+Title if defined('AAA') else Title
+ {{'#'*len(ttl)}}
+ {{ttl}}
+ {{'#'*len(ttl)}}
+ %end
+
+ %if defined('iAAA'):
+ :PDT: {{AAA}}
+ :Contact: {{!Contact}}
+ :Type: {{!Type}}
+ :Status: {{Status}}
+ %end
+ %def pagebreak():
+ .. raw:: openxml
+
+
+
+
+
+
+
+ .. raw:: html
+
+
+
+ .. raw:: latex
+
+ \pagebreak
+
+ .. for docutils
+ .. raw:: odt
+
+
+
+ .. for pandoc
+ .. raw:: opendocument
+
+
+
+ %end
+ %def epilog():
+ .. include:: /_links_sphinx.rst
+ %end #epilog'''
+
+example_over_tree = r'''
+ wafw.py << file:///__wafw__
+ waf
+ #!/usr/bin/env sh
+ shift
+ ./wafw.py "$@"
+ waf.bat
+ @setlocal
+ @set PYEXE=python
+ @where %PYEXE% 1>NUL 2>NUL
+ @if %ERRORLEVEL% neq 0 set PYEXE=py
+ @%PYEXE% -x "%~dp0wafw.py" %*
+ @exit /b %ERRORLEVEL%
+ wscript
+ #vim: ft=python
+ from waflib import Logs
+ Logs.colors_lst['BLUE']='\x1b[01;36m'
+ top='.'
+ out='build'
+ def options(opt):
+ opt.load('rstdoc.dcx')
+ def configure(cfg):
+ cfg.load('rstdoc.dcx')
+ def build(bld):
+ bld.load('rstdoc.dcx')
+ bld.build_docs()
+ org
+ ├ process
+ │ └ SOP
+ │ └ purchase.rest
+ │ Purchase
+ │ ========
+ │
+ │ .. _`sop_purchase`:
+ │
+ │ :sop_purchase:
+ │
+ │ A contributor places a link under the purchase folder.
+ │
+ │
+ │ .. include:: /_links_sphinx.rst
+ ├ discussion
+ │ └ topic1.rest
+ │ Topic1
+ │ ======
+ │
+ │ .. _`topic_merge_delay`:
+ │
+ │ :topic_merge_delay:
+ │
+ │ Can someone take over review of |000|, as I'm busy with ...
+ │
+ │ .. include:: /_links_sphinx.rst
+ ├ mediation
+ │ └ conflict1.rest
+ │ Conflict1
+ │ =========
+ │
+ │ .. _`conflict_000`:
+ │
+ │ :conflict_000:
+ │
+ │ Conflicting view on |000| between ...
+ │
+ │ .. include:: /_links_sphinx.rst
+ ├ contributor
+ └ c1
+ ├ assigned.rest
+ │ Assigned for c1
+ │ ===============
+ │
+ │ Planning and coordinating |000|
+ │
+ │ Implementation of |fw_000|
+ │
+ │ .. include:: /_links_sphinx.rst
+ ├ log
+ └ 2019.rest
+ 2019
+ ====
+
+ .. _`c1_20191101`
+
+ |issue1|
+
+ .. _`c1_20191102`
+
+ |issue1|
+ It was necessary to refactor ...
+
+ .. include:: /_links_sphinx.rst
+ doc
+ ├ index.rest
+ │ Documentation
+ │ =============
+ │
+ │ .. toc::
+ │
+ │ tutorial.rest
+ │
+ │ .. include:: /_links_sphinx.rst
+ └ tutorial.rest
+ │ Tutorial
+ │ ========
+ │
+ │ Example API usage
+ │
+ │ .. include:: /_links_sphinx.rst
+ pdt
+ └ 000
+ ├ info.rest
+ │ .. _`000`:
+ │
+ │ Feature Info
+ │ ============
+ │
+ │ .. _`i000a`:
+ │
+ │ :i000a: info
+ │
+ │ .. include:: /_links_sphinx.rst
+ ├ plan.rest
+ │ Feature Plan
+ │ ============
+ │
+ │ .. _`p000a`:
+ │
+ │ :p000a: plan
+ │
+ │ .. include:: /_links_sphinx.rst
+ ├ do.rest
+ │ Feature Spec
+ │ ============
+ │
+ │ .. _`d000a`:
+ │
+ │ :d000a: spec
+ │
+ │ .. include:: /_links_sphinx.rst
+ └ test.rest
+ │ Feature Test
+ │ ============
+ │
+ │ .. _`t000a`:
+ │
+ │ :t000a: test
+ │
+ │ .. include:: /_links_sphinx.rst
+ dev
+ ├ issues
+ │ └ issue1.rest
+ │ Issue1 Title
+ │ ============
+ │
+ │ .. _`issue1`:
+ │
+ │ :issue1:
+ │
+ │ SW does not link to device, if ...
+ │
+ │ .. include:: /_links_sphinx.rst
+ │ └ issue2.rest
+ │ Issue2 Title
+ │ ============
+ │
+ │ .. _`issue2`:
+ │
+ │ :issue2:
+ │
+ │ Test xyz fails.
+ │
+ │ .. include:: /_links_sphinx.rst
+ ├ hw
+ │ ├ casing
+ │ │ ├ plan.rest
+ │ .. _`case001`:
+ │
+ │ :case001:
+ │
+ │ According |d000a| ...
+ │
+ │ .. include:: /_links_sphinx.rst
+ │ │ ├ scad/
+ │ │ └ test
+ │ │ └ stability.rest
+ │ Casing Stability Tests
+ │ ======================
+ │
+ │ .. _`fall_test`:
+ │
+ │ :fall_test:
+ │
+ │ The casing is pushed from a table.
+ │
+ │ .. include:: /_links_sphinx.rst
+ │ ├ pcb1
+ │ │ ├ plan.rest
+ │ PCB1 Implementation
+ │ ===================
+ │
+ │ .. _`pcb1_000`:
+ │
+ │ :pcb1_000:
+ │
+ Overview of functional units of pcb1.
+ │
+ │ .. include:: /_links_sphinx.rst
+ │ │ ├ pcb1.sch
+ │ │ └ test/
+ │ └ test/
+ ├ sw
+ │ ├ fw
+ │ │ ├ plan.rest
+ │ Firmware
+ │ ========
+ │
+ │ .. _`fw_000`:
+ │
+ │ :fw_000:
+ │
+ │ To satisfy |000| these steps need to be taken.
+ │
+ │ .. include:: /_links_sphinx.rst
+ │ │ ├ controller1/
+ │ │ └ C
+ │ │ └ init.c
+ // just an example
+ │ │ ├ test/
+ │ ├ android/
+ │ │ ├ plan.rest
+ │ Android App
+ │ ===========
+ │
+ │ .. _`appplan`:
+ │
+ │ :appplan:
+ │
+ │ Implementation plan satisfying |000|.
+ │
+ │ .. include:: /_links_sphinx.rst
+ │ │ ├ app/
+ │ │ ├ testapp/
+ │ └ test/
+ └ test/
+ contribution.rest
+ Contributing
+ ============
+
+ .. _`how_to_contribute`:
+
+ :how_to_contribute:
+
+ - |general_goal|
+ - pdt about plans
+
+ .. _`unassigned_issues`:
+
+ :unassigned_issues:
+
+ These issues are still unassigned and need new contributors.
+
+ |issue2|
+
+ .. include:: /_links_sphinx.rst
+ readme.rest
+ Project Entry Point
+ ===================
+
+ .. _`general_goal`:
+
+ :general_goal:
+
+ Overview of goal and links to further information.
+
+ .. include:: /_links_sphinx.rst
+ index.rest
+ .. toctree::
+
+ .. vim: ft=rst
+
+ ############
+ Example Tree
+ ############
+
+ .. toctree::
+
+ readme.rest
+ contribution.rest
+ org/discussion/topic1.rest
+ org/mediation/conflict1.rest
+ org/process/SOP/purchase.rest
+ org/contributor/c1/assigned.rest
+ org/contributor/c1/log/2019.rest
+ pdt/000/info.rest
+ pdt/000/do.rest
+ pdt/000/plan.rest
+ pdt/000/test.rest
+ doc/tutorial.rest
+ dev/hw/casing/plan.rest
+ dev/hw/casing/test/stability.rest
+ dev/hw/pcb1/plan.rest
+ dev/sw/android/plan.rest
+ dev/sw/fw/plan.rest
+ dev/issues/issue1.rest
+ dev/issues/issue2.rest
+
+ .. include:: /_links_sphinx.rst'''
+
+
+def initroot(
+ rootfldr
+ ,sampletype
+ ):
+ '''
+ Creates a sample tree in the file system.
+
+ :param rootfldr: directory name that becomes root of the sample tree
+ :param sampletype: either 'ipdt' or 'stpl' for templated sample trees, or 'rest' or 'over' for non-templated
+
+ See ``example_rest_tree``, ``example_stpl_subtree``, ``example_ipdt_tree``, ``example_over_tree`` in dcx.py.
+
+ '''
+
+ if txdir is None:
+ return
+
+ def rR(instr):
+ if _rest == '.rst':
+ #>instr='x.rst y.rest z.rst'
+ instr = instr.replace('.rst','.rrrr')
+ instr = instr.replace('.rest','.rst')
+ instr = instr.replace('.rrrr','.rest')
+ #>instr == 'x.rest y.rst z.rest'
+ return instr
+
+ thisfile = __file__.replace('\\', '/')
+ thisdir = dirname(thisfile)
+ tex_ref = normjoin(thisdir, 'reference.tex')
+ docx_ref = normjoin(thisdir, 'reference.docx')
+ odt_ref = normjoin(thisdir, 'reference.odt')
+ wafw = normjoin(thisdir, 'wafw.py')
+ if sampletype == 'ipdt':
+ imglines = example_rest_tree.splitlines()
+ imglines = imglines[
+ list(rindices('├ egtikz.tikz',imglines))[0]:
+ list(rindices('├ gen',imglines))[0]]
+ imglines = [' '*4+x for x in imglines]
+ example_tree = example_ipdt_tree.replace('__imgs__',('\n'.join(imglines)+'\n').lstrip())
+ elif sampletype == 'over':
+ example_tree=example_over_tree
+ else:
+ example_tree=example_rest_tree
+ example_tree = rR(example_tree)
+ inittree = [
+ l for l in example_tree.replace(
+ '__dcx__', thisfile).replace(
+ '__tex_ref__', tex_ref).replace(
+ '__docx_ref__', docx_ref).replace(
+ '__odt_ref__', odt_ref).replace(
+ '__wafw__', wafw).replace(
+ '__code__', rootfldr.strip()=='.' and base(cwd()) or rootfldr
+ ).splitlines()
+ ]
+ if sampletype == 'stpl':
+ def _replace_lines(origlns, start, stop, insertlns):
+ return origlns[:list(rindices(start, origlns))
+ [0]] + insertlns + origlns[list(
+ rindices(stop, origlns))[0]:]
+ inittree = _replace_lines(inittree, rR('├ index.rest'), '├ egtikz.tikz',
+ rR(example_stpl_subtree).lstrip('\n').splitlines())
+ mkdir(rootfldr)
+ with new_cwd(rootfldr):
+ txdir.view_to_tree(inittree)
+
+def index_dir(
+ root='.'
+ ):
+ '''
+ Index a directory.
+
+ :param root: All subdirectories of ``root`` that contain a ``.rest`` or ``.rest.stpl`` file are indexed.
+
+ - expands the .stpl files
+ - generates the files as defined in the ``gen`` file (see example in dcx.py)
+ - generates ``_links_xxx.rst`` for xxx = {sphinx latex html pdf docx odt}
+ - generates ``.tags`` with jumps to reST targets
+
+ '''
+
+ # do all stpl's upfront. ``create_links_and_tags()`` needs them
+ from pathlib import Path
+ # reversed evaluates deeper stpls first
+ for f in reversed(sorted([str(x) for x in Path(root).rglob('*.stpl')])):
+ dpth = normjoin(root, f)
+ if isfile(dpth):
+ outpth = stem(dpth)
+ try:
+ dostpl(dpth, outpth)
+ except Exception as err:
+ print('Error expanding %s: %s' % (dpth, str(err)))
+ # link, gen and tags per directory
+ fldrs = Fldrs(root)
+ fldrs.scandirs()
+ #reversed to do create_traceability_file at the end
+ for folder, fldr in reversed(fldrs.items()):
+ # generated files need to be there to be indexed
+ genpth = normjoin(folder, 'gen')
+ if exists(genpth):
+ try:
+ for f, t, d, kw in parsegenfile(genpth):
+ gen(normjoin(folder, f),
+ target=normjoin(folder, t),
+ fun=d,
+ **kw)
+ except Exception as err:
+ print('Generating files in %s seems not meant to be done: %s' %
+ (genpth, str(err)))
+ fldr.create_links_and_tags()
+
+
+description = (
+
+"""
+``rstdcx`` CLI
+--------------
+
+Without parameters: creates ``|substitution|`` links and .tags ctags for reST targets.
+
+With two or three parameters: process file or dir to out file or dir
+through Pandoc, Sphinx, Docutils (third parameter):
+
+- ``html``, ``docx``, ``odt``, ``pdf``, ... uses Pandoc.
+
+- ``rst_html``, ``rst_odt``, ``rst_pdf``, ... uses
+ `rst2html `__, ...
+
+- ``sphinx_html``, ``sphinx_pdf``, ... uses Sphinx.
+ Sphinx provides a nice entry point via the
+ `sphinx bootstrap theme `__.
+
+4th parameter onward become python defines usable in ``.stpl`` files.
+
+Pdf output needs latex. Else you can make odt or docx and use
+
+- win: ``swriter.exe --headless --convert-to pdf Untitled1.odt``
+- linux: ``lowriter --headless --convert-to pdf Untitled1.odt``
+
+Inkscape (.eps, .svg), Dot (.dot), Planuml (.uml), latex (.tex,.tikz)
+are converted to .png into ``./_images`` or ``/_images`` or '.'.
+Any of the files can be a SimpleTemplate template (xxx.yyy.stpl).
+
+Configuration is in ``conf.py`` or ``../conf.py``.
+
+``rstdoc --stpl|--rest|--ipdt|-over`` create sample project trees.
+
+``--stpl`` with ``.rest.stpl`` template files,
+``--rest`` with only a doc folder with ``.rest`` files,
+``--ipdt`` with inform-plan-do-test enhancement cycles
+``--over`` with ``.rest`` files all over the project tree including symbolic links
+
+Examples
+--------
+
+Example folders (see wscript and Makefile there)::
+
+ rstdoc --rest [--rstrest]
+ rstdoc --stpl [--rstrest]
+ rstdoc --ipdt [--rstrest]
+ rstdoc --over [--rstrest]
+
+Use ``--rstrest`` to produce ``.rst`` for the main file,
+as ``.rest`` is not recognized by github/gitlab,
+who also don't support file inclusion,
+so no need for two extension anyway.
+
+Examples usages with the files generated by ``rstdoc --stpl tmp``:
+
+.. code-block:: sh
+
+ cd tmp/doc
+ rstdcx #expand .stpl and produce .tag and _links_xxx files
+
+ #expand stpl and append substitutions (for simple expansion use ``stpl .``)
+ rstdcx dd.rest.stpl - rest # expand to stdout, appending dd.html substitutions, to pipe to Pandoc
+ rstdcx dd.rest.stpl - html. # as before
+ rstdcx dd.rest.stpl - docx. # expand to stdout, appending dd.docx substitutions, to pipe to Pandoc
+ rstdcx dd.rest.stpl - newname.docx. # expand template, appending substitutions for target newname.docx
+ rstdcx dd.rest.stpl - html # expand to stdout, already process through Pandoc to produce html on stdout
+ rstdcx dd.rest.stpl # as before
+ rstdcx sy.rest.stpl - rst_html # expand template, already process through Docutils to produce html on stdout
+ stpl sy.rest.stpl | rstdcx - - sy.html. # appending sy.html substitutions, e.g. to pipe to Pandoc
+ stpl dd.rest.stpl | rstdcx - - dd.html # appending tp.html substitutions and produce html on stdout via Pandoc
+ rstdcx dd.rest.stpl dd.rest # expand into dd.rest, appending substitutions for target dd.html
+ rstdcx dd.rest.stpl dd.html html # expand template, process through Pandoc to produce dd.html
+ rstdcx dd.rest.stpl dd.html # as before
+ rstdcx dd.rest.stpl dd.html rst_html # expand template, already process through Docutils to produce dd.html
+ rstdcx dd.rest.stpl dd.docx # expand template, process through Pandoc to produce dd.docx
+ rstdcx dd.rest.stpl dd.odt pandoc # expand template, process through Pandoc to produce dd.odt
+ rstdcx dd.rest.stpl dd.odt # as before
+ rstdcx dd.rest.stpl dd.odt rst_odt # expand template, process through Docutils to produce dd.odt
+ rstdcx dd.rest.stpl dd.odt rst # as before
+ rstdcx . build html # convert current dir to build output dir using pandoc
+ rstdcx . build sphinx_html # ... using sphinx (if no index.rest, every file separately)
+
+ #Sphinx is not file-oriented
+ #but with rstdcx you need to provide the files to give Sphinx ``master_doc`` (normally: index.rest)
+ #Directly from ``.stpl`` does not work with Sphinx
+ rstdcx index.rest ../build/index.html sphinx_html # via Sphinx the output directory must be different
+
+ #convert the graphics and place the into _images or /_images
+ #if no _images directory exists they will be placed into the same directory
+ rstdcx egcairo.pyg
+ rstdcx egdot.dot.stpl
+ rstdcx egeps.eps
+ rstdcx egother.pyg
+ rstdcx egplt.pyg
+ rstdcx egpygal.pyg
+ rstdcx egpyx.pyg
+ rstdcx egsvg.svg.stpl
+ rstdcx egtikz.tikz
+ rstdcx egtikz1.tikz
+ rstdcx eguml.uml
+
+ #Convert graphics to a png (even if _images directory exists):
+ rstdcx eguml.uml eguml.png
+
+ #Files to other files:
+
+ rstdoc dd.rest.stpl dd.rest
+ rstdoc dd.rest.stpl dd.html html
+ rstdoc dd.rest.stpl dd.html
+ rstdoc sr.rest.stpl sr.html rst_html
+ rstdoc dd.rest.stpl dd.docx
+ rstdoc dd.rest.stpl dd.odt pandoc
+ rstdoc dd.rest.stpl dd.odt
+ rstdoc sr.rest.stpl sr.odt rst_odt
+ rstdoc sr.rest.stpl sr.odt rst
+ rstdoc index.rest build/index.html sphinx_html
+
+ #Directories to other directories with out info:
+
+ rstdoc . build html
+ rstdoc . build sphinx_html
+
+Grep with python re in .py, .rst, .rest, .stpl, .tpl::
+
+ rstdoc --pygrep inline
+
+Grep for keyword lines containing 'png'::
+
+ rstdoc --kw png
+
+Default keyword lines::
+
+ .. {{{kw1,kw2
+ .. {kw1,kw2}
+ {{_ID3('kw1 kw2')}}
+ %__ID3('kw1 kw2')
+ :ID3: kw1 kw2
+
+"""
+
+)
+
+
+def main(**args):
+ '''
+ This corresponds to the |rstdcx| shell command.
+
+ '''
+
+ import argparse
+
+ if not args:
+ parser = argparse.ArgumentParser(description=description,
+ formatter_class=argparse.RawDescriptionHelpFormatter)
+ parser.add_argument('--version', action='version', version = __version__)
+ parser.add_argument(
+ '--rstrest',
+ dest='rstrest',
+ action='store_true',
+ default=False,
+ help='For the sample projects, make .rst main files and .rest included, reversing default.')
+ parser.add_argument(
+ '--rest',
+ dest='restroot',
+ action='store',
+ help='Create a sample directory with `/doc/{sy,ra,sr,dd,tp}.rest` files.')
+ parser.add_argument(
+ '--stpl',
+ dest='stplroot',
+ action='store',
+ help='Create a sample directory with `/doc/{sy,ra,sr,dd,tp}.rest.stpl` files.')
+ parser.add_argument(
+ '--ipdt',
+ dest='ipdtroot',
+ action='store',
+ help='Create a sample directory with `/pdt/AAA/{i,p,d,t}.rest.stpl` for inform-plan-do-test cycles (A is base 36).')
+ parser.add_argument(
+ '--over',
+ dest='overroot',
+ action='store',
+ help='Create a sample directory with `.rest` files all over the project tree.')
+ parser.add_argument(
+ '--pygrep',
+ dest='pygrep',
+ action='store',
+ help='Grep rst doc using python regular expressions.')
+ parser.add_argument(
+ '--kw',
+ dest='kw',
+ action='store',
+ help='List keyword lines (.. {kw1,kw2,...}) that contain all given as parameter, e.g kw1,kw2.')
+ parser.add_argument(
+ '-I',
+ action='append',
+ metavar='folder',
+ nargs=1,
+ help='Add folders to look for ``conf.py``, ``.[s]tpl`` and reference.docx/odt/tex')
+ parser.add_argument(
+ 'infile',
+ nargs='?',
+ help='''\
+Integrates Sphinx, Pandoc and Docutils to produce output supported by any of them.
+To use all three, restructuredText must not use Sphinx extensions.
+Input file, dir or - for stdin.''')
+ parser.add_argument(
+ 'outfile',
+ nargs='?',
+ help='Output file, dir or - or nothing to print to std out.')
+ parser.add_argument(
+ 'outtype',
+ nargs='?',
+ default=None,
+ help="""One of {pandoc,sphinx,}x{html,docx,...}
+or omitted for default (pandoc) (- if further code paramters are given)."""
+ )
+ parser.add_argument(
+ 'code',
+ nargs='*',
+ help="""Further parameters are python code,
+to define variables that can be used in templates."""
+ )
+ args = parser.parse_args().__dict__
+
+ if 'code' in args and args['code'] is not None and args['code'] != []:
+ code = '\n'.join(args['code'])
+ eval(compile(code, '', 'exec'), globals())
+
+ _chk_rstrest = lambda :'stplroot' in args and args['rstrest'] and _set_rstrest('.rst')
+ if 'stplroot' in args and args['stplroot']:
+ _chk_rstrest()
+ initroot(args['stplroot'], 'stpl')
+ return
+ elif 'restroot' in args and args['restroot']:
+ _chk_rstrest()
+ initroot(args['restroot'], 'rest')
+ return
+ elif 'ipdtroot' in args and args['ipdtroot']:
+ _chk_rstrest()
+ initroot(args['ipdtroot'], 'ipdt')
+ return
+ elif 'overroot' in args and args['overroot']:
+ _chk_rstrest()
+ initroot(args['overroot'], 'over')
+ return
+
+ if 'pygrep' in args and args['pygrep']:
+ for f,i,l in grep(args['pygrep']):
+ print('"{}":{} {}'.format(f,i,l))
+ elif 'kw' in args and args['kw']:
+ for _, (f,i,l) in yield_with_kw(args['kw']):
+ print('"{}":{} {}'.format(f,i,l))
+ elif 'infile' in args and args['infile']:
+ for x in 'infile outfile outtype'.split():
+ if x not in args:
+ args[x] = None
+ if 'I' in args and args['I']:
+ g_include[:] = reduce(lambda x,y:x+y,args['I'],[])
+ outinfo = args['outtype'] if args['outtype'] != '-' else None
+ outfile = args['outfile']
+ infiles = [args['infile']]
+ outfiles = [args['outfile']]
+ notexistsout = outfile and outfile!='-' and not exists(outfile)
+ imgfiles = []
+ if isdir(args['infile']):
+ with new_cwd(args['infile']):
+ _get_rstrest()
+ index_dir(args['infile'])
+ if outfile is None:
+ return
+ imgfiles = [x for x in os.listdir(args['infile']) if _is_graphic(stem_ext(x)[1])]
+ infiles = [x for x in os.listdir(args['infile']) if is_rest(x)]
+ if notexistsout:
+ mkdir(outfile)
+ elif outfile:
+ bout = base(outfile)
+ fdo = bout.find('.')
+ if notexistsout and not (fdo>0 and fdo=0]
+ if len(onlyindex)>0:
+ infiles = onlyindex
+ outfiles = [normjoin(outfile, _in_2_out_name(inf,outinfo)) for inf in infiles]
+ for i in imgfiles:
+ convert(i,None)
+ for i,o in zip(infiles,outfiles):
+ for rxt in [_rst,_rest]:
+ if i.endswith(rxt) or i.endswith(rxt+_stpl):
+ _set_rstrest(rxt)
+ convert(i,o,outinfo)
+ else:
+ myroot = up_dir(is_project_root_file)
+ if myroot:
+ with new_cwd(myroot):
+ _get_rstrest()
+ index_dir()
+ return
+ _get_rstrest()
+ index_dir()
+
+if __name__ == '__main__':
+ main()
+
+# vim: ts=4 sw=4 sts=4 et noai nocin nosi
diff --git a/docs/userguide/references.rest b/docs/userguide/references.rest
new file mode 100644
index 000000000..56b01c388
--- /dev/null
+++ b/docs/userguide/references.rest
@@ -0,0 +1,160 @@
+.. vim: syntax=rst
+
+REFERENCES
+==========
+
+Below are cited references in an alphabetical listing by author.
+
+Ball, J. T., I. E. Woodrow, and J. A. Berry (1987), A model predicting
+stomatal conductance and its contribution to the control of
+photosynthesis under different environmental conditions, in Process in
+Photosynthesis Research, vol. 1, edited by J. Biggins, pp. 221–234,
+Martinus Nijhoff, Dordrecht, Netherlands.
+
+Bowling, L.C., D.P. Lettenmaier, B. Nijssen, L.P. Graham and
+co-authors, 2003: Simulation of high latitude hydrological processes in
+the Torne-Kalix basin: PILPS Phase 2(c) 1: Experiment description and
+summary intercomparisons. Global and Planet. Change
+
+Brun, E., Martin, E., Simon, V., Gendre, C., and Coléou, C.: An energy and
+mass model of snow cover suitable for operational avalanche forecasting, J.
+Glaciol., 35, 333-342, https://doi.org/10.3189/S0022143000009254, 1989.
+
+Brun, E., David, P., Sudul, M., and Brunot, G.: A numer- ical model to simulate
+snow-cover stratigraphy for operational avalanche forecasting, J. Glaciol., 38,
+13-22, https://doi.org/10.3189/S0022143000009552, 1992.
+
+Brun, E., Martin, E., and Spiridonov, V.: Coupling a multi- layered snow model
+with a GCM, Ann. Glaciol., 25, 66-72,
+https://doi.org/10.3189/S0260305500013811, 1997.
+
+Bryan, F. O., B. G. Kauffman, W. G. Large, and P. R. Gent (1996). The
+NCAR CSM flux coupler, NCAR Tech. Note 424, 50 pp. [Available from NCAR,
+Boulder, CO 80307]
+
+Chen, F., K.E. Mitchell, J. Schaake, Y. Xue, H.-L. Pan, V. Koren,
+Q.Y. Duan, M. Ek and A. Betts, 1996: Modeling of land-surface
+evaporation by four schemes and comparison with FIFE observations. J.
+Geophys. Res., 101, 7251-7268.
+
+Chen, F., Z. Janic and K.E. Mitchell, 1997: Impact of atmospheric
+surface-layer parameterizations in the new land-surface scheme of the
+NCEP mesoscale Eta model. Bound.-Layer Meteorol., 85, 391-421.
+
+Chen, F. and K.E. Mitchell, 1999: Using the GEWEX/ISLSCP forcing
+data to simulate global soil moisture fields and hydrological cycle for
+1987-1988, J. Meteorol. Soc. Japan, 77, 167-182.
+
+Dickinson, R. E., M. Shaikh, R. Bryant, and L. Graumlich (1998),
+Interactive canopies for a climate model, J. Clim., 11, 2823-2836,
+doi:10.1175/1520-0442.
+
+Dirmeyer, P.A., A.J. Dolman, N. Sato, 1999: The pilot phase of
+the Global Soil Wetness Project. Bull. Am. Meteorol. Soc., 80(5),
+851-878.
+
+Eidhammer, T., A. Booth, S. Decker, L. Li, M. Barlage., D. Gochis., R.
+Rasmussen, K. Melvold., A. Nesje and S Sobolowski (2021), Mass balance and
+hydrological modeling of the Hardangerjøkulen ice cap in south-central Norway,
+Hydrol. Earth Syst. Sci. 25, 4275-4297,
+https://doi.org/10.5194/hess-25-4275-2021.
+
+Ek, M.B., K.E. Mitchell, Y. Lin, E. Rogers, P. Grunmann, V.
+Koren, G. Gayno, and J.D. Tarpley, 2003: Implementation of Noah land
+surface model advances in the NCEP operational mesoscale Eta model.
+Submitted to J. Geophys. Res., Aug., 2003.
+
+Gerbaux, M., Genthon, C., Etchevers, P., Vincent, C., and Dedieu, J.: Surface
+mass balance of glacier in the French Alps: Dis- tributed modeling sensitivity
+to climate change, J. Glaciol., 175, 561-572,
+https://doi.org/10.3189/172756505781829133, 2005.
+
+Gochis, D.J. and F. Chen, 2003: Hydrological enhancements to the
+community Noah land surface model. NCAR Technical Note, NCAR/TN-454+STR,
+68 pgs.
+
+Jones, P. W. (1999). First- and Second-Order Conservative Remapping
+Schemes for Grids in Spherical Coordinates, Monthly Weather Review,
+Volume 127, 2204-2210.
+
+Julien, P.Y., B. Saghafian and F.L. Ogden, 1995: Raster-based
+hydrological modeling of spatially-varied surface runoff. Water Resour.
+Bull., AWRA, 31(3), 523-536.
+
+Koren, V., J.C. Schaake, K.E. Mitchell, Q.Y. Duan, F. Chen and J.
+Baker, 1999: A parameterization of snowpack and frozen ground intended for
+NCEP weather and climate models. J. Geophys. Res., 104(D16),
+19,569-19,585.
+
+LDAS, 2003: Land Data Assimilation Systems (LDAS). World Wide Web
+Homepage. Available online at: http://ldas.gsfc.nasa.gov/
+
+Mahrt, L. and H.-L. Pan, 1984: A two-layer model of soil
+hydrology. Bound.-Layer Meteorol., 29, 1-20, 1984.
+
+Niu, G.-Y., et al. (2011), The community Noah land surface model with
+multiparameterization options (Noah-MP): 1. Model description and
+evaluation with local-scale measurements, J. Geophys. Res. 116, D12109,
+doi: 10.1029/2010JD015139.
+
+Niu, G.-Y., and Z.-L. Yang (2004), The effects of canopy processes on
+snow surface energy and mass balances, J. Geophys. Res., 109, D23111,
+doi:10.1029/2004JD004884.
+
+Niu, G.-Y., Z.-L. Yang, R. E. Dickinson, L. E. Gulden, and H. Su (2007),
+Development of a simple groundwater model for use in climate models and
+evaluation with Gravity Recovery and Climate Experiment data, J.
+Geophys. Res., 112, D07103, doi:10.1029/2006JD007522.
+
+Ogden, F.L., 1997: CASC2D Reference Manual. Dept. of Civil and
+Evniron. Eng. U-37, U. Connecticut, 106 pp.
+
+Pan, H.-L. and L. Mahrt, 1987: Interaction between soil hydrology
+and boundary-layer development, Bound.-Layer Meteorol., 38, 185-202.
+
+Skamarock, W. C., J. B. Klemp, J. Dudhia, D. O. Gill, D. M.
+Barker, W. Wang, and J. G. Powers, 2005: A Description of the Advanced
+Research WRF Version 2. NCAR Technical Note NCAR/TN-468+STR,
+doi:10.5065/D6DZ069T.
+
+Vionnet, V., Brun, E., Morin, S., Boone, A., Faroux, S., Le Moigne, P., Martin,
+E., and Willemet, J.-M.: The detailed snowpack scheme Crocus and its
+implementation in SURFEX v7.2, Geosci. Model Dev., 5, 773-791,
+https://doi.org/10.5194/gmd-5-773- 2012, 2012.
+
+Wigmosta, M.S. L.W. Vail and D.P. Lettenmaier, 1994: A
+distributed hydrology-vegetation model for complex terrain. Water
+Resour. Res., 30(6), 1665-1679.
+
+Wigmosta, M.S. and D.P. Lettenmaier, 1999: A comparison of
+simplified methods for routing topographically driven subsurface flow.
+Water Resour. Res., 35(1), 255-264.
+
+Wood, E.F., D.P. Lettenmaier, X. Kian, D. Lohmann, A. Boone, S.
+Chang, F.Chen, Y. Dai, R.E. Dickinson, Q. Duan, M. Ek, Y.M. Gusev, F.
+Habets, P. Irannejad, R. Koster, K.E. Mitchell, O.N. Nasonova, J.
+Noilhan, J. Schaake, A. Schlosser, Y. Shao, A.B. Shmakin, D. Verseghy,
+K. Warrach, P. Wetzel, Y. Xue, Z.-L. Yang, and Q.-C. Zeng, 1998: The
+project for intercomparison of land-surface parameterization schemes
+(PILPS) phase 2(c) Red-Arkansas river basin experiment: 1. Experiment
+description and summary intercomparisons, Global Planet. Change, 19,
+115-135.
+
+Yang, Z.-L., and G.-Y. Niu (2003), The versatile integrator of surface
+and atmosphere processes (VISA) part I: Model description, Global
+Planet. Change, 38, 175–189, doi:10.1016/S0921-8181(03)00028-6.
+
+ESMF (Earth System Modeling Framework)
+https://www.earthsystemcog.org/projects/esmf/
+
+USGS Global Land Cover Characterization
+https://www.usgs.gov/centers/eros/science/usgs-eros-archive-landcover-products-global-land-cover-characterization-glcc-0?qt-science_center_objects=0#qt-science_center_objectsTableLegend:Appendix3
+https://edcftp.cr.usgs.gov/project/glcc/globdoc2_0.html#app3
+
+IGBP_MODIS_BU+tundra Landcover Class Legend
+
+ftp://ftp.emc.ncep.noaa.gov/mmb/gcp/ldas/noahlsm/README
+
+NCL (NCAR Command Language) https://www.ncl.ucar.edu/
+
+CDO (Climate Data Operators) https://code.mpimet.mpg.de/projects/cdo/