From b63883d531909d745221dd21d340cd1ff81b8ad5 Mon Sep 17 00:00:00 2001 From: Tuomas Koskela Date: Wed, 18 Dec 2024 15:01:08 +0000 Subject: [PATCH] Some refactoring of duplicate content --- .gitignore | 5 +- docs/tutorial/excalibur-tests_tutorial.md | 59 +++------------- docs/tutorial/getting-started.md | 2 +- docs/tutorial/reframe_tutorial.md | 67 +------------------ docs/tutorial/setup-python.md | 28 ++++++++ .../tutorial/stream-sanity-and-performance.md | 39 +++++++++++ mkdocs.yml | 2 + 7 files changed, 86 insertions(+), 116 deletions(-) create mode 100644 docs/tutorial/setup-python.md create mode 100644 docs/tutorial/stream-sanity-and-performance.md diff --git a/.gitignore b/.gitignore index 78fdda97..39def363 100644 --- a/.gitignore +++ b/.gitignore @@ -9,6 +9,7 @@ reframe.log reframe.out stage/ output/ +perflogs/ apps/castep/downloads/ apps/wrf/downloads/ apps/gromacs/downloads/ @@ -24,7 +25,9 @@ outdated/ #ignore virtual environment myvenv/ -perflogs/ # docs site/ + +#emacs backups +*~ \ No newline at end of file diff --git a/docs/tutorial/excalibur-tests_tutorial.md b/docs/tutorial/excalibur-tests_tutorial.md index deab5f66..3a41e3c7 100644 --- a/docs/tutorial/excalibur-tests_tutorial.md +++ b/docs/tutorial/excalibur-tests_tutorial.md @@ -16,22 +16,19 @@ In this tutorial you will set up the excalibur-tests benchmarking framework on a ---- -### Set up python +### Set up python environment -=== "ARCHER2" +{!tutorial/setup-python.md!} - We are going to use `python` and the `pip` package installer to install and run the framework. Load the `cray-python` module to get a python version that fills the requirements. - ```bash - module load cray-python - ``` - You can check with `python3 --version` that your python version is `3.8` or greater. You will have to load this module every time you login. - - (at the time of writing, the default version was `3.9.13`). +--- ----- ### Change to work directory +=== "Cosma" + + Move on to the next step. + === "ARCHER2" On ARCHER2, the compute nodes do not have access to your home directory, therefore it is important to install everything in a [work file system](https://docs.archer2.ac.uk/user-guide/data/#work-file-systems). @@ -225,7 +222,7 @@ In this section you will create a ReFrame benchmark by writing a python class th For simplicity, we use the [`STREAM`](https://www.cs.virginia.edu/stream/ref.html) benchmark. It is a simple memory bandwidth benchmark with minimal build dependencies. -If you've already gone through the [ReFrame tutorial](reframe_tutorial.md), the only difference you should focus on is the [build system](excalibur-tests_tutorial.md#add-build-recipe). +If you've already gone through the [ReFrame tutorial](reframe_tutorial.md) some of the steps in creating the STREAM benchmark are repeated. However, pay attention to the [`Create a Test Class`](excalibur-tests_tutorial.md#create-a-test-class) and [`Add Build Recipe`](excalibur-tests_tutorial.md#add-build-recipe) steps. ---- @@ -312,45 +309,7 @@ env_vars['OMP_PLACES'] = 'cores' ---- -### Add Sanity Check - -The rest of the benchmark follows the [Writing a Performance Test ReFrame Tutorial](https://reframe-hpc.readthedocs.io/en/latest/tutorial_basics.html#writing-a-performance-test). First we need a sanity check that ensures the benchmark ran successfully. A function decorated with the `@sanity_function` decorator is used by ReFrame to check that the test ran successfully. The sanity function can perform a number of checks, in this case we want to match a line of the expected standard output. - -```python -@sanity_function -def validate_solution(self): - return sn.assert_found(r'Solution Validates', self.stdout) -``` - ----- - -### Add Performance Pattern Check - -To record the performance of the benchmark, ReFrame should extract a figure of merit from the output of the test. A function decorated with the `@performance_function` decorator extracts or computes a performance metric from the test’s output. - -> In this example, we extract four performance variables, namely the memory bandwidth values for each of the “Copy”, “Scale”, “Add” and “Triad” sub-benchmarks of STREAM, where each of the performance functions use the [`extractsingle()`](https://reframe-hpc.readthedocs.io/en/latest/deferrable_functions_reference.html#reframe.utility.sanity.extractsingle) utility function. For each of the sub-benchmarks we extract the “Best Rate MB/s” column of the output (see below) and we convert that to a float. - ----- - -### Performance Pattern Check - -```python -@performance_function('MB/s', perf_key='Copy') -def extract_copy_perf(self): - return sn.extractsingle(r'Copy:\s+(\S+)\s+.*', self.stdout, 1, float) - -@performance_function('MB/s', perf_key='Scale') -def extract_scale_perf(self): - return sn.extractsingle(r'Scale:\s+(\S+)\s+.*', self.stdout, 1, float) - -@performance_function('MB/s', perf_key='Add') -def extract_add_perf(self): - return sn.extractsingle(r'Add:\s+(\S+)\s+.*', self.stdout, 1, float) - -@performance_function('MB/s', perf_key='Triad') -def extract_triad_perf(self): - return sn.extractsingle(r'Triad:\s+(\S+)\s+.*', self.stdout, 1, float) -``` +{!tutorial/stream-sanity-and-performance.md!} ---- diff --git a/docs/tutorial/getting-started.md b/docs/tutorial/getting-started.md index 37ad5199..b9d2a701 100644 --- a/docs/tutorial/getting-started.md +++ b/docs/tutorial/getting-started.md @@ -1,4 +1,4 @@ -## Getting Started on ARCHER2 +## Connecting to ARCHER2 To complete this tutorial, you need to [connect to ARCHER2 via ssh](https://docs.archer2.ac.uk/user-guide/connecting/). You will need diff --git a/docs/tutorial/reframe_tutorial.md b/docs/tutorial/reframe_tutorial.md index c7a7a8a1..54ffdc2b 100644 --- a/docs/tutorial/reframe_tutorial.md +++ b/docs/tutorial/reframe_tutorial.md @@ -29,34 +29,7 @@ You can customise the behaviour of each stage or add a hook before or after each ## Set up python environment -=== "Cosma" - - This tutorial is run on the [Cosma](https://cosma.readthedocs.io/en/latest/) supercomputer. - It should be straightforward to run on a different platform, the requirements are `gcc`, `git` and `python3`. (for the later parts you also need `make`, `autotools`, `cmake` and `spack`). - Before proceeding to install ReFrame, we recommend creating a python virtual environment to avoid clashes with other installed python packages. - First load a newer python module. - ```bash - module swap python/3.10.12 - ``` - -=== "ARCHER2" - - This tutorial is run on ARCHER2, you should have signed up for a training account before starting. - It can be ran on other HPC systems with a batch scheduler but will require making some changes to the config. - Before proceeding to install ReFrame, we recommend creating a python virtual environment to avoid clashes with other installed python packages. - First load the system python module. - ```bash - module load cray-python - ``` - -Then create an environment and activate it with - -```bash -python3 -m venv reframe_tutorial -source reframe_tutorial/bin/activate -``` - -You will have to activate the environment each time you login. To deactivate the environment run `deactivate`. +{!tutorial/setup-python.md!} ---- @@ -325,7 +298,7 @@ We can set environment variables in the `env_vars` dictionary. ---- -### Building +### Building the STREAM benchmark Recall the pipeline ReFrame executes when running a test. We can insert arbitrary functions between any steps in in the pipeline by decorating them with `@run_before` or `@run_after` @@ -346,41 +319,7 @@ It should be large enough to overflow all levels of cache so that there is no da ---- -### Sanity function - -Similar to before, we can check a line in stdout for validation. - -```python - @sanity_function - def validate_solution(self): - return sn.assert_found(r'Solution Validates', self.stdout) -``` - ----- - -### Add Performance Pattern Check - -To record the performance of the benchmark, ReFrame should extract a figure of merit from the output of the test. A function decorated with the `@performance_function` decorator extracts or computes a performance metric from the test’s output. - -> In this example, we extract four performance variables, namely the memory bandwidth values for each of the “Copy”, “Scale”, “Add” and “Triad” sub-benchmarks of STREAM, where each of the performance functions use the [`extractsingle()`](https://reframe-hpc.readthedocs.io/en/latest/deferrable_functions_reference.html#reframe.utility.sanity.extractsingle) utility function. For each of the sub-benchmarks we extract the “Best Rate MB/s” column of the output (see below) and we convert that to a float. - -```python -@performance_function('MB/s', perf_key='Copy') -def extract_copy_perf(self): - return sn.extractsingle(r'Copy:\s+(\S+)\s+.*', self.stdout, 1, float) - -@performance_function('MB/s', perf_key='Scale') -def extract_scale_perf(self): - return sn.extractsingle(r'Scale:\s+(\S+)\s+.*', self.stdout, 1, float) - -@performance_function('MB/s', perf_key='Add') -def extract_add_perf(self): - return sn.extractsingle(r'Add:\s+(\S+)\s+.*', self.stdout, 1, float) - -@performance_function('MB/s', perf_key='Triad') -def extract_triad_perf(self): - return sn.extractsingle(r'Triad:\s+(\S+)\s+.*', self.stdout, 1, float) -``` +{!tutorial/stream-sanity-and-performance.md!} ---- diff --git a/docs/tutorial/setup-python.md b/docs/tutorial/setup-python.md new file mode 100644 index 00000000..e0c68d29 --- /dev/null +++ b/docs/tutorial/setup-python.md @@ -0,0 +1,28 @@ +=== "Cosma" + + This tutorial is run on the [Cosma](https://cosma.readthedocs.io/en/latest/) supercomputer. + It should be straightforward to run on a different platform, the requirements are `gcc`, `git` and `python3`. (for the later parts you also need `make`, `autotools`, `cmake` and `spack`). + Before proceeding to install ReFrame, we recommend creating a python virtual environment to avoid clashes with other installed python packages. + First load a newer python module. + ```bash + module swap python/3.10.12 + ``` + +=== "ARCHER2" + + This tutorial is run on ARCHER2, you should have signed up for a training account before starting. + It can be ran on other HPC systems with a batch scheduler but will require making some changes to the config. + Before proceeding to install ReFrame, we recommend creating a python virtual environment to avoid clashes with other installed python packages. + First load the system python module. + ```bash + module load cray-python + ``` + +Then create an environment and activate it with + +```bash +python3 -m venv reframe_tutorial +source reframe_tutorial/bin/activate +``` + +You will have to activate the environment each time you login. To deactivate the environment run `deactivate`. diff --git a/docs/tutorial/stream-sanity-and-performance.md b/docs/tutorial/stream-sanity-and-performance.md new file mode 100644 index 00000000..0522ba88 --- /dev/null +++ b/docs/tutorial/stream-sanity-and-performance.md @@ -0,0 +1,39 @@ +### Add Sanity Check + +The rest of the benchmark follows the [Writing a Performance Test ReFrame Tutorial](https://reframe-hpc.readthedocs.io/en/latest/tutorial_basics.html#writing-a-performance-test). First we need a sanity check that ensures the benchmark ran successfully. A function decorated with the `@sanity_function` decorator is used by ReFrame to check that the test ran successfully. The sanity function can perform a number of checks, in this case we want to match a line of the expected standard output. + +```python +@sanity_function +def validate_solution(self): + return sn.assert_found(r'Solution Validates', self.stdout) +``` + +---- + +### Add Performance Pattern Check + +To record the performance of the benchmark, ReFrame should extract a figure of merit from the output of the test. A function decorated with the `@performance_function` decorator extracts or computes a performance metric from the test’s output. + +> In this example, we extract four performance variables, namely the memory bandwidth values for each of the “Copy”, “Scale”, “Add” and “Triad” sub-benchmarks of STREAM, where each of the performance functions use the [`extractsingle()`](https://reframe-hpc.readthedocs.io/en/latest/deferrable_functions_reference.html#reframe.utility.sanity.extractsingle) utility function. For each of the sub-benchmarks we extract the “Best Rate MB/s” column of the output (see below) and we convert that to a float. + +---- + +### Performance Pattern Check + +```python +@performance_function('MB/s', perf_key='Copy') +def extract_copy_perf(self): + return sn.extractsingle(r'Copy:\s+(\S+)\s+.*', self.stdout, 1, float) + +@performance_function('MB/s', perf_key='Scale') +def extract_scale_perf(self): + return sn.extractsingle(r'Scale:\s+(\S+)\s+.*', self.stdout, 1, float) + +@performance_function('MB/s', perf_key='Add') +def extract_add_perf(self): + return sn.extractsingle(r'Add:\s+(\S+)\s+.*', self.stdout, 1, float) + +@performance_function('MB/s', perf_key='Triad') +def extract_triad_perf(self): + return sn.extractsingle(r'Triad:\s+(\S+)\s+.*', self.stdout, 1, float) +``` diff --git a/mkdocs.yml b/mkdocs.yml index 765c8816..2364d0bc 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -65,6 +65,8 @@ theme: name: Switch to light mode markdown_extensions: - admonition + - markdown_include.include: + base_path: docs - pymdownx.details - pymdownx.highlight: anchor_linenums: true