From 8a77ffa5d2db4400aa2f3cb808a8c48ae7ba4aea Mon Sep 17 00:00:00 2001 From: tommoral Date: Mon, 22 Apr 2024 03:40:37 +0200 Subject: [PATCH 01/24] DOC improve organisation --- docs/docs/contribute.md | 6 --- docs/docs/examples | 1 + docs/docs/index.md | 94 +++++++++++++++--------------------- docs/docs/publications.md | 42 ++++++++++++++++ docs/docs/reference/index.md | 21 ++++++++ docs/docs/tutorials | 1 + docs/mkdocs.yml | 42 +++++++++------- pyproject.toml | 7 +-- 8 files changed, 132 insertions(+), 82 deletions(-) create mode 120000 docs/docs/examples create mode 100644 docs/docs/publications.md create mode 100644 docs/docs/reference/index.md create mode 120000 docs/docs/tutorials diff --git a/docs/docs/contribute.md b/docs/docs/contribute.md index ae7e91089..4ce3456f4 100644 --- a/docs/docs/contribute.md +++ b/docs/docs/contribute.md @@ -217,9 +217,3 @@ and open a browser on the page proposed by `mkdocs`. Now, whenever you make changes to the markdown files of the documentation, you can see the results almost immediately in the browser. -Note that the tutorials and examples are initially written in jupyter notebooks -and then converted to markdown programatically. To do so locally, you should run -``` -jupyter nbconvert --to markdown ../tutorials/*.ipynb --output-dir docs/tutorial/ -jupyter nbconvert --to markdown ../examples/*.ipynb --output-dir docs/examples/ -``` diff --git a/docs/docs/examples b/docs/docs/examples new file mode 120000 index 000000000..da7b19653 --- /dev/null +++ b/docs/docs/examples @@ -0,0 +1 @@ +../../examples/ \ No newline at end of file diff --git a/docs/docs/index.md b/docs/docs/index.md index c0cfa4695..357b318b3 100644 --- a/docs/docs/index.md +++ b/docs/docs/index.md @@ -1,39 +1,40 @@ # `sbi`: simulation-based inference -`sbi`: A Python toolbox for simulation-based inference. +`sbi` is a toolbox that let you run simulation-based inference methods simply using a high level API: ![using sbi](static/infer_demo.gif) -Inference can be run in a single line of code - -```python -posterior = infer(simulator, prior, method='SNPE', num_simulations=1000) -``` +## Overview -or in a few lines for more flexibility: +**To get started, install the `sbi` package with:** -```python -inference = SNPE(prior=prior) -_ = inference.append_simulations(theta, x).train() -posterior = inference.build_posterior() +```commandline +pip install sbi ``` +for more advances install options, see our [Install Guide](install.md). -`sbi` lets you choose from a variety of _amortized_ and _sequential_ SBI methods: +Then, checkout our material: -Amortized methods return a posterior that can be applied to many different observations without retraining, -whereas sequential methods focus the inference on one particular observation to be more simulation-efficient. -For an overview of implemented methods see below, or checkout or [GitHub page](https://github.com/mackelab/sbi). +
-## Overview +- :dart: [__Motivation and approach__](#motivation-and-approach) +

+ *General motivation for the SBI framework and methods included in `sbi`.* + +- :rocket: [__Tutorials__](tutorials/) +

+ *Various examples illustrating how to use the `sbi` package.* -- To learn about the general motivation behind simulation-based inference, and the - inference methods included in `sbi`, read on below. +- :building_construction: [__Reference API__](reference/) +

+ *The detailed description of the package classes and functions.* -- For example applications to canonical problems in neuroscience, browse the recent - research article [Training deep neural density estimators to identify mechanistic models of neural dynamics](https://doi.org/10.7554/eLife.56261). +- :book: [__Citation__]() +

+ *References for the implemented methods.* + +
-- If you want to get started using `sbi` on your own problem, jump to - [installation](install.md) and then check out the [tutorial](tutorial/00_getting_started.md). ## Motivation and approach @@ -86,48 +87,31 @@ The methods then proceed by 4. If needed, an initial estimate of the posterior can be used to adaptively generate additional informative simulations. -## Publications See [Cranmer, Brehmer, Louppe (2020)](https://doi.org/10.1073/pnas.1912789117) for a recent review on simulation-based inference. -The following papers offer additional details on the inference methods implemented in `sbi`. -You can find a tutorial on how to run each of these methods [here](https://sbi-dev.github.io/sbi/tutorial/16_implemented_methods/). - -### Posterior estimation (`(S)NPE`) - -- **Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation**
by Papamakarios & Murray (NeurIPS 2016)
[[PDF]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation.pdf) [[BibTeX]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation/bibtex) - -- **Flexible statistical inference for mechanistic models of neural dynamics**
by Lueckmann, Goncalves, Bassetto, Öcal, Nonnenmacher & Macke (NeurIPS 2017)
[[PDF]](https://papers.nips.cc/paper/6728-flexible-statistical-inference-for-mechanistic-models-of-neural-dynamics.pdf) [[BibTeX]](https://papers.nips.cc/paper/6728-flexible-statistical-inference-for-mechanistic-models-of-neural-dynamics/bibtex) - -- **Automatic posterior transformation for likelihood-free inference**
by Greenberg, Nonnenmacher & Macke (ICML 2019)
[[PDF]](http://proceedings.mlr.press/v97/greenberg19a/greenberg19a.pdf) [[BibTeX]](data:text/plain;charset=utf-8,%0A%0A%0A%0A%0A%0A%40InProceedings%7Bpmlr-v97-greenberg19a%2C%0A%20%20title%20%3D%20%09%20%7BAutomatic%20Posterior%20Transformation%20for%20Likelihood-Free%20Inference%7D%2C%0A%20%20author%20%3D%20%09%20%7BGreenberg%2C%20David%20and%20Nonnenmacher%2C%20Marcel%20and%20Macke%2C%20Jakob%7D%2C%0A%20%20booktitle%20%3D%20%09%20%7BProceedings%20of%20the%2036th%20International%20Conference%20on%20Machine%20Learning%7D%2C%0A%20%20pages%20%3D%20%09%20%7B2404--2414%7D%2C%0A%20%20year%20%3D%20%09%20%7B2019%7D%2C%0A%20%20editor%20%3D%20%09%20%7BChaudhuri%2C%20Kamalika%20and%20Salakhutdinov%2C%20Ruslan%7D%2C%0A%20%20volume%20%3D%20%09%20%7B97%7D%2C%0A%20%20series%20%3D%20%09%20%7BProceedings%20of%20Machine%20Learning%20Research%7D%2C%0A%20%20address%20%3D%20%09%20%7BLong%20Beach%2C%20California%2C%20USA%7D%2C%0A%20%20month%20%3D%20%09%20%7B09--15%20Jun%7D%2C%0A%20%20publisher%20%3D%20%09%20%7BPMLR%7D%2C%0A%20%20pdf%20%3D%20%09%20%7Bhttp%3A%2F%2Fproceedings.mlr.press%2Fv97%2Fgreenberg19a%2Fgreenberg19a.pdf%7D%2C%0A%20%20url%20%3D%20%09%20%7Bhttp%3A%2F%2Fproceedings.mlr.press%2Fv97%2Fgreenberg19a.html%7D%2C%0A%20%20abstract%20%3D%20%09%20%7BHow%20can%20one%20perform%20Bayesian%20inference%20on%20stochastic%20simulators%20with%20intractable%20likelihoods%3F%20A%20recent%20approach%20is%20to%20learn%20the%20posterior%20from%20adaptively%20proposed%20simulations%20using%20neural%20network-based%20conditional%20density%20estimators.%20However%2C%20existing%20methods%20are%20limited%20to%20a%20narrow%20range%20of%20proposal%20distributions%20or%20require%20importance%20weighting%20that%20can%20limit%20performance%20in%20practice.%20Here%20we%20present%20automatic%20posterior%20transformation%20(APT)%2C%20a%20new%20sequential%20neural%20posterior%20estimation%20method%20for%20simulation-based%20inference.%20APT%20can%20modify%20the%20posterior%20estimate%20using%20arbitrary%2C%20dynamically%20updated%20proposals%2C%20and%20is%20compatible%20with%20powerful%20flow-based%20density%20estimators.%20It%20is%20more%20flexible%2C%20scalable%20and%20efficient%20than%20previous%20simulation-based%20inference%20techniques.%20APT%20can%20operate%20directly%20on%20high-dimensional%20time%20series%20and%20image%20data%2C%20opening%20up%20new%20applications%20for%20likelihood-free%20inference.%7D%0A%7D%0A) - -- **Truncated proposals for scalable and hassle-free simulation-based inference**
by Deistler, Goncalves & Macke (NeurIPS 2022)
[[Paper]](https://arxiv.org/abs/2210.04815) +## Getting started with the `sbi` package +Once `sbi` is installed, inference can be run in a single line of code -### Likelihood-estimation (`(S)NLE`) - -- **Sequential neural likelihood: Fast likelihood-free inference with autoregressive flows**
by Papamakarios, Sterratt & Murray (AISTATS 2019)
[[PDF]](http://proceedings.mlr.press/v89/papamakarios19a/papamakarios19a.pdf) [[BibTeX]](https://gpapamak.github.io/bibtex/snl.bib) - -- **Variational methods for simulation-based inference**
by Glöckler, Deistler, Macke (ICLR 2022)
[[Paper]](https://arxiv.org/abs/2203.04176) - -- **Flexible and efficient simulation-based inference for models of decision-making**
by Boelts, Lueckmann, Gao, Macke (Elife 2022)
[[Paper]](https://elifesciences.org/articles/77220) - - -### Likelihood-ratio-estimation (`(S)NRE`) - -- **Likelihood-free MCMC with Amortized Approximate Likelihood Ratios**
by Hermans, Begy & Louppe (ICML 2020)
[[PDF]](http://proceedings.mlr.press/v119/hermans20a/hermans20a.pdf) - -- **On Contrastive Learning for Likelihood-free Inference**
Durkan, Murray & Papamakarios (ICML 2020)
[[PDF]](http://proceedings.mlr.press/v119/durkan20a/durkan20a.pdf) +```python +posterior = infer(simulator, prior, method='SNPE', num_simulations=1000) +``` -- **Towards Reliable Simulation-Based Inference with Balanced Neural Ratio Estimation**
by Delaunoy, Hermans, Rozet, Wehenkel & Louppe (NeurIPS 2022)
[[PDF]](https://arxiv.org/pdf/2208.13624.pdf) +or in a few lines for more flexibility: -- **Contrastive Neural Ratio Estimation**
Benjamin Kurt Miller, Christoph Weniger, Patrick Forré (NeurIPS 2022)
[[PDF]](https://arxiv.org/pdf/2210.06170.pdf) +```python +inference = SNPE(prior=prior) +_ = inference.append_simulations(theta, x).train() +posterior = inference.build_posterior() +``` -### Utilities +`sbi` lets you choose from a variety of _amortized_ and _sequential_ SBI methods: -- **Restriction estimator**
by Deistler, Macke & Goncalves (PNAS 2022)
[[Paper]](https://www.pnas.org/doi/10.1073/pnas.2207632119) +Amortized methods return a posterior that can be applied to many different observations without retraining, +whereas sequential methods focus the inference on one particular observation to be more simulation-efficient. -- **Simulation-based calibration**
by Talts, Betancourt, Simpson, Vehtari, Gelman (arxiv 2018)
[[Paper]](https://arxiv.org/abs/1804.06788)) -- **Expected coverage (sample-based)**
as computed in Deistler, Goncalves, Macke [[Paper]](https://arxiv.org/abs/2210.04815) and in Rozet, Louppe [[Paper]](https://matheo.uliege.be/handle/2268.2/12993) +For an overview of implemented methods see [the Inference API's reference]( +reference/inference/), or checkout or [GitHub page](https://github.com/mackelab/sbi). diff --git a/docs/docs/publications.md b/docs/docs/publications.md new file mode 100644 index 000000000..44cfcbe36 --- /dev/null +++ b/docs/docs/publications.md @@ -0,0 +1,42 @@ +# Publications + +This page references papers to get additional details on the inference metods implemented in `sbi`. +You can find a tutorial on how to run each of these methods [here](../tutorial/16_implemented_methods/). + +## Posterior estimation (`(S)NPE`) + +- **Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation**
by Papamakarios & Murray (NeurIPS 2016)
[[PDF]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation.pdf) [[BibTeX]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation/bibtex) + +- **Flexible statistical inference for mechanistic models of neural dynamics**
by Lueckmann, Goncalves, Bassetto, Öcal, Nonnenmacher & Macke (NeurIPS 2017)
[[PDF]](https://papers.nips.cc/paper/6728-flexible-statistical-inference-for-mechanistic-models-of-neural-dynamics.pdf) [[BibTeX]](https://papers.nips.cc/paper/6728-flexible-statistical-inference-for-mechanistic-models-of-neural-dynamics/bibtex) + +- **Automatic posterior transformation for likelihood-free inference**
by Greenberg, Nonnenmacher & Macke (ICML 2019)
[[PDF]](http://proceedings.mlr.press/v97/greenberg19a/greenberg19a.pdf) [[BibTeX]](data:text/plain;charset=utf-8,%0A%0A%0A%0A%0A%0A%40InProceedings%7Bpmlr-v97-greenberg19a%2C%0A%20%20title%20%3D%20%09%20%7BAutomatic%20Posterior%20Transformation%20for%20Likelihood-Free%20Inference%7D%2C%0A%20%20author%20%3D%20%09%20%7BGreenberg%2C%20David%20and%20Nonnenmacher%2C%20Marcel%20and%20Macke%2C%20Jakob%7D%2C%0A%20%20booktitle%20%3D%20%09%20%7BProceedings%20of%20the%2036th%20International%20Conference%20on%20Machine%20Learning%7D%2C%0A%20%20pages%20%3D%20%09%20%7B2404--2414%7D%2C%0A%20%20year%20%3D%20%09%20%7B2019%7D%2C%0A%20%20editor%20%3D%20%09%20%7BChaudhuri%2C%20Kamalika%20and%20Salakhutdinov%2C%20Ruslan%7D%2C%0A%20%20volume%20%3D%20%09%20%7B97%7D%2C%0A%20%20series%20%3D%20%09%20%7BProceedings%20of%20Machine%20Learning%20Research%7D%2C%0A%20%20address%20%3D%20%09%20%7BLong%20Beach%2C%20California%2C%20USA%7D%2C%0A%20%20month%20%3D%20%09%20%7B09--15%20Jun%7D%2C%0A%20%20publisher%20%3D%20%09%20%7BPMLR%7D%2C%0A%20%20pdf%20%3D%20%09%20%7Bhttp%3A%2F%2Fproceedings.mlr.press%2Fv97%2Fgreenberg19a%2Fgreenberg19a.pdf%7D%2C%0A%20%20url%20%3D%20%09%20%7Bhttp%3A%2F%2Fproceedings.mlr.press%2Fv97%2Fgreenberg19a.html%7D%2C%0A%20%20abstract%20%3D%20%09%20%7BHow%20can%20one%20perform%20Bayesian%20inference%20on%20stochastic%20simulators%20with%20intractable%20likelihoods%3F%20A%20recent%20approach%20is%20to%20learn%20the%20posterior%20from%20adaptively%20proposed%20simulations%20using%20neural%20network-based%20conditional%20density%20estimators.%20However%2C%20existing%20methods%20are%20limited%20to%20a%20narrow%20range%20of%20proposal%20distributions%20or%20require%20importance%20weighting%20that%20can%20limit%20performance%20in%20practice.%20Here%20we%20present%20automatic%20posterior%20transformation%20(APT)%2C%20a%20new%20sequential%20neural%20posterior%20estimation%20method%20for%20simulation-based%20inference.%20APT%20can%20modify%20the%20posterior%20estimate%20using%20arbitrary%2C%20dynamically%20updated%20proposals%2C%20and%20is%20compatible%20with%20powerful%20flow-based%20density%20estimators.%20It%20is%20more%20flexible%2C%20scalable%20and%20efficient%20than%20previous%20simulation-based%20inference%20techniques.%20APT%20can%20operate%20directly%20on%20high-dimensional%20time%20series%20and%20image%20data%2C%20opening%20up%20new%20applications%20for%20likelihood-free%20inference.%7D%0A%7D%0A) + +- **Truncated proposals for scalable and hassle-free simulation-based inference**
by Deistler, Goncalves & Macke (NeurIPS 2022)
[[Paper]](https://arxiv.org/abs/2210.04815) + + +## Likelihood-estimation (`(S)NLE`) + +- **Sequential neural likelihood: Fast likelihood-free inference with autoregressive flows**
by Papamakarios, Sterratt & Murray (AISTATS 2019)
[[PDF]](http://proceedings.mlr.press/v89/papamakarios19a/papamakarios19a.pdf) [[BibTeX]](https://gpapamak.github.io/bibtex/snl.bib) + +- **Variational methods for simulation-based inference**
by Glöckler, Deistler, Macke (ICLR 2022)
[[Paper]](https://arxiv.org/abs/2203.04176) + +- **Flexible and efficient simulation-based inference for models of decision-making**
by Boelts, Lueckmann, Gao, Macke (Elife 2022)
[[Paper]](https://elifesciences.org/articles/77220) + + +## Likelihood-ratio-estimation (`(S)NRE`) + +- **Likelihood-free MCMC with Amortized Approximate Likelihood Ratios**
by Hermans, Begy & Louppe (ICML 2020)
[[PDF]](http://proceedings.mlr.press/v119/hermans20a/hermans20a.pdf) + +- **On Contrastive Learning for Likelihood-free Inference**
Durkan, Murray & Papamakarios (ICML 2020)
[[PDF]](http://proceedings.mlr.press/v119/durkan20a/durkan20a.pdf) + +- **Towards Reliable Simulation-Based Inference with Balanced Neural Ratio Estimation**
by Delaunoy, Hermans, Rozet, Wehenkel & Louppe (NeurIPS 2022)
[[PDF]](https://arxiv.org/pdf/2208.13624.pdf) + +- **Contrastive Neural Ratio Estimation**
Benjamin Kurt Miller, Christoph Weniger, Patrick Forré (NeurIPS 2022)
[[PDF]](https://arxiv.org/pdf/2210.06170.pdf) + +## Utilities + +- **Restriction estimator**
by Deistler, Macke & Goncalves (PNAS 2022)
[[Paper]](https://www.pnas.org/doi/10.1073/pnas.2207632119) + +- **Simulation-based calibration**
by Talts, Betancourt, Simpson, Vehtari, Gelman (arxiv 2018)
[[Paper]](https://arxiv.org/abs/1804.06788)) + +- **Expected coverage (sample-based)**
as computed in Deistler, Goncalves, Macke [[Paper]](https://arxiv.org/abs/2210.04815) and in Rozet, Louppe [[Paper]](https://matheo.uliege.be/handle/2268.2/12993) diff --git a/docs/docs/reference/index.md b/docs/docs/reference/index.md new file mode 100644 index 000000000..657f28252 --- /dev/null +++ b/docs/docs/reference/index.md @@ -0,0 +1,21 @@ +# API Reference: + +
+ +- [Inference](inference.md) +
+ *XXX* +- [Neural Networks](models.md) +
+ *Models to perform posterior approximation and signal embeddings.* +- [Posteriors](posteriors.md) +
+ *XXX* +- [Potentials](potentials.md) +
+ *XXX* +- [Analysis](analysis.md) +
+ *XXX* + +
\ No newline at end of file diff --git a/docs/docs/tutorials b/docs/docs/tutorials new file mode 120000 index 000000000..478337109 --- /dev/null +++ b/docs/docs/tutorials @@ -0,0 +1 @@ +../../tutorials/ \ No newline at end of file diff --git a/docs/mkdocs.yml b/docs/mkdocs.yml index 817f03eaa..531e77b9c 100644 --- a/docs/mkdocs.yml +++ b/docs/mkdocs.yml @@ -3,27 +3,26 @@ site_url: "https://sbi-dev.github.io/sbi/" nav: - Home: index.md - - Installation: install.md - Tutorials and Examples: - Introduction: - - Getting started: tutorial/00_getting_started_flexible.md - - Amortized inference: tutorial/01_gaussian_amortized.md - - Implemented algorithms: tutorial/16_implemented_methods.md + - Getting started: tutorials/00_getting_started_flexible.ipynb + - Amortized inference: tutorials/01_gaussian_amortized.ipynb + - Implemented algorithms: tutorials/16_implemented_methods.md - Advanced: - - Multi-round inference: tutorial/03_multiround_inference.md - - Sampling algorithms in sbi: tutorial/11_sampler_interface.md - - Custom density estimators: tutorial/04_density_estimators.md - - Embedding nets for observations: tutorial/05_embedding_net.md - - SBI with trial-based data: tutorial/14_iid_data_and_permutation_invariant_embeddings.md - - Handling invalid simulations: tutorial/08_restriction_estimator.md - - Crafting summary statistics: tutorial/10_crafting_summary_statistics.md + - Multi-round inference: tutorials/03_multiround_inference.md + - Sampling algorithms in sbi: tutorials/11_sampler_interface.md + - Custom density estimators: tutorials/04_density_estimators.md + - Embedding nets for observations: tutorials/05_embedding_net.md + - SBI with trial-based data: tutorials/14_iid_data_and_permutation_invariant_embeddings.md + - Handling invalid simulations: tutorials/08_restriction_estimator.md + - Crafting summary statistics: tutorials/10_crafting_summary_statistics.md - Diagnostics: - - Posterior predictive checks: tutorial/12_diagnostics_posterior_predictive_check.md - - Simulation-based calibration: tutorial/13_diagnostics_simulation_based_calibration.md - - Density plots and MCMC diagnostics with ArviZ: tutorial/15_mcmc_diagnostics_with_arviz.md + - Posterior predictive checks: tutorials/12_diagnostics_posterior_predictive_check.md + - Simulation-based calibration: tutorials/13_diagnostics_simulation_based_calibration.md + - Density plots and MCMC diagnostics with ArviZ: tutorials/15_mcmc_diagnostics_with_arviz.md - Analysis: - - Conditional distributions: tutorial/07_conditional_distributions.md - - Posterior sensitivity analysis: tutorial/09_sensitivity_analysis.md + - Conditional distributions: tutorials/07_conditional_distributions.md + - Posterior sensitivity analysis: tutorials/09_sensitivity_analysis.md - Examples: - Hodgkin-Huxley example: examples/00_HH_simulator.md - Decision making model: examples/01_decision_making_model.md @@ -37,6 +36,7 @@ nav: - How to contribute: contribute.md - Code of Conduct: code_of_conduct.md - FAQ: faq.md + - SBI Publications: publications - Credits: credits.md repo_name: 'sbi-dev/sbi' @@ -64,6 +64,8 @@ markdown_extensions: - extra - smarty - admonition + - attr_list + - md_in_html - codehilite: guess_lang: false - toc: @@ -78,8 +80,7 @@ markdown_extensions: - pymdownx.caret - pymdownx.critic - pymdownx.details - - pymdownx.emoji: - emoji_generator: tag:yaml.org,2002:python/name:pymdownx.emoji.to_svg + - pymdownx.emoji - pymdownx.inlinehilite - pymdownx.magiclink - pymdownx.mark @@ -89,8 +90,13 @@ markdown_extensions: custom_checkbox: true - pymdownx.tilde + plugins: - search + - mkdocs-jupyter: + include: ["*.ipynb"] # Default: ["*.py", "*.ipynb"] + ignore: [".ipynb_checkpoints/*.ipynb"] + no_input_tag: True - mkdocstrings: default_handler: python handlers: diff --git a/pyproject.toml b/pyproject.toml index 027a83f08..9b1f5bb70 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -47,15 +47,16 @@ dependencies = [ ] [project.optional-dependencies] -dev = [ +doc = [ # Documentation "mkdocs", "mkdocs-material", "markdown-include", "mkdocs-redirects", + "mkdocs-jupyter", "mkdocstrings[python] >= 0.18", - "jupyter", - "nbconvert", +] +dev = [ # Lint "pre-commit == 3.5.0", "pyyaml", From 14dbf9cf7094b2c57153d3ddc9cb80c0b76d7aa9 Mon Sep 17 00:00:00 2001 From: tommoral Date: Mon, 22 Apr 2024 08:43:58 +0200 Subject: [PATCH 02/24] FIX pre-commit hooks --- docs/docs/contribute.md | 1 - docs/docs/index.md | 2 +- docs/docs/reference/index.md | 2 +- 3 files changed, 2 insertions(+), 3 deletions(-) diff --git a/docs/docs/contribute.md b/docs/docs/contribute.md index 4ce3456f4..9fe406c0d 100644 --- a/docs/docs/contribute.md +++ b/docs/docs/contribute.md @@ -216,4 +216,3 @@ mkdocs serve and open a browser on the page proposed by `mkdocs`. Now, whenever you make changes to the markdown files of the documentation, you can see the results almost immediately in the browser. - diff --git a/docs/docs/index.md b/docs/docs/index.md index 357b318b3..c66fa5265 100644 --- a/docs/docs/index.md +++ b/docs/docs/index.md @@ -32,7 +32,7 @@ Then, checkout our material: - :book: [__Citation__]()

*References for the implemented methods.* - + diff --git a/docs/docs/reference/index.md b/docs/docs/reference/index.md index 657f28252..54009a003 100644 --- a/docs/docs/reference/index.md +++ b/docs/docs/reference/index.md @@ -18,4 +18,4 @@
*XXX* - \ No newline at end of file + From 89b45a0eca6d64fdb133d7e205e5a97de3d06c51 Mon Sep 17 00:00:00 2001 From: tommoral Date: Mon, 22 Apr 2024 08:47:34 +0200 Subject: [PATCH 03/24] add index for tutorials --- tutorials/index.md | 43 +++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 43 insertions(+) create mode 100644 tutorials/index.md diff --git a/tutorials/index.md b/tutorials/index.md new file mode 100644 index 000000000..8b8d988c8 --- /dev/null +++ b/tutorials/index.md @@ -0,0 +1,43 @@ +## Introduction: + +
+- [Getting started](00_getting_started_flexible) +- [Amortized inference](01_gaussian_amortized) +- [Implemented algorithms](16_implemented_methods) +
+ + +## Advanced + +
+- [Multi-round inference](03_multiround_inference) +- [Sampling algorithms in sbi](11_sampler_interface) +- [Custom density estimators](04_density_estimators) +- [Embedding nets for observations](05_embedding_net) +- [SBI with trial-based data](14_iid_data_and_permutation_invariant_embeddings) +- [Handling invalid simulations](08_restriction_estimator) +- [Crafting summary statistics](10_crafting_summary_statistics) +
+ +## Diagnostics: + +
+- [Posterior predictive checks](12_diagnostics_posterior_predictive_check) +- [Simulation-based calibration](13_diagnostics_simulation_based_calibration) +- [Density plots and MCMC diagnostics with ArviZ](15_mcmc_diagnostics_with_arviz) +
+ + +## Analysis: + +
+- [Conditional distributions](07_conditional_distributions) +- [Posterior sensitivity analysis](09_sensitivity_analysis) +
+ +## Examples: + +
+- [Hodgkin-Huxley example](../examples/00_HH_simulator) +- [Decision making model](../examples/01_decision_making_model) +
From 8818d70b54d418b5d4fb8d847e907b36328ebc08 Mon Sep 17 00:00:00 2001 From: Jan Boelts Date: Tue, 7 May 2024 15:33:56 +0200 Subject: [PATCH 04/24] refactor new landing page. --- docs/docs/index.md | 72 +++++++++++++++++++++++++++------------------- 1 file changed, 43 insertions(+), 29 deletions(-) diff --git a/docs/docs/index.md b/docs/docs/index.md index c66fa5265..8fb59c329 100644 --- a/docs/docs/index.md +++ b/docs/docs/index.md @@ -1,8 +1,21 @@ -# `sbi`: simulation-based inference +# `sbi`: simulation-based inference toolkit -`sbi` is a toolbox that let you run simulation-based inference methods simply using a high level API: +`sbi` provides access to simulation-based inference methods via a user-friendly +interface: -![using sbi](static/infer_demo.gif) +```python +# simulation +theta = prior.sample((1000,)) +x = simulator(theta) + +# training +inference = SNPE(prior).append_simulations(theta, x) +inference.train() + +# inference +posterior = inference.build_posterior() +posterior_samples = posterior.sample((1000,), x=x_o) +``` ## Overview @@ -11,9 +24,10 @@ ```commandline pip install sbi ``` -for more advances install options, see our [Install Guide](install.md). -Then, checkout our material: +for more advanced install options, see our [Install Guide](install.md). + +Then, check out our material:
@@ -29,13 +43,12 @@ Then, checkout our material:

*The detailed description of the package classes and functions.* -- :book: [__Citation__]() +- :book: [__Citation__](citation.md)

- *References for the implemented methods.* + *How to cite the `sbi` package.*
- ## Motivation and approach Many areas of science and engineering make extensive use of complex, stochastic, @@ -43,9 +56,9 @@ numerical simulations to describe the structure and dynamics of the processes be investigated. A key challenge in simulation-based science is constraining these simulation models' -parameters, which are intepretable quantities, with observational data. Bayesian +parameters, which are interpretable quantities, with observational data. Bayesian inference provides a general and powerful framework to invert the simulators, i.e. -describe the parameters which are consistent both with empirical data and prior +describe the parameters that are consistent both with empirical data and prior knowledge. In the case of simulators, a key quantity required for statistical inference, the @@ -64,29 +77,29 @@ parameter space and the observation space, one of the methods will be more suita ![](./static/goal.png) -**Goal: Algorithmically identify mechanistic models which are consistent with data.** +**Goal: Algorithmically identify mechanistic models that are consistent with data.** -Each of the methods above needs three inputs: A candidate mechanistic model, prior -knowledge or constraints on model parameters, and observational data (or summary statistics -thereof). +Each of the methods above needs three inputs: A candidate mechanistic model, +prior knowledge or constraints on model parameters, and observational data (or +summary statistics thereof). The methods then proceed by 1. sampling parameters from the prior followed by simulating synthetic data from these parameters, -2. learning the (probabilistic) association between data (or - data features) and underlying parameters, i.e. to learn statistical inference from - simulated data. The way in which this association is learned differs between the - above methods, but all use deep neural networks. -3. This learned neural network is then applied to empirical data to derive the full - space of parameters consistent with the data and the prior, i.e. the posterior - distribution. High posterior probability is assigned to parameters which are - consistent with both the data and the prior, low probability to inconsistent - parameters. While SNPE directly learns the posterior distribution, SNLE and SNRE need - an extra MCMC sampling step to construct a posterior. -4. If needed, an initial estimate of the posterior can be used to adaptively generate - additional informative simulations. - +2. learning the (probabilistic) association between data (or data features) and + underlying parameters, i.e. to learn statistical inference from simulated + data. How this association is learned differs between the above methods, but + all use deep neural networks. +3. This learned neural network is then applied to empirical data to derive the + full space of parameters consistent with the data and the prior, i.e. the + posterior distribution. The posterior assigns high probability to parameters + that are consistent with both the data and the prior, and low probability to + inconsistent parameters. While SNPE directly learns the posterior + distribution, SNLE and SNRE need an extra MCMC sampling step to construct a + posterior. +4. If needed, an initial estimate of the posterior can be used to adaptively + generate additional informative simulations. See [Cranmer, Brehmer, Louppe (2020)](https://doi.org/10.1073/pnas.1912789117) for a recent review on simulation-based inference. @@ -109,8 +122,9 @@ posterior = inference.build_posterior() `sbi` lets you choose from a variety of _amortized_ and _sequential_ SBI methods: -Amortized methods return a posterior that can be applied to many different observations without retraining, -whereas sequential methods focus the inference on one particular observation to be more simulation-efficient. +Amortized methods return a posterior that can be applied to many different +observations without retraining, whereas sequential methods focus the inference +on one particular observation to be more simulation-efficient. For an overview of implemented methods see [the Inference API's reference]( From 867355a792c1cc342ba34514a55ef5436745a400 Mon Sep 17 00:00:00 2001 From: Jan Boelts Date: Tue, 7 May 2024 15:34:21 +0200 Subject: [PATCH 05/24] update docs dependencies --- pyproject.toml | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/pyproject.toml b/pyproject.toml index 9b1f5bb70..241c939bc 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -49,6 +49,10 @@ dependencies = [ [project.optional-dependencies] doc = [ # Documentation + "jupyter_contrib_nbextensions", + "notebook <= 6.4.12", + "traitlets <= 5.9.0", + "ipython <= 8.9.0", "mkdocs", "mkdocs-material", "markdown-include", From b20136fdf9727acaf0a2209fe116659f6ac0fcd8 Mon Sep 17 00:00:00 2001 From: Jan Boelts Date: Mon, 3 Jun 2024 16:28:56 +0200 Subject: [PATCH 06/24] wip: replace nav-bar dropdown with index. --- docs/docs/index.md | 9 +++++ docs/docs/reference/index.md | 10 +++--- docs/docs/reference/inference.md | 21 ++++++------ docs/mkdocs.yml | 58 ++++++++++++++++---------------- tutorials/index.md | 4 +-- 5 files changed, 55 insertions(+), 47 deletions(-) diff --git a/docs/docs/index.md b/docs/docs/index.md index 8fb59c329..39cf3aa04 100644 --- a/docs/docs/index.md +++ b/docs/docs/index.md @@ -4,6 +4,14 @@ interface: ```python +import torch +from sbi.inference import SNPE +from sbi.utils import BoxUniform + +# define prior and simulator +prior = BoxUniform(torch.tensor([0]), torch.tensor([1])) +simulator = lambda theta: torch.randn_like(theta) + theta + # simulation theta = prior.sample((1000,)) x = simulator(theta) @@ -13,6 +21,7 @@ inference = SNPE(prior).append_simulations(theta, x) inference.train() # inference +x_o = torch.tensor([1.5]) posterior = inference.build_posterior() posterior_samples = posterior.sample((1000,), x=x_o) ``` diff --git a/docs/docs/reference/index.md b/docs/docs/reference/index.md index 54009a003..92762f052 100644 --- a/docs/docs/reference/index.md +++ b/docs/docs/reference/index.md @@ -4,18 +4,18 @@ - [Inference](inference.md)
- *XXX* + *SBI algorithms and helper functions.* - [Neural Networks](models.md)
- *Models to perform posterior approximation and signal embeddings.* + *Utilities to build neural network-based density estimators and feature extractors.* - [Posteriors](posteriors.md)
- *XXX* + *Posterior classes* - [Potentials](potentials.md)
- *XXX* + *Potential function classes for posterior sampling.* - [Analysis](analysis.md)
- *XXX* + *Utilities for SBI visualizations and analyses.* diff --git a/docs/docs/reference/inference.md b/docs/docs/reference/inference.md index b248facca..748a27e92 100644 --- a/docs/docs/reference/inference.md +++ b/docs/docs/reference/inference.md @@ -1,16 +1,5 @@ # Inference -## Helpers - -::: sbi.inference.base.infer - -::: sbi.inference.base.simulate_for_sbi - -::: sbi.utils.user_input_checks.process_prior - -::: sbi.utils.user_input_checks.process_simulator - - ## Algorithms ::: sbi.inference.snpe.snpe_a.SNPE_A @@ -57,3 +46,13 @@ selection: filters: [ "!^_", "^__", "!^__class__" ] inherited_members: true + +## Helpers + +::: sbi.inference.base.infer + +::: sbi.inference.base.simulate_for_sbi + +::: sbi.utils.user_input_checks.process_prior + +::: sbi.utils.user_input_checks.process_simulator diff --git a/docs/mkdocs.yml b/docs/mkdocs.yml index 531e77b9c..03d57a8e6 100644 --- a/docs/mkdocs.yml +++ b/docs/mkdocs.yml @@ -3,35 +3,35 @@ site_url: "https://sbi-dev.github.io/sbi/" nav: - Home: index.md - - Tutorials and Examples: - - Introduction: - - Getting started: tutorials/00_getting_started_flexible.ipynb - - Amortized inference: tutorials/01_gaussian_amortized.ipynb - - Implemented algorithms: tutorials/16_implemented_methods.md - - Advanced: - - Multi-round inference: tutorials/03_multiround_inference.md - - Sampling algorithms in sbi: tutorials/11_sampler_interface.md - - Custom density estimators: tutorials/04_density_estimators.md - - Embedding nets for observations: tutorials/05_embedding_net.md - - SBI with trial-based data: tutorials/14_iid_data_and_permutation_invariant_embeddings.md - - Handling invalid simulations: tutorials/08_restriction_estimator.md - - Crafting summary statistics: tutorials/10_crafting_summary_statistics.md - - Diagnostics: - - Posterior predictive checks: tutorials/12_diagnostics_posterior_predictive_check.md - - Simulation-based calibration: tutorials/13_diagnostics_simulation_based_calibration.md - - Density plots and MCMC diagnostics with ArviZ: tutorials/15_mcmc_diagnostics_with_arviz.md - - Analysis: - - Conditional distributions: tutorials/07_conditional_distributions.md - - Posterior sensitivity analysis: tutorials/09_sensitivity_analysis.md - - Examples: - - Hodgkin-Huxley example: examples/00_HH_simulator.md - - Decision making model: examples/01_decision_making_model.md - - API Reference: - - Inference: reference/inference.md - - Neural Networks: reference/models.md - - Posteriors: reference/posteriors.md - - Potentials: reference/potentials.md - - Analysis: reference/analysis.md + - Tutorials and Examples: tutorials/index.md + # - Introduction: + # - Getting started: tutorials/00_getting_started_flexible.ipynb + # - Amortized inference: tutorials/01_gaussian_amortized.ipynb + # - Implemented algorithms: tutorials/16_implemented_methods.md + # - Advanced: + # - Multi-round inference: tutorials/03_multiround_inference.md + # - Sampling algorithms in sbi: tutorials/11_sampler_interface.md + # - Custom density estimators: tutorials/04_density_estimators.md + # - Embedding nets for observations: tutorials/05_embedding_net.md + # - SBI with trial-based data: tutorials/14_iid_data_and_permutation_invariant_embeddings.md + # - Handling invalid simulations: tutorials/08_restriction_estimator.md + # - Crafting summary statistics: tutorials/10_crafting_summary_statistics.md + # - Diagnostics: + # - Posterior predictive checks: tutorials/12_diagnostics_posterior_predictive_check.md + # - Simulation-based calibration: tutorials/13_diagnostics_simulation_based_calibration.md + # - Density plots and MCMC diagnostics with ArviZ: tutorials/15_mcmc_diagnostics_with_arviz.md + # - Analysis: + # - Conditional distributions: tutorials/07_conditional_distributions.md + # - Posterior sensitivity analysis: tutorials/09_sensitivity_analysis.md + # - Examples: + # - Hodgkin-Huxley example: examples/00_HH_simulator.md + # - Decision making model: examples/01_decision_making_model.md + - API Reference: reference/index.md + # - Inference: reference/inference.md + # - Neural Networks: reference/models.md + # - Posteriors: reference/posteriors.md + # - Potentials: reference/potentials.md + # - Analysis: reference/analysis.md - Contributing: - How to contribute: contribute.md - Code of Conduct: code_of_conduct.md diff --git a/tutorials/index.md b/tutorials/index.md index 8b8d988c8..c36d6d6c3 100644 --- a/tutorials/index.md +++ b/tutorials/index.md @@ -38,6 +38,6 @@ ## Examples:
-- [Hodgkin-Huxley example](../examples/00_HH_simulator) -- [Decision making model](../examples/01_decision_making_model) +- [Hodgkin-Huxley model](../examples/00_HH_simulator) +- [Decision-making model](../examples/01_decision_making_model)
From 5891f8290aa27eb964083e253f7da52df762f6cf Mon Sep 17 00:00:00 2001 From: Jan Boelts Date: Tue, 11 Jun 2024 11:47:35 +0200 Subject: [PATCH 07/24] shorter snippet, add publications to main page. --- docs/docs/index.md | 114 ++++++++++++++++++------- tutorials/16_implemented_methods.ipynb | 30 ++++++- 2 files changed, 112 insertions(+), 32 deletions(-) diff --git a/docs/docs/index.md b/docs/docs/index.md index 39cf3aa04..bd33ae2fc 100644 --- a/docs/docs/index.md +++ b/docs/docs/index.md @@ -8,22 +8,17 @@ import torch from sbi.inference import SNPE from sbi.utils import BoxUniform -# define prior and simulator -prior = BoxUniform(torch.tensor([0]), torch.tensor([1])) -simulator = lambda theta: torch.randn_like(theta) + theta +# uniform prior and gaussian simulator +theta = torch.rand(1000) +x = torch.randn_like(theta) + theta -# simulation -theta = prior.sample((1000,)) -x = simulator(theta) - -# training -inference = SNPE(prior).append_simulations(theta, x) -inference.train() +# choose sbi method and train +inference = SNPE() +inference.append_simulations(theta, x).train() # inference -x_o = torch.tensor([1.5]) posterior = inference.build_posterior() -posterior_samples = posterior.sample((1000,), x=x_o) +samples = posterior.sample((1000,), x=torch.tensor([1.5])) ``` ## Overview @@ -110,31 +105,88 @@ The methods then proceed by 4. If needed, an initial estimate of the posterior can be used to adaptively generate additional informative simulations. -See [Cranmer, Brehmer, Louppe (2020)](https://doi.org/10.1073/pnas.1912789117) for a recent -review on simulation-based inference. +See [Cranmer, Brehmer, Louppe (2020)](https://doi.org/10.1073/pnas.1912789117) +for a recent review on simulation-based inference. -## Getting started with the `sbi` package +## Implemented algorithms -Once `sbi` is installed, inference can be run in a single line of code +`sbi` implements a variety of _amortized_ and _sequential_ SBI methods. -```python -posterior = infer(simulator, prior, method='SNPE', num_simulations=1000) -``` +Amortized methods return a posterior that can be applied to many different +observations without retraining (e.g., NPE), whereas sequential methods focus +the inference on one particular observation to be more simulation-efficient +(e.g., SNPE). -or in a few lines for more flexibility: +Below, we list all implemented methods and the corresponding publications. To see +how to access these methods in `sbi`, check out our [Inference API's reference]( +reference/inference/) and the [tutorial on implemented +methods](tutorials/16_implemented_methods.ipynb). -```python -inference = SNPE(prior=prior) -_ = inference.append_simulations(theta, x).train() -posterior = inference.build_posterior() -``` +### Posterior estimation (`(S)NPE`) -`sbi` lets you choose from a variety of _amortized_ and _sequential_ SBI methods: +- **Fast ε-free Inference of Simulation Models with Bayesian Conditional Density + Estimation**
by Papamakarios & Murray (NeurIPS 2016) +
[[PDF]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation.pdf) + [[BibTeX]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation/bibtex) -Amortized methods return a posterior that can be applied to many different -observations without retraining, whereas sequential methods focus the inference -on one particular observation to be more simulation-efficient. +- **Flexible statistical inference for mechanistic models of neural dynamics** +
by Lueckmann, Goncalves, Bassetto, Öcal, Nonnenmacher & Macke (NeurIPS + 2017) +
[[PDF]](https://papers.nips.cc/paper/6728-flexible-statistical-inference-for-mechanistic-models-of-neural-dynamics.pdf) + [[BibTeX]](https://papers.nips.cc/paper/6728-flexible-statistical-inference-for-mechanistic-models-of-neural-dynamics/bibtex) + +- **Automatic posterior transformation for likelihood-free inference**
by Greenberg, Nonnenmacher & Macke (ICML 2019)
[[PDF]](http://proceedings.mlr.press/v97/greenberg19a/greenberg19a.pdf) [[BibTeX]](data:text/plain;charset=utf-8,%0A%0A%0A%0A%0A%0A%40InProceedings%7Bpmlr-v97-greenberg19a%2C%0A%20%20title%20%3D%20%09%20%7BAutomatic%20Posterior%20Transformation%20for%20Likelihood-Free%20Inference%7D%2C%0A%20%20author%20%3D%20%09%20%7BGreenberg%2C%20David%20and%20Nonnenmacher%2C%20Marcel%20and%20Macke%2C%20Jakob%7D%2C%0A%20%20booktitle%20%3D%20%09%20%7BProceedings%20of%20the%2036th%20International%20Conference%20on%20Machine%20Learning%7D%2C%0A%20%20pages%20%3D%20%09%20%7B2404--2414%7D%2C%0A%20%20year%20%3D%20%09%20%7B2019%7D%2C%0A%20%20editor%20%3D%20%09%20%7BChaudhuri%2C%20Kamalika%20and%20Salakhutdinov%2C%20Ruslan%7D%2C%0A%20%20volume%20%3D%20%09%20%7B97%7D%2C%0A%20%20series%20%3D%20%09%20%7BProceedings%20of%20Machine%20Learning%20Research%7D%2C%0A%20%20address%20%3D%20%09%20%7BLong%20Beach%2C%20California%2C%20USA%7D%2C%0A%20%20month%20%3D%20%09%20%7B09--15%20Jun%7D%2C%0A%20%20publisher%20%3D%20%09%20%7BPMLR%7D%2C%0A%20%20pdf%20%3D%20%09%20%7Bhttp%3A%2F%2Fproceedings.mlr.press%2Fv97%2Fgreenberg19a%2Fgreenberg19a.pdf%7D%2C%0A%20%20url%20%3D%20%09%20%7Bhttp%3A%2F%2Fproceedings.mlr.press%2Fv97%2Fgreenberg19a.html%7D%2C%0A%20%20abstract%20%3D%20%09%20%7BHow%20can%20one%20perform%20Bayesian%20inference%20on%20stochastic%20simulators%20with%20intractable%20likelihoods%3F%20A%20recent%20approach%20is%20to%20learn%20the%20posterior%20from%20adaptively%20proposed%20simulations%20using%20neural%20network-based%20conditional%20density%20estimators.%20However%2C%20existing%20methods%20are%20limited%20to%20a%20narrow%20range%20of%20proposal%20distributions%20or%20require%20importance%20weighting%20that%20can%20limit%20performance%20in%20practice.%20Here%20we%20present%20automatic%20posterior%20transformation%20(APT)%2C%20a%20new%20sequential%20neural%20posterior%20estimation%20method%20for%20simulation-based%20inference.%20APT%20can%20modify%20the%20posterior%20estimate%20using%20arbitrary%2C%20dynamically%20updated%20proposals%2C%20and%20is%20compatible%20with%20powerful%20flow-based%20density%20estimators.%20It%20is%20more%20flexible%2C%20scalable%20and%20efficient%20than%20previous%20simulation-based%20inference%20techniques.%20APT%20can%20operate%20directly%20on%20high-dimensional%20time%20series%20and%20image%20data%2C%20opening%20up%20new%20applications%20for%20likelihood-free%20inference.%7D%0A%7D%0A) + +- **Truncated proposals for scalable and hassle-free simulation-based + inference**
by Deistler, Goncalves & Macke (NeurIPS 2022) +
[[Paper]](https://arxiv.org/abs/2210.04815) + +- **BayesFlow: Learning complex stochastic models with invertible neural + networks**
by Radev, S. T., Mertens, U. K., Voss, A., Ardizzone, L., & Köthe, + U. (2020) (IEEE transactions on neural networks and learning systems 2020)
+ [Paper](https://ieeexplore.ieee.org/abstract/document/9298920) + +### Likelihood-estimation (`(S)NLE`) + +- **Sequential neural likelihood: Fast likelihood-free inference with + autoregressive flows**
by Papamakarios, Sterratt & Murray (AISTATS 2019) +
[[PDF]](http://proceedings.mlr.press/v89/papamakarios19a/papamakarios19a.pdf) + [[BibTeX]](https://gpapamak.github.io/bibtex/snl.bib) + +- **Variational methods for simulation-based inference**
by Glöckler, + Deistler, Macke (ICLR 2022)
[[Paper]](https://arxiv.org/abs/2203.04176) + +- **Flexible and efficient simulation-based inference for models of + decision-making**
by Boelts, Lueckmann, Gao, Macke (Elife 2022) +
[[Paper]](https://elifesciences.org/articles/77220) + + +### Likelihood-ratio-estimation (`(S)NRE`) + +- **Likelihood-free MCMC with Amortized Approximate Likelihood Ratios**
by + Hermans, Begy & Louppe (ICML 2020) +
[[PDF]](http://proceedings.mlr.press/v119/hermans20a/hermans20a.pdf) + +- **On Contrastive Learning for Likelihood-free Inference**
Durkan, Murray & + Papamakarios (ICML 2020) +
[[PDF]](http://proceedings.mlr.press/v119/durkan20a/durkan20a.pdf) + +- **Towards Reliable Simulation-Based Inference with Balanced Neural Ratio + Estimation**
by Delaunoy, Hermans, Rozet, Wehenkel & Louppe (NeurIPS 2022) +
[[PDF]](https://arxiv.org/pdf/2208.13624.pdf) + +- **Contrastive Neural Ratio Estimation**
Benjamin Kurt Miller, Christoph + Weniger, Patrick Forré (NeurIPS 2022) +
[[PDF]](https://arxiv.org/pdf/2210.06170.pdf) + +### Utilities + +- **Restriction estimator**
by Deistler, Macke & Goncalves (PNAS 2022) +
[[Paper]](https://www.pnas.org/doi/10.1073/pnas.2207632119) +- **Simulation-based calibration**
by Talts, Betancourt, Simpson, Vehtari, + Gelman (arxiv 2018)
[[Paper]](https://arxiv.org/abs/1804.06788)) -For an overview of implemented methods see [the Inference API's reference]( -reference/inference/), or checkout or [GitHub page](https://github.com/mackelab/sbi). +- **Expected coverage (sample-based)**
as computed in Deistler, Goncalves, + Macke [[Paper]](https://arxiv.org/abs/2210.04815) and in Rozet, Louppe + [[Paper]](https://matheo.uliege.be/handle/2268.2/12993) diff --git a/tutorials/16_implemented_methods.ipynb b/tutorials/16_implemented_methods.ipynb index c56e71393..a4598d909 100644 --- a/tutorials/16_implemented_methods.ipynb +++ b/tutorials/16_implemented_methods.ipynb @@ -131,6 +131,34 @@ " proposal = RestrictedPrior(prior, accept_reject_fn, sample_with=\"rejection\")" ] }, + { + "cell_type": "markdown", + "id": "3642634d", + "metadata": {}, + "source": [ + "**BayesFlow: Learning complex stochastic models with invertible neural\n", + "networks**
by Radev, S. T., Mertens, U. K., Voss, A., Ardizzone, L., & Köthe,\n", + "U. (2020) (IEEE transactions on neural networks and learning systems 2020)
\n", + "[Paper](https://ieeexplore.ieee.org/abstract/document/9298920)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "da7554dd", + "metadata": {}, + "outputs": [], + "source": [ + "# The BayesFlow functionality can be obtained via single-round SNPE.\n", + "from sbi.inference import SNPE_A\n", + "\n", + "inference = SNPE_A(prior)\n", + "theta = prior.sample((num_sims,))\n", + "x = simulator(theta)\n", + "inference.append_simulations(theta, x).train()\n", + "posterior = inference.build_posterior().set_default_x(x_o)" + ] + }, { "cell_type": "markdown", "id": "d13f84e2-d35a-4f54-8cbf-0e4be1a38fb3", @@ -468,7 +496,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.8.12" + "version": "3.10.0" }, "vscode": { "interpreter": { From 4552d154d4c28ac22b972a778ccffe97b078fba0 Mon Sep 17 00:00:00 2001 From: Jan Boelts Date: Tue, 11 Jun 2024 12:07:01 +0200 Subject: [PATCH 08/24] show faq as ordered list, format install.md. --- docs/docs/faq.md | 20 +++++++------------- docs/docs/install.md | 14 +++++++++----- 2 files changed, 16 insertions(+), 18 deletions(-) diff --git a/docs/docs/faq.md b/docs/docs/faq.md index eacf4f0a0..8f0beb872 100644 --- a/docs/docs/faq.md +++ b/docs/docs/faq.md @@ -1,15 +1,9 @@ # Frequently asked questions -[Can the algorithms deal with invalid data, e.g., NaN or inf?](faq/question_02_nans.md) - -[What should I do when my 'posterior samples are outside of the prior support' in SNPE?](faq/question_01_leakage.md) - -[When using multiple workers, I get a pickling error. Can I still use multiprocessing?](faq/question_03_pickling_error.md) - -[Can I use the GPU for training the density estimator?](faq/question_04_gpu.md) - -[How should I save and load objects in `sbi`?](faq/question_05_pickling.md) - -[Can I stop neural network training and resume it later?](faq/question_06_resume_training.md) - -[How can I use a prior that is not defined in PyTorch?](faq/question_07_custom_prior.md) +1. [What should I do when my 'posterior samples are outside of the prior support' in SNPE?](faq/question_01_leakage.md) +2. [Can the algorithms deal with invalid data, e.g., NaN or inf?](faq/question_02_nans.md) +3. [When using multiple workers, I get a pickling error. Can I still use multiprocessing?](faq/question_03_pickling_error.md) +4. [Can I use the GPU for training the density estimator?](faq/question_04_gpu.md) +5. [How should I save and load objects in `sbi`?](faq/question_05_pickling.md) +6. [Can I stop neural network training and resume it later?](faq/question_06_resume_training.md) +7. [How can I use a prior that is not defined in PyTorch?](faq/question_07_custom_prior.md) diff --git a/docs/docs/install.md b/docs/docs/install.md index 20ad9eedb..ecc9a50b3 100644 --- a/docs/docs/install.md +++ b/docs/docs/install.md @@ -1,22 +1,26 @@ # Installation `sbi` requires Python 3.8 or higher. A GPU is not required, but can lead to -speed-up in some cases. We recommend to use a [`conda`](https://docs.conda.io/en/latest/miniconda.html) virtual -environment ([Miniconda installation instructions](https://docs.conda.io/en/latest/miniconda.html)). -If `conda` is installed on the system, an environment for installing `sbi` can be created as follows: +speed-up in some cases. We recommend using a +[`conda`](https://docs.conda.io/en/latest/miniconda.html) virtual environment +([Miniconda installation +instructions](https://docs.conda.io/en/latest/miniconda.html)). If `conda` is +installed on the system, an environment for installing `sbi` can be created as +follows: ```commandline # Create an environment for sbi (indicate Python 3.8 or higher); activate it $ conda create -n sbi_env python=3.10 && conda activate sbi_env ``` -Independent of whether you are using `conda` or not, `sbi` can be installed using `pip`: +Independent of whether you are using `conda` or not, `sbi` can be installed +using `pip`: ```commandline pip install sbi ``` -To test the installation, drop into a python prompt and run +To test the installation, drop into a Python prompt and run ```python from sbi.examples.minimal import simple From b19e4d28d34caf4a0a888cafccf97909c929a6ad Mon Sep 17 00:00:00 2001 From: Jan Boelts Date: Tue, 11 Jun 2024 12:10:23 +0200 Subject: [PATCH 09/24] join tutorial instructions from README and tutorials/index. --- tutorials/README.md | 168 -------------------------------------------- tutorials/index.md | 25 +++++-- 2 files changed, 20 insertions(+), 173 deletions(-) delete mode 100644 tutorials/README.md diff --git a/tutorials/README.md b/tutorials/README.md deleted file mode 100644 index 88b650551..000000000 --- a/tutorials/README.md +++ /dev/null @@ -1,168 +0,0 @@ -# Tutorials for using the `sbi` toolbox - -These `sbi` tutorials are aimed at two sepatate groups -1. _users_, e.g., domain scientists that aim to get an introduction to the method to then apply it to their (mechanistic) models -2. _contributers_ who develop methods and/or plan to contribute to the `sbi` toolbox - -Before running the notebooks, follow our instructions to [install sbi](../README.md). -The numbers of the notebooks are not informative of the order, please follow this structure depending on which group you identify with. - -## I want to start applying `sbi` (_user_) -Before going through the tutorial notebooks, make sure to read through the **Overview, Motivation and Approach** below. - -- [Getting started](00_getting_started_flexible.ipynb) introduces the `sbi` package and its core functionality. -- [Inferring parameters for multiple observations](01_gaussian_amortized.ipynb) introduces the concept of amortization, i.e., that we do not need to retrain our inference procedure for different observations. -- [The example for a scientific simulator from neuroscience (Hodgkin-Huxley)](../examples/00_HH_simulator.ipynb), shows how `sbi` can be applied to scientific use cases building on the previous two examples. -- [Inferring parameters for a single observation ](03_multiround_inference.ipynb) introduces the concept of multi round inference for a single observation to be more sampling efficient. - -[All implemented methods](16_implemented_methods.ipynb) provides an overview of the implemented inference methods and how to call them. - -Once you have familiarised yourself with the methods and identified how to apply SBI to your use case, ensure you work through the **Diagnostics** tutorials linked below, to identify failure cases and assess the quality of your inference. - - -## I develop methods for `sbi` (_contributer_) - -### Introduction -- [Getting started](00_getting_started_flexible.ipynb) introduces the `sbi` package and its core functionality. -- [Inferring parameters for multiple observations ](01_gaussian_amortized.ipynb)introduces the concept of amortization. -- [All implemented methods](16_implemented_methods.ipynb) provides an overview of the implemented inference methods and how to call them. - -### Advanced: -- [Multi-round inference](03_multiround_inference.ipynb) -- [Sampling algorithms in sbi](11_sampler_interface.ipynb) -- [Custom density estimators](04_density_estimators.ipynb) -- [Learning summary statistics](05_embedding_net.ipynb) -- [SBI with trial-based data](14_iid_data_and_permutation_invariant_embeddings.ipynb) -- [Handling invalid simulations](08_restriction_estimator.ipynb) -- [Crafting summary statistics](10_crafting_summary_statistics.ipynb) - -### Diagnostics: -- [Posterior predictive checks](12_diagnostics_posterior_predictive_check.ipynb) -- [Simulation-based calibration](13_diagnostics_simulation_based_calibration.ipynb) -- [Density plots and MCMC diagnostics with ArviZ](15_mcmc_diagnostics_with_arviz.ipynb) - -### Analysis: -- [Conditional distributions](07_conditional_distributions.ipynb) -- [Posterior sensitivity analysis](09_sensitivity_analysis.ipynb) shows how to perform a sensitivity analysis of a model. - -### Examples: -- [Hodgkin-Huxley example](../examples/00_HH_simulator.ipynb) -- [Decision making model](../examples/01_decision_making_model.ipynb) - -Please first read our [contributer guide](../CONTRIBUTING.md) and our [code of conduct](../CODE_OF_CONDUCT.md). - - - - -## Overview - - -`sbi` lets you choose from a variety of _amortized_ and _sequential_ SBI methods: - -Amortized methods return a posterior that can be applied to many different observations without retraining, -whereas sequential methods focus the inference on one particular observation to be more simulation-efficient. -For an overview of implemented methods see below, or checkout our [GitHub page](https://github.com/mackelab/sbi). - -- To learn about the general motivation behind simulation-based inference, and the - inference methods included in `sbi`, read on below. - -- For example applications to canonical problems in neuroscience, browse the recent - research article [Training deep neural density estimators to identify mechanistic models of neural dynamics](https://doi.org/10.7554/eLife.56261). - - - -## Motivation and approach - -Many areas of science and engineering make extensive use of complex, stochastic, -numerical simulations to describe the structure and dynamics of the processes being -investigated. - -A key challenge in simulation models for science, is constraining the parameters of these models, which are intepretable quantities, with observational data. Bayesian -inference provides a general and powerful framework to invert the simulators, i.e. -describe the parameters which are consistent both with empirical data and prior -knowledge. - -In the case of simulators, a key quantity required for statistical inference, the -likelihood of observed data given parameters, $\mathcal{L}(\theta) = p(x_o|\theta)$, is -typically intractable, rendering conventional statistical approaches inapplicable. - -`sbi` implements powerful machine-learning methods that address this problem. Roughly, -these algorithms can be categorized as: - -- Neural Posterior Estimation (amortized `NPE` and sequential `SNPE`), -- Neural Likelihood Estimation (`(S)NLE`), and -- Neural Ratio Estimation (`(S)NRE`). - -Depending on the characteristics of the problem, e.g. the dimensionalities of the -parameter space and the observation space, one of the methods will be more suitable. - -![](./static/goal.png) - -**Goal: Algorithmically identify mechanistic models which are consistent with data.** - -Each of the methods above needs three inputs: A candidate mechanistic model, prior -knowledge or constraints on model parameters, and observational data (or summary statistics -thereof). - -The methods then proceed by - -1. sampling parameters from the prior followed by simulating synthetic data from - these parameters, -2. learning the (probabilistic) association between data (or - data features) and underlying parameters, i.e. to learn statistical inference from - simulated data. The way in which this association is learned differs between the - above methods, but all use deep neural networks. -3. This learned neural network is then applied to empirical data to derive the full - space of parameters consistent with the data and the prior, i.e. the posterior - distribution. High posterior probability is assigned to parameters which are - consistent with both the data and the prior, low probability to inconsistent - parameters. While SNPE directly learns the posterior distribution, SNLE and SNRE need - an extra MCMC sampling step to construct a posterior. -4. If needed, an initial estimate of the posterior can be used to adaptively generate - additional informative simulations. - -## Publications - -See [Cranmer, Brehmer, Louppe (2020)](https://doi.org/10.1073/pnas.1912789117) for a recent -review on simulation-based inference. - -The following papers offer additional details on the inference methods implemented in `sbi`. -You can find a tutorial on how to run each of these methods [here](https://sbi-dev.github.io/sbi/tutorial/16_implemented_methods/). - -### Posterior estimation (`(S)NPE`) - -- **Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation**
by Papamakarios & Murray (NeurIPS 2016)
[[Paper]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation.pdf) [[BibTeX]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation/bibtex) - -- **Flexible statistical inference for mechanistic models of neural dynamics**
by Lueckmann, Goncalves, Bassetto, Öcal, Nonnenmacher & Macke (NeurIPS 2017)
[[PDF]](https://papers.nips.cc/paper/6728-flexible-statistical-inference-for-mechanistic-models-of-neural-dynamics.pdf) [[BibTeX]](https://papers.nips.cc/paper/6728-flexible-statistical-inference-for-mechanistic-models-of-neural-dynamics/bibtex) - -- **Automatic posterior transformation for likelihood-free inference**
by Greenberg, Nonnenmacher & Macke (ICML 2019)
[[Paper]](http://proceedings.mlr.press/v97/greenberg19a/greenberg19a.pdf) - -- **Truncated proposals for scalable and hassle-free simulation-based inference**
by Deistler, Goncalves & Macke (NeurIPS 2022)
[[Paper]](https://arxiv.org/abs/2210.04815) - - -### Likelihood-estimation (`(S)NLE`) - -- **Sequential neural likelihood: Fast likelihood-free inference with autoregressive flows**
by Papamakarios, Sterratt & Murray (AISTATS 2019)
[[PDF]](http://proceedings.mlr.press/v89/papamakarios19a/papamakarios19a.pdf) [[BibTeX]](https://gpapamak.github.io/bibtex/snl.bib) - -- **Variational methods for simulation-based inference**
by Glöckler, Deistler, Macke (ICLR 2022)
[[Paper]](https://arxiv.org/abs/2203.04176) - -- **Flexible and efficient simulation-based inference for models of decision-making**
by Boelts, Lueckmann, Gao, Macke (Elife 2022)
[[Paper]](https://elifesciences.org/articles/77220) - - -### Likelihood-ratio-estimation (`(S)NRE`) - -- **Likelihood-free MCMC with Amortized Approximate Likelihood Ratios**
by Hermans, Begy & Louppe (ICML 2020)
[[PDF]](http://proceedings.mlr.press/v119/hermans20a/hermans20a.pdf) - -- **On Contrastive Learning for Likelihood-free Inference**
Durkan, Murray & Papamakarios (ICML 2020)
[[PDF]](http://proceedings.mlr.press/v119/durkan20a/durkan20a.pdf) - -- **Towards Reliable Simulation-Based Inference with Balanced Neural Ratio Estimation**
by Delaunoy, Hermans, Rozet, Wehenkel & Louppe (NeurIPS 2022)
[[PDF]](https://arxiv.org/pdf/2208.13624.pdf) - -- **Contrastive Neural Ratio Estimation**
Benjamin Kurt Miller, Christoph Weniger, Patrick Forré (NeurIPS 2022)
[[PDF]](https://arxiv.org/pdf/2210.06170.pdf) - -### Utilities - -- **Restriction estimator**
by Deistler, Macke & Goncalves (PNAS 2022)
[[Paper]](https://www.pnas.org/doi/10.1073/pnas.2207632119) - -- **Simulation-based calibration**
by Talts, Betancourt, Simpson, Vehtari, Gelman (arxiv 2018)
[[Paper]](https://arxiv.org/abs/1804.06788)) - -- **Expected coverage (sample-based)**
as computed in Deistler, Goncalves, Macke [[Paper]](https://arxiv.org/abs/2210.04815) and in Rozet, Louppe [[Paper]](https://matheo.uliege.be/handle/2268.2/12993) diff --git a/tutorials/index.md b/tutorials/index.md index c36d6d6c3..56d2c298a 100644 --- a/tutorials/index.md +++ b/tutorials/index.md @@ -1,12 +1,27 @@ -## Introduction: + +# Tutorials for using the `sbi` toolbox + +Before running the notebooks, follow our instructions to [install +sbi](../install.md). Alternatively, you can also open a [codespace on +GitHub](https://codespaces.new/sbi-dev/sbi) and work through the tutorials in +the browser. The numbers of the notebooks are not informative of the order, +please follow this structure depending on which group you identify with. + +Once you have familiarised yourself with the methods and identified how to apply +SBI to your use case, ensure you work through the **Diagnostics** tutorials +linked below, to identify failure cases and assess the quality of your +inference. + +## Introduction
- [Getting started](00_getting_started_flexible) - [Amortized inference](01_gaussian_amortized) - [Implemented algorithms](16_implemented_methods) +- [Example application with a simulator from neuroscience + (Hodgkin-Huxley)](../examples/00_HH_simulator.ipynb)
- ## Advanced
@@ -19,7 +34,7 @@ - [Crafting summary statistics](10_crafting_summary_statistics)
-## Diagnostics: +## Diagnostics
- [Posterior predictive checks](12_diagnostics_posterior_predictive_check) @@ -28,14 +43,14 @@
-## Analysis: +## Analysis
- [Conditional distributions](07_conditional_distributions) - [Posterior sensitivity analysis](09_sensitivity_analysis)
-## Examples: +## Examples
- [Hodgkin-Huxley model](../examples/00_HH_simulator) From d388422d333946d928e2d73428903183823cee28 Mon Sep 17 00:00:00 2001 From: Jan Boelts Date: Tue, 11 Jun 2024 13:21:42 +0200 Subject: [PATCH 10/24] shorten snippet, remove navigation bar dropdown details --- docs/docs/index.md | 6 ++-- docs/docs/publications.md | 42 -------------------------- docs/mkdocs.yml | 31 ++----------------- tutorials/16_implemented_methods.ipynb | 4 +-- tutorials/index.md | 2 +- 5 files changed, 8 insertions(+), 77 deletions(-) delete mode 100644 docs/docs/publications.md diff --git a/docs/docs/index.md b/docs/docs/index.md index bd33ae2fc..ea5e6384a 100644 --- a/docs/docs/index.md +++ b/docs/docs/index.md @@ -8,7 +8,7 @@ import torch from sbi.inference import SNPE from sbi.utils import BoxUniform -# uniform prior and gaussian simulator +# use dummy prior and simulator theta = torch.rand(1000) x = torch.randn_like(theta) + theta @@ -16,7 +16,7 @@ x = torch.randn_like(theta) + theta inference = SNPE() inference.append_simulations(theta, x).train() -# inference +# do inference posterior = inference.build_posterior() samples = posterior.sample((1000,), x=torch.tensor([1.5])) ``` @@ -39,7 +39,7 @@ Then, check out our material:

*General motivation for the SBI framework and methods included in `sbi`.* -- :rocket: [__Tutorials__](tutorials/) +- :rocket: [__Tutorials and Examples__](tutorials/)

*Various examples illustrating how to use the `sbi` package.* diff --git a/docs/docs/publications.md b/docs/docs/publications.md deleted file mode 100644 index 44cfcbe36..000000000 --- a/docs/docs/publications.md +++ /dev/null @@ -1,42 +0,0 @@ -# Publications - -This page references papers to get additional details on the inference metods implemented in `sbi`. -You can find a tutorial on how to run each of these methods [here](../tutorial/16_implemented_methods/). - -## Posterior estimation (`(S)NPE`) - -- **Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation**
by Papamakarios & Murray (NeurIPS 2016)
[[PDF]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation.pdf) [[BibTeX]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation/bibtex) - -- **Flexible statistical inference for mechanistic models of neural dynamics**
by Lueckmann, Goncalves, Bassetto, Öcal, Nonnenmacher & Macke (NeurIPS 2017)
[[PDF]](https://papers.nips.cc/paper/6728-flexible-statistical-inference-for-mechanistic-models-of-neural-dynamics.pdf) [[BibTeX]](https://papers.nips.cc/paper/6728-flexible-statistical-inference-for-mechanistic-models-of-neural-dynamics/bibtex) - -- **Automatic posterior transformation for likelihood-free inference**
by Greenberg, Nonnenmacher & Macke (ICML 2019)
[[PDF]](http://proceedings.mlr.press/v97/greenberg19a/greenberg19a.pdf) [[BibTeX]](data:text/plain;charset=utf-8,%0A%0A%0A%0A%0A%0A%40InProceedings%7Bpmlr-v97-greenberg19a%2C%0A%20%20title%20%3D%20%09%20%7BAutomatic%20Posterior%20Transformation%20for%20Likelihood-Free%20Inference%7D%2C%0A%20%20author%20%3D%20%09%20%7BGreenberg%2C%20David%20and%20Nonnenmacher%2C%20Marcel%20and%20Macke%2C%20Jakob%7D%2C%0A%20%20booktitle%20%3D%20%09%20%7BProceedings%20of%20the%2036th%20International%20Conference%20on%20Machine%20Learning%7D%2C%0A%20%20pages%20%3D%20%09%20%7B2404--2414%7D%2C%0A%20%20year%20%3D%20%09%20%7B2019%7D%2C%0A%20%20editor%20%3D%20%09%20%7BChaudhuri%2C%20Kamalika%20and%20Salakhutdinov%2C%20Ruslan%7D%2C%0A%20%20volume%20%3D%20%09%20%7B97%7D%2C%0A%20%20series%20%3D%20%09%20%7BProceedings%20of%20Machine%20Learning%20Research%7D%2C%0A%20%20address%20%3D%20%09%20%7BLong%20Beach%2C%20California%2C%20USA%7D%2C%0A%20%20month%20%3D%20%09%20%7B09--15%20Jun%7D%2C%0A%20%20publisher%20%3D%20%09%20%7BPMLR%7D%2C%0A%20%20pdf%20%3D%20%09%20%7Bhttp%3A%2F%2Fproceedings.mlr.press%2Fv97%2Fgreenberg19a%2Fgreenberg19a.pdf%7D%2C%0A%20%20url%20%3D%20%09%20%7Bhttp%3A%2F%2Fproceedings.mlr.press%2Fv97%2Fgreenberg19a.html%7D%2C%0A%20%20abstract%20%3D%20%09%20%7BHow%20can%20one%20perform%20Bayesian%20inference%20on%20stochastic%20simulators%20with%20intractable%20likelihoods%3F%20A%20recent%20approach%20is%20to%20learn%20the%20posterior%20from%20adaptively%20proposed%20simulations%20using%20neural%20network-based%20conditional%20density%20estimators.%20However%2C%20existing%20methods%20are%20limited%20to%20a%20narrow%20range%20of%20proposal%20distributions%20or%20require%20importance%20weighting%20that%20can%20limit%20performance%20in%20practice.%20Here%20we%20present%20automatic%20posterior%20transformation%20(APT)%2C%20a%20new%20sequential%20neural%20posterior%20estimation%20method%20for%20simulation-based%20inference.%20APT%20can%20modify%20the%20posterior%20estimate%20using%20arbitrary%2C%20dynamically%20updated%20proposals%2C%20and%20is%20compatible%20with%20powerful%20flow-based%20density%20estimators.%20It%20is%20more%20flexible%2C%20scalable%20and%20efficient%20than%20previous%20simulation-based%20inference%20techniques.%20APT%20can%20operate%20directly%20on%20high-dimensional%20time%20series%20and%20image%20data%2C%20opening%20up%20new%20applications%20for%20likelihood-free%20inference.%7D%0A%7D%0A) - -- **Truncated proposals for scalable and hassle-free simulation-based inference**
by Deistler, Goncalves & Macke (NeurIPS 2022)
[[Paper]](https://arxiv.org/abs/2210.04815) - - -## Likelihood-estimation (`(S)NLE`) - -- **Sequential neural likelihood: Fast likelihood-free inference with autoregressive flows**
by Papamakarios, Sterratt & Murray (AISTATS 2019)
[[PDF]](http://proceedings.mlr.press/v89/papamakarios19a/papamakarios19a.pdf) [[BibTeX]](https://gpapamak.github.io/bibtex/snl.bib) - -- **Variational methods for simulation-based inference**
by Glöckler, Deistler, Macke (ICLR 2022)
[[Paper]](https://arxiv.org/abs/2203.04176) - -- **Flexible and efficient simulation-based inference for models of decision-making**
by Boelts, Lueckmann, Gao, Macke (Elife 2022)
[[Paper]](https://elifesciences.org/articles/77220) - - -## Likelihood-ratio-estimation (`(S)NRE`) - -- **Likelihood-free MCMC with Amortized Approximate Likelihood Ratios**
by Hermans, Begy & Louppe (ICML 2020)
[[PDF]](http://proceedings.mlr.press/v119/hermans20a/hermans20a.pdf) - -- **On Contrastive Learning for Likelihood-free Inference**
Durkan, Murray & Papamakarios (ICML 2020)
[[PDF]](http://proceedings.mlr.press/v119/durkan20a/durkan20a.pdf) - -- **Towards Reliable Simulation-Based Inference with Balanced Neural Ratio Estimation**
by Delaunoy, Hermans, Rozet, Wehenkel & Louppe (NeurIPS 2022)
[[PDF]](https://arxiv.org/pdf/2208.13624.pdf) - -- **Contrastive Neural Ratio Estimation**
Benjamin Kurt Miller, Christoph Weniger, Patrick Forré (NeurIPS 2022)
[[PDF]](https://arxiv.org/pdf/2210.06170.pdf) - -## Utilities - -- **Restriction estimator**
by Deistler, Macke & Goncalves (PNAS 2022)
[[Paper]](https://www.pnas.org/doi/10.1073/pnas.2207632119) - -- **Simulation-based calibration**
by Talts, Betancourt, Simpson, Vehtari, Gelman (arxiv 2018)
[[Paper]](https://arxiv.org/abs/1804.06788)) - -- **Expected coverage (sample-based)**
as computed in Deistler, Goncalves, Macke [[Paper]](https://arxiv.org/abs/2210.04815) and in Rozet, Louppe [[Paper]](https://matheo.uliege.be/handle/2268.2/12993) diff --git a/docs/mkdocs.yml b/docs/mkdocs.yml index 03d57a8e6..b25bc3e9e 100644 --- a/docs/mkdocs.yml +++ b/docs/mkdocs.yml @@ -4,39 +4,12 @@ site_url: "https://sbi-dev.github.io/sbi/" nav: - Home: index.md - Tutorials and Examples: tutorials/index.md - # - Introduction: - # - Getting started: tutorials/00_getting_started_flexible.ipynb - # - Amortized inference: tutorials/01_gaussian_amortized.ipynb - # - Implemented algorithms: tutorials/16_implemented_methods.md - # - Advanced: - # - Multi-round inference: tutorials/03_multiround_inference.md - # - Sampling algorithms in sbi: tutorials/11_sampler_interface.md - # - Custom density estimators: tutorials/04_density_estimators.md - # - Embedding nets for observations: tutorials/05_embedding_net.md - # - SBI with trial-based data: tutorials/14_iid_data_and_permutation_invariant_embeddings.md - # - Handling invalid simulations: tutorials/08_restriction_estimator.md - # - Crafting summary statistics: tutorials/10_crafting_summary_statistics.md - # - Diagnostics: - # - Posterior predictive checks: tutorials/12_diagnostics_posterior_predictive_check.md - # - Simulation-based calibration: tutorials/13_diagnostics_simulation_based_calibration.md - # - Density plots and MCMC diagnostics with ArviZ: tutorials/15_mcmc_diagnostics_with_arviz.md - # - Analysis: - # - Conditional distributions: tutorials/07_conditional_distributions.md - # - Posterior sensitivity analysis: tutorials/09_sensitivity_analysis.md - # - Examples: - # - Hodgkin-Huxley example: examples/00_HH_simulator.md - # - Decision making model: examples/01_decision_making_model.md - API Reference: reference/index.md - # - Inference: reference/inference.md - # - Neural Networks: reference/models.md - # - Posteriors: reference/posteriors.md - # - Potentials: reference/potentials.md - # - Analysis: reference/analysis.md + - FAQ: faq.md - Contributing: - How to contribute: contribute.md - Code of Conduct: code_of_conduct.md - - FAQ: faq.md - - SBI Publications: publications + - Citation: citation.md - Credits: credits.md repo_name: 'sbi-dev/sbi' diff --git a/tutorials/16_implemented_methods.ipynb b/tutorials/16_implemented_methods.ipynb index a4598d909..9ea26279e 100644 --- a/tutorials/16_implemented_methods.ipynb +++ b/tutorials/16_implemented_methods.ipynb @@ -150,9 +150,9 @@ "outputs": [], "source": [ "# The BayesFlow functionality can be obtained via single-round SNPE.\n", - "from sbi.inference import SNPE_A\n", + "from sbi.inference import SNPE\n", "\n", - "inference = SNPE_A(prior)\n", + "inference = SNPE(prior)\n", "theta = prior.sample((num_sims,))\n", "x = simulator(theta)\n", "inference.append_simulations(theta, x).train()\n", diff --git a/tutorials/index.md b/tutorials/index.md index 56d2c298a..76be358bd 100644 --- a/tutorials/index.md +++ b/tutorials/index.md @@ -19,7 +19,7 @@ inference. - [Amortized inference](01_gaussian_amortized) - [Implemented algorithms](16_implemented_methods) - [Example application with a simulator from neuroscience - (Hodgkin-Huxley)](../examples/00_HH_simulator.ipynb) + (Hodgkin-Huxley)](../examples/00_HH_simulator)
## Advanced From ca32ce177610329387bce3d5e0e36c20dc124cd9 Mon Sep 17 00:00:00 2001 From: Jan Boelts Date: Tue, 11 Jun 2024 13:38:34 +0200 Subject: [PATCH 11/24] fix snippet --- docs/docs/index.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/docs/index.md b/docs/docs/index.md index ea5e6384a..f37010644 100644 --- a/docs/docs/index.md +++ b/docs/docs/index.md @@ -6,11 +6,11 @@ interface: ```python import torch from sbi.inference import SNPE -from sbi.utils import BoxUniform # use dummy prior and simulator -theta = torch.rand(1000) +theta = torch.randn(1000, 2) x = torch.randn_like(theta) + theta +print(theta.shape, x.shape) # choose sbi method and train inference = SNPE() @@ -18,7 +18,7 @@ inference.append_simulations(theta, x).train() # do inference posterior = inference.build_posterior() -samples = posterior.sample((1000,), x=torch.tensor([1.5])) +samples = posterior.sample((1000,), x=torch.ones(2)) ``` ## Overview From 8d93d5e9ca5675be6e5fe39dcf822cc7fc999fea Mon Sep 17 00:00:00 2001 From: Jan Boelts Date: Thu, 20 Jun 2024 13:09:35 +0200 Subject: [PATCH 12/24] fix: links and md headings. --- docs/docs/contribute.md | 11 ++++++----- docs/docs/index.md | 6 +++--- docs/mkdocs.yml | 2 +- 3 files changed, 10 insertions(+), 9 deletions(-) diff --git a/docs/docs/contribute.md b/docs/docs/contribute.md index 9fe406c0d..3939c62b2 100644 --- a/docs/docs/contribute.md +++ b/docs/docs/contribute.md @@ -38,12 +38,13 @@ the end of every year. Additionally, we mention all contributors in the releases propose and then working on your pull request after getting some feedback from others. -### How to contribute +### Contribution workflow -The following steps describe all parts of the workflow for doing a contribution such as -installing locally `sbi` from source, creating a `conda` environment, setting up -your `git` repository, etc. We've taken strong inspiration from the guides for -contribution of [`scikit-learn`](https://scikit-learn.org/stable/developers/contributing.html) +The following steps describe all parts of the workflow for doing a contribution +such as installing locally `sbi` from source, creating a `conda` environment, +setting up your `git` repository, etc. We've taken strong inspiration from the +contribution guides of +[`scikit-learn`](https://scikit-learn.org/stable/developers/contributing.html) and [`mne`](https://mne.tools/stable/development/contributing.html): **Step 1**: [Create an account](https://github.com/) on GitHub if you do not diff --git a/docs/docs/index.md b/docs/docs/index.md index f37010644..88ecf78f0 100644 --- a/docs/docs/index.md +++ b/docs/docs/index.md @@ -39,11 +39,11 @@ Then, check out our material:

*General motivation for the SBI framework and methods included in `sbi`.* -- :rocket: [__Tutorials and Examples__](tutorials/) +- :rocket: [__Tutorials and Examples__](tutorials/index.md)

*Various examples illustrating how to use the `sbi` package.* -- :building_construction: [__Reference API__](reference/) +- :building_construction: [__Reference API__](reference/index.md)

*The detailed description of the package classes and functions.* @@ -119,7 +119,7 @@ the inference on one particular observation to be more simulation-efficient Below, we list all implemented methods and the corresponding publications. To see how to access these methods in `sbi`, check out our [Inference API's reference]( -reference/inference/) and the [tutorial on implemented +reference/inference.md) and the [tutorial on implemented methods](tutorials/16_implemented_methods.ipynb). ### Posterior estimation (`(S)NPE`) diff --git a/docs/mkdocs.yml b/docs/mkdocs.yml index b25bc3e9e..b91fe1a00 100644 --- a/docs/mkdocs.yml +++ b/docs/mkdocs.yml @@ -69,7 +69,7 @@ plugins: - mkdocs-jupyter: include: ["*.ipynb"] # Default: ["*.py", "*.ipynb"] ignore: [".ipynb_checkpoints/*.ipynb"] - no_input_tag: True + no_input: True - mkdocstrings: default_handler: python handlers: From c7480c1807e6fd7dabc4c2303a49bbd485b5dc92 Mon Sep 17 00:00:00 2001 From: tommoral Date: Tue, 30 Jul 2024 13:06:07 +0200 Subject: [PATCH 13/24] FIX small changes in index.md --- docs/docs/index.md | 7 +++---- 1 file changed, 3 insertions(+), 4 deletions(-) diff --git a/docs/docs/index.md b/docs/docs/index.md index 88ecf78f0..812dfebcc 100644 --- a/docs/docs/index.md +++ b/docs/docs/index.md @@ -41,7 +41,7 @@ Then, check out our material: - :rocket: [__Tutorials and Examples__](tutorials/index.md)

- *Various examples illustrating how to use the `sbi` package.* + *Various examples illustrating how to
[get started](tutorials/00_getting_started_flexible/) or use the `sbi` package.* - :building_construction: [__Reference API__](reference/index.md)

@@ -142,9 +142,8 @@ methods](tutorials/16_implemented_methods.ipynb).
[[Paper]](https://arxiv.org/abs/2210.04815) - **BayesFlow: Learning complex stochastic models with invertible neural - networks**
by Radev, S. T., Mertens, U. K., Voss, A., Ardizzone, L., & Köthe, - U. (2020) (IEEE transactions on neural networks and learning systems 2020)
- [Paper](https://ieeexplore.ieee.org/abstract/document/9298920) + networks**
by Radev, S. T., Mertens, U. K., Voss, A., Ardizzone, L., & Köthe, U. (IEEE transactions on neural networks and learning systems 2020)
+ [[Paper]](https://ieeexplore.ieee.org/abstract/document/9298920) ### Likelihood-estimation (`(S)NLE`) From e5d7209ad478e4f1915639689e625bb00a52bd40 Mon Sep 17 00:00:00 2001 From: tommoral Date: Tue, 30 Jul 2024 15:21:01 +0200 Subject: [PATCH 14/24] DOC expose doc on PR --- .github/workflows/check_doc.yml | 25 +++++++++++++++++++++++++ 1 file changed, 25 insertions(+) create mode 100644 .github/workflows/check_doc.yml diff --git a/.github/workflows/check_doc.yml b/.github/workflows/check_doc.yml new file mode 100644 index 000000000..1b163088d --- /dev/null +++ b/.github/workflows/check_doc.yml @@ -0,0 +1,25 @@ +name: "docs" +on: + pull_request: + branches: [ master ] + +jobs: + docs: + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v2 + - name: Set up Python 3.10 + uses: actions/setup-python@v2 + with: + python-version: '3.10' + - name: Install sbi and dependencies + run: | + python -m pip install --upgrade pip + python -m pip install .[doc] + - name: Build documentation + run: | + mkdocs build + - uses: actions/upload-artifact@v1 + with: + name: DocumentationHTML + path: site/ \ No newline at end of file From 2f6397679dbee5aa02b9d4520c76d36aa773f0a4 Mon Sep 17 00:00:00 2001 From: tommoral Date: Tue, 30 Jul 2024 15:26:26 +0200 Subject: [PATCH 15/24] FIX linter --- .github/workflows/check_doc.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/check_doc.yml b/.github/workflows/check_doc.yml index 1b163088d..63a1d3cd3 100644 --- a/.github/workflows/check_doc.yml +++ b/.github/workflows/check_doc.yml @@ -22,4 +22,4 @@ jobs: - uses: actions/upload-artifact@v1 with: name: DocumentationHTML - path: site/ \ No newline at end of file + path: site/ From d2a7b28377d8e66a4c534b672651a1312a40ab7c Mon Sep 17 00:00:00 2001 From: tommoral Date: Tue, 30 Jul 2024 15:27:11 +0200 Subject: [PATCH 16/24] FIX workflow target --- .github/workflows/check_doc.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/check_doc.yml b/.github/workflows/check_doc.yml index 63a1d3cd3..e961a3531 100644 --- a/.github/workflows/check_doc.yml +++ b/.github/workflows/check_doc.yml @@ -1,7 +1,7 @@ name: "docs" on: pull_request: - branches: [ master ] + branches: [ main ] jobs: docs: From d1c555d14d55831cec3367c3b705b5c026921797 Mon Sep 17 00:00:00 2001 From: tommoral Date: Tue, 30 Jul 2024 15:32:08 +0200 Subject: [PATCH 17/24] FIX check doc workflow --- .github/workflows/check_doc.yml | 25 +++++++++++++++++++++---- 1 file changed, 21 insertions(+), 4 deletions(-) diff --git a/.github/workflows/check_doc.yml b/.github/workflows/check_doc.yml index e961a3531..4f6f20929 100644 --- a/.github/workflows/check_doc.yml +++ b/.github/workflows/check_doc.yml @@ -5,21 +5,38 @@ on: jobs: docs: + name: Check Documentation runs-on: ubuntu-latest steps: - - uses: actions/checkout@v2 - - name: Set up Python 3.10 + - name: Checkout + uses: actions/checkout@v4 + with: + fetch-depth: 0 + lfs: false + + - name: Set up Python uses: actions/setup-python@v2 with: - python-version: '3.10' + python-version: '3.8' + + - name: Cache dependency + id: cache-dependencies + uses: actions/cache@v4 + with: + path: ~/.cache/pip + key: ${{ runner.os }}-pip + - name: Install sbi and dependencies run: | python -m pip install --upgrade pip python -m pip install .[doc] + - name: Build documentation run: | + cd docs mkdocs build + - uses: actions/upload-artifact@v1 with: name: DocumentationHTML - path: site/ + path: docs/site/ From b732f6c3454e33021015ccdf46533c1de3ae7314 Mon Sep 17 00:00:00 2001 From: Jan Boelts Date: Tue, 6 Aug 2024 11:54:42 +0200 Subject: [PATCH 18/24] refactor: improve landing page and credits; update methods --- docs/docs/credits.md | 39 ++---- docs/docs/index.md | 63 +++++---- tutorials/16_implemented_methods.ipynb | 176 ++++++++++++++++--------- tutorials/index.md | 2 - 4 files changed, 165 insertions(+), 115 deletions(-) diff --git a/docs/docs/credits.md b/docs/docs/credits.md index 23da46007..12f4ef074 100644 --- a/docs/docs/credits.md +++ b/docs/docs/credits.md @@ -8,7 +8,8 @@ direct contributions to the codebase have been instrumental in the development o ## License -`sbi` is licensed under the [Apache License (Apache-2.0)](https://www.apache.org/licenses/LICENSE-2.0) and +`sbi` is licensed under the [Apache License +(Apache-2.0)](https://www.apache.org/licenses/LICENSE-2.0) and > Copyright (C) 2020 Álvaro Tejero-Cantero, Jakob H. Macke, Jan-Matthis Lückmann, > Michael Deistler, Jan F. Bölts. @@ -20,35 +21,23 @@ direct contributions to the codebase have been instrumental in the development o ## Support `sbi` has been supported by the German Federal Ministry of Education and Research (BMBF) -through the project ADIMEM (FKZ 01IS18052 A-D). -[ADIMEM](https://fit.uni-tuebingen.de/Project/Details?id=9199) is a collaborative -project between the groups of Jakob Macke (Uni Tübingen), Philipp Berens (Uni Tübingen), -Philipp Hennig (Uni Tübingen), and Marcel Oberlaender (caesar Bonn), which aims to develop -inference methods for mechanistic models. +through project ADIMEM (FKZ 01IS18052 A-D), project SiMaLeSAM (FKZ 01IS21055A) and the +Tübingen AI Center (FKZ 01IS18039A). Since 2024, `sbi` has been supported by the +appliedAI Institute for Europe gGmbH. ![](static/logo_bmbf.svg) ## Important dependencies and prior art -* `sbi` is the successor to [`delfi`](https://github.com/mackelab/delfi), a Theano-based - toolbox for sequential neural posterior estimation developed at [mackelab](https://uni-tuebingen.de/en/research/core-research/cluster-of-excellence-machine-learning/research/research/cluster-research-groups/professorships/machine-learning-in-science/). If you were - using `delfi`, we strongly recommend to move your inference over to `sbi`. Please open - issues if you find unexpected behaviour or missing features. We will consider these - bugs and give them priority. - -* `sbi` as a PyTorch-based toolbox started as a fork of +- `sbi` is the successor to [`delfi`](https://github.com/mackelab/delfi), a Theano-based + toolbox for sequential neural posterior estimation developed at + [mackelab](https://www.mackelab.org).If you were using `delfi`, we strongly recommend + moving your inference over to `sbi`. Please open issues if you find unexpected + behavior or missing features. We will consider these bugs and give them priority. +- `sbi` as a PyTorch-based toolbox started as a fork of [conormdurkan/lfi](https://github.com/conormdurkan/lfi), by [Conor M.Durkan](https://conormdurkan.github.io/). - -* `sbi` uses density estimators from -[bayesiains/nflows](https://github.com/bayesiains/nsf) by [Conor -M.Durkan](https://conormdurkan.github.io/), [George -Papamakarios](https://gpapamak.github.io/) and [Artur -Bekasov](https://arturbekasov.github.io/). These are proxied through -[`pyknos`](https://github.com/mackelab/pyknos), a package focused on density estimation. - -* `sbi` uses `PyTorch` and tries to align with the interfaces (e.g. for probability +- `sbi` uses `PyTorch` and tries to align with the interfaces (e.g. for probability distributions) adopted by `PyTorch`. - -* See [README.md](https://github.com/mackelab/sbi/blob/master/README.md) for a list of - publications describing the methods implemented in `sbi`. +- See [README.md](https://github.com/mackelab/sbi/blob/master/README.md) for a + list of publications describing the methods implemented in `sbi`. diff --git a/docs/docs/index.md b/docs/docs/index.md index 812dfebcc..2649d28b0 100644 --- a/docs/docs/index.md +++ b/docs/docs/index.md @@ -7,18 +7,21 @@ interface: import torch from sbi.inference import SNPE -# use dummy prior and simulator -theta = torch.randn(1000, 2) -x = torch.randn_like(theta) + theta -print(theta.shape, x.shape) +# define shifted Gaussian simulator. +def simulator(θ): return θ + torch.randn_like(θ) +# draw parameters from Gaussian prior. +θ = torch.randn(1000, 2) +# simulate data +x = simulator(θ) # choose sbi method and train inference = SNPE() -inference.append_simulations(theta, x).train() +inference.append_simulations(θ, x).train() -# do inference +# do inference given observed data +x_o = torch.ones(2) posterior = inference.build_posterior() -samples = posterior.sample((1000,), x=torch.ones(2)) +samples = posterior.sample((1000,), x=x_o) ``` ## Overview @@ -41,7 +44,8 @@ Then, check out our material: - :rocket: [__Tutorials and Examples__](tutorials/index.md)

- *Various examples illustrating how to
[get started](tutorials/00_getting_started_flexible/) or use the `sbi` package.* + *Various examples illustrating how to
[get + started](tutorials/00_getting_started_flexible/) or use the `sbi` package.* - :building_construction: [__Reference API__](reference/index.md)

@@ -135,20 +139,20 @@ methods](tutorials/16_implemented_methods.ipynb).
[[PDF]](https://papers.nips.cc/paper/6728-flexible-statistical-inference-for-mechanistic-models-of-neural-dynamics.pdf) [[BibTeX]](https://papers.nips.cc/paper/6728-flexible-statistical-inference-for-mechanistic-models-of-neural-dynamics/bibtex) -- **Automatic posterior transformation for likelihood-free inference**
by Greenberg, Nonnenmacher & Macke (ICML 2019)
[[PDF]](http://proceedings.mlr.press/v97/greenberg19a/greenberg19a.pdf) [[BibTeX]](data:text/plain;charset=utf-8,%0A%0A%0A%0A%0A%0A%40InProceedings%7Bpmlr-v97-greenberg19a%2C%0A%20%20title%20%3D%20%09%20%7BAutomatic%20Posterior%20Transformation%20for%20Likelihood-Free%20Inference%7D%2C%0A%20%20author%20%3D%20%09%20%7BGreenberg%2C%20David%20and%20Nonnenmacher%2C%20Marcel%20and%20Macke%2C%20Jakob%7D%2C%0A%20%20booktitle%20%3D%20%09%20%7BProceedings%20of%20the%2036th%20International%20Conference%20on%20Machine%20Learning%7D%2C%0A%20%20pages%20%3D%20%09%20%7B2404--2414%7D%2C%0A%20%20year%20%3D%20%09%20%7B2019%7D%2C%0A%20%20editor%20%3D%20%09%20%7BChaudhuri%2C%20Kamalika%20and%20Salakhutdinov%2C%20Ruslan%7D%2C%0A%20%20volume%20%3D%20%09%20%7B97%7D%2C%0A%20%20series%20%3D%20%09%20%7BProceedings%20of%20Machine%20Learning%20Research%7D%2C%0A%20%20address%20%3D%20%09%20%7BLong%20Beach%2C%20California%2C%20USA%7D%2C%0A%20%20month%20%3D%20%09%20%7B09--15%20Jun%7D%2C%0A%20%20publisher%20%3D%20%09%20%7BPMLR%7D%2C%0A%20%20pdf%20%3D%20%09%20%7Bhttp%3A%2F%2Fproceedings.mlr.press%2Fv97%2Fgreenberg19a%2Fgreenberg19a.pdf%7D%2C%0A%20%20url%20%3D%20%09%20%7Bhttp%3A%2F%2Fproceedings.mlr.press%2Fv97%2Fgreenberg19a.html%7D%2C%0A%20%20abstract%20%3D%20%09%20%7BHow%20can%20one%20perform%20Bayesian%20inference%20on%20stochastic%20simulators%20with%20intractable%20likelihoods%3F%20A%20recent%20approach%20is%20to%20learn%20the%20posterior%20from%20adaptively%20proposed%20simulations%20using%20neural%20network-based%20conditional%20density%20estimators.%20However%2C%20existing%20methods%20are%20limited%20to%20a%20narrow%20range%20of%20proposal%20distributions%20or%20require%20importance%20weighting%20that%20can%20limit%20performance%20in%20practice.%20Here%20we%20present%20automatic%20posterior%20transformation%20(APT)%2C%20a%20new%20sequential%20neural%20posterior%20estimation%20method%20for%20simulation-based%20inference.%20APT%20can%20modify%20the%20posterior%20estimate%20using%20arbitrary%2C%20dynamically%20updated%20proposals%2C%20and%20is%20compatible%20with%20powerful%20flow-based%20density%20estimators.%20It%20is%20more%20flexible%2C%20scalable%20and%20efficient%20than%20previous%20simulation-based%20inference%20techniques.%20APT%20can%20operate%20directly%20on%20high-dimensional%20time%20series%20and%20image%20data%2C%20opening%20up%20new%20applications%20for%20likelihood-free%20inference.%7D%0A%7D%0A) - -- **Truncated proposals for scalable and hassle-free simulation-based - inference**
by Deistler, Goncalves & Macke (NeurIPS 2022) -
[[Paper]](https://arxiv.org/abs/2210.04815) +- **Automatic posterior transformation for likelihood-free inference**
by Greenberg, Nonnenmacher & Macke (ICML 2019)
[[PDF]](http://proceedings.mlr.press/v97/greenberg19a/greenberg19a.pdf) [[BibTeX]](data:text/plain;charset=utf-8,%0A%0A%0A%0A%0A%0A%40InProceedings%7Bpmlr-v97-greenberg19a%2C%0A%20%20title%20%3D%20%09%20%7BAutomatic%20Posterior%20Transformation%20for%20Likelihood-Free%20Inference%7D%2C%0A%20%20author%20%3D%20%09%20%7BGreenberg%2C%20David%20and%20Nonnenmacher%2C%20Marcel%20and%20Macke%2C%20Jakob%7D%2C%0A%20%20booktitle%20%3D%20%09%20%7BProceedings%20of%20the%2036th%20International%20Conference%20on%20Machine%20Learning%7D%2C%0A%20%20pages%20%3D%20%09%20%7B2404--2414%7D%2C%0A%20%20year%20%3D%20%09%20%7B2019%7D%2C%0A%20%20editor%20%3D%20%09%20%7BChaudhuri%2C%20Kamalika%20and%20Salakhutdinov%2C%20Ruslan%7D%2C%0A%20%20volume%20%3D%20%09%20%7B97%7D%2C%0A%20%20series%20%3D%20%09%20%7BProceedings%20of%20Machine%20Learning%20Research%7D%2C%0A%20%20address%20%3D%20%09%20%7BLong%20Beach%2C%20California%2C%20USA%7D%2C%0A%20%20month%20%3D%20%09%20%7B09--15%20Jun%7D%2C%0A%20%20publisher%20%3D%20%09%20%7BPMLR%7D%2C%0A%20%20pdf%20%3D%20%09%20%7Bhttp%3A%2F%2Fproceedings.mlr.press%2Fv97%2Fgreenberg19a%2Fgreenberg19a.pdf%7D%2C%0A%20%20url%20%3D%20%09%20%7Bhttp%3A%2F%2Fproceedings.mlr.press%2Fv97%2Fgreenberg19a.html%7D%2C%0A%20%20abstract%20%3D%20%09%20%7BHow%20can%20one%20perform%20Bayesian%20inference%20on%20stochastic%20simulators%20with%20intractable%20likelihoods%3F%20A%20recent%20approach%20is%20to%20learn%20the%20posterior%20from%20adaptively%20proposed%20simulations%20using%20neural%20network-based%20conditional%20density%20estimators.%20However%2C%20existing%20methods%20are%20limited%20to%20a%20narrow%20range%20of%20proposal%20distributions%20or%20require%20importance%20weighting%20that%20can%20limit%20performance%20in%20practice.%20Here%20we%20present%20automatic%20posterior%20transformation%20(APT)%2C%20a%20new%20sequential%20neural%20posterior%20estimation%20method%20for%20simulation-based%20inference.%20APT%20can%20modify%20the%20posterior%20estimate%20using%20arbitrary%2C%20dynamically%20updated%20proposals%2C%20and%20is%20compatible%20with%20powerful%20flow-based%20density%20estimators.%20It%20is%20more%20flexible%2C%20scalable%20and%20efficient%20than%20previous%20simulation-based%20inference%20techniques.%20APT%20can%20operate%20directly%20on%20high-dimensional%20time%20series%20and%20image%20data%2C%20opening%20up%20new%20applications%20for%20likelihood-free%20inference.%7D%0A%7D%0A) - **BayesFlow: Learning complex stochastic models with invertible neural networks**
by Radev, S. T., Mertens, U. K., Voss, A., Ardizzone, L., & Köthe, U. (IEEE transactions on neural networks and learning systems 2020)
[[Paper]](https://ieeexplore.ieee.org/abstract/document/9298920) +- **Truncated proposals for scalable and hassle-free simulation-based + inference**
by Deistler, Goncalves & Macke (NeurIPS 2022) +
[[Paper]](https://arxiv.org/abs/2210.04815) + ### Likelihood-estimation (`(S)NLE`) - **Sequential neural likelihood: Fast likelihood-free inference with - autoregressive flows**
by Papamakarios, Sterratt & Murray (AISTATS 2019) + autoregressive flows**
by Papamakarios, Sterratt & Murray (AISTATS 2019)
[[PDF]](http://proceedings.mlr.press/v89/papamakarios19a/papamakarios19a.pdf) [[BibTeX]](https://gpapamak.github.io/bibtex/snl.bib) @@ -162,30 +166,33 @@ methods](tutorials/16_implemented_methods.ipynb). ### Likelihood-ratio-estimation (`(S)NRE`) -- **Likelihood-free MCMC with Amortized Approximate Likelihood Ratios**
by +- **Likelihood-free MCMC with Amortized Approximate Likelihood Ratios**
by Hermans, Begy & Louppe (ICML 2020)
[[PDF]](http://proceedings.mlr.press/v119/hermans20a/hermans20a.pdf) -- **On Contrastive Learning for Likelihood-free Inference**
Durkan, Murray & - Papamakarios (ICML 2020) +- **On Contrastive Learning for Likelihood-free Inference**
by Durkan, + Murray & Papamakarios (ICML 2020)
[[PDF]](http://proceedings.mlr.press/v119/durkan20a/durkan20a.pdf) - **Towards Reliable Simulation-Based Inference with Balanced Neural Ratio - Estimation**
by Delaunoy, Hermans, Rozet, Wehenkel & Louppe (NeurIPS 2022) + Estimation**
by Delaunoy, Hermans, Rozet, Wehenkel & Louppe (NeurIPS 2022)
[[PDF]](https://arxiv.org/pdf/2208.13624.pdf) -- **Contrastive Neural Ratio Estimation**
Benjamin Kurt Miller, Christoph - Weniger, Patrick Forré (NeurIPS 2022) +- **Contrastive Neural Ratio Estimation**
by Benjamin Kurt Miller, Christoph + Weniger & Patrick Forré (NeurIPS 2022)
[[PDF]](https://arxiv.org/pdf/2210.06170.pdf) -### Utilities +### Diagnostics + +- **Simulation-based calibration**
by Talts, Betancourt, Simpson, Vehtari, + Gelman (arxiv 2018)
[[Paper]](https://arxiv.org/abs/1804.06788) -- **Restriction estimator**
by Deistler, Macke & Goncalves (PNAS 2022) -
[[Paper]](https://www.pnas.org/doi/10.1073/pnas.2207632119) +- **Expected coverage (sample-based)**
as computed in Deistler, Goncalves, & + Macke (NeurIPS 2022)
[[Paper]](https://arxiv.org/abs/2210.04815) and in + Rozet & Louppe [[Paper]](https://matheo.uliege.be/handle/2268.2/12993) -- **Simulation-based calibration**
by Talts, Betancourt, Simpson, Vehtari, - Gelman (arxiv 2018)
[[Paper]](https://arxiv.org/abs/1804.06788)) +- **Local C2ST**
by Linhart, Gramfort & Rodrigues (NeurIPS + 2023)
[[Paper](https://arxiv.org/abs/2306.03580)] -- **Expected coverage (sample-based)**
as computed in Deistler, Goncalves, - Macke [[Paper]](https://arxiv.org/abs/2210.04815) and in Rozet, Louppe - [[Paper]](https://matheo.uliege.be/handle/2268.2/12993) +- **TARP**
by Lemos, Coogan, Hezaveh & Perreault-Levasseur (ICML + 2023)
[[Paper]](https://arxiv.org/abs/2302.03026) diff --git a/tutorials/16_implemented_methods.ipynb b/tutorials/16_implemented_methods.ipynb index 9ea26279e..585e79812 100644 --- a/tutorials/16_implemented_methods.ipynb +++ b/tutorials/16_implemented_methods.ipynb @@ -103,60 +103,66 @@ }, { "cell_type": "markdown", - "id": "b4de2b24-94ce-4cbf-a675-b0c19b5200ca", + "id": "e1df5c9d", "metadata": {}, "source": [ - "**Truncated proposals for scalable and hassle-free simulation-based inference**
by Deistler, Goncalves & Macke (NeurIPS 2022)
[[Paper]](https://arxiv.org/abs/2210.04815)\n" + "**BayesFlow: Learning complex stochastic models with invertible neural\n", + "networks**
by Radev, S. T., Mertens, U. K., Voss, A., Ardizzone, L., & Köthe,\n", + "U. (2020) (IEEE transactions on neural networks and learning systems 2020)
\n", + "[Paper](https://ieeexplore.ieee.org/abstract/document/9298920)\n", + "\n", + "The density estimation part of BayesFlow is equivalent to single-round NPE. The\n", + "additional contribution of the paper are several embedding networks for high-dimensional\n", + "data including permutation invariant embeddings. Similar embeddings networks are\n", + "implemented in `sbi` as well, under `sbi.neural_nets.embedding_nets.py`." ] }, { "cell_type": "code", "execution_count": null, - "id": "ae54b1a9-c3a6-4ee9-b687-bf8c046023c2", + "id": "836f58e9", "metadata": {}, "outputs": [], "source": [ + "# Posterior estimation with BayesFlow is equivalent to single-round SNPE.\n", "from sbi.inference import SNPE\n", - "from sbi.utils import RestrictedPrior, get_density_thresholder\n", "\n", "inference = SNPE(prior)\n", - "proposal = prior\n", - "for _ in range(num_rounds):\n", - " theta = proposal.sample((num_sims,))\n", - " x = simulator(theta)\n", - " _ = inference.append_simulations(theta, x).train(force_first_round_loss=True)\n", - " posterior = inference.build_posterior().set_default_x(x_o)\n", - "\n", - " accept_reject_fn = get_density_thresholder(posterior, quantile=1e-4)\n", - " proposal = RestrictedPrior(prior, accept_reject_fn, sample_with=\"rejection\")" + "theta = prior.sample((num_sims,))\n", + "x = simulator(theta)\n", + "inference.append_simulations(theta, x).train()\n", + "posterior = inference.build_posterior()\n", + "samples = posterior.sample((1000,), x=x_o)" ] }, { "cell_type": "markdown", - "id": "3642634d", + "id": "b4de2b24-94ce-4cbf-a675-b0c19b5200ca", "metadata": {}, "source": [ - "**BayesFlow: Learning complex stochastic models with invertible neural\n", - "networks**
by Radev, S. T., Mertens, U. K., Voss, A., Ardizzone, L., & Köthe,\n", - "U. (2020) (IEEE transactions on neural networks and learning systems 2020)
\n", - "[Paper](https://ieeexplore.ieee.org/abstract/document/9298920)" + "**Truncated proposals for scalable and hassle-free simulation-based inference**
by Deistler, Goncalves & Macke (NeurIPS 2022)
[[Paper]](https://arxiv.org/abs/2210.04815)\n" ] }, { "cell_type": "code", "execution_count": null, - "id": "da7554dd", + "id": "ae54b1a9-c3a6-4ee9-b687-bf8c046023c2", "metadata": {}, "outputs": [], "source": [ - "# The BayesFlow functionality can be obtained via single-round SNPE.\n", "from sbi.inference import SNPE\n", + "from sbi.utils import RestrictedPrior, get_density_thresholder\n", "\n", "inference = SNPE(prior)\n", - "theta = prior.sample((num_sims,))\n", - "x = simulator(theta)\n", - "inference.append_simulations(theta, x).train()\n", - "posterior = inference.build_posterior().set_default_x(x_o)" + "proposal = prior\n", + "for _ in range(num_rounds):\n", + " theta = proposal.sample((num_sims,))\n", + " x = simulator(theta)\n", + " _ = inference.append_simulations(theta, x).train(force_first_round_loss=True)\n", + " posterior = inference.build_posterior().set_default_x(x_o)\n", + "\n", + " accept_reject_fn = get_density_thresholder(posterior, quantile=1e-4)\n", + " proposal = RestrictedPrior(prior, accept_reject_fn, sample_with=\"rejection\")" ] }, { @@ -353,7 +359,6 @@ "\n", "from sbi.inference import SNRE_C\n", "\n", - "# Amortized inference\n", "inference = SNRE_C(prior)\n", "proposal = prior\n", "theta = proposal.sample((num_sims,))\n", @@ -370,7 +375,7 @@ "id": "6271d3b2-1d64-45b8-93b7-b640ab7dafc5", "metadata": {}, "source": [ - "## Utilities\n" + "## Diagnostics and utilities\n" ] }, { @@ -388,7 +393,8 @@ "metadata": {}, "outputs": [], "source": [ - "from sbi.analysis import run_sbc, sbc_rank_plot\n", + "from sbi.diagnostics import run_sbc\n", + "from sbi.analysis import sbc_rank_plot\n", "\n", "thetas = prior.sample((1000,))\n", "xs = simulator(thetas)\n", @@ -404,7 +410,7 @@ " thetas, xs, posterior, num_posterior_samples=1_000\n", ")\n", "\n", - "_ = sbc_rank_plot(\n", + "fig, axes = sbc_rank_plot(\n", " ranks=ranks,\n", " num_posterior_samples=1000,\n", " plot_type=\"hist\",\n", @@ -414,69 +420,106 @@ }, { "cell_type": "markdown", - "id": "8f1c75d9-4139-45ba-ad60-49d8ce8ec2ec", + "id": "48853668-7b6f-4cfd-9d93-7d62f0e77de8", "metadata": {}, "source": [ - "**Restriction estimator**
by Deistler, Macke & Goncalves (PNAS 2022)
[[Paper]](https://www.pnas.org/doi/10.1073/pnas.2207632119)\n" + "**Expected coverage (sample-based)**
as computed in Deistler, Goncalves, Macke (Neurips 2022) [[Paper]](https://arxiv.org/abs/2210.04815) and in Rozet, Louppe (2021) [[Paper]](https://matheo.uliege.be/handle/2268.2/12993)\n" ] }, { "cell_type": "code", "execution_count": null, - "id": "bdb39216-d0f0-41f6-a065-579d309ee1eb", + "id": "60e3d581-8a7f-4133-8756-9750f0174c88", "metadata": {}, "outputs": [], "source": [ - "from sbi.inference import SNPE\n", - "from sbi.utils import RestrictionEstimator\n", - "\n", - "restriction_estimator = RestrictionEstimator(prior=prior)\n", - "proposal = prior\n", - "\n", - "for _ in range(num_rounds):\n", - " theta = proposal.sample((num_sims,))\n", - " x = simulator(theta)\n", - " restriction_estimator.append_simulations(theta, x)\n", - " classifier = restriction_estimator.train()\n", - " proposal = restriction_estimator.restrict_prior()\n", + "thetas = prior.sample((1_000,))\n", + "xs = simulator(thetas)\n", "\n", - "all_theta, all_x, _ = restriction_estimator.get_simulations()\n", + "ranks, dap_samples = run_sbc(\n", + " thetas,\n", + " xs,\n", + " posterior,\n", + " num_posterior_samples=1_000,\n", + " reduce_fns=posterior.log_prob # Difference to SBC.\n", + ")\n", "\n", - "inference = SNPE(prior)\n", - "density_estimator = inference.append_simulations(all_theta, all_x).train()\n", - "posterior = inference.build_posterior()" + "# NOTE: Here we obtain a single rank plot because ranks are calculated\n", + "# for the entire posterior and not for each marginal like in SBC.\n", + "fig, axes = sbc_rank_plot(\n", + " ranks=ranks,\n", + " num_posterior_samples=1000,\n", + " plot_type=\"hist\",\n", + " num_bins=None,\n", + ")" ] }, { "cell_type": "markdown", - "id": "48853668-7b6f-4cfd-9d93-7d62f0e77de8", + "id": "3962b175", "metadata": {}, "source": [ - "**Expected coverage (sample-based)**
as computed in Deistler, Goncalves, Macke (Neurips 2022) [[Paper]](https://arxiv.org/abs/2210.04815) and in Rozet, Louppe (2021) [[Paper]](https://matheo.uliege.be/handle/2268.2/12993)\n" + "**TARP: Sampling-Based Accuracy Testing of Posterior Estimators for General Inference**\n", + "\n", + "Lemos, Coogan, Hezaveh & Perreault-Levasseur (ICML 2023)
[[Paper]](https://arxiv.org/abs/2302.03026)" ] }, { "cell_type": "code", "execution_count": null, - "id": "60e3d581-8a7f-4133-8756-9750f0174c88", + "id": "7de26848", "metadata": {}, "outputs": [], "source": [ - "from sbi.diagnostics import run_sbc, sbc_rank_plot\n", + "from sbi.diagnostics.tarp import run_tarp, plot_tarp\n", "\n", "thetas = prior.sample((1_000,))\n", "xs = simulator(thetas)\n", "\n", - "ranks, dap_samples = run_sbc(\n", - " thetas, xs, posterior, num_posterior_samples=1_000, reduce_fns=posterior.log_prob\n", + "expected_coverage, ideal_coverage = run_tarp(\n", + " thetas,\n", + " xs,\n", + " posterior,\n", + " references=None, # optional, defaults to uniform samples across parameter space.\n", + " num_posterior_samples=1_000,\n", ")\n", "\n", - "_ = sbc_rank_plot(\n", - " ranks=ranks,\n", - " num_posterior_samples=1000,\n", - " plot_type=\"hist\",\n", - " num_bins=None,\n", - ")" + "fix, axes = plot_tarp(expected_coverage, ideal_coverage)" + ] + }, + { + "cell_type": "markdown", + "id": "54a88026", + "metadata": {}, + "source": [ + "**Restriction estimator**
by Deistler, Macke & Goncalves (PNAS 2022)
[[Paper]](https://www.pnas.org/doi/10.1073/pnas.2207632119)\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "bc5e4c30", + "metadata": {}, + "outputs": [], + "source": [ + "from sbi.inference import SNPE\n", + "from sbi.utils import RestrictionEstimator\n", + "\n", + "restriction_estimator = RestrictionEstimator(prior=prior)\n", + "proposal = prior\n", + "\n", + "for _ in range(num_rounds):\n", + " theta = proposal.sample((num_sims,))\n", + " x = simulator(theta)\n", + " restriction_estimator.append_simulations(theta, x)\n", + " classifier = restriction_estimator.train()\n", + " proposal = restriction_estimator.restrict_prior()\n", + "\n", + "all_theta, all_x, _ = restriction_estimator.get_simulations()\n", + "\n", + "inference = SNPE(prior)\n", + "density_estimator = inference.append_simulations(all_theta, all_x).train()\n", + "posterior = inference.build_posterior()" ] } ], @@ -496,7 +539,20 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.10.0" + "version": "3.12.4" + }, + "toc": { + "base_numbering": 1, + "nav_menu": {}, + "number_sections": true, + "sideBar": true, + "skip_h1_title": false, + "title_cell": "Table of Contents", + "title_sidebar": "Contents", + "toc_cell": false, + "toc_position": {}, + "toc_section_display": true, + "toc_window_display": false }, "vscode": { "interpreter": { diff --git a/tutorials/index.md b/tutorials/index.md index 76be358bd..b45382f15 100644 --- a/tutorials/index.md +++ b/tutorials/index.md @@ -18,8 +18,6 @@ inference. - [Getting started](00_getting_started_flexible) - [Amortized inference](01_gaussian_amortized) - [Implemented algorithms](16_implemented_methods) -- [Example application with a simulator from neuroscience - (Hodgkin-Huxley)](../examples/00_HH_simulator) ## Advanced From 7f22346782ededef94e33f5026fdbec6581144d4 Mon Sep 17 00:00:00 2001 From: Jan Boelts Date: Wed, 7 Aug 2024 11:20:25 +0200 Subject: [PATCH 19/24] docs: change gh action to convert nbs and deploy docs upon release --- .github/workflows/check_doc.yml | 19 +++++++++++++------ 1 file changed, 13 insertions(+), 6 deletions(-) diff --git a/.github/workflows/check_doc.yml b/.github/workflows/check_doc.yml index 4f6f20929..15c554009 100644 --- a/.github/workflows/check_doc.yml +++ b/.github/workflows/check_doc.yml @@ -1,11 +1,11 @@ -name: "docs" +name: "build docs" on: - pull_request: - branches: [ main ] + release: + types: [ published ] jobs: docs: - name: Check Documentation + name: Build Documentation runs-on: ubuntu-latest steps: - name: Checkout @@ -31,10 +31,17 @@ jobs: python -m pip install --upgrade pip python -m pip install .[doc] - - name: Build documentation + - name: convert notebooks to markdown run: | cd docs - mkdocs build + mkdir -p docs/examples/ && mkdir -p docs/tutorials/ + jupyter nbconvert --to markdown ../examples/*.ipynb --output-dir docs/examples/ + jupyter nbconvert --to markdown ../tutorials/*.ipynb --output-dir docs/tutorials/ + + - name: Build and deploy documentation + run: | + cd docs + mkdocs gh-deploy - uses: actions/upload-artifact@v1 with: From 1a178462814cdbd60aec055df6c9934480699602 Mon Sep 17 00:00:00 2001 From: tommoral Date: Wed, 7 Aug 2024 14:45:58 +0200 Subject: [PATCH 20/24] CLN remove mkdocs-jupyter pluggin --- docs/docs/examples | 1 - docs/docs/tutorials | 1 - docs/mkdocs.yml | 4 ---- pyproject.toml | 1 - 4 files changed, 7 deletions(-) delete mode 120000 docs/docs/examples delete mode 120000 docs/docs/tutorials diff --git a/docs/docs/examples b/docs/docs/examples deleted file mode 120000 index da7b19653..000000000 --- a/docs/docs/examples +++ /dev/null @@ -1 +0,0 @@ -../../examples/ \ No newline at end of file diff --git a/docs/docs/tutorials b/docs/docs/tutorials deleted file mode 120000 index 478337109..000000000 --- a/docs/docs/tutorials +++ /dev/null @@ -1 +0,0 @@ -../../tutorials/ \ No newline at end of file diff --git a/docs/mkdocs.yml b/docs/mkdocs.yml index b91fe1a00..44862d393 100644 --- a/docs/mkdocs.yml +++ b/docs/mkdocs.yml @@ -66,10 +66,6 @@ markdown_extensions: plugins: - search - - mkdocs-jupyter: - include: ["*.ipynb"] # Default: ["*.py", "*.ipynb"] - ignore: [".ipynb_checkpoints/*.ipynb"] - no_input: True - mkdocstrings: default_handler: python handlers: diff --git a/pyproject.toml b/pyproject.toml index 9fc2cf41d..5ff9cd12b 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -56,7 +56,6 @@ doc = [ "mkdocs-material", "markdown-include", "mkdocs-redirects", - "mkdocs-jupyter", "mkdocstrings[python] >= 0.18", ] dev = [ From 1393d8a590ee2a9e5794ed21040002df176ee948 Mon Sep 17 00:00:00 2001 From: tommoral Date: Wed, 7 Aug 2024 15:50:22 +0200 Subject: [PATCH 21/24] DOC remove mkdocs-jupyter+add doc version control --- .../{check_doc.yml => build_docs.yml} | 22 +++++++++++-------- docs/docs/contribute.md | 9 +++++++- docs/docs/tutorials/.gitignore | 2 ++ {tutorials => docs/docs/tutorials}/index.md | 0 docs/mkdocs.yml | 3 +++ pyproject.toml | 1 + 6 files changed, 27 insertions(+), 10 deletions(-) rename .github/workflows/{check_doc.yml => build_docs.yml} (67%) create mode 100644 docs/docs/tutorials/.gitignore rename {tutorials => docs/docs/tutorials}/index.md (100%) diff --git a/.github/workflows/check_doc.yml b/.github/workflows/build_docs.yml similarity index 67% rename from .github/workflows/check_doc.yml rename to .github/workflows/build_docs.yml index 15c554009..a8df81af8 100644 --- a/.github/workflows/check_doc.yml +++ b/.github/workflows/build_docs.yml @@ -1,5 +1,8 @@ -name: "build docs" +name: "Build docs and deploy" on: + push: + branches: + - main release: types: [ published ] @@ -17,7 +20,7 @@ jobs: - name: Set up Python uses: actions/setup-python@v2 with: - python-version: '3.8' + python-version: '3.10' - name: Cache dependency id: cache-dependencies @@ -34,16 +37,17 @@ jobs: - name: convert notebooks to markdown run: | cd docs - mkdir -p docs/examples/ && mkdir -p docs/tutorials/ jupyter nbconvert --to markdown ../examples/*.ipynb --output-dir docs/examples/ jupyter nbconvert --to markdown ../tutorials/*.ipynb --output-dir docs/tutorials/ - - name: Build and deploy documentation + - name: Build and deploy dev documentation + if: ${{ github.event_name == 'push' }} run: | cd docs - mkdocs gh-deploy + mike deploy dev --push - - uses: actions/upload-artifact@v1 - with: - name: DocumentationHTML - path: docs/site/ + - name: Build and deploy the lastest documentation + if: ${{ github.event_name == 'release' }} + run: | + cd docs + mike deploy ${{ github.event.release.name }} latest -u --push diff --git a/docs/docs/contribute.md b/docs/docs/contribute.md index 3939c62b2..784ceb1e6 100644 --- a/docs/docs/contribute.md +++ b/docs/docs/contribute.md @@ -210,8 +210,15 @@ fails (xfailed). ## Contributing to the documentation Most of the documentation for `sbi` is written in markdown and the website is generated using `mkdocs` with `mkdocstrings`. To work on improvements of the -documentation, you should first run the command on your terminal +documentation, you should first install the `doc` dependencies: ``` +pip install -e ".[doc]" +``` +Then, you can run the command on your terminal +``` +cd docs +jupyter nbconvert --to markdown ../examples/*.ipynb --output-dir docs/examples/ +jupyter nbconvert --to markdown ../tutorials/*.ipynb --output-dir docs/tutorials/ mkdocs serve ``` and open a browser on the page proposed by `mkdocs`. Now, whenever you diff --git a/docs/docs/tutorials/.gitignore b/docs/docs/tutorials/.gitignore new file mode 100644 index 000000000..922aa335e --- /dev/null +++ b/docs/docs/tutorials/.gitignore @@ -0,0 +1,2 @@ +/**/*.md +*.png diff --git a/tutorials/index.md b/docs/docs/tutorials/index.md similarity index 100% rename from tutorials/index.md rename to docs/docs/tutorials/index.md diff --git a/docs/mkdocs.yml b/docs/mkdocs.yml index 44862d393..f9f0e4c84 100644 --- a/docs/mkdocs.yml +++ b/docs/mkdocs.yml @@ -32,6 +32,8 @@ extra: social: - icon: 'fontawesome/brands/github-alt' link: 'https://github.com/sbi-dev/sbi' + version: + provider: mike markdown_extensions: - extra @@ -66,6 +68,7 @@ markdown_extensions: plugins: - search + - mike - mkdocstrings: default_handler: python handlers: diff --git a/pyproject.toml b/pyproject.toml index 5ff9cd12b..aed79242c 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -57,6 +57,7 @@ doc = [ "markdown-include", "mkdocs-redirects", "mkdocstrings[python] >= 0.18", + "mike" ] dev = [ # Lint From d4675c67de8976d2a8044945f2c5d7c67ecef3be Mon Sep 17 00:00:00 2001 From: tommoral Date: Wed, 7 Aug 2024 16:02:44 +0200 Subject: [PATCH 22/24] MTN update .gitignore --- .gitignore | 2 -- docs/docs/examples/.gitignore | 2 ++ 2 files changed, 2 insertions(+), 2 deletions(-) create mode 100644 docs/docs/examples/.gitignore diff --git a/.gitignore b/.gitignore index c13f30a13..bf40436a9 100644 --- a/.gitignore +++ b/.gitignore @@ -1,8 +1,6 @@ # Project specific .sbi_env/ *sbi-logs/ -/docs/docs/tutorial/* -/docs/docs/examples/* /docs/site/* # Development files and python cache diff --git a/docs/docs/examples/.gitignore b/docs/docs/examples/.gitignore new file mode 100644 index 000000000..ecce1be80 --- /dev/null +++ b/docs/docs/examples/.gitignore @@ -0,0 +1,2 @@ +*.md +*.png From 9af62951a798faf9ac3eeed34f9d9c377115f59f Mon Sep 17 00:00:00 2001 From: Jan Boelts Date: Thu, 8 Aug 2024 16:50:50 +0200 Subject: [PATCH 23/24] fix: griffe warnings about .md links; refactoring text. --- docs/docs/contribute.md | 21 +++++++++++--------- docs/docs/faq.md | 4 ++++ docs/docs/index.md | 4 ++-- docs/docs/tutorials/index.md | 37 +++++++++++++++++++----------------- 4 files changed, 38 insertions(+), 28 deletions(-) diff --git a/docs/docs/contribute.md b/docs/docs/contribute.md index 784ceb1e6..ab82ef40a 100644 --- a/docs/docs/contribute.md +++ b/docs/docs/contribute.md @@ -179,19 +179,22 @@ to also run them without `-n auto`. When you create a PR onto `main`, our Continuous Integration (CI) actions on GitHub will perform the following checks: -- **`ruff`** for linting and formatting (including `black`, `isort`, and `flake8`) +- **[`ruff`](https://docs.astral.sh/ruff/formatter/)** for linting and formatting + (including `black`, `isort`, and `flake8`) - **[`pyright`](https://github.com/Microsoft/pyright)** for static type checking. -- **`pytest`** for running a subset of fast tests from our test suite. +- **[`pytest`](https://docs.pytest.org/en/stable/index.html)** for running a subset of + fast tests from our test suite. If any of these fail, try reproducing and solving the error locally: -- **`ruff`**: Make sure you have `pre-commit` installed locally with the same - version as specified in the [requirements](pyproject.toml). Execute it - using `pre-commit run --all-files`. `ruff` tends to give informative error - messages that help you fix the problem. Note that pre-commit only detects - problems with `ruff` linting and formatting, but does not fix them. You can - fix them either by running `ruff check . --fix(linting)`, followed by - `ruff format . --fix(formatting)`, or by hand. +- **`ruff`**: Make sure you have `pre-commit` installed locally with the same version as + specified in the + [`pyproject.toml`](https://github.com/sbi-dev/sbi/blob/main/pyproject.toml). Execute it + using `pre-commit run --all-files`. `ruff` tends to give informative error messages + that help you fix the problem. Note that pre-commit only detects problems with `ruff` + linting and formatting, but does not fix them. You can fix them either by running + `ruff check . --fix(linting)`, followed by `ruff format . --fix(formatting)`, or by + hand. - **`pyright`**: Run it locally using `pyright sbi/` and ensure you are using the same `pyright` version as used in the CI (which is the case if you have installed diff --git a/docs/docs/faq.md b/docs/docs/faq.md index 8f0beb872..741de8393 100644 --- a/docs/docs/faq.md +++ b/docs/docs/faq.md @@ -7,3 +7,7 @@ 5. [How should I save and load objects in `sbi`?](faq/question_05_pickling.md) 6. [Can I stop neural network training and resume it later?](faq/question_06_resume_training.md) 7. [How can I use a prior that is not defined in PyTorch?](faq/question_07_custom_prior.md) + +See also [discussion page](https://github.com/sbi-dev/sbi/discussions) and [issue +tracker](https://github.com/sbi-dev/sbi/issues) on the `sbi` GitHub repository for +recent questions and problems. diff --git a/docs/docs/index.md b/docs/docs/index.md index 2649d28b0..d9f7052a3 100644 --- a/docs/docs/index.md +++ b/docs/docs/index.md @@ -45,7 +45,7 @@ Then, check out our material: - :rocket: [__Tutorials and Examples__](tutorials/index.md)

*Various examples illustrating how to
[get - started](tutorials/00_getting_started_flexible/) or use the `sbi` package.* + started](tutorials/00_getting_started_flexible.md) or use the `sbi` package.* - :building_construction: [__Reference API__](reference/index.md)

@@ -124,7 +124,7 @@ the inference on one particular observation to be more simulation-efficient Below, we list all implemented methods and the corresponding publications. To see how to access these methods in `sbi`, check out our [Inference API's reference]( reference/inference.md) and the [tutorial on implemented -methods](tutorials/16_implemented_methods.ipynb). +methods](tutorials/16_implemented_methods.md). ### Posterior estimation (`(S)NPE`) diff --git a/docs/docs/tutorials/index.md b/docs/docs/tutorials/index.md index b45382f15..785b646e2 100644 --- a/docs/docs/tutorials/index.md +++ b/docs/docs/tutorials/index.md @@ -15,42 +15,45 @@ inference. ## Introduction
-- [Getting started](00_getting_started_flexible) -- [Amortized inference](01_gaussian_amortized) -- [Implemented algorithms](16_implemented_methods) +- [Getting started](00_getting_started_flexible.md) +- [Amortized inference](01_gaussian_amortized.md) +- [Implemented algorithms](16_implemented_methods.md)
## Advanced
-- [Multi-round inference](03_multiround_inference) -- [Sampling algorithms in sbi](11_sampler_interface) -- [Custom density estimators](04_density_estimators) -- [Embedding nets for observations](05_embedding_net) -- [SBI with trial-based data](14_iid_data_and_permutation_invariant_embeddings) -- [Handling invalid simulations](08_restriction_estimator) -- [Crafting summary statistics](10_crafting_summary_statistics) +- [Multi-round inference](03_multiround_inference.md) +- [Sampling algorithms in sbi](11_sampler_interface.md) +- [Custom density estimators](04_density_estimators.md) +- [Embedding nets for observations](05_embedding_net.md) +- [SBI with trial-based data](14_iid_data_and_permutation_invariant_embeddings.md) +- [Handling invalid simulations](08_restriction_estimator.md) +- [Crafting summary statistics](10_crafting_summary_statistics.md) +- [Importance sampling posteriors](17_importance_sampled_posteriors.md)
## Diagnostics
-- [Posterior predictive checks](12_diagnostics_posterior_predictive_check) -- [Simulation-based calibration](13_diagnostics_simulation_based_calibration) -- [Density plots and MCMC diagnostics with ArviZ](15_mcmc_diagnostics_with_arviz) +- [Posterior predictive checks](12_diagnostics_posterior_predictive_check.md) +- [Simulation-based calibration](13_diagnostics_simulation_based_calibration.md) +- [Density plots and MCMC diagnostics with ArviZ](15_mcmc_diagnostics_with_arviz.md) +- [Local-C2ST coverage checks](18_diagnostics_lc2st.md)
## Analysis
-- [Conditional distributions](07_conditional_distributions) -- [Posterior sensitivity analysis](09_sensitivity_analysis) +- [Conditional distributions](07_conditional_distributions.md) +- [Posterior sensitivity analysis](09_sensitivity_analysis.md) +- [Plotting functionality](19_plotting_functionality.md)
## Examples
-- [Hodgkin-Huxley model](../examples/00_HH_simulator) -- [Decision-making model](../examples/01_decision_making_model) +- [Hodgkin-Huxley model](../examples/00_HH_simulator.md) +- [Decision-making model](../examples/01_decision_making_model.md)
From 988afa4a1c0fee48afeec26914ebf60bb4ba62d9 Mon Sep 17 00:00:00 2001 From: Jan Boelts Date: Thu, 8 Aug 2024 17:23:47 +0200 Subject: [PATCH 24/24] fix: configure gh user in action for pushing to gh-pages --- .github/workflows/build_docs.yml | 9 +++++++-- 1 file changed, 7 insertions(+), 2 deletions(-) diff --git a/.github/workflows/build_docs.yml b/.github/workflows/build_docs.yml index a8df81af8..d06db6064 100644 --- a/.github/workflows/build_docs.yml +++ b/.github/workflows/build_docs.yml @@ -40,13 +40,18 @@ jobs: jupyter nbconvert --to markdown ../examples/*.ipynb --output-dir docs/examples/ jupyter nbconvert --to markdown ../tutorials/*.ipynb --output-dir docs/tutorials/ - - name: Build and deploy dev documentation + - name: Configure Git user for bot + run: | + git config --local user.email "github-actions[bot]@users.noreply.github.com" + git config --local user.name "github-actions[bot]" + + - name: Build and deploy dev documentation upon push to main if: ${{ github.event_name == 'push' }} run: | cd docs mike deploy dev --push - - name: Build and deploy the lastest documentation + - name: Build and deploy the lastest documentation upon new release if: ${{ github.event_name == 'release' }} run: | cd docs