diff --git a/docs/explanations/architecture.md b/docs/explanations/architecture.md index 4c3a97d1a..c829ebc73 100644 --- a/docs/explanations/architecture.md +++ b/docs/explanations/architecture.md @@ -1,46 +1,34 @@ -Architecture -============ +# Architecture + Blueapi performs a number of tasks: -* Managing the Bluesky RunEngine_, giving it instructions and handling its errors. Traditionally this job has been done by a human with an IPython_ terminal, so it requires automating. +* Managing the Bluesky [RunEngine](https://nsls-ii.github.io/bluesky/run_engine_api.html), giving it instructions and handling its errors. Traditionally this job has been done by a human with an [IPython](https://ipython.org/) terminal, so it requires automating. * Maintaining a registry of plans and devices. In the aforementioned IPython_ case, these would have just been global variables. * Communicating with the outside world, accepting instructions to run plans, providing updates on plan progress etc. These responsibilities are kept separate in the codebase to ensure a clean, maintainable architecture. -Key Components --------------- +## Key Components -.. figure:: ../images/blueapi-architecture.png - :width: 600px - :align: center - main components +![blueapi architecture main components](../images/blueapi-architecture.png) -The ``BlueskyContext`` Object -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +### The `BlueskyContext` Object Holds registries of plans and devices as well as a number of helper methods for registering en-masse from a normal Python module. +### The Worker Object -The Worker Object -^^^^^^^^^^^^^^^^^ - -Wraps the Bluesky ``RunEngine`` and accepts requests to run plans. The requests include the name +Wraps the Bluesky `RunEngine` and accepts requests to run plans. The requests include the name of the plan and a dictionary of parameters to pass. The worker validates the parameters against -the known expectations of the plan, passes it to the ``RunEngine`` and handles any errors. +the known expectations of the plan, passes it to the `RunEngine` and handles any errors. -The Service Object -^^^^^^^^^^^^^^^^^^ +### The Service Object Handles communications and the API layer. This object holds a reference to the worker can interrogate it/give it instructions in response to messages it recieves from the message bus. It can also forward the various events generated by the worker to topics on the bus. - - -.. _RunEngine: https://nsls-ii.github.io/bluesky/run_engine_api.html -.. _IPython: https://ipython.org/ diff --git a/docs/explanations/decisions/0003-no-queues.md b/docs/explanations/decisions/0003-no-queues.md index 9a3cd152b..5f8c8f383 100644 --- a/docs/explanations/decisions/0003-no-queues.md +++ b/docs/explanations/decisions/0003-no-queues.md @@ -1,27 +1,22 @@ -3. No Queues -============ +# 3. No Queues Date: 2023-05-22 -Status ------- +## Status Accepted -Context -------- +## Context In asking whether this service should hold and execute a queue of tasks. -Decision --------- +## Decision We will not hold any queues. The worker can execute one task at a time and will return an error if asked to execute one task while another is running. Queueing should be the responsibility of a different service. -Consequences ------------- +## Consequences The API must be kept queue-free, although transactions are permitted where the server caches requests. diff --git a/docs/explanations/decisions/0004-api-case.md b/docs/explanations/decisions/0004-api-case.md index 6ee520574..459c156a6 100644 --- a/docs/explanations/decisions/0004-api-case.md +++ b/docs/explanations/decisions/0004-api-case.md @@ -1,26 +1,21 @@ -4. API Model Case -================= +# 4. API Model Case Date: 2023-05-23 -Status ------- +## Status Accepted -Context -------- +## Context Considering whether keys in JSON blobs from the API should be in snake_case or camelCase. This includes plan parameters which may be user-defined. -Decision --------- +## Decision The priority is not to confuse users, so we will not alias any field names defined in Python. -Consequences ------------- +## Consequences Most code will be written with pep8 enforcers which means most field names will be snake_case. Some user defined ones may differ. diff --git a/docs/explanations/events.md b/docs/explanations/events.md index ae134ea9b..990d0b36b 100644 --- a/docs/explanations/events.md +++ b/docs/explanations/events.md @@ -1,12 +1,11 @@ -Events Emitted by the Worker -============================ +# Events Emitted by the Worker -Blueapi adds new events on top of the `bluesky event model`_. +Blueapi adds new events on top of the [`bluesky event model`](https://blueskyproject.io/event-model/main/index.html). -Reasons -------- +## Reasons -Since the ``RunEngine`` is traditionally used by a human in front of an IPython terminal, it + +Since the `RunEngine` is traditionally used by a human in front of an IPython terminal, it sometimes assumes intuitive behaviour. The worker replaces the human and so must fill in the gaps. @@ -14,23 +13,18 @@ The base engine programatically emits data events conforming to the `bluesky eve are meant to be handled by other subscribing code (e.g. databroker) and are decoupled from concerns such as whether a plan has started, finished, paused, errored etc. See the example below: -.. figure:: ../images/bluesky-events.png - :width: 600px - :align: center - - sequence of event emission compared to plan start/finish, in a complicated case +![sequence of event emission compared to plan start/finish, in a complicated case](../images/bluesky-events.png) -Note the gap between the start of the plan and the issue of the first `run start document`_, and the similar gap +Note the gap between the start of the plan and the issue of the first [`run start document`](https://blueskyproject.io/event-model/main/user/explanations/data-model.html#run-start-document), and the similar gap for the stop document vs end of the plan, thsse are typically used for setup and cleanup. Also note that a plan can produce an arbitrary number of runs. This decoupling is fine in an IPython terminal because a human user can see when a plan has started, can see when it's finished and can see which runs are associated with which plans. -New Events ----------- +## New Events For the case of automation, we introduce a new set of events outside of the event model, specifically -pertaining to the running of the plan and state of the ``RunEngine``. At a mimimum, an event is emitted +pertaining to the running of the plan and state of the `RunEngine`. At a mimimum, an event is emitted every time the engine: * Starts a new plan @@ -40,28 +34,18 @@ every time the engine: In the latter case, information about the error is also included. -Correlation ID --------------- +## Correlation ID + When controlling plans programatically, it can be useful to verify that event model documents really are related to the plan you just asked the worker to run. The worker will therefore bundle a correlation ID into the headers of messages containing documents. -.. seealso:: `Microsoft Playbook on Correlation IDs`_ +see also [`Microsoft Playbook on Correlation IDs`](https://microsoft.github.io/code-with-engineering-playbook/observability/correlation-id/) ActiveMQ will give this header a different name depending on the protocol you use. -.. list-table:: Correlation ID Headers - :widths: 25 25 - :header-rows: 1 - - * - Protocol - - Header name - * - JMS - - jms_correlationID - * - STOMP - - correlation-id - -.. _`bluesky event model`: https://blueskyproject.io/event-model/main/index.html -.. _`run start document`: https://blueskyproject.io/event-model/main/user/explanations/data-model.html#run-start-document -.. _`Microsoft Playbook on Correlation IDs`: https://microsoft.github.io/code-with-engineering-playbook/observability/correlation-id/ +| Protocol | Header name | +|----------|------------------| +| JMS | jms_correlationID| +| STOMP | correlation-id | diff --git a/docs/explanations/lifecycle.md b/docs/explanations/lifecycle.md index e80f2eaa7..286350c93 100644 --- a/docs/explanations/lifecycle.md +++ b/docs/explanations/lifecycle.md @@ -1,11 +1,8 @@ -Lifecycle of a Plan -=================== +# Lifecycle of a Plan The following demonstrates exactly what the code does with a plan through its lifecycle of being written, loaded and run. Take the following plan. - -.. code:: python - +``` from typing import Any, List, Mapping, Optional, Union import bluesky.plans as bp @@ -42,23 +39,21 @@ of being written, loaded and run. Take the following plan. """ yield from bp.count(detectors, num, delay=delay, md=metadata) +``` - -Loading and Registration ------------------------- +## Loading and Registration Blueapi will load this plan into its context if configured to load either this module or a module that -imports it. The ``BlueskyContext`` will go through all global variables in the module and register them +imports it. The `BlueskyContext` will go through all global variables in the module and register them if it detects that they are plans. At the point of registration it will inspect the plan's parameters and their type hints, from which it -will build a pydantic_ model of the parameters to validate against. In other words, it will build something +will build a [pydantic](https://docs.pydantic.dev/) model of the parameters to validate against. In other words, it will build something like this: -.. code:: python - +``` from pydantic import BaseModel class CountParameters(BaseModel): @@ -71,31 +66,26 @@ like this: arbitrary_types_allowed = True validate_all = True -.. note:: - This is for illustrative purposes only, this code is not actually generated, but an object resembling this class is constructed in memory. The default arguments will be validated by the context to inject the "det" device when the plan is run. The existence of the "det" default device is not checked until this time. +``` The model is also stored in the context. -Startup -------- +## Startup On startup, the context is passed to the worker, which is passed to the service. -The worker also holds a reference to the ``RunEngine`` that can run the plan. +The worker also holds a reference to the `RunEngine` that can run the plan. -Request -------- +## Request A user can send a request to run the plan to the service, which includes values for the parameters. It takes the form of JSON and may look something like this: - -.. code:: json - +``` { "name": "count", "params": { @@ -107,33 +97,32 @@ It takes the form of JSON and may look something like this: "delay": 0.1 } } +``` -The ``Service`` receives the request and passes it to the worker, which holds it in an internal queue +The `Service` receives the request and passes it to the worker, which holds it in an internal queue and executes it as soon as it can. -Validation ----------- +## Validation The pydantic model from earlier, as well as the plan function itself, is loaded out of the registry The parameter values in the request are validated against the model, this includes looking up devices -with names ``andor`` and ``pilatus`` or, if detectors was not passed ``det``. +with names `andor` and `pilatus` or, if detectors was not passed `det`. +See also [type validators](./type_validators.md) -.. seealso:: `./type_validators` -Execution ---------- +## Execution The validated parameter values are then passed to the plan function, which is passed to the RunEngine. -The plan is executed. While it is running, the ``Worker`` will publish +The plan is executed. While it is running, the `Worker` will publish -* Changes to the state of the ``RunEngine`` +* Changes to the state of the `RunEngine` * Changes to any device statuses running within a plan (e.g. when a motor changes position) -* Event model documents emitted by the ``RunEngine`` +* Event model documents emitted by the `RunEngine` * When a plan starts, finishes or fails. If an error occurs during any of the stages from "Request" onwards it is sent back to the user over the message bus. -.. _pydantic: https://docs.pydantic.dev/ + diff --git a/docs/explanations/type_validators.md b/docs/explanations/type_validators.md index 945ff5881..3f38a37e9 100644 --- a/docs/explanations/type_validators.md +++ b/docs/explanations/type_validators.md @@ -1,13 +1,9 @@ -Type Validators -=============== +# Type Validators -Requirement ------------ - -Blueapi takes the parameters of a plan and internally creates a pydantic_ model for future validation e.g. - -.. code:: python +## Requirement +Blueapi takes the parameters of a plan and internally creates a [pydantic](https://docs.pydantic.dev/) model for future validation e.g. +``` def my_plan(a: int, b: str = "b") -> Plan ... @@ -16,25 +12,24 @@ Blueapi takes the parameters of a plan and internally creates a pydantic_ model class MyPlanModel(BaseModel): a: int b: str = "b" +``` That way, when the plan parameters are sent in JSON form, they can be parsed and validated by pydantic. However, it must also cover the case where a plan doesn't take a simple dictionary, list or primitive but instead a device, such as a detector. -.. code:: python - +``` def my_plan(a: int, b: Readable) -> Plan: ... - +``` An Ophyd object cannot be passed over the network as JSON because it has state. -Instead, a string is passed, representing an ID of the object known to the ``BlueskyContext``. +Instead, a string is passed, representing an ID of the object known to the `BlueskyContext`. At the time a plan's parameters are validated, blueapi must take all the strings that are supposed to be devices and look them up against the context. For example with the request: -.. code:: json - +``` { "name": "count", "params": { @@ -46,12 +41,13 @@ to be devices and look them up against the context. For example with the request "delay": 0.1 } } +``` + +`andor` and `pilatus` should be looked up and replaced with Ophyd objects. -``andor`` and ``pilatus`` should be looked up and replaced with Ophyd objects. +## Solution -Solution --------- When the context loads available plans, it iterates through the type signature and replaces any reference to a bluesky protocol (or instance of a protocol) with a new class that extends the original type. Defining a class validator on @@ -63,13 +59,12 @@ object returned from validator method is not checked by pydantic so it can be the actual instance and the plan never sees the runtime generated reference type, only the type it was expecting. -.. note:: This uses the fact that the new types generated at runtime have access to +> **_NOTE:_** This uses the fact that the new types generated at runtime have access to the context that required them via their closure. This circumvents the usual problem of pydantic validation not being able to access external state when validating or deserialising. -.. code:: python - +``` def my_plan(a: int, b: Readable) -> Plan: ... @@ -78,12 +73,11 @@ type, only the type it was expecting. class MyPlanModel(BaseModel): a: int b: Reference[Readable] +``` -This also allows ``Readable`` to be placed at various type levels. For example: - -.. code:: python - +This also allows `Readable` to be placed at various type levels. For example: +``` def my_weird_plan( a: Readable, b: List[Readable], @@ -91,6 +85,4 @@ This also allows ``Readable`` to be placed at various type levels. For example: d: List[List[Readable]], e: List[Dict[str, Set[Readable]]]) -> Plan: ... - - -.. _pydantic: https://docs.pydantic.dev/ +``` diff --git a/docs/how-to/add-plans-and-devices.md b/docs/how-to/add-plans-and-devices.md index 4b4717a42..a2c12e0b2 100644 --- a/docs/how-to/add-plans-and-devices.md +++ b/docs/how-to/add-plans-and-devices.md @@ -1,38 +1,31 @@ -Add Plans and Devices to your Blueapi Environment -================================================= +# Add Plans and Devices to your Blueapi Environment Custom plans and devices, tailored to individual experimental environments, can be added via configuration. In both cases the relevant code must be in your Python path, for example, part of a library installed in your Python environment. For editing and tweaking you can use an editable install, see below. -Home of Code ------------- +## Home of Code The code can be in any pip-installable package, such as: * A package on pypi * A Github repository -* A local directory with a ``pyproject.toml`` file or similar. +* A local directory with a `pyproject.toml` file or similar. -The easiest place to put the code is a repository created with the `python-copier-template`_. Which can then become any of the above. +The easiest place to put the code is a repository created with the [`python-copier-template`](https://diamondlightsource.github.io/python-copier-template/main/index.html). Which can then become any of the above. -.. seealso:: Guide to setting up a new Python project with an environment and a standard set of tools: `Create a new repo from the template`_ +See also: Guide to setting up a new Python project with an environment and a standard set of tools: [`Create a new repo from the template`](https://diamondlightsource.github.io/python-copier-template/main/tutorials/create-new.html) For development purposes this code should be installed into your environment with -.. code:: shell - +``` pip install -e path/to/package +``` -Format ------- +## Format Plans in Python files look like this: -.. note:: The type annotations (e.g. ``: str``, ``: int``, ``-> MsgGenerator``) are required as blueapi uses them to generate an API! - -You can define as many plans as you like in a single Python file or spread them over multiple files. - -.. code:: python - +> **_NOTE:_** The type annotations (e.g. `: str`, `: int`, `-> MsgGenerator`) are required as blueapi uses them to generate an API! You can define as many plans as you like in a single Python file or spread them over multiple files. +``` from bluesky.protocols import Readable, Movable from blueapi.core import MsgGenerator from typing import Mapping, Any @@ -46,43 +39,39 @@ You can define as many plans as you like in a single Python file or spread them # logic goes here ... +``` +Devices are made using the [dodal](https://github.com/DiamondLightSource/dodal) style available through factory functions like this: -Devices are made using the dodal_ style available through factory functions like this: - -.. note:: The return type annotation ``-> MyTypeOfDetector`` is required as blueapi uses it to determine that this function creates a device. Meaning you can have a Python file where only some functions create devices and they will be automatically picked up. +> **_NOTE:_** The return type annotation `-> MyTypeOfDetector` is required as blueapi uses it to determine that this function creates a device. Meaning you can have a Python file where only some functions create devices and they will be automatically picked up. Similarly, these functions can be organized per-preference into files. - -.. code:: python - +``` from my_facility_devices import MyTypeOfDetector def my_detector(name: str) -> MyTypeOfDetector: return MyTypeOfDetector(name, {"other_config": "foo"}) +``` -.. seealso:: dodal_ for many more examples +See also: dodal for many more examples An extra function to create the device is used to preserve side-effect-free imports. Each device must have its own factory function. -Configuration -------------- +## Configuration + -.. seealso:: `./configure-app` +See also: [configure app](./configure-app.md) First determine the import path of your code. If you were going to import it in a Python file, what would you put? For example: - -.. code:: python - +``` import my_plan_library.tomography.plans +``` You would add the following into your configuration file: - -.. code:: yaml - +``` env: sources: - kind: dodal @@ -92,27 +81,23 @@ You would add the following into your configuration file: module: dodal.my_beamline - kind: planFunctions module: my_plan_library.tomography.plans +``` You can have as many sources for plans and devices as are needed. -Scratch Area on Kubernetes --------------------------- +## Scratch Area on Kubernetes -Sometimes in-the-loop development of plans and devices may be required. If running blueapi out of a virtual environment local packages can be installed with ``pip install -e path/to/package``, but there is also a way to support editable packages on Kubernetes with a shared filesystem. +Sometimes in-the-loop development of plans and devices may be required. If running blueapi out of a virtual environment local packages can be installed with `pip install -e path/to/package`, but there is also a way to support editable packages on Kubernetes with a shared filesystem. Blueapi can be configured to install editable Python packages from a chosen directory, the helm chart can mount this directory from the -host machine, include the following in your ``values.yaml``: - -.. code:: yaml - +host machine, include the following in your `values.yaml`: +``` scratch: hostPath: path/to/scratch/area # e.g. /dls_sw//software/blueapi/scratch -You can then clone projects into the scratch directory and blueapi will automatically incorporate them on startup. You must still include configuration to load the plans and devices from specific modules within those packages, see above. +``` +You can then clone projects into the scratch directory and blueapi will automatically incorporate them on startup. You must still include configuration to load the plans and devices from specific modules within those packages, see above. -.. _dodal: https://github.com/DiamondLightSource/dodal -.. _`python-copier-template`: https://diamondlightsource.github.io/python-copier-template/main/index.html -.. _`Create a new repo from the template`: https://diamondlightsource.github.io/python-copier-template/main/tutorials/create-new.html diff --git a/docs/how-to/configure-app.md b/docs/how-to/configure-app.md index 093b159fa..a42fd8740 100644 --- a/docs/how-to/configure-app.md +++ b/docs/how-to/configure-app.md @@ -1,19 +1,18 @@ -Configure the application -========================= +# Configure the application By default, configuration options are ingested from pydantic BaseModels, however the option exists to override these by defining a yaml file which -can be passed to the ``blueapi`` command. +can be passed to the `blueapi` command. -An example of this yaml file is found in ``config/defaults.yaml``, which follows -the schema defined in ``src/blueapi/config.py`` in the ``ApplicationConfig`` +An example of this yaml file is found in `config/defaults.yaml`, which follows +the schema defined in `src/blueapi/config.py` in the `ApplicationConfig` object. To set your own application configuration, edit this file (or write your own) and simply pass it to the CLI by typing:: - $ blueapi --config path/to/file.yaml +``` blueapi --config path/to/file.yaml ``` -where ``path/to/file.yaml`` is the relative path to the configuration file you +where `path/to/file.yaml` is the relative path to the configuration file you wish to use. Then, any subsequent calls to child commands of blueapi will use this file. diff --git a/docs/how-to/run-cli.md b/docs/how-to/run-cli.md index 430ae816b..58d8f2960 100644 --- a/docs/how-to/run-cli.md +++ b/docs/how-to/run-cli.md @@ -1,49 +1,39 @@ -Control the Worker via the CLI -============================== +# Control the Worker via the CLI The worker comes with a minimal CLI client for basic control. It should be noted that this is a test/development/debugging tool and not meant for production! -.. seealso:: - You must have access to the ``blueapi`` command either via the container or pip installing. `./run-container` and `../tutorials/installation` -.. seealso:: - In a developer environment, the worker can also be run from vscode: `../tutorials/dev-run`. -Basic Introspection -------------------- +## Basic Introspection The worker can tell you which plans and devices are available via: - -.. code:: shell - +``` blueapi controller plans blueapi controller devices -By default, the CLI will talk to the worker via a message broker on ``tcp://localhost:61613``, +By default, the CLI will talk to the worker via a message broker on `tcp://localhost:61613`, but you can customize this. -.. code:: shell - +``` blueapi controller -h my.host -p 61614 plans +``` -Running Plans -------------- +## Running Plans You can run a plan and pass arbitrary JSON parameters. - -.. code:: shell - +``` # Run the sleep plan blueapi controller run sleep '{"time": 5.0}' # Run the count plan blueapi controller run count '{"detectors": ["current_det", "image_det"]}' +``` The command will block until the plan is finished and will forward error/status messages from the server. -.. seealso:: Full reference: `../reference/cli` +See also [Full CLI reference](../reference/cli.md) diff --git a/docs/how-to/run-container.md b/docs/how-to/run-container.md index 857f0caee..dd86a779a 100644 --- a/docs/how-to/run-container.md +++ b/docs/how-to/run-container.md @@ -1,19 +1,17 @@ -Run in a container -================== +# Run in a container Pre-built containers with blueapi and its dependencies already installed are available on `Github Container Registry `_. -Starting the container ----------------------- +## Starting the container To pull the container from github container registry and run:: - ``docker run ghcr.io/diamondlightsource/blueapi:main --version`` + `docker run ghcr.io/diamondlightsource/blueapi:main --version` -with ``podman``:: +with `podman`: - ``podman run ghcr.io/diamondlightsource/blueapi:main --version`` + `podman run ghcr.io/diamondlightsource/blueapi:main --version` -To get a released version, use a numbered release instead of ``main``. +To get a released version, use a numbered release instead of `main`. diff --git a/docs/how-to/write-plans.md b/docs/how-to/write-plans.md index 1520baf7b..d71798672 100644 --- a/docs/how-to/write-plans.md +++ b/docs/how-to/write-plans.md @@ -1,93 +1,75 @@ -Writing Bluesky plans for Blueapi -================================= +# Writing Bluesky plans for Blueapi -.. warning:: - Please read the following completely and carefully, as you risk losing data if plans are written incorrectly! For an introduction to bluesky plans and general forms/advice, `see the bluesky documentation `__. Blueapi has some additional requirements, which are explained below. -Plans -~~~~~ +## Plans -While the bluesky project uses ``plan`` in a general sense to refer to any ``Iterable`` of ``Msg``\ s which may be run by the ``RunEngine``, blueapi distinguishes between a ``plan`` and a ``stub``. This distinction is made to allow for a subset of ``stub``\ s to be exposed and run, as ``stub``\ s may not make sense to run alone. +While the bluesky project uses `plan` in a general sense to refer to any `Iterable` of `Msg`\ s which may be run by the `RunEngine`, blueapi distinguishes between a `plan` and a `stub`. This distinction is made to allow for a subset of `stub`\ s to be exposed and run, as `stub`\ s may not make sense to run alone. -Generally, a ``plan`` includes at least one ``open_run`` and ``close_run`` and is a complete description of an experiment. If it does not, it is a ``stub``. This distinction is made in the bluesky core library between the ``plan``\ s and ``plan_stub``\ s modules. +Generally, a `plan` includes at least one `open_run` and `close_run` and is a complete description of an experiment. If it does not, it is a `stub`. This distinction is made in the bluesky core library between the `plan`\ s and `plan_stub`\ s modules. -Type Annotations -^^^^^^^^^^^^^^^^ +## Type Annotations -To be imported into the blueapi context, ``plan``\ s and ``stub``\ s must be the in the form of a ``PlanGenerator``: any function that return a ``MsgGenerator`` (a python ``Generator`` that yields ``Msg``\ s). ``PlanGenerator`` and ``MsgGenerator`` types are available to import from ``dodal``. - -.. note:: - ``PlanGenerator``\ s must be annotated with the return type ``MsgGenerator`` to be added to the blueapi context. - -.. code:: python +To be imported into the blueapi context, `plan`\ s and `stub`\ s must be the in the form of a `PlanGenerator`: any function that return a `MsgGenerator` (a python `Generator` that yields `Msg`\ s). `PlanGenerator` and `MsgGenerator` types are available to import from `dodal`. +``` def foo() -> MsgGenerator: # The minimum PlanGenerator acceptable to blueapi yield from {} +``` -.. note:: - ``PlanGenerator`` arguments must be annotated to enable blueapi to generate their schema +> **_NOTE:_** `PlanGenerator` arguments must be annotated to enable blueapi to generate their schema **Input annotations should be as broad as possible**, the least specific implementation that is sufficient to accomplish the requirements of the plan. -For example, if a plan is written to drive a specific implementation of Movable, but never calls any methods on the device and only yields bluesky ``'set'`` Msgs, it can be generalised to instead use the base protocol ``Movable``. - -.. code:: python +For example, if a plan is written to drive a specific implementation of Movable, but never calls any methods on the device and only yields bluesky `'set'` Msgs, it can be generalised to instead use the base protocol `Movable`. +``` def move_to_each_position(axis: Movable) -> MsgGenerator: # Originally written for SpecificImplementationMovable for _ in range(i): yield from abs_set(axis, location) +``` -Allowed Argument Types -^^^^^^^^^^^^^^^^^^^^^^ +## Allowed Argument Types -When added to the blueapi context, ``PlanGenerator``\ s are formalised into their schema- `a Pydantic BaseModel `__ with the expected argument types and their defaults. +When added to the blueapi context, `PlanGenerator`\ s are formalised into their schema- `a Pydantic BaseModel `__ with the expected argument types and their defaults. -Therefore, ``PlanGenerator``\ s must only take as arguments `those types which are valid Pydantic fields `__ or Device types which implement ``BLUESKY_PROTOCOLS`` defined in dodal, which are fetched from the context at runtime. +Therefore, `PlanGenerator`\ s must only take as arguments `those types which are valid Pydantic fields `__ or Device types which implement `BLUESKY_PROTOCOLS` defined in dodal, which are fetched from the context at runtime. -.. note:: + Allowed argument types for Pydantic BaseModels include the primitives, types that extend `BaseModel` and `dict`\ s, `list`\ s and other `sequence`\ s of supported types. Blueapi will deserialise these types from JSON, so `dict`\ s should use `str` keys. - Allowed argument types for Pydantic BaseModels include the primitives, types that extend ``BaseModel`` and ``dict``\ s, ``list``\ s and other ``sequence``\ s of supported types. Blueapi will deserialise these types from JSON, so ``dict``\ s should use ``str`` keys. - -Injecting defaults -^^^^^^^^^^^^^^^^^^ +## Injecting defaults Often when writing a plan, it is known which device the plan will mostly or always be run with, but at the time of writing the plan the device object has not been instantiated: dodal defines device factory functions, but these cannot be injected as default arguments to plans. -Dodal defines an ``inject`` function which bypasses the type checking of the constructed schemas, defering to the blueapi contexting fetching of the device when the plan is imported. This allows defaulting devices, so long as there is a device of that name in the context which conforms to the type annotation. - -.. code:: python +Dodal defines an `inject` function which bypasses the type checking of the constructed schemas, defering to the blueapi contexting fetching of the device when the plan is imported. This allows defaulting devices, so long as there is a device of that name in the context which conforms to the type annotation. +``` def touch_synchrotron(sync: Synchrotron = inject("synchrotron")) -> MsgGenerator: # There is only one Synchrotron device, so we know which one it will always be. # If there is no device named "synchrotron" in the blueapi context, it will except. sync.specific_function() yield from {} +``` -Metadata -^^^^^^^^ +### Metadata The bluesky event model allows for rich structured metadata to be attached to a scan. To enable this to be used consistently, blueapi encourages a standard form. -.. note:: - - Plans **should** include ``metadata`` as their final argument, if they do it **must** have the type Optional[Mapping[str, Any]], `and a default of None `__\, with the plan defaulting to an empty dict if passed ``None``. If the plan calls to a stub/plan which takes metadata, the plan **must** pass down its metadata, which may be a differently named argument. - -.. code:: python +> **_NOTE:_** Plans **should** include `metadata` as their final argument, if they do it **must** have the type Optional[Mapping[str, Any]], `and a default of None `__\, with the plan defaulting to an empty dict if passed `None`. If the plan calls to a stub/plan which takes metadata, the plan **must** pass down its metadata, which may be a differently named argument. +``` def pass_metadata(x: Movable, metadata: Optional[Mapping[str, Any]] = None) -> MsgGenerator: yield from bp.count{[x], md=metadata or {}} +``` -Docstrings -^^^^^^^^^^ +## Docstrings Blueapi plan schemas include includes the docstrings of imported Plans. **These should therefore explain as much about the scan as cannot be ascertained from its arguments and name**. This may include units of arguments (e.g. seconds or microseconds), its purpose in the function, the purpose of the plan etc. -.. code:: python - +``` def temp_pressure_snapshot( detectors: List[Readable], temperature: Movable = inject("sample_temperature"), @@ -114,26 +96,20 @@ Blueapi plan schemas include includes the docstrings of imported Plans. **These """ yield from move({temperature: target_temperature, pressure: target_pressure}) yield from count(detectors, 1, metadata or {}) +``` -Decorators -^^^^^^^^^^ - -Dodal defines a decorator for configuring any ``ophyd-async`` devices- which will be the majority of devices at Diamond- to write to a common location. - -.. warning:: +### Decorators - **This is an absolute requirement to write data onto the Diamond Filesystem**. +Dodal defines a decorator for configuring any `ophyd-async` devices- which will be the majority of devices at Diamond- to write to a common location. - This decorator must be used every time a new data collection is intended to begin. For an example, see below. - -.. code:: python +> **_NOTE:_** **This is an absolute requirement to write data onto the Diamond Filesystem**. This decorator must be used every time a new data collection is intended to begin. For an example, see below. +``` @attach_metadata def ophyd_async_snapshot( detectors: List[Readable], metadata: Optional[Mapping[str, Any]] = None, - ) -> MsgGenerator: - """ +### ) -> MsgGenerator: Configures a number of devices, which may be Ophyd-Async detectors and require knowledge of where to write their files, then takes a snapshot with them. Args: @@ -141,15 +117,13 @@ Dodal defines a decorator for configuring any ``ophyd-async`` devices- which wil Returns: MsgGenerator: Plan Yields: - Iterator[MsgGenerator]: Bluesky messages - """ +### Iterator[MsgGenerator]: Bluesky messages yield from count(detectors, 1, metadata or {}) def repeated_snapshot( detectors: List[Readable], metadata: Optional[Mapping[str, Any]] = None, - ) -> MsgGenerator: - """ +### ) -> MsgGenerator: Configures a number of devices, which may be Ophyd-Async detectors and require knowledge of where to write their files, then takes multiple snapshot with them. Args: @@ -157,8 +131,7 @@ Dodal defines a decorator for configuring any ``ophyd-async`` devices- which wil Returns: MsgGenerator: Plan Yields: - Iterator[MsgGenerator]: Bluesky messages - """ +### Iterator[MsgGenerator]: Bluesky messages @attach_metadata def inner_function(): yield from count(detectors, 1, metadata or {}) @@ -166,23 +139,22 @@ Dodal defines a decorator for configuring any ``ophyd-async`` devices- which wil for _ in range(5): yield from inner_function() +``` -Stubs -~~~~~ +### Stubs Some functionality in your plans may make sense to factor out to allow re-use. These pieces of functionality may or may not make sense outside of the context of a plan. Some will, such as nudging a motor, but others may not, such as waiting to consume data from the previous position, or opening a run without an equivalent closure. -To enable blueapi to expose the stubs that it makes sense to, but not the others, blueapi will only expose a subset of ``MsgGenerator``\ s under the following conditions: - -| ``__init__.py`` in directory has ``__exports__``: List[str]: only - those named in ``__exports__`` -| ``__init__.py`` in directory has ``__all__``: List[str] but no - ``__exports__``: only those named in ``__all__`` +To enable blueapi to expose the stubs that it makes sense to, but not the others, blueapi will only expose a subset of `MsgGenerator`\ s under the following conditions: -This allows other python packages (such as ``plans``) to access every function in ``__all__``, while only allowing a subset to be called from blueapi as standalone. +| `__init__.py` in directory has `__exports__`: List[str]: only + those named in `__exports__` +| `__init__.py` in directory has `__all__`: List[str] but no + `__exports__`: only those named in `__all__` -.. code:: python +This allows other python packages (such as `plans`) to access every function in `__all__`, while only allowing a subset to be called from blueapi as standalone. +``` # Rehomes all of the beamline's devices. May require to be run standalone from .package import rehome_devices # Awaits a standard callback from analysis. Should not be run standalone @@ -198,3 +170,4 @@ This allows other python packages (such as ``plans``) to access every function i __exports__ = [ "rehome_devices", ] +``` diff --git a/docs/index.md b/docs/index.md index 6bfac05b7..28eb29127 100644 --- a/docs/index.md +++ b/docs/index.md @@ -1,55 +1,60 @@ ---- -html_theme.sidebar_secondary.remove: true ---- +# How the documentation is structured ```{include} ../README.md :end-before: ::::{grid} 2 :gutter: 4 :::{grid-item-card} {material-regular}`directions_walk;2em` + ```{toctree} :maxdepth: 2 tutorials ``` + +++ Tutorials for installation and typical usage. New users start here. ::: :::{grid-item-card} {material-regular}`directions;2em` + ```{toctree} :maxdepth: 2 how-to ``` + +++ Practical step-by-step guides for the more experienced user. ::: :::{grid-item-card} {material-regular}`info;2em` + ```{toctree} :maxdepth: 2 explanations ``` + +++ Explanations of how it works and why it works that way. ::: :::{grid-item-card} {material-regular}`menu_book;2em` + ```{toctree} :maxdepth: 2 reference ``` + +++ Technical reference material including APIs and release notes. ::: diff --git a/docs/reference/api.md b/docs/reference/api.md index 02bf6568d..1d29fbcd7 100644 --- a/docs/reference/api.md +++ b/docs/reference/api.md @@ -1,17 +1,10 @@ # API -```{eval-rst} -.. automodule:: blueapi - - ``blueapi`` - ----------------------------------- -``` - This is the internal API reference for blueapi -```{eval-rst} -.. data:: blueapi.__version__ - :type: str - - Version number as calculated by https://github.com/pypa/setuptools_scm +```python +import blueapi +blueapi.__version__ ``` + +Version number as calculated by https://github.com/pypa/setuptools_scm diff --git a/docs/reference/cli.md b/docs/reference/cli.md index 47dcc4b9b..59980ffaf 100644 --- a/docs/reference/cli.md +++ b/docs/reference/cli.md @@ -1,9 +1,5 @@ -Command-Line Interface -====================== +# Command-Line Interface Full reference for the CLI: - -.. click:: blueapi.cli:main - :prog: blueapi :show-nested: diff --git a/docs/reference/messaging-spec.md b/docs/reference/messaging-spec.md index ef3b13687..40bb8544b 100644 --- a/docs/reference/messaging-spec.md +++ b/docs/reference/messaging-spec.md @@ -1,10 +1,7 @@ -Messaging Specification -======================= +# Messaging Specification The Blueapi service publishes Bluesky documents and other events to the message bus, allowing subscribers to keep track of the status of plans, as well as other status changes. This page documents the channels to which clients can subscribe to recieve these messages and their structure. -.. literalinclude:: ./asyncapi.yaml - :language: yaml diff --git a/docs/reference/rest-spec.md b/docs/reference/rest-spec.md index ded888033..8db73b848 100644 --- a/docs/reference/rest-spec.md +++ b/docs/reference/rest-spec.md @@ -1,8 +1,6 @@ -REST Specification -================== +# REST Specification Blueapi runs a FastAPI server through which the blueapi worker can be -interacted with. This page documents all possible endpoints of this -server. +interacted with. Here the [openapi docs page](./openapi.yaml) documents all possible endpoints of this server. + -.. openapi:: ./openapi.yaml diff --git a/docs/tutorials/dev-run.md b/docs/tutorials/dev-run.md index c9db6b5b5..8d60c0dd6 100644 --- a/docs/tutorials/dev-run.md +++ b/docs/tutorials/dev-run.md @@ -1,32 +1,23 @@ -Run/Debug in a Developer Environment -==================================== +# Run/Debug in a Developer Environment Assuming you have setup a developer environment, you can run a development version of the bluesky worker. -.. seealso:: https://diamondlightsource.github.io/python-copier-template/main/how-to/dev-install.html - -Start Bluesky Worker --------------------- +## Start Bluesky Worker Ensure you are inside your virtual environment: - -.. code:: shell - +``` source venv/bin/activate +``` -You will need to follow the instructions for setting up ActiveMQ as in `../how-to/run-cli`. +You will need to follow the instructions for setting up ActiveMQ as in [run cli instructions](../how-to/run-cli.md). -The worker will be available from the command line (``blueapi serve``), but can be started from vscode with additional +The worker will be available from the command line (`blueapi serve`), but can be started from vscode with additional debugging capabilities. 1. Navigate to "Run and Debug" in the left hand menu. 2. Select "Worker Service" from the debug configuration. 3. Click the green "Run Button" -.. figure:: ../images/debug-vscode.png - :align: center - - debug in vscode - +[debug in vscode](../images/debug-vscode.png) diff --git a/docs/tutorials/installation.md b/docs/tutorials/installation.md index 903984bf2..effcec222 100644 --- a/docs/tutorials/installation.md +++ b/docs/tutorials/installation.md @@ -4,39 +4,34 @@ You will need python 3.8 or later. You can check your version of python by typing into a terminal: - -``` -$ python3 --version +``` +python3 --version ``` ## Create a virtual environment It is recommended that you install into a “virtual environment” so this installation will not interfere with any existing Python software: - -``` -$ python3 -m venv /path/to/venv -$ source /path/to/venv/bin/activate +``` +python3 -m venv /path/to/venv +source /path/to/venv/bin/activate ``` ## Installing the library You can now use `pip` to install the library and its dependencies: - -``` -$ python3 -m pip install blueapi +``` +python3 -m pip install blueapi ``` If you require a feature that is not currently released you can also install from github: - -``` -$ python3 -m pip install git+https://github.com/DiamondLightSource/blueapi.git +``` +python3 -m pip install git+https://github.com/DiamondLightSource/blueapi.git ``` The library should now be installed and the commandline interface on your path. You can check the version that has been installed by typing: - -``` -$ blueapi --version +``` +blueapi --version ``` diff --git a/docs/tutorials/quickstart.md b/docs/tutorials/quickstart.md index f8fd71a0f..1eb1c42b8 100644 --- a/docs/tutorials/quickstart.md +++ b/docs/tutorials/quickstart.md @@ -1,60 +1,44 @@ -Quickstart Guide -================ - -.. seealso:: Assumes you have completed `./installation`. +# Quickstart Guide Blueapi acts as a worker that can run bluesky plans against devices for a specific laboratory setup. It can control devices to collect data and export events to tell downstream services about the data it has collected. - -Start ActiveMQ --------------- +## Start ActiveMQ The worker requires a running instance of ActiveMQ, the simplest way to start it is to run it via a container: -.. tab-set:: - - .. tab-item:: Docker - - .. code:: shell - - docker run -it --rm --net host rmohr/activemq:5.15.9-alpine - - .. tab-item:: Podman - - .. code:: shell +``` + docker run -it --rm --net host rmohr/activemq:5.15.9-alpine +``` - podman run -it --rm --net host rmohr/activemq:5.15.9-alpine +``` + podman run -it --rm --net host rmohr/activemq:5.15.9-alpine +``` - -Start Worker ------------- +## Start Worker To start the worker: -.. code:: shell - +``` blueapi serve - +``` The worker can also be started using a custom config file: -.. code:: shell - +``` blueapi --config path/to/file serve +``` - -Test that the Worker is Running -------------------------------- +## Test that the Worker is Running Blueapi comes with a CLI so that you can query and control the worker from the terminal. -.. code:: shell - +``` blueapi controller plans +``` The above command should display all plans the worker is capable of running. -.. seealso:: Full CLI reference: `../reference/cli` +See also [full cli reference](../reference/cli.md)