Skip to content

Commit

Permalink
updated for dynamic env
Browse files Browse the repository at this point in the history
  • Loading branch information
AlexPatrie committed Jan 6, 2025
1 parent bb799b2 commit 9539dcb
Show file tree
Hide file tree
Showing 9 changed files with 197 additions and 218 deletions.
102 changes: 62 additions & 40 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
FROM condaforge/miniforge3:latest
# FROM continuumio/miniconda3:main

LABEL org.opencontainers.image.title="bio-compose-server-base" \
org.opencontainers.image.description="Base Docker image for BioCompose REST API management, job processing, and datastorage with MongoDB, ensuring scalable and robust performance." \
Expand All @@ -15,53 +16,74 @@ COPY assets/docker/config/.pys_usercfg.ini /Pysces/.pys_usercfg.ini
COPY assets/docker/config/.pys_usercfg.ini /root/Pysces/.pys_usercfg.ini
COPY tests/test_fixtures /test_fixtures

WORKDIR /app
WORKDIR /bio-compose-server

# copy container libs
COPY ./gateway /app/gateway
COPY ./shared /app/shared
COPY ./worker /app/worker
COPY ./gateway /bio-compose-server/gateway
COPY ./shared /bio-compose-server/shared
COPY ./worker /bio-compose-server/worker

# copy env configs
COPY ./environment.yml /app/environment.yml
COPY ./pyproject.toml /app/pyproject.toml
COPY ./environment.yml /bio-compose-server/environment.yml
COPY ./pyproject.toml /bio-compose-server/pyproject.toml
RUN echo "Server" > /bio-compose-server/README.md


# install deps
RUN mkdir config \
&& apt-get update \
&& apt install -y \
meson \
g++ \
gfortran \
libblas-dev \
liblapack-dev \
libgfortran5 \
libhdf5-dev \
libhdf5-serial-dev \
libatlas-base-dev \
cmake \
make \
git \
build-essential \
python3-dev \
swig \
libc6-dev \
libx11-dev \
libc6 \
libgl1-mesa-dev \
pkg-config \
curl \
tar \
libgl1-mesa-glx \
libice6 \
libsm6 \
gnupg \
nano \
libstdc++6 \
&& conda update -n base -c conda-forge conda \
&& conda env create -f environment.yml -y
&& apt-get update \
&& apt-get install -y \
meson \
g++ \
gfortran \
libblas-dev \
liblapack-dev \
libgfortran5 \
libhdf5-dev \
libhdf5-serial-dev \
libatlas-base-dev \
cmake \
make \
git \
build-essential \
python3-dev \
swig \
libc6-dev \
libx11-dev \
libc6 \
libgl1-mesa-dev \
pkg-config \
curl \
tar \
libgl1-mesa-glx \
libice6 \
libsm6 \
gnupg \
libstdc++6

RUN conda update -n base -c conda-forge conda \
&& conda env create -f /bio-compose-server/environment.yml -y \
&& echo "conda activate server" >> /.bashrc

RUN conda run -n server poetry config virtualenvs.create false \
&& conda run -n server poetry lock \
&& conda run -n server poetry install \
&& conda run -n server poetry run pip install biosimulator-processes[cobra,copasi,smoldyn] \
&& conda install -n server pymem3dg -y


# && conda run -n server poetry config virtualenvs.create false \
# && conda run -n server poetry lock \
# && conda run -n server poetry install \
# && poetry run pip install

# && conda create -n server python=3.10 -y \
# && conda run -n server pip install --upgrade pip
# && conda run -n server pip install -e .

# && conda config --env --add channels conda-forge \
# && conda config --set channel_priority strict \
# && conda install readdy \
# && conda env create -f environment.yml -y
# && conda install -c conda-forge pymem3dg -y \
# && echo 'export LD_LIBRARY_PATH=$CONDA_PREFIX/lib:\$LD_LIBRARY_PATH' >> ~/.bashrc

# expose for gateway
Expand Down
56 changes: 1 addition & 55 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,55 +1 @@
![Deploy API](https://github.com/biosimulators/bio-check/actions/workflows/deploy-gateway.yml/badge.svg)
![Deploy Worker](https://github.com/biosimulators/bio-check/actions/workflows/deploy-worker.yml/badge.svg)

# BioCompose Server: A Biological Simulation Verification Service
### __This service utilizes separate containers for REST API management, job processing, and datastorage with MongoDB, ensuring scalable and robust performance.__

## **The REST API can be accessed via Swagger UI here: [https://compose.biosimulations.org/docs](https://compose.biosimulations.org/docs)

## **For Developers:**

### This application ("BioCompose") uses a microservices architecture which presents the following libraries:

- `gateway`: This library handles all requests including saving uploaded files, pending job creation, fetching results, and contains the user-facing endpoints.
- `shared`: A library of common objects/pointers used by both `gateway` and `worker`.
- `worker`: This library handles all job processing tasks for verification services such as job status adjustment, job retrieval, and more.

### The simulators used by this application consist of multiple python language bindings of C/C++ libraries. Given this fact, is it helpful to be aware of the dependency network required by each simulator. See the following documentation for simulators used in this application:

- [AMICI](https://amici.readthedocs.io/en/latest/python_installation.html)
- [COPASI(basico)](https://basico.readthedocs.io/en/latest/quickstart/get-started.html#installation)
- [PySCes](https://pyscesdocs.readthedocs.io/en/latest/userguide_doc.html#installing-and-configuring)
- [Tellurium](https://tellurium.readthedocs.io/en/latest/installation.html)
- [Simulator-specific implementations of the Biosimulators-Utils interface](https://docs.biosimulations.org/users/biosimulators-packages)
- [Smoldyn](https://www.smoldyn.org/SmoldynManual.pdf)
- *(Coming soon:)* [ReaDDy](https://readdy.github.io/installation.html)


### Dependency management scopes are handled as follows:

#### _*Locally/Dev*_:
- Anaconda via `environment.yml` - the closest to local development at root level which mimics what actually happens in the containers (conda deps tend to break more frequently than poetry.)

_*Remotely in microservice containers*_:
- Remote microservice container management is handled by `conda` via `environment.yml` files for the respective containers.

### The installation process is outlined as follows:

1. `git clone https://github.com/biosimulators/bio-compose-server.git`
2. `cd bio-compose-server/shared`
3. `mv .env_template .env`
4. Enter the following fields into the `.env` file:

MONGO_URI=<uri of your mongo instance. In this case we use the standard mongodb image with the app name bio-check>
GOOGLE_APPLICATION_CREDENTIALS=<path to your gcloud credentials .json file. Contact us for access>
BUCKET_NAME=bio-check-requests-1 # name of the bucket used in this app
5. `cd ..`
6. Pull and run the latest version of Mongo from the Docker Hub. (`docker run -d -it mongo:latest` or similar.)
7. `sudo chmod +x ./assets/dev/scripts/install.sh`
8. `./install.sh`


## Notes:
- This application currently uses MongoDB as the database store in which jobs are read/written. Database access is given to both the `api` and `worker` libraries. Such database access is
executed/implemented with the use of a `Supervisor` singleton.

#BioCompose Server
6 changes: 4 additions & 2 deletions docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ services:
- appnetwork

gateway:
platform: linux/amd64
build:
context: .
dockerfile: Dockerfile
Expand All @@ -17,19 +18,20 @@ services:
- "3001:3001"
networks:
- appnetwork
command: poetry run uvicorn gateway.main:app --host 0.0.0.0 --port 3001 --reload
command: conda run -n server uvicorn gateway.main:app --host 0.0.0.0 --port 3001 --reload
depends_on:
- mongodb

compose-worker:
platform: linux/amd64
build:
context: .
dockerfile: Dockerfile
image: ghcr.io/biosimulators/bio-compose-server-worker:0.0.1
container_name: worker
networks:
- appnetwork
command: poetry run python worker/main.py
command: conda run -n server python worker/main.py
depends_on:
- mongodb

Expand Down
63 changes: 33 additions & 30 deletions environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,33 +11,36 @@ dependencies:
- poetry
# - readdy
# - pymem3dg
- pip:
- uvicorn
- fastapi
- mypy
- pytest
- pip-autoremove
- networkx
- rustworkx
- qiskit
- qiskit-ibm-runtime
- qiskit-nature
- pylatexenc
- numpy
- pandas
- process-bigraph==0.0.22
- copasi-basico
- tellurium
- python-libsbml
- smoldyn
- requests-toolbelt
- python-dotenv
- google-cloud-storage
- python-multipart
- toml
- typing-extensions
- pymongo
- pydantic
- pydantic-settings
- chardet
- pyyaml
# - pip:
# - google-cloud-storage
# - typing-extensions
# - uvicorn
# - fastapi
# - mypy
# - pytest
# - pip-autoremove
# - networkx
# - rustworkx
# - qiskit
# - qiskit-ibm-runtime
# - qiskit-nature
# - pylatexenc
# - numpy
# - pandas
# - process-bigraph==0.0.22
# - copasi-basico
# - tellurium
# - python-libsbml
# - smoldyn
# - requests-toolbelt
# - python-dotenv
# - google-cloud-storage
# - python-multipart
# - toml
# - typing-extensions
# - pymongo
# - pydantic
# - pydantic-settings
# - chardet
# - pyyaml
# - biosimulator-processes
44 changes: 32 additions & 12 deletions gateway/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,15 @@
DbClientResponse,
AgentParameters,
BigraphRegistryAddresses,
IncompleteJob
IncompleteJob,
DB_TYPE,
DB_NAME,
BUCKET_NAME,
JobStatus,
DatabaseCollections
)
from shared_api import MongoDbConnector, DB_NAME, DB_TYPE, BUCKET_NAME, JobStatus, DatabaseCollections, file_upload_prefix
from io_api import write_uploaded_file, download_file_from_bucket
from shared.database import MongoDbConnector
from shared.io import write_uploaded_file, download_file_from_bucket
from shared.log_config import setup_logging


Expand Down Expand Up @@ -509,7 +514,7 @@ async def generate_simularium_file(
):
job_id = "files-generate-simularium-file" + str(uuid.uuid4())
_time = db_connector.timestamp()
upload_prefix, bucket_prefix = file_upload_prefix(job_id)
# upload_prefix, bucket_prefix = file_upload_prefix(job_id)
uploaded_file_location = await write_uploaded_file(job_id=job_id, uploaded_file=uploaded_file, bucket_name=BUCKET_NAME, extension='.txt')

# new simularium job in db
Expand Down Expand Up @@ -542,21 +547,36 @@ async def generate_simularium_file(
# raise HTTPException(status_code=404, detail=f"A simularium file cannot be parsed from your input. Please check your input file and refer to the simulariumio documentation for more details.")


# @app.get(
# "/get-process-bigraph-addresses",
# operation_id="get-process-bigraph-addresses",
# response_model=BigraphRegistryAddresses,
# tags=["Composition"],
# summary="Get process bigraph implementation addresses for composition specifications.")
# async def get_process_bigraph_addresses() -> BigraphRegistryAddresses:
# registry = await db_connector.read(collection_name="bigraph_registry", version="latest")
# if registry is not None:
# addresses = registry.get('registered_addresses')
# version = registry.get('version')
#
# return BigraphRegistryAddresses(registered_addresses=addresses, version=version)
# else:
# raise HTTPException(status_code=500, detail="Addresses not found.")


@app.get(
"/get-process-bigraph-addresses",
operation_id="get-process-bigraph-addresses",
response_model=BigraphRegistryAddresses,
tags=["Composition"],
summary="Get process bigraph implementation addresses for composition specifications.")
async def get_process_bigraph_addresses() -> BigraphRegistryAddresses:
registry = await db_connector.read(collection_name="bigraph_registry", version="latest")
if registry is not None:
addresses = registry.get('registered_addresses')
version = registry.get('version')

return BigraphRegistryAddresses(registered_addresses=addresses, version=version)
else:
raise HTTPException(status_code=500, detail="Addresses not found.")
from bsp import app_registrar
addresses = list(app_registrar.core.process_registry.registry.keys())
version = "latest"
return BigraphRegistryAddresses(registered_addresses=addresses, version=version)
# else:
# raise HTTPException(status_code=500, detail="Addresses not found.")


@app.get(
Expand Down
Loading

0 comments on commit 9539dcb

Please sign in to comment.