Skip to content

Commit

Permalink
Remove vLLM test dependency
Browse files Browse the repository at this point in the history
vLLM can only currently be run on GPU (unless you want to go to extreme
lengths to make it work on CPU), and we thus cannot run the tests in CI.
We thus separate the test dependencies in "with GPU" and "without
GPU" (a subset of "with GPU") that the user has to pick manually.
  • Loading branch information
rlouf committed Feb 17, 2025
1 parent a7096e1 commit 63cf9ee
Show file tree
Hide file tree
Showing 2 changed files with 30 additions and 16 deletions.
44 changes: 29 additions & 15 deletions docs/community/contribute.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Note that the [issue tracker][issues] is only intended for actionable items. In

First, [fork the repository on GitHub](https://github.com/dottxt-ai/outlines/fork) and clone the fork locally:

```bash
```shell
git clone [email protected]/YourUserName/outlines.git
cd outlines
```
Expand All @@ -27,35 +27,49 @@ Create a new virtual environment:

*If you are using `uv`*:

```bash
```shell
uv venv
source .venv/bin/activate
alias pip="uv pip" # ... or just remember to prepend any pip command with uv in the rest of this guide
```

*If you are using `venv`*:

```bash
```shell
python -m venv .venv
source .venv/bin/activate
```

*If you are using `conda`*:

```bash
```shell
conda env create -f environment.yml
```

Then install the dependencies in editable mode, and install the `pre-commit` hooks:

```bash
```shell
python -m venv .venv
source .venv/bin/activate
```

Then install the dependencies in editable mode, and install the pre-commit hooks:

```shell
pip install -e ".[test]"
pre-commit install
```
If you own a GPU and want to run the vLLM tests you will have to run:

```shell
pip install -e ".[test-gpu]"
```

instead.

Outlines provides optional dependencies for different supported backends, which you can install with

```bash
```shell
pip install ".[vllm]"
```

Expand Down Expand Up @@ -85,13 +99,13 @@ You will not have access to a GPU, but you'll be able to make basic contribution

Run the tests:

```bash
```shell
pytest
```

And run the code style checks:

```bash
```shell
pre-commit run --all-files
```

Expand All @@ -101,7 +115,7 @@ Outlines uses [asv](https://asv.readthedocs.io) for automated benchmark testing.

You can run the benchmark test suite locally with the following command:

```bash
```shell
asv run --config benchmarks/asv.conf.json
```

Expand All @@ -112,19 +126,19 @@ Caveats:

#### Run a specific test:

```bash
```shell
asv run --config benchmarks/asv.conf.json -b bench_json_schema.JsonSchemaBenchmark.time_json_schema_to_fsm
```

#### Profile a specific test:

```bash
```shell
asv run --config benchmarks/asv.conf.json --profile -b bench_json_schema.JsonSchemaBenchmark.time_json_schema_to_fsm
```

#### Compare to `origin/main`

```bash
```shell
get fetch origin
asv continuous origin/main HEAD --config benchmarks/asv.conf.json
```
Expand All @@ -140,13 +154,13 @@ asv continuous origin/main HEAD --config benchmarks/asv.conf.json

To work on the *documentation* you will need to install the related dependencies:

```bash
```shell
pip install -r requirements-doc.txt
```

To build the documentation and serve it locally, run the following command in the repository's root folder:

```bash
```shell
mkdocs serve
```

Expand All @@ -157,7 +171,7 @@ It will be updated every time you make a change.

Create a new branch on your fork, commit and push the changes:

```bash
```shell
git checkout -b new-branch
git add .
git commit -m "Changes I made"
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -69,12 +69,12 @@ test = [
"huggingface_hub",
"openai>=1.0.0",
"datasets",
"vllm; sys_platform == 'linux'",
"transformers",
"pillow",
"exllamav2",
"jax",
]
test-gpu=["outlines[test]", "vllm; sys_platform == 'linux'"]
serve = [
"vllm>=0.3.0",
"uvicorn",
Expand Down

0 comments on commit 63cf9ee

Please sign in to comment.