Skip to content

Commit

Permalink
update README commit tags
Browse files Browse the repository at this point in the history
  • Loading branch information
tstescoTT committed Dec 31, 2024
1 parent 5ab1816 commit 0e5b67a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion vllm-tt-metal-llama3-70b/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ docker run \
--volume ${PERSISTENT_VOLUME?ERROR env var PERSISTENT_VOLUME must be set}:/home/user/cache_root:rw \
--shm-size 32G \
--publish 7000:7000 \
ghcr.io/tenstorrent/tt-inference-server/tt-metal-llama3-70b-src-base-vllm:v0.0.1-tt-metal-3ef683762eaa-953161188c50
ghcr.io/tenstorrent/tt-inference-server/tt-metal-llama3-70b-src-base-vllm:v0.0.1-tt-metal-v0.54.0-rc2-953161188c50
```

By default the Docker container will start running the entrypoint command wrapped in `src/run_vllm_api_server.py`.
Expand Down

0 comments on commit 0e5b67a

Please sign in to comment.