Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated README.md to refer to 23.05 instead of 23.04 #159

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-->

**NOTE: Fastertransformer backend is currently undergoing restructuring. Build instructions are only tested with Triton container versions <= `23.04`**.
**NOTE: Fastertransformer backend is currently undergoing restructuring. Build instructions are only tested with Triton container versions <= `23.05`**.

# FasterTransformer Backend

Expand Down Expand Up @@ -105,7 +105,7 @@ For the issue of running the model with multi-gpu and multi-node, FasterTransfor
git clone https://github.com/triton-inference-server/fastertransformer_backend.git
cd fastertransformer_backend
export WORKSPACE=$(pwd)
export CONTAINER_VERSION=23.04
export CONTAINER_VERSION=23.05
export TRITON_DOCKER_IMAGE=triton_with_ft:${CONTAINER_VERSION}
```

Expand All @@ -117,14 +117,14 @@ FasterTransformer backend, thus the users must prepare own docker image either b
Note the `--is-multistage-build` is optional. It installs the minimal dependencies that allow fastertransformer_backend to run
```bash
# Create your own Triton container. You can skip this step (done in trtionserver/server)
python3 compose.py --backend pytorch --container-version 23.04 --output-name tritonserver_pytorch_only
python3 compose.py --backend pytorch --container-version 23.05 --output-name tritonserver_pytorch_only
# In tritonserver/fastertransformer_backend. This will overwrite the current Dockerfile
python3 docker/create_dockerfile_and_build.py --base-image tritonserver_pytorch_only --image-name tritonserver_with_ft --is-multistage-build

```
Alternatively you can simply run
```bash
python3 create_dockerfile_and_build.py --triton-version 23.04
python3 create_dockerfile_and_build.py --triton-version 23.05
```
to generate a fastertransformer backend, like done in option 2.

Expand Down Expand Up @@ -307,4 +307,4 @@ Sep 2021

Apr 2021
- **Release the FasterTransformer backend 1.0**.
- Support Multi-GPU on GPT.
- Support Multi-GPU on GPT.