Skip to content

Commit

Permalink
Merge branch 'main' into patch-1
Browse files Browse the repository at this point in the history
  • Loading branch information
MiNeves00 authored Nov 12, 2024
2 parents d95c337 + 3834975 commit b324888
Show file tree
Hide file tree
Showing 198 changed files with 14,371 additions and 17,723 deletions.
8 changes: 4 additions & 4 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -53,11 +53,11 @@ node_modules
# Environments
env
env3
.env
.env*
.env*.local
.venv
env/
venv/
.venv*
env*/
venv*/
ENV/
env.bak/
venv.bak/
Expand Down
36 changes: 15 additions & 21 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,43 +3,37 @@ repos:
rev: v4.4.0
hooks:
- id: trailing-whitespace
files: llmstudio/
exclude: ^llmstudio/ui/
files: libs/
- id: end-of-file-fixer
files: llmstudio/
exclude: ^llmstudio/ui/
files: libs/
- id: check-yaml
files: llmstudio/
exclude: ^llmstudio/ui/
files: libs/
- id: check-added-large-files
files: llmstudio/
exclude: ^llmstudio/ui/
files: libs/

- repo: https://github.com/psf/black
rev: 22.12.0
hooks:
- id: black
files: llmstudio/
exclude: ^llmstudio/ui/
files: libs/

- repo: https://github.com/pycqa/isort
rev: 5.12.0
hooks:
- id: isort
files: llmstudio/
exclude: ^llmstudio/ui/
files: libs/
args:
- "--profile=black"

- repo: https://github.com/myint/autoflake
rev: v1.4
- repo: https://github.com/PyCQA/autoflake
rev: v2.3.1
hooks:
- id: autoflake
files: llmstudio/
exclude: ^llmstudio/ui/
files: libs/
exclude: 'libs/core/llmstudio_core/providers/__init__.py|libs/llmstudio/llmstudio/providers/__init__.py'
args:
# - --remove-all-unused-imports
- --recursive
- --remove-unused-variables
- --in-place
additional_dependencies: [flake8]
- --remove-all-unused-imports
- --recursive
- --remove-unused-variables
- --in-place
additional_dependencies: [flake8]
15 changes: 0 additions & 15 deletions Dockerfile

This file was deleted.

4 changes: 2 additions & 2 deletions Makefile
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
start-db:
docker compose up --build
format:
pre-commit run --all-files
13 changes: 0 additions & 13 deletions docker-compose.yml

This file was deleted.

22 changes: 13 additions & 9 deletions docs/how-to/deploy-on-gke/deploy-on-google-kubernetes-engine.mdx
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
---
title: "Deploy on Google Kubernetes Engine"
---

Learn how to deploy LLMstudio as a containerized application on Google Kubernetes Engine and make calls from a local repository.


Expand All @@ -18,20 +22,20 @@ This example demonstrates a public deployment. For a private service accessible
<Step title="Select Deploy">
Go to **Workloads** and **Create a new Deployment**.
<Frame>
<img src="how-to/deploy-on-gke/images/step-2.png" />
<img src="images/step-2.png" />
</Frame>
</Step>
<Step title="Name Your Deployment">
Rename your project. We will call the one in this guide **llmstudio-on-gcp**.
<Frame>
<img src="how-to/deploy-on-gke/images/step-3.png" />
<img src="images/step-3.png" />
</Frame>
</Step>
<Step title="Select Your Cluster">
Choose between **creating a new cluster** or **using an existing cluster**.
For this guide, we will create a new cluster and use the default region.
<Frame>
<img src="how-to/deploy-on-gke/images/step-4.png" />
<img src="images/step-4.png" />
</Frame>
</Step>
<Step title="Proceed to Container Details">
Expand All @@ -47,7 +51,7 @@ This example demonstrates a public deployment. For a private service accessible
```
Set it as the **Image path** to your container.
<Frame>
<img src="how-to/deploy-on-gke/images/step-6.png" />
<img src="images/step-6.png" />
</Frame>
</Step>
<Step title="Set Environment Variables">
Expand All @@ -63,7 +67,7 @@ Additionally, set the `GOOGLE_API_KEY` environment variable to enable calls to G
<Tip>Refer to **SDK/LLM/Providers** for instructions on setting up other providers.</Tip>

<Frame>
<img src="how-to/deploy-on-gke/images/step-7.png" />
<img src="images/step-7.png" />
</Frame>

</Step>
Expand All @@ -74,13 +78,13 @@ Additionally, set the `GOOGLE_API_KEY` environment variable to enable calls to G
Select **Expose deployment as a new service** and leave the first item as is.

<Frame>
<img src="how-to/deploy-on-gke/images/step-9-1.png" />
<img src="images/step-9-1.png" />
</Frame>

Add two other items, and expose the ports defined in the **Set Environment Variables** step.

<Frame>
<img src="how-to/deploy-on-gke/images/step-9-2.png" />
<img src="images/step-9-2.png" />
</Frame>
</Step>
<Step title="Deploy">
Expand Down Expand Up @@ -108,7 +112,7 @@ Now let's make a call to our LLMstudio instance on GCP!

Go to your newly deployed **Workload**, scroll to the **Exposing services** section, and take note of the Host of your endpoint.
<Frame>
<img src="how-to/deploy-on-gke/images/step-env.png" />
<img src="images/step-env.png" />
</Frame>

Create your `.env` file with the following:
Expand Down Expand Up @@ -141,7 +145,7 @@ Now let's make a call to our LLMstudio instance on GCP!
```

<Frame>
<img src="how-to/deploy-on-gke/images/step-llmstudio-call.png" />
<img src="images/step-llmstudio-call.png" />
</Frame>


Expand Down
Binary file removed docs/how-to/deploy-on-gke/images/step-7-1.png
Binary file not shown.
Binary file removed docs/how-to/deploy-on-gke/images/step-7-2.png
Binary file not shown.
Binary file added docs/how-to/deploy-on-gke/images/step-7.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion docs/quickstart.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ description: "Let's have you setup LLMstudio"
Using it in a Python notebook is also fairly simple! Just run the following cell:

```python
from llmstudio import LLM
from llmstudio.providers import LLM
model = LLM("anthropic/claude-2.1")
model.chat("What are Large Language Models?")
```
Expand Down
Loading

0 comments on commit b324888

Please sign in to comment.