Skip to content

Commit

Permalink
VERSION 1.0.0b1 (#147)
Browse files Browse the repository at this point in the history
## LLMstudio Version 1.0.0b1

### What was done in this PR:

- new libraries: llmstudio-core, llmstudio-tracker, llmstudio-proxy,
llmstudio (former monolith)
- **Modularization and Refactoring**: Shifted the structure towards
modularization to enhance maintainability and scalability. Significant
changes were made in the core components to make them more modular,
including separating server components and organizing imports.
- **LLM Provider Updates**: Implemented changes in the LLM provider
instantiation. The provider now requires additional parameters such as
`api_key` and accepts a structured `chat_request`. This restructuring
also includes making the proxy and tracking optional for more flexible
usage.
- **Feature Enhancements**:
- Added asynchronous chat support, both with and without streaming, and
implemented synchronous methods.
- Introduced `session_id` tracking for chat sessions and logging support
for better traceability.
- Incorporated dynamic API versioning and the ability to handle multiple
API providers, including OpenAI, Azure, and VertexAI.
- Added support for LangChain agents and tool calling with parallel
execution.
- Adapted VertexAI integration for both regular and tool-calling
functions.
- Provided support for legacy UTC compatibility to address compatibility
with older Python versions.
- **Automated Testing and Documentation**:
- Enhanced test automation for development processes, including adding
tests for modularized components.
- Updated the documentation, added docstrings to LLM classes, and
provided a tutorial on using `langgraph`.
- **Formatting and Configuration**:
- Applied consistent formatting, added `isort` and `flake8`
configurations, and updated pre-commit hooks.
- Updated `config.yaml` for easier configuration management and enhanced
initialization for Azure-based clients.

### How it was tested:

- Conducted automated tests across development environments to verify
the correctness of modularized components and new functionality.
- Validated the functionality of asynchronous and synchronous chat
capabilities with various LLM providers, including OpenAI, Azure, and
VertexAI.
- Tested new `session_id` integration for accuracy in tracking sessions
and seamless functionality with logging tools.
- Verified compatibility across Python versions, particularly with UTC
handling in legacy versions.
- Reviewed server component separation and ensured compatibility with
modular server deployments.

### Additional notes:

- **Any breaking changes?**  
Yes, breaking changes were introduced in the LLM provider instantiation.
The new instantiation now requires specifying parameters in the
following format:
  ```python
  llm = LLM(provider=provider, api_key=api_key, **kwargs)
  chat_request = {
      "chat_input": "Hello, my name is Json",
      "model": model,
      "is_stream": False,
      "retries": 0,
      "parameters": {
          "temperature": 0,
          "max_tokens": 100,
          "response_format": {"type": "json_object"},
          "functions": None,
      }
  }
  llm.achat(**chat_request)
  ```

- **Any new dependencies added?**  
New dependencies are introduced to support modularization, API
versioning, and enhanced testing, though specific package details should
be checked in `pyproject.toml`.

- **Any performance improvements?**  
Optimized API calls with parallel execution for tool calls, streamlined
server components, and minimized function calls within the LLM provider,
which should contribute to overall performance improvements.
  • Loading branch information
claudiolemos authored Oct 28, 2024
2 parents 9642d65 + d51a5d1 commit 2ab49bc
Show file tree
Hide file tree
Showing 194 changed files with 14,249 additions and 17,637 deletions.
8 changes: 4 additions & 4 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -53,11 +53,11 @@ node_modules
# Environments
env
env3
.env
.env*
.env*.local
.venv
env/
venv/
.venv*
env*/
venv*/
ENV/
env.bak/
venv.bak/
Expand Down
36 changes: 15 additions & 21 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,43 +3,37 @@ repos:
rev: v4.4.0
hooks:
- id: trailing-whitespace
files: llmstudio/
exclude: ^llmstudio/ui/
files: libs/
- id: end-of-file-fixer
files: llmstudio/
exclude: ^llmstudio/ui/
files: libs/
- id: check-yaml
files: llmstudio/
exclude: ^llmstudio/ui/
files: libs/
- id: check-added-large-files
files: llmstudio/
exclude: ^llmstudio/ui/
files: libs/

- repo: https://github.com/psf/black
rev: 22.12.0
hooks:
- id: black
files: llmstudio/
exclude: ^llmstudio/ui/
files: libs/

- repo: https://github.com/pycqa/isort
rev: 5.12.0
hooks:
- id: isort
files: llmstudio/
exclude: ^llmstudio/ui/
files: libs/
args:
- "--profile=black"

- repo: https://github.com/myint/autoflake
rev: v1.4
- repo: https://github.com/PyCQA/autoflake
rev: v2.3.1
hooks:
- id: autoflake
files: llmstudio/
exclude: ^llmstudio/ui/
files: libs/
exclude: 'libs/core/llmstudio_core/providers/__init__.py|libs/llmstudio/llmstudio/providers/__init__.py'
args:
# - --remove-all-unused-imports
- --recursive
- --remove-unused-variables
- --in-place
additional_dependencies: [flake8]
- --remove-all-unused-imports
- --recursive
- --remove-unused-variables
- --in-place
additional_dependencies: [flake8]
15 changes: 0 additions & 15 deletions Dockerfile

This file was deleted.

4 changes: 2 additions & 2 deletions Makefile
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
start-db:
docker compose up --build
format:
pre-commit run --all-files
13 changes: 0 additions & 13 deletions docker-compose.yml

This file was deleted.

2 changes: 1 addition & 1 deletion docs/quickstart.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ description: "Let's have you setup LLMstudio"
Using it in a Python notebook is also fairly simple! Just run the following cell:

```python
from llmstudio import LLM
from llmstudio.providers import LLM
model = LLM("anthropic/claude-2.1")
model.chat("What are Large Language Models?")
```
Expand Down
242 changes: 0 additions & 242 deletions examples/01_intro_to_llmstudio.ipynb

This file was deleted.

Loading

0 comments on commit 2ab49bc

Please sign in to comment.