Simple Chat UI to chat about a private codebase using LLMs locally.
Technology:
- Ollama and llama3:8b as Large Language Model
- jina-embeddings-v2-base-code as Embedding Model
- LangChain as a framework for LLM
- Chainlit for the chat UI
-
Make sure you have Python 3.9 or later installed
-
Download and install Ollama
-
Pull the model:
ollama pull llama:3b
-
Create a Python virtual environment and activate it:
python3 -m venv .venv && source .venv/bin/activate
-
Install Python dependencies:
pip install -r requirements.txt
-
Clone an example repository to question the chat bot about:
git clone https://github.com/discourse/discourse
-
Set up the vector database:
python ingest-code.py
-
Start the chat bot:
chainlit run main.py
-
To exit the Python virtual environment after you are done, run:
deactivate
Modify the .env
file to run the chat bot on your codebase and language.