A Python script that serves as a standalone Streamlit wrapper for Ollama.
Video Demo : https://youtu.be/4QTL_lFeVTE
To install the required dependencies, run the following command in your terminal:
pip install -r requirements.txt
Please note that the script will automatically install Ollama and Phi3 if they are not already present.
Before running the script for the first time, make sure to run it on an elevated terminal to ensure proper installation and execution.
streamlit run app.py
That's it! You're ready to start using StreamLlama with Ollama.
- Chat with locally running LLMs , Absolutely offline
- Your data is Your data . Nothing leaves your device
- Auto install of Ollama and Phi3
The next developmental target for StreamLlama is to create a Standalone app for it, providing a more user-friendly interface.