CITChat: AI-Machine Learning Powered Support for the CIT Community
CITChat is a chat helpdesk system designed to streamline and enhance communication about CIT information services. It leverages machine learning and a Large Language Model (LLM) to provide on-demand responses to queries about CIT operations, enrollment, accounts, and more, available anytime and anywhere.
- LLM Integration: Provides intelligent responses using a large language model.
- Real-time Ingestion: Continuously updates with new information.
- Customizable AI Personality: Easily adjustable to match the desired characteristics and tone.
- Image Support: Capable of displaying images in responses.
- Trainable AI: Allows for easy training on new topics.
- Question Topic Management: Identifies and categorizes unknown questions for admin review.
- Integration: Easily integrates into institution kiosks for quick access.
We aim to streamline the school's information helpdesk by building a framework for building an automated digital helpdesk assistant which is a valuable tool for freshmen, transferees, parents, students, and other people alike. Which can be access anythime, anywhere.
Developed to address the need for efficient and effective communication within the CIT community, CITChat focuses on:
- Privacy: Ensuring sensitive information is handled securely.
- Accessibility: Providing information anytime, anywhere.
- Technologies:
- Built with FastAPI - for backend
- React.js - for frontend
- Components:
- Llama3.1 - for LLM
- MySQL - for Database
- qdrant - for vector database
- nomic - for embedding model
- ollama - acts as a bridge between the complexities of deploying the LLM
- PrivateGPT - for RAG, API and LLM Inference Abstractions
- Anaconda
- Python
- JavaScript
- HTML and CSS
- MaterialUI
- react-three
- Features:
- Real-time ingestion
- Customizable AI characteristics on the fly
- Image display support
- Statistics dashboard.
- Advance Logging
- Natural Language Support
- Clone the repository.
- Ensure you have Git, Python 3.11, NPM, VS Code, and MySQL installed.
- Download Ollama:
- Run
ollama pull llama3.1
- Run
ollama pull nomic-embed-text
- Start Ollama with
ollama serve
- Run
- Install Make: Download from GNUWin32 and add to your system PATH.
- Create New Virtual Environment:
- Use
conda create -n {env Name} python=3.11
- Use
- Install Poetry:
- Run
pip install pipx
- Run
pipx install poetry
- Run
pipx ensurepath
- Run
- Install PrivateGPT:
- Navigate to the PrivateGPT directory.
- Activate your conda environment.
- Run
poetry install --extras "ui llms-ollama embeddings-ollama vector-stores-qdrant"
- Configure Settings: Copy
settings-ollama.yaml
to the PrivateGPT folder. - Run PrivateGPT:
- Activate your environment and run
make run
with the appropriate profile settings.
- Activate your environment and run
- Create New Virtual Environment:
- Use
conda create -n {env Name} python=3.11
- Install dependencies using
pip install -r requirements.txt
. - Run the application using
uvicorn main:app --reload
.
- Use
npm install
npm start
- Access the chat helpdesk via the institution’s kiosk or web interface.
- Ask any questions related to CIT services and receive real-time responses.
- Fork the repository and create a new branch for your features.
- Submit a pull request with detailed descriptions of your changes.
- Report issues or suggest features via the issue tracker.
CITChat is licensed under the MIT License.