This project sets up an AI stack at home for various machine learning and AI tasks, including Open Web UI, Ollama, Stable Diffusion, and Pipeline services. The stack leverages Docker for containerization and Tailscale for secure access.
Before you begin, ensure you have:
- Docker installed.
- Docker Compose installed.
- Genereate an OAuth Client token in Tailscale for Docker and Tailscale integration.
The folder structure of the project is as follows:
ai-stack
|── config
| └── open-webui.json
|── state
├── .env
├── docker-compose.yaml
├── ollama
├── open-webui
├── stable-diffusion-webui-docker
.env
: Environment variables file.docker-compose.yaml
: Docker Compose configuration file.ollama
,open-webui
,stable-diffusion-webui-docker
: Directories for respective services and configurations.
git clone https://github.com/cwilliams001/ai-stack.git
cd ai-stack
Create a .env
file in the root directory if it does not exist already:
touch .env
Populate the .env
file with the necessary environment variables:
PUID=1000
PGID=1000
BRAVE_SEARCH_API_KEY=your-brave-api-key
TS_AUTHKEY=your-tailscale-auth-key
Before starting the stack, in the ai-stack
directory, you’ll want to clone the repo (or just copy the necessary files).
(this will create the folder for you)
git clone https://github.com/AbdBarho/stable-diffusion-webui-docker.git
After cloning you'll want to make a change to the Dockerfile
nano stable-diffusion-webui-docker/services.comfy/Dockerfile
I commented out the pinning to commit hash and just grabbed the latest comfy
FROM pytorch/pytorch:2.3.0-cuda12.1-cudnn8-runtime
ENV DEBIAN_FRONTEND=noninteractive PIP_PREFER_BINARY=1
RUN apt-get update && apt-get install -y git && apt-get clean
ENV ROOT=/stable-diffusion
RUN --mount=type=cache,target=/root/.cache/pip \
git clone https://github.com/comfyanonymous/ComfyUI.git ${ROOT} && \
cd ${ROOT} && \
git checkout master && \
# git reset --hard 276f8fce9f5a80b500947fb5745a4dde9e84622d && \
pip install -r requirements.txt
WORKDIR ${ROOT}
COPY . /docker/
RUN chmod u+x /docker/entrypoint.sh && cp /docker/extra_model_paths.yaml ${ROOT}
ENV NVIDIA_VISIBLE_DEVICES=all PYTHONPATH="${PYTHONPATH}:${PWD}" CLI_ARGS=""
EXPOSE 7860
ENTRYPOINT ["/docker/entrypoint.sh"]
CMD python -u main.py --listen --port 7860 ${CLI_ARGS}
You’ll want to grab any models you like from HuggingFace. I am using stabilityai/stable-diffusion-3-medium
You’ll want to download all of the models and then transfer them to your server and put them in the appropriate folders
Models will need to bt placed in the Stable-diffusion
folder.
stable-diffusion-webui-docker/data/models/Stable-diffusion
Models are any file in the root of stable-diffusion-3-medium
that have the extension *.safetensors
For clips, you’ll need to create this folder (because it doesn’t exist)
mkdir stable-diffusion-webui-docker/data/models/CLIPEncoder
You’ll need to download the same workflows to the machine that accesses ComfyUI so you can import them into the browser.
Example workflows are also available on HuggingFace in the Stable Diffusion 3 Medium repo
This should show up as a service on your tailnet port 7860
Connect to Open WebUI
Navigate to the Settings > Connections > OpenAI API section in Open WebUI.
Set the API URL to http://localhost:9099 and the API key to 0p3n-w3bu!.
If your Open WebUI is running in a Docker container, replace localhost with host.docker.internal in the API URL.
Manage Configurations
Go to the Admin Settings > Pipelines tab.
Select your desired pipeline and modify the valve values directly from the WebUI.
Add Anthropic Manifold Pipeline Plugin
Add the plugin from anthropic_manifold_pipeline.py to your pipelines directory.
docker compose up -d --build --force-recreate --remove-orphans
This will start all services defined in the docker-compose.yaml
file.
Provides a secure VPN connection to access services.
A general-purpose AI service.
A user interface to interact with various AI-related tasks.
Provides image generation capabilities.
Provides ability to add other API endpoints for AI services.
-
Accessing Services
Services are accessed securely through Tailscale. Make sure your device is connected to the Tailscale network with the appropriate tags.
-
Interacting with the AI Stack
Each service exposed will be available through their respective endpoints. Check the output of
docker-compose up -d
to see which ports are forwarded through Tailscale.
Refer to the .env
file for configuring environment-specific variables:
PUID
: User ID for permissions.PGID
: Group ID for permissions.BRAVE_SEARCH_API_KEY
: API key for Brave web search.TS_AUTHKEY
: Auth key for Tailscale.
The services are networked through Tailscale and do not expose ports directly to your local machine. Ensure your Tailscale configuration allows access to these services.
-
Logs
Check logs for each service to debug issues:
docker logs <container_name>
-
Connectivity Issues
Ensure your device is properly connected to the Tailscale network.
-
Docker Compose
If you encounter any issues starting the containers, you can bring down the stack and bring it up again:
docker-compose down docker-compose up -d