-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ollama #79
Comments
&name open-webui:
container_name: *name
hostname: *name
build:
context: https://github.com/open-webui/open-webui.git#main
dockerfile: Dockerfile
volumes:
- /mnt/llm/open-webui/data:/app/backend/data
depends_on:
ollama:
condition: service_started
restart: false
links:
- ollama
environment:
OLLAMA_BASE_URL: http://ollama:11434
SCARF_NO_ANALYTICS: true
ANONYMIZED_TELEMETRY: false
DO_NOT_TRACK: true
ENABLE_SIGNUP: false
WEBUI_AUTH: false
RAG_EMBEDDING_MODEL_AUTO_UPDATE: true
RAG_EMBEDDING_MODEL: "mixedbread-ai/mxbai-embed-large-v1"
networks:
- traefik-servicenet
- default
labels:
traefik.enable: true
traefik.http.routers.open-webui.rule: "Host(`open-webui.my.internal.domain`) || Host(`ollama-webui.my.internal.domain`) || Host(`openwebui.my.internal.domain`)"
traefik.http.routers.open-webui.tls.certresolver: le
traefik.http.routers.open-webui.entrypoints: websecure
traefik.http.routers.open-webui.tls.domains[0].main: "*.my.internal.domain"
traefik.http.routers.open-webui.service: open-webui-service
traefik.http.services.open-webui-service.loadbalancer.server.port: 8080
traefik.http.routers.open-webui.middlewares: authentik |
Most likely Ollama is overkill. when llama.cpp is there 🗡 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
export OLLAMA_HOST=0.0.0.0 WEBUI_AUTH=false ENABLE_SIGNUP=false && ollama serve >/tmp/ollama.log 2>&1 & uvx --python 3.11 open-webui serve > /tmp/ollamaui.log 2>&1 &;
The text was updated successfully, but these errors were encountered: