Running only the UI part locally while using external inference #16771
Unanswered
gpt4thewin
asked this question in
Q&A
Replies: 1 comment 3 replies
-
You can run it elsewhere and access it remotely with a raspberry pi. |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Is this even a possibility ?
How dependent is AUTOMATIC1111 on nvidia ?
I would also like to run it on :
Environment: Rasberry pi (ARM platform, no Nvidia GPU) with Docker
Beta Was this translation helpful? Give feedback.
All reactions