-
-
Notifications
You must be signed in to change notification settings - Fork 157
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adds support for on-device LLMs with SpeziLLMLocal #39
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #39 +/- ##
==========================================
- Coverage 93.03% 92.96% -0.07%
==========================================
Files 24 27 +3
Lines 832 922 +90
==========================================
+ Hits 774 857 +83
- Misses 58 65 +7
Continue to review full report in Codecov by Sentry.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you @vishnuravi; this looks great!
Adds support for on-device LLMs with SpeziLLMLocal
♻️ Current situation & Problem
Currently, the app can only use OpenAI models such as GPT-4. However, users may prefer to run an LLM on-device for increased privacy. This is now supported via the SpeziLLMLocal target of the SpeziLLM module and can be enabled in HealthGPT.
⚙️ Release Notes
--llmLocal
feature flag for toggling local executionHealthDataInterpreter
to use the local LLM if the flag is set📚 Documentation
Updated documentation
✅ Testing
Code of Conduct & Contributing Guidelines
By submitting creating this pull request, you agree to follow our Code of Conduct and Contributing Guidelines: