Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama llama3.1 loading fix #1400

Merged

Conversation

CyanideByte
Copy link
Contributor

Describe the changes you have made:

  • Fixed issue when loading llama3.1 with ollama (probably other models too)
  • "json.decoder.JSONDecodeError: Unterminated string starting at: line 1 column 2 (char 1)"
  • Refreshed the models listed for download ("llama3.1", "phi3", "mistral-nemo", "gemma2", "codestral")

Reference any relevant issues (e.g. "Fixes #000"):

Pre-Submission Checklist (optional but appreciated):

  • I have included relevant documentation updates (stored in /docs)
  • I have read docs/CONTRIBUTING.md
  • I have read docs/ROADMAP.md

OS Tests (optional but appreciated):

  • Tested on Windows
  • Tested on MacOS
  • Tested on Linux

@CyanideByte CyanideByte changed the title Ollama llama3.1 fix Ollama llama3.1 loading fix Aug 15, 2024
@MikeBirdTech
Copy link
Contributor

Tested and confirm this resolves the issue

@KillianLucas KillianLucas changed the base branch from main to development August 20, 2024 16:56
@KillianLucas KillianLucas merged commit f831887 into OpenInterpreter:development Aug 20, 2024
0 of 2 checks passed
@CyanideByte CyanideByte deleted the ollama-llama3.1-fix branch August 21, 2024 01:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[llama-3.1 70B]Open Interpreter's Preps did not complete after setting the model
3 participants