Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proper Configuration and Use of Ollama's Vector Model in RAGLite, and Fixing the Event loop is closed Error #85

Open
FlandreBlood opened this issue Jan 7, 2025 · 0 comments

Comments

@FlandreBlood
Copy link

Description

When configuring RAGLite to use Ollama as the vector model (embedder), I encountered the following issues:

  1. Initially, while configuring Ollama, an error occurred in litellm. Based on the solution provided in litellm Issue #7451, the error was fixed. However, after that, the RuntimeError: Event loop is closed error appeared.
  2. After some debugging, I found that it is necessary to import nest_asyncio in the script to patch the event loop in order to correctly call Ollama's vector model.
    Currently, the RAGLite README does not provide any information on how to properly configure the Ollama vector model call, which may cause new users to encounter similar issues during the configuration process.

Configuration Steps

To correctly call Ollama's vector model and avoid the event loop error, it is recommended to follow these steps:

1.Set Ollama API Base URL:

import os from config import SERVER_HOST # Ensure SERVER_HOST is defined in config.py os.environ["OLLAMA_API_BASE"] = f"http://{SERVER_HOST}:11434"

2.Configure RAGLite:

`from raglite import RAGLiteConfig

my_config = RAGLiteConfig(
db_url=f"sqlite:///./raglite.db",
llm="ollama/{chat_model}", # Replace with your Ollama chat model name
embedder=f"ollama/{vector_model_full_name}", # Replace with your Ollama vector model full name
chunk_max_size=300, # Adjust as needed, the default is 1440, Chinese vector models are recommended to set around 512
)
`

Error Log:

Initially, you may encounter an error like the following:
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: Event loop is closed
This is caused by asynchronous operations in litellm.

Solution:

  1. Fix litellm Issues:
    The first issue you encounter is an error caused by litellm. According to litellm Issue #7451, the solution is to update the litellm library or apply the recommended fix from the issue.

  2. Fix Event Loop Error:
    After resolving the above issue, you may still encounter the RuntimeError: Event loop is closed. This error is typically caused by asynchronous event loops not being handled correctly. The solution is to use nest_asyncio to patch the event loop.
    Add nest_asyncio at the beginning of your script:

import nest_asyncio
nest_asyncio.apply()

Complete Example:

Here is a simplified example demonstrating how to properly configure and use Ollama's vector model, and avoid the event loop error:

import os
from pathlib import Path
from raglite import RAGLiteConfig
from raglite import insert_document

# Set Ollama API base URL
os.environ["OLLAMA_API_BASE"] = f"http://{SERVER_HOST}:11434"  # Ensure SERVER_HOST is defined

# Configure RAGLite
my_config = RAGLiteConfig(
    db_url="sqlite:///raglite.db",
    llm="ollama/{chat_model}",  # Replace with your Ollama chat model
    embedder="ollama/{vector_model_full_name}",  # Replace with your Ollama vector model full name
    chunk_max_size=300,  # Chinese vector models are generally recommended to set around 512 context size
)

# Insert Document (e.g., Special Relativity.pdf)
insert_document(Path("Special Relativity.pdf"), config=my_config)

Conclusion:
When configuring Ollama's vector model in RAGLite, several issues may arise:

  1. litellm errors causing connection issues, for which the solution is detailed in litellm Issue #7451.
  2. The Event loop is closed error, which can be resolved by using nest_asyncio to patch the event loop.
    It would be helpful if the documentation in RAGLite clearly mentioned how to configure Ollama's vector model and resolve these common issues.

version:
litellm:Version: 1.56.5
raglite:0.51

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant