-
Notifications
You must be signed in to change notification settings - Fork 813
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How can I evaluate Ollama models such as Llama 3.3 using RAGAS? #1857
Comments
@dosu Can you solve this ? |
You can successfully use RAGAS with Ollama models by using the following code: from langchain_community.chat_models import ChatOllama
from langchain_community.embeddings import OllamaEmbeddings
ollama_llm = "mistral"
model = ChatOllama(model=ollama_llm)
embeddings_ollama = OllamaEmbeddings(model="mistral") This solution was suggested in a similar closed issue and received positive feedback [1]. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
I successfully run RAGAS to evaluate an Ollama model. However, I get a timeout error during evaluation and NAN values for metrics, although increasing the timeout for the RunConfig. Evaluating: 0%| | 0/1 [00:00<?, ?it/s]ERROR:ragas.executor:Exception raised in Job[0]: TimeoutError() **** This is the function I used to calculate the evaluation metrics
|
sadly this is a duplicate of #1170 |
[ ] I checked the documentation and related resources and couldn't find an answer to my question.
The text was updated successfully, but these errors were encountered: