Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: request json not same with python api #326

Open
1 task done
jamesvren opened this issue Feb 27, 2025 · 4 comments
Open
1 task done

bug: request json not same with python api #326

jamesvren opened this issue Feb 27, 2025 · 4 comments

Comments

@jamesvren
Copy link

  • I have looked for existing issues (including closed) about this

Bug Report

The request json are not same with what python openai send. This cause my local deepseek think it missing some items and donnot chat.

Reproduction

    let openai_client = openai::Client::from_url(&key, &url);

    let sentiment_classifier = openai_client
        .extractor::<SentimentClassification>("xxxx")
        .build();

    let text = "I like this house, it is awsome!";

    match sentiment_classifier.extract(text).await {
        Ok(result) => pretty_print_result(text, &result),
        Err(e) => eprintln!("Got wrong answer: {}", e),
    }

Expected behavior

I expect it send the same request like python openai

Screenshots

This is what rig send (Please note that messages content are not string)

Image

It will give me the following error:

Image

This is waht python openai send (which is my expect)

Image

Additional context

@0xMochan
Copy link
Contributor

Hey! Thanks for this issue. We actually have proper deepseek support via our rig-core/providers/deepseek.rs module. Using that will ensure the proper format that deepseek accepts.

    let deepseek_client = deepseek::Client::from_env();

    let sentiment_classifier = deepseek_client
        .extractor::<SentimentClassification>("xxxx")
        .build();

    let text = "I like this house, it is awsome!";

    match sentiment_classifier.extract(text).await {
        Ok(result) => pretty_print_result(text, &result),
        Err(e) => eprintln!("Got wrong answer: {}", e),
    }

Even though deepseek and other providers say they accept openai compatible format, we've now learned that they only accept a subset of the actual format. We might adjust our openai client a bit to serialize in a similar manner but it's kinda an odd.

@jamesvren
Copy link
Author

Hey! Thanks for this issue. We actually have proper deepseek support via our rig-core/providers/deepseek.rs module. Using that will ensure the proper format that deepseek accepts.

let deepseek_client = deepseek::Client::from_env();

let sentiment_classifier = deepseek_client
    .extractor::<SentimentClassification>("xxxx")
    .build();

let text = "I like this house, it is awsome!";

match sentiment_classifier.extract(text).await {
    Ok(result) => pretty_print_result(text, &result),
    Err(e) => eprintln!("Got wrong answer: {}", e),
}

Even though deepseek and other providers say they accept openai compatible format, we've now learned that they only accept a subset of the actual format. We might adjust our openai client a bit to serialize in a similar manner but it's kinda an odd.

I deploy deepseek with mistral, it only response with openai format. I tried with deepseek provider, but it failed to parse response. I also looked openai document, the example for default and stream also use string for content and only use dict with type for image. So I think it's better to support the same :)

@jamesvren
Copy link
Author

jamesvren commented Feb 28, 2025

By the way, it's better to have debug output about request json before send and response text before serialization since it might failed in somes cases. That will save us time for debugging.

@0xMochan
Copy link
Contributor

0xMochan commented Feb 28, 2025 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants