-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
o3-mini API endpoint not supported #3916
Comments
I also see there's a issue when use o3-mini
|
#3914 Developers of continue, please add o3 support to the o1 handling in core/llm/llms/OpenAI.ts as soon as possible.😢 |
Fixes continuedev#3916 Add support for the o3-mini model in `core/llm/llms/OpenAI.ts`. * Add "o3-mini" to the list of supported models. * Implement `isO3Model` method to check if the model is o3-mini. * Update `supportsPrediction` method to include "o3-mini". * Modify `_convertArgs` method to handle `max_completion_tokens` for o3-mini. * Update `modifyChatBody` method to handle `max_completion_tokens` for o3-mini. --- For more details, open the [Copilot Workspace session](https://copilot-workspace.githubnext.com/continuedev/continue/issues/3916?shareId=XXXX-XXXX-XXXX-XXXX).
Seems that the model works now, but would be good to have reasoning effort added for o1/3 models too. |
yes, seems it works now
|
I tried the latest version that was merge, but it only "works" and not fully operational. |
Hi, was wondering if this is working now. Is there an example config which is working? |
I'd love an example of a working config as well. |
Yesterday, after many months I deposited $ to my openai account only to find out Continue does not support o3-mini. |
@summersonnn you're being super demanding of an open source project. Anyway, just install the pre-release version and set o3-mini as your model type (following the openai examples in the docs). It works. |
By the way, regarding the issue where o3-mini can't handle long contexts when used with Continue, has the problem been resolved? |
Before submitting your bug report
Relevant environment info
Description
Continue does not currently support o3-mini. It may take some time before support is added, so we will distribute a config.ts that will allow you to run o3-mini flawlessly.
config.ts for o3-mini
code %USERPROFILE%\.continue\config.ts
config.ts
with the following content.To reproduce
No response
Log output
The text was updated successfully, but these errors were encountered: