Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(docs) Update W&B integration with LiteLLM #8052

Open
wants to merge 5 commits into
base: main
Choose a base branch
from

Conversation

ayulockin
Copy link

@ayulockin ayulockin commented Jan 28, 2025

New W&B Weave instrumentation with LiteLLM

The current documentation showing W&B's integration with LiteLLM is outdated. We do not support this feature anymore and opened this PR to update the same.

W&B Weave is our new product for GenAI workflows. We have an auto logging integration in our weave codebase. This PR adds/updates the documentation to reflect the same to avoid any/further confusion.

The PR also adds a print_verbose statement indicating that the WeightsBiasesLogger is deprecated. Do let me know if there is a better way to flag deprecation warning to the users.

Type

📖 Documentation

Copy link

vercel bot commented Jan 28, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jan 28, 2025 1:06pm

@ayulockin ayulockin marked this pull request as ready for review January 28, 2025 14:16

Install W&B Weave
```shell
pip install weave
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ayulockin i'm confused - why is the weave sdk required here?

If this is patching litellm then that would be bad, as it can cause unexpected errors.

It would be preferred if weave was written as a customlogger, and did not have an sdk requirement (e.g. using pure httpx like langsmith - https://github.com/BerriAI/litellm/blob/9644e197f760c5cb87ae7662dbedbbce1b04c6a8/litellm/integrations/langsmith.py)

Copy link
Author

@ayulockin ayulockin Jan 29, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @krrishdholakia, weave is required because we are patching litellm. It happens when we do weave.init().

It's a light patching and is basically decorating the completion and acompletion functions with weave.op() (docs).

We would love to write a custom logger but for now this patching based integration lives in our weave repo and this PR is meant to update the outdated product feature. We no longer support "Traces" product which the current documentation is showing and cause confusion for users.

We also have a documentation on our site here if you wanna take a look: https://weave-docs.wandb.ai/guides/integrations/litellm and this PR aligns the present capability on both docs.

can you point me to how weave works with litellm? Code would be helpful here

I am sharing the line where the fn is decorated with weave.op() but the whole integration is living in this single file.

https://github.com/wandb/weave/blob/9ff97cab61e62b84ab7f1760d7968b6e61ef4b87/weave/integrations/litellm/litellm.py#L93

@krrishdholakia
Copy link
Contributor

@ayulockin can you point me to how weave works with litellm? Code would be helpful here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants