Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Return run IDs from invoke/batch endpoints #148

Merged
merged 7 commits into from
Nov 1, 2023

Conversation

jakerachleff
Copy link
Contributor

@jakerachleff jakerachleff commented Oct 31, 2023

Clients of LangServe may be interested in the run_id of each parent run of LangChain. This can be used, for example, for external tracing applications like LangSmith. In this PR, we introduce a new dictionary in the response for the invoke/batch endpoints called "metadata" where we put the run_id (or run_ids for batch).

Notes

  • Stream is not currently supported since we do not have callbacks set up around streaming, we can add that later if needed

Copy link

cla-bot bot commented Oct 31, 2023

Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement, and we don't seem to have the username @jakerachleff on file. In order for us to review and merge your code, please complete the Individual Contributor License Agreement here https://forms.gle/Ljhqvt9Gdi1N385W6 .

For more details about why we have a CLA and other contribution guidelines please see: https://github.com/langchain-ai/langserve/blob/main/CONTRIBUTING.md.

@jakerachleff jakerachleff force-pushed the 2023-10-31-run-id-return branch from c20c234 to 860a9a1 Compare October 31, 2023 23:26
Copy link

cla-bot bot commented Oct 31, 2023

Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement, and we don't seem to have the username @jakerachleff on file. In order for us to review and merge your code, please complete the Individual Contributor License Agreement here https://forms.gle/Ljhqvt9Gdi1N385W6 .

For more details about why we have a CLA and other contribution guidelines please see: https://github.com/langchain-ai/langserve/blob/main/CONTRIBUTING.md.

Copy link

cla-bot bot commented Oct 31, 2023

Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement, and we don't seem to have the username @jakerachleff on file. In order for us to review and merge your code, please complete the Individual Contributor License Agreement here https://forms.gle/Ljhqvt9Gdi1N385W6 .

For more details about why we have a CLA and other contribution guidelines please see: https://github.com/langchain-ai/langserve/blob/main/CONTRIBUTING.md.

Copy link

cla-bot bot commented Nov 1, 2023

Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement, and we don't seem to have the username @jakerachleff on file. In order for us to review and merge your code, please complete the Individual Contributor License Agreement here https://forms.gle/Ljhqvt9Gdi1N385W6 .

For more details about why we have a CLA and other contribution guidelines please see: https://github.com/langchain-ai/langserve/blob/main/CONTRIBUTING.md.

):
return str(event_aggregator.callback_events[0].get("run_id"))
else:
None
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any thoughts on the error handling we'd want here? I just returned none for now, but that's obviously not ideal. Curious what we prefer

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should never occur I think so an assertion error is reasonable, can also use log an error once

async with get_async_test_client(app, raise_app_exceptions=True) as async_client:
response = await async_client.post("/invoke", json={"input": 1})
run_id = response.json()["metadata"]["run_id"]
assert _is_valid_uuid(run_id)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These tests are kinda lame, since they don't actually match up the proper run ID. They just assert some run ID was returned.

Do you guys know how to mock out the generation of run_ids for a given runnable run? If I could do that, then my tests would be much better. Not sure how to reach into the underlying infra like that tho

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's a fake tracer (FakeTracer)- check it it does whats needed, if not we could use mock patch (though that's brittle)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Took a quick look in the code, fake tracer won't help here -- i think current test is fine

Copy link

cla-bot bot commented Nov 1, 2023

Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement, and we don't seem to have the username @jakerachleff on file. In order for us to review and merge your code, please complete the Individual Contributor License Agreement here https://forms.gle/Ljhqvt9Gdi1N385W6 .

For more details about why we have a CLA and other contribution guidelines please see: https://github.com/langchain-ai/langserve/blob/main/CONTRIBUTING.md.

Copy link

cla-bot bot commented Nov 1, 2023

Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement, and we don't seem to have the username @jakerachleff on file. In order for us to review and merge your code, please complete the Individual Contributor License Agreement here https://forms.gle/Ljhqvt9Gdi1N385W6 .

For more details about why we have a CLA and other contribution guidelines please see: https://github.com/langchain-ai/langserve/blob/main/CONTRIBUTING.md.

@jakerachleff
Copy link
Contributor Author

I signed the verification form earlier... not sure why that still is marking as failed :/

@eyurtsev
Copy link
Collaborator

eyurtsev commented Nov 1, 2023

@jakerachleff it's not automated I need to add you manually 😞

@cla-bot cla-bot bot added the cla-signed label Nov 1, 2023
@eyurtsev
Copy link
Collaborator

eyurtsev commented Nov 1, 2023

@jakerachleff you're approved now :)

Copy link
Collaborator

@eyurtsev eyurtsev left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jakerachleff this looks good to me, let's touch base with @nfcampos to get his take on response body vs. in response headers

@jakerachleff jakerachleff merged commit cc2f9b8 into main Nov 1, 2023
6 checks passed
@jakerachleff jakerachleff deleted the 2023-10-31-run-id-return branch November 1, 2023 16:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants