-
Notifications
You must be signed in to change notification settings - Fork 153
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Exposing trace context to python backend #346
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, especially on the shm part, great work 🚀
I understand that this is only stage 1, but I was wondering if we plan to add some documents on how to interact with the trace context? (maybe we'd want to wait until the stage 2 is completed?) I think the demo code in the server PR would be really helpful(at least for my own understanding)
@krishung5, yes good point. I'll update docs in a separate PR, so that this can be cherry-picked faster |
struct InferenceTraceShm { | ||
bi::managed_external_buffer::handle_t trace_context_shm_handle; | ||
// The address of the 'TRITONSERVER_InferTrace' object. | ||
void* triton_trace; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you remind me why this is void*
? To avoid C API references out of process in stub?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, stub does not have an access to TRITONSERVER_InferTrace
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great! Had a question, but unrelated to approval: https://github.com/triton-inference-server/python_backend/pull/346/files#r1526885602
Exposing trace context to python backend
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As discussed offline, would be good to have documentation and example for this new feature.
@@ -1611,7 +1611,14 @@ PYBIND11_EMBEDDED_MODULE(c_python_backend_utils, module) | |||
.export_values(); | |||
|
|||
py::class_<InferenceTrace, std::shared_ptr<InferenceTrace>>( | |||
module, "InferenceTrace"); | |||
module, "InferenceTrace") | |||
.def("get_context", [](InferenceTrace& self) -> py::object { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we remove get_
from here? We usually don't include get_
in the public API names unless there is a name conflict.
.def("get_context", [](InferenceTrace& self) -> py::object { | ||
auto context = self.Context(); | ||
if (context != "") { | ||
return py::str(context); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it possible to convert the context into the object that the Python Otel requires?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would be good to have a Python object wrapper for context. It would allow us to have more flexibility for future changes.
if (context != "") { | ||
return py::str(context); | ||
} | ||
return py::none(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we have a plan for what context is going to look like for non-Otel scenarios?
Added needed logic to pass trace context to the stub proces + accessing it from the stub process.
Current discussion:
get_trace
does it need amode
field: https://github.com/triton-inference-server/server/pull/6985/files#r1525367279Most likely
mode
field will be removed fromget_trace
api.Related PRs:
core: triton-inference-server/core#334
server: triton-inference-server/server#6985