Skip to content

Commit

Permalink
Adds quick start to docs and links to it from UI
Browse files Browse the repository at this point in the history
Adds quick start cut & paste section to additional visibility docs.
Fixes some code in the docs too.

Links to it from the new insights section when there is no LLM
calls instrumented.
  • Loading branch information
skrawcz committed Aug 14, 2024
1 parent 0e8138c commit a91d5df
Show file tree
Hide file tree
Showing 2 changed files with 55 additions and 3 deletions.
46 changes: 45 additions & 1 deletion docs/concepts/additional-visibility.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,48 @@ Additional Visibility
Burr comes with the ability to see inside your actions. This is a very pluggable framework
that comes with the default tracking client, but can also be hooked up to tools such as `OpenTelemetry <https://opentelemetry.io/>`_

----------
Quickstart
----------
Below is a quick start. For more in depth documentation see the next few sections.

If you want to:
(a) automatically instrument LLM API calls, and
(b) see them show up in the Burr UI,

you can do the following:

1. Determine the LLM API you want to instrument (e.g. OpenAI, Anthropic, etc...). \
See `openllmetry repo <https://github.com/traceloop/openllmetry/tree/main/packages>`_ for available options.
2. Use the local tracker and flip the `use_otel_tracing` flag to True in the ``ApplicationBuilder``.

Here's an example to instrument OpenAI:

.. code-block:: bash
# install the appropriate openllmetry package
pip install opentelemetry-instrumentation-openai
.. code-block:: python
# add the right imports
from opentelemetry.instrumentation.openai import OpenAIInstrumentor
OpenAIInstrumentor().instrument() # this instruments openai clients
# create the local tracker
tracker = LocalTrackingClient(project="your-project")
app = (
ApplicationBuilder()
.with_graph(base_graph)
#... whatever you do normally here
.with_tracker(tracker, use_otel_tracing=True) # set use_otel_tracing to True
.build()
)
# use your app as you normally would -- go to the Burr UI and see additional spans!
-------
Tracing
-------
Expand Down Expand Up @@ -143,10 +185,11 @@ example:
.with_actions(my_action, ...)
.with_state(...)
.with_transitions(...)
.with_tracker("local", project="my_projet", cuse_otel_tracing=True)
.with_tracker("local", project="my_project", use_otel_tracing=True)
.with_entrypoint("prompt", "my_action")
.build()
)
While this is contrived, it illustrates that you can mix/match Burr/Otel. This is valuable
when you have a Burr action that calls out to a function that is instrumented via OTel (
of which there are a host of integrations).
Expand All @@ -166,6 +209,7 @@ log all spans to the OTel provider of choice (and you are responsible for initia
it as you see fit).

.. code-block:: python
from burr.integrations.opentelemetry import OpenTelemetryBridge
otel_tracer = trace.get_tracer(__name__)
Expand Down
12 changes: 10 additions & 2 deletions telemetry/ui/src/components/routes/app/InsightsView.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ export const InsightsView = (props: { steps: Step[] }) => {
// Display the total sum
// Skip cost for
return (
<div>
<div className="pt-2">
<h2>Total Prompt Tokens: {totalPromptTokens}</h2>
<h2>Total Completion Tokens: {totalCompletionTokens}</h2>
{/*<h2>Total Cost: ${totalCost}</h2>*/}
Expand All @@ -51,7 +51,15 @@ export const InsightsView = (props: { steps: Step[] }) => {
} else {
return (
<div>
<h2>No LLM calls found.</h2>
<h2 className="pt-2">
No LLM calls instrumented. To instrument{' '}
<a
className="text-dwlightblue"
href={'https://burr.dagworks.io/concepts/additional-visibility/#quickstart'}
>
see docs.
</a>
</h2>
</div>
);
}
Expand Down

0 comments on commit a91d5df

Please sign in to comment.