Skip to content

Commit

Permalink
initial commit
Browse files Browse the repository at this point in the history
  • Loading branch information
cswatt committed Feb 12, 2025
1 parent 281cbc5 commit bd44628
Show file tree
Hide file tree
Showing 6 changed files with 48 additions and 1 deletion.
3 changes: 3 additions & 0 deletions content/en/data_streams/dotnet.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,9 @@ environment:
- DD_DATA_STREAMS_ENABLED: "true"
```
### Monitoring connectors
{{% dsm_connectors %}}
### Monitoring SQS pipelines
Data Streams Monitoring uses one [message attribute][2] to track a message's path through an SQS queue. As Amazon SQS has a maximum limit of 10 message attributes allowed per message, all messages streamed through the data pipelines must have 9 or fewer message attributes set, allowing the remaining attribute for Data Streams Monitoring.
Expand Down
3 changes: 3 additions & 0 deletions content/en/data_streams/java.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,9 @@ To set up Data Streams Monitoring from the Datadog UI without needing to restart

{{< img src="data_streams/enable_dsm_service_catalog.png" alt="Enable the Data Streams Monitoring from the Dependencies section of the APM Service Page" >}}

### Monitoring connectors
{{% dsm_connectors %}}

### Monitoring SQS pipelines
Data Streams Monitoring uses one [message attribute][3] to track a message's path through an SQS queue. As Amazon SQS has a maximum limit of 10 message attributes allowed per message, all messages streamed through the data pipelines must have 9 or fewer message attributes set, allowing the remaining attribute for Data Streams Monitoring.

Expand Down
3 changes: 3 additions & 0 deletions content/en/data_streams/nodejs.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,9 @@ environment:
- DD_DATA_STREAMS_ENABLED: "true"
```
### Monitoring connectors
{{% dsm_connectors %}}
### Monitoring SQS pipelines
Data Streams Monitoring uses one [message attribute][4] to track a message's path through an SQS queue. As Amazon SQS has a maximum limit of 10 message attributes allowed per message, all messages streamed through the data pipelines must have 9 or fewer message attributes set, allowing the remaining attribute for Data Streams Monitoring.
Expand Down
5 changes: 4 additions & 1 deletion content/en/data_streams/python.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,8 +33,11 @@ environment:
- DD_DATA_STREAMS_ENABLED: "true"
```
### Monitoring connectors
{{% dsm_connectors %}}
### Monitoring SQS Pipelines
Data Streams Monitoring uses one [message attribute][4] to track a message's path through an SQS queue. As Amazon SQS has a maximum limit of 10 message attributes allowed per message, all messages streamed through the data pipelines must have 9 or less message attributes set, allowing the remaining attribute for Data Streams Monitoring.
Data Streams Monitoring uses one [message attribute][4] to track a message's path through an SQS queue. As Amazon SQS has a maximum limit of 10 message attributes allowed per message, all messages streamed through the data pipelines must have 9 or fewer message attributes set, allowing the remaining attribute for Data Streams Monitoring.
### Monitoring Kinesis pipelines
There are no message attributes in Kinesis to propagate context and track a message's full path through a Kinesis stream. As a result, Data Streams Monitoring's end-to-end latency metrics are approximated based on summing latency on segments of a message's path, from the producing service through a Kinesis Stream, to a consumer service. Throughput metrics are based on segments from the producing service through a Kinesis Stream, to the consumer service. The full topology of data streams can still be visualized through instrumenting services.
Expand Down
35 changes: 35 additions & 0 deletions layouts/shortcodes/dsm_connectors.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
#### Confluent Cloud connectors

Data Streams Monitoring can automatically discover your [Confluent Cloud][101] connectors and visualize them within the context of your end-to-end streaming data pipeline.

##### Setup

1. Install and configure the [Datadog-Confluent Cloud integration][102].
1. In Datadog, open the [Confluent Cloud integration tile][102].

<figure class="text-center">
<img src="{{ .Site.Params.img_url}}images/data_streams/confluent_cloud_connectors.png" alt="The log processors available" width="80%">
</figure>

Under **Actions**, a list of resources populates with detected clusters and connectors. Datadog attempts to discover new connectors every time you view this integration tile.
1. Select the resources you want to add.
1. Click **Add Resources**.

#### Self-hosted Kafka connectors

<div class="alert alert-info">This feature is in Preview.</div>

Data Streams Monitoring can collect information from your self-hosted Kafka connectors. In Datadog, these connectors are shown as services connected to Kafka topics. Datadog collects throughput to and from all Kafka topics. Datadog does not collect connector status or sinks and sources from self-hosted Kafka connectors.

##### Setup

Use Datadog's Java tracer, [`dd-trace-java`][103], to collect information from your Kafka Connect workers.

1. [Add the `dd-java-agent.jar` file][104] to your Kafka Connect workers. Ensure that you are using `dd-trace-java` [v1.44+][105].
1. Modify your Java options to include the Datadog Java tracer on your worker nodes. For example, on Strimzi, modify `STRIMZI_JAVA_OPTS` to add `-javaagent:/path/to/dd-java-agent.jar`.

[101]: https://www.confluent.io/confluent-cloud/
[102]: https://app.datadoghq.com/integrations/confluent-cloud
[103]: https://github.com/DataDog/dd-trace-java
[104]: /tracing/trace_collection/automatic_instrumentation/dd_libraries/java/?tab=wget
[105]: https://github.com/DataDog/dd-trace-java/releases/tag/v1.44.0
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit bd44628

Please sign in to comment.