Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update sl-jdbc.md #4102

Merged
merged 19 commits into from
Oct 13, 2023
Merged
Show file tree
Hide file tree
Changes from 14 commits
Commits
Show all changes
19 commits
Select commit Hold shift + click to select a range
132668e
Update sl-jdbc.md
rpourzand Sep 20, 2023
1cc4c03
Update sl-jdbc.md
rpourzand Sep 20, 2023
6911c26
Merge branch 'current' into rpourzand-order-by-and-where-updates
mirnawong1 Sep 22, 2023
438568b
Update website/docs/docs/dbt-cloud-apis/sl-jdbc.md
rpourzand Sep 22, 2023
85f1772
Update sl-jdbc.md
rpourzand Sep 22, 2023
daad977
Merge branch 'current' into rpourzand-order-by-and-where-updates
mirnawong1 Sep 25, 2023
c7271ef
Merge branch 'current' into rpourzand-order-by-and-where-updates
mirnawong1 Sep 25, 2023
690e5bc
Update website/docs/docs/dbt-cloud-apis/sl-jdbc.md
mirnawong1 Sep 25, 2023
caff001
Merge branch 'current' into rpourzand-order-by-and-where-updates
mirnawong1 Oct 2, 2023
904d209
update tabs
mirnawong1 Oct 2, 2023
6cf965b
Merge branch 'rpourzand-order-by-and-where-updates' of https://github…
mirnawong1 Oct 2, 2023
cc5eb65
Merge branch 'current' into rpourzand-order-by-and-where-updates
mirnawong1 Oct 2, 2023
c7ad961
Merge branch 'current' into rpourzand-order-by-and-where-updates
mirnawong1 Oct 2, 2023
0ba8510
clarify integrations
mirnawong1 Oct 2, 2023
a6a1d67
Update website/snippets/_sl-partner-links.md
mirnawong1 Oct 13, 2023
d6b21b5
Merge branch 'current' into rpourzand-order-by-and-where-updates
mirnawong1 Oct 13, 2023
12d3e7e
Update avail-sl-integrations.md
mirnawong1 Oct 13, 2023
86ec8b6
Merge branch 'current' into rpourzand-order-by-and-where-updates
mirnawong1 Oct 13, 2023
8180f4b
Merge branch 'current' into rpourzand-order-by-and-where-updates
mirnawong1 Oct 13, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
33 changes: 20 additions & 13 deletions website/docs/docs/dbt-cloud-apis/sl-jdbc.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@ description: "Integrate and use the JDBC API to query your metrics."
tags: [Semantic Layer, API]
---


<VersionBlock lastVersion="1.5">

import LegacyInfo from '/snippets/_legacy-sl-callout.md';
Expand Down Expand Up @@ -59,11 +58,13 @@ jdbc:arrow-flight-sql://semantic-layer.cloud.getdbt.com:443?&environmentId=20233

## Querying the API for metric metadata

The Semantic Layer JDBC API has built-in metadata calls which can provide a user with information about their metrics and dimensions. Here are some metadata commands and examples:
The Semantic Layer JDBC API has built-in metadata calls which can provide a user with information about their metrics and dimensions.

Refer to the following tabs for metadata commands and examples:

<Tabs>

<TabItem value="allmetrics" label="Fetch all defined metrics">
<TabItem value="allmetrics" label="Fetch defined metrics">

Use this query to fetch all defined metrics in your dbt project:

Expand All @@ -74,7 +75,7 @@ select * from {{
```
</TabItem>
<TabItem value="alldimensions" label="Fetch all dimensions for a metric">
<TabItem value="alldimensions" label="Fetch dimensions for a metric">
Use this query to fetch all dimensions for a metric.
Expand All @@ -87,7 +88,7 @@ select * from {{
</TabItem>
<TabItem value="dimensionvalueformetrics" label="Fetch dimension values metrics">
<TabItem value="dimensionvalueformetrics" label="Fetch dimension values">
Use this query to fetch dimension values for one or multiple metrics and single dimension.
Expand All @@ -100,7 +101,7 @@ semantic_layer.dimension_values(metrics=['food_order_amount'], group_by=['custom
</TabItem>
<TabItem value="queryablegranularitiesformetrics" label="Fetch queryable primary time granularities for metrics">
<TabItem value="queryablegranularitiesformetrics" label="Fetch queryable granularities for metrics">
Use this query to fetch queryable granularities for a list of metrics. This API request allows you to only show the time granularities that make sense for the primary time dimension of the metrics (such as `metric_time`), but if you want queryable granularities for other time dimensions, you can use the `dimensions()` call, and find the column queryable_granularities.
Expand All @@ -113,6 +114,9 @@ select * from {{
</TabItem>
</Tabs>
<Tabs>
<TabItem value="metricsfordimensions" label="Fetch available metrics given dimensions">
Expand Down Expand Up @@ -144,9 +148,10 @@ select NAME, QUERYABLE_GRANULARITIES from {{
</TabItem>
<TabItem value="fetchprimarytimedimensionnames" label="Determine what time dimension(s) make up metric_time for your metric(s)">
<TabItem value="fetchprimarytimedimensionnames" label="Fetch primary time dimension names">
It may be useful in your application to expose the names of the time dimensions that represent `metric_time` or the common thread across all metrics.
You can first query the `metrics()` argument to fetch a list of measures, then use the `measures()` call which will return the name(s) of the time dimensions that make up metric time.
```bash
Expand All @@ -167,12 +172,13 @@ To query metric values, here are the following parameters that are available:
| `metrics` | The metric name as defined in your dbt metric configuration | `metrics=['revenue']` | Required |
| `group_by` | Dimension names or entities to group by. We require a reference to the entity of the dimension (other than for the primary time dimension), which is pre-appended to the front of the dimension name with a double underscore. | `group_by=['user__country', 'metric_time']` | Optional |
| `grain` | A parameter specific to any time dimension and changes the grain of the data from the default for the metric. | `group_by=[Dimension('metric_time')` <br/> `grain('week\|day\|month\|quarter\|year')]` | Optional |
| `where` | A where clause that allows you to filter on dimensions and entities using parameters - comes with `TimeDimension`, `Dimension`, and `Entity` objects. Granularity is required with `TimeDimension` | `"{{ where=Dimension('customer__country') }} = 'US')"` | Optional |
| `where` | A where clause that allows you to filter on dimensions and entities using parameters. This takes a filter list OR string. Inputs come with `Dimension`, and `Entity` objects. Granularity is required if the `Dimension` is a time dimension | `"{{ where=Dimension('customer__country') }} = 'US')"` | Optional |
| `limit` | Limit the data returned | `limit=10` | Optional |
|`order` | Order the data returned | `order_by=['-order_gross_profit']` (remove `-` for ascending order) | Optional |
| `compile` | If true, returns generated SQL for the data platform but does not execute | `compile=True` | Optional |
## Note on time dimensions and `metric_time`
You will notice that in the list of dimensions for all metrics, there is a dimension called `metric_time`. `Metric_time` is a reserved keyword for the measure-specific aggregation time dimensions. For any time-series metric, the `metric_time` keyword should always be available for use in queries. This is a common dimension across *all* metrics in a semantic graph.
Expand Down Expand Up @@ -246,13 +252,13 @@ select * from {{
Where filters in API allow for a filter list or string. We recommend using the filter list for production applications as this format will realize all benefits from the <Term id="predicate-pushdown" /> where possible.
Where filters have the following components that you can use:
Where Filters have a few objects that you can use:
- `Dimension()` - This is used for any categorical or time dimensions. If used for a time dimension, granularity is required - `Dimension('metric_time').grain('week')` or `Dimension('customer__country')`
- `TimeDimension()` - This is used for all time dimensions and requires a granularity argument - `TimeDimension('metric_time', 'MONTH)`
- `Entity()` - Used for entities like primary and foreign keys - `Entity('order_id')`
- `Entity()` - This is used for entities like primary and foreign keys - `Entity('order_id')`
Note: If you prefer a more explicit path to create the `where` clause, you can optionally use the `TimeDimension` feature. This helps separate out categorical dimensions from time-related ones. The `TimeDimesion` input takes the time dimension name and also requires granularity, like this: `TimeDimension('metric_time', 'MONTH')`.
Use the following example to query using a `where` filter with the string format:
Expand All @@ -261,7 +267,7 @@ Use the following example to query using a `where` filter with the string format
select * from {{
semantic_layer.query(metrics=['food_order_amount', 'order_gross_profit'],
group_by=[Dimension('metric_time').grain('month'),'customer__customer_type'],
where="{{ TimeDimension('metric_time', 'MONTH') }} >= '2017-03-09' AND {{ Dimension('customer__customer_type' }} in ('new') AND {{ Entity('order_id') }} = 10")
where="{{ Dimension('metric_time').grain('month') }} >= '2017-03-09' AND {{ Dimension('customer__customer_type' }} in ('new') AND {{ Entity('order_id') }} = 10")
}}
```
Expand All @@ -271,7 +277,7 @@ Use the following example to query using a `where` filter with a filter list for
select * from {{
semantic_layer.query(metrics=['food_order_amount', 'order_gross_profit'],
group_by=[Dimension('metric_time').grain('month'),'customer__customer_type'],
where=[{{ TimeDimension('metric_time', 'MONTH')}} >= '2017-03-09', {{ Dimension('customer__customer_type' }} in ('new'), {{ Entity('order_id') }} = 10])
where=[{{ Dimension('metric_time').grain('month') }} >= '2017-03-09', {{ Dimension('customer__customer_type' }} in ('new'), {{ Entity('order_id') }} = 10])
}}
```
Expand All @@ -287,6 +293,7 @@ semantic_layer.query(metrics=['food_order_amount', 'order_gross_profit'],
order_by=['order_gross_profit'])
}}
```
### Query with compile keyword
Use the following example to query using a `compile` keyword:
Expand Down
4 changes: 2 additions & 2 deletions website/docs/guides/migration/sl-migration.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,8 +65,8 @@ This step is only relevant to users who want the legacy and new semantic layer t
1. Create a new deployment environment in dbt Cloud and set the dbt version to 1.6 or higher.
2. Choose `Only run on a custom branch` and point to the branch that has the updated metric definition
3. Set the deployment schema to a temporary migration schema, such as `tmp_sl_migration`. Optional, you can create a new database for the migration.
4. Create a job to parse your project, such as `dbt parse`, and run it. Make sure this job succeeds, There needs to be a successful job in your environment in order to set up the semantic layer
5. In Account Settings > Projects > Project details click `Configure the Semantic Layer`. Under **Environment**select the deployment environment you created in the previous step. Save your configuration.
4. Create a job to parse your project, such as `dbt parse`, and run it. Make sure this job succeeds, there needs to be a successful job in your environment in order to set up the semantic layer
5. In Account Settings > Projects > Project details click `Configure the Semantic Layer`. Under **Environment**, select the deployment environment you created in the previous step. Save your configuration.
6. In the Project details page, click `Generate service token` and grant it `Semantic Layer Only` and `Metadata Only` permissions. Save this token securely - you will need it to connect to the semantic layer.
At this point, both the new semantic layer and the old semantic layer will be running. The new semantic layer will be pointing at your migration branch with the updated metrics definitions.
Expand Down
26 changes: 19 additions & 7 deletions website/snippets/_sl-partner-links.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,23 @@
<!-- turn this list into cards or sections once more docs are provided -->
The dbt Semantic Layer integrations are capable of querying dbt metrics, importing definitions, surfacing the underlying data in partner tools, and more. These are the following tools that integrate with the dbt Semantic Layer:
## Integrations

1. **Mode** &mdash; To learn more about integrating with Mode, check out their [documentation](https://mode.com/help/articles/supported-databases/#dbt-semantic-layer) for more info.
2. **Hex** &mdash; To learn more about integrating with Hex, check out their [documentation](https://learn.hex.tech/docs/connect-to-data/data-connections/dbt-integration#dbt-semantic-layer-integration) for more info. Additionally, refer to [dbt Semantic Layer cells](https://learn.hex.tech/docs/logic-cell-types/transform-cells/dbt-metrics-cells) to set up SQL cells in Hex.
3. **Google Sheets** &mdash; Google Sheets integration coming soon.
4. **Tools that allows you to write SQL** &mdash; They must meet one of the two criteria:
* Supports a generic JDBC driver option (such as DataGrip) or
* Supports Dremio and uses ArrowFlightSQL driver version 12.0.0 or higher.
The dbt Semantic Layer integrations can do things like query dbt metrics, import definitions, surface the underlying data in partner tools, and more.

The tools that work with the dbt Semantic Layer include:

1. **Mode**<br />
To learn more about integrating with Mode, check out their [documentation](https://mode.com/help/articles/supported-databases/#dbt-semantic-layer) for more info.

1. **Hex**<br />
To learn more about integrating with Hex, check out their [documentation](https://learn.hex.tech/docs/connect-to-data/data-connections/dbt-integration#dbt-semantic-layer-integration) for more info. Additionally, refer to [dbt Semantic Layer cells](https://learn.hex.tech/docs/logic-cell-types/transform-cells/dbt-metrics-cells) to set up SQL cells in Hex.

1. **Google Sheets**<br />
Google Sheets integration coming soon.

1. **Tools that allows you to write SQL**<br />
To connect to tools that allow you to write SQL, they must meet one of the two criteria:
- Supports a generic JDBC driver option (such as DataGrip) or
- Supports Dremio and uses ArrowFlightSQL driver version 12.0.0 or higher.

Before you connect to these tools, you'll need to first [set up the dbt Semantic Layer](/docs/use-dbt-semantic-layer/setup-sl) and [generate a service token](/docs/dbt-cloud-apis/service-tokens) to create a Semantic Layer Only and Metadata Only service token.