Skip to content

Commit

Permalink
Merge branch 'current' into sl-access-url-st-docs-update
Browse files Browse the repository at this point in the history
mirnawong1 authored Apr 26, 2024
2 parents 4990950 + 03b9bc8 commit ba0bda5
Showing 14 changed files with 190 additions and 44 deletions.
4 changes: 2 additions & 2 deletions website/docs/docs/build/incremental-models.md
Original file line number Diff line number Diff line change
@@ -70,7 +70,7 @@ from {{ ref('app_data_events') }}

-- this filter will only be applied on an incremental run
-- (uses >= to include records whose timestamp occurred since the last run of this model)
where event_time >= (select max(event_time) from {{ this }})
where event_time >= (select coalesce(max(event_time), '1900-01-01') from {{ this }})

{% endif %}
```
@@ -141,7 +141,7 @@ from {{ ref('app_data_events') }}

-- this filter will only be applied on an incremental run
-- (uses >= to include records arriving later on the same day as the last run of this model)
where date_day >= (select max(date_day) from {{ this }})
where date_day >= (select coalesce(max(event_time), '1900-01-01') from {{ this }})

{% endif %}

8 changes: 6 additions & 2 deletions website/docs/docs/cloud/cloud-cli-installation.md
Original file line number Diff line number Diff line change
@@ -84,9 +84,13 @@ Refer to the [FAQs](#faqs) if your operating system runs into path conflicts.
:::info
Advanced users can configure multiple projects to use the same dbt Cloud CLI by placing the executable in the Program Files folder and [adding it to their Windows PATH environment variable](https://medium.com/@kevinmarkvi/how-to-add-executables-to-your-path-in-windows-5ffa4ce61a53).
Advanced users can configure multiple projects to use the same dbt Cloud CLI by:
Note that if you are using VS Code, you must restart it to pick up modified environment variables.
1. Placing the executable file (`.exe`) in the "Program Files" folder
2. [Adding it to their Windows PATH environment variable](https://medium.com/@kevinmarkvi/how-to-add-executables-to-your-path-in-windows-5ffa4ce61a53)
3. Saving it where needed
Note that if you're using VS Code, you must restart it to pick up modified environment variables.
:::
3. Verify your installation by running `./dbt --help` in the command line. If you see the following output, your installation is correct:
2 changes: 1 addition & 1 deletion website/docs/docs/cloud/manage-access/mfa.md
Original file line number Diff line number Diff line change
@@ -55,7 +55,7 @@ Choose the next steps based on your preferred enrollment selection:
4. Store the backup passcode in a secure location. This key will be usefull if the MFA method fails (like a lost or broken phone).


## Disclaimer
## Terms of Use

The terms below apply to dbt Cloud’s MFA via SMS program, that dbt Labs (“dbt Labs”, “we”, or “us”) uses to facilitate auto sending of authorization codes to users via SMS for dbt Cloud log-in requests.

Original file line number Diff line number Diff line change
@@ -88,6 +88,8 @@ We will begin deprecating support for spaces in dbt model names in v1.8 (raising
- Spaces in a model name make it impossible to `--select` the model name because the argument gets split into pieces over spaces very early in the pipeline.
- Most warehouses do not accept a table, or other object, with a space in its name.

To upgrade, replace any spaces in the model file name with an underscore and update any associated YAML that contains the model name to match. You can keep spaces in the database table name by configuring a [custom `alias`](/docs/build/custom-aliases#usage).

## Quick hits

- [Global config flags](/reference/global-configs/about-global-configs) are deprecated from the [`profiles.yml`](/docs/core/connect-data-platform/profiles.yml) file and should be moved to the [`dbt_project.yml`](/reference/dbt_project.yml).
Original file line number Diff line number Diff line change
@@ -112,6 +112,8 @@ The built-in [collect_freshness](https://github.com/dbt-labs/dbt-core/blob/1.5.l

Finally: The [built-in `generate_alias_name` macro](https://github.com/dbt-labs/dbt-core/blob/1.5.latest/core/dbt/include/global_project/macros/get_custom_name/get_custom_alias.sql) now includes logic to handle versioned models. If your project has reimplemented the `generate_alias_name` macro with custom logic, and you want to start using [model versions](/docs/collaborate/govern/model-versions), you will need to update the logic in your macro. Note that, while this is **not** a prerequisite for upgrading to v1.5—only for using the new feature—we recommmend that you do this during your upgrade, whether you're planning to use model versions tomorrow or far in the future.

Likewise, if your project has reimplemented the `ref` macro with custom logic, you will need to update the logic in your macro as described [here](https://docs.getdbt.com/reference/dbt-jinja-functions/builtins).

### For consumers of dbt artifacts (metadata)

The [manifest](/reference/artifacts/manifest-json) schema version will be updated to `v9`. Specific changes:
84 changes: 74 additions & 10 deletions website/docs/guides/sl-snowflake-qs.md
Original file line number Diff line number Diff line change
@@ -273,7 +273,7 @@ There are two ways to connect dbt Cloud to Snowflake. The first option is Partne

Using Partner Connect allows you to create a complete dbt account with your [Snowflake connection](/docs/cloud/connect-data-platform/connect-snowflake), [a managed repository](/docs/collaborate/git/managed-repository), [environments](/docs/build/custom-schemas#managing-environments), and credentials.

1. In the Snowflake UI, click on the home icon in the upper left corner. In the left sidebar, select **Admin**. Then, select **Partner Connect**. Find the dbt tile by scrolling or by searching for dbt in the search bar. Click the tile to connect to dbt.
1. In the Snowflake UI, click on the home icon in the upper left corner. In the left sidebar, select **Data Products**. Then, select **Partner Connect**. Find the dbt tile by scrolling or by searching for dbt in the search bar. Click the tile to connect to dbt.

<Lightbox src="/img/snowflake_tutorial/snowflake_partner_connect_box.png" title="Snowflake Partner Connect Box" />

@@ -347,7 +347,11 @@ If you used Partner Connect, you can skip to [initializing your dbt project](#in
<Snippet path="tutorial-managed-repo" />

## Initialize your dbt project and start developing
Now that you have a repository configured, you can initialize your project and start development in dbt Cloud:
This guide assumes you use the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) to develop your dbt project and define metrics. However, the dbt Cloud IDE doesn't support using [MetricFlow commands](/docs/build/metricflow-commands) to query or preview metrics (support coming soon).

To query and preview metrics in your development tool, you can use the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) to run the [MetricFlow commands](/docs/build/metricflow-commands).

Now that you have a repository configured, you can initialize your project and start development in dbt Cloud using the IDE:

1. Click **Start developing in the dbt Cloud IDE**. It might take a few minutes for your project to spin up for the first time as it establishes your git connection, clones your repo, and tests the connection to the warehouse.
2. Above the file tree to the left, click **Initialize your project**. This builds out your folder structure with example models.
@@ -378,6 +382,8 @@ Name the new branch `build-project`.
2. Name the file `staging/jaffle_shop/src_jaffle_shop.yml` , then click **Create**.
3. Copy the following text into the file and click **Save**.

<File name='models/staging/jaffle_shop/src_jaffle_shop.yml'>

```yaml
version: 2
@@ -390,6 +396,8 @@ sources:
- name: orders
```

</File>

:::tip
In your source file, you can also use the **Generate model** button to create a new model file for each source. This creates a new file in the `models` directory with the given source name and fill in the SQL code of the source definition.
:::
@@ -398,6 +406,8 @@ In your source file, you can also use the **Generate model** button to create a
5. Name the file `staging/stripe/src_stripe.yml` , then click **Create**.
6. Copy the following text into the file and click **Save**.

<File name='models/staging/stripe/src_stripe.yml'>

```yaml
version: 2
@@ -408,13 +418,16 @@ sources:
tables:
- name: payment
```
</File>

### Add staging models
[Staging models](/best-practices/how-we-structure/2-staging) are the first transformation step in dbt. They clean and prepare your raw data, making it ready for more complex transformations and analyses. Follow these steps to add your staging models to your project.

1. Create the file `models/staging/jaffle_shop/stg_customers.sql`. Or, you can use the **Generate model** button to create a new model file for each source.
1. In the `jaffle_shop` sub-directory, create the file `stg_customers.sql`. Or, you can use the **Generate model** button to create a new model file for each source.
2. Copy the following query into the file and click **Save**.

<File name='models/staging/jaffle_shop/stg_customers.sql'>

```sql
select
id as customer_id,
@@ -423,9 +436,13 @@ sources:
from {{ source('jaffle_shop', 'customers') }}
```

3. Create the file `models/staging/jaffle_shop/stg_orders.sql`
</File>

3. In the same `jaffle_shop` sub-directory, create the file `stg_orders.sql`
4. Copy the following query into the file and click **Save**.

<File name='models/staging/jaffle_shop/stg_orders.sql'>

```sql
select
id as order_id,
@@ -435,9 +452,13 @@ from {{ source('jaffle_shop', 'customers') }}
from {{ source('jaffle_shop', 'orders') }}
```

5. Create the file `models/staging/stripe/stg_payments.sql`.
</File>

5. In the `stripe` sub-directory, create the file `stg_payments.sql`.
6. Copy the following query into the file and click **Save**.

<File name='models/staging/stripe/stg_payments.sql'>

```sql
select
id as payment_id,
@@ -452,6 +473,8 @@ select
from {{ source('stripe', 'payment') }}
```

</File>

7. Enter `dbt run` in the command prompt at the bottom of the screen. You should get a successful run and see the three models.

### Add business-defined entities
@@ -463,6 +486,8 @@ This phase is the [marts layer](/best-practices/how-we-structure/1-guide-overvie
1. Create the file `models/marts/fct_orders.sql`.
2. Copy the following query into the file and click **Save**.
<File name='models/marts/fct_orders.sql'>
```sql
with orders as (
select * from {{ ref('stg_orders' )}}
@@ -504,9 +529,13 @@ select * from final
```
3. Create the file `models/marts/dim_customers.sql`.
</File>
3. In the `models/marts` directory, create the file `dim_customers.sql`.
4. Copy the following query into the file and click **Save**.
<File name='models/marts/dim_customers.sql'>
```sql
with customers as (
select * from {{ ref('stg_customers')}}
@@ -539,18 +568,26 @@ final as (
select * from final
```
5. Create the file `packages.yml` in your main directory
</File>
5. In your main directory, create the file `packages.yml`.
6. Copy the following text into the file and click **Save**.
<File name='packages.yml'>
```sql
packages:
- package: dbt-labs/dbt_utils
version: 1.1.1
```
7. Create the file `models/metrics/metricflow_time_spine.sql` in your main directory.
</File>
7. In the `models` directory, create the file `metrics/metricflow_time_spine.sql` in your main directory.
8. Copy the following query into the file and click **Save**.
<File name='models/metrics/metricflow_time_spine.sql'>
```sql
{{
config(
@@ -574,6 +611,8 @@ select * from final
```
</File>
9. Enter `dbt run` in the command prompt at the bottom of the screen. You should get a successful run message and also see in the run details that dbt has successfully built five models.
## Create semantic models
@@ -587,9 +626,11 @@ select * from final
In the following steps, semantic models enable you to define how to interpret the data related to orders. It includes entities (like ID columns serving as keys for joining data), dimensions (for grouping or filtering data), and measures (for data aggregations).
1. Create a new file `models/metrics/fct_orders.yml`
1. In the `metrics` sub-directory, create a new file `fct_orders.yml`.
2. Add the following code to that newly created file:
<File name='models/metrics/fct_orders.yml'>
```yaml
semantic_models:
- name: orders
@@ -600,6 +641,8 @@ semantic_models:
model: ref('fct_orders')
```
</File>
The following sections explain [dimensions](/docs/build/dimensions), [entities](/docs/build/entities), and [measures](/docs/build/measures) in more detail, showing how they each play a role in semantic models.
- [Entities](#entities) act as unique identifiers (like ID columns) that link data together from different tables.
@@ -612,6 +655,8 @@ The following sections explain [dimensions](/docs/build/dimensions), [entities](
Add entities to your `fct_orders.yml` semantic model file:
<File name='models/metrics/fct_orders.yml'>
```yaml
semantic_models:
- name: orders
@@ -628,12 +673,16 @@ semantic_models:
type: foreign
```
</File>
### Dimensions
[Dimensions](/docs/build/semantic-models#entities) are a way to group or filter information based on categories or time.
Add dimensions to your `fct_orders.yml` semantic model file:
<File name='models/metrics/fct_orders.yml'>
```yaml
semantic_models:
- name: orders
@@ -655,12 +704,16 @@ semantic_models:
time_granularity: day
```
</File>
### Measures
[Measures](/docs/build/semantic-models#measures) are aggregations performed on columns in your model. Often, you’ll find yourself using them as final metrics themselves. Measures can also serve as building blocks for more complicated metrics.
Add measures to your `fct_orders.yml` semantic model file:
<File name='models/metrics/fct_orders.yml'>
```yaml
semantic_models:
- name: orders
@@ -701,6 +754,8 @@ semantic_models:
use_approximate_percentile: False
```
</File>
## Define metrics
[Metrics](/docs/build/metrics-overview) are the language your business users speak and measure business performance. They are an aggregation over a column in your warehouse that you enrich with dimensional cuts.
@@ -717,6 +772,8 @@ Once you've created your semantic models, it's time to start referencing those m
Add metrics to your `fct_orders.yml` semantic model file:
<File name='models/metrics/fct_orders.yml'>
```yaml
semantic_models:
- name: orders
@@ -805,6 +862,8 @@ metrics:
- name: order_count
```
</File>
## Add second semantic model to your project
Great job, you've successfully built your first semantic model! It has all the required elements: entities, dimensions, measures, and metrics.
@@ -813,9 +872,11 @@ Let’s expand your project's analytical capabilities by adding another semantic
After setting up your orders model:
1. Create the file `models/metrics/dim_customers.yml`.
1. In the `metrics` sub-directory, create the file `dim_customers.yml`.
2. Copy the following query into the file and click **Save**.
<File name='models/metrics/dim_customers.yml'>
```yaml
semantic_models:
- name: customers
@@ -862,6 +923,9 @@ metrics:
measure: customers
```
</File>
This semantic model uses simple metrics to focus on customer metrics and emphasizes customer dimensions like name, type, and order dates. It uniquely analyzes customer behavior, lifetime value, and order patterns.
## Test and query metrics
2 changes: 1 addition & 1 deletion website/docs/guides/snowflake-qs.md
Original file line number Diff line number Diff line change
@@ -144,7 +144,7 @@ There are two ways to connect dbt Cloud to Snowflake. The first option is Partne

Using Partner Connect allows you to create a complete dbt account with your [Snowflake connection](/docs/cloud/connect-data-platform/connect-snowflake), [a managed repository](/docs/collaborate/git/managed-repository), [environments](/docs/build/custom-schemas#managing-environments), and credentials.

1. In the Snowflake UI, click on the home icon in the upper left corner. In the left sidebar, select **Admin**. Then, select **Partner Connect**. Find the dbt tile by scrolling or by searching for dbt in the search bar. Click the tile to connect to dbt.
1. In the Snowflake UI, click on the home icon in the upper left corner. In the left sidebar, select **Data Products**. Then, select **Partner Connect**. Find the dbt tile by scrolling or by searching for dbt in the search bar. Click the tile to connect to dbt.

<Lightbox src="/img/snowflake_tutorial/snowflake_partner_connect_box.png" title="Snowflake Partner Connect Box" />

4 changes: 0 additions & 4 deletions website/docs/reference/dbt_project.yml.md
Original file line number Diff line number Diff line change
@@ -42,8 +42,6 @@ The following example is a list of all available configurations in the `dbt_proj
[docs-paths](/reference/project-configs/docs-paths): [directorypath]
[asset-paths](/reference/project-configs/asset-paths): [directorypath]

[target-path](/reference/global-configs/json-artifacts): directorypath
[log-path](/reference/global-configs/logs): directorypath
[packages-install-path](/reference/project-configs/packages-install-path): directorypath

[clean-targets](/reference/project-configs/clean-targets): [directorypath]
@@ -123,8 +121,6 @@ vars:
[docs-paths](/reference/project-configs/docs-paths): [directorypath]
[asset-paths](/reference/project-configs/asset-paths): [directorypath]

[target-path](/reference/global-configs/json-artifacts): directorypath
[log-path](/reference/global-configs/logs): directorypath
[packages-install-path](/reference/project-configs/packages-install-path): directorypath

[clean-targets](/reference/project-configs/clean-targets): [directorypath]
4 changes: 2 additions & 2 deletions website/docs/reference/global-configs/about-global-configs.md
Original file line number Diff line number Diff line change
@@ -73,13 +73,13 @@ Because the values of `flags` can differ across invocations, we strongly advise
| [log_format](/reference/global-configs/logs#log-formatting) | enum | default (text) | ✅ | `DBT_LOG_FORMAT` | `--log-format` | ❌ |
| [log_level_file](/reference/global-configs/logs#log-level) | enum | debug | ✅ | `DBT_LOG_LEVEL_FILE` | `--log-level-file` | ❌ |
| [log_level](/reference/global-configs/logs#log-level) | enum | info | ✅ | `DBT_LOG_LEVEL` | `--log-level` | ❌ |
| [log_path](/reference/global-configs/logs) | path | None (uses `logs/`) | ❌ | `DBT_PROFILES_DIR` | `--profiles-dir` | ❌ |
| [log_path](/reference/global-configs/logs) | path | None (uses `logs/`) | ❌ | `DBT_LOG_PATH` | `--log-path` | ❌ |
| [partial_parse](/reference/global-configs/parsing#partial-parsing) | boolean | True | ✅ | `DBT_PARTIAL_PARSE` | `--partial-parse`, `--no-partial-parse` | ✅ |
| [populate_cache](/reference/global-configs/cache) | boolean | True | ✅ | `DBT_POPULATE_CACHE` | `--populate-cache`, `--no-populate-cache` | ✅ |
| [print](/reference/global-configs/print-output#suppress-print-messages-in-stdout) | boolean | True | ❌ | `DBT_PRINT` | `--print` | ❌ |
| [printer_width](/reference/global-configs/print-output#printer-width) | int | 80 | ✅ | `DBT_PRINTER_WIDTH` | `--printer-width` | ❌ |
| [profile](/docs/core/connect-data-platform/connection-profiles#about-profiles) | string | None | ✅ (as top-level key) | `DBT_PROFILE` | `--profile` | ❌ |
| [profiles_dir](/docs/core/connect-data-platform/connection-profiles#about-profiles) | path | None (current dir, then HOME dir) | ❌ | `DBT_PROFILES_DIR` | `--profiles-dir` | ❌ |
| [profiles_dir](/docs/core/connect-data-platform/connection-profiles#about-profiles) | path | None (current dir, then HOME dir) | ❌ | `DBT_PROFILES_DIR` | `--log-path` | ❌ |
| [project_dir](/reference/dbt_project.yml) | path | | ❌ | `DBT_PROJECT_DIR` | `--project-dir` | ❌ |
| [quiet](/reference/global-configs/logs#suppress-non-error-logs-in-output) | boolean | False | ❌ | `DBT_QUIET` | `--quiet` | ✅ |
| [resource-type](/reference/global-configs/resource-type) (v1.8+) | string | None | ❌ | `DBT_RESOURCE_TYPES` <br></br> `DBT_EXCLUDE_RESOURCE_TYPES` | `--resource-type` <br></br> `--exclude-resource-type` | ✅ |
20 changes: 11 additions & 9 deletions website/docs/reference/node-selection/graph-operators.md
Original file line number Diff line number Diff line change
@@ -3,13 +3,13 @@ title: "Graph operators"
---

### The "plus" operator
If placed at the front of the model selector, `+` will select all parents of the selected model and the model itself. If placed at the end of the string, `+` will select all children of the selected model and the model itself.
If placed at the front of the model selector, `+` will select all ancestors of the selected model and the model itself. If placed at the end of the string, `+` will select all descendants of the selected model and the model itself.


```bash
dbt run --select "my_model+" # select my_model and all children
dbt run --select "+my_model" # select my_model and all parents
dbt run --select "+my_model+" # select my_model, and all of its parents and children
dbt run --select "my_model+" # select my_model and all descendants
dbt run --select "+my_model" # select my_model and all ancestors
dbt run --select "+my_model+" # select my_model, and all of its ancestors and descendants
```


@@ -20,17 +20,19 @@ to step through.


```bash
dbt run --select "my_model+1" # select my_model and its first-degree children
dbt run --select "2+my_model" # select my_model, its first-degree parents, and its second-degree parents ("grandparents")
dbt run --select "3+my_model+4" # select my_model, its parents up to the 3rd degree, and its children down to the 4th degree
dbt run --select "my_model+1" # select my_model and its first-degree descendants
dbt run --select "2+my_model" # select my_model, its first-degree ancestors ("parents"), and its second-degree ancestors ("grandparents")
dbt run --select "3+my_model+4" # select my_model, its ancestors up to the 3rd degree, and its descendants down to the 4th degree
```


### The "at" operator
The `@` operator is similar to `+`, but will also include _the parents of the children of the selected model_. This is useful in continuous integration environments where you want to build a model and all of its children, but the _parents_ of those children might not exist in the database yet. The selector `@snowplow_web_page_context` will build all three models shown in the diagram below.
The `@` operator is similar to `+`, but will also include _all ancestors of all descendants of the selected model_. This is useful in continuous integration environments where you want to build a model and all of its descendants, but the _ancestors_ of those descendants might not exist in the schema yet. The `@` operator (which can only be placed at the front of the model name) will select as many degrees of ancestors ("parents," "grandparents," and so on) as is needed to successfully build all descendants of the specified model.

The selector `@snowplow_web_page_context` will build all three models shown in the diagram below.

<Lightbox src="/img/docs/running-a-dbt-project/command-line-interface/1643e30-Screen_Shot_2019-03-11_at_7.18.20_PM.png" title="@snowplow_web_page_context will select all of the models shown here"/>

```bash
dbt run --models @my_model # select my_model, its children, and the parents of its children
dbt run --select "@my_model" # select my_model, its descendants, and the ancestors of its descendants
```
84 changes: 80 additions & 4 deletions website/docs/reference/resource-configs/materialized.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,83 @@
:::danger
---
resource_types: [models]
description: "Materialized - Read this in-depth guide to learn about materializations in dbt."
datatype: "string"
---

These docs are a placeholder for a yet-to-be-written reference section.
<Tabs
groupId="config-languages"
defaultValue="project-yaml"
values={[
{ label: 'Project file', value: 'project-yaml', },
{ label: 'Property file', value: 'property-yaml', },
{ label: 'Config block', value: 'config', },
]
}>

Please refer to the [guide on materializations](/docs/build/materializations) for current documentation.

:::
<TabItem value="project-yaml">

<File name='dbt_project.yml'>

```yaml
[config-version](/reference/project-configs/config-version): 2

models:
[<resource-path>](/reference/resource-configs/resource-path):
+materialized: [<materialization_name>](https://docs.getdbt.com/docs/build/materializations#materializations)
```
</File>
</TabItem>
<TabItem value="property-yaml">
<File name='models/properties.yml'>
```yaml
version: 2

models:
- name: <model_name>
config:
materialized: [<materialization_name>](https://docs.getdbt.com/docs/build/materializations#materializations)

```

</File>

</TabItem>


<TabItem value="config">

<File name='models/<model_name>.sql'>

```jinja
{{ config(
materialized="[<materialization_name>](https://docs.getdbt.com/docs/build/materializations#materializations)"
) }}
select ...
```

</File>

</TabItem>

</Tabs>

## Definition

[Materializations](/docs/build/materializations#materializations) are strategies for persisting dbt models in a warehouse. These are the materialization types built into dbt:

- `ephemeral` &mdash; [ephemeral](/docs/build/materializations#ephemeral) models are not directly built into the database
- `table` &mdash; a model is rebuilt as a [table](/docs/build/materializations#table) on each run
- `view` &mdash; a model is rebuilt as a [view](/docs/build/materializations#view) on each run
- `materialized_view` &mdash; allows the creation and maintenance of [materialized views](/docs/build/materializations#materialized-view) in the target database
- `incremental` &mdash; [incremental](/docs/build/materializations#incremental) models allow dbt to insert or update records into a table since the last time that model was run

You can also configure [custom materializations](/guides/create-new-materializations?step=1) in dbt. Custom materializations are a powerful way to extend dbt's functionality to meet your specific needs.

4 changes: 2 additions & 2 deletions website/docs/terms/cte.md
Original file line number Diff line number Diff line change
@@ -175,14 +175,14 @@ CTEs are likely to be supported across most, if not all, [modern data warehouses

## Conclusion

CTEs are essentially temporary views that can be used throughout a query. They are a great way to give your SQL more structure and readability, and offer simplified ways to debug your code. You can leverage appropriately-named CTEs to easily identify upstream dependencies and code functionality. CTEs also support recursiveness and reusability in the same query. Overall, CTEs can be an effective way to level-up your SQL to be more organized and understandable.
CTEs are essentially temporary views that can be used throughout a query. They are a great way to give your SQL more structure and readability, and offer simplified ways to debug your code. You can leverage appropriately named CTEs to easily identify upstream dependencies and code functionality. CTEs also support recursiveness and reusability in the same query. Overall, CTEs can be an effective way to level-up your SQL to be more organized and understandable.

## Further Reading

If you’re interested in reading more about CTE best practices, check out some of our favorite content around model refactoring and style:

- [Refactoring Legacy SQL to dbt](/guides/refactoring-legacy-sql?step=5#implement-cte-groupings)
- [dbt Labs Style Guide](https://github.com/dbt-labs/corp/blob/main/dbt_style_guide.md#ctes)
- [dbt Labs Style Guide](https://docs.getdbt.com/best-practices/how-we-style/0-how-we-style-our-dbt-projects)
- [Modular Data Modeling Technique](https://www.getdbt.com/analytics-engineering/modular-data-modeling-technique/)

Want to know why dbt Labs loves CTEs? Check out the following pieces:
4 changes: 2 additions & 2 deletions website/docusaurus.config.js
Original file line number Diff line number Diff line change
@@ -71,14 +71,14 @@ var siteSettings = {
},
announcementBar: {
id: "biweekly-demos",
content: "Join our bi-weekly demos and see dbt Cloud in action!",
content: "Join us on May 14th for the dbt Cloud Launch Showcase event. Discover the latest innovations and watch live demos! ",
backgroundColor: "#047377",
textColor: "#fff",
isCloseable: true,
},
announcementBarActive: true,
announcementBarLink:
"https://www.getdbt.com/resources/dbt-cloud-demos-with-experts?utm_source=docs&utm_medium=event&utm_campaign=q1-2024_cloud-demos-with-experts_awareness",
"https://www.getdbt.com/resources/webinars/dbt-cloud-launch-showcase",
// Set community spotlight member on homepage
// This is the ID for a specific file under docs/community/spotlight
communitySpotlightMember: "alison-stanton",
10 changes: 5 additions & 5 deletions website/snippets/_sl-test-and-query-metrics.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,16 @@
To work with metrics in dbt, you have several tools to validate or run commands. Here's how you can test and query metrics depending on your setup:

- [**dbt Cloud IDE users**](#dbt-cloud-ide-users) &mdash; Currently, running MetricFlow commands directly in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) isn't supported, but is coming soon. You can still validate metrics using the **Preview** or **Compile** options, or visually through the DAG for semantic checks. This ensures your metrics are correctly defined without directly running commands.
- [**dbt Cloud CLI users**](#dbt-cloud-cli-users) &mdash; The [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) enables you to run [MetricFlow commands](/docs/build/metricflow-commands#metricflow-commands) for direct interaction with metrics.
- **dbt Core users** &mdash; Use the MetricFlow CLI for command execution. While this guide focuses on dbt Cloud users, dbt Core users can find detailed MetricFlow CLI setup instructions in the [MetricFlow commands](/docs/build/metricflow-commands#metricflow-commands) page. Note that to use the dbt Semantic Layer, you need to have a Team or Enterprise account.
- [**dbt Cloud IDE users**](#dbt-cloud-ide-users) &mdash; Currently, running MetricFlow commands directly in the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) isn't supported, but is coming soon. You can view metrics visually through the DAG in the **Lineage** tab without directly running commands.
- [**dbt Cloud CLI users**](#dbt-cloud-cli-users) &mdash; The [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) enables you to run [MetricFlow commands](/docs/build/metricflow-commands#metricflow-commands) to query and preview metrics directly in your command line interface.
- **dbt Core users** &mdash; Use the MetricFlow CLI for command execution. While this guide focuses on dbt Cloud users, dbt Core users can find detailed MetricFlow CLI setup instructions in the [MetricFlow commands](/docs/build/metricflow-commands#metricflow-commands) page. Note that to use the dbt Semantic Layer, you need to have a [Team or Enterprise account](https://www.getdbt.com/).

Alternatively, you can run commands with SQL client tools like DataGrip, DBeaver, or RazorSQL.

### dbt Cloud IDE users

You can validate your metrics in the dbt Cloud IDE by selecting the metric you want to validate and viewing it in the **Lineage** tab.
You can view your metrics in the dbt Cloud IDE by viewing them in the **Lineage** tab. The dbt Cloud IDE **Status button** (located in the bottom right of the editor) displays an **Error** status if there's an error in your metric or semantic model definition. You can click the button to see the specific issue and resolve it.

Once validated, make sure you commit and merge your changes in your project.
Once viewed, make sure you commit and merge your changes in your project.

<Lightbox src="/img/docs/dbt-cloud/semantic-layer/sl-ide-dag.jpg" title="Validate your metrics using the Lineage tab in the IDE." />

0 comments on commit ba0bda5

Please sign in to comment.