Saved queries
Saved queries are a way to save commonly used queries in MetricFlow. You can group metrics, dimensions, and filters that are logically related into a saved query. Saved queries are nodes and visible in the dbt DAGA DAG is a Directed Acyclic Graph, a type of graph whose nodes are directionally related to each other and don’t form a directional closed loop..
Saved queries serve as the foundational building block, allowing you to configure exports in your saved query configuration. Exports takes this functionality a step further by enabling you to schedule and write saved queries directly within your data platform using dbt Cloud's job scheduler.
Parameters
To create a saved query, refer to the following table parameters.
Note that we use the double colon (::) to indicate whether a parameter is nested within another parameter. So for example, query_params::metrics
means the metrics
parameter is nested under query_params
.
Parameter | Type | Required | Description |
---|---|---|---|
name | String | Required | Name of the saved query object. |
description | String | Required | A description of the saved query. |
label | String | Required | The display name for your saved query. This value will be shown in downstream tools. |
config | String | Optional | Use the config property to specify configurations for your saved query. Supports cache , enabled , export_as , group , meta , and schema configurations. |
config::cache::enabled | Object | Optional | An object with a sub-key used to specify if a saved query should populate the cache. Accepts sub-key true or false . Defaults to false |
query_params | Structure | Required | Contains the query parameters. |
query_params::metrics | List or String | Optional | A list of the metrics to be used in the query as specified in the command line interface. |
query_params::group_by | List or String | Optional | A list of the Entities and Dimensions to be used in the query, which include the Dimension or TimeDimension . |
query_params::where | List or String | Optional | A list of strings that may include the Dimension or TimeDimension objects. |
exports | List or Structure | Optional | A list of exports to be specified within the exports structure. |
exports::name | String | Required | Name of the export object. |
exports::config | List or Structure | Required | A config section for any parameters specifying the export. |
exports::config::export_as | String | Required | The type of export to run. Options include table or view currently and cache in the near future. |
exports::config::schema | String | Optional | The schema for creating the table or view. This option cannot be used for caching. |
exports::config::alias | String | Optional | The table alias used to write to the table or view. This option cannot be used for caching. |
If you use multiple metrics in a saved query, then you will only be able to reference the common dimensions these metrics share in the group_by
or where
clauses. Use the entity name prefix with the Dimension object, like Dimension('user__ds')
.
Configure saved query
Use saved queries to define and manage common Semantic Layer queries in YAML, including metrics and dimensions. Saved queries enable you to organize and reuse common MetricFlow queries within dbt projects. For example, you can group related metrics together for better organization, and include commonly used dimensions and filters.
In your saved query config, you can also leverage caching with the dbt Cloud job scheduler to cache common queries, speed up performance, and reduce compute costs.
In the following example, you can set the saved query in the semantic_model.yml
file:
saved_queries:
- name: test_saved_query
description: "{{ doc('saved_query_description') }}"
label: Test saved query
config:
cache:
enabled: true # Or false if you want it disabled by default
query_params:
metrics:
- simple_metric
group_by:
- "Dimension('user__ds')"
where:
- "{{ Dimension('user__ds', 'DAY') }} <= now()"
- "{{ Dimension('user__ds', 'DAY') }} >= '2023-01-01'"
exports:
- name: my_export
config:
alias: my_export_alias
export_as: table
schema: my_export_schema_name
Note, that you can set export_as
to both the saved query and the exports config, with the exports config value taking precedence. If a key isn't set in the exports config, it will inherit the saved query config value.
Where clause
Use the following syntax to reference entities, dimensions, time dimensions, or metrics in filters and refer to Metrics as dimensions for details on how to use metrics as dimensions with metric filters:
filter: |
{{ Entity('entity_name') }}
filter: |
{{ Dimension('primary_entity__dimension_name') }}
filter: |
{{ TimeDimension('time_dimension', 'granularity') }}
filter: |
{{ Metric('metric_name', group_by=['entity_name']) }}
Project-level saved queries
To enable saved queries at the project level, you can set the saved-queries
configuration in the dbt_project.yml
file. This saves you time in configuring saved queries in each file:
saved-queries:
my_saved_query:
+cache:
enabled: true
For more information on dbt_project.yml
and config naming conventions, see the dbt_project.yml reference page.
To build saved_queries
, use the --resource-type
flag and run the command dbt build --resource-type saved_query
.
Configure exports
Exports are an additional configuration added to a saved query. They define how to write a saved query, along with the schema and table name.
Once you've configured your saved query and set the foundation block, you can now configure exports in the saved_queries
YAML configuration file (the same file as your metric definitions). This will also allow you to run exports automatically within your data platform using dbt Cloud's job scheduler.
The following is an example of a saved query with an export:
saved_queries:
- name: order_metrics
description: Relevant order metrics
query_params:
metrics:
- orders
- large_order
- food_orders
- order_total
group_by:
- Entity('order_id')
- TimeDimension('metric_time', 'day')
- Dimension('customer__customer_name')
- ... # Additional group_by
where:
- "{{TimeDimension('metric_time')}} > current_timestamp - interval '1 week'"
- ... # Additional where clauses
exports:
- name: order_metrics
config:
export_as: table # Options available: table, view
schema: YOUR_SCHEMA # Optional - defaults to deployment schema
alias: SOME_TABLE_NAME # Optional - defaults to Export name
Run exports
Once you've configured exports, you can now take things a step further by running exports to automatically write saved queries within your data platform using dbt Cloud's job scheduler. This feature is only available with the dbt Cloud's Semantic Layer.
For more information on how to run exports, refer to the Exports documentation.
FAQs
Related docs
- Validate semantic nodes in a CI job
- Configure caching