Last updated: July 22, 2025
Date values in future percent data quality checks, SQL examples
Detects dates in the future in date, datetime and timestamp columns. Measures a percentage of dates in the future. Raises a data quality issue when too many future dates are found.
The date values in future percent data quality check has the following variants for each type of data quality checks supported by DQOps.
profile date values in future percent
Check description
Detects dates in the future in date, datetime and timestamp columns. Measures a percentage of dates in the future. Raises a data quality issue when too many future dates are found.
Data quality check name | Friendly name | Category | Check type | Time scale | Quality dimension | Sensor definition | Quality rule | Standard |
---|---|---|---|---|---|---|---|---|
profile_date_values_in_future_percent |
Maximum percentage of rows containing dates in future | datetime | profiling | Validity | date_values_in_future_percent | max_percent |
Command-line examples
Please expand the section below to see the DQOps command-line examples to run or activate the profile date values in future percent data quality check.
Managing profile date values in future percent check from DQOps shell
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=profile_date_values_in_future_percent --enable-warning
You can also use patterns to activate the check on all matching tables and columns.
dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -col=column_name -ch=profile_date_values_in_future_percent --enable-warning
Additional rule parameters are passed using the -Wrule_parameter_name=value.
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=profile_date_values_in_future_percent --enable-error
You can also use patterns to activate the check on all matching tables and columns.
dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -col=column_name -ch=profile_date_values_in_future_percent --enable-error
Additional rule parameters are passed using the -Erule_parameter_name=value.
Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the profile_date_values_in_future_percent check on all tables and columns on a single data source.
It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.
dqo> check run -c=connection_name -t=schema_name.table_name -ch=profile_date_values_in_future_percent
You can also run this check on all tables (and columns) on which the profile_date_values_in_future_percent check is enabled using patterns to find tables.
YAML configuration
The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
columns:
target_column:
profiling_checks:
datetime:
profile_date_values_in_future_percent:
parameters:
max_future_days: 0.0
warning:
max_percent: 0.0
error:
max_percent: 1.0
fatal:
max_percent: 5.0
labels:
- This is the column that is analyzed for data quality issues
Samples of generated SQL queries for each data source type
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the date_values_in_future_percent data quality sensor.
BigQuery
{% import '/dialects/bigquery.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL CAST({{(parameters.max_future_days)}} AS INT64) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATETIME_ADD(CURRENT_DATETIME(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% else -%}
SAFE_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN SAFE_CAST(analyzed_table.`target_column` AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST(0.0 * 86400 AS INT64) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value
FROM `your-google-project-id`.`<target_schema>`.`<target_table>` AS analyzed_table
ClickHouse
{% import '/dialects/clickhouse.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDate(now()) + INTERVAL CAST({{(parameters.max_future_days)}} AS Int64) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime(now()) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% else -%}
toDateTime64({{ lib.render_target_column('analyzed_table') }}, 3) > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN toDateTime64(analyzed_table."target_column", 3) > toDateTime64(now(), 3) + INTERVAL CAST(0.0 * 86400 AS Int64) SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM "<target_schema>"."<target_table>" AS analyzed_table
Databricks
{% import '/dialects/databricks.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value
FROM `<target_schema>`.`<target_table>` AS analyzed_table
DB2
{% import '/dialects/db2.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INT))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN CAST(analyzed_table."target_column" AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INT))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM (
SELECT
original_table.*
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
DuckDB
{% import '/dialects/duckdb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + INTERVAL ({{(parameters.max_future_days)}} * 1) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
HANA
{% import '/dialects/hana.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INTEGER))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% else -%}
TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TO_TIMESTAMP(analyzed_table."target_column") > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INTEGER))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM (
SELECT
original_table.*
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
MariaDB
{% import '/dialects/mariadb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value
FROM `<target_table>` AS analyzed_table
MySQL
{% import '/dialects/mysql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value
FROM `<target_table>` AS analyzed_table
Oracle
{% import '/dialects/oracle.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + numToDSInterval( CAST( ({{(parameters.max_future_days)}}) AS INTEGER), 'day' )
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( (0.0 * 86400) AS INTEGER), 'second' )
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM (
SELECT
original_table.*
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
PostgreSQL
{% import '/dialects/postgresql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM "your_postgresql_database"."<target_schema>"."<target_table>" AS analyzed_table
Presto
{% import '/dialects/presto.sql.jinja2' as lib with context -%}
{% macro render_value_in_future() -%}
{%- endmacro -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM (
SELECT
original_table.*
FROM "your_trino_database"."<target_schema>"."<target_table>" original_table
) analyzed_table
QuestDB
{% import '/dialects/questdb.sql.jinja2' as lib with context -%}
SELECT
COALESCE(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('d', ({{(parameters.max_future_days)}})::int, TODAY())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }}), 0.0)
AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM(
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
Redshift
{% import '/dialects/redshift.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM "your_redshift_database"."<target_schema>"."<target_table>" AS analyzed_table
Snowflake
{% import '/dialects/snowflake.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% else -%}
TRY_TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_TO_TIMESTAMP(analyzed_table."target_column") > TIMESTAMPADD(SECOND, CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM "your_snowflake_database"."<target_schema>"."<target_table>" AS analyzed_table
Spark
{% import '/dialects/spark.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value
FROM `<target_schema>`.`<target_table>` AS analyzed_table
SQL Server
{% import '/dialects/sqlserver.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT_BIG({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INT), GETDATE())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), GETDATE())
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT_BIG({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT_BIG(analyzed_table.[target_column]) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table.[target_column] AS DATETIME) > DATEADD(SECOND, CAST(0.0 * 86400 AS INT), SYSDATETIME())
THEN 1
ELSE 0
END
) / COUNT_BIG(analyzed_table.[target_column])
END AS actual_value
FROM [your_sql_server_database].[<target_schema>].[<target_table>] AS analyzed_table
Teradata
{% import '/dialects/teradata.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM "<target_schema>"."<target_table>" AS analyzed_table
Trino
{% import '/dialects/trino.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM (
SELECT
original_table.*
FROM "your_trino_catalog"."<target_schema>"."<target_table>" original_table
) analyzed_table
Expand the Configure with data grouping section to see additional examples for configuring this data quality checks to use data grouping (GROUP BY).
Configuration with data grouping
Sample configuration with data grouping enabled (YAML) The sample below shows how to configure the data grouping and how it affects the generated SQL query.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
default_grouping_name: group_by_country_and_state
groupings:
group_by_country_and_state:
level_1:
source: column_value
column: country
level_2:
source: column_value
column: state
columns:
target_column:
profiling_checks:
datetime:
profile_date_values_in_future_percent:
parameters:
max_future_days: 0.0
warning:
max_percent: 0.0
error:
max_percent: 1.0
fatal:
max_percent: 5.0
labels:
- This is the column that is analyzed for data quality issues
country:
labels:
- column used as the first grouping key
state:
labels:
- column used as the second grouping key
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the date_values_in_future_percent sensor.
BigQuery
{% import '/dialects/bigquery.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL CAST({{(parameters.max_future_days)}} AS INT64) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATETIME_ADD(CURRENT_DATETIME(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% else -%}
SAFE_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN SAFE_CAST(analyzed_table.`target_column` AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST(0.0 * 86400 AS INT64) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2
FROM `your-google-project-id`.`<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
ClickHouse
{% import '/dialects/clickhouse.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDate(now()) + INTERVAL CAST({{(parameters.max_future_days)}} AS Int64) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime(now()) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% else -%}
toDateTime64({{ lib.render_target_column('analyzed_table') }}, 3) > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN toDateTime64(analyzed_table."target_column", 3) > toDateTime64(now(), 3) + INTERVAL CAST(0.0 * 86400 AS Int64) SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2
FROM "<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Databricks
{% import '/dialects/databricks.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2
FROM `<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
DB2
{% import '/dialects/db2.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INT))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN CAST(analyzed_table."target_column" AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INT))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
DuckDB
{% import '/dialects/duckdb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + INTERVAL ({{(parameters.max_future_days)}} * 1) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + INTERVAL (0.0 * 86400) SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2
FROM AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
HANA
{% import '/dialects/hana.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INTEGER))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% else -%}
TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TO_TIMESTAMP(analyzed_table."target_column") > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INTEGER))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
MariaDB
{% import '/dialects/mariadb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2
FROM `<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
MySQL
{% import '/dialects/mysql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2
FROM `<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Oracle
{% import '/dialects/oracle.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + numToDSInterval( CAST( ({{(parameters.max_future_days)}}) AS INTEGER), 'day' )
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( (0.0 * 86400) AS INTEGER), 'second' )
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
PostgreSQL
{% import '/dialects/postgresql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2
FROM "your_postgresql_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Presto
{% import '/dialects/presto.sql.jinja2' as lib with context -%}
{% macro render_value_in_future() -%}
{%- endmacro -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2
FROM "your_trino_database"."<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
QuestDB
{% import '/dialects/questdb.sql.jinja2' as lib with context -%}
SELECT
COALESCE(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('d', ({{(parameters.max_future_days)}})::int, TODAY())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }}), 0.0)
AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM(
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
COALESCE(100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > DATEADD('s', (0.0)::int * 86400, NOW())
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column"), 0.0)
AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
FROM(
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2
FROM "<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Redshift
{% import '/dialects/redshift.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2
FROM "your_redshift_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Snowflake
{% import '/dialects/snowflake.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% else -%}
TRY_TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_TO_TIMESTAMP(analyzed_table."target_column") > TIMESTAMPADD(SECOND, CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2
FROM "your_snowflake_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Spark
{% import '/dialects/spark.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2
FROM `<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
SQL Server
{% import '/dialects/sqlserver.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT_BIG({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INT), GETDATE())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), GETDATE())
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT_BIG({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT_BIG(analyzed_table.[target_column]) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table.[target_column] AS DATETIME) > DATEADD(SECOND, CAST(0.0 * 86400 AS INT), SYSDATETIME())
THEN 1
ELSE 0
END
) / COUNT_BIG(analyzed_table.[target_column])
END AS actual_value,
analyzed_table.[country] AS grouping_level_1,
analyzed_table.[state] AS grouping_level_2
FROM [your_sql_server_database].[<target_schema>].[<target_table>] AS analyzed_table
GROUP BY analyzed_table.[country], analyzed_table.[state]
ORDER BY level_1, level_2
,
Teradata
{% import '/dialects/teradata.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2
FROM "<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Trino
{% import '/dialects/trino.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2
FROM "your_trino_catalog"."<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
daily date values in future percent
Check description
Detects dates in the future in date, datetime and timestamp columns. Measures a percentage of dates in the future. Raises a data quality issue when too many future dates are found. Stores the most recent captured value for each day when the data quality check was evaluated.
Data quality check name | Friendly name | Category | Check type | Time scale | Quality dimension | Sensor definition | Quality rule | Standard |
---|---|---|---|---|---|---|---|---|
daily_date_values_in_future_percent |
Maximum percentage of rows containing dates in future | datetime | monitoring | daily | Validity | date_values_in_future_percent | max_percent |
Command-line examples
Please expand the section below to see the DQOps command-line examples to run or activate the daily date values in future percent data quality check.
Managing daily date values in future percent check from DQOps shell
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=daily_date_values_in_future_percent --enable-warning
You can also use patterns to activate the check on all matching tables and columns.
dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -col=column_name -ch=daily_date_values_in_future_percent --enable-warning
Additional rule parameters are passed using the -Wrule_parameter_name=value.
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=daily_date_values_in_future_percent --enable-error
You can also use patterns to activate the check on all matching tables and columns.
dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -col=column_name -ch=daily_date_values_in_future_percent --enable-error
Additional rule parameters are passed using the -Erule_parameter_name=value.
Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the daily_date_values_in_future_percent check on all tables and columns on a single data source.
It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.
You can also run this check on all tables (and columns) on which the daily_date_values_in_future_percent check is enabled using patterns to find tables.
YAML configuration
The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
columns:
target_column:
monitoring_checks:
daily:
datetime:
daily_date_values_in_future_percent:
parameters:
max_future_days: 0.0
warning:
max_percent: 0.0
error:
max_percent: 1.0
fatal:
max_percent: 5.0
labels:
- This is the column that is analyzed for data quality issues
Samples of generated SQL queries for each data source type
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the date_values_in_future_percent data quality sensor.
BigQuery
{% import '/dialects/bigquery.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL CAST({{(parameters.max_future_days)}} AS INT64) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATETIME_ADD(CURRENT_DATETIME(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% else -%}
SAFE_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN SAFE_CAST(analyzed_table.`target_column` AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST(0.0 * 86400 AS INT64) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value
FROM `your-google-project-id`.`<target_schema>`.`<target_table>` AS analyzed_table
ClickHouse
{% import '/dialects/clickhouse.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDate(now()) + INTERVAL CAST({{(parameters.max_future_days)}} AS Int64) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime(now()) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% else -%}
toDateTime64({{ lib.render_target_column('analyzed_table') }}, 3) > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN toDateTime64(analyzed_table."target_column", 3) > toDateTime64(now(), 3) + INTERVAL CAST(0.0 * 86400 AS Int64) SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM "<target_schema>"."<target_table>" AS analyzed_table
Databricks
{% import '/dialects/databricks.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value
FROM `<target_schema>`.`<target_table>` AS analyzed_table
DB2
{% import '/dialects/db2.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INT))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN CAST(analyzed_table."target_column" AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INT))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM (
SELECT
original_table.*
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
DuckDB
{% import '/dialects/duckdb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + INTERVAL ({{(parameters.max_future_days)}} * 1) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
HANA
{% import '/dialects/hana.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INTEGER))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% else -%}
TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TO_TIMESTAMP(analyzed_table."target_column") > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INTEGER))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM (
SELECT
original_table.*
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
MariaDB
{% import '/dialects/mariadb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value
FROM `<target_table>` AS analyzed_table
MySQL
{% import '/dialects/mysql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value
FROM `<target_table>` AS analyzed_table
Oracle
{% import '/dialects/oracle.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + numToDSInterval( CAST( ({{(parameters.max_future_days)}}) AS INTEGER), 'day' )
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( (0.0 * 86400) AS INTEGER), 'second' )
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM (
SELECT
original_table.*
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
PostgreSQL
{% import '/dialects/postgresql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM "your_postgresql_database"."<target_schema>"."<target_table>" AS analyzed_table
Presto
{% import '/dialects/presto.sql.jinja2' as lib with context -%}
{% macro render_value_in_future() -%}
{%- endmacro -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM (
SELECT
original_table.*
FROM "your_trino_database"."<target_schema>"."<target_table>" original_table
) analyzed_table
QuestDB
{% import '/dialects/questdb.sql.jinja2' as lib with context -%}
SELECT
COALESCE(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('d', ({{(parameters.max_future_days)}})::int, TODAY())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }}), 0.0)
AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM(
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
Redshift
{% import '/dialects/redshift.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM "your_redshift_database"."<target_schema>"."<target_table>" AS analyzed_table
Snowflake
{% import '/dialects/snowflake.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% else -%}
TRY_TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_TO_TIMESTAMP(analyzed_table."target_column") > TIMESTAMPADD(SECOND, CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM "your_snowflake_database"."<target_schema>"."<target_table>" AS analyzed_table
Spark
{% import '/dialects/spark.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value
FROM `<target_schema>`.`<target_table>` AS analyzed_table
SQL Server
{% import '/dialects/sqlserver.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT_BIG({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INT), GETDATE())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), GETDATE())
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT_BIG({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT_BIG(analyzed_table.[target_column]) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table.[target_column] AS DATETIME) > DATEADD(SECOND, CAST(0.0 * 86400 AS INT), SYSDATETIME())
THEN 1
ELSE 0
END
) / COUNT_BIG(analyzed_table.[target_column])
END AS actual_value
FROM [your_sql_server_database].[<target_schema>].[<target_table>] AS analyzed_table
Teradata
{% import '/dialects/teradata.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM "<target_schema>"."<target_table>" AS analyzed_table
Trino
{% import '/dialects/trino.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM (
SELECT
original_table.*
FROM "your_trino_catalog"."<target_schema>"."<target_table>" original_table
) analyzed_table
Expand the Configure with data grouping section to see additional examples for configuring this data quality checks to use data grouping (GROUP BY).
Configuration with data grouping
Sample configuration with data grouping enabled (YAML) The sample below shows how to configure the data grouping and how it affects the generated SQL query.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
default_grouping_name: group_by_country_and_state
groupings:
group_by_country_and_state:
level_1:
source: column_value
column: country
level_2:
source: column_value
column: state
columns:
target_column:
monitoring_checks:
daily:
datetime:
daily_date_values_in_future_percent:
parameters:
max_future_days: 0.0
warning:
max_percent: 0.0
error:
max_percent: 1.0
fatal:
max_percent: 5.0
labels:
- This is the column that is analyzed for data quality issues
country:
labels:
- column used as the first grouping key
state:
labels:
- column used as the second grouping key
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the date_values_in_future_percent sensor.
BigQuery
{% import '/dialects/bigquery.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL CAST({{(parameters.max_future_days)}} AS INT64) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATETIME_ADD(CURRENT_DATETIME(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% else -%}
SAFE_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN SAFE_CAST(analyzed_table.`target_column` AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST(0.0 * 86400 AS INT64) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2
FROM `your-google-project-id`.`<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
ClickHouse
{% import '/dialects/clickhouse.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDate(now()) + INTERVAL CAST({{(parameters.max_future_days)}} AS Int64) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime(now()) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% else -%}
toDateTime64({{ lib.render_target_column('analyzed_table') }}, 3) > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN toDateTime64(analyzed_table."target_column", 3) > toDateTime64(now(), 3) + INTERVAL CAST(0.0 * 86400 AS Int64) SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2
FROM "<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Databricks
{% import '/dialects/databricks.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2
FROM `<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
DB2
{% import '/dialects/db2.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INT))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN CAST(analyzed_table."target_column" AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INT))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
DuckDB
{% import '/dialects/duckdb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + INTERVAL ({{(parameters.max_future_days)}} * 1) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + INTERVAL (0.0 * 86400) SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2
FROM AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
HANA
{% import '/dialects/hana.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INTEGER))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% else -%}
TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TO_TIMESTAMP(analyzed_table."target_column") > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INTEGER))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
MariaDB
{% import '/dialects/mariadb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2
FROM `<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
MySQL
{% import '/dialects/mysql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2
FROM `<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Oracle
{% import '/dialects/oracle.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + numToDSInterval( CAST( ({{(parameters.max_future_days)}}) AS INTEGER), 'day' )
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( (0.0 * 86400) AS INTEGER), 'second' )
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
PostgreSQL
{% import '/dialects/postgresql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2
FROM "your_postgresql_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Presto
{% import '/dialects/presto.sql.jinja2' as lib with context -%}
{% macro render_value_in_future() -%}
{%- endmacro -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2
FROM "your_trino_database"."<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
QuestDB
{% import '/dialects/questdb.sql.jinja2' as lib with context -%}
SELECT
COALESCE(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('d', ({{(parameters.max_future_days)}})::int, TODAY())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }}), 0.0)
AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM(
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
COALESCE(100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > DATEADD('s', (0.0)::int * 86400, NOW())
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column"), 0.0)
AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
FROM(
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2
FROM "<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Redshift
{% import '/dialects/redshift.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2
FROM "your_redshift_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Snowflake
{% import '/dialects/snowflake.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% else -%}
TRY_TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_TO_TIMESTAMP(analyzed_table."target_column") > TIMESTAMPADD(SECOND, CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2
FROM "your_snowflake_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Spark
{% import '/dialects/spark.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2
FROM `<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
SQL Server
{% import '/dialects/sqlserver.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT_BIG({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INT), GETDATE())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), GETDATE())
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT_BIG({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT_BIG(analyzed_table.[target_column]) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table.[target_column] AS DATETIME) > DATEADD(SECOND, CAST(0.0 * 86400 AS INT), SYSDATETIME())
THEN 1
ELSE 0
END
) / COUNT_BIG(analyzed_table.[target_column])
END AS actual_value,
analyzed_table.[country] AS grouping_level_1,
analyzed_table.[state] AS grouping_level_2
FROM [your_sql_server_database].[<target_schema>].[<target_table>] AS analyzed_table
GROUP BY analyzed_table.[country], analyzed_table.[state]
ORDER BY level_1, level_2
,
Teradata
{% import '/dialects/teradata.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2
FROM "<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Trino
{% import '/dialects/trino.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2
FROM "your_trino_catalog"."<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
monthly date values in future percent
Check description
Detects dates in the future in date, datetime and timestamp columns. Measures a percentage of dates in the future. Raises a data quality issue when too many future dates are found. Stores the most recent check result for each month when the data quality check was evaluated.
Data quality check name | Friendly name | Category | Check type | Time scale | Quality dimension | Sensor definition | Quality rule | Standard |
---|---|---|---|---|---|---|---|---|
monthly_date_values_in_future_percent |
Maximum percentage of rows containing dates in future | datetime | monitoring | monthly | Validity | date_values_in_future_percent | max_percent |
Command-line examples
Please expand the section below to see the DQOps command-line examples to run or activate the monthly date values in future percent data quality check.
Managing monthly date values in future percent check from DQOps shell
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=monthly_date_values_in_future_percent --enable-warning
You can also use patterns to activate the check on all matching tables and columns.
dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -col=column_name -ch=monthly_date_values_in_future_percent --enable-warning
Additional rule parameters are passed using the -Wrule_parameter_name=value.
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=monthly_date_values_in_future_percent --enable-error
You can also use patterns to activate the check on all matching tables and columns.
dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -col=column_name -ch=monthly_date_values_in_future_percent --enable-error
Additional rule parameters are passed using the -Erule_parameter_name=value.
Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the monthly_date_values_in_future_percent check on all tables and columns on a single data source.
It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.
dqo> check run -c=connection_name -t=schema_name.table_name -ch=monthly_date_values_in_future_percent
You can also run this check on all tables (and columns) on which the monthly_date_values_in_future_percent check is enabled using patterns to find tables.
YAML configuration
The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
columns:
target_column:
monitoring_checks:
monthly:
datetime:
monthly_date_values_in_future_percent:
parameters:
max_future_days: 0.0
warning:
max_percent: 0.0
error:
max_percent: 1.0
fatal:
max_percent: 5.0
labels:
- This is the column that is analyzed for data quality issues
Samples of generated SQL queries for each data source type
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the date_values_in_future_percent data quality sensor.
BigQuery
{% import '/dialects/bigquery.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL CAST({{(parameters.max_future_days)}} AS INT64) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATETIME_ADD(CURRENT_DATETIME(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% else -%}
SAFE_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN SAFE_CAST(analyzed_table.`target_column` AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST(0.0 * 86400 AS INT64) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value
FROM `your-google-project-id`.`<target_schema>`.`<target_table>` AS analyzed_table
ClickHouse
{% import '/dialects/clickhouse.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDate(now()) + INTERVAL CAST({{(parameters.max_future_days)}} AS Int64) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime(now()) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% else -%}
toDateTime64({{ lib.render_target_column('analyzed_table') }}, 3) > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN toDateTime64(analyzed_table."target_column", 3) > toDateTime64(now(), 3) + INTERVAL CAST(0.0 * 86400 AS Int64) SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM "<target_schema>"."<target_table>" AS analyzed_table
Databricks
{% import '/dialects/databricks.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value
FROM `<target_schema>`.`<target_table>` AS analyzed_table
DB2
{% import '/dialects/db2.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INT))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN CAST(analyzed_table."target_column" AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INT))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM (
SELECT
original_table.*
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
DuckDB
{% import '/dialects/duckdb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + INTERVAL ({{(parameters.max_future_days)}} * 1) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
HANA
{% import '/dialects/hana.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INTEGER))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% else -%}
TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TO_TIMESTAMP(analyzed_table."target_column") > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INTEGER))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM (
SELECT
original_table.*
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
MariaDB
{% import '/dialects/mariadb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value
FROM `<target_table>` AS analyzed_table
MySQL
{% import '/dialects/mysql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value
FROM `<target_table>` AS analyzed_table
Oracle
{% import '/dialects/oracle.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + numToDSInterval( CAST( ({{(parameters.max_future_days)}}) AS INTEGER), 'day' )
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( (0.0 * 86400) AS INTEGER), 'second' )
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM (
SELECT
original_table.*
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
PostgreSQL
{% import '/dialects/postgresql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM "your_postgresql_database"."<target_schema>"."<target_table>" AS analyzed_table
Presto
{% import '/dialects/presto.sql.jinja2' as lib with context -%}
{% macro render_value_in_future() -%}
{%- endmacro -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM (
SELECT
original_table.*
FROM "your_trino_database"."<target_schema>"."<target_table>" original_table
) analyzed_table
QuestDB
{% import '/dialects/questdb.sql.jinja2' as lib with context -%}
SELECT
COALESCE(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('d', ({{(parameters.max_future_days)}})::int, TODAY())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }}), 0.0)
AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM(
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
Redshift
{% import '/dialects/redshift.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM "your_redshift_database"."<target_schema>"."<target_table>" AS analyzed_table
Snowflake
{% import '/dialects/snowflake.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% else -%}
TRY_TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_TO_TIMESTAMP(analyzed_table."target_column") > TIMESTAMPADD(SECOND, CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM "your_snowflake_database"."<target_schema>"."<target_table>" AS analyzed_table
Spark
{% import '/dialects/spark.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value
FROM `<target_schema>`.`<target_table>` AS analyzed_table
SQL Server
{% import '/dialects/sqlserver.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT_BIG({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INT), GETDATE())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), GETDATE())
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT_BIG({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT_BIG(analyzed_table.[target_column]) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table.[target_column] AS DATETIME) > DATEADD(SECOND, CAST(0.0 * 86400 AS INT), SYSDATETIME())
THEN 1
ELSE 0
END
) / COUNT_BIG(analyzed_table.[target_column])
END AS actual_value
FROM [your_sql_server_database].[<target_schema>].[<target_table>] AS analyzed_table
Teradata
{% import '/dialects/teradata.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM "<target_schema>"."<target_table>" AS analyzed_table
Trino
{% import '/dialects/trino.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value
FROM (
SELECT
original_table.*
FROM "your_trino_catalog"."<target_schema>"."<target_table>" original_table
) analyzed_table
Expand the Configure with data grouping section to see additional examples for configuring this data quality checks to use data grouping (GROUP BY).
Configuration with data grouping
Sample configuration with data grouping enabled (YAML) The sample below shows how to configure the data grouping and how it affects the generated SQL query.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
default_grouping_name: group_by_country_and_state
groupings:
group_by_country_and_state:
level_1:
source: column_value
column: country
level_2:
source: column_value
column: state
columns:
target_column:
monitoring_checks:
monthly:
datetime:
monthly_date_values_in_future_percent:
parameters:
max_future_days: 0.0
warning:
max_percent: 0.0
error:
max_percent: 1.0
fatal:
max_percent: 5.0
labels:
- This is the column that is analyzed for data quality issues
country:
labels:
- column used as the first grouping key
state:
labels:
- column used as the second grouping key
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the date_values_in_future_percent sensor.
BigQuery
{% import '/dialects/bigquery.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL CAST({{(parameters.max_future_days)}} AS INT64) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATETIME_ADD(CURRENT_DATETIME(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% else -%}
SAFE_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN SAFE_CAST(analyzed_table.`target_column` AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST(0.0 * 86400 AS INT64) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2
FROM `your-google-project-id`.`<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
ClickHouse
{% import '/dialects/clickhouse.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDate(now()) + INTERVAL CAST({{(parameters.max_future_days)}} AS Int64) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime(now()) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% else -%}
toDateTime64({{ lib.render_target_column('analyzed_table') }}, 3) > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN toDateTime64(analyzed_table."target_column", 3) > toDateTime64(now(), 3) + INTERVAL CAST(0.0 * 86400 AS Int64) SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2
FROM "<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Databricks
{% import '/dialects/databricks.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2
FROM `<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
DB2
{% import '/dialects/db2.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INT))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN CAST(analyzed_table."target_column" AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INT))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
DuckDB
{% import '/dialects/duckdb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + INTERVAL ({{(parameters.max_future_days)}} * 1) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + INTERVAL (0.0 * 86400) SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2
FROM AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
HANA
{% import '/dialects/hana.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INTEGER))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% else -%}
TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TO_TIMESTAMP(analyzed_table."target_column") > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INTEGER))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
MariaDB
{% import '/dialects/mariadb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2
FROM `<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
MySQL
{% import '/dialects/mysql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2
FROM `<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Oracle
{% import '/dialects/oracle.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + numToDSInterval( CAST( ({{(parameters.max_future_days)}}) AS INTEGER), 'day' )
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( (0.0 * 86400) AS INTEGER), 'second' )
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
PostgreSQL
{% import '/dialects/postgresql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2
FROM "your_postgresql_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Presto
{% import '/dialects/presto.sql.jinja2' as lib with context -%}
{% macro render_value_in_future() -%}
{%- endmacro -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2
FROM "your_trino_database"."<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
QuestDB
{% import '/dialects/questdb.sql.jinja2' as lib with context -%}
SELECT
COALESCE(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('d', ({{(parameters.max_future_days)}})::int, TODAY())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }}), 0.0)
AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM(
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
COALESCE(100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > DATEADD('s', (0.0)::int * 86400, NOW())
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column"), 0.0)
AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
FROM(
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2
FROM "<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Redshift
{% import '/dialects/redshift.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2
FROM "your_redshift_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Snowflake
{% import '/dialects/snowflake.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% else -%}
TRY_TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_TO_TIMESTAMP(analyzed_table."target_column") > TIMESTAMPADD(SECOND, CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2
FROM "your_snowflake_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Spark
{% import '/dialects/spark.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2
FROM `<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
SQL Server
{% import '/dialects/sqlserver.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT_BIG({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INT), GETDATE())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), GETDATE())
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT_BIG({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT_BIG(analyzed_table.[target_column]) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table.[target_column] AS DATETIME) > DATEADD(SECOND, CAST(0.0 * 86400 AS INT), SYSDATETIME())
THEN 1
ELSE 0
END
) / COUNT_BIG(analyzed_table.[target_column])
END AS actual_value,
analyzed_table.[country] AS grouping_level_1,
analyzed_table.[state] AS grouping_level_2
FROM [your_sql_server_database].[<target_schema>].[<target_table>] AS analyzed_table
GROUP BY analyzed_table.[country], analyzed_table.[state]
ORDER BY level_1, level_2
,
Teradata
{% import '/dialects/teradata.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2
FROM "<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
Trino
{% import '/dialects/trino.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2
FROM "your_trino_catalog"."<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2
ORDER BY grouping_level_1, grouping_level_2
daily partition date values in future percent
Check description
Detects dates in the future in date, datetime and timestamp columns. Measures a percentage of dates in the future. Raises a data quality issue when too many future dates are found. Stores a separate data quality check result for each daily partition.
Data quality check name | Friendly name | Category | Check type | Time scale | Quality dimension | Sensor definition | Quality rule | Standard |
---|---|---|---|---|---|---|---|---|
daily_partition_date_values_in_future_percent |
Maximum percentage of rows containing dates in future | datetime | partitioned | daily | Validity | date_values_in_future_percent | max_percent |
Command-line examples
Please expand the section below to see the DQOps command-line examples to run or activate the daily partition date values in future percent data quality check.
Managing daily partition date values in future percent check from DQOps shell
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=daily_partition_date_values_in_future_percent --enable-warning
You can also use patterns to activate the check on all matching tables and columns.
dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -col=column_name -ch=daily_partition_date_values_in_future_percent --enable-warning
Additional rule parameters are passed using the -Wrule_parameter_name=value.
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=daily_partition_date_values_in_future_percent --enable-error
You can also use patterns to activate the check on all matching tables and columns.
dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -col=column_name -ch=daily_partition_date_values_in_future_percent --enable-error
Additional rule parameters are passed using the -Erule_parameter_name=value.
Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the daily_partition_date_values_in_future_percent check on all tables and columns on a single data source.
It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.
dqo> check run -c=connection_name -t=schema_name.table_name -ch=daily_partition_date_values_in_future_percent
You can also run this check on all tables (and columns) on which the daily_partition_date_values_in_future_percent check is enabled using patterns to find tables.
YAML configuration
The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
timestamp_columns:
partition_by_column: date_column
incremental_time_window:
daily_partitioning_recent_days: 7
monthly_partitioning_recent_months: 1
columns:
target_column:
partitioned_checks:
daily:
datetime:
daily_partition_date_values_in_future_percent:
parameters:
max_future_days: 0.0
warning:
max_percent: 0.0
error:
max_percent: 1.0
fatal:
max_percent: 5.0
labels:
- This is the column that is analyzed for data quality issues
date_column:
labels:
- "date or datetime column used as a daily or monthly partitioning key, dates\
\ (and times) are truncated to a day or a month by the sensor's query for\
\ partitioned checks"
Samples of generated SQL queries for each data source type
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the date_values_in_future_percent data quality sensor.
BigQuery
{% import '/dialects/bigquery.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL CAST({{(parameters.max_future_days)}} AS INT64) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATETIME_ADD(CURRENT_DATETIME(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% else -%}
SAFE_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN SAFE_CAST(analyzed_table.`target_column` AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST(0.0 * 86400 AS INT64) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
CAST(analyzed_table.`date_column` AS DATE) AS time_period,
TIMESTAMP(CAST(analyzed_table.`date_column` AS DATE)) AS time_period_utc
FROM `your-google-project-id`.`<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
ClickHouse
{% import '/dialects/clickhouse.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDate(now()) + INTERVAL CAST({{(parameters.max_future_days)}} AS Int64) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime(now()) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% else -%}
toDateTime64({{ lib.render_target_column('analyzed_table') }}, 3) > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN toDateTime64(analyzed_table."target_column", 3) > toDateTime64(now(), 3) + INTERVAL CAST(0.0 * 86400 AS Int64) SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
CAST(analyzed_table."date_column" AS DATE) AS time_period,
toDateTime64(CAST(analyzed_table."date_column" AS DATE), 3) AS time_period_utc
FROM "<target_schema>"."<target_table>" AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
Databricks
{% import '/dialects/databricks.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
CAST(analyzed_table.`date_column` AS DATE) AS time_period,
TIMESTAMP(CAST(analyzed_table.`date_column` AS DATE)) AS time_period_utc
FROM `<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
DB2
{% import '/dialects/db2.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INT))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN CAST(analyzed_table."target_column" AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INT))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
CAST(original_table."date_column" AS DATE) AS time_period,
TIMESTAMP(CAST(original_table."date_column" AS DATE)) AS time_period_utc
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
DuckDB
{% import '/dialects/duckdb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + INTERVAL ({{(parameters.max_future_days)}} * 1) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + INTERVAL (0.0 * 86400) SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
CAST(analyzed_table."date_column" AS date) AS time_period,
CAST((CAST(analyzed_table."date_column" AS date)) AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
HANA
{% import '/dialects/hana.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INTEGER))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% else -%}
TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TO_TIMESTAMP(analyzed_table."target_column") > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INTEGER))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
CAST(original_table."date_column" AS DATE) AS time_period,
TO_TIMESTAMP(CAST(original_table."date_column" AS DATE)) AS time_period_utc
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
MariaDB
{% import '/dialects/mariadb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
DATE_FORMAT(analyzed_table.`date_column`, '%Y-%m-%d 00:00:00') AS time_period,
FROM_UNIXTIME(UNIX_TIMESTAMP(DATE_FORMAT(analyzed_table.`date_column`, '%Y-%m-%d 00:00:00'))) AS time_period_utc
FROM `<target_table>` AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
MySQL
{% import '/dialects/mysql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
DATE_FORMAT(analyzed_table.`date_column`, '%Y-%m-%d 00:00:00') AS time_period,
FROM_UNIXTIME(UNIX_TIMESTAMP(DATE_FORMAT(analyzed_table.`date_column`, '%Y-%m-%d 00:00:00'))) AS time_period_utc
FROM `<target_table>` AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
Oracle
{% import '/dialects/oracle.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + numToDSInterval( CAST( ({{(parameters.max_future_days)}}) AS INTEGER), 'day' )
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( (0.0 * 86400) AS INTEGER), 'second' )
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
TRUNC(CAST(original_table."date_column" AS DATE)) AS time_period,
CAST(TRUNC(CAST(original_table."date_column" AS DATE)) AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
PostgreSQL
{% import '/dialects/postgresql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
CAST(analyzed_table."date_column" AS date) AS time_period,
CAST((CAST(analyzed_table."date_column" AS date)) AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM "your_postgresql_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
Presto
{% import '/dialects/presto.sql.jinja2' as lib with context -%}
{% macro render_value_in_future() -%}
{%- endmacro -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
CAST(original_table."date_column" AS date) AS time_period,
CAST(CAST(original_table."date_column" AS date) AS TIMESTAMP) AS time_period_utc
FROM "your_trino_database"."<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
QuestDB
{% import '/dialects/questdb.sql.jinja2' as lib with context -%}
SELECT
COALESCE(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('d', ({{(parameters.max_future_days)}})::int, TODAY())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }}), 0.0)
AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM(
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
COALESCE(100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > DATEADD('s', (0.0)::int * 86400, NOW())
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column"), 0.0)
AS actual_value,
time_period,
time_period_utc
FROM(
SELECT
original_table.*,
CAST(DATE_TRUNC('day', original_table."date_column") AS DATE) AS time_period,
CAST((CAST(DATE_TRUNC('day', original_table."date_column") AS DATE)) AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM "<target_table>" original_table
) analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
Redshift
{% import '/dialects/redshift.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
CAST(analyzed_table."date_column" AS date) AS time_period,
CAST((CAST(analyzed_table."date_column" AS date)) AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM "your_redshift_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
Snowflake
{% import '/dialects/snowflake.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% else -%}
TRY_TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_TO_TIMESTAMP(analyzed_table."target_column") > TIMESTAMPADD(SECOND, CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
CAST(analyzed_table."date_column" AS date) AS time_period,
TO_TIMESTAMP(CAST(analyzed_table."date_column" AS date)) AS time_period_utc
FROM "your_snowflake_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
Spark
{% import '/dialects/spark.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
CAST(analyzed_table.`date_column` AS DATE) AS time_period,
TIMESTAMP(CAST(analyzed_table.`date_column` AS DATE)) AS time_period_utc
FROM `<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
SQL Server
{% import '/dialects/sqlserver.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT_BIG({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INT), GETDATE())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), GETDATE())
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT_BIG({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT_BIG(analyzed_table.[target_column]) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table.[target_column] AS DATETIME) > DATEADD(SECOND, CAST(0.0 * 86400 AS INT), SYSDATETIME())
THEN 1
ELSE 0
END
) / COUNT_BIG(analyzed_table.[target_column])
END AS actual_value,
CAST(analyzed_table.[date_column] AS date) AS time_period,
CAST((CAST(analyzed_table.[date_column] AS date)) AS DATETIME) AS time_period_utc
FROM [your_sql_server_database].[<target_schema>].[<target_table>] AS analyzed_table
GROUP BY CAST(analyzed_table.[date_column] AS date), CAST(analyzed_table.[date_column] AS date)
ORDER BY CAST(analyzed_table.[date_column] AS date)
Teradata
{% import '/dialects/teradata.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
CAST(analyzed_table."date_column" AS DATE) AS time_period,
CAST(CAST(analyzed_table."date_column" AS DATE) AS TIMESTAMP) AS time_period_utc
FROM "<target_schema>"."<target_table>" AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
Trino
{% import '/dialects/trino.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
CAST(original_table."date_column" AS date) AS time_period,
CAST(CAST(original_table."date_column" AS date) AS TIMESTAMP) AS time_period_utc
FROM "your_trino_catalog"."<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
Expand the Configure with data grouping section to see additional examples for configuring this data quality checks to use data grouping (GROUP BY).
Configuration with data grouping
Sample configuration with data grouping enabled (YAML) The sample below shows how to configure the data grouping and how it affects the generated SQL query.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
timestamp_columns:
partition_by_column: date_column
incremental_time_window:
daily_partitioning_recent_days: 7
monthly_partitioning_recent_months: 1
default_grouping_name: group_by_country_and_state
groupings:
group_by_country_and_state:
level_1:
source: column_value
column: country
level_2:
source: column_value
column: state
columns:
target_column:
partitioned_checks:
daily:
datetime:
daily_partition_date_values_in_future_percent:
parameters:
max_future_days: 0.0
warning:
max_percent: 0.0
error:
max_percent: 1.0
fatal:
max_percent: 5.0
labels:
- This is the column that is analyzed for data quality issues
date_column:
labels:
- "date or datetime column used as a daily or monthly partitioning key, dates\
\ (and times) are truncated to a day or a month by the sensor's query for\
\ partitioned checks"
country:
labels:
- column used as the first grouping key
state:
labels:
- column used as the second grouping key
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the date_values_in_future_percent sensor.
BigQuery
{% import '/dialects/bigquery.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL CAST({{(parameters.max_future_days)}} AS INT64) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATETIME_ADD(CURRENT_DATETIME(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% else -%}
SAFE_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN SAFE_CAST(analyzed_table.`target_column` AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST(0.0 * 86400 AS INT64) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2,
CAST(analyzed_table.`date_column` AS DATE) AS time_period,
TIMESTAMP(CAST(analyzed_table.`date_column` AS DATE)) AS time_period_utc
FROM `your-google-project-id`.`<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ClickHouse
{% import '/dialects/clickhouse.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDate(now()) + INTERVAL CAST({{(parameters.max_future_days)}} AS Int64) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime(now()) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% else -%}
toDateTime64({{ lib.render_target_column('analyzed_table') }}, 3) > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN toDateTime64(analyzed_table."target_column", 3) > toDateTime64(now(), 3) + INTERVAL CAST(0.0 * 86400 AS Int64) SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2,
CAST(analyzed_table."date_column" AS DATE) AS time_period,
toDateTime64(CAST(analyzed_table."date_column" AS DATE), 3) AS time_period_utc
FROM "<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
Databricks
{% import '/dialects/databricks.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2,
CAST(analyzed_table.`date_column` AS DATE) AS time_period,
TIMESTAMP(CAST(analyzed_table.`date_column` AS DATE)) AS time_period_utc
FROM `<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
DB2
{% import '/dialects/db2.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INT))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN CAST(analyzed_table."target_column" AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INT))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2,
CAST(original_table."date_column" AS DATE) AS time_period,
TIMESTAMP(CAST(original_table."date_column" AS DATE)) AS time_period_utc
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
DuckDB
{% import '/dialects/duckdb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + INTERVAL ({{(parameters.max_future_days)}} * 1) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + INTERVAL (0.0 * 86400) SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2,
CAST(analyzed_table."date_column" AS date) AS time_period,
CAST((CAST(analyzed_table."date_column" AS date)) AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
HANA
{% import '/dialects/hana.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INTEGER))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% else -%}
TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TO_TIMESTAMP(analyzed_table."target_column") > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INTEGER))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2,
CAST(original_table."date_column" AS DATE) AS time_period,
TO_TIMESTAMP(CAST(original_table."date_column" AS DATE)) AS time_period_utc
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
MariaDB
{% import '/dialects/mariadb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2,
DATE_FORMAT(analyzed_table.`date_column`, '%Y-%m-%d 00:00:00') AS time_period,
FROM_UNIXTIME(UNIX_TIMESTAMP(DATE_FORMAT(analyzed_table.`date_column`, '%Y-%m-%d 00:00:00'))) AS time_period_utc
FROM `<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
MySQL
{% import '/dialects/mysql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2,
DATE_FORMAT(analyzed_table.`date_column`, '%Y-%m-%d 00:00:00') AS time_period,
FROM_UNIXTIME(UNIX_TIMESTAMP(DATE_FORMAT(analyzed_table.`date_column`, '%Y-%m-%d 00:00:00'))) AS time_period_utc
FROM `<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
Oracle
{% import '/dialects/oracle.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + numToDSInterval( CAST( ({{(parameters.max_future_days)}}) AS INTEGER), 'day' )
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( (0.0 * 86400) AS INTEGER), 'second' )
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2,
TRUNC(CAST(original_table."date_column" AS DATE)) AS time_period,
CAST(TRUNC(CAST(original_table."date_column" AS DATE)) AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
PostgreSQL
{% import '/dialects/postgresql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2,
CAST(analyzed_table."date_column" AS date) AS time_period,
CAST((CAST(analyzed_table."date_column" AS date)) AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM "your_postgresql_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
Presto
{% import '/dialects/presto.sql.jinja2' as lib with context -%}
{% macro render_value_in_future() -%}
{%- endmacro -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2,
CAST(original_table."date_column" AS date) AS time_period,
CAST(CAST(original_table."date_column" AS date) AS TIMESTAMP) AS time_period_utc
FROM "your_trino_database"."<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
QuestDB
{% import '/dialects/questdb.sql.jinja2' as lib with context -%}
SELECT
COALESCE(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('d', ({{(parameters.max_future_days)}})::int, TODAY())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }}), 0.0)
AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM(
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
COALESCE(100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > DATEADD('s', (0.0)::int * 86400, NOW())
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column"), 0.0)
AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2,
time_period,
time_period_utc
FROM(
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2,
CAST(DATE_TRUNC('day', original_table."date_column") AS DATE) AS time_period,
CAST((CAST(DATE_TRUNC('day', original_table."date_column") AS DATE)) AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM "<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
Redshift
{% import '/dialects/redshift.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2,
CAST(analyzed_table."date_column" AS date) AS time_period,
CAST((CAST(analyzed_table."date_column" AS date)) AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM "your_redshift_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
Snowflake
{% import '/dialects/snowflake.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% else -%}
TRY_TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_TO_TIMESTAMP(analyzed_table."target_column") > TIMESTAMPADD(SECOND, CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2,
CAST(analyzed_table."date_column" AS date) AS time_period,
TO_TIMESTAMP(CAST(analyzed_table."date_column" AS date)) AS time_period_utc
FROM "your_snowflake_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
Spark
{% import '/dialects/spark.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2,
CAST(analyzed_table.`date_column` AS DATE) AS time_period,
TIMESTAMP(CAST(analyzed_table.`date_column` AS DATE)) AS time_period_utc
FROM `<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
SQL Server
{% import '/dialects/sqlserver.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT_BIG({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INT), GETDATE())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), GETDATE())
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT_BIG({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT_BIG(analyzed_table.[target_column]) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table.[target_column] AS DATETIME) > DATEADD(SECOND, CAST(0.0 * 86400 AS INT), SYSDATETIME())
THEN 1
ELSE 0
END
) / COUNT_BIG(analyzed_table.[target_column])
END AS actual_value,
analyzed_table.[country] AS grouping_level_1,
analyzed_table.[state] AS grouping_level_2,
CAST(analyzed_table.[date_column] AS date) AS time_period,
CAST((CAST(analyzed_table.[date_column] AS date)) AS DATETIME) AS time_period_utc
FROM [your_sql_server_database].[<target_schema>].[<target_table>] AS analyzed_table
GROUP BY analyzed_table.[country], analyzed_table.[state], CAST(analyzed_table.[date_column] AS date), CAST(analyzed_table.[date_column] AS date)
ORDER BY level_1, level_2CAST(analyzed_table.[date_column] AS date)
Teradata
{% import '/dialects/teradata.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2,
CAST(analyzed_table."date_column" AS DATE) AS time_period,
CAST(CAST(analyzed_table."date_column" AS DATE) AS TIMESTAMP) AS time_period_utc
FROM "<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
Trino
{% import '/dialects/trino.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2,
CAST(original_table."date_column" AS date) AS time_period,
CAST(CAST(original_table."date_column" AS date) AS TIMESTAMP) AS time_period_utc
FROM "your_trino_catalog"."<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
monthly partition date values in future percent
Check description
Detects dates in the future in date, datetime and timestamp columns. Measures a percentage of dates in the future. Raises a data quality issue when too many future dates are found. Stores a separate data quality check result for each monthly partition.
Data quality check name | Friendly name | Category | Check type | Time scale | Quality dimension | Sensor definition | Quality rule | Standard |
---|---|---|---|---|---|---|---|---|
monthly_partition_date_values_in_future_percent |
Maximum percentage of rows containing dates in future | datetime | partitioned | monthly | Validity | date_values_in_future_percent | max_percent |
Command-line examples
Please expand the section below to see the DQOps command-line examples to run or activate the monthly partition date values in future percent data quality check.
Managing monthly partition date values in future percent check from DQOps shell
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=monthly_partition_date_values_in_future_percent --enable-warning
You can also use patterns to activate the check on all matching tables and columns.
dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -col=column_name -ch=monthly_partition_date_values_in_future_percent --enable-warning
Additional rule parameters are passed using the -Wrule_parameter_name=value.
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=monthly_partition_date_values_in_future_percent --enable-error
You can also use patterns to activate the check on all matching tables and columns.
dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -col=column_name -ch=monthly_partition_date_values_in_future_percent --enable-error
Additional rule parameters are passed using the -Erule_parameter_name=value.
Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the monthly_partition_date_values_in_future_percent check on all tables and columns on a single data source.
It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.
dqo> check run -c=connection_name -t=schema_name.table_name -ch=monthly_partition_date_values_in_future_percent
You can also run this check on all tables (and columns) on which the monthly_partition_date_values_in_future_percent check is enabled using patterns to find tables.
YAML configuration
The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
timestamp_columns:
partition_by_column: date_column
incremental_time_window:
daily_partitioning_recent_days: 7
monthly_partitioning_recent_months: 1
columns:
target_column:
partitioned_checks:
monthly:
datetime:
monthly_partition_date_values_in_future_percent:
parameters:
max_future_days: 0.0
warning:
max_percent: 0.0
error:
max_percent: 1.0
fatal:
max_percent: 5.0
labels:
- This is the column that is analyzed for data quality issues
date_column:
labels:
- "date or datetime column used as a daily or monthly partitioning key, dates\
\ (and times) are truncated to a day or a month by the sensor's query for\
\ partitioned checks"
Samples of generated SQL queries for each data source type
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the date_values_in_future_percent data quality sensor.
BigQuery
{% import '/dialects/bigquery.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL CAST({{(parameters.max_future_days)}} AS INT64) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATETIME_ADD(CURRENT_DATETIME(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% else -%}
SAFE_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN SAFE_CAST(analyzed_table.`target_column` AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST(0.0 * 86400 AS INT64) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
DATE_TRUNC(CAST(analyzed_table.`date_column` AS DATE), MONTH) AS time_period,
TIMESTAMP(DATE_TRUNC(CAST(analyzed_table.`date_column` AS DATE), MONTH)) AS time_period_utc
FROM `your-google-project-id`.`<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
ClickHouse
{% import '/dialects/clickhouse.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDate(now()) + INTERVAL CAST({{(parameters.max_future_days)}} AS Int64) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime(now()) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% else -%}
toDateTime64({{ lib.render_target_column('analyzed_table') }}, 3) > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN toDateTime64(analyzed_table."target_column", 3) > toDateTime64(now(), 3) + INTERVAL CAST(0.0 * 86400 AS Int64) SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
DATE_TRUNC('month', CAST(analyzed_table."date_column" AS DATE)) AS time_period,
toDateTime64(DATE_TRUNC('month', CAST(analyzed_table."date_column" AS DATE)), 3) AS time_period_utc
FROM "<target_schema>"."<target_table>" AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
Databricks
{% import '/dialects/databricks.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
DATE_TRUNC('MONTH', CAST(analyzed_table.`date_column` AS DATE)) AS time_period,
TIMESTAMP(DATE_TRUNC('MONTH', CAST(analyzed_table.`date_column` AS DATE))) AS time_period_utc
FROM `<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
DB2
{% import '/dialects/db2.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INT))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN CAST(analyzed_table."target_column" AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INT))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
DATE_TRUNC('MONTH', CAST(original_table."date_column" AS DATE)) AS time_period,
TIMESTAMP(DATE_TRUNC('MONTH', CAST(original_table."date_column" AS DATE))) AS time_period_utc
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
DuckDB
{% import '/dialects/duckdb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + INTERVAL ({{(parameters.max_future_days)}} * 1) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + INTERVAL (0.0 * 86400) SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
DATE_TRUNC('MONTH', CAST(analyzed_table."date_column" AS date)) AS time_period,
CAST((DATE_TRUNC('MONTH', CAST(analyzed_table."date_column" AS date))) AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
HANA
{% import '/dialects/hana.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INTEGER))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% else -%}
TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TO_TIMESTAMP(analyzed_table."target_column") > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INTEGER))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
SERIES_ROUND(CAST(original_table."date_column" AS DATE), 'INTERVAL 1 MONTH', ROUND_DOWN) AS time_period,
TO_TIMESTAMP(SERIES_ROUND(CAST(original_table."date_column" AS DATE), 'INTERVAL 1 MONTH', ROUND_DOWN)) AS time_period_utc
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
MariaDB
{% import '/dialects/mariadb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
DATE_FORMAT(analyzed_table.`date_column`, '%Y-%m-01 00:00:00') AS time_period,
FROM_UNIXTIME(UNIX_TIMESTAMP(DATE_FORMAT(analyzed_table.`date_column`, '%Y-%m-01 00:00:00'))) AS time_period_utc
FROM `<target_table>` AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
MySQL
{% import '/dialects/mysql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
DATE_FORMAT(analyzed_table.`date_column`, '%Y-%m-01 00:00:00') AS time_period,
FROM_UNIXTIME(UNIX_TIMESTAMP(DATE_FORMAT(analyzed_table.`date_column`, '%Y-%m-01 00:00:00'))) AS time_period_utc
FROM `<target_table>` AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
Oracle
{% import '/dialects/oracle.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + numToDSInterval( CAST( ({{(parameters.max_future_days)}}) AS INTEGER), 'day' )
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( (0.0 * 86400) AS INTEGER), 'second' )
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
TRUNC(CAST(original_table."date_column" AS DATE), 'MONTH') AS time_period,
CAST(TRUNC(CAST(original_table."date_column" AS DATE), 'MONTH') AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
PostgreSQL
{% import '/dialects/postgresql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
DATE_TRUNC('MONTH', CAST(analyzed_table."date_column" AS date)) AS time_period,
CAST((DATE_TRUNC('MONTH', CAST(analyzed_table."date_column" AS date))) AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM "your_postgresql_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
Presto
{% import '/dialects/presto.sql.jinja2' as lib with context -%}
{% macro render_value_in_future() -%}
{%- endmacro -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
DATE_TRUNC('MONTH', CAST(original_table."date_column" AS date)) AS time_period,
CAST(DATE_TRUNC('MONTH', CAST(original_table."date_column" AS date)) AS TIMESTAMP) AS time_period_utc
FROM "your_trino_database"."<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
QuestDB
{% import '/dialects/questdb.sql.jinja2' as lib with context -%}
SELECT
COALESCE(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('d', ({{(parameters.max_future_days)}})::int, TODAY())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }}), 0.0)
AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM(
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
COALESCE(100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > DATEADD('s', (0.0)::int * 86400, NOW())
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column"), 0.0)
AS actual_value,
time_period,
time_period_utc
FROM(
SELECT
original_table.*,
CAST(DATE_TRUNC('month', original_table."date_column") AS DATE) AS time_period,
CAST((CAST(DATE_TRUNC('month', original_table."date_column") AS DATE)) AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM "<target_table>" original_table
) analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
Redshift
{% import '/dialects/redshift.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
DATE_TRUNC('MONTH', CAST(analyzed_table."date_column" AS date)) AS time_period,
CAST((DATE_TRUNC('MONTH', CAST(analyzed_table."date_column" AS date))) AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM "your_redshift_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
Snowflake
{% import '/dialects/snowflake.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% else -%}
TRY_TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_TO_TIMESTAMP(analyzed_table."target_column") > TIMESTAMPADD(SECOND, CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
DATE_TRUNC('MONTH', CAST(analyzed_table."date_column" AS date)) AS time_period,
TO_TIMESTAMP(DATE_TRUNC('MONTH', CAST(analyzed_table."date_column" AS date))) AS time_period_utc
FROM "your_snowflake_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
Spark
{% import '/dialects/spark.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
DATE_TRUNC('MONTH', CAST(analyzed_table.`date_column` AS DATE)) AS time_period,
TIMESTAMP(DATE_TRUNC('MONTH', CAST(analyzed_table.`date_column` AS DATE))) AS time_period_utc
FROM `<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
SQL Server
{% import '/dialects/sqlserver.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT_BIG({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INT), GETDATE())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), GETDATE())
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT_BIG({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT_BIG(analyzed_table.[target_column]) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table.[target_column] AS DATETIME) > DATEADD(SECOND, CAST(0.0 * 86400 AS INT), SYSDATETIME())
THEN 1
ELSE 0
END
) / COUNT_BIG(analyzed_table.[target_column])
END AS actual_value,
DATEFROMPARTS(YEAR(CAST(analyzed_table.[date_column] AS date)), MONTH(CAST(analyzed_table.[date_column] AS date)), 1) AS time_period,
CAST((DATEFROMPARTS(YEAR(CAST(analyzed_table.[date_column] AS date)), MONTH(CAST(analyzed_table.[date_column] AS date)), 1)) AS DATETIME) AS time_period_utc
FROM [your_sql_server_database].[<target_schema>].[<target_table>] AS analyzed_table
GROUP BY DATEFROMPARTS(YEAR(CAST(analyzed_table.[date_column] AS date)), MONTH(CAST(analyzed_table.[date_column] AS date)), 1), DATEADD(month, DATEDIFF(month, 0, analyzed_table.[date_column]), 0)
ORDER BY DATEFROMPARTS(YEAR(CAST(analyzed_table.[date_column] AS date)), MONTH(CAST(analyzed_table.[date_column] AS date)), 1)
Teradata
{% import '/dialects/teradata.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
TRUNC(CAST(analyzed_table."date_column" AS DATE), 'MM') AS time_period,
CAST(TRUNC(CAST(analyzed_table."date_column" AS DATE), 'MM') AS TIMESTAMP) AS time_period_utc
FROM "<target_schema>"."<target_table>" AS analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
Trino
{% import '/dialects/trino.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
DATE_TRUNC('MONTH', CAST(original_table."date_column" AS date)) AS time_period,
CAST(DATE_TRUNC('MONTH', CAST(original_table."date_column" AS date)) AS TIMESTAMP) AS time_period_utc
FROM "your_trino_catalog"."<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY time_period, time_period_utc
ORDER BY time_period, time_period_utc
Expand the Configure with data grouping section to see additional examples for configuring this data quality checks to use data grouping (GROUP BY).
Configuration with data grouping
Sample configuration with data grouping enabled (YAML) The sample below shows how to configure the data grouping and how it affects the generated SQL query.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
timestamp_columns:
partition_by_column: date_column
incremental_time_window:
daily_partitioning_recent_days: 7
monthly_partitioning_recent_months: 1
default_grouping_name: group_by_country_and_state
groupings:
group_by_country_and_state:
level_1:
source: column_value
column: country
level_2:
source: column_value
column: state
columns:
target_column:
partitioned_checks:
monthly:
datetime:
monthly_partition_date_values_in_future_percent:
parameters:
max_future_days: 0.0
warning:
max_percent: 0.0
error:
max_percent: 1.0
fatal:
max_percent: 5.0
labels:
- This is the column that is analyzed for data quality issues
date_column:
labels:
- "date or datetime column used as a daily or monthly partitioning key, dates\
\ (and times) are truncated to a day or a month by the sensor's query for\
\ partitioned checks"
country:
labels:
- column used as the first grouping key
state:
labels:
- column used as the second grouping key
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the date_values_in_future_percent sensor.
BigQuery
{% import '/dialects/bigquery.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL CAST({{(parameters.max_future_days)}} AS INT64) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATETIME_ADD(CURRENT_DATETIME(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% else -%}
SAFE_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS INT64) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN SAFE_CAST(analyzed_table.`target_column` AS TIMESTAMP) > TIMESTAMP_ADD(CURRENT_TIMESTAMP(), INTERVAL CAST(0.0 * 86400 AS INT64) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2,
DATE_TRUNC(CAST(analyzed_table.`date_column` AS DATE), MONTH) AS time_period,
TIMESTAMP(DATE_TRUNC(CAST(analyzed_table.`date_column` AS DATE), MONTH)) AS time_period_utc
FROM `your-google-project-id`.`<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ClickHouse
{% import '/dialects/clickhouse.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDate(now()) + INTERVAL CAST({{(parameters.max_future_days)}} AS Int64) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > toDateTime(now()) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% else -%}
toDateTime64({{ lib.render_target_column('analyzed_table') }}, 3) > toDateTime64(now(), 3) + INTERVAL CAST({{(parameters.max_future_days)}} * 86400 AS Int64) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN toDateTime64(analyzed_table."target_column", 3) > toDateTime64(now(), 3) + INTERVAL CAST(0.0 * 86400 AS Int64) SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2,
DATE_TRUNC('month', CAST(analyzed_table."date_column" AS DATE)) AS time_period,
toDateTime64(DATE_TRUNC('month', CAST(analyzed_table."date_column" AS DATE)), 3) AS time_period_utc
FROM "<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
Databricks
{% import '/dialects/databricks.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2,
DATE_TRUNC('MONTH', CAST(analyzed_table.`date_column` AS DATE)) AS time_period,
TIMESTAMP(DATE_TRUNC('MONTH', CAST(analyzed_table.`date_column` AS DATE))) AS time_period_utc
FROM `<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
DB2
{% import '/dialects/db2.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN {% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INT))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INT))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN CAST(analyzed_table."target_column" AS TIMESTAMP) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INT))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2,
DATE_TRUNC('MONTH', CAST(original_table."date_column" AS DATE)) AS time_period,
TIMESTAMP(DATE_TRUNC('MONTH', CAST(original_table."date_column" AS DATE))) AS time_period_utc
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
DuckDB
{% import '/dialects/duckdb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + INTERVAL ({{(parameters.max_future_days)}} * 1) DAY
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + INTERVAL (0.0 * 86400) SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2,
DATE_TRUNC('MONTH', CAST(analyzed_table."date_column" AS date)) AS time_period,
CAST((DATE_TRUNC('MONTH', CAST(analyzed_table."date_column" AS date))) AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
HANA
{% import '/dialects/hana.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_DAYS(CURRENT_DATE, CAST({{(parameters.max_future_days)}} AS INTEGER))
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% else -%}
TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > ADD_SECONDS(CURRENT_TIMESTAMP, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER))
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TO_TIMESTAMP(analyzed_table."target_column") > ADD_SECONDS(CURRENT_TIMESTAMP, CAST(0.0 * 86400 AS INTEGER))
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2,
SERIES_ROUND(CAST(original_table."date_column" AS DATE), 'INTERVAL 1 MONTH', ROUND_DOWN) AS time_period,
TO_TIMESTAMP(SERIES_ROUND(CAST(original_table."date_column" AS DATE), 'INTERVAL 1 MONTH', ROUND_DOWN)) AS time_period_utc
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
MariaDB
{% import '/dialects/mariadb.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2,
DATE_FORMAT(analyzed_table.`date_column`, '%Y-%m-01 00:00:00') AS time_period,
FROM_UNIXTIME(UNIX_TIMESTAMP(DATE_FORMAT(analyzed_table.`date_column`, '%Y-%m-01 00:00:00'))) AS time_period_utc
FROM `<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
MySQL
{% import '/dialects/mysql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATE(), INTERVAL ({{(parameters.max_future_days)}}) DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD(CURRENT_DATETIME(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL ({{(parameters.max_future_days)}} * 86400) SECOND)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS DATETIME) > DATE_ADD(CURRENT_TIMESTAMP(), INTERVAL (0.0 * 86400) SECOND)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2,
DATE_FORMAT(analyzed_table.`date_column`, '%Y-%m-01 00:00:00') AS time_period,
FROM_UNIXTIME(UNIX_TIMESTAMP(DATE_FORMAT(analyzed_table.`date_column`, '%Y-%m-01 00:00:00'))) AS time_period_utc
FROM `<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
Oracle
{% import '/dialects/oracle.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + numToDSInterval( CAST( ({{(parameters.max_future_days)}}) AS INTEGER), 'day' )
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( ({{(parameters.max_future_days)}} * 86400) AS INTEGER), 'second' )
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + numToDSInterval( CAST( (0.0 * 86400) AS INTEGER), 'second' )
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2,
TRUNC(CAST(original_table."date_column" AS DATE), 'MONTH') AS time_period,
CAST(TRUNC(CAST(original_table."date_column" AS DATE), 'MONTH') AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM "<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
PostgreSQL
{% import '/dialects/postgresql.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2,
DATE_TRUNC('MONTH', CAST(analyzed_table."date_column" AS date)) AS time_period,
CAST((DATE_TRUNC('MONTH', CAST(analyzed_table."date_column" AS date))) AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM "your_postgresql_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
Presto
{% import '/dialects/presto.sql.jinja2' as lib with context -%}
{% macro render_value_in_future() -%}
{%- endmacro -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2,
DATE_TRUNC('MONTH', CAST(original_table."date_column" AS date)) AS time_period,
CAST(DATE_TRUNC('MONTH', CAST(original_table."date_column" AS date)) AS TIMESTAMP) AS time_period_utc
FROM "your_trino_database"."<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
QuestDB
{% import '/dialects/questdb.sql.jinja2' as lib with context -%}
SELECT
COALESCE(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('d', ({{(parameters.max_future_days)}})::int, TODAY())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > DATEADD('s', ({{(parameters.max_future_days)}})::int * 86400, NOW())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }}), 0.0)
AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM(
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
COALESCE(100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > DATEADD('s', (0.0)::int * 86400, NOW())
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column"), 0.0)
AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2,
time_period,
time_period_utc
FROM(
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2,
CAST(DATE_TRUNC('month', original_table."date_column") AS DATE) AS time_period,
CAST((CAST(DATE_TRUNC('month', original_table."date_column") AS DATE)) AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM "<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
Redshift
{% import '/dialects/redshift.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_typ) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATE + make_interval(days => ({{(parameters.max_future_days)}})::int)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% else -%}
({{ lib.render_target_column('analyzed_table') }})::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => ({{(parameters.max_future_days)}} * 86400)::int)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
(analyzed_table."target_column")::TIMESTAMP > CURRENT_TIMESTAMP + make_interval(secs => (0.0 * 86400)::int)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2,
DATE_TRUNC('MONTH', CAST(analyzed_table."date_column" AS date)) AS time_period,
CAST((DATE_TRUNC('MONTH', CAST(analyzed_table."date_column" AS date))) AS TIMESTAMP WITH TIME ZONE) AS time_period_utc
FROM "your_redshift_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
Snowflake
{% import '/dialects/snowflake.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% else -%}
TRY_TO_TIMESTAMP({{ lib.render_target_column('analyzed_table') }}) > TIMESTAMPADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_TO_TIMESTAMP(analyzed_table."target_column") > TIMESTAMPADD(SECOND, CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2,
DATE_TRUNC('MONTH', CAST(analyzed_table."date_column" AS date)) AS time_period,
TO_TIMESTAMP(DATE_TRUNC('MONTH', CAST(analyzed_table."date_column" AS date))) AS time_period_utc
FROM "your_snowflake_database"."<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
Spark
{% import '/dialects/spark.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE() + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table.`target_column`) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table.`target_column` AS TIMESTAMP) > CURRENT_TIMESTAMP() + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table.`target_column`)
END AS actual_value,
analyzed_table.`country` AS grouping_level_1,
analyzed_table.`state` AS grouping_level_2,
DATE_TRUNC('MONTH', CAST(analyzed_table.`date_column` AS DATE)) AS time_period,
TIMESTAMP(DATE_TRUNC('MONTH', CAST(analyzed_table.`date_column` AS DATE))) AS time_period_utc
FROM `<target_schema>`.`<target_table>` AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
SQL Server
{% import '/dialects/sqlserver.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT_BIG({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(DAY, CAST({{(parameters.max_future_days)}} AS INT), GETDATE())
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), GETDATE())
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS DATETIME) > DATEADD(SECOND, CAST({{(parameters.max_future_days)}} * 86400 AS INT), SYSDATETIME())
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT_BIG({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT_BIG(analyzed_table.[target_column]) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table.[target_column] AS DATETIME) > DATEADD(SECOND, CAST(0.0 * 86400 AS INT), SYSDATETIME())
THEN 1
ELSE 0
END
) / COUNT_BIG(analyzed_table.[target_column])
END AS actual_value,
analyzed_table.[country] AS grouping_level_1,
analyzed_table.[state] AS grouping_level_2,
DATEFROMPARTS(YEAR(CAST(analyzed_table.[date_column] AS date)), MONTH(CAST(analyzed_table.[date_column] AS date)), 1) AS time_period,
CAST((DATEFROMPARTS(YEAR(CAST(analyzed_table.[date_column] AS date)), MONTH(CAST(analyzed_table.[date_column] AS date)), 1)) AS DATETIME) AS time_period_utc
FROM [your_sql_server_database].[<target_schema>].[<target_table>] AS analyzed_table
GROUP BY analyzed_table.[country], analyzed_table.[state], DATEFROMPARTS(YEAR(CAST(analyzed_table.[date_column] AS date)), MONTH(CAST(analyzed_table.[date_column] AS date)), 1), DATEADD(month, DATEDIFF(month, 0, analyzed_table.[date_column]), 0)
ORDER BY level_1, level_2DATEFROMPARTS(YEAR(CAST(analyzed_table.[date_column] AS date)), MONTH(CAST(analyzed_table.[date_column] AS date)), 1)
Teradata
{% import '/dialects/teradata.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > (CURRENT_DATE + INTERVAL {{((parameters.max_future_days) * 1) | int}} DAY)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > CURRENT_DATETIME() + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% else -%}
CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL {{((parameters.max_future_days) * 86400) | int}} SECOND
{% endif -%}
THEN 1
ELSE 0
END
) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections('analyzed_table') }}
{{- lib.render_time_dimension_projection('analyzed_table') }}
FROM {{ lib.render_target_table() }} AS analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE 100.0 * SUM(
CASE
WHEN
CAST(analyzed_table."target_column" AS TIMESTAMP) > CURRENT_TIMESTAMP + INTERVAL 0 SECOND
THEN 1
ELSE 0
END
) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table."country" AS grouping_level_1,
analyzed_table."state" AS grouping_level_2,
TRUNC(CAST(analyzed_table."date_column" AS DATE), 'MM') AS time_period,
CAST(TRUNC(CAST(analyzed_table."date_column" AS DATE), 'MM') AS TIMESTAMP) AS time_period_utc
FROM "<target_schema>"."<target_table>" AS analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
Trino
{% import '/dialects/trino.sql.jinja2' as lib with context -%}
SELECT
CASE
WHEN COUNT({{ lib.render_target_column('analyzed_table') }}) = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
{% if lib.is_instant(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% elif lib.is_local_date(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('DAY', CAST({{(parameters.max_future_days)}} AS INTEGER), CURRENT_DATE)
{% elif lib.is_local_date_time(table.columns[column_name].type_snapshot.column_type) == 'true' -%}
{{ lib.render_target_column('analyzed_table') }} > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_DATETIME)
{% else -%}
TRY_CAST({{ lib.render_target_column('analyzed_table') }} AS TIMESTAMP) > DATE_ADD('SECOND', CAST({{(parameters.max_future_days)}} * 86400 AS INTEGER), CURRENT_TIMESTAMP)
{% endif -%}
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT({{ lib.render_target_column('analyzed_table') }})
END AS actual_value
{{- lib.render_data_grouping_projections_reference('analyzed_table') }}
{{- lib.render_time_dimension_projection_reference('analyzed_table') }}
FROM (
SELECT
original_table.*
{{- lib.render_data_grouping_projections('original_table') }}
{{- lib.render_time_dimension_projection('original_table') }}
FROM {{ lib.render_target_table() }} original_table
) analyzed_table
{{- lib.render_where_clause() -}}
{{- lib.render_group_by() -}}
{{- lib.render_order_by() -}}
SELECT
CASE
WHEN COUNT(analyzed_table."target_column") = 0 THEN 0.0
ELSE CAST(100.0 * SUM(
CASE
WHEN
TRY_CAST(analyzed_table."target_column" AS TIMESTAMP) > DATE_ADD('SECOND', CAST(0.0 * 86400 AS INTEGER), CURRENT_TIMESTAMP)
THEN 1
ELSE 0
END
) AS DOUBLE) / COUNT(analyzed_table."target_column")
END AS actual_value,
analyzed_table.grouping_level_1,
analyzed_table.grouping_level_2
,
time_period,
time_period_utc
FROM (
SELECT
original_table.*,
original_table."country" AS grouping_level_1,
original_table."state" AS grouping_level_2,
DATE_TRUNC('MONTH', CAST(original_table."date_column" AS date)) AS time_period,
CAST(DATE_TRUNC('MONTH', CAST(original_table."date_column" AS date)) AS TIMESTAMP) AS time_period_utc
FROM "your_trino_catalog"."<target_schema>"."<target_table>" original_table
) analyzed_table
GROUP BY grouping_level_1, grouping_level_2, time_period, time_period_utc
ORDER BY grouping_level_1, grouping_level_2, time_period, time_period_utc
What's next
- Learn how to configure data quality checks in DQOps
- Look at the examples of running data quality checks, targeting tables and columns