Skip to content

Last updated: July 22, 2025

Import custom result on table data quality checks, SQL examples

A table-level check that uses a custom SQL SELECT statement to retrieve a result of running a custom data quality check that was hardcoded in the data pipeline, and the result was stored in a separate table. The SQL query that is configured in this external data quality results importer must be a complete SELECT statement that queries a dedicated table (created by the data engineers) that stores the results of custom data quality checks. The SQL query must return a severity column with values: 0 - data quality check passed, 1 - warning issue, 2 - error severity issue, 3 - fatal severity issue.


The import custom result on table data quality check has the following variants for each type of data quality checks supported by DQOps.

profile import custom result on table

Check description

Runs a custom query that retrieves a result of a data quality check performed in the data engineering, whose result (the severity level) is pulled from a separate table.

Data quality check name Friendly name Category Check type Time scale Quality dimension Sensor definition Quality rule Standard
profile_import_custom_result_on_table Import custom data quality results on table custom_sql profiling Validity import_custom_result import_severity

Command-line examples

Please expand the section below to see the DQOps command-line examples to run or activate the profile import custom result on table data quality check.

Managing profile import custom result on table check from DQOps shell

Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.

dqo> check activate -c=connection_name -t=schema_name.table_name  -ch=profile_import_custom_result_on_table --enable-warning

You can also use patterns to activate the check on all matching tables and columns.

dqo> check activate -c=connection_name -t=schema_prefix*.fact_*  -ch=profile_import_custom_result_on_table --enable-warning

Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.

dqo> check activate -c=connection_name -t=schema_name.table_name  -ch=profile_import_custom_result_on_table --enable-error

You can also use patterns to activate the check on all matching tables and columns.

dqo> check activate -c=connection_name -t=schema_prefix*.fact_*  -ch=profile_import_custom_result_on_table --enable-error

Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the profile_import_custom_result_on_table check on all tables on a single data source.

dqo> check run -c=data_source_name -ch=profile_import_custom_result_on_table

It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.

dqo> check run -c=connection_name -t=schema_name.table_name -ch=profile_import_custom_result_on_table

You can also run this check on all tables on which the profile_import_custom_result_on_table check is enabled using patterns to find tables.

dqo> check run -c=connection_name -t=schema_prefix*.fact_*  -ch=profile_import_custom_result_on_table

YAML configuration

The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.

# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
  profiling_checks:
    custom_sql:
      profile_import_custom_result_on_table:
        parameters:
          sql_query: |-
            SELECT
              logs.my_actual_value as actual_value,
              logs.my_expected_value as expected_value,
              logs.error_severity as severity
            FROM custom_data_quality_results as logs
            WHERE logs.analyzed_schema_name = '{schema_name}' AND logs.analyzed_table_name = '{table_name}'
        warning: {}
        error: {}
        fatal: {}
  columns: {}
Samples of generated SQL queries for each data source type

Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the import_custom_result data quality sensor.

BigQuery
{% import '/dialects/bigquery.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
ClickHouse
{% import '/dialects/clickhouse.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Databricks
{% import '/dialects/databricks.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
DB2
{% import '/dialects/db2.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
DuckDB
{% import '/dialects/duckdb.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
HANA
{% import '/dialects/hana.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
MariaDB
{% import '/dialects/mariadb.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
MySQL
{% import '/dialects/mysql.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Oracle
{% import '/dialects/oracle.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
PostgreSQL
{% import '/dialects/postgresql.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Presto
{% import '/dialects/presto.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
QuestDB
{% import '/dialects/questdb.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Redshift
{% import '/dialects/redshift.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Snowflake
{% import '/dialects/snowflake.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Spark
{% import '/dialects/spark.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
SQL Server
{% import '/dialects/sqlserver.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Teradata
{% import '/dialects/teradata.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Trino
{% import '/dialects/trino.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'

daily import custom result on table

Check description

Runs a custom query that retrieves a result of a data quality check performed in the data engineering, whose result (the severity level) is pulled from a separate table.

Data quality check name Friendly name Category Check type Time scale Quality dimension Sensor definition Quality rule Standard
daily_import_custom_result_on_table Import custom data quality results on table custom_sql monitoring daily Validity import_custom_result import_severity

Command-line examples

Please expand the section below to see the DQOps command-line examples to run or activate the daily import custom result on table data quality check.

Managing daily import custom result on table check from DQOps shell

Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.

dqo> check activate -c=connection_name -t=schema_name.table_name  -ch=daily_import_custom_result_on_table --enable-warning

You can also use patterns to activate the check on all matching tables and columns.

dqo> check activate -c=connection_name -t=schema_prefix*.fact_*  -ch=daily_import_custom_result_on_table --enable-warning

Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.

dqo> check activate -c=connection_name -t=schema_name.table_name  -ch=daily_import_custom_result_on_table --enable-error

You can also use patterns to activate the check on all matching tables and columns.

dqo> check activate -c=connection_name -t=schema_prefix*.fact_*  -ch=daily_import_custom_result_on_table --enable-error

Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the daily_import_custom_result_on_table check on all tables on a single data source.

dqo> check run -c=data_source_name -ch=daily_import_custom_result_on_table

It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.

dqo> check run -c=connection_name -t=schema_name.table_name -ch=daily_import_custom_result_on_table

You can also run this check on all tables on which the daily_import_custom_result_on_table check is enabled using patterns to find tables.

dqo> check run -c=connection_name -t=schema_prefix*.fact_*  -ch=daily_import_custom_result_on_table

YAML configuration

The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.

# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
  monitoring_checks:
    daily:
      custom_sql:
        daily_import_custom_result_on_table:
          parameters:
            sql_query: |-
              SELECT
                logs.my_actual_value as actual_value,
                logs.my_expected_value as expected_value,
                logs.error_severity as severity
              FROM custom_data_quality_results as logs
              WHERE logs.analyzed_schema_name = '{schema_name}' AND logs.analyzed_table_name = '{table_name}'
          warning: {}
          error: {}
          fatal: {}
  columns: {}
Samples of generated SQL queries for each data source type

Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the import_custom_result data quality sensor.

BigQuery
{% import '/dialects/bigquery.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
ClickHouse
{% import '/dialects/clickhouse.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Databricks
{% import '/dialects/databricks.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
DB2
{% import '/dialects/db2.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
DuckDB
{% import '/dialects/duckdb.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
HANA
{% import '/dialects/hana.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
MariaDB
{% import '/dialects/mariadb.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
MySQL
{% import '/dialects/mysql.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Oracle
{% import '/dialects/oracle.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
PostgreSQL
{% import '/dialects/postgresql.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Presto
{% import '/dialects/presto.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
QuestDB
{% import '/dialects/questdb.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Redshift
{% import '/dialects/redshift.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Snowflake
{% import '/dialects/snowflake.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Spark
{% import '/dialects/spark.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
SQL Server
{% import '/dialects/sqlserver.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Teradata
{% import '/dialects/teradata.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Trino
{% import '/dialects/trino.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'

monthly import custom result on table

Check description

Runs a custom query that retrieves a result of a data quality check performed in the data engineering, whose result (the severity level) is pulled from a separate table.

Data quality check name Friendly name Category Check type Time scale Quality dimension Sensor definition Quality rule Standard
monthly_import_custom_result_on_table Import custom data quality results on table custom_sql monitoring monthly Validity import_custom_result import_severity

Command-line examples

Please expand the section below to see the DQOps command-line examples to run or activate the monthly import custom result on table data quality check.

Managing monthly import custom result on table check from DQOps shell

Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.

dqo> check activate -c=connection_name -t=schema_name.table_name  -ch=monthly_import_custom_result_on_table --enable-warning

You can also use patterns to activate the check on all matching tables and columns.

dqo> check activate -c=connection_name -t=schema_prefix*.fact_*  -ch=monthly_import_custom_result_on_table --enable-warning

Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.

dqo> check activate -c=connection_name -t=schema_name.table_name  -ch=monthly_import_custom_result_on_table --enable-error

You can also use patterns to activate the check on all matching tables and columns.

dqo> check activate -c=connection_name -t=schema_prefix*.fact_*  -ch=monthly_import_custom_result_on_table --enable-error

Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the monthly_import_custom_result_on_table check on all tables on a single data source.

dqo> check run -c=data_source_name -ch=monthly_import_custom_result_on_table

It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.

dqo> check run -c=connection_name -t=schema_name.table_name -ch=monthly_import_custom_result_on_table

You can also run this check on all tables on which the monthly_import_custom_result_on_table check is enabled using patterns to find tables.

dqo> check run -c=connection_name -t=schema_prefix*.fact_*  -ch=monthly_import_custom_result_on_table

YAML configuration

The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.

# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
  monitoring_checks:
    monthly:
      custom_sql:
        monthly_import_custom_result_on_table:
          parameters:
            sql_query: |-
              SELECT
                logs.my_actual_value as actual_value,
                logs.my_expected_value as expected_value,
                logs.error_severity as severity
              FROM custom_data_quality_results as logs
              WHERE logs.analyzed_schema_name = '{schema_name}' AND logs.analyzed_table_name = '{table_name}'
          warning: {}
          error: {}
          fatal: {}
  columns: {}
Samples of generated SQL queries for each data source type

Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the import_custom_result data quality sensor.

BigQuery
{% import '/dialects/bigquery.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
ClickHouse
{% import '/dialects/clickhouse.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Databricks
{% import '/dialects/databricks.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
DB2
{% import '/dialects/db2.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
DuckDB
{% import '/dialects/duckdb.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
HANA
{% import '/dialects/hana.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
MariaDB
{% import '/dialects/mariadb.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
MySQL
{% import '/dialects/mysql.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Oracle
{% import '/dialects/oracle.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
PostgreSQL
{% import '/dialects/postgresql.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Presto
{% import '/dialects/presto.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
QuestDB
{% import '/dialects/questdb.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Redshift
{% import '/dialects/redshift.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Snowflake
{% import '/dialects/snowflake.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Spark
{% import '/dialects/spark.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
SQL Server
{% import '/dialects/sqlserver.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Teradata
{% import '/dialects/teradata.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'
Trino
{% import '/dialects/trino.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name) | replace('{schema_name}', target_table.schema_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND logs.analyzed_table_name = '<target_table>'

What's next