Skip to content

Last updated: July 22, 2025

Import custom result on column data quality checks, SQL examples

Column level check that uses a custom SQL SELECT statement to retrieve a result of running a custom data quality check on a column by a custom data quality check, hardcoded in the data pipeline. The result is retrieved by querying a separate logging table, whose schema is not fixed. The logging table should have columns that identify a table and a column for which they store custom data quality check results, and a severity column of the data quality issue. The SQL query that is configured in this external data quality results importer must be a complete SELECT statement that queries a dedicated logging table, created by the data engineering team.


The import custom result on column data quality check has the following variants for each type of data quality checks supported by DQOps.

profile import custom result on column

Check description

Runs a custom query that retrieves a result of a data quality check performed in the data engineering, whose result (the severity level) is pulled from a separate table.

Data quality check name Friendly name Category Check type Time scale Quality dimension Sensor definition Quality rule Standard
profile_import_custom_result_on_column Import custom data quality results on column custom_sql profiling Validity import_custom_result import_severity

Command-line examples

Please expand the section below to see the DQOps command-line examples to run or activate the profile import custom result on column data quality check.

Managing profile import custom result on column check from DQOps shell

Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.

dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=profile_import_custom_result_on_column --enable-warning

You can also use patterns to activate the check on all matching tables and columns.

dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -col=column_name -ch=profile_import_custom_result_on_column --enable-warning

Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.

dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=profile_import_custom_result_on_column --enable-error

You can also use patterns to activate the check on all matching tables and columns.

dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -col=column_name -ch=profile_import_custom_result_on_column --enable-error

Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the profile_import_custom_result_on_column check on all tables and columns on a single data source.

dqo> check run -c=data_source_name -ch=profile_import_custom_result_on_column

It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.

dqo> check run -c=connection_name -t=schema_name.table_name -ch=profile_import_custom_result_on_column

You can also run this check on all tables (and columns) on which the profile_import_custom_result_on_column check is enabled using patterns to find tables.

dqo> check run -c=connection_name -t=schema_prefix*.fact_* -col=column_name_* -ch=profile_import_custom_result_on_column

YAML configuration

The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.

# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
  columns:
    target_column:
      profiling_checks:
        custom_sql:
          profile_import_custom_result_on_column:
            parameters:
              sql_query: |-
                SELECT
                  logs.my_actual_value as actual_value,
                  logs.my_expected_value as expected_value,
                  logs.error_severity as severity
                FROM custom_data_quality_results as logs
                WHERE logs.analyzed_schema_name = '{schema_name}' AND
                      logs.analyzed_table_name = '{table_name}' AND
                      logs.analyzed_column_name = '{column_name}'
            warning: {}
            error: {}
            fatal: {}
      labels:
      - This is the column that is analyzed for data quality issues
Samples of generated SQL queries for each data source type

Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the import_custom_result data quality sensor.

BigQuery
{% import '/dialects/bigquery.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
ClickHouse
{% import '/dialects/clickhouse.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Databricks
{% import '/dialects/databricks.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
DB2
{% import '/dialects/db2.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
DuckDB
{% import '/dialects/duckdb.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
HANA
{% import '/dialects/hana.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
MariaDB
{% import '/dialects/mariadb.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
MySQL
{% import '/dialects/mysql.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Oracle
{% import '/dialects/oracle.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
PostgreSQL
{% import '/dialects/postgresql.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Presto
{% import '/dialects/presto.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
QuestDB
{% import '/dialects/questdb.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Redshift
{% import '/dialects/redshift.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Snowflake
{% import '/dialects/snowflake.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Spark
{% import '/dialects/spark.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
SQL Server
{% import '/dialects/sqlserver.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Teradata
{% import '/dialects/teradata.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Trino
{% import '/dialects/trino.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'

daily import custom result on column

Check description

Runs a custom query that retrieves a result of a data quality check performed in the data engineering, whose result (the severity level) is pulled from a separate table.

Data quality check name Friendly name Category Check type Time scale Quality dimension Sensor definition Quality rule Standard
daily_import_custom_result_on_column Import custom data quality results on column custom_sql monitoring daily Validity import_custom_result import_severity

Command-line examples

Please expand the section below to see the DQOps command-line examples to run or activate the daily import custom result on column data quality check.

Managing daily import custom result on column check from DQOps shell

Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.

dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=daily_import_custom_result_on_column --enable-warning

You can also use patterns to activate the check on all matching tables and columns.

dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -col=column_name -ch=daily_import_custom_result_on_column --enable-warning

Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.

dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=daily_import_custom_result_on_column --enable-error

You can also use patterns to activate the check on all matching tables and columns.

dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -col=column_name -ch=daily_import_custom_result_on_column --enable-error

Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the daily_import_custom_result_on_column check on all tables and columns on a single data source.

dqo> check run -c=data_source_name -ch=daily_import_custom_result_on_column

It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.

dqo> check run -c=connection_name -t=schema_name.table_name -ch=daily_import_custom_result_on_column

You can also run this check on all tables (and columns) on which the daily_import_custom_result_on_column check is enabled using patterns to find tables.

dqo> check run -c=connection_name -t=schema_prefix*.fact_* -col=column_name_* -ch=daily_import_custom_result_on_column

YAML configuration

The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.

# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
  columns:
    target_column:
      monitoring_checks:
        daily:
          custom_sql:
            daily_import_custom_result_on_column:
              parameters:
                sql_query: |-
                  SELECT
                    logs.my_actual_value as actual_value,
                    logs.my_expected_value as expected_value,
                    logs.error_severity as severity
                  FROM custom_data_quality_results as logs
                  WHERE logs.analyzed_schema_name = '{schema_name}' AND
                        logs.analyzed_table_name = '{table_name}' AND
                        logs.analyzed_column_name = '{column_name}'
              warning: {}
              error: {}
              fatal: {}
      labels:
      - This is the column that is analyzed for data quality issues
Samples of generated SQL queries for each data source type

Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the import_custom_result data quality sensor.

BigQuery
{% import '/dialects/bigquery.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
ClickHouse
{% import '/dialects/clickhouse.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Databricks
{% import '/dialects/databricks.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
DB2
{% import '/dialects/db2.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
DuckDB
{% import '/dialects/duckdb.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
HANA
{% import '/dialects/hana.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
MariaDB
{% import '/dialects/mariadb.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
MySQL
{% import '/dialects/mysql.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Oracle
{% import '/dialects/oracle.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
PostgreSQL
{% import '/dialects/postgresql.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Presto
{% import '/dialects/presto.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
QuestDB
{% import '/dialects/questdb.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Redshift
{% import '/dialects/redshift.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Snowflake
{% import '/dialects/snowflake.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Spark
{% import '/dialects/spark.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
SQL Server
{% import '/dialects/sqlserver.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Teradata
{% import '/dialects/teradata.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Trino
{% import '/dialects/trino.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'

monthly import custom result on column

Check description

Runs a custom query that retrieves a result of a data quality check performed in the data engineering, whose result (the severity level) is pulled from a separate table.

Data quality check name Friendly name Category Check type Time scale Quality dimension Sensor definition Quality rule Standard
monthly_import_custom_result_on_column Import custom data quality results on column custom_sql monitoring monthly Validity import_custom_result import_severity

Command-line examples

Please expand the section below to see the DQOps command-line examples to run or activate the monthly import custom result on column data quality check.

Managing monthly import custom result on column check from DQOps shell

Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.

dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=monthly_import_custom_result_on_column --enable-warning

You can also use patterns to activate the check on all matching tables and columns.

dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -col=column_name -ch=monthly_import_custom_result_on_column --enable-warning

Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.

dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=monthly_import_custom_result_on_column --enable-error

You can also use patterns to activate the check on all matching tables and columns.

dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -col=column_name -ch=monthly_import_custom_result_on_column --enable-error

Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the monthly_import_custom_result_on_column check on all tables and columns on a single data source.

dqo> check run -c=data_source_name -ch=monthly_import_custom_result_on_column

It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.

dqo> check run -c=connection_name -t=schema_name.table_name -ch=monthly_import_custom_result_on_column

You can also run this check on all tables (and columns) on which the monthly_import_custom_result_on_column check is enabled using patterns to find tables.

dqo> check run -c=connection_name -t=schema_prefix*.fact_* -col=column_name_* -ch=monthly_import_custom_result_on_column

YAML configuration

The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.

# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
  columns:
    target_column:
      monitoring_checks:
        monthly:
          custom_sql:
            monthly_import_custom_result_on_column:
              parameters:
                sql_query: |-
                  SELECT
                    logs.my_actual_value as actual_value,
                    logs.my_expected_value as expected_value,
                    logs.error_severity as severity
                  FROM custom_data_quality_results as logs
                  WHERE logs.analyzed_schema_name = '{schema_name}' AND
                        logs.analyzed_table_name = '{table_name}' AND
                        logs.analyzed_column_name = '{column_name}'
              warning: {}
              error: {}
              fatal: {}
      labels:
      - This is the column that is analyzed for data quality issues
Samples of generated SQL queries for each data source type

Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the import_custom_result data quality sensor.

BigQuery
{% import '/dialects/bigquery.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
ClickHouse
{% import '/dialects/clickhouse.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Databricks
{% import '/dialects/databricks.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
DB2
{% import '/dialects/db2.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
DuckDB
{% import '/dialects/duckdb.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
HANA
{% import '/dialects/hana.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
MariaDB
{% import '/dialects/mariadb.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
MySQL
{% import '/dialects/mysql.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Oracle
{% import '/dialects/oracle.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
PostgreSQL
{% import '/dialects/postgresql.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Presto
{% import '/dialects/presto.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
QuestDB
{% import '/dialects/questdb.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Redshift
{% import '/dialects/redshift.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Snowflake
{% import '/dialects/snowflake.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Spark
{% import '/dialects/spark.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
SQL Server
{% import '/dialects/sqlserver.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Teradata
{% import '/dialects/teradata.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'
Trino
{% import '/dialects/trino.sql.jinja2' as lib with context -%}
{{ parameters.sql_query | replace('{table_name}', target_table.table_name)
                        | replace('{schema_name}', target_table.schema_name)
                        | replace('{column_name}', column_name) }}
SELECT
  logs.my_actual_value as actual_value,
  logs.my_expected_value as expected_value,
  logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
      logs.analyzed_table_name = '<target_table>' AND
      logs.analyzed_column_name = 'target_column'

What's next