Last updated: July 22, 2025
Import custom result on table data quality checks, SQL examples
A table-level check that uses a custom SQL SELECT statement to retrieve a result of running a custom data quality check that was hardcoded in the data pipeline, and the result was stored in a separate table. The SQL query that is configured in this external data quality results importer must be a complete SELECT statement that queries a dedicated table (created by the data engineers) that stores the results of custom data quality checks. The SQL query must return a severity column with values: 0 - data quality check passed, 1 - warning issue, 2 - error severity issue, 3 - fatal severity issue.
The import custom result on table data quality check has the following variants for each type of data quality checks supported by DQOps.
profile import custom result on table
Check description
Runs a custom query that retrieves a result of a data quality check performed in the data engineering, whose result (the severity level) is pulled from a separate table.
Data quality check name | Friendly name | Category | Check type | Time scale | Quality dimension | Sensor definition | Quality rule | Standard |
---|---|---|---|---|---|---|---|---|
profile_import_custom_result_on_table |
Import custom data quality results on table | custom_sql | profiling | Validity | import_custom_result | import_severity |
Command-line examples
Please expand the section below to see the DQOps command-line examples to run or activate the profile import custom result on table data quality check.
Managing profile import custom result on table check from DQOps shell
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -ch=profile_import_custom_result_on_table --enable-warning
You can also use patterns to activate the check on all matching tables and columns.
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -ch=profile_import_custom_result_on_table --enable-error
You can also use patterns to activate the check on all matching tables and columns.
Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the profile_import_custom_result_on_table check on all tables on a single data source.
It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.
dqo> check run -c=connection_name -t=schema_name.table_name -ch=profile_import_custom_result_on_table
You can also run this check on all tables on which the profile_import_custom_result_on_table check is enabled using patterns to find tables.
YAML configuration
The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
profiling_checks:
custom_sql:
profile_import_custom_result_on_table:
parameters:
sql_query: |-
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '{schema_name}' AND logs.analyzed_table_name = '{table_name}'
warning: {}
error: {}
fatal: {}
columns: {}
Samples of generated SQL queries for each data source type
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the import_custom_result data quality sensor.
BigQuery
ClickHouse
Databricks
DB2
DuckDB
HANA
MariaDB
MySQL
Oracle
PostgreSQL
Presto
QuestDB
Redshift
Snowflake
Spark
SQL Server
Teradata
Trino
daily import custom result on table
Check description
Runs a custom query that retrieves a result of a data quality check performed in the data engineering, whose result (the severity level) is pulled from a separate table.
Data quality check name | Friendly name | Category | Check type | Time scale | Quality dimension | Sensor definition | Quality rule | Standard |
---|---|---|---|---|---|---|---|---|
daily_import_custom_result_on_table |
Import custom data quality results on table | custom_sql | monitoring | daily | Validity | import_custom_result | import_severity |
Command-line examples
Please expand the section below to see the DQOps command-line examples to run or activate the daily import custom result on table data quality check.
Managing daily import custom result on table check from DQOps shell
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -ch=daily_import_custom_result_on_table --enable-warning
You can also use patterns to activate the check on all matching tables and columns.
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -ch=daily_import_custom_result_on_table --enable-error
You can also use patterns to activate the check on all matching tables and columns.
Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the daily_import_custom_result_on_table check on all tables on a single data source.
It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.
You can also run this check on all tables on which the daily_import_custom_result_on_table check is enabled using patterns to find tables.
YAML configuration
The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
monitoring_checks:
daily:
custom_sql:
daily_import_custom_result_on_table:
parameters:
sql_query: |-
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '{schema_name}' AND logs.analyzed_table_name = '{table_name}'
warning: {}
error: {}
fatal: {}
columns: {}
Samples of generated SQL queries for each data source type
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the import_custom_result data quality sensor.
BigQuery
ClickHouse
Databricks
DB2
DuckDB
HANA
MariaDB
MySQL
Oracle
PostgreSQL
Presto
QuestDB
Redshift
Snowflake
Spark
SQL Server
Teradata
Trino
monthly import custom result on table
Check description
Runs a custom query that retrieves a result of a data quality check performed in the data engineering, whose result (the severity level) is pulled from a separate table.
Data quality check name | Friendly name | Category | Check type | Time scale | Quality dimension | Sensor definition | Quality rule | Standard |
---|---|---|---|---|---|---|---|---|
monthly_import_custom_result_on_table |
Import custom data quality results on table | custom_sql | monitoring | monthly | Validity | import_custom_result | import_severity |
Command-line examples
Please expand the section below to see the DQOps command-line examples to run or activate the monthly import custom result on table data quality check.
Managing monthly import custom result on table check from DQOps shell
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -ch=monthly_import_custom_result_on_table --enable-warning
You can also use patterns to activate the check on all matching tables and columns.
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -ch=monthly_import_custom_result_on_table --enable-error
You can also use patterns to activate the check on all matching tables and columns.
Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the monthly_import_custom_result_on_table check on all tables on a single data source.
It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.
dqo> check run -c=connection_name -t=schema_name.table_name -ch=monthly_import_custom_result_on_table
You can also run this check on all tables on which the monthly_import_custom_result_on_table check is enabled using patterns to find tables.
YAML configuration
The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
monitoring_checks:
monthly:
custom_sql:
monthly_import_custom_result_on_table:
parameters:
sql_query: |-
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '{schema_name}' AND logs.analyzed_table_name = '{table_name}'
warning: {}
error: {}
fatal: {}
columns: {}
Samples of generated SQL queries for each data source type
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the import_custom_result data quality sensor.
BigQuery
ClickHouse
Databricks
DB2
DuckDB
HANA
MariaDB
MySQL
Oracle
PostgreSQL
Presto
QuestDB
Redshift
Snowflake
Spark
SQL Server
Teradata
Trino
What's next
- Learn how to configure data quality checks in DQOps
- Look at the examples of running data quality checks, targeting tables and columns