Last updated: July 22, 2025
Import custom result on column data quality checks, SQL examples
Column level check that uses a custom SQL SELECT statement to retrieve a result of running a custom data quality check on a column by a custom data quality check, hardcoded in the data pipeline. The result is retrieved by querying a separate logging table, whose schema is not fixed. The logging table should have columns that identify a table and a column for which they store custom data quality check results, and a severity column of the data quality issue. The SQL query that is configured in this external data quality results importer must be a complete SELECT statement that queries a dedicated logging table, created by the data engineering team.
The import custom result on column data quality check has the following variants for each type of data quality checks supported by DQOps.
profile import custom result on column
Check description
Runs a custom query that retrieves a result of a data quality check performed in the data engineering, whose result (the severity level) is pulled from a separate table.
Data quality check name | Friendly name | Category | Check type | Time scale | Quality dimension | Sensor definition | Quality rule | Standard |
---|---|---|---|---|---|---|---|---|
profile_import_custom_result_on_column |
Import custom data quality results on column | custom_sql | profiling | Validity | import_custom_result | import_severity |
Command-line examples
Please expand the section below to see the DQOps command-line examples to run or activate the profile import custom result on column data quality check.
Managing profile import custom result on column check from DQOps shell
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=profile_import_custom_result_on_column --enable-warning
You can also use patterns to activate the check on all matching tables and columns.
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=profile_import_custom_result_on_column --enable-error
You can also use patterns to activate the check on all matching tables and columns.
Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the profile_import_custom_result_on_column check on all tables and columns on a single data source.
It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.
dqo> check run -c=connection_name -t=schema_name.table_name -ch=profile_import_custom_result_on_column
You can also run this check on all tables (and columns) on which the profile_import_custom_result_on_column check is enabled using patterns to find tables.
YAML configuration
The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
columns:
target_column:
profiling_checks:
custom_sql:
profile_import_custom_result_on_column:
parameters:
sql_query: |-
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '{schema_name}' AND
logs.analyzed_table_name = '{table_name}' AND
logs.analyzed_column_name = '{column_name}'
warning: {}
error: {}
fatal: {}
labels:
- This is the column that is analyzed for data quality issues
Samples of generated SQL queries for each data source type
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the import_custom_result data quality sensor.
BigQuery
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
ClickHouse
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Databricks
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
DB2
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
DuckDB
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
HANA
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
MariaDB
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
MySQL
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Oracle
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
PostgreSQL
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Presto
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
QuestDB
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Redshift
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Snowflake
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Spark
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
SQL Server
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Teradata
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Trino
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
daily import custom result on column
Check description
Runs a custom query that retrieves a result of a data quality check performed in the data engineering, whose result (the severity level) is pulled from a separate table.
Data quality check name | Friendly name | Category | Check type | Time scale | Quality dimension | Sensor definition | Quality rule | Standard |
---|---|---|---|---|---|---|---|---|
daily_import_custom_result_on_column |
Import custom data quality results on column | custom_sql | monitoring | daily | Validity | import_custom_result | import_severity |
Command-line examples
Please expand the section below to see the DQOps command-line examples to run or activate the daily import custom result on column data quality check.
Managing daily import custom result on column check from DQOps shell
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=daily_import_custom_result_on_column --enable-warning
You can also use patterns to activate the check on all matching tables and columns.
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=daily_import_custom_result_on_column --enable-error
You can also use patterns to activate the check on all matching tables and columns.
Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the daily_import_custom_result_on_column check on all tables and columns on a single data source.
It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.
dqo> check run -c=connection_name -t=schema_name.table_name -ch=daily_import_custom_result_on_column
You can also run this check on all tables (and columns) on which the daily_import_custom_result_on_column check is enabled using patterns to find tables.
YAML configuration
The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
columns:
target_column:
monitoring_checks:
daily:
custom_sql:
daily_import_custom_result_on_column:
parameters:
sql_query: |-
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '{schema_name}' AND
logs.analyzed_table_name = '{table_name}' AND
logs.analyzed_column_name = '{column_name}'
warning: {}
error: {}
fatal: {}
labels:
- This is the column that is analyzed for data quality issues
Samples of generated SQL queries for each data source type
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the import_custom_result data quality sensor.
BigQuery
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
ClickHouse
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Databricks
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
DB2
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
DuckDB
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
HANA
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
MariaDB
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
MySQL
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Oracle
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
PostgreSQL
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Presto
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
QuestDB
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Redshift
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Snowflake
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Spark
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
SQL Server
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Teradata
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Trino
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
monthly import custom result on column
Check description
Runs a custom query that retrieves a result of a data quality check performed in the data engineering, whose result (the severity level) is pulled from a separate table.
Data quality check name | Friendly name | Category | Check type | Time scale | Quality dimension | Sensor definition | Quality rule | Standard |
---|---|---|---|---|---|---|---|---|
monthly_import_custom_result_on_column |
Import custom data quality results on column | custom_sql | monitoring | monthly | Validity | import_custom_result | import_severity |
Command-line examples
Please expand the section below to see the DQOps command-line examples to run or activate the monthly import custom result on column data quality check.
Managing monthly import custom result on column check from DQOps shell
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=monthly_import_custom_result_on_column --enable-warning
You can also use patterns to activate the check on all matching tables and columns.
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -col=column_name -ch=monthly_import_custom_result_on_column --enable-error
You can also use patterns to activate the check on all matching tables and columns.
Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the monthly_import_custom_result_on_column check on all tables and columns on a single data source.
It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.
dqo> check run -c=connection_name -t=schema_name.table_name -ch=monthly_import_custom_result_on_column
You can also run this check on all tables (and columns) on which the monthly_import_custom_result_on_column check is enabled using patterns to find tables.
YAML configuration
The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
columns:
target_column:
monitoring_checks:
monthly:
custom_sql:
monthly_import_custom_result_on_column:
parameters:
sql_query: |-
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '{schema_name}' AND
logs.analyzed_table_name = '{table_name}' AND
logs.analyzed_column_name = '{column_name}'
warning: {}
error: {}
fatal: {}
labels:
- This is the column that is analyzed for data quality issues
Samples of generated SQL queries for each data source type
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the import_custom_result data quality sensor.
BigQuery
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
ClickHouse
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Databricks
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
DB2
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
DuckDB
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
HANA
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
MariaDB
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
MySQL
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Oracle
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
PostgreSQL
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Presto
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
QuestDB
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Redshift
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Snowflake
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Spark
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
SQL Server
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Teradata
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
Trino
SELECT
logs.my_actual_value as actual_value,
logs.my_expected_value as expected_value,
logs.error_severity as severity
FROM custom_data_quality_results as logs
WHERE logs.analyzed_schema_name = '<target_schema>' AND
logs.analyzed_table_name = '<target_table>' AND
logs.analyzed_column_name = 'target_column'
What's next
- Learn how to configure data quality checks in DQOps
- Look at the examples of running data quality checks, targeting tables and columns