Last updated: July 22, 2025
Sql invalid record count on table data quality checks, SQL examples
A table-level check that uses a custom SQL query that return invalid values from column. Use the {table} token to reference the tested table. This data quality check can be used to compare columns on the same table. For example, when this check is applied on a age column, the condition can find invalid records in which the age is lower than 18 using an SQL query: `SELECT age FROM {table} WHERE age < 18`.
The sql invalid record count on table data quality check has the following variants for each type of data quality checks supported by DQOps.
profile sql invalid record count on table
Check description
Runs a custom query that retrieves invalid records found in a table and returns the number of them, and raises an issue if too many failures were detected. This check is used for setting testing queries or ready queries used by users in their own systems (legacy SQL queries). For example, when this check is applied on a age column, the condition can find invalid records in which the age is lower than 18 using an SQL query: `SELECT age FROM {table} WHERE age < 18`.
Data quality check name | Friendly name | Category | Check type | Time scale | Quality dimension | Sensor definition | Quality rule | Standard |
---|---|---|---|---|---|---|---|---|
profile_sql_invalid_record_count_on_table |
Custom SELECT SQL that returns invalid records | custom_sql | profiling | Validity | sql_invalid_record_count | max_count |
Command-line examples
Please expand the section below to see the DQOps command-line examples to run or activate the profile sql invalid record count on table data quality check.
Managing profile sql invalid record count on table check from DQOps shell
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -ch=profile_sql_invalid_record_count_on_table --enable-warning
You can also use patterns to activate the check on all matching tables and columns.
dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -ch=profile_sql_invalid_record_count_on_table --enable-warning
Additional rule parameters are passed using the -Wrule_parameter_name=value.
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -ch=profile_sql_invalid_record_count_on_table --enable-error
You can also use patterns to activate the check on all matching tables and columns.
dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -ch=profile_sql_invalid_record_count_on_table --enable-error
Additional rule parameters are passed using the -Erule_parameter_name=value.
Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the profile_sql_invalid_record_count_on_table check on all tables on a single data source.
It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.
dqo> check run -c=connection_name -t=schema_name.table_name -ch=profile_sql_invalid_record_count_on_table
You can also run this check on all tables on which the profile_sql_invalid_record_count_on_table check is enabled using patterns to find tables.
YAML configuration
The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
profiling_checks:
custom_sql:
profile_sql_invalid_record_count_on_table:
parameters:
sql_query: |-
SELECT age AS actual_value
FROM customers
WHERE age < 18
warning:
max_count: 0
error:
max_count: 10
fatal:
max_count: 100
columns: {}
Samples of generated SQL queries for each data source type
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the sql_invalid_record_count data quality sensor.
BigQuery
ClickHouse
Databricks
DB2
DuckDB
HANA
MariaDB
MySQL
Oracle
PostgreSQL
Presto
QuestDB
Redshift
Snowflake
Spark
SQL Server
Teradata
Trino
daily sql invalid record count on table
Check description
Runs a custom query that retrieves invalid records found in a table and returns the number of them, and raises an issue if too many failures were detected. This check is used for setting testing queries or ready queries used by users in their own systems (legacy SQL queries). For example, when this check is applied on a age column, the condition can find invalid records in which the age is lower than 18 using an SQL query: `SELECT age FROM {table} WHERE age < 18`.
Data quality check name | Friendly name | Category | Check type | Time scale | Quality dimension | Sensor definition | Quality rule | Standard |
---|---|---|---|---|---|---|---|---|
daily_sql_invalid_record_count_on_table |
Custom SELECT SQL that returns invalid records | custom_sql | monitoring | daily | Validity | sql_invalid_record_count | max_count |
Command-line examples
Please expand the section below to see the DQOps command-line examples to run or activate the daily sql invalid record count on table data quality check.
Managing daily sql invalid record count on table check from DQOps shell
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -ch=daily_sql_invalid_record_count_on_table --enable-warning
You can also use patterns to activate the check on all matching tables and columns.
dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -ch=daily_sql_invalid_record_count_on_table --enable-warning
Additional rule parameters are passed using the -Wrule_parameter_name=value.
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -ch=daily_sql_invalid_record_count_on_table --enable-error
You can also use patterns to activate the check on all matching tables and columns.
dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -ch=daily_sql_invalid_record_count_on_table --enable-error
Additional rule parameters are passed using the -Erule_parameter_name=value.
Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the daily_sql_invalid_record_count_on_table check on all tables on a single data source.
It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.
dqo> check run -c=connection_name -t=schema_name.table_name -ch=daily_sql_invalid_record_count_on_table
You can also run this check on all tables on which the daily_sql_invalid_record_count_on_table check is enabled using patterns to find tables.
YAML configuration
The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
monitoring_checks:
daily:
custom_sql:
daily_sql_invalid_record_count_on_table:
parameters:
sql_query: |-
SELECT age AS actual_value
FROM customers
WHERE age < 18
warning:
max_count: 0
error:
max_count: 10
fatal:
max_count: 100
columns: {}
Samples of generated SQL queries for each data source type
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the sql_invalid_record_count data quality sensor.
BigQuery
ClickHouse
Databricks
DB2
DuckDB
HANA
MariaDB
MySQL
Oracle
PostgreSQL
Presto
QuestDB
Redshift
Snowflake
Spark
SQL Server
Teradata
Trino
monthly sql invalid record count on table
Check description
Runs a custom query that retrieves invalid records found in a table and returns the number of them, and raises an issue if too many failures were detected. This check is used for setting testing queries or ready queries used by users in their own systems (legacy SQL queries). For example, when this check is applied on a age column, the condition can find invalid records in which the age is lower than 18 using an SQL query: `SELECT age FROM {table} WHERE age < 18`.
Data quality check name | Friendly name | Category | Check type | Time scale | Quality dimension | Sensor definition | Quality rule | Standard |
---|---|---|---|---|---|---|---|---|
monthly_sql_invalid_record_count_on_table |
Custom SELECT SQL that returns invalid records | custom_sql | monitoring | monthly | Validity | sql_invalid_record_count | max_count |
Command-line examples
Please expand the section below to see the DQOps command-line examples to run or activate the monthly sql invalid record count on table data quality check.
Managing monthly sql invalid record count on table check from DQOps shell
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the warning rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -ch=monthly_sql_invalid_record_count_on_table --enable-warning
You can also use patterns to activate the check on all matching tables and columns.
dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -ch=monthly_sql_invalid_record_count_on_table --enable-warning
Additional rule parameters are passed using the -Wrule_parameter_name=value.
Activate this data quality using the check activate CLI command, providing the connection name, table name, check name, and all other filters. Activates the error rule with the default parameters.
dqo> check activate -c=connection_name -t=schema_name.table_name -ch=monthly_sql_invalid_record_count_on_table --enable-error
You can also use patterns to activate the check on all matching tables and columns.
dqo> check activate -c=connection_name -t=schema_prefix*.fact_* -ch=monthly_sql_invalid_record_count_on_table --enable-error
Additional rule parameters are passed using the -Erule_parameter_name=value.
Run this data quality check using the check run CLI command by providing the check name and all other targeting filters. The following example shows how to run the monthly_sql_invalid_record_count_on_table check on all tables on a single data source.
It is also possible to run this check on a specific connection and table. In order to do this, use the connection name and the full table name parameters.
dqo> check run -c=connection_name -t=schema_name.table_name -ch=monthly_sql_invalid_record_count_on_table
You can also run this check on all tables on which the monthly_sql_invalid_record_count_on_table check is enabled using patterns to find tables.
YAML configuration
The sample schema_name.table_name.dqotable.yaml file with the check configured is shown below.
# yaml-language-server: $schema=https://cloud.dqops.com/dqo-yaml-schema/TableYaml-schema.json
apiVersion: dqo/v1
kind: table
spec:
monitoring_checks:
monthly:
custom_sql:
monthly_sql_invalid_record_count_on_table:
parameters:
sql_query: |-
SELECT age AS actual_value
FROM customers
WHERE age < 18
warning:
max_count: 0
error:
max_count: 10
fatal:
max_count: 100
columns: {}
Samples of generated SQL queries for each data source type
Please expand the database engine name section to see the SQL query rendered by a Jinja2 template for the sql_invalid_record_count data quality sensor.
BigQuery
ClickHouse
Databricks
DB2
DuckDB
HANA
MariaDB
MySQL
Oracle
PostgreSQL
Presto
QuestDB
Redshift
Snowflake
Spark
SQL Server
Teradata
Trino
What's next
- Learn how to configure data quality checks in DQOps
- Look at the examples of running data quality checks, targeting tables and columns