Skip to content

check


dqo check run

Run data quality checks that match a given condition

Description

Run data quality checks on your dataset that match a given condition. The command output is a table with the results that provides insight into the data quality.

Command-line synopsis

$ dqo [dqo options...] check run [-deh] [--daily-partitioning-include-today] [-fw] [-hl]
           [--monthly-partitioning-include-current-month] [-c=<connection>]
           [-cat=<checkCategory>] [-ch=<check>] [-col=<column>]
           [-ct=<checkType>]
           [--daily-partitioning-recent-days=<dailyPartitioningRecentDays>]
           [-f=<failAt>] [--from-date=<fromDate>]
           [--from-date-time=<fromDateTime>]
           [--from-date-time-offset=<fromDateTimeOffset>] [-m=<mode>]
           [--monthly-partitioning-recent-months=<monthlyPartitioningRecentMonth
           s>] [-of=<outputFormat>] [-s=<sensor>] [-t=<table>]
           [--to-date=<toDate>] [--to-date-time=<toDateTime>]
           [--to-date-time-offset=<toDateTimeOffset>] [-ts=<timeScale>]
           [-l=<labels>]... [-tag=<tags>]...
DQO Shell synopsis
dqo> check run [-deh] [--daily-partitioning-include-today] [-fw] [-hl]
           [--monthly-partitioning-include-current-month] [-c=<connection>]
           [-cat=<checkCategory>] [-ch=<check>] [-col=<column>]
           [-ct=<checkType>]
           [--daily-partitioning-recent-days=<dailyPartitioningRecentDays>]
           [-f=<failAt>] [--from-date=<fromDate>]
           [--from-date-time=<fromDateTime>]
           [--from-date-time-offset=<fromDateTimeOffset>] [-m=<mode>]
           [--monthly-partitioning-recent-months=<monthlyPartitioningRecentMonth
           s>] [-of=<outputFormat>] [-s=<sensor>] [-t=<table>]
           [--to-date=<toDate>] [--to-date-time=<toDateTime>]
           [--to-date-time-offset=<toDateTimeOffset>] [-ts=<timeScale>]
           [-l=<labels>]... [-tag=<tags>]...

Options

Command argument     Description Required Accepted values
-cat
--category
Check category name (volume, nulls, numeric, etc.)
-ch
--check
Data quality check name, supports patterns like '*_id'
-ct
--check-type
Data quality check type (profiling, recurring, partitioned) profiling
recurring
partitioned
-col
--column
Column name, supports patterns like '*_id'
-c
--connection
Connection name, supports patterns like 'conn*'
--daily-partitioning-include-today
Analyze also today and later days when running daily partitioned checks. By default, daily partitioned checks will not analyze today and future dates. Setting true will disable filtering the end dates.
--daily-partitioning-recent-days
The number of recent days to analyze incrementally by daily partitioned data quality checks.
-tag
--data-stream-level-tag
Data stream hierarchy level filter (tag)
-d
--dummy
Runs data quality check in a dummy mode, sensors are not executed on the target database, but the rest of the process is performed
-e
--enabled
Runs only enabled or only disabled sensors, by default only enabled sensors are executed
-f
--fail-at
Lowest data quality issue severity level (warning, error, fatal) that will cause the command to return with an error code. Use 'none' to return always a success error code. warning
error
fatal
none
-fw
--file-write
Write command response to a file
--from-date
Analyze the data since the given date (inclusive). The date should be an ISO 8601 date (yyyy-MM-dd). The analyzed table must have the timestamp column properly configured, it is the column that is used for filtering the date and time ranges. Setting the beginning date overrides recent days and recent months.
--from-date-time
Analyze the data since the given date and time (inclusive). The date and time should be an ISO 8601 local date and time without the time zone (yyyy-MM-dd HH:mm:ss). The analyzed table must have the timestamp column properly configured, it is the column that is used for filtering the date and time ranges. Setting the beginning date and time overrides recent days and recent months.
--from-date-time-offset
Analyze the data since the given date and time with a time zone offset (inclusive). The date and time should be an ISO 8601 date and time followed by a time zone offset (yyyy-MM-dd HH:mm:ss). For example: 2023-02-20 14:10:00+02. The analyzed table must have the timestamp column properly configured, it is the column that is used for filtering the date and time ranges. Setting the beginning date and time overrides recent days and recent months.
-hl
--headless
Run the command in an headless (no user input allowed) mode
-h
--help
Show the help for the command and parameters
-l
--label
Label filter
-m
--mode
Reporting mode (silent, summary, info, debug) silent
summary
info
debug
--monthly-partitioning-include-current-month
Analyze also the current month and later months when running monthly partitioned checks. By default, monthly partitioned checks will not analyze the current month and future months. Setting true will disable filtering the end dates.
--monthly-partitioning-recent-months
The number of recent months to analyze incrementally by monthly partitioned data quality checks.
-of
--output-format
Output format for tabular responses TABLE
CSV
JSON
-s
--sensor
Data quality sensor name (sensor definition or sensor name), supports patterns like 'table/validity/*'
-t
--table
Full table name (schema.table), supports wildcard patterns 'sch.tab'
-ts
--time-scale
Time scale for recurring and partitioned checks (daily, monthly, etc.) daily
monthly
--to-date
Analyze the data until the given date (exclusive, the given date and the following dates are not analyzed). The date should be an ISO 8601 date (YYYY-MM-DD). The analyzed table must have the timestamp column properly configured, it is the column that is used for filtering the date and time ranges. Setting the end date overrides the parameters to disable analyzing today or the current month.
--to-date-time
Analyze the data until the given date and time (exclusive). The date should be an ISO 8601 date (yyyy-MM-dd). The analyzed table must have the timestamp column properly configured, it is the column that is used for filtering the date and time ranges. Setting the end date and time overrides the parameters to disable analyzing today or the current month.
--to-date-time-offset
Analyze the data until the given date and time with a time zone offset (exclusive). The date and time should be an ISO 8601 date and time followed by a time zone offset (yyyy-MM-dd HH:mm:ss). For example: 2023-02-20 14:10:00+02. The analyzed table must have the timestamp column properly configured, it is the column that is used for filtering the date and time ranges. Setting the end date and time overrides the parameters to disable analyzing today or the current month.

dqo check enable

Description

Enable data quality checks matching specified filters

Command-line synopsis

$ dqo [dqo options...] check enable [-hno] [-fw] [-hl] [-c=<connection>] [-cat=<checkCategory>]
              [-ch=<check>] [-col=<column>] [-ct=<checkType>]
              [-dt=<datatypeFilter>] [-of=<outputFormat>] [-sn=<sensor>]
              [-t=<table>] [-ts=<timeScale>] [-E=<String=String>]...
              [-F=<String=String>]... [-S=<String=String>]...
              [-W=<String=String>]...
DQO Shell synopsis
dqo> check enable [-hno] [-fw] [-hl] [-c=<connection>] [-cat=<checkCategory>]
              [-ch=<check>] [-col=<column>] [-ct=<checkType>]
              [-dt=<datatypeFilter>] [-of=<outputFormat>] [-sn=<sensor>]
              [-t=<table>] [-ts=<timeScale>] [-E=<String=String>]...
              [-F=<String=String>]... [-S=<String=String>]...
              [-W=<String=String>]...

Options

Command argument     Description Required Accepted values
-cat
--category
Check category name (standard, nulls, numeric, etc.)
-ch
--check
Data quality check name, supports patterns like '*_id'
-ct
--check-type
Data quality check type (profiling, recurring, partitioned) profiling
recurring
partitioned
-col
--column
Column name, supports patterns like '*_id'
-c
--connection
Connection name, supports patterns like 'conn*'
-dt
--data-type
Datatype of columns on which to enable checks.
-E
--error-rule
Error level rule options. Usage: -E<rule_name>=<rule_value>, --error-rule=<rule_name>=<rule_value>
-F
--fatal-rule
Fatal level rule options. Usage: -F<rule_name>=<rule_value>, --fatal-rule=<rule_name>=<rule_value>
-fw
--file-write
Write command response to a file
-hl
--headless
Run the command in an headless (no user input allowed) mode
-h
--help
Show the help for the command and parameters
-n
--nullable
Enable check only on nullable columns (false for explicitly non-nullable columns).
-of
--output-format
Output format for tabular responses TABLE
CSV
JSON
-o
--override
Override existing configuration of selected checks.
-sn
--sensor
Data quality sensor name (sensor definition or sensor name), supports patterns like 'table/validity/*'
-S
--sensor-param
Configuration parameters for the sensor. Usage: -S<param_name>=<param_value>, --sensor-param=<param_name>=<param_value>
-t
--table
Full table name (schema.table), supports patterns like 'sch.tab'
-ts
--time-scale
Time scale for recurring and partitioned checks (daily, monthly, etc.) daily
monthly
-W
--warning-rule
Warning level rule options. Usage: -W<rule_name>=<rule_value>, --warning-rule=<rule_name>=<rule_value>

dqo check disable

Description

Disable data quality checks matching specified filters

Command-line synopsis

$ dqo [dqo options...] check disable [-hn] [-fw] [-hl] [-c=<connection>] [-cat=<checkCategory>]
               [-ch=<check>] [-col=<column>] [-ct=<checkType>]
               [-dt=<datatypeFilter>] [-of=<outputFormat>] [-s=<sensor>]
               [-t=<table>] [-ts=<timeScale>]
DQO Shell synopsis
dqo> check disable [-hn] [-fw] [-hl] [-c=<connection>] [-cat=<checkCategory>]
               [-ch=<check>] [-col=<column>] [-ct=<checkType>]
               [-dt=<datatypeFilter>] [-of=<outputFormat>] [-s=<sensor>]
               [-t=<table>] [-ts=<timeScale>]

Options

Command argument     Description Required Accepted values
-cat
--category
Check category name (standard, nulls, numeric, etc.)
-ch
--check
Data quality check name, supports patterns like '*_id'
-ct
--check-type
Data quality check type (profiling, recurring, partitioned) profiling
recurring
partitioned
-col
--column
Column name, supports patterns like '*_id'
-c
--connection
Connection name, supports patterns like 'conn*'
-dt
--data-type
Datatype of columns on which to disable checks.
-fw
--file-write
Write command response to a file
-hl
--headless
Run the command in an headless (no user input allowed) mode
-h
--help
Show the help for the command and parameters
-n
--nullable
Disable check only on nullable columns (false for explicitly non-nullable columns).
-of
--output-format
Output format for tabular responses TABLE
CSV
JSON
-s
--sensor
Data quality sensor name (sensor definition or sensor name), supports patterns like 'table/validity/*'
-t
--table
Full table name (schema.table), supports patterns like 'sch.tab'
-ts
--time-scale
Time scale for recurring and partitioned checks (daily, monthly, etc.) daily
monthly