Skip to content

Last updated: July 22, 2025

DQOps REST API common models reference

The references of all objects used as shared REST API models in all operations are listed below.

CheckType

Enumeration of data quality check types: profiling, monitoring, partitioned.

The structure of this object is described below

 Data type   Enum values 
string profiling
monitoring
partitioned

CheckTimeScale

Enumeration of time scale of monitoring and partitioned data quality checks (daily, monthly, etc.)

The structure of this object is described below

 Data type   Enum values 
string daily
monthly

FieldModel

Model of a single field that is used to edit a parameter value for a sensor or a rule. Describes the type of the field and the current value.

The structure of this object is described below

 Property name   Description                       Data type 
definition Field name that matches the field name (snake_case) used in the YAML specification. ParameterDefinitionSpec
optional Field value is optional and may be null, when false - the field is required and must be filled. boolean
string_value Field value for a string field. string
boolean_value Field value for a boolean field. boolean
integer_value Field value for an integer (32-bit) field. integer
long_value Field value for a long (64-bit) field. long
double_value Field value for a double field. double
datetime_value Field value for a date time field. datetime
column_name_value Field value for a column name field. string
enum_value Field value for an enum (choice) field. string
string_list_value Field value for an array (list) of strings. List[string]
integer_list_value Field value for an array (list) of integers, using 64 bit integers. List[integer]
date_value Field value for an date. date

RuleParametersModel

Model that returns the form definition and the form data to edit parameters (thresholds) for a rule at a single severity level (low, medium, high).

The structure of this object is described below

 Property name   Description                       Data type 
rule_name Full rule name. This field is for information purposes and can be used to create additional custom checks that reuse the same data quality rule. string
rule_parameters List of fields for editing the rule parameters like thresholds. List[FieldModel]
disabled Disable the rule. The rule will not be evaluated. The sensor will also not be executed if it has no enabled rules. boolean
configured Returns true when the rule is configured (is not null), so it should be shown in the UI as configured (having values). boolean

CheckConfigurationModel

Model containing fundamental configuration of a single data quality check.

The structure of this object is described below

 Property name   Description                       Data type 
connection_name Connection name. string
schema_name Schema name. string
table_name Table name. string
column_name Column name, if the check is set up on a column. string
check_target Check target (table or column). CheckTarget
check_type Check type (profiling, monitoring, partitioned). CheckType
check_time_scale Check timescale (for monitoring and partitioned checks). CheckTimeScale
category_name Category to which this check belongs. string
check_name Check name that is used in YAML file. string
sensor_parameters List of fields for editing the sensor parameters. List[FieldModel]
table_level_filter SQL WHERE clause added to the sensor query for every check on this table. string
sensor_level_filter SQL WHERE clause added to the sensor query for this check. string
warning Rule parameters for the warning severity rule. RuleParametersModel
error Rule parameters for the error severity rule. RuleParametersModel
fatal Rule parameters for the fatal severity rule. RuleParametersModel
disabled Whether the check has been disabled. boolean
configured Whether the check is configured (not null). boolean

CheckListModel

Simplistic model that returns a single data quality check, its name and "configured" flag.

The structure of this object is described below

 Property name   Description                       Data type 
check_category Check category. string
check_name Data quality check name that is used in YAML. string
help_text Help text that describes the data quality check. string
configured True if the data quality check is configured (not null). When saving the data quality check configuration, set the flag to true for storing the check. boolean

CheckContainerListModel

Simplistic model that returns the list of data quality checks, their names, categories and "configured" flag.

The structure of this object is described below

 Property name   Description                       Data type 
checks Simplistic list of all data quality checks. List[CheckListModel]
can_edit Boolean flag that decides if the current user can edit the check. boolean
can_run_checks Boolean flag that decides if the current user can run checks. boolean
can_delete_data Boolean flag that decides if the current user can delete data (results). boolean

RuleThresholdsModel

Model that returns the form definition and the form data to edit a single rule with all three threshold levels (low, medium, high).

The structure of this object is described below

 Property name   Description                       Data type 
error Rule parameters for the error severity rule. RuleParametersModel
warning Rule parameters for the warning severity rule. RuleParametersModel
fatal Rule parameters for the fatal severity rule. RuleParametersModel

DefaultRuleSeverityLevel

Default rule severity levels. Matches the severity level name (warning - 1, alert - 2, fatal - 3) with a numeric level.

The structure of this object is described below

 Data type   Enum values 
string none
warning
error
fatal

CronScheduleSpec

Cron job schedule specification.

The structure of this object is described below

 Property name   Description                       Data type 
cron_expression Unix style cron expression that specifies when to execute scheduled operations like running data quality checks or synchronizing the configuration with the cloud. string
disabled Disables the schedule. When the value of this 'disable' field is false, the schedule is stored in the metadata but it is not activated to run data quality checks. boolean

CheckRunScheduleGroup

The run check scheduling group (profiling, daily checks, monthly checks, etc), which identifies the configuration of a schedule (cron expression) used schedule these checks on the job scheduler.

The structure of this object is described below

 Data type   Enum values 
string profiling
monitoring_daily
monitoring_monthly
partitioned_daily
partitioned_monthly

EffectiveScheduleLevelModel

Enumeration of possible levels at which a schedule can be configured.

The structure of this object is described below

 Data type   Enum values 
string connection
table_override
check_override

EffectiveScheduleModel

Model of a configured schedule (on connection or table) or schedule override (on check). Describes the CRON expression and the time of the upcoming execution, as well as the duration until this time.

The structure of this object is described below

 Property name   Description                       Data type 
schedule_group Field value for a schedule group to which this schedule belongs. CheckRunScheduleGroup
schedule_level Field value for the level at which the schedule has been configured. EffectiveScheduleLevelModel
cron_expression Field value for a CRON expression defining the scheduling. string
disabled Field value stating if the schedule has been explicitly disabled. boolean

ScheduleEnabledStatusModel

Enumeration of possible ways a schedule can be configured.

The structure of this object is described below

 Data type   Enum values 
string enabled
disabled
not_configured
overridden_by_checks

CommentSpec

Comment entry. Comments are added when a change was made and the change should be recorded in a persisted format.

The structure of this object is described below

 Property name   Description                       Data type 
date Comment date and time datetime
comment_by Commented by string
comment Comment text string

CommentsListSpec

List of comments.

The structure of this object is described below

 Property name   Description                       Data type 
self List[CommentSpec]

CheckSearchFilters

Target data quality checks filter, identifies which checks on which tables and columns should be executed.

The structure of this object is described below

 Property name   Description                       Data type 
column The column name. This field accepts search patterns in the format: 'fk_*', '*_id', 'prefix*suffix'. string
column_data_type The column data type that was imported from the data source and is stored in the columns -> column_name -> type_snapshot -> column_type field in the .dqotable.yaml file. string
column_nullable Optional filter to find only nullable (when the value is true) or not nullable (when the value is false) columns, based on the value of the columns -> column_name -> type_snapshot -> nullable field in the .dqotable.yaml file. boolean
check_target The target type of object to run checks. Supported values are: table to run only table level checks or column to run only column level checks. CheckTarget
check_type The target type of checks to run. Supported values are profiling, monitoring and partitioned. CheckType
time_scale The time scale of monitoring or partitioned checks to run. Supports running only daily or monthly checks. Daily monitoring checks will replace today's value for all captured check results. CheckTimeScale
check_category The target check category, for example: nulls, volume, anomaly. string
quality_dimension The target data quality dimension, for example: Completeness, Accuracy, Consistency, Timeliness, Availability. string
table_comparison_name The name of a configured table comparison. When the table comparison is provided, DQOps will only perform table comparison checks that compare data between tables. string
check_name The target check name to run only this named check. Uses the short check name which is the name of the deepest folder in the checks folder. This field supports search patterns such as: 'profiling_*', '*count', 'profiling*_percent'. string
sensor_name The target sensor name to run only data quality checks that are using this sensor. Uses the full sensor name which is the full folder path within the sensors folder. This field supports search patterns such as: 'table/volume/row_*', '*count', 'table/volume/prefix*_suffix'. string
connection The connection (data source) name. Supports search patterns in the format: 'source*', '*_prod', 'prefix*suffix'. string
full_table_name The schema and table name. It is provided as ., for example public.fact_sales. The schema and table name accept patterns both in the schema name and table name parts. Sample patterns are: 'schema_name.tab_prefix_*', 'schema_name.', '.*', 'schema_name.*customer', 'schema_name.tab*_suffix'. string
enabled A boolean flag to target enabled tables, columns or checks. When the value of this field is not set, the default value of this field is true, targeting only tables, columns and checks that are not implicitly disabled. boolean
max_results Optional limit for the maximum number of results to return. integer

CheckTargetModel

Enumeration of possible targets for check model request result.

The structure of this object is described below

 Data type   Enum values 
string table
column

SimilarCheckModel

Describes a single check that is similar to other checks in other check types.

The structure of this object is described below

 Property name   Description                       Data type 
check_target The check target (table or column). CheckTarget
check_type The check type. CheckType
time_scale The time scale (daily, monthly). The time scale is optional and can be null (for profiling checks). CheckTimeScale
category The check's category. string
check_name Similar check name in another category. string

CheckModel

Model that returns the form definition and the form data to edit a single data quality check.

The structure of this object is described below

 Property name   Description                       Data type 
check_name Data quality check name that is used in YAML. string
help_text Help text that describes the data quality check. string
display_name User assigned display name that is shown instead of the original data quality check name. string
friendly_name An alternative check's name that is shown on the check editor as a hint. string
sensor_parameters List of fields for editing the sensor parameters. List[FieldModel]
sensor_name Full sensor name. This field is for information purposes and can be used to create additional custom checks that reuse the same data quality sensor. string
quality_dimension Data quality dimension used for tagging the results of this data quality checks. string
rule Threshold (alerting) rules defined for a check. RuleThresholdsModel
supports_error_sampling The data quality check supports capturing error samples, because an error sampling template is defined. boolean
supports_grouping The data quality check supports a custom data grouping configuration. boolean
standard This is a standard data quality check that is always shown on the data quality checks editor screen. Non-standard data quality checks (when the value is false) are advanced checks that are shown when the user decides to expand the list of checks. boolean
default_check This is a check that was applied on-the-fly, because it is configured as a default data observability check and can be run, but it is not configured in the table YAML. boolean
default_severity The severity level (warning, error, fatal) for the default rule that is activated in the data quality check editor when the check is enabled. DefaultRuleSeverityLevel
data_grouping_override Data grouping configuration for this check. When a data grouping configuration is assigned at a check level, it overrides the data grouping configuration from the table level. Data grouping is configured in two cases: (1) the data in the table should be analyzed with a GROUP BY condition, to analyze different groups of rows using separate time series, for example a table contains data from multiple countries and there is a 'country' column used for partitioning. (2) a static data grouping configuration is assigned to a table, when the data is partitioned at a table level (similar tables store the same information, but for different countries, etc.). DataGroupingConfigurationSpec
schedule_override Run check scheduling configuration. Specifies the schedule (a cron expression) when the data quality checks are executed by the scheduler. CronScheduleSpec
effective_schedule Model of configured schedule enabled on the check level. EffectiveScheduleModel
schedule_enabled_status State of the scheduling override for this check. ScheduleEnabledStatusModel
comments Comments for change tracking. Please put comments in this collection because YAML comments may be removed when the YAML file is modified by the tool (serialization and deserialization will remove non tracked comments). CommentsListSpec
disabled Disables the data quality check. Only enabled checks are executed. The sensor should be disabled if it should not work, but the configuration of the sensor and rules should be preserved in the configuration. boolean
exclude_from_kpi Data quality check results (alerts) are included in the data quality KPI calculation by default. Set this field to true in order to exclude this data quality check from the data quality KPI calculation. boolean
include_in_sla Marks the data quality check as part of a data quality SLA (Data Contract). The data quality SLA is a set of critical data quality checks that must always pass and are considered as a Data Contract for the dataset. boolean
configured True if the data quality check is configured (not null). When saving the data quality check configuration, set the flag to true for storing the check. boolean
filter SQL WHERE clause added to the sensor query. Both the table level filter and a sensor query filter are added, separated by an AND operator. string
run_checks_job_template Configured parameters for the "check run" job that should be pushed to the job queue in order to start the job. CheckSearchFilters
data_clean_job_template Configured parameters for the "data clean" job that after being supplied with a time range should be pushed to the job queue in order to remove stored results connected with this check. DeleteStoredDataQueueJobParameters
data_grouping_configuration The name of a data grouping configuration defined at a table that should be used for this check. string
always_collect_error_samples Forces collecting error samples for this check whenever it fails, even if it is a monitoring check that is run by a scheduler, and running an additional query to collect error samples will impose additional load on the data source. boolean
do_not_schedule Disables running this check by a DQOps CRON scheduler. When a check is disabled from scheduling, it can be only triggered from the user interface or by submitting "run checks" job. boolean
check_target Type of the check's target (column, table). CheckTargetModel
configuration_requirements_errors List of configuration errors that must be fixed before the data quality check can be executed. List[string]
similar_checks List of similar checks in other check types or in other time scales. List[SimilarCheckModel]
check_hash The check hash code that identifies the check instance. long
can_edit Boolean flag that decides if the current user can edit the check. boolean
can_run_checks Boolean flag that decides if the current user can run checks. boolean
can_delete_data Boolean flag that decides if the current user can delete data (results). boolean

QualityCategoryModel

Model that returns the form definition and the form data to edit all checks within a single category.

The structure of this object is described below

 Property name   Description                       Data type 
category Data quality check category name. string
comparison_name The name of the reference table configuration used for a cross table data comparison (when the category is 'comparisons'). string
compare_to_column The name of the column in the reference table that is compared. string
help_text Help text that describes the category. string
checks List of data quality checks within the category. List[CheckModel]
run_checks_job_template Configured parameters for the "check run" job that should be pushed to the job queue in order to start the job. CheckSearchFilters
data_clean_job_template Configured parameters for the "data clean" job that after being supplied with a time range should be pushed to the job queue in order to remove stored results connected with this quality category. DeleteStoredDataQueueJobParameters

CheckContainerModel

Model that returns the form definition and the form data to edit all data quality checks divided by categories.

The structure of this object is described below

 Property name   Description                       Data type 
categories List of all data quality categories that contain data quality checks inside. List[QualityCategoryModel]
effective_schedule Model of configured schedule enabled on the check container. EffectiveScheduleModel
effective_schedule_enabled_status State of the effective scheduling on the check container. ScheduleEnabledStatusModel
partition_by_column The name of the column that partitioned checks will use for the time period partitioning. Important only for partitioned checks. string
run_checks_job_template Configured parameters for the "check run" job that should be pushed to the job queue in order to start the job. CheckSearchFilters
data_clean_job_template Configured parameters for the "data clean" job that after being supplied with a time range should be pushed to the job queue in order to remove stored results connected with this check container DeleteStoredDataQueueJobParameters
can_edit Boolean flag that decides if the current user can edit the check. boolean
can_run_checks Boolean flag that decides if the current user can run checks. boolean
can_delete_data Boolean flag that decides if the current user can delete data (results). boolean

CheckContainerTypeModel

Model identifying the check type and timescale of checks belonging to a container.

The structure of this object is described below

 Property name   Description                       Data type 
check_type Check type. CheckType
check_time_scale Check timescale. CheckTimeScale

CheckTemplate

Model depicting a named data quality check that can potentially be enabled, regardless to its position in hierarchy tree.

The structure of this object is described below

 Property name   Description                       Data type 
check_target Check target (table, column) CheckTarget
check_category Data quality check category. string
check_name Data quality check name that is used in YAML. string
help_text Help text that describes the data quality check. string
check_container_type Check type with time-scale. CheckContainerTypeModel
sensor_name Full sensor name. string
check_model Template of the check model with the sensor parameters and rule parameters CheckModel
sensor_parameters_definitions List of sensor parameter fields definitions. List[ParameterDefinitionSpec]
rule_parameters_definitions List of threshold (alerting) rule's parameters definitions (for a single rule, regardless of severity). List[ParameterDefinitionSpec]

PhysicalTableName

Physical table name that is a combination of a schema name and a physical table name (without any quoting or escaping).

The structure of this object is described below

 Property name   Description                       Data type 
schema_name Schema name string
table_name Table name string

RuleSeverityLevel

Rule severity levels. Matches the severity level name (warning - 1, alert - 2, fatal - 3) with a numeric level.

The structure of this object is described below

 Data type   Enum values 
string valid
warning
error
fatal

CheckResultStatus

Enumeration of check execution statuses. It is the highest severity or an error if the sensor cannot be executed due to a configuration issue.

The structure of this object is described below

 Data type   Enum values 
string valid
warning
error
fatal
execution_error

CheckCurrentDataQualityStatusModel

The most recent data quality status for a single data quality check. If data grouping is enabled, this model will return the highest data quality issue status from all data quality results for all data groups.

The structure of this object is described below

 Property name   Description                       Data type 
current_severity The data quality issue severity for this data quality check. An additional value execution_error is used to tell that the check, sensor or rule failed to execute due to insufficient permissions to the table or an error in the sensor's template or a Python rule. For partitioned checks, it is the highest severity of all results for all partitions (time periods) in the analyzed time range. CheckResultStatus
highest_historical_severity The highest severity of previous executions of this data quality issue in the analyzed time range. It can be different from the current_severity if the data quality issue was solved and the most recently data quality issue did not detect it anymore. For partitioned checks, this field returns the same value as the current_severity, because data quality issues in older partitions are still valid. RuleSeverityLevel
column_name Optional column name for column-level data quality checks. string
check_type The check type: profiling, monitoring, partitioned. CheckType
time_scale The check time scale for monitoring and partitioned check types. The time scales are daily and monthly. The profiling checks do not have a time scale. CheckTimeScale
category Check category name, such as nulls, schema, strings, volume. string
quality_dimension Data quality dimension, such as Completeness, Uniqueness, Validity. string
executed_checks The total number of most recent checks that were executed on the column. Table comparison checks that are comparing groups of data are counted as the number of compared data groups. integer
valid_results The number of most recent valid data quality checks that passed without raising any issues. integer
warnings The number of most recent data quality checks that failed by raising a warning severity data quality issue. integer
errors The number of most recent data quality checks that failed by raising an error severity data quality issue. integer
fatals The number of most recent data quality checks that failed by raising a fatal severity data quality issue. integer
execution_errors The number of data quality check execution errors that were reported due to access issues to the data source, invalid mapping in DQOps, invalid queries in data quality sensors or invalid python rules. When an execution error is reported, the configuration of a data quality check on a column must be updated. integer

DimensionCurrentDataQualityStatusModel

A model that describes the current data quality status for a single data quality dimension.

The structure of this object is described below

 Property name   Description                       Data type 
dimension Data quality dimension name. The most popular dimensions are: Completeness, Uniqueness, Timeliness, Validity, Consistency, Accuracy, Availability. string
current_severity The most recent data quality issue severity for this table. When the table is monitored using data grouping, it is the highest issue severity of all recently analyzed data groups. For partitioned checks, it is the highest severity of all results for all partitions (time periods) in the analyzed time range. RuleSeverityLevel
highest_historical_severity The highest severity of previous executions of this data quality issue in the analyzed time range. It can be different from the current_severity if the data quality issue was solved and the most recently data quality issue did not detect it anymore. For partitioned checks, this field returns the same value as the current_severity, because data quality issues in older partitions are still valid. RuleSeverityLevel
executed_checks The total number of most recent checks that were executed on the table for one data quality dimension. Table comparison checks that are comparing groups of data are counted as the number of compared data groups. integer
valid_results The number of most recent valid data quality checks that passed without raising any issues. integer
warnings The number of most recent data quality checks that failed by raising a warning severity data quality issue. integer
errors The number of most recent data quality checks that failed by raising an error severity data quality issue. integer
fatals The number of most recent data quality checks that failed by raising a fatal severity data quality issue. integer
execution_errors The number of data quality check execution errors that were reported due to access issues to the data source, invalid mapping in DQOps, invalid queries in data quality sensors or invalid python rules. When an execution error is reported, the configuration of a data quality check on a table must be updated. integer
data_quality_kpi Data quality KPI score for the data quality dimension, measured as a percentage of passed data quality checks. DQOps counts data quality issues at a warning severity level as passed checks. The data quality KPI score is a value in the range 0..100. double

ColumnCurrentDataQualityStatusModel

The column validity status. It is a summary of the results of the most recently executed data quality checks on the column.

The structure of this object is described below

 Property name   Description                       Data type 
current_severity The most recent data quality issue severity for this column. When the table is monitored using data grouping, it is the highest issue severity of all recently analyzed data groups. For partitioned checks, it is the highest severity of all results for all partitions (time periods) in the analyzed time range. RuleSeverityLevel
highest_historical_severity The highest severity of previous executions of this data quality issue in the analyzed time range. It can be different from the current_severity if the data quality issue was solved and the most recently data quality issue did not detect it anymore. For partitioned checks, this field returns the same value as the current_severity, because data quality issues in older partitions are still valid. RuleSeverityLevel
executed_checks The total number of most recent checks that were executed on the column. Table comparison checks that are comparing groups of data are counted as the number of compared data groups. integer
valid_results The number of most recent valid data quality checks that passed without raising any issues. integer
warnings The number of most recent data quality checks that failed by raising a warning severity data quality issue. integer
errors The number of most recent data quality checks that failed by raising an error severity data quality issue. integer
fatals The number of most recent data quality checks that failed by raising a fatal severity data quality issue. integer
execution_errors The number of data quality check execution errors that were reported due to access issues to the data source, invalid mapping in DQOps, invalid queries in data quality sensors or invalid python rules. When an execution error is reported, the configuration of a data quality check on a column must be updated. integer
data_quality_kpi Data quality KPI score for the column, measured as a percentage of passed data quality checks. DQOps counts data quality issues at a warning severity level as passed checks. The data quality KPI score is a value in the range 0..100. double
checks The dictionary of statuses for data quality checks. The keys are data quality check names, the values are the current data quality check statuses that describe the most current status. Dict[string, CheckCurrentDataQualityStatusModel]
dimensions Dictionary of the current data quality statues for each data quality dimension. Dict[string, DimensionCurrentDataQualityStatusModel]

ColumnListModel

Column list model that returns the basic fields from a column specification, excluding nested nodes like a list of activated checks.

The structure of this object is described below

 Property name   Description                       Data type 
connection_name Connection name. string
table Physical table name including the schema and table names. PhysicalTableName
column_name Column name. string
sql_expression SQL expression for a calculated column, or a column that applies additional data transformation to the original column value. The original column value is referenced by a token {column}. string
column_hash Column hash that identifies the column using a unique hash code. long
disabled Disables all data quality checks on the column. Data quality checks will not be executed. boolean
id Marks columns that are part of a primary or a unique key. DQOps captures their values during error sampling to match invalid values to the rows in which the value was found. boolean
has_any_configured_checks True when the column has any checks configured (read-only). boolean
has_any_configured_profiling_checks True when the column has any profiling checks configured (read-only). boolean
has_any_configured_monitoring_checks True when the column has any monitoring checks configured (read-only). boolean
has_any_configured_partition_checks True when the column has any partition checks configured (read-only). boolean
type_snapshot Column data type that was retrieved when the table metadata was imported. ColumnTypeSnapshotSpec
data_quality_status The current data quality status for the column, grouped by data quality dimensions. DQOps may return a null value when the results were not yet loaded into the cache. In that case, the client should wait a few seconds and retry a call to get the most recent data quality status of the column. ColumnCurrentDataQualityStatusModel
run_checks_job_template Configured parameters for the "check run" job that should be pushed to the job queue in order to run all checks within this column. CheckSearchFilters
run_profiling_checks_job_template Configured parameters for the "check run" job that should be pushed to the job queue in order to run profiling checks within this column. CheckSearchFilters
run_monitoring_checks_job_template Configured parameters for the "check run" job that should be pushed to the job queue in order to run monitoring checks within this column. CheckSearchFilters
run_partition_checks_job_template Configured parameters for the "check run" job that should be pushed to the job queue in order to run partition partitioned checks within this column. CheckSearchFilters
collect_statistics_job_template Configured parameters for the "collect statistics" job that should be pushed to the job queue in order to run all statistics collector within this column. StatisticsCollectorSearchFilters
data_clean_job_template Configured parameters for the "data clean" job that after being supplied with a time range should be pushed to the job queue in order to remove stored results connected with this column. DeleteStoredDataQueueJobParameters
advanced_properties A dictionary of advanced properties that can be used for e.g. to support mapping data to data catalogs, a key/value dictionary. Dict[string, string]
can_edit Boolean flag that decides if the current user can update or delete the column. boolean
can_collect_statistics Boolean flag that decides if the current user can collect statistics. boolean
can_run_checks Boolean flag that decides if the current user can run checks. boolean
can_delete_data Boolean flag that decides if the current user can delete data (results). boolean

ProviderType

Data source provider type (dialect type). We will use lower case names to avoid issues with parsing, even if the enum names are not named following the Java naming convention.

The structure of this object is described below

 Data type   Enum values 
string bigquery
clickhouse
databricks
db2
duckdb
hana
mariadb
mysql
oracle
postgresql
presto
questdb
redshift
snowflake
spark
sqlserver
teradata
trino

ConnectionModel

Connection model returned by the rest api that is limited only to the basic fields, excluding nested nodes.

The structure of this object is described below

 Property name   Description                       Data type 
connection_name Connection name. string
connection_hash Connection hash that identifies the connection using a unique hash code. long
parallel_jobs_limit The concurrency limit for the maximum number of parallel SQL queries executed on this connection. integer
schedule_on_instance Limits running scheduled checks (started by a CRON job scheduler) to run only on a named DQOps instance. When this field is empty, data quality checks are run on all DQOps instances. Set a DQOps instance name to run checks on a named instance only. The default name of the DQOps Cloud SaaS instance is "cloud". string
provider_type Database provider type (required). Accepts: bigquery, snowflake, etc. ProviderType
bigquery BigQuery connection parameters. Specify parameters in the bigquery section. BigQueryParametersSpec
snowflake Snowflake connection parameters. SnowflakeParametersSpec
postgresql PostgreSQL connection parameters. PostgresqlParametersSpec
duckdb DuckDB connection parameters. DuckdbParametersSpec
redshift Redshift connection parameters. RedshiftParametersSpec
sqlserver SqlServer connection parameters. SqlServerParametersSpec
presto Presto connection parameters. PrestoParametersSpec
trino Trino connection parameters. TrinoParametersSpec
mysql MySQL connection parameters. MysqlParametersSpec
oracle Oracle connection parameters. OracleParametersSpec
spark Spark connection parameters. SparkParametersSpec
databricks Databricks connection parameters. DatabricksParametersSpec
hana HANA connection parameters. HanaParametersSpec
db2 DB2 connection parameters. Db2ParametersSpec
mariadb MariaDB connection parameters. MariaDbParametersSpec
clickhouse ClickHouse connection parameters. ClickHouseParametersSpec
questdb QuestDB connection parameters. QuestDbParametersSpec
teradata Teradata connection parameters. TeradataParametersSpec
run_checks_job_template Configured parameters for the "check run" job that should be pushed to the job queue in order to run all checks within this connection. CheckSearchFilters
run_profiling_checks_job_template Configured parameters for the "check run" job that should be pushed to the job queue in order to run profiling checks within this connection. CheckSearchFilters
run_monitoring_checks_job_template Configured parameters for the "check run" job that should be pushed to the job queue in order to run monitoring checks within this connection. CheckSearchFilters
run_partition_checks_job_template Configured parameters for the "check run" job that should be pushed to the job queue in order to run partition partitioned checks within this connection. CheckSearchFilters
collect_statistics_job_template Configured parameters for the "collect statistics" job that should be pushed to the job queue in order to run all statistics collectors within this connection. StatisticsCollectorSearchFilters
data_clean_job_template Configured parameters for the "data clean" job that after being supplied with a time range should be pushed to the job queue in order to remove stored results connected with this connection. DeleteStoredDataQueueJobParameters
advanced_properties A dictionary of advanced properties that can be used for e.g. to support mapping data to data catalogs, a key/value dictionary. Dict[string, string]
can_edit Boolean flag that decides if the current user can update or delete the connection to the data source. boolean
can_collect_statistics Boolean flag that decides if the current user can collect statistics. boolean
can_run_checks Boolean flag that decides if the current user can run checks. boolean
can_delete_data Boolean flag that decides if the current user can delete data (results). boolean
yaml_parsing_error Optional parsing error that was captured when parsing the YAML file. This field is null when the YAML file is valid. If an error was captured, this field returns the file parsing error message and the file location. string

DqoQueueJobId

Identifies a single job.

The structure of this object is described below

 Property name   Description                       Data type 
job_id Job id. long
job_business_key Optional job business key that was assigned to the job. A business key is an alternative user assigned unique job identifier used to find the status of a job finding it by the business key. string
parent_job_id Parent job id. Filled only for nested jobs, for example a sub-job that runs data quality checks on a single table. DqoQueueJobId

HistogramDailyIssuesCount

A model that stores a daily number of incidents.

The structure of this object is described below

 Property name   Description                       Data type 
warnings The number of failed data quality checks that generated a warning severity data quality issue. integer
errors The number of failed data quality checks that generated an error severity data quality issue. integer
fatals The number of failed data quality checks that generated a fatal severity data quality issue. integer
total_count The total count of failed data quality checks on this day. integer

IssueHistogramModel

Model that returns histograms of the data quality issue occurrences related to a data quality incident or a table. The dates in the daily histogram are using the default timezone of the DQOps server.

The structure of this object is described below

 Property name   Description                       Data type 
has_profiling_issues True when this data quality incident is based on data quality issues from profiling checks within the filters applied to search for linked data quality issues. boolean
has_daily_monitoring_issues True when this data quality incident is based on data quality issues from daily monitoring checks within the filters applied to search for linked data quality issues. boolean
has_monthly_monitoring_issues True when this data quality incident is based on data quality issues from monthly monitoring checks within the filters applied to search for linked data quality issues. boolean
has_daily_partitioned_issues True when this data quality incident is based on data quality issues from daily partitioned checks within the filters applied to search for linked data quality issues. boolean
has_monthly_partitioned_issues True when this data quality incident is based on data quality issues from monthly partitioned checks within the filters applied to search for linked data quality issues. boolean
days A map of the numbers of data quality issues per day, the day uses the DQOps server timezone. Dict[date, HistogramDailyIssuesCount]
columns A map of column names with the most data quality issues related to the incident. The map returns the count of issues as the value. Dict[string, integer]
checks A map of data quality check names with the most data quality issues related to the incident. The map returns the count of issues as the value. Dict[string, integer]

ProfilingTimePeriodTruncation

The time period for profiling checks (millisecond, daily, monthly, weekly, hourly). The default profiling check stores one value per month. When profiling checks is re-executed during the month, the previous profiling checks value is overwritten and only the most recent value is stored.

The structure of this object is described below

 Data type   Enum values 
string store_the_most_recent_result_per_month
store_the_most_recent_result_per_week
store_the_most_recent_result_per_day
store_the_most_recent_result_per_hour
store_all_results_without_date_truncation

TableListModel

Table list model returned by the rest api that is limited only to the basic fields, excluding nested nodes.

The structure of this object is described below

 Property name   Description                       Data type 
connection_name Connection name. string
table_hash Table hash that identifies the table using a unique hash code. long
target Physical table details (a physical schema name and a physical table name). PhysicalTableName
disabled Disables all data quality checks on the table. Data quality checks will not be executed. boolean
stage Stage name. string
filter SQL WHERE clause added to the sensor queries. string
do_not_collect_error_samples_in_profiling Disable automatic collection of error samples in the profiling section. The profiling checks by default always collect error samples for failed data quality checks. boolean
always_collect_error_samples_in_monitoring Always collect error samples for failed monitoring checks. DQOps will not collect error samples automatically when the checks are executed by a scheduler or by running checks from the metadata tree. Error samples are always collected only when the checks are run from the check editor. boolean
priority Table priority (1, 2, 3, 4, ...). The tables can be assigned a priority level. The table priority is copied into each data quality check result and a sensor result, enabling efficient grouping of more and less important tables during a data quality improvement project, when the data quality issues on higher priority tables are fixed before data quality issues on less important tables. integer
owner Table owner information like the data steward name or the business application name. TableOwnerSpec
profiling_checks_result_truncation Defines how many profiling checks results are stored for the table monthly. By default, DQOps will use the 'one_per_month' configuration and store only the most recent profiling checks result executed during the month. By changing this value, it is possible to store one value per day or even store all profiling checks results. ProfilingTimePeriodTruncation
file_format File format for a file based table, such as a CSV or Parquet file. FileFormatSpec
data_quality_status The current data quality status for the table, grouped by data quality dimensions. DQOps may return a null value when the results were not yet loaded into the cache. In that case, the client should wait a few seconds and retry a call to get the most recent data quality status of the table. TableCurrentDataQualityStatusModel
has_any_configured_checks True when the table has any checks configured. boolean
has_any_configured_profiling_checks True when the table has any profiling checks configured. boolean
has_any_configured_monitoring_checks True when the table has any monitoring checks configured. boolean
has_any_configured_partition_checks True when the table has any partition checks configured. boolean
partitioning_configuration_missing True when the table has missing configuration of the "partition_by_column" column, making any partition checks fail when executed. boolean
run_checks_job_template Configured parameters for the "check run" job that should be pushed to the job queue in order to run all checks within this table. CheckSearchFilters
run_profiling_checks_job_template Configured parameters for the "check run" job that should be pushed to the job queue in order to run profiling checks within this table. CheckSearchFilters
run_monitoring_checks_job_template Configured parameters for the "check run" job that should be pushed to the job queue in order to run monitoring checks within this table. CheckSearchFilters
run_partition_checks_job_template Configured parameters for the "check run" job that should be pushed to the job queue in order to run partition partitioned checks within this table. CheckSearchFilters
collect_statistics_job_template Configured parameters for the "collect statistics" job that should be pushed to the job queue in order to run all statistics collectors within this table. StatisticsCollectorSearchFilters
data_clean_job_template Configured parameters for the "data clean" job that after being supplied with a time range should be pushed to the job queue in order to remove stored results connected with this table. DeleteStoredDataQueueJobParameters
advanced_properties A dictionary of advanced properties that can be used for e.g. to support mapping data to data catalogs, a key/value dictionary. Dict[string, string]
can_edit Boolean flag that decides if the current user can update or delete this object. boolean
can_collect_statistics Boolean flag that decides if the current user can collect statistics. boolean
can_run_checks Boolean flag that decides if the current user can run checks. boolean
can_delete_data Boolean flag that decides if the current user can delete data (results). boolean
yaml_parsing_error Optional parsing error that was captured when parsing the YAML file. This field is null when the YAML file is valid. If an error was captured, this field returns the file parsing error message and the file location. string