SQL data sources are now available for metric monitoring. This functionality enables joining, filtering, and transforming tables in SQL before defining alert conditions without having to create views in the data warehouse.

SQL data sources are now available for metric monitoring. This functionality enables joining, filtering, and transforming tables in SQL before defining alert conditions without having to create views in the data warehouse.

Our new Create data product wizard provides a cleaner, more initiative process to help you create Data products in Monte Carlo. Start by going to "Create data product" under the Data product tab in the top menu.
Add a Name and Description that will make this Data Product easily identifiable by other users viewing this across your workspace.
Search for assets that you would like to add to this Data Product. Table and Reports can be added. Click "+" under "Add" column to add them to the Data Product.
As you add Tables and Reports, you will see them appear on the right side.
The number behind "Included assets", e.g. "8" in screenshot below, represents the total number of unique tables necessary to build the data product. It includes all tables used directly by the product, as well as their upstream dependencies.
For tables and reports where upstream lineage is available, you will see a column for the number of upstream tables connected to that asset that will be included in the Data Product, e.g. "4 upstream tables".
Click "Next: Review"
In the Review step, you will be able to see all Tables and Reports to be included in your Data Product. The tables and reports you included in the previous step and all their upstream tables have been included in the Data Product. Refer to Why Include All Upstream Tables in Data Products.
Use the "Lineage" and "List" view to see the Data Product in different visual layouts

Use the Filters at the top to filter both of these views by the monitoring status of each of the tables included in the Data Product.
Click "Create and Monitor" to create the Data Product and monitor all "Not monitored" tables. Note: the
Once a Data Product is created, all tables in the Data Product will be automatically tagged with a Table tag.
See more details on Data products in our documentation: https://docs.getmontecarlo.com/docs/using-data-product-dashboards
Similar to the daylight savings support that already exists on custom monitors, we've now added daylight savings support to explicit thresholds for Freshness and Volume.
Some customers run their data pipelines in their local timezone. This setting makes sure the schedule (or CRON) of their explicit thresholds also adheres to that local timezone.

Custom monitors have a toggle to turn failure notifications on or off. When on, the monitor will send an alert to if the monitor fails to complete. This is different than the monitor triggering an anomaly alert. Instead, the job of a failure notification is to indicate that monitor was not able to check the quality of the data. When creating a new monitor, the default setting is ‘on’.
We'll now alert to cases where the “run” of the monitor never even starts. For example:
Previously, we alerted just to cases where it started but then failed for some reason. To limit noise from repeatedly failing monitors, we’ll send no more than 1 failure notification per monitor per week.
For accounts where this will significantly increase the number of failure notifications being sent, a proactive message was delivered to Account Owners on September 25.
A beta version of MC-Informatica integration is now available. The integration creates lineage from Informatica task flows that take data from point A to point B. See docs for details https://docs.getmontecarlo.com/docs/informatica-beta

Inmetric monitors, users can now define segments with a SQL Expression. Previously, segmentation could only be configured by picking 1 or 2 fields.
This helps support a long tail of segmentation use cases. For example, when segmenting, users can now concatenate many different fields (e.g. if you wanted to segment by 4 or 5 different fields), or shorten really long field values that impair usability.
Learn more about segmentation.

Up until this week, Cardinality Rules and Referential Integrity Rules were options on the Monitor Menu in Monte Carlo. These were purpose-built monitor creation experiences that produced a SQL Rule. These were for use cases like:
Existing Cardinality Rules and Referential Integrity Rules will continue to run normally. But the workflows to create these have now been removed from the monitor menu.
Referential Integrity Rules and Cardinality Rules were made redundant by new ways to define a set in Validation Monitors. In Validation Monitors, the recommended way to address these use cases in now with the Is in set and Is not in set operators, which allow a user to define a set:
Read more about this change here.

The is in set and is not in set operators now allow users to define a set using 3 possible methods:
Previously, "From a list" was the only way to define a set. These new options make it easy generate large sets and keep them automatically updated as your data evolves. They are ideal for referential integrity checks and scenarios where you have large numbers of allowed values.

dbt and Airflow alerts will now be raised on ALL tables in Monte Carlo – whether the table is muted/monitored or not.
The alerts can be configured on a Job-level basis for each integration.
To configure which dbt Jobs will raise alerts, go to Settings -> Integrations -> dbt integration -> Edit and find the below configuration.
Learn more at dbt Failures As Monte Carlo Alerts
To configure which Airlow Jobs will raise alerts there are two options:
Go to Settings -> Integrations -> Airflow integration -> Configure Jobs and find the below configuration:

Under Assets, search and navigate to the Airflow Job that you want to configure alerts for. On the top right, toggle "Generates alerts".

Learn more at Airflow Alerts and Task Observability
dbt snapshot and seed errors are now available in MC as alerts, alongside model and test alerts. Users can go to settings -> dbt integrations -> edit, to configure the option to send those alerts. Make sure to add the new alert types in the relevant audiences to receive notifications. (docs)

snapshot errors in alert feed

Configure alerts options in Settings