Maintained with ☕️ by
IcePanel logo

AWS Glue Data Quality announces anomaly detection and dynamic rules

Share

Services

AWS Glue announces preview of a new Glue Data Quality capability that uses ML-powered anomaly detection algorithms to detect hard-to-find data quality issues and anomalies. This helps customers proactively identify quality issues and fix them, so that data users can make confident business decisions. Data engineers and analysts write data quality rules to measure and monitor their data. Rules work well when they know what to expect of the data. However, rules cannot identify anomalous patterns such as sudden increases in missing values or sudden drops in record counts. With this new capability, data engineers and analysts can now easily enable anomaly detection algorithms to analyze data statistics over time and generate insights about these anomalous patterns. This new capability also recommends rules that can be easily added to data pipelines for ongoing monitoring. For more information, refer to the blog and documentation. This new capability is available in preview in the following [AWS Regions](https://aws.amazon.com/about-aws/global-infrastructure/regional-product-services/): US East (Ohio), US East (N. Virginia), US West (Oregon), Asia Pacific (Tokyo), and Europe (Ireland). To learn more, please read [documentation](https://docs.aws.amazon.com/glue/latest/dg/data-quality-anomaly-detection.html) and [blog post](https://aws.amazon.com/blogs/aws/use-anomaly-detection-with-aws-glue-to-improve-data-quality-preview/).