Data quality monitoring company Lightup has announced a beta program intended to help customers identify issues with their data. 

Lightup tracks data that goes in and out of applications and detects when there are changes that would indicate data quality degradation. It takes metrics such as data availability, data delay, and data volume to determine this. 

In addition, the solution works to learn the normal behavior of those metrics by analyzing past patterns and using it as a baseline.

According to the company, its solution is designed to be used by data and analytics engineers who are building ETL or ELT pipelines. It is a pay-as-you-go solution, which means it doesn’t require a heavy upfront investment. It also includes a free option that companies can try before committing to using it. 

It can be integrated with data warehouses like BigQuery, Snowflake, Databricks, and Redshift.

“While it is well understood that data is the oxygen that fuels every application and process in an organization, companies are flying blind when it comes to understanding the health of data driving their applications,” said Manu Bansal, co-founder and CEO, Lightup. “With the data volumes, high cardinality, and complex data flows that we are all dealing with today, it is easy to end up with bad data in the pipeline. Lightup’s data quality monitoring solution provides data teams with a crystal clear understanding of the health and quality of the data fueling their applications. This ensures that data outages don’t silently turn into broken applications that can have a devastating impact on a company’s performance and bottom line.”