Data streaming company Confluent today announced new features being added to Confluent Cloud geared at ensuring data is trustworthy, easily processed, and securely shared. 

Among these features is an extension of the Stream Governance suite, Data Quality Rules. With this, users can remediate any data quality issues so that their data can be relied on to make business-critical decisions.

Furthermore, the new Custom Connectors, Stream Sharing, the Kora Engine, and early access program for managed Apache Flink are intended to help companies gain insights from their data on a single platform in order to cut back on operational burdens and improve performance.

“Real-time data is the lifeblood of every organization, but it’s extremely challenging to manage data coming from different sources in real time and guarantee that it’s trustworthy,” said Shaun Clowes, chief product officer at Confluent. “As a result, many organizations build a patchwork of solutions plagued with silos and business inefficiencies. Confluent Cloud’s new capabilities fix these issues by providing an easy path to ensuring trusted data can be shared with the right people in the right formats.”

Data Quality Rules also allows for schemas that are stored in Schema Registry to be augmented with multiple types of rules so that teams can improve data integrity, resolve quality issues quickly, and simplify schema evolution.

Custom Connectors also make it so that any Kafka connector can run on Confluent Cloud without the need for infrastructure management. 

With this, teams can connect to any data system through their own Kafka Connect plugins without any code changes, gain high availability and performance through the monitoring of the health of team’s connectors and workers, and decrease the operational burden of managing low-level connector infrastructure.

Lastly, Confluent’s Stream Sharing allows teams to easily share data with enterprise-grade security. With this, users can exchange real-time data with any Kafka client; share and protect their own data with authenticated sharing, access management, and layer encryption controls; and improve the quality and compatibility of shared data with consistent schemas across users, teams, and organizations. 

To learn more, read the blog post