Poor data quality costs businesses time, money, and customers. For companies conducting business in Europe, the associated costs could rise dramatically when the EU’s General Data Protection Regulation (GDPR) takes effect in May 2018. One small data quality mistake involving the misuse of Personally-Identifiable Information (PII) could cost a company 20 million euros or 4 percent of annual turn-over, whichever is higher.

Even in the absence of GDPR, the accelerating pace of business means that companies can no longer wait weeks or months to correct data errors, particularly if they want to stay competitive. To overcome these and other challenges, businesses are turning to Melissa’s “active data quality” solutions.

“Active data quality is anything that that depends on accurate, real-world reference data,” said Greg Brown, VP of Marketing at Melissa. “People are constantly moving and changing jobs, yet most enterprises don’t realize how inaccurate their data really is.”

For example, recent Melissa research reveals that within 3 1/2 years, half of all customer records will become outdated or otherwise inaccurate. Even more alarming is that 30 percent of the decay occurs within the first 12 months. Keeping current with the data changes is both expensive and difficult using in-house resources alone.

Rule-based versus active data quality
Many businesses already use a rule-based approach to data quality, which works well for corporate or process-specific data. Active data quality helps ensure that customer data is also kept accurate, including names, residential addresses, phone numbers email addresses, job titles, company names, and more.

“When organizations start building their data quality regimen, they inevitably wonder whether they should develop a solution in-house or buy an off-the-shelf solution,” said Brown. “We advocate a hybrid approach because it gives them the best of both worlds.”

Specifically, a hybrid approach allows development teams to define rule-based processes, metrics, and data controls while taking advantage of Melissa’s active data quality.

Common data quality challenges
Handling all data quality issues in-house is exceptionally challenging. For example, many organizations are ill-equipped to resolve two seemingly different addresses such as 6801 Hollywood Blvd., Hollywood, CA and 6801 Hollywood, Los Angeles, CA. While the two addresses appear to be different addresses, the former is a vanity city name and the latter is the preferred USPS city name. However, there are other challenges such as ensuring accurate directional information, suite information, carrier codes, and ZIP+4 codes that impact mail delivery. In fact, many businesses lack the standardization necessary to recognize that International Business Machines and IBM are the same company.

Companies also struggle to parse inverse or mixed names such as Mr. and Mrs. John and Mary Smith. Fewer still are able to transliterate foreign characters into Latin characters so their customer data can be validated and deduped on a global scale faster and more efficiently.

“The best model is one that can incorporate ‘smart, sharp tools’ to augment your current processes, as opposed to a monolithic approach that requires you to buy an entire suite of tools that you don’t necessarily need,” said Brown.

Melissa’s new Data Quality Portal
Melissa enables developers to improve their company’s data quality as they see fit. Rather than limiting what developers can accomplish in a 30-day trial, the new portal allows developers to move at their own pace by purchasing small credit packages.

“If your immediate problem is that you need real-time email verification on your e-commerce site, you can purchase a license for only that without any other encumbrances or add-ons,” said Brown. “It’s a low-risk initial investment, and it’s very easy to spin up when you need additional transactions.”

Using Melissa’s new Data Quality Portal, developers can try out data cleansing and enrichment tools, as well as access code snippets, examples and flexible REST APIs. Melissa also offers an audit service that allows developers to determine important data quality metrics, such as the percentage of duplicate customer files. “Most companies realize that poor data quality equals poor analytics,” said Brown.

“With GDPR, garbage-ingarbage-out comes with a higher level of risk. The portal provides an easy way for developers to proactively start reducing that risk.” For more information, visit www.melissa.com/developer.