Madhukar Kumar is vice president of product management and marketing at Liaison Technologies, and he’s convinced he’s seen the future of data. In the past 10 years, Kumar has served as director of professional services at Autonomy, manager and architect of Interwoven, and finally, a vice president at HP.

Today, however, he’s strictly concerned with data, and how to help enterprises get their arms around the information stored in their systems. Traditionally, managing that much data has been a difficult task, but Kumar is convinced that the future holds a new market for data platforms.

SD Times: What’s going on in data right now?

Kumar: I would say we live in an age with massive disruption going on. Data integration is one aspect of it. If you look at integration in general, there’s classic integrations, then there’s data integration, and then there’s integration which is purely around pulling wires between systems, like Enterprise Services Buses (ESB).

Almost 10 years ago, the ESBs came about because of a huge explosion in monolithic applications. The SAP and ERP systems came about, and that triggered a need for integration. But more recently, we’re starting to see the monolithic systems have started to deconstruct. There are cloud-based applications, and most have APIs. Now we’re seeing a huge explosion of applications.

A large IT organization is responsible for an average of 250+ applications. The size of the data is still increasing. When you do integrations, you can longer afford to leave the data in its place. If you’re doing integration between Salesforce, Workday and NetSuite the classic way, you’d create a new recorded page in Salesforce, a new entry in NetSuite, and so on.

But now, if you want to do something interesting, you have to do data integrations, extract, transform, dedupe, analyze… Finally, you figure out something, but it’s a very long process and takes a lot of time.

In data integration, we’re seeing more demand for doing integration with the data management piece involved in the overall use case. These are the data lakes. Whereas, what we’re seeing is customers who say, “When I do the integration, I want a copy and I want to store it in a data lake where I can apply my business rules. I want to extract actionable insight from that data.” What used to be three different worlds are coming together.

In addition, what we’re seeing is there are microservices that completely disrupt the entire architecture world. It goes from monolithic, to service oriented, to a point where you break up your functionality into modular pieces of applications with APIs around them.

You have to subscribe and publish. By the time you add a new endpoint, it’s seven months down the road. When these new microservices come up, your integration engine has to be agile and integrate them fast. Take that data harmonize it, dedupe, put it in a place where the data is in a fluid state, which can then be used to replay part of the data.

What’s the solution to this problem?

That’s where we see a new methodology emerge: data platform as a service. There’s a huge scarcity of technology resources. If I go to Indeed.com, I think the #1 search is data scientist. People are looking for expensive data scientists and they don’t have time for integration, nor have the money to now go and have 100 different developers and build out integrations.

With data platform as a service, data integration is handled in a cloud-based platform so you don’t have to worry about that. But at the same time, data is growing and a persistent layer in an immutable log fashion gets stored with log data and metadata. At the end of the day, what happens is while the data is integrated, it’s stored like a movie film reel. It could be a key value, could be a graph. It’s a very adaptive way of looking at data and replaying it.

Data platform as a service takes everything infrastructure as a service can do, but can handle the integration layer and big data layer, and provides that capabilities at the application layer.

It’s a methodology, like any PaaS. Other companies are starting to talk about the overall methodology, and a lot more claim they can do integration.

Does this work with streaming data?

You have the need for streaming, but also, you have the need for point-in-time data as well. That’s where data platform as a service works. This is Kappa Architecture, the next evolution of Lambda Architecture. It takes data that can be used for streaming, but it’s also persisted as an immutable log. You now have a polyglot persistence.

That information can then be elevated to knowledge, and then to wisdom. When you take knowledge along with data and metadata and context, it’s wisdom.