With data being increasingly stuck behind different services, API management is becoming more and more of a data integration challenge.

Currently most companies view API management as an access problem, but Avadhoot Kulkarni, product manager at Progress, recommends they shift their mindset and view it as a data problem instead. 

According to Kulkarni, APIs are nothing but “just ways to expose your data in user consumable ways.” 

As such, managing APIs leads to a number of data management challenges, including how to maintain data quality, data profiling, and data ownership. API developers and maintainers are concerned about data integrity and data consistency across their APIs, but the emergence of microservices architectures have helped in breaking down monolith applications into smaller services, which creates data silos. 

“Information, which is critical for organizations for their decision making, is locked behind different services. And it’s not easily accessible to the tooling that helps them integrate that data and get a business decision out of those,” said Kulkarni. 

RELATED CONTENT:
A guide to API management tools
How these vendors help companies with API management

One way to address that challenge is to give access to back end data directly, but that comes with its own set of new challenges, according to Kulkarni. It can create issues with data ownership as the user role access constraints put in place as a security measure into the API logic might not be accessible. This can be fixed for a small number of APIs by implementing custom integrations, but as the number of connections needed grows, it becomes less manageable.  

In addition, being able to write those custom integrations for data warehouses, data lakes, or business intelligence tools requires a deep knowledge of the API itself. This is another reason why this solution isn’t scalable, according to Kulkarni. 

“You sort of waste your engineering capacity on that instead of putting it on your business. You  start spending on this side project, which is, most likely not the best avenue for spending your resources,” said Kulkarni. 

Progress’ Kulkarni predicts that more and more of the industry will soon accept this idea of API management being a data management concern. AI and machine learning have permeated so much of what is done in the tech space, and data-driven or data-aware decision making is becoming more of the norm. 

“API management will be treated more like a data management problem in the near future. So the question about data quality, data profiling, how data gets moved between the different components, who has access to this data was also what privileges that particular person has on that data like who can modify versus who can only read how that data integrates with different solutions, that would be not only considered, but it will be also baked into the API architecture going forward,” said Kulkarni.

Data mesh emerges

According to Eric Madariaga, chief marketing officer at CData, data mesh is a technology that is emerging to help companies with this challenge. A data mesh helps to decouple data entry points. 

Data mesh was included in ThoughtWorks’ Technology Radar, first in November 2019 in the “Assess” category, and then moving into the “Trial” category in the October 2020 Radar. 

ThoughtWorks defines data mesh as “an architectural and organizational paradigm that challenges the age-old assumption that we must centralize big analytical data to use it, have data all in one place or be managed by a centralized data team to deliver value.”

According to ThoughtWorks, the concept is built on four principles:

  1. Decentralized data ownership and architecture
  2. Domain-oriented data served as a product
  3. Self-serve data infrastructure
  4. Federated governance, enabling interoperability between systems

“Different data assets within an organization become surfaced through a mesh-like architecture, so that they can be consumed and integrated from a variety of different resources,” said Madariaga. 

The concept isn’t that far off from the original concept of APIs, Madariaga explained. The data mesh provides a common interface for communication between different data resources, much like how API infrastructures help applications communicate with each other. 

“It becomes kind of an entire architectural paradigm,” said Madariaga. “It’s something that large organizations are using when they have multiple data warehouses and things. Conceptually, you know, it’s the idea of having a common interface for communicating with these resources, and solving the distributed dispersed data asset issues that organizations are facing and dealing with today. In API infrastructure, people are having applications that are trying to communicate with each other. They’re trying to do that in a common and consistent way. Data mesh, similarly, is solving that problem.”

Event streams also gaining popularity

According to David Mooter, senior analyst at research firm Forrester, event-driven architecture is another technology that is coming into play in the API management equation, specifically event streams. 

Mooter described a number of vendors already playing in this space of applying event streams to API management, such as IBM and Solace, and there is demand from clients. REST APIs have opened the doors for a lot of business innovation, but they do have their limitations, and event streams are helping to fill in some of those gaps. 

“It’s growing in popularity, but I’ve seen a lot more growth in demand for event streams as not an alternative to REST, but as an additional pool set that complements REST,” said Mooter. 

According to CData’s Madariaga, standardization of APIs is important, yet there are many different API frameworks that are in use today, such as REST and SOAP. 

“So there’s this huge landscape of how applications are talking to one another, and all kinds of different API interface standards,” said Madariaga. 

Madariaga believes it’s important to have a common language for these APIs to communicate through. 

Democratizing data management

“It enables citizen immigration and citizen developers and citizen integrators to use their tooling to work with APIs and data … If you want to increase adoption of your APIs of which you as a developer worked hard to build, providing tooling that gets all the way down to the end user is a very popular way, it’s a very important way to enable the broadest usage of your APIs,” said Madariaga. 

The beauty of low-code is that it allows non-developers to build applications through a drag-and-drop UI interface, but according to Forrester’s Mooter, those UI portals aren’t very useful unless they’re able to talk to IT systems. Therefore it’s important that citizen developers are able to connect via a robust suite of APIs. 

According to Madariaga, there can be a lot of complexity in the way citizen developers connect to APIs. If they want to integrate with an API, they must first define inputs and outputs, and may also have to configure the authentication settings. 

This can be a barrier to entry for those without the technical knowledge needed.  “By abstracting that into, say, a common database standard interface, you literally just drop in a driver and start working with back end APIs, like you would a standard traditional database, and every low-code and no-code application knows how to work with a traditional RDBMS database,” said Madariaga.

This abstraction not only benefits citizen developers, but saves traditional developers time as well. 

“Because really, ultimately, what happens is you’re submitting queries and getting back tables of data, and those tables are self describing,” said Madariaga. “So they come back, and they provide the columns of data that are there that are exposing underlying APIs. You can do things like joins and aggregates, and you could do all that in in way less code than it would be to go connect to an API itself, get data, do the transformations, do the integration, or anything else on the back end, it is a lot more complex when you are not using one of these API standards.”

Best practices for API creation

According to Forrester’s Mooter, it’s best to develop APIs by looking “outside in” rather than “inside out.” What this means is that rather than looking inwards at how the IT systems are already implemented, API developers should look outwards towards who will actually be using the API and what their needs are. 

He explained that further down this planning process it might be necessary to start considering your internal IT constraints due to factors like cost, but the process “should always begin and largely be driven by end user need, not what’s already in your IT system.”

Another important consideration for API management is governance. Mooter explained that sometimes companies tend to either under-govern or over-govern, neither of which are ideal. Over-governing could result in things getting slowed down too much, while not having enough governance can result in targets not being met. “Finding that sweet spot is rather challenging for organizations,” said Mooter.