api

The role of the library is rapidly morphing into the role of the API. As services-based applications continue to become the norm inside enterprises, the next logical step is often to make them available to the outside world. Today, the API has become the preferred method of interaction with third parties and outside developers, not just a nice-to-have option for automating business-to-business transactions.

This brave new world of intertwined services has made the life of the average developer a bit more interesting. From design, to test, to deployment and maintenance, APIs are now involved in every layer of the development process.

Just how does one design an API to last? How can they be scaled effectively, and what are these new standards going to change for the day-to-day work of API management?

(Related: GitHub ramps up project management)

From a testing perspective, APIs bring complexity into the process on both ends. APIs from the outside world must either be tunneled to test environments, or they must be virtualized in order to fit into those tight testing windows. Temil Sanchez, sales engineer at SmartBear Software, said that service virtualization and mocking are both important to the process.

As a salesman, Sanchez often uses APIs in his product demonstrations. During one a sales presentation, one of his required demonstration APIs was offline. By virtualizing the service, he was able to replace it on the fly, allowing his demonstration application to work in front of potential customers.

This type of reasoning also holds in the testing environment, where it’s not always possible to bring in outside API calls due to enterprise network restrictions. Service virtualization is an important tool for creating the feedback loop of a Continuous Integration process when building with APIs.

Additionally, virtualizing a line-of-business API ensures developers will always be able to test against it, even when there are turbulent times afoot, such as a major upgrade or an internal systems update.

With proper service virtualization in place, APIs can become a part of the automated testing process, said Sanchez.

One step closer to standards
A few years ago, we would have called them Web-based APIs with a capital W. Today, they’re just APIs, RESTful though they may be. And it is within this tangle of HTTP calls that standards have begun to take hold. RAML, WADL and the now most popular Swagger seek to bring some cohesion to the many millions of APIs out there in the wild.

One common thing all three of these standards attempt to create is a way of describing an API in a somewhat automated fashion. This takes the form of automatically generated documentation, typically.

Cedric Monier, vice president and product line manager for API management at Axway, said that the two most popular standards are Swagger and RAML, with Swagger gaining the most mindshare among his customers. While Axway supports both in its API-management platform, he said other vendors are picking Swagger first.

“If you look at the actual adoption of the standards, RAML is primarily pushed by MuleSoft, and other vendors have made the choice of Swagger,” he said. “In the tooling available, a lot of tools today are tools supporting Swagger than tools supporting RAML.”

That’s not to say Swagger is the better standard from a technical perspective, added Monier. “In the process of registering the API and onboarding it, we can start with an API that is defined in Swagger or RAML. We do see RAML 1.0 is a little ahead of the game when we look at the functional scope,” he said.

RAML was created by Uri Sarid, Emiliano Lesende, Santiago Vacas and Damian Martinez, all of whom work for MuleSoft. Sarid, CTO of MuleSoft, said that RAML was created to deal with the ever-increasing amount of APIs out there.

“We were saying, ‘It seems that a lot of APIs that are being published are a little bit ad hoc.’ Similarly, API consumption was ad hoc,” he said. “People were writing code after reading documentation of an API and attaching to it that way. There were other somewhat standardized ways of doing things—like WADL or Swagger—that people used as intermediary bridges. If I have a tool chain that generates a Swagger implementation, and I can put a UI on top of that, then I am good.”

Yet Sarid said there was something missing from these other standards. “What we didn’t see was intentionality—someone saying, ‘I intend my API to work in this way.’ If we’re going to make developers and enterprise lives easier in the API economy, we need to get everyone intentionally thinking about APIs,” he said.

Thus, Sarid and the RAML working group set out to design a specification that would force the user to indicate their intention when creating the specifications for their APIs. Such an effort, he said, goes the extra mile to define the purpose of an API, not just delineate its functions in a mechanical manner. Such extra metadata allows the API to be better defined for automated monitoring and administration, he said.

“From that humanly writeable but machine readable specification, you can generate goodness: auto-documentation, auto-governance. We didn’t see Swagger wanted to go in this direction,” said Sarid.

Designing the RAML standard was just the first step, however, said Sarid. “We knew it would make no impact without tools to help you leverage those capabilities, so at the same time we developed an API designer and an API console that was reactive to that. We created an API notebook based on my experience with Mathematica. If you have an API specification, it should be trivial to code against it. The API notebook allows you to point to an API specification so you can start scripting against it. It’s almost like blogging for the API world,” he said.

The world of API standards really heated up at the start of 2016 when the Open API Initiative began operations. This project, working under the umbrella of the Linux Foundation, seeks to develop an API description format based on the Swagger specifications. In simpler terms, the OpenAPI 2.0 specification is simply the renamed Swagger 2.0 specifications. The Initiative has attracted many vendors, from Microsoft and Google to IBM and Intuit.

The initiative is currently hard at work on the OpenAPI 3.0 specification. It will be a breaking release, with significant structural improvements, and will follow the semantic versioning conventions that have become popular of late, meaning we’re now expecting version 3.0.0 of the specification instead of simply 3.0.

Root-level properties are more easily understood in OpenAPI 3.0.0, thanks to the new components property, which can contain reusable metadata without applying those modifiers to the entire API due to being held at the root level. The root level of an API spec will now contain an array of objects, thus allowing the API to support multiple root URLs, such as both an HTTP and HTTPS URL.

The new specification for OpenAPI 3.0.0 also includes a Path Item Object, which can contain metadata about the objects being manipulated through API methods. This will provide a simpler path to describing the functionality of those objects, if it exists.

Mark Geene, CEO and cofounder of Cloud Elements (an API integration platform), has built a business on making other people’s APIs easier to use. He said that building APIs to a specification and using RAML or Swagger is a major key to success.

“I’m really encouraged with Swagger being contributed to the OpenAPI Initiative,” he said. “There are a lot of companies, including ourselves, supporting that. That’s really broken out. Swagger has taken off significantly because of that, leading the way by far in the adoption for a standard way to document for an API.

“That’s the most likely area of standards now: how you document them. As for structures, domain models, etc., there’s not going to be a near-term type [standard]. Application developers look at that as their core IP. At least if I can read and see a consistent way to ingest those APIs from a documentation perspective, that’s a big step forward.”

Designing for developers
Cloud Elements offers a place where APIs are consolidated under singular, simpler API calls. Social media platforms, for example, can be accessed with a single API through a Cloud Elements API hub, rather than having to use Facebook’s API, Twitter’s API, LinkedIn’s API, and others individually.

“We have 115 public ’elements.’ They’re connector services to things like Hubspot, Salesforce, QuickBooks, etc. We probably have done another thousand integrations to services that aren’t in our public catalog,” said Geene. “We’ve done a lot of writing to various APIs. We give developers a way to create one set of APIs to rule all the APIs they work with.”

Geene and his team have seen it all in their time working with APIs. There are a lot of ways an API can go bad, and he has some holistic advice for other developers designing APIs.

First and foremost: “Deal with custom data well in your API, because the majority of people using a SaaS application have created custom fields or objects in those applications. If you want to have a dynamic integration experience, you have to let the people integrating to your application consume those objects and understand the metadata of those objects and structure of those objects,” said Geene.

Next on his list is being able to learn about change. “Now that I can access the data at your endpoint, I want to know when it changes. Of the 100-plus APIs we’ve integrated to, only 10% have a comprehensive set of webhooks. Those provide me the ability to be polled, subscribed and be notified when something changes. If you don’t have those, you have to create an entire polling framework to ask ‘Has it changed? Has it changed?’ That’s very inefficient. We’re seeing more support for that, but it’s slow.”

Steve Davis, CTO of Four51 (makers of an e-commerce platform), converted his company’s e-commerce platform to an API-based platform. In doing so, the company learned some lessons about API design. He embraced advice similar to Geene’s during the API development process for its product, Ordercloud.io.

“We added in extended properties,” said Davis. “As complex as our data model is, it’s never enough for an individual customer. You go into a database, you get to customize columns. With an API, you don’t have that possibility. You can extend the data model within our API. That’s been popular with implementers of our system. We were early adopters of the patch method in HTTP, so we could provide partial resource modifications.”

One major pain point for all API developers is around authentication, said Geene. With OAuth 2, he said, there is tremendous confusion around implementation. “You still see, even with OAuth 2, so many variations in the implementation,” he said.

“The standard isn’t consistent enough. There are unique differences in how people interpret that authentication mechanism. That’s probably one of the biggest challenges in APIs is getting through authentication.”

Another holistic development tip for API developers from Geene is to keep backward compatibility. “Writing to so many APIs from so many different endpoints, we’ve seen everything under the sun. I have to say most vendors are doing a good job of not creating backward incompatibility in their APIs, for the most part,” he said.

“When it does happen, it creates havoc. Sometimes you have to do that. Best practices are to maintain different versions of that resource and the method applied to that resource. Do versioning at the method level whenever possible. I think that’s a good practice to follow. It saves developers a significant amount of time on that front.”

The real secret to writing and maintaining a good API, said Geene, is proper grammar. “Writing at the API level is really about separating the nouns and the verbs,” he said.

“If you look at the RESTful world, where you get into trouble is where you’re not sticking to the resources being the nouns and sticking to the five verbs that are the common, RESTful verbs: get, post, put, delete, and patch. Doing those verbs in a consistent manner at the most basic level makes a significant impact.”

There’s an easy way to measure whether or not an API is truly well implemented and following best practices, said Geene. “With best practice APIs, I can do everything through APIs. A lot of times people will set up eventing, but a lot of times I have to go to the web UI to set up a webhook or create a new user. In the best APIs to work with, I can do everything: I can administrate users, set up events, and access comprehensive metadata. Having a robust set of APIs around metadata is an area that’s underserved, where I can understand your structure and the entire catalog of APIs through that metadata, and do that well.”

About Alex Handy

Alex Handy is the Senior Editor of Software Development Times.