Serverless adoption has been growing rapidly as the market begins to mature and as enterprise applications continue to shift in the direction of containers and microservices.
The most prominent cloud providers including Amazon — which is at the forefront with its AWS Lambda offering — IBM, Microsoft, Google, and others have already released serverless computing capabilities and continue to add more serverless functionalities.
In this year’s The State of Serverless report, Datadog found that half of all AWS users have adopted AWS Lambda. In addition, the report found that close to 80% of large enterprises with AWS and the vast majority of AWS Containers users have adopted Lambda as well.
Despite the name, serverless doesn’t denote the lack of servers, but rather the ability to build and run applications without thinking about servers, a concept praised by many developers.
The primary benefits of serverless are agility, simplicity and cost. With serverless, developers can focus on their code and the platform takes care of the rest. This can reduce the time from idea to production significantly, according to Sachin Pikle, the product strategy director at Oracle.
“It’s a different paradigm. With a serverless functions-as-a-service (FaaS) offering, developers can write code using their favorite programming language, deploy and run it without having to provision, manage, or scale servers,” Pikle said. “It can be a massive productivity boost for developers — you can do a lot more with a lot less.”
In addition, serverless applications are easier to design and implement as complex concerns like multi-threading, scaling, high availability, and security are pushed to the serverless platform. Serverless applications, over the long haul, enable cheaper costs as organizations only pay for the resources used, and nothing for idle resources since pricing is based on the actual amount of resources consumed by an application, rather than on pre-purchased units of capacity, according to Pikle.
Oracle found that the most common serverless use cases are event-driven governance, security policy enforcement, data ingestion, log ingestion, running untrusted code in a secure and isolated manner, functions as web and mobile API back-ends — especially for SaaS extensions, reacting to business events in SaaS applications, and machine learning.
Most serverless applications are event-driven, use managed services, and trigger code automatically.
For instance in serverless, uploading an image file to an Object Store bucket can trigger an image resize function, or operational alerts for high memory or high CPU usage can trigger the platform to increase the memory or CPU of the virtual machine, and several others, Pikle explained.
“Like anything else, serverless has its pros and cons. It’s ideal for short-running, event-driven, spiky workloads,” said Pikle. “Looking at several organizations that have successfully adopted serverless, their pros definitely outweigh their cons.”
Another reason why serverless has picked up so much momentum is that there is a growing ecosystem of tools that are being built into the platforms of the large cloud providers. A growing number of languages are also being supported, according to Arun Chandrasekaran, a distinguished analyst at Gartner.
“The biggest thing drawing users to serverless is operational simplicity because a lot of things like setup integration, provisioning management, a lot of that is abstracted away from the consumer of the service, which is the developer,” Chandrasekaran said.
As serverless architectures mature, they have been able to minimize the issue of provision concurrency, in which there was a performance penalty when a function was called a second time, causing a “cold start.” The introduction of API Gateways was able increase performance twofold to vastly minimize the latency caused by cold starts.
The growing maturity of these architectures also expanded its appeal for different use cases.
“[Serverless] infrastructure is so mature and so scalable right now. Most invocations of service compute are for data processing jobs, whereas most projects are for APIs right now, whether that’s building out a REST API, GraphQL API, or even a microservice or monolithic application,” said Austen Collins, the founder and CEO of framework provider Serverless Inc.
Collins said that compute is only about half of the picture of the value that can be derived from serverless. The other half comes from all of the managed serverless cloud services that integrate with those functions, he said, and this is what really expands the possibility of use cases.
“At the end of the day, what makes serverless serverless is the rich integrations that serverless frameworks have with data sources, so for example, take Lambda and the rich integrations it has with DynamoDB. And secondly, it has the whole scalability aspect like auto-scaling. Trying to imitate that scalability in a data center is just not possible,” Gartner’s Chandrasekaran said.
The term serverless has evolved significantly in terms of how the marketplace and vendors are trying to define it.
“Serverless kind of became a moniker in 2015 when Amazon launched Lambda as a service. When people said serverless they specifically meant serverless functions, which is a way for you to do application development where you’re decomposing applications into serverless functions and running it in a function as a service environment,” Chandrasekaran said. “Today, serverless is a bit broader, particularly in terms of how vendors talk about it. For example in AWS, a lot of other services such as SQS or Athena, are looped into serverless.”
However, he added that functions as a service is still the most prominent manifestation of serverless today.
At times in the industry, the terms serverless and functions as a service (FaaS) are used interchangeably. However, there is an important distinction between the two.
Serverless refers to an environment where people don’t worry about what happens below the application layer. By contrast, when talking about FaaS or FaaS software, it refers to the software that makes running a serverless environment possible, according to Al Gillen, the group vice president of software development and open source at IDC.
“Think of FaaS as the enabling service, whereas serverless is the composite offering,” Gillen said. “Serverless environments are much more akin to database stored procedures and triggers. Think of it as your bank account. When your bank account goes negative a dollar, it triggers an action where it freezes the ability to pay any more checks out of that account and sends off a message saying you’ve overdrafted or you’re negative and then it pulls up ‘incurs late fee.’ Those things only happen with a certain set of conditions and generally speaking, that’s how serverless environments are set up to operate as well.”
However, while serverless offers tremendous cost and scalability benefits, it does have limitations for certain super low-latency use cases. Also as a new technology, it comes with certain challenges regarding implementation. One of these challenges is a major skills gap in knowing how to build cloud-native applications.
A different mindset
“Serverless requires a fundamentally different mindset in terms of some of the best practices that they need to know for building more decomposable or composable applications, and that’s challenging for a lot of traditional organizations because they just don’t have enough skills and enough developers that are really trained in these new cloud-native ways,” Gartner’s Chandrasekaran said.
Some organizations find themselves overwhelmed in trying to coordinate their move to serverless, prompting them to seek out serverless framework and third-party tooling.
“I think everybody wants to use serverless cloud infrastructure and they want to use all of these next-generation cloud services that are available. Unfortunately, a lot of teams have trouble putting all of those pieces together to make a whole application,” said Serverless’s Collins.
Additionally, not all applications are ideal for serverless environments. The application logic must be capable of being packaged as discrete functions and the applications themselves need to be event-driven. There is an application pattern and you need to really identify the right types of workloads that will fit into that pattern.
“At the end of the day, in a serverless environment you have a low degree of control over the operational environment. From a developer standpoint, that could be an attractive attribute because there is less to manage and less to worry about,” Chandrasekaran said. “However, if you have maybe a security administrator you may think, ‘I don’t have all of this ability to work underneath it.’”
Chandrasekaran added that it is tough to predict the performance in a serverless environment, which makes it less appealing for super low-latency transactional workloads.
No great database story for serverless
Despite its effectiveness in a variety of use cases, serverless has not been adopted for databases because traditional databases place limits on scalability.
A lot of traditional databases require you to establish a connection. Unfortunately, when you have stateless compute like AWS Lambda that could just scale massively in parallel, every single one of those functions, if it scales massively, is going to try to establish a database connection to MySQL or PostgreSQL and they’re just gonna crash that database,” Serverless’ Collins said.
To solve this, HTTP APIs need to be a part of these database technologies so that serverless compute can just scale massively and interact with an API. It’s a data gateway concept where you have to have some middleware in between your serverless compute and your database, and provide that functionality, Collins explained.
New trends around serverless
As serverless continues to grow, new trends around emerging capabilities have surfaced.
Gartner’s Chandrasekaran said the first notable trend serverless providers will continue to improve on is support for more languages, better security monitoring, application debugging, and local testing.
Startup companies will handle many of these additional capabilities in the future. Also, there is a focus on open-source projects that allow organizations to deploy serverless not just in the cloud, but also at the edge. Startups are also working to tackle one of serverless’s biggest pain points up until now: stateful workloads.
“We have started to see some startups that are briefing us on more stateful workloads that could potentially run on serverless function environments. However, it’s very early to say because most of these startups are in beta or other very early stages,” Chandrasekaran said.
Another trend is that there are vendors that are trying to propagate more open standards. Google is one of the key ones in this space with its project Knative, a serverless kernel that runs on top of Kubernetes.
“Things that are built on Knative are typically pretty portable from one Kubernetes environment to another, making it attractive,” IDC’s Gillen said, adding that the more universal cross-cloud serverless solutions that can made, the better.
Adding serverless capabilities
Overall, serverless is going to increasingly take over the mainstream as different types of services are increasingly adding serverless capabilities, according to Serverless’ Collins.
“We think the future cloud is going to be focused on outcomes where you’re going to have managed services, serverless services, API-as-a-service that solve business problems and give you immediate outcomes. These are solutions where you don’t even need to think about the infrastructure at all,” Collins said.
Also, if the economy is headed towards a deep recession, organizations are going to be looking at doing more with less and that’s going to involve a lot of engineering teams potentially outsourcing a lot of what they do in-house over to the cloud providers. “I think it is a remarkably recession-proof architecture,” Collins said.
“All in all, what we’re looking at here is a second wave of cloud as it evolves to be an abstraction over infrastructure,” Collins said. “Software is eating the world, cloud is eating software, and serverless is eating cloud.”
One insurance company’s take on serverless
Branch Insurance, an Ohio-headquartered insurance startup that bundles home and auto insurance online, started developing using serverless starting in 2018 and said the use of serverless streamlined their development process.
“The biggest benefit of serverless is that it gives us true infrastructure as code in a way that all of our developers can understand and maintain pretty easily, which is no small feat,” said Joseph Emison, the cofounder and CTO of Branch Insurance. “The standard problems that plague the majority of development teams in the world, we don’t have at all. So things like ‘it works in this environment and not that environment, or in order to deploy this thing. I have to do this manual thing every time. And if I forget, it breaks or it doesn’t work.’ All of our developers from junior to senior have no problem implementing new infrastructure.”
Emison said insurance has a sort of simplicity to it that makes it very easy to use a lot of automated and managed services and tooling such as things like Amazon App Think or Amplify.
Another reason that is causing insurance companies to look into serverless is that the industry has a whole lot of pain around very old systems and they haven’t been updating them successfully.
“I think that the older and creakier your systems are, the more benefits you’re getting in rewriting them,” Emison said.
However, AWS’s standard tooling just wasn’t enough to coordinate developers, prompting the company to adopt a serverless platform and architecture visualization tool called Stackery that offered Branch Insurance the cross-account viewing and additional functionalities that it was seeking.
“Amazon tooling didn’t do it for us because it didn’t have a good cross-account view. One of the things that turns out is the best way to run your organization when you’re using serverless or using serverless services, is to have every developer have his or her own account, every environment has its own Amazon account, so there’s this massive benefit in running them all identically. Amazon tends to only think within one account at a time,” Emison added.