The allure of serverless is attracting a number of businesses and development teams.
A recent report from the Cloud Foundry Foundation revealed 51 percent of respondents are either using or evaluating serverless, with 18 percent of them doing it at scale.
For businesses, it is about an on-demand price model as opposed to upfront costs. For development teams, it is about removing operational complexity and focusing on what really matters, which is the code.
“Managing infrastructure is an incidental complexity. You only do it because you have to. It doesn’t make much business sense to manage infrastructure. By being able to move to a more pay-as-you-go style, you are removing responsibility, worry and operational complexity,” said Viktor Klang, deputy chief technology officer at Lightbend, a cloud-native microservices platform company.
RELATED CONTENT: Serverless: A bad rerun
But serverless isn’t just a ‘set it and forget it’ solution. It is a different model and a different way of thinking, and businesses have to look beyond all the flashy benefits to evaluate if moving to serverless makes sense to them, according to Chris Parlette, director of cloud solutions for the cloud services company ParkMyCloud.
“It has its use cases, but it is not going to solve everything, and I don’t think it is a perfect use case for everything you are trying to do. It is important for enterprises and large organizations to evaluate it as an option, but it is just another option. It is another tool in the toolbox,” he said.
Taking a deeper look into serverless
Serverless has become very attractive because it is the next abstraction of technology, according to Tolga Tarhan, chief technology officer for the cloud-native managed services provider Onica.
In the past, teams would put servers in data centers. Then a couple of years later they started using virtual machines for everything. A couple more years later, they started moving to the cloud as the next step away from buying and maintaining hardware. This is the next step in that journey, Tarhan explained.
With serverless, development teams no longer have to worry about the operating system or virtual machine at all. They just have to make sure their code is running at the right time. “They don’t really care about all the noise around an operating system: patching it, securing it, managing it, configuring it…all that falls away. Instead they just write the code they need for their use case and write it up to the right events,” said Tarhan. “It reduces complexity and costs. You are not paying for servers that are idle. You are only paying for the milliseconds you are using compute resources, and it frees up more of a developer’s and development team’s time for feature development.”
However, the cost benefits can also be misleading. According to Parlette, there can be some hidden costs if you don’t understand where or how to use serverless, and that can become very expensive. Serverless also makes you dependent on your cloud provider since they are the one that controls the resource provisioning and responsibility for the back-end infrastructure.
The good news is that as more businesses start to use serverless, there is more data around what exactly it is going to cost and what the ongoing maintenance for a serverless environment actually looks like. For instance, “if there are huge spikes in traffic, you can now understand what it means. If there is flat traffic, you can evaluate it more. It is an apples-to-apples comparison with a more traditional server model,” said Parlette.
Being smart about how you implement the architecture
The best time to use serverless is when you are starting new applications from scratch, using a microservices-based approach where you have small services and chunks of code that are not interdependent on one another; and running infrequently used scripts that don’t need to be constantly running on a server, according to Parlette.
Parlette explained he wouldn’t suggest taking existing applications or monolithic applications and moving them to a serverless approach. “I wouldn’t rewrite a whole application just to use serverless without at least doing a lot of evaluation of the ROI on that. The reason I say that is because a lot of times you are sitting there rewriting your application, you are not making any forward progress. You are not implementing new features. You are not pushing the ball forward,” he said.
If you are spending all this time rewriting apps just to run on a different back end, the customer is not getting any beneficial use out of it. “Just rewriting something for the sake of using serverless doesn’t make sense,” Parlette added. Because of this, he does not see traditional servers being completely replaced by serverless architecture any time soon.
However Onica’s Tarhan explained that serverless shouldn’t be looked at as a migration path. Instead, it is a green field, net new application development option or a major reactor. “You might be able to reuse some of the code, but you are going to do some serious surgery to your application,” he said.
“The question is more about value,” Lightbend’s Klang added. “What is the investment of making the transition for a specific piece of technology and what is the cost of just integrating with existing solutions? You always need to make that decision on a case-by-case basis.”
According to John Graham-Cumming, chief technology officer for the internet security and performance company Cloudflare, application architecture is going to move from a two-tier client-server approach to a three-tier approach. Graham-Cumming explained by adding a third tier, or middle ground, businesses will be able to get closer to the user, have lower latency and be much more interactive. This third tier is on the edge in a serverless platform. Since serverless enables application code to be able to run from anywhere, it makes it possible for code to run on edge locations. Edge computing allows development teams to bring computing as close to the source of data as possible. This is important to reduce latency and bandwidth use, which can sometimes be a problem in serverless. “The edge is very fast and has great connection to the Internet,” so it eliminates any noticeable delays, Graham-Cumming explained.
“People think serverless is only for some tiny point of their application or a little bit of configuration, they are not yet fully appreciating what is happening here is a fundamental change in how applications get built and what we will see in the future is people thinking about the three-tier application,” he said.
One of the areas that serverless does not work well in is where your application is stateful. Serverless applications, or functions as a service, is intended to be stateless, meaning it holds no memory, according to Lightbend’s Klang.
As an example, if you are doing an image resize function that takes the image in and emits a resized version of the image, it doesn’t matter if there are a billion images being resized in parallel because they are not interdependent on each other. But, if your application is stateful and needs memory in order to know what to do and how to do it, having functions activated at the same time using the same memory doesn’t work in a serverless world, Klang explained. “You get a contention on storage. Can functions be processed in parallel or not, and if they do, will the result still be true because another function could have updated that information before the other one could continue or see the change.”
Another way to look at it is thinking about a shopping cart. If you are an online store, you want your customers to be able to have a consistent view of their shopping cart no matter the device they are shopping on. If they make a change on their phone, you want them to be able to see that change when they log into their computer.
“In order to facilitate that, you need state. You need to manage what is inside that shopping cart and that is something which has clear consistency requirements, but also you have more possible potential points of entry to your state,” said Klang.
How businesses have traditionally tried to get around the issue is push the problem into the database, but that just creates more issues. “Whenever you have multiple function activations against the same piece of information, you wrap each in a database transaction to deal with resolving conflict, but the problem with that is the contention part because now the database has to be responsive to do the transaction coordination,” said Klang. “It also means you have to figure out how to scale out your database, know which data is where, and coordinate access to that data if you split it up.”
Lightbend wants to address this with the release of its open-source project, CloudState.
“Bringing stateful microservices, fast data/streaming, and the power of reactive technologies to the cloud-native ecosystem breaks down the final impediment standing in the way of a serverless platform for general-purpose application development, true elastic scalability, and global deployment in the Kubernetes ecosystem,” said Jonas Bonér, chief technology officer at Lightbend.
CloudState tries to address the state serverless problem by having the state pass into the function instead of the function accessing the state. “What CloudState does is it makes sure that only one activation for that same piece of information is being handled at once,” said Klang. “If you let the database management system deal with the coordination, then once you start scaling out your functions you are actually increasing contention in the database. People think by parallelizing the load they are going to be able to do more, but they are actually only able to do less because it is going to wait to get IO. Coordination takes longer the more things are actually contending for something. By having this data and the data access being handled separately, we can figure out if the database is the bottleneck or if it is the function processing that is the bottleneck, and then making a scaling decision becomes so much easier.”
Currently CloudState targets common stateful use cases such as training and serving machine learning models, low-latency real-time prediction and recommendation serving, user sessions, distributed transactions, shared collaborative workspaces, and workflow management.
“Stateless functions is a great tool that has its place in the cloud computing toolkit, but for Serverless to reach the grand vision that the industry is demanding of a Serverless world while allowing us to build modern data-centric real-time applications, we can’t continue to ignore the hardest problem in distributed systems: managing state—your data,” according to the CloudState website.