The concept of serverless architecture may not be very new but lately, it has been observed to be an emerging trend in the cloud. The reason behind this is the simple fact that it has simplified life for developers by providing them with ample time to code instead of using it to set up servers. The servers are set up by the cloud service providers and this eliminates the need for managing anything else except for the environment in which it has to execute. The benefits of employing serverless architecture are being talked about and businesses are on an endeavor to leverage the function’s code that has to be uploaded along with it configuring it for maximum output. This whitepaper attempts to elaborate on the concept of serverless and its advantages with reference to our experience from working with serverless technology.
Serverless is a framework that enables the process of deploying applications to be hassle-free. While some people misunderstand that serverless architecture refers to the absence of servers, it’s not so. The servers do exist but are not managed by us because, the cloud providers undertake the function of providing and maintaining servers and these servers last in stateless ephemeral containers, fully managed by the cloud vendor. The April 2018 report by Gartner states that by 2020 more than 20% of global enterprises will have deployed serverless computing. Serverless has also emerged as the fastest-growing segment by outrunning containers-as-a-service according to a report by RightScale. These statistics show that the hype around serverless is based on its efficiency and capability to transform the future.
Given below is the stats as released by Google Trends on the interest over time regarding serverless.
Serverless has been an added benefit for those who had moved their business from traditional practices to cloud. A recent report states that the market could grow from $4.25bn in 2018 to $14.93bn in 2023. The fact that cloud providers are investing heavily on it shows that serverless is bound to create a remarkable disruption in the coming years.
The four main advantages of serverless are as follows:
• Increased developer productivity: Developers can utilize the time spent in setting up a server to focus on the optimization of other areas.
• Adjustable capacity and auto-scaling: The provision to scale individual functions instead of scaling the entire app.
• Cost-effective: It has the pay-as-you-go feature where one needs to pay only for the services that have been used and not when the system had been idle.
• Improved user experience: With no servers to manage, developers focus on enhancing the features rather than the infrastructure and this ultimately results in better user experience.
With its numerous benefits, serverless framework finds its application in various areas such as building web applications, data processing and so on. Let’s take a look at some of its prominent use cases.
• Web Applications: With serverless, it is possible to build web applications that can handle peak loads by scaling accordingly even if the load is unexpected. Also when there is no traffic, it costs almost nothing. Serverless also allows applications to be multilingual without getting locked into using the same language as their legacy software.
• Backends: Serverless enables the creation of highly secure, available and perfectly scalable backends. In the case of media and log processing, it becomes simpler to process compute-heavy workloads without the complexity of building multithreaded systems or manually scaling compute fleets.
• Data Processing: Clickstream and other near real-time streaming data processes are taken care of by serverless. Big Data, high-speed video transcoding, stock trade analysis, and compute-intensive Monte Carlo simulations for loan applications are some of the other data processing use cases of serverless.
• Chatbots: Serverless has made it easier to build and scale chatbots and power the chatbot logic. The ability of serverless functions to automatically scale when there is an influx of requests makes it the best choice to host a chatbot. Also serverless allows one to move between other cloud services offered by the same provider. With chatbots being put widely in use, one could choose from different serverless technologies to simplify the process of building and maintaining a chatbot.
• IT Automation: Serverless functions could be attached to alarms and monitors to provide customization when required. Cron jobs could be handled easily using the serverless framework as in the case of AWS where Lambda is combined with CloudWatch events to execute a Cron job. With modern applications being replicated, serverless reduces the extent of complexity involved in writing Cron jobs.
Some of the available serverless frameworks in the market are from AWS, Microsoft Azure and Google Cloud. The serverless framework of AWS is called Lambda Function, that of Azure and Google Cloud are called Azure Function and Google Cloud Function respectively. Here is a comparison of these three frameworks and their offerings.
Though there have been other serverless frameworks in the market, AWS Lamba has managed to retain its position in the top as the most widely adopted serverless service. Thus, this whitepaper focuses on AWS and its serverless services based on our experience of working with it.
Amazon has been one of the most sought after cloud service provider and offers several other features along with the cloud. One such feature is its serverless platform and as the AWS Lambda product page states, it “lets you run code without provisioning or managing servers”. What sets it apart from other application frameworks is the fact that it manages both the infrastructure and the code and also supports multiple programming languages like Java, Python and the like. Introduced in 2014, AWS Lambda executes code when driven by an event trigger and thus could be called as an event-driven computing platform. One need not pay when the system is idle and has to only pay for the time consumed. Lambda supports use cases such as updates to DynamoDB tables, image or object uploads to Amazon S3, responding to website clicks or reacting to sensor readings from an IoT connected device.
The workflow of AWS Lambda is as follows:
Alongside Lambda, Amazon also provides numerous fully managed services that can be used to build and run serverless applications. From storage to developer tooling, AWS takes care of functions that allows one to focus more on product innovation. The several services and their workflows are as follows:
Amazon Simple Storage Service, otherwise known as Amazon S3, helps in storing any amount of data that could be retrieved anytime from anywhere on the web. The storage system is very durable and highly secure and makes web-scale computing easier for developers. The use cases of Amazon S3 include storing data for websites, mobile applications, backup and restore, enterprise applications, Big Data analytics, archive, and IoT devices. It provides simple, scalable, elastic file storage through the ‘Amazon Elastic File System’. As files are added and removed, it grows and shrinks as per the demand. The workflow of Amazon S3 is as follows:
The document database is called Amazon DynamoDB, which is a fast and flexible NoSQL database service that delivers single-digit millisecond performance at any scale. As the Amazon webpage states, DynamoDB is efficient enough to handle more than 10 trillion requests per day and also support peaks of more than 20 million requests per second. DynamoDB helps in building powerful web applications, interactive mobile apps, and flexible and reusable microservices. Another service is the Amazon Aurora Serverless which is a part of the Amazon Relational Database Service. It is an auto-scaling configuration for Amazon Aurora where the database will automatically start up, shut down, and scale capacity up or down based on your application’s needs.
Here is a sample workflow of how DyanamoDB helps in building a web application.
For developers to create, publish, maintain, monitor, and secure APIs at any scale, Amazon has a feature called the Amazon API Gateway. One could easily create ‘REST’ and WebSocket APIs which would act as the ‘front door’ for applications to access data, business logic, or functionality from the backend services. These backend services could include the workloads running on Amazon Elastic Compute Cloud, any web application or real-time communication applications. API Gateway helps in testing and releasing new versions by allowing to run multiple versions of the same API simultaneously. Allowing to process thousands of concurrent API calls and handling traffic management, the API Gateway also reduces cost wherein one only needs to pay for the API calls received and the amount of data transferred out.
This is how the API Gateway works
Amazon offers services such as Amazon SNS, Amazon SQS, AWS AppSync, and Amazon EventBridge. The Amazon Simple Notification Service otherwise known as Amazon SNS helps publisher systems to send out messages to a large number of subscriber endpoints for parallel processing. It also helps in sending out notifications to end-users through email, mobile push, and SMS. It is highly durable, secure, available, and also easy to use. Also, another service called Amazon SQS (Simple Queue Service) helps to send, store, and receive messages between software components at any volume, without losing them. SQS enables one to decouple and scale microservices, distributed systems, and serverless applications. For simplifying the process of application development, there exists AWS AppSync that helps in creating a flexible API to securely access, manipulate and combine data from one or more data sources. To connect applications easily using data from one’s own applications, SaaS applications, and AWS services, Amazon has a serverless event bus called the Amazon EventBridge. The process of building event-driven applications is made easier with EventBridge as it takes care of event ingestion and delivery, security, authorization, and error handling.
Here is a sample diagram explaining how Amazon SNS works
AWS StepFunctions aids in application development by translating your workflow into an easy to understand state machine diagram. These workflows are basically made up of a series of steps, with the output of one step acting as the input of the next. When multiple AWS services are coordinated into serverless workflows, the process of building and updating apps become quicker and this is achieved by StepFunctions. Each step of execution can be monitored making it easy to identify and rectify problems.
The workflow of Amazon StepFunctions is as follows:
Amazon Athena and Amazon Kinesis are services that focus on simplifying data streaming and analysis. Athena, an interactive query service makes it easy for anyone with SQL skills to quickly analyze large-scale datasets present in S3. With Athena, there is no infrastructure involved as it is serverless and also one needs to pay only for the queries that are run. To collect, process, and analyze real-time, streaming data, Amazon provides Kinesis that allows to cost-effectively process streaming data and also allows one to choose the tools that are fit for the application. It processes and analyzes data as it is received and does not wait for all data to be collected before starting to process. Some of its use cases include securely streaming video from cameraequipped devices in homes, offices, factories, and public places to AWS.
To speed up the process of developing serverless applications, Amazon provides developers with particular frameworks, deployment tools, SDKs, IDE plugins, and monitoring solutions. This aids in rapidly building, testing, deploying and monitoring serverless applications.
If you allocated 512 MB of memory to your function, executed it 3 million times in one month, and it ran for 1 second each time, your charges would be calculated as follows:
Monthly compute charges The monthly compute price is $0.00001667 per GB and the free tier provides 400,000 GB Total compute (seconds) = 3M * (1s) = 3,000,000 seconds Total compute (GB) = 3,000,000 * 512MB/1024 = 1,500,000 GB Total compute – Free tier compute = Monthly billable compute GB 1,500,000 GB – 400,000 free tier GB = 1,100,000 GB Monthly compute charges = 1,100,000 * $0.00001667 = $18.34
Monthly request charges The monthly request price is $0.20 per 1 million requests and the free tier provides 1M requests per month.
Total requests – Free tier requests = Monthly billable requests 3M requests – 1M free tier requests = 2M Monthly billable requests Monthly request charges = 2M * $0.2/M = $0.40
Total monthly charges Total charges = Compute charges + Request charges = $18.34 + $0.40 = $18.74 per month
As mentioned earlier, employing serverless architecture has its own set of advantages such as being costefficient and scalable. Some of the other benefits offered by serverless are as follows,
• Server Management is Unnecessary: The fact that in the case of serverless, servers are set up and managed by the cloud provider and not by us is a major advantage. It helps in reducing cost, time and effort. This provides developers with the freedom to focus more on their applications without being held back by server management. It also saves the time used for detecting infrastructure pitfalls.
• Lesser Time-to-market: Continuing from the previous point, the fact that the serverless framework saves time for developers results in products hitting the market faster than expected. Rather than spending weeks or months in developing an application, serverless enables developers to create one in days or sometimes even within hours.
• Improved Scalability: Scalability is one of the prominent advantages that has made organizations switch to the cloud. A successful application needs to be ready to handle a huge influx of data and serverless ensures this with its flexible scaling. As the app grows, the system accommodates the growth and if it’s not successful then infrastructure is not provisioned. This ensures that one stays prepared when there is a sudden requirement for infrastructure and also that no infrastructure goes to waste.
• Increased Efficiency: It is evident from the above-mentioned points that serverless enables apps to run more efficiently than when traditional methods are employed. With serverless, developers are free from worrying about matters like scaling, infrastructure, DevOps, capacity planning, and the like. Also, the pay as you go feature ensures that one only pays for the time the system was in use and this, in turn, reduces the amount of waste generated.
• Enhanced Latency: For an app to perform equally well for all its users it must have access points on a global scale. Serverless ensures that there exists a serverless node close to the user that makes scaling uncomplicated. In the case of the traditional framework, the request has to travel to the place where the server has been hosted and will create latencies that can lead to poor user experiences. This can be tackled through the serverless framework that ensures better latency and experience.
Since 2018, serverless has been one of the most sought-after cloud services owing to its scalability and reliability. The fact that the servers are handled by the cloud provider is both a boon and a bane. While the whitepaper discusses the pros of it, the cons are that the lack of control over the server makes it difficult to expand or modify resources manually. Also, the use of third-party APIs might sometimes result in vendor lock-in or other security concerns. That said, these limitations do not stop it on its trajectory upward and might even not exist anymore in the future with cloud providers bringing in more changes with each passing year. Some of the limitations of serverless are same as that of cloud but that did not stop the latter from being widely adopted and with several influential companies adopting serverless, its future looks promising and bright.