7
 min read

Serverless Architecture An Easy-To-Understand Guide

Cloud computing has changed the way that modern applications are designed, deployed and maintained. One of the driving forces behind the adoption of cloud is serverless computing.

Table of Contents

We can think of software as a collection of smaller pieces of code working together, each one solving a particular task. We refer to these small pieces of code as functions. These can run independently of each other, but when working in unison, they form the application itself. Serverless architecture exploits this trait. An application is broken down to its constituent functions and each is deployed and run individually as opposed to being bundled together. Running these individual functions separately in the cloud is known as Function as a Service (FaaS), and provides a plethora of benefits when designing modern software.

Serverless architectures are ideal for use cases where an application revolves around responding to events with code. Some common examples include data processing pipelines, application backends, APIs, chatbots, voicebots, and workflow automation.

Serverless Architecture vs Traditional Architecture

In traditional architecture, developers would run code on a server to complete a specific task. The server could either be owned – usually the hardware residing on the customer’s premises - or rented from a cloud provider. Either way, the server would be available around-the-clock, waiting for requests to service. These servers would often sit idle, and even when processing requests, they would not use all their resources in terms of CPU or allocated memory. 

Rather than running 24/7 and constantly waiting for requests to come in, serverless computing adopts a usage-based model, spinning up resources when requests are submitted. Once these are completed, the servers are powered down. As such, instead of paying for computing resources such as virtual machines every month regardless of their usage, serverless computing bills per function invoked. You only pay for the resources you use, when you use them.

Event-based triggering

A key differentiator between serverless-based applications and monolithic applications is the way in which code is executed. In serverless architectures, functions are only run as and when they are needed. When a request is received, an event is triggered and the cloud provider runs the relevant function automatically, reducing the amount of time servers are idle. 

In an Infrastructure as a Service (IaaS) environment - which better suits traditional software - applications would need to run continuously, always listening for incoming requests to process and output data as required. If the hosting infrastructure or application is not running when a request is sent, the request will not be processed. This always-on  approach results in wasted resources as the code will need to be run on a server even when there are no requests to service. Let’s take as an example a web API designed to process orders from an online store. With an always-on approach, if this web API receives no requests during the night, the resources used overnight are wasted. FaaS allows the web API to always be available, but only actually runs the code as and when it is required.

Decentralised Software 

Software written to be run in a serverless environment must be decentralised, with each component running individually in the cloud as a function. These components can still communicate with one another in order to share and process data.

Traditional Monolith vs Modern Serverless Approach

However, they do not require the entire application to be running when only a subset of these components of the application are actually servicing user requests. Any code that facilitates the creation of a user interface or other user-facing content is deployed to the users’ device, and other functions such as storing details in a database are enacted by triggering the exact function needed on the FaaS platform. This decentralising of code also allows for an application to be more easily scaled in order to service a larger than usual number of requests from users. 

This contrasts with systems deployed in an IaaS environment, where large applications are deployed as a singular unit, serving all of the functions of the software. Components of the software responsible for rendering user interfaces, storing data in a database and sending requests to other services such as Stripe to manage payments are all deployed together. 

Managed Infrastructure 

Serverless computing removes the need for application developers and businesses to own or rent their own servers and infrastructure. Instead, code is stored and managed by a cloud provider, and the supplied functions are run in the correct environment with the relevant accompanying software as and when required. Application developers no longer need to manage how their code will be run. When a function is triggered, the serverless computing service automatically finds a server to run the code on, runs the code and outputs the result to the relevant destination. 

In traditional environments, developers would need to update and maintain a server they are using to run their code. In the past, having control of this was paramount in ensuring the performance and stability of your software. The need to control and manage your own infrastructure and servers is no longer as essential as it once was, due to new modern standards and the environments provided in FaaS offerings.

Critical Components of Serverless Architecture

Having discussed differences between serverless and traditional architectures, we will now describe the components that make up the serverless model and how they play a role in achieving flexibility and scalability.

API Gateways 

API gateways provide the entry-point for clients to send requests to a service and retrieve data. They will receive requests from all clients and trigger the relevant function, passing data to the relevant component and returning the resulting data back to the client. API Gateways provide an easy way to adapt called functions based on requests without ever needing to update client code. 

API gateways can be used to aggregate multiple function invocations together to achieve composite functionalities. A client may send a singular request, but that one request can trigger the execution of more than one function. These functions would produce a single response that gets returned to the user. This approach helps to reduce the number of requests the client needs to make, improve performance and client experience. To exemplify, consider a user registering an account - a singular client request is sent to the API gateway, but multiple functions are executed to store user details, send a welcome email and generate a ‘welcome’ coupon code.

Function as a Service 

The business logic of a serverless application can be found in the functions developers write. These functions are executed on a FaaS platform that’s provided by a cloud provider. They often perform small, concentrated tasks and only execute for a short period of time, the upper time limit before the function times out generally being five minutes. This maximum runtime is one of the reasons why tasks are broken down in many smaller, interlinked functions that together solve a wider problem. 

For a function to run in isolation from others that execute on the same physical servers, it needs to execute within a container. These are small, light-weight virtual environments that, in a similar manner with virtual machines, abstract physical resources and dedicate them to the applications that run within the containers. They are very inexpensive to spin and run, and allow functions to execute very quickly after an event is detected. 

Functions initialise when they get triggered by an event. Imagine a user updating their contact details in a mobile application. The API gateway receives the request, invokes the relevant functions, and returns the result once the function has completed its task. Functions can also be triggered by passive events such as on a time schedule or when a file is updated in a file storage service. 

Serverless environments are stateless. Functions are not necessarily aware of previous function executions and will often receive a piece of data as a parameter, do something with it and then either store the result somewhere or return a result to the relevant function or client. 

Database Service 

Persisting data such as user contact details, product information and more can be achieved using a database Backend as a Service (BaaS) solution. Functions running in a serverless application need to persist data somewhere for later usage by the same functions or for other components in the wider application. Many cloud providers offer some form of database management service whereby the database is maintained and deployed by the cloud provider, and functions simply leverage the service. In addition to persisting data, some databases can be used for caching data where writing and reading from a database is inefficient.

Serverless Architectures Benefits

By design, serverless architecture enables the wider business to inherit technical benefits around efficiency and scalability. Below we explore the benefits as they are applicable to each type of stakeholder.

With serverless architectures resource capacity matches demand minimizing waste

Benefits to Businesses

  • Maintain experience during spikes in demand – serverless’ auto-scaling mechanisms can provision computational resources by responding to demand spikes. Services deployed in a serverless environment using FaaS offerings are naturally scalable. This allows a product to remain responsive and ensures users are not lost due to time-outs and slow service 
  • Lower costs - serverless solutions reduce operational costs by cutting out infrastructure management. The cost of the service will be entirely determine by the number of invoked functions.
  • Simpler business management – rather than hiring dedicated operations staff to manage infrastructure, serverless allows for these resources to be allocated to DevOps teams that can deliver real business value in a simpler organisational structure. 
  • Differentiating features – serverless makes it easier to cater to the market's requirements for on-demand or 'instantaneous' services.

Benefits to Developers 

  • No-touch scalability - FaaS deployments are naturally easy to scale and do not require the extensive configuration and management of a traditional auto-scaling service. As a developer, this removes the need to design code in a scalable way and allows for more focus on the business logic and code performance. 
  • Value-adding tasks - No infrastructure to manage means more time working on business problems. All infrastructure is managed by the cloud provider, which removes the need to update and maintain servers. 
  • Application Management - More concise code is something that often comes with serverless computing and FaaS. Code needs to be split into separate functions in order to be deployed, this requirement allows developers to break code down to smaller business units which are easier to manage. Keep in mind that this is only achievable if the development team follows serverless best practices and is particularly attentive to performance monitoring.

Benefits to Users 

  • Responsiveness - Naturally scalable code leads to a more responsive and seamless user experience. Traditional auto-scaling mechanisms don’t always work correctly, and can sometimes provision fewer resources than required, leading to increased latency and system crashes.
  • Continuous delivery - With no infrastructure to manage, developers can focus on delivery end-user value more quickly. 
  • Geographical Proximity – FaaS can achieve an almost instantaneous response when functions are run in a nearby data centre rather than thousands of miles away. With FaaS, cloud providers can run applications in the data centres closest to the end user. 
  • Realtime preferences – some use cases for serverless applications include on-the-fly preference updates for users for a fully tailored experience.

Serverless Architecture Challenges

As with every new technology, serverless comes with a handful of challenges. Most of those can be mitigated with careful planning and preparation.

Challenges for Businesses 

  • Delayed testing - the difficulty in replicating serverless environments locally for testing may result in longer and expected testing phases. 
  • Talent sourcing and training - Developers need to have good knowledge of FaaS and code designed specifically for serverless environments. This requires businesses to train developers on this technology, or hire engineers that already have these skills

Challenges for Developers 

  • Cold Starts - this problem occurs when a function is run after it hasn't been requested for a while. The initial running of a function may be slower than usual, due to the cloud provider not having resources provisioned and the benefits of caching are not achieved. This may lead to increased latency for a small number of users every so often. This can be remedied relatively easily by sending periodic requests to infrequently-triggered functions 
  • Difficult testing - Testing functions deployed in serverless environments is not as easy as testing code designed to be deployed as a singular unit. This is primarily because replication of FaaS locally for testing is not easy and requires extensive setup 
  • Managing memory - Serverless solutions execute functions in isolation from each other, and no state is maintained across instances. You can think of state as a short-term working memory, which allows functions to keep track of various things such as the number of users that have used the service and store files that can be returned to a user quickly for example. Developers need to remember this and design their code accordingly, so sate is not an inherent requirement

Challenges for Users 

  • Initial latency – if a user sends a request at a time when resources are not spun up, the user may face some initial latency. This cold start problem can be alleviated using caching alongside other means such as keeping a function ‘warm’ by sending periodic requests to it.

Conclusion 

Serverless computing is solving some critical challenges in application development. It allows engineers to develop their applications as small, individual functions that work together to form a powerful, naturally scalable solution that can serve large numbers of users concurrently. Infrastructure management is handled by the cloud provider, simplifying and streamlining operations.

The caveat to all these benefits, as with all new technologies, is that developers need training and relevant knowledge to design and maintain serverless applications using best practices.