0% found this document useful (0 votes)
56 views3 pages

A Research Paper On Serverless Computing IJERTV11IS090064

The document discusses serverless computing including its architecture, examples of uses, and advantages and disadvantages. Serverless computing provides backend services on an event-driven and pay-per-use basis without requiring developers to manage infrastructure. It allows automatic scaling, reduced costs, and simpler development compared to traditional infrastructure models.

Uploaded by

PAINGO MAIMA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
56 views3 pages

A Research Paper On Serverless Computing IJERTV11IS090064

The document discusses serverless computing including its architecture, examples of uses, and advantages and disadvantages. Serverless computing provides backend services on an event-driven and pay-per-use basis without requiring developers to manage infrastructure. It allows automatic scaling, reduced costs, and simpler development compared to traditional infrastructure models.

Uploaded by

PAINGO MAIMA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Published by : International Journal of Engineering Research & Technology (IJERT)

http://www.ijert.org ISSN: 2278-0181


Vol. 11 Issue 09, September 2022

A Research Paper on Serverless Computing


Vaishnavi Kulkarni
Technical Support Engineer
Supermoney, Mumbai,India

Abstract— Serverless computing is a method of providing vendor. Serverless means that the developers can do their
backend services on an as-used basis. A serverless provider work without having to worry about servers at all.
allows users to write and deploy code without the hassle of The term ‘serverless’ is somewhat misleading, as there are
worrying about the underlying infrastructure. Keywords— still servers providing these backend services, but all of the
Serverless computing, Cloud computing, front end, back end.
server space and infrastructure concerns are handled by the
I. INTRODUCTION vendor. Serverless means that the developers can do their
Serverless computing is a method of providing backend work without having to worry about servers at all.
services on an as-used basis. A serverless provider allows
users to write and deploy code without the hassle of worrying II. ARCHITECTURE OF SERVERLESS COMPUTING
about the underlying infrastructure. A company that gets Serverless architecture is largely based on a
backend services from a serverless vendor is charged based Functions as a Service (FaaS) model that allows cloud
platforms to execute code without the need for fully
on their computation and do not have to reserve and pay for
provisioned infrastructure instances. FaaS, also known as
a fixed amount of bandwidth or number of servers, as the Compute as a Service (CaaS), are stateless, server-side
service is auto-scaling. Note that despite the name serverless, functions that are event-driven, scalable, and fully managed
physical servers are still used but developers do not need to by cloud providers.
be aware of them. DevOps teams write code that focuses on business logic and
In the early days of the web, anyone who wanted to build a then define an event that triggers the function to be executed,
web application had to own the physical hardware required to such as an HTTP request. The cloud provider then executes
run a server, which is a cumbersome and expensive the code and sends the results to the web application for users
undertaking. to review.
Then came cloud computing, where fixed numbers of servers AWS Lambda, Microsoft Azure Functions, Google Cloud
or amounts of server space could be rented remotely. Functions and IBM OpenWhisk are all well-known examples
Developers and companies who rent these fixed units of of serverless services offered by the cloud providers.
server space generally over-purchase to ensure that a spike in The convenience and cost-saving benefits associated with on-
traffic or activity will not exceed their monthly limits and demand auto-scaling resources, and only paying for services
break their applications. This means that much of the server as they're needed, makes serverless frameworks an appealing
space that gets paid for can go to waste. Cloud vendors have option for DevOps teams and business stakeholders alike.
introduced auto-scaling models to address the issue, but even ➢ Examples of serverless
with auto-scaling an unwanted spike in activity, such as The growing popularity of cloud computing and
a DDoS Attack, could end up being very expensive. microservices combined with the demand for greater
innovation and agility without increasing costs has
contributed significantly to the prevalence of serverless
applications. Notable use cases include:
• Slack:
Serverless is ideal for independent task based applications
such as chatbots and can save on operational costs since
billing is based on the actual number of requests. Slack, a
popular, cloud-based business communication platform, uses
a serverless application called marbot to send notifications
from Amazon Web Services (AWS) to DevOps teams
Fig:Cost Benefits of Serverless through Slack.
• HomeAway:
Serverless computing allows developers to purchase Reducing development time and server costs while
backend services on a flexible ‘pay-as-you-go’ basis, simplifying the build process are goals that universally appeal
meaning that developers only have to pay for the services to business teams and IT teams. HomeAway relied on Google
they use. This is like switching from a cell phone data plan Cloud Functions to develop an app that allowed users to
with a monthly fixed limit, to one that only charges for each search and comment on the recommendations of travelers in
byte of data that actually gets used. real time, even in areas without an internet connection. The
The term ‘serverless’ is somewhat misleading, as there are cloud services available through Cloud Firestore and Cloud
still servers providing these backend services, but all of the Functions made it possible to set up the infrastructure within
server space and infrastructure concerns are handled by the

IJERTV11IS090064 www.ijert.org 351


(This work is licensed under a Creative Commons Attribution 4.0 International License.)
Published by : International Journal of Engineering Research & Technology (IJERT)
http://www.ijert.org ISSN: 2278-0181
Vol. 11 Issue 09, September 2022

minutes and deploy the app within six weeks with just one the user. Similarly, when the user creates an account and
full-time developer. enters financial information to buy the tickets, another back-
• GreenQ: and-forth communication between the frontend and backend
Garbage pick-up and disposal is an industry that may not will occur.
seem to require innovative technology, but GreenQ took a
sophisticated approach to streamlining and improving waste IV. ADVANTAGE AND DISADVANTAGE OF
management by using IBM OpenWhisk to create an IoT SERVERLESS COMPUTING
platform that uses hardware installed on garbage trucks to
collect key metrics such as pickup time, location, and load Advantage
weight. The auto-scaling available through serverless was • Lower costs - Serverless computing is generally
particularly valuable due to the fluctuation of infrastructure very cost-effective, as traditional cloud
demands based on the number of customers and trucks at any providers of backend services (server
given time. allocation) often result in the user paying for
• Coca-Cola: unused space or idle CPU time.
Soft drink giant Coca-Cola has enthusiastically embraced • Simplified scalability - Developers using
serverless after its implementation in vending machines serverless architecture don’t have to worry
resulted in significant savings. Whenever a beverage is about policies to scale up their code. The
purchased, the payment gateway makes a call to the AWS serverless vendor handles all of the scaling on
API Gateway and triggers an AWS Lambda function to demand.
complete the transaction. Since vending machines must • Simplified backend code - With FaaS,
communicate with headquarters for inventory and marketing developers can create simple functions that
purposes, the ability to pay per request rather than operating independently perform a single purpose, like
at full capacity had a substantial impact on reducing costs. making an API call.
• Quicker turnaround - Serverless architecture can
III. FRONTEND VS BACKEND significantly cut time to market. Instead of
Application development is generally split into two needing a complicated deploy process to roll out
realms: the frontend and the backend. The frontend is the part bug fixes and new features, developers can add
of the application that users see and interact with, such as the and modify code on a piecemeal basis.
visual layout. The backend is the part that the user doesn’t Disadvantage
see; this includes the server where the application's files live • Testing and debugging become more challenging
and the database where user data and business logic is It is difficult to replicate the serverless environment in order
persisted. to see how code will actually perform once deployed.
Debugging is more complicated because developers do not
have visibility into backend processes, and because the
application is broken up into separate, smaller functions.
• Serverless computing introduces new security
concerns
When vendors run the entire backend, it may not be possible
to fully vet their security, which can especially be a problem
for applications that handle personal or sensitive data.
Because companies are not assigned their own discrete
physical servers, serverless providers will often be running
code from several of their customers on a single server at any
given time. This issue of sharing machinery with other parties
Fig :Frontend vs Backend
is known as 'multitenancy' – think of several companies
trying to lease and work in a single office at the same time.
For example, let’s imagine a website that sells
Multitenancy can affect application performance and, if the
concert tickets. When a user types a website address into the
multi-tenant servers are not configured properly, could result
browser window, the browser sends a request to the backend
in data exposure. Multitenancy has little to no impact for
server, which responds with the website data. The user will
networks that sandbox functions correctly and have powerful
then see the frontend of the website, which can include
enough infrastructure.
content such as text, images, and form fields for the user to
• Serverless architectures are not built for long-running
fill out. The user can then interact with one of the form fields
processes
on the frontend to search for their favorite musical act. When
This limits the kinds of applications that can cost-effectively
the user clicks on ‘submit’, this will trigger another request to
run in a serverless architecture. Because serverless providers
the backend. The backend code checks its database to see if a
charge for the amount of time code is running, it may cost
performer with this name exists, and if so, when they will be
more to run an application with long-running processes in a
playing next, and how many tickets are available. The
serverless infrastructure compared to a traditional one.
backend will then pass that data back to the frontend, and the
frontend will display the results in a way that makes sense to • Performance may be affected

IJERTV11IS090064 www.ijert.org 352


(This work is licensed under a Creative Commons Attribution 4.0 International License.)
Published by : International Journal of Engineering Research & Technology (IJERT)
http://www.ijert.org ISSN: 2278-0181
Vol. 11 Issue 09, September 2022

Because it's not constantly running, serverless code may need Infrastructure-as-a-service (IaaS) is a catchall term for
to 'boot up' when it is used. This startup time may degrade cloud vendors hosting infrastructure on behalf of their
performance. However, if a piece of code is used regularly, customers. IaaS providers may offer serverless functionality,
the serverless provider will keep it ready to be activated – a but the terms are not synonymous.
request for this ready-to-go code is called a 'warm start.' A
request for code that hasn't been used in a while is called a CONCLUSION
'cold start.' Serverless computing continues to evolve as
• Vendor lock-in is a risk serverless providers come up with solutions to overcome
Allowing a vendor to provide all backend services for an some of its drawbacks. One of these drawbacks is cold starts.
application inevitably increases reliance on that vendor. Typically when a particular serverless function has not been
Setting up a serverless architecture with one vendor can make called in a while, the provider shuts down the function to save
it difficult to switch vendors if necessary, especially since energy and avoid over-provisioning. The next time a user
each vendor offers slightly different features and workflows runs an application that calls that function, the serverless
V. How does serverless compare to other cloud backend provider will have to spin it up fresh and start hosting that
models? function again. This startup time adds significant latency,
A couple of technologies that are often conflated with which is known as a ‘cold start’.
serverless computing are Backend-as-a-Service and Platform- Once the function is up and running it will be served much
as-a-Service. Although they share similarities, these models more rapidly on subsequent requests (warm starts), but if the
do not necessarily meet the requirements of serverless. function is not requested again for a while, the function will
Backend-as-a-service (BaaS) is a service model where a once again go dormant. This means the next user to request
cloud provider offers backend services such as data storage, that function will experience a cold start. Up until fairly
so that developers can focus on writing front-end code. But recently, cold starts were considered a necessary trade-off of
while serverless applications are event-driven and run on the using serverless functions.
edge, BaaS applications may not meet either of these
requirements. REFERENCES
Platform-as-a-service (PaaS) is a model where developers [1] https://martinfowler.com/articles/serverless.html
[2] http://lukeangel.co/cross-platform/docker-servless-faas-functions-as-
essentially rent all the necessary tools to develop and deploy a-service/
applications from a cloud provider, including things like [3] https://hackernoon.com/what-is-serverless-architecture-what-are-its-
operating systems and middleware. However, PaaS pros-and-cons-cc4b804022e9
applications are not as easily scalable as serverless [4] https://serverless.com/blog/2018-serverless-community-survey-
huge-growth-usage/
applications. PaaS also don’t necessarily run on the edge and [5] https://serverless.com/framework/docs/providers/
often have a noticeable startup delay that isn’t present in
serverless applications

IJERTV11IS090064 www.ijert.org 353


(This work is licensed under a Creative Commons Attribution 4.0 International License.)

You might also like