Cloud Computing Module 3 Notes
Cloud Computing Module 3 Notes
Data Center design and inter-connection networks: Warehouse-scale data center, modular
data centers
Cloud Deployment models: Public, private and hybrid clouds, examples.
Cloud security: Cloud security defense strategies, Distributed intrusion/anamoly detection
● The three-tier data center network architecture is a traditional network topology that
has been widely adopted in many older data centers. Redundancy is a key part of this
design, with multiple paths from the access layer to the core, in addition to helping
networks achieve high availability and efficient resource allocation.
ModularDataCenterinShippingContainers
● A modern data center is structured as a shipyard of server clusters housed in truck-
towed containers.
● Inside the container, hundreds of blade servers are housed in racks surrounding the
container walls.
● An array of fans forces the heated air generated by the server racks to go through a heat
exchanger, which cools the air for the next rack on a continuous loop.
● Large-scale data center built with modular containers appear as a big shipping yard of
container trucks. This container-based data center was motivated by demand for lower
power consumption, higher computer density, and mobility to relocate data centers to
better locations with lower electricity costs, better cooling water supplies, and cheaper
housing for maintenance engineers.
● Sophisticated cooling technology enables up to 80% reduction in cooling costs
compared with traditional warehouse data centers. Both chilled air circulation and cold
water are flowing through the heat exchange pipes to keep the server racks cool and
easy to repair.
● The modular container design includes the network, computer, storage, and cooling gear.
● The container must be designed to be weatherproof and easy to transport.
● The modular data-center approach supports many cloud service applications.
For example, the health care industry will benefit by installing a data center at all clinic
sites.
Cloud Deployment Models
● Clouds constitute the primary outcome of cloud computing.
● Clouds build the infrastructure on top of which services are implemented and
delivered to customers. Such infrastructures can be of different types and
provide useful information about the nature and the services offered by the
cloud.
● A more useful classification is given according to the administrative domain of a
cloud: It identifies the boundaries within which cloud computing services are
implemented, provides hints on the underlying infrastructure adopted to support
such services, and qualifies them.
● Following are the different types of cloud deployment models:
o Public clouds. The cloud is open to the wider public.
o Privateclouds. The cloud is implemented with in the private premises of
an institution and generally made accessible to the members of the
institution or a subset of them.
o Hybrid clouds. The cloud is a combination of the two previous solutions
and most likely identifies a private cloud that has been augmented with
resources or services hosted in a publiccloud.
o Communityclouds. The cloud is characterized by a multi-administrative
domain involving different deployment models (public, private, and
hybrid), and it is specifically designed to address the needs of a specific
industry/criteria/context.
Public clouds
● Public clouds constitute the first expression of cloud computing.
● They are a realization of the actual view of cloud computing in which the services
offered are made available to anyone, from anywhere, and at any time through the
Internet.
● From a structural point of view they are a distributed system, most likely composed
of one or more datacenters connected together, on top of which the specific services
offered by the cloud are implemented.
● Any customer can easily sign in with the cloud provider, enter credential and billing
details, and use the services offered.
● Historically, public clouds were the first class of cloud that were implemented and
offered.
● They offer solutions for minimizing IT infrastructure costs and serve as a viable
option for handling peak loads on the local infrastructure.
● They have become an interesting option for small enterprises, which are able to start
their businesses without large up-front investments by completely relying on public
infrastructure for their IT needs.
● The ability to grow or shrink according to the needs of the related business has made
public cloud attractive. By renting the infrastructure or subscribing to application
services, customers were able to dynamically upsize or downsize their IT according
to the demands of their business.
● Currently, public clouds are used both to completely replace the IT infrastructure of
enterprises and to extend it when it is required.
● A fundamental characteristic of public clouds is multitenancy. A public cloud is
meant to serve a multitude of users, not a single customer.
● A public cloud can offer any kind of service : infrastructure, platform, or applications.
● For example, AmazonEC2 is a public cloud that provides infrastructure-as-a-service;
Google App Engine is a public cloud that provides an application development
platform-as-a-service; and SalesForce.com is a public cloud that provides software-
as- a-service.
● What makes public clouds special is the way they are consumed: They are available
to every one and are generally architected to support a large quantity of users.
● What characterizes public clouds is their natural ability to scale on demand and
sustain peak loads.
● Public clouds can be composed of geographically dispersed data centers to share the
load of users and better serve them according to their locations. For example,
Amazon Web Services has data centers installed in the United States, Europe,
Singapore, Australia etc. and allow their customers to choose from different regions:
us-west-1, us-east-1, or eu-west-1.
● Each region is priced differently and are further divided into availability zones,
which map to specific datacenters. According to the specific class of services
delivered by the cloud, a different software stack is installed to manage the
infrastructure: virtual machine managers, distributed middleware, or distributed
applications.
Private clouds
● Public clouds are appealing and provide a viable option to cut IT costs and reduce
capital expenses, but they are not applicable in all scenarios. For example, a very
common critique to the use of cloud computing in its canonical implementation is the
loss of control.
● In the case of public clouds, the provider is in control of the infrastructure and,
eventually, of the customers’ core logic and sen- sitive data. Even though there could
be regulatory procedure in place that guarantees fair manage- ment and respect of
the customer’s privacy, this condition can still be perceived as a threat or as an
unacceptable risk that some organizations are not willing to take.
● In particular, institutions such as government and military agencies will not consider
public clouds as an option for processing or storing their sensitive data.
● The risk of a breach in the security infrastructure of the provider could expose such
information to others; this could simply be considered unacceptable.
● In other cases, the loss of control of where your virtual IT infrastructure resides
could open the way to other problematic situations. More precisely, the geographical
location of a datacenter generally determines the regulations that are applied to
management of digital information. As a result, according to the specific location of
data, some sensitive information can be made accessible to government agencies or
even considered outside the law if processed with specific cryptographic techniques.
● Thus, all these aspects make the use of a public computing infrastructure not always
possible.
● The solution lies in private clouds, which are similar to public clouds, but their
resource provisioning model is limited within the boundaries of an organization.
● Private clouds are virtual distributed systems that rely on a private infrastructure
and provide internal users with dynamic provisioning of computing resources.
● Instead of a pay-as-you-go model as in public clouds, there could be other schemes in
place, taking into account the usage of the cloud and proportionally billing the
different departments or sections of an enterprise.
● Private clouds have the advantage of keeping the core business operations in-house
by relying on the existing IT infrastructure and reducing the burden of maintaining it
once the cloud has been set up.
● In this scenario, security concerns are less critical, since sensitive information does
not flow out of the private infrastructure.
● From an architectural point of view, private clouds can be implemented on more
heterogeneous hardware: They generally rely on the existing IT infrastructure
already deployed on the private premises.
● Private clouds can provide in-house solutions for cloud computing, but if compared
to public clouds they exhibit more limited capability to scale elastically on demand.
Hybrid clouds
● Public clouds are large software and hardware infrastructures that have a capability
that is huge enough to serve the needs of multiple users, but they suffer from
security threats and administrative pitfalls.
Although the option of completely relying on a public virtual infrastructure is appeal-
ing for companies that did not incur IT capital costs and have just started
considering their IT needs (i.e., start-ups), in most cases the private cloud option
prevails because of the existing IT infrastructure.
● Private clouds are the perfect solution when it is necessary to keep the processing of
information within an enterprise’s premises or it is necessary to use the existing
hardware and software infrastructure.
One of the major drawbacks of private deployments is the inability to scale on
demand and to efficiently address peak loads.
● In this case, it is important to leverage capabilities of public clouds as needed. Hence,
a hybrid solution could be an interesting opportunity for taking advantage of the
best of the private and public worlds. This led to the development and diffusion of
hybrid clouds.
● Hybrid clouds allow enterprises to exploit existing IT infrastructures, maintain
sensitive information within the premises, and naturally grow and shrink by
provisioning external resources and releasing them when they are no longer needed.
● Security concerns are then only limited to the public portion of the cloud that can be
used to perform operations with less stringent constraints but that are still part of
the system workload.
● Hybrid cloud is a heterogeneous distributed system resulting from a private cloud
that integrates additional services or resources from one or more public clouds.
● Whereas the concept of hybrid cloud is general, it mostly applies to IT infrastructure
rather than software services.
Community Clouds
● Community clouds are distributed systems created by integrating the services of
different clouds to address the specific needs of an industry, a community, or a
business sector.
● In community cloud, the infrastructure is shared by several organizations and
supports a specific community that has shared concerns (e.g., mission, security
requirements, policy, and compliance considerations).
● The following diagram describes the concept of community cloud.
● The users of a specific community could fall into a well-identified community,
sharing the same concerns or needs; they can be government bodies, industries, or
even simple users, but all of them focus on the same issues for their interaction with
the cloud.
● Community cloud differ from public clouds, which serve a multitude of users with
different needs. Community clouds are also different from private clouds, where the
services are generally delivered within the institution that owns the cloud.
● From an architectural point of view, a community cloud is most likely implemented
over multi- ple administrative domains. This means that different organizations such
as government bodies, private enterprises, research organizations, and even public
virtual infrastructure providers contribute with their resources to build the cloud
infrastructure.
● Candidate sectors for community clouds include Media industry, Healthcare
industry, Public Services, Scientific research etc.
PUBLIC CLOUD PLATFORMS: AWS, AZURE and GAE
Five Major Cloud Platforms and Their Service Offerings
Model IBM Amazon Google Microsoft Salesforce
Google Cloud
IaaS - AWS - -
Platform
(GCP)
BlueClou Google App
PaaS - Engine Windows Force.com
d, WCA,
RC2 (GAE) Azure
Gmail, .NET Online
SaaS Lotus Live -
Google service, CRM,
Docs Dynamic Gifttag
CRM
OS level
Virtualization - Hardware, (Applicat OS level/ -
OS and ion Hypel-V
Xen Containe
r)
SOA, B2, GFS, Apex,
Service EC2, S3, SQS,
TSAM, RAD, Chubby, Live, SQL visual
Offerings SimpleDB
Web 2.0 BigTable, Hotmail force,
MapReduce record
security
Programming .NET
AMI - Python Framework Apex
Support
Compute Elastic Compute Cloud (EC2), Lambda, Elastic MapReduce, Auto Scaling
Messaging Simple Queue Service (SQS), Simple Notification Service (SNS)
Storage Simple Storage Service (S3), Elastic Block Storage (EBS), AWS Import/Export
Content Delivery Amazon CloudFront
Monitoring Amazon CloudWatch
Support AWS Premium Support
Database Amazon SimpleDB, Relational Database Service (RDS), DynamoDB
Networking Virtual Private Cloud (VPC), Elastic Load Balancing
Web Traffic Alexa Web Information Service, Alexa Web Sites
E-Commerce Fulfillment Web Service (FWS)
Payments and Billing Flexible Payments Service (FPS),
Amazon DevPay Workforce Amazon Mechanical Turk
● EC2 provides the virtualized platforms to the host VMs where the cloud application can
run. VMs can be used to share computing resources both flexibly and safely.
● S3 (Simple Storage Service) provides the object-oriented storage service for users.
● EBS (Elastic Block Service) provides the block storage interface which can be used to
support traditional applications.
● SQS (Simple Queue Service) ensures a reliable message service between two processes.
The message can be kept reliably even when the receiver processes are not running.
● ELB (Elastic Load Balancing) automatically distributes incoming application traffic
across multiple Amazon EC2 instances and allows user to avoid nonoperating nodes and
to equalize load on functioning images.
● CloudWatch is a web service that provides monitoring for AWS cloud resources, starting
with Amazon EC2. It provides customers with visibility into resource utilization,
operational performance, and overall demand patterns, including metrics such as CPU
utilization, disk reads and writes, and network traffic.
● Lambda AWS Lambda is a serverless compute service that allows you to run code
without provisioning or managing servers. It executes code in response to events and
automatically manages the computing resources required.
● CloudFront Amazon CloudFront is a content delivery web service. It acts as a Content
Delivery Network (CDN) that accelerates the delivery of web content to users worldwide
by caching content at edge locations near those users. This results in faster loading
times and improved performance for websites and applications.
● Users can access their objects through SOAP with either browsers or other client
programs which support the SOAP standard.
● Amazon provides a more flexible cloud computing platform for developers to build
cloud applications. Small and medium-size companies can put their business on the
Amazon cloud platform.
● Using the AWS platform, they can service large numbers of Internet users and make
profits through those paid services.
● Both auto-scaling and ELB are enabled by CloudWatch which monitors running instances.
● Amazon offers a Relational Database Service (RDS) with a messaging interface. RDS
brings the familiarity of SQL engines like MySQL, PostgreSQL, or SQL Server to the cloud,
offering ACID compliance and complex querying for structured data.
● The Elastic MapReduce capability is equivalent to Hadoop running on the basic EC2
offering.
● AWS Import/Export allows one to ship large volumes of data to and from EC2 by
shipping physical disks; it is well known that this is often the highest bandwidth
connection between geographically distant systems.
● Amazon CloudFront implements a content distribution network.
Microsoft Windows Azure
● Microsoft Windows Azure
● In 2010, Microsoft launched the Windows Azure platform to meet the challenges in cloud
computing.
● This platform is built over Microsoft data centers.
● The Azure platform is divided into three major component platforms:
● IaaS (Infrastructure as a Service)
● PaaS (Platform as a Service)
● SaaS (Software as a Service)
● Windows Azure offers a cloud platform built on Windows OS and based on Microsoft
virtualization technology.
● Azure manages all servers, storage, and network resources of the data center.
● On top of the infrastructure are the various services for building different cloud applications.The
following figure shows the overall architecture of Microsoft’s cloud platform.
● GFS (Google File System) is used for storing large amounts of data.
● MapReduce is used for application program development.
● Chubby is used for distributed application lock services.
● BigTable offers a storage service for accessing NoSQL Data.
● Third-party application providers can use GAE to build cloud applications for providing
services.
● The applications all run in data centers under tight management by Google engineers.
Inside each data center, there are thousands of servers forming different clusters.
● The building blocks of Google’s cloud computing application include the Google File
System for storing large amounts of data, the MapReduce programming framework for
application developers, Chubby for distributed application lock services, and BigTable as
a storage service for accessing structural or semi-structural data.
● GAE runs the user program on Google’s infrastructure. As it is a platform running third-
party programs, application developers now do not need to worry about the
maintenance of servers.
● Functional Modules of GAE include datastore (data storage services with BigTable),
application runtime environment (scalable web programming – Python, Java), software
development kit (SDK) (local application development), administration console (easy
management of user application) , GAE web service infrastructure (special interfaces).
● Well-known GAE applications include the Google Search Engine, Google Docs, Google
Earth, and Gmail. These applications can support large numbers of users
simultaneously.
Cloud Services
Cloud Armor DDos Shield
(Protection)
Azure traffic
DNS Service Amazon Route Cloud DNS
manager
53
Compute Engine
Automation AWS Opsworks Azure Automation
Management
Per-second
Pricing pricing with a Per-minute basis Per-minute basis
60-second
minimum
Amazon Web Google Cloud
Subject Microsoft Azure
services Platform
Azure Cloud
Security AWS Security
Security security
Hub
Centre Command
Centre
● A healthy cloud ecosystem is desired to free users from cheating, hacking, viruses, spam
and privacy violations etc.
● The security demands of three cloud service models, IaaS, PaaS, and SaaS vary from each
other.
● The security models / strategies are based on various SLAs between providers and users.
● Basically three cloud security enforcements or demands are expected.
o On-site security of data centers (Biometric readers, CCTV, motion detection,
and man traps)
o Network security and fault tolerance (external firewalls, intrusion detection
systems (IDSes) and third-party vulnerability assessment)
o Platform security (SSL and data decryption, strict password policies and system
trust certification)
● The following figure shows the mapping of cloud models, where special security
measures are deployed at various cloud service levels.
● Users desire to have a software environment that provides many useful tools to build
cloud applications over large data sets.
● In addition, users also desire to have security and privacy protection software for using
the cloud.
● The software that provides security and privacy protection should offer the following
features:
o Special APIs for authenticating users.
o Fine-grained access control to protect data integrity and deter hackers.
o Shared data sets protected from malicious alteration, deletion, or copyright
violation.
o Ability to secure the cloud service provider from invading users’ privacy.
o Personal firewalls at user ends.
Data Coloring and Cloud Watermarking:
● With shared files and data sets, privacy, security, and copyright information could be
compromised in a cloud computing environment.
● Users desire to work in a trusted software environment that provides useful tools to
build cloud applications over protected data sets.
● Above diagram illustrates how system generates special colors for each of the data object.
● Key Concepts of Data Coloring
1. Data Classification
o Sensitive Data (e.g., PII, financial records) → Red
o Confidential Data (e.g., internal documents) → Yellow
o Public Data (e.g., marketing materials) → Green
2. Metadata Tagging
o Each data object is assigned a tag or label indicating its classification.
o Helps enforce security policies when data moves between cloud services.
3. Policy Enforcement
o Access Control: Prevents unauthorized users from accessing classified data.
o Data Loss Prevention (DLP): Detects and restricts sharing of
sensitive information.
o Audit & Compliance: Ensures adherence to GDPR, HIPAA, and other
regulations.
4. Data Tracking & Monitoring
o Helps detect anomalies and prevent data breaches.
o Supports real-time monitoring of sensitive data movement in cloud
environments.
● Data coloring assigns unique "colors" (identifying information) to data fragments, while
cloud watermarking embeds invisible identifiers to prove ownership and integrity.
● Data coloring technique is used to preserve data integrity and user privacy in cloud.
● Watermarking is mainly used for digital copyright management.
● Data coloring means labeling each data object by a unique color. Thus differently colored
data objects are thus distinguishable.
● Cloud storage provides a process for the generation, embedding, extraction of water
marks in colored objects.
● This color matching process van be applied to implement different trust management
events.
● The user identification can also be colored to be matched with the data colors.
● Data coloring technique takes minimal number of calculations to color or decolour data
objects compared to encryption/decryption technique.
● Cryptography and coloring can be jointly used in a cloud environment.
*************************************************************************
https://www.nlyte.com/blog/data-center-basics/ https://www.youtube.com/watch?
v=BzJvVBxSEOM