0% found this document useful (0 votes)
9 views11 pages

Cloud Computing Mid 2 Answers

The document provides an overview of various topics in cloud computing, including NumPy methods in Python for array manipulation, Amazon EC2 and DynamoDB features and advantages, deployment prototyping processes, and the application of cloud computing in healthcare. It also discusses authentication mechanisms in cloud security, such as SSO, SAML, and OTP. Additionally, it highlights the benefits and limitations of cloud services in different contexts, including live video streaming and healthcare data management.

Uploaded by

Sudha Pooja
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views11 pages

Cloud Computing Mid 2 Answers

The document provides an overview of various topics in cloud computing, including NumPy methods in Python for array manipulation, Amazon EC2 and DynamoDB features and advantages, deployment prototyping processes, and the application of cloud computing in healthcare. It also discusses authentication mechanisms in cloud security, such as SSO, SAML, and OTP. Additionally, it highlights the benefits and limitations of cloud services in different contexts, including live video streaming and healthcare data management.

Uploaded by

Sudha Pooja
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd

CLOUD COMPUTING MID 2 ANSWERS

1. Explain about NumPy and its methods in Python?

NumPy: NumPy stands for numeric python which is a python package for the computation

and processing of the multidimensional and single dimensional array elements.

NumPy methods in Python:

Array Creation

[Link](): Create an array.

import numpy as np

arr = [Link]([1, 2, 3])

[Link](): Create an array of zeros.

zeros = [Link]((3, 3))

[Link](): Create an array of ones.

ones = [Link]((3, 3))

[Link](): Create an empty array.

empty = [Link]((2, 2))

[Link](): Create an array with a range of values.

range_arr = [Link](0, 10, 2)

[Link](): Create an array with linearly spaced values.

linspace_arr = [Link](0, 1, 5)

[Link](): Create an array with random values.

random_arr = [Link](3, 3)

Array Manipulation

[Link](): Change the shape of an array.

reshaped = [Link]((1, 3))

[Link](): Flatten an array.

flat = [Link]()

[Link](): Transpose an array.


transposed = arr.T

[Link](): Concatenate arrays.

concatenated = [Link]((arr, arr), axis=0)

[Link](): Split an array into multiple sub-arrays.

splitted = [Link](arr, 3)

Mathematical Operations

[Link](): Element-wise addition of arrays.

added = [Link](arr, arr)

[Link](): Element-wise subtraction of arrays.

subtracted = [Link](arr, arr)

[Link](): Element-wise multiplication of arrays.

multiplied = [Link](arr, arr)

[Link](): Element-wise division of arrays.

divided = [Link](arr, arr)

[Link](): Dot product of two arrays.

dot_product = [Link](arr, arr)

Statistical Operations

[Link](): Mean of the array elements.

mean = [Link](arr)

[Link](): Change the shape of an array.

reshaped = [Link]((1, 3))

[Link](): Flatten an array.

flat = [Link]()

[Link](): Transpose an array.

transposed = arr.T

[Link](): Concatenate arrays.

concatenated = [Link]((arr, arr), axis=0)

[Link](): Split an array into multiple sub-arrays.


splitted = [Link](arr, 3)

Mathematical Operations

[Link](): Element-wise addition of arrays.

added = [Link](arr, arr)

[Link](): Element-wise subtraction of arrays.

subtracted = [Link](arr, arr)

[Link](): Element-wise multiplication of arrays.

multiplied = [Link](arr, arr)

[Link](): Element-wise division of arrays.

divided = [Link](arr, arr)

[Link](): Dot product of two arrays.

dot_product = [Link](arr, arr)

Statistical Operations

[Link](): Mean of the array elements.

mean = [Link](arr)

[Link](): Median of the array elements.

median = [Link](arr)

[Link](): Standard deviation of the array elements.

std_dev = [Link](arr)

[Link](): Sum of the array elements.

total = [Link](arr)

[Link](): Minimum value of the array elements.

minimum = [Link](arr)

[Link](): Maximum value of the array elements.

maximum = [Link](arr)

2. Explain about Amazon EC2 Amazon Dynamo DB in Python for Amazon Web services?

Amazon EC2:

EC2 which is a short form of Elastic Compute Cloud (ECC) it is a cloud computing service

offered by the Cloud Service Provider


AWS. You can deploy your applications in EC2 servers without any worrying about the underlying
infrastructure. You configure the EC2- Instance in a very secure manner by using the VPC, Subnets, and
Security groups. You can scale the configuration of the EC2 instance you have configured based on the
demand of the application by attaching the autoscaling group to the EC2 instance. You can scale up

and scale down the instance based on the incoming traffic of the application.

Features of Amazon EC2

Amazon EC2 provides the following high-level features:

 Instances
 Virtual servers.
 Amazon Machine Images (AMIs)
 Preconfigured templates for your instances that package the components you need for
 your server (including the operating system and additional software).
 Instance types

[Link](): Median of the array elements.

median = [Link](arr)

[Link](): Standard deviation of the array elements.

std_dev = [Link](arr)

[Link](): Sum of the array elements.

total = [Link](arr)

[Link](): Minimum value of the array elements.

minimum = [Link](arr)

[Link](): Maximum value of the array elements.

maximum = [Link](arr)

2. Explain about Amazon EC2 Amazon Dynamo DB in Python for Amazon Web services?

Amazon EC2:

EC2 which is a short form of Elastic Compute Cloud (ECC) it is a cloud computing service

offered by the Cloud Service Provider AWS. You can deploy your applications in EC2

servers without any worrying about the underlying infrastructure. You configure the EC2-

Instance in a very secure manner by using the VPC, Subnets, and Security groups. You can

scale the configuration of the EC2 instance you have configured based on the demand of

the application by attaching the auto scaling group to the EC2 instance. You can scale up
and scale down the instance based on the incoming traffic of the application.

Features of Amazon EC2

Amazon EC2 provides the following high-level features:

Instances

Virtual servers.

Amazon Machine Images (AMIs)

Preconfigured templates for your instances that package the components you need for your server
(including the operating system and additional software).

Instance types

Various configurations of CPU, memory, storage, networking capacity, and graphics hardware for your
instances.

Amazon EBS volumes

Persistent storage volumes for your data using Amazon Elastic Block Store (Amazon EBS).

Instance store volumes

Storage volumes for temporary data that is deleted when you stop, hibernate, or terminate your
instance.

Key pairs

Secure login information for your instances. AWS stores the public key and you store the private key in a
secure place.

Security groups

A virtual firewall that allows you to specify the protocols, ports, and source IP ranges that can reach your
instances, and the destination IP ranges to which your instances can connect.

Amazon Dynamo DB:

➢ Dynamo DB allows users to create databases capable of storing and retrieving any amount of data
and comes in handy while serving any amount of traffic. I

➢ t dynamically manages each customer’s requests and provides high performance by automatically
distributing data and traffic over servers.

➢ It is a fully managed NoSQL database service that is fast, predictable in terms of performance, and
seamlessly scalable.

➢ It also eliminates the operational burden and complexity involved in protecting sensitive data by
providing encryption at REST.

Advantage of Dynamo DB:


The main advantages of opting for Dynamo DB are listed below:

• It has fast and predictable performance.

• It is highly scalable.

• It offloads the administrative burden operation and scaling.

• It offers encryption at REST for data protection.

• Its scalability is highly flexible.

• AWS Management Console can be used to monitor resource utilization and performance metrics.

• It provides on-demand backups.

• With point-in-time recovery, you can restore that table to any point in time during the last 35 days.

• It can be highly automated.

Limitations of Dynamo DB

The below list provides us with the limitations of Amazon Dynamo DB:

• It has a low read capacity unit of 4kB per second and a write capacity unit of 1KB per second.

• All tables and global secondary indexes must have a minimum of one read and one write capacity unit.

• Table sizes have no limits, but accounts have a 256 table limit unless you request a higher cap.

• Only Five local and twenty global secondary (default quota) indexes per table are permitted.

• Dynamo DB does not prevent the use of reserved words as names.

• Partition key length and value minimum length sits at 1 byte, and maximum at 2048 bytes, however,
Dynamo DB places no limit on values.
3. Explain about Deployment Prototyping?

DEPLOYMENT PROTOTYPING

• Deployment prototyping can help in making deployment architecture design choices.

• By comparing performance of alternative deployment architectures, deployment

prototyping can help in choosing the best and most cost-effective deployment

architecture that can meet the application performance requirements.

• Deployment design is an iterative process that involves the following steps:

• Deployment Design

Create the deployment with various tiers as specified in the deployment

configuration and deploy the application.

• Performance Evaluation

Verify whether the application meets the performance requirements with the

deployment.

• Deployment Refinement

fig

Deployments are refined based on the performance evaluations. Various

alternatives can exist in this step such as vertical scaling, horizontal scaling, for

instance.

4. Explain the case study for Live video streaming App?

Case Study: Live Video Streaming App

Introduction: - Live video streaming application allows on-demand creation of video streaming instances
in the

cloud. Video streaming applications have become very popular in the recent years with more and more
users
watching events broadcast live on the internet. Live streamed events can be viewed by audiences
around the world

in different types of devices connected to the internet such as laptops, desktops, tablets. smartphones,
internet-

TVs, etc. This capability to reach a much wider audience across a much larger geographic area is one of
the most

unique and exciting applications of streaming technology.

Workflow of a Live Video Streaming Application

Below diagram shows a workflow of a live video streaming application that uses multimedia cloud.

Document continues below

 Discover more from:

In this workflow
● the video and audio feeds generated by a number of cameras and microphones are
mixed/multiplexed with
video/audio mixers and then encoded by a client application which then sends the encoded feeds to the
multimedia cloud.
● On the cloud, streaming instances are created on-demand and the streams are then broadcast over
the internet.
● The streaming instances also record the event streams which are later moved to the cloud storage for
video
archiving
Live video streaming application allows on-demand creation of video streaming instances in the cloud.

5. Explain about Cloud computing for Health Care?


Cloud Computing for Healthcare
• Healthcare Ecosystem
➢ The healthcare ecosystem consists of numerous entities including healthcare providers
(primary care physicians, specialists, hospitals, for instance), payers (government, private
health insurance companies, employers), pharmaceutical, device and medical service
companies, IT solutions and services firms, and patients.
• Healthcare Data
➢ The process of provisioning healthcare involves massive healthcare data that exists in
different forms (structured or unstructured), is stored in disparate data sources (such as
relational databases, file servers, for instance) and in many different formats.
• The cloud can provide several benefits to all the stakeholders in the healthcare ecosystem
through systems such as
• Health Information Management System (HIMS),
 Laboratory Information System (LIS),
 Radiology Information System (RIS),
 Pharmacy Information System (PIS),
for instance

Benefits of Cloud for Healthcare include:


Providers & Hospitals
➢ With public cloud based EHR systems hospitals don’t need to spend a significant portion
of their budgets on IT infrastructure.
➢ Public cloud service providers provide on-demand provisioning of hardware resources
with pay-per-use pricing models.
➢ Thus hospitals using public cloud based EHR systems can save on upfront capital
investments in hardware and data center infrastructure and pay only for the operating
expenses of the cloud resources used.
➢ Hospitals can access patient data stored in the cloud and share the data with other hospitals.
Patients
➢ Patients can provide access to their health history and information stored in the cloud
(using SaaS applications) to hospitals so that the admissions, care and discharge processes can be
streamlined
➢ Physicians can upload diagnosis reports (such as pathology reports) to the cloud so that
they can be accessed by doctors remotely for diagnosing the illness.
➢ Patients can manage their prescriptions and associated information such as dosage,
amount and frequency, and provide this information to their healthcare provider Payers
Health payers can increase the effectiveness of their care management programs by providing
value added services and giving access to health information to members
Electronic Health Records (EHRs)
Though the primary use of EHRs is to maintain all medical data for an individual patient and
to provide efficient access to the stored data at the point of care, EHRs can be the source for
valuable aggregated information about overall patient populations.
• The EHR data can be used for advanced healthcare applications such as population-level
health surveillance, disease detection, outbreak prediction, public health mapping, similarity-
based clinical decision intelligence, medical prognosis, syndromic diagnosis, visual-analytics
investigation, for instance
Cloud EHRs:
Save Infrastructure Costs
• Traditional client-server EHR systems with dedicated hosting require a team of IT experts to
install, configure, test, run, secure and update hardware and software.
• With cloud-based EHR systems, organizations can save on the upfront capital investments
for setting up the computing infrastructure as well as the costs of managing the
infrastructure.
Data Integration & Interoperability
• Traditional EHR systems use different and often confiicting technical and semantic
standards which leads to data integration and interoperability problems.
• To address interoperability problems, several electronic health record (EHR) standards that
enable structured clinical content for the purpose of exchange are currently under
development.
• Interoperability of EHR systems will contribute to more effective and efficient patient care
by facilitating the retrieval and processing of clinical information about a patient from different sites.
Scalability and Performance
• Traditional EHR systems are built on a client-server model with dedicated hosting that
involves a server which is installed within the organization’s network and multiple clients that
access the server. Scaling up such systems requires additional hardware.
• Cloud computing is a hosting abstraction in which the underlying computing infrastructure
is provisioned on demand and can be scaled up or down based on the workload.
• Scaling up cloud applications is easier as compared to client-server applications.

6. Explain about Authentication in Cloud Security?

Authentication in Cloud Security:


• Authentication refers to confirming the digital identity of the entity requesting access to
some protected information.
• The process of authentication involves, but is not limited to, validating the at least one factor
of identification of the entity to be authenticated.
• A factor can be something the entity or the user knows (password or pin), something the
user has (such as a smart card), or something that can uniquely identify the user (such as
fingerprints).
• In multifactor authentication more than one of these factors are used for authentication.

There are various mechanisms for authentication including


❖ SSO
❖ SAML-Token
❖ OTP
Single Sign-on (SSO):
• Single Sign-on (SSO) enables users to access multiple systems or applications after signing
in only once, for the first time.
• When a user signs in, the user identity is recognized and there is no need to sign in again
and again to access related systems or applications.
• Since different systems or applications may be internally using different authentication
mechanisms, SSO upon receiving initial credential translates to different credentials for
different systems or applications.
• The benefit of using SSO is that it reduces human error and saves time spent in
authenticating with different systems or applications for the same identity.
• There are different implementation mechanisms:
o SAML-Token
o Kerberos
SAML-Token:
• Security Assertion Markup Language (SAML) is an XML-based open standard data format
for exchanging security information (authentication and authorization data) between an
identity provider and a service provider.
SAML-token based SSO authentication
• When a user tries to access the cloud application, a SAML request is generated and the user
is redirected to the identity provider
. • The identity provider parses the SAML request and authenticates the user. A SAML token
is returned to the user, who then accesses the cloud application with the token.
• SAML prevents man-in-the-middle and replay attacks by requiring the use of SSL
encryption when transmitting assertions and messages.
• SAML also provides a digital signature mechanism that enables the assertion to have a
validity time range to prevent replay attacks
Kerberos:
• Kerberos is an open authentication protocol that was developed At MIT.
• Kerberos uses tickets for authenticating client to a service that communicate over an un-
secure network.
• Kerberos provides mutual authentication, i.e. both the client and the server authenticate
with each other

One Time Password (OTP):


• One time password is another authentication mechanism that uses passwords which are
valid for single use only for a single transaction or session.
• Authentication mechanism based on OTP tokens are more secure because they are not
vulnerable to replay attacks.
• Text messaging (SMS) is the most common delivery mode for OTP tokens.
• The most common approach for generating OTP tokens is time synchronization.
• Time-based OTP algorithm (TOTP) is a popular time synchronization-based algorithm for
generating OTPs.

You might also like