0% found this document useful (0 votes)
116 views25 pages

AWS Exam Dumps for Architects

The document provides sample questions and answers related to the AWS Certified Solutions Architect - Associate exam. It includes new questions on various AWS topics along with explanations for the answers. A total of 8 sample questions are presented.

Uploaded by

Harsh Gupta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
116 views25 pages

AWS Exam Dumps for Architects

The document provides sample questions and answers related to the AWS Certified Solutions Architect - Associate exam. It includes new questions on various AWS topics along with explanations for the answers. A total of 8 sample questions are presented.

Uploaded by

Harsh Gupta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps

https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

Exam Questions AWS-Solution-Architect-Associate


Amazon AWS Certified Solutions Architect - Associate

https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

NEW QUESTION 1
- (Exam Topic 1)
A company is preparing to launch a public-facing web application in the AWS Cloud. The architecture consists of Amazon EC2 instances within a VPC behind an
Elastic Load Balancer (ELB). A third-party service is used for the DNS. The company's solutions architect must recommend a solution to detect and protect against
large-scale DDoS attacks.
Which solution meets these requirements?

A. Enable Amazon GuardDuty on the account.


B. Enable Amazon Inspector on the EC2 instances.
C. Enable AWS Shield and assign Amazon Route 53 to it.
D. Enable AWS Shield Advanced and assign the ELB to it.

Answer: D

Explanation:
https://aws.amazon.com/shield/faqs/

NEW QUESTION 2
- (Exam Topic 1)
A company hosts its multi-tier applications on AWS. For compliance, governance, auditing, and security, the company must track configuration changes on its
AWS resources and record a history of API calls made to these resources.
What should a solutions architect do to meet these requirements?

A. Use AWS CloudTrail to track configuration changes and AWS Config to record API calls
B. Use AWS Config to track configuration changes and AWS CloudTrail to record API calls
C. Use AWS Config to track configuration changes and Amazon CloudWatch to record API calls
D. Use AWS CloudTrail to track configuration changes and Amazon CloudWatch to record API calls

Answer: B

NEW QUESTION 3
- (Exam Topic 1)
A company recently launched Linux-based application instances on Amazon EC2 in a private subnet and launched a Linux-based bastion host on an Amazon EC2
instance in a public subnet of a VPC A solutions architect needs to connect from the on-premises network, through the company's internet connection to the
bastion host and to the application servers The solutions architect must make sure that the security groups of all the EC2 instances will allow that access
Which combination of steps should the solutions architect take to meet these requirements? (Select TWO)

A. Replace the current security group of the bastion host with one that only allows inbound access from the application instances
B. Replace the current security group of the bastion host with one that only allows inbound access from the internal IP range for the company
C. Replace the current security group of the bastion host with one that only allows inbound access from the external IP range for the company
D. Replace the current security group of the application instances with one that allows inbound SSH access from only the private IP address of the bastion host
E. Replace the current security group of the application instances with one that allows inbound SSH access from only the public IP address of the bastion host

Answer: CD

Explanation:
https://digitalcloud.training/ssh-into-ec2-in-private-subnet/

NEW QUESTION 4
- (Exam Topic 1)
A company needs to store data in Amazon S3 and must prevent the data from being changed. The company wants new objects that are uploaded to Amazon S3
to remain unchangeable for a nonspecific amount of time until the company decides to modify the objects. Only specific users in the company’s AWS account can
have the ability to delete the objects. What should a solutions architect do to meet these requirements?

A. Create an S3 Glacier vault Apply a write-once, read-many (WORM) vault lock policy to the objects
B. Create an S3 bucket with S3 Object Lock enabled Enable versioning Set a retention period of 100 years Use governance mode as the S3 bucket's default
retention mode for new objects
C. Create an S3 bucket Use AWS CloudTrail to (rack any S3 API events that modify the objects Upon notification, restore the modified objects from any backup
versions that the company has
D. Create an S3 bucket with S3 Object Lock enabled Enable versioning Add a legal hold to the objects Add the s3 PutObjectLegalHold permission to the IAM
policies of users who need to delete the objects

Answer: D

Explanation:
"The Object Lock legal hold operation enables you to place a legal hold on an object version. Like setting a retention period, a legal hold prevents an object version
from being overwritten or deleted. However, a legal hold doesn't have an associated retention period and remains in effect until removed."
https://docs.aws.amazon.com/AmazonS3/latest/userguide/batch-ops-legal-hold.html

NEW QUESTION 5
- (Exam Topic 1)
A company recently signed a contract with an AWS Managed Service Provider (MSP) Partner for help with an application migration initiative. A solutions architect
needs to share an Amazon Machine Image (AMI) from an existing AWS account with the MSP Partner's AWS account. The AMI is backed by Amazon Elastic
Block Store (Amazon EBS) and uses a customer managed customer master key (CMK) to encrypt EBS volume snapshots.
What is the MOST secure way for the solutions architect to share the AMI with the MSP Partner's AWS account?

A. Make the encrypted AMI and snapshots publicly availabl

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

B. Modify the CMK's key policy to allow the MSP Partner's AWS account to use the key
C. Modify the launchPermission property of the AM
D. Share the AMI with the MSP Partner's AWS account onl
E. Modify the CMK's key policy to allow the MSP Partner's AWS account to use the key.
F. Modify the launchPermission property of the AMI Share the AMI with the MSP Partner's AWS account onl
G. Modify the CMK's key policy to trust a new CMK that is owned by the MSP Partner for encryption.
H. Export the AMI from the source account to an Amazon S3 bucket in the MSP Partner's AWS account.Encrypt the S3 bucket with a CMK that is owned by the
MSP Partner Copy and launch the AMI in the MSP Partner's AWS account.

Answer: B

Explanation:
Share the existing KMS key with the MSP external account because it has already been used to encrypt the AMI snapshot.
https://docs.aws.amazon.com/kms/latest/developerguide/key-policy-modifying-external-accounts.html

NEW QUESTION 6
- (Exam Topic 1)
A company wants to improve its ability to clone large amounts of production data into a test environment in the same AWS Region. The data is stored in Amazon
EC2 instances on Amazon Elastic Block Store (Amazon EBS) volumes. Modifications to the cloned data must not affect the production environment. The software
that accesses this data requires consistently high I/O performance.
A solutions architect needs to minimize the time that is required to clone the production data into the test environment.
Which solution will meet these requirements?

A. Take EBS snapshots of the production EBS volume


B. Restore the snapshots onto EC2 instance store volumes in the test environment.
C. Configure the production EBS volumes to use the EBS Multi-Attach featur
D. Take EBS snapshots of the production EBS volume
E. Attach the production EBS volumes to the EC2 instances in the test environment.
F. Take EBS snapshots of the production EBS volume
G. Create and initialize new EBS volume
H. Attach the new EBS volumes to EC2 instances in the test environment before restoring the volumes from the production EBS snapshots.
I. Take EBS snapshots of the production EBS volume
J. Turn on the EBS fast snapshot restore feature on the EBS snapshot
K. Restore the snapshots into new EBS volume
L. Attach the new EBS volumes to EC2 instances in the test environment.

Answer: C

NEW QUESTION 7
- (Exam Topic 1)
A company wants to reduce the cost of its existing three-tier web architecture. The web, application, and database servers are running on Amazon EC2 instances
for the development, test, and production environments. The EC2 instances average 30% CPU utilization during peak hours and 10% CPU utilization during non-
peak hours.
The production EC2 instances run 24 hours a day. The development and test EC2 instances run for at least 8 hours each day. The company plans to implement
automation to stop the development and test EC2 instances when they are not in use.
Which EC2 instance purchasing solution will meet the company's requirements MOST cost-effectively?

A. Use Spot Instances for the production EC2 instance


B. Use Reserved Instances for the development and test EC2 instances.
C. Use Reserved Instances for the production EC2 instance
D. Use On-Demand Instances for the development and test EC2 instances.
E. Use Spot blocks for the production EC2 instance
F. Use Reserved Instances for the development and test EC2 instances.
G. Use On-Demand Instances for the production EC2 instance
H. Use Spot blocks for the development and test EC2 instances.

Answer: B

NEW QUESTION 8
- (Exam Topic 1)
A company is designing an application where users upload small files into Amazon S3. After a user uploads a file, the file requires one-time simple processing to
transform the data and save the data in JSON format for later analysis.
Each file must be processed as quickly as possible after it is uploaded. Demand will vary. On some days, users will upload a high number of files. On other days,
users will upload a few files or no files.
Which solution meets these requirements with the LEAST operational overhead?

A. Configure Amazon EMR to read text files from Amazon S3. Run processing scripts to transform the dat
B. Store the resulting JSON file in an Amazon Aurora DB cluster.
C. Configure Amazon S3 to send an event notification to an Amazon Simple Queue Service (Amazon SQS) queu
D. Use Amazon EC2 instances to read from the queue and process the dat
E. Store the resulting JSON file in Amazon DynamoDB.
F. Configure Amazon S3 to send an event notification to an Amazon Simple Queue Service (Amazon SQS) queu
G. Use an AWS Lambda function to read from the queue and process the dat
H. Store the resulting JSON file in Amazon DynamoD
I. Most Voted
J. Configure Amazon EventBridge (Amazon CloudWatch Events) to send an event to Amazon Kinesis Data Streams when a new file is uploade
K. Use an AWS Lambda function to consume the event from the stream and process the dat
L. Store the resulting JSON file in Amazon Aurora DB cluster.

Answer: C

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

Explanation:
Amazon S3 sends event notifications about S3 buckets (for example, object created, object removed, or object restored) to an SNS topic in the same Region.
The SNS topic publishes the event to an SQS queue in the central Region.
The SQS queue is configured as the event source for your Lambda function and buffers the event messages for the Lambda function.
The Lambda function polls the SQS queue for messages and processes the Amazon S3 event notifications according to your application’s requirements.
https://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/subscribe-a-lambda-function-to-event-notific

NEW QUESTION 9
- (Exam Topic 1)
An ecommerce company wants to launch a one-deal-a-day website on AWS. Each day will feature exactly one product on sale for a period of 24 hours. The
company wants to be able to handle millions of requests each hour with millisecond latency during peak hours.
Which solution will meet these requirements with the LEAST operational overhead?

A. Use Amazon S3 to host the full website in different S3 buckets Add Amazon CloudFront distributions Set the S3 buckets as origins for the distributions Store the
order data in Amazon S3
B. Deploy the full website on Amazon EC2 instances that run in Auto Scaling groups across multiple Availability Zones Add an Application Load Balancer (ALB) to
distribute the website traffic Add another ALB for the backend APIs Store the data in Amazon RDS for MySQL
C. Migrate the full application to run in containers Host the containers on Amazon Elastic Kubernetes Service (Amazon EKS) Use the Kubernetes Cluster
Autoscaler to increase and decrease the number of pods to process bursts in traffic Store the data in Amazon RDS for MySQL
D. Use an Amazon S3 bucket to host the website's static content Deploy an Amazon CloudFront distributio
E. Set the S3 bucket as the origin Use Amazon API Gateway and AWS Lambda functions for the backend APIs Store the data in Amazon DynamoDB

Answer: D

NEW QUESTION 10
- (Exam Topic 1)
A company hosts an application on multiple Amazon EC2 instances The application processes messages from an Amazon SQS queue writes to an Amazon RDS
table and deletes the message from the queue Occasional duplicate records are found in the RDS table. The SQS queue does not contain any duplicate
messages.
What should a solutions architect do to ensure messages are being processed once only?

A. Use the CreateQueue API call to create a new queue


B. Use the Add Permission API call to add appropriate permissions
C. Use the ReceiveMessage API call to set an appropriate wail time
D. Use the ChangeMessageVisibility APi call to increase the visibility timeout

Answer: D

Explanation:
The visibility timeout begins when Amazon SQS returns a message. During this time, the consumer processes and deletes the message. However, if the
consumer fails before deleting the message and your system doesn't call the DeleteMessage action for that message before the visibility timeout expires, the
message becomes visible to other consumers and the message is received again. If a message must be received only once, your consumer should delete it within
the duration of the visibility timeout. https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-visibility-timeout.html
Keyword: SQS queue writes to an Amazon RDS From this, Option D best suite & other Options ruled out [Option A - You can't intruduce one more Queue in
the existing one; Option B - only Permission & Option C - Only Retrieves Messages] FIF O queues are designed to never introduce duplicate messages.
However, your message producer might introduce duplicates in certain scenarios: for example, if the producer sends a message, does not receive a response, and
then resends the same message. Amazon SQS APIs provide deduplication functionality that prevents your message producer from sending duplicates. Any
duplicates introduced by the message producer are removed within a 5-minute deduplication interval. For standard queues, you might occasionally receive a
duplicate copy of a message (at-least- once delivery). If you use a standard queue, you must design your applications to be idempotent (that is, they must not be
affected adversely when processing the same message more than once).

NEW QUESTION 10
- (Exam Topic 1)
A company has an application that ingests incoming messages. These messages are then quickly consumed by dozens of other applications and microservices.
The number of messages varies drastically and sometimes spikes as high as 100,000 each second. The
company wants to decouple the solution and increase scalability. Which solution meets these requirements?

A. Persist the messages to Amazon Kinesis Data Analytic


B. All the applications will read and process the messages.
C. Deploy the application on Amazon EC2 instances in an Auto Scaling group, which scales the number of EC2 instances based on CPU metrics.
D. Write the messages to Amazon Kinesis Data Streams with a single shar
E. All applications will read from the stream and process the messages.
F. Publish the messages to an Amazon Simple Notification Service (Amazon SNS) topic with one or more Amazon Simple Queue Service (Amazon SQS)
subscription
G. All applications then process the messages from the queues.

Answer: D

Explanation:
https://aws.amazon.com/sqs/features/
By routing incoming requests to Amazon SQS, the company can decouple the job requests from the processing instances. This allows them to scale the number of
instances based on the size of the queue, providing more resources when needed. Additionally, using an Auto Scaling group based on the queue size will
automatically scale the number of instances up or down depending on the workload. Updating the software to read from the queue will allow it to process the job
requests in a more efficient manner, improving the performance of the system.

NEW QUESTION 14
- (Exam Topic 1)
A company needs guaranteed Amazon EC2 capacity in three specific Availability Zones in a specific AWS Region for an upcoming event that will last 1 week.
What should the company do to guarantee the EC2 capacity?

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

A. Purchase Reserved instances that specify the Region needed


B. Create an On Demand Capacity Reservation that specifies the Region needed
C. Purchase Reserved instances that specify the Region and three Availability Zones needed
D. Create an On-Demand Capacity Reservation that specifies the Region and three Availability Zones needed

Answer: D

Explanation:
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-capacity-reservations.html
Reserve instances: You will have to pay for the whole term (1 year or 3years) which is not cost effective

NEW QUESTION 17
- (Exam Topic 1)
A company's dynamic website is hosted using on-premises servers in the United States. The company is launching its product in Europe, and it wants to optimize
site loading times for new European users. The site's backend must remain in the United States. The product is being launched in a few days, and an immediate
solution is needed.
What should the solutions architect recommend?

A. Launch an Amazon EC2 instance in us-east-1 and migrate the site to it.
B. Move the website to Amazon S3. Use cross-Region replication between Regions.
C. Use Amazon CloudFront with a custom origin pointing to the on-premises servers.
D. Use an Amazon Route 53 geo-proximity routing policy pointing to on-premises servers.

Answer: C

Explanation:
https://aws.amazon.com/pt/blogs/aws/amazon-cloudfront-support-for-custom-origins/
You can now create a CloudFront distribution using a custom origin. Each distribution will can point to an S3 or to a custom origin. This could be another storage
service, or it could be something more interesting and more dynamic, such as an EC2 instance or even an Elastic Load Balancer

NEW QUESTION 21
- (Exam Topic 1)
A solutions architect is developing a multiple-subnet VPC architecture. The solution will consist of six subnets in two Availability Zones. The subnets are defined as
public, private and dedicated for databases. Only the Amazon EC2 instances running in the private subnets should be able to access a database.
Which solution meets these requirements?

A. Create a now route table that excludes the route to the public subnets' CIDR block
B. Associate the route table to the database subnets.
C. Create a security group that denies ingress from the security group used by instances in the public subnet
D. Attach the security group to an Amazon RDS DB instance.
E. Create a security group that allows ingress from the security group used by instances in the private subnet
F. Attach the security group to an Amazon RDS DB instance.
G. Create a new peering connection between the public subnets and the private subnet
H. Create a different peering connection between the private subnets and the database subnets.

Answer: C

Explanation:
Security groups are stateful. All inbound traffic is blocked by default. If you create an inbound rule allowing traffic in, that traffic is automatically allowed back out
again. You cannot block specific IP address using Security groups (instead use Network Access Control Lists).
"You can specify allow rules, but not deny rules." "When you first create a security group, it has no inbound rules. Therefore, no inbound traffic originating from
another host to your instance is allowed until you add inbound rules to the security group." Source:
https://docs.aws.amazon.com/vpc/latest/userguide/VPC_SecurityGroups.html#VPCSecurityGroups

NEW QUESTION 23
- (Exam Topic 1)
A company is developing a two-tier web application on AWS. The company's developers have deployed the application on an Amazon EC2 instance that connects
directly to a backend Amazon RDS database. The company must not hardcode database credentials in the application. The company must also implement a
solution to automatically rotate the database credentials on a regular basis.
Which solution will meet these requirements with the LEAST operational overhead?

A. Store the database credentials in the instance metadat


B. Use Amazon EventBridge (Amazon CloudWatch Events) rules to run a scheduled AWS Lambda function that updates the RDS credentials and instance
metadata at the same time.
C. Store the database credentials in a configuration file in an encrypted Amazon S3 bucke
D. Use Amazon EventBridge (Amazon CloudWatch Events) rules to run a scheduled AWS Lambda function that updates the RDS credentials and the credentials
in the configuration file at the same tim
E. Use S3 Versioning to ensure the ability to fall back to previous values.
F. Store the database credentials as a secret in AWS Secrets Manage
G. Turn on automatic rotation for the secre
H. Attach the required permission to the EC2 role to grant access to the secret.
I. Store the database credentials as encrypted parameters in AWS Systems Manager Parameter Stor
J. Turn on automatic rotation for the encrypted parameter
K. Attach the required permission to the EC2 role to grant access to the encrypted parameters.

Answer: C

Explanation:
https://docs.aws.amazon.com/secretsmanager/latest/userguide/create_database_secret.html

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

NEW QUESTION 28
- (Exam Topic 1)
A company needs the ability to analyze the log files of its proprietary application. The logs are stored in JSON format in an Amazon S3 bucket Queries will be
simple and will run on-demand A solutions architect needs to perform the analysis with minimal changes to the existing architecture
What should the solutions architect do to meet these requirements with the LEAST amount of operational overhead?

A. Use Amazon Redshift to load all the content into one place and run the SQL queries as needed
B. Use Amazon CloudWatch Logs to store the logs Run SQL queries as needed from the Amazon CloudWatch console
C. Use Amazon Athena directly with Amazon S3 to run the queries as needed
D. Use AWS Glue to catalog the logs Use a transient Apache Spark cluster on Amazon EMR to run the SQL queries as needed

Answer: C

Explanation:
Amazon Athena can be used to query JSON in S3

NEW QUESTION 29
- (Exam Topic 1)
A development team runs monthly resource-intensive tests on its general purpose Amazon RDS for MySQL DB instance with Performance Insights enabled. The
testing lasts for 48 hours once a month and is the only process that uses the database. The team wants to reduce the cost of running the tests without reducing the
compute and memory attributes of the DB instance.
Which solution meets these requirements MOST cost-effectively?

A. Stop the DB instance when tests are complete


B. Restart the DB instance when required.
C. Use an Auto Scaling policy with the DB instance to automatically scale when tests are completed.
D. Create a snapshot when tests are complete
E. Terminate the DB instance and restore the snapshot when required.
F. Modify the DB instance to a low-capacity instance when tests are complete
G. Modify the DB instance again when required.

Answer: A

NEW QUESTION 31
- (Exam Topic 1)
A company runs multiple Windows workloads on AWS. The company's employees use Windows file shares that are hosted on two Amazon EC2 instances. The
file shares synchronize data between themselves and
maintain duplicate copies. The company wants a highly available and durable storage solution that preserves how users currently access the files.
What should a solutions architect do to meet these requirements?

A. Migrate all the data to Amazon S3 Set up IAM authentication for users to access files
B. Set up an Amazon S3 File Gatewa
C. Mount the S3 File Gateway on the existing EC2 Instances.
D. Extend the file share environment to Amazon FSx for Windows File Server with a Multi-AZ configuratio
E. Migrate all the data to FSx for Windows File Server.
F. Extend the file share environment to Amazon Elastic File System (Amazon EFS) with a Multi-AZ configuratio
G. Migrate all the data to Amazon EFS.

Answer: A

NEW QUESTION 32
- (Exam Topic 1)
A solutions architect is using Amazon S3 to design the storage architecture of a new digital media application. The media files must be resilient to the loss of an
Availability Zone Some files are accessed frequently while other files are rarely accessed in an unpredictable pattern. The solutions architect must minimize the
costs of storing and retrieving the media files.
Which storage option meets these requirements?

A. S3 Standard
B. S3 Intelligent-Tiering
C. S3 Standard-Infrequent Access {S3 Standard-IA)
D. S3 One Zone-Infrequent Access (S3 One Zone-IA)

Answer: B

Explanation:
S3 Intelligent-Tiering - Perfect use case when you don't know the frequency of access or irregular patterns of usage.
Amazon S3 offers a range of storage classes designed for different use cases. These include S3 Standard for general-purpose storage of frequently accessed
data; S3 Intelligent-Tiering for data with unknown or changing access patterns; S3 Standard-Infrequent Access (S3 Standard-IA) and S3 One Zone-Infrequent
Access (S3 One Zone-IA) for long-lived, but less frequently accessed data; and Amazon S3 Glacier (S3 Glacier) and Amazon S3 Glacier Deep Archive (S3 Glacier
Deep Archive) for long-term archive and digital preservation. If you have data residency requirements that can’t be met by an existing AWS Region, you can use
the S3 Outposts storage class to store your S3 data on-premises. Amazon S3 also offers capabilities to manage your data throughout its lifecycle. Once an S3
Lifecycle policy is set, your data will automatically transfer to a different storage class without any changes to your application.
https://aws.amazon.com/getting-started/hands-on/getting-started-using-amazon-s3-intelligent-tiering/?nc1=h_ls

NEW QUESTION 36
- (Exam Topic 1)
A company uses 50 TB of data for reporting. The company wants to move this data from on premises to AWS A custom application in the company's data center
runs a weekly data transformation job. The company plans to pause the application until the data transfer is complete and needs to begin the transfer process as
soon as possible.

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

The data center does not have any available network bandwidth for additional workloads A solutions architect must transfer the data and must configure the
transformation job to continue to run in the AWS Cloud
Which solution will meet these requirements with the LEAST operational overhead?

A. Use AWS DataSync to move the data Create a custom transformation job by using AWS Glue
B. Order an AWS Snowcone device to move the data Deploy the transformation application to the device
C. Order an AWS Snowball Edge Storage Optimized devic
D. Copy the data to the devic
E. Create a custom transformation job by using AWS Glue
F. Order an AWS
G. Snowball Edge Storage Optimized device that includes Amazon EC2 compute Copy the data to the device Create a new EC2 instance on AWS to run the
transformation application

Answer: C

NEW QUESTION 41
- (Exam Topic 1)
A company is preparing to deploy a new serverless workload. A solutions architect must use the principle of least privilege to configure permissions that will be
used to run an AWS Lambda function. An Amazon EventBridge (Amazon CloudWatch Events) rule will invoke the function.
Which solution meets these requirements?

A. Add an execution role to the function with lambda:InvokeFunction as the action and * as the principal.
B. Add an execution role to the function with lambda:InvokeFunction as the action and Service:amazonaws.com as the principal.
C. Add a resource-based policy to the function with lambda:'* as the action and Service:events.amazonaws.com as the principal.
D. Add a resource-based policy to the function with lambda:InvokeFunction as the action and Service:events.amazonaws.com as the principal.

Answer: D

Explanation:
https://docs.aws.amazon.com/eventbridge/latest/userguide/resource-based-policies-eventbridge.html#lambda-pe

NEW QUESTION 44
- (Exam Topic 1)
A company needs to keep user transaction data in an Amazon DynamoDB table. The company must retain the data for 7 years.
What is the MOST operationally efficient solution that meets these requirements?

A. Use DynamoDB point-in-time recovery to back up the table continuously.


B. Use AWS Backup to create backup schedules and retention policies for the table.
C. Create an on-demand backup of the table by using the DynamoDB consol
D. Store the backup in an Amazon S3 bucke
E. Set an S3 Lifecycle configuration for the S3 bucket.
F. Create an Amazon EventBridge (Amazon CloudWatch Events) rule to invoke an AWS Lambda functio
G. Configure the Lambda function to back up the table and to store the backup in an Amazon S3bucke
H. Set an S3 Lifecycle configuration for the S3 bucket.

Answer: C

NEW QUESTION 48
- (Exam Topic 1)
A company has an Amazon S3 bucket that contains critical data. The company must protect the data from accidental deletion.
Which combination of steps should a solutions architect take to meet these requirements? (Choose two.)

A. Enable versioning on the S3 bucket.


B. Enable MFA Delete on the S3 bucket.
C. Create a bucket policy on the S3 bucket.
D. Enable default encryption on the S3 bucket.
E. Create a lifecycle policy for the objects in the S3 bucket.

Answer: AB

NEW QUESTION 52
- (Exam Topic 1)
A company has more than 5 TB of file data on Windows file servers that run on premises Users and applications interact with the data each day
The company is moving its Windows workloads to AWS. As the company continues this process, the company requires access to AWS and on-premises file
storage with minimum latency The company needs a solution that minimizes operational overhead and requires no significant changes to the existing file access
patterns. The company uses an AWS Site-to-Site VPN connection for connectivity to AWS
What should a solutions architect do to meet these requirements?

A. Deploy and configure Amazon FSx for Windows File Server on AW


B. Move the on-premises file data to FSx for Windows File Serve
C. Reconfigure the workloads to use FSx for Windows File Server on AWS.
D. Deploy and configure an Amazon S3 File Gateway on premises Move the on-premises file data to the S3 File Gateway Reconfigure the on-premises workloads
and the cloud workloads to use the S3 File Gateway
E. Deploy and configure an Amazon S3 File Gateway on premises Move the on-premises file data to Amazon S3 Reconfigure the workloads to use either Amazon
S3 directly or the S3 File Gateway, depending on each workload's location
F. Deploy and configure Amazon FSx for Windows File Server on AWS Deploy and configure an Amazon FSx File Gateway on premises Move the on-premises
file data to the FSx File Gateway Configure the cloud workloads to use FSx for Windows File Server on AWS Configure the on-premises workloads to use the FSx
File Gateway

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

Answer: D

NEW QUESTION 53
- (Exam Topic 1)
A company needs to review its AWS Cloud deployment to ensure that its Amazon S3 buckets do not have unauthorized configuration changes.
What should a solutions architect do to accomplish this goal?

A. Turn on AWS Config with the appropriate rules.


B. Turn on AWS Trusted Advisor with the appropriate checks.
C. Turn on Amazon Inspector with the appropriate assessment template.
D. Turn on Amazon S3 server access loggin
E. Configure Amazon EventBridge (Amazon Cloud Watch Events).

Answer: D

NEW QUESTION 57
- (Exam Topic 1)
A company is migrating a distributed application to AWS The application serves variable workloads The legacy platform consists of a primary server trial
coordinates jobs across multiple compute nodes The company wants to modernize the application with a solution that maximizes resiliency and scalability.
How should a solutions architect design the architecture to meet these requirements?

A. Configure an Amazon Simple Queue Service (Amazon SQS) queue as a destination for the jobs Implement the compute nodes with Amazon EC2 instances
that are managed in an Auto Scaling grou
B. Configure EC2 Auto Scaling to use scheduled scaling
C. Configure an Amazon Simple Queue Service (Amazon SQS) queue as a destination for the jobs Implement the compute nodes with Amazon EC2 Instances
that are managed in an Auto Scaling group Configure EC2 Auto Scaling based on the size of the queue
D. Implement the primary server and the compute nodes with Amazon EC2 instances that are managed In an Auto Scaling grou
E. Configure AWS CloudTrail as a destination for the fobs Configure EC2 Auto Scaling based on the load on the primary server
F. implement the primary server and the compute nodes with Amazon EC2 instances that are managed in an Auto Scaling group Configure Amazon EventBridge
(Amazon CloudWatch Events) as a destination for the jobs Configure EC2 Auto Scaling based on the load on the compute nodes

Answer: B

NEW QUESTION 58
- (Exam Topic 1)
A company is building an application in the AWS Cloud. The application will store data in Amazon S3 buckets in two AWS Regions. The company must use an
AWS Key Management Service (AWS KMS) customer managed key to encrypt all data that is stored in the S3 buckets. The data in both S3 buckets must be
encrypted and decrypted with the same KMS key. The data and the key must be stored in each of the two Regions.
Which solution will meet these requirements with the LEAST operational overhead?

A. Create an S3 bucket in each Region Configure the S3 buckets to use server-side encryption with Amazon S3 managed encryption keys (SSE-S3) Configure
replication between the S3 buckets.
B. Create a customer managed multi-Region KMS ke
C. Create an S3 bucket in each Regio
D. Configure replication between the S3 bucket
E. Configure the application to use the KMS key with client-side encryption.
F. Create a customer managed KMS key and an S3 bucket in each Region Configure the S3 buckets to use server-side encryption with Amazon S3 managed
encryption keys (SSE-S3) Configure replication between the S3 buckets.
G. Create a customer managed KMS key and an S3 bucket m each Region Configure the S3 buckets to use server-side encryption with AWS KMS keys (SSE-
KMS) Configure replication between the S3 buckets.

Answer: B

Explanation:
From https://docs.aws.amazon.com/kms/latest/developerguide/custom-key-store-overview.html
For most users, the default AWS KMS key store, which is protected by FIPS 140-2 validated cryptographic modules, fulfills their security requirements. There is no
need to add an extra layer of maintenance responsibility or a dependency on an additional service. However, you might consider creating a custom key store if
your organization has any of the following requirements: Key material cannot be stored in a shared environment. Key material must be subject to a secondary,
independent audit path. The HSMs that generate and store key material must be certified at FIPS 140-2 Level 3.
https://docs.aws.amazon.com/kms/latest/developerguide/custom-key-store-overview.html
https://docs.aws.amazon.com/kms/latest/developerguide/multi-region-keys-overview.html

NEW QUESTION 63
- (Exam Topic 1)
A bicycle sharing company is developing a multi-tier architecture to track the location of its bicycles during peak operating hours The company wants to use these
data points in its existing analytics platform A solutions architect must determine the most viable multi-tier option to support this architecture The data points must
be accessible from the REST API.
Which action meets these requirements for storing and retrieving location data?

A. Use Amazon Athena with Amazon S3


B. Use Amazon API Gateway with AWS Lambda
C. Use Amazon QuickSight with Amazon Redshift.
D. Use Amazon API Gateway with Amazon Kinesis Data Analytics

Answer: D

Explanation:
https://aws.amazon.com/solutions/implementations/aws-streaming-data-solution-for-amazon-kinesis/

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

NEW QUESTION 65
- (Exam Topic 1)
A company's website uses an Amazon EC2 instance store for its catalog of items. The company wants to make sure that the catalog is highly available and that
the catalog is stored in a durable location.
What should a solutions architect do to meet these requirements?

A. Move the catalog to Amazon ElastiCache for Redis.


B. Deploy a larger EC2 instance with a larger instance store.
C. Move the catalog from the instance store to Amazon S3 Glacier Deep Archive.
D. Move the catalog to an Amazon Elastic File System (Amazon EFS) file system.

Answer: A

NEW QUESTION 69
- (Exam Topic 1)
A company has thousands of edge devices that collectively generate 1 TB of status alerts each day. Each alert is approximately 2 KB in size. A solutions architect
needs to implement a solution to ingest and store the alerts for future analysis.
The company wants a highly available solution. However, the company needs to minimize costs and does not want to manage additional infrastructure. Ad
ditionally, the company wants to keep 14 days of data available for immediate analysis and archive any data older than 14 days.
What is the MOST operationally efficient solution that meets these requirements?

A. Create an Amazon Kinesis Data Firehose delivery stream to ingest the alerts Configure the Kinesis Data Firehose stream to deliver the alerts to an Amazon S3
bucket Set up an S3 Lifecycle configuration to transition data to Amazon S3 Glacier after 14 days
B. Launch Amazon EC2 instances across two Availability Zones and place them behind an Elastic Load Balancer to ingest the alerts Create a script on the EC2
instances that will store tne alerts m an Amazon S3 bucket Set up an S3 Lifecycle configuration to transition data to Amazon S3 Glacier after 14 days
C. Create an Amazon Kinesis Data Firehose delivery stream to ingest the alerts Configure the Kinesis Data Firehose stream to deliver the alerts to an Amazon
Elasticsearch Service (Amazon ES) duster Set up the Amazon ES cluster to take manual snapshots every day and delete data from the duster that is older than 14
days
D. Create an Amazon Simple Queue Service (Amazon SQS i standard queue to ingest the alerts and set the message retention period to 14 days Configure
consumers to poll the SQS queue check the age of the message and analyze the message data as needed If the message is 14 days old the consumer should
copy the message to an Amazon S3 bucket and delete the message from the SQS queue

Answer: A

Explanation:
https://aws.amazon.com/kinesis/data-firehose/features/?nc=sn&loc=2#:~:text=into%20Amazon%20S3%2C%20

NEW QUESTION 70
- (Exam Topic 2)
A large media company hosts a web application on AWS. The company wants to start caching confidential media files so that users around the world will have
reliable access to the files. The content is stored in Amazon S3 buckets. The company must deliver the content quickly, regardless of where the requests originate
geographically.
Which solution will meet these requirements?

A. Use AWS DataSync to connect the S3 buckets to the web application.


B. Deploy AWS Global Accelerator to connect the S3 buckets to the web application.
C. Deploy Amazon CloudFront to connect the S3 buckets to CloudFront edge servers.
D. Use Amazon Simple Queue Service (Amazon SQS) to connect the S3 buckets to the web application.

Answer: C

Explanation:
CloudFront uses a local cache to provide the response, AWS Global accelerator proxies requests and connects to the application all the time for the response.
https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-content-restricting-access-to-s3

NEW QUESTION 73
- (Exam Topic 2)
A company is migrating an application from on-premises servers to Amazon EC2 instances. As part of the migration design requirements, a solutions architect
must implement infrastructure metric alarms. The company does not need to take action if CPU utilization increases to more than 50% for a short burst of time.
However, if the CPU utilization increases to more than 50% and read IOPS on the disk are high at the same time, the company needs to act as soon as possible.
The solutions architect also must reduce false alarms.
What should the solutions architect do to meet these requirements?

A. Create Amazon CloudWatch composite alarms where possible.


B. Create Amazon CloudWatch dashboards to visualize the metrics and react to issues quickly.
C. Create Amazon CloudWatch Synthetics canaries to monitor the application and raise an alarm.
D. Create single Amazon CloudWatch metric alarms with multiple metric thresholds where possible.

Answer: A

NEW QUESTION 78
- (Exam Topic 2)
A company uses a popular content management system (CMS) for its corporate website. However, the required patching and maintenance are burdensome. The
company is redesigning its website and wants anew solution. The website will be updated four times a year and does not need to have any dynamic content
available. The solution must provide high scalability and enhanced security.
Which combination of changes will meet these requirements with the LEAST operational overhead? (Choose two.)

A. Deploy an AWS WAF web ACL in front of the website to provide HTTPS functionality
B. Create and deploy an AWS Lambda function to manage and serve the website content

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

C. Create the new website and an Amazon S3 bucket Deploy the website on the S3 bucket with static website hosting enabled
D. Create the new websit
E. Deploy the website by using an Auto Scaling group of Amazon EC2 instances behind an Application Load Balancer.

Answer: AD

NEW QUESTION 79
- (Exam Topic 2)
A company wants to run applications in containers in the AWS Cloud. These applications are stateless and can tolerate disruptions within the underlying
infrastructure. The company needs a solution that minimizes cost and operational overhead.
What should a solutions architect do to meet these requirements?

A. Use Spot Instances in an Amazon EC2 Auto Scaling group to run the application containers.
B. Use Spot Instances in an Amazon Elastic Kubernetes Service (Amazon EKS) managed node group.
C. Use On-Demand Instances in an Amazon EC2 Auto Scaling group to run the application containers.
D. Use On-Demand Instances in an Amazon Elastic Kubernetes Service (Amazon EKS) managed node group.

Answer: A

Explanation:
https://aws.amazon.com/cn/blogs/compute/cost-optimization-and-resilience-eks-with-spot-instances/

NEW QUESTION 82
- (Exam Topic 2)
A company has a highly dynamic batch processing job that uses many Amazon EC2 instances to complete it. The job is stateless in nature, can be started and
stopped at any given time with no negative impact, and typically takes upwards of 60 minutes total to complete. The company has asked a solutions architect to
design a scalable and cost-effective solution that meets the requirements of the job.
What should the solutions architect recommend?

A. Implement EC2 Spot Instances


B. Purchase EC2 Reserved Instances
C. Implement EC2 On-Demand Instances
D. Implement the processing on AWS Lambda

Answer: A

NEW QUESTION 84
- (Exam Topic 2)
A company wants to build a scalable key management Infrastructure to support developers who need to encrypt data in their applications.
What should a solutions architect do to reduce the operational burden?

A. Use multifactor authentication (MFA) to protect the encryption keys.


B. Use AWS Key Management Service (AWS KMS) to protect the encryption keys
C. Use AWS Certificate Manager (ACM) to create, store, and assign the encryption keys
D. Use an IAM policy to limit the scope of users who have access permissions to protect the encryption keys

Answer: B

Explanation:
https://aws.amazon.com/kms/faqs/#:~:text=If%20you%20are%20a%20developer%20who%20needs%20to%20d

NEW QUESTION 85
- (Exam Topic 2)
A company is running a multi-tier web application on premises. The web application is containerized and runs on a number of Linux hosts connected to a
PostgreSQL database that contains user records. The operational overhead of maintaining the infrastructure and capacity planning is limiting the company's
growth. A solutions architect must improve the application's infrastructure.
Which combination of actions should the solutions architect take to accomplish this? (Choose two.)

A. Migrate the PostgreSQL database to Amazon Aurora


B. Migrate the web application to be hosted on Amazon EC2 instances.
C. Set up an Amazon CloudFront distribution for the web application content.
D. Set up Amazon ElastiCache between the web application and the PostgreSQL database.
E. Migrate the web application to be hosted on AWS Fargate with Amazon Elastic Container Service (Amazon ECS).

Answer: AE

NEW QUESTION 89
- (Exam Topic 2)
Organizers for a global event want to put daily reports online as static HTML pages. The pages are expected to generate millions of views from users around the
world. The files are stored In an Amazon S3 bucket. A solutions architect has been asked to design an efficient and effective solution.
Which action should the solutions architect take to accomplish this?

A. Generate presigned URLs for the files.


B. Use cross-Region replication to all Regions.
C. Use the geoproximtty feature of Amazon Route 53.
D. Use Amazon CloudFront with the S3 bucket as its origin.

Answer: D

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

NEW QUESTION 93
- (Exam Topic 2)
An application runs on Amazon EC2 instances across multiple Availability Zones The instances run in an Amazon EC2 Auto Scaling group behind an Application
Load Balancer The application performs best when the CPU utilization of the EC2 instances is at or near 40%.
What should a solutions architect do to maintain the desired performance across all instances in the group?

A. Use a simple scaling policy to dynamically scale the Auto Scaling group
B. Use a target tracking policy to dynamically scale the Auto Scaling group
C. Use an AWS Lambda function to update the desired Auto Scaling group capacity.
D. Use scheduled scaling actions to scale up and scale down the Auto Scaling group

Answer: B

Explanation:
https://docs.aws.amazon.com/autoscaling/application/userguide/application-auto-scaling-target-tracking.html

NEW QUESTION 95
- (Exam Topic 2)
A company wants to migrate its existing on-premises monolithic application to AWS.
The company wants to keep as much of the front- end code and the backend code as possible. However, the company wants to break the application into smaller
applications. A different team will manage each application. The company needs a highly scalable solution that minimizes operational overhead.
Which solution will meet these requirements?

A. Host the application on AWS Lambda Integrate the application with Amazon API Gateway.
B. Host the application with AWS Amplif
C. Connect the application to an Amazon API Gateway API that is integrated with AWS Lambda.
D. Host the application on Amazon EC2 instance
E. Set up an Application Load Balancer with EC2 instances in an Auto Scaling group as targets.
F. Host the application on Amazon Elastic Container Service (Amazon ECS) Set up an Application Load Balancer with Amazon ECS as the target.

Answer: D

Explanation:
https://aws.amazon.com/blogs/compute/microservice-delivery-with-amazon-ecs-and-application-load-balancers/

NEW QUESTION 100


- (Exam Topic 2)
A reporting team receives files each day in an Amazon S3 bucket. The reporting team manually reviews and copies the files from this initial S3 bucket to an
analysis S3 bucket each day at the same time to use with Amazon QuickSight. Additional teams are starting to send more files in larger sizes to the initial S3
bucket.
The reporting team wants to move the files automatically analysis S3 bucket as the files enter the initial S3 bucket. The reporting team also wants to use AWS
Lambda functions to run pattern-matching code on the copied data. In addition, the reporting team wants to send the data files to a pipeline in Amazon SageMaker
Pipelines.
What should a solutions architect do to meet these requirements with the LEAST operational overhead?

A. Create a Lambda function to copy the files to the analysis S3 bucke


B. Create an S3 event notification for the analysis S3 bucke
C. Configure Lambda and SageMaker Pipelines as destinations of the event notificatio
D. Configure s30bjectCreated:Put as the event type.
E. Create a Lambda function to copy the files to the analysis S3 bucke
F. Configure the analysis S3 bucket to send event notifications to Amazon EventBridge (Amazon CloudWatch Events). Configure an ObjectCreated rule in
EventBridge (CloudWatch Events). Configure Lambda and SageMaker Pipelines as targets for the rule.
G. Configure S3 replication between the S3 bucket
H. Create an S3 event notification for the analysis S3 bucke
I. Configure Lambda and SageMaker Pipelines as destinations of the event notificatio
J. Configure s30bjectCreated:Put as the event type.
K. Configure S3 replication between the S3 bucket
L. Configure the analysis S3 bucket to send event notifications to Amazon EventBridge (Amazon CloudWatch Events). Configure an ObjectCreated rule in
EventBridge (CloudWatch Events). Configure Lambda and SageMaker Pipelines as targets for the rule.

Answer: A

NEW QUESTION 104


- (Exam Topic 2)
A company runs an Oracle database on premises. As part of the company’s migration to AWS, the company wants to upgrade the database to the most recent
available version. The company also wants to set up disaster recovery (DR) for the database. The company needs to minimize the operational overhead for normal
operations and DR setup. The company also needs to maintain access to the database's underlying operating system.
Which solution will meet these requirements?

A. Migrate the Oracle database to an Amazon EC2 instanc


B. Set up database replication to a different AWS Region.
C. Migrate the Oracle database to Amazon RDS for Oracl
D. Activate Cross-Region automated backups to replicate the snapshots to another AWS Region.
E. Migrate the Oracle database to Amazon RDS Custom for Oracl
F. Create a read replica for the database in another AWS Region.
G. Migrate the Oracle database to Amazon RDS for Oracl
H. Create a standby database in another Availability Zone.

Answer: C

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

Explanation:
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/rds-custom.html and https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/working-with-
custom-oracle.html

NEW QUESTION 107


- (Exam Topic 2)
A company has a service that produces event data. The company wants to use AWS to process the event data as it is received. The data is written in a specific
order that must be maintained throughout processing The company wants to implement a solution that minimizes operational overhead. How should a solutions
architect accomplish this?

A. Create an Amazon Simple Queue Service (Amazon SQS) FIFO queue to hold messages Set up an AWS Lambda function to process messages from the queue
B. Create an Amazon Simple Notification Service (Amazon SNS) topic to deliver notifications containing payloads to process Configure an AWS Lambda function
as a subscriber.
C. Create an Amazon Simple Queue Service (Amazon SQS) standard queue to hold message
D. Set up an AWS Lambda function to process messages from the queue independently
E. Create an Amazon Simple Notification Service (Amazon SNS) topic to deliver notifications containing payloads to proces
F. Configure an Amazon Simple Queue Service (Amazon SQS) queue as a subscriber.

Answer: A

Explanation:
The details are revealed in below url: https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/FIFO-queues.html
FIFO (First-In-First-Out) queues are designed to enhance messaging between applications when the order of operations and events is critical, or where duplicates
can't be tolerated. Examples of situations where you might use FIFO queues include the following: To make sure that user-entered commands are run in the right
order. To display the correct product price by sending price modifications in the right order. To prevent a student from enrolling in a course before registering for an
account.

NEW QUESTION 108


- (Exam Topic 2)
A new employee has joined a company as a deployment engineer. The deployment engineer will be using AWS CloudFormation templates to create multiple AWS
resources. A solutions architect wants the deployment engineer to perform job activities while following the principle of least privilege.
Which steps should the solutions architect do in conjunction to reach this goal? (Select two.)

A. Have the deployment engineer use AWS account roof user credentials for performing AWS CloudFormation stack operations.
B. Create a new IAM user for the deployment engineer and add the IAM user to a group that has the PowerUsers IAM policy attached.
C. Create a new IAM user for the deployment engineer and add the IAM user to a group that has the Administrate/Access IAM policy attached.
D. Create a new IAM User for the deployment engineer and add the IAM user to a group that has an IAM policy that allows AWS CloudFormation actions only.
E. Create an IAM role for the deployment engineer to explicitly define the permissions specific to the AWS CloudFormation stack and launch stacks using Dial IAM
role.

Answer: DE

Explanation:
https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html https://docs.aws.amazon.com/IAM/latest/UserGuide/id_users.html

NEW QUESTION 113


- (Exam Topic 2)
A company runs an application using Amazon ECS. The application creates esi/ed versions of an original image and then makes Amazon S3 API calls to store the
resized images in Amazon S3.
How can a solutions architect ensure that the application has permission to access Amazon S3?

A. Update the S3 role in AWS IAM to allow read/write access from Amazon ECS, and then relaunch the container.
B. Create an IAM role with S3 permissions, and then specify that role as the taskRoleAm in the task definition.
C. Create a security group that allows access from Amazon ECS to Amazon S3, and update the launch configuration used by the ECS cluster.
D. Create an IAM user with S3 permissions, and then relaunch the Amazon EC2 instances for the ECS cluster while logged in as this account.

Answer: B

NEW QUESTION 114


- (Exam Topic 2)
A company sells ringtones created from clips of popular songs. The files containing the ringtones are stored in Amazon S3 Standard and are at least 128 KB in
size. The company has millions of files, but downloads are infrequent for ringtones older than 90 days. The company needs to save money on storage while
keeping the most accessed files readily available for its users.
Which action should the company take to meet these requirements MOST cost-effectively?

A. Configure S3 Standard-Infrequent Access (S3 Standard-IA) storage for the initial storage tier of the objects.
B. Move the files to S3 Intelligent-Tiering and configure it to move objects to a less expensive storage tier after 90 days.
C. Configure S3 inventory to manage objects and move them to S3 Standard-Infrequent Access (S3 Standard-1A) after 90 days.
D. Implement an S3 Lifecycle policy that moves the objects from S3 Standard to S3 Standard-Infrequent Access (S3 Standard-1A) after 90 days.

Answer: D

NEW QUESTION 115


- (Exam Topic 3)
A company wants to deploy a new public web application on AWS The application includes a web server tier that uses Amazon EC2 instances The application also
includes a database tier that uses an Amazon RDS for MySQL DB instance
The application must be secure and accessible for global customers that have dynamic IP addresses How should a solutions architect configure the security
groups to meet these requirements'?

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

A. Configure the security group tor the web servers lo allow inbound traffic on port 443 from 0.0.0. 0/0) Configure the security group for the DB instance to allow
inbound traffic on port 3306 from the security group of the web servers
B. Configure the security group for the web servers to allow inbound traffic on port 443 from the IP addresses of the customers Configure the security group for the
DB instance lo allow inbound traffic on port 3306 from the security group of the web servers
C. Configure the security group for the web servers to allow inbound traffic on port 443 from the IP addresses of the customers Configure the security group for the
DB instance to allow inbound traffic on port 3306 from the IP addresses of the customers
D. Configure the security group for the web servers to allow inbound traffic on port 443 from 0.0.0.0.0 Configure the security group for the DB instance to allow
inbound traffic on port 3306 from 0.0.0.0/0)

Answer: A

NEW QUESTION 116


- (Exam Topic 3)
A company manages its own Amazon EC2 instances that run MySQL databases. The company is manually managing replication and scaling as demand
increases or decreases. The company needs a new solution that simplifies the process of adding or removing compute capacity to or from its database tier as
needed. The solution also must offer improved performance, scaling, and durability with minimal effort from operations.
Which solution meets these requirements?

A. Migrate the databases to Amazon Aurora Serverless for Aurora MySQL.


B. Migrate the databases to Amazon Aurora Serverless for Aurora PostgreSQL.
C. Combine the databases into one larger MySQL databas
D. Run the larger database on larger EC2 instances.
E. Create an EC2 Auto Scaling group for the database tie
F. Migrate the existing databases to the new environment.

Answer: A

Explanation:
https://aws.amazon.com/rds/aurora/serverless/

NEW QUESTION 121


- (Exam Topic 3)
A company needs to export its database once a day to Amazon S3 for other teams to access. The exported object size vanes between 2 GB and 5 GB. The S3
access pattern for the data is variable and changes rapidly. The data must be immediately available and must remain accessible for up to 3 months. The company
needs the most cost-effective solution that will not increase retrieval time
Which S3 storage class should the company use to meet these requirements?

A. S3 Intelligent-Tiering
B. S3 Glacier Instant Retrieval
C. S3 Standard
D. S3 Standard-Infrequent Access (S3 Standard-IA)

Answer: D

Explanation:
S3 Intelligent-Tiering is a cost-optimized storage class that automatically moves data to the most cost-effective access tier based on changing access patterns.
Although it offers cost savings, it also introduces additional latency and retrieval time into the data retrieval process, which may not meet the requirement of
"immediately available" data. On the other hand, S3 Standard-Infrequent Access (S3 Standard-IA) provides low cost storage
with low latency and high throughput performance. It is designed for infrequently accessed data that can be recreated if lost, and can be retrieved in a timely
manner if required. It is a cost-effective solution that meets the requirement of immediately available data and remains accessible for up to 3 months.

NEW QUESTION 123


- (Exam Topic 3)
A company runs a web application on Amazon EC2 instances in multiple Availability Zones. The EC2 instances are in private subnets. A solutions architect
implements an internet-facing Application Load Balancer (ALB) and specifies the EC2 instances as the target group. However, the internet traffic is not reaching
the EC2 instances.
How should the solutions architect reconfigure the architecture to resolve this issue?

A. Replace the ALB with a Network Load Balance


B. Configure a NAT gateway in a public subnet to allow internet traffic.
C. Move the EC2 instances to public subnet
D. Add a rule to the EC2 instances’ security groups to allow outbound traffic to 0.0.0.0/0.
E. Update the route tables for the EC2 instances’ subnets to send 0.0.0.0/0 traffic through the internet gateway rout
F. Add a rule to the EC2 instances’ security groups to allow outbound traffic to 0.0.0.0/0.
G. Create public subnets in each Availability Zon
H. Associate the public subnets with the AL
I. Update the route tables for the public subnets with a route to the private subnets.

Answer: D

Explanation:
https://aws.amazon.com/premiumsupport/knowledge-center/public-load-balancer-private-ec2/

NEW QUESTION 127


- (Exam Topic 3)
A company is implementing new data retention policies for all databases that run on Amazon RDS DB instances. The company must retain daily backups for a
minimum period of 2 years. The backups must be consistent and restorable.
Which solution should a solutions architect recommend to meet these requirements?

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

A. Create a backup vault in AWS Backup to retain RDS backup


B. Create a new backup plan with a daily schedule and an expiration period of 2 years after creatio
C. Assign the RDS DB instances to the backup plan.
D. Configure a backup window for the RDS DB instances for daily snapshot
E. Assign a snapshot retention policy of 2 years to each RDS DB instanc
F. Use Amazon Data Lifecycle Manager (Amazon DLM) to schedule snapshot deletions.
G. Configure database transaction logs to be automatically backed up to Amazon CloudWatch Logs with an expiration period of 2 years.
H. Configure an AWS Database Migration Service (AWS DMS) replication tas
I. Deploy a replication instance, and configure a change data capture (CDC) task to stream database changes to Amazon S3 as the targe
J. Configure S3 Lifecycle policies to delete the snapshots after 2 years.

Answer: A

NEW QUESTION 131


- (Exam Topic 3)
A company is designing a cloud communications platform that is driven by APIs. The application is hosted on Amazon EC2 instances behind a Network Load
Balancer (NLB). The company uses Amazon API Gateway to provide external users with access to the application through APIs. The company wants to protect
the platform against web exploits like SQL injection and also wants to detect and mitigate large, sophisticated DDoS attacks.
Which combination of solutions provides the MOST protection? (Select TWO.)

A. Use AWS WAF to protect the NLB.


B. Use AWS Shield Advanced with the NLB.
C. Use AWS WAF to protect Amazon API Gateway.
D. Use Amazon GuardDuty with AWS Shield Standard.
E. Use AWS Shield Standard with Amazon API Gateway.

Answer: BC

Explanation:
AWS Shield Advanced provides expanded DDoS attack protection for your Amazon EC2 instances, Elastic Load Balancing load balancers, CloudFront
distributions, Route 53 hosted zones, and AWS Global Accelerator standard accelerators.
AWS WAF is a web application firewall that lets you monitor the HTTP and HTTPS requests that are forwarded to your protected web application resources. You
can protect the following resource types:
Amazon CloudFront distribution Amazon API Gateway REST API Application Load Balancer
AWS AppSync GraphQL API Amazon Cognito user pool
https://docs.aws.amazon.com/waf/latest/developerguide/what-is-aws-waf.html

NEW QUESTION 133


- (Exam Topic 3)
A company recently deployed a new auditing system to centralize information about operating system versions patching and installed software for Amazon EC2
instances. A solutions architect must ensure all instances provisioned through EC2 Auto Scaling groups successfully send reports to the auditing system as soon
as they are launched and terminated
Which solution achieves these goals MOST efficiently?

A. Use a scheduled AWS Lambda function and run a script remotely on all EC2 instances to send data to the audit system.
B. Use EC2 Auto Scaling lifecycle hooks to run a custom script to send data to the audit system when instances are launched and terminated
C. Use an EC2 Auto Scaling launch configuration to run a custom script through user data to send data to the audit system when instances are launched and
terminated
D. Run a custom script on the instance operating system to send data to the audit system Configure the script to be invoked by the EC2 Auto Scaling group when
the instance starts and is terminated

Answer: B

NEW QUESTION 136


- (Exam Topic 3)
A company has an API that receives real-time data from a fleet of monitoring devices. The API stores this data in an Amazon RDS DB instance for later analysis.
The amount of data that the monitoring devices send to the API fluctuates. During periods of heavy traffic, the API often returns timeout errors.
After an inspection of the logs, the company determines that the database is not capable of processing the volume of write traffic that comes from the API. A
solutions architect must minimize the number of
connections to the database and must ensure that data is not lost during periods of heavy traffic. Which solution will meet these requirements?

A. Increase the size of the DB instance to an instance type that has more available memory.
B. Modify the DB instance to be a Multi-AZ DB instanc
C. Configure the application to write to all active RDS DB instances.
D. Modify the API to write incoming data to an Amazon Simple Queue Service (Amazon SQS) queu
E. Use an AWS Lambda function that Amazon SQS invokes to write data from the queue to the database.
F. Modify the API to write incoming data to an Amazon Simple Notification Service (Amazon SNS) topic.Use an AWS Lambda function that Amazon SNS invokes
to write data from the topic to the database.

Answer: C

Explanation:
Using Amazon SQS will help minimize the number of connections to the database, as the API will write data to a queue instead of directly to the database.
Additionally, using an AWS Lambda function that Amazon SQS invokes to write data from the queue to the database will help ensure that data is not lost during
periods of heavy traffic, as the queue will serve as a buffer between the API and the database.

NEW QUESTION 137


- (Exam Topic 3)
A solutions architect has created two IAM policies: Policy1 and Policy2. Both policies are attached to an IAM group.

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

A cloud engineer is added as an IAM user to the IAM group. Which action will the cloud engineer be able to perform?

A. Deleting IAM users


B. Deleting directories
C. Deleting Amazon EC2 instances
D. Deleting logs from Amazon CloudWatch Logs

Answer: C

Explanation:
https://awscli.amazonaws.com/v2/documentation/api/latest/reference/ds/index.html

NEW QUESTION 138


- (Exam Topic 3)
A company wants to migrate an Oracle database to AWS. The database consists of a single table that contains millions of geographic information systems (GIS)
images that are high resolution and are identified by a geographic code.
When a natural disaster occurs tens of thousands of images get updated every few minutes. Each geographic code has a single image or row that is associated
with it. The company wants a solution that is highly available and scalable during such events
Which solution meets these requirements MOST cost-effectively?

A. Store the images and geographic codes in a database table Use Oracle running on an Amazon RDS Multi-AZ DB instance
B. Store the images in Amazon S3 buckets Use Amazon DynamoDB with the geographic code as the key and the image S3 URL as the value
C. Store the images and geographic codes in an Amazon DynamoDB table Configure DynamoDB Accelerator (DAX) during times of high load
D. Store the images in Amazon S3 buckets Store geographic codes and image S3 URLs in a database table Use Oracle running on an Amazon RDS Multi-AZ DB
instance.

Answer: A

NEW QUESTION 140


- (Exam Topic 3)
A company hosts a three-tier ecommerce application on a fleet of Amazon EC2 instances. The instances run in an Auto Scaling group behind an Application Load
Balancer (ALB) All ecommerce data is stored in an Amazon RDS for ManaDB Multi-AZ DB instance
The company wants to optimize customer session management during transactions The application must store session data durably
Which solutions will meet these requirements? (Select TWO )

A. Turn on the sticky sessions feature (session affinity) on the ALB


B. Use an Amazon DynamoDB table to store customer session information
C. Deploy an Amazon Cognito user pool to manage user session information

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

D. Deploy an Amazon ElastiCache for Redis cluster to store customer session information
E. Use AWS Systems Manager Application Manager in the application to manage user session information

Answer: AB

NEW QUESTION 144


- (Exam Topic 3)
A company runs a containerized application on a Kubernetes cluster in an on-premises data center. The company is using a MongoDB database for data storage.
The company wants to migrate some of these environments to AWS, but no code changes or deployment method changes are possible at this time. The company
needs a solution that minimizes operational overhead.
Which solution meets these requirements?

A. Use Amazon Elastic Container Service (Amazon ECS) with Amazon EC2 worker nodes for compute and MongoDB on EC2 for data storage.
B. Use Amazon Elastic Container Service (Amazon ECS) with AWS Fargate for compute and Amazon DynamoDB for data storage.
C. Use Amazon Elastic Kubernetes Service (Amazon EKS) with Amazon EC2 worker nodes for compute and Amazon DynamoDB for data storage.
D. Use Amazon Elastic Kubernetes Service (Amazon EKS) with AWS Fargate for compute and Amazon DocumentDB (with MongoDB compatibility) for data
storage.

Answer: D

Explanation:
Amazon DocumentDB (with MongoDB compatibility) is a fast, reliable, and fully managed database service. Amazon DocumentDB makes it easy to set up,
operate, and scale MongoDB-compatible databases in the cloud. With Amazon DocumentDB, you can run the same application code and use the same drivers
and tools that you use with MongoDB.
https://docs.aws.amazon.com/documentdb/latest/developerguide/what-is.html

NEW QUESTION 147


- (Exam Topic 3)
A company is using Amazon CloudFront with this website. The company has enabled logging on the CloudFront distribution, and logs are saved in one of the
company's Amazon S3 buckets The company needs to perform advanced analyses on the logs and build visualizations
What should a solutions architect do to meet these requirements'?

A. Use standard SQL queries in Amazon Athena to analyze the CloudFront togs in the S3 bucket Visualize the results with AWS Glue
B. Use standard SQL queries in Amazon Athena to analyze the CloudFront togs in the S3 bucket Visualize the results with Amazon QuickSight
C. Use standard SQL queries in Amazon DynamoDB to analyze the CloudFront logs m the S3 bucket Visualize the results with AWS Glue
D. Use standard SQL queries in Amazon DynamoDB to analyze the CtoudFront logs m the S3 bucket Visualize the results with Amazon QuickSight

Answer: D

NEW QUESTION 152


- (Exam Topic 3)
A company wants to configure its Amazon CloudFront distribution to use SSL/TLS certificates. The company does not want to use the default domain name for the
distribution. Instead, the company wants to use a different domain name for the distribution.
Which solution will deploy the certificate with icurring any additional costs?

A. Request an Amazon issued private certificate from AWS Certificate Manager (ACM) in the us-east-1 Region
B. Request an Amazon issued private certificate from AWS Certificate Manager (ACM) in the us-west-1 Region.
C. Request an Amazon issued public certificate from AWS Certificate Manager (ACU) in the us-east-1 Region
D. Request an Amazon issued public certificate from AWS Certificate Manager (ACU) in the us-west-1 Regon.

Answer: B

NEW QUESTION 156


- (Exam Topic 3)
A company is designing a shared storage solution for a gaming application that is hosted in the AWS Cloud The company needs the ability to use SMB clients to
access data solution must be fully managed.
Which AWS solution meets these requirements?

A. Create an AWS DataSync task that shares the data as a mountable file system Mount the file system to the application server
B. Create an Amazon EC2 Windows instance Install and configure a Windows file share role on the instance Connect the application server to the file share
C. Create an Amazon FSx for Windows File Server file system Attach the file system to the origin server Connect the application server to the file system
D. Create an Amazon S3 bucket Assign an IAM role to the application to grant access to the S3 bucket Mount the S3 bucket to the application server

Answer: C

NEW QUESTION 157


- (Exam Topic 3)
An online retail company has more than 50 million active customers and receives more than 25,000 orders each day. The company collects purchase data for
customers and stores this data in Amazon S3. Additional customer data is stored in Amazon RDS.
The company wants to make all the data available to various teams so that the teams can perform analytics. The solution must provide the ability to manage fine-
grained permissions for the data and must minimize operational overhead.
Which solution will meet these requirements?

A. Migrate the purchase data to write directly to Amazon RD


B. Use RDS access controls to limit access.
C. Schedule an AWS Lambda function to periodically copy data from Amazon RDS to Amazon S3. Create an AWS Glue crawle
D. Use Amazon Athena to query the dat
E. Use S3 policies to limit access.

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

F. Create a data lake by using AWS Lake Formatio


G. Create an AWS Glue JDBC connection to Amazon RD
H. Register the S3 bucket in Lake Formatio
I. Use Lake Formation access controls to limit access.
J. Create an Amazon Redshift cluste
K. Schedule an AWS Lambda function to periodically copy data from Amazon S3 and Amazon RDS to Amazon Redshif
L. Use Amazon Redshift access controls to limit access.

Answer: C

Explanation:
https://aws.amazon.com/blogs/big-data/manage-fine-grained-access-control-using-aws-lake-formation/

NEW QUESTION 162


- (Exam Topic 3)
A company has deployed a database in Amazon RDS for MySQL. Due to increased transactions, the database support team is reporting slow reads against the
DB instance and recommends adding a read replica.
Which combination of actions should a solutions architect take before implementing this change? (Choose two.)

A. Enable binlog replication on the RDS primary node.


B. Choose a failover priority for the source DB instance.
C. Allow long-running transactions to complete on the source DB instance.
D. Create a global table and specify the AWS Regions where the table will be available.
E. Enable automatic backups on the source instance by setting the backup retention period to a value other than 0.

Answer: CE

Explanation:
"An active, long-running transaction can slow the process of creating the read replica. We recommend that you wait for long-running transactions to complete
before creating a read replica. If you create multiple read replicas in parallel from the same source DB instance, Amazon RDS takes only one snapshot at the start
of the first create action. When creating a read replica, there are a few things to consider. First, you must enable automatic backups on the source DB instance by
setting the backup retention period to a value other than 0. This requirement also applies to a read replica that is the source DB instance for another read replica"
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_ReadRepl.html

NEW QUESTION 165


- (Exam Topic 3)
A company has a web application with sporadic usage patterns There is heavy usage at the beginning of each month moderate usage at the start of each week
and unpredictable usage during the week The application consists of a web server and a MySQL database server running inside the data center The company
would like to move the application to the AWS Cloud and needs to select a cost-effective database platform that will not require database modifications
Which solution will meet these requirements?

A. Amazon DynamoDB
B. Amazon RDS for MySQL
C. MySQL-compatible Amazon Aurora Serverless
D. MySQL deployed on Amazon EC2 in an Auto Scaling group

Answer: B

NEW QUESTION 168


- (Exam Topic 3)
A company is running a multi-tier recommence web application in the AWS Cloud. The application runs on Amazon EC2 instances with an Amazon RDS for
MySQL Multi-AZ OB instance. Amazon ROS is configured with the latest generation DB instance with 2.000 GB of storage In a General Purpose SSD (gp3)
Amazon Elastic Block Store (Amazon EBSl volume. The database performance affects the application during periods high demand.
A database administrator analyzes the logs in Amazon CloudWatch Logs and discovers that the application performance always degrades when the number of
read and write IOPS is higher than 20.000.
What should a solutions architect do to improve the application performance?

A. Replace the volume with a magnetic volume.


B. Increase the number of IOPS on the gp3 volume.
C. Replace the volume with a Provisioned IOPS SSD (Io2) volume.
D. Replace the 2.000 GB gp3 volume with two 1.000 GB gp3 volumes

Answer: C

NEW QUESTION 173


- (Exam Topic 3)
A solutions architect observes that a nightly batch processing job is automatically scaled up for 1 hour before the desired Amazon EC2 capacity is reached. The
peak capacity is the ‘same every night and the batch jobs always start at 1 AM. The solutions architect needs to find a cost-effective solution that will allow for the
desired EC2 capacity to be reached quickly and allow the Auto Scaling group to scale down after the batch jobs are complete.
What should the solutions architect do to meet these requirements?

A. Increase the minimum capacity for the Auto Scaling group.


B. Increase the maximum capacity for the Auto Scaling group.
C. Configure scheduled scaling to scale up to the desired compute level.
D. Change the scaling policy to add more EC2 instances during each scaling operation.

Answer: C

Explanation:

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

By configuring scheduled scaling, the solutions architect can set the Auto Scaling group to automatically scale up to the desired compute level at a specific time
(IAM) when the batch job starts and then automatically scale down after the job is complete. This will allow the desired EC2 capacity to be reached quickly and
also help in reducing the cost.

NEW QUESTION 176


- (Exam Topic 3)
A company recently migrated its web application to AWS by rehosting the application on Amazon EC2 instances in a single AWS Region. The company wants to
redesign its application architecture to be highly available and fault tolerant. Traffic must reach all running EC2 instances randomly.
Which combination of steps should the company take to meet these requirements? (Choose two.)

A. Create an Amazon Route 53 failover routing policy.


B. Create an Amazon Route 53 weighted routing policy.
C. Create an Amazon Route 53 multivalue answer routing policy.
D. Launch three EC2 instances: two instances in one Availability Zone and one instance in another Availability Zone.
E. Launch four EC2 instances: two instances in one Availability Zone and two instances in another Availability Zone.

Answer: CE

Explanation:
https://aws.amazon.com/premiumsupport/knowledge-center/multivalue-versus-simple-policies/

NEW QUESTION 179


- (Exam Topic 3)
A company's web application consists o( an Amazon API Gateway API in front of an AWS Lambda function and an Amazon DynamoDB database. The Lambda
function
handles the business logic, and the DynamoDB table hosts the data. The application uses Amazon Cognito user pools to identify the individual users of the
application. A solutions architect needs to update the application so that only users who have a subscription can access premium content.

A. Enable API caching and throttling on the API Gateway API


B. Set up AWS WAF on the API Gateway API Create a rule to filter users who have a subscription
C. Apply fine-grained IAM permissions to the premium content in the DynamoDB table
D. Implement API usage plans and API keys to limit the access of users who do not have a subscription.

Answer: C

NEW QUESTION 180


- (Exam Topic 3)
A company is moving its data management application to AWS. The company wants to transition to an event-driven architecture. The architecture needs to the
more distributed and to use serverless concepts whit performing the different aspects of the workflow. The company also wants to minimize operational overhead.
Which solution will meet these requirements?

A. Build out the workflow in AWS Glue Use AWS Glue to invoke AWS Lambda functions to process the workflow slaps
B. Build out the workflow in AWS Step Functions Deploy the application on Amazon EC2 Instances Use Step Functions to invoke the workflow steps on the EC2
instances
C. Build out the workflow in Amazon EventBridg
D. Use EventBridge to invoke AWS Lambda functions on a schedule to process the workflow steps.
E. Build out the workflow m AWS Step Functions Use Step Functions to create a stale machine Use the stale machine to invoke AWS Lambda functions to
process the workflow steps

Answer: C

NEW QUESTION 185


- (Exam Topic 3)
A company is moving its on-premises Oracle database to Amazon Aurora PostgreSQL. The database has several applications that write to the same tables. The
applications need to be migrated one by one with a month in between each migration. Management has expressed concerns that the database has a high number
of reads and writes. The data must be kept in sync across both databases throughout the migration.
What should a solutions architect recommend?

A. Use AWS DataSync for the initial migratio


B. Use AWS Database Migration Service (AWS DMS) to create a change data capture (CDC) replication task and a table mapping to select all tables.
C. Use AWS DataSync for the initial migratio
D. Use AWS Database Migration Service (AWS DMS) to create a full load plus change data capture (CDC) replication task and a table mapping to select all tables.
E. Use the AWS Schema Conversion Tool with AWS Database Migration Service (AWS DMS) using amemory optimized replication instanc
F. Create a full load plus change data capture (CDC) replication task and a table mapping to select all tables.
G. Use the AWS Schema Conversion Tool with AWS Database Migration Service (AWS DMS) using a compute optimized replication instanc
H. Create a full load plus change data capture (CDC) replication task and a table mapping to select the largest tables.

Answer: C

NEW QUESTION 188


- (Exam Topic 3)
A company recently announced the deployment of its retail website to a global audience. The website runs on multiple Amazon EC2 instances behind an Elastic
Load Balancer. The instances run in an Auto Scaling group across multiple Availability Zones.
The company wants to provide its customers with different versions of content based on the devices that the customers use to access the website.
Which combination of actions should a solutions architect take to meet these requirements? (Choose two.)

A. Configure Amazon CloudFront to cache multiple versions of the content.


B. Configure a host header in a Network Load Balancer to forward traffic to different instances.

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

C. Configure a Lambda@Edge function to send specific objects to users based on the User-Agent header.
D. Configure AWS Global Accelerato
E. Forward requests to a Network Load Balancer (NLB). Configure the NLB to set up host-based routing to different EC2 instances.
F. Configure AWS Global Accelerato
G. Forward requests to a Network Load Balancer (NLB). Configure the NLB to set up path-based routing to different EC2 instances.

Answer: AC

Explanation:
For C: IMPROVED USER EXPERIENCE Lambda@Edge can help improve your users' experience with your websites and web applications across the world, by
letting you personalize content for them without sacrificing performance. Real-time Image Transformation You can customize your users' experience by
transforming images on the fly based on the user characteristics. For example, you can resize images based on the viewer's device type—mobile, desktop, or
tablet. You can also cache the transformed images at CloudFront Edge locations to further improve performance when delivering images.
https://aws.amazon.com/lambda/edge/

NEW QUESTION 191


- (Exam Topic 3)
A company's facility has badge readers at every entrance throughout the building. When badges are scanned, the readers send a message over HTTPS to
indicate who attempted to access that particular entrance.
A solutions architect must design a system to process these messages from the sensors. The solution must be highly available, and the results must be made
available for the company's security team to analyze.
Which system architecture should the solutions architect recommend?

A. Launch an Amazon EC2 instance to serve as the HTTPS endpoint and to process the messages Configure the EC2 instance to save the results to an Amazon
S3 bucket.
B. Create an HTTPS endpoint in Amazon API Gatewa
C. Configure the API Gateway endpoint to invoke an AWS Lambda function to process the messages and save the results to an Amazon DynamoDB table.
D. Use Amazon Route 53 to direct incoming sensor messages to an AWS Lambda functio
E. Configure the Lambda function to process the messages and save the results to an Amazon DynamoDB table.
F. Create a gateway VPC endpoint for Amazon S3. Configure a Site-to-Site VPN connection from the facility network to the VPC so that sensor data can be written
directly to an S3 bucket by way of the VPC endpoint.

Answer: B

NEW QUESTION 196


- (Exam Topic 3)
A company is developing a marketing communications service that targets mobile app users. The company needs to send confirmation messages with Short
Message Service (SMS) to its users. The users must be able to reply to the SMS messages. The company must store the responses for a year for analysis.
What should a solutions architect do to meet these requirements?

A. Create an Amazon Connect contact flow to send the SMS message


B. Use AWS Lambda to process the responses.
C. Build an Amazon Pinpoint journe
D. Configure Amazon Pinpoint to send events to an Amazon Kinesis data stream for analysis and archiving.
E. Use Amazon Simple Queue Service (Amazon SQS) to distribute the SMS message
F. Use AWS Lambda to process the responses.
G. Create an Amazon Simple Notification Service (Amazon SNS) FIFO topi
H. Subscribe an Amazon Kinesis data stream to the SNS topic for analysis and archiving.

Answer: B

Explanation:
https://aws.amazon.com/pinpoint/product-details/sms/ Two-Way Messaging: Receive SMS messages from your customers and reply back to them in a chat-like
interactive experience. With Amazon Pinpoint, you can create automatic responses when customers send you messages that contain certain keywords. You can
even use Amazon Lex to create conversational bots. A majority of mobile phone users read incoming SMS messages almost immediately after receiving them. If
you need to be able to provide your customers with urgent or important information, SMS messaging may be the right solution for you. You can use Amazon
Pinpoint to create targeted groups of customers, and then send them campaign-based messages. You can also use Amazon Pinpoint to send direct messages,
such as appointment confirmations, order updates, and one-time passwords.

NEW QUESTION 201


- (Exam Topic 3)
A company has implemented a self-managed DNS service on AWS. The solution consists of the following:
• Amazon EC2 instances in different AWS Regions
• Endpomts of a standard accelerator m AWS Global Accelerator
The company wants to protect the solution against DDoS attacks What should a solutions architect do to meet this requirement?

A. Subscribe to AWS Shield Advanced Add the accelerator as a resource to protect


B. Subscribe to AWS Shield Advanced Add the EC2 instances as resources to protect
C. Create an AWS WAF web ACL that includes a rate-based rule Associate the web ACL with the accelerator
D. Create an AWS WAF web ACL that includes a rate-based rule Associate the web ACL with the EC2 instances

Answer: B

NEW QUESTION 205


- (Exam Topic 3)
What should a solutions architect do to ensure that all objects uploaded to an Amazon S3 bucket are encrypted?

A. Update the bucket policy to deny if the PutObject does not have an s3 x-amz-acl header set
B. Update the bucket policy to deny if the PutObject does not have an s3:x-amz-aci header set to private.

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

C. Update the bucket policy to deny if the PutObject does not have an aws SecureTransport header set to true
D. Update the bucket policy to deny if the PutObject does not have an x-amz-server-side-encryption header set.

Answer: D

Explanation:
https://aws.amazon.com/blogs/security/how-to-prevent-uploads-of-unencrypted-objects-to-amazon-s3/#:~:text=

NEW QUESTION 207


- (Exam Topic 3)
A company is migrating its on-premises workload to the AWS Cloud. The company already uses several Amazon EC2 instances and Amazon RDS DB instances.
The company wants a solution that automatically starts and stops the EC2 instances and D6 instances outside of business hours. The solution must minimize cost
and infrastructure maintenance.
Which solution will meet these requirement?

A. Scale the EC2 instances by using elastic resize Scale the DB instances to zero outside of business hours
B. Explore AWS Marketplace for partner solutions that will automatically start and stop the EC2 Instances and OB instances on a schedule
C. Launch another EC2 instanc
D. Configure a crontab schedule to run shell scripts that will start and stop the existing EC2 instances and DB instances on a schedule.
E. Create an AWS Lambda function that will start and stop the EC2 instances and DB instances Configure Amazon EventBridge to invoke the Lambda function on
a schedule

Answer: D

NEW QUESTION 209


- (Exam Topic 3)
A company runs a fleet of web servers using an Amazon RDS for PostgreSQL DB instance After a routine compliance check, the company sets a standard that
requires a recovery pant objective (RPO) of less than 1 second for all its production databases.
Which solution meets these requirement?

A. Enable a Multi-AZ deployment for the DB Instance


B. Enable auto scaling for the OB instance m one Availability Zone.
C. Configure the 06 instance in one Availability Zone and create multiple read replicas in a separate Availability Zone
D. Configure the 06 instance m one Availability Zone, and configure AWS Database Migration Service (AWS DMS) change data capture (CDC) tasks

Answer: A

NEW QUESTION 210


- (Exam Topic 3)
A solutions architect is designing a two-tiered architecture that includes a public subnet and a database subnet. The web servers in the public subnet must be open
to the internet on port 443. The Amazon RDS for MySQL D6 instance in the database subnet must be accessible only to the web servers on port 3306.
Which combination of steps should the solutions architect take to meet these requirements? (Select TWO.)

A. Create a network ACL for the public subnet Add a rule to deny outbound traffic to 0 0 0 0/0 on port3306
B. Create a security group for the DB instance Add a rule to allow traffic from the public subnet CIDR block on port 3306
C. Create a security group for the web servers in the public subnet Add a rule to allow traffic from 0 0 0 O'O on port 443
D. Create a security group for the DB instance Add a rule to allow traffic from the web servers' security group on port 3306
E. Create a security group for the DB instance Add a rule to deny all traffic except traffic from the web servers' security group on port 3306

Answer: CD

NEW QUESTION 211


- (Exam Topic 3)
A company has an application that is backed ny an Amazon DynamoDB table. The company's compliance requirements specify that database backups must be
taken every month, must be available for 6 months, and must be retained for 7 years.
Which solution will meet these requirements?

A. Create an AWS Backup plan to back up the DynamoDB table on the first day of each mont
B. Specify a lifecycle policy that transitions the backup to cold storage after 6 month
C. Set the retention period foreach backup to 7 years.
D. Create a DynamoDB on-damand backup of the DynamoDB table on the first day of each month Transition the backup to Amazon S3 Glacier Flexible Retrieval
after 6 month
E. Create an S3 Lifecycle policy to delete backups that are older than 7 years.
F. Use the AWS SDK to develop a script that creates an on-demand backup of the DynamoDB tabl
G. Set up an Amzon EvenlBridge rule that runs the script on the first day of each mont
H. Create a second script that will run on the second day of each month to transition DynamoDB backups that are older than 6 months to cold storage and to
delete backups that are older than 7 years.
I. Use the AWS CLI to create an on-demand backup of the DynamoDB table Set up an Amazon EventBridge rule that runs the command on the first day of each
month with a cron expression Specify in the command to transition the backups to cold storage after 6 months and to delete the backups after 7 years.

Answer: A

NEW QUESTION 216


- (Exam Topic 3)
A company needs to ingested and handle large amounts of streaming data that its application generates. The application runs on Amazon EC2 instances and
sends data to Amazon Kinesis Data Streams. which is contained wild default settings. Every other day the application consumes the data and writes the data to an
Amazon S3 bucket for business intelligence (BI) processing the company observes that Amazon S3 is not receiving all the data that trio application sends to
Kinesis Data Streams.

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

What should a solutions architect do to resolve this issue?

A. Update the Kinesis Data Streams default settings by modifying the data retention period.
B. Update the application to use the Kinesis Producer Library (KPL) lo send the data to Kinesis Data Streams.
C. Update the number of Kinesis shards lo handle the throughput of me data that is sent to Kinesis Data Streams.
D. Turn on S3 Versioning within the S3 bucket to preserve every version of every object that is ingested in the S3 bucket.

Answer: A

NEW QUESTION 220


- (Exam Topic 3)
A company runs an application on Amazon EC2 Linux instances across multiple Availability Zones. The application needs a storage layer that is highly available
and Portable Operating System Interface (POSIX) compliant. The storage layer must provide maximum data durability and must be shareable across the EC2
instances. The data in the storage layer will be accessed frequency for the first 30 days and will be accessed infrequently alter that time.
Which solution will meet these requirements MOST cost-effectively?

A. Use the Amazon S3 Standard storage class Create an S3 Lifecycle policy to move infrequently accessed data to S3 Glacier
B. Use the Amazon S3 Standard storage clas
C. Create an S3 Lifecycle policy to move infrequently accessed data to S3 Standard-Infrequent Access (EF3 Standard-IA).
D. Use the Amazon Elastic File System (Amazon EFS) Standard storage clas
E. Create a Lifecycle management policy to move infrequently accessed data to EFS Standard-Infrequent Access (EFS Standard-IA)
F. Use the Amazon Elastic File System (Amazon EFS) One Zone storage clas
G. Create a Lifecycle management policy to move infrequently accessed data to EFS One Zone-Infrequent Access (EFS One Zone-IA).

Answer: C

NEW QUESTION 223


- (Exam Topic 3)
A solution architect needs to assign a new microsoft for a company’s application. Clients must be able to call an HTTPS endpoint to reach the micoservice. The
microservice also must use AWS identity and Access Management (IAM) to authentication calls. The soltions architect will write the logic for this microservice by
using a single AWS Lambda function that is written in Go 1.x.
Which solution will deploy the function in the in the MOST operationally efficient way?

A. Create an Amazon API Gateway REST AP


B. Configure the method to use the Lambda functio
C. Enable IAM authentication on the API.
D. Create a Lambda function URL for the functio
E. Specify AWS_IAM as the authentication type.
F. Create an Amazon CloudFront distributio
G. Deploy the function to Lambda@Edg
H. Integrate IAM authentication logic into the Lambda@Edge function.
I. Create an Amazon CloudFront distribuio
J. Deploy the function to CloudFront Function
K. Specify AWS_IAM as the authentication type.

Answer: A

NEW QUESTION 226


- (Exam Topic 3)
A company has a regional subscription-based streaming service that runs in a single AWS Region. The architecture consists of web servers and application
servers on Amazon EC2 instances. The EC2 instances are in Auto Scaling groups behind Elastic Load Balancers. The architecture includes an Amazon Aurora
database cluster that extends across multiple Availability Zones.
The company wants to expand globally and to ensure that its application has minimal downtime.

A. Extend the Auto Scaling groups for the web tier and the application tier to deploy instances in Availability Zones in a second Regio
B. Use an Aurora global database to deploy the database in the primary Region and the second Regio
C. Use Amazon Route 53 health checks with a failover routing policy to the second Region.
D. Deploy the web tier and the application tier to a second Regio
E. Add an Aurora PostgreSQL cross-Region Aurara Replica in the second Regio
F. Use Amazon Route 53 health checks with afailovers routing policy to the second Region, Promote the secondary to primary as needed.
G. Deploy the web tier and the applicatin tier to a second Regio
H. Create an Aurora PostSQL database in the second Regio
I. Use AWS Database Migration Service (AWS DMS) to replicate the primary database to the second Regio
J. Use Amazon Route 53 health checks with a failover routing policy to the second Region.
K. Deploy the web tier and the application tier to a second Regio
L. Use an Amazon Aurora global database to deploy the database in the primary Region and the second Regio
M. Use Amazon Route 53 health checks with a failover routing policy to the second Regio
N. Promote the secondary to primary as needed.

Answer: A

NEW QUESTION 228


- (Exam Topic 3)
A company is developing an ecommerce application that will consist of a load-balanced front end, a container-based application, and a relational database. A
solutions architect needs to create a highly available solution that operates with as little manual intervention as possible.
Which solutions meet these requirements? (Select TWO.)

A. Create an Amazon RDS DB instance in Multi-AZ mode.


B. Create an Amazon RDS DB instance and one or more replicas in another Availability Zone.

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

C. Create an Amazon EC2 in stance-based Docker cluster to handle the dynamic application load.
D. Create an Amazon Elastic Container Service (Amazon ECS) cluster with a Fargate launch type to handle the dynamic application load.
E. Create an Amazon Elastic Container Service (Amazon ECS) cluster with an Amazon EC2 launch type to handle the dynamic application load.

Answer: AD

Explanation:
https://docs.aws.amazon.com/AmazonECS/latest/developerguide/Welcome.html
* 1. Relational database: RDS
* 2. Container-based applications: ECS
"Amazon ECS enables you to launch and stop your container-based applications by using simple API calls. You can also retrieve the state of your cluster from a
centralized service and have access to many familiar Amazon EC2 features."
* 3. Little manual intervention: Fargate
You can run your tasks and services on a serverless infrastructure that is managed by AWS Fargate. Alternatively, for more control over your infrastructure, you
can run your tasks and services on a cluster of Amazon EC2 instances that you manage.

NEW QUESTION 233


- (Exam Topic 3)
A company wants to use high performance computing (HPC) infrastructure on AWS for financial risk modeling. The company's HPC workloads run on Linux. Each
HPC workflow runs on hundreds of Amazon EC2 Spot Instances, is shorl-lived, and generates thousands of output files that are ultimately stored in persistent
storage for analytics and long-term future use.
The company seeks a cloud storage solution that permits the copying of on-premises data to long-term persistent storage to make data available for processing by
all EC2 instances. The solution should also be a high performance file system that is integrated with persistent storage to read and write datasets and output files.
Which combination of AWS services meets these requirements?

A. Amazon FSx for Lustre integrated with Amazon S3


B. Amazon FSx for Windows File Server integrated with Amazon S3
C. Amazon S3 Glacier integrated with Amazon Elastic Block Store (Amazon EBS)
D. Amazon S3 bucket with a VPC endpoint integrated with an Amazon Elastic Block Store (Amazon EBS) General Purpose SSD (gp2) volume

Answer: A

Explanation:
https://aws.amazon.com/fsx/lustre/
Amazon FSx for Lustre is a fully managed service that provides cost-effective, high-performance, scalable storage for compute workloads. Many workloads such
as machine learning, high performance computing (HPC), video rendering, and financial simulations depend on compute instances accessing the same set of data
through high-performance shared storage.

NEW QUESTION 235


- (Exam Topic 3)
A company is migrating an old application to AWS The application runs a batch job every hour and is CPU intensive The batch job takes 15 minutes on average
with an on-premises server The server has 64 virtual CPU (vCPU) and 512 GiB of memory
Which solution will run the batch job within 15 minutes with the LEAST operational overhead?

A. Use AWS Lambda with functional scaling


B. Use Amazon Elastic Container Service (Amazon ECS) with AWS Fargate
C. Use Amazon Lightsail with AWS Auto Scaling
D. Use AWS Batch on Amazon EC2

Answer: D

Explanation:
Use AWS Batch on Amazon EC2. AWS Batch is a fully managed batch processing service that can be used to easily run batch jobs on Amazon EC2 instances. It
can scale the number of instances to match the workload, allowing the batch job to be completed in the desired time frame with minimal operational overhead.
Using AWS Lambda with Amazon API Gateway - AWS Lambda https://docs.aws.amazon.com/lambda/latest/dg/services-apigateway.html AWS Lambda FAQs
https://aws.amazon.com/lambda/faqs/

NEW QUESTION 237


- (Exam Topic 3)
A solution architect is designing a company’s disaster recovery (DR) architecture. The company has a MySQL database that runs on an Amazon EC2 instance in
a private subnet with scheduled backup. The DR design to include multiple AWS Regions.
Which solution will meet these requiements with the LEAST operational overhead?

A. Migrate the MySQL database to multiple EC2 instance


B. Configure a standby EC2 instance in the DR Region Turn on replication.
C. Migrate the MySQL database to Amazon RD
D. Use a Multi-AZ deploymen
E. Turn on read replication for the primary DB instance in the different Availability Zones.
F. Migrate the MySQL database to an Amazon Aurora global databas
G. Host the primary DB cluster in the primary Regio
H. Host the secondary DB cluster in the DR Region.
I. Store the schedule backup of the MySQL database in an Amazon S3 bucket that is configured for S3 Cross-Region Replication (CRR). Use the data backup to
restore the database in the DR Region.

Answer: C

NEW QUESTION 239


- (Exam Topic 3)
A company offers a food delivery service that is growing rapidly. Because of the growth, the company’s order processing system is experiencing scaling problems

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

during peak traffic hours. The current architecture includes the following:
• A group of Amazon EC2 instances that run in an Amazon EC2 Auto Scaling group to collect orders from the application
• Another group of EC2 instances that run in an Amazon EC2 Auto Scaling group to fulfill orders
The order collection process occurs quickly, but the order fulfillment process can take longer. Data must not be lost because of a scaling event.
A solutions architect must ensure that the order collection process and the order fulfillment process can both scale properly during peak traffic hours. The solution
must optimize utilization of the company’s AWS resources.
Which solution meets these requirements?

A. Use Amazon CloudWatch metrics to monitor the CPU of each instance in the Auto Scaling groups.Configure each Auto Scaling group’s minimum capacity
according to peak workload values.
B. Use Amazon CloudWatch metrics to monitor the CPU of each instance in the Auto Scaling groups.Configure a CloudWatch alarm to invoke an Amazon Simple
Notification Service (Amazon SNS) topic that creates additional Auto Scaling groups on demand.
C. Provision two Amazon Simple Queue Service (Amazon SQS) queues: one for order collection and another for order fulfillmen
D. Configure the EC2 instances to poll their respective queu
E. Scale the Auto Scaling groups based on notifications that the queues send.
F. Provision two Amazon Simple Queue Service (Amazon SQS) queues: one for order collection and another for order fulfillmen
G. Configure the EC2 instances to poll their respective queu
H. Create a metric based on a backlog per instance calculatio
I. Scale the Auto Scaling groups based on this metric.

Answer: D

Explanation:
The number of instances in your Auto Scaling group can be driven by how long it takes to process a message and the acceptable amount of latency (queue delay).
The solution is to use a backlog per instance metric with the target value being the acceptable backlog per instance to maintain.

NEW QUESTION 244


- (Exam Topic 3)
A solutions architect is creating a new VPC design There are two public subnets for the load balancer, two private subnets for web servers and two private subnets
for MySQL The web servers use only HTTPS The solutions architect has already created a security group tor the load balancer allowing port 443 from 0 0 0 0/0
Company policy requires that each resource has the teas! access required to still be able to perform its tasks
Which additional configuration strategy should the solutions architect use to meet these requirements?

A. Create a security group for the web servers and allow port 443 from 0.0.0.0/0 Create a security group for the MySQL servers and allow port 3306 from the web
servers security group
B. Create a network ACL for the web servers and allow port 443 from 0.0.0.0/0 Create a network ACL (or the MySQL servers and allow port 3306 from the web
servers security group
C. Create a security group for the web servers and allow port 443 from the load balancer Create a security group for the MySQL servers and allow port 3306 from
the web servers security group
D. Create a network ACL 'or the web servers and allow port 443 from the load balancer Create a network ACL for the MySQL servers and allow port 3306 from the
web servers security group

Answer: C

NEW QUESTION 248


- (Exam Topic 3)
A company stores confidential data in an Amazon Aurora PostgreSQL database in the ap-southeast-3 Region The database is encrypted with an AWS Key
Management Service (AWS KMS) customer managed key The company was recently acquired and must securely share a backup of the database with the
acquiring company's AWS account in ap-southeast-3.
What should a solutions architect do to meet these requirements?

A. Create a database snapshot Copy the snapshot to a new unencrypted snapshot Share the new snapshot with the acquiring company's AWS account
B. Create a database snapshot Add the acquiring company's AWS account to the KMS key policy Share the snapshot with the acquiring company's AWS account
C. Create a database snapshot that uses a different AWS managed KMS key Add the acquiring company's AWS account to the KMS key alia
D. Share the snapshot with the acquiring company's AWS account.
E. Create a database snapshot Download the database snapshot Upload the database snapshot to an Amazon S3 bucket Update the S3 bucket policy to allow
access from the acquiring company's AWS account

Answer: A

NEW QUESTION 249


- (Exam Topic 3)
A company is planning to store data on Amazon RDS DB instances. The company must encrypt the data at rest.
What should a solutions architect do to meet this requirement?

A. Create an encryption key and store the key in AWS Secrets Manager Use the key to encrypt the DB instances
B. Generate a certificate in AWS Certificate Manager (ACM). Enable SSL/TLS on the DB instances by using the certificate
C. Create a customer master key (CMK) in AWS Key Management Service (AWS KMS) Enable encryption for the DB instances
D. Generate a certificate in AWS Identity and Access Management {IAM) Enable SSUTLS on the DB instances by using the certificate

Answer: C

NEW QUESTION 252


- (Exam Topic 3)
A company hosts rts sialic website by using Amazon S3 The company wants to add a contact form to its webpage The contact form will have dynamic server-sKle
components for users to input their name, email address, phone number and user message The company anticipates that there will be fewer than 100 site visits
each month
Which solution will meet these requirements MOST cost-effectively?

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

A. Host a dynamic contact form page in Amazon Elastic Container Service (Amazon ECS) Set up Amazon Simple Email Service (Amazon SES) to connect to any
third-party email provider.
B. Create an Amazon API Gateway endpoinl with an AWS Lambda backend that makes a call to Amazon Simple Email Service (Amazon SES)
C. Convert the static webpage to dynamic by deploying Amazon Ughtsail Use client-side scnpting to build the contact form Integrate the form with Amazon
WorkMail
D. Create a Q micro Amazon EC2 instance Deploy a LAMP (Linux Apache MySQ
E. PHP/Perl/Python) stack to host the webpage Use client-side scripting to buiW the contact form Integrate the form with Amazon WorkMail

Answer: D

Explanation:
Create a t2 micro Amazon EC2 instance. Deploy a LAMP (Linux Apache MySQL, PHP/Perl/Python) stack to host the webpage. Use client-side scripting to build
the contact form. Integrate the form with Amazon WorkMail. This solution will provide the company with the necessary components to host the contact form page
and integrate it with Amazon WorkMail at the lowest cost. Option A requires the use of Amazon ECS, which is more expensive than EC2, and Option B requires
the use of Amazon API Gateway, which is also more expensive than EC2. Option C requires the use of Amazon Lightsail, which is more expensive than EC2.
Using AWS Lambda with Amazon API Gateway - AWS Lambda https://docs.aws.amazon.com/lambda/latest/dg/services-apigateway.html AWS Lambda FAQs
https://aws.amazon.com/lambda/faqs/

NEW QUESTION 257


- (Exam Topic 3)
A solutions architect is designing a multi-tier application for a company. The application's users upload images from a mobile device. The application generates a
thumbnail of each image and returns a message to the user to confirm that the image was uploaded successfully.
The thumbnail generation can take up to 60 seconds, but the company wants to provide a faster response time to its users to notify them that the original image
was received. The solutions architect must design the application to asynchronously dispatch requests to the different application tiers.
What should the solutions architect do to meet these requirements?

A. Write a custom AWS Lambda function to generate the thumbnail and alert the use
B. Use the image upload process as an event source to invoke the Lambda function.
C. Create an AWS Step Functions workflow Configure Step Functions to handle the orchestration between the application tiers and alert the user when thumbnail
generation is complete
D. Create an Amazon Simple Queue Service (Amazon SQS) message queu
E. As images are uploaded, place a message on the SQS queue for thumbnail generatio
F. Alert the user through an application message that the image was received
G. Create Amazon Simple Notification Service (Amazon SNS) notification topics and subscriptions Use one subscription with the application to generate the
thumbnail after the image upload is complet
H. Use a second subscription to message the user's mobile app by way of a push notification after thumbnail generation is complete.

Answer: C

NEW QUESTION 258


- (Exam Topic 3)
A company stores its data objects in Amazon S3 Standard storage. A solutions architect has found that 75% of the data is rarely accessed after 30 days. The
company needs all the data to remain immediately accessible with the same high availability and resiliency, but the company wants to minimize storage costs.
Which storage solution will meet these requirements?

A. Move the data objects to S3 Glacier Deep Archive after 30 days.


B. Move the data objects to S3 Standard-Infrequent Access (S3 Standard-IA) after 30 days.
C. Move the data objects to S3 One Zone-Infrequent Access (S3 One Zone-IA) after 30 days.
D. Move the data objects to S3 One Zone-Infrequent Access (S3 One Zone-IA) immediately.

Answer: B

NEW QUESTION 263


- (Exam Topic 3)
A transaction processing company has weekly scripted batch jobs that run on Amazon EC2 instances. The EC2 instances are in an Auto Scaling group. The
number of transactions can vary but the beseline CPU utilization that is noted on each run is at least 60%. The company needs to provision the capacity 30
minutes before the jobs run.
Currently engineering complete this task by manually modifying the Auto Scaling group parameters. The company does not have the resources to analyze the
required capacity trends for the Auto Scaling group counts. The company needs an automated way to modify the Auto Scaling group’s capacity.
Which solution will meet these requiements with the LEAST operational overhead?

A. Ceate a dynamic scalling policy for the Auto Scaling grou


B. Configure the policy to scale based on the CPU utilization metric to 60%.
C. Create a scheduled scaling polcy for the Auto Scaling grou
D. Set the appropriate desired capacity, minimum capacity, and maximum capacit
E. Set the recurrence to weekl
F. Set the start time to 30 minute
G. Before the batch jobs run.
H. Create a predictive scaling policy for the Auto Scaling grou
I. Configure the policy to scale based on forecas
J. Set the scaling metric to CPU utilizatio
K. Set the target value for the metric to 60%. In the Policy, set the instances to pre-launch 30 minutes before the jobs run.
L. Create an Amazon EventBridge event to invoke an AWS Lamda function when the CPU utilization metric value for the Auto Scaling group reaches 60%.
Configure the Lambda function to increase the Auto Scaling group’s desired capacity and maximum capacity by 20%.

Answer: C

NEW QUESTION 267


......

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Welcome to download the Newest 2passeasy AWS-Solution-Architect-Associate dumps
https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/ (348 New Questions)

THANKS FOR TRYING THE DEMO OF OUR PRODUCT

Visit Our Site to Purchase the Full Set of Actual AWS-Solution-Architect-Associate Exam Questions With
Answers.

We Also Provide Practice Exam Software That Simulates Real Exam Environment And Has Many Self-Assessment Features. Order the AWS-
Solution-Architect-Associate Product From:

https://www.2passeasy.com/dumps/AWS-Solution-Architect-Associate/

Money Back Guarantee

AWS-Solution-Architect-Associate Practice Exam Features:

* AWS-Solution-Architect-Associate Questions and Answers Updated Frequently

* AWS-Solution-Architect-Associate Practice Questions Verified by Expert Senior Certified Staff

* AWS-Solution-Architect-Associate Most Realistic Questions that Guarantee you a Pass on Your FirstTry

* AWS-Solution-Architect-Associate Practice Test Questions in Multiple Choice Formats and Updatesfor 1 Year

Passing Certification Exams Made Easy visit - https://www.2PassEasy.com


Powered by TCPDF (www.tcpdf.org)

You might also like