• support@dumpspool.com

SPECIAL LIMITED TIME DISCOUNT OFFER. USE DISCOUNT CODE TO GET 20% OFF DP2021

PDF Only

Dumpspool PDF book

$35.00 Free Updates Upto 90 Days

  • SAA-C03 Dumps PDF
  • 879 Questions
  • Updated On January 16, 2025

PDF + Test Engine

Dumpspool PDF and Test Engine book

$60.00 Free Updates Upto 90 Days

  • SAA-C03 Question Answers
  • 879 Questions
  • Updated On January 16, 2025

Test Engine

Dumpspool Test Engine book

$50.00 Free Updates Upto 90 Days

  • SAA-C03 Practice Questions
  • 879 Questions
  • Updated On January 16, 2025
Check Our Free Amazon SAA-C03 Online Test Engine Demo.

How to pass Amazon SAA-C03 exam with the help of dumps?

DumpsPool provides you the finest quality resources you’ve been looking for to no avail. So, it's due time you stop stressing and get ready for the exam. Our Online Test Engine provides you with the guidance you need to pass the certification exam. We guarantee top-grade results because we know we’ve covered each topic in a precise and understandable manner. Our expert team prepared the latest Amazon SAA-C03 Dumps to satisfy your need for training. Plus, they are in two different formats: Dumps PDF and Online Test Engine.

How Do I Know Amazon SAA-C03 Dumps are Worth it?

Did we mention our latest SAA-C03 Dumps PDF is also available as Online Test Engine? And that’s just the point where things start to take root. Of all the amazing features you are offered here at DumpsPool, the money-back guarantee has to be the best one. Now that you know you don’t have to worry about the payments. Let us explore all other reasons you would want to buy from us. Other than affordable Real Exam Dumps, you are offered three-month free updates.

You can easily scroll through our large catalog of certification exams. And, pick any exam to start your training. That’s right, DumpsPool isn’t limited to just Amazon Exams. We trust our customers need the support of an authentic and reliable resource. So, we made sure there is never any outdated content in our study resources. Our expert team makes sure everything is up to the mark by keeping an eye on every single update. Our main concern and focus are that you understand the real exam format. So, you can pass the exam in an easier way!

IT Students Are Using our AWS Certified Solutions Architect - Associate (SAA-C03) Dumps Worldwide!

It is a well-established fact that certification exams can’t be conquered without some help from experts. The point of using AWS Certified Solutions Architect - Associate (SAA-C03) Practice Question Answers is exactly that. You are constantly surrounded by IT experts who’ve been through you are about to and know better. The 24/7 customer service of DumpsPool ensures you are in touch with these experts whenever needed. Our 100% success rate and validity around the world, make us the most trusted resource candidates use. The updated Dumps PDF helps you pass the exam on the first attempt. And, with the money-back guarantee, you feel safe buying from us. You can claim your return on not passing the exam.

How to Get SAA-C03 Real Exam Dumps?

Getting access to the real exam dumps is as easy as pressing a button, literally! There are various resources available online, but the majority of them sell scams or copied content. So, if you are going to attempt the SAA-C03 exam, you need to be sure you are buying the right kind of Dumps. All the Dumps PDF available on DumpsPool are as unique and the latest as they can be. Plus, our Practice Question Answers are tested and approved by professionals. Making it the top authentic resource available on the internet. Our expert has made sure the Online Test Engine is free from outdated & fake content, repeated questions, and false plus indefinite information, etc. We make every penny count, and you leave our platform fully satisfied!

Amazon Web Services SAA-C03 Exam Overview:

Aspect Details
Exam Cost $150 USD
Total Time 130 minutes
Available Languages English, Japanese, Korean, and Simplified Chinese
Passing Marks 720 out of 1000
Exam Format Multiple Choice and Multiple Answer
Exam Type Associate Level
Prerequisites At least one year of hands-on experience with AWS services
Exam Registration Through Pearson VUE
Retake Policy Every 14 days, up to 3 times in a year
Validity 3 years

AWS Certified Solutions Architect - Associate (SAA-C03) Exam Topics Breakdown

Content Area Percentage Description
Domain 1: Design Resilient Architectures 30% Design a multi-tier architecture solution including disaster recovery, scalability, high availability, and fault tolerance.
Domain 2: Design High-Performing Architectures 28% Design secure, reliable, and scalable solutions to meet business requirements, incorporating AWS services and features.
Domain 3: Design Secure Applications and Architectures 24% Design secure access to AWS resources, data protection strategies, and encryption solutions.
Domain 4: Design Cost-Optimized Architectures 18% Identify cost-effective solutions, and design architectures that optimize costs while meeting business objectives.

Frequently Asked Questions

Amazon SAA-C03 Sample Question Answers

Question # 1

A company has applications that run in an organization in AWS Organizations. The company outsources operational support of the applications. The company needs to provide access for the external support engineers without compromising security. The external support engineers need access to the AWS Management Console. The external support engineers also need operating system access to the company's fleet of Amazon EC2 instances that run Amazon Linux in private subnets. Which solution will meet these requirements MOST securely?

A. Confirm that AWS Systems Manager Agent (SSM Agent) is installed on all instances. Assign an instance profile with the necessary policy to connect to Systems Manager. Use AWS 1AM Identity Center to provide the external support engineers console access. Use Systems Manager Session Manager to assign the required permissions.
B. Confirm that AWS Systems Manager Agent {SSM Agent) is installed on all instances. Assign an instance profile with the necessary policy to connect to Systems Manager. Use Systems Manager Session Manager to provide local 1AM user credentials in each AWS account to the external support engineers for console access.
C. Confirm that all instances have a security group that allows SSH access only from the external support engineers source IP address ranges. Provide local 1AM user credentials in each AWS account to the external support engineers for console access. Provide each external support engineer an SSH key pair to log in to the application instances.
D. Create a bastion host in a public subnet. Set up the bastion host security group to allow access from only the external engineers' IP address ranges Ensure that all instances have a security group that allows SSH access from the bastion host. Provide each external support engineer an SSH key pair to log in to the application instances. Provide local account 1AM user credentials to the engineers for console access.

Question # 2

A company runs an environment where data is stored in an Amazon S3 bucket. The objects are accessed frequently throughout the day. The company has strict data encryption requirements for data that is stored in the S3 bucket. The company currently uses AWS Key Management Service (AWS KMS) for encryption. The company wants to optimize costs associated with encrypting S3 objects without making additional calls to AWS KMS. Which solution will meet these requirements?

A. Use server-side encryption with Amazon S3 managed keys (SSE-S3).
B. Use an S3 Bucket Key for server-side encryption with AWS KMS keys (SSE-KMS) on the new objects.
C. Use client-side encryption with AWS KMS customer managed keys.
D. Use server-side encryption with customer-provided keys (SSE-C) stored in AWS KMS.

Question # 3

A company is designing the architecture for a new mobile app that uses the AWS Cloud. The company uses organizational units (OUs) in AWS Organizations to manage its accounts. The company wants to tag Amazon EC2 instances with data sensitivity by using values of sensitive and nonsensitive 1AM identities must not be able to delete a tag or create instances without a tag Which combination of steps will meet these requirements? (Select TWO.)

A. In Organizations, create a new tag policy that specifies the data sensitivity tag key and the required values. Enforce the tag values for the EC2 instances Attach the tag policy to the appropriate OU.
B. In Organizations, create a new service control policy (SCP) that specifies the data sensitivity tag key and the required tag values Enforce the tag values for the EC2 instances. Attach the SCP to the appropriate OU.
C. Create a tag policy to deny running instances when a tag key is not specified. Create another tag policy that prevents identities from deleting tags Attach the tag policies to the appropriate OU.
D. Create a service control policy (SCP) to deny creating instances when a tag key is not specified. Create another SCP that prevents identities from deleting tags Attach the SCPs to the appropriate OU.
E. Create an AWS Config rule to check if EC2 instances use the data sensitivity tag and the specified values. Configure an AWS Lambda function to delete the resource if a noncompliant resource is found.

Question # 4

A company is running a media store across multiple Amazon EC2 instances distributed across multiple Availability Zones in a single VPC. The company wants a high-performing solution to share data between all the EC2 instances, and prefers to keep the data within the VPC only. What should a solutions architect recommend?

A. Create an Amazon S3 bucket and call the service APIs from each instance's application.
B. Create an Amazon S3 bucket and configure all instances to access it as a mounted volume.
C. Configure an Amazon Elastic Block Store (Amazon EBS) volume and mount it across all instances.
D. Configure an Amazon Elastic File System (Amazon EFS) file system and mount It across all instances.

Question # 5

A company stores user data in AWS. The data is used continuously with peak usage duringbusiness hours. Access patterns vary, with some data not being used for months at a time.A solutions architect must choose a cost-effective solution that maintains the highest levelof durability while maintaining high availability.Which storage solution meets these requirements?

A. Amazon S3 Standard
B. Amazon S3 Intelligent-Tiering
C. Amazon S3 Glacier Deep Archive
D. Amazon S3 One Zone-Infrequent Access (S3 One Zone-IA)

Question # 6

A weather forecasting company collects temperature readings from various sensors on a continuous basis. An existing data ingestion process collects the readings and aggregates the readings into larger Apache Parquet files. Then the process encrypts the files by using client-side encryption with KMS managed keys (CSE-KMS). Finally, the process writes the files to an Amazon S3 bucket with separate prefixes for each calendar day. The company wants to run occasional SQL queries on the data to take sample moving averages for a specific calendar day. Which solution will meet these requirements MOST cost-effectively?

A. Configure Amazon Athena to read the encrypted files. Run SQL queries on the data directly in Amazon S3.
B. Use Amazon S3 Select to run SQL queries on the data directly in Amazon S3.
C. Configure Amazon Redshift to read the encrypted files Use Redshift Spectrum and Redshift query editor v2 to run SQL queries on the data directly in Amazon S3.
D. Configure Amazon EMR Serverless to read the encrypted files. Use Apache SparkSQL to run SQL queries on the data directly in Amazon S3.

Question # 7

A company has separate AWS accounts for its finance, data analytics, and development departments. Because of costs and security concerns, the company wants to control which services each AWS account can use Which solution will meet these requirements with the LEAST operational overhead?

A. Use AWS Systems Manager templates to control which AWS services each department can use
B. Create organization units (OUs) for each department in AWS Organizations. Attach service control policies (SCPs) to the OUs.
C. Use AWS CloudFormation to automatically provision only the AWS services that each department can use.
D. Set up a list of products in AWS Service Catalog in the AWS accounts to manage and control the usage of specific AWS services

Question # 8

A company is building a web application that serves a content management system. The content management system runs on Amazon EC2 instances behind an Application Load Balancer (Al B). The FC? instances run in an Auto Scaling group across multiple Availability 7ones. Users are constantly adding and updating files, blogs and other website assets in the content management system. A solutions architect must implement a solution in which all the EC2 Instances share up-todate website content with the least possible lag time. Which solution meets these requirements?

A. Update the EC2 user data in the Auto Scaling group lifecycle policy to copy the website assets from the EC2 instance that was launched most recently. Configure the ALB to make changes to the website assets only in the newest EC2 instance.
B. Copy the website assets to an Amazon Elastic File System (Amazon EFS) file system. Configure each EC2 instance to mount the EFS file system locally. Configure the website hosting application to reference the website assets that are stored in the EFS file system.
C. Copy the website assets to an Amazon S3 bucket. Ensure that each EC2 Instance downloads the website assets from the S3 bucket to the attached Amazon Elastic Block Store (Amazon EBS) volume. Run the S3 sync command once each hour to keep files up to date.
D. Restore an Amazon Elastic Block Store (Amazon EBS) snapshot with the website assets. Attach the EBS snapshot as a secondary EBS volume when a new CC2 instance is launched. Configure the website hosting application to reference the website assets that are stored in the secondary EDS volume.

Question # 9

A company is building an application in the AWS Cloud. The application is hosted on Amazon EC2 instances behind an Application Load Balancer (ALB). The company uses Amazon Route 53 for the DNS. The company needs a managed solution with proactive engagement to detect against DDoS attacks. Which solution will meet these requirements?

A. Enable AWS Config. Configure an AWS Config managed rule that detects DDoS attacks.
B. Enable AWS WAF on the ALB Create an AWS WAF web ACL with rules to detect and prevent DDoS attacks. Associate the web ACL with the ALB.
C. Store the ALB access logs in an Amazon S3 bucket. Configure Amazon GuardDuty to detect and take automated preventative actions for DDoS attacks.
D. Subscribe to AWS Shield Advanced. Configure hosted zones in Route 53 Add ALB resources as protected resources.

Question # 10

A company runs a Node.js function on a server in its on-premises data center. The data center stores data in a PostgreSQL database. The company stores the credentials in a connection string in an environment variable on the server. The company wants to migrate its application to AWS and to replace the Node.js application server with AWS Lambda. The company also wants to migrate to Amazon RDS for PostgreSQL and to ensure that the database credentials are securely managed. Which solution will meet these requirements with the LEAST operational overhead?

A. Store the database credentials as a parameter in AWS Systems Manager Parameter Store. Configure Parameter Store to automatically rotate the secrets every 30 days. Update the Lambda function to retrieve the credentials from the parameter.
B. Store the database credentials as a secret in AWS Secrets Manager. Configure Secrets Manager to automatically rotate the credentials every 30 days Update the Lambda function to retrieve the credentials from the secret.
C. Store the database credentials as an encrypted Lambda environment variable. Write a custom Lambda function to rotate the credentials. Schedule the Lambda function to run every 30 days.
D. Store the database credentials as a key in AWS Key Management Service (AWS KMS). Configure automatic rotation for the key. Update the Lambda function to retrieve the credentials from the KMS key.

Question # 11

A company runs several websites on AWS for its different brands Each website generates tens of gigabytes of web traffic logs each day. A solutions architect needs to design a scalable solution to give the company's developers the ability to analyze traffic patterns across all the company's websites. This analysis by the developers will occur on demand once a week over the course of several months. The solution must support queries with standard SQL. Which solution will meet these requirements MOST cost-effectively?

A. Store the logs in Amazon S3. Use Amazon Athena for analysis.
B. Store the logs in Amazon RDS. Use a database client for analysis.
C. Store the logs in Amazon OpenSearch Service. Use OpenSearch Service for analysis.
D. Store the logs in an Amazon EMR cluster. Use a supported open-source framework for SQL-based analysis.

Question # 12

A company runs its production workload on an Amazon Aurora MySQL DB cluster that includes six Aurora Replicas. The company wants near-real-time reporting queries from one of its departments to be automatically distributed across three of the Aurora Replicas. Those three replicas have a different compute and memory specification from the rest of the DB cluster. Which solution meets these requirements?

A. Create and use a custom endpoint for the workload.
B. Create a three-node cluster clone and use the reader endpoint.
C. Use any of the instance endpoints for the selected three nodes.
D. Use the reader endpoint to automatically distribute the read-only workload.

Question # 13

A company is building a cloud-based application on AWS that will handle sensitive customer data. The application uses Amazon RDS for the database. Amazon S3 for object storage, and S3 Event Notifications that invoke AWS Lambda for serverless processing. The company uses AWS 1AM Identity Center to manage user credentials. The development, testing, and operations teams need secure access to Amazon RDS and Amazon S3 while ensuring the confidentiality of sensitive customer data. The solution must comply with the principle of least privilege. Which solution meets these requirements with the LEAST operational overhead?

A. Use 1AM roles with least privilege to grant all the teams access. Assign 1AM roles to each team with customized 1AM policies defining specific permission for Amazon RDS and S3 object access based on team responsibilities.
B. Enable 1AM Identity Center with an Identity Center directory. Create and configure permission sets with granular access to Amazon RDS and Amazon S3. Assign all the teams to groups that have specific access with the permission sets.
C. Create individual 1AM users for each member in all the teams with role-based permissions. Assign the 1AM roles with predefined policies for RDS and S3 access to each user based on user needs. Implement 1AM Access Analyzer for periodic credential evaluation.
D. Use AWS Organizations to create separate accounts for each team. Implement crossaccount 1AM roles with least privilege Grant specific permission for RDS and S3 access based on team roles and responsibilities. Answer: B

Question # 14

A company is implementing a new application on AWS. The company will run the application on multiple Amazon EC2 instances across multiple Availability Zones within multiple AWS Regions. The application will be available through the internet. Users will access the application from around the world. The company wants to ensure that each user who accesses the application is sent to the EC2 instances that are closest to the user's location. Which solution will meet these requirements?

A. Implement an Amazon Route 53 geolocation routing policy. Use an internet-facing Application Load Balancer to distribute the traffic across all Availability Zones within the same Region.
B. Implement an Amazon Route 53 geoproximity routing policy. Use an internet-facing Network Load Balancer to distribute the traffic across all Availability Zones within the same Region.
C. Implement an Amazon Route 53 multivalue answer routing policy Use an internet-facing Application Load Balancer to distribute the traffic across all Availability Zones within the same Region.
D. Implement an Amazon Route 53 weighted routing policy. Use an internet-facing Network Load Balancer to distribute the traffic across all Availability Zones within the same Region.

Question # 15

An ecommerce company runs several internal applications in multiple AWS accounts. The company uses AWS Organizations to manage its AWS accounts. A security appliance in the company's networking account must inspect interactions between applications across AWS accounts. Which solution will meet these requirements?

A. Deploy a Network Load Balancer (NLB) in the networking account to send traffic to the security appliance. Configure the application accounts to send traffic to the NLB by using an interface VPC endpoint in the application accounts
B. Deploy an Application Load Balancer (ALB) in the application accounts to send traffic directly to the security appliance.
C. Deploy a Gateway Load Balancer (GWLB) in the networking account to send traffic to the security appliance. Configure the application accounts to send traffic to the GWLB by using an interface GWLB endpoint in the application accounts
D. Deploy an interface VPC endpoint in the application accounts to send traffic directly to the security appliance.

Question # 16

A company stores data in an on-premises Oracle relational database. The company needs to make the data available in Amazon Aurora PostgreSQL for analysis The company uses an AWS Site-to-Site VPN connection to connect its on-premises network to AWS. The company must capture the changes that occur to the source database during the migration to Aurora PostgreSQL. Which solution will meet these requirements?

A. Use the AWS Schema Conversion Tool (AWS SCT) to convert the Oracle schema to Aurora PostgreSQL schema. Use the AWS Database Migration Service (AWS DMS) fullload migration task to migrate the data.
B. Use AWS DataSync to migrate the data to an Amazon S3 bucket. Import the S3 data to Aurora PostgreSQL by using the Aurora PostgreSQL aws_s3 extension.
C. Use the AWS Schema Conversion Tool (AWS SCT) to convert the Oracle schema to Aurora PostgreSQL schema. Use AWS Database Migration Service (AWS DMS) to migrate the existing data and replicate the ongoing changes.
D. Use an AWS Snowball device to migrate the data to an Amazon S3 bucket. Import the S3 data to Aurora PostgreSQL by using the Aurora PostgreSQL aws_s3 extension.

Question # 17

A company has an employee web portal. Employees log in to the portal to view payroll details. The company is developing a new system to give employees the ability to upload scanned documents for reimbursement. The company runs a program to extract text-based data from the documents and attach the extracted information to each employee's reimbursement IDs for processing. The employee web portal requires 100% uptime. The document extract program runs infrequently throughout the day on an on-demand basis. The company wants to build a scalable and cost-effective new system that will require minimal changes to the existing web portal. The company does not want to make any code changes. Which solution will meet these requirements with the LEAST implementation effort?

A. Run Amazon EC2 On-Demand Instances in an Auto Scaling group for the web portal. Use an AWS Lambda function to run the document extract program. Invoke the Lambda function when an employee uploads a new reimbursement document.
B. Run Amazon EC2 Spot Instances in an Auto Scaling group for the web portal. Run the document extract program on EC2 Spot Instances Start document extract program instances when an employee uploads a new reimbursement document.
C. Purchase a Savings Plan to run the web portal and the document extract program. Run the web portal and the document extract program in an Auto Scaling group.
D. Create an Amazon S3 bucket to host the web portal. Use Amazon API Gateway and an AWS Lambda function for the existing functionalities. Use the Lambda function to run the document extract program. Invoke the Lambda function when the API that is associated with a new document upload is called.

Question # 18

A medical company wants to perform transformations on a large amount of clinical trial data that comes from several customers. The company must extract the data from a relational database that contains the customer data. Then the company will transform the data by using a series of complex rules. The company will load the data to Amazon S3 when the transformations are complete. All data must be encrypted where it is processed before the company stores the data in Amazon S3. All data must be encrypted by using customer-specific keys. Which solution will meet these requirements with the LEAST amount of operational effort?

A. Create one AWS Glue job for each customer Attach a security configuration to each job that uses server-side encryption with Amazon S3 managed keys (SSE-S3) to encrypt the data.
B. Create one Amazon EMR cluster for each customer Attach a security configuration to each cluster that uses client-side encryption with a custom client-side root key (CSECustom) to encrypt the data.
C. Create one AWS Glue job for each customer Attach a security configuration to each job that uses client-side encryption with AWS KMS managed keys (CSE-KMS) to encrypt the data.
D. Create one Amazon EMR cluster for each customer Attach a security configuration to each cluster that uses server-side encryption with AWS KMS keys (SSE-KMS) to encrypt the data.

Question # 19

A company needs to optimize its Amazon S3 storage costs for an application that generates many files that cannot be recreated Each file is approximately 5 MB and is stored in Amazon S3 Standard storage. The company must store the files for 4 years before the files can be deleted The files must be immediately accessible The files are frequently accessed in the first 30 days of object creation, but they are rarely accessed after the first 30 days. Which solution will meet these requirements MOST cost-effectively?

A. Create an S3 Lifecycle policy to move the files to S3 Glacier Instant Retrieval 30 days after object creation. Delete the files 4 years after object creation.
B. Create an S3 Lifecycle policy to move the files to S3 One Zone-Infrequent Access (S3 One Zone-IA) 30 days after object creation Delete the files 4 years after object creation.
C. Create an S3 Lifecycle policy to move the files to S3 Standard-Infrequent Access (S3 Standard-IA) 30 days after object creation Delete the files 4 years after object creation.
D. Create an S3 Lifecycle policy to move the files to S3 Standard-Infrequent Access (S3 Standard-IA) 30 days after object creation. Move the files to S3 Glacier Flexible Retrieval 4 years after object creation.

Question # 20

A startup company is hosting a website for its customers on an Amazon EC2 instance. The website consists of a stateless Python application and a MySQL database. The website serves only a small amount of traffic. The company is concerned about the reliability of the instance and needs to migrate to a highly available architecture. The company cannot modify the application code. Which combination of actions should a solutions architect take to achieve high availability for the website? (Select TWO.)

A. Provision an internet gateway in each Availability Zone in use.
B. Migrate the database to an Amazon RDS for MySQL Multi-AZ DB instance.
C. Migrate the database to Amazon DynamoDB. and enable DynamoDB auto scaling.
D. Use AWS DataSync to synchronize the database data across multiple EC2 instances.
E. Create an Application Load Balancer to distribute traffic to an Auto Scaling group of EC2 instances that are distributed across two Availability Zones.

Question # 21

A company has migrated several applications to AWS in the past 3 months. The company wants to know the breakdown of costs for each of these applications. The company wants to receive a regular report that Includes this Information. Which solution will meet these requirements MOST cost-effectively?

A. Use AWS Budgets to download data for the past 3 months into a csv file. Look up the desired information.
B. Load AWS Cost and Usage Reports into an Amazon RDS DB instance. Run SQL queries to gel the desired information.
C. Tag all the AWS resources with a key for cost and a value of the application's name. Activate cost allocation tags Use Cost Explorer to get the desired information.
D. Tag all the AWS resources with a key for cost and a value of the application's name. Use the AWS Billing and Cost Management console to download bills for the past 3 months. Look up the desired information.

Question # 22

A company is migrating a legacy application from an on-premises data center to AWS. The application relies on hundreds of cron Jobs that run between 1 and 20 minutes on different recurring schedules throughout the day. The company wants a solution to schedule and run the cron jobs on AWS with minimal refactoring. The solution must support running the cron jobs in response to an event in the future. Which solution will meet these requirements?

A. Create a container image for the cron jobs. Use Amazon EventBridge Scheduler to create a recurring schedule. Run the cron job tasks as AWS Lambda functions.
B. Create a container image for the cron jobs. Use AWS Batch on Amazon Elastic Container Service (Amazon ECS) with a scheduling policy to run the cron jobs.
C. Create a container image for the cron jobs. Use Amazon EventBridge Scheduler to create a recurring schedule Run the cron job tasks on AWS Fargate.
D. Create a container image for the cron jobs. Create a workflow in AWS Step Functions that uses a Wait state to run the cron jobs at a specified time. Use the RunTask action to run the cron job tasks on AWS Fargate.

Question # 23

A company uses Amazon API Gateway to manage its REST APIs that third-party service providers access The company must protect the REST APIs from SQL injection and crosssite scripting attacks. What is the MOST operationally efficient solution that meets these requirements?

A. Configure AWS Shield.
B. Configure AWS WAR
C. Set up API Gateway with an Amazon CloudFront distribution Configure AWS Shield in CloudFront.
D. Set up API Gateway with an Amazon CloudFront distribution. Configure AWS WAF in CloudFront

Question # 24

A company is using AWS DataSync to migrate millions of files from an on-premises system to AWS. The files are 10 KB in size on average. The company wants to use Amazon S3 for file storage. For the first year after the migration the files will be accessed once or twice and must be immediately available. After 1 year the files must be archived for at least 7 years. Which solution will meet these requirements MOST cost-effectively?

A. Use an archive tool lo group the files into large objects. Use DataSync to migrate the objects. Store the objects in S3 Glacier Instant Retrieval for the first year. Use a lifecycle configuration to transition the files to S3 Glacier Deep Archive after 1 year with a retention period of 7 years.
B. Use an archive tool to group the files into large objects. Use DataSync to copy the objects to S3 Standard-Infrequent Access (S3 Standard-IA). Use a lifecycle configuration to transition the files to S3 Glacier Instant Retrieval after 1 year with a retention period of 7 years.
C. Configure the destination storage class for the files as S3 Glacier Instant. Retrieval Use a lifecycle policy to transition the files to S3 Glacier Flexible Retrieval after 1 year with a retention period of 7 years.
D. Configure a DataSync task to transfer the files to S3 Standard-Infrequent Access (S3 Standard-IA) Use a lifecycle configuration to transition the files to S3. Deep Archive after 1 year with a retention period of 7 years.

Question # 25

A company is migrating its on-premises Oracle database to an Amazon RDS for Oracle database. The company needs to retain data for 90 days to meet regulatory requirements. The company must also be able to restore the database to a specific point in time for up to 14 days. Which solution will meet these requirements with the LEAST operational overhead?

A. Create Amazon RDS automated backups. Set the retention period to 90 days.
B. Create an Amazon RDS manual snapshot every day. Delete manual snapshots that are older than 90 days.
C. Use the Amazon Aurora Clone feature for Oracle to create a point-in-time restore. Delete clones that are older than 90 days
D. Create a backup plan that has a retention period of 90 days by using AWS Backup for Amazon RDS.

Question # 26

A company uses Amazon EC2 instances and stores data on Amazon Elastic Block Store (Amazon EBS) volumes. The company must ensure that all data is encrypted at rest by using AWS Key Management Service (AWS KMS). The company must be able to control rotation of the encryption keys. Which solution will meet these requirements with the LEAST operational overhead?

A. Create a customer managed key Use the key to encrypt the EBS volumes.
B. Use an AWS managed key to encrypt the EBS volumes. Use the key to configure automatic key rotation.
C. Create an external KMS key with imported key material. Use the key to encrypt the EBS volumes.
D. Use an AWS owned key to encrypt the EBS volumes.

Question # 27

A company uses an Amazon DynamoDB table to store data that the company receives from devices. The DynamoDB table supports a customer-facing website to display recent activity on customer devices The company configured the table with provisioned throughput for writes and reads The company wants to calculate performance metrics for customer device data on a daily basis. The solution must have minimal effect on the table's provisioned read and write capacityWhich solution will meet these requirements?

A. Use an Amazon Athena SQL query with the Amazon Athena DynamoDB connector to calculate performance metrics on a recurring schedule.
B. Use an AWS Glue job with the AWS Glue DynamoDB export connector to calculate performance metrics on a recurring schedule.
C. Use an Amazon Redshift COPY command to calculate performance metrics on a recurring schedule.
D. Use an Amazon EMR job with an Apache Hive external table to calculate performance metrics on a recurring schedule.

Question # 28

A company sets up an organization in AWS Organizations that contains 10AWS accounts. A solutions architect must design a solution to provide access to the accounts for several thousand employees. The company has an existing identity provider (IdP). The company wants to use the existing IdP for authentication to AWS. Which solution will meet these requirements?

A. Create 1AM users for the employees in the required AWS accounts. Connect 1AM users to the existing IdP. Configure federated authentication for the 1AM users.
B. Set up AWS account root users with user email addresses and passwords that are synchronized from the existing IdP.
C. Configure AWS 1AM Identity Center Connect 1AM Identity Center to the existing IdP Provision users and groups from the existing IdP 
D. Use AWS Resource Access Manager (AWS RAM) to share access to the AWS accounts with the users in the existing IdP.

Question # 29

A company currently runs an on-premises stock trading application by using Microsoft Windows Server. The company wants to migrate the application to the AWS Cloud. The company needs to design a highly available solution that provides low-latency access to block storage across multiple Availability Zones. Which solution will meet these requirements with the LEAST implementation effort?

A. Configure a Windows Server cluster that spans two Availability Zones on Amazon EC2 instances. Install the application on both cluster nodes. Use Amazon FSx for Windows File Server as shared storage between the two cluster nodes.
B. Configure a Windows Server cluster that spans two Availability Zones on Amazon EC2 instances. Install the application on both cluster nodes Use Amazon Elastic Block Store (Amazon EBS) General Purpose SSD (gp3) volumes as storage attached to the EC2 instances. Set up application-level replication to sync data from one EBS volume in one Availability Zone to another EBS volume in the second Availability Zone.
C. Deploy the application on Amazon EC2 instances in two Availability Zones Configure one EC2 instance as active and the second EC2 instance in standby mode. Use an Amazon FSx for NetApp ONTAP Multi-AZ file system to access the data by using Internet Small Computer Systems Interface (iSCSI) protocol.
D. Deploy the application on Amazon EC2 instances in two Availability Zones. Configure one EC2 instance as active and the second EC2 instance in standby mode. Use Amazon Elastic Block Store (Amazon EBS) Provisioned IOPS SSD (io2) volumes as storage attached to the EC2 instances. Set up Amazon EBS level replication to sync data from one io2 volume in one Availability Zone to another io2 volume in the second Availability Zone.

Question # 30

A company is migrating its databases to Amazon RDS for PostgreSQL. The company is migrating its applications to Amazon EC2 instances. The company wants to optimize costs for long-running workloads. Which solution will meet this requirement MOST cost-effectively?

A. Use On-Demand Instances for the Amazon RDS for PostgreSQL workloads. Purchase a 1 year Compute Savings Plan with the No Upfront option for the EC2 instances.
B. Purchase Reserved Instances for a 1 year term with the No Upfront option for the Amazon RDS for PostgreSQL workloads. Purchase a 1 year EC2 Instance Savings Plan with the No Upfront option for the EC2 instances.
C. Purchase Reserved Instances for a 1 year term with the Partial Upfront option for the Amazon RDS for PostgreSQL workloads. Purchase a 1 year EC2 Instance Savings Plan with the Partial Upfront option for the EC2 instances.
D. Purchase Reserved Instances for a 3 year term with the All Upfront option for the Amazon RDS for PostgreSQL workloads. Purchase a 3 year EC2 Instance Savings Plan with the All Upfront option for the EC2 instances.

Question # 31

A company hosts an application in a private subnet. The company has already integrated the application with Amazon Cognito. The company uses an Amazon Cognito user pool to authenticate users. The company needs to modify the application so the application can securely store user documents in an Amazon S3 bucket. Which combination of steps will securely integrate Amazon S3 with the application? (Select TWO.)

A. Create an Ama2on Cognito identity pool to generate secure Amazon S3 access tokens for users when they successfully log in.
B. Use the existing Amazon Cognito user pool to generate Amazon S3 access tokens for users when they successfully log in.
C. Create an Amazon S3 VPC endpoint in the same VPC where the company hosts the application.
D. Create a NAT gateway in the VPC where the company hosts the application. Assign a policy to the S3 bucket to deny any request that is not initiated from Amazon Cognito. 
E. Attach a policy to the S3 bucket that allows access only from the users' IP addresses.

Question # 32

A company hosts its application on several Amazon EC2 instances inside a VPC. The company creates a dedicated Amazon S3 bucket for each customer to store their relevant information in Amazon S3. The company wants to ensure that the application running on EC2 instances can securely access only the S3 buckets that belong to the company's AWS account. Which solution will meet these requirements with the LEAST operational overhead?

A. Create a gateway endpoint for Amazon S3 that is attached to the VPC Update the 1AM instance profile policy to provide access to only the specific buckets that the application needs.
B. Create a NAT gateway in a public subnet with a security group that allows access to only Amazon S3 Update the route tables to use the NAT Gateway.
C. Create a gateway endpoint for Amazon S3 that is attached to the VPC Update the 1AM instance profile policy with a Deny action and the following condition key:
D. Create a NAT Gateway in a public subnet Update route tables to use the NAT Gateway Assign bucket policies for all buckets with a Deny action and the following condition key:

Question # 33

A company wants to standardize its Amazon Elastic Block Store (Amazon EBS) volume encryption strategy. The company also wants to minimize the cost and configuration effort required to operate the volume encryption check. Which solution will meet these requirements?

A. Write API calls to describe the EBS volumes and to confirm the EBS volumes are encrypted. Use Amazon EventBridge to schedule an AWS Lambda function to run the API calls.
B. Write API calls to describe the EBS volumes and to confirm the EBS volumes are encrypted. Run the API calls on an AWS Fargate task.
C. Create an AWS Identity and Access Management (1AM) policy that requires the use of tags on EBS volumes. Use AWS Cost Explorer to display resources that are not properly tagged. Encrypt the untagged resources manually.
D. Create an AWS Config rule for Amazon EBS to evaluate if a volume is encrypted and to flag the volume if it is not encrypted.

Question # 34

An online gaming company hosts its platform on Amazon EC2 instances behind Network Load Balancers (NLBs) across multiple AWS Regions. The NLBs can route requests to targets over the internet. The company wants to improve the customer playing experience by reducing end-to-end load time for its global customer base. Which solution will meet these requirements?

A. Create Application Load Balancers (ALBs) in each Region to replace the existing NLBs. Register the existing EC2 instances as targets for the ALBs in each Region.
B. Configure Amazon Route 53 to route equally weighted traffic to the NLBs in each Region.
C. Create additional NLBs and EC2 instances in other Regions where the company has large customer bases.
D. Create a standard accelerator in AWS Global Accelerator. Configure the existing NLBs as target endpoints.

Question # 35

A company has stored millions of objects across multiple prefixes in an Amazon S3 bucket by using the Amazon S3 Glacier Deep Archive storage class. The company needs to delete all data older than 3 years except for a subset of data that must be retained. The company has identified the data that must be retained and wants to implement a serverless solution. Which solution will meet these requirements?

A. Use S3 Inventory to list all objects. Use the AWS CLI to create a script that runs on an Amazon EC2 instance that deletes objects from the inventory list.
B. Use AWS Batch to delete objects older than 3 years except for the data that must be retained
C. Provision an AWS Glue crawler to query objects older than 3 years. Save the manifest file of old objects. Create a script to delete objects in the manifest.
D. Enable S3 Inventory. Create an AWS Lambda function to filter and delete objects. Invoke the Lambda function with S3 Batch Operations to delete objects by using the inventory reports.

Question # 36

A solutions architect needs to connect a company's corporate network to its VPC to allow on-premises access to its AWS resources. The solution must provide encryption of all traffic between the corporate network and the VPC at the network layer and the session layer. The solution also must provide security controls to prevent unrestricted access between AWS and the on-premises systems. Which solution meets these requirements?

A. Configure AWS Direct Connect to connect to the VPC. Configure the VPC route tables to allow and deny traffic between AWS and on premises as required.
B. Create an 1AM policy to allow access to the AWS Management Console only from a defined set of corporate IP addresses Restrict user access based on job responsibility by using an 1AM policy and roles
C. Configure AWS Site-to-Site VPN to connect to the VPC. Configure route table entries to direct traffic from on premises to the VPC. Configure instance security groups and network ACLs to allow only required traffic from on premises.
D. Configure AWS Transit Gateway to connect to the VPC. Configure route table entries to direct traffic from on premises to the VPC. Configure instance security groups and network ACLs to allow only required traffic from on premises.

Question # 37

A company currently stores 5 TB of data in on-premises block storage systems. The company's current storage solution provides limited space for additional data. The company runs applications on premises that must be able to retrieve frequently accessed data with low latency. The company requires a cloud-based storage solution. Which solution will meet these requirements with the MOST operational efficiency?

A. Use Amazon S3 File Gateway Integrate S3 File Gateway with the on-premises applications to store and directly retrieve files by using the SMB file system.
B. Use an AWS Storage Gateway Volume Gateway with cached volumes as iSCSt targets.
C. Use an AWS Storage Gateway Volume Gateway with stored volumes as iSCSI targets.
D. Use an AWS Storage Gateway Tape Gateway. Integrate Tape Gateway with the onpremises applications to store virtual tapes in Amazon S3.

Question # 38

A company has a three-tier web application that processes orders from customers. The web tier consists of Amazon EC2 instances behind an Application Load Balancer. The processing tier consists of EC2 instances. The company decoupled the web tier and processing tier by using Amazon Simple Queue Service (Amazon SQS). The storage layer uses Amazon DynamoDB. At peak times some users report order processing delays and halts. The company has noticed that during these delays, the EC2 instances are running at 100% CPU usage, and the SQS queue fills up. The peak times are variable and unpredictable. The company needs to improve the performance of the application Which solution will meet these requirements?

A. Use scheduled scaling for Amazon EC2 Auto Scaling to scale out the processing tier instances for the duration of peak usage times. Use the CPU Utilization metric to determine when to scale.
B. Use Amazon ElastiCache for Redis in front of the DynamoDB backend tier. Use target utilization as a metric to determine when to scale.
C. Add an Amazon CloudFront distribution to cache the responses for the web tier. Use HTTP latency as a metric to determine when to scale.
D. Use an Amazon EC2 Auto Scaling target tracking policy to scale out the processing tier instances. Use the ApproximateNumberOfMessages attribute to determine when to scale.

Question # 39

A company runs database workloads on AWS that are the backend for the company's customer portals. The company runs a Multi-AZ database cluster on Amazon RDS for PostgreSQL. The company needs to implement a 30-day backup retention policy. The company currently has both automated RDS backups and manual RDS backups. The company wants to maintain both types of existing RDS backups that are less than 30 days old. Which solution will meet these requirements MOST cost-effectively?

A. Configure the RDS backup retention policy to 30 days tor automated backups by using AWS Backup. Manually delete manual backups that are older than 30 days.
B. Disable RDS automated backups. Delete automated backups and manual backups that are older than 30 days. Configure the RDS backup retention policy to 30 days tor automated backups.
C. Configure the RDS backup retention policy to 30 days for automated backups. Manually delete manual backups that are older than 30 days
D. Disable RDS automated backups. Delete automated backups and manual backups that are older than 30 days automatically by using AWS CloudFormation. Configure the RDS backup retention policy to 30 days for automated backups.

Question # 40

A company is building an application on AWS that connects to an Amazon RDS database. The company wants to manage the application configuration and to securely store and retrieve credentials for the database and other services. Which solution will meet these requirements with the LEAST administrative overhead?

A. Use AWS AppConfig to store and manage the application configuration. Use AWS Secrets Manager to store and retrieve the credentials.
B. Use AWS Lambda to store and manage the application configuration. Use AWS Systems Manager Parameter Store to store and retrieve the credentials.
C. Use an encrypted application configuration file Store the file in Amazon S3 for the application configuration. Create another S3 file to store and retrieve the credentials.
D. Use AWS AppConfig to store and manage the application configuration. Use Amazon RDS to store and retrieve the credentials.

Question # 41

A company is planning to migrate a legacy application to AWS. The application currently uses NFS to communicate to an on-premises storage solution to store application data. The application cannot be modified to use any other communication protocols other than NFS for this purpose. Which storage solution should a solutions architect recommend for use after the migration?

A. AWS DataSync
B. Amazon Elastic Block Store (Amazon EB5)
C. Amazon Elastic File System (Amazon EF5)
D. Amazon EMR File System (Amazon EMRFS)

Question # 42

A financial services company plans to launch a new application on AWS to handle sensitive financial transactions. The company will deploy the application on Amazon EC2 instances. The company will use Amazon RDS for MySQL as the database. The company's security policies mandate that data must be encrypted at rest and in transit. Which solution will meet these requirements with the LEAST operational overhead?

A. Configure encryption at rest for Amazon RDS for MySQL by using AWS KMS managed keys. Configure AWS Certificate Manager (ACM) SSL/TLS certificates for encryption in transit.
B. Configure encryption at rest for Amazon RDS for MySQL by using AWS KMS managed keys. Configure IPsec tunnels for encryption in transit
C. Implement third-party application-level data encryption before storing data in Amazon RDS for MySQL. Configure AWS Certificate Manager (ACM) SSL/TLS certificates for encryption in transit.
D. Configure encryption at rest for Amazon RDS for MySQL by using AWS KMS managed keys Configure a VPN connection to enable private connectivity to encrypt data in transit.

Question # 43

A company's application is deployed on Amazon EC2 instances and uses AWS Lambda functions for an event-driven architecture. The company uses nonproduction development environments in a different AWS account to test new features before the company deploys the features to production. The production instances show constant usage because of customers in different time zones. The company uses nonproduction instances only during business hours on weekdays. The company does not use the nonproduction instances on the weekends. The company wants to optimize the costs to run its application on AWS. Which solution will meet these requirements MOST cost-effectively

A. Use On-Demand Instances (or the production instances. Use Dedicated Hosts for the nonproduction instances on weekends only.
B. Use Reserved instances for the production instances and the nonproduction instances Shut down the nonproduction instances when not in use.
C. Use Compute Savings Plans for the production instances. Use On-Demand Instances for the nonproduction instances Shut down the nonproduction instances when not in use.
D. Use Dedicated Hosts for the production instances. Use EC2 Instance Savings Plans for the nonproduction instances.

Question # 44

A company recently migrated a monolithic application to an Amazon EC2 instance and Amazon RDS. The application has tightly coupled modules. The existing design of the application gives the application the ability to run on only a single EC2 instance. The company has noticed high CPU utilization on the EC2 instance during peak usage times. The high CPU utilization corresponds to degraded performance on Amazon RDS for read requests. The company wants to reduce the high CPU utilization and improve read request performance. Which solution will meet these requirements?

A. Resize the EC2 instance to an EC2 instance type that has more CPU capacity. Configure an Auto Scaling group with a minimum and maximum size of 1. Configure an RDS read replica for read requests.
B. Resize the EC2 instance to an EC2 instance type that has more CPU capacity. Configure an Auto Scaling group with a minimum and maximum size of 1. Add an RDS read replica and redirect all read/write traffic to the replica.
C. Configure an Auto Scaling group with a minimum size of 1 and maximum size of 2. Resize the RDS DB instance to an instance type that has more CPU capacity.
D. Resize the EC2 instance to an EC2 instance type that has more CPU capacity Configure an Auto Scaling group with a minimum and maximum size of 1. Resize the RDS DB instance to an instance type that has more CPU capacity.

Question # 45

A company runs multiple workloads on virtual machines (VMs) in an on-premises data center. The company is expanding rapidly. The on-premises data center is not able to scale fast enough to meet business needs. The company wants to migrate the workloads to AWS. The migration is time sensitive. The company wants to use a lift-and-shift strategy for noncritical workloads. Which combination of steps will meet these requirements? (Select THREE.)

A. Use the AWS Schema Conversion Tool (AWS SCT) to collect data about the VMs.
B. Use AWS Application Migration Service. Install the AWS Replication Agent on the VMs.
C. Complete the initial replication of the VMs. Launch test instances to perform acceptance tests on the VMs.
D. Stop all operations on the VMs Launch a cutover instance.
E. Use AWS App2Container (A2C) to collect data about the VMs.
F. Use AWS Database Migration Service (AWS DMS) to migrate the VMs.

Question # 46

An ecommerce company runs Its application on AWS. The application uses an Amazon Aurora PostgreSQL cluster in Multi-AZ mode for the underlying database. During a recent promotional campaign, the application experienced heavy read load and write load. Users experienced timeout issues when they attempted to access the application. A solutions architect needs to make the application architecture more scalable and highly available. Which solution will meet these requirements with the LEAST downtime?

A. Create an Amazon EventBndge rule that has the Aurora cluster as a source. Create an AWS Lambda function to log the state change events of the Aurora cluster. Add the Lambda function as a target for the EventBndge rule Add additional reader nodes to fail over to.
B. Modify the Aurora cluster and activate the zero-downtime restart (ZDR) feature. Use Database Activity Streams on the cluster to track the cluster status.
C. Add additional reader instances to the Aurora cluster Create an Amazon RDS Proxy target group for the Aurora cluster.
D. Create an Amazon ElastiCache for Redis cache. Replicate data from the Aurora cluster to Redis by using AWS Database Migration Service (AWS DMS) with a write-around approach.

Question # 47

A company runs an AWS Lambda function in private subnets in a VPC. The subnets havea default route to the internet through an Amazon EC2 NAT instance. The Lambda functionprocesses input data and saves its output as an object to Amazon S3.Intermittently, the Lambda function times out while trying to upload the object because ofsaturated traffic on the NAT instance's network The company wants to access Amazon S3without traversing the internet.Which solution will meet these requirements?

A. Replace the EC2 NAT instance with an AWS managed NAT gateway.
B. Increase the size of the EC2 NAT instance in the VPC to a network optimized instance type
C. Provision a gateway endpoint for Amazon S3 in the VPC. Update the route tables of the subnets accordingly.
D. Provision a transit gateway. Place transit gateway attachments in the private subnetswhere the Lambda function is running.

Question # 48

A solutions architect is designing an asynchronous application to process credit card datavalidation requests for a bank. The application must be secure and be able to process eachrequest at least once.Which solution will meet these requirements MOST cost-effectively?

A. Use AWS Lambda event source mapping. Set Amazon Simple Queue Service (AmazonSQS) standard queues as the event source. Use AWS KeyManagement Service (SSE-KMS) for encryption. Add the kms:Decrypt permission for theLambda execution role.
B. Use AWS Lambda event source mapping. Use Amazon Simple Queue Service (AmazonSQS) FIFO queues as the event source. Use SQS managed encryption keys (SSE-SQS)for encryption. Add the encryption key invocation permission for the Lambda function.
C. Use the AWS Lambda event source mapping. Set Amazon Simple Queue Service(Amazon SQS) FIFO queues as the event source. Use AWS KMS keys (SSE-KMS). Addthe kms:Decrypt permission for the Lambda execution role.
D. Use the AWS Lambda event source mapping. Set Amazon Simple Queue Service(Amazon SQS) standard queues as the event source. Use AWS KMS keys (SSE-KMS) forencryption. Add the encryption key invocation permission for the Lambda function.

Question # 49

A company hosts an application on Amazon EC2 On-Demand Instances in an Auto Scalinggroup. Application peak hours occur at the same time each day. Application users reportslow application performance at the start of peak hours. The application performs normally2-3 hours after peak hours begin. The company wants to ensure that the application worksproperly at the start o* peak hours.Which solution will meet these requirements?

A. Configure an Application Load Balancer to distribute traffic properly to the Instances.
B. Configure a dynamic scaling policy for the Auto Scaling group to launch new instancesbased on memory utilization
C. Configure a dynamic scaling policy for the Auto Scaling group to launch new instancesbased on CPU utilization.
D. Configure a scheduled scaling policy for the Auto Scaling group to launch new instancesbefore peak hours.

Question # 50

A company needs a solution to prevent AWS CloudFormation stacks from deploying AWSIdentity and Access Management (1AM) resources that include an inline policy or "•" in thestatement The solution must also prohibit deployment ot Amazon EC2 instances with publicIP addresses The company has AWS Control Tower enabled in its organization in AWSOrganizations.Which solution will meet these requirements?

A. Use AWS Control Tower proactive controls to block deployment of EC2 instances withpublic IP addresses and inline policies with elevated access or "*"
B. Use AWS Control Tower detective controls to block deployment of EC2 instances withpublic IP addresses and inline policies with elevated access or ""
C. Use AWS Config to create rules for EC2 and 1AM compliance Configure the rules to runan AWS Systems Manager Session Manager automation to delete a resource when it isnot compliant
D. Use a service control policy (SCP) to block actions for the EC2 instances and 1AMresources if the actions lead to noncompliance

Question # 51

A company is migrating a document management application to AWS. The application runson Linux servers. The company will migrate the application to Amazon EC2 instances in anAuto Scaling group. The company stores 7 TiB of documents in a shared storage filesystem. An external relational database tracks the documents.Documents are stored once and can be retrieved multiple times for reference at any time.The company cannot modify the application during the migration. The storage solutionmust be highly available and must support scaling over time.Which solution will meet these requirements MOST cost-effectively?

A. Deploy an EC2 instance with enhanced networking as a shared NFS storage system.Export the NFS share. Mount the NFS share on the EC2 instances in theAuto Scaling group.
B. Create an Amazon S3 bucket that uses the S3 Standard-Infrequent Access (S3Standard-IA) storage class Mount the S3 bucket on the EC2 instances in theAuto Scaling group.
C. Deploy an SFTP server endpoint by using AWS Transfer for SFTP and an Amazon S3bucket. Configure the EC2 instances in the Auto Scaling group toconnect to the SFTP server.
D. Create an Amazon.. System (Amazon fcFS) file system with mount points in multipleAvailability Zones. Use the bFS Stondard-intrcqucnt Access (Standard-IA) storage class.Mount the NFS share on the EC2 instances in the Auto Scaling group.

Question # 52

A company is migrating five on-premises applications to VPCs in the AWS Cloud. Eachapplication is currently deployed in isolated virtual networks on premises and should bedeployed similarly in the AWS Cloud. The applications need to reach a shared servicesVPC. All the applications must be able to communicate with each other. If the migration is successful, the company will repeat the migration process for more than100 applications.Which solution will meet these requirements with the LEAST administrative overhead?

A. Deploy software VPN tunnels between the application VPCs and the shared servicesVPC. Add routes between the application VPCs in their subnets to the shared servicesVPC.
B. Deploy VPC peering connections between the application VPCs and the sharedservices VPC. Add routes between the application VPCs in their subnets to the sharedservices VPC through the peering connection.
C. Deploy an AWS Direct Connect connection between the application VPCs and theshared services VPC. Add routes from the application VPCs in their subnets to the sharedservices VPC and the applications VPCs. Add routes from the shared services VPCsubnets to the applications VPCs.
D. Deploy a transit gateway with associations between the transit gateway and theapplication VPCs and the shared services VPC Add routes between the application VPCsin their subnets and the application VPCs to the shared services VPC through the transitgateway.

Question # 53

A company uses an Amazon CloudFront distribution to serve content pages for its website.The company needs to ensure that clients use a TLS certificate when accessing thecompany's website. The company wants to automate the creation and renewal of the Tl Scertificates.Which solution will meet these requirements with the MOST operational efficiency?

A. Use a CloudFront security policy lo create a certificate.
B. Use a CloudFront origin access control (OAC) to create a certificate.
C. Use AWS Certificate Manager (ACM) to create a certificate. Use DNS validation for thedomain.
D. Use AWS Certificate Manager (ACM) to create a certificate. Use email validation for thedomain.

Question # 54

A company uses Amazon RDS with default backup settings for Its database tier Thecompany needs to make a dally backup of the database to meet regulatory requirements.The company must retain the backups (or 30 days.Which solution will meet these requirements with the LEAST operational overhead?

A. Write an AWS Lambda function to create an RDS snapshot every day.
B. Modify the RDS database lo have a retention period of 30 days for automated backups.
C. Use AWS Systems Manager Maintenance Windows to modify the RDS backup retentionperiod.
D. Create a manual snapshot every day by using the AWS CLI. Modify the RDS backupretention period.

Question # 55

A company runs its application on Oracle Database Enterprise Edition The company needsto migrate the application and the database to AWS. The company can use the Bring YourOwn License (BYOL) model while migrating to AWS The application uses third-partydatabase features that require privileged access.A solutions architect must design a solution for the database migration.Which solution will meet these requirements MOST cost-effectively?

A. Migrate the database to Amazon RDS for Oracle by using native tools. Replace thethird-party features with AWS Lambda.
B. Migrate the database to Amazon RDS Custom for Oracle by using native toolsCustomize the new database settings to support the third-party features.
C. Migrate the database to Amazon DynamoDB by using AWS Database Migration Service{AWS DMS). Customize the new database settings to support the third-party features.
D. Migrate the database to Amazon RDS for PostgreSQL by using AWS DatabaseMigration Service (AWS DMS). Rewrite the application code to remove the dependency onthird-party features.

Question # 56

A company stores several petabytes of data across multiple AWS accounts The companyuses AWS Lake Formation to manage its data lake The company's data science teamwants to securely share selective data from its accounts with the company’s engineeringteam for analytical purposes.Which solution will meet these requirements with the LEAST operational overhead?

A. Copy the required data to a common account. Create an 1AM access role in thataccount Grant access by specifying a permission policy that includes users from theengineering team accounts as trusted entities.
B. Use the Lake Formation permissions Grant command in each account where the data isstored to allow the required engineering team users to access the data.
C. Use AWS Data Exchange to privately publish the required data to the requiredengineering team accounts
D. Use Lake Formation tag-based access control to authorize and grant cross-accountpermissions for the required data to the engineering team accounts

Question # 57

A company has an on-premises SFTP file transfer solution. The company is migrating tothe AWS Cloud to scale the file transfer solution and to optimize costs by using AmazonS3. The company's employees will use their credentials for the on-premises MicrosoftActive Directory (AD) to access the new solution The company wants to keep the currentauthentication and file access mechanisms.Which solution will meet these requirements with the LEAST operational overhead?

A. Configure an S3 File Gateway. Create SMB file shares on the file gateway that use theexisting Active Directory to authenticate
B. Configure an Auto Scaling group with Amazon EC2 instances to run an SFTP solutionConfigure the group to scale up at 60% CPU utilization.
C. Create an AWS Transfer Family server with SFTP endpoints Choose the AWS DirectoryService option as the identity provider Use AD Connector to connect the on-premisesActive Directory.
D. Create an AWS Transfer Family SFTP endpoint. Configure the endpoint to use the AWSDirectory Service option as the identity provider to connect to the existing Active Directory.

Question # 58

A video game company is deploying a new gaming application to its global users. Thecompany requires a solution that will provide near real-time reviews and rankings of theplayers.A solutions architect must design a solution to provide fast access to the data. The solutionmust also ensure the data persists on disks in the event that the company restarts theapplication.Which solution will meet these requirements with the LEAST operational overhead?

A. Configure an Amazon CloudFront distribution with an Amazon S3 bucket as the origin.Store the player data in the S3 bucket.
B. Create Amazon EC2 instances in multiple AWS Regions. Store the player data on theEC2 instances. Configure Amazon Route 53 with geolocation records to direct users to theclosest EC2 instance.
C. Deploy an Amazon ElastiCache for Redis cluster. Store the player data in theElastiCache cluster.
D. Deploy an Amazon ElastiCache for Memcached cluster. Store the player data in theElastiCache cluster.

Question # 59

A company is running a highly sensitive application on Amazon EC2 backed by an AmazonRDS database Compliance regulations mandate that all personally identifiable information(Pll) be encrypted at rest.Which solution should a solutions architect recommend to meet this requirement with theLEAST amount of changes to the infrastructure?

A. Deploy AWS Certificate Manager to generate certificates Use the certificates to encryptthe database volume
B. Deploy AWS CloudHSM. generate encryption keys, and use the keys to encryptdatabase volumes.
C. Configure SSL encryption using AWS Key Management Service {AWS KMS) keys toencrypt database volumes.
D. Configure Amazon Elastic Block Store (Amazon EBS) encryption and Amazon RDSencryption with AWS Key Management Service (AWS KMS) keys to encrypt instance anddatabase volumes.

Question # 60

A company that uses AWS Organizations runs 150 applications across 30 different AWSaccounts The company used AWS Cost and Usage Report to create a new report in themanagement account The report is delivered to an Amazon S3 bucket that is replicated toa bucket in the data collection account.The company's senior leadership wants to view a custom dashboard that provides NATgateway costs each day starting at the beginning of the current month. Which solution will meet these requirements?

A. Share an Amazon QuickSight dashboard that includes the requested table visual.Configure QuickSight to use AWS DataSync to query the new report
B. Share an Amazon QuickSight dashboard that includes the requested table visual.Configure QuickSight to use Amazon Athena to query the new report.
C. Share an Amazon CloudWatch dashboard that includes the requested table visualConfigure CloudWatch to use AWS DataSync to query the new report
D. Share an Amazon CloudWatch dashboard that includes the requested table visual.Configure CloudWatch to use Amazon Athena to query the new report

Question # 61

A company runs containers in a Kubernetes environment in the company's local datacenter. The company wants to use Amazon Elastic Kubernetes Service (Amazon EKS) andother AWS managed services Data must remain locally in the company's data center andcannot be stored in any remote site or cloud to maintain complianceWhich solution will meet these requirements?

A. Deploy AWS Local Zones in the company's data center
B. Use an AWS Snowmobile in the company's data center
C. Install an AWS Outposts rack in the company's data centerc
D. Install an AWS Snowball Edge Storage Optimized node in the data center

Question # 62

A company runs a self-managed Microsoft SOL Server on Amazon EC2 instances and Amazon Elastic Block Store (Amazon EBS). Daily snapshots are taken of the EBSvolumes.Recently, all the company's EBS snapshots were accidentally deleted while running asnapshot cleaning script that deletes all expired EBS snapshots. A solutions architectneeds to update the architecture to prevent data loss without retaining EBS snapshotsindefinitely.Which solution will meet these requirements with the LEAST development effort?

A. Change the 1AM policy of the user to deny EBS snapshot deletion.
B. Copy the EBS snapshots to another AWS Region after completing the snapshots daily.
C. Create a 7-day EBS snapshot retention rule in Recycle Bin and apply the rule for allsnapshots.
D. Copy EBS snapshots to Amazon S3 Standard-Infrequent Access (S3 Standard-IA).

Question # 63

Asocial media company has workloads that collect and process data The workloads storethe data in on-premises NFS storage The data store cannot scale fast enough to meet thecompany's expanding business needs The company wants to migrate the current datastore to AWSWhich solution will meet these requirements MOST cost-effectively?

A. Set up an AWS Storage Gateway Volume Gateway Use an Amazon S3 Lifecycle policyto transition the data to the appropnate storage class
B. Set up an AWS Storage Gateway Amazon S3 File Gateway Use an Amazon S3Lifecycle policy to transition the data to the appropriate storage class
C. Use the Amazon Elastic File System (Amazon EFS) Standard-Infrequent Access(Standard-IA) storage class Activate the infrequent access lifecycle policy
D. Use the Amazon Elastic File System (Amazon EFS) One Zone-Infrequent Access (OneZone-IA) storage class Activate the infrequent access lifecycle policy

Question # 64

A company uses Amazon FSx for NetApp ONTAP in its primary AWS Region for CIFS andNFS file shares. Applications that run on Amazon EC2 instances access the file shares Thecompany needs a storage disaster recovery (OR) solution in a secondary Region. The datathat is replicated in the secondary Region needs to be accessed by using the sameprotocols as the primary Region.Which solution will meet these requirements with the LEAST operational overhead?

A. Create an AWS Lambda function lo copy the data to an Amazon S3 bucket. Replicatethe S3 bucket (o the secondary Region.
B. Create a backup of the FSx for ONTAP volumes by using AWS Backup. Copy thevolumes to the secondary Region. Create a new FSx for ONTAP instance from the backup.
C. Create an FSx for ONTAP instance in the secondary Region. Use NetApp SnapMirror toreplicate data from the primary Region to the secondary Region.
D. Create an Amazon Elastic File System (Amazon EFS) volume. Migrate the current datato the volume. Replicate the volume to the secondary Region.

Question # 65

A company has migrated a fleet of hundreds of on-premises virtual machines (VMs) to Amazon EC2 instances. The instances run a diverse fleet of Windows Server versionsalong with several Linux distributions. The company wants a solution that will automateinventory and updates of the operating systems. The company also needs a summary ofcommon vulnerabilities of each instance for regular monthly reviews.What should a solutions architect recommend to meet these requirements?

A. Set up AWS Systems Manager Patch Manager to manage all the EC2 instances.Configure AWS Security Hub to produce monthly reports.
B. Set up AWS Systems Manager Patch Manager to manage all the EC2 instances DeployAmazon Inspector, and configure monthly reports
C. Set up AWS Shield Advanced, and configure monthly reports Deploy AWS Config toautomate patch installations on the EC2 instances
D. Set up Amazon GuardDuty in the account to monitor all EC2 instances Deploy AWSConfig to automate patch installations on the EC2 instances.

Question # 66

A large international university has deployed all of its compute services in the AWS CloudThese services include Amazon EC2. Amazon RDS. and Amazon DynamoDB. Theuniversity currently relies on many custom scripts to back up its infrastructure. However,the university wants to centralize management and automate data backups as much aspossible by using AWS native options.Which solution will meet these requirements?

A. Use third-party backup software with an AWS Storage Gateway tape gateway virtualtape library.
B. Use AWS Backup to configure and monitor all backups for the services in use
C. Use AWS Config to set lifecycle management to take snapshots of all data sources on aschedule.
D. Use AWS Systems Manager State Manager to manage the configuration and monitoringof backup tasks.

Question # 67

A company runs a critical data analysis job each week before the first day of the work weekThe job requires at least 1 hour to complete the analysis The job is stateful and cannottolerate interruptions. The company needs a solution to run the job on AWS.Which solution will meet these requirements?

A. Create a container for the job. Schedule the job to run as an AWS Fargate task on anAmazon Elastic Container Service (Amazon ECS) cluster by using Amazon EventBridgeScheduler.
B. Configure the job to run in an AWS Lambda function. Create a scheduled rule inAmazon EventBridge to invoke the Lambda function.
C. Configure an Auto Scaling group of Amazon EC2 Spot Instances that run Amazon LinuxConfigure a crontab entry on the instances to run the analysis.
D. Configure an AWS DataSync task to run the job Configure a cron expression to run thetask on a schedule.

Question # 68

A company has several on-premises Internet Small Computer Systems Interface (iSCSI)network storage servers The company wants to reduce the number of these servers bymoving to the AWS Cloud. A solutions architect must provide low-latency access tofrequently used data and reduce the dependency on on-premises servers with a minimalnumber of infrastructure changes.Which solution will meet these requirements?

A. Deploy an Amazon S3 File Gateway
B. Deploy Amazon Elastic Block Store (Amazon EBS) storage with backups to Amazon S3
C. Deploy an AWS Storage Gateway volume gateway that is configured with storedvolumes
D. Deploy an AWS Storage Gateway volume gateway that is configured with cachedvolumes.

Question # 69

A company uses GPS trackers to document the migration patterns of thousands of seaturtles. The trackers check every 5 minutes to see if a turtle has moved more than 100yards (91.4 meters). If a turtle has moved, its tracker sends the new coordinates to a webapplication running on three Amazon EC2 instances that are in multiple Availability Zonesin one AWS Region.Jgpently. the web application was overwhelmed while processing an unexpected volume oftracker data. Data was lost with no way to replay the events. A solutionsftitect must prevent this problem from happening again and needs a solution with the least operational overhead.at should the solutions architect do to meet these requirements?

A. Create an Amazon S3 bucket to store the data. Configure the application to scan fornew data in the bucket for processing.
B. Create an Amazon API Gateway endpoint to handle transmitted location coordinates.Use an AWS Lambda function to process each item concurrently.
C. Create an Amazon Simple Queue Service (Amazon SOS) queue to store the incomingdata. Configure the application to poll for new messages for processing.
D. Create an Amazon DynamoDB table to store transmitted location coordinates. Configurethe application to query the table for new data for processing. Use TTL to remove data thathas been processed.

Question # 70

A company is designing a new multi-tier web application that consists of the following components: • Web and application servers that run on Amazon EC2 instances as part of Auto Scalinggroups• An Amazon RDS DB instance for data storageA solutions architect needs to limit access to the application servers so that only the webservers can access them. Which solution will meet these requirements?

A. Deploy AWS PrivateLink in front of the application servers. Configure the network ACLto allow only the web servers to access the application servers.
B. Deploy a VPC endpoint in front of the application servers Configure the security group toallow only the web servers to access the application servers
C. Deploy a Network Load Balancer with a target group that contains the applicationservers' Auto Scaling group Configure the network ACL to allow only the web servers toaccess the application servers.
D. Deploy an Application Load Balancer with a target group that contains the applicationservers' Auto Scaling group. Configure the security group to allow only the web servers toaccess the application servers.

Question # 71

A company has an Amazon S3 data lake The company needs a solution that transformsthe data from the data lake and loads the data into a data warehouse every day The datawarehouse must have massively parallel processing (MPP) capabilities.Data analysts then need to create and train machine learning (ML) models by using SQLcommands on the data The solution must use serverless AWS services wherever possibleWhich solution will meet these requirements?

A. Run a daily Amazon EMR job to transform the data and load the data into AmazonRedshift Use Amazon Redshift ML to create and train the ML models
B. Run a daily Amazon EMR job to transform the data and load the data into AmazonAurora Serverless Use Amazon Aurora ML to create and train the ML models
C. Run a daily AWS Glue job to transform the data and load the data into Amazon RedshiftServerless Use Amazon Redshift ML to create and tram the ML models
D. Run a daily AWS Glue job to transform the data and load the data into Amazon Athenatables Use Amazon Athena ML to create and train the ML models

Question # 72

content management system runs on Amazon EC2 instances behind an Application LoadBalancer (Al B). The FC? instances run in an Auto Scaling group across multipleAvailability 7ones. Users are constantly adding and updating files, blogs and other websiteassets in the content management system.A solutions architect must implement a solution in which all the EC2 Instances share up-todatewebsite content with the least possible lag time.Which solution meets these requirements?

A. Update the EC2 user data in the Auto Scaling group lifecycle policy to copy the websiteassets from the EC2 instance that was launched most recently. Configure the ALB to makechanges to the website assets only in the newest EC2 instance.
B. Copy the website assets to an Amazon Elastic File System (Amazon EFS) file system.Configure each EC2 instance to mount the EFS file system locally.Configure the website hosting application to reference the website assets that are stored inthe EFS file system.
C. Copy the website assets to an Amazon S3 bucket. Ensure that each EC2 Instancedownloads the website assets from the S3 bucket to the attached AmazonElastic Block Store (Amazon EBS) volume. Run the S3 sync command once each hour tokeep files up to date.
D. Restore an Amazon Elastic Block Store (Amazon EBS) snapshot with the websiteassets. Attach the EBS snapshot as a secondary EBS volume when a new CC2 instance islaunched. Configure the website hosting application to reference the website assets thatare stored in the secondary EDS volume.

Question # 73

A company wants to add its existing AWS usage cost to its operation cost dashboard Asolutions architect needs to recommend a solution that will give the company access to itsusage cost programmatically. The company must be able to access cost data for thecurrent year and forecast costs for the next 12 months.Which solution will meet these requirements with the LEAST operational overhead?

A. Access usage cost-related data by using the AWS Cost Explorer API with pagination.
B. Access usage cost-related data by using downloadable AWS Cost Explorer report csv files.
C. Configure AWS Budgets actions to send usage cost data to the company through FTP.
D. Create AWS Budgets reports for usage cost data Send the data to the company throughSMTP.

Question # 74

A company runs an application in a VPC with public and private subnets. The VPC extendsacross multiple Availability Zones. The application runs on Amazon EC2 instances inprivate subnets. The application uses an Amazon Simple Queue Service (Amazon SOS)queue. A solutions architect needs to design a secure solution to establish a connection betweenthe EC2 instances and the SOS queueWhich solution will meet these requirements?

A. Implement an interface VPC endpoint tor Amazon SOS. Configure the endpoint to usethe private subnets. Add to the endpoint a security group that has aninbound access rule that allows traffic from the EC2 instances that are in the privatesubnets.
B. Implement an interface VPC endpoint tor Amazon SOS. Configure the endpoint to usethe public subnets. Attach to the interface endpoint a VPC endpointpolicy that allows access from the EC2 Instances that are in the private subnetsc
C. Implement an interface VPC endpoint for Ama7on SOS. Configure the endpoint to usethe public subnets Attach an Amazon SOS access policy to the interface VPC endpoint thatallows requests from only a specified VPC endpoint.
D. Implement a gateway endpoint tor Amazon SOS. Add a NAT gateway to the privatesubnets. Attach an 1AM role to the EC2 Instances that allows access to the SOS queue.

Question # 75

A company has an internal application that runs on Amazon EC2 instances in an AutoScaling group. The EC2 instances are compute optimized and use Amazon Elastic BlockStore (Amazon EBS) volumes.The company wants to identify cost optimizations across the EC2 instances, the AutoScaling group, and the EBS volumes.Which solution will meet these requirements with the MOST operational efficiency?

A. Create a new AWS Cost and Usage Report. Search the report for costrecommendations for the EC2 instances, the Auto Scaling group, and the EBS volumes.
B. Create new Amazon CloudWatch billing alerts. Check the alert statuses for costrecommendations for the EC2 instances, the Auto Scaling group, and the EBS volumes.
C. Configure AWS Compute Optimizer for cost recommendations for the EC2 instances,the Auto Scaling group, and the EBS volumes.
D. Configure AWS Compute Optimizer for cost recommendations for the EC2 instances.Create a new AWS Cost and Usage Report. Search the report for cost recommendationsfor the Auto Scaling group and the EBS volumes.

Question # 76

A company's near-real-time streaming application is running on AWS. As the data isingested, a Job runs on the data and takes 30 minutes to complete. The workloadfrequently experiences high latency due to large amounts of incoming data. A solutionsarchitect needs to design a scalable and serverless solution to enhance performance.Which combination of steps should the solutions architect take? (Select TWO.)

A. Use Amazon Kinesis Data Firehose to Ingest the data.
B. Use AWS Lambda with AWS Step Functions to process the data.
C. Use AWS Database Migration Service (AWS DMS) to ingest the data
D. Use Amazon EC2 instances in an Auto Seating group to process the data.
E. Use AWS Fargate with Amazon Elastic Container Service (Amazon ECS) to process the data.

Question # 77

A solutions architect is creating an application that will handle batch processing of largeamounts of data. The input data will be held in Amazon S3 and the ou data will be stored ina different S3 bucket. For processing, the application will transfer the data over the networkbetween multiple Amazon EC2 instances.What should the solutions architect do to reduce the overall data transfer costs?

A. Place all the EC2 instances in an Auto Scaling group.
B. Place all the EC2 instances in the same AWS Region.
C. Place all the EC2 instances in the same Availability Zone.
D. Place all the EC2 instances in private subnets in multiple Availability Zones.

Question # 78

A company has two AWS accounts: Production and Development. The company needs topush code changes in the Development account to the Production account. In the alphaphase, only two senior developers on the development team need access to the Productionaccount. In the beta phase, more developers will need access to perform testing.Which solution will meet these requirements?

A. Create two policy documents by using the AWS Management Console in each account.Assign the policy to developers who need access.
B. Create an 1AM role in the Development account Grant the 1AM role access to theProduction account. Allow developers to assume the role
C. Create an IAM role in the Production account. Define a trust policy that specifies theDevelopment account Allow developers to assume the role
D. Create an IAM group in the Production account. Add the group as a principal in a trustpolicy that specifies the Production account. Add developers to the group.

Question # 79

A company uses 50 TB of data for reporting The company wants to move this data from onpremises to AWS A custom application in the company's data center runs a weekly datatransformation job The company plans to pause the application until the data transfer iscomplete and needs to begin the transfer process as soon as possibleThe data center does not have any available network bandwidth for additional workloads. Asolutions architect must transfer the data and must configure the transformation job tocontinue to run in the AWS Cloud. Which solution will meet these requirements with the LEAST operational overhead?

A. Use AWS DataSync to move the data Create a custom transformation job by using AWS Glue.
B. Order an AWS Snowcone device to move the data Deploy the transformation application to the device.
C. Order an AWS Snowball Edge Storage Optimized device. Copy the data to the device.Create a custom transformation Job by using AWS Glue.
D. Order an AWS Snowball Edge Storage Optimized device that includes Amazon EC2compute Copy the data to the device Create a new EC2 instance on AWS to run thetransformation application.

Question # 80

A company serves its website by using an Auto Scaling group of Amazon EC2 instances ina single AWS Region. The website does not require a databaseThe company is expanding, and the company's engineering team deploys the website to asecond Region. The company wants to distribute traffic across both Regions toaccommodate growth and for disaster recovery purposes The solution should not servetraffic from a Region in which the website is unhealthy.Which policy or resource should the company use to meet these requirements?

A. An Amazon Route 53 simple routing policy
B. An Amazon Route 53 multivalue answer routing policy
C. An Application Load Balancer in one Region with a target group that specifies the EC2instance IDs from both Regions
D. An Application Load Balancer in one Region with a target group that specifies the IPaddresses of the EC2 instances from both Regions

Question # 81

A company wants to build a logging solution for its multiple AWS accounts. The companycurrently stores the logs from all accounts in a centralized account. The company hascreated an Amazon S3 bucket in the centralized account to store the VPC flow logs andAWS CloudTrail logs. All logs must be highly available for 30 days for frequent analysis,retained tor an additional 60 days tor backup purposes, and deleted 90 days after creation.Which solution will meet these requirements MOST cost-effectively?

A. Transition objects to the S3 Standard storage class 30 days after creation. Write anexpiration action that directs Amazon S3 to delete objects after 90 days.
B. Transition objects lo the S3 Standard-Infrequent Access (S3 Standard-IA) storage class30 days after creation Move all objects to the S3 Glacier FlexibleRetrieval storage class after 90 days. Write an expiration action that directs Amazon S3 todelete objects after 90 days.
C. Transition objects to the S3 Glacier Flexible Retrieval storage class 30 days aftercreation. Write an expiration action that directs Amazon S3 to delete objects alter 90 days.
D. Transition objects to the S3 One Zone-Infrequent Access (S3 One Zone-IA) storageclass 30 days after creation. Move all objects to the S3 Glacier Flexible Retrieval storageclass after 90 days. Write an expiration action that directs Amazon S3 to delete objectsafter 90 days.

Question # 82

A company is hosting a high-traffic static website on Amazon S3 with an AmazonCloudFront distribution that has a default TTL of 0 seconds The company wants toimplement caching to improve performance for the website However, the company alsowants to ensure that stale content Is not served for more than a few minutes after adeploymentWhich combination of caching methods should a solutions architect implement to meetthese requirements? (Select TWO.)

A. Set the CloudFront default TTL to 2 minutes.
B. Set a default TTL of 2 minutes on the S3 bucket
C. Add a Cache-Control private directive to the objects in Amazon S3.
D. Create an AWS Lambda@Edge function to add an Expires header to HTTP responsesConfigure the function to run on viewer response.
E. Add a Cache-Control max-age directive of 24 hours to the objects in Amazon S3. Ondeployment, create a CloudFront invalidation to clear any changed files from edge caches

Question # 83

A company needs to optimize the cost of its Amazon EC2 Instances. The company alsoneeds to change the type and family of its EC2 instances every 2-3 months.What should the company do lo meet these requirements?

A. Purchase Partial Upfront Reserved Instances tor a 3-year term.
B. Purchase a No Upfront Compute Savings Plan for a 1-year term.
C. Purchase All Upfront Reserved Instances for a 1 -year term.
D. Purchase an All Upfront EC2 Instance Savings Plan for a 1-year term.

Question # 84

A company runs an application on Amazon EC2 Instances in a private subnet. Theapplication needs to store and retrieve data in Amazon S3 buckets. According to regulatoryrequirements, the data must not travel across the public internet.What should a solutions architect do to meet these requirements MOST cost-effectively?

A. Deploy a NAT gateway to access the S3 buckets.
B. Deploy AWS Storage Gateway to access the S3 buckets.
C. Deploy an S3 interface endpoint to access the S3 buckets.
D. Deploy an S3 gateway endpoint to access the S3 buckets.

Question # 85

A company uses an Amazon S3 bucket as its data lake storage platform The S3 bucketcontains a massive amount of data that is accessed randomly by multiple teams andhundreds of applications. The company wants to reduce the S3 storage costs and provideimmediate availability for frequently accessed objectsWhat is the MOST operationally efficient solution that meets these requirements?

A. Create an S3 Lifecycle rule to transition objects to the S3 Intelligent-Tiering storageclass
B. Store objects in Amazon S3 Glacier Use S3 Select to provide applications with accessto the data.
C. Use data from S3 storage class analysis to create S3 Lifecycle rules to automaticallytransition objects to the S3 Standard-Infrequent Access (S3 Standard-IA) storage class.
D. Transition objects to the S3 Standard-Infrequent Access (S3 Standard-IA) storage classCreate an AWS Lambda function to transition objects to the S3 Standard storage classwhen they are accessed by an application

Question # 86

A company's SAP application has a backend SQL Server database in an on-premisesenvironment. The company wants to migrate its on-premises application and databaseserver to AWS. The company needs an instance type that meets the high demands of itsSAP database. On-premises performance data shows that both the SAP application andthe database have high memory utilization.Which solution will meet these requirements?

A. Use the compute optimized Instance family for the application Use the memoryoptimized instance family for the database.
B. Use the storage optimized instance family for both the application and the database
C. Use the memory optimized instance family for both the application and the database
D. Use the high performance computing (HPC) optimized instance family for theapplication. Use the memory optimized instance family for the database.

Question # 87

A company needs to design a hybrid network architecture The company's workloads arecurrently stored in the AWS Cloud and in on-premises data centers The workloads requiresingle-digit latencies to communicate The company uses an AWS Transit Gateway transitgateway to connect multiple VPCsWhich combination of steps will meet these requirements MOST cost-effectively? (SelectTWO.)

A. Establish an AWS Site-to-Site VPN connection to each VPC.
B. Associate an AWS Direct Connect gateway with the transit gateway that is attached to the VPCs.
C. Establish an AWS Site-to-Site VPN connection to an AWS Direct Connect gateway.
D. Establish an AWS Direct Connect connection. Create a transit virtual interface (VIF) to a Direct Connect gateway.
E. Associate AWS Site-to-Site VPN connections with the transit gateway that is attached to the VPCs

Question # 88

A company has an application that runs on Amazon EC2 instances in a private subnet Theapplication needs to process sensitive information from an Amazon S3 bucket Theapplication must not use the internet to connect to the S3 bucket.Which solution will meet these requirements?

A. Configure an internet gateway. Update the S3 bucket policy to allow access from theinternet gateway Update the application to use the new internet gateway
B. Configure a VPN connection. Update the S3 bucket policy to allow access from the VPNconnection. Update the application to use the new VPN connection.
C. Configure a NAT gateway. Update the S3 bucket policy to allow access from the NATgateway. Update the application to use the new NAT gateway.
D. Configure a VPC endpoint. Update the S3 bucket policy to allow access from the VPCendpoint. Update the application to use the new VPC endpoint.

Question # 89

A company is planning to deploy its application on an Amazon Aurora PostgreSQLServerless v2 cluster. The application will receive large amounts of traffic. The companywants to optimize the storage performance of the cluster as the load on the applicationincreasesWhich solution will meet these requirements MOST cost-effectively?

A. Configure the cluster to use the Aurora Standard storage configuration.
B. Configure the cluster storage type as Provisioned IOPS.
C. Configure the cluster storage type as General Purpose.
D. Configure the cluster to use the Aurora l/O-Optimized storage configuration.

Question # 90

A company plans to rehost an application to Amazon EC2 instances that use AmazonElastic Block Store (Amazon EBS) as the attached storageA solutions architect must design a solution to ensure that all newly created Amazon EBSvolumes are encrypted by default. The solution must also prevent the creation ofunencrypted EBS volumesWhich solution will meet these requirements?

A. Configure the EC2 account attributes to always encrypt new EBS volumes.
B. Use AWS Config. Configure the encrypted-volumes identifier Apply the default AWS KeyManagement Service (AWS KMS) key.
C. Configure AWS Systems Manager to create encrypted copies of the EBS volumes.Reconfigure the EC2 instances to use the encrypted volumes
D. Create a customer managed key in AWS Key Management Service (AWS KMS)Configure AWS Migration Hub to use the key when the company migrates workloads.

Question # 91

A company uses AWS to host its public ecommerce website. The website uses an AWSGlobal Accelerator accelerator for traffic from the internet. Tt\e Global Acceleratoraccelerator forwards the traffic to an Application Load Balancer (ALB) that is the entry pointfor an Auto Scaling group.The company recently identified a ODoS attack on the website. The company needs asolution to mitigate future attacks. Which solution will meet these requirements with the LEAST implementation effort?

A. Configure an AWS WAF web ACL for the Global Accelerator accelerator to block trafficby using rate-based rules.
B. Configure an AWS Lambda function to read the ALB metrics to block attacks byupdating a VPC network ACL.
C. Configure an AWS WAF web ACL on the ALB to block traffic by using rate-based rules.
D. Configure an Ama7on CloudFront distribution in front of the Global Accelerator accelerator

Question # 92

A media company uses an Amazon CloudFront distribution to deliver content over theinternet The company wants only premium customers to have access to the media streamsand file content. The company stores all content in an Amazon S3 bucket. The companyalso delivers content on demand to customers for a specific purpose, such as movie rentalsor music downloads.Which solution will meet these requirements?

A. Generate and provide S3 signed cookies to premium customers
B. Generate and provide CloudFront signed URLs to premium customers.
C. Use origin access control (OAC) to limit the access of non-premium customers
D. Generate and activate field-level encryption to block non-premium customers.

Question # 93

A media company has a multi-account AWS environment in the us-east-1 Region. Thecompany has an Amazon Simple Notification Service {Amazon SNS) topic in a productionaccount that publishes performance metrics. The company has an AWS Lambda functionin an administrator account to process and analyze log data.The Lambda function that is in the administrator account must be invoked by messagesfrom the SNS topic that is in the production account when significant metrics tM* reported.Which combination of steps will meet these requirements? (Select TWO.)

A. Create an IAM resource policy for the Lambda function that allows Amazon SNS toinvoke the function. Implement an Amazon Simple Queue Service (Amazon SQS) queue inthe administrator account to buffer messages from the SNS topic that is in the productionaccount. Configure the SOS queue to invoke the Lambda function.
B. Create an IAM policy for the SNS topic that allows the Lambda function to subscribe tothe topic.
C. Use an Amazon EventBridge rule in the production account to capture the SNS topicnotifications. Configure the EventBridge rule to forward notifications to the Lambda functionthat is in the administrator account.
D. Store performance metrics in an Amazon S3 bucket in the production account. UseAmazon Athena to analyze the metrics from the administrator account.

Question # 94

A weather forecasting company needs to process hundreds of gigabytes of data with submillisecondlatency. The company has a high performance computing (HPC) environmentin its data center and wants to expand its forecasting capabilities.A solutions architect must identify a highly available cloud storage solution that can handlelarge amounts of sustained throughput Files that are stored in the solution should beaccessible to thousands of compute instances that will simultaneously access and processthe entire dataset.What should the solutions architect do to meet these requirements?

A. Use Amazon FSx for Lustre scratch file systems
B. Use Amazon FSx for Lustre persistent file systems.
C. Use Amazon Elastic File System (Amazon EFS) with Bursting Throughput mode.
D. Use Amazon Elastic File System (Amazon EFS) with Provisioned Throughput mode.

Question # 95

A company uses a Microsoft SOL Server database. The company's applications areconnected to the database. The company wants to migrate to an Amazon AuroraPostgreSQL database with minimal changes to the application code.Which combination of steps will meet these requirements? (Select TWO.)

A. Use the AWS Schema Conversion Tool <AWS SCT) to rewrite the SOL queries in theapplications.
B. Enable Babelfish on Aurora PostgreSQL to run the SQL queues from the applications.
C. Migrate the database schema and data by using the AWS Schema Conversion Tool(AWS SCT) and AWS Database Migration Service (AWS DMS).
D. Use Amazon RDS Proxy to connect the applications to Aurora PostgreSQL
E. Use AWS Database Migration Service (AWS DMS) to rewrite the SOI queries in theapplications

Question # 96

A company has an application that is running on Amazon EC2 instances A solutionsarchitect has standardized the company on a particular instance family and variousinstance sizes based on the current needs of the company.The company wants to maximize cost savings for the application over the next 3 years. Thecompany needs to be able to change the instance family and sizes in the next 6 monthsbased on application popularity and usage Which solution will meet these requirements MOST cost-effectively?

A. Compute Savings Plan
B. EC2 Instance Savings Plan
C. Zonal Reserved Instances
D. Standard Reserved Instances

Question # 97

A company stores sensitive data in Amazon S3 A solutions architect needs to create anencryption solution The company needs to fully control the ability of users to create, rotate,and disable encryption keys with minimal effort for any data that must be encrypted.Which solution will meet these requirements?

A. Use default server-side encryption with Amazon S3 managed encryption keys (SSE-S3)to store the sensitive data
B. Create a customer managed key by using AWS Key Management Service (AWS KMS).Use the new key to encrypt the S3 objects by using server-side encryption with AWS KMSkeys (SSE-KMS).
C. Create an AWS managed key by using AWS Key Management Service {AWS KMS)Use the new key to encrypt the S3 objects by using server-side encryption with AWS KMSkeys (SSE-KMS).
D. Download S3 objects to an Amazon EC2 instance. Encrypt the objects by usingcustomer managed keys. Upload the encrypted objects back into Amazon S3.

Question # 98

A company wants to migrate an application to AWS. The company wants to increase theapplication's current availability The company wants to use AWS WAF in the application'sarchitecture.Which solution will meet these requirements?

A. Create an Auto Scaling group that contains multiple Amazon EC2 instances that hostthe application across two Availability Zones. Configure an Application Load Balancer(ALB) and set the Auto Scaling group as the target. Connect a WAF to the ALB.
B. Create a cluster placement group that contains multiple Amazon EC2 instances thathosts the application Configure an Application Load Balancer and set the EC2 instances asthe targets. Connect a WAF to the placement group.
C. Create two Amazon EC2 instances that host the application across two AvailabilityZones. Configure the EC2 instances as the targets of an Application Load Balancer (ALB).Connect a WAF to the ALB.
D. Create an Auto Scaling group that contains multiple Amazon EC2 instances that hostthe application across two Availability Zones. Configure an Application Load Balancer(ALB) and set the Auto Scaling group as the target Connect a WAF to the Auto Scalinggroup.

Question # 99

A development team uses multiple AWS accounts for its development, staging, andproduction environments. Team members have been launching large Amazon EC2instances that are underutilized. A solutions architect must prevent large instances frombeing launched in all accounts.How can the solutions architect meet this requirement with the LEAST operationaloverhead?

A. Update the 1AM policies to deny the launch of large EC2 instances. Apply the policies toall users.
B. Define a resource in AWS Resource Access Manager that prevents the launch of largeEC2 instances.
C. Create an (AM role in each account that denies the launch of large EC2 instances.Grant the developers 1AM group access to the role.
D. Create an organization in AWS Organizations in the management account with thedefault policy. Create a service control policy (SCP) that denies the launch of large EC2Instances, and apply it to the AWS accounts.

Question # 100

A company is developing an application to support customer demands. The companywants to deploy the application on multiple Amazon EC2 Nitro-based instances within thesame Availability Zone. The company also wants to give the application the ability to writeto multiple block storage volumes in multiple EC2 Nitro-based instances simultaneously to achieve higher application availability. Which solution will meet these requirements?

A. Use General Purpose SSD (gp3) EBS volumes with Amazon Elastic Block Store(Amazon EBS) Multi-Attach.
B. Use Throughput Optimized HDD (st1) EBS volumes with Amazon Elastic Block Store(Amazon EBS) Multi-Attach
C. Use Provisioned IOPS SSD (io2) EBS volumes with Amazon Elastic Block Store(Amazon EBS) Multi-Attach.
D. Use General Purpose SSD (gp2) EBS volumes with Amazon Elastic Block Store(Amazon E8S) Multi-Attach.

Question # 101

A company runs a stateful production application on Amazon EC2 instances Theapplication requires at least two EC2 instances to always be running.A solutions architect needs to design a highly available and fault-tolerant architecture forthe application. The solutions architect creates an Auto Scaling group of EC2 instances.Which set of additional steps should the solutions architect take to meet theserequirements?

A. Set the Auto Scaling group's minimum capacity to two. Deploy one On-DemandInstance in one Availability Zone and one On-Demand Instance in a second AvailabilityZone.
B. Set the Auto Scaling group's minimum capacity to four Deploy two On-DemandInstances in one Availability Zone and two On-Demand Instances in a second AvailabilityZone
C. Set the Auto Scaling group's minimum capacity to two. Deploy four Spot Instances inone Availability Zone.
D. Set the Auto Scaling group's minimum capacity to four Deploy two On-DemandInstances in one Availability Zone and two Spot Instances in a second Availability Zone.

Question # 102

A company recently migrated its web application to the AWS Cloud The company uses anAmazon EC2 instance to run multiple processes to host the application. The processesinclude an Apache web server that serves static content The Apache web server makesrequests to a PHP application that uses a local Redis server for user sessions.The company wants to redesign the architecture to be highly available and to use AWSmanaged solutions Which solution will meet these requirements?

A. Use AWS Elastic Beanstalk to host the static content and the PHP application.Configure Elastic Beanstalk to deploy its EC2 instance into a public subnet Assign a publicIP address.
B. Use AWS Lambda to host the static content and the PHP application. Use an AmazonAPI Gateway REST API to proxy requests to the Lambda function. Set the API GatewayCORS configuration to respond to the domain name. Configure Amazon ElastiCache forRedis to handle session information
C. Keep the backend code on the EC2 instance. Create an Amazon ElastiCache for Rediscluster that has Multi-AZ enabled Configure the ElastiCache for Redis cluster in clustermode Copy the frontend resources to Amazon S3 Configure the backend code to referencethe EC2 instance
D. Configure an Amazon CloudFront distribution with an Amazon S3 endpoint to an S3bucket that is configured to host the static content. Configure an Application Load Balancerthat targets an Amazon Elastic Container Service (Amazon ECS) service that runs AWSFargate tasks for the PHP application. Configure the PHP application to use an AmazonElastiCache for Redis cluster that runs in multiple Availability Zones

Question # 103

A company's software development team needs an Amazon RDS Multi-AZ cluster. TheRDS cluster will serve as a backend for a desktop client that is deployed on premises. Thedesktop client requires direct connectivity to the RDS cluster.The company must give the development team the ability to connect to the cluster by usingthe client when the team is in the office.Which solution provides the required connectivity MOST securely?

A. Create a VPC and two public subnets. Create the RDS cluster in the public subnets.Use AWS Site-to-Site VPN with a customer gateway in the company's office.
B. Create a VPC and two private subnets. Create the RDS cluster in the private subnets.Use AWS Site-to-Site VPN with a customer gateway in the company's office.
C. Create a VPC and two private subnets. Create the RDS cluster in the private subnets.Use RDS security groups to allow the company's office IP ranges to access the cluster.
D. Create a VPC and two public subnets. Create the RDS cluster in the public subnets.Create a cluster user for each developer. Use RDS security groups to allow the users toaccess the cluster.

Question # 104

A company uses an Amazon Aurora PostgreSQL provisioned cluster with its application.The application's peak traffic occurs several times a day for periods of 30 minutes toseveral hours.The database capacity is provisioned to handle peak traffic from the application, but thedatabase has wasted capacity during non-peak hours. The company wants to reduce thedatabase costs.Which solution will meet these requirements with the LEAST operational effort?

A. Set up an Amazon CloudWatch alarm to monitor database utilization. Scale up or scaledown the database capacity based on the amount of traffic.
B. Migrate the database to Amazon EC2 instances in on Auto Scaling group. Increase ordecrease the number of instances based on the amount of traffic.
C. Migrate the database to an Amazon Aurora Serverless DB cluster to scale up or scaledown the capacity based on the amount of traffic.
D. Schedule an AWS Lambda function to provision the required database capacity at thestart of each day. Schedule another Lambda function to reduce the capacity at the end ofeach day.

Question # 105

A company has applications that run on Amazon EC2 instances in a VPC One of theapplications needs to call the Amazon S3 API to store and read objects. According to thecompany's security regulations, no traffic from the applications is allowed to travel acrossthe internet.Which solution will meet these requirements?

A. Configure an S3 gateway endpoint.
B. Create an S3 bucket in a private subnet.
C. Create an S3 bucket in the same AWS Region as the EC2 instances.
D. Configure a NAT gateway in the same subnet as the EC2 instances

Question # 106

A company needs a secure connection between its on-premises environment and AWS.This connection does not need high bandwidth and will handle a small amount of traffic.The connection should be set up quickly.What is the MOST cost-effective method to establish this type of connection?

A. Implement a client VPN
B. Implement AWS Direct Connect.
C. Implement a bastion host on Amazon EC2.
D. Implement an AWS Site-to-Site VPN connection.

Question # 107

A social media company wants to store its database of user profiles, relationships, andinteractions in the AWS Cloud. The company needs an application to monitor any changesin the database. The application needs to analyze the relationships between the dataentities and to provide recommendations to users.Which solution will meet these requirements with the LEAST operational overhead?

A. Use Amazon Neptune to store the information. Use Amazon Kinesis Data Streams toprocess changes in the database.
B. Use Amazon Neptune to store the information. Use Neptune Streams to processchanges in the database.
C. Use Amazon Quantum Ledger Database (Amazon QLDB) to store the information. UseAmazon Kinesis Data Streams to process changes in the database.
D. Use Amazon Quantum Ledger Database (Amazon QLDB) to store the information. UseNeptune Streams to process changes in the database.

Question # 108

A company wants to create a mobile app that allows users to stream slow-motion videoclips on their mobile devices. Currently, the app captures video clips and uploads the videoclips in raw format into an Amazon S3 bucket. The app retrieves these video clips directlyfrom the S3 bucket. However, the videos are large in their raw format.Users are experiencing issues with buffering and playback on mobile devices. Thecompany wants to implement solutions to maximize the performance and scalability of theapp while minimizing operational overhead.Which combination of solutions will meet these requirements? (Select TWO.)

A. Deploy Amazon CloudFront for content delivery and caching
B. Use AWS DataSync to replicate the video files across AWS Regions in other S3 buckets
C. Use Amazon Elastic Transcoder to convert the video files to more appropriate formats.
D. Deploy an Auto Scaling group of Amazon EC2 instances in Local Zones for contentdelivery and caching
E. Deploy an Auto Scaling group of Amazon EC2 Instances to convert the video files tomore appropriate formats.

Question # 109

A marketing company receives a large amount of new clickstream data in Amazon S3 froma marketing campaign The company needs to analyze the clickstream data in Amazon S3quickly. Then the company needs to determine whether to process the data further in thedata pipeline.Which solution will meet these requirements with the LEAST operational overhead?

A. Create external tables in a Spark catalog Configure jobs in AWS Glue to query the data
B. Configure an AWS Glue crawler to crawl the data. Configure Amazon Athena to querythe data.
C. Create external tables in a Hive metastore. Configure Spark jobs in Amazon EMR toquery the data.
D. Configure an AWS Glue crawler to crawl the data. Configure Amazon Kinesis DataAnalytics to use SQL to query the data

Question # 110

A company hosts its core network services, including directory services and DNS, in its onpremisesdata center. The data center is connected to the AWS Cloud using AWS DirectConnect (DX). Additional AWS accounts are planned that will require quick, cost-effective,and consistent access to these network services.What should a solutions architect implement to meet these requirements with the LEASTamount of operational overhead?

A. Create a DX connection in each new account. Route the network traffic to the onpremisesservers.
B. Configure VPC endpoints in the DX VPC for all required services. Route the networktraffic to the on-premises servers.
C. Create a VPN connection between each neV account and the DX VPC. Route thenetwork traffic to the on-premises servers.
D. Configure AWS Transit Gateway between the accounts. Assign DX to the transitgateway and route network traffic to the on-premises servers.

Question # 111

A company's solutions architect is designing an AWS multi-account solution that uses AWSOrganizations. The solutions architect has organized the company's accounts intoorganizational units (OUs).The solutions architect needs a solution that will identify any changes to the OU hierarchy.The solution also needs to notify the company's operations team of any changes.Which solution will meet these requirements with the LEAST operational overhead?

A. Provision the AWS accounts by using AWS Control Tower. Use account driftnotifications to Identify the changes to the OU hierarchy.
B. Provision the AWS accounts by using AWS Control Tower. Use AWS Config aggregatedrules to identify the changes to the OU hierarchy.
C. Use AWS Service Catalog to create accounts in Organizations. Use an AWS CloudTrailorganization trail to identify the changes to the OU hierarchy.
D. Use AWS CloudFormation templates to create accounts in Organizations. Use the driftdetection operation on a stack to identify the changes to the OUhierarchy.

Question # 112

A company has released a new version of its production application The company'sworkload uses Amazon EC2. AWS Lambda. AWS Fargate. and Amazon SageMaker. Thecompany wants to cost optimize the workload now that usage is at a steady state. Thecompany wants to cover the most services with the fewest savings plans. Whichcombination of savings plans will meet these requirements? (Select TWO.)

A. Purchase an EC2 Instance Savings Plan for Amazon EC2 and SageMaker.
B. Purchase a Compute Savings Plan for Amazon EC2. Lambda, and SageMaker
C. Purchase a SageMaker Savings Plan
D. Purchase a Compute Savings Plan for Lambda, Fargate, and Amazon EC2
E. Purchase an EC2 Instance Savings Plan for Amazon EC2 and Fargate

Question # 113

A company deploys Amazon EC2 instances that run in a VPC. The EC2 instances loadsource data into Amazon S3 buckets so that the data can be processed in the future.According to compliance laws, the data must not be transmitted over the public internet.Servers in the company's on-premises data center will consume the output from anapplication that runs on the LC2 instances.Which solution will meet these requirements?

A. Deploy an interface VPC endpoint for Amazon EC2. Create an AWS Site-to-Site VPNconnection between the company and the VPC.
B. Deploys gateway VPC endpoint for Amazon S3 Set up an AWS Direct Connect connection between the on-premises network and the VPC.
C. Set up on AWS Transit Gateway connection from the VPC to the S3 buckets. Create anAWS Site-to-Site VPN connection between the company and the VPC.
D. Set up proxy EC2 instances that have routes to NAT gateways. Configure the proxyEC2 instances lo fetch S3 data and feed the application instances.

Question # 114

A company regularly uploads GB-sized files to Amazon S3. After Ihe company uploads the files, the company uses a fleet of Amazon EC2 Spot Instances to transcode the file format.The company needs to scale throughput when the company uploads data from the onpremisesdata center to Amazon S3 and when Ihe company downloads data from AmazonS3 to the EC2 instances.gUkicn solutions will meet these requirements? (Select TWO.)

A. Use the S3 bucket access point instead of accessing the S3 bucket directly.
B. Upload the files into multiple S3 buckets.
C. Use S3 multipart uploads.
D. Fetch multiple byte-ranges of an object in parallel. fe
E. Add a random prefix to each object when uploading the files.

Question # 115

A company has a mobile app for customers The app's data is sensitive and must beencrypted at rest The company uses AWS Key Management Service (AWS KMS)The company needs a solution that prevents the accidental deletion of KMS keys Thesolution must use Amazon Simple Notification Service (Amazon SNS) to send an emailnotification to administrators when a user attempts to delete a KMS keyWhich solution will meet these requirements with the LEAST operational overhead''

A. Create an Amazon EventBndge rule that reacts when a user tries to delete a KMS keyConfigure an AWS Config rule that cancels any deletion of a KMS key Add the AWS Configrule as a target of the EventBridge rule Create an SNS topic that notifies the administrators
B. Create an AWS Lambda function that has custom logic to prevent KMS key deletionCreate an Amazon CloudWatch alarm that is activated when a user tries to delete a KMSkey Create an Amazon EventBridge rule that invokes the Lambda function when theDeleteKey operation is performed Create an SNS topic Configure the EventBndge rule topublish an SNS message that notifies the administrators
C. Create an Amazon EventBndge rule that reacts when the KMS DeleteKey operation isperformed Configure the rule to initiate an AWS Systems Manager Automationrunbook Configure the runbook to cancel the deletion of the KMS key Create an SNS topicConfigure the EventBndge rule to publish an SNS message that notifies the administrators.
D. Create an AWS CloudTrail trail Configure the trail to delrver logs to a new AmazonCloudWatch log group Create a CloudWatch alarm based on the metric filter for theCloudWatch log group Configure the alarm to use Amazon SNS to notify the administratorswhen the KMS DeleteKey operation is performed

Question # 116

A company has an on-premises business application that generates hundreds of files eachday. These files are stored on an SMB file share and require a low-latency connection tothe application servers. A new company policy states all application-generated files mustbe copied to AWS. There is already a VPN connection to AWS.The application development team does not have time to make the necessary codemodifications to move the application to AWS Which service should a solutions architectrecommend to allow the application to copy files to AWS?

A. Amazon Elastic File System (Amazon EFS)
B. Amazon FSx for Windows File Server
C. AWS Snowball
D. AWS Storage Gateway

Question # 117

A company has a web application in the AWS Cloud and wants to collect transaction datain real time. The company wants to prevent data duplication and does not want to manageinfrastructure. The company wants to perform additional processing on the data after thedata is collected.Which solution will meet these requirements?

A. Configure an Amazon Simple Queue Service (Amazon SOS) FIFO queue. Configure anAWS Lambda function with an event source mapping for the FIFO queue to process thedata.
B. Configure an Amazon Simple Queue Service (Amazon SQS) FIFO queue Use an AWSBatch job to remove duplicate data from the queue Configure an AWSLambda function to process the data.
C. Use Amazon Kinesis Data Streams to send the Incoming transaction data to an AWSBatch job that removes duplicate data. Launch an Amazon EC2 instance that runs acustom script lo process the data.
D. Set up an AWS Step Functions state machine to send incoming transaction data to anAWS Lambda function to remove duplicate data. Launch an Amazon EC2 instance thatruns a custom script to process the data.

Question # 118

A company wants to isolate its workloads by creating an AWS account for each workload.The company needs a solution that centrally manages networking components for theworkloads. The solution also must create accounts with automatic security controls(guardrails).Which solution will meet these requirements with the LEAST operational overhead?

A. Use AWS Control Tower to deploy accounts. Create a networking account that has aVPC with private subnets and public subnets. Use AWS Resource Access Manager (AWSRAM) to share the subnets with the workload accounts.
B. Use AWS Organizations to deploy accounts. Create a networking account that has aVPC with private subnets and public subnets. Use AWS Resource Access Manager (AWSRAM) to share the subnets with the workload accounts.
C. Use AWS Control Tower to deploy accounts. Deploy a VPC in each workload account.Configure each VPC to route through an inspection VPC by using a transit gatewayattachment.
D. Use AWS Organizations to deploy accounts. Deploy a VPC in each workload account.Configure each VPC to route through an inspection VPC by using a transit gatewayattachment.

Question # 119

A company's web application consists of multiple Amazon EC2 instances that run behindan Application Load Balancer in a VPC. An Amazon RDS for MySQL DB instance containsthe data The company needs the ability to automatically detect and respond to suspiciousor unexpected behavior in its AWS environment. The company already has added AWSWAF to its architecture.What should a solutions architect do next to protect against threats?

A. Use Amazon GuardDuty to perform threat detection. Configure Amazon EventBridge tofilter for GuardDuty findings and to Invoke an AWS Lambda function to adjust the AWSWAF rules.
B. Use AWS Firewall Manager to perform threat detection. Configure Amazon EventBridgeto filter for Firewall Manager findings and to invoke an AWS Lambda function to adjust theAWS WAF web ACL
C. Use Amazon Inspector to perform threat detection and lo update the AWS WAF rules. Create a VPC network ACL to limit access to the web application.
D. Use Amazon Macie to perform threat detection and to update the AWS WAF rules.Create a VPC network ACL to limit access to the web application.

Question # 120

A company is storing petabytes of data in Amazon S3 Standard The data is stored inmultiple S3 buckets and is accessed with varying frequency The company does not know access patterns for all the data. The company needs to implement a solution for each S3bucket to optimize the cost of S3 usage.Which solution will meet these requirements with the MOST operational efficiency?

A. Create an S3 Lifecycle configuration with a rule to transition the objects in the S3 bucketto S3 Intelligent-Tiering.
B. Use the S3 storage class analysis tool to determine the correct tier for each object in theS3 bucket. Move each object to the identified storage tier.
C. Create an S3 Lifecycle configuration with a rule to transition the objects in the S3 bucketto S3 Glacier Instant Retrieval.
D. Create an S3 Lifecycle configuration with a rule to transition the objects in the S3 bucketto S3 One Zone-Infrequent Access (S3 One Zone-IA).

Question # 121

A company needs to optimize its Amazon S3 storage costs for an application thatgenerates many files that cannot be recreated Each file is approximately 5 MB and isstored in Amazon S3 Standard storage.The company must store the files for 4 years before the files can be deleted The files mustbe immediately accessible The files are frequently accessed in the first 30 days of objectcreation, but they are rarely accessed after the first 30 days.Which solution will meet these requirements MOST cost-effectively

A. Create an S3 Lifecycle policy to move the files to S3 Glacier Instant Retrieval 30 daysafter object creation. Delete the files 4 years after object creation.
B. Create an S3 Lifecycle policy to move the files to S3 One Zone-Infrequent Access (S3One Zone-IA) 30 days after object creation Delete the files 4 years after object creation.
C. Create an S3 Lifecycle policy to move the files to S3 Standard-Infrequent Access (S3Standard-IA) 30 days after object creation Delete the files 4 years after object creation.
D. Create an S3 Lifecycle policy to move the files to S3 Standard-Infrequent Access (S3Standard-IA) 30 days after object creation. Move the files to S3 Glacier Flexible Retrieval 4years after object creation.

Question # 122

A company is planning to migrate data to an Amazon S3 bucket The data must beencrypted at rest within the S3 bucket The encryption key must be rotated automaticallyevery year.Which solution will meet these requirements with the LEAST operational overhead?

A. Migrate the data to the S3 bucket. Use server-side encryption with Amazon S3 managedkeys (SSE-S3). Use the built-in key rotation behavior of SSE-S3encryption keys.
B. Create an AWS Key Management Service (AWS KMS) customer managed key Enableautomatic key rotation Set the S3 bucket's default encryption behavior to use the customermanaged KMS key. Migrate the data to the S3 bucket.
C. Create an AWS Key Management Service (AWS KMS) customer managed key Set theS3 bucket's default encryption behavior to use the customer managed KMS key. Migratethe data to the S3 bucket. Manually rotate the KMS key every year.
D. Use customer key material to encrypt the data Migrate the data to the S3 bucket. Createan AWS Key Management Service (AWS KMS) key without key material Import thecustomer key material into the KMS key. Enable automatic key rotation.

Question # 123

An online photo-sharing company stores Hs photos in an Amazon S3 bucket that exists inthe us-west-1 Region. The company needs to store a copy of all new photos in the us-east-1 Region.Which solution will meet this requirement with the LEAST operational effort?

A. Create a second S3 bucket in us-east-1. Use S3 Cross-Region Replication to copyphotos from the existing S3 bucket to the second S3 bucket.
B. Create a cross-origin resource sharing (CORS) configuration of the existing S3 bucket.Specify us-east-1 in the CORS rule's AllowedOngm element.
C. Create a second S3 bucket in us-east-1 across multiple Availability Zones. Create an S3Lifecycle rule to save photos into the second S3 bucket,
D. Create a second S3 bucket In us-east-1. Configure S3 event notifications on objectcreation and update events to Invoke an AWS Lambda function to copy photos from theexisting S3 bucket to the second S3 bucket.

Question # 124

A robotics company is designing a solution for medical surgery The robots will useadvanced sensors, cameras, and Al algorithms to perceive their environment and tocomplete surgeries.The company needs a public load balancer in the AWS Cloud that will ensure seamlesscommunication with backend services. The load balancer must be capable of routing trafficbased on the query strings to different target groups. The traffic must also be encrypted Which solution will meet these requirements?

A. Use a Network Load Balancer with a certificate attached from AWS Certificate Manager(ACM) Use query parameter-based routing
B. Use a Gateway Load Balancer. Import a generated certificate in AWS Identity andAccess Management (1AM). Attach the certificate to the load balancer. Use HTTP pathbasedrouting.
C. Use an Application Load Balancer with a certificate attached from AWS CertificateManager (ACM). Use query parameter-based routing.
D. Use a Network Load Balancer. Import a generated certificate in AWS Identity andAccess Management (1AM). Attach the certificate to the load balancer. Use queryparameter-based routing.

Question # 125

A company's application is running on Amazon EC2 instances within an Auto Scaling groupbehind an Elastic Load Balancing (ELB) load balancer Based on the application's history,the company anticipates a spike in traffic during a holiday each year. A solutions architectmust design a strategy to ensure that the Auto Scaling group proactively increases capacityto minimize any performance impact on application users.Which solution will meet these requirements?

A. Create an Amazon CloudWatch alarm to scale up the EC2 instances when CPUutilization exceeds 90%.
B. Create a recurring scheduled action to scale up the Auto Scaling group before theexpected period of peak demand
C. Increase the minimum and maximum number of EC2 instances in the Auto Scalinggroup during the peak demand period
D. Configure an Amazon Simple Notification Service (Amazon SNS) notification to sendalerts when there are autoscaling:EC2_INSTANCE_LAUNCH events.

Question # 126

A company manages a data lake in an Amazon S3 bucket that numerous applicationsaccess The S3 bucket contains a unique prefix for each application The company wants torestrict each application to its specific prefix and to have granular control of the objectsunder each prefix.Which solution will meet these requirements with the LEAST operational overhead?

A. Create dedicated S3 access points and access point policies for each application.
B. Create an S3 Batch Operations job to set the ACL permissions for each object in the S3bucket
C. Replicate the objects in the S3 bucket to new S3 buckets for each application. Createreplication rules by prefix
D. Replicate the objects in the S3 bucket to new S3 buckets for each application Creatededicated S3 access points for each application

Question # 127

A company is migrating its workloads to AWS. The company has sensitive and critical datain on-premises relational databases that run on SQL Server instances. The company wantsto use the AWS Cloud to increase security and reduce operational overhead for thedatabases. Which solution will meet these requirements?

A. Migrate the databases to Amazon EC2 instances. Use an AWS Key ManagementService (AWS KMS) AWS managed key for encryption.
B. Migrate the databases to a Multi-AZ Amazon RDS for SQL Server DB instance Use anAWS Key Management Service (AWS KMS) AWS managed key for encryption.
C. Migrate the data to an Amazon S3 bucket Use Amazon Macie to ensure data security
D. Migrate the databases to an Amazon DynamoDB table. Use Amazon CloudWatch Logsto ensure data security

Question # 128

A company runs workloads in the AWS Cloud The company wants to centrally collectsecurity data to assess security across the entire company and to improve workloadprotection.Which solution will meet these requirements with the LEAST development effort?

A. Configure a data lake in AWS Lake Formation Use AWS Glue crawlers to ingest thesecurity data into the data lake.
B. Configure an AWS Lambda function to collect the security data in csv format. Upload thedata to an Amazon S3 bucket
C. Configure a data lake in Amazon Security Lake to collect the security data Upload thedata to an Amazon S3 bucket.
D. Configure an AWS Database Migration Service (AWS DMS) replication instance to loadthe security data into an Amazon RDS cluster

Question # 129

A company has separate AWS accounts for its finance, data analytics, and developmentdepartments. Because of costs and security concerns, the company wants to control whichservices each AWS account can useWhich solution will meet these requirements with the LEAST operational overhead

A. Use AWS Systems Manager templates to control which AWS services each departmentcan use
B. Create organization units (OUs) for each department in AWS Organizations. Attachservice control policies (SCPs) to the OUs.
C. Use AWS CloudFormation to automatically provision only the AWS services that eachdepartment can use.
D. Set up a list of products in AWS Service Catalog in the AWS accounts to manage andcontrol the usage of specific AWS services

Question # 130

A global company runs its workloads on AWS The company's application uses Amazon S3buckets across AWS Regions for sensitive data storage and analysis. The company storesmillions of objects in multiple S3 buckets daily. The company wants to identify all S3buckets that are not versioning-enabled.Which solution will meet these requirements?

A. Set up an AWS CloudTrail event that has a rule to identify all S3 buckets that are notversioning-enabled across Regions
B. Use Amazon S3 Storage Lens to identify all S3 buckets that are not versioning-enabledacross Regions.
C. Enable IAM Access Analyzer for S3 to identify all S3 buckets that are not versioningenabledacross Regions
D. Create an S3 Multi-Region Access Point to identify all S3 buckets that are notversioning-enabled across Regions

Question # 131

A company is designing an event-driven order processing system Each order requiresmultiple validation steps after the order is created. An independent AWS Lambda functionperforms each validation step. Each validation step is independent from the other validationsteps Individual validation steps need only a subset of the order event information.The company wants to ensure that each validation step Lambda function has access toonly the information from the order event that the function requires The components of theorder processing system should be loosely coupled to accommodate future businesschanges.Which solution will meet these requirements?

A. Create an Amazon Simple Queue Service (Amazon SQS> queue for each validationstep. Create a new Lambda function to transform the order data to the format that eachvalidation step requires and to publish the messages to the appropriate SQS queuesSubscribe each validation step Lambda function to its corresponding SQS queue
B. Create an Amazon Simple Notification Service {Amazon SNS) topic. Subscribe thevalidation step Lambda functions to the SNS topic. Use message body filtering to send only the required data to each subscribed Lambda function.
C. Create an Amazon EventBridge event bus. Create an event rule for each validation stepConfigure the input transformer to send only the required data to each target validation stepLambda function.
D. Create an Amazon Simple Queue Service {Amazon SQS) queue Create a new Lambdafunction to subscribe to the SQS queue and to transform the order data to the format thateach validation step requires. Use the new Lambda function to perform synchronousinvocations of the validation step Lambda functions in parallel on separate threads.

Question # 132

A company uses Amazon API Gateway to manage its REST APIs that third-party serviceproviders access The company must protect the REST APIs from SQL injection and crosssitescripting attacks. What is the MOST operationally efficient solution that meets these requirements

A. Configure AWS Shield.
B. Configure AWS WAR
C. Set up API Gateway with an Amazon CloudFront distribution Configure AWS Shield inCloudFront.
D. Set up API Gateway with an Amazon CloudFront distribution. Configure AWS WAF inCloudFront

Question # 133

A company has multiple VPCs across AWS Regions to support and run workloads that areisolated from workloads in other Regions Because of a recent application launchrequirement, the company's VPCs must communicate with all other VPCs across allRegions.Which solution will meet these requirements with the LEAST amount of administrativeeffort?

A. Use VPC peering to manage VPC communication in a single Region Use VPC peeringacross Regions to manage VPC communications.
B. Use AWS Direct Connect gateways across all Regions to connect VPCs across regionsand manage VPC communications.
C. Use AWS Transit Gateway to manage VPC communication in a single Region andTransit Gateway peering across Regions to manage VPC communications.
D. Use AWS PrivateLink across all Regions to connect VPCs across Regions and manageVPC communications.

Question # 134

A company is creating a prototype of an ecommerce website on AWS. The websiteconsists of an Application Load Balancer, an Auto Scaling group of Amazon EC2 instancesfor web servers, and an Amazon RDS for MySQL DB instance that runs with the Single-AZconfiguration.The website is slow to respond during searches of the product catalog. The product catalogis a group of tables in the MySQL database that the company does not ate frequently. Asolutions architect has determined that the CPU utilization on the DB instance is high whenproduct catalog searches occur.What should the solutions architect recommend to improve the performance of the websitedunng searches of the product catalog?

A. Migrate the product catalog to an Amazon Redshift database. Use the COPY commandto load the product catalog tables.
B. Implement an Amazon ElastiCache for Redis cluster to cache the product catalog. Uselazy loading to populate the cache.
C. Add an additional scaling policy to the Auto Scaling group to launch additional EC2instances when database response is slow.
D. Turn on the Multi-AZ configuration for the DB instance. Configure the EC2 instances tothrottle the product catalog queries that are sent to the database.

Question # 135

A global ecommerce company runs its critical workloads on AWS. The workloads use anAmazon RDS for PostgreSQL DB instance that is configured for a Multi-AZ deployment.Customers have reported application timeouts when the company undergoes databasefailovers. The company needs a resilient solution to reduce failover timeWhich solution will meet these requirements?

A. Create an Amazon RDS Proxy. Assign the proxy to the DB instance.
B. Create a read replica for the DB instance Move the read traffic to the read replica.
C. Enable Performance Insights. Monitor the CPU load to identify the timeouts.
D. Take regular automatic snapshots Copy the automatic snapshots to multiple AWSRegions

Question # 136

A company wants to use Amazon Elastic Container Service (Amazon ECS) to run its onpremisesapplication in a hybrid environment The application currently runs on containerson premises.The company needs a single container solution that can scale in an on-premises, hybrid, orcloud environment The company must run new application containers in the AWS Cloudand must use a load balancer for HTTP traffic.Which combination of actions will meet these requirements? (Select TWO.)

A. Set up an ECS cluster that uses the AWS Fargate launch type for the cloud applicationcontainers Use an Amazon ECS Anywhere external launch type for theon-premises application containers.
B. Set up an Application Load Balancer for cloud ECS services
C. Set up a Network Load Balancer for cloud ECS services.
D. Set up an ECS cluster that uses the AWS Fargate launch type Use Fargate for the cloud application containers and the on-premises application containers.
E. Set up an ECS cluster that uses the Amazon EC2 launch type for the cloud applicationcontainers. Use Amazon ECS Anywhere with an AWS Fargate launch type for the onpremisesapplication containers.

Question # 137

A company runs an application that uses Amazon RDS for PostgreSQL The applicationreceives traffic only on weekdays during business hours The company wants to optimizecosts and reduce operational overhead based on this usage.Which solution will meet these requirements?

A. Use the Instance Scheduler on AWS to configure start and stop schedules.
B. Turn off automatic backups. Create weekly manual snapshots of the database.
C. Create a custom AWS Lambda function to start and stop the database based onminimum CPU utilization.
D. Purchase All Upfront reserved DB instances

Question # 138

A company is preparing to store confidential data in Amazon S3. For compliance reasons,the data must be encrypted at rest. Encryption key usage must be logged for auditingpurposes. Keys must be rotated every year.Which solution meets these requirements and is the MOST operationally efficient?

A. Server-side encryption with customer-provided keys (SSE-C)
B. Server-side encryption with Amazon S3 managed keys (SSE-S3)
C. Server-side encryption with AWS KMS keys (SSE-KMS) with manual rotation
D. Server-side encryption with AWS KMS keys (SSE-KMS) with automatic rotation

Question # 139

A news company that has reporters all over the world is hosting its broadcast system onAWS. The reporters send live broadcasts to the broadcast system. The reporters usesoftware on their phones to send live streams through the Real Time Messaging Protocol(RTMP).A solutions architect must design a solution that gives the reporters the ability to send thehighest quality streams The solution must provide accelerated TCP connections back tothe broadcast system.What should the solutions architect use to meet these requirements?

A. Amazon CloudFront
B. AWS Global Accelerator
C. AWS Client VPN
D. Amazon EC2 instances and AWS Elastic IP addresses

Question # 140

A solutions architect is creating an application. The application will run on Amazon EC2instances in private subnets across multiple Availability Zones in a VPC. The EC2instances will frequently access large files that contain confidential information. These filesare stored in Amazon S3 buckets for processing. The solutions architect must optimize thenetwork architecture to minimize data transfer costs.What should the solutions architect do to meet these requirements?

A. Create a gateway endpoint for Amazon S3 in the VPC. In the route tables for the privatesubnets, add an entry for the gateway endpoint
B. Create a single NAT gateway in a public subnet. In the route tables for the privatesubnets, add a default route that points to the NAT gateway
C. Create an AWS PrivateLink interface endpoint for Amazon S3 in the VPC. In the routetables for the private subnets, add an entry for the interface endpoint.
D. Create one NAT gateway for each Availability Zone in public subnets. In each of theroute labels for the private subnets, add a default route that points lo the NAT gateway inthe same Availability Zone

Question # 141

A company plans to run a high performance computing (HPC) workload on Amazon EC2Instances The workload requires low-latency network performance and high networkthroughput with tightly coupled node-to-node communication.Which solution will meet these requirements?

A. Configure the EC2 instances to be part of a cluster placement group
B. Launch the EC2 instances with Dedicated Instance tenancy.
C. Launch the EC2 instances as Spot Instances.
D. Configure an On-Demand Capacity Reservation when the EC2 instances are launched.

Question # 142

A company uses Amazon EC2 instances and Amazon Elastic Block Store (Amazon EBS)to run its self-managed database The company has 350 TB of data spread across all EBSvolumes. The company takes daily EBS snapshots and keeps the snapshots for 1 month.The dally change rate is 5% of the EBS volumes.Because of new regulations, the company needs to keep the monthly snapshots for 7years. The company needs to change its backup strategy to comply with the newregulations and to ensure that data is available with minimal administrative effort.Which solution will meet these requirements MOST cost-effectively?

A. Keep the daily snapshot in the EBS snapshot standard tier for 1 month Copy themonthly snapshot to Amazon S3 Glacier Deep Archive with a 7-year retentionperiod.
B. Continue with the current EBS snapshot policy. Add a new policy to move the monthlysnapshot to Amazon EBS Snapshots Archive with a 7-year retention period.
C. Keep the daily snapshot in the EBS snapshot standard tier for 1 month Keep themonthly snapshot in the standard tier for 7 years Use incremental snapshots.
D. Keep the daily snapshot in the EBS snapshot standard tier. Use EBS direct APIs to takesnapshots of all the EBS volumes every month. Store the snapshots in an Amazon S3bucket in the Infrequent Access tier for 7 years.

Question # 143

A company has an application that serves clients that are deployed in more than 20.000retail storefront locations around the world. The application consists of backend webservices that are exposed over HTTPS on port 443 The application is hosted on AmazonEC2 Instances behind an Application Load Balancer (ALB). The retail locationscommunicate with the web application over the public internet. The company allows eachretail location to register the IP address that the retail location has been allocated by itslocal ISP.The company's security team recommends to increase the security of the applicationendpoint by restricting access to only the IP addresses registered by the retail locations.What should a solutions architect do to meet these requirements?

A. Associate an AWS WAF web ACL with the ALB Use IP rule sets on the ALB to filtertraffic Update the IP addresses in the rule to Include the registered IP addresses
B. Deploy AWS Firewall Manager to manage the ALB. Configure firewall rules to restricttraffic to the ALB Modify the firewall rules to include the registered IP addresses.
C. Store the IP addresses in an Amazon DynamoDB table. Configure an AWS Lambdaauthorization function on the ALB to validate that incoming requests are from the registeredIP addresses.
D. Configure the network ACL on the subnet that contains the public interface of the ALBUpdate the ingress rules on the network ACL with entries for each of the registered IPaddresses.

Question # 144

A company has an application that customers use to upload images to an Amazon S3bucket Each night, the company launches an Amazon EC2 Spot Fleet that processes allthe images that the company received that day. The processing for each image takes 2minutes and requires 512 MB of memory.A solutions architect needs to change the application to process the images when theimages are uploadedWhich change will meet these requirements MOST cost-effectively?

A. Use S3 Event Notifications to write a message with image details to an Amazon SimpleQueue Service (Amazon SQS) queue. Configure an AWS Lambda function to read themessages from the queue and to process the images
B. Use S3 Event Notifications to write a message with image details to an Amazon SimpleQueue Service (Amazon SQS) queue Configure an EC2 Reserved Instance to read themessages from the queue and to process the images.
C. Use S3 Event Notifications to publish a message with image details to an AmazonSimple Notification Service (Amazon SNS) topic. Configure a container instance in AmazonElastic Container Service (Amazon ECS) to subscribe to the topic and to process theimages.
D. Use S3 Event Notifications to publish a message with image details to an AmazonSimple Notification Service (Amazon SNS) topic. to subscribe to the topic and to process the images.

Question # 145

A company has a web application that has thousands of users. The application uses 8-10user-uploaded images to generate Al images. Users can download the generated AlImages once every 6 hours. The company also has a premium user option that gives usersthe ability to download the generated Al images anytimeThe company uses the user-uploaded images to run Al model training twice a year. Thecompany needs a storage solution to store the images.Which storage solution meets these requirements MOST cost-effectively?

A. Move uploaded images to Amazon S3 Glacier Deep Archive. Move premium usergeneratedAl images to S3 Standard. Move non-premium user-generated Al images to S3Standard-Infrequent Access (S3 Standard-IA).
B. Move uploaded images to Amazon S3 Glacier Deep Archive. Move all generated Al images to S3 Glacier Flexible Retrieval.
C. Move uploaded images to Amazon S3 One Zone-Infrequent Access {S3 One Zone-IA)Move premium user-generated Al images to S3 Standard. Move non-premium usergeneratedAl images to S3 Standard-Infrequent Access (S3 Standard-IA).
D. Move uploaded images to Amazon S3 One Zone-Infrequent Access {S3 One Zone-IA)Move all generated Al images to S3 Glacier Flexible Retrieval

Question # 146

A company wants to build a map of its IT infrastructure to identify and enforce policies onresources that pose security risks. The company's security team must be able to querydata in the IT infrastructure map and quickly identify security risks.Which solution will meet these requirements with the LEAST operational overhead?

A. Use Amazon RDS to store the data. Use SQL to query the data to identify security risks.
B. Use Amazon Neptune to store the data. Use SPARQL to query the data to identifysecurity risks.
C. Use Amazon Redshift to store the data. Use SQL to query the data to identify securityrisks.
D. Use Amazon DynamoDB to store the data. Use PartiQL to query the data to identifysecurity risks.

Question # 147

A company maintains about 300 TB in Amazon S3 Standard storage month after monthThe S3 objects are each typically around 50 GB in size and are frequently replaced withmultipart uploads by their global application The number and size of S3 objects remainconstant but the company's S3 storage costs are increasing each month.How should a solutions architect reduce costs in this situation?

A. Switch from multipart uploads to Amazon S3 Transfer Acceleration.
B. Enable an S3 Lifecycle policy that deletes incomplete multipart uploads.
C. Configure S3 inventory to prevent objects from being archived too quickly.
D. Configure Amazon CloudFront to reduce the number of objects stored in Amazon S3.

Question # 148

A company is building a microservices-based application that will be deployed on AmazonElastic Kubernetes Service (Amazon EKS). The microservices will interact with each other.The company wants to ensure that the application is observable to identify performanceissues in the future.Which solution will meet these requirements?

A. Configure the application to use Amazon ElastiCache to reduce the number of requeststhat are sent to the microservices.
B. Configure Amazon CloudWatch Container Insights to collect metrics from the EKSclusters Configure AWS X-Ray to trace the requests between the microservices.
C. Configure AWS CloudTrail to review the API calls. Build an Amazon QuickSightdashboard to observe the microservice interactions.
D. Use AWS Trusted Advisor to understand the performance of the application.

Question # 149

A company has a multi-tier payment processing application that is based on virtualmachines (VMs). The communication between the tiers occurs asynchronously through athird-party middleware solution that guarantees exactly-once delivery.The company needs a solution that requires the least amount of infrastructuremanagement. The solution must guarantee exactly-once delivery for application messagingWhich combination of actions will meet these requirements? (Select TWO.)

A. Use AWS Lambda for the compute layers in the architecture.
B. Use Amazon EC2 instances for the compute layers in the architecture.
C. Use Amazon Simple Notification Service (Amazon SNS) as the messaging componentbetween the compute layers.
D. Use Amazon Simple Queue Service (Amazon SQS) FIFO queues as the messagingcomponent between the compute layers.
E. Use containers that are based on Amazon Elastic Kubemetes Service (Amazon EKS) forthe compute layers in the architecture.

Question # 150

A company has a mobile game that reads most of its metadata from an Amazon RDS DBinstance. As the game increased in popularity, developers noticed slowdowns related to thegame's metadata load times Performance metrics indicate that simply scaling the databasewill not help A solutions architect must explore all options that include capabilities forsnapshots, replication, and sub-millisecond response timesWhat should the solutions architect recommend to solve these issues'?

A. Migrate the database to Amazon Aurora with Aurora Replicas
B. Migrate the database to Amazon DynamoDB with global tables
C. Add an Amazon ElastiCache for Redis layer in front of the database.
D. Add an Amazon ElastiCache for Memcached layer in front of the database

Question # 151

A financial company needs to handle highly sensitive data The company will store the datain an Amazon S3 bucket The company needs to ensure that the data is encrypted in transitand at rest The company must manage the encryption keys outside the AWS CloudWhich solution will meet these requirements?

A. Encrypt the data in the S3 bucket with server-side encryption (SSE) that uses an AWS Key Management Service (AWS KMS) customer managed key
B. Encrypt the data in the S3 bucket with server-side encryption (SSE) that uses an AWSKey Management Service (AWS KMS) AWS managed key
C. Encrypt the data in the S3 bucket with the default server-side encryption (SSE)
D. Encrypt the data at the company's data center before storing the data in the S3 bucket

Question # 152

A company has an on-premises data center that is running out of storage capacity. Thecompany wants to migrate its storage infrastructure to AWS while minimizing bandwidthcosts. The solution must allow for immediate retrieval of data at no additional cost.How can these requirements be met?

A. Deploy Amazon S3 Glacier Vault and enable expedited retrieval. Enable provisionedretrieval capacity for the workload.
B. Deploy AWS Storage Gateway using cached volumes. Use Storage Gateway to storedata in Amazon S3 while retaining copies of frequently accessed data subsets locally.
C. Deploy AWS Storage Gateway using stored volumes to store data locally. Use StorageGateway to asynchronously back up point-in-time snapshots of the data to Amazon S3.
D. Deploy AWS Direct Connect to connect with the on-premises data center. ConfigureAWS Storage Gateway to store data locally. Use Storage Gateway to asynchronously backup point-in-time snapshots of the data to Amazon S3.

Question # 153

A company has a business-critical application that runs on Amazon EC2 instances. Theapplication stores data in an Amazon DynamoDB table. The company must be able torevert the table to any point within the last 24 hours.Which solution meets these requirements with the LEAST operational overhead?

A. Configure point-in-time recovery for the table.
B. Use AWS Backup for the table.
C. Use an AWS Lambda function to make an on-demand backup of the table every hour.
D. Turn on streams on the table to capture a log of all changes to the table in the last 24hours Store a copy of the stream in an Amazon S3 bucket.

Question # 154

A company's website hosted on Amazon EC2 instances processes classified data stored inAmazon S3 Due to security concerns, the company requires a pnvate and secureconnection between its EC2 resources and Amazon S3.Which solution meets these requirements?

A. Set up S3 bucket policies to allow access from a VPC endpomt.
B. Set up an 1AM policy to grant read-write access to the S3 bucket.
C. Set up a NAT gateway to access resources outside the private subnet.
D. Set up an access key ID and a secret access key to access the S3 bucket.

Question # 155

A company wants to use NAT gateways in its AWS environment. The company's AmazonEC2 instances in private subnets must be able to connect to the public internet through theNAT gateways. Which solution will meet these requirements'?

A. Create public NAT gateways in the same private subnets as the EC2 instances
B. Create private NAT gateways in the same private subnets as the EC2 instances
C. Create public NAT gateways in public subnets in the same VPCs as the EC2 instances
D. Create private NAT gateways in public subnets in the same VPCs as the EC2 instances

Question # 156

A company wants to migrate an on-premises legacy application to AWS. The applicationingests customer order files from an on-premises enterprise resource planning (ERP)system. The application then uploads the files to an SFTP server. The application uses ascheduled job that checks for order files every hour.The company already has an AWS account that has connectivity to the on-premisesnetwork. The new application on AWS must support integration with the existing ERPsystem. The new application must be secure and resilient and must use the SFTP protocol to process orders from the ERP system immediately.Which solution will meet these requirements?

A. Create an AWS Transfer Family SFTP internet-facing server in two Availability Zones.Use Amazon S3 storage. Create an AWS Lambda function to process order files. Use S3Event Notifications to send s3: ObjectCreated: * events to the Lambda function.
B. Create an AWS Transfer Family SFTP internet-facing server in one Availability Zone.Use Amazon Elastic File System (Amazon EFS) storage. Create an AWS Lambda functionto process order files. Use a Transfer Family managed workflow to invoke the Lambdafunction.
C. Create an AWS Transfer Family SFTP internal server in two Availability Zones. UseAmazon Elastic File System (Amazon EFS) storage. Create an AWS Step Functions statemachine to process order files. Use Amazon EventBridge Scheduler to invoke the statemachine to periodically check Amazon EFS for order files.
D. Create an AWS Transfer Family SFTP internal server in two Availability Zones. UseAmazon S3 storage. Create an AWS Lambda function to process order files. Use aTransfer Family managed workflow to invoke the Lambda function.

Question # 157

A company needs to create an AWS Lambda function that will run in a VPC in thecompany's primary AWS account. The Lambda function needs to access files that thecompany storesin an Amazon Elastic File System (Amazon EFS) file system. The EFS file system islocated in a secondary AWS account. As the company adds files to the file system thesolution must scale to meet the demand.Which solution will meet these requirements MOST cost-effectively?

A. Create a new EPS file system in the primary account Use AWS DataSync to copy thecontents of the original EPS file system to the new EPS file system
B. Create a VPC peering connection between the VPCs that are in the primary accountand the secondary account
C. Create a second Lambda function In the secondary account that has a mount that isconfigured for the file system. Use the primary account's Lambda function to invoke thesecondary account's Lambda function
D. Move the contents of the file system to a Lambda Layer’s Configure the Lambda layer'spermissions to allow the company's secondary account to use the Lambda layer.

Question # 158

A company uses Amazon S3 to store high-resolution pictures in an S3 bucket. To minimizeapplication changes, the company stores the pictures as the latest version of an S3 objectThe company needs to retain only the two most recent versions ot the pictures.The company wants to reduce costs. The company has identified the S3 bucket as a largeexpense.Which solution will reduce the S3 costs with the LEAST operational overhead?

A. Use S3 Lifecycle to delete expired object versions and retain the two most recentversions.
B. Use an AWS Lambda function to check for older versions and delete all but the twomost recent versions
C. Use S3 Batch Operations to delete noncurrent object versions and retain only the twomost recent versions
D. Deactivate versioning on the S3 bucket and retain the two most recent versions.

Question # 159

A company is creating an application The company stores data from tests of the applicationin multiple on-premises locationsThe company needs to connect the on-premises locations to VPCs in an AWS Region inthe AWS Cloud The number of accounts and VPCs will increase during the next year Thenetwork architecture must simplify the administration of new connections and must providethe ability to scale.Which solution will meet these requirements with the LEAST administrative overhead'?

A. Create a peering connection between the VPCs Create a VPN connection between theVPCs and the on-premises locations.
B. Launch an Amazon EC2 instance On the instance, include VPN software that uses aVPN connection to connect all VPCs and on-premises locations.
C. Create a transit gateway Create VPC attachments for the VPC connections Create VPNattachments for the on-premises connections.
D. Create an AWS Direct Connect connection between the on-premises locations and acentral VPC. Connect the central VPC to other VPCs by using peering connections.

Question # 160

A company is developing a new mobile app. The company must implement proper trafficfiltering to protect its Application Load Balancer (ALB) against common application-levelattacks, such as cross-site scripting or SQL injection. The company has minimalinfrastructure and operational staff. The company needs to reduce its share of theresponsibility in managing, updating, and securing servers for its AWS environment.What should a solutions architect recommend to meet these requirements?

A. Configure AWS WAF rules and associate them with the ALB.
B. Deploy the application using Amazon S3 with public hosting enabled.
C. Deploy AWS Shield Advanced and add the ALB as a protected resource.
D. Create a new ALB that directs traffic to an Amazon EC2 instance running a third-partyfirewall, which then passes the traffic to the current ALB.

Question # 161

A company’s security team requests that network traffic be captured in VPC Flow Logs.The logs will be frequently accessed for 90 days and then accessed intermittently.What should a solutions architect do to meet these requirements when configuring thelogs?

A. Use Amazon CloudWatch as the target. Set the CloudWatch log group with an expiration of 90 days
B. Use Amazon Kinesis as the target. Configure the Kinesis stream to always retain the logs for 90 days.
C. Use AWS CloudTrail as the target. Configure CloudTrail to save to an Amazon S3 bucket, and enable S3 Intelligent-Tiering.
D. Use Amazon S3 as the target. Enable an S3 Lifecycle policy to transition the logs to S3 Standard-Infrequent Access (S3 Standard-IA) after 90 days.

Question # 162

A company is developing a mobile game that streams score updates to a backendprocessor and then posts results on a leaderboard A solutions architect needs to design asolution that can handle large traffic spikes process the mobile game updates in order ofreceipt, and store the processed updates in a highly available database The company alsowants to minimize the management overhead required to maintain the solutionWhat should the solutions architect do to meet these requirements?

A. Push score updates to Amazon Kinesis Data Streams Process the updates in KinesisData Streams with AWS Lambda Store the processed updates in Amazon DynamoDB.
B. Push score updates to Amazon Kinesis Data Streams. Process the updates with a fleetof Amazon EC2 instances set up for Auto Scaling Store the processed updates in AmazonRedshift.
C. Push score updates to an Amazon Simple Notification Service (Amazon SNS) topicSubscribe an AWS Lambda function to the SNS topic to process the updates. Store theprocessed updates in a SQL database running on Amazon EC2.
D. Push score updates to an Amazon Simple Queue Service (Amazon SQS) queue. Use afleet of Amazon EC2 instances with Auto Scaling to process the updates in the SQSqueue. Store the processed updates in an Amazon RDS Multi-AZ DB instance.

Question # 163

A company runs an SMB file server in its data center. The file server stores large files thatthe company frequently accesses for up to 7 days after the file creation date. After 7 days,the company needs to be able to access the files with a maximum retrieval time of 24hours.Which solution will meet these requirements?

A. Use AWS DataSync to copy data that is older than 7 days from the SMB file server toAWS.
B. Create an Amazon S3 File Gateway to increase the company's storage space. Createan S3 Lifecycle policy to transition the data to S3 Glacier Deep Archive after 7 days.
C. Create an Amazon FSx File Gateway to increase the company's storage space. Createan Amazon S3 Lifecycle policy to transition the data after 7 days.
D. Configure access to Amazon S3 for each user. Create an S3 Lifecycle policy totransition the data to S3 Glacier Flexible Retrieval after 7 days.

Question # 164

A company has an organization in AWS Organizations that has all features enabled Thecompany requires that all API calls and logins in any existing or new AWS account must beaudited The company needs a managed solution to prevent additional work and tominimize costs The company also needs to know when any AWS account is not compliantwith the AWS Foundational Security Best Practices (FSBP) standard.Which solution will meet these requirements with the LEAST operational overhead?

A. Deploy an AWS Control Tower environment in the Organizations management accountEnable AWS Security Hub and AWS Control Tower Account Factory in the environment.
B. Deploy an AWS Control Tower environment in a dedicated Organizations memberaccount Enable AWS Security Hub and AWS Control Tower Account Factory in theenvironment.
C. Use AWS Managed Services (AMS) Accelerate to build a multi-account landing zone(MALZ) Submit an RFC to self-service provision Amazon GuardDuty in the MALZ.
D. Use AWS Managed Services (AMS) Accelerate to build a multi-account landing zone(MALZ) Submit an RFC to self-service provision AWS Security Hub in the MALZ.

Question # 165

A solutions architect is designing a user authentication solution for a company The solutionmust invoke two-factor authentication for users that log in from inconsistent geographicallocations. IP addresses, or devices. The solution must also be able to scale up toaccommodate millions of users.Which solution will meet these requirements'?

A. Configure Amazon Cognito user pools for user authentication Enable the nsk-basedadaptive authentication feature with multi-factor authentication (MFA)
B. Configure Amazon Cognito identity pools for user authentication Enable multi-factorauthentication (MFA).
C. Configure AWS Identity and Access Management (1AM) users for user authenticationAttach an 1AM policy that allows the AllowManageOwnUserMFA action
D. Configure AWS 1AM Identity Center (AWS Single Sign-On) authentication for userauthentication Configure the permission sets to require multi-factor authentication(MFA)

Question # 166

A solutions architect needs to design the architecture for an application that a vendorprovides as a Docker container image The container needs 50 GB of storage available fortemporary files The infrastructure must be serverless.Which solution meets these requirements with the LEAST operational overhead?

A. Create an AWS Lambda function that uses the Docker container image with an AmazonS3 mounted volume that has more than 50 GB of space
B. Create an AWS Lambda function that uses the Docker container image with an AmazonElastic Block Store (Amazon EBS) volume that has more than 50 GB of space
C. Create an Amazon Elastic Container Service (Amazon ECS) cluster that uses the AWSFargate launch type Create a task definition for the container image with an AmazonElastic File System (Amazon EFS) volume. Create a service with that task definition.
D. Create an Amazon Elastic Container Service (Amazon ECS) cluster that uses theAmazon EC2 launch type with an Amazon Elastic Block Store (Amazon EBS) volume thathas more than 50 GB of space Create a task definition for the container image. Create aservice with that task definition.

Question # 167

A company uses AWS Organizations to run workloads within multiple AWS accounts Atagging policy adds department tags to AWS resources when the company creates tags.An accounting team needs to determine spending on Amazon EC2 consumption Theaccounting team must determine which departments are responsible for the costsregardless of AWS account The accounting team has access to AWS Cost Explorer for allAWS accounts within the organization and needs to access all reports from Cost Explorer.Which solution meets these requirements in the MOST operationally efficient way'?

A. From the Organizations management account billing console, activate a user-definedcost allocation tag named department Create one cost report in Cost Explorer grouping by tag name, and filter by EC2.
B. From the Organizations management account billing console, activate an AWS-definedcost allocation tag named department. Create one cost report in Cost Explorer grouping bytag name, and filter by EC2.
C. From the Organizations member account billing console, activate a user-defined costallocation tag named department. Create one cost report in Cost Explorer grouping by thetag name, and filter by EC2.
D. From the Organizations member account billing console, activate an AWS-defined costallocation tag named department. Create one cost report in Cost Explorer grouping by tagname and filter by EC2.

Question # 168

A company is building an Amazon Elastic Kubernetes Service (Amazon EKS) cluster for itsworkloads. All secrets that are stored in Amazon EKS must be encrypted in the Kubernetesetcd key-value store.Which solution will meet these requirements?

A. Create a new AWS Key Management Service (AWS KMS) key Use AWS SecretsManager to manage rotate, and store all secrets in Amazon EKS.
B. Create a new AWS Key Management Service (AWS KMS) key Enable Amazon EKSKMS secrets encryption on the Amazon EKS cluster.
C. Create the Amazon EKS cluster with default options Use the Amazon Elastic BlockStore (Amazon EBS) Container Storage Interface (CSI) driver as an add-on.
D. Create a new AWS Key Management Service (AWS KMS) key with the ahas/aws/ebsalias Enable default Amazon Elastic Block Store (Amazon EBS) volume encryption for theaccount.

Question # 169

A retail company has several businesses. The IT team for each business manages its ownAWS account. Each team account is part of an organization in AWS Organizations. Eachteam monitors its product inventory levels in an Amazon DynamoDB table in the team'sown AWS account.The company is deploying a central inventory reporting application into a shared AWSaccount. The application must be able to read items from all the teams' DynamoDB tables.Which authentication option will meet these requirements MOST securely?

A. Integrate DynamoDB with AWS Secrets Manager in the inventory application account.Configure the application to use the correct secret from Secrets Manager to authenticateand read the DynamoDB table. Schedule secret rotation for every 30 days.
B. In every business account, create an 1AM user that has programmatic access.Configure the application to use the correct 1AM user access key ID and secret access keyto authenticate and read the DynamoDB table. Manually rotate 1AM access keys every 30days.
C. In every business account, create an 1AM role named BU_ROLE with a policy that givesthe role access to the DynamoDB table and a trust policy to trust a specific role in theinventory application account. In the inventory account, create a role named APP_ROLEthat allows access to the STS AssumeRole API operation. Configure the application to useAPP_ROLE and assume the cross-account role BU_ROLE to read the DynamoDB table.
D. Integrate DynamoDB with AWS Certificate Manager (ACM). Generate identitycertificates to authenticate DynamoDB. Configure the application to use the correctcertificate to authenticate and read the DynamoDB table.

Question # 170

A company built an application with Docker containers and needs to run the application inthe AWS Cloud The company wants to use a managed sen/ice to host the applicationThe solution must scale in and out appropriately according to demand on the individualcontainer services The solution also must not result in additional operational overhead orinfrastructure to manageWhich solutions will meet these requirements? (Select TWO)

A. Use Amazon Elastic Container Service (Amazon ECS) with AWS Fargate.
B. Use Amazon Elastic Kubernetes Service (Amazon EKS) with AWS Fargate.
C. Provision an Amazon API Gateway API Connect the API to AWS Lambda to run the containers.
D. Use Amazon Elastic Container Service (Amazon ECS) with Amazon EC2 worker nodes.
E. Use Amazon Elastic Kubernetes Service (Amazon EKS) with Amazon EC2 workernodes.

Question # 171

A company uses Amazon S3 as its data lake. The company has a new partner that mustuse SFTP to upload data files A solutions architect needs to implement a highly availableSFTP solution that minimizes operational overhead.Which solution will meet these requirements?

A. Use AWS Transfer Family to configure an SFTP-enabled server with a publiclyaccessible endpoint Choose the S3 data lake as the destination
B. Use Amazon S3 File Gateway as an SFTP server Expose the S3 File Gateway endpointURL to the new partner Share the S3 File Gateway endpoint with the newpartner
C. Launch an Amazon EC2 instance in a private subnet in a VPC. Instruct the new partnerto upload files to the EC2 instance by using a VPN. Run a cron job script on the EC2instance to upload files to the S3 data lake
D. Launch Amazon EC2 instances in a private subnet in a VPC. Place a Network LoadBalancer (NLB) in front of the EC2 instances. Create an SFTP listener port for the NLB Share the NLB hostname with the new partner Run a cron job script on the EC2 instancesto upload files to the S3 data lake.

Question # 172

A company hosts an application used to upload files to an Amazon S3 bucket Onceuploaded, the files are processed to extract metadata which takes less than 5 seconds Thevolume and frequency of the uploads varies from a few files each hour to hundreds ofconcurrent uploads The company has asked a solutions architect to design a cost-effectivearchitecture that will meet these requirements.What should the solutions architect recommend?

A. Configure AWS CloudTrail trails to tog S3 API calls Use AWS AppSync to process thefiles.
B. Configure an object-created event notification within the S3 bucket to invoke an AWSLambda function to process the files.
C. Configure Amazon Kinesis Data Streams to process and send data to Amazon S3.Invoke an AWS Lambda function to process the files.
D. Configure an Amazon Simple Notification Service (Amazon SNS) topic to process thefiles uploaded to Amazon S3 Invoke an AWS Lambda function to process the files.

Question # 173

A company runs analytics software on Amazon EC2 instances The software accepts jobrequests from users to process data that has been uploaded to Amazon S3 Users reportthat some submitted data is not being processed Amazon CloudWatch reveals that theEC2 instances have a consistent CPU utilization at or near 100% The company wants toimprove system performance and scale the system based on user load.What should a solutions architect do to meet these requirements?

A. Create a copy of the instance Place all instances behind an Application Load Balancer
B. Create an S3 VPC endpoint for Amazon S3 Update the software to reference theendpoint
C. Stop the EC2 instances. Modify the instance type to one with a more powerful CPU andmore memory. Restart the instances.
D. Route incoming requests to Amazon Simple Queue Service (Amazon SQS) Configurean EC2 Auto Scaling group based on queue size Update the software to read from the queue.

Question # 174

A company is deploying an application that processes streaming data in near-real time Thecompany plans to use Amazon EC2 instances for the workload The network architecturemust be configurable to provide the lowest possible latency between nodesWhich combination of network solutions will meet these requirements? (Select TWO)

A. Enable and configure enhanced networking on each EC2 instance
B. Group the EC2 instances in separate accounts
C. Run the EC2 instances in a cluster placement group
D. Attach multiple elastic network interfaces to each EC2 instance
E. Use Amazon Elastic Block Store (Amazon EBS) optimized instance types.

Question # 175

A company runs a container application on a Kubernetes cluster in the company's datacenter The application uses Advanced Message Queuing Protocol (AMQP) tocommunicate with a message queue The data center cannot scale fast enough to meet thecompany's expanding business needs The company wants to migrate the workloads toAWSWhich solution will meet these requirements with the LEAST operational overhead? \

A. Migrate the container application to Amazon Elastic Container Service (Amazon ECS)Use Amazon Simple Queue Service (Amazon SQS) to retrieve the messages.
B. Migrate the container application to Amazon Elastic Kubernetes Service (Amazon EKS)Use Amazon MQ to retrieve the messages.
C. Use highly available Amazon EC2 instances to run the application Use Amazon MQ toretrieve the messages.
D. Use AWS Lambda functions to run the application Use Amazon Simple Queue Service(Amazon SQS) to retrieve the messages.

Question # 176

A company runs a real-time data ingestion solution on AWS. The solution consists of themost recent version of Amazon Managed Streaming for Apache Kafka (Amazon MSK). Thesolution is deployed in a VPC in private subnets across three Availability Zones.A solutions architect needs to redesign the data ingestion solution to be publicly availableover the internet. The data in transit must also be encrypted.Which solution will meet these requirements with the MOST operational efficiency?

A. Configure public subnets in the existing VPC. Deploy an MSK cluster in the publicsubnets. Update the MSK cluster security settings to enable mutual TLS authentication.
B. Create a new VPC that has public subnets. Deploy an MSK cluster in the publicsubnets. Update the MSK cluster security settings to enable mutual TLS authentication.
C. Deploy an Application Load Balancer (ALB) that uses private subnets. Configure an ALBsecurity group inbound rule to allow inbound traffic from the VPC CIDR block for HTTPSprotocol.
D. Deploy a Network Load Balancer (NLB) that uses private subnets. Configure an NLBlistener for HTTPS communication over the internet.

What our clients say about SAA-C03 Question Answers

flags     Theo Gill     Jan 16, 2025

Dumpspool.com provided the perfect prep material that helped me ace the Amazon SAA-C03 exam!

Ace Your Amazon SAA-C03 Exam | Trusted Practice Questions

Leave a comment

Your email address will not be published. Required fields are marked *

Rating / Feedback About This Exam