PDF Only
$35.00 Free Updates Upto 90 Days
- SAA-C03 Dumps PDF
- 824 Questions
- Updated On September 13, 2024
PDF + Test Engine
$60.00 Free Updates Upto 90 Days
- SAA-C03 Question Answers
- 824 Questions
- Updated On September 13, 2024
Test Engine
$50.00 Free Updates Upto 90 Days
- SAA-C03 Practice Questions
- 824 Questions
- Updated On September 13, 2024
How to pass Amazon SAA-C03 exam with the help of dumps?
DumpsPool provides you the finest quality resources you’ve been looking for to no avail. So, it's due time you stop stressing and get ready for the exam. Our Online Test Engine provides you with the guidance you need to pass the certification exam. We guarantee top-grade results because we know we’ve covered each topic in a precise and understandable manner. Our expert team prepared the latest Amazon SAA-C03 Dumps to satisfy your need for training. Plus, they are in two different formats: Dumps PDF and Online Test Engine.
How Do I Know Amazon SAA-C03 Dumps are Worth it?
Did we mention our latest SAA-C03 Dumps PDF is also available as Online Test Engine? And that’s just the point where things start to take root. Of all the amazing features you are offered here at DumpsPool, the money-back guarantee has to be the best one. Now that you know you don’t have to worry about the payments. Let us explore all other reasons you would want to buy from us. Other than affordable Real Exam Dumps, you are offered three-month free updates.
You can easily scroll through our large catalog of certification exams. And, pick any exam to start your training. That’s right, DumpsPool isn’t limited to just Amazon Exams. We trust our customers need the support of an authentic and reliable resource. So, we made sure there is never any outdated content in our study resources. Our expert team makes sure everything is up to the mark by keeping an eye on every single update. Our main concern and focus are that you understand the real exam format. So, you can pass the exam in an easier way!
IT Students Are Using our AWS Certified Solutions Architect - Associate (SAA-C03) Dumps Worldwide!
It is a well-established fact that certification exams can’t be conquered without some help from experts. The point of using AWS Certified Solutions Architect - Associate (SAA-C03) Practice Question Answers is exactly that. You are constantly surrounded by IT experts who’ve been through you are about to and know better. The 24/7 customer service of DumpsPool ensures you are in touch with these experts whenever needed. Our 100% success rate and validity around the world, make us the most trusted resource candidates use. The updated Dumps PDF helps you pass the exam on the first attempt. And, with the money-back guarantee, you feel safe buying from us. You can claim your return on not passing the exam.
How to Get SAA-C03 Real Exam Dumps?
Getting access to the real exam dumps is as easy as pressing a button, literally! There are various resources available online, but the majority of them sell scams or copied content. So, if you are going to attempt the SAA-C03 exam, you need to be sure you are buying the right kind of Dumps. All the Dumps PDF available on DumpsPool are as unique and the latest as they can be. Plus, our Practice Question Answers are tested and approved by professionals. Making it the top authentic resource available on the internet. Our expert has made sure the Online Test Engine is free from outdated & fake content, repeated questions, and false plus indefinite information, etc. We make every penny count, and you leave our platform fully satisfied!
Amazon Web Services SAA-C03 Exam Overview:
Aspect | Details |
---|---|
Exam Cost | $150 USD |
Total Time | 130 minutes |
Available Languages | English, Japanese, Korean, and Simplified Chinese |
Passing Marks | 720 out of 1000 |
Exam Format | Multiple Choice and Multiple Answer |
Exam Type | Associate Level |
Prerequisites | At least one year of hands-on experience with AWS services |
Exam Registration | Through Pearson VUE |
Retake Policy | Every 14 days, up to 3 times in a year |
Validity | 3 years |
AWS Certified Solutions Architect - Associate (SAA-C03) Exam Topics Breakdown
Content Area | Percentage | Description |
---|---|---|
Domain 1: Design Resilient Architectures | 30% | Design a multi-tier architecture solution including disaster recovery, scalability, high availability, and fault tolerance. |
Domain 2: Design High-Performing Architectures | 28% | Design secure, reliable, and scalable solutions to meet business requirements, incorporating AWS services and features. |
Domain 3: Design Secure Applications and Architectures | 24% | Design secure access to AWS resources, data protection strategies, and encryption solutions. |
Domain 4: Design Cost-Optimized Architectures | 18% | Identify cost-effective solutions, and design architectures that optimize costs while meeting business objectives. |
Frequently Asked Questions
Question # 1
A company runs an AWS Lambda function in private subnets in a VPC. The subnets havea default route to the internet through an Amazon EC2 NAT instance. The Lambda functionprocesses input data and saves its output as an object to Amazon S3.Intermittently, the Lambda function times out while trying to upload the object because ofsaturated traffic on the NAT instance's network The company wants to access Amazon S3without traversing the internet.Which solution will meet these requirements?
A. Replace the EC2 NAT instance with an AWS managed NAT gateway.
B. Increase the size of the EC2 NAT instance in the VPC to a network optimized instance type
C. Provision a gateway endpoint for Amazon S3 in the VPC. Update the route tables of the subnets accordingly.
D. Provision a transit gateway. Place transit gateway attachments in the private subnetswhere the Lambda function is running.
Question # 2
A solutions architect is designing an asynchronous application to process credit card datavalidation requests for a bank. The application must be secure and be able to process eachrequest at least once.Which solution will meet these requirements MOST cost-effectively?
A. Use AWS Lambda event source mapping. Set Amazon Simple Queue Service (AmazonSQS) standard queues as the event source. Use AWS KeyManagement Service (SSE-KMS) for encryption. Add the kms:Decrypt permission for theLambda execution role.
B. Use AWS Lambda event source mapping. Use Amazon Simple Queue Service (AmazonSQS) FIFO queues as the event source. Use SQS managed encryption keys (SSE-SQS)for encryption. Add the encryption key invocation permission for the Lambda function.
C. Use the AWS Lambda event source mapping. Set Amazon Simple Queue Service(Amazon SQS) FIFO queues as the event source. Use AWS KMS keys (SSE-KMS). Addthe kms:Decrypt permission for the Lambda execution role.
D. Use the AWS Lambda event source mapping. Set Amazon Simple Queue Service(Amazon SQS) standard queues as the event source. Use AWS KMS keys (SSE-KMS) forencryption. Add the encryption key invocation permission for the Lambda function.
Question # 3
A company hosts an application on Amazon EC2 On-Demand Instances in an Auto Scalinggroup. Application peak hours occur at the same time each day. Application users reportslow application performance at the start of peak hours. The application performs normally2-3 hours after peak hours begin. The company wants to ensure that the application worksproperly at the start o* peak hours.Which solution will meet these requirements?
A. Configure an Application Load Balancer to distribute traffic properly to the Instances.
B. Configure a dynamic scaling policy for the Auto Scaling group to launch new instancesbased on memory utilization
C. Configure a dynamic scaling policy for the Auto Scaling group to launch new instancesbased on CPU utilization.
D. Configure a scheduled scaling policy for the Auto Scaling group to launch new instancesbefore peak hours.
Question # 4
A company needs a solution to prevent AWS CloudFormation stacks from deploying AWSIdentity and Access Management (1AM) resources that include an inline policy or "•" in thestatement The solution must also prohibit deployment ot Amazon EC2 instances with publicIP addresses The company has AWS Control Tower enabled in its organization in AWSOrganizations.Which solution will meet these requirements?
A. Use AWS Control Tower proactive controls to block deployment of EC2 instances withpublic IP addresses and inline policies with elevated access or "*"
B. Use AWS Control Tower detective controls to block deployment of EC2 instances withpublic IP addresses and inline policies with elevated access or ""
C. Use AWS Config to create rules for EC2 and 1AM compliance Configure the rules to runan AWS Systems Manager Session Manager automation to delete a resource when it isnot compliant
D. Use a service control policy (SCP) to block actions for the EC2 instances and 1AMresources if the actions lead to noncompliance
Question # 5
A company is migrating a document management application to AWS. The application runson Linux servers. The company will migrate the application to Amazon EC2 instances in anAuto Scaling group. The company stores 7 TiB of documents in a shared storage filesystem. An external relational database tracks the documents.Documents are stored once and can be retrieved multiple times for reference at any time.The company cannot modify the application during the migration. The storage solutionmust be highly available and must support scaling over time.Which solution will meet these requirements MOST cost-effectively?
A. Deploy an EC2 instance with enhanced networking as a shared NFS storage system.Export the NFS share. Mount the NFS share on the EC2 instances in theAuto Scaling group.
B. Create an Amazon S3 bucket that uses the S3 Standard-Infrequent Access (S3Standard-IA) storage class Mount the S3 bucket on the EC2 instances in theAuto Scaling group.
C. Deploy an SFTP server endpoint by using AWS Transfer for SFTP and an Amazon S3bucket. Configure the EC2 instances in the Auto Scaling group toconnect to the SFTP server.
D. Create an Amazon.. System (Amazon fcFS) file system with mount points in multipleAvailability Zones. Use the bFS Stondard-intrcqucnt Access (Standard-IA) storage class.Mount the NFS share on the EC2 instances in the Auto Scaling group.
Question # 6
A company is migrating five on-premises applications to VPCs in the AWS Cloud. Eachapplication is currently deployed in isolated virtual networks on premises and should bedeployed similarly in the AWS Cloud. The applications need to reach a shared servicesVPC. All the applications must be able to communicate with each other. If the migration is successful, the company will repeat the migration process for more than100 applications.Which solution will meet these requirements with the LEAST administrative overhead?
A. Deploy software VPN tunnels between the application VPCs and the shared servicesVPC. Add routes between the application VPCs in their subnets to the shared servicesVPC.
B. Deploy VPC peering connections between the application VPCs and the sharedservices VPC. Add routes between the application VPCs in their subnets to the sharedservices VPC through the peering connection.
C. Deploy an AWS Direct Connect connection between the application VPCs and theshared services VPC. Add routes from the application VPCs in their subnets to the sharedservices VPC and the applications VPCs. Add routes from the shared services VPCsubnets to the applications VPCs.
D. Deploy a transit gateway with associations between the transit gateway and theapplication VPCs and the shared services VPC Add routes between the application VPCsin their subnets and the application VPCs to the shared services VPC through the transitgateway.
Question # 7
A company uses an Amazon CloudFront distribution to serve content pages for its website.The company needs to ensure that clients use a TLS certificate when accessing thecompany's website. The company wants to automate the creation and renewal of the Tl Scertificates.Which solution will meet these requirements with the MOST operational efficiency?
A. Use a CloudFront security policy lo create a certificate.
B. Use a CloudFront origin access control (OAC) to create a certificate.
C. Use AWS Certificate Manager (ACM) to create a certificate. Use DNS validation for thedomain.
D. Use AWS Certificate Manager (ACM) to create a certificate. Use email validation for thedomain.
Question # 8
A company uses Amazon RDS with default backup settings for Its database tier Thecompany needs to make a dally backup of the database to meet regulatory requirements.The company must retain the backups (or 30 days.Which solution will meet these requirements with the LEAST operational overhead?
A. Write an AWS Lambda function to create an RDS snapshot every day.
B. Modify the RDS database lo have a retention period of 30 days for automated backups.
C. Use AWS Systems Manager Maintenance Windows to modify the RDS backup retentionperiod.
D. Create a manual snapshot every day by using the AWS CLI. Modify the RDS backupretention period.
Question # 9
A company runs its application on Oracle Database Enterprise Edition The company needsto migrate the application and the database to AWS. The company can use the Bring YourOwn License (BYOL) model while migrating to AWS The application uses third-partydatabase features that require privileged access.A solutions architect must design a solution for the database migration.Which solution will meet these requirements MOST cost-effectively?
A. Migrate the database to Amazon RDS for Oracle by using native tools. Replace thethird-party features with AWS Lambda.
B. Migrate the database to Amazon RDS Custom for Oracle by using native toolsCustomize the new database settings to support the third-party features.
C. Migrate the database to Amazon DynamoDB by using AWS Database Migration Service{AWS DMS). Customize the new database settings to support the third-party features.
D. Migrate the database to Amazon RDS for PostgreSQL by using AWS DatabaseMigration Service (AWS DMS). Rewrite the application code to remove the dependency onthird-party features.
Question # 10
A company stores several petabytes of data across multiple AWS accounts The companyuses AWS Lake Formation to manage its data lake The company's data science teamwants to securely share selective data from its accounts with the company’s engineeringteam for analytical purposes.Which solution will meet these requirements with the LEAST operational overhead?
A. Copy the required data to a common account. Create an 1AM access role in thataccount Grant access by specifying a permission policy that includes users from theengineering team accounts as trusted entities.
B. Use the Lake Formation permissions Grant command in each account where the data isstored to allow the required engineering team users to access the data.
C. Use AWS Data Exchange to privately publish the required data to the requiredengineering team accounts
D. Use Lake Formation tag-based access control to authorize and grant cross-accountpermissions for the required data to the engineering team accounts
Question # 11
A company has an on-premises SFTP file transfer solution. The company is migrating tothe AWS Cloud to scale the file transfer solution and to optimize costs by using AmazonS3. The company's employees will use their credentials for the on-premises MicrosoftActive Directory (AD) to access the new solution The company wants to keep the currentauthentication and file access mechanisms.Which solution will meet these requirements with the LEAST operational overhead?
A. Configure an S3 File Gateway. Create SMB file shares on the file gateway that use theexisting Active Directory to authenticate
B. Configure an Auto Scaling group with Amazon EC2 instances to run an SFTP solutionConfigure the group to scale up at 60% CPU utilization.
C. Create an AWS Transfer Family server with SFTP endpoints Choose the AWS DirectoryService option as the identity provider Use AD Connector to connect the on-premisesActive Directory.
D. Create an AWS Transfer Family SFTP endpoint. Configure the endpoint to use the AWSDirectory Service option as the identity provider to connect to the existing Active Directory.
Question # 12
A video game company is deploying a new gaming application to its global users. Thecompany requires a solution that will provide near real-time reviews and rankings of theplayers.A solutions architect must design a solution to provide fast access to the data. The solutionmust also ensure the data persists on disks in the event that the company restarts theapplication.Which solution will meet these requirements with the LEAST operational overhead?
A. Configure an Amazon CloudFront distribution with an Amazon S3 bucket as the origin.Store the player data in the S3 bucket.
B. Create Amazon EC2 instances in multiple AWS Regions. Store the player data on theEC2 instances. Configure Amazon Route 53 with geolocation records to direct users to theclosest EC2 instance.
C. Deploy an Amazon ElastiCache for Redis cluster. Store the player data in theElastiCache cluster.
D. Deploy an Amazon ElastiCache for Memcached cluster. Store the player data in theElastiCache cluster.
Question # 13
A company is running a highly sensitive application on Amazon EC2 backed by an AmazonRDS database Compliance regulations mandate that all personally identifiable information(Pll) be encrypted at rest.Which solution should a solutions architect recommend to meet this requirement with theLEAST amount of changes to the infrastructure?
A. Deploy AWS Certificate Manager to generate certificates Use the certificates to encryptthe database volume
B. Deploy AWS CloudHSM. generate encryption keys, and use the keys to encryptdatabase volumes.
C. Configure SSL encryption using AWS Key Management Service {AWS KMS) keys toencrypt database volumes.
D. Configure Amazon Elastic Block Store (Amazon EBS) encryption and Amazon RDSencryption with AWS Key Management Service (AWS KMS) keys to encrypt instance anddatabase volumes.
Question # 14
A company that uses AWS Organizations runs 150 applications across 30 different AWSaccounts The company used AWS Cost and Usage Report to create a new report in themanagement account The report is delivered to an Amazon S3 bucket that is replicated toa bucket in the data collection account.The company's senior leadership wants to view a custom dashboard that provides NATgateway costs each day starting at the beginning of the current month. Which solution will meet these requirements?
A. Share an Amazon QuickSight dashboard that includes the requested table visual.Configure QuickSight to use AWS DataSync to query the new report
B. Share an Amazon QuickSight dashboard that includes the requested table visual.Configure QuickSight to use Amazon Athena to query the new report.
C. Share an Amazon CloudWatch dashboard that includes the requested table visualConfigure CloudWatch to use AWS DataSync to query the new report
D. Share an Amazon CloudWatch dashboard that includes the requested table visual.Configure CloudWatch to use Amazon Athena to query the new report
Question # 15
A company runs containers in a Kubernetes environment in the company's local datacenter. The company wants to use Amazon Elastic Kubernetes Service (Amazon EKS) andother AWS managed services Data must remain locally in the company's data center andcannot be stored in any remote site or cloud to maintain complianceWhich solution will meet these requirements?
A. Deploy AWS Local Zones in the company's data center
B. Use an AWS Snowmobile in the company's data center
C. Install an AWS Outposts rack in the company's data centerc
D. Install an AWS Snowball Edge Storage Optimized node in the data center
Question # 16
A company runs a self-managed Microsoft SOL Server on Amazon EC2 instances and Amazon Elastic Block Store (Amazon EBS). Daily snapshots are taken of the EBSvolumes.Recently, all the company's EBS snapshots were accidentally deleted while running asnapshot cleaning script that deletes all expired EBS snapshots. A solutions architectneeds to update the architecture to prevent data loss without retaining EBS snapshotsindefinitely.Which solution will meet these requirements with the LEAST development effort?
A. Change the 1AM policy of the user to deny EBS snapshot deletion.
B. Copy the EBS snapshots to another AWS Region after completing the snapshots daily.
C. Create a 7-day EBS snapshot retention rule in Recycle Bin and apply the rule for allsnapshots.
D. Copy EBS snapshots to Amazon S3 Standard-Infrequent Access (S3 Standard-IA).
Question # 17
Asocial media company has workloads that collect and process data The workloads storethe data in on-premises NFS storage The data store cannot scale fast enough to meet thecompany's expanding business needs The company wants to migrate the current datastore to AWSWhich solution will meet these requirements MOST cost-effectively?
A. Set up an AWS Storage Gateway Volume Gateway Use an Amazon S3 Lifecycle policyto transition the data to the appropnate storage class
B. Set up an AWS Storage Gateway Amazon S3 File Gateway Use an Amazon S3Lifecycle policy to transition the data to the appropriate storage class
C. Use the Amazon Elastic File System (Amazon EFS) Standard-Infrequent Access(Standard-IA) storage class Activate the infrequent access lifecycle policy
D. Use the Amazon Elastic File System (Amazon EFS) One Zone-Infrequent Access (OneZone-IA) storage class Activate the infrequent access lifecycle policy
Question # 18
A company uses Amazon FSx for NetApp ONTAP in its primary AWS Region for CIFS andNFS file shares. Applications that run on Amazon EC2 instances access the file shares Thecompany needs a storage disaster recovery (OR) solution in a secondary Region. The datathat is replicated in the secondary Region needs to be accessed by using the sameprotocols as the primary Region.Which solution will meet these requirements with the LEAST operational overhead?
A. Create an AWS Lambda function lo copy the data to an Amazon S3 bucket. Replicatethe S3 bucket (o the secondary Region.
B. Create a backup of the FSx for ONTAP volumes by using AWS Backup. Copy thevolumes to the secondary Region. Create a new FSx for ONTAP instance from the backup.
C. Create an FSx for ONTAP instance in the secondary Region. Use NetApp SnapMirror toreplicate data from the primary Region to the secondary Region.
D. Create an Amazon Elastic File System (Amazon EFS) volume. Migrate the current datato the volume. Replicate the volume to the secondary Region.
Question # 19
A company has migrated a fleet of hundreds of on-premises virtual machines (VMs) to Amazon EC2 instances. The instances run a diverse fleet of Windows Server versionsalong with several Linux distributions. The company wants a solution that will automateinventory and updates of the operating systems. The company also needs a summary ofcommon vulnerabilities of each instance for regular monthly reviews.What should a solutions architect recommend to meet these requirements?
A. Set up AWS Systems Manager Patch Manager to manage all the EC2 instances.Configure AWS Security Hub to produce monthly reports.
B. Set up AWS Systems Manager Patch Manager to manage all the EC2 instances DeployAmazon Inspector, and configure monthly reports
C. Set up AWS Shield Advanced, and configure monthly reports Deploy AWS Config toautomate patch installations on the EC2 instances
D. Set up Amazon GuardDuty in the account to monitor all EC2 instances Deploy AWSConfig to automate patch installations on the EC2 instances.
Question # 20
A large international university has deployed all of its compute services in the AWS CloudThese services include Amazon EC2. Amazon RDS. and Amazon DynamoDB. Theuniversity currently relies on many custom scripts to back up its infrastructure. However,the university wants to centralize management and automate data backups as much aspossible by using AWS native options.Which solution will meet these requirements?
A. Use third-party backup software with an AWS Storage Gateway tape gateway virtualtape library.
B. Use AWS Backup to configure and monitor all backups for the services in use
C. Use AWS Config to set lifecycle management to take snapshots of all data sources on aschedule.
D. Use AWS Systems Manager State Manager to manage the configuration and monitoringof backup tasks.
Question # 21
A company runs a critical data analysis job each week before the first day of the work weekThe job requires at least 1 hour to complete the analysis The job is stateful and cannottolerate interruptions. The company needs a solution to run the job on AWS.Which solution will meet these requirements?
A. Create a container for the job. Schedule the job to run as an AWS Fargate task on anAmazon Elastic Container Service (Amazon ECS) cluster by using Amazon EventBridgeScheduler.
B. Configure the job to run in an AWS Lambda function. Create a scheduled rule inAmazon EventBridge to invoke the Lambda function.
C. Configure an Auto Scaling group of Amazon EC2 Spot Instances that run Amazon LinuxConfigure a crontab entry on the instances to run the analysis.
D. Configure an AWS DataSync task to run the job Configure a cron expression to run thetask on a schedule.
Question # 22
A company has several on-premises Internet Small Computer Systems Interface (iSCSI)network storage servers The company wants to reduce the number of these servers bymoving to the AWS Cloud. A solutions architect must provide low-latency access tofrequently used data and reduce the dependency on on-premises servers with a minimalnumber of infrastructure changes.Which solution will meet these requirements?
A. Deploy an Amazon S3 File Gateway
B. Deploy Amazon Elastic Block Store (Amazon EBS) storage with backups to Amazon S3
C. Deploy an AWS Storage Gateway volume gateway that is configured with storedvolumes
D. Deploy an AWS Storage Gateway volume gateway that is configured with cachedvolumes.
Question # 23
A company uses GPS trackers to document the migration patterns of thousands of seaturtles. The trackers check every 5 minutes to see if a turtle has moved more than 100yards (91.4 meters). If a turtle has moved, its tracker sends the new coordinates to a webapplication running on three Amazon EC2 instances that are in multiple Availability Zonesin one AWS Region.Jgpently. the web application was overwhelmed while processing an unexpected volume oftracker data. Data was lost with no way to replay the events. A solutionsftitect must prevent this problem from happening again and needs a solution with the least operational overhead.at should the solutions architect do to meet these requirements?
A. Create an Amazon S3 bucket to store the data. Configure the application to scan fornew data in the bucket for processing.
B. Create an Amazon API Gateway endpoint to handle transmitted location coordinates.Use an AWS Lambda function to process each item concurrently.
C. Create an Amazon Simple Queue Service (Amazon SOS) queue to store the incomingdata. Configure the application to poll for new messages for processing.
D. Create an Amazon DynamoDB table to store transmitted location coordinates. Configurethe application to query the table for new data for processing. Use TTL to remove data thathas been processed.
Question # 24
A company is designing a new multi-tier web application that consists of the following components: • Web and application servers that run on Amazon EC2 instances as part of Auto Scalinggroups• An Amazon RDS DB instance for data storageA solutions architect needs to limit access to the application servers so that only the webservers can access them. Which solution will meet these requirements?
A. Deploy AWS PrivateLink in front of the application servers. Configure the network ACLto allow only the web servers to access the application servers.
B. Deploy a VPC endpoint in front of the application servers Configure the security group toallow only the web servers to access the application servers
C. Deploy a Network Load Balancer with a target group that contains the applicationservers' Auto Scaling group Configure the network ACL to allow only the web servers toaccess the application servers.
D. Deploy an Application Load Balancer with a target group that contains the applicationservers' Auto Scaling group. Configure the security group to allow only the web servers toaccess the application servers.
Question # 25
A company has an Amazon S3 data lake The company needs a solution that transformsthe data from the data lake and loads the data into a data warehouse every day The datawarehouse must have massively parallel processing (MPP) capabilities.Data analysts then need to create and train machine learning (ML) models by using SQLcommands on the data The solution must use serverless AWS services wherever possibleWhich solution will meet these requirements?
A. Run a daily Amazon EMR job to transform the data and load the data into AmazonRedshift Use Amazon Redshift ML to create and train the ML models
B. Run a daily Amazon EMR job to transform the data and load the data into AmazonAurora Serverless Use Amazon Aurora ML to create and train the ML models
C. Run a daily AWS Glue job to transform the data and load the data into Amazon RedshiftServerless Use Amazon Redshift ML to create and tram the ML models
D. Run a daily AWS Glue job to transform the data and load the data into Amazon Athenatables Use Amazon Athena ML to create and train the ML models
Question # 26
content management system runs on Amazon EC2 instances behind an Application LoadBalancer (Al B). The FC? instances run in an Auto Scaling group across multipleAvailability 7ones. Users are constantly adding and updating files, blogs and other websiteassets in the content management system.A solutions architect must implement a solution in which all the EC2 Instances share up-todatewebsite content with the least possible lag time.Which solution meets these requirements?
A. Update the EC2 user data in the Auto Scaling group lifecycle policy to copy the websiteassets from the EC2 instance that was launched most recently. Configure the ALB to makechanges to the website assets only in the newest EC2 instance.
B. Copy the website assets to an Amazon Elastic File System (Amazon EFS) file system.Configure each EC2 instance to mount the EFS file system locally.Configure the website hosting application to reference the website assets that are stored inthe EFS file system.
C. Copy the website assets to an Amazon S3 bucket. Ensure that each EC2 Instancedownloads the website assets from the S3 bucket to the attached AmazonElastic Block Store (Amazon EBS) volume. Run the S3 sync command once each hour tokeep files up to date.
D. Restore an Amazon Elastic Block Store (Amazon EBS) snapshot with the websiteassets. Attach the EBS snapshot as a secondary EBS volume when a new CC2 instance islaunched. Configure the website hosting application to reference the website assets thatare stored in the secondary EDS volume.
Question # 27
A company wants to add its existing AWS usage cost to its operation cost dashboard Asolutions architect needs to recommend a solution that will give the company access to itsusage cost programmatically. The company must be able to access cost data for thecurrent year and forecast costs for the next 12 months.Which solution will meet these requirements with the LEAST operational overhead?
A. Access usage cost-related data by using the AWS Cost Explorer API with pagination.
B. Access usage cost-related data by using downloadable AWS Cost Explorer report csv files.
C. Configure AWS Budgets actions to send usage cost data to the company through FTP.
D. Create AWS Budgets reports for usage cost data Send the data to the company throughSMTP.
Question # 28
A company runs an application in a VPC with public and private subnets. The VPC extendsacross multiple Availability Zones. The application runs on Amazon EC2 instances inprivate subnets. The application uses an Amazon Simple Queue Service (Amazon SOS)queue. A solutions architect needs to design a secure solution to establish a connection betweenthe EC2 instances and the SOS queueWhich solution will meet these requirements?
A. Implement an interface VPC endpoint tor Amazon SOS. Configure the endpoint to usethe private subnets. Add to the endpoint a security group that has aninbound access rule that allows traffic from the EC2 instances that are in the privatesubnets.
B. Implement an interface VPC endpoint tor Amazon SOS. Configure the endpoint to usethe public subnets. Attach to the interface endpoint a VPC endpointpolicy that allows access from the EC2 Instances that are in the private subnetsc
C. Implement an interface VPC endpoint for Ama7on SOS. Configure the endpoint to usethe public subnets Attach an Amazon SOS access policy to the interface VPC endpoint thatallows requests from only a specified VPC endpoint.
D. Implement a gateway endpoint tor Amazon SOS. Add a NAT gateway to the privatesubnets. Attach an 1AM role to the EC2 Instances that allows access to the SOS queue.
Question # 29
A company has an internal application that runs on Amazon EC2 instances in an AutoScaling group. The EC2 instances are compute optimized and use Amazon Elastic BlockStore (Amazon EBS) volumes.The company wants to identify cost optimizations across the EC2 instances, the AutoScaling group, and the EBS volumes.Which solution will meet these requirements with the MOST operational efficiency?
A. Create a new AWS Cost and Usage Report. Search the report for costrecommendations for the EC2 instances, the Auto Scaling group, and the EBS volumes.
B. Create new Amazon CloudWatch billing alerts. Check the alert statuses for costrecommendations for the EC2 instances, the Auto Scaling group, and the EBS volumes.
C. Configure AWS Compute Optimizer for cost recommendations for the EC2 instances,the Auto Scaling group, and the EBS volumes.
D. Configure AWS Compute Optimizer for cost recommendations for the EC2 instances.Create a new AWS Cost and Usage Report. Search the report for cost recommendationsfor the Auto Scaling group and the EBS volumes.
Question # 30
A company's near-real-time streaming application is running on AWS. As the data isingested, a Job runs on the data and takes 30 minutes to complete. The workloadfrequently experiences high latency due to large amounts of incoming data. A solutionsarchitect needs to design a scalable and serverless solution to enhance performance.Which combination of steps should the solutions architect take? (Select TWO.)
A. Use Amazon Kinesis Data Firehose to Ingest the data.
B. Use AWS Lambda with AWS Step Functions to process the data.
C. Use AWS Database Migration Service (AWS DMS) to ingest the data
D. Use Amazon EC2 instances in an Auto Seating group to process the data.
E. Use AWS Fargate with Amazon Elastic Container Service (Amazon ECS) to process the data.
Question # 31
A solutions architect is creating an application that will handle batch processing of largeamounts of data. The input data will be held in Amazon S3 and the ou data will be stored ina different S3 bucket. For processing, the application will transfer the data over the networkbetween multiple Amazon EC2 instances.What should the solutions architect do to reduce the overall data transfer costs?
A. Place all the EC2 instances in an Auto Scaling group.
B. Place all the EC2 instances in the same AWS Region.
C. Place all the EC2 instances in the same Availability Zone.
D. Place all the EC2 instances in private subnets in multiple Availability Zones.
Question # 32
A company has two AWS accounts: Production and Development. The company needs topush code changes in the Development account to the Production account. In the alphaphase, only two senior developers on the development team need access to the Productionaccount. In the beta phase, more developers will need access to perform testing.Which solution will meet these requirements?
A. Create two policy documents by using the AWS Management Console in each account.Assign the policy to developers who need access.
B. Create an 1AM role in the Development account Grant the 1AM role access to theProduction account. Allow developers to assume the role
C. Create an IAM role in the Production account. Define a trust policy that specifies theDevelopment account Allow developers to assume the role
D. Create an IAM group in the Production account. Add the group as a principal in a trustpolicy that specifies the Production account. Add developers to the group.
Question # 33
A company uses 50 TB of data for reporting The company wants to move this data from onpremises to AWS A custom application in the company's data center runs a weekly datatransformation job The company plans to pause the application until the data transfer iscomplete and needs to begin the transfer process as soon as possibleThe data center does not have any available network bandwidth for additional workloads. Asolutions architect must transfer the data and must configure the transformation job tocontinue to run in the AWS Cloud. Which solution will meet these requirements with the LEAST operational overhead?
A. Use AWS DataSync to move the data Create a custom transformation job by using AWS Glue.
B. Order an AWS Snowcone device to move the data Deploy the transformation application to the device.
C. Order an AWS Snowball Edge Storage Optimized device. Copy the data to the device.Create a custom transformation Job by using AWS Glue.
D. Order an AWS Snowball Edge Storage Optimized device that includes Amazon EC2compute Copy the data to the device Create a new EC2 instance on AWS to run thetransformation application.
Question # 34
A company serves its website by using an Auto Scaling group of Amazon EC2 instances ina single AWS Region. The website does not require a databaseThe company is expanding, and the company's engineering team deploys the website to asecond Region. The company wants to distribute traffic across both Regions toaccommodate growth and for disaster recovery purposes The solution should not servetraffic from a Region in which the website is unhealthy.Which policy or resource should the company use to meet these requirements?
A. An Amazon Route 53 simple routing policy
B. An Amazon Route 53 multivalue answer routing policy
C. An Application Load Balancer in one Region with a target group that specifies the EC2instance IDs from both Regions
D. An Application Load Balancer in one Region with a target group that specifies the IPaddresses of the EC2 instances from both Regions
Question # 35
A company wants to build a logging solution for its multiple AWS accounts. The companycurrently stores the logs from all accounts in a centralized account. The company hascreated an Amazon S3 bucket in the centralized account to store the VPC flow logs andAWS CloudTrail logs. All logs must be highly available for 30 days for frequent analysis,retained tor an additional 60 days tor backup purposes, and deleted 90 days after creation.Which solution will meet these requirements MOST cost-effectively?
A. Transition objects to the S3 Standard storage class 30 days after creation. Write anexpiration action that directs Amazon S3 to delete objects after 90 days.
B. Transition objects lo the S3 Standard-Infrequent Access (S3 Standard-IA) storage class30 days after creation Move all objects to the S3 Glacier FlexibleRetrieval storage class after 90 days. Write an expiration action that directs Amazon S3 todelete objects after 90 days.
C. Transition objects to the S3 Glacier Flexible Retrieval storage class 30 days aftercreation. Write an expiration action that directs Amazon S3 to delete objects alter 90 days.
D. Transition objects to the S3 One Zone-Infrequent Access (S3 One Zone-IA) storageclass 30 days after creation. Move all objects to the S3 Glacier Flexible Retrieval storageclass after 90 days. Write an expiration action that directs Amazon S3 to delete objectsafter 90 days.
Question # 36
A company is hosting a high-traffic static website on Amazon S3 with an AmazonCloudFront distribution that has a default TTL of 0 seconds The company wants toimplement caching to improve performance for the website However, the company alsowants to ensure that stale content Is not served for more than a few minutes after adeploymentWhich combination of caching methods should a solutions architect implement to meetthese requirements? (Select TWO.)
A. Set the CloudFront default TTL to 2 minutes.
B. Set a default TTL of 2 minutes on the S3 bucket
C. Add a Cache-Control private directive to the objects in Amazon S3.
D. Create an AWS Lambda@Edge function to add an Expires header to HTTP responsesConfigure the function to run on viewer response.
E. Add a Cache-Control max-age directive of 24 hours to the objects in Amazon S3. Ondeployment, create a CloudFront invalidation to clear any changed files from edge caches
Question # 37
A company needs to optimize the cost of its Amazon EC2 Instances. The company alsoneeds to change the type and family of its EC2 instances every 2-3 months.What should the company do lo meet these requirements?
A. Purchase Partial Upfront Reserved Instances tor a 3-year term.
B. Purchase a No Upfront Compute Savings Plan for a 1-year term.
C. Purchase All Upfront Reserved Instances for a 1 -year term.
D. Purchase an All Upfront EC2 Instance Savings Plan for a 1-year term.
Question # 38
A company runs an application on Amazon EC2 Instances in a private subnet. Theapplication needs to store and retrieve data in Amazon S3 buckets. According to regulatoryrequirements, the data must not travel across the public internet.What should a solutions architect do to meet these requirements MOST cost-effectively?
A. Deploy a NAT gateway to access the S3 buckets.
B. Deploy AWS Storage Gateway to access the S3 buckets.
C. Deploy an S3 interface endpoint to access the S3 buckets.
D. Deploy an S3 gateway endpoint to access the S3 buckets.
Question # 39
A company uses an Amazon S3 bucket as its data lake storage platform The S3 bucketcontains a massive amount of data that is accessed randomly by multiple teams andhundreds of applications. The company wants to reduce the S3 storage costs and provideimmediate availability for frequently accessed objectsWhat is the MOST operationally efficient solution that meets these requirements?
A. Create an S3 Lifecycle rule to transition objects to the S3 Intelligent-Tiering storageclass
B. Store objects in Amazon S3 Glacier Use S3 Select to provide applications with accessto the data.
C. Use data from S3 storage class analysis to create S3 Lifecycle rules to automaticallytransition objects to the S3 Standard-Infrequent Access (S3 Standard-IA) storage class.
D. Transition objects to the S3 Standard-Infrequent Access (S3 Standard-IA) storage classCreate an AWS Lambda function to transition objects to the S3 Standard storage classwhen they are accessed by an application
Question # 40
A company's SAP application has a backend SQL Server database in an on-premisesenvironment. The company wants to migrate its on-premises application and databaseserver to AWS. The company needs an instance type that meets the high demands of itsSAP database. On-premises performance data shows that both the SAP application andthe database have high memory utilization.Which solution will meet these requirements?
A. Use the compute optimized Instance family for the application Use the memoryoptimized instance family for the database.
B. Use the storage optimized instance family for both the application and the database
C. Use the memory optimized instance family for both the application and the database
D. Use the high performance computing (HPC) optimized instance family for theapplication. Use the memory optimized instance family for the database.
Question # 41
A company needs to design a hybrid network architecture The company's workloads arecurrently stored in the AWS Cloud and in on-premises data centers The workloads requiresingle-digit latencies to communicate The company uses an AWS Transit Gateway transitgateway to connect multiple VPCsWhich combination of steps will meet these requirements MOST cost-effectively? (SelectTWO.)
A. Establish an AWS Site-to-Site VPN connection to each VPC.
B. Associate an AWS Direct Connect gateway with the transit gateway that is attached to the VPCs.
C. Establish an AWS Site-to-Site VPN connection to an AWS Direct Connect gateway.
D. Establish an AWS Direct Connect connection. Create a transit virtual interface (VIF) to a Direct Connect gateway.
E. Associate AWS Site-to-Site VPN connections with the transit gateway that is attached to the VPCs
Question # 42
A company has an application that runs on Amazon EC2 instances in a private subnet Theapplication needs to process sensitive information from an Amazon S3 bucket Theapplication must not use the internet to connect to the S3 bucket.Which solution will meet these requirements?
A. Configure an internet gateway. Update the S3 bucket policy to allow access from theinternet gateway Update the application to use the new internet gateway
B. Configure a VPN connection. Update the S3 bucket policy to allow access from the VPNconnection. Update the application to use the new VPN connection.
C. Configure a NAT gateway. Update the S3 bucket policy to allow access from the NATgateway. Update the application to use the new NAT gateway.
D. Configure a VPC endpoint. Update the S3 bucket policy to allow access from the VPCendpoint. Update the application to use the new VPC endpoint.
Question # 43
A company is planning to deploy its application on an Amazon Aurora PostgreSQLServerless v2 cluster. The application will receive large amounts of traffic. The companywants to optimize the storage performance of the cluster as the load on the applicationincreasesWhich solution will meet these requirements MOST cost-effectively?
A. Configure the cluster to use the Aurora Standard storage configuration.
B. Configure the cluster storage type as Provisioned IOPS.
C. Configure the cluster storage type as General Purpose.
D. Configure the cluster to use the Aurora l/O-Optimized storage configuration.
Question # 44
A company plans to rehost an application to Amazon EC2 instances that use AmazonElastic Block Store (Amazon EBS) as the attached storageA solutions architect must design a solution to ensure that all newly created Amazon EBSvolumes are encrypted by default. The solution must also prevent the creation ofunencrypted EBS volumesWhich solution will meet these requirements?
A. Configure the EC2 account attributes to always encrypt new EBS volumes.
B. Use AWS Config. Configure the encrypted-volumes identifier Apply the default AWS KeyManagement Service (AWS KMS) key.
C. Configure AWS Systems Manager to create encrypted copies of the EBS volumes.Reconfigure the EC2 instances to use the encrypted volumes
D. Create a customer managed key in AWS Key Management Service (AWS KMS)Configure AWS Migration Hub to use the key when the company migrates workloads.
Question # 45
A company uses AWS to host its public ecommerce website. The website uses an AWSGlobal Accelerator accelerator for traffic from the internet. Tt\e Global Acceleratoraccelerator forwards the traffic to an Application Load Balancer (ALB) that is the entry pointfor an Auto Scaling group.The company recently identified a ODoS attack on the website. The company needs asolution to mitigate future attacks. Which solution will meet these requirements with the LEAST implementation effort?
A. Configure an AWS WAF web ACL for the Global Accelerator accelerator to block trafficby using rate-based rules.
B. Configure an AWS Lambda function to read the ALB metrics to block attacks byupdating a VPC network ACL.
C. Configure an AWS WAF web ACL on the ALB to block traffic by using rate-based rules.
D. Configure an Ama7on CloudFront distribution in front of the Global Accelerator accelerator
Question # 46
A media company uses an Amazon CloudFront distribution to deliver content over theinternet The company wants only premium customers to have access to the media streamsand file content. The company stores all content in an Amazon S3 bucket. The companyalso delivers content on demand to customers for a specific purpose, such as movie rentalsor music downloads.Which solution will meet these requirements?
A. Generate and provide S3 signed cookies to premium customers
B. Generate and provide CloudFront signed URLs to premium customers.
C. Use origin access control (OAC) to limit the access of non-premium customers
D. Generate and activate field-level encryption to block non-premium customers.
Question # 47
A media company has a multi-account AWS environment in the us-east-1 Region. Thecompany has an Amazon Simple Notification Service {Amazon SNS) topic in a productionaccount that publishes performance metrics. The company has an AWS Lambda functionin an administrator account to process and analyze log data.The Lambda function that is in the administrator account must be invoked by messagesfrom the SNS topic that is in the production account when significant metrics tM* reported.Which combination of steps will meet these requirements? (Select TWO.)
A. Create an IAM resource policy for the Lambda function that allows Amazon SNS toinvoke the function. Implement an Amazon Simple Queue Service (Amazon SQS) queue inthe administrator account to buffer messages from the SNS topic that is in the productionaccount. Configure the SOS queue to invoke the Lambda function.
B. Create an IAM policy for the SNS topic that allows the Lambda function to subscribe tothe topic.
C. Use an Amazon EventBridge rule in the production account to capture the SNS topicnotifications. Configure the EventBridge rule to forward notifications to the Lambda functionthat is in the administrator account.
D. Store performance metrics in an Amazon S3 bucket in the production account. UseAmazon Athena to analyze the metrics from the administrator account.
Question # 48
A weather forecasting company needs to process hundreds of gigabytes of data with submillisecondlatency. The company has a high performance computing (HPC) environmentin its data center and wants to expand its forecasting capabilities.A solutions architect must identify a highly available cloud storage solution that can handlelarge amounts of sustained throughput Files that are stored in the solution should beaccessible to thousands of compute instances that will simultaneously access and processthe entire dataset.What should the solutions architect do to meet these requirements?
A. Use Amazon FSx for Lustre scratch file systems
B. Use Amazon FSx for Lustre persistent file systems.
C. Use Amazon Elastic File System (Amazon EFS) with Bursting Throughput mode.
D. Use Amazon Elastic File System (Amazon EFS) with Provisioned Throughput mode.
Question # 49
A company uses a Microsoft SOL Server database. The company's applications areconnected to the database. The company wants to migrate to an Amazon AuroraPostgreSQL database with minimal changes to the application code.Which combination of steps will meet these requirements? (Select TWO.)
A. Use the AWS Schema Conversion Tool <AWS SCT) to rewrite the SOL queries in theapplications.
B. Enable Babelfish on Aurora PostgreSQL to run the SQL queues from the applications.
C. Migrate the database schema and data by using the AWS Schema Conversion Tool(AWS SCT) and AWS Database Migration Service (AWS DMS).
D. Use Amazon RDS Proxy to connect the applications to Aurora PostgreSQL
E. Use AWS Database Migration Service (AWS DMS) to rewrite the SOI queries in theapplications
Question # 50
A company has an application that is running on Amazon EC2 instances A solutionsarchitect has standardized the company on a particular instance family and variousinstance sizes based on the current needs of the company.The company wants to maximize cost savings for the application over the next 3 years. Thecompany needs to be able to change the instance family and sizes in the next 6 monthsbased on application popularity and usage Which solution will meet these requirements MOST cost-effectively?
A. Compute Savings Plan
B. EC2 Instance Savings Plan
C. Zonal Reserved Instances
D. Standard Reserved Instances
Question # 51
A company stores sensitive data in Amazon S3 A solutions architect needs to create anencryption solution The company needs to fully control the ability of users to create, rotate,and disable encryption keys with minimal effort for any data that must be encrypted.Which solution will meet these requirements?
A. Use default server-side encryption with Amazon S3 managed encryption keys (SSE-S3)to store the sensitive data
B. Create a customer managed key by using AWS Key Management Service (AWS KMS).Use the new key to encrypt the S3 objects by using server-side encryption with AWS KMSkeys (SSE-KMS).
C. Create an AWS managed key by using AWS Key Management Service {AWS KMS)Use the new key to encrypt the S3 objects by using server-side encryption with AWS KMSkeys (SSE-KMS).
D. Download S3 objects to an Amazon EC2 instance. Encrypt the objects by usingcustomer managed keys. Upload the encrypted objects back into Amazon S3.
Question # 52
A company wants to migrate an application to AWS. The company wants to increase theapplication's current availability The company wants to use AWS WAF in the application'sarchitecture.Which solution will meet these requirements?
A. Create an Auto Scaling group that contains multiple Amazon EC2 instances that hostthe application across two Availability Zones. Configure an Application Load Balancer(ALB) and set the Auto Scaling group as the target. Connect a WAF to the ALB.
B. Create a cluster placement group that contains multiple Amazon EC2 instances thathosts the application Configure an Application Load Balancer and set the EC2 instances asthe targets. Connect a WAF to the placement group.
C. Create two Amazon EC2 instances that host the application across two AvailabilityZones. Configure the EC2 instances as the targets of an Application Load Balancer (ALB).Connect a WAF to the ALB.
D. Create an Auto Scaling group that contains multiple Amazon EC2 instances that hostthe application across two Availability Zones. Configure an Application Load Balancer(ALB) and set the Auto Scaling group as the target Connect a WAF to the Auto Scalinggroup.
Question # 53
A development team uses multiple AWS accounts for its development, staging, andproduction environments. Team members have been launching large Amazon EC2instances that are underutilized. A solutions architect must prevent large instances frombeing launched in all accounts.How can the solutions architect meet this requirement with the LEAST operationaloverhead?
A. Update the 1AM policies to deny the launch of large EC2 instances. Apply the policies toall users.
B. Define a resource in AWS Resource Access Manager that prevents the launch of largeEC2 instances.
C. Create an (AM role in each account that denies the launch of large EC2 instances.Grant the developers 1AM group access to the role.
D. Create an organization in AWS Organizations in the management account with thedefault policy. Create a service control policy (SCP) that denies the launch of large EC2Instances, and apply it to the AWS accounts.
Question # 54
A company is developing an application to support customer demands. The companywants to deploy the application on multiple Amazon EC2 Nitro-based instances within thesame Availability Zone. The company also wants to give the application the ability to writeto multiple block storage volumes in multiple EC2 Nitro-based instances simultaneously to achieve higher application availability. Which solution will meet these requirements?
A. Use General Purpose SSD (gp3) EBS volumes with Amazon Elastic Block Store(Amazon EBS) Multi-Attach.
B. Use Throughput Optimized HDD (st1) EBS volumes with Amazon Elastic Block Store(Amazon EBS) Multi-Attach
C. Use Provisioned IOPS SSD (io2) EBS volumes with Amazon Elastic Block Store(Amazon EBS) Multi-Attach.
D. Use General Purpose SSD (gp2) EBS volumes with Amazon Elastic Block Store(Amazon E8S) Multi-Attach.
Question # 55
A company runs a stateful production application on Amazon EC2 instances Theapplication requires at least two EC2 instances to always be running.A solutions architect needs to design a highly available and fault-tolerant architecture forthe application. The solutions architect creates an Auto Scaling group of EC2 instances.Which set of additional steps should the solutions architect take to meet theserequirements?
A. Set the Auto Scaling group's minimum capacity to two. Deploy one On-DemandInstance in one Availability Zone and one On-Demand Instance in a second AvailabilityZone.
B. Set the Auto Scaling group's minimum capacity to four Deploy two On-DemandInstances in one Availability Zone and two On-Demand Instances in a second AvailabilityZone
C. Set the Auto Scaling group's minimum capacity to two. Deploy four Spot Instances inone Availability Zone.
D. Set the Auto Scaling group's minimum capacity to four Deploy two On-DemandInstances in one Availability Zone and two Spot Instances in a second Availability Zone.
Question # 56
A company recently migrated its web application to the AWS Cloud The company uses anAmazon EC2 instance to run multiple processes to host the application. The processesinclude an Apache web server that serves static content The Apache web server makesrequests to a PHP application that uses a local Redis server for user sessions.The company wants to redesign the architecture to be highly available and to use AWSmanaged solutions Which solution will meet these requirements?
A. Use AWS Elastic Beanstalk to host the static content and the PHP application.Configure Elastic Beanstalk to deploy its EC2 instance into a public subnet Assign a publicIP address.
B. Use AWS Lambda to host the static content and the PHP application. Use an AmazonAPI Gateway REST API to proxy requests to the Lambda function. Set the API GatewayCORS configuration to respond to the domain name. Configure Amazon ElastiCache forRedis to handle session information
C. Keep the backend code on the EC2 instance. Create an Amazon ElastiCache for Rediscluster that has Multi-AZ enabled Configure the ElastiCache for Redis cluster in clustermode Copy the frontend resources to Amazon S3 Configure the backend code to referencethe EC2 instance
D. Configure an Amazon CloudFront distribution with an Amazon S3 endpoint to an S3bucket that is configured to host the static content. Configure an Application Load Balancerthat targets an Amazon Elastic Container Service (Amazon ECS) service that runs AWSFargate tasks for the PHP application. Configure the PHP application to use an AmazonElastiCache for Redis cluster that runs in multiple Availability Zones
Question # 57
A company's software development team needs an Amazon RDS Multi-AZ cluster. TheRDS cluster will serve as a backend for a desktop client that is deployed on premises. Thedesktop client requires direct connectivity to the RDS cluster.The company must give the development team the ability to connect to the cluster by usingthe client when the team is in the office.Which solution provides the required connectivity MOST securely?
A. Create a VPC and two public subnets. Create the RDS cluster in the public subnets.Use AWS Site-to-Site VPN with a customer gateway in the company's office.
B. Create a VPC and two private subnets. Create the RDS cluster in the private subnets.Use AWS Site-to-Site VPN with a customer gateway in the company's office.
C. Create a VPC and two private subnets. Create the RDS cluster in the private subnets.Use RDS security groups to allow the company's office IP ranges to access the cluster.
D. Create a VPC and two public subnets. Create the RDS cluster in the public subnets.Create a cluster user for each developer. Use RDS security groups to allow the users toaccess the cluster.
Question # 58
A company uses an Amazon Aurora PostgreSQL provisioned cluster with its application.The application's peak traffic occurs several times a day for periods of 30 minutes toseveral hours.The database capacity is provisioned to handle peak traffic from the application, but thedatabase has wasted capacity during non-peak hours. The company wants to reduce thedatabase costs.Which solution will meet these requirements with the LEAST operational effort?
A. Set up an Amazon CloudWatch alarm to monitor database utilization. Scale up or scaledown the database capacity based on the amount of traffic.
B. Migrate the database to Amazon EC2 instances in on Auto Scaling group. Increase ordecrease the number of instances based on the amount of traffic.
C. Migrate the database to an Amazon Aurora Serverless DB cluster to scale up or scaledown the capacity based on the amount of traffic.
D. Schedule an AWS Lambda function to provision the required database capacity at thestart of each day. Schedule another Lambda function to reduce the capacity at the end ofeach day.
Question # 59
A company has applications that run on Amazon EC2 instances in a VPC One of theapplications needs to call the Amazon S3 API to store and read objects. According to thecompany's security regulations, no traffic from the applications is allowed to travel acrossthe internet.Which solution will meet these requirements?
A. Configure an S3 gateway endpoint.
B. Create an S3 bucket in a private subnet.
C. Create an S3 bucket in the same AWS Region as the EC2 instances.
D. Configure a NAT gateway in the same subnet as the EC2 instances
Question # 60
A company needs a secure connection between its on-premises environment and AWS.This connection does not need high bandwidth and will handle a small amount of traffic.The connection should be set up quickly.What is the MOST cost-effective method to establish this type of connection?
A. Implement a client VPN
B. Implement AWS Direct Connect.
C. Implement a bastion host on Amazon EC2.
D. Implement an AWS Site-to-Site VPN connection.
Question # 61
A social media company wants to store its database of user profiles, relationships, andinteractions in the AWS Cloud. The company needs an application to monitor any changesin the database. The application needs to analyze the relationships between the dataentities and to provide recommendations to users.Which solution will meet these requirements with the LEAST operational overhead?
A. Use Amazon Neptune to store the information. Use Amazon Kinesis Data Streams toprocess changes in the database.
B. Use Amazon Neptune to store the information. Use Neptune Streams to processchanges in the database.
C. Use Amazon Quantum Ledger Database (Amazon QLDB) to store the information. UseAmazon Kinesis Data Streams to process changes in the database.
D. Use Amazon Quantum Ledger Database (Amazon QLDB) to store the information. UseNeptune Streams to process changes in the database.
Question # 62
A company wants to create a mobile app that allows users to stream slow-motion videoclips on their mobile devices. Currently, the app captures video clips and uploads the videoclips in raw format into an Amazon S3 bucket. The app retrieves these video clips directlyfrom the S3 bucket. However, the videos are large in their raw format.Users are experiencing issues with buffering and playback on mobile devices. Thecompany wants to implement solutions to maximize the performance and scalability of theapp while minimizing operational overhead.Which combination of solutions will meet these requirements? (Select TWO.)
A. Deploy Amazon CloudFront for content delivery and caching
B. Use AWS DataSync to replicate the video files across AWS Regions in other S3 buckets
C. Use Amazon Elastic Transcoder to convert the video files to more appropriate formats.
D. Deploy an Auto Scaling group of Amazon EC2 instances in Local Zones for contentdelivery and caching
E. Deploy an Auto Scaling group of Amazon EC2 Instances to convert the video files tomore appropriate formats.
Question # 63
A marketing company receives a large amount of new clickstream data in Amazon S3 froma marketing campaign The company needs to analyze the clickstream data in Amazon S3quickly. Then the company needs to determine whether to process the data further in thedata pipeline.Which solution will meet these requirements with the LEAST operational overhead?
A. Create external tables in a Spark catalog Configure jobs in AWS Glue to query the data
B. Configure an AWS Glue crawler to crawl the data. Configure Amazon Athena to querythe data.
C. Create external tables in a Hive metastore. Configure Spark jobs in Amazon EMR toquery the data.
D. Configure an AWS Glue crawler to crawl the data. Configure Amazon Kinesis DataAnalytics to use SQL to query the data
Question # 64
A company hosts its core network services, including directory services and DNS, in its onpremisesdata center. The data center is connected to the AWS Cloud using AWS DirectConnect (DX). Additional AWS accounts are planned that will require quick, cost-effective,and consistent access to these network services.What should a solutions architect implement to meet these requirements with the LEASTamount of operational overhead?
A. Create a DX connection in each new account. Route the network traffic to the onpremisesservers.
B. Configure VPC endpoints in the DX VPC for all required services. Route the networktraffic to the on-premises servers.
C. Create a VPN connection between each neV account and the DX VPC. Route thenetwork traffic to the on-premises servers.
D. Configure AWS Transit Gateway between the accounts. Assign DX to the transitgateway and route network traffic to the on-premises servers.
Question # 65
A company's solutions architect is designing an AWS multi-account solution that uses AWSOrganizations. The solutions architect has organized the company's accounts intoorganizational units (OUs).The solutions architect needs a solution that will identify any changes to the OU hierarchy.The solution also needs to notify the company's operations team of any changes.Which solution will meet these requirements with the LEAST operational overhead?
A. Provision the AWS accounts by using AWS Control Tower. Use account driftnotifications to Identify the changes to the OU hierarchy.
B. Provision the AWS accounts by using AWS Control Tower. Use AWS Config aggregatedrules to identify the changes to the OU hierarchy.
C. Use AWS Service Catalog to create accounts in Organizations. Use an AWS CloudTrailorganization trail to identify the changes to the OU hierarchy.
D. Use AWS CloudFormation templates to create accounts in Organizations. Use the driftdetection operation on a stack to identify the changes to the OUhierarchy.
Question # 66
A company has released a new version of its production application The company'sworkload uses Amazon EC2. AWS Lambda. AWS Fargate. and Amazon SageMaker. Thecompany wants to cost optimize the workload now that usage is at a steady state. Thecompany wants to cover the most services with the fewest savings plans. Whichcombination of savings plans will meet these requirements? (Select TWO.)
A. Purchase an EC2 Instance Savings Plan for Amazon EC2 and SageMaker.
B. Purchase a Compute Savings Plan for Amazon EC2. Lambda, and SageMaker
C. Purchase a SageMaker Savings Plan
D. Purchase a Compute Savings Plan for Lambda, Fargate, and Amazon EC2
E. Purchase an EC2 Instance Savings Plan for Amazon EC2 and Fargate
Question # 67
A company deploys Amazon EC2 instances that run in a VPC. The EC2 instances loadsource data into Amazon S3 buckets so that the data can be processed in the future.According to compliance laws, the data must not be transmitted over the public internet.Servers in the company's on-premises data center will consume the output from anapplication that runs on the LC2 instances.Which solution will meet these requirements?
A. Deploy an interface VPC endpoint for Amazon EC2. Create an AWS Site-to-Site VPNconnection between the company and the VPC.
B. Deploys gateway VPC endpoint for Amazon S3 Set up an AWS Direct Connect connection between the on-premises network and the VPC.
C. Set up on AWS Transit Gateway connection from the VPC to the S3 buckets. Create anAWS Site-to-Site VPN connection between the company and the VPC.
D. Set up proxy EC2 instances that have routes to NAT gateways. Configure the proxyEC2 instances lo fetch S3 data and feed the application instances.
Question # 68
A company regularly uploads GB-sized files to Amazon S3. After Ihe company uploads the files, the company uses a fleet of Amazon EC2 Spot Instances to transcode the file format.The company needs to scale throughput when the company uploads data from the onpremisesdata center to Amazon S3 and when Ihe company downloads data from AmazonS3 to the EC2 instances.gUkicn solutions will meet these requirements? (Select TWO.)
A. Use the S3 bucket access point instead of accessing the S3 bucket directly.
B. Upload the files into multiple S3 buckets.
C. Use S3 multipart uploads.
D. Fetch multiple byte-ranges of an object in parallel. fe
E. Add a random prefix to each object when uploading the files.
Question # 69
A company has a mobile app for customers The app's data is sensitive and must beencrypted at rest The company uses AWS Key Management Service (AWS KMS)The company needs a solution that prevents the accidental deletion of KMS keys Thesolution must use Amazon Simple Notification Service (Amazon SNS) to send an emailnotification to administrators when a user attempts to delete a KMS keyWhich solution will meet these requirements with the LEAST operational overhead''
A. Create an Amazon EventBndge rule that reacts when a user tries to delete a KMS keyConfigure an AWS Config rule that cancels any deletion of a KMS key Add the AWS Configrule as a target of the EventBridge rule Create an SNS topic that notifies the administrators
B. Create an AWS Lambda function that has custom logic to prevent KMS key deletionCreate an Amazon CloudWatch alarm that is activated when a user tries to delete a KMSkey Create an Amazon EventBridge rule that invokes the Lambda function when theDeleteKey operation is performed Create an SNS topic Configure the EventBndge rule topublish an SNS message that notifies the administrators
C. Create an Amazon EventBndge rule that reacts when the KMS DeleteKey operation isperformed Configure the rule to initiate an AWS Systems Manager Automationrunbook Configure the runbook to cancel the deletion of the KMS key Create an SNS topicConfigure the EventBndge rule to publish an SNS message that notifies the administrators.
D. Create an AWS CloudTrail trail Configure the trail to delrver logs to a new AmazonCloudWatch log group Create a CloudWatch alarm based on the metric filter for theCloudWatch log group Configure the alarm to use Amazon SNS to notify the administratorswhen the KMS DeleteKey operation is performed
Question # 70
A company has an on-premises business application that generates hundreds of files eachday. These files are stored on an SMB file share and require a low-latency connection tothe application servers. A new company policy states all application-generated files mustbe copied to AWS. There is already a VPN connection to AWS.The application development team does not have time to make the necessary codemodifications to move the application to AWS Which service should a solutions architectrecommend to allow the application to copy files to AWS?
A. Amazon Elastic File System (Amazon EFS)
B. Amazon FSx for Windows File Server
C. AWS Snowball
D. AWS Storage Gateway
Question # 71
A company has a web application in the AWS Cloud and wants to collect transaction datain real time. The company wants to prevent data duplication and does not want to manageinfrastructure. The company wants to perform additional processing on the data after thedata is collected.Which solution will meet these requirements?
A. Configure an Amazon Simple Queue Service (Amazon SOS) FIFO queue. Configure anAWS Lambda function with an event source mapping for the FIFO queue to process thedata.
B. Configure an Amazon Simple Queue Service (Amazon SQS) FIFO queue Use an AWSBatch job to remove duplicate data from the queue Configure an AWSLambda function to process the data.
C. Use Amazon Kinesis Data Streams to send the Incoming transaction data to an AWSBatch job that removes duplicate data. Launch an Amazon EC2 instance that runs acustom script lo process the data.
D. Set up an AWS Step Functions state machine to send incoming transaction data to anAWS Lambda function to remove duplicate data. Launch an Amazon EC2 instance thatruns a custom script to process the data.
Question # 72
A company wants to isolate its workloads by creating an AWS account for each workload.The company needs a solution that centrally manages networking components for theworkloads. The solution also must create accounts with automatic security controls(guardrails).Which solution will meet these requirements with the LEAST operational overhead?
A. Use AWS Control Tower to deploy accounts. Create a networking account that has aVPC with private subnets and public subnets. Use AWS Resource Access Manager (AWSRAM) to share the subnets with the workload accounts.
B. Use AWS Organizations to deploy accounts. Create a networking account that has aVPC with private subnets and public subnets. Use AWS Resource Access Manager (AWSRAM) to share the subnets with the workload accounts.
C. Use AWS Control Tower to deploy accounts. Deploy a VPC in each workload account.Configure each VPC to route through an inspection VPC by using a transit gatewayattachment.
D. Use AWS Organizations to deploy accounts. Deploy a VPC in each workload account.Configure each VPC to route through an inspection VPC by using a transit gatewayattachment.
Question # 73
A company's web application consists of multiple Amazon EC2 instances that run behindan Application Load Balancer in a VPC. An Amazon RDS for MySQL DB instance containsthe data The company needs the ability to automatically detect and respond to suspiciousor unexpected behavior in its AWS environment. The company already has added AWSWAF to its architecture.What should a solutions architect do next to protect against threats?
A. Use Amazon GuardDuty to perform threat detection. Configure Amazon EventBridge tofilter for GuardDuty findings and to Invoke an AWS Lambda function to adjust the AWSWAF rules.
B. Use AWS Firewall Manager to perform threat detection. Configure Amazon EventBridgeto filter for Firewall Manager findings and to invoke an AWS Lambda function to adjust theAWS WAF web ACL
C. Use Amazon Inspector to perform threat detection and lo update the AWS WAF rules. Create a VPC network ACL to limit access to the web application.
D. Use Amazon Macie to perform threat detection and to update the AWS WAF rules.Create a VPC network ACL to limit access to the web application.
Question # 74
A company is storing petabytes of data in Amazon S3 Standard The data is stored inmultiple S3 buckets and is accessed with varying frequency The company does not know access patterns for all the data. The company needs to implement a solution for each S3bucket to optimize the cost of S3 usage.Which solution will meet these requirements with the MOST operational efficiency?
A. Create an S3 Lifecycle configuration with a rule to transition the objects in the S3 bucketto S3 Intelligent-Tiering.
B. Use the S3 storage class analysis tool to determine the correct tier for each object in theS3 bucket. Move each object to the identified storage tier.
C. Create an S3 Lifecycle configuration with a rule to transition the objects in the S3 bucketto S3 Glacier Instant Retrieval.
D. Create an S3 Lifecycle configuration with a rule to transition the objects in the S3 bucketto S3 One Zone-Infrequent Access (S3 One Zone-IA).
Question # 75
A company needs to optimize its Amazon S3 storage costs for an application thatgenerates many files that cannot be recreated Each file is approximately 5 MB and isstored in Amazon S3 Standard storage.The company must store the files for 4 years before the files can be deleted The files mustbe immediately accessible The files are frequently accessed in the first 30 days of objectcreation, but they are rarely accessed after the first 30 days.Which solution will meet these requirements MOST cost-effectively
A. Create an S3 Lifecycle policy to move the files to S3 Glacier Instant Retrieval 30 daysafter object creation. Delete the files 4 years after object creation.
B. Create an S3 Lifecycle policy to move the files to S3 One Zone-Infrequent Access (S3One Zone-IA) 30 days after object creation Delete the files 4 years after object creation.
C. Create an S3 Lifecycle policy to move the files to S3 Standard-Infrequent Access (S3Standard-IA) 30 days after object creation Delete the files 4 years after object creation.
D. Create an S3 Lifecycle policy to move the files to S3 Standard-Infrequent Access (S3Standard-IA) 30 days after object creation. Move the files to S3 Glacier Flexible Retrieval 4years after object creation.
Question # 76
A company is planning to migrate data to an Amazon S3 bucket The data must beencrypted at rest within the S3 bucket The encryption key must be rotated automaticallyevery year.Which solution will meet these requirements with the LEAST operational overhead?
A. Migrate the data to the S3 bucket. Use server-side encryption with Amazon S3 managedkeys (SSE-S3). Use the built-in key rotation behavior of SSE-S3encryption keys.
B. Create an AWS Key Management Service (AWS KMS) customer managed key Enableautomatic key rotation Set the S3 bucket's default encryption behavior to use the customermanaged KMS key. Migrate the data to the S3 bucket.
C. Create an AWS Key Management Service (AWS KMS) customer managed key Set theS3 bucket's default encryption behavior to use the customer managed KMS key. Migratethe data to the S3 bucket. Manually rotate the KMS key every year.
D. Use customer key material to encrypt the data Migrate the data to the S3 bucket. Createan AWS Key Management Service (AWS KMS) key without key material Import thecustomer key material into the KMS key. Enable automatic key rotation.
Question # 77
An online photo-sharing company stores Hs photos in an Amazon S3 bucket that exists inthe us-west-1 Region. The company needs to store a copy of all new photos in the us-east-1 Region.Which solution will meet this requirement with the LEAST operational effort?
A. Create a second S3 bucket in us-east-1. Use S3 Cross-Region Replication to copyphotos from the existing S3 bucket to the second S3 bucket.
B. Create a cross-origin resource sharing (CORS) configuration of the existing S3 bucket.Specify us-east-1 in the CORS rule's AllowedOngm element.
C. Create a second S3 bucket in us-east-1 across multiple Availability Zones. Create an S3Lifecycle rule to save photos into the second S3 bucket,
D. Create a second S3 bucket In us-east-1. Configure S3 event notifications on objectcreation and update events to Invoke an AWS Lambda function to copy photos from theexisting S3 bucket to the second S3 bucket.
Question # 78
A robotics company is designing a solution for medical surgery The robots will useadvanced sensors, cameras, and Al algorithms to perceive their environment and tocomplete surgeries.The company needs a public load balancer in the AWS Cloud that will ensure seamlesscommunication with backend services. The load balancer must be capable of routing trafficbased on the query strings to different target groups. The traffic must also be encrypted Which solution will meet these requirements?
A. Use a Network Load Balancer with a certificate attached from AWS Certificate Manager(ACM) Use query parameter-based routing
B. Use a Gateway Load Balancer. Import a generated certificate in AWS Identity andAccess Management (1AM). Attach the certificate to the load balancer. Use HTTP pathbasedrouting.
C. Use an Application Load Balancer with a certificate attached from AWS CertificateManager (ACM). Use query parameter-based routing.
D. Use a Network Load Balancer. Import a generated certificate in AWS Identity andAccess Management (1AM). Attach the certificate to the load balancer. Use queryparameter-based routing.
Question # 79
A company's application is running on Amazon EC2 instances within an Auto Scaling groupbehind an Elastic Load Balancing (ELB) load balancer Based on the application's history,the company anticipates a spike in traffic during a holiday each year. A solutions architectmust design a strategy to ensure that the Auto Scaling group proactively increases capacityto minimize any performance impact on application users.Which solution will meet these requirements?
A. Create an Amazon CloudWatch alarm to scale up the EC2 instances when CPUutilization exceeds 90%.
B. Create a recurring scheduled action to scale up the Auto Scaling group before theexpected period of peak demand
C. Increase the minimum and maximum number of EC2 instances in the Auto Scalinggroup during the peak demand period
D. Configure an Amazon Simple Notification Service (Amazon SNS) notification to sendalerts when there are autoscaling:EC2_INSTANCE_LAUNCH events.
Question # 80
A company manages a data lake in an Amazon S3 bucket that numerous applicationsaccess The S3 bucket contains a unique prefix for each application The company wants torestrict each application to its specific prefix and to have granular control of the objectsunder each prefix.Which solution will meet these requirements with the LEAST operational overhead?
A. Create dedicated S3 access points and access point policies for each application.
B. Create an S3 Batch Operations job to set the ACL permissions for each object in the S3bucket
C. Replicate the objects in the S3 bucket to new S3 buckets for each application. Createreplication rules by prefix
D. Replicate the objects in the S3 bucket to new S3 buckets for each application Creatededicated S3 access points for each application
Question # 81
A company is migrating its workloads to AWS. The company has sensitive and critical datain on-premises relational databases that run on SQL Server instances. The company wantsto use the AWS Cloud to increase security and reduce operational overhead for thedatabases. Which solution will meet these requirements?
A. Migrate the databases to Amazon EC2 instances. Use an AWS Key ManagementService (AWS KMS) AWS managed key for encryption.
B. Migrate the databases to a Multi-AZ Amazon RDS for SQL Server DB instance Use anAWS Key Management Service (AWS KMS) AWS managed key for encryption.
C. Migrate the data to an Amazon S3 bucket Use Amazon Macie to ensure data security
D. Migrate the databases to an Amazon DynamoDB table. Use Amazon CloudWatch Logsto ensure data security
Question # 82
A company runs workloads in the AWS Cloud The company wants to centrally collectsecurity data to assess security across the entire company and to improve workloadprotection.Which solution will meet these requirements with the LEAST development effort?
A. Configure a data lake in AWS Lake Formation Use AWS Glue crawlers to ingest thesecurity data into the data lake.
B. Configure an AWS Lambda function to collect the security data in csv format. Upload thedata to an Amazon S3 bucket
C. Configure a data lake in Amazon Security Lake to collect the security data Upload thedata to an Amazon S3 bucket.
D. Configure an AWS Database Migration Service (AWS DMS) replication instance to loadthe security data into an Amazon RDS cluster
Question # 83
A company has separate AWS accounts for its finance, data analytics, and developmentdepartments. Because of costs and security concerns, the company wants to control whichservices each AWS account can useWhich solution will meet these requirements with the LEAST operational overhead
A. Use AWS Systems Manager templates to control which AWS services each departmentcan use
B. Create organization units (OUs) for each department in AWS Organizations. Attachservice control policies (SCPs) to the OUs.
C. Use AWS CloudFormation to automatically provision only the AWS services that eachdepartment can use.
D. Set up a list of products in AWS Service Catalog in the AWS accounts to manage andcontrol the usage of specific AWS services
Question # 84
A global company runs its workloads on AWS The company's application uses Amazon S3buckets across AWS Regions for sensitive data storage and analysis. The company storesmillions of objects in multiple S3 buckets daily. The company wants to identify all S3buckets that are not versioning-enabled.Which solution will meet these requirements?
A. Set up an AWS CloudTrail event that has a rule to identify all S3 buckets that are notversioning-enabled across Regions
B. Use Amazon S3 Storage Lens to identify all S3 buckets that are not versioning-enabledacross Regions.
C. Enable IAM Access Analyzer for S3 to identify all S3 buckets that are not versioningenabledacross Regions
D. Create an S3 Multi-Region Access Point to identify all S3 buckets that are notversioning-enabled across Regions
Question # 85
A company is designing an event-driven order processing system Each order requiresmultiple validation steps after the order is created. An independent AWS Lambda functionperforms each validation step. Each validation step is independent from the other validationsteps Individual validation steps need only a subset of the order event information.The company wants to ensure that each validation step Lambda function has access toonly the information from the order event that the function requires The components of theorder processing system should be loosely coupled to accommodate future businesschanges.Which solution will meet these requirements?
A. Create an Amazon Simple Queue Service (Amazon SQS> queue for each validationstep. Create a new Lambda function to transform the order data to the format that eachvalidation step requires and to publish the messages to the appropriate SQS queuesSubscribe each validation step Lambda function to its corresponding SQS queue
B. Create an Amazon Simple Notification Service {Amazon SNS) topic. Subscribe thevalidation step Lambda functions to the SNS topic. Use message body filtering to send only the required data to each subscribed Lambda function.
C. Create an Amazon EventBridge event bus. Create an event rule for each validation stepConfigure the input transformer to send only the required data to each target validation stepLambda function.
D. Create an Amazon Simple Queue Service {Amazon SQS) queue Create a new Lambdafunction to subscribe to the SQS queue and to transform the order data to the format thateach validation step requires. Use the new Lambda function to perform synchronousinvocations of the validation step Lambda functions in parallel on separate threads.
Question # 86
A company uses Amazon API Gateway to manage its REST APIs that third-party serviceproviders access The company must protect the REST APIs from SQL injection and crosssitescripting attacks. What is the MOST operationally efficient solution that meets these requirements
A. Configure AWS Shield.
B. Configure AWS WAR
C. Set up API Gateway with an Amazon CloudFront distribution Configure AWS Shield inCloudFront.
D. Set up API Gateway with an Amazon CloudFront distribution. Configure AWS WAF inCloudFront
Question # 87
A company has multiple VPCs across AWS Regions to support and run workloads that areisolated from workloads in other Regions Because of a recent application launchrequirement, the company's VPCs must communicate with all other VPCs across allRegions.Which solution will meet these requirements with the LEAST amount of administrativeeffort?
A. Use VPC peering to manage VPC communication in a single Region Use VPC peeringacross Regions to manage VPC communications.
B. Use AWS Direct Connect gateways across all Regions to connect VPCs across regionsand manage VPC communications.
C. Use AWS Transit Gateway to manage VPC communication in a single Region andTransit Gateway peering across Regions to manage VPC communications.
D. Use AWS PrivateLink across all Regions to connect VPCs across Regions and manageVPC communications.
Question # 88
A company is creating a prototype of an ecommerce website on AWS. The websiteconsists of an Application Load Balancer, an Auto Scaling group of Amazon EC2 instancesfor web servers, and an Amazon RDS for MySQL DB instance that runs with the Single-AZconfiguration.The website is slow to respond during searches of the product catalog. The product catalogis a group of tables in the MySQL database that the company does not ate frequently. Asolutions architect has determined that the CPU utilization on the DB instance is high whenproduct catalog searches occur.What should the solutions architect recommend to improve the performance of the websitedunng searches of the product catalog?
A. Migrate the product catalog to an Amazon Redshift database. Use the COPY commandto load the product catalog tables.
B. Implement an Amazon ElastiCache for Redis cluster to cache the product catalog. Uselazy loading to populate the cache.
C. Add an additional scaling policy to the Auto Scaling group to launch additional EC2instances when database response is slow.
D. Turn on the Multi-AZ configuration for the DB instance. Configure the EC2 instances tothrottle the product catalog queries that are sent to the database.
Question # 89
A global ecommerce company runs its critical workloads on AWS. The workloads use anAmazon RDS for PostgreSQL DB instance that is configured for a Multi-AZ deployment.Customers have reported application timeouts when the company undergoes databasefailovers. The company needs a resilient solution to reduce failover timeWhich solution will meet these requirements?
A. Create an Amazon RDS Proxy. Assign the proxy to the DB instance.
B. Create a read replica for the DB instance Move the read traffic to the read replica.
C. Enable Performance Insights. Monitor the CPU load to identify the timeouts.
D. Take regular automatic snapshots Copy the automatic snapshots to multiple AWSRegions
Question # 90
A company wants to use Amazon Elastic Container Service (Amazon ECS) to run its onpremisesapplication in a hybrid environment The application currently runs on containerson premises.The company needs a single container solution that can scale in an on-premises, hybrid, orcloud environment The company must run new application containers in the AWS Cloudand must use a load balancer for HTTP traffic.Which combination of actions will meet these requirements? (Select TWO.)
A. Set up an ECS cluster that uses the AWS Fargate launch type for the cloud applicationcontainers Use an Amazon ECS Anywhere external launch type for theon-premises application containers.
B. Set up an Application Load Balancer for cloud ECS services
C. Set up a Network Load Balancer for cloud ECS services.
D. Set up an ECS cluster that uses the AWS Fargate launch type Use Fargate for the cloud application containers and the on-premises application containers.
E. Set up an ECS cluster that uses the Amazon EC2 launch type for the cloud applicationcontainers. Use Amazon ECS Anywhere with an AWS Fargate launch type for the onpremisesapplication containers.
Question # 91
A company runs an application that uses Amazon RDS for PostgreSQL The applicationreceives traffic only on weekdays during business hours The company wants to optimizecosts and reduce operational overhead based on this usage.Which solution will meet these requirements?
A. Use the Instance Scheduler on AWS to configure start and stop schedules.
B. Turn off automatic backups. Create weekly manual snapshots of the database.
C. Create a custom AWS Lambda function to start and stop the database based onminimum CPU utilization.
D. Purchase All Upfront reserved DB instances
Question # 92
A company is preparing to store confidential data in Amazon S3. For compliance reasons,the data must be encrypted at rest. Encryption key usage must be logged for auditingpurposes. Keys must be rotated every year.Which solution meets these requirements and is the MOST operationally efficient?
A. Server-side encryption with customer-provided keys (SSE-C)
B. Server-side encryption with Amazon S3 managed keys (SSE-S3)
C. Server-side encryption with AWS KMS keys (SSE-KMS) with manual rotation
D. Server-side encryption with AWS KMS keys (SSE-KMS) with automatic rotation
Question # 93
A news company that has reporters all over the world is hosting its broadcast system onAWS. The reporters send live broadcasts to the broadcast system. The reporters usesoftware on their phones to send live streams through the Real Time Messaging Protocol(RTMP).A solutions architect must design a solution that gives the reporters the ability to send thehighest quality streams The solution must provide accelerated TCP connections back tothe broadcast system.What should the solutions architect use to meet these requirements?
A. Amazon CloudFront
B. AWS Global Accelerator
C. AWS Client VPN
D. Amazon EC2 instances and AWS Elastic IP addresses
Question # 94
A solutions architect is creating an application. The application will run on Amazon EC2instances in private subnets across multiple Availability Zones in a VPC. The EC2instances will frequently access large files that contain confidential information. These filesare stored in Amazon S3 buckets for processing. The solutions architect must optimize thenetwork architecture to minimize data transfer costs.What should the solutions architect do to meet these requirements?
A. Create a gateway endpoint for Amazon S3 in the VPC. In the route tables for the privatesubnets, add an entry for the gateway endpoint
B. Create a single NAT gateway in a public subnet. In the route tables for the privatesubnets, add a default route that points to the NAT gateway
C. Create an AWS PrivateLink interface endpoint for Amazon S3 in the VPC. In the routetables for the private subnets, add an entry for the interface endpoint.
D. Create one NAT gateway for each Availability Zone in public subnets. In each of theroute labels for the private subnets, add a default route that points lo the NAT gateway inthe same Availability Zone
Question # 95
A company plans to run a high performance computing (HPC) workload on Amazon EC2Instances The workload requires low-latency network performance and high networkthroughput with tightly coupled node-to-node communication.Which solution will meet these requirements?
A. Configure the EC2 instances to be part of a cluster placement group
B. Launch the EC2 instances with Dedicated Instance tenancy.
C. Launch the EC2 instances as Spot Instances.
D. Configure an On-Demand Capacity Reservation when the EC2 instances are launched.
Question # 96
A company uses Amazon EC2 instances and Amazon Elastic Block Store (Amazon EBS)to run its self-managed database The company has 350 TB of data spread across all EBSvolumes. The company takes daily EBS snapshots and keeps the snapshots for 1 month.The dally change rate is 5% of the EBS volumes.Because of new regulations, the company needs to keep the monthly snapshots for 7years. The company needs to change its backup strategy to comply with the newregulations and to ensure that data is available with minimal administrative effort.Which solution will meet these requirements MOST cost-effectively?
A. Keep the daily snapshot in the EBS snapshot standard tier for 1 month Copy themonthly snapshot to Amazon S3 Glacier Deep Archive with a 7-year retentionperiod.
B. Continue with the current EBS snapshot policy. Add a new policy to move the monthlysnapshot to Amazon EBS Snapshots Archive with a 7-year retention period.
C. Keep the daily snapshot in the EBS snapshot standard tier for 1 month Keep themonthly snapshot in the standard tier for 7 years Use incremental snapshots.
D. Keep the daily snapshot in the EBS snapshot standard tier. Use EBS direct APIs to takesnapshots of all the EBS volumes every month. Store the snapshots in an Amazon S3bucket in the Infrequent Access tier for 7 years.
Question # 97
A company has an application that serves clients that are deployed in more than 20.000retail storefront locations around the world. The application consists of backend webservices that are exposed over HTTPS on port 443 The application is hosted on AmazonEC2 Instances behind an Application Load Balancer (ALB). The retail locationscommunicate with the web application over the public internet. The company allows eachretail location to register the IP address that the retail location has been allocated by itslocal ISP.The company's security team recommends to increase the security of the applicationendpoint by restricting access to only the IP addresses registered by the retail locations.What should a solutions architect do to meet these requirements?
A. Associate an AWS WAF web ACL with the ALB Use IP rule sets on the ALB to filtertraffic Update the IP addresses in the rule to Include the registered IP addresses
B. Deploy AWS Firewall Manager to manage the ALB. Configure firewall rules to restricttraffic to the ALB Modify the firewall rules to include the registered IP addresses.
C. Store the IP addresses in an Amazon DynamoDB table. Configure an AWS Lambdaauthorization function on the ALB to validate that incoming requests are from the registeredIP addresses.
D. Configure the network ACL on the subnet that contains the public interface of the ALBUpdate the ingress rules on the network ACL with entries for each of the registered IPaddresses.
Question # 98
A company has an application that customers use to upload images to an Amazon S3bucket Each night, the company launches an Amazon EC2 Spot Fleet that processes allthe images that the company received that day. The processing for each image takes 2minutes and requires 512 MB of memory.A solutions architect needs to change the application to process the images when theimages are uploadedWhich change will meet these requirements MOST cost-effectively?
A. Use S3 Event Notifications to write a message with image details to an Amazon SimpleQueue Service (Amazon SQS) queue. Configure an AWS Lambda function to read themessages from the queue and to process the images
B. Use S3 Event Notifications to write a message with image details to an Amazon SimpleQueue Service (Amazon SQS) queue Configure an EC2 Reserved Instance to read themessages from the queue and to process the images.
C. Use S3 Event Notifications to publish a message with image details to an AmazonSimple Notification Service (Amazon SNS) topic. Configure a container instance in AmazonElastic Container Service (Amazon ECS) to subscribe to the topic and to process theimages.
D. Use S3 Event Notifications to publish a message with image details to an AmazonSimple Notification Service (Amazon SNS) topic. to subscribe to the topic and to process the images.
Question # 99
A company has a web application that has thousands of users. The application uses 8-10user-uploaded images to generate Al images. Users can download the generated AlImages once every 6 hours. The company also has a premium user option that gives usersthe ability to download the generated Al images anytimeThe company uses the user-uploaded images to run Al model training twice a year. Thecompany needs a storage solution to store the images.Which storage solution meets these requirements MOST cost-effectively?
A. Move uploaded images to Amazon S3 Glacier Deep Archive. Move premium usergeneratedAl images to S3 Standard. Move non-premium user-generated Al images to S3Standard-Infrequent Access (S3 Standard-IA).
B. Move uploaded images to Amazon S3 Glacier Deep Archive. Move all generated Al images to S3 Glacier Flexible Retrieval.
C. Move uploaded images to Amazon S3 One Zone-Infrequent Access {S3 One Zone-IA)Move premium user-generated Al images to S3 Standard. Move non-premium usergeneratedAl images to S3 Standard-Infrequent Access (S3 Standard-IA).
D. Move uploaded images to Amazon S3 One Zone-Infrequent Access {S3 One Zone-IA)Move all generated Al images to S3 Glacier Flexible Retrieval
Question # 100
A company wants to build a map of its IT infrastructure to identify and enforce policies onresources that pose security risks. The company's security team must be able to querydata in the IT infrastructure map and quickly identify security risks.Which solution will meet these requirements with the LEAST operational overhead?
A. Use Amazon RDS to store the data. Use SQL to query the data to identify security risks.
B. Use Amazon Neptune to store the data. Use SPARQL to query the data to identifysecurity risks.
C. Use Amazon Redshift to store the data. Use SQL to query the data to identify securityrisks.
D. Use Amazon DynamoDB to store the data. Use PartiQL to query the data to identifysecurity risks.
Question # 101
A company maintains about 300 TB in Amazon S3 Standard storage month after monthThe S3 objects are each typically around 50 GB in size and are frequently replaced withmultipart uploads by their global application The number and size of S3 objects remainconstant but the company's S3 storage costs are increasing each month.How should a solutions architect reduce costs in this situation?
A. Switch from multipart uploads to Amazon S3 Transfer Acceleration.
B. Enable an S3 Lifecycle policy that deletes incomplete multipart uploads.
C. Configure S3 inventory to prevent objects from being archived too quickly.
D. Configure Amazon CloudFront to reduce the number of objects stored in Amazon S3.
Question # 102
A company is building a microservices-based application that will be deployed on AmazonElastic Kubernetes Service (Amazon EKS). The microservices will interact with each other.The company wants to ensure that the application is observable to identify performanceissues in the future.Which solution will meet these requirements?
A. Configure the application to use Amazon ElastiCache to reduce the number of requeststhat are sent to the microservices.
B. Configure Amazon CloudWatch Container Insights to collect metrics from the EKSclusters Configure AWS X-Ray to trace the requests between the microservices.
C. Configure AWS CloudTrail to review the API calls. Build an Amazon QuickSightdashboard to observe the microservice interactions.
D. Use AWS Trusted Advisor to understand the performance of the application.
Question # 103
A company has a multi-tier payment processing application that is based on virtualmachines (VMs). The communication between the tiers occurs asynchronously through athird-party middleware solution that guarantees exactly-once delivery.The company needs a solution that requires the least amount of infrastructuremanagement. The solution must guarantee exactly-once delivery for application messagingWhich combination of actions will meet these requirements? (Select TWO.)
A. Use AWS Lambda for the compute layers in the architecture.
B. Use Amazon EC2 instances for the compute layers in the architecture.
C. Use Amazon Simple Notification Service (Amazon SNS) as the messaging componentbetween the compute layers.
D. Use Amazon Simple Queue Service (Amazon SQS) FIFO queues as the messagingcomponent between the compute layers.
E. Use containers that are based on Amazon Elastic Kubemetes Service (Amazon EKS) forthe compute layers in the architecture.
Question # 104
A company has a mobile game that reads most of its metadata from an Amazon RDS DBinstance. As the game increased in popularity, developers noticed slowdowns related to thegame's metadata load times Performance metrics indicate that simply scaling the databasewill not help A solutions architect must explore all options that include capabilities forsnapshots, replication, and sub-millisecond response timesWhat should the solutions architect recommend to solve these issues'?
A. Migrate the database to Amazon Aurora with Aurora Replicas
B. Migrate the database to Amazon DynamoDB with global tables
C. Add an Amazon ElastiCache for Redis layer in front of the database.
D. Add an Amazon ElastiCache for Memcached layer in front of the database
Question # 105
A financial company needs to handle highly sensitive data The company will store the datain an Amazon S3 bucket The company needs to ensure that the data is encrypted in transitand at rest The company must manage the encryption keys outside the AWS CloudWhich solution will meet these requirements?
A. Encrypt the data in the S3 bucket with server-side encryption (SSE) that uses an AWS Key Management Service (AWS KMS) customer managed key
B. Encrypt the data in the S3 bucket with server-side encryption (SSE) that uses an AWSKey Management Service (AWS KMS) AWS managed key
C. Encrypt the data in the S3 bucket with the default server-side encryption (SSE)
D. Encrypt the data at the company's data center before storing the data in the S3 bucket
Question # 106
A company has an on-premises data center that is running out of storage capacity. Thecompany wants to migrate its storage infrastructure to AWS while minimizing bandwidthcosts. The solution must allow for immediate retrieval of data at no additional cost.How can these requirements be met?
A. Deploy Amazon S3 Glacier Vault and enable expedited retrieval. Enable provisionedretrieval capacity for the workload.
B. Deploy AWS Storage Gateway using cached volumes. Use Storage Gateway to storedata in Amazon S3 while retaining copies of frequently accessed data subsets locally.
C. Deploy AWS Storage Gateway using stored volumes to store data locally. Use StorageGateway to asynchronously back up point-in-time snapshots of the data to Amazon S3.
D. Deploy AWS Direct Connect to connect with the on-premises data center. ConfigureAWS Storage Gateway to store data locally. Use Storage Gateway to asynchronously backup point-in-time snapshots of the data to Amazon S3.
Question # 107
A company has a business-critical application that runs on Amazon EC2 instances. Theapplication stores data in an Amazon DynamoDB table. The company must be able torevert the table to any point within the last 24 hours.Which solution meets these requirements with the LEAST operational overhead?
A. Configure point-in-time recovery for the table.
B. Use AWS Backup for the table.
C. Use an AWS Lambda function to make an on-demand backup of the table every hour.
D. Turn on streams on the table to capture a log of all changes to the table in the last 24hours Store a copy of the stream in an Amazon S3 bucket.
Question # 108
A company's website hosted on Amazon EC2 instances processes classified data stored inAmazon S3 Due to security concerns, the company requires a pnvate and secureconnection between its EC2 resources and Amazon S3.Which solution meets these requirements?
A. Set up S3 bucket policies to allow access from a VPC endpomt.
B. Set up an 1AM policy to grant read-write access to the S3 bucket.
C. Set up a NAT gateway to access resources outside the private subnet.
D. Set up an access key ID and a secret access key to access the S3 bucket.
Question # 109
A company wants to use NAT gateways in its AWS environment. The company's AmazonEC2 instances in private subnets must be able to connect to the public internet through theNAT gateways. Which solution will meet these requirements'?
A. Create public NAT gateways in the same private subnets as the EC2 instances
B. Create private NAT gateways in the same private subnets as the EC2 instances
C. Create public NAT gateways in public subnets in the same VPCs as the EC2 instances
D. Create private NAT gateways in public subnets in the same VPCs as the EC2 instances
Question # 110
A company wants to migrate an on-premises legacy application to AWS. The applicationingests customer order files from an on-premises enterprise resource planning (ERP)system. The application then uploads the files to an SFTP server. The application uses ascheduled job that checks for order files every hour.The company already has an AWS account that has connectivity to the on-premisesnetwork. The new application on AWS must support integration with the existing ERPsystem. The new application must be secure and resilient and must use the SFTP protocol to process orders from the ERP system immediately.Which solution will meet these requirements?
A. Create an AWS Transfer Family SFTP internet-facing server in two Availability Zones.Use Amazon S3 storage. Create an AWS Lambda function to process order files. Use S3Event Notifications to send s3: ObjectCreated: * events to the Lambda function.
B. Create an AWS Transfer Family SFTP internet-facing server in one Availability Zone.Use Amazon Elastic File System (Amazon EFS) storage. Create an AWS Lambda functionto process order files. Use a Transfer Family managed workflow to invoke the Lambdafunction.
C. Create an AWS Transfer Family SFTP internal server in two Availability Zones. UseAmazon Elastic File System (Amazon EFS) storage. Create an AWS Step Functions statemachine to process order files. Use Amazon EventBridge Scheduler to invoke the statemachine to periodically check Amazon EFS for order files.
D. Create an AWS Transfer Family SFTP internal server in two Availability Zones. UseAmazon S3 storage. Create an AWS Lambda function to process order files. Use aTransfer Family managed workflow to invoke the Lambda function.
Question # 111
A company needs to create an AWS Lambda function that will run in a VPC in thecompany's primary AWS account. The Lambda function needs to access files that thecompany storesin an Amazon Elastic File System (Amazon EFS) file system. The EFS file system islocated in a secondary AWS account. As the company adds files to the file system thesolution must scale to meet the demand.Which solution will meet these requirements MOST cost-effectively?
A. Create a new EPS file system in the primary account Use AWS DataSync to copy thecontents of the original EPS file system to the new EPS file system
B. Create a VPC peering connection between the VPCs that are in the primary accountand the secondary account
C. Create a second Lambda function In the secondary account that has a mount that isconfigured for the file system. Use the primary account's Lambda function to invoke thesecondary account's Lambda function
D. Move the contents of the file system to a Lambda Layer’s Configure the Lambda layer'spermissions to allow the company's secondary account to use the Lambda layer.
Question # 112
A company uses Amazon S3 to store high-resolution pictures in an S3 bucket. To minimizeapplication changes, the company stores the pictures as the latest version of an S3 objectThe company needs to retain only the two most recent versions ot the pictures.The company wants to reduce costs. The company has identified the S3 bucket as a largeexpense.Which solution will reduce the S3 costs with the LEAST operational overhead?
A. Use S3 Lifecycle to delete expired object versions and retain the two most recentversions.
B. Use an AWS Lambda function to check for older versions and delete all but the twomost recent versions
C. Use S3 Batch Operations to delete noncurrent object versions and retain only the twomost recent versions
D. Deactivate versioning on the S3 bucket and retain the two most recent versions.
Question # 113
A company is creating an application The company stores data from tests of the applicationin multiple on-premises locationsThe company needs to connect the on-premises locations to VPCs in an AWS Region inthe AWS Cloud The number of accounts and VPCs will increase during the next year Thenetwork architecture must simplify the administration of new connections and must providethe ability to scale.Which solution will meet these requirements with the LEAST administrative overhead'?
A. Create a peering connection between the VPCs Create a VPN connection between theVPCs and the on-premises locations.
B. Launch an Amazon EC2 instance On the instance, include VPN software that uses aVPN connection to connect all VPCs and on-premises locations.
C. Create a transit gateway Create VPC attachments for the VPC connections Create VPNattachments for the on-premises connections.
D. Create an AWS Direct Connect connection between the on-premises locations and acentral VPC. Connect the central VPC to other VPCs by using peering connections.
Question # 114
A company is developing a new mobile app. The company must implement proper trafficfiltering to protect its Application Load Balancer (ALB) against common application-levelattacks, such as cross-site scripting or SQL injection. The company has minimalinfrastructure and operational staff. The company needs to reduce its share of theresponsibility in managing, updating, and securing servers for its AWS environment.What should a solutions architect recommend to meet these requirements?
A. Configure AWS WAF rules and associate them with the ALB.
B. Deploy the application using Amazon S3 with public hosting enabled.
C. Deploy AWS Shield Advanced and add the ALB as a protected resource.
D. Create a new ALB that directs traffic to an Amazon EC2 instance running a third-partyfirewall, which then passes the traffic to the current ALB.
Question # 115
A company’s security team requests that network traffic be captured in VPC Flow Logs.The logs will be frequently accessed for 90 days and then accessed intermittently.What should a solutions architect do to meet these requirements when configuring thelogs?
A. Use Amazon CloudWatch as the target. Set the CloudWatch log group with an expiration of 90 days
B. Use Amazon Kinesis as the target. Configure the Kinesis stream to always retain the logs for 90 days.
C. Use AWS CloudTrail as the target. Configure CloudTrail to save to an Amazon S3 bucket, and enable S3 Intelligent-Tiering.
D. Use Amazon S3 as the target. Enable an S3 Lifecycle policy to transition the logs to S3 Standard-Infrequent Access (S3 Standard-IA) after 90 days.
Question # 116
A company is developing a mobile game that streams score updates to a backendprocessor and then posts results on a leaderboard A solutions architect needs to design asolution that can handle large traffic spikes process the mobile game updates in order ofreceipt, and store the processed updates in a highly available database The company alsowants to minimize the management overhead required to maintain the solutionWhat should the solutions architect do to meet these requirements?
A. Push score updates to Amazon Kinesis Data Streams Process the updates in KinesisData Streams with AWS Lambda Store the processed updates in Amazon DynamoDB.
B. Push score updates to Amazon Kinesis Data Streams. Process the updates with a fleetof Amazon EC2 instances set up for Auto Scaling Store the processed updates in AmazonRedshift.
C. Push score updates to an Amazon Simple Notification Service (Amazon SNS) topicSubscribe an AWS Lambda function to the SNS topic to process the updates. Store theprocessed updates in a SQL database running on Amazon EC2.
D. Push score updates to an Amazon Simple Queue Service (Amazon SQS) queue. Use afleet of Amazon EC2 instances with Auto Scaling to process the updates in the SQSqueue. Store the processed updates in an Amazon RDS Multi-AZ DB instance.
Question # 117
A company runs an SMB file server in its data center. The file server stores large files thatthe company frequently accesses for up to 7 days after the file creation date. After 7 days,the company needs to be able to access the files with a maximum retrieval time of 24hours.Which solution will meet these requirements?
A. Use AWS DataSync to copy data that is older than 7 days from the SMB file server toAWS.
B. Create an Amazon S3 File Gateway to increase the company's storage space. Createan S3 Lifecycle policy to transition the data to S3 Glacier Deep Archive after 7 days.
C. Create an Amazon FSx File Gateway to increase the company's storage space. Createan Amazon S3 Lifecycle policy to transition the data after 7 days.
D. Configure access to Amazon S3 for each user. Create an S3 Lifecycle policy totransition the data to S3 Glacier Flexible Retrieval after 7 days.
Question # 118
A company has an organization in AWS Organizations that has all features enabled Thecompany requires that all API calls and logins in any existing or new AWS account must beaudited The company needs a managed solution to prevent additional work and tominimize costs The company also needs to know when any AWS account is not compliantwith the AWS Foundational Security Best Practices (FSBP) standard.Which solution will meet these requirements with the LEAST operational overhead?
A. Deploy an AWS Control Tower environment in the Organizations management accountEnable AWS Security Hub and AWS Control Tower Account Factory in the environment.
B. Deploy an AWS Control Tower environment in a dedicated Organizations memberaccount Enable AWS Security Hub and AWS Control Tower Account Factory in theenvironment.
C. Use AWS Managed Services (AMS) Accelerate to build a multi-account landing zone(MALZ) Submit an RFC to self-service provision Amazon GuardDuty in the MALZ.
D. Use AWS Managed Services (AMS) Accelerate to build a multi-account landing zone(MALZ) Submit an RFC to self-service provision AWS Security Hub in the MALZ.
Question # 119
A solutions architect is designing a user authentication solution for a company The solutionmust invoke two-factor authentication for users that log in from inconsistent geographicallocations. IP addresses, or devices. The solution must also be able to scale up toaccommodate millions of users.Which solution will meet these requirements'?
A. Configure Amazon Cognito user pools for user authentication Enable the nsk-basedadaptive authentication feature with multi-factor authentication (MFA)
B. Configure Amazon Cognito identity pools for user authentication Enable multi-factorauthentication (MFA).
C. Configure AWS Identity and Access Management (1AM) users for user authenticationAttach an 1AM policy that allows the AllowManageOwnUserMFA action
D. Configure AWS 1AM Identity Center (AWS Single Sign-On) authentication for userauthentication Configure the permission sets to require multi-factor authentication(MFA)
Question # 120
A solutions architect needs to design the architecture for an application that a vendorprovides as a Docker container image The container needs 50 GB of storage available fortemporary files The infrastructure must be serverless.Which solution meets these requirements with the LEAST operational overhead?
A. Create an AWS Lambda function that uses the Docker container image with an AmazonS3 mounted volume that has more than 50 GB of space
B. Create an AWS Lambda function that uses the Docker container image with an AmazonElastic Block Store (Amazon EBS) volume that has more than 50 GB of space
C. Create an Amazon Elastic Container Service (Amazon ECS) cluster that uses the AWSFargate launch type Create a task definition for the container image with an AmazonElastic File System (Amazon EFS) volume. Create a service with that task definition.
D. Create an Amazon Elastic Container Service (Amazon ECS) cluster that uses theAmazon EC2 launch type with an Amazon Elastic Block Store (Amazon EBS) volume thathas more than 50 GB of space Create a task definition for the container image. Create aservice with that task definition.
Question # 121
A company uses AWS Organizations to run workloads within multiple AWS accounts Atagging policy adds department tags to AWS resources when the company creates tags.An accounting team needs to determine spending on Amazon EC2 consumption Theaccounting team must determine which departments are responsible for the costsregardless of AWS account The accounting team has access to AWS Cost Explorer for allAWS accounts within the organization and needs to access all reports from Cost Explorer.Which solution meets these requirements in the MOST operationally efficient way'?
A. From the Organizations management account billing console, activate a user-definedcost allocation tag named department Create one cost report in Cost Explorer grouping by tag name, and filter by EC2.
B. From the Organizations management account billing console, activate an AWS-definedcost allocation tag named department. Create one cost report in Cost Explorer grouping bytag name, and filter by EC2.
C. From the Organizations member account billing console, activate a user-defined costallocation tag named department. Create one cost report in Cost Explorer grouping by thetag name, and filter by EC2.
D. From the Organizations member account billing console, activate an AWS-defined costallocation tag named department. Create one cost report in Cost Explorer grouping by tagname and filter by EC2.
Question # 122
A company is building an Amazon Elastic Kubernetes Service (Amazon EKS) cluster for itsworkloads. All secrets that are stored in Amazon EKS must be encrypted in the Kubernetesetcd key-value store.Which solution will meet these requirements?
A. Create a new AWS Key Management Service (AWS KMS) key Use AWS SecretsManager to manage rotate, and store all secrets in Amazon EKS.
B. Create a new AWS Key Management Service (AWS KMS) key Enable Amazon EKSKMS secrets encryption on the Amazon EKS cluster.
C. Create the Amazon EKS cluster with default options Use the Amazon Elastic BlockStore (Amazon EBS) Container Storage Interface (CSI) driver as an add-on.
D. Create a new AWS Key Management Service (AWS KMS) key with the ahas/aws/ebsalias Enable default Amazon Elastic Block Store (Amazon EBS) volume encryption for theaccount.
Question # 123
A retail company has several businesses. The IT team for each business manages its ownAWS account. Each team account is part of an organization in AWS Organizations. Eachteam monitors its product inventory levels in an Amazon DynamoDB table in the team'sown AWS account.The company is deploying a central inventory reporting application into a shared AWSaccount. The application must be able to read items from all the teams' DynamoDB tables.Which authentication option will meet these requirements MOST securely?
A. Integrate DynamoDB with AWS Secrets Manager in the inventory application account.Configure the application to use the correct secret from Secrets Manager to authenticateand read the DynamoDB table. Schedule secret rotation for every 30 days.
B. In every business account, create an 1AM user that has programmatic access.Configure the application to use the correct 1AM user access key ID and secret access keyto authenticate and read the DynamoDB table. Manually rotate 1AM access keys every 30days.
C. In every business account, create an 1AM role named BU_ROLE with a policy that givesthe role access to the DynamoDB table and a trust policy to trust a specific role in theinventory application account. In the inventory account, create a role named APP_ROLEthat allows access to the STS AssumeRole API operation. Configure the application to useAPP_ROLE and assume the cross-account role BU_ROLE to read the DynamoDB table.
D. Integrate DynamoDB with AWS Certificate Manager (ACM). Generate identitycertificates to authenticate DynamoDB. Configure the application to use the correctcertificate to authenticate and read the DynamoDB table.
Question # 124
A company built an application with Docker containers and needs to run the application inthe AWS Cloud The company wants to use a managed sen/ice to host the applicationThe solution must scale in and out appropriately according to demand on the individualcontainer services The solution also must not result in additional operational overhead orinfrastructure to manageWhich solutions will meet these requirements? (Select TWO)
A. Use Amazon Elastic Container Service (Amazon ECS) with AWS Fargate.
B. Use Amazon Elastic Kubernetes Service (Amazon EKS) with AWS Fargate.
C. Provision an Amazon API Gateway API Connect the API to AWS Lambda to run the containers.
D. Use Amazon Elastic Container Service (Amazon ECS) with Amazon EC2 worker nodes.
E. Use Amazon Elastic Kubernetes Service (Amazon EKS) with Amazon EC2 workernodes.
Question # 125
A company uses Amazon S3 as its data lake. The company has a new partner that mustuse SFTP to upload data files A solutions architect needs to implement a highly availableSFTP solution that minimizes operational overhead.Which solution will meet these requirements?
A. Use AWS Transfer Family to configure an SFTP-enabled server with a publiclyaccessible endpoint Choose the S3 data lake as the destination
B. Use Amazon S3 File Gateway as an SFTP server Expose the S3 File Gateway endpointURL to the new partner Share the S3 File Gateway endpoint with the newpartner
C. Launch an Amazon EC2 instance in a private subnet in a VPC. Instruct the new partnerto upload files to the EC2 instance by using a VPN. Run a cron job script on the EC2instance to upload files to the S3 data lake
D. Launch Amazon EC2 instances in a private subnet in a VPC. Place a Network LoadBalancer (NLB) in front of the EC2 instances. Create an SFTP listener port for the NLB Share the NLB hostname with the new partner Run a cron job script on the EC2 instancesto upload files to the S3 data lake.
Question # 126
A company hosts an application used to upload files to an Amazon S3 bucket Onceuploaded, the files are processed to extract metadata which takes less than 5 seconds Thevolume and frequency of the uploads varies from a few files each hour to hundreds ofconcurrent uploads The company has asked a solutions architect to design a cost-effectivearchitecture that will meet these requirements.What should the solutions architect recommend?
A. Configure AWS CloudTrail trails to tog S3 API calls Use AWS AppSync to process thefiles.
B. Configure an object-created event notification within the S3 bucket to invoke an AWSLambda function to process the files.
C. Configure Amazon Kinesis Data Streams to process and send data to Amazon S3.Invoke an AWS Lambda function to process the files.
D. Configure an Amazon Simple Notification Service (Amazon SNS) topic to process thefiles uploaded to Amazon S3 Invoke an AWS Lambda function to process the files.
Question # 127
A company runs analytics software on Amazon EC2 instances The software accepts jobrequests from users to process data that has been uploaded to Amazon S3 Users reportthat some submitted data is not being processed Amazon CloudWatch reveals that theEC2 instances have a consistent CPU utilization at or near 100% The company wants toimprove system performance and scale the system based on user load.What should a solutions architect do to meet these requirements?
A. Create a copy of the instance Place all instances behind an Application Load Balancer
B. Create an S3 VPC endpoint for Amazon S3 Update the software to reference theendpoint
C. Stop the EC2 instances. Modify the instance type to one with a more powerful CPU andmore memory. Restart the instances.
D. Route incoming requests to Amazon Simple Queue Service (Amazon SQS) Configurean EC2 Auto Scaling group based on queue size Update the software to read from the queue.
Question # 128
A company is deploying an application that processes streaming data in near-real time Thecompany plans to use Amazon EC2 instances for the workload The network architecturemust be configurable to provide the lowest possible latency between nodesWhich combination of network solutions will meet these requirements? (Select TWO)
A. Enable and configure enhanced networking on each EC2 instance
B. Group the EC2 instances in separate accounts
C. Run the EC2 instances in a cluster placement group
D. Attach multiple elastic network interfaces to each EC2 instance
E. Use Amazon Elastic Block Store (Amazon EBS) optimized instance types.
Question # 129
A company runs a container application on a Kubernetes cluster in the company's datacenter The application uses Advanced Message Queuing Protocol (AMQP) tocommunicate with a message queue The data center cannot scale fast enough to meet thecompany's expanding business needs The company wants to migrate the workloads toAWSWhich solution will meet these requirements with the LEAST operational overhead? \
A. Migrate the container application to Amazon Elastic Container Service (Amazon ECS)Use Amazon Simple Queue Service (Amazon SQS) to retrieve the messages.
B. Migrate the container application to Amazon Elastic Kubernetes Service (Amazon EKS)Use Amazon MQ to retrieve the messages.
C. Use highly available Amazon EC2 instances to run the application Use Amazon MQ toretrieve the messages.
D. Use AWS Lambda functions to run the application Use Amazon Simple Queue Service(Amazon SQS) to retrieve the messages.
Question # 130
A company runs a real-time data ingestion solution on AWS. The solution consists of themost recent version of Amazon Managed Streaming for Apache Kafka (Amazon MSK). Thesolution is deployed in a VPC in private subnets across three Availability Zones.A solutions architect needs to redesign the data ingestion solution to be publicly availableover the internet. The data in transit must also be encrypted.Which solution will meet these requirements with the MOST operational efficiency?
A. Configure public subnets in the existing VPC. Deploy an MSK cluster in the publicsubnets. Update the MSK cluster security settings to enable mutual TLS authentication.
B. Create a new VPC that has public subnets. Deploy an MSK cluster in the publicsubnets. Update the MSK cluster security settings to enable mutual TLS authentication.
C. Deploy an Application Load Balancer (ALB) that uses private subnets. Configure an ALBsecurity group inbound rule to allow inbound traffic from the VPC CIDR block for HTTPSprotocol.
D. Deploy a Network Load Balancer (NLB) that uses private subnets. Configure an NLBlistener for HTTPS communication over the internet.
Question # 131
A company runs a Java-based job on an Amazon EC2 instance. The job runs every hourand takes 10 seconds to run. The job runs on a scheduled interval and consumes 1 GB ofmemory. The CPU utilization of the instance is low except for short surges during which thejob uses the maximum CPU available. The company wants to optimize the costs to run thejob.Which solution will meet these requirements?
A. Use AWS App2Container (A2C) to containerize the job. Run the job as an AmazonElastic Container Service (Amazon ECS) task on AWS Fargate with 0.5 virtual CPU(vCPU) and 1 GB of memory.
B. Copy the code into an AWS Lambda function that has 1 GB of memory. Create anAmazon EventBridge scheduled rule to run the code each hour.
C. Use AWS App2Container (A2C) to containerize the job. Install the container in theexisting Amazon Machine Image (AMI). Ensure that the schedule stops the container whenthe task finishes.
D. Configure the existing schedule to stop the EC2 instance at the completion of the joband restart the EC2 instance when the next job starts.
Question # 132
An ecommerce company runs applications in AWS accounts that are part of anorganization in AWS Organizations The applications run on Amazon Aurora PostgreSQLdatabases across all the accounts The company needs to prevent malicious activity andmust identify abnormal failed and incomplete login attempts to the databasesWhich solution will meet these requirements in the MOST operationally efficient way?
A. Attach service control policies (SCPs) to the root of the organization to identify the failedlogin attempts
B. Enable the Amazon RDS Protection feature in Amazon GuardDuty for the memberaccounts of the organization
C. Publish the Aurora general logs to a log group in Amazon CloudWatch Logs Export thelog data to a central Amazon S3 bucket
D. Publish all the Aurora PostgreSQL database events in AWS CloudTrail to a centralAmazon S3 bucket
Question # 133
A company needs to provide customers with secure access to its data. The companyprocesses customer data and stores the results in an Amazon S3 bucket.All the data is subject to strong regulations and security requirements. The data must beencrypted at rest. Each customer must be able to access only their data from their AWSaccount. Company employees must not be able to access the data.Which solution will meet these requirements?
A. Provision an AWS Certificate Manager (ACM) certificate for each customer. Encrypt thedata client-side. In the private certificate policy, deny access to the certificate for allprincipals except an 1AM role that the customer provides.
B. Provision a separate AWS Key Management Service (AWS KMS) key for eachcustomer. Encrypt the data server-side. In the S3 bucket policy, deny decryption of data forall principals except an 1AM role that the customer provides.
C. Provision a separate AWS Key Management Service (AWS KMS) key for eachcustomer. Encrypt the data server-side. In each KMS key policy, deny decryption of datafor all principals except an 1AM role that the customer provides.
D. Provision an AWS Certificate Manager (ACM) certificate for each customer. Encrypt thedata client-side. In the public certificate policy, deny access to the certificate for allprincipals except an 1AM role that the customer provides.
Question # 134
A company has a nightly batch processing routine that analyzes report files that an onpremisesfile system receives daily through SFTP. The company wants to move thesolution to the AWS Cloud. The solution must be highly available and resilient. The solutionalso must minimize operational effort.Which solution meets these requirements?
A. Deploy AWS Transfer for SFTP and an Amazon Elastic File System (Amazon EFS) filesystem for storage. Use an Amazon EC2 instance in an Auto Scaling group with ascheduled scaling policy to run the batch operation.
B. Deploy an Amazon EC2 instance that runs Linux and an SFTP service. Use an AmazonElastic Block Store {Amazon EBS) volume for storage. Use an Auto Scaling group with theminimum number of instances and desired number of instances set to 1.
C. Deploy an Amazon EC2 instance that runs Linux and an SFTP service. Use an AmazonElastic File System (Amazon EFS) file system for storage. Use an Auto Scaling group withthe minimum number of instances and desired number of instances set to 1.
D. Deploy AWS Transfer for SFTP and an Amazon S3 bucket for storage. Modify theapplication to pull the batch files from Amazon S3 to an Amazon EC2 instance forprocessing. Use an EC2 instance in an Auto Scaling group with a scheduled scaling policyto run the batch operation.
Question # 135
A company uses high concurrency AWS Lambda functions to process a constantlyincreasing number of messages in a message queue during marketing events. TheLambda functions use CPU intensive code to process the messages. The company wantsto reduce the compute costs and to maintain service latency for its customers.Which solution will meet these requirements?
A. Configure reserved concurrency for the Lambda functions. Decrease the memoryallocated to the Lambda functions.
B. Configure reserved concurrency for the Lambda functions. Increase the memoryaccording to AWS Compute Optimizer recommendations.
C. Configure provisioned concurrency for the Lambda functions. Decrease the memoryallocated to the Lambda functions.
D. Configure provisioned concurrency for the Lambda functions. Increase the memoryaccording to AWS Compute Optimizer recommendations.
Question # 136
A company runs applications on AWS that connect to the company's Amazon RDSdatabase. The applications scale on weekends and at peak times of the year. Thecompany wants to scale the database more effectively for its applications that connect tothe database.Which solution will meet these requirements with the LEAST operational overhead?
A. Use Amazon DynamoDB with connection pooling with a target group configuration forthe database. Change the applications to use the DynamoDB endpoint.
B. Use Amazon RDS Proxy with a target group for the database. Change the applicationsto use the RDS Proxy endpoint.
C. Use a custom proxy that runs on Amazon EC2 as an intermediary to the database.Change the applications to use the custom proxy endpoint.
D. Use an AWS Lambda function to provide connection pooling with a target groupconfiguration for the database. Change the applications to use the Lambda function.
Question # 137
A company wants to run its payment application on AWS The application receives paymentnotifications from mobile devices Payment notifications require a basic validation beforethey are sent for further processingThe backend processing application is long running and requires compute and memory tobe adjusted The company does not want to manage the infrastructureWhich solution will meet these requirements with the LEAST operational overhead?
A. Create an Amazon Simple Queue Service (Amazon SQS) queue Integrate the queuewith an Amazon EventBndge rule to receive payment notifications from mobile devicesConfigure the rule to validate payment notifications and send the notifications to the backend application Deploy the backend application on Amazon Elastic KubernetesService (Amazon EKS) Anywhere Create a standalone cluster
B. Create an Amazon API Gateway API Integrate the API with anAWS Step Functionsstate machine to receive payment notifications from mobile devices Invoke the statemachine to validate payment notifications and send the notifications to the backendapplication Deploy the backend application on Amazon Elastic Kubernetes Sen/ice(Amazon EKS). Configure an EKS cluster with self-managed nodes.
C. Create an Amazon Simple Queue Sen/ice (Amazon SQS) queue Integrate the queuewith an Amazon EventBridge rule to receive payment notifications from mobile devicesConfigure the rule to validate payment notifications and send the notifications to thebackend application Deploy the backend application on Amazon EC2 Spot InstancesConfigure a Spot Fleet with a default allocation strategy.
D. Create an Amazon API Gateway API Integrate the API with AWS Lambda to receivepayment notifications from mobile devices Invoke a Lambda function to validate paymentnotifications and send the notifications to the backend application Deploy the backendapplication on Amazon Elastic Container Service (Amazon ECS). Configure Amazon ECSwith an AWS Fargate launch type.
Question # 138
A company has multiple AWS accounts with applications deployed in the us-west-2 RegionApplication logs are stored within Amazon S3 buckets in each account The company wants to build a centralized log analysis solution that uses a single S3 bucket Logs must not leaveus-west-2, and the company wants to incur minimal operational overheadWhich solution meets these requirements and is MOST cost-effective?
A. Create an S3 Lifecycle policy that copies the objects from one of the application S3buckets to the centralized S3 bucket
B. Use S3 Same-Region Replication to replicate logs from the S3 buckets to another S3bucket in us-west-2 Use this S3 bucket for log analysis.
C. Write a script that uses the PutObject API operation every day to copy the entirecontents of the buckets to another S3 bucket in us-west-2 Use this S3 bucket for loganalysis.
D. Write AWS Lambda functions in these accounts that are triggered every time logs aredelivered to the S3 buckets (s3 ObjectCreated a event) Copy the logs to another S3 bucketin us-west-2. Use this S3 bucket for log analysis.
Question # 139
A company runs a highly available web application on Amazon EC2 instances behind anApplication Load Balancer The company uses Amazon CloudWatch metricsAs the traffic to the web application Increases, some EC2 instances become overloadedwith many outstanding requests The CloudWatch metrics show that the number of requestsprocessed and the time to receive the responses from some EC2 instances are both highercompared to other EC2 instances The company does not want new requests to beforwarded to the EC2 instances that are already overloaded.Which solution will meet these requirements?
A. Use the round robin routing algorithm based on the RequestCountPerTarget and ActiveConnection Count CloudWatch metrics.
B. Use the least outstanding requests algorithm based on the RequestCountPerTarget andActiveConnectionCount CloudWatch metrics.
C. Use the round robin routing algorithm based on the RequestCount andTargetResponseTime CloudWatch metrics.
D. Use the least outstanding requests algorithm based on the RequestCount andTargetResponseTime CloudWatch metrics.
Question # 140
An analytics company uses Amazon VPC to run its multi-tier services. The company wantsto use RESTful APIs to offer a web analytics service to millions of users. Users must beverified by using an authentication service to access the APIs.Which solution will meet these requirements with the MOST operational efficiency?
A. Configure an Amazon Cognito user pool for user authentication. Implement Amazon APIGateway REST APIs with a Cognito authorizer.
B. Configure an Amazon Cognito identity pool for user authentication. Implement AmazonAPI Gateway HTTP APIs with a Cognito authorizer.
C. Configure an AWS Lambda function to handle user authentication. Implement AmazonAPI Gateway REST APIs with a Lambda authorizer.
D. Configure an 1AM user to handle user authentication. Implement Amazon API GatewayHTTP APIs with an 1AM authorizer.
Question # 141
A company has an AWS Direct Connect connection from its on-premises location to anAWS account The AWS account has 30 different VPCs in the same AWS Region TheVPCs use private virtual interfaces (VIFs) Each VPC has a CIDR block that does notoverlap with other networks under the company's controlThe company wants to centrally manage the networking architecture while still allowingeach VPC to communicate with all other VPCs and on-premises networksWhich solution will meet these requirements with the LEAST amount of operationaloverhead?
A. Create a transit gateway and associate the Direct Connect connection with a new transitVIF Turn on the transit gateway's route propagation feature
B. Create a Direct Connect gateway Recreate the private VIFs to use the new gatewayAssociate each VPC by creating new virtual private gateways
C. Create a transit VPC Connect the Direct Connect connection to the transit VPC Create apeenng connection between all other VPCs in the Region Update the route tables
D. Create AWS Site-to-Site VPN connections from on premises to each VPC Ensure thatboth VPN tunnels are UP for each connection Turn on the route propagation feature
Question # 142
A solutions architect is designing a shared storage solution for a web application that isdeployed across multiple Availability Zones The web application runs on Amazon EC2instances that are in an Auto Scaling group The company plans to make frequent changesto the content The solution must have strong consistency in returning the new content assoon as the changes occur.Which solutions meet these requirements? (Select TWO)
A. Use AWS Storage Gateway Volume Gateway Internet Small Computer SystemsInterface (iSCSI) block storage that is mounted to the individual EC2 instances
B. Create an Amazon Elastic File System (Amazon EFS) file system Mount the EFS filesystem on the individual EC2 instances
C. Create a shared Amazon Elastic Block Store (Amazon EBS) volume. Mount the EBSvolume on the individual EC2 instances.
D. Use AWS DataSync to perform continuous synchronization of data between EC2 hostsin the Auto Scaling group
E. Create an Amazon S3 bucket to store the web content Set the metadata for the Cache-Control header to no-cache Use Amazon CloudFront to deliver the content
Question # 143
A company needs to extract the names of ingredients from recipe records that are storedas text files in an Amazon S3 bucket A web application will use the ingredient names toquery an Amazon DynamoDB table and determine a nutrition score.The application can handle non-food records and errors The company does not have anyemployees who have machine learning knowledge to develop this solutionWhich solution will meet these requirements MOST cost-effectively?
A. Use S3 Event Notifications to invoke an AWS Lambda function when PutObjectrequests occur Program the Lambda function to analyze the object and extract theingredient names by using Amazon Comprehend Store the Amazon Comprehend output inthe DynamoDB table.
B. Use an Amazon EventBridge rule to invoke an AWS Lambda function when PutObjectrequests occur. Program the Lambda function to analyze the object by using AmazonForecast to extract the ingredient names Store the Forecast output in the DynamoDB table.
C. Use S3 Event Notifications to invoke an AWS Lambda function when PutObjectrequests occur Use Amazon Polly to create audio recordings of the recipe records. Savethe audio files in the S3 bucket Use Amazon Simple Notification Service (Amazon SNS) tosend a URL as a message to employees Instruct the employees to listen to the audio filesand calculate the nutrition score Store the ingredient names in the DynamoDB table.
D. Use an Amazon EventBridge rule to invoke an AWS Lambda function when a PutObjectrequest occurs Program the Lambda function to analyze the object and extract theingredient names by using Amazon SageMaker Store the inference output from theSageMaker endpoint in the DynamoDB table.
Question # 144
A company has a new mobile app. Anywhere in the world, users can see local news ontopics they choose. Users also can post photos and videos from inside the app.Users access content often in the first minutes after the content is posted. New contentquickly replaces older content, and then the older content disappears. The local nature ofthe news means that users consume 90% of the content within the AWS Region where it isuploaded.Which solution will optimize the user experience by providing the LOWEST latency forcontent uploads?
A. Upload and store content in Amazon S3. Use Amazon CloudFront for the uploads.
B. Upload and store content in Amazon S3. Use S3 Transfer Acceleration for the uploads.
C. Upload content to Amazon EC2 instances in the Region that is closest to the user. Copythe data to Amazon S3.
D. Upload and store content in Amazon S3 in the Region that is closest to the user. Usemultiple distributions of Amazon CloudFront.
Question # 145
An ecommerce application uses a PostgreSQL database that runs on an Amazon EC2instance. During a monthly sales event, database usage increases and causes databaseconnection issues for the application. The traffic is unpredictable for subsequent monthlysales events, which impacts the sales forecast. The company needs to maintainperformance when there is an unpredictable increase in traffic.Which solution resolves this issue in the MOST cost-effective way?
A. Migrate the PostgreSQL database to Amazon Aurora Serverless v2.
B. Enable auto scaling for the PostgreSQL database on the EC2 instance to accommodateincreased usage.
C. Migrate the PostgreSQL database to Amazon RDS for PostgreSQL with a largerinstance type
D. Migrate the PostgreSQL database to Amazon Redshift to accommodate increasedusage
Question # 146
A company's marketing data is uploaded from multiple sources to an Amazon S3 bucket A series ot data preparation jobs aggregate the data for reporting The data preparation jobsneed to run at regular intervals in parallel A few jobs need to run in a specific order laterThe company wants to remove the operational overhead of job error handling retry logic,and state managementWhich solution will meet these requirements?
A. Use an AWS Lambda function to process the data as soon as the data is uploaded tothe S3 bucket Invoke Other Lambda functions at regularly scheduled intervals
B. Use Amazon Athena to process the data Use Amazon EventBndge Scheduler to invokeAthena on a regular internal
C. Use AWS Glue DataBrew to process the data Use an AWS Step Functions statemachine to run the DataBrew data preparation jobs
D. Use AWS Data Pipeline to process the data. Schedule Data Pipeline to process the dataonce at midnight.
Question # 147
A research company uses on-premises devices to generate data for analysis. Thecompany wants to use the AWS Cloud to analyze the data. The devices generate .csv filesand support writing the data to SMB file share. Company analysts must be able to use SQLcommands to query the data. The analysts will run queries periodically throughout the day.Which combination of steps will meet these requirements MOST cost-effectively? (SelectTHREE.)
A. Deploy an AWS Storage Gateway on premises in Amazon S3 File Gateway mode.
B. Deploy an AWS Storage Gateway on premises in Amazon FSx File Gateway mode.
C. Set up an AWS Glue crawler to create a table based on the data that is in Amazon S3.
D. Set up an Amazon EMR cluster with EMR Fife System (EMRFS) to query the data thatis in Amazon S3. Provide access to analysts.
E. Set up an Amazon Redshift cluster to query the data that is in Amazon S3. Provideaccess to analysts.
F. Set up Amazon Athena to query the data that is in Amazon S3. Provide access toanalysts.
Question # 148
A company website hosted on Amazon EC2 instances processes classified data stored inThe application writes data to Amazon Elastic Block Store (Amazon EBS) volumes Thecompany needs to ensure that all data that is written to the EBS volumes is encrypted atrest.Which solution will meet this requirement?
A. Create an 1AM role that specifies EBS encryption Attach the role to the EC2 instances
B. Create the EBS volumes as encrypted volumes Attach the EBS volumes to the EC2instances
C. Create an EC2 instance tag that has a key of Encrypt and a value of True Tag allinstances that require encryption at the EBS level
D. Create an AWS Key Management Service (AWS KMS) key policy that enforces EBSencryption in the account Ensure that the key policy is active
Question # 149
A company has Amazon EC2 instances that run nightly batch jobs to process data. TheEC2 instances run in an Auto Scaling group that uses On-Demand billing. If a job fails onone instance: another instance will reprocess the job. The batch jobs run between 12:00AM and 06 00 AM local time every day.Which solution will provide EC2 instances to meet these requirements MOST cost-effectively'?
A. Purchase a 1-year Savings Plan for Amazon EC2 that covers the instance family of theAuto Scaling group that the batch job uses.
B. Purchase a 1-year Reserved Instance for the specific instance type and operatingsystem of the instances in the Auto Scaling group that the batch job uses.
C. Create a new launch template for the Auto Scaling group Set the instances to SpotInstances Set a policy to scale out based on CPU usage.
D. Create a new launch template for the Auto Scaling group Increase the instance size Seta policy to scale out based on CPU usage.
Question # 150
A company hosts a three-tier web application in the AWS Cloud. A Multi-AZ Amazon RDSfor MySQL server forms the database layer. Amazon ElastiCache forms the cache layer.The company wants a caching strategy that adds or updates data in the cache when acustomer adds an item to the database. The data in the cache must always match the datain the database.Which solution will meet these requirements?
A. Implement the lazy loading caching strategy
B. Implement the write-through caching strategy.
C. Implement the adding TTL caching strategy.
D. Implement the AWS AppConfig caching strategy.
Question # 151
A company wants to analyze and troubleshoot Access Denied errors and Unauthonzederrors that are related to 1AM permissions The company has AWS CloudTrail turned onWhich solution will meet these requirements with the LEAST effort?
A. Use AWS Glue and write custom scripts to query CloudTrail logs for the errors
B. Use AWS Batch and write custom scripts to query CloudTrail logs for the errors
C. Search CloudTrail logs with Amazon Athena queries to identify the errors
D. Search CloudTrail logs with Amazon QuickSight. Create a dashboard to identify the errors.
Question # 152
A global company runs its applications in multiple AWS accounts in AWS Organizations.The company's applications use multipart uploads to upload data to multiple Amazon S3buckets across AWS Regions. The company wants to report on incomplete multipartuploads for cost compliance purposes.Which solution will meet these requirements with the LEAST operational overhead?
A. Configure AWS Config with a rule to report the incomplete multipart upload object count.
B. Create a service control policy (SCP) to report the incomplete multipart upload objectcount.
C. Configure S3 Storage Lens to report the incomplete multipart upload object count.
D. Create an S3 Multi-Region Access Point to report the incomplete multipart upload objectcount.
Question # 153
A company has stored 10 TB of log files in Apache Parquet format in an Amazon S3 bucketThe company occasionally needs to use SQL to analyze the log files Which solution willmeet these requirements MOST cost-effectively?
A. Create an Amazon Aurora MySQL database Migrate the data from the S3 bucket intoAurora by using AWS Database Migration Service (AWS DMS) Issue SQL statements tothe Aurora database.
B. Create an Amazon Redshift cluster Use Redshift Spectrum to run SQL statementsdirectly on the data in the S3 bucket
C. Create an AWS Glue crawler to store and retrieve table metadata from the S3 bucketUse Amazon Athena to run SQL statements directly on the data in the S3 bucket
D. Create an Amazon EMR cluster Use Apache Spark SQL to run SQL statements directlyon the data in the S3 bucket
Question # 154
A pharmaceutical company is developing a new drug. The volume of data that the company generates has grown exponentially over the past few months. The company'sresearchers regularly require a subset of the entire dataset to be immediately available withminimal lag. However the entire dataset does not need to be accessed on a daily basis. Allthe data currently resides in on-premises storage arrays, and the company wants to reduceongoing capital expenses.Which storage solution should a solutions architect recommend to meet theserequirements?
A. Run AWS DataSync as a scheduled cron job to migrate the data to an Amazon S3bucket on an ongoing basis.
B. Deploy an AWS Storage Gateway file gateway with an Amazon S3 bucket as the targetstorage Migrate the data to the Storage Gateway appliance.
C. Deploy an AWS Storage Gateway volume gateway with cached volumes with anAmazon S3 bucket as the target storage. Migrate the data to the Storage Gatewayappliance.
D. Configure an AWS Site-to-Site VPN connection from the on-premises environment toAWS. Migrate data to an Amazon Elastic File System (Amazon EFS) file system.
Question # 155
A company runs a three-tier web application in a VPC across multiple Availability Zones.Amazon EC2 instances run in an Auto Scaling group for the application tier.The company needs to make an automated scaling plan that will analyze each resource'sdaily and weekly historical workload trends. The configuration must scale resourcesappropriately according to both the forecast and live changes in utilization.Which scaling strategy should a solutions architect recommend to meet theserequirements?
A. Implement dynamic scaling with step scaling based on average CPU utilization from theEC2 instances.
B. Enable predictive scaling to forecast and scale. Configure dynamic scaling with targettracking.
C. Create an automated scheduled scaling action based on the traffic patterns of the webapplication.
D. Set up a simple scaling policy. Increase the cooldown period based on the EC2 instancestartup time
Question # 156
A company deployed a serverless application that uses Amazon DynamoDB as a databaselayer The application has experienced a large increase in users. The company wants toimprove database response time from milliseconds to microseconds and to cache requeststo the database.Which solution will meet these requirements with the LEAST operational overhead?
A. Use DynamoDB Accelerator (DAX).
B. Migrate the database to Amazon Redshift.
C. Migrate the database to Amazon RDS.
D. Use Amazon ElastiCache for Redis.
Question # 157
An online video game company must maintain ultra-low latency for its game servers. Thegame servers run on Amazon EC2 instances. The company needs a solution that canhandle millions of UDP internet traffic requests each second.Which solution will meet these requirements MOST cost-effectively?
A. Configure an Application Load Balancer with the required protocol and ports for theinternet traffic. Specify the EC2 instances as the targets.
B. Configure a Gateway Load Balancer for the internet traffic. Specify the EC2 instances asthe targets.
C. Configure a Network Load Balancer with the required protocol and ports for the internettraffic. Specify the EC2 instances as the targets.
D. Launch an identical set of game servers on EC2 instances in separate AWS Regions. Route internet traffic to both sets of EC2 instances.
Question # 158
A company maintains an Amazon RDS database that maps users to cost centers. Thecompany has accounts in an organization in AWS Organizations. The company needs asolution that will tag all resources that are created in a specific AWS account in theorganization. The solution must tag each resource with the cost center ID of the user whocreated the resource.Which solution will meet these requirements?
A. Move the specific AWS account to a new organizational unit (OU) in Organizations fromthe management account. Create a service control policy (SCP) that requires all existingresources to have the correct cost center tag before the resources are created. Apply the SCP to the new OU.
B. Create an AWS Lambda function to tag the resources after the Lambda function looksup the appropriate cost center from the RDS database. Configure an Amazon EventBridgerule that reacts to AWS CloudTrail events to invoke the Lambda function.
C. Create an AWS CloudFormation stack to deploy an AWS Lambda function. Configurethe Lambda function to look up the appropriate cost center from the RDS database and totag resources. Create an Amazon EventBridge scheduled rule to invoke theCloudFormation stack.
D. Create an AWS Lambda function to tag the resources with a default value. Configure anAmazon EventBridge rule that reacts to AWS CloudTrail events to invoke the Lambdafunction when a resource is missing the cost center tag.
Question # 159
A company is designing a tightly coupled high performance computing (HPC) environmentin the AWS Cloud The company needs to include features that will optimize the HPCenvironment for networking and storage.Which combination of solutions will meet these requirements? (Select TWO )
A. Create an accelerator in AWS Global Accelerator. Configure custom routing for theaccelerator.
B. Create an Amazon FSx for Lustre file system. Configure the file system with scratchstorage.
C. Create an Amazon CloudFront distribution. Configure the viewer protocol policy to beHTTP and HTTPS.
D. Launch Amazon EC2 instances. Attach an Elastic Fabric Adapter (EFA) to theinstances.
E. Create an AWS Elastic Beanstalk deployment to manage the environment.
Question # 160
A company is running a photo hosting service in the us-east-1 Region. The service enablesusers across multiple countries to upload and view photos. Some photos are heavilyviewed for months, and others are viewed for less than a week. The application allowsuploads of up to 20 MB for each photo. The service uses the photo metadata to determinewhich photos to display to each user.Which solution provides the appropriate user access MOST cost-effectively?
A. Store the photos in Amazon DynamoDB. Turn on DynamoDB Accelerator (DAX) tocache frequently viewed items.
B. Store the photos in the Amazon S3 Intelligent-Tiering storage class. Store the photometadata and its S3 location in DynamoDB.
C. Store the photos in the Amazon S3 Standard storage class. Set up an S3 Lifecyclepolicy to move photos older than 30 days to the S3 Standard-Infrequent Access (S3Standard-IA) storage class. Use the object tags to keep track of metadata.
D. Store the photos in the Amazon S3 Glacier storage class. Set up an S3 Lifecycle policyto move photos older than 30 days to the S3 Glacier Deep Archive storage class. Store thephoto metadata and its S3 location in Amazon OpenSearch Service.
Question # 161
A company is designing a new web application that will run on Amazon EC2 Instances. Theapplication will use Amazon DynamoDB for backend data storage. The application trafficwill be unpredictable. T company expects that the application read and write throughput tothe database will be moderate to high. The company needs to scale in response toapplication traffic.Which DynamoDB table configuration will meet these requirements MOST cost-effectively?
A. Configure DynamoDB with provisioned read and write by using the DynamoDBStandard table class. Set DynamoDB auto scaling to a maximum defined capacity.
B. Configure DynamoDB in on-demand mode by using the DynamoDB Standard tableclass.
C. Configure DynamoDB with provisioned read and write by using the DynamoDBStandard Infrequent Access (DynamoDB Standard-IA) table class. Set DynamoDB autoscaling to a maximum defined capacity.
D. Configure DynamoDB in on-demand mode by using the DynamoDB Standard InfrequentAccess (DynamoDB Standard-IA) table class.
Question # 162
A company's web application that is hosted in the AWS Cloud recently increased inpopularity. The web application currently exists on a single Amazon EC2 instance in asingle public subnet. The web application has not been able to meet the demand of theincreased web traffic.The company needs a solution that will provide high availability and scalability to meet theincreased user demand without rewriting the web application.Which combination of steps will meet these requirements? (Select TWO.)
A. Replace the EC2 instance with a larger compute optimized instance.
B. Configure Amazon EC2 Auto Scaling with multiple Availability Zones in private subnets.
C. Configure a NAT gateway in a public subnet to handle web requests.
D. Replace the EC2 instance with a larger memory optimized instance.
E. Configure an Application Load Balancer in a public subnet to distribute web traffic
Question # 163
A company is designing a web application on AWS The application will use a VPNconnection between the company's existing data centers and the company's VPCs. Thecompany uses Amazon Route 53 as its DNS service. The application must use privateDNS records to communicate with the on-premises services from a VPC. Which solutionwill meet these requirements in the MOST secure manner?
A. Create a Route 53 Resolver outbound endpoint. Create a resolver rule. Associate theresolver rule with the VPC
B. Create a Route 53 Resolver inbound endpoint. Create a resolver rule. Associate theresolver rule with the VPC.
C. Create a Route 53 private hosted zone. Associate the private hosted zone with the VPC.
D. Create a Route 53 public hosted zone. Create a record for each service to allow servicecommunication.
Question # 164
A media company stores movies in Amazon S3. Each movie is stored in a single video filethat ranges from 1 GB to 10 GB in size.The company must be able to provide the streaming content of a movie within 5 minutes ofa user purchase. There is higher demand for movies that are less than 20 years old thanfor movies that are more than 20 years old. The company wants to minimize hostingservice costs based on demand.Which solution will meet these requirements?
A. Store all media content in Amazon S3. Use S3 Lifecycle policies to move media datainto the Infrequent Access tier when the demand for a movie decreases.
B. Store newer movie video files in S3 Standard Store older movie video files in S3Standard-Infrequent Access (S3 Standard-IA). When a user orders an older movie, retrievethe video file by using standard retrieval.
C. Store newer movie video files in S3 Intelligent-Tiering. Store older movie video files inS3 Glacier Flexible Retrieval. When a user orders an older movie, retrieve the video file byusing expedited retrieval.
D. Store newer movie video files in S3 Standard. Store older movie video files in S3 GlacierFlexible Retrieval. When a user orders an older movie, retrieve the video file by using bulkretrieval.
Question # 165
A business application is hosted on Amazon EC2 and uses Amazon S3 for encryptedobject storage. The chief information security officer has directed that no application trafficbetween the two services should traverse the public internet.Which capability should the solutions architect use to meet the compliance requirements?
A. AWS Key Management Service (AWS KMS)
B. VPC endpoint
C. Private subnet
D. Virtual private gateway
Leave a comment
Your email address will not be published. Required fields are marked *