Weekend Sale Limited Time Flat 70% Discount offer - Ends in 0d 00h 00m 00s - Coupon code: 70spcl

Amazon Web Services SAA-C03 AWS Certified Solutions Architect - Associate (SAA-C03) Exam Practice Test

Page: 1 / 47
Total 467 questions

AWS Certified Solutions Architect - Associate (SAA-C03) Questions and Answers

Question 1

A company wants to publish a private website for its on-premises employees. The website consists of several HTML pages and image files. The website must be available only through HTTPS and must be available only to on-premises employees. A solutions architect plans to store the website files in an Amazon S3 bucket.

Which solution will meet these requirements?

Options:

A.

Create an S3 bucket policy to deny access when the source IP address is not the public IP address of the on-premises environment Set up an Amazon Route 53 alias record to point to the S3 bucket. Provide the alias record to the on-premises employees to grant the employees access to the website.

B.

Create an S3 access point to provide website access. Attach an access point policy to deny access when the source IP address is not the public IP address of the on-premises environment. Provide the S3 access point alias to the on-premises employees to grant the employees access to the website.

C.

Create an Amazon CloudFront distribution that includes an origin access control (OAC) that is configured for the S3 bucket. Use AWS Certificate Manager for SSL. Use AWS WAF with an IP set rule that allows access for the on-premises IP address. Set up an Amazon Route 53 alias record to point to the CloudFront distribution.

D.

Create an Amazon CloudFront distribution that includes an origin access control (OAC) that is configured for the S3 bucket. Create a CloudFront signed URL for the objects in the bucket. Set up an Amazon Route 53 alias record to point to the CloudFront distribution. Provide the signed URL to the on-premises employees to grant the employees access to the website.

Question 2

A company runs a content management system on an Amazon Elastic Container Service (Amazon ECS) cluster. The system allows visitors to provide feedback about the company's products by uploading documents and photos of the products to an Amazon S3 bucket.

The company has a workflow on AWS that processes uploaded documents to perform sentiment analysis of photos and text. The processing workflow calls multiple AWS services.

The company needs a solution to automate the processing workflow. The solution must handle any failed uploads.

Which solution will meet these requirements with the LEAST effort?

Options:

A.

Use S3 Event Notifications to publish events to an Amazon Simple Notification Service (Amazon SNS) topic. Deploy a web application on the Amazon ECS cluster to subscribe to the SNS topic and listen for events to orchestrate the processing workflow.

B.

Use S3 Event Notifications to publish events to an Amazon Simple Queue Service (Amazon SQS) queue. Configure long polling. Deploy an Amazon EC2 instance that runs a script to orchestrate the processing workflow.

C.

Use S3 Event Notifications to publish events to an Amazon Simple Queue Service (Amazon SQS) queue. Create an ECS cluster that scales based on the number of messages in the queue. Configure the cluster to orchestrate the processing workflow.

D.

Use S3 Event Notifications to invoke an Amazon EventBridge rule. Configure the rule to initiate an AWS Step Functions workflow that orchestrates the processing workflow.

Question 3

Question:

A company is building an ecommerce application that uses a relational database to store customer data and order history. The company also needs a solution to store 100 GB of product images. The company expects the traffic flow for the application to be predictable. Which solution will meet these requirements MOST cost-effectively?

Options:

Options:

A.

Use Amazon RDS for MySQL for the database. Store the product images in an Amazon S3 bucket.

B.

Use Amazon DynamoDB for the database. Store the product images in an Amazon S3 bucket.

C.

Use Amazon RDS for MySQL for the database. Store the product images in an Amazon Aurora MySQL database.

D.

Create three Amazon EC2 instances. Install MongoDB software on the instances to use as the database. Store the product images in an Amazon RDS for MySQL database with a Multi-AZ deployment.

Question 4

A company has a VPC with multiple private subnets that host multiple applications. The applications must not be accessible to the internet. However, the applications need to access multiple AWS services. The applications must not use public IP addresses to access the AWS services.

Options:

A.

Configure interface VPC endpoints for the required AWS services. Route traffic from the private subnets through the interface VPC endpoints.

B.

Deploy a NAT gateway in each private subnet. Route traffic from the private subnets through the NAT gateways.

C.

Deploy internet gateways in each private subnet. Route traffic from the private subnets through the internet gateways.

D.

Set up an AWS Direct Connect connection between the private subnets. Route traffic from the private subnets through the Direct Connect connection.

Question 5

A company is migrating some workloads to AWS. However, many workloads will remain on premises. The on-premises workloads require secure and reliable connectivity to AWS with consistent, low-latency performance.

The company has deployed the AWS workloads across multiple AWS accounts and multiple VPCs. The company plans to scale to hundreds of VPCs within the next year.

The company must establish connectivity between each of the VPCs and from the on-premises environment to each VPC.

Which solution will meet these requirements?

Options:

A.

Use an AWS Direct Connect connection to connect the on-premises environment to AWS. Configure VPC peering to establish connectivity between VPCs.

B.

Use multiple AWS Site-to-Site VPN connections to connect the on-premises environment to AWS. Create a transit gateway to establish connectivity between VPCs.

C.

Use an AWS Direct Connect connection with a Direct Connect gateway to connect the on-premises environment to AWS. Create a transit gateway to establish connectivity between VPCs. Associate the transit gateway with the Direct Connect gateway.

D.

Use an AWS Site-to-Site VPN connection to connect the on-premises environment to AWS. Configure VPC peering to establish connectivity between VPCs.

Question 6

An adventure company has launched a new feature on its mobile app. Users can use the feature to upload their hiking and rafting photos and videos anytime. The photos and videos are stored in Amazon S3 Standard storage in an S3 bucket and are served through Amazon CloudFront.

The company needs to optimize the cost of the storage. A solutions architect discovers that most of the uploaded photos and videos are accessed infrequently after 30 days. However, some of the uploaded photos and videos are accessed frequently after 30 days. The solutions architect needs to implement a solution that maintains millisecond retrieval availability of the photos and videos at the lowest possible cost.

Which solution will meet these requirements?

Options:

A.

Configure S3 Intelligent-Tiering on the S3 bucket.

B.

Configure an S3 Lifecycle policy to transition image objects and video objects from S3 Standard to S3 Glacier Deep Archive after 30 days.

C.

Replace Amazon S3 with an Amazon Elastic File System (Amazon EFS) file system that is mounted on Amazon EC2 instances.

D.

Add a Cache-Control: max-age header to the S3 image objects and S3 video objects. Set the header to 30 days.

Question 7

A company has a static website that is hosted on Amazon CloudFront in front of Amazon S3. The static website uses a database backend. The company notices that the website does not reflect updates that have been made in the website's Git repository. The company checks the continuous integration and continuous delivery (CI/CD) pipeline between the Git repository and Amazon S3. The company verifies that the webhooks are configured properly and that the CI/CD pipeline Is sending messages that indicate successful deployments.

A solutions architect needs to implement a solution that displays the updates on the website.

Which solution will meet these requirements?

Options:

A.

Add an Application Load Balancer.

B.

Add Amazon ElastiCache for Redis or Memcached to the database layer of the web application.

C.

Invalidate the CloudFront cache.

D.

Use AWS Certificate Manager (ACM) to validate the website's SSL certificate.

Question 8

A developer used the AWS SDK to create an application that aggregates and produces log records for 10 services. The application delivers data to an Amazon Kinesis Data Streams stream.

Each record contains a log message with a service name, creation timestamp, and other log information. The stream has 15 shards in provisioned capacity mode. The stream uses service name as the partition key.

The developer notices that when all the services are producing logs,ProvisionedThroughputExceededException errors occur during PutRecord requests. The stream metrics show that the write capacity the applications use is below the provisioned capacity.

How should the developer resolve this issue?

Options:

A.

Change the capacity mode from provisioned to on-demand.

B.

Double the number of shards until the throttling errors stop occurring.

C.

Change the partition key from service name to creation timestamp.

D.

Use a separate Kinesis stream for each service to generate the logs.

Question 9

A company is deploying a critical application by using Amazon RDS for MySQL. The application must be highly available and must recover automatically. The company needs to support interactive users (transactional queries) and batch reporting (analytical queries) with no more than a 4-hour lag. The analytical queries must not affect the performance of the transactional queries.

Options:

A.

Configure Amazon RDS for MySQL in a Multi-AZ DB instance deployment with one standby instance. Point the transactional queries to the primary DB instance. Point the analytical queries to a secondary DB instance that runs in a different Availability Zone.

B.

Configure Amazon RDS for MySQL in a Multi-AZ DB cluster deployment with two standby instances. Point the transactional queries to the primary DB instance. Point the analytical queries to the reader endpoint.

C.

Configure Amazon RDS for MySQL to use multiple read replicas across multiple Availability Zones. Point the transactional queries to the primary DB instance. Point the analytical queries to one of the replicas in a different Availability Zone.

D.

Configure Amazon RDS for MySQL as the primary database for the transactional queries with automated backups enabled. Configure automated backups. Each night, create a read-only database from the most recent snapshot to support the analytical queries. Terminate the previously created database.

Question 10

An ecommerce company is migrating its on-premises workload to the AWS Cloud. The workload currently consists of a web application and a backend Microsoft SQL database for storage.

The company expects a high volume of customers during a promotional event. The new infrastructure in the AWS Cloud must be highly available and scalable.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.

Migrate the web application to two Amazon EC2 instances across two Availability Zones behind an Application Load Balancer. Migrate the database to Amazon RDS for Microsoft SQL Server with read replicas in both Availability Zones.

B.

Migrate the web application to an Amazon EC2 instance that runs in an Auto Scaling group across two Availability Zones behind an Application Load Balancer. Migrate the database to two EC2 instances across separate AWS Regions with database replication.

C.

Migrate the web application to Amazon EC2 instances that run in an Auto Scaling group across two Availability Zones behind an Application Load Balancer. Migrate the database to Amazon RDS with Multi-AZ deployment.

D.

Migrate the web application to three Amazon EC2 instances across three Availability Zones behind an Application Load Balancer. Migrate the database to three EC2 instances across three Availability Zones.

Question 11

A company is launching a new application that will be hosted on Amazon EC2 instances. A solutions architect needs to design a solution that does not allow public IPv4 access that originates from the internet. However, the solution must allow the EC2 instances to make outbound IPv4 internet requests.

Options:

A.

Deploy a NAT gateway in public subnets in both Availability Zones. Create and configure one route table for each private subnet.

B.

Deploy an internet gateway in public subnets in both Availability Zones. Create and configure a shared route table for the private subnets.

C.

Deploy a NAT gateway in public subnets in both Availability Zones. Create and configure a shared route table for the private subnets.

D.

Deploy an egress-only internet gateway in public subnets in both Availability Zones. Create and configure one route table for each private subnet.

Question 12

A company tracks customer satisfaction by using surveys that the company hosts on its website. The surveys sometimes reach thousands of customers every hour. Survey results are currently sent in email messages to the company so company employees can manually review results and assess customer sentiment.

The company wants to automate the customer survey process. Survey results must be available for the previous 12 months.

Which solution will meet these requirements in the MOST scalable way?

Options:

A.

Send the survey results data to an Amazon API Gateway endpoint that is connected to an Amazon Simple Queue Service (Amazon SQS) queue. Create an AWS Lambda function to poll the SQS queue, call Amazon Comprehend for sentiment analysis, and save the results to an Amazon DynamoDB table. Set the TTL for all records to 365 days in the future.

B.

Send the survey results data to an API that is running on an Amazon EC2 instance. Configure the API to store the survey results as a new record in an Amazon DynamoDB table, call Amazon Comprehend for sentiment analysis, and save the results in a second DynamoDB table. Set the TTL for all records to 365 days in the future.

C.

Write the survey results data to an Amazon S3 bucket. Use S3 Event Notifications to invoke an AWS Lambda function to read the data and call Amazon Rekognition for sentiment analysis. Store the sentiment analysis results in a second S3 bucket. Use S3 Lifecycle policies on each bucket to expire objects after 365 days.

D.

Send the survey results data to an Amazon API Gateway endpoint that is connected to an Amazon Simple Queue Service (Amazon SQS) queue. Configure the SQS queue to invoke an AWS Lambda function that calls Amazon Lex for sentiment analysis and saves the results to an Amazon DynamoDB table. Set the TTL for all records to 365 days in the future.

Question 13

A company hosts an application on AWS. The application has generated approximately 2.5 TB of data over the previous 12 years. The company currently stores the data on Amazon EBS volumes.

The company wants a cost-effective backup solution for long-term storage. The company must be able to retrieve the data within minutes when required for audits.

Which solution will meet these requirements?

Options:

A.

Create EBS snapshots to back up the data.

B.

Create an Amazon S3 bucket. Use the S3 Glacier Deep Archive storage class to back up the data.

C.

Create an Amazon S3 bucket. Use the S3 Glacier Flexible Retrieval storage class to back up the data.

D.

Create an Amazon Elastic File System (Amazon EFS) file system to back up the data.

Question 14

A company has an application that processes information from documents that users upload. When a user uploads a new document to an Amazon S3 bucket, an AWS Lambda function is invoked. The Lambda function processes information from the documents.

The company discovers that the application did not process many recently uploaded documents. The company wants to ensure that the application processes each document with retries if there is an error during the first attempt to process the document.

Which solution will meet these requirements?

Options:

A.

Create an Amazon API Gateway REST API that has a proxy integration to the Lambda function. Update the application to send requests to the REST API.

B.

Configure a replication policy on the S3 bucket to stage the documents in another S3 bucket that an AWS Batch job processes on a daily schedule.

C.

Deploy an Application Load Balancer in front of the Lambda function that processes the documents.

D.

Configure an Amazon Simple Queue Service (Amazon SQS) queue as an event source for the Lambda function. Configure an S3 event notification on the S3 bucket to send new document upload events to the SQS queue.

Question 15

A developer is creating an ecommerce workflow in an AWS Step Functions state machine that includes an HTTP Task state. The task passes shipping information and order details to an endpoint.

The developer needs to test the workflow to confirm that the HTTP headers and body are correct and that the responses meet expectations.

Which solution will meet these requirements?

Options:

A.

Use the TestState API to invoke only the HTTP Task. Set the inspection level to TRACE.

B.

Use the TestState API to invoke the state machine. Set the inspection level to DEBUG.

C.

Use the data flow simulator to invoke only the HTTP Task. View the request and response data.

D.

Change the log level of the state machine to ALL. Run the state machine.

Question 16

A company recently migrated a data warehouse to AWS. The company has an AWS Direct Connect connection to AWS. Company users query the data warehouse by using a visualization tool. The average size of the queries that the data warehouse returns is 50 MB. The average visualization that the visualization tool produces is 500 KB in size. The result sets that the data warehouse returns are not cached.

The company wants to optimize costs for data transfers between the data warehouse and the company.

Which solution will meet this requirement?

Options:

A.

Host the visualization tool on premises. Connect to the data warehouse directly through the internet.

B.

Host the visualization tool in the same AWS Region as the data warehouse. Access the visualization tool through the internet.

C.

Host the visualization tool on premises. Connect to the data warehouse through the Direct Connect connection.

D.

Host the visualization tool in the same AWS Region as the data warehouse. Access the visualization tool through the Direct Connect connection.

Question 17

A company runs an AWS Lambda function in private subnets in a VPC. The subnets have a default route to the internet through an Amazon EC2 NAT instance. The Lambda function processes input data and saves its output as an object to Amazon S3.

Intermittently, the Lambda function times out while trying to upload the object because of saturated traffic on the NAT instance's network The company wants to access Amazon S3 without traversing the internet.

Which solution will meet these requirements?

Options:

A.

Replace the EC2 NAT instance with an AWS managed NAT gateway.

B.

Increase the size of the EC2 NAT instance in the VPC to a network optimized instance type

C.

Provision a gateway endpoint for Amazon S3 in the VPC. Update the route tables of the subnets accordingly.

D.

Provision a transit gateway. Place transit gateway attachments in the private subnets where the Lambda function is running.

Question 18

A company runs game applications on AWS. The company needs to collect, visualize, and analyze telemetry data from the company's game servers. The company wants to gain insights into the behavior, performance, and health of game servers in near real time. Which solution will meet these requirements?

Options:

A.

Use Amazon Kinesis Data Streams to collect telemetry data. Use Amazon Managed Service for Apache Flink to process the data in near real time and publish custom metrics to Amazon CloudWatch. Use Amazon CloudWatch to create dashboards and alarms from the custom metrics.

B.

Use Amazon Data Firehose to collect, process, and store telemetry data in near real time. Use AWS Glue to extract, transform, and load (ETL) data from Firehose into required formats for analysis. Use Amazon QuickSight to visualize and analyze the data.

C.

Use Amazon Kinesis Data Streams to collect, process, and store telemetry data. Use Amazon EMR to process the data in near real time into required formats for analysis. Use Amazon Athena to analyze and visualize the data.

D.

Use Amazon DynamoDB Streams to collect and store telemetry data. Configure DynamoDB Streams to invoke AWS Lambda functions to process the data in near real time. Use Amazon Managed Grafana to visualize and analyze the data.

Question 19

A company is designing a new Amazon Elastic Kubernetes Service (Amazon EKS) deployment to host multi-tenant applications that use a single cluster. The company wants to ensure that each pod has its own hosted environment. The environments must not share CPU, memory, storage, or elastic network interfaces.

Which solution will meet these requirements?

Options:

A.

Use Amazon EC2 instances to host self-managed Kubernetes clusters. Use taints and tolerations to enforce isolation boundaries.

B.

Use Amazon EKS with AWS Fargate. Use Fargate to manage resources and to enforce isolation boundaries.

C.

Use Amazon EKS and self-managed node groups. Use taints and tolerations to enforce isolation boundaries.

D.

Use Amazon EKS and managed node groups. Use taints and tolerations to enforce isolation boundaries.

Question 20

A company runs its production workload on Amazon EC2 instances with Amazon Elastic Block Store (Amazon EBS) volumes. A solutions architect needs to analyze the current EBS volume cost and to recommend optimizations. The recommendations need to include estimated monthly saving opportunities.

Which solution will meet these requirements?

Options:

A.

Use Amazon Inspector reporting to generate EBS volume recommendations for optimization.

B.

Use AWS Systems Manager reporting to determine EBS volume recommendations for optimization.

C.

Use Amazon CloudWatch metrics reporting to determine EBS volume recommendations for optimization.

D.

Use AWS Compute Optimizer to generate EBS volume recommendations for optimization.

Question 21

A company uses Amazon Elastic Container Service (Amazon ECS) to run workloads that belong to service teams. Each service team uses an owner tag to specify the ECS containers that the team owns. The company wants to generate an AWS Cost Explorer report that shows how much each service team spends on ECS containers on a monthly basis.

Which combination of steps will meet these requirements in the MOST operationally efficient way? (Select TWO.)

Options:

A.

Create a custom report in Cost Explorer. Apply a filter for Amazon ECS.

B.

Create a custom report in Cost Explorer. Apply a filter for the owner resource tag.

C.

Set up AWS Compute Optimizer. Review the rightsizing recommendations.

D.

Activate the owner tag as a cost allocation tag. Group the Cost Explorer report by linked account.

E.

Activate the owner tag as a cost allocation tag. Group the Cost Explorer report by the owner cost allocation tag.

Question 22

A company wants to visualize its AWS spend and resource usage. The company wants to use an AWS managed service to provide visual dashboards.

Which solution will meet these requirements?

Options:

A.

Configure an export in AWS Data Exports. Use Amazon QuickSight to create a cost and usage dashboard. View the data in QuickSight.

B.

Configure one custom budget in AWS Budgets for costs. Configure a second custom budget for usage. Schedule daily AWS Budgets reports by using the two budgets as sources.

C.

Configure AWS Cost Explorer to use user-defined cost allocation tags with hourly granularity to generate detailed data.

D.

Configure an export in AWS Data Exports. Use the standard export option. View the data in Amazon Athena.

Question 23

A company plans to use an Amazon S3 bucket to archive backup data. Regulations require the company to retain the backup data for 7 years.

During the retention period, the company must prevent users, including administrators, from deleting the data. The company can delete the data after 7 years.

Which solution will meet these requirements?

Options:

A.

Create an S3 bucket policy that denies delete operations for 7 years. Create an S3 Lifecycle policy to delete the data after 7 years.

B.

Create an S3 Object Lock default retention policy that retains data for 7 years in governance mode. Create an S3 Lifecycle policy to delete the data after 7 years.

C.

Create an S3 Object Lock default retention policy that retains data for 7 years in compliance mode. Create an S3 Lifecycle policy to delete the data after 7 years.

D.

Create an S3 Batch Operations job to set a legal hold on each object for 7 years. Create an S3 Lifecycle policy to delete the data after 7 years.

Question 24

A company runs its workloads on Amazon Elastic Container Service (Amazon ECS). The container images that the ECS task definition uses need to be scanned for Common Vulnerabilities and Exposures (CVEs). New container images that are created also need to be scanned.

Which solution will meet these requirements with the FEWEST changes to the workloads?

Options:

A.

Use Amazon Elastic Container Registry (Amazon ECR) as a private image repository to store the container images. Specify scan on push filters for the ECR basic scan.

B.

Store the container images in an Amazon S3 bucket. Use Amazon Macie to scan the images. Use an S3 Event Notification to initiate a Made scan for every event with an s3:ObjeclCreated:Put event type

C.

Deploy the workloads to Amazon Elastic Kubernetes Service (Amazon EKS). Use Amazon Elastic Container Registry (Amazon ECR) as a private image repository. Specify scan on push filters for the ECR enhanced scan.

D.

Store the container images in an Amazon S3 bucket that has versioning enabled. Configure an S3 Event Notification for s3:ObjectCrealed:* events to invoke an AWS Lambda function. Configure the Lambda function to initiate an Amazon Inspector scan.

Question 25

A company has an application that uses an Amazon RDS for PostgreSQL database. The company is developing an application feature that will store sensitive information for an individual in the database.

During a security review of the environment, the company discovers that the RDS DB instance is not encrypting data at rest. The company needs a solution that will provide encryption at rest for all the existing data and for any new data that is entered for an individual.

Which combination of steps should the company take to meet these requirements? (Select TWO.)

Options:

A.

Create a snapshot of the DB instance. Enable encryption on the snapshot. Use the encrypted snapshot to create a new DB instance. Adjust the application configuration to use the new DB instance.

B.

Create a snapshot of the DB instance. Create an encrypted copy of the snapshot. Use the encrypted snapshot to create a new DB instance. Adjust the application configuration to use the new DB instance.

C.

Modify the configuration of the DB instance by enabling encryption. Create a snapshot of the DB instance. Use the snapshot to create a new DB instance. Adjust the application configuration to use the new DB instance.

D.

Use AWS Key Management Service (AWS KMS) to create a new default AWS managed aws/rds key. Select this key as the encryption key for operations with Amazon RDS.

E.

Use AWS Key Management Service (AWS KMS) to create a new customer managed key. Select this key as the encryption key for operations with Amazon RDS.

Question 26

A company has applications that run on Amazon EC2 instances in a VPC One of the applications needs to call the Amazon S3 API to store and read objects. According to the company's security regulations, no traffic from the applications is allowed to travel across the internet.

Which solution will meet these requirements?

Options:

A.

Configure an S3 gateway endpoint.

B.

Create an S3 bucket in a private subnet.

C.

Create an S3 bucket in the same AWS Region as the EC2 instances.

D.

Configure a NAT gateway in the same subnet as the EC2 instances

Question 27

A company creates operations data and stores the data in an Amazon S3 bucket for the company's annual audit, an external consultant needs to access an annual report that is stored in the S3 bucket. The external consultant needs to access the report for 7 days.

The company must implement a solution to allow the external consultant access to only the report.

Which solution will meet these requirements with the MOST operational efficiency?

Options:

A.

Create a new S3 bucket that is configured to host a public static website. Migrate the operations data to the new S3 bucket. Share the S3 website URL with the external consultant.

B.

Enable public access to the S3 bucket for 7 days. Remove access to the S3 bucket when the external consultant completes the audit.

C.

Create a new IAM user that has access to the report in the S3 bucket. Provide the access keys to the external consultant. Revoke the access keys after 7 days.

D.

Generate a presigned URL that has the required access to the location of the report on the S3 bucket. Share the presigned URL with the external consultant.

Question 28

A company is planning to deploy a managed MySQL database solution for its non-production applications. The company plans to run the system for several years on AWS. Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Create an Amazon RDS for MySQL instance. Purchase a Reserved Instance.

B.

Create an Amazon RDS for MySQL instance. Use the instance on an on-demand basis.

C.

Create an Amazon Aurora MySQL cluster with writer and reader nodes. Use the cluster on an on-demand basis.

D.

Create an Amazon EC2 instance. Manually install and configure MySQL Server on the instance.

Question 29

A company needs a solution to back up and protect critical AWS resources. The company needs to regularly take backups of several Amazon EC2 instances and Amazon RDS for PostgreSQL databases. To ensure high resiliency, the company must have the ability to validate and restore backups.

Which solution meets the requirement with LEAST operational overhead?

Options:

A.

Use AWS Backup to create a backup schedule for the resources. Use AWS Backup to create a restoration testing plan for the required resources.

B.

Take snapshots of the EC2 instances and RDS DB instances. Create AWS Batch jobs to validate and restore the snapshots.

C.

Create a custom AWS Lambda function to take snapshots of the EC2 instances and RDS DB instances. Create a second Lambda function to restore the snapshots periodically to validate the backups.

D.

Take snapshots of the EC2 instances and RDS DB instances. Create an AWS Lambda function to restore the snapshots periodically to validate the backups.

Question 30

A company has multiple Amazon RDS DB instances that run in a development AWS account. All the instances have tags to identify them as development resources. The company needs the development DB instances to run on a schedule only during business hours.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an Amazon CloudWatch alarm to identify RDS instances that need to be stopped Create an AWS Lambda function to start and stop the RDS instances.

B.

Create an AWS Trusted Advisor report to identify RDS instances to be started and stopped. Create an AWS Lambda function to start and stop the RDS instances.

C.

Create AWS Systems Manager State Manager associations to start and stop the RDS instances.

D.

Create an Amazon EventBridge rule that invokes AWS Lambda functions to start and stop the RDS instances.

Question 31

Question:

A finance company collects streaming data for a real-time search and visualization system. They want to migrate to AWS using a native solution for ingest, search, and visualization.

Options:

Options:

A.

Use EC2 to ingest/process data to S3 → Athena + Managed Grafana

B.

Use EMR to ingest/process to Redshift → Redshift Spectrum + QuickSight

C.

Use EKS to ingest/process to DynamoDB → CloudWatch Dashboards

D.

Use Kinesis Data Streams → Amazon OpenSearch Service → Amazon QuickSight

Question 32

A company has a large fleet of vehicles that are equipped with internet connectivity to send telemetry to the company. The company receives over 1 million data points every 5 minutes from the vehicles. The company uses the data in machine learning (ML) applications to predict vehicle maintenance needs and to preorder parts. The company produces visual reports based on the captured data. The company wants to migrate the telemetry ingestion, processing, and visualization workloads to AWS. Which solution will meet these requirements?

Options:

A.

Use Amazon Timestream for LiveAnalytics to store the data points. Grant Amazon SageMaker permission to access the data for processing. Use Amazon QuickSight to visualize the data.

B.

Use Amazon DynamoDB to store the data points. Use DynamoDB Connector to ingest data from DynamoDB into Amazon EMR for processing. Use Amazon QuickSight to visualize the data.

C.

Use Amazon Neptune to store the data points. Use Amazon Kinesis Data Streams to ingest data from Neptune into an AWS Lambda function for processing. Use Amazon QuickSight to visualize the data.

D.

Use Amazon Timestream to for LiveAnalytics to store the data points. Grant Amazon SageMaker permission to access the data for processing. Use Amazon Athena to visualize the data.

Question 33

A company stores data for multiple business units in a single Amazon S3 bucket that is in the company's payer AWS account. To maintain data isolation, the business units store data in separate prefixes in the S3 bucket by using an S3 bucket policy.

The company plans to add a large number of dynamic prefixes. The company does not want to rely on a single S3 bucket policy to manage data access at scale. The company wants to develop a secure access management solution in addition to the bucket policy to enforce prefix-level data isolation.

Options:

A.

Configure the S3 bucket policy to deny s3:GetObject permissions for all users. Configure the bucket policy to allow s3:* access to individual business units.

B.

Enable default encryption on the S3 bucket by using server-side encryption with Amazon S3 managed keys (SSE-S3).

C.

Configure resource-based permissions on the S3 bucket by creating an S3 access point for each business unit.

D.

Use pre-signed URLs to provide access to the S3 bucket.

Question 34

A company runs its production workload on an Amazon Aurora MySQL DB cluster that includes six Aurora Replicas. The company wants near-real-time reporting queries from one of its departments to be automatically distributed across three of the Aurora Replicas. Those three replicas have a different compute and memory specification from the rest of the DB cluster.

Which solution meets these requirements?

Options:

A.

Create and use a custom endpoint for the workload.

B.

Create a three-node cluster clone and use the reader endpoint.

C.

Use any of the instance endpoints for the selected three nodes.

D.

Use the reader endpoint to automatically distribute the read-only workload.

Question 35

A company hosts an application on AWS that gives users the ability to download photos. The company stores all photos in an Amazon S3 bucket that is located in the us-east-1 Region. The company wants to provide the photo download application to global customers with low latency.

Which solution will meet these requirements?

Options:

A.

Find the public IP addresses that Amazon S3 uses in us-east-1. Configure an Amazon Route 53 latency-based routing policy that routes to all the public IP addresses.

B.

Configure an Amazon CloudFront distribution in front of the S3 bucket. Use the distribution endpoint to access the photos that are in the S3 bucket.

C.

Configure an Amazon Route 53 geoproximity routing policy to route the traffic to the S3 bucket that is closest to each customer's location.

D.

Create a new S3 bucket in the us-west-1 Region. Configure an S3 Cross-Region Replication rule to copy the photos to the new S3 bucket.

Question 36

A company has developed an API by using an Amazon API Gateway REST API and AWS Lambda functions. The API serves static content and dynamic content to users worldwide. The company wants to decrease the latency of transferring the content for API requests. Which solution will meet these requirements?

Options:

A.

Deploy the REST API as an edge-optimized API endpoint. Enable caching. Enable content encoding in the API definition to compress the application data in transit.

B.

Deploy the REST API as a Regional API endpoint. Enable caching. Enable content encoding in the API definition to compress the application data in transit.

C.

Deploy the REST API as an edge-optimized API endpoint. Enable caching. Configure reserved concurrency for the Lambda functions.

D.

Deploy the REST API as a Regional API endpoint. Enable caching. Configure reserved concurrency for the Lambda functions.

Question 37

A company runs multiple workloads on virtual machines (VMs) in an on-premises data center. The company is expanding rapidly. The on-premises data center is not able to scale fast enough to meet business needs. The company wants to migrate the workloads to AWS.

The migration is time sensitive. The company wants to use a lift-and-shift strategy for non-critical workloads.

Which combination of steps will meet these requirements? (Select THREE.)

Options:

A.

Use the AWS Schema Conversion Tool (AWS SCT) to collect data about the VMs.

B.

Use AWS Application Migration Service. Install the AWS Replication Agent on the VMs.

C.

Complete the initial replication of the VMs. Launch test instances to perform acceptance tests on the VMs.

D.

Stop all operations on the VMs Launch a cutover instance.

E.

Use AWS App2Container (A2C) to collect data about the VMs.

F.

Use AWS Database Migration Service (AWS DMS) to migrate the VMs.

Question 38

A software company needs to upgrade a critical web application. The application is hosted in a public subnet. The EC2 instance runs a MySQL database. The application's DNS records are published in an Amazon Route 53 zone.

A solutions architect must reconfigure the application to be scalable and highly available. The solutions architect must also reduce MySQL read latency.

Which combination of solutions will meet these requirements? (Select TWO.)

Options:

A.

Launch a second EC2 instance in a second AWS Region. Use a Route 53 failover routing policy to redirect the traffic to the second EC2 instance.

B.

Create and configure an Auto Scaling group to launch private EC2 instances in multiple Availability Zones. Add the instances to a target group behind a new Application Load Balancer.

C.

Migrate the database to an Amazon Aurora MySQL cluster. Create the primary DB instance and reader DB instance in separate Availability Zones.

D.

Create and configure an Auto Scaling group to launch private EC2 instances in multiple AWS Regions. Add the instances to a target group behind a new Application Load Balancer.

E.

Migrate the database to an Amazon Aurora MySQL cluster with cross-Region read replicas.

Question 39

A company runs a web application in a single AWS Region. A solutions architect wants to ensure that the web application can continue to operate if the application becomes unavailable in the Region.

Which solution will meet this requirement?

Options:

A.

Deploy the application in multiple Regions. Use Amazon Route 53 DNS health checks to route traffic to a healthy Region.

B.

Deploy the application in multiple Availability Zones within a single Region. Use Amazon Route 53 DNS health checks to route traffic to healthy application resources.

C.

Deploy the application in multiple Regions. Use an Amazon Route 53 simple routing record to route traffic to a healthy Region.

D.

Deploy the application in multiple Availability Zones within a single Region. Use an Amazon Route 53 latency record in each Availability Zone to route traffic to a healthy Availability Zone.

Question 40

A developer is creating a serverless application that performs video encoding. The encoding process runs as background jobs and takes several minutes to encode each video. The process must not send an immediate result to users.

The developer is using Amazon API Gateway to manage an API for the application. The developer needs to run test invocations and request validations. The developer must distribute API keys to control access to the API.

Which solution will meet these requirements?

Options:

A.

Create an HTTP API. Create an AWS Lambda function to handle the encoding jobs. Integrate the function with the HTTP API. Use the Event invocation type to call the Lambda function.

B.

Create a REST API with the default endpoint type. Create an AWS Lambda function to handle the encoding jobs. Integrate the function with the REST API. Use the Event invocation type to call the Lambda function.

C.

Create an HTTP API. Create an AWS Lambda function to handle the encoding jobs. Integrate the function with the HTTP API. Use the RequestResponse invocation type to call the Lambda function.

D.

Create a REST API with the default endpoint type. Create an AWS Lambda function to handle the encoding jobs. Integrate the function with the REST API. Use the RequestResponse invocation type to call the Lambda function.

Question 41

A company uses an Amazon EC2 instance to run a script to poll for and process messages in an Amazon Simple Queue Service (Amazon SQS) queue. The company wants to reduce operational overhead while maintaining its ability to process an increasing number of messages that are added to the queue. Which solution will meet these requirements?

Options:

A.

Increase the size of the EC2 instance to process messages in the SQS queue faster.

B.

Configure an Amazon EventBridge rule to turn off the EC2 instance when the SQS queue is empty.

C.

Migrate the script on the EC2 instance to an AWS Lambda function with an event source of the SQS queue.

D.

Configure an AWS Systems Manager Run Command to run the script on demand.

Question 42

A financial service company has a two-tier consumer banking application. The frontend serves static web content. The backend consists of APIs. The company needs to migrate the frontendcomponent to AWS. The backend of the application will remain on premises. The company must protect the application from common web vulnerabilities and attacks.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Migrate the frontend to Amazon EC2 instances. Deploy an Application Load Balancer (ALB) in front of the instances. Use the instances to invoke the on-premises APIs. Associate AWS WAF rules with the instances.

B.

Deploy the frontend as an Amazon CloudFront distribution that has multiple origins. Configure one origin to be an Amazon S3 bucket that serves the static web content. Configure a second origin to route traffic to the on-premises APIs based on the URL pattern. Associate AWS WAF rules with the distribution.

C.

Migrate the frontend to Amazon EC2 instances. Deploy a Network Load Balancer (NLB) in front of the instances. Use the instances to invoke the on-premises APIs. Create an AWS Network Firewall instance. Route all traffic through the Network Firewall instance.

D.

Deploy the frontend as a static website based on an Amazon S3 bucket. Use an Amazon API Gateway REST API and a set of Amazon EC2 instances to invoke the on-premises APIs. Associate AWS WAF rules with the REST API and the S3 bucket.

Question 43

A company is developing a content sharing platform that currently handles 500 GB of user-generated media files. The company expects the amount of content to grow significantly in the future. The company needs a storage solution that can automatically scale, provide high durability, and allow direct user uploads from web browsers.

Options:

A.

Store the data in an Amazon Elastic Block Store (Amazon EBS) volume with Multi-Attach enabled.

B.

Store the data in an Amazon Elastic File System (Amazon EFS) Standard file system.

C.

Store the data in an Amazon S3 Standard bucket.

D.

Store the data in an Amazon S3 Express One Zone bucket.

Question 44

A company is implementing a new application on AWS. The company will run the application on multiple Amazon EC2 instances across multiple Availability Zones within multiple AWS Regions. The application will be available through the internet. Users will access the application from around the world.

The company wants to ensure that each user who accesses the application is sent to the EC2 instances that are closest to the user's location.

Which solution will meet these requirements?

Options:

A.

Implement an Amazon Route 53 geolocation routing policy. Use an internet-facing Application Load Balancer to distribute the traffic across all Availability Zones within the same Region.

B.

Implement an Amazon Route 53 geoproximity routing policy. Use an internet-facing Network Load Balancer to distribute the traffic across all Availability Zones within the same Region.

C.

Implement an Amazon Route 53 multivalue answer routing policy Use an internet-facing Application Load Balancer to distribute the traffic across all Availability Zones within the same Region.

D.

Implement an Amazon Route 53 weighted routing policy. Use an internet-facing Network Load Balancer to distribute the traffic across all Availability Zones within the same Region.

Question 45

A company needs to ingest and analyze telemetry data from vehicles at scale for machine learning and reporting.

Which solution will meet these requirements?

Options:

A.

Use Amazon Timestream for LiveAnalytics to store data points. Grant Amazon SageMaker permission to access the data. Use Amazon QuickSight to visualize the data.

B.

Use Amazon DynamoDB to store data points. Use DynamoDB Connector to ingest data into Amazon EMR for processing. Use Amazon QuickSight to visualize the data.

C.

Use Amazon Neptune to store data points. Use Amazon Kinesis Data Streams to ingest data into a Lambda function for processing. Use Amazon QuickSight to visualize the data.

D.

Use Amazon Timestream for LiveAnalytics to store data points. Grant Amazon SageMaker permission to access the data. Use Amazon Athena to visualize the data.

Question 46

A company has a batch processing application that runs every day. The process typically takes an average 3 hours to complete. The application can handle interruptions and can resume the process after a restart. Currently, the company runs the application on Amazon EC2 On-Demand Instances. The company wants to optimize costs while maintaining the same performance level. Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Purchase a 1-year EC2 Instance Savings Plan for the appropriate instance family and size to meet the requirements of the application.

B.

Use EC2 On-Demand Capacity Reservations based on the appropriate instance family and size to meet the requirements of the application. Run the EC2 instances in an Auto Scaling group.

C.

Determine the appropriate instance family and size to meet the requirements of the application. Convert the application to run on AWS Batch with EC2 On-Demand Instances. Purchase a 1-year Compute Savings Plan.

D.

Determine the appropriate instance family and size to meet the requirements of the application. Convert the application to run on AWS Batch with EC2 Spot Instances.

Question 47

A company is creating a low-latency payment processing application that supports TLS connections from IPv4 clients. The application requires outbound access to the public internet. Users must access the application from a single entry point.

The bank wants to use Amazon Elastic Container Service (Amazon ECS) tasks to deploy the application. The company wants to enable AWSVPC network mode.

Which solution will meet these requirements MOST securely?

Options:

A.

Create a VPC that has an internet gateway, public subnets, and private subnets. Deploy a Network Load Balancer and a NAT gateway in the public subnets. Deploy the ECS tasks in the private subnets.

B.

Create a VPC that has an outbound-only internet gateway, public subnets, and private subnets. Deploy an Application Load Balancer and a NAT gateway in the public subnets. Deploy the ECS tasks in the private subnets.

C.

Create a VPC that has an internet gateway, public subnets, and private subnets. Deploy an Application Load Balancer in the public subnets. Deploy the ECS tasks in the public subnets.

D.

Create a VPC that has an outbound-only internet gateway, public subnets, and private subnets. Deploy a Network Load Balancer in the public subnets. Deploy the ECS tasks in the public subnets.

Question 48

A company's SAP application has a backend SQL Server database in an on-premises environment. The company wants to migrate its on-premises application and database server to AWS. The company needs an instance type that meets the high demands of its SAP database. On-premises performance data shows that both the SAP application and the database have high memory utilization.

Which solution will meet these requirements?

Options:

A.

Use the compute optimized Instance family for the application Use the memory optimized instance family for the database.

B.

Use the storage optimized instance family for both the application and the database

C.

Use the memory optimized instance family for both the application and the database

D.

Use the high performance computing (HPC) optimized instance family for the application. Use the memory optimized instance family for the database.

Question 49

A company has a production Amazon RDS for MySQL database. The company needs to create a new application that will read frequently changing data from the database with minimal impact on the database's overall performance. The application will rarely perform the same query more than once.

What should a solutions architect do to meet these requirements?

Options:

A.

Set up an Amazon ElastiCache cluster. Query the results in the cluster.

B.

Set up an Application Load Balancer (ALB). Query the results in the ALB.

C.

Set up a read replica for the database. Query the read replica.

D.

Set up querying of database snapshots. Query the database snapshots.

Question 50

A company is using Amazon DocumentDB global clusters to support an ecommerce application. The application serves customers across multiple AWS Regions. To ensure business continuity, the company needs a solution to minimize downtime during maintenance windows or other disruptions.

Which solution will meet these requirements?

Options:

A.

Regularly create manual snapshots of the DocumentDB instance in the primary Region.

B.

Perform a managed failover to a secondary Region when needed.

C.

Perform a failover to a replica DocumentDB instance within the primary Region.

D.

Configure increased replication lag to manage cross-Region replication.

Question 51

A finance company uses an on-premises search application to collect streaming data from various producers. The application provides real-time updates to search and visualization features. The company is planning to migrate to AWS and wants to use an AWS native solution. Which solution will meet these requirements?

Options:

A.

Use Amazon EC2 instances to ingest and process the data streams to Amazon S3 buckets for storage. Use Amazon Athena to search the data. Use Amazon Managed Grafana to create visualizations.

B.

Use Amazon EMR to ingest and process the data streams to Amazon Redshift for storage. Use Amazon Redshift Spectrum to search the data. Use Amazon QuickSight to create visualizations.

C.

Use Amazon Elastic Kubernetes Service (Amazon EKS) to ingest and process the data streams to Amazon DynamoDB for storage. Use Amazon CloudWatch to create graphical dashboards to search and visualize the data.

D.

Use Amazon Kinesis Data Streams to ingest and process the data streams to Amazon OpenSearch Service. Use OpenSearch Service to search the data. Use Amazon QuickSight to create visualizations.

Question 52

A company hosts a multi-tier inventory reporting application on AWS. The company needs a cost-effective solution to generate inventory reports on demand. Admin users need to have the ability to generate new reports. Reports take approximately 5-10 minutes to finish. The application must send reports to the email address of the admin user who generates each report.

Options:

Options:

A.

Use Amazon Elastic Container Service (Amazon ECS) to host the report generation code. Use an Amazon API Gateway HTTP API to invoke the code. Use Amazon Simple Email Service (Amazon SES) to send the reports to admin users.

B.

Use Amazon EventBridge to invoke a scheduled AWS Lambda function to generate the reports. Use Amazon Simple Notification Service (Amazon SNS) to send the reports to admin users.

C.

Use Amazon Elastic Kubernetes Service (Amazon EKS) to host the report generation code. Use an Amazon API Gateway REST API to invoke the code. Use Amazon Simple Notification Service (Amazon SNS) to send the reports to admin users.

D.

Create an AWS Lambda function to generate the reports. Use a function URL to invoke the function. Use Amazon Simple Email Service (Amazon SES) to send the reports to admin users.

Question 53

A company needs to accommodate traffic for a web application that the company hosts on AWS, especially during peak usage hours.

The application uses Amazon EC2 instances as web servers, an Amazon RDS DB instance for database operations, and an Amazon S3 bucket to store transaction documents. The application struggles to scale effectively and experiences performance issues.

The company wants to improve the scalability of the application and prevent future performance issues. The company also wants to improve global access speeds to the transaction documents for the company's global users.

Which solution will meet these requirements?

Options:

A.

Place the EC2 instances in Auto Scaling groups to scale appropriately during peak usage hours. Use Amazon RDS read replicas to improve database read performance. Deploy an Amazon CloudFront distribution that uses Amazon S3 as the origin.

B.

Increase the size of the EC2 instances to provide more compute capacity. Use Amazon ElastiCache to reduce database read loads. Use AWS Global Accelerator to optimize the delivery of the transaction documents that are in the S3 bucket.

C.

Transition workloads from the EC2 instances to AWS Lambda functions to scale in response to the usage peaks. Migrate the database to an Amazon Aurora global database to provide cross-Region reads. Use AWS Global Accelerator to deliver the transaction documents that are in the S3 bucket.

D.

Convert the application architecture to use Amazon Elastic Container Service (Amazon ECS) containers. Configure a Multi-AZ deployment of Amazon RDS to support database operations. Replicate the transaction documents that are in the S3 bucket across multiple AWS Regions.

Question 54

A company wants to create an API to authorize users by using JSON Web Tokens (JWTs). The company needs to support dynamic access to multiple AWS services by using path-based routing.

Which solution will meet these requirements?

Options:

A.

Deploy an Application Load Balancer behind an Amazon API Gateway REST API. Configure IAM authorization.

B.

Deploy an Application Load Balancer behind an Amazon API Gateway HTTP API. Use Amazon Cognito for authorization.

C.

Deploy a Network Load Balancer behind an Amazon API Gateway REST API. Use an AWS Lambda function as a custom authorizer.

D.

Deploy a Network Load Balancer behind an Amazon API Gateway HTTP API. Use Amazon Cognito for authorization.

Question 55

A company wants to provide a third-party system that runs in a private data center with access to its AWS account. The company wants to call AWS APIs directly from the third-party system. The company has an existing process for managing digital certificates. The company does not want to use SAML or OpenID Connect (OIDC) capabilities and does not want to store long-term AWS credentials.

Which solution will meet these requirements?

Options:

A.

Configure mutual TLS to allow authentication of the client and server sides of the communication channel.

B.

Configure AWS Signature Version 4 to authenticate incoming HTTPS requests to AWS APIs.

C.

Configure Kerberos to exchange tickets for assertions that can be validated by AWS APIs.

D.

Configure AWS Identity and Access Management (IAM) Roles Anywhere to exchange X.509 certificates for AWS credentials to interact with AWS APIs.

Question 56

Question:

A company runs an application on several Amazon EC2 instances that store persistent data on an Amazon Elastic File System (Amazon EFS) file system. The company needs to replicate the data to another AWS Region by using an AWS managed service solution. Which solution will meet these requirements MOST cost-effectively?

Options:

Options:

A.

Use the EFS-to-EFS backup solution to replicate the data to an EFS file system in another Region.

B.

Run a nightly script to copy data from the EFS file system to an Amazon S3 bucket. Enable S3 Cross-Region Replication on the S3 bucket.

C.

Create a VPC in another Region. Establish a cross-Region VPC peer. Run a nightly rsync to copy data from the original Region to the new Region.

D.

Use AWS Backup to create a backup plan with a rule that takes a daily backup and replicates it to another Region. Assign the EFS file system resource to the backup plan.

Question 57

A finance company is migrating its trading platform to AWS. The trading platform processes a high volume of market data and processes stock trades. The company needs to establish a consistent, low-latency network connection from its on-premises data center to AWS.

The company will host resources in a VPC. The solution must not use the public internet.

Which solution will meet these requirements?

Options:

A.

Use AWS Client VPN to connect the on-premises data center to AWS.

B.

Use AWS Direct Connect to set up a connection from the on-premises data center to AWS

C.

Use AWS PrivateLink to set up a connection from the on-premises data center to AWS.

D.

Use AWS Site-to-Site VPN to connect the on-premises data center to AWS.

Question 58

A company wants to share data that is collected from self-driving cars with the automobile community. The data will be made available from within an Amazon S3 bucket. The company wants to minimize its cost of making this data available to other AWS accounts.

What should a solutions architect do to accomplish this goal?

Options:

A.

Create an S3 VPC endpoint for the bucket.

B.

Configure the S3 bucket to be a Requester Pays bucket.

C.

Create an Amazon CloudFront distribution in front of the S3 bucket.

D.

Require that the files be accessible only with the use of the BitTorrent protocol.

Question 59

A company hosts an application on AWS. The application gives users the ability to upload photos and store the photos in an Amazon S3 bucket. The company wants to use Amazon CloudFront and a custom domain name to upload the photo files to the S3 bucket in the eu-west-1 Region.

Which solution will meet these requirements? (Select TWO.)

Options:

A.

Use AWS Certificate Manager (ACM) to create a public certificate in the us-east-1 Region. Use the certificate in CloudFront

B.

Use AWS Certificate Manager (ACM) to create a public certificate in eu-west-1. Use the certificate in CloudFront.

C.

Configure Amazon S3 to allow uploads from CloudFront. Configure S3 Transfer Acceleration.

D.

Configure Amazon S3 to allow uploads from CloudFront origin access control (OAC).

E.

Configure Amazon S3 to allow uploads from CloudFront. Configure an Amazon S3 website endpoint.

Question 60

A company has several on-premises Internet Small Computer Systems Interface (iSCSI) network storage servers The company wants to reduce the number of these servers by moving to the AWS Cloud. A solutions architect must provide low-latency access to frequently used data and reduce the dependency on on-premises servers with a minimal number of infrastructure changes.

Which solution will meet these requirements?

Options:

A.

Deploy an Amazon S3 File Gateway

B.

Deploy Amazon Elastic Block Store (Amazon EBS) storage with backups to Amazon S3

C.

Deploy an AWS Storage Gateway volume gateway that is configured with stored volumes

D.

Deploy an AWS Storage Gateway volume gateway that is configured with cached volumes.

Question 61

A company is redesigning a static website. The company needs a solution to host the new website in the company's AWS account. The solution must be secure and scalable.

Which combination of solutions will meet these requirements? (Select THREE.)

Options:

A.

Configure an Amazon CloudFront distribution. Set the Amazon S3 bucket as the origin.

B.

Associate an AWS Certificate Manager (ACM) TLS certificate to the Amazon CloudFront distribution.

C.

Enable static website hosting for the Amazon S3 bucket.

D.

Create an Amazon S3 bucket to store the static website content.

E.

Export the website's SSL/TLS certificate from AWS Certificate Manager (ACM) to the root of the Amazon S3 bucket.

F.

Turn off Block Public Access for the Amazon S3 bucket.

Question 62

A company plans to rehost an application to Amazon EC2 instances that use Amazon Elastic Block Store (Amazon EBS) as the attached storage

A solutions architect must design a solution to ensure that all newly created Amazon EBS volumes are encrypted by default. The solution must also prevent the creation of unencrypted EBS volumes

Which solution will meet these requirements?

Options:

A.

Configure the EC2 account attributes to always encrypt new EBS volumes.

B.

Use AWS Config. Configure the encrypted-volumes identifier Apply the default AWS Key Management Service (AWS KMS) key.

C.

Configure AWS Systems Manager to create encrypted copies of the EBS volumes. Reconfigure the EC2 instances to use the encrypted volumes

D.

Create a customer managed key in AWS Key Management Service (AWS KMS) Configure AWS Migration Hub to use the key when the company migrates workloads.

Question 63

A company maintains its accounting records in a custom application that runs on Amazon EC2 instances. The company needs to migrate the data to an AWS managed service for development and maintenance of the application data. The solution must require minimal operational support and provide immutable, cryptographically verifiable logs of data changes.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Copy the records from the application into an Amazon Redshift cluster.

B.

Copy the records from the application into an Amazon Neptune cluster.

C.

Copy the records from the application into an Amazon Timestream database.

D.

Copy the records from the application into an Amazon Quantum Ledger Database (Amazon QLDB) ledger.

Question 64

A company needs to migrate its customer transactions database from on-premises to AWS. The database resides on an Oracle DB instance that runs on a Linux server. According to a new security requirement, the company must rotate the database password each year.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Convert the database to Amazon DynamoDB by using AWS Schema Conversion Tool (AWS SCT). Store the password in AWS Systems Manager Parameter Store. Create an Amazon CloudWatch alarm to invoke an AWS Lambda function for yearly password rotation.

B.

Migrate the database to Amazon RDS for Oracle. Store the password in AWS Secrets Manager. Turn on automatic rotation. Configure a yearly rotation schedule.

C.

Migrate the database to an Amazon EC2 instance. Use AWS Systems Manager Parameter Store to keep and rotate the connection string by using an AWS Lambda function on a yearly schedule.

D.

Migrate the database to Amazon Neptune by using AWS Schema Conversion Tool (AWS SCT). Create an Amazon CloudWatch alarm to invoke an AWS Lambda function for yearly password rotation.

Question 65

A company provides a trading platform to customers. The platform uses an Amazon API Gateway REST API, AWS Lambda functions, and an Amazon DynamoDB table. Each trade that the platform processes invokes a Lambda function that stores the trade data in Amazon DynamoDB. The company wants to ingest trade data into a data lake in Amazon S3 for near real-time analysis. Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use Amazon DynamoDB Streams to capture the trade data changes. Configure DynamoDB Streams to invoke a Lambda function that writes the data to Amazon S3.

B.

Use Amazon DynamoDB Streams to capture the trade data changes. Configure DynamoDB Streams to invoke a Lambda function that writes the data to Amazon Data Firehose. Write the data from Data Firehose to Amazon S3.

C.

Enable Amazon Kinesis Data Streams on the DynamoDB table to capture the trade data changes. Configure Kinesis Data Streams to invoke a Lambda function that writes the data to Amazon S3.

D.

Enable Amazon Kinesis Data Streams on the DynamoDB table to capture the trade data changes. Configure a data stream to be the input for Amazon Data Firehose. Write the data from Data Firehose to Amazon S3.

Question 66

A company wants to implement new security compliance requirements for its development team to limit the use of approved Amazon Machine Images (AMIs).

The company wants to provide access to only the approved operating system and software for all its Amazon EC2 instances. The company wants the solution to have the least amount of lead time for launching EC2 instances.

Which solution will meet these requirements?

Options:

A.

Create a portfolio by using AWS Service Catalog that includes only EC2 instances launched with approved AMIs. Ensure that all required software is preinstalled on the AMIs. Create the necessary permissions for developers to use the portfolio.

B.

Create an AMI that contains the approved operating system and software by using EC2 Image Builder. Give developers access to that AMI to launch the EC2 instances.

C.

Create an AMI that contains the approved operating system Tell the developers to use the approved AMI Create an Amazon EventBridge rule to run an AWS Systems Manager script when a new EC2 instance is launched. Configure the script to install the required software from a repository.

D.

Create an AWS Config rule to detect the launch of EC2 instances with an AMI that is not approved. Associate a remediation rule to terminate those instances and launch the instances again with the approved AMI. Use AWS Systems Manager to automatically install the approved software on the launch of an EC2 instance.

Question 67

A company collects data from sensors. The company needs a cloud-based solution to store and transform the sensor data to make critical decisions. The solution must store the data for up to 2 days. After 2 days, the solution must delete the data. The company needs to use the transformeddata in an automated workflow that has manual approval steps.

Which solution will meet these requirements?

Options:

A.

Load the data into an Amazon Simple Queue Service (Amazon SQS) queue that has a retention period of 2 days. Use an Amazon EventBridge pipe to retrieve data from the queue, transform the data, and pass the data to an AWS Step Functions workflow.

B.

Load the data into AWS DataSync. Delete the DataSync task after 2 days. Invoke an AWS Lambda function to retrieve the data, transform the data, and invoke a second Lambda function that performs the remaining workflow steps.

C.

Load the data into an Amazon Simple Notification Service (Amazon SNS) topic. Use an Amazon EventBridge pipe to retrieve the data from the topic, transform the data, and send the data to Amazon EC2 instances to perform the remaining workflow steps.

D.

Load the data into an Amazon Simple Notification Service (Amazon SNS) topic. Use an Amazon EventBridge pipe to retrieve the data from the topic and transform the data into an appropriate format for an Amazon SQS queue. Use an AWS Lambda function to poll the queue to perform the remaining workflow steps.

Question 68

A company needs to grant a team of developers access to the company's AWS resources. The company must maintain a high level of security for the resources.

The company requires an access control solution that will prevent unauthorized access to the sensitive data.

Which solution will meet these requirements?

Options:

A.

Share the IAM user credentials for each development team member with the rest of the team to simplify access management and to streamline development workflows.

B.

Define IAM roles that have fine-grained permissions based on the principle of least privilege. Assign an IAM role to each developer.

C.

Create IAM access keys to grant programmatic access to AWS resources. Allow only developers to interact with AWS resources through API calls by using the access keys.

D.

Create an AWS Cognito user pool. Grant developers access to AWS resources by using the user pool.

Question 69

A company is using microservices to build an ecommerce application on AWS. The company wants to preserve customer transaction information after customers submit orders. The company wants to store transaction data in an Amazon Aurora database. The company expects sales volumes to vary throughout each year.

Options:

A.

Use an Amazon API Gateway REST API to invoke an AWS Lambda function to send transaction data to the Aurora database. Send transaction data to an Amazon Simple Queue Service (Amazon SQS) queue that has a dead-letter queue. Use a second Lambda function to read from the SQS queue and to update the Aurora database.

B.

Use an Amazon API Gateway HTTP API to send transaction data to an Application Load Balancer (ALB). Use the ALB to send the transaction data to Amazon Elastic Container Service (Amazon ECS) on Amazon EC2. Use ECS tasks to store the data in Aurora database.

C.

Use an Application Load Balancer (ALB) to route transaction data to Amazon Elastic Kubernetes Service (Amazon EKS). Use Amazon EKS to send the data to the Aurora database.

D.

Use Amazon Data Firehose to send transaction data to Amazon S3. Use AWS Database Migration Service (AWS DMS) to migrate the data from Amazon S3 to the Aurora database.

Question 70

A company launches a new web application that uses an Amazon Aurora PostgreSQL database. The company wants to add new features to the application that rely on AI. The company requires vector storage capability to use AI tools.

Which solution will meet this requirement MOST cost-effectively?

Options:

A.

Use Amazon OpenSearch Service to create an OpenSearch service. Configure the application to write vector embeddings to a vector index.

B.

Create an Amazon DocumentDB cluster. Configure the application to write vector embeddings to a vector index.

C.

Create an Amazon Neptune ML cluster. Configure the application to write vector embeddings to a vector graph.

D.

Install the pgvector extension on the Aurora PostgreSQL database. Configure the application to write vector embeddings to a vector table.

Question 71

An ecommerce company runs Its application on AWS. The application uses an Amazon Aurora PostgreSQL cluster in Multi-AZ mode for the underlying database. During a recent promotionalcampaign, the application experienced heavy read load and write load. Users experienced timeout issues when they attempted to access the application.

A solutions architect needs to make the application architecture more scalable and highly available.

Which solution will meet these requirements with the LEAST downtime?

Options:

A.

Create an Amazon EventBndge rule that has the Aurora cluster as a source. Create an AWS Lambda function to log the state change events of the Aurora cluster. Add the Lambda function as a target for the EventBndge rule Add additional reader nodes to fail over to.

B.

Modify the Aurora cluster and activate the zero-downtime restart (ZDR) feature. Use Database Activity Streams on the cluster to track the cluster status.

C.

Add additional reader instances to the Aurora cluster Create an Amazon RDS Proxy target group for the Aurora cluster.

D.

Create an Amazon ElastiCache for Redis cache. Replicate data from the Aurora cluster to Redis by using AWS Database Migration Service (AWS DMS) with a write-around approach.

Question 72

Question:

An ecommerce company hosts an API that handles sales requests. The company hosts the API frontend on Amazon EC2 instances that run behind an Application Load Balancer (ALB). The company hosts the API backend on EC2 instances that perform the transactions. The backend tiers are loosely coupled by an Amazon Simple Queue Service (Amazon SQS) queue.

The company anticipates a significant increase in request volume during a new product launch event. The company wants to ensure that the API can handle increased loads successfully.

Options:

Options:

A.

Double the number of frontend and backend EC2 instances to handle the increased traffic during the product launch event. Create a dead-letter queue to retain unprocessed sales requests when the demand exceeds the system capacity.

B.

Place the frontend EC2 instances into an Auto Scaling group. Create an Auto Scaling policy to launch new instances to handle the incoming network traffic.

C.

Place the frontend EC2 instances into an Auto Scaling group. Add an Amazon ElastiCache cluster in front of the ALB to reduce the amount of traffic the API needs to handle.

D.

Place the frontend and backend EC2 instances into separate Auto Scaling groups. Create a policy for the frontend Auto Scaling group to launch instances based on incoming network traffic. Create a policy for the backend Auto Scaling group to launch instances based on the SQS queue backlog.

Question 73

A company uses AWS Cost Explorer to monitor its AWS costs. The company notices that Amazon Elastic Block Store (Amazon EBS) storage and snapshot costs increase every month. However, the company does not purchase additional EBS storage every month. The company wants to optimize monthly costs for its current storage usage.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use logs in Amazon CloudWatch Logs to monitor the storage utilization of Amazon EBS. Use Amazon EBS Elastic Volumes to reduce the size of the EBS volumes.

B.

Use a custom script to monitor space usage. Use Amazon EBS Elastic Volumes to reduce the size of the EBS volumes.

C.

Delete all expired and unused snapshots to reduce snapshot costs.

D.

Delete all nonessential snapshots. Use Amazon Data Lifecycle Manager to create and manage the snapshots according to the company's snapshot policy requirements.

Question 74

A company discovers that an Amazon DynamoDB Accelerator (DAX) cluster for the company's web application workload is not encrypting data at rest. The company needs to resolve thesecurity issue.

Which solution will meet this requirement?

Options:

A.

Stop the existing DAX cluster. Enable encryption at rest for the existing DAX cluster, and start the cluster again.

B.

Delete the existing DAX cluster. Recreate the DAX cluster, and configure the new cluster to encrypt the data at rest.

C.

Update the configuration of the existing DAX cluster to encrypt the data at rest.

D.

Integrate the existing DAX cluster with AWS Security Hub to automatically enable encryption at rest.

Question 75

A company is developing a new application that will run on Amazon EC2 instances. The application needs to access multiple AWS services.

The company needs to ensure that the application will not use long-term access keys to access AWS services.

Options:

A.

Create an IAM user. Assign the IAM user to the application. Create programmatic access keys for the IAM user. Embed the access keys in the application code.

B.

Create an IAM user that has programmatic access keys. Store the access keys in AWS Secrets Manager. Configure the application to retrieve the keys from Secrets Manager when the application runs.

C.

Create an IAM role that can access AWS Systems Manager Parameter Store. Associate the role with each EC2 instance profile. Create IAM access keys for the AWS services, and store the keys in Parameter Store. Configure the application to retrieve the keys from Parameter Store when the application runs.

D.

Create an IAM role that has permissions to access the required AWS services. Associate the IAM role with each EC2 instance profile.

Question 76

A consulting company provides professional services to customers worldwide. The company provides solutions and tools for customers to expedite gathering and analyzing data on AWS. The company needs to centrally manage and deploy a common set of solutions and tools for customers to use for self-service purposes.

Which solution will meet these requirements?

Options:

A.

Create AWS Cloud Formation templates for the customers.

B.

Create AWS Service Catalog products for the customers.

C.

Create AWS Systems Manager templates for the customers.

D.

Create AWS Config items for the customers.

Question 77

A company has Amazon EC2 instances in multiple AWS Regions. The instances all store and retrieve confidential data from the same Amazon S3 bucket. The company wants to improve the security of its current architecture.

The company wants to ensure that only the Amazon EC2 instances within its VPC can access the S3 bucket. The company must block all other access to the bucket.

Which solution will meet this requirement?

Options:

A.

Use IAM policies to restrict access to the S3 bucket.

B.

Use server-side encryption (SSE) to encrypt data in the S3 bucket at rest. Store the encryption key on the EC2 instances.

C.

Create a VPC endpoint for Amazon S3. Configure an S3 bucket policy to allow connections only from the endpoint.

D.

Use AWS Key Management Service (AWS KMS) with customer-managed keys to encrypt the data before sending the data to the S3 bucket.

Question 78

A company recently migrated its application to AWS. The application runs on Amazon EC2 Linux instances in an Auto Scaling group across multiple Availability Zones. The application stores data in an Amazon Elastic File System (Amazon EFS) file system that uses EFS Standard-Infrequent Access storage. The application indexes the company's files, and the index is stored in an Amazon RDS database.

The company needs to optimize storage costs with some application and services changes.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Create an Amazon S3 bucket that uses an Intelligent-Tiering lifecycle policy. Copy all files to the S3 bucket. Update the application to use Amazon S3 API to store and retrieve files.

B.

Deploy Amazon FSx for Windows File Server file shares. Update the application to use CIFS protocol to store and retrieve files.

C.

Deploy Amazon FSx for OpenZFS file system shares. Update the application to use the new mount point to store and retrieve files.

D.

Create an Amazon S3 bucket that uses S3 Glacier Flexible Retrieval. Copy all files to the S3 bucket. Update the application to use Amazon S3 API to store and retrieve files as standard retrievals.

Question 79

A medical company wants to perform transformations on a large amount of clinical trial data that comes from several customers. The company must extract the data from a relational databasethatcontains the customer data. Then the company will transform the data by using a series of complex rules. The company will load the data to Amazon S3 when the transformations are complete.

All data must be encrypted where it is processed before the company stores the data in Amazon S3. All data must be encrypted by using customer-specific keys.

Which solution will meet these requirements with the LEAST amount of operational effort?

Options:

A.

Create one AWS Glue job for each customer Attach a security configuration to each job that uses server-side encryption with Amazon S3 managed keys (SSE-S3) to encrypt the data.

B.

Create one Amazon EMR cluster for each customer Attach a security configuration to each cluster that uses client-side encryption with a custom client-side root key (CSE-Custom) to encrypt the data.

C.

Create one AWS Glue job for each customer Attach a security configuration to each job that uses client-side encryption with AWS KMS managed keys (CSE-KMS) to encrypt the data.

D.

Create one Amazon EMR cluster for each customer Attach a security configuration to each cluster that uses server-side encryption with AWS KMS keys (SSE-KMS) to encrypt the data.

Question 80

A company runs a critical public application on Amazon Elastic Kubernetes Service (Amazon EKS) clusters. The application has a microservices architecture. The company needs to implement a solution that collects, aggregates, and summarizes metrics and logs from the application in a centralized location.

Which solution will meet these requirements in the MOST operationally efficient way?

Options:

A.

Run the Amazon CloudWatch agent in the existing EKS cluster. Use a CloudWatch dashboard to view the metrics and logs.

B.

Configure a data stream in Amazon Kinesis Data Streams. Use Amazon Kinesis Data Firehose to read events and to deliver the events to an Amazon S3 bucket. Use Amazon Athena to view the events.

C.

Configure AWS CloudTrail to capture data events. Use Amazon OpenSearch Service to query CloudTrail.

D.

Configure Amazon CloudWatch Container Insights in the existing EKS cluster. Use a CloudWatch dashboard to view the metrics and logs.

Question 81

A company hosts an application in an Amazon EC2 Auto Scaling group. The company has observed that during periods of high demand, new instances take too long to join the Auto Scaling group and serve the increased demand. The company determines that the root cause of the issue is the long boot time of the instances in the Auto Scaling group. The company needs to reduce the time required to launch new instances to respond to demand. Which solution will meet this requirement?

Options:

A.

Increase the maximum capacity of the Auto Scaling group by 50%.

B.

Create a warm pool for the Auto Scaling group. Use the default specification for the warm pool size.

C.

Increase the health check grace period for the Auto Scaling group by 50%.

D.

Create a scheduled scaling action. Set the desired capacity equal to the maximum capacity of the Auto Scaling group.

Question 82

A solutions architect is designing an application that helps users fill out and submit registration forms. The solutions architect plans to use a two-tier architecture that includes a web application server tier and a worker tier.

The application needs to process submitted forms quickly. The application needs to process each form exactly once. The solution must ensure that no data is lost.

Which solution will meet these requirements?

Options:

A.

Use an Amazon Simple Queue Service {Amazon SQS) FIFO queue between the web application server tier and the worker tier to store and forward form data.

B.

Use an Amazon API Gateway HTTP API between the web application server tier and the worker tier to store and forward form data.

C.

Use an Amazon Simple Queue Service (Amazon SQS) standard queue between the web application server tier and the worker tier to store and forward form data.

D.

Use an AWS Step Functions workflow. Create a synchronous workflow between the web application server tier and the worker tier that stores and forwards form data.

Question 83

A company is migrating a large amount of data from on-premises storage to AWS. Windows, Mac, and Linux based Amazon EC2 instances in the same AWS Region will access the data by using SMB and NFS storage protocols. The company will access a portion of the data routinely. The company will access the remaining data infrequently.

The company needs to design a solution to host the data.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an Amazon Elastic File System (Amazon EFS) volume that uses EFS Intelligent-Tiering. Use AWS DataSync to migrate the data to the EFS volume.

B.

Create an Amazon FSx for ONTAP instance. Create an FSx for ONTAP file system with a root volume that uses the auto tiering policy. Migrate the data to the FSx for ONTAP volume.

C.

Create an Amazon S3 bucket that uses S3 Intelligent-Tiering. Migrate the data to the S3 bucket by using an AWS Storage Gateway Amazon S3 File Gateway.

D.

Create an Amazon FSx for OpenZFS file system. Migrate the data to the new volume.

Question 84

A company uses an Amazon S3 bucket as its data lake storage platform The S3 bucket contains a massive amount of data that is accessed randomly by multiple teams and hundreds of applications. The company wants to reduce the S3 storage costs and provide immediate availability for frequently accessed objects

What is the MOST operationally efficient solution that meets these requirements?

Options:

A.

Create an S3 Lifecycle rule to transition objects to the S3 Intelligent-Tiering storage class

B.

Store objects in Amazon S3 Glacier Use S3 Select to provide applications with access to the data.

C.

Use data from S3 storage class analysis to create S3 Lifecycle rules to automatically transition objects to the S3 Standard-Infrequent Access (S3 Standard-IA) storage class.

D.

Transition objects to the S3 Standard-Infrequent Access (S3 Standard-IA) storage class Create an AWS Lambda function to transition objects to the S3 Standard storage class when they are accessed by an application

Question 85

A company runs an application in a VPC on AWS. The company's on-premises data center has a DNS server. The data center is connected to AWS through an AWS Direct Connect connection with a private virtual interface (VIF). The on-premises DNS server needs to resolve the DNS name of the application in the VPC.

Options:

A.

Set up AWS Verified Access endpoints in the VPC. Configure DNS forwarding rules in Verified Access. Configure the on-premises DNS server to forward DNS queries through the Verified Access endpoints.

B.

Configure the Direct Connect connection to enable DNS resolution between the on-premises DNS server and the application in the VPC.

C.

Create an Amazon Route 53 Resolver outbound endpoint and a Resolver rule in the VPC. Configure the on-premises DNS server to send requests for the application to the outbound endpoint.

D.

Create an Amazon Route 53 Resolver inbound endpoint in the VPC. Configure the on-premises DNS server to send requests for the application to the inbound endpoint.

Question 86

A company runs a Windows-based ecommerce application on Amazon EC2 instances. The application has a very high transaction rate. The company requires a durable storage solution that can deliver 200,000 IOPS for each EC2 instance.

Which solution will meet these requirements?

Options:

A.

Host the application on EC2 instances that have Provisioned IOPS SSD (io2) Block Express Amazon Elastic Block Store (Amazon EBS) volumes attached.

B.

Install the application on an Amazon EMR cluster. Use Hadoop Distributed File System (HDFS) with General Purpose SSD (gp3) Amazon Elastic Block Store (Amazon EBS) volumes.

C.

Use Amazon FSx for Lustre as shared storage across the EC2 instances that run the application.

D.

Host the application on EC2 instances that have SSD instance store volumes and General Purpose SSD (gp3) Amazon Elastic Block Store (Amazon EBS) volumes attached.

Question 87

An ecommerce company runs an application that uses an Amazon DynamoDB table in a single AWS Region. The company wants to deploy the application to a second Region. The company needs to support multi-active replication with low latency reads and writes to the existing DynamoDB table in both Regions.

Which solution will meet these requirements in the MOST operationally efficient way?

Options:

A.

Create a DynamoDB global secondary index (GSI) for the existing table. Create a new table in the second Region. Convert the existing DynamoDB table to a global table. Specify the new table as the secondary table.

B.

Enable Amazon DynamoDB Streams for the existing table. Create a new table in the second Region. Create a new application that uses the DynamoDB Streams Kinesis Adapter and the Amazon Kinesis Client Library (KCL). Configure the new application to read data from the DynamoDB table in the first Region and to write the data to the new table in the second Region.

C.

Convert the existing DynamoDB table to a global table. Choose the appropriate second Region to achieve active-active write capabilities in both Regions.

D.

Enable Amazon DynamoDB Streams for the existing table. Create a new table in the second Region. Create an AWS Lambda function in the first Region that reads data from the table in the first Region and writes the data to the new table in the second Region. Set a DynamoDB stream as the input trigger for the Lambda function.

Question 88

A company needs to run its external website on Amazon EC2 instances and on-premises virtualized servers. The AWS environment has a 1 GB AWS Direct Connect connection to the data center. The application has IP addresses that will not change. The on-premises and AWS servers are able to restart themselves while maintaining the same IP address if a failure occurs. Some website users have to add their vendors to an allow list, so the solution must have a fixed IP address. The company needs a solution with the lowest operational overhead to handle this split traffic.

What should a solutions architect do to meet these requirements?

Options:

A.

Deploy an Amazon Route 53 Resolver with rules pointing to the on-premises and AWS IP addresses.

B.

Deploy a Network Load Balancer on AWS. Create target groups for the on-premises and AWS IP addresses.

C.

Deploy an Application Load Balancer on AWS. Register the on-premises and AWS IP addresses with the target group.

D.

Deploy Amazon API Gateway to direct traffic to the on-premises and AWS IP addresses based on the header of the request.

Question 89

A company has AWS Lambda functions that use environment variables. The company does not want its developers to see environment variables in plaintext.

Which solution will meet these requirements?

Options:

A.

Deploy code to Amazon EC2 instances instead of using Lambda functions.

B.

Configure SSL encryption on the Lambda functions to use AWS CloudHSM to store and encrypt the environment variables.

C.

Create a certificate in AWS Certificate Manager (ACM). Configure the Lambda functions to use the certificate to encrypt the environment variables.

D.

Create an AWS Key Management Service (AWS KMS) key. Enable encryption helpers on the Lambda functions to use the KMS key to store and encrypt the environment variables.

Question 90

Question:

A company wants to migrate an application that uses a microservice architecture to AWS. The services currently run on Docker containers on-premises. The application has an event-driven architecture that uses Apache Kafka. The company configured Kafka to use multiple queues to send and receive messages. Some messages must be processed by multiple services. Which solution will meet these requirements with the LEAST management overhead?

Options:

Options:

A.

Migrate the services to Amazon Elastic Container Service (Amazon ECS) with the Amazon EC2 launch type. Deploy a Kafka cluster on EC2 instances to handle service-to-service communication.

B.

Migrate the services to Amazon Elastic Container Service (Amazon ECS) with the AWS Fargate launch type. Create multiple Amazon Simple Queue Service (Amazon SQS) queues to handle service-to-service communication.

C.

Migrate the services to Amazon Elastic Container Service (Amazon ECS) with the AWS Fargate launch type. Deploy an Amazon Managed Streaming for Apache Kafka (Amazon MSK) cluster to handle service-to-service communication.

D.

Migrate the services to Amazon Elastic Container Service (Amazon ECS) with the Amazon EC2 launch type. Use Amazon EventBridge to handle service-to-service communication.

Question 91

An ecommerce company is launching a new marketing campaign. The company anticipates the campaign to generate ten times the normal number of daily orders through the company's ecommerce application. The campaign will last 3 days.

The ecommerce application architecture is based on Amazon EC2 instances in an Auto Scaling group and an Amazon RDS for MySQL database. The application writes order transactions to an Amazon Elastic File System (Amazon EFS) file system before the application writes orders to the database. During normal operations, the application write operations peak at 5,000 IOPS.

A solutions architect needs to ensure that the application can handle the anticipated workload during the marketing campaign.

Which solution will meet this requirement?

Options:

A.

For the duration of the campaign, increase the provisioned IOPS for the RDS for MySQL database. Set the Amazon EFS throughput mode to Bursting throughput.

B.

For the duration of the campaign, increase the provisioned IOPS for the RDS for MySQL database. Set the Amazon EFS throughput mode to Elastic throughput.

C.

Convert the database to a Multi-AZ deployment. Set the Amazon EFS throughput mode to Elastic throughput for the duration of the campaign.

D.

Use AWS Database Migration Service (AWS DMS) to convert the database to RDS for PostgreSQL. Set the Amazon EFS throughput mode to Bursting throughput.

Question 92

A company needs to migrate a MySQL database from an on-premises data center to AWS within 2 weeks. The database is 180 TB in size. The company cannot partition the database.

The company wants to minimize downtime during the migration. The company's internet connection speed is 100 Mbps.

Which solution will meet these requirements?

Options:

A.

Order an AWS Snowball Edge Storage Optimized device. Use AWS Database Migration Service (AWS DMS) and the AWS Schema Conversion Tool (AWS SCT) to migrate the database to Amazon RDS for MySQL and replicate ongoing changes. Send the Snowball Edge device back to AWS to finish the migration. Continue to replicate ongoing changes.

B.

Establish an AWS Site-to-Site VPN connection between the data center and AWS. Use AWS Database Migration Service (AWS DMS) and the AWS Schema Conversion Tool (AWS SCT) to migrate the database to Amazon RDS tor MySQL and replicate ongoing changes.

C.

Establish a 10 Gbps dedicated AWS Direct Connect connection between the data center and AWS. Use AWS DataSync to replicate the database to Amazon S3. Create a script to import the data from Amazon S3 to a new Amazon RDS for MySQL database instance.

D.

Use the company's existing internet connection. Use AWS DataSync to replicate the database to Amazon S3. Create a script to import the data from Amazon S3 to a new Amazon RDS for MySQL database instance.

Question 93

A company is building a serverless application to process large video files that users upload. The application performs multiple tasks to process each video file. Processing can take up to 30 minutes for the largest files.

The company needs a scalable architecture to support the processing application.

Which solution will meet these requirements?

Options:

A.

Store the uploaded video files in Amazon Elastic File System (Amazon EFS). Configure a schedule in Amazon EventBridge Scheduler to invoke an AWS Lambda function periodically to check for new files. Configure the Lambda function to perform all the processing tasks.

B.

Store the uploaded video files in Amazon Elastic File System (Amazon EFS). Configure an Amazon EFS event notification to start an AWS Step Functions workflow that uses AWS Fargate tasks to perform the processing tasks.

C.

Store the uploaded video files in Amazon S3. Configure an Amazon S3 event notification to send an event to Amazon EventBridge when a user uploads a new video file. Configure an AWS Step Functions workflow as a target for an EventBridge rule. Use the workflow to manage AWS Fargate tasks to perform the processing tasks.

D.

Store the uploaded video files in Amazon S3. Configure an Amazon S3 event notification to invoke an AWS Lambda function when a user uploads a new video file. Configure the Lambda function to perform all the processing tasks.

Question 94

A company has developed an API using Amazon API Gateway REST API and AWS Lambda. How can latency be reduced for users worldwide?

Options:

A.

Deploy the REST API as an edge-optimized API endpoint. Enable caching. Enable content encoding to compress data in transit.

B.

Deploy the REST API as a Regional API endpoint. Enable caching. Enable content encoding to compress data in transit.

C.

Deploy the REST API as an edge-optimized API endpoint. Enable caching. Configure reserved concurrency for Lambda functions.

D.

Deploy the REST API as a Regional API endpoint. Enable caching. Configure reserved concurrency for Lambda functions.

Question 95

A company wants to migrate applications from its on-premises servers to AWS. As a first step, the company is modifying and migrating a non-critical application to a single Amazon EC2 instance. The application will store information in an Amazon S3 bucket. The company needs to follow security best practices when deploying the application on AWS.

Which approach should the company take to allow the application to interact with Amazon S3?

Options:

A.

Create an IAM role that has administrative access to AWS. Attach the role to the EC2 instance.

B.

Create an IAM user. Attach the AdministratorAccess policy. Copy the generated access key and secret key. Within the application code, use the access key and secret key along with the AWS SDK to communicate with Amazon S3.

C.

Create an IAM role that has the necessary access to Amazon S3. Attach the role to the EC2 instance.

D.

Create an IAM user. Attach a policy that provides the necessary access to Amazon S3. Copy the generated access key and secret key. Within the application code, use the access key and secret key along with the AWS SDK to communicate with Amazon S3.

Question 96

A company has an organization in AWS Organizations that has all features enabled. The company has multiple Amazon S3 buckets in multiple AWS Regions around the world. The S3 buckets contain sensitive data.

The company needs to ensure that no personally identifiable information (PII) is stored in the S3 buckets. The company also needs a scalable solution to identify PII.

Which solution will meet these requirements?

Options:

A.

In the Organizations management account, configure an Amazon Macie administrator IAM user as the delegated administrator for the global organization. Use the Macie administrator user to configure Macie settings to scan for PII.

B.

For each Region in the Organizations management account, designate a delegated Amazon Macie administrator account. In the Macie administrator account, add all accounts in the organization. Use the Macie administrator account to enable Macie. Configure automated sensitive data discovery for all accounts in the organization.

C.

For each Region in the Organizations management account, configure a service control policy (SCP) to identify PII. Apply the SCP to the organization root.

D.

In the Organizations management account, configure AWS Lambda functions to scan for PII in each Region.

Question 97

A company is building a gaming application that needs to send unique events to multiple leaderboards, player matchmaking systems, and authentication services concurrently. The company requires an AWS-based event-driven system that delivers events in order and supports a publish-subscribe model. The gaming application must be the publisher, and the leaderboards, matchmaking systems, and authentication services must be the subscribers.

Which solution will meet these requirements?

Options:

A.

Amazon EventBridge event buses

B.

Amazon Simple Notification Service (Amazon SNS) FIFO topics

C.

Amazon Simple Notification Service (Amazon SNS) standard topics

D.

Amazon Simple Queue Service (Amazon SQS) FIFO queues

Question 98

A company runs a Microsoft Windows SMB file share on-premises to support an application. The company wants to migrate the application to AWS. The company wants to share storage across multiple Amazon EC2 instances.

Which solutions will meet these requirements with the LEAST operational overhead? (Select TWO.)

Options:

A.

Create an Amazon Elastic File System (Amazon EFS) file system with elastic throughput.

B.

Create an Amazon FSx for NetApp ONTAP file system.

C.

Use Amazon Elastic Block Store (Amazon EBS) to create a self-managed Windows file share on the instances.

D.

Create an Amazon FSx for Windows File Server file system.

E.

Create an Amazon FSx for OpenZFS file system.

Question 99

A company has an application that uses a MySQL database that runs on an Amazon EC2 instance. The instance currently runs in a single Availability Zone. The company requires a fault-tolerant database solution that provides a recovery time objective (RTO) and a recovery point objective (RPO) of 2 minutes or less. Which solution will meet these requirements?

Options:

A.

Migrate the MySQL database to Amazon RDS. Create a read replica in a second Availability Zone. Create a script that detects availability interruptions and promotes the read replica when needed.

B.

Migrate the MySQL database to Amazon RDS for MySQL. Configure the new RDS for MySQL database to use a Multi-AZ deployment.

C.

Create a second MySQL database in a second Availability Zone. Use native MySQL commands to sync the two databases every 2 minutes. Create a script that detects availability interruptions and promotes the second MySQL database when needed.

D.

Create a copy of the EC2 instance that runs the MySQL database. Deploy the copy in a second Availability Zone. Create a Network Load Balancer. Add both instances as targets.

Question 100

A company runs an order management application on AWS. The application allows customers to place orders and pay with a credit card. The company uses an Amazon CloudFront distribution to deliver the application.

A security team has set up logging for all incoming requests. The security team needs a solution to generate an alert if any user modifies the logging configuration.

Options (Select TWO):

Options:

A.

Configure an Amazon EventBridge rule that is invoked when a user creates or modifies a CloudFront distribution. Add the AWS Lambda function as a target of the EventBridge rule.

B.

Create an Application Load Balancer (ALB). Enable AWS WAF rules for the ALB. Configure an AWS Config rule to detect security violations.

C.

Create an AWS Lambda function to detect changes in CloudFront distribution logging. Configure the Lambda function to use Amazon Simple Notification Service (Amazon SNS) to send notifications to the security team.

D.

Set up Amazon GuardDuty. Configure GuardDuty to monitor findings from the CloudFront distribution. Create an AWS Lambda function to address the findings.

E.

Create a private API in Amazon API Gateway. Use AWS WAF rules to protect the private API from common security problems.

Question 101

A solutions architect is designing the network architecture for an application that runs on Amazon EC2 instances in an Auto Scaling group. The application needs to access data that is in Amazon S3 buckets.

Traffic to the S3 buckets must not use public IP addresses. The solutions architect will deploy the application in a VPC that has public and private subnets.

Which solutions will meet these requirements? (Select TWO.)

Options:

A.

Deploy the EC2 instances in a private subnet. Configure a default route to an egress-only internet gateway.

B.

Deploy the EC2 instances in a public subnet. Create a gateway endpoint for Amazon S3. Associate the endpoint with the subnet's route table.

C.

Deploy the EC2 instances in a public subnet. Create an interface endpoint for Amazon S3. Configure DNS hostnames and DNS resolution for the VPC.

D.

Deploy the EC2 instances in a private subnet. Configure a default route to a NAT gateway in a public subnet.

E.

Deploy the EC2 instances in a private subnet. Configure a default route to a customer gateway.

Question 102

A company is migrating a new application from an on-premises data center to a new VPC in the AWS Cloud. The company has multiple AWS accounts and VPCs that share many subnets and applications. The company wants to have fine-grained access control for the new application.The company wants to ensure that all network resources across accounts and VPCs that are granted permission to access the new application can access the application.

Which solution will meet these requirements?

Options:

A.

Set up a VPC peering connection for each VPC that needs access to the new application VPC. Update route tables in each VPC to enable connectivity.

B.

Deploy a transit gateway in the account that hosts the new application. Share the transit gateway with each account that needs to connect to the application. Update route tables in the VPC that hosts the new application and in the transit gateway to enable connectivity.

C.

Use an AWS PrivateLink endpoint service to make the new application accessible to other VPCs. Control access to the application by using an endpoint policy.

D.

Use an Application Load Balancer (ALB) to expose the new application to the internet. Configure authentication and authorization processes to ensure that only specified VPCs can access the application.

Question 103

A company is implementing a new policy to enhance the security of its AWS environment. The policy requires all administrative actions that users perform on the AWS Management Console to be secured by multi-factor authentication (MFA).

Which solution will allow the company to enforce this policy in the MOST operationally efficient way?

Options:

A.

Enable MFA on the root account. Ensure that all administrators use the root account to perform administrative actions.

B.

Create an IAM policy that requires MFA to be enabled for the IAM roles that administrators assume to perform administrative actions.

C.

Configure an Amazon CloudWatch alarm that sends an email notification when an administrator performs an administrative action without MFA.

D.

Use AWS Config to periodically audit IAM users and to automatically attach an IAM policy that requires MFA when AWS Config detects administrative actions.

Question 104

A company that has multiple AWS accounts maintains an on-premises Microsoft Active Directory. The company needs a solution to implement Single Sign-On for its employees. The company wants to use AWS IAM Identity Center.

The solution must meet the following requirements:

Allow users to access AWS accounts and third-party applications by using existing Active Directory credentials.

Enforce multi-factor authentication (MFA) to access AWS accounts.

Centrally manage permissions to access AWS accounts and applications.

Options:

Options:

A.

Create an IAM identity provider for Active Directory in each AWS account. Ensure that Active Directory users and groups access AWS accounts directly through IAM roles. Use IAM Identity Center to enforce MFA in each account for all users.

B.

Use AWS Directory Service to create a new AWS Managed Microsoft AD Active Directory. Configure IAM Identity Center in each account to use the new AWS Managed Microsoft AD Active Directory as the identity source. Use IAM Identity Center to enforce MFA for all users.

C.

Use IAM Identity Center with the existing Active Directory as the identity source. Enforce MFA for all users. Use AWS Organizations and Active Directory groups to manage access permissions for AWS accounts and application access.

D.

Use AWS Lambda functions to periodically synchronize Active Directory users and groups with IAM users and groups in each AWS account. Use IAM roles and policies to manage application access. Create a second Lambda function to enforce MFA.

Question 105

A company is designing an application to maintain a record of customer orders. The application will generate events. The company wants to use an Amazon EventBridge event bus to send the application's events to an Amazon DynamoDB table. Which solution will meet these requirements?

Options:

A.

Use the EventBridge default event bus. Configure DynamoDB Streams for the DynamoDB table that hosts the customer order data.

B.

Create an EventBridge custom event bus. Create an AWS Lambda function as a target. Configure the Lambda function to forward the customer order data to the DynamoDB table.

C.

Create an EventBridge partner event bus. Create an Amazon Simple Notification Service (Amazon SNS) topic. Subscribe an AWS Lambda function to the SNS topic. Configure the Lambda function to read the customer order data and to forward the data to the DynamoDB table.

D.

Create an EventBridge partner event bus. Create an AWS Lambda function as a target. Configure the Lambda function to forward the customer order data to the DynamoDB table.

Question 106

A company runs a payment processing system in the AWS Cloud Sometimes when a payment fails because of insufficient funds or technical issues, users attempt to resubmit the payment. Sometimes payment resubmissions invoke multiple payment messages for the same payment ID.

A solutions architect needs to ensure that the payment processing system receives payment messages that have the same payment ID sequentially, according to when the messages were generated. The processing system must process the messages in the order in which the messages are received. The solution must retain all payment messages for 10 days for analytics.

Which solutions will meet these requirements? (Select TWO.)

Options:

A.

Write the payment messages to an Amazon DynamoDB table that uses the payment ID as the partition key.

B.

Write the payment messages to an Amazon Kinesis data stream that uses the payment ID as the partition key.

C.

Write the payment messages to an Amazon ElastiCache for Memcached cluster that uses the payment ID as the key

D.

Write the payment messages to an Amazon Simple Queue Service (Amazon SQS) queue. Set the message attribute to use the payment ID.

E.

Write the payment messages to an Amazon Simple Queue Service (Amazon SQS) FIFO queue Set the message group to use the payment ID.

Question 107

A company is designing an IPv6 application that is hosted on Amazon EC2 instances in a private subnet within a VPC. The application will store user-uploaded content in Amazon S3 buckets. The application will save each S3 object's URL link and metadata in Amazon DynamoDB.

The company must not use public internet connections to transmit user-uploaded content or metadata.

Which solution will meet these requirements?

Options:

A.

Implement a gateway VPC endpoint for Amazon S3 and an interface VPC endpoint for Amazon DynamoDB.

B.

Implement interface VPC endpoints for both Amazon S3 and Amazon DynamoDB.

C.

Implement gateway VPC endpoints for both Amazon S3 and Amazon DynamoDB.

D.

Implement a gateway VPC endpoint for Amazon DynamoDB and an interface VPC endpoint for Amazon S3.

Question 108

A company hosts an application in a private subnet. The company has already integrated the application with Amazon Cognito. The company uses an Amazon Cognito user pool to authenticate users.

The company needs to modify the application so the application can securely store user documents in an Amazon S3 bucket.

Which combination of steps will securely integrate Amazon S3 with the application? (Select TWO.)

Options:

A.

Create an Amazon Cognito identity pool to generate secure Amazon S3 access tokens for users when they successfully log in.

B.

Use the existing Amazon Cognito user pool to generate Amazon S3 access tokens for users when they successfully log in.

C.

Create an Amazon S3 VPC endpoint in the same VPC where the company hosts the application.

D.

Create a NAT gateway in the VPC where the company hosts the application. Assign a policy to the S3 bucket to deny any request that is not initiated from Amazon Cognito.

E.

Attach a policy to the S3 bucket that allows access only from the users' IP addresses.

Question 109

A company is developing a serverless web application that gives users the ability to interact with real-time analytics from online games. The data from the games must be streamed in real time. The company needs a durable, low-latency database option for user data. The company does not know how many users will use the application. Any design considerations must provide response times of single-digit milliseconds as the application scales.

Which combination of AWS services will meet these requirements? (Select TWO.)

Options:

A.

Amazon CloudFront

B.

Amazon DynamoDB

C.

Amazon Kinesis

D.

Amazon RDS

E.

AWS Global Accelerator

Question 110

A company is preparing to store confidential data in Amazon S3. For compliance reasons, the data must be encrypted at rest. Encryption key usage must be logged for auditing purposes. Keys must be rotated every year.

Which solution meets these requirements and is the MOST operationally efficient?

Options:

A.

Server-side encryption with customer-provided keys (SSE-C)

B.

Server-side encryption with Amazon S3 managed keys (SSE-S3)

C.

Server-side encryption with AWS KMS keys (SSE-KMS) with manual rotation

D.

Server-side encryption with AWS KMS keys (SSE-KMS) with automatic rotation

Question 111

A company wants to create a long-term storage solution that will allow users to upload terabytes of images and videos. The company will use the images and videos to train machine learning (ML) models. The storage solution must be scalable and cost-optimized.

Which solution will meet these requirements?

Options:

A.

Provision an Amazon S3 bucket for users to upload images and videos. Copy the data from the S3 bucket to an Amazon FSx for Lustre file system to make the data available for ML model training.

B.

Provision an Amazon S3 bucket for users to upload images and videos. Configure the S3 bucket to make the data available to Amazon SageMaker AI training. Store the data in the S3 Intelligent-Tiering storage class.

C.

Configure an Amazon SageMaker AI notebook instance with 16 GB of storage. Create a custom application to allow users to upload images and videos directly to the notebook instance.

D.

Provision an Amazon S3 bucket for users to upload images and videos. Copy the data from the S3 bucket to an Amazon Elastic File System (Amazon EFS) file system to make the data available for ML model training.

Question 112

A company is storing data in Amazon S3 buckets. The company needs to retain any objects that contain personally identifiable information (PII) that might need to be reviewed.

A solutions architect must develop an automated solution to identify objects that contain PII and apply the necessary controls to prevent deletion before review.

Which combination of steps should the solutions architect take to meet these requirements? (Select THREE.)

Options:

A.

Create a job in Amazon Macie to scan the S3 buckets for the relevant sensitive data identifiers.

B.

Move the identified objects to the S3 Glacier Deep Archive storage class.

C.

Create an AWS Lambda function that performs an S3 Object Lock legal hold operation on the identified objects.

D.

Create an AWS Lambda function that applies an S3 Object Lock retention period to the identified objects in governance mode.

E.

Create an Amazon EventBridge rule that invokes the AWS Lambda function when Amazon Macie detects sensitive data.

F.

Configure multi-factor authentication (MFA) delete on the S3 buckets.

Question 113

An ecommerce company runs several internal applications in multiple AWS accounts. The company uses AWS Organizations to manage its AWS accounts.

A security appliance in the company's networking account must inspect interactions between applications across AWS accounts.

Which solution will meet these requirements?

Options:

A.

Deploy a Network Load Balancer (NLB) in the networking account to send traffic to the security appliance. Configure the application accounts to send traffic to the NLB by using an interface VPC endpoint in the application accounts

B.

Deploy an Application Load Balancer (ALB) in the application accounts to send traffic directly to the security appliance.

C.

Deploy a Gateway Load Balancer (GWLB) in the networking account to send traffic to the security appliance. Configure the application accounts to send traffic to the GWLB by using an interface GWLB endpoint in the application accounts

D.

Deploy an interface VPC endpoint in the application accounts to send traffic directly to the security appliance.

Question 114

A company needs a solution to prevent photos with unwanted content from being uploaded to the company’s web application. The solution must not involve training a machine learning (ML) model.

Which solution will meet these requirements?

Options:

A.

Create and deploy a model by using Amazon SageMaker Autopilot. Create a real-time endpoint that the web application invokes when new photos are uploaded.

B.

Create an AWS Lambda function that uses Amazon Rekognition to detect unwanted content. Create a Lambda function URL that the web application invokes when new photos are uploaded.

C.

Create an Amazon CloudFront function that uses Amazon Comprehend to detect unwanted content. Associate the function with the web application.

D.

Create an AWS Lambda function that uses Amazon Rekognition Video to detect unwanted content. Create a Lambda function URL that the web application invokes when new photos are uploaded.

Question 115

A company is building an application on AWS that connects to an Amazon RDS database. The company wants to manage the application configuration and to securely store and retrieve credentials for the database and other services.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.

Use AWS AppConfig to store and manage the application configuration. Use AWS Secrets Manager to store and retrieve the credentials.

B.

Use AWS Lambda to store and manage the application configuration. Use AWS Systems Manager Parameter Store to store and retrieve the credentials.

C.

Use an encrypted application configuration file Store the file in Amazon S3 for the application configuration. Create another S3 file to store and retrieve the credentials.

D.

Use AWS AppConfig to store and manage the application configuration. Use Amazon RDS to store and retrieve the credentials.

Question 116

A solutions architect creates an Auto Scaling group for a memory-intensive application. The solutions architect wants to scale up and scale down based on memory usage. Which solution will meet this requirement?

Options:

A.

Install and configure the AWS Systems Manager Agent (SSM Agent). Create a step scaling policy that has step adjustments based on the memory usage trend.

B.

Install and configure the Amazon CloudWatch agent. Create a target tracking policy to scale based on the mem_used_percent CloudWatch metric.

C.

Install and configure the AWS Systems Manager Agent (SSM Agent). Create a target tracking policy to scale based on the mem_used_percent Amazon CloudWatch metric.

D.

Install and configure the Amazon CloudWatch agent. Create a scheduled scaling policy to scale based on the memory usage trend.

Question 117

Question:

A company runs a mobile game app that stores session data (up to 256 KB) for up to 48 hours. The data updates frequently and must be deleted automatically after expiration. Restorability is also required.

Options:

Options:

A.

Use an Amazon DynamoDB table to store the session data. Enable point-in-time recovery (PITR) and TTL.

B.

Use Amazon MemoryDB and enable PITR and TTL.

C.

Store session data in S3 Standard. Enable Versioning and a Lifecycle rule to expire objects after 48 hours.

D.

Store data in S3 Intelligent-Tiering with Versioning and a Lifecycle rule to expire after 48 hours.

Question 118

A solutions architect is provisioning an Amazon Elastic File System (Amazon EFS) file system to provide shared storage across multiple Amazon EC2 instances. The instances all exist in the same VPC across multiple Availability Zones. There are two instances in each Availability Zone. The solutions architect must make the file system accessible to each instance with the lowest possible latency.

Which solution will meet these requirements?

Options:

A.

Create a mount target for the EFS file system in the VPC. Use the mount target to mount the file system on each of the instances.

B.

Create a mount target for the EFS file system in one Availability Zone of the VPC. Use the mount target to mount the file system on the instances in that Availability Zone. Share the directory with the other instances.

C.

Create a mount target for each instance. Use each mount target to mount the EFS file system on each respective instance.

D.

Create a mount target in each Availability Zone of the VPC. Use the mount target to mount the EFS file system on the instances in the respective Availability Zone.

Question 119

A company uses an Amazon DynamoDB table to store data that the company receives from devices. The DynamoDB table supports a customer-facing website to display recent activity oncustomer devices The company configured the table with provisioned throughput for writes and reads

The company wants to calculate performance metrics for customer device data on a daily basis. The solution must have minimal effect on the table's provisioned read and write capacity

Which solution will meet these requirements?

Options:

A.

Use an Amazon Athena SQL query with the Amazon Athena DynamoDB connector to calculate performance metrics on a recurring schedule.

B.

Use an AWS Glue job with the AWS Glue DynamoDB export connector to calculate performance metrics on a recurring schedule.

C.

Use an Amazon Redshift COPY command to calculate performance metrics on a recurring schedule.

D.

Use an Amazon EMR job with an Apache Hive external table to calculate performance metrics on a recurring schedule.

Question 120

An insurance company runs an application on premises to process contracts. The application processes jobs that are comprised of many tasks. The individual tasks run for up to 5 minutes. Some jobs can take up to 24 hours in total to finish. If a task fails, the task must be reprocessed.

The company wants to migrate the application to AWS. The company will use Amazon S3 as part of the solution. The company wants to configure jobs to start automatically when a contract is uploaded to an S3 bucket.

Which solution will meet these requirements?

Options:

A.

Use AWS Lambda functions to process individual tasks. Create a primary Lambda function to handle the overall job processing by calling individual Lambda functions in sequence. Configure the S3 bucket to send an event notification to invoke the primary Lambda function to begin processing.

B.

Use a state machine in AWS Step Functions to handle the overall contract processing job. Configure the S3 bucket to send an event notification to Amazon EventBridge. Create a rule in Amazon EventBridge to target the state machine.

C.

Use an AWS Batch job to handle the overall contract processing job. Configure the S3 bucket to send an event notification to initiate the Batch job.

D.

Use an S3 event notification to notify an Amazon Simple Queue Service (Amazon SQS) queue when a contract is uploaded. Configure an AWS Lambda function to read messages from the queue and to run the contract processing job.

Question 121

A company stores user data in AWS. The data is used continuously with peak usage during business hours. Access patterns vary, with some data not being used for months at a time. A solutions architect must choose a cost-effective solution that maintains the highest level of durability while maintaining high availability.

Which storage solution meets these requirements?

Options:

A.

Amazon S3 Standard

B.

Amazon S3 Intelligent-Tiering

C.

Amazon S3 Glacier Deep Archive

D.

Amazon S3 One Zone-Infrequent Access (S3 One Zone-IA)

Question 122

A company hosts an application on AWS that stores files that users need to access. The application uses two Amazon EC2 instances. One instance is in Availability Zone A, and the second instance is in Availability Zone B. Both instances use Amazon Elastic Block Store (Amazon EBS) volumes. Users must be able to access the files at any time without delay. Users report that the two instances occasionally contain different versions of the same file. Users occasionally receive HTTP 404 errors when they try to download files. The company must address the customer issues. The company cannot make changes to the application code. Which solution will meet these requirements in the MOST operationally efficient way?

Options:

A.

Run the robocopy command on one of the EC2 instances on a schedule to copy files from the Availability Zone A instance to the Availability Zone B instance.

B.

Configure the application to store the files on both EBS volumes each time a user writes or updates a file.

C.

Mount an Amazon Elastic File System (Amazon EFS) file system to the EC2 instances. Copy the files from the EBS volumes to the EFS file system. Configure the application to store files in the EFS file system.

D.

Create an EC2 instance profile that allows the instance in Availability Zone A to access the S3 bucket. Re-associate the instance profile to the instance in Availability Zone B when needed.

Question 123

A company has a large data workload that runs for 6 hours each day. The company cannot lose any data while the process is running. A solutions architect is designing an Amazon EMR cluster configuration to support this critical data workload.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Configure a long-running cluster that runs the primary node and core nodes on On-Demand Instances and the task nodes on Spot Instances.

B.

Configure a transient cluster that runs the primary node and core nodes on On-Demand Instances and the task nodes on Spot Instances.

C.

Configure a transient cluster that runs the primary node on an On-Demand Instance and the core nodes and task nodes on Spot Instances.

D.

Configure a long-running cluster that runs the primary node on an On-Demand Instance, the core nodes on Spot Instances, and the task nodes on Spot Instances.

Question 124

A company manages multiple AWS accounts in an organization in AWS Organizations. The company's applications run on Amazon EC2 instances in multiple AWS Regions. The company needs a solution to simplify the management of security rules across the accounts in its organization. The solution must apply shared security group rules, audit security groups, and detect unused and redundant rules in VPC security groups across all AWS environments.

Which solution will meet these requirements with the MOST operational efficiency?

Options:

A.

Use AWS Firewall Manager to create a set of rules based on the security requirements. Replicate the rules to all the AWS accounts and Regions.

B.

Use AWS CloudFormation StackSets to provision VPC security groups based on the specifications across multiple accounts and Regions. Deploy AWS Network Firewall to define the firewall rules to control network traffic across multiple accounts and Regions.

C.

Use AWS CloudFormation StackSets to provision VPC security groups based on the specifications across multiple accounts and Regions. Configure AWS Config and AWS Lambda to evaluate compliance information and to automate enforcement across all accounts and Regions.

D.

Use AWS Network Firewall to build policies based on the security requirements. Centrally apply the new policies to all the VPCs and accounts.

Question 125

A weather forecasting company collects temperature readings from various sensors on a continuous basis. An existing data ingestion process collects the readings and aggregates the readings into larger Apache Parquet files. Then the process encrypts the files by using client-side encryption with KMS managed keys (CSE-KMS). Finally, the process writes the files to an Amazon S3 bucket with separate prefixes for each calendar day.

The company wants to run occasional SQL queries on the data to take sample moving averages for a specific calendar day.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Configure Amazon Athena to read the encrypted files. Run SQL queries on the data directly in Amazon S3.

B.

Use Amazon S3 Select to run SQL queries on the data directly in Amazon S3.

C.

Configure Amazon Redshift to read the encrypted files Use Redshift Spectrum and Redshift query editor v2 to run SQL queries on the data directly in Amazon S3.

D.

Configure Amazon EMR Serverless to read the encrypted files. Use Apache SparkSQL to run SQL queries on the data directly in Amazon S3.

Question 126

A company is designing a microservice-based architecture tor a new application on AWS. Each microservice will run on its own set of Amazon EC2 instances. Each microservice will need to interact with multiple AWS services such as Amazon S3 and Amazon Simple Queue Service (Amazon SQS).

The company wants to manage permissions for each EC2 instance based on the principle of least privilege.

Which solution will meet this requirement?

Options:

A.

Assign an IAM user to each micro-service. Use access keys stored within the application code to authenticate AWS service requests.

B.

Create a single IAM role that has permission to access all AWS services. Associate the IAM role with all EC2 instances that run the microservices

C.

Use AWS Organizations to create a separate account for each microservice. Manage permissions at the account level.

D.

Create individual IAM roles based on the specific needs of each microservice. Associate the IAM roles with the appropriate EC2 instances.

Question 127

A company has separate AWS accounts for its finance, data analytics, and development departments. Because of costs and security concerns, the company wants to control which services each AWS account can use

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use AWS Systems Manager templates to control which AWS services each department can use

B.

Create organization units (OUs) for each department in AWS Organizations. Attach service control policies (SCPs) to the OUs.

C.

Use AWS CloudFormation to automatically provision only the AWS services that each department can use.

D.

Set up a list of products in AWS Service Catalog in the AWS accounts to manage and control the usage of specific AWS services

Question 128

A company plans to use AWS to run high-performance computing (HPC) workloads and analytics workloads. The company will run HPC workloads on Amazon EC2 instances. The workloads require a high-performance file system that can scale to millions of input/output operations per second (IOPS). Which combination of steps will meet these requirements? (Select TWO.)

Options:

A.

Use Amazon Elastic File System (Amazon EFS) as a high-performance file system.

B.

Use Amazon FSx for Lustre as a high-performance file system.

C.

Create an Auto Scaling group of Amazon EC2 instances. Use Reserved Instances. Configure a spread placement group. Use AWS Batch to run the analytics workloads.

D.

Use Mountpoint for Amazon S3 as a high-performance file system.

E.

Create an Auto Scaling group of Amazon EC2 instances. Use a mix of On-Demand Instances, Reserved Instances, and Spot Instances. Configure a cluster placement group. Use Amazon EMR to run the analytics workloads.

Question 129

A company wants to use a data lake that is hosted on Amazon S3 to provide analytics services for historical data. The data lake consists of 800 tables but is expected to grow to thousands of tables. More than 50 departments use the tables, and each department has hundreds of users. Different departments need access to specific tables and columns.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an IAM role for each department. Use AWS Lake Formation based access control to grant each IAM role access to specific tables and columns. Use Amazon Athena to analyze the data.

B.

Create an Amazon Redshift cluster for each department. Use AWS Glue to ingest into the Redshift cluster only the tables and columns that are relevant to that department. Create Redshift database users. Grant the users access to the relevant department's Redshift cluster. Use Amazon Redshift to analyze the data.

C.

Create an IAM role for each department. Use AWS Lake Formation tag-based access control to grant each IAM role access to only the relevant resources. Create LF-tags that are attached to tables and columns. Use Amazon Athena to analyze the data.

D.

Create an Amazon EMR cluster for each department. Configure an IAM service role for each EMR cluster to access relevant S3 files. For each department's users, create an IAM role that provides access to the relevant EMR cluster. Use Amazon EMR to analyze the data.

Question 130

A company needs to optimize its Amazon S3 storage costs for an application that generates many files that cannot be recreated Each file is approximately 5 MB and is stored in Amazon S3 Standard storage.

The company must store the files for 4 years before the files can be deleted The files must be immediately accessible The files are frequently accessed in the first 30 days of object creation, but they are rarely accessed after the first 30 days.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Create an S3 Lifecycle policy to move the files to S3 Glacier Instant Retrieval 30 days after object creation. Delete the files 4 years after object creation.

B.

Create an S3 Lifecycle policy to move the files to S3 One Zone-Infrequent Access (S3 One Zone-IA) 30 days after object creation Delete the files 4 years after object creation.

C.

Create an S3 Lifecycle policy to move the files to S3 Standard-Infrequent Access (S3 Standard-IA) 30 days after object creation Delete the files 4 years after object creation.

D.

Create an S3 Lifecycle policy to move the files to S3 Standard-Infrequent Access (S3 Standard-IA) 30 days after object creation. Move the files to S3 Glacier Flexible Retrieval 4 years after object creation.

Question 131

Question:

A machine learning (ML) team is building an application that uses data that is in an Amazon S3 bucket. The ML team needs a storage solution for its model training workflow on AWS. The ML team requires high-performance storage that supports frequent access to training datasets. The storage solution must integrate natively with Amazon S3. Which solution will meet these requirements with the LEAST operational overhead?

Options:

Options:

A.

Use Amazon Elastic Block Store (Amazon EBS) volumes to provide high-performance storage. Use AWS DataSync to migrate data from the S3 bucket to EBS volumes.

B.

Use Amazon EC2 ML instances to provide high-performance storage. Store training data on Amazon EBS volumes. Use the S3 Copy API to copy data from the S3 bucket to EBS volumes.

C.

Use Amazon FSx for Lustre to provide high-performance storage. Store training datasets in Amazon S3 Standard storage.

D.

Use Amazon EMR to provide high-performance storage. Store training datasets in Amazon S3 Glacier Instant Retrieval storage.

Question 132

A company is migrating a legacy application from an on-premises data center to AWS. The application relies on hundreds of cron Jobs that run between 1 and 20 minutes on different recurring schedules throughout the day.

The company wants a solution to schedule and run the cron jobs on AWS with minimal refactoring. The solution must support running the cron jobs in response to an event in the future.

Which solution will meet these requirements?

Options:

A.

Create a container image for the cron jobs. Use Amazon EventBridge Scheduler to create a recurring schedule. Run the cron job tasks as AWS Lambda functions.

B.

Create a container image for the cron jobs. Use AWS Batch on Amazon Elastic Container Service (Amazon ECS) with a scheduling policy to run the cron jobs.

C.

Create a container image for the cron jobs. Use Amazon EventBridge Scheduler to create a recurring schedule Run the cron job tasks on AWS Fargate.

D.

Create a container image for the cron jobs. Create a workflow in AWS Step Functions that uses a Wait state to run the cron jobs at a specified time. Use the RunTask action to run the cron job tasks on AWS Fargate.

Question 133

A company is designing an advertisement distribution application to run on AWS. The company wants to deploy the application as a container to Amazon Elastic Container Service (Amazon ECS).

Advertisements must be displayed to users around the world with low latency. The company needs to optimize data transfer costs.

Which solution will meet these requirements?

Options:

A.

Deploy the application in a single AWS Region. Use an Application Load Balancer (ALB) to distribute traffic. Create an Amazon CloudFront distribution, and set the ALB as the origin.

B.

Deploy the application in multiple AWS Regions. Create an Application Load Balancer (ALB) in each Region. Use Amazon Route 53 with a latency-based weighted routing policy to distribute traffic to the ALBs.

C.

Deploy the application in multiple AWS Regions. Create an Application Load Balancer (ALB) in each Region. Create a transit gateway in each Region. Route traffic between the ALBs and Amazon ECS through the transit gateways.

D.

Deploy the application in a single AWS Region. Use an Application Load Balancer (ALB) to distribute traffic. Create an accelerator in AWS Global Accelerator. Associate the accelerator with the ALB.

Question 134

A company runs multiple workloads in separate AWS environments. The company wants to optimize its AWS costs but must maintain the same level of performance for the environments.

The company's production environment requires resources to be highly available. The other environments do not require highly available resources.

Each environment has the same set of networking components, including the following:

• 1 VPC

• 1 Application Load Balancer

• 4 subnets distributed across 2 Availability Zones (2 public subnets and 2 private subnets)

• 2 NAT gateways (1 in each public subnet)

• 1 internet gateway

Which solution will meet these requirements?

Options:

A.

Do not change the production environment workload. For each non-production workload, remove one NAT gateway and update the route tables for private subnets to target the remaining NAT gateway for the destination 0.0.0.0/0.

B.

Reduce the number of Availability Zones that all workloads in all environments use.

C.

Replace every NAT gateway with a t4g.large NAT instance. Update the route tables for each private subnet to target the NAT instance that is in the same Availability Zone for the destination 0.0.0.0/0.

D.

In each environment, create one transit gateway and remove one NAT gateway. Configure routing on the transit gateway to forward traffic for the destination 0.0.0.0/0 to the remaining NAT gateway. Update private subnet route tables to target the transit gateway for the destination 0.0.0.0/0.

Question 135

A company stores sensitive customer data in an Amazon DynamoDB table. The company frequently updates the data. The company wants to use the data to personalize offers for customers.

The company's analytics team has its own AWS account. The analytics team runs an application on Amazon EC2 instances that needs to process data from the DynamoDB tables. The company needs to follow security best practices to create a process to regularly share data from DynamoDB to the analytics team.

Which solution will meet these requirements?

Options:

A.

Export the required data from the DynamoDB table to an Amazon S3 bucket as multiple JSON files. Provide the analytics team with the necessary IAM permissions to access the S3 bucket.

B.

Allow public access to the DynamoDB table. Create an IAM user that has permission to access DynamoDB. Share the IAM user with the analytics team.

C.

Allow public access to the DynamoDB table. Create an IAM user that has read-only permission for DynamoDB. Share the IAM user with the analytics team.

D.

Create a cross-account IAM role. Create an IAM policy that allows the AWS account ID of the analytics team to access the DynamoDB table. Attach the IAM policy to the IAM role. Establish a trust relationship between accounts.

Question 136

A company is using an Amazon Redshift cluster to run analytics queries for multiple sales teams. In addition to the typical workload, on the last Monday morning of each month, thousands of users run reports. Users have reported slow response times during the monthly surge.

The company must improve query performance without impacting the availability of the Redshift cluster.

Which solution will meet these requirements?

Options:

A.

Resize the Redshift cluster by using the classic resize capability of Amazon Redshift before every monthly surge. Reduce the cluster to its original size after each surge.

B.

Resize the Redshift cluster by using the elastic resize capability of Amazon Redshift before every monthly surge. Reduce the cluster to its original size after each surge.

C.

Enable the concurrency scaling feature for the Redshift cluster for specific workload management (WLM) queues.

D.

Enable Amazon Redshift Spectrum for the Redshift cluster before every monthly surge.

Question 137

A company's reporting system delivers hundreds of .csv files to an Amazon S3 bucket each day. The company must convert these files to Apache Parquet format and must store the files in a transformed data bucket.

Which solution will meet these requirements with the LEAST development effort?

Options:

A.

Create an Amazon EMR cluster with Apache Spark installed. Write a Spark application to transform the data. Use EMR File System (EMRFS) to write files to the transformed data bucket.

B.

Create an AWS Glue crawler to discover the data. Create an AWS Glue extract, transform, and load (ETL) job to transform the data. Specify the transformed data bucket in the output step.

C.

Use AWS Batch to create a job definition with Bash syntax to transform the data and output the data to the transformed data bucket. Use the job definition to submit a job. Specify an array job as the job type.

D.

Create an AWS Lambda function to transform the data and output the data to the transformed data bucket. Configure an event notification for the S3 bucket. Specify the Lambda function as the destination for the event notification.

Question 138

A company is creating a web application that will store a large number of images in Amazon S3. The images will be accessed by users over variable periods of time. The company wants to:

Retain all the images.

Incur no cost for retrieval.

Have minimal management overhead.

Have the images available with no impact on retrieval time.

Which solution meets these requirements?

Options:

A.

Implement S3 Intelligent-Tiering.

B.

Implement S3 storage class analysis.

C.

Implement an S3 Lifecycle policy to move data to S3 Standard-Infrequent Access (S3 Standard-IA).

D.

Implement an S3 Lifecycle policy to move data to S3 One Zone-Infrequent Access (S3 One Zone-IA).

Question 139

A company wants to optimize costs for its AWS infrastructure. The company wants to receive notifications when actual costs or forecasted costs exceed a specified budget. The company does not want to develop a custom solution.

Which solution will meet these requirements?

Options:

A.

Use AWS Trusted Advisor to set up budget notifications. Configure Amazon CloudWatch to monitor costs. Export CloudWatch data to Amazon S3. Use machine learning (ML) to estimate future trends based on the CloudWatch data.

B.

Create a budget in AWS Budgets that has a specified cost threshold. Create an AWS Lambda function that sends a notification to the company when costs reach the specified threshold. Use AWS Billing and Cost Management reports to monitor costs.

C.

Use AWS Cost Explorer to set a specified budget threshold. Create an AWS Lambda function to calculate cost estimates. Configure the Lambda function to send a notification to an Amazon Simple Notification Service (Amazon SNS) topic if estimated costs exceed the specified threshold.

D.

Create a budget in AWS Budgets that has a specified cost threshold. Configure AWS Budgets to send budget alerts to an Amazon Simple Notification Service (Amazon SNS) topic. Use AWS Cost Explorer to monitor costs.

Question 140

A company uses Amazon RDS for PostgreSQL databases for its data tier. The company must implement password rotation for the databases.

Which solution meets this requirement with the LEAST operational overhead?

Options:

A.

Store the password in AWS Secrets Manager. Enable automatic rotation on the secret.

B.

Store the password in AWS Systems Manager Parameter Store. Enable automatic rotation on the parameter.

C.

Store the password in AWS Systems Manager Parameter Store. Write an AWS Lambda function that rotates the password.

D.

Store the password in AWS Key Management Service (AWS KMS). Enable automatic rotation on the AWS KMS key.

Page: 1 / 47
Total 467 questions