Month End Special Limited Time Flat 70% Discount offer - Ends in 0d 00h 00m 00s - Coupon code: 70spcl

Amazon Web Services AIF-C01 AWS Certified AI Practitioner Exam Exam Practice Test

Page: 1 / 9
Total 93 questions

AWS Certified AI Practitioner Exam Questions and Answers

Question 1

Which option is a benefit of using Amazon SageMaker Model Cards to document AI models?

Options:

A.

Providing a visually appealing summary of a model's capabilities.

B.

Standardizing information about a model's purpose, performance, and limitations.

C.

Reducing the overall computational requirements of a model.

D.

Physically storing models for archival purposes.

Question 2

A financial institution is using Amazon Bedrock to develop an AI application. The application is hosted in a VPC. To meet regulatory compliance standards, the VPC is not allowed access to any internet traffic.

Which AWS service or feature will meet these requirements?

Options:

A.

AWS PrivateLink

B.

Amazon Macie

C.

Amazon CloudFront

D.

Internet gateway

Question 3

An AI practitioner has a database of animal photos. The AI practitioner wants to automatically identify and categorize the animals in the photos without manual human effort.

Which strategy meets these requirements?

Options:

A.

Object detection

B.

Anomaly detection

C.

Named entity recognition

D.

Inpainting

Question 4

A company wants to deploy a conversational chatbot to answer customer questions. The chatbot is based on a fine-tuned Amazon SageMaker JumpStart model. The application must comply with multiple regulatory frameworks.

Which capabilities can the company show compliance for? (Select TWO.)

Options:

A.

Auto scaling inference endpoints

B.

Threat detection

C.

Data protection

D.

Cost optimization

E.

Loosely coupled microservices

Question 5

A company wants to use a large language model (LLM) on Amazon Bedrock for sentiment analysis. The company wants to know how much information can fit into one prompt.

Which consideration will inform the company's decision?

Options:

A.

Temperature

B.

Context window

C.

Batch size

D.

Model size

Question 6

A company wants to create an application by using Amazon Bedrock. The company has a limited budget and prefers flexibility without long-term commitment.

Which Amazon Bedrock pricing model meets these requirements?

Options:

A.

On-Demand

B.

Model customization

C.

Provisioned Throughput

D.

Spot Instance

Question 7

A company wants to use large language models (LLMs) with Amazon Bedrock to develop a chat interface for the company's product manuals. The manuals are stored as PDF files.

Which solution meets these requirements MOST cost-effectively?

Options:

A.

Use prompt engineering to add one PDF file as context to the user prompt when the prompt is submitted to Amazon Bedrock.

B.

Use prompt engineering to add all the PDF files as context to the user prompt when the prompt is submitted to Amazon Bedrock.

C.

Use all the PDF documents to fine-tune a model with Amazon Bedrock. Use the fine-tuned model to process user prompts.

D.

Upload PDF documents to an Amazon Bedrock knowledge base. Use the knowledge base to provide context when users submit prompts to Amazon Bedrock.

Question 8

A company wants to create a chatbot by using a foundation model (FM) on Amazon Bedrock. The FM needs to access encrypted data that is stored in an Amazon S3 bucket.

The data is encrypted with Amazon S3 managed keys (SSE-S3).

The FM encounters a failure when attempting to access the S3 bucket data.

Which solution will meet these requirements?

Options:

A.

Ensure that the role that Amazon Bedrock assumes has permission to decrypt data with the correct encryption key.

B.

Set the access permissions for the S3 buckets to allow public access to enable access over the internet.

C.

Use prompt engineering techniques to tell the model to look for information in Amazon S3.

D.

Ensure that the S3 data does not contain sensitive information.

Question 9

A company deployed an AI/ML solution to help customer service agents respond to frequently asked questions. The questions can change over time. The company wants to give customer service agents the ability to ask questions and receive automatically generated answers to common customer questions. Which strategy will meet these requirements MOST cost-effectively?

Options:

A.

Fine-tune the model regularly.

B.

Train the model by using context data.

C.

Pre-train and benchmark the model by using context data.

D.

Use Retrieval Augmented Generation (RAG) with prompt engineering techniques.

Question 10

A company is developing a new model to predict the prices of specific items. The model performed well on the training dataset. When the company deployed the model to production, the model's performance decreased significantly.

What should the company do to mitigate this problem?

Options:

A.

Reduce the volume of data that is used in training.

B.

Add hyperparameters to the model.

C.

Increase the volume of data that is used in training.

D.

Increase the model training time.

Question 11

What are tokens in the context of generative AI models?

Options:

A.

Tokens are the basic units of input and output that a generative AI model operates on, representing words, subwords, or other linguistic units.

B.

Tokens are the mathematical representations of words or concepts used in generative AI models.

C.

Tokens are the pre-trained weights of a generative AI model that are fine-tuned for specific tasks.

D.

Tokens are the specific prompts or instructions given to a generative AI model to generate output.

Question 12

A company is implementing the Amazon Titan foundation model (FM) by using Amazon Bedrock. The company needs to supplement the model by using relevant data from the company's private data sources.

Which solution will meet this requirement?

Options:

A.

Use a different FM

B.

Choose a lower temperature value

C.

Create an Amazon Bedrock knowledge base

D.

Enable model invocation logging

Question 13

A company is building a solution to generate images for protective eyewear. The solution must have high accuracy and must minimize the risk of incorrect annotations.

Which solution will meet these requirements?

Options:

A.

Human-in-the-loop validation by using Amazon SageMaker Ground Truth Plus

B.

Data augmentation by using an Amazon Bedrock knowledge base

C.

Image recognition by using Amazon Rekognition

D.

Data summarization by using Amazon QuickSight

Question 14

A company makes forecasts each quarter to decide how to optimize operations to meet expected demand. The company uses ML models to make these forecasts.

An AI practitioner is writing a report about the trained ML models to provide transparency and explainability to company stakeholders.

What should the AI practitioner include in the report to meet the transparency and explainability requirements?

Options:

A.

Code for model training

B.

Partial dependence plots (PDPs)

C.

Sample data for training

D.

Model convergence tables

Question 15

A company has thousands of customer support interactions per day and wants to analyze these interactions to identify frequently asked questions and develop insights.

Which AWS service can the company use to meet this requirement?

Options:

A.

Amazon Lex

B.

Amazon Comprehend

C.

Amazon Transcribe

D.

Amazon Translate

Question 16

A company built an AI-powered resume screening system. The company used a large dataset to train the model. The dataset contained resumes that were not representative of all demographics. Which core dimension of responsible AI does this scenario present?

Options:

A.

Fairness.

B.

Explainability.

C.

Privacy and security.

D.

Transparency.

Question 17

A company wants to use a large language model (LLM) on Amazon Bedrock for sentiment analysis. The company wants to know how much information can fit into one prompt.

Which consideration will inform the company's decision?

Options:

A.

Temperature

B.

Context window

C.

Batch size

D.

Model size

Question 18

A company manually reviews all submitted resumes in PDF format. As the company grows, the company expects the volume of resumes to exceed the company's review capacity. The company needs an automated system to convert the PDF resumes into plain text format for additional processing.

Which AWS service meets this requirement?

Options:

A.

Amazon Textract

B.

Amazon Personalize

C.

Amazon Lex

D.

Amazon Transcribe

Question 19

A company is using a pre-trained large language model (LLM) to build a chatbot for product recommendations. The company needs the LLM outputs to be short and written in a specific language.

Which solution will align the LLM response quality with the company's expectations?

Options:

A.

Adjust the prompt.

B.

Choose an LLM of a different size.

C.

Increase the temperature.

D.

Increase the Top K value.

Question 20

A company is using domain-specific models. The company wants to avoid creating new models from the beginning. The company instead wants to adapt pre-trained models to create models for new, related tasks.

Which ML strategy meets these requirements?

Options:

A.

Increase the number of epochs.

B.

Use transfer learning.

C.

Decrease the number of epochs.

D.

Use unsupervised learning.

Question 21

A company wants to make a chatbot to help customers. The chatbot will help solve technical problems without human intervention. The company chose a foundation model (FM) for the chatbot. The chatbot needs to produce responses that adhere to company tone.

Which solution meets these requirements?

Options:

A.

Set a low limit on the number of tokens the FM can produce.

B.

Use batch inferencing to process detailed responses.

C.

Experiment and refine the prompt until the FM produces the desired responses.

D.

Define a higher number for the temperature parameter.

Question 22

A company is building a solution to generate images for protective eyewear. The solution must have high accuracy and must minimize the risk of incorrect annotations.

Which solution will meet these requirements?

Options:

A.

Human-in-the-loop validation by using Amazon SageMaker Ground Truth Plus

B.

Data augmentation by using an Amazon Bedrock knowledge base

C.

Image recognition by using Amazon Rekognition

D.

Data summarization by using Amazon QuickSight

Question 23

Which functionality does Amazon SageMaker Clarify provide?

Options:

A.

Integrates a Retrieval Augmented Generation (RAG) workflow

B.

Monitors the quality of ML models in production

C.

Documents critical details about ML models

D.

Identifies potential bias during data preparation

Question 24

An AI practitioner trained a custom model on Amazon Bedrock by using a training dataset that contains confidential data. The AI practitioner wants to ensure that the custom model does not generate inference responses based on confidential data.

How should the AI practitioner prevent responses based on confidential data?

Options:

A.

Delete the custom model. Remove the confidential data from the training dataset. Retrain the custom model.

B.

Mask the confidential data in the inference responses by using dynamic data masking.

C.

Encrypt the confidential data in the inference responses by using Amazon SageMaker.

D.

Encrypt the confidential data in the custom model by using AWS Key Management Service (AWS KMS).

Question 25

A large retailer receives thousands of customer support inquiries about products every day. The customer support inquiries need to be processed and responded to quickly. The company wants to implement Agents for Amazon Bedrock.

What are the key benefits of using Amazon Bedrock agents that could help this retailer?

Options:

A.

Generation of custom foundation models (FMs) to predict customer needs

B.

Automation of repetitive tasks and orchestration of complex workflows

C.

Automatically calling multiple foundation models (FMs) and consolidating the results

D.

Selecting the foundation model (FM) based on predefined criteria and metrics

Question 26

Which option is a benefit of ongoing pre-training when fine-tuning a foundation model (FM)?

Options:

A.

Helps decrease the model's complexity

B.

Improves model performance over time

C.

Decreases the training time requirement

D.

Optimizes model inference time

Question 27

A company wants to develop a large language model (LLM) application by using Amazon Bedrock and customer data that is uploaded to Amazon S3. The company's security policy states that each team can access data for only the team's own customers.

Which solution will meet these requirements?

Options:

A.

Create an Amazon Bedrock custom service role for each team that has access to only the team's customer data.

B.

Create a custom service role that has Amazon S3 access. Ask teams to specify the customer name on each Amazon Bedrock request.

C.

Redact personal data in Amazon S3. Update the S3 bucket policy to allow team access to customer data.

D.

Create one Amazon Bedrock role that has full Amazon S3 access. Create IAM roles for each team that have access to only each team's customer folders.

Page: 1 / 9
Total 93 questions