hiexam
amazon · AWS-Certified-AI-Practitioner-AIF-C01 · Q604 · hot_area · llm_inference_modes, batch_vs_realtime

HOTSPOT - A company has developed a large language model (LLM) and wants to make the LLM available to multiple interna…

HOTSPOT - A company has developed a large language model (LLM) and wants to make the LLM available to multiple internal teams. The company needs to select the appropriate inference mode for each team. Select the correct inference mode from the following list for each use case. Each inference mode should be selected one or more times. //IMG//
Correct region(s)

Answer hidden

Unlock AWS-Certified-AI-Practitioner-AIF-C01

First 5 questions of every exam are free. Unlock the rest.

Includes
  • · All — questions
  • · Verified answers + AI explanations
  • · Spaced-repetition mistake notebook
  • · Phone, tablet, laptop sync

One-time payment. Doesn't auto-renew. · Already paid? Sign in