# AWS-Certified-Data-Analytics---Specialty — Question 427

**Type:** multiple_choice
**Topics:** topic_1

## Question

A company wants to collect and process events data from different departments in near-real time. Before storing the data in Amazon S3, the company needs to clean the data by standardizing the format of the address and timestamp columns. The data varies in size based on the overall load at each particular point in time. A single data record can be 100 KB-10 MB.
How should a data analytics specialist design the solution for data ingestion?

## Correct Answer

_See scenario._

## Explanation

C. Should be the right answer, because of the main requirement "A single data record can be 100 KB-10 MB."
- Kinesis firehose - The maximum size of a record sent to Kinesis Data Firehose, before base64-encoding, is 1,000 KiB. 
- Kinesis stream - The maximum size of the data payload of a record before base64-encoding is up to 1 MB.
 - SQS - https://aws.amazon.com/pt/about-aws/whats-new/2015/10/now-send-payloads-up-to-2gb-with-amazon-sqs/

**Reference:** examtopics_top_comment

---
Source: https://hiexam.net/q/amazon/AWS-Certified-Data-Analytics---Specialty/427  
Practice (tracked): https://hiexam.net/study/AWS-Certified-Data-Analytics---Specialty/practice