hardmultiple choiceObjective-mapped

A company ingests millions of IoT events per second from sensors around the world. Each event is a JSON message with timestamp, device ID, and readings. They need to support real-time analytics dashboards and also store all raw data for long-term historical analysis. They want to minimize operational overhead. Which Azure data storage solution should they recommend?

Question 1hardmultiple choice
Full question →

A company ingests millions of IoT events per second from sensors around the world. Each event is a JSON message with timestamp, device ID, and readings. They need to support real-time analytics dashboards and also store all raw data for long-term historical analysis. They want to minimize operational overhead. Which Azure data storage solution should they recommend?

Answer choices

Why each option matters

Good practice is not just finding the correct option. The wrong answers often show the exact trap the exam wants you to fall into.

A

Distractor review

Azure Data Lake Storage Gen2 for all data.

Data Lake Storage is a great store for historical data but does not provide real-time stream ingestion. You would still need an ingestion service. Additionally, storing every event directly to Data Lake without batching could be inefficient.

B

Best answer

Azure Event Hubs with Capture to Azure Data Lake Storage.

Event Hubs can handle millions of events per second. The Capture feature automatically writes ingested events to Data Lake Storage in Avro format (or JSON). For real-time dashboards, you can use Stream Analytics to query the Event Hubs stream. This provides a seamless, low-operational-overhead solution.

C

Distractor review

Azure Cosmos DB for both real-time and historical data.

Cosmos DB can support real-time queries but at high write throughput it becomes very expensive. Storing billions of raw events in Cosmos DB for long-term is not cost-effective and does not utilize its strengths as a NoSQL document database.

D

Distractor review

Azure Time Series Insights (TSI) Standard.

TSI is purpose-built for IoT time-series data and can handle high cardinality. However, it is not ideal for storing raw JSON events long-term at massive scale, and the warm/cold store costs may be higher than Data Lake Storage. Also, TSI does not provide the flexibility of using other analytics tools on the data.

Common exam trap

Common exam trap: NAT rules depend on direction and matching traffic

NAT is not only about the public address. The inside/outside interface roles and the ACL or rule that matches traffic are just as important.

Technical deep dive

How to think about this question

NAT questions usually test address translation, overload/PAT behaviour, static mappings and whether the right traffic is being translated. Read the interface direction and address terms carefully.

KKey Concepts to Remember

  • Static NAT maps one inside address to one outside address.
  • PAT allows many inside hosts to share one public address using ports.
  • Inside local and inside global describe the private and translated addresses.
  • NAT ACLs identify traffic for translation, not always security filtering.

TExam Day Tips

  • Identify inside and outside interfaces first.
  • Check whether the scenario needs static NAT, dynamic NAT or PAT.
  • Do not confuse NAT matching ACLs with normal packet-filtering intent.

Related practice questions

Related AZ-305 practice-question pages

Use these pages to review the topic behind this question. This is how one missed question becomes focused revision.

More questions from this exam

Keep practising from the same exam bank, or move into a focused topic page if this question exposed a weak area.

FAQ

Questions learners often ask

What does this AZ-305 question test?

Static NAT maps one inside address to one outside address.

What is the correct answer to this question?

The correct answer is: Azure Event Hubs with Capture to Azure Data Lake Storage. — Azure Event Hubs is designed for high-throughput event ingestion and can capture all events to Azure Blob Storage or Azure Data Lake Storage in its native JSON format. This provides a cost-effective, durable storage for historical analysis. For real-time analytics, you can use Azure Stream Analytics to process the stream and feed a dashboard. This combination minimizes operational overhead by using serverless ingestion and storage. Time Series Insights is optimized for IoT time-series but may be more expensive for petabyte-scale historical data. Cosmos DB is not ideal for storing billions of raw events due to cost and performance limitations for write-heavy workloads.

What should I do if I get this AZ-305 question wrong?

Then try more questions from the same exam bank and focus on understanding why the wrong options are tempting.

Discussion

Loading comments…

Sign in to join the discussion.