A company ingests millions of IoT events per second from sensors around the world. Each event is a JSON message with timestamp, device ID, and readings. They need to support real-time analytics dashboards and also store all raw data for long-term historical analysis. They want to minimize operational overhead. Which Azure data storage solution should they recommend?
Answer choices
Why each option matters
Good practice is not just finding the correct option. The wrong answers often show the exact trap the exam wants you to fall into.
Distractor review
Azure Data Lake Storage Gen2 for all data.
Data Lake Storage is a great store for historical data but does not provide real-time stream ingestion. You would still need an ingestion service. Additionally, storing every event directly to Data Lake without batching could be inefficient.
Best answer
Azure Event Hubs with Capture to Azure Data Lake Storage.
Event Hubs can handle millions of events per second. The Capture feature automatically writes ingested events to Data Lake Storage in Avro format (or JSON). For real-time dashboards, you can use Stream Analytics to query the Event Hubs stream. This provides a seamless, low-operational-overhead solution.
Distractor review
Azure Cosmos DB for both real-time and historical data.
Cosmos DB can support real-time queries but at high write throughput it becomes very expensive. Storing billions of raw events in Cosmos DB for long-term is not cost-effective and does not utilize its strengths as a NoSQL document database.
Distractor review
Azure Time Series Insights (TSI) Standard.
TSI is purpose-built for IoT time-series data and can handle high cardinality. However, it is not ideal for storing raw JSON events long-term at massive scale, and the warm/cold store costs may be higher than Data Lake Storage. Also, TSI does not provide the flexibility of using other analytics tools on the data.
Common exam trap
Common exam trap: NAT rules depend on direction and matching traffic
NAT is not only about the public address. The inside/outside interface roles and the ACL or rule that matches traffic are just as important.
Technical deep dive
How to think about this question
NAT questions usually test address translation, overload/PAT behaviour, static mappings and whether the right traffic is being translated. Read the interface direction and address terms carefully.
KKey Concepts to Remember
- Static NAT maps one inside address to one outside address.
- PAT allows many inside hosts to share one public address using ports.
- Inside local and inside global describe the private and translated addresses.
- NAT ACLs identify traffic for translation, not always security filtering.
TExam Day Tips
- Identify inside and outside interfaces first.
- Check whether the scenario needs static NAT, dynamic NAT or PAT.
- Do not confuse NAT matching ACLs with normal packet-filtering intent.
Related practice questions
Related AZ-305 practice-question pages
Use these pages to review the topic behind this question. This is how one missed question becomes focused revision.
SAA-C03 VPC practice questions
Practise AZ-305 questions linked to SAA-C03 VPC.
SAA-C03 S3 lifecycle policy questions
Practise AZ-305 questions linked to SAA-C03 S3 lifecycle policy questions.
SAA-C03 RDS Multi-AZ questions
Practise AZ-305 questions linked to SAA-C03 RDS Multi-AZ questions.
SAA-C03 IAM policy practice questions
Practise AZ-305 questions linked to SAA-C03 IAM policy.
SAA-C03 Route 53 failover questions
Practise AZ-305 questions linked to SAA-C03 Route 53 failover questions.
SAA-C03 CloudFront practice questions
Practise AZ-305 questions linked to SAA-C03 CloudFront.
SAA-C03 NAT gateway questions
Practise AZ-305 questions linked to SAA-C03 NAT gateway questions.
SAA-C03 VPC endpoint questions
Practise AZ-305 questions linked to SAA-C03 VPC endpoint questions.
SAA-C03 Auto Scaling practice questions
Practise AZ-305 questions linked to SAA-C03 Auto Scaling.
SAA-C03 disaster recovery questions
Practise AZ-305 questions linked to SAA-C03 disaster recovery questions.
SAA-C03 high availability questions
Practise AZ-305 questions linked to SAA-C03 high availability questions.
SAA-C03 cost optimization questions
Practise AZ-305 questions linked to SAA-C03 cost optimization questions.
More questions from this exam
Keep practising from the same exam bank, or move into a focused topic page if this question exposed a weak area.
Question 1
A company is designing hub-and-spoke networking. Spoke VNets must use a central Azure Firewall for outbound internet traffic. Which two configurations are required?
Question 2
A company is designing private access to a PaaS database from workloads in a VNet. The database should not be reachable over its public endpoint. What should be recommended?
Question 3
A data platform must support analytical queries over petabytes of files in a data lake, while preserving hierarchical namespaces and fine-grained ACLs. Which storage service should you design around?
Question 4
A database workload has an RPO of 15 minutes and an RTO of 4 hours. Cost is more important than near-zero data loss. Which design is usually more appropriate than synchronous multi-region replication?
Question 5
A hub-and-spoke Azure network must centralize outbound inspection and still allow spokes to resolve private endpoint DNS names. Which two components are commonly required? (Choose 2.)
Question 6
A multinational company uses Microsoft Entra ID and several Azure subscriptions. Security administrators need to review privileged role assignments every month and require justification for continued access. Which design should be recommended?
FAQ
Questions learners often ask
What does this AZ-305 question test?
Static NAT maps one inside address to one outside address.
What is the correct answer to this question?
The correct answer is: Azure Event Hubs with Capture to Azure Data Lake Storage. — Azure Event Hubs is designed for high-throughput event ingestion and can capture all events to Azure Blob Storage or Azure Data Lake Storage in its native JSON format. This provides a cost-effective, durable storage for historical analysis. For real-time analytics, you can use Azure Stream Analytics to process the stream and feed a dashboard. This combination minimizes operational overhead by using serverless ingestion and storage. Time Series Insights is optimized for IoT time-series but may be more expensive for petabyte-scale historical data. Cosmos DB is not ideal for storing billions of raw events due to cost and performance limitations for write-heavy workloads.
What should I do if I get this AZ-305 question wrong?
Then try more questions from the same exam bank and focus on understanding why the wrong options are tempting.
Discussion
Sign in to join the discussion.