A manufacturing company collects temperature and vibration data from thousands of sensors. The data is streamed to Azure Event Hubs. The company wants to store all this raw data in Azure Data Lake Storage Gen2 for future batch analytics. They need a solution that automatically writes the streaming data to the data lake in near real-time, without requiring any custom code for the write operation. Which Azure feature should they use?
Answer choices
Why each option matters
Good practice is not just finding the correct option. The wrong answers often show the exact trap the exam wants you to fall into.
Distractor review
Azure Stream Analytics job output to Azure Data Lake Storage Gen2
While Stream Analytics can write to ADLS Gen2, it requires defining a streaming job and is not an automatic feature of Event Hubs itself.
Best answer
Azure Event Hubs Capture
Event Hubs Capture automatically captures streaming data into Azure Blob Storage or Azure Data Lake Storage Gen2 without any custom code. It writes data in Avro format and is ideal for long-term storage and batch analytics.
Distractor review
Azure Data Factory Copy Activity
Data Factory Copy Activity is a scheduled data movement tool, not a real-time automatic capture mechanism. It requires an orchestration trigger and is not directly integrated with Event Hubs for continuous capture.
Distractor review
Azure Synapse Pipelines
Synapse Pipelines are similar to Data Factory and are used for orchestrating data movement and transformation on a schedule, not for automatic real-time capture from Event Hubs.
Common exam trap
Common exam trap: answer the scenario, not the keyword
Many certification questions include familiar terms but test a specific constraint. Read the exact wording before choosing an answer that is generally true but wrong for this case.
Technical deep dive
How to think about this question
This question should be treated as a scenario, not a definition check. Identify the problem, the constraint and the best action. Then compare each option against those facts.
KKey Concepts to Remember
- Read the scenario before looking for a memorised answer.
- Find the constraint that changes the correct option.
- Eliminate answers that are true in general but not in this case.
- Use explanations to understand the rule behind the answer.
TExam Day Tips
- Underline the problem statement mentally.
- Watch for words such as best, first, most likely and least administrative effort.
- Review why wrong options are wrong, not only why the correct option is correct.
Related practice questions
Related DP-900 practice-question pages
Use these pages to review the topic behind this question. This is how one missed question becomes focused revision.
More questions from this exam
Keep practising from the same exam bank, or move into a focused topic page if this question exposed a weak area.
Question 1
A data engineer needs to process streaming data from IoT devices and store the results in Azure Data Lake Storage for long-term analytics. The data must be processed in near real-time to detect anomalies and trigger alerts. Which Azure service should the engineer use for stream processing?
Question 2
A data engineer needs to query data stored in CSV files in Azure Data Lake Storage Gen2 using T-SQL in Azure Synapse Analytics, without loading the data into the database. Which feature should they use?
Question 3
A data engineer needs to process raw clickstream data from multiple websites that is stored in Azure Blob Storage as JSON files. The processing must run automatically every hour, transform the data into a structured format for reporting, and handle schema changes in the source data without manual intervention. Which Azure service should be used?
Question 4
A data engineer is designing a data lake architecture in Azure. They plan to first ingest raw data from various sources into a landing zone in Azure Data Lake Storage Gen2. Then they will clean, validate, and deduplicate that data in a second zone. Finally, they will create aggregated, business-ready datasets in a third zone for analysts. This layered approach is known as which architecture?
Question 5
A data engineer needs to transform large datasets stored in Azure Data Lake Storage Gen2 using Python and Apache Spark. They want a serverless compute option that automatically scales and requires no cluster management. Which Azure service should they use?
Question 6
A company collects customer feedback forms. Each form contains always-present fields like CustomerID and SubmissionDate, but also a free-text Comments field and optional fields like Rating or ProductCategory that vary between forms. How should this data be classified?
FAQ
Questions learners often ask
What does this DP-900 question test?
Read the scenario before looking for a memorised answer.
What is the correct answer to this question?
The correct answer is: Azure Event Hubs Capture — Azure Event Hubs Capture is the correct feature because it automatically captures streaming data from Event Hubs to Azure Blob Storage or Azure Data Lake Storage Gen2 without any custom code. It writes data in Avro format and is designed for long-term storage and batch analytics. In contrast, Azure Stream Analytics requires defining a streaming job, Azure Data Factory Copy Activity is a scheduled data movement tool, and Azure Synapse Pipelines are orchestration pipelines, none of which provide automatic, code-free capture directly from Event Hubs.
What should I do if I get this DP-900 question wrong?
Then try more questions from the same exam bank and focus on understanding why the wrong options are tempting.
Discussion
Sign in to join the discussion.