mediummultiple choiceObjective-mapped

A company stores large video files in Azure Blob Storage. The files are accessed frequently for the first 30 days after upload, then rarely for the next 180 days, and after that they are only needed for compliance but never accessed. The company wants to minimize storage costs while ensuring the files remain durable and accessible. Which strategy should they implement?

Question 1mediummultiple choice
Full question →

A company stores large video files in Azure Blob Storage. The files are accessed frequently for the first 30 days after upload, then rarely for the next 180 days, and after that they are only needed for compliance but never accessed. The company wants to minimize storage costs while ensuring the files remain durable and accessible. Which strategy should they implement?

Answer choices

Why each option matters

Good practice is not just finding the correct option. The wrong answers often show the exact trap the exam wants you to fall into.

A

Distractor review

Store all files in the Cool access tier and apply lifecycle management to move files to the Archive tier after 30 days.

The Cool tier is not optimal for the first 30 days of frequent access because it has higher read costs and lower availability than Hot. Moving to Archive after 30 days is too early for the 180-day period of rare access.

B

Best answer

Store files initially in the Hot tier, then use lifecycle management to move files to Cool after 30 days and to Archive after 210 days.

This strategy matches the access patterns: Hot for the frequent first 30 days, Cool for the rare next 180 days, and Archive for the never-accessed compliance period. Lifecycle management automates transitions, minimizing costs.

C

Distractor review

Store files in the Archive tier from the beginning to maximize cost savings.

Archive tier has the lowest storage cost but high retrieval costs and latency (up to 15 hours to rehydrate). This is unsuitable for the first 30 days when files are frequently accessed.

D

Distractor review

Store files in the Premium tier for fast access, then manually delete files after 30 days.

The Premium tier is designed for high-performance scenarios (e.g., VMs) and is expensive for bulk video storage. Deleting files after 30 days does not meet the long-term compliance requirement.

Common exam trap

Common exam trap: answer the scenario, not the keyword

Many certification questions include familiar terms but test a specific constraint. Read the exact wording before choosing an answer that is generally true but wrong for this case.

Technical deep dive

How to think about this question

This question should be treated as a scenario, not a definition check. Identify the problem, the constraint and the best action. Then compare each option against those facts.

KKey Concepts to Remember

  • Read the scenario before looking for a memorised answer.
  • Find the constraint that changes the correct option.
  • Eliminate answers that are true in general but not in this case.
  • Use explanations to understand the rule behind the answer.

TExam Day Tips

  • Underline the problem statement mentally.
  • Watch for words such as best, first, most likely and least administrative effort.
  • Review why wrong options are wrong, not only why the correct option is correct.

Related practice questions

Related DP-900 practice-question pages

Use these pages to review the topic behind this question. This is how one missed question becomes focused revision.

More questions from this exam

Keep practising from the same exam bank, or move into a focused topic page if this question exposed a weak area.

Question 1

A data engineer needs to process streaming data from IoT devices and store the results in Azure Data Lake Storage for long-term analytics. The data must be processed in near real-time to detect anomalies and trigger alerts. Which Azure service should the engineer use for stream processing?

Question 2

A data engineer needs to query data stored in CSV files in Azure Data Lake Storage Gen2 using T-SQL in Azure Synapse Analytics, without loading the data into the database. Which feature should they use?

Question 3

A data engineer needs to process raw clickstream data from multiple websites that is stored in Azure Blob Storage as JSON files. The processing must run automatically every hour, transform the data into a structured format for reporting, and handle schema changes in the source data without manual intervention. Which Azure service should be used?

Question 4

A data engineer is designing a data lake architecture in Azure. They plan to first ingest raw data from various sources into a landing zone in Azure Data Lake Storage Gen2. Then they will clean, validate, and deduplicate that data in a second zone. Finally, they will create aggregated, business-ready datasets in a third zone for analysts. This layered approach is known as which architecture?

Question 5

A data engineer needs to transform large datasets stored in Azure Data Lake Storage Gen2 using Python and Apache Spark. They want a serverless compute option that automatically scales and requires no cluster management. Which Azure service should they use?

Question 6

A company collects customer feedback forms. Each form contains always-present fields like CustomerID and SubmissionDate, but also a free-text Comments field and optional fields like Rating or ProductCategory that vary between forms. How should this data be classified?

FAQ

Questions learners often ask

What does this DP-900 question test?

Read the scenario before looking for a memorised answer.

What is the correct answer to this question?

The correct answer is: Store files initially in the Hot tier, then use lifecycle management to move files to Cool after 30 days and to Archive after 210 days. — The Hot access tier is optimal for data accessed frequently (first 30 days). The Cool tier is cost-effective for infrequently accessed data (next 180 days). The Archive tier offers the lowest storage cost for rarely accessed data (after 210 days). Azure Blob Storage lifecycle management can automate transitioning between tiers based on age. Option B correctly uses lifecycle management to move from Hot to Cool at 30 days and then to Archive at 210 days. Option A moves to Archive too early (30 days), incurring high read costs during the rare-access period. Option C uses Archive from the start, which would make frequent reads expensive and slow. Option D deletes data and is not a cost-optimization strategy.

What should I do if I get this DP-900 question wrong?

Then try more questions from the same exam bank and focus on understanding why the wrong options are tempting.

Discussion

Loading comments…

Sign in to join the discussion.