hardmultiple choiceObjective-mapped

A company is building a petabyte-scale data lake for analytics. They need a storage solution that supports a hierarchical namespace, POSIX-like permissions (ACLs), and is optimized for big data analytics workloads using Apache Spark and Hive. The data must be accessible over the Azure Blob Storage API. Which Azure data service should they use?

Question 1hardmultiple choice
Full question →

A company is building a petabyte-scale data lake for analytics. They need a storage solution that supports a hierarchical namespace, POSIX-like permissions (ACLs), and is optimized for big data analytics workloads using Apache Spark and Hive. The data must be accessible over the Azure Blob Storage API. Which Azure data service should they use?

Answer choices

Why each option matters

Good practice is not just finding the correct option. The wrong answers often show the exact trap the exam wants you to fall into.

A

Distractor review

Azure Blob Storage (with flat namespace)

Standard Blob Storage does not support a hierarchical namespace or native POSIX ACLs, making it less suitable for Hadoop/Spark analytics that rely on directory structures.

B

Best answer

Azure Data Lake Storage Gen2

ADLS Gen2 combines Blob Storage with a hierarchical namespace and ACLs, enabling Hadoop-compatible access and high-performance analytics with Spark, Hive, and other tools.

C

Distractor review

Azure NetApp Files

NetApp Files provides NFS and SMB file shares, but it is not designed for petabyte-scale data lakes and does not offer Blob API compatibility for analytics frameworks.

D

Distractor review

Azure HPC Cache

HPC Cache is a caching service for high-performance computing, not a primary storage solution. It reduces latency but does not provide the namespace or ACL features required.

Common exam trap

Common exam trap: ACLs stop at the first match

ACLs are processed top to bottom. The first matching entry wins, and an implicit deny usually exists at the end.

Technical deep dive

How to think about this question

ACL questions test precision: source, destination, protocol, port and direction. A generally correct ACL can still fail if it is applied on the wrong interface or in the wrong direction.

KKey Concepts to Remember

  • Standard ACLs match source addresses.
  • Extended ACLs can match source, destination, protocol and ports.
  • The first matching ACL entry is used.
  • There is usually an implicit deny at the end.

TExam Day Tips

  • Check inbound versus outbound direction.
  • Read the ACL from top to bottom.
  • Look for a broader permit or deny above the intended line.

Related practice questions

Related AZ-305 practice-question pages

Use these pages to review the topic behind this question. This is how one missed question becomes focused revision.

More questions from this exam

Keep practising from the same exam bank, or move into a focused topic page if this question exposed a weak area.

FAQ

Questions learners often ask

What does this AZ-305 question test?

Standard ACLs match source addresses.

What is the correct answer to this question?

The correct answer is: Azure Data Lake Storage Gen2 — Azure Data Lake Storage Gen2 (ADLS Gen2) is built on Azure Blob Storage and provides a hierarchical namespace, ACLs, and full integration with Hadoop and Spark. It is designed for large-scale analytics. Standard Blob Storage does not offer a hierarchical namespace or POSIX permissions. Azure NetApp Files is a file share service, not optimized for big data analytics using Blob APIs. Azure HPC Cache accelerates access but is not a primary storage solution.

What should I do if I get this AZ-305 question wrong?

Then try more questions from the same exam bank and focus on understanding why the wrong options are tempting.

Discussion

Loading comments…

Sign in to join the discussion.