Microsoft · Free Practice Questions · Last reviewed May 2026
30 real exam-style questions organised by domain, each with the correct answer highlighted and a plain-English explanation of why it's right — and why the others are wrong.
You are implementing an Azure Durable Functions application that processes orders. The function must call three external APIs (payment gateway, inventory system, and shipping calculator) in parallel, then aggregate the results once all three have completed. Which Durable Functions pattern should you use?
Function chaining
Fan-out/Fan-in
Fan-out calls multiple activity functions in parallel, and fan-in waits for all to complete before aggregating results.
Monitor
Human interaction
A company uses Azure Functions with a consumption plan. The function processes messages from a queue. During peak hours, the function takes longer to execute, and some messages are processed twice. What is the most likely cause?
The function timeout is set too low.
The queue message visibility timeout is shorter than the function processing time.
Correct. If the visibility timeout expires, the message becomes visible again and can be processed by another instance, resulting in duplicates.
The function uses blob output binding incorrectly.
The function app is using a premium plan instead of consumption.
You are deploying a Node.js application to Azure Web Apps for Containers. The application needs to read configuration settings from Azure App Configuration. What is the recommended method to securely connect the app to the configuration store?
Store connection string in environment variables.
Use Key Vault references in App Settings.
Use managed identity.
Correct. Managed identity provides secure authentication without secrets.
Hardcode the connection string.
You are implementing an order processing system using Azure Durable Functions. The function must send notifications to multiple channels (email, SMS, push) in parallel and wait for all to complete before sending a confirmation. Which Durable Functions feature should you utilize?
Orchestration trigger with fan-out/fan-in pattern
Correct. The orchestrator can call multiple activity functions in parallel using Task.WhenAll, then aggregate results before proceeding.
Entity trigger
Activity trigger with retry policy
Timer trigger
You are deploying a sensitive configuration to Azure Container Instances. The configuration must be encrypted at rest and not visible in the container logs. What should you use?
Environment variables in the container group
Azure Key Vault with managed identity and secret volumes
Correct. This approach ensures secrets are encrypted in Key Vault, mounted as volumes, and not exposed in logs.
Azure Files volume mounted into the container
ConfigMap in a Kubernetes cluster
A company deploys a web application to Azure App Service. They want to deploy a new version of the app with zero downtime and the ability to quickly roll back if needed. Which deployment feature should they use?
Auto-scaling
Deployment slots
Deployment slots enable staging, warm-up, and swapping with immediate rollback, providing zero-downtime deployments.
Traffic Manager
Application Insights
Want more Develop Azure compute solutions practice?
Practice this domainA company stores archival data in Azure Blob Storage. The data is accessed only a few times per year, and retrieval can take up to 15 hours. Which blob access tier minimizes storage costs while meeting these requirements?
Hot tier
Cool tier
Archive tier
Archive tier offers the lowest storage cost and supports retrieval within 1-15 hours, fitting the scenario.
Premium tier
You are building a serverless application that needs to react to insertions and updates in an Azure Cosmos DB container. You want to process these changes using an Azure Function. Which trigger should you configure for the function?
Cosmos DB trigger
The Cosmos DB trigger uses the change feed to respond to inserts and updates in the container.
Blob trigger
Event Grid trigger
Service Bus trigger
You are developing an application that writes logs to Azure Blob Storage. Each log entry is small (less than 1 KB) and you need to store millions of entries per day. You want to minimize storage costs and maximize write throughput. Which blob type should you use?
Block blobs with a high block size.
Append blobs.
Correct. Append blobs are designed for frequent append operations and are ideal for logging.
Page blobs.
Block blobs with a low block size.
You need to upload large files (up to 100 GB) to Azure Blob Storage from a web application. The upload must be resilient to network failures and support pausing/resuming. Which approach should you use?
Upload the blob as a single PUT operation.
Use block blob with multiple blocks and parallel upload.
Correct. Block blobs support chunked upload with retry and resume capability.
Use append blob.
Use AzCopy from the server.
You need to store millions of small log entries (each <1 KB) per day from an IoT device. The logs are rarely read. Which storage solution is most cost-effective?
Azure Blob Storage Block Blob
Correct. Block blobs support high-volume storage with low-cost tiers (Cool/Archive) and can handle billions of objects.
Azure SQL Database
Azure Table Storage
Azure Files
You are developing an application that writes log entries to Azure Blob Storage. Each log entry is approximately 500 bytes, and you expect to generate millions of entries per day. The logs are rarely read, and when they are read, you need to retrieve ranges of logs sequentially. Which blob type should you use to minimize storage costs and maximize write throughput?
Block blobs
Append blobs
Append blobs are specifically designed for append operations, providing high write throughput and low cost per write. They are ideal for streaming log data where new entries are continuously added.
Page blobs
Azure Files shares
Want more Develop for Azure storage practice?
Practice this domainYou have multiple Azure virtual machines that need to access the same Azure Key Vault to retrieve certificates. You want to minimize administrative overhead while ensuring each VM can authenticate without managing credentials. Which identity type should you use?
System-assigned managed identity on each VM
User-assigned managed identity assigned to each VM
A single user-assigned identity can be assigned to all VMs. You grant Key Vault access once, reducing overhead.
Service principal with client secret stored in each VM
Storage account key
A developer accidentally deleted a secret from Azure Key Vault. Soft-delete is enabled with a retention period of 90 days. After 60 days, you attempt to recover the secret. What should you do?
Run the Azure CLI command: az keyvault secret recover
This command restores the secret while within the soft-delete retention window (60 days out of 90).
Enable purge protection on the Key Vault first, then recover the secret.
Recover is not possible because the retention period of 90 days has not elapsed.
Run the Azure CLI command: az keyvault secret undelete
A company stores sensitive data in an Azure Storage account. They need to restrict access based on the client's IP address and require that clients use a valid SAS token. Which mechanism should they use?
Microsoft Entra ID authentication.
Shared Key.
SAS token with IP ACL.
Correct. A SAS token can specify an allowed IP address range.
Firewall and virtual networks.
You are developing an application that stores user secrets. You need to ensure that the secrets are encrypted at rest and rotated automatically. Which Azure service should you integrate?
Azure Storage.
Azure Key Vault.
Correct. Key Vault is designed for secret management with encryption and rotation capabilities.
Azure Security Center.
Microsoft Entra ID.
You have an Azure Function app that needs to retrieve a secret from Azure Key Vault at runtime. You want to avoid storing any credentials in code or configuration. Which mechanism should you use?
Service principal with client secret
Managed identity
Correct. Managed identity allows the Function app to authenticate to Azure Key Vault without any stored credentials.
Access key
Shared access signature (SAS)
A developer deleted a secret from Azure Key Vault with soft-delete and purge protection enabled (retention 90 days). After 50 days, the secret is needed again. What is the correct recovery method?
Purge the secret and then restore from a backup
Recover the secret using Azure CLI 'az keyvault secret recover'
Correct. Soft-delete allows recovery within the retention period using the recover command.
Recreate the secret with the same name
Use an Azure Resource Manager template to undelete the secret
Want more Implement Azure security practice?
Practice this domainAn e-commerce application emits a high volume of telemetry data to Azure Application Insights. You need to reduce the cost of data ingestion while preserving statistical accuracy for performance metrics. Which sampling technique should you use?
Adaptive sampling
Adaptive sampling dynamically tunes the sampling rate to keep data volume manageable while preserving statistical validity.
Fixed-rate sampling with a 1% rate
Ingestion sampling
Head-based sampling
You need to monitor the real-time CPU utilization of an Azure virtual machine. Which Azure Monitor feature is designed for this purpose?
Metrics
Metrics provide real-time numerical values such as CPU usage, ideal for monitoring performance.
Logs
Alerts
Workbooks
You have an Azure App Service web app that experiences intermittent slowness. You enable Application Insights and notice that the "Failed Requests" metric is low, but "Server Response Time" is high for a subset of requests. You want to identify the specific code path causing the delay. Which feature should you use?
Live Metrics.
Snapshot Debugger.
Profiler.
Correct. Profiler traces requests and identifies slow code paths.
Availability tests.
An Azure Function processes events from Event Hubs. You need to monitor the number of events that were successfully processed and those that were dropped due to processing errors. Which approach should you use?
Custom metrics in Application Insights.
Correct. You can use the Application Insights SDK to log custom events or metrics for processed and dropped events.
Event Hubs metrics.
Stream Analytics job.
Log Analytics query on function logs.
Your e-commerce application sends telemetry to Application Insights. You need to reduce ingestion costs while preserving the ability to detect trends in performance metrics. Which sampling type should you configure?
Fixed-rate sampling
Adaptive sampling
Correct. Adaptive sampling dynamically adjusts to keep the telemetry volume within a budget while preserving statistical accuracy for trends.
Ingestion sampling
Head-based sampling
You need to monitor the CPU utilization of an Azure VM in real-time and set up an alert when it exceeds 90%. Which Azure Monitor feature should you use?
Log Analytics Workspace
Metrics Explorer
Correct. Metrics Explorer provides near real-time platform metrics and supports creating metric alerts.
Application Insights
Azure Monitor for VMs
Want more Monitor, troubleshoot, and optimize Azure solutions practice?
Practice this domainA retail system uses Azure Service Bus to process orders. Each order has multiple messages (e.g., payment, shipping, confirmation) that must be processed in sequence. You need to guarantee that all messages belonging to the same order are handled by the same consumer in order. Which Service Bus feature should you use?
Sessions
Sessions ensure FIFO ordering and guarantee that messages with the same session ID are processed by a single consumer.
Scheduled messages
Dead-letter queue
Auto-forwarding
You manage an API in Azure API Management. You need to cache API responses such that different responses are returned based on the product subscription key used by the caller. Which set of policies should you implement?
Set a 'cache-lookup' policy in the inbound section and a 'cache-store' policy in the outbound section, using the subscription key as a cache vary-by parameter.
This is the correct pattern: lookup cache on request, store on response, varying by subscription key.
Set a 'cache-store' policy in the inbound section and a 'cache-lookup' policy in the outbound section.
Set both 'cache-lookup' and 'cache-store' policies in the inbound section.
Set only a 'cache-store' policy in the backend section.
A company uses Azure Logic Apps to integrate with a third-party REST API. The API has a rate limit of 100 requests per minute. You need to ensure that the Logic App respects this limit. Which connector feature should you configure?
Retry policy.
Concurrency control.
Correct. Concurrency control limits the number of in-flight requests, helping to stay within rate limits.
Swagger connector.
API Management.
You are building an API that needs to send notifications to multiple subscribers. Each subscriber has a different callback URL, and you need to ensure each notification is sent exactly once and retried on failure. Which Azure service should you use?
Azure Event Grid.
Correct. Event Grid delivers events to multiple subscribers with retry and exactly-once semantics.
Azure Service Bus.
Azure Notification Hubs.
Azure Queue Storage.
You manage an API in Azure API Management. The API response varies depending on the caller's subscription key. You need to cache responses per subscription key to reduce backend load. Which policy configuration should you use?
Set cache key to include the subscription key
Correct. Using a policy like <cache-lookup vary-by-key="@(context.Subscription.Id)" /> caches different responses per subscription.
Use a global cache with no variation
Disable caching and rely on the backend
Use rate limiting policy
You have an order processing system using Azure Service Bus. Each order generates multiple messages that must be processed in order and by the same consumer. Which Service Bus feature ensures this?
Message sessions
Correct. Sessions guarantee ordered, first-in-first-out (FIFO) delivery and that messages in a session are handled by a single consumer.
Topics and subscriptions
Dead-letter queues
Auto-forwarding
Want more Connect to and consume Azure services and third-party services practice?
Practice this domainThe AZ-204 exam has up to 60 questions and must be completed in 100 minutes. The passing score is 700/1000.
The AZ-204 exam uses multiple-choice, multiple-select, drag-and-drop, and exhibit-based questions. Exhibit questions show CLI output, network diagrams, or routing tables and ask you to interpret them — exactly the format Courseiva uses.
The exam covers 5 domains: Develop Azure compute solutions, Develop for Azure storage, Implement Azure security, Monitor, troubleshoot, and optimize Azure solutions, Connect to and consume Azure services and third-party services. Questions are weighted by domain — higher-weight domains appear more on your actual exam.
No. These are original exam-style practice questions written against the official Microsoft AZ-204 exam objectives. They are not copied from the real exam. Courseiva focuses on genuine understanding, not memorisation of braindumps.
Courseiva tracks your accuracy per domain and routes you toward weak areas automatically. Free, no account required.