mediummultiple choiceObjective-mapped

An S3 bucket stores user-uploaded media. Most objects are never read again, but compliance requires keeping them for at least 18 months. Retrieval is rare and typically only needed during investigations. The current design keeps everything in S3 Standard, increasing storage cost. Which configuration best optimizes cost while meeting the retention and rare-access requirements?

Question 1mediummultiple choice
Full question →

An S3 bucket stores user-uploaded media. Most objects are never read again, but compliance requires keeping them for at least 18 months. Retrieval is rare and typically only needed during investigations. The current design keeps everything in S3 Standard, increasing storage cost. Which configuration best optimizes cost while meeting the retention and rare-access requirements?

Answer choices

Why each option matters

Good practice is not just finding the correct option. The wrong answers often show the exact trap the exam wants you to fall into.

A

Distractor review

Move all objects to S3 Glacier Instant Retrieval immediately upon upload and disable lifecycle policies.

Glacier Instant Retrieval can be useful for frequent retrieval needs with low latency, but it is not the most cost-optimized option for data that is rarely accessed. Disabling lifecycle removes the ability to move older data to cheaper tiers.

B

Best answer

Use an S3 lifecycle policy to transition objects to S3 Glacier Deep Archive after 30 days, and expire them after 18 months.

Lifecycle policies can automatically move data to lower-cost storage classes after it becomes infrequently accessed. Because reads are rare and required only during investigations, Glacier Deep Archive is a strong cost-optimization choice. Setting expiration after 18 months ensures compliance retention is met.

C

Distractor review

Keep objects in S3 Standard but compress them with a custom process to reduce storage size.

While compression may reduce object size, it introduces additional compute complexity and operational overhead. The biggest driver of storage cost is usually the storage class; transitioning to an archive tier provides a more direct and predictable cost reduction.

D

Distractor review

Enable S3 Intelligent-Tiering for all objects and delete any object not accessed within 24 hours.

Intelligent-Tiering can help when access patterns are unpredictable, but deleting objects after 24 hours directly violates the stated compliance requirement to retain data for at least 18 months.

Common exam trap

Common exam trap: NAT rules depend on direction and matching traffic

NAT is not only about the public address. The inside/outside interface roles and the ACL or rule that matches traffic are just as important.

Technical deep dive

How to think about this question

NAT questions usually test address translation, overload/PAT behaviour, static mappings and whether the right traffic is being translated. Read the interface direction and address terms carefully.

KKey Concepts to Remember

  • Static NAT maps one inside address to one outside address.
  • PAT allows many inside hosts to share one public address using ports.
  • Inside local and inside global describe the private and translated addresses.
  • NAT ACLs identify traffic for translation, not always security filtering.

TExam Day Tips

  • Identify inside and outside interfaces first.
  • Check whether the scenario needs static NAT, dynamic NAT or PAT.
  • Do not confuse NAT matching ACLs with normal packet-filtering intent.

Related practice questions

Related SAA-C03 practice-question pages

Use these pages to review the topic behind this question. This is how one missed question becomes focused revision.

More questions from this exam

Keep practising from the same exam bank, or move into a focused topic page if this question exposed a weak area.

FAQ

Questions learners often ask

What does this SAA-C03 question test?

Static NAT maps one inside address to one outside address.

What is the correct answer to this question?

The correct answer is: Use an S3 lifecycle policy to transition objects to S3 Glacier Deep Archive after 30 days, and expire them after 18 months. — Use S3 lifecycle policies to move objects to cheaper storage classes once they become infrequently accessed, while honoring the compliance retention period. In this scenario, most objects are rarely read after upload, and retrieval is only needed during investigations. Transitioning to S3 Glacier Deep Archive after an initial short period (for example, 30 days) provides substantial storage cost savings. The lifecycle rule should expire (delete) objects only after 18 months to maintain compliance. Moving everything immediately to Glacier Instant Retrieval is usually more expensive than deep archive options for rarely accessed data. Compression does not address the primary cost driver as effectively as storage class transitions and adds operational complexity. Intelligent-Tiering is compatible with variable access patterns, but the immediate deletion rule conflicts with the 18-month compliance requirement.

What should I do if I get this SAA-C03 question wrong?

Then try more questions from the same exam bank and focus on understanding why the wrong options are tempting.

Discussion

Loading comments…

Sign in to join the discussion.