hardmulti selectObjective-mapped

An application stores user-uploaded binaries in S3. Access is unpredictable for the first month, then most objects become cold. The team wants the cheapest approach that avoids manually guessing access patterns. Which two actions are best? Select two.

Question 1hardmulti select
Full question →

An application stores user-uploaded binaries in S3. Access is unpredictable for the first month, then most objects become cold. The team wants the cheapest approach that avoids manually guessing access patterns. Which two actions are best? Select two.

Answer choices

Why each option matters

Good practice is not just finding the correct option. The wrong answers often show the exact trap the exam wants you to fall into.

A

Best answer

Enable S3 Intelligent-Tiering on the bucket.

Correct. Intelligent-Tiering is designed for objects with uncertain or changing access patterns. It automatically moves data between access tiers, reducing the need for manual guessing and avoiding overpaying for standard storage.

B

Distractor review

Keep all objects in S3 Standard because lifecycle transitions add too much management.

Incorrect. S3 Standard is the most expensive common general-purpose class, so keeping everything there defeats the cost objective. The workload description specifically suggests that access patterns change over time.

C

Best answer

Add a lifecycle rule to move very old objects to S3 Glacier Deep Archive when minute-level retrieval is no longer required.

Correct. Deep Archive is the cheapest destination for very cold binaries that do not need fast restores. Pairing lifecycle transitions with Intelligent-Tiering gives a good cost profile across both uncertain and truly cold phases.

D

Distractor review

Copy all binaries to Amazon EFS so retrieval is faster.

Incorrect. EFS is a shared file system, not a low-cost object archive. Copying binary uploads to EFS would usually increase cost and introduce the wrong storage model for this access pattern.

E

Distractor review

Disable versioning because S3 Intelligent-Tiering needs it to work.

Incorrect. Intelligent-Tiering does not require versioning to function. Turning off versioning also does not address the core issue, which is matching storage cost to changing access frequency.

Common exam trap

Common exam trap: NAT rules depend on direction and matching traffic

NAT is not only about the public address. The inside/outside interface roles and the ACL or rule that matches traffic are just as important.

Technical deep dive

How to think about this question

NAT questions usually test address translation, overload/PAT behaviour, static mappings and whether the right traffic is being translated. Read the interface direction and address terms carefully.

KKey Concepts to Remember

  • Static NAT maps one inside address to one outside address.
  • PAT allows many inside hosts to share one public address using ports.
  • Inside local and inside global describe the private and translated addresses.
  • NAT ACLs identify traffic for translation, not always security filtering.

TExam Day Tips

  • Identify inside and outside interfaces first.
  • Check whether the scenario needs static NAT, dynamic NAT or PAT.
  • Do not confuse NAT matching ACLs with normal packet-filtering intent.

Related practice questions

Related SAA-C03 practice-question pages

Use these pages to review the topic behind this question. This is how one missed question becomes focused revision.

More questions from this exam

Keep practising from the same exam bank, or move into a focused topic page if this question exposed a weak area.

FAQ

Questions learners often ask

What does this SAA-C03 question test?

Static NAT maps one inside address to one outside address.

What is the correct answer to this question?

The correct answer is: Enable S3 Intelligent-Tiering on the bucket. — The best answer combines automation for uncertain access with a lifecycle rule for truly cold data. Intelligent-Tiering handles the first month without requiring the team to predict access behavior. After the binaries become very cold and minute-level retrieval is no longer needed, a lifecycle transition to Glacier Deep Archive cuts storage cost further. This approach minimizes manual tuning while keeping the archive affordable over time. Why others are wrong: Keeping everything in Standard wastes money when most objects become cold. EFS is the wrong service because it is a file system, not a low-cost object archive. Disabling versioning is unrelated to cost optimization and does not help Intelligent-Tiering. The right answer is to let S3 manage changing access patterns and then move deep-cold objects to an archive tier.

What should I do if I get this SAA-C03 question wrong?

Then try more questions from the same exam bank and focus on understanding why the wrong options are tempting.

Discussion

Loading comments…

Sign in to join the discussion.