60-Second Summary
If you are in a rush, here is the executive briefing:
The Problem: Engineers see the lowest price label and move operational backups there. But they forget that this data is effectively frozen in carbonite.
The Result: A critical restore incident where the team has to explain to the CTO why the data won't be available until tomorrow morning.
The Simple Fix: Use Deep Archive exclusively for data you are legally required to keep but almost certainly will never read (Compliance, Regulatory, Legal Hold).
The Catch: The fastest you can get data back is 12 hours. There is no expedited button here.
The Outcome: If you respect the retrieval limits, Deep Archive is the closest thing to free storage you will find in the cloud.
If you look at the AWS pricing page, S3 Glacier Deep Archive looks like a typo. At $0.00099 per GB, it is ridiculously cheap. It promises to slash your AWS S3 storage bill compared to S3 Standard."
It promises to slash your storage bill by 96% compared to S3 Standard. For many companies, this is the difference between a storage bill that looks like a mortgage payment and one that looks like a rounding error.
But Deep Archive is not just storage. It is a vault. And vaults are designed to keep things in, not to let you take them out easily.
This post is to give you clarity on S3 Glacier Deep Archive.
We are going to go more in detail for this. You’ll get an explainer of the pricing that makes this help your business, the 180 plan you can follow, and why this is the only storage class where you need to plan your retrieval strategy in days, not minutes.
S3 Glacier Deep Archive is the end of the line for data. It is the cold storage meant for the digital dust that you cannot delete but do not need.
Like the other S3 classes, it offers 11 nines of durability (99.999999999%). Your data is safe; it is replicated across three Availability Zones. It isn't going anywhere.
But here is the trade: You are sacrificing accessibility for price.
To make this work, you have to look at the Pricing Trio:
Let’s stop guessing and look at why this tier is revolutionary for long-term retention. The math gets staggering when you look at scale.
Imagine you are a media company, a hospital, or a financial firm holding 1 PB (1,000,000 GB) of data that you need to keep for 10 years for compliance.
The Result: You saved $2.6 million. That is a 96% cost reduction just for changing the storage class. Now this is a very far feteched example not very realsitic but we can see real life applications from this example.
Even if you have to do a massive audit and retrieve 1% of that data (10 TB):
Retrieval Cost: 10,000 GB × $0.02 = $200. The retrieval cost is negligible compared to the millions you saved in storage.
Ready to see how much your storage bill could drop?
The storage is cheap, but the handcuffs are tight. There are two specific traps that burn teams who move too fast.
Deep Archive has a 180-day minimum storage duration.
If you upload a file today and delete it tomorrow, AWS charges you for the remaining 179 days. For data with changing patterns, S3 intelligent tiering might be safer, but I have seen teams use Deep Archive for monthly backups that they rotate (delete) every 30 days. The Result: They are paying a 150-day penalty on every single file. They are effectively paying for 6 months of storage for files that only exist for one month.
Stop paying early deletion penalties. Let us audit your lifecycle rules first.
Deep Archive does not have an Expedited option.
In S3 Glacier Flexible Retrieval, you can pay extra to get data in minutes. In Deep Archive, you cannot. The fastest retrieval is 12 hours. No amount of money will make AWS move faster.
If your Disaster Recovery plan relies on Deep Archive, your Recovery Time Objective (RTO) is effectively Tomorrow. If your business cannot survive being offline for 24 hours, this is the wrong place for your DR data.
Use Deep Archive when the data is effectively dead, but the law says you have to keep the body.
It is perfect for:
Because you can't get data instantly, you need a workflow for when you do need it.
The Audit Scenario: Imagine you receive a legal notice requiring emails from 8 years ago.
Pro Tip: The Hybrid Strategy For mixed-access archives, use a Restore and Promote strategy:
This gives you operational speed during the audit without sacrificing long-term savings.
Deep Archive is the ultimate commitment. Once you put data there, you are committed for 6 months. You need to be sure before you click the button.
This is where a tool like Costimizer is essential.
Don't guess your ROI, simulate it with your actual data.
Before you apply a lifecycle rule to move data to Deep Archive, ask these questions.
If you answered YES, you are about to unlock the biggest savings in the AWS ecosystem.
Only if your business can tolerate being down for 24+ hours. For most modern businesses, the answer is No. Use S3 Standard-IA or S3 Glacier Instant Retrieval for DR so you can recover quickly. Deep Archive is for the Backup of the Backup.
Yes, but it is slow. It typically takes within 48 hours. Use Bulk ($0.0025/GB) when you have a massive amount of data to restore (petabytes) and time is not an issue. Use Standard ($0.02/GB) when you need it within 12 hours.
Like the other Glacier classes, it is 40 KB. If you archive millions of 1 KB files, AWS will charge you for 40 KB for each one.
Table of Contents
Explore our Topics
Having delivered value from Day 1, customers have literally texted us that we could charge them, but Costimizer continues to be a free product for our customers