The myth of the cheap archive. Why are the hidden costs of Tiering draining IT budgets?

Although storage tiering has been considered the foundation of cloud cost optimization for years, in the age of AI and real-time analytics, it is becoming a technological liability. Instead of generating savings, traditional tiering increasingly surprises IT departments with hidden fees and crippling delays, forcing them to seek more predictable access models.

9 Min Read
Dane branza IT

For almost two decades, cloud architecture has been based on one seemingly inviolable dogma: data that is rarely used should be ‘frozen’. The Cloud Object Storage model, shaped in the mid-2000s by Amazon (S3), defined the standard for thinking about infrastructure costs. But in 2025, in the age of real-time analytics, AI and rigorous compliance, this logic is beginning to crack. What looks like a saving in Excel becomes an unpredictable cost trap in operational practice.

Only a decade ago, dividing data into classes (Hot, Warm, Cold/Glacier) was not only logical, but necessary. Storage media were expensive and bandwidth was limited. Outsourcing rarely touched data to cheaper, slower storage tiers (Tiering) promised CFOs and CIOs clear savings. The principle was simple: you pay a lot for what you use now, and pennies for what ‘lies and dusts’.

On paper, this approach still seems rational. However, the reality of modern IT brutally verifies this model. Infrastructure teams are increasingly struggling with complex lifecycle policies, operational delays and – most importantly – costs that cannot be budgeted for annually. So is the era of Tiering coming to an end?

Logic of the 2000s versus digital reality

Data levelling had a strong economic mandate at a time when data was static. The archive served to be forgotten. Today, however, data has become fuel. The rise of machine learning, Big Data analytics and the need for real-time reporting has made the concept of ‘rarely used data’ fluid.

A file that has not been opened for 180 days can become critical by the minute to a predictive algorithm, an audit process or an emergency RODO request. In the classic S3 model, IT systems collide with a wall. Data has been ‘pushed out’ to a low-cost tier according to Lifecycle Management policy, and immediate restoration is impossible or extremely expensive.

The huge drop in the price of storage itself in recent years has meant that the difference in price per TB between hot and cold tiering is no longer the sole determinant of cost-effectiveness. In the new economic calculus, access costs, rather than resting costs, are becoming crucial.

The maths that hurts – the hidden costs of ‘cold’ data

Many IT managers fall into the trap of looking solely at the price of storage (storage at rest). However, this is only the tip of the TCO (Total Cost of Ownership) iceberg. Traditional Tiering is laden with a number of charges, which are written in the fine print in cloud providers’ price lists, and which hit companies when they least expect it.

The main problem is a lack of transparency. Companies often omit from their calculations:

  • Retrieval Fees: The cost of ‘retrieving’ data from an archive can be many times the annual cost of storing it.
  • Minimum retention period: Many ‘low-cost’ memory classes enforce the retention of an object for, say, 90 or 180 days. Deleting or moving it earlier incurs a financial penalty.
  • Exit costs (Egress Fees): Data transfer outside the provider’s cloud.

The scenario is repetitive: a company moves terabytes of legacy client data to a ‘cold’ classroom to save budget. Months later, the legal department orders an audit or historical review. The IT department has to ‘unfreeze’ these resources. Suddenly, the process generates an invoice that ‘eats up’ all the savings previously generated and further blocks the budget for new investments. Cost unpredictability becomes enemy number one for business stability.

Time is money – operational paralysis

The financial aspect is one thing, but Tiering also introduces operational risk. In the case of deep archives (Deep Archive type), the time to restore access to data is calculated in hours and sometimes days.

For modern applications that expect millisecond responses, this is unacceptable. When an analytics tool or reporting system encounters archived data, workflows are interrupted. There are *time-outs*, error messages and business processes come to a standstill. In time-critical environments – such as banking, e-commerce or manufacturing – such a delay can mean real reputational and financial losses.

In addition, data lifecycle management (Lifecycle Policies) is becoming increasingly complex. The ‘move to archive after 30 days without access’ rule sounds reasonable, but in practice it is a blunt tool. IT teams waste hundreds of hours configuring exceptions, monitoring rules and manually restoring data at the request of the business. Instead of dealing with innovation, administrators become custodians of the digital archive, fighting against a system that was supposed to make their job easier.

The “Always-Hot” trend – predictability instead of gambling

In response to these challenges, a new trend is crystallising in the storage market: a move away from class-based logic towards Always-Hot architectures.

More and more IT decision-makers are questioning the relevance of Tiering. Instead of juggling data between different tiers, companies are opting for models in which all objects – regardless of age or frequency of use – are maintained in instant access mode.

The advantages of this approach go beyond simple convenience:

1. financial predictability: in the Always-Hot model, the variable costs of data recovery disappear. The company pays for capacity and transfer, but is not penalised for wanting to use its own information. Budgeting becomes simple and precise.

2. Efficiency: the absence of ‘unfreezing’ processes means that every application, script or analyst has access to the full spectrum of data at the same time.

3 Simplifying the architecture: Eliminating complex retention and portability rules frees up human resources.

Security and Compliance in a flat structure

A data warehouse that makes everything available instantly, however, requires a different security philosophy. Classic S3 mechanisms, such as ACLs (Access Control Lists) or policies at the individual bucket level, become unmanageable and confusing at large scale.

Modern Object Storage systems rely on IAM (Identity and Access Management). Since data is always available (“hot”), access control must be surgical. Rights are assigned to the identity of the user or application, rather than being “stuck” to folders. This allows precise identification of who can read, write or delete objects, which is crucial in multi-tenancy environments.

The legal aspect is equally important. Compliance with RODO, European data sovereignty or protection from extraterritorial regulations (such as the US CLOUD Act) are priorities today. Companies need to know where their data is and be confident that they can permanently delete or export it at the request of a regulator. In a tiered model, where data is spread across different classes of archiving, implementing the ‘right to be forgotten’ can be technically difficult and time-consuming. A flat architecture (with no layers) drastically simplifies auditability and compliance management.

Resilience through accessibility

Looking to the future, it is clear that data volumes will grow exponentially, but tolerance for access delays will decrease. Companies cannot afford to hold their digital assets hostage to complex pricing and slow archive drives.

The Always-Hot approach fits into a broader strategy of business resilience (Resilience). It is a model that prioritises business continuity and responsiveness over theoretical carrier savings. The classic Tiering model, while well deserved for cloud development, has reached its limits in many scenarios. Its complexity and hidden costs make it a relic of a previous IT era.

For CIOs and system architects, the lesson is clear: choosing storage today is a strategic decision, not just a purchasing one. Those who opt for direct availability and cost transparency are building the foundation for IT that is ready for the unpredictable challenges of the future – from sudden audits to the AI revolution.

Share This Article