Vendor lock-in as operational risk. What does the case of VMware and Broadcom teach us?

The drastic licensing changes following Broadcom's acquisition of VMware are a rude awakening for managers who have been paying a “peace of mind tax” for years for sticking with a single vendor. This market shock definitively ends the era of convenient monoliths, proving that true business security today requires escaping vendor lock-in in favor of open, sovereign ecosystems.

7 Min Read
biznes CISO
Author: Goodtime / Adobe Stock

Recent months in the IT industry have resembled a cold shower. The acquisition of VMware by Broadcom and the resulting radical changes in licensing are not just a price tag adjustment. It is an earthquake that has exposed a systemic error in the strategies of many companies. For years, managers have been paying a kind of ‘peace of mind tax’ by choosing monolithic solutions from a single supplier. Today, it turns out that this peace of mind was illusory and that convenience has become a trap.

Until recently, the ‘one-stop-shop’ strategy – that is, basing key infrastructure on a single, dominant supplier – seemed rational. It made it easier to manage, integrate services and negotiate. However, the current situation in the virtualisation market shows how thin the line is between a stable partnership and a dangerous dependency, known in the industry as Vendor Lock-in. When a company’s technological foundation, hitherto taken for granted, suddenly becomes a source of budgetary and operational uncertainty, boards must ask themselves the difficult question: can we afford the risk of having no alternative?

Catalyst for change – when the ‘standard’ becomes a burden

Market dominance comes at a price, which is ultimately paid by the customer. The changes introduced following the VMware acquisition, including the move to a subscription model and product bundling, have shaken up the European market. As industry organisations, including CISPE, point out, the new conditions are hitting almost every organisation using the cloud with a ricochet.

For chief financial officers (CFOs) and chief information officers (CIOs), the problem is not just the cost increase itself, although this can be drastic. Far more dangerous is the loss of flexibility. New licensing models often force the purchase of extensive software packages that the company does not really use, but which it has to pay for in order to keep critical systems running. This contradicts the idea of cost optimisation and agility that modern companies strive for.

As the market enters a phase of economic downturn and the search for savings, a sudden and uncontrollable increase in the cost of maintaining infrastructure (‘Run the Business’) becomes unacceptable. This demonstrates emphatically that relying on the closed ecosystem of a single giant is a strategy that only works in times of peace. In times of market turbulence, it becomes a crutch.

Moving forward – Open Source as a new definition of stability

Open source software is no longer seen as a ‘cheaper alternative for enthusiasts’ and is being promoted to a strategic ‘Plan A’ for the biggest players. A redefinition of what security means in business is taking place.

Security today is about independence. Companies are turning to open code-based solutions (such as KVM, Linux or containerisation platforms) because they offer something that the commercial market has just denied: predictability. In an open source model, there is no risk that a vendor will ‘disable’ a licence overnight, drastically change the product roadmap (roadmap) or force an unwanted upgrade.

This is where the concept of digital sovereignty comes into play. In the context of increasing regulatory requirements – such as the EU’s NIS2 directive or the DORA regulation for the financial sector – having full control over the technology stack is key. Open code provides transparency. A company knows what happens to its data, where it is processed and how secure it is. In a world of ‘black boxes’ provided by global corporations, such auditability is becoming a luxury that only those who have opted for open standards can afford.

It is also a matter of pure economies of scale. Open source allows you to pay for real use and support, not artificial licensing barriers. It allows the infrastructure to scale at a pace dictated by business growth rather than the quarterly sales performance of the software vendor.

The architecture of the future does not like a rigid framework

The decision to move away from monolithic solutions has another key dimension: innovation. We live in an age where hybrid cloud and multicloud are becoming standard. According to Bitkom, 29% of companies are already using hybrid cloud and 41% are combining services from multiple providers. In such an environment, interoperability – that is, the ability of systems to work together – is currency. Open ecosystems are inherently focused on integration. Closed proprietary platforms inherently seek to isolate and build walls.

What’s more, we stand at the threshold of the AI revolution. Investment in AI, which is forecast to shoot up around 2026, requires massive computing power. Modelling the cost of AI is difficult, and adding expensive, complex licences charged ‘per CPU core’ can kill the profitability (ROI) of innovation projects right from the start.

The infrastructure under AI, Edge Computing or modern cloud-native applications is developing fastest in open communities. Waiting for a commercial provider to implement these innovations in its paid package is a waste of valuable time (Time-to-Market). Migrating to open platforms is therefore not just about escaping costs, but more importantly about building a highway for future innovation.

Resilience through diversification

The upheaval caused by the VMware changes should be a wake-up call for managers. The future of IT belongs to modular, sovereign and flexible systems. The days when a single vendor guaranteed peace of mind in exchange for total control over a customer’s infrastructure are definitely coming to an end.

Today, in an era of geopolitical and market uncertainty, technology diversification is a new form of insurance policy. Managers and boards should no longer ask the question: “Can we afford the hardship of migrating to new solutions?”. The right question is: “Can we afford the risk of staying with the current model?”. The answer, looking at recent market developments, seems obvious.

TAGGED:
Share This Article