2191 applications in one company: The new reality of the SaaS market

Companies are facing unprecedented software growth, managing an average of over two thousand applications, which are becoming increasingly difficult to control. Artificial intelligence has further accelerated this process, with employees implementing new tools on their own faster than IT departments can verify them.

6 Min Read
cloud computing

The digital architecture of a large enterprise resembles a vibrant, ever-expanding ecosystem that is increasingly difficult to control using traditional oversight methods. According to the latest data in the Torii 2026 SaaS Benchmark report, the average corporation now operates 2,191 applications. This figure signals a fundamental shift in the way technology permeates the fabric of business. It paints a picture of a digital jungle where every new initiative, every project and every attempt by an employee to optimise their work results in another software rush. In this reality, the role of the COO and IT leader is evolving from that of a strict controller to that of an architect of fluid structures who, instead of building dams, must learn to manage the current of the river.

The foundation for this rapid expansion has been laid by artificial intelligence, which has acted as a catalyst for the phenomenon hitherto known as Shadow IT. While the self-implementation of IT solutions by business departments is not a new phenomenon, it is AI that has given it an unprecedented dynamic and, more significantly, increased the so-called blast radius of possible incidents. Tools based on language models and automation differ from classic SaaS platforms in one key feature: their ability to integrate instantly and deeply with company resources. One click is all it takes for an unverified algorithm to gain access to company mail, calendars or confidential knowledge bases. Not only do these tools appear in an organisation at the speed of light, but they demonstrate an astonishing longevity, persisting in system structures long after the users’ initial enthusiasm has died down.

From a security perspective, this situation poses a number of challenges. Since more than 61% of discovered applications operate outside the formal radar of IT departments, we are dealing with the existence of a parallel technology ecosystem. This is a reality in which traditional firewalls and static security policies are becoming anachronistic tools. Indeed, modern threats no longer arise solely from external attacks, but from the inattentive intertwining of authorised data with unauthorised processing. It is a dangerous symbiosis in which lack of visibility directly translates into an organisation’s vulnerability to legal risks and loss of intellectual property.

It is worth leaning into the condition of the employee who, in 2026, becomes the de facto self-manager of his or her own micro-environment. Interacting with 40 applications in a single working day has become the new standard of efficiency, although in reality it can lead to the phenomenon of decision paralysis and digital fatigue. The employee, seeking to maximise his or her own performance, rarely analyses the long-term effects of implementing a free browser plug-in or subscribing to an image generator. For him, it is the immediate benefit that counts, making the mechanism of technology adoption completely decentralised and unpredictable. Combating this trend through restrictive bans usually proves ineffective, pushing innovation even deeper underground.

This expansion also has tangible financial consequences, which should become a priority for boards. The lack of consistent oversight of thousands of applications leads to a systematic drain on capital. Inactive licences, duplicate functionality or forgotten subscriptions generate losses running into millions of dollars a year. This is a financial noise that often escapes attention when constructing annual budgets because the individual amounts seem insignificant. It is only when viewed through the lens of an enterprise-wide scale that the enormity of the waste is revealed, which could be redirected to strategic investments in the development of in-house AI models or the modernisation of core systems.

It therefore becomes necessary to move away from a management model based on periodic reviews. A traditional quarterly or annual software audit is about as useful as checking the weather forecast a month ago. Organisations need models built for continuous discovery. This means implementing systems that monitor data flows and emerging points of contact with new software on the corporate network in real time. This proactivity balances the need for innovation with security requirements. Rather than blocking access to new technologies, IT departments should aim to act as a process facilitator that validates tools on the fly and integrates the most valuable ones into the company’s official ecosystem.

However, this reorientation requires a cultural shift within the organisation. Technology management is becoming part of every manager’s business hygiene. The modern approach assumes that since software adoption no longer follows centralised paths, responsibility for its use must be appropriately dispersed, while maintaining central visibility. This is the paradox of modern management: in order to maintain control, one must first accept that full, restrictive dominance over every byte of data is a myth.

Share This Article