When artificial intelligence is mentioned, attention is usually drawn to breakthrough generative models, autonomous agents or visions of computers that understand the world as well as humans. Meanwhile, the biggest winner of this revolution is a solution that is not new at all. Chatbots – often seen as a boring customer support tool – are back in the spotlight. And paradoxically, they are the ones that best demonstrate how the distant history of artificial intelligence meets its most practical application today.
From ELIZA to ChatGPT
The history of chatbots is a story of evolution, not revolution. As early as 1967, ELIZA was created at MIT – a program based on simple rules that allowed for text-based human-machine dialogue. Responses were fully predefined and selected based on keywords. Although it seems primitive from today’s perspective, for users it was the first substitute for a conversation with a computer.
Two decades later came Jabberwocky, which added voice interaction. What is obvious today thanks to Siri or the Google Assistant sounded like science fiction then. The next step was taken by A.L.I.C.E. in the 1990s – a system that stored responses and used them to create new responses. In practice, this was not yet real science, but for many researchers it opened up the question of where programming ends and intelligence begins.
Over the following decades, more complex systems emerged, but they were all based on the same foundation: rules, keywords and sets of predetermined responses. It was not until natural language processing and large language models overturned this convention, allowing chatbots to break away from narrow frameworks.
The data and computing power revolution
That chatbots in the 21st century really took off was not the result of a brilliant idea, but the effect of a combination of computing power and data. The development of GPUs made it possible to process huge collections of information, and the internet provided access to these collections. When open source libraries such as TensorFlow and PyTorch appeared, the barrier to entry for companies dropped dramatically. Creating your own chatbot was no longer the domain of research labs and technology corporations.
The turning point came in 2022 and the Transformer architecture on which ChatGPT was based . From a simple text completion model, AI has evolved into a conversational system that can respond naturally and flexibly. Semi-supervised learning, based on dialogue examples, allowed chatbots to break the pattern of rule repetition. From now on, it was no longer about a set of possible answers, but about conversational skills.
Today’s challenges: technology versus costs
Today’s chatbots no longer need pre-programmed responses. They can use billions of examples and conversational context to respond more consistently than ever before. However, the biggest challenge is no longer technology, but economics.
Companies deploying chatbots in customer service, operating 24 hours a day, clash with the issue of cost. Every interaction requires computing resources, and with large models this means high bills. In practice, this means that organisations are increasingly opting for smaller, specialised models – cheaper and sufficient for specific tasks. Paradoxically, in a world of ‘bigger is better’, a business advantage can be provided by the optimised model rather than the most advanced one.
This shifts the focus from the question ‘what is possible’ to ‘what pays off’. And it puts technology companies and IT integrators in the role of advisors who need to help clients balance innovation and budget.
A multimodal future
The next wave of change, which is already beginning, concerns multimodality. If chatbots used to only understand text, today they are learning to analyse speech, images and even video. The combination of these modalities creates new scenarios of use: from the generation of marketing material, to automated internal reports, to personalised presentations based on company data.
RAG, or Retrieval-Augmented Generation, architectures are a particularly interesting direction. With these, a chatbot can not only draw on general knowledge, but also refer to the organisation’s internal databases. This paves the way for advanced question-and-answer systems or corporate search engines that understand the business context better than traditional tools.
Forecasts indicate that from 2025 onwards, RAG and AI agents will be one of the main drivers of productivity growth in many industries. The chatbot will cease to be a simple interface in customer service and will become part of a company’s knowledge infrastructure.
What this means for business and the IT channel
For companies using chatbots, this means thinking of them not just as a tool that automates simple customer queries, but as a strategic data handling layer. A chatbot can become an organisation’s knowledge access point, a reporting channel and a creative tool.
For IT suppliers and resellers, on the other hand, the coming era of chatbots is an opportunity to develop new services. The integration of RAG systems, the design of multimodal solutions or advice on optimising the cost of models are all areas that can build real competitive advantages.
Looking more broadly, chatbots are an interesting case showing that in technology, it is not always what is most futuristic that wins, but what is most useful. After years of being underestimated, they are becoming central to the AI revolution, combining the simple function of communication with the most advanced artificial intelligence algorithms.
Old player, new deal
The history of chatbots is a reminder that, in the IT world, many ideas return in new guises. The ELIZA of the 1960s was a scientific experiment, the ChatGPT a commercial breakthrough. Six decades have passed between them, but the need remains the same: how to make a machine understand a human.
Today, the answer is more advanced than ever, but the challenges are just as real. Companies need to decide how to harness the potential of multimodal AI agents while controlling costs. Technology providers are becoming partners in this decision, not just tool vendors.
The paradox of the generative revolution is that the oldest technology may be the biggest beneficiary. Chatbot, until recently treated as a digital automaton answering the most frequently asked questions, is today growing into a strategic player in the AI ecosystem. And this is only the beginning of its new hand.