Until recently, edge computing was regarded as a technology at the intersection of industry, research and development. Although its potential was obvious, the impetus for it to become a widely implemented solution was lacking. Today, the situation is different. The leap in performance of chips for localised computing and the increasing availability of 5G networks mean that Edge AI – or artificial intelligence running on end devices – is entering the mainstream of digital transformation.
Unlike AI developed in cloud environments, edge AI processes data directly where it is generated – on a surveillance camera, industrial gateway, warehouse cart or portable diagnostic device. It’s faster, cheaper and – crucially for many industries – more secure in terms of regulation and privacy. For technology partners and the IT sales channel, this is a very tangible opportunity: edge AI creates a whole new segment of hardware, integration and services that is just now starting to grow rapidly.
The transformation that has enabled the acceleration of edge AI is due to several parallel developments. First, chipmakers have brought low-cost, low-power AI-accelerated chips into mass production that can be installed even in small devices. Second, 5G networks and modern Wi-Fi have significantly improved the quality and reliability of local connections. Thirdly, there has been increased pressure to reduce operational costs associated with transferring data to the cloud. As a result, more and more companies are recognising that ‘on-premise’ computing provides immediate results while minimising bandwidth consumption and reducing the risks associated with the transfer of sensitive data.
Edge AI is no longer an experimental concept. It is already working in real-world settings – and in many industries. In retail, cameras analyse customer behaviour in real time, monitor shelf stocking and inform staff of possible shortages. In industry, line-mounted vision systems detect product defects without the need to send images to a data centre. Logistics fleets use devices that analyse sensor data and support route planning or vehicle technical diagnostics. In healthcare, portable medical devices are able to analyse ultrasound images or ECG data locally and pinpoint abnormalities without a connection to the cloud. Cities are experimenting with local image analysis of street surveillance cameras to respond more quickly to traffic incidents or acts of vandalism.
The common denominator of these applications is autonomy and speed. Decisions are made where the data is created – without waiting for analysis in the cloud. For the IT channel, this represents a fundamental change: the importance of physical devices, integration competence and distributed infrastructure management is growing. Edge AI is creating a new business space where partners can offer not only hardware, but also deployment, maintenance and security services.
Hardware sales are becoming the starting point. Customers are asking for microservers, AI-enabled cameras, sensors with device-level data analytics, and systems that enable local deployment and updating of machine learning models. Increasingly, they are interested not only in what a device can do, but how it can be integrated into existing infrastructure – ERP, CRM, BI or MES. AI model lifecycle management services are also coming into play – updating, testing, monitoring performance and ensuring compliance with security requirements.
From a sales perspective, edge AI is a segment that is growing faster than classic cloud solutions. This is primarily because the number of endpoints – places where a customer can deploy an AI device – is many times greater. Each shop, warehouse, vehicle, production floor or medical facility may have separate needs and separate infrastructure, which translates into more opportunities for collaboration and service sales.
IT partners who want to enter this segment should start by identifying the areas with the greatest potential. In practice, these include manufacturing companies, logistics centres, offline retail chains, local authorities developing smart city projects and medical facilities with on-site diagnostics. When talking to clients, it is worth focusing on simple business outcomes – savings, faster response times, improved data security or operational autonomy. In many cases, the best starting point will be a local pilot – implementing edge AI in a single facility, warehouse or service point, which can then be scaled up.
On the other hand, customer awareness is not worth overestimating. Many of them do not yet differentiate between AI running in the cloud and that embedded locally. This makes it all the more important to speak the language of benefits rather than technology. It’s not about accelerators, frameworks and ML libraries – it’s about real return on investment and improved operational performance.
Getting into edge AI does not require a complete redefinition of an IT partner’s business. However, new competencies are needed – basic knowledge of how to implement machine learning models, experience in IoT integration and partnerships with edge-ready hardware and platform providers such as Nvidia Jetson, Hailo, Intel, AWS Panorama or Azure Percept. It is also good practice to prepare ready-to-use scenarios that can be quickly shown to the customer – oriented not to the technology, but to solving a specific problem.
Edge AI is not an experiment. It is a new market that is just now gaining momentum. Partners who start building their competencies and offerings around this technology today have a real opportunity to gain a competitive advantage – not only in terms of revenue, but also in terms of advisory standing with clients. It is worth acting quickly. In two years’ time, edge could already be the standard – the question is who will implement it.