As we look ahead to 2025, the Edge Monsters see a number of key themes emerging at the edge. While some trends build on years of momentum, others represent significant shifts in how organizations deploy, manage, and scale edge infrastructure with a focus on quick business value. Whether driven by technological advances, operational demands, or economic realities, these trends will shape the future of edge computing in the year ahead.
1. Zero Trust at the Edge
2025 will be the year that we begin to hear a lot of talk about Zero Trust at the Edge. This will be driven by organizations moving away from traditional methods of authentication, such as passwords and 2-factor authentication, to a zero trust paradigm. The result will be more tools, standards, and overall adoption of Zero Trust. While not driven by challenges at the edge, we expect the edge will be swept up in the larger enterprise moves that take place.
2. Hyper-constrained Edge
In 2025, we expect to see increased adoption of the “hyper-constrained edge.” We define this as an edge environment that has the same constraints as a traditional edge but significantly more so. A good example of a hyper-constrained edge device is a camera running model inference while participating in a business solution.
This shift is driven by advancements in miniaturization—computing devices are getting smaller, more powerful, and more cost-effective all at once.
The hyper-constrained edge will explode the number of devices that are participating in business solutions and create a new management challenge for enterprises, specifically with managing distributed operations via many control planes.
3. Miniaturization
We expect the trend of shrinking form factors with increasing capability to continue, unlocking new use cases at both the traditional edge and the hyper-constrained edge. This is paired with a decrease in costs, which creates a trifecta for large-scale edge deployments. A key area to watch is NVIDIA’s advancements, particularly with devices like the Jetson Orin Nano, which exemplify this shift in the AI arena. As more high-performance, small-footprint, and cost-effective devices emerge, organizations seeking to deploy workloads at the edge will adopt them at an increasing pace. Smaller, better, and cheaper are beginning to emerge as reality.
4. Friction continues to reduce in deploying K8s at the Edge
The friction in deploying Kubernetes on bare metal has been dramatically reduced through key ecosystem improvements. What was once a frontier that was settled with primitive tactics and simple solutions has matured into robust patterns and architectures.
Cluster API has evolved into a robust foundation for infrastructure management, letting operators treat bare metal servers as declarative resources. The container storage interface (CSI) ecosystem has matured significantly, with providers like Rook offering stable integration with local storage and existing infrastructure. Meanwhile, networking improvements through projects like MetalLB and Cilium have simplified load balancing and routing with better BGP support and eBPF capabilities. These advancements have enabled edge computing at scale, allowing organizations to manage hundreds of remote locations using lightweight distributions like K3s without requiring specialized expertise at each site.
Looking ahead to 2025, we’ll likely see even more sophisticated bare metal provisioning through Cluster API, improved storage operators designed specifically for edge scenarios, enhanced eBPF-based networking capabilities, and better-automated operations with advanced self-healing and failure detection mechanisms. Bullet proof architecture patterns for provisioning and managing clusters from “sidecar” devices are prevailing in production. The future of K8s distributions at the edge is still bright.
5. Rise of alternatives to K8s at the Edge
Over the past eight years, Kubernetes has become the default assumption at the Edge, largely through edge-friendly distributions like K3s and MicroK8s. Many companies are continuing to develop and launch new Kubernetes-based deployments this year. However, 2025 is shaping up to be a year where this assumption is challenged.
This shift is being driven by the growing need to avoid operating multiple control planes for different types of edge artifacts, including containers, virtual machines, bare metal applications, and WebAssembly (WASM). While Kubernetes can support some of these (such as WASM with additional tools and frameworks), others—like virtual machines—require more effort. The underlying bet is that edge platform teams will find enough business value in running legacy, greenfield, and special-purpose applications together at the Edge to justify the added complexity but not want to manage these systems through discrete control planes.
We don’t expect a single dominant technology to emerge in 2025, but we do anticipate greater experimentation, the rise of new projects, and more mature solutions from vendors tackling this challenge.
That said, the counterpoint is worth considering—there are already tools that extend Kubernetes to manage a variety of workloads, including virtual machines (via KubeVirt), micro-VMs (via Kata), LF Edge EVE, and WASM. The ongoing evolution of AI hardware—whether in the form of GPUs (CUDA vs. PSX), dedicated accelerators, or specialized edge AI chips—will also play a significant role in shaping this landscape. As compute demands shift and architectures optimize for specific AI workloads, the choice of underlying hardware could influence whether Kubernetes remains the dominant orchestration layer or if alternative approaches gain traction.
6. Automation will lead the way
We expect to see more architectures where the traditional Edge serves as the central nervous system (CNS) of the business—processing information, making decisions, and triggering actions like automation (likely with increasing AI involvement). Meanwhile, hyper-constrained devices—sometimes connected, sometimes not—act as the senses (CV as eyes, sensors as touch, audio as ears) and limbs (executing actions in the physical world).
And then there’s the continued evolution of humanoid robotics, adding another dimension to this shift.
Put simply, “Automation will lead the way” at the Edge in 2025. We expect to see new Edge deployments emerge that empower a modern approach to automation—leveraging years of advancements in AI, computer vision, IoT, and edge architecture. With these technologies converging, the focus shifts to developing software “brains” that can interpret the environment and take meaningful action. For organizations capable of building their own software, this will be a relatively easy leap.
7. Edge and Datacenter Paradigms Converge
The convergence of edge and data center architectures isn’t entirely new—many organizations already deploy cloud-native tools at the edge. But in 2025, we expect this convergence to accelerate and evolve in new ways. The shift is no longer just about extending cloud-native tools to the edge—it’s about creating a true hybrid model, where edge locations behave as first-class extensions of the broader enterprise infrastructure, unified through a hybrid control plane that spans edge, cloud, and on-prem environments.
This shift is being driven by new pressures and capabilities. As AI workloads begin to move to the edge, organizations are rethinking data gravity, leading to more persistent storage and long-term data retention rather than just temporary caching. At the same time, edge hardware is closing the performance gap with data centers, with high-performance GPUs, AI accelerators, and DPUs enabling more complex workloads to run locally. With unified control planes and shared operational models, the distinction between edge, cloud, and datacenter continues to fade.
8. The cost of model training will continue to fall
The cost of developing models will continue to fall in 2025. Advances in model efficiency from techniques like sparse architectures and weight-sharing techniques are reducing the computational burden of both training and inference.
Investment continues to pour into hardware innovation in the form of AI accelerators and bigger/better GPUs. Open-source models are also lowering entry barriers, allowing companies to build competitive AI systems without the massive upfront investment previously required. Techniques like model distillation (transfer of knowledge from larger models to smaller) are lowering compute costs for inference and making it possible to run intelligent models on smaller infrastructure footprints (even laptops or edge devices).
Since the Edge Monsters discussed this trend, we’ve already seen the arrival of Deep Seek R1 in late January, which appears to be validating this trend and showcasing how innovative techniques have the potential to further lower training and inference costs.
What will this mean for the edge? If nothing else, the lowering costs for training mean use cases are likely to become more accessible, and those that need to be at the edge for latency or reliability reasons are increasingly likely to become a reality as the required hardware specs shrunk and the model capabilities increase. We’re also excited to see the continued development of things like NVidia’s Project DIGITS evolve over the year.
9. Companies will continue to “kick the can” on tech debt and app modernization
Given ongoing economic headwinds, we don’t expect a wave of modernization efforts in 2025. Organizations will likely continue to “kick the can” on addressing tech debt and legacy applications, prioritizing only initiatives with a clear and immediate business payoff. While new Edge deployments will continue, they will likely be greenfield / transformational projects or extensions of core business functions (e.g., POS in retail).
For companies with existing Edge footprints, this means managing an increasingly heterogeneous environment—a mix of VMs, containers, bare metal, and legacy systems. As a result, many organizations will look for vendor solutions that offer a unified control plane, simplifying management across disparate infrastructure and workloads. At the same time, open-source tools that help bridge the gap between old and new architectures may see increased adoption as companies seek pragmatic solutions rather than full-scale modernization.
10. Edge Computing will have a breakout year
Through our network of relationships, the Edge Monsters continue to hear new and evolving use cases for Edge deployments, and we expect 2025 to be a breakout year. This is especially true in retail, where traditional applications like Point-of-Sale are shifting to modern edge infrastructure. Beyond retail, we see a maturing and increasingly unified edge across manufacturing, industrials, automotive, and healthcare as industries move toward more standardized, scalable edge architectures. This acceleration is being fueled by the rapid growth of IoT devices, advancements in edge hardware, AI-driven decision-making, and industry-specific innovations that demand real-time processing at the edge. Combined with the themes we have shared above, we believe 2025 will be a breakout year.
Be sure to subscribe for updates and follow us on LinkedIn.
The Edge Monsters: Brian Chambers, Michael Henry, Erik Nordmark, Joe Pearson, Jim Teal, Dillon TenBrink, Tilly Gilbert, Anna Boyle & Michael Maxey
Prepared by STL Partners