6 Edge Computing Trends To Watch In 2022


While many aspects of edge computing are not new, the bigger picture is rapidly evolving. “Edge Computing” includes, for example, the branch branch systems that have been distributed for decades. The term has also swallowed all kinds of computer systems on local factory floors and telecommunications providers, albeit in a more networked and less proprietary manner than has been the historical norm.

Edge computing helps IT and business leaders solve problems as both sensor data and machine learning data grow.

But even if we see echoes of older architectures in certain edge computing implementations, we also see edge trends evolving that are really new, or at least very different from previous ones. And they help IT and business executives solve problems in industries ranging from telecommunications to automotive, for example, as both sensor data and machine learning data increase.

[ How can automation free up more staff time for innovation? Get the free eBook: Managing IT with Automation. ]

Edge Computing Trends That You Should Have On Your Radar

Here, edge experts examine six trends IT and business leaders should focus on in 2022:

1. Edge workloads are getting thicker

One big change we’re seeing is that there are more computers and more storage at the edge. Decentralized systems were often used to reduce dependency on network connections rather than to perform tasks that could not be practically done in a central location, assuming reasonably reliable communication. But that’s changing.

IoT has always involved at least the collection of data, almost by definition. But what could be a trickle has now developed into a flood, as the data required for machine learning (ML) applications flow in from a large number of sensors. But even if training models are often developed in a central data center, the ongoing application of these models is usually moved to the edge of the network. This limits the network bandwidth requirements and enables quick local measures, such as B. Shutting down a machine in response to abnormal sensor readings. The goal is to deliver insights and take action the moment it’s needed.

[ Need to talk edge with colleagues, customers, or partners? Get a shareable primer: How to explain edge computing in plain English. ]

2. RISC-V is gaining ground

Of course, both data-intensive and compute-intensive workloads require hardware to run on. The details will vary depending on the application and the tradeoffs required between performance, performance, cost, etc. Traditionally, choices have been limited to something custom, ARM, or x86. Neither is completely open, although the ARM and x86 have developed a large ecosystem of supporting hardware and software over time, largely powered by the leading developers of processor components.

But RISC-V is a new and fascinating open hardware-based instruction set architecture.

Why is it fascinating? Red Hat Global Emerging Technology Evangelist Yan Fisher puts it this way: “The unique aspect of RISC-V is that its design process and specification are really open. The design reflects the decisions made by the community based on collective experience and research. “

This open approach and an accompanying active ecosystem are already helping to drive RISC-V design wins in a variety of industries. Says Calista Redmond, CEO of RISC-V International, “With the move to edge computing, we’re seeing massive investments in RISC-V across the ecosystem, from multinationals like Alibaba, Andes Technology and NXP to startups like SiFive , Esperanto Technologies and GreenWaves Technologies develop innovative Edge-AI-RISC-V solutions. “

3. Virtual Radio Access Networks (vRAN) are becoming an increasingly important edge use case

As part of the 5G implementation, network operators are switching to a more flexible vRAN approach.

A radio access network is responsible for activating devices such as smartphones or Internet of Things (IoT) devices and connecting them to a mobile network. In the context of 5G deployments, network operators are switching to a more flexible vRAN approach, in which the logical RAN components at a high level by decoupling hardware and software as well as by using cloud technology for automated deployment and scaling and workload placement be disaggregated.

Hanen Garcia, Red Hat Telco Solutions Manager, and Ishu Verma, Red Hat Emerging Technology Evangelist, state: “A study shows that the use of virtual RAN (vRAN) / Open RAN (oRAN) solutions can save up to 44% compared to. performs traditional distributed / centralized RAN configurations. “They add,” This modernization will allow communication service providers (CSPs) to simplify network operations and improve flexibility, availability and efficiency – while serving an increasing number of use cases. Cloud-native and container-based RAN solutions offer lower costs, easier upgrades and modifications, horizontal scalability and less vendor engagement than proprietary or VM-based solutions. “

4. Scaling drives operational approaches

Many aspects of an edge computing architecture can differ from those implemented only within the walls of a data center. Devices and computers may have poor physical security and no IT staff on-site. The network connection may be unreliable. Good bandwidth and low latencies cannot be taken for granted. However, many of the most pressing challenges relate to scaling; there can be thousands (or more) network endpoints.

“Regardlessly standardize and minimize operational space.”

Kris Murphy, Senior Principal Software Engineer at Red Hat, names four main steps you need to take to deal with scaling: “Ruthlessly standardize, minimize operational ‘surface’, pull whenever possible and automate small things.”

For example, it recommends performing transactional, i.e. atomic, updates so that a system is not only partially updated and thus can end up in a poorly defined state. When updating, she also argues that it is good practice for endpoints to get updates because “outbound connectivity is more likely to be available”. You should also be careful to limit peak loads by not performing all updates at the same time.

[  Want to learn more about implementing edge computing? Read the blog: How to implement edge infrastructure in a maintainable and scalable way. ]

5. Edge computing requires attestation

With marginal resources scarce, features that require little or no local resources are the pragmatic options to consider. In addition, here too, every approach must be highly scalable, otherwise the uses and advantages will be extremely limited. One standout option is the Keylime project. “Technologies such as Keylime, which can verify that computing devices are booting up and remaining in a trusted operating state, should be considered for widespread deployment, especially in resource-constrained environments,” said Ben Fischer, Red Hat Emerging Technology Evangelist.

Keylime provides remote boot and runtime confirmation using Integrity Measurement Architecture (IMA) and leverages Trusted Platform Modules (TPMs) common to most laptop, desktop and server motherboards. If a hardware TPM is not available, a virtual or vTPM can be loaded to provide the required TPM functionality. Boot and runtime attestation is a means of verifying that the edge device boots in a known trusted state and maintains that state while it is running. In other words, if something unexpected happened, such as a rogue process, the expected state would change, which is reflected in the measurement and would take the edge device offline as it entered an untrustworthy state. This device could be investigated and repaired and put back into operation in a trustworthy state.

6. Confidential Computing becomes more important at the edge

Edge security requires broad preparation. The availability of resources such as network connectivity, power, personnel, equipment and functionality varies widely, but is far below what would be available in a data center. These limited resources limit the possibilities for ensuring availability and security. In addition to encrypting local storage and connections to centralized systems, confidential computing provides the ability to encrypt data while it is in use by the edge computing device.

This protects both the processed data and the software processing the data from being captured or manipulated. Fischer argues that “confidential computing on edge computing devices will become a fundamental security technology for edge computing due to limited edge resources.”

According to the Everest Group’s Confidential Computing Consortium (CCC) report, Confidential Computing – The Next Frontier in Data Security, “Confidential Computing in a distributed edge network can also help achieve new efficiencies without the data or IP – Impair data protection by creating a secure foundation for scaling analyzes at the edge without compromising data security. ”In addition, confidential computing ensures that only authorized commands and code are executed by edge and IoT devices. The use of confidential computing in IOT and edge devices as well as in the back end helps control critical infrastructures by preventing the manipulation of the data code that is communicated via interfaces. “

Confidential computing applications at the edge range from autonomous vehicles to the collection of sensitive information.

Diverse applications in all industries

The diversity of these edge computing trends reflects both the diversity and scale of edge workloads. There are some similarities – multiple physical footprints, the use of cloud-native and container technologies, an increasing use of machine learning. Telco applications, however, often have little in common with industrial IoT use cases, which in turn differ from those in the automotive industry. But no matter what industry you look at, interesting things will happen on the fringes in 2022.

[ Want to learn more about edge and data-intensive applications? Get the details on how to build and manage data-intensive intelligent applications in a hybrid cloud blueprint. ]


Comments are closed.