Comprehending Edge Intelligence with Regards to IT Infrastructure

Getting closer to ‘Edge’ of the network triggers various use cases and benefits to users as well as operations. Having ‘intelligence’ at edge of the network powers the edge computing vendors to push further for offering delightful solutions. Let’s understand how this will have subtle impact and benefits regards to IT infrastructure needed to facilitate edge intelligence.

We have seen how evolvement of cloud had a radical impact on digitally transforming enterprises by providing infrastructure, platform, and application on demand. It’s time to move further than the cloud. Advances in IT infrastructure technologies taking cloud power much closer to where data is produced (smart devices or appliances, industrial equipment, vehicles, etc.) which further enables the vast amount of opportunities for organizations to transform more and become competitive. We call it as, Edge computing. The core idea behind edge computing architecture is to facilitate real-time responses from computing platforms for use cases like smart cities, autonomous vehicles, AR, VR, etc.

Generating intelligent insights, predicting the future action and automating operations are the key for organizations to automate routine operations and be on top of catastrophic situations. For businesses, it can majorly reduce OPEX and helps in staying ahead in the market. What if we get such intelligence at the edge of an overall network and reduce overheads in communicating network and cloud infrastructure? Many technology vendors are exploiting the potential of this idea and development by these vendors and open communities are in progress as well.

Click on image to download ebook

I got in touch with to CTO of Swim.ai, Simon Crosby to understand core concepts around machine learning at the edge. I discussed about various IT infrastructure aspects where intelligence is enabled at the edge of the network. Here is the transcript of our discussion.

Q – Tell us about the status of edge intelligence and how enterprises are leveraging it

Organizations are increasingly flooded with data from their customers, edge devices, sensors, production systems, partners and assets. This data contains information that can enable faster responses, better decisions, greater efficiency, and better customer service, but the challenge is to find critical insights in all this edge data. The emerging category of ‘edge intelligence’ must involve real-time analysis and machine learning at the ‘edge’ to handle the volume of data and take full advantage.

With data context and understanding now available at the edge, using a ‘stateful’ architecture, you can identify relevant insights, reduce the data and discard the noise. When combined with innovations in edge computing, analytics, and machine learning, algorithms suddenly have access to much larger datasets and can apply them to streaming data in real-time. By deploying machine learning algorithms at the edge, they’re no longer bound to learning on historical datasets and can instead train and iterate on live data as it is created. The utility of applying machine learning with post hoc batch analysis versus applying machine learning in-the-stream is the difference between analysing why it rained yesterday and knowing that you’ll need an umbrella today.

Q – How Open Source is helpful in Edge Intelligent systems and infrastructure?

The field of ML is advancing rapidly – you should ensure your applications and solutions are “ML agile” and be able to upgrade to newer algorithms easily. An application architecture that allows you to plug in the major open source toolkits could save years and get your solution into production quicker.

The community development model appeals to researchers, end users and developers – users can be sure they won’t be stranded with a dead-end proprietary stack, and developers can confidently invest time and effort into widely used code bases – developing skill sets that are portable across projects, employers and even clouds. The open source ML tools are not only the cutting edge of algorithm development but also embody the de-facto work practice of many data scientists.

Open source ML fuels a race favoring execution and development efficiency. Winners will capitalize on the ready availability of powerful tools to deliver economic benefits quick and at a reasonable cost. Those that adhere to the mantra of proprietary algorithm secret sauce may succeed tactically but are doomed to eventual failure – slow adoption of their technologies and lacking ongoing developer support.

Q – What is Digital Twin? Any implementation to elaborate?

A digital twin is a software representation (model) of a physical object, such as a machine, sensor, or real-world process, constructed from data and other contextual information from that object. Digital twins simplify data handling and application development by providing a simple way of getting data from multiple heterogeneous objects in a system, abstracting away the notion of an individual device giving the application access to all the different objects. Paired with data processing, analytics, and machine learning, digital twins can enable predictive maintenance for industrial equipment, or efficiently monitor individual assets flowing throughout a complex supply chain. The concept of digital twins has been around for years, but now with an increased focus on IoT technologies and innovations in edge computing, the digital twin is poised for a breakout.

Companies focused on industrial automation require an extensive amount of programming effort to run operations at complex, process manufacturing facilities. By adopting machine learning and data analytics software, companies can use digital twins as an ‘autonomous agent architecture’ to improve the supervisory control loop, successfully responding to changes in the operations with real-time speed, accurate decisions, and robustness.

Embedding this type of software on a company’s standard industrial computers enable systems within a manufacturing facility to collaborate in real-time, across distributed devices, and implement new system configurations that support optimization strategies. These services act as digital twins, observing real-time data streams and operating states from process controllers and providing a simple model for understanding the state of the system as a whole.

The stateful architecture of the software allows the services to collaborate in real-time, across distributed devices, and implement system configurations that supported various optimization strategies, enabling an architecture that can independently react to changes and reconfigure a system appropriately at a fraction of the time and cost of traditional automation solutions.

The result is an edge-based, distributed data processing and analytics capability that will reduce their implementation costs for customer projects and enable advanced analytics.

Q – How 5G enabled features like Network Slicing and MEC will be useful?

The term “edge” in MEC may be slightly misleading. While it’s now cost-effective to deploy datacenters into the field (for example at 5G base stations), what MEC really seeks to accomplish is to extend the cloud closer to where data originates. That’s only halfway to true edge computing. There are certainly good reasons for bringing servers closer (in the network topology) to end users, such as for video processing and NLP, but the main challenge lies in building applications that can make use of edge data.

The real leap forward will come when applications can create cohesive meshes across all servers (wherever they are in the network) AND other edge compute resources (mobile phones, connected machinery, network equipment, sensors…) thereby abstracting away the notion of network infrastructure altogether for application developers.

Q – What kind of real-time analysis data is useful for enterprises (case by case)?

Service providers: Service providers can optimize efficiently using machine learning insights from ‘full fidelity’ edge data. Better insights into services, applications, network performance, handset/CPE performance and the impacts on bandwidth, latency, and data will be vital.  They can transform raw data from their equipment into a real-time streaming GUI/API, transmitting information at a sub-second rate with only a fraction of the network costs that comes with a REST-based architecture.

Telecommunications Service Providers: An edge intelligence software running on a Wi-Fi set-top box can access the Wi-Fi channel performance statistics and make real-time recommendations or changes. Instances running on handsets and cell-sites (or local POPs) digital twins can be constructed of the handset performance, the cell site performance and the network performance. This can be used to provide real-time visibility of the current state of the devices & network, and further can be used to build a digital twin model of actual performance.

Enterprise Applications: Providers can deploy edge intelligence software on local machines to process millions of tag readings, creating context-rich data streams that can be used to build real-time analytics and dashboards, providing real-time visibility of asset flow and supply chain operations and making insights available via real-time APIs.

Q – How secure is analytics data? Tell us how vendors are securing analysis. Any special methodologies or countermeasures used?

Any effort where data can be transformed, obfuscated, and encrypted closer to where it originates, is a security win. “Dumb” sensors, which only forward data downstream to big data lakes in the cloud can be susceptible to compromise. However, agent-based architectures, which are stateful, can be deployed locally to intelligently monitor individual nodes in the system, quarantining nodes which have been potentially compromised. Furthermore, deploying stateful agents alongside sensors and connected devices makes it possible to encrypt edge data at the origin, before ever transmitting over a network, thus ensuring end-to-end encryption.

Q – Tell us about the role of evolving storage technologies for edge intelligence. How does I/O performance matter?

Storage continues to become cheaper and more widely available throughout the network, from the edge to the cloud. Many of the cheapest industrial sensors now have similar compute resources compared to those of a Raspberry Pi. This has created profound opportunities for new edge applications which can reduce costs, improve productivity, monitor security, and more. However, even considering the coming 5G era and improvements in streaming technology, bandwidth will be a limiting resource and network latency will continue to inhibit the creation of real-time applications.

I/O performance is critical to ensure maximum efficiency of real-time applications. By transforming data before the first network hop (or as close to as possible), applications transmit significantly lower volumes of data over the networks while retaining the full resolution of that data. Furthermore, downstream analytics in the cloud can now process highly refined data, as the housekeeping was taken care of upstream, at the edge.

[Tweet “Comprehending Edge Intelligence with Regards to IT Infrastructure ~ via @CalsoftInc”]

 
Share:

Related Posts

Gen AI Trends 2025

Top Generative AI Trends Shaping 2025

Modernization of industries began with the Industrial Revolution in the early 19th Century with the use of machines, and it has continued with the digitization of devices…

Share:
IoT and its Applications in Driving Smart Manufacturing

IoT and its Applications in Driving Smart Manufacturing

The Internet of Things (IoT) is a key element of global industrial transformation, and the manufacturing sector leads in leveraging this technology. The millions of IoT devices,…

Share:
Product Lifecycle Management in Software Development using Large Language Models

Product Lifecycle Management in Software Development using Large Language Models

The data of any organization is of extreme value. But what happens when that data is not trustworthy and accessible to your teams? You will face challenges…

Share:

Driving AI Innovation: Insights from the 2024 NVIDIA AI Summit

The NVIDIA AI Summit, held from October 23-25, 2024, in Mumbai, was more than just an industry event. It was a place filled with ideas, innovation, and…

Share:
Building Your IoT Cloud Architecture Guide and Strategies

Building Your IoT Cloud Architecture: Guide and Strategies

Discover key strategies and best practices for building your IoT cloud architecture.

Share:
Understanding the Differences Between Public, Private, and Hybrid Cloud Solutions

Understanding the Differences Between Public, Private, and Hybrid Cloud Solutions

Discover the key distinctions between public, private, and hybrid cloud solutions to help choose the right model for your business needs.

Share: