Archived edge series: More answers about edge computing architectures, advantages

Archived version of the webcast “Edge series: Edge computing architectures, advantages” is available for viewing online; a summary of information offered follows along with more answers to audience questions about industrial edge computing architectures.

ByAlan Raveling and Aditya Agrawal July 1, 2023
Courtesy: Control Engineering webcasts

Learning Objectives

  • “Edge computing architectures, advantages,” a May 25 Control Engineering webcast explained how edge computing fits into automation and controls.
  • Discussed how edge computing can make cloud resources more effective and reviewed benefits, examples and lessons learned about edge computing as used with automation and controls.
  • Additional audience questions are answered.

Edge architecture insights

  • Edge computing can help applications of automation and controls.
  • Industrial edge computing can make cloud resources more effective and provide benefits. Learn from examples and system integrators below and in a May 25 “Edge computing architectures, advantages,” a webcast.

“Edge computing architectures, advantages,” a May 25Control Engineeringwebcast, examined edge computing architectures for automation and control applications. Webcast attendees had additional questions on edge computing, answered below. The webcast and information below also reviewed resources and research on edge computing. See how system integrators and other experts consider it a means to advance, rather than replace, the benefits controls bring to many mission-critical industry applications. Architecture interactions among edge computing and cloud resources were discussed.

Learning objectives covered in the 1-hour educational webcast are:

  • Understand how edge computing fits into automation and controls.

  • Explore applications that benefit from edge computing.

  • Learn how edge computing can make cloud resources more effective.

  • Review benefits, examples and lessons learned about edge computing as used with automation and controls.

Edge computing experts, poll and survey results

Expert presenters in the webcast are Alan Raveling, OT architect and cybersecurity leader, Interstates; and Aditya Agrawal, 5G CTO, L&T Technology Services. In an exit poll of webcast attendees, 100% of the audience attending live said the speakers helped them understand the edge computing topics covered.

In a poll question during the presentation, audience members attending live were asked how they are using edge computing for automation, controls and instrumentation applications. Prompted replies showed supervisory control and data acquisition (SCADA) applications were the highest, followed by distributed control system or process applications and monitoring remote assets, then control processes, monitoring local assets, human-machine interface and least use with analytics and decision support at 9% (Figure 1).

Figure 1, audience poll for the May 25 Control Engineering webcast, “Edge computing architectures, advantages”: Audience members attending the webcast live were asked how they are using edge computing for automation, controls and instrumentation applications. Supervisory control and data acquisition (SCADA) applications were the highest, followed by distributed control system or process applications and monitoring remote assets, then control processes, monitoring local assets, human-machine interface and least use with analytics and decision support at 9%. Courtesy: Control Engineering webcasts

Figure 1, audience poll for the May 25 Control Engineering webcast, “Edge computing architectures, advantages”: Audience members attending the webcast live were asked how they are using edge computing for automation, controls and instrumentation applications. Supervisory control and data acquisition (SCADA) applications were the highest, followed by distributed control system or process applications and monitoring remote assets, then control processes, monitoring local assets, human-machine interface and least use with analytics and decision support at 9%. Courtesy: Control Engineering webcasts

In relatedresearch ofControl Engineeringsubscribers earlier this year on artificial intelligence and edge computing, SCADA also ranked highest in a similar question, with results as follows.

在用户responding, 98% were using edge computing in some way. The top three uses were control processes (51%), SCADA software (50%) and with human-machine interface software (46): This is a three-way tie, considering margin of error for that research.

Figure 2: Many applications need edge computing, including manufacturing, smart infrastructure, energy and mining, aerospace and others, said Aditya Agrawal, 5G CTO, L&T Technology Services, in the May 25 Control Engineering webcast, “Edge computing architectures, advantages.” Courtesy: Control Engineering webcasts

Figure 2: Many applications need edge computing, including manufacturing, smart infrastructure, energy and mining, aerospace and others, said Aditya Agrawal, 5G CTO, L&T Technology Services, in the May 25 Control Engineering webcast, “Edge computing architectures, advantages.” Courtesy: Control Engineering webcasts

During the webcast, Raveling said line-based machines, along with connected input/output (I/O) devices, sensors, programmable logic controllers (PLCs) and other control systems are the richest application data sources, require the quickest responses for applications and often have limited visibility to other lines and machines. Local data centers, central or distributed, can send fast actions to lines and machine and potentially serve as the staging area for cloud-based applications to provide added capabilities; Raveling cautioned of latency and cybersecurity risk with cloud connections. Possible edge computing applications, he said, include predictive analytics, enhanced quality monitoring and workplace safety monitoring, among others. Lessons learned, Raveling said, include ensuring the architecture and staff training can support industrial edge computing applications (Figure 3).

Figure 3: Alan Raveling, OT architect and cybersecurity leader, Interstates discussed lessons learned about edge computing, in the May 25 Control Engineering webcast, “Edge computing architectures, advantages.” To ensure the edge computing application succeeds prework may be necessary, including infrastructure enhancements and personnel training. Courtesy: Control Engineering webcasts

Figure 3: Alan Raveling, OT architect and cybersecurity leader, Interstates discussed lessons learned about edge computing, in the May 25 Control Engineering webcast, “Edge computing architectures, advantages.” To ensure the edge computing application succeeds prework may be necessary, including infrastructure enhancements and personnel training. Courtesy: Control Engineering webcasts

More answers to audience questions about industrial edge computing architecture

Question: What timeframe is a return on investment (ROI) expected for an edge computing project?

Agrawal:There is no fixed answer to this. Edge compute is typically part of the overall system and typically enterprises should look for ROI on the total cost of ownership, not just the edge compute component.

Q: What is latency, and how can edge compute provide lower latency?

Agrawal:Latency is the time duration between when the client device starts an action and when it receives a processed result. If a client device is sending packets and receiving a response back from the network that doesn’t require significant compute, then the latency is dominated by network latency and it is easily measured by using the ping command. Typical LTE network latencies are ~25 to 50ms for well-provisioned private networks. Wi-Fi latencies are in a similar range for lightly loaded Wi-Fi networks. 5G network latencies with Release 16 gear and ultra-low latency said to be close to 1ms. On top of network latency, when there is significant compute applications, such as artificial intelligence (AI) and machine learning (ML) computer-vision applications, going to the cloud and back can add significant latency, perhaps hundreds of milliseconds, even seconds. With edge compute, this latency can be reduced by an order of magnitude for well right-sized edge compute and transport.

Q: Can a similar compute vision application be run on the cloud instead of on the edge? Are there trade-offs of running the compute part of a vision application from the cloud?

Agrawal: In addition to latency trade-off mentioned, another trade-off to running an application on the cloud is data privacy/security. Keeping compute local (edge compute) can reduce the thread footprint since the video or image stream is not leaving the enterprise premise.

Q: What about edge computing do you wish those familiar with automation and controls would understand?

Agrawal:它可能不合理期望的汽车mation and controls experts would become experts in vast areas of networking, AI/ML and compute at the same time. That said, it would be good if automation and controls experts could articulate requirements for data privacy, latency, throughput and different application use cases that provide high economic value and hence high ROI and let a system integrator then use that to propose the right networking, compute and AI/ML system solution.

Q: How quickly is edge computing being adopted?

Agrawal:许多企业都采用边计算器官ically, sometimes even without calling it edge compute, for the reasons of data privacy and low latency, and sometimes because the cost is lower for static workloads that do not need the autoscale capability of cloud.

Q: Can local edge systems using AI work when they must access internet data?

Agrawal: This is an evolving field. At the moment, it looks like even the inferencing compute for large language models would require an expensive edge compute cluster and cloud is a better option. This may change over time.

Q: How do you manage access to and connect edge devices?

Agrawal: Access can be managed via control network with a common operational technology/information technology (OT/IT) switch.

Q: How are the costs for hardware and ongoing personnel support justified to management?

Agrawal: Support personnel can be mostly remote and outsources offshore to a system integrator to manage costs and present a business case justification to management. For hardware, if the compute workload is non-bursty as in many OT applications, total-cost of ownership over two years for right-sized on-premise compute can be lower cost than cloud operating expenditures (OPEX). For bursty workloads, cloud can be a lower cost option. A system integrator can help right-size the compute, as this is where most enterprises falter by over provisioning compute so cost is too high or under-provisioning to the point where performance does not meet requirements.

Q: What is the role of the currently hyped “AI on the edge”?

Agrawal: AI is not just one thing. Computer vision AI on the edge is reality, not hype and giving real benefits, such as employee safety, product quality inspection and other use cases. Large language models (LLM) AI on the edge is work in progress.

Q: How can edge computing enable more resilient and secure systems in the face of natural disasters or cyberattacks?

Agrawal:边缘甚至可以提供弹性云有限公司nnectivity is disrupted due to natural disasters. Local environments can continue to function normally. Also, by using contained local environments (edge), threat footprint for cyberattacks is reduced, with the right access controls in place for the local data environments. Defense installations for the longest time have used the concept of “airgap” between sensitive information environments and networked systems to reduce the cyberattack threat footprint. Edge is an extension of that concept.

Q: Which manufacturers provide the best edge computing?

Agrawal: Commodity servers typically form edge-compute clusters and a number of smaller, innovative original equipment manufacturers (OEMs) are providing increasingly better solutions. Typically, a system integrator can recommend the right combination of solutions.

Figure 4: Alan Raveling, OT architect and cybersecurity leader, Interstates and Aditya Agrawal, 5G CTO, L&T Technology Services were expert speakers in the May 25 Control Engineering webcast, “Edge computing architectures, advantages.” Courtesy: Control Engineering webcasts

Figure 4: Alan Raveling, OT architect and cybersecurity leader, Interstates and Aditya Agrawal, 5G CTO, L&T Technology Services were expert speakers in the May 25 Control Engineering webcast, “Edge computing architectures, advantages.” Courtesy: Control Engineering webcasts

– Edited by Mark T. Hoske, content manager and webcast moderator,Control Engineering,CFE Media and Technology,mhoske@cfemedia.com.

KEYWORDS: Industrial edge computing

CONSIDER THIS

Have you donethe right preparation to support an industrial edge computing architecture?


Author Bio:Alan Raveling, OT architect and cybersecurity leader, Interstates; and Aditya Agrawal, 5G CTO, L&T Technology Services.