https://medium.com/google-developers/computing-at-the-edge-of-iot-140a888007b
We’ve seen that demand for low latency, offline access, and enhanced machine learning capabilities is fueling a move towards decentralization with more powerful computing devices at the edge.
Nevertheless, many distributed applications benefit more from a centralized architecture and the lowest cost hardware powered by MCUs.
Let’s examine how hardware choice and use case requirements factor into different IoT system architectures.
313 Comments
Tomi Engdahl says:
Automation controllers, edge computing
https://www.controleng.com/articles/automation-controllers-edge-computing/?oly_enc_id=0462E3054934E2U
Which devices to use? A multi-tasking controller combines functions of a PC-based software controller with visualization, PC applications, and input/output connections. Combining a scalable human-machine interface (HMI) system, supervisory control and data acquisition, and other functions can add edge functionality.
Tomi Engdahl says:
MESA smart manufacturing model to detail 8 areas
https://www.controleng.com/articles/mesa-smart-manufacturing-model-to-detail-8-areas/?oly_enc_id=0462E3054934E2U
The MESA International “Model for Smart Manufacturing” intends to cover business intelligence, product lifecycle management (PLM), value chain management, manufacturing operations, the Industrial Internet of Things (IIoT), asset management, workforce and cybersecurity. MESA told Control Engineering how it will help.
Tomi Engdahl says:
Winners And Losers At The Edge
No company owns this market yet — and won’t for a very long time.
https://semiengineering.com/winners-and-losers-at-the-edge/
The edge is a vast collection of niches tied to narrow vertical markets, and it is likely to stay that way for years to come. This is both good and bad for semiconductor companies, depending upon where they sit in the ecosystem and their ability to adapt to a constantly shifting landscape.
Some segments will see continued or new growth, including EDA, manufacturing equipment, IP, security and data analytics. Others will likely see their market share erode as the cost of designing advanced-node chips skyrockets and the end markets beneath them continue to fragment. So rather than selling hundreds of millions or even billions of units to turn a profit, they will have to become far more nimble and compete for smaller volumes at the edge, where, so far, there are no clear winners.
This already is causing churn across multiple markets. Among the examples:
The biggest prizes for chipmakers have been designs for servers, smart phones and increasingly automotive companies. But systems companies such as Google, Facebook and Apple, and giant automotive OEMs such as Volkswagen, Daimler and BMW are now designing their own chips to take advantage of their proprietary AI/ML/DL algorithms or software. That leaves standalone chipmakers vying for accelerator and control logic designs, which carry significantly lower average sale prices (ASPs).
Edge markets are becoming more narrowly focused, particularly as intelligence is added into devices to provide specific solutions. To maximize performance and power efficiency, hardware needs to be co-designed with the software, which makes it difficult to develop one extremely complex SoC that plays across multiple markets.
The rollout of the edge coincides with the slowdown in Moore’s Law and the rising cost of developing highly customized SoCs. This makes development of base platforms that work across multiple segments much more important, but it also requires chiplets and other IP developed by multiple vendors. That tends to dilute profits and change the business model.
Tomi Engdahl says:
Data flow is no longer hierarchical
Can industrial edge computing fit into the Purdue model?
https://www.controleng.com/articles/data-flow-is-no-longer-hierarchical/?oly_enc_id=0462E3054934E2U
Since its introduction in 1992, the Purdue model has remained virtually unchanged. Considering the blazing speed of technological change characteristic of today’s modern business landscape, is it time to re-evaluate the model’s relevancy, especially given the advent of the Industrial Internet of Things (IIoT)?
When the Purdue Model for Control Hierarchy was published by Theodore J. Williams and the Industry-Purdue University Consortium for Computer Integrated Manufacturing, it quickly became the de-facto standard for how manufacturing teams thought about, architected, and implemented industrial control systems. The Purdue model became the barometer of what good manufacturing looks like, the reference point for conversations about systems and data flows and the defining snapshot of where operational and plant floor applications sit relative to the rest of the business. In short, it defined the landscape.
With the advent of IIoT, the Purdue model may be starting to show its age. Today’s technology stack is vastly different than what it was back in the 1990s, and a host of new and exciting methods are being deployed to unlock business capabilities in ways that were previously impractical. Most notably, rapid acceleration of the number of disparate connected devices and mass democratization of computing power introduces new requirements not addressed within the linear hierarchy of the model in its current form.
The Purdue model was created to ensure security. This is accomplished by taking a layered view of how machines and processes function and interact with each other, and how data is produced, transferred and consumed at the various levels.
The model, in the shape of a pyramid formation represents how information flows from the shop floor upwards into high-level enterprise systems. The model separates enterprise and operational domains into different zones isolated with an industrialized Demilitarized Zone, or DMZ, in between. Built-in security prevents security breaches between Level 0 and Level 5.
The model keeps computing and networks deterministic, i.e., ensuring that networks on the shop floor remain dedicated to the control systems and do not become “flooded” with non-production related data that could result in network capacity issues that could stop the manufacturing process.
The Purdue model also serves as a blueprint for IT systems to acquire shop floor data via the DMZ without compromising production or allowing capture of plant floor mechanical equipment for nefarious purposes. Cybersecurity concerns were also addressed by firewalls placed between industrial and enterprise zones, isolating data within the zones absent explicit data sharing rules.
What are the limitations?
The Purdue model fit the world of 1992 nicely. Cloud computing was just a dream. The bulk of compute capability to run the facility and manufacturing processes was found on-premises. Data sharing between manufacturing facilities and central offices was limited to order placement and fulfillment.
These layers and zones contributed to a controlled flow of data, mostly originating from the bottom of the Purdue pyramid upwards or planning data pushed down into the model for consumption at lower levels.
The model dictated that data be organized to be hierarchical and purpose driven. Data required to run processes came into the system top down and was processed and consumed as needed at each level.
Today’s data flow is no longer hierarchical. Manufacturers added intelligence at the sensors (Level 1), controllers (Level 2), and “edge,” which can be anywhere along Level 1 to 3 based on where the edge device is placed. All of this to say that points of exposure are occurring much further down the pyramid than the Purdue model ever considered. Due to the expanded power of edge computing devices, large amounts of data can be collected at Level 1, processed and be sent directly to the cloud.
Critics say Industry 4.0 has made the Purdue model at best outdated and at worst obsolete. These outdated applications of the model are seen in use cases where sensor data is being collected at Level 0 and is required to be sent to the cloud to enable predictive maintenance capabilities. Sending Level 0 data to Level 5 directly violates the segmentation aspects of the Purdue model.
Stay or go?
Scrapping the Purdue model, however, doesn’t work either. The Purdue model still serves the segmentation requirements for both wireless and wired networks and protects the operational technology (OT) network from unwarranted traffic and exploits.
What is needed is a hybrid solution that integrates into the Purdue model to maintain segmentation for traditional instances of IT and OT data flow, but also provides the flexibility needed as Industrial IoT use cases become more prevalent.
This level of IIoT flexibility can be attained by adding an industrial edge computing platform software layer. With this layer, an Industrial IoT project can adhere to each level in the Purdue model. This platform layer can sit either at Level 2 or Level 3 and provide data collection capability from OT devices at Level 0, 1, 2 and 3, while also facilitating data collection from IT layers at Levels 4 and 5. The benefit is that the traditional hierarchies inherent in the Purdue model can be bypassed where needed (i.e. sensors sending data from Level 0 to Level 5) by piping the data through the platform to ensure control and security.
The industrial edge computing platform sits inside the Purdue model, facilitating communications between any level as required. It is the data quarterback. It is the orchestration platform that makes it easy for systems to communicate amongst themselves.
The Purdue model has benefits still valuable in today’s manufacturing environment. Implementing an industrial edge computing platform into the model preserves the integrity of the system while allowing flexibility that drives the foundation of a flat data collection and analytic environment that accelerates continuous improvement.
Tomi Engdahl says:
New Edge network appliance designed ready for 5G and Wi-Fi 6
https://www.electropages.com/2020/07/new-edge-network-appliance-designed-ready-5g-and-wi-fi-6?utm_campaign=2020-07-07-Latest-Product-News&utm_source=newsletter&utm_medium=email&utm_term=article&utm_content=New+Edge+network+appliance+designed+ready+for+5G+and+Wi-Fi+6
Advantech has launched its new FWA-1112VC network appliance which offers a series of innovative features to its range of entry and mid-level white boxes for SD-WAN and uCPE. Major upgrades incorporate extended connectivity choice, with dual 10GbE SFP+ and PoE+ support, as well as a future-proof design able to adopt coming 5G and Wi-Fi 6 technologies.
The device is a fanless and compact platform that fits working environments where noise levels require to be held down.
“The uCPE market is rapidly expanding to include additional application such as IoT and 5G,” commented James Buchanan, senior vice president and general manager Edge Cloud at ADVA. “The new Advantech FWA-1112VC supports that evolution with its enhanced support for wireless technologies and extended temperature range. It will be a welcome addition for those looking to expand the deployment of virtualised applications.”
Tomi Engdahl says:
How edge computing will unleash the potential of IIoT
https://www.controleng.com/articles/how-edge-computing-will-unleash-the-potential-of-iiot/?oly_enc_id=0462E3054934E2U
Combining the potential of Industrial Internet of Things (IIoT) devices with the processing power of edge computing, automation solutions and analytics is giving manufacturing production data more value. See five ways to make edge IIoT deployment more effective.
Tomi Engdahl says:
https://www.eetimes.com/will-blaize-trailblaze-edge-ai-market/
https://www.uusiteknologia.fi/2020/08/17/verkon-reunalaskentaan-ai-kiihdytinkortteja/
Tomi Engdahl says:
From Cloud To Cloudlets
Why the intersection of 5G and the edge is driving a new compute model.
https://semiengineering.com/from-cloud-to-cloudlets/
Cloudlets, or mini-clouds, are starting to roll out closer to the sources of data in an effort to reduce latency and improve overall processing performance. But as this approach gains steam, it also is creating some new challenges involving data distribution, storage and security.
The growing popularity of distributed clouds is a recognition that the cloud model has limitations. Sending the growing volume of end-device data to the cloud for processing is resource-intensive, time-consuming, and highly inefficient.
“There is a huge amount of data being created every day,” said Lip-Bu Tan, CEO of Cadence, in a presentation at the recent Cadence Live. “All of this data needs to be transmitted, stored and processed. All of this requires high performance compute, high-bandwidth transmission and high-density storage. This is an exciting time for innovation in semiconductors in architecture, design, EDA, IP and manufacturing ecosystem.”
Tan noted that 90% of all data that exists today was generated in the past two years, and 80% of that is video or images. Of that amount, only about 2% of that data is being analyzed today. “There’s a huge opportunity to analyze that data,” he said. “That will drive new business models for all the different verticals.”
Much of that data needs to be analyzed locally. This is a marked departure from the original IoT concept, which assumed that 5G millimeter wave technology would be provide enough bandwidth and speed for tens of billions of IoT devices to connect to the cloud for nearly instantaneous results. Even under perfect conditions using mmWave, it takes too long. And as engineers working with 5G mmWave have learned, that technology isn’t just a speedier version of 4G. Signals don’t go through windows or around corners, they are easily interrupted, and they attenuate quickly.
As a result, the focus for mmWave has shifted from nearly ubiquitous small-cell implementations outside and inside of buildings, to better-defined infrastructure closer to the data sources. It also has forced a rethinking of what exactly 5G will be used for, namely line-of-sight communication for shorter distances, with some ability to bend around objects using beamforming. That makes it a viable option for connecting many more devices to edge-based servers, and one that is being heavily promoted by telecommunications companies, cloud providers, and chipmakers alike.
“We’re seeing the need for really high connectivity for up to 2.5 million devices in 1 square mile,” said Mallik Tatpamula, chief product and technology officer at Ericsson, said during a Simulation World presentation. “This is a paradigm shift toward micro-data centers. This is the decentralized cloud, where you transfer data to the closest locations.”
Alongside of 5G, there is a push to reduce the amount of data by making sense of what is useless and what is valuable closer to the source. That requires a fair amount of intelligence at the edge, particularly in safety- or mission-critical applications, where data needs to be scrubbed much more carefully so that important data isn’t discarded. From there, the data can be further processed locally or remotely and stored wherever it makes sense.
“When it comes to the pure economics of storage, a centralized cloud is nearly always going to deliver the best result,”
Tomi Engdahl says:
back and forth with Linux cloud server http://turnoff.us/geek/edge/
Tomi Engdahl says:
https://www.nokia.com/networks/products/airframe-open-edge-server/
Tomi Engdahl says:
http://turnoff.us/geek/edge
Tomi Engdahl says:
https://www.uusiteknologia.fi/2020/08/31/teollisuus-4-0-reunalaskenta-avaimena-uudenaikaisuuteen/
Tomi Engdahl says:
https://www.eetimes.com/ai-and-vision-at-the-edge/
Tomi Engdahl says:
What Enables AI at the Edge?
https://www.eetimes.com/what-enables-ai-at-the-edge/
Tomi Engdahl says:
Making Machine-Learning Design Practical for the Edge
Bringing AI to embedded devices at the edge hasn’t been for the faint-hearted. However, that’s about to change.
https://www.electronicdesign.com/technologies/iot/article/21136192/making-machinelearning-design-practical-for-the-edge?utm_source=EG+ED+IoT+for+Engineers&utm_medium=email&utm_campaign=CPS200903036&o_eid=7211D2691390C9R&rdx.ident%5Bpull%5D=omeda%7C7211D2691390C9R&oly_enc_id=7211D2691390C9R
Tomi Engdahl says:
https://www.codemotion.com/magazine/dev-hub/machine-learning-dev/edge-machine-learning/
Tomi Engdahl says:
Integrity Problems For Edge Devices
https://semiengineering.com/integrity-problems-for-edge-devices/
Noise becomes a significant issue at older nodes when voltage is significantly reduced, which is a serious issue for battery-powered devices.
Battery-powered edge devices need to save every picojoule of energy they can, which often means running at very low voltages. This can create signal and power integrity issues normally seen at the very latest technology nodes. But because these tend to be lower-volume, lower-cost devices, developers often cannot afford to perform the same level of analysis on these devices.
Noise can come in many forms. Some of it is static, some is dynamic.
Static noise may be due to things like process variation. “Variation is a huge issue,” says Mo Faisal, president and CEO of Movellus. “If you are running your chip near threshold, your variation from slow corner to fast is no longer 2 or 3 times. It could be up to 20 times.”
Dynamic noise comes from thermal shot noise, interference, imperfect power supplies, and many other sources. “It’s not necessarily that low voltage designs are more susceptible. It is that the noise that they are susceptible to is likely to cause the product to fail,” says Brad Griffin, product management group director for multi-physics system analysis at Cadence. “If the voltage swing is 5 volts, you have a lot of margin for noise. But if the voltage swing is 0.8 volts, you don’t have nearly as much margin.”
Analog circuitry can be more susceptible to noise than digital. “The big challenge for us is around the analog, especially that associated with wireless interfaces,”
Tomi Engdahl says:
Shining a Light on Edge Computing in Industrial IoT
From machine learning to deterministic latency, find out how edge computing fits with industrial IoT.
https://www.electronicdesign.com/technologies/iot/article/21141188/shining-a-light-on-edge-computing-in-industrial-iot?utm_source=EG+ED+IoT+for+Engineers&utm_medium=email&utm_campaign=CPS200911058&o_eid=7211D2691390C9R&rdx.ident%5Bpull%5D=omeda%7C7211D2691390C9R&oly_enc_id=7211D2691390C9R
What you’ll learn
The concept of edge computing and how it benefits the IIoT.
Leveraging machine learning in edge computing.
Tomi Engdahl says:
Hyperscale And Edge Computing: The What, Where And How
Networks are adapting to meet the upcoming explosion of latency-sensitive data.
https://semiengineering.com/hyperscale-and-edge-computing-the-what-where-and-how/
Back to the edge. Karim Arabi, in his DAC 2014 keynote, “Mobile Computing Opportunities, Challenges and Technology Drivers,” defined edge computing broadly as “all computing outside the cloud happening at the edge of the network.” Cloud computing would operate on big data while edge computing operates on “instant data” that is real-time data generated by sensors or users. In the tutorial “Next-Generation Verification for the Era of AI/ML and 5G” that I organized at DVCon 2020, I referenced industry data that timed “devices/things” at <5ms, “edge computing nodes” at <5ms, “network hubs and regional data centers” at <10-40ms, the “core network” at <60ms and the “cloud data center” at ~100ms. In contrast, more recently I heard a definition from analysts that defined edge computing as everything within 20ms latency, as indicated in the figure above.
So, what’s the bottom line?
Just five years from now, by 2025, sensors will create exabytes of data per day that will be transmitted through next-generation networks with the lowest latencies possible—where zettabytes of data need to be stored in the global datasphere. When combined with consumer expectations for instantaneous responses to all their needs, networks, storage and compute must “hyper-scale” to speeds and capacities that are hard to comprehend, coining the term hyperscale computing.
Tomi Engdahl says:
Prototype nodes are designed to improve latency, reliability, and performance of IoT deployments where resources are scarce.
Raspberry Pi-Powered FUDGE Frugal Edge Nodes Aim to Improve the Efficiency of Resource-Limited IoT
https://www.hackster.io/news/raspberry-pi-powered-fudge-frugal-edge-nodes-aim-to-improve-the-efficiency-of-resource-limited-iot-a2eca257192b
Prototype nodes are designed to improve latency, reliability, and performance of IoT deployments where resources are scarce.
Researchers at the Universitat Politécnica de Valéncia and the International Centre for Theoretical Physics (ICTP) have discovered a way to reduce the resource requirements of computing at the edge for the Internet of Things: FUDGE, the frugal edge node.
“The growing connection between the Internet of Things (IoT) and artificial intelligence (AI) poses many challenges that require novel approaches and even a rethinking of the entire communication and processing architecture to meet new requirements for latency, reliability, power consumption and resource usage,” the team explains in the paper’s abstract. “Edge computing is a promising approach to meet these challenges that can also be beneficial in delivering advanced AI-based IoT solutions in areas where connectivity is scarce and resources are generally limited.”
Tomi Engdahl says:
Artificial Intelligence Takes on The Edge
https://www.sealevel.com/2020/09/17/artificial-intelligence-takes-on-the-edge/
Edge devices are used in IoT to perform computing functions. Artificial intelligence (AI) employs machine learning to power digital assistants, smart phones and autonomous factories. Previously, much data processing for AI applications occurred over the cloud, requiring massive loads of bandwidth. But with new and continually improving microprocessors, AI processing can occur within edge devices: meet edge AI.
Edge AI Advantages
By processing data in edge devices, latency is reduced on the cloud. Monetary costs of bandwidth and cloud services are also reduced. Edge devices increase security since data is processed locally instead of sent over an internet connection. Edge AI improves real-time decision-making since data can be transferred closer to the applicable device.
Edge AI Applications
Smart phones and digital assistants already use edge AI. Consumers use it when they ask Siri or Alexa a question or give a verbal command. But it’s taken advancements in microprocessors to bring edge AI to larger applications.
Tomi Engdahl says:
Hailo challenges Intel and Google with its new AI modules for edge devices
https://techcrunch.com/2020/09/30/hailo-challenges-intel-and-google-with-its-new-ai-modules-for-edge-devices/?tpcc=ECFB2020&fbclid=IwAR0gOZ2iGEXVIzcsAX73J_cvIUMREF8di14g3TRUBjnlGF0tOTxWIQVsMVw
Hailo, a Tel Aviv-based startup best known for its high-performance AI chips, today announced the launch of its M.2 and Mini PCIe high-AI acceleration modules. Based around its Hailo-8 chip, these new models are meant to be used in edge devices for anything from smart city and smart home solutions to industrial applications.
“Manufacturers across industries understand how crucial it is to integrate AI capabilities into their edge devices. Simply put, solutions without AI can no longer compete,” said Orr Danon, CEO of Hailo, in today’s announcement. “Our new Hailo-8 M.2 and Mini PCIe modules will empower companies worldwide to create new powerful, cost-efficient, innovative AI-based products with a short time-to-market – while staying within the systems’ thermal constraints. The high efficiency and top performance of Hailo’s modules are a true gamechanger for the edge market.”
Tomi Engdahl says:
Security At The Edge
Experts at the Table: How to keep devices that last longer secure, particularly when AI/ML are added in.
https://semiengineering.com/security-at-the-edge/
Tomi Engdahl says:
End users, OEMs and technology partners engage on IIoT
IIoT-enabled predictive maintenance maximizes uptime, with machinery end users and OEMs working together to determine best practices
https://www.controleng.com/articles/end-users-oems-and-technology-partners-engage-on-iiot/?oly_enc_id=0462E3054934E2U
The Industrial Internet of Things (IIoT) solutions and methods enable collection of machine data and monitoring of machine performance and reliability. Both end users and original equipment manufacturers (OEMs) can act on this data to achieve their goals, improving asset uptime through predictive maintenance and asset efficiency through production analytics.
Because of the benefits, digital transformation and incorporation of IIoT concepts have become business priorities, requiring more collaboration than ever to ensure success. This is because digital transformation isn’t just a one-time event but instead a journey involving both technology and people.
To ensure they are travelling down the same path, manufacturing plant end users, OEMs, and IIoT technology suppliers are partnering in the design and implementation of equipment to ensure value can be realized, while overcoming the perceived risks of sharing data. Manufacturing plant end users and machinery OEMs also demand robust and secure platforms, which are now available in the form of edge computing.
Tomi Engdahl says:
Is ‘Datafication’ the New Mantra for Smart Everything?
https://www.eetimes.com/is-datafication-the-new-mantra-for-smart-everything/
What is ‘Datafication’? Is that even proper English? Apparently, it is now.
In the technology industry, we’re used to lots of acronyms. We’re also used to hearing new phrases that both startups and established companies would love us to adopt as industry naming. It gives a kind of hidden sense of pride in having invented the term.
So, when reviewing the presentations submitted for next week’s Boards and Solutions 2020 virtual conference (13-14 October 2020), the one term I picked up as a possible new mantra is ‘datafication’. More specifically, as Charbel Aoun, EMEA business development director for smart cities at Nvidia, describes the four megatrends that will significantly impact our lives, he explains that with 8,000 new IoT devices connected every 60 seconds, “Digitization has enabled datafication.”
He adds, “IoT could turn the world into data that could be used to make macro decisions on resource utilization. Information is a great way to reduce waste and increase efficiencies. That is really what the internet of things provides. This was the vision of Kevin Ashton back in 1999, the father of the term IoT. Today, this vision is becoming a reality.”
Indeed, data is everything, and not just at the edge, but also in the data center, as Nvidia revealed more details of its data processing unit (DPU) at its GTC conference.
Aoun describes the march of datafication in his talk, as he talks about the big challenge for smart cities as an example. “There’s around one billion cameras worldwide, recording 24/7, generating a huge amount of data. It is basically impossible for humans to process such amounts of data. To give you an idea, with one 1080P resolution camera, H.264 I at 34 fps, will generate 47Gbytes of data in 24 hours and 17 Terabytes of data in one year. On the other hand, one CCTV operator can focus for 30 minutes while looking at 4-16 video streams at the same time. Which means for every 100 screens or 100 streams you want to monitor, you need six operators. To understand the scale of the challenge, let us look at the number of CCTVs in the city. In Shanghai, one million plus CCTVs, London, 500,000, Moscow, 200,000, L.A. 25,000, Berlin 20,000.”
“Now you get the picture of the volume of data that can be generated from all the cameras in a city, and the amount of resources required to maintain and monitor.” In his paper, “How AI can make cities smarter – Powering AI City with IVA”, Aoun talks about how AI is helping make sense of the information overload in very effective and efficient ways, provides insight and enables real time decision making to enhance the lives of citizens. He illustrates how AI offers city managers new solutions to 21st century urban challenges with some practical examples.
Value will come from edge autonomy
While all this data being generated needs to be processed and analyzed to provide the insights and enable actions, how about if the edge devices themselves are able to intelligently make decisions? This is the premise of the paper, “Insights into edge autonomy – the future of edge computing”
Tomi Engdahl says:
Microsoft debuts portable data center to bring cloud computing to remote environments
https://techxplore.com/news/2020-10-microsoft-debuts-portable-center-cloud.html
Microsoft Corporation has announced on its website the development of a portable data center that can be used to bring cloud computing to remote environments. In their announcement, Microsoft describes their Azure Modular Datacenter (MDC) as a solution for customers who need cloud computing capabilities in both hybrid and challenging environments. They note also that many places around the world still face immense hurdles in connecting to internet services. They claim their new systems will help to overcome those problems in such regions.
Microsoft Azure is a cloud computing service that allows customers to build, test, deploy and manage cloud-based applications. With MDC, Microsoft hopes to expand its customer base by offering the same types of services to users who may reside in challenging environments.
Tomi Engdahl says:
OneSpin is contributing security to German government-funded project to build a secure and scalable ecosystem for a RISC-V-based systems for AI edge systems more straightforward and secure. As a contributing partner in the Scalable Infrastructure for Edge Computing (Scale4Edge) project, OneSpin is contributing verification tools.
OneSpin Contributes Processor Integrity Solution for ZuSE-Scale4Edge Government-Funded Project to Assure Integrity of Edge Computing Processors
https://www.onespin.com/press-events/press-releases/details/onespin-contributes-processor-integrity-solution-for-zuse-scale4edge-government-funded-project-to-assure-integrity-of-edge-computing-processors
Tomi Engdahl says:
Edge Computing Market Size, Share & Trends Analysis Report By Component (Hardware, Software, Services, Edge-managed Platforms), By Industry Vertical (Healthcare, Agriculture), By Region, And Segment Forecasts, 2020 – 2027
https://www.grandviewresearch.com/industry-analysis/edge-computing-market
Report Overview
The global edge computing market size was valued at USD 3.5 billion in 2019, registering a CAGR exceeding 37% from 2020 to 2027. IoT-edge partnership is expected to revolutionize data computing and account for various corporate gains for those looking to leverage and harness the power of data analytics in developing solutions for major industry verticals. Moreover, to cope up with the challenges of network latency and the need for immediate real-time insights has led to the evolution of multi-locational hybrid data architectures that store data locally at the edge. Furthermore, surging enterprise demand for more powerful computing at the edge has resulted in companies to offer AI-enabled edge solutions.
Edge computing for IoT (Internet of Things) is expected to bring some possible advantages for many IoT deployments, as compared to using the cloud to store and process data. For instance, many IoT processors deliver an increased level of automation at the edge resulting in low latency for rapid data processing. The edge IoT has the ability to reside at an operator’s regional or local datacenter, at a dedicated server, or a base station on the customer’s premises. Furthermore, companies such as Cisco Systems, Inc., Amazon Web Services, and Marlabs Inc. are innovating to offer IoT platforms based on technology.
The latest on-device approach features lower dependency on cloud and better manage a massive deluge of data being generated by the edge IoT products. For instance, Nest Cam IQ indoor security cameras built by Google Inc. features an on-device vision processing to detect motion, recognize familiar faces, and send real-time alerts about the occurrence of specific events, all by leveraging the technology. Moreover, Google Inc. has announced a Cloud IoT Edge platform that extends the Google cloud’s data processing and Machine Learning (ML) capabilities to edge devices. This software platform comes with an ML inferencing engine that can take advantage of Edge Tensor Processing Units (TPUs).
Furthermore, with edge offering several use-cases in video analytics such as real-time traffic monitoring and security and surveillance, has provided an impetus to the smart cities’ development. The integration of video analytics with edge platforms offers a mesh of fog nodes to facilitate intelligent video processing by enabling anomaly detection, real-time tracking, and data insights. Additionally, declining prices of hardware components and computation mediums have enabled wide-scale adoption of edge computing methodologies. For instance, NTT Docomo Inc. in Japan recently tested a new video IoT solution for video analytics that leverages the technology.
Tomi Engdahl says:
Edge-native Linux
https://ubuntu.com/blog/edge-native-linux
Tomi Engdahl says:
Here’s how Intel is collaborating with top medical device firms on AI and edge innovations that improve medical imaging in three areas: accuracy, efficiency and speed.
https://www.forbes.com/sites/insights-inteliot/2020/12/09/medical-imagings-next-frontier-ai-and-the-edge/?utm_source=FBPAGE&utm_medium=social&utm_content=4302851662&utm_campaign=sprinklrForbesMainFB&sh=6ef1f9ce7a80
Tomi Engdahl says:
https://new.siemens.com/global/en/products/automation/topic-areas/industrial-edge/simatic-edge.html
Tomi Engdahl says:
What Is Industry 4.0 and How Did We Get Here? with MIT Professor David Hardt
https://www.youtube.com/watch?v=ttfMEXGdh1s
Tomi Engdahl says:
Red Hat tunes up RHEL and OpenShift for life on computing’s edge
https://www.zdnet.com/article/red-hat-tunes-up-rhel-and-openshift-for-life-on-computings-edge/
As edge computing grows in importance, Red Hat is tweaking its leading Linux and Kubernetes to make the most of it.
At the virtual KubeCon, leading Linux and cloud company Red Hat showed up new edge computing capabilities for Red Hat Enterprise Linux (RHEL) and Red Hat OpenShift, its Kubernetes platform. With these, RHEL will be more stable than ever in even smaller hardware footprints. Meanwhile, OpenShift will support a remote worker node architecture to help deliver Kubernetes to space-constrained and remote deployments.
Why? Because edge computing, as Arpit Joshipura, The Linux Foundation’s general manager of networking, predicted: “Edge computing will overtake cloud computing” by 2025.” Time will tell if Joshipura was right, but according to the Worldwide Edge Spending Guide from IDC, the worldwide edge computing market is estimated to reach $250.6 billion in 2024 with edge-related software predicted to be roughly 21% of this spend. Red Hat wants its share of this market.
Tomi Engdahl says:
https://www.uusiteknologia.fi/2021/02/05/reunalaskenta-tuo-uutta-suorituskykya-koneisiin-seminaari-9-2-2021/
Tomi Engdahl says:
The Intelligent Edge: An Increasing Target for Bad Actors
https://www.securityweek.com/intelligent-edge-increasing-target-bad-actors
The traditional network perimeter has been replaced with multiple edge environments. These include WAN, multi-cloud, IoT, home offices, the new device edge, and more. Each edge environment comes with its own set of unique risks and vulnerabilities, which is why they have become a prime target for cybercriminals, who are shifting significant resources to strategically target and exploit emerging network edge environments. Organizations need the right knowledge and the right resources to remain protected as these and newer threats emerge.
The rise of the intelligent edge
The new “intelligent edge” is one of the biggest trends impacting businesses across industries. The intelligent edge is widely defined as the combination of advanced wireless connectivity, compact processing power, and AI to analyze and aggregate data in a location as close as possible to where it is captured in a network. One outcome of this is the emergence of the distributed cloud, where ad hoc networks are created dynamically by groups of endpoint devices running a common virtual platform. This intelligent edge, sometimes known as “intelligence at the edge” has huge ramifications for the interaction between mobile and IoT devices and the rest of the network.
Deloitte predicts the global market for the intelligent edge will reach $12 billion in 2021, driven in part by expanding 5G networks and hyperscale cloud. There is great potential for those organizations able to harness the potential of the intelligent edge, but there’s also increased opportunity for cybercriminals to ply their trade in new ways.
Tomi Engdahl says:
Hannes Niederhauser, CEO S&T AG, on trends 2021: “New standards accelerate developments for high-performance computing and safety”
https://www.kontron.com/about-kontron/news-events/detail/21_01_13_hannes-niederhauser-trends-2021?_cldee=dG9taS5lbmdkYWhsQG5ldGNvbnRyb2wuZmk%3d&recipientid=contact-fb9fc356b7ede71180d6005056971118-faa43add8ff04941a8ca2e070374ae6b&esid=cf87ffda-2b6d-eb11-ba4e-00155d40c117
Hannes Niederhauser, CEO of S&T AG, has identified five defining trends in the industry for the coming year: Predictive Maintenance, High Performance Computing/AI, 5G and 10G-PON, Functional Safety and the SDC standard in the medical sector.
“Corona has changed the world and given a boost to digitalization in many areas. This is particularly evident in the industrial environment. Processes around production have also changed. We see a trend from off-shoring to Asia back to in-shoring to Europe. However, competitive production requires a high degree of automation. This requires state-of-the-art technologies.
By 2025, around 75,000 million machines will be integrated into the Internet of Things (IoT) and the Industrial Internet of Things (IIoT). The leading market research institutes forecast annual growth of 31%. Due to this development, gigantic amounts of data are being generated around the globe, which must be evaluated and distributed reliably, quickly and securely. Kontron, being responsible for the IoT Solutions division within the S&T Group, has already set the course for corresponding solutions with the development of numerous key technologies.
Tomi Engdahl says:
In case of emergencies: Kontron’s KISS Rackmount Systems in the Gotthard Base Tunnel
https://www.kontron.com/blog/embedded/swiss-telematix-ag-kontron-s-kiss-rackmount-systems-in-the-gotthard-base-tunnel?_cldee=dG9taS5lbmdkYWhsQG5ldGNvbnRyb2wuZmk%3d&recipientid=contact-fb9fc356b7ede71180d6005056971118-faa43add8ff04941a8ca2e070374ae6b&esid=cf87ffda-2b6d-eb11-ba4e-00155d40c117
At 57 kilometers, the Gotthard Base Tunnel through the central Swiss Alps is the longest and, with a rock overburden of up to 2,450 meters, also the deepest underground railway tunnel in the world. If you add all the connecting and access tunnels as well as shafts, the entire tunnel system even measures around 152 kilometers. The tunnel is a true masterpiece of engineering, but it goes without saying that even in such a sophisticated structure, provision must be made for emergencies.
The tunnel is equipped with control electronics and monitoring equipment connected to the Train Control Center via fibre optic cables. Over 70,000 data points and more than 200,000 sensors register every change and detect potentially dangerous incidents fully automatically. And deep under the Swiss mountains, our KISS rackmount systems also fulfil a function that is vital in case of an emergency: their robustness, availability and reliability guarantee the functionality of the Telematix emergency call system. The Kontron KISS 19″ IPC serves as a hardware platform in the Telematix SXR node on which the software solutions for emergency call systems and public address systems can be based. The SXR-Node in turn integrates software solutions into existing communication networks, for example into standard fixed-line network telephony, ISDN or mobile radio.
Tomi Engdahl says:
Deloitte predicts the global market for the intelligent edge will reach $12 billion in 2021, driven in part by expanding 5G networks and hyperscale cloud. There is great potential for those organizations able to harness the potential of the intelligent edge, but there’s also increased opportunity for cybercriminals to ply their trade in new ways.
https://www.securityweek.com/intelligent-edge-increasing-target-bad-actors
Tomi Engdahl says:
Launching with pre-built models and a no-code model generator, Azure Percept aims to put secure AI firmly at the edge of everywhere.
Microsoft Announces Azure Percept, a Pilot-Ready “Complete Edge AI Platform”
https://www.hackster.io/news/microsoft-announces-azure-percept-a-pilot-ready-complete-edge-ai-platform-00e91a7b7c18
Launching with pre-built models and a no-code model generator, Azure Percept aims to put secure AI firmly at the edge of everywhere.
Microsoft has announced the launch, in public “pilot-ready” preview, of Azure Percept, a new edge AI platform combining certified hardware and software services to bring artificial intelligence to as wide a market as possible.
“With Azure Percept, we have simplified AI on the edge from silicon to service,” claims Microsoft’s Roanne Sones of the new platform, announced today at the company’s Ignite conference. “You can start prototyping in minutes, getting real-time edge AI insights for quick decision-making and analytics when and where the action occurs to help speed the development of your solution.”
Tomi Engdahl says:
https://www.uusiteknologia.fi/2021/03/03/microsoft-toi-reunalaskentaan-uuden-pilvialustan/
Tomi Engdahl says:
Edge computing’s importance in Industry 4.0
Edge computing links on-the-floor devices and the cloud and pre-processes data for use. Four common edge computing mistakes are highlighted.
https://www.controleng.com/articles/edge-computings-importance-in-industry-4-0/
Why is the edge important?
If you’re looking to make sense of Big Data, these edge computing devices are critical. An incredible amount of data points can be generated from your process – but you likely don’t want to send it all to the cloud. Edge computing devices will limit the amount of data that needs to be sent to the cloud (or servers) for analysis.
Why limit data sent to the cloud?
There are three main reasons why you might want to limit the amount of data you are sending to The Cloud.
Security: When data is exceptionally sensitive, it might be prudent to add an extra layer of security by not sending raw data off the line. By preprocessing on the edge, users can “anonymize” the data a bit.
Cost: Some cloud providers charge by the amount of data. When you reduce the data sent, users can reduce overall costs.
Network speed: As users start looking at additional data points, or increasing the frequency of data collection, the amount of data can increase exponentially. For example, one model we ran recently generated over 8,000 raw data points in 18 seconds. To avoid taxing the internet connection, users may want to reduce the amount of data sent.
Four common mistakes when using the edge
1. Undersizing edge devices.
2. Undersizing system architecture
3. Not preparing the network
4. Underestimating the importance of edge devices.
Tomi Engdahl says:
https://etn.fi/index.php/tekniset-artikkelit/12143-palvelinteho-tulee-kayttajan-luo
Tomi Engdahl says:
A multi-tier approach to #MachineLearning at the edge can help streamline both development and deployment for the #AIoT
#EVS21 #AI #IoT
https://buff.ly/3ioEPHD
Tomi Engdahl says:
Multi-tier machine learning helps speed edge AI deployment
https://www.edn.com/multi-tier-machine-learning-helps-speed-edge-ai-deployment/
Tomi Engdahl says:
Kaappitason datakeskukset Edge- ja IoT-ratkaisuihin
https://www.uusiteknologia.fi/2021/06/23/kaappitason-datakeskukset-edge-ja-iot-ratkaisuihin/
Tomi Engdahl says:
Edge computing for industrial AIoT applications
https://www.controleng.com/articles/edge-computing-for-industrial-aiot-applications/?oly_enc_id=0462E3054934E2U
Artificial intelligence (AI) applications in Industrial Internet of Things (IIoT) and edge computing provide advantages for real-time decisions and smarter actions in the field. See advice on edge computer selection and tools for building Artificial Intelligence of Things (AIoT) applications.
Tomi Engdahl says:
https://www.etteplan.com/stories/towards-less-vulnerable-embedded-electronics-new-regulation-cybersecurity
Tomi Engdahl says:
https://www.uusiteknologia.fi/2021/06/23/kaappitason-datakeskukset-edge-ja-iot-ratkaisuihin/
Tomi Engdahl says:
https://www.distrelec.ch/en/knowhow-arduino-edge-control/cms/knowhow-arduino-edge-control
Tomi Engdahl says:
Transforming health care at the edge
https://www.technologyreview.com/2021/06/10/1026038/transforming-health-care-at-the-edge/
Edge computing is reshaping health care by bringing big data processing and storage closer to the source, to support game-changing technologies such as the internet of things, artificial intelligence, and robotics.
Before 2020, digital transformation in health care was frustratingly slow, even as providers dreamed of boosting efficiency, increasing flexibility, and reining in spiraling costs. In a risk-averse industry known for lagging behind technology trends, doctors still had not fully adopted electronic health records, for example. And the use of digital tools for diagnosis, tracking, and treatment was emerging but limited. Yet, health-care data often needs to be accessed by collaborative teams across institutions, sometimes around the globe. System downtime or slow app performance is not an option.
These challenges were highlighted during the covid-19 pandemic. Not surprisingly, the health-care industry has accelerated its efforts to embrace digital over the past year, as well as experiment with exploding trends including telehealth, patient-generated health data, and remote surgery. These efforts, in turn, have evolved and expanded thanks to improvements in advanced technologies, including the internet of things, artificial intelligence, and robotics. The global market for connected medical devices, for instance, is expected to swell to $158 billion in 2022, up from $41 billion in 2017.
“The real-time feedback loop required for things like remote monitoring of a patient’s heart and respiratory metrics is only possible with something like edge computing.”
Arun Mirchandani, Executive Advisor on Health-Care Digital Transformation
To scale virtual patient services, manage medical devices, and support smart hospital applications, modern health systems must handle massive data sets closer to the data-gathering devices—to reduce the delay in the transfer of data, called latency, and enable real-time decision-making, says Arun Mirchandani, an advisor on health-care digital transformation. Whether it’s on a health worker’s tablet, a wearable device, an ingestible sensor, or a mobile app, computing at the “edge” of the network is essential for speed, scale, and performance.
Edge computing, through on-site sensors and devices, as well as last-mile edge equipment that connects to those devices, allows data processing and analysis to happen close to the digital interaction. Rather than using centralized cloud or on-premises infrastructure, these distributed tools at the edge offer the same quality of data processing but without latency issues or massive bandwidth use.
“The real-time feedback loop required for things like remote monitoring of a patient’s heart and respiratory metrics is only possible with something like edge computing,” Mirchandani says. “If all that information took several seconds or a minute to get processed somewhere else, it’s useless.”
Opportunities and challenges at the health-care edge
The sky’s the limit when it comes to the opportunities to use edge computing in health care, says Paul Savill, senior vice president of product management and services at technology company Lumen, especially as health systems work to reduce costs by shifting testing and treatment out of hospitals and into clinics, retail locations, and homes.
“A lot of patient care now happens at retail drugstores, whether it is blood work, scans, or other assessments,” Savill says. “With edge computing capabilities and tools, that can now take place on-site, on a real-time basis, so you don’t have to send things to a lab and wait a day or week to get results back.”
The arrival of 5G technology, the new standard for broadband cellular networks, will also drive opportunities, as it works with edge computing tools to support the internet of things and machine learning, adds Mirchandani. “It’s the combination of this super-low-latency network and computing at the edge that will help these powerful new applications take flight,” he says. Take robotic surgeries—it’s crucial for the surgeon to have nearly instant, sub-millisecond sensory feedback. “That’s not possible in any other way than through technologies such as edge computing and 5G,” he says.
“A lot of patient care now happens at retail drugstores. With edge computing capabilities and tools, that can now take place on-site, on a real-time basis, so you don’t have to send things to a lab and wait a day or week to get results back.”
Paul Savill, Senior Vice President, Product Management and Services, Lumen
Data security, however, is a particular challenge for any health-care-related technology because of HIPAA, the US health information privacy law, and other regulations. The real-time data transmission edge computing provides will be under significant scrutiny, Mirchandani explains, which may affect widespread adoption. “There needs to be an almost 100% guarantee that the information you generate from a heart monitor, pulse oximeter, blood glucose monitor, or any other device will not be intercepted or disrupted in any way,” he says.
The health-care edge: Not just hype
There is no doubt that health-care is drowning in data: A 2020 Dell survey found that health-care and life sciences data grew almost 900% over the previous two years, with 10 to 15 connected devices at the typical hospital bedside. Big data sets can quickly become overwhelming to manage, label, and share efficiently. To be useful in the form of wearable devices and streamlined services, the data needs to be processed and analyzed with real-time insights, closer to the edge, and transferred to the cloud or an on-premises network if necessary.
That is not simply buzzword-driven hype: covid-19, for example, has laid bare the need for health-care options outside the doctor’s office or hospital.
“In the short term, edge computing will solve thorny health-care challenges around real-time data analytics and speech recognition applications,” says Mirchandani. “But while it is early days, many of the other exciting edge computing use cases and applications are coming as well, especially as systems become more interoperable.”