https://medium.com/google-developers/computing-at-the-edge-of-iot-140a888007b
We’ve seen that demand for low latency, offline access, and enhanced machine learning capabilities is fueling a move towards decentralization with more powerful computing devices at the edge.
Nevertheless, many distributed applications benefit more from a centralized architecture and the lowest cost hardware powered by MCUs.
Let’s examine how hardware choice and use case requirements factor into different IoT system architectures.
313 Comments
Tomi Engdahl says:
Key drivers and benefits of edge computing for smart manufacturing
http://www.controleng.com/single-article/key-drivers-and-benefits-of-edge-computing-for-smart-manufacturing/737ac1286236a66c3d74e38740142d1c.html
Edge computing means faster response times, increased reliability and security. Five edge computing advantages are highlighted.
A lot has been said about how the Internet of Things (IoT) is revolutionizing the manufacturing world. Many studies have already predicted more than 50 billion devices will be connected by 2020. It is also expected over 1.44 billion data points will be collected per plant per day. This data will be aggregated, sanitized, processed, and used for critical business decisions.
This means unprecedented demand and expectations on connectivity, computational power and speed of service—quality of service. Can we afford any latency in critical operations? This is the biggest driver for edge computing. More power closer to the data source-the “Thing” in IoT.
Edge computing and drivers
Rather than a conventional central controlling system, this distributed control architecture is gaining popularity as an alternative as the light version of data center and where control functions are placed closer to the devices.
Edge computing means data processing power at the edge of the network, closer to the source of data. With edge computing, each device—whether it be a sensor, robotic arm, HVAC unit, a connected car, or any intelligent device—collects data, uses data processing model performed by the cloud, and packages it up for processing and analysis.
IDC research predicted in 3 years, 45% of IoT-created data will be stored, processed, analyzed, and acted upon close to, or at the edge of, the network and over 6 billion devices will be connected to the edge computing solution. Inherent challenges of the cloud infrastructure such as network latency, cost of network bandwidth and data storage, security, and compliance issues are minimized by edge computing infrastructure and are the key drivers of edge technology.
Tomi Engdahl says:
Defining the value of edge computing
Explore the three important factors of deploying an edge computing solution.
http://www.controleng.com/single-article/defining-the-value-of-edge-computing/526e16bad24079046e7071636146222d.html
With the growth in connected devices and machines, industrial enterprises are realizing the need for an efficient way to manage large amounts of data, which in turn escalates the importance of edge computing. As industrial technology becomes more complex and devices become more powerful, edge computing is emerging as a valuable solution for harnessing all this computing power for business value. Today, edge devices go beyond basic automation, enabling industrial enterprises to perform an expanding array of advanced computing and analytical tasks.
When evaluating these technologies, companies should look at these three critical components to help ensure a successful, “cloud-like” edge computing deployment:
1. The elimination of production downtime
2. The ability to analyze, act on and protect data in real-time
3. Simplifies operations
With the rise of IIoT, industrial enterprises are blurring the line that separate the enterprise data center and business systems (IT) from production automation systems (OT) and their respective networks. This converged “hybrid OT professional” has a unique combination of skills to bridge the gap separating the IT and OT worlds, thus reducing the burden on IT teams along their organization’s IIoT transformation.
Tomi Engdahl says:
How To Secure The Network Edge
The risk of breaches is growing, and so is the potential damage.
https://semiengineering.com/how-to-secure-the-network-edge/
Microcontrollers, sensors, and other devices that live at the edge of the Internet must be protected against cyberattacks and intrusions just as much as the chips in data centers, network routers, and PCs. But securing those edge devices presents a swath of unique challenges, including the cost and availability of technology resources, as well as varying levels of motivation to solve these problems by both vendors and end users.
But securing the edge takes on new urgency as safety issues enter the picture. Assisted and autonomous driving essentially transform cars into Internet edge devices, where real-time responsiveness is required for accident avoidance and cloud-based connectivity is needed for such things as traffic and weather alerts. Likewise, embedded systems are being used to monitor and control critical infrastructure, and that data is being read by external monitors or devices at the edge of the network that are directly connected to those systems.
All of this raises the stakes for security. So how exactly do this issues get solved, and by whom?
“That’s a tricky question,” observed Robert Bates, chief safety officer for the Embedded Software Division at Mentor, a Siemens Business. “In some sense, those kinds of smart devices can be as secure as anything else connected to the network. But theory and reality are two different things.”
“The same problems exist across industry,” said Bates. “Industry buys something, and they just kind of want to forget about it. If they’re not updating these devices themselves, or they’re not thinking about updating them, they’re going to be exposed—even if their security was top-notch at the point of the link. That’s one problem.
Tomi Engdahl says:
In-cabinet thermal management in edge computing environments
http://www.cablinginstall.com/articles/print/volume-26/issue-1/features/cabling-management/in-cabinet-thermal-management-in-edge-computing-environments.html?cmpid=enl_cim_cim_data_center_newsletter_2018-02-27&pwhid=e8db06ed14609698465f1047e5984b63cb4378bd1778b17304d68673fe5cbd2798aa8300d050a73d96d04d9ea94e73adc417b4d6e8392599eabc952675516bc0&eid=293591077&bid=2016809
Providers of cable management systems offer resources, systems for edge computing applications.
Edge challenges
The company then provides the other side of that scenario: “As worthy as these benefits may be, IT will face new challenges and tasks in edge computing implementation.”
In a similar vein, CPI’s August 2017 blog post reminds us, “As the Internet of Things (IoT) continues to evolve and edge computing—which pushes applications, data, and computing services away from centralized data centers—becomes more common, managing assets and white space remotely becomes more challenging. A comprehensive and reliable solution that works together to simplify operation, costs and labor, as well as allows for network expansion, is key.”
Rittal adds, “Edge computing, by definition, exposes hardware to an environment that may be challenging—in footprint, ambient temperatures, contaminants, particulates, vibration or accessibility. Solutions abound for each of these concerns: micro data centers, NEMA-appropriate enclosures, thermal management and filtration systems, and shock-absorbing designs.”
Tomi Engdahl says:
Woe Canada: Rather than rise from the ashes, IBM-built C$1bn Phoenix payroll system is going down in flames
https://www.theregister.co.uk/2018/03/02/canada_payroll_system_phoenix/
Canucks to pull plug on ill-fated mismanaged govt IT project
Canada is about ready to pull the plug on its IBM-built error-plagued Phoenix payroll system that has cost the nation nearly CAN$1bn ($790m).
Launched in 2016, Phoenix was an IBM implementation of the Oracle PeopleSoft platform that was supposed to handle payroll for 46 Canadian government agencies and departments.
Unfortunately for the Great White North, the system was almost immediately beset with problems. Nearly two years later, officials have had to expand their payroll support staff from 550 heads to more than 1,500 to cope with the cockups, and some CAN$460m has been spent on support and fixes.
Now, America’s Hat says it’s time to cut bait and move on. The administration plans to begin a two-year CAN$16m project to design a new system to replace Phoenix.
“Over the last year and a half, the government has hired several hundred people to rebuild capacity that was lost due to the previously flawed business plan,” the budget report said of Phoenix.
Systems nominal
IBM is taking the news in stride, and insisted it has held up its end of the bargain.
“As the government has repeatedly acknowledged, IBM is fulfilling its obligations on the Phoenix contract, and the software is functioning as intended,” a spokesperson told El Reg on Thursday.
Regardless of who is at fault, the situation is a bad look for all parties involved. For IBM, the ordeal is another major government project failure its name is attached to at a time when Big Blue was just righting its financial ship.
The Canadian government, meanwhile, said this week that on top of the nearly CAN$900m support costs, it will have to cough up about CAN$5.5m in charges to smooth out tax headaches caused by botched payments to employees and additional costs to support the legal fallout.
Tomi Engdahl says:
IoT Security Concerns Push Vendors to the Edge
https://www.eetimes.com/document.asp?doc_id=1333044
NUREMBERG, Germany — Doing more processing at the edge to avoid sending sensitive data to the cloud emerged as a common theme among vendors at the Embedded World conference here last week. Whether this is a result of forthcoming GDPR (General Data Protection Regulation) laws coming into force across the European Union on May 25, or whether there it is simply a lack of sufficient security in current devices is difficult to tell.
Tomi Engdahl says:
Netflix could pwn 2020s IT security – they need only reach out and take
Workload isolation is niche, but they’re rather good at it
https://www.theregister.co.uk/2018/03/08/will_serverless_kill_the_container_star/
The container is doomed, killed by serverless. Containers are killing Virtual Machines (VM). Nobody uses bare metal servers. Oh, and tape is dead. These, and other clichés, are available for a limited time, printed on a coffee mug of your choice alongside a complimentary moon-on-a-stick for $24.99.
Snark aside, what does the future of containers really look like?
Recently, Red Hat’s CEO casually mentioned that containers still don’t power most of the workloads run by enterprises. Some people have seized on this data point to proclaim the death of the container. Some champion the “death” of containers because they believe serverless is the future. Some believe in the immutable glory of virtual machines and wish the end of this upstart workload encapsulation mechanism.
Tomi Engdahl says:
AI Core – Artificial Intelligence On The Edge
https://www.eeweb.com/profile/eeweb/news/ai-core-artificial-intelligence-on-the-edge
UP Bridge the Gap – a brand of AAEON Europe – is proud to launch AI Core: the first embedded ultra-compact Artificial Intelligence processing cards for edge computing.
AI Core is a mini-PCIe module powered by Intel® Movidius™ Myriad™ 2 technology. This low-power module enhances industrial IoT edge devices with hardware accelerated deep learning and enhanced machine vision functionality. AAEON Technology is one of the first IPC manufacturers to address the growing need for Artificial Intelligence on the edge with dedicated hardware.
Most of the available IoT solutions are focused on connecting edge devices to the cloud and these deployments face challenges related to latency, network bandwidth, reliability and security. Experts in this field agree that not all the tasks and decision making processes can be addressed in cloud-only models. AI Core is the solution for cloud limitations by bringing AI performance and hardware acceleration not “at” but “ON” the edge of the Internet of Things.
Tomi Engdahl says:
Energy Requirements And Challenges For IoT Autonomous Intelligence At The Edge
https://semiengineering.com/energy-requirements-and-challenges-for-iot-autonomous-intelligence-at-the-edge/
Today’s computational landscape is vast and power hungry. Can it be sustainable?
Tomi Engdahl says:
Intelligence At The Edge Is Transforming Our World
https://semiengineering.com/intelligence-at-the-edge-is-transforming-our-world/
Machine learning already plays a part in everyday life, but efficient inference will keep it moving forward.
Tomi Engdahl says:
Exponentials At The Edge
https://semiengineering.com/exponentials-at-the-edge/
The revolution that started in mobile phones will continue in other devices, but much faster.
Tomi Engdahl says:
Tom Krazit / GeekWire:
Cloudflare launches Cloudflare Workers, an edge computing service for developers using its network, charging devs $0.50 for every 1M tasks used by their apps — Cloudflare is ready to take the wraps of a new service designed for developers creating Internet-of-Things apps that want to capitalize …
Cloudflare to open an edge computing service for developers using its network
https://www.geekwire.com/2018/cloudflare-open-edge-computing-service-developers-using-network/
Cloudflare is ready to take the wraps of a new service designed for developers creating Internet-of-Things apps that want to capitalize on the proximity benefits provided by edge computing.
Cloudflare Workers was first introduced last September, and Cloudflare is expected to announce Tuesday that it is now generally available for developers to check out. The new service runs on hardware that Cloudflare has installed in more than 125 data centers around the world to power its anti-DDoS (distributed denial of service) attack service, and it allows developers to write JavaScript applications through the Service Worker API that will run much closer to their users than might otherwise be possible with standard cloud services.
“For quite some time, we have understood that there is real power in deploying applications that ran incredibly close to where users are on the internet,”
About 1,000 users have been playing with Cloudflare Workers since the company opened the service up to a broader beta program in January following the September announcement. “I’ve been surprised by how dramatically different all of the applications people have bult are, it doesn’t feel like there is a bound to them yet,” Prince said.
The benefits of edge computing are just starting to make their way into the world, although lots of folks have been talking about it for a while. It’s a recognition of the fact that as connected devices spread throughout the world, it quickly makes more sense to execute a lot of the code running those devices as physically close to them as possible, as waiting for instructions from a remote cloud data center won’t always cut it for real-time IoT devices.
Tomi Engdahl says:
Tom Warren / The Verge:
Microsoft unveils cloud gaming division led by Microsoft vet Kareem Choudhry who says the company wants content available across all devices, hints at streaming
Microsoft’s new gaming cloud division readies for a future beyond Xbox
Cloud services seen as the future of games
https://www.theverge.com/2018/3/15/17123452/microsoft-gaming-cloud-xbox-future
Tomi Engdahl says:
he Xilinx FPGA manufacturer today introduced its new vision and at the same time the new product category it calls ACAP, the adaptive computational acceleration platform.
- ACAP computing capabilities go far beyond the capabilities of traditional FPGAs. It’s a genuinely new product category that can be altered to fit different applications and workloads at a device level, Peng said at a press conference.
The words are covered. With the ACAP processor, functions can be dynamically changed during performance. The change takes time in milliseconds, and then the new application-specific computation succeeds much higher power per watt than with a general-purpose processor or graphics processor.
According to Peng, ACAP is ideally suited to new big data and artificial intelligence applications. These include video processing, database processing, data compression, searches, calculation of AI models, machine vision, and many of the network acceleration functions.
The first ACAP family is called Everest and is implemented in the TSMC’s 7 nanometer process. The first chips in the chip are getting through this year. – Everest circuits will radically differ from what Xilinx and Altera have done so far.
Source: http://www.etn.fi/index.php/13-news/7724-pc-laskennan-aika-on-ohi
Tomi Engdahl says:
9 hidden risks of telecommuting policies
https://www.cio.com/article/3261950/hiring-and-staffing/hidden-risks-of-telecommuting-policies.html
As the boundaries of the enterprise shift, IT’s ability to support and protect remote work environments must shift correspondingly. Here’s how to develop a comprehensive telecommuting policy to mitigate potential liabilities.
How I Learned to Stop Worrying and Love Telecommuting
https://www.cio.com/article/2436957/it-organization/how-i-learned-to-stop-worrying-and-love-telecommuting.html
CareGroup CIO John Halamka takes an in-depth look at the policies and technologies necessary for supporting flexible work arrangements.
Tomi Engdahl says:
Tech Giants Set to Face 3% Tax on Revenue Under New EU Plan
https://www.bloomberg.com/news/articles/2018-03-17/tech-giants-set-to-face-3-tax-on-revenue-under-new-eu-plan
Tomi Engdahl says:
Microsoft’s new gaming cloud division readies for a future beyond Xbox
Cloud services seen as the future of games
https://www.theverge.com/2018/3/15/17123452/microsoft-gaming-cloud-xbox-future
Microsoft shipped its first video game in 1981, appropriately named Microsoft Adventure. It was an MS-DOS game that booted directly from a floppy disk, and set the stage for Microsoft’s adventures in gaming. A lot has changed over the past 37 years, and when you think of Microsoft’s efforts in gaming these days you’ll immediately think of Xbox. It’s fair to say a lot is about to change over the next few decades too, and Microsoft is getting ready. Today, the software giant is unveiling a new gaming cloud division that’s ready for a future where consoles and gaming itself are very different to today.
Tomi Engdahl says:
Xilinx to bust ACAP in the dome of data centres all over with uber FPGA
That’s an Adaptive Compute Acceleration Platform btw
https://www.theregister.co.uk/2018/03/19/xilinx_everest_acap_super_fpga/
Xilinx is developing a monstrous FPGA that can be dynamically changed at the hardware level.
The biz’s “Everest” project is the development of what Xilinx termed an Adaptive Compute Acceleration Platform (ACAP), an integrated multi-core heterogeneous design that goes way beyond your bog-standard FPGA, apparently. It is being built with TSMC’s 7nm process technology and tapes out later this year.
Xilinx Unveils Revolutionary Adaptable Computing Product Category
https://www.xilinx.com/news/press/2018/xilinx-unveils-revolutionary-adaptable-computing-product-category.html
ACAP TECHNICAL DETAILS
An ACAP has – at its core – a new generation of FPGA fabric with distributed memory and hardware-programmable DSP blocks, a multicore SoC, and one or more software programmable, yet hardware adaptable, compute engines, all connected through a network on chip (NoC). An ACAP also has highly integrated programmable I/O functionality, ranging from integrated hardware programmable memory controllers, advanced SerDes technology and leading edge RF-ADC/DACs, to integrated High Bandwidth Memory (HBM) depending on the device variant.
Software developers will be able to target ACAP-based systems using tools like C/C++, OpenCL and Python. An ACAP can also be programmable at the RTL level using FPGA tools.
“This is what the future of computing looks like,” says Patrick Moorhead, founder, Moor Insights & Strategy. “We are talking about the ability to do genomic sequencing in a matter of a couple of minutes, versus a couple of days. We are talking about data centers being able to program their servers to change workloads depending upon compute demands, like video transcoding during the day and then image recognition at night. This is significant.”
ACAP has been under development for four years at an accumulated R&D investment of over one billion dollars (USD). There are currently more than 1,500 hardware and software engineers at Xilinx designing “ACAP and Everest.” Software tools have been delivered to key customers. “Everest” will tape out in 2018 with customer shipments in 2019.
Tomi Engdahl says:
No future-oriented IT megatrends without real time capable fiber optics
http://www.rosenberger-osi.com/en/main/news/microblog/edge-computing.html
The digital transformation casts its shadows. The world of applications is thrown upside down. The purpose and size of the project take a back seat. Users with their expectations of a modern, digitized world are in the focus of attention. Edge Computing and the new 5G mobile network are on everyone’s lips. They are the prerequisite for real-time applications and low latency. But are these pure hype topics or concrete solutions that make future-oriented applications possible in the first place?
Experts agree: without a modern, real-time-capable infrastructure, neither the Internet of Things (IoT), nor autonomous driving or Smart Cities can be realized. The fact is that data volumes around the globe are exploding. Real-time applications need to be processed within seconds and in the area where they are created and used.
In the case of autonomous driving cars, for example, all data processing is carried out directly in the vehicle. Reactions must occur within milliseconds, for example to prevent accidents. The necessary real-time data processing is only possible with 5G mobile communications and Edge Computing. The construction of structures is no magic trick, because network nodes for edge computing can be integrated in the next street light, an advertising pillar or near a mobile radio cell, even in the middle of such a cell. Data processing at its best.
A dream of the future? Not at all! At the 2018 Winter Olympics in South Korea, for example, visitors and athletes at the venues have already been able to test a 5G installation and immerse themselves in the world of new applications. At the Mobile World Congress 2018 in Barcelona, 5G was also one of the central themes. Commercial projects based on 5G are already planned for 2018 in the EU. The Federal Ministry of Transport and Digital Technologies estimates that by 2020, rapid mobile communications technology will then be available everywhere.
Tomi Engdahl says:
How to get started with edge computing
http://www.controleng.com/single-article/how-to-get-started-with-edge-computing/e0c9a8e2d5090f2071b2191170cc784a.html
Implementing edge devices into a system is powerful, easy to use and install, cost-effective, and optimizes data collection and reliability. Find your way to the edge.
Edge computing is designed to enhance the Industrial Internet of Things (IIoT) and provides many potential advantages for users. Edge computing speeds up data flow and extends knowledge of what’s happening on a network. It also improves data reliability by making device data the one source of truth. And there’s less latency. If there are local human-machine interfaces (HMIs), there’s still local access and control even if network connectivity is lost to help prevent against losing data. Edge devices are more powerful, easier to use, and less expensive, making it very affordable to put powerful computers at the edge of a network.
Getting started with edge computing
With all the edge products on the market, there are a lot of choices for a company to make.
Think about the entire system and how the edge devices are going to fit into the larger architecture. Find the devices that work best for the system and the company’s overall goals.
Ask specific questions about the devices. How can they be maintained and upgraded? Can the data be moved to a central location? Can the devices be used for other functions at the edge?
The architecture should allow plug-and-play functionality. Individual components should be replaceable without affecting the whole system. Older architecture requiring configurations in multiple places inhibits the ability to make future changes.
Many edge devices work well with message queuing telemetry transport (MQTT), which is the perfect messaging protocol for the IIoT. MQTT was designed about 20 years ago for the industrial space. In recent years, it has become more popular because of its low bandwidth requirements and publish/subscribe model.
MQTT reports by exception and communicates data only when there’s a change. It also makes data available for applications such as supervisory control and data acquisition (SCADA), enterprise resource planning (ERP), information technology (IT), business intelligence, and more. MQTT provides high availability and scalability.
Results with edge computing
Edge computing is expanding along with the IIoT because it provides numerous benefits. For example, an oil and gas pipeline used traditional polling, which usually takes 30-45 minutes to hear back from all the remote locations. If operators pressed a button to open a valve, they’d have to wait 15 minutes to get confirmation the valve had opened. After installing edge devices and MQTT, the process now takes less than 15 seconds.
Tomi Engdahl says:
Fog computing for industrial automation
http://www.controleng.com/single-article/fog-computing-for-industrial-automation/78d79cad4ec97b8f4c6ebf4dbc9a3d74.html
How to develop a secure, distributed automation architecture in a data-driven world: Two examples and five advantages of fog computing are highlighted.
The manufacturing industry is experiencing substantial benefits as industrial operators use the Industrial Internet of Things (IIoT) to automate systems, deploy sensors to measure, monitor, and analyze data, improve efficiencies, and increase revenue opportunities for manufacturing operations. Using eight pillars of a fog computing architecture can help.
The amount of data from these newly-connected plants can be measured in the petabytes (1 million gigabytes): Millions of streaming, connected sensors on industrial control systems (ICSs), dozens of autonomous drones, industrial robots, video surveillance cameras covering plants, and so on.
Traditional information technology (IT) approaches to operational technology (OT) environments cannot keep up with the necessary volume, latency, mobility, reliability, security, privacy, and network bandwidth challenges in controlled, supplier-connected, or rugged operational environments. It’s time for a new architectural approach to allow IIoT to reach its potential with fog computing.
Defining fog computing
Fog computing is designed for data-dense, high-performance computing, high-stakes environments. Fog is an emerging, distributed architecture that bridges the continuum between cloud and connected devices that doesn’t require persistent cloud connectivity in the field and factory. Fog works by selectively moving compute, storage, communication, control, and decision making closer to IoT sensors and actuators, where the data is generated and used. It augments, not replaces, investments in the cloud to enable an efficient, cost-effective, secure, and constructive use of the IIoT in manufacturing environments.
Fog is sometimes referred to as edge computing, but there are key differences. Fog is a superset of edge functionality. The fog architecture pools the resources and data sources between devices residing at the edge in north-south (cloud-to-sensor), east-west (function-to-function or peer-to-peer) hierarchies working with the cloud for maximum efficiency. Edge computing tends to be limited to a small number of north-south layers often associated with simple protocol gateway functions.
Fog nodes are foundational elements of the fog architecture. A fog node is any device that provides computational, networking, storage, and acceleration elements of the fog architecture. Examples include industrial controllers, switches, routers, embedded servers, sophisticated gateways, programmable logic controllers (PLCs), and intelligent IoT endpoints such as video surveillance cameras.
Tomi Engdahl says:
The Case That Never Ends: Oracle Wins Latest Round vs. Google
https://www.wired.com/story/the-case-that-never-ends-oracle-wins-latest-round-vs-google
Tomi Engdahl says:
Nvidia wants AI to Get Out of the Cloud and Into a Camera, Drone, or Other Gadget Near You
https://spectrum.ieee.org/view-from-the-valley/computing/embedded-systems/nvidia-wants-ai-to-get-out-of-the-cloud-into-a-camera-drone-or-other-gadget-near-you
People are just now getting comfortable with the idea that data from many electronic gadgets they use flies up to the cloud. But going forward, much of that data will stick closer to Earth, processed in hardware that lives at the so-called edge—for example, inside security cameras or drones.
That’s why Nvidia, the processor company whose graphics processing units (GPUs) are powering much of the boom in deep learning, is now focused on the edge
Why the move to the edge? At a press event held Tuesday in San Francisco, Talla gave four main reasons: bandwidth, latency, privacy, and availability. Bandwidth is becoming an issue for cloud processing, he indicated, particularly for video, because cameras in video applications such as public safety are moving to 4K resolution and increasing in numbers. “By 2020, there will be 1 billion cameras in the world doing public safety and streaming data,” he said. “There’s not enough upstream bandwidth available to send all this to the cloud.” So, processing at the edge will be an absolute necessity.
Tomi Engdahl says:
Tom Warren / The Verge:
Overview of gaming laptops using Intel 6-core CPUs, including Samsung Notebook Odyssey Z, Acer Nitro 5, Gigabyte AERO 15/15x, Asus ROG Zephyrus M, Asus ROG G703 — Gaming laptops are getting faster and thinner — Intel unveiled its first Core i9 chip for laptops earlier today …
These are the gaming laptops using Intel’s new 6-core processors
Gaming laptops are getting faster and thinner
https://www.theverge.com/circuitbreaker/2018/4/3/17192022/intel-6-core-processor-gaming-laptops-list
Intel unveiled its first Core i9 chip for laptops earlier today, and updated Core i5 and i7 processors that include 6 cores of power instead of the usual quad-core processors. All of these new processors are primarily designed for gaming and high-performance laptops, so expect to see a number of new notebooks shipping with these new processors. While processors don’t matter as much to gaming (outside of VR) as a modern GPU, a number of OEMs are announcing new laptops today that pair Intel’s latest processor with Nvidia graphics cards.
Here’s everything from Acer, Asus, Gigabyte, and Samsung that use Intel’s latest processors.
Tomi Engdahl says:
Sean White / The Mozilla Blog:
Mozilla says it is building Firefox Reality, a new cross-platform web browser designed from the ground up to work on standalone VR and AR headsets
Mozilla Brings Firefox to Augmented and Virtual Reality
https://blog.mozilla.org/blog/2018/04/03/mozilla-brings-firefox-augmented-virtual-reality/
Sean White April 3, 2018
Today, we primarily access the Internet through our phones, tablets and computers. But how will the world access the web in five years, or in ten years, and how will the web itself grow and change?
We believe that the future of the web will be heavily intertwined with virtual and augmented reality, and that future will live through browsers. That’s why we’re building Firefox Reality, a new kind of web browser that has been designed from the ground up to work on stand-alone virtual and augmented reality (or mixed reality) headsets.
Tomi Engdahl says:
How to get started with edge computing
https://www.controleng.com/single-article/how-to-get-started-with-edge-computing/e0c9a8e2d5090f2071b2191170cc784a.html
Implementing edge devices into a system is powerful, easy to use and install, cost-effective, and optimizes data collection and reliability. Find your way to the edge.
Edge computing is designed to enhance the Industrial Internet of Things (IIoT) and provides many potential advantages for users. Edge computing speeds up data flow and extends knowledge of what’s happening on a network. It also improves data reliability by making device data the one source of truth. And there’s less latency. If there are local human-machine interfaces (HMIs), there’s still local access and control even if network connectivity is lost to help prevent against losing data. Edge devices are more powerful, easier to use, and less expensive, making it very affordable to put powerful computers at the edge of a network.
Many edge devices work well with message queuing telemetry transport (MQTT), which is the perfect messaging protocol for the IIoT. MQTT was designed about 20 years ago for the industrial space. In recent years, it has become more popular because of its low bandwidth requirements and publish/subscribe model.
MQTT reports by exception and communicates data only when there’s a change. It also makes data available for applications such as supervisory control and data acquisition (SCADA), enterprise resource planning (ERP), information technology (IT), business intelligence, and more. MQTT provides high availability and scalability.
Results with edge computing
Edge computing is expanding along with the IIoT because it provides numerous benefits. For example, an oil and gas pipeline used traditional polling, which usually takes 30-45 minutes to hear back from all the remote locations. If operators pressed a button to open a valve, they’d have to wait 15 minutes to get confirmation the valve had opened. After installing edge devices and MQTT, the process now takes less than 15 seconds.
Tomi Engdahl says:
IoT and Industrie 4.0 creating M&A hotspots
https://www.controleng.com/single-article/iot-and-industrie-40-creating-m-a-hotspots/597b5e01a9a86d57b91143176a606800.html
According to a report from Hampleton Partners, he Internet of Things (IoT) or Industrie 4.0 technologies are driving a new wave of mergers and acquisitions in the technology sector.
The Internet of Things (IoT) and Industrie 4.0 technologies are driving a new wave of mergers and acquisitions in the technology sector according to a report from Hampleton Partners, an M&A consultancy. Technologies such as sensors, blockchain, artificial intelligence (AI), machine learning, and big data analytics are the key hotspots across a range of industry verticals, including automotive, healthtech, and high-tech industrial applications.
Traditional companies are under pressure to quickly and effectively integrate these technologies into their product and service offerings. Hampleton Partners’ 2020 Tech Outlook report says the alternative is being rendered obsolete by new market entrants that are causing “profound shifts in value chains and customer behaviors.”
Tomi Engdahl says:
Artificial intelligence in the industrial enterprise
https://www.controleng.com/single-article/artificial-intelligence-in-the-industrial-enterprise/2e0569050f5a89c15ab4c1b898fa313e.html
Analytics can deliver insight as to how things are going, but artificial intelligence (AI) doesn’t become a thing until you start using machine learning and semantics for insight.
Automation can improve a process. Productivity can gain from examination of workflows and leading indicators. And analytics deliver insight as to how things are going. But it isn’t till you step over into the cognitive, with things like machine learning and semantics, that the realm of artificial intelligence (AI) is entered.
For the Industrial Internet of Things (IIoT), predictive maintenance of machinery and equipment is the first application demonstrating wide commercial acceptance. “This can be done with classic regression and predictive analytics. With artificial intelligence, however, you go beyond the structured deterministic to the fuzzier stochastic,” said Jeff Kavanaugh, vice president, senior partner, Infosys. “With machine learning based on input such as audio signatures, the computer learns as a human would, by first paying attention to how a machine sounds when it’s healthy and then understanding anomalies.”
Sample set asymmetry
A question often asked is whether companies have the data needed to enable machine learning, and whether the data is in a form suitable for such use. “People have more data than they think, but less than they hope,” said Kavanaugh. “While there are a lot of data stores that don’t lend themselves to machine learning, there are instances where great amounts of data simply aren’t needed. At other times, companies can build on the power of accumulated data. Industrial manufacturers do have deep troves of simple data which can be converted to use cases, where they can go deep.”
“We’re talking about things that are inherently cognitive, in other words fuzzy. While the earlier transformation was from full analog to computerized operations, the current one is more pervasive, more connected, more intelligent—and ultimately—more profound.”
Tomi Engdahl says:
Processing Moves To The Edge
https://semiengineering.com/processing-moves-to-the-edge/
Definitions vary by market and by vendor, but an explosion of data requires more processing to be done locally.
Edge computing is evolving from a relatively obscure concept into an increasingly complex component of a distributed computing architecture, in which processing is being shifted toward end devices and satellite data facilities and away from the cloud.
Edge computing has gained attention in two main areas. One is the industrial IoT, where it serves as a do-it-yourself infrastructure for on-site data centers. The second involves autonomous vehicles, where there is simply not enough time to ask the cloud for solutions.
But ask two people to describe it and you are likely to get two very different answers. On one hand, it is understood well enough that it can be used in satellite IIoT data centers and in machine learning-enabled iPhones. On the other, most of those designing it can’t say what it looks like.
Tomi Engdahl says:
Nokia and Telia brought the 5G to the factory
elia and Nokia, in cooperation with Intel, took a major step in the next industrial revolution on a trial using 5G, cloud and data center in a real industrial environment as an example of a new kind of digital process.
The experiment demonstrated how companies can take advantage of 5G core features such as low latency and top speeds in conjunction with video analytics to streamline and streamline the manufacturing industry. At the Nokia factory in Oulu, the video line on the assembly line was monitored and analyzed with the Finnish Finwe Startup Video Analytical Application.
The video image was uploaded on Intel’s 5G mobile test platform on the Nokia 5G test network at 28 gigahertz and Telia’s fiber optic network at Telia’s data center in Helsinki. Due to high-speed connections and low latency, the video analytics application was able to react immediately and report deviations to the line responsible for the assembly line that could be fixed in real-time and thus improve the quality, reliability and efficiency of the process. Data Center’s service was also brought to the edge of the network closer to the location of use on the Nokia Multi-Access Edge Computing platform (MEC), whereby the video analytics delay was further reduced.
Thanks to the high-speed 5G and fiber connections, the quality and efficiency of industrial processes such as the test can be improved with real-time cloud computing facilities.
Source: http://etn.fi/index.php/13-news/7844-nokia-ja-telia-toivat-5g-n-tehtaaseen
Tomi Engdahl says:
Navigating The Foggy Edge Of Computing
https://semiengineering.com/navigating-the-foggy-edge-of-computing/
It’s not just cloud and edge anymore as a new layer of distributed computing closer to end devices picks up steam.
The National Institute of Standards and Technology (NIST) defines fog computing as a horizontal, physical or virtual resource paradigm that resides between smart end-devices and traditional cloud or data centers. This model supports vertically-isolated, latency-sensitive applications by providing ubiquitous, scalable, layered, federated and distributed
computing, storage and network connectivity. Put simply, fog computing extends the cloud to be closer to the things that produce and act on Internet of Things (IoT) data.
According to Business Matters, moving computing and storage resources closer to the user is critical to the success of the Internet of Everything (IoE), with new processes decreasing response time and working more efficiently in a fog environment. Indeed, as Chuck Byers of the OpenFog Consortium confirms, fog computing is “rapidly gaining momentum” as the architecture that bridges the current gap in IoT, 5G and embedded AI systems.
As mentioned above, 5G networks is one area in which fog computing is expected to play a major role. As RCR Wireless reports, the convergence of 5G and fog computing is anticipated to be an “inevitable consequence” of bringing processing tasks closer to the edge of an enterprise’s network.
Tomi Engdahl says:
DWDM Optical Modules Take It to the Edge
http://www.lightwaveonline.com/articles/2018/04/dwdm-optical-modules-take-it-to-the-edge.html?cmpid=enl_lightwave_lightwave_datacom_2018-04-17&pwhid=6b9badc08db25d04d04ee00b499089ffc280910702f8ef99951bdbdad3175f54dcae8b7ad9fa2c1f5697ffa19d05535df56b8dc1e6f75b7b6f6f8c7461ce0b24&eid=289644432&bid=2071061
The need for low latency and quality of service is driving cloud traffic ever closer to the edge of the network. In response, cloud providers are moving toward a new distributed data center architecture of multiple edge data centers rather than a single mega-data center in a geographic market. This distributed data center model requires an orders-of-magnitude increase in optical connectivity among the edge data centers to ensure reliable and robust service quality for the end users.
As a result, the industry is clamoring for low-cost and high-bandwidth transceivers between network elements. The advent of pluggable 100G Ethernet DWDM
modules in QSFP28 form factor holds the promise of superior performance, tremendous cost savings, and scalability.
Moving data to the edge
According to Cisco, global IP traffic will increase nearly threefold over the next 5 years, and will have increased 127-fold from 2005 to 2021. In addition, almost half a billion (429 million) mobile devices and connections were added in 2016. Smartphones accounted for most of that growth, followed by machine-to-machine (M2M) modules. As these devices continue to multiply, the need to bring the data center closer to the sources, devices, and networks all producing data is driving the shift to the network’s edge.
With 5G on the horizon, bandwidth will continue to be a major challenge. Cisco predicts that although 5G will only be 0.2% of connections (25 million) by 2021, it will generate 4.7 times more traffic than the average 4G connection.
Data center virtualization
Applications and virtualization are driving the need for low-latency network requirements, further propelling the need for data to be stored closer to the user. For example, with the increased popularity of software as a service (SaaS) applications such as Microsoft 365 and Salesforce.com, enterprises are replacing proprietary, on-site applications and workloads with third-party alternatives hosted in public cloud data centers. This shift requires optical connections in both private buildings and data centers in addition to the external workloads and applications being processed, effectively creating a virtual enterprise campus. This rise in migrating application workloads is increasing the demand for a fast, reliable, and cost-effective optical connectivity approach.
Overcoming the fiber bottleneck: 100G DWDM
The recent bandwidth surge often leads to available fiber pairs becoming fully consumed. The result is fiber exhaustion, a condition that can be a particular issue in dense urban areas where the data centers tend to be smaller and segmented over several discrete sites.
Adding more fiber may be prohibited by conduit size, permit requirements (right of way), service startup time, or most importantly construction cost, which can add up to millions of dollars depending on location and distance. Any one or combination of these factors can prevent operators from scaling their network quickly and efficiently to meet their user’s growing demands.
Wavelengths in the C-Band range from 1520 nm to 1577 nm, and typical low-cost multiplexers/demultiplexers operate on a 100-GHz grid supporting up to 48 independent channels in a single fiber. The 1550-nm C-Band window also leverages the capabilities and cost of Erbium-doped fiber amplifiers (EDFAs) to account for optical system losses. Although an investment in a DWDM line system is required, the payback can be in the order of months depending on the bandwidth and fiber availability.
Traditionally, 100G DWDM technology is optimized for transport applications that can connect data centers at hundreds to thousands of kilometers. These 100G DWDM offerings require up to 25 W per 100G and are available in large-chassis transport boxes or telecom CFP/CFP2 modules, rather than the data center industry standard 100G QSFP28 form factor.
Recently, a new breed of DWDM QSFP28 module, based upon silicon photonics and PAM4 modulated transmission, has been introduced in the market. This transceiver enables IP over DWDM (IPoDWDM), a paradigm for cost-effective, scalable DWDM interconnect for distributed data center architecture. Use of such pluggable modules enables convergence of the optical layer inside as well as between edge data centers, enabling switch-to-switch connectivity up to 80 km without the need for a dedicated transport layer.
These modules can also support up to 40 DWDM channels on a single fiber, giving network operators a 40x increase in fiber utilization or spectral efficiency (4 Tbps versus 100 Gbps in a single fiber pair).
Tomi Engdahl says:
The 4 primary edge networking archetypes, plus technology requirements
http://www.cablinginstall.com/articles/2018/04/vertiv-edge-reqs.html?cmpid=enl_cim_cim_data_center_newsletter_2018-04-24&pwhid=e8db06ed14609698465f1047e5984b63cb4378bd1778b17304d68673fe5cbd2798aa8300d050a73d96d04d9ea94e73adc417b4d6e8392599eabc952675516bc0&eid=293591077&bid=2078269
Vertiv, formerly Emerson Network Power, this month released Defining Four Edge Archetypes and their Technology Requirements, a global, research-based analysis of network edge use cases, resulting in the identification of four main archetypes for edge applications and the technology required to support them.
According to a press statement, for the project Vertiv’s experts identified data-centric sets of workload requirements for each edge use case and corresponding needs for performance, availability and security. They examined specific performance requirements, including latency, availability, scalability and security, in conjunction with the need for encryption, authentication and regulatory compliance. They also looked at the need to integrate with existing or legacy applications and other data sources, while considering the number of edge locations in a given network.
According to Vertiv, the four edge networking archetypes are:
Data Intensive – This includes use cases where the amount of data makes it impractical to transfer over the network directly to the cloud or from the cloud to point-of-use due to data volume, cost or bandwidth issues. Examples include smart cities, smart factories, smart homes/buildings, high-definition content distribution, high-performance computing, restricted connectivity, virtual reality, and oil and gas digitization. The most widely used example is high-definition content delivery, where major content providers such as Amazon and Netflix actively partner with colocation providers to expand delivery networks to bring data-intensive streaming video closer to users to reduce costs and latency.
Human-Latency Sensitive – This archetype includes use cases where services are optimized for human consumption, and it is all about speed. Delayed data delivery negatively impacts a user’s technology experience, potentially reducing a retailer’s sales and profitability. Use cases include smart retail, augmented reality, website optimization, and natural language processing.
Machine-to-Machine Latency Sensitive – Speed also is the defining characteristic of this archetype, which includes the arbitrage market, smart grid, smart security, real-time analytics, low-latency content distribution, and defense force simulation. Because machines are able to process data much faster than humans, the consequences for slow delivery are higher than in the Human-Latency Archetype. For example, delays in commodities and stock trading, where prices fluctuate within fractions of a second, may turn potential gains into losses.
Life Critical – This archetype encompasses use cases that directly impact human health and safety. Consequently, speed and reliability are vital. Use cases include smart transportation, digital health, connected/autonomous cars, autonomous robots, and drones. Autonomous vehicles, for example, must have updated data to operate safely, as is the case with drones that may be used for e-commerce and package delivery.
DEFINING FOUR EDGE ARCHETYPES AND THEIR TECHNOLOGY REQUIREMENTS
https://www.vertivco.com/globalassets/documents/white-papers/vertiv-edgearchetypes-wp-en-na-sl-11490_229156_1.pdf
Tomi Engdahl says:
Edge Computing May Increase Attack Surface
http://www.datacenterknowledge.com/edge-computing/edge-computing-may-increase-attack-surface
To protect the edge, enterprises should move toward architectures that will protect applications even if the infrastructure is compromised.
Edge computing can increase computing power and lower latency, but it poses the risk of expanding the attack surface, experts say.
For example, some enterprises are deploying compute clusters or small edge data centers closer to endusers or production facilities to minimize network latency and reduce the volume of network traffic, said Bob Peterson, CTO architect at Sungard Availability Services.
“However, many times they are putting systems in areas that may not have the same logical and physical controls as their larger data centers,” he said.
In addition, restoring physical control or services can be more difficult with remote centers, and the risk of systems being breached or tampered with increases when devices are placed in locations with little or no staff.
“I think it’s not that security teams are overlooking the risks, but more so that security teams are unable to keep up with the rapid evolution of technology,” he said. “I think we are still too far away from information security being a fundamental part of everyone’s role.”
Tomi Engdahl says:
Beyond collaboration: Old, new manufacturing companies cooperate on solutions
https://www.controleng.com/single-article/beyond-collaboration-old-new-manufacturing-companies-cooperate-on-solutions/a8f8ae3369dc274e27aab559d6d75394.html?OCVALIDATE&[email protected]&ocid=101781
How an enclosures company, a computer maker, and a device supplier worked together to provide a plant-floor data center.
There are several official marketing themes at Hannover Messe 2018 in Germany—Connected Enterprise, Factory of the Future, and the ubiquitous Industrie 4.0. But a less visible theme is emerging as industry suppliers look to accelerate growth and deliver on the promise of the Industrial Internet of Things (IIoT): Good old-fashioned cooperation.
As big-name brands such as Microsoft, Intel, Oracle, and SAP try to make their presence felt in the industrial space, they are partnering with traditional manufacturing suppliers in some innovate solutions to allow not just a plug-and-play feel for the software, but a comprehensive solution for some of the hardware issues.
One good example is the partnership between device giant ABB, computer expert Hewlett Packard Enterprise, and enclosures leader Rittal. They announced the Secure Edge Data Center so manufacturers could locate their information technology (IT) assets on the plant floor and deliver secure and reliable data storage and management.
As the use of data continues to rise, more manufacturers are deciding between cloud computing for high-power enterprise analytics and the edge computing needs that will deliver fast sensor data to the plant floor in real time.
By contributing their individual areas of expertise to the solutions, ABB, Hewlett Packard and Secure Edge Data Center Rittal have created a secure box with fast computing and leading-edge sensor technology.
Tomi Engdahl says:
Challenges At The Edge
https://semiengineering.com/challenges-at-the-edge/
Real products are starting to hit the market, but this is just the beginning of whole new wave of technology issues.
Edge computing is inching toward the mainstream as the tech industry begins grappling with the fact that far too much data will be generated by sensors to send everything back to the cloud for processing.
The initial idea behind the IoT/IIoT, as well as other connected devices, was that simple sensors would relay raw data to the cloud for processing through one or more gateways. Those gateways could be inside a company, a home, an industrial operation, or even a connected car. But it’s becoming apparent that approach is untenable because there is far too much data to process—even with higher-speed communications technology such as 5G.
“A PC will generate 90 megabytes of data a day,” said Tien Shiah, who runs HBM marketing at Samsung. “An autonomous car will generate 4 terabytes a day. A connected plane will generate 50 terabytes a day.”
Most of that data is useless. An autonomous vehicle will collect data from radar, LiDAR, and an assortment of cameras. Some of it will be used in the training algorithms to improve safety and overall performance of vehicles, which will be relayed into the cloud. Some of it will need an instant response to avoid an accident or to address a problem in real time, which needs to be processed and acted upon immediately and locally. And most of it will be discarded, like the video footage from security cameras.
If that pre-processing is done locally, far less data needs to be further processed in the cloud or some mid-range servers. The result is far better performance for less money and less power, enabling the kind of rapid response required by autonomous cars, drones, or even robots.
Defining the edge
One of the biggest problems with edge computing, however, is that it’s a technology in transition. It’s being defined as it evolves. Today, you can’t actually order up purpose-built edge-computing products able to support a specific mix of IoT devices, infrastructure and computing requirements.
“Right now, edge computing is mostly just a lot of talk, but there is progress,” said Zeus Kerravala, principal analyst at ZK Research. “The partnership Nvidia announced with Arm, and the edge processors announced by Intel are both fundamental products purpose-built for the edge where you need to add power to do the processing on a device or gateway or other facility rather than sending it to the cloud.”
Tomi Engdahl says:
Challenges At The Edge
https://semiengineering.com/challenges-at-the-edge/
Real products are starting to hit the market, but this is just the beginning of whole new wave of technology issues.
Edge computing is inching toward the mainstream as the tech industry begins grappling with the fact that far too much data will be generated by sensors to send everything back to the cloud for processing.
The initial idea behind the IoT/IIoT, as well as other connected devices, was that simple sensors would relay raw data to the cloud for processing through one or more gateways. Those gateways could be inside a company, a home, an industrial operation, or even a connected car. But it’s becoming apparent that approach is untenable because there is far too much data to process—even with higher-speed communications technology such as 5G.
“A PC will generate 90 megabytes of data a day,” said Tien Shiah, who runs HBM marketing at Samsung. “An autonomous car will generate 4 terabytes a day. A connected plane will generate 50 terabytes a day.”
This also is important for AI, machine learning and deep learning applications, which may or may not be connected to a system in motion. The key with AI/ML/DL is to be able to do the inferencing piece on a local device, which improves security as well as performance. That adds a whole new spin on the edge computing model, though. The GPUs used to train these systems can run in parallel because they are working off a system of weights and floating point calculations.
Tomi Engdahl says:
Edge Computing: Explained
https://hardware.slashdot.org/story/18/05/07/230249/edge-computing-explained?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Slashdot%2Fslashdot%2Fto+%28%28Title%29Slashdot+%28rdf%29%29
The advent of edge computing as a buzzword you should perhaps pay attention to is the realization by these companies that there isn’t much growth left in the cloud space. Almost everything that can be centralized has been centralized. Most of the new opportunities for the “cloud” lie at the “edge.” The word edge in this context means literal geographic distribution. Edge computing is computing that’s done at or near the source of the data, instead of relying on the cloud at one of a dozen data centers to do all the work. It doesn’t mean the cloud will disappear. It means the cloud is coming to you.
What is edge computing?
The future of software will be managed
https://www.theverge.com/circuitbreaker/2018/5/7/17327584/edge-computing-cloud-google-microsoft-apple-amazon
Tomi Engdahl says:
Challenges At The Edge
https://semiengineering.com/challenges-at-the-edge/
Real products are starting to hit the market, but this is just the beginning of whole new wave of technology issues.
Edge computing is inching toward the mainstream as the tech industry begins grappling with the fact that far too much data will be generated by sensors to send everything back to the cloud for processing.
The initial idea behind the IoT/IIoT, as well as other connected devices, was that simple sensors would relay raw data to the cloud for processing through one or more gateways. Those gateways could be inside a company, a home, an industrial operation, or even a connected car. But it’s becoming apparent that approach is untenable because there is far too much data to process—even with higher-speed communications technology such as 5G.
Tomi Engdahl says:
Will edge data centers save the internet?
https://www.cablinginstall.com/articles/pt/2018/05/will-edge-data-centers-save-the-internet.html?cmpid=enl_cim_cim_data_center_newsletter_2018-05-08&pwhid=e8db06ed14609698465f1047e5984b63cb4378bd1778b17304d68673fe5cbd2798aa8300d050a73d96d04d9ea94e73adc417b4d6e8392599eabc952675516bc0&eid=293591077&bid=2095045
In a new industry perspective piece for Data Center Knowledge, Cole Crawford, CEO and founder of Vapor IO, explains how building data centers at the edge of the last-mile network will allow the wired and wireless worlds to operate in tandem and will enable new classes of applications.
“The wireless world and the wireline world are on a crash course,” warns Crawford. “We built each of these nationwide infrastructures with different goals and restraints, using vastly different principles and geographic footprints. Without fundamental changes to converge these two communications systems, the internet itself will eventually break.”
As explained by Vapor IO, “As the world moves toward the augmented reality, virtual reality, autonomous driving and the internet of things, data centers must be viewed as a distributed collection of resources that evolve to address the changing requirements of these new workloads. Large, centralized data centers need to be extended with hybrid and edge technologies that can be remotely monitored and operated.”
http://www.datacenterknowledge.com/industry-perspectives/how-edge-data-centers-will-save-internet
Tomi Engdahl says:
Rugged IT in a Box
Taking a “safe” approach—Rittal has a number of solutions for putting IT on the edge, where ruggedness is a plus.
http://www.electronicdesign.com/industrial-automation/rugged-it-box?NL=ED-003&Issue=ED-003_20180511_ED-003_177&sfvc4enews=42&cl=article_1_b&utm_rid=CPG05000002750211&utm_campaign=17267&utm_medium=email&elq2=431426b44cab45af838ffb3196c10c87
Everything is moving to the cloud, but sometimes that enterprise information technology (IT) hardware needs to be on the edge. Oftentimes, that edge computing isn’t as lightweight as an embedded gateway. Developers that need to deliver more powerful solutions must find a way to keep the IR resources cool and clean. Rugged solutions like those built around OpenVPX for military and avionics represent one approach. Although COTS solutions exist, they typically require customization, making them much more expensive.
Another approach is to use conventional IT hardware, but wrap it within a protected structure. This is where Rittal’s custom data centers come into play (Fig. 1). Built off the company’s Modular Data Center solutions for IT, these systems are based on Rittal’s popular TS 8 and TS IT series and new VX25 series enclosures. In addition, cooling-related solutions allow the IT hardware to operate in industrial environments closer to the industrial Internet of Things (IIoT).
Tomi Engdahl says:
20 Most Ethical High-Tech Companies
https://www.eetimes.com/document.asp?doc_id=1333281
Good ethics are simply good business. In light of that reality, more and more organizations are putting time and energy into perfecting their compliance and ethics practices and procedures.
Especially in the electronics and high-technology sectors, there are some excellent examples of organizations that have built stellar reputations and have built their business as a result.
Organizations in the high-tech sector, including defense, distribution, manufacturing, logistics, automotive, and more, were well represented on the list. The awards were based on the following criteria:
Ethics and compliance program (35%)
Corporate citizenship and responsibility (20%)
Culture of ethics (20%)
Governance (15%)
Leadership, innovation and reputation (10%).
Increasingly, organizations of all kinds are extending their business strategy to include a community focus that takes diversity/inclusion, investments, and the company voice into to mold the company grand and win the loyalty and respect of employees, customers, and stakeholders alike. “At Microsoft, trust and integrity are core to our values and critical to our success. We’re passionate about applying the power of technology to improve our world, and that starts with doing business in a way that builds and maintains trust with our customers,” added Microsoft President, Brad Smith.
Tomi Engdahl says:
AI Benchmark Targets Inference
EEMBC to measure power-constrained chips
https://www.eetimes.com/document.asp?doc_id=1333297
The EEMBC trade group has started an effort to define a machine-learning benchmark for running inference jobs on devices at the edge of the network. The effort spun out of a separate benchmark that the group plans to release in June for chips used in advanced driver assistance systems (ADAS).
The work marks at least the third major initiative in six months to measure performance of neural-network jobs. It may be the first to focus on chips for power-constrained embedded systems.
Last month, Baidu and Facebook announced work with a handful of chipmakers on ML Perf, initially focused on training jobs in data centers. The Transaction Processing Council formed an effort in December that also likely will focus on training.
Tomi Engdahl says:
Autonomous micro data centers: Hyperscale extension infrastructure for the edge of the cloud
https://www.cablinginstall.com/articles/2018/05/autonomous-micro-rm.html?cmpid=enl_cim_cim_data_center_newsletter_2018-05-22&pwhid=e8db06ed14609698465f1047e5984b63cb4378bd1778b17304d68673fe5cbd2798aa8300d050a73d96d04d9ea94e73adc417b4d6e8392599eabc952675516bc0&eid=293591077&bid=2112784
Tomi Engdahl says:
http://www.etn.fi/index.php/13-news/8032-ensimmainen-14-teratavun-palvelinlevy
Tomi Engdahl says:
Home> Tools & Learning> Products> Product Review
“AI” is in the air. Is it on your board?
https://www.edn.com/electronics-products/electronic-product-reviews/other/4460695/-AI–is-in-the-air–Is-it-on-your-board-
It seems you can’t swing a Turing Test these days without hitting an “AI”. And while a small number of projects are gunning for the Turing, the vast majority of AI in the air refers to neural networks and image (or other pattern) recognition. In fact, neural networks have pretty much crashed the party, and, for better or worse, are what most people mean by “AI” these days.
AI has been in the cloud for some years now: Voice recognition and machine (e.g., Google) translation accuracy is way up. But what if it’s a sunny day? No cloud. That’s where today’s second-hottest buzzword comes in: The Edge.
Edge computing means that you do lots of crunching, neural networking, or what have you, locally, at the edge of the network. If you still need to call home, the amount of data involved is much less than if you were feeding raw video, say, over the network.
Lattice is one company taking the AI plunge. Their just-introduced sensAI stack encompasses devkits, FPGAs, software, & IP. And while the learning curve for any new technology will tend to daunting, Lattice is trying to make it easy, whether by offering most of the technology for free, or partnering with expert third parties who can help you with some of the newer, trickier bits, like neural-net training.
Tomi Engdahl says:
Bloomberg:
Sources: Microsoft has agreed to acquire GitHub; the deal could be announced as soon as Monday — – Companies are said to plan announcement as soon as Monday — San Francisco-based startup was valued at $2 billion in 2015 — Microsoft Corp. has agreed to acquire GitHub Inc. …
deals
Microsoft Will Acquire Coding Site GitHub
https://www.bloomberg.com/news/articles/2018-06-03/microsoft-is-said-to-have-agreed-to-acquire-coding-site-github
For Microsoft Corp., acquiring GitHub Inc. would be both a return to the company’s earliest roots and a sharp turnaround from where it was a decade ago.
The software maker has agreed to acquire GitHub, the code-repository company popular with many software developers, and could announce the deal as soon as Monday, according to people familiar with the matter.
Redmond, Washington-based Microsoft is now one of the biggest contributors to GitHub, and as Nadella moves the company away from complete dependence on the Windows operating system to more in-house development on Linux, the company needs new ways to connect with the broader developer community.
Tomi Engdahl says:
Understanding data security concerns in remote data centers
https://www.cablinginstall.com/articles/print/volume-26/issue-5/features/data-center/understanding-data-security-concerns-in-remote-data-centers.html?cmpid=enl_cim_cim_data_center_newsletter_2018-06-05&pwhid=6b9badc08db25d04d04ee00b499089ffc280910702f8ef99951bdbdad3175f54dcae8b7ad9fa2c1f5697ffa19d05535df56b8dc1e6f75b7b6f6f8c7461ce0b24&eid=289644432&bid=2127371
With security breaches on the rise, compliance with regulations keeps a tight leash on enterprises.
In 2017, recorded U.S. data breaches hit a new all-time high of 1,579, up almost 50 percent over the previous year, according to the Identity Theft Resource Center. This should come as no surprise, considering that also last year, data has taken the place of oil as the world’s most valuable resource.
For data centers, privacy and physical security of servers and switches have always been a critical priority, but increased migration toward remote edge compute sites and multitenant data centers (MTDC) has made remote management and access control of the data center cabinet more complex and challenging.
Furthermore, growing data privacy regulations such as the Health Insurance Portability and Accountability Act (HIPAA), Payment Card Industry Data Security Standard (PCI-DSS), Federal Information Security Management Act (FISMA), and the upcoming General Data Protection Regulation (GDPR) are driving the need for more-stringent cybersecurity measures, including closely controlled access to cabinets where servers and switches reside.
Regulations and physical security compliance
Certain segments of the industry—particularly healthcare and financials—look at cabinet access control more strictly, requiring a detailed report of who, when and why the cabinet was accessed. Generally though, all regulations simply require physical access control measures to be in place, but it is up to enterprises to decide which specific method or technology to use.
Tomi Engdahl says:
A fast 4G connection is soon available on the Windows 10 laptop
If your laptop is currently connected to a network, it should be made either via a Wi-Fi connection or through a 4G connection to your phone. Ian is the time when Windows 10 laptops support pre-finished 4G or LTE network connections.
Qualcomm has launched a new Snapdragon 850 mobile platform in Taiwan’s Computex show together with Samsung. It has an X20 modem and an AI processor on the same circuit.
For leaflets, this means triple machine simulation algorithms, up to 1.2 gigabit LTE network connectivity, and 25 hours of battery life in normal use.
Snapdragon 850 processors are manufactured in a 10 nanometer process
Source: http://www.etn.fi/index.php/13-news/8102-windows-10-lapparissa-on-pian-nopea-4g-yhteys-valmiina
Tomi Engdahl says:
Optane DIMMs Are Worth the Wait
https://www.eetimes.com/author.asp?section_id=36&doc_id=1333361
We may need to wait for broad availability of Intel’s Optane on DIMMs, but when it arrives it should bring an important cost/performance improvement to servers.
At a May 30 event, Intel said it is already sampling its Optane DIMMs. The modules, branded Optane DC, will ship for revenue this year
Most of this event focused not on the DIMM itself, but on the framework Intel has been building for the past five years to enable 3D XPoint adoption. The company helped establish software standards in an effort to assure that all of the necessary support tools are in place to make the new memories useful.
Intel presented benchmarks showing servers with Optane DC boasted nine times more read transactions per second and 11 times more users per system than DRAM-only servers. This is all great. The question is when will we get it?
It may not be all that soon. Recent history with this technology suggests there may be some delays before the product reaches broad availability.