Prediction articles:
2020: A consumer electronics forecast for the year(s) ahead
AI Chips: What Will 2020 Bring?
CEO Outlook: 2020 Vision: 5G, China and AI are prominent, but big changes are coming everywhere
Top 10 Tech Failures From 2019 That Hint At 2020 Trends – Last year’s tech failures often turn into next year’s leading trends
Trends:
AMD’s 7nm Ryzen 4000 CPUs are here to take on Intel’s 10nm Ice Lake laptop chips
Top 9 challenges IT leaders will face in 2020: From skills shortages to privacy concerns
From the oil rig to the lake: a shift in perspective on data
In December 2020, the new IEC/EN 62368-1 will replace the existing safety standards EN 60950-1 and EN 60065-1
Use of technology money outside company IT department is the new normal
Tech to try:
12 Alternative Operating Systems You Can Use In 2020
CONTINUOUS INTEGRATION: WHAT IT IS AND WHY YOU NEED IT
Research:
Universal memory coming? New type of non-volatile general purpose memory on research, some call it UltraRAM.
1,318 Comments
Tomi Engdahl says:
AMD Ryzen 7 5800X Emerges As A Serious Rival For The Intel Core i9-10900K
https://www.tomshardware.com/news/amd-ryzen-7-5800x-emerges-as-a-serious-rival-for-the-intel-core-i910900k
Tomi Engdahl says:
AMD Ryzen 5800X and 5900X CPUs could be on sale and gunning for Intel on October 20
https://www.techradar.com/uk/news/amd-ryzen-5800x-and-5900x-cpus-could-be-on-sale-and-gunning-for-intel-on-october-20
Tomi Engdahl says:
AMD Zen 3 CPUs listed as Ryzen 5000-series chips in benchmark leak
https://www.pcgamer.com/amd-ryzen-7-5800x-5000-series-cpu-aots/
Tomi Engdahl says:
Lenovo’s 2-pound Thinkpad X1 Nano features Intel’s latest processors
You can get up to Intel’s 11-generation Core i7 in the X1 Nano.
https://www.engadget.com/lenovo-thinkpad-x1-nano-100042926.html
Tomi Engdahl says:
Third-Party GPU Makers and Nvidia Respond to Nvidia RTX 30-Series Crash to Desktop Issues (More Updates)
https://www.tomshardware.com/news/third-party-gpu-makers-respond-to-nvidia-rtx-30-series-crash-to-desktop-issues
Tomi Engdahl says:
https://www.pcworld.com/article/3573079/is-it-still-worth-it-to-buy-a-used-xeon-for-a-diy-pc-build-ask-an-expert.html
Tomi Engdahl says:
Python programming in the final frontier: Microsoft and NASA release
student learning portal
https://www.techrepublic.com/article/python-programming-in-the-final-frontier-microsoft-and-nasa-release-student-learning-portal/
Overall, the project includes three different NASA-inspired lessons.
These learning pathways were created by computer scientist and
entrepreneur Sarah Guthals to teach programming fundamentals using
space exploration challenges and themes.
Tomi Engdahl says:
Online avatar service Gravatar allows mass collection of user info
https://www.bleepingcomputer.com/news/security/online-avatar-service-gravatar-allows-mass-collection-of-user-info/
A user enumeration technique discovered by security researcher Carlo
Di Dato demonstrates how Gravatar can be abused for mass data
collection of its profiles by web crawlers and bots.
Tomi Engdahl says:
51% of Developers Say They’re Managing 100 Times More Code Than a Decade Ago
https://developers.slashdot.org/story/20/10/04/0157214/51-of-developers-say-theyre-managing-100-times-more-code-than-a-decade-ago?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Slashdot%2Fslashdot%2Fto+%28%28Title%29Slashdot+%28rdf%29%29
When asked how the size of the codebase across their entire company, measured in megabytes and the number of repositories, has changed in the past decade, over half (51%) of software development stakeholders reported they have more than 100 times the volume of code they had 10 years ago. And a staggering 18% say they have 500 times more code.
Sourcegraph: Devs are managing 100x more code now than they did in 2010
Developers manage more code, in more languages, for more platforms than ever.
https://arstechnica.com/gadgets/2020/10/sourcegraph-devs-are-managing-100x-more-code-now-than-they-did-in-2010/
Tomi Engdahl says:
NVIDIA Delays GeForce RTX 3070 Launch to October 29th
by Ryan Smith on October 2, 2020 3:30 PM EST
https://www.anandtech.com/show/16135/nvidia-delays-geforce-rtx-3070-launch-to-october-29th
In a brief news post made to their GeForce website last night, NVIDIA has announced that they have delayed the launch of the upcoming GeForce RTX 3070 video card. The high-end video card, which was set to launch on October 15th for $499, has been pushed back by two weeks. It will now be launching on October 29th.
Indirectly referencing the launch-day availability concerns for the RTX 3080 and RTX 3090 last month, NVIDIA is citing a desire to have “more cards available on launch day” for the delay. NVIDIA does not disclose their launch supply numbers
https://www.nvidia.com/en-us/geforce/news/geforce-rtx-3070-available-october-29/
Tomi Engdahl says:
GitHub users speak their brains on Microsoft’s open-source efforts: ASP.NET shines, but WPF is ‘a disaster’
An open-source project that can’t handle pull requests isn’t a good look
https://www.theregister.com/2020/10/02/microsoft_open_source_github_survey/
Tomi Engdahl says:
5 keys to next-generation IC packaging design
https://www.edn.com/5-keys-to-next-generation-ic-packaging-design/?utm_source=newsletter&utm_campaign=link&utm_medium=EDNFunFriday-20201002
Tomi Engdahl says:
Raymond Zhong / New York Times:
How Taiwan’s biggest chip maker TSMC is caught in a tough spot, forced to heed the dictates of Trump’s tech policy, while trying to keep many customers in China — The island’s biggest chip maker has been a coveted partner to both battling giants. But rising nationalism is making it harder to keep the middle ground.
In U.S.-China Tech Feud, Taiwan Feels Heat From Both Sides
https://www.nytimes.com/2020/10/01/technology/taiwan-china-tsmc-huawei.html
The island’s biggest chip maker has been a coveted partner to both battling giants. But rising nationalism is making it harder to keep the middle ground.
Tomi Engdahl says:
DevOps auttaa, kun ohjelmiston kehitys jumittaa
https://talentree.fi/softa/devops-auttaa-kun-ohjelmiston-kehitys-jumittaa/?utm_source=facebook&utm_medium=paid+social&utm_campaign=FB+Softa+2020-06&utm_content=blogi
Jos jotkin seuraavista ongelmista ovat vielä korjaamatta, nyt ne kannattaa korjata:
Ominaisuuksien kehittämisessä alkaa kestää kauan ja aika tuntuu menevän muuhun kuin uusien ominaisuuksien kehittämiseen.
Ohjelmistossa tulee odottamattomia bugeja, vaikka se oli kehittäjän koneella testattu.
Julkaisut ovat työläitä.
Ylläpito on työlästä.
Manuaalisissa toimenpiteissä tapahtuu inhimillisiä virheitä.
Tieto ohjelmiston virheistä tulee sen käyttäjiltä.
Tomi Engdahl says:
Nvidia Presents the DPU, a New Type of Data Center Processor
https://www.eetimes.com/nvidia-presents-the-dpu-a-new-type-of-data-center-processor/
Nvidia announced a new type of processor, the data processing unit (DPU), essentially a network interface card (NIC) with built-in Arm CPU cores to offload and accelerate networking, storage and security tasks which would previously have been done on another CPU. The DPU will eventually replace the NIC in data center systems.
“As NICs have advanced, you will also find acceleration engines for different kinds of I/O activities, such as RDMA embedded there as well,” he said. “As this has happened, it’s created a greater load on the x86 CPU hosts of the server, leaving less room to run the applications.”
The Bluefield-2 DPU comes on a PCIe card, and combines a Mellanox ConnectX-6 Dx SmartNIC with 8x 64-bit Arm A72 cores and 2x VLIW acceleration engines on the same silicon. A single Bluefield-2 DPU can deliver the same data center services that could consume up to 125 CPU cores. This frees up valuable CPU cores to run a wide range of other enterprise applications, said Das.
Nvidia also announced the Bluefield-2X, which adds an Ampere family AI accelerator GPU to the same card as the Bluefield 2. This adds 60 TOPS of AI acceleration which can be used to do intelligent analysis of what is going on in the network. For example, it could be used for intrusion detection, where AI can tell the difference between normal and abnormal behavior so that anything abnormal can be proactively identified and blocked.
Tomi Engdahl says:
Frederic Lardinois / TechCrunch:
Google rebrands G Suite as Google Workspace, integrates Meet, Chat, and Rooms across all its applications, and introduces new logos for all apps in the suite — Google is rebranding G Suite, its set of online productivity and collaboration tools for businesses that include the likes of Gmail, Drive, Docs and Meet.
G Suite is now Google Workspace
https://techcrunch.com/2020/10/06/g-suite-is-now-google-workspace/
Google is rebranding G Suite, its set of online productivity and collaboration tools for businesses that include the likes of Gmail, Drive, Docs and Meet. The new name is Google Workspace, a name the company already hinted at when it first introduced a set of new collaboration tools and Google Meet integrations for the service earlier this year. Now those new tools are coming out of preview and with that, the company decided to also give the service a new name and introduce new logos for all the included productivity apps, which are now being used — and paid for — by more than 6 million businesses.
Tomi Engdahl says:
Excel Hell: It’s not just blame for pandemic pandemonium being spread between the sheets
Some things simply don’t belong in regulatory environments
https://www.theregister.com/2020/10/06/excel/
Column The howls of disbelieving, horrified laughter caused by the news of the latest pandemic data cock-up yesterday were well deserved.
16,000 cases lost – purportedly in a blunder involving CSV data, row limits, and an out-of-date Excel file format? In a multibillion-pound, “world-beating” contact-tracing system? Unnoticed for a week of rising infection? In a system known to be broken for months but still not fixed?
Ridicule and despair, those shagged-out nags of our Johnsonian apocalypse, once again trudged exhaustedly across the plaguelands of England.
But the true horror is rooted much deeper and the underlying sins stretch much wider than Number 10. Of that sad catalogue of fail, one item which was widely blamed for the chaos deserves our very finest scorn. One alleged piece of the jagged little jigsaw is a global pandemic all its own, poisoning our data and sickening our businesses for 35 years. In five years’ time, with some luck and much work, vaccines and social changes will see COVID-19 demoted to flu status, yet Excel will be with us still.
It is an execrable design, from any angle. It has forced generations of untrained business people to operate without help on their data through a letterbox using Lego bricks as scalpels. It knows not any modern data techniques – of structure, robustness, verification, documentation, modularity, versioning, variable typing, variable naming, bound labels. How the Hell is anyone supposed to build and use and communicate and maintain any sort of model where you have to build it cell by cell, with tricksy little links and kiddy algebraic naming conventions?
It is a sixth-form programming project grown to the size of Godzilla.
If you mention VBA, I shall scream.
If you mention security, I shall scream louder.
For this, I blame Microsoft. People have to use Excel because it is the only data manipulation tool in Office, and Office is the only game in town. And because of this, Microsoft hasn’t done an innovation worth a damn in personal data management for the common business user in decades.
The result is an unending history of misery, at a corporate and a personal level, because Microsoft doesn’t care. It’s got your money, you’ve got to use its software, off you go.
If there were health and safety rules for software, Excel would be up there with radium cigarettes and arsenic gobstoppers.
In fact, where there are rules – such as in some medical regulatory environments – you can’t just use Excel. You have to certify your particular application.
People and data do not mix well, but in the sacred names of Turing, Shannon, and Lovelace, we deserve a better answer than Excel.
Use spreadsheets for their intended purpose
For your edification, I present this list – 44 pages long – of spreadsheet horror stories, of data entry, calculating, modelling, and analytic chaos that has cost hundreds of millions of pounds and put endeavour of all sorts at risk.
I say spreadsheets, because all are bad, but I mean Excel, because that’s the only one that matters. That’s the compulsory one. The compatibility enforcer, the one that sets the rules. The one that misrules.
http://www.eusprig.org/horror-stories.htm
Tomi Engdahl says:
5 Graphics Settings Worth Tweaking in Every PC Game
Sure, you can settle for the default presets, but even small changes can mean better performance—and a much better gaming experience.
https://www.wired.com/story/five-graphics-settings-to-change-every-pc-game/
Tomi Engdahl says:
Nyt se tuli: ensimmäinen DDR5-muisti
https://etn.fi/index.php/13-news/11244-nyt-se-tuli-ensimmainen-ddr5-muisti
Standardointijärjestö JEDEC julkisti DRAM-muistien DDR5-määritykset heinäkuussa 2020. Sen jälkeen DRAM-valmistajille on ollut käynnissä kilpajuoksu siitä, kuka ehtii markkinoille ensimmäisenä DDR5-muistien kanssa. Korealainen SK Hynix voitti kisan ennen Samsungia ja Micronia.
SK Hynix on toki työstänyt DDR5-muistejaan jo pitkän aikaa. Se ilmoitti tuotekehityksen alkamisesta marraskuussa 2018 ja on toimittanut näytteitä esimerkiksi Intelille jo pidemmän aikaa. Standardin valmistuttua ja yhteensopivuustestien jälkeen SK Hynix on nyt valmis kaupallisiin toimituksiin.
SK Hynixin DDR5-muisti tukee 4800-5600 megabitin siirtonopeutta sekunnissa. DDR4-muistehin verrattuna nopeus kasvaa 1,8-kertaiseksi. Tällä kaistalla muistista voidaan siirtää yhdeksän FullHD-elokuvaa sekunnissa.
Tomi Engdahl says:
Xbox’s Phil Spencer Isn’t Sure 8K Will Ever Be Standard in Video Games
https://games.slashdot.org/story/20/10/06/181201/xboxs-phil-spencer-isnt-sure-8k-will-ever-be-standard-in-video-games?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Slashdot%2Fslashdot%2Fto+%28%28Title%29Slashdot+%28rdf%29%29
Xbox boss Phil Spencer has said that he isn’t sure if 8K resolution will ever be standard for video games, calling it “aspirational technology.” From a report:
Talking to Wired, Spencer said,” I think 8K is aspirational technology. The display capabilities of devices are not really there yet. I think we’re years away from 8K being — if it ever is — standard in video games.” Spencer’s comments come despite the Xbox Series X being able to support 8K output. However, while it may technically be able to push video at a resolution of 7680 x 4320, there are more factors to consider, chiefly being if anyone even has an 8K television or monitor to view such visuals on. According to Wired’s chat with Liz Hamren, head of gaming engineering at Xbox, Microsoft’s data suggests that 4K TV adoption is less than what publishers may think, and so that suggests 8K adoption is still years away at least.
Xbox’s Phil Spencer Isn’t Sure 8K Will Ever Be Standard in Video Games
Too many pixels?
https://www.ign.com/articles/xbox-phil-spencer-8k-video-games-standard
If you aspire to 8K gaming, you’re best off taking a look at PCs and the new Nvidia RTX 3090 GPU, which costs a cool $1499 (and requires the rest of a PC and an 8K display, too). If that’s out of your budget, then you’re out of luck on 8K for now.
Tomi Engdahl says:
Is ‘Datafication’ the New Mantra for Smart Everything?
https://www.eetimes.com/is-datafication-the-new-mantra-for-smart-everything/
What is ‘Datafication’? Is that even proper English? Apparently, it is now.
In the technology industry, we’re used to lots of acronyms. We’re also used to hearing new phrases that both startups and established companies would love us to adopt as industry naming. It gives a kind of hidden sense of pride in having invented the term.
So, when reviewing the presentations submitted for next week’s Boards and Solutions 2020 virtual conference (13-14 October 2020), the one term I picked up as a possible new mantra is ‘datafication’. More specifically, as Charbel Aoun, EMEA business development director for smart cities at Nvidia, describes the four megatrends that will significantly impact our lives, he explains that with 8,000 new IoT devices connected every 60 seconds, “Digitization has enabled datafication.”
He adds, “IoT could turn the world into data that could be used to make macro decisions on resource utilization. Information is a great way to reduce waste and increase efficiencies. That is really what the internet of things provides. This was the vision of Kevin Ashton back in 1999, the father of the term IoT. Today, this vision is becoming a reality.”
Indeed, data is everything, and not just at the edge, but also in the data center, as Nvidia revealed more details of its data processing unit (DPU) at its GTC conference.
Aoun describes the march of datafication in his talk, as he talks about the big challenge for smart cities as an example. “There’s around one billion cameras worldwide, recording 24/7, generating a huge amount of data. It is basically impossible for humans to process such amounts of data. To give you an idea, with one 1080P resolution camera, H.264 I at 34 fps, will generate 47Gbytes of data in 24 hours and 17 Terabytes of data in one year. On the other hand, one CCTV operator can focus for 30 minutes while looking at 4-16 video streams at the same time. Which means for every 100 screens or 100 streams you want to monitor, you need six operators. To understand the scale of the challenge, let us look at the number of CCTVs in the city. In Shanghai, one million plus CCTVs, London, 500,000, Moscow, 200,000, L.A. 25,000, Berlin 20,000.”
“Now you get the picture of the volume of data that can be generated from all the cameras in a city, and the amount of resources required to maintain and monitor.” In his paper, “How AI can make cities smarter – Powering AI City with IVA”, Aoun talks about how AI is helping make sense of the information overload in very effective and efficient ways, provides insight and enables real time decision making to enhance the lives of citizens. He illustrates how AI offers city managers new solutions to 21st century urban challenges with some practical examples.
Value will come from edge autonomy
While all this data being generated needs to be processed and analyzed to provide the insights and enable actions, how about if the edge devices themselves are able to intelligently make decisions? This is the premise of the paper, “Insights into edge autonomy – the future of edge computing”
Tomi Engdahl says:
Low and no-code are wonderful, but a ‘big code’ world lurks underneath
For developers, code releases are “emotional” events. Many have fear and anxiety at the moment they release code or submit it for review –and fear breaking dependencies
https://www.zdnet.com/article/low-and-no-code-are-wonderful-but-a-big-code-world-lurks-underneath/
While most applications, online services and even automobiles now run on hundreds of thousands, or even millions of lines of code behind the scenes, many applications and services can be written with relatively few lines, thanks to abstractions available through today’s platforms. Serverless offerings and low-code or no-code solutions take this even a step further. Still, lurking underneath all those pain-free interfaces is a huge hairball of volume and dependencies being created in today’s enterprises.
Tomi Engdahl says:
Nvidia promises once again to let Arm keep its Switzerland-of-chips biz model – and even license some Nv GPU tech
We’ll be nice post-merger, bosses pledge at confab
https://www.theregister.com/2020/10/07/nvidia_arm_plans/
Tomi Engdahl says:
DDR5 is Coming: First 64GB DDR5-4800 Modules from SK Hynix
by Dr. Ian Cutress on October 6, 2020 8:00 AM EST
https://www.anandtech.com/show/16142/ddr5-is-coming-first-64gb-ddr5-4800-modules-from-sk-hynix
DDR5 is the next stage of platform memory for use in the majority of major compute platforms. The specification (as released in July 2020) brings the main voltage down from 1.2 V to 1.1 V, increases the maximum silicon die density by a factor 4, doubles the maximum data rate, doubles the burst length, and doubles the number of bank groups. Simply put, the JEDEC DDR specifications allows for a 128 GB unbuffered module running at DDR5-6400. RDIMMs and LRDIMMs should be able to go much higher, power permitting.
Insights into DDR5 Sub-timings and Latencies
by Dr. Ian Cutress on October 6, 2020 11:00 AM EST
https://www.anandtech.com/show/16143/insights-into-ddr5-subtimings-and-latencies
Tomi Engdahl says:
Time for a virtual love affair: ESXi-Arm Fling flung onto the web for peeps to test drive with Raspberry Pi 4, other kit
Totally not-safe-for-production 64-bit Arm hypervisor port released for evaluation, sell the SmartNIC idea, etc
https://www.theregister.com/2020/10/07/vmware_esxi_arm/
Tomi Engdahl says:
Excel Hell: It’s not just blame for pandemic pandemonium being spread between the sheets
Some things simply don’t belong in regulatory environments
https://www.theregister.com/2020/10/06/excel/
Tomi Engdahl says:
https://etn.fi/index.php/13-news/11247-tekoaly-tuotiin-tukemaan-vahakoodista-kehitysta
Tomi Engdahl says:
US takes chip wars to new level by targeting SMIC
America’s closest allies will have to play a balancing act
https://asiatimes.com/2020/10/us-takes-chip-wars-to-new-level-by-targeting-smic/
By targeting Semiconductor Manufacturing International Corporation, the US has used the nuclear option. Unlike Huawei or ByteDance, SMIC is foundational for China. From self-driving cars to artificial intelligence, China’s future depends on the success of SMIC (and its domestic chip industry). By taking action against SMIC, Washington has attacked the heart of China.
Now, it’s war over chips.
Except this won’t just take place between the US and China.
Of the many implications that come with sanctioning SMIC, one is that other nations will be dragged into the tech war.
Tomi Engdahl says:
IFTLE 463: DoD Focuses on “Reshoring” Electronics to the US
https://www.3dincites.com/2020/10/iftle-463-dod-focuses-on-reshoring-electronics-to-the-us/
The Defense Advanced Research Projects Agency (DARPA) launched its Electronic Resurgence Initiative (ERI) in 2017 with the intention of reshoring a domestic chip industry that has been moving steadily offshore for decades. Microelectronics are a foundational building block of most of our defense systems.
Tomi Engdahl says:
IFTLE 462: If Not a Node then What?
https://www.3dincites.com/2020/09/iftle-462-if-not-a-node-then-what/
Since 1971, the linear dimensions of a MOS transistor have shrunk down by a factor of roughly 1K and the number of transistors on a single chip has increased 15MM. The metrics used to gauge this progress in integration density have been dimensions called the metal half-pitch and gate length. These are defined below.
Metal half-pitch is half the distance from the start of one metal interconnect to the start of the next on a chip.
In the planar transistor design, gate length measures the space between the transistor’s source and drain electrodes.
In that space sits the device’s gate stack, which controls the flow of electrons between the source and drain. It has been the most important dimension for determining transistor performance because a shorter gate length suggested a faster-switching device.
For a long time, these dimensions were roughly equivalent and became known as the “node”. These features on the chip were typically made 30 percent smaller with each successive generation. Such a reduction enabled the doubling of transistor density because reducing both the x and y dimensions of a rectangle by 30 percent means a halving in area. Presto, we got Moore’s Law!
Using the gate length and half-pitch as the “node number” served its purpose from the 1970s through the mid-1990s, but then the two features began to uncouple. Chipmakers began to shrink the gate length more aggressively than the other features. For example, transistors made using the so-called 130nm node actually had 70nm gates. The result was the continuation of the Moore’s Law density-doubling pathway, but with a disproportionately shrinking gate length. Yet industry, for the most part, stuck to the old node-naming convention.
In the early 2000s, other processes were devised to increase performance. For instance, the industry put part of the transistor’s silicon under strain, increasing the speed and power efficiency of CMOS devices without making the gate length much smaller.
By 2011, at the “22nm node”, Intel switched to the FinFET transistor architecture. The devices had 26-nm gate lengths, a 40-nm half-pitch, and 8-nm-wide fins. Thus at the 22 node, the original concept of “node” had absolutely no meaning, “…because it had nothing to do with any dimension that you could find on the die…”
Time to Rename the Node?
Thus, Moore and many others contend we need a better way to describe the technology that we have developed/are developing. The nomenclature should reflect the sizes of actual features important to the transistor. One suggestion has been to use measures that describe the real limit on the area needed to make a transistor:
The contacted gate pitch (G), i.e the minimum distance from one transistor’s gate to another’s;
The metal pitch (M), which measures the minimum distance between two horizontal interconnects;
The number of tiers of devices on the chip (T) since we are quickly approaching the time when further shrinkage is impossible, and the only way forward is to stack.
The coming 5nm chips will have a contacted gate pitch of 48 nm, a metal pitch of 36 nm, and a single-tier thus making the GMT metric G48M36T1.
Tomi Engdahl says:
https://www.electropages.com/blog/2020/10/how-chiplets-may-help-future-semiconductor-technology?utm_campaign=2020-10-07-Latest-Product-News&utm_source=newsletter&utm_medium=email&utm_term=article&utm_content=How+%E2%80%9CChiplets%E2%80%9D+May+Help+the+Future+of+Semiconductor+Technology
Tomi Engdahl says:
Facing the Threat of a Potential FPGA Shortage: a Call to Action
http://www.cotsjournalonline.com/index.php/2020/09/28/facing-the-threat-of-a-potential-fpga-shortage-a-call-to-action/
Consider this simple question. Can the defense and space industries count on a continued stable supply of mission-critical Field Programmable Gate Array (FPGA) components 10, 20, or even 30 years from now? Recent global events remind us that failing to adequately prepare for a catastrophe can lead to unintended consequences in domino-like succession. Some see the wisdom in seeking shelter before a tornado strikes, while others may argue that allowing events to play out is the preferred course of action. Planning for an eventual potential shortage of a critical military component may not be tops on everyone’s priority list. If an electronic guidance system for a warfighter is missing one critical part, that plane or missile will not fly. Let’s view the whole paradigm from a distance to see what’s missing, such as in the case of FPGA devices.
A sudden shortage of mission-critical FPGA devices could result in warfighters not flying and rockets not launching. This is not an exaggeration. Makers of ruggedized FPGA devices currently depend on a single subcontractor to attach these essential copper-wrapped solder columns to the body of the FPGA. Not anyone can perform solder column attachment, for reasons that we will explain.
Past production shortages in the semiconductor industry have been short-lived because multiple vendors have been able to quickly step in to fill voids in the supply chain. Today, only a single subcontractor is designated on the Qualified Manufacturer List (QML-38535) as a provider of copper-wrapped solder column attachment services for the entire FPGA industry. Any supply chain dependent on a single supplier is inherently vulnerable. Action is needed to resolve this vulnerability.
Any number of unfortunate causes, from natural disasters to internal business problems, could interrupt business continuation for this current monopoly supplier.
An existential threat could arise if a hostile foreign actor acquired or otherwise took control and, for example, relocated production offshore. A facility relocation typically results in the loss of QML status, pending requalification. It can take up to 24 months for a new candidate to undergo the arduous approval process before attaining QML status to provide the aforementioned services. A prolonged production shutdown of FPGA devices directly impacts US national security, affecting thousands of downstream customers who would be unable to complete systems and black box builds. Proactive steps taken now to identify and monitor the risks could mitigate such a threat.
Tomi Engdahl says:
Intel says its 11th-gen Rocket Lake desktop CPUs will arrive in Q1 2021, with PCIe 4.0 support; AMD is set to unveil new CPUs at an event tomorrow — Just ahead of AMD’s own next-gen Zen 3 announcement — Intel has confirmed that its 11th Gen Rocket Lake desktop processors will be out sometime …
Intel confirms 11th Gen Rocket Lake desktop processors coming in early 2021
27 comments
Just ahead of AMD’s own next-gen Zen 3 announcement
https://www.theverge.com/2020/10/7/21505926/intel-11th-gen-rocket-lake-desktop-processors-q1-2021-amd?scrolla=5eb6d68b7fedc32c19ef33b4
Tomi Engdahl says:
Tom Warren / The Verge:
Sony publishes a full PS5 teardown, revealing removable sides, dust catchers, storage expansion, and more
Sony’s PS5 teardown video reveals removable sides, dust catchers, and storage expansion
A closer look inside the PlayStation 5
https://www.theverge.com/2020/10/7/21505598/sony-ps5-playstation-5-tear-down-hardware?scrolla=5eb6d68b7fedc32c19ef33b4
Tomi Engdahl says:
Official Teardown Gives Unexpected Look Into PS5
https://hackaday.com/2020/10/07/official-teardown-gives-unexpected-look-into-ps5/
With Sony and Microsoft still a month away from the public release of their next-generation game consoles, you’d expect technical details of their respective systems to still be under a veil of secrecy. But both companies look to be taking things a bit differently this generation, as it becomes increasingly clear that modern consumers are interested in what makes their devices tick. Today, Sony really threw down the gauntlet by beating the tech media to the punch and posting their own in-depth teardown on the new PlayStation 5.
https://blog.playstation.com/2020/10/07/ps5-teardown-an-inside-look-at-our-most-transformative-console-yet/
Tomi Engdahl says:
https://etn.fi/index.php/13-news/11250-gan-kutistaa-laturit-ja-tekee-ne-nopeammiksi
Tomi Engdahl says:
European Semiconductor Sales Drop, Global Sales Rise
https://www.eetimes.eu/european-semiconductor-sales-drop-global-sales-rise/
The European semiconductor market continues to decline sharply on a year-to-year basis, according to figures from the Semiconductor Industry Association (SIA).
Year-on-year semiconductor sales have been on a downward trend in Europe. They declined by 7.1 percent in April 2020. By 12.9 percent in May. By 17.1 percent in June. By 14.7 percent in July. And by 10.1 percent in August. No doubt the Covid-19 crisis and the uncertainties in the automotive industry are correlated with these successive sales drops.
The gloomy picture in Europe contrasts with the sales dynamics in the rest of the world. In August 2020, SIA announced global sales of semiconductors reached $36.2 billion, up 4.9 percent from August 2019 total of $34.5 billion and up 3.6 percent from the July 2020 total of $35.0 billion.
“Global semiconductor sales increased year-to-year in August for the seventh consecutive month, demonstrating the global market so far has remained largely insulated from ongoing global macroeconomic headwinds, but there is still substantial uncertainty for the months ahead,” commented John Neuffer, SIA president and CEO, in a statement.
Tomi Engdahl says:
Death of the PC? Do me a favour, says Lenovo bigwig: ‘I’m expecting the biggest growth in a decade… for 2021′
Exec adopts a build-and-they-will-come mentality
https://www.theregister.com/2020/10/07/lenovo_shipment_predictions/
Canalys Forum 2020 Forecasting tech sales is a dark art at the best of times but in a pandemic it takes on a whole new level of complexity. Unperturbed, Lenovo’s president and COO is predicting shipment growth not seen in a decade for 2021.
Gianfranco Lanci said the outbreak of the virus has “accelerated a remote revolution. Over the past several months we have seen the resurgence of PC sales. Talk of the PC dying and being taken over by tablets and smartphones are over.
Tomi Engdahl says:
Intel Confirms Rocket Lake on Desktop for Q1 2021, with PCIe 4.0
by Dr. Ian Cutress on October 7, 2020 12:00 PM EST
https://www.anandtech.com/show/16145/intel-confirms-rocket-lake-on-desktop-for-q1-2021-with-pcie-40
Tomi Engdahl says:
From
https://www.facebook.com/126000117413375/posts/3918719971474685/
//70s hipster programming languages:
Fortran
Cobol
ALGOL
Assembly
Basic
APL
Lisp
Pascal
//Today’s hipster programming languages:
Python
JS
Java
C#
C++
C
R
Swift
Objective C
Kotlin
Will update this list in another 50 years
Tomi Engdahl says:
Russian Company Tapes Out 16-Core Elbrus CPU: 2.0 GHz, 16 TB of RAM in 4-Way System
By Anton Shilov a day ago
https://www.tomshardware.com/news/russian-company-tapes-out-16-core-elbrus-cpu-20-ghz-16-tb-of-ram-in-4-way-system
MCST’s Elbrus-16C contains 12 billion of transistors and hits 750 GFLOPS FP64
MCST, a microprocessor developer from Russia, has demonstrated the first engineering sample of its 16-core Elbrus-16C CPU. The processor is an evolution of MCST’s proprietary VLIW architecture that adds features like virtualization. The Elbrus-16C is designed primarily for desktops and servers that have to comply with Russia’s governmental requirements for security and reliability.
The system-on-chip packs 16 cores running at 2 GHz, has an eight-channel DDR4 memory controller, supports 32 PCIe Gen 3 lanes, four SATA 3.0 ports, and integrated 2.5 GbE as well as 10 GbE interfaces. The CPU can address up to 4TB of DDR4 memory, the same capacity as AMD’s EPYC 7002-series processors, but the developer does not disclose which memory modules — RDIMMs or LRDIMMs — it uses. The CPU has a 110W TDP.
As far as performance is concerned, the manufacturer says that its 16-core processor can offer 1.5 FP32 TFLOPS as well as 0.75 FP64 TFLOPS.
Virtualization and 4-way SMP support will allow Russian server makers to build cloud servers based on the MCST Elbrus-16C parts over time when there are appropriate operating systems available.
So far, MCST has managed to run its Elbrus Linux operating system on a prototype Elbrus-16C-based server. The company will evaluate samples of the CPU in the coming quarters and expects the chip to be ready for mass production by late 2021.
Tomi Engdahl says:
Huawei’s 24-Core 7nm Kunpeng CPU Allegedly Beats Core i9-9900K In Multi-Core Performance
https://www.tomshardware.com/news/huaweis-24-core-7nm-kunpeng-920-cpu-allegedly-outmatches-core-i9-9900k-in-multi-core-performance
Chinese news outlet IThome received word that Huawei is on the brink if launching the brand’s new desktop PC (internally known as Pangu) for the domestic market. The system utilizes a variant of the company’s Kunpeng 920, which is also known as the Hi1620. The report claims that the Kunpeng 920 3211K’s multi-core performance is slightly better than the Intel Core i9-9900K Coffee Lake processor.
The Kunpeng 920, which is based on Arm’s Neoverse N1 (codename Ares) microarchitecture, boasts core configurations that span from 24 up to 64 cores, running between 2.4 GHz and 3 GHz. TSMC used to produce Kunpeng 920 for Huawei on its 7nm process node before cutting off all ties with Chinese tech giant due to new U.S. regulations.
Tomi Engdahl says:
Linus Torvalds Tosses Intel CPU Aside To Make Way For a Ryzen Threadripper 3970X
By Zhiye Liu May 25, 2020
https://www.tomshardware.com/news/linus-torvalds-tosses-intel-cpu-aside-ryzen-threadripper-3970x
Tomi Engdahl says:
While technology provided the possibility for us to work, study, or shop remotely during the quarantine, this fast digital leap also had human rights implications.
COVID-19 tracing: technology is not the savior. Or is it?
https://cybernews.com/privacy/covid-19-tracing-technology-is-not-the-savior-or-is-it/?utm_source=facebook&utm_medium=cpc&utm_campaign=rm&utm_content=covid_19_tracing
Some governments used the health crisis to cement their power and limit human rights both online and offline, argue human rights watchdogs. Meanwhile, companies, such as Google, brag about helping combat the pandemic while protecting people’s privacy.
“Past 6-7 months have proven that people don’t have any reason to trust most governments, that at (…) worst have used the crisis to centralize and cement their power, and limit human rights online and offline,” European Policy Manager at Access Now Fanny Hidvégi said in a United Nations discussion about protecting human rights during the pandemic.
technologies opened various opportunities for people during the quarantine. Children were able to study, while their parents could work remotely.
“Technologies facilitated access to culture at a time when all the cultural establishments were closed,” he said.
Tracing apps have been at the heart of a very heated debate around the world. Concerns regarding the potential misuse and potential data privacy breaches emerged,
Silvio Gonzato said.
Countries like France, Finland, or Germany, developed contact tracing apps, and the European Union developed a toolbox and continues to update technical guidance.
“Tracing apps must be voluntary, secure, and interoperable, and respect privacy. Apps should avoid the identification of users and should not use the geolocation. All the applications must be temporary only, and will have to be dismantled as soon as the pandemic is over, and should retain data only for the minimum period of time,” he explained.
Tomi Engdahl says:
#LegacyIT systems are everywhere—power grids, water treatment plants, telephone exchanges, and air traffic control, and so much more—and they need help. In Radio Spectrum’s latest podcast, IT expert Bob Charette explains the problems of legacy code—and legacy data @RadioSpectrum1
The Problem of Old Code and Older Coders
Legacy IT systems are everywhere—and they need help
https://spectrum.ieee.org/podcast/computing/it/the-problem-of-old-code-and-older-coders
The coronavirus pandemic has exposed any number of weaknesses in our technologies, business models, medical systems, media, and more.
Water treatment plants, telephone exchanges, power grids, and air traffic control are just a few of the systems controlled by antiquated code.
In 2005, Bob Charette wrote a seminal article, entitled “Why Software Fails.” Now, fifteen years later, he strikes a similar nerve with another cover story that shines a light at the vast and largely hidden problem of legacy IT. Bob is a 30-year veteran of IT consulting, a fellow IEEE Spectrum contributing editor, and I’m happy to say my good friend as well as my guest today. He joins us by Skype.
Why Software Fails
https://spectrum.ieee.org/computing/software/why-software-fails
Steven Cherry Bob, the numbers are staggering. In the past 10 years, at least $2.5 trillion has been spent trying to replace legacy IT systems, of which some seven hundred and twenty billion dollars was utterly wasted on failed replacement efforts. And that’s before the last of the COBOL generation retires. Just how big a problem is this?
Bob Charette That’s a really good question. The size of the problem really is unknown. We have no clear count of the number of systems that are legacy in government where we should be able to have a pretty good idea. We have really no insight into what’s happening in industry. The only thing that we that we do know is that we’re spending trillions of dollars annually in terms of operations and maintenance of these systems, and as you mentioned, we’re spending hundreds of billions per year in trying to modernize them with large numbers failing. This is this is one of the things that when I was doing the research and you try to find some authoritative number, there just isn’t any there at all.
some of that is record-keeping problems, some of that is secrecy, especially on the corporate side. A little bit of that might be definitional
Does everybody agree on what legacy IT is? What counts as legacy?
Bob Charette No. And that’s another problem. What happens is there’s different definitions in different organizations and in different government agencies, even in the US government. And no one has a standard definition. About the closest that we come to is that it’s a system that does not meet the business need for some reason. Now, I want to make it very clear: The definition doesn’t say that it has to be old, or past a certain point in time Nor does it mean that it’s COBOL. There are systems that have been built and are less than 10 years old that are considered legacy because they no longer meet the business need. So the idea is, is that there’s lots of reasons why it may not meet the business needs—there may be obsolescent hardware, the software software may not be usable or feasible to be improved. There may be bugs in the system that just can’t be fixed at any reasonable cost. So there’s a lot of reasons why a system may be termed legacy, but there’s really no general definition that everybody agrees with.
Do we keep building new systems, seemingly without a second thought that we’re going to have to maintain them?
Bob Charette Yes, and for good reason. When we build a system and it actually works, it works usually for a fairly long time. There’s kind of an irony and a paradox. The irony is that the longer these systems live, the harder they are to replace. Paradoxically, because they’re so important, they also don’t receive any attention in terms of spend. Typically, for every dollar that’s spent on developing a system, there’s somewhere between eight and 10 dollars that’s being spent to maintain it over its life. But very few systems actually are retired before their time. Almost every system that I know of, of any size tends to last a lot longer than what the designers ever intended.
Tomi Engdahl says:
https://spectrum.ieee.org/riskfactor/computing/it/it-failures-2018-all-the-old-familiar-faces
Tomi Engdahl says:
Definitely not Windows 95: What operating systems keep things running in space?
The updates don’t come every spring and fall, but space operating systems keep evolving.
https://arstechnica.com/features/2020/10/the-space-operating-systems-booting-up-where-no-one-has-gone-before/?utm_medium=social&utm_social-type=owned&utm_source=facebook&utm_brand=ars
Tomi Engdahl says:
Why AMD is acquiring #FPGA powerhouse Xilinx, Inc. #DataCenters #5G #AI
Why AMD is acquiring FPGA powerhouse Xilinx
https://www.edn.com/why-amd-is-acquiring-the-fpga-powerhouse-xilinx/?utm_content=bufferc97f6&utm_medium=social&utm_source=edn_facebook&utm_campaign=buffer
After Intel bought Altera, the reports about AMD in talks to acquire Xilinx bring an end to the long reign of what many semiconductor industry watchers called the Coke and Pepsi era of the FPGA world. Is it about AI workloads in the data center? Or is it about Xilinx’s massive push into the automotive and 5G networking worlds?
Industry observers will debate the rationale behind the acquisition in the coming days, but it seems to be a no-brainer at the outset. AMD, Intel’s alter ego in the x86 processor realm, appears to be responding to Intel’s acquisition of Altera in 2015. While Altera’s purchase made Xilinx the king of the FPGA world, many wondered for how long? Now The Wall Street Journal reports that AMD is in talks with Xilinx for a whopping $30 billion acquisition deal.
AMD Is in Advanced Talks to Buy Xilinx
https://www.wsj.com/articles/amd-is-in-advanced-talks-to-buy-xilinx-11602205553
Deal that could be worth more than $30 billion would mark the latest big tie-up in the rapidly consolidating industry
Tomi Engdahl says:
There is an agreement under which NVIDIA will acquire Arm Limited from SBG and the SoftBank Vision Fund (together, “SoftBank”) in a transaction valued at $40 billion.
They will soon go up or down together unless some big regulators do something nasty.
https://nvidianews.nvidia.com/news/nvidia-to-acquire-arm-for-40-billion-creating-worlds-premier-computing-company-for-the-age-of-ai
Tomi Engdahl says:
Gartner Says Worldwide PC Shipments Grew 3.6% in Third Quarter of 2020
Ongoing Consumer Demand for Home Entertainment and Distance Learning Drove Strongest U.S. Market Growth in 10 Years
https://www.gartner.com/en/newsroom/press-releases/2020-10-12-gartner-says-worldwide-pc-shipments-grew-3-point-six-percent-in-the-third-quarter-of-2020
Worldwide PC shipments totaled 71.4 million units in the third quarter of 2020, a 3.6% increase from the third quarter of 2019, according to preliminary results by Gartner, Inc. Consumer demand for PCs due to home entertainment and distance learning needs during the ongoing pandemic, along with the strongest growth the U.S. PC market has seen in 10 years, drove the global market momentum.
“This quarter had the strongest consumer PC demand that Gartner has seen in five years,” said Mikako Kitagawa, research director at Gartner. “The market is no longer being measured in the number of PCs per household; rather, the dynamics have shifted to account for one PC per person. While PC supply chain disruptions tied to the COVID-19 pandemic have been largely resolved, this quarter saw shortages of key components, such as panels, as a result of this high consumer demand.
“The business PC market had a more cautious dynamic this quarter. Businesses have continued to buy PCs for remote work, but the focus has shifted from urgent device procurement towards cost optimization. However, enterprise spending remained strong where government funding for distance learning and remote work has fueled device purchases, such as in the U.S. and Japan.”