Computers and component trends 2020

Prediction articles:

2020: A consumer electronics forecast for the year(s) ahead

AI Chips: What Will 2020 Bring?

CEO Outlook: 2020 Vision: 5G, China and AI are prominent, but big changes are coming everywhere

Top 10 Tech Failures From 2019 That Hint At 2020 Trends – Last year’s tech failures often turn into next year’s leading trends

Trends:

AMD’s 7nm Ryzen 4000 CPUs are here to take on Intel’s 10nm Ice Lake laptop chips

Top 9 challenges IT leaders will face in 2020: From skills shortages to privacy concerns

Linux in 2020: 27.8 million lines of code in the kernel, 1.3 million in whole system
Systemd? It’s the proper technical solution, says kernel maintainer

Hero programmers do exist, do all the work, do chat a lot – and do need love and attention from project leaders

From the oil rig to the lake: a shift in perspective on data

In December 2020, the new IEC/EN 62368-1 will replace the existing safety standards EN 60950-1 and EN 60065-1

Use of technology money outside company IT department is the new normal

Tech to try:

12 Alternative Operating Systems You Can Use In 2020

CONTINUOUS INTEGRATION: WHAT IT IS AND WHY YOU NEED IT

Research:

Universal memory coming? New type of non-volatile general purpose memory on research, some call it UltraRAM.

1,318 Comments

  1. Tomi Engdahl says:

    TSMC to build 2nm fab in Hsinchu
    Monica Chen, Hsinchu; Rodney Chan, DIGITIMES
    Tuesday 25 August 2020
    https://www.digitimes.com/news/a20200825PD210.html

    TSMC plans to build its 2nm wafer fab in Hsinchu, where land has already been obtained for the facility, according to YP Chin, senior vice president for operations at the foundry house.

    Reply
  2. Tomi Engdahl says:

    Moore’s Law Enters The 4th Dimension
    There’s no shortage of demand for more compute power, but the underlying assumptions have to change.
    https://semiengineering.com/moores-law-enters-the-4th-dimension/

    Put simply, if costs cannot be reduced every couple of years, the economic model needs to be spread out over longer periods of time. Making chips that are faster and more customized costs more money, but if the lifecycle of those chips can be extended, then the basic economics don’t change. This isn’t exactly Moore’s Law as it was written, but it’s an interesting market adaptation of the basic concept. Why be confined to just three dimensions when you can add a fourth?

    Reply
  3. Tomi Engdahl says:

    China’s semiconductor drive stalls in Wuhan, exposing gap in hi-tech production capabilities
    https://www.scmp.com/economy/china-economy/article/3099100/chinas-semiconductor-drive-stalls-wuhan-exposing-gap-hi-tech?utm_source=Facebook&utm_medium=share_widget&utm_campaign=3099100

    Construction on a US$20 billion state-of-the-art semiconductor manufacturing plant in Wuhan has stalled due to a lack of funding
    It’s the latest example of a Chinese chip factory hitting funding problems, underlining how far the nation has to go to boost production capabilities

    Reply
  4. Tomi Engdahl says:

    Then: It works on my system.

    Now: It works on my Linux container, err I mean my Kubernetes cluster.

    There are a few things in a programmer’s life that remain the same.

    Reply
  5. Tomi Engdahl says:

    U.S. Tech Stocks Are Now Worth More Than $9 Trillion, Eclipsing The Entire European Stock Market
    https://www.forbes.com/sites/sergeiklebnikov/2020/08/28/us-tech-stocks-are-now-worth-more-than-9-trillion-eclipsing-the-entire-european-stock-market/?utm_campaign=forbes&utm_source=facebook&utm_medium=social&utm_term=Gordie/#676f7264696

    U.S. tech stocks have been pushing the stock market to record highs, and now the sector has now become more valuable than the entire European stock market for the first time in history, according to the latest research from Bank of America.

    Reply
  6. Tomi Engdahl says:

    The IT systems that governments and businesses depend on are often decades old and prone to failure. Why do they cost trillions to maintain but are so hard to modernize?

    Inside the Hidden World of Legacy IT Systems
    https://spectrum.ieee.org/computing/it/inside-hidden-world-legacy-it-systems

    “Fix the damn unemployment system!”

    This past spring, tens of millions of Americans lost their jobs due to lockdowns aimed at slowing the spread of the SARS-CoV-2 virus. And untold numbers of the newly jobless waited weeks for their unemployment benefit claims to be processed, while others anxiously watched their bank accounts for an extra US $600 weekly payment from the federal government.

    Delays in processing unemployment claims in 19 states—Alaska, Arizona, Colorado, Connecticut, Hawaii, Iowa, Kansas, Kentucky, New Jersey, New York, Ohio, Oklahoma, Oregon, Pennsylvania, Rhode Island, Texas, Vermont, Virginia, and Wisconsin—are attributed to problems with antiquated and incompatible state and federal unemployment IT systems. Most of those systems date from the 1980s, and some go back even further.

    Things were so bad in New Jersey that Governor Phil Murphy pleaded in a press conference for volunteer COBOL programmers to step up to fix the state’s ­Disability Automated Benefits ­System.

    Similar problems have emerged at the federal level.

    The pandemic has acted as a powerful outgoing tide that has exposed government’s dependence on aging legacy IT systems.

    But governments aren’t the only ones struggling under the weight of antiquated IT. It is equally easy to find airlines, banks, insurance companies, and other commercial entities that continue to rely on old IT, contending with software or hardware that is no longer supported by the supplier or has defects that are too costly to repair. These systems are prone to outages and errors, vulnerable to cyber­intrusions, and progressively more expensive and difficult to maintain.

    Since 2010, corporations and governments worldwide have spent an estimated $35 trillion on IT products and services. Of this amount, about three-quarters went toward operating and maintaining existing IT systems. And at least $2.5 trillion was spent on trying to replace legacy IT systems, of which some $720 billion was wasted on failed replacement efforts.

    But it’s astonishing how seldom people notice these IT systems, even with companies and public institutions spending hundreds of billions of dollars every year on them.

    Infrastructure like wastewater treatment plants, power grids, air traffic control, telecommunications services, and government administration depends on hundreds of thousands of unseen IT systems that form another, hidden infrastructure. Commercial organizations rely on IT systems to manage payroll, order supplies, and approve cashless sales, to name but three of thousands of automated tasks necessary to the smooth functioning of a modern economy. Though these systems run practically every aspect of our lives, we don’t give them a second thought because, for the most part, they function. It doesn’t even occur to us that IT is something that needs constant attention to be kept in working order.

    Indeed, the very invisibility of legacy IT is a kind of testament to how successful these systems are. Except, of course, when they’re not.

    There’s no formal definition of “legacy system,” but it’s commonly understood to mean a critical system that is out of date in some way.
    To modernize a computing system or not is a question that bedevils nearly every organization. Given the many problems caused by legacy IT systems, you’d think that modernization would be a no-brainer. But that decision isn’t nearly as straightforward as it appears. Some legacy IT systems end up that way because they work just fine over a long period. Others stagger along because the organization either doesn’t want to or can’t afford to take on the cost and risk associated with modernization.

    Obviously, a legacy system that’s critical to day-to-day operations cannot be replaced or enhanced without major disruption. And so even though that system contributes mightily to the organization’s operations, management tends to ignore it and defer modernization. On most days, nothing goes catastrophically wrong, and so the legacy system remains in place.

    This “kick the can” approach is understandable. Most IT systems, whether new or modernized, are expensive affairs that go live late and over budget, assuming they don’t fail partially or completely. These situations are not career-enhancing experiences, as many former chief information officers and program managers can attest. Therefore, once an IT system is finally operating reliably, there’s little motivation to plan for its eventual retirement.

    What management does demand, however, is for any new IT system to provide a return on investment and to cost as little as possible for as long as possible. Such demands often lead to years of underinvestment in routine maintenance. Of course, those same executives who approved the investment in the new system probably won’t be with the organization a decade later, when that system has legacy status.

    Similarly, the developers of the system, who understand in detail how it operates and what its limitations are, may well have moved on to other projects or organizations. For especially long-lived IT systems, most of the developers have likely retired.

    IT system quietly age into legacy status.

    Millions of people every month experience the frustrations and inconveniences of decrepit legacy IT.

    U.K. bank customers know this frustration only too well.

    Over the past several years, U.S. air carriers have experienced on average nearly one IT-related outage per month, many of them attributable to legacy IT. Some have lasted days and caused the delay or cancellation of thousands of flights.

    Poorly maintained legacy IT systems are also prone to cybersecurity breaches. At the credit reporting agency Equifax, the complexity of its legacy systems contributed to a failure to patch a critical vulnerability in the company’s Automated Consumer Interview System, a custom-built portal developed in the 1970s to handle consumer disputes. This failure led, in 2017, to the loss of 146 million individuals’ sensitive personal information.

    Take COBOL, a programming language that dates to 1959. Computer science departments stopped teaching COBOL some decades ago. And yet the U.S. Social Security Administration reportedly still runs some 60 million lines of COBOL. The IRS has nearly as much COBOL ­programming, along with 20 million lines of assembly code. And, according to a 2016 GAO report, the departments of Commerce, Defense, Treasury, Health and Human Services, and ­Veterans Affairs are still “using 1980s and 1990s Microsoft operating systems that stopped being supported by the vendor more than a decade ago.”

    Given the vast amount of outdated software that’s still in use, the cost of maintaining it will likely keep climbing not only for government, but for commercial organizations, too.

    The first step in fixing a massive problem is to admit you have one.

    Part of the modernization push by governments in the United States and abroad has been to provide more effective administrative controls, increase the reliability and speed of delivering benefits, and improve customer service. In the commercial sector, by contrast, IT modernization is being driven more by competitive pressures and the availability of newer computing technologies like cloud computing and machine learning.

    “Everyone understands now that IT drives organization innovation,”

    Companies saddled with legacy IT systems won’t be able to compete on the expected rapid delivery of improved features or customer service, and therefore “are going to find themselves forced into a box canyon, unable to get out,” Salvaggio says.

    This is already happening in the banking industry. Existing firms are having a difficult time competing with new businesses that are spending most of their IT budgets on creating new offerings instead of supporting legacy systems.

    Modernization creates its own problems. Take the migration of legacy data to a new system. When TSB moved to its new IT platform in 2018, some 1.9 million online and mobile customers discovered they were locked out of their accounts for nearly two weeks. And modernizing one legacy system often means having to upgrade other interconnecting systems, which may also be legacy. At the IRS, for instance, the original master tax file systems installed in the 1960s have become buried under layers of more modern, interconnected systems, each of which made it harder to replace the preceding system. The agency has been trying to modernize its interconnected legacy tax systems since 1968 at a cumulative cost of at least $20 billion in today’s money, so far with very little success. It plans to spend up to another $2.7 billion on modernization over the next five years.

    massive duplication and data silos sound ridiculous, but they are shockingly common. Here’s one way it often happens: The government issues a new mandate that includes a requirement for some type of automation, and the policy comes with fresh funding to implement it. Rather than upgrade an existing system, which would be disruptive, the department or agency finds it easier to just create a new IT system, even if some or most of the new system duplicates what the existing system is doing. The result is that different units within the same organization end up deploying IT systems with overlapping functions.

    “The shortage of thinking about systems engineering” along with the lack of coordinating IT developments to avoid duplication have long plagued government and corporations alike, Salvaggio says.

    The best way to deal with legacy IT is to never let IT become legacy. Growing recognition of legacy IT systems’ many costs has sparked a rethinking of the role of software maintenance.

    Currently, software development, operations, and support are considered separate activities. But if you fuse those activities into a single integrated activity—employing what is called DevOps—the operational system is then always “under development,” continuously and incrementally being improved, tested, and deployed, sometimes many times a day.

    DevOps is just one way to keep core IT systems from turning into legacy systems.

    Since 2015, DARPA has funded research aimed at making software that will be viable for more than 100 years. The Building Resource Adaptive Software Systems (BRASS) program is trying to figure out how to build “long-lived software systems that can dynamically adapt to changes in the resources they depend upon and environments in which they operate,” according to program manager ­Sandeep Neema.

    Creating such timeless systems will require a “start from scratch” approach to software design that doesn’t make assumptions about how an IT system should be designed, coded, or maintained.

    The goal is to be able to update or upgrade applications without the need for extensive intervention by a human programmer, Neema told Spectrum, thereby “buying down the cost of maintenance.”

    The COVID-19 pandemic has exposed the debilitating consequences of relying on antiquated IT systems for essential services. Unfortunately, that dependence, along with legacy IT’s enormous and increasing costs, will still be with us long after the pandemic has ended.

    The problems associated with legacy systems will only worsen as the Internet of Things, with its billions of interconnected computing devices, matures. These devices are already being connected to legacy IT, which will make it even more difficult to replace and modernize those systems. And eventually the IoT devices will become legacy. Just as with legacy systems today, those devices likely won’t be replaced as long as they continue to work, even if they are no longer supported. The potential cybersecurity risk of vast numbers of obsolete but still operating IoT devices is a huge unknown. Already, many IoT devices have been deployed without basic cybersecurity built into them,

    Now imagine a not-too-distant future where hundreds of millions or even billions of legacy IoT devices are deeply embedded into government and commercial offices, schools, hospitals, factories, homes, and even people. Further imagine that their cybersecurity or technical flaws are not being fixed and remain connected to legacy IT systems that themselves are barely supported. In such a world, the pervasive dependence upon increasing numbers of interconnected, obsolete systems will have created something far grimmer and murkier than Edgerton’s twilight world

    Reply
  7. Tomi Engdahl says:

    It has to do with scheduled drive maintenance.

    A bug in Windows 10 could be slowly wrecking your SSD
    By Paul Lilly 18 hours ago
    Fortunately a fix is on the way.
    https://www.pcgamer.com/windows-10-bug-wrecking-ssd/?utm_content=bufferb1fea&utm_medium=social&utm_source=facebook&utm_campaign=buffer_pcgamerfb

    Microsoft is currently testing a fix for Windows 10 bug that could cause the operating system to defragment solid state drives (SSDs) more often than is needed. While periodic defragging of a mechanical hard disk drive (HDD) is a good thing, doing it too often on SSDs can actually degrade their integrity and shorten their lifespan.

    As spotted by Bleeping Computer, when Microsoft rolled out the May 2020 update for Windows 10, it introduced a bug to the Optimize Drives feature causing it to incorrectly determine the last time a drive has been optimized. When you open it up, you might notice your SSD says “Needs optimization” even if the routine was recently run (Windows 10 handles this automatically)

    What ends up happening is Windows 10 defrags your SSD each time your reboot your system.

    According to our friends at TechRadar, Windows 10 is usually able to discern whether to defrag or run a harmless TRIM process on a drive, depending on its type. But if volume snapshots are enabled (so you can revert to a backup using System Restore), it will in fact defrag the drive even if it is an SSD.

    Regardless, Microsoft has a fix in place, which has been implemented in Windows Insider program.

    Reply
  8. Tomi Engdahl says:

    Will automation eliminate data science positions?
    https://techcrunch.com/2020/08/27/will-automation-eliminate-data-science-positions/

    “Will automation eliminate data science positions?”

    This is a question I’m asked at almost every conference I attend, and it usually comes from someone from one of two groups with a vested interest in the answer: The first is current or aspiring practitioners who are wondering about their future employment prospects. The second consists of executives and managers who are just starting on their data science journey.

    Understanding the business problem is the biggest challenge
    The most important question in data science is not which machine learning algorithm to choose or even how to clean your data. It is the questions you need to ask before even one line of code is written: What data do you choose and what questions do you choose to ask of that data?

    What is missing (or wishfully assumed) from the popular imagination is the ingenuity, creativity and business understanding that goes into those tasks.

    Making your assumptions
    After formulating a data science question, data scientists need to outline their assumptions. This often manifests itself in the form of data munging, data cleaning and feature engineering. Real-world data are notoriously dirty and many assumptions have to be made to bridge the gap between the data we have and the business or policy questions we are seeking to address. These assumptions are also highly dependent on real-world knowledge and business context.

    A historical analogy
    There is a clear precedent in history to suggest data science will not be automated away. There is another field where highly trained humans are crafting code to make computers perform amazing feats. These humans are paid a significant premium over others who are not trained in this field and (perhaps not surprisingly) there are education programs specializing in training this skill. The resulting economic pressure to automate this field is equally, if not more, intense. This field is software engineering.

    Indeed, as software engineering has become easier, the demand for programmers has only grown. This paradox — that automation increases productivity, driving down prices and ultimately driving up demand is not new — we’ve seen it again and again in fields ranging from software engineering to financial analysis to accounting. Data science is no exception and automation will likely drive up demand for this skillset, not down.

    Reply
  9. Tomi Engdahl says:

    Lightmatter’s Mars SoC Bends Light to Process Data
    By Paul Alcorn 9 days ago
    The era of laser-powered computing draws near
    https://www.tomshardware.com/news/lightmatter-mars-soc-bends-light-to-process-data-silicon-photonics

    Reply
  10. Tomi Engdahl says:

    Linux Kernel 5.8 “The Biggest Release of All Time” is Finally Available Now
    https://itsfoss.com/kernel-5-8-release/

    Reply
  11. Tomi Engdahl says:

    MIT Develops Integrated Lightwave Electronic Circuits
    https://scitechdaily.com/mit-develops-integrated-lightwave-electronic-circuits/

    MIT researchers develop integrated lightwave electronic circuits to detect the phase of ultrafast optical fields.

    Reply
  12. Tomi Engdahl says:

    Systems design for advanced beginners
    https://robertheaton.com/2020/04/06/systems-design-for-advanced-beginners/

    You’ve started yet another company with your good friend, Steve Steveington. It’s an online marketplace where people can buy and sell things and where no one asks too many questions. It’s basically a rip-off of Craigslist, but with Steve’s name instead of Craig’s.

    Reply
  13. Tomi Engdahl says:

    Relying on plain-text email is a ‘barrier to entry’ for kernel development, says Linux Foundation board member
    Microsoft’s ‘open source wonk’ Sarah Novotny wants to see easier ways for people to get involved
    https://www.theregister.com/2020/08/25/linux_kernel_email/

    Reply
  14. Tomi Engdahl says:

    WHY EPIC CAN’T AFFORD TO LOSE THE UNREAL ENGINE IN ITS LEGAL FIGHT WITH APPLE
    Apple and Epic’s Fortnite fight has spiraled into something much more
    https://www.theverge.com/2020/8/26/21402443/epic-fortnite-apple-unreal-engine-ios-game-developers-lawsuit

    Reply
  15. Tomi Engdahl says:

    Can A “Polite Font” Stop Cyberbullying And Restore Civility?
    https://www.iflscience.com/brain/can-a-polite-font-stop-cyberbullying-and-restore-civility/

    Finnish software company tietoEVRY have created what they hope is a helpful tool, a font that smooths over the worst abuses.

    Known as The Polite Type, the secret to tietoEVRY’s project is not the soothing shape of its letters, although the designers hope people will appreciate those as well. Rather it works like a writing assistance program to turn abusive messages into something more gentle. Anyone using The Polite Type to write “I hate you” will find their words changed to “I disagree with you.” Similarly, words associated with hate speech such as racism or sexism will be swapped with a less loaded alternative or blurred out where none is suitable.

    https://www.thepolitetype.com/

    Reply
  16. Tomi Engdahl says:

    Business, Financial & Legal
    TSMC is so ahead of the game, Samsung might not catch up until 2030
    TSMC has nailed 7nm, prepping 6nm, has 5nm and 3nm designs on their way — and is already heading into 2nm, and beyond…

    Read more: https://www.tweaktown.com/news/74768/tsmc-is-so-ahead-of-the-game-samsung-might-not-catch-up-until-2030/index.html?utm_source=dlvr.it&utm_medium=facebook

    Reply
  17. Tomi Engdahl says:

    Science, Space & Robotics
    TSMC on 3nm : 3x silicon density over 7nm, 51% less power, 32% faster
    TSMC begins detailing its next-gen 3nm node, mass production in 2H 2022 — with 3.06x increase in silicon density over 7nm.

    Read more: https://www.tweaktown.com/news/74691/tsmc-on-3nm-3x-silicon-density-over-7nm-51-less-power-32-faster/index.html

    Reply
  18. Tomi Engdahl says:

    What the Power PC to Intel transition can tell us about Apple Silicon release dates and which Macs will come first – or last.
    https://appleinsider.com/articles/20/08/28/what-the-power-pc-to-intel-transition-tells-us-about-apple-silicon-release-dates

    The past is another country, and the old Apple of a decade and a half ago has been long replaced by the behemoth it’s become. Yet, the decisions Apple made over Intel in 2005 are being repeated now — and they give us a guide to what we’ll get and when we’ll see Apple silicon Macs.

    Not including entire platform migrations, Apple has gone through a major processor transition in the Mac twice before, but the company in 2020 is barely recognizable from what it was in 1994 or 2005.

    Reply
  19. Tomi Engdahl says:

    Applied Materials Signals It Has Weathered the Worst of the Virus
    https://www.electronicdesign.com/technologies/embedded-revolution/article/21140541/applied-materials-signals-it-has-weathered-the-worst-of-the-virus?utm_source=EG+ED+Analog+%26+Power+Source&utm_medium=email&utm_campaign=CPS200828017&o_eid=7211D2691390C9R&rdx.ident%5Bpull%5D=omeda%7C7211D2691390C9R&oly_enc_id=7211D2691390C9R

    Applied Materials is the world’s largest maker of manufacturing tools for ICs. The company’s results serve as a barometer for demand in the global chip business, which has been dented by supply chain delays and other fallout from Covid-19.

    Reply
  20. Tomi Engdahl says:

    China’s semiconductor drive stalls in Wuhan, exposing gap in hi-tech production capabilities
    https://www.scmp.com/economy/china-economy/article/3099100/chinas-semiconductor-drive-stalls-wuhan-exposing-gap-hi-tech

    Construction on a US$20 billion state-of-the-art semiconductor manufacturing plant in Wuhan has stalled due to a lack of funding
    It’s the latest example of a Chinese chip factory hitting funding problems, underlining how far the nation has to go to boost production capabilities

    Reply
  21. Tomi Engdahl says:

    Understanding Advanced Packaging Technologies And Their Impact On The Next Generation Of Electronics
    https://semiengineering.com/understanding-advanced-packaging-technologies-and-their-impact-on-the-next-generation-of-electronics/
    A primer on wafer-level packaging and the technologies used alongside it.

    Reply
  22. Tomi Engdahl says:

    New Architectures, Much Faster Chips
    Massive innovation to drive orders of magnitude improvements in performance.
    https://semiengineering.com/new-architectures-much-faster-chips/

    Reply
  23. Tomi Engdahl says:

    Battery-Free Low-Cost Printable Flexible Circuits Turn Paper, Card Into Interactive Gadgets
    https://www.hackster.io/news/battery-free-low-cost-printable-flexible-circuits-turn-paper-card-into-interactive-gadgets-5b5ffd259cb4

    Printed directly onto paper or card, these sub-$0.25 circuits can withstand moisture, flexing, and harvest energy from a finger-press.

    Reply
  24. Tomi Engdahl says:

    IFTLE 460: On Lost Years and Gang Bonding for Multi-die Stacking
    Is 2020 A Lost Year?
    https://www.3dincites.com/2020/08/iftle-460-on-lost-years-and-gang-bonding-for-multi-die-stacking/

    Reply
  25. Tomi Engdahl says:

    Chaim Gartenberg / The Verge:
    Intel announces 11th-generation Tiger Lake CPUs for laptops, with integrated Xe graphics, Thunderbolt 4 support, Wi-Fi 6, and more, coming this fall — The ‘best processor for thin-&-light’ laptops — Intel has officially announced its first 11th Gen Tiger Lake processors for laptops …

    Intel announces its new 11th Gen Tiger Lake CPUs, available on laptops this fall
    The ‘best processor for thin-&-light’ laptops
    https://www.theverge.com/2020/9/2/21408718/intel-11th-gen-tiger-lake-cpu-processor-announcement-laptops-fall?scrolla=5eb6d68b7fedc32c19ef33b4

    Intel has officially announced its first 11th Gen Tiger Lake processors for laptops, which will feature the company’s new integrated Xe graphics, Thunderbolt 4 support, Wi-Fi 6, and a big leap in performance and battery life over the previous Ice Lake chips. The company claims that the new 11th Gen lineup offers the “best processor for thin-&-light” laptops.

    Intel is launching nine new 11th Gen designs for both its U-series (which Intel is now referring to as UP3) and Y-series class chips (aka UP4), led by the Core i7-1185G7, which offer base speeds of 3.0GHz, a maximum single core turbo boost of up to 4.8GHz, and a maximum all-core boost of up to 4.3GHz. It also features the most powerful version of Intel’s Iris Xe integrated graphics, with 96 CUs and a maximum graphics speed of 1.35GHz.

    Reply
  26. Tomi Engdahl says:

    Tom Warren / The Verge:
    Nvidia announces $699 RTX 3080 GPU, launching on Sept. 17, boasting up to two times the performance of the RTX 2080, as well as a $499 RTX 3070

    Nvidia announces new RTX 3080 GPU, priced at $699 and launching September 17th
    Nvidia promises a PC gaming breakthrough
    https://www.theverge.com/2020/9/1/21409953/nvidia-geforce-rtx-3080-specs-price-release-date-features?scrolla=5eb6d68b7fedc32c19ef33b4

    Reply
  27. Tomi Engdahl says:

    Purdue University researchers’ new technology can transform paper from a notebook into a music player interface and even make food packaging interactive.

    Battery-Free Low-Cost Printable Flexible Circuits Turn Paper, Card Into Interactive Gadgets
    https://www.hackster.io/news/battery-free-low-cost-printable-flexible-circuits-turn-paper-card-into-interactive-gadgets-5b5ffd259cb4

    Printed directly onto paper or card, these sub-$0.25 circuits can withstand moisture, flexing, and harvest energy from a finger-press.

    Reply
  28. Tomi Engdahl says:

    Why cloud costs get out of control: Too much lift and shift, and pricing that is ‘screwy and broken’
    The Reg talks to the experts about how to manage spend
    https://www.theregister.com/2020/09/03/cloud_control_costs/

    Organisations are “over budget for cloud spend by an average of 23 per cent, and expect cloud spend to increase by 47 per cent next year,” according to a “State of the cloud 2020″ report by Flexera, based on a survey of 750 technical professionals.

    As if that weren’t bad enough, respondents self-estimate that 30 per cent of cloud spend is wasted. COVID-19 has, if anything, made the problem worse, with most respondents saying the pandemic has increased planned cloud usage.

    Adrian Bradley is a CIO Advisory director at KPMG advising customers on cloud cost management. “Our clients find three things going wrong,” he told The Register. “The first is that it costs them more than they anticipated to get to the cloud. Second, when they get to the cloud they find that they’re spending more than they expected, and quite often more than they historically spent. Third, they don’t feel they’re getting the value from that spend.”

    The biggest problem, said Bradley, is that organisations “make a lot of compromises” moving to the cloud because the level of digital transformation needed to get the full benefit is not there.

    Reply
  29. Tomi Engdahl says:

    Intel races to defend US chip leadership from Asian rivals
    Chipmaker pins hopes on 10-nm chipset but is struggling to overcome delays
    https://asia.nikkei.com/Business/Technology/Intel-races-to-defend-US-chip-leadership-from-Asian-rivals

    Intel has released a new line of laptop CPUs that it hopes will drive revenue growth into 2021, but the company is struggling to maintain its — and America’s — leadership in chip manufacturing against Asian rivals.

    Intel is the biggest U.S. chip company by revenue and the only American microprocessor maker that still produces its own advanced designs domestically, but it has struggled with production issues and lengthy delays in recent years.

    The company is set to release its latest chipset, code named Tiger Lake, on Wednesday. It will be Intel’s second central processing unit for notebook computers using its 10-nanometer process technology, the most advanced by a U.S. company.

    Intel’s latest laptop CPU platform is faster and much more powerful in terms of computing and graphics processing than the latest offerings this year by smaller U.S. rivals Advanced Micro Devices and Nvidia, said Ksenia Chistyakova, product marketing engineer for AI and Media, at a news conference on Wednesday.

    Reply
  30. Tomi Engdahl says:

    Why cloud costs get out of control: Too much lift and shift, and pricing that is ‘screwy and broken’
    The Reg talks to the experts about how to manage spend
    https://www.theregister.com/2020/09/03/cloud_control_costs/

    Spinning up services on public clouds is dead easy, but what about staying in control of the bill?

    Organisations are “over budget for cloud spend by an average of 23 per cent, and expect cloud spend to increase by 47 per cent next year,” according to a “State of the cloud 2020″ report by Flexera, based on a survey of 750 technical professionals.

    As if that weren’t bad enough, respondents self-estimate that 30 per cent of cloud spend is wasted. COVID-19 has, if anything, made the problem worse, with most respondents saying the pandemic has increased planned cloud usage.

    Reply
  31. Tomi Engdahl says:

    ARM Co-Founder Hermann Hauser: ‘It’s In Nvidia’s Interests To Destroy Arm’
    https://slashdot.org/story/20/09/03/2019200/arm-co-founder-hermann-hauser-its-in-nvidias-interests-to-destroy-arm?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Slashdot%2Fslashdot%2Fto+%28%28Title%29Slashdot+%28rdf%29%29

    SoftBank is in advanced talks with US chip company Nvidia to sell Arm — with a price in the region of 32 billion euros reportedly being thrown around. But Nvidia’s purchase of the Cambridge-based chip designer would not only strike a blow to the UK’s technological sovereignty, but would result in the destruction of Arm itself, Arm co-founder Hermann Hauser has claimed. Nvidia recently overtook Intel as the most valuable microprocessor company in the world, and its great wealth right now provides it a unique opportunity, says Hauser. “They are the semiconductor company that can buy Arm to destroy it — and it is very much in its interest to destroy Arm because they [would] gain a lot more than the 40 billion that they pay for it,” he claims.

    Hermann Hauser: ‘It’s in Nvidia’s interests to destroy Arm’
    https://tech.newstatesman.com/business/hermann-hauser-nvidia-destroy-arm

    Reply
  32. Tomi Engdahl says:

    This QR code contains a complete Windows executable that will run on Windows 7 and up. It’s a simple but fully playable implementation of Snake.

    Snake in a QR code
    https://itsmattkc.com/etc/snakeqr/

    Reply
  33. Tomi Engdahl says:

    Digital pregnancy tests are almost as powerful as the original IBM PC
    A lot of computer to read an old-fashioned pee strip
    https://www.theverge.com/tldr/2020/9/4/21422628/digital-pregnancy-test-teardown-processor-ram-ibm-pc

    Reply
  34. Tomi Engdahl says:

    The first battery-free Game Boy wants to power a gaming revolution
    Solar energy and mechanical triggers power the Engage, a console at the cutting edge of computer engineering.
    https://www.cnet.com/features/the-first-battery-free-game-boy-wants-to-power-a-gaming-revolution/

    Reply
  35. Tomi Engdahl says:

    Guy runs ‘Doom’ on a pregnancy test and wait, what?
    https://mashable.com/article/pregnancy-test-doom/?utm_campaign=trueAnthem%3A+Trending+Content&utm_medium=trueAnthem&utm_source=facebook

    Foone, a programmer who likes reverse-engineering things and typically works with ancient hardware and software, has managed to run a fully functional game of Doom on an electronic pregnancy test.

    Reply
  36. Tomi Engdahl says:

    Programmer Has Made 1993′s Doom Playable on a Pregnancy Test
    It’s a Doomguy!
    https://nordic.ign.com/news/39466/programmer-has-made-1993s-doom-playable-on-a-pregnancy-test

    It’s important to note that Foone did replace the display and microcontroller, so the only part of the original tester is the shell. However, getting Doom running and playable on a 128×32 pixel monochrome display at 1bpp is still an impressive feat

    https://mobile.twitter.com/Foone/status/1302234777894883329

    Reply
  37. Tomi Engdahl says:

    Skyrim is Now on a Pregnancy Test
    A hardware hacker on Twitter manages to port the opening prologue of The Elder Scrolls 5: Skyrim to a pregnancy test in a hilarious viral video.
    https://gamerant.com/skyrim-pregnancy-test/

    To be clear, the version of the game on the stick is actually just a recording of the opening moments of the game, but it’s undeniably an absolute feat to behold. Rendered in dots on the tiny OLED screen fixed onto the pregnancy test, the video is unmistakably the notorious “you’re finally awake” prologue that introduces players to the world of Skyrim, which many will surely agree is utterly insane.

    Reply
  38. Tomi Engdahl says:

    The 25 greatest Java apps ever written
    From space exploration to genomics, from reverse compilers to robotic controllers, Java is at the heart of today’s world. Here are a few of the countless Java apps that stand out from the crowd.

    https://blogs.oracle.com/javamagazine/the-top-25-greatest-java-apps-ever-written

    Reply
  39. Tomi Engdahl says:

    No, Kubernetes doesn’t make applications portable, say analysts. Good luck avoiding lock-in, too
    K8s may even make it hard to use the cloud’s best bits
    https://www.theregister.com/2020/09/08/kubernetes_app_portability_problems/

    Do not make application portability your primary driver for adopting Kubernetes, say Gartner analysts Marco Meinardi, Richard Watson and Alan Waite, because while the tool theoretically improves portability in practice it also locks you in while potentially denying you access to the best bits of the cloud.

    The three advance that theory in a recent “Technical Professional Advice” document that was last week summarised in a blog post.

    Why Adopting Kubernetes for Application Portability Is Not a Good Idea
    https://blogs.gartner.com/marco-meinardi/2020/09/04/adopting-kubernetes-application-portability-not-good-idea/

    Reply
  40. Tomi Engdahl says:

    Classy move: C++ 20 wins final approval in ISO technical ballot, formal publication expected by end of year
    ‘Best approximation of C++ ideals so far,’ says Stroustrup – but is it too big and complex?
    https://www.theregister.com/2020/09/07/c_20_wins_final_approval/

    C++ 20, the latest version of the venerable object-oriented programming language, has been unanimously endorsed in ISO’s final technical approval ballot.

    It awaits a “final round of editorial work,” following which it will be formally published, expected to be “towards the end of 2020.”

    A major new version of C++ comes every three years, named after the year, so C++ 20 is the successor to C++ 17. C++ 20 is a big release, bigger than its predecessors. There are four key new features often called the “big four.” These are:

    Modules.
    Concepts. “A concept is a compile-time predicate,”
    Ranges library
    Coroutines, functions that “can suspend execution to be resumed later,” used for asynchronous programming.

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*