Computers and component trends 2020

Prediction articles:

2020: A consumer electronics forecast for the year(s) ahead

AI Chips: What Will 2020 Bring?

CEO Outlook: 2020 Vision: 5G, China and AI are prominent, but big changes are coming everywhere

Top 10 Tech Failures From 2019 That Hint At 2020 Trends – Last year’s tech failures often turn into next year’s leading trends

Trends:

AMD’s 7nm Ryzen 4000 CPUs are here to take on Intel’s 10nm Ice Lake laptop chips

Top 9 challenges IT leaders will face in 2020: From skills shortages to privacy concerns

Linux in 2020: 27.8 million lines of code in the kernel, 1.3 million in whole system
Systemd? It’s the proper technical solution, says kernel maintainer

Hero programmers do exist, do all the work, do chat a lot – and do need love and attention from project leaders

From the oil rig to the lake: a shift in perspective on data

In December 2020, the new IEC/EN 62368-1 will replace the existing safety standards EN 60950-1 and EN 60065-1

Use of technology money outside company IT department is the new normal

Tech to try:

12 Alternative Operating Systems You Can Use In 2020

CONTINUOUS INTEGRATION: WHAT IT IS AND WHY YOU NEED IT

Research:

Universal memory coming? New type of non-volatile general purpose memory on research, some call it UltraRAM.

1,318 Comments

  1. Tomi Engdahl says:

    IFTLE 453: No, This Ain’t Your Father’s Microelectronic Packaging
    https://www.3dincites.com/2020/06/iftle-453-no-this-aint-your-fathers-microelectronic-packaging/

    In a recent IMAPS webinar, John Park (Figure 1), product management director of Cadence Design Systems, gave a tutorial entitled “This Is Not Your Father’s Advanced Semiconductor Packaging…an EDA Perspective”.

    With the demise of Moore’s Law (I’m with the group that thinks it happened around 28nm), things have significantly changed. I certainly have used my platform to become a campanologist (bell ringer) for the premise that the importance of packaging has increased to the point where it IS now being used to differentiate products in the market because of the increased performance it can bring when done correctly.

    The Design Community Heeds the Call

    The design community, which also used to ignore packaging because that’s not where the money was being spent, has now stood up and is paying attention. In fact, Park, indicating that “heterogeneously integrated” system-in-package (SiP) (here we go with the buzz words again) will be leveraged to design the next-generation electronic products stated flat out stated, “SiP will replace SoC”.

    The IFTLE message to old-time packaging engineers is “better get on board and start learning about these design tools because they are quickly becoming a part of your future! “

    Reply
  2. Tomi Engdahl says:

    “When your customer starts finding almost as many bugs as you found yourself, you’re not leading into the right place.”

    Intel insider claims it finally lost Apple because Skylake QA ‘was abnormally bad’
    By Dave James a day ago
    https://www.pcgamer.com/intel-skylake-why-apple-left/?utm_content=buffer9f5c7&utm_medium=social&utm_source=facebook&utm_campaign=buffer_pcgamerfb

    Ex-Intel principal engineer, François Piednoël, believes he witnessed the inflection point three years ago with Skylake.

    The “bad quality assurance of Skylake” was responsible for Apple finally making the decision to ditch Intel and focus on its own ARM-based processors for high-performance machines. That’s the claim made by outspoken former Intel principal engineer, François Piednoël.

    It’s been one of the big stories from this last week; Apple finally announcing its two-year transition away from Intel for its Mac desktop and notebook lines. There has been a lot of speculation about why this has happened, with the main consideration being that it’s aiming to consolidate the architectures across all its different platforms, from iPhone, through iPad, and finally into its laptop and desktop range.

    That makes complete sense from a business and an architectural point of view, but while it’s something Piednoël says was always under consideration by Apple, he believes if the company hadn’t found so many issues within the Skylake architecture it would still be onboard the Intel chip train. It was the straw that broke the Apple’s back, so to speak.

    “The quality assurance of Skylake was more than a problem,” says Piednoël during a casual Xplane chat and stream session. “It was abnormally bad. We were getting way too much citing for little things inside Skylake. Basically our buddies at Apple became the number one filer of problems in the architecture. And that went really, really bad.

    “When your customer starts finding almost as much bugs as you found yourself, you’re not leading into the right place.”

    “For me this is the inflection point,” says Piednoël. “This is where the Apple guys who were always contemplating to switch, they went and looked at it and said: ‘Well, we’ve probably got to do it.’ Basically the bad quality assurance of Skylake is responsible for them to actually go away from the platform.”

    Reply
  3. Tomi Engdahl says:

    Tom Warren / The Verge:
    Apple confirms it is not planning to support Boot Camp on ARM-based Macs, and says “purely virtualization is the route” — Virtualization is the route forward — Apple will start switching its Macs to its own ARM-based processors later this year, but you won’t be able to run Windows in Boot Camp mode on them.

    Apple’s new ARM-based Macs won’t support Windows through Boot Camp
    https://www.theverge.com/2020/6/24/21302213/apple-silicon-mac-arm-windows-support-boot-camp?scrolla=5eb6d68b7fedc32c19ef33b4

    Virtualization is the route forward

    Reply
  4. Tomi Engdahl says:

    Los Angeles Times:
    Stories from tech workers who are women, people of color, and LGBTQ-identifying, describe discrimination, microaggressions, feelings of isolation, and more

    Black and brown tech workers share their experiences of racism on the job
    https://www.latimes.com/business/technology/story/2020-06-24/diversity-in-tech-tech-workers-tell-their-story

    Of the 68 tech workers who responded to the survey, half said they felt tech was not inclusive to people from diverse backgrounds.

    “Tech is inclusive because they need our talent and manpower regardless of whether they acknowledge it or not,” one Hulu employee who asked not to be named said. “It’s a spectrum of tolerance and acceptance.”

    People whose identities are underrepresented in their field find themselves judged through a different set of lenses, one that ignores the question of privilege, the Hulu employee said. “A single mom that didn’t go to college is seen as uneducated whereas a white college dropout is seen as a ‘genius,’ as if he’s too good for college.”

    Despite the hardships of navigating largely white, male spaces, fewer than a third of respondents said they’ve ever left a company in response to discrimination or non-inclusiveness.

    Reply
  5. Tomi Engdahl says:

    AfterShip makes its automated shipping API, Postmen, available for free
    https://techcrunch.com/2020/06/24/aftership-makes-its-automated-shipping-api-postmen-available-for-free/?tpcc=ECFB2020

    Co-founder Andrew Chan told TechCrunch that the company decided to make the Postmen API, previously offered as a SaaS subscription for enterprises, free in response to the massive jump in online shopping caused by the COVID-19 pandemic. About 60% of Postmen’s users are in the United States.

    Since February, AfterShip has seen an 85% increase in shipping volume

    Reply
  6. Tomi Engdahl says:

    Dual Boot is Dead: Windows and Linux are now One.
    Turn your Windows machine into a developer workstation with WSL 2.
    https://towardsdatascience.com/dual-boot-is-dead-windows-and-linux-are-now-one-27555902a128

    Reply
  7. Tomi Engdahl says:

    “Is it unethical to not tell my employer I’ve automated my job?”
    A message board question turns out to be a case study in tech-related ethics.
    https://www.fastcompany.com/90357964/is-it-unethical-to-not-tell-my-employer-ive-automated-my-job

    Roughly two years ago, a post on workforce question-and-answer site Workplace StackExchange generated a lively debate. The post, which has since been viewed nearly a half-million times, was by a user named Etherable, who designed an algorithm that transformed the substance of a 40-hour workweek into a two-hour project. The poster was using the remaining time to tend to personal issues and spend more time with their son. Etherable asked, “Is it unethical for me to not tell my employer I’ve automated my job?”

    The question seems to be a straightforward yes/no question. But the responses were often qualified. Some thought that lack of disclosure was unethical but warranted given the worry that the poster would lose their job. Others thought that the answer hinged on whether the company was paying the poster for hours or results. And at least a few gave pats on the back: One of technology’s great promises was that it would free people from rote tasks and give them back time that could be devoted to more meaningful pursuits. This was just an illustration of that promise.

    While the employment contract or company policies should be the deciding factor in the situation described, the post also illustrates some of the more complex ethical issues emerging as technology advances

    “Rather than a transfer of tasks from humans to automation, we’re going to see a fusion of intelligent agents, computers, [and] algorithms working together with humans,” he says. That fusion is going to shine a light on some ethical and other problem areas companies and workers already struggle to navigate.

    Part of the question relates to metrics

    If the individual is being paid by output rather than hours, then they are fulfilling their agreement.

    Based on the individual’s employment status and agreement, the company may own the algorithm because it owns the employee’s work product, Lynch says. Also, Etherable built in “bugs” to make the work imperfect, as it would be if a human did it. That indicates an effort to mislead the employer. Plus, if the employer is relying on algorithms of which it is unaware and did not vet, the employee may be leaving the company open to security breaches or other liability.

    Etherable’s primary concern about disclosure was that it may lead to the company simply replacing them with the program. It’s a valid concern, because that’s what many companies might do. However, this viewpoint shows a fundamental lack of trust between employer and employee–and that’s a problem

    The fear that people are disposable if their productivity can be bested by technology may lead to a confidence of trust as well as unpleasant and unintended consequences. High-pressure environments with a focus on results rather than people can lead to workplace cheating and unethical behavior

    “There are ways to manage moving to automation,” he says. “They don’t necessarily displace employees. You can reassign people for the job, retrain them for the job,”

    But without seriously addressing the trust and culture issues fueled by technology and automation, companies will incur costs of which they may not even be aware.

    “I think two things are going on,” he says. “One is a lack of trust in the employer. And two is the employees themselves, not having the foresight to realize that they’ll likely be the ones that get recognized. Like you turn up to your employer and say, ‘Check out what I’ve done.’ Most places I know, you’d be getting a promotion off the back of that.” As long as they got rid of the bugs, of course.

    Reply
  8. Tomi Engdahl says:

    Ex-Intel engineer: Apple turned away from Intel over Skylake CPU bugs
    https://www.zdnet.com/article/ex-intel-engineer-apple-turned-away-from-intel-over-skylake-cpu-bugs/?ftag=COS-05-10aaa0h&utm_campaign=trueAnthem%3A+Trending+Content&utm_medium=trueAnthem&utm_source=facebook

    “The quality assurance in SkyLake was abnormally bad,” says former Intel engineer François Piednoël.

    Reply
  9. Tomi Engdahl says:

    Steven Sinofsky / Learning By Shipping:
    Apple’s two year timeline for Mac’s transition to Apple Silicon is a demonstration of some of the most remarkable product engineering over time in history — Apple’s announcement of “Apple Silicon” is important for many reasons. Delivering on such an undertaking is the result of remarkable product engineering.

    Apple’s Relentless Strategy, Execution, and Point of View
    https://medium.learningbyshipping.com/apples-relentless-strategy-and-execution-7544a76aa26

    Apple’s announcement of “Apple Silicon” is important for many reasons. Delivering on such an undertaking is the result of remarkable product engineering. An annotated thread…

    Reply
  10. Tomi Engdahl says:

    Todd Haselton / CNBC:
    Microsoft says it will permanently close all Microsoft Store retail locations and focus on its online store instead — – Microsoft on Friday announced it will permanently close its Microsoft Store retail locations. — In the past decade or so, Microsoft began to expand its retail presence …

    Microsoft is permanently closing its retail stores
    https://www.cnbc.com/2020/06/26/microsoft-to-close-retail-stores.html

    Microsoft on Friday announced it will permanently close its 83 Microsoft Store retail locations.

    In the past decade or so, Microsoft began to expand its retail presence in an effort to create a shopping experience similar to Apple’s.

    Microsoft said the closing of its physical locations will “result in a pre-tax charge of approximately $450 million, or $0.05 per share,” which it will record in the current quarter that ends June 30.

    Microsoft said its retail team members will help on the website instead of in-store.

    Reply
  11. Tomi Engdahl says:

    Lauren Goode / Wired:
    Virtual labs at WWDC and Microsoft Build lowered barriers to entry and for some felt more personal than live labs, but the conferences lacked communal elements

    Virtual Conferences Mean All-Access—Except When They Don’t
    https://www.wired.com/story/wwdc-virtual-conferences-inclusion/

    The end of WWDC marks the end of Big Tech’s conference season. What did this virtual experiment reveal about the meaning of community?

    The conclusion of Apple’s big software shindig this week wraps up a months-long experiment in virtual tech conferences. The experiment isn’t over—far from it, as the coronavirus pandemic shows no real signs of easing up in the US, and most of the tech events from now through the end of 2020 are being marketed as “virtual” events.

    Typically, between the months of April and June each year, tech giants like Microsoft, Amazon, Facebook, Google, and Apple corral thousands of people in one massive space to give previews of new software and to get app makers excited to make apps for them. This year, those rousing keynotes, coding sessions, hallway conversations, and after-hours meetups all happened online.

    Only, some of those interactions didn’t happen at all. Events like Google I/O, Facebook F8, and Amazon re:MARS were cancelled entirely. Microsoft and Apple forged ahead with carefully produced CEO keynotes and virtual coding labs, but couldn’t replicate the serendipitous run-ins or casual gatherings that are sometimes the most valuable part of conferences. Virtual attendees told me that online-only events have lowered the barriers to entry; people no longer have to spend thousands on tickets and travel to get access to information that may be critical to their livelihoods. But people who spoke to me were pretty straightforward about what’s still lacking from virtual events: They miss the hang.

    “Conferences aren’t just about what’s on the schedule, but the side conversations and the other social aspects,”

    “I don’t think we’ve quite figured out as an entire industry what the best way is to bring in some of those social interactions when an event is virtual.”

    Warren points out that online-only events have some very real benefits for communities that are normally underrepresented at tech events. Virtual keynotes and coding lessons can be translated into a dozen different languages for people watching at home

    Still, there were no virtual events that were adequate substitutes for in-person meetups.

    If this season’s virtual tech events have offered a kind of blueprint for what may come—a beta test, if you will—they’ve proven there’s still blank space for the communal elements humans crave.

    Despite the sorely missed in-person meetups and the still-disappointing lack of diversity at tech events, plenty of developers gave Microsoft and Apple kudos for managing to put on robust and technically sound virtual conferences this spring.

    At this point, no one knows how long it will be before large tech conferences happen in person again, if at all.

    “Personally, I hope tech companies continue the virtual conferences when it comes to the sessions, because it’s nice to be able to pause and absorb that information,” says Kaya Thomas. “But maybe they could do one or two days of in-person things, the keynote or the meetups, when it’s available in the future.”

    Reply
  12. Tomi Engdahl says:

    https://www.omgubuntu.co.uk/2020/06/linux-marketshare-increased-again-last-month-and-do-did-ubuntus

    Linux’s share of all desktop OSes grew from a new-high of 2.87 percent in April 2020 to an even higher high of 3.17 percent in May 2020.

    Now, this is relatively unusual. Linux marketshare — based on past trends — typically hovers below the 2 percent mark and doesn’t fluctuate widely

    Reply
  13. Tomi Engdahl says:

    List of Free Python Resources [Updated June 2020]
    https://hakin9.org/list-of-free-python-resources/

    Reply
  14. Tomi Engdahl says:

    Self-Healing Devices Can Regain Abilities After Becoming Damaged
    Devices made from the self-healing material could even gain new functionality after they’ve been cut
    https://www.hackster.io/news/self-healing-devices-can-regain-abilities-after-becoming-damaged-ae5fc11cece4

    Reply
  15. Tomi Engdahl says:

    Godot provides a huge set of common tools, so you can just focus on making your game without reinventing the wheel.

    Godot is completely free and open-source under the very permissive MIT license. No strings attached, no royalties, nothing. Your game is yours, down to the last line of engine code.

    https://godotengine.org/

    Reply
  16. Tomi Engdahl says:

    What Is Confidential Computing?
    https://spectrum.ieee.org/computing/hardware/what-is-confidential-computing

    A handful of major technology companies are going all in on a new security model they’re calling confidential computing in an effort to better protect data in all its forms.

    The three pillars of data security involve protecting data at rest, in transit, and in use. Protecting data at rest means using methods such as encryption or tokenization so that even if data is copied from a server or database, a thief can’t access the information. Protecting data in transit means making sure unauthorized parties can’t see information as it moves between servers and applications. There are well-established ways to provide both kinds of protection.

    Protecting data while in use, though, is especially tough because applications need to have data in the clear—not encrypted or otherwise protected—in order to compute. But that means malware can dump the contents of memory to steal information. It doesn’t really matter if the data was encrypted on a server’s hard drive if it’s stolen while exposed in memory.

    Reply
  17. Tomi Engdahl says:

    Are Quantum-Tunnel Transistors Real, and What Do They Mean for Power Tech?
    https://www.electronicdesign.com/power-management/whitepaper/21134181/are-quantumtunnel-transistors-real-and-what-do-they-mean-for-power-tech?utm_source=EG+ED+Analog+%26+Power+Source&utm_medium=email&utm_campaign=CPS200619059&o_eid=7211D2691390C9R&rdx.ident%5Bpull%5D=omeda%7C7211D2691390C9R&oly_enc_id=7211D2691390C9R

    Did the electronics industry take the wrong turn when it embraced CMOS 50 years ago? If quantum tunneling was the path that should have been taken, can an upstart R&D firm convince us to embrace it?

    It’s still too early to tell whether SFN’s announcement is a promising, but ultimately fruitless development, or the beginning of a second semiconductor revolution.

    Reply
  18. Tomi Engdahl says:

    Chip Sector Provides a Lifeline During Pandemic
    https://www.eetimes.com/chip-sector-provides-a-lifeline-during-pandemic/

    It could be worse: Despite a pandemic and an economic downturn rivaling the Great Depression, chip makers managed to eke out revenue gains during the first quarter of the Plague Year 2020.

    Despite wrenching economic dislocations, the novel coronavirus appears to have stimulated global computer and server sales as workers set up home offices while home-schooling their kids. Both require connections to datacenters and public cloud services. The result, observed Hewlett-Packard Enterprise CEO Antonio Neri, himself recovering from Covid-19: “Connectivity is going to be [like] electricity and water.”

    While overall semiconductor revenues declined slightly during the first three months on 2020, the top 10 chip makers recorded revenue growth just over 2 percent, according to market tracker Omdia.

    Omdia reported that HiSilicon registered a whopping 40.3 percent jump in 1Q 2020 revenues, catapulting the Chinese chip maker into eighth place in its rankings of global semiconductor suppliers. HiSilicon “is doing this by building up sufficient inventory to ride out the impact of the revised U.S. trade restrictions, which are planned to go into effect in September,” Omdia said.

    Meanwhile, Qualcomm’s first quarter revenues jumped on the strength of 5G wireless deployments, boosting them a healthy 14.6 percent. “Qualcomm is benefitting from the Chinese government’s move to kickstart the economy by emphasizing the deployment of 5G infrastructure,” said Ron Ellwanger, Omdia’s senior analyst for semiconductor manufacturing. “The company is taking advantage of China’s subsidized 5G handset market, along with moves to accelerate the building of 5G infrastructure” across China.

    Reply
  19. Tomi Engdahl says:

    Frederic Lardinois / TechCrunch:
    AWS makes CodeGuru, a set of tools that use machine learning to automatically review code for bugs and suggest potential optimizations, generally available

    CodeGuru, AWS’s AI code reviewer and performance profiler, is now generally available
    https://techcrunch.com/2020/06/29/codeguru-awss-ai-code-reviewer-and-performance-profiler-is-now-generally-available/

    AWS today announced that CodeGuru, a set of tools that use machine learning to automatically review code for bugs and suggest potential optimizations, is now generally available. The tool launched into preview at AWS re:Invent last December.

    CodeGuru consists of two tools, Reviewer and Profiler, and those names pretty much describe exactly what they do. To build Reviewer, the AWS team actually trained its algorithm with the help of code from more than 10,000 open source projects on GitHub, as well as reviews from Amazon’s own internal codebase.

    https://aws.amazon.com/codeguru/

    Amazon CodeGuru is a developer tool powered by machine learning that provides intelligent recommendations for improving code quality and identifying an application’s most expensive lines of code. Integrate Amazon CodeGuru into your existing software development workflow where you will experience built-in code reviews to detect and optimize the expensive lines of code to reduce costs.

    Reply
  20. Tomi Engdahl says:

    David Streitfeld / New York Times:
    Even as many companies adopt work from home policies during the pandemic, history shows that such initiatives have consistently failed and were abandoned

    The Long, Unhappy History of Working From Home
    https://www.nytimes.com/2020/06/29/technology/working-from-home-failure.html

    As the coronavirus keeps spreading, employers are convinced remote work has a bright future.
    Decades of setbacks suggest otherwise.

    Three months after the coronavirus pandemic shut down offices, corporate America has concluded that working from home is working out. Many employees will be tethered to Zoom and Slack for the rest of their careers, their commute accomplished in seconds.

    Richard Laermer has some advice for all the companies rushing pell-mell into this remote future: Don’t be an idiot.

    “Every weekend became a three-day holiday,” he said. “I found that people work so much better when they’re all in the same physical space.”

    IBM came to a similar decision. In 2009, 40 percent of its 386,000 employees in 173 countries worked remotely. But in 2017, with revenue slumping, management called thousands of them back to the office.

    Even as Facebook, Shopify, Zillow, Twitter and many other companies are developing plans to let employees work remotely forever, the experiences of Mr. Laermer and IBM are a reminder that the history of telecommuting has been strewn with failure. The companies are barreling forward but run the risk of the same fate.

    “Working from home is a strategic move, not just a tactical one that saves money,” said Kate Lister, president of Global Workplace Analytics. “A lot of it comes down to trust. Do you trust your people?”

    Companies large and small have been trying for decades to make working from home work. As long ago as 1985, the mainstream media was using phrases like “the growing telecommuting movement.” Peter Drucker, the management guru, declared in 1989 that “commuting to office work is obsolete.”

    Telecommuting was a technology-driven innovation that seemed to offer benefits to both employees and executives.

    Facebook expects up to half its workers to be remote as soon as 2025. The chief executive of Shopify, a Canadian e-commerce company that employs 5,000 people, tweeted in May that most of them “will permanently work remotely. Office centricity is over.” Walmart’s tech chief told his workers that “working virtually will be the new normal.”

    Quora, a question-and-answer site, said last week that “all existing employees can immediately relocate to anywhere we can legally employ them.” Those who do not want to go anywhere can still use the Silicon Valley headquarters, which would become a co-working space.

    Quora said 60 percent of its workers expressed a preference for remote work, in line with national surveys.

    “Anyone who has led a team knows that delegation is not always the most effective leadership style,”

    The coronavirus shutdown, which means 95 percent of Best Buy’s corporate campus workers are currently remote, might now be prompting another shift in company philosophy. “We expect to continue on a permanent basis some form of flexible work options,” a spokeswoman said.

    Remote workers might be free of commuting costs, but they are traditionally more vulnerable.

    started seeing his newly remote staff in a new light.

    “I kind of learned who was really doing the work and who was not really doing as much work as it looked like on paper that they might have been doing,” he said. With “some of the supervisory, middle-management people,” he added, “I’m starting to wonder if I really need them.”

    “When people are in turmoil, you take advantage of them,” said John Sullivan, a professor of management at San Francisco State University.

    “The data over the last three months is so powerful,” he said. “People are shocked. No one found a drop in productivity. Most found an increase. People have been going to work for a thousand years, but it’s going to stop and it’s going to change everyone’s life.”

    Innovation, Dr. Sullivan added, might even catch up eventually.

    “When you hire remotely, you can get the best talent around and not just the best talent that wants to live in California or New York,” he said. “You get true diversity. And it turns out that affects innovation.”

    “Companies are saying working from home is working so well we’re going to let people work from home forever,” he said. “It’s good P.R., and very romantic, and very unrealistic. We’ll be back in the office as soon as there’s a vaccine.”

    Reply
  21. Tomi Engdahl says:

    It’s really hard to find maintainers, says Linux creator. Linus developed a bad reputation early on for people working on Linux kernel. However, he went through therapy to calm down. I think the damage was already done. I hope he finds successor, tho.

    ‘It’s really hard to find maintainers’: Linus Torvalds ponders the future of Linux
    Will code move on to a language such as Rust? ‘I’m convinced it’s going to happen’ says kernel colonel
    https://www.theregister.com/2020/06/30/hard_to_find_linux_maintainers_says_torvalds/

    Linux creator Linus Torvalds spoke about the challenge of finding future maintainers for the open-source operating system, at the (virtual) Open Source Summit and Embedded Linux conference under way this week.

    Reply
  22. Tomi Engdahl says:

    Intel was once the unrivaled leader in CPU technology, but after half a decade of 10nm process woes, it faces a resurgent AMD and Arm-based Macs.

    Arm Macs and AMD rising: How Intel’s endless 10nm struggles cost it so much
    https://www.pcworld.com/article/3563786/arm-macs-and-amd-rising-how-intels-endless-10nm-struggles-cost-it-so-much.html

    Intel’s endless 10nm nightmare has cost it so, so much.

    It all started on September 5, 2014. That’s the day Intel introduced 5th-gen Core M chips based on “Broadwell,” the company’s first processors built using the 14-nanometer manufacturing process. Despite some manufacturing woes that pushed Broadwell back from its expected 2013 release, Intel’s offering served as the vanguard of processor technology. AMD remained stuck on the 28nm process with its abysmal Bulldozer architecture. A mere month later, the Apple iPad Air 2 launched with a custom A8X chip that couldn’t quite hang with Intel’s older Haswell CPUs in Geekbench—but it was getting close.

    Nearly six years later, the tables have turned. Intel’s 10th-gen Core processors remain on an (upgraded) 14nm process. AMD’s Ryzen chips have snatched the computing crown, and Apple’s doing the unthinkable: switching Macs away from x86 CPUs onto its own custom Arm silicon.

    How did Intel get here? Let’s look at how the company lost its way, starting with the death of tick-tock.

    The long road to 10nm
    It wasn’t supposed to be like this. Intel’s original roadmaps expected 10nm chips to launch in 2016, with more advanced 7nm chips coming in 2018. Then the delays began.

    In early 2016, Intel confirmed that tick-tock was dead, adding a third leg to the process dubbed “optimization.”

    Those 10nm Sunny Cove cores indeed hit laptops in the form of 10th-gen “Ice Lake” processors in August, 2019. Yes, 10nm was finally, truly here—at least in notebooks. Intel’s desktop offerings remain on the 14nm process. And even after the three-year delay, the actual 10nm CPU cores came with lower clock speeds and didn’t impress much.

    Intel hasn’t sat still for half a decade; it’s been fine-tuning the performance of its 14nm processors

    Intel’s also been pushing what’s possible with 14nm hard to keep up with the competitive landscape.

    AMD struck back big-time with its new Ryzen processors, built using TSMC’s most advanced processing nodes. Ryzen debuted in 2017 as a core-loaded 14nm monster that slaughtered Intel in multi-threaded tasks and overall value, but lagged in gaming performance.

    Lower prices and significant IPC improvements helped 2nd-gen Ryzen supplant Intel’s 8th-gen Core i7 as our recommended flagship processor. Then, with Intel mired at 14nm, AMD took the technological lead with 3rd-gen Ryzen CPUs built using an advanced 7nm process with support for blazing-fast PCIe 4.0 storage. (Intel’s latest 10th-gen chips remain on PCIe 3.0.)

    “For probably 9 out of 10 consumers looking at a high-end CPU, they’ll want to buy the Ryzen 9 3900X [over the Core i9-9900K],”

    Ryzen processors dominate our list of the best CPUs

    more than 50-percent share of premium processor sales at many top global retailers

    Making matters worse for Intel, 7nm Ryzen 4000 mobile chips introduced in 2020 enable performance that all but the highest-end Intel-based systems just can’t match. “To put AMD’s Ryzen 4000 in perspective, you have to understand that in AMD’s 50-year history, it has never beaten Intel in laptops,”

    AMD expects over 100 laptops with Ryzen 4000 to launch in 2020. It also expects to release next-gen Ryzen desktop processors later this year

    Reply
  23. Tomi Engdahl says:

    Your Own Open Source ASIC: SkyWater-PDF Plans First 130 Nm Wafer In 2020
    https://hackaday.com/2020/06/30/your-own-open-source-asic-skywater-pdf-plans-first-130-nm-wafer-in-2020/

    Who Needs Open Source ASICs?

    Of course, FPGAs and ASICs aren’t the answer to every problem. We can’t help but notice that some examples you see — including ours — are sometimes better for learning than actually practical. For example, the classic sample for learning about state machines on an FPGA is a traffic light. Why not? Everyone sort of understands what it is supposed to do, it has clear state logic, and you can make it as simple as you like or quite complex if it senses vehicles and pedestrian crosswalk buttons or changes based on schedules.

    On the negative side, ASICs are not for the sloppy. Historically, taping out an ASIC has been very expensive. So you have a run of parts but — oops — you forgot that counter needs to reset to a non-zero number. In an FPGA, that’s a minor annoyance; you simply change the configuration — especially now that one time programmable FPGAs are rare outside of certain applications. Even if you have to trash an FPGA and program another one, they are generally not very expensive unless they are radiation hardened or very large devices.

    If you make that mistake on an ASIC, you are in big trouble. You can’t change anything on the parts you have. You have to have a new batch built with new upfront costs. In the commercial world, that kind of mistake can be career-ending.

    Tim makes it clear that his target audience isn’t the professional building custom ASICs, though. It is us. The hackers and tinkerers that want to create custom ICs. There may be some student market, too, although schools often have deals to make that feasible already.

    https://github.com/google/skywater-pdk

    Reply
  24. Tomi Engdahl says:

    European Commission Energy Efficiency Label Reform
    Newly rescaled EU energy efficiency labels will go into effect on March 1, 2021 for five product categories. Products currently bearing top grades could receive as low as a C rating on the updated label. To continue attracting consumers with high grades, many manufacturers have already started redesinging their appliances to drastically improve efficiency.
    https://ac-dc.power.com/green-room/regulations-agency/european-commission-energy-efficiency-label-reform/?utm_campaign=EU_Labeling&utm_medium=email&utm_source=PI&utm_content=EU_Labeling&utm_term=AC-DC+Multiple+Appliances+PI+EN&mkt_tok=eyJpIjoiTUdNM1pUZzFaV0prTmpCbCIsInQiOiJydHZlMjJpVXV5TVwvNU9jeEk3SndEa0dGQkw3WU1mazNWWEFoRkxIcnBRbXh5SFVkc0RQSEFVSnZMZGpaQktzdCtrVHRaNDZuSG83Z09BeXJrc3RQOGVTTUFRS08rdXphbkxwcFwvWDlSZTNXT0hOR1QxdkY5dUVSYUFTRzQzeGFiIn0%3D

    With a color-coded scale from A (most efficient) to G (least efficient), the energy label is recognized by 93% of European consumers, and 79% consider it when buying energy efficient products, according to the European Commission. A product labelled in the highest available category can gain an advantage over its competitors. After years of expending the top rating from A+ to A+++ due to development in energy efficiency, the label is in need of a complete rescale.

    https://ec.europa.eu/commission/presscorner/detail/en/MEMO_19_1596

    Reply
  25. Tomi Engdahl says:

    A plan to redesign the internet could make apps that no one controls
    https://www.technologyreview.com/2020/07/01/1004725/redesign-internet-apps-no-one-controls-data-privacy-innovation-cloud/

    Dfinity wants to allow the creation of apps that can run on the network itself rather than on servers owned by Facebook, Google or Amazon. Can it succeed where others have failed?

    In 1996 John Perry Barlow, cofounder of internet rights group the Electronic Frontier Foundation, wrote “A declaration of the independence of cyberspace.” It begins: “Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.”

    Barlow was reacting to the US Communications Decency Act, an early attempt to regulate online content, which he saw as overreaching. But the broad vision he put forward of a free and open internet controlled by its users was one that many internet pioneers shared.

    Fast-forward a quarter-century and that vision feels naïve. Governments may have struggled to regulate the internet, but new sovereigns have taken over instead. Barlow’s “home of Mind” is ruled today by the likes of Google, Facebook, Amazon, Alibaba, Tencent, and Baidu—a small handful of the biggest companies on earth.

    Yet listening to the mix of computer scientists and tech investors speak at an online event on June 30 hosted by the Dfinity Foundation, a not-for-profit organization headquartered in Zurich, Switzerland, it is clear that a desire for revolution is brewing. “We’re taking the internet back to a time when it provided this open environment for creativity and economic growth, a free market where services could connect on equal terms,” says Dominic Williams, Dfinity’s founder and chief scientist. “We want to give the internet its mojo back.”

    Dfinity is building what it calls the internet computer, a decentralized technology spread across a network of independent data centers that allows software to run anywhere on the internet rather than in server farms that are increasingly controlled by large firms, such as Amazon Web Services or Google Cloud. This week Dfinity is releasing its software to third-party developers, who it hopes will start making the internet computer’s killer apps. It is planning a public release later this year.

    Rewinding the internet is not about nostalgia. The dominance of a few companies, and the ad-tech industry that supports them, has distorted the way we communicate—pulling public discourse into a gravity well of hate speech and misinformation—and upended basic norms of privacy.

    There is an economic problem too. The effective monopoly of these firms stifles the kind of innovation that spawned them in the first place. It is no coincidence that Google, Facebook, and Amazon were founded back when Barlow’s cyberspace was still a thing.

    On the normal internet, both data and software are stored on specific computers—servers at one end and laptops, smartphones, and game consoles at the other. When you use an app, such as Zoom, software running on Zoom’s servers sends data to your device and requests data from it.

    Dfinity is introducing a new standard, which it calls the internet computer protocol (ICP). These new rules let developers move software around the internet as well as data. All software needs computers to run on, but with ICP the computers could be anywhere. Instead of running on a dedicated server in Google Cloud, for example, the software would have no fixed physical address, moving between servers owned by independent data centers around the world. “Conceptually, it’s kind of running everywhere,” says Dfinity engineering manager Stanley Jones.

    In practice, it means that apps can be released that nobody owns or controls. Data centers will be paid a fee, in crypto tokens, by the app developers for running their code, but they won’t have access to the data, making it hard for advertisers to track your activity across the internet.

    A less welcome upshot is that a free-for-all internet could also make it difficult to hold app makers accountable.

    In fact, a decentralized internet may lead to a decentralized form of governance, in which developers and users all have a say in how it is regulated—much as Barlow wanted. This is the ideal adopted in the crypto world. But as we’ve seen with Bitcoin and Ethereum, it can lead to infighting between cliques. It is not clear that mob rule would be better than recalcitrant CEOs.

    This week, Dfinity showed off a TikTok clone called CanCan. In January it demoed a LinkedIn-alike called LinkedUp. Neither app is being made public, but they make a convincing case that apps made for the internet computer can rival the real things.

    Remaking the internet

    But Dfinity is not the first to try to remake the internet. It joins a list of organizations developing a range of alternatives, including Solid, SAFE Network, InterPlanetary File System, Blockstack, and others. All draw on the techno-libertarian ideals embodied by blockchains, anonymized networks like Tor and peer-to-peer services like BitTorrent.

    Some, like Solid, also have all-star backing. The brainchild of Tim Berners-Lee, who came up with the basic design for the web in 1989

    Even when Solid is ready for full release, Kagal expects that only people who really worry about what happens to their personal data will make the switch. “We’ve been talking about privacy for 20 years and people care about it,” she says. “But when it comes to actually taking action, nobody wants to leave Facebook.”

    Even within the niche communities of developers working to make a new internet, there is little awareness of rival projects.

    It’s possible that the internet may be forced to change whether the average user cares or not. “Privacy regulations could become so restrictive that companies will be forced to move to a more decentralized model,” says Kagal. “They might realize that storing and collecting all this personal information is just not worth their while anymore.”

    In the years since Barlow wrote his polemic, the data economy has sunk deep roots. “It would be great if it was replaced with Solid,” says Kagal. “But it would be great if it was replaced with something else as well. It just needs to be done.”

    Reply
  26. Tomi Engdahl says:

    ASML, TSMC, Intel tout EUV, advanced packaging at CSTIC 2020
    https://www.digitimes.com/news/a20200701PD204.html

    Reply
  27. Tomi Engdahl says:

    You can build Linus Torvalds’ PC: Here’s all the hardware and where to buy it
    https://www.zdnet.com/article/you-can-build-linus-torvalds-pc-heres-all-the-hardware-and-where-to-buy-it/

    Normally, Torvalds would pop into his local Fry’s. But in these pandemic times, he ordered everything from Amazon. Here’s the complete list of parts.

    Linus Torvalds is the most famous programmer in the world, father of the Linux operating system, and maker of the near-universal Git distributed version control system. He also builds his own developer workstation and recently upgraded his PC to a speedy AMD Threadripper 3970x-based processor. But a computer is more than a CPU.

    In an exclusive conversation, Torvalds revealed what he used in his latest programming PC. Total price? About $3,500, which, for a high-end desktop computer, is darn cheap.

    https://www.zdnet.com/article/look-whats-inside-linus-torvalds-latest-linux-development-pc/

    Reply
  28. Tomi Engdahl says:

    Linus builds Linus’ new PC!
    https://m.youtube.com/watch?v=Kua9cY8q_EI

    Linus has built a new PC for himself, and now Linus is going to replicate it as best he can – Wait, which Linus are we talking about? Who cares, let’s build a Threadripper workstation!

    Reply
  29. Tomi Engdahl says:

    5 Myths About Cloud Native Servers
    https://www.eetimes.com/5-myths-about-cloud-native-servers/?utm_source=newsletter&utm_campaign=link&utm_medium=EDNFunFriday-20200703&oly_enc_id=2359J2998023G8W

    Myth 1: Innovation in server CPU design has closely followed Moore’s law

    While Moore’s Law delivered on its promise for many years, effectively doubling transistor count every two years, x86 architecture innovation has actually been slowing down over the last five years.

    Clearly, data centers will not be able to meet their increasing performance demands by simply scaling up existing processors without increasing their power consumption even faster. As the industry moves toward more modern services based on containers and microservices, which are meant to help applications rapidly scale up and down based on demand, performance and power efficiency will be absolutely critical and will require new cloud-native server processors built specifically to handle these innovative cloud workloads.

    Myth 2: Threads are the same as CPU cores

    Most x86 processors and some Arm processors utilize simultaneous multithreading (SMT) to increase the number of logical cores available. These threads are often perceived or marketed as if they are independent physical cores in the processor. They are not the same.

    Today’s x86 based processors have up to 64 cores and they have two threads that can run on each core

    Myth 3: More CPU cores don’t matter as much because most applications are single threaded

    Highlighting the growth of these new models, analyst firm IDC has predicted that by 2023, 80% of workloads will shift to or be created with containers/microservices.[ii] CSPs and hyperscalers require a new class of cloud-native processors to address the needs of the modern data center — most importantly, processors with the maximum number of cores as opposed to legacy processors with few cores that were optimized for yesterday’s software.

    Myth 4: One CPU architecture can meet the demands of all workloads

    Processor vendors that service multiple markets typically design a single architecture and use it across all the markets they serve. For instance, a single x86 architecture is used across multiple markets such as clients (laptops/servers), high performance computing, enterprise, edge computing, and cloud computing. However, features that are optimized for some markets, such as client laptops, do not directly apply to other markets such as cloud computing.

    For example, the multi-core x86 processor can run faster when few cores are active and slows down as more cores are utilized. While this works well in a client computing environment, the same feature introduces challenges to predictable performance in cloud computing environments.

    Myth 5: High-performance CPUs can’t be power efficient

    Today’s legacy technology comes at a cost — it’s power hungry. Unfortunately, this issue has caused many people to think it’s simply not possible to have a high-performance CPU that is also power efficient. The reality is, all that’s needed is an architecture that can do both, and that’s designed to address the needs of a single market. This is exactly what happened in the mobile and tablet markets with the Arm architecture

    Now, the same innovation is being driven in server-class processors with a focus on hyperscale and cloud service provider markets. Arm’s RISC architecture provides high performance, better performance per watt efficiency, and server class RAS for data centers.

    Reply
  30. Tomi Engdahl says:

    3,650 respondents from 21 countries spoke about their DevOps successes, challenges, and ongoing struggles. See what they have to say.

    Mapping the DevSecOps Landscape
    https://about.gitlab.com/developer-survey/?utm_medium=paidsocial&utm_source=facebook&utm_campaign=2020surveydevsecops_emea_pr_static_x_x&utm_content=developer-survey_corpmkt_239_english_

    Reply
  31. Tomi Engdahl says:

    Semiconductor Sales Increase 5.8%

    The Semiconductor Industry Association (SIA) announced worldwide sales of semiconductors were $35.0 billion in May 2020, an increase of 5.8% from the May 2019 total of $33.0 billion and 1.5% more than the April 2020 total of $34.4 billion. Monthly sales are compiled by the World Semiconductor Trade Statistics (WSTS) organization and represent a three-month moving average.

    Global Semiconductor Sales Increase 5.8 Percent Year-to-Year in May; Annual Sales Projected to Increase 3.3 Percent in 2020, 6.2 Percent in 2021
    https://www.semiconductors.org/global-semiconductor-sales-increase-5-8-percent-year-to-year-in-may-annual-sales-projected-to-increase-3-3-percent-in-2020-6-2-percent-in-2021/

    Reply

Leave a Reply to Tomi Engdahl Cancel reply

Your email address will not be published. Required fields are marked *

*

*