Computer trends 2018

IT seems to be growing again. Gartner forecasts worldwide IT spending will increase 4.5% this year to $3.68 trillion, driven by artificial intelligence, big data analytics, blockchain technology, and the IoT.

Digital transformations are fashionable. You won’t find an enterprise that isn’t leveraging some combination of cloud, analytics, artificial intelligence and machine learning to better serve customers or streamline operations. But here’s a hard truth about digital transformations: Many are failing outright or are in danger of failing. Typical reasons for failing are not understanding what is digital transformation (different people understand it differently), lack of CEO sponsorship, talent deficiency, resistance to change. Usually a technology-first approach to digital transformation is a recipe for disaster. Truing to just push trough technically unfeasible transformation idea is another way to fail.

The digital era requires businesses to move with speed, and that is causing IT organizations to rethink how they work. A lot of  IT is moving off premises to SaaS providers and the public cloud. Research outfit 451 standout finding was that 60 per cent of the surveyed enterprises say they will run the majority of their IT outside the confines of enterprise data centres by the end of 2019. From cost containment to hybrid strategies, CIOs are getting more creative in taking advantage of the latest offerings and the cloud’s economies of scale.

In 2018 there seems to be a growing Software Engineering Talent Shortage in both quantity and quality. For the past nine years, software engineers have been at the top of the hardest to fill jobs in the United States. And same applies to many other countries including Finland. Forrester projects that firms will pay 20% above market for quality engineering talent in 2018. Particularly in-demand skills  are data scientists, high-end software developers and information security analysts. There is real need for well-studied, experienced engineers with a formal and deep understanding of software engineering. Recruiting and retaining tech talent remains IT’s biggest challenge today. Most CIOs are migrating applications to public cloud services, offloading operations and maintenance of computing, storage and other capabilities so they can reallocate staff to focus on what’s strategic to their business.

The enterprise no longer is at the center of the IT universe. It seems that reports of the PC’s demise have been greatly exaggerated and the long and painful decline in PC sales of the last half-decade as tailed off, at least momentarily. As the sales of smartphones and tablets have risen, consumers had not stopped using PCs, but merely replaced them less often. FT reports that PC is set to stage a comeback in 2018, after the rise of smartphones sent sales of desktop and laptop computers into decline in recent years. If that does not happen, then PC market could return to growth in 2019. But the end result is that PC is no longer seen as the biggest growth driver for chip makers. An extreme economic shift has chipmakers focused on hyperscale clouds.

Microservices are talked about a lot. Software built using microservices is easier to deliver and maintain than the big and brittle architectures or old; these were difficult to scale and might take years to build and deliver. Microservices are small and self-contained, so therefore easy to wrap up in a virtual machine or a container (but don’t have to live in containers). Public cloud providers increasingly differentiate themselves through the features and services they provide. But it turns out that microservices are far from being one-size-fit-for-all silver bullet for IT challenges.

Containers will try to make break-trough again in 2018. Year 2017 was supposed to be the year of containers! It wasn’t? Oops. Maybe year 2018 is better. Immature tech still has a bunch of growing up to do. Linux Foundation’s Open Containers Initiative (OCI) finally dropped two specifications that standardise how containers operate at a low level. The needle in 2018 will move towards containers running separately from VMs, or entirely in place of VMs. Kubernates gains traction. It seems that the containers are still at the point where the enterprise is waiting to embrace them.

Serverless will be talked about. Serverless computing is a cloud computing execution model in which the cloud provider dynamically manages the allocation of machine resources. Serverless architectures refer to applications that significantly depend on third-party services (knows as Backend as a Service or “BaaS”) or on custom code that’s run in ephemeral containers (Function as a Service or “FaaS”), the best known vendor host of which currently is AWS Lambda.

Automation is what everybody with many computers wants. Infrastructure automation creates and destroys basic IT resources such as compute instances, storage, networking, DNS, and so forth. Security automation helps keeping systems secure. It bosses want to create self-driving private clouds. The journey to self-driving clouds needs to be gradual. The vision of the self-driving cloud makes sense, but the task of getting from here to there can seem daunting. DevOps automation with customer control: Automatic installation and configuration, Integration that brings together AWS and VMWare, workflows migration controlled by users, Self-service provisioning based on templates defined by users, Advanced machine learning to automate processes, and Automated upgrades.

Linux is center of many cloud operations: Google and Facebook started building their own gear and loading it with their own software. Google has it’s own Linux called gLinux.  Facebook networking uses Linux-based FBOSS operating system. Even Microsoft has developed its own Linux for cloud operations. Software-defined networking (SDN) is a very fine idea.

Memory business boomed in 2017 for both NAND and DRAM. The drivers for DRAM are smartphones and servers. Solid-state drives (SSDs) and smartphones are fueling the demand for NANDNAND Market Expected to Cool in Q1 from the crazy year 2017, but it is still growing well because there is increasing demand. Memory — particular DRAM — was largely considered a commodity business.

Lots of 3D NAND will go to solid state drives in 2018. IDC forecasts strong growth for the solid-state drive (SSD) industry as it transitions to 3D NAND.  SSD industry revenue is expected to reach $33.6 billion in 2021, growing at a CAGR of 14.8%. Sizes of memory chips increase as number of  layer in 3D NAND are added. The traditional mechanical hard disk based on magnetic storage is in hard place in competition, as the speed of flash-based SSDs is so superior

There is search for faster memory because modern computers, especially data-center servers that skew heavily toward in-memory databases, data-intensive analytics, and increasingly toward machine-learning and deep-neural-network training functions, depend on large amounts of high-speed, high capacity memory to keep the wheels turning. The memory speed has not increased as fast as the capacity. The access bandwidth of DRAM-based computer memory has improved by a factor of 20x over the past two decades. Capacity increased 128x during the same period. For year 2018 DRAM remains a near-universal choice when performance is the priority. There is search going on for a viable replacement for DRAM. Whether it’s STT-RAM or phase-change memory or resistive RAM, none of them can match the speed or endurance of DRAM.

 

 

PCI Express 4.0 is ramping up. PCI-standards consortium PCI-SIG (Special Interest Group) has ratified and released specifications for PCIe 4.0 Specification Version 1. Doubling PCIe 3.0’s 8 GT/s (~1 GB/s) of bandwidth per lane, PCIe 4.0 offers a transfer rate of 16 GT/s. The newest version of PCI Express will start appearing on motherboards soon. PCI-SIG has targeted Q2 2019 for releasing the finalized PCIe 5.0 specification, so PCIe 4.0 won’t be quite as long-lived as PCIe 3.0 has been. So we’ll See PCIe 4.0 this year in use and PCIe 5.0 in 2019.

USB type C is on the way to becoming the most common PC and peripheral interface. The USB C connector has become faster more commonplace than any other earlier interface. USB C is very common on smartphones, but the interface is also widespread on laptops. Sure, it will take some time before it is the most common. In 2021, the C-type USB connector has almost five billion units, IHS estimates.

It seems that the after-shocks of Meltdown/Spectre vulnerabilities on processors will be haunting us for quite long time this year. It is now three weeks since The Register revealed the chip design flaws that Google later confirmed and the world still awaits certainty about what it will take to get over the silicon slip-ups. Last pieces of farce has been that Intel Halts Spectre, Meltdown CPU Patches Over Unstable Code and Linux creator Linus Torvalds criticises Intel’s ‘garbage’ patches. Computer security will not be the same after all this has been sorted out.

What’s Next With Computing? IBM discusses AI, neural nets and quantum computing. Many can agree that those technologies will be important. Public cloud providers increasingly provide sophisticated flavours of data analysis and increasingly Machine Learning (ML) and Artificial Intelligence (AI). Central Banks Are Using Big Data to Help Shape Policy. Over the past few years, machine learning (ML) has evolved from an interesting new approach that allows computers to beat champions at chess and Go, into one that is touted as a panacea for almost everything. 2018 will be the start of what could be a longstanding battle between chipmakers to determine who creates the hardware that artificial intelligence lives on.

ARM processor based PCs are coming. As Microsoft and Qualcomm jointly announced in early December that the first Windows 10 notebooks with ARM-based Snapdragon 835 processors will be officially launched in early 2018, there will be more and more PCs with ARM processor architecture hitting the market. Digitimes Research expects that ARM-based models may dominate lower-end PC market, but don’t hold your breath on this. It is rumoured that “wireless LTE connectivity” function will be incorporated into all the entry-level Window 10 notebooks with ARM processors, branded by Microsoft as the “always-connected devices.” HP and Asustek have released some ARM-based notebooks with Windows 10S.

Sources:
Ohjelmistoalan osaajapula pahenee – kasvu jatkuu

PC market set to return to growth in 2018

PC market could return to growth in 2019

PC sales grow for the first time in five years

USBC yleistyy nopeasti

PCI-SIG Finalizes and Releases PCIe 4.0, Version 1 Specification: 2x PCIe Bandwidth and More

Hot Chips 2017: We’ll See PCIe 4.0 This Year, PCIe 5.0 In 2019

Serverless Architectures

Outsourcing remains strategic in the digital era

8 hot IT hiring trends — and 8 going cold

EDA Challenges Machine Learning

The Battle of AI Processors Begins in 2018

How to create self-driving private clouds

ZeroStack Lays Out Vision for Five-Step Journey to Self-Driving Cloud

2017 – the year of containers! It wasn’t? Oops. Maybe next year

Hyperscaling The Data Center

Electronics trends for 2018

2018′s Software Engineering Talent Shortage— It’s quality, not just quantity

Microservices 101

How Central Banks Are Using Big Data to Help Shape Policy

Digitimes Research: ARM-based models may dominate lower-end PC market

Intel Halts Spectre, Meltdown CPU Patches Over Unstable Code

Spectre and Meltdown: Linux creator Linus Torvalds criticises Intel’s ‘garbage’ patches

Meltdown/Spectre week three: World still knee-deep in something nasty

What’s Next With Computing? IBM discusses AI, neural nets and quantum computing.

The Week in Review: IoT

PCI Express 4.0 as Fast As Possible

Microsoft has developed its own Linux!

Microsoft Built Its Own Linux Because Everyone Else Did

Facebook has built its own switch. And it looks a lot like a server

Googlella on oma sisäinen linux

Is the writing on the wall for on-premises IT? This survey seems to say so

12 reasons why digital transformations fail

7 habits of highly effective digital transformations

 

857 Comments

  1. Tomi Engdahl says:

    Firing up 750 Raspberry Pis
    https://hackaday.com/2018/01/24/firing-up-750-raspberry-pis/

    Creating Raspberry Pi clusters is a popular hacker activity. Bitscope has been commercializing these clusters for a bit now and last year they created a cluster of 750 Pis for Los Alamos National Labs. You might wonder what an institution know for supercomputers wants with a cluster of Raspberry Pis. Turns out it is tough to justify taking a real high-speed cluster down just to test software. Now developers can run small test programs with a large number of CPU cores without requiring time on the big iron.

    The system is modular with each module holding 144 active nodes, 6 spares, and a single cluster manager. This all fits in a 6U rack enclosure. Bitscope points out that you could field 1,000 nodes in 42U and the power draw — including network fabric and cooling — would be about 6 kilowatts. That sounds like a lot, but for a 1,000 node device, that’s pretty economical. The cost isn’t bad, either, running about $150,000 for 1,000 nodes. Sure, that’s a lot too but not compared to the alternatives.

    Scalable clusters make HPC R&D easy as Raspberry Pi.
    http://www.bitscope.com/blog/FM/?p=GF13L

    Denver 13th November 2017, BitScope Designs, developer of BitScope Blade, an infrastructure platform for Raspberry Pi available globally via element14, has built a large Raspberry Pi cluster for a pilot conceived at Los Alamos National Laboratory (LANL).

    The 750 node cluster, comprising five rack mount BitScope Cluster Modules, each with 150 x 64 bit quad-core Raspberry Pi ARM boards and integrated network switches is the first step in a program run by the New Mexico Consortium (NMC), an organisation of three NM Universities and led by LANL.

    Reply
  2. Tomi Engdahl says:

    Report: 80’s kids started programming at an earlier age than today’s millennials
    https://thenextweb.com/dd/2018/01/23/report-80s-kids-started-programming-at-an-earlier-age-than-todays-millennials/

    HackerRank has published its 2018 Developer Skills Report. The paper looks at a number things essential to understanding the developer landscape, and explores things like the perks coders demand from their workplaces, the technologies they prefer to use, and how they entered the software development industry in the first place.

    Those in the 18 to 24 age group overwhelmingly started their programming journey in their late teens.

    When you look at older generations, you notice another striking trend: a comparatively larger proportion started programming between the ages of five and ten. 12.2 percent of those aged between 35 and 44 started programming then.

    2018 Developer Skills Report
    http://research.hackerrank.com/developer-skills/2018/

    Reply
  3. Tomi Engdahl says:

    Ops: It’s everyone’s job now
    https://opensource.com/article/17/7/state-systems-administration?sc_cid=70160000001273HAAQ

    The last decade was all about teaching sysadmins to write code. The next challenge will be teaching operations to software developers.

    “Ops is over.”
    “Sysadmins? That’s so old school.”
    “All the good engineering teams are automating operations out of existence.”

    Do you hear this a lot? I do. People love to say that ops is dead.

    Ops is how you get stuff done.
    Here’s my definition of operations: Operations is the constellation of your org’s technical skills, practices, and cultural values around designing, building, scaling and maintaining systems.” Ops is the process of delivering value to users. Ops is where beautiful theory meets stubborn reality.

    In other words, ops is how you get stuff done. It’s not optional. You ship software, you do ops. If business is the “why” and dev is the “what,” ops is the “how.” We are all interwoven and we all participate in each other’s mandates.

    Reply
  4. Tomi Engdahl says:

    Cya Windows. You Can Now Run PowerShell on Linux & macOS
    https://www.bleepingcomputer.com/news/microsoft/cya-windows-you-can-now-run-powershell-on-linux-and-macos/

    An open-source cross-platform version of PowerShell, called PowerShell Core 6.0, has been released by Microsoft that not only runs on Windows, but runs on macOS and Linux as well. Going forward, this version is going to be the actively developed with the original PowerShell that we have been using for the past 10 years only getting security updates from now on.

    Reply
  5. Tomi Engdahl says:

    Micron Puts 64-Layer 3D in Enterprise SSDs
    https://www.eetimes.com/document.asp?doc_id=1332883

    Micron Technology’s first enterprise SATA SSD using its 64-layer 3D NAND isn’t a whole lot different than its predecessor, and that’s the point.

    The company just introduced its 5200 series SSDs designed for virtualized workloads that rotating media can’t handle, such as online transaction processing, virtual desktop infrastructure and media streaming.

    the company believes it’s the first 64-layer 3D NAND SSD for the enterprise market

    Despite a lot of discussion that this year will see a tipping point in NVMe adoption, there’s still a strong demand for SATA SSDs, said Wong, with prices even going up, thanks in part to NAND flash shortages. “Even with the price increases, on the enterprise side the total cost of ownership still makes sense,” he said.

    And although 8TB is now the highest capacity available for SATA SSDs, the mainstream is still using one or two terabyte drives, he said. “When you get to four and eight, companies are thinking of moving to NVMe,” Wong said.

    Reply
  6. Tomi Engdahl says:

    AI Silicon Preps for 2018 Debuts
    A dozen startups chase deep learning
    https://www.eetimes.com/document.asp?doc_id=1332877

    Deep neural networks are like a tsunami on the distant horizon.

    Given their still-evolving algorithms and applications, it’s unclear what changes deep neural nets (DNNs) ultimately will bring. But their successes thus far in translating text and recognizing images and speech make it clear they will reshape computer design, and the changes are coming at a time of equally profound disruptions in how semiconductors are designed and manufactured.

    The first merchant chips tailored for training DNNs will ship this year.

    Reply
  7. Tomi Engdahl says:

    Chipmakers Rally in Talent War
    Cloud giants dominate with STEM grads
    https://www.eetimes.com/document.asp?doc_id=1332865

    The semiconductor industry needs a lot of good engineers and a makeover to attract them. A veteran executive put out a call to action at the Industry Strategy Summit here to help launch an initiative to do it.

    “We were a clearing house for the best and brightest once, but today it’s a war for talent, and we are a step or two behind,” said Dan Durn, the chief financial officer of Applied Materials. “Today, kids dream about Google, Facebook, and Apple; they don’t dream about us, and we need to change that.”

    About 85% of chip vendors need new kinds of talent to keep pace with the rise of digital operations powered by automated systems, big data, and machine learning. However, 77% of them report a shortage of talent

    Intel and Samsung have high ratings with recent grads, but so do AirBnB, Netflix, LinkedIn, and other tech companies. Meanwhile, “the list of the bottom 15 companies in brand recognition by students is dominated by semiconductor companies that most college grads have not heard about,

    Reply
  8. Tomi Engdahl says:

    20 years on, open source hasn’t changed the world as promised
    https://www.itworld.com/article/3246274/open-source-tools/20-years-on-open-source-hasnt-changed-the-world-as-promised.html

    Most code remains closed and proprietary, even though open source now dominates enterprise platforms. How can that be?

    Reply
  9. Tomi Engdahl says:

    Why DevSecOps matters to IT leaders
    https://enterprisersproject.com/article/2018/1/why-devsecops-matters-it-leaders?sc_cid=7016000000127ECAAY

    DevSecOps may not be an elegant term, but the results are attractive: Stronger security, earlier in the development cycle. Consider one IT leader’s tussle with Meltdown

    Reply
  10. Tomi Engdahl says:

    Ted Nelson on What Modern Programmers Can Learn From the Past
    https://spectrum.ieee.org/video/geek-life/profiles/ted-nelson-on-what-modern-programmers-can-learn-from-the-past

    The inventor of hypertext talks about the birth of personal computing, the web, and how to think beyond the currently possible

    Reply
  11. Tomi Engdahl says:

    In 1988, A College Kid’s Screw-Up Changed The Internet Forever
    http://www.iflscience.com/technology/in-1988-a-college-kids-screwup-changed-the-internet-forever/

    On the evening of November 2, 1988, in a quiet computer lab at MIT, a student majorly screwed up.

    Robert Tappan Morris, a 23-year-old computer science student at Cornell University, had written 99 lines of code and launched the program onto the ARPANET, the early foundation of the Internet. Unbeknownst to him, he had just unleashed one of the Internet’s first self-replicating, self-propagating worm – “the Morris Worm” – and it would change the way we saw the Internet forever.

    But why would a nerdy college kid unleash this beast? Even after 30 years, a criminal trial, and countless retellings of his story, it remains unclear.

    Reply
  12. Tomi Engdahl says:

    AI Fuels Next-Gen HBM Demand
    https://www.eetimes.com/document.asp?doc_id=1332846

    High bandwidth memory gained some momentum last week as Samsung Electronics announced it started mass production of its second-generation technology, dubbed Aquabolt.

    Designed for use with next-gen supercomputers, artificial intelligence (AI) and graphics systems, Tien Shiah, product marketing manager for High Bandwidth Memory at Samsung, said the 8 GB High Bandwidth Memory-2 (HBM2) offers the highest DRAM performance levels and the fastest data transmission rates available today with a 2.4 gigabits-per-second (Gbps) data transfer speed per pin at 1.2V. That’s nearly a 50 percent performance improvement per package, he said, compared with Samsung’s previous generation HBM2 package, Flarebolt, with its 1.6Gbps pin speed at 1.2V and 2.0Gbps at 1.35V.

    In a telephone interview with EE Times from CES, Shiah said a single Aquabolt package will offer a 307GBps data bandwidth, achieving 9.6 times faster data transmission than an 8Gb GDDR5 chip, which provides a 32GBps data bandwidth. This means using four packages in a system will enable a 1.2 terabytes-per-second (TBps) bandwidth, he said, hence the overall system performance by as much as 50 percent.

    Reply
  13. Tomi Engdahl says:

    Micron, Rambus, & Others Team Up To Spur GDDR6 Adoption in Non-GPU Products
    by Ryan Smith on January 23, 2018 9:00 AM EST
    https://www.anandtech.com/show/12362/micron-rambus-others-team-up-to-spur-gddr6-adoption

    the drums of GDDR6 have been beating loudly for most of the last year now. The new memory standard replaces the venerable GDDR5 memory, which, to make long-time readers feel old, launched 10 years ago. While GDDR5 has evolved well beyond its initially planned lifecycle to meet the needs of the industry, it’s finally begun to reach its apex, and a new memory standard has been needed to take its place. GDDR6 then promises to be a big deal, offering a significant jump in memory bandwidth over GDDR5 – and even GDDR5X – giving processors of all sorts a much-needed boost.

    Reply
  14. Tomi Engdahl says:

    Ever wondered why tech products fail so frequently? No, me neither
    Celestial choirs are coming to your screen soon
    https://www.theregister.co.uk/2018/01/26/ever_wondered_why_tech_products_fail_so_frequently_no_me_neither/

    Reply
  15. Tomi Engdahl says:

    Matryoshki of news: Tech giants flash code to Russia, Dutch hack Kremlin spies, and more
    It’s all kicking off
    https://www.theregister.co.uk/2018/01/26/tech_russia_source_code_dnc_hack/

    Technology companies can’t decide whether to take Russian money or run from it – not that they’ve ever been much good at turning down cash.

    McAfee, SAP, and Symantec, which make software used by the US government, allowed Russian authorities to scan their source code for backdoors and other flaws, according to Reuters on Thursday, as has HPE.

    Like China and other nations, the Russian government requires a look under the hood before it will consider spending cash on enterprise software as the applications could be compromised.

    The fear is that foreign governments may stash backdoors in the code, effectively turning the apps into bugs – as in, spying bugs. Look no further than the US government, which refuses to run software from Moscow-based Kaspersky on its machines over concerns the antivirus tools can be abused to beam Uncle Sam’s secrets to the Kremlin. Kaspersky denies any impropriety.

    Knowing that Russian officials have potentially glimpsed exploitable security bugs in applications used by US government departments will freak out American officials.

    This is, don’t forget, the same Russian government implicated in the compromise of government agency networks, and the 2016 presidential election, in the US.

    McAfee, SAP, and Symantec, along with Micro Focus which took over ArcSight, the HPE product audited, told Reuters that the code reviews were done under controlled conditions. No code was allowed to be copied, taken away, or altered by the Russians, we’re told.

    Reply
  16. Tomi Engdahl says:

    Dell considering strategic options including IPO – CNBC, citing DJ
    https://www.cnbc.com/2018/01/25/reuters-america-dell-considering-strategic-options-including-ipo–cnbc-citing-dj.html

    Dell Technologies Inc is considering strategic options including a public share offering or a deal with its majority owned New York-listed unit, VMware Inc , CNBC said in a tweet, citing Dow Jones.

    Reply
  17. Tomi Engdahl says:

    PCIe PIPE 4.4.1: Enabler for PCIe Gen4
    https://blogs.synopsys.com/vip-central/2018/01/17/pcie-pipe-4-4-1-enabler-for-pcie-gen4/

    PCIe is a multi-layered serial bus protocol which implements dual-simplex link. It provides high speed data transfer and low latency owing to its dedicated point to point topology. To accelerate verification and device development time for PCIe based sub-systems, PIPE (PHY Interface for the PCI Express) architecture was defined by Intel. PIPE is a standard interface defined between PHY sub-layer (PCS – Physical Coding sub-layer) and MAC (Media Access Layer).

    The first stable version of PIPE was published as PIPE 2.0 in 2007. Over the time, PIPE has evolved to support higher speeds and the added functionalities of next generation PCIe specifications. PIPE 4.4.1 specification, released in early 2017, is fully compliant with PCIe 4.0 base specification supporting 16GT/s speed. It has major improvements over PIPE 4.3, while maintaining backward compatibility. Following diagram illustrates PIPE interface, and the partitioning of PHY layer of PCIe.

    Reply
  18. Tomi Engdahl says:

    Wanna design a chip that talks to silly-fast GDDR6? You’ll have to talk to Rambus, too
    Blueprints touted to ASIC, SoC makers to take on GPUs
    https://www.theregister.co.uk/2018/01/25/rambus_gddr6_phy_ip_core/

    Semiconductor licensing giant Rambus announced this week a physical layer design for accessing GDDR6 – aka double data rate type six synchronous graphics random-access memory.

    This GDDR6 PHY blueprint is aimed at hooking up high-speed, high-bandwidth GDDR6 SGRAM to hardware accelerators and processors to rapidly crunch through stuff like crypto-mining and machine learning.

    GDDR technology is usually aimed at plumbing fast memory into graphics processors to make games and math-heavy workloads run faster. These Rambus designs are aimed at non-GPU semiconductor engineers who want to connect super-fast dual-channel GDDR6 SGRAM to their custom accelerators and system-on-chips.

    Reply
  19. Tomi Engdahl says:

    When Will the Tech Bubble Burst?
    https://www.nytimes.com/2017/08/05/opinion/sunday/when-will-the-tech-bubble-burst.html

    The dot-com era saw the rise of big companies that were building the nuts and bolts of the internet — including Dell, Microsoft, Cisco and Intel — and of start-ups that promised to tap its revolutionary potential. The current boom lacks a popular name because the innovations — from the internet of things to artificial intelligence and machine learning — are sprawling and hard to label. If there is a single thread, it is the expanding capacity to harness data, which the Alibaba founder, Jack Ma, calls the “electricity of the 21st century.”

    Market excitement about authentic technology innovations enters the manic phase when stock prices rise faster than justified by underlying economic growth. Since the crisis of 2008, the United States economy has been recovering at the rate of around 2 percent, roughly half the rate seen for much of the past century. The areas of growth are limited in this environment.

    These new private funding channels are creating “unicorns,” companies that haven’t gone public but are valued at $1 billion or more. Unicorns barely existed in 1999. Now there are more than 260 worldwide, with technology companies dominating the list. And if signs emerge that the privately owned unicorns are faltering, the value of publicly owned tech companies is not likely to hold up either.

    Reply
  20. Tomi Engdahl says:

    Again the IT project went baddly – why?

    It is a misconception that idle it projects would be a reminder of the devops and other agile development models. It was before the timetable and the budget of the large-scale deployment of the software dealt. They still happen.

    Most of the current failures are still different. Agile development, devops, continuous delivery, and fast failing have changed the nature of IT projects.

    These iterative management methods are designed to minimize catastrophic failure. It’s still possible to pinpoint It’s projects, it’s just different than before.

    A cautionary example is the deployment project of saas-crm
    The IT Administration made a sales specification with the sales management business.

    “We thought we had all the support, and that we knew what project was expected. But when we finished the project, sales did not want it. The resistance was really powerful. Top management was involved, but users were skeptical. ”

    The cloud-based crm was declared unsuccessful and rejected. So the project can stay on schedule and budget, but still fail.

    “The product may be how wonderful and worth a thousand, but if we can not meet the end-user expectations, we have failed.”

    According to McMasters, success would have been more likely if information management had marketed the benefits of the new system instead of focusing on the project. “We were not very dedicated. Co-operation with the business could have been more intense. ”

    Similar projects are enough. The Project Management Institute (PMI) carried out a survey involving three-hundred project management professionals. 28 percent of the strategic projects managed by the respondents failed completely. About 37% of the respondents reported that the failure was mostly due to the lack of clearly defined and / or achievable junctions and the goals defining the progress of the project. Next, the most common cause was poor communication (19% of respondents), inadequate communication by senior management (18%), worker resistance (14%) and inadequate funding (9%).

    According to the same survey, organizations lost billions of dollars per investor for badly-executed projects.

    PwC asked 2 216 business and IT leaders from 53 countries, which is a barrier to digitalisation. Approximately 64% mentioned lack of cooperation between business and IT, 58% inflexible or slow processes, 41% lack of new and existing technologies, 38% obsolete technologies and 37% lack of qualified teams.

    The definition of success or failure of the project has also been expanded. PMI’s report says that the definition of success is changing: “Traditional indicators such as timetable and costs are not enough in a competitive environment. It is equally important whether projects can meet the objectives set for them. ”

    “I would define in the current customer-centered environment a failure if the company’s reputation, result or turnover is affected,” he says. “Failure nowadays involves more business processes than actual technology problems.”

    “If you get something done in time and budget, but it does not do what customers or users want, the job is gone.”

    Agile development and devops, and similar approaches, help to reduce the probability of a massive failure of IT projects. For example, the template codes for small pieces at a time are automatically tested and iterated until everything works and then moves to the next piece.

    “It works in theory [as a safety net]. Mistakes are being sought more often so the result should be of higher quality. When done this way, there should be less defects, “says Stephen Elliott.

    It also reduces the risk of failure if software development and testing make more use of automation.

    “Most failures are still associated with human factors. Poor code, false network settings, or unmanaged load balancing. Errors are possible because the operating environment is complicated. Increasing automation should still reduce human errors, especially in software development and network management, ”

    “Nowadays, adjustments are being made more and more rapidly, as required.”

    Chris McMasters tries to control the risk of failure by focusing on the projected outcome. He breaks down the contract into the devops style to smaller parts, making possible problems more quickly. In addition, he favors pilot programs where ideas can safely fail. They can innovate without endangering business.

    There is a problem in the basic nature of agility and devops. “There are some small problems to be taken during development, but potential bigger problems can only be detected when the particles are integrated into a whole system,”

    For example, teams using iterative practices may consider the features of a new software to be functional when viewed one by one. In the final application, however, they do not work as well together as a whole.

    “These models, in a way, allow the situation to escalate to the point that system failures only emerge at a high level,”

    Breakdown of business and IT silos can also increase the risk of failure. At that time, business management may look for the latest and cutting-edge technologies, even if they did not understand them or went through other alternatives.

    PwC’s survey of 2015 indicates that 68% of the technology costs of organizations are paid outside the IT budget.

    “Business executives buy these products directly and realize that they also need access to business information or elsewhere in the company’s IT infrastructure. Thus, projects are delayed or re-defined or completely abolished. ”

    In other studies, PwC has noticed in recent years that the most common cause of project failures is an “inflexible or slow process”. Other issues include lack of skilled teams and problems with third parties.

    “The risk is still there,” says Stephen Elliott. Even if a new application works well, its deployment can cause problems in a complex it environment that has new and old technologies alongside it. “The applications accessed over a network all over the world, and they often revolve around the hardware environment of the third party. There are really many levels here that are all prone to problems. ”

    It’s important to remember that whatever the cause of the failure of a technology project is, information management is most likely to get the reasons behind it, whether it is the cause or not.

    “In the final game, the blackjack is usually in the hands of information management,” says James Stanger. “Therefore, it must always be on guard.”

    Fail fast, it’s worth it

    When projects that remain on schedule and in budget, but whose results do not like end-users are now also unsuccessful, companies’ risk attitudes have changed.

    “Mocking is now acceptable, as long as mistakes are learned,” says IDC analyst Stephen Elliott . “Some companies appreciate failures, as long as the situation improves and people learn from them.”

    Elliott points out that organizations that are inclined to accept failures also work diligently to reduce the risks. Sandbox environments, pilot projects and iterative development limit the damage if something goes wrong.

    “They reduce the risk of major failures,”

    The potential of the project and the consequences of its failure vary.

    “We will decide where to fail, and when not,”

    “We became experts in certain failures. I was pleasantly surprised by the project that took two years because we became very clever, ”

    The attitude that allows for intermittent failure is also vital to other organizations who want to innovate and remain competitive.

    “If you learn all the time and improve your performance, you may also fail,”

    Source: https://www.tivi.fi/Kaikki_uutiset/taas-meni-it-hanke-monkaan-miksi-6698622

    Reply
  21. Tomi Engdahl says:

    7 hot IT career trends — and 7 going cold
    https://www.cio.com/article/3234364/careers-staffing/hot-and-cold-technology-career-trends.html

    The growing IT skills gap and demand for data pros and hybrid roles are disrupting the traditional IT career path. The following heat map of career trends with help you cash in and avoid dead ends.

    Hot: IT pros taking leadership roles
    Cold: Dev and ops in silos
    Hot: Soft skills
    Cold: Ability to pursue soft skills
    Hot: Analytics certifications
    Cold: Vendor-specific certifications for security
    Hot: Personal relationships with contacts
    Cold: Padding LinkedIn connections
    Hot: Business skills
    Cold: Moving from tech to finance
    Hot: Hybrid roles
    Cold: Jumping ship (vs. moving up)
    Hot: Developing security skills
    Cold: Traditional benefits (vs. work-life balance)

    Reply
  22. Tomi Engdahl says:

    PC not dead, Apple single-handedly propping up mobe market, says Gartner
    Yes, folks, it’s crystal ball time again
    https://www.theregister.co.uk/2018/01/29/gartner_device_shipment_predictions/

    PC shipments will continue sliding south, reckon Gartner’s mystic mages – but, like Monty Python’s Black Knight, they still refuse to lay down and die.

    Figures from everyone’s favourite analyst haus suggest that shipments of desktop computers will decline from 220 million worldwide in 2016 to 187 million units by 2019.

    Overall, the firm predicts that shipments of mobes, fondleslabs and old-fashioned PC-style boxen will increase by 2.1 per cent this year.

    “The market is on a downward trend for the next couple of years,” research director Ranjit Atwal told The Register. “As we go into 2018 we are looking at a flat PC market… the consumer market is weak.”

    Business PC use has held up, however, with the projection for biz-focused PC shipments set to flatline, Atwal said – a pleasant change from its previous downward trajectory. Atwal reckoned it was business PC shipments that “had been holding up the PC market” as companies migrate from elderly operating systems to Windows 10.

    “The consumer decline was more severe,”

    Reply
  23. Tomi Engdahl says:

    Bionic Beaver 18.04 LTS to use Xorg by default
    https://insights.ubuntu.com/2018/01/26/bionic-beaver-18-04-lts-to-use-xorg-by-default/

    Bionic Beaver, the codename for the next Ubuntu LTS release, is due in April 2018 and will ship with both the traditional Xorg graphics stack as well as the newer Wayland based stack, but Xorg will be the default.

    17.10, released in October 2017, ships with the Wayland based graphics server as the default and the Xorg based equivalent is available as an option from the login screen.

    Why opt for Xorg by default? There are three main reasons:

    1. Screen sharing in software like WebRTC services, Google Hangouts, Skype, etc works well under Xorg.
    2. Remote Desktop control for example RDP & VNC works well under Xorg.
    3. Recoverability from Shell crashes is less dramatic under Xorg.

    The architecture of GNOME Shell and Mutter is such that a GNOME Shell crash will end your whole session, killing running applications and returning you to the login screen. When using Xorg, the shell can restart independently of the display server and running applications. This means that once the shell is restarted, you can pretty much pick up your session from where you left off, with your applications still running.
    There are two solutions to this problem when using Wayland: make sure the shell doesn’t crash or change the architecture. Both of these are work in progress and we continue to contribute to this work upstream.

    Reply
  24. Tomi Engdahl says:

    Mark Gurman / Bloomberg:
    Sources: Apple working on three Mac models, including laptops and a new desktop, with custom co-processors; new iPad to be released toward the end of the year

    How Apple Built a Chip Powerhouse to Threaten Qualcomm and Intel
    https://www.bloomberg.com/graphics/2018-apple-custom-chips/

    The company already makes many of the chips for its iPhones, iPads, Macs and Watches.

    For several years, Apple has been steadily designing more and more of the chips powering its iPhones, iPads, Macs and Apple Watches. This creates a better user experience and helps trump rivals. Recently the company got a fresh incentive to go all-in on silicon: revelations that microprocessors with components designed by Intel Corp., Arm Holdings Plc and Advanced Micro Devices, Inc. are vulnerable to hacking.

    Steve Jobs long believed Apple should own the technologies inside its products rather than rely on mashups of components from other chip makers, including Samsung, Intel and Imagination Technologies.

    That original “system-on-a-chip” has since been succeeded by increasingly powerful processors.

    So far, only two Mac lines include custom Apple processors: the MacBook Pro with Touch Bar and the iMac Pro. Apple is working on at least three updated Mac models with custom co-processors for release as soon as this year, including updated laptops and a new desktop, according to a person familiar with the plan.

    Reply
  25. Tomi Engdahl says:

    Been bugging the boss for a raise? Now’s the time to go into infosec
    Security specialists to command 7% salary hikes, survey finds
    https://www.theregister.co.uk/2018/01/31/it_salary_survey/

    Cybersecurity specialists will enjoy the highest salary increases among IT professionals with rises of 7 per cent – compared to 2 per cent for devs and 3 per cent for infrastructure experts – according to a survey by recruitment consultancy Robert Walters.

    Infosec bods have become ever more highly sought in the wake of high-profile data leaks and cyber attacks. Developers have been in demand to support digitalisation projects.

    “At this point, salaries for IT professionals are highly inflated, with employers having to compete to secure top talent. In this context, the increases for cybersecurity specialists are particularly noteworthy.”

    Companies want more than technical know-how, Iqbal added.

    “Employers are keen to secure professionals who can demonstrate communication and project management skills as they look to more closely integrate their IT function into the wider business.”

    Reply
  26. Tomi Engdahl says:

    Gartner Says Worldwide Device Shipments Will Increase 2.1 Percent in 2018
    By 2021, 9 Percent of Smartphones Sold Will Support 5G
    https://www.gartner.com/newsroom/id/3849063

    Worldwide shipments of devices — PCs, tablets and mobile phones — totaled 2.28 billion units in 2017, according to Gartner, Inc. Shipments are on course to reach 2.32 billion units in 2018, an increase of 2.1 percent.

    Reply
  27. Tomi Engdahl says:

    Bitcoin mania is hurting PC gamers by pushing up GPU prices
    https://www.theverge.com/2018/1/30/16949550/bitcoin-graphics-cards-pc-prices-surge

    Bitcoin and other cryptocurrencies like Ethereum, Ripple, and Litecoin have soared in value over the past year, thanks to continued interest from a range of investors. As the price of these cryptocurrencies has increased, graphics cards have also seen big price increases thanks to retail stock shortages. A range of mid- or high-end graphics cards from AMD or Nvidia are in short supply, mostly due to cryptocurrency miners buying them in bulk to build machines to mine bitcoin and similar cryptocurrencies.

    Polygon reports that pricing for Nvidia’s GeForce GTX 1070 should be around $380 (depending on the model), but that some cards are now being sold for more than $700 due to the stock shortages – an increase of more than 80 percent. Cryptocurrency miners use stacks of graphics cards to solve the mathematical problems need to authenticate payments on the network and create new bitcoin.

    Reply
  28. Tomi Engdahl says:

    Fujifilm, Xerox throw each other a US$6.1 billion lifeline
    We’ll buy the J/V from you so you can use the cash to buy us, goddit?
    https://www.theregister.co.uk/2018/02/01/fujifilm_xerox_deal/

    Fujifilm has announced a $US6.1 billion deal to take control of troubled Xerox.

    It’s a complex deal: the Fuji Xerox joint venture (75 per cent owned by Fujifilm) will spend $US6.1 billion buying Fujifilm’s stake in the J/V, and Fujifilm will use those billions to buy a 50.1 per cent stake in Xerox.

    The combined company will have revenue of $18 billion, and the companies believe they’ll be able to trim costs by around $1.7bn by 2022.

    Reuters reported that Fujifilm has already announced layoffs of around 10,000 among its Asia-Pacific staff.

    Reply
  29. Tomi Engdahl says:

    The Document Foundation announced the release of LibreOffice 6 today: “a major release nd a dramatically improved free office suite, which celebrates the 7th anniversary of the availability of the very first version of LibreOffice.”

    The Document Foundation announces LibreOffice 6.0: power, simplicity, security and interoperability from desktop to cloud
    https://blog.documentfoundation.org/blog/2018/01/31/libreoffice-6/

    The Document Foundation announces LibreOffice 6.0, a major release and a dramatically improved free office suite, which celebrates the 7th anniversary of the availability of the very first version of LibreOffice. Today LibreOffice is more powerful, simple and secure, and offers superior interoperability with Microsoft Office documents.

    LibreOffice 6.0 is immediately available for Windows, macOS and Linux, and for the cloud. The new major release adds a large number of significant new features to the core engine and to individual modules (Writer, Calc and Impress/Draw), with the objective of providing users with the best in terms of personal productivity.

    Reply
  30. Tomi Engdahl says:

    GDC Rescinds Award For Atari Founder Nolan Bushnell After Criticisms of Sexually Inappropriate Behavior
    https://games.slashdot.org/story/18/01/31/2133204/gdc-rescinds-award-for-atari-founder-nolan-bushnell-after-criticisms-of-sexually-inappropriate-behavior?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Slashdot%2Fslashdot%2Fto+%28%28Title%29Slashdot+%28rdf%29%29

    The organizers of the Game Developers Choice Awards announced today that they have rescinded the Pioneer Award for Atari founder Nolan Bushnell, and announced the award will not be given this year entirely. “The decision follows a day of outcry after GDC organizers announced that Bushnell, 74, had been tapped for the GDCA’s lifetime achievement honor,”

    GDC cancels achievement award for Atari founder after outcry (update)
    Nolan Bushnell’s past prompts #NotNolan social media campaign
    https://www.polygon.com/2018/1/31/16955152/nolan-bushnell-gdc-pioneer-award-notnolan-metoo

    The organizers of the Game Developers Choice Awards announced today that they have rescinded the Pioneer Award for Atari founder Nolan Bushnell, and announced the award will not be given this year entirely.

    The decision follows a day of outcry after GDC organizers announced that Bushnell, 74, had been tapped for the GDCA’s lifetime achievement honor. News accounts and histories over the past several years have documented a history of workplace misconduct and sexist behavior toward women by Bushnell, during Atari’s early days.

    Reply
  31. Tomi Engdahl says:

    RAID 5 vs RAID 10: Recommended RAID For Safety and Performance
    https://www.cyberciti.biz/tips/raid5-vs-raid-10-safety-performance.html

    in Categories File system, FreeBSD, Hardware, Linux, OpenBSD, RedHat/Fedora Linux, Storage, Suse Linux, UNIX, Windows server

    RAID 5 vs RAID 10

    RAID 10 = Combining features of RAID 0 + RAID 1. It provides optimization for fault tolerance.

    RAID 0 helps to increase performance by striping volume data across multiple disk drives.

    RAID 1 provides disk mirroring which duplicates your data.

    In some cases, RAID 10 offers faster data reads and writes than RAID 5 because it does not need to manage parity.

    What is the cost of reduced performance and possibly reduced customer satisfaction? Finally what is the cost of lost business if data is unrecoverable? I maintain that the drives are FAR cheaper! Hence my mantra:
    NO RAID5! NO RAID5! NO RAID5!

    A note about backup
    Any RAID level will not protect you from multiple disk failures. While one disk is offline for any reason, your disk array is not fully redundant. Therefore, old good tape backups are always recommended

    Reply
  32. Tomi Engdahl says:

    Bloomberg:
    Sources: Intel plans to sell a majority stake in its AR business, seeking as much as $350M, which has been working on smart glasses for smartphones — Smart specs will allow users to see contextual information — Division valued at as much as $350 million; Intel keeps stake

    Intel Is Said to Plan Sale of Majority Stake in AR Glasses Unit
    https://www.bloomberg.com/news/articles/2018-02-01/intel-is-said-to-plan-sale-of-majority-stake-in-ar-glasses-unit

    Reply
  33. Tomi Engdahl says:

    Chrome OS 64 rolling out w/ new screenshot shortcut, Android app updates, security patches
    https://9to5google.com/2018/02/01/google-chrome-os-64-features/

    Following updates to Android, Mac, Windows, and Linux last week, version 64 is now rolling out to Chrome OS. In addition to several browser enhacements, Chromebooks add a new screenshot shortcut and a number of changes for Android apps.

    Convertible Chrome OS devices add a new Android-inspired shortcut for taking screenshots. Especially convenient in tablet mode, pressing the power and volume down buttons will capture your current screen.

    Meanwhile, this version contains a bevy of Android app-related improvements. Release notes reveal a revamped Intent Picker for Play applications with a “same window by default with override” behavior. It is now possible to enable VPN for Play apps, while there are also “Android Container auto update optimizations.”

    Reply
  34. Tomi Engdahl says:

    Besides the XPoint: Persistent memory tech is cool, but the price tag… OUCH
    No economies of scale = piss-poor adoption
    https://www.theregister.co.uk/2018/02/02/price_preventing_persistent_memory_popularity/

    The prospects of XPoint and other persistent memory technologies becoming a standard part of servers’ design is being held up because the darn stuff costs too much, an analyst has said.

    That’s because it is made in small quantities so no economies of scale occur which would make it cheaper.

    Object Analysis analyst Jim Handy explained this at the SNIA’s Flash Memory Summit in January and it starts with the memory hierarchy idea.

    There is a sweet spot diagonal running up the chart from left to right, from tape (slow/cheap) at the bottom through disk, SSD, DRAM and the cache levels to L1 cache, which is the fastest and most expensive item on the chart.

    Any new technology aiming to punch its way into the memory hierarchy at any point needs to be better performing than things below it and less expensive than items above it on the chart.

    We have seen NVDIMMs trying to push their way into the SSD-DRAM gap and generally failing – witness Diablo Technologies.

    Handy said NAND suffered from the same issue until around 2004. Before then, in its then SLC (1 bit/cell) form, it was more expensive than DRAM ($/GB) even though a 100mm die using a 44nm process stored 8GB compared to a equivalent DRAM die storing 4GB. Twice the bits should mean half the cost, but it didn’t – because not enough of the stuff was made to bring in economies of scale.

    Ever since NAND and DRAM prices have been separating, with, El Reg suggests, MLC (2bits/cell), TLC (3bits/cell) and 3D NAND (many more bits/die) increasing the separation.

    Make more, support more

    The NAND and NVDIMM-N lessons for XPoint, and other persistent memory technologies aimed at the same DRAM-NAND gap, is that their manufacturing volume needs to be high enough to provide a cost-performance profile matching that of the gap placement on the memory hierarchy chart

    Their manufacturing volume needs to approach that of DRAM, in Handy’s view. They also need software support, particularly for persistence, and this is coming on Linux, Windows and VMware. The initial PM take-up will be for performance, and require faster-than-NAND performance and lower-than-DRAM pricing.

    Reply
  35. Tomi Engdahl says:

    Dawn Of The Data-Centric Era
    What markets and technologies will drive growth in 2018.
    https://semiengineering.com/dawn-of-the-data-centric-era/

    Reply
  36. Tomi Engdahl says:

    Hadoop Has Failed Us, Tech Experts Say
    https://www.datanami.com/2017/03/13/hadoop-failed-us-tech-experts-say/

    The Hadoop dream of unifying data and compute in a distributed manner has all but failed in a smoking heap of cost and complexity, according to technology experts and executives who spoke to Datanami.

    “I can’t find a happy Hadoop customer. It’s sort of as simple as that,” says Bob Muglia, CEO of Snowflake Computing, which develops and runs a cloud-based relational data warehouse offering. “It’s very clear to me, technologically, that it’s not the technology base the world will be built on going forward.”

    “The number of customers who have actually successfully tamed Hadoop is probably less than 20 and it might be less than 10,” Muglia says. “That’s just nuts given how long that product, that technology has been in the market and how much general industry energy has gone into it.”

    One of the companies that supposedly tamed Hadoop is Facebook, which developed relational database technologies like Hive and Presto to enable SQL querying for data stored in HDFS.

    Hadoop’s strengths lie in serving as a cheap storage repository and for processing ETL batch workloads, Johnson says. But it’s ill-suited for running interactive, user-facing applications

    Hadoop is great if you’re a data scientist who knows how to code in MapReduce or Pig, Johnson says, but as you go higher up the stack, the abstraction layers have mostly failed to deliver on the promise of enabling business analysts to get at the data.

    The Hadoop community has so far failed to account for the poor performance and high complexity of Hadoop, Johnson says. “The Hadoop ecosystem is still basically in the hands of a small number of experts,”

    A better architecture for enabling people to be productive with big data can be found in Apache Kafka, Johnson says.

    While Kafka is included in many Hadoop distributions, Kreps purposely avoided building any Hadoop dependencies into Kafka, and he strived to make it as simple to use as possible.

    “Kafka is definitely its own thing. It runs standalone. It has no connection to Hadoop. I think that’s absolutely a good thing for people who are trying to build production application,”

    Reply
  37. Tomi Engdahl says:

    10 bad DevOps habits to break
    https://enterprisersproject.com/article/2018/1/10-bad-devops-habits-break?sc_cid=7016000000127ECAAY

    Some common DevOps practices can turn into bad habits. Nip them in the bud now for greater success in 2018

    Reply
  38. Tomi Engdahl says:

    Google architecture
    https://plus.google.com/+RipRowan/posts/eVeouesvaVX

    The best article I’ve ever read about architecture and the management of IT.

    Reply
  39. Tomi Engdahl says:

    The Windows 10 operating system has finally become the world’s most popular computer operating system. According to Statcounter

    According to Statcounter, Windows 10 is now installed on 42.78 percent of the world’s PCs. Windows 7 share fell to 41.86 percent. The share of other versions of Windows is already well below 10 percent.

    Windows 10 was released in July 2015.

    Source: http://etn.fi/index.php?option=com_content&view=article&id=7506&via=n&datum=2018-02-05_15:18:57&mottagare=31202

    Reply
  40. Tomi Engdahl says:

    Surviving Your Digital Transformation
    http://www.securityweek.com/surviving-your-digital-transformation

    Digital Transformation Without an Equivalent Security Transformation is Leaving Organizations More Vulnerable

    2018 is lining up to be the year of Digital Transformation. Just about every organization looking to remain viable in the growing digital marketplace has some sort of digital transformation in progress or one in the planning stages for this year. These projects range from implementing basic applications to better interact with online consumers, to converging OT and IT networks, or even pushing their entire infrastructure to the cloud.

    But digital transformation without an equivalent security transformation is leaving organizations more vulnerable than ever. The results are alarming. According to Gartner, nearly $90 billion was spent on information security in 2017 and is expected to top a trillion dollars over the next five years. But cybercrime over that same period is expected to continue to rise. In spite of our efforts, we are falling further and further behind.

    An Outside-In Look at Digital Transformation
    http://www.securityweek.com/outside-look-digital-transformation

    Digital Transformation is a Massive Undertaking and Must be Entered into With Equal Thought to Security and Business Strategy

    Reply
  41. Tomi Engdahl says:

    Small business IT spending to pass $600 billion in 2018: IDC
    http://www.zdnet.com/article/small-business-it-spending-to-pass-600-billion-in-2018-idc/

    IDC expects SMBs to spend the most on devices, including personal computing devices, peripherals, and mobile phones.

    Reply
  42. Tomi Engdahl says:

    Bloomberg:
    US defense contractor General Dynamics to acquire government IT services provider CSRA for $6.8B, making it the second largest provider of federal IT services

    Tank-Maker General Dynamics Renews IT Bet With $6.8 Billion Deal
    https://www.bloomberg.com/news/articles/2018-02-12/general-dynamics-to-buy-csra-for-6-8-billion-to-add-it-services

    General Dynamics Corp., the maker of Abrams tanks and nuclear submarines, is making a record bet on reinvigorating its information-technology business.

    The U.S. defense contractor agreed to buy CSRA Inc. for about $6.8 billion to expand its computer-services offerings for government agencies and military customers.

    Reply
  43. Tomi Engdahl says:

    Intel’s Microprocessor Share Slips Below 60%
    https://www.eetimes.com/document.asp?doc_id=1332968

    With PC shipments in continual decline and smartphone growth exploding over the past few years, Intel’s share of the overall microprocessor market has slipped to below 60 percent, according to market watcher IC Insights.

    While PC processors for computer and servers still account for more than half of the MPU market — a projected 52 percent in 2018 — ARM-based mobile SoCs and embedded processors for automotive, IoT and other applications grew faster than other categories of MPUs over the past five years, IC Insights said.

    Intel has long been the dominant supplier of PC microprocessors, nearly all of which are based on Intel’s x86 architecture and supplied by Intel and its significantly smaller rival, AMD. Intel’s share of the overall MPU market had been more than 75 percent for most of the last decade, IC Insights noted.

    Reply

Leave a Reply to Tomi Engdahl Cancel reply

Your email address will not be published. Required fields are marked *

*

*