It seems that PC market seems to be stabilizing in 2016. I expect that the PC market to shrinks slightly. While mobile devices have been named as culprits for the fall of PC shipments, IDC said that other factors may be in play. It is still pretty hard to make any decent profits with building PC hardware unless you are one of the biggest players – so again Lenovo, HP, and Dell are increasing their collective dominance of the PC market like they did in 2015. I expect changes like spin-offs and maybe some mergers with with smaller players like Fujitsu, Toshiba and Sony. The EMEA server market looks to be a two-horse race between Hewlett Packard Enterprise and Dell, according to Gartner. HPE, Dell and Cisco “all benefited” from Lenovo’s acquisition of IBM’s EMEA x86 server organisation.
Tablet market is no longer high grow market – tablet maker has started to decline, and decline continues in 2016 as owners are holding onto their existing devices for more than 3 years. iPad sales are set to continue decline and iPad Air 3 to be released in 1st half of 2016 does not change that. IDC predicts that detachable tablet market set for growth in 2016 as more people are turning to hybrid devices. Two-in-one tablets have been popularized by offerings like the Microsoft Surface, with options ranging dramatically in price and specs. I am not myself convinced that the growth will be as IDC forecasts, even though Company have started to make purchases of tablets for workers in jobs such as retail sales or field work (Apple iPads, Windows and Android tablets managed by company). Combined volume shipments of PCs, tablets and smartphones are expected to increase only in the single digits.
All your consumer tech gear should be cheaper come July as shere will be less import tariffs for IT products as World Trade Organization (WTO) deal agrees that tariffs on imports of consumer electronics will be phased out over 7 years starting in July 2016. The agreement affects around 10 percent of the world trade in information and communications technology products and will eliminate around $50 billion in tariffs annually.
In 2015 the storage was rocked to its foundations and those new innovations will be taken into wider use in 2016. The storage market in 2015 went through strategic foundation-shaking turmoil as the external shared disk array storage playbook was torn to shreds: The all-flash data centre idea has definitely taken off as a vision that could be achieved so that primary data is stored in flash with the rest being held in cheap and deep storage. Flash drives generally solve the dusk drive latency access problem, so not so much need for hybrid drives. There is conviction that storage should be located as close to servers as possible (virtual SANs, hyper-converged industry appliances and NVMe fabrics). The existing hybrid cloud concept was adopted/supported by everybody. Flash started out in 2-bits/cell MLC form and this rapidly became standard and TLC (3-bits/cell or triple layer cell) had started appearing. Industry-standard NVMe drivers for PCIe flash cards appeared. Intel and Micron blew non-volatile memory preconceptions out of the water in the second half of the year with their joint 3D XPoint memory announcement. Boring old disk disk tech got shingled magnetic recording (SMR) and helium-filled drive technology; drive industry is focused on capacity-optimizing its drives. We got key:value store disk drives with an Ethernet NIC on-board and basic GET and PUT object storage facilities came into being. Tape industry developed a 15TB LTO-7 format.
The use of SSD will increase and it’s price will drop. SSDs will be in more than 25% of new laptops sold in 2015. SSDs are expected to be in 31% of new consumer laptops in 2016 and more than 40% by 2017. The prices of mainstream consumer SSDs have fallen dramatically every year over the past three years while HDD prices have not changed much. SSD prices will decline to 24 cents per gigabyte in 2016. In 2017 they’re expected to drop to 11-17 cents per gigabyte (means a 1TB SSD on average would retail for $170 or less).
Hard disk sales will decrease, but this technology is not dead. Sales of hard disk drives have been decreasing for several years now (118 million units in the third quarter of 2015), but according to Seagate hard disk drives (HDDs) are set to still stay relevant around for at least 15 years to 20 years. HDDs remain the most popular data storage technology as it is cheapest in terms of per-gigabyte costs. While SSDs are generally getting more affordable, high-capacity solid-state drives are not going to become as inexpensive as hard drives any time soon.
Because all-flash storage systems with homogenous flash media are still too expensive to serve as a solution to for every enterprise application workload, enterprises will increasingly turn to performance optimized storage solutions that use a combination of multiple media types to deliver cost-effective performance. The speed advantage of Fibre Channel over Ethernet has evaporated. Enterprises also start to seek alternatives to snapshots that are simpler and easier to manage, and will allow data and application recovery to a second before the data error or logical corruption occurred.
Local storage and the cloud finally make peace in 2016 as the decision-makers across the industry have now acknowledged the potential for enterprise storage and the cloud to work in tandem. Over 40 percent of data worldwide is expected to live on or move through the cloud by 2020 according to IDC.
Open standards for data center development are now a reality thanks to advances in cloud technology. Facebook’s Open Compute Project has served as the industry’s leader in this regard.This allows more consolidation for those that want that. Consolidation used to refer to companies moving all of their infrastructure to the same facility. However, some experts have begun to question this strategy as the rapid increase in data quantities and apps in the data center have made centralized facilities more difficult to operate than ever before. Server virtualization, more powerful servers and an increasing number of enterprise applications will continue to drive higher IO requirements in the datacenter.
Cloud consolidation starts heavily in 2016: number of options for general infrastructure-as-a-service (IaaS) cloud services and cloud management software will be much smaller at the end of 2016 than the beginning. The major public cloud providers will gain strength, with Amazon, IBM SoftLayer, and Microsoft capturing a greater share of the business cloud services market. Lock-in is a real concern for cloud users, because PaaS players have the ancient imperative to find ways to tie customers to their platforms and aren’t afraid to use them so advanced users want to establish reliable portability across PaaS products in a multi-vendor, multi-cloud environment.
Year 2016 will be harder for legacy IT providers than 2015. In its report, IDC states that “By 2020, More than 30 percent of the IT Vendors Will Not Exist as We Know Them Today.” Many enterprises are turning away from traditional vendors and toward cloud providers. They’re increasingly leveraging open source. In short, they’re becoming software companies. The best companies will build cultures of performance and doing the right thing — and will make data and the processes around it self-service for all their employees. Design Thinking to guide companies who want to change the lives of its customers and employees. 2016 will see a lot more work in trying to manage services that simply aren’t designed to work together or even be managed – for example Whatever-As-A-Service cloud systems to play nicely together with their existing legacy systems. So competent developers are the scarce commodity. Some companies start to see Cloud as a form of outsourcing that is fast burning up inhouse ITops jobs with varying success.
There are still too many old fashioned companies that just can’t understand what digitalization will mean to their business. In 2016, some companies’ boards still think the web is just for brochures and porn and don’t believe their business models can be disrupted. It gets worse for many traditional companies. For example Amazon is a retailer both on the web and increasingly for things like food deliveries. Amazon and other are playing to win. Digital disruption has happened and will continue.
Windows 10 is coming more on 2016. If 2015 was a year of revolution, 2016 promises to be a year of consolidation for Microsoft’s operating system. I expect that Windows 10 adoption in companies starts in 2016. Windows 10 is likely to be a success for the enterprise, but I expect that word from heavyweights like Gartner, Forrester and Spiceworks, suggesting that half of enterprise users plan to switch to Windows 10 in 2016, are more than a bit optimistic. Windows 10 will also be used in China as Microsoft played the game with it better than with Windows 8 that was banned in China.
Windows is now delivered “as a service”, meaning incremental updates with new features as well as security patches, but Microsoft still seems works internally to a schedule of milestone releases. Next up is Redstone, rumoured to arrive around the anniversary of Windows 10, midway through 2016. Also Windows servers will get update in 2016: 2016 should also include the release of Windows Server 2016. Server 2016 includes updates to the Hyper-V virtualisation platform, support for Docker-style containers, and a new cut-down edition called Nano Server.
Windows 10 will get some of the already promised features not delivered in 2015 delivered in 2016. Windows 10 was promised coming to PCs and Mobile devices in 2015 to deliver unified user experience. Continuum is a new, adaptive user experience offered in Windows 10 that optimizes the look and behavior of apps and the Windows shell for the physical form factor and customer’s usage preferences. The promise was same unified interface for PCs, tablets and smart phones – but it was only delivered in 2015 for only PCs and some tablets. Mobile Windows 10 for smart phone is expected to start finally in 2016 – The release of Microsoft’s new Windows 10 operating system may be the last roll of the dice for its struggling mobile platform. Because Microsoft Plan A is to get as many apps and as much activity as it can on Windows on all form factor with Universal Windows Platform (UWP), which enables the same Windows 10 code to run on phone and desktop. Despite a steady inflow of new well-known apps, it remains unclear whether the Universal Windows Platform can maintain momentum with developer. Can Microsoft keep the developer momentum going? I am not sure. In addition there are also plans for tools for porting iOS apps and an Android runtime, so expect also delivery of some or all of the Windows Bridges (iOS, web app, desktop app, Android) announced at the April 2015 Build conference in hope to get more apps to unified Windows 10 app store. Windows 10 does hold out some promise for Windows Phone, but it’s not going to make an enormous difference. Losing the battle for the Web and mobile computing is a brutal loss for Microsoft. When you consider the size of those two markets combined, the desktop market seems like a stagnant backwater.
Older Windows versions will not die in 2016 as fast as Microsoft and security people would like. Expect Windows 7 diehards to continue holding out in 2016 and beyond. And there are still many companies that run their critical systems on Windows XP as “There are some people who don’t have an option to change.” Many times the OS is running in automation and process control systems that run business and mission-critical systems, both in private sector and government enterprises. For example US Navy is using obsolete operating system Microsoft Windows XP to run critical tasks. It all comes down to money and resources, but if someone is obliged to keep something running on an obsolete system, it’s the wrong approach to information security completely.
Virtual reality has grown immensely over the past few years, but 2016 looks like the most important year yet: it will be the first time that consumers can get their hands on a number of powerful headsets for viewing alternate realities in immersive 3-D. Virtual Reality will become the mainstream when Sony, and Samsung Oculus bring consumer products on the market in 2016. Whole virtual reality hype could be rebooted as Early build of final Oculus Rift hardware starts shipping to devs. Maybe HTC‘s and Valve‘s Vive VR headset will suffer in the next few month. Expect a banner year for virtual reality.
GPU and FPGA acceleration will be used in high performance computing widely. Both Intel and AMD have products with CPU and GPU in the same chip, and there is software support for using GPU (learn CUDA and/or OpenCL). Also there are many mobile processors have CPU and GPU on the same chip. FPGAs are circuits that can be baked into a specific application, but can also be reprogrammed later. There was lots of interest in 2015 for using FPGA for accelerating computations as the nest step after GPU, and I expect that the interest will grow even more in 2016. FPGAs are not quite as efficient as a dedicated ASIC, but it’s about as close as you can get without translating the actual source code directly into a circuit. Intel bought Altera (big FPGA company) in 2015 and plans in 2016 to begin selling products with a Xeon chip and an Altera FPGA in a single package – possibly available in early 2016.
Artificial intelligence, machine learning and deep learning will be talked about a lot in 2016. Neural networks, which have been academic exercises (but little more) for decades, are increasingly becoming mainstream success stories: Heavy (and growing) investment in the technology, which enables the identification of objects in still and video images, words in audio streams, and the like after an initial training phase, comes from the formidable likes of Amazon, Baidu, Facebook, Google, Microsoft, and others. So-called “deep learning” has been enabled by the combination of the evolution of traditional neural network techniques, the steadily increasing processing “muscle” of CPUs (aided by algorithm acceleration via FPGAs, GPUs, and, more recently, dedicated co-processors), and the steadily decreasing cost of system memory and storage. There were many interesting releases on this in the end of 2015: Facebook Inc. in February, released portions of its Torch software, while Alphabet Inc.’s Google division earlier this month open-sourced parts of its TensorFlow system. Also IBM Turns Up Heat Under Competition in Artificial Intelligence as SystemML would be freely available to share and modify through the Apache Software Foundation. So I expect that the year 2016 will be the year those are tried in practice. I expect that deep learning will be hot in CES 2016. Several respected scientists issued a letter warning about the dangers of artificial intelligence (AI) in 2015, but I don’t worry about a rogue AI exterminating mankind. I worry about an inadequate AI being given control over things that it’s not ready for. How machine learning will affect your business? MIT has a good free intro to AI and ML.
Computers, which excel at big data analysis, can help doctors deliver more personalized care. Can machines outperform doctors? Not yet. But in some areas of medicine, they can make the care doctors deliver better. Humans repeatedly fail where computers — or humans behaving a little bit more like computers — can help. Computers excel at searching and combining vastly more data than a human so algorithms can be put to good use in certain areas of medicine. There are also things that can slow down development in 2016: To many patients, the very idea of receiving a medical diagnosis or treatment from a machine is probably off-putting.
Internet of Things (IoT) was talked a lot in 2015, and it will be a hot topics for IT departments in 2016 as well. Many companies will notice that security issues are important in it. The newest wearable technology, smart watches and other smart devices corresponding to the voice commands and interpret the data we produce - it learns from its users, and generate appropriate responses in real time. Interest in Internet of Things (IoT) will as bring interest to real-time business systems: Not only real-time analytics, but real-time everything. This will start in earnest in 2016, but the trend will take years to play out.
Connectivity and networking will be hot. And it is not just about IoT. CES will focus on how connectivity is proliferating everything from cars to homes, realigning diverse markets. The interest will affect job markets: Network jobs are hot; salaries expected to rise in 2016 as wireless network engineers, network admins, and network security pros can expect above-average pay gains.
Linux will stay big in network server marker in 2016. Web server marketplace is one arena where Linux has had the greatest impact. Today, the majority of Web servers are Linux boxes. This includes most of the world’s busiest sites. Linux will also run many parts of out Internet infrastructure that moves the bits from server to the user. Linux will also continue to rule smart phone market as being in the core of Android. New IoT solutions will be moist likely to be built mainly using Linux in many parts of the systems.
Microsoft and Linux are not such enemies that they were few years go. Common sense says that Microsoft and the FOSS movement should be perpetual enemies. It looks like Microsoft is waking up to the fact that Linux is here to stay. Microsoft cannot feasibly wipe it out, so it has to embrace it. Microsoft is already partnering with Linux companies to bring popular distros to its Azure platform. In fact, Microsoft even has gone so far as to create its own Linux distro for its Azure data center.
Web browsers are coming more and more 64 bit as Firefox started 64 bit era on Windows and Google is killing Chrome for 32-bit Linux. At the same time web browsers are loosing old legacy features like NPAPI and Silverlight. Who will miss them? The venerable NPAPI plugins standard, which dates back to the days of Netscape, is now showing its age, and causing more problems than it solves, and will see native support removed by the end of 2016 from Firefox. It was already removed from Google Chrome browsers with very little impact. Biggest issue was lack of support for Microsoft’s Silverlight which brought down several top streaming media sites – but they are actively switching to HTML5 in 2016. I don’t miss Silverlight. Flash will continue to be available owing to its popularity for web video.
SHA-1 will be at least partially retired in 2016. Due to recent research showing that SHA-1 is weaker than previously believed, Mozilla, Microsoft and now Google are all considering bringing the deadline forward by six months to July 1, 2016.
Adobe’s Flash has been under attack from many quarters over security as well as slowing down Web pages. If you wish that Flash would be finally dead in 2016 you might be disappointed. Adobe seems to be trying to kill the name by rebranding trick: Adobe Flash Professional CC is now Adobe Animate CC. In practive it propably does not mean much but Adobe seems to acknowledge the inevitability of an HTML5 world. Adobe wants to remain a leader in interactive tools and the pivot to HTML5 requires new messaging.
The trend to try to use same same language and tools on both user end and the server back-end continues. Microsoft is pushing it’s .NET and Azure cloud platform tools. Amazon, Google and IBM have their own set of tools. Java is on decline. JavaScript is going strong on both web browser and server end with node.js , React and many other JavaScript libraries. Apple also tries to bend it’s Swift programming language now used to make mainly iOS applications also to run on servers with project Perfect.
Java will still stick around, but Java’s decline as a language will accelerate as new stuff isn’t being written in Java, even if it runs on the JVM. We will not see new Java 9 in 2016 as Oracle’s delayed the release of Java 9 by six months. The register tells that Java 9 delayed until Thursday March 23rd, 2017, just after tea-time.
Containers will rule the world as Docker will continue to develop, gain security features, and add various forms of governance. Until now Docker has been tire-kicking, used in production by the early-adopter crowd only, but it can change when vendors are starting to claim that they can do proper management of big data and container farms.
NoSQL databases will take hold as they be called as “highly scalable” or “cloud-ready.” Expect 2016 to be the year when a lot of big brick-and-mortar companies publicly adopt NoSQL for critical operations. Basically NoSQL could be seem as key:value store, and this idea has also expanded to storage systems: We got key:value store disk drives with an Ethernet NIC on-board and basic GET and PUT object storage facilities came into being.
In the database world Big Data will be still big but it needs to be analyzed in real-time. A typical big data project usually involves some semi-structured data, a bit of unstructured (such as email), and a whole lot of structured data (stuff stored in an RDBMS). The cost of Hadoop on a per-node basis is pretty inconsequential, the cost of understanding all of the schemas, getting them into Hadoop, and structuring them well enough to perform the analytics is still considerable. Remember that you’re not “moving” to Hadoop, you’re adding a downstream repository, so you need to worry on systems integration and latency issues. Apache Spark will also get interest as Spark’s multi-stage in-memory primitives provides more performance for certain applications. Big data brings with it responsibility – Digital consumer confidence must be earned.
IT security continues to be a huge issue in 2016. You might be able to achieve adequate security against hackers and internal threats but every attempt to make systems idiot proof just means the idiots get upgraded. Firms are ever more connected to each other and the general outside world. So in 2016 we will see even more service firms accidentally leaking critical information and a lot more firms having their reputations scorched by incompetence fuelled security screw-ups. Good security people are needed more and more – a joke doing the rounds of ITExecs doing interviews is “if you’re a decent security bod, why do you need to look for a job”
There will still be unexpected single points of failures in big distributed networked system. The cloud behind the silver lining is that Amazon or any other cloud vendor can be as fault tolerant, distributed and well supported as you like, but if a service like Akamai or Cloudflare was to die, you still stop. That’s not a single point of failure in the classical sense but it’s really hard to manage unless you go for full cloud agnosticism – which is costly. This is hard to justify when their failure rate is so low, so the irony is that the reliability of the content delivery networks means fewer businesses work out what to do if they fail. Oh, and no one seems to test their mission-critical data centre properly, because it’s mission critical- So they just over-specify where they can and cross their fingers (= pay twice and get the half the coverage for other vulnerabilities).
For IT start-ups it seems that Silicon Valley’s cash party is coming to an end. Silicon Valley is cooling, not crashing. Valuations are falling. The era of cheap money could be over and valuation expectations are re-calibrating down. The cheap capital party is over. It could mean trouble for weaker startups.
933 Comments
Tomi Engdahl says:
Why Java is not so popular in software development startup projects?
http://startups.stackexchange.com/questions/9131/why-java-is-not-so-popular-in-software-development-startup-projects
You need considerably more experience to write a java backend, and AngularJS frontend, and a RESTful service, than to hack together a few pages using PHP. If you’re a startup with little money, hire some cheap part-time developing CS undergrads, and they’ll hack something together. Finding some experienced, good, developers and making them give up their full-time jobs for a startup that may or may not fly is way more difficult.
I’ve seen plenty of startups using Java. Might be just a local cluster rather than anything statistically significant.
I believe the popularity of dynamically typed languages is that they overcome the limitations of weak statically typed languages, including Java.
Tomi Engdahl says:
NVMe Test Tools, Ecosystem Grow
http://www.eetimes.com/document.asp?doc_id=1329687&
The NVM Express (NVMe) specification continues to gain momentum with the growth of the ecosystem around it, including tools to support protocol testing. Last year, a number of vendors announced new solid-state drives (SSDs) using the specification for both servers and workstations.
NVMe is a standardized register interface, command and feature set for PCIe-based storage technologies such as SSDs, designed specifically for non-volatile memory. It is optimized for high performance and low latency, scaling from client to enterprise segments.
In a telephone interview with EE Times, John Wiedemeier, Teledyne LeCroy’s product marketing manager for the protocol solutions group, said the past few years have seen SSD vendors further expand their use of NVMe with products using the M.2 “stick of gum” form factor in devices such as laptops and tablets, and the standard drive size that has begun be dubbed U.2.
“Systems companies are trying to test dual ports for the various conditions that could cause a drive to go down and what will happen to the system,” he said, including various scenarios of ports dropping out and what effect they will have on mission-critical systems. “This is going to make things much easier because they now have dedicated tool.”
Tomi Engdahl says:
Stephen Hall / 9to5Google:
Android apps and the Google Play Store are coming to Chrome OS — It looks like there’s a little tidbit of information that might have been originally planned for the keynote (pulled because of time restraints, maybe?). According to a session description now on the Google I/O website …
It’s official: Android apps and the Play Store are coming to Chrome
http://9to5google.com/2016/05/18/its-official-android-apps-and-the-play-store-are-coming-to-chrome/
It looks like there’s a little tidbit of information that might have been originally planned for the keynote (pulled because of time restraints, maybe?). According to a session description now on the Google I/O website, Google “announced” today that the Google Play Store is coming to Chrome…
Today we announced that we’re adding the best mobile app experiences in the world, Android apps and the Google Play store, to the best browser in the world, Chrome! Come to this session and test your Android apps for Chrome OS
This isn’t exactly surprising as we saw evidence that this was in the cards all the way back in April, but it’s cool nonetheless to see it become official.
Tomi Engdahl says:
Ina Fried / Recode:
Google will compete with its partners and sell its own Daydream virtual reality headsets — Think Nexus, but for VR. — While focusing yesterday on its new virtual reality headset as a design that will be licensed to partners, Google also plans to sell a version of Daydream itself.
Google will compete with its partners and sell its own Daydream virtual reality headsets
Think Nexus, but for VR.
http://www.recode.net/2016/5/19/11713830/google-daydream-vr-headsets-partners-
While focusing yesterday on its new virtual reality headset as a design that will be licensed to partners, Google also plans to sell a version of Daydream itself.
The Daydream headset is designed as an evolution of the low-end Cardboard, relying on a phone to provide the display, brains and head-tracking abilities. Unlike Cardboard, though, Daydream is designed to be far more comfortable so it can be used for longer periods of time.
A separate controller does have electronics, including a bunch of sensors, several buttons and a clickable trackpad.
VR head Clay Bavor confirmed Google will sell its version of the hardware.
The move is similar to what Google did with Cardboard, showing Google wants to make sure lots of these headsets get out. If other makers get enough devices out, great, but if not, Google wants to make sure lots of people have access to Daydream.
Google and partners like Epic Games and Unity focused a lot of their attention on the motion-sensing controller that accompanies the headset. Oculus, for example, has plans for a motion controller shipping later this year.
The headset and controller aren’t the only components for Daydream. Google is also certifying a range of phones as Daydream-ready.
Google is also working with video partners including Hulu, Netflix, IMAX, the National Basketball Association and Major League Baseball.
Unreal Engine adds full Google Daydream VR support, native Unity support coming this summer
http://techcrunch.com/2016/05/19/unreal-engine-adds-full-google-daydream-vr-support-native-unity-support-coming-this-summer/
Tomi Engdahl says:
Tom Warren / The Verge:
IDC analyst says Chromebooks outsold Macs for the first time in the US in the latest quarter — Google’s low-cost Chromebooks outsold Apple’s range of Macs for the first time in the US recently. While IDC doesn’t typically break out Windows vs. Chromebook sales, IDC analyst Linn Huang confirmed the milestone to The Verge.
Chromebooks outsold Macs for the first time in the US
http://www.theverge.com/2016/5/19/11711714/chromebooks-outsold-macs-us-idc-figures
Google’s low-cost Chromebooks outsold Apple’s range of Macs for the first time in the US recently. While IDC doesn’t typically break out Windows vs. Chromebook sales, IDC analyst Linn Huang confirmed the milestone to The Verge. “Chrome OS overtook Mac OS in the US in terms of shipments for the first time in 1Q16,” says Huang. “Chromebooks are still largely a US K-12 story.”
IDC estimates Apple’s US Mac shipments to be around 1.76 million in the latest quarter, meaning Dell, HP, and Lenovo sold nearly 2 million Chromebooks in Q1 combined. Chromebooks have been extremely popular in US schools
Google’s milestone will undoubtedly unnerve Microsoft, at a time when PC shipments are in an overall decline. IDC predicts a “modest rebound” in the coming months, thanks to some IT buyers considering Windows 10 transitions and an uptick in Chromebook sales in the US.
Emil Protalinski / VentureBeat:
Developers can test Android apps on select Chrome OS devices starting in June, wide release planned for September — Google is finally bringing the Google Play store, including its more than 1.5 million Android apps, to Chrome OS. Google Play will first arrive with Chrome OS version 53 …
Google Play is coming to Chrome OS in September
http://venturebeat.com/2016/05/19/google-play-is-coming-to-chrome-os-in-september/
Tomi Engdahl says:
2016 Open Source Jobs Report
Fifth annual report reveals that open source talent is in high demand
http://go.linuxfoundation.org/download-2016-open-source-jobs-report?utm_source=edx_newsletter&utm_medium=email&utm_campaign=2016-open-source-jobs-report
Key findings:
87% of hiring managers say it’s difficult to find open source talent, and 79% have even increased incentives to hold on to their current open source professionals.
58% of hiring managers are looking for DevOps talent, making DevOps the most in-demand role in open source today.
For jobs seekers, even though 86% of tech professionals say open source has advanced their careers, only 2% say money and perks are the best part of their job.
Tomi Engdahl says:
Google developed its own AI processor
Google IO event also presented an interesting new device. It is artificial intelligence algorithms to process only tpu-processor.
TPU stands for the tensor processor unit. The processor processes the last year, Google announced by TensorFlow algorithms. Norma Joupin a blog post on the company responsible for the design of the processor, the TPU is Google’s self-developed ASIC, which has been developed over the years.
What is surprising is Joupin narrative information that TPU-processors has already been tested for years, Google’s data centers. Machine Learning region processor brings Joupin by several orders of magnitude higher performance machine learning calculation.
Source: http://etn.fi/index.php?option=com_content&view=article&id=4466:google-kehitti-oman-keinoalyprosessorin&catid=13&Itemid=101
Tomi Engdahl says:
IBM’s Articles
Machine Learning in Every Application
https://www.eeweb.com/company-blog/ibm/machine-learning-in-every-application/
Machine learning aids almost everything an engineer can imagine.
It is not surprising that many design engineers are using internet connectivity to enable new features in their system designs. If an engineer is designing medical equipment, an industrial machine, a field service tool, or a vending machine, the design can almost always benefit from being connected to the web (with the required security elements, of course). Today’s technology makes it easy to find sensors and IoT tools to collect information about an assets health, enablingdesign engineers to significantly enhance their products.
Users will immediately benefit when an asset is connected. They can read the asset’s status any time they choose, whether it’s in their facility or remote some 1,000 miles away, andperform maintenance based on what they know about the asset’s behavior. Meaning, define rules on the behavior of the asset and if the asset behaves beyond the known thresholds then schedule a maintenance appointment. However, much more can be done with the help of machine learning algorithms.
If the asset is connected and sensory data is available, the asset manager can take advantage of machine learning to leverage continuous feedback, solve operational challenges, and transform customer experiences.
Tomi Engdahl says:
Google’s TPU Hit a Tight Sked
Short schedule “answers a lot of questions”
http://www.eetimes.com/document.asp?doc_id=1329722&
Google’s new processors for accelerating machine learning were built under a tight schedule, said Norm Jouppi, a veteran chip designer and distinguished hardware engineer who led the effort.
“I joined Google 2.5 years ago, and there were already some efforts underway…The emphasis was on getting something out quickly,” said Jouppi in an interview with EE Times.
A decision was made about three years ago to make a custom chip to accelerate Google’s machine learning algorithms.
The chips, called tensor processing units (TPUs) after the Google TensorFlow algorithm they accelerate, have been running for more than a year in Google’s data centers. “The sked was pretty challenging, the team did really well on that—a short schedule can help focus you, it answers a lot of questions,” Jouppi said.
Before Jouppi arrived in September 2013, Google engineers had evaluated using CPUs, GPUs and FPGAs. “They decided the benefits of customization were great enough that they wanted to go straight to custom silicon,”
Google may reveal some details about the TPUs in the fall, but for now it is keeping mum on their inner workings.
Tomi Engdahl says:
Chrome OS to get Android apps via the magic of containers
Long-expected marriage of operating systems
http://www.theregister.co.uk/2016/05/19/chrome_os_gets_android_apps_via_containers/
Google I/O 2016 Google has pulled the move the software market has been waiting ages for, and built a system to run Android apps on its desktop operating system.
The system works by setting up a Linux container in the Chrome operating system that runs a complete version of Android in a locked-down environment to minimize security issues. It’s not an emulated version of Android, so there should be a minimum number of issues, Chrome OS team leader Kan Liu told developers at the Google I/O conference.
There is a little bit of processor and memory load to pull this off, but it’s well within the scope of Chromebooks that use either Intel or ARM processors, he explained. Chromebooks are more powerful than the smartphones that Android apps are designed for, so processors will have little problem running the mobile code.
The system does have its limits, however. Android apps that require specific hardware, like an always-on cellular link, won’t run on a laptop that doesn’t have the necessary kit. But that’s not as big an issue as you might think, Liu opined.
Tomi Engdahl says:
Forget high-powered PCs, mobile is the future of VR, says Google
But you’ll need a new phone to do it
http://www.theregister.co.uk/2016/05/19/mobile_is_the_future_of_vr_says_google/
Google I/O 2016 Google has been outlining plans for kickstarting its virtual reality portfolio this year, including new hardware, software tools, and developer support.
Unlike Facebook’s Oculus platform, which requires a high-end PC to crunch the code, Google thinks that you can get a perfectly decent VR experience just using a smartphone, like Samsung’s Gear VR setup. As a result, Android N is going to be built with Google’s Daydream VR platform in mind.
If you want to slip into Google’s virtual world, it’s going to cost you, however. Its reference designs require the highest-powered processors, lots of extra motion sensors for tracking head movements, and a low-latency screen capable of displaying 60fps graphics with no ghosting.
Clay Bavor, Google’s VP of VR, said the Chocolate Factory was going to be building its own Android N smartphones to use for virtual reality, but other manufacturers have also promised hardware, he said, including Chinese firm Xiaomi. Don’t expect the price of such high-end kit to be cheap.
Tomi Engdahl says:
Adobe launches Spark: Amateur graphical fun!
We can’t publish the visual storytelling version of this due to the community guidelines
http://www.theregister.co.uk/2016/05/19/adobe_launches_spark_for_graphical_blah/
Adobe has launched Spark in a hope that its graphics software can be tooled for the mobile age.
Launched as part of the backup-gobbling service Creative Cloud, Spark is intended to embiggen the San Jose-based business’s animation suite, although it’s still only available on iOS.
It rebrands some of the company’s apps from yesteryear, with Slate and Post arriving as Spark Video, Spark Page, and Spark, er, Post, all iOS mobile apps for various forms of graphics, as well as the Spark web app itself “for creating social posts and graphics, web stories and animated videos.”
Tomi Engdahl says:
Being an IT trainer is like performing the bullet-catching trick
You’ll like this (but not a lot)
http://www.theregister.co.uk/2016/05/20/being_an_it_trainer_is_like_performing_the_bulletcatching_trick/
Tomi Engdahl says:
Google’s Tensor Processing Unit: What We Know
by Joshua Ho on May 20, 2016 6:00 AM EST
http://www.anandtech.com/show/10340/googles-tensor-processing-unit-what-we-know
If you’ve followed Google’s announcements at I/O 2016, one stand-out from the keynote was the mention of a Tensor Processing Unit, or TPU (not to be confused with thermoplastic urethane). I was hoping to learn more about this TPU, however Google is currently holding any architectural details close to their chest.
More will come later this year, but for now what we know is that this is an actual processor with an ISA of some kind. What exactly that ISA entails isn’t something Google is disclosing at this time – and I’m curious as to whether it’s even Turing complete – though in their blog post on the TPU, Google did mention that it uses “reduced computational precision.” It’s a fair bet that unlike GPUs there is no ISA-level support for 64 bit data types, and given the workload it’s likely that we’re looking at 16 bit floats or fixed point values, or possibly even 8 bits.
Tomi Engdahl says:
eBay is betting shoppers will embrace virtual reality as much as gamers
http://mashable.com/2016/05/18/ebay-virtual-reality-shopping/#BKonSADiRGq9
Virtual reality could be the next big thing after the mobile shopping boom, and the brands want in.
Not just for gaming, the technology could also support retail and browsing experiences, and eBay is one of the first companies to take the leap.
Once the iOS or Android eBay Virtual Reality Department Store app is downloaded, it works with headsets like Samsung’s Gear VR. eBay and Myer are also offering 20,000 free “shopticals” — basically just Google Cardboard headsets — to shoppers.
Tomi Engdahl says:
Don’t Use Allo
http://motherboard.vice.com/read/dont-use-google-allo
The buzziest thing Google announced at its I/O conference Wednesday was Allo, a chatbot-enabled smartphone messaging app that looks to take on iMessage, Facebook Messenger, and the Facebook-owned WhatsApp.
Early sentiment about Allo is overwhelmingly positive: It looks beautiful, lets you doodle on images before you send them, comes with stickers as well as emojis, and it’s the first Google product to offer end-to-end encryption, which is certainly a good thing.
But if you care at all about your privacy, you should not use Google Allo.
Allo’s big innovation is “Google Assistant,” a Siri competitor that will give personalized suggestions and answers to your questions on Allo as well as on the newly announced Google Home, which is a competitor to Amazon’s Echo.
On Allo, Google Assistant will learn how you talk to certain friends and offer suggested replies to make responding easier. Let that sink in for a moment: The selling point of this app is that Google will read your messages, for your convenience.
Tomi Engdahl says:
Google’s TPU Hit a Tight Sked
Short schedule “answers a lot of questions”
http://www.eetimes.com/document.asp?doc_id=1329722&
Google’s new processors for accelerating machine learning were built under a tight schedule, said Norm Jouppi, a veteran chip designer and distinguished hardware engineer who led the effort.
“I joined Google 2.5 years ago, and there were already some efforts underway…The emphasis was on getting something out quickly,” said Jouppi in an interview with EE Times.
Google Designing AI Processors
TensorFlow accelerators used in AlphaGo
http://www.eetimes.com/document.asp?doc_id=1329715
Google has developed its own accelerator chips for artificial intelligence it calls tensor processing units (TPUs) after the open source TensorFlow algorithms it released last year. The news was the big surprise saved for the end of a two-hour keynote at the search giant’s annual Google IO event in the heart of Silicon Valley.
“We have started building tensor processing units…TPUs are an order of magnitude higher performance per Watt than commercial FPGAs and GPUs, they powered the AlphaGo system,” said Sundar Pichai, Google’s chief executive, citing the Google computer that beat a human Go champion.
The accelerators have been running in Google’s data centers for more than a year, according to a blog by Norm Jouppi, a distinguished hardware engineer at Google. “TPUs already power many applications at Google, including RankBrain, used to improve the relevancy of search results and Street View, to improve the accuracy and quality of our maps and navigation,” he said.
Tomi Engdahl says:
Graphics Rivals in Epic Battle
Pascal vs. Polaris a rare clash of titans
http://www.eetimes.com/author.asp?section_id=36&doc_id=1329707&
Advanced Micro Devices and Nvidia go head-to-head this year in a rare battle of new GPUs, memory types and processes, says a veteran scorekeeper.
“This could be one of most interesting second halfs in graphics for many years,” says Dean McCarron, principal of market watcher Mercury Research of the looming battle between Nvidia’s Pascal and AMD’s Polaris.
“It’s been awhile since we’ve seen a simultaneous launch of this magnitude for both companies,” said McCarron who has been tracking the market for some 20 years.
Upping the ante, the new chips pack new memory types and use new process nodes, too. AMD is using the 14nm process Globalfoundries licensed from Samsung. Nvidia’s Polaris is its first chip made using TSMC’s 16FF+ process.
As for memory, the first Pascal chips will use GDDR5x, a tweak of what otherwise appeared to be the end of the line in Jedec memory interfaces. Polaris will be AMD’s second-generation chip to use high-bandwidth memory (HBM), which offers greater bandwidth but comes at a higher cost.
Jedec had a GDDR6 on the drawing board that was supposed to come out three years ago but never materialized
Tomi Engdahl says:
Peter Bright / Ars Technica:
How OneCore, the modularized OS at the heart of Windows 10, fulfills Microsoft’s quest to enable apps that work well across PCs, phones, tablets, Xbox, and more — Microsoft promised developers that Windows would run anywhere. This summer, it finally will.
OneCore to rule them all: How Windows Everywhere finally happened
Microsoft promised developers that Windows would run anywhere. This summer, it finally will.
http://arstechnica.com/information-technology/2016/05/onecore-to-rule-them-all-how-windows-everywhere-finally-happened/
The Windows 10 Anniversary update, due later this summer, represents a major landmark for Microsoft. As well as being a significant update for Windows 10 on the desktop and Windows 10 Mobile on phones, the release is also coming to the Xbox One. For the first time, the Xbox One will be running essentially the same operating system as desktop Windows. Critically, it will also be able to run many of the same applications as desktop Windows.
In a lot of ways, this represents the realization of a vision that Microsoft has been promoting for more than 20 years: Windows Everywhere. Always important to Microsoft’s ambitions for Windows as a platform, the Windows Everywhere ideal has a renewed significance with Windows 10 and CEO Satya Nadella’s promise that Windows 10 will have one billion users within the first three years of its availability. The purpose of that promise is to send a message to developers that Windows is a big platform, a platform that they should still think about and create software for.
Microsoft can now credibly speak of having one operating system (with Windows 10 as its most familiar branding) that can span hardware from little embedded Internet of Things devices to games consoles to PCs to cloud-scale server farms. At its heart is a slimmed down, modularized operating system dubbed OneCore. Windows 10, Windows Server, Xbox 10, Windows 10 Mobile, Windows 10 IoT, and the HoloLens operating system are all built on this same foundation.
It took a long time to reach this point. Along the way, Microsoft built three major operating system families, killed two of them off, and even reorganized the entire company. In the end, all that action was necessary in order to make building a single operating system practical. Apple and Google will probably do something similar with their various operating systems, but Microsoft has managed it first.
Since the early 1990s, when Microsoft broke away from its shared OS/2 development effort with IBM in favor of its own Windows NT, the Redmond firm has been pitching the idea of Windows as a platform that scales from palmtops and handheld computers all the way up to large servers. Spanning the entire range of systems would be the Win32 API, offering developers a single set of tools and skills that could reach systems of any type.
By the late 1990s, Windows Everywhere was a bit more concrete, but it relied on a broader concept of what it meant to be “Windows.” Windows NT was the “real” Windows.
But Windows NT wasn’t the only Windows. Windows NT’s problem was that it was big; it had, for the time, prohibitively high demands for RAM and disk space. It also lacked support for DOS drivers, giving it weaker hardware support.
Like those old operating systems, Windows 95 had relatively low hardware demands and support for DOS drivers and mass-market hardware.
In the ’90s, Microsoft had a third platform for these devices: Windows CE.
With these three operating systems, Microsoft sorta kinda had a way of doing “Windows Everywhere”—albeit a disjointed version. Parts of Win32 were common to every operating system, and even though the NT-95-CE trio shared little or nothing under the hood, they were at least all “Windows” branded.
Having three separate solutions came at some cost. Obviously for Microsoft, there was a great cost in developing these operating systems in parallel, doing everything in triplicate.
Offering something similar to Win32 on each operating system was helpful, but it wasn’t enough to tell developers that there was truly “Windows Everywhere” with a single development platform and API.
The same was true, and worse, for hardware companies.
Pushing the limits
Over on the phone, the wheels were starting to come off Windows CE. After being very early to the idea of “a phone that can run applications” with the Windows CE-based Windows Mobile, Microsoft was rather late to “a phone that can run applications running an operating system that people like to use.” Belatedly realizing that Windows Mobile, with its scaled-down Windows 95 lookalike interface and dependence on itty bitty styluses, was never going to win hearts and minds, Microsoft came up with Windows Phone 7.
Out of necessity, this was built on top of Windows CE
From MinWin to ServerCore
The MinWin work first shipped in Windows Vista. Subsequent releases have followed its principles—including the stricter approach to how dependencies are taken—and extended them, with the kernel and individual drivers split up into smaller, more manageable, less entangled components.
This extensive clean-up and reorganization work in turn enabled the Windows division to meet those demands for a smaller, more focused server system. With the dependencies known and flowing all in one direction, it became feasible to start carving portions of Windows off.
To develop Windows RT, the Windows 8 build for ARM, Microsoft had to add this extra hardware support.
This Windows RT work came at just the right time for the Phone team. The work to make Windows 8 run on ARM systems-on-chips was just as relevant to phones as it was to tablets, giving the Phone platform the basic infrastructure it needed to build a Windows NT-based phone operating system. The Phone team took the ongoing Windows RT work and used it as the basis for its new operating system.
On the other hand, Windows Phone 8 did use CoreSystem, picking up the same core security, networking, and Windows Update components. It has the same driver model as desktop Windows, the same browser engine as desktop Windows, and the same management and enterprise policy system as desktop Windows.
One Microsoft, OneCore
Windows 8.1 included an updated version of the WinRT API, Store, and application servicing model—collectively, ModernCore. This soon made its way to Windows Phone, as Windows Phone 8.1 synced its code with Windows 8.1, bringing ModernCore to a mobile platform.
In July 2013, about a month before Windows 8.1 was finished, then-CEO Steve Ballmer announced a major reorganization of the company. Ballmer said he wanted “One strategy, one Microsoft,” and the new Microsoft had a singular group, the Operating Systems Engineering Group, that was responsible for all operating system development, from phone to cloud.
With the reorg, that all changed. For the first time, Windows for the phone, Windows for the PC, Windows for the server, and Windows for the Xbox had a common ownership and a unified development process.
With the unification mostly complete, the hundreds of millions of us who use Windows aren’t likely to suffer that same kind of stall in progress. If anything, we should see the reverse: by bringing together smaller teams and letting them share their efforts, Microsoft should be able to move faster.
Perhaps the biggest gains, for both developers and users, come from unexpected new platforms.
It has taken many years for Microsoft to attain its Windows Everywhere goal, but with OneCore and OneCoreUAP the company can credibly say that it has achieved that. The common platform isn’t the one that was envisaged back in 1992—while Win32 is still an important concern, it’s not the API that’s going to power new development on new platforms. But few developers are likely to be too troubled by this.
Apple’s iOS, tvOS, and OS X all have many portions in common, but they don’t yet offer the kind of uniform platform and developer experience that Windows 10 does.
Tomi Engdahl says:
Dave Gershgorn / Popular Science:
Google’s open-source research project Magenta uses AI to generate music, video, other visual arts, makes the process easier with TensorFlow, will launch June 1 — THE MULTI-DISCIPLINARY MACHINE — If Google’s artificial intelligence can paint its dreams, why not make other kinds of art?
‘Magenta’ Is Google’s New Project To Make Art With Artificial Intelligence
The multi-disciplinary machine
http://www.popsci.com/magenta-is-googles-project-to-make-art-with-artificial-intelligence
If Google’s artificial intelligence can paint its dreams, why not make other kinds of art?
Douglas Eck, a researcher on the Magenta project, said that the group will first tackle algorithms that can generate music, then move to video and then other visual arts.
“There’s a couple of things that got me wanting to form Magenta, and one of them was seeing the completely, frankly, astonishing improvements in the state of the art [of creative deep learning]. And I wanted to demystify this a little bit,” Eck said during a panel at Moogfest, a music and technology festival.
The Magenta project will build all of their deep learning models open-source on top of TensorFlow, Google’s open-source artificial intelligence platform, according to Eck. He says that the hope behind open-sourcing the project is that others will be able to take Google’s work and further it themselves. The project’s GitHub page is currently empty (other than a ReadMe file), but will have its first code soon.
Tomi Engdahl says:
Google’s TPU Hit a Tight Sked
Short schedule “answers a lot of questions”
http://www.eetimes.com/document.asp?doc_id=1329722&
Google’s new processors for accelerating machine learning were built under a tight schedule, said Norm Jouppi, a veteran chip designer and distinguished hardware engineer who led the effort.
Tomi Engdahl says:
Researchers Use Developer Biometrics to Predict Code Quality
http://motherboard.vice.com/read/researchers-use-developer-biometrics-to-predict-code-quality
Informatics researchers from the University of Zurich have developed a not at all sinister-sounding system capable of predicting the quality of code produced by developers based on their biometric data. By using heart rate information, for example, they were able to quantify the difficulty a given programmer had in producing a piece of software. This information could then be used to identify likely sections of bad code. Pre-crime for software debugging, in other words.
The Zurich researchers, Sebastian C Müller and Thomas Fritz, describe their work in a paper presented this week at the 38th International Conference on Software Engineering in Austin.
A common first line of defense against bugs and just poor quality code is the code review, the duo notes. Basically, one developer writes some code and then passes it on to someone else, who sort of acts like a code editor. The reviewer/editor looks over the completed code for defects and places where the code might be improved.
This is a costly system, however: Code reviews take time and people. Automated review systems exist and sort of work, but they run up against two barriers. “First, they are predominantly based on metrics, such as code churn or module size, that can only be collected after a code change is completed and often require access to further information, such as the history of the code,” the paper notes. “Second, they do not take the individual differences between developers comprehending code into account, such as the ones that exist between novices and experts.”
Enter biometrics. By looking at the programmer as they program, rather than the code after the programmer is done writing it, the system described by the Zurich researchers finds code quality issues as the code is being produced.
Tomi Engdahl says:
A New Number Format for Computers Could Nuke Approximation Errors for Good
http://motherboard.vice.com/read/a-new-number-format-for-computers-could-nuke-approximation-errors-for-good?trk_source=recommended
Even to the more mathematically challenged among us, it’s a reasonably easy task to parse a number with a decimal point.
Computers have a funny, uneasy relationship to decimal numbers, however. Whole numbers are easy because it’s easy to represent a whole number in our base-10 counting system as a binary (base-2) number. With 32 bits, or 32 digits, we can represent a huge range of whole numbers (integers or ints, in computer science), all the way up to 2147483647. Usually, that’s more than enough. The problem when we start adding fractional values is that we have to have a way of encoding where exactly within a string of digits a decimal point should be located. It changes number by number.
In computing, a number with a decimal point and corresponding fractional value is represented by the floating-point data type (a float). For a 32 bit floating-point number, it turns out that we really only get 23 binary digits to represent the numerical content of a number, with the rest reserved for representing the position of the decimal point within the number (as an exponent, as in scientific notion).
The problem is less so a relatively limited range of possible values than a fundamental limitation on precision.
ohn Gustafson, a computer scientist specializing high-performance computing and the namesake behind Gustafson’s Law, has proposed a new solution to this seemingly unavoidable source of error (read: imprecision). He calls the new format “unum,” for universal number.
“A unum has three additional fields that make the number self-descriptive,” Gustafson explains in an interview with the ACM’s Ubiquity magazine. “The ‘ubit’ that marks whether a number is exact or in between exact values, the size of the exponent in bits, and the size of the fraction in bits. So not only does the binary point float, the number of significant digits also floats.”
The End of (Numeric) Error
An interview with John L. Gustafson
http://ubiquity.acm.org/article.cfm?id=2913029
Tomi Engdahl says:
Programmers Aren’t Writing Green Code Where It’s Most Needed
http://motherboard.vice.com/read/programmers-arent-writing-green-code-where-its-most-needed?trk_source=recommended
Confession? I don’t write green code. I mean, it might be green code just by coincidence, but I’ve never really thought too much about the relative energy consumption demanded by this design pattern or algorithm versus some other. Sadly, this is true even when I’m working with actual hardware and low-level software, such as that written in plain C for embedded devices (in my case, for an Arduino board or other microcontroller platform). What’s more, I don’t think the green code idea has ever come up in my years of computer science classes.
I’m hardly the exception, according to a paper presented this week at the 38th International Conference on Software Engineering. In interviews and surveys conducted with 464 software engineers from a range of disciplines—including mobile, data center, embedded, and traditional software development—researchers found that where green coding most matters, its practice is rare.
Green software development is as it sounds. In their own words, the researchers behind the new paper, a team drawn from IBM, Google, Microsoft, and the University of Delaware, were looking specifically for answers relating to how software engineers think about battery life/energy usage when they write requirements, design, construct, test, and maintain their software.
“Based on our interviews, we initially theorized that practitioners with experience in mobile (‘battery life is very important, especially in mobile devices’), data center (‘any watt that we can save is either a watt we don’t have to pay for, or it’s a watt that we can send to another server’), and embedded (‘maximum power usage is limited so energy has a big influence on not only hardware but also software’) would more often have requirements or goals about energy usage than traditional practitioners (‘we always have access to power, so energy isn’t the highest priority’).”
This turned out to be accurate for only mobile developers, who used green practices more than any other group, with 53 percent reporting that they “almost always” or “often” wrote applications with energy usage requirements.
An empirical study of practitioners’ perspectives on green software engineering
http://dl.acm.org/citation.cfm?id=2884810
The energy consumption of software is an increasing concern as the use of mobile applications, embedded systems, and data center-based services expands. While research in green software engineering is correspondingly increasing, little is known about the current practices and perspectives of software engineers in the field. This paper describes the first empirical study of how practitioners think about energy when they write requirements, design, construct, test, and maintain their software.
Tomi Engdahl says:
What an API Is and Why It’s Worth Fighting For
http://motherboard.vice.com/read/oracle-vs-google-what-an-api-is-and-why-its-worth-fighting-for?trk_source=recommended
Tomi Engdahl says:
Todd Bishop / GeekWire:
Apple, Microsoft and Google hold 23% of all U.S. non-financial corporate cash
Apple, Microsoft and Google hold 23% of all U.S. corporate cash, as tech sector accumulates wealth
http://www.geekwire.com/2016/apple-microsoft-google-hold-nearly-quarter-u-s-corporate-cash/
Apple, Microsoft and Google are the top three cash-rich U.S. companies across all sectors of business, not including banks and other financial institutions — holding a combined $391 billion in cash as of the end of 2015, or more than 23 percent of the entire $1.68 trillion held by the nation’s non-financial corporations.
cash1Apple leads the pack with $215.7 billion in cash, followed by Microsoft at $102.6 billion, and Google at $73.1 billion.
Tomi Engdahl says:
Google’s Daydream is Silicon Reality
Hitting the 20 millisecond latency target
http://www.eetimes.com/author.asp?section_id=36&doc_id=1329737&
Google’s Daydream VR is very much a reality today for semiconductor managers such as Tim Leland. The head of visual processing at Qualcomm is one of many working with the search giant for some time to bring to all next-generation Android phones an upgraded version of the mobile virtual reality Samsung pioneered with its GearVR.
Leland’s team helped develop an Android framework for optimizing single-buffer rendering. The graphics cores in its latest Snapdragon 820 SoC were tuned to deliver fine grained pre-emption to reduce motion-to-photon latency, a key metric to make sure displays change as fast as a user’s head moves.
“It took a lot of effort” from deep in the SoC through Android to the application to hit the 20 millisecond target, Leland said.
Snapdragon chips needed “to change way they handshake with sensors” to reduce latency. The sensors themselves need to support fast sampling at rates of 100 MHz to a gigahertz.
Tomi Engdahl says:
Don’t Pick a Programming Language Because It’s the ‘Most Profitable’
http://motherboard.vice.com/read/dont-pick-a-programming-language-because-its-the-most-profitable-java-javascript-python?trk_source=recommended
I’m not trying to nuke the general ideas of in-demand or popular programming languages—because those are things that exist. But they exist within a much larger ecosystem of talent, ability, and, ultimately, employability. Here are a few things that need to be kept in mind as you chew through another “most profitable” programming language list.
0) Popular languages and in-demand languages != the future
You’ll see Java at the top of many “most profitable” lists and it is for sure an in-demand skill.
Java will be around for a while, but not so much by choice.
0.5) People make money, not languages
Java can be expected to be a part of a much larger skill set than what we might normally think of “coding,” e.g. software engineering, systems programming, etc. A professional engineer whose work involves Java is going to have a boatload more qualifications than “knowing Java,”
1) Programming languages are tools
Programming languages are not programming. They are tools used to program. Programming is a set of skills that are mostly language-independent. “Knowing Java” implies to precisely no one that you are a competent programmer.
2) Programming languages depend on problems
Even very general languages like Java fit some domains better than others. Programming is always just solving problems and programming languages all exist to solve some subset of all problems. This is why there are so many languages—some are better at solving certain types of problems than others.
So, if not the “most profitable,” what language should you actually learn? Probably the one best suited to helping you learn other languages, e.g. the one that will teach you to actually program. That might be Python, or it could even be something with a very limited domain (or problem space), like Processing.
Tomi Engdahl says:
Future servers in the data center will use very wide range of processors: the traditional host processors, hardware accelerator, graphics processors and programmable FPGA circuits.
Seven companies have now teamed up to implement under the interface through which they can all talk to each other and shared memory.
The matter is AMD, ARM, Huawei, IBM; Mellanox, Qualcomm and Xilinx. They have formed a consortium CCIX (Cache Coherent Interconnect for Accelerators). CCIX interface is not an easy or trivial task – and Intel/Altera is missing from the company list.
At the moment, for example, graphic accelerators and FPGA circuits connected systems PCIExpress interface. It works, but is not necessarily the best solution.
Intel is trying, together with Altera to push their own data centers QPI bus.
Source: http://etn.fi/index.php?option=com_content&view=article&id=4482:uusi-liitanta-yhdistaa-kaikki-prosessorit-datakeskuksessa&catid=13&Itemid=101
Tomi Engdahl says:
Josh Constine / TechCrunch:
Facebook now using its own machine learning system instead of Bing for translations, and now supports 40 languages
Facebook ditches Bing, 800M users now see its own AI text translations
http://techcrunch.com/2016/05/23/facebook-translation/
Machine learning is accomplishing Facebook’s mission of connecting the world across language barriers. Facebook is now serving 2 billion text translations per day. Facebook can translate across 40 languages in 1,800 directions, like French to English. And 800 million users, almost half of all Facebook users, see translations each month.
That’s all based on Facebook’s own machine learning translation system. In 2011 it started working with Microsoft Bing to power translations, but has since bene working to transition to its own system. In December 2015, Facebook finally completed the shift, and now exclusively uses its own translation tech.
Tomi Engdahl says:
Jacob Pramuk / CNBC:NEW
HP Enterprise announces spinoff of its enterprise services unit, which will merge with CSC, meets Q2 expectations with $12.71B in sales; stock up 8%+
Hewlett Packard Enterprise announces spinoff as earnings meet Street
http://www.cnbc.com/2016/05/24/hp-enterprise-reports-q2-earnings-results-.html
Hewlett Packard Enterprise on Tuesday reported earnings that met Wall Street’s expectations and announced a spinoff of its enterprise services unit, which will merge with Computer Sciences.
The computing company, itself a spinoff of the former Hewlett-Packard, posted adjusted earnings of 42 cents per share on $12.71 billion in sales for its fiscal 2016 second quarter. Revenue rose about 1 percent as reported from the prior-year period.
The merger, which values HPE’s services business at about $8.5 billion, is expected to be completed by the end of March 2017. HPE shareholders will own shares of both HPE and the combined company.
In November, the former Hewlett-Packard split into HPE and HP Inc. The move separated the legacy hardware business from the enterprise computing segment.
Tomi Engdahl says:
Ben Gilbert / Tech Insider:
Former employees describe what went wrong with Disney’s video game initiative: strong aversion to risk and little understanding of the game industry
Disney just shut down a huge project that was supposed to be worth billions — insiders reveal what went wrong
http://www.techinsider.io/inside-disneys-messy-video-game-business-2016-5
The world’s largest entertainment company doesn’t make its own video games anymore.
Disney — the folks behind “Star Wars,” Mickey Mouse, Disneyland, everything Marvel, ESPN, and hundreds of other iconic characters — believes that the better option is to license its incredibly successful properties out to other companies. It still makes mobile games, sure, but Disney’s out of the “big” game business; the stuff most people play on stuff like the PlayStation 4 and PC.
Tomi Engdahl says:
Accelerators Unite ARM, IBM, X86
Seven vendors forge alternative to Intel, Nvidia
http://www.eetimes.com/document.asp?doc_id=1329734&
Seven chip makers will define a cache-coherent interconnect for server accelerators, providing an alternative to Intel and Nvidia in a red hot sector of cloud computing. The effort is the first hardware collaboration of its type to span ARM, x86 and Power processors.
Advanced Micro Devices, ARM, Huawei, IBM, Mellanox, Qualcomm and Xilinx will define the Cache Coherent Interconnect for Accelerators (CCIX, pronounced C6). The group will release a draft specification by the end of the year, but has so far not released any technical or financial details.
The effort started after Intel bid $16.7 billion to buy Altera last year, in part to use its FPGAs as accelerators for its Xeon server CPUs. Other processor vendors approached Xilinx individually to create a cache-coherent link to their chips, and Xilinx proposed the idea of a single link to serve them all. Intel has already started shipping a device that puts a Xeon server CPU and Altera FPGA in a single package.
The need for accelerator chips to bolster performance has spread across the computer industry like wildfire in the past year. Much of the interest comes from Web giants applying a new class of machine-learning algorithms to a growing set of applications from voice and picture recognition to contextual searches.
Tomi Engdahl says:
Going the Distance with PCIe
http://www.eetimes.com/document.asp?doc_id=1329744&
One of the challenges flash-based storage has faced is that systems designers have been using hard disk architectures and applying them to SSDs, including PCI Express (PCIe).
PCIe, however, was not specifically designed for storage. Although it has a great deal of theoretical bandwidth, it has inherent limitations because it is not a native storage interface. It requires an onboard controller to manage resources between the flash memory and server I/O. And if I/O requests scale beyond controller thresholds, it can dramatically increase latency.
While that has led some vendors to develop alternative technologies, PCIe is still a widely-used interface that has advantages. And despite efforts to build more compact systems that keep everything close together to reduce the distance a signal must travel, sometimes it’s not always possible.
Parade Technologies in Santa Clara, Calif., has opted to address this reality with something called a redriver. The high-speed interface IC supplier just introduced a two-lane 8Gb/s PCI Express and SATA redriver targeted at M.2 SSDs and other high speed peripheral applications. The PS8559 features four redriver channels enabling the support of two bi-directional lanes, and is able to handle PCIe up to 8Gb/sec and SATA up to 6Gb/sec.
Tomi Engdahl says:
More CIOs report to the CEO, underscoring IT’s rising importance
http://www.cio.com/article/3074899/cio-role/more-cios-report-to-the-ceo-underscoring-it-s-rising-importance.html
With digital strategies increasingly taking center stage in many businesses, more CIOs are reporting to the CEO. However, the tech talent dearth threatens CIOs ability to conduct their work.
Thirty-four percent of CIOs surveyed report directly to their CEO, further validating how IT has become increasingly strategic as businesses seek to generate more money using digital technologies, according to the new 2016 Harvey Nash/KPMG CIO survey. However, 65 percent of CIOs say the lack of technical talent — particularly for big data analytics — is hampering their efforts to keep up with the pace of change.
Snyder says that 21 percent of respondents say their CEOs are formulating digital strategies but expect CIOs to choreograph the necessary technology and business process changes. As a result, CIOs are focusing more on innovation and less on the operational efforts. CIOs are steering employees through new ways of working, delivering technologies that enable better customer engagement. And as they direct organizational changes they are spending more time getting to know their customers. Four out of 10 respondents indicate they spend at least one day a week on something other than IT.
“CIOs are no longer focused solely on delivering the right technology to enable the enterprise, rather they are now the key agent of change for moving enterprise strategy forward,” says Snyder.
Tomi Engdahl says:
This $5 Billion Software Company Has No Sales Staff
http://www.bloomberg.com/news/articles/2016-05-18/this-5-billion-software-company-has-no-sales-staff
Atlassian sold $320 million worth of business software last year without a single sales employee. Everyone else in the industry noticed.
Brandon Cipes, vice president for information systems at OceanX, has spent enough time in senior IT positions to hate sales calls. “It’s like buying a car—a process that seemingly should be so simple, but every time I have to, it’s like a five- to six-hour ordeal,” he says. “Most of our effort is trying to get the salespeople to leave us alone.” Cipes didn’t always feel that way, though. Back in 2013, he was used to the routine. His conversion began when he e-mailed business-software maker Atlassian, asking the company to send him a sales rep, and it said no.
Atlassian, which makes popular project-management and chat apps such as Jira and HipChat, doesn’t run on sales quotas and end-of-quarter discounts. In fact, its sales team doesn’t pitch products to anyone, because Atlassian doesn’t have a sales team. Initially an anomaly in the world of business software, the Australian company has become a beacon for other businesses counting on word of mouth to build market share. “Customers don’t want to call a salesperson if they don’t have to,” says Scott Farquhar, Atlassian’s co-chief executive officer. “They’d much rather be able to find the answers on the website.”
The way technology companies sell software has changed dramatically in the past decade. The availability of open source alternatives has pushed traditional brands and rising challengers to offer more free trials, free basic versions of their software with paid upgrades, and online promotions.
Incumbents such as IBM, Oracle, and Hewlett Packard Enterprise, which employ thousands of commissioned salespeople, are acquiring open source or cloud companies that sell differently
The idea is to distribute products to individuals or small groups at potential customers big and small and hope interest spreads upstairs.
So far, though, Atlassian remains the most extreme example of this model.
Tomi Engdahl says:
Asia Hotbed of IT Piracy Despite Economic Growth: Report
http://www.securityweek.com/asia-hotbed-it-piracy-despite-economic-growth-report
Unlicensed Software Use Still High Globally Despite Costly Cybersecurity Threats
More than 60 percent of all computer software installed in the Asia-Pacific in 2015 was unlicensed, the worst of any region, despite growing economies and anti-piracy efforts, an industry watchdog said Wednesday.
The Software Alliance — which includes giants like Microsoft, Apple, Intel, Oracle and Adobe — said in a report that the unlicensed software in Asia had a value of $19.1 billion last year.
Piracy rates were most rampant in Bangladesh, Pakistan and Indonesia at more than 80 percent. The global piracy average was 39 percent.
While the worldwide piracy rate decreased by four percentage points from 2013, Asia saw only a one percentage point decline to 61 percent over the two-year period, said the report, which did not cover mobile devices.
Tomi Engdahl says:
Magic Leap partners with messaging startup Twilio
The mixed-reality startup wants us all to interact via holograms.
http://www.engadget.com/2016/05/25/magic-leap-partners-with-messaging-startup-twilio/
We still don’t know all that much about super-secret mixed-reality startup Magic Leap. But today we learned that it will be partnering with communications company Twilio to make chatting with holographic-looking versions of your friends and family eventually happen.
Twilio CEO, Jeff Lawson was joined onstage via telepresence robot by Magic Leap CEO, Rony Abovitz who said that the companies will be “working to integrate what I think are amazing services and components for communication.” The two companies also announced that 10 lucky developer teams in the Twilio community will have a chance to build for the mixed-reality hardware via an SDK.
Tomi Engdahl says:
Android Is ‘Fair Use’ As Google Beats Oracle In $9 Billion Lawsuit
https://yro.slashdot.org/story/16/05/26/2030233/android-is-fair-use-as-google-beats-oracle-in-9-billion-lawsuit
Ars Technica writes that Google’s Android OS does not infringe upon Oracle-owned copyrights because its re-implementation of 37 Java APIs is protected by “fair use.” The jury unanimously answered “yes” in response to whether or not Google’s use of Java APIs was a “fair use” under copyright law. The trial is now over, since Google won.
Google beats Oracle—Android makes “fair use” of Java APIs
Oracle has spent many millions trying to get a chunk of Android, to no avail.
http://arstechnica.com/tech-policy/2016/05/google-wins-trial-against-oracle-as-jury-finds-android-is-fair-use/
Following a two-week trial, a federal jury concluded Thursday that Google’s Android operating system does not infringe Oracle-owned copyrights because its re-implementation of 37 Java APIs is protected by “fair use.” The verdict was reached after three days of deliberations.
“Ladies and gentlemen of the jury, listen to your verdict as it will stand recorded,” said the court clerk, before polling each of the ten men and women on the jury.
There was only one question on the special verdict form, asking if Google’s use of the Java APIs was a “fair use” under copyright law. The jury unanimously answered “yes,” in Google’s favor. The verdict ends the trial, which began earlier this month. If Oracle had won, the same jury would have gone into a “damages phase” to determine how much Google should pay. Because Google won, the trial is over.
“We’re grateful for the jury’s verdict,” said Google lead lawyer Robert Van Nest before getting into the elevator with Google’s in-house lawyers. “That’s it.” Oracle attorneys had no comment.
Google said in a statement that its victory was good for everybody. “Today’s verdict that Android makes fair use of Java APIs represents a win for the Android ecosystem, for the Java programming community, and for software developers who rely on open and free programming languages to build innovative consumer products,” a Google spokesperson said via e-mail.
Tomi Engdahl says:
Google Doesn’t Owe Oracle a Cent for Using Java in Android, Jury Finds
http://www.wired.com/2016/05/google-doesnt-owe-oracle-cent-using-java-android-jury-finds/
Google’s use of the Oracle’s Java programming language in the Android operating system is legal, a federal jury found today in a verdict that could have major implications for the future of software development.
The case, which has dragged on for six years, could have cost Google as much as $9 billion in damages had it lost. But the decision affects more than just Google. The case is important because it helps clarify the copyright rules around what programmers can borrow for their own work. Programmers routinely borrow APIs from existing products either to ensure compatibility between products or simply to make it easier to learn a new product. An Oracle victory could have seriously curtailed that practice, hindering the creation of new software.
Tomi Engdahl says:
Linus Torvalds wins the desktop; Chromebooks outsell Macbooks
http://www.cio.com/article/3073897/linux/linus-torvalds-wins-the-desktop-chromebooks-outsell-macbooks.html
hree facts: PC sales continue to decline. Macbooks continue to grow as a share of PC shipments. And in the first quarter of 2016, Chromebooks outsold Macbooks. Yes, you read that right. According to IDC analyst Linn Huang, Chromebooks beat Macs in overall shipments in the U.S.
CIO May 2016 digital magazine cover
Download the May digital magazine
Cover story: How analytics transforms IoT data into business intelligence
Read Now
With that news, Linus Torvalds is ready to declare desktop victory. On Thursday last week, Torvalds posted on his Google+ page: “Hey, either Macs don’t count much on the desktop, or we may have to finally lay the ‘year of the Linux desktop’ joke to rest.”
Torvalds’ response to that commenter hits the nail on the head:
It’s not a desktop exactly the same way that PC’s were not “real computers” when they started showing up?
The arguments were the same back then. “Cheap toy, you can’t get real work done”.
The whole “it’s not the same thing” is simply not an argument. Of course it’s not the same thing. Computing changes all the time.”
Tomi Engdahl says:
Thunderbolt 3 of connection the data is transferred at a rate of 40 gigabits per second. It is double the rate compared to the second version, although at the same time the power consumption drops by half.
3 thunderbolt the speed enables the connection to a display signal may be transferred to two 60-hertsiselle 4K screen. In addition, it supports PCIe 3.0 interface, HDMI 2.0, DisplayPort 1.2 connector and USB 3.1 to. Also download up to 100 watts of power and manages to USB power profile.
Another challenge has been the spread of the cables needed for the price. 3 also supports a thunderbolt preferred passive cables in which data is transferred at a rate of 20 gigabits per second. When the rate is double the latest USB bus compared, it will be enough for most people.
NAB trade show in Las Vegas saw a lot of Thunderbolt 3 solutions, which have been developed specifically for the transmission and recording of 4K video.
Source: http://etn.fi/index.php?option=com_content&view=article&id=4500:5-vuotias-thunderbolt-valmis-laitevyoryyn&catid=13&Itemid=101
Tomi Engdahl says:
US nuke arsenal runs on 1970s IBM ‘puter waving 8-inch floppies
Uncle Sam blows billions a year on legacy tech
http://www.theregister.co.uk/2016/05/25/us_nuclear_guidance_system_running_on_8inch_floppies/
A US Government Accounting Office (GAO) report has highlighted the parlous state of Uncle Sam’s IT infrastructure.
As an example, the computer used to coordinate America’s nuclear forces is an IBM Series/1 that uses eight‑inch floppy disks capable of storing about 80KB of data each. Meanwhile, the Treasury Department is calculating tax returns on a 56-year-old IBM mainframe using programs written in assembly code; it says it has no plans to update its systems.
“Federal legacy IT investments are becoming increasingly obsolete: many use outdated software languages and hardware parts that are unsupported,” the report [PDF] states.
“Federal IT investments have too frequently failed or incurred cost overruns and schedule slippages while contributing little to mission-related outcomes. The federal government has spent billions of dollars on failed and poorly performing IT investments which often suffered from ineffective management, such as project planning, requirements definition, and program oversight and governance.”
Several of the systems reviewed by the GAO are scheduled for replacement – the US military will have its nuclear control and targeting systems running on a more modern server by 2020, for example – but plenty of departments reported no plans to upgrade.
Tomi Engdahl says:
Are EU having a laugh? Europe passes hopeless cyber-commerce rules
When compromise becomes why bother at all
http://www.theregister.co.uk/2016/05/27/ec_passes_ecommerce_rules/
The European Commission (EC) has approved a series of ecommerce rules designed to make Europe more competitive online.
In true European fashion however, the proposals contain a lengthy series of inconsistent compromises and avoid altogether the most complex policy issues, making them largely worthless.
Vice-President for the Digital Single Market, Andrus Ansip, said of the measures: “All too often people are blocked from accessing the best offers when shopping online, or decide not to buy cross-border because the delivery prices are too high, or they are worried about how to claim their rights if something goes wrong.
“We want to solve the problems that are preventing consumers and businesses from fully enjoying the opportunities of buying and selling products and services online.”
Except the rules don’t do that.
Worst, however, is the fact that the Commission has exempted digital goods from its digital single market, so companies will be able to continue to geo-block videos and other digital files.
Tomi Engdahl says:
The largest PC manufacturer making loss
Chinese computer and electronics manufacturer Lenovo did fiscal year that ended in a $ 128 million loss. The financial year, net sales declined by three per cent a year ago to 44.9 billion dollars.
The company says in a statement embarking the fourth quarter of $ 180 to a profit of reorganization time. At the same time, however, net sales decreased by 19 per cent compared with the same quarter last year.
Lenovo has kept the pole position as the largest PC manufacturer
Source: http://www.tivi.fi/Kaikki_uutiset/suurimman-pc-valmistajan-tulos-reippaasti-pakkasella-6554619
Tomi Engdahl says:
Chris Birchall’s Re-Engineering Legacy Software (Manning Publications)
http://www.linuxjournal.com/content/chris-birchalls-re-engineering-legacy-software-manning-publications?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+linuxjournalcom+%28Linux+Journal+-+The+Original+Magazine+of+the+Linux+Community%29
Chances are high that you didn’t write the application you’re currently working on. Most developers inherit projects built on an existing codebase that reflects design patterns, usage assumptions, infrastructure and tooling from another time and another team (and the docs are complete rubbish). To help you breathe new life into your legacy project, pick up Chris Birchall’s new book from Manning Publications titled Re-Engineering Legacy Software. Birchall’s book is an experience-driven guide to revitalizing inherited projects, covering refactoring, quality metrics, toolchain and workflow, continuous integration, infrastructure automation and organizational culture.
Tomi Engdahl says:
Mike Masnick / Techdirt:
Google-Oracle verdict: good in that it allows for the re-implementation of APIs in different software, but fair-use was the wrong judicial vehicle
Big Win For Fair Use: Jury Says Google’s Use Of Java API’s Was Fair Use… On To The Appeal
https://www.techdirt.com/articles/20160526/13584834558/big-win-fair-use-jury-says-googles-use-java-apis-was-fair-use-to-appeal.shtml
Overall, a good result of a bad process and a confused judicial system. For now.
Tomi Engdahl says:
Cavium Flexes ARM Server Upgrade
14nm ThunderX2 will pack 54 cores
http://www.eetimes.com/document.asp?doc_id=1329785&
Cavium described significant upgrades planned for its next-generation ARM server SoC. Details of the ThunderX2, which won’t be in volume production until late next year, come at a time when the first-generation chip is gaining traction but has not yet found high volume markets.
It’s been a slow slog bringing ARM-based servers to a market heavily dominated by the Intel x86 and its deep pool of legacy software. To date, Cavium has had as much or more market traction than any of the surviving players, but it is so far only shipping thousands of chips per quarter.
“The primary reason announcing now is our first products are in the market so the market is anxious about what’s next,”
Late last year, rival Applied Micro announced its 32-core X-Gene 3 will ship by the end of this year.
Cavium currently claims the high-end of the ARM server market with the most cores and support for dual-socket systems, competing with Intel’s Xeon line. Some competitors got distracted by the concept of a microserver market that didn’t pan out for 32-bit and low-end parts.
So far, one ODM is about to start production and one large cloud computing provider is using ThunderX in production servers, he added. Lenovo and Hewlett-Packard have designed servers for ARM-based SoCs but it’s unclear if they have any significant customers for them.
A group developing support for ARM servers as part of the Linaro collaboration is finishing work on a standard software platform including the Advanced Configuration and Power Interface.
Broadcom, Huawei and Qualcomm have announced plans for ARM-based server SoCs and are among the most significant players yet to debut parts. Qualcomm’s development platforms are doing a particularly good job adhering to existing server standards
“I suspect there are tire kickers around the world, especially in China and Europe,” Freund said. “Red Hat alone could consume a few thousand units, and you can bet that Baidu, Alibaba, and TenCent are buying more than a few
Tomi Engdahl says:
Jon Peddie Research has investigated the graphics processors and cards market year first quarter. decline in PC sales will also be selling graphics cards has fallen.
compared to the previous graphics processors for deliveries declined by 15 per cent, which is in line with the PC development. Manufacturers AMD’s shipments declined five percent, Nvidia just over 15 percent and Intel’s 16.9 percent.
AMD fared quite well Radeon graphics card. Card sales grew by 12.9 per cent from the previous quarter. At the same time, Nvidia graphics cards sales were down nearly 35 percent. In practice, the old ATI brand took Nvidia market share.
Intel, of course, is the clear leader in graphics processors. In January-March, its market share was 70.1 per cent, although shrank one and a half per cent. Nvidia’s market share was 16.7 percent, and AMD’s 13.2 percent.
Source: http://etn.fi/index.php?option=com_content&view=article&id=4513:erillisgrafiikka-sinnittelee-edelleen&catid=13&Itemid=101
Tomi Engdahl says:
Disk death: Three-quarters of PCs will run SSDs by 2020
At least, that’s what the analysts say…
http://www.theregister.co.uk/2016/05/31/hdd_revenues_to_plummet_as_ssd_penetration_rises/
Total disk drive shipments are going to plummet by 2020, with raw SSD cost getting cheaper than disk and SSDs taking over from disk in notebooks.
Analyst haus Stifel Nicolaus’s MD, Aaron Rakers, has taken a close look at Gartner’s HDD and SSD projections over the next few years.
He writes: “Gartner is currently forecasting HDD industry shipments to decline at a seven per cent compound annual growth rate (CAGR) from 2015-2020, while revenue is expected to be down two per cent over this timeframe.”
Rakers said:
Total industry capacity shipped is expected to grow at an 18.5 per cent CAGR
PC-related capacity shipped is expected to grow at a five per cent CAGR
Mission-critical (performance-optimized) enterprise HDD capacity shipped is expected to decline at a 12 per cent CAGR
Business-critical (high-cap / nearline) enterprise capacity shipped is estimated to grow at a 41 per cent CAGR
Gartner now thinks total SSD PC penetration rates will increase from 15 per cent and 22 per cent in 2014 and 2015 to 64 per cent in 2019 and nearly 76 per cent by 2020. Tellingly: “SSDs in mobile and ultramobile premium PCs are estimated to grow from 34 per cent penetration in 2015 to 83 per cent penetration by 2020.”
Gartner expects a cross-over $/raw GB cost for enterprise SSDs vs. Mission Critical HDDs occurring in 2018. This excludes the effects of compression and deduplication in raising effective from raw capacity.