It seems that PC market seems to be stabilizing in 2016. I expect that the PC market to shrinks slightly. While mobile devices have been named as culprits for the fall of PC shipments, IDC said that other factors may be in play. It is still pretty hard to make any decent profits with building PC hardware unless you are one of the biggest players – so again Lenovo, HP, and Dell are increasing their collective dominance of the PC market like they did in 2015. I expect changes like spin-offs and maybe some mergers with with smaller players like Fujitsu, Toshiba and Sony. The EMEA server market looks to be a two-horse race between Hewlett Packard Enterprise and Dell, according to Gartner. HPE, Dell and Cisco “all benefited” from Lenovo’s acquisition of IBM’s EMEA x86 server organisation.
Tablet market is no longer high grow market – tablet maker has started to decline, and decline continues in 2016 as owners are holding onto their existing devices for more than 3 years. iPad sales are set to continue decline and iPad Air 3 to be released in 1st half of 2016 does not change that. IDC predicts that detachable tablet market set for growth in 2016 as more people are turning to hybrid devices. Two-in-one tablets have been popularized by offerings like the Microsoft Surface, with options ranging dramatically in price and specs. I am not myself convinced that the growth will be as IDC forecasts, even though Company have started to make purchases of tablets for workers in jobs such as retail sales or field work (Apple iPads, Windows and Android tablets managed by company). Combined volume shipments of PCs, tablets and smartphones are expected to increase only in the single digits.
All your consumer tech gear should be cheaper come July as shere will be less import tariffs for IT products as World Trade Organization (WTO) deal agrees that tariffs on imports of consumer electronics will be phased out over 7 years starting in July 2016. The agreement affects around 10 percent of the world trade in information and communications technology products and will eliminate around $50 billion in tariffs annually.
In 2015 the storage was rocked to its foundations and those new innovations will be taken into wider use in 2016. The storage market in 2015 went through strategic foundation-shaking turmoil as the external shared disk array storage playbook was torn to shreds: The all-flash data centre idea has definitely taken off as a vision that could be achieved so that primary data is stored in flash with the rest being held in cheap and deep storage. Flash drives generally solve the dusk drive latency access problem, so not so much need for hybrid drives. There is conviction that storage should be located as close to servers as possible (virtual SANs, hyper-converged industry appliances and NVMe fabrics). The existing hybrid cloud concept was adopted/supported by everybody. Flash started out in 2-bits/cell MLC form and this rapidly became standard and TLC (3-bits/cell or triple layer cell) had started appearing. Industry-standard NVMe drivers for PCIe flash cards appeared. Intel and Micron blew non-volatile memory preconceptions out of the water in the second half of the year with their joint 3D XPoint memory announcement. Boring old disk disk tech got shingled magnetic recording (SMR) and helium-filled drive technology; drive industry is focused on capacity-optimizing its drives. We got key:value store disk drives with an Ethernet NIC on-board and basic GET and PUT object storage facilities came into being. Tape industry developed a 15TB LTO-7 format.
The use of SSD will increase and it’s price will drop. SSDs will be in more than 25% of new laptops sold in 2015. SSDs are expected to be in 31% of new consumer laptops in 2016 and more than 40% by 2017. The prices of mainstream consumer SSDs have fallen dramatically every year over the past three years while HDD prices have not changed much. SSD prices will decline to 24 cents per gigabyte in 2016. In 2017 they’re expected to drop to 11-17 cents per gigabyte (means a 1TB SSD on average would retail for $170 or less).
Hard disk sales will decrease, but this technology is not dead. Sales of hard disk drives have been decreasing for several years now (118 million units in the third quarter of 2015), but according to Seagate hard disk drives (HDDs) are set to still stay relevant around for at least 15 years to 20 years. HDDs remain the most popular data storage technology as it is cheapest in terms of per-gigabyte costs. While SSDs are generally getting more affordable, high-capacity solid-state drives are not going to become as inexpensive as hard drives any time soon.
Because all-flash storage systems with homogenous flash media are still too expensive to serve as a solution to for every enterprise application workload, enterprises will increasingly turn to performance optimized storage solutions that use a combination of multiple media types to deliver cost-effective performance. The speed advantage of Fibre Channel over Ethernet has evaporated. Enterprises also start to seek alternatives to snapshots that are simpler and easier to manage, and will allow data and application recovery to a second before the data error or logical corruption occurred.
Local storage and the cloud finally make peace in 2016 as the decision-makers across the industry have now acknowledged the potential for enterprise storage and the cloud to work in tandem. Over 40 percent of data worldwide is expected to live on or move through the cloud by 2020 according to IDC.
Open standards for data center development are now a reality thanks to advances in cloud technology. Facebook’s Open Compute Project has served as the industry’s leader in this regard.This allows more consolidation for those that want that. Consolidation used to refer to companies moving all of their infrastructure to the same facility. However, some experts have begun to question this strategy as the rapid increase in data quantities and apps in the data center have made centralized facilities more difficult to operate than ever before. Server virtualization, more powerful servers and an increasing number of enterprise applications will continue to drive higher IO requirements in the datacenter.
Cloud consolidation starts heavily in 2016: number of options for general infrastructure-as-a-service (IaaS) cloud services and cloud management software will be much smaller at the end of 2016 than the beginning. The major public cloud providers will gain strength, with Amazon, IBM SoftLayer, and Microsoft capturing a greater share of the business cloud services market. Lock-in is a real concern for cloud users, because PaaS players have the ancient imperative to find ways to tie customers to their platforms and aren’t afraid to use them so advanced users want to establish reliable portability across PaaS products in a multi-vendor, multi-cloud environment.
Year 2016 will be harder for legacy IT providers than 2015. In its report, IDC states that “By 2020, More than 30 percent of the IT Vendors Will Not Exist as We Know Them Today.” Many enterprises are turning away from traditional vendors and toward cloud providers. They’re increasingly leveraging open source. In short, they’re becoming software companies. The best companies will build cultures of performance and doing the right thing — and will make data and the processes around it self-service for all their employees. Design Thinking to guide companies who want to change the lives of its customers and employees. 2016 will see a lot more work in trying to manage services that simply aren’t designed to work together or even be managed – for example Whatever-As-A-Service cloud systems to play nicely together with their existing legacy systems. So competent developers are the scarce commodity. Some companies start to see Cloud as a form of outsourcing that is fast burning up inhouse ITops jobs with varying success.
There are still too many old fashioned companies that just can’t understand what digitalization will mean to their business. In 2016, some companies’ boards still think the web is just for brochures and porn and don’t believe their business models can be disrupted. It gets worse for many traditional companies. For example Amazon is a retailer both on the web and increasingly for things like food deliveries. Amazon and other are playing to win. Digital disruption has happened and will continue.
Windows 10 is coming more on 2016. If 2015 was a year of revolution, 2016 promises to be a year of consolidation for Microsoft’s operating system. I expect that Windows 10 adoption in companies starts in 2016. Windows 10 is likely to be a success for the enterprise, but I expect that word from heavyweights like Gartner, Forrester and Spiceworks, suggesting that half of enterprise users plan to switch to Windows 10 in 2016, are more than a bit optimistic. Windows 10 will also be used in China as Microsoft played the game with it better than with Windows 8 that was banned in China.
Windows is now delivered “as a service”, meaning incremental updates with new features as well as security patches, but Microsoft still seems works internally to a schedule of milestone releases. Next up is Redstone, rumoured to arrive around the anniversary of Windows 10, midway through 2016. Also Windows servers will get update in 2016: 2016 should also include the release of Windows Server 2016. Server 2016 includes updates to the Hyper-V virtualisation platform, support for Docker-style containers, and a new cut-down edition called Nano Server.
Windows 10 will get some of the already promised features not delivered in 2015 delivered in 2016. Windows 10 was promised coming to PCs and Mobile devices in 2015 to deliver unified user experience. Continuum is a new, adaptive user experience offered in Windows 10 that optimizes the look and behavior of apps and the Windows shell for the physical form factor and customer’s usage preferences. The promise was same unified interface for PCs, tablets and smart phones – but it was only delivered in 2015 for only PCs and some tablets. Mobile Windows 10 for smart phone is expected to start finally in 2016 – The release of Microsoft’s new Windows 10 operating system may be the last roll of the dice for its struggling mobile platform. Because Microsoft Plan A is to get as many apps and as much activity as it can on Windows on all form factor with Universal Windows Platform (UWP), which enables the same Windows 10 code to run on phone and desktop. Despite a steady inflow of new well-known apps, it remains unclear whether the Universal Windows Platform can maintain momentum with developer. Can Microsoft keep the developer momentum going? I am not sure. In addition there are also plans for tools for porting iOS apps and an Android runtime, so expect also delivery of some or all of the Windows Bridges (iOS, web app, desktop app, Android) announced at the April 2015 Build conference in hope to get more apps to unified Windows 10 app store. Windows 10 does hold out some promise for Windows Phone, but it’s not going to make an enormous difference. Losing the battle for the Web and mobile computing is a brutal loss for Microsoft. When you consider the size of those two markets combined, the desktop market seems like a stagnant backwater.
Older Windows versions will not die in 2016 as fast as Microsoft and security people would like. Expect Windows 7 diehards to continue holding out in 2016 and beyond. And there are still many companies that run their critical systems on Windows XP as “There are some people who don’t have an option to change.” Many times the OS is running in automation and process control systems that run business and mission-critical systems, both in private sector and government enterprises. For example US Navy is using obsolete operating system Microsoft Windows XP to run critical tasks. It all comes down to money and resources, but if someone is obliged to keep something running on an obsolete system, it’s the wrong approach to information security completely.
Virtual reality has grown immensely over the past few years, but 2016 looks like the most important year yet: it will be the first time that consumers can get their hands on a number of powerful headsets for viewing alternate realities in immersive 3-D. Virtual Reality will become the mainstream when Sony, and Samsung Oculus bring consumer products on the market in 2016. Whole virtual reality hype could be rebooted as Early build of final Oculus Rift hardware starts shipping to devs. Maybe HTC‘s and Valve‘s Vive VR headset will suffer in the next few month. Expect a banner year for virtual reality.
GPU and FPGA acceleration will be used in high performance computing widely. Both Intel and AMD have products with CPU and GPU in the same chip, and there is software support for using GPU (learn CUDA and/or OpenCL). Also there are many mobile processors have CPU and GPU on the same chip. FPGAs are circuits that can be baked into a specific application, but can also be reprogrammed later. There was lots of interest in 2015 for using FPGA for accelerating computations as the nest step after GPU, and I expect that the interest will grow even more in 2016. FPGAs are not quite as efficient as a dedicated ASIC, but it’s about as close as you can get without translating the actual source code directly into a circuit. Intel bought Altera (big FPGA company) in 2015 and plans in 2016 to begin selling products with a Xeon chip and an Altera FPGA in a single package – possibly available in early 2016.
Artificial intelligence, machine learning and deep learning will be talked about a lot in 2016. Neural networks, which have been academic exercises (but little more) for decades, are increasingly becoming mainstream success stories: Heavy (and growing) investment in the technology, which enables the identification of objects in still and video images, words in audio streams, and the like after an initial training phase, comes from the formidable likes of Amazon, Baidu, Facebook, Google, Microsoft, and others. So-called “deep learning” has been enabled by the combination of the evolution of traditional neural network techniques, the steadily increasing processing “muscle” of CPUs (aided by algorithm acceleration via FPGAs, GPUs, and, more recently, dedicated co-processors), and the steadily decreasing cost of system memory and storage. There were many interesting releases on this in the end of 2015: Facebook Inc. in February, released portions of its Torch software, while Alphabet Inc.’s Google division earlier this month open-sourced parts of its TensorFlow system. Also IBM Turns Up Heat Under Competition in Artificial Intelligence as SystemML would be freely available to share and modify through the Apache Software Foundation. So I expect that the year 2016 will be the year those are tried in practice. I expect that deep learning will be hot in CES 2016. Several respected scientists issued a letter warning about the dangers of artificial intelligence (AI) in 2015, but I don’t worry about a rogue AI exterminating mankind. I worry about an inadequate AI being given control over things that it’s not ready for. How machine learning will affect your business? MIT has a good free intro to AI and ML.
Computers, which excel at big data analysis, can help doctors deliver more personalized care. Can machines outperform doctors? Not yet. But in some areas of medicine, they can make the care doctors deliver better. Humans repeatedly fail where computers — or humans behaving a little bit more like computers — can help. Computers excel at searching and combining vastly more data than a human so algorithms can be put to good use in certain areas of medicine. There are also things that can slow down development in 2016: To many patients, the very idea of receiving a medical diagnosis or treatment from a machine is probably off-putting.
Internet of Things (IoT) was talked a lot in 2015, and it will be a hot topics for IT departments in 2016 as well. Many companies will notice that security issues are important in it. The newest wearable technology, smart watches and other smart devices corresponding to the voice commands and interpret the data we produce - it learns from its users, and generate appropriate responses in real time. Interest in Internet of Things (IoT) will as bring interest to real-time business systems: Not only real-time analytics, but real-time everything. This will start in earnest in 2016, but the trend will take years to play out.
Connectivity and networking will be hot. And it is not just about IoT. CES will focus on how connectivity is proliferating everything from cars to homes, realigning diverse markets. The interest will affect job markets: Network jobs are hot; salaries expected to rise in 2016 as wireless network engineers, network admins, and network security pros can expect above-average pay gains.
Linux will stay big in network server marker in 2016. Web server marketplace is one arena where Linux has had the greatest impact. Today, the majority of Web servers are Linux boxes. This includes most of the world’s busiest sites. Linux will also run many parts of out Internet infrastructure that moves the bits from server to the user. Linux will also continue to rule smart phone market as being in the core of Android. New IoT solutions will be moist likely to be built mainly using Linux in many parts of the systems.
Microsoft and Linux are not such enemies that they were few years go. Common sense says that Microsoft and the FOSS movement should be perpetual enemies. It looks like Microsoft is waking up to the fact that Linux is here to stay. Microsoft cannot feasibly wipe it out, so it has to embrace it. Microsoft is already partnering with Linux companies to bring popular distros to its Azure platform. In fact, Microsoft even has gone so far as to create its own Linux distro for its Azure data center.
Web browsers are coming more and more 64 bit as Firefox started 64 bit era on Windows and Google is killing Chrome for 32-bit Linux. At the same time web browsers are loosing old legacy features like NPAPI and Silverlight. Who will miss them? The venerable NPAPI plugins standard, which dates back to the days of Netscape, is now showing its age, and causing more problems than it solves, and will see native support removed by the end of 2016 from Firefox. It was already removed from Google Chrome browsers with very little impact. Biggest issue was lack of support for Microsoft’s Silverlight which brought down several top streaming media sites – but they are actively switching to HTML5 in 2016. I don’t miss Silverlight. Flash will continue to be available owing to its popularity for web video.
SHA-1 will be at least partially retired in 2016. Due to recent research showing that SHA-1 is weaker than previously believed, Mozilla, Microsoft and now Google are all considering bringing the deadline forward by six months to July 1, 2016.
Adobe’s Flash has been under attack from many quarters over security as well as slowing down Web pages. If you wish that Flash would be finally dead in 2016 you might be disappointed. Adobe seems to be trying to kill the name by rebranding trick: Adobe Flash Professional CC is now Adobe Animate CC. In practive it propably does not mean much but Adobe seems to acknowledge the inevitability of an HTML5 world. Adobe wants to remain a leader in interactive tools and the pivot to HTML5 requires new messaging.
The trend to try to use same same language and tools on both user end and the server back-end continues. Microsoft is pushing it’s .NET and Azure cloud platform tools. Amazon, Google and IBM have their own set of tools. Java is on decline. JavaScript is going strong on both web browser and server end with node.js , React and many other JavaScript libraries. Apple also tries to bend it’s Swift programming language now used to make mainly iOS applications also to run on servers with project Perfect.
Java will still stick around, but Java’s decline as a language will accelerate as new stuff isn’t being written in Java, even if it runs on the JVM. We will not see new Java 9 in 2016 as Oracle’s delayed the release of Java 9 by six months. The register tells that Java 9 delayed until Thursday March 23rd, 2017, just after tea-time.
Containers will rule the world as Docker will continue to develop, gain security features, and add various forms of governance. Until now Docker has been tire-kicking, used in production by the early-adopter crowd only, but it can change when vendors are starting to claim that they can do proper management of big data and container farms.
NoSQL databases will take hold as they be called as “highly scalable” or “cloud-ready.” Expect 2016 to be the year when a lot of big brick-and-mortar companies publicly adopt NoSQL for critical operations. Basically NoSQL could be seem as key:value store, and this idea has also expanded to storage systems: We got key:value store disk drives with an Ethernet NIC on-board and basic GET and PUT object storage facilities came into being.
In the database world Big Data will be still big but it needs to be analyzed in real-time. A typical big data project usually involves some semi-structured data, a bit of unstructured (such as email), and a whole lot of structured data (stuff stored in an RDBMS). The cost of Hadoop on a per-node basis is pretty inconsequential, the cost of understanding all of the schemas, getting them into Hadoop, and structuring them well enough to perform the analytics is still considerable. Remember that you’re not “moving” to Hadoop, you’re adding a downstream repository, so you need to worry on systems integration and latency issues. Apache Spark will also get interest as Spark’s multi-stage in-memory primitives provides more performance for certain applications. Big data brings with it responsibility – Digital consumer confidence must be earned.
IT security continues to be a huge issue in 2016. You might be able to achieve adequate security against hackers and internal threats but every attempt to make systems idiot proof just means the idiots get upgraded. Firms are ever more connected to each other and the general outside world. So in 2016 we will see even more service firms accidentally leaking critical information and a lot more firms having their reputations scorched by incompetence fuelled security screw-ups. Good security people are needed more and more – a joke doing the rounds of ITExecs doing interviews is “if you’re a decent security bod, why do you need to look for a job”
There will still be unexpected single points of failures in big distributed networked system. The cloud behind the silver lining is that Amazon or any other cloud vendor can be as fault tolerant, distributed and well supported as you like, but if a service like Akamai or Cloudflare was to die, you still stop. That’s not a single point of failure in the classical sense but it’s really hard to manage unless you go for full cloud agnosticism – which is costly. This is hard to justify when their failure rate is so low, so the irony is that the reliability of the content delivery networks means fewer businesses work out what to do if they fail. Oh, and no one seems to test their mission-critical data centre properly, because it’s mission critical- So they just over-specify where they can and cross their fingers (= pay twice and get the half the coverage for other vulnerabilities).
For IT start-ups it seems that Silicon Valley’s cash party is coming to an end. Silicon Valley is cooling, not crashing. Valuations are falling. The era of cheap money could be over and valuation expectations are re-calibrating down. The cheap capital party is over. It could mean trouble for weaker startups.
933 Comments
Tomi Engdahl says:
20% of Scientific Papers On Genes Contain Conversion Errors Caused By Excel, Says Report
https://science.slashdot.org/story/16/08/23/2222258/20-of-scientific-papers-on-genes-contain-conversion-errors-caused-by-excel-says-report
A new report from scientists Mark Ziemann, Yotam Eren, and Assam El-Osta says that 20% of scientific papers on genes contain gene name conversion errors caused by Excel. In the scientific article, titled “Gene name errors are widespread in the scientific literature,” article’s abstract section, the scientists explain: “The spreadsheet software Microsoft Excel, when used with default settings, is known to convert gene names to dates and floating-point numbers. A programmatic scan of leading genomics journals reveals that approximately one-fifth of papers with supplementary Excel gene lists contain erroneous gene name conversions.”
Tomi Engdahl says:
HTC Vive Gives Autonomous Robots Direction
http://hackaday.com/2016/08/23/htc-vive-gives-autonomous-robots-direction/
The HTC Vive is a virtual reality system designed to work with Steam VR. The system seeks to go beyond just a headset in order to make an entire room a virtual reality environment by using two base stations that track the headset and controller in space. The hardware is very exciting because of the potential to expand gaming and other VR experiences, but it’s already showing significant potential for hackers as well — in this case with robotics location and navigation.
RoboSavvy integrated HTC Vive with ROS
http://robosavvy.com/forum/viewtopic.php?f=26&t=14232
Tomi Engdahl says:
Microsoft’s take on simplification: The Acer Aspire One Cloudbook
http://www.edn.com/electronics-blogs/brians-brain/4442590/Microsoft-s-take-on-simplification–The-Acer-Aspire-One-Cloudbook
Maybe I’ve just grown weary of the perpetual software update cycle, but I’m seemingly all about simplicity of late. Recently, I passed along my preliminary impressions of a Toshiba Chromebook 2, which as its name implies is based on Google’s cloud-centric Chrome OS. And today, I’m going to share my initial thoughts on Microsoft’s take on simplification, in the form of Acer’s Aspire One Cloudbook series, initially introduced in August 2015.
The Aspire One Cloudbook targets the low end of the computing market; here are some high-level specs for my particular “11″ model, which I recently snagged for $89.99 refurbished on sale at Newegg
Tomi Engdahl says:
HoloLens NES Emulator For Augmented Retro Gaming
http://hackaday.com/2016/08/24/hololens-nes-emulator-for-augmented-retro-gaming/
[Andrew Peterson] was looking for a way to indulge in his retro gaming passions in a more contemporary manner. His 3D NES emulator “N3S” for Windows brings Nintendo classics to the HoloLens, turning pixels into voxels, and Super Mario into an augmented reality gingerbread man.
To run NES games on the HoloLens, [Andrew’s] emulator uses the Nestopia libretro core. Since AR glasses cry for an augmentation of the game itself, the N3S re-emulates the NES’ picture processing unit (PPU), allowing it to interpret a Nintendo game’s graphics in a 3D space.
N3S
http://n3s.io/index.php?title=N3S
N3S is a 3D NES emulator for Windows that is currently in alpha. It wraps the Nestopia UE libretro core and re-emulates the PPU to draw predefined 3D voxel meshes in place of 2D sprites.
Tomi Engdahl says:
Are Energy Standards for Computers on Horizon?
http://www.eetimes.com/author.asp?section_id=36&doc_id=1330329&
California has put the wheels in motion, and a NRDC says electricity use by computers can be cut in half using off-the-shelf technology with no impact on performance, and at negligible cost.
The California Energy Commission appears to be moving ahead with the nation’s first energy efficiency standards for computers and monitors. Some reports indicate that the standards, which would apply to the power-use settings for desktops, laptops and computer monitors sold in the state, may be adopted by the end of this year; given California’s market size and influence, adoption of these standards could spark industrywide changes, the news report noted.
The standards, which would vary by computer type and possibly be phased in during 2017 and/or 2018, would save consumers hundreds of millions of dollars every year, according to the CEC’s March 2015 press release. For desktop computers alone, it is estimated that a $2 increase in manufacturing costs will return $69 to consumers in energy savings over the five-year life of a desktop, the organization claims.
Tomi Engdahl says:
Minecraft + Oculus Rift Virtual Reality = Mind Goes Boom
http://www.eetimes.com/author.asp?section_id=216&doc_id=1330351&
Installing Minecraft on the Oculus Rift is not as intuitive as one might hope, but the end result is spectacular and well-worth the effort.
As I’ve mentioned before, I’ve never really been much of a games player, so I’ve largely watched game-related things evolve from the sidelines.
Take Minecraft, for example. I was vaguely aware of this little scamp when it first appeared on the scene. Over time, I saw documentaries about it on TV and I saw younger folks playing it, but it never really grabbed my attention.
Tomi Engdahl says:
19 Views of IDF16
PC giant aims to play a new tune
http://www.eetimes.com/document.asp?doc_id=1330332&
Whether Intel can remain the world’s largest semiconductor company in the wake of the PC tsunami is anyone’s guess. But there’s no doubt the company is trying to move fast on multiple fronts to participate in a richly diverse set of opportunities ahead.
This year’s Intel Develop Forum showed the company racing to get traction with a broad set of new platforms in machine vision, the Internet of Things, FPGAs, machine learning and more. None of them will replace the PC, but some collection of them could someday more than fill that gap.
Analysts’ opinions were mixed. In a research note entitled, “Battleship is turning,” Ross Seymore of Deutsche Bank said he was “impressed by Intel’s commitment to move beyond its PC heritage into a wide array of new markets … this transition will become increasingly apparent in 2017 as PC-related revs fall to ~50% of the company’s mix.
He and others praised the company for an update on its 10nm node that showed “its ability to keep pace with Moore’s law when others have found doing so more difficult.”
PC was probably was “the most important design win of all time…but it’s not clear IoT will be the same horn of plenty…I think ARM is going to be the long term winner in IoT,”
Tomi Engdahl says:
Live from LinuxCon – Sharing the latest news and learnings on Microsoft’s open journey
https://azure.microsoft.com/en-us/blog/live-from-linuxcon-sharing-the-latest-news-and-learnings-on-microsoft-s-open-journey/
representing Microsoft as a keynote speaker for the first time! I’m excited to share exciting new open source developments from Microsoft and things we’ve learned from our journey with Linux and open source.
The reality is customers use more than one tool and more than one platform to operate their businesses. They need tools that support Linux and Windows, and they need a cloud that allows them to run any application. One of the things I shared with linux.com recently was how blown away I was to see how large Microsoft’s investment in Linux already is. We brought .NET Core, PowerShell, and SQL Server to Linux. We also open sourced Visual Studio Code and just recently PowerShell. And, we are contributing to and participating in numerous community projects. It’s incredible to be a part of it.
Our latest open source and Linux advancements
One of the areas we are focused on is delivering open management solutions. In today’s multi-cloud, multi-OS world, customers need simple, unified tools to reduce complexity. That’s why just last week, we announced that we’re open sourcing PowerShell and making it available on Linux. Now PowerShell users across Windows and Linux can use our popular command-line shell and scripting language to manage almost everything from almost anywhere. My colleague Jeffrey Snover wrote a fantastic story about the journey to open source PowerShell and how customer-centricity brought us here
Today, I’m also excited to share that OMS Docker Container monitoring is available in preview. By nature, containers are lightweight and easily provisioned, so without a centralized approach to monitoring, customers may find it difficult to manage and respond to critical issues quickly.
Our experiences with Linux in Azure, where nearly 1 in 3 VMs today are Linux, have brought us closer to our customers and what they need to succeed in a rapidly advancing world. We have made significant investments in making Microsoft’s platform a great place to run open source software, and I will be working with my team to accelerate this effort over the coming months.
Choice and flexibility are important tenets of our platform. Also critical are our efforts to contribute to open source projects, integrate open source technologies in our platform, and forge commercial and community partnerships with the ecosystem.
Tomi Engdahl says:
Excel hell messes up ~20 per cent of genetic science papers
Australian boffins say the problem is between users’ ears and in the spreadsheet’s formatting genes
http://www.theregister.co.uk/2016/08/25/excel_hell_messes_up_20_per_cent_of_genetic_science_papers/
Tomi Engdahl says:
Focus Shifting To Photonics
http://semiengineering.com/focus-shifting-to-photonics/
Using light to move data will save power and improve performance; laser built into process technology overcomes huge hurdle.
Silicon photonics finally appears ready for prime time, after years of unfulfilled expectations and a vision that stretches back at least a couple decades.
The biggest challenge has been the ability to build a light source directly into the silicon process, rather than trying to add one onto a chip after manufacturing. Intel today said it has achieved that milestone, setting the stage for building economies of scale into the process. That may take several more years, but it nonetheless represents an important step for this technology.
“We have solved the problem of integrating the laser into the process,” said Alexis Bjorlin, general manager of the Connectivity Group at Intel. “We invested in a methodology to bond light-emitting III-V GaN to silicon so that the lasers are defined in silicon. This is the Holy Grail of silicon photonics.”
The first implementations of this technology will be between systems within a data center, where silicon photonics already is in widespread use. This is a relatively price-insensitive but fast-growing market
“Right now we can drive a 3X per bit power reduction. So you have higher-rate switches, and you get an improvement in power consumption. The core differentiator there is the laser integrated on silicon.”
Tomi Engdahl says:
Power9 Opens IBM to Partners
One chip, four variants and tons of eDRAM
http://www.eetimes.com/document.asp?doc_id=1330350&
IBM’s Power 9 processor, described for the first time at Hot Chips yesterday, could become a break out chip, seeding new OEM and accelerator partners and rejuvenating Big Blue’s bid against archrival Intel in high-end servers.
The 14nm Power 9, first mentioned in March, takes a bold if somewhat fragmented strategy in the hot area of accelerators. It is IBM’s first Power chip to emerge as a family to enable a range of scale up and scale out system designs.
Like past IBM microprocessors, to reach new performance levels it uses a gob of memory—including a whopping 120 Mbyte embedded DRAM in shared L3 cache riding a 7 Tbit/second on-chip fabric.
Tomi Engdahl says:
New microchip demonstrates efficiency and scalable design
http://www.princeton.edu/main/news/archive/S47/19/67G69/?section=topstories
Princeton University researchers have developed a new computer chip that promises to boost the performance of data centers that lie at the core of numerous online services such as email and social media.
The chip — called “Piton” after the metal spikes driven by rock climbers into mountainsides to aid in their ascent — was presented Aug. 23 at Hot Chips, a symposium on high-performance chips held in Cupertino, California.
The Princeton researchers designed their chip specifically for massive computing systems. Piton could substantially increase processing speed while slashing energy usage. The chip architecture is scalable — designs can be built that go from a dozen to several thousand cores, which are the independent processors that carry out the instructions in a computer program. Also, the architecture enables thousands of chips to be connected into a single system containing millions of cores.
“With Piton, we really sat down and rethought computer architecture in order to build a chip specifically for data centers and the cloud,” said David Wentzlaff, a Princeton assistant professor of electrical engineering and associated faculty in the Department of Computer Science. “The chip we’ve made is among the largest chips ever built in academia and it shows how servers could run far more efficiently and cheaply.”
The current version of the Piton chip measures 6 millimeters by 6 millimeters. The chip has more than 460 million transistors, each of which are as small as 32 nanometers
The bulk of these transistors are contained in 25 cores.
computer manufacturers have turned to multi-core chips to squeeze further gains out of conventional approaches to computer hardware
The Piton chip’s design focuses on exploiting commonality among programs running simultaneously on the same chip. One method to do this is called execution drafting. It works very much like the drafting in bicycle racing, when cyclists conserve energy by riding behind a lead rider who cuts through the air, creating a slipstream.
Piton was designed by the Princeton team and manufactured by IBM.
Tomi Engdahl says:
Facebook is giving away the software it uses to understand objects in photos
DeepMask and SharpMask are now open source
http://www.theverge.com/2016/8/25/12630850/facebook-fair-deepmask-sharpmask-ai-image-recognition
Facebook is open sourcing a set of computer vision software tools that can identify both the variety and the shape of objects within photos. The tools, developed by the Facebook AI Research (FAIR) team, are called DeepMask, SharpMask, and MultiPathNet, and all three work in tandem to help break down and contextualize the contents of images. These technologies, though not in active use in consumer Facebook products right now, are similar to the software the company uses to describe photos to blind users, a feature it calls “automatic alternative text” that launched back in April.
Through machine learning, a widely used AI training technique, Facebook is able to teach algorithms how to perform traditional human cognitive tasks by feeding what are called neural networks large sets of data.
The process by which a neural net identifies these objects is called segmentation, which asks the computer a series of yes / no questions about an image in an attempt to classify its contents. That’s DeepMask’s role, whereas SharpMask is used to refine the selection of objects for better accuracy.
Tomi Engdahl says:
Bloomberg:
Sources: Apple working on new pro software features for iPad expected 2017, a 5K monitor, and Mac updates, including a thinner MacBook Pro with flatter keyboard — Deeper stylus integration, faster displays planned for iPads — New MacBooks, iMac, and 5K monitor with LG in product pipeline
Apple Is Working on iPad Upgrades and Refreshed Mac Lineup
http://www.bloomberg.com/news/articles/2016-08-29/apple-said-to-prepare-ipad-upgrades-and-refreshed-mac-lineup
Apple Inc. is developing new features for the iPad to cater to professional users, along with new Mac laptops and desktops, according to people familiar with the matter.
Upcoming software upgrades for the iPad include wider operating-system support for Apple’s stylus accessory, while hardware performance improvements are also in development, according to the people. The refreshed Mac hardware line includes new versions of the iMac desktop, MacBook Air laptop, and a 5K standalone monitor in collaboration with LG Electronics Inc., in addition to a thinner MacBook Pro laptop.
Tomi Engdahl says:
Andrew Cunningham / Ars Technica:
Intel launches “Kaby Lake” 7th-generation CPUs with improved support for 4K video — New 7th-generation Core CPUs have a lot in common with the 6th generation. — Intel’s tick-tock model may be dead, but the PC industry still demands new hardware every year.
Intel unveils Kaby Lake, its first post-“tick-tock” CPU architecture
New 7th-generation Core CPUs have a lot in common with the 6th generation.
http://arstechnica.com/gadgets/2016/08/intel-unveils-kaby-lake-its-first-post-tick-tock-cpu-architecture/
Intel’s tick-tock model may be dead, but the PC industry still demands new hardware every year. Many PC models are refreshed once a year or so, and that means that the PC makers need new stuff to put into those computers whether or not the laws of physics want to comply.
First off, none of these mobile CPUs will include new chipsets, which means you’ll get the same connectivity options as before. The 100-series chipsets should still be more than adequate for most people, but they’re missing things like 10Gbps USB 3.1 gen 2, to say nothing of Thunderbolt. On the plus side, the lack of huge changes means that Kaby Lake chips can easily be dropped into existing Skylake designs, something that will help PC makers get Kaby Lake systems on the shelves in a hurry.
Kaby’s biggest advertised feature is improved support for 4K. All Kaby Lake integrated GPUs will support hardware-accelerated decoding and encoding of 10-bit HEVC/H.265 video streams and decoding of 8-bit VP9 streams. If you don’t already know, supporting hardware acceleration for certain codecs means that the GPU (usually via a dedicated media block) handles all the processing instead of the CPU.
HDMI 2.0 and HDCP 2.2 are also supported, which (respectively) enable 4K output at 60Hz over an HDMI cable and provide the DRM required to carry an encrypted 4K signal from the thing that’s playing it to the screen that will show it. The maximum supported DisplayPort version remains 1.2, however, dashing the hopes of anyone who wants to drive a 5K display at 60Hz over a single cable using DisplayPort 1.3.
The biggest improvement from an architectural/manufacturing standpoint is something Intel is calling “14nm+,” an optimized version of the same process node used for both Broadwell and Skylake. The company isn’t sharing many details—Intel says that 14nm+ has an “improved fin profile” and “improved transistor channel strain,” for what that’s worth.
CPU and GPU speed improvements come mostly from clock speed increases rather than architectural improvements, as they did in the Haswell refresh. Intel promises 12- to 19-percent better CPU performance
Tomi Engdahl says:
Darrell Etherington / TechCrunch:
PlayStation Now streaming service that lets you play legacy titles on Windows PCs is launches today with promo price of $100/year, less than half normal price
PlayStation Now streaming service available today on Windows PCs
https://techcrunch.com/2016/08/30/playstation-now-streaming-service-available-today-on-windows-pcs/
You don’t need a PlayStation to play PlayStation games anymore: Sony’s Playstation Now subscription-based game streaming service is now out for PC, and you an grab the app and start playing some of PlayStation’s best legacy titles immediately if you’ve got a Windows machine.
It’ll cost you, of course – but not as much as you would’ve paid for the games available individually. A 12-month subscription to PlayStation Now will run you $99.99 as part of a limited-time promotion to celebrate the PC launch. Normally, a PS Now subscription will run you more than double that.
What does PlayStation Now actually provide? Access to a library of over 50 ‘Greatest Hits’ games
Tomi Engdahl says:
Shell Game
http://hackaday.com/2016/08/30/shell-game/
A lot of us spend a lot of time switching between Windows and Linux.
What I hate most about Windows is how hard is it to see what’s going on under the hood.
War is Shell
One place where Linux always used to have an advantage over DOS and Windows was the shell. There are lots of variations available under Linux, but bash seems to be the current pick for most people.
In the old DOS days, some of us went to 4DOS which was nice, but no bash
Windows Power
Microsoft finally addressed the shortcomings of its default command interpreter, first introducing Windows Scripting Host to allow Javascript and VBScript batch files. Eventually, this was supplanted by Monad which later became known as the Windows PowerShell.
Shell Shock
Two things have recently happened that surprised me. First, Microsoft made bash available (and other Linux executables) for Windows 10 as a native application
I’ve used Cygwin and UWIN to have a very full-featured Linux environment under Windows for years
Sure, NT used to have a crippled POSIX subsystem, but it wasn’t practical.
The second piece of news that surprised me is that you can now get PowerShell for Linux or OS/X.
So now you have several options for using Linux and Windows without going crazy switching between the two:
Run Linux and put Windows in a virtual machine
Run Windows and put Linux in a virtual machine
Use bash everywhere (using Cygwin or the Microsoft product)
Use PowerShell everywhere
Tomi Engdahl says:
Bloomberg:
Hospitals experimenting with VR as a pain management tool as studies find sensory overload distracts the brain
Hospitals Try Giving Patients a Dose of VR
With the price of hardware falling, VR equipment has become a more affordable option for doctors
http://www.bloomberg.com/news/articles/2016-08-29/hospitals-try-giving-patients-a-dose-of-vr
The 13-year-old was set on fire when a bonfire exploded on her and her friend. To prevent infection, burn victims need their bandages changed and dead skin scraped away. Sometimes, even morphine isn’t enough to make that tolerable.
At the Shriners Hospital for Children in Galveston, Duke’s doctors gave her a virtual reality headset. Slipping it on, she was immersed in “SnowWorld,” an icy landscape where she got to lob snow at snowmen and igloos. The Texas hospital is one of the few trying out virtual reality to relieve pain.
“I’d never heard of it so I was a little surprised,” she said. “When I first tried it, it distracted me from what they were doing so it helped with the pain.”
It’s still a new and experimental approach, but proponents of virtual reality say that it can be an effective treatment for everything from intense pain to Alzheimer’s disease to arachnophobia to depression. And as Facebook Inc., Sony Corp., HTC Corp. and others race to build a dominant VR set, the price of hardware has fallen, making the equipment a more affordable option for hospitals looking for alternatives for pain relief.
Tomi Engdahl says:
Intel Debuts 14nm+ Processors
Process gives Kaby Lake 12% boost
http://www.eetimes.com/document.asp?doc_id=1330373&
Intel Corp. officially announced Kaby Lake, its seventh-generation Core PC processors made in a 14nm+ process and focused on delivering better 4K video. The family provides the first indication of what more modest product advances may look like as Intel stretches Moore’s law to cover with one process node multiple generations of chips.
Kaby Lake processors gain a 12% improvement from enhancements to Intel’s 14nm process. They also include a modestly updated media engine with hardware support for decoding VP9 video and encoding and decoding 4K 10-bit HEVC video. Otherwise, the chips use the same architecture as the previous Skylake generation, including the existing Skylake x86 pipeline.
Tomi Engdahl says:
Microsoft’s beta language service gets C# dev kit
Yet another chatbot to train
http://www.theregister.co.uk/2016/08/31/microsofts_beta_language_service_gets_c_dev_kit/
Microsoft has pushed out a C# software development kit (SDK) for its in-beta language parsing API, LUIS.
LUIS – the Language Understanding Intelligent Service – is another chunk of the chatbot capability Redmond is so keen on.
It’s a model-making environment which Microsoft reckons helps developers teach existing apps to understand “book tickets to Paris”, “turn on the lights” and so on.
The idea is to relieve developers of as much effort as possible, with models built for Cortana and Bing doing the hard work.
“Turn on the lights” – the “intent” – needs only a simple response (“ok”), but as Microsoft explains, “I’d like to buy a black dress” needs a more nuanced response (“what size?”, for example).
Luis includes pre-built entities; the developer then trains the model, and finally publishes it to an HTTP endpoint as JSON.
Windows (.Net) SDK for the Microsoft Language Understanding Intelligent Service API, part of Congitive Services http://www.microsoft.com/cognitive-services/en-us/language-understanding-intelligent-service-luis
https://github.com/Microsoft/Cognitive-LUIS-Windows
Language Understanding Intelligent Service (LUIS)
LUIS lets your app understand language
https://www.luis.ai/
Tomi Engdahl says:
Not to be forgotten – Thunderbolt strike through next year
Intel announced this week the new seventh-generation Core processors. Kaby Lake districts, of course, bring some more performance PCs for buyers, but above all the new features shown arise next year change: PC connections move to Thunderbolt time.
Are based on the new Core processors hybridiläppäreitä will be on sale as early as during the rest of the year. Very many of the interface has changed from a single, multipurpose C-type USB port.
Currently, the market is Intel estimates that about sixty device that supports Thunderbolt bus. This amount practically to double over the next few months.
Externally, Thunderbolt and USB Type-C has no difference. Thunderbolt 3 is arranged to work in a new USB. But it’s not the same thing, because even though the USB-C is standard, Thunderbolt is Intel’s own technology.
Buses differences are not insignificant. Thunderbolt is four times faster than USB 3.1. Data is transferred along the bus Intel’s 40 gigabits per second.
Source: http://etn.fi/index.php?option=com_content&view=article&id=4952:ettei-unohtuisi-thunderbolt-lyo-lapi-ensi-vuonna&catid=13&Itemid=101
Tomi Engdahl says:
Baidu Open-Sources Its Deep Learning Tools
https://news.slashdot.org/story/16/09/01/058247/baidu-open-sources-its-deep-learning-tools
Microsoft, Google, Facebook, and Amazon have all done it — and now Baidu’s doing it, too. The Chinese tech giant has open sourced one of its key machine learning tools, PaddlePaddle, offering the software up to the global community of AI researchers. Baidu’s big claim for PaddlePaddle is that it’s easier to use than rival programs. Like Amazon’s DSSTNE and Microsoft’s CNTK, PaddlePaddle offers a toolkit for deep learning, but Baidu says comparable software is designed to work in too many different situations, making it less approachable to newcomers.
Baidu follows US tech giants and open-sources its deep learning tools
http://www.theverge.com/2016/9/1/12725804/baidu-machine-learning-open-source-paddle
Tomi Engdahl says:
Why Facebook’s Oculus Team Had to Rebuild Some of Its Virtual Reality Software
http://fortune.com/2016/08/31/facebook-oculus-rebuild-virtual-reality-software/
The original Oculus Home screen was slow, buggy, and filled with sloppy code.
What’s the point in strapping a virtual reality headset to your face if you can’t figure out how to use it?
The original Oculus home screen, which he said was slow and cobbled together, was rebuilt from scratch to ensure everything would run as smoothly as possible. The interface’s design didn’t necessary put off users, Nguyen explained, but rather it was the software plumbing that needed a serious overhaul.
The first version of the Oculus Home, Oculus’s interface, relied on several software services that his team of two other coders could not update because they were discontinued over time
Because his team built the interface in a disorganized way, they ended up with sloppy code, he said.
“Quickly the code became spaghetti,” Nguyen said. “It wasn’t delicious at the end of the day.”
Still, the interface worked
Tomi Engdahl says:
HP builds one desktop PC around a speaker, another in slices
HP is trying to make desktop computers as exciting as laptops and all-in-ones.
http://arstechnica.com/gadgets/2016/09/hp-builds-one-desktop-pc-around-a-speaker-another-in-slices/
HP has announced today two new desktop PCs, both with some unusual form factors, in what it calls the “desktop reinvention.” While laptops and all-in-ones have a long history of novel designs and advanced engineering, the traditional desktop has tended to be a rather less exciting category. Some systems have shrunk to take advantage of the increasing integration and decreasing power requirements that modern processors boast, but the plain-old mini-tower PC, still a corporate staple, has had little thought or attention given to its design over the years.
Tomi Engdahl says:
John Markoff / New York Times:
Sources: Google, Amazon, Facebook, Microsoft, and IBM form group to devise ethics for AI, and discuss its impact on jobs, transportation, warfare, and more
How Tech Giants Are Devising Real Ethics for Artificial Intelligence
http://www.nytimes.com/2016/09/02/technology/artificial-intelligence-ethics.html?_r=0
For years, science-fiction moviemakers have been making us fear the bad things that artificially intelligent machines might do to their human creators. But for the next decade or two, our biggest concern is more likely to be that robots will take away our jobs or bump into us on the highway.
Now five of the world’s largest tech companies are trying to create a standard of ethics around the creation of artificial intelligence. While science fiction has focused on the existential threat of A.I. to humans, researchers at Google’s parent company, Alphabet, and those from Amazon, Facebook, IBM and Microsoft have been meeting to discuss more tangible issues, such as the impact of A.I. on jobs, transportation and even warfare.
Tech companies have long overpromised what artificially intelligent machines can do. In recent years, however, the A.I. field has made rapid advances in a range of areas, from self-driving cars and machines that understand speech, like Amazon’s Echo device, to a new generation of weapons systems that threaten to automate combat.
Tomi Engdahl says:
Reuters:
Sources: Hewlett Packard Enterprise in talks to sell its software division to Thoma Bravo, hopes it can fetch $8B-10B
Exclusive: HP Enterprise in talks to sell software unit to Thoma Bravo – sources
http://www.reuters.com/article/us-hpe-software-thomabravo-idUSKCN1175RV
The negotiations come as HPE Chief Executive Meg Whitman seeks to focus the U.S. company’s strategy on networking, storage, data centers and related technology services, after its separation last year from computer and printer maker HP Inc (HPQ.N).
Tomi Engdahl says:
Lucas Matney / TechCrunch:
Qualcomm unveils Snapdragon VR820, a reference design for an all-in-one, eye-tracking VR headset
Qualcomm unveils a wireless eye-tracking VR headset
https://techcrunch.com/2016/09/01/qualcomm-unveils-standalone-eye-tracking-vr-headset-reference-design/
Another day, another VR headset that you will never be able to buy.
Today at the IFA conference in Berlin, Qualcomm unveiled a reference design for an all-in-one headset built on the company’s new Snapdragon VR820 architecture.
The company’s Snapdragon 820 is already one of the most popular smartphone SoC’s on the market, but Qualcomm believes that mobile VR’s full potential isn’t being reached on the 820 because the headsets aren’t single-minded enough.
This headset design, built in partnership with Shenzhen-based Goertek, isn’t something that consumers are going to be able to try out, Qualcomm unveiled this as a reference design to entice OEMs to build all-in-one HMDs on the new VR820.
Tomi Engdahl says:
The HDMI connection will be lost from laptops in the future
HDMI connection HDMI Licensing technology developer says it has expanded its connection with a new alternative mode (the so-called. Alt mode). It allows the C-type USB connector equipped devices can move the image directly to the HDMI displays without any adapters.
If your device supports HDMI connection, such as a movie can be transferred to it via HDMI directly to a large-screen on a single physical cable. At the same time the future of small devices can be left out of the one connector, the USB Type-C is sufficient.
According to the association with the HDMI interface displays sold this year to nearly 290 million. Its range of projectors, monitors, and practically one hundred percent of Flat Panel TVs.
There are some limitations with a new HDMI technology is. Alt mode supports standard assay 1.4b, 2.0b, but not the newer standard. via the USB interface to the C-4K to transfer the image quality, the 3D image and the HDMI Ethernet data, the HDR example, but not sound, which is part of the 2.0 specification.
Source: http://etn.fi/index.php?option=com_content&view=article&id=4964&via=n&datum=2016-09-02_09:54:09&mottagare=30929
Tomi Engdahl says:
Facebook Feeds Open Software
VC rides open source startups
http://www.eetimes.com/document.asp?doc_id=1330391&
The annual geek fest aims to encourage software developers to contribute to and collaborate on open source code. Walking the talk, Facebook described code it plans to release for everything from stabilizing 360-degree videos to improving data compression and machine learning. However, the event also showed Facebook’s openness has its limits.
“There is so much happening in the open and we can solve problems faster working together,” said Jay Parikh, head of engineering and infrastructure at Facebook in a keynote talk, noting an estimated 1.5 million engineers follow open source projects.
Companies building products based on open code, many with “proprietary software wrapped around it to make enterprise ready” will make up some of the largest tech public offerings in the next few years, Li said, claiming more than 75% of business users are adopting open source code.
“Open source is a development and licensing model with many flavors, but open adoption software is a broader business model shift in how code is developed, used and monetized, like Salesforce pioneered a new way of delivering software — this is the same thing,” he said. “There’s no way a 12-person R&D shop is going to out innovate this room, so customers are turning toward the networking effect,” he added, pointing to the audience of several hundred developers.
Facebook open sources Zstandard compression algorithm and MyRocks storage engine
https://techcrunch.com/2016/08/31/facebook-open-sources-zstandard-compression-algorithm-and-myrocks-storage-engine/?ncid=rss&cps=gravity_1462_894864682673588316
Today, Facebook is releasing its Zstandard compression algorithm into the wild as open source. The lossless compression technology is aimed at replacing existing libraries like zlib that are powered by the outdated Deflate compression algorithm. In addition to Zstandard, Facebook is also dropping its MyRocks storage engine as open source. MyRocks is currently being used by Facebook to improve the storage efficiency of its MySQL databases.
Both releases occurred in coordination with Facebook’s @Scale conference in San Jose.
Tomi Engdahl says:
PC-BSD Operating System Gets Renamed to TrueOS, Follows a Rolling Release Model
TrueOS will track FreeBSD’s “Current” branch
Read more: http://linux.softpedia.com/blog/pc-bsd-operating-system-gets-renamed-to-trueos-follows-a-rolling-release-model-507866.shtml#ixzz4J63qtBe8
Tomi Engdahl says:
New Intel and AMD Chips Will Only Support Windows 10
https://hardware.slashdot.org/story/16/09/01/2031247/new-intel-and-amd-chips-will-only-support-windows-10
uried in the announcement of the new Kaby Lake (seventh-generation) processors and a rash of incoming notebooks set to use them is the confirmation that they will have a Windows 10 future. Microsoft has been warning people for ages that Kaby Lake will not run on anything older than Windows 10, and it looks like AMD’s upcoming Zen chip will be going the same way. Microsoft said, “As new silicon generations are introduced, they will require the latest Windows platform at that time for support.
Microsoft made ‘em do it: The latest Kaby Lake, Zen chips will support only Windows 10
No one knows what would happen if you tried to run an older OS on one of the new chips.
http://www.pcworld.com/article/3112663/software/microsoft-made-em-do-it-the-latest-kaby-lake-zen-chips-will-support-only-windows-10.html
Tomi Engdahl says:
Jon Brodkin / Ars Technica:
OpenOffice leaders discussing shutdown due to lack of developers; many switched to LibreOffice
OpenOffice, after years of neglect, could shut down
As LibreOffice soars, OpenOffice management considers retiring the project.
http://arstechnica.com/information-technology/2016/09/openoffice-after-years-of-neglect-could-shut-down/
OpenOffice, once the premier open source alternative to Microsoft Office, could be shut down because there aren’t enough developers to update the office suite. Project leaders are particularly worried about their ability to fix security problems.
“It is my considered opinion that there is no ready supply of developers who have the capacity, capability, and will to supplement the roughly half-dozen volunteers holding the project together,” Hamilton wrote.
No decisions have been made yet, but Hamilton noted that “retirement of the project is a serious possibility,” as the Apache board “wants to know what the project’s considerations are with respect to retirement.”
Many developers have abandoned OpenOffice to work on LibreOffice, a fork that got its first release in January 2011. While LibreOffice issues frequent updates, OpenOffice’s most recent version update was 4.1.2 in October 2015. That was the only OpenOffice release in 2015, and there were only two updates in all of 2014. LibreOffice got 14 version updates in 2015 alone.
“In the case of Apache OpenOffice, needing to disclose security vulnerabilities for which there is no mitigation in an update has become a serious issue,” Hamilton wrote. By the time a new version release incorporates the fix, it will likely be “a year since the release of Apache OpenOffice 4.1.2.”
Tomi Engdahl says:
Sysadmins: Poor capacity planning is not our fault
Let us explain why things are so crap
http://www.theregister.co.uk/2016/09/05/poor_capacity_planning_is_not_our_fault/
What we discovered, however, was that many don’t even have what we might arguably describe as ‘the basics’ properly covered. Indeed, the most common approaches to capacity planning remain overprovisioning and “winging it”, ie, relying on instinct/vigilance and the odd spreadsheet.
Of course some argue that the informal, ad hoc approach is perfectly adequate, and that getting too ‘procedural’ is more trouble than it’s worth. Fair enough if you have a modest and relatively slow-moving IT environment, and a small IT team in which everyone always knows what everyone else is up to. But with nearly 60 per cent reporting downtime and/or service degradation as a result of capacity-related issues, and around 50 per cent talking about costly and disruptive emergency procurements when resource limits are unexpectedly reached, the approach being taken by most clearly isn’t working.
As with all surveys, however, the numbers only tell part of the story.
“A decision to centralize most of our servers, without first looking at what network changes would need to happen to make it work.”
“Developers who don’t think they need to do performance testing, just throw the biggest VM at it and forget about it.”
“Lack of timely communication from application teams about upcoming projects.”
“Snapshots that consume the entire storage a VM is hosted on, and not just once.”
Some readers alluded to poor training within the business, and certainly raising awareness and trying to get people to help themselves will be useful. From experience in other areas such as security and data protection, however, you have to be realistic about the ability of the average user to appreciate the practicalities and become motivated to act appropriately.
“It’s all about cost – no capacity management or appreciation.”
“Repeatedly informing management of system running out of resources until it eventually has exhausted resources resulting in panic procurement.”
“Managers who think the cloud is a means to reduce cost over co-lo, without actually thinking it through.”
“Management, business and 3rd party provider taking 6 months to approve and implement additional storage so production POS went down.”
That last comment underlines the notion that it really is a mistake to regard capacity planning and management as purely an ‘IT thing’ or simply a chore for sysadmins to take care of. With IT systems being so fundamental to so many processes and functions across most organisations today, it really needs to be thought of as an aspect of business risk management.
Tomi Engdahl says:
In August, the Linux operating system accounted for 2.11 per cent of desktops
Windows market share has again risen to over 90 per cent.
Mac OS’s market share is now about 7.4 per cent.
Source: http://etn.fi/index.php?option=com_content&view=article&id=4978:linux-edelleen-marginaalissa-tyopoydalla&catid=13&Itemid=101
Tomi Engdahl says:
Nicole Lee / Engadget:
Alcatel’s new standalone VR headset, the Vision, has 2 1080×1020 displays, 3 hours of battery, is likely to retail for $500-600, but won’t use Daydream platform
Alcatel’s standalone VR headset is a tough sell
The idea sounds neat, but Gear VR and Daydream means it has its work cut out for it
https://www.engadget.com/2016/09/05/alcatel-vision-hands-on/
Tomi Engdahl says:
Chrome For Windows To Get Battery Performance Boost
https://hardware.slashdot.org/story/16/09/05/1840228/chrome-for-windows-to-get-battery-performance-boost
Earlier this year, Microsoft claimed that its Edge browser was much lighter on battery than Chrome. Google is now attempting to address that. It has announced that Chrome 53 will contain numerous CPU and GPU power consumption enhancements for video playback, along with other big performance and power improvements.
Chrome for Windows to get battery performance boost
http://www.zdnet.com/article/chrome-for-windows-to-get-battery-performance-boost/
Google responds to Microsoft criticism that Chrome is a battery hog by introducing new power consumption enhancements.
Tomi Engdahl says:
Microsoft thought of the children and decided to ban some browsers
Redmond’s Family Settings now block browsers-without-filters by default, but which ones?
http://www.theregister.co.uk/2016/09/05/microsoft_thought_of_the_children_and_decided_they_must_only_use_edge/
Tomi Engdahl says:
Dell Launches as Private Giant
http://www.eetimes.com/author.asp?section_id=36&doc_id=1330414&
Dell completed the largest merger in the tech sector in a period of massive consolidation, but whether the move creates a breakout opportunity us unclear.
There’s no doubt the $67 billion merger of Dell, EMC and VMWare is bold. The Dell Technologies it formally created today claims to be the world’s largest private tech company.
This new giant clearly has a strong position, but it will have to show me it has the strength to pull ahead of the pack in a maturing and changing industry.
Dell’s traditional competitors Hewlett-Packard and IBM have been shedding weight to compete as more nimble companies. Meanwhile, new rivals in cloud computing such as Amazon and Google are demonstrating its not how much gear you sell, it’s how much you can buy and effectively operate that is key to success.
For me, the key number in the deal is 6%. That’s the percentage of the new company’s $74 billion in revenues invested in an estimated $4.5 billion in R&D. The figure is well above Dell’s traditional level, on par with IBM and ahead of HP and even Apple (where the wild success of the iPhone skews the ratio) but still below rivals such as Cisco and Google.
Tomi Engdahl says:
Flash-based SSD hard drive to quickly replace old mechanical disks in laptops. DRAMeXchange research institute, the SSD drives share will rise to new laptops already 50 per cent in 2018.
Source: http://etn.fi/index.php?option=com_content&view=article&id=4995:ssd-pian-joka-toisessa-lapparissa&catid=13&Itemid=101
Tomi Engdahl says:
ell yesterday officially became the world’s largest private company. EMC’s purchase of a $ 63 billion left the shop feels like a huge decision, but chief executive Michael Dell, the options were not. – In this business, either you die or turn into, Dell described dramatically.
The deal, Dell will grow to 130 thousand employee company. Its name will change to Dell Technologies. PC-half retained as the name of Dell, storage, and network-side was named DellEMC. It was appointed Head of EMC’s David Goullen.
Sales of the new Dell is a $ 74 billion company that is clearly higher than, for example, the Finnish State budget. Michael Dell clearly sees the company for future networks and services, and the terminal itself is reduced in this figure significance. PC is sure not to disappear Dell’s product range, but, for example, smart phones could go out.
Source: http://etn.fi/index.php?option=com_content&view=article&id=4998:dellilla-ei-ollut-vaihtoehtoja&catid=13&Itemid=101
Tomi Engdahl says:
IBM lifts lid, unleashes Linux-based x86 killer on unsuspecting world
NVLink, big bandwidth, but is it enough?
http://www.theregister.co.uk/2016/09/08/ibm_releases_x86_killer_s822lc_linux/
IBM is mounting its strongest challenge yet to x86 hegemony with the unveiling of its spanking new Linux based S822LC system. What makes this system different is that it’s based on a new processor/motherboard configuration, complete with a sporty NVIDIA NVLink connector.
According to my research, close to half of you HPC users have a significant number of applications that are starved for more memory bandwidth.
If this describes you, then you might want to take a look at this new IBM box – it has more memory bandwidth than anything else out there.
How much more? Well, the basic Power 8 system has up to 230GB/s of memory bandwidth vs. 102GB/s for Intel’s latest and greatest processors.
Tomi Engdahl says:
Bloomberg:
Sources: Dell Technologies will cut 2K-3K jobs after acquiring EMC, mostly in the US; post-acquisition company has 140K employees
Dell Technologies to Cut at Least 2,000 Jobs After EMC Deal
http://www.bloomberg.com/news/articles/2016-09-08/dell-technologies-said-to-cut-at-least-2-000-jobs-after-emc-deal
Dell is looking for cost savings of about $1.7 billion in the first 18 months after the transaction but is largely focused on using the deal to boost sales by several times that amount, the people added. The new company has 140,000 employees.
“As is common with deals of this size, there will be some overlaps we will need to manage and where some employee reduction will occur. We will do everything possible to minimize the impact on jobs,”
Tomi Engdahl says:
Charlie Warzel / BuzzFeed:
How Apple slowly evolves its products, like the iPhone, Watch, and its line of headphones, to draw customers deeper into the Apple ecosystem
Apple’s Strategy Is Innovation By A Thousand Tweaks
https://www.buzzfeed.com/charliewarzel/apples-strategy-is-innovation-by-a-thousand-tweaks?utm_term=.omAv52j8JM#.mfkOmMq6zX
The iPhone is to make you buy the AirPods which are to make you use Siri to make you keep your MacBook to get you to upgrade your iPhone.
Tomi Engdahl says:
Andrew Webster / The Verge:
Sony debuts PlayStation 4 Pro, which supports 4K, launches Nov. 10 for $399; new games will be playable on old PS4 hardware; older games look better on PS4 Pro
Sony announces PlayStation 4 Pro with 4K HDR gaming for $399
Out on November 10th
http://www.theverge.com/2016/9/7/12748316/sony-ps4-pro-announced-price-specs-release-date
Sony has officially unveiled the next big iteration of the PlayStation 4. The console — codenamed Neo, which the company discussed in brief just ahead of E3 in June — upgrades the three-year-old PS4 hardware with a faster processor, better graphics, and support for 4K resolution. It launches November 10th, and it will cost $399. It’s intended to be sold alongside the base PS4 instead of replacing it, and new games will still be playable on the older hardware.
The PS4 Pro can output 4K and HDR video, which is powered by an upgraded GPU. Sony also boosted the clock rate for the new PS4 Pro. It will also come with a 1TB hard drive
Tomi Engdahl says:
Sony wins case over pre-installed Windows software
User refused to sign ‘end-user licence agreement’ at startup
http://www.theregister.co.uk/2016/09/09/sony_wins_case_over_preinstalled_windows_software/
Selling a computer with pre-installed software is not an unfair commercial practice and there is no requirement to list the costs of the software, the Court of Justice of the European Union (CJEU) has ruled.
The CJEU, Europe’s highest court, was ruling on a case between Vincent Deroo-Blanquart and Sony. Deroo-Blanquart bought a Sony laptop in 2008, but refused to subscribe to the operating system’s end-user licence agreement, arguing that he wanted to be reimbursed for the cost of the pre-installed software.
Sony refused to do so, but offered a refund if Deroo-Blanquart returned the computer.
Deroo-Blanquart brought legal proceedings against Sony, looking for €450 for the pre-installed software, and €2,500 in damages.
The French Cour de cassation, which is hearing the case on appeal, had asked the CJEU whether supplying a computer with pre-installed software, with no option to choose a version without it, constitutes an unfair commercial practice. It also asked whether it is a misleading commercial practice to fail to indicate the price of each item of software.
Tomi Engdahl says:
Excel abuse hits new heights as dev uses VBA to code spreadsheet messaging app
At least he didn’t use Excel as a database for credit cards
http://www.theregister.co.uk/2016/09/08/slow_day_inspires_excelvba_instant_messaging_app/
We shouldn’t encourage readers to waste their time like this, but it’s the kind blend of wonderful insanity that springs from a sysadmin with time on his hands: an enterprise instant messaging platform that runs in Excel.
Of course, nobody would want to deny Microsoft the chance to stub its toe properly by writing its own competitor to Slack, Skype Teams, would they?
This project, by one Tristan Calderbank, is pretty special.
We can almost feel a series coming on: what pointless tasks have sysadmins or devs tortured Excel into performing for fun?
tristancalderbank/excel-messenger
A silly local group messaging app where the server and all clients are excel sheets.
https://github.com/tristancalderbank/excel-messenger/tree/master
Tomi Engdahl says:
Next PCIe is the last with copper
The PCI bus technology manages the PCI-SIG 4.0 version promises PCIExpress bus ready for next year. At the same time the organization says that the 4.0 version is the last using copper cables PCI bus. Since then, the connection is bound to become an optical.
At the moment, it seems that the physical connector changes. 4.0 interface will be backward compatible with earlier versions of the PCI.
A single cable of the fourth generation of PCIe-bus transfer two gigabytes of data per second. X16-type connector Alpi will therefore pass data from 32 gigabytes per second. The pace is thus a double compared to the PCIe 3.0
Source: http://etn.fi/index.php?option=com_content&view=article&id=5015:seuraava-pcie-on-viimeinen-kuparilla&catid=13&Itemid=101
Tomi Engdahl says:
Microsoft Wants Autistic Coders. Can It Find Them And Keep Them?
http://www.fastcompany.com/3062835/hr/microsoft-autism-hiring
Job interviews can be especially hard if you’re autistic. A Microsoft effort aimed at a wider spectrum of the workforce wants to solve that.
In the past, Adickman had never disclosed his autism when he applied for jobs. Once, a manager had berated him for making a list of tasks on his phone instead of in handwriting, and he’d wanted to explain why he preferred typing to writing: a quirk in fine motor skills, associated with autism, that made for messy penmanship. “I have hypermobility,” he’d blurted. “I don’t care what you have,” his manager had replied. He would soon quit.
Adickman and millions of adults with autism often find themselves in a difficult bind.
The program, which began in May 2015, does away with the typical interview approach, instead inviting candidates to hang out on campus for two weeks and work on projects while being observed and casually meeting managers who might be interested in hiring them. Only at the end of this stage do more formal interviews take place.
The goal is to create a situation that is better suited to autistic people’s styles of communicating and thinking. Microsoft isn’t the first to attempt something like this: The German software firm SAP, among a handful of others, have similar programs—but Microsoft is the highest-profile company to have gone public with its efforts, and autistic adults are hoping it will spark a broader movement.
And yet, being autistic is considered a brain disorder, and it affects the way people process and communicate information—skills that are at the core of many white-collar professions.
Diagnoses of autism spectrum disorder, a catch-all name that includes a range of symptoms
“As a whole, people with autism—even those who are quite bright, and intellectually quite capable—are facing worse job prospects because of their social challenges,”
“The unemployment rate is chronic, which is not a reflection of the talent pool, it’s just a reflection of these people not getting through the door.”
Tomi Engdahl says:
Face of a Robot, Voice of an Angel?
DeepMind’s use of neural networks to synthesize speech could finally make computers sound more human.
https://www.technologyreview.com/s/602343/face-of-a-robot-voice-of-an-angel/
The last time you heard a computer convert a line of text to speech, it probably jarred. Google’s machine-learning division, DeepMind, has developed a new voice synthesis system using artificial intelligence that it thinks will improve the situation.
Having a computer generate the sound of a voice isn’t a new idea. Perhaps the most common approach is simply to use an incredibly large selection of pre-recorded speech fragments from a single person. In a technique called concatenative synthesis, these are pieced together to create larger sounds, words, and sentences. That’s why a lot of computer-generated speech often suffers from glitches, quirky changes in intonation, and pronunciation stumbles.
The other competing approach uses mathematical models to re-create known sounds that are then assembled into words and sentences. While less prone to glitches, this so-called parametric approach does end up sounding robotic. What unites the two approaches, though, is that they both stitch together chunks of sound, rather than creating the whole audio waveform from scratch.
Tomi Engdahl says:
A Loud Sound Just Shut Down a Bank’s Data Center for 10 Hours
https://motherboard.vice.com/read/a-loud-sound-just-shut-down-a-banks-data-center-for-10-hours
ING Bank’s main data center in Bucharest, Romania, was severely damaged over the weekend during a fire extinguishing test. In what is a very rare but known phenomenon, it was the loud sound of inert gas being released that destroyed dozens of hard drives. The site is currently offline and the bank relies solely on its backup data center, located within a couple of miles’ proximity.
“The drill went as designed, but we had collateral damage”, ING’s spokeswoman in Romania told
Data centers typically rely on inert gas to protect the equipment in the event of a fire, as the substance does not chemically damage electronics, and the gas only slightly decreases the temperature within the data center.
According to people familiar with the system, the pressure at ING Bank’s data center was higher than expected, and produced a loud sound when rapidly expelled through tiny holes
“It was as high as their equipment could monitor, over 130dB”.
Sound means vibration, and this is what damaged the hard drives. The HDD cases started to vibrate, and the vibration was transmitted to the read/write heads, causing them to go off the data tracks.
“[T]he HDD can tolerate less than 1/1,000,000 of an inch offset from the center of the data track—any more than that will halt reads and writes”,