It seems that PC market seems to be stabilizing in 2016. I expect that the PC market to shrinks slightly. While mobile devices have been named as culprits for the fall of PC shipments, IDC said that other factors may be in play. It is still pretty hard to make any decent profits with building PC hardware unless you are one of the biggest players – so again Lenovo, HP, and Dell are increasing their collective dominance of the PC market like they did in 2015. I expect changes like spin-offs and maybe some mergers with with smaller players like Fujitsu, Toshiba and Sony. The EMEA server market looks to be a two-horse race between Hewlett Packard Enterprise and Dell, according to Gartner. HPE, Dell and Cisco “all benefited” from Lenovo’s acquisition of IBM’s EMEA x86 server organisation.
Tablet market is no longer high grow market – tablet maker has started to decline, and decline continues in 2016 as owners are holding onto their existing devices for more than 3 years. iPad sales are set to continue decline and iPad Air 3 to be released in 1st half of 2016 does not change that. IDC predicts that detachable tablet market set for growth in 2016 as more people are turning to hybrid devices. Two-in-one tablets have been popularized by offerings like the Microsoft Surface, with options ranging dramatically in price and specs. I am not myself convinced that the growth will be as IDC forecasts, even though Company have started to make purchases of tablets for workers in jobs such as retail sales or field work (Apple iPads, Windows and Android tablets managed by company). Combined volume shipments of PCs, tablets and smartphones are expected to increase only in the single digits.
All your consumer tech gear should be cheaper come July as shere will be less import tariffs for IT products as World Trade Organization (WTO) deal agrees that tariffs on imports of consumer electronics will be phased out over 7 years starting in July 2016. The agreement affects around 10 percent of the world trade in information and communications technology products and will eliminate around $50 billion in tariffs annually.
In 2015 the storage was rocked to its foundations and those new innovations will be taken into wider use in 2016. The storage market in 2015 went through strategic foundation-shaking turmoil as the external shared disk array storage playbook was torn to shreds: The all-flash data centre idea has definitely taken off as a vision that could be achieved so that primary data is stored in flash with the rest being held in cheap and deep storage. Flash drives generally solve the dusk drive latency access problem, so not so much need for hybrid drives. There is conviction that storage should be located as close to servers as possible (virtual SANs, hyper-converged industry appliances and NVMe fabrics). The existing hybrid cloud concept was adopted/supported by everybody. Flash started out in 2-bits/cell MLC form and this rapidly became standard and TLC (3-bits/cell or triple layer cell) had started appearing. Industry-standard NVMe drivers for PCIe flash cards appeared. Intel and Micron blew non-volatile memory preconceptions out of the water in the second half of the year with their joint 3D XPoint memory announcement. Boring old disk disk tech got shingled magnetic recording (SMR) and helium-filled drive technology; drive industry is focused on capacity-optimizing its drives. We got key:value store disk drives with an Ethernet NIC on-board and basic GET and PUT object storage facilities came into being. Tape industry developed a 15TB LTO-7 format.
The use of SSD will increase and it’s price will drop. SSDs will be in more than 25% of new laptops sold in 2015. SSDs are expected to be in 31% of new consumer laptops in 2016 and more than 40% by 2017. The prices of mainstream consumer SSDs have fallen dramatically every year over the past three years while HDD prices have not changed much. SSD prices will decline to 24 cents per gigabyte in 2016. In 2017 they’re expected to drop to 11-17 cents per gigabyte (means a 1TB SSD on average would retail for $170 or less).
Hard disk sales will decrease, but this technology is not dead. Sales of hard disk drives have been decreasing for several years now (118 million units in the third quarter of 2015), but according to Seagate hard disk drives (HDDs) are set to still stay relevant around for at least 15 years to 20 years. HDDs remain the most popular data storage technology as it is cheapest in terms of per-gigabyte costs. While SSDs are generally getting more affordable, high-capacity solid-state drives are not going to become as inexpensive as hard drives any time soon.
Because all-flash storage systems with homogenous flash media are still too expensive to serve as a solution to for every enterprise application workload, enterprises will increasingly turn to performance optimized storage solutions that use a combination of multiple media types to deliver cost-effective performance. The speed advantage of Fibre Channel over Ethernet has evaporated. Enterprises also start to seek alternatives to snapshots that are simpler and easier to manage, and will allow data and application recovery to a second before the data error or logical corruption occurred.
Local storage and the cloud finally make peace in 2016 as the decision-makers across the industry have now acknowledged the potential for enterprise storage and the cloud to work in tandem. Over 40 percent of data worldwide is expected to live on or move through the cloud by 2020 according to IDC.
Open standards for data center development are now a reality thanks to advances in cloud technology. Facebook’s Open Compute Project has served as the industry’s leader in this regard.This allows more consolidation for those that want that. Consolidation used to refer to companies moving all of their infrastructure to the same facility. However, some experts have begun to question this strategy as the rapid increase in data quantities and apps in the data center have made centralized facilities more difficult to operate than ever before. Server virtualization, more powerful servers and an increasing number of enterprise applications will continue to drive higher IO requirements in the datacenter.
Cloud consolidation starts heavily in 2016: number of options for general infrastructure-as-a-service (IaaS) cloud services and cloud management software will be much smaller at the end of 2016 than the beginning. The major public cloud providers will gain strength, with Amazon, IBM SoftLayer, and Microsoft capturing a greater share of the business cloud services market. Lock-in is a real concern for cloud users, because PaaS players have the ancient imperative to find ways to tie customers to their platforms and aren’t afraid to use them so advanced users want to establish reliable portability across PaaS products in a multi-vendor, multi-cloud environment.
Year 2016 will be harder for legacy IT providers than 2015. In its report, IDC states that “By 2020, More than 30 percent of the IT Vendors Will Not Exist as We Know Them Today.” Many enterprises are turning away from traditional vendors and toward cloud providers. They’re increasingly leveraging open source. In short, they’re becoming software companies. The best companies will build cultures of performance and doing the right thing — and will make data and the processes around it self-service for all their employees. Design Thinking to guide companies who want to change the lives of its customers and employees. 2016 will see a lot more work in trying to manage services that simply aren’t designed to work together or even be managed – for example Whatever-As-A-Service cloud systems to play nicely together with their existing legacy systems. So competent developers are the scarce commodity. Some companies start to see Cloud as a form of outsourcing that is fast burning up inhouse ITops jobs with varying success.
There are still too many old fashioned companies that just can’t understand what digitalization will mean to their business. In 2016, some companies’ boards still think the web is just for brochures and porn and don’t believe their business models can be disrupted. It gets worse for many traditional companies. For example Amazon is a retailer both on the web and increasingly for things like food deliveries. Amazon and other are playing to win. Digital disruption has happened and will continue.
Windows 10 is coming more on 2016. If 2015 was a year of revolution, 2016 promises to be a year of consolidation for Microsoft’s operating system. I expect that Windows 10 adoption in companies starts in 2016. Windows 10 is likely to be a success for the enterprise, but I expect that word from heavyweights like Gartner, Forrester and Spiceworks, suggesting that half of enterprise users plan to switch to Windows 10 in 2016, are more than a bit optimistic. Windows 10 will also be used in China as Microsoft played the game with it better than with Windows 8 that was banned in China.
Windows is now delivered “as a service”, meaning incremental updates with new features as well as security patches, but Microsoft still seems works internally to a schedule of milestone releases. Next up is Redstone, rumoured to arrive around the anniversary of Windows 10, midway through 2016. Also Windows servers will get update in 2016: 2016 should also include the release of Windows Server 2016. Server 2016 includes updates to the Hyper-V virtualisation platform, support for Docker-style containers, and a new cut-down edition called Nano Server.
Windows 10 will get some of the already promised features not delivered in 2015 delivered in 2016. Windows 10 was promised coming to PCs and Mobile devices in 2015 to deliver unified user experience. Continuum is a new, adaptive user experience offered in Windows 10 that optimizes the look and behavior of apps and the Windows shell for the physical form factor and customer’s usage preferences. The promise was same unified interface for PCs, tablets and smart phones – but it was only delivered in 2015 for only PCs and some tablets. Mobile Windows 10 for smart phone is expected to start finally in 2016 – The release of Microsoft’s new Windows 10 operating system may be the last roll of the dice for its struggling mobile platform. Because Microsoft Plan A is to get as many apps and as much activity as it can on Windows on all form factor with Universal Windows Platform (UWP), which enables the same Windows 10 code to run on phone and desktop. Despite a steady inflow of new well-known apps, it remains unclear whether the Universal Windows Platform can maintain momentum with developer. Can Microsoft keep the developer momentum going? I am not sure. In addition there are also plans for tools for porting iOS apps and an Android runtime, so expect also delivery of some or all of the Windows Bridges (iOS, web app, desktop app, Android) announced at the April 2015 Build conference in hope to get more apps to unified Windows 10 app store. Windows 10 does hold out some promise for Windows Phone, but it’s not going to make an enormous difference. Losing the battle for the Web and mobile computing is a brutal loss for Microsoft. When you consider the size of those two markets combined, the desktop market seems like a stagnant backwater.
Older Windows versions will not die in 2016 as fast as Microsoft and security people would like. Expect Windows 7 diehards to continue holding out in 2016 and beyond. And there are still many companies that run their critical systems on Windows XP as “There are some people who don’t have an option to change.” Many times the OS is running in automation and process control systems that run business and mission-critical systems, both in private sector and government enterprises. For example US Navy is using obsolete operating system Microsoft Windows XP to run critical tasks. It all comes down to money and resources, but if someone is obliged to keep something running on an obsolete system, it’s the wrong approach to information security completely.
Virtual reality has grown immensely over the past few years, but 2016 looks like the most important year yet: it will be the first time that consumers can get their hands on a number of powerful headsets for viewing alternate realities in immersive 3-D. Virtual Reality will become the mainstream when Sony, and Samsung Oculus bring consumer products on the market in 2016. Whole virtual reality hype could be rebooted as Early build of final Oculus Rift hardware starts shipping to devs. Maybe HTC‘s and Valve‘s Vive VR headset will suffer in the next few month. Expect a banner year for virtual reality.
GPU and FPGA acceleration will be used in high performance computing widely. Both Intel and AMD have products with CPU and GPU in the same chip, and there is software support for using GPU (learn CUDA and/or OpenCL). Also there are many mobile processors have CPU and GPU on the same chip. FPGAs are circuits that can be baked into a specific application, but can also be reprogrammed later. There was lots of interest in 2015 for using FPGA for accelerating computations as the nest step after GPU, and I expect that the interest will grow even more in 2016. FPGAs are not quite as efficient as a dedicated ASIC, but it’s about as close as you can get without translating the actual source code directly into a circuit. Intel bought Altera (big FPGA company) in 2015 and plans in 2016 to begin selling products with a Xeon chip and an Altera FPGA in a single package – possibly available in early 2016.
Artificial intelligence, machine learning and deep learning will be talked about a lot in 2016. Neural networks, which have been academic exercises (but little more) for decades, are increasingly becoming mainstream success stories: Heavy (and growing) investment in the technology, which enables the identification of objects in still and video images, words in audio streams, and the like after an initial training phase, comes from the formidable likes of Amazon, Baidu, Facebook, Google, Microsoft, and others. So-called “deep learning” has been enabled by the combination of the evolution of traditional neural network techniques, the steadily increasing processing “muscle” of CPUs (aided by algorithm acceleration via FPGAs, GPUs, and, more recently, dedicated co-processors), and the steadily decreasing cost of system memory and storage. There were many interesting releases on this in the end of 2015: Facebook Inc. in February, released portions of its Torch software, while Alphabet Inc.’s Google division earlier this month open-sourced parts of its TensorFlow system. Also IBM Turns Up Heat Under Competition in Artificial Intelligence as SystemML would be freely available to share and modify through the Apache Software Foundation. So I expect that the year 2016 will be the year those are tried in practice. I expect that deep learning will be hot in CES 2016. Several respected scientists issued a letter warning about the dangers of artificial intelligence (AI) in 2015, but I don’t worry about a rogue AI exterminating mankind. I worry about an inadequate AI being given control over things that it’s not ready for. How machine learning will affect your business? MIT has a good free intro to AI and ML.
Computers, which excel at big data analysis, can help doctors deliver more personalized care. Can machines outperform doctors? Not yet. But in some areas of medicine, they can make the care doctors deliver better. Humans repeatedly fail where computers — or humans behaving a little bit more like computers — can help. Computers excel at searching and combining vastly more data than a human so algorithms can be put to good use in certain areas of medicine. There are also things that can slow down development in 2016: To many patients, the very idea of receiving a medical diagnosis or treatment from a machine is probably off-putting.
Internet of Things (IoT) was talked a lot in 2015, and it will be a hot topics for IT departments in 2016 as well. Many companies will notice that security issues are important in it. The newest wearable technology, smart watches and other smart devices corresponding to the voice commands and interpret the data we produce - it learns from its users, and generate appropriate responses in real time. Interest in Internet of Things (IoT) will as bring interest to real-time business systems: Not only real-time analytics, but real-time everything. This will start in earnest in 2016, but the trend will take years to play out.
Connectivity and networking will be hot. And it is not just about IoT. CES will focus on how connectivity is proliferating everything from cars to homes, realigning diverse markets. The interest will affect job markets: Network jobs are hot; salaries expected to rise in 2016 as wireless network engineers, network admins, and network security pros can expect above-average pay gains.
Linux will stay big in network server marker in 2016. Web server marketplace is one arena where Linux has had the greatest impact. Today, the majority of Web servers are Linux boxes. This includes most of the world’s busiest sites. Linux will also run many parts of out Internet infrastructure that moves the bits from server to the user. Linux will also continue to rule smart phone market as being in the core of Android. New IoT solutions will be moist likely to be built mainly using Linux in many parts of the systems.
Microsoft and Linux are not such enemies that they were few years go. Common sense says that Microsoft and the FOSS movement should be perpetual enemies. It looks like Microsoft is waking up to the fact that Linux is here to stay. Microsoft cannot feasibly wipe it out, so it has to embrace it. Microsoft is already partnering with Linux companies to bring popular distros to its Azure platform. In fact, Microsoft even has gone so far as to create its own Linux distro for its Azure data center.
Web browsers are coming more and more 64 bit as Firefox started 64 bit era on Windows and Google is killing Chrome for 32-bit Linux. At the same time web browsers are loosing old legacy features like NPAPI and Silverlight. Who will miss them? The venerable NPAPI plugins standard, which dates back to the days of Netscape, is now showing its age, and causing more problems than it solves, and will see native support removed by the end of 2016 from Firefox. It was already removed from Google Chrome browsers with very little impact. Biggest issue was lack of support for Microsoft’s Silverlight which brought down several top streaming media sites – but they are actively switching to HTML5 in 2016. I don’t miss Silverlight. Flash will continue to be available owing to its popularity for web video.
SHA-1 will be at least partially retired in 2016. Due to recent research showing that SHA-1 is weaker than previously believed, Mozilla, Microsoft and now Google are all considering bringing the deadline forward by six months to July 1, 2016.
Adobe’s Flash has been under attack from many quarters over security as well as slowing down Web pages. If you wish that Flash would be finally dead in 2016 you might be disappointed. Adobe seems to be trying to kill the name by rebranding trick: Adobe Flash Professional CC is now Adobe Animate CC. In practive it propably does not mean much but Adobe seems to acknowledge the inevitability of an HTML5 world. Adobe wants to remain a leader in interactive tools and the pivot to HTML5 requires new messaging.
The trend to try to use same same language and tools on both user end and the server back-end continues. Microsoft is pushing it’s .NET and Azure cloud platform tools. Amazon, Google and IBM have their own set of tools. Java is on decline. JavaScript is going strong on both web browser and server end with node.js , React and many other JavaScript libraries. Apple also tries to bend it’s Swift programming language now used to make mainly iOS applications also to run on servers with project Perfect.
Java will still stick around, but Java’s decline as a language will accelerate as new stuff isn’t being written in Java, even if it runs on the JVM. We will not see new Java 9 in 2016 as Oracle’s delayed the release of Java 9 by six months. The register tells that Java 9 delayed until Thursday March 23rd, 2017, just after tea-time.
Containers will rule the world as Docker will continue to develop, gain security features, and add various forms of governance. Until now Docker has been tire-kicking, used in production by the early-adopter crowd only, but it can change when vendors are starting to claim that they can do proper management of big data and container farms.
NoSQL databases will take hold as they be called as “highly scalable” or “cloud-ready.” Expect 2016 to be the year when a lot of big brick-and-mortar companies publicly adopt NoSQL for critical operations. Basically NoSQL could be seem as key:value store, and this idea has also expanded to storage systems: We got key:value store disk drives with an Ethernet NIC on-board and basic GET and PUT object storage facilities came into being.
In the database world Big Data will be still big but it needs to be analyzed in real-time. A typical big data project usually involves some semi-structured data, a bit of unstructured (such as email), and a whole lot of structured data (stuff stored in an RDBMS). The cost of Hadoop on a per-node basis is pretty inconsequential, the cost of understanding all of the schemas, getting them into Hadoop, and structuring them well enough to perform the analytics is still considerable. Remember that you’re not “moving” to Hadoop, you’re adding a downstream repository, so you need to worry on systems integration and latency issues. Apache Spark will also get interest as Spark’s multi-stage in-memory primitives provides more performance for certain applications. Big data brings with it responsibility – Digital consumer confidence must be earned.
IT security continues to be a huge issue in 2016. You might be able to achieve adequate security against hackers and internal threats but every attempt to make systems idiot proof just means the idiots get upgraded. Firms are ever more connected to each other and the general outside world. So in 2016 we will see even more service firms accidentally leaking critical information and a lot more firms having their reputations scorched by incompetence fuelled security screw-ups. Good security people are needed more and more – a joke doing the rounds of ITExecs doing interviews is “if you’re a decent security bod, why do you need to look for a job”
There will still be unexpected single points of failures in big distributed networked system. The cloud behind the silver lining is that Amazon or any other cloud vendor can be as fault tolerant, distributed and well supported as you like, but if a service like Akamai or Cloudflare was to die, you still stop. That’s not a single point of failure in the classical sense but it’s really hard to manage unless you go for full cloud agnosticism – which is costly. This is hard to justify when their failure rate is so low, so the irony is that the reliability of the content delivery networks means fewer businesses work out what to do if they fail. Oh, and no one seems to test their mission-critical data centre properly, because it’s mission critical- So they just over-specify where they can and cross their fingers (= pay twice and get the half the coverage for other vulnerabilities).
For IT start-ups it seems that Silicon Valley’s cash party is coming to an end. Silicon Valley is cooling, not crashing. Valuations are falling. The era of cheap money could be over and valuation expectations are re-calibrating down. The cheap capital party is over. It could mean trouble for weaker startups.
933 Comments
Tomi Engdahl says:
‘Jet blast’ noise KOs ING bank’s spinning rust servers
Noisy fire-fighting test cripples Romanian cash operation
http://www.theregister.co.uk/2016/09/12/fire_fighting_vibrations_ing_banking_crash/
Sound waves equal to a jet fighter taking off, generated by fire fighting equipment during testing, has been blamed for taking down ING’s banking operations.
Card and ATM transactions, internet banking and ING’s website were all felled in Romania for 10 hours on Saturday by the freak occurrence, according to Vice’s Motherboard website.
ING had been running a routine test of the Inergen fire-suppressant system at the Bucharest data centre housing the servers behind its digital operations.
Inergen is an inert gas used to suppress fires. ING blamed the system for causing the outage.
Tomi Engdahl says:
Samsung printers fable ends
HP announced the acquisition of the Korean Samsung’s printer business. The purchase price of 1.05 billion dollars. Printer sector of the US, the largest manufacturers of acquisition.
Source: http://etn.fi/index.php?option=com_content&view=article&id=5036:samsungin-tulostinten-taru-loppuu&catid=13&Itemid=101
Tomi Engdahl says:
Data Centers Dominate FPGA Event
http://www.eetimes.com/author.asp?section_id=36&doc_id=1330431&
In a sign of the times, four out of the five keynote presentations at FPL 2016, a major FPGA conference in Europe, were given by large companies such as IBM, Intel and Microsoft focused on the efficient deployment and use of FPGAs in data centers.
Christoph Hagleitne presented IBM’s view of the major applications where FPGAs can provide differentiation such as cognitive computing, high performance computing and the Internet of Things. He described two complementary approaches to establish heterogeneous computing systems in cloud datacenters.
The first approach is based on heterogeneous supernodes that tightly couple compute resources to multi-core CPUs and their coherent memory system via high-bandwidth, low latency interconnects like CAPI or NVlink. The second approach is based on the disaggregation of data center resources where the individual compute, memory, and storage resources are connected via the network fabric and can be individually optimized and scaled in line with the cloud paradigm.
Hagleitne described IBM’s SupperVessel platform, a cloud service created by IBM Research in Beijing and IBM Systems Labs that is part of the OpenPower initiative. SupperVessel is based on Power processors, with FPGAs and GPUs providing acceleration services, using OpenStack to manage the whole cloud. He also presented the new interfaces IBM’s Power 9 processor will use for fast communication between the processor and the accelerators.
P. K. Gupta, the general manager of Xeon+FPGA products in Intel’s data center group, said FPGAs can increase the performance of applications such as machine learning, cloud radio-access networks, edge computing and content delivery. Accelerators can increase performance at lower total cost of ownership for targeted workloads, he said.
FPGAs can be deployed in data centers both in discrete platforms and in integrated platforms in multi-chip packages, Gupta said.
Finally, Xilinx presented a new open-source framework called Pynq for designing with its Zynq FPGAs. Pynq enables programmers who design embedded systems to exploit the capabilities of SoCs based on Zynq chips without having to use CAD tools to design programmable logic circuits. Instead the SoCs are programmed in Python and the code is developed and tested directly on the embedded system. The programmable logic circuits are imported as hardware libraries and programmed through their APIs, essentially the same way that software libraries are imported and programmed.
Tomi Engdahl says:
Hyperscale data center market to reach $71.2 billion globally by 2022: Analyst
http://www.cablinginstall.com/articles/pt/2016/09/hyperscale-data-center-market-to-reach-71-2-billion-globally-by-2022-analyst.html
According to a new report by Allied Market Research, the world hyperscale data center market is expected to reach a revenue of $71.2 billion by 2022, with a CAGR of 20.7% from 2016 to 2022.
As stated by the analyst, “Hyperscale data centers are most widely adopted by cloud service providers to house cloud-based resources and cloud services, accounting for a market share of around 63% in 2015. North America is the largest revenue-generating region for hyperscale data centers, followed by Europe and Asia-Pacific in 2015.”
– Increased adoption of hyperscale data centers is anticipated to be witnessed across key sectors such as healthcare, manufacturing, and government utilities from 2016 to 2022.
– In 2015, North America was the highest revenue-generating region, constituting nearly 37% of the total market revenue.
Tomi Engdahl says:
Chris Duckett / ZDNet:
Nvidia debuts new AI-focused processors for data centers based on its Pascal design, says the new chips have 4x speed, 3x efficiency of previous models — Nvidia has launched a pair of GPUs that the chip giant claims are four times faster than last year’s offering.
Nvidia releases Pascal GPUs for neural networks
http://www.zdnet.com/article/nvidia-releases-pascal-gpus-for-neural-networks/
Nvidia has launched a pair of GPUs that the chip giant claims are four times faster than last year’s offering.
Tomi Engdahl says:
Peter Bright / Ars Technica:
Microsoft will now let devs distribute repackaged Win32 desktop apps through the Windows Store, which was previously limited to Universal Windows Platform apps — The Project Centennial “bridge” for Win32 apps is now ready for production. — Traditional desktop Windows applications …
Desktop apps make their way into the Windows Store
The Project Centennial “bridge” for Win32 apps is now ready for production.
http://arstechnica.com/information-technology/2016/09/desktop-apps-make-their-way-into-the-windows-store/
Traditional desktop Windows applications can now be distributed and sold through the Windows Store, with note-taking application Evernote being one of the first to use this new capability.
Until now, applications built for and sold through the Windows Store in Windows 10 have been built for the Universal Windows Platform (UWP), the common set of APIs that spans Windows 10 across all the many devices it supports. This has left one major category of application, the traditional desktop application built using the Win32 API, behind.
Announced at Build 2015, codename Project Centennial—now officially titled the Desktop App Converter—is Microsoft’s solution to this problem. It allows developers to repackage existing Win32 applications with few or no changes and sell them through the store. Applications packaged this way aren’t subject to all the sandbox restrictions that UWP applications are, ensuring that most will work unmodified. But they are also given the same kind of clean installation, upgrading, and uninstallation that we’ve all come to expect from Store-delivered software.
Centennial is designed to provide not just a way of bringing Win32 apps into the store; it also provides a transition path so that developers can add UWP-based functionality to their old applications on a piecemeal basis.
The big downside to Centennial applications is, of course, that they’re not universal. They’ll only run on desktop Windows systems with x86 processors. What this means for UWP going forward isn’t clear.
Tomi Engdahl says:
PC sales have been steadily declining for several years. Gartner does not believe that the sales figures no longer return to their former levels. This means that manufacturers must be able to reform their business if they want to stay alive at all.
The Institute provides manufacturers, in fact, very little time. According to Gartner, 2020 is the year by which manufacturers have to make their decisions. One solution is, of course, leaving the entire PC industry.
If a company wants to survive, it offers a variety of models. The company may try to present products and business model. This requires large volumes, so that the operation can be profitable in shrinking margins.
Another option is to try a new business model for existing products. In practice, companies are able to sell a PC as a service. PC itself might be free, but tied to, for example, a digital content provider services.
A third possible model is to explore new kinds of products, such as with sensors, new networked home products, smarter PCs.
The fourth option is to actually invent themselves again: the development of new products and new business models.
Source: http://etn.fi/index.php?option=com_content&view=article&id=5049:enta-jos-pc-n-ostaisi-palveluna&catid=13&Itemid=101
Tomi Engdahl says:
Matt Weinberger / Business Insider:
Microsoft now has the most open source contributors on GitHub, surpassing Facebook and Google with 16.4K
Microsoft just edged out Facebook and proved that it’s changed in an important way
http://www.businessinsider.com/microsoft-github-open-source-2016-9?op=1%3fr=US&IR=T&IR=T
Over the course of the 90′s and ’00′s, Microsoft became notorious for competing aggressively with the very concept of open source — free software, developed by teams of volunteer programmers from all over the world.
Since Satya Nadella took the CEO job at Microsoft in 2014, the company’s views towards open source have evolved. Microsoft has embraced open source, and even supports the open source Linux operating system on its Microsoft Azure cloud computing platform.
Tomi Engdahl says:
Digital Data Storage is Undergoing Mind-Boggling Growth
http://www.eetimes.com/author.asp?section_id=36&doc_id=1330462&
In 2010, digital data storage requirements hit the “Zetta” prefix, with only one prefix, the “Yotta,” left available.
The evolution of digital data
Let’s take a look at digital data — an area that has seen exponential growth in the past decade or so — which may be classified as either structured or unstructured.
Structured data is highly organized and made up mostly of tables with rows and columns that define their meaning. Examples are Excel spreadsheets and relational databases.
Unstructured data is everything else. Examples include the following:
Email messages, instant messages, text messages…
Text files, including Word documents, PDFs, and other files such as books, letters, written documents, audio and video transcripts…
PowerPoints and SlideShare presentations
Audio files of music, voicemails, customer service recordings…
Video files that include movies, personal videos, YouTube uploads…
Images of pictures, illustrations, memes…
The volume of unstructured data exploded in the past decade and half.
Just compare the size of a text file such as The Divine Comedy — which was translated into English by Henry F. Cary in 1888 — at 553kB with the file size of an HD video that stores a movie like The Bourne Identity at 30GB. The difference is of seven orders of magnitude (107) or 10 million times.
Statistics published by venues that track the digital data market are staggering. According to IDC Research, digital data will grow at a compound annual growth rate (CAGR) of 42% through 2020. In the 2010-2020 decade, the world’s data will grow by 50X; i.e., from about 1ZB in 2010 to about 50ZB in 2020.
And IBM found that humans now create 2.5 quintillion bytes of data daily; that’s the equivalent of about half a billion HD movie downloads.
Digital data storage supply and demand
The advent of the computer accelerated our ability to create data, but it also brought a new challenge. Now that we can generate data blazingly fast, how do we store it?
It’s far easier to generate zettabytes of data than to manufacture zettabytes of data storage capacity. A wide gap is emerging between data generation and hard drive and flash production.
By 2020, demand for capacity will outstrip production by six zettabytes, or nearly double the demand of 2013 alone.
Conclusion
At the time of the 19th General Conference on Weights and Measures in 1991, a metric prefix to the power of 24 was considered to be large enough to include virtually all known physical measures for many years to come.
Approximately twenty years later, in 2010, digital data storage hit the “Zetta” prefix, with only one prefix, the “Yotta,” left available. Maybe the time is approaching for another conference to further expand the available prefixes.
Tomi Engdahl says:
The new java late – for next summer
Java platform version 9 is still in deep trouble. According to preliminary plans it was supposed to be completed by the end of 2015. Now, Oracle has announced that the release is delayed again. Now talking about July 2017.
Source: http://etn.fi/index.php?option=com_content&view=article&id=5059:uusi-java-myohastyy-ensi-kesaan&catid=13&Itemid=101
Tomi Engdahl says:
Matt Weinberger / Business Insider:
Microsoft now has the most open source contributors on GitHub, surpassing Facebook and Google with 16.4K
Microsoft just edged out Facebook and proved that it’s changed in an important way
http://www.businessinsider.com/microsoft-github-open-source-2016-9?op=1%3fr=US&IR=T&IR=T
Over the course of the 90′s and ’00′s, Microsoft became notorious for competing aggressively with the very concept of open source — free software, developed by teams of volunteer programmers from all over the world.
Since Satya Nadella took the CEO job at Microsoft in 2014, the company’s views towards open source have evolved. Microsoft has embraced open source, and even supports the open source Linux operating system on its Microsoft Azure cloud computing platform.
Tomi Engdahl says:
OS X The name became history when the machines are updated to the new MacOS-time. This is a big update, which is known by the code name Sierra.
A big change is that the digital assistant Siri will come to mac computers. Siri gets its own icon to your desktop, start menu or the Dock.
The machine login easier, at least if you own an Apple Watch wearable. Sierra automatically detects the user’s Watch-clock proximity and LogIn.
In 2000, Mac computers run Mac OS 9 operating system. It was replaced by a Unix-based OS X.
Source: http://etn.fi/index.php?option=com_content&view=article&id=5066:ensimmainen-macos-julkaistaan&catid=13&Itemid=101
New new macOX is stil based on the same code base as OS X.
Tomi Engdahl says:
Sandisk (now part of Western Digital) has announced an SD memory card, which is suitable for terabytes of data. Thus, a small card has more capacity than any laptop or computer that you have ever owned.
1TB SD card is only a prototype, which was presented at the Photokina show. Incidentally, Sandisk introduced a 512-gigabyte SD card, just in the same Photokina two years ago.
This capacity is the need, as soon as 4K and 8K video also is increasing at a rapid pace.
Source: http://etn.fi/index.php?option=com_content&view=article&id=5074:sd-kortilla-enemman-tilaa-kuin-koneesi-kiintolevylla&catid=13&Itemid=101
Tomi Engdahl says:
Two weeks ago, the world’s largest privately-managed company was formed when EMC merged with Dell.
a $ 63 billion trade as one of the world’s largest and by far the largest acquisition by a private company. The new net sales of $ 74 billion in Dell Technologies has 140 thousand employees
- It may be that many image Dell is distorted. Perhaps Dell is still seen as a PC manufacturer, even if we have long been a big factor in cloud computing and data center equipment supplier. In this sense, the change is not as dramatic, says Dell CEO Mika Frankenberg.
the biggest change in the IT side is probably the fact that few people buy more plain hardware. – All automated and purchased as a service. In Finland is one of Europe’s leading market, Enberg says.
Source: http://etn.fi/index.php?option=com_content&view=article&id=5083:uusi-dell-on-valmis-datavyoryyn&catid=13&Itemid=101
Tomi Engdahl says:
Zombie Moore’s Law shows hardware is eating software
Customised CPUs are doing things software just can’t do on commodity kit
http://www.theregister.co.uk/2016/09/22/the_evolution_of_moores_law_suggests_hardware_is_eating_software/
After being pronounced dead this past February – in Nature, no less – Moore’s Law seems to be having a very weird afterlife. Within the space of the last thirty days we’ve seen:
Intel announce some next-generation CPUs that aren’t very much faster than the last generation of CPUs;
Intel delay, again, the release of some of its 10nm process CPUs; and
Apple’s new A10 chip, powering iPhone 7, is as one of the fastest CPUs ever.
Intel hasn’t lost the plot. In fact, most of the problems in Moore’s Law have come from Intel’s slavish devotion to a single storyline: more transistors and smaller transistors are what everyone needs. That push toward ‘general purpose computing’ gave us thirty years of Wintel, but that no longer looks to be the main game. The CPU is all grown up.
Tomi Engdahl says:
W3C Set To Publish HTML 5.1, Work Already Started On HTML 5.2
https://developers.slashdot.org/story/16/09/21/2131222/w3c-set-to-publish-html-51-work-already-started-on-html-52
Members of the World Wide Web Consortium (W3C) are getting ready to launch the HTML 5.1 specification and have already started work on the upcoming HTML 5.2 version since mid-August. The HTML 5.1 standard has been promoted from a “Release Candidate” to a “Proposed Recommendation,” the last step before it becomes a “W3C Recommendation,” and officially replaces HTML 5 as the current HTML standard.
W3C Set to Publish HTML 5.1, Work Already Started on HTML 5.2
HTML 5.1 taking its last step before becoming “the standard”
Read more: http://news.softpedia.com/news/w3c-set-to-publish-html-5-1-already-started-work-on-html-5-2-508512.shtml#ixzz4Kz3goYzZ
Tomi Engdahl says:
Microsoft told to compensate customers for Windows 10 breaking computers
http://www.telegraph.co.uk/technology/2016/09/21/microsoft-told-to-compensate-customers-for-windows-10-breaking-c/
Customers have reported problems with Microsoft’s Windows 10 and the anniversary update
Tomi Engdahl says:
C Deemed Top Programming Language in New Ten-Source Ranking
Java and Python come in second and third
http://webscripts.softpedia.com/blog/c-deemed-top-programming-language-in-new-ten-source-ranking-506712.shtml
The Institute of Electrical and Electronics Engineers (IEEE) has aggregated data from ten sources that evaluated programming languages based on various criteria to provide one of the broadest rankings to date.
According to its findings, the top 10 most popular programming languages today are C, Java, Python, C++, R, C#, PHP, JavaScript, Ruby, and Go.
IEEE included data from Google searches, GitHub active repos, GitHub trends, new GitHub projects, Stack Overflow questions, views on existing Stack Overflow questions, Reddit, Hacker News, Career Builder, Dice, Twitter, and IEEE Xplore.
Read more: http://webscripts.softpedia.com/blog/c-deemed-top-programming-language-in-new-ten-source-ranking-506712.shtml#ixzz4Kz50Dnfg
Tomi Engdahl says:
Going hyperconverged? Don’t forget to burst into the cloud
Just make sure your team evolves with the technology
http://www.theregister.co.uk/2016/09/22/hyperconvergence_and_cloud_means_what/
Here’s a key benefit of that shiny new hyperconverged box you just bought: it’s supposed to speak the cloud’s language.
After all, hyperconverged storage is sometimes viewed as a private cloud in a box, melding storage, networking and compute into a single package with the storage management happening under the hood.
It offers the ability to provision new resources and control them via software APIs.
That sounds a lot like the public cloud, then, only in a rack somewhere in your data centre. In theory at least, that opens up the possibility for an on-premise hyperconverged box to talk to public cloud services like Azure and AWS. But however push-button hyperconverged kit is supposed to be, rolling it into a hybrid cloud environment is going to take a little effort. There will be speed bumps along the way.
What pressures will you face, and what skills should your team have to safely navigate them?
Jeff Kato, senior analyst and consultant at tech advisory firm Taneja Group, has never seen on-premise vendors scurry so much to make their kit easy to use. “They know they’re competing against public clouds, so now they have to make on-premise infrastructure as inexpensive and easy to use as public cloud,” he said.
The next step is for the two to integrate, he said, and indeed, we’re already seeing many signs of this across the industry.
Tomi Engdahl says:
Chris Brook / Threatpost:
Facebook brings osquery, its open source SQL-powered detection tool for monitoring OS processes and networks, to Windows
Facebook Debuts Open Source Detection Tool for Windows
https://threatpost.com/facebook-debuts-open-source-detection-tool-for-windows/120897/
Facebook successfully ported its SQL-powered detection tool, osquery, to Windows this week, giving users a free and open source method to monitor networks and diagnose problems. The framework, which converts operating systems to relational databases, allows users to write SQL-based queries to detect intrusions and other types of malicious activity across networks. Facebook debuted the open source tool in 2014 as cross-platform, but for the last two years it was only supported on Ubuntu, CentOS, and Mac OS X operating systems. Facebook isn’t the biggest Windows shop, but the company confirmed in March that because so many users were asking for it, it was building a version of the tool for Windows 10.
See more at: Facebook Debuts Open Source Detection Tool for Windows https://wp.me/p3AjUX-vrX
Tomi Engdahl says:
Mary Jo Foley / ZDNet:
Windows 10 surpasses 400M active devices, up from 300M in May, and announces new container-based isolation security feature for Edge browser, coming next year
Microsoft: Windows 10 now on 400 million devices
http://www.zdnet.com/article/microsoft-windows-10-now-on-400-million-devices/
Microsoft officials said Windows 10 has hit the 400 million ‘active’ device milestone, up from 300 million in early May.
They also said that Windows Insider testers working with early Windows 10 “Redstone 2″ builds soon should get their hands on a new Edge browser security feature that’s been rumored for some time: Container-based isolation in the browser.
That container-based isolation is technology codenamed “Barcelona.” While Windows 10 Enterprise currently supports containers for development purposes, Barcelona is specific to the browser baked into the operating system.
Microsoft execs have christened Barcelona “Windows Defender Application Guard,” they said today. The feature will use virtualization-based security, isolating potentially malicious code in containers so it can’t spread across company networks. Starting “early next year,” Microsoft will start testing this feature with enterprise customers who’ve expressed interest, officials said.
Microsoft officials also said at Ignite today that the Windows Defender Advanced Threat Protection (ATP) and Office 365 ATP services now “share intelligence mutually.”
Tomi Engdahl says:
Vladimir Putin Is Replacing Microsoft Programs With Domestic Software
https://tech.slashdot.org/story/16/09/28/0514240/vladimir-putin-is-replacing-microsoft-programs-with-domestic-software
Moscow city will replace Microsoft Corp. programs with domestic software on thousands of computers in answer to President Vladimir Putin’s call for Russia’s authorities to reduce dependence on foreign technology amid tensions with the U.S. and Europe. The city will initially replace Microsoft’s Exchange Server and Outlook on 6,000 computers with an e-mail system installed by state-run carrier Rostelecom PJSC, Artem Yermolaev, head of information technology for Moscow, told reporters Tuesday.
Moscow Drops Microsoft on Putin’s Call for Self-Sufficiency
http://www.bloomberg.com/news/articles/2016-09-27/moscow-drops-microsoft-outlook-as-putin-urges-self-sufficiency
City hall switching to local software installed by Rostelecom
Russia cuts dependence on U.S. tech amid political tensions
Tomi Engdahl says:
Report evaluates global data center technical furniture market
http://www.cablinginstall.com/articles/pt/2016/09/report-evaluates-global-data-center-technical-furniture-market.html?cmpid=Enl_CIM_DataCenters_September272016&eid=289644432&bid=1539275
The Global Data Center Technical Furniture Market (Rack, PDU, General Construction Furniture) – Strategic Assessment and Forecast – Till 2021 report from Beige Market Intelligence
Market research analysts at Beige Market Intelligence forecast the worldwide data center technical furniture market to grow at a CAGR of around 14.51% during the forecast period. The rack segment is expected to witness the highest growth during the period, as the demand for taller racks will drive the market revenue. Market growth of technical furniture directly depends on the number of new data center construction projects and the number of data center renovation projects, notes the analyst.
During the period 2018-2019, the report forecasts that almost 80% of renovation projects will be finished, which will have an impact on the volume sales of racks as well as PDUs. Maximum renovation projects and new projects are going for fluid leak detection and seismic isolation platforms also.
The report covers the companies operating in the entire value chain of the market. Major players identified within the report are Blackbox Network Services, Eaton, Emerson Network Power, Rittal, and Schneider Electric. The report also covers emerging vendors in the market such as Belden, Chatsworth, CyberPower Systems, Dell, Fujitsu, HP, IBM, Pentair, and Raritan
Tomi Engdahl says:
Insider blog: Working for Intel’s Data Center Group
http://www.cablinginstall.com/articles/pt/2016/09/insider-blog-working-for-intel-s-data-center-group.html?cmpid=Enl_CIM_DataCenters_September272016&eid=289644432&bid=1539275
From my perspective, data centers are where all of the most exciting developments are happening with respect to the maturation of IT. In the modern world, everyone uses services that rely on servers—for example, email—but very few people have their own servers. They use clients hosted in data centers around the world to enable some of the most basic functions we associate with IT devices; they never actually see the servers.
We’re in the middle of a generational shift where more and more of our computing is being done in the data center rather than locally, and we just access a small part of that power and information as we need it. At the same time, the cost of running these data centers has become low enough that the services they can provide can be offered for free by subsidizing the cost with ad revenue.
The Difference of Working at Intel
There’s a few key points to why Intel is in a position to advance this field.
One is our breadth. Because we reach into so many areas, it enables us to get a good understanding of our customer’s problems and then do the hard work of bringing genuinely new technology to market that can address those problems. We don’t have to be constrained to finding workaround solutions to problems using existing technology, we can actually question the nature of the problem and design new technology from the ground up to address it.
Additionally, when we come up with that new technology, we have what it takes to actually bring that to market. It’s easy to prove something in a lab and create a proof of concept, but making that idea practical and marketable enough to ship is very hard. Intel DCG can actually make this possible because our breadth allows us to understand and work on every aspect of a problem and create a complete solution that integrates the new idea rather than just a component.
Working in DCG
One of the best things about working in this group is the ability “get in the trenches” and get a working understanding of the situations our customers have to deal with. That experience permeates our working culture and leads to a genuine desire to find real solutions for real problems rather than just chasing after minute improvements in specs.
Tomi Engdahl says:
Coding as a spectator sport
http://www.edn.com/electronics-blogs/about-embedded/4442743/Coding-as-a-spectator-sport?_mc=NL_EDN_EDT_EDN_today_20160927&cid=NL_EDN_EDT_EDN_today_20160927&elqTrackId=0a6047e153714401b85d7ce8fa5adb59&elq=5bf020b22b2040de82eb81bcb16b2321&elqaid=34035&elqat=1&elqCampaignId=29755
With the rise of low-cost platforms like the Arduino and growing interest in the Internet of Things (IoT), a new category of embedded developers is arising. Many of this new cadre are not coming from electronics or computer science backgrounds, but are forging their own approaches to development unbound by tradition or academic methods. Some are even learning to code not by study, but by watching others write code.
One example of this online learning approach is Twitch TV. Although this site looks on the surface to only be a means of sharing video game excursions, there is more available if you dig a bit. In the Creative channels, search for Programming and you’ll come up with a list of videos on the topic. Some are recordings of presentations, while others are a kind of tutorial. The tutorials take the form of “looking over the shoulder” as the video creator narrates their activity. When originally created, these are streamed live, and have a chat line open for real-time question-and-answer. The recorded version then gets archived for latecomers’ use.
Another learning resource that uses the same “over the shoulder” video format as Twitch is LiveCoding. Unlike Twitch, however, Live Coding focuses exclusively on coding. It is also more organized in its approach to offering instruction than Twitch. LiveCoding organizes its content by programming language (Java, Python, Ruby, C/C++, etc.), some with tens of thousands of videos available. Within each of those language categories, the site offers a choice of beginner, intermediate, or advanced level topics.
There also seems to be a social aspect to this method of knowledge transfer. Alongside the streaming presentation there is a chat box, which allows real-time viewers to post comments, ask questions, and the like.
Perhaps this approach is a natural extension of an increasingly online existence, or a way for developers who live and breathe coding to connect and interact with like-minded cohorts, but coding as a spectator sport simply doesn’t appeal to me. And I worry that participant are exchanging only random tidbits of information and failing to see things in an overall context and structure. Such fragmented knowledge transfer is fine for play and prototyping but, I fear, falls short of providing the kind of instruction needed to achieve reliable, production-ready design.
Tomi Engdahl says:
Marius Nestor / Softpedia News:
Raspberry Pi Foundation unveils PIXEL, a new LXDE-based desktop for Raspbian that is designed to be more appealing with new theme, application icons, more
Raspberry Pi Foundation Unveils New LXDE-Based Desktop for Raspbian Called PIXEL
The new desktop environment can be installed right now
Read more: http://news.softpedia.com/news/raspberry-pi-foundation-unveils-new-lxde-based-desktop-for-raspbian-called-pixel-508756.shtml#ixzz4LeZ6xWer
Tomi Engdahl says:
Roger Parloff / Fortune:
Amazon, Facebook, Google, IBM, Microsoft launch Partnership on AI, a nonprofit to advance best practices in AI; the group isn’t intended to lobby government — Five tech giants announced on Wednesday that they are launching a nonprofit to “advance public understanding” …
AI Partnership Launched by Amazon, Facebook, Google, IBM, and Microsoft
http://fortune.com/2016/09/28/ai-partnership-facebook-google-amazon/
The new group will focus on ethics and best practices
Five tech giants announced on Wednesday that they are launching a nonprofit to “advance public understanding” of artificial intelligence and to formulate “best practices on the challenges and opportunities within the field.”
The Partnership on Artificial Intelligence to Benefit People and Society is being formed byAmazon, Facebook, Google, IBM, and Microsoft, each of which will have a representative on the group’s 10-member board.
The partnership will conduct research and recommend best practices relating to “ethics, fairness and inclusivity; transparency, privacy, and interoperability; collaboration between people and AI systems; and the trustworthiness, reliability and robustness of the technology,” according to the announcement. “It does not intend to lobby government or other policymaking bodies.”
“We’re in a golden age of machine learning and AI,”
Tomi Engdahl says:
Gravity Sketch’s Wild VR App Will Let You Draw in Mid-Air
https://www.wired.com/2016/09/gravity-sketchs-wild-vr-app-will-let-draw-mid-air/
Back in 2014, a London-based startup called Gravity Sketch released a prototype for an impressive virtual reality sketching tool. The tech demo relied on a proprietary tablet-and-VR-headset combo that was eye-catching (it reminded us of something out of Tron), but commercially unavailable. Two and a half years later, the London-based startup has ditched the hardware approach entirely—but it’s also a lot closer to bringing its intuitive 3D design tool to the public.
Tomi Engdahl says:
Google Research Blog:
Google releases YouTube-8M, a dataset of 8M video URLs representing over 500K hours of video, along with 4800 distinct labels, for machine learning — Posted by Sudheendra Vijayanarasimhan and Paul Natsev, Software Engineers — Many recent breakthroughs in machine learning and machine perception …
Announcing YouTube-8M: A Large and Diverse Labeled Video Dataset for Video Understanding Research
http://research.googleblog.com/2016/09/announcing-youtube-8m-large-and-diverse.html
Tomi Engdahl says:
SSD hard drive is a hundredfold better than the old mechanical optical disc. It is quieter, faster, consume less power, but, unfortunately, continue to pay significantly more. At the moment, the price difference is 4-10 times.
SSD-gigabyte price is currently 20-50 cents capacity NAND technology and depending on the disc format. At the same time, the traditional TB of disk-gigabyte price will be about four cents.
Source: http://etn.fi/index.php?option=com_content&view=article&id=5141:ssd-gigatavu-maksaa-4-10-kertaa-enemman&catid=13&Itemid=101
Tomi Engdahl says:
Facebook has built a world-class, murderous, Doom-playing AI
http://www.techinsider.io/facebook-intel-win-vizdoom-contest-build-artificial-intelligence-ai-plays-doom-deathmatch-2016-9
Researchers from Facebook have built a highly efficient piece of artificial intelligence (AI), programmed to slaughter anything in its path.
Luckily, it can only play “Doom”.
A team from the social network blasted its way to glory this week in a contest to build programs capable of autonomously playing the classic first-person shooter game.
Called VizDoom, it’s an irreverent test of skill for AI researchers, focused on eight-player deathmatches. Think your software is smart? Prove it: Defeat your rivals in a bloody eight-player shoot-out.
ViZDoom
http://vizdoom.cs.put.edu.pl/
ViZDoom is a Doom-based AI research platform for reinforcement learning from raw visual information. It allows developing AI bots that play Doom using only the screen buffer. ViZDoom is primarily intended for research in machine visual learning, and deep reinforcement learning, in particular.
Tomi Engdahl says:
Tech Titans Join Forces to Stop AI from Behaving Badly
The new partnership is also designed to head off unwanted regulation.
https://www.technologyreview.com/s/602483/tech-titans-join-forces-to-stop-ai-from-behaving-badly/?utm_campaign=socialflow&utm_source=twitter&utm_medium=post
When it comes to policing artificial intelligence, technology leaders think there is safety in numbers.
A new organization called the Partnership on Artificial Intelligence to Benefit People and Society will seek to foster public dialogue and create guidelines for developing AI so that systems do not misbehave. The companies involved include Google and its subsidiary DeepMind, Facebook, Amazon, Microsoft, and IBM. The partnership is founded on eight tenets or principles, including the idea that AI should benefit as many people as possible; that the public should be involved in its development; that research should be conducted in an open way; and that AI systems should be able to explain their reasoning.
Partners on AI
http://www.partnershiponai.org/
Tomi Engdahl says:
Building Chips That Can Learn
Machine learning, AI, require more than just power and performance.
http://semiengineering.com/building-chips-that-can-learn/
The idea that devices can learn optimal behavior rather than relying on more generalized hardware and software is driving a resurgence in artificial intelligence, machine leaning, and cognitive computing. But architecting, building and testing these kinds of systems will require broad changes that ultimately could impact the entire semiconductor ecosystem.
“We’ve been having a lot of discussions lately about cognitive computing,” said Wally Rhines, chairman and CEO of Mentor Graphics. “When we’re born, all the cells in our brain are the same. Over time, those cells specialize into regions such as eyesight. The thinking is that if you start with identical (semiconductor) memory cells, you can specialize them with time. And based on the applications you expose the chip to, stored memory accumulates more data over time. The brain is largely about pattern recognition. What differentiates us from animals is predictive pattern recognition. That requires hierarchical memory and invariant memory. So you don’t store every pattern, but if you see a face in the shadows you can still recognize it. The human brain does this much more effectively than technology.”
Tomi Engdahl says:
Amazon Offers $2.5M To Make Alexa Your Friend
http://hackaday.com/2016/09/30/amazon-offers-2-5m-to-make-alexa-your-friend/
Amazon has unveiled the Alexa Prize, a $2.5 Million purse for the first team to turn Alexa, the voice service that powers the Amazon Echo, into a ‘socialbot’ capable of, “conversing coherently and engagingly with humans on popular topics for 20 minutes”.
The Alexa Prize is only open to teams from colleges or universities, with the winning team taking home $500,000 USD, with $1M awarded to the team’s college or university in the form of a research grant. Of course, the Alexa Prize grants Amazon a perpetual, irrevocable, worldwide, royalty-free license to make use of the winning socialbot.
It may be argued the Alexa Prize is a competition to have a chat bot pass a Turning Test. This is a false equivalency;
The Alexa Prize
$2.5 Million to Advance Conversational Artificial Intelligence
https://developer.amazon.com/alexaprize
Tomi Engdahl says:
Wish
Minecraft
Were Open Source?
Meet Minetest.
A free, open source voxel game engine and game. Fully extendable. You are in control.
http://www.minetest.net/
Tomi Engdahl says:
Firefox to doctor Pepper so it can run Chrome’s PDF, Flash plugins
Mozilla decides to crib some of Google’s browser interfaces
http://www.theregister.co.uk/2016/10/01/firefox_chrome_apis/
Mozilla is investigating hooking up Google Chrome’s builtin plugins to Firefox.
The foundation’s Project Mortar hopes to spare its developers from building and improving non-core components of Firefox by instead providing the same software interfaces that Chromium, the open-source engine of Chrome, provides.
That will allow Firefox to run Chrome’s PDF viewer and Flash player, saving Moz’s programmers from having to develop and maintain their own.
“Project Mortar seeks to reduce the time Mozilla spends on technologies that are required to provide a complete web browsing experience, but are not a core piece of the web platform,” explained Mozilla senior director of engineering Johnny Stenback on Friday.
Specifically, under Moz’s plan, Firefox will support Google’s Pepper API so it can run pdfium – Chrome’s open-source native PDF viewer – and the Adobe-Google-built Pepper Flash player – which runs inside a sandbox to limit the damage malicious code can do if it exploits a security hole in the plugin.
Tomi Engdahl says:
With HDDs On The Ropes, Samsung Predicts SSD Price Collisions As NVMe Takes Over
https://hardware.slashdot.org/story/16/10/01/2127209/with-hdds-on-the-ropes-samsung-predicts-ssd-price-collisions-as-nvme-takes-over
At its Global SSD Summit, Samsung shared its vision of the current state of SSD market and also outlined the future trends. The company noted that SSDs are steadily displacing HDDs in more applications, but NVMe is shaping up to be the dark horse that may put the venerable HDD to rest.
With HDDs On The Ropes, Samsung Predicts SSD Price Collisions As NVMe Takes Over
http://www.tomshardware.com/news/samsung-ssd-hdd-sata-nvme,32762.html
SSDs are steadily displacing HDDs in more applications, but NVMe is shaping up to be the dark horse that may put the venerable HDD to rest.
SSD Rumblings As Prices Drop
Samsung loves Google, and not just because it probably buys plenty of its SSDs. Samsung outlined its rather intense focus on Google Analytics for marketing purposes last year, and this year it pointed out that recent Google searches for “SSD upgrades” outweighed searches for “CPU upgrades.” The historical trend indicates that this wasn’t always the case (of course), but with 40 million searches for SSD upgrades this year, it is clear that SSDs are on the move.
Mount SSD Spews Forth Capacious SSDs, 3D TLC NAND To Blame
The wave of TLC NAND in both 2D and 3D flavors helps to push pricing down. Samsung cited a Forward Insights (a market analysis firm) report that indicates the industry, as a whole, reached a crossover point this year that sees more TLC NAND making its way to market than MLC NAND.
HDDs On The Ropes
HDDs have a serious pricing problem for low-capacity applications. The majority of notebook users don’t need more than 1TB of storage (most need much less), and the rise of the cloud continues to reduce the need for high-capacity local storage. SSDs don’t really have to beat HDDs on the price versus capacity front (and they probably never will); the price just needs to be “close enough” to become a viable HDD alternative. The popular 256GB SSD capacity is already close to the price of a 1TB HDD. Samsung predicts that the price of a 256GB SSD will sink beneath 1TB HDDs in mid-2017, and 512GB will follow in 2020.
Tomi Engdahl says:
How to steal the mind of an AI: Machine-learning models vulnerable to reverse engineering
Think SQL injections on steroids
http://www.theregister.co.uk/2016/10/01/steal_this_brain/
Amazon, Baidu, Facebook, Google and Microsoft, among other technology companies, have been investing heavily in artificial intelligence and related disciplines like machine learning because they see the technology enabling services that become a source of revenue.
Consultancy Accenture earlier this week quantified this enthusiasm, predicting that AI “could double annual economic growth rates by 2035 by changing the nature of work and spawning a new relationship between man and machine” and by boosting labor productivity by 40 per cent.
But the machine learning algorithms underpinning this harmonious union of people and circuits aren’t secure.
In a paper [PDF] presented in August at the 25th Annual Usenix Security Symposium, researchers at École Polytechnique Fédérale de Lausanne, Cornell University, and The University of North Carolina at Chapel Hill showed that machine learning models can be stolen and that basic security measures don’t really mitigate attacks.
Machine learning models may, for example, accept image data and return predictions about what’s in the image.
Stealing Machine Learning Models via Prediction APIs
https://regmedia.co.uk/2016/09/30/sec16_paper_tramer.pdf
Tomi Engdahl says:
Linus Torvalds Officially Announces the Release of Linux Kernel 4.8
https://linux.slashdot.org/story/16/10/03/026220/linus-torvalds-officially-announces-the-release-of-linux-kernel-48?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Slashdot%2Fslashdot%2Fto+%28%28Title%29Slashdot+%28rdf%29%29
Today, Linus Torvalds proudly announced the release and availability for download of the Linux 4.8 kernel branch, which is now the latest stable and most advanced one.
Linux Kernel 4.8 Officially Released, Merge Window for Kernel 4.9 Now Open
Read more: http://news.softpedia.com/news/linux-kernel-4-8-officially-released-merged-window-for-kernel-4-9-now-open-508873.shtml#ixzz4M0iACDbc
Tomi Engdahl says:
Roger Parloff / Fortune:
A history of deep learning and how it’s being used in more tech products than ever — Decades-old discoveries are now electrifying the computing industry and will soon transform corporate America. — Over the past four years, readers have doubtlessly noticed quantum leaps in the quality of a wide range of everyday technologies.
http://fortune.com/ai-artificial-intelligence-deep-machine-learning/
Tomi Engdahl says:
Khari Johnson / VentureBeat:
45K developers are using Microsoft Bot Framework to make bots for Skype and other chat platforms, while 34K developers are making bots for Facebook Messenger
Microsoft’s bot platform is more popular than Facebook’s among developers
http://venturebeat.com/2016/09/26/microsofts-bot-platform-is-more-popular-than-facebooks-among-developers/
Tomi Engdahl says:
Kaleao’s KMAX ARM-based server has legs. How fast can it run?
Hyper-converged version has 192 servers in 3U
http://www.theregister.co.uk/2016/10/04/kaleao_kmax_armbased_server/
Kaleao is a startup developing ARM-based servers and hyper-converged appliances under a KMAX brand. Its marketing-speak says it has a “true convergence” approach, it involves “physicalization” and there is a “microvisor” – oh dear, what does this mean?
The KMAX product comes in server and appliance forms.
The servers use 64-bit ARMv8-compatible CPU and employs big.LITLE architecture – ARM’s form of CPU tiering with one or more beefy cores spinning up to handle heavy workloads and smaller lightweight cores (which don’t need quite so much power) taking on other work.
FPGAs – reprogrammable logic chips – are employed and Kaleao says that one “can create the virtual function NIC as an actual PCI device. Each VM can directly map a unique virtual function as a PCI device, so each VM has real hardware resources that can change dynamically like virtual ones.”
What is “true convergence?” Kaleao states: “Traditional converged systems are pre-integrated assembly of storage, servers and network equipment, which normally are separated devices interconnected and provided. True convergence is the technology that allows a native, board-level convergence. With true convergence, any device can be a compute, storage and network server or any of these functions.”
The net net is that Kaleao offers deeper convergence, not true or even untrue convergence.
Tomi Engdahl says:
Lucas Matney / TechCrunch:
Samsung waits to see whether VR is hype or mainstream before moving forward on standalone VR headsets, 10K displays
Samsung waiting to see where VR hype cycle lands before moving on standalone headsets, next-gen 10K displays
https://techcrunch.com/2016/09/27/samsung-waiting-to-see-where-vr-hype-cycle-lands-before-moving-on-standalone-headsets/amp/
At a company event today in San Francisco, Samsung President & Chief Strategy Officer Young Sohn detailed that the company is actively pursuing both smartphone-focused VR headsets and standalone solutions. The decision to market and ship a dedicated all-in-one device would rely largely on where the VR market goes in the upcoming months and years, he says, and whether the clunky headsets can gain wider adoption.
“Is [virtual reality] hype or mainstream? I don’t have a good answer for you today,” Sohn said.
Sohn detailed that he believes the industry is at the peak of its hype cycle and that “there’s a bit of a chicken and egg problem right now” for shipping all-in-one headsets when the market hasn’t entirely proven itself thus far.
Samsung is currently one of the largest manufacturers of mobile VR headsets. There are over a million Galaxy and Note owners utilizing the company’s $99 Gear VR headset, though a large chunk of that user base likely received those headsets for free based on pre-order promotions for the company’s handsets. The company is also one of Google Daydream’s first partners and is more than likely going to release a separate mobile headset for that platform.
The QuadHD displays currently available on Samsung’s Galaxy and Note smartphone lines may be more than adequate for regular smartphone usage but when the device is slotted into a Gear VR headset and places inches away from the user’s eyes, the display limits become much more visible.
Sohn said that virtual reality technologies would definitely be a driving incentive for the company to “move faster” in building next-gen displays, but also posited that building a 10K mobile display would likely require an investment of “at least $5 billion to $10 billion” from the company.
These standalone headsets differ from mobile solutions in that no secondary compute device is required with all of the compute, display and sensor tech baked into a single device. Intel and Qualcomm have both shown off reference designs for all-in-one VR devices but are not looking to immediately market these devices to consumers.
Tomi Engdahl says:
Enhancing Communication Between Security and DevOps
http://www.securityweek.com/enhancing-communication-between-security-and-devops
Security teams and DevOps teams aren’t always on the same page and the lack of communication often results in misaligned priorities that significantly inhibit productivity. Developers need enhanced communiation and instruction from the risk management team to remediate vulnerabilities that are being discovered in applications.
The shift to DevOps is inevitable in many organizations. With businesses focused on higher and faster performance, and fearful of falling behind competitors, the allure of being more responsive to customers and stakeholders will inevitably overwhelm security teams’ concerns. This means that security organizations must learn to meaningfully insert themselves into this transition. A key part of this change is evolving to effectively work with, not against, application development teams – the “Dev” side of DevOps.
The majority of developers do not have a strong background in secure coding or secure design. This is unfortunate and is the result of a variety of factors, including namely that universities who train developers via computer science programs rarely teach secure coding topics. Many software development projects also treat security as an afterthought – limiting their focus to security features like encryption rather than secure coding fundamentals such as input validation, output encoding, and so on.
Tomi Engdahl says:
The decline of the past – the PC will not die at all
The sales of PCs will fall by eight per cent this year. Hardware manufacturers in terms of the good news Gartner’s forecast is that this year will go down in history as the worst in terms of sales. Next year, sales are going to recover, albeit very slowly.
Gartner has predicted the development of sales for different types of computers for a long time.
PC sales are no longer shrinking and mobile phone market is no longer growing.
However, this is a huge market. All in all, a variety of computers – which is calculated in this smartphone – sold this year and not the fault between 2.3-2.4 billion devices.
Traditional PCs or desktops and laptops this year to 216 million – will continue to slow down, so that in 2018 machines sold in 199 million. At the same time new ultra-portable devices slowly growing sales (now 49 million, expected to grow to 75 million in two years).
Sales of ultra-mobile devices, including tablet, sales volumes are no longer growing on an annual basis and will be less than 200 million sold devices.
Source: http://etn.fi/index.php?option=com_content&view=article&id=5176:lasku-ohi-pc-ei-kuolekaan&catid=13&Itemid=101
Tomi Engdahl says:
Adi Robertson / The Verge:
PlayStation VR review: a decently priced and comfortable headset with some good launch titles, but held back by the outdated and imprecise Move controllers
PlayStation VR review: When good enough is great
http://www.theverge.com/2016/10/5/13167954/playstation-vr-review-ps4-psvr-virtual-reality-headset-controllers
This was supposed to be the year virtual reality broke out. The Oculus Rift and HTC Vive, the first two high-end consumer devices on the market, arrived this spring to critical praise and preorders that sold out within minutes. Then… they plateaued. Despite some great experiences, months of near-total unavailability dulled the post-release buzz for both headsets, particularly the Rift. Neither the Rift or the Vive ecosystems produced a killer app that was big enough to push VR out of the margins, especially given the high cost of a headset and gaming PC. While 360-degree video has at least gotten a toehold in popular culture, the dream of sophisticated VR gaming — which arguably resurrected virtual reality in the first place — remains far away for most people.
But there are three months left in the year, and one thing that could change that: PlayStation VR.
PlayStation VR is Sony’s attempt at bringing virtual reality to its PlayStation 4 console, starting next week. Arriving right in time for the holidays, it’s being positioned as a (relatively) cheap, unintimidating gaming headset, designed for a device that might already be sitting in your living room. The Rift and Vive had to be judged on a sort of abstract scale of quality
Tomi Engdahl says:
Breaking compression, one year at a time
DIY is hard
http://www.theregister.co.uk/2016/10/07/breaking_compression_one_year_at_a_time/
Computers physically last a lot longer than vendors would like. The idea of the three-year refresh cycle is considered sacred amongst a certain crowd, but when pressed most will admit that refreshes of that nature are exceptionally rare. While we can keep equipment running for a decade or beyond, there are hidden issues in doing so of which we should all be aware.
Data centres are organic; equipment is added and removed as needed. For the small business it isn’t odd to see units that are five, six or even 10 years old. Even enterprises will often have an “if it ain’t broke, don’t fix it” policy for various workloads. This can lead to some truly astonishing finds when picking through inventory reports.
Most sysadmins who operate in a post-refresh economy will be able to quote some well-known headline issues with keeping old gear around.
Storage arrays are often the most tempting devices to push past refresh.
The storage wars changed some of this. Oh, the big storage vendors still treat their customers poorly, but there are plenty of alternatives to those out-of-date fossils. Software to make whitebox servers into storage superheroes now come in flavours, and a lot of the nanoNAS vendors have stepped up to provide resilient storage for the mid market.
software vendors have focused on making it even easier to scale up and scale out whitebox solutions. RAID cards are no longer welcome: the simpler the HBA the better. This has led to fewer hardware restrictions, only to see sweating your IT assets run up against software barriers.
7 years ago the 200GB “operating system drive” of that 16TB VM would have been considered a large VM in its own right. Today, I don’t even think about that 200GB disk when talking about the file server; 200GB virtual drives are almost functionally an irrelevance.
In an effort to stave off the purchase of those 4TB drives, software compression was enabled a few years ago. The archival servers do nothing all day, and their CPUs go unutilized. Compression on cold storage wasn’t thought to be a problem.
It turns out that you really shouldn’t enable compression on folders in which you intend to dump gigantic VMs. It leads to interesting scenarios where, for example, attempting to copy over files simply stops at some arbitrary point and neither fails nor proceeds.
We could have this same conversation about weirdness regarding deduplication
Tomi Engdahl says:
Director digital expertise will bring significant business and economic benefits for the company.
This is revealed by Oxford Economics research institute fresh investigation.
today only one in five business leaders may invite a so-called digital leader.
Director of digital skills should be a clear advantage.
1 Better economic success.
2 satisfied and committed employees.
3 A more open culture and a rich breeding ground for new leaders.
4 A simpler decision-making culture.
5 Focus diverse organization, more female workers.
6 Better attitude to young leaders.
“It is clear that the success of digital business requires a new type of leadership. Young workers expect their employers a more inclusive and more social leadership, diverse management teams and less hierarchy, ”
Source: http://www.tivi.fi/Kaikki_uutiset/johtaja-panosta-digiosaamiseen-naista-syista-se-kannattaa-6588580
Tomi Engdahl says:
Device-as-a-Service to make life simple? Nope
Still, where’s there mystery there’s margin and PC vendors need that desperately
http://www.channelregister.co.uk/2016/10/07/deviceasaservice_to_make_life_simple_nope/
There isn’t much mystery in PCs these days – and not much margin either – but that could all be about to change if HP Inc’s prophecy on the bewildering range of payment options for device services proves right.
The firm rolled out out a “from birth to burial” device-as-a-service package for customers in the summer
“You put one machine into a customer and depending on what they are doing, they use more or less commute power… the business model changes, and now you could offer device-as-a-service based on the complexity of the task,”
With a full estate of PCs managed contractually in this way, the whole situation could “become even more complex”, and where’s the complexity there is some mystery, and where there’s mystery….
The HP exec was challenged to describe DaaS in fewer than 10 words
Tomi Engdahl says:
Tech Billionaires Are Asking Scientists For Help To Break Humans Out of Computer Simulation
https://science.slashdot.org/story/16/10/06/1352205/tech-billionaires-are-asking-scientists-for-help-to-break-humans-out-of-computer-simulation
Many believe that we live in a computer simulation. But it takes a billionaire and his money to ask scientists to help break us out of the simulation.
Tech billionaires are asking scientists for help breaking humans out of the computer simulation
http://nordic.businessinsider.com/tech-billionaires-want-to-break-humans-out-of-a-computer-simulation-2016-10?r=US&IR=T
The theory that we might all be living in a computer simulation has gotten so popular among Silicon Valley’s tech elites that two billionaires are now apparently pouring money into breaking us out of the simulation.
Here’s how the new Yorker tells it:
Many people in Silicon Valley have become obsessed with the simulation hypothesis, the argument that what we experience as reality is in fact fabricated in a computer; two tech billionaires have gone so far as to secretly engage scientists to work on breaking us out of the simulation.