Computer technology trends for 2016

It seems that PC market seems to be stabilizing in 2016. I expect that the PC market to shrinks slightly. While mobile devices have been named as culprits for the fall of PC shipments, IDC said that other factors may be in play. It is still pretty hard to make any decent profits with building PC hardware unless you are one of the biggest players – so again Lenovo, HP, and Dell are increasing their collective dominance of the PC market like they did in 2015. I expect changes like spin-offs and maybe some mergers with with smaller players like Fujitsu, Toshiba and Sony. The EMEA server market looks to be a two-horse race between Hewlett Packard Enterprise and Dell, according to Gartner. HPE, Dell and Cisco “all benefited” from Lenovo’s acquisition of IBM’s EMEA x86 server organisation.

Tablet market is no longer high grow market – tablet maker has started to decline, and decline continues in 2016 as owners are holding onto their existing devices for more than 3 years. iPad sales are set to continue decline and iPad Air 3 to be released in 1st half of 2016 does not change that. IDC predicts that detachable tablet market set for growth in 2016 as more people are turning to hybrid devices. Two-in-one tablets have been popularized by offerings like the Microsoft Surface, with options ranging dramatically in price and specs. I am not myself convinced that the growth will be as IDC forecasts, even though Company have started to make purchases of tablets for workers in jobs such as retail sales or field work (Apple iPads, Windows and Android tablets managed by company). Combined volume shipments of PCs, tablets and smartphones are expected to increase only in the single digits.

All your consumer tech gear should be cheaper come July as shere will be less import tariffs for IT products as World Trade Organization (WTO) deal agrees that tariffs on imports of consumer electronics will be phased out over 7 years starting in July 2016. The agreement affects around 10 percent of the world trade in information and communications technology products and will eliminate around $50 billion in tariffs annually.

Happy Computer Laptop

In 2015 the storage was rocked to its foundations and those new innovations will be taken into wider use in 2016. The storage market in 2015 went through strategic foundation-shaking turmoil as the external shared disk array storage playbook was torn to shreds: The all-flash data centre idea has definitely taken off as a vision that could be achieved so that primary data is stored in flash with the rest being held in cheap and deep storage.  Flash drives generally solve the dusk drive latency access problem, so not so much need for hybrid drives. There is conviction that storage should be located as close to servers as possible (virtual SANs, hyper-converged industry appliances  and NVMe fabrics). The existing hybrid cloud concept was adopted/supported by everybody. Flash started out in 2-bits/cell MLC form and this rapidly became standard and TLC (3-bits/cell or triple layer cell) had started appearing. Industry-standard NVMe drivers for PCIe flash cards appeared. Intel and Micron blew non-volatile memory preconceptions out of the water in the second half of the year with their joint 3D XPoint memory announcement. Boring old disk  disk tech got shingled magnetic recording (SMR) and helium-filled drive technology; drive industry is focused on capacity-optimizing its drives.  We got key:value store disk drives with an Ethernet NIC on-board and basic GET and PUT object storage facilities came into being. Tape industry developed a 15TB LTO-7 format.

The use of SSD will increase and it’s price will drop. SSDs will be in more than 25% of new laptops sold in 2015.  SSDs are expected to be in 31% of new consumer laptops in 2016 and more than 40% by 2017. The prices of mainstream consumer SSDs have fallen dramatically every year over the past three years while HDD prices have not changed much.  SSD prices will decline to 24 cents per gigabyte in 2016. In 2017 they’re expected to drop to 11-17 cents per gigabyte (means a 1TB SSD on average would retail for $170 or less).

Hard disk sales will decrease, but this technology is not dead. Sales of hard disk drives have been decreasing for several years now (118 million units in the third quarter of 2015), but according to Seagate hard disk drives (HDDs) are set to still stay relevant around for at least 15 years to 20 years.  HDDs remain the most popular data storage technology as it is cheapest in terms of per-gigabyte costs. While SSDs are generally getting more affordable, high-capacity solid-state drives are not going to become as inexpensive as hard drives any time soon. 

Because all-flash storage systems with homogenous flash media are still too expensive to serve as a solution to for every enterprise application workload, enterprises will increasingly turn to performance optimized storage solutions that use a combination of multiple media types to deliver cost-effective performance. The speed advantage of Fibre Channel over Ethernet has evaporated. Enterprises also start  to seek alternatives to snapshots that are simpler and easier to manage, and will allow data and application recovery to a second before the data error or logical corruption occurred.

Local storage and the cloud finally make peace in 2016 as the decision-makers across the industry have now acknowledged the potential for enterprise storage and the cloud to work in tandem. Over 40 percent of data worldwide is expected to live on or move through the cloud by 2020 according to IDC.

Happy Computer Laptop

Open standards for data center development are now a reality thanks to advances in cloud technology. Facebook’s Open Compute Project has served as the industry’s leader in this regard.This allows more consolidation for those that want that. Consolidation used to refer to companies moving all of their infrastructure to the same facility. However, some experts have begun to question this strategy as  the rapid increase in data quantities and apps in the data center have made centralized facilities more difficult to operate than ever before. Server virtualization, more powerful servers and an increasing number of enterprise applications will continue to drive higher IO requirements in the datacenter.

Cloud consolidation starts heavily in 2016: number of options for general infrastructure-as-a-service (IaaS) cloud services and cloud management software will be much smaller at the end of 2016 than the beginning. The major public cloud providers will gain strength, with Amazon, IBM SoftLayer, and Microsoft capturing a greater share of the business cloud services market. Lock-in is a real concern for cloud users, because PaaS players have the ancient imperative to find ways to tie customers to their platforms and aren’t afraid to use them so advanced users want to establish reliable portability across PaaS products in a multi-vendor, multi-cloud environment.

Year 2016 will be harder for legacy IT providers than 2015. In its report, IDC states that “By 2020, More than 30 percent of the IT Vendors Will Not Exist as We Know Them Today.” Many enterprises are turning away from traditional vendors and toward cloud providers. They’re increasingly leveraging open source. In short, they’re becoming software companies. The best companies will build cultures of performance and doing the right thing — and will make data and the processes around it self-service for all their employees. Design Thinking to guide companies who want to change the lives of its customers and employees. 2016 will see a lot more work in trying to manage services that simply aren’t designed to work together or even be managed – for example Whatever-As-A-Service cloud systems to play nicely together with their existing legacy systems. So competent developers are the scarce commodity. Some companies start to see Cloud as a form of outsourcing that is fast burning up inhouse ITops jobs with varying success.

There are still too many old fashioned companies that just can’t understand what digitalization will mean to their business. In 2016, some companies’ boards still think the web is just for brochures and porn and don’t believe their business models can be disrupted. It gets worse for many traditional companies. For example Amazon is a retailer both on the web and increasingly for things like food deliveries. Amazon and other are playing to win. Digital disruption has happened and will continue.
Happy Computer Laptop

Windows 10 is coming more on 2016. If 2015 was a year of revolution, 2016 promises to be a year of consolidation for Microsoft’s operating system. I expect that Windows 10 adoption in companies starts in 2016. Windows 10 is likely to be a success for the enterprise, but I expect that word from heavyweights like Gartner, Forrester and Spiceworks, suggesting that half of enterprise users plan to switch to Windows 10 in 2016, are more than a bit optimistic. Windows 10 will also be used in China as Microsoft played the game with it better than with Windows 8 that was banned in China.

Windows is now delivered “as a service”, meaning incremental updates with new features as well as security patches, but Microsoft still seems works internally to a schedule of milestone releases. Next up is Redstone, rumoured to arrive around the anniversary of Windows 10, midway through 2016. Also Windows servers will get update in 2016: 2016 should also include the release of Windows Server 2016. Server 2016 includes updates to the Hyper-V virtualisation platform, support for Docker-style containers, and a new cut-down edition called Nano Server.

Windows 10 will get some of the already promised features not delivered in 2015 delivered in 2016. Windows 10 was promised coming  to PCs and Mobile devices in 2015 to deliver unified user experience. Continuum is a new, adaptive user experience offered in Windows 10 that optimizes the look and behavior of apps and the Windows shell for the physical form factor and customer’s usage preferences. The promise was same unified interface for PCs, tablets and smart phones – but it was only delivered in 2015 for only PCs and some tablets. Mobile Windows 10 for smart phone is expected to start finally in 2016 – The release of Microsoft’s new Windows 10 operating system may be the last roll of the dice for its struggling mobile platform. Because Microsoft Plan A is to get as many apps and as much activity as it can on Windows on all form factor with Universal Windows Platform (UWP), which enables the same Windows 10 code to run on phone and desktop. Despite a steady inflow of new well-known apps, it remains unclear whether the Universal Windows Platform can maintain momentum with developer. Can Microsoft keep the developer momentum going? I am not sure. In addition there are also plans for tools for porting iOS apps and an Android runtime, so expect also delivery of some or all of the Windows Bridges (iOS, web app, desktop app, Android) announced at the April 2015 Build conference in hope to get more apps to unified Windows 10 app store. Windows 10 does hold out some promise for Windows Phone, but it’s not going to make an enormous difference. Losing the battle for the Web and mobile computing is a brutal loss for Microsoft. When you consider the size of those two markets combined, the desktop market seems like a stagnant backwater.

Older Windows versions will not die in 2016 as fast as Microsoft and security people would like. Expect Windows 7 diehards to continue holding out in 2016 and beyond. And there are still many companies that run their critical systems on Windows XP as “There are some people who don’t have an option to change.” Many times the OS is running in automation and process control systems that run business and mission-critical systems, both in private sector and government enterprises. For example US Navy is using obsolete operating system Microsoft Windows XP to run critical tasks. It all comes down to money and resources, but if someone is obliged to keep something running on an obsolete system, it’s the wrong approach to information security completely.

Happy Computer Laptop

Virtual reality has grown immensely over the past few years, but 2016 looks like the most important year yet: it will be the first time that consumers can get their hands on a number of powerful headsets for viewing alternate realities in immersive 3-D. Virtual Reality will become the mainstream when Sony, and Samsung Oculus bring consumer products on the market in 2016. Whole virtual reality hype could be rebooted as Early build of final Oculus Rift hardware starts shipping to devs. Maybe HTC‘s and Valve‘s Vive VR headset will suffer in the next few month. Expect a banner year for virtual reality.

GPU and FPGA acceleration will be used in high performance computing widely. Both Intel and AMD have products with CPU and GPU in the same chip, and there is software support for using GPU (learn CUDA and/or OpenCL). Also there are many mobile processors have CPU and GPU on the same chip. FPGAs are circuits that can be baked into a specific application, but can also be reprogrammed later. There was lots of interest in 2015 for using FPGA for accelerating computations as the nest step after GPU, and I expect that the interest will grow even more in 2016. FPGAs are not quite as efficient as a dedicated ASIC, but it’s about as close as you can get without translating the actual source code directly into a circuit. Intel bought Altera (big FPGA company) in 2015 and plans in 2016 to begin selling products with a Xeon chip and an Altera FPGA in a single packagepossibly available in early 2016.

Artificial intelligence, machine learning and deep learning will be talked about a lot in 2016. Neural networks, which have been academic exercises (but little more) for decades, are increasingly becoming mainstream success stories: Heavy (and growing) investment in the technology, which enables the identification of objects in still and video images, words in audio streams, and the like after an initial training phase, comes from the formidable likes of Amazon, Baidu, Facebook, Google, Microsoft, and others. So-called “deep learning” has been enabled by the combination of the evolution of traditional neural network techniques, the steadily increasing processing “muscle” of CPUs (aided by algorithm acceleration via FPGAs, GPUs, and, more recently, dedicated co-processors), and the steadily decreasing cost of system memory and storage. There were many interesting releases on this in the end of 2015: Facebook Inc. in February, released portions of its Torch software, while Alphabet Inc.’s Google division earlier this month open-sourced parts of its TensorFlow system. Also IBM Turns Up Heat Under Competition in Artificial Intelligence as SystemML would be freely available to share and modify through the Apache Software Foundation. So I expect that the year 2016 will be the year those are tried in practice. I expect that deep learning will be hot in CES 2016 Several respected scientists issued a letter warning about the dangers of artificial intelligence (AI) in 2015, but I don’t worry about a rogue AI exterminating mankind. I worry about an inadequate AI being given control over things that it’s not ready for. How machine learning will affect your business? MIT has a good free intro to AI and ML.

Computers, which excel at big data analysis, can help doctors deliver more personalized care. Can machines outperform doctors? Not yet. But in some areas of medicine, they can make the care doctors deliver better. Humans repeatedly fail where computers — or humans behaving a little bit more like computers — can help. Computers excel at searching and combining vastly more data than a human so algorithms can be put to good use in certain areas of medicine. There are also things that can slow down development in 2016: To many patients, the very idea of receiving a medical diagnosis or treatment from a machine is probably off-putting.

Internet of Things (IoT) was talked a lot in 2015, and it will be a hot topics for IT departments in 2016 as well. Many companies will notice that security issues are important in it. The newest wearable technology, smart watches and other smart devices corresponding to the voice commands and interpret the data we produce - it learns from its users, and generate appropriate  responses in real time. Interest in Internet of Things (IoT) will as bring interest to  real-time business systems: Not only real-time analytics, but real-time everything. This will start in earnest in 2016, but the trend will take years to play out.

Connectivity and networking will be hot. And it is not just about IoT.  CES will focus on how connectivity is proliferating everything from cars to homes, realigning diverse markets. The interest will affect job markets: Network jobs are hot; salaries expected to rise in 2016  as wireless network engineers, network admins, and network security pros can expect above-average pay gains.

Linux will stay big in network server marker in 2016. Web server marketplace is one arena where Linux has had the greatest impact. Today, the majority of Web servers are Linux boxes. This includes most of the world’s busiest sites. Linux will also run many parts of out Internet infrastructure that moves the bits from server to the user. Linux will also continue to rule smart phone market as being in the core of Android. New IoT solutions will be moist likely to be built mainly using Linux in many parts of the systems.

Microsoft and Linux are not such enemies that they were few years go. Common sense says that Microsoft and the FOSS movement should be perpetual enemies.  It looks like Microsoft is waking up to the fact that Linux is here to stay. Microsoft cannot feasibly wipe it out, so it has to embrace it. Microsoft is already partnering with Linux companies to bring popular distros to its Azure platform. In fact, Microsoft even has gone so far as to create its own Linux distro for its Azure data center.

Happy Computer Laptop

Web browsers are coming more and more 64 bit as Firefox started 64 bit era on Windows and Google is killing Chrome for 32-bit Linux. At the same time web browsers are loosing old legacy features like NPAPI and Silverlight. Who will miss them? The venerable NPAPI plugins standard, which dates back to the days of Netscape, is now showing its age, and causing more problems than it solves, and will see native support removed by the end of 2016 from Firefox. It was already removed from Google Chrome browsers with very little impact. Biggest issue was lack of support for Microsoft’s Silverlight which brought down several top streaming media sites – but they are actively switching to HTML5 in 2016. I don’t miss Silverlight. Flash will continue to be available owing to its popularity for web video.

SHA-1 will be at least partially retired in 2016. Due to recent research showing that SHA-1 is weaker than previously believed, Mozilla, Microsoft and now Google are all considering bringing the deadline forward by six months to July 1, 2016.

Adobe’s Flash has been under attack from many quarters over security as well as slowing down Web pages. If you wish that Flash would be finally dead in 2016 you might be disappointed. Adobe seems to be trying to kill the name by rebranding trick: Adobe Flash Professional CC is now Adobe Animate CC. In practive it propably does not mean much but Adobe seems to acknowledge the inevitability of an HTML5 world. Adobe wants to remain a leader in interactive tools and the pivot to HTML5 requires new messaging.

The trend to try to use same same language and tools on both user end and the server back-end continues. Microsoft is pushing it’s .NET and Azure cloud platform tools. Amazon, Google and IBM have their own set of tools. Java is on decline. JavaScript is going strong on both web browser and server end with node.js , React and many other JavaScript libraries. Apple also tries to bend it’s Swift programming language now used to make mainly iOS applications also to run on servers with project Perfect.

Java will still stick around, but Java’s decline as a language will accelerate as new stuff isn’t being written in Java, even if it runs on the JVM. We will  not see new Java 9 in 2016 as Oracle’s delayed the release of Java 9 by six months. The register tells that Java 9 delayed until Thursday March 23rd, 2017, just after tea-time.

Containers will rule the world as Docker will continue to develop, gain security features, and add various forms of governanceUntil now Docker has been tire-kicking, used in production by the early-adopter crowd only, but it can change when vendors are starting to claim that they can do proper management of big data and container farms.

NoSQL databases will take hold as they be called as “highly scalable” or “cloud-ready.” Expect 2016 to be the year when a lot of big brick-and-mortar companies publicly adopt NoSQL for critical operations. Basically NoSQL could be seem as key:value store, and this idea has also expanded to storage systems: We got key:value store disk drives with an Ethernet NIC on-board and basic GET and PUT object storage facilities came into being.

In the database world Big Data will be still big but it needs to be analyzed in real-time. A typical big data project usually involves some semi-structured data, a bit of unstructured (such as email), and a whole lot of structured data (stuff stored in an RDBMS). The cost of Hadoop on a per-node basis is pretty inconsequential, the cost of understanding all of the schemas, getting them into Hadoop, and structuring them well enough to perform the analytics is still considerable. Remember that you’re not “moving” to Hadoop, you’re adding a downstream repository, so you need to worry on systems integration and latency issues. Apache Spark will also get interest as Spark’s multi-stage in-memory primitives provides more performance  for certain applications. Big data brings with it responsibility – Digital consumer confidence must be earned.

IT security continues to be a huge issue in 2016. You might be able to achieve adequate security against hackers and internal threats but every attempt to make systems idiot proof just means the idiots get upgraded. Firms are ever more connected to each other and the general outside world. So in 2016 we will see even more service firms accidentally leaking critical information and a lot more firms having their reputations scorched by incompetence fuelled security screw-ups. Good security people are needed more and more – a joke doing the rounds of ITExecs doing interviews is “if you’re a decent security bod, why do you need to look for a job”

There will still be unexpected single points of failures in big distributed networked system. The cloud behind the silver lining is that Amazon or any other cloud vendor can be as fault tolerant, distributed and well supported as you like, but if a service like Akamai or Cloudflare was to die, you still stop. That’s not a single point of failure in the classical sense but it’s really hard to manage unless you go for full cloud agnosticism – which is costly. This is hard to justify when their failure rate is so low, so the irony is that the reliability of the content delivery networks means fewer businesses work out what to do if they fail. Oh, and no one seems to test their mission-critical data centre properly, because it’s mission criticalSo they just over-specify where they can and cross their fingers (= pay twice and get the half the coverage for other vulnerabilities).

For IT start-ups it seems that Silicon Valley’s cash party is coming to an end. Silicon Valley is cooling, not crashing. Valuations are falling. The era of cheap money could be over and valuation expectations are re-calibrating down. The cheap capital party is over. It could mean trouble for weaker startups.

 

933 Comments

  1. Tomi Engdahl says:

    Almost Two-Thirds of Software Companies Contributing To Open Source, Says Survey
    https://news.slashdot.org/story/16/04/28/168248/almost-two-thirds-of-software-companies-contributing-to-open-source-says-survey

    Open source’s march toward preeminence in business software continued over the past year, according to a survey released by open source management provider Black Duck Software and venture capital firm North Bridge. Roughly two-thirds of respondents to the survey — which was administered online and drew 1,300 respondents — said that their companies encouraged developers to contribute to open-source projects, and a similar proportion said that they were actively engaged in doing so already. That’s a 5% increase from the previous year’s survey.

    Almost two-thirds of software companies contributing to open source
    http://www.networkworld.com/article/3062031/open-source-tools/almost-two-thirds-of-software-companies-contributing-to-open-source.html

    Open source’s march toward preeminence in business software continued over the past year, according to a survey released today by open source management provider Black Duck Software and venture capital firm North Bridge.

    Roughly two-thirds of respondents to the survey – which was administered online and drew 1,300 respondents – said that their companies encouraged developers to contribute to open-source projects, and a similar proportion said that they were actively engaged in doing so already. That’s a 5% increase from the previous year’s survey.

    “Open source today is unequivocally the engine of innovation,” he said in a statement. “[W]hether that’s powering technology like operating systems, cloud, big data or IoT, or powering a new generation of open source companies delivering compelling solutions to the market.”

    About 60 percent of respondents said that open-source participation is a competitive advantage, and roughly a third said they had full-time resources dedicated to open-source projects, the survey found. Increasingly, open-source software is viewed as highly competitive on features, in addition to having lower total cost of ownership and the ability to avoid vendor lock-in.

    The Tenth Annual Future of Open Source Survey
    https://www.blackducksoftware.com/2016-future-of-open-source

    Open source viewed as today’s preeminent architecture and an engine for innovation, but significant challenges remain in open source security and management practices

    Open Source Participation on the Rise

    The survey revealed an active corporate open source community that spurs innovation, delivers exponential value and shares camaraderie:

    67 percent of respondents report actively encouraging developers to engage in and contribute to open source projects.
    65 percent of companies are contributing to open source projects.
    One in three companies have a full-time resource dedicated to open source projects.
    59 percent of respondents participate in open source projects to gain competitive edge.

    Reply
  2. Tomi Engdahl says:

    The death of Intel’s Atom casts a dark shadow over the rumored Surface Phone
    http://www.pcworld.com/article/3063672/windows/the-death-of-intels-atom-casts-a-dark-shadow-over-the-rumored-surface-phone.html

    Microsoft still reportedly remains committed to Windows 10 Mobile running on ARM chips. And what about the HoloLens?

    Intel’s plans to discontinue its Atom chips for phones and some tablets may not have killed the dream of a Microsoft Surface phone—just the piece of it that made it so enticing.

    In the wake of a restructuring that relegated the PC to just another connected device, Intel confirmed Friday that it has cancelled its upcoming SoFIA and Broxton chips. That leaves Intel with just one Atom chip, Apollo Lake, which it had slated for convertible tablets.

    Microsoft has never formally commented on its future phone plans, save for a leaked email that suggests that Microsoft is committed to the Windows 10 Mobile platform and phones running ARM processors. But fans of the platform have long hoped for a phone that could run native Win32 legacy apps as well as the new UWP platform that Microsoft has made a central platform of Windows 10. The assumption was that this would require a phone running on an Intel Atom processor. Intel’s decision eliminates that option.

    Intel gives up on the smartphone

    Intel’s decision was first reported by analyst Patrick Moorhead, and confirmed by IDG News Service and PCWorld. Intel told PCWorld that it plans to kill the “Broxton” Atom platform as well as all the flavors of its SoFIA chips, which combined Atom cores with 3G and LTE modems for smartphones. The company said it will continue to support tablets with a 3G derivative of the SoFIA chip, the older Bay Trail and Cherry Trail, as well as some upcoming Core chips.

    Microsoft uses a Cherry Trail chip inside of the HoloLens, but it’s unclear whether or not that will be affected.

    Intel cuts Atom chips, basically giving up on the smartphone and tablet markets
    Intel is refocusing on ‘products that deliver higher returns.’
    http://www.pcworld.com/article/3063508/components/intel-is-on-the-verge-of-exiting-the-smartphone-and-tablet-markets-after-cutting-atom-chips.html

    Intel could be on the verge of exiting the market for smartphones and standalone tablets, wasting billions of dollars it spent trying to expand in those markets.

    The company is immediately canceling Atom chips, code-named Sofia and Broxton, for mobile devices, an Intel spokeswoman confirmed.

    These are the first products on the chopping block as part of Intel’s plan to reshape operations after announcing plans this month to cut 12,000 jobs.

    The news of the chip cuts was first reported by analyst Patrick Moorhead in an article on Forbes’ website.

    Reply
  3. Tomi Engdahl says:

    Build the future of apps with Xamarin.
    Xamarin SDK is now fully available under the MIT license
    http://open.xamarin.com/

    Xamarin brings open source .NET to mobile development, enabling every developer to build truly native apps for any device in C# and F#. We’re excited for your contributions in continuing our mission to make it fast, easy, and fun to build great mobile apps.

    Reply
  4. Tomi Engdahl says:

    GCC 6.1 The developers of the most prominent new feature is the c ++, the default standard exchange c ++ 98 to C ++ 14 standard.

    Now run by default c ++ 14 is an iterated version of the C ++ 11 standard. It was a huge step in the development of the entire c ++ – language. The developer of this can be seen, inter alia, the definition of convenience variables by means of the automatic typing.

    Source: http://www.tivi.fi/Vinkit/gnu-kaantajat-oppivat-uusia-c-temppuja-6546062

    Reply
  5. Tomi Engdahl says:

    If you work on Seagate’s performance drives, time to find another job
    Lousy three months ends multi-year stream of profit
    http://www.theregister.co.uk/2016/04/30/seagate_q3_fy2016_first_loss/

    As expected, Seagate’s revenues totaled $2.6bn in the three months to April 1, the third quarter [PDF] in its fiscal 2016, down 21 per cent year-on-year.

    And as anticipated, it made a $21m loss – its first loss in many many quarters and way down from the year-ago’s $291m profit.

    “As we look forward, our strategic focus is unchanged”

    Hard drive revenue was $2.37bn (down 22 per cent from last year’s $3.1bn), while enterprise systems, flash and other tech brought in $224m (down 2.2 per cent year-on-year) in spite of a Dot Hill contribution.

    Reply
  6. Tomi Engdahl says:

    Stephanie Condon / ZDNet:
    Michael Dell reveals new name for post-merger Dell-EMC to be Dell Technologies, client services business will be branded Dell and enterprise biz Dell EMC

    Michael Dell reveals new name for Dell-EMC
    http://www.zdnet.com/article/michael-dell-reveals-new-name-structure-for-dell-emc/

    After Dell completes is acquisition of EMC, the merged company will be called Dell Technologies, Michael Dell announced at EMC World

    After Dell completes its acquisition of EMC, the merged company will be called Dell Technologies, Michael Dell announced at EMC World in Las Vegas on Monday.

    The company’s new name is meant to “convey a sense of being a family of businesses and aligned capabilities,” Dell said. “As far as family names go, I’m kind of attached to Dell.”

    The Dell CEO laid out the structure of the new company, which will comprise several brands, including Dell, EMC Information Infrastructure, VMWare, Pivotal, RSA, and Virtustream.

    While other companies are “shrinking their way to success,” Dell and EMC are doing just the opposite, Dell said, positioning the merged company to provide “essential technology infrastructure for the next industrial revolution.”

    The new company’s client services business will be branded Dell, the CEO said, touting the brand equity of the Dell PC. Meanwhile, its combined enterprise business will be called Dell EMC.

    Reply
  7. Tomi Engdahl says:

    Cadence DSP Targets Neural Network Development
    http://www.eetimes.com/document.asp?doc_id=1329588&

    Neural networks—artificial intelligence processing systems inspired by the human brain—are a hot topic in technology, as large companies like Facebook, Google and Microsoft are developing them and putting them into use.

    Most neural network technology in place today runs on graphics processing units (GPUs) from Nvidia Corp. and others. EDA and intellectual property vendor Cadence Design Systems Inc. stepped into the fray on on Monday (May 2), rolling out a new version of its Tensilica Vision processing core optimized specifically for vision/deep learning applications.

    “Everybody is spending a lot of time developing a lot of research and producing a lot of technology,” said Pulin Desai, director of product marketing for Cadence’s Imaging/Vision Group, in an interview with EE Times. “The market is very hot. Maybe it’s hot because everything is being run on GPUs.”

    Reply
  8. Tomi Engdahl says:

    Munich switched to linux and drifted to problems

    Three years ago, the city of Munich made a bold movement and changed its city workers and machinery from Windows to Linux. Now, in an internal report reveals that the new IT structure is a nightmare for administrators

    According to the report the problems seem to stem from the part of the city machines have had to be kept for Windows XP and Windows 2000 strain.

    Munich’s employees have 20 000 aircraft, for which the operating system has been developed in Ubuntu LiMux. In addition, the city has more than 4,000 employees of the machine, which continues to run Windows. This is done because certain applications used by the city’s only work in Windows.

    Some of the problems caused by the Munich Linux.
    city used by LibreOffice version (4.1.2) and with the KDE 4 desktops

    Munich City- made earlier this year to reject removing Windows XP completely. The cost of this Decision was estimated to more than $ 12 for every 000 XP per user.

    City of Munich in the report is done by Accenture – Its results can be used both agains and to support the use of open source environment

    Source: http://etn.fi/index.php?option=com_content&view=article&id=4373:m-nchen-vaihtoi-linuxiin-ja-ajautui-ongelmiin&catid=13&Itemid=101

    Reply
  9. Tomi Engdahl says:

    Ubuntu Founder Pledges No Back Doors in Linux
    http://www.eweek.com/enterprise-apps/ubuntu-founder-pledges-no-back-doors-in-linux.html

    VIDEO: Mark Shuttleworth, founder of Canonical and Ubuntu, discusses what might be coming in Ubuntu 16.10 later this year and why security is something he will never compromise.

    One thing that Ubuntu Linux users will also continue to rely on is the strong principled stance that Shuttleworth has on encryption. With the rapid growth of the Linux Foundation’s Let’s Encrypt free Secure Sockets Layer/Transport Layer Security (SSL/TLS) certificate platform this year, Shuttleworth noted that it’s a good idea to consider how that might work in an integrated way with Ubuntu.

    Overall, he said, the move to encryption as a universal expectation is really important.

    “We don’t do encryption to hide things; we do encryption so we can choose what to share,” Shuttleworth said. “That’s a profound choice we should all be able to make.”

    Shuttleworth emphasized that on the encryption debate, Canonical and Ubuntu are crystal clear.

    “We will never backdoor Ubuntu; we will never weaken encryption,” he said.

    Reply
  10. Tomi Engdahl says:

    Creators Of Siri Demo Their Next AI Assistant Viv, It’s Far More Open Platform
    https://apple.slashdot.org/story/16/05/09/1610207/creators-of-siri-demo-their-next-ai-assistant-viv-its-far-more-open-platform?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Slashdot%2Fslashdot%2Fto+%28%28Title%29Slashdot+%28rdf%29%29

    A small company called Viv on Monday unveiled a “frictionless”, artificially intelligent software also called Viv, which understands complicated human queries and connects with other apps to get your work done more conveniently and efficiently. Viv was demonstrated live at TechCrunch’s Disrupt NY conference on Monday.

    The team behind Siri debuts its next-gen AI “Viv” at #TCDisrupt
    https://twitter.com/TechCrunch/status/729698387260383234

    Reply
  11. Tomi Engdahl says:

    Chris O’Brien / VentureBeat:
    Sales of PCs, laptops, and tablets fell 13% in Q1; reaching lowest point since 2011
    http://venturebeat.com/2016/05/09/sales-of-pcs-laptops-and-tablets-fell-13-in-q1-reaching-lowest-point-since-2011/

    The consumer electronics industry was dealt another punch in the face during the first three months of this year.

    According to the latest report from market research firm Canalys, shipments of PC devices (including desktops, notebooks, two-in-ones, and tablets) amounted to 101 million units in the first quarter of 2016. That represents a decline of 13 percent from the same period a year ago — the lowest volume since the second quarter of 2011.

    Apple is still the overall leader in device shipments, but the company saw its unit volume fall 17 percent to 14 million. That puts it just barely head of Lenovo (by 25,000 units), which was getting whacked hard in China.

    About the only cheery news was that sales of two-in-ones grew 13 percent.

    “The global PC market had a bad start to 2016, and it is difficult to see any bright spots for vendors in the coming quarters,”

    Reply
  12. Tomi Engdahl says:

    Spark a replacement for the Hadoop?

    Data Scientist analyzing data – in other words, big data – which is, by definition, is too much, which is too complex, and they generated too fast, that it could be examined in the traditional business analytics software.

    You might already know the name of Hadoop.
    This popular mass data analysis system originated in the inspired idea of ​​Google’s MapReduce algorithm – turned into an open code project of the Apache Foundation in 2007.

    The basic idea is to divide the networked computers oversized mass of data to the local mass storage class and forms the data processing and analysis tasks, so that each node in the network can handle its own data independently and submits its interim results to be combined to the upper level, which returns the final results.

    around Hadoop has developed into an entire ecosystem of software projects. Overall, based on a redundant fault-tolerant file system HDFS, the Hadoop Distributed File System
    In 2012, the Apache projects connected Yarn, Yet Another Resource Negotiator
    MapReduce is a resource created a patchwork of competing with each other and support each other subprojects, such as sql-style data warehouse software Hive, Pig scripting language, machine learning library Mahout, a number of search engines and more.

    However, MapReduce has its limitations – it works well for batch processing, but it is not it is not very suitable for iterative processing. You have always have to go through all the data. If the data is changed, it needs to be all stored back to nodes before new processing can be done.

    Counteracts was born in 2009, yet another software project Spark, which became the top-level Apache project in 2014. Version 1.6 Spark keep intermediate results at calculating system central computer memories, eliminating the slow mass storage operations. RDD’s included passing the whole time data to the measures taken, so that its values ​​can always be recalculated from the original, the disc stored in the output data.

    Spark has developed a number of add-ons that appear programmer fitted the Spark platform on the core algorithm library. as the name suggests gives Spark sql sql interface, mass data. Spark Streaming deals with information that is not already stored in the files, but that arrive in one or more continuous same flow network connected apparatuses, a model of the Internet of Things, or monitoring and optimization of large WLAN network base stations. Spark GraphX ​handles graph database operations and Spark MLlib again is a collection of machine learning algorithms.

    Spark’s marketing factor is speed. In many Spark demonstrations promises that it is 10-100 times faster than Hadoop – but in real life applications you usually don’t get that much speedup.

    A lot of independent tests have been run to compare, for example, one of the leading Hadoop- / SQL interface software: the Hive, Spark SQL and Impalaa. Clearly the winner has not been found, but the results vary depending on the quality, usage patterns and concurrent user data volume.

    MapReduce home language is java, but the Spar Cat programmer can choose in addition to home language Scala also be java or python and last summer from the statistical programming language r.

    Spark’s leader in commercialization is Databricks-based company.
    Also Cloudera, Horton Works, MapR and Pentaho have come to be known Spark promoters.
    Also IBM and SAP support is found.

    Even the Spark is not necessarily the ultimate solution for mass data processing. Critics have pointed out that its real-time share is in fact not fully real-time.

    In true real time advantage tends to yet another Apache project called Flink (based on Stratosphere software developed by Berlin University of Technology since 2010).

    Source: http://www.tivi.fi/Kaikki_uutiset/korvaako-spark-hadoopin-6548817

    Reply
  13. Tomi Engdahl says:

    65 percent of the companies says it will increase hiring an open source programmers more than the next six months experts from other regions. Linux Foundation and Dice-recruitment company jointly commissioned this report, an open source experts is currently in huge demand.

    Last year, half of the recruits of the companies said it will invest Linux talent hiring. Now open source experts sought even more. at the top of the wish list are developers

    There is also extensive know-how cloud technologies is well-asked questions. 51 percent of the developers pay the stressed OpenStack-, CloudStack- expertise and other similar techniques.

    Source: http://etn.fi/index.php?option=com_content&view=article&id=4384:avoimen-koodin-ohjelmoijat-kovassa-nosteessa&catid=13&Itemid=101

    Reply
  14. Tomi Engdahl says:

    2016 Open Source Jobs Report
    Fifth annual report reveals that open source talent is in high demand
    http://go.linuxfoundation.org/download-2016-open-source-jobs-report

    Key findings:

    87% of hiring managers say it’s difficult to find open source talent, and 79% have even increased incentives to hold on to their current open source professionals.
    58% of hiring managers are looking for DevOps talent, making DevOps the most in-demand role in open source today.
    For jobs seekers, even though 86% of tech professionals say open source has advanced their careers, only 2% say money and perks are the best part of their job.

    Reply
  15. Tomi Engdahl says:

    Linux Mint development faster

    Linux Mint is one of the most popular Linux distribution, some statistics show that even the most popular. Now, the project website has been reported that the development of clearer and faster. The new version will no longer publish a separate OEM disk images rather than the revision where the different codecs are missing.

    This can be achieved by leaving off the multimedia codecs package that appears next month in Mint 18 Update. Each user can upload multimedia codecs they need themselves after upgrading operating systems directly from the home window.

    In the future, Mint Linux will be released in 12 separate ISO image files (earlier 18 versions).

    Linux Mint is based on Ubuntu 16.04

    Source: http://etn.fi/index.php?option=com_content&view=article&id=4378:linux-mintin-kehitys-nopeutuu&catid=13&Itemid=101

    Reply
  16. Tomi Engdahl says:

    The Qt Company’s Qt Start-Up
    http://www.linuxjournal.com/content/qt-companys-qt-start

    The Qt Company is proud to offer a new version of the Qt for Application Development package called Qt Start-Up, the company’s C++-based framework of libraries and tools that enables the development of powerful, interactive and cross-platform applications and devices. Now used by around one million developers worldwide, the Qt Company seeks to expand its user base by targeting smaller enterprises.

    The new Qt Start-Up, available to companies with an annual revenue of less than $100,000, enables small and start-up companies to harness the full power of the Qt application and UI development framework in their products. The Qt Start-Up package is just as powerful as the regular Qt for Application Development package, but it’s available at a much lower price.

    http://www.qt.io/

    Reply
  17. Tomi Engdahl says:

    Linux Mint to go DIY for multimedia
    Users to choose own codecs
    http://www.theregister.co.uk/2016/05/09/linux_mint_to_go_diy_for_multimedia/

    The Linux Mint project has decided version 18, scheduled for June 2016, will end out-of-the-box installation of multimedia codecs.

    The reasoning is straightforward: shipping with codecs involves a lot of work that other mainstream distributions don’t bother with, instead leaving users to choose what they want post-install.

    As the maintainers explain in this Friday post, pre-configuring codecs was “very costly and only slightly improved our distribution”, so out they go, with the post providing instructions on codec installation.

    Snipping the codecs out of the default images means the project will be able to cull its release cycle to four events with 12 ISO images to test in each event.

    Reply
  18. Tomi Engdahl says:

    Microsoft Hits $1 Trillion In Total Cumulative Revenue: Reports
    https://apple.slashdot.org/story/16/05/09/1724243/microsoft-hits-1-trillion-in-total-cumulative-revenue-reports

    Microsoft has hit a major milestone: $1 trillion in all-time cumulative revenue. The finding was first spotted by Jeff Reifman, a tech consultant. According to him, Microsoft hit the milestone in its last quarter. Interestingly, Apple also hit $1 trillion in revenue in 2015.

    Microsoft Revenue Quietly Surpasses $1 Trillion
    http://jeffreifman.com/2016/05/08/microsoft-revenue-quietly-surpasses-1-trillion/

    Reply
  19. Tomi Engdahl says:

    Jon Fingas / Engadget:
    Disney ends in-house console game production with cancellation of its Infinity series — So much for Disney remaining a big player in the video game world. As part of its second quarter earnings release, the media giant has revealed that it’s getting out of the self-published video game business …

    Disney cancels ‘Infinity’ as it quits video games
    It’s not the first time the House of Mouse has backed out of gaming.
    http://www.engadget.com/2016/05/10/disney-cancels-infinity-quits-video-games/

    So much for Disney remaining a big player in the video game world. As part of its second quarter earnings release, the media giant has revealed that it’s getting out of the self-published video game business… and canceling its Infinity game series in the process. Disney hasn’t said much about why it’s jumping ship, but it notes that “lower results” (read: poor sales) for Infinity prompted the move. It’s a fairly costly move: Disney is taking on a $147 million charge to axe the division.

    This isn’t the first time Disney has bailed on in-house games. In 2013, it both closed Epic Mickey developer Junction Point and the legendary LucasArts studio. The interactive group has regularly struggled since then, too, leading to Disney cutting 700 jobs in 2014.

    Disney will still have a toehold in gaming through licenses, and there are two last Infinity releases coming in the next few weeks (an Alice release in May and Finding Dory in June).

    Reply
  20. Tomi Engdahl says:

    Microsoft Adds V-Sync Control And Adaptive Framerate Support To The Universal Windows Platform
    by Brett Howse on May 10, 2016 6:05 PM EST
    http://www.anandtech.com/show/10312/microsoft-adds-vsync-control-and-adaptive-framerate-support-to-the-universal-windows-platform

    Microsoft has certainly gone all-in on their Universal Windows Platform (UWP) and have been updating it, including the name, for several years now. What originally started as WinRT apps has morphed into a much more powerful platform which can support a wide variety of apps. There’s no doubt that traditional Win32 apps made for the desktop are not going away, but with Project Centennial, Microsoft hopes to bring at least actively developed Win32 apps over to the UWP platform.

    One of the main areas that Microsoft has focused on in their marketing is UWP’s support for DirectX 12 in Windows 10, and to that end they have promoted several big budget games which have come to the Windows Store as a UWP app. But the move from tablet style games to high demand PC games was not entirely smooth. The UWP platform operates in a different way than traditional Win32 games, and it lacked several features that PC gamers had become accustomed to. Some of those features were as simple as the ability to control V-Sync in a game, and exclusive fullscreen.

    In today’s patch Tuesday, Microsoft has addressed some of these complaints

    UWP now support controllable V-Sync, as well as support for adaptive framerate displays in AMD’s FreeSync or NVIDIA’s G-SYNC

    Reply
  21. Tomi Engdahl says:

    Abner Li / 9to5Google:
    Google will reportedly launch a standalone “Android VR” headset at I/O, will be less powerful than Vive or Rift

    Android VR will reportedly be announced next week as a standalone headset
    http://9to5google.com/2016/05/11/android-vr-dedicated-headset/

    With Google I/O starting next week, veteran tech journalist Peter Rojas has tweeted that Android VR will launch as a dedicated, standalone headset. This corroborates earlier reports and the mention of “AndroidVR” we saw yesterday in the latest Unreal Engine preview.

    Reply
  22. Tomi Engdahl says:

    Linaro’s ARM-Based Developer Cloud
    http://www.linuxjournal.com/content/linaros-arm-based-developer-cloud

    As the adoption of ARM-based servers accelerates and IoT applications rapidly evolve, software developers are demanding access to requisite hardware and software-reference platforms. In response, Linaro released Linaro Developer Cloud, a new cloud-based native ARMv8 development environment, which can be used to design, develop, port and test server, cloud and IoT applications without substantial upfront hardware investment.

    The Developer Cloud is the combination of ARM-based silicon vendors’ server hardware platforms, emerging cloud technologies and many Linaro member-driven projects, including server-class boot architecture, kernel and virtualization. The Developer Cloud is based on OpenStack, leveraging both Debian and CentOS, as the underlying cloud OS infrastructure.

    Reply
  23. Tomi Engdahl says:

    Dean Takahashi / VentureBeat:
    Nvidia beats with Q1 revenue of $1.3B, up 13% YoY, credits growth to gaming, automotive, and deep learning

    Nvidia zooms by earnings targets thanks to graphics chip demand from cars and games
    http://venturebeat.com/2016/05/12/nvidia-zooms-by-earnings-targets-thanks-to-graphics-chip-demand-from-cars-and-games/

    Graphics chip maker Nvidia reported earnings for its fIrst fiscal quarter ended April 30 that beat Wall Street’s expectations.

    Nvidia’s results are a bellwether for the PC industry, as the company is one of the largest makers of graphics chips. Its results are also indicators of the health of sectors such as PC gaming hardware, graphics-enhanced data center computing, deep learning, and car computing. The PC market isn’t growing like it once did, but Nvidia still did well.

    “We are enjoying growth in all of our platforms — gaming, professional visualization, data center and auto,” said Jen-Hsun Huang, CEO of Nvidia, in a statement “Accelerating our growth is deep learning, a new computing model that uses the graphics processing unit’s (GPU’s) massive computing power to learn artificial intelligence algorithms. Its adoption is sweeping one industry after another, driving demand for our GPUs.”

    He added, “Our new Pascal GPU architecture will give a giant boost to deep learning, gaming and VR. We are excited to bring a new wave of innovations to the markets we serve. Pascal processors are in full production and will be available later this month.

    The Nvidia GeForce GTX 1080 and 1070 GPUs represent the newest generation of graphics for consumer computers, and they come at a time when 3D graphics is being pushed to its limit by virtual reality headsets.

    The consumer GPUs come a month after Nvidia unveiled the very first Pascal-based GPU, the P100, which was targeted at deep learning neural networks. Pascal is a new master design, dubbed an architecture, for a whole generation of chips that also come with a new manufacturing process. The process, based on the Taiwan Semiconductor Manufacturing’s 16 nanometer FinFET node technology

    The previous generation of chips used the 28-nanometer TSMC process that has been available to Nvidia and rival Advanced Micro Devices since 2012. With the new process, Nvidia was able to create the P100 with 15 billion transistors on a single chip. The GTX 1080 and 1070 are expected to be slimmed-down versions of the P100. The GPUs are expected to have about 3,584 CUDA cores.

    Reply
  24. Tomi Engdahl says:

    Ethan Baron / Mercury News:
    Bay Area tech layoffs more than doubled in first four months of 2016 to 3,135 compared to same period in 2015

    Tech layoffs more than double in Bay Area
    http://www.mercurynews.com/business/ci_29880696/tech-layoffs-more-than-double-bay-area

    In yet another sign of a slowdown in the booming Bay Area economy, tech layoffs more than doubled in the first four months of this year compared to the same period last year.

    Yahoo’s 279 workers let go this year contributed to the 3,135 tech jobs lost in the four-county region of Santa Clara, San Mateo, Alameda and San Francisco counties from January through April, as did the 50 workers axed at Toshiba America in Livermore and the 71 at Autodesk in San Francisco. In the first four months of last year, just 1,515 Bay Area tech workers were laid off, according to mandatory filings under California’s WARN Act. For that period in 2014, the region’s tech layoffs numbered 1,330.

    Reply
  25. Tomi Engdahl says:

    Slav Petrov / Google Research Blog:
    Google open sources SyntaxNet, a neural network framework for parsing natural language, and releases Parsey McParseface, a SyntaxNet-trained English parser — At Google, we spend a lot of time thinking about how computer systems can read and understand human language in order to process it in intelligent ways.

    Announcing SyntaxNet: The World’s Most Accurate Parser Goes Open Source
    http://googleresearch.blogspot.fi/2016/05/announcing-syntaxnet-worlds-most.html

    At Google, we spend a lot of time thinking about how computer systems can read and understand human language in order to process it in intelligent ways. Today, we are excited to share the fruits of our research with the broader community by releasing SyntaxNet, an open-source neural network framework implemented in TensorFlow that provides a foundation for Natural Language Understanding (NLU) systems. Our release includes all the code needed to train new SyntaxNet models on your own data, as well as Parsey McParseface, an English parser that we have trained for you and that you can use to analyze English text. Parsey McParseface is built on powerful machine learning algorithms that learn to analyze the linguistic structure of language, and that can explain the functional role of each word in a given sentence. Because Parsey McParseface is the most accurate such model in the world, we hope that it will be useful to developers and researchers interested in automatic extraction of information, translation, and other core applications of NLU.

    How does SyntaxNet work? SyntaxNet is a framework for what’s known in academic circles as a syntactic parser, which is a key first component in many NLU systems. Given a sentence as input, it tags each word with a part-of-speech (POS) tag that describes the word’s syntactic function, and it determines the syntactic relationships between words in the sentence, represented in the dependency parse tree. These syntactic relationships are directly related to the underlying meaning of the sentence in question.

    Reply
  26. Tomi Engdahl says:

    James Vincent / The Verge:
    Western Digital officially completes acquisition of SanDisk for around $16B — Western Digital has now officially purchased flash memory manufacturer SanDisk after regulator approval was passed earlier this week. The agreement between the two storage companies was originally announced last October in a deal worth nearly $19 billion.

    Western Digital officially closes SanDisk acquisition
    http://www.theverge.com/2016/5/12/11662018/western-digital-sandisk-deal-complete

    Western Digital has now officially purchased flash memory manufacturer SanDisk after regulator approval was passed earlier this week. The agreement between the two storage companies was originally announced last October in a deal worth nearly $19 billion. However, Chinese company Unisplendour, which was planning to buy a 15 percent stake in Western Digital as part of the merger, backed out in February this year, lowering the value of the deal to around $16 billion. At the time, Western Digital reaffirmed its confidence in the purchase, and this week said it was looking forward to creating “the leading storage solutions company.”

    Reply
  27. Tomi Engdahl says:

    Google Has Open Sourced SyntaxNet, Its AI for Understanding Language
    http://www.wired.com/2016/05/google-open-sourced-syntaxnet-ai-natural-language/

    Reply
  28. Tomi Engdahl says:

    Seagate ready for the HAMR blow: First drives out in 2017
    Market reducing manufacturing but driving up capacity
    http://www.theregister.co.uk/2016/05/12/how_will_hamr_technology_affect_seagate_in_derry/

    Seagate is reducing its manufacturing capacity while still focusing on high-capacity disk drives for cloud and hyper-scale storage of unstructured data. This means it needs higher capacity drives, requiring new read-write head technology.

    It is a high-technology, nano-scale, clean room manufacturing process, and drive read-write head technology development goes in lockstep with drive recording technology, such as perpendicular magnetic recording (PMR) and its areal density increases, shingled magnetic recording (SMR), and the developing HAMR (heat-assisted magnetic recording) which sees laser heating elements added to the write head part.

    PMR was a development from the previous longitudinal magnetic recording technology, in which the magnetised domains in the disk platter’s recording medium were laid out flat on the platter’s surface. In a quest to increase the capacity of the drives by decreasing the surface area of the recording domain the domains were flipped upright and stored vertically, perpendicular to the recording medium’s surface, so that only their tips were visible, as it were, on that surface.

    This as measured as bits per square inch and called the areal density of the drive. Seagate had 400Gbit/in2 in 2009, 500 Gbit/in2 in 2011, and 650 Gbit/in2 in 2013/14, and 800 Gbit/in2 in 2015/16. Each generation lasts about 18 months before the engineering technologists, operating at the boundaries of manufacturing physics and chemistry, reach the next step.

    Only the next PMR step is soon going to be a dead end.

    There will be so few grains in a shrunken PMR bit that their stability will suffer and they will experience random bit value changes because of the influence of neighbouring bits and random events. If we want higher capacity disk drives in a standard-sized 3.5-inch or 2.5-inch disk drive enclosure then some other technology will have to be used.

    Hopefully HAMR can provide areal density of between 1.2 and 5 terabits per square inch, with initial product integration slated for 2016. That is a large increase from PMR. Development of HAMR is costly and complex and Seagate’s Northern Ireland operation is deeply involved in it.

    Seagate manufacturing capacity reduction

    Recently, in the face of falling disk drive unit sales, Seagate has decided to reduce its drive manufacturing capacity. Chairman and CEO Steve Luczo said; “The company is in the process of prioritising our strategic positioning, manufacturing footprint and operating expense investments to achieve the appropriate level of normalised earnings. We anticipate that these actions will be implemented over the next several quarters.”

    According to Stifel MD Aaron Rakers, Seagate will reduce its HDD manufacturing capacity footprint by 35 per cent, saving approximately $20m per quarter. It is currently making 40 million disk drives a quarter, say 200 million a year, meaning, at a 3-platter/drive average 600 million read-write heads are needed.

    It peaked at 66 million drives/quarter or c240 million/year in early 2014, which would have required roughly 720 million read-write heads.

    Reply
  29. Tomi Engdahl says:

    No objections to object stores: Everyone’s going smaller and faster
    Yes, they really are. Enrico Signoretti takes a look at what firms are up to
    http://www.theregister.co.uk/2016/04/29/object_stores_more_faster_smaller/

    A couple of weeks ago I published an article about high performance object storage. Reactions have been quite diverse. Some think that object stores can only be huge and slow and then others who think quite the opposite. In fact, they can also be fast and small.

    In the last year I’ve had a lot of interesting conversations with end users and vendors concerning this topic. Having just covered the part about “fast object stores”, again I’d like to point out that by fast I mean “faster and with better latency than traditional object stores, but not as fast as block storage.”

    This time round I’d like to talk about smaller object stores.

    Talking to some of my colleagues (both analysts and bloggers), they say that object storage makes no sense under one petabyte or so. But my point of view is that they are dead wrong. It all depends on the applications and on the strategy your organization is adopting.

    It all depends on the applications and on the strategy your organisation is adopting. Let me work with examples here.

    It depends on the application

    HDS was one of the first in the market to think about object storage as an enabler for cloud and data-driven applications and not just as a more affordable form of storage for cold data. They invested on building an ecosystem which is now very robust and seems quite successful with their customers.

    Two pieces of this ecosystem are the remote NAS gateway and Sync&Share (HDI, HCP Anywhere in HDS nomenclature). HDS claims that more than 1,500 customers are running HCP now and there’s something like 400+ PB of on-premises storage under management. Just by doing the simple maths (400/1500), this falls into the range of 260TB per user on average… without considering that some of these customers are huge and use HCP for the traditional archiving/content management use cases.

    Other vendors, such as Cloudian for example, have a license that starts as low as 10TB. I have personally met some of their (happy) customers in the range of 100/300TB. These end users have embraced object storage for NAS gateways, file distribution, and, lately, backup. For each new application they add more capacity and more cluster nodes.

    Caringo is another good example. It has always worked with ISVs and many of their customers are quite small. And now, thanks to FileFly they have a compelling solution for file server consolidation/remotisation.

    But there is more

    Some startups are working on smaller object storage systems intentionally. They want to build small object storage systems by design – or better still, small footprint object storage systems)

    Minio is working hard on an S3-compatible object store that can run in a single virtual machine or a container. The product is open source and has been thought up for developers. I think about it as the MySQL of object stores. And they are not alone, also Open.IO has a similar approach to building an object storage system that can serve single applications. The right back-end for developers in the cloud era.

    The idea behind this object storage system is that developers are asking for S3-compatible storage to build their applications. The small footprint is necessary to embed it within a container and distribute the application in the easiest possible way. But this also means that the S3 engine is very small and fast (yes, again, fast!), security is simplified, and multi-tenancy is no longer a problem since you have an S3 repository dedicated to your application. For better or worse, the developer takes control of the overall “micro-infrastructure”.

    Closing the circle

    Thinking about object storage as being suited only for huge multi-petabyte installations is passé. Examples supporting this are everywhere and most enterprises are choosing object storage, not for its characteristics of durability or scalability, but because they want to implement cloud storage systems with applications that take advantage of protocols like S3.

    Reply
  30. Tomi Engdahl says:

    We’re calling it: World hits peak Namey McNameface
    Googly McSearchface releases natural language tool called Parsey McParseface
    http://www.theregister.co.uk/2016/05/13/google_parsey_mcparseface_nlu_tool/

    Googly McSearchface has released SyntaxNet, “an open-source neural network framework” and an open source tool for parsing the English language called Parsey McParseface.

    The company hopes that Codey McCoderfaces will put the two tools to work doing Natural Language Understanding (NLU), the art of helping computers to understand spoken language.

    The Alphabet subsidiary’s Researchy McResearchface branch thinks that SynatxNet is just the tool to help Parsey McParseface do all the heavy lifting needed to process language. Googly McSearchface senior staff research scientist Slav Petrov says a neural network is needed because “human languages show remarkable levels of ambiguity.”

    “It is not uncommon for moderate length sentences – say 20 or 30 words in length – to have hundreds, thousands, or even tens of thousands of possible syntactic structures,” Petrov writes offering the sentence “Alice drove down the street in her car” as an example of the challenges parsers face because one interpretation is that Alice drives down a street while driving her car. The other is that the street in question is inside Alice’s car.

    Reply
  31. Tomi Engdahl says:

    Kobayashi Maru gets real: VR and AR in meatspace today
    Outside the Silicon Valley hype cycle
    http://www.theregister.co.uk/2016/05/13/virtual_reality_meets_reality/

    Three years ago few people other than hardcore gamers and those working in specialist industrial fields were still talking about VR.

    It was a gimmicky technology cursed by the “P” word (“potential”) and huge, ungainly headsets that generated a flurry of interest in the late 1990s that quickly evaporated.

    Today, it’s big news. Arguably much of the resurgence in interest is down to Oculus, the company that surprised everyone in 2012 with a fabulously successful Kickstarter campaign to build its Rift VR headset – or using the modern parlance, head-mounted display (HMD) – for just $300.

    By the time Oculus had released a second, more compact version of Rift to developers in July 2014, the company had been acquired by Facebook for the previously inconceivable sum of $2bn. The consumer version was duly announced last summer, and when pre-orders opened in January this year at $599 per unit, all hell broke loose.

    The Oculus Rift is not a self-contained system: it’s 3D display device for hooking up to a Windows PC,

    This isn’t to say that the new wave of VR is restricted to the gaming elite. Samsung partnered with Oculus in 2014 to create the £96 Gear VR, a lightweight HMD designed to be driven by a Galaxy smartphone running Android

    Paring this concept down even further, Google conceived an ultra-lo-fi HMD constructed from pizza-box cardboard, selling for under £10 or

    It’s an experience that sells VR to the masses: indeed, Google claims that more than five million people own Cardboard viewers and over 1,000 apps have been created for them. Even if you dismiss it as a short-term consumer electronics fad, modern VR is inexpensive and accessible – two accusations you could never make about 3D TV.

    Nor should VR be seen principally as gaming tech. “VR has a range of fascinating applications that range beyond the entertainment industry,”

    Medical training is another emerging application for VR but not necessarily in the way you might expect. The UK Government’s Defence Science and Technology Laboratory (DSTL) is using VR to teach soldiers how to assist wounded comrades in the field.

    Grimly evocative of Star Trek’s Kobayashi Maru test, the system allows trainers to insert complications along the way, such as unexpected gunfire, smoke limiting visibility and sudden changes in the patient’s condition, which will influence the trainees’ responses and affect the eventual outcome. As Collette Johnson of Plextek explains: “When you do something wrong, the scenario changes and follows through according to the decisions you made.”

    The system makes use of Oculus rift HMDs but Johnson is keen to develop it further for £689 HTC Vive headset, calling it “a game changer for real-world applications outside gaming, such as disaster relief, by allowing room for movement and enhanced graphic quality.”

    Less surprisingly, heavy industry has also warmed to the opportunities that VR presents, not least because it has been using 3D design tools for decades.

    “We believe the AR market will grow considerably in the commercial market, such as arts and culture,” says Valerie Riffaud-Cangelosi, new markets development manager at Epson. “Museums were an early adopter. Also healthcare, retail and applications that have become really streamlined… like drones.”

    Reply
  32. Tomi Engdahl says:

    Linux is a prisoner of its own development

    The Linux kernel now supports 35 different file system, and all architectures conceivable. The core of more than 10 000 lines of new code are added every day. This will flow to the end in the autumn of 25 years to comply with the operating system problem, if it is not already on it.

    Code expansion is partly linux excellent update rate should be. On the other hand every week kernel upgrade seats quickly found on the bugs, but on the other hand, it will inevitably lead to an increase in the amount of code. Normal user’s perspective, this also leads to an increase in the size of the installation package.

    This is despite the fact that linux is developing for the past five years has been relatively stable.
    Completely new subsystems will no longer come

    Linux version 4.5 includes more than 21 million lines of code.
    aptop hovering around 1.6 million lines of code, lines of code in the smart phone has 2.5 million.

    Code expansion will also lead to the fact that Linux may not suit small IoT devices. At least, it requires the company’s ability to develop its own, stripped-down version just for your application.

    Source: http://etn.fi/index.php?option=com_content&view=article&id=4417:linux-on-oman-kehityksensa-vanki&catid=13&Itemid=101

    Reply
  33. Tomi Engdahl says:

    Microsoft’s Surface Makes the iPad Feel Outdated, Research Shows
    The Surface Book grows in the “laplets” product category
    Read more: http://news.softpedia.com/news/microsoft-s-surface-makes-the-ipad-feel-outdated-research-shows-504019.shtml#ixzz48dVFUh16

    Reply
  34. Tomi Engdahl says:

    Jacob Kastrenakes / The Verge:
    Acer partners with Starbreeze on StarVR headset with 210-degree field of view for arcades and theme parks — Acer is joining forces with the game studio Starbreeze to develop a high-end virtual reality headset designed for arcades and theme parks. Starbreeze has already been showcasing the headset …

    Acer joins work on high-end VR headset for theme parks
    StarVR is meant to be more immersive than other headsets
    http://www.theverge.com/2016/5/15/11677870/starvr-starbreeze-acer-join-forces

    Acer is joining forces with the game studio Starbreeze to develop a high-end virtual reality headset designed for arcades and theme parks. Starbreeze has already been showcasing the headset, called StarVR, for the past year; going forward, development of StarVR will be a collaboration between the two companies.

    StarVR’s standout feature is its wide field of vision, which is supposed to more closely mimic what a person can see in real life, down to their peripheral vision. Starbreeze designed its headset to have a 210-degree field of view — far wider than the 110-degree field of view on the Oculus Rift and HTC Vive.

    Virtual-reality theme parks are already becoming a reality; and Acer and Starbreeze seemingly hope to tap into that market by offering a far more immersive VR experience than what you’d find at home.

    Virtual reality theme park The Void opening its first outpost in Times Square
    The space will host an experience based around the new Ghostbusters film
    http://www.theverge.com/2016/5/9/11603622/the-void-virtual-reality-ghost-busters-times-square

    The most talked about experience in technology over the last six months has to be The Void, an immersive virtual reality theme park based in Utah that transports users to another world they can see, feel, and interact with. It uses a VR headset powered by a supercomputer backpack to allow for untethered walking.

    articipants wear a haptic suit that tracks their movement and provides a sensory experience. Moving podiums, fog machines, and heat lamps give the VR world a physically convincing atmosphere. The sales pitch is that this is not just virtual reality, it’s hyper reality, and all the reviews so far have testified to an experience that lives up to this moniker.

    The company’s business plan is to create entertainment centers around the world, small-scale versions of what it built in Utah powered by the same tech. The Void will create its own content, but it will also serve as an immersive version of major films and video games. If you go to see Star Wars in theaters and you love it, you can pay an additional $10 or $20 to spend 15 minutes actually walking around in the universe interacting with its characters and environment

    Reply
  35. Tomi Engdahl says:

    Flash and mechanical disc price gap shrinks

    Few people nowadays want a new laptop into an old-fashioned mechanical, rotating hard drives. For most of the difference in price between traditional and based on flash-circuits the plates begin to be practically non-existent.

    The memories of the market in the next DRAMExchangen 128-gigabyte flash-based solid state disk and 500 GB of HDD-disc price gap will shrink this year to three dollars. If you do not need much local storage, you should already get a solid state disk.

    There are currently 128 GB solid state disk may be slightly depending EUR 35-45 online store. per-GB solid state disk price is still four times higher than the magnetic recording.

    Source: http://etn.fi/index.php?option=com_content&view=article&id=4420:flashin-ja-mekaanisen-levyn-hintaero-kutistuu&catid=13&Itemid=101

    Reply
  36. Tomi Engdahl says:

    Linus Torvalds releases Linux 4.6
    Dirty drivers delayed release by a very useful week
    http://www.theregister.co.uk/2016/05/16/linus_torvalds_releases_linux_46/

    Linus Torvalds has loosed version 4.6 of the Linux kernel on the waiting world.

    “It’s just as well I didn’t cut the rc cycle short, since the last week ended up getting a few more fixes than expected,” wrote the Linux overlord.

    “Since rc7, there’s been small noise all over, with driver fixes being the bulk of it, but there is minor noise all over (perf tooling, networking, filesystems, documentation, some small arch fixes.)”

    New this time around is support for a bunch more ARM systems-on-a-chip, including Qualcomm’s Snapdragon 820. IBM’s POWER9 finds its first support, although perhaps prematurely given the silicon won’t arrive until late 2016.

    Reply
  37. Tomi Engdahl says:

    Microsoft Kills Its Game-Building Platform Spark
    https://games.slashdot.org/story/16/05/15/0150209/microsoft-kills-its-game-building-platform-spark

    “Starting 5/13/16, ‘Project Spark’ will no longer be available for download on the Xbox Marketplace or Windows Store,” Microsoft wrote in a blog post, adding that it will go offline for good on August 12th.

    Ars Technica remembered Spark as the free multi-device, build-your-own game platform that you never knew existed. “Marketing teams never effectively sold the possibilities and power of Spark’s make-your-own-game system,”

    News \ Project Spark Sunset Announcement
    http://forums.projectspark.com/yaf_postst214854.aspx

    Reply
  38. Tomi Engdahl says:

    Marine Corps battles Windows 10 migration woes
    http://www.zdnet.com/article/marine-corps-battle-windows-10-migration-woes/

    Because the DoD purchases “yesterday’s technology tomorrow,” the Marine Corps is having problems installing Windows 10 onto brand new systems.

    The Marine Corps’ plan to roll Windows 10 out across its computer network has hit a snag – even brand new systems are having problems with the upgrade.

    According to a report by Federal News Radio, technicians had initially believed that they would be able to remotely install – therefore eliminating the need to have someone visit each desktop and laptop – Windows 10 on some 60 to 70 percent of the computers within the Marine Corps Enterprise Network (MCEN).

    But according to Brig. Gen. Dennis Crall, CIO of the Marine Corps, in reality that figure is closer to 10 percent.

    “Our challenges are with hardware, and hardware that is older than a couple years is having more difficulty accepting Windows 10 than hardware that is new,” said Brig. Gen. Crall.

    The problem – outdated hardware.

    So why is the DoD investing so much time getting Windows 10 onto its systems? Security.

    “We’ve never had an operating system that’s had this much security baked in from the beginning,” said Terry Halvorsen, the DoD’s CIO.

    Reply
  39. Tomi Engdahl says:

    Michael del Castillo / CoinDesk:
    The DAO, a distributed organization on the Ethereum blockchain that disperses ETH to other startups and projects, has now raised 10.5M+ ETH, worth about $105M

    The DAO: Or How A Leaderless Ethereum-Based Organization Raised $50 Million
    http://www.coindesk.com/the-dao-just-raised-50-million-but-what-is-it/

    A distributed organization with no single leader that could theoretically exist so long as there’s an Internet connection was launched last month, and has since then left many observers and Ethereum community members feeling optimistic – if not a bit confused – about what exactly was created.

    ‘The DAO’, as it’s called, takes its name from the description for a new type of entity: a distributed autonomous organization. Intended to act as a vehicle for supporting Ethereum-related projects, The DAO has garnered over $50m worth of ethers (ETH) – the digital token of the Ethereum network – from investors.

    But what does The DAO do exactly? Think of it as a hub that disperses ETH to other startups and projects. Backers of The DAO receive voting rights by means of a digital token, which can be used to help determine the future direction of the organization and which projects will actually get funded following a voting period.

    Participants stand to receive possible dividends, including ether, in return for supporting the project.

    Yet, there are outstanding questions about the exact origins of The DAO as it exists in the wild today.

    Reply
  40. Tomi Engdahl says:

    Sarah Jeong / Motherboard:
    Oracle v Google trial, which is based on the notion that APIs are copyrightable, hinges on a judge and jurors who don’t have a firm grasp of FOSS philosophy — The problem with Oracle v. Google is that everyone actually affected by the case knows what an API is, but the whole affair …

    In Oracle v. Google, a Nerd Subculture Is on Trial
    http://motherboard.vice.com/read/in-google-v-oracle-the-nerds-are-getting-owned

    The problem with Oracle v. Google is that everyone actually affected by the case knows what an API is, but the whole affair is being decided by people who don’t, from the normals in the jury box to the normals at the Supreme Court—which declined to hear the case in 2015, on the advice of the normals at the Solicitor General’s office, who perhaps did not grasp exactly how software works.

    In a world where Silicon Valley is coming into dominance, Oracle v. Google is an unusual instance in which the nerds are getting totally owned by the normals. Their judgment on the technologies they have birthed is being overridden by old people in black robes; their beloved traditions and mythologies around free and open source software are being scoffed at by corporate stiffs in suits as inconsistent hippie nonsense.

    Silicon Valley wants to live in a world of its own, where it sets its own rules and writes its own laws. Oracle v. Google does little to change its mind that this is only right and fair.

    And to be fair to Oracle attorneys, although the copyleft idealism of the free and open source software movement infects Silicon Valley at its very foundation, Silicon Valley is a capitalist enterprise, and has always had an ambivalent relationship with FOSS.

    Oracle v. Google is the revenge of the normals, bringing a hammer down on the customs and practices that the nerds decided for themselves.

    Reply
  41. Tomi Engdahl says:

    Now you can tailor Swift – on Ubuntu
    Swift has landed on Linux. Repeat: Swift has landed on Linux
    http://www.theregister.co.uk/2016/03/23/apple_swift_ubuntu_linux/

    The latest iteration of Apple’s open-source programming language Swift has taken its first major step towards Linux support.

    The release of Swift 2.2 landed on March 21, and includes its first Linux port in the form of binaries for Ubuntu 14.04 and 15.10.

    As Neowin notes, now that there’s a Linux build, “it won’t be long before it unofficially arrives” on other distributions.

    “The Linux port is still relatively new and in this release does not include the Swift Core Libraries (which will appear in Swift 3). The port does, however, include LLDB and the REPL,” the release post states.

    LLDB is the LLVM project’s debugger, and is also used in Apple’s Xcode; REPL is read–eval–print loop, the language shell.

    Swift programming language update introduces Linux support
    http://www.neowin.net/news/swift-programming-language-update-introduces-linux-support

    Reply
  42. Tomi Engdahl says:

    Cloudy desktops are as mature as cloudy servers … from 2008!
    It’s not a DaaS-aster, but not better or cheaper than VDI, yet
    http://www.theregister.co.uk/2016/05/17/cloudy_desktops_are_as_mature_as_cloudy_servers_from_2008/

    Desktop-as-a-service in 2016 is about as mature as infrastructure-as-a-service was in 2008, so waiting until it matures is more sensible than diving in now.

    So says Garter for Technical Professionals’ analyst Mark Lockwood, who The Register’s virtualisation desk beheld yesterday at the firm’s Infrastructure Operations & Data Centre Summit in Sydney.

    Lockwood said DaaS currently lags desktop virtualisation (VDI) in many ways, especially on cost. Best-practice VDI costs about US$300/seat/year. DaaS costs more. VDI doesn’t have latency problems. DaaS does and those problems only get worse if your desktops have to come in over the WAN to reach data inside the firewall, pipe that data into the cloudy desktop and then send it to users over the WAN again.

    Current DaaS offerings are also a little unrealistic – the base configuration of a single CPU with 2GB of RAM is not useful for most applications. More realistically-specced machines cost about US$50/month. Suppliers are also thin on the ground. Today only Amazon Web Services and VMware provide a single-throat-to-choke experience. Other providers can split bills so you pay for VDI licences and for the cloudy desktops.

    Reply
  43. Tomi Engdahl says:

    Google Releases Spaces Group-Sharing App On Android, iOS, and Desktop
    https://tech.slashdot.org/story/16/05/16/1549254/google-releases-spaces-group-sharing-app-on-android-ios-and-desktop

    Google on Monday released Spaces, an app that is designed to make it easier to share links, videos and other things from the Web in group conversations. The app, which has been in private beta for a few months, is available for Android, iOS, desktop and mobile web.

    With Spaces, it’s simple to find and share articles, videos and images without leaving the app, since Google Search, YouTube, and Chrome come built in.

    Introducing Spaces, a tool for small group sharing
    https://googleblog.blogspot.fi/2016/05/introducing-spaces-tool-for-small-group.html

    Group sharing isn’t easy. From book clubs to house hunts to weekend trips and more, getting friends into the same app can be challenging. Sharing things typically involves hopping between apps to copy and paste links. Group conversations often don’t stay on topic, and things get lost in endless threads that you can’t easily get back to when you need them. We wanted to build a better group sharing experience, so we made a new app called Spaces that lets people get people together instantly to share around any topic. With Spaces, it’s simple to find and share articles, videos and images without leaving the app, since Google Search, YouTube, and Chrome come built in.

    Spaces
    Small group sharing for everything in life
    https://get.google.com/spaces/

    Reply
  44. Tomi Engdahl says:

    The Windows 10 future: imagine a boot stamping an upgrade treadmill forever
    Windows-as-a-service will require more frequent testing of everything you run
    http://www.theregister.co.uk/2016/05/17/the_windows_10_future_imagine_a_boot_stamping_an_upgrade_treadmill_forever/

    The advent of Windows-as-a-service means that businesses adopting Windows 10 will need to ensure they can monitor their software portfolio for compatibility with Microsoft’s latest updates.

    So says Annette Jump, a research director at Gartner who today addressed the firm’s Infrastructure Operations & Data Centre Summit in Sydney, on the topic of how to prepare for a Windows 10 migration.

    Jump said Gartner’s research suggests at least 80 per cent of you will have done so by the end of 2018. Most of you will consume Windows 10′s Current Branch for Business (CBB), a strain of Windows 10 that arrives four months after the periodic releases of the consumer version of the OS and which Microsoft says it will maintain for at least eight months.

    Your challenge, Jump said, is that skipping a CBB release could mean skipping important enhancements, including security tweaks. That in turn means that you’ll need to be ready for frequent testing and implementation of new Windows 10 versions.

    Reply
  45. Tomi Engdahl says:

    Romain Dillet / TechCrunch:
    Sketchfab now supports all VR headsets for its 3D model sharing platform — You saw it coming, right? 3D model repository Sketchfab just added support for all major virtual reality headsets out there. You can now view your pretty 3D models in your VR headset.

    Sketchfab now supports all VR headsets for its 3D model sharing platform
    http://techcrunch.com/2016/05/17/sketchfab-now-supports-all-vr-headsets-for-its-3d-model-sharing-platform/

    You saw it coming, right? 3D model repository Sketchfab just added support for all major virtual reality headsets out there. You can now view your pretty 3D models in your VR headset.

    The Sketchfab VR apps work with the Oculus Rift, HTC Vive, Gear VR and Cardboard. The startup already announced support for Google Cardboard back in January thanks to WebVR. But now that other (more powerful) headsets are out, it’s good to see expanded support with native apps.

    Sketchfab is all about viewing 3D models. Think about it as a sort of YouTube or SoundCloud for 3D files. So it makes perfect sense that you’d want to move around 3D models with a VR headsets. It’s much more immersive than having to drag your mouse or finger around the screen.

    Reply
  46. Tomi Engdahl says:

    Jordan Novet / VentureBeat:
    Amazon open sources DSSTNE, its library for building deep learning models, says it’s 2.1X faster than Google’s TensorFlow when run on specific AWS configuration

    Amazon open-sources its own deep learning software, DSSTNE
    http://venturebeat.com/2016/05/11/amazon-open-sources-its-own-deep-learning-software-dsstne/

    Amazon has suddenly made a remarkable entrance into the world of open-source software for deep learning. Yesterday the ecommerce company quietly released a library called DSSTNE on GitHub under an open-source Apache license.

    Deep learning involves training artificial neural networks on lots of data and then getting them to make inferences about new data. Several technology companies are doing it — heck, it even got some air time recently in the show “Silicon Valley.” And there are already several other deep learning frameworks to choose from, including Google’s TensorFlow.

    https://github.com/amznlabs/amazon-dsstne

    Reply
  47. Tomi Engdahl says:

    The fork? Node.js: Code showdown re-opens Open Source wounds
    Left pad chaos highlighted madness behind scenes
    http://www.theregister.co.uk/2016/05/13/open_source_insider_myth_of_opensource/

    Open source software rarely receives the kind of attention that the press lavishes on the latest hot new thing blessed by Silicon Valley venture capitalists. Yet these projects are the foundations of the web world.

    Without open source there would be no Slack, no Medium, no Github. Nor would there be Google, Facebook, or much of anything else.

    Without open source projects like Apache, Nginx, OpenSSL, OpenSSH and others (to say nothing of GNU/Linux, which does get some attention), the latest hot new thing would likely not exist. More fundamentally, the web as we know it would not exist.

    There is a kind of myth that has grown around this lack of attention. It’s the myth of the lone developer creating powerful magic. It’s a myth the open-source community likes to tell itself: that open source software is created by individuals working on labours of love in their spare time.

    This isn’t always a myth; indeed, it’s often surprising how little support key projects get considering how many companies would cease to exist without them.

    However, the myth ignores the fact that much of the money going into open source software is directly and indirectly (in the form of employing developers who contribute to open source projects) coming from corporations.

    There’s a tension in open source between individuals building projects out of love, frustration or other personal motivations and corporations devoting their time and money that help further the bottom line.

    Occasionally the web gets a wake-up call about this tension that exists between individual developers and corporations building fortunes atop their code.

    The kerfuffle at NPM, the default package manager for the very popular Node.js project, nicely illustrates exactly this tension.

    There’s a lesson here for everyone – consider your dependencies carefully – but there’s also a wakeup call here for both to developers and corporations.

    There’s really nothing original about this story. It’s part of the tension that seems inherent in software development at this stage. It’s so common, in fact, that open-source software has a simple mechanism for handling this situation – the fork.

    Don’t like where a project is headed or who’s in charge of it? Go make your own. It happens with small projects like Koçulu’s and big ones like the MariaDB fork of MySQL.

    So while the short version of the NPM story has a happy ending – Koçulu’s code is now free of NPM and NPM has forks of it available for developers who depend on it

    Reply
  48. Tomi Engdahl says:

    Steve Dent / Engadget:
    IBM researchers find a way to store 3 bits in a phase-change memory cell, paving the way for memory technology with DRAM-like speed and a cost closer to flash

    IBM’s optical storage is 50 times faster than flash
    The phase-change memory is now cheaper than RAM, too.
    http://www.engadget.com/2016/05/17/ibm-research-phase-change-memory/

    Flash storage is too slow for your device’s main memory, but RAM is expensive and volatile. Thanks to a breakthrough from IBM, phase-change memory (PCM) might one day replace them both. The crystal-based storage has been used in optical disks and other tech for at least 15 years, but the technology has been limited by the cost and storage density — cells are either “on” or “off.” However, IBM researchers have figured out how to save 3-bits of data per cell, dramatically increasing the capacity of the original tech.

    To store PCM data on a Blu-ray disk, you apply a high current to amorphous (non-crystalline) glass materials, transforming them into a more conductive crystale form. To read it back, you apply a lower voltage to measure conductivity — when it’s high, the state is “1,” and when it’s low, it’s “0.” By heating up the materials, more states can be stored, but the problem is that the crystals can “drift” depending on the ambient temperature. IBM’s team figured out how to track and encode those variations, allowing them to reliably read 3-bits of data per cell long after it was written.

    Reply
  49. Tomi Engdahl says:

    Jessica Guynn / USA Today:
    Report: Google introducing software tools at I/O to help developers build chat bots that run in messaging apps, in strategy similar to Microsoft’s

    Google to make ‘chat bot’ play, report says
    http://www.usatoday.com/story/tech/news/2016/05/17/google-messaging-chatbots-io/84510922/

    Google is creating tools for software developers to build chat bots that run inside messaging apps such as Facebook Messenger and Google’s own messaging products, according to a published report.

    The technology giant plans to discuss some of the details at its annual developer conference, which starts Wednesday, The Information reported.

    Following Facebook and Microsoft, Google Gets Messaging Bots
    https://www.theinformation.com/following-facebook-and-microsoft-google-gets-messaging-bots?token=b4a20d8d1d2c20aafb2f6e627f41d96fcc2563a4

    Google is about to join the “bot” craze gripping Silicon Valley.

    Reports that Google was working on a way for people to use its smartphone messenger to chat with businesses—or Google itself—have circulated for months. But we’ve recently learned more details

    Reply

Leave a Reply to Tomi Engdahl Cancel reply

Your email address will not be published. Required fields are marked *

*

*