Computer trends for 2015

Here are comes my long list of computer technology trends for 2015:

Digitalisation is coming to change all business sectors and through our daily work even more than before. Digitalisation also changes the IT sector: Traditional software package are moving rapidly into the cloud.  Need to own or rent own IT infrastructure is dramatically reduced. Automation application for configuration and monitoring will be truly possible. Workloads software implementation projects will be reduced significantly as software is a need to adjust less. Traditional IT outsourcing is definitely threatened. The security management is one of the key factors to change as security threats are increasingly digital world. IT sector digitalisation simply means: “more cheaper and better.”

The phrase “Communications Transforming Business” is becoming the new normal. The pace of change in enterprise communications and collaboration is very fast. A new set of capabilities, empowered by the combination of Mobility, the Cloud, Video, software architectures and Unified Communications, is changing expectations for what IT can deliver.

Global Citizenship: Technology Is Rapidly Dissolving National Borders. Besides your passport, what really defines your nationality these days? Is it where you were live? Where you work? The language you speak? The currency you use? If it is, then we may see the idea of “nationality” quickly dissolve in the decades ahead. Language, currency and residency are rapidly being disrupted and dematerialized by technology. Increasingly, technological developments will allow us to live and work almost anywhere on the planet… (and even beyond). In my mind, a borderless world will be a more creative, lucrative, healthy, and frankly, exciting one. Especially for entrepreneurs.

The traditional enterprise workflow is ripe for huge change as the focus moves away from working in a single context on a single device to the workflow being portable and contextual. InfoWorld’s executive editor, Galen Gruman, has coined a phrase for this: “liquid computing.”   The increase in productivity is promised be stunning, but the loss of control over data will cross an alarming threshold for many IT professionals.

Mobile will be used more and more. Currently, 49 percent of businesses across North America adopt between one and ten mobile applications, indicating a significant acceptance of these solutions. Embracing mobility promises to increase visibility and responsiveness in the supply chain when properly leveraged. Increased employee productivity and business process efficiencies are seen as key business impacts.

The Internet of things is a big, confusing field waiting to explode.  Answer a call or go to a conference these days, and someone is likely trying to sell you on the concept of the Internet of things. However, the Internet of things doesn’t necessarily involve the Internet, and sometimes things aren’t actually on it, either.

The next IT revolution will come from an emerging confluence of Liquid computing plus the Internet of things. Those the two trends are connected — or should connect, at least. If we are to trust on consultants, are in sweet spot for significant change in computing that all companies and users should look forward to.

Cloud will be talked a lot and taken more into use. Cloud is the next-generation of supply chain for ITA global survey of executives predicted a growing shift towards third party providers to supplement internal capabilities with external resources.  CIOs are expected to adopt a more service-centric enterprise IT model.  Global business spending for infrastructure and services related to the cloud will reach an estimated $174.2 billion in 2014 (up a 20% from $145.2 billion in 2013), and growth will continue to be fast (“By 2017, enterprise spending on the cloud will amount to a projected $235.1 billion, triple the $78.2 billion in 2011“).

The rapid growth in mobile, big data, and cloud technologies has profoundly changed market dynamics in every industry, driving the convergence of the digital and physical worlds, and changing customer behavior. It’s an evolution that IT organizations struggle to keep up with.To success in this situation there is need to combine traditional IT with agile and web-scale innovation. There is value in both the back-end operational systems and the fast-changing world of user engagement. You are now effectively operating two-speed IT (bimodal IT, two-speed IT, or traditional IT/agile IT). You need a new API-centric layer in the enterprise stack, one that enables two-speed IT.

As Robots Grow Smarter, American Workers Struggle to Keep Up. Although fears that technology will displace jobs are at least as old as the Luddites, there are signs that this time may really be different. The technological breakthroughs of recent years — allowing machines to mimic the human mind — are enabling machines to do knowledge jobs and service jobs, in addition to factory and clerical work. Automation is not only replacing manufacturing jobs, it is displacing knowledge and service workers too.

In many countries IT recruitment market is flying, having picked up to a post-recession high. Employers beware – after years of relative inactivity, job seekers are gearing up for changeEconomic improvements and an increase in business confidence have led to a burgeoning jobs market and an epidemic of itchy feet.

Hopefully the IT department is increasingly being seen as a profit rather than a cost centre with IT budgets commonly split between keeping the lights on and spend on innovation and revenue-generating projects. Historically IT was about keeping the infrastructure running and there was no real understanding outside of that, but the days of IT being locked in a basement are gradually changing.CIOs and CMOs must work more closely to increase focus on customers next year or risk losing market share, Forrester Research has warned.

Good questions to ask: Where do you see the corporate IT department in five years’ time? With the consumerization of IT continuing to drive employee expectations of corporate IT, how will this potentially disrupt the way companies deliver IT? What IT process or activity is the most important in creating superior user experiences to boost user/customer satisfaction?

 

Windows Server 2003 goes end of life in summer 2015 (July 14 2015).  There are millions of servers globally still running the 13 year-old OS with one in five customers forecast to miss the 14 July deadline when Microsoft turns off extended support. There were estimated to be 2.7 million WS2003 servers in operation in Europe some months back. This will keep the system administrators busy, because there is just around half year time and update for Windows Server 2008 or Windows 2012 to may be have difficulties. Microsoft and support companies do not seem to be interested in continuing Windows Server 2003 support, so those who need that the custom pricing can be ” incredibly expensive”. At this point is seems that many organizations have the desire for new architecture and consider one option to to move the servers to cloud.

Windows 10 is coming  to PCs and Mobile devices. Just few months back  Microsoft unveiled a new operating system Windows 10. The new Windows 10 OS is designed to run across a wide range of machines, including everything from tiny “internet of things” devices in business offices to phones, tablets, laptops, and desktops to computer servers. Windows 10 will have exactly the same requirements as Windows 8.1 (same minimum PC requirements that have existed since 2006: 1GHz, 32-bit chip with just 1GB of RAM). There is technical review available. Microsoft says to expect AWESOME things of Windows 10 in January. Microsoft will share more about the Windows 10 ‘consumer experience’ at an event on January 21 in Redmond and is expected to show Windows 10 mobile SKU at the event.

Microsoft is going to monetize Windows differently than earlier.Microsoft Windows has made headway in the market for low-end laptops and tablets this year by reducing the price it charges device manufacturers, charging no royalty on devices with screens of 9 inches or less. That has resulted in a new wave of Windows notebooks in the $200 price range and tablets in the $99 price range. The long-term success of the strategy against Android tablets and Chromebooks remains to be seen.

Microsoft is pushing Universal Apps concept. Microsoft has announced Universal Windows Apps, allowing a single app to run across Windows 8.1 and Windows Phone 8.1 for the first time, with additional support for Xbox coming. Microsoft promotes a unified Windows Store for all Windows devices. Windows Phone Store and Windows Store would be unified with the release of Windows 10.

Under new CEO Satya Nadella, Microsoft realizes that, in the modern world, its software must run on more than just Windows.  Microsoft has already revealed Microsoft office programs for Apple iPad and iPhone. It also has email client compatible on both iOS and Android mobile operating systems.

With Mozilla Firefox and Google Chrome grabbing so much of the desktop market—and Apple Safari, Google Chrome, and Google’s Android browser dominating the mobile market—Internet Explorer is no longer the force it once was. Microsoft May Soon Replace Internet Explorer With a New Web Browser article says that Microsoft’s Windows 10 operating system will debut with an entirely new web browser code-named Spartan. This new browser is a departure from Internet Explorer, the Microsoft browser whose relevance has waned in recent years.

SSD capacity has always lag well behind hard disk drives (hard disks are in 6TB and 8TB territory while SSDs were primarily 256GB to 512GB). Intel and Micron will try to kill the hard drives with new flash technologies. Intel announced it will begin offering 3D NAND drives in the second half of next year as part of its joint flash venture with Micron. Later (next two years) Intel promises 10TB+ SSDs thanks to 3D Vertical NAND flash memory. Also interfaces to SSD are evolving from traditional hard disk interfaces. PCIe flash and NVDIMMs will make their way into shared storage devices more in 2015. The ULLtraDIMM™ SSD connects flash storage to the memory channel via standard DIMM slots, in order to close the gap between storage devices and system memory (less than five microseconds write latency at the DIMM level).

Hard disks will be still made in large amounts in 2015. It seems that NAND is not taking over the data centre immediately. The huge great problem is $/GB. Estimates of shipped disk and SSD capacity out to 2018 shows disk growing faster than flash. The world’s ability to make and ship SSDs is falling behind its ability to make and ship disk drives – for SSD capacity to match disk by 2018 we would need roughly eight times more flash foundry capacity than we have. New disk technologies such as shingling, TDMR and HAMR are upping areal density per platter and bringing down cost/GB faster than NAND technology can. At present solid-state drives with extreme capacities are very expensive. I expect that with 2015, the prices for SSD will will still be so much higher than hard disks, that everybody who needs to store large amounts of data wants to consider SSD + hard disk hybrid storage systems.

PC sales, and even laptops, are down, and manufacturers are pulling out of the market. The future is all about the device. We have entered the post-PC era so deeply, that even tablet market seem to be saturating as most people who want one have already one. The crazy years of huge tables sales growth are over. The tablet shipment in 2014 was already quite low (7.2% In 2014 To 235.7M units). There is no great reasons or growth or decline to be seen in tablet market in 2015, so I expect it to be stable. IDC expects that iPad Sees First-Ever Decline, and I expect that also because the market seems to be more and more taken by Android tablets that have turned to be “good enough”. Wearables, Bitcoin or messaging may underpin the next consumer computing epoch, after the PC, internet, and mobile.

There will be new tiny PC form factors coming. Intel is shrinking PCs to thumb-sized “compute sticks” that will be out next year. The stick will plug into the back of a smart TV or monitor “and bring intelligence to that”. It is  likened the compute stick to similar thumb PCs that plug to HDMI port and are offered by PC makers with the Android OS and ARM processor (for example Wyse Cloud Connect and many cheap Android sticks).  Such devices typically don’t have internal storage, but can be used to access files and services in the cloudIntel expects that sticks size PC market will grow to tens of millions of devices.

We have entered the Post-Microsoft, post-PC programming: The portable REVOLUTION era. Tablets and smart phones are fine for consuming information: a great way to browse the web, check email, stay in touch with friends, and so on. But what does a post-PC world mean for creating things? If you’re writing platform-specific mobile apps in Objective C or Java then no, the iPad alone is not going to cut it. You’ll need some kind of iPad-to-server setup in which your iPad becomes a mythical thin client for the development environment running on your PC or in cloud. If, however, you’re working with scripting languages (such as Python and Ruby) or building web-based applications, the iPad or other tablet could be an useable development environment. At least worth to test.

You need prepare to learn new languages that are good for specific tasks. Attack of the one-letter programming languages: From D to R, these lesser-known languages tackle specific problems in ways worthy of a cult following. Watch out! The coder in the next cubicle might have been bitten and infected with a crazy-eyed obsession with a programming language that is not Java and goes by the mysterious one letter name. Each offers compelling ideas that could do the trick in solving a particular problem you need fixed.

HTML5′s “Dirty Little Secret”: It’s Already Everywhere, Even In Mobile. Just look under the hood. “The dirty little secret of native [app] development is that huge swaths of the UIs we interact with every day are powered by Web technologies under the hood.”  When people say Web technology lags behind native development, what they’re really talking about is the distribution model. It’s not that the pace of innovation on the Web is slower, it’s just solving a problem that is an order of magnitude more challenging than how to build and distribute trusted apps for a single platform. Efforts like the Extensible Web Manifesto have been largely successful at overhauling the historically glacial pace of standardization. Vine is a great example of a modern JavaScript app. It’s lightning fast on desktop and on mobile, and shares the same codebase for ease of maintenance.

Docker, meet hype. Hype, meet Docker. Docker: Sorry, you’re just going to have to learn about it. Containers aren’t a new idea, and Docker isn’t remotely the only company working on productising containers. It is, however, the one that has captured hearts and minds. Docker containers are supported by very many Linux systems. And it is not just only Linux anymore as Docker’s app containers are coming to Windows Server, says Microsoft. Containerization lets you do is launch multiple applications that share the same OS kernel and other system resources but otherwise act as though they’re running on separate machines. Each is sandboxed off from the others so that they can’t interfere with each other. What Docker brings to the table is an easy way to package, distribute, deploy, and manage containerized applications.

Domestic Software is on rise in China. China is Planning to Purge Foreign Technology and Replace With Homegrown SuppliersChina is aiming to purge most foreign technology from banks, the military, state-owned enterprises and key government agencies by 2020, stepping up efforts to shift to Chinese suppliers, according to people familiar with the effort. In tests workers have replaced Microsoft Corp.’s Windows with a homegrown operating system called NeoKylin (FreeBSD based desktop O/S). Dell Commercial PCs to Preinstall NeoKylin in China. The plan for changes is driven by national security concerns and marks an increasingly determined move away from foreign suppliers. There are cases of replacing foreign products at all layers from application, middleware down to the infrastructure software and hardware. Foreign suppliers may be able to avoid replacement if they share their core technology or give China’s security inspectors access to their products. The campaign could have lasting consequences for U.S. companies including Cisco Systems Inc. (CSCO), International Business Machines Corp. (IBM), Intel Corp. (INTC) and Hewlett-Packard Co. A key government motivation is to bring China up from low-end manufacturing to the high end.

 

Data center markets will grow. MarketsandMarkets forecasts the data center rack server market to grow from $22.01 billion in 2014 to $40.25 billion by 2019, at a compound annual growth rate (CAGR) of 7.17%. North America (NA) is expected to be the largest region for the market’s growth in terms of revenues generated, but Asia-Pacific (APAC) is also expected to emerge as a high-growth market.

The rising need for virtualized data centers and incessantly increasing data traffic is considered as a strong driver for the global data center automation market. The SDDC comprises software defined storage (SDS), software defined networking (SDN) and software defined server/compute, wherein all the three components of networking are empowered by specialized controllers, which abstract the controlling plane from the underlying physical equipment. This controller virtualizes the network, server and storage capabilities of a data center, thereby giving a better visibility into data traffic routing and server utilization.

New software-defined networking apps will be delivered in 2015. And so will be software defined storage. And software defined almost anything (I an waiting when we see software defined software). Customers are ready to move away from vendor-driven proprietary systems that are overly complex and impede their ability to rapidly respond to changing business requirements.

Large data center operators will be using more and more of their own custom hardware instead of standard PC from traditional computer manufacturers. Intel Betting on (Customized) Commodity Chips for Cloud Computing and it expects that Over half the chips Intel will sell to public clouds in 2015 will have custom designs. The biggest public clouds (Amazon Web Services, Google Compute, Microsoft Azure),other big players (like Facebook or China’s Baidu) and other public clouds  (like Twitter and eBay) all have huge data centers that they want to run optimally. Companies like A.W.S. “are running a million servers, so floor space, power, cooling, people — you want to optimize everything”. That is why they want specialized chips. Customers are willing to pay a little more for the special run of chips. While most of Intel’s chips still go into PCs, about one-quarter of Intel’s revenue, and a much bigger share of its profits, come from semiconductors for data centers. In the first nine months of 2014, the average selling price of PC chips fell 4 percent, but the average price on data center chips was up 10 percent.

We have seen GPU acceleration taken in to wider use. Special servers and supercomputer systems have long been accelerated by moving the calculation of the graphics processors. The next step in acceleration will be adding FPGA to accelerate x86 servers. FPGAs provide a unique combination of highly parallel custom computation, relatively low manufacturing/engineering costs, and low power requirements. FPGA circuits may provide a lot more power out of a much lower power consumption, but traditionally programming then has been time consuming. But this can change with the introduction of new tools (just next step from technologies learned from GPU accelerations). Xilinx has developed a SDAccel-tools to  to develop algorithms in C, C ++ – and OpenCL languages and translated it to FPGA easily. IBM and Xilinx have already demoed FPGA accelerated systems. Microsoft is also doing research on Accelerating Applications with FPGAs.


If there is one enduring trend for memory design in 2014 that will carry through to next year, it’s the continued demand for higher performance. The trend toward high performance is never going away. At the same time, the goal is to keep costs down, especially when it comes to consumer applications using DDR4 and mobile devices using LPDDR4. LPDDR4 will gain a strong foothold in 2015, and not just to address mobile computing demands. The reality is that LPDRR3, or even DDR3 for that matter, will be around for the foreseeable future (lowest-cost DRAM, whatever that may be). Designers are looking for subsystems that can easily accommodate DDR3 in the immediate future, but will also be able to support DDR4 when it becomes cost-effective or makes more sense.

Universal Memory for Instant-On Computing will be talked about. New memory technologies promise to be strong contenders for replacing the entire memory hierarchy for instant-on operation in computers. HP is working with memristor memories that are promised to be akin to RAM but can hold data without power.  The memristor is also denser than DRAM, the current RAM technology used for main memory. According to HP, it is 64 and 128 times denser, in fact. You could very well have a 512 GB memristor RAM in the near future. HP has what it calls “The Machine”, practically a researcher’s plaything for experimenting on emerging computer technologies. Hewlett-Packard’s ambitious plan to reinvent computing will begin with the release of a prototype operating system in 2015 (Linux++, in June 2015). HP must still make significant progress in both software and hardware to make its new computer a reality. A working prototype of The Machine should be ready by 2016.

Chip designs that enable everything from a 6 Gbit/s smartphone interface to the world’s smallest SRAM cell will be described at the International Solid State Circuits Conference (ISSCC) in February 2015. Intel will describe a Xeon processor packing 5.56 billion transistors, and AMD will disclose an integrated processor sporting a new x86 core, according to a just-released preview of the event. The annual ISSCC covers the waterfront of chip designs that enable faster speeds, longer battery life, more performance, more memory, and interesting new capabilities. There will be many presentations on first designs made in 16 and 14 nm FinFET processes at IBM, Samsung, and TSMC.

 

1,403 Comments

  1. Tomi Engdahl says:

    Hacking a Universal Assembler
    http://hackaday.com/2015/08/06/hacking-a-universal-assembler/

    I have always laughed at people who keep multitools–those modern Swiss army knives–in their toolbox. To me, the whole premise of a multitool is that they keep me from going to the toolbox. If I’ve got time to go to the garage, I’m going to get the right tool for the job.

    Not that I don’t like a good multitool. They are expedient and great to get a job done. That’s kind of the way I feel about axasm — a universal assembler I’ve been hacking together. To call it a cross assembler hack doesn’t do it justice. It is a huge and ugly hack, but it does get the job done.

    Why A Cross Assembler?

    A cross compiler is a compiler capable of creating executable code for a platform other than the one on which the compiler is running. For example, a compiler that runs on a Windows 7 PC but generates code that runs on Android smartphone is a cross compiler.You are probably wondering why I wanted to write an assembler. The problem is, I like to design custom CPUs (usually implemented in Verilog on an FPGA). These CPUs have unique instructions sets, and that means there is no off-the-shelf assembler. There are some table-driven assemblers that you can customize, but then you have to learn some arcane specialized syntax for that tool. My goal wasn’t to write an assembler. It was to get some code written for my new CPU.

    The Hack

    The C preprocessor has a bad reputation, probably because it is like dynamite. It is amazingly useful and also incredibly dangerous, especially in the wrong hands. It occurred to me that if my assembly language looked like C macros, I could easily create a custom assembler from a fixed skeleton.

    My plan was simple: Use an awk script to convert conventional assembler format code into macros.

    The Result

    Unlike a normal assembler, the output file from the script isn’t the machine code. It is two sets of C language macros that get included with the standard source code for the assembler. A driver script orchestrates the whole thing. It runs the script, calls the compiler, and then executes the resulting (temporary) program (passing it any options you specified). The standard source code just gets a buffer filled with your machine code and emits it in one of several available formats.

    In case that wasn’t clear enough, the program generated has one function: to print out your specific assembly language program in machine code using some format. That’s it. You don’t save the executables. Once they run, they aren’t useful anymore.

    The key, then, is configuring the macro files.

    Once you have your assembly language program and a suitable processor definition, it is easy to run axasm from the command line

    It may seem a little strange to pervert the C preprocessor this way, but it does give a lot of advantages. First, you can define your CPU instruction set in a comfortable language and use powerful constructs in functions and macros to get the job done. Second, you can use all the features of the C compiler. Constant math expressions work fine, for example.

    Naturally, since AXASM works for custom processors, you can also define standard processors, too. Github has definitions for the RCA1802, the 8080, and the PIC16F84. If you create a new definition, please do a pull request on Github and share.

    wd5gnr/axasm
    Universal Cross Assembler
    https://github.com/wd5gnr/axasm

    Reply
  2. Tomi Engdahl says:

    Emil Protalinski / VentureBeat:
    Firefox 40 arrives with Windows 10 support, expanded malware protection, and new Android navigation gestures
    http://venturebeat.com/2015/08/11/firefox-40-arrives-with-windows-10-support-expanded-malware-protection-and-new-android-navigation-gestures/

    Reply
  3. Tomi Engdahl says:

    Fresh flashery from the NANDsters
    Flashing their goods at the Flash Memory Summit crowd
    http://www.theregister.co.uk/2015/08/12/fresh_flashery_from_nandsters/

    The Flash Memory Summit has seen a fresh crop of SSDs and PCIe flash cards appear from NAND fans BiTMICRO, Seagate and Toshiba as they try to wow delegates and get sales momentum started.
    Seagate Nytro flash cards

    Seagate, flush and fresh with its strategic alliance with Micron over flash, has announced three new Nytro (acquired LSI flash card technology) products; the XF1440 2.5-inch NVMe SSD, the XM1440 M.2 NVMe SSD, and the XP6550 flash accelerator card. They all use eMLC flash.

    Seagate says the XP1550 is warranted for 5 years or the end of the NAND’s life, without telling us when that will be on the datasheet; pretty unhelpful.

    Tosh triplets

    Not content with introducing one new PCIe flash product, Tosh is giving birth to PCIe flash triplets. The thrilling threesome use Tosh’s own MLC flash, 19nm possibly, 16nm maybe, and have NVMe drivers.

    Known for a military product focus, BiTMICRO is launching MAXio Z-Series flash PCIe cards for enterprises. They support up to 8.8TB of capacity, in a PCIe edge card format, and use two BiTMICRO-developed ASICS to boost performance. Flash Translation Layer (FTL) processing is offloaded from the host server to the SSD.

    Availability and stuff

    After Samsung’s 48-layer, 3D NAND news, Micron/Intel’s 3D Point non-volatile memory, SanDisk’s 48-layer 3D NAND production piloting, and HGST’s PCM demo, these three sets of SSD announcements are middle-of-the-road normal. BiTMICRO’s has technology interest with the FTL off-loading, but the others are straightforward evolutionary advances on existing MLC flash product. Worthy, of course, that goes without saying, but not sexy.

    Reply
  4. Tomi Engdahl says:

    Lenovo used a hidden Windows feature to ensure its software could not be deleted
    http://thenextweb.com/insider/2015/08/12/lenovo-used-a-hidden-windows-feature-to-ensure-its-software-could-not-be-deleted/

    A recently uncovered feature – which had been swept under the rug – allowed new Lenovo laptops to use new Windows features to install the company’s software and tools even if the computer was wiped.

    The oddity was first noted by Ars Technica forum user ‘ge814‘ and corroborated by Hacker News user ‘chuckup.’

    Reply
  5. Tomi Engdahl says:

    Android and Windows make gains in enterprise tablet market at expense of iOS
    The iPad is starting to lose its grip
    By Carly Page
    http://www.theinquirer.net/inquirer/news/2421762/android-and-windows-make-gains-in-enterprise-tablet-market-at-expense-of-ios

    APPLE’S SHARE of the enterprise tablet market saw a sharp decline in the second quarter of 2015, while Android and Windows made “significant gains”.

    “The erosion in iPad dominance points to a change in the tablet market as the long-predicted role of tablets as laptop replacements finally becomes a reality,” Good Technology said in its report.

    Apple’s lost market share has been quickly snapped up by Google and Microsoft, both of which made huge gains during the three-month period.

    Android tablets, despite ongoing security fears, claimed a quarter of all business tablet activations in Q2, up from 15 percent in the first quarter.

    Microsoft is hot on Google’s heels. Windows tablets, including the firm’s own Surface, accounted for 11 percent of activations during the period, up from four percent in the previous quarter.

    Good Technology said: “The growth of Windows, which includes Surface devices and devices from Windows OEMs, was especially impressive given that only two quarters ago Windows made up only one percent of tablet sales.”

    Reply
  6. Tomi Engdahl says:

    Eva Dou / Wall Street Journal:
    Lenovo to cut 3200 non-manufacturing jobs after its Q1 net profit drops 50.9% YoY to $105M; job cuts part of company’s $650M cost-cutting effort

    Lenovo to Cut Jobs as PC Market Continues to Contract
    Chinese PC maker sees net profit fall for the fiscal first quarter
    http://www.wsj.com/article_email/lenovo-to-cut-jobs-as-pc-market-continues-to-contract-1439424194-lMyQjAxMTA1NjE3MzgxMjMwWj

    BEIJING—Chinese PC maker Lenovo Group Ltd. said it would cut jobs and restructure as it continues the integration of two major acquisitions, after reporting a 51% slump in fiscal first-quarter earnings.

    The news sent shares of Lenovo, which bought smartphone maker Motorola Mobility and International Business Machines Corp. ’s low-end server unit last year, to their lowest level in nearly 18 months.

    The company faces tough conditions as the global personal-computer market continues to contract and China’s smartphone market becomes saturated.

    Chief Executive Yang Yuanqing said in a statement that the past quarter was possibly “the toughest market environment in recent years.”

    The world’s largest PC maker by shipments said Thursday it will cut 10% of its nonmanufacturing positions, or 3,200 people, as part of a $650 million cost-cutting program in the second half of the fiscal year. That equates to 5% of its total head count.

    Lenovo spent roughly $5 billion last year to buy the Motorola smartphone business from Google Inc. and IBM’s low-end server unit.

    Reply
  7. Tomi Engdahl says:

    Microsoft: Surface hub will ship from January 1, 2016
    Fondle-giant to lumber out of Oregon at last
    http://www.theregister.co.uk/2015/08/13/microsoft_surface_hub_will_start_shipping_on_january_1_2016/

    Microsoft has quietly announced when its Surface Hub will ship, and if you’re impatient, you won’t be pleased: it won’t be until January 1, 2016.

    In July, it emerged that manufacturing problems were going to delay the planned September 1 ship date of the product.

    Redmond has now updated its “we’re running late” blog post with the bare information about the ship date.

    The wall-mounted too-big-to-fondleslabs are being built in Oregon, and Microsoft originally said pre-orders meant it was going to have to scale up its manufacturing plans.

    The fondlewalls are Microsoft’s attempt to carve a chunk out of Cisco’s collaboration business, with 55” and 80” units that can bring the true horror of any PowerPoint presentation to enormous life.

    As well as the Universal Apps that also run on PCs and Windows phones, Microsoft wants developers to create large-screen apps for the Surface Hub.

    Reply
  8. Tomi Engdahl says:

    Today’s top stories
    Tested: How Flash destroys your browser’s performance
    http://www.itworld.com/article/2967202/applications/tested-how-flash-destroys-your-browsers-performance.html

    We tested the effects of browsing with and without Flash on several major browsers. Enabling Flash is, in a word, catastrophic.

    n case you needed another reason to uninstall Adobe Flash, we’ve got one: It can drag down your PC by as much as 80 percent. Yes, 80 percent. So not only is Adobe Flash incredibly unsafe, it’s a memory hog. And we’ve got the numbers to prove it.

    As part of an upcoming roundup of the major browsers, we tested their abilities to handle Flash. Two browsers, Mozilla Firefox and Opera, do not include Flash, although you can download a plugin from Adobe to enable it. A third, Microsoft’s new Edge browser, enables Flash by default, although you can manually turn it off. Both Internet Explorer 11 and Google’s Chrome also include Flash, which you can disable or adjust within the Settings menu.

    Reply
  9. Tomi Engdahl says:

    The LibreOffice Story
    http://news.slashdot.org/story/15/08/12/1747204/the-libreoffice-story

    Jono Bacon in his latest column writes about the story of LibreOffice and how it rose out of the ashes of StarOffice and OpenOffice.org. Bacon also touches on why he feels LibreOffice is such a key piece of Open Source for communities across the world.

    Developing LibreOffice has been hard, technically challenging, and at times demotivating work, and contributors’ efforts can be seen by millions of users across the world.

    LibreOffice community achievements
    http://opensource.com/life/15/8/libreoffice-community-achievements

    Recently the LibreOffice project released version 5.0 of their cross-platform office suite. The new release brings together a raft of features for a comprehensive office suite that runs across Linux, Windows, Mac, and increasingly the mobile space.

    Saying LibreOffice or OpenOffice to people can lead to interesting reactions. For some people, LibreOffice is the darling of the open source world, and for others, it is a crappy Microsoft Office alternative that they look down on.

    I believe that LibreOffice plays an important function in the world, and one that spans beyond the mere function of an office suite.

    Reply
  10. Tomi Engdahl says:

    Lenovo CEO: We will axe 3,200 workers as our profits shrink to nowt
    Pre-tax income dives 80 per cent as fewer people find cash for PCs
    http://www.channelregister.co.uk/2015/08/13/lenovo_q1_fy2016_results_80_per_cent_profit_shrink/

    There’s no room for sentiment or emotion in business, as Lenovo showed today when it reported an 80 per cent crash in pre-tax profits and plans to axe 3,200 jobs across the group.

    Revenues for the three months ended 30 June grew three per cent, excluding the impact of for-ex conversions to $10.7bn – but the firm saw a 47 per cent hike in expenses to $1.55bn.

    “Last quarter, we faced perhaps the toughest market environment in recent years,” said chairman and CEO Yuanqing Yang.

    The perfect storm included a shrinking PC and tablet market that resulted in some stock write downs, macroeconomic “challenges” in Latin America, and intense competition that Lenovo said “hurt” Motorola’s bottom line. And it saw a “rapidly shifting technology landscape in the enterprise business.”

    In the quarter, Lenovo was also forced to write down PC stocks in Europe, as the company admitted, amid the severe slowdown in sales that hit all vendors.

    “We must be proactive and decisive now,” said Yang, “though very difficult, this action will include a reduction in workforce of about 3,200 people.” This equates to ten per cent of the non-manufacturing headcount or five per cent of all employees.

    Reply
  11. Tomi Engdahl says:

    Dell, Google dangle Chromebooks over IT bosses sick of Windows
    Although they can still run Windows, ish
    http://www.theregister.co.uk/2015/08/13/google_makes_pitch_to_push_chromebooks_into_business_with_dell/

    Google’s Chromebooks are just over four year old and, while the hardware has done well in education, businesses and normal people haven’t been too keen.

    In response, the ad giant has teamed up with Dell to fix this with a line of Chromebooks for business. Rajen Sheth, director of product management for Android and Chrome for Business and Education, said that with businesses updating from Windows XP, there’s an opportunity for Google to pick up some market share.

    “This is a long term move from legacy systems,” Kirk Schell, Dell’s GM for commercial client solutions, told a press conference in San Francisco on Thursday. “It’s going to be a long process, which is why you’ll see Google apps running alongside legacy apps via virtualization.”

    The Dell Chromebook 13 (for the size, not for luck) is a carbon fiber and magalloy machine that has the look of a premium laptop and weighs in at a reasonably portable 3.23lb, with the touchscreen adding a third of a pound to that.

    Reply
  12. Tomi Engdahl says:

    Notebook size decreases

    As schools began, many students buy a large, up to 15-inch laptops equipped with visual displays. However, the market investigation indicates that the flap thigh size is shrinking. Greater Teen panes sales will slow down and lower rate. IHS Research Institute, this shows that consumers are no longer interested in the large, 15-inch screens.

    Large panes sales fell in January-June by 14 per cent from one year ago. However, the 38.4 million sold in the display means that the 15-inch is still the most popular size.

    Meanwhile, the 11-inch displays, sales increased 35 per cent from 8 million to 11 million. IHS: the number of images of the very growth of the Chromebook laptops popularity.

    Source: http://etn.fi/index.php?option=com_content&view=article&id=3193:lappareiden-koko-pienenee&catid=13&Itemid=101

    Reply
  13. Tomi Engdahl says:

    Ron Miller / TechCrunch:
    IBM to build LinuxOne mainframes running Ubuntu Linux in partnership with Canonical; IBM joins Linux Foundation’s new Open Mainframe Project

    IBM Teams With Canonical To Put Ubuntu Linux On Mainframes
    http://techcrunch.com/2015/08/16/ibm-teams-with-canonical-on-linux-mainframe/

    You might not think that ‘Linux’ and ‘mainframe’ belong in the same sentence, but IBM has been putting various flavors of Linux on its mainframe computers for 15 years. Today IBM and Canonical announced that the two companies were teaming up to build one running Ubuntu Linux. The new unit is called the LinuxOne.

    The announcement comes as part of a broader strategy from IBM designed to drive mainframe usage to a wider audience. This new approach includes a monthly subscription pricing model, deeper involvement with other open source projects, contributing a huge cache of mainframe code to open source and participating in the newly launched Open Mainframe Project.

    The new mainframes come in two flavors, named for penguins (Linux — penguins — get it?). The first is called Emperor and runs on the IBM z13, which we wrote about in January. The other is a smaller mainframe called the Rockhopper designed for a more “entry level” mainframe buyer.

    Reply
  14. Tomi Engdahl says:

    Filling the void between fast/expensive DRAM and slow/cheaper flash
    Looks to be spin about spin and torque talk
    http://www.theregister.co.uk/2015/08/17/filling_void_between_fast_expensive_dram_slow_cheaper_flash/

    The AupM001 storage module is made of ST-MRAM (Spin Torque Magnetic RAM) which combines DRAM speed with non-volatility and an interface compatible with DDR3 SDRAM.

    IT is a candidate tech to fill the void between fast, expensive and volatile DRAM on the one hand, and slow, cheaper, and persistent flash on the other.

    Intel and Micron have just launched 3D XPoint memory aimed at roughly the same place in the market while HGST and Mellanox are involved in PCIe-connected PCM storage devices which is a third contender for this spot in the DRAM-NAND gap.

    As a DRAM-NAND gap filling candidate ST-MRAM chips from Everspin are here now and shipping in products, which is much more than be said about both PCM and XPoint.

    It does not have wide adoption, and its use as in high-speed, buffer-type functions suggests its cost-profile might work against wider adoption.

    Reply
  15. Tomi Engdahl says:

    Edge browser fails to win over Windows 10 users
    http://www.computerworld.com/article/2971957/web-browsers/edge-browser-fails-to-win-over-windows-10-users.html

    Even with aggressive setup switching, Edge has been adopted by a minority of those running Windows 10, according to two metrics vendors

    Microsoft’s new Edge browser is being used by a minority of those running Windows 10 — between one-sixth and one-third — according to data from a pair of analytics vendors.

    The early returns on Edge not only hint at Microsoft’s failure to get the earliest adopters to rely on the new browser, but also question Mozilla’s contention that Windows 10′s setup will result in defections from its own Firefox, or by association, other non-Microsoft browsers.

    Reply
  16. Tomi Engdahl says:

    IBM ‘TrueNorth’ Neuro-Synaptic Chip Promises Huge Changes — Eventually
    http://tech.slashdot.org/story/15/08/18/2257243/ibm-truenorth-neuro-synaptic-chip-promises-huge-changes—-eventually?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Slashdot%2Fslashdot%2Fto+%28%28Title%29Slashdot+%28rdf%29%29

    Each of IBM’s “TrueNorth” chips contains 5.4 billion transistors and runs on 70 milliwatts. The chips are designed to behave like neurons—the basic building blocks of biological brains. Dharmenda Modha, the head of IBM’s cognitive computing group, says a system of 24 connected chips simulates 48 million neurons, roughly the same number rodents have.

    IBM’s ‘Rodent Brain’ Chip Could Make Our Phones Hyper-Smart
    http://www.wired.com/2015/08/ibms-rodent-brain-chip-make-phones-hyper-smart/

    I can see the computer chips and the circuit boards and the multi-colored lights on the inside. It looks like a prop from a ’70s sci-fi movie, but Modha describes it differently. “You’re looking at a small rodent,” he says.

    He means the brain of a small rodent—or, at least, the digital equivalent. The chips on the inside are designed to behave like neurons—the basic building blocks of biological brains. Modha says the system in front of us spans 48 million of these artificial nerve cells, roughly the number of neurons packed into the head of a rodent.

    Modha oversees the cognitive computing group at IBM, the company that created these “neuromorphic” chips. For the first time, he and his team are sharing their unusual creations with the outside world, running a three-week “boot camp” for academics and government researchers at an IBM R&D lab on the far side of Silicon Valley. Plugging their laptops into the digital rodent brain at the front of the room, this eclectic group of computer scientists is exploring the particulars of IBM’s architecture and beginning to build software for the chip dubbed TrueNorth.

    Some researchers who got their hands on the chip at an engineering workshop in Colorado the previous month have already fashioned software that can identify images, recognize spoken words, and understand natural language. Basically, they’re using the chip to run “deep learning” algorithms, the same algorithms that drive the internet’s latest AI services, including the face recognition on Facebook and the instant language translation on Microsoft’s Skype. But the promise is that IBM’s chip can run these algorithms in smaller spaces with considerably less electrical power, letting us shoehorn more AI onto phones and other tiny devices, including hearing aids and, well, wristwatches.

    “What does a neuro-synaptic architecture give us? It lets us do things like image classification at a very, very low power consumption,”

    The TrueNorth is part of a widespread movement to refine the hardware that drives deep learning and other AI services. Companies like Google and Facebook and Microsoft are now running their algorithms on machines backed with GPUs (chips originally built to render computer graphics), and they’re moving towards FPGAs (chips you can program for particular tasks). For Peter Diehl, a PhD student in the cortical computation group at ETH Zurich and University Zurich, TrueNorth outperforms GPUs and FPGAs in certain situations because it consumes so little power.

    The main difference, says Jason Mars, a professor of a computer science at the University of Michigan, is that the TrueNorth dovetails so well with deep-learning algorithms. These algorithms mimic neural networks in much the same way IBM’s chips do, recreating the neurons and synapses in the brain.

    Reply
  17. Tomi Engdahl says:

    Data Storage Intel
    Intel Promises ‘Optane’ SSDs Based On Technology Faster Than Flash In 2016
    http://hardware.slashdot.org/story/15/08/18/2222244/intel-promises-optane-ssds-based-on-technology-faster-than-flash-in-2016

    Intel today announced that it will introduce SSDs based on a new non-volatile memory that is significantly faster than flash in 2016. A prototype was shown operating at around seven times as fast as a high-end SSD available today

    Scant details have been released, but the technology has similarities with the RRAM and memristor technologies being persued by other companies.

    Intel’s Reinvention of the Hard Drive Could Make All Kinds of Computers Faster
    http://www.technologyreview.com/news/540411/intels-reinvention-of-the-hard-drive-could-make-all-kinds-of-computers-faster/

    A new kind of hard drive available next year will be able to move your data many times faster than the best today.

    Reply
  18. Tomi Engdahl says:

    China’s Tianhe-1 supercomputer back online after Tianjin blast
    Data centre survives explosion as corruption accusations fly
    http://www.theregister.co.uk/2015/08/19/chinas_tianhe1_supercomputer_back_online_after_tianjin_blast/

    The Tianhe-1 supercomputer in the Chinese city of Tianjin is back online.

    State organs China Youth Daily and People.com.cn are both reporting, with more than a little tinge of pride, that the supercomputer survived the blast thanks to the resilience of the data centre in which it was housed.

    The data centre is a couple of kilometres from the warehouse that last week exploded with extraordinary force.

    The supercomputer, ranked the world’s 24th most-powerful, is widely used by Chinese government agencies so returning to operations is doubtless welcome. Whether it is fully operational is harder to gauge China is keen to ensure it citizens, and the world, don’t perceive disruption to its activities.

    Reply
  19. Tomi Engdahl says:

    Huge savings prompt Italian city to dump OpenOffice for Microsoft after four years
    http://www.zdnet.com/article/huge-savings-prompt-italian-city-to-dump-openoffice-for-microsoft-after-four-years/

    According to a study commissioned by the government body – and Microsoft – the costs of using an open source suite far outweigh those of using Office 365.

    Is it better for local government bodies to use open source or proprietary software? It’s an argument that generates heated opinion on both sides. Supporters of the former highlight the savings that come from getting rid of licenses, while those who prefer the latter argue that proprietary products need little training to use and, generally speaking, boast a wider set of features.

    One case study from Italy illustrates the second point of view. Between 2011 and 2014, the municipality of Pesaro, in the Marche region, trained up its 500 employees to use OpenOffice. However, last year the organization decided to switch back to Microsoft and use its cloud productivity suite Office 365.

    According to a recently published report from the Netics Observatory – commissioned by the municipality and Microsoft itself – the city administration will be able to save up to 80 percent of the software’s total cost of ownership (which includes deployment, IT support, subscription plans cost, and other elements) using Office 365, compared to its previous setup.

    It might sound surprising, but according to the municipality’s head of the statistics and information systems department, Stefano Bruscoli – who was interviewed by the authors of the report – the savings are largely due to the significant and unexpected deployment costs that the administration faced when it decided to rollout OpenOffice and abandon the on-premise version of Microsoft Office that it had used up until 2011.

    “We encountered several hurdles and dysfunctions around the use of specific features,”

    In particular, having to repaginate and tweak a number of documents due to a lack of compatibility between the proprietary and the open source systems translated into a considerable waste of time and productivity. The management estimates that every day roughly 300 employees had to spend up to 15 minutes each sorting out such issues.

    Reply
  20. Tomi Engdahl says:

    Why SharePoint is the last great on-premises application
    http://www.cio.com/article/2970173/collaboration-software/why-sharepoint-is-the-last-great-on-premises-application.html

    While it seems like almost every piece of IT is moving to cloud these days, there are still plenty of reasons to keep SharePoint in your server room – where it belongs.

    At the Worldwide Partner Conference (WPC) last month in Orlando, we heard many of the same grumblings we’ve been hearing about Microsoft for years now: They don’t care about on-premises servers. They’re leaving IT administrators in the dust and hanging them out to dry while forcing Azure and Office 365 content on everyone. They’re ignoring the small and medium business.
    resume makeover executive
    IT Resume Makeover: How to add flavor to a bland resume

    Don’t count on your ‘plain vanilla’ resume to get you noticed – your resume needs a personal flavor to
    Read Now

    It’s hard to ignore this trend. It’s also true that the cost-to-benefit ratio continues to decrease to the point where common sense favors moving many workloads up to the cloud where you can transform capex and personnel expense to opex that scales up and down very easily.

    But SharePoint Server is such a sticky product with tentacles everywhere in the enterprise that it may well be the last great on-premises application. Let’s explore why.

    The cloud simply means someone else’s computer

    One clear reason is that SharePoint, for so many organizations, hosts a large treasure trove of content, from innocuous memos and agendas for weekly staff meetings to confidential merger and acquisitions documents. In most organizations, human resources uses SharePoint to store employee compensation analysis data and spreadsheets; executives collaborate within their senior leadership teams and any high-level contacts outside the organization on deals that are proprietary and must be secured at all times; and product planning and management group store product plans, progress reports and even backups of source code all within SharePoint sites and document libraries.

    No matter how secure Microsoft or any other cloud provider claims it can make its hosted instances of SharePoint, there will always be that nagging feeling in the back of a paranoid administrator’s head: Our data now lives somewhere that is outside of my direct control. It’s an unavoidable truth, and from a security point of view, the cloud is just a fancy term for someone else’s computer.

    Not even Microsoft claims that every piece of data in every client tenant within SharePoint Online is encrypted. Custom Office 365 offerings with dedicated instances for your company can be made to be encrypted, and governmental cloud offerings are encrypted by default, but a standard E3 or E4 plan may or may not be encrypted. Microsoft says it is working on secure defaults, but obviously this is a big task to deploy over the millions of servers they run.

    Nothing is going to stop the FBI, the Department of Justice, the National Security Agency or any other governmental agency in any jurisdiction from applying for and obtaining a subpoena to just grab the physical host that stores your data and walk it right out of Microsoft’s data center into impound and seizure. Who knows when you would get it back? Microsoft famously does not offer regular backup service of SharePoint, relying instead on mirror images and duplicate copies for fault tolerance

    Worse, you might not even know that the government is watching or taking your data from SharePoint Online.

    It’s tough for many – perhaps even most – Fortune 500 companies to really get their heads around this idea. And while Microsoft touts the idea of a hybrid deployment, it’s difficult and not inexpensive and (at least until SharePoint 2016 is released) a bit kludgy as well.

    It’s (sort of) an application development platform

    Some companies have taken advantage of SharePoint’s application programming interfaces, containers, workflow and other technologies to build in-house applications on top of the document and content management features. Making those systems work on top of Office 365 and SharePoint Online can be very difficult beast to tame.

    It’s a choice with less obvious benefits – there is lower-hanging fruit

    Email is still the slam dunk of cloud applications. Your organization derives no competitive advance, no killer differentiation in the marketplace from running a business email server like Microsoft Exchange. It is simply a cost center

    Secure email solutions exist now that encrypt transmissions and message stores both at rest and in transit, so security in the email space is much more mature than, say, hosted SharePoint. No wonder Exchange Online is taking off.

    SharePoint is not as clear a case here. While you might choose to put your extranet on SharePoint Online or host a file synchronization solution in the cloud

    Reply
  21. Tomi Engdahl says:

    Why every member of your team should be a UX expert
    http://thenextweb.com/uxdesign/2015/08/11/why-every-member-of-your-team-should-be-a-ux-expert/

    Building a culture of User Experience (UX) awareness requires education and leadership. You’ll need some buy-in from employees across the different teams to get them to believe that they should care about UX, even if it’s not part of their core job description.

    UX is by nature multidisciplinary. It requires the input of designers, information architects, usability specialists, and many others to make it happen. Getting a polished, final product requires more than just the design team.

    If everybody thinks of how they want the user to experience and feel the product, that can go a long way towards ensuring a finished result with excellent design and functionality. Getting there may require a change in thinking, but it can be well worth the effort in the end.

    Avoid the myths

    Jerry Cao, a UX content strategist at UXPin, offers an excellent place to start: avoid the common myths that surround User Interface design. For example, he warns that functional designs don’t automatically create good experiences. User Experience is about creating a good feeling on the part of the visitor. So for a website, for example, customer testimonials can be a powerful way to create trust.

    Reply
  22. Tomi Engdahl says:

    Build a “Virtual SuperComputer” with Process Virtualization
    http://www.linuxjournal.com/content/build-virtual-supercomputer-process-virtualization?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+linuxjournalcom+%28Linux+Journal+-+The+Original+Magazine+of+the+Linux+Community%29

    Build and release is a complicated process. I really don’t need to tell anyone that…but I did anyway. But a rapid, precise feedback cycle that notifies whether the latest build of software passed successfully or not, and one that tells you WHAT failed specifically instead of just “operation failed,” can mean the difference between being able to quickly fix defects, or not discovering those defects until later in the project’s lifecycle, thereby increasing exponentially the cost and difficulty of fixing them. IncrediBuild has been addressing precisely this with Microsoft TFS for years, and is now reinventing their tool for Linux developers.

    The secret, according to Eyal Maor, IncrediBuild’s CEO, is what they call Process Virtualization. In a nutshell, Process Virtualization helps to speed up the build process by allowing the build machine to utilize all the cores in ALL the machines across the network. Maor says that in most cases, “cycles can be reduced from 20 minutes to under 2 minutes, and there are many more cycles available.”

    For Linux Journal readers, perhaps the most interesting part of this is how it works.

    So obviously, operations need to be done in parallel, but the devil is in the details for parallel processing. Remember that, if improving speed by more efficient usage of multi-core CPUs is the goal, attention should be focused on parallelizing CPU-bound operations, not I/O-bound operations. Beyond that, there are a couple of possible parallel processing solutions:

    Clustering – Essentially parallelizing the build process workflow level, running one complete workflow on each agent, so that multiple build requests can be handled in parallel.
    HPC – Computers (physical or virtual) that can aggregate computing power in a way that delivers much higher performance than that of an individual desktop computer or workstation. HPCs make use of many CPU cores in order to accomplish this. Ubuntu-based machines, for example, can support up to 256 cores.

    While either of these solutions provides opportunity to speed up processes, both have limitations and drawbacks.

    IncrediBuild, on the other hand, transforms any build machine into a virtual supercomputer by allowing it to harness idle CPU cycles from remote machines across the network even while they’re in use. No changes are needed to source code, no additional hardware is required, no dedicated cluster is necessary and it works on both local and WAN so one can scale limitlessly into the cloud. It does not disturb the normal performance of existing operations on the remote machine, instead making use of idle CPU cycles on any cores on the network.

    Process virtualization consists of two actors – the initiator machine and the helper machines. The initiator distributes computational processes over the network as if the remote machines’ cores were its own, while the processes executed on the helpers

    IncrediBuild is a product of Xoreax Software which launched in 2002 and has since become the de facto standard solution for code-build acceleration.

    Reply
  23. Tomi Engdahl says:

    The Flash Storage Revolution Is Here
    http://www.wired.com/2015/08/flash-storage/

    You’ve likely heard about Samsung’s 16TB hard drive, by far the world’s largest. That is an eye-popping number, a large enough leap forward that it’s difficult to fully process. And the most exciting thing about that 16TB hard drive? It’s just a hint of what’s coming next.

    The pace of flash storage development has been slow and steady for decades. Finally, though, we’re starting to see breakthroughs of the last few years result in actual products, starting with one mammoth SSD.

    The Samsung drive, called PM1633a, was first reported by Golem.de and announced at last week’s Flash Memory Summit in California. While its size is impressive, it’s all the more astonishing for being a solid state drive—comprising flash memory chips—as opposed to more conventional (and affordable) hard drives that rely on magnetically coated spinning discs.

    While SSDs have been faster and more rugged than their HDD counterparts, they have until recently been far more limited in capacity. To this point, the largest 2.5-inch (the size of Samsung’s latest) SSD you could buy was 4TB, at a cost of around $6,000. Even high-capacity spinning disc drives top out at around 10TB. While the PM1633a probably hasn’t remedied the cost situation, a four-fold leap in size is incredible.

    What it’s not, though, is unexpected. In fact, Samsung laid the groundwork for this very device years ago.

    Samsung’s solution, called V-NAND, has seen remarkable gains since its introduction. In the first year, the company stacked 24 of layers on a single die, while in 2014 it managed 36. The 16TB SSD kicks that up to 48.

    Even more exciting is that when that level of tech does trickle down to consumers, it won’t necessarily even come from Samsung; V-NAND isn’t the only vertical NAND technology out there. Intel and Micron recently announced that they’re working on something quite similar, though they don’t expect to produce consumer devices based on the technology until early next year. Toshiba has dabbled in 3D NAND, with products expected by the end of next year. All of them have the systems in place to produce equally, if not more, impressive drives.

    The implications of storage breakthroughs like this go beyond data centers and laptops, though. “Memory and storage are the two things that are holding up huge innovations in biotech, in design, and for that matter even artificial intelligence,” Moorhead says. “They’ve become a fundamental building block for moving the industry forward. These big innovations at the top trickle their way down into cars, into phones, over a five to seven year period.”

    The innovations he refers to include the manufacturing smarts flexed by Samsung, Intel and Micron, and Toshiba.

    Reply
  24. Tomi Engdahl says:

    Gordon Mah Ung / PCWorld:
    Intel Skylake CPU details revealed: These faster, more power-efficient chips can even drive three 4K monitors
    http://www.pcworld.com/article/2972823/components-processors/intel-skylake-laptop-cpus-should-use-less-power-run-faster-and-can-even-drive-three-4k-monitors.html

    Skylake-based laptops should offer a nice boost over today’s Broadwell laptops, and here’s why.

    After weeks of teasing us, Intel finally lifted the kilt to show us details of its 6th generation CPU. Besides being faster and using less power than its predecessors, Skylake chips can drive multiple 4K displays, feature new instructions to speed up security operations, and even hardened memory defenses.

    But those expecting the full monty on how much the chips will cost, how many cores they’ll have, and when you can buy a laptop with them will continue to be disappointed.

    “Today is not the launch of Skylake,” Julius Mandelblat, an Intel senior principle engineer said to a frustrated room of developers, hardware vendors and engineers who peppered him with specific product questions after his presentation about Skylake. That news, Intel officials said, won’t come for another “couple of weeks.”

    Reply
  25. Tomi Engdahl says:

    Intel CEO Sees A Bright Future For IoT, Developers
    http://www.eetimes.com/document.asp?doc_id=1327458&

    At the Intel Developer Forum(IDF) in San Francisco, Calif., on Tuesday, CEO Brian Krzanich said there’s never been a better time to be a developer.

    “I’ve never seen such diversity of opportunity for developers,” said Krzanich during his keynote at the event.

    That’s not exactly a novel sentiment. In 2013, the Outcast Agency held a media event based on the theme “The developer is king.” The event involved representatives from companies that depend on developers, such as Github, Google, Mixpanel, New Relic, and Stripe talking about why it’s a great time to be a developer. That was also the message coming out of Twilio’s Signal developer conference in May. The web, mobile platforms, and the cloud have all expanded the need for developers and the scope of their work.

    Intel CEO Sees A Bright Future For IoT, Developers
    http://www.informationweek.com/mobile/mobile-devices/intel-ceo-sees-a-bright-future-for-iot-developers/d/d-id/1321809

    Tiny computers, real-time depth sensing, and breakthrough memory technology are among the innovations featured at this year’s Intel Developer Forum. CEO Brian Krzanich detailed how connected devices will change the way we all do business.

    At the Intel Developer Forum (IDF) in San Francisco, Calif., on Tuesday, CEO Brian Krzanich said there’s never been a better time to be a developer.

    Reply
  26. Tomi Engdahl says:

    Firefox preps processor revamp under Project Electrolysis
    Sleepy project hits nightly builds
    http://www.theregister.co.uk/2015/06/18/firefox_electrolysis/

    Mozilla looks ready to revamp its Firefox web browser so tabs and user interfaces can run in separate processes.

    The feature has appeared in a nightly testing version of the browser and has been in lengthy development under Project Electrolysis.

    Developer Dan Mircea says the feature is activated by default in nightly builds and will be part of a transition process so that affected developers whose add-ons rely on accessing web content directly can adapt with minimal shock.

    “In current versions of desktop Firefox, the entire browser runs in a single operating system process. In particular, the JavaScript that runs the browser UI (chrome code) runs in the same process as the code in web pages,” Mircea says.

    “Future versions of Firefox will run the browser UI in a separate process from web content.”

    Reply
  27. Tomi Engdahl says:

    Google releases version 1.5 of its Go programming language, finally ditches last remnants of C
    http://venturebeat.com/2015/08/19/google-releases-version-1-5-of-its-go-programming-language-finally-ditches-last-remnants-of-c/

    Google today released Go 1.5, the sixth major stable release of its Go programming language. You can download Go 1.5 right now from the Go downloads page.

    This is not a major release, as denoted by the version number and the fact that the only language change is the lifting of a restriction in the map literal syntax to make them more consistent with slice literals. That said, Go 1.5 does include a significant rewrite: The compiler tool chain has been translated from C to Go. This means “the last vestiges of C code” have been finally removed from the Go code base.

    https://golang.org/dl/

    Reply
  28. Tomi Engdahl says:

    Intel’s Compute Sticks stick it to Windows To Go, Chromecast
    Tiny PCs are sysadmin and thin client friendly, to a point
    http://www.theregister.co.uk/2015/08/20/intels_compute_sticks_stick_it_to_windows_to_go_chromecast/

    Review “Little Stick. Big Surprise.” That’s what Intel says about its Compute Sticks, the new ‘smaller than an iPhone’ mini-PCs designed for portability and ease of use

    The Intel Compute Stick (ICS) is perhaps best thought of as the mutant offspring of a Raspberry Pi on steroids and Google Chromecast. The offspring emerges as a tiny computer CPU, RAM and storage on a small motherboard contained within a reasonably well finished case. Protruding from the case is a HDMI male adapter ready to plug into any display boasting its female counterpart.

    The ICS is a full working PC with Windows 8.1 for Bing a quad core Atom processor Z3735F running at up to 1.83 GHz, 2 GB memory, 32 GB of on-board storage, b/g/n WiFi, Bluetooth and a microSD card slot. An Ubuntu version shaves the RAM to 1GB and storage to 8GB. The RAM’s soldered on so forget upgrades and ponder buying the Windows version and installing Ubuntu rather than making do with a wimpily-specced machine. Not many people complain about having ‘too much storage’.

    Reply
  29. Tomi Engdahl says:

    Linux Foundation CII announces best practice badge programme
    Like a Blue Peter badge, but one you helped make earlier
    http://www.theinquirer.net/inquirer/news/2422659/linux-foundation-cii-announces-best-practice-badge-programme

    THE CORE INFRASTRUCTURE INITIATIVE (CII) has announced plans to involve the community in securing open source software.

    The CII, which is run by the Linux Foundation, was formed as a joint venture by the technology sector after the Heartbleed palava last year, primarily as a curator of the OpenSSL protocol, but at a wider level to look at overall software security standards.

    Today, it has announced a badge programme, inviting interested parties to contribute on the criteria used to determine the security, quality and stability of open source software.

    The first draft of the criteria is already up on GitHub spearheaded by David Wheeler, coordinator of the CII Census Project.

    The badge programme is described as a “secure open source development maturity model” and will allow developers to certify software with a best practices badge against common standards, such as attention to quality, security and knot-tying. No wait, that’s the Boy Scouts.

    The announcement explains: “Virtually every industry and business leverages open source, and is therefore more connected and dependent on it than ever before. Despite its prevalence, trying to quickly determine the best maintained and most secure open source to use is a complex problem for seasoned CIOs and nimble developers.

    “The self-assessment, and the badges that will follow, are designed to be a simple, fairly basic way for projects to showcase their commitment to security and quality. The criteria is also meant to encourage open source software projects to take positive steps with both in mind and to help users know which projects are taking these positive steps.”

    Reply
  30. Tomi Engdahl says:

    Linux boss Torvalds: Don’t talk to me about containers and other buzzwords
    Just wants to get through next 6 months
    http://www.theregister.co.uk/2015/08/19/linuxcon_torvalds_qa/

    LinuxCon 2015 Linux kernel maintainer Linus Torvalds isn’t thinking about where his creation will be ten years from now – in fact, he claims he doesn’t even think ahead one year.

    “I am a very plodding, pedestrian kind of person,” Torvalds said during a Q&A session with Linux Foundation boss Jim Zemlin at LinuxCon in Seattle on Wednesday. “I look six months ahead. I look ahead at this release and I know what’s coming up in the next one.

    “I don’t think planning 10 years ahead is necessarily very sane. Because if you think about Linux ten years back and where Linux was ten years ago, trying to plan for where we are now would have been completely insane.”

    What’s more, while the popularity of Linux continues unabated, Torvalds has little time for some of the other hot-button technologies that have risen out of the open source community of late.

    “I’m sorry to everybody involved here in containers,” he said. “I’m so happy that the kernel tends to be fairly far removed from all of these issues, all of the buzzwords and all the new technologies. We end up being in a situation where we’re such an infrastructure play that we only care about us working and then how people use the kernel.”

    Similarly, while the industry is abuzz about the so-called Internet of Things, the Finnish coder takes a somewhat pessimistic view about Linux’s ultimate role in it.

    The chief hurdle for running Linux on really tiny devices, he said, is that the kernel itself has grown considerably since its early days. Where once it fit in under a megabyte, these days it’s tens of megabytes in size, and that’s not likely to change much in the future.

    “It’s always really hard to get rid of unnecessary fat. As every developer in this room knows, things tend to grow,”

    And when it comes to security, the Linux main man doesn’t think it’s likely that systems running Linux will ever be completely hardened and bulletproof – at least, not to the satisfaction of the security community, which Torvalds admitted to occasionally being at odds with.

    “Security is bugs,” he said. “Most of the security issues we’ve had in the kernel – and happily they haven’t been that big, or some of them are pretty big but they don’t happen that often – most of them have been just completely stupid bugs that nobody really would have thought of as security issues normally, except for the fact that some clever person comes around and takes advantage of it. And the thing is, you’re never going to get rid of bugs.”

    Reply
  31. Tomi Engdahl says:

    OLPC heir reveals modular laptop design
    ‘Infinity’ allows kids to swap out components, even the screen
    http://www.theregister.co.uk/2015/08/21/modular_offbrand_olpc_heir_goes_on_sale/

    One Education, an Australian offshoot of the One Laptop Per Child (OLPC) project, has opened preview sales for an heir to the XO computer.

    The “Infinity” has a modular design so while the quad-core, 1.5GHz CPU is baked in, it’s possible to replace the battery, camera and just about anything else thanks to the presence of three ports. The screen’s replaceable too, so that one can add a touch panel or upgrade resolution as future modules emerge.

    The first models will ship with Android Lollipop, with Linux and Windows versions promised real soon now. USB-C provides connectivity and power. WiFi and Bluetooth take care of comms.

    Reply
  32. Tomi Engdahl says:

    PEAK TECHNOLOGY? Facebook, Amazon, Netflix, Google, Apple stocks hit the deck
    Billions wiped off the charts – and Twitter falls below its stock-market debut
    http://www.theregister.co.uk/2015/08/21/twitter_and_other_stocks_down/

    Thursday was a rough day on Wall Street for many of the biggest names in the tech industry, as stocks dipped across the board.

    The Dow Jones Industrial Average fell by 2.06 per cent, and the Nasdaq 100 Index was down 2.8 per cent, on the day Facebook, Amazon, Netflix, Google, and Apple all saw their share prices dip – wiping $49bn off the five giants’ total market value. According to Bloomberg data, it was their worst day since January 2013.

    Reply
  33. Tomi Engdahl says:

    Storage Industry Group Tackles SSD Data Recovery
    http://www.eetimes.com/document.asp?doc_id=1327476&

    Solid state storage presents problems when the need arises to recover data, so a new special interest group has been formed to address the challenges.

    Last week at the Flash Memory Summit [LINK], the Storage Networking Industry Association (SNIA) and its Solid State Storage Initiative (SSSI) announced the creation of a new Special Interest Group around Data Recovery/Erase to foster development of tools and standards.

    Data recovery firms have had conversations with makers of solid-state drives (SSDs) for a number of years around the processes and tools for resurrecting data, said Gillware Data Recovery’s Scott Holewinski, chair of the DR/E SIG. “The issue we’ve been having is there are no standards around it.”

    In a telephone interview with EE Times, he said an annual session at the summit wasn’t enough to keep the momentum going; the SIG will formally bring everyone together to improve recovery of data from solid state storage. The decision to include erase with data recovery made sense because the same capabilities are required for both disciplines.

    Reply
  34. Tomi Engdahl says:

    Intel Memory Bus Draws Fire
    SSDs next year, servers in 2017
    http://www.eetimes.com/document.asp?doc_id=1327465&

    SAN FRANCISCO – Intel should open up the proprietary interface for the 3D XPoint memory chips it co-developed with Micron, said several sources. The reactions came as Intel announced at the Intel Developer Forum here new details about the chips it will ship next year in solid-state drives and system memory modules.

    The so-called Intel Optane technology will include “optimized controllers and interfaces to the platform and software IP to make complete product,” said Rob Crooke, general manager of Intel’s non-volatile memory group. He spoke in a keynote where Intel chief executive Brian Krzanich called 3D XPoint “the biggest breakthrough in memory and storage in 30 years.”

    Crooke showed working SSDs with the chips delivering five to seven times the I/O operations/second of Intel’s fastest flash SSDs. The drives will come in versions for everything from “data centers to ultrabooks” as well as dual in-line memory modules (DIMMs) for Xeon servers. They will plug a hole in today’s memory products by delivering lower latency and longer endurance than flash with higher capacity than DRAM, he said.

    Intel has no plans to sell 3D XPoint chips. A Micron executive said the company is about two months away from announcing its separate plans for selling 3D XPoint products.

    Reply
  35. Tomi Engdahl says:

    Autonomous Cars In, Big Data Out In Gartner Hype Cycle
    http://www.eetimes.com/author.asp?section_id=36&doc_id=1327475&

    Gartner’s annual Hype Cycle is out, and IoT and autonomous cars are in this year. Big data, however, is losing some of its luster.

    Just two years ago, big data was at this peak of the hype cycle. It was replaced last year by the Internet of Things, a ranking that IoT still holds in this report. Indeed, big data is nowhere to be found on the current Garter Hype Cycle.

    Additionally, the absence of any specific cybersecurity technologies from the Hype Cycle is puzzling, although “digital security” and “software-defined security” are mentioned as pre-peak areas.

    Reply
  36. Tomi Engdahl says:

    Businesses Programming
    Do Old Programmers Need To Keep Leaping Through New Hoops?
    http://developers.slashdot.org/story/15/08/20/2242239/do-old-programmers-need-to-keep-leaping-through-new-hoops?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Slashdot%2Fslashdot%2Fto+%28%28Title%29Slashdot+%28rdf%29%29

    n recent years, it seems as if tech has evolved into an industry that lionizes the young. Despite all the press about 21-year-old rock-star developers and 30-year-old CEOs, though, is there still a significant market for older programmers and developers, especially those with specialized knowledge? The answer is “yes,” of course, and sites like Dice suggest that older tech pros should take steps such as setting up social media accounts and spending a lot of time on Github if they want to attract interest from companies and recruiters.

    Job-Seeking Tips for Older Tech Pros
    http://insights.dice.com/2015/08/20/job-seeking-tips-older-tech-pros/?CMPID=AF_SD_UP_JS_AV_OG_DNA_

    Learn New Stuff

    The biggest threat to tech pros’ continued employment is letting their skills atrophy. You may already know the ins and outs of older-but-popular programming languages such as C++ and Java, but it doesn’t hurt to see how other programmers are pushing the boundaries of those platforms (websites such as Hacker News are a huge help with that), as well as explore the possibilities of new languages and technologies. If you’re skilled in Objective-C, for example, you can increase your job-earning potential by learning Swift, which will likely become the primary way of building iOS apps over the next several years.

    With hardware, the need to learn new stuff is even more important, given the segment’s rapid evolution. What you know about servers or PCs will be ancient history sooner than you think; and with more businesses embracing the cloud over on-premises data centers, adapting to new methods of building and administrating is more important than ever.

    Where do all the old programmers go?
    http://www.infoworld.com/article/2617093/it-careers/it-careers-where-do-all-the-old-programmers-go.html

    In search of the fabled elephants’ graveyard of software developers over 40

    We all know that software development is a young man’s game. While hotshot young coders get fat raises and promotions to management, older programmers have an ever more difficult time finding work. Right?

    In a recent editorial, Norman Matloff, a professor of computer science at the University of California, Davis, describes software engineering as a career dead end. “Many programmers find that their employability starts to decline at about age 35,” Matloff writes.

    If this were radio, here’s where I’d cue the sound of the needle skipping off the record. Age 35? I thought we were talking about older programmers. Since when is 35 old?

    “Statistics show that most software developers are out of the field by age 40,” Matloff continues, and here my eyebrows really start to rise. Most programmers? As in the majority of them? Gone? (Matloff declines to mention which statistics he’s reading.)

    If that’s true, where do they go?

    Is the sky really falling?
    Now, I’m not going to do an about-face and claim that age discrimination doesn’t exist in software development. It probably is more prevalent in tech fields than in other industries.

    First, the anecdotal evidence: I know quite a lot of people, but I’m at an age when just about everyone in my social circle has reached or is fast approaching 40. That includes a number of software developers. What does it say about me, I wonder, that every single one of my programmer friends also happens to be a statistical outlier?

    Hunkering down
    For starters, some of them don’t go. They become highly specialized in a certain area, industry, tool, or company, and they carve out a lucrative niche sticking to what they do best. These are the coders who go on to become “distinguished engineers” at larger tech businesses. They’re also the true statistical outliers in Matloff’s data, so let’s forget about them.

    Other programmers are inevitably promoted to management.

    Forging new paths, under the radar
    Other developers don’t leave the field, but they do quit their jobs. They go on to found startups, where their titles might be principal or CTO. Entrepreneurs have a way of slipping through the cracks of employment surveys — again, throwing off the statistics.

    Employment surveys have a way of missing independent contractors, too. Yet consulting can be particularly lucrative for software developers, and it tends to favor mature programmers with extensive industry experience.

    Reply
  37. Tomi Engdahl says:

    All aboard the Skylake: How Intel stopped worrying and learned to love overclocking
    6th gen CPU and swanky Asus mobo on test
    http://www.theregister.co.uk/2015/08/21/all_aboard_the_skylake/

    Nope, it’s not the next instalment of the James Bond franchise, but Intel’s eagerly awaited successor to the Broadwell platform. Skylake, like Broadwell, is built on a 14nm process but this time we get yet another new socket, as the new CPUs use the LGA1151 Socket.

    At launch, Intel is introducing two new enthusiast CPUs, as well as a new performance chipset, the Z170, which sees DDR4 memory support brought to the mainstream market for the first time.

    You might be wondering why Intel is launching a new platform with the higher-end CPUs rather than mainstream, and the answer to that is PC gaming. According to Intel, PC gaming is once again driving the desktop platform. With 1.2 billion PC gamers worldwide, Intel is committed to supporting both gaming and overclocking.

    The first of the sixth-generation processors being launched to support the new platform are two Skylake K desktop enthusiast chips: the Core i7-6700K and the Core i5-6600K. The remainder of the family, including mobile variants, are scheduled to be launched just after this year’s IDF (Intel Developer Forum) or later.

    The flagship Core i7-6700K has four cores and eight threads (via Hyper Threading) and has a base frequency of 4GHz on all four cores with a maximum Turbo frequency of up to 4.2GHz, 8MB of cache.

    So what about overclocking? With recent processor releases, Intel has slowly loosened its grip on locking down most parts of the CPU to stop any serious fiddling.

    Yet with Skylake K, the company has fully given itself over to the dark side with all the PCIe, DMI and BCLK straps being thrown out the window. Instead, there is an independent BCLK with a full range of 1MHz increments. The PEG/DMI has its own isolated 100MHz clock, and the set ratios for multipliers have also been given the boot.

    An essential to a stable overclock is getting the balance between power delivery and heat correctly balanced. For the sixth-generation processors, Intel has taken a step back and returned the voltage regulator from being internally mounted in the CPU to being external and on the motherboard, thereby getting rid of some of the internal heat generation in one fell swoop.

    To support the new processors at launch, there is the flagship Sunrise Point 100 series chipset – the Z170. The Z170 is the successor to the Z97 and, as mentioned earlier, it brings 40 per cent more high-speed I/O lanes than the previous chipset, with up to 20 PCIe 3.0 lanes and up to 10 USB 3.0 ports being supported.

    Reply
  38. Tomi Engdahl says:

    Yellow and blue circles, red arrows added to Gartner’s Magic Quadrant
    This to there, that to there … try and keep up
    http://www.theregister.co.uk/2015/08/18/yellow_arrows_blue_circles_gartner_magic_quadrant/

    Here’s something to delve into; a look at the changes in Gartner’s integrated systems Magic Quadrant between 2014 and 2015

    What do we see? There’s been a lot of movement and five new entries. Two previous entries have disappeared as well; Bull and Unisys.

    The newcomers are all in the niche players box; Scale Computing, Picot3, GridStore, SGI and Nimboxx. Their arrival shows a lot of startup energy, and money, is being invested in hyperconverged systems.

    The biggest mover is IBM. With the sale of its server business to Lenovo its old entry has been divided into two.

    Reply
  39. Tomi Engdahl says:

    DirectX 12 tested: An early win for AMD and disappointment for Nvidia
    First DX12 gaming benchmark shows R9 290X going toe-to-toe with a GTX 980 Ti.
    http://arstechnica.com/gaming/2015/08/directx-12-tested-an-early-win-for-amd-and-disappointment-for-nvidia/

    Windows 10 brings a slew of features to the table—the return of the Start menu, Cortana, the Xbox App—but the most interesting for gamers is obvious: DirectX 12 (DX12). The promise of a graphics API that allows console-like low-level access to the GPU and CPU, as well as improved performance for existing graphics cards, is tremendously exciting. Yet for all the Windows 10 information to trickle out in the three weeks since the OS launched, DX12 has remained the platform’s most mysterious aspect. There’s literally been no way to test these touted features and see just what kind of performance uplift (if any) there is. Until now, that is.

    Reply
  40. Tomi Engdahl says:

    Kev Needham / Mozilla Add-ons Blog:
    Mozilla to introduce new API for Firefox to make it easier to port extensions from other browsers, to deprecate certain classes of add-ons 12-18 months from now

    The Future of Developing Firefox Add-ons
    https://blog.mozilla.org/addons/2015/08/21/the-future-of-developing-firefox-add-ons/

    Introducing the WebExtensions API

    For some time we’ve heard from add-on developers that our APIs could be better documented and easier to use. In addition, we’ve noticed that many Firefox add-on developers also maintain a Chrome, Safari, or Opera extension with similar functionality. We would like add-on development to be more like Web development: the same code should run in multiple browsers according to behavior set by standards, with comprehensive documentation available from multiple vendors.

    To this end, we are implementing a new, Blink-compatible API in Firefox called WebExtensions. Extension code written for Chrome, Opera, or, possibly in the future, Microsoft Edge will run in Firefox with few changes as a WebExtension. This modern and JavaScript-centric API has a number of advantages, including supporting multi-process browsers by default and mitigating the risk of misbehaving add-ons and malware.

    WebExtensions will behave like other Firefox add-ons; they will be signed by Mozilla, and discoverable through addons.mozilla.org (AMO) or through the developer’s website. With this API, extension developers should be able to make the same extension available on Firefox and Chrome with a minimal number of changes to repackage for each platform.

    Reply
  41. Tomi Engdahl says:

    Natasha Singer / New York Times:
    IDC: US desktop, laptop, tablet and 2-in-1 computer sales in education grew 33% YOY in 2014, with Chromebook sales up 310% to 3.9M, iPad sales down 6.9% to 2.7M

    Chromebooks Gaining on iPads in School Sector
    http://bits.blogs.nytimes.com/2015/08/19/chromebooks-gaining-on-ipads-in-school-sector/

    As school districts across the country increasingly invest in technology for their students, the volume of personal computers in the classroom is surging.

    And it is Chromebook – notebook computers that run on Google’s Chrome operating system – an upstart in a sector dominated by Apple and Microsoft, that is largely responsible for the growth trend in schools, according to a new report from IDC, a market research firm.

    Last year, the market for desktop, laptop, tablets and two-in-one computers shipped to kindergarten-through-12th-grade schools and institutions of higher education in the United States amounted to $7 billion, according to estimates from IDC.

    In all, the company said, about 13.2 million systems were shipped in 2014 – about 33 percent more than the year before.

    With its line-up of iPads, MacBooks and other higher-priced products, Apple reaped the greatest revenue in the sector last year, accounting for nearly half of the total dollars spent on personal computers in education, the report said.

    In terms of the sheer numbers of devices sold, however, Microsoft remained in the lead. In 2014, about 4.9 million Windows devices, including notebooks and desktops, shipped to schools, giving Microsoft a roughly 38 percent market share in unit sales, IDC said.

    Last year, about 3.9 million Chromebooks were shipped in the education sector, an increase in unit sales of more than 310 percent compared with the previous year, IDC said. By contrast, iPad unit sales for education fell last year to 2.7 million devices, compared to 2.9 million in 2013, according to IDC data.

    “Even if Microsoft is No. 1 in volume and Apple is No. 1 in revenue, from the growth perspective, nobody can beat Chromebook,”

    Reply
  42. Tomi Engdahl says:

    Storage Industry Group Tackles SSD Data Recovery
    http://www.eetimes.com/document.asp?doc_id=1327476&

    Solid state storage presents problems when the need arises to recover data, so a new special interest group has been formed to address the challenges.

    Last week at the Flash Memory Summit [LINK], the Storage Networking Industry Association (SNIA) and its Solid State Storage Initiative (SSSI) announced the creation of a new Special Interest Group around Data Recovery/Erase to foster development of tools and standards.

    Data recovery firms have had conversations with makers of solid-state drives (SSDs) for a number of years around the processes and tools for resurrecting data, said Gillware Data Recovery’s Scott Holewinski, chair of the DR/E SIG. “The issue we’ve been having is there are no standards around it.”

    Reply
  43. Tomi Engdahl says:

    Samsung May Release an 18″ Tablet
    http://hardware.slashdot.org/story/15/08/23/2247252/samsung-may-release-an-18-tablet?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Slashdot%2Fslashdot%2Fto+%28%28Title%29Slashdot+%28rdf%29%29

    A report at PC Magazine says that Samsung may soon field a tablet to satisfy people not content with the 7″, 9″, 12″, or even slightly larger tablets that are today’s normal stock in trade. Instead, the company is reported to be working on an 18.4″ tablet aimed at “living rooms, offices, and schools.”

    Report: Samsung Prepping 18.4-Inch Tablet
    http://www.pcmag.com/article2/0,2817,2489877,00.asp

    Reply
  44. Tomi Engdahl says:

    Does Linux need a new file system? Ex-Google engineer thinks so
    Bcache author says it’s nearly ready for prime time
    http://www.theregister.co.uk/2015/08/24/does_linux_need_a_new_file_system_exgoogle_engineer_thinks_so/

    Former Googler Kent Overstreet has announced that a long-term project to craft a new Linux file system is at a point where he’d like other developers to pitch in.

    he wants to “match ext4 and xfs on performance and reliability, but with the features of btrfs/zfs”.

    He says “the bcache codebase has been evolving/metastasizing into a full blown, general purpose posix filesystem – a modern COW [copy-on-write – El Reg] filesystem with checksumming, compression, multiple devices, caching, and eventually snapshots and all kinds of other nifty features”.

    http://evilpiepirate.org/git/linux-bcache.git

    Reply
  45. Tomi Engdahl says:

    Here are the top 10 programming languages used on GitHub
    http://venturebeat.com/2015/08/19/here-are-the-top-10-programming-languages-used-on-github/

    GitHub today shared a closer look at how the popularity of programming languages used on its code collaboration website has changed over the years. In short, the graph above shows the change in rank for programming languages since GitHub launched in 2008 all the way to what the site’s 10 million users are using for coding today.

    GitHub is a repository hosting service that builds on the distributed revision control and source code management functionality of Git, which is strictly a command-line tool. GitHub provides a web-based graphical interface, as well as desktop and mobile integration.

    Trend lines aside, here are the top 10 programming languages on GitHub today: 1. JavaScript 2. Java 3. Ruby 4. PHP 5. Python 6. CSS 7. C++ 8. C# 9. C 10. HTML

    Reply
  46. Tomi Engdahl says:

    AMD’s Market Share Dips Below 20 Percent
    http://www.ign.com/articles/2015/08/22/amds-market-share-dips-below-20-percent

    When it comes to share of the consumer discreet graphics card market, AMD is being overpowered by rival Nvidia.

    According to a chart from an internal Nvidia report on market share for the two companies, Nvidia controls a whopping 82% of the graphics card market, with AMD continuing to lose ground. AMD cards now hold an 18% slice of the graphics card pie.

    All is not lost for AMD, however. While its share of the desktop market has been declining for some time, Mercury Research issued a release showing AMD made gains in the mobile discreet graphics segment.

    Reply
  47. Tomi Engdahl says:

    Open sourcerer Mirantis taps up Intel for $100m worth of funding
    Chipzilla backs OpenStack vendor as part of collaboration lunge
    http://www.theregister.co.uk/2015/08/24/mirantis_taps_up_intel_for_100m/

    Chipzilla Intel has led a funding injection of $100m (£64m) into open source cloudy operating system Mirantis, part of its move into the world of OpenStack.

    The deal is apparently part of Intel’s recent announcement intended to create “tens of thousands of new clouds”.

    Mirantis has so far raised $220m, having previously drummed up $100m of funding in 2014.

    Last month, the chipmaker announced its intention to create “scores” of collaborations and investments in software-defined infrastructure. It said it would collaborate with Rackspace on enterprise features and an OpenStack Innovation Centre.

    Diane Bryant, general manager of Intel’s Data Center Group, said the biz intends to bring “open cloud infrastructure to the entire industry”.

    She said: “As enterprises embrace public, private and hybrid cloud strategies, they need choices in their infrastructure software.

    “OpenStack is an ideal open solution for cloud-native applications and services, and our collaboration with Mirantis is well placed to ensure the delivery of critical new enterprise features helping to create of tens of thousands of clouds.”

    Reply
  48. Tomi Engdahl says:

    ASUS Announces A 144Hz WQHD Gaming Monitor With FreeSync
    by Brandon Chester on August 19, 2015 12:59 PM EST
    http://www.anandtech.com/show/9545/asus-announces-mg278q-wqhd-gaming-monitor-with-freesync

    Being a gaming-oriented display, the MG278Q’s focus is on a low response time and a high refresh rate rather than color accuracy. Since it’s a TN panel it’s likely that the panel has a native 6bit color depth per subpixel and uses temporal dithering to emulate 16.7 million colors, although this has not been confirmed. In addition to the 144Hz refresh rate, the MG278Q supports AMD’s FreeSync technology which utilizes the Adaptive Sync feature of DisplayPort 1.2a to enable a variable refresh rate synchronized to the GPU’s rendering of frames.

    The AMD FreeSync Review
    by Jarred Walton on March 19, 2015 12:00 PM EST
    http://www.anandtech.com/show/9097/the-amd-freesync-review

    The first time anyone talked about adaptive refresh rates for monitors – specifically applying the technique to gaming – was when NVIDIA demoed G-SYNC back in October 2013. The idea seemed so logical that I had to wonder why no one had tried to do it before.

    The impetus behind adaptive refresh is to overcome visual artifacts and stutter cause by the normal way of updating the screen. Briefly, the display is updated with new content from the graphics card at set intervals, typically 60 times per second. While that’s fine for normal applications, when it comes to games there are often cases where a new frame isn’t ready in time, causing a stall or stutter in rendering. Alternatively, the screen can be updated as soon as a new frame is ready, but that often results in tearing – where one part of the screen has the previous frame on top and the bottom part has the next frame (or frames in some cases).

    Neither input lag/stutter nor image tearing are desirable, so NVIDIA set about creating a solution: G-SYNC. Perhaps the most difficult aspect for NVIDIA wasn’t creating the core technology but rather getting display partners to create and sell what would ultimately be a niche product – G-SYNC requires an NVIDIA GPU, so that rules out a large chunk of the market. Not surprisingly, the result was that G-SYNC took a bit of time to reach the market as a mature solution, with the first displays that supported the feature requiring modification by the end user.

    Over the past year we’ve seen more G-SYNC displays ship that no longer require user modification, which is great, but pricing of the displays so far has been quite high.

    When AMD demonstrated their alternative adaptive refresh rate technology and cleverly called it FreeSync, it was a clear jab at the added cost of G-SYNC displays. As with G-SYNC, it has taken some time from the initial announcement to actual shipping hardware, but AMD has worked with the VESA group to implement FreeSync as an open standard that’s now part of DisplayPort 1.2a, and they aren’t getting any royalties from the technology. That’s the “Free” part of FreeSync, and while it doesn’t necessarily guarantee that FreeSync enabled displays will cost the same as non-FreeSync displays, the initial pricing looks quite promising.

    The major scaler companies – Realtek, Novatek, and MStar – have all built FreeSync (DisplayPort Adaptive Sync) into their latest products, and since most displays require a scaler anyway there’s no significant price increase.

    FreeSync vs. G-SYNC Performance

    One item that piqued our interest during AMD’s presentation was a claim that there’s a performance hit with G-SYNC but none with FreeSync.
    things may have changed, but even so the difference was generally quite small – less than 3%
    It’s probably safe to say that AMD is splitting hairs when they show a 1.5% performance drop in one specific scenario compared to a 0.2% performance gain

    Except for a glitch with testing Alien Isolation using a custom resolution, our results basically don’t show much of a difference between enabling/disabling G-SYNC/FreeSync – and that’s what we want to see.

    Closing Thoughts

    It took a while to get here, but if the proof is in the eating of the pudding, FreeSync tastes just as good as G-SYNC when it comes to adaptive refresh rates. Within the supported refresh rate range, I found nothing to complain about. Perhaps more importantly, while you’re not getting a “free” monitor upgrade, the current prices of the FreeSync displays are very close to what you’d pay for an equivalent display that doesn’t have adaptive sync. That’s great news, and with the major scaler manufacturers on board with adaptive sync the price disparity should only shrink over time.

    FreeSync driver that AMD provided (Catalyst 15.3 Beta 1) only allows FreeSync to work with single GPU configurations

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*