Looking for Computing Reboot

IEEE has an interesting working group and web page called Rebooting Computing. “Rebooting Computing” was coined by IEEE Life Fellow Peter Denning as part of his National Science Foundation-sponsored initiative to revamp computing education. The group will work from a holistic viewpoint, taking into account evolutionary and revolutionary approaches.

rebootcomputing

Why this is important? Computing Needs a Reboot article tells that old techniques running out of gas so engineers need to explore new computing paradigms to fuel future performance advances. For years, it’s been true that about every year and a half, computers in general doubled in speed. It used to be due to Moore’s Law that kept computers doubling in performance for the same cost every 18 months for 50 years. These tricks hit a roadblock in 2005. One new trick was speculative execution that meant that higher performance was tied directly to higher power, which shifted cost to cooling. The industry reacted by putting on the same die multiple computers (cores), but in order to light up all of the cores, the burden shifted from hardware to the programmer – multicore programming has still many challenges today.

But things got even worse for the computer industry.  The trend that Gordon Moore observed, that transistors per unit area doubled every 18 months, was coming to an end. And so here we are today: microprocessors are not getting faster for the same price.  The list of computing’s impact is nearly infinite as computers are almost everywhere, so what will be the effect to society and tech industry if we can’t advance at the speed we are used to? The Death of Moore’s Law Will Spur Innovation – or at least is should start interest searching for fresh ideas.

In late 2012, IEEE began an initiative “Rebooting Computing” that tries to find potential approaches to getting back to the historic exponential scaling of computer performance. The potential approaches are radical: leverages randomness and allows computers to produce approximate results, mimic the structures of the brain and keeping list of potential results longer.  Each of these approaches is considered lunatic fringe by the industry.

 

3 Comments

  1. Tomi Engdahl says:

    HP haters: Get ready to rage against THE MACHINE ‘next year’
    Prototype version won’t be quite what’s promised
    http://www.theregister.co.uk/2015/06/03/hp_machine_prototype_2016/

    HP Discover HP is still hard at work at the futuristic computing platform it has dubbed the Machine, and staffers from HP Labs were on hand at the HP Discover conference in Las Vegas this week to give an update on its progress.

    There was even sample hardware on display, which HP Labs’ Jaap Suermondt said was “evidence that we’re actually making significant progress.”

    Progress doesn’t equal product, though, and there’s little evidence that the Machine is anywhere close to being marketable. But then, as HP CTO Martin Fink put it in a presentation for the media, HP has always described the Machine as an “end-of-decade project.”

    HP has branded its current fancy servers “Moonshot.” Think of the Machine as more of a “Mars-shot” – the R&D required will cost about as much and it has nearly as much chance of success.

    In case you’re not familiar with the core idea, the Machine is a new kind of computing device aimed at addressing the new kinds of problems that arise given the current data explosion.

    What makes the Machine different from current-generation computers is that it makes memory central, in an architecture that HP is calling memory-driven computing (MDC). Each Machine has a pool of what HP calls “universal memory,” and you can attach whatever you want to that memory – including CPU cores, GPUs, network interfaces, specialized processing units, and so on.

    “When you boot the Machine, you actually boot all of the memory; only when all of the memory is fully booted do you boot the processors,” Fink explained. “You can reboot all of the processors and not even impact memory.”

    Another unique aspect of the Machine is that all of the memory is non-volatile, ideally using HP’s future-voodoo memristor technology. Fink said he doesn’t believe HP will be alone in this; eventually, the whole industry will move toward non-volatile memory technologies.

    “If the Machine offered absolutely no enhancement to performance for a workload and all it did was accomplish its security goals, that fact alone would make it valuable,” Fink opined.

    Is this ‘Mars-shot’ really feasible?

    Yet problems remain. The most glaring one is that HP still has not managed to bring its fabled memristor technology to market, meaning the universal non-volatile memory at the core of the Machine’s design doesn’t really exist yet.

    Manadhata said the Machine is currently “the biggest problem” being worked on at HP Labs. That seems to be something of an understatement.

    Reply
  2. Tomi Engdahl says:

    James Niccolai / PCWorld:
    Prototype of HP’s ‘Machine’ server coming next year, but finished product with memristor technology is still years away

    Prototype of HP’s futuristic ‘Machine’ coming next year
    http://www.pcworld.com/article/2931352/prototype-of-hps-futuristic-machine-coming-next-year.html

    A prototype of Hewlett-Packard’s futuristic Machine computer will be ready for partners to develop software on by next year, though the finished product is still half a decade away.

    The single-rack prototype will have 2,500 CPU cores and an impressive 320TB of main memory, CTO and HP Labs Director Martin Fink told reporters at the HP Discover conference Wednesday. This is more than 20 times the amount of any server on the market today, he claimed.

    But there’s a catch: the prototype will use current DRAM memory chips, because the advanced memristor technology that HP eventually plans to use is still under development—one of the big reasons The Machine remains several years away.

    HP is placing a huge bet that it can develop a new type of computer that stores all data in vast pools of non-volatile memory. HP says the Machine will be superior to any computer today. A system the size of a refrigerator will be able to do the work of a whole data center, it claims.

    Say goodbye to disk drives

    In current server architectures, the CPUs lie at the center, with multiple layers of memory and storage attached, including DRAM and hard disk drives. HP’s goal is to do away with disk drives altogether, and replace DRAM with pools of non-volatile memory.

    That type of memory keeps its data when the power is switched off, so the Machine can be highly energy efficient. Non-volatile memory exists today, for example NAND Flash, but its performance is slow, at least in high-performance computing terms, and memristors should offer far greater storage density.

    The Machine makes memory “a first class citizen,” he said, with memory pools linked by high-speed silicon photonics that will carry data at 1.2TB per second.

    “The Machine is driven by making memory the center of the universe, with the processors surrounding it,” he said. And he has a new name for the architecture the Machine is based on: Memory Driven Computing.

    What apps will The Machine run?

    HP is having a “huge debate” about the applications that will run on the machine. Most people want to transfer over existing workloads, which HP says will be possible, but more interesting are the new applications not possible today.

    It’s a great story, but as with any major new technology that’s still five years out, it’s impossible to say if it will pan out. Intel once thought it would take over the world with a new processor architecture called Itanium, and that chip seems headed for the scrap heap.

    But HP is pushing full steam ahead.

    One booth shows an emulation tool HP engineers are using to develop the Machine’s OS and firmware. On a laptop, it can simulate the huge memory pools the system will use even though the hardware itself doesn’t exist yet.

    Known as the Machine Architecture Simulator, it can also simulate compute nodes for the Machine, and engineers can select from x86 or ARM-type processors, indicating The Machine will be processor agnostic.

    In fact, the type of processor isn’t important, Fink said. Large companies could even design their own, application-specific CPUs, or attach GPUs or network interface cards.

    Reply
  3. Tomi Engdahl says:

    Major Advance Reveals the Limits of Computation
    http://www.wired.com/2015/10/major-advance-reveals-limits-computation/

    At first glance, the big news coming out of this summer’s conference on the theory of computing appeared to be something of a letdown. For more than 40 years, researchers had been trying to find a better way to compare two arbitrary strings of characters, such as the long strings of chemical letters within DNA molecules. The most widely used algorithm is slow and not all that clever: It proceeds step-by-step down the two lists, comparing values at each step. If a better method to calculate this “edit distance” could be found, researchers would be able to quickly compare full genomes or large data sets, and computer scientists would have a powerful new tool with which they could attempt to solve additional problems in the field.

    Reply

Leave a Reply to Tomi Engdahl Cancel reply

Your email address will not be published. Required fields are marked *

*

*