Computer trends for 2015

Here are comes my long list of computer technology trends for 2015:

Digitalisation is coming to change all business sectors and through our daily work even more than before. Digitalisation also changes the IT sector: Traditional software package are moving rapidly into the cloud.  Need to own or rent own IT infrastructure is dramatically reduced. Automation application for configuration and monitoring will be truly possible. Workloads software implementation projects will be reduced significantly as software is a need to adjust less. Traditional IT outsourcing is definitely threatened. The security management is one of the key factors to change as security threats are increasingly digital world. IT sector digitalisation simply means: “more cheaper and better.”

The phrase “Communications Transforming Business” is becoming the new normal. The pace of change in enterprise communications and collaboration is very fast. A new set of capabilities, empowered by the combination of Mobility, the Cloud, Video, software architectures and Unified Communications, is changing expectations for what IT can deliver.

Global Citizenship: Technology Is Rapidly Dissolving National Borders. Besides your passport, what really defines your nationality these days? Is it where you were live? Where you work? The language you speak? The currency you use? If it is, then we may see the idea of “nationality” quickly dissolve in the decades ahead. Language, currency and residency are rapidly being disrupted and dematerialized by technology. Increasingly, technological developments will allow us to live and work almost anywhere on the planet… (and even beyond). In my mind, a borderless world will be a more creative, lucrative, healthy, and frankly, exciting one. Especially for entrepreneurs.

The traditional enterprise workflow is ripe for huge change as the focus moves away from working in a single context on a single device to the workflow being portable and contextual. InfoWorld’s executive editor, Galen Gruman, has coined a phrase for this: “liquid computing.”   The increase in productivity is promised be stunning, but the loss of control over data will cross an alarming threshold for many IT professionals.

Mobile will be used more and more. Currently, 49 percent of businesses across North America adopt between one and ten mobile applications, indicating a significant acceptance of these solutions. Embracing mobility promises to increase visibility and responsiveness in the supply chain when properly leveraged. Increased employee productivity and business process efficiencies are seen as key business impacts.

The Internet of things is a big, confusing field waiting to explode.  Answer a call or go to a conference these days, and someone is likely trying to sell you on the concept of the Internet of things. However, the Internet of things doesn’t necessarily involve the Internet, and sometimes things aren’t actually on it, either.

The next IT revolution will come from an emerging confluence of Liquid computing plus the Internet of things. Those the two trends are connected — or should connect, at least. If we are to trust on consultants, are in sweet spot for significant change in computing that all companies and users should look forward to.

Cloud will be talked a lot and taken more into use. Cloud is the next-generation of supply chain for ITA global survey of executives predicted a growing shift towards third party providers to supplement internal capabilities with external resources.  CIOs are expected to adopt a more service-centric enterprise IT model.  Global business spending for infrastructure and services related to the cloud will reach an estimated $174.2 billion in 2014 (up a 20% from $145.2 billion in 2013), and growth will continue to be fast (“By 2017, enterprise spending on the cloud will amount to a projected $235.1 billion, triple the $78.2 billion in 2011“).

The rapid growth in mobile, big data, and cloud technologies has profoundly changed market dynamics in every industry, driving the convergence of the digital and physical worlds, and changing customer behavior. It’s an evolution that IT organizations struggle to keep up with.To success in this situation there is need to combine traditional IT with agile and web-scale innovation. There is value in both the back-end operational systems and the fast-changing world of user engagement. You are now effectively operating two-speed IT (bimodal IT, two-speed IT, or traditional IT/agile IT). You need a new API-centric layer in the enterprise stack, one that enables two-speed IT.

As Robots Grow Smarter, American Workers Struggle to Keep Up. Although fears that technology will displace jobs are at least as old as the Luddites, there are signs that this time may really be different. The technological breakthroughs of recent years — allowing machines to mimic the human mind — are enabling machines to do knowledge jobs and service jobs, in addition to factory and clerical work. Automation is not only replacing manufacturing jobs, it is displacing knowledge and service workers too.

In many countries IT recruitment market is flying, having picked up to a post-recession high. Employers beware – after years of relative inactivity, job seekers are gearing up for changeEconomic improvements and an increase in business confidence have led to a burgeoning jobs market and an epidemic of itchy feet.

Hopefully the IT department is increasingly being seen as a profit rather than a cost centre with IT budgets commonly split between keeping the lights on and spend on innovation and revenue-generating projects. Historically IT was about keeping the infrastructure running and there was no real understanding outside of that, but the days of IT being locked in a basement are gradually changing.CIOs and CMOs must work more closely to increase focus on customers next year or risk losing market share, Forrester Research has warned.

Good questions to ask: Where do you see the corporate IT department in five years’ time? With the consumerization of IT continuing to drive employee expectations of corporate IT, how will this potentially disrupt the way companies deliver IT? What IT process or activity is the most important in creating superior user experiences to boost user/customer satisfaction?

 

Windows Server 2003 goes end of life in summer 2015 (July 14 2015).  There are millions of servers globally still running the 13 year-old OS with one in five customers forecast to miss the 14 July deadline when Microsoft turns off extended support. There were estimated to be 2.7 million WS2003 servers in operation in Europe some months back. This will keep the system administrators busy, because there is just around half year time and update for Windows Server 2008 or Windows 2012 to may be have difficulties. Microsoft and support companies do not seem to be interested in continuing Windows Server 2003 support, so those who need that the custom pricing can be ” incredibly expensive”. At this point is seems that many organizations have the desire for new architecture and consider one option to to move the servers to cloud.

Windows 10 is coming  to PCs and Mobile devices. Just few months back  Microsoft unveiled a new operating system Windows 10. The new Windows 10 OS is designed to run across a wide range of machines, including everything from tiny “internet of things” devices in business offices to phones, tablets, laptops, and desktops to computer servers. Windows 10 will have exactly the same requirements as Windows 8.1 (same minimum PC requirements that have existed since 2006: 1GHz, 32-bit chip with just 1GB of RAM). There is technical review available. Microsoft says to expect AWESOME things of Windows 10 in January. Microsoft will share more about the Windows 10 ‘consumer experience’ at an event on January 21 in Redmond and is expected to show Windows 10 mobile SKU at the event.

Microsoft is going to monetize Windows differently than earlier.Microsoft Windows has made headway in the market for low-end laptops and tablets this year by reducing the price it charges device manufacturers, charging no royalty on devices with screens of 9 inches or less. That has resulted in a new wave of Windows notebooks in the $200 price range and tablets in the $99 price range. The long-term success of the strategy against Android tablets and Chromebooks remains to be seen.

Microsoft is pushing Universal Apps concept. Microsoft has announced Universal Windows Apps, allowing a single app to run across Windows 8.1 and Windows Phone 8.1 for the first time, with additional support for Xbox coming. Microsoft promotes a unified Windows Store for all Windows devices. Windows Phone Store and Windows Store would be unified with the release of Windows 10.

Under new CEO Satya Nadella, Microsoft realizes that, in the modern world, its software must run on more than just Windows.  Microsoft has already revealed Microsoft office programs for Apple iPad and iPhone. It also has email client compatible on both iOS and Android mobile operating systems.

With Mozilla Firefox and Google Chrome grabbing so much of the desktop market—and Apple Safari, Google Chrome, and Google’s Android browser dominating the mobile market—Internet Explorer is no longer the force it once was. Microsoft May Soon Replace Internet Explorer With a New Web Browser article says that Microsoft’s Windows 10 operating system will debut with an entirely new web browser code-named Spartan. This new browser is a departure from Internet Explorer, the Microsoft browser whose relevance has waned in recent years.

SSD capacity has always lag well behind hard disk drives (hard disks are in 6TB and 8TB territory while SSDs were primarily 256GB to 512GB). Intel and Micron will try to kill the hard drives with new flash technologies. Intel announced it will begin offering 3D NAND drives in the second half of next year as part of its joint flash venture with Micron. Later (next two years) Intel promises 10TB+ SSDs thanks to 3D Vertical NAND flash memory. Also interfaces to SSD are evolving from traditional hard disk interfaces. PCIe flash and NVDIMMs will make their way into shared storage devices more in 2015. The ULLtraDIMM™ SSD connects flash storage to the memory channel via standard DIMM slots, in order to close the gap between storage devices and system memory (less than five microseconds write latency at the DIMM level).

Hard disks will be still made in large amounts in 2015. It seems that NAND is not taking over the data centre immediately. The huge great problem is $/GB. Estimates of shipped disk and SSD capacity out to 2018 shows disk growing faster than flash. The world’s ability to make and ship SSDs is falling behind its ability to make and ship disk drives – for SSD capacity to match disk by 2018 we would need roughly eight times more flash foundry capacity than we have. New disk technologies such as shingling, TDMR and HAMR are upping areal density per platter and bringing down cost/GB faster than NAND technology can. At present solid-state drives with extreme capacities are very expensive. I expect that with 2015, the prices for SSD will will still be so much higher than hard disks, that everybody who needs to store large amounts of data wants to consider SSD + hard disk hybrid storage systems.

PC sales, and even laptops, are down, and manufacturers are pulling out of the market. The future is all about the device. We have entered the post-PC era so deeply, that even tablet market seem to be saturating as most people who want one have already one. The crazy years of huge tables sales growth are over. The tablet shipment in 2014 was already quite low (7.2% In 2014 To 235.7M units). There is no great reasons or growth or decline to be seen in tablet market in 2015, so I expect it to be stable. IDC expects that iPad Sees First-Ever Decline, and I expect that also because the market seems to be more and more taken by Android tablets that have turned to be “good enough”. Wearables, Bitcoin or messaging may underpin the next consumer computing epoch, after the PC, internet, and mobile.

There will be new tiny PC form factors coming. Intel is shrinking PCs to thumb-sized “compute sticks” that will be out next year. The stick will plug into the back of a smart TV or monitor “and bring intelligence to that”. It is  likened the compute stick to similar thumb PCs that plug to HDMI port and are offered by PC makers with the Android OS and ARM processor (for example Wyse Cloud Connect and many cheap Android sticks).  Such devices typically don’t have internal storage, but can be used to access files and services in the cloudIntel expects that sticks size PC market will grow to tens of millions of devices.

We have entered the Post-Microsoft, post-PC programming: The portable REVOLUTION era. Tablets and smart phones are fine for consuming information: a great way to browse the web, check email, stay in touch with friends, and so on. But what does a post-PC world mean for creating things? If you’re writing platform-specific mobile apps in Objective C or Java then no, the iPad alone is not going to cut it. You’ll need some kind of iPad-to-server setup in which your iPad becomes a mythical thin client for the development environment running on your PC or in cloud. If, however, you’re working with scripting languages (such as Python and Ruby) or building web-based applications, the iPad or other tablet could be an useable development environment. At least worth to test.

You need prepare to learn new languages that are good for specific tasks. Attack of the one-letter programming languages: From D to R, these lesser-known languages tackle specific problems in ways worthy of a cult following. Watch out! The coder in the next cubicle might have been bitten and infected with a crazy-eyed obsession with a programming language that is not Java and goes by the mysterious one letter name. Each offers compelling ideas that could do the trick in solving a particular problem you need fixed.

HTML5′s “Dirty Little Secret”: It’s Already Everywhere, Even In Mobile. Just look under the hood. “The dirty little secret of native [app] development is that huge swaths of the UIs we interact with every day are powered by Web technologies under the hood.”  When people say Web technology lags behind native development, what they’re really talking about is the distribution model. It’s not that the pace of innovation on the Web is slower, it’s just solving a problem that is an order of magnitude more challenging than how to build and distribute trusted apps for a single platform. Efforts like the Extensible Web Manifesto have been largely successful at overhauling the historically glacial pace of standardization. Vine is a great example of a modern JavaScript app. It’s lightning fast on desktop and on mobile, and shares the same codebase for ease of maintenance.

Docker, meet hype. Hype, meet Docker. Docker: Sorry, you’re just going to have to learn about it. Containers aren’t a new idea, and Docker isn’t remotely the only company working on productising containers. It is, however, the one that has captured hearts and minds. Docker containers are supported by very many Linux systems. And it is not just only Linux anymore as Docker’s app containers are coming to Windows Server, says Microsoft. Containerization lets you do is launch multiple applications that share the same OS kernel and other system resources but otherwise act as though they’re running on separate machines. Each is sandboxed off from the others so that they can’t interfere with each other. What Docker brings to the table is an easy way to package, distribute, deploy, and manage containerized applications.

Domestic Software is on rise in China. China is Planning to Purge Foreign Technology and Replace With Homegrown SuppliersChina is aiming to purge most foreign technology from banks, the military, state-owned enterprises and key government agencies by 2020, stepping up efforts to shift to Chinese suppliers, according to people familiar with the effort. In tests workers have replaced Microsoft Corp.’s Windows with a homegrown operating system called NeoKylin (FreeBSD based desktop O/S). Dell Commercial PCs to Preinstall NeoKylin in China. The plan for changes is driven by national security concerns and marks an increasingly determined move away from foreign suppliers. There are cases of replacing foreign products at all layers from application, middleware down to the infrastructure software and hardware. Foreign suppliers may be able to avoid replacement if they share their core technology or give China’s security inspectors access to their products. The campaign could have lasting consequences for U.S. companies including Cisco Systems Inc. (CSCO), International Business Machines Corp. (IBM), Intel Corp. (INTC) and Hewlett-Packard Co. A key government motivation is to bring China up from low-end manufacturing to the high end.

 

Data center markets will grow. MarketsandMarkets forecasts the data center rack server market to grow from $22.01 billion in 2014 to $40.25 billion by 2019, at a compound annual growth rate (CAGR) of 7.17%. North America (NA) is expected to be the largest region for the market’s growth in terms of revenues generated, but Asia-Pacific (APAC) is also expected to emerge as a high-growth market.

The rising need for virtualized data centers and incessantly increasing data traffic is considered as a strong driver for the global data center automation market. The SDDC comprises software defined storage (SDS), software defined networking (SDN) and software defined server/compute, wherein all the three components of networking are empowered by specialized controllers, which abstract the controlling plane from the underlying physical equipment. This controller virtualizes the network, server and storage capabilities of a data center, thereby giving a better visibility into data traffic routing and server utilization.

New software-defined networking apps will be delivered in 2015. And so will be software defined storage. And software defined almost anything (I an waiting when we see software defined software). Customers are ready to move away from vendor-driven proprietary systems that are overly complex and impede their ability to rapidly respond to changing business requirements.

Large data center operators will be using more and more of their own custom hardware instead of standard PC from traditional computer manufacturers. Intel Betting on (Customized) Commodity Chips for Cloud Computing and it expects that Over half the chips Intel will sell to public clouds in 2015 will have custom designs. The biggest public clouds (Amazon Web Services, Google Compute, Microsoft Azure),other big players (like Facebook or China’s Baidu) and other public clouds  (like Twitter and eBay) all have huge data centers that they want to run optimally. Companies like A.W.S. “are running a million servers, so floor space, power, cooling, people — you want to optimize everything”. That is why they want specialized chips. Customers are willing to pay a little more for the special run of chips. While most of Intel’s chips still go into PCs, about one-quarter of Intel’s revenue, and a much bigger share of its profits, come from semiconductors for data centers. In the first nine months of 2014, the average selling price of PC chips fell 4 percent, but the average price on data center chips was up 10 percent.

We have seen GPU acceleration taken in to wider use. Special servers and supercomputer systems have long been accelerated by moving the calculation of the graphics processors. The next step in acceleration will be adding FPGA to accelerate x86 servers. FPGAs provide a unique combination of highly parallel custom computation, relatively low manufacturing/engineering costs, and low power requirements. FPGA circuits may provide a lot more power out of a much lower power consumption, but traditionally programming then has been time consuming. But this can change with the introduction of new tools (just next step from technologies learned from GPU accelerations). Xilinx has developed a SDAccel-tools to  to develop algorithms in C, C ++ – and OpenCL languages and translated it to FPGA easily. IBM and Xilinx have already demoed FPGA accelerated systems. Microsoft is also doing research on Accelerating Applications with FPGAs.


If there is one enduring trend for memory design in 2014 that will carry through to next year, it’s the continued demand for higher performance. The trend toward high performance is never going away. At the same time, the goal is to keep costs down, especially when it comes to consumer applications using DDR4 and mobile devices using LPDDR4. LPDDR4 will gain a strong foothold in 2015, and not just to address mobile computing demands. The reality is that LPDRR3, or even DDR3 for that matter, will be around for the foreseeable future (lowest-cost DRAM, whatever that may be). Designers are looking for subsystems that can easily accommodate DDR3 in the immediate future, but will also be able to support DDR4 when it becomes cost-effective or makes more sense.

Universal Memory for Instant-On Computing will be talked about. New memory technologies promise to be strong contenders for replacing the entire memory hierarchy for instant-on operation in computers. HP is working with memristor memories that are promised to be akin to RAM but can hold data without power.  The memristor is also denser than DRAM, the current RAM technology used for main memory. According to HP, it is 64 and 128 times denser, in fact. You could very well have a 512 GB memristor RAM in the near future. HP has what it calls “The Machine”, practically a researcher’s plaything for experimenting on emerging computer technologies. Hewlett-Packard’s ambitious plan to reinvent computing will begin with the release of a prototype operating system in 2015 (Linux++, in June 2015). HP must still make significant progress in both software and hardware to make its new computer a reality. A working prototype of The Machine should be ready by 2016.

Chip designs that enable everything from a 6 Gbit/s smartphone interface to the world’s smallest SRAM cell will be described at the International Solid State Circuits Conference (ISSCC) in February 2015. Intel will describe a Xeon processor packing 5.56 billion transistors, and AMD will disclose an integrated processor sporting a new x86 core, according to a just-released preview of the event. The annual ISSCC covers the waterfront of chip designs that enable faster speeds, longer battery life, more performance, more memory, and interesting new capabilities. There will be many presentations on first designs made in 16 and 14 nm FinFET processes at IBM, Samsung, and TSMC.

 

1,403 Comments

  1. Tomi Engdahl says:

    Brad Chacos / PCWorld:
    Nvidia introduces multi-resolution shading to reduce graphics performance needed to create VR scenes, which could help VR games run on less powerful hardware

    Nvidia’s radical multi-resolution shading tech could help VR reach the masses
    http://www.pcworld.com/article/2926083/nvidias-radical-multi-resolution-shading-tech-could-help-vr-reach-the-masses.html

    When Oculus VR revealed the recommended PC specs for the forthcoming consumer release of its highly anticipated Oculus Rift virtual reality headset, the graphics card requirements were shockingly reasonable. Sure, the GeForce GTX 970 and Radeon R9 290 are no slouches in the eye-candy department, but delivering high-resolution visuals to two displays at 90 frames per second takes a lot of firepower. How will developers create top-tier VR games that don’t require the latest and greatest graphics cards (like the newly announced GTX 980 Ti) to run at the blistering frame rates required to avoid the dreaded VR nausea?

    Nvidia may have stumbled onto the answer with multi-resolution shading (MRS) feature, a new GameWorks VR middleware technology available for developers. MRS takes advantage of a quirk in the way VR headsets render images to drastically reduce the graphics performance needed to create virtual scenes—which could effectively be used to run VR games on less powerful hardware.

    The secret sauce in Nvidia’s multi-resolution shading lies in the way virtual reality headsets, by their very nature, warp on-screen imagery.

    Normally, graphics cards render full-screen images as a straight-ahead, rectangular scene, applying the same resolution across the entire image

    But VR headsets use a pair of over-the-eye lenses to push the focal point of scenes out into the distance.

    The Oculus Rift (and other VR headsets) scrunch the edges of rendered environments together into a roughly oval shape to make them appear correctly when viewed through the lenses.

    “GPUs render straight, not distorted,” says Peterson. “So what we actually have to do is take the original image, then warp it, to account for the fact that it’s going to be re-distorted by those lenses, so that by the end of the day—when you see it—the image is straight again.”

    But that warping compresses the edges of the images, throwing away a lot of the native imagery produced by the GPU. Your graphics card is essentially working harder than it has to.

    Rather than rendering the entire image at the same resolution, MRS splits the screen into separate regions. The center of the image—where your eyes primarily focus in a VR headset, and where the image isn’t distorted— is rendered at full, native resolution. The edges of the screen, however, are rendered at a reduced quality to take advantage of VR’s necessary warping and distortion.

    “It’s between 50 percent and 100 percent less pixel work [compared to traditionally rendered VR scenes],” says Peterson.

    That’s insane. Even more insane: The reduced quality edge regions truly aren’t noticeable in the final image unless the compression quality is cranked to extreme levels.

    At a 30 percent reduction in pixel work, there was no visible difference with MRS enabled or disabled.

    In order to truly make the reduced rendering visible, Peterson had to crank the compression up to 50 percent, or half the workload of the same image rendered at full resolution across the board. Only then was the effect noticeable, as a faint shimmering around the very edges of the image.

    That’s big news for VR developers, and for gamers who want to get into the virtual reality experience without spending the equivalent of a college education on a graphics card.

    “So if you’re a game developer, this means that you can have higher quality games, or that you can have your games run on more GPUs,” says Peterson.

    Reply
  2. Tomi Engdahl says:

    Nvidia’s new graphics card pushes for better 4K
    http://www.cnet.com/news/nvidia-pushes-4k-for-pc-with-new-geforce-gtx-980-ti/

    Nvidia announced a new flagship graphics card delivering smoother 4K graphics for PC gamers, plus new G-Sync screen technology for smoother gaming on laptops.

    Nvidia has announced a souped-up version of its previous flagship graphics card and given it a titanium sheen, pre-empting the formal opening of the Computex 2015 trade show.

    Dubbed the GeForce GTX 980 Ti, Nvidia says the GTX 980 Ti will deliver better support for 4K PC gaming, allowing new games such as The Witcher 3 to run at 4K resolution at a reasonable 45 frames per second, compared to 19fps on the older GTX 680. For the latest car racing simulator, Project Cars, Nvidia expects to deliver 47fps on the GTX 980 Ti compared with 18fps on the GTX 680.

    The reference version of the graphics card will retail for $649

    For the technically inclined, it’s interesting that the GTX 980 Ti is clocked slower than previous cards at 1,000MHz.

    Besides the new GTX 980 Ti, Nvidia also announced that new gaming notebooks powered by its graphics cards will have G-Sync. G-Sync is Nvidia’s own solution for the problem of screen tearing, which is a horizontal distortion that happens when the display’s refresh rate is unable to keep up with the output from the graphic card.

    Most users turn on V-Sync to fix the problem, but it comes at the cost of performance. Nvidia’s solution is to synchronize both the monitor’s refresh rate and the render rate of the GPU, so images don’t go out of sync. New laptops from Asus, Clevo, Gigabyte and MSI will pack Nvidia-approved 75Hz displays.

    Nvidia currently controls around 70 percent of the GPU market, with its main rival, Advanced Micro Devices, powering the console hardware for the Xbox One, Sony PlayStation 4 and Nintendo’s Wii U, and the remaining 30 percent of the PC market. This is in stark contrast to 2010, where both players were neck-and-neck.

    Reply
  3. Tomi Engdahl says:

    FPGAs Ride HP’s Moonshot
    SRC goes to data centers with Altera
    http://www.eetimes.com/document.asp?doc_id=1326707&

    SAN JOSE, Calif. – Hewlett-Packard’s processor-agnostic Moonshot server officially adds today another board to its menu of options – the Saturn 1 from SRC Computers. The deal is the first big step into the commercial limelight for SRC that has been quietly selling its boards mainly to government users since 2002.

    Moonshot is a chassis with a passive backplane that can host a variety of processor and networking cards. HP launched the system in October 2013 using Intel microserver processors and in late September 2014 added options for ARM-based cards using chips from Applied Micro and Texas Instruments.

    HP would not disclose how many of the non-x86 versions of Moonshot it has sold so far. But it did say it has more than 20 reference customers and run more than 275 customer proof-of-concepts in its four Moonshot labs.

    Reply
  4. Tomi Engdahl says:

    ASUS reveals ‘Zensational’ style-over-substance kit
    This year’s smartmobes and fondleslabs are trying to look like handbags
    http://www.theregister.co.uk/2015/06/01/asus_reveals_its_new_zensations/

    Computex day zero kicked off in Taipei today, and after some pleasantries got down to business with an ASUS keynote that pitched style over substance as a good thing.

    Shih instead declared that “Inspiration is what surrounds us, like the mesmerising beauty and power of nature. The fusion of simplicity and peace. The perfect balance between beauty and strength” before exhorting us all to “Join me on this journey to Zensation!”

    The first step on that journey turned out to be the ZenAiO, “A fusion of art and technology” in the form of an all-in-one PC with a Corei7, GTX 960M gaming graphic that Shih reckons will make your home more beautiful and make you a fragmeister to reckon with. Voice recognition and Intel’s RealSense both get guernseys, the better to help you chat with Windows 10′s Cortana personal assistant or play controller-free games

    The new phone has image enhancement softwaere Chuang likened to “digital makeup” and 13MP front and rear cameras so you can always look digitally selfie-tastic. A Qualcomm Snapdragon 615 makes the 5.5 incher hum.

    Shih returned with a new range of ZenPad fondleslabs which he deemed “The perfect fusion of fashion and technology.”

    “Just like carrying a bag or a wallet that is both stylish and practical, a tablet can serve as a loyal companion that hosts all your essentials in style,” Shih said. The new range therefore comes with lots of lovely new finishes and cases.

    Reply
  5. Tomi Engdahl says:

    Apple takes aim at Google in VR market with Metaio acquisition
    Firm likely to announce virtual reality plans at next month’s WWDC
    http://www.theinquirer.net/inquirer/news/2410624/apple-takes-aim-at-google-in-vr-market-with-metaio-acquisition

    APPLE HAS ACQUIRED augmented reality startup Metaio just hours after Google detailed its virtual reality (VR) plans during the firm’s I/O keynote.

    Apple hasn’t said much about the reported acquisition and instead gave its usual vague statement. of “Apple buys smaller technology companies from time to time, and we generally do not discuss our purpose or plans.”

    Reply
  6. Tomi Engdahl says:

    Apple trumps Windows PC makers for technical support
    80 percent of the time, it works every time
    http://www.theinquirer.net/inquirer/news/2410743/apple-trumps-windows-pc-makers-for-technical-support

    APPLE HAS BEATEN Windows PC makers to be crowned best for technical support for the ninth year running.

    That’s according to the latest figures from Consumer Reports, which ranks Apple top among PC manufacturers for post-purchase technical support.

    The report is put together from survey results gathered from 3,200 American PC buyers, and shows that Apple’s online and phone technical support solved customers’ problems 80 percent of the time.

    In comparison, “with most Windows PCs, there’s only a 50-50 chance that a manufacturer’s tech support will do the trick”, Consumer Reports said.

    Reply
  7. Tomi Engdahl says:

    Apple takes aim at Google in VR market with Metaio acquisition
    Firm likely to announce virtual reality plans at next month’s WWDC
    http://www.theinquirer.net/inquirer/news/2410624/apple-takes-aim-at-google-in-vr-market-with-metaio-acquisition

    Reply
  8. Tomi Engdahl says:

    EMC goes open source: software also work with competitors hardware

    Storage company EMC moved to open source at the time of the announcement of its software to the open division in May.

    The first company’s storage software VIPR and CoprHD end up in the open distribution.

    The transition to open source means that software is available free of charge and third parties to develop their own services and applications.

    “We believe in open source and that it is our future”, the company’s director of CJ Desai said to be held in Las Vegas in the EMC World conference.

    According to him, the EMC will no longer see the Open Source risk. Despite the fact that open source code enables EMC’s storage software use also manufactured by storage on servers of other companies.

    EMC’s Finnish chief technology officer Robotics According to Salmi, open source start-up entrepreneurs who receive software at no additional cost access. Therefore, the announcement can be expected to influence the Finnish companies.

    The company will publish Project CoprHD avomen source code on GitHub service in June. It is licensed with the Mozilla Public License 2.0, IIIa, which means that developers can apply the code.

    EMC’s goal is that the open source software using VIPR controler to reach as many clients, including multi-vendor storage products using.

    Source: http://www.tivi.fi/Kaikki_uutiset/2015-06-02/EMC-siirtyy-avoimeen-l%C3%A4hdekoodiin-ohjelmistot-toimivat-my%C3%B6s-kilpailijoiden-raudalla-3321952.html

    Reply
  9. Tomi Engdahl says:

    Microsoft announced yesterday that Windows 10 will be for public distribution in late July. Current Windows 7 and Windows 8 users will receive an updated operating system for free.

    Until now, Windows users have been able to freely choose the time, what version of – if at all – they use their machine. In the future, Microsoft will force users to a specific version of the update and these updates will happen automatically.

    What does this mean for the user? Above all, the fact that Microsoft were to decide what applications included in the operating system update will be.

    Contemporary data show that Windows 10 shipped, for example, the popular candy with Crash Saga game. No one shall be asked whether they want the game their computer or not.

    Windows 10 does not analysts, not much to raise the linux market share in desktops and laptops.
    If the laptop is sold with pre-installed Windows 10 TVs, a device manufacturer can force the start of Windows security (secure boot). At least the smaller Linux distributions installation of such machines is going to be difficult, because they have not been licensed to any kind of Secure Boot support.

    Source: http://etn.fi/index.php?option=com_content&view=article&id=2914:windowsista-tulee-kuin-applen-os-x&catid=13&Itemid=101

    Reply
  10. Tomi Engdahl says:

    The Chromebook is now available to desktop computer – sort of

    If you have tried Google’s Chrome OS Chromebooks based operating system, the speed is appreciated by easily.

    Operating system called Chromixium is a one developer project, which aims to the desktop Chrome-like look and feel. CR OS designed as has now been published in a stable 1.0 version. It is based on Ubuntu LTS version, so it is a stable operating system.

    Chromixium start a genuine Chromebook view the Chromium browser.

    CR OS and the underlying real Chrome OS Chromebooks are very similar. The main difference is that the CR OS can run on the local programs. They can be installed in Ubuntu application libraries. The use of local programs does not require an internet connection.

    Even in the old 700-megahertz processor it runs smoothly according to the developer.

    Source: http://etn.fi/index.php?option=com_content&view=article&id=2915:chromebookin-saa-nyt-poytakoneeseen-tavallaan&catid=13&Itemid=101

    http://chromixium.org/

    Reply
  11. Tomi Engdahl says:

    Analyst: Windows 10 does not save the PC market

    PC sales picked up last year, when the Windows XP technical support ended. In late July, the division will become the new Windows 10. Now the situation is different. According to analysts, Windows 10 is the only new operating system. None of the property will not solve the development of PC sales.

    Equivalent Gartner predicts mobile devices and computers group Steve Kleynhans says that PC sales development depends much more on other factors. – This will depend on exchange rate developments, the new Intel micros Skylake processors enabled features and new wireless capabilities, Kleynhans says.

    - People stopped caring about the operating system a long time ago. More interested in the added new device user experience. This in turn depends on the hardware of the new device, its size and weight, as well as its elegance, Steve Kleynhans stresses.

    Source: http://etn.fi/index.php?option=com_content&view=article&id=2916:analyytikko-windows-10-ei-pelasta-pc-markkinaa&catid=13&Itemid=101

    Reply
  12. Tomi Engdahl says:

    Chips of the world, Unite: Intel bakes Iris Pro gfx into new Xeon
    http://www.theregister.co.uk/2015/06/02/intel_bakes_iris_pro_graphics_into_new_xeon/

    Computex 2015 Intel has, for the first time, baked its Iris graphics chipper into its server-centric Xeon CPUs.

    Announced today at Computex 2015, the Xeon E3-1200v4 includes Iris pro graphics P6300. The result is a CPU tuned for video transcription.

    If a CPU designed for that workload sounds a bit obscure, consider the many video-on-demand operators who, thanks to device proliferation, need to prepare their moving pictures for all sorts of different situations. The new CPUs are also aimed at desktop virtualisation applications, as Intel says they enable “remote workstations”.

    The new Xeons come in five SKUs, all with 14nm architecture and four cores. Details of the five models’s speeds were hard to come by at the time of writing.

    Attendees were also treated to a tease of the sixth-generation Core CPUs, due in 2015′s second half. Skaugen showed off an all-in-one reference design just 10mm thick, with a so-something-inch 4K screen and the ability to lie flat on a table. There’s even built-in LTE and Skaugen joked that Asian markets’ liking for very large phones could be satisfied by such a device.

    One more thing: Intel announced a new set of gateway devices for sensors and an alliance that sees Ubuntu’s Snappy join Windows and Wind River as the operating systems that will run on the device.

    Reply
  13. Tomi Engdahl says:

    Sebastian Anthony / Ars Technica UK:
    Thunderbolt 3 embraces USB Type-C connector, doubles bandwidth to 40Gbps
    Thunderbolt 3 integrates USB 3.1, optional 100W power delivery, 5K @ 60Hz display.
    http://arstechnica.co.uk/gadgets/2015/06/thunderbolt-3-embraces-usb-type-c-connector-doubles-bandwidth-to-40gbps/

    At Computex 2015, Intel has unveiled Thunderbolt 3. The headline feature: Thunderbolt 3 has changed connector from Mini DisplayPort to USB Type-C.

    In addition to the new connector, Thunderbolt 3 now also supports USB 3.1 (up to 10Gbps), and the Thunderbolt transport layer sees its max bandwidth doubled from 20Gbps to 40Gbps. Thunderbolt 3 also offers an optional 100W of power, in accordance with the USB Power Delivery spec. Without USB PD, Thunderbolt 3 will provide up to 15 watts.

    Thunderbolt 3 is backed by Intel’s new Alpine Ridge controller. USB 3.1 support is provided by integrating a USB 3.1 host controller into Alpine Ridge. There will be two flavours of the controller, one that uses four PCIe 3.0 lanes to drive two Thunderbolt ports, and another version that only uses two PCIe lanes connected to a single Thunderbolt port.

    With the increase in max bandwidth, Thunderbolt 3 now supports up to two 4K @ 60Hz displays or a single 5K @ 60Hz display running off a single cable. The official Intel slide deck says that Thunderbolt 3 supports DisplayPort 1.2 (not 1.3), but there’s no mention of HDMI. The Alpine Ridge leak back in April said that HDMI 2.0 is supported, however.

    The same leak also suggested that Thunderbolt 3 would be paired with Skylake, Intel’s next chip after Broadwell.

    At launch, there’ll be one passive Thunderbolt 3 cable that supports Thunderbolt, USB 3.1, and DisplayPort 1.2, but with a max bandwidth of only 20Gbps. There’ll also be an active cable that allows for up to 40Gbps, but drops DisplayPort 1.2 connectivity.

    The most exciting aspect of Thunderbolt 3 is its adoption of the USB Type-C connector. Type-C has a much smaller Z-height (about 3mm) than Mini DisplayPort (about 5mm)

    With a total bandwidth of 40Gbps, Thunderbolt 3 offers a tantalising glimpse of “one cable to rule them all.” In theory, you could use Thunderbolt almost everywhere: to power your laptop, to power and drive your 4K monitor, and to power and connect all of your external peripherals.

    Reply
  14. Tomi Engdahl says:

    Mark Hachman / PCWorld:
    AMD’s new 28nm Carrizo chips for mainstream notebooks use half the power of last-gen chips and improve performance 1.5x

    AMD’s Carrizo chip targets the one thing every laptop user wants: Longer battery life
    http://www.pcworld.com/article/2928189/amd-aims-carrizo-chip-at-making-the-most-popular-notebooks-run-longer.html

    Think of Intel’s Core chips as the giant’s massive broadsword, cleaving through the ranks of high, midrange, and low-end PCs with ease. If that’s so, then AMD’s Carrizo chip is AMD’s epee, skewering a very specific notebook segment with a pointed thrust: longer battery life.

    AMD is expected to launch Carrizo, or what the company formally calls the sixth-generation A-series chips, at an event in Computex in Taiwan Tuesday night. There, company executives will explain how they’ve painted a bullseye on the mainstream notebook, specifically the $400 to $700 segment that should represent about 38 percent of all 2015 notebook sales. Partners will include Asus, Acer, HP, Lenovo, and Toshiba, which will ship Carrizo-equipped notebooks this month.
    ADVERTISING

    “We think [the notebook] is a key device for 2015,” said Jason Banta, AMD’s mainstream product line manager. “We think it is a segment to get right, and we think our competition has gotten it wrong.”

    we want to perform our everyday tasks without worrying about digging out our power cord. AMD didn’t exactly lead the pack to this point, but Carrizo could be in the right place at the right time.

    Tops on the wish list: Lowering power

    Intel accomplishes much of its power savings through steady process shrinks in its own fabs, which fabless AMD can’t rely upon. Instead, AMD leaned on its design kung fu: A Carrizo reference system uses half the power of a current “Kaveri” system, while performance has increased 1.5X—all within the same 28-nm manufacturing technology of the prior generation. That took “engineering courage,” according to Joe Macri, AMD’s chief technical officer.

    “When I first heard about this product, and staying at the 28-nm node, when Intel is at 14-nm, you think, ‘My god, they’re really falling behind,’”

    “Most people think of cleverness as how you design the material. But designing the circuit really does matter.”

    By designing the Excavator core with some of the same tools AMD uses to lay out its GPU cores, for example—something that AMD had never considered previously—AMD engineers made the Excavator layout far more efficient, saving power and space, Macri said.

    All-day movies?

    AMD executives said they expect to have about a year’s head start over Intel in adding in specialized decoder logic for movies encoded with the High Efficiency Video Codec, or HEVC. Offloading that task from the main CPU to specialized logic cuts power dramatically. With Kaveri, decoding and playing back a 1080p movie required close to 5 watts; with Carrizo, it’s just under 2 watts, Macri said.

    That’s important in two scenarios: with HEVC-encoded movies that a Carrizo laptop is streaming, or for travelers who may download a few HEVC-encoded movies to their laptop. In both, you’ll get far more playback time with a Carrizo laptop compared to a Kaveri laptop—AMD looped the public “Big Buck Bunny” movie on a 15-watt FX-8800P and eked out 9.5 hours of HD video playback.

    (It’s important to note that while Amazon uses HEVC, and Netflix is moving to HEVC for 4K-encoded movies, Google’s YouTube has chosen to use the VP9 codec instead. Carrizo won’t offer any extra benefit there.)

    AMD also claims that Carrizo’s ability to transcode movie information is several times faster than its older FX chips achieved. In addition, what AMD calls “Perfect Picture” helps improve video quality by upscaling 1080p to 4K-like resolutions.

    Reply
  15. Tomi Engdahl says:

    Peter Bright / Ars Technica:
    Microsoft bringing SSH to Windows and PowerShell — Will contribute to OpenSSH to make it run well on Windows. — SSH, or secure shell, is the mainstay of remote access and administration in the Linux world, and the lack of any straightforward equivalent has always been an awkward feature of the Windows world.

    Microsoft bringing SSH to Windows and PowerShell
    Will contribute to OpenSSH to make it run well on Windows
    http://arstechnica.com/information-technology/2015/06/microsoft-bringing-ssh-to-windows-and-powershell/

    SSH, or secure shell, is the mainstay of remote access and administration in the Linux world, and the lack of any straightforward equivalent has always been an awkward feature of the Windows world. While there are various third-party options, Windows lacks both a native SSH client, for connecting to Linux machines, and it lacks an SSH server, to support inbound connections from Linux machines.

    The PowerShell team announced that this is going to change: Microsoft is going to work with and contribute to OpenSSH, the de facto standard SSH implementation in the Unix world, to bring its SSH client and server to Windows.

    PowerShell is in some ways an obvious group to do such work; while PowerShell is arguably stronger as a scripting language than it is an interactive shell, it’s nonetheless Microsoft’s preferred tool for command-line Windows management and administration. The ability to connect securely to a Windows machine from a Linux one to use a PowerShell shell is a logical extension of PowerShell’s capabilities.

    Even with a native SSH server, Windows still won’t be as good a platform for remote command-line management as Unix

    Reply
  16. Tomi Engdahl says:

    Raspberry Pi would be sufficient for most computer users

    When a consumer goes today to buy a PC, he will be sold easily with the latest Intel processor and equipped with a fast SSD memory model, which is easily accumulated cost at least several hundred euros, even more. Yet the truth is that most of the needs sufficient for a few ten euros Raspberry Pi card performance.

    Raspberry Pi Card can be run on multiple Linux distributions. Datamationin test is a good example, they have tested Debian-based Raspbian. According to the test Raspbianilla can be Raspberry Pi card to make almost everything purebred with a laptop or desktop computer.

    For example, the LibreOffice office suite revolves almost flawlessly (lots of big tables in document was a bit slow) – the biggest challenge comes in to save the transcripts as Raspberry Pi does not have traditional built-in hard disk.

    Datamationin test Raspberry Pi to run the Firefox browser flexibly. Testers, the most surprising was the fact that the card used to run the Youtube videos. VLC repeater to run videos quite smoothly.

    This shows that the Raspberry Pi Foundation’s argument that the second generation of card is right for the PC, it is true

    Source: http://etn.fi/index.php?option=com_content&view=article&id=2918:raspberry-pi-riittaisi-useimmille-tietokoneeksi&catid=13&Itemid=101

    Reply
  17. Tomi Engdahl says:

    Josh Constine / TechCrunch:
    Magic Leap announces augmented reality SDK that supports Unity and Unreal game engines, no release date given but company says “soon” — Magic Leap Announces Its Augmented Reality Developer Platform

    Magic Leap Announces Its Augmented Reality Developer Platform
    http://techcrunch.com/2015/06/02/magic-leap-platform/

    Magic Leap wants game makers, filmmakers, and other creators to build augmented reality experiences on its platform, and today on stage at MIT Technology Review’s EmTech Digital conference, it announced how that will happen. Magic Leap is launching a development platform. It’s just opened a Developers section of its website where people can sign up for access to its SDK, which will work with the Unreal and Unity game engines. The company tweets that the SDK will be released “soon”.

    Reply
  18. Tomi Engdahl says:

    How long until you lose the world’s smallest 128GB USB drive?
    SanDisk unveils new hard drives including Type-C portable SSDs
    http://www.theverge.com/2015/6/1/8696287/worlds-smallest-usb-128gb

    How tiny is SanDisk’s new 128GB flash drive? Well, to our eyes it simply looks like a USB connector with a plastic end-cap — there doesn’t seem to be any storage there at all. Regardless of how it looks though, SanDisk says this upgrade to their Ultra Fit series is capable of storing up to 16 hours of full HD video in a form factor “smaller than a dime,” while offering transfer speeds of up to 130MB/s.

    Reply
  19. Tomi Engdahl says:

    VirtualBox 5.0 beta four graduates to become first release candidate
    Off with its paravirtualised head, says Snoracle to desktop hypervisor
    http://www.theregister.co.uk/2015/06/03/virtualbox_50_beta_blends_into_first_release_candidate/

    Snoracle’s new version of desktop hypervisor VirtualBox has progressed sufficiently well that it’s decided the revised tool is out of beta and into release candidate status.

    VirtualBox is a developer favourite because, unlike rival desktop hypervisors from VMware and Parallels, it is free. Oracle’s kept the tool alive and well-maintained, but hadn’t lavished it with much attention until, in early April, a beta for version 5.0 emerged.

    Betas two, three and four have appeared since, and on Monday release candidate one landed.

    Reply
  20. Tomi Engdahl says:

    You’ll never love an appliance like your old database
    What stops you committing to the all-in-one hardware solution?
    http://www.theregister.co.uk/2015/06/02/database_appliances_good_as_they_sound/

    Huawei recently announced a database appliance. The Appliance for Large Database is based on the Chinese data centre usurper’s FusionServer RH8100 beast that targets RISC with a battery of Xeon Intel chips.

    The Huawei machine has been constructed for the newest version of SAP’s Business Suite, S/4HANA, which was announced earlier this year: SAP’s first in-memory-only edition of its flagship suite.

    The marriage of software and metal in the form of an appliance isn’t new, especially in databases. Rather, it’s something database suitors such as Oracle and Microsoft with their relational databases have been pushing for years, to tap the performance power of the raw hardware and suposed simplicity of installation and configuration of the database. Finally, no need for a separate database.

    It sounds good, but do appliances really mean the end of the discrete database?

    “The big advantage is that they combine storage, networking and processing into a single box, which means that the database administrator has less to worry about,” he says. “As they are dedicated appliances, the hardware has been chosen for its suitability to database tasks, so you know the equipment should be right for the task in hand.”

    This can smooth the path for administrators. Oracle has said that a database admin can roll out its appliance in around two hours.

    “If you talk to an IT decision maker about database appliances, they might think: ‘That will replace half a rack of compute and storage that is specifically just serving my database workload’,” according to Giri Fox, director of customer technology services at Rackspace.

    It all sounds like a slam dunk for IT managers fed up with configuring their own equipment, but does this mean that the appliance will replace the discrete database? It’s unlikely for a variety of reasons, say the experts.

    Reply
  21. Tomi Engdahl says:

    NoSQL champ MongoDB plugs into SQL analytics power
    A ‘huge new realm of possibilities’, apparently
    http://www.theregister.co.uk/2015/06/03/mongodb_swallows_sql_tools/

    NoSQL database maker MongoDB is tapping the power of business intelligence and data visualisation, working on a connector for SQL-compliant data analysis tools which it promises would work with giants such as IBM Cognos Business Intelligence, plus tools from Qlik and Tableau Software.

    As a NoSQL database, MongoDB had been pitched firmly in the camp of Young Turks, who were pitted against relational and seen as the new future of data.

    That future has been recalibrated, however, as actual customers remained wedded to relational and SQL, meaning NoSQL databases had to work with them.

    MongoDB’s SQL bridge has been developed with Tableau Software, working with joint customers on features and performance. Tableau is a data analysis and presentation firm founded in 2003, which in 2013 cashed in on the industry’s fascination with data by going public with an IPO.

    A connector to SQL analysis and BI tools is a big deal for MongoDB. Until now, if you’ve used SQL-based tools and wanted to combine these with MongoDB, you had to move data held in MongoDB into a relational database first.

    Reply
  22. Tomi Engdahl says:

    HP CloudSystem 9.0 includes Helion platform for private clouds
    CloudSystem 9.0 has OpenStack and Eucalyptus integrated
    http://www.theinquirer.net/inquirer/news/2411415/hp-cloudsystem-90-includes-helion-platform-for-private-clouds

    HP IS TO UPDATE its CloudSystem cloud-in-a-box solution by including the full Helion OpenStack and Helion Development Platform to deliver a comprehensive private cloud for customers that can bridge the legacy and cloud native worlds.

    It also now includes the Eucalyptus stack, which enables workloads from AWS to run on CloudSystem, HP said.

    Due for general availability in September, CloudSystem 9.0 is the latest incarnation of HP’s ready-made private cloud platform for enterprise customers or service providers, which can be delivered as just software or included as a complete package with HP infrastructure hardware.

    Reply
  23. Tomi Engdahl says:

    Pocket comes to Firefox!
    https://www.mozilla.org/en-US/firefox/38.0.5/whatsnew/?oldversion=38.0.1

    The world’s most popular save-for-later service is now available in Firefox. Sign in with your Firefox Account and you can save articles, videos and more to enjoy anytime, anywhere.

    Reply
  24. Tomi Engdahl says:

    Microsoft: OpenSSH coming to PowerShell for interoperability between Linux and Windows
    Not the Power Shell from Mario Kart
    http://www.theinquirer.net/inquirer/news/2411425/microsoft-openssh-coming-to-powershell-for-interoperability-between-linux-and-windows

    MICROSOFT HAS ANNOUNCED that OpenSSH, the security protocol at the heart of Linux-based systems, is to get support in its products.

    The move is the latest in a long string of acts of openness as Microsoft steers towards taking its place in a multi-platform world, rather than attempting to recreate the domination that has slipped through its fingers as the landscape has evolved.

    Microsoft has been working to integrate Linux into products like Azure for some time, and it’s getting to the point where it would be pretty idiotic to hold out any further.

    Angel Calvo, group software engineering manager for the PowerShell team, said: “A popular request the PowerShell team has received is to use Secure Shell protocol and Shell session (aka SSH) to interoperate between Windows and Linux – both Linux connecting to and managing Windows via SSH and, vice versa, Windows connecting to and managing Linux via SSH.

    “Thus, the combination of PowerShell and SSH will deliver a robust and secure solution to automate and remotely manage Linux and Windows systems.”

    Reply
  25. Tomi Engdahl says:

    Second-hand IT alliance forms to combat ‘bully’ vendors
    Free ICT Europe says: vive la resistance!
    http://www.theregister.co.uk/2015/06/03/secondhand_alliance_formed_to_combat_bully_vendors/

    Second-hand IT providers have formed an alliance to combat what they claim are increasingly aggressive tactics by big vendors desperate to claw back falling revenue.

    Tomas O’Leary, secretary of Free ICT Europe, and chief executive of independent provider of IBM software maintenance Origina, claimed big vendors have stepped up efforts to squeeze out second-hand IT providers.

    The not-for-profit foundation’s principal remit is to address what it deems to be unfair practices by original equipment manufacturers.

    The most common tactics are to perform software audits on companies and crackdown on second-hand repairs as a means of maximising sales, said O’Leary. By doing so they intend to restrict the number of suppliers not directly part of their own ecosystem, he added.

    “It’s purely economic, they are carrying out a land-grab,”

    Vendors do appear to have been been tightening their grip of software and maintenance over the last few years.

    In 2013 Hewlett-Packard said it would restrict who is allowed to fix its ProLiant servers. The move means customers will need to be on a full-blown support contract to receive a service agreement ID.

    Cisco also launched an EMEA-wide crackdown in the channel,
    However, Cisco recently announced plans to move deeper into the second-hand market itself, with the launch of its of its Cisco Refresh programme in May to “address the growing secondary market”.

    Reply
  26. Tomi Engdahl says:

    The quest to save today’s gaming history from being lost forever
    Changes in digital distribution, rights management increasingly make preservation tough.
    http://arstechnica.com/gaming/2015/06/the-quest-to-save-todays-gaming-history-from-being-lost-forever/

    “The very nature of digital [history] is that it’s both inherently easy to save and inherently easy to utterly destroy forever.”

    Jason Scott knows what he’s talking about when it comes to the preservation of digital software. At the Internet Archive, he’s collected thousands of classic games, pieces of software, and bits of digital ephemera. His sole goal is making those things widely available through the magic of browser-based emulation.

    Compared to other types of archaeology, this kind of preservation is still relatively easy for now. While the magnetic and optical disks and ROM cartridges that hold classic games and software will eventually be rendered unusable by time, it’s currently pretty simple to copy their digital bits to a form that can be preserved and emulated well into the future.

    But paradoxically, an Atari 2600 cartridge that’s nearly 40 years old is much easier to preserve at this point than many games released in the last decade. Thanks to changes in the way games are being distributed, protected, and played in the Internet era, large parts of what will become tomorrow’s video game history could be lost forever. If we’re not careful, that is.

    Throwing away the layers

    “I totally get that people look at this and say all of this game history stuff is navel-gazing bullshit… an irrelevant, wasteful, trivial topic,” Scott told Ars. “[But] mankind is poorer when you don’t know your history, all of your history, and the culture is poorer for it.”

    And in today’s game industry, being “constantly in the now” often means throwing out masses of current history without a thought. “I use FarmVille often as my go-to example of this because, like it or not, that game is historically significant and will be studied,” gaming historian and Lost Levels creator Frank Cifaldi told Ars. “Keeping an offline game safe is pretty easy, but what do you do for FarmVille, a game that is constantly updated, to the point where Zynga manipulates it server-side?”

    These days, it’s not just Facebook games that are having their internal history slowly peeled away.

    “An analogy here is maybe to a piece of architecture,” Dyson continued. “When you’re seeking to preserve a historic house, there may be layers, it may have been lived in by many different people. Mount Vernon had been lived in by George Washington’s descendants, so they made a decision to restore it to George Washington’s time and erase this later history. Do you make the same kind of decision with games?”

    The death of “accidental ambient archiving”

    Such historical restoration may not be easy with many modern titles. When updates are automatically pushed out and applied over the Internet every time you log on to a console or PC to play, those historical layers are erased en masse without a thought. Where patches may have once gone out via FTP sites, where they could be archived and studied, now the process is hidden. That’s more convenient for the players, but it’s the equivalent of a constant digital purge from a historian’s point of view.

    “For that convenience [of automatic updating], we lose a lot of what you might call accidental ambient archiving,”

    Reply
  27. Tomi Engdahl says:

    Quentin Hardy / New York Times:
    Meg Whitman confirms HP split for Nov. 1; HP Enterprise gets new logo but future vision remains vague — Whitman Paints a Vague Picture of Hewlett Packard Enterprise

    Whitman Paints a Vague Picture of Hewlett Packard Enterprise
    http://bits.blogs.nytimes.com/2015/06/03/whitman-paints-a-vague-picture-of-hp-enterprise/

    LAS VEGAS — Meg Whitman, chairwoman and chief executive of Hewlett-Packard, explained her dream Tuesday for Hewlett Packard Enterprise, the new company she’s creating after HP is split in half. She also indicated her dilemma.

    Ms. Whitman told an annual meeting of big-business customers that Hewlett Packard Enterprise, which sells servers, storage, software and networking products, as of Nov. 1 will be remade for the growing business of cloud computing, mobility and big data.

    “We’re living in an idea economy,” she said, and it has never been easier for someone to turn a concept into reality.

    Hewlett Packard Enterprise will be the means by which older companies will move into the new world.

    For Hewlett Packard Enterprise, “Our services are the tip of the spear, where we begin the journey together” with customers, Ms. Whitman said.

    Much of the company’s current services business consists of doing things like running call centers cheaply, not leading huge changes for corporate customers.

    HP Inc., where personal computers and printing will go, was little discussed.

    Right now, the company’s most basic function is delivering computer hardware.

    Reply
  28. Tomi Engdahl says:

    Intel, Altera: Math in Question
    Co-packaged x86, FPGAs ship late 2016
    http://www.eetimes.com/document.asp?doc_id=1326741&

    The math on Intel’s $16.7 billion bid to buy Altera doesn’t add up although the merged companies could see gains, analysts said. The deal raises more questions than others in a string of recent mega-mergers for a semiconductor industry that is consolidating as it matures.

    Analysts questioned Intel’s claim it could ride to seven percent revenue growth thanks to the merger. The x86 giant could give Altera the clout to take share from its traditional FPGA rival, Xilinx. However, Intel’s poor track record in mergers made some skeptical about that prospect.

    In a conference call, Intel chief executive Brian Krzanich said the combined companies will ship integrated products starting in late 2016 for servers and some still-undetermined embedded systems. Initial products will pack x86 and FPGA die in a single package, followed “shortly” by products that merge both on SoCs.

    Intel targets a bigger opportunity than current estimates of a $1 billion annual market for stand-alone FPGAs as server co-processors, said Krzanich.

    With such chips, Web giants such as Google or Facebook could rapidly shift from running on the same servers their algorithms for facial search or encryption, Krzanich suggested. “They want to do multiple workloads [and] the workload will change over time, [so an integrated chip] is ideal for this,” he said.

    “I don’t see that as a fundamental strategy for the server business,” said Linley Gwennap, principal of market watcher The Linley Group (Mountain View, Calif.) and a veteran Intel analyst. “We’ve seen some experimentation using FPGAs in the data center and there have been some promising results, but we haven’t seen much deployment,” Gwennap said.

    Reply
  29. Tomi Engdahl says:

    JEDEC Announces Support for Hybrid NVDIMM Modules
    http://www.eetimes.com/document.asp?doc_id=1326739&

    JEDEC Solid State Technology Association has approved the first standards for support of hybrid DDR4 memory modules.

    The standards work is being done by JEDEC’s JC-45 Committee for Memory Modules, which developed the non-volatile DIMM (NVDIMM) taxonomy in collaboration with Storage Network Industry Association’s NVDIMM Special Interest Group (SIG), a sub-committee of SNIA’s Solid State Storage Initiative.

    The new standard defines hybrid DDR4 memory modules as those that plug into standard DIMM sockets and appear like a DDR4 SDRAM to the system controller, yet contain non-volatile memories such as NAND flash on the module. These hybrid module families may share the memory channel with other standard DDR4 DIMMs. Publication of the standard is expected later this year

    The JEDEC standards cover two versions of hybrid modules: the NVDIMM-N, which combines DRAM and NAND flash where the flash provides backup and restore of all DRAM for reliable data persistence in the event of a power failure; and the NVDIMM-F, which provides directly addressable NAND flash that is accessed as a block oriented mass storage device.

    Although the standards work for NVDIMM is still in its early days, there are already products available.

    Reply
  30. Tomi Engdahl says:

    How CIOs can reduce shadow IT in government
    http://www.cio.com/article/2929782/it-management/how-cios-can-reduce-shadow-it-in-government.html

    A new study highlights rise of shadow IT, unauthorized applications in government agencies, arguing for greater involvement with the business lines of the enterprise and better understanding of users’ needs.

    If government CIOs want to bring IT out of the shadows, they need to start by understanding what kind of tools agency personnel need to do their jobs.

    That’s one of the chief takeaways from a new study looking at shadow IT in the government — those unauthorized applications and services that employees use without the permission of the CIO and the tech team.

    he new analysis, conducted by cloud security vendor Skyhigh Networks, identifies a startling amount of applications in use in public-sector organizations. According to an analysis of log data tracking the activities of some 200,000 government workers in the United States and Canada, the average agency uses 742 cloud services, on the order of 10 to 20 times more than the IT department manages.

    “The first thing I would say is yes, it’s alarming, but it’s not unique. Some of these issues are what we see in the commercial sector, as well,”

    So the use of unauthorized applications, though a potentially severe security risk, often results simply from employees trying to do their work more efficiently, Gupta says, urging CIOs to connect with the business units of their enterprise to get a better sense of where the needs lie.

    By category, collaboration tools like Microsoft 365 or Gmail are the most commonly used cloud applications, according to Skyhigh’s analysis, with the average organization running 120 such services. Cloud-based software development services such as GitHub and SourceForge are a distant second, followed by content-sharing services. The average government employee runs 16.8 cloud services, according to the report.

    Lack of awareness creates Shadow IT problem

    One of the challenges is that not all storage or collaboration services are created equally, and users, without guidance from the CIO, might opt for an application that has comparatively lax security controls, claims ownership of users’ data, or one that might be hosted in a country that the government has placed trade sanctions on.

    “The problem is our employees are not aware of that and they just use the service that seems most appropriate,” Gupta says.

    “The mindset shift has to move from shadow IT being a real threat and a problem to shadow IT giving me insight,” Gupta says. “Rather than become the department of no, how do I become the department of yes?”

    Reply
  31. Tomi Engdahl says:

    This Windows 10 PC is the size of a phone charger, and 100% real
    http://www.techradar.com/us/news/computing/pc/this-windows-10-pc-is-the-size-of-a-phone-charger-and-100-real-1295701

    Microsoft has showcased a new form factor for Windows 10 devices, one that can actually be squeezed into a plug. The Quanta Compute Plug was demonstrated by Microsoft’s Nick Parker during his Computex keynote address in Taiwan.

    The mini-PC comes with two USB 3.0 ports and an HDMI port, allowing you to turn your TV into a smart computer. Users can control their TV using Cortana via a Bluetooth remote or headset.

    Parker didn’t mention what version of Windows it will run but it is safe to assume that it will be a full Windows 10 model rather than Windows 10 for IoT

    The Windows ecosystem is following what Android and Linux had done before. Intel recently embraced the stick form factor with the Intel Compute Stick, something that Android users first discovered about four years ago thanks to HDMI dongles.

    As for the power adaptor, Marvell introduced the Linux-powered SheevaPlug back in 2009, one that obviously used an ARM system-on-chip.

    Quanta doesn’t sell computers but is one of the biggest so-called original design manufacturers (or ODM) worldwide. In other words, they build stuff for others (think HP, Dell, Apple) to sell.

    Reply
  32. Tomi Engdahl says:

    Piston Goes To Cisco, Blue Box to IBM As OpenStack Consolidation Accelerates
    http://techcrunch.com/2015/06/03/piston-goes-to-cisco-bluebox-to-ibm-as-openstack-consolidation-accelerates/

    Cisco announced this morning it has purchased private cloud — and OpenStack — specialist Piston Cloud Computing. The acquisition comes on the heels of its Metacloud purchase last Fall, and the acquisition marks the latest OpenStack startup to get scooped up as the market consolidation continues.

    Meanwhile IBM grabbed the OpenStack private-cloud service Blue Box, another OpenStack player, as the big companies’ fight for OpenStack dominance continues unabated.

    OpenStack is clearly maturing and part of that process involves the startup dominoes starting to fall fast and furiously as big companies like Oracle, IBM, HP, EMC and Cisco begin to see the value of these companies as they attempt to capture a piece of the growing OpenStack market. When assessing the build versus buy equation and the difficulty finding OpenStack engineering talent, buying often makes the most sense.

    It gives these companies like Cisco and IBM the talent they desire and some nice intellectual property to go with it.

    Reply
  33. Tomi Engdahl says:

    Puppet Labs:
    IT Automation Company, Puppet Labs, Turns 10! — Puppet Labs founder and CEO Luke Kanies talks about the company’s growth and the increasing importance of IT as a competitive advantage.

    Happy 10th Birthday, Puppet Labs!
    https://puppetlabs.com/blog/happy-10th-birthday-puppet-labs?ls=content-syndication&ccn=Techmeme-20150520&cid=701G0000000F68e

    Growing an Automation Company, Growing a Market

    People always ask me, “Has the company played out the way you planned?” or, “Have you grown as quickly as you thought you would, in the way you expected?” etc. Hah! That would imply I had a certain level of planning when I started. I did have three related goals, but really no idea how I was going to accomplish them:

    Build software that makes its users better at their jobs, and more fulfilled doing them. Specifically, build automation software that would enable sysadmins to automate the menial, repetitive parts of their work, and reduce inconsistencies and outages, so they could focus on long-term, larger and more interesting work.
    Enable companies to move faster without sacrificing stability or security, turning technology into a competitive advantage.
    Drive adoption of the best technology throughout the industry by reducing the friction of switching to newer and better technology.

    Puppet Labs started as an open source software company because, really, I didn’t know anything else.

    Then our next challenge was, how do we build upon great open source software and a healthy community to create a software company that survives on the value its customers find in its software — while still maintaining the open source software, supporting its users and at the same time, advancing the commercial product? How do we scale the company, the community, and the product?

    We have found that Puppet’s balance of simplicity and power has scaled really well.

    As we grow in scope, we’ll always seek to maintain that balance, and we’ll always eschew creating fast but insufficiently powerful tools, or crazy-awesome but confusing tools. We’ve made some missteps, and our customers have made these clear

    Now the challenge in front of us is, how do we move from just 15 percent of the market using automation (yes, it’s true) to 50 percent? That’s the big hill in front of us today. It’s not about taking share in the automation market, but about changing the expectations and habits of the market so that automation itself becomes a default — similar to how VMware has taken the market from essentially 100 percent physical machines to just about entirely virtualized over the course of its growth.

    What’s Changed in the IT Industry over the Past 10 Years

    One of the most important trends we’ve seen is the move away from outsourcing IT, and toward companies trying to turn IT into a competitive advantage. In the early 2000s, many companies thought, “Hey, I’m not a technology company — I’m a media/retail/finance/etc. company,” and partnered with someone to make the tech problem go away.

    By now they’ve all found that, actually, they are a technology company, and technology has to be a core competency. Now they’re bringing it back in house, all while trying to build up a brand new skillset.

    We’ve also seen how important it is to focus on getting software into production as quickly as possible. Companies have been relentlessly finding ways to shorten the time between a developer’s keyboard and production

    Speaking of virtualization, the last 10 years has seen it move from merely interesting to the de facto substrate on which much innovation and change is done. To name just two big trends, both AWS and Docker are entirely enabled by the pervasive adoption of virtualization, and the rebuilding of everyone’s workflows around that.

    Reply
  34. Tomi Engdahl says:

    Emil Protalinski / VentureBeat:
    Chrome beta now automatically pauses less important Flash content to boost performance and battery life — Google today detailed a very interesting initiative in partnership with Adobe: The two have been working to make Flash content more power-efficient in Chrome.

    Chrome beta now automatically pauses less important Flash content to boost performance and battery life
    http://venturebeat.com/2015/06/04/chrome-beta-now-automatically-pauses-less-important-flash-content-to-reduce-power-consumption/

    Here’s how the feature works. Chrome beta will automatically pause Flash content that isn’t “central to the webpage” while keeping central content playing without interruption. The company offers an obvious example: Animations on the side will be paused while the video you’re trying to watch will be unaffected.

    That said, Google expects accidents to occur. As such, if Chrome beta pauses something you were interested in, you can resume playback by just clicking on it.

    Reply
  35. Tomi Engdahl says:

    Aloysius Low / CNET:
    Microdia unveils 512GB microSD card, will cost around $1,000, available in July — Microdia crams 512GB into a microSD card, out in July — Meant for professional photographers with an unquenchable desire for storage space, this microSD card will cost upwards of $1,000.

    Microdia crams 512GB into a microSD card, out in July
    http://www.cnet.com/news/microdia-will-sell-a-1000-ish-512gb-microsd-come-july/

    Reply
  36. Tomi Engdahl says:

    JavaScript You Need to Know For a Job
    http://insights.dice.com/2015/06/04/javascript-you-need-to-know-for-a-job/

    JavaScript is a programming language that’s easy to pick up, but extremely difficult to master. Even some of its beginner-level functions are decidedly not beginner-friendly. When you land your first JavaScript job, you’re going to want to know as much as possible, if only so you can navigate through some of the language’s trickier aspects without needing to ask for help.

    The Beginner’s List

    Know the different ways to create objects, such as using the “new” keyword, as well as just declaring an object (such as ‘x = {a:1, b:2};’).

    Know what a prototype is, what the “this” variable does, and how to use both.

    Know the difference between a list and an object (and how a list is technically both, and can be used as both).

    Know that functions are objects that can be passed as parameters into other functions and returned from other functions.

    Know what closures are and how to use them. This might seem like an advanced topic, but when working with functions returning functions, it’s easy to introduce bugs if you’re not careful.

    Know how to use functions such as the list’s map and filter functions. With this in mind, I encourage you to read this specification and learn the methods available on all types of objects.

    Understand the built-in objects (they’re constructors!) and how to use them, including Function and Array (with capital F and A).

    Know your way around the developer command line and debugger. All the major browsers provide these now.

    there’s always more to learn. Want to take it even further? Learn about ES5 and the newest features of JavaScript that might not be present in all browsers. Learn the different frameworks, such as Backbone, Ember, Angular, and Knockout. The more you know, the more likely you’ll land that job.

    Reply
  37. Tomi Engdahl says:

    Open Source Haxe/OpenFL Platform Will Support Home Game Consoles
    http://developers.slashdot.org/story/15/06/04/1725230/open-source-haxeopenfl-platform-will-support-home-game-consoles

    At last week’s World Wide Haxe conference, a coalition of game developers announced that the open source platform Haxe/OpenFL is coming soon to home game consoles. The first three games that will ship using the technology are Yummy Circus , Defender’s Quest (HD edition) , and the award-winning Papers, Please. Haxe is a programming language that compiles to other programming languages (everything from C++ to Javascript to Python), has been around for about 10 years and is quite powerful. OpenFL is a hardware-accelerated cross-platform reimplementation of the Flash API, built on top of Haxe

    OpenFL for Home Game Consoles
    http://www.fortressofdoors.com/openfl-for-home-game-consoles/

    Reply
  38. Tomi Engdahl says:

    Open source? HP Enterprise will be all-in, post split, says CTO
    It’s in ‘the fabric of everything we do,’ exec says
    http://www.channelregister.co.uk/2015/06/04/hp_enterprise_loves_open_source/

    HP Discover What’s the latest enterprise IT company to proclaim its love of open source? HP, that’s who – or, more specifically, Hewlett Packard Enterprise, one of two companies that will emerge once HP splits this November.

    “We have taken this very, very seriously and we are all-in on the notion of open source,” Fink said, adding that even game-changing big bets like the Machine will be backed by open source software.

    Not that using open source is new for HP, he observed. Behind the scenes, the Palo Alto firm has been a major contributor to a number of open source projects, particularly those that are relevant to its core mission of helping customers build and grow their IT infrastructure.

    “We are the Number One contributor to the OpenStack project,” Fink said. “We contribute large bodies of code to the Cloud Foundry project. We are heavily involved with partners who lead open source projects – like Hortonworks, for example. We are contributing heavily to making the cloud open source and making that real for you.”

    To prove it, on Wednesday HP announced Grommet, a new user interface framework that’s specifically tailored for enterprise applications and that HP has released under the Apache License.

    “I want to stress something here: It is not called HP Grommet. It is called Grommet,” Fink said. “It is HP’s contribution to the IT industry to bring consumer-grade capabilities with an enterprise user experience framework so that all of you can take advantage of it.”

    The genesis of Grommet, Fink said, was when customers would complain that various HP products wouldn’t integrate well with each other, because their UIs were so radically different.

    All Hewlett Packard Enterprise software will ship with UIs based on Grommet from now on, Fink said.

    http://grommet.io/docs/

    Reply
  39. Tomi Engdahl says:

    Yahoo Killing Maps, Pipes & More
    http://tech.slashdot.org/story/15/06/04/2229247/yahoo-killing-maps-pipes-more

    Yahoo is shutting down its mapping service, Pipes and reducing the availability of Yahoo TV and Yahoo Music. The company has decided instead to focus on three major parts of its business: search, communications, and digital content. ”

    Comment:

    To watch Yahoo slowly die because they got out classed by all the upstarts in the market.

    Really, they fell prey to the PHB effect before their competitors did. MBA’s took over too fast at Yahoo after the founders took their money and ran…

    Yahoo does Spring Cleaning: Shuts down Maps, Pipes & more
    http://www.networkworld.com/article/2931719/collaboration-social/yahoo-does-spring-cleaning-shuts-down-maps-pipes-more.html

    In case you were wondering what it is exactly that Yahoo does these days, the company says its focus is on “search, communications and digital content.” The rest must go, and as such, Yahoo today has announced some things it is getting rid of.

    For starters, the company is doing away with maps.yahoo.com (a.k.a. Yahoo Maps) at the end of June. Though maps will live on within Yahoo search and Flickr in some fashion.

    *Assorted Yahoo media properties, including Yahoo Music in France and Canada, will be axed as the company streamlines its market-specific media properties.

    Reply
  40. Tomi Engdahl says:

    Why businesses are turning to managed IT services
    http://www.cio.com/article/2930498/it-strategy/why-businesses-are-turning-to-managed-it-services.html

    More organizations are turning over certain IT functions to managed service providers, freeing internal IT staff to focus on strategic IT projects.

    Organizations are increasingly turning to managed service providers (MSPs) to handle elements of their IT needs as part of a collaborative arrangement with the internal IT department, according to new research from IT industry trade association CompTIA.

    MSPs have been around for a long time, but adoption has been relatively low. As late as last year CompTIA found that only 3-in-10 organizations had any of their IT in the hands of an MSP, says Carolyn April, senior director, Industry Analysis, at CompTIA. But more than two-thirds of companies surveyed for CompTIA’s Fourth Annual Trends in Managed Services Study, released Monday, say they have used the services of an outside IT firm within the past 12 months.
    ADVERTISING

    Companies have become more familiar with managed services and are turning to them for management of certain IT functions, particularly email hosting, customer relationship management (CRM) applications, storage, backup and recovery and network monitoring.

    “While one-time projects account for some of these engagements, a significant portion is ongoing management of one or more IT functions by a managed services provider,” says April, who is also author of the report. “There is a much higher degree of familiarity with the term ‘managed services’ and greater adoption.”

    It is also important to note that while companies are increasingly relying on outside providers for part of their IT needs, MSPs generally complement rather than replace internal IT.

    “Very few of these companies get rid of their IT staffs just because they join up with an MSP,” April says.

    Instead, especially in larger companies, bringing an MSP into the mix frees up existing IT staff to focus on more strategic projects.

    “It elevates the IT staff and brings them out of the shadows within the organization,” she says. “It allows them to focus on a custom app dev project or cloud initiative — something highly strategic. I think that’s a win-win for your IT staff.”

    The trade association found that six-in-10 respondents that consider their technology usage advanced are using an MSP for physical security services. Also, 63 percent of the same group are using an MSP for application monitoring.

    Reply
  41. Tomi Engdahl says:

    HP to buy EMC? We think so, say Wall St money men
    Thigh rubbing financial analysts see so ‘many reasons this makes sense’
    http://www.theregister.co.uk/2015/06/07/hp_emc_tie_up_makes_sense_say_bean_counters/

    “HP to buy EMC? We think so,” said Brian Alexander, director of technology research at investment services biz Raymond James.

    He said a buy would beef up HP’s cloud portfolio with VMware and Virtustream services, while EMC and Pivotal would boost the converged infrastructure and analytics side of the portfolio. Mobility goodness for HP would come from VMware AirWatch.

    “While management’s messaging around the size of M&A in HP Enterprise continues to refer to Aruba as a benchmark, CEO Meg Whitman explained that from an academic perspective, technology hardware is an industry that should consolidate due to declining revenues and slowing growth rates,” he added.

    The storage desk at El Reg has ruminated and cogitated about a coming together between the two rivals, and concluded from a portfolio perspective – 3Par aside – that it makes sense. Analysts agree.

    Reply
  42. Tomi Engdahl says:

    Alienware Steam Machine
    http://www.alienware.com/landings/steammachine/

    The Alienware Steam Machine takes console gaming to the next level with a massive library of over 1000 local online games, all playable in full 1080p HD.

    You’ll experience powerful and immersive gaming with high performance NVIDIA® GeForce® GTX GPU 2GB GDDR5 graphics and Intel® Core™ processors. And with the innovative Steam Controller, you’ll be able to move like never before.

    Reply
  43. Tomi Engdahl says:

    Why Apple and Google Made Their Own Programming Languages
    http://apple.slashdot.org/story/15/06/07/1532219/why-apple-and-google-made-their-own-programming-languages

    This Business Insider article looks into the state of Google Go and Apple Swift, highlighting what the two languages have in common — and why tech companies would bother involving themselves in the programming language holy wars. From the article: “One fringe benefit for Google and Apple is that making your own programming language makes recruitment easier”

    Why Google and Apple made their own programming languages
    http://www.businessinsider.com/wwdc-apple-swift-google-go-2015-6

    Almost exactly one year ago, Apple released Swift, a new programming language promised to make it easier to build iPhone and Mac apps than ever before.

    The vision behind Swift is right in the name. Apple promises Swift is faster to write, and results in faster applications.

    That vision resonated
    And Swift frequently tops developer surveys as the language developers are the most interested in

    But Apple isn’t the first big tech company to release its own programming language — not by a long shot.

    Back in 1991, Microsoft released Visual Basic as a way to streamline the development of Basic, one of the earliest popular programming languages with a graphical user interface (GUI).

    More recently, circa 2009, Google released Go, its own programming language.

    Programming languages are a matter of religion for so many developers. For Apple, Google, and even Microsoft way back when to release a new programming language means an uphill battle against the habits and routines of developers, who are set in their ways.

    The two languages have a lot in common. Both Google’s Go and Apple’s Swift were designed to fix the problems that developers were having with older programming languages.

    Swift is enough like Objective-C
    that developers picked it up quickly
    Objective-C and Swift code can be used side-by-side in the same app

    Similarly, Go was designed to make it easier to build complex systems. The old workhorses of the networked software programming industry, C++ and Python, just couldn’t keep up with the demands that Google was placing on it.

    “Go was born out of frustration with existing languages and environments for systems programming,”

    As an open source project, developers all over the world are contributing to Go and making it better.

    One fringe benefit for Google and Apple is that making your own programming language makes recruitment easier — for instance, since it builds a lot of its own server applications in Go, Google is more likely to hire a developer who’s already proficient in the language since she would need less training.

    Plus, any cool problems that developers in the wild solve with languages like Swift or Go, Apple and Google can adopt for their own businesses.

    Go hasn’t seen the explosive growth of Swift — not least because it targets more complex, so-called “system level” programming.

    The Go project has described itself as “designed by Google to help solve Google’s problems, and Google has big problems.” You can use Go for a mobile app if you really want, but it’s not what the language is best at.

    Red-hot container startup Docker uses Go for some of its internal projects, as does growing cloud hosting service DigitalOcean. If Swift is for making iPhone apps, Go is for the servers that hold that app’s data.

    The final word here is that Google and Apple are both pushing programming languages because new advances in technology are being underserved by existing languages and techniques.

    Reply
  44. Tomi Engdahl says:

    On Managing Developers
    http://developers.slashdot.org/story/15/06/07/046233/on-managing-developers

    A columnist at TechCrunch takes a crack at advice on how to manage developers. He has some decent starting points. For example, “Basically a manager’s job is to make other people more productive. What’s one really good way to do that? Do the work that is getting in their way. Which means: find out what kind of important work your developers dislike the most, and do it for them.” Also: “[D]on’t bull$%^& anyone, ever. … Speak the truth as you see it. Speak it diplomatically, don’t get me wrong; but be trustworthy.”

    On Managing Developers
    http://techcrunch.com/2015/06/06/on-managing-developers/

    But I can assure you I am someone who screwed up a lot along the road to being better. Here are some mistakes from which I have learned:

    Just Because You’re In Charge Doesn’t Mean You’re In Control

    The great irony of management is that the higher up you go, the less actual control you have. When you are but a humble coder, you make the computer do exactly what you want; when you’re a manager, you only hope that people understand what you want, and then trust/pray that they do it both correctly and in a timely manner.

    Agile Is Good. What Is Agile?

    People talk about process a lot. Too much, if you ask me. In many places, ‘agile development’ has become codified into a fixed, carefully specified process of “standups” and “scrums” and “sprints,” which is darkly ironic given that the key principles of the Agile Manifesto include “value individuals and interactions over processes and tools” and “value responding to change over following a plan.” So what did companies create and fervently follow? Agile processes, tools, and plans. Sigh. If you are a Certified Scrum Master, you are doing it wrong.

    The Shit Work Is Your Work

    Basically a manager’s job is to make other people more productive. What’s one really good way to do that? Do the work that is getting in their way. Which means: find out what kind of important work your developers dislike the most, and do it for them.

    Few developers like to write documentation. That means it’s your job. Testing is really important, but few developers like to write tests. That means it’s your job.

    Think QA is someone else’s job, or not part of your job description? Then you’re a really really really bad manager.

    The summary of the summary of the summary is: people are a problem

    Unfortunately, as you probably already know, people suck. People flake, screw up, ignore you, pester you, disappear, quit, and/or lose faith in you, frequently for no good reason. Guess what? Now that you’re a manager, their screwups are your screwups, and their problems are your problems. Get used to it. You need to develop an early-warning sense of people problems and have contingency plans in place.

    Also: people work in different ways and at different paces. Which is a polite way to say that hiring well is really important, because, again, most people suck.

    the most surprising lesson of my professional career: once people learn that they can trust you, then you can usually trust them back. No, really. I swear.

    You’re basically a mildly glorified technical translator

    Managers translate the project’s Big Picture into individual tasks for developers, with their details and interactions specified in minute detail; translate the work done by developers so that clients/executives/users understand it; and, perhaps most importantly, translate the details of errors, roadblocks, and opportunities to client/executive-speak as well. As any professional translator will tell you, this means you need to understand the work more intimately than the author. To them it’s intuitive, but it’s your job to break it down so it makes sense to anyone.

    Reply
  45. Tomi Engdahl says:

    Millennials bring consumer shopping tactics to corporate buying
    http://www.cio.com/article/2931767/it-industry/millennials-bring-consumer-shopping-tactics-to-corporate-buying.html

    An IBM study of more than 700 people who make corporate buying decisions, including Gen-Xers, people from Gen-Y and millennials, sheds light on notable differences in their motivations to make corporate purchases and offers valuable lessons for B2B marketers.

    Millennials take a different approach than their Gen-X and Baby Boomer counterparts to many things, including social media use, work habits and communication. They’re also shaking things up in the marketing world, according to a recent report from The Boston Consulting Group.
    state of cios
    State of the CIO 2015

    More than 500 top IT leaders responded to our online survey to help us gauge the state of the
    Read Now

    More than half (51 percent) of the 800 millennial respondents base their personal buying decisions on peer recommendations. They also want to interact more frequently with companies through digital channels. This behavior has started to trickle down into their purchasing decisions at work, and because millennials now make up the majority of the American workforce, it’s time for B2B marketers to take notice.

    When initially researching a potential corporate purchase, millennials prefer straightforward, in-person meetings with vendor representatives, which is in contrast to Gen X-ers or Boomers who like to get their information from analysts, articles, blogs and third-party websites

    Carolyn Baird, global research leader at the IBM Institute for Business Value (IBV) and author of the report, says B2B vendors need to pay attention to these trends. “We know how [millennials] are researching as consumers and so much of it is digital,” she says. “But when you get into the B2B space, the digital [component] is table stakes and what they really want to do is connect with individuals they’d be working with.”

    When moving on to the actual corporate purchase, millennial behavior is evenly split. Thirty-six percent of respondents rely on recommendations from family or friends, another 36 percent use their company’s data analysis, and 34 percent go with gut feelings.

    “If the data shows a promising and competitive product, and the individuals selling it pass my gut-check, then I’m likely to go into persuasive mode and explain why this is the right vendor to my colleagues. If I feel uneasy about either, then I start checking in with others and weigh their past experiences in similar situations.”

    Satisfied millennials spread the word

    B2B vendors should also note that if millennials like their products, they’re more likely to show love online versus past generations. Sixty-nine percent of respondents leave compliments on vendor sites, and the same number of respondents post positive comments on social media. Millennials are also less likely to blast companies with negative feedback online than the previous two generations.

    “Millennials are very aware of how quickly a brand can be damaged with negative social exposure,” Baird says. They also know the potential impact of negative feedback on their own careers

    Reply
  46. Tomi Engdahl says:

    Engineering Shortage Persists
    India still holds sway in software
    http://www.eetimes.com/author.asp?section_id=36&doc_id=1326810&

    Lower engineering salaries means managers should brace themselves for the revolving door, especially when 3-7-year itch kicks in.

    “Engineers are impossible to find,” says Frank Kern. He should know, as chief executive of product engineering firm Aricent he employs 11,000 software developers.

    The cost difference is one big reason why so any companies now have software engineering centers in India, including Aricent which has 75% of its total staff In India in centers in Bangalore, Delhi, Chenai, Hyderabad and Pune. India has long been known for churning out lots of engineering grads who are hard workers and polished English speakers. The problem is hanging on to them.

    Aricent loses about 15 percent of its developers every year. Others in India lose up to 19 percent, Kern says. Employers dole out big raises, getting young workers up to nearly $20,000 in the first couple years to keep them around.

    The tough part comes 3-7 years into an engineer’s career. Competition is fierce for developers who are trained and have some experience under their belts. That’s pushing annual salary increase up into the double-digits again these days, Kern says.

    To compete, companies like Aricent are looking for new places to establish engineering centers away from the crowd.

    India remains the best place for software engineering on the planet, says Kern who ran IBM’s IT services group with 14,000 engineers before taking the helm at Aricent in October 2012 to focus on product engineering for OEMs and, increasingly, chip vendors. “I can and have put centers anywhere in the world, but there is no place that offers what India does,” he says.

    China is good, but its big east coast centers of Shenzhen and Shanghai are starting to get expensive. Having centers in the European Union to serve Europe provides advantages, and Poland is Kern’s next big target there.

    Reply
  47. Tomi Engdahl says:

    Nutanix did build ‘Acropolis’ hypervisor, wants you to bury it
    Hyper-upstart wants to manage apps, not virtual machines, without sysadmin help
    http://www.theregister.co.uk/2015/06/09/nutanix_acropolis_and_prism/

    As predicted by The Register, Nutanix has built a hypervisor called Acropolis, but is using it for more than scale-out compute and an assault on VMware.

    Instead, the company wants to abstract all the stuff that makes up modern applications and make managing infrastructure idiot-proof.

    We’ll explain that in a moment.

    First, “Acropolis”, which is now the umbrella name for Nutanix’s compute and storage offerings. The company’s Distributed Storage Fabric is one temple on the Acropolis and does all the fun software-defined storage things that have made Nutanix a $300-million business.

    Also on the Acropolis is the “App Mobility Fabric”, tools that manage VMs but do so from an application-centric point of view. The idea is that instead of managing VMs, IT folk can manage all the hardware and software dedicated to a particular application.

    That’s made possible, in part, by the Acropolis hypervisor, a custom cut of KVM tuned to the company’s app management ambitions. The hypervisor is said to be lightweight, speedy and suited to VMs or containers.

    “We made it lean and we made it secure and easy to deploy,”

    There’s your shot at VMware, in case you’ve been waiting for it.

    Prism is a management tool that offers a “True consumer-grade user experience“ and “an end-to-end view of all workflows” so that admins can make applications do their bidding “with no need for specialized training”.

    All of those nice marketing words mean, Gill said, that “if I want to create 100 VMs and the system says I do not have space, I can now saw do some analysis and some VM sprawl analysis” with the machine learning routines built into Prism.

    Not many clicks later, we’re promised, the problem will be addressed, without a sysadmin needing to get their hands particularly dirty.

    Gill said Nutanix hopes to extend this capability so that workloads tended by Nutanix software can burst into public clouds.

    Reply
  48. Tomi Engdahl says:

    Mozilla Plans To Build Virtual Reality APIs Into Firefox By the End of 2015
    http://tech.slashdot.org/story/15/06/09/1427241/mozilla-plans-to-build-virtual-reality-apis-into-firefox-by-the-end-of-2015

    Mozilla’s VR research team is hard at work making virtual reality native to the web. The group wants more than a few experimental VR-only websites, they want responsive VR websites that can adapt seamlessly between VR and non-VR, from mobile to desktop, built with HTML and CSS
    aim to have support for the WebVR API
    by end of this year.

    How Mozilla Plans to Build VR Into the Foundation of the Web
    http://www.roadtovr.com/mozilla-build-vr-foundation-web-firefox-virtual-reality-oculus-rift/

    Though people speak of the ‘metaverse’—a vast network of interconnected virtual worlds—as a far flung future concept, in reality we already have the underlying structure built and functioning on a massive scale; it’s called the internet and it connects billions of us together. The only thing holding the internet back from becoming the metaverse is a way to step inside of it. Mozilla, it turns out, is working on the solution.

    What Needs to Happen to Make VR Native on the Web

    The biggest things that we need are support for building native-feeling immersive VR sites, users who have access to the necessary hardware to experience VR, and finally a compelling reason for developers to create or adapt existing sites. These are all tied together, and we’re trying to make sure that we can make small steps along all of these fronts without blocking on any one of them. Ultimately, we want users to have a seamless, friction-free experience on the Web, whether browsing existing Web content or new VR content. We also want developers to have a clear path to creating new fully-immersive VR web sites as well as adding VR elements to their current sites. Finally, we want all of this to work on the widest possible range of hardware, as one of the strengths of the Web is its ability to scale from the lowest end mobile phones to the highest end desktops.

    In more depth, there are several key areas that need to be addressed on the Web platform and browser side:

    Support for Building and Viewing VR-optimized Websites in HTML and CSS

    Our initial work with Web VR has focused on creating content using WebGL, which is a full 3D graphics API.
    While WebGL is a good place to get started with VR experiments on the Web, the lingua franca of the Web remains HTML+CSS
    We also need to view and interact with HTML and CSS websites in virtual reality. For this, we will need to implement VR equivalents of scrolling, clicking links, zooming in, etc. And we will need to determine how to display desktop and mobile sites that were never designed for virtual reality. A historical analogue is Apple’s Mobile Safari browser, which successfully defined a system for displaying and interacting with classic desktop websites on 3.5-inch iPhone touch screens.

    Improved User Experience for Web VR in Current Desktop and Mobile Browsers

    In order for Web VR to be a functional part of the modern Web, desktop and mobile browsers will need to integrate a baseline of user interface support, in some cases updating long-standing security and interface conventions.

    Widespread Adoption of WebVR APIs in Major Browsers

    as many browsers as possible will need to implement support for the new WebVR API (as well as implement the basic platform, performance and UX improvements outlined here)

    The consumption of video is one of the most common use cases of the modern Web. HTML5 gives us the and elements, which make it easy for Web developers to add media to their websites. For example, in order for VR video to take off on the Web, we may need to extend the element

    For VR to truly be a first class citizen on the Web, browsers need to be able to deliver VR content with performance that matches that of native applications. Refresh rates of 75Hz, 90Hz, and beyond will be common, and the need for efficient paths to communicate the most accurate sensor data

    As stated earlier, a key strength of the Web is its universality. Websites built in HTML and CSS run everywhere, browsers provide excellent backwards compatibility, and browsers abstract away hardware and OS complexity. Developers know their content will run everywhere, and generally either do not need to worry about optimizing for one operating system or another

    Reply
  49. Tomi Engdahl says:

    Virty servers’ independence promise has been betrayed
    The rise of the fake cluster – and management indifference
    http://www.theregister.co.uk/2015/06/08/virtualisation_loses_its_independence/

    Server virtualisation in its current state is pretty much done. The market is mature and critical tier one workloads are absolutely at home on tier one virtualised platforms such as VMware and HyperV. Virtualisation tends to be a very useful technology in larger organisations for what most administrators would say are the wrong reasons.

    People, management especially, tend to think they can run anything in a virtual estate. People come up with great ideas like virtualising cluster nodes and “test configurations” introducing single points of failure in order to get the job done quickly. “We can fix it later” is the usual cry. Getting a VM guest stood up is easy. Orders of magnitude quicker than getting bare metal hosts bought and racked. Politics at work don’t help either.

    One of the killer selling points of virtualisation is that when a physical cluster node needed fixing, upgrading or taking out of service it is a trivial matter to just migrate the hosts virtual machines onto another cluster node.

    These forced implementations tend to be things that break the whole VM-to-host independence and cause problems that impact not only the servers and services but ultimately your reputation with other groups and customers.

    A classic example of this behaviour is when DBAs or other managers want a new application or database cluster stood up quickly because: “The installation engineer arrives tomorrow, we need it yesterday.” This time scale doesn’t really allow for bare metal purchase and racking.

    Virtualised clusters as guests is an age old issue that has dogged virtualised platforms for several generations of virtualisation.

    Once the faux cluster is up and live, it breaks one of the main tenets of virtualisation. It has the effect of tying specific virtual guests to specfic host nodes and preventing migration.

    In big business, where everything is change controlled to the smallest degree, changes that require host outages can cost several hundreds of pounds by the time all the work is completed. Failing a piece of work because there was a faux cluster node on the host is seen as a big issue.

    Another limiting factor is the use of “special” networks that are only available on one host as it functions as in internal heartbeat. Usually this is sheer laziness disguised as “It’s only a test.” All the guests that need this “test” network are then tied to the single host in question. It is easy enough to do properly but people get lazy.

    I know techies are not always comfortable going against management wishes but if you don’t talk about these issues they will persist. Without management backing when you say no, you will get railroaded into it.

    Reply
  50. Tomi Engdahl says:

    Apple: Swift is going open source with support for iOS, OS X and Linux
    In other news, hell has frozen over
    http://www.theinquirer.net/inquirer/news/2412166/apple-swift-is-going-open-source-with-support-for-ios-os-x-and-linux

    APPLE HAS ANNOUNCED that its Swift programming language is going open source, supporting iOS, OS X and Linux. In other news, hell has frozen over.

    Swift 2 made its debut during Apple’s Worldwide Developer Conference (WWDC) on Monday, with the first version programming language making its debut during last year’s keynote.

    Swift 2 brings with it a bunch of new features for Apple developers, including error handling, protocol extensions, Apple’s Xcode integrated development environment and new optimization technology, Federighi told the crowd.

    The most notable announcement however was that Swift is going open source, in a huge move for Apple that sees it following in the footsteps of its increasingly open source rivals Google and Microsoft.

    What’s more, as well as being available on iOS and OS X, Apple is also releasing developer tools for Linux, meaning writing apps for Apple platforms will no longer require owning an Apple device.

    Reply

Leave a Reply to Tomi Engdahl Cancel reply

Your email address will not be published. Required fields are marked *

*

*