Computer technologies for 2012

ARM processor becomes more and more popular during year 2012. Power and Integration—ARM Making More Inroads into More Designs. It’s about power—low power; almost no power. A huge and burgeoning market is opening for devices that are handheld and mobile, have rich graphics, deliver 32-bit multicore compute power, include Wi-Fi, web and often 4G connectivity, and that can last up to ten hours on a battery charge.The most obvious among these are smartphones and tablets, but there is also an increasing number of industrial and military devices that fall into this category.

The rivalry between ARM and Intel in this arena is predictably intense because try as it will, Intel has not been able to bring the power consumption of its Atom CPUs down to the level of ARM-based designs (Atom typically in 1-4 watt range and a single ARM Cortex-A9 core in the 250 mW range). ARM’s East unimpressed with Medfield, design wins article tells that Warren East, CEO of processor technology licensor ARM Holdings plc (Cambridge, England), is unimpressed by the announcements made by chip giant Intel about the low-power Medfield system-chip and its design wins. On the other hand Android will run better on our chips, says Intel. Look out what happens in this competition.

Windows-on-ARM Spells End of Wintel article tells that Brokerage house Nomura Equity Research forecasts that the emerging partnership between Microsoft and ARM will likely end the Windows-Intel duopoly. The long-term consequences for the world’s largest chip maker will likely be an exit from the tablet market as ARM makes inroads in notebook computers. As ARM is surely going to keep pointing out to everyone, they don’t have to beat Intel’s raw performance to make a big splash in this market, because for these kinds of devices, speed isn’t everything, and their promised power consumption advantage will surely be a major selling point.

crystalball

Windows 8 Release Expected in 2012 article says that Windows 8 will be with us in 2012, according to Microsoft roadmaps. Microsoft still hinting at October Windows 8 release date. It will be seen what are the ramifications of Windows 8, which is supposed to run on either the x86 or ARM architectures. Windows on ARM will not be terribly successful says analyst but it is left to be seen is he right. ARM-based chip vendors that Microsoft is working with (TI, Nvidia, Qualcomm) are now focused on mobile devices (smartphones, tablets, etc.) because this is where the biggest perceived advantages of ARM-based chips lie, and do not seem to be actively working on PC designs.

Engineering Windows 8 for mobile networks is going on. Windows 8 Mobile Broadband Enhancements Detailed article tells that using mobile broadband in Windows 8 will no longer require specific drivers and third-party software. This is thanks to the new Mobile Broadband Interface Model (MBIM) standard, which hardware makers are reportedly already beginning to adopt, and a generic driver in Windows 8 that can interface with any chip supporting that standard. Windows will automatically detect which carrier it’s associated with and download any available mobile broadband app from the Windows store. MBIM 1.0 is a USB-based protocol for host and device connectivity for desktops, laptops, tablets and mobile devices. The specification supports multiple generations of GSM and CDMA-based 3G and 4G packet data services including the recent LTE technology.

crystalball

Consumerization of IT is a hot trend that continues at year 2012. Uh-oh, PC: Half of computing device sales are mobile. Mobile App Usage Further Dominates Web, Spurred by Facebook article tells that the era of mobile computing, catalyzed by Apple and Google, is driving among the largest shifts in consumer behavior over the last forty years. Impressively, its rate of adoption is outpacing both the PC revolution of the 1980s and the Internet Boom of the 1990s. By the end of 2012, Flurry estimates that the cumulative number of iOS and Android devices activated will surge past 1 billion, making the rate of iOS and Android smart device adoption more than four times faster than that of personal computers (over 800 million PCs were sold between 1981 and 2000). Smartphones and tablets come with broadband connectivity out-of-the-box. Bring-your-own-device becoming accepted business practice.

Mobile UIs: It’s developers vs. users article tells that increased emphasis on distinctive smartphone UIs means even more headaches for cross-platform mobile developers. Whose UI will be a winner? Native apps trump the mobile Web.The increased emphasis on specialized mobile user interface guidelines casts new light on the debate over Web apps versus native development, too.

crystalball

The Cloud is Not Just for Techies Anymore tells that cloud computing achieves mainstream status. So we demand more from it. That’s because our needs and expectations for a mainstream technology and an experimental technology differ. Once we depend on a technology to run our businesses, we demand minute-by-minute reliability and performance.

Cloud security is no oxymoron article is estimated that in 2013 over $148 billion will be spent on cloud computing. Companies large and small are using the cloud to conduct business and store critical information. The cloud is now mainstream. The paradigm of cloud computing requires cloud consumers to extend their trust boundaries outside their current network and infrastructure to encompass a cloud provider. There are three primary areas of cloud security that relate to almost any cloud implementation: authentication, encryption, and network access control. If you are dealing with those issues and software design, read Rugged Software Manifesto and Rugged Software Development presentation.

Enterprise IT’s power shift threatens server-huggers article tells that as more developers take on the task of building, deploying, and running applications on infrastructure outsourced to Amazon and others, traditional roles of system administration and IT operations will morph considerably or evaporate.

Explosion in “Big Data” Causing Data Center Crunch article tells that global business has been caught off-guard by the recent explosion in data volumes and is trying to cope with short-term fixes such as buying in data centre capacity. Oracle also found that the number of businesses looking to build new data centres within the next two years has risen. Data centre capacity and data volumes should be expected to go up – this drives data centre capacity building. Data centre capacity and data volumes should be expected to go up – this drives data centre capacity building. Most players active on “Big Data” field seems to plan to use Apache Hadoop framework for the distributed processing of large data sets across clusters of computers. At least EMC, Microsoft, IBM, Oracle, Informatica, HP, Dell and Cloudera are using Hadoop.

Cloud storage has been very popular topic lately to handle large amount of data storage. The benefits have been told very much, but now we can also see risks of that to realize. Did the Feds Just Kill the Cloud Storage Model? article claims that Megaupload Type Shutdowns and Patriot Act are killing interest to Cloud Storage. Many innocent Megaupload users have had their data taken away from them. The MegaUpload seizure shows how personal files hosted on remote servers operated by a third party can easily be caught up in a government raid targeted at digital pirates. In the wake of Megaupload crackdown, fear forces similar sites to shutter sharing services?. If you use any of these cloud storage sites to store or distribute your own non-infringing files, you are wise to have backups elsewhere, because they may be next on the DOJ’s copyright hit list.

Did the Feds Just Kill the Cloud Storage Model? article tells that worries have been steadily growing among European IT leaders that the USA Patriot Act would give the U.S. government unfettered access to their data if stored on the cloud servers of American providers. Escaping the grasp of the Patriot Act may be more difficult than the marketing suggests. “You have to fence yourself off and make sure that neither you or your cloud service provider has any operations in the United States”, “otherwise you’re vulnerable to U.S. jurisdiction.” And the cloud computing model is built on the argument data can and should reside anywhere around the world, freely passing between borders.

crystalball

Data centers to cut LAN cord? article mentions that 60GHz wireless links are tested in data centers to ease east-west traffic jams. According to a recent article in The New York Times, data center and networking techies are playing around with 60GHz wireless networking for short-haul links to give rack-to-rack communications some extra bandwidth for when the east-west traffic goes a bit wild. The University of Washington and Microsoft Research published a paper at the Association of Computing Machinery’s SIGCOMM 2011 conference late last year about their tests of 60GHz wireless links in the data center. Their research used prototype links that bear some resemblance to the point-to-point, high bandwidth technology known as WiGig (Wireless Gigabit), which among other things is being proposed as a means to support wireless links between Blu-ray DVD players and TVs, replacing HDMI cables (Wilocity Demonstrates 60 GHz WiGig (Draft 802.11ad) Chipset at CES). 60 GHz band is suitable for indoor, high-bandwidth use in information technology.. There are still many places for physical wires. The wired connections used in a data center are highly reliable, so “why introduce variability in a mission-critical situation?”

820 Comments

  1. Tomi Engdahl says:

    IT pros lack recent skills
    http://www.theregister.co.uk/2012/03/13/technology_outpaces_training/

    CompTIA says technology outpaces training

    46% of IT workers struggle to keep their skills up to date with new technology, according to CompTIA, a non-profit trade association advancing the global interests of information technology professionals.

    The resulting mess means that (80% of employers think they have a damaging skills gap, often impacting staff productivity (41% of respondents) customer service or customer engagement (32%), and security (31%).

    Reply
  2. Tomi Engdahl says:

    As web booms, so does demand for data centers
    http://gigaom.com/cloud/as-web-booms-so-does-demand-for-data-centers/

    A new survey by Campos Research & Analysis on behalf of Digital Realty Trust, a data center real estate company, has some data about the North American market that points to a growing demand for data center services in 2012 and 2013:

    Ninety-two percent of respondents will definitely or probably expand in 2012—the highest percentage in the six years that Digital Realty has sponsored the survey.
    Of those respondents with definite plans to expand in 2012, 38 percent expect to expand in three or more locations. Fifty-four percent expect to pursue projects of 15,000 square feet or more in size.
    Of those companies planning or considering data center projects this year, 92 percent plan to expand in the U.S., approximately 50 percent also expect to expand in Europe or the Asia Pacific region, and 21 percent reported plans for projects in South America.

    And from the way things are going, why do I get a feeling that all these projections are going to be rethought, by the time we get to end of 2012.

    Reply
  3. Tomi Engdahl says:

    Consumer devices still cause IT chaos — five years after first iPhone
    http://www.computerworld.com/s/article/9225126/Consumer_devices_still_cause_IT_chaos_five_years_after_first_iPhone

    IT execs offer advice on tough task of embracing consumer devices, and how they can be used to innovate

    The consumerization of IT trend is causing chaos and confusion in IT operations grappling with the demand that they support the latest iPad, iPhone or Android device.

    That’s the opinion of many CIO’s and IT managers attending Computerworld’s Premier 100 conference here this week.

    “After all this time, I’m still in crisis mode with Bring Your Own Device,” said Alex Yohn, assistant director of the office of technology at West Virginia University in Morgantown, W.Va.

    Focusing on adapting apps to work with iOS has caused some users of Android machines to complain when things don’t work. But the IT manager said, “We tell them if it doesn’t work, it’s your own fault.”

    “BYOD is happening, so get over it. Embrace it,” said good CTO Nicko van Someren.

    Tom Soderstrom, CTO at NASA’s Jet Propulsion Laboratory, spent nearly an hour describing how the organization uses consumer technologies internally to excite workers and the public. The IT shop needs to be an innovator, not just the security enforcer, he said.

    How mobile, BYOD and younger workers are reinventing IT
    http://www.computerworld.com/s/article/9224568/How_mobile_BYOD_and_younger_workers_are_reinventing_IT

    Changes are coming to IT, so you’d better be ready

    A combination of forces — the move to mobility, the arrival of a new generation of employees and the bring-your-own-device (BYOD) trend — is changing the world of IT with a speed that might have seemed impossible a few years ago.

    The study found that younger workers:

    Have very high expectations when it comes to getting a response regarding support calls.
    Prefer interactions with IT beyond just calling the help desk, including email, chat and texts.
    Will typically research problems on their own (either before calling IT or while waiting for a response).
    Tend to work outside of typical business hours and off premises.
    Will develop their own solutions and processes with the tools at their disposal, including consumer-oriented cloud services and personal devices.
    Value working collaboratively with colleagues within their organization and beyond it.
    Are often willing to share knowledge about solutions provided to them by IT and solutions and processes they develop on their own.

    For the most part, this means that millennials are assets to an organization. After all, what employer wouldn’t want motivated self-starters that work well with others and can leverage their personal experience as well as that of their professional and social networks?

    Reply
  4. Tomi Engdahl says:

    The Consoles Are Dying, Says Developer
    http://games.slashdot.org/story/12/03/14/0354237/the-consoles-are-dying-says-developer

    “While you might have often heard that PC gaming is dying — detractors have been claiming this for over a decade — one developer has a different take: that consoles are the ones on the way out. In a 26-minute presentation at GDC Ben Cousins, who heads mobile/tablet game maker ngmoco, uses statistics of electronic and gaming purchases, along with market shares of developers and publishers from just a few years ago, to come to some surprising conclusions.

    Sony, Nintendo and Microsoft — are losing out when compared with the new generation of gaming platform developers: Facebook, Apple and Google.

    Can $60 Games Survive?
    http://games.slashdot.org/story/12/03/13/227201/can-60-games-survive

    Game budgets continue to rise with each successive console generation, and with the Wii U launching later this year, the industry is on the cusp of yet another costly transition. Publishers have been regularly charging $60 for games this generation, but that model simply cannot survive, Nexon America CEO Daniel Kim said in an interview.

    Reply
  5. Tomi Engdahl says:

    Media Tablet Shipments Outpace Fourth Quarter Targets; Strong Demand for New iPad and Other Forthcoming Products Leads to Increase in 2012 Forecast, According to IDC
    http://www.idc.com/getdoc.jsp?containerId=prUS23371312

    Worldwide media tablet shipments into sales channels rose by 56.1% on a sequential basis in the fourth calendar quarter of 2011 (4Q11) to 28.2 million units worldwide, according to the International Data Corporation (IDC)

    As predicted, Android made some strong gains in 4Q11, thanks in large part to the Amazon Kindle Fire’s success (the Fire runs a custom version of Google’s Android OS). Android grew its market share from 32.3% in 3Q11 to 44.6% in 4Q11. As a result, iOS slipped from 61.6% market share to 54.7%

    “As the sole vendor shipping iOS products, Apple will remain dominant in terms of worldwide vendor unit shipments,” Mainelli said. “However, the sheer number of vendors shipping low-priced, Android-based tablets means that Google’s OS will overtake Apple’s in terms of worldwide market share by 2015. We expect iOS to remain the revenue market share leader through the end of our 2016 forecast period and beyond.”

    Reply
  6. Tomi Engdahl says:

    Hybrid IT: How Internal and External Cloud Services are Transforming IT
    http://www.gartner.com/technology/research/technical-professionals/hybrid-cloud.jsp

    Hybrid IT is the result of combining internal and external services, usually from a combination of internal and public clouds, in support of a business outcome. Adoption risks of public clouds have led to architectures that connect internal core services and critical data to external, commoditized services.

    Hybrid IT relies on new technologies to connect clouds, sophisticated approaches to data classification and identity, and service oriented architecture. In addition, the role of IT and its practitioners is undergoing significant change.

    Reply
  7. Tomi Engdahl says:

    What tech skills gap? Train your workers
    IT professionals are feeling the pressure to boost their tech skills, but they aren’t getting the support and resources
    http://www.infoworld.com/t/it-training/what-tech-skills-gap-train-your-workers-188519

    IT professionals can expect increasing pressure from management to learn new job skills as cloud and mobile computing gain traction and new cyber security threats emerge. Unfortunately, those same IT pros can’t reliably expect much in the way of assistance from their employers to get that training.

    A new research report, “State of IT Skills” by CompTIA, found that around 9 in 10 business managers see gaps in workers’ skill sets, yet organizations are more likely to outsource a task or hire someone new than invest in training an existing staff. Perhaps worse, a significant amount of training received by IT doesn’t translate to skills they actually use on the job — savvy IT pros might need to invest their own time and resources in training for the sake of job security.

    The study points to the fact that organizations strongly appreciate IT’s valuable role in business, even if upper-level executives and HR personnel don’t understand the underlying complexity of learning, deploying, and maintaining new technologies.

    Reply
  8. Tomi Engdahl says:

    Meet the power challenges of high-speed interconnects in the cloud
    http://www.eetimes.com/design/communications-design/4238105/Meet-the-power-challenges-of-high-speed-interconnects-in-the-cloud?Ecosystem=communications-design

    With the addition of cloud services (for example, Apple iCloud) to the already massively connected Internet, data centers are seeing an unprecedented increase in sheer compute and storage requirements. This growth directly impacts energy consumption. As it grows, engineers are seeking solutions to keep the power under control. In this article we examine specifically the interconnect power budgets as massively connected systems move beyond 10 gigabit per second (Gbps) interconnects and solutions to lower power consumption in these high-speed channels.

    No longer is a “server” really a discrete piece of hardware. In most cases, the actual hardware hosting a service may be anywhere within a service provider’s infrastructure, which introduces a sort of “uncertainty” of where it is at any moment in time. This type of performance throttling is referred to as “virtualization” or the encapsulation of a service within a software framework that allows it to move freely between hardware hosts. This allows service providers the ability to vary the resources on demand and improve the power consumption of the infrastructure.

    As services are throttled, there is a great deal of “machine-to-machine” (M2M) activity. In most data centers, most of the traffic is between machines and not connected to the outside world. The simple addition of virtualization has driven the need to migrate from one gigabit per second interconnects (standard on many mid-decade servers) to 10 Gbps. Today, the demand is driving the move to 25 Gbps interconnects. Many of these connections are less than 5 meters with the majority less than one meter in length

    With one gigabit connections, small gauge wires could easily carry the bits without considerable loss of signal integrity.

    With the move to 10G Ethernet, signal integrity became more of an issue and passive cables started using larger gauge wire to compensate. The airflow / bend radius issues began to show up and installers / designers started looking to fiber interconnects as a way to fix the problem. This move to fiber introduced several issues such as increased cost and power consumption. A typical single 10G Ethernet SFP+ module dissipates about a watt of power. With tens of thousands of ports, the amount of power required just for the fiber interconnects increased significantly

    If passive cable used for high-speed interconnects suffers from bulk and bend radius issues, then fiber solutions suffer from increased power consumption and higher cost.

    The technology of improving the signal integrity of cables is limited to lengths less than 15 meters in most cases for 10 Gbps Ethernet. This is actually very common today for 10 Gbps interconnects.

    In the world of fiber interconnects, there are basically two realms:1) interconnects for short (less than a kilometer); and 2) long (much greater than a kilometer) communications

    As cloud computing and storage continues to grow in both scale and capacity, the interconnections between nodes will continue to increase in capacity.

    Reply
  9. Tomi Engdahl says:

    How Green Is the Cloud?
    http://www.wired.com/cloudline/2012/03/cloud-carbon/

    Moving to the cloud means green in more ways than one, with savings on the order of billions of dollars, as well as by significantly reducing carbon emissions, according to a new study.

    Have a look at the two reports and share your thoughts: Is the cloud key to reducing carbon emissions, or is this spin/greenwashing?

    Reply
  10. Tomi Engdahl says:

    Does What’s Inside Matter?
    http://spark.qualcomm.com/salon/does-what%E2%80%99s-inside-matter

    Are specs still important? Should we as consumers be expected to know what’s in our devices? My answer to that is yes, but it’s complicated

    No one buys a gadget, computer or TV because they want to own a specific chipset, hard drive or piece of glass. We buy gadgets because we want to enjoy a particular experience, and we want a product that can deliver that experience

    The stuff that goes into our gadgets—the specs—enable those experiences, and can serve as useful shorthand for whether or not a device can actually deliver the experiences we’re looking for.

    But the key is that those specs—processor speed, memory, screen size, resolution, etc.—have to be paired with excellent software. You need both.

    That means in order to create a great gadget experience, you ultimately need a capable device with well-designed software

    Netbooks are a good example of this trade-off.

    Conversely, we’ve seen how even tablets with powerful processors and plenty of RAM can disappoint users when paired with sub-par software.

    You can argue that consumers should be able to buy something without having to worry about whether it’s going to deliver as promised. That might be true, but it’s not the reality of the gadget world.

    And yes, that means knowing a bit more about what specs actually mean and why and when they matter.

    When do specs stop mattering? I think they only become irrelevant when a spec improvement doesn’t lead to an incremental improvement in experience, or when the specs themselves become obsolete. An example of this is the camera megapixel war

    Just ask anyone arguing that specs don’t matter whether they’d want a slower processor in their smartphone, or to give up that 4G radio for a 2G one.

    Reply
  11. Tomi Engdahl says:

    Intel rolls first processor optimized for datacenters
    http://www.edn.com/article/521123-Intel_rolls_first_processor_optimized_for_datacenters.php

    Intel Corp says it has designed its first processor built from the ground up for the “green” datacenters of the future, claiming a 70% increase in performance for the same energy consumption. The new E5-2600 also features a high-speed bi-directional ring encircling its up to eight cores per socket connecting up to 20 Mbytes of cache, quad DDR3 memory controllers, and 40-lanes of PCI-Express 3 for input/output (I/O).

    “The E5 is our first CPU optimized for the energy-efficient datacenter of 2015,” said Jeff Gilbert, Sandy Bridge architect. “It features twin 32-byte wide ultra-high-speed rings going in opposite directions to encircle the eight [Sandy Bridge] cores and connect them to cache.”

    Besides voltage and frequency scaling for each core, the new power management agent also manages energy efficiency in I/O by dynamically reducing its width in response to workload and thermal management goals

    Reply
  12. Tomi Engdahl says:

    Estimate: Amazon Cloud Backed by 450,000 Servers
    http://www.datacenterknowledge.com/archives/2012/03/14/estimate-amazon-cloud-backed-by-450000-servers/

    How many servers does it take to power Amazon’s huge cloud computing operation? Like many large Internet companies, Amazon doesn’t disclose details of its infrastructure, including how many servers it uses. But a researcher estimates that Amazon Web Services is using at least 454,400 servers in seven data center hubs around the globe.

    Reply
  13. Tomi Engdahl says:

    IT Skills Shortages Inside Companies Hamper Profitability, Productivity
    http://www.cio.com/article/702176/IT_Skills_Shortages_Inside_Companies_Hamper_Profitability_Productivity_?taxonomyId=3123

    Data from a recent survey conducted by CompTIA shows that skills shortages inside IT departments negatively impact corporate profitability, productivity, innovation, speed to market, customer service and security.

    The skills that respondents ranked as the most important were core IT skills. The following IT skills received rankings greater than 70 percent:

    networking/infrastructure
    servers/data center management
    storage/back-up
    cybersecurity
    database/information management
    help desk/IT support
    telecom/unified communications
    printers/copiers/faxes
    data analytics/business intelligence
    Web design and development

    Reply
  14. Tomi Engdahl says:

    Microsoft: No next-gen Xbox in 2012
    http://www.reghardware.com/2012/03/19/microsoft_insists_no_next_gen_xbox_in_2012/

    Those anticipating an Xbox ’720′ reveal at games industry shindig E3 2012 will be disappointed. Microsoft has insisted it will not discuss next-gen consoles at all this year.

    “There will be no talk of new Xbox hardware at E3 or anytime soon,” said Dennis before insisting: “For us, 2012 is all about Xbox 360.”

    Following the announcement, two unnamed moles said Microsoft is set to squeeze out a least one more year of sales from its current-gen console, Bloomberg reports.

    Reply
  15. Tomi Engdahl says:

    Microsoft ‘yanked optical drive from Xbox 720′
    http://www.reghardware.com/2012/03/09/microsoft_drops_optical_disc_drive_for_next_xbox/

    Microsoft’s next Xbox – whatever it’s called – will not feature an optical disc drive, moles have claimed. Instead, it will gain games by downloads and possibly on memory cards too.

    Allegedly breaking one of the “strictest NDA” contracts ever encountered, the insiders said the console will indeed be launched in 2013, as rumoured, with a type of removable solid-state storage rather than the DVD format of yesteryear, MCV reports.

    The news that console manufacturers could turn their backs on discs is worrying for already struggling high street outlets

    Reply
  16. Tomi Engdahl says:

    Moles say Sony eyeing AMD for PlayStation 4 chips
    http://www.reghardware.com/2012/02/23/will_sony_turn_to_amd_for_playstation_4_graphics/

    The PlayStation 3 currently utilises GPU technology from graphics guru and AMD arch-rival Nvidia, former AMD employees suggested this might not be the case going forward, Forbes reports.

    AMD CFO Thomas Seifert has already identified gaming as a key revenue driver for the chip company in 2012. Seifert may be referring to Microsoft’s impending ‘Xbox 720′, since the Redmond outfit already rolls with AMD for the Xbox 360 console.

    Don’t forget, though, that neither Sony nor Microsoft have said anything about what technology they might be including in future consoles.

    Reply
  17. Tomi Engdahl says:

    AMD unveils new chip for Web hosting
    The Opteron 3200 Series processor is said to offer better performance while using less power.
    http://news.cnet.com/8301-1001_3-57400465-92/amd-unveils-new-chip-for-web-hosting/

    Specifically, the new AMD Opteron 3200 Series processor is touted to offer a more efficient bang per buck as the enterprise-class platform is said to offer up to 60 percent better performance per dollar and use up to 19 percent less power per core.

    The Opteron 3200 Series also falls in line with AMD’s cloud strategy as these chips are also supposed to boast twice the core density per rack.

    45W to 65W TDP
    2.7 GHz base frequency, up to 3.7 GHz frequency using AMD Turbo CORE technology

    Reply
  18. Tomi Engdahl says:

    Big data enters open-source hype cycle
    Riches for some, mostly not VCs
    http://www.theregister.co.uk/2012/03/20/who_is_making_money_in_big_data/

    Is it the open source hype cycle, replayed in big-data style?

    Possibly. Open source was all the rage in the tech press for years as it promised to lower costs while improving enterprise IT freedom. Ultimately, a few start-ups cashed out big time (MySQL, JBoss), but for the most part the real value in open source came as both IT vendors and in-house IT organisations turned to open source to provide raw material for their software projects. Open source became less about sales and more about code, which was exactly what it was designed to do.

    Today, venture capitalists are throwing piles of cash into big data start-ups hoping to strike it rich, and some undoubtedly will. But let’s be clear: data analytics has long been part of the tech industry. We may choose to call it “Big Data” now but it has been a staple of forward-thinking industries for at least 20 years, as one blogger notes.

    Call it data warehousing and data mining. Call it business analytics. Call it whatever you want. It’s not new, and it’s not even necessarily a game changer

    However, there are at least two big areas that the new big data, much like open source, trumps its antecedents: cost and scale.

    These two factors, perhaps more than anything else, account for the startling rise in Hadoop’s popularity, even as the more staid “data mining” has lost its lustre. Hadoop makes the collection and analysis of data possible on low-cost, easily scaled, commodity hardware. In the past a financial services company that wanted to run credit analysis jobs had to pay an IBM a huge check to cover the cost of the proprietary hardware and software.

    Not anymore. Hadoop has democratised data, turning it into a competitive market.

    In sum, yes, big data is big. But it’s not really new. What is new is the ability to process immense quantities of data for pennies on the data warehousing dollar.

    Reply
  19. Tomi Engdahl says:

    In the Cloud, Your Data Can Get Caught Up in Legal Actions
    http://www.cio.com/article/702488/In_the_Cloud_Your_Data_Can_Get_Caught_Up_in_Legal_Actions?taxonomyId=3024

    We all know that the data we rely on to run our businesses can be subject to subpoena and other government actions. Such actions create additional risks when that data is in the cloud.

    With cloud computing, data from multiple customers is typically commingled on the same servers. That means that legal action taken against another customer that is completely unrelated to your business could have a ripple effect. Your data could become unavailable to you just because it was being stored on the same server as data belonging to someone else that was subject to some legal action. For example, a search warrant issued for the data of another customer could result in your data being seized as well.

    The federal indictments against the individuals behind Megaupload
    The catch is that a lot of people were using Megaupload for legitimate purposes . When the government took action against the alleged bad guys, those legitimate users also lost access to their data. It’s a textbook example of how technology continues to outpace the law’s ability to address the new questions that arise with cloud computing. For example, who is responsible for returning data to legitimate users?

    With Megaupload essentially shut down, legitimate users couldn’t retrieve their data directly as in the past. The government wouldn’t release any data while it temporarily had custody of it to gather evidence, and when it was finished with that, it didn’t want responsibility for sorting and returning data.

    How could a legitimate Megaupload customer have avoided getting caught up in this? Thoroughly vetting a cloud provider’s background and business practices before using its service would be a good first step in most cases.

    The cloud also complicates things if your own data is subject to some legal action.
    To reduce the risks that come with this decreased control, it’s important to determine in advance what your cloud provider’s standard policies are regarding such legal requests.

    Your contract should also specify what the provider needs to do if any of your data becomes the subject of a subpoena or other legal or governmental request for access.

    Reply
  20. Tomi Engdahl says:

    HP finally decides the future of the PC: It’s a printer accessory
    http://www.theregister.co.uk/2012/03/21/hp_psg_ipg_amalgamation/

    HP has confirmed the rumours anent the future of its Personal Systems Group – the globocorp’s PC-making arm, which at one point seemed likely to be sold off – by announcing that it will merge with the company’s printer tentacle.

    HP’s Imaging and Printing Group (IPG) and its Personal Systems Group (PSG) are joining forces to create the Printing and Personal Systems Group.

    “This combination will bring together two businesses where HP has established global leadership,” said CEO Meg Whitman.

    Reply
  21. Tomi Engdahl says:

    Windows 8 tablet PCs expected to be launched in October, say Taiwan makers
    http://www.digitimes.com/news/a20120320PD213.html

    In line with Microsoft’s release of Windows 8 in September-October, first-tier vendors including Hewlett-Packard, Dell, Lenovo, Acer and Asustek Computer are developing x86-based tablets for launch in October, according to Taiwan-based supply chain makers.

    Vendors including Samsung Electronics, Sony, Toshiba, Lenovo, Acer and Asustek are expected to launch Windows 8 on ARM (WoA) tablet PCs in early 2013, the sources indicated. However, Nokia may take the initiative to launch 10-inch WoA tablets equipped with Qualcomm processors in November 2012, the sources claimed.

    Reply
  22. Tomi Engdahl says:

    Power your mobile strategy with a cloud
    Use a private cloud to handle security, management and data access for your mobile workforce
    http://www.networkworld.com/research/2012/032012-power-your-mobile-strategy-with-257451.html?page=1

    Mobile devices will soon be driving cloud computing — and vice versa. Here’s why: It’s very sensible to use a private cloud for security, management and other aspects of mobile applications. But getting there will require planning and investment by IT.

    Some have already moved in this direction. In a December 2011 survey of 3,645 IT decision-makers in eight countries, a third of the respondents said that providing information access to multiple devices was their top reason for implementing cloud computing.

    Why adopt a cloud? Top motivating factors. Accessibility to information via multiple devices – 33% Accelerating business speed – 21 % Cutting costs – 17% Source: TNS/CSC survey, December 2011; 3,645 respondents

    “The nice part of this is that we get automatic rendering of content to all mobile devices, removing or eliminating the need to write device-specific apps” for iPhone or Android devices, among others, Peltz explains. After the CMS is fully implemented, “it will allow all of our content to be managed by end users or departments or business units,” he says.

    Among the issues his group has wrestled with are whether to build a Web portal that adapts itself based on the device that is coming into it, or to go with a device-specific app. Today the firm is using both approaches.
    But the company also has a web portal “where I can do the exact same thing,” Miller says. The goal is to have “inputs coming in from just about any mobile device.”

    Although mobile computing and mobile cloud computing may sound the same, they are in fact very different. In “regular” mobile computing, applications run on a mobile device in native mode, with the application and data all stored on the device.

    Running a mobile application in native mode has some advantages — most important, no latency or network bandwidth problems. But applications that run on mobile devices are often limited in functionality and are generally not business-class applications; it’s very rare to find native smartphone apps used as serious front ends for database queries, for instance.

    In contrast, mobile cloud computing applications run on servers that reside in the cloud. Application data also lives in the cloud and results are fed back to the mobile device via an over-the-air network such as 3G or 4G. Users access apps and data via the browser on their mobile devices.

    Because data (and some applications) move between mobile devices and the cloud via off-premises networks, security is a major consideration.

    Jeff Deacon, director of corporate strategy at Verizon Business, says that in most organizations today, mobile devices are coming in straight across the Internet, and this is not a good idea. “If you poke a hole in your firewall for access from a mobile device you have effectively poked a hole in your firewall for anyone in the world. Securing a gateway specific to mobile devices that can support various operating systems — iOS, Android, Windows — is very important.”

    Deacon says that many companies do not allow access to back-office data across the Internet. Access to secured data with smartphones or tablets should be done via a VPN.

    “The usability group wants to make it easier for people to use the phone, while the security folks want to make it more difficult,” says Eric Miller, CIO at Erie Insurance.

    “We rely on the security of the phone to allow people to get into the app, but then you have to authenticate yourself against our back-end system,” he says.

    “During the design of apps we always assume that a phone can be lost,” and they keep in mind what would be lost in case someone cracks the encryption.

    The market for cloud-based mobile applications is expected to grow almost 90% from 2009 to 2014, according to Juniper Research . For its part, ABI Research reports that more than 240 million business customers will access cloud-computing services via mobile devices by 2015 and that number could approach a billion.

    Reply
  23. Tomi Engdahl says:

    ARM Says New Architecture Could Cut Cost of 32-Bit MCUs
    http://www.designnews.com/document.asp?doc_id=240544&cid=NL_Newsletters+-+DN+Daily

    A new core architecture developed by ARM could cut the cost of a 32-bit microcontroller (MCU) to as little as 30 cents for applications ranging from touchscreens and motor controllers to stoves and refrigerators.

    Known as the Cortex M0+, the new core could potentially enable 32-bit MCUs to be cost competitive with 8- and 16-bit devices in certain situations.

    The new architecture builds on ARM’s existing Thumb instruction set, which was used to improve compiled code density when it came out more than a decade ago. In the case of the M0+, the Thumb scheme compacts the 32-bit instruction set, enabling an MCU to employ less on-board Flash and therefore hit a lower price point.

    “It’s like a 16-bit architecture from the perspective of external memory storage,” said Geoff Lees, vice president and general manager of Freescale’s industrial and multi-market MCU business. “But as soon as it’s in the core, it’s executing as 32-bit instructions.”

    Up to now, the lowest-priced 32-bit devices typically cost about 50 cents. If MCU makers are able to cut that to 30 cents, then the new technology could have a major impact on the makeup of the MCU market, said Massimini of Semico Research.

    Reply
  24. Tomi Engdahl says:

    Analyst eyes Q3 2013 for Xbox 720 release
    http://www.reghardware.com/2012/03/23/xbox_720_to_launch_in_q3_2013_says_analyst/

    Microsoft’s next Xbox console will launch in Q3 2013 and, contrary to widespread rumours, will not be a download-only platform, one analyst reckons.

    “Although not yet confirmed by Microsoft, we believe the next generation Xbox console could launch in the fall of 2013,” said Doug Creutz, a soothsayer at Cowan & Company, a stockbroker, in a research note circulated this morning.

    The analyst reckons a digital-only Xbox would pose far too many risks to Microsoft’s market share, and any attempt to kill off the used games market would be damaging to the industry’s ecosystem.

    “We think it’s unlikely that there would be that [used-game blocking] next-gen console because the model simply hasn’t been proven,” he said, talking up the value of used games to the industry.

    Reply
  25. Tomi Engdahl says:

    Has virtualization really ended all your worries?
    The very real limits of happiness
    http://www.theregister.co.uk/2012/03/23/infrastructure_virtualisation/

    OK, so you’ve virtualised a bunch of servers and saved yourself a bit of money on hardware. Life is a little easier because you no longer have to go through the server procurement and provisioning cycle quite so often to meet new requirements.

    But has anything fundamentally changed in the way you manage systems and deliver IT services?

    All of our research, including various Reg reader studies, suggests that for the majority it hasn’t. Server virtualisation initiatives are great for a one-shot payback, but once you have gone through your estate and consolidated what you can, you may not be hugely better off than before.

    Virtual servers need patching, monitoring, managing and troubleshooting, just like physical ones. In fact, some report that the administration and support challenge is actually more difficult because it is all too easy for virtual images to proliferate and create virtual server sprawl.

    Furthermore, many have been caught out by virtualisation’s knock-on impact on the rest of the infrastructure. We frequently hear stories of unanticipated network and storage bottlenecks, as well as of funds not being available for adding more capacity because the necessary upgrades were not budgeted for.

    The truth is that the real enablers of IT systems efficiency, effectiveness and flexibility are joined up management and automation.

    If you examine how the more successful IT departments differ from others, you usually find that they have paid a lot more attention to integration of management tools and processes, and taken steps to cut down on the manual work involved in routine administration.

    While it is easy to talk in principle about cleaning up the fragmented and disjointed mess of tools and processes currently in place, it is incredibly hard to pull off.

    Reply
  26. Tomi says:

    ‘Intelligent systems’ poised to outsell PCs, smartphones
    IDC: ARM facing shrinking market share
    http://www.theregister.co.uk/2012/03/23/idc_intelligent_systems/

    Analyst house IDC predicts that the traditional embedded-systems market is reaching an inflection point where a new breed of intelligent devices will take over the market and drive the current fashionable terms de jour: Big Data and “the internet of things”.

    DC defines intelligent systems as those that use a high-level operating system, connect to the internet, run native or cloud-based apps, and have the ability to process their own data. The analysts anticipate that by 2016, these smart systems will account for a third of the embedded processor market, but over two-thirds of its value, and their numbers will outpace the growth of either PCs or smartphones.

    Consumer goods such as vending machines and kiosks are going to be the key growth drivers, he said, with energy systems such as smart meters and solar-power management systems in the energy sector providing major growth. In the enterprise space, this means that a lot more data is going to be flowing that could be analyzed and used to refine business plans.

    The shift will also mean big changes for the British chip wizards at ARM. According to IDC’s figures, ARM had 71 per cent of the processors in embedded systems last year, but by 2016 this share will nearly halve to 38 per cent. By contrast, x86 systems, which currently have 8 per cent of the market, will grow to 41 per cent as manufacturers seeks to put more grunt into their systems.

    Reply
  27. Tomi Engdahl says:

    Why is IT Hiring so Hard?
    http://blogs.cio.com/careers/16899/why-it-hiring-so-hard

    CIOs want to hire IT generalists but admit to a large, financially damaging skills gap. Maybe IT leaders don’t know how to find the right staff.

    The latest data from Robert Half International shows that tech positions are harder to fill than those in accounting and finance, legal, advertising and marketing.

    Skills mismatch and the fast pace of technology change are part of it. Is anyone really a cloud architect yet? A tablet commerce expert? More than 90 percent of companies says there’s a gap between the technical skills their IT staffs possess and the skills their companies need

    Skills mismatch and the fast pace of technology change are part of it. Is anyone really a cloud architect yet? A tablet commerce expert? More than 90 percent of companies says there’s a gap between the technical skills their IT staffs possess and the skills their companies need

    The conventional wisdom is that bosses hire people like themselves. Our State of the CIO research indicates that CIOs lack the very skills they now seek in staff members. Just 9 percent of the 596 CIOs in our latest survey spend time studying customer needs to identify commercial opportunities. Only 17 percent find ways to differentiate their companies against competitors. So maybe CIOs dither in filling open jobs because they don’t recognize the talent they need when they see it.

    Or maybe the problem is unrealistic expectations. It may be too soon to have your pick of cloud, mobile and social computing experts because enterprises have only begun to put in industrial strength systems that use those technologies for real business scenarios. There are relatively few people with that experience in the pool of the IT unemployed.

    Reply
  28. Tomi Engdahl says:

    Wall Street Beat: Tech Bellwethers Offer View Into IT Trends
    HP, Oracle and Apple make moves that highlight industry shifts
    http://www.cio.com/article/702787/Wall_Street_Beat_Tech_Bellwethers_Offer_View_Into_IT_Trends

    Hewlett-Packard, Oracle and Apple sparked some of the biggest corporate financial news of the week, highlighting sector trends as IT edges toward what could be its best first quarter on the markets since the dot-com bust in 2000, despite some turbulence during the past few days.

    Tech stocks in general have been riding a wave of investor confidence in the economy, as unemployment claims in the U.S. decline and housing market data shows the beginnings of a recovery.

    Oracle’s quarterly report on Tuesday was an improvement over the prior quarter’s and highlighted the importance of cloud computing.
    Highlighting the company’s security features for cloud-based applications
    Oracle has made cloud-oriented acquisitions
    “It’s faux cloud, they’ve made some acquisitions to try to be cloud,”

    Hewlett-Packard Wednesday announced the merger of its Imaging and Printing Group and its Personal Systems Group into a new unit called the Printing and Personal Systems Group.
    “The real growth opportunity for HP and its shareholders is in services, software, and higher margin hardware (e.g., servers, cloud-based systems, networking, storage).”

    Apple CEO Tim Cook finally answered investor and shareholder calls to distribute dividends.
    Cook pointed to skyrocketing iPad sales as an indicator of its recent success, highlighting one of the more important hardware trends of the past year.
    “We believe the tablet market will surpass the PC market in size. It’s a matter of time,” Cook said.

    Reply
  29. Tomi Engdahl says:

    Don’t let the cloud obscure your software’s performance
    Keeping an eye on SaaS
    http://www.theregister.co.uk/2012/03/26/saas_performance/

    Software as a service (SaaS) can be a great cost saver for companies willing to abandon their own hardware and software, but what happens if productivity leaves the building too?

    Giving control of your business applications to someone else can also mean losing control of performance.

    Application performance management systems that use locally installed software agents to check on things like server and network load are useful when you are running systems on your own premises, but how do you monitor performance when all of the equipment is running on someone else’s turf?

    “There are application monitoring solutions available today that allow the end-to-end response time of a transaction to be measured,”

    “This gives the organisation a means to calibrate the service quality of application performance without having to use any data supplied by the SaaS vendor.”

    Some systems approach this customer experience management problem by monitoring performance from a specific URL.

    A handy Ping or Traceroute command could soon put disputes about network problems on the customer’s side to rest. But how can you measure performance inside the SaaS provider’s infrastructure?

    “With difficulty,” says Clive Longbottom, service director at analyst Quocirca. “You need an agentless probe that can get past their firewalls and not be seen as an intruder.”

    That raises the question, what do you put in a contract? Service level agreements (SLAs) for SaaS implementations should offer more than just guaranteed uptime. Response times should figure in the equation.

    A responsible customer will work with the SaaS provider to plan for future growth.

    Understanding what parameters of application performance you are trying to monitor is crucial.

    In some SaaS scenarios, file backup and restore might be an important performance indicator

    Marsh has a sage piece of advice. “If you find yourself going down the cloud route and performance isn’t as expected, then make sure you have a strategy for backing out of the cloud and coming back on premise,” he says.

    Reply
  30. Dulcilene says:

    i loved this text from you, helped me a lot. i am really grateful.http://www.simbolodamusica.com

    Reply
  31. Tomi Engdahl says:

    Next Xbox No-Show: Why Microsoft’s Keeping Durango Under Wraps
    http://www.gamesindustry.biz/articles/2012-03-22-next-xbox-no-show-why-microsofts-keeping-durango-under-wraps

    Veteran journalist Chris Morris explains what a no-show for the next Xbox at E3 means

    “By pushing Durango’s unveiling back a year, Microsoft could find itself going head to head with Sony in a battle of features, even if the machines don’t hit shelves at the same time”

    Reply
  32. Tomi Engdahl says:

    ARM-based Xbox ‘lite’ coming in 2013, Xbox 360 successor later, insider claims
    http://www.bgr.com/2012/03/22/arm-based-xbox-lite-coming-in-2013-xbox-360-successor-later-insider-claims/

    Microsoft had intended to unveil its next-generation Xbox console during the annual E3 conference this year according to a BGR source, but it now looks as though that won’t happen — the company has gone on record in stating, “we can confirm that there will be no talk of new Xbox hardware at E3 or anytime soon.”

    Redmond-based company is currently working on not one but two Xbox consoles that will launch in the coming years.

    “My understanding is that we’ll see a Xbox device in late 2013 which does Arcade-style games & all the current & future media apps with Kinect (with near-mode),” MS Nerd wrote on Reddit while fielding user-submitted questions. “It will be an ARM-based platform price-competitive with the Apple TV (if you own a Kinect already).”

    A low-cost Xbox gaming console that works alongside the company’s popular Kinect motion and voice-based controller could better-position the company to combat the increasing popularity of casual mobile games while Microsoft continues to work on its Xbox 360 successor. Such a device could also feature deep Windows Phone integration.

    “At some point after that, we’ll see a Xbox Next, a true successor to the 360,” MS Nerd continued.

    Reply
  33. Tomi Engdahl says:

    iPad Retina Display Key to Consumer Demand: Baird
    http://www.eweek.com/c/a/Mobile-and-Wireless/iPad-Retina-Display-Key-to-Consumer-Demand-Baird-201100/

    The high-resolution Retina display, a key feature of Apple’s latest iPad, is driving consumer demand for the tablet, according to a survey from Baird.

    Prospective buyers of Apple’s latest iPad tablet are mainly interested in the high-resolution Retina display new to the device, according to a survey from Baird. According to the results of the online survey, 24 percent of U.S. respondents plan to purchase the new iPad in the next three months, with 29 percent of international respondents planning to purchase it. When asked about reasons for purchasing the new iPad, 28 percent cited the Retina display as the top reason, followed by the A5X processor at 26 percent and Long Term Evolution (LTE) wireless capability at 17 percent.

    Slightly under half 48 percent of the survey’s U.S. respondents said that the currently own a tablet. The iPad 2 was the most commonly owned device, followed by the Amazon Kindle Fire and the original iPad. Google Android-based tablets (excluding the Fire and Barnes & Noble Nook) had a 5 percent share. The international sample was somewhat less penetrated, and had no HP TouchPad or Nook representation.

    A quarter of the 59 Kindle Fire owners Baird surveyed said that they plan to purchase an iPad in the next three months. The report noted that the Kindle Fire was released in mid-November, marking a “fairly quick” turnaround.

    Reply
  34. Tomi Engdahl says:

    Goldman Sachs in email muppet hunt
    ‘Toxic and destructive’ leak sparks grep binge
    http://www.theregister.co.uk/2012/03/27/goldman_sachs_email_audit/

    Spencer Allingham, technical director at IT optimisation specialist Condusiv Technologies, commented: “While investigating emails to tap into corporate culture will undoubtedly be revealing for the organisation, the sheer amount of work to recover past or deleted emails will be a vast drain on time and money if appropriate technology is not in place.

    “For many IT departments it is a constant struggle to find the budget to update systems and improve efficiency, and it is at times like these that poor infrastructures are exposed, and can cause reputational damage, even putting companies head to head with legislation, if the investigation is a legal requirement.”

    Allingham said that tighter financial regulations meant that email trawls like the one Goldman Sachs has been obliged to undertake are likely to become more commonplace in future. Failure to put a strategy in place that can accommodate such investigations could prove to be expensive if anything goes awry, he warned.

    “The recent climate of Big Data and virtualisation has only extrapolated the issue of controlling the data deluge common to most corporate environments. Data now varies in content, sensitivity, form and also in how it’s stored, but as investigations such as the Goldman Sachs case proves, speed is key and access to data needs to occur irrelevant of changes in the IT infrastructure.

    Reply
  35. Tomi Engdahl says:

    RightScale: Hybrid clouds on the rise
    http://www.theregister.co.uk/2012/03/27/rightscale_hybrid_clouds/

    “The hybrid cloud is a little bit like sex in high school,” explains RightScale CEO Michael Crandell. “Everybody is talking about it, but not everybody is doing it – except us.”

    Of course, when you sell a management service that runs atop a cloud that is used to manage a hodgepodge of different cloud fabrics, as RightScale does, you expect for your customers to be on the cutting edge of mixing and matching private and public cloudy infrastructure. So this is not really much of a surprise.

    The reasons why companies are deploying multiple clouds – even if they are incompatible – is a complicated issue, Crandell tells El Reg. But the fact that they feel they have to do so for performance, disaster recovery, latency, or cost reasons certainly plays into the hand of a company like RightScale, whose management tool is essentially a “super control freak” that rides atop the management APIs of the infrastructure cloud fabrics used to build public and private clouds, and tells them what to do.

    What is often the case is that big companies are now willing to try out a new application on a public cloud first – like Zynga does with new games it is unsure of – and then they move it to their private cloud when they have a better sense of what resources it will need.

    The RightScale service is not just a monitoring and deployment tool, but is used to automate the operations of virtual machine (and soon physical server) images out there in the clouds.

    The RightScale service can link into three cloud fabrics that are commonly used for building private clouds at this point: CloudStack from Citrix Systems, Eucalyptus from Eucalyptus Systems, and OpenStack, the open source alternative championed by NASA and Rackspace Hosting.

    On the traditional public cloud front, where companies have their own proprietary infrastructure cloud tools and controllers, RightScale supports Amazon Web Services, Datapipe, IDC Frontier (a subsidiary of Yahoo! Japan), Logicworks, Rackspace Cloud, and SoftLayer.

    Obviously, Microsoft’s Azure public cloud is not an infrastructure cloud, but a platform cloud, and hence does not belong on the list.

    Reply
  36. Tomi Engdahl says:

    What IT Managers Say to Get the CIO’s OK
    http://www.cio.com/article/702750/What_IT_Managers_Say_to_Get_the_CIO_s_OK?page=1&taxonomyId=3174

    When data center and facility managers meet with the CIO about new equipment, the conversations are rarely easy. The equipment they seek is often expensive, in the six- or seven-figure range, and justifying the expense can be challenging.

    Make the project real

    CIOs today “truly don’t understand from a technical perspective,” said one manager. Another said that “CIOs are less and less technical as time goes on.”

    Keep the presentation short

    One manager recommended summarizing an equipment or project request on one sheet of paper. The manager also suggested arriving at the meeting with a presentation that can be delivered within five minutes.

    Be diplomatic and educate

    Get to know the people in the finance department who work on IT purchasing. One manager said that whenever he installs a new piece of equipment, he invites people from finance to the data center to see it. Building those relationships may help sell future projects. “It makes it more tangible,” he said.

    Offer options

    One manager recommended going into the meeting with three options: a high-end everything-we-want option; a this-will-get-the-job-done option, and a bare bones option: “It will work, but it will be a struggle.”

    By giving the CIO options, said an advocate of this approach, it “gives them some control over the solution itself.”

    Reply
  37. Tomi Engdahl says:

    What system builders need to know about solid state drives
    Get the best from flash technology
    http://www.channelregister.co.uk/2012/03/28/enterprise_ssds/

    If you are building systems using solid state drives (SSDs), you need rock-solid reliability and performance – and you won’t get it from consumer-grade flash.

    But how do you know if the drives you choose are enterprise-grade? A supplier may say its SSD is enterprise quality but can you be sure this marketing claim is true? You need to understand the qualities of an enterprise-class SSD and check candidate drives so that the systems you build for your customers have a long, reliable life.

    There are four main attributes of an SSD that mark it out as truly enterprise-class: speed, endurance, data integrity and system builder friendliness.

    Enterprise-class single-level cell SSDs exhibit sequential read and write I/O bandwidth of 300MBps and 360MBps, with generally equal read and write speeds, and random read and write IOPS above 48,000 and 22,000 respectively. They will be able to do this for five years, which brings us to working life.

    An enterprise-class SSD should be expected to have, say, five years of life. It should also have a formal amount of data that can be written to it, for example 14.6PB for an 800GB multi-level SSD, which equates to more than 10 full drive capacity writes a day.

    Getting data on and off the SSD reliably, quickly and at a consistent rate are three excellent qualities – but correct data is equally necessary.

    Error checking and correction is vital for SSDs, as it is for hard disk drives. T10 protection information (PI) and I/O error detection codes (IOEDC) are other techniques used to ensure the integrity of data.
    When data is first written, cyclic redundancy check (CRC) data is added to it.

    You need to be able to qualify the SSDs and your supplier should have rigorous quality control procedures so that you develop and ship your systems with consistently reliable media. Look for more than a million-and-a-half hours between failures and an annual failure rate of less than 0.55 per cent.

    Reply
  38. Tomi Engdahl says:

    Nearly 1 Billion Smart Connected Devices Shipped in 2011 with Shipments Expected to Double by 2016, According to IDC
    http://www.idc.com/getdoc.jsp?containerId=prUS23398412

    The universe of smart connected devices, including PCs, media tablets, and smartphones, saw shipments of more than 916 million units and revenues surpassing $489 billion dollars in 2011, according to the International Data Corporation (IDC).

    Looking ahead, unit shipments for smart connected devices should top 1.1 billion worldwide in 2012. By 2016, IDC predicts shipments will reach 1.84 billion units

    In terms of platforms, IDC expects a relatively dramatic shift between 2011 and 2016, with the once-dominant Windows on x86 platform, consisting of PCs running the Windows operating system on any x86-compatible CPU, slipping from a leading 35.9% share in 2011 down to 25.1% in 2016. The number of Android-based devices running on ARM CPUs, on the other hand, will grow modestly from 29.4% share in 2011 to a market-leading 31.1% share in 2016. Meanwhile, iOS-based devices will grow from 14.6% share in 2011 to 17.3% in 2016.

    “Android’s growth is tied directly to the propagation of lower-priced devices,”

    “Smartphone growth will be driven by Asia/Pacific countries, especially China, where mobile operators are subsidizing the purchase of 3G smartphones, thus increasing the total addressable market.”

    Reply
  39. Tomi Engdahl says:

    Munich Has Saved €4M So Far After Switch To Linux
    http://linux.slashdot.org/story/12/03/29/0025239/munich-has-saved-4m-so-far-after-switch-to-linux

    “Mayor Ude reported today that the city of Munich has saved €4 million so far (Google translation of German original) by switching its IT infrastructure from Windows NT and Office to Linux and OpenOffice. At the same time, the number of trouble tickets decreased from 70 to 46 per month. Savings were €2.8M from software licensing and €1.2M from hardware because demands are lower for Linux compared to Windows 7.”

    Reply
  40. Tomi Engdahl says:

    ARM-Android to outship Windows-Anything by 2016
    http://www.theregister.co.uk/2012/03/28/idc_pc_tablet_smartphone_smackdown/

    Windows might be on the rise in the world of embedded systems, but if IDC’s prognostications are right, then Windows is about to get its kernel handed to it with the rise of Android on what the market researcher dubs “smart connected devices.”

    By IDC’s reckoning, makers of PCs, tablets, and smartphones shipped some 916 million units of machinery in 2012, raking in an astounding $489bn in moolah.

    El Reg suspects that more and more of us on planet Earth have all three types of devices. (I know that I do, and in fact, I have a workstation for the office, a netbook for the road, an iPad for amusement and browsing, and a Droid for a phone, a map, and browsing when I am really bored.)

    Add all these devices up, and IDC reckons that the pile of shiny new gear shipped out to consumers and businesses will be 1.1 billion units tall in 2012 and will be 1.84 billion units tall by 2016 – twice the current ship rate and nearly three times the rate set in 2010.

    Those ARM-Android machines will be the largest class of machines shipping in 2016, at least by number. Tablets and smartphones running Apple’s iOS will more than double from just under 134 million devices in 2011 (14.6 per cent of the machines sold) to 318 million machines (17.3 per cent of the pile) by 2016.

    Reply
  41. Tomi Engdahl says:

    ‘Intelligent systems’ poised to outsell PCs, smartphones
    http://www.theregister.co.uk/2012/03/23/idc_intelligent_systems/

    IDC defines intelligent systems as those that use a high-level operating system, connect to the internet, run native or cloud-based apps, and have the ability to process their own data. The analysts anticipate that by 2016, these smart systems will account for a third of the embedded processor market, but over two-thirds of its value, and their numbers will outpace the growth of either PCs or smartphones.

    Reply
  42. Tomi Engdahl says:

    The Next PlayStation is Called Orbis, Sources Say. Here are the Details.
    http://kotaku.com/5896996

    While the official reveal of Sony’s next home console could still be months away, if not longer, Kotaku has today learned some important details concerning the PlayStation 3′s successor.

    For one, the console’s name—or at least its codename/working title—is apparently Orbis. And it’s being planned for release in time for the 2013 holiday season.

    Our main source supplied some basic specs for the console, but as the future is always in motion, bear in mind these could easily change between now and the Orbis’ retail release. Still, if you’d like to know what developers are being told to plan for now, here you go.

    AMD x64 CPU
    AMD Southern Islands GPU

    The PS4′s GPU in particular, we’re told, will be capable of displaying Orbis games at a resolution of up to 4096×2160, which is far in excess of the needs of most current HDTV sets. It’ll also be capable of playing 3D games in 1080p (the PS3 could only safely manage 3D at 720p).

    Reply
  43. Tomi Engdahl says:

    Apples Are Growing in American Homes
    http://www.cnbc.com/id/46857053

    Half of all U.S. households own at least one Apple product, according to CNBC’s All-America Economic survey.

    That’s more than 55 million homes with at least one iPhone, iPad, iPod or Mac computer.

    Homes that own least one Apple, own an average of three. Overall, the average household has 1.6 Apple devices, with almost one-quarter planning to buy at least one more in the next year.

    “It’s a fantastic business model — the more of our products you own, the more likely you are to buy more,”

    “Planned obsolescence has always been a part of the technology industries sales model, but Apple has taken it to a whole new level.”

    Our survey shows Apple buyers tend to be male, college-educated, and younger.

    The poll of 836 Americans was conducted by landline and cellphone from March 19 to 22 and has a margin of error of plus or minus 3.4 percent.

    Reply
  44. Tomi Engdahl says:

    While magnetic tape is about as boring as technology gets, it’s still the cheapest storage medium and among the fastest in sequential reads and writes. And, with the release of LTO-6 with 8TB cartridges around the corner and the relatively new open linear tape file system (LTFS) being embraced by movie and television markets, tape is taking on a new life.

    Source: http://hardware.slashdot.org/story/12/03/29/1926240/after-60-years-tape-reinserts-itself

    Reply
  45. Tomi Engdahl says:

    How to Be Ready for Big Data
    http://www.cio.com/article/702467/How_to_Be_Ready_for_Big_Data?page=1&taxonomyId=3002

    Big Data is coming, but for most organizations it’s three-to-five years away. That doesn’t mean you shouldn’t prepare now. Analyzing Big Data will require reference information like that provided by a semantic data model. And once you mine the data, you need to secure it.

    In the next three to five years, we will see a widening gap between companies that understand and exploit Big Data and companies that are aware of it but don’t know what to do about it, says Kalyan Viswanathan, global head of information management with Tata Consultancy Services’ (TCS) global consulting group. The companies that succeed in turning Big Data into actionable information with have a clear competitive advantage, Viswanathan says.

    “Today, most companies are aware of Big Data,” he says. “There’s a lot written about it. There are conferences about it. Awareness has become quite pervasive. But if you look at actually exploiting Big Data, I would say we’re at the very beginning stages of it.”

    One of the keys to taking unstructured data—audio, video, images, unstructured text, events, tweets, wikis, forums and blogs—and extracting useful data from it is to create a semantic data model as a layer that sits on top of your data stores and helps you make sense of everything.

    “We have to put data together from disparate sources and make sense of it,” says David Saul, chief scientist at State Street, a financial services provider that serves global institutional investors.

    But collecting all this data and making it more accessible also means organizations need to be serious about securing it. And that requires thinking about security architecture from the beginning, Saul says.

    “I believe the biggest mistake that most people make with security is they leave thinking about it until the very end, until they’ve done everything else: architecture, design and, in some cases, development,” Saul says. “That is always a mistake.”

    Saul says that State Street has implemented an enterprise security framework in which every piece of data in its stores includes with it the kind of credentials required to access that data.

    “By doing that, we get better security,” he says. “We get much finer control. We have the ability to do reporting to satisfy audit requirements. Every piece of data is considered an asset. Part of that asset is who’s entitled to look at it, who’s entitled to change it, who’s entitled to delete it, etc. Combine that with encryption, and if someone does break in and has free reign throughout the organization, once they get to the data, there’s still another protection that keeps them from getting access to the data and the context.”

    Gazzang’s Warnock agrees, noting that companies that collect and leverage Big Data very quickly find that they have what Forrester calls ‘toxic data’ on their hands. For instance, imagine a wireless company that is collecting machine data—who’s logged onto which towers, how long they’re online, how much data they’re using, whether they’re moving or staying still—that can be used to provide insight to user behavior.

    “Downstream analytics is the reason you gather all this data in the first place,” he says. But organizations should then follow best practices by encrypting it.

    “Over time, just as it’s best practice to protect the perimeter with firewalls, it will be best practice to encrypt data at rest,” he says.

    Reply
  46. Tomi Engdahl says:

    Sales show tablets and Ultrabooks not rivals
    http://www.reghardware.com/2012/03/30/gfk_numbers_show_table_punters_want_tablets_laptop_buyers_want_ultrabooks/

    Intel’s hope that it can take on and beat the tablet with skinny laptops – Ultrabooks – may prove unfounded, in the UK at least.

    tablets took a greater share of over-the-counter sales in February compared to February 2011, notebooks’ share of sales fell.

    So folk after an Ultrabook won’t generally be considering a tablet as an alternative, and vice versa.

    That explains, perhaps, the rising share of desktop computers. To Ashford that suggests punters are keener to buy the products they need and not simply grab the cheapest kit.

    “What manufacturers and retailers will need to achieve in 2012 will be matching new technologies and form-factors to consumers’ needs,” he said, “rather than a race to the bottom in terms of pricing.”

    Reply
  47. Tomi Engdahl says:

    Is it time to take the fight into the Clouds?
    The years to come seem waste of breath, to server minders
    http://www.theregister.co.uk/2012/03/30/rackspace_hosting_server_study/

    Rackspace Hosting has a vested interest in convincing IT shops that they don’t need to own and operate their own servers and that they should leave it to the professionals with “fanatical support.” And it looks like many companies are getting grumpy enough to give clouds a whirl.

    If you drill down into the 2012 report, which is called Cloud Reality Check, you won’t find much variation in the time spent troubleshooting and managing servers as opposed to doing other important activities like talking to vendors and suppliers about new stuff.

    The most damning data to come from the survey is that companies seem to be getting worse at doing server capacity planning. Across all survey respondents, 18 per cent admitted that they bought too many servers, wasting money, and 45 per cent said they didn’t buy enough capacity, which means end users get grumpy because there is not enough iron to drive the apps.

    If this survey is any guide, then cloud providers have their work cut out for them. Companies are more nervous about security, reliability, real cost savings, and transparent pricing when it comes to hosting and cloud computing than they were back in 2009. The positions are hardening between those who will jump to clouds and those who won’t because they are not sure the payoff is real and the pain can be reduced.

    Reply
  48. Tomi Engdahl says:

    Exclusive: Google, Amazon, and Microsoft Swarm China for Network Gear
    http://www.wired.com/wiredenterprise/2012/03/google-microsoft-network-gear/all/1

    Google, Amazon, Microsoft, and Facebook buy more networking hardware than practically anyone else on earth. After all, these are the giants of the internet. But at the same time, they’re buying less and less gear from Cisco, HP, Juniper, and the rest of the world’s largest networking vendors. It’s an irony that could lead to a major shift in the worldwide hardware market.

    “My biggest customers were these big data center [companies], so I know all of them pretty well,” Liao says. “They all have different ways of solving their networking problems, but they have all moved away from big networking companies like Cisco or Juniper or [the Dell-owned] Force10.”

    The move away from U.S. network equipment stalwarts is one of the best-kept secrets in Silicon Valley. Some web giants consider their networking hardware strategy a competitive advantage that must be hidden from rivals.

    J.R. Rivers is one of the arms dealers. He runs a company called Cumulus Networks that helps the giants of the web — and other outfits — buy their networking hardware directly from “original design manufacturers,” or ODMs, in China and Taiwan.

    “When Google looked at their network, they need high-bandwidth connections between their servers and they wanted to be able to manage things — at scale,” Rivers says. “With the traditional enterprise networking vendors, they just couldn’t get there. The cost was too high, and the systems were too closed to be manageable on a network of that size.”

    So Google drew up its own designs — working alongside manufacturers in Taiwan and China — and cut the Ciscos and the Force10s out of the equation. The Ciscos and the Force10s build their gear with many of those same manufacturers. Google removed the middlemen.

    The search giant does much the same with its servers, buying custom-built machines straight from Asia rather than going through traditional sellers such as Dell and HP.

    Reply
  49. Tomi Engdahl says:

    Nearly 1 Billion Smart Connected Devices Shipped in 2011 with Shipments Expected to Double by 2016, According to IDC
    http://www.idc.com/getdoc.jsp?containerId=prUS23398412

    The universe of smart connected devices, including PCs, media tablets, and smartphones, saw shipments of more than 916 million units and revenues surpassing $489 billion dollars in 2011, according to the International Data Corporation (IDC).

    “Whether it’s consumers looking for a phone that can tap into several robust ‘app’ ecosystems, businesses looking at deploying tablet devices into their environments, or educational institutions working to update their school’s computer labs, smart, connected, compute-capable devices are playing an increasingly important role in nearly every individual’s life,” said Bob O’Donnell, vice president, Clients and Displays at IDC.

    In terms of platforms, IDC expects a relatively dramatic shift between 2011 and 2016, with the once-dominant Windows on x86 platform, consisting of PCs running the Windows operating system on any x86-compatible CPU, slipping from a leading 35.9% share in 2011 down to 25.1% in 2016. The number of Android-based devices running on ARM CPUs, on the other hand, will grow modestly from 29.4% share in 2011 to a market-leading 31.1% share in 2016. Meanwhile, iOS-based devices will grow from 14.6% share in 2011 to 17.3% in 2016.

    “Android’s growth is tied directly to the propagation of lower-priced devices,”

    “Smartphone growth will be driven by Asia/Pacific countries, especially China, where mobile operators are subsidizing the purchase of 3G smartphones, thus increasing the total addressable market. In many if not all instances, the smartphone will be the primary connection to the Internet,”

    Reply

Leave a Reply to Tomi Cancel reply

Your email address will not be published. Required fields are marked *

*

*