ARM processor becomes more and more popular during year 2012. Power and Integration—ARM Making More Inroads into More Designs. It’s about power—low power; almost no power. A huge and burgeoning market is opening for devices that are handheld and mobile, have rich graphics, deliver 32-bit multicore compute power, include Wi-Fi, web and often 4G connectivity, and that can last up to ten hours on a battery charge.The most obvious among these are smartphones and tablets, but there is also an increasing number of industrial and military devices that fall into this category.
The rivalry between ARM and Intel in this arena is predictably intense because try as it will, Intel has not been able to bring the power consumption of its Atom CPUs down to the level of ARM-based designs (Atom typically in 1-4 watt range and a single ARM Cortex-A9 core in the 250 mW range). ARM’s East unimpressed with Medfield, design wins article tells that Warren East, CEO of processor technology licensor ARM Holdings plc (Cambridge, England), is unimpressed by the announcements made by chip giant Intel about the low-power Medfield system-chip and its design wins. On the other hand Android will run better on our chips, says Intel. Look out what happens in this competition.
Windows-on-ARM Spells End of Wintel article tells that Brokerage house Nomura Equity Research forecasts that the emerging partnership between Microsoft and ARM will likely end the Windows-Intel duopoly. The long-term consequences for the world’s largest chip maker will likely be an exit from the tablet market as ARM makes inroads in notebook computers. As ARM is surely going to keep pointing out to everyone, they don’t have to beat Intel’s raw performance to make a big splash in this market, because for these kinds of devices, speed isn’t everything, and their promised power consumption advantage will surely be a major selling point.
Windows 8 Release Expected in 2012 article says that Windows 8 will be with us in 2012, according to Microsoft roadmaps. Microsoft still hinting at October Windows 8 release date. It will be seen what are the ramifications of Windows 8, which is supposed to run on either the x86 or ARM architectures. Windows on ARM will not be terribly successful says analyst but it is left to be seen is he right. ARM-based chip vendors that Microsoft is working with (TI, Nvidia, Qualcomm) are now focused on mobile devices (smartphones, tablets, etc.) because this is where the biggest perceived advantages of ARM-based chips lie, and do not seem to be actively working on PC designs.
Engineering Windows 8 for mobile networks is going on. Windows 8 Mobile Broadband Enhancements Detailed article tells that using mobile broadband in Windows 8 will no longer require specific drivers and third-party software. This is thanks to the new Mobile Broadband Interface Model (MBIM) standard, which hardware makers are reportedly already beginning to adopt, and a generic driver in Windows 8 that can interface with any chip supporting that standard. Windows will automatically detect which carrier it’s associated with and download any available mobile broadband app from the Windows store. MBIM 1.0 is a USB-based protocol for host and device connectivity for desktops, laptops, tablets and mobile devices. The specification supports multiple generations of GSM and CDMA-based 3G and 4G packet data services including the recent LTE technology.
Consumerization of IT is a hot trend that continues at year 2012. Uh-oh, PC: Half of computing device sales are mobile. Mobile App Usage Further Dominates Web, Spurred by Facebook article tells that the era of mobile computing, catalyzed by Apple and Google, is driving among the largest shifts in consumer behavior over the last forty years. Impressively, its rate of adoption is outpacing both the PC revolution of the 1980s and the Internet Boom of the 1990s. By the end of 2012, Flurry estimates that the cumulative number of iOS and Android devices activated will surge past 1 billion, making the rate of iOS and Android smart device adoption more than four times faster than that of personal computers (over 800 million PCs were sold between 1981 and 2000). Smartphones and tablets come with broadband connectivity out-of-the-box. Bring-your-own-device becoming accepted business practice.
Mobile UIs: It’s developers vs. users article tells that increased emphasis on distinctive smartphone UIs means even more headaches for cross-platform mobile developers. Whose UI will be a winner? Native apps trump the mobile Web.The increased emphasis on specialized mobile user interface guidelines casts new light on the debate over Web apps versus native development, too.
The Cloud is Not Just for Techies Anymore tells that cloud computing achieves mainstream status. So we demand more from it. That’s because our needs and expectations for a mainstream technology and an experimental technology differ. Once we depend on a technology to run our businesses, we demand minute-by-minute reliability and performance.
Cloud security is no oxymoron article is estimated that in 2013 over $148 billion will be spent on cloud computing. Companies large and small are using the cloud to conduct business and store critical information. The cloud is now mainstream. The paradigm of cloud computing requires cloud consumers to extend their trust boundaries outside their current network and infrastructure to encompass a cloud provider. There are three primary areas of cloud security that relate to almost any cloud implementation: authentication, encryption, and network access control. If you are dealing with those issues and software design, read Rugged Software Manifesto and Rugged Software Development presentation.
Enterprise IT’s power shift threatens server-huggers article tells that as more developers take on the task of building, deploying, and running applications on infrastructure outsourced to Amazon and others, traditional roles of system administration and IT operations will morph considerably or evaporate.
Explosion in “Big Data” Causing Data Center Crunch article tells that global business has been caught off-guard by the recent explosion in data volumes and is trying to cope with short-term fixes such as buying in data centre capacity. Oracle also found that the number of businesses looking to build new data centres within the next two years has risen. Data centre capacity and data volumes should be expected to go up – this drives data centre capacity building. Data centre capacity and data volumes should be expected to go up – this drives data centre capacity building. Most players active on “Big Data” field seems to plan to use Apache Hadoop framework for the distributed processing of large data sets across clusters of computers. At least EMC, Microsoft, IBM, Oracle, Informatica, HP, Dell and Cloudera are using Hadoop.
Cloud storage has been very popular topic lately to handle large amount of data storage. The benefits have been told very much, but now we can also see risks of that to realize. Did the Feds Just Kill the Cloud Storage Model? article claims that Megaupload Type Shutdowns and Patriot Act are killing interest to Cloud Storage. Many innocent Megaupload users have had their data taken away from them. The MegaUpload seizure shows how personal files hosted on remote servers operated by a third party can easily be caught up in a government raid targeted at digital pirates. In the wake of Megaupload crackdown, fear forces similar sites to shutter sharing services?. If you use any of these cloud storage sites to store or distribute your own non-infringing files, you are wise to have backups elsewhere, because they may be next on the DOJ’s copyright hit list.
Did the Feds Just Kill the Cloud Storage Model? article tells that worries have been steadily growing among European IT leaders that the USA Patriot Act would give the U.S. government unfettered access to their data if stored on the cloud servers of American providers. Escaping the grasp of the Patriot Act may be more difficult than the marketing suggests. “You have to fence yourself off and make sure that neither you or your cloud service provider has any operations in the United States”, “otherwise you’re vulnerable to U.S. jurisdiction.” And the cloud computing model is built on the argument data can and should reside anywhere around the world, freely passing between borders.
Data centers to cut LAN cord? article mentions that 60GHz wireless links are tested in data centers to ease east-west traffic jams. According to a recent article in The New York Times, data center and networking techies are playing around with 60GHz wireless networking for short-haul links to give rack-to-rack communications some extra bandwidth for when the east-west traffic goes a bit wild. The University of Washington and Microsoft Research published a paper at the Association of Computing Machinery’s SIGCOMM 2011 conference late last year about their tests of 60GHz wireless links in the data center. Their research used prototype links that bear some resemblance to the point-to-point, high bandwidth technology known as WiGig (Wireless Gigabit), which among other things is being proposed as a means to support wireless links between Blu-ray DVD players and TVs, replacing HDMI cables (Wilocity Demonstrates 60 GHz WiGig (Draft 802.11ad) Chipset at CES). 60 GHz band is suitable for indoor, high-bandwidth use in information technology.. There are still many places for physical wires. The wired connections used in a data center are highly reliable, so “why introduce variability in a mission-critical situation?”
820 Comments
Tomi says:
NVIDIA and AMD Launch New High-End Workstation, Virtualization, and HPC GPUs
http://hardware.slashdot.org/story/12/11/13/014241/nvidia-and-amd-launch-new-high-end-workstation-virtualization-and-hpc-gpus
“Nvidia is taking the wraps off a new GPU targeted at HPC and as expected, it’s a monster. The Nvidia K20, based on the GK110 GPU, weighs in at 7.1B transistors, double the previous gen GK104′s 3.54B.”
“Meanwhile, AMD has announced a new FirePro graphics card at SC12 today, and it’s aimed at server workloads and data center deployment.”
“On paper, AMD’s new FirePro S10000 is a serious beast. Single and double-precision rates at 5.9 TFLOPS and 1.48 TFLOPS respectively are higher than anything from Intel or Nvidia”
“The S10000 is aimed at the virtualization market with its dual-GPUs on a single-card”
Comments:
Right now they are all too expensive, and consume too much juice.
Server virtualization doesn’t really need this – running web servers or databases or name servers, which are all essentially fancy timesharing.
But “Desktop Virtualization” emulates your entire desktop as a virtual machine on a shared server, graphics and all, and just ships the rendered screens back to your desktop, accessible from anywhere, with RDP or VNC or whatever, kind of like a clumsy version of X Windows except you get to do full-scale graphics acceleration at the server farm instead of at your desktop. The mainframe IT crowd like it, because the PC on your desk can be dumb and low-powered, and the server back in the server farm they get to maintain can be big and fancy, and they can have better control over it than over your desktop, don’t need to keep every bit of software up to date on everybody’s remote PC, and it’s generally easier to manage.
Those 8 TFLOPS would have landed it somewhere at the top of the #500 supercomputer performance list in November, 2011 [top500.org].
Tomi Engdahl says:
Exclusive: AMD hires bank to explore options – sources
http://www.reuters.com/article/2012/11/13/us-amd-jpmorgan-idUSBRE8AC14Z20121113
Advanced Micro Devices has hired JPMorgan Chase & Co to explore options, which could include a sale, as the chipmaker struggles to find a role in an industry increasingly focused on mobile and away from traditional PCs, according to three sources familiar with the situation.
Sources told Reuters on Tuesday that an outright sale of the company is not a priority, and other options for AMD could include a sale of its portfolio of patents.
AMD said in an email to Reuters, “AMD’s board and management believe that the strategy the company is currently pursuing to drive long-term growth by leveraging AMD’s highly-differentiated technology assets is the right approach to enhance shareholder value. AMD is not actively pursuing a sale of the company or significant assets at this time.”
Tomi Engdahl says:
“Pc-sales decline will not stop”
PC sales have gone very sticky. Some people have been waiting for the new Windows would jazz up the situation, but not all are approximate W-potency vitamin faith. Barclays analyst has introduced the PC manufacturers in terms of depressing forecast that sales will shrink from now on, year after year.
Next year, sales would fall this year’s poor numbers by 4 per cent. Reitzes has corrected the previous forecast, which was thought to shrink one percent.
What, then, is the market changed for the worse? Reitzes appoint four reasons: “The general economy is weak, there is confusion on Windows 8, tablets cannibalize PC sales and machine life cycles are stretched more for longer.”
Windows 8 and ultrabookit have baffled the market in a way that has exacerbated the plunge.
Source: http://www.tietokone.fi/uutiset/pc_myynnin_alamaki_ei_lopu
Tomi Engdahl says:
Is IT Outsourcing a Dying Concept?
http://www.cio.com/article/721159/Is_IT_Outsourcing_a_Dying_Concept_
Has the concept of IT outsourcing outlived its effectiveness? KPMG’s Cliff Justice says the process that the term was invented to describe has evolved considerably and finding value in IT service providers is taking on new meaning.
According to a recent survey by outsourcing analyst firm HfS Research, 63 percent of IT leaders would like to drop the term “outsourcing” to describe the IT services provided to them by third-parties and 68 percent of IT service providers want to do away with the designation.
Some of that can be attributed to the negative connotations associated with the word “outsourcing,”
But it could also be that the term has outlived its effectiveness. “Like many invented terms, the process it describes has evolved considerably from what it was initially,” says Cliff Justice, partner in KPMG’s Shared Services and Outsourcing Advisory.
In his report earlier this year, “The Death of Outsourcing,” Justice argues that as the use of third-party IT service providers has evolved from a simple “lift-and-shift” back-office cost-cutting exercise to a complex ecosystem of smaller deals, higher expectations and more business-centric services.
Tomi Engdahl says:
Windows 8 Euro PC sales SHOCKER: Results actually not bad
‘Real opportunity for MS here’ says gobsmacked analyst
http://www.channelregister.co.uk/2012/11/14/bechtle_results/
Windows 8 sent the PC market’s heartbeat up a tick or two in the period either side of launch but the pulse was still far from racing.
Then again no one expected it to be, save maybe for OEMs, which simply crossed fingers and closed their eyes tightly in the hope of an uplift in sales instead of the downward or flatlining curves of recent years.
“Secondly, despite the uncertain economy, PC sales through IT distribution in the two weeks prior to launch and including the week itself actually rose 7.8 per cent [year-on-year],” said Davies.
Roughly four-fifths of the machines shipped included a Windows 8 64-bit version. Less than 1 per cent of the systems sold were made up by Windows Pro. Windows RT made up 2.5 per cent of sales.
Businesses don’t generally adopt a new OS until about 18 months down the line following a period of testing and when the glitches have been ironed out.
Tomi Engdahl says:
Microsoft updates its Windows Embedded roadmap; Embedded 8 Handheld is alive
http://www.zdnet.com/microsoft-updates-its-windows-embedded-roadmap-embedded-8-handheld-is-alive-7000007405/
Summary: Microsoft has provided yet another update to its Windows Embedded roadmap. Many products have been renamed. And a new version of Embedded Handheld is coming.
Windows Embedded 8 Handheld — about which “more information will be available in early 2013″ — is going to be based on Windows Phone 8 technologies, says Microsoft’s new press release. This is a product for the ruggedized/enterprise handheld device market.
What else is on the roadmap?
Windows Embedded 8 Standard, which is based on the Windows 8 core code. The release preview available now; general availability is slated for March 2013.
Windows Embedded 8 Pro, which also is based on the Windows 8 core. This is the new name for the product formerly known as Windows Embedded 8 Enterprise. General availability is scheduled for March 2013.
Windows Embedded 8 Industry is the renamed Windows Embedded POSReady (with POS standing for point of sale). There will be both a community technology preview test build and details on timing coming in January 2013.
Windows Embedded Compact 2013 is the new name for the product that until now seemed destined to be known as Windows Embedded 8 Compact. Microsoft is acknowledging publicly that it will be generally available in the second quarter of 2013. (Recently, Microsoft officials indicated the coming Embedded Compact release would ship in the first quarter of 2013, a slip from the previous “latter half of 2012″ ship target. So it seems it has slipped a bit again.)
Windows Embedded 8 Automotive. Microsoft’s not saying much on this beyond “more information will be available in early 2013.” Microsoft is working with “preselected partners” on this product, which will “be based on Windows 8 technologies.”
Until Windows Phone 8, Microsoft had used Windows Embedded Compact (and its predecessor Windows CE) as the core for its mobile operating-system platform. With Windows Phone 8, Microsoft dropped Embedded Compact and replaced it with the Windows NT core.
Tomi Engdahl says:
Report: 4 suppliers account for half of all containerized data center shipments
http://www.cablinginstall.com/articles/2012/10/ims-containerized-data-center-report.html
A new report from IMS Research reveals that, in the emerging market for containerized data centers, just a few companies hold a significant share. Half of all shipments estimated for 2012 were found by IMS to come from just four suppliers, while 80 percent of all shipments are said to be held by nine suppliers. The remaining 20 percent of the market is largely supplied by local integrators and electrical contractors, according to the firm’s latest research.
The first group is IT companies which include Dell, HP and IBM. These companies were some of the earliest entrants to the market and specialize in selling IT hardware
The second group, data center infrastructure (DCI) companies,
Eaton, Emerson and Schneider Electric.
The final group, electrical contractors and integrators, include a greater assortment of companies.
Tomi Engdahl says:
HP’s Todd Bradley: Surface is no competition to us
http://www.citeworld.com/business/21072/todd-bradley?page=3
TB: First, I wouldn’t say there’s competition from them. I’d hardly call Surface competition.
CITEworld: But the desktop environment is evolving. The PC is dead, right?
TB: You guys have been writing about how the PC has been dead for 20 years. You and your colleagues have. Desktops are going to morph depending on where you are. All-in-ones in China, are clearly a significant piece of the Chinese market. The usage models as you look at these emerging markets, as excited as we are going to get about ultra-mobile, the billion people in rural China still want PCs that have DVD players in them. it’s all about usage.
All-in-ones will be a big driver of usage. All-in-ones with touch, or we’ve created a very thin all-in-one with a pad that goes with it to give you all the touch sensation without having to touch the screen.
Tomi Engdahl says:
94 per cent of the world’s 500 fastest supercomputers run Linux – only three computers run Windows
The world’s 500 most powerful supercomputers 469 runs on a Linux operating system.
After the Linux operating system was the most popular Unix, which found 20 super computer.
The world’s most powerful supercomputer the Titan is located in the U.S. Department of Energy’s Oak Ridge laboratory. It reached the 17.59 petaflops computing power in benchmark tests. Theoretically the Titan is capable of 20 petaflopsin power.
Source: http://www.tietoviikko.fi/kaikki_uutiset/94+prosenttia+maailman+500+nopeimmasta+supertietokoneesta+kayttaa+linuxia++vain+kolme+tietokonetta+windowsia/a856735?s=r&wtm=tietoviikko/-16112012&
Tomi Engdahl says:
WiGig crew to cut DisplayPort cables
60GHz high-speed wireless tech to support screens
http://www.theregister.co.uk/2012/11/16/wigig_and_vesa_partner_for_fresh_wireless_displayport_push/
VESA, the organisation behind DisplayPort and past monitor connection technologies, must have fallen a little out of love with wireless connectivity. Two years after entering into an alliance with WiGig, the 60GHz band high-speed WLAN standard, it this week felt the need to renew its vows.
Highlighting the two institutions’ “renewed collaboration”, the WiGig Alliance and VESA have announced the formation of a working group focused on ensuring DisplayPort signalling streams smoothly over WiGig’s ultrawideband links, a key step, they both believe, in eliminating the cables that currently tie different bits of computer kit together.
The upshot will be the formal certification of WiGig for DisplayPort compatibility, but neither body could say when this might take place.
WiGig is capable of maintaining data transfer rates of up to 7Gb/s in the 60GHz band.
Tomi Engdahl says:
HP big cheese: Is the cloud even proper IT? Whoops, I said it
http://www.channelregister.co.uk/2012/11/16/msp_channel_fight/
In an industry where the number of vendors and distributors seems to be shrinking by the week, Canalys’ Channels Forum offered dealers and value-added resellers something that felt like choice.
IBM launched a bells-and-whistles managed-service provider (MSP) programme just weeks before the event. The tech titan promised to help any reseller – or other type of business – reposition as a cloud-enabled MSP and remain relevant to the customers of today. Big Blue claimed that for every dollar spent through traditional vendors, another is spent through MSPs.
Presumably if IBM had had its way, the event would have had -as-a-service appended to its title.
Unfortunately for Big Blue, HP was first up at the Canalys conference. Any dealers not willing to wholeheartedly inhale on the cloud would have found HP’s non-pitch on the role of managed-service providers comforting.
A presentation by Peter Ryan, a HP senior vice-president and the general manager for EMEA enterprise, was sceptical, if not scathing, of just how different today’s IT environment and its requirements are from that of last century, last decade or even last year.
It’s a sobering thought, but one that underlies HP’s approach that the ability to deal with legacy IT is as important as a promise to get your accounts department into the cloud within 24 hours.
While MSP-like companies will become important, “traditional delivery models will remain dominant for a long time to come”.
The question is when do you switch from sweating the legacy business you’ve got to either moving with the tide of history or getting out of the game altogether.
Tomi Engdahl says:
28 per cent of the UK adult population regret the fact that selecting a career with ICT
IT attractiveness largest component was seen as money you get from it (44 per cent of the respondents).
A close second and third were the intellectual challenge of provision (41 per cent) and the wide range of tasks (30 per cent).
Those who would like to work in the IT sector, but fail to do so, 45 per cent said the reason the lack of science degree. One fifth of the respondents thought the field is too competitive, and 13 percent said the sector to be too male-dominated.
Source: http://www.tietoviikko.fi/kaikki_uutiset/yli+neljannes+briteista+katuu+uravalintaansa+quotolisi+pitanyt+valita+italaquot/a856848?s=r&wtm=tietoviikko/-16112012&
Tomi Engdahl says:
Intel Corporation today announced that the company’s president and CEO, Paul Otellini, has decided to retire as an officer and director at the company’s annual stockholders’ meeting in May.
Sources:
http://slashdot.org/story/12/11/20/0054200/intel-ceo-paul-otellini-retiring
http://www.businesswire.com/news/home/20121119005831/en/Intel-CEO-Paul-Otellini-Retire
Tomi Engdahl says:
Intel roadmap leak shows quad-core Atoms for 2014
Shift to 22nm and faster clock speeds for Bay Trail-T
http://www.theregister.co.uk/2012/11/19/intel_atom_roadmap_leak/
Intel plans to release its first 22nm quad-core Atom system on a chip at the start of 2014, according to a leaked roadmap showing the new processor’s specifications.
German tech blog Mobilegeeks.de got hold of the roadmap, which shows the current Clover Trail systems being replaced with the 7th generation Atom dubbed Bay Trail-T line. The new chips will run at up to 2.1Ghz, compared to Clover Trail’s 1.5Ghz, giving a 50-60 per cent boost to general computing performance.
Tomi Engdahl says:
Design guru: Windows 8 is ‘a monster’ and ‘a tortured soul’
Aaagh: ‘Dozens of carnival barkers yelling at you’
http://www.theregister.co.uk/2012/11/19/windows_8_disappointment/
US usability guru Jakob Nielsen has rubbished “disappointing” Windows 8, savaging the Microsoft OS’s signature Live Tiles and its complicated gestures.
“Windows 8 on mobile devices and tablets is akin to Dr Jekyll: a tortured soul hoping for redemption.
“On a regular PC, Windows 8 is Mr Hyde: a monster that terrorises poor office workers and strangles their productivity.”
The new more complicated gestures also presented problems, requiring users to be highly accurate and learn a bunch of new sequences off by heart.
“The slightest mistake in any of these steps gives you a different result,”
“Icons are flat, monochromatic, and coarsely simplified,” Nielsen added. “[W]e often saw users either not relating to the icons or simply not understanding them.”
“The underlying problem is the idea of recycling a single software UI for two very different classes of hardware devices. It would have been much better to have two different designs: one for mobile and tablets, and one for the PC,” Nielsen concluded.
“I understand why Microsoft likes the marketing message of ‘One Windows, Everywhere’. But this strategy is wrong for users.”
Tomi Engdahl says:
Review: Intel’s Next Unit of Computing
A taste of the future—or maybe just a nifty little PC
http://techreport.com/review/23888/review-intel-next-unit-of-computing
What’s this? A real and reasonably capable PC stuffed into box that will fit in the palm of your hand?
We’ve heard such claims before, but they’ve never really panned out. Usually such systems have been based on low-power Atom processors or the like, demanding massive performance trade-offs to fit into a small space. Now that most of the world is convinced the PC is doomed and mobile devices are taking over, though, I suppose we should start paying closer attention.
It doesn’t hurt that Intel, the traditional provider of PC performance, has produced this sleek little 4″ by 4″ box and given it a totally-not-pretentious name: the Next Unit of Computing.
Intel calls it NUC, for short, which is incredibly cute.
The firm’s ambitions for this form factor are far more serious. Most of the talk about the NUC mentions obvious applications for a teeny PC, such as digital signage and home theater systems. There’s an undercurrent of suggestion, however, that boxes such as this one may be the future of the PC.
Still, the concept is compelling, instantly spurring the question: what would you do with a little PC of this size? That question comes into sharp focus when you realize that these NUC boxes are on the cusp of broad availability in early December at a pretty darned reasonable price.
Intel anticipates to be somewhere around $300-320.
The Cliff’s Notes version is simple: Intel should have called this an Ultrabox, in an obvious play on the Ultrabook name. The guts of the NUC are essentially the same as an Ultrabook’s, right down to the 17W dual-core Ivy Bridge processor.
the system comes with a 65W laptop-style power brick that plugs into the back of the enclosure
Add the NUC’s likely price and the various components, including the 64GB SSD we mentioned above, and the total price tag rings up at just about $450, without shipping. That’s pretty reasonable, all things considered—better than a poke in the eye with a sharp stick, a sensation that’s probably similar to what you’d feel upon forking over 600 bucks for a Mac Mini.
Since the NUC is essentially a full-fledged PC, the possibile applications for it are nearly endless.
One notable limitation for any NUC application is the system’s port selection
Since the NUC is largely a pre-fab system with only a few cards to install, one wouldn’t expect to run into the sort of frustrating problems that sometimes plague DIY PC builds. Unfortunately, my NUC experience was marred by a pretty major issue that I’m still trying to resolve.
NUC’s speed isn’t going to blow your hair back, but this thing is also more than half as fast as a modern desktop system, which is pretty good.
The NUC’s lineage as an Ultrabook-class system means it’s free of some of the worries you’d have about PCs based on Atom-class processors. Rarely does it feel sluggish.
CPU utilization consistently stays well under 50% while streaming HD video via Netflix (which, by the way, doesn’t invoke the lock-up problems in our NUC that file copies can.) Yes, the chip’s IGP includes a hardware H.264 video decode block, but with this class of CPU power on tap, you’re not under threat of choppy video playback should software fail to make use of it.
Tomi Engdahl says:
Vendors must break code of silence on software’s biggest FAILS
Sick of marketing? Let the devs speak
http://www.theregister.co.uk/2012/11/12/open_and_shut/
Developers love to complain about vendor infomercials at conferences and in press articles, and rightly so. No one wants to have marketing pitches shoved down their throats. They’re boring and quite possibly counterproductive.
And yet so much of our media revert to vendor content because developers, so determined not to be marketed to, are too often constrained by their employers from offering interesting details on the technology they use.
This isn’t just bad for the conference organizers and media publications that are forced to sell out to vendors/advertisers to pay their bills. It’s also bad for developers, who are starved for meaningful information from their peers. It’s like an open-source community in which nobody is allowed to share. It just doesn’t work.
“Early adopters will pioneer use cases and fast followers will replicate those use cases. Once more mainstream companies see these use cases and are comfortable that the vanguard has derisked the implementation, the use case goes big time.”
But now imagine an industry in which legal departments prohibit IT professionals from sharing information about their use cases. That’s the industry we live in, one in which the release of information must be negotiated by vendors like Cloudera, Hortonworks and MapR.
Tomi Engdahl says:
Oprah Winfrey too late to save Microsoft’s Windows 8
Signs are that Redmond has produced a turkey this Xmas
http://www.theregister.co.uk/2012/11/20/windows_8_sales_dissappointing/
Early signs are showing that hopes for the overnight success of Microsoft’s Windows 8 are unrealistic, although the tech giant appears to have bet the farm on the brand new operating system with the shiny new interface.
Microsoft blogger Paul Thurrott has quoted one unnamed company source as saying early sales of Windows 8 PCs are disappointing and below expectations.
The question is: is this because of unrealistic expectations set by Microsoft, the dynamics of an ailing PC market, the deeper problems of supply and demand, or did Microsoft just get things wrong?
Certainly, Microsoft has been guilty of setting unrealistic expectations on new versions of Windows over the years.
But it’s a question of scale, and whether expectations were initially set too high – either deliberately by Microsoft’s sales managers or using the soft power of PR and gadget-happy news sites and bloggers hungry for something to finally stick to the smugs of Cupertino. This would seem to be the real problem.
Online tech retailer NewEgg yesterday described sales of Windows 8 as “slow going”, but didn’t provide figures. Merle McIntosh, senior vice president of NewEgg product management, said NewEgg had hoped for an “explosion”
With PC makers holding back for CES in January, and with Intel-based Surfaces coming next year, all eyes will now be on the full year’s results for 2013.
Tomi Engdahl says:
Engineers: It’s BYOD for Life
http://www.designnews.com/author.asp?section_id=1386&doc_id=254426&cid=NL_Newsletters+-+DN+Daily
Not Adams Smith, Tocqueville, or even Milton Friedman himself could have predicted how electronic consumerism would transform the workplace.
The affordability of smart devices has made them an invaluable part of the employee repertoire. The number of people with smartphones will only increase. For this reason, the concept of “bring your own device” (BYOD) is causing every type of business to do some restructuring for the sake of security.
The trend is clear. The consulting firm Ovum surveyed 4,000 full-time employees and found that 70 percent use their own smart devices to access corporate information.
The fact that employees never disconnect their devices is where the extra time is picked up. MW stated the 92 percent of people who remain online afterhours and on vacation are “content… [with] the job flexibility.”
Privacy and security are the main concerns with the BYOD movement. These studies have also exposed that IT departments are ineffective, oblivious, or simply ignoring the fact that all these extra network connections pose a security risk for malware infecting their systems or data being lost or stolen. But, BYOD helps the bottom line (profit) and is so well received by workers that others are calling for IT departments to simply focus on developing adequate strategies and policies that promote each business’s goals and offer interoperability between devices instead of policing workers. Furthermore, the BYOD trend is helping some businesses expand as they launch services and products to assess and secure companies’ networks.
What’s next?
All of these studies, products, and efforts make it clear that BYOD is here to stay. So the question remains, how will this affect the evolution of engineering alongside BYOD? The bigger and most progressive companies are fully embracing the trend, and building support around the idea. Will everyone else follow suit?
Policies to ensure safe and secure use of personal devices will allow engineers to access more corporate data, which has obvious benefits when working in the era where time means everything.
Times are changing. The consumer world and the corporate business structure are inevitably merging. It will be interesting to see how companies react to these shifts of accessibility and redistribution of authority, as each member of the workforce integrates their work and technology with their daily lives, year after year. Likewise, the evolution engineering practices and how businesses use them will continue to transform the workforce and economy.
Tomi Engdahl says:
World’s oldest original digital computer is turned back on after 61 years
http://www.extremetech.com/extreme/140953-worlds-oldest-original-digital-computer-is-turned-back-on-after-61-years
The world’s oldest, original, still-working digital computer has been unveiled at the National Museum of Computing in Bletchley Park, the home of the United Kingdom’s Second World War encryption and codebreaking efforts, where, among other luminaries, Alan Turing and co famously broke the German Enigma cipher.
The computer, originally called Harwell but now called the Wolverhampton Instrument for Teaching Computing from Harwell (WITCH), was originally powered up in 1951
Over the last three years, WITCH has been lovingly restored to its original glory — and now it’s on display at Bletchley, powered up and working its way through some original 1950s computer programs.
Tomi Engdahl says:
Autonomy to HP: bollocks
‘We have been ambushed’
http://www.theregister.co.uk/2012/11/20/autonomy_refutes_hp_allegations/
The HP-Autonomy spat has, predictably, turned into a high-profile slanging match, with Autonomy founder Mike Lynch firing back at Meg Whitman via the Wall Street Journal.
Hewlett-Packard has sensationally written down the value of the software company it acquired last year by nearly $US9 billion, and blamed Autonomy for “misrepresenting” its value during the Leo Apotheker-era $US10.7 billion acquisition.
“It does seem to be coincident with them releasing the worst set of results in their 70 year company history”, he notes. Reloading yet again, Lynch then listed some of HP’s prior performance in acquisitions: EDS and Palm, both of which resulted in write-downs.
He blames internal HP politics for the disaster, saying that in the inter-divisional war between hardware and software, “Autonomy was at odds with the divisions that were in power”.
Tomi Engdahl says:
The Linux Foundation’s UEFI Secure Boot Pre-Bootloader Delayed
http://linux.slashdot.org/story/12/11/21/0543203/the-linux-foundations-uefi-secure-boot-pre-bootloader-delayed
“The Linux Foundation’s plans for releasing a signed pre-bootloader that will enable users to install Linux alongside Windows 8 systems with UEFI have been reportedly delayed.”
“Linux kernel maintainer James Bottomley disclosed that he has been having rather bizarre experiences with Microsoft sysdev centre.”
“I’m not sure how long it will take MS to get their act together”
Tomi Engdahl says:
H.P.’s Misstep Shows the Risk in Big Ideas
http://www.nytimes.com/2012/11/22/technology/hps-misstep-shows-risk-in-the-push-for-big-ideas.html?pagewanted=all&_r=0
When Hewlett-Packard spent roughly $10 billion on the software company Autonomy, it thought it was buying a slice of the future — investing in the hot trend of big data. But the deal turned out to be a debacle, and not only because H.P. wrote down $5 billion of the purchase.
The ill-fated marriage of the companies is a lesson for H.P. and other older technology giants as they throw billions at supposedly game-changing acquisitions, trying to gain a foothold in the future.
In that future, smartphones and tablets, connected to cloud-computing data centers, are the essential tools of work and play. Companies rent software over the air, rather than buying it with expensive maintenance contracts.
These forces threaten older businesses, like H.P.’s traditional personal computer and data storage products. Other companies, like Oracle, Microsoft and Cisco, also face pressure. They are all trying to buy the future — and have the cash to do it.
Tomi Engdahl says:
The Red Flags That Were Obvious — To Some — In the HP-Autonomy Deal
http://allthingsd.com/20121121/the-red-flags-that-were-obvious-to-some-in-the-hp-autonomy-deal/
The one big, glaring question that will probably never be fully answered in the still-developing HP-Autonomy scandal is this: If buying Autonomy has so obviously turned out to be a bad deal 15 months after it was first announced, why weren’t there any red flags that could have warned HP before the deal was consummated?
It turns out there were, and a few smart short-seller
There are others who saw troubles at Autonomy that should have occurred to HP’s due diligence team.
Software is usually sold for cash up front, and then any further payments tend to come from either service and support on an ongoing basis or, as in the case of companies like Workday or Salesforce.com, as subscription revenue. In either case, the company selling it can’t report the ongoing payments as income because it hasn’t been paid yet, and since there are no tangible goods involved, the risk of not being paid is higher.
Tomi Engdahl says:
Supercomputers face growing resilience problems
http://www.computerworld.com.au/article/442703/supercomputers_face_growing_resilience_problems/
As the number of components in large supercomputers grows, so does the possibility of component failure
Today’s high-performance computing (HPC) systems can have 100,000 nodes or more — with each node built from multiple components of memory, processors, buses and other circuitry. Statistically speaking, all these components will fail at some point, and they halt operations when they do so, said David Fiala, a Ph.D student at the North Carolina State University, during a talk at SC12.
The problem is not a new one, of course. When Lawrence Livermore National Laboratory’s 600-node ASCI (Accelerated Strategic Computing Initiative) White supercomputer went online in 2001, it had a mean time between failures (MTBF) of only five hours, thanks in part to component failures. Later tuning efforts had improved ASCI White’s MTBF to 55 hours, Fiala said.
But as the number of supercomputer nodes grows, so will the problem. “Something has to be done about this. It will get worse as we move to exascale,”
Today’s techniques for dealing with system failure may not scale very well, Fiala said. He cited checkpointing, in which a running program is temporarily halted and its state is saved to disk. Should the program then crash, the system is able to restart the job from the last checkpoint.
The problem with checkpointing, according to Fiala, is that as the number of nodes grows, the amount of system overhead needed to do checkpointing grows as well — and grows at an exponential rate. On a 100,000-node supercomputer, for example, only about 35 percent of the activity will be involved in conducting work. The rest will be taken up by checkpointing and — should a system fail — recovery operations, Fiala estimated.
Basically, the researchers’ approach consists of running multiple copies, or “clones” of a program, simultaneously and then comparing the answers. The software, called RedMPI, is run in conjunction with the Message Passing Interface (MPI), a library for splitting running applications across multiple servers so the different parts of the program can be executed in parallel.
RedMPI intercepts and copies every MPI message that an application sends, and sends copies of the message to the clone (or clones) of the program. If different clones calculate different answers, then the numbers can be recalculated on the fly, which will save time and resources from running the entire program again.
“Implementing redundancy is not expensive. It may be high in the number of core counts that are needed, but it avoids the need for rewrites with checkpoint restarts,” Fiala said. “The alternative is, of course, to simply rerun jobs until you think you have the right answer.”
Tomi Engdahl says:
Mozilla quietly kills Firefox 64-bit for Windows, despite an estimated 50% of testers using it
http://thenextweb.com/apps/2012/11/22/mozilla-quietly-kills-firefox-64-bit-for-windows-despite-an-alleged-50-of-testers-using-it/?fromcat=all
Mozilla Engineering Manager Benjamin Smedberg last Friday quietly posted a thread over on the Google Groups mozilla.dev.planning discussion board titled “Turning off win64 builds.” By Wednesday, Smedberg had declared that the 64-bit version of Firefox for Windows would never see the light of day, unless Mozilla decides to revert the decision at some point in the future.
Mozilla is still making a troubling decision here. The company may end up alienating a good chunk of its enthusiasts, or at least those that haven’t yet fled to Google’s Chrome.
Indeed, the decision has resulted in a huge uproar from 64-bit for Windows users, as noted on a Hacker News thread pointing to another discussion board. A few users have even shown off screenshots of Firefox using huge amounts of memory, specifically more than Windows 32-bit can address.
Firefox users are thus left without much of an option. They can switch to OS X or Linux, both of which have full versions of Firefox 64-bit. Windows 64-bit users meanwhile can only consider Internet Explorer and Opera, since both Chrome and Safari don’t offer 64-bit flavors.
Tomi Engdahl says:
Black Friday: Catalyst for Tablets to Far Surpass Notebook Shipments in North America
http://www.displaysearchblog.com/2012/11/black-friday-catalyst-for-tablets-to-far-surpass-notebook-shipments-in-north-america/
NPD DisplaySearch’s fourth quarter North American tablet shipment forecast is 21.5 million units, far exceeding the 14.6 million notebooks and mini-notes that are expected to ship in the same period. Starting in 2013 in North America, tablet shipments are expected to exceed notebook shipments on an annual basis for the first time; 80 million tablets versus 63.8 million notebooks. On a worldwide basis, tablet shipments aren’t expected to out-ship notebooks until 2015, when 275.9 million tablets are projected to ship, as compared to 270 million notebooks.
Several factors are enabling the North American tablet-friendly environment. First, over 70% of U.S. households have PCs, putting the U.S. amongst the countries with highest PC penetration rates in the world, and making new PC purchases less necessary for consumers. Second, consumer preference has shifted from notebooks to tablets in the U.S., as we have seen from rapid growth of the tablet shipments (200% Y/Y or 38.2 million units in 2011, and 46%, or a projected 56 million, in 2012) and the rapid slowdown of notebooks (-2% Y/Y or 54.9 million units in 2011, and 2%, or a projected 55.9 million units, in 2012) over the last two years. Third, major players (i.e. Amazon, Google) started, focused, or emphasized their tablet efforts in the U.S.
Tomi Engdahl says:
Windows 8 — Disappointing Usability for Both Novice and Power Users
http://www.useit.com/alertbox/windows-8.html
Summary:
Hidden features, reduced discoverability, cognitive overhead from dual environments, and reduced power from a single-window UI and low information density. Too bad.
Tomi Engdahl says:
LiMux Project Has Saved Munich €10m So Far
http://linux.slashdot.org/story/12/11/23/1732222/limux-project-has-saved-munich-10m-so-far
“Over €10 million (approximately £8 million or $12.8 million) has been saved by the city of Munich, thanks to its development and use of the city’s own Linux platform. The calculation compares the current overall cost of the LiMux migration with that of two technologically equivalent Windows scenarios: Windows with Microsoft Office and Windows with OpenOffice.”
“The study is based on around 11,000 migrated workplaces within Munich’s city administration as well as 15,000 desktops that are equipped with an open source office suite”
Tomi Engdahl says:
Linux brings over €10 million savings for Munich
http://www.h-online.com/open/news/item/Linux-brings-over-EUR10-million-savings-for-Munich-1755802.html
The calculation of savings follows a question by the city council’s independent Free Voters (Freie Wähler) group, which led to Munich’s municipal LiMux project presenting a comparative budget calculation at the meeting of the city council’s IT committee on Wednesday. The calculation compares the current overall cost of the LiMux migration with that of two technologically equivalent Windows scenarios: Windows with Microsoft Office and Windows with OpenOffice. Reportedly, savings amount to over €10 million.
The study is based on around 11,000 migrated workplaces within Munich’s city administration as well as 15,000 desktops that are equipped with an open source office suite. The comparison with Windows assumes that Windows systems must be on the same technological level; this would, for example, mean that they would have been upgraded to Windows 7 at the end of 2011. Project parameters such as scope, duration, applied methodology or external support were assumed to be the same in all scenarios.
According to the calculation, Windows with Microsoft Office would so far have incurred about €11.6 million (£9.3 million) in operating-system-related costs. Microsoft Office and its upgrades would have cost €4.2 million (£3.3 million), and the Windows system about €2.6 million (£2.1 million).
According to the comparison, the cost of the LiMux scenario was only a fraction of this. The project management says that by September 2012, the project had incurred only €270,000 (£218,000) because it involved no licence fees and no hardware upgrades were necessary as a result of software upgrades. The costs were exclusively generated by migrating applications.
Costs that are not related to the operating system, such as staff and training costs, were identically listed at around €22 million (£17 million) in all three scenarios. Overall, the project says that Windows and Microsoft Office would have cost just over €34 million (£27 million), while Windows with Open Office would have cost about €30 million (£24 million). The LiMux scenario, on the other hand, has reportedly cost less than €23 million (£18 million).
Tomi Engdahl says:
Scientists See Promise in Deep-Learning Programs
http://www.nytimes.com/2012/11/24/science/scientists-see-advances-in-deep-learning-a-part-of-artificial-intelligence.html?pagewanted=all
Using an artificial intelligence technique inspired by theories about how the brain recognizes patterns, technology companies are reporting startling gains in fields as diverse as computer vision, speech recognition and the identification of promising new molecules for designing drugs.
The advances have led to widespread enthusiasm among researchers who design software to perform human activities like seeing, listening and thinking.
Tomi Engdahl says:
Applied Micro shows off X-Gene ARM server prototypes
http://www.theregister.co.uk/2012/11/26/applied_micro_x_gene_prototypes/
Applied Micro Circuits is not yet shipping its first X-Gene ARM-based processor aimed at servers, and it is going to be a while yet before it can get the processors into the field. But because there is so much at stake, Applied Micro can’t afford to be left out of any conversations about ARM Holding’s attack on the data center. The reason? It has invested very heavily (at least relative to its size) in this X-Gene project.
And so the top brass at Applied Micro have to keep talking. First, back at the Hot Chips 24 conference in August, Paramesh Gopi, the company’s president and CEO, and Gaurav Singh, vice president of engineering for ARM and PowerPC processors, gave out details of the forthcoming eight-core, 64-bit X-Gene server-on-a-chip processor and its on-chip coherent network. Gopi even whipped out a sample server card just to show what a card using the X-Gene processor might look like.
The plan is for early shipments of X-Gene processors and systems using them towards the end of 2013. And as for Calxeda and its partners – Advanced Micro Devices, Cavium, and possibly Samsung Electronics – when it comes to 64-bit processors, Gopi says they will not be able to catch up.
“Nobody is going to get anything synthesizable very quickly,” asserts Gopi. “We will have a year lead on them, minimum.”
Calxeda has four-core, 32-bit ECX-1000 processors, announced last November and shipping earlier this year
Around this time in 2014, Calxeda will have its “Lago” 64-bit ECX processors in the field, and it is shooting for a fabric interconnect that will be able to span up to 1 million nodes over the next couple of years.
Applied Micro says is that Dell is working on a prototype machine with six X-Gene processors, for a total of eight cores, with 192GB of memory. Each sled in the system has one Gigabit Ethernet and two 10GE links and a dozen SATA connectors
Tomi Engdahl says:
Intel’s Haswell Could Be Last Interchangeable Desktop Microprocessors – Report.
http://www.xbitlabs.com/news/cpu/display/20121122022244_Intel_s_Haswell_Could_Be_Last_Interchangeable_Desktop_Microprocessors_Report.html
As personal computers become smaller, their flexibility is decreasing. According to a media report starting from code-named Broadwell generation of processors, Intel Corp. will only offer mainstream desktop chips in BGA packaging, which will eliminate upgrade options as well as increase risks for PC makers.
Both Intel and Advanced Micro Devices supply two different desktop platforms these days, making a very clear difference between mainstream and high-end desktop. Still, mainstream PCs with simplistic processors may easily be upgraded with very fast processors thanks to the fact that the chips are interchangeable and come in the same LGA1155 form-factor. Unfortunately, the ease of upgrade may come to an end in two years as starting from Broadwell generation of central processing units (CPUs) mainstream chips will cease to use land grid array (LGA) and micro pin grid array (µPGA) packages and will only be available in in ball grid array (BGA) form-factors, just like Intel Atom processors.
According to Japanese PC Watch web-site, code-named Haswell microprocessors may be the last desktop chips in LGA packaging, which enabled easy switch of CPUs on mainboards. Starting from Broadwell chips, which are due in 2014, all mainstream desktop processors will be available in BGA packaging, which means that they will have to be soldered to mainboards, something that can be done in relatively sophisticated manufacturing facilities.
Tomi Engdahl says:
“According to a story by Charlie Demerjian, a long-time hardware journalist, Intel’s next generation of x86 CPUs, Broadwell, will not come in a package having pins. Hence manufacturers will have to solder it onto motherboards. That will likely seriously wound the enthusiast PC market. If Intel doesn’t change their plans, the future pasture for enthusiasts looks like it will go to ARM chips or something from offshore manufacturers.”
Source: http://tech.slashdot.org/story/12/11/26/2016228/is-intel-planning-to-kill-enthusiast-pcs
Tomi Engdahl says:
“Windows 8′s Metro UI presents a clean and spiffy new interface for Microsoft’s latest OS. But one of the operating system’s oldest and most hated problems — crapware — still lurks below the surface”
Source: http://tech.slashdot.org/story/12/11/27/0038226/windows-8-pcs-still-throttled-by-crapware
Tomi Engdahl says:
Microsoft’s client-access licensing and pricing changes to hit December 1
http://www.zdnet.com/microsofts-client-access-licensing-and-pricing-changes-to-hit-december-1-7000007916/
Summary: Business customers who purchase CALs from Microsoft for use with Exchange, Lync, SharePoint, Windows Server, System Center and other products should be aware of licensing/pricing changes coming soon.
As of December 1, Microsoft is changing the way it prices the “user” option when purchasing client-access licenses (CALs), which will result in higher prices for some customers.
With the User CAL, customers buy a CAL for every user who accesses the server to use services such as file storage or printing, regardless of the number of devices they use for that access. With a Device CAL, they purchase a CAL for every device that accesses a server, regardless of the number of users who use that device to access the server.
Microsoft has positioned User CALs as being the optimal choice if company employees need to have roaming access to the corporate network using multiple devices
Tomi Engdahl says:
MEMS sensors in gaming: $500M “Call of Duty” revenue will drive what’s next in game platforms
http://www.edn.com/electronics-blogs/anablog/4401967/MEMS-Motion-sensors-in-gaming—500M–Call-of-Duty–revenue-will-help-drive-what-s-next-in-Xbox-360–Wii-and-PlayStation-3
The MEMS suppliers know where the money is with smart phones and tablets plus the gaming industry and you can be sure they are feverishly working on the next level of motion sensing to make gaming more realistic.
Tomi Engdahl says:
Lighting with longevity: Livermore’s longest living light bulb more than a century of light…The Centennial light bulb.
http://www.edn.com/electronics-blogs/looking—electronics/4401954/Lighting-with-longevity–Livermore-s-longest-living-light-bulb-more-than-a-century-of-light-The-Centennial-light-bulb-?cid=Newsletter+-+EDN+on+Analog
it becomes the 111th year Celebration of the Centennial Light Bulb on June 18, 2013. Suffering for a week without power due to hurricane Sandy acutely reminded me how great an invention the incandescent light bulb was for mankind.
As I researched stories about this amazing Shelby1 manufactured light bulb (see Figure1), I found many interesting facts about the technology in making light bulbs. As well as many interesting stories about the invention of the light bulb and the “Phoebus Cartel”2 that fixed the life of light bulbs to 1,000 hours.
Tomi Engdahl says:
Cloud Services came, and they promised to correct it all: systems faster, quality is improved and costs are kept in shape, when everything is paid only on usage. This has certainly happened.
Still feeling a bit like that IT buyers are driven out of the frying pan into the fire. However, the old troubles are reduced but new ones take their place. Quite a difficult position in the small ones, a kind lion’s share of the Finnish companies have.
Core of the problem is the fact that no one may know where the cloud of the end of the day the data is transferred.
Source: http://www.tietoviikko.fi/blogit/uutiskommentti/itostaja+joutuu+ojasta+allikkoon/a859875?s=r&wtm=tietoviikko/-28112012&
Tomi Engdahl says:
HP boffin: Honey! I shrank the PC. To nanometre size, dammit
http://www.theregister.co.uk/2012/11/29/hp_nanostore/
HP boffins have packed layers of RAM, caches and storage in a computer into a combined block of memristors and processor cores to create highly scalable “nanostore” systems. It’s hoped these little monsters will chew through mountains of data with terrific energy efficiency.
The nanostore design appears to place energy efficiency above data retrieval rates. HP Labs fellow Parthasarathy Ranganathan presented the blueprints at the San Jose Server Design Summit this week, according to EE Times.
Ranganathan has a three to five year timescale in mind to develop the technology into a product. The chips will be ideal for crunching through huge datasets that have to be largely or fully in memory to be worked on in parallel. Plus, the nanometre-scale systems should be able to pack more storage capacity into data centres.
A single nanostore chip consists of multiple 3D-stacked layers of dense silicon nonvolatile memories, such as phase change memories or memristors, with a top layer of power-efficient compute cores.
Doron Kempel’s OmniCube is similar to the nanostore, bringing compute, storage and networking together, but is not based on such deep integration at the semiconductor level.
Tomi Engdahl says:
Raspberry Pi daddy: Stroke your hardware at night, land a job easy
You want a career in computers? Start using computers
http://www.theregister.co.uk/2012/11/29/eben_upton_career_advice/
Eben Upton, a key player in the Raspberry Pi’s genesis, said out-of-work graduates should get busy with computers in their spare time if they want to land a job. And he didn’t mean logging into Facebook.
Speaking in a Google Hangout video chat conference call thing, Upton drew on his years of hiring newbies at chip giant Broadcom and his time teaching computer science at Cambridge University.
He said graduates need to wow interviewers with their enthusiasm and proof of their ability to learn and develop – and that will involve spending long evenings hacking away at code, breadboards and pet projects.
“If you walk in the door here and you look bright, you’re going to get hired,” he told a graduate struggling to find work due to the Catch-22 situation inexperienced job-hunters find themselves in: it’s hard to get onto the employment ladder and gain experience if nearly every company wants people with, say, a minimum of five years of C++ programming experience.
“If you’re the right candidate that’s the answer. For the right candidate, people will waive these requirements,” Upton said.
asa fagforening says:
I am not certain the place you’re getting your info, however great topic. I needs to spend some time studying much more or understanding more. Thanks for great information I was in search of this info for my mission.
Tomi Engdahl says:
Windows Next: Just call it ‘Blue’?
http://www.zdnet.com/windows-next-just-call-it-blue-7000002535/
Summary: Windows 9 might not be Microsoft’s next version of Windows. Instead, ‘Blue’ could be the interim release that shows up first.
It’s not surprising Microsoft already is working on whatever version of Windows follows Windows 8.
I’ve heard the next version of Windows is not going to be Windows 9. Instead, I’ve heard from a couple of my contacts that some kind of an update is coming next year. The Windows release codenamed “Blue” — mentioned by Win8China last week — is likely the codename of this interim release, my contacts claim.
I’m not clear if Blue is simply what we in the Windows world typically call a service pack, which is a rollup of fixes and updates. Or maybe Blue is more of a feature pack, which would/could include be a rollup of fixes plus some new features.
The word seems to be, whichever it is, that Microsoft is moving away from the big-bang Windows release schedule to which it typically has adhered, and is now attempting to move toward something more like what Apple does, with point releases.
Tomi Engdahl says:
Ballmer: We’re all about devices and services now
http://news.cnet.com/8301-10805_3-57555119-75/ballmer-were-all-about-devices-and-services-now/
Microsoft’s CEO makes the most succinct case yet that the software giant has moved to a new era where it will build devices and produce the services that run on them.
-Microsoft may be best known for Windows. It’s Office productivity suite runs on hundreds of millions of computers around the world. But Microsoft Chief Executive Steve Ballmer made it abundantly clear at today’s annual shareholder meeting that the software giant should be thought of as a devices and services company going forward.
“This is really a new era for our company,” Ballmer told shareholders today.
“We will relentlessly focus on delightful, seamless services,” he said.
The company will continue to sell services themselves, such as Office 365, an online version of its productivity software. And it will create devices that provide the best experiences when using those services, such as its new Surface tablet computer that runs Windows RT, the lightweight version of its new operating system.
“We’ve come a long way in the last year. I could not be more excited for what’s ahead,” Ballmer said. “We’ve never had a stronger product line.”
Tomi Engdahl says:
Windows 8 Gets off To a Slow Start, According to The NPD Group
https://www.npd.com/wps/portal/npd/us/news/press-releases/windows-8-gets-off-to-a-slow-start-according-to-the-npd-group/
PORT WASHINGTON, NEW YORK, NOVEMBER 29, 2012 – The consumer Windows PC and tablet market* didn’t get the boost it needed from the launch of Microsoft’s Windows 8 in the U.S. Since the Windows 8 launch on October 26, Windows device sales have fallen 21 percent versus the same period last year**, according to leading market research company The NPD Group’s Weekly Tracking Service***. Notebooks, which have been weak throughout most of 2012, saw that trend continue as they fell 24 percent. Desktop sales have fared better this year, dropping just 9 percent.
After just four weeks on the market, it’s still early to place blame on Windows 8 for the ongoing weakness in the PC market,” said Stephen Baker, vice president of industry analysis at NPD. “We still have the whole holiday selling season ahead of us, but clearly Windows 8 did not prove to be the impetus for a sales turnaround some had hoped for.”
Average selling prices of Windows computing devices have jumped significantly this year.
Tomi Engdahl says:
THE FUTURE OF DIGITAL [SLIDE DECK]
http://www.businessinsider.com/future-of-digital-slides-2012-11
Global Internet Connected Device Shipments
http://static4.businessinsider.com/image/50a50aadecad04f123000006-650/internet-connected-devices.jpg
Tomi Engdahl says:
Embedded systems are evolving, but where are the tools?
http://www.techdesignforums.com/blog/2012/11/13/embedded-systems-tools-gap/?mid=14528057&PC=L&c=2012_11_29_embedded_technical_news
These are exciting times to be an embedded system developer. The landscape is evolving at a rapid pace. That much is evident from the almost daily stream of news from this sector.
However, although the announcements are exciting, they are either about evolution in embedded hardware architectures or what’s new in the software space
These are exciting times to be an embedded system developer. The landscape is evolving at a rapid pace. That much is evident from the almost daily stream of news from this sector.
There is little, if any, mention of happenings in embedded development tools.
That is partly because the evolution of those tools has not kept up with the growth in embedded hardware and software. It’s a problem.
A scenario is quickly developing where the gap could become very large between what the embedded ‘system’ (hardware + software) is capable of delivering and what the embedded developer can actually enable.
The embedded software design methodology, especially the code-compile-debug flow, hasn’t changed much. The arrival of multicore architectures has given rise to new defects (race conditions, deadlocks, stalls, etc.) which are not a part of the traditional embedded design lexicon. Added to that, there are the expectations of ‘high-performance’ from multicore that imply a need for quality performance analysis tools.
Our industry is evolving fast and urgently needs to address the hardware-software gap.
Tomi Engdahl says:
EXTREME computer sports: Meet the cluster war winners
3 Teraflop LINPACK for NUDT, Texas surprises, Utah defends home turf
http://www.theregister.co.uk/2012/11/30/sc12_cluster_competition_winners/
Tomi Engdahl says:
NY Museum of Modern Art embraces 14 video games
http://www.theregister.co.uk/2012/12/01/moma_begins_video_game_collection/
The prestigious Museum of Modern Art (MoMA) in New York has elevated the humble video game into its pantheon of art objects, and has named the first 14 of the 40 or so games that it will eventually add to its collection.
“Are video games art?” MoMA curator Paola Antonelli asked rhetorically when announcing the collection, then answered herself. “They sure are, but they are also design, and a design approach is what we chose for this new foray into this universe.”
The first 14 games chosen by Antonelli and her curatorial staff span a broad range of styles, from the minimalist simplicity of the original Tetris to the ethereal boredom of Myst to the low-res sentimentalism of Passages to the inevitable doom of Canabalt.
Pac-Man (1980)
Tetris (1984)
Another World (1991)
Myst (1993)
SimCity 2000 (1994)
vib-ribbon (1999)
The Sims (2000)
Katamari Damacy (2004)
EVE Online (2003)
Dwarf Fortress (2006)
Portal (2007)
flOw (2006)
Passage (2008)
Canabalt (2009)
Tomi Engdahl says:
Interview with Ryan C. Gordon about Linux Gaming
http://cheerfulghost.com/panickedthumb/posts/771
Thanks for taking the time to answer some questions for us. Let’s start with a very broad one– how do you see the state of Linux gaming today?
It’s making progress. We’re turning out to have a pretty big year, with Unity3D coming to the platform, and Valve preparing to release Steam. These are just good foundations to an awesome 2013.
As a follow up, where do you think Linux gaming is headed?
Ask me again in three months.
The question will be: will everyone’s enthusiasm infect companies like Electronic Arts? Activision? Ubisoft?
Will it bring back Epic and Id?
Time will tell.
Recently many game developers have been quite vocal about their distaste and distrust of Windows 8. Some feel that the Windows Store will effectively shut out competition. How do you feel about the changes in Windows 8, and do you think this will drive more people to Linux for gaming and day-to-day use?
I confess to not knowing much about Windows 8, except that I think I’ve seen more Win8 commercials this month than I saw political ads leading up to the election.
I _do_ think that Valve is making this move to Linux _specifically_ because of the Windows Store. If your product is a store that sells software, can you survive on platforms where the platform maker is concerned with controlling (and getting a cut of) software purchases? Between Apple and Microsoft, Valve has to fight for a less restrictive platform.
If they are moderately successful, that’s great for Linux gamers. If they are wildly successful, that’s great for _everybody_. Someone has to push back on these walled-garden app stores that are popping up on every platform.
Now that Steam is coming to Linux do you think this will prompt other companies to port games that might not have done so previously?
Absolutely, but there’s actually a few factors at work:
- Steam on Linux, as you mentioned,
- Humble Bundle pushing really hard for Linux ports,
- Unity shipping a Linux port of their engine,
- Kickstarter being flooded with Linux customers.
There’s just a lot of data (and specifically, data about money) this year, and it’s motivating a lot of developers to test the waters.
Beyond FatELF and your recent proposal to Gnome, what other changes would you make to the Linux stack to make game development and porting easier?
We need need need a better OpenGL debugger. ApiTrace is a good start, but it’s only a start.