Electronics technologies for 2012

Product engineering organizations face the incredible challenge of ever shrinking market windows for innovation in 2012. Due to globalization, increasing competition and rapidly changing technology, there are many risks and uncertainties facing the new product development path. These opportunities if missed, can lead to huge costs and overwhelming complexity that can compromise quality and lead to very expensive recalls. Innovating in the face of these pressures requires organizations to rethink how they work.

Learn the most important new technologies and start designing next-generation equipment early if you are working on real technology company. The real technology companies asks are Amazon, Facebook, eBay, and Google good technology companies or good applications-of-technology companies? Applications-of-technology could also be a good position to be. No matter where you are differentiate to dominate. No more lame “me too” products. CES is over; it’s time to start designing. Here are some material to fuel up your innovation.

crystalball

EE Times’ 20 hot technologies for 2012 article is a list of 20 technologies EE Times editors think can bring big changes, and that EE Times will be tracking during 2012. Hot technologies: Looking ahead to 2012 article: EDN magazine editors reflect on some of the hot trends and technologies in 2011 – and look ahead to 2012.

Top 12 Hot Design Technologies for 2012 article mentions MEMS, Wireless sensor networks, Internet of Things starts with lightbulbs, new flexibility via organic materials for electronics, Near Field Communication (NFC) is becoming available in many mobile phones, Printed electronics, power scavenging methods for low power electronics, Graphene, conversion of solar energy, Ethernet displaces proprietary field buses, 40/100 Gbit/s Ethernet Active-matrix organic light-emitting diode (AMOLED) displays and Smart Grid technologies (power management and architecture system components). We are nearing the point where some microelectronics systems can be made sufficiently low power – requiring microamps rather than milliamps – that scavenging methods can produce enough power to enable them to be autonomous.

crystalball

Home electronics is expected to become a new status symbol (Kodinelektroniikasta uusi statussymboli) article tells that consumer electronics demand will increase further in 2012 according to market research by Deloitte. Latest digital technology will also become a status symbol in homes. In particular, tablets, and smartphones Deloitte expects record sales.

Mobile phones with advanced features start to replace traditional separate devices for different functions. This is happening for small digital cameras and video cameras. 2012 At the end of 2012 there is the more navigation capable mobile phones than the stand-alone GPS navigators according to Berg Insight. Berg Insight calculates that the sales of separate GPS navigators started to decline already in 2011. Nav equipment manufacturers have responded to the situation by bringing the software to mobile devices.

How apps for your appliances represent the next opportunity article tells that Samsung Electronics not only pushed its smart TVs at CES, but a whole line of smart appliances, including washers and refrigerators. If Samsung Electronics is right, developers may flock to smart appliances as the next opportunity. That included music apps such as Pandora on the refrigerator and an app on the washer that can ping you when a load is done.

IPv6 is becoming more important. One of the driving forces behind the move from IPv4 to IPv6 has been low-cost embedded devices, which are going online at an accelerating pace. Support for this technology will be crucial for the success of many forthcoming connected embedded devices. IPv6 on a microcontroller article gives some tips how to implement IPv6 on small microcontroller.

The science fiction future of medical implants is here article tells that semiconductor solutions contained in hand-held consumer product innovations are now finding their way into medical implantables: wireless data and power transmission as well as analog, microcontrollers and transducer capabilities.

App Servers and Lua Scripting Speed Rich Web Applications for Small Devices article tells that with ever more smart devices connecting to the web, even small embedded devices must be able to serve up rich graphical presentations of the data to satisfy user expectations. This creates a new challenge for designers of small embedded systems as a new task. With time and space at a premium, a scripting approach can be invaluable. LAMP (Linux, Apache, mysql, PHP) setups work well in full-up web server implementations (at least around 65 Mbyte of memory), but for small embedded system we need something that uses less resources. Smartphones have set the bar ridiculously high when it comes to how sophisticated the application interface should be.

We’re on the cusp of an era that offers better-than-ever display technologies for an excitingly immersive viewer experience. Just as we’ve seen the emergence of 3DTV for consumers and higher than HDTV resolutions are to be tested in 2012 London Olympics. Xilinx Making Immersive 3D and 4K2K Displays Possible with 7 Series FPGA System Integration press release tells that Xilinx just introduced new 28nm Kintex™-7 Field Programmable Gate Array (FPGA)-based targeted reference designs and a new development baseboard for accelerating the development of next-generation, 3D and 4K2K display technologies at 2012 International CES. The network infrastructure will need an overhaul in 2012 due to the increasing amounts of high-definition video and other traffic.

crystalball

ARM processor becomes more and more popular during year 2012. Power and Integration—ARM Making More Inroads into More Designs. It’s about power—low power; almost no power. A huge and burgeoning market is opening for devices that are handheld and mobile. The most obvious among these are smartphones and tablets, but there is also an increasing number of industrial and military devices that fall into this category. ARM’s East unimpressed with Medfield, design wins article tells that Warren East, CEO of processor technology licensor ARM Holdings plc (Cambridge, England), is unimpressed by the announcements made by chip giant Intel about the low-power Medfield system-chip and its design wins. Android will run better on our chips, says Intel. Look out what happens in this competition.

Bill McClean: Don’t broad-brush the semiconductor market article tells that year 2011 started off great, full of optimism and high growth expectations for the semiconductor industry. But that mellowed as the year progressed (total semiconductor market at 2% growth for this year, although smartphone increase very much). Going into 2012, there’s not a lot of optimism. Any thoughts on 2013? A: We’re thinking it’s going to be a little slower than 2012. So, we’re looking to a slower market—not a disaster.

Chip sales flat in 2011, will grow (a wee bit) in 2012 article tells that the prognosticators at Gartner have ranked the chip makers of the world by 2011 revenue and are calling the market for 2012, with a reasonably upbeat forecast for next year’s chip sales, but (paradoxically) a dreadful forecast for companies that make the gear to bake the chips. Disk shortages are expected to slow down PC sales for several quarters. Smartphones, tablets, and flash will represent more than three-quarters of the revenue growth between now and 2015.

There are also some more optimistic predictions for chip sales. Malcolm Penn, founder and chief analyst with semiconductor market analysis firm Future Horizons Ltd, is more bullish than most other market analysts. Bullish Penn sees chip market growth of 8% in 2012 article tells that Malcolm Penn has predicted that the global chip market will rise on an annual basis by 8% to $323.2 billion in 2012. Penn said that after a flat first quarter he expected the chip market to bounce back in the second half of the year. He considers that 8% growth is a “safe bet,” and indicated that annual growth “could easily be 20%.” For 2013 Future Horizons forecasts 20% annual growth.

crystalball

EDN magazine writes in PC boards: Materials and processing are now a hot technology article that exotic substrates and fabrication methods are now commonplace. A dozen layers, thick copper, fine lines, and buried vias are just the processing side of the modern high-tech PCB. There are many processing options that have made PCBs truly a hot technology. Also the substrates themselves are now high tech. Traditional FR-2 (phenolic resin bonded paper) or FR-4 (glass-reinforced epoxy laminate) are not the only widely supported choices anymore. You could always specify Teflon or polyamide substrates for high-speed circuits. In addition to the old high tech like flex circuits, there are a host of improvements that make a whole new set of high-tech PCB designs that are truly a hot technology.

EDA industry predictions for 2012 mentions that 28-nm design starts will increase by 50% in 2012 and more people will be dabbling with 20 nm. The increased design sizes and complexity will create all kinds of pressure in the verification and test fields.

The rise in fake parts is also contributing to engineers’ fears that their products will be corrupted. Counterfeit electronic components were big issue in 2011, and the problem does not go away this year.

EDA industry predictions for 2012 mentions a trend, and one that has been going on for some time, is a continued migration of functionality from hardware to software. Dr Markus Willems of Synopsys attributed this to “the needs to support multiple standards simultaneously (wireless, multimedia), use the same hardware platform for product derivatives (automotive), quickly adjust to evolving standards (wireless), and react to changing market demands (all applications).” Increased rate of adoption of new technologies such as tablets, ultra-books, and their inherent demand for low-power solutions will help the EDA industry improve their importance. Electronic system-level design tools (ESL) continues to be an important thrust for the EDA industry. Increased adoption of the TLM 2.0 (Transaction-level modeling) standard is a popular theme. Several EDA companies have been busy writing books recently and self-publishing them.

Product Lifecycle Management (PLM) tools are taking product design to the next level (especially in automotive, aerospace, and defense). PLM was launched more than a decade ago with the lofty vision of creating an enterprise-wide, central repository for all product-related data, from the earliest customer requirements feedback through quality and failure data collected in the field by maintenance and support personnel. Product lifecycle management, sometimes “product life cycle management”, represents an all-encompassing vision for managing all data relating to the design, production, support and ultimate disposal of manufactured goods. What 2012 holds for Product Lifecycle Management? article tells about current PLM trends.

The prototype comes of age article tells that a radical change is about to happen in the typical development of an electronic system. The hardware-development flow will no longer be the center around which everything else revolves. The rising size and complexity of systems and the limitations of using a single-purpose model—the hardware-design model—have fueled the growth of new prototyping technologies. Among the changes now taking place in this area is the migration to higher levels of abstraction for hardware design. The ability to derive several implementations from a single high-level description is also desirable. Many hardware blocks now come with sophisticated software stacks, and they also must be integrated into the software flow.

crystalball

‘KISS’ Among Engineers’ Top 2012 Concerns article tells that Rich Merritt agrees that we’ve forgotten the KISS principle especially in automation sector. “We’ve made everything so complicated, complex, and convoluted that we’ve entered the age of ‘transoptimal engineering,’ ” he says. “That is, things are so advanced and have so many features, they don’t work anymore.” Business development manager Herat Shah sees the pressures for complexity and price converging in an unhealthy manner. “The biggest issue for the automation and control supplier is to design and engineer something that’s the cheapest and the best,” he says. “Practically, this is not possible.” In addition to this there are security concerns: Stuxnet targeted controllers, and made engineers realize that factories aren’t immune to security threats.

How do you manage the Internet of you? article claims that electronics has gotten to the point (in the consumer space) where the only innovations are the mundane, the enhancements, the extensions. A computing device today (whether a tablet, a phone or a PC) can do what telephony, typewriters, pen and ink, film (motion and still), cameras, television, radio (basically all major mediums) did a generation ago. And yet… And yet we still innovate. We still build. We still buy. The devices in one sense feed the worst part of a personality: compulsiveness. They suppress pause and reflection. Think about it.

403 Comments

  1. Tomi Engdahl says:

    Do new technologies ease small-scale product innovation?
    http://www.edn.com/article/521760-Do_new_technologies_ease_small_scale_product_innovation_.php

    PC-based simulation and tools, manufacturing setups, and production equipment mean that it is easier to develop lower-volume products.

    The two extremes of product volume have their own attributes. If you’re designing and building a high-volume product, you can justify manufacturing tooling and test fixtures—whether at your facility or at a contract assembly house. At the other end of the volume spectrum, if you are producing only a few units per month, or doing semicustom or full-custom work, you usually must perform many aspects of the manufacture using manual techniques.

    But what about those projects with low to moderate volume of approximately 10 to 50 units per month? They are often caught in the small-scale, in-between zone: too few to afford serious tooling and fixturing but too many to build by hand.

    Machining is not the only technique that has changed radically. Using a variety of high-end plastics, sintered powdered metal, and other materials, along with CAD/CAM software, rapid prototyping lets you build both prototypes and modest production runs, with virtually no tooling cost or lag time.

    For the PCB, you can use modeling tools and software to prepare the layout and then get a batch of boards made outside in 24 to 48 hours.

    If you step back and look at the tools, tooling, components, and processes it takes to develop and produce a lower-volume product, you’ll see that these developments have changed things for the better. You can then market your product directly through the Web, avoiding the need for a more formal channel of distribution until you get some customers and traction.

    Reply
  2. Tomi Engdahl says:

    The quick-paced and global nature of our economy today is effecting a sea change in the industrial automation industry—a level of change not seen since the automation revolution nearly 40 years ago.

    Manufacturing has become more competitive as extremely agile and low-cost producers come online and undercut long-established vendors.

    Customers meanwhile require ever-faster innovation and shorter product cycles, something traditional manufacturers cannot easily deliver. Along with increasing automation complexity, these trends suggest that vendors need new and more agile processes

    Reply
  3. Tomi Engdahl says:

    Obsolescence by design: Short-term gain, long-term loss, and an environmental crime
    http://www.edn.com/blog/Brian_s_Brain/41782-Obsolescence_by_design_Short_term_gain_long_term_loss_and_an_environmental_crime.php?cid=EDNToday_20120515

    Anyhoo … the battery in the iPhone 3GS still works passably, but holds notably less charge than it did when new.

    My buddies at iFixit generously provide detailed battery-swap instructions, plus a tutorial video and the replacement part plus tools you’ll need to successfully conduct the iPhone 3GS surgery. But I think you’ll agree that while the procedure might be a no-brainer for engineers like us, it isn’t for the masses. And I’ll also claim that this desired outcome defined an intentional design decision by Apple; when the battery inevitably fails, the company assumes that the affected consumer will just go out and buy a brand new handset.

    That same design decision (explained by the company as a necessity to enable slim system form factors…a claim which I frankly don’t buy) extends to the company’s iPods, none of which have ever offered an easily user-accessible battery. And it also extends to the company’s laptops

    One other related MacBook “feature” irks me, too. Apple makes a habit of regularly obsoleting various products (and generations of products) with each Mac OS X uptick.

    Here’s the thing; I pragmatically ‘get’ why Apple chose to chart these particular design courses, from a business standpoint. Non-removable batteries, as I’ve already mentioned, guarantee obsolescence and replacement of the entire system. And O/S obsolescence not only guarantees system obsolescence but also simplifies both O/S development (by limiting backwards-compatibility necessity) and subsequent O/S support. But Apple’s stubbornness also fills up lots of landfills with lots of otherwise perfectly good hardware. And it irks me every time I discover that some widget I’ve bought seemingly only a short time before is now archaic.

    Reply
  4. Tomi Engdahl says:

    Flexible Displays Landing in 2012, But Not in Apple Gear
    http://www.wired.com/gadgetlab/2012/05/apple-flexible-displays

    Flexible displays have tickled our imaginations for years. And before the end of 2012, we’ll finally see companies employing flexible displays in their products. But while the possibilities are tantalizing, don’t let your imagination run wild. The earliest iterations of flexible displays won’t be very bendy, and they won’t appear in Apple hardware as some news outlets have recently speculated.

    Such a display could be useful in a number of applications, such as in a device with a gently curved screen. Ultimately, the display could even be deployed in a flexible, bendable phone or tablet. But that’s probably not on the horizon anytime soon.

    In early March, Samsung announced it would be mass-producing its flexible OLED displays, like the one seen above, by the end of this year. Now flash-forward to this Monday: According to a report from the Korea Times, Samsung is seeing “huge” orders for this display

    Samsung’s flexible OLED display certainly has some advantages over current display tech. For one, it’s basically unbreakable because it doesn’t use glass, but rather a type of plastic called polyamide

    “This type of flexible display will be very thin, lightweight, rugged, and unbreakable,”

    Colegrove said there are two reasons why Samsung’s flexible OLED is attractive to device manufacturers. First, the display is thin, lightweight and difficult to break — this offers immediate design benefits. Second, any type of new, novel technology offers marketing benefits. You can hear the commercial spiel now: “We have the first flexible AMOLED display devices in human history!”

    As for Samsung’s flexible OLED technology, it will appear in phones in 2012, and possibly in tablets next year.

    Gillett said bendable displays could ultimately be used in devices that roll up like a newspaper — ideal for reducing the physical footprint of any mobile device. But that’s just the far-out, sci-fi-inspired application of flexible display technology. A truly bendable device “would be an engineering nightmare because the more people flex it, the more you’d wear the components inside it,” Gillett said.

    “It’s completely impossible to see any Apple product with flexible AMOLED this year,” Jennifer Colegrove, NPD DisplaySearch’s vice president of emerging display technology, told Wired.
    And if you’re looking to find a flexible display in an iDevice, you’ll probably have to wait until the 2013-2014 time frame, says Colgrove — with truly bendy iDevices appearing in 2015 at the earliest.

    Reply
  5. Tomi Engdahl says:

    Analog matters – more than you think
    http://www.fpgagurus.edn.com/blog/fpga-gurus-blog/analog-matters-%E2%80%93-more-you-think

    Now that a good chunk of video and audio traffic is transported on broadband networks end-to-end in a digital format, you will find an increasing number of people who say that analog engineering is no longer important. You found these kind of analog nay-sayers in the 1960s, 1980s, 2000s, and now. And you know what? They were just as wrong in each decade, but there are fewer and fewer analog engineers around to make the case for real-world continuous signals.

    FPGA and analog chip vendors should look to the example of Avnet, ADI, and Xilinx at X-fest, and offer more mixed-signal tutorials, particularly if FPGA vendors want to continue their strategy of eventual world domination.

    Reply
  6. Tomi Engdahl says:

    CPU and GPU chips account for half of $111bn chip market
    Application specific cores are the next big thing
    http://www.theinquirer.net/inquirer/news/2175750/cpu-gpu-chips-account-half-usd111bn-chip-market

    ANALYST OUTFIT IMS Research claims that half of the processor market is made up of chips that have CPUs and GPUs integrated on the same die.

    The firm’s findings show that chips that include both CPU and GPU cores now account for half of the $111bn processor market. According to its report, this growth has all but destroyed the integrated graphics market, but the company said that discrete GPUs will continue to see growth.

    Tom Hackenberg, semiconductors research manager at IMS Research said, “Through the last decade the mobile and media consumption device markets have been pivotal for this hybridization trend; Apple, Broadcom, Marvell, Mediatek, Nvidia, Qualcomm, Samsung, St Ericsson, Texas Instruments and many other processor vendors have been offering heterogeneous application-specific processors with a microprocessor core integrating a GPU to add value within extremely confined parameters of space, power and cost.”

    Hackenberg made an interesting point as to why both AMD and Intel are pushing deeper into their respective CPU and GPU on-die strategies, suggesting that it is a way to easily design an embedded processor for use in handheld devices.

    Reply
  7. Tomi Engdahl says:

    Great support is a great product differentiator
    http://www.eetimes.com/electronics-blogs/other/4373250/Great-support-is-a-great-product-differentiator

    Most design engineers in EDA work within flows that come from one big vendor and that offer support along with software. Best-in-class niche software products are integrated into this flow to meet specialized needs. But what, exactly, is a best-in-class product?

    The EDA industry includes an incredible variety of software products to solve complex, but niche, problems. Because each design team has very specific requirements, and the software must be integrated into a design flow, products often need to be customized and scripted for each customer site.

    Yet how many EDA or design managers look at support as part of the software purchase decision?

    A software solution bundles services with product to ensure that design teams achieve their goals. This is not simply a matter of FAQs or manuals, but technical support that will quickly and effectively solve customer problems. The quality and timeliness of customer support will be crucial to the success of projects through the life of the license.

    What do companies famous for their customer service and support have in common? Three key answers are as follows:

    Founders and top managers make excellent technical support a high priority.
    Excellence and timeliness of service don’t reside solely in the Customer Support section, but are goals of the entire organization.
    Customer feedback is actively solicited and the research and development organizations are encouraged to work directly with customers.

    Excellent technical support is a key ingredient for the success of any EDA company and any software development project. And it is a win-win-win situation for vendors, developers and customers.

    Reply
  8. Tomi Engdahl says:

    The growing use of programmable logic in mobile handsets
    http://www.eetimes.com/design/programmable-logic/4373234/The-growing-use-of-programmable-logic-in-mobile-handsets

    Smart phones, tablets and other battery powered devices have evolved beyond communication devices and now offer personal assistance by unifying “always connected” features such as navigation, email, phone, Internet access and camera. Choosing between the two leading smartphone operating systems, smartphone designers depend on their physical hardware to differentiate their products and position them against competitive products. This is an area where programmable logic devices add direct value by providing mobile handset system architects a way to quickly innovate and add new functionality to their products.

    Reply
  9. Tomi Engdahl says:

    Plastic Doesn’t Pollute – People Do
    http://www.designnews.com/author.asp?section_id=1386&doc_id=244115&cid=NL_Newsletters+-+DN+Daily

    “Plastic is a huge and innovating world,” said Mario Maggiani, director at Assocomaplast, the association of Italian plastics and rubber machinery producers. Those innovations include processes like injection molding, extrusion of film, blow molding, and rotational molding, though the real steps forward are arguably being taken in material composition.

    “Biomaterials are becoming big in the world of plastic,” said Maggiani, noting that this had come as a result of a “war against plastic,” not just in Italy, but across the globe.

    “It’s not plastic that’s polluting the world… it’s the people throwing plastic around, these people are polluting the world,” said Maggiani, adding that if only plastic was managed appropriately, it wouldn’t pollute, and could even be recycled or used to create energy. Burning one kilo of plastic was the approximate equivalent of one kilo of oil, “so you can burn it and recover energy,”

    Reply
  10. Tomi Engdahl says:

    Modeling in layers results in increased productivity
    http://www.edn.com/blog/Practical_Chip_Design/41787-Modeling_in_layers_results_in_increased_productivity.php

    Productivity is related to the way in which we can model something and reliably go from that model to an implementation that meets all of the design goals. In the digital world, we have managed to incrementally increase productivity over time by raising the abstraction, improving the quality of the tools and removing tasks that previously had to be performed manually.

    In the analog world, almost none of this has happened. Abstractions have been elusive without leaving too much on the table in terms of overdesign, resulting in larger area or poorer performance as compared to hand crafted. This is perhaps more so in today’s design environment where “improvements” in process fabrication techniques have been working against analog, making the circuits more sensitive, decreasing the signal-to-noise ratio and increasing noise by placing them in close proximity to noisy digital circuitry.

    Reply
  11. Tomi Engdahl says:

    How Dr. Middlebrook shattered analog paradigms
    http://www.edn.com/blog/Anablog/41777-How_Dr_Middlebrook_shattered_analog_paradigms.php

    Professor R.D. Middlebrook, professor of Electrical Engineering at California Institute of Technology showed young engineers how to increase their productivity by using design-oriented analysis to obtain low entropy expressions.

    Middlebrook stated that the usual negative approach instilled in engineers early on is, “I don’t have enough information, so I can’t solve the problem.” We need to replace this thinking by the positive approach, “Somehow or other I have to find additional information to make the necessary trade-offs and approximations so I can do the design.” This is nowadays called “Thinking outside the box”
    His method was as follows: The result of design-oriented analysis is a low-entropy (simplified, low-complexity) expression, from which more useful design information can be obtained than simply one numerical answer for an assumed set of component values.

    Reply
  12. Tomi Engdahl says:

    Slideshow: The Fun Side of 3D Printing
    http://www.designnews.com/author.asp?section_id=1394&doc_id=243488&cid=NL_Newsletters+-+DN+Daily

    3D printing has been around for years and has nestled its way into lots of companies’ product development processes as a more effective way to produce prototype products, test functional parts, and perhaps even pump out limited-run production parts.

    Yet in addition to that so-called serious product development and engineering work, there’s a significant number of less serious, but equally important, efforts underway. These are pushing the limits of 3D printing toward more consumer-friendly — even quirky, some might say — applications. We’re talking 3D-printed chocolate, 3D-printed fabric and clothes, and even 3D-printed body parts.

    Reply
  13. Tomi Engdahl says:

    Smart Meters Create Opportunity for Electronics Suppliers
    http://www.designnews.com/author.asp?section_id=1395&doc_id=243712&cid=NL_Newsletters+-+DN+Daily

    A dramatic increase in demand for smart meters is going to create a big growth opportunity for makers of embedded electronic products ranging from microcontrollers and analog sensors to operating systems, according to a new study published by Pike Research.

    The study predicts that 55 percent of the world’s 1.5 billion electromechanical meters will be replaced by smart meters as part of an effort to create smart grids before 2020. Each year between now and then, 75 million to 99 million smart meters will be installed, says the report

    “Meter manufacturing was pretty dull for about 100 years,” Bob Gohn, vice president of research for Pike told Design News. “But now we’re having a burst where we’re replacing more than half the world’s meters in less than 10 years.”

    The reason for the large-scale replacement effort is that electric utilities want to reduce their operating costs and set the stage for increased use of renewable energy. Smart meters enable them to reduce operating costs because utilities don’t need to send out workers to read the meters, nor are workers necessarily needed to shut off power or turn it back on, Gohn said.

    “Instead of reading the meter once a month, or once every six months, the utility can take readings every 15 minutes or hour,” Gohn said. “And it can set up different rates for electric usage, depending on the amount of power that’s used at any given time of day.”

    The use of the new meters is already growing in the US and is expected to spread to Europe and Asia over the next eight years, Pike’s study said.

    New smart meters will need microcontrollers for measurement, communications, and management, as well as control of human-machine interfaces (HMIs).

    “Increasingly, we are seeing processors with something as powerful as an ARM 9 or an ARM Cortex-M series core,” Gohn said. “Some manufacturers are even building their own dedicated silicon for the metering market.”

    For suppliers, the large-scale adoption of smart meters won’t necessarily translate to huge economies of scale, Gohn said. Meter designs and standards will vary in different regions of the world, and some semiconductor makers may end up building application-specific integrated circuits (ASICs) for those regions.

    “This isn’t going to last forever,” he said. “Eventually, we’ll get to a terminal penetration rate, and then we’ll be back to a 15- to 20-year replacement cycle.”

    Reply
  14. Tomi Engdahl says:

    Tech Trend: What does the Smart Connect Ecosystem have in store for your IT team?
    http://www.eetimes.com/design/smart-energy-design/4373448/Tech-Trend–What-does-the-Smart-Connect-Ecosystem-have-in-store-for-your-IT-team-

    Advanced Metering Infrastructure (AMI) is all set to replace traditional electric meters with new smart meters for residential and small business consumers. That means anyone who consumes less than 200KW is about to have access to a variety of pricing plans, real time information on usage patterns and better control of consumption in general.

    The consumer angle of this story has been covered extensively. But the impact on utility companies and more importantly what it will mean in terms of technology is sort of a gray area at the moment. While the actual process of implementing and integrating these meters unfolds, companies need to think ahead to how this will impact their IT requirements in the near future.

    Having your systems properly connected and up and running round the clock is going to be a major challenge. But what’s going to make things even more interesting, is that for the first time consumers are going to actually be able to tell when your back-end goes down. How is that possible? Well that’s where your consumer portal comes in.

    Customers will have real-time access and control of devices within their homes such as smart thermostats, appliances and solar panels. Expectations of what they will see on a portal will be very different from today. More importantly, customers are going to be logging in more often and not just when they have to pay a bill or check on the status of a service request.

    All this activity is going to generate huge volumes of data.

    AMI is going to enable consumers make smarter choices on energy consumption. But to succeed, it will need utility companies to make some very drastic technology decisions!

    Reply
  15. Tomi Engdahl says:

    IT distributors: The only people adding value to the world economy
    More than just middlemen…
    http://www.channelregister.co.uk/2012/05/22/distribution_electronics_market/

    Why distributors? More specifically, why electronics distributors? Why have these intermediaries in the markets at all?

    Yes, obviously, someone somewhere has to have pieces of kit on a shelf somewhere for when a customer wants to make an order. Someone has to crate it up and ship it off too: but why do we still have distributors as independent companies performing these functions? It’s the sort of thing we might have expected to see being disintermediated away over the years as communications became cheaper, as information became ever easier to keep track and as delivery itself became both easier and cheaper.

    It’s entirely possible to think of ways in which that gap between the manufacturer and the final retailer should have closed, killing the space in which distribution is done. Wal-Mart’s logistics system is reputed to have done just that.

    However, this gap hasn’t shrunk. In large part the industry still works on manufacturer to distributor to retailer or VAR/reseller. Some firms have gone direct, Dell being an obvious example, but this hasn’t become the standard across the industry despite what might seem to be obvious benefits of such a model.

    There are reasonable financial reasons why this extra step might not be worthwhile as well: depreciation of value being one of them. It rather shocked me when I first heard it said that electronics are about as perishable as chocolate. While they are sitting on a shelf their value (as an average, of course) goes down by some 1 per cent a week, or 50 odd per cent a year. This means an extra step – where the goods hang about in a warehouse gathering dust – seems like something people would like to cut out.

    But just looking at individual companies isn’t going to tell us whether the whole sector is adding economic value or not. Some companies might just not be doing very well in the face of competition while others are doing just fine. What we’d really like to to see the average return on capital for the whole sector.

    For the 36 companies which my low friend considers to be the electronics distribution sector, this gives us a return on capital of 24 per cent. This is well above either our arbitrary 8 per cent or Ingram’s admitted weighted average cost of capital of 9 per cent.

    At this point the economist entirely loses interest in the subject under discussion. Why a certain activity is adding value to the economy is completely uninteresting: we know that people do all sorts of weird things and the very reason that we don’t try to plan an economy is because no one centrally can work out why they do what they do or what value they gain from having done so.

    Reply
  16. Tomi Engdahl says:

    Boffins develop nanoscale vacuum tube running at .46 THz
    Power hungry but radiation resistant relic could make comeback … in spaaaace
    http://www.theregister.co.uk/2012/05/24/nan_vacuum_tubes/

    Researchers from NASA and Korea’s National Nanofab Center have cooked up nanoscale vacuum tubes, potentially bringing some of the earliest digital devices back into the mainstream of technology.

    As detailed in a new paper from Applied Physics Letters, the tiny tubes were manufactured using the same processes applied to silicon semiconductors. An important tweak sees a small cavity etched into silicon, bordered by a source, a gate and a drain. The cavity does not enclose a vacuum, but at 150 nanometres across is so small that electrons flowing through it are unlikely to bump into any other matter.

    Test rigs researchers have created show that data therefore screams along at up to .46 terahertz. The price of this power is power: firing up a device made of the tiny tubes needs 10 volts of juice, compared to one volt for a conventional transistor

    Reply
  17. Tomi Engdahl says:

    Memristor is 200 years old, say academics
    http://www.edn.com/article/521878-Memristor_is_200_years_old_say_academics.php

    Professor Leon Chua, the originator of the term memristor for a variable resistance device with pinched hysteresis behavior, has joined forces with Professor Christofer Toumazou and Themistoklis Prodomakis, both of Imperial College London, to demonstrate that the behavior has been observed for more than 200 years.

    Chua originally postulated the existence of a fourth fundamental passive electronic circuit element as a consequence of electrical theory and coined the term memristor in 1971. Hewlett Packard Co picked up use of the terminology and produced an electronic circuit element based on titanium oxide in 2008.

    The conclusion is that the memristor is not an invention but a description of a natural phenomenon that can be found in many dissipative devices made from many materials, internal structures and architectures. It adds that observation of memristor behavior just happens to be older than the first formal discussions of the resistor (Ohm in 1827) and the inductor (Faraday in 1931).

    Reply
  18. Tomi Engdahl says:

    Can advanced tools actually slow us down?
    http://www.eetimes.com/electronics-blogs/pop-blog/4373712/Can-advanced-tools-actually-slow-us-down-

    It’s not just personal projects which now have this “do it fast, and do it over (and over)” mode. The relative ease with which we can now write code, then execute it, test it, edit it, and re-execute it can discourage proper planning, despite all the formal coding procedures which software engineers claim they follow.

    Today’s pressures don’t allow for much of the “first, stop and think” or even “first, slow it down” approach. Plus, the tools we have encourage the “do it fast, then fix” approach. It takes real discipline to not be driven by the pressure from management, especially when it aided by the programming tools which are literally at our fingertips.

    Maybe this pace is actually counterproductive, and we should remember that sometimes it’s better to go slow and careful, than go fast but have to do it over and over again—and the appearance of non-stop activity on a project is not the same as actual progress.

    Reply
  19. Tomi Engdahl says:

    Is it true? Embedded MCU designers learning FPGAs?
    http://www.eetimes.com/electronics-blogs/other/4373642/Is-it-true–Embedded-MCU-designers-learning-FPGAs-

    These are guys who are black-belt “masters of the mystic arts” when it comes to designing embedded systems using microcontrollers that they program in Assembly, C, C++, or some other programming or scripting language. But when I asked them about FPGAs, I was amazed by the number who knew practically nothing about these little rascals.

    The general level of understanding is that an FPGA is “A silicon chip that can be configured to do whatever you want it to do,” and … actually, there is no “and” … that’s it. A lot of these guys and gals have heard that hardware design engineers use languages like Verilog or VHDL, but … once again that’s pretty much it.

    The surprising thing is how many folks mention that they would really like to learn more about FPGAs … it’s just that they’ve never gotten around to it.

    anyone can learn FPGAs anytime they like… it’s just a matter of getting round to it.

    We just launched a new website called All Programmable Planet that boasts a bevy of the most incredible bloggers
    http://www.programmableplanet.com/

    The thing is that FPGAs offer all sorts of capabilities that are of interest for all sorts of applications. In some cases you might decide to use an FPGA in conjunction with a MCU; in other cases you might configure a portion of the FPGA to implement one or more MCUs; alternatively you might use an FPGA that has one or more hard core MCUs embedded inside it … the possibilities are endless.

    The bottom line is that FPGAs are seeing exponential growth across the board in all sorts of markets for all sorts of applications. Some MCU designers are going to learn how to ride the crest of the FPGA wave, while others will sit around saying “I wish I could learn how to use FPGAs”

    Reply
  20. Tomi Engdahl says:

    Power: a significant challenge in EDA design
    http://www.edn.com/article/521840-Power_a_significant_challenge_in_EDA_design.php

    Power has become a primary design consideration over the past decade and is causing some big changes in the way that engineers design and verify systems. Physics no longer provides a free ride.

    Power is the rate at which energy is consumed—not a hot topic 10 years ago but a primary design consideration today. A system’s consumption of energy creates heat, drains batteries, strains power-delivery networks, and increases costs.

    The rise in mobile computing initially drove the desire to reduce energy consumption, but the effects of energy consumption are now far-reaching and may cause some of the largest structural changes in the industry. This issue is important for server farms, the cloud, automobiles, chips, and ubiquitous sensor networks relying on harvested energy.

    Reply
  21. Tomi Engdahl says:

    Power: a significant challenge in EDA design
    http://www.edn.com/article/521840-Power_a_significant_challenge_in_EDA_design.php?cid=Newsletter+-+EDN+Weekly

    Power adds another layer of complexity that designers must verify. It requires additional tool support that manufacturers are cobbling into those now on the market. Power adds several new devices to the design, such as isolation logic, power switches, level shifters, and retention cells.

    According to Dermott Lynch, vice president of marketing at Silicon Frontline Technology, power devices typically operate at 70 to 90% efficiency, which results in a loss of 10 to 30% of overall system power. Ely Tsern, vice president and chief technology officer for Rambus’ semiconductor-business group, adds that more aggressive power-mode transitions, with finer-grained power domains, will result in faster transition of local supply currents, which in turn can induce greater di/dt supply noise for sensitive local circuits, especially analog circuits.

    Shanmugavel cautions, however, that, under all conditions, the power-delivery network should be able to sustain the load without compromising the voltage integrity. For example, when a global clock transitions and a functional unit turns on to perform a task, a transient-current demand occurs. This transient current can be three to five times that of the nominal current, depending on the functional block, which places an enormous load on the power-delivery network. You must validate the transient voltage noise on the network under these circumstances.

    Reply
  22. Tomi Engdahl says:

    Engineers Are Happy in Role as Non-Managers
    http://www.designnews.com/author.asp?section_id=1381&doc_id=242782&cid=NL_Newsletters+-+DN+Daily

    If you’ve remained a technical contributor, eschewing the climb up the corporate ladder, are you happy with your decision?

    Consulting engineer Robert Sander Sr. has no regrets about his decision:
    “I spent four years as a manager and hated it all the time. I am much happier as a technical subject matter expert and acknowledged leader of technology in my field.”

    According to consultant William Ketel II, “The sense of personal achievement that comes with a successful design is quite a reward, and that does make me happy. I can’t imagine doing anything else that would be so rewarding that I would get paid for.”

    Lest we assume that management and engineering are discrete functions, Ketel reminds us of the contrary. When he was chief engineer at a startup, he saw himself as being on the engineering side because he was “the one ultimately responsible for everything.” After that startup cratered, his experience with his manager at a subsequent job wasn’t so sanguine:
    “When one [manager] makes it clear that they regard all engineers as interchangeable commodity resources, moral suffers, loyalty is damaged, and the engineers leave. I am at a loss as to how MBA accountants can design electronic vehicle systems.”

    Reply
  23. Tomi Engdahl says:

    Beat power management challenges in advanced LTE smartphones
    http://www.eetimes.com/design/communications-design/4373791/Beat-power-management-challenges-in-advanced-LTE-smartphones?Ecosystem=communications-design

    Ask any group of people about their phone, and while several may comment on the cool new features and apps, you are also sure to find several who complain about smartphone battery life.

    Mobile applications are the latest phenomenon causing a larger than normal spike in power demand. Multiple power-hungry functions, including high-speed graphics processing and mobile broadband connections, in addition to other radio connections that manage functions such as location, are all running simultaneously.

    This power spike will have a significant impact on the mobile industry as handsets and user expectations evolve and more advanced air interfaces such as LTE become common. Whether they are mobile business warriors conducting work via the cloud on a tablet or smartphone, the geekiest gamers or somewhere in between, mobile devices users share two things in common. First, they expect their technology to work whenever and wherever they use it. Second, they are frustrated by battery issues.

    A number of factors affect power consumption and the subsequent impact on device battery life. In the case of smaller mobile devices, like smartphones, thermal management is one of the key challenges. Efficient thermal management will help increase battery life, as well as stop the waste heat from leaching out. Reduction of temperature can be achieved through several other methods, including the casing design.

    Reducing the temperature not only conserves the power in the device, but also preserves the device itself.
    wear and damage due to heat, however, is a real concern

    The power consumption issue is compounded by the need for higher data rates. Essentially, the more data that needs to be processed at speed, the worse the power consumption problem is going to become. Consider the nature of smartphones: These devices are more than just phones; they╒re expected to play music and video files, as well as streaming video. They often also serve as photo storage devices. This multitasking requires immersive graphics and sound, memory access and a high-speed broadband network connectivity through HSPA+ and LTE – all of which are power hungry. To complicate the issue, very often the user has applications like email and Facebook running in the background.

    The underlying software has done a lot to address the issues in these devices. Hardware modifications can achieve a great deal as well. The first thing a design architect can offer is a thermal simulation. This enables a manufacturer to assess the temperature limitations and emissions for each component and to test them against use in GSM, 3G and LTE environments

    The right design can make a big impact.

    The smaller the consumer device, the more difficult it is likely to be to route the heat somewhere productive or least damaging as it has fewer places it can go. This is where ASIC and particularly digital baseband design comes into its own, with the ability to switch off some power domains to minimize leakage.

    Smartphone demands for more multitasking and a better overall mobile experience are here to stay, whether it’s using mobile applications, streaming video or another yet-to-be-imagined application. Next-generation devices will face even more critical power considerations

    Reply
  24. Tomi Engdahl says:

    Renesas outsources top-end chips to TSMC as shake-out looms
    http://www.reuters.com/article/2012/05/28/us-renesas-restructuring-idUSBRE84P03820120528

    Japan’s Renesas Electronics will outsource its top-end chips to Taiwan Semiconductor Manufacturing Co to survive cut-throat global competition after falling behind in investment and as it grapples with a costly restructuring.

    Reply
  25. Tomi Engdahl says:

    Inside the Schick Hydro microcontroller-powered wet razor
    http://www.edn.com/article/521808-Inside_the_Schick_Hydro_microcontroller_powered_wet_razor.php

    Though the shaving experience may be overrated, a look inside the Schick Hydro reveals another unexpected application for microcontrollers—in this case a less-than-$1 PIC10F222 from Microchip.

    Reply
  26. Tomi Engdahl says:

    Intel Ivy Bridge Processor Hits 7GHz Overclock Record
    http://hardware.slashdot.org/story/12/05/31/2020204/intel-ivy-bridge-processor-hits-7ghz-overclock-record

    “Renowned Overclocker HiCookie used a Gigabyte Z77X-UD3H motherboard to achieve a fully validated 7.03GHz clock speed on an Intel Core i7 3770K Ivy Bridge processor. As it stands, that’s the highest clockspeed for an Ivy Bridge CPU, and it required a steady dose of liquid nitrogen to get there.”

    Reply
  27. Tomi Engdahl says:

    Image Sensors to hit $3.7 billion in 2017
    http://www.eetimes.com/electronics-blogs/other/4374489/Image-Sensors-to-hit–3-7-billion-in-2017

    According to a new report by Transparency Market Research, U.S. image sensors including linear, CCD, and x-ray are expected to reach $3.7 billion in 2017, up from $2.0 billion in 2011.

    84.2% of the gain is attributable to linear image sensors

    CMOS image sensor shipments reached a whopping 464.0 million units in 2011, a growth of 27.6% over 2010, not surprising as they account for 90% of the linear image sensor market.

    What’s important to remember is that these revenue numbers are particularly impressive given the sharp decline in sensor prices over the past several years, with the trend expected to continue.

    Reply
  28. Tomi Engdahl says:

    Design headache: How do you design to minimize the impact of display obsolescence?
    http://www.edn.com/electronics-blogs/powersource/4374238/Design-headache-How-do-you-design-to-minimize-the-impact-of-display-obsolescence-

    Designing with LCDs can be a challenge because of the lack of standardization among the different product offerings. While this is true to some extent for almost all electronic components, most have some degree of commonality in voltage levels, package types, or at least connectors. Not so with LCDs. Not with their dimensions or, most tellingly, their connectors. And don’t even think about second sourcing in most instances – suppliers want you to design in their products and then be captive.

    “In my search for a replacement, I discovered that no two display manufactures make pin/functional displays, resulting in a costly re-design.”

    Why are display manufactures so reluctant to standardize products? What makes it so difficult? How does one design to minimize the impact of display obsolescence?”

    While I have heard grumblings over the years about the lack of standardization in LCDs, I had not realized that that there was virtually no second-sourcing of the displays. How can they get buy-in from major device manufacturers or the military?

    Ken’s not naïve: He understands the business decisions of companies can result in product changes which are highly-inconvenient to some of their customers, and that standardization is often not in the LCD supplier’s competitive interests.

    Reply
  29. Farrah Bekhit says:

    Its like you read my mind! You appear to know a lot about this, like you wrote the book in it or something. I think that you can do with a few pics to drive the message home a bit, but other than that, this is fantastic blog. A fantastic read. I will definitely be back.

    Reply
  30. Tomi Engdahl says:

    An IHS iSuppli Inventory Insider Market Brief this week reinforced a positive outlook on chips. The report found that total semiconductor inventory as a percentage of suppliers’ revenue amounted to 50 percent in the first quarter, up from 47.8 percent in the fourth quarter.

    “The higher inventory numbers among semiconductor suppliers for the first quarter of 2012 represent a signal of better things to come,” said IHS. “There was an increasing level of inventory both among chip suppliers and customers, indicating that both the supply and demand sides of the business believe that the environment in the electronics market has turned positive.”

    Source:
    http://www.cio.com/article/708020/Wall_Street_Beat_Economic_Uncertainty_Continues_to_Plague_Tech

    Reply
  31. Tomi Engdahl says:

    The Growing Importance of Asymmetric and Asynchronous Processing
    http://www.fpgagurus.edn.com/blog/fpga-gurus-blog/growing-importance-asymmetric-and-asynchronous-processing?cid=Newsletter+-+EDN+on+FPGAs

    There are many more opportunities for multicore chips based on asymmetric cores – where processors might work on the same data stream, but not in lockstep, and maybe not even with a common kernel – or asynchronous cores, where co-processors do not have to wait for a master processor to finish work on a data set before they can begin their own work, regardless of any common clocks.

    As major FPGA vendors turn to 28-nm generations that allow combined suites of control-plane, datapath, and DSP processors, designers will require more and more help with both software and vertical logic blocks

    In the embedded world, SMP multicores will be a special case. The mainstream of design will be complex single-chip platforms stuffed with asynchronous and asymmetric cores.

    Reply
  32. Tomi Engdahl says:

    When Chips Are Designed to be Sloppy
    http://www.fpgagurus.edn.com/blog/fpga-gurus-blog/when-chips-are-designed-be-sloppy?cid=Newsletter+-+EDN+on+FPGAs

    When The Economist magazine’s Technology Quarterly featured an article on sloppy processing in its June 2 edition (http://www.economist.com/node/21556087), the magazine had to remind readers that professional chip designers prefer to call the new techniques pioneered at Rice University and other institutions something bearing a pleasant euphemism like “inexact hardware” or “probabilistic computing.” Seriously, though, The Economist’s preference for sloppy processing probably is a more apt descriptor.

    DSP processors often have to give a result in video or audio processing where an inexact solution is perfectly OK for the user. In such cases, allowing a certain amount of error can help reduce the overall chip’s power budget by as much as 50 percent.

    The methodology has moved beyond the speculative stage. The Economist article referred to one U.S. Navy contract in which Singular Computing is working with Charles River Associates on a shipboard video-processing chip utilizing some of these ideas.

    Now, most datapath processor cores and virtually all control-plane cores are too critical to use sloppy processing, but the techniques used by the Rice University group can isolate the inexact processing streams from those using critical results.

    It seems as though the development of such IP cores would be far easier, and more practical within a chip design, than the use of fuzzy logic, or quantum computing methods.

    Reply
  33. Tomi Engdahl says:

    Studies on Diffractive Mobile Display Backlights
    http://research.nokia.com/publication/12339

    The conventional mobile liquid-crystal display structure consists of a backlight unit with the associated light sources, the display panel itself, and various optical films that control the state of polarization and viewing characteristics of the display. The backlight unit itself has evolved in the last fifteen years to become a very efficient component to provide uniform illumination to the electro-optic spatial light modulator comprised by the liquid-crystal pixel array. In the conventional structure, the color filter array embedded in the liquid-crystal display panel limits the light throughput in the display system. The backlight unit itself in the conventional configuration is difficult to improve any further, and a system redesign is required to make the display system perform more efficiently than what is currently possible.

    One possible way to redesign a display system more effectively is to direct the appropriate primary bands of light through the respective subpixels in the display panel, instead of having these filter white light into the primary colors. This can be done by diffractive means, i. e. by placing a grating structure on the light guide plate of the backlight unit. Significant improvement in energy efficiency of a mobile display system can be achieved by this approach, and at the same time, cost savings can be expected due to the elimination of many beam-shaping films in the backlight unit, as compared to the conventional mobile liquid-crystal display configuration. Further cost savings can be achieved by removing the color filter array, provided that the color purity of outcoupled light is good.

    This thesis presents a new pixelated color-separating grating array concept that diffracts the light from red, green, and blue light-emitting diodes in the backlight unit through a subpixel array in a prospective mobile display module.

    Reply
  34. Tomi Engdahl says:

    When GPS Goes Down, Pentagon Still Wants a Way to Fight
    http://www.wired.com/dangerroom/2012/06/darpa-gps/

    The navigational system used by the military for just about everything from guiding drones to dropping bombs is increasingly under threat of attack. Now, the Pentagon’s desperate to replace it. Or, at least, reinforce it enough to stave off a looming storm of strikes.

    That’s the thrust of a new venture from Darpa, the military’s premier research arm and the brains behind GPS’ initial development in the 1950s. On Tuesday, the agency announced the second phase of their program, “All Source Positioning and Navigation (ASPN),” that’s trying to “enable low-cost, robust and seamless navigation solutions … with or without GPS.”

    The program, which Darpa quietly kicked off last year with two awards for theoretical research, is one part of a larger military effort that’s trying to steer the Pentagon away from its GPS dependency.

    Why? First off, there’s the growing risk of GPS signals being jammed by adversarial forces. Enemies on the ground can also “spoof” a GPS system — essentially tricking it into showing an incorrect location. And these are far from hypothetical risks

    The risks now inherent in GPS are well-known, but it doesn’t look like Darpa’s ready to give up on the system altogether. Instead, they’re after a navigational system that can swiftly move between different combos of devices, using a “plug-and-play” approach.

    Of course, there are already plenty of GPS alternatives available. Radio beacons, which transmit signals from static locations to receiving devices, allow the calculation of location based on proximity to various beacons. Ground feature navigation extracts the positions of tracked objects and then uses them as points of reference to gauge a vessel’s locale. And stellar navigation systems use the coordinates of celestial bodies to assist in a vehicle’s navigation.

    Darpa’s dream navigational system would go beyond those kinds of discreet systems — by incorporating pretty much all of them. The ASPN system

    Among the ton of gadgets that Darpa wants the system to utilize: 3-D imagers, LiDAR, temperature sensors … and good old compasses.

    Reply
  35. Tomi Engdahl says:

    The sad truth Nokia’s situation: the manufacturing industry to escape from the west

    “Nokia’s changes have been monitored for a couple of years. The company’s market situation is constantly returned to normal, which does not make any changes easier to digest,” Aunesluoma says.

    According to him, followed by Nokia as only one player in the global trend in which the manufacturing industry left by higher costs to the West. Industrial low-cost countries, the move itself is special about the fact that companies are today engaged in R & D activities with the countries of Asia, for example.

    “In Western countries, specializes in the manufacturing industry of high-tech niche products in other industries like the electronics industry.”

    Aunesluoma of similar examples can be seen even in Sweden.

    Source: http://www.tietoviikko.fi/kaikki_uutiset/karu+totuus+nokian+tilanteesta+valmistava+teollisuus+karkaa+lannesta/a816598?s=r&wtm=tietoviikko/-14062012&

    Reply
  36. Tomi Engdahl says:

    Wall Street Beat: Bad News Rolls in for Tech
    IDC says the software market will slow down, as Nokia faces tough times ahead
    http://www.cio.com/article/708563/Wall_Street_Beat_Bad_News_Rolls_in_for_Tech?page=1&taxonomyId=1375

    “IDC expects the overall software market to return to more conservative growth in the years to come,” said Patrick Melgarejo, director of IDC’s software trackers, in the report.

    IDC also had some troubling news for the storage market.
    “The first quarter saw decidedly mixed results,”

    On the hardware and components side of tech, TI and Nokia gave fresh cause for worry. TI on Monday narrowed its expected ranges for revenue and earnings per share (EPS).

    On Friday, Moody’s ratings agency downgraded Nokia’s debt grade to junk status, noting greater-than-expected pressure on the company’s earnings.

    Reply
  37. Tomi Engdahl says:

    ARM CTO asks: Are we getting value from verification?
    http://www.edn.com/design/integrated-circuit-design/4375333/ARM-CTO-asks–Are-we-getting-value-from-verification-

    He traced the improvements in design tools and methodologies, noting the impact on complexity and power. As for the impact on productivity, he pointed out that automated tools now handle much of the custom design work that once fell to the hardware team. For example, layout took six months for the ARM1 but only 32 minutes for the Cortex-M0.
    The software team has demonstrated more conservative numbers, Muller said

    Turning his attention to verification, Muller noted that verification of the ARM1 had taken 200 hours; today, the task requires 1,439,000 hours and occupies the resources of a 1.5-MW data center. Verification time has effectively increased by a factor of 3,000,000, or its reciprocal when expressed as a productivity figure. Muller then asked the fundamental question: Is this investment in verification well spent?

    He answered the question with the declaration that complexity has spiraled because design is not formal. Formal processes are reserved for verification but should be applied to design, Muller said; we should be able to push a button and get an answer.

    Hardware engineers have the power to constrain the verification problem, but they have been complacent, Muller told his DAC audience.
    Video of the DAC opening session, including Muller’s talk, is available on the DAC Web site.

    Reply
  38. Tomi Engdahl says:

    Low Power is Everywhere
    http://www.synopsys.com/Company/Publications/SynopsysInsight/Pages/Art4-low-power-IssQ2-12.aspx?cmp=Insight-I2-2012-Art4-Email1&elq_mid=3545&elq_cid=303473

    Meeting power budgets for most System-on-Chip (SoC) designs today is no longer a requirement for mobile applications only. Almost every market segment today has some concern with designing in low power features — although the driving factor for why does differ among them. Mary Ann White, Synopsys, explains how different segments have different reasons for making power a primary design requirement.

    The primary impetus for low power design was initially driven by the mobile market due to the need for extending battery life; however, different segments do have different reasons for making power a primary design requirement. For example, the advent of the internet and social media heavily drives the Servers and Networking Market segments where large server clouds and compute farms need to work reliably without overheating; so, their primary concern is reducing the amount of expensive energy required for operation and air conditioning. Other markets such as the multimedia and set top box segments are plugged into the wall but ‘green’ initiatives and the high cost of electricity have forced them into increasing energy efficiency through building in low power techniques similar to those used in the mobile application space.

    Reply
  39. Tomi Engdahl says:

    Optimizing FPGAs for power: A full-frontal attack
    http://www.eetimes.com/design/military-aerospace-design/4375561/Optimizing-FPGAs-for-power–A-full-frontal-attack?Ecosystem=communications-design

    This piece is a hands-on how to optimize FPGA designs for low power operation. This is a universal topic but, of course, it is really important for wireless and mobile military and aerospace applications that run on battery power.

    The article is a rundown of low-power design techniques for the latest families of 7 series FPGAs at each stage of the development cycle.

    Reply
  40. Tomi Engdahl says:

    Multi-core is coming fast. More than 60% of embedded engineers expect to be using multi-core or multiprocessor architectures in their development projects by 2012, according to VDC Research. Unfortunately development teams making the transition to multi-core are learning that old techniques and tools are no longer effective

    Source: EDN’s Resource Center e-mail June 20, 2012

    Data races are risky because they are easy to introduce and difficult to find. A widespread misconception is that some forms of data races are harmless. However, modern optimizing compilers generate code that can cause incorrect execution when data races exist, even for those that are thought to be benign.

    There are lots of risks of concurrency.
    There are surprising ways in which programmers can inadvertently introduce such bugs into their code.

    One ways of finding and eliminating such defects automatically is using static analysis.

    Source: http://www.edn.com/electrical-engineers/education-training/webinars/4236670/Hazards-of-Multi-threaded-and-Multi-core-Software-Development-

    Reply
  41. Tomi Engdahl says:

    Energy-efficient MCUs transform consumer and industrial apps
    http://www.edn.com/electronics-products/other/4375699/Energy-efficient-MCUs-transform-consumer-and-industrial-apps

    The Kinetis L Series is the first MCU built on the ARM Cortex-M0+ processor, and claimed to be the world’s most energy-efficient MCU. The 32-bit devices are designed to transform consumer and industrial apps currently using 8- and 16-bit architectures.

    The product consumes 50 uA/MHz* in very-low-power run (VLPR) mode, and rapidly wakes from a reduced power state to process data and return to sleep.

    Reply
  42. Tomi Engdahl says:

    Product How-To: FPGAs for speed and flexibility
    http://www.edn.com/design/systems-design/4375713/Product-How-To–FPGAs-modules-for-speed-and-flexibility

    If traditional bus-interfaced I/O is used, you will find in an ideal situation a control loop execution time of approximately 1 µS. An example would be a CPU executing a real time OS and talking over the PCI Bus to an I/O module which is connected to the external interface world.

    This same I/O control loop, when formulated around an FPGA module
    This control loop is roughly 1000x faster. The control loop is executed in the FPGA just the other side of the I/O transceivers.

    The external world has a wide mix of I/O interfaces. Any FPGA module to be used must meet that interface.

    Reply
  43. Tomi Engdahl says:

    Situation in the EE job market
    http://www.edn.com/electronics-blogs/other/4375847/Situation-in-the-EE-job-market-?cid=EDNToday

    Back in October, an article appeared in The Wall Street Journal with the headline “Why Companies Aren’t Getting the Employees They Need” It noted that even with millions of highly educated and highly trained workers sidelined by the worst economic downturn in three generations, companies were reporting shortages of skilled workers.

    The author of the article, Peter Cappelli, an expert on employment and management issues, concluded that although employers are in almost complete agreement about the skills gap, there was no actual evidence of it. Instead, he said, “The real culprits are the employers themselves.”

    Why Good People Can’t Get Jobs: The Skills Gap and What Companies Can Do About It
    http://wdp.wharton.upenn.edu/books/why-good-people-cant-get-jobs/

    Even in a time of perilously high unemployment, companies contend that they cannot find the employees they need. Pointing to a skills gap, employers argue applicants are simply not qualified; schools aren’t preparing students for jobs; the government isn’t letting in enough high-skill immigrants; and even when the match is right, prospective employees won’t accept jobs at the wages offered.

    In this powerful and fast-reading book, Peter Cappelli, Wharton management professor and director of Wharton’s Center for Human Resources, debunks the arguments and exposes the real reasons good people can’t get hired.

    Named one of HR Magazine’s Top 20 Most Influential Thinkers of 2011, Cappelli not only changes the way we think about hiring but points the way forward to rev America’s job engine again and put people back to work.

    Reply
  44. Tomi Engdahl says:

    Design News Radio Explores Big-Picture Systems Engineering
    http://www.designnews.com/author.asp?section_id=1394&doc_id=245953

    Systems-level thinking: It’s a tried-and-true design principle, still one that engineering groups have long struggled to make a reality and not just a classroom concept.

    While most manufacturers see the value in big-picture, systems-level thinking, the realities and limitations of traditional design tools and existing development processes have kept mechanical, electrical, and software engineering groups tethered to siloed tools and balking at cultural changes that demand more cross-functional cooperation.

    That insular mindset is starting to change, thanks to the weight of regulatory and compliance requirements. Companies are starting to demand process changes and new tool capabilities that promote cross-functional collaboration early on in the design process when it’s far more cost-effective to make changes. In response to their demands, the design tool vendors are rolling out a spate of new products and integrations.

    Reply
  45. Tomi Engdahl says:

    Keep the Focus on Requirements
    http://www.designnews.com/author.asp?section_id=1386&doc_id=246000

    Looks like someone in program management lost track of the requirements.

    Adding unnecessary features just because you can is sloppy engineering

    For many mission-critical systems, redundancy is an essential part of a reliability strategy.

    You can’t arrive at your destination unless you know where you’re going, which is why setting requirements at the beginning of a development project is so important. The job doesn’t end there, though. You have to manage the requirements and avoid mission creep, which can consume significant amounts of engineering hours without, in a case like this, bringing significant benefit to the user.

    The vow of doctors is first: Do no harm. The vow of engineers should be first and foremost as well: Get the job done. Or, take a cue from Occam’s Razor: Choose the simplest solution.

    Is your philosophy “more is more” or “keep it simple, stupid”?

    What’s your technique for staying focused on requirements and not adding unnecessary functionality?

    Reply
  46. Tomi Engdahl says:

    Innovative design engineers are incorporating a powerful set of sustainable practices in their product designs to improve the integrity of the product – and not just to reduce their carbon footprint. These practices impact every stage in the product lifecycle from ideation to board layout, to material and component selection, manufacturing and assembly, and end-of-life processing. Not only are these practices desirable, they are becoming mandatory at many leading manufacturers as they can significantly reduce both cost and waste. A deep knowledge of sustainable product design fundamentals is essential in today’s competitive product design environment.

    Source: Sustainability: Delivering Value with Product Stewardship mail

    Reply
  47. Tomi Engdahl says:

    Intel may see $2B non-x86 growth in 2013
    http://www.eetimes.com/electronics-news/4376018/Intel-may-see–2B-non-x86-growth-in-2013

    Next year, Intel is poised to generate $2 billion in revenues—half its expected revenue growth—from chips outside its traditional x86 processors, according to a financial analyst who tracks the company.

    With its relatively new embedded, NAND flash and wireless products, “Intel has dramatically outperformed its competition on revenue growth and/or profitability from 2008-2012 and is poised to extend these gains in 2013,” said Ross Seymore, an analyst with Deutsche Bank Equity Research.

    “These segments have the potential to generate about 50 percent of the $4 billion revenue growth implied in our 2013 estimates”

    Reply
  48. Tomi Engdahl says:

    Tech Manufacturing Is a Disaster Waiting To Happen
    http://slashdot.org/story/12/06/25/0224219/tech-manufacturing-is-a-disaster-waiting-to-happen

    “Peter Cochrane writes that since globalization took hold, geographic diversity has become distorted along with the resilience of supply so we now have a concentration of limited sourcing and manufacture in the supply chain in just one geographic region, south-east Asia, amounting to a major disaster just waiting to happen. ‘Examples of a growing supply-chain brittleness include manufacturers temporarily denuded of LCD screens, memory chips and batteries by fires, a tsunami, and industrial problems,’ ”

    “Today, PCs, laptops, tablets and smartphones are produced by just 10 dominant contract manufacturers”

    “The bad news is that many of the 10 big players in the IT field are not making good profits, so economic pressure could result in the 10 becoming seven.”

    Reply
  49. jason pure natural organic products says:

    I’ve been surfing online greater than three hours lately, yet I never found any interesting article like yours. It’s pretty price sufficient for me. Personally, if all site owners and bloggers made excellent content material as you did, the web will probably be much more helpful than ever before.

    Reply
  50. Tomi Engdahl says:

    Machine Safety: Where do I start?
    http://www.controleng.com/single-article/machine-safety-where-do-i-start/f74ba706bb7cd736d4c403013498d93b.html

    Eight tips follow for a machine safety and machine guarding.

    many machine safety experts generally recommend these basic steps for better machine safety and machine guarding:

    1. Management needs to assign one executive with machine safety oversight responsibility.

    2. A team needs to be identified for implementation and execution of company safety policy.

    3. Gather all your machine safety standards, OSHA regulations, drawings, manuals, accident and/or maintenance history, etc.

    4. Conduct a plant-wide high-level machine guarding/safety assessment to establish hazard level priorities.

    5. Immediately install any fixed/moveable guards as necessary based on the above.

    6. Conduct machine specific risk assessments based on the priorities established above.

    7. Mitigate all identified hazards to acceptable levels.

    8. And, document the entire process.

    Safety control systems: Essential considerations, costs
    http://www.controleng.com/single-article/safety-control-systems-essential-considerations-costs/2bb2c9af9894f6c9e44c447b697d8159.html

    Safety control systems have four typical architectures, each with advantages and disadvantages, and differing cost (per unit).

    Increasing integrity and reliability of safety systems has been promoted in European law with the migration from EN954-1 to the latest SIL and Pl standards: ENISO13849 and IEC/EN62061.

    For machine builders and owners, the ideal safety system architecture must conform with the minimum industry standards while allowing safe machine operation and not preventing operators from efficient production. The systems should be subject to a lifecycle cost analysis and a benefit-to-cost ratio analysis.

    Automation manufacturers that provide safety control equipment include:

    Pilz Safety
    Siemens
    GuardMaster (part of Rockwell Automation)
    Sick
    Schmersal

    By their nature, machine safety systems are an understood element of most control applications and are tailored to meet safe machine operation requirements.

    Selecting safety components for inclusion in a safety system is subject to the same rigors as any design engineering exercise. They must, obviously, be fit for purpose, satisfy the technical (or safety integrity) demands of an application, and be cost competitive. (Safety is probably one of the areas where cost is the lower priority of the two). Technical and product support are, as always, in the mix.

    This is the most basic safety system configuration with the cost per unit starting between £100 [~$158] and £300 [~$475].*

    Option 1: Safety Relays

    All emergency stop buttons are wired in series (with dual channels for redundancy).
    Dual contactors are also used for redundancy.

    Option 2: Configurable safety relays

    Components are wired individually, making testing easier.
    Reset point is configurable in the software

    Typical price per unit is between £300 [~$475] and £1000 [~$1582].*

    Option 3: Redundant PLC arrangement

    A typical arrangement for programmable logic (PLC) controllers used in critical applications is to configure a redundant pair, often with “hot-swap” functionality. The redundant controller is used to support a safe and orderly shutdown in the event the primary controller fails.

    The price for two parallel PLC systems starts around £10 000 [~$15,822]

    Option 4: Safety PLCs

    Specific code for safety applications is written in addition to the normal PLC code.
    Safe I/O modules can be centralized or remote.

    (from £2000 [~$3,164]* per unit)

    Reply

Leave a Reply to Tomi Engdahl Cancel reply

Your email address will not be published. Required fields are marked *

*

*