Electronics trends for 2013

Electronics industry hopefully starts to glow after not so good year 2012. It’s safe to say that 2012 has been a wild ride for all of us. The global semiconductor industry has demonstrated impressive resilience in year 2012, despite operating in a challenging global macroeconomic environment. Many have already ratcheted back their expectations for 2013. Beyond 2012, the industry is expected to grow steadily and moderately across all regions, according to the WSTS forecast. So we should see moderate growth in 2013 and 2014. I hope this happens.

The non-volatile memory market is growing rapidly. Underlying technologies for non-volatile memories article tells that non-volatile memory applications can be divided into standalone and embedded system solutions. Standalone applications tend to be driven primarily by costs is dominated by NAND FLASH technology. The embedded market relies mainly on NOR Flash for critical applications and NAND for less critical data storage. Planar CT NAND and 3D NAND could fast become commercially viable this year or in few years. MRAM, PCRAM, and RRAM will need more time and new material innovation to become major technologies.

Multicore CPU architectures are a little like hybrid vehicles: Once seen as anomalies, both are now encountered on a regular basis and are widely accepted as possible solutions to challenging problems. Multi-core architectures will find their application but likely won’t force the extinction of single-core MCUs anytime soon. Within the embedded community, a few applications now seem to be almost exclusively multicore, but in many others multicore remains rare. There are concerns over the complexity and uncertainty about the benefits.

FPGAs as the vanishing foundation article tells that we are entering a new environment in which the FPGA has faded into the wallpaper – not because it is obsolete, but because it is both necessary and ubiquitous. After displacing most functions of ASICs, DSPs, and a few varieties of microcontrollers, it’s fair to ask if there is any realm of electronic products where use of the FPGA is not automatically assumed. Chances are, in the next few years, the very term “FPGA” might be replaced by “that ARM-based system on a chip” from Xilinx, Altera, Lattice, or other vendor.

Software and services have become the soul of consumer technology. Hardware has become increasingly commoditized into blank vessels that do little more than hold Facebook and Twitter and the App Store and Android and iOS.

Are products owned when bought? The trend in recent decades has been an increase in the dependence of the buyer on the seller.

More than 5 billion wireless connectivity chips will ship in 2013, according to market research firm ABI Research. This category includes standalone chips for Bluetooth, Wi-Fi, satellite positioning, near-field communications and ZigBee as well as so called “combo” chips that combine multiple standards. Broadcom seen retaining lead in connectivity chips. Bluetooth Smart, WiGig and NFC are all are seeing increased adoption in fitness, automotive and retail applications. Combo chips are also a growing opportunity based on the popularity of smart phones, tablet computers and smart televisions.

Signal integrity issues are on the rise as both design complexity and speed increase all the time. The analog world is moving faster than ever. Learning curves are sharper, design cycles are shorter, and systems more complex. Add to all this the multidisciplinary, analog/digital nature of today’s designs, and your job just gets more complicated.

High-speed I/O: On the road to disintegration? article tells that increases in data rates driven by a need for higher bandwidth (10Gbps, 40Gbps, 100Gbps networking) means the demands on system-level and chip-to-chip interconnects are increasingly challenging design and manufacturing capabilities. For current and future high-performance, high-speed serial interfaces featuring equalization could well be the norm and high levels of SoC integration may no longer be the best solution.

crystalball

For a long time, the Consumer Electronics Show, which began in 1967, was the Super Bowl of new technology, but now consumer electronics show as a concept is changing and maybe fading out in some way. The social web has replaced the trade show as a platform for showcasing and distributing products and concepts and ideas.

NFC, or near-field communications, has been around for 10 years, battling its own version of the chicken-and-egg question: Which comes first, the enabled devices or the applications? Near-field communications to go far in 2013 article expects that this is the year for NFC. NFC is going to go down many different paths, not just mobile wallet.

3-D printing was hot last year and is still hot. We will be seeing much more on this technology in 2013.

Inexpensive tablets and e-readers will find their users. Sub-$100 tablets and e-readers will offer more alternatives to pricey iPads and Kindles. Also sub-$200 higher performance tablet group is selling well.

User interfaces will evolve. Capacitive sensing—Integrating multiple interfaces and Human-machine interfaces enter the third dimension. Ubiquitous sensors meet the most natural interface–speech.

Electronic systems in the automotive industry is accelerating at a furious pace. The automotive industry in the United States is steadily recovering and nowadays electronics run pretty much everything in a vehicle. Automotive electronics systems trends impact test and measurement companies Of course, with new technologies come new challenges: faster transport buses, more wireless applications, higher switching power and sheer amount and density of electronics in modern vehicles.

Next Round: GaN versus Si article tells that the wide-band gap (WBG) power devices have shown up as Gallium Nitride (GaN) and Silicon Carbide (SiC). These devices provide low RDSON with higher breakdown voltage.

Energy harvesting was talked quite much in 2012 and I expect that it will find more and more applications this year. Four main ambient energy sources are present in our environment: mechanical energy (vibrations, deformations), thermal energy (temperature gradients or variations), radiant energy (sun, infrared, RF) and chemical energy (chemistry, biochemistry). Peel-and-stick solar cells are coming.

Wireless charging of mobile devices is get getting some popularity. Wireless charging for Qi technology is becoming the industry standard as Nokia, HTC and some other companies use that. There is a competing AW4P wireless charging standard pushed by Samsung ja Qualcomm.

crystalball

In recent years, ‘Low-carbon Green Growth’ has emerged as a very important issue in selling new products. LED lighting industry analysis and market forecast article tells that ‘Low-carbon Green Growth’ is a global trend. LED lighting is becoming the most important axis of ‘Low-carbon Green Growth’ industry. The expectations for industry productivity and job creation are very large.

A record number of dangerous electrical equipment has been pulled from market by Finnish Safety and Chemicals Agency’s control. Poor equipment design have been found in a lot, especially in LED light bulbs. Almost 260 items were taken from the market and very many of them were LED lights. With high enthusiasm we went to the new technology and then forgotten the basic electrical engineering. CE marking is not in itself guarantee that the product is safe.

The “higher density,” “higher dynamic” trend also is challenging traditional power distribution technologies within systems. Some new concepts are being explored today. AC vs DC power in data center discussion is going strong. Redundant power supplies are asked for in many demanding applications.

According to IHS, global advanced meter shipments are expected to remain stable from 2012 through 2014. Smart electricity meters seen doubling by 2016 (to about 35 percent penetration). In the long term, IHS said it anticipates that the global smart meter market will depend on developing economies such as China, Brazil and India. What’s next after smart power meter? How about some power backup for the home?

Energy is going digital article claims that graphical system design changes how we manipulate, move, and store energy. What defines the transition from analog to digital and how can we tell when energy has made the jump? First, the digital control of energy, in the form of electricity, requires smart sensors. Second, digital energy systems must be networked and field reconfigurable to send data that makes continuous improvements and bug fixes possible. Third, the system must be modeled and simulated with high accuracy and speed. When an analog technology goes digital, it becomes an information technology — a software problem. The digital energy revolution is enabled by powerful software tools.

Cloud is talked a lot in both as design tool and service where connected devices connect to. The cloud means many things to many people, but irrespective of how you define it, there are opportunities for engineers to innovate. EDA companies put their hope on Accelerating embedded design with cloud-enabled development platforms. They say that The Future of Design is Cloudy. M2M companies are competing in developing solutions for easily connecting embedded devices to cloud.

Trend articles worth to check out:
13 Things That Went Obsolete In 2012
Five Technologies to Watch in 2013
Hot technologies: Looking ahead to 2013
Hot technologies: Looking ahead to 2013
Technology predictions for 2013
Prediction for 2013 – Technology
Slideshow: Top Technologies of 2013
10 hot consumer trends for 2013

Popular designer articles from last year that could give hints what to expect:
Top 10 Communications Design Articles of 2012
Top 10 smart energy articles of 2012
Slideshow: The Top 10 Industrial Control Articles of 2012
Looking at Developer’s Activities – a 2012 Retrospective

626 Comments

  1. Tomi Engdahl says:

    Low-Power Numbers Can Be Deceiving
    http://www.designnews.com/document.asp?doc_id=267375&cid=nl.dn14&dfpPParams=ind_184,aid_267375&dfpLayout=article

    Microcontroller vendors seem to announce a new low-power product on a regular basis. In fact, almost all of the MCUs launched these days are dubbed low power in some way. This may not always be true, since there’s some window dressing and specmanship to make products appear better than reality.

    The devil is in the details. More often than not engineers will discover what the product can really do while testing it in their application, after spinning the board and writing the code. But all the information is usually available. By looking carefully at the electrical parameters in the datasheet the clever designer can analyze beforehand the device’s real behavior and anticipate if a particular microcontroller is truly low power and fits his or her application.

    The first number that clever marketing people will massage is active power, which answers the question, “How low am I when the CPU is running?” and is expressed in mA/MHz or µA/MHz. The lower the number the better. It should give an accurate view of how much power the device will sink while in operation. As a designer, however, you shouldn’t follow the advertised number blindly, but rather dive into the datasheet and check the note that gives the test conditions.

    Most vendors will provide the information about the power with peripherals on, with various degrees of operation

    Some people will calculate the active power in µA/MHz only for the highest frequency, the most favorable case. You should look at and calculate the active power per MHz at each frequency, especially the one you target.

    While each vendor has his own nicknames for the low power modes, there are few variants functionally. Dubbed halt, stop, standby, low power, sleep, etc., the modes typically fall into four categories, each with a power associated with it, and also a wake-up time and a wake-up mode. Table 1 summarizes these typical modes.

    Parameter variations
    I learned an important lesson a long time ago: Never design based on the typical. When you’re designing boards for the medical and transportation sectors, it’s critical to design in a robust way. This is a basic rule for all hardware design if you want to create things that don’t break, and is very commonly applied for interface timings and I/Os. However, I’m always surprised that so many engineers seem to accept typical data from the vendors. This is a dangerous practice, and the quality-minded engineer knows better.

    There are several ways that parameters can vary from the typical value. In the case of current draw, the main influencers are temperature and manufacturing variation. External voltage has a limited influence, due to the fact that most MCUs embed a voltage regulator that maintains a constant internal voltage. If you don’t take these variations into account, a significant portion of your devices could run out of power prematurely, requiring battery replacement or failure in the field.

    Leakage is the main contributor to current in low-power modes. Because the leakage characteristic is exponential with temperature, if the MCU isn’t well designed, power consumption can skyrocket with temperature. This is a key parameter to check

    The second variation that shouldn’t be overlooked is the manufacturing variation. Devices and production lots can vary, which is why manufacturers have MIN and MAX numbers.

    Reply
  2. Tomi Engdahl says:

    Altera and Micron help propel HMC, but will 3D packaging limit interest?
    http://www.edn.com/electronics-blogs/fpga-gurus/4420847/Altera-and-Micron-help-propel-HMC–but-will-3D-packaging-limit-interest-

    Altera Corp and Micron Technology Inc demonstrated a big step forward for Hybrid Memory Cube, by showing a Stratix V interface to the promising 3D technology. How might packaging limit HMC’s popularity compared to special-purpose nonvolatiles like ferroelectric or magnetoresistive RAM?

    In early August, we talked about the growing importance of memory and system-bus interfaces on next-generation FPGAs. Altera and Micron, both members of the HMC Consortium, are hoping to accelerate the viability and hence the utility of hybrid cubes by making them a mainstream alternative to DRAM and SRAM. Altera not only showed a Stratix V prototype, but indicated it would work on HMC for its upcoming Gen 10 family of FPGAs.

    Reply
  3. Tomi Engdahl says:

    IBM’s SyNAPSE carves out neural network simulation the company insists FPGAs can’t follow
    http://www.edn.com/electronics-blogs/fpga-gurus/4420462/IBM-s-SyNAPSE-carves-out-neural-network-simulation-the-company-insists-FPGAs-can-t-follow

    Neural network processing is a specialized massive-parallel VLSI world that only occasionally involves FPGA archtitectures. But when Dharmendra Modha, principal SyNAPSE investigator at IBM Research, began dismissing the potential of FPGAs in August interviews on synaptic chips, some clearing of the neural-network air seemed necessary.

    IBM has turned up the heat on the DARPA-funded SyNAPSE project because a “corelet” programming model was developed and announced at a neural-network conference in Dallas in August. The SyNAPSE hardware appears to be optimal for the parallel tasks and sparse distributed memory seen in neural networks, but it is useful to remember that the renaissance in neural-network studies has been going on for nearly 25 years. Most work involves software simulation, but processors designed as nodes have been promoted every few years or so, since the early 1990s.

    Reply
  4. Tomi Engdahl says:

    8-bit MCUs halve current consumption for battery operation
    http://www.edn.com/electronics-products/other/4420866/8-bit-MCUs-halve-current-consumption-for-battery-operation

    Lapis Semiconductor reports that microcontrollers in the ML610Q474 family consume just 0.25 µA in halt mode—50% lower than conventional products—making them suitable for devices that require long-term operation using just coin or dry-cell batteries, such as watches, clocks, and security tokens. The microcontrollers, which run on a single 1.5-V battery, also ensure high noise immunity, clearing ±30 kV during IEC61000-4-2 noise-immunity testing.

    Reply
  5. Tomi Engdahl says:

    According to Forward Concepts about fifteen companies currently produce 4g or LTE modems. Growing a 4g phone market of interest to many, but Qualcomm’s position in the market is a strikingly strong.

    For example, it is frequency-ie LTE modems FFD variant of sales last year. All in all modems were sold 47 million. This was Qualcomm’s share of 86 per cent.

    Only Samsung has been in the region made significant inroads and that too only in our own Galaxy smartphones, thanks. Samsung’s share of the market last year was 9 per cent. For example, last week, Broadcom, Renesas Mobile ownership has ended up listed on the market by one per cent a slice.

    Broadcom has said that it would launch Renesas acquired from LTE modem available as a commercial product for early next year.

    Ericsson still wants a 4g mobile phones, and expects ST-Ericsson modem could increase the market’s third soon after Qualcomm, and Samsung (but that could be unrealistic).

    Source: http://www.etn.fi/index.php?option=com_content&view=article&id=350:modeemikisa-yha-qualcommin-hallussa&catid=13&Itemid=101

    Reply
  6. Tomi Engdahl says:

    MEMS gryroscope ICs enable shake-free images
    http://www.edn.com/electronics-products/other/4420925/MEMS-gryroscope-ICs-enable-shake-free-images

    Specifically optimized for OIS (optical image stabilization) in smart phones and digital still cameras, the two-axis L2G3IS and three-axis L3G3IS gyroscopes from STMicroelectronics measure angular acceleration with a full-scale range of ±65 dps/±130 dps and deliver the rate to the external world through an SPI digital interface.

    By moving the lens in real time to compensate for physical movement of the camera, OIS can significantly improve the sharpness of an image, especially in low light when hand jitter during the long exposure time can blur images.

    The MEMS gyroscopes operate with a resonant frequency of the sensing mass at around 20 kHz.

    the devices are immune to the damage that could be caused by ultrasonic cleaning equipment (typically operating at about 30 kHz) that many customers use to clean devices before equipment assembly.

    Reply
  7. Tomi Engdahl says:

    Intel Quark Runs on Roof, Raises Questions
    http://www.eetimes.com/document.asp?doc_id=1319447&itc=eetimes_node_199&cid=NL_EDN_DesignIdeas_20130912&elq=f86be4f0931649b68bba1bd3cdc2abe5&elqCampaignId=1128

    An HVAC system on a rooftop in Minneapolis is running Quark, Intel’s newest and smallest SoC. If all goes well, Daikin McQuay might someday buy millions of the chips.

    Intel announced Quark at its annual developer conference here as its bid to get a jump on the emerging Internet of Things. However, it provided no details on its technical specs or when it will be released, suggesting it is more of a rushed trial balloon than a nailed-down product and strategy.

    In a brief encounter after his first IDF keynote, Intel’s new chief executive, Brian Krzanich, said Quark is x86 compatible. The chip he showed was made in a 32nm process, he added.

    In his keynote, Krzanich described Quark as a fifth the size and a tenth the power consumption of Atom. It’s a synthesizable core Intel will let others use along with third-party silicon blocks in SoCs Intel will make.

    Designers will not be allowed to customize the Quark core. They can only connect third-party blocks to its fabric. Intel will allow some process tweaks for some customers, he added.

    Last week, HVAC giant Daikin got one industrial reference board using a Quark chip and including WiFi and 3G support. Kevin Facinelli, executive vice president for operations at the company, dialed into the board from the IDF event here to show it is working.

    “We looked at Freescale and ARM too but decided on using Quark,” Facinelli said.

    The mechanical engineering company was not concerned about relative silicon performance. It just wanted to offer a remote maintenance capability with high security.

    Security software gave Intel the edge over ARM. The Quark reference board runs a stack of white-listed Wind River embedded operating system supplemented with McAfee security software, the kind of embedded system stack Intel has been touting for embedded systems for more than a year.

    Peter Glaskowsky, a veteran processor analyst, said Quark could be a 386-vintage subset of the x86 for which patents are now expired. “They could be making a virtue of necessity,”

    Alternatively, it could be the world’s smallest 64-bit x86.

    One wrinkle for Intel in this scenario is whether AMD has any outstanding patents on a 64-bit x86. It pioneered the architecture with its Opteron later emulated by Intel’s processors.

    Reply
  8. Tomi Engdahl says:

    Stop blaming the supply for your dissipation woes
    http://www.edn.com/electronics-blogs/power-points/4420891/Stop-blaming-the-supply-for-your-dissipation-woes

    It’s not news that everyone is interested in efficiency these days. Whether it is because of “green” incentives, run time, or thermal limitations, using less power is a good thing and often a priority. As a consequence, engineers are looking at sources of inefficiency and trying to minimize them, of course.

    Go through some simple numbers and you’ll see why, using a nominal 100 W supply. A modern AC/DC or DC/DC switching supply has efficiency of between 80% and 85% when operating at half load and above. (These are “typical” numbers, and using them can be a little dangerous – but this is an illustrative example to make a point, not a formal design review.) That means the supply itself is dissipating between 15 and 20 W.

    So management says “go get a more efficiency supply.” You search and find a supply that is a little more expensive, but has efficiency of 90% at the same load rating. Great: you have cut your supply dissipation down to 10 W. Or looking at it the other way, you have decreased the heat from the supply from between 15 to 20 W down to 10 W – thereby cutting it as much as 50%.

    But wait: even that 20% loss represents only a small fraction of the total heat dissipation of your system’s electronics, which is 80 W. Getting a better supply hasn’t changed that; it has decreased the supply + circuitry dissipation by only a few percent

    Maybe you need to work on decreasing the dissipation of the electronics as much as that of the supply

    The problem is that in most designs that’s a lot of work to achieve. Substituting lower-power components, lower-power circuit topologies, and lower-power I/O is hard work, may require new and untried components or designs, and is risky. It’s easier to try to get a big chunk of savings in one shot, rather than work watt by watt (or mW by mW) through the circuit.

    Don’t be defensive when people start pointing at the supply’s inefficiency as the source of your consumption and dissipation problems. It’s OK to point out that such conventional, simplistic thinking may have you looking for trouble in the wrong places.

    Reply
  9. Tomi Engdahl says:

    Russia’s electronics did not grow in the first half

    Many Finnish manufacturer and component distribution company to rely on Russia engine of growth, the production will fall in Finland. At least early in the year, the Russian market were not a gold mine, as a component of the market remained virtually unchanged.

    Source: http://www.etn.fi/index.php?option=com_content&view=article&id=359:venajan-elektroniikka-ei-kasvanut-alkuvuonna&catid=13&Itemid=101

    Reply
  10. Tomi Engdahl says:

    Researchers Develop Brain-to-Brain Interface for Human Motion Control
    http://www.designnews.com/author.asp?section_id=1386&doc_id=267562&cid=nl.dn14

    Researchers at the University of Washington (UW) have created what they’re calling the first human brain-to-brain interface that allowed one researcher to control another’s hand with his mind.

    Though it’s not exactly the “Jedi” mind trick, at first glance, the technology may seem a bit too mind blowing to comprehend

    “For brain-to-brain technology to become a viable form of ‘mind control,’ the advances to be made in science and technology need to be so great that we cannot even imagine them at the present moment — and we are not sure it would be possible at all,” he told us. “So, we are not concerned about this application, although we feel that it is certainly positive to begin a discussion on this issue.”

    Reply
  11. Tomi Engdahl says:

    Medical Device Manufacturers, Be Prepared to Innovate
    http://www.designnews.com/document.asp?doc_id=267640&cid=nl.dn14

    The trend toward “bio-connectivity” is gaining momentum, and medical device manufacturers need to be ready to bring that connectivity to next-generation products, a futurist at the Medical Design & Manufacturing Show said this week.

    “The number of in-person visits to hospitals is decreasing and the number of bio-connective, virtual visits is increasing,” Jim Carroll, futurist and author, told a gathering of engineers at the show.

    Today’s doctors are more likely to do patient consultations over Skype, Carroll said, adding that 40 percent of physicians are now willing to track patients via text messaging, email, and Facebook. He cited examples of such companies as Withings Inc., which makes a blood pressure monitor for use with iPhones and iPads, and MedCottage, which sells one bedroom “granny pods” that can be placed in the backyards of families caring for elderly patients. The cottage incorporates cameras and sensors, enabling patients to be monitored and managed from afar.

    Carroll also pointed to a growing number of diabetes management technologies that enable patients to monitor themselves at home and share their data with physicians.

    Some high-level healthcare executives have gone as far as to say that the need for dedicated central facilities is changing, Carroll said. “One CEO said that the concept of a hospital as a physical place is disappearing,” he told the audience of engineers. “Eventually, it’s going to go virtual.”

    Reply
  12. Tomi Engdahl says:

    Video: CSR Releases World’s Thinnest Touchscreen
    http://www.designnews.com/author.asp?section_id=1386&doc_id=267660&cid=nl.dn14

    CSR is known for its innovative Bluetooth technologies. Just recently, the company released the newest member of the family, the CSR1010 chip. It is part of the CSR μEnergy range, which has been optimized to use much less power than other current leading Bluetooth chips

    Partnering with Atmel and Conductive Inkjet Technology (CIT) allowed the engineers to create what is the world’s thinnest touchscreen interface. Atmel contributed its touch silicon, which detects the key input, while CIT provided the printing technology, which allows conductors to be printed on thin, flexible materials. Integrating these technologies with the Bluetooth chip, the touch interface will use only a fraction of the mobile device’s energy while providing a response time of less than 12 milliseconds.

    Overall, the screen measures in at just slightly less than 0.5 mm thick.

    Reply
  13. Tomi Engdahl says:

    A touch screen Geiger counter without a Geiger tube
    http://hackaday.com/2013/09/14/a-touch-screen-geiger-counter-without-a-geiger-tube/

    We’re assuming [Toumal] was desperately bored one day, because in the depths of the Internet he found some really cool components to build a solid state Geiger counter.

    This device uses a specially-made photodiode made by First Sensor to detect gamma emissions from 5 to 1000 keV.

    Reply
  14. Tomi Engdahl says:

    DIY manufacturing, open source hardware, and the New Long Tail
    http://www.edn.com/electronics-blogs/powersource/4307824/DIY-manufacturing-open-source-hardware-and-the-New-Long-Tail

    Remember The Long Tail? The Long Tail is Chris Anderson’s theory he proposed in a 2004 Wired magazine article that in the Internet world, where the cost of information storage has become vanishingly small, it’s now profitable to sell (mostly digital) products to much, much smaller market sizes.

    Anderson has a new article in Wired magazine, Atoms are the New Bits, where he proposes that a similar change is taking place in hardware design and manufacturing: One of the biggest obstacles for start-ups that sell new hardware products (as opposed to software) is the capital required for tooling and parts/labor. Anderson writes, “As ideas go straight into production, no financing or tooling is required.” Hoo-boy, that’s a gross simplification, but his point is that hardware startup businesses are moving into the realm of the garage tinkerer who can make use of low-cost 3D design tools and fast-turn-around prototyping, and China-based manufacturing.

    Unlike most journalists, he has some real-world experience to base his premise upon

    Comment: Old news…. Yes most are only now becoming aware of the possibilities. If you are doing something simple .. any good tech can design the product… soup to nuts… If you are doing something complex .. you need the skills/education/experience with most/all the issues related to design/manufacture/selling of the product in question. Key to being effective as a single person providing their own product.. knowing all the tools involved. Most electronics engineers only know the simulation and schematic capture tools.. Most electronic pcb designers only know the pcb layout and CAM tools Most mechanical designers only know their 2D or 3D CAD system… Production people need to have tools to control BOMs/assembly documents/ pick and place data/inventory control/etc…. If you are doing something complex.. You will need to know all of these tools… and more.

    Reply
  15. Tomi Engdahl says:

    Slideshow: Intel’s CEO Fires Up IDF
    http://www.eetimes.com/document.asp?doc_id=1319489&

    Intel’s new chief executive made his public debut at the company’s annual developer forum here, carving out an image as a straight shooter loaded for bear. “Our plan is to lead in every segment of computing,” he said in his first keynote as the CEO.

    Reply
  16. Tomi Engdahl says:

    Silicon Sees Schizophrenic Forecast
    http://www.eetimes.com/document.asp?doc_id=1319479&

    Economically, the semiconductor industry is headed for good times as the global economy pulls out of prolonged recession led by slow growth in Europe. Technologically, chip vendors are facing challenges that ultimately could undermine their business model.

    That was the appropriately schizophrenic forecast offered by Bill McClean, president of IC Insights, in his annual fall forecast here. “In general the trend for growth in semiconductors will improve in the next 10 years,” McClean said.

    Overall, he expects in the current 10 years a consolidating chip sector could see 8 percent growth and flat to slightly positive average selling prices. That’s significantly better than the 4.7 percent growth and 3 percent ASP declines of the past decade.

    Specifically, McClean forecasts market growth will nudge up to 11 percent and 13 percent in 2015 and 2016, largely on rising worldwide GDP growth. He also pegged the next down cycle of the semiconductor industry will start in 2017.

    McClean said he believes the industry as a whole will continue to lower costs per transistor over the next five years despite steep technical challenges. However, he suggested all bets are off at the 10nm level when the industry adopts extreme ultraviolet (EUV) lithography and 450mm wafers.

    Rising transistor costs would undermine systems sales based on traditional replacement cycles that anticipate significantly cheaper, faster systems every two years.

    “The whole infrastructure behind electronics systems sales could fall away — that’s definitely a possibility,” McClean said. “I don’t know exactly what’s going to happen when, but it’s a negative and they only question is how negative is it,” he said.

    Members of an elite audience of industry forecasters at his presentation here disagreed.

    Reply
  17. Tomi Engdahl says:

    Collaborative Advantage: Blocked From Designing 3D Circuits?
    http://www.eetimes.com/author.asp?section_id=36&doc_id=1319482&

    Although the semiconductor industry has benefited tremendously from 50 years of Moore’s Law there is also broad recognition that the era of regular process shrinks may be, for many, approaching an end.

    A complex mix of technologies and design methods collectively referred to as “More Than Moore” is replacing that regular miniaturization. One of the most promising technologies is the vertical stacking of dies using through-silicon-vias (TSVs) for die interconnect to create a 3D circuit.

    The promise of 3D includes greater functional density, reduced footprints, and lower power consumption, because the distance and capacitance of interconnect can be reduced. However, as the technology matures, as business models evolve, and as 3D design standards emerge, another issue hangs like a cloud over the landscape. That cloud, which could prevent, or at least delay, the broad market adoption of 3D circuitry is patents.

    A casual Internet search for “design of 3D circuits patents” quickly reveals at least five patents describing methods and systems for the design of 3D ICs, focused on design software tools. Finding further details is left as an exercise for the reader.

    These sets of patent claims address data structures for 3D design software. The claims appear to be fundamental enough that it could be impossible to work around them in any design tool implementation, assuming the claims stand. Furthermore, licensing is exclusively controlled by a non-practicing legal entity that is currently suing both EDA and end-user companies, with no indications of any quick settlement on the horizon. This legal climate, now preventing or delaying 3D-capable EDA tools, will necessarily have a similar impact on a pipeline of 3D designs, constraining the market for 3D manufacturing and commercialization.

    Reply
  18. Tomi Engdahl says:

    Broadband signal capture is “big data”
    http://www.edn.com/electronics-blogs/test-voices/4421043/Broadband-signal-capture-is-big-data

    More than 15 years ago, Dr. Douglas Englehart, inventor of the mouse, windows, videoconferencing, and hypertext, said that “the digital revolution is far more significant than the invention of writing or even printing.” You could argue this point, but who today could disagree that every function that can be performed digitally either already is or will be? The opinions of hard-core audiophiles notwithstanding, converting analog signals to digital form is “better.” It creates enormous possibilities for storing, manipulating, testing, and many other tasks that would be either impractical or impossible in the analog domain.

    On the other hand, “sometimes enough is too much,” a quote variously attributed in different contexts to Mark Twain and Groucho Marx, is equally fitting today, especially in some of the more obscure areas of test and measurement, and especially as it applies to bandwidth.

    The ability to “ingest” huge swaths of spectrum directly from the antenna without the need for all that nasty intervening microwave hardware makes a lot of people giddy these days, and for good reason. If a digital receiver sporting a high-performance ADC can capture all frequencies from DC to 6 GHz (for example) over long periods with at least reasonable resolution, very interesting opportunities arise. That 6 GHz contains most of the world’s signal activity, from maritime and airborne navigation at the low end to wireless communications upward from VHF.

    Once recorded and stored, a single five-minute signal capture at an urban location over this frequency range will reveal hundreds of thousands of signals.

    Taking this one step further, as the signals are now digital, they can be analyzed in various degrees of detail using specialized analysis and measurement software, rearranged in the signal file, and used to create new signal files containing only the spectrum with signals of interest. Waveforms can be analyzed in agonizing detail to determine their modulation and other characteristics. It’s all possible because of the immense flexibility afforded by the “digital domain”.

    There is however, one not-so-tiny problem: Signal captures, even short ones, produce lots of data and long ones many terabytes of data, which must be stored somewhere and worse, analyzed (the latter being both difficult and time-consuming). The greater the bandwidth, sampling rate, duration, and resolution of the signal capture, the more data you wind up with. Thus the conundrum: Digital representations of analog signals are essential but analyzing them is, or can be, a bear. So what to do?

    The answer is to reduce the amount of data you have to analyze, and to do that you need to search for and identify only the spectrum containing signals of interest as early as possible in the receive path through pre-processing in an FPGA, post-processing in a general-purpose processor or processors, and passing the much less data-intensive result back through the processing and analysis chain. This and a lot more is being done today in defense systems and thanks to huge processing and storage capability no doubt in something approaching real time by the NSA and other intelligence agencies.

    Reply
  19. Tomi Engdahl says:

    Lipo batteries are extremely dangerous… beware
    http://www.martinmelchior.be/2013/04/lipo-batteries-are-extremely-dangerous.html

    RC helicopters and planes (almost) all use Lipo batteries and those are DANGEROUS ! I nearly got my home destroyed by a lipo fire. Look at the video below to see what would have happened if I did not get there in time.

    Reply
  20. Tomi Engdahl says:

    Chinese DRAM Plant Fire Continues To Drive Up Memory Prices
    http://hardware.slashdot.org/story/13/09/17/0330207/chinese-dram-plant-fire-continues-to-drive-up-memory-prices

    “Damage from an explosion and fire in SK Hynix’s Wuxi, China DRAM fabrication plant will drive up global memory prices for PCs, servers, and other devices, according to new reports. Most of the damage from the Sept. 4 fire was to the air-purification systems and roof of the plant,”

    “The Wuxi plant makes approximately 10 percent of the world’s supply of DRAM chips; its primary customers include Apple, Samsung, Lenovo, Dell and Sony.”

    Reply
  21. Tomi Engdahl says:

    Texas Instruments inductance to digital converter (LDC): Necessity breeds invention
    http://www.edn.com/electronics-products/electronic-product-reviews/other/4421072/Texas-Instruments-inductive-to-digital-converter–LDC–Necessity-breeds-invention

    Texas Instruments designers have developed an entirely new data converter with the LDC1000 inductance-to-digital converter (LDC) designed specifically for inductive sensing applications.

    The LDC uniquely combines all of the external circuitry on chip that is normally required to simultaneously measure the impedance and resonant frequency of an LC resonator. It then regulates the oscillation amplitude in a closed loop to a constant level, while it monitors the dissipated energy of the resonator. This leads the accurate measurement of inductance of the front-end LC circuit, which enables precise measurement of linear/angular position, displacement, motion, compression, vibration, metal composition and new applications which will be conceived by designers. All of this can be done in the presence of oil, dust, dirt and moisture unlike most other solutions.

    Yes, necessity breeds invention, so new design needs will find that this unique device can be used in ways in which no other device is capable. Existing sensor technologies shortfalls can be eliminated with the front-end capabilities, compactness, low cost, low power and sheer flexibility of this LDC. It may well have you rethink your use of currently popular Hall-effect, pressure, ohmic, capacitive, optical and ultrasonic sensors.

    Inductive sensors can detect metal objects without touching them. They are sometimes used as a proximity detector or position sensor in factory automation and other areas. The operating principle is the use of a coil and an oscillator to create a magnetic field surrounding the sensing surface. The metallic object or “actuator” causes a dampening of the amplitude of the oscillation which can be used and detected in various ways to manage, position and control a process.

    There is a great need in the market today for sensing devices to operate in harsh environments and be immune to contaminants. Green solutions are expected, no demanded, by the market. Sensors are expected to reduce costs and R&D and new sensing capabilities are needed.

    Existing sensing technologies

    Reply
  22. Tomi Engdahl says:

    ESD Affects People, Too
    http://www.designnews.com/author.asp?section_id=1365&doc_id=267714&cid=nl.dn14&dfpPParams=ind_184,industry_consumer,aid_267714&dfpLayout=blog

    ESD, while not necessarily permanently harmful to people, is still a pain (no pun intended). When possible (and most times, it is), a machine design should handle ESD to prevent discharge through people or other equipment.

    This problem was raised to HR/safety status rather quickly; their solution was to give the operators static discharge wristbands.

    What’s wrong with that solution? Static discharge bands prevent a charge from moving from a human to whatever needs ESD protection; they prevent the charge from traveling from the human to the sensitive device. The ESD wristbands will do nothing to help the human when the charge moves in the other direction; the human is still part of the electrical circuit.

    They commented that the static discharge problem, while still there, was greatly reduced. The pipe and wire had lower resistance than the human operators; the charge flowed through the copper instead of the humans. It was a somewhat awkward solution, having to wave the pipe along the panel as they peeled, but they weren’t being painfully shocked anymore.

    I recommended to the operations manager that the pipe be replaced with a proper active static eliminator, one that emits positive ions at a negatively charged surface. There are a number of manufacturers that make such systems; any of them would have completely neutralized the static charge, which built up as the paper was peeled.

    Reply
  23. Tomi Engdahl says:

    Implantable medical miracles
    http://www.edn.com/electronics-blogs/anablog/4421014/Implantable-medical-miracles-a-chat-session-on-September-19th

    In the present world of electronic medical implants, human lifespan has been increased, we are living stronger and healthier plus we generally feel better. Think of pacemakers, pacemakers with defibrillators, implanted insulin pumps, spinal stimulators and brain-machine-interface (BMI) devices just to name a few of these marvels of electronic innovation that have helped correct ailments in our relatively fragile bodies.

    There are challenges to implanting electronic devices in the body. The devices must use little power — whether they scavenge it from the body, include an internal battery, or use an inductive power transfer technique. The devices must be packaged in a way that the body won’t reject the foreign matter, as it is wont to do. There must be a bidirectional way to pass information. The devices must be extremely reliable, and they usually require very high amounts of integrated functionality.

    Reply
  24. Tomi Engdahl says:

    TSMC Releases 16nm FinFET Design Flows
    http://www.eetimes.com/document.asp?doc_id=1319511&

    Leading pure-play foundry Taiwan Semiconductor Manufacturing Co. Ltd. has announced the existence of three reference design flows for FinFET and 3D-stacked ICs that have been taken to silicon. The silicon validation of these flows signifies the opening up of the manufacturing processes for the design of production volume chips.

    Intel was the pioneer of the FinFET in commercial production and remains the only company with such a manufacturing process. However, TSMC is reported to have signed to supply Apple with processors on a three-year contract that will include some FinFET production (see TSMC signs up Apple for three-year FinFET deal).

    EDA software vendors collaborated with TSMC to develop and validate these design routes using silicon test vehicles, TSMC said in a press release. However, TSMC did not indicate which companies’ tools had been proved effective at which stages of the design process.

    The 16FinFET digital design flow uses the Cortex-A15 multicore processor, licensed from ARM Holdings plc, as its validation vehicle for certification.

    Reply
  25. Tomi Engdahl says:

    Intel also wants to IoT devices

    With the new Quark processor, Intel is trying to compete agains the British ARM’s on very low power devices.

    The new general manager Brian Krzanich really presented the key to his speech Quark-circuit family, which is intended for the Internet of Things, for example, wearable electronics devices.

    Intel promises Quark-processor basis for the development of devices for the reference cards for the end of the year. Initially, we are aiming to industry, energy and transport applications, Krzanich defined.

    Low power consumption is not the only big new thing in connection with Quark. Intel will also license the processor. The processor giant clearly wants to challenge ARM’s Cortex-M and Cortex-R family Consolidating processors. Quark has taken the Pentium instruction set position, so it is the x86 architecture. The details of the Intel architecture has not yet been revealed.

    Source: http://www.etn.fi/index.php?option=com_content&view=article&id=380:intel-haluaa-myos-iot-laitteisiin&catid=13&Itemid=101

    Reply
  26. Tomi Engdahl says:

    Cadstar got the mobile app

    Zuken Cadstar to expand PCB design software to his mobile world. The new smart phone application allows a designer’s personal workstation attached to the design.

    Apple’s iPhone, iPad and Android devices disclosed in the application is free to download from app stores. According to users, the application will save you time as it can be used to quickly take a look at some Cadstar files.

    Source: http://www.etn.fi/index.php?option=com_content&view=article&id=377:cadstar-sai-mobiilisovelluksen&catid=13&Itemid=101

    Reply
  27. Tomi Engdahl says:

    Dark, Dirty & Dead-End? Manufacturers Say No
    http://www.designnews.com/author.asp?section_id=1395&doc_id=267776&cid=nl.dn14

    If your son or daughter could go to a community college, receive a two-year degree in manufacturing technology, and earn $58,000/year inside of 18 months, would you encourage it?

    At first glance, the answer seems like a no-brainer, given that many four-year college grads are now making far less than that and paying off huge college loans. But a career in manufacturing apparently doesn’t have the required cache.

    ”There’s still an issue of image,” noted Maria Coons, vice president of workforce and strategic alliances for Harper College, a two-year institution in Illinois that offers an advanced manufacturing pathway. “We have to repackage and remarket manufacturing, and that’s what we’re attempting to do.”

    ”We have all kinds of manufacturers in our state who can’t find enough skilled workers,” Coons added. “In Illinois alone, there are an estimated 30,000 job openings.”

    Reply
  28. Tomi Engdahl says:

    Smartphones & Tablets as Remote HMIs
    http://www.designnews.com/author.asp?section_id=1386&doc_id=267763&cid=nl.dn14

    Smartphones and tablets as remote HMIs (human-machine interfaces) are becoming more and more of a reality. The real usage is more limited to remote monitoring devices, but still new technology is offering an interesting and useful way to track critical plant production information, for example. With new tools that rely on browser technology and limit the amount of development effort required, this is an approach that I think will continue to gain momentum.

    “Small to mid-sized businesses have taken ‘mobile HMI’ and run with it,” David Hill, marketing communications manager at Opto 22, told Design News in an email. “Customers knew exactly what they wanted to do with a mobile HMI, which they usually wanted on a smartphone, but the cost and complexity of getting there had been too great.” For two water industry customers who connected to an HMI using remote desktop software, the existing solution was limited and cumbersome.

    What’s interesting is to see how midsized businesses are using this approach.

    To provide security, the obvious concern with these types of systems, Opto 22 recommends using VPN access and separating an organization’s control and computer networks. And beyond that, the key is careful assignment of user rights. The Groov appliance also implements SSL communications using software developed by the OpenSSL Project for use in the OpenSSL Toolkit.

    While the examples above are obviously not extremely complex manufacturing systems, they do point out the way that connectivity solutions, even with limited capabilities, can offer very significant advances in productivity and flexibility. One thing that amazes me is that controlling the home thermostat, given the availability of wireless technology, has not already gone mainstream with the proliferation of smartphone technology. I know that solutions exist but even the high cost of energy isn’t fueling this trend.

    Reply
  29. Tomi Engdahl says:

    Xilinx joins the OpenCL effort, as part of All Programmable Abstractions initiative
    http://www.edn.com/electronics-blogs/fpga-gurus/4421142/Xilinx-joins-the-OpenCL-effort–as-part-of-All-Programmable-Abstractions-initiative

    OpenCL has found more support among Altera partners, but has needed another major FPGA vendor to support this open language, originally developed by Apple Inc. In mid-September, Xilinx announced it would work with partners MathWorks and National Instruments on OpenCL, as part of a new All Programmable Abstractions initiative.

    Critics might see Xilinx as adding a special superfluous marketing spin to the commitment to OpenCL. For the time being, APA does seem to be an ill-defined paper initiative. But the goal of Xilinx and its two major development partners is admirable.

    The APA initiative wants to place C, C++, System C, and the new OpenCL in a common framework to encourage high-level language programming of FPGAs. This could not happen too soon.

    Current trends in FPGA use suggest that most mid-range and larger FPGAs will rely at least on a single microprocessor core, if not multiple cores– either homogenous multiprocessing cores, or heterogeneous cores reliant on dissimilar threads and programming models.

    Xilinx is not putting a timeline on its work with partners on APA, and frankly, it would be unlikely to see any C++ or System C projects emerge before late 2014. This could push an OpenCL program into 2015 or later.

    Reply
  30. Tomi Engdahl says:

    Smart watch is now the hottest new device category. They are developing large companies Samsung and Sony’s leadership, but also the smaller entrepreneurs. At the same time a new area to provide growth opportunities for semiconductor houses.

    STMicroelectronics says that the STM32 series of micro-controller works as the brain of Pebble smart watch. The clock is automatically linked to the iPhone and Android smartphones via Bluetooth. F205 32-bit controller is suitable for this application because of performance and low power consumption. F205 controller measuring just 4×4 millimeter.

    Source: http://www.etn.fi/index.php?option=com_content&view=article&id=386:st-n-prosessori-ohjaa-alykelloa&catid=13&Itemid=101

    Reply
  31. Tomi Engdahl says:

    Flexible Curved Displays to Top $27 Billion by 2023
    http://www.eetimes.com/document.asp?doc_id=1319531&

    Flexible and curved displays are a booming market that is just getting off the ground. From flexible displays for mobile devices like e-books to curved wraparound displays that provide an immersive experience for television viewers, the market for flexible and curved displays is growing rapidly, according to a Touch Display Research report released this week.

    “The flexible and curved market will be $388 million this year,” Jennifer Colegrove, president and analyst of the Santa Clara, Calif., research firm, told us. “And we forecast it will grow to $27 billion by 2023 at a compound annual growth rate of 53 percent.”

    Reply
  32. Tomi Engdahl says:

    IMEC Process Supports Germanium-Tin Transistors
    http://www.eetimes.com/document.asp?doc_id=1319529&

    A Belgo-Japanese research team has developed an improved process for the integration of germanium-tin films and MOSFETs on silicon substrates. This opens up the possibility of strained GeSn pMOSFETs at sub-10 nm geometries.

    The team, drawn from Katholiek University of Leuven, the IMEC research institute at Leuven, and Japan’s National Institute of Advanced Industrial Science and Technology (AIST), had developed a solid-phase epitaxy process that overcomes previous limitations in laying down germanium-tin films

    The interest in germanium-tin is related to the ability to use materials with high electron mobility, such as germanium, and being able to improve that mobility through the use of engineered lattice strain.

    The IMEC team has allowed, for the first time they claim, the operation of depletion-mode junctionless GeSn pMOSFET on silicon. The use of a silicon substrate is likely to be have a significant effect on reducing the cost of production, as well as allowing the integration of conventional CMOS and GeSn transistors.

    Reply
  33. Tomi Engdahl says:

    Sharing the heavy-lifting with supercapacitors
    http://www.edn.com/electronics-blogs/analog-ic-startup/4419595/Sharing-the-heavy-lifting-with-supercapacitors

    Lithium-ion (Li-ion) batteries are common in portable electronics. It is easy to find batteries that can source 1200mAh at a voltage of 3.7V. If for example a cell phone user decides to take a picture with the cell phone camera, the battery will have to source the high-current LED flashes that are common in today’s high megapixel imagers.

    During this time, large pulses of current are required from the battery that in turn increase the battery ESR and in turn can shutdown the complete system.

    One way to address this is to use a supercapacitor.

    A Li-ion battery has an ESR anywhere between 200mΩ and 300mΩ while a supercapacitor can have an ESR as low as 50mΩ. Overall, this sharing of the load reduces the ESR drop and in turn, helps keep the complete system functional.

    Reply
  34. Tomi Engdahl says:

    Inductive Sensor/Converter Opens More Doors Than I Could Imagine
    http://www.designnews.com/author.asp?section_id=1386&doc_id=267824&cid=nl.dn14

    In a meeting last week with a vendor releasing an inductive touch sensor (sort of), I got excited by the number of possibilities that were offered by a device like this one. The device I’m referring to is actually an inductance-to-digital converter (LDC), a part that uses coils and springs as inductive sensors to deliver higher resolution, increased reliability, and more flexibility than existing sensing solutions. And this comes at a lower cost, too.

    This contactless sensing technology can be used to measure the position, motion, or composition of a metal or conductive target, as well as detect the compression, extension, or twist of a spring. Developed by Texas Instruments, the first device in the family is dubbed the LDC1000.

    Reply
  35. Tomi Engdahl says:

    Growth in Model-Based Design for Automation Control
    http://www.designnews.com/author.asp?section_id=1386&doc_id=267765&cid=nl.dn14

    If we look back five years at the number of automation system suppliers that could import a control algorithm into their automation software tools from a simulation environment, there were very few companies on the list. Not many automation systems were building targets for importing from a simulation environment into their automation control platform through code generation.

    But now, according to industry experts at The MathWorks, the list has grown much longer. Systems are adding new sophisticated functions and tools.

    “Siemens has a new target for their PC platform, and Omron just created a target for their PLCs that works with Simulink, The MathWorks’ PLC coder product,” Tony Lennon, The MathWorks industry manager for North America, told Design News. “The build-up over time is significant and it is a trend where machine builders know [it] is important to move to system simulations earlier in the design process.”

    Typically, machine builders have been experts at mechanical design and sizing motors for a given static load, for example. But system integration gets more difficult for the OEM when adding software sophistication to machines for controlling more complex systems with multiple axes and for coordinating large numbers of motors.

    “What I can confirm is that in the mechatronic development process, software is growing in importance and is the main part of OEM machines,”

    Lennon said that the aerospace and automotive industries use simulation tools because they realize people can’t always be testing physical prototypes. But now the level of complexity and amount of code going into modern machines is causing machine builders to recognize the value of simulation, as well. Automation vendors are helping to build this bridge between the simulation environments and automation-control products. The code in a system-level simulation can now be easily ported into the actual production software that is used to drive machines.

    “With the challenges going forward, from the CAD point of view, we have a very accurate animation of machines and seeing the 3D motion of the parts, which captures the dynamics of the machine”,

    Lennon said customers are interested and would like to see more of this happen. The MathWorks also has interfaces to SolidWorks, ProEngineer, and Autodesk, and continues to work on marrying together the engineering and simulation environments.

    One point of emphasis has been on developing targets between Simulink and automation software programming environments. The evolution is helping customers who are following this adoption phase gain a new set of skills. The process is similar to their moving from a drafting board to 2D and 3D CAD models. The goal is to see the benefits in testing and software development, where fewer errors occur and the process is streamlined to achieve the targeted machine specifications.

    “From an OEM point of view, these tools would provide an ability to visualize complex motions,”

    “One of the pain points for machinery builders is that they often don’t have a good idea if the software works until they commission their software on a real machine. If it doesn’t work, this can affect delays and impact delivery of machines to the end customer,” added Wallner.

    Reply
  36. Tomi Engdahl says:

    Cell phone flash memory faster

    UFS (Universal Flash Storage) is an interface standard from JEDEC. It’s first version was introduced in 2011. In summer 2012, the standard advanced to version 1.1, the interface data rate was increased to 300 megabits per second.

    UFS 2.0 to increase the link speed of a 600 megabits per second. In addition you can use more than one links. As a result, data can pass in two directions of up to 1.2 gigabits per second.

    Mobile flash circuits sales for mobile devices are projected to increase considerably this year. Last year, the chips are sold 20.2 billion dollars, but this year, iSuppli forecasts the market to grow to $ 23.1 billion, or about 17 billion Euros despite the fact that the average price of chips getting cheaper all the time.

    Source: http://www.etn.fi/index.php?option=com_content&view=article&id=392:kannykan-flashmuisti-nopeutuu&catid=13&Itemid=101

    Reply
  37. Tomi Engdahl says:

    Agilent Splits HP DNA Again
    http://www.eetimes.com/document.asp?doc_id=1319548&

    The old Hewlett-Packard DNA is splitting once again. Agilent Technologies announced a plan to split itself into two companies in a move its chief executive called “the biggest and most profound change in our history.”

    The largest of the two new units — the $3.9 billion group that will retain the Agilent name — will focus on life science instruments and will consider moving into related markets such as gene sequencing.

    The remaining $2.9 billion group — yet to be named — will carry on the traditional test and measurement business started by Dave Packard and Bill Hewlett in 1939.

    The rational of the move is clear, but the work to make it successful will be hard.

    “The trigger point that we needed to make the separation came in November 2010 when Agilent was reclassified as a health care company in the S&P 500,” explained Bill Sullivan, who will remain as CEO of the new Agilent unit.

    Agilent’s traditional T&M and its growing medical instruments businesses have “two distinct investment and business opportunities,” Sullivan said in a conference call

    “We have spent months looking at clever ways to separate the companies without adding costs,” Sullivan said. “Obviously you will have costs of two boards and two management teams with two tax filings in every country in the world, but the team has come up with detailed plan,” he said.

    The company made its largest acquisition — a $2.2 deal for Dako in May 2012 — to bolster its life sciences business, a deal on which Agilent has has yet to get a full return, Sullivan said. And the company is still behind competitors in academic and research markets, he added.

    Agilent has already identified a path to splitting its IT systems and real estate as part of the plan. “We have a lot of regulatory and approval processes to go through in many countries, and that’s the gating item to the final completion of the deal,” said Sullivan.

    Reply
  38. Tomi Engdahl says:

    Sub-Threshold ARM Chip on Track at Ambiq
    http://www.eetimes.com/document.asp?doc_id=1319543&

    Fabless startup Ambiq Micro Inc. is on track to start sampling an ARM-based microcontroller that operates transistors near and below their threshold voltage in 2014. Volume production of sub-threshold microcontrollers is then expected in 2015, Scott Hanson, chief technology officer of Ambiq, told EE Times in a telephone interview.

    Hanson said Ambiq is designing mixed-signal devices based on the Cortex-M0+ core from ARM, but rather than being general-purpose MCUs, these MCUs are tailored for emerging applications where power consumption is critical. Typical applications would include: wearable devices, smartcards, wireless sensors, and portable medical equipment, he told us.

    On its website, Ambiq describes its sub-threshold power-optimized technology (SPOT) process as turning “microamps into nanoamps.” The threshold where a transistor can be turned on and significant current flows is in the region of 0.3 to 0.6V, depending on the manufacturing process, but around the threshold, the device characteristics have exponential dependencies on voltages and temperature making design difficult.

    Ambiq was formed in 2010 — with backing from ARM Holdings plc among others — to commercialize sub-threshold research carried out at the University of Michigan. However, the company’s first products were the AM08XX and AM18XX real-time clock (RTC) chips, which the company claimed were the world’s lowest power RTC chips due to their use of the SPOT process.

    Reply
  39. Tomi Engdahl says:

    Power amplifiers – Playing it safe
    http://www.edn.com/electronics-blogs/powersource/4420912/Power-amplifiers—Playing-it-safe

    When Murphy Hits

    When your power amplifier mimics a steam engine, you know that something is wrong – terribly wrong. Using power amplifiers for high voltage or high current applications has its challenges, in particular when the output of the amplifier is used for equipment that is handled by others or that is used in harsh environments. The impact for mishandling can go way beyond just a straight damage to the power amplifier. It can shut down a production line, damage other portions of the equipment, or can even pose the threat of harming an individual.

    I didn’t do anything wrong!?

    It’s common that you might burn an amplifier or two when you design and test a new circuit. But once you worked out the kinks, protected the inputs against voltage transients, put in the right compensation to get the desired stability, you expect a design to work reliably.

    When we see returns of our power amplifiers from the field, it is very likely that the failure analysis reveals an operation of the power amplifier outside its safe operating area (SOA). This can be extensive stress due to operating the device outside its input or supply voltage boundaries or beyond its temperature limits, running higher currents than allowed for a certain period of time, or by the end user causing a short in the output circuitry.

    Aim for the Early Bird Special

    While you often can’t control the handling of your equipment, various options are available to protect your system. The best time to start thinking about system protection is right when you write your system specification. It’s a good idea at that time to analyze potential fault and mishandling scenarios and their impact on the overall system. Once that list is complete, you can then plan for measures in your design to limit the impact of the faults you identified as critical. Pushing out the fault analysis to a later time will be costly

    Depending on the device, an external controller will read the temperature of the power amplifier and control the health of the system, or the power amplifier internally shuts down its operation once the temperature exceeds a certain threshold.

    Limiting the maximum output current of the power amplifier provides another level of protection. Most power amplifiers provide pins for a current limit resistor. The amplifier will measure the voltage across the current limit resistor, adjusting the current if the voltage exceeds the set threshold. However, that current limit might not be fully comprehensive, as the power dissipation of the device needs to be taken into consideration as well.

    Implementing current limits can be done in various ways. One might provide a current power supply that has a built-in current limit. However, when planning to use that option, capacitances on the system have to be considered, as well how as quickly the current limit kicks in. Capacitances on the board that are used to stabilize the power supply can also turn into current sources under some fault conditions, allowing the system to exceed the maximum current for a limited period of time.

    SOA charts are available that indicate what current the device can handle relative to its supply and output voltage.

    One potential answer is the use of power amplifiers that offer internal short circuit detection and protection. Once a short is detected, the amplifier will shut down immediately – immediately meaning within nanoseconds or microseconds. A reset pulse or a power cycle will bring the amplifier back to life.

    Power amplifiers have become more sophisticated and provide increased protection functionality, like input voltage protection, temperature monitoring or short circuit detection. However, they can’t cover all the faults that can damage a system. The importance of an early fault analysis can’t be stressed enough; it will save you money at the end. A

    Reply
  40. Tomi Engdahl says:

    Wearable Devices Help Parents ‘Guard’ Children
    http://www.designnews.com/author.asp?section_id=1386&doc_id=267903&cid=nl.dn14

    Mobile phones have done a lot for helping parents keep track of where their children are, but what about younger children for whom mobile devices aren’t a practical option?

    Inspired by personal experience, a Taiwanese entrepreneur has come up with an answer to this problem: Wearable devices that can help parents locate their kids if they wander off, and aid in the location of children before they go missing permanently.

    Fang was inspired to invent Guardian when his own child went missing from a department store

    “Within that half an hour, which felt like a lifetime, Fang was absolutely helpless. He kept blaming himself for not paying better attention. Fang believes that with the aid of technology, the tragedy of missing children can be turned into a thing of the past.”

    The keys to the devices — which come in a buckle design that can be worn on clothing or a bracelet that can go around a child’s ankle or wrist — are Bluetooth 4.0 technology and an iOS-compatible smartphone application, Lu said. Guardian also uses Bluetooth Low Energy (BLE) and a replaceable button-cell lithium battery, CR2032, that’s rated at 3.0V for power and can last from four months to a year before being replaced. BLE — used in sensors and low-power devices — reduces the power the devices need to operate while maintaining Bluetooth’s typical communication range.

    Parents or guardians use a smartphone application to set up a variety of parameters for the devices, including a safety perimeter with a range of zero to 230 feet, Lu said. The Guardian device will send an alert to their smartphones if a child wanders too close to this perimeter so they can retrieve them.

    In the event a child wearing a Guardian device does go missing, the device — via the Bluetooth-based cloud network — sends the child’s location to his or her parents.

    Guardian devices are currently available by preorder through the BeLuvv website for $24.95 and will be sold at retail (starting Nov. 30) for $29.95. BeLuvv also is working on an Android-compatible version of the devices.

    Reply
  41. Tomi says:

    The world’s semiconductor production reduces 6.8 percent to about 54.8 billion U.S. dollars, ie 40.5 billion euros in the current year 2013.

    These figures explain the research firm Gartner sees the market to grow rapidly in 2014 and 2015, before falling back slightly in 2016.

    After two years of semiconductor production value should be more than 71 billion dollars, or about 52 billion euros.

    The main reason is downstream from the beginning of the year overabundance of slightly softens the mobile device market, and some chip production, the gradual transition from the current 28 nanometer to 14-nanometer.

    Source: http://www.tietoviikko.fi/kaikki_uutiset/sirukauppa+hiippuu/a931983

    Reply
  42. Tomi Engdahl says:

    Apple M7 From NXP, Says Chipworks
    http://www.eetimes.com/document.asp?doc_id=1319563&

    NXP Semiconductors makes the M7 sensor controller used in the new Apple iPhone 5S and Samsung makes the phone’s A7 processor, according to a teardown by Chipworks (Ottawa). The iPhone 5S hit the market on September 20. Chipworks was among many companies rushing to post teardowns and analyses of its internal workings.

    “The M7 has been a difficult chip to locate on the board and rumors have been going around about the lack of a discrete M7 chip inside the iPhone 5S,” Chipworks analysts said in their online report.

    “Luckily, we’ve been able to locate the M7 in the form [of] the NXP LPC18A1, [part of] the LPC1800 series [of NXP's ARM] Cortex-M3 based microcontrollers,” they said (see above). “This represents a big win for NXP,” they added, given the volume and visibility of iPhone sales.

    The M7 controls functions from a variety of discrete sensors including a gyroscope, an accelerometer, and a compass. Chipworks noted that traditionally, Apple used STMicroelectronics’s accelerometer and gyroscope, and an electro-magnetic compass from Asahi Kasei Microdevices (AKM). “We have since confirmed the compass to be AKM’s AK8963,” they said.

    The A7 is Apple’s first 64-bit mobile SoC. Eventually teardown experts aim to explore the guts of the chip to determine details, such as how many cores it uses.

    “We have confirmed through early analysis that the device is fabricated at Samsung’s foundry,” Chipworks reports.

    A look inside the A7 (APL0698) found a contacted gate pitch of 114 nm (above). That suggests it is made in the Samsung 28nm HKMG process, the same used for the Exynos 5410, application processor used in the Galaxy S IV, Chipworks said.

    Reply
  43. Tomi Engdahl says:

    News & Analysis
    Soft Mobile Market Hits Chip Capex
    Peter Clarke
    9/20/2013 09:11 AM EDT
    http://www.eetimes.com/document.asp?doc_id=1319555&

    Forecasts for capital expenditure in the chip industry have been reduced in the short term, affecting 2013 totals, due to perceived softness in the smartphone and tablet computer markets. However, a more favorable general economy is prompting a better outlook for 2014 and 2015.

    Market research firm Gartner reckons capital spending in the semiconductor industry is going to fall by 6.8 percent in 2013 to $54.77 billion before jumping by 14.1 and 13.8 percent respectively in 2014 and 2015.

    The sub-market of semiconductor manufacturing equipment is set to decline more steeply in 2013, by 8.5 percent, to $34.63 billion, while equipment specifically for wafer fabs will drop by 9.1 percent to $26.95 billion, the company said in a press release.

    The firm said the reduction in wafer fab spending is due to lowered investment in equipment for 28nm production following a softening in the premium mobile device market.

    Intel, TSMC, and Samsung are responsible for more than half of the chip capex in 2013 and the top ten firms account for 76 percent. Some easing of the situation may come late in the year as Intel prepares for 14nm production in 2014.

    Longer term, Gartner expects the general economic malaise to have worked through globally, resulting in two stronger years of electronic equipment and semiconductor component sales and a concurrent rise in capex spending.

    Reply
  44. Tomi Engdahl says:

    Apple no longer buy the most Flash

    Apple rose in 2010, the largest purchaser of flash memory. This was a result of the iPhone smartphone and iPad tablet power demand. Now the number one buyer of Apple’s position is under threat.

    According to IDC research, a recent report by Apple’s purchase of the NAND circuits is no longer increasing unabated.

    Android and Windows Phone regard to the popularity growth, especially in lower-end smart phones has meant that the flash circuits are primarily sold in them.

    NAND chip sales have increased this year by 23.9 per cent from last year. On this basis, IDC predicts that full-year sales will exceed $ 30 billion mark.

    NAND memories 44 per cent goes to smartphones and basic phones. Every four bits are planted in the solid state hard drives.

    Source: http://www.etn.fi/index.php?option=com_content&view=article&id=399:apple-ei-enaa-osta-eniten-flashia&catid=13&Itemid=101

    Reply
  45. Tomi Engdahl says:

    Video: ‘Terminator’ Polymer Has Potential for Device Design
    http://www.designnews.com/author.asp?section_id=1386&doc_id=267960

    Imagine if you dropped your mobile device and didn’t have to worry about breaking it, because you knew the phone would bounce harmlessly off the floor — and, even if it were damaged, it could repair itself.

    That’s the promise of a new material developed by researchers at the IK4-CIDETEC Research Center in Spain that can fuse back together in two hours after being severed. The polymer-based material uses “a poly(urea-urethane) type composition, a material which is widely used in industry,” the researchers said in a press release.

    A YouTube video (below) shows a researcher cutting a solid cylindrical-shaped piece of the material in half. The two halves are put back in contact with each other and left to sit at room temperature for two hours. In that time, according to the video, the material connects back together as a single piece of polymer that doesn’t separate when the researcher stretches it.

    Reply
  46. Tomi Engdahl says:

    NI Controllers Help Modernize Power Grid
    http://www.designnews.com/author.asp?section_id=1386&doc_id=267962

    National Instruments recently announced a new platform for its software-designed cRIO 9068 controller. This new product still uses LabVIEW programming but offers a 4x performance boost by using a 667 MHz ARM dual core processor and new FPGAs. It also provides an ability to reuse existing code, or users can re-compile most existing applications and add support for the Eclipse Development Environment.

    One target for the new platform is digital energy production, including, for example, an application developed by LocalGrid Technologies. The development of microgrid and smart distributed power generation solutions is a prime expansion area for automation and control vendors offering off-the-shelf controller options to help utilities move from electromechanical, one-way communication control to more potent digital solutions that incorporate networking and measurement technology.

    The key with LocalGrid’s eGridOS software is that it allows grid operators unfamiliar with measurement control software tools to develop software and write application-specific algorithms that can be deployed to in-field devices. Even though each Microgrid system uses a different topology and each installation is unique, a combination local grid cells, remote HMIs, and cell asset nodes can be monitored and controlled using a substation controller.

    Through the NI platform, LocalGrid engineers are taking advantage of the toolkits for power measurement, communication, and industry certifications available with the platform. The DNP3 (Distributed Network Protocol) is being used along with the NI Electrical Power Suite to perform online analysis and calculation of power signals. These protocols are required for these installations, and help LocalGrid reduce both costs and effort.

    Reply
  47. Tomi Engdahl says:

    Test equipment, two steps removed
    Martin Rowe – September 19, 2013
    http://www.edn.com/electronics-blogs/rowe-s-and-columns/4421399/Test-Equipment–Two-Steps-Removed

    Agilent Technologies announced today that it is dividing yet again. This time, it’s splitting off its test-and-measurement business into a separate company that will get a new name, as yet unknown. You may recall that HP split off Agilent in 1999, ending years of speculation.

    What does this mean for the test business? The original part of HP that Bill and Dave started will, by the end of 2014, be two steps away from its founders.

    From an engineering perspective, I don’t think the name of the company makes a difference to you, provided the new company keeps developing the test equipment you need to do your job. Any faltering, in engineering or technical support, will be capitalized upon by competitors.

    Since 1999, a generation of engineers have grown up not knowing that HP was a test-and-measurement company.

    Reply
  48. Tomi Engdahl says:

    Flexible medical sensors aid diagnoses
    http://www.edn.com/electronics-blogs/anablog/4421437/Flexible-medical-sensors-aid-diagnoses

    Modern day MEMS and integrated circuit advances are now making it possible to integrate a tiny, flexible sensor at the tip of a catheter or guidewire.

    There exists a generic platform for the fabrication and assembly of partially flexible sensors at the tip and around minimally invasive medical devices such as catheters and guidewires. This is called the Flex-to-Rigid (F2R) platform.

    Examples of different IC-based sensors that can be made from this flexible technology are as follows:

    The transducing IC function of a forward looking medical imager needs to be located at the tip of the instrument. In this case, a circular or a ring-shaped transducer is needed

    The difference in diameter between guidewires and catheters plays an important role when a sensing functionality has to be implemented around the instrument. In the case of a catheter, it is possible to use tiny rigid silicon tiles connected with flexible interconnects

    A 360 degreed thermal flow sensor can be positioned around a guidewire and it requires a fully flexible sensor foil, and at the same time a pressure sensor can be placed inside of it

    Reply

Leave a Reply to Tomi Cancel reply

Your email address will not be published. Required fields are marked *

*

*