Electronics trends for 2013

Electronics industry hopefully starts to glow after not so good year 2012. It’s safe to say that 2012 has been a wild ride for all of us. The global semiconductor industry has demonstrated impressive resilience in year 2012, despite operating in a challenging global macroeconomic environment. Many have already ratcheted back their expectations for 2013. Beyond 2012, the industry is expected to grow steadily and moderately across all regions, according to the WSTS forecast. So we should see moderate growth in 2013 and 2014. I hope this happens.

The non-volatile memory market is growing rapidly. Underlying technologies for non-volatile memories article tells that non-volatile memory applications can be divided into standalone and embedded system solutions. Standalone applications tend to be driven primarily by costs is dominated by NAND FLASH technology. The embedded market relies mainly on NOR Flash for critical applications and NAND for less critical data storage. Planar CT NAND and 3D NAND could fast become commercially viable this year or in few years. MRAM, PCRAM, and RRAM will need more time and new material innovation to become major technologies.

Multicore CPU architectures are a little like hybrid vehicles: Once seen as anomalies, both are now encountered on a regular basis and are widely accepted as possible solutions to challenging problems. Multi-core architectures will find their application but likely won’t force the extinction of single-core MCUs anytime soon. Within the embedded community, a few applications now seem to be almost exclusively multicore, but in many others multicore remains rare. There are concerns over the complexity and uncertainty about the benefits.

FPGAs as the vanishing foundation article tells that we are entering a new environment in which the FPGA has faded into the wallpaper – not because it is obsolete, but because it is both necessary and ubiquitous. After displacing most functions of ASICs, DSPs, and a few varieties of microcontrollers, it’s fair to ask if there is any realm of electronic products where use of the FPGA is not automatically assumed. Chances are, in the next few years, the very term “FPGA” might be replaced by “that ARM-based system on a chip” from Xilinx, Altera, Lattice, or other vendor.

Software and services have become the soul of consumer technology. Hardware has become increasingly commoditized into blank vessels that do little more than hold Facebook and Twitter and the App Store and Android and iOS.

Are products owned when bought? The trend in recent decades has been an increase in the dependence of the buyer on the seller.

More than 5 billion wireless connectivity chips will ship in 2013, according to market research firm ABI Research. This category includes standalone chips for Bluetooth, Wi-Fi, satellite positioning, near-field communications and ZigBee as well as so called “combo” chips that combine multiple standards. Broadcom seen retaining lead in connectivity chips. Bluetooth Smart, WiGig and NFC are all are seeing increased adoption in fitness, automotive and retail applications. Combo chips are also a growing opportunity based on the popularity of smart phones, tablet computers and smart televisions.

Signal integrity issues are on the rise as both design complexity and speed increase all the time. The analog world is moving faster than ever. Learning curves are sharper, design cycles are shorter, and systems more complex. Add to all this the multidisciplinary, analog/digital nature of today’s designs, and your job just gets more complicated.

High-speed I/O: On the road to disintegration? article tells that increases in data rates driven by a need for higher bandwidth (10Gbps, 40Gbps, 100Gbps networking) means the demands on system-level and chip-to-chip interconnects are increasingly challenging design and manufacturing capabilities. For current and future high-performance, high-speed serial interfaces featuring equalization could well be the norm and high levels of SoC integration may no longer be the best solution.

crystalball

For a long time, the Consumer Electronics Show, which began in 1967, was the Super Bowl of new technology, but now consumer electronics show as a concept is changing and maybe fading out in some way. The social web has replaced the trade show as a platform for showcasing and distributing products and concepts and ideas.

NFC, or near-field communications, has been around for 10 years, battling its own version of the chicken-and-egg question: Which comes first, the enabled devices or the applications? Near-field communications to go far in 2013 article expects that this is the year for NFC. NFC is going to go down many different paths, not just mobile wallet.

3-D printing was hot last year and is still hot. We will be seeing much more on this technology in 2013.

Inexpensive tablets and e-readers will find their users. Sub-$100 tablets and e-readers will offer more alternatives to pricey iPads and Kindles. Also sub-$200 higher performance tablet group is selling well.

User interfaces will evolve. Capacitive sensing—Integrating multiple interfaces and Human-machine interfaces enter the third dimension. Ubiquitous sensors meet the most natural interface–speech.

Electronic systems in the automotive industry is accelerating at a furious pace. The automotive industry in the United States is steadily recovering and nowadays electronics run pretty much everything in a vehicle. Automotive electronics systems trends impact test and measurement companies Of course, with new technologies come new challenges: faster transport buses, more wireless applications, higher switching power and sheer amount and density of electronics in modern vehicles.

Next Round: GaN versus Si article tells that the wide-band gap (WBG) power devices have shown up as Gallium Nitride (GaN) and Silicon Carbide (SiC). These devices provide low RDSON with higher breakdown voltage.

Energy harvesting was talked quite much in 2012 and I expect that it will find more and more applications this year. Four main ambient energy sources are present in our environment: mechanical energy (vibrations, deformations), thermal energy (temperature gradients or variations), radiant energy (sun, infrared, RF) and chemical energy (chemistry, biochemistry). Peel-and-stick solar cells are coming.

Wireless charging of mobile devices is get getting some popularity. Wireless charging for Qi technology is becoming the industry standard as Nokia, HTC and some other companies use that. There is a competing AW4P wireless charging standard pushed by Samsung ja Qualcomm.

crystalball

In recent years, ‘Low-carbon Green Growth’ has emerged as a very important issue in selling new products. LED lighting industry analysis and market forecast article tells that ‘Low-carbon Green Growth’ is a global trend. LED lighting is becoming the most important axis of ‘Low-carbon Green Growth’ industry. The expectations for industry productivity and job creation are very large.

A record number of dangerous electrical equipment has been pulled from market by Finnish Safety and Chemicals Agency’s control. Poor equipment design have been found in a lot, especially in LED light bulbs. Almost 260 items were taken from the market and very many of them were LED lights. With high enthusiasm we went to the new technology and then forgotten the basic electrical engineering. CE marking is not in itself guarantee that the product is safe.

The “higher density,” “higher dynamic” trend also is challenging traditional power distribution technologies within systems. Some new concepts are being explored today. AC vs DC power in data center discussion is going strong. Redundant power supplies are asked for in many demanding applications.

According to IHS, global advanced meter shipments are expected to remain stable from 2012 through 2014. Smart electricity meters seen doubling by 2016 (to about 35 percent penetration). In the long term, IHS said it anticipates that the global smart meter market will depend on developing economies such as China, Brazil and India. What’s next after smart power meter? How about some power backup for the home?

Energy is going digital article claims that graphical system design changes how we manipulate, move, and store energy. What defines the transition from analog to digital and how can we tell when energy has made the jump? First, the digital control of energy, in the form of electricity, requires smart sensors. Second, digital energy systems must be networked and field reconfigurable to send data that makes continuous improvements and bug fixes possible. Third, the system must be modeled and simulated with high accuracy and speed. When an analog technology goes digital, it becomes an information technology — a software problem. The digital energy revolution is enabled by powerful software tools.

Cloud is talked a lot in both as design tool and service where connected devices connect to. The cloud means many things to many people, but irrespective of how you define it, there are opportunities for engineers to innovate. EDA companies put their hope on Accelerating embedded design with cloud-enabled development platforms. They say that The Future of Design is Cloudy. M2M companies are competing in developing solutions for easily connecting embedded devices to cloud.

Trend articles worth to check out:
13 Things That Went Obsolete In 2012
Five Technologies to Watch in 2013
Hot technologies: Looking ahead to 2013
Hot technologies: Looking ahead to 2013
Technology predictions for 2013
Prediction for 2013 – Technology
Slideshow: Top Technologies of 2013
10 hot consumer trends for 2013

Popular designer articles from last year that could give hints what to expect:
Top 10 Communications Design Articles of 2012
Top 10 smart energy articles of 2012
Slideshow: The Top 10 Industrial Control Articles of 2012
Looking at Developer’s Activities – a 2012 Retrospective

626 Comments

  1. Tomi Engdahl says:

    Latest News on Product, Technologies, Services and Solutions to Help Accelerate Innovation
    http://www.synopsys.com/Company/Publications/SynopsysInsight/Pages/Art9-innovation-update-IssQ1-13.aspx?elq_mid=4289&elq_cid=303473&cmp=Insight-I1-2013-Art9-Email&elq=ed0ed32f05624b27b9d75e948e09e76d

    Synopsys offers a wide range of products, solutions, and services to the global electronics market for designing and manufacturing semiconductors. In this article, we provide a brief overview of key additions to some of our existing solutions as well as showcase recent innovations to Synopsys’ ever-growing lineup of software, IP and services.

    Reply
  2. Tomi Engdahl says:

    Embedded Memory Test & Repair at 20-nm Nodes and Below
    http://www.synopsys.com/Company/Publications/SynopsysInsight/Pages/Art6-star-IssQ1-13.aspx?elq_mid=4289&elq_cid=303473&cmp=Insight-I1-2013-Art6-Email&elq=ed0ed32f05624b27b9d75e948e09e76d

    With embedded memories dominating the SoC area in today’s designs, SoC yield relies heavily on memory yield. In order to meet the stringent requirements of next-generation, high-performance devices, these designs must be larger and use multiple processor cores. The increased design complexity presents a unique set of test and yield challenges including higher test costs, yield implications due to a higher total bit count, higher power consumption during test, and lower design productivity. Additionally, there is greater manufacturing complexity in 20-nm technology nodes, which create new yield challenges, both in the form of increased defect densities and in the form of new types of failure mechanisms that need to be modeled for accurate detection, diagnosis, and repair. It is essential to have an embedded memory test and repair solution that not only meets the above challenges for today’s designs, especially those at 20-nm and below, but that is also cost-effective.

    This article provides an overview of the newly released DesignWare® STAR Memory System® 5 to specifically address the challenges of designs on 20 nm and below.

    Reply
  3. Tomi Engdahl says:

    Researchers Advance Development of Organic Batteries
    http://www.designnews.com/author.asp?section_id=1386&doc_id=261858&cid=NL_Newsletters+-+DN+Daily

    As the news of overheating lithium-ion batteries continues to surface, the industry is working to develop fuel and energy cells made of different materials. And a group of US and UK researchers has made a breakthrough in the development of batteries that use bacteria as their reactive material.

    Researchers at Pacific Northwest National Laboratory in Washington state and the UK’s University of East Anglia have found that it’s possible to produce an electrical current by touching proteins on the surface of bacteria to a mineral surface, thus paving the way for the creation of microbial fuel cells.

    Reply
  4. Tomi Engdahl says:

    Intel Tries to Secure Its Footing Beyond PCs
    http://www.nytimes.com/2013/04/15/technology/intel-tries-to-find-a-foothold-beyond-pcs.html?pagewanted=all&_r=0

    For the last several months, Andy Bryant, the chairman of Intel, has been trying to put steel in the backs of the company’s employees. At meetings, he tells them that Intel must fundamentally change even though the computer chip maker still has what it takes to succeed in engineering and manufacturing.

    unofficial motto, “Only the paranoid survive.” Intel now finds itself faced with a fundamental question: Can the paranoid also evolve?

    PC sales are now collapsing, as users are relying more on mobile phones and tablets that rarely contain Intel chips.

    Intel’s other mainstay business, chips for computer servers, is also changing. Cloud computing is creating huge demand for basic servers, but its simpler and cheaper designs may drive down prices and profit margins and offer openings to new competitors.

    Intel is also scrambling to find a new leader.

    “In this new world, with smartphones and tablets, and cloud computing, things are moving around fast,” said Hector Ruiz, the former chief executive of Advanced Micro Devices, Intel’s top competitor in making PC chips. “Intel has the talent, engineering, and resources, but they are their own worst enemy.”

    The idea that Intel put the squeeze on its customers has been around Silicon Valley for years, but has never been proved. With Intel controlling 80 percent of the PC market at times, and PC makers facing low profit margins, any supply interruption from Intel could be disastrous.

    Even Microsoft has been moving away from its longtime partner as it tries to adapt to the new, post-PC world.

    Reply
  5. Tomi Engdahl says:

    Memory Effect Discovered In Lithium-Ion Batteries
    http://hardware.slashdot.org/story/13/04/15/1558240/memory-effect-discovered-in-lithium-ion-batteries

    “”Lithium-ion batteries have long been thought to be free of the memory effects of other rechargeable batteries. However, this appears to be not the case.”

    Comment: This has actually been theorized for a long time

    ‘Charge memory’ found in Li-Ion batteries
    Better charge management needed
    http://www.theregister.co.uk/2013/04/15/liion_charge_memory_effect/

    The widespread belief that lithium-ion batteries don’t suffer from “charge memory” might be mistaken, according to new research out of Japan and Switzerland.

    The research, published in Nature Materials (abstract), finds that “charge memory” can emerge in the common electrode material lithium-iron phosphate (LiFePO4). As a result, the authors suggest, Li-Ion based systems might mis-report the charge state of the batteries.

    The effect can be overcome, Novak explains, by adapting the battery management software – since idling the batteries for long enough can erase the memory effect.

    Reply
  6. Tomi Engdahl says:

    SiTime enters smartphone market with first MEMS oscillator
    http://www.edn.com/electronics-products/electronic-product-reviews/other/4411944/SiTime-enters-smartphone-market-with-first-MEMS-oscillator

    SiTime Corporation introduced the SiT15xx family of 32 kHz MEMS oscillators that replace legacy quartz crystal resonators. This new family is targeted at mobile applications, such as smartphones and tablets, which require small size and low power. The SiT15xx family overcomes the limitations of quartz-baseddevices in several ways; SiTime’s solutions offer area savings of 85%, cuts power by 50% and are 15 times more reliable, all of which enable smaller, lower power and longer lasting mobile electronics.

    Supply current of only 0.75 microamps (typical) is 50% better than an equivalent quartz solution

    Reply
  7. Tomi Engdahl says:

    shaped foil technology
    http://library.constantcontact.com/download/get/file/1101859139491-127/Shaped+Foil+Technology.pdf

    West Coast Magnetics, in partnership with the Thayer School
    of Engineering at Dartmouth, has developed new and improved
    technology for high power SMPS transformers and inductors.
    This technology uses new foil shaping techniques which result
    in windings which have the low DC resistance characteristic of
    a foil winding AND the low AC resistance of a litz wire winding.
    Shaped foil designs in both inductors and transformers have
    been shown to result in a significant and verifiable reduction
    in winding losses.

    Dartmouth and WCM researchers have developed
    a model to optimize the paralleling of foil strands in order to
    achieve 20% to 50% lower winding losses than conventional
    windings.

    Reply
  8. Tomi Engdahl says:

    SanDisk ’2-3 years’ away from mass-producing 3D flash chips
    19nm NAND wafers ready by Q3 2013
    http://www.theregister.co.uk/2013/04/22/sandisk_q1_2013/

    Enterprise flash storage is proceeding inexorably down the process-shrinking road. But what happens when the shrinkage stops and flash devices evolve from using 2D to 3D chips? SanDisk thinks it might have the answer.

    Current SanDisk enterprise SSDs and PCIe flash products are made using 24nm NAND. Mehrotra said: “We expect to launch our next-generation SaaS and PCIe SSDs manufactured on our 19-nanometer technology in the second half of this year.”

    The 19nm move is more advanced with client SSDs: “For the client market, we have begun revenue shipment of our new 19-nanometer SSDs to the retail and B2B channels”

    The 19nm process is also called 1X, with 24nm flash being a 2X process in flash manufacturing lingo. The industry sees future 1Y and 1Z processes coming and these generally refer to 18nm – 15nm (1X) and 14nm – 10nm (1Z) processes.

    SanDisk operates flash foundries in partnership with Toshiba. There are basically two ways SanDisk, or any other flash foundry operator, can produce more flash bits. One is to build and commission more flash fabrication foundries, known as fabs, so as to produce more wafers; the other is to shrink the size of flash cells, via a process shrink, and so get more flash dice from a wafer. Process shrinks cost money as equipment has to be re-jigged.

    SanDisk has, Mehrotra says, decided 3D stacking of layers of planar or 2D cells is the way to go

    “NAND’s future technology roadmap beyond 1Z … is not certain, and the 3D technologies of the future are in — still in early stages of development. It’s not clear that how much of the toolset of those 3D technologies will be usable [in] common with the NAND memory production. … the 3D technology [is] 2 to 3 years away from any meaningful production.”

    The indicated timescales are:-
    1Y NAND starting in 2013′s third quarter and ramping up through 2014
    1Z production starting, we think, in 2015/16 and ramping up after that
    3D BiCS production starting in 2015/16 with pilot production first, perhaps in 2014
    3D ReRAM starting, we guess, in 2017/18 or maybe later.

    “We do believe that when we transition in the future to 3D technologies, post-planar NAND technologies, that, at some point, we will need to add new capacity for those technologies and that the capital cost of adding that capacity will be high”

    Reply
  9. Tomi Engdahl says:

    Poor financial situation of the electronics industry. The semiconductor industry’s investments fell by 16.1 per cent last year, estimates Gartner, in a statement.

    Globally, industry invested in new production machinery $ 37.8 billion, about 29 billion euros.

    Source: http://www.tietoviikko.fi/kaikki_uutiset/puolijohdeteollisuus+viileni+viime+vuonna/a896237?s=r&wtm=tietoviikko/-22042013&

    Reply
  10. Tomi Engdahl says:

    STMicroelectronics satellite-tracking chips work with all world major satellite nav systems
    http://www.edn.com/electronics-products/electronic-product-reviews/other/4412380/STMicroelectronics-satellite-tracking-chips-work-with-all-world-major-satellite-nav-systems

    ST and European Space Agency performed the first-ever position fix using GALILEO satellites

    Introduced as the world’s first single-chip positioning device for multiple global navigation systems in January 2011, ST’s Teseo II is a single-chip standalone satellite receiver able to use signals from all of the satellite navigation systems, including GPS, the European GALILEO system, Russian GLONASS and Japanese QZSS3.

    In March 2013, the first position fix of longitude, latitude and altitude using the four Galileo satellites currently in orbit was performed by the European Space Agency (ESA) at its Technology Centre in the Netherlands and by ST at its GNSS (Global Navigation Satellite System) software development labs in Naples, Italy.

    Reply
  11. Tomi Engdahl says:

    While PC sales have plunged, Intel has become the decline in sales in addition to another problem: the high-priced plants in danger of remaining idle.

    Thus, the company has started to work as a contract manufacturer to
    a number of technology companies, . But could the Intel manufactures such as atoms compete with ARM chips?

    - Technically, yes we can produce anything, as we have the world’s most advanced manufacturing facilities. The current guidelines of the manufacturer, however, we are not competing products. We select our clients.

    Source: http://www.digitoday.fi/bisnes/2013/04/26/intel-nailla-konsteilla-pc-uudistuu/20136059/66?rss=6

    Reply
  12. Tomi Engdahl says:

    Can EV Batteries Last 20 Years?
    http://www.designnews.com/document.asp?doc_id=261882&cid=NL_Newsletters+-+DN+Daily&dfpPParams=ind_184,industry_auto,aid_261882&dfpLayout=article

    Lithium-ion batteries for electric cars may last far longer than we’ve been led to believe, a battery expert told the American Chemical Society in a speech this week.

    Mikael Cugnet of the French Atomic Energy Commission said current estimates of an eight-year lithium-ion life have been based on accelerated tests that don’t necessarily provide an accurate picture of how long the batteries will really last in electric cars and hybrids. He believes that if managed properly, EV battery packs could operate reliably for 15 years, and possibly as long as 20 years.

    “Up to now, researchers have also based their estimates for lithium batteries on prior experience with lead-acid and nickel-metal hydride,” Cugnet said. But he argues that packs made with those materials tend to use less consistent manufacturing processes.

    The wrong charging techniques can also shorten a battery’s life. Lithium-ion battery packs need to stay as close as possible to a 50 percent charge, he said, usually going no higher than 80 percent and no lower than 20 percent. Moreover, electric car owners should refrain from doing too many “fast charges,” in which an EV battery can be recharged in under an hour.

    “The extrapolation we’ve made from our own tests shows that lithium-ion packs can last 15 or even 20 years,” he said. “It mostly depends on how you charge it and what temperature it operates at.”

    Reply
  13. Tomi Engdahl says:

    Unlocking the Potential of “Big Data” in Low-Power Wireless Sensor Networks
    http://rtcmagazine.com/articles/view/102975

    Wireless networks are connecting millions of sensors and devices into the “Internet of Things.” Collectively these are sending huge amounts of data that must eventually be aggregated and utilized in the cloud.

    By 2020, there could be 50 billion devices that communicate wirelessly. According to the GSM Alliance, just a quarter of these devices will be mobile handsets and personal computers. The rest will be autonomous connected devices that communicate with other machines without user interaction. The Internet we know today is rapidly evolving into a web of connected wireless devices—the Internet of Things (IoT).

    Options for connecting devices wirelessly are numerous, but some of the more popular include Wi-Fi, Bluetooth, ZigBee and proprietary solutions based on sub-GHz technologies. Each solution has its own set of strengths and weaknesses, but in this emerging world of unprecedented connectivity, these wireless technologies will co-exist

    In wireless sensor network environments scalability is vital. An individual sensor may not provide status updates more than once a second and only transmit a few bytes of information each time, but even a single building can have tens of thousands of nodes.

    Reply
  14. Tomi Engdahl says:

    Internal mechanism secures power electrical connections
    http://www.electronicproducts.com/Interconnections/Sockets_and_Pins/Internal_mechanism_secures_power_electrical_connections.aspx

    The IEC Auto-Lock is an internal retention mechanism that hooks to the pins of power supply inlets and resists any axial force unless the mechanism is released.

    Universal fit with any C14 or C20 inlet (snap-in or screw-on)

    Quail Electronics
    http://www.quail.com/iec-auto-lock.aspx

    The Auto-Lock™ is an internal retention mechanism that actually hooks to the pins of power supply inlets and resists any axial force unless the Auto-Lock™ mechanism is released! Tremendous amounts of force cannot get the electrical connector to budge, but as soon as the IEC Auto-Lock™ mechanism is manually deactivated, the connector slips easily out of the inlet.

    With a unique patented locking mechanism on the connector side, fastening and unlocking your cord is just one pull away. After inserting the connector end into your device, it is safely secure. Try to pull on the cord and you won’t be able to disconnect it. When you want to remove the cord, simply pull simultaneously on the red sliding tabs and voila!

    Reply
  15. Tomi Engdahl says:

    IP should stand for Intellectual Partnership
    http://www.edn.com/electronics-blogs/practical-chip-design/4412820/IP-should-stand-for-Intellectual-Partnership

    He talked about how central the IP business model is to the system design industry these days and how it has recreated the entrepreneurial spirit for many designers. It is now possible for them to create something of value either on their own or with a small team of people. He admits that you have to have a stable company behind you because fly-by-night is not a successful way to do business where there is a large element of trust and partnership in an IP related deal.

    He spoke about some of the difficulties that companies have in pricing IP and more specifically deciding which upgrades or modification to do for individual clients. Making a variant of a design can benefit all of the current and future adopters of the IP, but at the same time can also bring in more instability in the product. More variants mean more to verify

    He stated that the biggest issue he sees in the industry today is one of communications. As previously stated, a company does not sell a block of IP and throw it over a wall. A sale means the start of a partnership – one in which the IP company takes on the role of the domain expert for part of the chip.

    Reply
  16. Tomi Engdahl says:

    Building New Materials With Light
    http://cen.acs.org/articles/91/web/2013/04/Building-New-Materials-Light.html

    Nanotechnology: A simple optical system assembles two-dimensional structures of nanoparticles using a laser beam and could lead to new materials for sensors and photonic devices

    In optical traps, a concentrated beam of light puts a force on particles, causing them to move toward the most intense part of the beam. Current systems that can manipulate many particles at once do so by generating complex light fields using space-hogging setups with many lenses.

    Povinelli’s optical trap uses a patterned slab of silicon called a photonic crystal. She and her collaborators etched into the crystal a regular array of 300-nm-diameter holes, spaced 860 nm from one another. They immersed the slab in a suspension of 520-nm-diameter polystyrene particles and illuminated it from below with a laser. The particles floating above the crystal then moved into the holes, forming a square crystal lattice measuring 13 μm on each side.

    The holes in the silicon interact with laser light to create an intense light field above the slab. This field holds the particles in the lattice.

    Reply
  17. Tomi Engdahl says:

    Structural test coverage is declining due to advanced packaging and PCB technologies (HDI) which are limiting test access points. Today’s coverage has fallen to 30-60% for many high density products.

    Declining coverage and increasing clock speeds are resulting in unpredictable OS boot failures and difficult fault identification. Embedded test takes a different approach by verifying component operation at a low level to quickly identify defective parts.

    Source:
    Kozio Technical Webinar On-Demand
    http://info.kozio.com/OD_2013_04_TestingProductionOnDemand.html?mkt_tok=3RkMMJWWfF9wsRoiu6nAZKXonjHpfsX67%2B8uXbHr08Yy0EZ5VunJEUWy3IEDRNQ%2FcOedCQkZHblFnVwIQ629UKINoqcI

    Reply
  18. Tomi Engdahl says:

    Secure authentication engine targets smart card digital transactions
    http://www.edn.com/electronics-products/other/4413052/Secure-authentication-engine-targets-smart-card-digital-transactions

    Crocus Technology announced a prototype of Match in Place (MiP), a distinctive security technique that strengthens user authentication in digital transactions.

    MiP is specially designed for smart card makers seeking to implement biometrics and other strong user authentication in smart cards, secure microcontrollers, tokens and mobile devices.
    During an authentication process, the direct bit-to-bit comparison of the stored reference pattern to be authenticated occurs within the memory array.

    In the new MiP approach, the sensitive data stored in a Magnetic Logic Unit (MLU) array is never revealed to an external bus or register. This makes the security more robust.

    Crocus will make MiP available as a FPGA (Field Programmable Gate Array) demonstrator in Q3 2013.

    Each cell of the MiP technology is a non-volatile memory cell combined with the virtual XOR gate (digital logic gate) of the MLU. Multiple cells are connected in series to a NAND chain acting as a linear MiP engine.

    Reply
  19. Tomi Engdahl says:

    ARM launches free development tools for embedded Linux
    http://www.edn.com/electronics-products/other/4413035/ARM-launches-free-development-tools-for-embedded-Linux

    ARM has extended the scope of its Development Studio 5 (DS-5) Community Edition (CE) to provide a fully featured, industry standard, and free-to-use software development environment for Embedded Linux applications including Android.

    DS-5 CE provides an integrated solution including an Eclipse IDE, GNU cross-compiler, DS-5 Debugger, Streamline performance analyzer, online help and software examples.

    The graphical DS-5 Debugger only needs an Ethernet connection to the target to enable power debug features typically available only in commercial debuggers. In addition, it integrates Linux-specific functionality, such as a target file system explorer and an automated flow for downloading applications to the target, launching them and connecting the debugger.

    The use of Linux is growing rapidly in the embedded space, fueled by the availability of low-cost, low-power, high-performance ARM processor-based MPUs with working Linux ports and communication stacks. Unfortunately, getting started with Embedded Linux can be a daunting experience, with a number of fragmented open-source development tools with command line interfaces and lack of interoperability. Just getting a Linux cross-development environment up and running may take hours for a Linux expert, or days for an embedded developer from a Microcontroller background.

    Reply
  20. Tomi Engdahl says:

    Opinions vary widely on IoT security concern
    http://www.edn.com/electronics-blogs/systems-interface/4413081/Opinions-vary-widely-on-IoT-security-concern

    Will the IoT (Internet of Things) become a hacker’s paradise? Or is concern over security for the embedded systems that define the IoT overblown?

    Opinions about IoT security are as varied as the systems that will make the IoT, according to a study released last week at DESIGN West by UBM Tech (EDN’s parent company) and VDC Research

    27% of survey participants indicated the industry is not very vulnerable or not vulnerable at all to attacks on IoT/M2M devices.

    I have to assume that those who aren’t worried either figure IoT devices a) aren’t penetrable or b) lie below the threshold of interest of bad actors. It’s safe to say that any system can be penetrated

    I’m having a hard time with the “somewhat worried” category: If there’s a basic acknowledgement of a security problem, we all should be very worried. Even under the assumption that the IoT will comprise billions of smart sensors with hardwired operation that can’t be modified remotely, there are too many opportunities for corrupting the data stream – make that deluge – of information flowing through the IoT

    It’s difficult to identify right and wrong answers when it comes to security of devices for a system of systems that isn’t built yet and may take markedly different turns than traditional systems. It stands to reason, however, that if the foundation is vulnerable, the system of systems is vulnerable.

    Reply
  21. Tomi Engdahl says:

    Mobile Technology’s Influence on Data Acquisition
    http://www.designnews.com/author.asp?section_id=1386&doc_id=262720&

    Industrial users are starting to expect mobile access to measurement data, according to an article on mobile technology that is part of the recently released “Data Acquisition Technology Outlook 2013″ report from National Instruments

    The report states:

    The worldwide proliferation of mobile devices has given people unlimited and instant access to information. Questions no longer go unanswered, as information is made available from anywhere, anytime. Mobile technology has created a natural expectation to have continuous access to information and it is now influencing the data acquisition market.

    The report also quotes Jessy Cavazos, industry director for test and measurement at the consulting firm Frost & Sullivan:

    Mobile computing devices are evolving and providing opportunities for wireless data acquisition systems. This is going to change the data acquisition market.

    Though we may not be surprised by this trend or even by that bold prediction, they do raise questions about the impact of mobile technology on data acquisition methods and tools.

    The other articles that are part of the report are:

    Big Analog Data and Data Acquisition: “Differentiation is no longer about who can collect the most data; it’s about who can quickly make sense of the data they collect.”

    Moore’s Law at Work in Data Logging: “With the digital world we live in becoming more complex, we are demanding more from the systems recording the physical and electrical phenomena of today and tomorrow.”

    Emerging Bus Technologies: “New bus technologies are poised to evolve data acquisition systems and address the challenges of future measurement applications.”

    Reply
  22. Tomi Engdahl says:

    ‘Black Silicon’ gives CMOS image sensor true nightglow capabilities
    http://www.edn.com/electronics-products/other/4413036/-Black-Silicon–gives-CMOS-image-sensor-true-nightglow-capabilities

    Image sensor developer SiOnyx has demonstrated its XQE family of CMOS image sensors for the first time with sensitivity enhancements as high as 10x for infrared sensing.

    The XQE sensors also deliver true nightglow detection capabilities in extreme, low-light conditions. Infrared (IR) sensitivity is critical in many existing and emerging mass-market applications, including biometrics, eye tracking, natural human interface such as gesture recognition and surveillance. In surveillance, the enhanced IR sensitivity provided by the XQE sensors take advantage of the naturally occurring IR ‘nightglow’ to enable imaging under conditions that normally require very expensive image-intensified nightvision equipment.

    The ‘Black Silicon’ laser process used by SiOnyx induces structural changes in the CMOS detector materials. This results in increased optoelectronic response in the visible and near infrared (NIR) regions as well as broadening the spectral response of the silicon.

    “XQE sensors provide unparalleled cost-effective, digital nightvision solutions for the fully networked soldier,”

    Reply
  23. Tomi Engdahl says:

    AMD seeks the salvation from customized chips

    The chip manufacturer AMD is preparing a pc in the post era of a formal unit focuses on tailor-made chips. The company has already made the PlayStation 4 console custom chipset that uses AMD’s processor and graphics technology.

    “A custom chip unit to build the best products assisted by an experienced team of chip architecture and broad portfolio, which is so mobile, graphics, processor and modem technology,” said AMD director Saeid Moshkelani.

    Source: http://www.tietoviikko.fi/kaikki_uutiset/amd+hakee+pelastusta+raataloidyista+siruista/a898862?s=r&wtm=tietoviikko/-02052013&

    Reply
  24. Tomi Engdahl says:

    New computer chip can test DNA in one hour
    Goal is to provide personalized drug treatments for patients
    http://www.itworld.com/science/351046/new-computer-chip-can-test-dna-one-hour?source=itworldtvthumb

    April 04, 2013, 12:11 PM — In another “the future is now” moment, check out this video news report from DigInfo News that highlights a DNA testing chip that can deliver results in one hour, down from three to four days currently.

    The system, developed by Panasonic and Belgium-based research group IMEC, “is expected to enable personalized, tailor-made therapy to become widespread,” DigInfo News reports.

    Reply
  25. Tomi Engdahl says:

    DNA testing chip delivers results in one hour, paves way for personalized drug treatments
    http://www.diginfo.tv/v/13-0022-r-en.php

    Panasonic, together with the Belgium-based research institution IMEC, has developed a DNA testing chip that automates all stages of obtaining genetic information, including preprocessing.

    “This is the chip we’ve actually developed. As you can see, it’s less than half the size of a business card. It contains everything needed for testing DNA. Once a drop of blood is inserted, the chip completes the entire process, up to SNP detection.”

    SNPs are variations in a single DNA base among individuals.

    Detecting SNPs makes it possible to check whether genetically transmitted diseases are present, evaluate future risks, and identify genes related to illness.

    Testing is done simply by injecting the blood and a chemical into the chip, and setting it in the testing system.

    Reply
  26. Tomi Engdahl says:

    The Intel Opportunity
    http://stratechery.com/2013/the-intel-opportunity/

    A new CEO has taken over Intel. Their core business, upon which the company has been built, is floundering. Does the new CEO, who is not really new at all (he’s the current COO), have the vision to ensure Intel’s continued success?

    Intel originally found success as a memory manufacturer.

    By the 1980s, though, it was the microprocessor business, fueled by the IBM PC, that was driving growth, while the DRAM business was fully commoditized and dominated by Japanese manufacturers.

    By 1986, said high water was rapidly threatening to drag Intel under. In fact, 1986 remains the only year in Intel’s history that they made a loss. Global overcapacity had caused DRAM prices to plummet, and Intel, rapidly becoming one of the smallest players in DRAM, felt the pain severely

    Intel was already the best microprocessor design company in the world. They just needed to accept and embrace their destiny.

    Intel’s Identity Crisis, v2

    Intel reaped the benefit of Grove’s repositioning for 25 years. Their chip designs were the foundation of the PC era, and while they faced nominal competition from AMD, they gained many of the economic benefits of a monopolist. But for a brief spell around the turn of the century, a “good” computer required an Intel chip, and they charged prices befitting their place in the PC value chain.

    Throughout the PC period, Intel invested heavily in their chip design.

    Intel chips have no rival when it comes to PC performance; unfortunately for Intel, PCs are in decline. Mobile devices, such as phones and tablets, are in ascendance, and there Intel’s core strength in all-out performance is a 2nd-order consideration. Power consumption is critical,

    Intel’s identity as a chip designer is increasingly irrelevant.

    Most chip designers are fabless; they create the design, then hand it off to a foundry. AMD, Nvidia, Qualcomm, MediaTek, Apple – none of them own their own factories. This certainly makes sense: manufacturing semiconductors is perhaps the most capital-intensive industry in the world, and AMD, Qualcomm, et al have been happy to focus on higher margin design work.

    Massive demand, limited suppliers, huge barriers to entry. It’s a good time to be a manufacturing company. It is, potentially, a good time to be Intel.

    Reply
  27. Tomi Engdahl says:

    Invisibility cloak hides objects from radio waves
    http://www.edn.com/electronics-blogs/tech-edge/4413109/Invisibility-cloak-hides-objects-from-radio-waves

    Researchers at the University of Texas at Austin have built an “invisibility” cloak that can hide 3D objects from microwaves. Unlike most previous cloaking techniques, which have used bulk metamaterials to bend incoming waves around an object, this latest approach uses a very thin form-fitting “metascreen” – just microns thick – to cancel microwaves scattering off a 3D object, making it invisible to microwave radiation.

    Reply
  28. Tomi Engdahl says:

    Target impedance based solutions for PDN may not provide realistic assessment
    http://www.edn.com/design/test-and-measurement/4413192/Target-impedance-based-solutions-for-PDN-may-not-provide-a-realistic-assessment

    One of the more common design techniques for power distribution networks (PDN) is the determination of the peak impedance that will assure that the voltage excursions on the power rail will be maintained within allowable limits, generally referred to as the target impedance. In theory, this simple evaluation is as basic as Ohm’s law: the allowable target impedance is determined by dividing the allowable voltage excursion by the change in load current. Significant benefits of this method are that it: 1) allows measurements to be made in the frequency domain, 2) is simple, 3) is inexpensive to perform, and 4) is wideband, allowing measurements to be made easily into the GHz range. A drawback of the frequency-domain measurement is that the results are small signal, and they might not represent non-linear controls or boundary conditions that result in large-signal rather than small-signal solutions.

    Measuring in the time domain offers a large signal measurement solution; however, the method is much more difficult because the ability to control very high-speed current steps is very difficult and might not be possible. This article focuses on the fundamental flaws of using target impedance as an assessment method using simple, lumped element models and simulations to highlight some of the key issues. A high-performance optimization simulator is used to determine the best- and worst-case voltage excursions for a given tolerance.

    Reply
  29. Tomi Engdahl says:

    IR drop: The ‘gift’ that keeps on taking
    http://www.edn.com/electronics-blogs/power-points/4413178/IR-drop–The–gift–that-keeps-on-taking

    There’s no getting away from it: one of the oldest manifestations of Ohm’s Law still affects us daily. There’s voltage loss (drop) when current flows through resistance, your basic V = IR.

    This phenomenon is so pervasive that I am still amazed that an internal-combustion car can even start

    Yes, there are lots of low-power battery-powered devices in use (you can make your own list), but high-current devices are also part of our world. You’ve got home systems which are drawing tens of amps, all the way to servers with 100 to 200 amps/board and kilowatts per rack. That means you are fighting both IR drop and subsequent thermal dissipation.

    The solution in many cases, of course, is to trade current for voltage.

    I see many otherwise good designs which overlook this consideration. The schematic says “here’s your power” but the physical cabling say “not quite.” It’s easy to inadvertently overlook a fundamental factor such as IR drop when you are worried about hardware issues such as nanosecond signal timing, signal integrity, EMI/RFI, PC board layout, drive currents, I/O, switching, and the many other issues which affect product design.

    Reply
  30. Tomi says:

    Engineers create semiconductor using kitchen microwave
    Material can be used for photovoltaics, sensors, and heat re-use
    http://www.electronicproducts.com/Discrete_Semiconductors/Power_Semiconductors/Engineers_create_semiconductor_using_kitchen_microwave.aspx

    Your microwave at home might be able to deliver a batch of popcorn in less than five minutes, but did you know that it’s also capable of cooking up a plate full of semiconductors?

    Sarswat says that compared with photovoltaic semiconductors, which use highly toxic materials cadmium and arsenic, ingredients for the photovoltaic material they cooked up in a kitchen microwave “are more environmentally friendly.” This includes the use of different “precursor” chemicals (acetate salts instead of chloride salts) and a different solvent (oleylamine instead of ethylene glycol).

    What’s more, the use of a kitchen microwave to create this material will help expand the number of applications to which the material can be applied. “We hope in the next five years there will be some commercial products from this, and we are continuing to pursue applications and improvements,” says Free. “It’s a good market, but we don’t know exactly where the market will go.”

    The duo’s microwave-ready CZTS material can be used in other applications, including the thermoelectric conversion of heat into electricity, biosensors, circuit components, and solar energy to break down water to produce hydrogen for fuel cells.

    They say that the use of microwaves for the purpose of processing materials is not only fast, it often suppresses unwanted chemical “side reactions.”

    Reply
  31. biochemical tests are used to determine says:

    I’m extremely impressed together with your writing skills and also with the structure in your blog. Is that this a paid subject or did you modify it your self? Anyway keep up the nice quality writing, it is uncommon to look a great weblog like this one nowadays..

    Reply
  32. Tomi Engdahl says:

    SAW, BAW and the future of wireless
    http://www.edn.com/design/wireless-networking/4413442/SAW–BAW-and-the-future-of-wireless

    RF interference has always been an inhibitor of communications, requiring designers to perform major feats to keep it in check. Today’s wireless devices must not only reject signals from other services but from themselves, too, as the number of bands packed inside each device increases.

    A high-end smartphone must filter the transmit and receive paths for 2G, 3G, and 4G wireless access methods in up to 15 bands, as well as Wi-Fi, Bluetooth and the receive path of GPS receivers. Signals in the receive paths must be isolated from one another. They also must reject other extraneous signals whose causes are too diverse to list. To do so, a multi-band smartphone will require eight or nine filters and eight duplexers. Without acoustic filter technology, it would be impossible.

    Surface acoustic wave (SAW) filters are used widely in 2G receiver front ends and in duplexers and receive filters. SAW filters combine low insertion loss with good rejection, can achieve broad bandwidths and are a tiny fraction of the size of traditional cavity and even ceramic filters. Because SAW filters are fabricated on wafers, they can be created in large volumes at low cost.

    SAW filters, however, have limitations. Above about 1 GHz, their selectivity declines, and at about 2.5 GHz their use is limited to applications that have modest performance requirements. SAW devices also are notoriously sensitive to temperature changes

    An alternative approach is to use temperature-compensated (TC-SAW) filters, which include overcoating of the IDT structures with layers that increase stiffness at higher temperatures.

    While SAW and TC-SAW filters are well suited for up to about 1.5 GHz, BAW filters deliver compelling performance advantages above this frequency

    BAW filter size also decreases with higher frequencies, which makes them ideal for the most demanding 3G and 4G applications. In addition, BAW design is far less sensitive to temperature variation even at broad bandwidths, while delivering very low loss and very steep filter skirts.

    Unlike SAW filters, the acoustic wave in a BAW filter propagates vertically

    Reply
  33. Tomi Engdahl says:

    Resolving design chain vs supply chain conflicts
    http://www.edn.com/design/components-and-packaging/4413407/Resolving-design-chain-vs-supply-chain-conflicts

    The modern supply chain could benefit from a 3-D simulation tool. The process is no longer linear: A better model might be the known galaxy, with a bunch of suppliers (planets) circling around a sun (the customer).

    EBN and Velocity have been examining steps and strategies to build an effective supply chain. “Effective” means different things to different constituents: For engineers, it’s a wide choice of products, technical support, and fast prototyping; for buyers, it’s quality, price, and dependability; for customers, it’s all of these, plus aftermarket services. Each partner has a specific role to play and a wish list on how things could work better.

    Design engineers traditionally haven’t been a part of the classic supply chain. The process used to begin when a design was done and purchasing developed the BOM. But engineering responsibilities have expanded over the years. Designers now have to consider the price of the components they’re using, the life cycle of these parts, compliance with environmental mandates such as RoHS, and a host of other issues. Engineers, like everybody else, are being asked to do more with less at a faster pace than ever. So what would make a designer’s job more efficient?

    A study conducted by Technology Forecasters Inc found that engineers visited 25 or more websites before even starting a design. Engineers are looking for new parts, the price and availability of these parts, potential end-of-life (EOL) issues, and environmental sustainability.

    If engineers were designing for their supply chain, they’d pick a widely available, low-priced part that could be sourced anywhere in the world. It wouldn’t be a proprietary part, and it wouldn’t have a lead time. This part would be in no danger of going EOL, and it would be automatically compliant with global environmental regulations. Somehow, though, the engineer would source this component only through a chosen supplier or a chosen distributor and not through a competitor. And even though the part is low-priced, there would be a healthy profit margin built into the sale.

    But engineers don’t design for the supply chain; they design for their customers. The needs of the supply chain and the design engineer aren’t perfectly matched, but they can be better aligned.

    Reply
  34. Tomi Engdahl says:

    The Future of Gaming — It May All Be in Your Head
    http://singularityhub.com/2013/05/12/the-future-of-gaming-it-may-all-be-in-your-head/

    Gaming as a hobby evokes images of lethargic teenagers huddled over their controllers, submerged in their couch surrounded by candy bar wrappers. This image should soon hit the reset button since a more exciting version of gaming is coming. It’s called neurogaming, and it’s riding on the heels of some exponential technologies that are converging on each other. Many of these were on display recently in San Francisco at the NeuroGaming Conference and Expo; a first-of-its-kind conference whose existence alone signals an inflection point in the industry.

    Conference founder, Zack Lynch, summarized neurogaming to those of us in attendance as the interface, “where the mind and body meet to play games.”

    Driven by explosive growth in computer processing, affordable sensors, and new haptic sensation technology, neurogame designers have entirely new toolkits to craft an immersive experience that simulates our waking life. Lucid journeys into the dreamscapes depicted in films like Inception may soon become possible.

    Recently developed platforms like Xbox Kinect and Nintendo Wii don’t require the motor skill to use complex gamepads

    The next step for game designers is to introduce psycho-emotional inputs measuring anything from heart rate, facial analysis, voice measurement, skin conductance, eye tracking, pupil dilation, brain activity, and your ever-changing emotional profile. These games will know the user at a subconscious level and deliver an experience that could forever blur the line between virtual and reality.

    The future of neurogaming depends heavily on continued development of reliable augmented and virtual reality technologies. Chatter about Google Glass was everywhere, and I especially enjoyed sampling the Oculus Rift, a crowd favorite.

    Reply
  35. Tomi Engdahl says:

    Power electronics in renewable energy
    Key power components in solar inverter systems
    http://www.electronicproducts.com/Power_Products/Invertors/Power_electronics_in_renewable_energy.aspx

    Reply
  36. Tomi Engdahl says:

    Power control, cheap and good
    http://www.edn.com/electronics-blogs/power-points/4413682/Power-control–cheap-and-good

    The apparent simplicity of the function of this unit is deceptive, since it has to control a resistive heating element at a level of around 20/40/60 W from the AC line. As such, it must not only meet functional requirements, but also regulatory and safety codes for line-powered products that will take some amount of abuse even in normal use – that heater pad itself endures a lot of bending, folding, crushing, and more. Further, as a two-wire unit with no AC-line ground connection, there is an additional safety challenge in terms of user isolation and insulation.

    And here’s the impressive part: you can get all this for under $20 at a local store. When you think about it, that means the price to the store is likely somewhere between $10 and $15, so the BOM and manufacturing costs must even lower, probably in the $5 to $10 range. Yet that modest cost has to cover the loaded PC board including a few MOSFETs and a basic microcontroller, plastic enclosure, an AC cord and the wires between the controller and the heating pad, and the heating pad unit itself. That’s a lot of stuff for short money.

    I have conflicted feelings about this situation. On one side, I am proud and impressed with what the aggregation of so many engineering disciplines

    We have seen this over and over as we produce amazing, technology-packed devices, which took years of work, struggle, and uncountable costs, and which sell for a few dollars – yet people still moan, “it costs too much, doesn’t do enough, etc.”

    Have you ever taken a standard, low-cost consumer device and adapted it for your project, to save development time or cost?

    Reply
  37. Tomi Engdahl says:

    Digital isolators helping to relieve the transformer blues
    http://www.edn.com/electronics-blogs/isolate-me-/4414414/Digital-isolators-helping-to-relieve-the-transformer-blues

    Every time a communications bus or control lines need to be isolated, the power line gets cut at the isolation barrier. So inevitably a designer winds up dealing with isolated power supplies. Fortunately there are a lot of options to get power across an isolation barrier.

    The little transformer that worked so well with the timer can triple in size in all directions, once 5kV transient requirements and double insulation are dropped on it. Now it is probably the largest and tallest component on the board, and the manufacturer wants a lot for it.

    An option that can save a lot of design time and certification time comes from digital isolator technology. For example, chip scale digital isolators now exist to meet the requirements for reinforced isolation and have 5kV withstand voltage.

    The same technology that is used for data isolation has also been applied to chip scale DC-DC converters with isoPower

    Reply
  38. Tomi Engdahl says:

    IP no longer the Wild West
    http://www.edn.com/electronics-blogs/practical-chip-design/4414115/IP-no-longer-the-Wild-West

    It is an enticing story. Why do a design for a single chip when you could do the design once and then sell that same design over and over again to everyone who needs it. This is a classic case of not just doing a job for which you get a salary, but building an asset that pays you many times over. At least this was the initial thought of one company that saw the opportunity in IP in the early days. They wanted to do the design as a service for one company and then retain the rights so that they could resell it. Needless to say, few companies felt happy about that arrangement.

    The IP business has matured a lot since those early wild west days, or has it?

    Reply
  39. Tomi Engdahl says:

    -40°C to +85°C or 25°C only. What temp range is this part truly guaranteed over?
    http://www.edn.com/electronics-blogs/analog-ic-startup/4414271/-40-C-to–85-C-or-25-C-only–What-temp-range-is-this-part-truly-guaranteed-over-

    A growing trend in the analog IC industry is low end suppliers playing the game of “guaranteeing” their parts over extended temp range of -40°C to +85°C. Great, no problem, the part is guaranteed. It’s only when you look at the fine print that you see the problem:

    “Electrical Characteristics: Unless otherwise indicated, VDD = +1.4V to +5.5V, VSS = GND, TA = 25°C”

    Oops, so much for the guarantee.

    How can you “guarantee” your part over temperature when you only guarantee the specs at room temperature? It makes no sense to me, but the schlock houses do this with regularity.

    Reply
  40. Tomi Engdahl says:

    LabView FPGA becomes tool for system configuration
    http://www.edn.com/electronics-blogs/fpga-gurus/4414577/LabView-FPGA-becomes-tool-for-system-configuration

    More often than not, programmers within National Instruments Corp tend to be the first people to tout the capabilities of LabView FPGA for configuring board-level or system-level products for vertical markets. But recently, OEMs themselves are stepping up to sing the praises of LabView as an alternative to HDL optimization of FPGA logic.

    “It’s not as though LabView will always replace HDLs in defining all FPGA products,” Stratoudakis said. “But those of us who have worked with LabView for several years have learned how flexible the tool is, particularly the extensions developed specifically for the FPGA environment.”

    Stratoudakis is looking at vertical products that can be defined with ease in vertical realms such as communications and military electronics. When a system product can be defined in a constrained environment like a NI RIO card, LabView can be sufficient in and of itself. However, Stratoudakis empashized that when each FPGA device should be optimized within a design, it probably makes sense to learn VHDL or another high-level hardware description language.

    LabView customers have brought up some VHDL features not yet supported within LabView, such as support for arrays or constants. For now, however, few of the NI developers of LabView or its external users promote LabView as a replacement for VHDL. Rather, it is seen as a high-level tool that eases the rapid configuration of FPGAs within a larger system.

    Reply
  41. Tomi Engdahl says:

    How to validate and analyze complex serial-bus-link models
    http://www.edn.com/design/test-and-measurement/4414476/How-to-validate-and-analyze-complex-serial-bus-link-models-

    Faster signaling speeds and shrinking geometries have created the need for robust serial data link applications to support modeling, measurement, and simulation of live waveforms on a real-time oscilloscope. Designs are evolving to address these challenges with advanced equalization techniques at the transmitter and receiver. Smaller form factors make signal access more difficult, resulting in non-ideal probing points. This can lead to loss and reflections on the acquired signal due to impedance discontinuities that would not be present at the ideal measurement location.

    Serial-data-link analysis applications allow the user to load circuit models for the measurement circuit, which includes the test and measurement fixtures and instruments used to acquire waveforms from the DUT. This allows the loss and reflections caused by fixtures and test equipment (such as the probe and scope) to be de-embedded from the acquired waveforms. De-embedding these effects improves the accuracy of measurements and can make the difference between passing and failing a test. In addition, link analysis applications allow the user to define the simulation circuit by loading channel models for the serial-data-link system in order to evaluate performance without the need for actual link hardware to be present.

    Reply
  42. Tomi Engdahl says:

    Paul Otellini’s Intel: Can the Company That Built the Future Survive It?
    http://www.theatlantic.com/technology/archive/2013/05/intel-may-have-lost-the-iphone-battle-but-it-could-still-win-the-mobile-war/275825/

    As the CEO steps down, he leaves the Intel machine poised to take on the swarming ecosystem of competitors who make smartphone chips.

    Forty-five years after Intel was founded by Silicon Valley legends Gordon Moore and Bob Noyce, it is the world’s leading semiconductor company. While almost every similar company — and there used to be many — has disappeared or withered away, Intel has thrived through the rise of Microsoft, the Internet boom and the Internet bust, the resurgence of Apple, the laptop explosion that eroded the desktop market, and the wholesale restructuring of the semiconductor industry.

    Under his watch since 2005, it created the world’s best chips for laptops, assumed a dominant position in the server market, vanquished long-time rival AMD, retained a vertically integrated business model that’s unique in the industry, and maintained profitability throughout the global economic meltdown. The company he ran was far larger, more complex and more global than anything Bob Noyce and Gordon Moore could have imagined when they founded it in 1968.

    In the last full year before he ascended to chief executive, Intel generated $34 billion in sales. By 2012, that number had grown to $53 billion.

    “By all accounts, the company has been incredibly successful during his tenure on the things that made them Intel,”

    Even Otellini’s natural rival, former AMD CEO Hector Ruiz, had to agree that Intel’s CEO “was more successful than people give him credit for.”

    But Paul Otellini’s Intel spent $19.5 billion on R&D during 2011 and 2012. That’s $8 billion more than Google. And a substantial amount of Intel’s innovation comes from its manufacturing operations, and Intel spent another $20 billion building factories during the last two years. That’s nearly $40 billion dedicated to bringing new products into being in just two years! These investments have continued because of Otellini’s unshakeable faith that eventually, as he told me, “At the end of the day, the best transistors win, no matter what you’re building, a server or a phone.” That’s always the strategy. That’s always the solution.

    Despite the $53 billion in revenue and all the company’s technical and business successes, the question on many a commentator’s mind is, Can Intel thrive in the tablet and smartphone world the way it did during the standard PC era?

    The industry changes ushered in by the surge in these flat-glass computing devices can be seen two ways. Intel’s James prefers to see the continuities with Intel’s existing business. “Everyone wants the tablet to be some mysterious thing that’s killing the PC. What do you think the tablet really is? A PC,” she said. “A PC by any other name is still a personal computer. If it does general purpose computing with multiple applications, it’s a PC.” Sure, she admitted, tablets are a “form factor and user modality change,” but tablets are still “a general purpose computer.”

    Reply
  43. Tomi Engdahl says:

    Intel could have been inside the original iPhone, says outgoing CEO
    http://www.theverge.com/2013/5/16/4337954/intel-could-have-been-inside-the-original-iphone-says-outgoing-ceo

    Intel CEO Paul Otellini is stepping down today, and The Atlantic has published a lengthy profile of the outgoing CEO. While the article mostly argues that Intel thrived under Otellini’s watch, it also reveals what could have been: Otellini told the publication that he personally shot down a chance to put Intel processors in the original Apple iPhone.

    “We ended up not winning it or passing on it, depending on how you want to view it. And the world would have been a lot different if we’d done it,”

    Intel claimed that Apple was slated to use the company’s upcoming “Silverthorne” chip — which became the Intel Atom. And while Intel admitted in 2008 that Atom couldn’t yet compete on battery life with the ARM-based chips that Apple did indeed put in the iPhone and iPad, it’s also possible that Intel would simply have built ARM chips on Apple’s behalf. That’s what Samsung does for Apple, and as recently as last month rumors have suggested that Intel might still offer foundry services to produce the next Apple system-on-chip.

    Reply
  44. Tomi Engdahl says:

    New CEO vows Intel will be more responsive in mobile push
    http://www.reuters.com/article/2013/05/16/us-intel-krzanich-idUSBRE94F0YC20130516

    Intel Corp’s new CEO Brian Krzanich said on Thursday that under his leadership the top chipmaker will be more responsive to customers in an intensified focus on the fast-growing smartphone and tablet market where it lags its rivals.

    Reply
  45. Tomi Engdahl says:

    Today’s LEDs—What’s responsible for the improvements?
    http://www.edn.com/design/led/4414325/Today-s-LEDs-What-s-responsible-for-the-improvements-

    Traditionally LEDs were viewed as an excellent lighting alternative offering significant energy savings, albeit with inferior visual performance compared to some lighting options.

    Over the past few years, the energy savings LEDs provide have continued to grow at an impressive pace. In fact today´s LEDs are more than twice as efficient as LEDs from just five years ago, offering 25-30% energy savings compared to CCFLs and up to 80% savings compared to incandescent bulbs. These eye-catching energy savings have been accompanied by significant space savings and enhanced visual performance.

    Key developments in LED manufacturing, specifically enhanced equipment, improved processes and superior materials, have allowed the latest generation of LEDs to provide a powerful combination of excellent light output, visual performance and space savings. Let’s look at each development closer.

    Reply
  46. Tomi Engdahl says:

    10 tips for maximizing battery life
    http://www.edn.com/electronics-blogs/embedded-basics/4413562/10-tips-for-maximizing-battery-life

    Portable, battery powered devices are sweeping through society like wild fire. Mobile computing and sensor devices are springing up everywhere providing engineers with not only a plethora of data but also applications. Requirements often dictate constraints on size and weight that limit how much capacity the battery can carry. The number of features on devices in addition to the time between charges makes it very challenging to near impossible to meet the requirements. Selecting a low power microcontroller is an obvious first step but there is a number of software and hardware tips that can be followed to ensure that every last milli-Amp-hour of current are put to good use.

    Early in the design cycle it is highly recommended that a battery budget be put together.

    If more battery is needed than is available, please don’t just move forward on the project! Make the necessary changes up front to spare weeks or months of heartache down the road!

    It is easy to overlook what should be done with an input/output pin that is not being used. This oversight however can be the difference between having a marketable product and an expensive paper weight. Each microcontroller has different recommendations on what to do with unused pins and close examination of the datasheet will reveal what should be done. For example, an unnamed silicon vendor datasheet recommends that any unused I/O be set as an output and driven low. The purpose of this is to minimize leakage and quiescent currents in an effort to minimize power usage. While these currents are tiny, each unused pin adds to this loss and over a period of a day can be a substantial amount of battery life.

    If there is an unused peripheral like an analog-to-digital converter or a pulse-width-modulator, turn it off in order to save power! Peripherals can be quite a power hog!

    Reply
  47. Tomi Engdahl says:

    Teen’s invention could charge your phone in 20 seconds
    http://www.nbcnews.com/technology/teens-invention-could-charge-your-phone-20-seconds-1C9977955

    Waiting hours for a cellphone to charge may become a thing of the past, thanks to an 18-year-old high-school student’s invention. She won a $50,000 prize Friday at an international science fair for creating an energy storage device that can be fully juiced in 20 to 30 seconds.

    The fast-charging device is a so-called supercapacitor

    it can last for 10,000 charge-recharge cycles

    “My cellphone battery always dies,” she told NBC News when asked what inspired her to work on the energy-storage technology. Supercapacitors also allowed her to focus on her interest in nanochemistry — “really working at the nanoscale to make significant advances in many different fields.”

    “It is also flexible”

    Khare’s invention won her the Intel Foundation Young Scientist Award at the Intel International Science and Engineering Fair,

    Reply
  48. Tomi Engdahl says:

    Maxim Integrated 30th anniversary
    http://www.edn.com/design/analog/4414648/Maxim-Integrated-30th-anniversary

    In 1983, Jack Gifford and other industry experts, having varied experience in semiconductors design and sales, founded Maxim.

    One of Maxim’s innovative achievements was the development of the MAX232, a single 5V-powered RS-232 line driver and receiver, quite possibly the “founding father” of analog system-level integration according to Len Sherman, Senior scientist at Maxim.

    Reply
  49. Tomi Engdahl says:

    Design for manufacturing and yield
    http://www.edn.com/design/integrated-circuit-design/4412868/Design-for-manufacturing-and-yield

    As designs move to the 28-nm and smaller nodes, the likelihood of a design being manufactured without defects trends toward zero unless a rapidly growing set of rules is adhered to. Those rules are increasing in number and complexity. The absence of extreme ultraviolet light sources means that double patterning has become essential and new devices, such as 3-D transistors, are being adopted. But it does not stop with just manufacturability. Lithographic features affect functionality and performance in such a way that yield has also become a primary concern.

    Here we examine the problems and the ways in which increasingly sophisticated software can be used to overcome the limitations of technology. Representatives from five different companies discuss, from their own perspectives, these issues as well as solutions.

    Major contributors to yield loss are:

    Geometric variations during the manufacturing process may result in performance variations that can push the device out of the allowed 3-Sigma variation, causing parametric yield loss.
    Specific patterns on the die may not get manufactured as desired because of diffractions that happen during the lithography process, causing catastrophic failures on the dies.
    Random defects may induce shorts or opens on the wafer, resulting in yield loss.
    The wafer goes through chemical mechanical polishing (CMP) after every interconnect and dielectric layer is deposited. Metal-density variations result in the thickness variations during the CMP process, which can accumulate errors and alter interconnect parasitics, causing yield loss.

    Engineers can take precautions during the design process that would help reduce these effects. Logic designers can add either redundant logic or memory cells. These can be used to repair faults, which will increase yield even though the die is partly defective. There are tools and techniques used in the diagnosis of failures seen on silicon to ascertain the cause of the failures. This information can be used to correct layouts to improve the yield.

    Semiconductor packaging is evolving from an enabling role to a differentiator in today’s electronic devices.

    Predictions from package modeling can be broadly categorized into performance and reliability.

    Manufacturing improvements via novel materials, processes, and new technologies aren’t keeping up with the market demand for ever-shrinking feature dimensions, increasing performance, and low-power requirements. Software is now, and will remain, the new key enabler, as long as there’s a growing gap between design and manufacturing.

    At 28 nm, the impact of manufacturing variability on performance, power consumption, and yield has become disproportionately larger and more complex. Software analysis is critical for effectively quantifying and mitigating the impact on both the physical integrity and parametric performance of the designs.

    During the transition to the 28-nm node, several leading semiconductor companies struggled with supply: They couldn’t ship enough of their products. Part of the problem was lower-than-expected yield. This situation illustrates how traditional yield learning methods are running out of steam, largely because of the dramatic increase in the number and complexity of design-sensitive defects and longer failure analysis cycle times. These factors have forced fabless semiconductor companies to arm themselves with new technologies such as diagnosis-driven yield analysis (DDYA), which can rapidly identify the root cause of yield loss and effectively separate design- and process-oriented yield loss.

    Process variations, especially local random variations, are making DFY a must-have methodology for sub-65-nm design. A DFY methodology comprises three critical components: statistical transistor model extraction, yield prediction and analysis, and a powerful statistical simulation engine. An integrated solution with all three components provides added efficiency and consistency.

    Designers have had to choose between two types of circuit simulators. Either they can choose high accuracy and good usability available in SPICE simulators, but sacrifice performance and capacity, or designers can choose FastSPICE with high performance and capacity but with poor accuracy and usability. Neither is suitable as a DFY simulation engine.

    The ideal solution would be to extend the capabilities of SPICE to cover the performance and capacity provided with FastSPICE. Parallel SPICE simulators have taken over some of the space covered by FastSPICE with 10× or more speedup over traditional SPICE.

    Reply
  50. Tomi Engdahl says:

    AMD slips to fourth spot as Qualcomm and Samsung capitalize on mobile processor sales
    http://www.theverge.com/2013/5/21/4350966/qualcomm-samsung-beat-amd-processor-sales

    A fall in demand for desktop PCs and continued growth of the smartphone and tablet market saw AMD fall from second to fourth place for microprocessor sales in 2012. According to a new report from IC Insights, Qualcomm and Samsung overtook AMD to reach second and third spot respectively after they both posted year-on-year growth, thanks to increased sales of their ARM-based mobile processors. Intel continued to dominate the market — despite seeing a 1 percent decline last year.

    Intel and AMD’s declines mirror the rapid fall in demand for legacy PCs and hardware — in the first quarter of 2013 alone, PC sales saw “the steepest decline ever in a single quarter.” In attempt to overcome this trend, AMD has said it will include ARM processors inside its x86 Accelerated Processing Units (APUs) later this year.

    Reply

Leave a Reply to biochemical tests are used to determine Cancel reply

Your email address will not be published. Required fields are marked *

*

*