Electronics trends for 2013

Electronics industry hopefully starts to glow after not so good year 2012. It’s safe to say that 2012 has been a wild ride for all of us. The global semiconductor industry has demonstrated impressive resilience in year 2012, despite operating in a challenging global macroeconomic environment. Many have already ratcheted back their expectations for 2013. Beyond 2012, the industry is expected to grow steadily and moderately across all regions, according to the WSTS forecast. So we should see moderate growth in 2013 and 2014. I hope this happens.

The non-volatile memory market is growing rapidly. Underlying technologies for non-volatile memories article tells that non-volatile memory applications can be divided into standalone and embedded system solutions. Standalone applications tend to be driven primarily by costs is dominated by NAND FLASH technology. The embedded market relies mainly on NOR Flash for critical applications and NAND for less critical data storage. Planar CT NAND and 3D NAND could fast become commercially viable this year or in few years. MRAM, PCRAM, and RRAM will need more time and new material innovation to become major technologies.

Multicore CPU architectures are a little like hybrid vehicles: Once seen as anomalies, both are now encountered on a regular basis and are widely accepted as possible solutions to challenging problems. Multi-core architectures will find their application but likely won’t force the extinction of single-core MCUs anytime soon. Within the embedded community, a few applications now seem to be almost exclusively multicore, but in many others multicore remains rare. There are concerns over the complexity and uncertainty about the benefits.

FPGAs as the vanishing foundation article tells that we are entering a new environment in which the FPGA has faded into the wallpaper – not because it is obsolete, but because it is both necessary and ubiquitous. After displacing most functions of ASICs, DSPs, and a few varieties of microcontrollers, it’s fair to ask if there is any realm of electronic products where use of the FPGA is not automatically assumed. Chances are, in the next few years, the very term “FPGA” might be replaced by “that ARM-based system on a chip” from Xilinx, Altera, Lattice, or other vendor.

Software and services have become the soul of consumer technology. Hardware has become increasingly commoditized into blank vessels that do little more than hold Facebook and Twitter and the App Store and Android and iOS.

Are products owned when bought? The trend in recent decades has been an increase in the dependence of the buyer on the seller.

More than 5 billion wireless connectivity chips will ship in 2013, according to market research firm ABI Research. This category includes standalone chips for Bluetooth, Wi-Fi, satellite positioning, near-field communications and ZigBee as well as so called “combo” chips that combine multiple standards. Broadcom seen retaining lead in connectivity chips. Bluetooth Smart, WiGig and NFC are all are seeing increased adoption in fitness, automotive and retail applications. Combo chips are also a growing opportunity based on the popularity of smart phones, tablet computers and smart televisions.

Signal integrity issues are on the rise as both design complexity and speed increase all the time. The analog world is moving faster than ever. Learning curves are sharper, design cycles are shorter, and systems more complex. Add to all this the multidisciplinary, analog/digital nature of today’s designs, and your job just gets more complicated.

High-speed I/O: On the road to disintegration? article tells that increases in data rates driven by a need for higher bandwidth (10Gbps, 40Gbps, 100Gbps networking) means the demands on system-level and chip-to-chip interconnects are increasingly challenging design and manufacturing capabilities. For current and future high-performance, high-speed serial interfaces featuring equalization could well be the norm and high levels of SoC integration may no longer be the best solution.

crystalball

For a long time, the Consumer Electronics Show, which began in 1967, was the Super Bowl of new technology, but now consumer electronics show as a concept is changing and maybe fading out in some way. The social web has replaced the trade show as a platform for showcasing and distributing products and concepts and ideas.

NFC, or near-field communications, has been around for 10 years, battling its own version of the chicken-and-egg question: Which comes first, the enabled devices or the applications? Near-field communications to go far in 2013 article expects that this is the year for NFC. NFC is going to go down many different paths, not just mobile wallet.

3-D printing was hot last year and is still hot. We will be seeing much more on this technology in 2013.

Inexpensive tablets and e-readers will find their users. Sub-$100 tablets and e-readers will offer more alternatives to pricey iPads and Kindles. Also sub-$200 higher performance tablet group is selling well.

User interfaces will evolve. Capacitive sensing—Integrating multiple interfaces and Human-machine interfaces enter the third dimension. Ubiquitous sensors meet the most natural interface–speech.

Electronic systems in the automotive industry is accelerating at a furious pace. The automotive industry in the United States is steadily recovering and nowadays electronics run pretty much everything in a vehicle. Automotive electronics systems trends impact test and measurement companies Of course, with new technologies come new challenges: faster transport buses, more wireless applications, higher switching power and sheer amount and density of electronics in modern vehicles.

Next Round: GaN versus Si article tells that the wide-band gap (WBG) power devices have shown up as Gallium Nitride (GaN) and Silicon Carbide (SiC). These devices provide low RDSON with higher breakdown voltage.

Energy harvesting was talked quite much in 2012 and I expect that it will find more and more applications this year. Four main ambient energy sources are present in our environment: mechanical energy (vibrations, deformations), thermal energy (temperature gradients or variations), radiant energy (sun, infrared, RF) and chemical energy (chemistry, biochemistry). Peel-and-stick solar cells are coming.

Wireless charging of mobile devices is get getting some popularity. Wireless charging for Qi technology is becoming the industry standard as Nokia, HTC and some other companies use that. There is a competing AW4P wireless charging standard pushed by Samsung ja Qualcomm.

crystalball

In recent years, ‘Low-carbon Green Growth’ has emerged as a very important issue in selling new products. LED lighting industry analysis and market forecast article tells that ‘Low-carbon Green Growth’ is a global trend. LED lighting is becoming the most important axis of ‘Low-carbon Green Growth’ industry. The expectations for industry productivity and job creation are very large.

A record number of dangerous electrical equipment has been pulled from market by Finnish Safety and Chemicals Agency’s control. Poor equipment design have been found in a lot, especially in LED light bulbs. Almost 260 items were taken from the market and very many of them were LED lights. With high enthusiasm we went to the new technology and then forgotten the basic electrical engineering. CE marking is not in itself guarantee that the product is safe.

The “higher density,” “higher dynamic” trend also is challenging traditional power distribution technologies within systems. Some new concepts are being explored today. AC vs DC power in data center discussion is going strong. Redundant power supplies are asked for in many demanding applications.

According to IHS, global advanced meter shipments are expected to remain stable from 2012 through 2014. Smart electricity meters seen doubling by 2016 (to about 35 percent penetration). In the long term, IHS said it anticipates that the global smart meter market will depend on developing economies such as China, Brazil and India. What’s next after smart power meter? How about some power backup for the home?

Energy is going digital article claims that graphical system design changes how we manipulate, move, and store energy. What defines the transition from analog to digital and how can we tell when energy has made the jump? First, the digital control of energy, in the form of electricity, requires smart sensors. Second, digital energy systems must be networked and field reconfigurable to send data that makes continuous improvements and bug fixes possible. Third, the system must be modeled and simulated with high accuracy and speed. When an analog technology goes digital, it becomes an information technology — a software problem. The digital energy revolution is enabled by powerful software tools.

Cloud is talked a lot in both as design tool and service where connected devices connect to. The cloud means many things to many people, but irrespective of how you define it, there are opportunities for engineers to innovate. EDA companies put their hope on Accelerating embedded design with cloud-enabled development platforms. They say that The Future of Design is Cloudy. M2M companies are competing in developing solutions for easily connecting embedded devices to cloud.

Trend articles worth to check out:
13 Things That Went Obsolete In 2012
Five Technologies to Watch in 2013
Hot technologies: Looking ahead to 2013
Hot technologies: Looking ahead to 2013
Technology predictions for 2013
Prediction for 2013 – Technology
Slideshow: Top Technologies of 2013
10 hot consumer trends for 2013

Popular designer articles from last year that could give hints what to expect:
Top 10 Communications Design Articles of 2012
Top 10 smart energy articles of 2012
Slideshow: The Top 10 Industrial Control Articles of 2012
Looking at Developer’s Activities – a 2012 Retrospective

626 Comments

  1. Tomi Engdahl says:

    28-nm SoC development costs doubled over 40-nm
    Software burden means cost of 28-nm SoC development near double that of the previous node, according to market research firm.

    Source: http://www.eetimes.com/electronics-news/4417072/28-nm-SoC-development-costs-doubled-over-previous-node

    Reply
  2. Tomi Engdahl says:

    DAQ system offers high-density I/O in 1U rack space
    http://www.edn.com/electronics-products/other/4417009/DAQ-system-offers-high-density-I-O-in-1U-rack-space

    two independent GigE interfaces and four front-loading I/O slots

    Useful for prototyping, evaluation, and hardware-in-the-loop testing, the 1U Flatrack accommodates up to 100 analog inputs, 128 analog outputs, 192 digital I/O bits, 48 ARINC-429 channels, and 32 RS-232/422/485 serial ports.

    Reply
  3. Tomi Engdahl says:

    Wireless modules enable products for the Internet of Things
    http://www.edn.com/electronics-products/other/4417087/Wireless-modules-enable-products-for-the-Internet-of-Things

    NXP Semiconductors has announced the a range of small-footprint modules based on the ultra-low-power JN5168 wireless microcontroller. Supporting multiple network stacks including ZigBee Home Automation, ZigBee Light Link, ZigBee Smart Energy, JenNet-IP and RF4CE, the JN5168 wireless modules are a mere 16 x 21 mm and offer very low transmit and receive power consumption.

    Reply
  4. Tomi Engdahl says:

    Medical technical breakthroughs: Now it’s personal
    http://www.edn.com/electronics-blogs/brians-brain/4416762/Medical-technical-breakthroughs–Now-it-s-personal

    EEGs and MRIs are fairly commonplace nowadays

    But after we discussed the results in the examination room, the neurologist invited me into her office and fired up her computer. I perused its high-resolution LCD while she cycled through several extended series of images, navigating through my brain in various horizontal and vertical directions.

    Here we were, in front of a several year old PC, rapidly pulling up high-resolution images of my head off a server somewhere, images which had been snapped by a piece of high tech gear only a few days before. And it was all as casual and matter-of-fact as if she was showing off photos to me from her latest vacation. Amazing. Absolutely amazing.

    Reply
  5. Tomi Engdahl says:

    Science
    Quantum transistors at room temp
    http://www.theregister.co.uk/2013/06/24/quantum_transistors_at_room_temp/

    The world might still be 20 years from the end of Moore’s Law, but the hunt for technologies to replace semiconductors is going on right now. A group from Michigan Technological University is offering one such alternative: a quantum tunnelling transistor that operates at room temperature.

    The culmination of work begun in 2007, their demonstration has been published in Advanced Materials

    Quantum properties are seen as a promising replacement for semiconductors on both scores: transistors can be built at the single-atom scale, and they don’t have the same heat dissipation issues. However, most quantum effect transistors need to function at cryogenic temperatures.

    That makes room temperature operation an important goal for development – and that’s what the MTU group, led by MTU physicist Yoke Khin Yap, is claiming.

    Their quantum transistor is fabricating by placing gold quantum dots on boron nitride nanotubes. The three-nanometre gold dots were placed using lasers, while the nanotubes both provide insulation between the dots, and confine the dots.

    Reply
  6. Tomi Engdahl says:

    Disruptions: Medicine That Monitors You
    http://bits.blogs.nytimes.com/2013/06/23/disruptions-medicine-that-monitors-you/

    SAN FRANCISCO — They look like normal pills, oblong and a little smaller than a daily vitamin. But if your doctor writes a prescription for these pills in the not-too-distant future, you might hear a new twist on an old cliché: “Take two of these ingestible computers, and they will e-mail me in the morning.”

    As society struggles with the privacy implications of wearable computers like Google Glass, scientists, researchers and some start-ups are already preparing the next, even more intrusive wave of computing: ingestible computers and minuscule sensors stuffed inside pills.

    Although these tiny devices are not yet mainstream, some people on the cutting edge are already swallowing them to monitor a range of health data and wirelessly share this information with a doctor. And there are prototypes of tiny, ingestible devices that can do things like automatically open car doors or fill in passwords.

    “You will — voluntarily, I might add — take a pill, which you think of as a pill but is in fact a microscopic robot, which will monitor your systems” and wirelessly transmit what is happening, Eric E. Schmidt, the executive chairman of Google, said last fall at a company conference. “If it makes the difference between health and death, you’re going to want this thing.”

    Reply
  7. Tomi Engdahl says:

    Quantum-Tunneling Electrons Could Make Semiconductors Obsolete
    http://hardware.slashdot.org/story/13/06/25/0045245/quantum-tunneling-electrons-could-make-semiconductors-obsolete

    “The powerful, reliable combination of transistors and semiconductors in computer processors could give way to systems built on the way electrons misbehave, all of it contained in circuits that warp even the most basic rules of physics. Rather than relying on a predictable flow of electrons that appear to know whether they are particles or waves, the new approach depends on quantum tunneling, in which electrons given the right incentive can travel faster than light, appear to arrive at a new location before having left the old one, and pass straight through barriers that should be able to hold them back.”

    Reply
  8. Tomi Engdahl says:

    Signal distortion from high-K ceramic capacitors
    http://www.edn.com/design/analog/4416466/Signal-distortion-from-high-K-ceramic-capacitors

    Multilayer ceramic capacitors (MLCCs) are used extensively in modern electronics because they offer high volumetric efficiencies and low equivalent series resistances at attractive prices. These advantages make MLCCs nearly ideal for a wide range of applications

    Unfortunately, these advantages come with a downside: high-K MLCCs exhibit a substantial voltage coefficient, meaning their capacitance varies depending on the applied voltage. In AC applications this phenomenon manifests itself as waveform distortion and can compromise the overall system performance. When printed circuit board (PCB) area and cost are major design constraints, board and system level designers may be tempted to use high-K MLCCs in circuits where they can introduce significant distortion into the signal path.

    Reply
  9. Tomi Engdahl says:

    Wireless eval kit kickstarts IoT designs
    http://www.edn.com/electronics-products/other/4417469/Wireless-eval-kits-speed-time-to-market-of-IoT-designs-

    The SimpleLink Wi-Fi CC3000 BoosterPack released by Texas Instruments is designed to help professional engineers, hobbyists and university students kickstart the development of Wi-Fi-enabled Internet of Things (IoT) applications.

    The Wi-Fi BoosterPack is powered by the SimpleLink Wi-Fi CC3000 module from TI, which offers simplified Wi-Fi connectivity for Microcontroller (MCU) -based systems. Working with both the MSP430 and Tiva C Series MCU LaunchPad evaluation kit

    Reply
  10. Tomi Engdahl says:

    Student replaces obsolete memories with PIC32 MCU
    http://www.edn.com/electronics-blogs/dev-monkey-blog/4417212/Student-replaces-obsolete-memories-with-PIC32-MCU-

    Jim wanted to rebuild or replicate a small computer called the “Mark-8″ I designed in the early 1970s. He got to the point of creating the memory section and discovered the Intel 1101 memories (256-by-one bit) had gone obsolete many years ago. Obsolete-component companies might have some 1101 ICs, but at prices that put them well outside Jim’s budget. So he asked if I had ideas about how to substitute newer memory ICs.

    I thought about adapting inexpensive standard read-write memories (SRAM) because a few of them would cover the Mark-8′s entire 16 kbyte address space. Unfortunately, the computer design uses split data buses—a bus for data input and a separate bus for data output.

    Current microcontrollers (MCUs) provide many I/O pins and lots of SRAM, so could we substitute one for the entire memory, buffers, and logic? A few assembly-language instructions just might do the job.

    After I looked at specs for MCU boards on hand I settled on the Digilent chipKIT Max32 that gave me a Microchip PIC32MX795F512L with 128 kbytes of SRAM and as many as 83 I/O ports. And many of those I/O pins can accept 5V logic signals. Add a pull-up resistor and configure then with open-drain outputs and they can provide 5V TTL logic signals. So far, so good.

    This memory-substitution worked well as far as we have tested it. I’ll send Jim my Max32 board so he can try it with the Mark-8 circuits he has built. The Digilent chipKIT Max32 board costs $49.50, which seems like a small price to pay when I consider the time needed to hand wire individual memory chips, buffers, and logic, install this circuitry, and test it. Another win for a microcontroller.

    Reply
  11. Tomi Engdahl says:

    Robust crosstalk design rules
    http://www.edn.com/design/test-and-measurement/4417648/Robust-crosstalk-design-rules

    Ask any designer which has less channel-to-channel differential crosstalk: loosely-coupled stripline pairs or tightly-coupled stripline pairs, and 99% of them will say tightly-coupled differential pairs have less crosstalk. They are wrong.

    In high-speed serial links operating above 10 Gbps, losses are the dominant factor influencing interconnect design. No matter how low the loss in the dielectric, conductor loss dominates the attenuation. The only design feature that affects conductor loss is the conductor width.

    This means that in high-data-rate channels, everything should be done to enable the widest line practical. In most dense multilayer boards, the widest practical line width is about 7 mils before interconnect density or total board thickness reach practical limits.

    If the target impedance is 100 Ohms differential impedance, and the line width is 7 mils, changing the coupling between the two lines that make up a differential pair will change the differential impedance. As the lines are brought closer together, the differential impedance will decrease. In a stripline topology, to compensate for the decrease in differential impedance, the dielectric thickness between the planes is increased.

    Fringe electric and magnetic field coupling between two differential pairs causes near-end crosstalk in stripline. The more fringe fields that couple between two pairs, the more the crosstalk.

    In a tightly coupled differential stripline pair, the total dielectric spacing between the planes needs to be about 26 mils for 100 Ohm impedance, 7 mil wide, half ounce copper, in a laminate with dielectric constant of 4. If the traces move apart to a spacing of twice the line width, the total dielectric thickness is reduced to 17 mils for the same differential impedance.

    How much crosstalk is too much? Most channel specs require a signal-to-noise ratio of about 20 dB.

    In a channel without equalization, -30 dB channel-to-channel cross talk is the acceptable limit. In a channel with the highest loss, -45 dB is the acceptable crosstalk limit.

    As we pull the differential pairs apart, the saturated near-end crosstalk will decrease.

    The channel-to-channel spacing required to achieve no more than 1.5% near end differential crosstalk, for a tightly coupled pair is >1.5 x the line width, and >1 x the line width for loosely coupled pairs.

    From the design curve above, a robust design rule for less than –50 dB near-end crosstalk, for tightly coupled differential pairs is keeping the channel-to-channel spacing >4 x the line width, while for loosely coupled differential pairs, keep the spacing >3 x the line width.

    These are robust design rules that allow you to sleep at night

    Reply
  12. Tomi Engdahl says:

    Airport body scanners: Are they hazardous?
    http://www.edn.com/design/analog/4417389/Airport-body-scanners–Are-they-hazardous-

    The truth about airport body scanners lies somewhere between those who want to ban all body scanners and RF in the air around us and those who say they are “absolutely” safe. My view is that we do not have enough long-term effect data yet to help us decide as engineers.

    It was well known to us that human body tissue absorbs RF radiation (non-ionized radiation) very heavily in the frequency ranges of 30-300 MHz where the RF energy is absorbed most efficiently when the whole body is exposed.

    Remember, microwaves work at a frequency of 2.45 GHz and can cook a chicken. Waves in this frequency range are absorbed by water, fats and sugars. Once absorbed, they’re converted directly into atomic motion, that is, heat. So yes, I’m skeptical about millimeter waves.

    There were two main types of scanners at airports over the last few years, “millimeter wave” and “backscatter” machines. Millimeter wave units send radio waves over a person and produce a three-dimensional image by measuring the energy reflected back. Backscatter machines use low-level X-rays to create a two-dimensional image of the body.

    The US government is having an ongoing investigation of ionizing radiation (like x-rays) and also non-ionizing radiation (RF energy, of which millimeter waves are of course a subset) and how these might affect our bodies in the short and long-term. Right now, the National Council on Radiation Protection (NCRP), Institute of Electrical and Electronics Engineers (IEEE) and International Commission on Non-Ionizing Radiation Protection (ICNIRP) adhere to certain maximum exposure levels for the human body. (The threshold level is a SAR value for the whole body of 4 watts per kilogram.)

    There have been more than 700 inspections of X-Ray backscatter systems, in 2012, that passed all radiation tests—however, the machines have now been replaced completely by non-X-ray millimeter wave machines. By the way, the governing body of the European Union has banned the use of X-ray body scanners in order “not to risk jeopardizing citizens’ health and safety.”

    The TSA claims that it was privacy concerns — not radiation — that resulted in the agency canceling its contract with Rapiscan Systems in January. Rapiscan, unlike the maker of the millimeter-wave machines (L-3 Communications Security & Detection Systems) was unable to meet TSA’s deadline to develop software to convert the nude-like images produced by their machines into stick-like figures, the agency has said.

    Millimeter waves, being similar to our cell phones (by the way, millimeter waves are at 92 GHz vs. less than 10 GHz for cell phone RF)

    These millimeter wave body scanning machines emit a special type of microwave signal, not X-ray. Two rotating transmitters produce the waves as a passenger stands hands over the head and very still inside the machine. The energy passes through clothing, bounces off the person’s skin — as well as any potential threats — and then returns to two receivers, which send images, front and back, to an operator station.

    Since the wavelengths of millimeter waves are between 1 to 10 mm, they are large relative to natural and synthetic fibers, but they tend to pass through most materials, such as clothing, making them an ideal candidate for scanning technologies.

    Each transmitter emits a pulse of energy, which travels as a wave to a person standing in the machine, passes through the person’s clothes, reflects off the person’s skin or concealed solid and liquid objects and then travels back, where the transmitter, now acting like a receiver, detects the signal. Because there are several transmitter/receiver discs stacked vertically and because these stacks rotate around the person, the device can form a complete picture, from head to toe and front to back.

    The software then interprets the data and displays an image on an operator screen. It creates a 3-D, black-and-white, whole-body silhouette of the person. Another key feature in the system is known as “automated target recognition” (ATR), which is software that detects threats and highlights in the image.

    Some millimeter wave scanners do not have this type of software, so they form images that reveal a person’s unique topography

    Millimeter wave scanners do not function like a metal detector though. They see through clothing to look for metallic and nonmetallic objects (Plastic guns, explosives in your underwear and knives, for example).

    Reply
  13. Tomi Engdahl says:

    ARM, Imagination team up on UK electronics plan
    http://www.eetimes.com/electronics-news/4417727/ARM-Imagination-collaborate-on-UK-electronics-plan

    The CEOs the world’s leading processor IP licensors, ARM Holdings plc and Imagination Technologies Group plc, have set aside their commercial rivalry to work together on a blueprint to boost electronics business in the UK.

    Warren East of ARM Holdings plc and Sir Hossein Yassaie of Imagination sat together on a committee that has produced the ESCO report which suggests a number of initiatives to help grow electronics business in the UK.

    “ARM has demonstrated that UK businesses can achieve global leadership from a UK base. To make sure that we produce more ARMs in the future, both business and government need to drive change. With the great potential that exists here we must ensure we create the right culture and climate in the UK to foster talent for the technology businesses of tomorrow,”

    Reply
  14. Tomi Engdahl says:

    TI targets Internet of Things set-up
    http://www.eetimes.com/electronics-news/4417512/TI-targets-Internet-of-Things-set-up

    Texas Instruments rolled out new hardware and software to help simplify the process of connecting thousands of different types of devices to the Internet of Things.

    The giant electronics manufacturer’s new solution taps into near-field communications (NFC) protocols as a means of making wireless connections between routers and printers, speakers, sensors, switches, and a multitude of other products. “The whole idea is to make the pairing easy,” Dev Pradhan of Texas Instruments told Design News. “With this, your NFC-enabled phone pairs the information from your router to all your devices with a touch.”

    Reply
  15. Ludie Reusswig says:

    I like the helpful info you provide in your articles. I will bookmark your weblog and check again here frequently. I’m quite sure I’ll learn a lot of new stuff right here! Best of luck for the next!

    Reply
  16. Tomi says:

    Would You Let a Robot Stick You With a Needle?
    http://science.slashdot.org/story/13/07/27/059223/would-you-let-a-robot-stick-you-with-a-needle

    “IEEE Spectrum has a story about a robot that uses infra red and ultrasound to image veins, picks the one with best bloodflow, and then sticks a needle in.”

    “aiming for better performance than a human”

    Reply
  17. Tomi says:

    Profile: Veebot
    Making a robot that can draw blood faster and more safely than a human can
    http://spectrum.ieee.org/robotics/medical-robots/profile-veebot

    Veebot, a start-up in Mountain View, Calif., is hoping to automate drawing blood and inserting IVs by combining robotics with image-analysis software. To use the Veebot system, a patient puts his or her arm through an archway over a padded table.

    Currently, Veebot’s machine can correctly identify the best vein to target about 83 percent of the time, says Harris, which is about as good as a human. Harris wants to get that rate up to 90 percent before clinical trials. However, while he expects to achieve this in three to five months, he will then have to secure outside funding to cover the expense of those trials.

    Harris estimates the market for his technology to be about US $9 billion, noting that “blood is drawn a billion times a year in the U.S. alone; IVs are started 250 million times.” Veebot will initially try to sell to large medical facilities.

    Reply
  18. Tomi says:

    A compact approach eliminates voltage ripple in portable devices
    http://www.electronicproducts.com/Analog_Mixed_Signal_ICs/Power_Management/A_compact_approach_eliminates_voltage_ripple_in_portable_devices.aspx

    A new, patented noise attenuation technology enables active filtering to be implemented in compact silicon integrated circuits, producing superior noise attenuation in a smaller board area than an equivalent LC filter

    Reply
  19. Tomi says:

    Scientists Demonstrate Ultra-Fast Magnetite Electrical Switch
    http://science.slashdot.org/story/13/07/30/238205/scientists-demonstrate-ultra-fast-magnetite-electrical-switch

    “Researchers at the U.S. Department of Energy’s SLAC National Accelerator Laboratory recently demonstrated electrical switching thousands of times faster than in transistors now in use thanks to a naturally magnetic mineral called magnetite”

    “The experiment is considered a major step forward in understanding electrical structures at the atomic level “

    Reply
  20. Tomi says:

    World’s Smallest Transistor
    http://www.eeweb.com/company-news/mouser/worlds-smallest-transistor/

    Mouser Electronics, Inc. announces the availability of the smallest transistor package on the market from ROHM Semiconductor which is optimized for thin, compact portable devices.

    ROHM Semiconductor ultra compact MOSFETs & bipolar transistors have the smallest transistor package on the market which is optimized for thin, compact portable devices. The VML0806 case type measures just 0.8mm × 0.6mm with a height of only 0.36mm.

    surface mount technology limit the smallest conventional transistors to the 1006 size (1.0mm×0.6mm, t=0.37mm)

    Reply
  21. Tomi says:

    Chip-On-Glass (CoG) for LCD Modules
    http://www.eeweb.com/company-blog/nxp/chip-on-glass-cog-for-lcd-modules/

    LCDs are often supplied as LCD modules which have built-in driver circuitry that simplifies installation and improves reliability. However, the addition of packaged driver circuitry in LCD modules also results in a number of disadvantages

    Increases the thickness of the display
    Raises the costs
    Creates greater vulnerability for failures of the modules

    All of these drawbacks are important considerations when it comes to displays for industrial, automotive and portable equipment. That’s why designers in these areas should strongly consider using Chip-on-Glass (COG) LCD modules. Chip-on-Glass (COG) LCD modules offer a very thin profile, enhanced reliability, and a reasonable price.

    Reply
  22. high traffic academy FREE! says:

    Thank you a lot for providing individuals with a very nice opportunity to read in detail from this web site. It is always so amazing and also jam-packed with a great time for me personally and my office colleagues to search your site at minimum 3 times weekly to study the newest secrets you will have. And lastly, I’m just at all times fulfilled with all the staggering concepts served by you. Selected 3 facts in this article are easily the very best I’ve ever had.

    Reply
  23. Tomi says:

    A Review of Corrosion and Environmental Effects on Electronics
    http://www.smtnet.com/library/index.cfm?fuseaction=view_article&article_id=1913

    Electronic industry uses a number of metallic materials in various forms. Also new materials and technology are introduced all the time for increased performance. In recent years, corrosion of electronic systems has been a significant issue. Multiplicity of materials used is one reason limiting the corrosion reliability. However, the reduced spacing between components on a printed circuit board (PCB) due to miniaturization of device is another factor that has made easy for interaction of components in corrosive environments.

    Reply
  24. Tomi says:

    Effect of BGA Reballing and its Influence on Ball Shear Strength
    http://www.smtnet.com/library/index.cfm?fuseaction=view_article&article_id=1910

    As more components are becoming lead free and not available in the tin lead alloy, there is an industry wide interest when it comes to the reballing and the subsequent effects it has on the strength of those components. This is particularly true for legacy parts needed for military applications some of which use tin lead solder. There is cause for concern due to the potential mixing of alloys and the differences in reflow temperatures of the two different alloys. Additionally, there are unknown characteristics regarding the intermetallics that are formed due to the potential of mixed alloys.

    This research paper will focus on the effect of various parameters that are used to reball a BGA and their effect on the overall shear strength.

    Reply
  25. Tomi Engdahl says:

    SiTime’s MEMS Resonators: An alternative to Quartz
    http://www.edn.com/electronics-products/electronic-product-reviews/other/4419231/SiTime-s-MEMS-Resonators–An-alternative-to-Quartz

    SiTime Corporation has introduced the TempFlat MEMS. Until recently, all MEMS oscillators used compensation circuitry to stabilize the output frequency over temperature. This new design eliminates temperature compensation, resulting in higher performance, smaller size, lower power and cost.

    The basic architecture of a MEMS oscillator combines a MEMS resonator die together with an oscillator IC.

    With TempFlat MEMS, oscillators can offer ±50 PPM to ±100 PPM frequency stability without the need for temperature sensors and compensation. This simplifies the design of the analog CMOS circuits and reduces system size and power consumption.

    This is a great process achievement by SiTime. They have managed to cure the problem of MEMS oscillators over temperature at the source—the MEMS itself. Kudos to the process engineers!

    Reply
  26. Tomi Engdahl says:

    Crossbar says it will explode the $60B flash memory market with Resistive RAM, which stores a terabyte on a chip
    http://venturebeat.com/2013/08/05/crossbar-says-it-will-explode-the-60b-flash-memory-market-with-resistive-ram-which-stores-a-terabyte-on-a-chip/

    Crossbar is announcing today a new kind of memory chip that can replace flash memory, one of the fundamental building blocks of digital electronics, in a number of of applications.

    The Santa Clara, Calif.-based chip startup is announcing Resistive RAM, a technology that can store a terabyte of data on a single chip that is smaller than a postage stamp. It can access that data 20 times faster than the best breed of flash memory. Those features could prove disruptive to the $60 billion dollar flash market that is at the heart of the $1.1 trillion electronics market.

    The company can put a terabyte of data, or about 250 hours of high-definition movies, on a single chip that is smaller than the equivalent flash memory chip

    The company has built a working memory array in the standard manufacturing plant of one of its partners. That’s a big milestone that shows the new technology won’t require a wholesale change in manufacturing technology.

    “What is unique about this is that we have been able to get to manufacturing in just three years,” said George Minassian, chief executive of Crossbar, in an interview with VentureBeat. “It is a technology that is easy to manufacture.”

    Of course, it is always difficult for a new technology to replace an existing one that is operating on a huge economic scale.

    Like flash, RRAM is non-volatile, meaning it can store data permanently, even when the power is turned off. The Crossbar design uses a three-layer structure that can be stacked in three dimensions. The more vertical the structures on the chip, the more terabytes can be stored on that chip.

    “Today’s non-volatile memory technologies are running out of steam, hitting significant barriers as they scale to smaller manufacturing processes,”

    “RRAM is widely considered the obvious leader in the battle for a next generation memory and Crossbar is the company most advanced, showing a working demo that proves the manufacturability of RRAM.”

    Crossbar said it has filed 100 patents, with 30 already issued.

    Reply
  27. Tomi Engdahl says:

    Single chip enables DisplayPort, USB and power over a single cable
    http://www.edn.com/electronics-products/other/4419219/Single-chip-enables-DisplayPort–USB-and-power-over-a-single-cable

    Texas Instruments has launched a single chip that delivers audio/video, USB data and power over a single cable between a notebook, ultrabook or tablet PC and a docking station or dongle.

    DockPort provides a lower cost alternative to proprietary implementations and offers more features than standard USB docking stations. It enables system designers to create smaller, more affordable docking stations that connect and synchronize computers with LCD monitors, dongles, keyboard/mouse, Gigabit Ethernet, storage, audio speakers, DVD/Blu-ray media player and smartphone.

    The HD3SS2521 controller enables DisplayPort, USB 3.0, USB 2.0 and power over a single interconnecting cable, and provides the control logic and automatic switching required on the cable’s host side and dock side.

    A bidirectional 2:1 switch manages DockPort detection, as well as signal and power switching. It enables display, USB, power, and computer docking Interface over a single cable,

    Reply
  28. Tomi Engdahl says:

    Heat-shrink tubing ensures circuit integrity
    http://www.edn.com/electronics-products/other/4419238/Heat-shrink-tubing-ensures-circuit-integrity

    Cross-linked polyolefin tubing from Protostack shrinks to half its original diameter when exposed to temperatures above 90°C to provide color identification, protection, and strain relief for wire connections and terminals.

    Heat-shrink tubing not only helps keep wiring tidy, but also provides electrical insulation to ensure reliable performance.

    Reply
  29. Tomi Engdahl says:

    IBM Gets Allies to Chip Away at Intel
    Google, Others Join Effort to Break Big Blue’s Power Designs Out of a Niche
    http://online.wsj.com/article_email/SB10001424127887323420604578650412719931232-lMyQjAxMTAzMDAwNTEwNDUyWj.html

    International Business Machines Corp. IBM has enlisted Google Inc. GOOG and some other high-tech allies for a collective effort to catapult an IBM chip technology out of a shrinking niche.

    The alliance the companies plan to announce Tuesday would allow many companies to license IBM microprocessor designs—based on a technology dubbed Power—that are now only found in Big Blue’s own server systems. Licensees could incorporate IBM-designed circuitry in their own chips, with members of the alliance working on related products such as servers, networking and storage devices, participants said.

    Other initial members of the group, called the OpenPower Consortium, include Silicon Valley chip maker Nvidia Corp., the Israel-based networking-technology maker Mellanox Technologies Ltd. MLNX and Taiwan-based Tyan Computer Corp., a server supplier that is a unit of MiTAC International Corp.

    Reply
  30. Tomi Engdahl says:

    Time-tested technical tools and techniques
    http://www.edn.com/electronics-blogs/power-points/4418467/Time-tested-technical-tools-and-techniques

    back in the day, before microprocessors ruled the land, when vacuum tubes were the only amplifier choice, and when “digital” was mostly a relay-based reality, Lissajous figures were the only viable technique for frequency comparison.

    Using the Lissajous-pattern technique, you could also check for basic ×2, ×3, and ×4 frequency relationships, as well as other small-integer ratios (i.e., 3:2), by seeing if the on-screen pattern rotated or not. A 2:1 ratio yielded a “figure 8,” for example.

    But using Lissajous figures for frequency checking has gone the way of the selenium rectifier, and with good reason. Even a low-cost, basic frequency meter is much more versatile, accurate, precise, and convenient for just about every application. Still, you might use the Lissajous-figure technique or an updated version when checking your frequency source against an atomic clock, where you are looking for slight differences and the frequency meter is not good enough

    The fact is that many old-time techniques and tools were born of necessity and ingenuity, of course, but they have been superseded or made obsolete by today’s instrumentation.

    Are there any other relatively old engineering tools and techniques that are still in fairly widespread use?

    Reply
  31. Tomi Engdahl says:

    Global industrial robot sales in 2012 hit second highest mark ever
    http://www.vision-systems.com/articles/2013/07/global-industrial-robot-sales-in-2012-hit-second-highest-mark-ever.html

    More than 159,000 industrial robots were sold globally in 2012, which is the second highest amount of all time, according to statistics released by the International Federation of Robotics (IFR).

    Of the total about of robots sold, 70% went to Japan, China, the United States, South Korea, and Germany

    Global sales have increased each year by an average of 9% per year since 2008 as a result of automation trends around the world.

    The overall decrease from 2011 to 2012 was mainly due to the decrease in sales to the electronics industry.

    Reply
  32. Tomi Engdahl says:

    Test trends: Commercial scan compression tools
    http://www.edn.com/design/test-and-measurement/4418539/Test-trends–Commercial-scan-compression-tools

    About a dozen years ago, the world of test had reached an economic impasse: most digital designs had become sufficiently complex that standard scan testing techniques were no longer cost-effective. Because scan chains were too long, it was taking too much time to scan data through them when applying manufacturing tests. Moreover, even though at-speed testing had become essential for screening nanometer defects, many low-cost and legacy testers did not have the memory capacity to store all the data.

    The emergence of commercial scan compression tools at that juncture changed the economics of test. Instead of connecting flops together into a few very long scan chains, they created hundreds of short chains connected to a compressor-decompressor (CODEC). Compression enabled substantial cost savings through test application time reduction (TATR) and test data volume reduction (TDVR). TATR lowered costs for semiconductor manufacturers testing parts in high volume because more parts could be tested in less time. TDVR reduced memory storage requirements to accommodate both stuck-at and at-speed tests to improve defect coverage.

    Since compression’s early successes, designers worldwide have embraced it as a design-for-test (DFT) methodology essential for lowering test costs and enabling high test quality. But compression requirements have evolved in the intervening years, making it necessary to also evolve the compression technology. Four manufacturing test trends have had the most influence on the direction of the technology.

    Reply
  33. Tomi Engdahl says:

    Sensor stickies aren’t a toy story
    http://www.edn.com/electronics-blogs/sensor-ee-perception/4418289/Sensor-stickies-aren-t-a-toy-story?elq=a673ad2124f94309a16befec5e5f7b97&elqCampaignId=170

    Is this cool, or what? A nationwide project in Japan called: Sensor System Development Project to Solve Social Problems, spawned a new type of environmental sensor. Featuring a surface area of only 2.5cm x 1mm thick, the solution acts like a piece of tape—attaching to a surfaces easily, measuring and reporting environmental conditions.

    The sensors have three layers: a highly integrated MEMS sensor, antenna, and power generation/storage.

    The deployed sensor becomes a part of a wireless communications network and can measure CO2, temperature, infrared light, dust, and more, and data is transmitted from the sensor to a CPU. Targeted applications so far are factories, schools, office buildings, and medical.

    Sounds like it will cost a bundle? Guess again. Expectations are it will be on the market for less than $10 each. Pretty nifty.

    Reply
  34. Tomi Engdahl says:

    Is NASA’s design opportunity for FPGAs in space vanishing in favor of privatized platforms?
    http://www.edn.com/electronics-blogs/fpga-gurus/4418246/Is-NASA-s-design-opportunity-for-FPGAs-in-space-vanishing-in-favor-of-privatized-platforms-?elq=a673ad2124f94309a16befec5e5f7b97&elqCampaignId=170

    Not so long ago, the opportunities for rad-hardened FPGAs used in space applications rested with the Defense Department or NASA. The rise of privatized launches and open-architecture microsatellites like CubeSat, however, have made NASA design-ins the exception rather than the rule.

    This reality was drilled home in late June as 4DSP LLC announced a $42,479 contract from NASA’s Langley Research Center, to use 3U CompactPCI cards based on a Virtex-6 as part of a terrestrial platform to test space instruments. Only a few years ago, NASA and Air Force contracts utilizing FPGAs were commonplace. Now, it’s time for microsatellites designed by academia and private industry.

    NASA, the National Reconnaissance Office, and the Air Force are experimenting with designs like CubeSat and FalconSat. But government contracts have long relied on proprietary satellite buses such as A2100 and STARbus.

    In the new world, FPGAs may have more opportunity to end up in orbit, because designs need not be limited to strapped federal agencies currently facing sequestration.

    Reply
  35. Tomi Engdahl says:

    ARM Architecture Offers Challenges Along with Features to Help Meet Them
    http://rtcmagazine.com/articles/view/103006

    The ARM processor architecture offers a number of hardware features, which, in conjunction with tools that are able to take advantage of them, can greatly enhance understanding, bug avoidance and bug fixing during the development process.

    Embedded developers who create systems using the ARM architecture face challenges similar to developers who use other architectures. Some of these challenges involve initializing the device and its peripherals while others involve the ability to write to peripherals that are attached to the microcontroller core. Moreover, debugging software written for any device can be a daunting affair if the application is complex. Fortunately, ARM Ltd. has worked diligently to foster an ecosystem of partners and services that mitigate many of the difficulties in developing embedded software, both on the software development and the debugging aspects of product development. This ecosystem helps a developer write an application quickly and efficiently, all while minimizing the potential for defects in the code.

    Semiconductor companies generally have different ways to initialize and utilize parts of the core, such as interrupts and condition registers. Because of this, it can be difficult for an embedded developer to learn how to use these parts of the microcontroller effectively and correctly, especially if their organization is using parts from several different silicon vendors. ARM has a unique solution to this problem via their Cortex Microcontroller Software Interface Standard (CMSIS) interface. CMSIS defines an API that an embedded developer can use to set up timers, ITM registers, the Nested Vectored Interrupt Controller (NVIC) and many other things.

    Reply
  36. Tomi Engdahl says:

    Product How-to: Software selectable serial protocols over a single connector
    http://www.edn.com/design/analog/4419212/Product-How-to–Software-selectable-serial-protocols-over-a-single-connector

    While the RS-232, RS-422, and RS-485 serial standards are hardly new, combining them over a single connector creates challenges unforeseen when these standards were first drafted years ago. Multiprotocol transceivers with integrated termination resistors greatly simplify the development of modern serial controllers, resulting in smaller, lower cost, and quicker-to-market solutions.

    Today RS-232 has largely been replaced by USB in the personal computer market, but RS-232, RS-422, and RS-485 are still used heavily in industrial applications like medical devices, factory and building automation, and robotics. These protocols are simple and low cost to implement, are time tested, and deliver rugged and robust communication in noisy environments.

    Interfacing with an existing network of serial devices may require several of the common protocols, typically with separate external connectors for each. However, with the electronics equipment industry continuously moving in the direction of cheaper, smaller, and quicker to market, designers are being pressed to reduce the number of bulky connectors and deliver a lower cost integrated solution in a shorter time. Since the RS-232 standard specifies the DB9 connector, it is often chosen, requiring the RS-485 and RS-422 standards to co-exist alongside RS-232 in the same connector.

    Integrating multiple serial standards over a single shared connector introduces a host of issues the standards developers did not conceive or plan for. Proper signaling with one protocol requires the drivers and receivers of the others to be disconnected to avoid excessive loading on the bus.

    Older and lower cost RS-485 transceivers require external biasing resistors to hold the bus in a known state when transmitters are idle or powered off. The RS-232 protocol was not designed to drive these resistive loads, so they must be removed from the lines for RS-232 communication.

    Multiprotocol serial transceivers combine RS-232, RS-422, and RS-485 drivers and receivers into a single chip to address these common issues. For example, the SP338 and SP339 devices from Exar include switchable termination resistors for high speed RS-485/422 (up to 20Mbps) and full support of all eight RS-232 signals specified for the popular DB9 connector.

    Reply
  37. Tomi Engdahl says:

    Connector voltage stress
    http://www.edn.com/electronics-blogs/living-analog/4419213/Connector-voltage-stress-

    I have seen the recommendation in high voltage assembly work that adjacent surfaces which span distances between two nodes having high voltage between them be stressed at no greater than 10 volts per mil.

    The 1704-1 connector is rated for service up to 5000 volts.

    Reply
  38. Tomi Engdahl says:

    Samsung’s “3D Vertical” NAND crams a terabit on a single chip
    Longer life, higher reliability, more performance—what’s not to like?
    http://arstechnica.com/gadgets/2013/08/samsungs-3d-vertical-nand-crams-a-terabit-on-a-single-chip/

    SSD enthusiasts know all about SLC, MLC, and TLC, but there are some new acronyms in SSD town: V-NAND and CTF. Samsung announced in a press release last night that it has begun mass production of “3D Vertical NAND,” a type of flash that it claims overcomes the existing limits on the design and production of existing NAND types. When we looked at those limits about a year ago, they seemed pretty significant; Samsung’s V-NAND aims to neatly sidestep most of the issues.

    The new V-NAND is manufactured at a 10nm process size, and it starts at a density of 128Gb per NAND chip. The NAND chips are constructed in layers, stacking up to 24 individual NAND cells on top of each other. This lets Samsung scale the chip’s capacity up without having to add more NAND cells in a series, or “planar scaling,” as the traditional “just shrink ‘em and add more cells” method is called.

    The other acronym, CTF, stands for “Charge Trap Flash.” Traditional NAND flash records zeros and ones by storing charge in a set of floating gate transistors, with the presence or absence of charge corresponding to a 0 or a 1 in single-level cell NAND, and the amount of charge corresponding to different multibit values in multi- and triple-level cell NAND (we have an extremely in-depth primer on the inner workings of SSDs if you want more details). However, Samsung’s new V-NAND dispenses with floating gate transistors and uses a different method:

    Samsung’s CTF-based NAND flash architecture, an electric charge is temporarily placed in a holding chamber of the non-conductive layer of flash that is composed of silicon nitride (SiN), instead of using a floating gate to prevent interference between neighboring cells.

    Samsung predicts that V-NAND will scale up to 1Tb per individual NAND chip. Most SSDs use at least eight NAND chips in parallel, so V-NAND could lead directly to low dollar-per-GB 2.5-inch form factor SSDs of 1TB and beyond—capacities which many Ars commenters have said repeatedly that they desperately want.

    Reply
  39. Tomi Engdahl says:

    Industry’s first 5V Cortex-M0+ MCUs target industrial and white goods
    http://www.edn.com/electronics-products/other/4419320/Industry-s-first-5V-Cortex-M0–MCUs-target-industrial-and-white-goods

    Freescale Semiconductor has developed the industry’s first 5V 32-bit MCUs built around the ARM Cortex-M0+ processor for rugged industrial and consumer environments.

    The sub $1 Kinetis E series MCUs feature electromagnetic noise immunity for systems that traditionally use 8- and 16-bit MCUs, such as white goods and industrial applications, while providing high efficiency and optimal code density.

    The controllers are aimed at applications such as dishwashers, refrigerators, home and building control systems, motor control fans, industrial Converters and other equipment commonly operating in high-noise environments.

    “Historically, 32-bit MCUs have been associated with lower voltage operation and considered unreliable in electromagnetically harsh environments such as factories or even many households,” said Brandon Tolany, vice president of marketing and business development for Freescale’s MCU business. “Kinetis E series MCUs break new ground by combining low cost and high performance with advanced electromagnetic compatibility (EMC) and electrostatic discharge (ESD) protection features designed to support compliance with industrial-grade reliability and temperature requirements.”

    The family features high electrical fast transient/electrostatic discharge (EFT/ESD) performance, flash, RAM, register, watchdog and clock tests and operates from 2.7V to 5.5V and across -40°C to +105°C ambient temperature range. The ARM Cortex-M0+ core runs up to 20MHz and the devices include a single-cycle 32-bit x 32-bit multiplier, single-cycle I/O access port, up to 64 KB flash memory, 256 bytes of EEPROM and up to 4 KB RAM.

    “It’s impressive to see a 32-bit MCU family at these price points for this type of environment,”

    The first Kinetis E series device, the 64-pin QFP version of the KE02, is available now. Suggested resale pricing for the KE02 MCU family in 10K unit quantities ranges from 78 cents to $1.13 (USD). The FRDM-KE02Z Freescale Freedom development platform is available now for $12.95 (USD).

    Reply
  40. Tomi Engdahl says:

    Weightlessness of space brings unforeseen power-management challenges
    http://www.edn.com/electronics-blogs/power-points/4419174/Weightlessness-of-space-brings-unforeseen-power-management-challenges

    For example, engineers know that the standard meltable-link fuse is a simple, passive, reliable, and very effective way to protect against damage due to short circuits and overloads; it’s normal and wise practice to use these on power and signal lines. Their operating principle is simple: when excess current flows through the fuse, the link heats and melts, and then the molten blob falls away, breaking the circuit and current path.

    Whoops… the word “falls” is the key to why a fuse won’t work in the weightless world: there is no force (such as gravity) to cause the blob to go anywhere. It will melt and then stay in place, making and breaking the circuit intermittently. Hmmm, I didn’t think of that reality.

    Another counterintuitive situation has to do with cooling, an omnipresent concern for systems. In the vacuum of space, of course, there is no option of conduction or convection cooling; only radiation cooling is possible. This complicates the design of satellites and must be carefully factored into the thermal planning and system design.

    But what about the Space Shuttle, Skylab, or the International Space Station, all of which have a “normal” air atmosphere? That should allow convection cooling of the electronics as heating air rises, you might assume.

    Whoops, wrong again… another mistaken assumption: the word “rises” has no meaning in this weightless environment. What happens is that the heated air just stays where it is, accumulating around the heat source and acting as a warm – and therefore destructive – blanket.

    Reply
  41. Tomi Engdahl says:

    ∑-∆ Isolation amplifier transfers low frequencies across barrier
    http://www.edn.com/design/analog/4419193/–Isolation-amplifier-transfers-low-frequencies-across-barrier

    Figure 1 The floating sigma-delta converter U1 generates a pulse train whose duty cycle depends on the voltage at J1 and J2. The filtered DC level is reconstructed at the output J3.

    The circuit has been used in pH and Redox amplifiers, and to isolate the speed signal in three-phase AC inverters.

    Reply
  42. Tomi Engdahl says:

    Samsung rolls out first mass-produced 3D NAND flash memory chip
    The reports of NAND’s death, Sammy says, are greatly exaggerated
    http://www.theregister.co.uk/2013/08/07/samsung_rolls_out_first_massproduced_3d_nand_chip/

    “Following the world’s first mass production of 3D Vertical NAND, we will continue to introduce 3D V-NAND products with improved performance and higher density, which will contribute to further growth of the global memory industry,” said Samsung SVP for flash product and technology Jeong-Hyuk Choi in a statement.

    The V-NAND part will provide 128 gigabits of storage, and is based on Samsung’s implementation of 3D Charge Trap Flash (CTF) technology

    Reply
  43. Tomi Engdahl says:

    Boffins harvest TV, mobile signals for BATTERY-FREE comms
    Powerless tech better than your common-or-garden RFID tag, claim researchers
    http://www.theregister.co.uk/2013/08/14/boffins_hawk_powerless_radio/

    Radio boffins from the University of Washington have created tags and readers which reflect and feed off ambient radio frequency energy for communications – without needing a power source.

    The team calls the technology “ambient backscatter” and reckons it could connect up the much-heralded Internet of Things without either party needing a power supply.

    Two tags can absorb, or reflect, existing transmissions from (for example) a TV broadcaster to convey information to each other, while using the power absorbed to process the signals, as demonstrated in this jolly video.

    Absorbing power from TV transmissions is very old news. Back in the ’60s Practical Wireless ran a feature on the subject. More recently, Intel managed to pull 25uA, at 1.5v, from a local TV transmitter – apparently just for fun.

    Pulling power from a TV signal generates a shadow, sucking the signal from the surrounding area. The effect is very localised but it’s that removal of signal which the Ambient Backscatter system perceives as a transmitted Zero. A “One” is sent by reflecting the signal, and thus binary communications is possible.

    Communication by reflected signal is also old news; various forms of RFID tag work this way.

    Reply
  44. Tomi Engdahl says:

    Optical Navigation Systems: The foundation of modern pointing devices
    http://www.edn.com/design/sensors/4419587/Optical-Navigation-Systems–The-foundation-of-modern-pointing-devices

    Optical Navigation Systems (ONS) makes use of optical physics to measure the degree of the relative motion (both speed and magnitude) between a navigation device and the navigation surface. These systems find their major application in pointing and finger tracking devices. Initially, ONS entered the consumer market through optical mice, an application in which they still experience great success. The precision ONS provides in motion sensing, however, has also been found to also make it a suitable candidate for finger tracking applications. This is evident from its widespread use in PC tablets, smart phones, digital cameras, and remote controls.

    Optical physics says that whenever a beam of light is incident on a surface, a part of it is absorbed, some of it gets scattered, and the rest is reflected back

    The degree (or the percentage) of absorption, reflection, and scattering depends on both the wavelength of the light and the characteristics of the reflecting surface.

    A variety of optical sources can be used along with the appropriate sensors, including Light Emitting Diodes (LED), and infra red lasers. Laser sources are most commonly used as they provide high resolution and can be used on a wide variety of surfaces, including ones that are somewhat reflective

    LED’s, though more susceptible to external vibrations and ambient light noise, offer an ease of implementation in applications where high resolution is not required. Compared to LASER sources, LED’s are also not required to meet any eye safety standards

    In most applications, irrespective of the optical source (Laser or LED) used, a photodiode array is a fundamental part of any optical sensor. However, the arrangement and orientation of the array depends on the processing technique used and usually varies from sensor to sensor. Each array consists of several tiny photodiodes (pixels) which define the resolution of the sensor. In general, more pixels provide higher resolution.

    For an ONS-based system, the microprocessor unit acts more like a small digital signal processor (DSP). As it receives data from the sensor (photodiode array) in the form of a matrix, analysis is usually complex and requires a DSP. In most systems, the functionality of the DSP and microprocessor is integrated into a single component to reduce system cost and size.

    Since every ONS is usually a part of a bigger system, it needs to communicate with other functional blocks of the system. This requires support for external interfaces like USB, SPI, I2C, etc.

    Reply
  45. Tomi Engdahl says:

    World’s biggest chip equipment manufacturer names Gary Dickerson as CEO
    http://venturebeat.com/2013/08/15/worlds-biggest-chip-equipment-manufacturer-names-gary-dickerson-as-ceo/

    Applied Materials, the world’s largest maker of chip manufacturing equipment, has named Gary Dickerson as president and chief executive of the company. He replaces Mike Splinter, who will remain on the board and retain the title of executive chairman.

    Santa Clara, Calif.-based Applied makes the multi-million-dollar equipment that is used to manufacture semiconductor chips in multi-billion-dollar factories.

    Splinter has worked in chips for 40 years. This fall, he will receive the Semiconductor Industry Association’s 2013 Robert N. Noyce award for his outstanding achievements in the business.

    Reply
  46. Tomi Engdahl says:

    Tech industry slips into a surprising slump
    http://www.latimes.com/business/la-fi-tech-slump-20130819,0,871392.story

    After a six-year boom ignited by the introduction of the iPhone in 2007, tech firms are in the unusual position of being laggards in the U.S. economy’s recovery.

    In a surprising turn, the tech industry is in a slump even as the U.S. economy picks up steam.

    Though there are still bright spots among companies that help manage data or provide cybersecurity, many of the industry’s biggest companies — Microsoft, Google, IBM and Dell — are struggling to figure out the changes in the way businesses and consumers are buying and using technology.

    Microsoft is among the tech industry’s biggest players struggling to navigate the changes in the way businesses and consumers are buying and using technology. In its disappointing summer earnings report Microsoft took an ugly $900-million write-down because of poor sales of its Surface tablet

    It’s not a bust — not yet at least. And it isn’t as serious as the 2000 dot-com crash, when tech’s fortunes quickly deteriorated. Indeed, on the ground in Silicon Valley, there is a bit of a disconnect because competition for hiring remains intense.

    But in recent months, tech earnings have plummeted as tech companies have reported slower growth or declines. Venture capital has fallen almost 7% this year. And tech stocks have lagged the broader stock market this year.

    “What I’ve seen is that a lot of the tech heavyweights are having challenges,” said Patrick Moorhead, principal analyst at Moor Insights and Strategy. “There’s a fundamental shift in the marketplace that many people are grappling with. What we’re seeing is a transitional period.”

    And tech finds itself in the unusual position of being a laggard in the economy’s recovery.

    “Technology remains a big drag on earnings growth,” Zacks Investment Research analyst Sheraz Mian wrote in a recent report. “The sector’s earnings picture is very poor.”

    Companies continue to shift from buying their own hardware and software to renting computing power through cloud-based services in which files are kept at massive data centers in far-flung locations. These save money for buyers but generate less revenue for sellers.

    Consumers, meanwhile, appear to be showing signs of fatigue after embracing so many new gadgets in recent years.

    PC sales have been devastated by tablets. But now tablets are losing steam, with even Apple reporting a decline in iPad sales in the most recent quarter.

    Worldwide tablet shipments fell nearly 10% in the second quarter compared with the first quarter, according to an August study from IDC.

    Reply
  47. Tomi Engdahl says:

    Are You Designing for Testability?
    http://www.designnews.com/author.asp?section_id=1394&doc_id=265467&cid=nl.dn14

    Are you designing for testability? If you think that even simple designs don’t need test points, you are wrong.

    Circuit scale continues to follow Moore’s Law, even to this day. When 0403 becomes 0201 and then becomes 010005, the scale is almost beyond direct human manipulation. At these sizes, using component solder connections as a test point is possible but unserviceable. “But the design is just a simple RC circuit, no need for a test point.” No, seriously, this is the wrong mindset. Designs change, expand, conflict, and errors abound in early versions. Catching errors early will save 10 to 1,000 times the money to solve. Due diligence ahead of time is key.

    There are few inevitabilities in life — death, bills, taxes — and problematic designs should be added to that list. That said, Design for Testability (DFT) is essential in these days of system on a chip (SoC) and components that are smaller than the point of a needle.

    The requirements stage of any design is the most critical. An error in this stage will snowball complications and cost more to fix later.

    NASA published a report called “Error Cost Escalation through the Project Life Cycle.” An error at the requirements stage was given an arbitrary “1 unit.” The costs to fix the error during the design phase ticks up to “3 to 8 units.” If that same error was addressed at the manufacturing stage, the cost then moved up to “7 to 16 units.”

    Addressing that error during the test and integration phase brings the cost up to “21 to 78 units.” Fixing an error after a product has been released into the market would cost “29 units” to a whopping “1,500 units.”

    At $240 a day ($30 per hour), an engineer could fix a problem for 3 units, or $720. Let it go to the public, and it could reach 1,500 units, or $360,000 to smooth out that wrinkle. The design phase is the gateway point, bridging ideas with the real world. It is extremely critical to stop the dominoes of error from falling any further past this point.

    Reply

Leave a Reply to Tomi Cancel reply

Your email address will not be published. Required fields are marked *

*

*