Here is my list of electronics industry trends and predictions for 2016:
There was a huge set of mega mergers in electronics industry announced in 2015. In 2016 we will see less mergers and how well the existing mergers went. Not all of the major acquisitions will succeed. Probably the the biggest challenge in these mega-mergers is “creating merging cultures or–better yet–creating new ones”.
Makers and open hardware will boost innovation in 2016. Open source has worked well in the software community, and it is coming more to hardware side. Maker culture encourages people be creators of technology rather than just consumers of it. A combination of the maker movement and robotics is preparing children for a future in which innovation and creativity will be more important than ever: robotics is an effective way for children as young as four years old to get experience in the STEM fields of science, technology, engineering, mathematics as well as programming and computer science. The maker movement is inspiring children to tinker-to-learn. Popular DIY electronics platforms include Arduino, Lego Mindstorms, Raspberry Pi, Phiro and LittleBits. Some of those DIY electronics platforms like Arduino and Raspberry Pi are finding their ways into commercial products for example in 3D printing, industrial automation and Internet of Things application fields.
Open source processors core gains more traction in 2016. RISC-V is on the march as an open source alternative to ARM and Mips. Fifteen sponsors, including a handful of high tech giants, are queuing up to be the first members of its new trade group for RISC-V. Currently RISC-V runs Linux and NetBSD, but not Android, Windows or any major embedded RTOSes. Support for other operating systems is expected in 2016. For other open source processor designs, take a look at OpenCores.org, the world’s largest site/community for development of hardware IP cores as open source.
GaN will be more widely used and talked about in 2016. Gallium nitride (GaN) is a binary III/V direct bandgap semiconductor commonly used in bright light-emitting diodes since the 1990s. It has special properties for applications in optoelectronic, high-power and high-frequency devices. You will see more GaN power electronics components because GaN – in comparison to the best silicon alternative – will enable higher power density through the ability to switch at high frequencies. You can get GaN devices for example from GaN Systems, Infineon, Macom, and Texas Instruments. The emergence of GaN as the next leap forward in power transistors gives new life to Moore’s Law in power.
Power electronics is becoming more digital and connected in 2016. Software-defined power brings to bear critical need in modern power systems. Digital Power was the beginning of software-defined power using a microcontroller or a DSP. Software-defined power takes this to another level. Connectivity is the key to success for software-defined power and the PMBus will enable the efficient communication and connection between all power devices in computer systems. It seems that power architectures to become software defined, which will take advantage of digital power adaptability and introduce software control to manage the power continuously as operating conditions change. For example adaptive voltage scaling (AVS) is supported by the AVSBus is contained in the newest PMBus standard V 1.3. The use of power-optimization software algorithms and the concept of the Software Defined Power Architecture (SDPA) are all being seen as part of a brave new future for advanced board-power management.
Nanowires and new forms of memory like RRAM (resistive random access memory) and spintronics are also being researched, and could help scale down chips. Many “exotic” memory technologies are in the lab, and some are even in shipping product: Ferroelectric RAM (FRAM), Resistive RAM (ReRAM), Magnetoresistive RAM (MRAM), Nano-RAM (NRAM).
Nanotube research has been ongoing since 1991, but there has been long road to get practical nanotube transistor. It seems that we almost have the necessary parts of the puzzle in 2016. In 2015 IBM reported a successful auto-alligment method for placing them across the source and drain. Texas Instruments is now capable of growing wafer scale graphene and the Chinese have taken the lead in developing both graphene and nanotubes according to Lux Research.
While nanotubes provide the fastest channel material available today, III-V materials like gallium arsenide (GaAs) and indium gallium arsenide (InGaAs) are all being explored by IBM, Intel, Imec and Samsung as transistor channels on silicon substrates. Dozen of researchers worldwide are experimenting with black phosphorus as an alternative to nanotubes and graphene for the next generation of semiconductors. Black phosphorus has the advantage of having a bandgap and works well alongside silicon photonics device. 3-Molybdenum disulphide MoS2 is also a contender for the next generation of semiconductors, due to its novel stacking properties.
Graphene has many fantastic properties and there has been new finding in it. I think it would be a good idea to follow development around magnetized graphene. Researchers make graphene magnetic, clearing the way for faster everything. I don’t expect practical products in 2016, but maybe something in next few years.
Optical communications is integrating deep into chips finally. There are many new contenders on the horizon for the true “next-generation” of optical communications with promising technologies in development in labs and research departments around the world. Silicon photonics is the study and application of photonic systems which use silicon as an optical medium. Silicon photonic devices can be made using existing semiconductor fabrication. Now we start to have technology to build optoelectronic microprocessors built using existing chip manufacturing. Engineers demo first processor that uses light for ultrafast communications. Optical communication could also potentially reduce chips’ power consumption on inter-chip-links and enable easily longer very fast links between ICs where needed. Two-dimensional (2D) transition metal dichalcogenides (TMDCs), which may enable engineers to exceed the properties of silicon in terms of energy efficiency and speed, moving researchers toward 2D on-chip optoelectronics for high-performance applications in optical communications and computing. To build practical systems with those ICs, we need to figure out how make easily fiber-to-chip coupling or how to manufacture practical optical printed circuit board (O-PCB).
Look development at self-directed assembly.Researchers from the National Institute of Standards and Technology (NIST) and IBM have discovered a trenching capability that could be harnessed for building devices through self-directed assembly. The capability could potentially be used to integrate lasers, sensors, wave guides and other optical components into so called “lab-on-a-chip” devices.
Smaller chip geometries are come to mainstream in 2016. Chip advancements and cost savings slowed down with the current 14-nanometer process, which is used to make its latest PC, server and mobile chips. Other manufacturers are catching to 14 nm and beyond. GlobalFoundries start producing a central processing chip as well as a graphics processing chip using 14nm technology. After a lapse, Intel looks to catch up with Moore’s Law again with with upcoming 10-nanometer and 7-nm processes. Samsung revealed that it will soon begin production of a 10nm FinFET node, and that the chip will be in full production by the end of 2016. This is expected to be at around the same time as rival TSMC. TSMC 10nm process will require triple patterning. For mass marker products it seems that 10nm node, is still at least a year away. Intel delayed plans for 10nm processors while TSMC is stepping on the gas, hoping to attract business from the likes of Apple. The first Intel 10-nm chips, code-named Cannonlake, will ship in 2017.
Looks like Moore’s Law has some life in it yet, though for IBM creating a 7nm chip required exotic techniques and materials. IBM Research showed in 2015 a 7nm chip will hold 20 billion transistors manufactured by perfecting EUV lithography and using silicon-germanium channels for its finned field-effect transistors (FinFETs). Also Intel revealed that the end of the road for Silicon is nearing as alternative materials will be required for the 7nm node and beyond. Scaling Silicon transistors down has become increasingly difficult and expensive and at around 7nm it will prove to be downright impossible. IBM development partner Samsung is in a race to catch up with Intel by 2018 when the first 7nm products are expected. Expect Silicon Alternatives Coming By 2020. One very promising short-term Silicon alternative is III-V semiconductor based on two compounds: Indium gallium arsenide ( InGaAs ) and indium phosphide (InP). Intel’s future mobile chips may have some components based on gallium nitride (GaN), which is also an exotic III-V material.
Silicon and traditional technologies continue to be still pushed forward in 2016 successfully. It seems that the extension of 193nm immersion to 7nm and beyond is possible, yet it would require octuple patterning and other steps that would increase production costs. IBM Research earlier this year beat Intel to the 7nm node by perfecting EUV lithography and using silicon-germanium channels for its finned field-effect transistors (FinFETs). Taiwan Semiconductor Manufacturing Co. (TSMC), the world’s largest foundry, said it has started work on a 5nm process to push ahead its most advanced technology. TSMC’s initial development work at 5nm may be yet another indication that EUV has been set back as an eventual replacement for immersion lithography.
It seems that 2016 could be the year for mass-adoption of 3D ICs and 3D memory. For over a decade, the terms 3D ICs and 3D memory have been used to refer to various technologies. 2016 could see some real advances and traction in the field as some truly 3D products are already shipping and more are promised to come soon. The most popular 3D category is that of 3D NAND flash memory: Samsung, Toshiba, Sandisk, Intel and Micron have all announced or started shipping flash that uses 3D silicon structure (we are currently seeing 128Gb-384Gb parts). Micron’s Hybrid Memory Cube (HMC) uses stacked DRAM die and through-silicon vias (TSVs) to create a high-bandwidth RAM subsystem with an abstracted interface (think DRAM with PCIe). Intel and Micron have announced production of a 3D crosspoint architecture high-endurance (1,000× NAND flash) nonvolatile memory.
The success of Apple’s portable computers, smartphones and tablets will lead to the fact that the company will buy as much as 25 per cent of world production of mobile DRAMs in 2016. In 2015 Apple bought 16.5 per cent of mobile DRAM.
After COP21 climate change summit reaches deal in Paris environmental compliance 2016 will become stronger business driver. Increasingly, electronics OEMs are realizing that environmental compliance goes beyond being a good corporate citizen. On the agenda for these businesses: climate change, water safety, waste management, and environmental compliance. Keep in mindenvironmental compliance requirements that include the Waste Electrical and Electronic Equipment (WEEE) directive, Restriction of Hazardous Substances Directive 2002/95/EC (RoHS 1), and Registration, Evaluation, Authorization and Restriction of Chemicals (REACH). It’s a legal situation: If you do not comply with regulatory aspects of business, you are out of business. Some companies are leading the parade toward environmental compliance or learning as they go.
Connectivity is proliferating everything from cars to homes, realigning diverse markets. It needs to be done easily for user, reliably, efficiently and securely.It is being reported that communications technologies are responsible for about 2-4% of all of carbon footprint generated by human activity. The needs for communications and faster speeds is increasing in this every day more and more connected world – penetration of smart devices there was a tremendous increase in the amount of mobile data traffic from 2010 to 2014.Wi-Fi has become so ubiquitous in homes in so many parts of the world that you can now really start tapping into that by having additional devices. When IoT is forecasted to be 50 billion connections by 2020, with the current technologies this would increase power consumption considerably. The coming explosion of the Internet of Things (IoT) will also need more efficient data centers that will be taxed to their limits.
The Internet of Things (IoT) is enabling increased automation on the factory floor and throughout the supply chain, 3D printing is changing how we think about making components, and the cloud and big data are enabling new applications that provide an end-to-end view from the factory floor to the retail store. With all of these technological options converging, it will be hard for CIOs, IT executives, and manufacturing leaders keep up. IoT will also be hard for R&D.Internet of Things (IoT) designs mesh together several design domains in order to successfully develop a product. Individually, these design domains are challenging. Bringing them all together to create an IoT product can place extreme pressure on design teams. It’s still pretty darn tedious to get all these things connected, and there’s all these standards battles coming on. The rise of the Internet of Things and Web services is driving new design principles as Web services from companies such as Amazon, Facebook and Uber are setting new standards for user experiences. Designers should think about building their products so they can learn more about their users and be flexible in creating new ways to satisfy them – but in such way that the user’s don’t feel that they are spied on what they do.
Subthreshold Transistors and MCUs will be hot in 2016 because Internet of Things will be hot in 2016 and it needs very low power chips. The technology is not new as cheap digital watches use FETs operating in the subthreshold region, but decades digital designers have ignored this operating region, because FETs are hard to characterize there. Now subthreshold has invaded the embedded space thanks to Ambiq’s new Apollo MCU. PsiKick Inc. has designed a proof-of-concept wireless sensor node system-chip using conventional EDA tools and a 130nm mixed-signal CMOS that operates with sub-threshold voltages and opening up the prospect of self-powering Internet of Things (IoT) systems. I expect also other sub-threshold designs to emerge. ARM Holdings plc (Cambridge, England) is also working at sub- and near-threshold operation of ICs. TSMC has developed a series of processes characterized down to near threshold voltages (ULP family for ultra low power are processes). Intel will focus on its IoT strategy and next-generation low voltage mobile processors.
FPGAs in various forms are coming to be more widely use use in 2016 in many applications. They are not no longer limited to high-end aerospace, defense, and high-end industrial applications. There are different ways people use FPGA. Barrier of entry to FPGA development have lowered so that even home makers can use easily FPGAs with cheap FPGA development boards, free tools and open IP cores. There was already lots of interest in 2015 for using FPGA for accelerating computations as the next step after GPU. Intel bought Altera in 2015 and plans in 2016 to begin selling products with a Xeon chip and an Altera FPGA in a single package – possibly available in early 2016. Examples of applications that would be well-suited for use of ARM-based FPGAs, including industrial robots, pumps for medical devices, electric motor controllers, imaging systems, and machine vision systems. Examples of ARM-based FPGAs are such as Xilinx’s Zynq-7000 and Altera’s Cyclone V intertwine. Some Internet of Things (IoT) application could start to test ARM-based field programmable gate array (FPGA) technology, enabling the hardware to be adaptable to market and consumer demands – software updates on such systems become hardware updates. Other potential benefits would be design re-use, code portability, and security.
The trend towards module consolidation is applicable in many industries as the complexity of communication, data rates, data exchanges and networks increases. Consolidating ECU in vehicles is has already been big trend for several years, but the concept in applicable to many markets including medical, industrial and aerospace.
It seems to be that AXIe nears the tipping point in 2016. AXIe is a modular instrument standard similar to PXI in many respects, but utilizing a larger board format that allows higher power instruments and greater rack density. It relies chiefly on the same PCI Express fabric for data communication as PXI. AXIe-1 is the uber high end modular standard and there is also compatible AXIe-0 that aims at being a low cost alternative. Popular measurement standard AXIe, IVI, LXI, PXI, and VXI have two things in common: They each manage standards for the test and measurement industry, and each of those standards is ruled by a private consortium. Why is this? Right or wrong, it comes down to speed of execution.
These days, a hardware emulator is a stylish, sleek box with fewer cables to manage. The “Big Three” EDA vendors offer hardware emulators in their product portfolios, each with a distinct architecture to give development teams more options. For some offerings emulation has become a datacenter resource through a transaction-based emulation mode or acceleration mode.
LED lighting is expected to become more intelligent, more beautiful, more affordable in 2016. Everyone agrees that the market for LED lighting will continue to enjoy dramatic year-on-year growth for at least the next few years. LED Lighting Market to Reach US$30.5 Billion in 2016 and Professional Lighting Markets to See Explosive Growth. Some companies will win on this growth, but there are also losers. Due currency fluctuations and price slide in 2015, end market demands in different countries have been much lower than expected, so smaller LED companies are facing financial loss pressures. The history of the solar industry to get a good sense of some of the challenges the LED industry will face. Next bankruptcy wave in the LED industry is possible. The LED incandescent replacement bulb market represents only a portion of a much larger market but, in many ways, it is the cutting edge of the industry, currently dealing with many of the challenges other market segments will have to face a few years from now. IoT features are coming to LED lighting, but it seem that one can only hope for interoperability
Other electronics trends articles to look:
Hot technologies: Looking ahead to 2016 (EDN)
CES Unveiled NY: What consumer electronics will 2016 bring?
Analysts Predict CES 2016 Trends
LEDinside: Top 10 LED Market Trends in 2016
961 Comments
Tomi Engdahl says:
Uncertainty Rocks Chip Market
http://semiengineering.com/broader-shifts-more-uncertainty/
Semiconductor industry grapples with changes in markets, technologies and economics.
The semiconductor industry is undergoing sweeping changes in every direction, making it far more difficult to figure out which path to take next, when to take it, and how to get there.
The next few years will redefine which semiconductor companies emerge as leaders, which ones get pushed down or out or absorbed into other companies, and which markets will be the most lucrative. And that could change again as technology is used to interconnect markets that have so far never interacted with each other under the broad banner of the Internet of Things.
“Historically, the semiconductor industry replaces about half of the top 10 companies every 20 to 30 years,” said Wally Rhines, chairman and CEO of Mentor Graphics. “Most of those replacements will be new companies.”
Tomi Engdahl says:
Wideband testing of satellites
http://www.edn.com/electronics-blogs/out-of-this-world-design/4442301/Wideband-testing-of-satellites?_mc=NL_EDN_EDT_EDN_analog_20160630&cid=NL_EDN_EDT_EDN_analog_20160630&elqTrackId=ab5ca12a57644c38bce0f201bbbdc9eb&elq=72b36055818a46ddb7f831bd5ac6806e&elqaid=32894&elqat=1&elqCampaignId=28724
To deliver the next generation of satellite services, spacecraft operators are increasingly using larger bandwidths at higher frequencies. Characterising transponder performance such as SNR, SFDR, and flatness over hundreds of MHz or several GHz, can be very difficult for OEMs and equally challenging for suppliers of test and measurement equipment.
Wide bandwidths are sometimes split into multiple channels and dynamic range problems occur when non-linearities, e.g. amplifiers, ADCs, and DACs, generate intermodulation products between the input frequencies. These new frequencies can appear within the bandwidths of other channels causing distortion.
Noise power ratio (NPR) is a wideband test which measures the ‘quietness’ of an unused channel accounting for intermodulation distortion products generated by non-linearities within the signal chain.
Tomi Engdahl says:
Not-So Quiet Race in Wide Band Gap Semiconductor
http://www.eetimes.com/author.asp?section_id=36&doc_id=1330000&
Wide band gap semiconductor materials (diamond, silicon carbide, and gallium nitride) are well positioned to play important roles in the next and future generations of consumer and military/defense electronics capability.
With focused initiatives in place such as the ‘Materials Genome Initiative’ and the ‘Power America’and a resurgence in advanced materials manufacturing capability, wide band gap semiconductor materials—such as diamond, silicon carbide (SiC), and gallium nitride (GaN)—are well positioned to play important roles in the next and future generations of consumer and military/defense electronics capability. Despite big backers like the White House and U.S. Department of Energy, the present capability and technology development roadmap continue to remain largely unknown to the broader scientific and engineering community (not to mention main street America).
Tomi Engdahl says:
Averna Acquires Nexjen, Goes Deeper in U.S. Test Market
http://www.eetimes.com/document.asp?doc_id=1330033&
Averna, a Montreal-based test-system integrator and test-services company, as acquired test-system integrator Nexjen systems based in Charlotte, North Carolina. The acquisition gives Averna a stronger foothold into automated testing in the southeastern United States. Both companies have had strong affiliations with National Instruments through its Alliance Program.
Tomi Engdahl says:
New ultra-miniature high-density connector
http://www.fischerconnectors.com/us/en/new-ultra-miniature-connector-minimax-06
Fischer Connectors continues to drive the evolution towards combined signal and power connectors as a way of making electronics lighter and smaller. MiniMax 06 integrates fully into the ultra-miniature high-performance Fischer MiniMax™ Series, well-known for its use in limited space and lightweight applications, and for meeting the combined needs of multiple signals and power. Tested for high-speed protocols such as HDMI and data transfer up to 10Gb/s, the series also addresses the growing market need for higher data transmission rates.
Small, smaller, smallest: new ultra-miniature high-density connector
http://www.fischerconnectors.com/global/en/news/small-smaller-smallest-new-ultra-miniature-high-density-connector
Fischer Connectors has just launched MiniMax 06, a new ultra-miniature connector within its Fischer MiniMax™ Series.
Fischer Connectors expands its high-density product range, the Fischer MiniMax™ Series, with an even smaller ultra-miniature connector: MiniMax 06.
The new connector can include up to 12 power and signal contacts in a footprint of only 10 mm. This corresponds to a density factor* of 0.83 – a unique technological feature for a connector with standard 0.5 mm contacts. The new MiniMax 06 connector also includes a new unique configuration with 2 signal and 2 high-power 1.3 mm contacts for applications needing 10 Amps or more power, doubling the standard 5 Amps current rating of the existing MiniMax product line.
Tested for high-speed protocols such as HDMI and data transfer up to 10Gb/s, the series also addresses the growing market need for higher data transmission rates.
These connectors are also rugged, with IP68 sealing (2m/24h), both mated and unmated, an unbreakable keying system, and over-molded cable assemblies. Available in three locking systems (push-pull, screw and quick-release), they are easy to connect and disconnect up to 5,000 times.
Tomi Engdahl says:
Dissecting Phase Change Memory: Atom by Atom, Bond by Bond
http://www.eetimes.com/author.asp?section_id=36&doc_id=1330002&
The innovative application of a mix of simulation techniques has provided a team at IBM with a unique ability to view the connection between atomic bond type, drift and electrical conductivity in PCM devices. Results overturn some old ideas of band gap expansion.
In a recently published paper Federico Zipoli and colleagues Daniel Krebs, Alessandro Curioni of IBM Zurich have adopted a different and more fundamental approach to understanding the problems of drift and resistance changes in phase change memory (PCM).
Tomi Engdahl says:
DRAM and blast it: Micron staff face axe after flash woes
Seagate eSSD deal faces problems
http://www.theregister.co.uk/2016/07/04/micron_staff_face_axe_through_flash_woes/
Micron made a loss of $215m in its latest quarter, job losses are coming, and the Seagate eSSD partnership hasn’t taken off.
Its fiscal third 2016 quarter revenues were $2.9bn, compared to $3.85bn a year ago and $2.93bn a quarter ago, when it made a $97m loss. Things have gotten a whole lot worse since then on the price-mix front.
It plans to lower its staff count by about 2,400 jobs – about 7.5 per cent.
The damage was done by a fall in NAND sales, to around $900m, with trade units down 10 per cent compared to the second quarter and average selling price down 6 per cent. Memory (DRAM) sales grew.
Micron CEO Mark Durcan’s canned statement reads: “Although we have made good progress in deploying our advanced DRAM and NAND technologies, we continue to face challenging market conditions.”
Tomi Engdahl says:
Qualcomm remains the clear leader in LTE modems
Forward Concepts has listed LTE modem chips suppliers last year. Qualcomm is still the clear leader, the company held a total of $ 18.8 billion, with 66 percent of the market. In practice, two out of three LTE modem came from Qualcomm
The smartphone is still by far the largest LTE modems target device
behind Qualcomm’s Mediatek is the second largest supplier of 19 per cent market share. Spreadtrum market share of six percent and Samsung and Haisi share the fourth place in the three percentages.
Marvell, Lead Core, and Intel were successful last year to capture per cent slice of the sale of LTE modems.
Source: http://etn.fi/index.php?option=com_content&view=article&id=4655:qualcomm-edelleen-selva-ykkonen-modeemeissa&catid=13&Itemid=101
Tomi Engdahl says:
Push-in and Plug-in Connectors for Circuit Breakers
https://www.eeweb.com/news/push-in-and-plug-in-connectors-for-circuit-breakers
E-T-A Circuit Breakers is pleased to announce the availability of two new connectors for its well-known 3120 circuit breaker product series – making E-T-A the first circuit breaker manufacturer to offer customers a circuit breaker/switch combination with push-in terminal (3120-PT) or plug-in (Y31214001) wire termination options when installing circuit protection into their designs.
Both connectors provide users with tool-free installation and faster equipment wiring which reduces the installation time and production cost.
The push-in terminal connector (3120-PT) is sold as a single component with the appropriate 3120 configurations. It is designed for solid wire or wire ferrules, on stranded wire, for correct connection. When the wire is pushed in, the contact spring opens automatically and provides the required force for a secure connection – 100% protected against brush contact. In the event the circuit breaker needs to be replaced, the wires are easily released with a standard screw driver.
Tomi Engdahl says:
LCD Biasing and Contrast Control Methods
https://www.eeweb.com/company-blog/microchip/lcd-biasing-and-contrast-control-methods/
This article discusses a variety of methods in biasing Liquid Crystal Displays (LCDs) focusing on employing PIC microcontrollers with an LCD controller.
LCD types and LCD waveforms detemine the type of biasing that is required. There are different kinds of LCD biasing, based on the construction.
There are two electrodes where the LCD waveforms are driven; they are called SEGMENTs (SEGs) and COMMONs (COMs). The LCD requires an AC waveform to be applied between these electrodes.
Tomi Engdahl says:
Synchronizing sample clocks of a data converter array
http://www.edn.com/design/analog/4442297/Synchronizing-sample-clocks-of-a-data-converter-array?_mc=NL_EDN_EDT_EDN_today_20160704&cid=NL_EDN_EDT_EDN_today_20160704&elqTrackId=9807e163730c459caffcf68389cdee9d&elq=a8b5439cf1f2476a966a3b461a37b8da&elqaid=32931&elqat=1&elqCampaignId=28758
The requirements of higher system bandwidth and resolution in a variety of applications from communications infrastructure to instrumentation drive up the demand for connecting multiple data converters in an array form. Designers must find low noise and high accuracy solutions to clock and synchronize a large array of data converters using the common JESD204B serial data converter interface.
Clock generation devices containing jitter attenuation functions, internal VCOs, and a multitude of outputs and many synchronization management functions, are now coming to market to address this system problem. In many real-life applications, however, the sheer number of required clocks in a data converter array exceeds what may be feasible to obtain from a single IC component. Designers often resort to connecting multiple clock generation and clock distribution components together, thus, creating a broad “clock tree”.
This article provides a real-life case study of how to build a flexible and re-programmable clock expansion network, that maintains not only an excellent phase noise/jitter performance, but also passes-on the required synchronization information from the 1st device of the clock tree to the last one with deterministic control.
Tomi Engdahl says:
Zero-mask-adder NVM vs. Embedded flash
http://www.edn.com/electronics-blogs/ic-designer-s-corner/4442292/Zero-mask-adder-NVM-vs–Embedded-flash?_mc=NL_EDN_EDT_EDN_today_20160705&cid=NL_EDN_EDT_EDN_today_20160705&elqTrackId=a7e97428c09b4587983f80e8dca27bc6&elq=ea3cd78c37fc481dbd2eaea7b34d6afe&elqaid=32939&elqat=1&elqCampaignId=28766
Statistics show the total amount of on-chip memory continues to grow at a rapid pace, occupying a significant portion of system-on-chip (SoC) die area as well as consuming a big piece of the overall power budget. For example, the automotive memory IC market is expected to more than double, going from $1.6B in 2015 to $3.5B in 2018 (Source: IC Insights 2015). This growth is driven by consumer demand for convenience, additional capabilities, compliance with safety standards, sustainability, etc.
The growth in memory, including reprogrammable non-volatile memory (NVM), is apparent across all types of market segments. Figure 1 shows the two established and mature reprogrammable product categories within NVM – embedded flash and traditional logic-based multi-time programmable. A cost-effective option within the logic-based multi-time programmable category is the zero-mask-adder NVM solution.
Despite several emerging NVM technologies, embedded flash remains the technology of choice for applications that require a few hundred to tens of thousands of cycles of reprogrammability. In addition, embedded flash is an attractive choice for chip designers because of its compact footprint and widespread availability in popular processes and foundries.
Tomi Engdahl says:
3D-printed micro-optics
http://scottadams-tttt.tumblr.com/post/146766594801/top-tech-256-space-sensors-micro-optics
Researchers at University of Stuttgart in Germany have used an ultra short laser pulses to create optical lenses “which are hardly larger than a human hair.”
The 3D-printed lenses will “permit the construction of novel and extremely small endoscopes which are suited for smallest body openings or machine parts that can be inspected,” Nature reports.
The scientists also printed optical free form surfaces and miniature objectives directly onto CMOS image chips, and combined the optics with LED illumination systems.
Two-photon direct laser writing of ultracompact multi-lens objectives
http://www.nature.com/nphoton/journal/vaop/ncurrent/full/nphoton.2016.121.html
Tomi Engdahl says:
IC Debugging: Simulation vs. Lab validation
http://www.edn.com/electronics-blogs/day-in-the-life-of-a-chip-designer/4442256/IC-Debugging–Simulation-vs–Lab-validation?_mc=NL_EDN_EDT_EDN_today_20160706&cid=NL_EDN_EDT_EDN_today_20160706&elqTrackId=adbd99836cf04db8912ecc67c02a4029&elq=3594814851a6412cb5909f7011d9885d&elqaid=32973&elqat=1&elqCampaignId=28794
Design verification has been the biggest challenge for any IC design company for decades now. EDA tools and verification methods have evolved a lot over the last two decades, but along with that, the size and complexity of chips has also increased. Effectively, we don’t see measurable time savings in the design verification phase of an ASIC or FPGA cycle. The time it takes to complete the functional verification phase is still around 70% for most complex ASICs.
Now the question that arises is, within a given verification cycle, what activities consume the majority of time? The common answer that we get from DV engineers is, “debugging”. The kind of issues faced are of different types during different sub-phases of the DV cycle. In this article, we will talk about the kind of challenges we have been facing at eInfochips in our projects on post-silicon-lab validation phases of ASIC cycles. We will compare it with the functional simulation wherever required.
Tomi Engdahl says:
Transmitter FFE makes the channel do the work
http://www.edn.com/electronics-blogs/measure-of-things/4442311/Transmitter-FFE-makes-the-channel-do-the-work?_mc=NL_EDN_EDT_EDN_today_20160706&cid=NL_EDN_EDT_EDN_today_20160706&elqTrackId=3936deee61a042ffa5ff3755929e69f9&elq=3594814851a6412cb5909f7011d9885d&elqaid=32973&elqat=1&elqCampaignId=28794
At high data rates, around and above 10 Gbits/s, we have to face the reality that conducting traces glued to dielectric (a.k.a., printed circuit boards) are truly horrible waveguides. They attenuate the signal, whether it’s NRZ/PAM-2 or PAM-4, mess up the relationships between the Fourier components’ amplitudes, frequencies, and phases—everything that makes the waveform a signal
We must do a lot of work to help the receiver recognize the resulting waveform as a signal. In addition to careful layout and use of quality components—all in a cost-optimized way, of course—equalization does a lot of work.
Equalization effectively removes the channel response by applying the inverse transfer function of the channel to the signal.
Here’s another way to think of it that I find more intuitive: the channel distorts the signal, so why not pre-distort the signal in such a way that the channel itself removes that distortion? In other words, pre-distort the transmitted signal in a way that includes the inverse channel frequency response so that the channel cancels the pre-distortion.
TxFFE and receiver DFE (decision feedback equalizer) work together;
Inter-symbol interference and ziplines
http://www.edn.com/electronics-blogs/all-aboard-/4423043/Inter-symbol-interference-and-ziplines
When we think of digital signals, we naturally think of logic highs and lows, signals rooted in the Fourier components of a square wave. Of course, we also know it’s not like that.
At gigabit per second rates, conducting traces on PCBs act more like ugly waveguides than tidy paths from transmitter to signal. Rather than a current propagating along a conductor, these signals are electromagnetic waves that travel through the dielectric, barely clinging to the trace.
But it’s worse than that. As you zip through the forest, your body smears out and interferes with bits ahead and behind you. You absorb elements of bits from others, you and the other bits on this trace interfere with each other.
Tomi Engdahl says:
Just the facts, STT-MRAM: Your DRAM replacement’s on its way
Fast switching of super-small magnetic tunnel junctions
http://www.theregister.co.uk/2016/07/07/dram_replacement_sttmram_comes_closer/
Spin Transfer-Torque Magnetic RAM (STT-MRAM) is a future DRAM replacement candidate. It uses one of two different spin directions of electrons to signal a binary one or zero.
IBM and Samsung scientists have published an IEEE paper* demonstrating switching MRAM cells for 655 devices with diameters ranging from 50 down to 11 nanometers in just 10 nanoseconds using only 7.5 microamperes. They say it is a significant achievement towards the development of Spin Torque MRAM.
The authors write that “in the early stages of the development of new STT-MRAM materials, the yields of ultra-small devices are low and standard memory tests are difficult.”
Tomi Engdahl says:
Cars, Security, And HW-SW Co-Design
http://semiengineering.com/cars-security-and-hw-sw-co-design/
Experts at the table, part 1: Hardware and software must be developed at the same time these days to shorten the time-to-market for advanced devices and electronics.
SE: Hardware-software co-design has been a topic of discussion for many years, trying to get hardware and software together. The whole hardware/software co-design concept goes back to the 1990s. Software and hardware have to be built concurrently these days. What trends and ideas do you think are important now?
Stahl: It’s interesting that you mention that term, hardware/software co-design. It’s really an old term. We can do both. It’s really people with different job functions that do different pieces of any given embedded device. You have to make people work together, but they have distinct job functions – software developer, there’s a hardware architect, there are hardware designers. They’re using some tools to communicate. What co-design said at the time is let’s try to design things together, so they’ll work well together, but also let’s try to save some time. Today, you find more people saying we want to change the project schedule a little bit by making the software earlier available or developing software earlier. The co-design term is really gone.
Tomi Engdahl says:
Where zero IF wins: 50 percent smaller PCB footprint at 1/3 the cost, Part 1
http://www.edn.com/design/analog/4442330/Where-zero-IF-wins–50-percent-smaller-PCB-footprint-at-1-3-the-cost–Part-1?_mc=NL_EDN_EDT_EDN_analog_20160707&cid=NL_EDN_EDT_EDN_analog_20160707&elqTrackId=38ec636f406044239ac9919caf4a9af1&elq=717ccfd2b3624ee48c1d2745e9cf1859&elqaid=32988&elqat=1&elqCampaignId=28811
Zero-IF (ZIF) architecture have been around since the early days of radio. Today the ZIF architecture can be found in nearly all consumer radios whether television, cellphones or Bluetooth technology. The key reason for this wide adoption is that it has proven time and again to offer the lowest cost, lowest power and the smallest footprint solution in any radio technology. Historically, this architecture has been withheld from applications that demand high performance. However with the demand for wireless growing around us and the rapidly crowding spectrums, a change is required in order to continue economically deploying radios in the infrastructure that supports our wireless needs.
Contemporary Zero-IF architectures can satisfy these needs as many of the impairments normally associated with these architectures have been resolved through a combination of process, design, partitioning and algorithms.
Zero IF receivers can also cover a very broad range of RF frequencies simply by changing the local oscillator. Zero IF transceivers provide a truly broadband experience with typical coverage continuously from several hundred megahertz up to around 6 GHz. Without fixed filters, truly flexible radios are possible, greatly reducing and possibly eliminating the effort required to develop band variations of the radio design. Because of the flexible digitizers and programmable baseband filters, Zero IF designs not only deliver high performance, but also significant flexibility in adopting to a wide range of frequency and bandwidths while maintaining nearly flat performance without the need to optimize analog circuits (filters, etc) for each configuration – true software defined radio (SDR) technology.
Tomi Engdahl says:
Financial Times:
SoftBank agrees to acquire ARM Holdings for £24.3B at £17 per share, a 43% premium over last week’s closing price — Japan’s SoftBank has agreed to acquire Arm Holdings, the UK’s pre-eminent technology company, for £24.3bn in an enormous bet by the Japanese telecoms group …
SoftBank to acquire UK’s Arm for £24.3bn
Arash Massoudi, James Fontanella-Khan, Richard Waters, Lauren Fedor
http://www.ft.com/intl/cms/s/0%2F235b1af4-4c7f-11e6-8172-e39ecd3b86fc.html#axzz4EkmYVs8V
Japan’s SoftBank has agreed to acquire Arm Holdings, the UK’s pre-eminent technology company, for £24.3bn, in an enormous bet by the Japanese telecoms group that the smartphone chip designer will make it a leader in one of the next big markets, the “internet of things”.
The takeover of Cambridge-based Arm, which was founded 25 years ago and now employs 4,000 people, will be the largest acquisition of a European technology business.
The fall in sterling following the June 23 referendum has left the UK currency nearly 30 per cent lower against the Japanese yen over the past year, making Arm an attractive target.
“I did not make the investment because of Brexit. The paradigm shift is the opportunity. I would have made this decision regardless,”
Arm’s business model has relied on licensing its technology to other hardware makers including Apple and Samsung Electronics, giving it a near-ubiquitous presence in mobile devices. It reaps a small royalty amount for each device, relying on very large volumes.
As purely a designer of chips rather a manufacturer, Arm’s intellectual property model leaves it with a high profit margin. However, its revenues of about £1bn last year made it a minnow by global chip standards.
The purchase price is equivalent to 70 times its net income last year
Tomi Engdahl says:
SoftBank Bought ARM
http://hackaday.com/2016/07/18/softbank-bought-arm/
$32 billion USD doesn’t buy as much as it used to. Unless you convert it into British Pounds, battered by the UK’s decision to leave the European Union, and make an offer for ARM Holdings. In that case, it will buy you our favorite fabless chip-design company.
The company putting up 32 Really Big Ones is Japan’s SoftBank, a diversified technology conglomerate. SoftBank is most visible as a mobile phone operator in Japan, but their business strategy lately has been latching on to emerging technology and making very good investments. (With the notable exception of purchasing the US’s Sprint Telecom, which they say is turning around.) Recently, they’ve focused on wireless and IoT. And now, they’re going to buy ARM.
We suspect that this won’t mean much for ARM in the long term. SoftBank isn’t a semiconductor firm, they just want a piece of the action. With the Japanese economy relatively stagnant, a strong Yen and a weak Pound, ARM became a bargain.
Tomi Engdahl says:
Cypress Aims Low-Pin Memory at Automotive, IoT
http://www.eetimes.com/document.asp?doc_id=1330151&
Two attributes that are considered highly desirable in memory for the burgeoning automotive and Internet of Things (IoT) markets are a small footprint and low power consumption, but in the automotive market in particular, simpler is also better.
Cypress Semiconductor’s recent update to its HyperRAM memory DRAM device is addressing the automotive industry’s preference for fewer moving pieces with a lower pin count. The company is now sampling its new high-speed, self-refresh DRAM based on its low-pin-count HyperBus interface.
The 64Mb HyperRAM is designed to serve as an expanded scratchpad memory for rendering of high-resolution graphics or calculations of data-intensive firmware algorithms in a wide array of automotive, industrial and consumer applications,
The embedded systems-focused Cypress is seeing more demand for higher performance for interfaces, said Hoehler, but with lower pin counts, both in automotive applications and IoT, particularly industrial IoT, where more process power is required from a microcontroller. “They are running out of on-chip RAM, but they want to keep a small pin count and a small PCB with high read and write bandwidth.
He said designers are looking to reduce overall system cost by combining DRAM and flash in a single package if possible.
Tomi Engdahl says:
By 2040, computers will need more electricity than the world can generate
So says the semiconductor industry’s last ever communal roadmap
http://www.theregister.co.uk/2016/07/25/semiconductor_industry_association_international_technology_roadmap_for_semiconductors/
Without much fanfare, the Semiconductor Industry Association earlier this month published a somewhat-bleak assessment of the future of Moore’s Law – and at the same time, called “last drinks” on its decades-old International Technology Roadmap for Semiconductors (ITRS).
The industry’s been putting together the roadmap every two years since the 1990s, when there were 19 leading-edge chip vendors. Today, there are just four – Intel, TSMC, Samsung and Global Foundries – and there’s too much road to map, so the ITRS issued in July will be the last.
The group suggests that the industry is approaching a point where economics, rather than physics, becomes the Moore’s Law roadblock. The further below 10 nanometres transistors go, the harder it is to make them economically.
That will put a post-2020 premium on stacking transistors in three dimensions without gathering too much heat for them to survive.
Tomi Engdahl says:
Mergers and Acquisitions: Analog and Linear
http://hackaday.com/2016/07/26/mergers-and-acquisitions-analog-and-linear/
Analog Devices and Linear Technology have announced today they will combine forces to create a semiconductor company worth $30 Billion.
This news follows the very recent acquisition of ARM Holdings by Japan’s SoftBank, and the later mergers, purchases or acquisitions of On and Fairchild, Avago and Broadcom, NXP and Freescale, and Microchip and Atmel, Intel and Altera, and a few more we’re forgetting at the moment.
Both Analog and Linear address similar markets; Analog Devices is best known for amps, interface, and power management ICs. Linear, likewise, isn’t known for ‘fun’ devices, but without their products the ‘fun’ components wouldn’t work. Because the product lines are so complimentary, the resulting company will stand to save $150 Million annually after the deal closes.
Analog Devices and Linear Technology to Combine Creating the Premier Analog Technology Company
http://www.analog.com/en/about-adi/news-room/press-releases/2016/7-26-2016-adi-and-linear-technology-to-combine.html
Tomi Engdahl says:
Electronics Webshop Farnell not moved ELFA owns the Daetwylerille but the transferred to the American Avnetille. The company’s offer of £ 691 million clearly exceeded the announced start of the summer offer of Swiss companies.
American Avent is the market leader in addition to Arrowin one of the largest distributors of electronics components and parts. The company operated in Finland Avnet’s, but also through the EBV distribution company.
English Premier Farnell began a mail order company, which has invested heavily into the shop. The company is also known as Raspberry-small card manufacturer.
Source: http://www.uusiteknologia.fi/2016/07/29/farnell-meneekin-avnetille/
Tomi Engdahl says:
Printed Photo solar cell works
Aalto University researchers have developed solar cells illustrated. The research results open up new opportunities for products and buildings for the development of integrated photovoltaic cells. Solar cells have long been made from inexpensive materials by various printing techniques.
The most powerful potential solar cell is jet black in color. Printing are particularly suitable for organic solar cells and dye solar cells.
colored patterned Aalto University researchers’ idea of a solar cell is to combine the same surface of the light that utilize other features, such as visual information or graphics.
Inkjet solar cell generates electricity could toner to print the selected image file to determine the shape and density, and light transmittance of different image regions could be adjusted accurately.
” Colored ink jet solar cells were equally effective and durable than the equivalent cells made in the traditional way. They lasted for more than a thousand hours of continuous light and the heat stress tests without any sign of decline in the efficiency, ”
The study is best served dye and electrolyte was obtained from a Swiss Ecole Polytechnique Fédérale de Lausanne University research team
The study results open new possibilities for decorative, products, and building integrated photovoltaic products, development.
Source: http://www.uusiteknologia.fi/2016/07/26/tulostettu-valokuva-toimii-aurinkokennona/
Tomi Engdahl says:
PADS update integrates more analysis
http://www.edn.com/electronics-products/other/4442352/PADS-update-integrates-more-analysis?_mc=NL_EDN_EDT_EDN_today_20160713&cid=NL_EDN_EDT_EDN_today_20160713&elqTrackId=3db4456a304c4e1dac9fc69f8b06f1f1&elq=183f8c45a0d54dcaaf332a5c0cfea6b3&elqaid=33057&elqat=1&elqCampaignId=28885
The latest PADS update includes DDR and AMS (analog/mixed-signal) simulation capabilities and HyperLynx DRC SI tools.
AMS combines VHDL-AMS, SPICE, and HyperLynx.
Depending on selected options, PADS’ cost ranges from $3,560-$4,995 (USD).
https://www.pads.com/
Tomi Engdahl says:
10 predictions for the next 60 years in power electronics
http://www.edn.com/design/power-management/4442335/10-predictions-for-next-60-years-in-power-electronics?_mc=NL_EDN_EDT_EDN_today_20160713&cid=NL_EDN_EDT_EDN_today_20160713&elqTrackId=883fd3ce837848868ebd4a29d2233c6d&elq=183f8c45a0d54dcaaf332a5c0cfea6b3&elqaid=33057&elqat=1&elqCampaignId=28885
DN is 60 years old this year and so much has happened in electronics in that time. In this article we’ll take a look at what the power electronics world will hold for us in 2076, from alternative energy sources, to diamond technology and the next generation of GaN and SiC. Click through and let us know your predictions in the comments below.
Nuclear-powered mustard-seed-sized batteries
Thermoelectric generators have an interesting past.
Of course extreme miniaturization needs to happen as well as protection of humans from radiation, but by 2076, I predict we will have another efficient, long-lasting source of energy as small as a mustard seed to power electronic circuitry.
He3 as an energy source
an unusual element found in great quantities in the lunar soil known as Helium-3 (He3). I predicted that this would be an excellent solution to Earth’s energy problems. It seems many have joined in my prediction since then.
US officials were not interested in Schmitt’s idea to use Helium-3 as a fuel source on Earth. China, India, and Russia, however, are both interested in mining this element. Also, the European Space Agency (ESA) has shown interest.
Tomi Engdahl says:
The next 60 years in consumer electronics
http://www.edn.com/design/consumer/4442358/The-next-60-years-in-consumer-electronics?_mc=NL_EDN_EDT_EDN_consumerelectronics_20160713&cid=NL_EDN_EDT_EDN_consumerelectronics_20160713&elqTrackId=fff640ec5b794bc4a6810c0fdca05800&elq=f746e1aac2e74956812d32464dc02f4f&elqaid=33048&elqat=1&elqCampaignId=28879
Tomi Engdahl says:
Happy anniversary, EDN: you’re old
http://www.edn.com/electronics-blogs/measure-of-things/4442298/Happy-anniversary-EDN–you-re-old
Tomi Engdahl says:
Regulators come in SnPb BGA packages
http://www.edn.com/electronics-products/other/4442347/Regulators-come-in-SnPb-BGA-packages?_mc=NL_EDN_EDT_EDN_today_20160712&cid=NL_EDN_EDT_EDN_today_20160712&elqTrackId=5932943a2a2d4255aa1484f0e6023a71&elq=eb3be2789ccf4affa783eda2c13077d7&elqaid=33041&elqat=1&elqCampaignId=28867
Linear Technology offers 53 µModule power products for use in applications where tin-lead soldering is preferred, such as defense, avionics, and heavy equipment industries. According to the manufacturer, a µModule point-of-load regulator in a SnPb (tin-lead) BGA package simplifies PC board assembly for suppliers in these industries.
Lead-free versions of these devices and more than 100 µModule products in both BGA and LGA packages are also available.
Tomi Engdahl says:
Home> Tools & Learning> Products> Product Brief
Isolated amplifier minimizes error
http://www.edn.com/electronics-products/other/4442307/Isolated-amplifier-minimizes-error?_mc=NL_EDN_EDT_EDN_productsandtools_20160711&cid=NL_EDN_EDT_EDN_productsandtools_20160711&elqTrackId=f8ebeb4c12a1481aaf031e5397a75b5a&elq=87066f29b8e2450ca51c0a2cea99cb8a&elqaid=33033&elqat=1&elqCampaignId=28859
A 7-kV reinforced isolated amplifier, the AMC1301 from Texas Instruments, provides low offset error and drift of ±200 µV at 25°C and ±3 µV/°C, respectively. This rugged device has a working breakdown voltage of 1000 VRMS for a minimum insulation barrier lifetime of 64 years, exceeding VDE 0884-10 safety requirements.
The AMC1301 comes in a 5.85×7.5-mm SOIC package and costs $2.90 each in lots of 1000 units. A $49 evaluation module is available, as well as a SPICE simulation model and a reference design to help designers verify board-level signal-integrity requirements.
http://www.ti.com/product/AMC1301?HQS=hpa-pa-dsig-amc1301-pr-pf-null-wwe
AMC1301 Evaluation Module
http://www.digikey.it/catalog/en/partgroup/amc1301-evaluation-module/62254
AMC1301EVM Evaluation Module using the AMC1301 Precision, ±250mV Input, 3µs Delay, Reinforced Isolated Amplifier
Tomi Engdahl says:
Where zero IF wins: 50 percent smaller PCB footprint at 1/3 the cost, Part 1
http://www.edn.com/design/analog/4442330/Where-zero-IF-wins–50-percent-smaller-PCB-footprint-at-1-3-the-cost–Part-1?_mc=NL_EDN_EDT_EDN_today_20160711&cid=NL_EDN_EDT_EDN_today_20160711&elqTrackId=b25d6762b1874922b3432caa24332004&elq=6bdb7797850943e8bd42d99055382204&elqaid=33026&elqat=1&elqCampaignId=28852
Tomi Engdahl says:
PCB Design Basics: Example design flow
http://www.edn.com/design/pc-board/4426878/PCB-Design-Basics–Example-design-flow?_mc=NL_EDN_EDT_pcbdesigncenter_20160711&cid=NL_EDN_EDT_pcbdesigncenter_20160711&elqTrackId=f72b276ae8274c27b57ed9e8a0d88efa&elq=be2ca2fef9c044df9eb3879aaa940e63&elqaid=33024&elqat=1&elqCampaignId=28850
This series of articles discusses the different steps of PCB development from the basics of creating a design schematic with specific requirements, to finalizing a board and preparing it for fabrication. The articles are written in the context of the National Instruments circuit design tools NI Multisim and NI Ultiboard.
Tomi Engdahl says:
200mm Equipment Shortfall
http://semiengineering.com/200mm-equipment-shortfall/
Older equipment is now very much in demand due to shifts in end markets and new options for packaging.
A surge in demand for consumer electronics, communications ICs, sensors and other products has created a shortage in 200mm fab capacity that shows no signs of abating.
None of these chips need to be manufactured using the most advanced processes, and there have been enough tweaks to processes at established nodes to eke even more out of existing processes. But that has left chipmakers struggling to procure 200mm equipment for those fabs as demand for chips at these older nodes continues to rise.
“Core” is a term that refers to a used piece of equipment that must be refurbished by an OEM, third-party refurbishing house or end-user.
By some estimates, there are somewhere between 600 to 720 pieces of 200mm used equipment or cores that are in inventory or available for purchase on the open market today. However, there is demand for about 1,000 200mm cores among device makers today
Tomi Engdahl says:
Wireless networking will cover the world
http://www.edn.com/design/wireless-networking/4442396/Wireless-networking-will-cover-the-world?_mc=NL_EDN_EDT_EDN_weekly_20160721&cid=NL_EDN_EDT_EDN_weekly_20160721&elqTrackId=18e45b258bec4aac9c341b86ceaa4095&elq=9ad6a982e69743dbabc001cd93609f6f&elqaid=33150&elqat=1&elqCampaignId=28979
The next 60 years of wireless and networking technologies will be exponentially more exciting than the first 60 years. As radio frequency (RF) bandwidth becomes consolidated under that banner of the worldwide right of every citizen to connectivity, the technologies of photonic LiFi, peer-to-peer communications, and low-orbit satellite integration for back-haul will unify the Earth.
Tomi Engdahl says:
The future of IC design
http://www.edn.com/design/integrated-circuit-design/4442375/The-future-of-IC-design?_mc=NL_EDN_EDT_EDN_weekly_20160721&cid=NL_EDN_EDT_EDN_weekly_20160721&elqTrackId=b6eced3f466046368fe74ce7ad72178c&elq=9ad6a982e69743dbabc001cd93609f6f&elqaid=33150&elqat=1&elqCampaignId=28979
To celebrate 60 years of EDN, we’re looking into the future to predict what advancements will be made in IC Design in the next 60 years. By 2076 3-D room-temperature, superconducting, quantum, neuromorphic, and photonic mixed-signal devices will be the common denominator for all integrated circuit designs. Design tools will be so sophisticated that even novice designers will be able to mix and match these technologies into system-in-package designs that solve all application problems behind the scenes. Users will be so used to extensions to their innate brain capabilities that the technologies which perform the tasks will be taken for granted, leaving the engineering community—and its robotic assistants—on a unique echelon of society that actually understands how the world works.
Tomi Engdahl says:
Who’s Calling The Shots
First of two parts: Systems vendors used to take their lead from chipmakers. Not anymore.
http://semiengineering.com/whos-calling-the-shots/
Throughout the PC era and well into the mobile phone market, it was semiconductor companies that called the shots while OEMs followed their lead and designed systems around chips. That’s no longer the case.
A shift has been underway over the past half decade, and continuing even now, to reverse that trend. The OEM — or systems company as it is more commonly called today — now determines what features go into a system, often based on what software will be needed or what application it will be used for. After that, the specification is developed for the hardware, including the performance characteristics, the power budget and the cost target.
This is a significant change, and it is unfolding over a period of years—sometimes unevenly, sometimes in unexpected ways, and usually in small pieces that don’t provide a clear overall picture of what’s happening. And while it doesn’t necessarily make hardware any less valuable — semiconductors are at the center of every major development and shift in technology — it does affect ecosystems for everything from IoT appliances to consumer electronics to automotive subsystems, and the underlying IP and design strategies that are used to create them.
Shifting business models change the support ecosystem, as well. They affect the chip design strategy, and they affect the entire design flow — and they raise a lot of unanswered questions.
“We are putting ourselves in our customer’s shoes and trying things out,” said Frank Schirrmeister, group director for product marketing of the System Development Suite at Cadence. “We try to replicate that internally, so we have as part of our agreement with ARM, for example, access to their internals so that we can do validation of our tools with it.”
Who’s Calling The Shots
http://semiengineering.com/whos-calling-the/
A large part of this shift involves software, which falls on many plates throughout the ecosystem. Making sure all of the layers of software interoperate and integrate well together is no small feat, and it is growing in complexity at every turn as systems becomes more sophisticated.
“Even if you look back 10 or 20 years ago, there was already a lack of communication between hardware and software teams,” said Simon Rance, senior product manager in the systems and software group at ARM. “Now, the communication is almost completely broken or segmented because there’s an imbalance. There are four to six times as many software designers than there are hardware designers. Hardware is designing their portion of it having no idea how software is going to go in and program it all. And when something goes wrong for the software engineer, the hardware engineer has no idea what the heck they were trying to do in the first place anyway, so they can’t help them debug. It’s these type of issues that are where the system schedules are getting very long, not shrinking.”
Going forward, a complete solution from the system level down will include the quality and security of the software
Software challenges are daunting
Andrew Caples, senior product manager for Nucleus product line in Mentor Graphics’ Embedded Systems Division, observed that software and hardware collide or they converge. “Right now, there’s so much capability on the hardware — there’s all sorts of accelerators, there’s crypto acceleration, there are GPUs, and all sorts of connectivity, but if you look at what devices have today as far as the capabilities of the silicon today, it requires an awful lot of low-level code to make that all work. Just bringing up GPU support for graphics acceleration for a display is more difficult. Displays used to be cool and novel. Now compelling displays are necessary. Everybody wants their device to look like an iPad in all sorts of cool displays and icons and lots of capabilities — those require that type of graphics support.”
Supporting the various graphics engines out there requires a tremendous amount of expertise, he explained. “After that, you can extrapolate out. Everything is connected, we already know that. If you look at the connectivity, supporting 802.11 is kind of passé now. But there are lots of chipsets out there”
“There are certifications that can be required from the WiFi Alliance to ensure conforming and compliance. And then you add in Bluetooth and 802.15.4 and Zigbee, and it becomes really quite difficult to be able to provide all this support in the chipsets.”
It’s not uncommon today for devices to have many cores and require a substantial amount of low-level support for bring up.
“It’s harder to provide comprehensive support for all of the boards out there and all of the SoCs out there, all the processors out there — you really have to make your bets on which boards and processors are going to be widely embraced,”
The response by OEMs in some cases is to conform to standards and standardized testing and certifications. But just throwing more bodies at a problem isn’t necessarily the best strategy.
ARM has been wrestling with this issue for quite some time and recently rolled out some technology that looks to address this by allowing any designer, whether they are hardware or software or verification, the same viewpoint of the system design information, Rance said.
“It’s not doing it from a hardware point of view, or a software point of view or a verification point of view,”
This type of technology does mean that the more design teams take this approach, EDA tools need to be able to work and play nicely with each other, Rance said. And clearly, they will have to because the systems companies require it.
At the end of the day, as part of the gargantuan effort of designing, integrating and verifying an elegant, sophisticated electronic system today the systems OEMs are in the driver’s seat. Whether it is choosing partners, IP, foundry, packaging the OEM is also driving openness and interoperability amongst all the players in the game. The successful players will learn where their pieces fit, and how to ease the integration in the system.
Tomi Engdahl says:
10 predictions for the next 60 years in analog electronics
http://www.edn.com/design/analog/4442479/10-predictions-for-the-next-60-years-in-analog-electronics?_mc=NL_EDN_EDT_EDN_today_20160804&cid=NL_EDN_EDT_EDN_today_20160804&elqTrackId=11f377dd53014f39bc7979185ff29da3&elq=62208d8ecdd94191905d68606125f7ac&elqaid=33323&elqat=1&elqCampaignId=29130
Steve Whalley, Chief Strategy Officer of the MEMS & Sensors Industry Group (MSIG), forsees a single MEMS process with many MEMS silos, not unlike the silicon process, to advance the MEMS industry in the future. There will be multiple designs in the future, not only a single dominant one like the Intel processor.
Proprietary solutions will disappear and allow the trillions or even quadrillions of MEMS to proliferate in 2076. Wearables will be non-intrusive, not like smart watches today, but will be virtually invisible arrays of different sensors as a part of our clothing and even on our skin in ultra-thin, but non-invasive processes which can include sensors integrated with conditioning, processor, battery, radio, antenna, etc. No fabs will be needed since roll-to-roll printing will be the norm. T-Sensors will be in full swing solving world hunger, pollution, and clean sustainable energy and may change their name to P-Sensors (Peta-Sensors) in 2076. Security will begin at the sensor’s edge and “garbage data” will be eliminated at the sensor.
Tomi Engdahl says:
Why have an internal cal lab?
http://www.edn.com/electronics-blogs/all-things-measured/4442433/Why-have-an-Internal-Cal-Lab-?_mc=NL_EDN_EDT_EDN_today_20160804&cid=NL_EDN_EDT_EDN_today_20160804&elqTrackId=13a63fc3c3354706a514cbf8c94bf7c0&elq=62208d8ecdd94191905d68606125f7ac&elqaid=33323&elqat=1&elqCampaignId=29130
Calibrating your test equipment—multimeters, oscilloscopes, network analyzers, signal sources, etc.—improves the chances that your company’s products will meet specifications. In some instances, calibration is required to comply with industry standards or as part of an agreement between you and your customer. How often you calibrate depends with several factors such as customer requirements, manufacturer recommendations, or an instrument’s past history. Calibration maybe performed by the test-equipment manufacturer, buy a third-party cal lab, or by an in-house cal lab. Using an in-house lab has several advantages, though it often requires a significant capital investment in equipment and personnel.
What are the advantages of calibrating test equipment in house?
With in-house calibration, you can avoid making bad measurement-based business decisions by reducing the risk associated with measurements, i.e. reducing the possibility that measurements used to make some type of product or process decision are not skewed or masked by calibration errors.
In-house calibration lets you maintain compliance with national and international regulatory and industry standards. Many corporations maintain ICLs (internal calibration labs) to satisfy their ISO 9001 calibration requirements. ISO 9001 section 7.6 essentially states, “Where necessary to ensure valid results, monitoring and measuring equipment must be calibrated.”
One of the often overlooked advantages of maintaining an ICL is that ICL personnel are SMEs (subject matter experts) on the usage, functionality and limitations of test and measurement equipment and are available for customers to “pick their brains” at no cost. ICL SMEs understand what type of test-and-measurement equipment may be best suited to make a measurement depending on a customer’s unique application requirements.
Tomi Engdahl says:
PCB layout tips for thermal vias
http://www.edn.com/electronics-blogs/the-workbench/4421218/PCB-layout-tips-for-thermal-vias?_mc=NL_EDN_EDT_pcbdesigncenter_20160808&cid=NL_EDN_EDT_pcbdesigncenter_20160808&elqTrackId=1917ada03ddb4e14a4758b5fa5863da3&elq=d564b62d93804f779a5f4cb83727c4d3&elqaid=33356&elqat=1&elqCampaignId=29157
“To reduce operating temperatures easily, use more layers of solid ground or power planes connected directly to heat sources with multiple vias. Establishing effective heat and high-current routes will optimize heat transfer by means of convection. The use of thermally conductive planes to spread the heat evenly dramatically lowers the temperature by maximizing the area used for heat transfer to the atmosphere.”
Know there is a lot of caution you need to exercise when trying to get the heat out of a part just using a circuit board. You have to realize the guidelines in the datasheet are usually based on one part making heat, sitting on a standard board of certain dimensions.
Texas Instruments’ WEBENCH is a neat program, especially because it has Mentor Graphic’s FloTherm built in to help you see the hot spots in switching regulators. This is what taught me that a modern buck regulator will have more heat coming out of the catch diode than the pass FET.
http://www.ti.com/lsds/ti/analog/webench/overview.page
Tomi Engdahl says:
Will there still be PCBs in 60 years?
http://www.edn.com/design/pc-board/4442496/Will-there-still-be-PCBs-in-60-years-?_mc=NL_EDN_EDT_pcbdesigncenter_20160808&cid=NL_EDN_EDT_pcbdesigncenter_20160808&elqTrackId=6d379c83a6484928aff613e38b1fa07d&elq=d564b62d93804f779a5f4cb83727c4d3&elqaid=33356&elqat=1&elqCampaignId=29157
eflecting on 60 years of EDN has us thinking about the future, and wondering how PCBs will change.
If you were around in the ’60s, sing it:
In the year
2076
Will there still be printed circuit boards?
Maybe not, but I predict PCBs will exist for at least a while longer. What should we expect?
The most obvious near-term changes will simply result from trickle-down. High-end technology, like low-loss dielectrics, ultra-smooth copper, HDI (High Density Interconnect), integrated flex, embedded components, and decoupling planes, will invade more and more products. HDI of course is already pretty common given the extremely dense devices we take for granted.
Optical interconnects are widely used for high-speed data. Expect PCB-embedded optical waveguides to become common in backplanes and cards, possibly in conjunction with 3D printing.
PCBs will embrace new forms, such as flexible materials and continuous roll production. Ultra-thin materials will see use in, for example, externally applied medical monitors that “stick on” using only van der Waals force, and internal, dissolvable sensors.
3D PCB printing is inching towards reality. While this might seem questionable technology at first glance – perhaps useful for fast prototyping – it’s not that hard to see how it could become the dominant production method.
3D printers will also enable printing of embedded components like resistors, and embedded structures like optical and microwave waveguides.
Tomi Engdahl says:
RF design in the 21st century
http://www.edn.com/design/analog/4442464/RF-design-in-the-21st-century?_mc=NL_EDN_EDT_EDN_analog_20160804&cid=NL_EDN_EDT_EDN_analog_20160804&elqTrackId=2649da38ea3c4b55b85b5e1e00d60081&elq=2ca2e05316ea4487a0068e88139497c9&elqaid=33320&elqat=1&elqCampaignId=29127
My first job on leaving college was maintaining military radios. I had covered RF theory, but found that the practice was significantly different. The company’s detailed design work was performed at a remote location, and shrouded in mystery. RF design was a “black art” that only a few specialists could understand. I later moved into pure logic design, where the relative simplicity of 1s and 0s held fewer uncertainties. This ultimately led me into two decades of involvement with FPGAs.
It is striking that logic design methodologies have made huge advances in higher levels of abstraction, tool support, and productivity, while RF design has made comparatively slow progress–until now. I recently discovered a couple of new highly integrated and fully programmable wireless transceivers, and was struck by the similarities between these wireless solutions and FPGAs. Viewed from 30,000 feet, both product types are field programmable, highly flexible, and can be used in a wide range of applications.
The highly integrated wireless products are sometimes classified as field programmable radio frequency (FPRF) devices, which is the term I’ll use from now on.
The building blocks of a wireless receiver would be generic with a low noise amplifier (LNA), followed by some filtering. The RF might then be mixed in a tunable superheterodyne (superhet) stage to convert it to an intermediate frequency (IF) followed by a further mixer stage to translate to baseband. Alternatively, direct conversion, also known as zero-IF, is possible using modern semiconductor technology and astute design techniques. The baseband signal would be amplified and filtered, prior to conversion to in-phase and quadrature (I&Q) digital bit streams. The transmit path converts I&Q data streams to analog signals via DACs, followed by filtering and amplification before being mixed to modulate the RF carrier and on to an RF amplifier.
A typical “conventional design style” process would use discrete semiconductor components, rather than the new highly integrated chips. The System Architect would define a black box for the RF subsystem detailing all the performance specification
The RF Designer will consider architectural options
For a discrete implementation the designer conducts a scan of products available from a range of semiconductor vendors.
Further complications arise when more than one RF frequency is required, and considerably more complex if multiple bands or different bandwidths are needed. Different frequencies may require tunable components and an agile antenna.
It is expected that MIMO will become an important addition to meet the growing demand for data throughput.
New design paradigm
Contrast this with using an FPRF
The transmitter takes digital baseband signals and converts these into modulated RF signals, while the receiver decodes incoming RF and outputs baseband digital streams. The frequency range is programmable over a wide range (0.1 MHz to 3800 MHz from the second generation FPRF, or 100 kHz to 12 GHz with the addition of an Up/Down RF frequency shifter) with RF bandwidths up to 120 MHz. The whole RF chain is specified over voltage and temperature at different frequencies
Moreover, designers can choose from a number of boards (some including FPGAs) to rapidly start product development. FPRF suppliers provide free printed circuit board layouts (Gerber files) to remove the issue of the tricky RF section and reduce risk and speed up the time to market.
Designers do not need a formal evaluation of the FPRF devices, as they can program the chips on the bench and check out the performance.
Programming of the FPRF is by loading simple address and parameter details into the device over an SPI connection. This scheme allows the design to be modified in seconds. The revised performance can be tested, enabling the effects of frequency, filter bandwidth and gain to be rapidly explored
The GUI allows full access to all the programmable features, so that different settings can be downloaded in real time.
The analog components on FPRF devices are not the only programmable parts. There can be extensive user programmable DSP blocks dispersed throughout some chips in both receive and transmit paths.
The analog components on FPRF devices are not the only programmable parts. There can be extensive user programmable DSP blocks dispersed throughout some chips in both receive and transmit paths.
FPRFs can save time and money also during the testing and calibration process.
Tomi Engdahl says:
Soon components will shrink to invisibility
http://www.edn.com/design/components-and-packaging/4442492/Soon-components-will-shrink-to-invisibility?_mc=NL_EDN_EDT_pcbdesigncenter_20160808&cid=NL_EDN_EDT_pcbdesigncenter_20160808&elqTrackId=06bac24558584c78bdb05f25e00ae333&elq=d564b62d93804f779a5f4cb83727c4d3&elqaid=33356&elqat=1&elqCampaignId=29157
This month we’re celebrating EDN’s 60th anniversary by looking ahead to technology we might see in 60 years. Components in 2076 will be transparent, harness quantum-mechanical properties, and be integrated with all the devices needed for an application, including their own brain-inspired internal power supply. And 3D packaging will be the norm, with circuit boards resembling a relief map of New York City.
All components in the future will be available in transparent versions, enabling see-through smartphones and other devices that “wow” the consumer. Already available are thin-film transparent conductors and transistors with prototypes of transparent resistors, capacitors, and inductors on the drawing board.
Tomi Engdahl says:
Smart sensor fusion: Analog Devices and Linear Technology
http://www.edn.com/electronics-blogs/sensor-ee-perception/4442453/Smart-sensor-fusion–Analog-Devices-and-Linear-Technology?_mc=NL_EDN_EDT_EDN_today_20160728&cid=NL_EDN_EDT_EDN_today_20160728&elqTrackId=da2559222743448f9ffd4ef17fd1df37&elq=a10b28a33581402b8a4825ba8ba0e613&elqaid=33219&elqat=1&elqCampaignId=29047
I had intended to look at the latest in sensor fusion algorithms this week, but the purchase of Linear Technology by Analog Devices is so logical (ironic, for two analog companies) that it’s hard to ignore the potential from a sensing point of view, particularly for the Internet of Things (IoT.)
From a high level, the two companies are typically differentiated by defining ADI as a leader in converter technology, and Linear Technology as leader in power management. This helps Wall St., but anyone who’s worked with either company knows it’s a lot more interesting than that.
Tomi Engdahl says:
The tyranny of numbers: Sensors in the next 60 years
http://www.edn.com/design/sensors/4442436/The-tyranny-of-numbers–Sensors-in-the-next-60-years?_mc=NL_EDN_EDT_EDN_today_20160728&cid=NL_EDN_EDT_EDN_today_20160728&elqTrackId=77c1839c424540b3b548b65bbb38213a&elq=a10b28a33581402b8a4825ba8ba0e613&elqaid=33219&elqat=1&elqCampaignId=29047
Almost 60 years ago, when EDN magazine had just begun, a young newly graduated employee at Texas Instruments spent his summer wrestling the “tyranny of numbers” in an effort to efficiently scale the number of transistors in a circuit.
That young man was Jack Kilby, of course, and now that same rethinking of how to scale devices using fundamental materials science is being done for sensors, and the impact the practical realization of this research will have on how we interface with our world and architect systems over the next 60 years could be just as dramatic.
Of course, it’s important to recognize that long before the Internet of Things (IoT) and microelectromechanical systems (MEMS), electrical and mechanical designers have been using sensors of some form for generations to connect the analog to the digital world.
Here at EDN we even did a recent poll to find out what designers considered to be their favorite (it’s temperature sensors, by the way, followed by … well, check out the survey.)
That survey had a long list of sensor types: humidity, pressure, temperature, infrared, flow, MEMS accelerometers and gyroscopes, and the list is still growing.
So the main issue was packaging and interconnects and making the device comfortable and cost effective, given the numerous boards
Still, the packaging problem remains, as more and more sensors, with their electronics, are added per node, and more nodes are added to a human’s personal area network as well as the IoT in general. As Michael Long, Analog Devices Inc (ADI) segment manager for industrial and instrumentation, pointed out at Sensors Expo, the nirvana right now is the concept of zero-pin sensors (ZPS) that are self-powered and maintenance free with ultra-low-power processing. This, he said, requires a combination of optimized processors, sensor, radio, and power management, including energy harvesting.
Long was at ADI’s booth and demonstrated the SNAP Sensor imaging technology which addresses power consumption by doing a lot of functions on-chip, instead of using the main host processor
TMicroelectronics’ Edoardo Gallizio demonstrated BlueCoin, a miniature MEMS reference platform with four MEMS microphones, inertial MEMS, and a Bluetooth low energy radio
Vesper recently took low-power MEMS to a new level, announcing the first commercially available quiescent-sensing MEMS device. The VM1010 microphone detects acoustic events while consuming virtually zero power
In the context of MEMS, the combination of sensor fusion with an increasing number of sensors, from gyroscopes and accelerometers to actuators and acoustics, is now all about the system and creating value
“Standards are important for IoT interoperability,” said Lightman.
Of course, with standards comes competition for differentiation, as “it can’t be about cost,” said Lightman. “Companies aren’t after high volume, but [instead are after] high margins.” This comes about through more sophisticated performance, particularly in medical, “where there’s no margin for error.”
Tomi Engdahl says:
EDN predictions for 2006, 10 years later
http://www.edn.com/electronics-blogs/rowe-s-and-columns/4442405/EDN-predictions-for-2006–10-years-later
Tomi Engdahl says:
Understand firmware’s total cost
http://www.edn.com/electronics-blogs/embedded-basics/4442394/Understand-firmwares-total-cost?_mc=NL_EDN_EDT_EDN_today_20160726&cid=NL_EDN_EDT_EDN_today_20160726&elqTrackId=43f6f35cac2a4eccb4d33fde7f9bc75c&elq=94c022fd1f6143ffb4733b15876f4a76&elqaid=33182&elqat=1&elqCampaignId=29011
Innovation can be an exciting endeavor but on occasion management and developer decisions will optimistically estimate the cost implications for a project. That optimism can sometimes come from a short-sightedness or knowledge gap in understanding what is involved in the total cost of ownership for developing embedded software. Let’s look at the five major cost contributors that affect the total cost of ownership.
Contributor #1 – Software licensing
Contributor #2 – Software development
Contributor #3 – Software maintenance
Contributor #4 – Certifications
Contributor #5 – Sales and marketing
The total cost to own firmware is far larger than just the development costs. In order to truly understand the full investment necessary to be successful, companies and teams need to expand their considerations and understand how software licensing, certifications, and even the maintenance cycle will affect their return on investment. Without all these pieces the story is incomplete and the chances for a product’s financial success may be drastically reduced.
Tomi Engdahl says:
RF GaN Gains Steam
http://semiengineering.com/rf-gan-gains-steam/
Technology may be perfect for 5G, but there are still some challenges to overcome
The RF gallium nitride (GaN) device market is heating up amid the need for more performance with better power densities in a range of systems, such as infrastructure equipment, missile defense and radar.
On one front, for example, RF GaN is beginning to displace a silicon-based technology for the power amplifier sockets in today’s wireless base stations. GaN is making inroads in base stations, mainly because it’s a wide-bandgap technology, meaning that it is faster and provides higher breakdown voltages than silicon and other III-V devices.
Now, seeking to leverage these properties into a bigger market, some RF GaN suppliers are targeting the technology for future handheld systems. RF GaN is overkill and too expensive for today’s smartphones, but it’s a candidate for future handhelds based on a next-generation wireless standard called 5G.
RF GaN is ideal for 5G or even advanced 4G systems, as the technology shines at higher frequency ranges. But for future handhelds, the material has some challenges, such as power consumption, thermal issues and cost.
Still, the industry should keep a close eye on the progress of RF GaN. “GaN devices can handle more power than other high frequency technologies like GaAs and InP, with better frequency performance characteristics than other power technologies like LDMOS and SiC,” said Eric Higham, an analyst with Strategy Analytics.
“GaN devices also have higher instantaneous bandwidths, which becomes important as the industry goes to higher frequencies where the bands are wider and implements more carrier aggregation bands,” Higham said. “This means fewer amplifiers are needed to cover all the bands and channels.”
GaN, gallium arsenide (GaAs) and indium phosphide (InP) are III-V compound semiconductor technologies that are used for RF applications. Meanwhile, laterally diffused metal oxide semiconductor (LDMOS) is a silicon-based RF technology. And silicon carbide (SiC) is used in various applications.
The problem with GaN? It’s expensive. Most RF GaN is produced on smaller and expensive SiC substrates.
GaN has unique wide-bandgap properties, but it’s expensive.
Meanwhile, the first big market for RF GaN was the military/aerospace sector.
Tomi Engdahl says:
Home> Community > Blogs > Rowe’s and Columns
EDN predictions for 2006, 10 years later
http://www.edn.com/electronics-blogs/rowe-s-and-columns/4442405/EDN-predictions-for-2006–10-years-later
Tomi Engdahl says:
What does the future hold for medical technology?
http://www.edn.com/design/medical/4442412/What-does-the-future-hold-for-medical-technology-?_mc=NL_EDN_EDT_EDN_today_20160725&cid=NL_EDN_EDT_EDN_today_20160725&elqTrackId=692b54020de040e99a1afad0ecb9e01e&elq=f3f51723c0664194a2cbcec566dc9a39&elqaid=33166&elqat=1&elqCampaignId=28997
I’m not a doctor, nor do I play one on TV. Nevertheless, it’s fun to think how far medicine has come in EDN’s first 60 years, and imagine what the coming 60 years might hold for medical technology.