Heat and various types of noise can disrupt optical signals in silicon photonics applications, pushing light into frequencies that generally are filtered out.
Unless those filters are adjusted, data may be lost or incomplete, and in the case of streaming data it may be impossible to reconstruct. But predicting when and how physical effects will affect light isn’t always obvious, which makes it more difficult to make the necessary adjustments. Using light at the chip level is just beginning to ramp up, and it’s becoming clear there is still much work left to do. While it may seem like optical communications should be well understood by this point—the telecommunications industry has used fiber optics for decades—silicon photonics is just beginning to roll out for die-to-die communications inside of and between different packages. The slowdown of Moore’s Law is driving much of this shift. Feature shrinks beyond 7nm are no longer providing significant improvements in performance per watt, and it is taking more time for the chip industry to rev each new node. There are more physical effects to contend with, such as RC delay caused by thinner wires. In addition, various types of noise, which used to be shielded by thicker dielectrics, are now first-order effects. And on top of all of that, thermal dissipation is becoming more difficult and expensive to manage at each new node due to increases in dynamic power density and more circuitry that is always in the ON state. In response, chipmakers have begun exploring a variety of new options, ranging from different architectural approaches such as in-memory or near-memory computing, to heterogeneous designs and new packaging options. Light is one more option, but so far it has not been fully characterized or completely understood at the chip level. Of particular concern is the impact of heat on light. “Noise effects come from many different sources, including thermal, which can cause an index of refraction shift,” said Ashkan Seyedi, research scientist at Hewlett Packard Labs. “Right now, VCSELs (vertical cavity surface emitting lasers) have a 20nm channel spacing, the so-called CWDM (coarse wavelength-division multiplexing) grid. Within that window, the color of the VCSEL is actually fluctuating.” That fluctuation, which in photonics represents a frequency shift, is caused and sometimes amplified by heat and noise. As a result, packages and dies need to be characterized within a certain temperature range, and lasers and filters need to be developed to work within those temperatures. “When you have a frequency shift, what comes out of the laser may not be what you designed,” said Norman Chang, chief technologist for ANSYS‘ Semiconductor Business Unit. “That needs to go into the waveguide, which is a wave division multiplexor. But with the thermal impact on frequency, it may be filtered out.” For applications like facial recognition on a smart phone, this may be more of an annoyance than a serious problem. But for object or motion detection in a car or a plane, it can be life-threatening. “A single transfer of a signal is usually not a good idea,” said Ajit Paranjpe, CTO of Veeco. “Any extraneous noise can impact light, and if it interferes with sensors, you have to deal with that noise in the design. It may cause frequency drift. In addition to that, different light sources may have a different amount of variability.” Dissecting photonics Two main approaches are used in silicon photonics. One approach is to use a ring modulator, which is basically a pulsing of the signal by using such techniques as phase-shift keying or various types of frequency-division multiplexing. The second involves resonators, whereby a cavity is created in silicon by a pair of mirrors facing each other. Light is emitted between those mirrors, and bounces back and forth at a defined frequency. Fig. 1: Micro-ring resonator, about 20 microns in diameter. Source: HP Labs VCSELs, which use the resonator approach, started out as a niche technology for computer mice and laser printers. They have since gone mainstream with the introduction of VCSEL-based facial recognition in the iPhone X. Other phone makers are following suit, and VCSELs are finding their way into other applications, including LiDAR for assisted and autonomous vehicles. All of this has stepped up the pressure to develop tools and methodologies to be able to address a variety of technical issues in photonics to provide consistency on the design side. The goal is better yields, so that manufacturers can achieve the kinds of economies of scale systems vendors expect from semiconductors. “The promise of silicon photonics is to be able to re-use existing infrastructure at a lower cost than what is available today,” said Twan Korthorst, R&D director of photonics solutions at Synopsys. “There are three different approaches. One is to grow these devices on silicon. A second is to die-attach photonics with hybrid integration. And the third is to use a light source at a distance with a 90nm transceiver process.” So far, there is no clear winner. The fundamental issue is that the light source is difficult to integrate with other electronics. In fact, the die-attach approach is basically a connector glued onto a package. Add in frequency shifts and this can become very messy. “When you think about the scope of optical interconnect, it’s the temperature variation that’s going to really hamstring you,” said HP Labs’ Seyedi. “Let’s say you’re designing a VCSEL link, and you have a 60-channel optical I/O built around your ASIC, which may be a GPU or switch or CPU. If the heat map is all over the place, then you have lasers that are surfing a very tumultuous wave of thermal fluctation. Laser 1 may vary from 20° C to 70° C, and laser 27 is only going from 45° to 55°. You have to implement a thermal stabilization mechanism across all of those lasers, and the ‘t’ is what makes that difficult. If someone says this chip is going to run hot, but it’s only going to vary 10°, that’s a pretty simple thermal mitigation scheme. But if it can be anywhere from 20° to 80° C, and you have no idea what the average is going to be or how that will vary on a millisecond time scale, you have a lot bigger space to accommodate for.” Heat can impact designs in other ways, as well. If one or more of those lasers run hotter than the others, they will burn out more quickly. That increases the number of failures in time, and unless redundancy is built into the laser module it can ruin an entire package. But redundancy adds area, and in many of these devices that extra area isn’t available. “If you think about a typical data center, you’ll have hundreds of thousands of individual lasers,” said Seyedi. “If you have a FIT (failure in time rate) under 5, that’s still about 10 lasers a month that die out. In VCSEL-based links, that’s a sealed package, so when that laser is dead the whole package is dead. If you have a pluggable with eight lasers inside an active optical cable and one of those parts dies and there’s not redundancy, seven healthy lasers go away at the expense of one bad one.” Tools and testing issues This isn’t that surprising to chipmakers and tools vendors. In some respects, this is just another type of variation. But there are a few new twists in photonics that make it more difficult to characterize the impact of heat on a device. One is that there is no “best” approach to building these systems, and each one has its pros and cons. For example, passive interposers are now being used for signal waveguides. While those may help offset some of the thermal effects, that approach makes it significantly harder to test and model these devices. “The main issue is that the wave guides have to be a certain dimension and you have to keep space between them, so it takes up a lot of area,” said John Ferguson, director of marketing for Calibre DRC applications at Mentor, a Siemens Business. “This doesn’t benefit from advanced nodes. In fact, 65nm is the most advanced node right now. You have a chip on one side and nothing underneath except to connect it. The photonics is separate. But testing this is difficult. People are trying to crack that now. It’s not like an IC, where you put it on the testbench with probes. You need to measure optical signal processing on the outside, and that’s not trivial.” Testing is somewhat simpler with VCSELs than edge-emitting lasers. With VCSELs, entire arrays can be added to a single die because the light propagates perpendicular to the chip’s surface. That also makes them easier to test, because they can be tested at the wafer level, said Veeco’s Paranjpe. With edge-emitting lasers, the wafer needs to be diced and then the rest of the device needs to be constructed using those chips. “The advantage of VCSELs is that entire arrays can be manufactured on a single die,” Paranjpe said. “With the wafer, you can do wafer probing and wafer test. With edge-emitting lasers, you need to test the individual die first, and then the device.” That also adds to the amount of time it takes to manufacture these die, which in turn increases the cost. But the bigger problem is that all of this needs to happen in the context of a system because heat can spread across a die or throughout a system, and tight tolerances mean that thermal and power budgets can be additive. Add in silicon photonics and signal integrity becomes a significant concern. This hasn’t been lost on the semiconductor supply chain. Process design kits and design flows are being developed or tweaked to account for these kinds of issues. In effect, these are restrictive design rules that accompany any new manufacturing process. “Foundries and fabless design houses are developing PDKs, which are basically building blocks for design tools,” said Jigesh Patel, business development manager for photonics solutions at Synopsys. “In the design flow, PDKs are becoming more and more widespread. There also are PDKs for different processes at different foundries.” What’s ironic here is that restrictive design rules are being applied at older geometries, which is where most of the photonics work is happening. The reason is that while process dimensions are bigger in photonics chips, the various elements on those chips need to be as precise as at much smaller nodes or they can distort or disrupt signals. “Sidewall roughness is the biggest problem,” said Patel. Existing design tools also need to be enhanced to deal with all of these issues. “It will require a finer resolution for thermal simulation,” said ANSYS’ Chang. Packaging options One way around these problems is to leverage advances in packaging, where die-to-die communication can run at the speed of light using lower power with fewer restrictions on area. This is particularly attractive in 3D-ICs, where heat dissipation has slowed adoption, and in applications such as networking and exascale computing where performance is critical. Inside of data centers, photonics needs to be more tightly integrated and better characterized than what is done today with plug-in modules. This approach also will allow vendors to build in better monitoring tools for lasers, including statistical modeling for laser failures that are based on a variety of factors. It also will allow them to build in failover capabilities. “In our VCSEL interconnects, we brand that as four lasers per fiber, but there’s actually a fifth one included,” said HP Labs’ Seyedi. “If one of those four dies out, you kick on the fifth one and it extends the package lifetime. Pluggables can’t do that now because the form factor is maxed out. There is no room for spares on that chip. Every square millimeter is accounted for. But because we’re moving eventually to co-packaged photonics, it allows us to build in more redundancy.” In 3D-ICs, where memory is stacked directly on logic and connected with through-silicon vias, light could help reduce thermal issues. But it only will work if the rest of the device is characterized to incorporate the impact of that heat on light. At this point there is growing recognition that problems exist, but there is still much work to be done to make this as predictable and cost-effective as communication over copper wires. Related Stories VCSEL Technology Takes Off Get Ready For Integrated Silicon Photonics Silicon Photonics Comes Into Focus Ed Sperling (all posts) Ed Sperling is the editor in chief of Semiconductor Engineering. Array (  => 24100041 ) Knowledge Centers Blogs Silicon Photonics Published on December 5, 2016 Technical Papers Machine Learning Based Prediction: Health Behavior on BP October 12, 2018 by Technical Paper LinkAutonomous Vehicle Navigation in Rural Environments without Detailed Prior Maps (MIT) May 15, 2018 by Technical Paper LinkSilicon CMOS Architecture For A Spin-based Quantum Computer January 24, 2018 by Technical Paper LinkMaking high-capacity data caches more efficient October 24, 2017 by Technical Paper LinkHow Neural Networks Think (MIT) October 18, 2017 by Technical Paper Link Trending Articles The Next Semiconductor Revolution Four industry experts talk about what’s changing, how quickly, and where the limits are with AI. China’s Foundry Biz Takes Big Leap Forward 30 facilities planned, including 10/7nm processes, but trade war and economic factors could slow progress. Mostly Upbeat Outlook For Chips 2019 will be a year of change for the semiconductor industry as new fields drive technological advancements. What’s Next For AI, Quantum Chips Leaders of three R&D organizations, Imec, Leti and SRC, discuss the latest chip trends in AI, packaging and quantum computing. Unsticking Moore’s Law Applied Materials’ VP looks at what’s next for semiconductor manufacturing and the impact of variation, new materials and different architectures. Knowledge Centers Entities, people and technologies explored Related Articles Why Chips Die Semiconductor devices face many hazards before and after manufacturing that can cause them to fail prematurely. The Impact of Moore’s Law Ending Chips will cost more to design and manufacture even without pushing to the latest node, but that’s not the whole story. Fundamental Shifts In 2018 This will go down as a good year for the semiconductor industry, where new markets and innovation were both necessary and rewarded. Where Advanced Packaging Makes Sense Experts at the Table, Part 1: Impact on the supply chain, who’s using advanced packaging, and the cost of packaging versus device scaling. AI Chip Architectures Race To The Edge Companies battle it out to get artificial intelligence to the edge using various chip architectures as their weapons of choice. AI Begins To Reshape Chip Design Technology adds more granularity, but starting point for design shifts as architectures cope with greater volumes of data. A Crisis In DoD’s Trusted Foundry Program? GlobalFoundries’ decision to put 7nm on hold is raising concerns across the mil/aero industry. Foundries Prepare For Battle At 22nm Bulk CMOS, FD-SOI and finFETs all on tap as big players vie for differentiation. But where will chipmakers go after 28nm?