Grand Challenges for Photonics
Keynote Remarks at World Technology Mapping Forum
by Lionel Kimerling, MIT, Boston & Executive Director of AIM Photonics Academy
This forum was organised jointly by PhotonDelta, The Netherlands & the AIM Photonics Academy based in Boston.
Save The Dates
We are delighted to announce the dates and location for the second World Technology Mapping Forum. It will be held on the campus of the University of Twente in Enschede, The Netherlands from June 20-22nd 2018. More practical details on the forum programme and hotel information will follow at the start of 2018. In the meantime, if you participated in June 2017 and have feedback on what you would like to see in 2018, please get in touch. We suggest signing up for the free WTMF Newsletter to stay informed of developments.
In June 2017, just over 170 scientists, researchers, government representatives and high-tech industry experts from 17 countries converged on the Dutch city of s’Hertogenbosch. They came to discuss next generation technologies the world is going to need in 2030 and beyond. That’s important now because we’re reaching the economic end of what’s popularly known as “Moore’s Law”. And particles of light (photons) rather than electrons will be the engine driving many new applications in communications and life sciences that we’ll soon take for granted.
The three public keynotes given in June 2017 are being transcribed and published, since each explains the technology challenges that photonics needs to solve, now and two decades from now. This is the first of three.
Thank you and it's a pleasure to be here. I think this forum marks the beginning of something very important for photonics and particularly integrated photonics as a worldwide supply chain. Because, unless we can work together we're self-limiting.
The Grand Challenges for Photonics
I'd like to discuss what we consider to be the grand challenges ahead and the time lines for meeting them in the context of electronic photonic integration. We're talking about a transition by a customized photonics industry that served telecommunications and optics to an industry which is now joining with electronics. We must figure out how that meeting is going to happen, what advantages photonics offers and how they're going to be implemented at a cost level which is not going to change the value proposition of systems.
I'll talk a little bit about the way JePPiX is looking at it and the way Integrated Photonics Systems Roadmap IPSR is looking at it right now, particularly from the idea of applications, markets and costs. The IPSR has a heritage of almost 25-years of road mapping activity. This started well before the telecom boom and bust, ending up to where we are today.
How it Began
We started out looking at where the applications were. We developed a mantra that said good things would happen if we could develop cross market platforms, standardize components and integrate. It turned out that was a message that the photonics industry wasn't happy to hear because they were doing very well with very high margin customized products and we were arguing for something that was quite different.
Yet it's very much in line with what's happening today. In the JePPIX roadmap presentation they look at each one of those issues with a separate chapter and describe where those things are moving and how to deal with it.
In the IPSR, we've begun to target the supply chain. And what do the suppliers need to know to be ready with their products that they're going to add; all the way from wafers to design automation to tools to the eventual users at the end. We need to specify in detail what those are as well as the timelines are is the key message I'd like to leave you with today.
What is the biggest barrier to technology change? Is that the technology itself or is it us? We think it's us. You're taking a risk both at your corporation level and at your personal level to implement a new technology.
Who are the roadmap stakeholders?
In simplest terms, there are three parties: government, industry and academia.
The government is interested in how do we make sound investments so that we can create jobs and create a thriving society. And secondly, if regulation is needed how we make that regulation in a way that doesn't slow down the rate of innovation because innovation is the key to that economic benefit this is being created.
Thinking about industries is quite interesting. Industry needs a supply chain coordination. And these days that supply chain is worldwide. Which is one of the reasons why we feel (as a US organization) that it's very important to interact with the rest of the world. There's no integral piece of it that can be captured in one country. So, for any of us to think that we can dominate everything with a vertically integrated organization is a thing of the past.
Supply Chain Coordination
Look at supply chain coordination. What are the goals that a vendor must meet and what are the product timelines such that there are no roadblocks in the supply chain as new products are rolled out? And my second point is about learning. We think about a learning curve is something magic - that the more we make of something the lower its cost becomes. But that learning curve is quite interesting. It isn't just a matter of making things. It's a conversation between suppliers as well as suppliers and competitors. The competitors inform the suppliers and they inform back to the people who are making the actual products.
So that conversation and integrating that all the way along the supply chain IS the learning curve. It isn't just a series of S-curves that are creating innovation after innovation. Those S-curves are important and we need to support them. But from an industry point of view we really need to create that conversation along the supply chain.
And lastly, academia. What are the important problems that people should be working on? When I was working at Bell Labs being in a problem-rich environment was the thing that we valued most, because we could always find something important to work on. And that's what a roadmap tells academia and it also spurs innovation because where we don't have a solution we are telling researchers and they can begin to work on an answer.
Conclusions (given at the end of the video)
Before we go into the technical details, here are the high-level conclusions.
The key driver we started with was the Internet of Things. So have a look at these numbers from the cloud. In 2013 they were more connected devices than people on the planet. Look at what we we're expecting in 2020: six devices for every person on earth and everybody doesn't have a device. Some of you have more than six. This is what's driving all the communications. And so, in general, the system architectures can continue to migrate to distributed typologies for performance and energy reasons. Photonic interconnection is going to be the answer and we need to be able to deal with that.
What is electronics looking at? Electronics is saying; we've been following Moore's Law and we're going to keep doing it but you can't scale transistors anymore. So let's scale the number of chips in the package and maybe we can just add more chips of the package and that will be a solution. But that's given the power density issue.
If you go to "more than Moore" they're saying well everything is an option to continue scaling and photonics is just one of those little things that we'll consider along with the quantum of this and that and nano this and that. So for photonics to be taken seriously you've got to get back up into "Beyond Moore's Law" box. And so communication centric architectures and system level design are the way to do that.
My final slide I think is the one that you might consider to be the most important.
This is probably a challenge directly to the government and the academic stakeholders working with roadmaps. What is the biggest barrier to technology change? Is that the technology itself or is it us? We think it's us. You're taking a risk both at your corporation level and at your personal level to implement a new technology. There's no guarantee that you're going to succeed and you have to really be a risk taker to engage in that. There's a general acceptance of the incumbent solution. Copper works and copper is easy to connect. Although we know it isn't going to scale into the future but we're going to keep doing that because it's easier and we know that it works. And then lastly cost and standardization. If we can't get the cost down as we move from the board to the package with photonics we're never going to be able to afford to build those components. So it doesn't make any sense even if we can implement them they're going to be too expensive and our eventual system cost. So even though the learning curve is built of a thousand S-curve innovations we need to see those innovations in a way that people are interested in taking risk and they know where they want to go with the roadmap.
The Three Levels of the Roadmapping Process
I will present three levels of detail for our roadmapping process, starting tomorrow in the technical workshops.
- Firstly, what are our technology goals and how we should define them?
- Secondly, what are the infrastructure targets that we should meet and their timelines?
- And lastly, how much detail do we have to supply along the supply chain in order to be effective?
These technologies goals are important and they are application specific. So, when I said cross market platforms, we can really get in a bind if we have a thousand different platforms for a thousand different applications. We need to limit the number of platforms to the fewest number of applications.
Steel Industry Comparison
Look at the technology roadmap for the steel industry. They have a different technology roadmap for high-strength steel for car bodies versus high strength-steel used in construction. Those are two separate roadmaps with two different alloys: one has to be formable whereas the other is more castable and machinable.
If we think about the integrated circuit industry and its early years, it was focused on memory as the primary application. That drove all of the technology evolution. Then it was focused on microprocessors and now that shrink is over, it really doesn't have much focus left over. So, it is up to integrated photonics to provide that next step for them.
Let's talk about technology goals. We just saw the Google video and this is just an illustration of that. Everything goes to a data center. When it comes to the Internet of Things and how it's connected to the datacenter - we' re part of that. It's just quite impressive as to how that's expanding.
But if you just look at one application, high performance computing, the system performance and floating-point operations per second has increased at the steady rate of a thousand times every 10 years. And that's been achieved at more or less constant cost for the last 20 years.
Now if you think about that, it's quite amazing because Moore's law and what we're doing at the chip level is only 100 x every 10 years. So, something is happening at system level optimisation that is enabling this one thousand X every 10 years. If we try to project forward from now to the platforms we need 10 years ahead and 20 years ahead, we have to think scalability in every decision we make. We don't want to create a solution that only has a one generation lifetime; it must be scaling forward.
Something's happened at system level optimization in the past that is only going to be amplified in the future. Because photonics is enabling distributed systems, we need to begin to think about designing, not at the device level, not at the chip level but at the system level and that's going to cause a new economic calculation for how we value the cost of components. Because it's really the cost per function that the system is delivering not the cost of a gigabit or the cost of a chip.
So, these are the constraints that jump out at you when you start thinking about how to scale forward at a thousand X every 10 years and keep the ball rolling.
Energy, the power density, is an issue when you're trying to do integration. But energy is an issue because every projection that you've seen says we're going to use all of our electricity in the world to power our datacenters or IP switches or whatever. The first one came out in Japan about 15 years ago where they looked at just the IP switches in their system and projected what the video content was going to be and their network. They said if we just keep distributing this video content through normal internet protocol, we're going to use all the electricity generated in Japan by the year 2035 just for internet switches. Something must happen, one way or the other.
2. Bandwidth density
Bandwidth density. We can't keep increasing the size of data centers. Everything has to get smaller, so the "football field size" of a data center will be more or less a standard and maybe even better.
3. Port count
Port count becomes important as things get more parallel. We have to put in more ports into the system and figure out how we can do that with reasonable switching.
4. Functional latency:
As we build more and more complicated systems. The latency is a key issue in energy efficiency because if we're spending energy to keep systems going but we're not cranking out any results from that, then we're wasting energy. The more we reduce the latency, the more effective and more energy efficient our systems are.
We're putting out capital investment into all of this information hardware and if we're not using it then that's going to waste.
And then lastly reconfiguration. The key to efficiency is not to make everything the same and have a CPU and memory and just keep going back and forth. We've learned that lesson. The key is to build the system and configure it for a special purpose and then when that we're using it for a different purpose we should reconfigure it.
While doing a complex calculation or simulation, we often change the purpose in the midst of that. So, the ability to do reconfiguration to apply either neural network or some sort of artificial intelligence to sense what the application is and configure the system for optimum efficiency, it is necessary to build that in.
All of these points are going to be enabled by photonic integration and photonic interconnection. For most people, we should realize that this is a new way of thinking about systems.
We also need to look at the products and how that's moving forward. So this is just an example out of the roadmap for silicon photonics. If you look back a year or so you can say that we're mainly interested in interconnects and to a lesser extent innovation and packaging. But as we go forward it is still interconnection packaging. By 2020 we're beginning to think about a full integration in silicon photonics chip and applications and logic and memory. As we go beyond 2025, systems on chip and things of that kind become much more important. Keeping your eye on these things gives you the ability to say; if I have to scale where I'm scaling toward? And if I create a solution today is it going to fulfil the needs that I'm thinking about in the future?
Rigidity is different for photonics
Each one of these has a unique solution for photonics which is somewhat different than it was for electronics. For instance, you need much more rigidity for photonics than we need for electronics. So how are we going to implement that at cost and high density?
So let's look at the infrastructure targets that need to be created to support this.
As we looked at manufacturing as a whole, we came up with the list of key Grand Challenges for Manufacturing. The big thing is manufacturing system integration. We're designing for systems. So, do we need to make any changes in the way that we make things? The chip is not the goal anymore. It's the system. So, the package has to conform to what the system footprint is going to be.
Right now, we're looking at on board optical interconnection and we have no idea exactly what that footprint is going to be. Is it going to be pluggable? Is it going to be embedded in the board? Is going to be fly over fiber? Is it going to be embedded waveguides. These questions shouldn't be open anymore.
We should know the solutions to them because the applications are there in 2018 and it's a scramble now. And a lot of it is going to depend on how the throughput from manufacturing can be achieved at a reasonable cost.
Standardization in materials design, packaging, functionality. I will go into solutions for these and a little bit. Foremost it is defined by cost and system requirements and platform tradeoffs. Remember: scaling forward involves cross market platforms. So scaling costs simplicity is key to scaling costs. If you look at any learning curve analysis the exponent in the learning curve is always related to the introduction of increasing simplicity as you move forward integration packaging production volume.
Particularly important is known good die when you're scaling cost and that's something that's already well characterized and well known in integrated circuit processing with electronics. But this is something we're just beginning to realize is important with photonics. In the old days, we used to think of a good wafer and we'd throw away a bad wafer and we'd have a good wafer. Now we're thinking about line yield and die yield and so forth.
Reliability is important: what are the failure modes? And these are failure modes in long term test. And do we need to build in redundancy to have long term systems. Particularly in sensors this is important because once we deploy a sensor network we expect long term service.
One interesting thing I want to point out here. If you survey the industry, the 2020 target for a silicon photonics transceiver is less than one cent per gigabit per second. (excluding the laser because you don't know what the solution for the laser is.
But that's very small compared to what we're typically getting today for telecom transceivers. So how do you make that transition? And once you do, how does that feed back into telecom? Should those companies be thinking more seriously about a different value proposition.
Self Aware Systems
When it comes to scaling power per function, as I mentioned functional latency self-aware and self-regulating systems. We're already building that in with electronics. Electronics is still very short-range communication. But if you have a multi-core chip, a core can decide whether the core next to it is being used to not just by measuring the heat in the chip and you can deploy resources based on that. And that's one of the simpler ways of self-aware systems and those are becoming more and more active now.
Scaling up speed with parallelism. I mentioned the need for network reconfiguration to do that efficiently.
And then lastly scaling bandwidth density, data rate port count, and spectral bandwidth. Once you get into the idea of spectral bandwidth you realize everything has to be single mode. That's quite different than pixels pumping in multimode fiber.
Application View Point
So if we look at this from the application point of view. Let me take two applications: a datacenter and Internet of Things sensors.
The data center: We've got to pack more things together and get up the data rates faster. So how do we do heterogeneous integration of memory logic, power control and photonics together? There are interposer solutions or 2D packaging solutions. But everything seems to be moving from the investment on the electronics side where the dollars are already there, toward three-dimensional system and package designs. So, we need to look at that and make sure that photonics is compatible or will be compatible.
Let’s look at switch routers to enable that reconfiguration. They must be looking at what kind of traffic they're serving. They need to consider how to do a hybrid of packet switching and circuit switching to be most efficient and operate with the lowest latency.
As far as the Internet of things is concerned, I'll just mention this. We're going to put out sensors everywhere. They're going to need to communicate with one another. They're going to have to operate for long term. But how are they going to be powered? So, we must consider ways of both tethered power where they're connected to a power system and energy scavenge power from solar or from heat or from other things.
So, in general, the packaging evolution is to bring the electronics closer together because as the bandwidth goes up electronics can't support it except for shorter and shorter distances. And the thing that we need to consider from photonics is how does this activity in electronics feed into electronic photonic packaging synergy?
So let's just go through a couple of the infrastructure issues electronic photonic design automation. We need seamless compatibility between digital analog CAD tools that doesn't exist today. These are two separate organizations. Photonics is basically an analog platform doing digital work. We need to consider how do we put those two things together and do it seriously. If we look down the performance scaling path, we need to be able to get to 128 terabytes per second input/output on a chip. But we need to do that by making tradeoffs and design and to essentially go through a path of “good enough photonics”.
Telecom has been perfect photonics. Once we start integrating devices everywhere, we can do more components to get good enough photonics to get our system performance without having each thing be perfect.
If we look in a timeline we need validated PDK models for photonic circuits. Probably the most important thing - we need a foundry infrastructure for IP licensing and indemnification. That means a look-up database and licensing fees all those things need to be there if we're going to have a serious industry.
By 2025 we need to have electronic photonic design PDK models and an IC package design platform for electronics and photonics. Remember that in the beginning in integrated circuits packaging was an afterthought. It isn't any longer. And it is important that in photonics we don't make that same mistake.
We want to design for a package from the very beginning.
Wafer assembly and multi project wafer runs which is the key to development. Laser integration is still open. Is it going to be in the package or is it going to be a hybrid or monolithic on the chip? Or is it going to be in the wall? Or will we get wall-plug light in the same way we get electricity out of the wall? This must be solved and there isn't any obvious solution to it yet.
Athermalisation is important, particularly if we're doing tuning and filters. Changes in temperature change the index of refraction of materials by thermal optic effect. And we need a scalable solution for that.
And by 2025, we need to understand where we're going to add gain and how we're going to have pervasive gamebox and the circuits. Every time you do a split, you lose 50 percent of the light power. And you just can't keep doing that. If we're going to be doing complex functions with our photonics we need to have integrated game blocks.
Inline control and tests is the key to manufacturing yields. One of the key things is to limit the number of test protocols. So right now, we have different protocols and within that we have different protocols for a metic and non-hermetic. And we need to cut that down so that it can be simple and fast.
And I mentioned known-good die before. By 2025, there's no reason why we shouldn't have built in self-test. It it's too hard to add light to a wafer to do a test, why not just do an electronic test as we always have. Ensure that you have your light sources and your testers on the wafer already.
Design for Packaging
As some people have mentioned already, the big issue is packaging. So this just give you a few targets for packaging. The cost of parts has to go from hundreds of dollars to cents. The number of parts has you go from 10 parts to less than five. The assembly time needs to go from minutes to seconds, tolerances down to 0.5 micron and assembly time from hours to minutes. Capital equipment costs must go way down. Fiber attach must go to seconds (and I'm saying we need to get rid of fiber attach altogether) and test time needs to go from minutes per unit to seconds per unit. So that's that's all (laughter)!
If everything we're adding to the system is taking us in a different direction then we're doing the wrong thing. So, we need a supply chain to do packaging at low cost. We need to understand the failure modes. We need a comprehensive materials database, so we can select the best materials and understand how they're going to behave under processing. We need to replace fiber pig tails. Is it going to be surface mount? Is it going to be an optical pen equivalent? What is it going to be? It can't be fly over fiber forever and pigtails are the equivalent of the old breadboarding of circuits that we had decades ago so we know that's going to go. We just have to figure out where.
And then lastly scaling system packaging architecture. So there's CPU and ASIC. How do we put those together? The interposer looks like the leading way to do that. And then how do we put power into that and how do we scale the IO?
We need sub-micron tolerances and rigid mechanical stability and a low cost high accuracy parts supply chain. So, if we're going to do any assembly of parts it's very important in photonics to get the part dimensions to be precise.
It is just like the old days when we started to do assembly lines.
We loosened up the tolerance as much as possible so everything would fit together. Well now we're not doing that. Everything needs to fit together with sub-micron alignment tolerance and the ability to do that is something new and a new challenge.
So let me say a few things about detail and then I'll conclude at a broader level.
If you want to make a roadmap then you must figure out what are the attributes that we're going examine over time. When we were just looking at the chip platform level, we came up with this list of attributes. So I took this to the roadmap team and said okay we came up with this list of key roadmap attributes so what do we do now? And Bill Bottoms says - you've got to prioritize them.
And then you have to figure out what applications they go for.
And are you going to be doing memory or are you going to be doing the microprocessors or whatever the application is.
And try to figure out whether you can get a generic set of attributes for a limited set of applications. And then you can chart them up. So it's not easy.
I have just a few examples of these charts just so you can look at them. And these are the things that we distribute out to our technical working groups before they start. You put the numbers in there the first time, the second time you refine them a little bit. And then the third time they get a little bit better. You keep looking at them over and over again, until the industry aligns with this and this is informing the supply chain.
So these are questions asked about waveguides transparency: what materials are they going to be made out of? What's the index contrast going to be? What is its stability with respect to temperature? How much power is it going to be able to carry? What wafer uniformity do we need in order to be able to make devices out of that material? And what is exactly the material system which is going to change; whether we're looking at a middle IR platform or a telecom platform and so forth.
When it comes to photodetectors and modulators, it's the same kind of thing. The same kind of attributes. And that brings me to the game blocks which is one the big unknowns. How are we going to make these gameblocks? Remember that yield for those game boxes is important. As someone who worked on laser reliability, I can tell you that this is the most sensitive device that we will be integrating into the chip.
As you can see, creating these charts is a difficult task in the beginning. But once you populate them, it's a way of moving forward.
Lionel Kimerling was speaking at the World Technology Mapping Forum, June 14th 2017.