Could you please introduce yourself and your company?
What is the relationship between the photonic chips that you are building and machine learning? Is that what your company is focused on?
Andrea Rocchetto: Initially, the company started mostly in the quantum technology market. But we’ve witnessed an overlap of the requirements for photonic chips in quantum with the requirements for photonic chips in datacentres. Both care a lot about chips that are low loss, which means that you don’t have a degradation in the quality of your photonic signal, and are fast-switching, so that you can do fast light modulation.
In terms of the applications of photonics in datacentres, I would say there are two main areas. One, that is already a reality and whose impact is going to significantly increase in the short and medium term, and another that is more speculative.
The more near-medium term opportunity is the one of using photonics for networking inside datacentres. It’s not just at the level of connecting different servers together, but really goes at the level of the processor (CPU), graphics card (GPU) and memory. There is a push to replace copper wires and in general electronic communications with photonic links. For that, you need new components that are extremely miniaturised, and that’s why integrated photonic circuits start to play a role, since all the networking already done with bulky optical components is not suitable to end up in a board with a CPU and GPU.
The goal is to increase data communication speed, which is a crucial aspect for the computational efficiency of the datacentre, but also increase energy efficiency, because moving information around will generate a lot of wasted energy.
A more long term goal is doing actual computation in a photonic chip, although don’t work on that aspect. It’s a more speculative approach. There have been major startups that were working in this space and then pivoted away because the competition with GPUs is so fierce at the moment. Furthermore, the performance of GPUs is so much better than what we can build with photonic chips at the moment.
What is the distinction between photonics and current hardware technologies, especially when it comes to their use inside datacentres? How is it done now with copper wires and connectors, and what is it that makes photonics inherently faster and more energy efficient?
Andrea Rocchetto: The existing technologies for computing are all silicon based and use electrons to do the information processing. Photonics promises to do almost the same things but with photons – the particles of light. Photons have two big advantages.
First, photons can carry multiple signals simultaneously using different wavelengths of light (a technique called wavelength division multiplexing). This means a single optical fibre can transmit many independent streams of data at once, dramatically increasing the information capacity compared to electronic systems.
Second, photons interact weakly with matter. This means that when a photon travels in an optical fibre, it’s not going to waste energy in terms of heat, so by flowing in the optical fibre in this non-interacting way you can really minimize the loss of energy.
One of the major sources of energy use is cooling, so if datacentres transitioned to photonic communications, presumably the cooling cost would also be reduced?
Andrea Rocchetto: A data centre is a very complex object, and there’s not one solution to the problems we have. I’m sure you’ve heard about companies trying to build small nuclear reactors next to datacentres. Photonics could also improve the energy efficiency of data centres. These are all solutions that will have to be implemented and that ultimately tackle slightly different problems.
GPUs will continue to be used in the future, and they will continue to require a lot of power that will have to be dissipated, so better thermal management will continue to be a priority. All the solutions need to be implemented to move forward.
How do these photonic components actually interface with traditional CPUs and GPUs?
Andrea Rocchetto: The key component here is a transceiver, which is a device that turns an electrical signal into photonic signal. The effort into miniaturizing these transducers is a critical endeavour in the whole area of photonics for datacentres.
At some point, we will want to build these transceivers directly onto a GPU or CPU, this is called co-packaged optics. In the same wafer, you build some photonic components alongside the electrical components. We’re not yet there in terms of commercial applications. But that’s really where the industry will go at some point.
Where do you see the industry going in the next couple of years? What do you think is needed to support research and innovation in this area?
Andrea Rocchetto: The hyper-scalers: Google, Microsoft, Amazon, are the leading customers of and often also the leading developers of some of these technologies.
So for example, Google has been working on replacing its datacentre network switches with photonic switches, and has been doing that for the last 10 years. They’ve done that in house. These companies also remain the biggest customers of this technology.
However, the technology is also rapidly evolving. There are new materials being used for photonic circuits, for example, lithium niobate, lithium tantalate, barium titanate. All these materials are electro-optic materials that allow for very fast modulation of light with potentially low losses. One big problem of the existing and most widely used chip manufacturing platform, which is silicon photonics, is that it has very high photonic losses. These other materials promise better switching performance by keeping the losses down.
There are also different substrates: We don’t build chips in silicon but build chips in glass What’s the big advantage of glass? Glass is the same material of which optical fibres are made of, and this allows us to have essentially lossless coupling efficiency.
Connectivity with the optical fibres is where the majority of the losses happen when you’re trying to network things, because most of the losses are at the interface with the fibre.
Europe has a long history of strength in photonics. Innovation in photonics is also pretty widely distributed across the continent. For example, Italy is an extremely strong place for photonics, and that’s why Ephos is based there.
But photonics would significantly benefit from more funding in photonic foundries. A lot of these new materials that I was mentioning that are the future of the industry require working with non-standard materials for a clean room. Having facilities where one can experiment and work with these materials is critical and, at the moment, there is not that much capacity for doing so.
I know that there are existing European projects trying to address that, including part of the European Chips Act, and even the Quantum Chips Act. But I think more can be done at the foundry level. I would also want to point out that China has been investing massively in photonics.
You mentioned that your company started out interested in quantum systems as well. Is the expectation that photonics in the form that it is being developed for datacentres would connect back eventually to quantum computing?
Andrea Rocchetto: There are different hardware platforms that can be used to build qubits (the core component of quantum computers), and photonics is one of the platforms that you can use to build a quantum computer.
There is one technology that will have to be part of the quantum ecosystem, not just for computing, but also for quantum sensing and quantum communications. That technology is photonics, because ultimately photons are the only way we have to move quantum information from one place to another.
If you’re building an ion trap based quantum computer, which is a competing platform, you will ultimately need photons to connect the different traps together. If you’re building a neutral atom quantum computer, again, another competing platform, you will need photons to address your atoms. So photonic information processing is needed in quantum tech. There is absolutely no future of quantum technologies without the ability to manipulate and process photons.
There are some research attempts to use photonics in machine learning context, but from what you’ve talked about the most direct one for now is using photonics to improve the efficiency of the datacentres that those programs are run in.
What are your thoughts on the relationship between photonic technologies as they are now and machine learning?
Andrea Rocchetto: The only concrete thing that we have today is that photonics can help to increase the speed and reduce the energy cost of datacentres that are used to train machine learning models.
This is why companies like OpenAI are extremely interested in photonics.
However, there is this other component, that we were describing before, of doing the actual training of the models on photonic chips. This technology has been shown in the lab, but the performance is ultimately not competitive with existing GPU-based technologies.
I think we should not discount the possibility that they could be competitive, especially in an industry where chip manufacturers are targeting ever more specialised chips rather than more general computing devices like CPUs.
So I think it’s certainly possible to have specialized photonic devices that address very much needed challenges in a photonic way.
I would put the attempts to do actual computations with photons more in the risk category of quantum computation, which is immensely promising, but not yet there in terms of impact.
What do you think Europe’s place in developing these technologies is?
Andrea Rocchetto: I just want to conclude with a parting thought: It’s the Golden Age of photonics these days because the technology is mature enough, and the needs that photonics is answering are significant and real. When you have this intersection of needs and solutions, that’s when things blossom. This is really what we’re witnessing today in photonics and Europe is very well positioned to be competitive in this space.

