The writeup on phys.org is troublesome at best. Starting with the Ming Hsieh Department of Electrical and Computer Engineering, it buries the rest of that sentence in paragraph 5: USC (University of Southern California) and the Abbe Center of Photonics, Friedrich Schiller University Jena, Germany.
This team has made a nonlinear lattice that relies on something they call "Joule-Thomson-like expansion." The Joule-Thomsen effect is the ideal gas law in beginning science. PV=nRT. Compression heats a gas, expansion cools a gas.
Why they're studying the equivalent photonics principle [1] is that it focuses an array of inputs, "causing light to condense at a single spot, regardless of the initial excitation position." Usually the problem is that light is linearly independent: two beams blissfully ignore each other. To do useful switching or compute, one of the beams has to be able to act as a control signal.
A photon gas doesn't conserve the number of particles (n) like beginning physics would suggest. This lets the temperature of the gas control the output.
The temperature, driven by certain specific inputs, produces the nonlinear response. I didn't see a specific claim what gain they achieved.
This paper is more on the theoretical end of photonics research. Practical research such as at UBC Vancouver [2] where a device does "weight update speed of 60 GHz" and for clustering it can do "112 x 112-pixel images" - the tech doesn't compete well against electronics yet.
TSMC and NVidia are attempting photonics plays too. But they're only achieving raw I/O with photons. They can attach the fiber directly to the chip to save watts and boost speeds.
Basic physics gets in the way too. A photon's wavelength at near UV is 400 nanometers, but the transistors in a smartphone are measured at 7 nanometers ish. Electrical conduction is fundamentally smaller than a waveguide for light. Where light could maybe outshine electrons is in switching speed. But this research paper doesn't claim high switching speed.
Minor nit, Joule-Thomson is not just the ideal gas law - it is a separate thermodynamic effect entirely. Case in point, for certain gases the change in temperature due to Joule-Thomson has the opposite sign that you would predict from the ideal gas law alone.
This has interesting applications. For example, you can exploit this with dilute metal vapor in an expanding helium gas to cool the metal vapor to very low temperature - the Joule-Thomson expansion of helium increases the helium's temperature by converting the energy of the intermolecular forces into heat. This draws out energy from the metal vapor. If done in a vacuum chamber, then in the region before the shockwave formed by the helium, the supercooled metal atoms will form small van der Waals clusters that can be spectroscopically probed in the jet. This was an interesting area of study back in the 80s that advanced our understanding of van der Waals forces.
Light doesnt interact with itself directly without a third non-light partner. So yes the light of course needs to interact with lattice made of atoms to make any switching possible here. This is why we can see light from the stars though it had to travel through other light for millions of years.
> TSMC and NVidia are attempting photonics plays too.
It's probably been six years since I looked at this space. The problem at the time for TSMC and several other people was that their solutions worked fairly well for firing photons vertically out of the chip and not well at all for firing them horizontally through the chip. I don't know if in the short term and mid term if an optical PCIe or memory bus is more overall horsepower than faster cross-chip communication in CPUs. But the solutions they were still chasing back then were good between chips, maybe between chiplets. Which could still be an interesting compromise.
> 400 nanometers, but the transistors in a smartphone are measured at 7 nanometers ish
The best em sensors need to be at least 1/10th the length of the frequency they are sending/receiving right? 40 nm isn't awful but it does suggest light for communication between functional units, rather than for assembling them.
It's true that the transistors are on the order of 50nm, but the conduits for getting the electrons to those transistors are presumably a bit smaller than that.
Probably not 7nm small, but not the full 50 nm either.
It's actually quite comprehensible. Nonlinear optical medium + photon gas -> a photon gas which is no longer ideal, so that things like Joule-Thompson effect can happen in it, then they build simple computing mechanisms out of it.
I agree. The article is written quite superficial and when it gets intersting it just repeats the stuff from before. Imo the subject and news are really hot shit, but the author did his best to hide it in banality
I found this article extremely hard to understand, and the linked abstract was not much more help. My impression is that the device can take light coming into one of several input ports and through some magic of nonlinear optics, ensure that it all ends up at a single output port, something like a funnel. I was unable to determine anything about what this routing mechanism is (heating a substrate, maybe?), if the routing is dynamically changeable, or it works in reverse, eg light coming in can be routed to one of several output ports. The latter would seem like a breakthrough, but my impression is that what's described here is more proof-of-concept than prototype.
>what this routing mechanism is (heating a substrate, maybe?)
You can engineer a waveguide if you understand the nonlinear theory they propose. There's no heat exchange involved, which is easy to get confused on because the writing in the article does not really understand "optical thermodynamics".
>if the routing is dynamically changeable
At this point probably not, it requires a finely engineered waveguide which has a well-defined "ground state"
>it works in reverse, eg light coming in can be routed to one of several output ports
In theory it works in reverse, as everything in this system is time-reversible (i.e., the "optical thermodynamics" is just an analogy and not real thermodynamics, which would break time reversibility). This is demonstrated via a simulation in the SI, but experimentally they did not achieve this (it may be difficult, I am not an experimentalist so cannot comment).
As best I can understand (which is barely, and poorly!), it seems that this new, and interesting, field of optical thermodynamics allows the behavior of non-linear optical systems to be predicted, in this case allowing them to design a "photonic lattice" - some sort of system of waveguides - so that light behaves in a predictable way and can effectively be steered without having to use any active switching components.
What is even less clear than the above is how is this being used.. Presumably it's not just about routing light to some fixed location, but rather allowing it to be switched, so perhaps(?!) the phototic lattice has multiple inputs that interact resulting in light being steered to one of many outputs? Light being used to switch light?
I dunno - it was clear as mud. I'm basically just guessing here.
Sounds great, but I often find myself wondering "where's the catch?". There's not enough info in the abstract judge for myself whether the idea has legs. I'm sure it'll get more press if there's something to it.
May be somewhat tangential to the topic, there are some companies pushing for use of light as the data bus for AI training path. Is the data throughput need that high for training? Do these companies really have a point?
This would have some amazing implications but they will also need to build the routing mechanism with light-based attenuation or it will never exceed the speed of electricity in a wire.
I've always wondered if optical transistors and optical processors might be able to kick the crap out of conventional electronic ones even if they're fabricated at a much larger "process node" if the switching could be orders of magnitude faster.
Electronics has topped out in the gigahertz range. We keep cramming more cores and more ALU units and wider vectors onto a chip by making it smaller and we keep making it more energy efficient, but it's not getting faster in terms of linear compute and hasn't for a while. We've started hitting physical limits there.
Optics could, AFAIK, run in the terahertz range. That's thousands of gigahertz. So wouldn't that be like thousands of electronic cores, but it would accelerate all code including non-parallelizable code?
I've wondered if this might not be a bigger deal for general purpose compute than quantum computing.
The writeup on phys.org is troublesome at best. Starting with the Ming Hsieh Department of Electrical and Computer Engineering, it buries the rest of that sentence in paragraph 5: USC (University of Southern California) and the Abbe Center of Photonics, Friedrich Schiller University Jena, Germany.
This team has made a nonlinear lattice that relies on something they call "Joule-Thomson-like expansion." The Joule-Thomsen effect is the ideal gas law in beginning science. PV=nRT. Compression heats a gas, expansion cools a gas.
Why they're studying the equivalent photonics principle [1] is that it focuses an array of inputs, "causing light to condense at a single spot, regardless of the initial excitation position." Usually the problem is that light is linearly independent: two beams blissfully ignore each other. To do useful switching or compute, one of the beams has to be able to act as a control signal.
A photon gas doesn't conserve the number of particles (n) like beginning physics would suggest. This lets the temperature of the gas control the output.
The temperature, driven by certain specific inputs, produces the nonlinear response. I didn't see a specific claim what gain they achieved.
This paper is more on the theoretical end of photonics research. Practical research such as at UBC Vancouver [2] where a device does "weight update speed of 60 GHz" and for clustering it can do "112 x 112-pixel images" - the tech doesn't compete well against electronics yet.
TSMC and NVidia are attempting photonics plays too. But they're only achieving raw I/O with photons. They can attach the fiber directly to the chip to save watts and boost speeds.
Basic physics gets in the way too. A photon's wavelength at near UV is 400 nanometers, but the transistors in a smartphone are measured at 7 nanometers ish. Electrical conduction is fundamentally smaller than a waveguide for light. Where light could maybe outshine electrons is in switching speed. But this research paper doesn't claim high switching speed.
[1] https://en.wikipedia.org/wiki/Photon_gas
[2] https://www.nature.com/articles/s41467-024-53261-x
Minor nit, Joule-Thomson is not just the ideal gas law - it is a separate thermodynamic effect entirely. Case in point, for certain gases the change in temperature due to Joule-Thomson has the opposite sign that you would predict from the ideal gas law alone.
This has interesting applications. For example, you can exploit this with dilute metal vapor in an expanding helium gas to cool the metal vapor to very low temperature - the Joule-Thomson expansion of helium increases the helium's temperature by converting the energy of the intermolecular forces into heat. This draws out energy from the metal vapor. If done in a vacuum chamber, then in the region before the shockwave formed by the helium, the supercooled metal atoms will form small van der Waals clusters that can be spectroscopically probed in the jet. This was an interesting area of study back in the 80s that advanced our understanding of van der Waals forces.
Light doesnt interact with itself directly without a third non-light partner. So yes the light of course needs to interact with lattice made of atoms to make any switching possible here. This is why we can see light from the stars though it had to travel through other light for millions of years.
> TSMC and NVidia are attempting photonics plays too.
It's probably been six years since I looked at this space. The problem at the time for TSMC and several other people was that their solutions worked fairly well for firing photons vertically out of the chip and not well at all for firing them horizontally through the chip. I don't know if in the short term and mid term if an optical PCIe or memory bus is more overall horsepower than faster cross-chip communication in CPUs. But the solutions they were still chasing back then were good between chips, maybe between chiplets. Which could still be an interesting compromise.
> 400 nanometers, but the transistors in a smartphone are measured at 7 nanometers ish
The best em sensors need to be at least 1/10th the length of the frequency they are sending/receiving right? 40 nm isn't awful but it does suggest light for communication between functional units, rather than for assembling them.
> A photon's wavelength at near UV is 400 nanometers, but the transistors in a smartphone are measured at 7 nanometers ish.
Not really, "7 nm" is just a marketing name, the actual transistors are around 50 nm:
https://en.wikipedia.org/wiki/7_nm_process
It's true that the transistors are on the order of 50nm, but the conduits for getting the electrons to those transistors are presumably a bit smaller than that.
Probably not 7nm small, but not the full 50 nm either.
I don't think the author of this piece has a clue how this works. I certainly don't, even after reading it slowly.
It's actually quite comprehensible. Nonlinear optical medium + photon gas -> a photon gas which is no longer ideal, so that things like Joule-Thompson effect can happen in it, then they build simple computing mechanisms out of it.
The details are probably fiddly though.
- my understanding of nonlinear optical mediums is negligible. Something like the crystals that cause quantum entanglement and emitting photon pairs?
- what is a "photon gas"? Is this a state of matter? What is the matter if photons aren't matter?
- ideal gas law, PV=nRT not obeyed? Due to ionization or something? Photon pressure?
- Joule-Thompson Effect?
- Building computers out of light?
- Which thermodynamic properties or laws are being obeyed? Is this something like a Carnot cycle, but with photons?
4 replies →
I agree. The article is written quite superficial and when it gets intersting it just repeats the stuff from before. Imo the subject and news are really hot shit, but the author did his best to hide it in banality
I found this article extremely hard to understand, and the linked abstract was not much more help. My impression is that the device can take light coming into one of several input ports and through some magic of nonlinear optics, ensure that it all ends up at a single output port, something like a funnel. I was unable to determine anything about what this routing mechanism is (heating a substrate, maybe?), if the routing is dynamically changeable, or it works in reverse, eg light coming in can be routed to one of several output ports. The latter would seem like a breakthrough, but my impression is that what's described here is more proof-of-concept than prototype.
>what this routing mechanism is (heating a substrate, maybe?)
You can engineer a waveguide if you understand the nonlinear theory they propose. There's no heat exchange involved, which is easy to get confused on because the writing in the article does not really understand "optical thermodynamics".
>if the routing is dynamically changeable
At this point probably not, it requires a finely engineered waveguide which has a well-defined "ground state"
>it works in reverse, eg light coming in can be routed to one of several output ports
In theory it works in reverse, as everything in this system is time-reversible (i.e., the "optical thermodynamics" is just an analogy and not real thermodynamics, which would break time reversibility). This is demonstrated via a simulation in the SI, but experimentally they did not achieve this (it may be difficult, I am not an experimentalist so cannot comment).
As best I can understand (which is barely, and poorly!), it seems that this new, and interesting, field of optical thermodynamics allows the behavior of non-linear optical systems to be predicted, in this case allowing them to design a "photonic lattice" - some sort of system of waveguides - so that light behaves in a predictable way and can effectively be steered without having to use any active switching components.
What is even less clear than the above is how is this being used.. Presumably it's not just about routing light to some fixed location, but rather allowing it to be switched, so perhaps(?!) the phototic lattice has multiple inputs that interact resulting in light being steered to one of many outputs? Light being used to switch light?
I dunno - it was clear as mud. I'm basically just guessing here.
Sounds great, but I often find myself wondering "where's the catch?". There's not enough info in the abstract judge for myself whether the idea has legs. I'm sure it'll get more press if there's something to it.
An apt username to be commenting on light-related topics.
May be somewhat tangential to the topic, there are some companies pushing for use of light as the data bus for AI training path. Is the data throughput need that high for training? Do these companies really have a point?
This would have some amazing implications but they will also need to build the routing mechanism with light-based attenuation or it will never exceed the speed of electricity in a wire.
I've always wondered if optical transistors and optical processors might be able to kick the crap out of conventional electronic ones even if they're fabricated at a much larger "process node" if the switching could be orders of magnitude faster.
Electronics has topped out in the gigahertz range. We keep cramming more cores and more ALU units and wider vectors onto a chip by making it smaller and we keep making it more energy efficient, but it's not getting faster in terms of linear compute and hasn't for a while. We've started hitting physical limits there.
Optics could, AFAIK, run in the terahertz range. That's thousands of gigahertz. So wouldn't that be like thousands of electronic cores, but it would accelerate all code including non-parallelizable code?
I've wondered if this might not be a bigger deal for general purpose compute than quantum computing.
Or is my understanding way off?
[dead]