Congratulations for the team on the hard engineering effort.
I have had multiple discussions with people who claimed the fundamental limit to be one bit per (received) photon, whereas I insisted such a limit would be merely pragmatical, in other words, passing that limit is an engineering effort, not a fundamental physics breakthrough.
It is quite understandable why people believe 1 bit per photon to be the fundamental limit: either a photon is detected or it is not detected in a certain timeframe (say detector dead time, or background noise counts).
The reason this is important is because it would set a minimum energy per bit received at a certain wavelength, due to Planck's constant.
However I always pointed out that such a limit is not fundamental with an easy thought experiment. imagine the following communication scheme:
The transmitter has a number of bits it wishes to transmit, it pretends the sequence of bits represents an integer M, and waits M time cycles (after the last transmission) before transmitting another single photon.
The receiver receives photons and counts time cycles between subsequent photons, converts it to binary and reconstructs the binary sequence.
Suppose the average number of bits was B, then we have the single photon energy at some wavelength divided by B bits as the energy per bit for that wavelength, which could be driven arbitrarily below 1 photon per bit.
Disbelief. Every. Time.
The typical retort was that data transmission would be "too slow". Which would be true for a single channel.
Now suppose that background photons are mixed into the signal, say stray light from the sun scattering air molecules (think blue sky).
The background has an energy density over wavelength. Now consider a spectrometer setup, dividing a spectrum in more numerous and finer bins. The power and thus background photons per unit time in a specific bin thus decreases with finer bins. The number of channels increases with finer bins. This means that one could increase the data rate while decreasing the energy per bit, but not indefinitely (as the wavelength and thus frequency interval of the bin narrows, the modulation rate of the bin eventually intercepts the shortest time duration the scheme uses.
I am grateful of this article, because now I can point to a pragmatic achievement that 1 bit per photon is not a fundamental limit at all.
Congratulations for the team on the hard engineering effort.
I have had multiple discussions with people who claimed the fundamental limit to be one bit per (received) photon, whereas I insisted such a limit would be merely pragmatical, in other words, passing that limit is an engineering effort, not a fundamental physics breakthrough.
It is quite understandable why people believe 1 bit per photon to be the fundamental limit: either a photon is detected or it is not detected in a certain timeframe (say detector dead time, or background noise counts).
The reason this is important is because it would set a minimum energy per bit received at a certain wavelength, due to Planck's constant.
However I always pointed out that such a limit is not fundamental with an easy thought experiment. imagine the following communication scheme:
The transmitter has a number of bits it wishes to transmit, it pretends the sequence of bits represents an integer M, and waits M time cycles (after the last transmission) before transmitting another single photon.
The receiver receives photons and counts time cycles between subsequent photons, converts it to binary and reconstructs the binary sequence.
Suppose the average number of bits was B, then we have the single photon energy at some wavelength divided by B bits as the energy per bit for that wavelength, which could be driven arbitrarily below 1 photon per bit.
Disbelief. Every. Time.
The typical retort was that data transmission would be "too slow". Which would be true for a single channel.
Now suppose that background photons are mixed into the signal, say stray light from the sun scattering air molecules (think blue sky).
The background has an energy density over wavelength. Now consider a spectrometer setup, dividing a spectrum in more numerous and finer bins. The power and thus background photons per unit time in a specific bin thus decreases with finer bins. The number of channels increases with finer bins. This means that one could increase the data rate while decreasing the energy per bit, but not indefinitely (as the wavelength and thus frequency interval of the bin narrows, the modulation rate of the bin eventually intercepts the shortest time duration the scheme uses.
I am grateful of this article, because now I can point to a pragmatic achievement that 1 bit per photon is not a fundamental limit at all.
Interesting phenomenon, but I've found the wikipedia page much more instructive..:
https://en.m.wikipedia.org/wiki/Four-wave_mixing