Quantum Understanding May Spawn Self-Cooling Computers

It’s a simple fact that computers generate heat as they work. While part of this is certainly due to the engineering of the machines, another part has to do with fundamental physics: processing information generates heat. In a mind-blowing new paper, a group of researchers describe how data deletion can actually have a cooling effect thanks to the mysterious phenomenon of quantum entanglement. In the scientific journal Nature, the team describes how supercomputers, which often experience limited performance due to their heat generation, could benefit from quantum cooling.

The actual mechanism of the theory might be a little hard to follow, but it all centers on the observer and on two different definitions of entropy. In thermodynamics, entropy refers to the disorder present in a system. In information theory, entropy is a measure of information density. The theory described in this new study basically states that both terms are describing the same thing, which is essentially a lack of knowledge.

Objects don’t possess entropy; observers do. So according to this theory, if two observers delete the same amount of data in memory and one has more knowledge of the data, that user perceives lower entropy and completes the deletion using less energy than the uninformed observer. An observer with perfect “classical” (as in classical physics) knowledge of the data could delete it with zero energy. But if observer and data shared quantum entanglement, the observer would have greater than complete knowledge and an entropy of less than zero. The result is that the deletion actually results in removing heat from the system. Of course, the quantum cooling theory is just that: a theory. It has yet to be tested, but conceptually it is entirely possible that quantum entanglement will help cool our supercomputers.

submit to reddit
See more in Quantum Leaps or under Science. June, 2011.