The aim of this post is to hypothesize about the collapse of the wave function based on thermodynamic entropy and computational reversibility. This will be done using arguments based on statistical mechanics, both quantum and classical, and on the theory of computation and the information theory.
In this sense, it is interesting to note that most of the natural processes have a reversible behavior, among which we must highlight the models of gravitation, electromagnetism and quantum physics. In particular, the latter is the basis for all the models of the emerging reality that configure the classical reality (macroscopic reality).
On the contrary, thermodynamic processes have an irreversible behavior, which contrasts with the previous models and poses a contradiction originally proposed by Loschmidt, since they are based on quantum physics, which has a reversible nature. It should also be emphasized that thermodynamic processes are essential to understand the nature of classical reality, since they are present in all macroscopic interactions.
This raises the following question. If the universe as a quantum entity is a reversible system, how is it possible that irreversible behavior exists within it?
This irreversible behavior is materialized in the evolution of thermodynamic entropy, in such a way that the dynamics of thermodynamic systems is determined by an increase of entropy as the system evolves in time. This determines that the complexity of the emerging classical reality grows steadily in time and therefore the amount of information of the classical universe.
To answer this question we will hypothesize how the collapse of the wave function is the mechanism that determines how classical reality emerges from the underlying quantum nature, justifying the increase of entropy and as a consequence the rise in the amount of information.
In order to go deeper into this topic, we will proceed to analyze it from the point of view of the theory of computation and the theory of information, emphasizing the meaning and nature of the concept of entropy. This point of view is fundamental, since quantity of information and entropy are synonyms of the same phenomenon.
Reversible computing
First we must analyze what reversible computation is and how it is implemented. To begin with, it should be emphasized that classical computation has an irreversible nature, which is made clear by a simple example, such as the XOR gate, which constitutes a universal set in classical computation, meaning that with a set of these gates any logical function can be implemented.
This gate performs the logical function X⊕Y from the logical variables X and Y, in such a way that in this process the system loses one bit of information, since the input information corresponds to two bits of information, while the output has only one bit of information. Therefore, once the X⊕Y function has been executed, it is not possible to recover the values of the X and Y variables.
According to Landauer’s principle [1], this loss of information means that the system dissipates energy in the environment, increasing its entropy, so that the loss of one bit of information dissipates a minimum energy k·T·ln2 in the environtment. Where k is Boltzmann’s constant and T is the absolute temperature of the system.
Therefore, for a classic system to be reversible it must verify that it does not lose information, so two conditions must be verified:
- The number of input and output bits must be the same.
- The relationship between inputs and outputs must be bijective.
The following figure shows the above criteria. But this does not mean that the logic function can be considered a complete set of implementation in a reversible computational context, since the relationship between inputs and outputs is linear and therefore cannot implement nonlinear functions.

It is shown that for this to be possible the number of bits must be n≥3, an example being the Toffoli gate (X,Y,Z)→(X,Y,Z⊕XY) and Fredkin gate (X,Y,Z)→(X, XZ+¬XY,XY+¬XZ), where ¬ is the logical negation.

For this type of gates to form a universal set of quantum computation it is also necessary that they verify the ability to implement nonlinear functions, so according to its truth table the Toffoli gate is not a universal quantum set, unlike the Fredkin gate which is.
One of the reasons for studying universal reversible models of computation, such as the billiard ball model proposed by Fredkin and Toffoli [2], is that they could theoretically lead to real computational systems that consume very low amounts of energy.
But where these models become relevant is in quantum computation, since quantum theory has a reversible nature, which makes it possible to implement reversible algorithms by using reversible logic gates. The reversibility of these algorithms opens up the possibility of reducing the energy dissipated in their execution and approaching the Landauer limit.
Fundamentals of quantum computing
In the case of classical computing a bit of information can take one of the values {0,1}. In contrast, the state of a quantum variable is a superposition of its eigenstates. Thus, for example, the eigenstates of the spin of a particle with respect to some reference axes are {|0〉,|1〉}, so that the state of the particle |Ψ〉 can be in a superposition of the eigenstates |Ψ〉= α|0〉+ β|1〉, α2+ β2 = 1. This is what is called a qubit, so that a qubit can simultaneously encode the values {0,1}.
Thus, in a system consisting of n qubits its wave function can be expressed as |Ψ〉 = α0|00…00〉+α1|00…01〉+α2|00…10〉+…+αN-1|11…11〉, Σ(αi)2 =1, N=2n, such that the system can encode the N possible combinations of n bits and process them simultaneously, which is an exponential speedup compared to classical computing.
The time evolution of the wave function of a quantum system is determined by a unitary transformation, |Ψ’〉 = U|Ψ〉, such that the transposed conjugate of U is its inverse, U†U = UU†= I. Therefore, the process is reversible |Ψ〉 = U†|Ψ’〉 = U†U|Ψ〉, keeping the entropy of the system constant throughout the process, so the implementation of quantum computing algorithms must be performed with reversible logic gates. As an example, the inverse function of the Ferdkin gate is itself, as can be easily deduced from its definition.
The evolution of the state of the quantum system continues until it interacts with a measuring device, in what is defined as the quantum measurement, such that the system collapses into one of its possible states |Ψ〉 = |i〉, with probability (αi)2. Without going into further details, this behavior raises a philosophical debate that nevertheless has an empirical confirmation.
Another fundamental feature of quantum reality is particle entanglement, which plays a fundamental role in the implementation of quantum algorithms, quantum cryptography and quantum teleportation.
To understand what particle entanglement means let us first analyze the wave function of two independent quantum particles. Thus, the wave function of a quantum system consisting of two qubits, |Ψ0〉 = α00|0〉+ α01|1〉, |Ψ1〉 = α10|0〉+ α11|1〉, can be expressed as:
|Ψ〉= |Ψ0〉⊗ |Y1〉= α00·α10|00〉+α00·α11|01〉+α01·α10|10〉+α01·α11|11〉,
such that both qubits behave as independent systems, since this expression is factorizable in the functions |Ψ0〉 and |Ψ1〉. Where ⊗ is the tensor product.
However, quantum theory admits solutions for the system, such as |Ψ〉 = α|00〉+β|11〉, α2+ β2 = 1, so if a measurement is performed on one of the qubits, the quantum state of the other collapses instantaneously, regardless of the location of the entangled qubits.
Thus, if one of the qubit collapses in state |0〉 the other qubit collapses also in state |0〉. Conversely, if the qubit collapses into the |1〉 state the other qubit collapses into the |1〉 state as well. This means that the entangled quantum system behaves not as a set of independent qubits, but as a single inseparable quantum system, until the measurement of the system is performed.
This behavior seems to violate the speed limit imposed by the theory of relativity, breaking the principle of locality, which establishes that the state of an object is only influenced by its immediate environment. These inconsistencies gave rise to what is known as the EPR paradox [3], positing that quantum theory was an incomplete theory requiring the existence of hidden local variables in the quantum model.
However, Bell’s theorem [4] proves that quantum physics is incompatible with the existence of local hidden variables. For this purpose, Bell determined what results should be obtained from the measurement of entangled particles, assuming the existence of local hidden variables. This leads to the establishment of a constraint on how the measurement results correlate, known as Bell’s inequalities.
The experimental results obtained by A. Aspect [5] have shown that particle entanglement is a real fact in the world of quantum physics, so that the model of quantum physics is complete and does not require the existence of local hidden variables.
In short, quantum computing is closely linked to the model of quantum physics, based on the concepts of: superposition of states, unitary transformations and quantum measurement. To this we must add particle entanglement, so that a quantum system can be formed by a set of entangled particles, which form a single quantum system.
Based on these concepts, the structure of a quantum computer is as shown in the figure below. Without going into details about the functional structure of each block, the logic gates that constitute the quantum algorithm perform a specific function, for example the product of two variables. In this case, the input qubits would encode all the possible combinations of the input variables, obtaining as a result all the possible products of the input variables, encoded in the superposition of states of the output qubits.

For the information to emerge into the classical world it is necessary to measure the set of output qubits, so that the quantum state randomly collapses into one of its eigenstates, which is embodied in a set of bits that encodes one of the possible outcomes.
But this does not seem to be of practical use. On the one hand, quantum computing involves exponential speedup, by running all products simultaneously. But all this information is lost when measuring quantum information. For this reason, quantum computing requires algorithm design strategies to overcome this problem.
Shor’s factorization algorithm [6] is a clear example of this. In this particular case, the input qubits will encode the number to be factorized, so that the quantum algorithm will simultaneously obtain all the prime divisors of the number. When the quantum measurement is performed, a single factor will be obtained, which will allow the rest of the divisors to be obtained sequentially in polynomial time, which means acceleration with respect to the classical algorithms that require an exponential time.
But fundamental questions arise from all this. It seems obvious that the classical reality emerges from the quantum measurement and, clearly, the information that emerges is only a very small part of the information describing the quantum system. Therefore, one of the questions that arise is: What happens to the information describing the quantum system when performing the measurement? But on the other hand, when performing the measurement information emerges at the classical level, so we must ask: What consequences does this behavior have on the dynamics of the classical universe?
Thermodynamic entropy
The impossibility of directly observing the collapse of the wave function has given rise to various interpretations of quantum mechanics, so that the problem of quantum measurement remains an unsolved mystery [7]. However, we can find some clue if we ask what quantum measurement means and what is its physical foundation.
In this sense, it should be noted that the quantum measurement process is based on the interaction of quantum systems exclusively. The fact that quantum measurement is generally associated with measurement scenarios in an experimental context can give the measurement an anthropic character and, as a consequence, a misperception of the true nature of quantum measurement and of what is defined as a quantum observable.
Therefore, if the quantum measurement involves only quantum systems, the evolution of these systems will be determined by unitary transformations, so that the quantum entropy will remain constant throughout the whole process. But on the other hand, this quantum interaction causes the emergence of information that constitutes classical reality and ultimately produces an increase in classical entropy. Consequently, what is defined as quantum measurement would be nothing more than the emergence of information that conforms classical reality.
The abstract view is clearly shown in practical cases. Thus, for example, from the interaction between atoms that interact with each other emerge the observable properties that determine the properties of the system they form, such as its mechanical properties. However, the quantum system formed by atoms evolves according to the laws of quantum mechanics, keeping the amount of quantum information constant.
Similarly, the interaction between a set of atoms to form a molecule is determined by the laws of quantum mechanics, and therefore by unitary transformations, so that the complexity of the system remains constant at the quantum level. However, at the classical level the resulting system is more complex, emerging new properties that constitute the laws of chemistry and biology.
The question that arises is how it is possible that equations at the microscopic level which are time invariant can lead to a time asymmetry, as shown by the Boltzmann equation of heat diffusion.
Another objection to this behavior, and to a purely mechanical basis for thermodynamics, is due to the fact that every finite system, however complex it may be, must recover its initial state periodically after the so-called recurrence time, as demonstrated by Poincaré [8]. However, by purely statistical analysis it is shown that the probability of a complex thermodynamic system returning to its initial state is practically zero, with recurrence times much longer than the age of the universe itself.
Perhaps the most significant and which clearly highlights the irreversibility of thermodynamic systems is the evolution of the entropy S, which determines the complexity of the system and whose temporal dynamics is increasing, such that the derivative of S is always positive Ṡ > 0. But what is more relevant is that this behavior is demonstrated from the quantum description of the system in what is known as “Pauli’s Master Equation” [9].

This shows that the classical reality emerges from the quantum reality in a natural way, which supports the hypothesis put forward, in such a way that the interaction between quantum systems results in what is called the collapse of the wave function of these systems, emerging the classical reality.
Thermodynamic entropy vs. information theory
The analysis of this behavior from the point of view of information theory confirms this idea. The fact that quantum theory is time-reversible means that the complexity of the system is invariant. In other words, the amount of information describing the quantum system is constant in time. However, the classical reality is subject to an increase of complexity in time determined by the evolution of thermodynamic entropy, which means that the amount of information of the classical system is increasing with time.
If we assume that classical reality is a closed system, this poses a contradiction since in such a system information cannot grow over time. Thus, in a reversible computing system the amount of information remains unchanged, while in a non-reversible computing system the amount of information decreases as the execution progresses. Consequently, classical reality cannot be considered as an isolated system, so the entropy increase must be produced by an underlying reality that injects information in a sustained way.
In short, this analysis is consistent with the results obtained from quantum physics, by means of the “Pauli’s Master Equation”, which shows that the entropy growth of classical reality is obtained from its quantum nature.
It is important to note that the thermodynamic entropy can be expressed as a function of the probability of the microstates as S = – k Σ(pi ln pi), where k is the Boltzmann constant and which matches the amount of information in a system, if the physical dimensions are chosen such that k = 1. Therefore, it seems clear that the thermodynamic entropy represents the amount of information that emerges from the quantum reality.
But there remains the problem of understanding the physical process by which quantum information emerges into the classical reality layer1. It should be noted that the analysis to obtain the classical entropy from the quantum state of the system is purely mathematical and does not provide physical criteria on the nature of the process. Something similar happens with the analysis of the system from the point of view of classical statistical mechanics [10], where the entropy of the system is obtained from the microstates of the system (generalized coordinates qi and generalized momentum pi), so it does not provide physical criteria to understand this behavior either.
The inflationary universe
The expansion of the universe [11] is another example of how the entropy of the universe is growing steadily since its beginning, suggesting that the classical universe is an open system. But, unlike thermodynamics, in this case the physical structure involved is the vacuum.
It is important to emphasize that historically physical models integrate the vacuum as a purely mathematical structure of space-time in which physical phenomena occur, so that conceptually it is nothing more than a reference frame. This means that in classical models, the vacuum or space-time is not explicitly considered as a physical entity, as is the case with other physical concepts.
The development of the theory of relativity is the first model in which it is recognized, at least implicitly, that the vacuum must be a complex physical structure. While it continues to be treated as a reference frame, two aspects clearly highlight this complexity: the interaction between space-time and momentum-energy, and its relativistic nature.
Experiments such as the Casimir effect [12] or the Lamb effect show the complexity of the vacuum, so that quantum mechanics attributes to the basic state of electromagnetic radiation zero-point electric field fluctuations that pervade empty space at all frequencies. Similarly, the Higgs field suggests that it permeates all of space, such that particles interacting with it acquire mass.But ultimately there is no model that defines spacetime beyond a simple abstract reference frame.
However, it seems obvious that the vacuum must be a physical entity, since physical phenomena occur within it and, above all, its size and complexity grow systematically. This means that its entropy grows as a function of time, so the system must be open, there being a source that injects information in a sustained manner. The current theory assumes that dark energy is the cause of inflation [13], although its existence and nature is still a hypothesis.
Conclusions
From the previous analysis it is deduced that the entropy increase of the classical systems emerges from the quantum reality, which produces a sustained increase of the information of the classical reality. For this purpose different points of view have been used, such as classical and quantum thermodynamic criteria, and mathematical criteria such as classical and quantum computation theory and information theory.
The results obtained by these procedures are concordant, allowing verification of the hypothesis that classical reality emerges in a sustained manner from quantum interaction, providing insight into what is meant by the collapse of the wave function.
What remains a mystery is how this occurs, for while the entropy increase is demonstrated from the quantum state of the system, this analysis does not provide physical criteria for how this occurs.
Evidently, this must be produced by the quantum interaction of the particles involved, so that the collapse of their wave function is a source of information at the classical level. However, it is necessary to confirm this behavior in different scenarios since, for example, in a system in equilibrium there is no increase in entropy and yet there is still a quantum interaction between the particles.
Another factor that must necessarily intervene in this behavior is the vacuum, since the growth of entropy is also determined by variations in the dimensions of the system, which is also evident in the case of the inflationary universe. However, the lack of a model of the physical vacuum describing its true nature makes it difficult to establish hypotheses to explain its possible influence on the sustained increase of entropy.
In conclusion, the increase of information produced by the expansion of the universe is an observable fact that is not yet justified by a physical model. On the contrary, the increase of information determined by entropy is a phenomenon that emerges from quantum reality and that is justified by the model of quantum physics and that, as has been proposed in this essay, would be produced by the collapse of the wave function.
Appendix
1 The irreversibility of the system is obtained from the quantum density matrix:
ρ(t)= ∑ i pi |i〉〈i|
Being |i〉 the eigenstates of the Hamiltonian ℌ0, such that the general Hamiltonian is ℌ=ℌ0+V, where the perturbation V is the cause of the state transitions. Thus for example, in an ideal gas ℌ0, could be the kinetic energy and V the interaction as a consequence of the collision of the atoms of the gas.
Consequently, “Pauli’s Master Equation” takes into consideration the interaction of particles with each other and their relation to the volume of the system, but in an abstract way. Thus, the interaction of two particles has a quantum nature, exchanging energy by means of bosons, something that is hidden in the mathematical development.
Similarly, gas particles interact with the vacuum, this interaction being fundamental, as is evident in the expansion of the gas shown in the figure. However, the quantum nature of this interaction is hidden in the model. Moreover, it is also not possible to establish what this interaction is like, beyond its motion, since we lack a vacuum model that allows this analysis.
References
[1] | R. Landauer, “Irreversibility and Heat Generation in Computing Process,” IBM J. Res. Dev., vol. 5, pp. 183-191, 1961. |
[2] | E. Fredkin y T. Toffoli, «Conservative logic,» International Journal of Theoretical Physics, vol. 21, p. 219–253, 1982. |
[3] | A. Einstein, B. Podolsky and N. Rose, “Can Quantum-Mechanical Description of Physical Reality be Considered Complete?,” Physical Review, vol. 47, pp. 777-780, 1935. |
[4] | J. S. Bell, «On the Einstein Podolsky Rosen Paradox,» Physics, vol. 1, nº 3, pp. 195-290, 1964. |
[5] | A. Aspect, P. Grangier and G. Roger, “Experimental Tests of Realistic Local Theories via Bell’s Theorem,” Phys. Rev. Lett., vol. 47, pp. 460-463, 1981. |
[6] | P. W. Shor, «Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer,» arXiv:quant-ph/9508027v2, 1996. |
[7] | M. Schlosshauer, J. Kofler y A. Zeilinger, «A Snapshot of Foundational Attitudes Toward Quantum Mechanics,» arXiv:1301.1069v, 2013. |
[8] | H. Poincaré, «Sur le problème des trois corps et les équations de la dynamique,» Acta Math, vol. 13, pp. 1-270, 1890. |
[9] | F. Schwabl, Statistical Mechanics, pp. 491-494, Springer, 2006. |
[10] | F. W. Sears, An Introduction to Thermodynamics, The Kinetic Theory of Gases, and Statistical Mechanics, Addison-Wesley Publishing Company, 1953. |
[11] | A. H. Guth, The Inflationary Universe, Perseus, 1997. |
[12] | H. B. G. Casimir, «On the Attraction Between Two Perfectly Conducting Plates,» Indag. Math. , vol. 10, p. 261–263., 1948. |
[13] | P. J. E. Peebles y B. Ratra, «The cosmological constant and dark energy,» Reviews of Modern Physics, vol. 75, nº 2, p. 559–606, 2003. |