The arrow of time
Why do we experience time the way we do? The future seems a very different beast to the past, but as far as we can tell all of nature's fundamental laws are fully reversible. Take a ball that's just been kicked into the air. Newton tells us that the ball will lose speed as it rises to it's greatest height at which point it will start to fall and that it will trace out a parabola as it does. However, if we play the video backwards, the ball will do exactly the same thing, and this is because the laws apply equally well when you start with "final" conditions instead of "initial" conditions and rewrite the equations in terms of "backwards time" $\tau = -t$ instead of time $t$.
Despite this we do not experience the two directions the same. In this post I will give an argument for why this is the case - one that borrows from thermodynamics, Hamiltonian classical mechanics, and Landauer's Limit.
Phase Space and Entropy
\begin{align}
\dot{x_i} &= \frac{\partial{H}}{\partial{p_i}} \\
\dot{p_i} &= -\frac{\partial{H}}{\partial{x_i}} \\
\end{align}
$$
dS = \frac{Q}{T}
$$
This says that the change in entropy of part of a system is equal to the heat energy added to it divided by it's temperature. The context here is that that part of the system should start off in a state of homogeneous temperature and pressure and likewise end up in a homogeneous state. And it should be noted that this is the entropy change for just part of the system. Clearly that heat came from somewhere and that somewhere lost entropy, although perhaps not the same amount since it may be at a different temperature. The famous 2nd law of thermodynamics $dS > 0$ states that for an isolated system the total entropy must increase with time. If the isolated system consists of a hot material and a cold material then heat cannot flow from the cold material to the hot material. If the isolated system is more complex then it is in fact possible for heat to flow in the opposite direction in some parts of the system as long as entropy increases elsewhere more than compensate for the corresponding entropy loss.
S = k\ ln(\Omega)
$$
Penrose's answer
Liouville's Theorem
Hamilton pointed out that you can use his equations to calculate a trajectory for any point in phase space, and so work out how the corresponding system would evolve in time. Liouville was thinking along the same lines, but for regions of phase space. If you start with a blob in phase space and evolve each point in it, you end up with a new blob. The curious thing that Liouville managed to demonstrate is that the volume of the blob remains the same when you do this$^{\dagger_3}$. This surprising result tells us that phase space is in some sense a natural way to describe the space of states for a system.
Memories and other records
What is it that makes the future feel different from the past? It is that we can recollect the past but not the future. That is to say that we have memories only of the past state of the world. These are structures in our brain we can play back to regenerate images of the past. Memories are an example of a more general concept: the record. A record could be a memory, but could also be a photograph, a CD-ROM, some notes written on a scrap of paper, or bytes in a database. In each case the record contains information about the past which - with the right equipment - can be played back to answer questions about the state the world was in.
My goal is to demonstrate that records can in fact only be laid down if entropy is increasing. This then completely answers our question of why the future feels different to the past: We live in a part of the block universe in which entropy is changing with time; and any records that do exist must relate to states of the universe in which the entropy was smaller, giving us a very strong sense of "earlier" and of "later".
A Toy Universe
To illustrate let's imagine a simple universe which has a very little in it. Specifically it has two heat reservoirs connected together by a small amount of heat conducting material, and it has a computer with a gigabyte of memory. The purpose of the memory is to make records about the state of the heat reservoirs. You can imagine a record being just two numbers - representing the temperatures of each - which is appended to a long list. Now, obviously you need more than memory to make a computer, you need a CPU, a heat sink, and so forth. However, I'm going to ignore all the those components since - in this ideal computer - their macroscopic state does not change over time, unlike the states of the reservoir and the memory.
Let's assume that one heat reservoir is at 0 Celsius (273K) and the other at 100 Celcius (373K) and let's consider an interval of time $\Delta t$ over which 1 Joule is transferred between them. In the direction of time in which the heat "flows" from hot to cold entropy $S$ has increased by $\frac{1}{273}-\frac{1}{373} \approx 0.001 JK^{-1}$.
Remember that entropy is also the volume $\Omega$ of the "foam bubble" in phase space in which every point "looks alike" from the macroscopic point of view. So by $\Delta t$ the phase point is in a new "foam bubble" which is larger than it's original home by a factor of $e^{0.001/k} \approx 2^{10^{20}}$ - that's the number of configurations you can have in 12.5 billion gigabytes of computer memory - roughly all the RAM that exists in the world!
In the picture above I've shown that the original "foam bubble" in phase space maps into the larger one but does not map completely onto it. This is because - according to Liouville - phase space is an incompressible fluid! So the image is very wispy with very little density within the target bubble. In this time direction there is no reason why the gigabyte of memory could not become populated with records for the period $\Delta t$.How is this possible when Liouville's Theorem tells us that phase space is incompressible? The answer is that in fact the original larger bubble does not map into just one smaller bubble but into at least $2^{10^{20}}$ different bubbles.
Each of these bubbles represents the same macroscopic state for the reservoirs, so they must represent different states for the memory. But wait! One gigabyte of memory can only have $2^{23}$ different states which is much less than $2^{10^{20}}$. This means that only a tiny subset of the larger bubble can possibly map into any of the smaller bubbles, and - within that - the smaller bubble you end up in is random.- $\dagger$ Actually you can use any generalised co-ordinates $q_i$ instead of $x_i$ so long as the $q_i$ and the $\dot{q_i}$ completely define the state. You end up with different $p_i$ which are known as conjugate momenta instead of just momenta, and of course a different function $H$. But Hamilton's laws still work, and phase space has the same properties.
- $\dagger_2$ For simplicity I've assumed that $x_i$ and $p_i$ are the microscopic degrees of freedom and temperature and pressure are the macroscopic degrees of freedom. The actual definition of statistical entropy is phrased in terms of more general degrees of freedom.
- $\dagger_3$ Proof: Let $V=\prod \Delta x_i \Delta p_i$, then $\delta V = \sum \frac{\delta \Delta x_i}{\Delta x_i}V + \frac{\delta \Delta p_i}{\Delta p_i}V$. But $\delta \Delta x_i = \delta t \Delta \dot{x_i} = \delta t \Delta x_i \frac{\partial \dot{x_i}}{\partial x_i} = \delta t \Delta x_i \frac{\partial^2 H}{\partial x_i \partial p_i}$. Do a similar calculation for $\delta \Delta p_i$ and substitute into the formula for $\delta V$ to see that everything cancels out.
Comments
Post a Comment