A commentary on Information flow in a kinetic Ising model peaks in the disordered phase

Lionel Barnett, Joseph T. Lizier, Michael Harré, Anil K. Seth and Terry Bossomaier, Phys. Rev. Lett. 111(17), 2013.

Download preprint
Download supplemental material.

Phase transitions are ubiquitous in nature. The canonical examples stem from the physical sciences: thus materials exhibit sharp transitions from solid to liquid to gaseous form when temperature or pressure crosses some threshold. Magnetisation in some materials is another example. These state transitions may generally be associated with an order parameter - the density, for example, in solid-liquid-gas transitions, or average magnetisation in a metal - which changes abruptly (or undergoes a discontinuity) when some system parameter, like temperature, crosses a critical threshold. An important class of phase transitions, furthermore, may be characterised as order-disorder transitions. Thus, for example, the molecules in a solid are highly ordered in space, while those in a liquid are more disordered. Similarly, in a magnetised material all the electrons spin in the same direction (order) while if the material becomes demagnetised the electrons spins are randomised (disorder).

What these complex dynamical systems have in common, is that they comprise large ensembles of interacting elements - the molecules in a gas, electron spins in a magnetic material. Thus it should come as no surprise that we should find phase transitions in a huge variety of complex systems comprising large ensembles of interacting elements, such as financial markets, neural systems, ecosystems, flocking birds, etc. Somewhat counter-intuitively, in a variety of complex systems disorder is associated with a "normal" state, while order is associated with a "pathological" state. So for example in a healthy financial market, prices appear to fluctuate quite randomly, whereas market crashes are associated with more ordered "herding" behaviour, leading to instability. In a similar vein, some types of epileptic seizures are associated with an abnormal degree of synchronisation of neural activity, in contrast to the comparatively desynchronised activity observed in a normally functioning brain.

Obviously, it would be of major importance if we were able to predict when a complex dynamical system is about to transition from healthy disorder to pathological order, and a substantial research effort has been spent on ways to do this. One line of attack is via information theory. In particular, so-called mutual information between individual elements of a complex system - roughly, how much the state of one element tells you about the state of the other - turns out to be a useful characterisation of the order/disorder balance: if the average order in a system is very high, then average mutual information is low, since there is not much uncertainty in the state of individual elements to begin with, and consequently one element doesn't tell you much about the state of another. But if average order is very low then then average mutual information between system elements is again low; now system elements behave near-independently, so again one element doesn't tell you much about the state of another.

Hence we would expect that mutual information attains a peak value at some intermediate balance between order and disorder, and indeed this is exactly what we see in a range of complex systems. But - and here's why mutual information is not a useful predictor for phase transitions - the peak value of mutual information inevitably turns out to fall exactly at the phase transition. So imagine you are tracking the average mutual information in a financial market, and observe it to be rising over time. Then the market crashes - it has undergone a phase transition from disordered to ordered dynamics - and you subsequently observe that average mutual information falls again (this is, in fact, exactly what we do see in real-world financial crashes). Thus, since your mutual information indicator peaked at the crash, it only warned you about the crash after it happened!

So what better information-theoretic predictors for phase transitions might there be? Our own research was motivated by the observation that mutual information is in some sense "static"; it measures the relationships between system elements at a fixed point in time. We felt that it was thus missing out on an essential aspect of complex dynamical systems - namely, the dynamics; that is, how system elements interact, or mediate each other's behaviour, over time. What we really wanted to measure was how, in some sense, information "flows through the system" over time. To this end, we decided to investigate a recent addition to the information-theorist's arsenal, so-called transfer entropy. Transfer entropy is a kind of time-lagged mutual information (adjusted to discount shared historical information) that may be interpreted - although this interpretation is controversial - as a measure of information flow, or (even more controversially) as a measure of "causal influence" between system elements.

Before attempting to apply transfer entropy in anger to real-world data, we decided, as a proof-of-concept, to investigate its behaviour in a classical and well-understood model from theoretical physics, the Ising model. The Ising model, essentially a simplified mathematical model of ferromagnetism, is the fruit fly of phase transitions in mathematical physics. Indeed, it has been said that almost everything we know about phase transitions (and we know a lot!) stems from studying this model. It is also rather a beautiful construct - simple to describe, but remarkably rich in behaviour (and, it must be said, notoriously difficult to analyse mathematically). In the (2D) Ising model we have an ensemble of binary (up/down) "spins", representing the electron spins in a ferromagnetic material, that sit on the vertices of a 2-dimensional lattice. The spins try to line up with the spins of their lattice neighbours, but thermal fluctuations (mediated by a temperature parameter) cause them sometimes to reverse randomly - the higher the temperature, the more likely they are to reverse. At low temperatures, the lining-up effect wins out, and the model exhibits a magnetised state, where (almost) all spins line up in one direction or the other. At high temperature, thermal fluctuations win out - the spins are random, and there is zero overall magnetisation. But magnetisation does not change smoothly from full-on to zero as temperature is increased; rather, it is a kind of all-or-nothing effect. At a very precise critical temperature the system as a whole transitions abruptly from magnetised to demagnetised in a decidedly non-smooth manner - and then stays demagnetised at all higher temperatures. This is a classical 2nd order phase transition.

Previous work had already demonstrated that average mutual information in the Ising model peaks (as already mentioned) precisely at the phase transition. We worked out an analytic expression for a kind of global average transfer entropy - a system-wide measure of "information flow density", if you like - for the Ising model. This mathematical formula was not enough to compute the transfer entropy measure entirely analytically, so we still needed to simulate the dynamics of the Ising model to get a final result; this required a good week or so of computing time on a supercomputer. The upshot was that - unlike mutual information - our global transfer entropy measure peaked distinctly on the disordered side of the phase transition - that is, at a higher temperature than the critical transition temperature (this in itself was unusual: in the Ising model, just about anything interesting that happens, happens at the critical temperature).

Why is this result significant? Imagine that you are monitoring both the average mutual information and global transfer entropy of an Ising system - but you don't have access to the temperature or magnetisation (in a real-world system you probably don't even know what the system and order parameters are). Slowly, over time, you observe both the mutual information and transfer entropy to rise... but then the transfer entropy peaks and starts to fall, while the mutual information continues to rise. A plausible explanation for what you have just observed is that the system started off at a high-temperature, disordered state, and the temperature then fell until it crossed the transfer entropy peak (remember, that peak lies on the high-temperature, disordered side of the phase transition). So we might plausibly conclude that the system is moving towards a phase transition... in fact (assuming that trend continues ), we predict an imminent phase transition!

Of course, it is a huge jump to extrapolate this effect from a "toy" model in physics to massively complex real-world systems such as financial markets or brains. But... then again... the Ising model has always come through rather nicely for physicists. Non-scientists are often vexed that scientists will expend huge amounts of effort (and grants!) on investigating apparently simplistic models that seem laughably far from the mess and mayhem of the real world. But there are good reasons for this: simplified models can potentially distil the essence of an effect; they can yield clarity - real-world data, by contrast, is noisy and results frequently equivocal. In our case, if we hadn't got a clean result for this most canonical of models, it would have been back to the drawing board; if it didn't work in the Ising model it almost certainly wasn't going to work anywhere else.

So will information flow, as we conjecture, turn out to be a useful predictor for at least some class of phase transitions in real-world complex systems? Well, we surely won't know without testing it out. Let's face it: it may well not work out, but if it does, it's a big prize.


Lionel Barnett, October 2013

Back to my web page