Thanks for your input, scottb.
I am actually pretty well familiar with the thermodynamic aspects of entropy.
My concern is rather as to the usefulness of the term “degree of disorder” to get the meaning of it across to general readers, which is my aim.
The problem being that any series such that I gave as examples would ordinarily be perceived to be ordered. And if the Shannon entropy, which which I am not very familiar, could in some way brought to bear on this.
I think you effectively negated this in the first line of your reply:
" Shannon entropy is a measure of how “expected” a sequence is, given its past history"
Which, as I suspected, seems to confirm that Shannon theory has nothing to say about absolute configurations but only probabilities derived from previous history of a sequence.
The problem being that any series such that I gave as examples would ordinarily be perceived to be ordered.
That’s only because they are. They’re fully specified. If I gave you a specific thermodynamic state, say for a small volume of gas, it gives the same sense of being “ordered”.
In a sense, but only in the same sense that thermodynamic entropy has nothing to say about absolute states, ultimately, as information entropy and thermodynamic entropy are exactly the same thing.
In thermodynamics, the dynamic evolution of the system is always the same — the laws of quantum mechanics specify exactly how future states evolve from present states. Shannon’s entropy originally came about in considering communications channels, in which the evolution function is part of what’s being studied, so it’s almost always introduced in terms of the probability distribution of subsequent symbols, given a history.
Note that the probabilities are rarely determined by the specific symbols in the history. The usual model is that we know the probability distribution with which the transmitter is sending, but the channel itself introduces noise in such a way that the receiver sees a different symbol from the one sent.
The “probabilities derived from previous history of a sequence” that you mention are more aligned with notions like Kolmogorov complexity than with entropy, per se, though they’re somewhat related.
I think maybe the issue has to do with seeing entropy as a “degree of disorder” in the first place. That’s really not a good approach. While it’s true that most configurations people are likely to view as “highly ordered” are low entropy configurations, the reverse isn’t true — the majority of low entropy configurations wouldn’t be considered “highly ordered”. Rocks arranged around the rim of a crater don’t seem any more “ordered” than rocks in a pile at the bottom, despite their having different entropy.
Entropy isn’t about single configurations. It’s about subspaces of the full configuration space. In thermodynamic entropy, your macro-level description of the system (temperature, pressure, etc) doesn’t correspond to a single state, it corresponds to many states, which occupy some volume within the space of all possible configurations. The entropy is, quite literally, the logarithm of that volume (times Boltzmann’s constant, for historical reasons).
In other words, it’s a measure of the information you have. If your information goes beyond the ordinary macro-scale measurements, you reduce the volume further. At the limit, your information pins the state down to a single point in configuration space, with zero volume and thus, zero entropy. In practice, Heisenberg uncertainty prevents you from getting there (particle position and momentum are part of the state, but they’re a conjugate pair), but the principle is the same.
Information (Shannon) entropy and thermodynamic entropy aren’t different concepts. They’re the exact same concept discovered in two different ways.
Welcome! OmniNerd's content is generated by nerds like you. Learn more.