Loading 3 Votes - +

RE: Shannon Entropy

I’ll give it a go.

Shannon entropy is a measure of how “expected” a sequence is, given its past history. So, if you’ve seen “HTHTHTHTHTHTH”, you expect that it’s more likely the next symbol will be “T” than “H”.

To put that more precisely, you assign a higher probability to the outcome “H” than to “T”. The formula for entropy is -∑p(x)log(p(x)), where p(x) is the probability assigned to symbol x, and the sum is taken over all possible values of x. Since the p(x) have to always sum to 1, when one probability increases, it has to come at the expense of the others.

Thermodynamic entropy is a measure of how “expected” a given state is. Normally, we “expect” the state to be pretty mixed — we don’t expect all the molecules of atmosphere in a room to gather one corner (somewhat analogous to the “HHHHHH” case).

To bring the two concepts together, it’s important to realize that we can’t actually examine the details of the thermodynamic state directly — instead, we measure various statistical properties of the system (temperature, pressure, volume, mass, charge, etc.) that are related to the state. The detailed state includes the exact number of particles, their energy states, positions, velocities, and so on.

Note that there are usually a huge number of detailed states that have the same statistical properties. Given any state, you can swap any particle for any other similar particle and the state is the same (swap any set of electrons, swap any set of protons, etc.) Temperature is essentially the average velocity of the particles — speed one up and slow another down by the right amounts and you’ve got the same temperature.

The macro-scale statistical measurements you make (temperature, etc.) allow you to put constraints on the possible microstates the system can possibly be in. Of all possible states, only some of them have a given temperature, pressure, etc. In the entropy formula, these constraints show up as changes in the probabilities — when you measure the temperature, the probabilities have to be adjusted to reflect that states not at that temperature are now very improbable. But, as with adjusting the probabilities associated with the coin tosses, the more constraints you put on the system — the more you can pare down the possible set of microstates — the lower the entropy of the system.

In the coin-tossing view, this is like not being able to see the outcomes of individual tosses, but instead being told statistical properties like the ratio of heads to tosses, the average length of a run of heads, the length of the longest run, and so on. With just ten tosses, there are 1024 possible “states”. If you know that half of the tosses came up heads, you’ve narrowed it down to one of 252 states.

With a million tosses, there are a huge number of outcomes with exactly half heads. On the other hand, there’s only one possible outcome with a million heads. So, knowing the ratio of heads to tosses gives some constraint on the possible microstate (the exact sequence of tosses). Low entropy states correspond to outcomes where the ratio is far from 1/2.

Suppose you had tossed a fair coin twenty times and — one in a million chance — they all came up heads, a low-entropy state. As you continue to add tosses to the sequence, it’s very unlikely that you’re going to remain in the low entropy state — there’s a 50% chance that the twenty-first toss will come up tails, bumping up the entropy. This is the essence of the second law of thermodynamics — transitions to higher-entropy states are more likely than transitions to lower-entropy states, so entropy increases.

Does that help?

Thread parent sort order:
Thread verbosity:

Thanks for your input, scottb.

I am actually pretty well familiar with the thermodynamic aspects of entropy.

My concern is rather as to the usefulness of the term “degree of disorder” to get the meaning of it across to general readers, which is my aim.

The problem being that any series such that I gave as examples would ordinarily be perceived to be ordered. And if the Shannon entropy, which which I am not very familiar, could in some way brought to bear on this.

I think you effectively negated this in the first line of your reply:
" Shannon entropy is a measure of how “expected” a sequence is, given its past history"

Which, as I suspected, seems to confirm that Shannon theory has nothing to say about absolute configurations but only probabilities derived from previous history of a sequence.

Right?

What is OmniNerd?

Omninerd_icon Welcome! OmniNerd's content is generated by nerds like you. Learn more.

Voting Booth

The most important factor in buying my next car is?

10 votes, 1 comment