Putting numbers on disorder
Chaos vs order: a classic way to divide our world. Humanity has been fascinated by the distinction. In fact our fascination is so big, that we feel the need to measure the order/chaos of many systems. Pop science loves to use the "Entropy" metric for that.
But is Entropy the appropriate unit? And does this adequately describe the things that matter to us such as language and biology? Let's talk!
Measuring fuzziness
Before unpacking entropy, first we need some context. Roughly speaking, there are two types of measurements. One is unit measurements (like length and speed) that describe the properties of a specific part. The other type of measurement is system measurements which measure high-level, collective behaviours that emerge only when parts form a whole and have no meaning at the level of individual parts.
At this point you might be scratching your head. But it's simpler than it sounds. Think of group cohesion. Group cohesion basically means how much people in a team like each other - it has no meaning at all if you zoom in on an individual. Likewise, temperature measures the average "movement" (kinetic energy) of a group of particles - individual particles can be fast or slow but not hot or cold.
Entropy, explained simply
Entropy is also a system-level metric. A high-level definition will make this clearer: entropy is the range of possible states of a system. In other words, if you get a description of a system entropy measures how little or how much the description narrows it down.
Consider two football games. Game A, where for example 3 goals have been scored and the away team is winning, the scores could be 1-2 or 0-3. And game B, where the total number of goals is 1 and the away team is winning the only score could be 0-1. So we can say that the score of game A has more entropy than game B. This illustrates information entropy as uncertainty.
Entropy has its limits (or does it not?)
So is then entropy a good metric for chaos? As we have seen, entropy cannot really tell you anything about a very specific situation (the score is 0-1) but works only at an uncertainty level. A clean vs a messy room then have the same entropy (known states and thus 0).
What entropy tells you though, is how vague is the statement "messy room". Entropy evaluates your information and not hard indicators. In that context, the second law of thermodynamics (in a closed system, entropy can only stay the same or increase) doesn't necessarily predict the heat death of the universe as some scientists claim. It simply tells us that our information about this system with the current descriptors is going to be less precise as time goes on. If less precise means chaotic to you, then entropy is a good metric - otherwise, we might need a new one.
Applications to everyday life
Entropy also has more to teach us about everyday life. Information describes anything from language to our genes. So what is the role of entropy in these fields? Both genetics and human speech have certain rules (grammar, syntax etc.) that define what sequences are allowed and which are not. Think about it - aren't those rules entropy management mechanisms?
These rules define our playground. Allow for too much variability and it becomes hard to parse the message (difficult to understand speech or decode the DNA's instructions). On the other hand, too little variability and there is no flair (no mutations or developments and too rigid language). Language contains built-in rules that constrain variability. No wonder pattern recognition machines (AI) are good at making sensible sentences - the recipe is in the package!
How much entropy is contained in this picture?!!
ReplyDeleteGreat question, haha. I guess it depends on who is answering: for the desk owner, probably 0 (everything is a specific place after all). But for others, I expect entropy is higher - things could be all over the place!
Delete