What if there were an algorithm which procedurally generated responses to any set of circumstances, one so robust it could be applied to any system? Enter causal entropic forces.
But first, what does “causal entropic forces” mean? Let’s take the words in reverse order.
In physics, we talk about force as mass times acceleration. Force, in our sense, is abstracted as we aren’t talking about objects through space, but systems through states. The “space” is the set of all system states, and the “mass” is the resistance to change in movement through those states. Imagine a business which orders 100 widgets every month. The set of states of this system is the number of widgets the business buys, and the resistance arises from several factors including contracts and scalability. If the business is locked into a contract, it may take some time before it can lower orders to 90 or 80 per month. If the business is processing the widgets before selling them, there is some speed at which it can increase its production capacity – it doesn’t make sense to order more widgets than may be processed. In general, the business will not go from ordering 100 widgets to 50 or 200 overnight. We term this resistance to change inertia.
Entropy is oft called the disorder of a system, but the more mathematical definition is the set of accessible states. Think of the entropy of a system as the amount of information required to represent the system. Read this post for a primer.
An entropic force is a force which increases entropy. Entropic forces don’t exist in our universe – they are an approximation of a statistical phenomenon. An entropic force would model how food dye spreads through still water. At the microscopic level, the dye particles move randomly and ignore the movement of the other dye particles. Modeling random movements of billions of billions of dye particles is infeasible (and unnecessary for us to understand the system), so we instead represent the system as a concentration of particles diffusing deterministically with an entropic force.
Causal refers to the fact that these are not entropic forces, but the causal generalization of them. Causal generalizations are statements such as “public education decreases poverty” or “radiation causes cancer.” They aren’t true in every instance (you don’t get cancer every time a photon interacts with a cell in your body), but they are causally linked (every photon of high enough energy which interacts with a cell in your body has a chance of causing the cell to become cancerous). So a causal entropic force is a force which increases path entropy, rather than just entropy.
Recall that entropy is the number of states in which a system may be. Similarly, path entropy is the number of future paths the system may take. So while a fair coin may have one bit of entropy, if we are going to flip the coin ten times the path entropy is ten bits.
Now that I’ve got these terms defined, I can finally get to discussing the paper.
This post series was inspired by the paper Causal Entropic Forces written last year by Dr. Wissner-Gross and Dr. Freer.