A measure of the disorder or randomness in a closed system.
Entropy is the thermodynamic property toward equilibrium/average/homogenization/dissipation: hotter, more dynamic areas of a system lose heat/energy while cooler areas (e.g., space) get warmer / gain energy; molecules of a solvent or gas tend to evenly distribute; material objects wear out; organisms die; the universe is cooling down. In the observable universe, entropy – like time – runs in one direction only (it is not a reversible process).
In classical thermodynamics, the concept of entropy is defined phenomenologically by the second law of thermodynamics, which states that the entropy of an isolated system always increases or remains constant.
These processes reduce the state of order of the initial systems, and therefore entropy is an expression of disorder or randomness.
“The fundamental irreversibility that is the hallmark of the arrow of time”.
It is the increase in entropy, in the disorderliness of the world, which makes these everyday events irreversible and separates the past from the future.
“Understanding the arrow of time is a matter of understanding the origin of the universe,”
But the question remains: “what is time?” The response of the American physicist John Wheeler is worth remembering: “Time is nature’s way of keeping everything from happening at once.”