Entropy

A measure of the disorder or randomness in a closed system.

Entropy is the thermodynamic property toward equilibrium/average/homogenization/dissipation: hotter, more dynamic areas of a system lose heat/energy while cooler areas (e.g., space) get warmer / gain energy; molecules of a solvent or gas tend to evenly distribute; material objects wear out; organisms die; the universe is cooling down. In the observable universe, entropy – like time – runs in one direction only (it is not a reversible process).

In classical thermodynamics, the concept of entropy is defined phenomenologically by the second law of thermodynamics, which states that the entropy of an isolated system always increases or remains constant.

These processes reduce the state of order of the initial systems, and therefore entropy is an expression of disorder or randomness.

“The fundamental irreversibility that is the hallmark of the arrow of time”.

It is the increase in entropy, in the disorderliness of the world, which makes these everyday events irreversible and separates the past from the future.

“Understanding the arrow of time is a matter of understanding the origin of the universe,”

But the question remains: “what is time?” The response of the American physicist John Wheeler is worth remembering: “Time is nature’s way of keeping everything from happening at once.”

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s