Entropy (S) is a measure of the disorder or randomness in a system. It tells us how spread out or dispersed the energy and particles are.
Key Points:
- High entropy → particles are more disordered, energy is more spread out.
- Low entropy → particles are more ordered, energy is more concentrated.
- Entropy is a state function, meaning it depends only on the current state, not the path taken.
- The Second Law of Thermodynamics states that in any spontaneous process, the entropy of the universe always increases.
Examples:
- Ice melting → Solid ice (ordered, low entropy) turns into liquid water (disordered, higher entropy).
- Perfume spreading in a room → Molecules move randomly, increasing disorder.
- Burning wood → Converts ordered solid into gases and ash with much higher entropy.
- Deck of cards → A neatly arranged deck has low entropy; a shuffled deck has high entropy.
In short:
Entropy is a way to measure how much disorder or randomness exists in a system. Nature tends to move toward higher entropy because disordered states are more probable than ordered ones.