College Physics
Science and TechnologyStatistical Interpretation of Entropy and the Second Law of Thermodynamics: The Underlying Explanation
The various ways of formulating the second law of thermodynamics tell what happens rather than why it happens. Why should heat transfer occur only from hot to cold? Why should energy become ever less available to do work? Why should the universe become increasingly disorderly? The answer is that it is a matter of overwhelming probability. Disorder is simply vastly more likely than order.
When you watch an emerging rain storm begin to wet the ground, you will notice that the drops fall in a disorganized manner both in time and in space. Some fall close together, some far apart, but they never fall in straight, orderly rows. It is not impossible for rain to fall in an orderly pattern, just highly unlikely, because there are many more disorderly ways than orderly ones. To illustrate this fact, we will examine some random processes, starting with coin tosses.
Coin Tosses
What are the possible outcomes of tossing 5 coins? Each coin can land either heads or tails. On the large scale, we are concerned only with the total heads and tails and not with the order in which heads and tails appear. The following possibilities exist:
These are what we call macrostates. A macrostate is an overall property of a system. It does not specify the details of the system, such as the order in which heads and tails occur or which coins are heads or tails.
Using this nomenclature, a system of 5 coins has the 6 possible macrostates just listed. Some macrostates are more likely to occur than others. For instance, there is only one way to get 5 heads, but there are several ways to get 3 heads and 2 tails, making the latter macrostate more probable. [link] lists of all the ways in which 5 coins can be tossed, taking into account the order in which heads and tails occur. Each sequence is called a microstate—a detailed description of every element of a system.
Individual microstates | Number of microstates | |
5 heads, 0 tails | HHHHH | 1 |
4 heads, 1 tail | HHHHT, HHHTH, HHTHH, HTHHH, THHHH | 5 |
3 heads, 2 tails | HTHTH, THTHH, HTHHT, THHTH, THHHT HTHTH, THTHH, HTHHT, THHTH, THHHT | 10 |
2 heads, 3 tails | TTTHH, TTHHT, THHTT, HHTTT, TTHTH, THTHT, HTHTT, THTTH, HTTHT, HTTTH | 10 |
1 head, 4 tails | TTTTH, TTTHT, TTHTT, THTTT, HTTTT | 5 |
0 heads, 5 tails | TTTTT | 1 |
Total: 32 |
The macrostate of 3 heads and 2 tails can be achieved in 10 ways and is thus 10 times more probable than the one having 5 heads. Not surprisingly, it is equally probable to have the reverse, 2 heads and 3 tails. Similarly, it is equally probable to get 5 tails as it is to get 5 heads. Note that all of these conclusions are based on the crucial assumption that each microstate is equally probable. With coin tosses, this requires that the coins not be asymmetric in a way that favors one side over the other, as with loaded dice. With any system, the assumption that all microstates are equally probable must be valid, or the analysis will be erroneous.
The two most orderly possibilities are 5 heads or 5 tails. (They are more structured than the others.) They are also the least likely, only 2 out of 32 possibilities. The most disorderly possibilities are 3 heads and 2 tails and its reverse. (They are the least structured.) The most disorderly possibilities are also the most likely, with 20 out of 32 possibilities for the 3 heads and 2 tails and its reverse. If we start with an orderly array like 5 heads and toss the coins, it is very likely that we will get a less orderly array as a result, since 30 out of the 32 possibilities are less orderly. So even if you start with an orderly state, there is a strong tendency to go from order to disorder, from low entropy to high entropy. The reverse can happen, but it is unlikely.
Macrostate | Number of microstates | |
Heads | Tails | (W) |
100 | 0 | 1 |
99 | 1 | |
95 | 5 | |
90 | 10 | |
75 | 25 | |
60 | 40 | |
55 | 45 | |
51 | 49 | |
50 | 50 | |
49 | 51 | |
45 | 55 | |
40 | 60 | |
25 | 75 | |
10 | 90 | |
5 | 95 | |
1 | 99 | |
0 | 100 | 1 |
Total: |
This result becomes dramatic for larger systems. Consider what happens if you have 100 coins instead of just 5. The most orderly arrangements (most structured) are 100 heads or 100 tails. The least orderly (least structured) is that of 50 heads and 50 tails. There is only 1 way (1 microstate) to get the most orderly arrangement of 100 heads. There are 100 ways (100 microstates) to get the next most orderly arrangement of 99 heads and 1 tail (also 100 to get its reverse). And there are ways to get 50 heads and 50 tails, the least orderly arrangement. [link] is an abbreviated list of the various macrostates and the number of microstates for each macrostate. The total number of microstates—the total number of different ways 100 coins can be tossed—is an impressively large . Now, if we start with an orderly macrostate like 100 heads and toss the coins, there is a virtual certainty that we will get a less orderly macrostate. If we keep tossing the coins, it is possible, but exceedingly unlikely, that we will ever get back to the most orderly macrostate. If you tossed the coins once each second, you could expect to get either 100 heads or 100 tails once in years! This period is 1 trillion () times longer than the age of the universe, and so the chances are essentially zero. In contrast, there is an 8% chance of getting 50 heads, a 73% chance of getting from 45 to 55 heads, and a 96% chance of getting from 40 to 60 heads. Disorder is highly likely.
Disorder in a Gas
The fantastic growth in the odds favoring disorder that we see in going from 5 to 100 coins continues as the number of entities in the system increases. Let us now imagine applying this approach to perhaps a small sample of gas. Because counting microstates and macrostates involves statistics, this is called statistical analysis. The macrostates of a gas correspond to its macroscopic properties, such as volume, temperature, and pressure; and its microstates correspond to the detailed description of the positions and velocities of its atoms. Even a small amount of gas has a huge number of atoms: of an ideal gas at 1.0 atm and has atoms. So each macrostate has an immense number of microstates. In plain language, this means that there are an immense number of ways in which the atoms in a gas can be arranged, while still having the same pressure, temperature, and so on.
The most likely conditions (or macrostates) for a gas are those we see all the time—a random distribution of atoms in space with a Maxwell-Boltzmann distribution of speeds in random directions, as predicted by kinetic theory. This is the most disorderly and least structured condition we can imagine. In contrast, one type of very orderly and structured macrostate has all of the atoms in one corner of a container with identical velocities. There are very few ways to accomplish this (very few microstates corresponding to it), and so it is exceedingly unlikely ever to occur. (See [link](b).) Indeed, it is so unlikely that we have a law saying that it is impossible, which has never been observed to be violated—the second law of thermodynamics.
The disordered condition is one of high entropy, and the ordered one has low entropy. With a transfer of energy from another system, we could force all of the atoms into one corner and have a local decrease in entropy, but at the cost of an overall increase in entropy of the universe. If the atoms start out in one corner, they will quickly disperse and become uniformly distributed and will never return to the orderly original state ([link](b)). Entropy will increase. With such a large sample of atoms, it is possible—but unimaginably unlikely—for entropy to decrease. Disorder is vastly more likely than order.
The arguments that disorder and high entropy are the most probable states are quite convincing. The great Austrian physicist Ludwig Boltzmann (1844–1906)—who, along with Maxwell, made so many contributions to kinetic theory—proved that the entropy of a system in a given state (a macrostate) can be written as
where is Boltzmann’s constant, and is the natural logarithm of the number of microstates corresponding to the given macrostate. is proportional to the probability that the macrostate will occur. Thus entropy is directly related to the probability of a state—the more likely the state, the greater its entropy. Boltzmann proved that this expression for is equivalent to the definition , which we have used extensively.
Thus the second law of thermodynamics is explained on a very basic level: entropy either remains the same or increases in every process. This phenomenon is due to the extraordinarily small probability of a decrease, based on the extraordinarily larger number of microstates in systems with greater entropy. Entropy can decrease, but for any macroscopic system, this outcome is so unlikely that it will never be observed.
Suppose you toss 100 coins starting with 60 heads and 40 tails, and you get the most likely result, 50 heads and 50 tails. What is the change in entropy?
Strategy
Noting that the number of microstates is labeled in [link] for the 100-coin toss, we can use to calculate the change in entropy.
Solution
The change in entropy is
where the subscript i stands for the initial 60 heads and 40 tails state, and the subscript f for the final 50 heads and 50 tails state. Substituting the values for from [link] gives
Discussion
This increase in entropy means we have moved to a less orderly situation. It is not impossible for further tosses to produce the initial state of 60 heads and 40 tails, but it is less likely. There is about a 1 in 90 chance for that decrease in entropy () to occur. If we calculate the decrease in entropy to move to the most orderly state, we get . There is about a chance of this change occurring. So while very small decreases in entropy are unlikely, slightly greater decreases are impossibly unlikely. These probabilities imply, again, that for a macroscopic system, a decrease in entropy is impossible. For example, for heat transfer to occur spontaneously from 1.00 kg of ice to its environment, there would be a decrease in entropy of . Given that a corresponds to about a chance, a decrease of this size () is an utter impossibility. Even for a milligram of melted ice to spontaneously refreeze is impossible.
- Examine the situation to determine if entropy is involved.
- Identify the system of interest and draw a labeled diagram of the system showing energy flow.
- Identify exactly what needs to be determined in the problem (identify the unknowns). A written list is useful.
- Make a list of what is given or can be inferred from the problem as stated (identify the knowns). You must carefully identify the heat transfer, if any, and the temperature at which the process takes place. It is also important to identify the initial and final states.
- Solve the appropriate equation for the quantity to be determined (the unknown). Note that the change in entropy can be determined between any states by calculating it for a reversible process.
- Substitute the known value along with their units into the appropriate equation, and obtain numerical solutions complete with units.
- To see if it is reasonable: Does it make sense? For example, total entropy should increase for any real process or be constant for a reversible process. Disordered states should be more probable and have greater entropy than ordered states.
Section Summary
- Disorder is far more likely than order, which can be seen statistically.
- The entropy of a system in a given state (a macrostate) can be written as where is Boltzmann’s constant, and is the natural logarithm of the number of microstates corresponding to the given macrostate.
Conceptual Questions
Explain why a building made of bricks has smaller entropy than the same bricks in a disorganized pile. Do this by considering the number of ways that each could be formed (the number of microstates in each macrostate).
Problem Exercises
Using [link], verify the contention that if you toss 100 coins each second, you can expect to get 100 heads or 100 tails once in years; calculate the time to two-digit accuracy.
It should happen twice in every or once in every
What percent of the time will you get something in the range from 60 heads and 40 tails through 40 heads and 60 tails when tossing 100 coins? The total number of microstates in that range is . (Consult [link].)
(a) If tossing 100 coins, how many ways (microstates) are there to get the three most likely macrostates of 49 heads and 51 tails, 50 heads and 50 tails, and 51 heads and 49 tails? (b) What percent of the total possibilities is this? (Consult [link].)
(a)
(b) 24%
(a) What is the change in entropy if you start with 100 coins in the 45 heads and 55 tails macrostate, toss them, and get 51 heads and 49 tails? (b) What if you get 75 heads and 25 tails? (c) How much more likely is 51 heads and 49 tails than 75 heads and 25 tails? (d) Does either outcome violate the second law of thermodynamics?
(a) What is the change in entropy if you start with 10 coins in the 5 heads and 5 tails macrostate, toss them, and get 2 heads and 8 tails? (b) How much more likely is 5 heads and 5 tails than 2 heads and 8 tails? (Take the ratio of the number of microstates to find out.) (c) If you were betting on 2 heads and 8 tails would you accept odds of 252 to 45? Explain why or why not.
(a)
(b) 5.6 times more likely
(c) If you were betting on two heads and 8 tails, the odds of breaking even are 252 to 45, so on average you would break even. So, no, you wouldn’t bet on odds of 252 to 45.
Macrostate | Number of Microstates (W) | |
Heads | Tails | |
10 | 0 | 1 |
9 | 1 | 10 |
8 | 2 | 45 |
7 | 3 | 120 |
6 | 4 | 210 |
5 | 5 | 252 |
4 | 6 | 210 |
3 | 7 | 120 |
2 | 8 | 45 |
1 | 9 | 10 |
0 | 10 | 1 |
Total: 1024 |
(a) If you toss 10 coins, what percent of the time will you get the three most likely macrostates (6 heads and 4 tails, 5 heads and 5 tails, 4 heads and 6 tails)? (b) You can realistically toss 10 coins and count the number of heads and tails about twice a minute. At that rate, how long will it take on average to get either 10 heads and 0 tails or 0 heads and 10 tails?
(a) Construct a table showing the macrostates and all of the individual microstates for tossing 6 coins. (Use [link] as a guide.) (b) How many macrostates are there? (c) What is the total number of microstates? (d) What percent chance is there of tossing 5 heads and 1 tail? (e) How much more likely are you to toss 3 heads and 3 tails than 5 heads and 1 tail? (Take the ratio of the number of microstates to find out.)
(b) 7
(c) 64
(d) 9.38%
(e) 3.33 times more likely (20 to 6)
In an air conditioner, 12.65 MJ of heat transfer occurs from a cold environment in 1.00 h. (a) What mass of ice melting would involve the same heat transfer? (b) How many hours of operation would be equivalent to melting 900 kg of ice? (c) If ice costs 20 cents per kg, do you think the air conditioner could be operated more cheaply than by simply using ice? Describe in detail how you evaluate the relative costs.
- College Physics
- Preface
- Introduction: The Nature of Science and Physics
- Kinematics
- Introduction to One-Dimensional Kinematics
- Displacement
- Vectors, Scalars, and Coordinate Systems
- Time, Velocity, and Speed
- Acceleration
- Motion Equations for Constant Acceleration in One Dimension
- Problem-Solving Basics for One-Dimensional Kinematics
- Falling Objects
- Graphical Analysis of One-Dimensional Motion
- Two-Dimensional Kinematics
- Dynamics: Force and Newton's Laws of Motion
- Introduction to Dynamics: Newton’s Laws of Motion
- Development of Force Concept
- Newton’s First Law of Motion: Inertia
- Newton’s Second Law of Motion: Concept of a System
- Newton’s Third Law of Motion: Symmetry in Forces
- Normal, Tension, and Other Examples of Forces
- Problem-Solving Strategies
- Further Applications of Newton’s Laws of Motion
- Extended Topic: The Four Basic Forces—An Introduction
- Further Applications of Newton's Laws: Friction, Drag, and Elasticity
- Uniform Circular Motion and Gravitation
- Work, Energy, and Energy Resources
- Introduction to Work, Energy, and Energy Resources
- Work: The Scientific Definition
- Kinetic Energy and the Work-Energy Theorem
- Gravitational Potential Energy
- Conservative Forces and Potential Energy
- Nonconservative Forces
- Conservation of Energy
- Power
- Work, Energy, and Power in Humans
- World Energy Use
- Linear Momentum and Collisions
- Statics and Torque
- Rotational Motion and Angular Momentum
- Introduction to Rotational Motion and Angular Momentum
- Angular Acceleration
- Kinematics of Rotational Motion
- Dynamics of Rotational Motion: Rotational Inertia
- Rotational Kinetic Energy: Work and Energy Revisited
- Angular Momentum and Its Conservation
- Collisions of Extended Bodies in Two Dimensions
- Gyroscopic Effects: Vector Aspects of Angular Momentum
- Fluid Statics
- Introduction to Fluid Statics
- What Is a Fluid?
- Density
- Pressure
- Variation of Pressure with Depth in a Fluid
- Pascal’s Principle
- Gauge Pressure, Absolute Pressure, and Pressure Measurement
- Archimedes’ Principle
- Cohesion and Adhesion in Liquids: Surface Tension and Capillary Action
- Pressures in the Body
- Fluid Dynamics and Its Biological and Medical Applications
- Introduction to Fluid Dynamics and Its Biological and Medical Applications
- Flow Rate and Its Relation to Velocity
- Bernoulli’s Equation
- The Most General Applications of Bernoulli’s Equation
- Viscosity and Laminar Flow; Poiseuille’s Law
- The Onset of Turbulence
- Motion of an Object in a Viscous Fluid
- Molecular Transport Phenomena: Diffusion, Osmosis, and Related Processes
- Temperature, Kinetic Theory, and the Gas Laws
- Heat and Heat Transfer Methods
- Thermodynamics
- Introduction to Thermodynamics
- The First Law of Thermodynamics
- The First Law of Thermodynamics and Some Simple Processes
- Introduction to the Second Law of Thermodynamics: Heat Engines and Their Efficiency
- Carnot’s Perfect Heat Engine: The Second Law of Thermodynamics Restated
- Applications of Thermodynamics: Heat Pumps and Refrigerators
- Entropy and the Second Law of Thermodynamics: Disorder and the Unavailability of Energy
- Statistical Interpretation of Entropy and the Second Law of Thermodynamics: The Underlying Explanation
- Oscillatory Motion and Waves
- Introduction to Oscillatory Motion and Waves
- Hooke’s Law: Stress and Strain Revisited
- Period and Frequency in Oscillations
- Simple Harmonic Motion: A Special Periodic Motion
- The Simple Pendulum
- Energy and the Simple Harmonic Oscillator
- Uniform Circular Motion and Simple Harmonic Motion
- Damped Harmonic Motion
- Forced Oscillations and Resonance
- Waves
- Superposition and Interference
- Energy in Waves: Intensity
- Physics of Hearing
- Electric Charge and Electric Field
- Introduction to Electric Charge and Electric Field
- Static Electricity and Charge: Conservation of Charge
- Conductors and Insulators
- Coulomb’s Law
- Electric Field: Concept of a Field Revisited
- Electric Field Lines: Multiple Charges
- Electric Forces in Biology
- Conductors and Electric Fields in Static Equilibrium
- Applications of Electrostatics
- Electric Potential and Electric Field
- Introduction to Electric Potential and Electric Energy
- Electric Potential Energy: Potential Difference
- Electric Potential in a Uniform Electric Field
- Electrical Potential Due to a Point Charge
- Equipotential Lines
- Capacitors and Dielectrics
- Capacitors in Series and Parallel
- Energy Stored in Capacitors
- Electric Current, Resistance, and Ohm's Law
- Circuits, Bioelectricity, and DC Instruments
- Magnetism
- Introduction to Magnetism
- Magnets
- Ferromagnets and Electromagnets
- Magnetic Fields and Magnetic Field Lines
- Magnetic Field Strength: Force on a Moving Charge in a Magnetic Field
- Force on a Moving Charge in a Magnetic Field: Examples and Applications
- The Hall Effect
- Magnetic Force on a Current-Carrying Conductor
- Torque on a Current Loop: Motors and Meters
- Magnetic Fields Produced by Currents: Ampere’s Law
- Magnetic Force between Two Parallel Conductors
- More Applications of Magnetism
- Electromagnetic Induction, AC Circuits, and Electrical Technologies
- Introduction to Electromagnetic Induction, AC Circuits and Electrical Technologies
- Induced Emf and Magnetic Flux
- Faraday’s Law of Induction: Lenz’s Law
- Motional Emf
- Eddy Currents and Magnetic Damping
- Electric Generators
- Back Emf
- Transformers
- Electrical Safety: Systems and Devices
- Inductance
- RL Circuits
- Reactance, Inductive and Capacitive
- RLC Series AC Circuits
- Electromagnetic Waves
- Geometric Optics
- Vision and Optical Instruments
- Wave Optics
- Introduction to Wave Optics
- The Wave Aspect of Light: Interference
- Huygens's Principle: Diffraction
- Young’s Double Slit Experiment
- Multiple Slit Diffraction
- Single Slit Diffraction
- Limits of Resolution: The Rayleigh Criterion
- Thin Film Interference
- Polarization
- *Extended Topic* Microscopy Enhanced by the Wave Characteristics of Light
- Special Relativity
- Introduction to Quantum Physics
- Atomic Physics
- Introduction to Atomic Physics
- Discovery of the Atom
- Discovery of the Parts of the Atom: Electrons and Nuclei
- Bohr’s Theory of the Hydrogen Atom
- X Rays: Atomic Origins and Applications
- Applications of Atomic Excitations and De-Excitations
- The Wave Nature of Matter Causes Quantization
- Patterns in Spectra Reveal More Quantization
- Quantum Numbers and Rules
- The Pauli Exclusion Principle
- Radioactivity and Nuclear Physics
- Medical Applications of Nuclear Physics
- Particle Physics
- Frontiers of Physics
- Atomic Masses
- Selected Radioactive Isotopes
- Useful Information
- Glossary of Key Symbols and Notation