18.2 Entropy

Course Menu
CHECK OUT CHAD'S
PRE-HEALTH PREP
Table of Contents
    Add a header to begin generating the table of contents

    What is Entropy?

    Entropy is often defined as randomness or disorder, but this isn't quite true.  It turns out entropy has a mathematical definition related to the number of possible microstates in which a system can exist; the more possible microstates the greater the entropy.  But the definition may not be the most satisfying conceptually and can be a little complicated, which is why it is often explained to students as being synonymous with randomness or disorder.  This definition is not completely without merit but it is better stated that entropy CORRELATES with randomness or disorder.  A system with more possible microstates will have more disorder and more entropy.  So there is a correlation between entropy and disorder, but it's not quite true to state that they are the same thing.

     

    According to statistical mechanics entropy is defined by the following equation:

     

    S = -kBlnΩ

     

    kB = 1.38x10-23 m2 kg s-2 K-1 (the boltzmann constant)

     

    Ω = the number of possible microstates

     

    A microstate is simply one arrangement in which all particles in a system can exist.  The number of possible microstates is typically greater when there are more particles present or when more rotational configurations are possible for a molecule or around the bonds within a molecule, when the particles are spread out in a greater volume, at higher temperatures, etc.  This definition also provides an explanation for the Third Law of Thermodynamics which we studied in the last lesson (18.1 The Laws of Thermodynamics) which states that "a perfect crystal at absolute zero (Kelvin) has zero entropy.  At absolute zero (Kelvin) a perfect crystal will only have a single possible microstate and as ln(1) = 0, therefore S = 0.

    Factors Affecting Entropy

    There are a variety of factors affecting entropy which can be summarized as follows:

     

    1. Phase of Matter - Gas > Liquid > Solid

    2. Higher Temperature, Higher Entropy

    3. Higher Volume, Higher Entropy

    4. Higher Number of Particles, Higher Entropy

    Phase of Matter

    The most significant factor affecting entropy is the phase of matter:

     

    gas > liquid > solid

     

    Solids are often crystalline in which the atoms/ions/molecules are in well-defined positions in a highly ordered structure resulting in relatively low entropy.

     

    In the liquid phase, atoms/ions/molecules are now free to move around which results in an increase in the number of possible microstates and an increase in entropy relative to the solid phase.

     

    In the gas phase, atoms/ions/molecules are not only free to move around, but move around faster on average than those in a liquid, and there is a large increase in volume relative to the liquid phase all of which result in a larger number of possible microstates and a higher entropy.  It should be noted that there is considerably more entropy associated with the gas phase as compared to either the solid or liquid phase.

    Higher Temperature, Higher Entropy

    Higher temperatures correspond to higher average kinetic energies.  In the liquid and gas phases this means that atoms/ions/molecules are moving faster resulting in a greater number of possible microstates and higher entropy.  In the solid phase more kinetic energy would correspond to greater vibration of the atoms/ions/molecules in the crystal structure which also results in a greater number of possible microstates and higher entropy. 

    Higher Volume, Higher Entropy

    A higher volume means that there are a greater number of positions for the particles in the system to occupy which results in a greater number of possible microstates for the system and a higher entropy.  As gases have a much larger volume than either the corresponding liquid or solid, this explains in part why gases have higher entropy than the corresponding liquid or solid.

    Higher Number of Particles, Higher Entropy

    Entropy, it turns out, is an extensive property and is proportional to the size of the sample.  As the number of particles in a sample increase so does the number of possible microstates the system can occupy resulting in a higher entropy.

    What is ΔS?

    ΔS, the change in entropy, is often defined by the amount of heat transferred in a reversible process divided by the absolute temperature at which the process was carried out.

     

    ΔS = qrev T

     

    A reversible process is one carried out under equilibrium conditions.

     

    Enthalpy change was defined back in chapter 5 (5.1 The First Law of Thermodynamics, Enthalpy, and Phase Changes) as being equal to the heat transferred at constant pressure:

     

    ΔH = qP

     

    Substituting this into the above expression would lead us to a relationship between ΔS and ΔH for phase changes that we will study that we will study in the next lesson (18.3 Gibbs Free Energy and the Relationship between Delta G, Delta H, and Delta S ):

     

    ΔSofus = ΔHofus Tm                ΔSovap = ΔHovap Tb

     

    ΔS of a reaction can also be calculated from absolute entropies as we will learn later in the chapter (18.4 Delta G, Delta H, Delta S and Formation Reactions ):

     

    ΔSorxn = ΣnSoproducts - ΣmSoreactants

    How to Predict the Sign of ΔS

    There are three important factors for predicting the sign of ΔS:

    1. A change in the number of moles of gas

    2. Phase changes

    3. A change in the complexity of the substances

    A Change in the Number of Moles of Gas

    Gases have significantly more entropy than either liquids or solids and the most common way to predict the sign of ΔS is to identify a change in the number of moles of gas:

     

    An increase in the number of moles of gas corresponds to an increase in entropy (ΔS > 0).

    2C(s, graphite) + O2(g) → 2CO(g)    ΔSo = +179 J/K

     

    A decrease in the number of moles of gas corresponds to a decrease in entropy (ΔS < 0).

    2H2(g) + O2(g) → 2H2O(g)    ΔSo = -88.5 J/K

     

    A reaction with no change in the number of moles of gas is likely to have a ΔS value of small magnitude.  It could be either positive or negative, but will likely be close to zero either way.

    N2(g) + O2(g) → 2NO(g)    ΔSo = +25 J/K  (closer to zero than the previous examples)

    Phase Changes

    As the phase of matter has a significant impact on entropy in a predictable way (gas > liquid > solid) it should be no surprise that it is relatively straightforward to predict ΔS for a phase change.

     

    ΔS is positive for fusion, vaporization and sublimation.

     

    ΔS is negative for freezing, condensation, and deposition.

     

    These are summarized below:

    entropy phase change

    A Change in Complexity of the Substances

    If there is no change in the number of moles of gas and the reaction isn't simply a phase change then a change in the complexity of the substances may allow you to predict the sign of ΔS.

     

    An increase in complexity would lead to a positive ΔS, while a decrease in complexity would lead to a negative ΔS.

     

    A change in complexity is marked by substances composed of more atoms and/or more elements.  Take for example one of the reactions that was also used as an earlier example for when there was no change in the number of moles of gas:

     

    N2(g) + O2(g) → 2NO(g)    ΔSo = +25 J/K

     

    There are two moles of gases on the reactants' side and on the products' side so there is no change in moles of gas.  Also, you could not characterize this as a phase change either.  So we're left with a comparison of complexity.

     

    All of the substances involved in this reaction are diatomic.  But the reactants, N2 and O2, are both symmetrical being composed of two identical atoms, whereas the product, NO, are not symmetrical being composed of 2 different atoms.  The symmetrical reactants have a lower number of degrees of freedom we say which is a measure of complexity.  Ultimately we could rotate them 180° and we couldn't tell the difference due to symmetry.  But if you rotate an NO molecules 180° you would definitely notice the difference.  This increase in complexity explains the (slightly) positive value for ΔS.

    entropy complexity

    These two rotational conformations can be distinguished from one another for NO, but not for N2 or O2. This means there a greater number of possible microstates for the NO molecules explaining why greater complexity tends to result in higher entropy.