Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Entopy, Lecture notes of Thermodynamics

thermodynamics entropy

Typology: Lecture notes

2015/2016

Uploaded on 05/01/2016

cruise269
cruise269 🇮🇳

1 document

1 / 44

Partial preview of the text

Download Entopy and more Lecture notes Thermodynamics in PDF only on Docsity! Entopy Chapter 20 Entropy and the Second Law of Thermodynamics Key contents: The arrow of time Entropy and the 2nd law of thermodynamics Efficiency of engines Macrostates and microstates Probability and entropy 20.3 Change in Entropy: Entropy is a State Function Suppose that an ideal gas is taken through a reversible process, with the gas in an equilibrium state at the end of each step. For each small step, the energy transferred as heat to or from the gas is dQ, the work done by the gas is dW, and the change in internal energy is dEint. We have: Since the process is reversible, dW = p dV and dEint = nCV dT. Therefore, Using ideal gas law, we obtain: Integrating, Finally, The change in entropy S between the initial and final states of an ideal gas depends only on properties of the initial and final states; S does not depend on how the gas changes between the two states. Example, Change of Entropy: Figure 20-5a shows two identical copper blocks of mass m = 1.5 kg: block L at temperature 7;, = 60°C and block R at temperature Tjz = 20°C. The blocks are in a thermally insu- lated box and are separated by an insulating shutter. When we lift the shutter, the blocks eventually come to the equi- librium temperature 7, = 40°C (Fig. 20-55). What is the net entropy change of the two-block system during this irre- versible process? The specific heat of copper is 386 J/kg-K. Calculations: For the reversible process, we need a ther- mal reservoir whose temperature can be changed slowly (say, by turning a knob), We then take the blocks through the following two steps, illustrated in Fig. 20-6. Step 1: With the reservoir’s temperature set at 60°C, put block L on the reservoir. (Since block and reservoir are at the same temperature, they are already in thermal equilib- rium.) Then slowly lower the temperature of the reservoir and the block to 40°C. As the block’s temperature changes by each increment dT during this process, energy dQ is trans- ferred as heat from the block to the reservoir. Using Eq. 18- 14, we can write this transferred energy as dQ = mc dT, where c is the specific heat of copper. According to Eq. 20-1, Movable M shutter f Fig. 20-5 Insulation Irreversible process (a) (b) the entropy change AS, of block L during the full tem- perature change from initial temperature T;, (= 60°C = 333 K) to final temperature T,(= 40°C = 313 K) is fdQ Ts mc dT Tr dT rt Ti Ti. —meln— =. Ti. Inserting the given data yields . 313K AS, = (1.5 kg)(386 J/kg-K) In 333K = —35.86 J/K. Step 2; With the reservoir’s temperature now set at 20°C, put block R on the reservoir. Then slowly raise the tempera- ture of the reservoir and the block to 40°C. With the same reasoning used to find AS;, you can show that the entropy change AS, of block R during this process is BIg 293 K ASp = (1.5 kg)(386 J/kg -K) In = +38.23 J/K. The net entropy change AS,,, of the two-block system un- dergoing this two-step reversible process is then AS, = AS, + AS, = —35.86 JIK + 38.23 J/K = 2.4 J/K. Thus, the net entropy change AS;,..y for the two-block sys- tem undergoing the actual irreversible process is ASirey = ASjey = 2.4 J/K. (Answer) Example, Change of Entropy, Free Expansion of Gas: Suppose 1.0 mol of nitrogen gas is confined to the left side of the container of Fig. 20-1a. You open the stopcock, and the volume of the gas doubles. What is the entropy change of the gas for this irreversible process? Treat the gas as ideal. Calculations: From Table 19-4, the energy Q added as heat to the gas as it expands isothermally at temperature T from an initial volume Vi to a final volume Vf is Here n is the number of moles of gas present. The entropy change for this reversible process in which the temperature is held constant is 20.5 Entropy in the Real World: Perfect Engines To have a ‘prefect’ engine, i.e., all the absorbed heat transferred to work, we require QL=0. With the engine entropy change being zero, and the environment entropy change being the total entropy change for such an engine to work will be negative, violating the 2 nd law. The 2nd law of thermodynamics can be stated as: No perfect engine! (The Kelvin-Planck statement) Senv   |QH | TH  0 20.6 Entropy in the Real World: Perfect Refrigerators The entropy change for the cold reservoir is -|Q|/TL, and that for the warm reservoir is +|Q|/TH. Thus, the net entropy change for the entire system is: TH >TL, and the right side of this equation is negative and thus the net change in entropy per cycle for the closed system refrigerator reservoirs is also negative. This violates the second law of thermodynamics, and therefore a perfect refrigerator does not exist. The 2nd law of thermodynamics can be stated as: No perfect refrigerators! (The Clausius statement) Not only there are no perfect engines and refrigerators, but also their efficiency has an upper limit! Carnot’s theorem: (1)All reversible engines have the same efficiency. (2) No engines have an efficiency higher than that of a reversible engine. 20.5 Entropy in the Real World: Engine Efficiency # Ideal engines = Reversible engines # A Carnot engine is an ideal engine undergoing a Carnot cycle. Example, Carnot Engine: Imagine a Carnot engine that operates between the temper- f 300 K 5 _ L_ = = 9 . atures Ty = 850 K and T, = 300K. The engine performs => as Nee snk 0.647 = 65%. (Answer) 1200 J of work each cycle, which takes 0.25 s. H s (a) What is the efficiency of this engine? : i ; : 49 W _ 1200J (b) What is the average power of this engine? => P=—— =. = 4800 W = 4.8kW. (Answer) t 2) W _ 1200J . |Qu] = — = —— = 1855J. (Answer) (c) How much energy IQuy! is extracted as heat from the 5, s 0.647 high-temperature reservoir every cycle? Qu = 1Qul — W = 1855 J — 1200 J = 655 J. (Answer) 1855 J AS = = = een +2.18 J/K. (Answer) (c) By how much does the entropy of the working substance Similarly, for the negative transfer of energy Q,, to the change as a result of the energy transferred to it from the a> low-temperature reservoir at T,,we have high-temperature reservoir? From it to the low-temperature AS, = a _ ss = Lei ee AU reservoir? Note that the net entropy change of the working substance for one cycle is zero, as we discussed in deriving Eq. 20-10. Example, Impossible Engine: An inventor claims to have constructed an engine that has an efficiency of 75% when operated between the boiling and freezing points of water. Is this possible? KEY IDEA The efficiency of a real engine must be less than the effi- ciency of a Carnot engine operating between the same two temperatures. Calculation: From Eq. 20-13, we find that the efficiency of a Carnot engine operating between the boiling and freezing points of water is f, (0 +273) K eee Ty (100 + 273) K = 0.268 = 27%. Thus, for the given temperatures, the claimed efficiency of 75% for a real engine (with its irreversible processes and wasteful energy transfers) is impossible. 20.8 A Statistical View of Entropy Macro-states and Micro-states Example: a two-phase-block case with 6 identical but distinguishable particles A certain set of numbers of particles in each phase block corresponds to a certain macrostate. {ab, cdef}and{ab, cedf} are the same macro and microstates. {ab, cdef}and{ac, bdef} are the same macro but different microstates. {abc,def}and{def,abc} are the same macro but different microstates. {ab, cdef}and {abcd,ef} are different macrostates. The thermodynamic probability of a macrostate is the corresponding number of microstates: # Every microstate is assumed to be equally probable. The arrow of time points to the direction of a state of more microstates, i.e., a more probable state. Sometimes it is called a more disordered state. Entropy is an extendable (i.e. additive) state variable. Thermodynamic probability is multiplicative. It’s likely that One may show in fact that the constant a is equal to the Boltzmann constant k, i.e., 20.8 A Statistical View of Entropy: Probability and Entropy Stirling’s approximation is often used for ln N! when N is large: ln N! ≈ N(ln N) - N (Stirling’s approximation). S alnW Example, Entropy change : When n moles of an ideal gas doubles its volume in a free expansion, the entropy increase from the initial state i to the final state f is Sf -Si =nR ln 2. Derive this result with statistical mechanics. Calculations: The molecules are in a closed container, and the multiplicity W of their microstates can be found by: Here N is the number of molecules in the n moles of the gas. Initially, with the molecules all in the left half of the container, their (n1, n2) configuration is (N, 0). Finally, with the molecules spread through the full volume, their (n1, n2) configuration is (N/2, N/2). The initial and final entropies are Now, Therefore, The change in entropy from the initial state to the final is thus which is what we set out to show. Homework: Problems 12, 17, 34, 43, 46 20.2 Irreversible Processes and Entropy: Examples to show the arrow of time… Changes in energy within a closed system do not set the direction of irreversible processes. Calling for another state variable to account for the arrow of time… Entropy Postulate: If an irreversible process occurs in a closed system, the entropy S of the system always increases; it never decreases. Here Q is the energy transferred as heat to or from the system during the (reversible) process, and T is the temperature of the system in kelvins. Entropy change of an irreversible process can be found with a reversible one connecting the initial and final states. free expansion an isothermal process 20.3 Change in Entropy: Entropy is a State Function Suppose that an ideal gas is taken through a reversible process, with the gas in an equilibrium state at the end of each step. For each small step, the energy transferred as heat to or from the gas is dQ, the work done by the gas is dW, and the change in internal energy is dEint. We have: Since the process is reversible, dW = p dV and dEint = nCV dT. Therefore, Using ideal gas law, we obtain: Integrating, Finally, The change in entropy S between the initial and final states of an ideal gas depends only on properties of the initial and final states; S does not depend on how the gas changes between the two states. 20.4 The Second Law of Thermodynamics If a process occurs in a closed system, the entropy of the system increases for irreversible processes and remains constant for reversible processes. It never decreases. Here the greater-than sign applies to irreversible processes and the equals sign to reversible processes. This relation applies only to closed systems. The reversible processes as dictated in a P-V diagram, however, can have any signs of entropy change since they describe only part of a closed system, which includes the environment. 20.5 Entropy in the Real World: Engine Efficiency An engine (or a refrigerator) is a system of some substance to undergo a cycle between two thermal reservoirs of high and low temperatures. 20.5 Entropy in the Real World: Perfect Engines To have a ‘prefect’ engine, i.e., all the absorbed heat transferred to work, we require QL=0. With the engine entropy change being zero, and the environment entropy change being the total entropy change for such an engine to work will be negative, violating the 2 nd law. The 2nd law of thermodynamics can be stated as: No perfect engine! (The Kelvin-Planck statement) Senv   |QH | TH  0 20.5 Entropy in the Real World: Carnot Engine Heat: Entropy Changes: Efficiency: The reverse of a Carnot engine is an ideal refrigerator, also called a Carnot refrigerator, whose efficiency, the coefficient of performance is 20.7 The Efficiencies of Real Engines Fig. 20-16 (a) Engine X drives a Carnot refrigerator. (b) If, as claimed, engine X is more efficient than a Carnot engine, then the combination shown in (a) is equivalent to the perfect refrigerator shown here. This violates the second law of thermodynamics, so we conclude that engine X cannot be more efficient than a Carnot engine. Suppose there is an engine X, which has an efficiency X that is greater than C, the Carnot efficiency. When the engine X is coupled to a Carnot refrigerator, the work it requires per cycle may be made equal to that provided by engine X. Thus, no (external) work is performed on or by the combination engine +refrigerator, which we take as our system. We have the assumption , where the primed notation refers to the engine X. Therefore, which finally leads to : > 0 This shows that the net effect of engine X and the Carnot refrigerator working in combination is to transfer energy Q as heat from a low-temperature reservoir to a high-temperature reservoir without the requirement of work. This is a perfect refrigerator, whose existence is a violation of the second law of thermodynamics. Carnot’s theorem is proved! Example, Carnot Engine: Imagine a Carnot engine that operates between the temper- f 300 K 5 _ L_ = = 9 . atures Ty = 850 K and T, = 300K. The engine performs => as Nee snk 0.647 = 65%. (Answer) 1200 J of work each cycle, which takes 0.25 s. H s (a) What is the efficiency of this engine? : i ; : 49 W _ 1200J (b) What is the average power of this engine? => P=—— =. = 4800 W = 4.8kW. (Answer) t 2) W _ 1200J . |Qu] = — = —— = 1855J. (Answer) (c) How much energy IQuy! is extracted as heat from the 5, s 0.647 high-temperature reservoir every cycle? Qu = 1Qul — W = 1855 J — 1200 J = 655 J. (Answer) 1855 J AS = = = een +2.18 J/K. (Answer) (c) By how much does the entropy of the working substance Similarly, for the negative transfer of energy Q,, to the change as a result of the energy transferred to it from the a> low-temperature reservoir at T,,we have high-temperature reservoir? From it to the low-temperature AS, = a _ ss = Lei ee AU reservoir? Note that the net entropy change of the working substance for one cycle is zero, as we discussed in deriving Eq. 20-10. # A system changes its microstate all the time. # The equilibrium state is the most probable macrostate. Example, Microstates and multiplicity: Suppose that there are 100 indistinguishable molecules in the box of Fig. 20-17. How many microstates are associated with the configuration , = 50 and = 50, and with the configuration mn, = 100 and n, = 0? Interpret the results in terms of the rela- tive probabilities of the two configurations. KEY IDEA The multiplicity W of a configuration of indistinguishable molecules in a closed box is the number of independent microstates with that configuration, as given by Eq. 20-20. Calculations: Thus, for the (1), /1) configuration (50, 50), N! 100! ‘ill 50!50! 9.33 x 1017 ~ (3.04 x 10®)(3.04 x 10%) = 1.01 x 10”. (Answer) Similarly, for the configuration (100, 0), we have M _ too _1_1_, min! 10010! Old We (Answer) The meaning: Thus, a 50-50 distribution is more likely than a 100-0 distribution by the enormous factor of about 1 X 10”. If you could count, at one per nanosec- ond, the number of microstates that correspond to the 50-50 distribution, it would take you about 3 x 10” years, which is about 200 times longer than the age of the universe. Keep in mind that the 100 molecules used in this sample problem is a very small number. Imagine what these calculated probabilities would be like for a mole of molecules, say about N = 10™. Thus, you need never worry about suddenly finding all the air molecules clustering in one corner of your room, with you gasping for air in another corner. So, you can breathe easy be- cause of the physics of entropy. The arrow of time points to the direction of a state of more microstates, i.e., a more probable state. Sometimes it is called a more disordered state. Entropy is an extendable (i.e. additive) state variable. Thermodynamic probability is multiplicative. It’s likely that One may show in fact that the constant a is equal to the Boltzmann constant k, i.e., 20.8 A Statistical View of Entropy: Probability and Entropy Stirling’s approximation is often used for ln N! when N is large: ln N! ≈ N(ln N) - N (Stirling’s approximation). S alnW
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved