Documente Academic
Documente Profesional
Documente Cultură
601J
Thermodynamics of Biomolecular Systems
Instructors: Linda G. Griffith, Kimberly Hamad-Schifferli, Moungi G. Bawendi, Robert W. Field
Lecture 13
5.60/20.110/2.772
U (S ,V , ni )
(dU )S ,V ,n
A(T ,V , ni )
(dA)T ,V ,n
H (S , P, ni )
(dH )S , P ,n
G (T , P, ni )
(dG )T , P ,n
0
0
These tell you which way a reaction will go, and when it will stop or reach equilibrium.
Want these thermodynamic properties, now in terms of molecular level picture.
p
Before: ignore molecular properties
Now want to account for it: molecule exists in states, has energy levels (recall 5.111)
If have large number of molecules (NA = 6 1023), it is nearly impossible to track all of the states. Mind
boggling!
How?: use probability to describe molecule instead of tracking each one
pi = probability that molecule will be in ith state. The link between microscopic and macroscopic:
1
Lecture 13
5.60/20.110/2.772
Boltzmann Equation
= multiplicity
S = k ln
S = k pi ln pi
i =1
Approach:
make model to represent problem
describe the states a system can be in
use statistics and probability to average
relate to a thermodynamic property (U, S, etc)
DEMYSTIFYING S
Lecture 13
5.60/20.110/2.772
= N ( N 1) ( N 2 ) 3 2 1 = N !
Example: 4 mer, A, T, G, C
How many different ways can these 4 nucleotides be arranged?
Write out all possible arrangements:
ACGT
CAGT
GACT
TACG
ACTG
CATG
GATC
TAGC
AGCT
CGAT
GCAT
TCAG
AGTC
CGTA
GCTA
TCGA
ATCG
CTAG
GTAC
TGAC
ATGC
CTGA
GTCA
TGCA
=24 ways
Using equation:
4 available slots: __ __ __ __
= 4 3 2 1 = 4! = 24 possible arrangements
3
Lecture 13
5.60/20.110/2.772
(remember: 0! =1)
Indistinguishability: What if two of the letters are the same?:
Example: 3mer with A, A, G
Two of the nucleotides are indistinguishable (cant tell them apart)
First: pretend A and A are distinguishable: A1 A2
G A1 A2
G A2 A1
A1 G A2
A2 G A1
A1 A2 G
A2 A1 G
= 6 ways
Then: take away labels. Both columns same. = 3 ways
To account for indistinguishability, divide by NA = number of As
N! 3 2 1
=3
=
N A!
2 1
NA = number of As = 2
General equation:
N objects
t categories
N!
n1!n2!...nt !
N
N!
(n, N ) = =
n n!( N n )!
Lecture 13
5.60/20.110/2.772
Degree of freedom: is the # of black particles and # white particles on either side of the wall.
N!
8!
=
= 70
nB !nw! 4!4!
Now: count number of ways each configuration is possible for each configuration (number of B on
Left) Use
tot=leftright
A
4B0B
4! 4!
=1
4!0! 0!4!
3B1B
4! 4!
= 16
3!1! 1!3!
2B2B
4! 4!
= 36
2!2! 2!2!
1B3B
4! 4!
= 16
1!3! 3!1!
0B4B
4! 4!
=1
0!4! 4!0!
tot=70 ways
Lecture 13
5.60/20.110/2.772
Case C has the highest entropy because it has the highest multiplicity. Therefore, this is the state
the system is most likely to be in. This also is confirmed by our intuition of the problem, in that C is the
most disordered case.
S = k ln
This is akin to:
vs
=36
=1
maximizing maximizes S
tendency towards disorder
Probability: likelihood that the system is occupying that state
N = number of total outcomes
nA = number of outcomes of category A
pA =
pA = probability of obtaining outcome of type A
Example: Die
N=6
nodd = 3
6
nA
N
Lecture 13
5.60/20.110/2.772
nodd 3 1
= =
N
6 2
podd =
Addition Rule:
Probability of outcome A OR outcome B:
p A + pB
Multiplication Rule:
Probability of outcome A AND outcome B
p A pB
If outcomes are collectively exhaustive:
t
=1
i =1
Probability Distributions
Plotting pi vs. i gives you a probability distribution. For example, the probabilities of rolling a die may
look like either case A or case B:
1.00
A
0.75
pi
i =1
0.50
0.25 1/6 1/6 1/6 1/6 1/6 1/6
0.00
1
1.00
pi
= k [ln 6] = 1.79k
6/6
S = k pi ln pi
i =1
0.50
0.25
0.00
0.75
S = k pi ln pi
0/6
1
=0
Lecture 13
5.60/20.110/2.772
Lets use this to calculate S for two different probability distributions. Of the two above, can guess
that the case a) has the higher entropy. More disordered. But lets calculate a number using:
S = k pi ln pi
i =1
Can see that flatter distributions have higher entropy. What does this mean? If you are distributing
particles in energy levels, evenly distributing them leads to the highest entropy.
But what we usually observe is that things tend to have lower energies. Why is that? We have placed
no constraints on the system. But dont worry, we will!
Given a probability distribution, can compute the average values of some property X
< X >=
score p
i
all
states
pi
all
states
1 1 1 1 1 1
= 1 + 2 + 3 + 4 + 5 + 6 = 3.5
6 6 6 6 6 6
< >=
pi
all
states
This is our constraint: that the total energy has to sum up to a particular value.
Stirlings approximation:
S = k ln
n
n!
e
ln n! n ln n n
=N! or N!/(n1!n2!.)