Sunteți pe pagina 1din 15

MP-1 (SJP, Phys 2020, Fa ‘01)

Modern Physics: ( Physics from the 20th century...)


What we've been studying so far in 2010/2020 is called classical physics.
Newtonian mechanics, wave phenomena, heat, electricity and magnetism,
optics, wave optics... We have described phenomena of ordinary life (and
sometimes more exotic things!) qualitatively and quantitatively. Classical
physics is a remarkable success story, a deep understanding of nature that
has also led to a significant fraction of the technology in our modern lives.
This physics was developed mostly over the period from the late 1600's
(Newton) through the late 1800's (optics, thermodynamics, and Maxwell's
E+M)

Around 1900, several major revolutions in the way we think about, and
look at, the physical world occurred. The new discoveries are called
modern physics (even though much of it is 70-100 years old!)
Specifically, I'm talking about relativity, and quantum mechanics.
Nuclear and particle physics are spinoffs, in a sense: new exp'tal
observables but using the framework of relativity and quantum mech.

Modern physics doesn't eliminate the need, usefulness, even in some ways
the correctness of classical physics. But it extends our understanding to
regions where classical ideas simply break down - they fail to describe
experiment! Relativity is particularly important when speed (or energy)
get very high. Quantum mechanics is most useful when sizes scales are
very small (like inside atoms, or nuclei).

In 1890, some physicists (admittedly, mostly older ones :-) thought


everything in nature could be explained "classically". It seemed just a
question of solving Newton's laws or Maxwell's equations in more
complicated situations. Was physics "done"?, they wondered -- just a
matter of measuring quantities to more decimal places to improve
accuracy? They were quite completely wrong! It is impossible to
understand the behavior of, say, atoms, from a purely classical or
Newtonian approach. The world simply does not behave like Newton (or
really, most of us!) thinks, at extreme scales. Relativity and quantum
mechanics are weird, mind boggling!

``Do not keep saying to yourself, if you can possibly avoid it, `But how can it be like
that?' because you will get `down the drain,' into a blind alley from which nobody has
yet escaped. Nobody knows how it can be like that.''
-- Richard Feynman (one of the greatest physicists of the latter 20th century)
MP-2 (SJP, Phys 2020, Fa ‘01)

Relativity was pretty much Einstein's idea (though Maxwell, Michelson,


Morely, Lorentz, Mach and many others contributed of course!) It
involves the idea of how we observe physical events - in particular, how
our description of physics depends on our "reference frame". Galileo had
thought about this long before Einstein, and came up with ( correct!)
classical expressions relating observables. The speed or position of an
object depends on the observer in well-defined ways, but the equations
relating the observables are universal and independent of observer. Some
things are relative (like velocity, or value of magnetic field), but other
things are not (like F=ma, or Maxwell's equations)

The fundamental principle here, the "relativity principle", was


understood by Galileo as well as Einstein: the basic laws of physics are
the same in all inertial reference frames. The extra step that Einstein took
was small, but had radical implications. Einstein argued, on the basis of
Maxwell's equations (and a lot of thinking!)
that the speed of light, c, is a constant independent of the speed of the
source or observer. This becomes a basic law of physics, with the same
status as, say, Maxwell's equations.

Galileo would have argued otherwise: if you fire a bullet towards me,
YOU say it has a certain speed, say 1000 mi/hr. But if I run away from
you at 999 mi/hr, I see it has a different speed (in this case a harmless 1
mi/hr, the difference 1000 - 999, right?) This works with bullets, but
Einstein says that for light, that's not correct. If you shine a flashlight
towards me, YOU say it has a speed of c=2.99E8 m/s. If I run away from
you at 2.98E8 m/s, I say the light has a speed of .... (drum roll here... are
you about to say the difference, 0.01E8 m/s?) No, the speed I measure is
ALSO c=2.99E8 m/s. That's what Einstein argued, and it has been
checked and verified in countless experiments, including pieces of
technology in use every day today...

We're not going to go much further into the theory of relativity, just
because we're running out of lectures this semester! When you start to
think about the consequences of Einstein's postulate, your whole way of
thinking about the world gets shaken. For his postulate to be correct,
neither time nor space can be absolutes. There's not some "cosmic clock"
clicking off seconds out there. Time is relative, the passage of time
depends on the observer!
MP-3 (SJP, Phys 2020, Fa ‘01)

Closely tied in to that, simultaneity is relative. I might see two events


(say, two stars going nova in different parts of the sky) and say, Hey, they
happened at the same time. But another physicist, moving with respect to
me (perhaps she's an astronomer on "Mars Base") will disagree. She will
argue that one happened first. Who's right? We both are, simultaneity is
NOT an absolute property of events in different places! Weird stuff.
The consequences continue - a particle which decays radioactively has a
certain lifetime. But if it moves quickly with respect to you, it lives
longer. Time is dilated when things move fast. Einstein derived equations
which quantify these effects. They're not important or noticeable unless
the speeds involved are very close to c. But we certainly have observed
them - atomic clocks transported in jets tick at a measurably different rate
than reference clocks on the ground. And GPS systems must correct for
relativistic effects or they won't work (the clocks on GPS satellites move
fast!)
Einstein does not tell us that everything is relative, he's not throwing away
science! He has unified space and time, into a 4-dimensional "space-time"
that must be considered together to describe these experiments.
Momentum and energy similarly become connected, and the most famous
consequence (derivable just from the simple principle of relativity
described above!) is that E= mc^2, mass and energy are related. Energy
can be converted into mass (and is, regularly, in particle physics
experiments), and vice versa, mass can be converted into energy, and is,
regularly, in the core of our sun, nuclear power plants and bombs, and, in
fact, when you light a match. That's right: E=mc^2 is not just about
nuclear energy. It's about ANY kind of energy, including ordinary
chemical energy like the burning of a match.
The equations of relativity are not hard - the math involved is nothing
worse than the Pythagorean theorem. Read Chapter 26 to get a
quantitative sense of relativity! At this point, you are all sophisticated and
knowledgeable enough to read about physics on your own, even
something as intimidating as relativity. It is hard to wrap your head
around it, it's not exactly intuitive. But you can understand it! There are
also lots of great books that you could read, e.g. "Relativity Simply
Explained". But we're going to go on to survey a few other topics of
modern physics now...
MP-4 (SJP, Phys 2020, Fa ‘01)

The other part of modern physics is quantum mechanics. It is, in many


ways, a far more radical theory than relativity. Its consequences are still
being studied, its philosophical implications still a great source of
discussion. It was not developed by one person, but a large group of really
brilliant minds - Max Planck started the process, Einstein was an active
participant (though he never really believed in it!), Neils Bohr, Louis de
Broglie, Werner Heisenberg, Erwin Schrodinger, Max Born, Paul Dirac,
Wolfgang Pauli...

Its development wasn't sudden, like relativity. It began with a series of


puzzles, experiments whose results were in direct conflict with classical
physics predictions. E.g, Maxwell's equations (coupled with some
classical thermal/statistical physics) make a clear prediction about what
the spectrum of radiation from hot objects should be (so-called
"blackbody radiation", blackbody just means any ordinary material that
absorbs and emits strongly) But the data disagreed. Not by a little bit,
either, it was totally and completely different!

Max Planck was able to explain the data by making a wild and crazy
hypothesis - that light is quantized, it comes in bundles, in chunks, in
"quanta". In particular, if you light of some color (frequency), there is a
minimum chunk, ONE quantum of light (which we now would call a
photon) and you can't have any LESS light than that (except for none at
all, of course.)

This is a thoroughly preposterous idea. Light is a wave, after all, and you
should be able to make waves with any amplitude, any energy, you like,
right? You can make sound waves of any loudness, can't you? However,
it's not a preposterous idea if light is really "particles" (it doesn't bother
you to say you can only go down to one "quantum" of electricity, namely
e, an electron, does it? You can't have any less electric charge than that,
except for zero. Because electricity comes in chunks, in quanta.) But
light's supposed to be a wave, according to Maxwell, not a particle! So,
this was Planck's wild idea.

He even came up with a formula, a very important one:


E=hf The energy of a photon is a constant times the frequency.
"h" is a new constant of nature, h= 6.67E-34 J*s (h is now called
"Planck's constant") This formula is strange, not very intuitive!
MP-5 (SJP, Phys 2020, Fa ‘01)

Plank felt this idea of "quantized light" was mathematical fiction, a way to
describe blackbody data that had nothing to do with reality. But Einstein
found the idea compelling, and asked if there were other consequences of
quantizing light. He used the idea to explain another totally unrelated
puzzle of the day - the "photoelectric effect": light shining on metal
makes electrons fly off. (We use this effect all the time now in solar cells,
photo-detectors, all sorts of applications. Even photosynthesis is a
complex version of this effect...)
Classical physics says waves transmit energy and momentum, (and so can
"kick electrons") but the details are wrong for the photoelectric effect.
Maxwell says low frequency light should be able to lift electrons out of
metal if you wait awhile, or if the light is bright enough. But Einstein says
that if E=hf, then a given color light has a minimum amount of energy. If
that isn't enough to lift an electron out of the "potential well" of the metal,
then NOTHING at all will happen. (If no individual photon can lift out an
electron, none will ever come out...) But, as you change frequency, you
reach a critical color (f) where the energy IS enough, and suddenly lots of
electrons should come popping out, you'll see a "photo-current".
Here is Einstein's prediction for kinetic energy KE of ejected e-
(KE) of ejected electrons as a function of
frequency of light shined on the metal. It's a
simple consequence of conservation of energy!
frequency of light, f
Electrons have a "binding energy" W
(also called the "work function" of the metal, it's f(min)
the amount of energy required to lift an electron out of the metal.)
That means the electrons starts off with NEGATIVE potential energy
(initial electron energy is -W, it's "bound")
Initial energy = hf (from the light) - W (for the electron)
Final energy = KE of the electron.
Conservation of energy says KE = hf -W, the graph shown above. The
slope should be Planck's constant h, which Plank derived in a TOTALLY
different way, looking at glowing metal in ovens.
Completely incorrect according to E+M theory, but absolutely correct in
the lab! A famous experimentalist, Millikan, read Einstein's paper and did
the experiment. He was certain it was all a crock, and worked very hard to
show that Einstein was wrong. But in the end, the prediction of Planck
and Einstein was exactly correct.
MP-6 (SJP, Phys 2020, Fa ‘01)

So which is it now, is light a particle or a wave? Countless interference


exp'ts cannot be easily explained by a simple particle model, they require
light to be a wave. Yet, Einstein and Planck show us these new exp'ts that
cannot be easily explained by a wave model! Light has both characters -
you simply can't picture what light really "is" in any classical way! You
really have to use the formulas of quantum mechanics, tied in with
Maxwell's equations. (A rigorous "combined theory" was developed in the 1940's,
by Richard Feynman and others, called "QED" or "Quantum Electrodynamics", but the
basic ideas were all in place from earlier in the century.)

The message is profound - you can't always impose a simple model of the
world on nature. Light is complicated. It was a revolution in physics, and
the model that emerged is still unsatisfying to many people. It's a model
with "wave-particle duality". We describe it accurately, mathematically,
but it's hard to "picture" in any simple, intuitive, classical way.
Classically, you either have waves or particles; you can't have both at the
same time! But you can and must in the real world of quantum physics.
There were many more exp'ts verifying E=hf and the quantum nature of
light. The most famous, and clinching, was in the 30's by Arthur
Compton. He managed to shine quanta of light at electrons and watch the
electrons "bounce" off. The light behaved JUST as a particle, a billiard
ball, would! The description of the scattering requires relativity, but
otherwise is just a standard "particle collision", not at ALL what you'd
expect if a light wave was "wiggling" past the electron.

So the idea of "quantizing energy" and quantizing light in particular was


important, but this is not the whole story. Quantum mechanics goes much
further, it's not just about photons. Louis de Broglie (a theorist) argued
that if light can be both wave and particle, why can't electrons also have
both characters? Physicists were getting very wild in their thinking! Come
on, surely electrons are particles... But de Broglie was correct - electrons
do exhibit wavelike behavior in some cases. For example, if you beam
electrons (just like the beam in your TV) towards a pair of slits, you'd
expect two bright spots (where lots of electrons hit) right behind the holes.
But what you see in reality is a diffraction/interference pattern, just like
light going through slits! Beams of particles interfere in a way that only
waves should...
MP-7 (SJP, Phys 2020, Fa ‘01)

What is the wavelength of a particle?? De Broglie came up with a


formula, based on the conclusion of Einstein (and Compton) that
momentum p = h/lambda for light. De Broglie predicted that

λ = h/p for particles of momentum p.

Let's compute λ for, say, me (a rather large particle) Imagine I'm walking
through some double doors (2 big slits!) Will I diffract? Will I get "bent"
off to some funny angle? Will there be certain angles (the diffraction
minima) that I will never go to? Yikes?! Let's assume I'm walking, so v =
1 m/s, and my momentum is p = mv = 50 kg m/s. (roughly)

λ (Steve) = h/p = 6.63E-34 J*s / (50 kg m/s) = 1.3E-35 m.

Now, remember, you never notice interference effects unless the slit size
is roughly as small as wavelength. So unless the doors are about 10^-35 m
apart, you'll never notice any effect. We're saved from "quantum
weirdness" in the ordinary world! (Atoms are typically 10^-10 m apart,
10^-35 is very small. I wouldn't fit through them if they were that close!)

But what about electrons from a low voltage cathode? Suppose you let an
electron drop through a 100 V difference (no big deal, a couple of car
batteries and a capacitor can do this) Let's have them strike a crystal,
which looks microscopically like a row of atoms about 10^-10 m apart. A
bunch of atoms in a row - it's like a grating, isn't it? The speed of the
electron comes from conservation of energy: 100 eV = 1/2 m v^2.
Solving for v (looking up m = 9.1E-31 kg), I find v = 6E6 m/s.
(Check this yourself, don't forget to convert eV to Joules. Do you see why
one electron dropping 100V has 100eV of energy)?

λ(electron)= h/p = 6.63E-34 J*s/(9.1E-31 kg)*(6E6 m/s) = 1.2E-10 m

Cool. The electron DOES have a wavelength comparable to the spacing of


the "slits" between the atoms. Electrons should (and do!) diffract when
they strike a crystal face. The experiment was done in 1927, and many
more since then. Particles have a "wave nature".
MP-8 (SJP, Phys 2020, Fa ‘01)

This is the principle of the electron microscope! You can only "look" at
objects if the wavelength of light you are using is smaller than the object.
(If λ is bigger than the object, the light diffracts and you can't get a good
shadow, a good image.) If you want to look at something smaller than
visible light's wavelength (a few hundred nm), then you need to use a
"light" source with smaller wavelength. You could still use EM radiation,
that would mean going into the "X-ray" region. But it's hard to steer X-
rays, there are no good "lenses" to focus them. Instead, you can use
electrons. They're charged, so they're very easy to steer with E and B
fields (you can build "lenses" for electrons with magnets!) and by varying
their energy, you can pick wavelengths that are small enough to image
individual atoms! Very cool practical application of quantum physics

It turns out that all particles have wavelike properties. Wave-particle


duality extends to everything. Neutrons, e.g., have MORE mass, so more
momentum, hence small wavelength. We use "neutron interferometers" if
we want interference effects with very low energy particles... People have
even done experiments where they "interfered" atom beams to get "bright
and dark" bands (The grating was a standing wave of light, so the BEAM
was particles and GRATING was light, exactly backwards of Chapter 24!)

The equations here are simple, nothing new is required:


d sin(theta) = m lambda (for maxima) is true for particles too....

None of these "quantum" effects I'm discussing were noticed until the
20th century because of the numerical scales, the subtlety of the effects -
as our example of "me" diffracting showed. Recall the limit of short
wavelengths in optics gives "ray optics", which is really particle-like
behavior! We talked about that in Chapter 24 - when the slits are big
enough, you don't NOTICE that light is a wave. Neils Bohr came up with
an idea called the correspondence principle that basically says any new
theory of nature (like Einstein's, or de Broglie's) should still give all the
old usual well-tested classical results if applied to ordinary situations. It's
only when you look in new regimes (like high speed, or tiny systems like
atoms) that you start to get these new (even crazy) phenomena. Relativity
is like that too - Newton's laws break down at high speeds, but in the limit
of speeds much less than c, all the formulas reduce to the classical ones.
You can show this mathematically!) Classical physics is an
approximation to modern physics, but an incredibly accurate one in
ordinary world situations.
MP-9 (SJP, Phys 2020, Fa ‘01)

I'm afraid quantum mechanics gets much weirder still. It's not just that
particles have wave character - that alone would be something I think we
could accept. It's much worse - their behavior is not fully deterministic! I
think his is the craziest and most radical break from classical physics of
all. When I describe a particle as a "wave", I'm really saying that the
particle is described by a mathematical "wave function" that tells you the
probability of finding the particle somewhere. It's not that the electron
wiggles like a wave - it's more that the location of the electron is not
exactly predetermined - where the wave function is big, the electron is
likely to be. Where the wave function is small, the electron is not likely to
be. The "wave" here is a probability wave. This is hard to think about!

There is an inherent uncertainty in all quantum measurements. If you try


to measure one aspect of a particle (e.g. its position), you can do so, but
the act of measurement "disturbs the wave", and results in an increased
uncertainty about some other aspect (e.g. the momentum). Heisenberg's
uncertainty principle quantified this relationship. It's not just a statement
about our ability to make measurements, it's a more fundamental
statement about the nature of reality. Nature does not know the position
and momentum of an electron at the same time, because an electron is
described only probabilistically! There is an intrinsic element of
uncertainty in certain observables (although not all)

Einstein helped start the theory of quantum mechanics, but he rejected


philosophically the conclusions that it inevitably led to. "God does not
play dice with the Universe", he said. Alas, he was fundamentally
incorrect about this! There is at least as good experimental evidence for
the correctness of quantum physics as there is for relativity - the
consequences of the Heisenberg Uncertainty principle are unavoidable
and observable. (Indeed, tunneling electron microscopes make use of it to
function! )

Quantum mechanics is a whole new way of thinking about reality. The


world is not exactly "clockwork", there is an element of unpredictability
that runs counter to a lot of people's intuitions about science. But let's
leave this quantum weirdness and look at something a little more
concrete: atoms.
MP-10 (SJP, Phys 2020, Fa ‘01)

By the turn of the century, people roughly knew the mass of an atom, and
they had estimated the size of atoms to be about 1 Angstrom (10^-10 m)
from chemistry experiments. They know of the existence of electrons.
(Thomson did experiments, like our class demo with the CRT, to measure
e/m, and Millikan did experiments to measure e) People had postulated
the existence of nuclei - Rutherford scattered small atoms off bigger ones
(Helium from metal), and concluded that the nucleus is tiny - atoms are
mostly empty space. He came up with a "planetary" model: a nucleus at
the center, and an electron orbiting, like earth around the sun. Only it's
electricity that holds the electron in, not gravity. The nucleus is positively
charged and massive: 99.9% of the mass of the atom. The electron is
negatively charged, and orbits at a radius of about 10^-10 m.
The nucleus itself is about 10^-15 m in size, a "femtometer" or "fm" (also
called a Fermi).
The simplest atom, hydrogen, is the lightest. It consists of one proton in
the center, and one electron in orbit. (Helium is the next biggest, with 2
protons, 2 neutrons, to make the mass correct, and 2 electrons in orbit.
And so on up the period table.)
It was a neat, simple picture. But it was also wrong!
1) Electrons in orbit are accelerating (centripetally), and Maxwell says
they should radiate. The expected lifetime is calculable: a fraction of a
second. Matter should vanish in a puff of radiation in less than 1 second!
2) Hydrogen emits only certain frequencies of light when excited. Why?
The planetary model should allow a continuum of energy - electrons can
have any energy, atoms should just glow with a broad spectrum of energy.
2b) The light emitted by hydrogen was examined in spectrometers, and
only a few specific "lines" or colors were seen. (You'll study this in lab 6)
There was even a formula found by Balmer (a high school teacher):
1 / = R ( 1 /22 − 1 /n 2 )
where R was a constant (obtained from the hydrogen data), and n is any
integer. You plug in an integer n, and out pops an experimentally
observed wavelengths!

The formula had NO theoretical basis, it was purely "empirical"...


MP-11 (SJP, Phys 2020, Fa ‘01)

Neils Bohr applied the fledgling ideas of quantized energy and photons to
understand atoms. His model was not full-blown quantum mechanics!
That took another few years. (A solid theory of the atom required Schrodinger's
"quantum wave equation", which put together de Broglie's ideas in a more rigorous
way, and was then improved by Dirac in the late 20's, and finally by QED in the 40's)
But Bohr came first - he constructed a "semi-quantum" picture, a model,
of atoms, that explained some classically inexplicable behavior. It paved
the way to the full blown "Quantum mechanics" still to come.
Here were Bohr's "postulates":
1) Nature only allows certain orbital radii. Electrons cannot sit at any
random radius (with therefore any old energy). (Phys 2010 will tell us
how energy in a circular orbit is determined from radius!) Bohr didn't
know the reason, he just guessed the fact that electron radii, and thus
energies, are quantized. Only certain values are allowed.
(By the way, de Broglie was soon able to explain why! If you want a "wave" to wiggle
in a certain space, the wavelength has to fit. A guitar string only allows certain special
wavelengths: those that are the length of the string, or L/2, or L/3,... They have to fit.
Similarly, if an electron is a wave, the wavelength (hence momentum, hence energy) is
restricted by geometry. But never mind for now, Bohr didn't know this...)

2) Angular momentum, r*p ,is ALSO quantized. Only certain values of


r*p are allowed: r*p = n h/ 2 pi.
(h is Plank's constant, and n is any integer.)
Where the heck did that come from? Well... the units are right, "h" has units of angular
momentum, and since "h" quantized light energy, maybe it quantizes angular
momentum too? But why the 2 pi? And why quantize angular momentum? For Bohr, it
was purely "ad hoc" - it just explained data. De Broglie's idea that "wavelength has to
fit into the orbital circumference" derives this result too. Schrodinger's wave equation
derives it even more rigorously (and fixes it up a little). But like I said, Bohr had a
crude model, not a theory! So let's go on...

3) If an electron is in an allowed orbit, it won't radiate. This is a


"postulate", not an explanation. It just doesn't radiate, says Bohr.
But if it changes orbits (jumps from one allowed radius, to another),
THEN it will radiate. It will emit a photon. The energy of the photon must
conserve total energy:

E = hf of the emitted photon = (Efinal - Einitial) of the electron in orbit.

Those are Bohr's postulates. Let's look at the consequences of Bohr's


model for the simplest atom, hydrogen:
MP-12 (SJP, Phys 2020, Fa ‘01)

Consider an electron in orbit around a proton.


electron in orbit
Newton says F = ma. (mass m)
(So Bohr is "quasi-classical", he's still invoking +Q
Newton's laws!) Nucleus
(Q = Ze, with
Circular orbits have a = v^2/r
Z=1,2,3,...)
(where r is the radius of the electron orbit: that's
just Phys 2010 kinematics!)
The force is purely electrical, Coulomb's law says
F = k Q1 Q2 / r^2 = k (Ze) (e)/ r^2.
Put this together: kZe^2/r^2 = m v^2/r. (Eq'n #1)
Now, Bohr argues angular momentum is quantized:
r*p = n h/(2 pi).

(We will define h = h / (2π), called "h-bar", simply so we don't have to


keep writing those 2 pi's all time.)
Anyway, Bohr's quantization rule says r*(mv) = n h , or
v = n h / mr (Eq'n #2)
Plug this into Eq'n #1 above, and then get "r" by itself (try it yourself!)
n2  h2 
I get r =  2  (Eq'n #3)
Z  ke m 
The stuff in parentheses is just a bunch of constants of nature you can
look up in Giancoli. It's a calculator problem, I get 0.53 E-10 m.

(Remember, from the figure, Z is an integer: just the number of protons in


the nucleus. For hydrogen, Z=1.)

This formula gives us many different radii - you plug in n (the "quantum
number of the orbit") and you'll get out a radius.
Once you have r, you also have a corresponding v (from Eq'n #2)

Bohr's postulate of quantized angular momentum tells us that the radius of


the orbit is also quantized. But in a funny way, it goes like n^2.

For hydrogen atoms (Z=1),


r1, the minimum radius, is .53 E-10 m. This is called the "Bohr radius".
MP-13 (SJP, Phys 2020, Fa ‘01)

This is a big deal! Remember, I told you that atoms are experimentally
measured to be about 10^-10 m in size, which is just what the Bohr radius
gives us. The Bohr model is predicting, from fundamental constants, what
the size of atoms should be! Think about that a little. Could you have
imagined that humans could CALCULATE the size of the building blocks
of nature, just by knowing what seem to be COMPLETELY unrelated
constants like "h" and "k"?... Remember, "h" is measured from the energy
of photons, from hot glowing objects, or light shining on metal - it's a
universal constant of nature that has nothing (apparently) to do with
atoms, it's about light. And "k" is the constant from Coulomb's law, it tells
you about sparks, and cat fur. It tells you about the forces between
charges, even just electrons - no atoms required to measure it! Nature is
all tied together - the properties of light, and electricity are intimately
hooked into the objects that make up stuff. It's really very cool!

How about energy? Recall from Ch. 17 the potential V near a point charge
Q is V = kQ/r. And, potential energy U = q*V, so
E(total) = PE + KE = 1/2 m v^2 + k (Ze) (-e) / r.
(Because "Q" = Ze = charge of the nucleus, "q"= -e = charge of electron)
Now plug in "v" and "r" we just found, (try it, if you like algebra puzzles!)

−1  e4 mk 2 
I get E = 2
2 Z 
 2 h 
2
n
Once again, the mess in parenthesis just involves a bunch of constants of
nature. (m is the electron mass, etc). When I punch in my calculator, it
looks nicer if I convert to eV instead of Joules, I get 13.6 eV.

So energy is also quantized, again in a funny way. (Like 1/n^2).

The energy came out negative but that makes sense. Electrons are bound
in hydrogen, you must add energy to get the electron off to infinity.
Indeed, this formula lets us predict that energy, the "binding energy" of
hydrogen.

Bohr has concluded that a single integer, "n", the "principle quantum
number" tells you the radius, speed, energy, and angular momentum.
Everything you'd want to know about the atom, basically.
MP-14 (SJP, Phys 2020, Fa ‘01)

Let's examine these "allowed" or "quantized" energies.

n=∞
. E(∞) = 0 eV n=1 is the lowest orbit, the
.
. most deeply bound.
n=3 -13.6 eV/(3^2) = -1.5 eV
n=2 -13.6 eV/(2^2) = -3.4 eV
You need to add 13.6 eV to
it to make the electron
escape to infinity. That's the
binding energy of hydrogen,
n=1 -13.6 eV/(1^2) = -13.6 eV
from chemistry. It's bang on
correct.
Another great success of the model - the chemical binding energy is
calculated exactly. And now remember Bohr's ideas #3, that if you're in
one of these allowed states (called "orbitals" or "energy levels"), you're
stable, and orbit without radiating, but you can also jump to another level.
If you jump down in energy, a photon will be emitted, with
E(photon) = E(init) -E(final) = -13.6 eV( 1/n[init]^2 - 1/n[final]^2)

If you jump down from some excited level to n=2, the photon leaves with
E(photon) = +13.6 eV(1/2^2 - 1/n[init]^2).
Since photons have E = hf (from Planck), and f*λ=c (from Maxwell),
we have 1/lambda = f/c = E/(hc) = 13.6 eV/(hc) (1/2^2 - 1/n^2)

This is Mr. Balmer's formula! And, Bohr has predicted the factor out
front, it comes from fundamental constants. Think about this - Bohr
predicts the specific colors of light that hydrogen should emit, all from
first principles. Another remarkable achievement. Experimentally, the
accuracy is stunning. Although the theory is crude, it works well.

Now, why does nf have to be 2? Can't you jump down to nf=1? Sure!
But if you plug in the numbers, you discover that such wavelengths are
shorter than visible (they're UV). So Bohr is now not just matching data,
he's predicting a whole bunch of additional "spectral lines". When you
take a UV detector and look for them, they're there, and exactly predicted
by the Bohr formula.

Bohr also predicts the absorption spectrum. For cold hydrogen, we expect
the electrons to all start in n=1 (the "ground state", lowest possible
energy), so if you shine light through cold hydrogen, you should absorb
photons if they have E = -13.6 eV ( 1/n^2 - 1/1^2) And that's also right!
MP-15 (SJP, Phys 2020, Fa ‘01)

Quantum mechanics goes much further than Bohr's idea. As I said before,
Bohr's model is still mostly classical. He's thinking of the electron as a
small object that obeys Newton's law. It has a radius (position) and
velocity (momentum). But in the end, we have learned that's not true. The
electron's position is uncertain, so is its momentum. We can derive and
calculate a "wave function" for the electron, which tells where it is likely
to be if you make a measurement. The energies that Bohr calculated turn
out to be the same in the full quantum calculation, but the model, the
picture, of the atom is different. I think of it more as a "cloud" of electron,
but even that's not right - the electron is not fuzzy, it's just that no one
knows where it is until we look for it. It has a probability that is "spread
out like a cloud"... I'm sorry, it's hard to describe, or picture. But the
mathematics gives spectacular predictions - not just the energies of
photons, but magnetic properties, electric properties, chemical properties.
Everything you can measure about hydrogen atoms is calculable from first
principles, involving only fundamental constants of nature. At the same
time that physics has become indeterministic in some respects, it has also
become grandly unified and accurate in others!

I'm afraid a few pages about quantum physics can't possibly do it justice.
Like relativity, you can read Ch. 27 and 28 to learn some basic formulas
which allow you to understand real life applications - tunneling
microscopes, fluorescence, how lasers work. And, why nuclei are
radioactive... And there are tons of books in any bookstore that try to
"popularize" it. Unlike relativity, which you can really learn and
understand on your own, I think quantum is much harder. A physics major
will take three full courses in quantum mechanics as an undergraduate,
and another 3 as a grad student... (and spend about a week or two on
relativity.) And then you can use quantum mechanics to make wonderful
and accurate predictions, but to really start to understand it? Well, I'm still
working on that myself :-)

S-ar putea să vă placă și