All Episodes

April 15, 2025 58 mins

Daniel and Kelly try to organize their thoughts about the disorderly topic of entropy.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:06):
There are so many topics in physics that are hard
to grapple with to deeply understand, and it's not helped
by the fact that often physicists have given these things
confusing names. Electrons, for example, have quantum spin, but they're
not actually spinning. Moments of inertia have nothing to do
with moments of time. Work and power have very different

(00:28):
meanings in physics and in English. But sometimes even when
physicists invent a brand new word to convey a new idea,
it's still slippery to grasp. So today in the pod,
we're going to try to grab hold of one of
the trickiest concepts in physics, one that's often tossed about
and attached to a simple explanation, but whose subtle power

(00:50):
isn't usually clearly explained. Today in the pod, we are
tackling entropy. What is it? Does it explain why our
teenager's rooms are so messy, or why coffee spills out
of cups but not back into them. Does it tell
us about the fate of the universe or the nature
of time? Is it about order and chaos? Why did
physicists even devise this concept? Welcome to Daniel and Kelly's

(01:14):
Extraordinarily Disorderly Universe.

Speaker 2 (01:31):
Hello. My name is Kelly Wiener Smith, and I often
make jokes about entropy increasing in my home, and today
I'm gonna find out if that joke is scientifically accurate
or not. Hi.

Speaker 1 (01:42):
I'm Daniel. I'm a particle a physicists, and I think
that there's always one person in the marriage who's more
orderly and one person is more chaotic.

Speaker 2 (01:50):
Yeah, it's pretty easy to identify who's who in my marriage.
Are you the more orderly or the more chaotic in
your marriage? Katrina seems pretty orderly, but so do you.

Speaker 1 (02:01):
I don't want to slander my wife on air, especially
if she's not here to defend herself, so I'll say
that in some categories, I'm more orderly, and in some categories,
she's more on top of stuff, and that's why we're
such a good team.

Speaker 2 (02:12):
Ah, I'm going to slander my husband. He's the chaotic
force in our family, he absolutely is. But he's also,
you know, a lot of the creative force. So it
works out.

Speaker 1 (02:20):
Yeah, exactly. Well. The yin and yang is what makes
it exciting, isn't it.

Speaker 2 (02:25):
It keeps things fresh, that's for sure. So I was
thinking about the topic for today, and when I think
about entropy, the first thing that comes to mind is
jazz music. And when I googled entropy and jazz, actually
a lot of things came up. People have like studied
jazz music through the lens of entropy. Do you like
jazz music? What do you think?

Speaker 1 (02:45):
Wow, that's fascinating. I never connected entropy and jazz music.
There's some jazz music I like, but I like things
that are a little bit more melodic. So just like
a wandering sprinkle of nodes doesn't do it for me.
I need a little bit of rhythm and some beat
to it and whatever. So I'm more of a blues
guy than jazz, how about you.

Speaker 2 (03:03):
It depends on my mood. There are some moods that
I'm in where jazz is exactly what I want and
it's exactly what I need to listen to while I'm writing.
And then there's other moods where it does make me
feel kind of frustrated and overwhelmed. So, but you know,
I feel that way for a lot of different kinds
of music. I've got like very particular kinds of music
for different kinds of moods.

Speaker 1 (03:21):
I appreciate the jazz nerds though, because they help me
understand how almost any human endeavor there are so many
layers to it, you know, like people who appreciate wine
and they can tell the difference between like a five
hundred dollars bottle of wine and a five thousand dollars
bottle of wine, whereas like I can't tell the difference
between ten dollars and twenty dollars bottles of wine, Or
like how many levels of skale there are two chess,

(03:42):
you know, where like a level twenty person will be
the level eighteen person will be the level seventeen person. Consistently.
I feel that same way about jazz, because people talk
about it and they're like, Wow, this guy is a
genius and so amazing in this way and that way,
and I'm just like at the beginning of a journey
of appreciating that. But I love when human culture goes
really deep on something and people could appreciate the fine

(04:02):
nuances of it.

Speaker 2 (04:04):
Yeah, you know, if I had endless time and money,
I would love to just like jump into different cultures
and just sort of like appreciate and absorb, you know,
the various features that are exciting to them and just
sort of enjoy their culture for a little while. And
I very briefly jumped into jazz as I've had different
friends who do jazz things and it's a very cool culture.
But I am still at the early phases of like

(04:24):
what is good? I don't know. I just know that
this makes me smile.

Speaker 1 (04:29):
Well, something I do have very strong opinions about because
I spent a lot of time perfecting my tastes about
it is pizza making. I'm an at home pizza maker,
and I have like strong opinions about like, you know,
the stretchiness of the dough and the puffiness of the
crust and the darkness of the sauce and whatever, because
I've made a lot of pizzas in my house, so
I know like exactly what I like in a pizza.

(04:50):
What have you nerded out on? What are you like
a level twenty expert in?

Speaker 2 (04:54):
Hmm? Well, so first I'll say that my husband also
nerds out on pizza, which is making me wonder if
we sh make pizza when you're here to visit or
if we shouldn't. Uh oh, Zach and I will have
a discussion about that. What am I a level twenty
nerd on. I don't know. I guess I'm sort of
a generalist. I like lots of things. I mean, I'm
really into the moths in our area, and I spent

(05:17):
two years learning about Russian and Russian culture and like
the language. But I don't know if I'm like a
level twenty on either one.

Speaker 1 (05:25):
Of those things. Well, I can tell you moths not
very good on pizza. I mean they're delicious, but really
not a popular topic.

Speaker 2 (05:32):
No, I totally believe that. But I've pulled us far
from our topic. Today. We're talking about entropy, and you know,
I think first we should find out what our audience
thinks entropy means.

Speaker 1 (05:43):
That's right, let's try to beat back the chaos and
stay on topic today, Kelly good luck. So I went
out there and I asked folks what they thought entropy was,
because I was curious what connections people had made in
their minds. Maybe they'll also connect it to jazz or
blues or pizza making or moth loving. Let's hear what
folks had to say. The tendency of a closed system

(06:04):
to kind of go from order to disorder unless you
add energy to go from order to disorder only I
had my head around it.

Speaker 2 (06:13):
Entropy is a measure of the remaining free energy in
a specific.

Speaker 1 (06:17):
Volume homogene as milk toast, a description of the state
of a system in terms of its energy. The more
different configurations the system can have and be in the
same state, the higher the entropy it has. Matter being
either organized or being in a chaotic state, the degree

(06:39):
of disorder and a physical system.

Speaker 2 (06:41):
The amount of energy in a system that can't.

Speaker 1 (06:44):
Be recovered, describes the level of organization.

Speaker 2 (06:47):
As time progresses, entropy progresses.

Speaker 1 (06:49):
I think it's the tendency of energy to spread out
slow crawl towards simplicity that the universe imposes on everything. Disorder,
aos the absence of order. Things degrade over time. It
feels weird because that seems made up what disorder is
and what order is. Or if there are more configurations,
the entropy is higher number of micro states of the system,

(07:13):
which can result in the current macro state.

Speaker 2 (07:16):
Things want to move from a higher embodied energy state
to a lower one. Well, no pizza in those answers,
but actually a lot of variety in the answers there.

Speaker 1 (07:25):
Yeah, I think we really capture something here that there's
a general sense that entropy goes up and that it
has to do something with order and disorder, but also
that there are multiple concepts of entropy. Right, this sensitive
energy and entropy and a sense of organization related to entropy.
And so I think that captures like the big chaotic
confusion that is most people's understanding of what entropy actually is.

Speaker 2 (07:48):
I do think this idea has escaped into like the
general consciousness, and maybe it has disconnected from its physics
definitions in that process.

Speaker 1 (07:56):
Yeah, I think you see a lot of tech bros
on social media using enterpa if they know what it means.

Speaker 2 (08:03):
I mean, it's nowhere near as bad as the word quantum.
But let's clear things up today exactly.

Speaker 1 (08:08):
It's getting there, though.

Speaker 2 (08:09):
It's getting there, all right. So tell me about the
first time the word entropy was used.

Speaker 1 (08:14):
Yeah, entropy is a fun topic because it's not an
ancient topic. Right. People have been talking about motion and
velocity and energy and time since people have been like
smoking whatever and sitting in caves and looking up at
the night sky and wondering how the universe.

Speaker 2 (08:28):
Works, the things that bring us all together.

Speaker 1 (08:30):
Yeah, so you can go back and look at the
Sumerians thinking about the path of the planets and the
length of the year, and you know, the structure of
the Solar system and stuff like that. But entropy is
a recent concept. It's less than two hundred years old.
It's a word that was invented fairly recently as people
were puzzling over engines and wheels and energy.

Speaker 2 (08:52):
Oh so was the idea that, like, it's hard to
keep an engine working because entropy sort of overtime makes
the system less reliable or tell me more about this history.

Speaker 1 (09:01):
Yes, so it's the early eighteen hundreds and there's the
Carnea cycle and so this French physicist Carno actually a
father's son team we're thinking about engines and heat and
there was sort of a mystery at the time, like
what is heat? Anyway, people had the sense that the
universe had a microscopic explanation that could help you understand
the macroscopic view, like you know, this is around the

(09:24):
time when we're about to get Dalton and thinking about
the existence of atoms. So that idea is out there
that like maybe this microscopic stuff, you know, and also
biologically right, like the germ theory is coming out around
this time, you know, within decades at least, so people
were wondering, like, what is heat? Is heat like a particle,
is it a substance? Does it flow from one thing

(09:44):
to the other. Remember early ideas of like electric charge,
where like there were two of them and they were
a liquid and they flowed. So this is sort of
like an idea that was out there. People were wondering,
how do engines work? What is energy? What is heat?
How are they all related? And in eighteen twenty four
Carnot put his finger on this idea that differences in
heat can be used to do useful stuff like if

(10:06):
you have something that's hot and something that's cold, energy
will flow from the hot thing to the cold thing,
and you can use that to do stuff sort of
like the way water flows from uphill to downhill, and
if you capture that flow, you put like a water
wheel there. You can use that to do stuff like
grind your wheat into flour. Right, very useful. But if

(10:26):
the water is all flat, you can't use it to
do anything. And so heat differences can be used to
produce work. That's Carno's big insight in the early eighteen.

Speaker 2 (10:34):
Hundreds, did we have the steam engine at this point.

Speaker 1 (10:37):
Yet, yeah, steam engines existed at this point, and that's
a great example of how you use heat. Right, you
heat the steam, it rises, it turns your turbine. You
can use that to do work or these days to
make electricity. Right, and so Carne understood that engines can
do this, engines can turn heat differences into work. But
he describes sort of a perfect engine one where you

(10:59):
can turn heat differences into work, and then you could
use that work to create heat differences, so back and
forth and back and forth and sort of perfectly without
any loss. But he also had this idea that, well,
sometimes you have imperfect engines, that something is lost, creeps
out of your cycle. This is an imperfect engine. Eventually
it'll wind down, you'll turn a heat difference into work,

(11:20):
and then you'll turn that work into a heat difference.
But you get a little bit less, sort of like
the way if you drop a ball. In principle, the
potential energy the ball turns into kinetic energy and then
it bounces back up and it regains all of that
potential energy. But in practice there's a little bit of
friction and there's losses in the system, and the ball
doesn't bounce forever, right, in the same way. He understood

(11:41):
that this happened, but he didn't describe it mathematically. It
wasn't for a few more decades that people sort of
described this with equations and sort of more mathematical concepts.

Speaker 2 (11:51):
Okay, but even today, we don't have a perfect engine, right,
all of our engines are imperfect.

Speaker 1 (11:55):
All of our engines are imperfect. Exactly. And it was
Claussius who sounds like he's it would be an ancient
Greek guy he does, I know, yeah, exactly. I imagine
him in a robe, even though he probably just wore
a suit and had a top hat. But Classius defined
it mathematically, and he thought of it as the stuff
that's flowing, like entropy is leaving the system, it's moving

(12:17):
through It's like a physical thing. And he connected it
to temperature. And so his contribution was essentially invent this
concept of entropy to help us understand why some engines
are imperfect. And he thought of it as a thing
which flows to the system, like a real physical thing,
not just like hey, here's a number, we're calculating it.
We define it like you could invent anything, right, you

(12:39):
could invent the jigiblions and define it as like the
number of apples in the universe minus the number of
ice cream cones. And that doesn't necessarily have to mean anything, right,
But sometimes you invent a number and it means something
in the universe. It like describes someone which actually exists
or is important. And so he connected entropy not just
to heat, but also to temperature. Right. Temperature also something

(13:01):
people were trying to understand. And so we're going to
get into the mathematics of what that all means and
how it works a little later on. But Colossius's other
big contribution was the word. He created the word entropy.
Before this, we had heat and we had temperature. But
Colossius created the word entropy to think about energy flow.

Speaker 2 (13:21):
And so at this point he's thinking about the movements
of heat or the movement of energy. But like so
when I think of entropy, I usually think of like
stuff getting lost. Was the idea of stuff getting lost
or getting disordered part of this idea or is he
just tracking the flow of things not disordered?

Speaker 1 (13:37):
Disorder comes later with Boltzmann. We'll get to there in
a minute. But he's thinking about the energy flow and
where does it go, and he's using that to understand
why engines will wind down, right for sure, because entropy
is one of the reasons energy doesn't flow completely perfectly.
But I also love the story of how he came
up with the word. So entropy is like a word

(13:58):
he invented. He took the letters E N from energy
because it's related to energy, and then he took the
word trophy from the Greek word for change. So he's like, oh,
this is cool entropy. And you know, he was cognizant
of the fact that this is something he was inventing,
and it was kind of a recent idea. So that's
why he reached all the way back to Greek, because
he wanted it to connect it with like ancient languages

(14:19):
and ancient thoughts, and he said, quote, I prefer going
to the ancient languages for the names of important scientific quantities,
so that they may mean the same thing in all
living tongues. I think he was hoping that, like, if
he uses a Greek root, then even like Romanians and
Bulgarians and English speakers and everybody is going to have
some understanding intuitively of what this concept means.

Speaker 2 (14:42):
Now I am imagining him saying that in a toga.

Speaker 1 (14:46):
He's carving it into marble.

Speaker 2 (14:48):
Right, and has history judged this to have been a
good decision.

Speaker 1 (14:52):
I don't know. I mean, listen to the listeners, Like,
entropy is very confusing. I don't think anybody has the
same idea of what entropy is. And there's famous physicists
a few decades ago, Leon Cooper, who won the Nobel
Prize for superconductivity. So like a dude knows his stuff
and he says the COLOSSI has quote succeeded in coining
a word that means the same thing to everybody. Nothing.

(15:19):
I don't know. If I invent something so pervasive that
people are griping about it, then like, hey, you know
I've done something. All publicity is good publicity, right, We'll.

Speaker 2 (15:27):
Never be sure about that, but okay.

Speaker 1 (15:30):
And so at this point we have sort of these
macroscopic handles on it. We're describing temperature and energy and
entropy as things we can measure about the stuff we experience, right,
A macroscopic that means like stuff in our world the
human scale, you know, like you can take a thermometer
and you can put it in your water. You can
measure the temperatures, the number you can measure about like

(15:50):
large amounts of physical things. But people again wanted to
understand these things microscopically, like what happens down below? If
you're understanding the particles, what is this all mean? And
on the show we do that a lot, and people
are often writing in to ask me, like, well, what's
really happening at the particle level during hawking radiation or
when light bounces off a mirror or something like, what's
really going on? As if like the reductionist explanation reveals

(16:14):
something truer, And you know, we'll talk about that in
a minute, but I want people to understand that there
are many layers of the universe of reality. None of
them are more true than the other. We have this
sense that like, as you go deeper, maybe you're approaching
some fundamental layer of explanation, but every layer is useful.
You know, the macroscopic view, the human level view of
the universe is just as valid and just as useful,

(16:36):
even if there is another layer underneath. Because the amazing
thing is that you can write equations that work to
describe the macroscopic without understanding the microscopic. The universe gives
us that that access to many different layers of reality.

Speaker 2 (16:49):
So if every layer is useful, what I'm hearing you
say is that it's okay that I skip chemistry. I
can focus on the other useful layers. We tell me
just as much.

Speaker 1 (16:59):
I'm saying, and unfortunately you can quote me on this.
Chemistry has its uses. There are places where there are
problems you can't solve using biology or we're using physics.
You need that intermediate step where you're like, you're thinking
about the stoichiometry and whatever. So you're like, yes, chemist's
out there. I appreciate you. It's not that chemistry is terrible,
it's just that I can't do it.

Speaker 2 (17:18):
Same same Yep, No, I appreciate you too, all right.

Speaker 1 (17:22):
So then late eighteen hundreds we have Boltzmann. He's one
of the guys that founded statistical mechanics and thinking about
things in terms of the particles. You really want to understand,
like what is temperature in terms of the particles, Like
if I have something hot and something cold, what's going
on microscopically? And he made this huge contribution connecting temperature

(17:43):
to microscopic motion and specifically defining entropy in terms of
what's going on microscopically. This is a huge leap forward
and really one of the only places in physics or
maybe even in science where we have a mathematical bridge
between two different layers of reality, where you can take
the microscopic understanding of like particles whizzing around and use

(18:04):
it to derive the macroscopic rules right different levels of reality.
Is like when you have different kinds of laws, Like
particles have different behavior than liquid flowing, but we know
liquids are made of particles, and so liquids are this
thing that emerge from particles, and usually we don't know
how to derive it. We can like find the laws
for liquids and find the laws for particles, but we
don't know how to connect them, right, Like you can't

(18:26):
derive fluid mechanics from the standard model. But here he
developed an understanding of what's going on for particles and
he built the mathematical bridge, like you can derive the
ideal gas law from Boltzmann's description of what's happening with
these particles. It's amazing. It's like the only place I've
ever seen this kind of connection where like, not only
do you have a reductionist ability to see the lower level.

(18:49):
But also there's like a mathematical bridge that shows you
why it works. It's kind of incredible.

Speaker 2 (18:54):
Why is that so rare?

Speaker 1 (18:56):
Because the universe is complicated, you know, like to go
from micros gopic to macroscopic, you have to describe a
lot of stuff, and usually they're two hurdles chaos and approximations.
Chaos because like, sometimes the tiny little details matter, you know,
like butterfly flaps its wings in China, hurricane goes a
different direction, like, so you can't ignore those little details,

(19:16):
which means you have to keep track of lots and
lots and lots of little details, like remember how many
atoms there are in a drop of water, right, like
Avagajo's number is a big, big number. Calculating all those
details is essentially impossible, so you end up making approximations.
And sometimes those approximations work, like Boltzmann's big contribution was
finding ways to calculate these averages that work mostly, but

(19:40):
sometimes they don't, and maybe we just don't have the
right kind of math. So in principles should be possible,
but the approximations we make along the way and the
sensitivity to the little details make it really, really hard.
That's why we need chemistry.

Speaker 2 (19:53):
All right, Well, on that note, let's take a break
so that we can sort of absorb the fact that
we need chemistry and in terms with that, and when
we get back we'll talk about the different actual definitions
of entropy. All Right, we're back. We've all come to

(20:23):
terms with the fact that we need chemistry in our life.
And I have a confession to make. I actually minored
in chemistry and my birthday is October twenty third, which
is ten twenty three Avagadro's number ten to the twenty third.
So when I was in college, all of my parties
were chemistry themed parties. We would play like periodic table games.
I know this confession time.

Speaker 1 (20:43):
Haring this on me. Now we've been working together for
so long.

Speaker 2 (20:46):
I felt like I had to tell you I can't
live this lie anymore.

Speaker 1 (20:49):
Daniel, was this burning a hole in your psyche all
these times?

Speaker 2 (20:54):
But I cannot tell you how large my bill was
at the glassware depot because I they broke so much
glass in my chemistry journey. I decided there's no way
I can make a living out of this. I can
maybe just become bankrupt. But anyway, Okay, let's get back
to entropy. We've talked about how you can study heat
to do work and how that heat sort of moves around.

(21:15):
Tell me about the relationship between those phenomena with entropy.

Speaker 1 (21:19):
Yeah, So what we end up with is a description
of entropy from several different perspectives. We have like Carno
and Classius. Their description of entropy is like something the
same level is like temperature and energy a macroscopic quantity, right,
a thermodynamic quantity that relates to like temperature and energy flow.
And then we have Boltzman. He describes entropy statistically in

(21:42):
terms of like the little particles and how you average
those up and how that emerges from those tiny details.
Later on, a guy named Shannon creates an idea of
information entropy, which is probably what people were talking about
when they connect jazz to entropy. So we have three
different definition of entropy, and they're actually more. There's like
five different definitions of entropy and they're all related. And

(22:05):
we'll talk about the statistical and the thermodynamic definitions of
entropy today, but they're connected. They're not the same thing,
but they are similar and you might think, like, what
are you talking about. How can you have different definitions
of the same thing, Right, Well, we already have that,
Like for temperature temperature, we have a statistical view of temperature.
We may have a thermodynamic view of temperature. These things

(22:27):
are a little bit fuzzy, and just like we have
different levels of understanding of the universe, some of which
are useful sometimes and not others. Like particle physics great
when you're at the LEDC, not so useful when you're
pouring liquid into beakers. Right, they're useful in some cases
and not useful in others. Because we don't have a
complete understanding of the universe. We have these approximate, limited

(22:48):
views into the universe, and you've got to pick which
toolkit you use. So that's why we end up with
several different definitions of entropy. But you know they are connected,
and today we're going to show you some of those connections.

Speaker 2 (23:00):
Biologist, I'm thinking about the definition of species of the
word species, right, same sort of situation. We'll have a
whole episode on that at some point.

Speaker 1 (23:07):
Please. That sounds fascinating.

Speaker 2 (23:10):
Yeah, we could drink and discuss this for like weeks
on end If we have enough biologists together that and
somehow we'll start talking about boop.

Speaker 1 (23:15):
But anyway, also, and then I get to have fake
outrage and how ridiculous you guys are.

Speaker 2 (23:22):
That's fair, that's fair taste in my own medicine. All right,
let's start with the statistical definition.

Speaker 1 (23:27):
Okay, So statistical definition is a good entry point because
I think it connects to a lot of people's intuitive
description of entropy as related to chaos or disorder or something.
And you often hear people say entropy is a measure
of disorder in the universe. But that's missing a lot
of really important nuance that I really want people to
grab hold of. Entropy does have to do with order,

(23:50):
but specifically, it's a relative quantity. It's not an absolute
thing where you're like, you measure the disorder and you
get a number. It has to do with how much
information you have about two relative levels of the universe,
and it requires you to define those two levels. So
we have like a macroscopic view, things you can observe
like temperature or energy or density or something that you

(24:12):
can measure sort of at the human level, and then
micro states things that you can't observe arrangements of the
particles or something inobservable that would give you that same
macro state. So you have to pick these two levels, right,
macro and micro in order to even define entropy. Entropy
has to do with how many different micro states you

(24:32):
can have that are consistent with the same macro state
that you measure.

Speaker 2 (24:36):
I would love an example.

Speaker 1 (24:38):
All right, So let's do an example. Let's say, for example,
you have ten coins and you flip them. They're either
heads or tails. Okay, and let's say that macroscopically, because
you have limited information about the universe, all you can
know is how many heads there are. You can't tell
which coin is heads and which coin is tails. You
can just know how many heads there are, So maybe
this five, maybe this ten. Macroscopically, you always have limited information,

(25:01):
like when you measure the temperature of your coffee. You're
not measuring the speed of every individual particle. You have
some big overall average quantity, right, So that's your macroscopic information,
and then we'll define the microscopic is like, actually, which
coins are heads? Right? So you know microscopically, like maybe
it's the first five or heads and the second fiber
tails or whatever. Okay, so we've defined a macro state

(25:25):
and a micro state. An entropy is a measure of
how many micro states you can have for a given
macro state. So say, for example, my macro state, which
is just how many heads there are? I flip all
the coins and I tell you there are zero heads. Well,
how many micro states are there they can give you
zero heads? How many arrangements of those coins can give

(25:45):
you zero heads one? They all yeah, exactly. So that's
a very small number of micro states. What if I
do it again, and this time I can tell you, well,
the macro state is that one of the coin has heads.
How many microstate are there that can give you one
coin having heads? Ten exactly? Ten? Choose one for the

(26:06):
mathematicians out there. Now, if I say, okay, we do
it again, and this time we got five heads, how
many micro states are there?

Speaker 2 (26:13):
I'm gonna give you the middle finger because I can't
calculate that on air.

Speaker 1 (26:18):
It's a big number, right, It's like ten times nine
times eight whatever. It's a big number. So the point
is each macro state has a different number of micro states.
Some of them have only one arrangement of the coins
that will give you the same macro state, that's low entropy.
If you have few micro states that are consistent with
that macro state, it's low entropy. If you have a

(26:38):
lot of micro states that are consistent with your macro state,
like if your macro state is five heads, then there's
lots of different arrangements of that, that's high entropy. Okay,
So the key here it's not just disorder like how
scrambled are the heads and tails. It's relative lack of
knowledge between the micro state and the macro state.

Speaker 2 (26:56):
Okay, I get that.

Speaker 1 (26:58):
If you hold onto that in your head, it actually
makes it very easy to understand why entropy tends to
increase in the universe. You just need one more piece
of information. If you assume that all the micro states
are equally likely, like any particular arrangement is equally likely,
And that's true in this example of the coins, because
like you know, every coin toss is independent. You're just

(27:18):
as likely to get all heads as all tails or
any other particular arrangement like heads tails, heads tails heads tails,
fine specifying them exactly, every micro state is equally likely.
What does that mean? Well, if you just slip all
the coins, you're more likely to get a macro state
that has high entropy, because the macro states that have

(27:39):
low entropy by definitions are the ones with few micro states,
Like it's hard to get all heads or hard to
get all tails. There's lots and lots of ways to
get five heads and fives tails. So if you keep
flipping coins right, then on average, you're going to get
higher entropy than lower entropy. And so the universe does
this not with coin but with quantum states. If each

(28:03):
quantum state of the universe is equally likely, then the
universe tends towards higher entropy because as you keep flipping coins,
you're more likely to get microscopic configurations that give you
higher entropy, just because there are more of them.

Speaker 2 (28:17):
So so far, I haven't heard anything that would suggest
that that makes the universe more disordered or anything like that.
And is that right?

Speaker 1 (28:25):
So here we get a little slippery because we have
a nice crisp mathematical definition of micro states and macro
states and numbers, and entropy is mathematically the log of
the number of micro states. So what do we mean
by disorder? Disorder is like one of these intuitive words
that we use that we don't have a crisp definition
of but we can try to connect it, you know.
So for example, if I told you all the coins

(28:48):
are the same, you'd be like, oh, that's nice and ordered.
If I told you, oh, it's a scrambled headstaales tails
has whatever, that would seem more disordered, right, And so
in that sense it connects with that intuitive definition. But
I think there are other examples that are maybe more intuitive.

Speaker 2 (29:02):
When I hear disorder, my connotation of disorder is that
it's something bad is happening, or like we're moving towards
a state of more badness. But what we're really saying
is that as entropy increases at like each individual point
of interest, it's just harder to predict what's happening at

(29:23):
each of those spots.

Speaker 1 (29:25):
Yeah, I don't think you need to connect disorder with badness,
you know, to think that maybe the universe is just
getting jazzier as time goes on.

Speaker 2 (29:33):
That sounds good, That sounds good depending on my mood.

Speaker 1 (29:35):
You know, we're getting rid of the melody, we're kicking
out inchin ideas about keys and whatever, and we're just
wandering up and down the scale without a plan. The
universe is just getting jazzier.

Speaker 2 (29:46):
Is it less planned or is it just you don't
have as good a handle at the microscale of what's
going on, Like, does that necessarily result in something less planned?
I guess you would say that zero heads is a
more planny state to be in than five heads. Is

(30:07):
that what we're saying.

Speaker 1 (30:08):
I have a little bit of trouble with the intuitive
concept of disorder again, because it's not very well defined.
I think it's maybe easier to think about it in
terms of where stuff is physically rather than heads and tails.
So let's take another example that's maybe more intuitive. So
let's say, for example, you have one hundred particles in
a box, and instead of just knowing like the average

(30:31):
energy of those particles, your macroscopic measurement can tell you
the energy distribution, So you can tell the difference between
like all the energies in one particle or the energy
is shared, okay, And so if all the energy is
in just one of those hundred particles, one of them
is like going crazy whizzing around. The other ones are
just sitting there. How many microstates are there that are
consistent with that? Well, one hundred, because there's one hundred

(30:53):
particles and you can't tell which particle is which, but
there's one hundred ways you could give all the energy
to just one particle. On the other hand, and if
you share the energy, right, if you're in a macro
state where the energy is smoothly shared between them, now
you have a lot of different particles that have energy,
so there's like one hundred times ninety nine times ninety
eight whatever. There's lots of different ways to arrange those

(31:13):
particles so that they share the energy. So you know
that seems more disordered because now you have more particles
whizzing around than rather than just one particle, and you
can make sort of similar arguments about physical locations of particles.
If your macro state is to measure the distribution of
the particles, right, then having all the particles in one

(31:33):
corner of the box gives you very few ways to
arrange the particles, whereas having the particles all the way
through the box, there's lots of different ways to arrange
those particles. And so the reason that happens is that
those have higher entropy, or another way to say that
is that there are just more ways for that to
happen in the universe. So if all the micro states
are equally likely, the ones with more entropy are more likely.

(31:56):
And in that sense, entropy is connected to disorder because
it tends to share energy, spread energy out and also
spread particles out, so it tends to make things smooth
and even rather than like clumped and tight together.

Speaker 2 (32:09):
All right, now, I'm with you. That definition landed better
for me, I think or that example.

Speaker 1 (32:14):
All right, cool, yeah, but it's important to understand that
the statistical definition of entropy really requires you to define
these two levels, and so there is no absolute sense
of entropy. Like you and I could look at the
same system and have different numbers for entropy. If you
have a different macro state, if you can observe more
fine grain details than I can, we have different macro states.
If we're thinking about different micro states, we have different entropies.

(32:38):
Entropy is a relative thing like velocity, right, So it's
not some fundamental thing in the universe from a statistical
point of view. It's this relative thing. But it's also
connected to energy and temperature and this thermodynamic sense of
entropy that CLAUSI is invented.

Speaker 2 (32:54):
So could we go through this example again, but think
about how bolts then identified the macro and the micro states.

Speaker 1 (33:03):
Yeah, absolutely, let's do that, and then let's think about
how that gives us a handle on energy. And it's
going to take us to understanding why energy flows and
the macroscopic sense of energy. And so let's back up
a thing about bolt men. Before we talk about entry,
Let's just talk about temperature, because temperature is this other
thing where we have like a definition of it microscopically
and macroscopically. Temperature macroscopically is like, well, you put a

(33:26):
thermometer in something, right, or you touch something you can
feel it's hot or it's cold. But we also amazingly
have this microscopic sense of temperature. Microscopically, we think about
just particles in their velocities. Like it gets more complicated,
you're talking about solids and liquids and whatever in different
like vibrational states. But just imagine a box of particles
and it's a simple gas and the particles are whizzing around.

(33:46):
What happens when something is hot is those particles have
higher speeds, and when something is cold, those particles have
lower speeds. And I think a lot of people already
have a sense of this, But what maybe you don't
appreciate is that this really is a mapping between the
microscopic and the macroscopic. You're like, hot equals high speed particles,
cold equals low speed particles. That's amazing, it's incredible that

(34:08):
we have this connection. Right, And those are two different
definitions of temperature definition, one statistical microscopic sense of like
particles moving, the other macroscopic thermodynamic definition of temperature where
you're like, it's hot, it's cold, right, I feel hot
and therefore heat is flowing from me. Right, This is incredible,

(34:28):
And this is what Boltzman did. He connected these two
senses of temperature microscopically and macroscopically.

Speaker 2 (34:35):
So just to nail the point home, high temperature is
more entropy, is that right, because they're moving around and
more disordered.

Speaker 1 (34:41):
Oh, no, great question. And entropy is a slightly more
subtle connection to temperature. When energy flows to erase a
temperature difference, like when something goes from hot to cold,
entropy is the stuff that's being transported. And sort of
the same way that like if you imagine an electric

(35:02):
circuit and you have a voltage difference, what happens when
you have a voltage difference, you have flow of current, right,
you have charges flowing from one to the other to
balance that out. Charges the stuff that's transported. When energy
flows to erase a temperature difference. You can think of it,
entropy is like the charge of that system. It's the
stuff that's being transported. And so there's this connection between

(35:24):
energy and heat and temperature and entropy. That's a little
bit subtle, but I think it's important to understand. So
we're familiar with the idea that energy flows, right, things
flow from hot to cold. Why does that happen? Right?
Why do things flow from hot to cold? The answer
is that when things flow from hot to cold, the
number of microstates tends to increase. Right, Just like the

(35:47):
example we talked about in a minute ago with the particles,
If you have one really really hot particle in ninety
nine cold ones that has less entropy than sharing the
energy among the particles, there's more ways to arrange it.
If you can give it to all the part articles,
then just give it to one. They're more micro states.
So what happens when you have a temperature difference is
even opportunity to increase the entropy. So energy moves to

(36:10):
maximize the micro states, not because like, oh, the universe
likes energy to be spread out or something like that.
It's because all the micro states are equally likely and
energy flows in a way that increases the number of
micro states, right. Maximizing entropy is what causes energy to flow,
or another way to say it is like energy flowing

(36:31):
is increasing the entropy, right, and energy stops flowing when
it no longer will increase the microstates. If you have
like two systems next to each other, A and B,
energy will flow from one to the other if it
increases the number of microstates. Right. So, now as energy
is flowing from A to B, A is losing energy,

(36:51):
It's going to have fewer micro states. You're losing microstates,
but B is gaining them. And this will happen as
long as the gain and B is greater than the law.
So as soon as that equalizes, as soon as moving
energy from one system to another will not increase the
total number of micro states, energy stops flowing, and that's

(37:11):
what we define to be temperature. Temperature is this relationship
between energy and entropy in a material. If the temperatures
are equal, there's no gain in energy flowing. It does
increase the energy. So that's the definition of temperature thermodynamically.
If the temperatures are equal, no energy flow. And mathematically

(37:31):
we define temperature to be this ratio of a change
in energy to a change in entropy. Chemist out there
probably know it's DEDs. So you don't have to have
an equal energy between systems. What you need is equal temperature,
which means that any energy that moves will make an
equal change in the entropy. And so when the temperature

(37:52):
is equal between the two objects, no heat will flow
because dedes is the same. That's the definition of temperature thermodynamically. Right,
we have the microscopic definition of temperature. It's like particles
whizzing around. It's their speed. Now we have this weird
thermodynamic definition of temperature as the ratio of energy to entropy.
It turns out you can derive one from the other. Right,

(38:13):
you can start with the mathematics of kinetic energy of
particles and derive this definition. That's what Boltimant did. It's
incredible mathematical bridge of temperature from one to the other
and also helps us understand entropy from one to the other.
And so that's sort of the thermodynamic sense of entropy,
and I think it's amazing because it tells us why

(38:35):
energy flows. Energy flows to increase the number of microstates,
and it will stop flowing when the number of microstates
will not be increased.

Speaker 2 (38:43):
Okay, I exactly kept up with that explanation. So my
brain has no questions yet. So let's take a break
and when we get back, we'll see what Kelly's brain
has to offer. And we're back. So now we have

(39:14):
two different definitions of entropy. Let's talk about some applications
of this knowledge.

Speaker 1 (39:21):
Yeah, entropy has a lot of really deep consequences, and
it touches so many topics and physics, from time to life,
to black holes, to the Big Bang to the future
of the universe. It's really incredibly pervasive. One reason is
that it's so simple. It just tells us about how
systems evolve. They evolve from less likely to more likely
and what does that mean the consequences And one of

(39:44):
the deepest connections with entropy is what we heard the
listeners say that somehow entropy is responsible for why time
moves forwards. And I remember hearing this for the first
time thinking, what, that's crazy, that would explain such a
deep mystery, right, Like, we have these three directions of
space and one of time, and we know space and
time are related and connected by relativity, but time is different.

(40:07):
You can always revisit a location in space, but you
can't revisit a location in time. And you can move
positive and negative in the X axis, but only positive
in the time axis. And why forwards are not backwards?
And like what does this all mean? So there's some
deep mysteries here about what time is.

Speaker 2 (40:23):
The idea here that if entropy is increasing, you're never
going to get all the particles back in that same configuration,
and that's why you can't go back in time. Oh no,
you're laughing.

Speaker 1 (40:33):
I love that. I think that is sort of a
summary of how physicists put it. The argument is a
little bit more elaborated. It's that when we look at
the laws of physics, so many of them are reversible.
They don't seem to prefer one direction. You know, for example,
in a vacuum, if you bounce a ball, it hits
the floor, comes back up, It'll come back up to

(40:53):
the same height, and so that path is reversible. Right.
If you play a video of that happening in a
vacuum with this new end energy loss, then you can't
tell if the video is being played backwards or forwards.
The same laws of physics supply and describe it perfectly,
same with particles, almost completely. And so people were wondering
for a long time, like, well, if the laws of
physics don't care if time goes forwards or backwards, they

(41:14):
work both ways, why does time go forwards? And so
entropy seems to be one of the places where physics
has a preference for one direction or the other, right,
like it likes the micro states to increase, so energy
increases with time. And then there's this leap people make.
They say, oh, energy and time increase together, therefore that's

(41:36):
why time moves forwards. And I'm not sure I follow
that leap of logic, Like I will accept that entropy
and time are connected. Entropy increases as time goes up.
But that same law could tell you, like, well, the
universe could still run backwards. It just wouldn't be symmetric.
It would just mean that if the universe ran backwards,
entropy would decrease. That law predicts that also, it predicts

(41:57):
that if time ran backwards, Andy, will, why don't we
live in that universe. I don't know it's consistent with
the second law of physics, right as long as time
runs backwards. So I don't believe that the second law
physics tells you why time runs forwards. It connects time
and entropy, but it doesn't tell you why time goes
forward or backwards. There could be folks out there living

(42:18):
in another universe where time runs backwards and entropy is decreasing,
and they're claiming that entropy is the reason time runs backwards.
Of course they don't call it backwards. There's definitely some
interesting hints there, but I don't believe that it conclusively
shows us why time goes forwards.

Speaker 2 (42:33):
So maybe I've missed this, but I think that's the
first time we've used the phrase second law.

Speaker 1 (42:37):
The second law is just a statement that entropy increases
as time goes on. Okay, delta S is greater than zero, right,
S is the symbol for entropy, and so it just
says the entropy increases as time marches on.

Speaker 2 (42:48):
All right, So if I can hijack the conversation for
a second, oh please do so. You know, as someone
who spends a lot of time with evolutionary biologists. I've
been to a couple creationism evolution debates and they get spicy.

Speaker 1 (43:04):
That really bends the meaning of a word debate.

Speaker 2 (43:06):
Also, I felt like I learned a lot by thinking
through the arguments. But anyway, so a common point that
is brought up during these discussions is that the second
law says that things should be getting more disordered with time.
So how do you have evolution creating more complex organisms
over time? And what is the answer that you would

(43:28):
give them? The answer I heard, if this is helpful,
is it has to do something with the second law
being about if you're in a closed system. But the
Earth isn't a closed system because energy is coming in
from the sun, and so if you're not in a
closed system, none of that holds, Is that right?

Speaker 1 (43:43):
Yeah, I'm thinking about it for a minute first, because
I'm wondering what it is meant here by complexity, you know,
like and whether it even is connected to entropy, statistically
order disorder, micro states, macro states, or if it's just
this sort of like intuitive this seems similar to that,
So let's use one word in place of the other.

(44:03):
I'm not even really sure that complexity is connected to entropy.
At all. But it's definitely true that life and entropy
have a close connection because living things tend to decrease
their local entropy. My body is a system that decreases
its entropy. You might wonder, like, well, if entropy is
supposed to always increase, how does that happen. Well, as
you say, I'm not isolated, right, I have a huge

(44:26):
environment around me, and so I exchange energy with the
environment and do all sorts of complicated stuff to locally
make my entropy go down. Overall, I'm increasing the entropy
of the environment I'm interacting with more than my entropy
is decreasing, So overall second law is fine. The point
of the second law is you can't pick out a
one part of a system and say the entropy always

(44:46):
has to increase for every sub part of the system.
It's just the whole system where entropy has to increase.
The same way, like you can't apply conservation of energy
to just half of a system and say, oh, the
energy is flowing out of this and so energy is
not conserved. Like energy is conserved for the whole system.
Entropy increases for the whole system. So actually, this is
one way that some physicists think we should define life.

(45:09):
It's like systems that decrease their local entropy at the
expense of the environment. Because you know, biologists spend hours arguing, like,
what is life anyway in the system that can reproduce
is it's something that passes on genetic information? Whatever is
all these definitions of life? And this is sort of
a physics based definition of life. It's something that decreases
its local entropy. Doesn't violate the second law of thermodynamics

(45:32):
to decrease your local entropy at all. And I don't
know how to think about evolution in terms of complexity.
Like I guess evolution has produced systems that tend to
decrease their local entropy more and more as time goes on.
But to me, that's not a violation of the second
law of thermodynamics at all, or really says anything about
micro versus macro states.

Speaker 2 (45:53):
Interesting, That's not an answer I heard it any of
the debates that I attended. I was raised We don't
need too much of Kelly history here. And I was
raised Catholic in a family where and I know Catholicism
is okay with evolution, But I was raised in a
family where the young Earth hypothesis that Earth is only
six thousand years old was held pretty tightly, and so
I went to a lot of these debates to try

(46:13):
to figure out how I felt about it on my own,
and the answer I always heard from the evolutionary billogist
was well, we're not a closed system or whatever. So anyway,
that was really interesting. W what a virus be alive
according to the physicist definition of life?

Speaker 1 (46:27):
That's a good question, I think it would be. I
once had a conversation with Sarah AMRII Walker, and she
wrote a fascinating book last year about this whole question.
So I should ask her that, but I think so.
But let me ask you, what was it that convinced
you that the Earth is not young, that it's billions
of years old and not thousands of years old, assuming,
of course, that you got there. I did.

Speaker 2 (46:48):
I did it was? I mean, I took enough classes
where I learned about the various ways we date rocks
and about the fossil record and how complete it was,
and just sort of engaged more with what we actually
know in the science, and it became pretty clear to
me that the data was pointing to an Earth much
much much much, much, much much older than six thousand

(47:10):
years old.

Speaker 1 (47:11):
Yeah. Wow, fascinating. So thinking about deep time and the
history of the universe in the future of the universe.
Entropy is also connected to these ideas of like the
Big Bang and the future of the universe and black holes.
And there's a lot of confusion out there about what
entropy tells us about these things. I think partially because
people are thinking about entropy from a temperature point of

(47:31):
view or a gravitational point of view, which are actually
a little bit different, and people are thinking about no
entropy meaning no temperature. So I thought'd be useful to
go through these a little bit and help untangle some
of the confusion.

Speaker 2 (47:43):
Go for it.

Speaker 1 (47:43):
So let's start the very beginning with the Big Bang. Right.
If the universe is increasing in entropy all the time,
then as you go back in time, the universe is
decreasing in entropy, and that means that entropy gets lower
and lower and lower, which means, you know, if you
go back to the very first moments that we can
think about what we call the Big Bang, when the
universe was very hot and very dense, then that must

(48:05):
have been very low entropy, right, because entropy is increasing,
so entry must have been very, very low. But it's
hard to get your head around. Like I'm imagining a hot,
dense gas and it's pretty smooth, right, it's not very clumpy.
That doesn't seem to me like a very low entropy situation.
In fact, it seems like I'm disorganized and everything's flying around.
How is that low entropy? This is confusing to people,

(48:27):
But the key thing to understand is the dominant force
there is gravity. So instead of thinking about entropy from
a point of view of like the temperature of the particles,
think about the arrangements of the particles and what's a
more likely arrangement.

Speaker 2 (48:39):
So gravity is pulling everything to one spot, whereas without
that it would have been all over the place and
much more disordered and spread out. Is that the idea.
I'm going to stop trying to finish your sentences because
it reveals a little I'm understanding. But I think I'm
getting this.

Speaker 1 (48:55):
A gravitational point of view, Being really spread out is
very low entropy, and being clumped together is higher entropy. Right,
Because gravity is not the same as heat, gravity tends
to clump things together instead of spread things out, and
so from a gravity point of view, being very spread
out is rare. Like if you have a bunch of
matter and you let it sit there, like, it's very

(49:17):
rare for it to be perfectly spread out like for
that to happen, everything would have to be in perfect balance,
like a universe that's completely smooth, where there's no perturbations.
That's what be required for gravity to not be able
to clump things together. So gravity likes to clump things together.
Clumping things together increases their entropy. From a gravitational point

(49:38):
of view, Remember we set entropy is relative. It's not
like there's a certain number for the universe or certain
number even for a system. It depends on the arrangements
and what you'd define as the macro and the micro states.
And so from a gravitational point of view, an initial
state where everything is very spread out is quite unlikely.
And actually one of the deep mysteries of the universe
is why did the universe begin in such a low

(50:00):
entropy state. If we're going from a low entropy to
high entropy and it turns out there's actually going to
be a maximum entropy of the universe, then the sort
of the time it takes to get to the maximum entropy,
which we call the heat death, which we'll talk about
in a minute, defines a sort of the interesting period
of the universe. Once we get to the heat death,
the universe isn't really very interesting anymore. So how low
the entropy is when we begin sort of defines how

(50:22):
long we can do interesting stuff right, and we're sort
of lucky the universe begin with very very low entropy,
Like if it started with very very high entropy, not
much would happen. I just sort of continue that way.
And so one of the mysteries of entropy actually is
like why it started with such low entropy. And as
gravity continues to do its work, it makes black holes, right,

(50:44):
it can clump these things together, and black holes actually
have the maximum entropy, Like there's no way to arrange
a mass with higher entropy than a black hole. It's
like the maximum entropy arrangement of a system. And the
fact that black holes have an entropy is really fascinating,
and it was one of the ways that Hawking and
his collaborators figured out that black holes must glow a

(51:05):
little bit, because having entropy means you can define temperature
for black holes, and if you can find temperature for
black holes, then you can think about them glowing like
everything else in the universe that has temperature. So you
can derive Hawking radiation thermodynamically, saying like, well, if it
has entropy, it's got to have some temperature and then
it should glow in the universe. And you can think

(51:26):
about the temperature of a black hole. And actually, really
really massive black holes have lower temperature, which is why
they glow less than small black holes that have higher
temperature than they glow brighter. And a lot of people
think that Hawking derived his idea of Hawking radiation from
like thinking about the little particles near the edge of
the black hole. But that's not true. Actually, it's not

(51:46):
where the derivation comes from. It comes from thermodynamics, because
we don't understand the gravity for little particles, like there
is no way to think through that little example microscopically
of what happens to the particles. We only have a
macroscopic understanding of Hawking radiation because, as we've said many
times on the show, we don't understand quantum gravity.

Speaker 2 (52:05):
I'm trying to decide if I can rescue my comparison
about my house being a mess with entropy. Could I
say my house is like a black hole because that's
where the maximum entropy is, or I really need to
let this comparison go. I think this is where disordered
has a negative connotation to me, and I think that
maybe that's been holding me back this whole time.

Speaker 1 (52:24):
I'm not going to comment because I feel like that's
going to put me in the middle of your marriage.

Speaker 2 (52:27):
Okay, let's move on there.

Speaker 1 (52:28):
And I want Zach to like me, so all right.

Speaker 2 (52:30):
Let's move on to heat death.

Speaker 1 (52:31):
Yeah, so what's going to happen at the end of
the universe. Well, you know, gravity clumps things together into
black holes eventually, but those black holes also glow, right,
and so you get the universe increasing its entropy, you
get black holes, and those black holes glow out photons.
And so the final end point of the universe is
those black holes evaporate into photons and the universe just

(52:53):
filled with this hawking radiation, sometimes photons, sometimes other particles,
and that's the state of maximum entropy. So how do
we understand the entropy in this whole story? It's low
when the universe begins with matter mostly spread out, and
then grows as the universe gathers together things into massive
objects like black holes, and then keeps growing as the
universe converts those black holes back into a bath of

(53:15):
matter and radiation from black hole evaporation. How does that
make sense in terms of our definition of entropy and
microstates and all that. Yeah, the answer is it really
doesn't because we don't have micro states for gravity. That
would require understanding how gravity works for particles, and we
just don't, not until somebody cracks quantum gravity. So this

(53:36):
concept of black hole entropy is not statistical. It's thermodynamic,
as we mentioned a minute ago. It's derived from arguments
about temperature and about energy. We know that a black
hole entropy grows with its surface area, and that lines
up with our understanding that entropy grows because gravity will
gather stuff together to make black holes larger. But then

(53:57):
how does it make sense for entropy to keep growing
as though a black holes evaporate away to something that
resembles the early universe Again, But it turns out the
heat death bath of radiation is not the same as
the early universe conditions. We think that the quantum information
is still there. The whole history of the universe is encoded,
and so the particles that evaporated from the black holes

(54:19):
are all entangled together in a complicated way. People are
still figuring out that holds that information. So the answer
is we still aren't sure about a lot of this
black hole information, and entropy is a very active area
of research. And that's the best explanation I can give.

Speaker 2 (54:33):
You, Daniel, that was a perfect explanation.

Speaker 1 (54:35):
And so we end up with this situation where energy
is sort of surprisingly spread out again, but it's not cold.
People confuse the heat death of the universe and think, oh,
nothing is moving. They think of it like death freezing.
But it's not absolute zero. It just means there's no
way to get anything done the way like Carnot was saying,
that you need energy differences to get stuff done. Like

(54:57):
you run your temperature, you need hot and coals. The
energy can flow from hot to cold. You need water
to flow downhill so you can capture it with your
water wheel. If everything is flat and smooth, then there's
no way to do anything useful, right, And so that's
what the heat death is. Not when everything is frozen,
not when particles can't move, but when there's no way
to do anything useful in the universe. And so that's

(55:19):
why you can't have like life or anything else interesting
because everything's maximumly spread out. You can't take advantage of
any energy differences because there aren't any anymore.

Speaker 2 (55:27):
We know what temperature the heat death will be if
it's not absolute zero.

Speaker 1 (55:30):
Yeah, it's a great question. It depends on how long
it takes, because as the universe expands, it cools, and
we don't know actually how quickly the universe will be
expanding in the future. It's accelerating, but you know, the
mechanism for that acceleration is dark energy, which is not understood.
So unfortunately I can't give you a number for that today.

Speaker 2 (55:46):
Well. One of the things I love about our conversations
is that so often halfway through a lesson, which is
what some of these end up feeling like. I feel
like I enter the conversation thinking Okay, I know what
we're talking about, and I leave thinking, wow, that was
a lot different than what I thought the answer was
going to be. And so I always leave and think
about the conversation like much longer into the day. It

(56:09):
sticks with me for a while, and so thank you
for helping me realize that I shouldn't be making entropy
jokes about my house, and now I'll start thinking about
jokes about how no work can get into I'm gonna
start working on that.

Speaker 1 (56:25):
Yeah, or you know, use jazz instead. Hey, Zach, can
you jazz up the kitchen a little bit? Or the
kitchen has gotten a little too jazzy.

Speaker 2 (56:32):
A little too jazzy, I think is the problem with
the kitchen. It needs more melody.

Speaker 1 (56:38):
I love thinking about these topics, and especially helping people
understand how they really work, because so often the real
understanding of it is more fascinating and more interesting. We're
not throwing a wet blanket on people's idea of entropy.
We're showing them how exciting, how jazzy it actually is.

Speaker 2 (56:53):
Yes, and I'm trying to make myself feel a little
bit better about the multiple times in this conversation where
I got the answer exactly opposite correct, And I guess
I'm hoping that this is a place where people can
come with their incorrect, preconceived notions Oh yeah, and not
feel self conscious about having it wrong. Absolutely, because if
Kelly can be wrong so often and can continue to

(57:15):
move through her life with any degree of confidence, then
you should also feel welcome to ask us anything, So
reach out.

Speaker 1 (57:22):
Yeah, and if one of our explanations didn't make sense
to you, please do reach out. It's not just Kelly
you gets to ask questions, and not just me. You
get to ask Kelly biology questions. We want to hear
your questions. Please do write to us two questions at
Daniel and Kelly dot org. We'll ride back to you.

Speaker 2 (57:37):
We respond to everybody, looking forward to hearing from you.

Speaker 1 (57:39):
All right, until then, everybody keep it chassy.

Speaker 2 (57:49):
Daniel and Kelly's Extraordinary Universe is produced by iHeartRadio. We
would love to hear from you, We really would.

Speaker 1 (57:56):
We want to know what questions you have about this
extraordinary I.

Speaker 2 (58:01):
Want to know your thoughts on recent shows, suggestions for
future shows. If you contact us, we will get back
to you.

Speaker 1 (58:07):
We really mean it. We answer every message. Email us
at Questions at Danielankelly dot org.

Speaker 2 (58:14):
You can find us on social media. We have accounts
on x, Instagram, Blue Sky and on all of those platforms.
You can find us at d and kuniverse.

Speaker 1 (58:23):
Don't be shy, write to us
Advertise With Us

Follow Us On

Hosts And Creators

Daniel Whiteson

Daniel Whiteson

Kelly Weinersmith

Kelly Weinersmith

Show Links

RSS FeedBlueSky

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Bobby Bones Show

The Bobby Bones Show

Listen to 'The Bobby Bones Show' by downloading the daily full replay.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.