Register | Sign In


Understanding through Discussion


EvC Forum active members: 64 (9164 total)
5 online now:
Newest Member: ChatGPT
Post Volume: Total: 916,852 Year: 4,109/9,624 Month: 980/974 Week: 307/286 Day: 28/40 Hour: 2/4


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   Entropy in Layman's Terms
Son Goku
Inactive Member


Message 5 of 51 (544462)
01-26-2010 3:28 PM
Reply to: Message 1 by Rrhain
01-23-2010 3:33 AM


Entropy - The statistical mechanics viewpoint.
Rather than explain entropy from a thermodynamic point of view, I'll try to explain it from a statistical mechanical perspective. What this means is that instead of talking about it as a consequence of the movement of heat and energy (thermodynamics), I will discuss how it relates to the basically random (statistical) motions (mechanics) of billions of atoms.
Take a pile of hydrogen gas, it's composed of billions and billions of atom. Let's be specific and say we have 1 kilogram of hydrogen. That's
60,000,000,000,000,000,000,000,000 atoms. If we want to describe the hydrogen gas we need six numbers for each atom. Three to indicate where the atom is and three to indicate its momentum in each direction. So that's:
360,000,000,000,000,000,000,000,000 numbers over all.
However obviously nobody describes a gas this way. You use one number for its weight, one number for its volume and maybe, at most, a few numbers to describe its shape. This is vastly smaller.
So to describe the state of the gas on a microscopic scale, you need a huge amount of information and on a macroscopic scale you need far less. This leads to the idea of a microstate and macrostate. The microstate is the information on the current state of the atoms from an atomic point of view. The macrostate is the information needed to describe the gas from a larger/macro point of view.
The macrostate is obviously a much cruder way of describing the gas, so several states of the gas which have different microstates, will have identical macrostates.
As an example, say you were looking at the gas. If somebody displaced one single atom 5 nanometers to the right, then the microstate would be different, because the atom is in a different position. However the macrostate would be unchanged, because there is no visible difference at all on our scale.
Entropy is defined at the macroscale, it is essentially the quantity that "bridges" the two scales. The entropy of a macrostate is the number of microstates which produce that macrostate.*
You might ask, why does entropy increase and how is it related to disorder and work?
The answer to the first question is that entropy doesn't actually always increase, it's just massively unlikely not to.
Let's take two macrostates of our gas of hydrogen. In one macrostate all the hydrogen is spread across the room evenly. In the other macrostate it's packed into one corner. The first macrostate has a massive number of microstates associated with it. You can move loads of atoms and nothing will change on our scale. For the other, move enough atoms out of the corner and things will start to look different.
To give you an idea, say the room was about one cubic meter in size. If we compare the case where the gas was concentrated in a corner one tenth of a cubic meter in size and the case where the gas is spread evenly around the room, the latter has a: (get ready for a massive number)
Google google google google google google google google google google google google,
times more microstates than the former.
That means that there are far, far more ways to arrange the atoms in the latter case to have things look identical, than there are in the former. You could say the latter macrostate is more generic and the former more special. Entropy is essentially a measure of how generic a macrostate is. The more generic a state is, the bigger the entropy.
This is why entropy will almost certainly increase. As time moves on the atoms are more likely to randomly wander into a high entropy state, because a high entropy states are literally more generic/common state, the odds are in their favour.
Finally, what has this to do with useable work?
Well if you think about it, in order for something to do work, it has to be arranged in a fashion where it can give up its energy to something else. You have to put (or find) it in a work capable state. To take my hydrogen gas example, you could get it to do work by arranging it to be tightly compressed, so that its pressure does work. Obviously there are macrostates more capable of doing work than others. The point is that a state that can do a lot of work is rarer than one that can do little, simply because one requires a specific arrangement and hence is special/rare and the other doesn't need any particular arranging and so is common.
Over time, just as I said, you're more likely to move into a common state (because they're common) and so you'll probably end up in a low work capable state.
Hence entropy will probably increase and useable work will probably decrease.
*More accurately it is the log of the number of microstates.

This message is a reply to:
 Message 1 by Rrhain, posted 01-23-2010 3:33 AM Rrhain has not replied

Replies to this message:
 Message 47 by Peepul, posted 04-27-2010 7:03 AM Son Goku has replied

  
Son Goku
Inactive Member


Message 10 of 51 (557151)
04-23-2010 4:37 AM
Reply to: Message 9 by Fiver
04-17-2010 3:38 PM


Entropy and Disorder
Yes, you are correct. It is an example that entropy is not really related to what humans call order/disorder. There are less arrangements of the marbles which result in the bag with a speration of cold and hot marbles, hence it has less entropy.
Of course since all of this is related to statistics, you need a large amount of marbles for the statement to be meaningful.

This message is a reply to:
 Message 9 by Fiver, posted 04-17-2010 3:38 PM Fiver has not replied

  
Son Goku
Inactive Member


Message 49 of 51 (558210)
04-30-2010 11:07 AM
Reply to: Message 47 by Peepul
04-27-2010 7:03 AM


Re: Entropy - The statistical mechanics viewpoint.
Hi Son Goku,
I understand the microstate / macrostate definition, thanks for that.
But I'm struggling with the link to 'usable work' and I'm hoping you'll be able to help me out.
* Which of these concepts is actually the 'definition' of entropy
The definition I gave is "the" definition of entropy. The thermodynamic version related to usable work is an idealisation. For example, the thermodynamic definition of entropy and the associated second law says entropy never decreases. The statistical mechanical definition says that entropy is extremely unlikely to decrease.
However the idealisation involved in the thermodynamic definition is the second most accurate idealisation in all of science. More on this below.
* The link you describe between the two views is quite conceptual - can it be demonstrated rigorously with maths? (I'm not asking you to actually show the math, btw)
Good question. Yes, it can be rigorously shown, although it takes an incredible amount of work. Physicists always went by the conceptual link and assumed they were the same. However it was only in 1960s with work by David Ruelle that people began to check if it was rigorously true. Basically you get the "usable work" definition back when you take the limit of an infinite number of particles. In fact statistical mechanics turns into thermodynamics in the infinite particle limit.
Of course the real world doesn't have an infinite number of particles, so the thermodynamic definition is not the correct one and is really an idealisation. However the error of this idealisation is so small (and I mean really small), that it doesn't matter at all in practical terms.
* I actually think there is quite a strong link with disorder here - do you think people are wrong to say that entropy is a measure of disorder ? If so, why are they wrong?
They are wrong in the technical sense that entropy and disorder are not the same. In a lot of situations increase in entropy would correspond to increase in disorder. However there are cases where they don't really have any relation, for example increasing entropy might lead to what humans would call increasing order.
The real problem is that disorder is related to our intuitive notions of chaos and order, so it leads to a misconception that our ideas of chaos and order are somehow a principle of the physical world.
However in a mechanical setting, or in everyday applications (car rusting, fridge breaking, e.t.c.) it's perfectly fine to think of entropy as disorder.

This message is a reply to:
 Message 47 by Peepul, posted 04-27-2010 7:03 AM Peepul has replied

Replies to this message:
 Message 50 by Peepul, posted 04-30-2010 11:16 AM Son Goku has not replied

  
Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024