Register | Sign In


Understanding through Discussion


EvC Forum active members: 64 (9163 total)
4 online now:
Newest Member: ChatGPT
Post Volume: Total: 916,411 Year: 3,668/9,624 Month: 539/974 Week: 152/276 Day: 26/23 Hour: 2/4


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   10 Categories of Evidence For ID
paisano
Member (Idle past 6443 days)
Posts: 459
From: USA
Joined: 05-07-2004


Message 111 of 147 (208231)
05-14-2005 8:38 PM
Reply to: Message 109 by Jerry Don Bauer
05-14-2005 8:07 PM


his page is only half right, as this depends on how we calculate Shannon entropy. If we want to calculate it thermodynamically, it's just a matter of adding Boltzmann's constant into the formula to come out in Joules/ degree Kelvin.
Jerry, I'm sorry, but you're simply incorrect on this point. "Calculating Shannon entropy thermodynamically" is an incoherent statement. Shannon entropy and thermodynamic (Boltzmann) entropy are different quantities, and cannot be conflated like this.
If you want references, consult undergraduate thermo/stat mech texts such as Reif or Kittel, or graduate texts such as Pathria or Landau and Lifshitz.
In none of these texts, much less in the professional literature, will you see Shannon entropy and Boltzmann entropy conflated in the way you assert.
Let's do an illustrative example.
Consider two devices for measuring the quantity of fuel
in an automobile fuel tank:
1) An LED which is in a series circuit with a sending unit in the tank
with a switch that is open until there is one gallon in the tank,
at which point the switch is closed, completing the circuit and
lighting the LED.
This system has two possible information states:
LED off (>1 gallon left in the tank)
LED on (< 1 gallon left)
Without a priori information about the amount of fuel in the tank, we may assume either
state has equal probability. In this case, the Shannon entropy is indeed
proportional to ln W, where W is the number of possible information states of the system.
Thus , its Shannon entropy is proportional to ln(2), or 0.693.
The amount of energy needed to light the LED can be estimated
by noting that a typical LED draws a current of about 20 milliamps
with a voltage drop of about 2.0 volts. (More or less, these numbers vary with
LED characteristics but are good approximations).
Thus the power necessary to light the LED and keep it lighted
is P = VI = 2.0*(20e-03) = 40e-03 watts, Since watts= joules/sec,
we need 40e-03 joules per second to light the LED and keep it lit.
2) A fuel gauge, essentially a pointer, bimetal strip, and heating coil connected in a circuit
to a sending unit consisting of a variable resistor connected to a mechanical float,
calibrated such that the total resistance in the circuit (heating coil + variable resistor) varies linearly with the quantity of fuel in the tank, with a voltage regulator to keepp the voltage at 12V.
To estimate the number of information states available to the fuel gauge, assume it has a pointer of length 5 cm and thickness 2 mm, with a full travel through a curcular arc of 180 degrees.
Using pi = 3.14, this corresponds to an arc length of 15.7 cm, or 78.5 discrete positions of the pointer.
Let's call it approximately 80.
So again, assuming no a priori knowledge about the amount of fuel in the tank, each state is equiprobable and we have the Shannon entropy of the fuel gauge system proportional to
ln(80), or 4.38.
To estimate the energy difference of the fuel gauge between the empty and full state,
we note that from the auto.howstuffworks.com entry on GM fuel gauges, the resistance of the sending unit circuit varies from 2 ohms at empty, to 88 ohms at full. The voltage in the circuit is of course regulated 12 volts.
So, the current in the circuit to keep the gauge pointer to "full" is
I= 12/88 = 0.14 amps
And the power supplied the heating coil is then
P = VI = 12*0.14 = 1.68 watts
At "empty" the current in the circuit is I= 12/2 = 6 amps
and the power supplied is
P = VI = 72 watts
So the energy supply difference between the empty and full state for the fuel gauge is
about 70.3 joules.
(If we knew the resistance of the coil and the thermal expansion coefficients of the
bimetal strip, we could do a more accurate computation, but we are just after an order of magnitude estimate here)
Now, on to entropy.
The ratio of Shannon entropy of the fuel gauge system to that of the LED system is:
Shannon entropy ratio = 4.38/0.693 = 6.32
Now what about the thermodynamic or Boltzmann entropy ?
We could compute the thermodynamic entropy change for both systems at their respective end states
from the grand canonical ensemble applied to the molecular structure of the system components, but this is overkill.
Computing the ratio of macroscopic thermal entropy change will be sufficient.
To estimate the ratio of thermodynamic entropy change from the end states of the LED system and
the end states of the fuel gauge, we note that , from the first and second laws,
entropy change will scale linearly with energy change, so the ratio of entropy will be proportional
to the ratio of energy.
In other words, dS (fuel gauge)/dS (LED) = dE(fuel gauge)/dE(LED)
or, Thermodynamic entropy ratio = 70.3/40e-03 = 1758.
Note that since we have taken ratios, any unit differnces between the Shannon entropy and Boltzmann entropy ratio cancel.
To recap, the Shannon entropy ratio between the two information systems = 6.32
the Boltzmann entropy ratio between the two systems = 1758.
So the change in thermodynamic entropy is orders of magnitude larger than the
change in informational or Shannon entropy.
We can see that:
1) "information" does not scale as I=mc**2, in contrast to the rather naive assertion
that it did. Both systems are relatively simple linear circuits, and there is no way the differnece in mass of the electrons passing through them corresponds to the different number of information macrostates.
Not to mention the E= mc**2 relation applies to relativistic mechanics, not fuel gauges or coin ensembles or tea and sugar.
2) Changes in Boltzmann entropy and Shannon entropy for differnet systems do not scale linearly,
so doing things like adding them is meaningless, and they are not interchangeable.
And here is the takeaway point
3) The system that provides more information when observed (the fuel gauge) has greater Shannon entropy than the system that provides less infomation when observed (the LED). This is the opposite interpretation of what you (Jerry) have been asserting.
Now I have certainly made some simplifying assumptions in doing this calculation, but none that, IMO, would affect the final outcome. If anyone disagrees, let them show their work.

This message is a reply to:
 Message 109 by Jerry Don Bauer, posted 05-14-2005 8:07 PM Jerry Don Bauer has replied

Replies to this message:
 Message 112 by Jerry Don Bauer, posted 05-15-2005 2:35 AM paisano has replied

  
paisano
Member (Idle past 6443 days)
Posts: 459
From: USA
Joined: 05-07-2004


Message 116 of 147 (208332)
05-15-2005 9:29 AM
Reply to: Message 112 by Jerry Don Bauer
05-15-2005 2:35 AM


But we can play hard and fast with both of these as many physicists do and use Shannon's formula with Boltzmann's constant in it and these two entropies become blurred, some even calling it Shannon/Boltzmann entropy:
Not quite. What you have here, with the constant, is in fact the generalized Boltzmann equation for thermodynamic entropy when the probabilities of the system being in each of its microstates are not known.
The form S= k ln W is a simplification for the case where the probabilities of the microstates are all equal, so Pi = 1/N.
Also, perhaps now you can see why you were wrong in your former assertion that Shannon entropy reduces down to Boltzmann's entropy.
I think you've misunderstood something. My point is that they are not equivalent, and you can't just compute one or the other by using or not using Boltzmann's constant.
If you are considering permutations of macroscopic quantities (like LED or fuel gauge states, or heads or tails for coin ensembles) you are computing Shannon entropy or "Shannon entropy multiplied by Boltzmann's constant" (which isn't really a useful quantity), in either case.
If you really want to compute Boltzmann entropy, doing the Shannon entropy and multiplying by k isn't the correct way to do it.
You need to deal with microstates. You need to consider the states at the molecular level. You need to compute a partition function to get at how the probabilities of the molecular level microstates are distributed. You then need to apply the canonical or grand canonical ensemble to get a sensible number for the Boltzmann entropy that properly tracks with all of the other thermodynamic variables of the system like Gibbs free energy, heat, etc.
You can also, compute the entropy if you know something about the macroscopic thermal properties of the system in terms of energy and heat using the first and second law.
The Boltzmann entropy computed by either of these methods should be consistent and should make sense in terms of macroscopic thermodynamic variables.
Hmmm.....Thank you, as you've got me stymied as to how any of this has anything to do with the discussion.
I'm doing an order of magnitude estimate. I'm trying to show how thermodynamic entropy scales for the two systems based on a macroscopic consideration of its energy changes.
It doesn't scale the same way as Shannon entropy, which is the point I am trying to make:
You can't compute Boltzmann entropy by computing Shannon entropy and then multiplying by k, or vice versa.
If you could do this, the ratios of Shannon entropy for the two systems, and the ratios of Boltzmann entropy, should come out equal, because in a ratio, the constants cancel.
The ratios don't come out equal, because the Shannon entropy is a macroscopic statistical quantity, and at the fundamental level the Boltzmann entropy is a microscopic statistical quantity, although it can be related to macroscopic thermal quantities like energy and heat.

This message is a reply to:
 Message 112 by Jerry Don Bauer, posted 05-15-2005 2:35 AM Jerry Don Bauer has not replied

  
Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024