Register | Sign In


Understanding through Discussion


EvC Forum active members: 65 (9162 total)
2 online now:
Newest Member: popoi
Post Volume: Total: 915,806 Year: 3,063/9,624 Month: 908/1,588 Week: 91/223 Day: 2/17 Hour: 0/0


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   10 Categories of Evidence For ID
Limbo
Inactive Member


Message 106 of 147 (208211)
05-14-2005 7:41 PM
Reply to: Message 105 by Jerry Don Bauer
05-14-2005 7:11 PM


Re: Category 1
Whereas Darwinism is based on naturalism, correct?
Why the heck cant we have science based on more than one philosophy when it comes to origin questions?
We are a diverse people philosophically, why cant our science reflect this?
This message has been edited by Limbo, 05-14-2005 07:42 PM

This message is a reply to:
 Message 105 by Jerry Don Bauer, posted 05-14-2005 7:11 PM Jerry Don Bauer has not replied

Replies to this message:
 Message 108 by NosyNed, posted 05-14-2005 8:01 PM Limbo has not replied

  
RAZD
Member (Idle past 1404 days)
Posts: 20714
From: the other end of the sidewalk
Joined: 03-14-2004


Message 107 of 147 (208215)
05-14-2005 7:58 PM
Reply to: Message 99 by Jerry Don Bauer
05-14-2005 6:45 PM


1) When loose information is diffused, entropy will tend to rise.
2) Specificity is inversely proportional to the probability of an event occurring.
And I cannot believe you are missing this one since we've been yammering about it for 50 posts:
3) DNA must be designed by an intelligent agent or by code pre-programmed by an intelligent agent.
now describe how the experiments are being set up to test these hypothesis.
this is, after all, the way scientists work. they don't stop with the concept and wait.
and continually reasserting your position on DNA does not make it any more factual: describe the evidence that eliminates all other possibilities.

we are limited in our ability to understand
by our ability to understand
RebelAAmerican.Zen[Deist
{{{Buddha walks off laughing with joy}}}

This message is a reply to:
 Message 99 by Jerry Don Bauer, posted 05-14-2005 6:45 PM Jerry Don Bauer has not replied

  
NosyNed
Member
Posts: 8996
From: Canada
Joined: 04-04-2003


Message 108 of 147 (208217)
05-14-2005 8:01 PM
Reply to: Message 106 by Limbo
05-14-2005 7:41 PM


More than one philosophy
Why the heck cant we have science based on more than one philosophy when it comes to origin questions?
We already do. There are a significant fraction of scientists who have a "philosophy" that includes a creator god. They continue to accept this while they try to determine how their god did his work.
As far as "naturalism" is concerned with Darwinism it is simply the same approach taken by all the sciences. We have a question about the natural world (not the supernatural). We may have alternate ideas about the answer to that question. How are we to settle such an idea?
How are we to sort out the alternative views in a way that is most likely to arrive somewhere near the most "real" answer. The philosophical approach taken is that of "methodological naturalism" which has nothing to do with "philosophical naturalism". It says that we examine the available evidence and arrive at a consensus view from each side in the debate making their best argument based on that evidence. When one view has won over the majority of those engaged in the debate it becomes the current theory answering that question about the natural world.
If you think there is a better way to answer a natural question I would love to hear it. After specifying how it would work in general some examples would be nice.
The correct place for this would be in one of the "Is It Science" threads or you might start a new one.
This message has been edited by NosyNed, 05-14-2005 08:02 PM

This message is a reply to:
 Message 106 by Limbo, posted 05-14-2005 7:41 PM Limbo has not replied

  
Jerry Don Bauer
Inactive Member


Message 109 of 147 (208220)
05-14-2005 8:07 PM
Reply to: Message 96 by Percy
05-13-2005 2:00 PM


quote:
In other words, you're mistaking meaning for information. Meaning is independent from information. Your example actually consists of two possible messages from the set of messages of a channel of information 36 characters wide. The difference in meaning between the two messages is merely a human superimposed one.
I'm looking at this from the aspect of hard science. Once an information channel has been established between information and a recipient, information flows. That is objective science and this information can be calculated. Meaning is subjective because that is open to human interpretation. Not on the same planet.
quote:
Information is conceptual. You're confusing the means of representing and transmitting information with information itself. Possibly you arrived at this conclusion because you've confused informational entropy with thermodynamic entropy. This is a common mistake, so common Wikipedia even addresses it in its article on Information Theory:
In other words, obvious but superficial analogies between the two caused the adoption of the term "entropy" by the science of information theory, but the two apply to distinctly different realms and have completely different rules of application.
I wouldn't put to much credibility into Wikipedia as this is an encyclopedia written by its readers and you never know if the page you are viewing was written by an informationalist or a 15 year old high school drop-out. This page is only half right, as this depends on how we calculate Shannon entropy. If we want to calculate it thermodynamically, it's just a matter of adding Boltzmann's constant into the formula to come out in Joules/ degree Kelvin.
Information is NOT conceptual. If on is heads and off is tails and I have 3 coins that are heads, I have 2^3 = 8, log2(8) = 3 bits of information. No subjective meaning there, just information expressed mathematically.
quote:
I didn't think you were dissing genetic algorithms. I was addressing the point that was the reason genetic algorithms were originally introduced into the discussion. ID claims that only intelligence can create information. Genetic algorithms are applications of evolutionary principles to design problems. Genetic algorithms create original designs, i.e., new information. Genetic algorithms are not intelligent. ID's claims about intelligence and information are not only unsupported, they're contradicted by the mere existence of genetic algorithms.
And as I said earlier, this is self evident anyway. All natural processes generate information. People don't create information just by writing down observations.
It doesn't work this way. These programs are NOT creating information in the way the Darwinists think they are and this has been shown time and time again by others. I was discussing Schneider's Ev program with someone. For a good idea of how that program REALLY works:
http://www.iscid.org/papers/Strachan_EvEvaluation_062803.pdf
quote:
I don't really see much difference between a made up god and a made up alien from another universe.
So you think Hawking's musings that our universe was caused by a singularity event in the black hole of another universe deals with made-up aliens and gods? Interesting.
quote:
"Mathematically impossible" is just an unsupported assertion, and genetic algorithms make clear this isn't the case.
No, mathematically impossible means just that:
Shannon relates that certain relays or flip-flop circuits store information. N of
these devices will store N bits, and since the total possible states of these devices is 2N
(off and on) and log2(2N) = N, a device with two stable positions such as these switches
can store one bit of information. When base 10 is used, the units may be called decimal
digits. Converting decimal digits to bits is accomplished by the following formula:
(BITS)log2(N) = (DECIMAL DIGITS)log10(N) / (DECIMAL DIGITS)log10(2)
While particles are particles, they are in one state, and when they change from particles
into non-particles (energy) they are in another state. Thus, we are right in the middle of Shannon's notion of a system with two states of existence.
Since one particle represents one bit when it's "on" as a particle, I would throw out a
question: is it possible to get more information out of a system than there is information
in it? Stephen Hawking and John Preskill have an ongoing bet that Hawking groupies
will be familiar with. Preskill maintains that the information that comes out of a black
hole reflects the information that went in it. Hawking is a bit muddled on the issue, but
one tenet I think we all can agree on is that we cannot get MORE information out of a
black hole than is in the black hole to begin with. It is little more than common sense
(coupled with a law or two of science) to understand that one cannot retrieve more
information from the Encyclopedia Britannica than it contains.
Understanding this, it becomes a matter of mathematics to fathom how much information
nature could have produced since the inception of the universe. Since each particle cannot produce more information than it contains; and if a particle produces maximum information every chance it gets, using the standard estimate of 10^80 particles in the universe and a 15 billion year old universe, we can then calculate the total information the universe could produce. We must assume that it might do so every tP, or every Planck time cycle, which is the smallest time interval possible--remember we are calculating a maximum.
As Shannon asserted, when we use base 10 we will come out in decimal digits. So if 10^80 particles are producing maximum information every 10^-45 seconds and there have been 4.73 x 10^17 seconds since the big bang, the actual universal probability bound is:
(10^80 / 10^-45)4.73 x 10^17 = 4.73 x 10^142, or rounding, 10^143.
And considering Shannon's conversion to bits:
log10(10^143) / log10(2) = 143 / 0.30103 = 475 bits
There's your math and since the simplest organism contains many times this amount of information we can safely conclude that these organisms did not come from nature.

Design Dynamics

This message is a reply to:
 Message 96 by Percy, posted 05-13-2005 2:00 PM Percy has replied

Replies to this message:
 Message 111 by paisano, posted 05-14-2005 8:38 PM Jerry Don Bauer has replied
 Message 115 by Percy, posted 05-15-2005 9:26 AM Jerry Don Bauer has not replied
 Message 119 by niniva, posted 11-23-2005 4:45 PM Jerry Don Bauer has not replied

  
Jerry Don Bauer
Inactive Member


Message 110 of 147 (208227)
05-14-2005 8:27 PM
Reply to: Message 98 by Jazzns
05-14-2005 5:48 PM


Re: Intelligent Selection?
quote:
I am not making any claim. All I am asking is how would one tell the difference? You came to the conclusion that the coin example was a case of intelligent selection. What objective test did you do in order to reach that conclusion that would apply to any instance of selection?
I came to that conclusion because that conclusion was inherent in the description to begin with. When one intelligently accepts one outcome and rejects another, how can it be anything other than intelligence?
quote:
Which first off is not what modern science considers common decent or the definition of species to be. If any was really advocating a literal adherance to the parody you described in that post then I agree that this would not be science. There are many definitions of species some of which cover not extant species so there is no violation of any definitions.
Species - Wikipedia
See in particular the definition of morphological species. Certainly where it is possible the stronger definition of biological species can be used but by no means is it an exhaustive definition or the only one used in science.
There is only one definition for a sexual, biological species. I do understand that Darwinism has tried to get rid of it to suit their agenda, but it hasn't been quite done yet.
A morpho-species is not a definition of a biological species, but a terminology paleontologists use to classify fossils. It has not a thing to do with biological systems other than to employ morphology to draw similarities and contrast differences in structure.
quote:
We have fossils of creatures who's jawbone has a function as a jaw bone and a sound wave receptor. Before them we have similar looking creatures with just a jaw bone. After them we have similar looking creatures that have the same jaw bone primarily as a sound receptor. Last we have similar looking creatures that have the same bone exclusivly as a sound receptor.
How is this not scientific to tentativily conclude that this is a transitional sequence as a result of decent with modification?
Because you have no experimental evidence to show this. Lamark did the same thing. Since little trees look very similar to middle sized trees and those like really big trees, he concluded, in a similar vein to you that the big trees evolved from the little trees.
The main difference today is that he was laughed out of town and modern Darwinsists have managed to infiltrate academia with this stuff.
quote:
How would the mechanism for this sequence change its meaning if it was performed by Intelligent vs Natural Selection?
I don't understand the question. Change the meaning of what?
quote:
What test could I do to tell if this sequence of evolution was intelligently prescribed or a "devolution" as you would call it?
Sequence the genome and look for noise in the form of pseudogenes.

Design Dynamics

This message is a reply to:
 Message 98 by Jazzns, posted 05-14-2005 5:48 PM Jazzns has not replied

  
paisano
Member (Idle past 6422 days)
Posts: 459
From: USA
Joined: 05-07-2004


Message 111 of 147 (208231)
05-14-2005 8:38 PM
Reply to: Message 109 by Jerry Don Bauer
05-14-2005 8:07 PM


his page is only half right, as this depends on how we calculate Shannon entropy. If we want to calculate it thermodynamically, it's just a matter of adding Boltzmann's constant into the formula to come out in Joules/ degree Kelvin.
Jerry, I'm sorry, but you're simply incorrect on this point. "Calculating Shannon entropy thermodynamically" is an incoherent statement. Shannon entropy and thermodynamic (Boltzmann) entropy are different quantities, and cannot be conflated like this.
If you want references, consult undergraduate thermo/stat mech texts such as Reif or Kittel, or graduate texts such as Pathria or Landau and Lifshitz.
In none of these texts, much less in the professional literature, will you see Shannon entropy and Boltzmann entropy conflated in the way you assert.
Let's do an illustrative example.
Consider two devices for measuring the quantity of fuel
in an automobile fuel tank:
1) An LED which is in a series circuit with a sending unit in the tank
with a switch that is open until there is one gallon in the tank,
at which point the switch is closed, completing the circuit and
lighting the LED.
This system has two possible information states:
LED off (>1 gallon left in the tank)
LED on (< 1 gallon left)
Without a priori information about the amount of fuel in the tank, we may assume either
state has equal probability. In this case, the Shannon entropy is indeed
proportional to ln W, where W is the number of possible information states of the system.
Thus , its Shannon entropy is proportional to ln(2), or 0.693.
The amount of energy needed to light the LED can be estimated
by noting that a typical LED draws a current of about 20 milliamps
with a voltage drop of about 2.0 volts. (More or less, these numbers vary with
LED characteristics but are good approximations).
Thus the power necessary to light the LED and keep it lighted
is P = VI = 2.0*(20e-03) = 40e-03 watts, Since watts= joules/sec,
we need 40e-03 joules per second to light the LED and keep it lit.
2) A fuel gauge, essentially a pointer, bimetal strip, and heating coil connected in a circuit
to a sending unit consisting of a variable resistor connected to a mechanical float,
calibrated such that the total resistance in the circuit (heating coil + variable resistor) varies linearly with the quantity of fuel in the tank, with a voltage regulator to keepp the voltage at 12V.
To estimate the number of information states available to the fuel gauge, assume it has a pointer of length 5 cm and thickness 2 mm, with a full travel through a curcular arc of 180 degrees.
Using pi = 3.14, this corresponds to an arc length of 15.7 cm, or 78.5 discrete positions of the pointer.
Let's call it approximately 80.
So again, assuming no a priori knowledge about the amount of fuel in the tank, each state is equiprobable and we have the Shannon entropy of the fuel gauge system proportional to
ln(80), or 4.38.
To estimate the energy difference of the fuel gauge between the empty and full state,
we note that from the auto.howstuffworks.com entry on GM fuel gauges, the resistance of the sending unit circuit varies from 2 ohms at empty, to 88 ohms at full. The voltage in the circuit is of course regulated 12 volts.
So, the current in the circuit to keep the gauge pointer to "full" is
I= 12/88 = 0.14 amps
And the power supplied the heating coil is then
P = VI = 12*0.14 = 1.68 watts
At "empty" the current in the circuit is I= 12/2 = 6 amps
and the power supplied is
P = VI = 72 watts
So the energy supply difference between the empty and full state for the fuel gauge is
about 70.3 joules.
(If we knew the resistance of the coil and the thermal expansion coefficients of the
bimetal strip, we could do a more accurate computation, but we are just after an order of magnitude estimate here)
Now, on to entropy.
The ratio of Shannon entropy of the fuel gauge system to that of the LED system is:
Shannon entropy ratio = 4.38/0.693 = 6.32
Now what about the thermodynamic or Boltzmann entropy ?
We could compute the thermodynamic entropy change for both systems at their respective end states
from the grand canonical ensemble applied to the molecular structure of the system components, but this is overkill.
Computing the ratio of macroscopic thermal entropy change will be sufficient.
To estimate the ratio of thermodynamic entropy change from the end states of the LED system and
the end states of the fuel gauge, we note that , from the first and second laws,
entropy change will scale linearly with energy change, so the ratio of entropy will be proportional
to the ratio of energy.
In other words, dS (fuel gauge)/dS (LED) = dE(fuel gauge)/dE(LED)
or, Thermodynamic entropy ratio = 70.3/40e-03 = 1758.
Note that since we have taken ratios, any unit differnces between the Shannon entropy and Boltzmann entropy ratio cancel.
To recap, the Shannon entropy ratio between the two information systems = 6.32
the Boltzmann entropy ratio between the two systems = 1758.
So the change in thermodynamic entropy is orders of magnitude larger than the
change in informational or Shannon entropy.
We can see that:
1) "information" does not scale as I=mc**2, in contrast to the rather naive assertion
that it did. Both systems are relatively simple linear circuits, and there is no way the differnece in mass of the electrons passing through them corresponds to the different number of information macrostates.
Not to mention the E= mc**2 relation applies to relativistic mechanics, not fuel gauges or coin ensembles or tea and sugar.
2) Changes in Boltzmann entropy and Shannon entropy for differnet systems do not scale linearly,
so doing things like adding them is meaningless, and they are not interchangeable.
And here is the takeaway point
3) The system that provides more information when observed (the fuel gauge) has greater Shannon entropy than the system that provides less infomation when observed (the LED). This is the opposite interpretation of what you (Jerry) have been asserting.
Now I have certainly made some simplifying assumptions in doing this calculation, but none that, IMO, would affect the final outcome. If anyone disagrees, let them show their work.

This message is a reply to:
 Message 109 by Jerry Don Bauer, posted 05-14-2005 8:07 PM Jerry Don Bauer has replied

Replies to this message:
 Message 112 by Jerry Don Bauer, posted 05-15-2005 2:35 AM paisano has replied

  
Jerry Don Bauer
Inactive Member


Message 112 of 147 (208285)
05-15-2005 2:35 AM
Reply to: Message 111 by paisano
05-14-2005 8:38 PM


quote:
Jerry, I'm sorry, but you're simply incorrect on this point. "Calculating Shannon entropy thermodynamically" is an incoherent statement. Shannon entropy and thermodynamic (Boltzmann) entropy are different quantities, and cannot be conflated like this.
This is correct upon first examination. The problem here is that it is not that cut and dried because one can manipulate this mathematically and come out with something between the two.
Boltzmann entropy should always come out in energy/temperature expressed as Joules/degrees Kelvin because R = 0.0821 (atm.L)/(mol.K), 1 atm.L = 101J, R = 8.31 J/mol.K, NA = 6.02x10^23/mol (R)/(NA) =
K = 1.38 x10^-23 J/K
And:
But we can play hard and fast with both of these as many physicists do and use Shannon's formula with Boltzmann's constant in it and these two entropies become blurred, some even calling it Shannon/Boltzmann entropy:
Also, perhaps now you can see why you were wrong in your former assertion that Shannon entropy reduces down to Boltzmann's entropy. In the first example I gave you of Shannon's formula, there was no K in there. So it would have reduced down to S = log(W). Not S = K log(W).
quote:
Let's do an illustrative example.
Consider two devices for measuring the quantity of fuel
in an automobile fuel tank:
1) An LED which is in a series circuit with a sending unit in the tank
with a switch that is open until there is one gallon in the tank,
at which point the switch is closed, completing the circuit and
lighting the LED.
This system has two possible information states:
LED off (>1 gallon left in the tank)
LED on (< 1 gallon left)
Without a priori information about the amount of fuel in the tank, we may assume either
state has equal probability. In this case, the Shannon entropy is indeed
proportional to ln W, where W is the number of possible information states of the system.
Thus , its Shannon entropy is proportional to ln(2), or 0.693.
The amount of energy needed to light the LED can be estimated
by noting that a typical LED draws a current of about 20 milliamps
with a voltage drop of about 2.0 volts. (More or less, these numbers vary with
LED characteristics but are good approximations).
Thus the power necessary to light the LED and keep it lighted
is P = VI = 2.0*(20e-03) = 40e-03 watts, Since watts= joules/sec,
we need 40e-03 joules per second to light the LED and keep it lit.
2) A fuel gauge, essentially a pointer, bimetal strip, and heating coil connected in a circuit
to a sending unit consisting of a variable resistor connected to a mechanical float,
calibrated such that the total resistance in the circuit (heating coil + variable resistor) varies linearly with the quantity of fuel in the tank, with a voltage regulator to keepp the voltage at 12V.
To estimate the number of information states available to the fuel gauge, assume it has a pointer of length 5 cm and thickness 2 mm, with a full travel through a curcular arc of 180 degrees.
Using pi = 3.14, this corresponds to an arc length of 15.7 cm, or 78.5 discrete positions of the pointer.
Let's call it approximately 80.
So again, assuming no a priori knowledge about the amount of fuel in the tank, each state is equiprobable and we have the Shannon entropy of the fuel gauge system proportional to
ln(80), or 4.38.
To estimate the energy difference of the fuel gauge between the empty and full state,
we note that from the auto.howstuffworks.com entry on GM fuel gauges, the resistance of the sending unit circuit varies from 2 ohms at empty, to 88 ohms at full. The voltage in the circuit is of course regulated 12 volts.
So, the current in the circuit to keep the gauge pointer to "full" is
I= 12/88 = 0.14 amps
And the power supplied the heating coil is then
P = VI = 12*0.14 = 1.68 watts
At "empty" the current in the circuit is I= 12/2 = 6 amps
and the power supplied is
P = VI = 72 watts
So the energy supply difference between the empty and full state for the fuel gauge is
about 70.3 joules.
(If we knew the resistance of the coil and the thermal expansion coefficients of the
bimetal strip, we could do a more accurate computation, but we are just after an order of magnitude estimate here)
Hmmm.....Thank you, as you've got me stymied as to how any of this has anything to do with the discussion.
quote:
Now, on to entropy.
The ratio of Shannon entropy of the fuel gauge system to that of the LED system is:
Shannon entropy ratio = 4.38/0.693 = 6.32
Now what about the thermodynamic or Boltzmann entropy ?
We could compute the thermodynamic entropy change for both systems at their respective end states
from the grand canonical ensemble applied to the molecular structure of the system components, but this is overkill.
Computing the ratio of macroscopic thermal entropy change will be sufficient.
To estimate the ratio of thermodynamic entropy change from the end states of the LED system and
the end states of the fuel gauge, we note that , from the first and second laws,
entropy change will scale linearly with energy change, so the ratio of entropy will be proportional
to the ratio of energy.
In other words, dS (fuel gauge)/dS (LED) = dE(fuel gauge)/dE(LED)
or, Thermodynamic entropy ratio = 70.3/40e-03 = 1758.
Note that since we have taken ratios, any unit differnces between the Shannon entropy and Boltzmann entropy ratio cancel.
To recap, the Shannon entropy ratio between the two information systems = 6.32
the Boltzmann entropy ratio between the two systems = 1758. So the change in thermodynamic entropy is orders of magnitude larger than the
change in informational or Shannon entropy.
Fine. But maybe you did not understand my point. I stated "his page is only half right, as this depends on how we calculate Shannon entropy. If we want to calculate it thermodynamically, it's just a matter of adding Boltzmann's constant into the formula to come out in Joules/ degree Kelvin."
IOW, he IS HALF RIGHT. And you have just shown an example of that. But all situations do not deal with fuel gages and LED lights. Entropy can be defined as = -information and shown above as proportional to -Sum over i of Pi log(Pi), where Pi is the probability of the "ith" event.
And here is where he is half wrong. If I shoot an electron into a box with three slits, that particle will have a chance of going through 1 of 3 slits, hitting a piece of film at the back of each slit and recording the event on any one of 3 pieces of film. The probabilities involved with that electron coming to hit any 1 of the films is 1/3 = or .33. We can derive some statistical entropy out of this.
S = -Sum over i of .33 log(.33), S = .159.
Cool. There is our Shannon entropy. But watch me mix Shannon entropy with Boltzmann entropy to come out with energy and heat which is the very definition of thermodynamic entropy:
S = -K Sum over i of .33 log(.33),
S = 1.38 x10^-23 J/K(.159),
S = 2.19 x 10^-24 J/K.............With me? I just showed Shannon entropy thermodynamically.
quote:
1) "information" does not scale as I=mc**2, in contrast to the rather naive assertion that it did. Both systems are relatively simple linear circuits, and there is no way the differnece in mass of the electrons passing through them corresponds to the different number of information macrostates. Not to mention the E= mc**2 relation applies to relativistic mechanics, not fuel gauges or coin ensembles or tea and sugar.
Very true. Just getting Percy to think a bit deeper. But you probably need to think a bit different too. You see atoms are information. Atoms are matter. Matter can be turned into energy, and energy back into matter. I do this everyday sitting right here in my office.
Consider our lowly hydrogen atom, basically composed of a single proton. This particle has a mass of 0.000 000 000 000 000 000 000 000 001 672 kg. Not much.
In one kilogram of water, the mass of hydrogen atoms amounts to just slightly more than 111 grams, or 0.111 kg. E = MC^2 tells us the amount of energy this mass would be equivalent to, if it were all suddenly turned into energy. It says that to find the energy, you multiply the mass by the square of the speed of light, this number being 300,000,000 meters per second (about):
= 0.111 x 300,000,000 x 300,000,000 = 10,000,000,000,000,000 Joules. This is some major energy!
My point with all of this is to be careful when running your mouth and shooting careless information at people around you. Because this gives a new meaning to 'loose lips sink ships.'
quote:
2) Changes in Boltzmann entropy and Shannon entropy for differnet systems do not scale linearly, so doing things like adding them is meaningless, and they are not interchangeable.
Yep. This is true. This is why one must use the same math all the way through.
quote:
The system that provides more information when observed (the fuel gauge) has greater Shannon entropy than the system that provides less infomation when observed (the LED). This is the opposite interpretation of what you (Jerry) have been asserting.
Now I have certainly made some simplifying assumptions in doing this calculation, but none that, IMO, would affect the final outcome. If anyone disagrees, let them show their work.
I don't disagree. I just think we can go further.

Design Dynamics

This message is a reply to:
 Message 111 by paisano, posted 05-14-2005 8:38 PM paisano has replied

Replies to this message:
 Message 116 by paisano, posted 05-15-2005 9:29 AM Jerry Don Bauer has not replied

  
Mammuthus
Member (Idle past 6474 days)
Posts: 3085
From: Munich, Germany
Joined: 08-09-2002


Message 113 of 147 (208300)
05-15-2005 5:08 AM
Reply to: Message 99 by Jerry Don Bauer
05-14-2005 6:45 PM


quote:
You are no longer arguing science just throwing out some provocative posts. Here's about the only thing I can answer in this one:
I will take this as an admission on your part that you are not familiar with any of the relevant experiments pertaining to evolutionary biology or the thousands of articles that describe them. This is clearly evident from your posts but you have now indicated as well that you have no interest in learning any science but are more comfortable pretending that not knowing about science is evidence for your position. Pity...it would have been nice to encounter an so called IDist who actually has an interest in the subject they debate...maybe we will get one here one day.
quote:
DNA must be designed by an intelligent agent or by code pre-programmed by an intelligent agent.
this is a statement of faith and not a scientific hypothesis. You can neither test for this nor falsify it. That you fail to realize this is why you will never be a scientist nor understand science in any form. You might regurgitate a few definitions here and there (though everything you have stated about genes and genomes has been incorrect) but you really do not understand science at all. That would be ok but coupled with your lack of interest in actually learning how it works makes you a mere internet troll.
You can re-state your position as much as you like, you can misquote and mischaracterize current research as much as you like, you can ignore studies (which are avialable to anyone with an internet connection) as much as you like. None of it strengthens you position and without a testable and falsifiable hypothesis for ID you will spin in circles endlessly...Your last two posts to me have been completely evasive so I will let you go as you clearly are in over your head with the molecular biology being discussed...have fun, and don't get to dizzy.

This message is a reply to:
 Message 99 by Jerry Don Bauer, posted 05-14-2005 6:45 PM Jerry Don Bauer has not replied

  
Parasomnium
Member
Posts: 2224
Joined: 07-15-2003


Message 114 of 147 (208308)
05-15-2005 6:59 AM
Reply to: Message 100 by Jerry Don Bauer
05-14-2005 6:58 PM


Calculating odds
{Retracted. I've noticed an enormous stupidity in my post. I'm working on a correction.}
This message has been edited by Parasomnium, 15-May-2005 12:06 PM

This message is a reply to:
 Message 100 by Jerry Don Bauer, posted 05-14-2005 6:58 PM Jerry Don Bauer has not replied

  
Percy
Member
Posts: 22388
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.2


Message 115 of 147 (208331)
05-15-2005 9:26 AM
Reply to: Message 109 by Jerry Don Bauer
05-14-2005 8:07 PM


Hi Jerry,
About your proof of "mathematically impossible", you had claimed in Message 56 that the creation of information found in organisms is "mathematically impossible". I challenged this as an unsupported assertion in Message 96, so you've now replied by calculating "how much information nature could have produced since the inception of the universe," which you figured to be 475 bits. Then you said, "The simplest organism contains many times this amount of information."
Question: How can the "simplest organism" contain "many times" the amount of information "produced since the inception of the universe"? In other words, how can a subset contain more of anything than the set of which it is a member?
Genetic algorithms are one example falsifying your claim that intelligence is required to create information, and another falsification can be illustrated with a simple example from biology. Consider a gene of a population of organisms that has two alleles, call them A and B. This gene is represented by a mere three nucleotides, and A and B are encoded like this:
A   ATC
B   ATG
The amount of information in this gene for the population (assuming equal probability for each allele) is -log21/2 = 1.
Now let's say that during reproduction a mutation causes a new offspring of the population to have a mutation in this gene that produces a unique allele, call it C:
A   ATC
B   ATG
C   ACG
The amount of information in this gene for the population (again assuming equal probability for each allele) is now -log21/3 = 1.585. The information within the population for this gene has increased from 1 to 1.585, and no intelligence was involved.
quote:
I don't really see much difference between a made up god and a made up alien from another universe.
So you think Hawking's musings that our universe was caused by a singularity event in the black hole of another universe deals with made-up aliens and gods? Interesting.
But we're talking about your musings, not Hawkings. The point is that there is no more evidence for aliens from another universe than there is for God or gods. You said, "I'm looking at this from the aspect of hard science," but it's hard to see how.
Information is NOT conceptual. If on is heads and off is tails and I have 3 coins that are heads, I have 2^3 = 8, log2(8) = 3 bits of information. No subjective meaning there, just information expressed mathematically.
No, Jerry, information is conceptual. Conceptual does not mean subjective. You are confusing the matter or energy used to record and represent information with the information itself. Any entropic changes in the matter in which information is recorded has no effect on the information itself. For example, heat a book (but don't burn it) and the entropy of the matter making up the book will change, but the information entropy will not. Or heat the wires carrying data in your computer, such as happens over the first half hour or so after you turn it on. Again, the entropy of the matter changes, the information does not.
It doesn't work this way. These programs are NOT creating information in the way the Darwinists think they are and this has been shown time and time again by others.
Keeping in mind that we're talking about genetic algorithms (your link was about evolution simulations, and anyway, links should only be offered to support your argument, not to make your argument), how about you showing it now?
--Percy
This message has been edited by Percy, 05-15-2005 09:28 AM

This message is a reply to:
 Message 109 by Jerry Don Bauer, posted 05-14-2005 8:07 PM Jerry Don Bauer has not replied

  
paisano
Member (Idle past 6422 days)
Posts: 459
From: USA
Joined: 05-07-2004


Message 116 of 147 (208332)
05-15-2005 9:29 AM
Reply to: Message 112 by Jerry Don Bauer
05-15-2005 2:35 AM


But we can play hard and fast with both of these as many physicists do and use Shannon's formula with Boltzmann's constant in it and these two entropies become blurred, some even calling it Shannon/Boltzmann entropy:
Not quite. What you have here, with the constant, is in fact the generalized Boltzmann equation for thermodynamic entropy when the probabilities of the system being in each of its microstates are not known.
The form S= k ln W is a simplification for the case where the probabilities of the microstates are all equal, so Pi = 1/N.
Also, perhaps now you can see why you were wrong in your former assertion that Shannon entropy reduces down to Boltzmann's entropy.
I think you've misunderstood something. My point is that they are not equivalent, and you can't just compute one or the other by using or not using Boltzmann's constant.
If you are considering permutations of macroscopic quantities (like LED or fuel gauge states, or heads or tails for coin ensembles) you are computing Shannon entropy or "Shannon entropy multiplied by Boltzmann's constant" (which isn't really a useful quantity), in either case.
If you really want to compute Boltzmann entropy, doing the Shannon entropy and multiplying by k isn't the correct way to do it.
You need to deal with microstates. You need to consider the states at the molecular level. You need to compute a partition function to get at how the probabilities of the molecular level microstates are distributed. You then need to apply the canonical or grand canonical ensemble to get a sensible number for the Boltzmann entropy that properly tracks with all of the other thermodynamic variables of the system like Gibbs free energy, heat, etc.
You can also, compute the entropy if you know something about the macroscopic thermal properties of the system in terms of energy and heat using the first and second law.
The Boltzmann entropy computed by either of these methods should be consistent and should make sense in terms of macroscopic thermodynamic variables.
Hmmm.....Thank you, as you've got me stymied as to how any of this has anything to do with the discussion.
I'm doing an order of magnitude estimate. I'm trying to show how thermodynamic entropy scales for the two systems based on a macroscopic consideration of its energy changes.
It doesn't scale the same way as Shannon entropy, which is the point I am trying to make:
You can't compute Boltzmann entropy by computing Shannon entropy and then multiplying by k, or vice versa.
If you could do this, the ratios of Shannon entropy for the two systems, and the ratios of Boltzmann entropy, should come out equal, because in a ratio, the constants cancel.
The ratios don't come out equal, because the Shannon entropy is a macroscopic statistical quantity, and at the fundamental level the Boltzmann entropy is a microscopic statistical quantity, although it can be related to macroscopic thermal quantities like energy and heat.

This message is a reply to:
 Message 112 by Jerry Don Bauer, posted 05-15-2005 2:35 AM Jerry Don Bauer has not replied

  
Parasomnium
Member
Posts: 2224
Joined: 07-15-2003


Message 117 of 147 (208370)
05-15-2005 1:46 PM
Reply to: Message 100 by Jerry Don Bauer
05-14-2005 6:58 PM


Re: Of hammers and men
Jerry Don Bauer writes:
You can't just throw out 500 coins and reason that whatever pattern you get has the odds of 1/2^500 of occurring.
The chance of getting any one particular pattern is 1 in 2500, whether you specify it in advance or not.
Jerry Don Bauer writes:
You have to pre-conceive the pattern as I did with 500 heads. Only THEN do the odds become 1/2^500 it will occur.
Some questions:
  • Are the chances of something happening dependent on it being preconceived?
  • How do we calculate the chances of something not preconceived?
  • What if we empty the bucket first and only then predict what is on the floor (before looking)?
  • What if we threw just one coin? Would you say (and I'm paraphrasing what you said before): "You can't just throw out one coin and reason that whatever outcome you get has the odds of 1/2 of occurring"?
Anyway, please note that I had already anticipated your notion of preconception and dealt with it:
Parasomnium writes:
Now, if you were to predict the exact configuration before emptying the bucket, I might grant you that to be right would be so infinitessimally improbable as to be well-nigh impossible.
That's wherein the problem lies. The improbability argument is always accompanied by the false picture of evolution working toward a predefined goal.
Either you forgot about that or you chose to ignore it. But I'd like an answer to it anyway.

We are all atheists about most of the gods that humanity has ever believed in. Some of us just go one god further. - Richard Dawkins

This message is a reply to:
 Message 100 by Jerry Don Bauer, posted 05-14-2005 6:58 PM Jerry Don Bauer has not replied

Replies to this message:
 Message 118 by Peter, posted 11-03-2005 11:21 AM Parasomnium has not replied

  
Peter
Member (Idle past 1478 days)
Posts: 2161
From: Cambridgeshire, UK.
Joined: 02-05-2002


Message 118 of 147 (256493)
11-03-2005 11:21 AM
Reply to: Message 117 by Parasomnium
05-15-2005 1:46 PM


Re: Of hammers and men
Sorry to butt in but this has all been gone over before on
this site.
Probability after the fact is meaningless.
The probability of me having typed this is, well, not a
probability at all. I already did.
You note that the prob. of 'any one' pattern is whatever --
by selecting one your are predefining the output.
The probability of getting A pattern is 1.

This message is a reply to:
 Message 117 by Parasomnium, posted 05-15-2005 1:46 PM Parasomnium has not replied

  
niniva
Inactive Member


Message 119 of 147 (262735)
11-23-2005 4:45 PM
Reply to: Message 109 by Jerry Don Bauer
05-14-2005 8:07 PM


quote:
It doesn't work this way. These programs are NOT creating information in the way the Darwinists think they are
GAs demonstrate that the evolutionary process can design - that is all that matters. Getting into semantics about "information" is just a cunning trick to sidestep the real issue.

This message is a reply to:
 Message 109 by Jerry Don Bauer, posted 05-14-2005 8:07 PM Jerry Don Bauer has not replied

  
eevans
Inactive Member


Message 120 of 147 (281872)
01-26-2006 11:45 PM


Intelligent Design Video
Greetings all,
I would like to compliment the posts offered up thus far regarding origins. Far too many forums concerned with similar topics are much less civil and fair minded. As a lay enthusiaste of the origins debate, I submit the following video created by Illustra Media. It contains some high level (relatively general, yet interesting) material on the Intelligent Design movement. Your feedback would be greatly appreciated.
part1
http://www.kaneva.com/checkout/stream.aspx?assetId=2536&f...
part2
http://www.kaneva.com/checkout/stream.aspx?assetId=2538&f...

  
Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024