Register | Sign In


Understanding through Discussion


EvC Forum active members: 65 (9162 total)
8 online now:
Newest Member: popoi
Post Volume: Total: 915,815 Year: 3,072/9,624 Month: 917/1,588 Week: 100/223 Day: 11/17 Hour: 0/0


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   How do "novel" features evolve?
PaulK
Member
Posts: 17822
Joined: 01-10-2003
Member Rating: 2.2


Message 227 of 314 (661606)
05-08-2012 4:34 PM
Reply to: Message 226 by zaius137
05-08-2012 4:19 PM


Re: creating "information" is either easy or irrelevant
quote:
The mathematics is good to a degree but I do disagree with the conclusion of this author as I show in making my point. I do not support the spontaneous introduction of new information. I maintain very low probabilities do not happen when they exceed any reasonable probability bound.
Then you fail to understand even the role of specification in Dembski's argument. Low probability events happen all the time. And where in the paper does it require any event below your probability bound ?
quote:
Here is where you confuse me a bit. I have attended presentations by Myer and do not see a conflict in his exposition of, Specified Complexity. In fact, Dembski’s formulation is actually based on Kolmogorov complexity...
No, it isn't. In fact the lower the Kolmogorov complexity, the better the specification.

This message is a reply to:
 Message 226 by zaius137, posted 05-08-2012 4:19 PM zaius137 has replied

Replies to this message:
 Message 228 by zaius137, posted 05-08-2012 5:24 PM PaulK has replied

  
PaulK
Member
Posts: 17822
Joined: 01-10-2003
Member Rating: 2.2


Message 230 of 314 (661613)
05-08-2012 5:40 PM
Reply to: Message 228 by zaius137
05-08-2012 5:24 PM


Re: creating "information" is either easy or irrelevant
quote:
Show me where specified events of low probability happen all the time. I believe very small probabilities are not comprehended because they are never encountered in our everyday lives.
That's a nice bit of equivocation there. As I said low probability events happen all the time - especially when you consider sequences of smaller events (as Dembski does). It's the specification which is vitally important and you cannot ignore it.
Now, how about your reason for rejecting the conclusions of the paper ? You claim that it requires an event with a probability below your probability bound. Presumably a specified event. What is it ? What is your probability bound ? And how do you know that it is below your probability bound ?
quote:
Entropy only specifies the amount of bits required to encode information (Shannon Entropy). Hence, the lower the number (lower entropy) implies more organization; fewer bits are required thus lower entropy. Entropy 101
Kolmogorov complexity isn't the same as entropy, but it is a closely related concept. The more random a sequence, the greater the Kolmogorov complexity.

This message is a reply to:
 Message 228 by zaius137, posted 05-08-2012 5:24 PM zaius137 has not replied

  
PaulK
Member
Posts: 17822
Joined: 01-10-2003
Member Rating: 2.2


Message 253 of 314 (662077)
05-12-2012 5:55 AM
Reply to: Message 252 by zaius137
05-12-2012 2:57 AM


Re: Information
I'm pretty convinced that you don't understand the subject.
quote:
The uncertainties or probabilities directly produce resulting entropy (Shannon entropy). For instance, a fair coin toss has entropy of one (to transmit the outcome of a fair coin toss you need one bit). An unfair coin toss (say 70% heads and 30% of the time its tails) has entropy of about (.88). The entropy is less because the outcome is more certain. A perfectly predictable outcome has lowest entropy. As in cybernetics, information can reduce entropy. Consequently, I am implying an inverse relationship between information and entropy.
The information carried by the bit is less because the outcome is more certain. The relationship doesn't seem to be inverse at all. The greater the entropy of the source the more information carried by each bit. (This makes sense when you consider compression)
Wikipedia
Shannon's entropy measures the information contained in a message
In cybernetics information may be USED to reduce the entropy of a system where entropy is considered as disorganisation, but that is not the same thing at all.
quote:
This relationship, as I encountered it, is presented in a book by A.E. Wilder-Smith and is an ultimate test for an intelligent designer.
I suspect that that is the source of your confusion. Creationists do not have a good record of dealing with Information Theory.

This message is a reply to:
 Message 252 by zaius137, posted 05-12-2012 2:57 AM zaius137 has not replied

  
PaulK
Member
Posts: 17822
Joined: 01-10-2003
Member Rating: 2.2


Message 258 of 314 (662253)
05-14-2012 2:23 AM
Reply to: Message 257 by zaius137
05-14-2012 12:59 AM


Re: Information
quote:
I think we are viewing the same elephant from different angles. When you say randomness of the system declined, I say innate information of the system has increased. Yes, then the number of bits needed to quantify the system would decrease. Information of the system increases entropy decreases (it is an inverse relationship from that perspective).
I don't think that that is true at all.
If I understand correctly your claim is that the less information in the message, the more information in the source of the message ("the system"). It certainly makes no sense to say that the information in the message goes up as the information in the message declines !
But this seems obviously false. A system that is only capable of producing one message can be very simple. Simpler than a system which produces two distinct messages. How can we then say that the first system has more information in it than the second ?
I would argue that the important distinction is between meaningful messages and random noise. But assuming the production of meaningful messages, we come back to the relationship that higher entropy = more information. Shannon information does not deal with the issue of meaning so it seems that the entropy of the signal is the only useful measure of information that it has to offer.

This message is a reply to:
 Message 257 by zaius137, posted 05-14-2012 12:59 AM zaius137 has not replied

  
PaulK
Member
Posts: 17822
Joined: 01-10-2003
Member Rating: 2.2


Message 263 of 314 (662310)
05-14-2012 3:42 PM
Reply to: Message 262 by zaius137
05-14-2012 3:20 PM


Re: Information
quote:
I still think that you are confusing the information in the data source with the methode and result of Shannon Entropy (the amount of information needed to transmit that information). Remember I said the entire exercise of using shannon entropy was to expose a system containing innate information to the power of statistics.
I would say that the confusion is on your part. Complex communication requires high entropy. You say that LOW entropy is a measure of innate information. But that would mean that the INABILITY to communicate complex information would indicate the presence of complex information ! That's absurd.
quote:
The entire validity for using Shannon Entropy is how you define the probability.
Usually it's defined on the basis of the predictability of the next term in the sequence based on the previous terms and the structure of the messages. That was the basis used in calculating the entropy of English. That's useful (very useful).
quote:
In your book example, scrambling the letters of the entire book will not change the entropy if your probability is broad enough.
Actually it would increase it using the basis I suggest above. For instance, in English there is a high probability of 'u' following 'q'. You'd lose that if you scrambled the letters.
But where's your measure of probability that supports your idea that low entropy - high information ? A source that can only give one message has zero entropy by any reasonable standard. But how does that indicate a high information content ?

This message is a reply to:
 Message 262 by zaius137, posted 05-14-2012 3:20 PM zaius137 has not replied

  
PaulK
Member
Posts: 17822
Joined: 01-10-2003
Member Rating: 2.2


Message 267 of 314 (662370)
05-15-2012 4:40 AM
Reply to: Message 266 by zaius137
05-15-2012 3:13 AM


Re: Information
So you are literally claiming that the less information in the message, the more information in the message.
How can you not notice the contradiction?

This message is a reply to:
 Message 266 by zaius137, posted 05-15-2012 3:13 AM zaius137 has not replied

  
PaulK
Member
Posts: 17822
Joined: 01-10-2003
Member Rating: 2.2


Message 271 of 314 (662418)
05-15-2012 2:06 PM
Reply to: Message 270 by zaius137
05-15-2012 1:48 PM


Re: Information
So a biased coin produces less information. And a two-headed coin would produce no information...
Now, are you ever going to actually explain your point ?

This message is a reply to:
 Message 270 by zaius137, posted 05-15-2012 1:48 PM zaius137 has replied

Replies to this message:
 Message 274 by zaius137, posted 05-16-2012 12:29 AM PaulK has replied

  
PaulK
Member
Posts: 17822
Joined: 01-10-2003
Member Rating: 2.2


Message 276 of 314 (662475)
05-16-2012 1:37 AM
Reply to: Message 274 by zaius137
05-16-2012 12:29 AM


Re: Information
quote:
I am not a mathematician by any sense of the word but believe that it is not that the unfair coin posses less information but my understanding is that the unfair coin is more predictable.
Then you are really confused. Your are confusing the coin with the act of tossing it, and you are confusing pure chance with communication. And even worse, you are confusing predictability with information. It seems very clear to me that you have an idea in your head but you haven't thought it out - or even considered the points I've raised.
quote:
Therefore, it takes less information to transmit the result of the toss. I have received some criticism that I am using the term information incorrectly, when I refer to the message possessing more information. This criticism may be justified so I have been using the term negentropy instead.
And why is this "negentropy" important ? Please tell me what it is about a two-headed coin that makes it more complex or more ordered than a normal coin in any significant way ? For instance if we simply consider the coins themselves, shouldn't a normal coin with it's differing designs on obverse and reverse be considered as offering more information than the two-headed coin rather than less ?

This message is a reply to:
 Message 274 by zaius137, posted 05-16-2012 12:29 AM zaius137 has not replied

  
Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024