|
Register | Sign In |
|
QuickSearch
Summations Only | Thread ▼ Details |
Member (Idle past 1405 days) Posts: 20714 From: the other end of the sidewalk Joined: |
|
Thread Info
|
|
|
Author | Topic: How do "novel" features evolve? | |||||||||||||||||||
PaulK Member Posts: 17822 Joined: Member Rating: 2.2 |
quote: Then you fail to understand even the role of specification in Dembski's argument. Low probability events happen all the time. And where in the paper does it require any event below your probability bound ?
quote: No, it isn't. In fact the lower the Kolmogorov complexity, the better the specification.
|
|||||||||||||||||||
PaulK Member Posts: 17822 Joined: Member Rating: 2.2 |
quote: That's a nice bit of equivocation there. As I said low probability events happen all the time - especially when you consider sequences of smaller events (as Dembski does). It's the specification which is vitally important and you cannot ignore it. Now, how about your reason for rejecting the conclusions of the paper ? You claim that it requires an event with a probability below your probability bound. Presumably a specified event. What is it ? What is your probability bound ? And how do you know that it is below your probability bound ?
quote: Kolmogorov complexity isn't the same as entropy, but it is a closely related concept. The more random a sequence, the greater the Kolmogorov complexity.
|
|||||||||||||||||||
PaulK Member Posts: 17822 Joined: Member Rating: 2.2 |
I'm pretty convinced that you don't understand the subject.
quote: The information carried by the bit is less because the outcome is more certain. The relationship doesn't seem to be inverse at all. The greater the entropy of the source the more information carried by each bit. (This makes sense when you consider compression)
Wikipedia
Shannon's entropy measures the information contained in a message
In cybernetics information may be USED to reduce the entropy of a system where entropy is considered as disorganisation, but that is not the same thing at all.
quote: I suspect that that is the source of your confusion. Creationists do not have a good record of dealing with Information Theory.
|
|||||||||||||||||||
PaulK Member Posts: 17822 Joined: Member Rating: 2.2 |
quote: I don't think that that is true at all. If I understand correctly your claim is that the less information in the message, the more information in the source of the message ("the system"). It certainly makes no sense to say that the information in the message goes up as the information in the message declines ! But this seems obviously false. A system that is only capable of producing one message can be very simple. Simpler than a system which produces two distinct messages. How can we then say that the first system has more information in it than the second ? I would argue that the important distinction is between meaningful messages and random noise. But assuming the production of meaningful messages, we come back to the relationship that higher entropy = more information. Shannon information does not deal with the issue of meaning so it seems that the entropy of the signal is the only useful measure of information that it has to offer.
|
|||||||||||||||||||
PaulK Member Posts: 17822 Joined: Member Rating: 2.2 |
quote: I would say that the confusion is on your part. Complex communication requires high entropy. You say that LOW entropy is a measure of innate information. But that would mean that the INABILITY to communicate complex information would indicate the presence of complex information ! That's absurd.
quote: Usually it's defined on the basis of the predictability of the next term in the sequence based on the previous terms and the structure of the messages. That was the basis used in calculating the entropy of English. That's useful (very useful).
quote: Actually it would increase it using the basis I suggest above. For instance, in English there is a high probability of 'u' following 'q'. You'd lose that if you scrambled the letters. But where's your measure of probability that supports your idea that low entropy - high information ? A source that can only give one message has zero entropy by any reasonable standard. But how does that indicate a high information content ?
|
|||||||||||||||||||
PaulK Member Posts: 17822 Joined: Member Rating: 2.2 |
So you are literally claiming that the less information in the message, the more information in the message.
How can you not notice the contradiction?
|
|||||||||||||||||||
PaulK Member Posts: 17822 Joined: Member Rating: 2.2 |
So a biased coin produces less information. And a two-headed coin would produce no information...
Now, are you ever going to actually explain your point ?
|
|||||||||||||||||||
PaulK Member Posts: 17822 Joined: Member Rating: 2.2 |
quote: Then you are really confused. Your are confusing the coin with the act of tossing it, and you are confusing pure chance with communication. And even worse, you are confusing predictability with information. It seems very clear to me that you have an idea in your head but you haven't thought it out - or even considered the points I've raised.
quote: And why is this "negentropy" important ? Please tell me what it is about a two-headed coin that makes it more complex or more ordered than a normal coin in any significant way ? For instance if we simply consider the coins themselves, shouldn't a normal coin with it's differing designs on obverse and reverse be considered as offering more information than the two-headed coin rather than less ?
|
|
|
Do Nothing Button
Copyright 2001-2023 by EvC Forum, All Rights Reserved
Version 4.2
Innovative software from Qwixotic © 2024