Register | Sign In


Understanding through Discussion


EvC Forum active members: 65 (9162 total)
4 online now:
Newest Member: popoi
Post Volume: Total: 915,806 Year: 3,063/9,624 Month: 908/1,588 Week: 91/223 Day: 2/17 Hour: 0/0


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   How do "novel" features evolve?
Percy
Member
Posts: 22388
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.2


(2)
Message 154 of 314 (660125)
04-21-2012 7:25 AM
Reply to: Message 147 by intellen
04-21-2012 2:49 AM


Re: how populations evolve - when is it "novel"?
intellen writes:
Jefferinoopolis writes:
Due to this change in environment the bugblatter beast’s favorite game virtually disappears. It fails to adapt to the changes.
1. Ok, I don't know why those population of bugblatter had become "It fails to adapt to the changes." in your post. Why they failed? They had feet, right? They had instinct to protect their lives, right? So, why they failed? Did you never think about it?
It was not the bugblatter that Jefferinoopolis described as failing to adapt to the environmental changes, but rather the bugblatter beast's favorite game. Jefferinoopolis is posing a scenario where the bugblatter population faces a crisis because the food it normally eats for survival is no longer available.
3. They will change but they will never become two different species. Since species is defined as any organism that can mate and reproduce. Maybe, they will never mate themselves at first since they had the instinct of "territorial supremacy" to be protected when the two separated group meet. But no, evolution will never kicks in and there will never be no new species.
Well, yes, we already understand that this is your position. What we need to understand is why, as the bugblatter population responds to changed environmental conditions and becomes smaller and thinner along with a host of related morphological changes, you believe that it could never change to the point where it could no longer mate with the original unchanged population still living on the western island.
--Percy

This message is a reply to:
 Message 147 by intellen, posted 04-21-2012 2:49 AM intellen has not replied

  
Percy
Member
Posts: 22388
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.2


Message 251 of 314 (661953)
05-11-2012 6:52 AM
Reply to: Message 247 by zaius137
05-10-2012 6:36 PM


Re: Information
zaius137 writes:
Uncertainty goes up implied information goes down.
I'm not sure what you mean by the modifier "implied", but unless it has special significance I think you've got the relationship exactly backwards. When uncertainty is greatest concerning the state of the next bit to be communicated is when the most information is exchanged.
--Percy

This message is a reply to:
 Message 247 by zaius137, posted 05-10-2012 6:36 PM zaius137 has replied

Replies to this message:
 Message 252 by zaius137, posted 05-12-2012 2:57 AM Percy has replied

  
Percy
Member
Posts: 22388
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.2


Message 254 of 314 (662082)
05-12-2012 7:38 AM
Reply to: Message 252 by zaius137
05-12-2012 2:57 AM


Re: Information
zaius137 writes:
The uncertainties or probabilities directly produce resulting entropy (Shannon entropy). For instance, a fair coin toss has entropy of one (to transmit the outcome of a fair coin toss you need one bit). An unfair coin toss (say 70% heads and 30% of the time its tails) has entropy of about (.88). The entropy is less because the outcome is more certain. A perfectly predictable outcome has lowest entropy. As in cybernetics, information can reduce entropy. Consequently, I am implying an inverse relationship between information and entropy.
You described it accurately, but your conclusion is backwards. The lower the entropy the less information can be communicated.
Take your example of, "A perfectly predictable outcome has lowest entropy." A two-headed coin has a "perfectly predictable outcome" and therefore has the lowest possible entropy. When you flip the coin I don't need you to tell me the result because I already know. This means you have communicated no information when you tell me it came up "heads," because I already knew that.
Entropy and information have a positive, not an inverse, relationship.
--Percy

This message is a reply to:
 Message 252 by zaius137, posted 05-12-2012 2:57 AM zaius137 has replied

Replies to this message:
 Message 255 by zaius137, posted 05-12-2012 1:22 PM Percy has replied

  
Percy
Member
Posts: 22388
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.2


Message 256 of 314 (662184)
05-13-2012 7:57 AM
Reply to: Message 255 by zaius137
05-12-2012 1:22 PM


Re: Information
Hi Zaius,
Your quoted definition is telling you the same thing we've already been telling you.
Your inability to understand these definitions and explanations of Shannon entropy stems from your assumption of an inverse relationship between randomness and information content. You think that the greater the randomness the less the information content. In fact the reverse is true.
An informally stated law of information theory is that you can't tell someone something he already knows. An analogy would be that if I already know it's lunchtime that you have communicated no information when you tell me it's lunchtime.
If you want to know the exact positions of all the atoms in a fixed amount of gas in a square container at a point in time, because they are randomly oriented it would take a great many bits to communicate this information. The entropy is very high and so the amount of information needed to communicate all this random information is also very high.
But if you want to know the exact positions of the same number of atoms in a square crystal all you need to communicate is the position of one corner and the orientation. This is because the crystal has a regular non-random structure. Its entropy is very low, and the information communicated by sending the position of each and every atom is redundant and unnecessary, communicating little that was not already known.
More generally, if I already know the next bit is going to be a 1, then you have communicated no information when you tell me the next bit is 1. The information content is 0 bits and the entropy is 0 bits.
But if I have no idea whether the next bit will be 1 or 0 and you tell me it is 1, then the information content and the entropy is 1 bit.
If you're communicating the letters of words then the probability of the next letter depends upon the previous letter. If the previous letter is "b" and there are 9 letters that can follow "b", then I have a 1/9 chance of guessing the next letter, and so the entropy is high and the amount of information communicated is high when I receive the next letter.
Or say the previous letter was "z" and there are only 6 letters that can follow "z", then I have a 1/6 chance of guessing the next letter, and so the entropy is lower and the amount of information communicated is lower when I receive the next letter
But if the previous letter is "q" then there is only one letter that can follow "q", and I have a 1/1 chance of guessing that next letter correctly. The entropy is 0 and the amount of information communicated when I receive "u" as the next letter is also 0.
As my odds of guessing the next letter have risen from 1/9 to 1/6 to 1/1, in other words as the randomness has declined, the entropy of each next letter and the information communicated has also declined. It is a direct relationship.
--Percy
Edited by Percy, : Typo.

This message is a reply to:
 Message 255 by zaius137, posted 05-12-2012 1:22 PM zaius137 has replied

Replies to this message:
 Message 257 by zaius137, posted 05-14-2012 12:59 AM Percy has replied

  
Percy
Member
Posts: 22388
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.2


Message 261 of 314 (662271)
05-14-2012 9:00 AM
Reply to: Message 257 by zaius137
05-14-2012 12:59 AM


Re: Information
zaius137 writes:
I believe I can say we agree here
Except for the fact that your understanding is backwards, sure, we agree. Why don't you respond to the very specific examples I provided. They reveal how precisely backwards your understanding is. Shannon entropy is a measure of the predictability of the next bit. As that predictability declines the entropy increases and the amount of information also increases.
Here's an example of how you're thinking about information. We have a book on our computer that contains information. We run the book through a program that randomly scrambles all the characters. You think the book now has less information, and that's where you've gone wrong.
The fact of the matter is that the book now has more information than it had before because we're less able to predict the next character. For example, if I saw the letter "q" in the original book I would know that the next letter was "u". When I find out that the next letter is "u" I haven't learned anything. No information has been communicated.
But if I saw the letter "q" in the scrambled book I would have no idea what the next letter could be. When I find out the next letter is "f" I have learned something I could not possibly have known. Information has definitely been communicated.
Your original point was that "creationists like Myers" have defined "information in the genome", but you have as yet offered no evidence whatsoever of this, and the fact that you yourself misunderstand information underscores this point.
--Percy

This message is a reply to:
 Message 257 by zaius137, posted 05-14-2012 12:59 AM zaius137 has replied

Replies to this message:
 Message 262 by zaius137, posted 05-14-2012 3:20 PM Percy has replied

  
Percy
Member
Posts: 22388
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.2


Message 264 of 314 (662318)
05-14-2012 4:36 PM
Reply to: Message 262 by zaius137
05-14-2012 3:20 PM


Re: Information
Hi Zaius,
I think you're confusing yourself with your own jargon, and you're drawing a distinction that doesn't exist between information content and information communication. This sentence from the Wikipedia article on Information Theory clearly indicates a direct relationship between information and entropy:
Wikipedia writes:
For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes).
Did you get that? Less information == lower entropy.
--Percy

This message is a reply to:
 Message 262 by zaius137, posted 05-14-2012 3:20 PM zaius137 has replied

Replies to this message:
 Message 266 by zaius137, posted 05-15-2012 3:13 AM Percy has replied

  
Percy
Member
Posts: 22388
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.2


Message 269 of 314 (662390)
05-15-2012 9:12 AM
Reply to: Message 266 by zaius137
05-15-2012 3:13 AM


Re: Information
Hi Zaius,
Let's dispense with this misunderstanding first:
zaius137 writes:
Remember I acknowledge that the information in the message is independent of the amount of information that is required to transmit the message.
The amount of information required to transmit some information content over a lossless channel is equal to the amount of information content. If you have a 2 megabyte disk file on your computer then it will take 2 megabytes of information to transmit that file over a lossless channel.
In reality information content and information transmitted are the same thing. All the same concepts of information measures and entropy and so forth apply to both. They're just slightly different perspectives of the same thing. If you have a book then the measures of its information and entropy are the same whether it's sitting on your hard drive or being transmitted over the Internet.
Now let's dispense with this misunderstanding:
The comparison here is between a dice throw and a coin flip. The coin flip needs only a one-bit transmission to convey the message. Whereas the dice roll takes 2.6 bits of transmission to convey the message portion. The provides less information only refers to the transmission information.
If your message set size is 2, let's say the message set is {0, 1}, and each message has equal probability (.5), then the amount of infomation conveyed by sending one message is 1 bit:
[b][size=3]log2(2) = 1 bit[/size][/b]
The entropy of the information (as given by the Wikipedia article on Information Theory) is:
Plugging in the values we see that the entropy is (I'll be using log2):
[b][size=3]- ((.5)(log2(.5)) + (.5)(log2(.5)))
= - ((.5)(-1) + (.5)(-1))
= - ((-.5) + (-.5))
= - (-1)
= 1[/size][/b]
Now let's say your message set size is 4 and that the message set is {00, 01, 10, 11}, and each message again has equal probability (.25), then the amount of information conveyed by sending one message is 2 bits:
[b][size=3]log2(4) = 2 bits[/size][/b]
Plugging in the values to our entropy equation:
[b][size=3]- ((.25)(log2(.25)) + (.25)(log2(.25)) + (.25)(log2(.25)) + (.25)(log2(.25)))
= - ((.25)(-2) + (.25)(-2) + (.25)(-2) + (.25)(-2))
= - ((-.5) + (-.5) + (-.5) + (-.5))
= - (-2)
= 2[/size][/b]
When the information content was 1 bit per message then the entropy was 1. When the information content was 2 bits per message then the entropy was 2. See how the entropy is increasing with the information content?
--Percy
Edited by Percy, : Fix typo in one of the equations.
Edited by Percy, : Putzed with the equations a little bit to improve readability.

This message is a reply to:
 Message 266 by zaius137, posted 05-15-2012 3:13 AM zaius137 has replied

Replies to this message:
 Message 270 by zaius137, posted 05-15-2012 1:48 PM Percy has replied
 Message 273 by NoNukes, posted 05-15-2012 10:37 PM Percy has replied

  
Percy
Member
Posts: 22388
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.2


Message 272 of 314 (662441)
05-15-2012 9:29 PM
Reply to: Message 270 by zaius137
05-15-2012 1:48 PM


Re: Information
Hi Zaius,
Your position is that entropy and information have an inverse relationship. I showed that for equiprobable messages that this is precisely backwards for message sets of any size. How in your imagination does your example of a 70/30 2-message set contradict this? You didn't even attempt to draw any comparisons to other message set sizes with non-equiprobable distributions.
If you keep other things roughly equal with similar probability distributions then you'll find that for message sets of any size entropy and information have a direct relationship. Here are some examples:
ProbabilitiesInformation (bits)Entropy
{.5, .5}11
{.25, .5, .25}1.61.5
{.1, .4, .4, .1}21.7
{.1, .2, .4, .2, .1}2.32.1
This table just goes from 1 bit to 1.6 bits to 2 bits to 2.3 bits, and the direct relationship between information and entropy is clear. If I instead went from 1 bit to 10 bits to 100 bits to 1000 bits the relationship would become even more clear, as if that's necessary.
Why don't you put together your own table showing us the incredibly different probability distributions you have to use before you can get a consistently declining entropy with increasing message set size?
AbE: Just for the heck of it, the entropy for an equiprobable message set of size 10 is 3.3, for size 100 is 6.6, for size 1000 is 10.0, for size 10,000 is 13.2, for size 100,000 is 16.6, for 1,000,000 is 20.0, etc., etc., etc. Get the idea?
--Percy
Edited by Percy, : AbE.

This message is a reply to:
 Message 270 by zaius137, posted 05-15-2012 1:48 PM zaius137 has not replied

  
Percy
Member
Posts: 22388
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.2


Message 278 of 314 (662497)
05-16-2012 7:31 AM
Reply to: Message 274 by zaius137
05-16-2012 12:29 AM


Re: Information
szius137 writes:
I am not a mathematician by any sense of the word but believe that it is not that the unfair coin posses less information but my understanding is that the unfair coin is more predictable.
This is true. As predictability increases, entropy decreases. When the probabilities for the two sides of the coin become {1, 0} then the predictability is 1 and the entropy is 0. You can choose probabilities for the two sides of a coin that will cause the entropy to range between 0 and 1. You can choose probabilities for a three sided die that will cause the entropy to range between 0 and 1.6. You can choose probabilities for a four sided die that will cause entropy to range between 0 and 2. You can choose probabilities for a five sided die that will cause entropy to range between 0 and 2.3. The lowest possible entropy for any message set is always 0. The highest possible entropy for any message set increases with increasing message set size, e.g., increases with increasing information.
Therefore, it takes less information to transmit the result of the toss. I have received some criticism that I am using the term information incorrectly, when I refer to the message possessing more information. This criticism may be justified so I have been using the term negentropy instead.
According to Wikipedia, in information theory "Negentropy measures the difference in entropy between a given distribution and the Gaussian distribution with the same variance." Is this what you really mean to talk about?
Anyway, I am trying to express that a more predictable message would posses less entropy thus the transmission of that message would require less bits.
Yes, this is true, the lower the entropy the greater can be the information density. How does this support your position about adding information to a genome?
--Percy

This message is a reply to:
 Message 274 by zaius137, posted 05-16-2012 12:29 AM zaius137 has not replied

  
Percy
Member
Posts: 22388
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.2


Message 279 of 314 (662499)
05-16-2012 7:41 AM
Reply to: Message 273 by NoNukes
05-15-2012 10:37 PM


Re: Information
Hi NoNukes,
I know, I know, I know. You know too much about computers. Forget all the overhead involved in disk storage and data transmission. Maybe I should have stuck with the book example.
Anyway, the point is that measures of information are the same regardless whether the information is in some static form or is being transmitted. Zaius was trying to argue that information and entropy have a direct relationship in one and an inverse relationship in the other.
Zaius has cheered your post and ignored mine, so he must have interpreted your post as a successful rebuttal of my argument for a direct relationship between information and entropy.
--Percy

This message is a reply to:
 Message 273 by NoNukes, posted 05-15-2012 10:37 PM NoNukes has replied

Replies to this message:
 Message 280 by NoNukes, posted 05-16-2012 11:12 AM Percy has replied
 Message 284 by zaius137, posted 05-17-2012 2:25 PM Percy has replied

  
Percy
Member
Posts: 22388
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.2


Message 283 of 314 (662535)
05-16-2012 2:21 PM
Reply to: Message 280 by NoNukes
05-16-2012 11:12 AM


Re: Information
NoNukes writes:
I wasn't referring to overhead and such. I'm referring to files that are compressible. Like 2 Megabyte files containing all zeroes as an extreme example. The information content of such a file is very small and we can transmit that information across a lossless channel in only a few bytes, even if we count the bytes describing the decompression algorithm.
The example was just to illustrate the simple principle that increasing information and increasing entropy go hand in hand regardless whether the information is static or in the process of being communicated. If it helps, assume a pre-compressed 2 Mb file.
Further fault can be found with the example. A book or file is not really a message but a sequence of many messages where each message is a single character. Compression in this context is not a question of representing each individual message (character) more efficiently (although that possibility could be explored, too) but of encoding the original messages into a new message set so that the resulting sequence of messages is smaller in terms of the number of total bits.
But the topic was the relationship between information and entropy, which is not inverse but direct. If your equiprobable message set size is 4 then the amount of information in each message is 2 bits. If you introduce redundancy so that it takes 100 bits to transmit a single message, the amount of information transmitted is still 2 bits, and the entropy is still 2.
--Percy

This message is a reply to:
 Message 280 by NoNukes, posted 05-16-2012 11:12 AM NoNukes has seen this message but not replied

  
Percy
Member
Posts: 22388
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.2


Message 294 of 314 (662678)
05-17-2012 8:18 PM
Reply to: Message 284 by zaius137
05-17-2012 2:25 PM


Re: Information
zaius137 writes:
I rather not dwell on entropy but I cannot let you go on thinking I have no point.
If you'd rather not dwell on entropy then stop being wrong about it, which you continued doing in this message, demonstrating that you do indeed have no point, but even more importantly that you have little comprehension or intuitive feel for the subject
My point is that lower entropy implies that the message has more information.
You have a stunning lack of interest about resolving the conflict between what you believe and what the math of information theory says.
These two thought experiments relate how information can lower entropy.
I'll address this in a moment, but first it's important that you recognize that this isn't what you were originally wrong about. You originally asserted that the greater the information the less the entropy. This was wrong. You next asserted that static information and transmitted information are different, and that in one the relationship between information and entropy was direct, in the other inverse. This, too, was wrong.
You've now changed realms from information theory to thermodynamics and quantum theory. You've also changed questions from, "How much information and entropy is there?" to "How much entropy will it cost to gain information about a physical system?" This is the 2nd Law of Thermodynamics which states that the entropy of any closed system can only increase. If one does work in order to gain information then the cost to you in entropy will always be greater than the information gained.
In other words, if you did work that gained you 2-bits of information with an entropy of 2, then you must have exerted work costing more than 2 units of entropy in order to gain that information. This entropy cost is what Brillioun called negentropy because it is what you have to give up in order to gain information. But the more information you want to gain the more entropy you have to give up. And the more information your gain, the more entropy that information has, though that entropy is less than the entropy you gave up in order to gain the information (2nd Law of Thermodynamics again). It's a positive relationship.
If you really understood this you'd be explaining it instead of combing the Internet for quotes you can misinterpret to agree with you. I suggest you read the Wikipedia article on information theory, it's actually pretty good and pretty simple, and once you can explain your viewpoint without blatantly contradicting the simple math of information theory you come back here and see if you still believe that information and entropy are inversely related.
--Percy
Edited by Percy, : Typo.

This message is a reply to:
 Message 284 by zaius137, posted 05-17-2012 2:25 PM zaius137 has replied

Replies to this message:
 Message 301 by zaius137, posted 05-20-2012 1:20 AM Percy has replied

  
Percy
Member
Posts: 22388
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.2


Message 307 of 314 (662901)
05-20-2012 7:37 AM
Reply to: Message 301 by zaius137
05-20-2012 1:20 AM


Re: Information
Hi Zaius,
As I said before, if you understood information theory you'd be explaining it instead of combing the Internet for quotes you can misinterpret to agree with you. Since you did not yourself include any interpretive text for the quote, let me do that for you:
quote:
Equivalently, the Shannon entropy is a measure of the average information content one is missing when one does not know the value of the random variable. The concept was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".
This quote isn't explicit about the direction of the relationship (leaving me puzzled as to why you chose it), but a little thought fills in the missing points. As the amount of information one does not know increases, so does the randomness of the variable one does not know. Randomness increases with increasing message set size, and you learn more from a message from a larger message set than from a smaller one. This is because the uncertainty also has a direct relationship with message set size:
quote:
The uncertainty for such a set of n outcomes is defined by

The greater the message set size the greater the uncertainty, and the more our uncertainty is reduced when we receive a message, and therefore the greater the entropy of that message.
It's a direct relationship. For an equiprobable message set of size 2 the probability of each message is .5, the information one does not know is 1 bit, the uncertainty is 1, and the entropy is 1 bit.
For an equiprobable message set of size 4 the probability of each message is .25, the information one does not know is 2 bits, the uncertainty is 2, and the entropy is 2 bits.
For an equiprobable message set of size 8 the probability of each message is .125, the information one does not know is 3 bits, the uncertainty is 3, and the entropy is 3 bits.
Entropy is a measure of our uncertainty about what message we might receive next. The larger the message set, the greater the information we receive in a message. And also the larger the message set, the greater the uncertainty of which message from that set we might receive next, which is the entropy. To understand this better you might consider reading the rest of the article beyond the 2nd paragraph that you quoted. Pay particular attention to the section titled Rationale.
It might also help you to think about the effect of a message on the person receiving it. The person now has more information than he had before, so there are now more messages in his message set, so it would take more bits to communicate the state of this person, so his entropy has risen by the amount of entropy in the message he just received.
If at some point you again feel the urge to provide quotes from the Internet that you're misinterpreting, fight it.
For so long as you fail to care whether your understanding conflicts with the intuitive relationship between information and entropy and even with simple math you will continue to be wrong, not only in your thinking about information theory, but also in any conclusions you reach based upon your misunderstanding, such as the possibility of adding information to the genome.
--Percy

This message is a reply to:
 Message 301 by zaius137, posted 05-20-2012 1:20 AM zaius137 has not replied

  
Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024