Register | Sign In


Understanding through Discussion


EvC Forum active members: 65 (9164 total)
3 online now:
Newest Member: ChatGPT
Post Volume: Total: 916,914 Year: 4,171/9,624 Month: 1,042/974 Week: 1/368 Day: 1/11 Hour: 0/1


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   Information for MPW
:æ: 
Suspended Member (Idle past 7215 days)
Posts: 423
Joined: 07-23-2003


Message 1 of 27 (82199)
02-02-2004 3:55 PM


:: writes:
Wherever a few are selected out of many, information has increased.
MPW writes:
I don't know where you got that from, but common sense tells me it doesn't work. You can lose, but never gain. Information NEVER increases without intelligence, period.
As has been shown numerous times in the past, common sense is not a reliable judge of truth or falsity with regard to the fine-grained details of reality. Yours is no exception, and I shall demonstrate:
Please read: http://home.mira.net/~reynella/debate/shannon.htm
I expect that most of it will be over your head, but it's at least worth a shot for you to try to read it. If you garner nothing else from that page, at least find where it states that the formula for calculating information is:
    Where I is information, and pi is the probability of a selected element of the set of all the specified elements.
    As an example, say we have a set A of elements a1, a2, and a3 input into a selection machine. This machine is designed to select one element from that set at random and out put it. We turn it on, and it pumps out a2.
    Now, since we know that there were 3 elements out of which one was chosen at random, the probability of a2 = 1/3.
    We can now calculate the information resulting from the selection of a2 by using the formula:
    Notice that the value is positive. Selection always results in information increase.
    Now, suppose that the elements a1, a2, and a3 are alleles in the genome of an organism. If natural selection permits one of these alleles to emerge to the extinction of the others, then the calculation would be exactly the same, and information in the genome would have increased. You've conceded that selection operates on biological organisms, and it's been demonstrated in any case if you decide to reverse yourself.
    So now, please cease with your ludicrous claims that information does not increase through natural selection. ALL forms of selection produce information increases, as I have just shown.

    Replies to this message:
     Message 2 by Percy, posted 02-02-2004 4:40 PM :æ: has replied

      
    :æ: 
    Suspended Member (Idle past 7215 days)
    Posts: 423
    Joined: 07-23-2003


    Message 3 of 27 (82234)
    02-02-2004 5:36 PM
    Reply to: Message 2 by Percy
    02-02-2004 4:40 PM


    Re: Does Natural Selection Increase Information?
    Hi, Percy!
    Let me give this another shot.
    I approach the problem from a slightly different angle. The total information for any gene in a population of organisms is equal to, keeping it simple, the log base 2 of the number of alleles (the "keeping it simple" part means I'm assuming equal probability for all alleles, and that each allele is a piece of information). The total information in the population for this gene can only change if the number of alleles increases or decreases. The selection by any individual reproductive act of one particular allele neither increases nor decreases the number of alleles in the population.
    This all seems fine except for where you state that "each allele is a piece of information". Information is not a thing, per se. It does not have mass or dimension. It is a measure of the change in the probability of the appearance of a certain thing or symbol. It's like temperature -- something we abstract from the behavior of elements of external reality, but not an external reality itself.
    An increase can only happen through mutation, and a decrease can only happen when no offspring in a generation inherits the allele...
    And this I disagree with.
    The greater the diversity of alleles, the greater the information gained if one were selected for. This is because as there are more and more alleles, the probability for the selection of any one allele gets smaller. Then, when one is selected for, we gain more information. In that sense, the more mutations there are, then the more information is increased as one allele establishes itself as the most advantageous. Gradually it will be selected for, however since it was selected out of a greater number of variants, it's probability is necessarily smaller and hence the information has increased by a greater factor.
    But what if a mutation occurs to an allele that also passes the selection filter? Say, a5 is a mutation of a2 yet also passes the selection filter. Then we have:
    -log2(2/5) = 1.32 bits
    If that mutant allele had NOT been fit to survive the selection filter, we would've had:
    -log2(1/5) = 2.32 bits
    So you can see that a greater number of mutations, if not deselected, can actually lower the degree of information increase. However, anywhere we have a finite number of alleles at one time, and then at a subsequent time we have a smaller number of alleles which survived based on fitness, selection has operated upon that set of alleles and the remaining ones carry an increase in information.
    It is hypothetically possible to have X number of alleles, pass them all along (i.e. do not select for any) and then add, say, 2 mutations to that set of alleles, and the result would be a information decrease. It seems that in reality, though, that selection acts much faster than mutation does (i.e. most mutations are disadvantageous and are generally deselected) and consequently information increases over time.
    Natural selection without mutation or allele death can neither increase nor decrease information.
    Well, this is strictly true since once a single allele is selected and none remain, then the probability that it will be "selected" again is 1. In that case, I think calling it "selection" is a bit of a misnomer.
    Even when you consider multiple genes working in concert permutationally, information cannot be considered to have increased or decreased during the gene mixing of reproduction because the log base 2 of all possible permutations is the total information in the genome, and individual expressions of these permutations do not affect it
    If the number of unique alleles in a subsequent generation is greater than the number of alleles in the previous generation, then information would have decreased, but this is not "selection" per se.

    This message is a reply to:
     Message 2 by Percy, posted 02-02-2004 4:40 PM Percy has replied

    Replies to this message:
     Message 5 by Percy, posted 02-02-2004 6:41 PM :æ: has replied
     Message 6 by Taqless, posted 02-02-2004 6:43 PM :æ: has not replied

      
    :æ: 
    Suspended Member (Idle past 7215 days)
    Posts: 423
    Joined: 07-23-2003


    Message 12 of 27 (82440)
    02-03-2004 1:37 AM
    Reply to: Message 5 by Percy
    02-02-2004 6:41 PM


    Re: Does Natural Selection Increase Information?
    Hi, Percy!
    It seems I'm still struggling to explain how I understand this to work. Please bear with me as I give it another shot.
    In Shannon information terms (see Shannon's Original Paper), the total alleles for a gene in a population represent the total set of messages that can be copied to offspring. Each allele represents a single member of the total message set of all possible alleles for that gene in the population.
    This all makes sense. However, the I don't see how this supports your labeling each allele a piece of information. Information is what is produced when a message is selected from the message set, but it is not the element of the set itself.
    You're thinking of the individual offspring as a receiver of information received from the parents, but it is only relevant to talk about the information in a population...
    I spoke of the selection of a single element for simplicity's sake. The principle still holds across a population. It's just that instead of 1 being selected from 5, we might have 345 selected from 347, or whatever.
    If it were true that the mere act of reproduction increased the information in a population, then a population could increase genomic information simple by increasing the size of the population. But having multiple copies of the same set of alleles doesn't increase the amount of information a population possesses, no more than possessing two copies of Shannon's paper increases the amount of information available to you.
    Agreed.
    I don't know if this is just too profound for me, or if you're just making it up as you go along, but this makes no sense to me. You're going to have to explain this one.
    Well, I did make the example up on the spot, but the principle is not ad hoc. Let me try to explain more thoroughly.
    If we had 4 elements in the message set {a1, a2, a3, a4} and the element selected was a1, we will assume for the example that each element was equally probable and state the probability of a1 as 1/4. Thus the formula states:
      So the selection of element a1 generated 2 bits of information.
      Now, say that these elements are alleles in a genome. Perhaps it will be easier to visualize if we increase the number of elements by a factor of 10. Say there are 4 different alleles for the same gene equally distributed across a population of 40 organisms. So we have 10 of a1, 10 of a2, 10 of a3, and 10 of a4.
      Now say all of the members of the first set pass along their traits to one descendent and then die, except one member has two offspring and a new allele appears (the offspring are an a1 and an a5). Meaning now we could have 1 of the a5, 10 of a1, 10 of a2, 10 of a3, and 10 of a4. The probabilities shift so that the probability of a5 is 1/41, the probability of a1 is 10/41, a2 is 10/41 and so on.
      Pass them through the filter, and suppose that again only the a1's pass through. Now we have:
        So we can see that even the mere existence and deselection of the odd mutated allele increases the incremental information gain compared to before it was among the population in the previous generation.
        If the a5 would have passed with the a1's instead of being deselected, the calculation would have been:
          So what I was hoping to illustrate here is that a beneficial mutation (one that passes through the selection filter) can increase information once it is selected for (as in (3)), just not as much as if was deselected (as in (2)), and even less than if it had never existed in the message set (as in (1)).
          Does that make sense?
          Clearly wrong, even just intuitively - just try creating information by ripping pages out of a book and burning them, and then by your logic when the book is empty it contains more information than it ever did.
          Disregarding the meaning of the printed text on the pages, the selection of 1 page out of the entire set of pages would result in an increase in information. It seems intuitively wrong because you're conflating meaning with information. Shannon's paper states in the 2nd paragraph of the introduction:
          quote:
          Frequently the messages have meaning: that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages. (emphasis in original)
          From: http:///DataDropsite/Shannon.pdf
          So I hope you can understand that it is valid, strictly speaking, to ignore that the pages happen to exhibit a meaningful arrangement of letters. I'm only speaking about the pages and their numbers.
          If your population begins with 64 alleles and then later has only 32 alleles, then just do the math. The information present at the beginning for this gene was 6, and later it was 5.
          I don't think you've done the math properly.
          If there were 64 in the message set, and 32 were selected, then the probability that those 32 were selected is (32/64). The equation would thus state:
            Or a 1 bit increase in information upon the selection of 32 elements of the message set out of 64.
            I said:
            quote:
            It is hypothetically possible to have X number of alleles, pass them all along (i.e. do not select for any) and then add, say, 2 mutations to that set of alleles, and the result would be a information decrease.
            Again, clearly wrong. Keep in mind that when you add alleles to a gene you're increasing the message set size M, and that information, which is log2M (or -log2(1/M), whichever you prefer), increases with increasing M.
            I worded that statement poorly. I meant to describe a relative information decrease. In other words, a greater number of alleles can result in a smaller gain of information than had the extra allele not existed, but not that the gain would be negative.
            So, how'd we do?
            [This message has been edited by ::, 02-03-2004]
            [This message has been edited by ::, 02-03-2004]

            This message is a reply to:
             Message 5 by Percy, posted 02-02-2004 6:41 PM Percy has replied

            Replies to this message:
             Message 13 by Percy, posted 02-03-2004 12:50 PM :æ: has replied

              
            :æ: 
            Suspended Member (Idle past 7215 days)
            Posts: 423
            Joined: 07-23-2003


            Message 14 of 27 (83034)
            02-04-2004 1:12 PM
            Reply to: Message 13 by Percy
            02-03-2004 12:50 PM


            Re: Does Natural Selection Increase Information?
            BUMP!
            Hi, Percy.
            I'm bumping this up to remind myself to return to it and respond to your post. It's just that I've been short on the kind of time a response like this will require.

            This message is a reply to:
             Message 13 by Percy, posted 02-03-2004 12:50 PM Percy has not replied

              
            :æ: 
            Suspended Member (Idle past 7215 days)
            Posts: 423
            Joined: 07-23-2003


            Message 15 of 27 (85816)
            02-12-2004 1:55 PM
            Reply to: Message 13 by Percy
            02-03-2004 12:50 PM


            Re: Does Natural Selection Increase Information?
            Hi, Percy. My apologies for the wait until I could supply my response. I think I've identified where we disagree with regard to the application of Shannon's Theory, and I'll try to clear this up before going back to explain my previous arguments.
            Percy writes:
            You're measuring the information (number of bits) *after* it's arrived. You must instead look at the total number of messages in the set *before* the message is sent, which is -log2(1/5) = 2.32 bits.
            I don't think that this is valid and I'll explain why.
            In Shannon Theory, "information" is defined as a measure of a change in certainty or uncertainty. When you become more certain of the message, you've acquired information. When you become less certain of the message, you've lost information. Therefore, it is invalid to calculate the information "contained" in the entire message set prior to the selection and transmission of an element because there's been no change in certainty. Information isn't "contained" in the message set as it sits before a message is sent. It is abstracted from the process of selecting and sending a message.
            It could be argued, I suppose, that simply defining the message set reduces uncertainty and therefore supplies information, but if we were to define a message set of 5 elements out of the literally infinite number of possible elements that exist in the universe's message set prior to definition, our calcluation would be:
              It is only when a message is selected out of a defined message set that we can calculate a meaningful probability of that message and therefore measure the reduction of uncertainty.
              Percy writes:
              Your answer of 2.04 bits is incorrect because the receiver of the information did not know which allele he would receive, and so there has to be provision that he could receive any of the 5 alleles in the message set.
              The receiver does not need to know which message will be sent in order to get information; he must only know which messages are possible, or what messages are elements of the defined message set.
              Before a message is sent, the receiver has complete uncertainty. He does not know which of the possible message will be sent. If he receives 2 of the 5 possible messages, his uncertainty is reduced and he has acquired information. However, if he were to receive only 1 of the 5 possible messages, his uncertainty would be reduced by a greater degree, and that is why the transmission of 1 message out of 5 transmits a greater measure of information than a transmission of 2 messages out of 5.
              This is where the confusion lies, IMHO. Do you see what I'm getting at?

              This message is a reply to:
               Message 13 by Percy, posted 02-03-2004 12:50 PM Percy has replied

              Replies to this message:
               Message 16 by Percy, posted 02-12-2004 4:10 PM :æ: has replied

                
              :æ: 
              Suspended Member (Idle past 7215 days)
              Posts: 423
              Joined: 07-23-2003


              Message 17 of 27 (85868)
              02-12-2004 4:41 PM
              Reply to: Message 16 by Percy
              02-12-2004 4:10 PM


              Re: Does Natural Selection Increase Information?
              Percy writes:
              You also need to explain why you keep taking the log base 2 of any number that strikes your fancy, in this case 5 over infinity.
              Let's look at the general formula:
                ...where pi is the probability of the transmitted message(s). To calculate pi, you simply express it as the number of elements of the message set that were transmitted divided by the total number of possible messages in the set. If we had a message set of five possible messages, and two messages were included in a transmission, the probability that those two messages would be sent is 2/5, or pi = 2/5. Then we can calculate the information gained in the transmission with the formula like so:
                  I agree that the message set is always predetermined, and my example of trying to calculate the selection of 5 messages out of infinite possible messages I think distracted from the real point which is that there is no meaningful calculation of information over an entire message set as a whole. Basically what I was trying to illustrate is that in defining a message set, you're selecting 5 messages out of an infinity of possible messages (because they could be literally anything), and that since the definition of the set is somewhat like a selection process, one might think that a meaningful calculation of information could be made over the set as a whole. So in my calculation, 5/∞ was supposed to be the value of pi, the probability that 5 messages would be selected out of an infinity of possible messages.
                  Percy writes:
                  What you're communicating from sender to receiver is not the message set, but a single message.
                  Actually, more than one message can be sent in a single transmission, however uncertainty is not reduced as much as if only one message were sent. That's why you gain less information taking the negative log base 2 of 2/5 than you get taking the negative log base 2 of 1/5.
                  Does that help?
                  [This message has been edited by ::, 02-12-2004]

                  This message is a reply to:
                   Message 16 by Percy, posted 02-12-2004 4:10 PM Percy has replied

                  Replies to this message:
                   Message 19 by Percy, posted 02-12-2004 7:59 PM :æ: has not replied
                   Message 21 by Percy, posted 02-13-2004 9:18 AM :æ: has replied
                   Message 27 by Percy, posted 02-16-2004 11:16 AM :æ: has not replied

                    
                  :æ: 
                  Suspended Member (Idle past 7215 days)
                  Posts: 423
                  Joined: 07-23-2003


                  Message 22 of 27 (86132)
                  02-13-2004 2:33 PM
                  Reply to: Message 21 by Percy
                  02-13-2004 9:18 AM


                  Re: Does Natural Selection Increase Information?
                  Alright, cool... it looks like we're making some good progress toward understanding eachother. I'm putting this response here to let you know that I've seen your most rencent posts, although I haven't really gone over them in depth. I don't think I'll be able to really get back into this until early next week, but I do intend to continue.
                  Have a great weekend and take care,
                  ::

                  This message is a reply to:
                   Message 21 by Percy, posted 02-13-2004 9:18 AM Percy has not replied

                    
                  Newer Topic | Older Topic
                  Jump to:


                  Copyright 2001-2023 by EvC Forum, All Rights Reserved

                  ™ Version 4.2
                  Innovative software from Qwixotic © 2024