quote:The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem.
So, yes, random information increases Shannon information. Selection reduces it.
You're studying the population of a certain organism. One of the offspring experiences a mutation that gives it two copies of a gene that produces a certain protein. The organism now produces twice as much of that protein as before resulting in differences from others in the population. So of course gene duplication increases information.
Mathematically, the number of bits required to encode the number of genes of an organism is log2(number-of-genes). Naturally log2(number-of-genes + 1) is a larger number and therefore more information.
Werner Gitt in In the Beginning Was Information distinguishes 5 levels of information of which Statistical (Shannon Information) is the lowest level.
Werner Gitt confuses information with meaning.
But repeating the answer to your original question about whether mutation and selection can increase Shannon information, random mutation increases information, selection reduces it.
Random mutation degrades existing information. Take a page from a book and start randomly mutating the letters and pretty soon it is unreadable and any information on that page is lost.
You're confusing information with meaning. In order to measure information mathematically you must use Shannon information.
Here's an example using a single gene of a population. Imagine that in this population the gene has 4 alleles. The amount of information for this gene across the population is:
log2(4) = 2
Now imagine that one of the offspring in this population experiences a mutation in this gene to form an allele that is different from the existing 4 alleles. Now there are 5 alleles for this gene in the population, and the amount of information is:
log2(5) = 2.32
So the amount of information in the population for this gene has increased from 2 to 2.32.
If you have some other way of measuring information, let's hear it.
Actually, I don't think most of us could answer these questions ourselves, so I don't think it's fair to ask them of CRR.
But the reason we can't answer them, or at least that I can't answer them, is that I have no idea how many genes and alleles pigs, cow, ostriches and crocodiles possess. Once provided that information I could answer the question.
So maybe another way to pose the question to CRR is to provide him a simple genome for a hypothetical population and ask him how he measures the amount of information in it, e.g.:
There are various ways of measuring various aspects of information but I don't think any of them is complete, and maybe never will be.
You seem to have lost your way technically. Naturally no scientific field is ever complete, but we know more than enough to measure the information in a population's genome, and Shannon information is the way to do it. Gitt is a pretender. Learn something about actual information theory, then go back to Gitt and you'll see why what he says is nonsense.
I proposed one simple way of measuring that information on the basis of alleles/gene. It isn't the only way to approach the problem, but it's probably the simplest and easiest to understand. If you want to compare the amount of information between populations, that's the easiest way to do it.
We already know that most evolution happens by the loss of information.
How would you know whether evolution causes a gain or loss of information if you don't know how to measure information?
So that would appear to be a qualified yes to the question "Can mutation and selection increase information?".
Mutation unqualifiedly increases information. Nothing else is possible. You have more alleles in the population than you had before. That can't be anything but an increase in information.
Actually there are a couple of unlikely mutation cases that can result in equal or less amounts of information in the population. A mutation can leave the population with the same amount of information if the mutated allele happens to be identical to a preexisting allele. And a mutation can leave the population with less information if the mutated allele happens to be the last instance of that allele *and* the mutated allele happens to be identical to a preexisting allele. These cases are so rare as to not be worth considering. I'm mentioning them only for completeness.
Whether selection results in a gain or loss of information depends upon how you look at it. If selection is considered a pruning process, one that decides which offspring don't survive to reproduce, then selection can never increase information, only decrease it or leave it the same.
But another way of looking at selection is as the pressures responsible for differential reproductive success, in which case it isn't a pruning process but a process that decides which offspring survive to reproduce and which don't. In this case, whether selection increases or decreases information is a function of the offspring that are selected to survive to reproduce combined with whether the mutations they possess represent an increase or decrease in information.
How is a process of deciding who survives to reproduce not a pruning process?
When trying to answer the question about whether selection can increase information, how you look at selection determines the answer. Pruning the unfit is one way to look at selection, a process of elimination, in which case selection could never increase information.
But differential reproduction success is another and more detailed way of looking at selection. For this perspective admittance gates might be a better analogy than pruning. The gates are wide open to the most fit, closed to the least fit, and somewhere in between for those with fitness in between. Admitting organisms that add more information than that lost by those refused admittance can result in increased information.
I think the first view is simpler because it allows one to think about mutation and selection separately (first mutation occurs as part of the reproductive process, then selection of the most fit occurs) , but the equations of population genetics include variables for selection and mutation rate - they're interdependent.
So I don't know. Maybe I haven't come up with good ways of thinking about this issue. I wish HBD were still around.
The theory of evolution claims that all life originated from 1 cell.
Actually, evolution does not say that "all life originated from 1 cell." You're probably thinking of the universal common ancestor, that all life present on earth today evolved from one or a few universal common ancestors through a process of descent with modification and natural selection.
Has a cell ever been observed to create a totally different cell?
That wouldn't be evolution, more like a miracle, though laboratory experimentation has manipulated cells to produce other cell types, for instance, stem cells.
Seeing that would have been required to produce such different types of life forms.
Evolution is about gradual change over many generations, not sudden change.
However in this case the original cell has ALL of the genetic and extra-genetic information for EVERY cell including the information to switch off the un-needed functions in each specialist tissue. That is the difference between a complex organism and a single celled organism.
As I pointed out in another thread, the amoeba genome is a hundred times larger than the human genome. Which is more complex? You can't answer questions like that if you're going to avoid mathematics and information theory.
And are answers to those questions meaningful when all that's important is whether an organism is well adapted to its environment? How would you mathematically measure how well adapted an organism is? If you could do that you could then compare level of adaptation to complexity or amount of information and see if there are any meaningful correlations.
My guess is that there would be little correlation. Evolution is a random walk. Solutions to the adaptation problems faced by organisms in their environment are presented through allele frequency change and random mutation. Evolution makes do with what is available. How much genome an organism ended up with to achieve adaptation is probably random, though perhaps it may serve as a record of how much environmental change its ancestral species have passed through.
My conclusion is that size is not a good measure of information content. I have found that with books. You can't determine the value of the content by weighing a book or by doing a word count.
You're still confusing information with meaning, and you still have accepted no way of measuring information (Gitt cannot measure information). Therefore your statements about information gain or loss are meaningless.
No, you're still confusing information with meaning. That's why you have no way of measuring information and why all your statements about gain and loss of information are nonsense.
Let me repeat this simple question. If you know how to measure information, then tell us how much information there is in a simple genome of 4 genes with this many alleles:
Gene 1: 4 alleles
Gene 2: 1 allele
Gene 3: 2 alleles
Gene 4: 3 alleles
The last time I asked this question you responded with a non sequitur reference to a paper on protein functional sequence complexity. The paper is suspect anyway because it equates meaning with function. That's very strange. Oh, I just looked Durston up - intelligent design advocate and head of Power to Change Ministries.
But if you think Durston is on to something then just answer the question: How much information is in that simple genome?