|
Register | Sign In |
|
QuickSearch
EvC Forum active members: 65 (9162 total) |
| |
popoi | |
Total: 915,806 Year: 3,063/9,624 Month: 908/1,588 Week: 91/223 Day: 2/17 Hour: 0/0 |
Summations Only | Thread ▼ Details |
Member (Idle past 1404 days) Posts: 20714 From: the other end of the sidewalk Joined: |
|
Thread Info
|
|
|
Author | Topic: How do "novel" features evolve? | |||||||||||||||||||||||||||||||||||||||
Dr Adequate Member (Idle past 284 days) Posts: 16113 Joined: |
I think we are viewing the same elephant from different angles. When you say randomness of the system declined, I say innate information of the system has increased. Yes, then the number of bits needed to quantify the system would decrease. Information of the system increases entropy decreases (it is an inverse relationship from that perspective). Well in that case you're using words like information and entropy in the exact opposite way to Shannon. And if that doesn't perturb you, consider this: according to your way of doing things, a string consisting of no bases of DNA would have maximal information --- surely this can't be what you intend? Shannon was a genius, he didn't invent information theory one evening when he was drunk, perhaps you should consider following his lead.
|
|||||||||||||||||||||||||||||||||||||||
Dr Adequate Member (Idle past 284 days) Posts: 16113 Joined: |
Remember where Shannon entropy is most appropriate. It gives insight to how many bits needed to covey an independent variable by communications. As the randomness of that variable increases (less innate information) ... More information. Look at your citation:
As an example consider some English text, encoded as a string of letters, spaces and punctuation (so our signal is a string of characters). Since some characters are not very likely (e.g. 'z') while others are very common (e.g. 'e') the string of characters is not really as random as it might be. On the other hand, since we cannot predict what the next character will be, it does have some 'randomness'. Entropy is a measure of this randomness, suggested by Claude E. Shannon in his 1949 paper A Mathematical Theory of Communication. So according to Shannon, an encyclopedia has more 'randomness' (entropy, information) than a book of the same length consisting only of millions of instances of the letter A. And the former is indeed a great deal more informative than the latter.
|
|||||||||||||||||||||||||||||||||||||||
Dr Adequate Member (Idle past 284 days) Posts: 16113 Joined: |
In any case, if you want information to be the exact opposite of Shannon entropy, we can accommodate you by answering your original question about adding information to the genome in those terms as well.
After all, in my first post on this subject, I explained how a mutation could increase Shannon entropy, as follows:
Dr A writes: Well, if your choice is Shannon entropy, then creating information is easy. Any insertion would do it, since the insertion increases the number of bits in the genome, and since the content of these bits is not completely predictable from their context. So, for example, consider a "toy genome" (real genomes are of course longer) of the form GTACT_ACTCTA, where the _ represents a base that has just been added by insertion, the identity of which I am not going to tell you. Can you deduce with complete certainty what base is represented by the _, based only on the knowledge that it is preceded by GTACT and followed by ACTCTA? Of course not. Therefore, it makes a non-zero contribution to the total Shannon entropy of the genome. So if your choice of a measure of information is now the opposite of Shannon entropy, then all I need to do is reverse the argument as follows: Well, if your choice is the exact opposite of Shannon entropy, then creating information is easy. Any deletion would do it, since the deletion decreases the number of bits in the genome, and since the content of these bits was not completely predictable from their context. So, for example, consider a "toy genome" (real genomes are of course longer) of the form GTACT_ACTCTA, where the _ represents a base that has just been removed by deletion, the identity of which I am not going to tell you. Can you deduce with complete certainty what base was represented by the _, based only on the knowledge that it was preceded by GTACT and followed by ACTCTA? Of course not. Therefore, it made a non-zero contribution to the total Shannon entropy of the old genome, and so the new genome GTACTACTCTA has, by your criterion, more information than the original. --- There you go. Your way of quantifying information may be the silliest thing since King Olaf the Silly's "Decree of Custard" back in 947, but clearly "information" so quantified can be increased by mutation. Edited by Dr Adequate, : No reason given.
|
|||||||||||||||||||||||||||||||||||||||
Dr Adequate Member (Idle past 284 days) Posts: 16113 Joined: |
"Negentropy" is a term coined by Erwin Schrdinger in his popular-science book "What is life?" (1943).
quote: Note that Shannon entropy and thermodynamic entropy are not the same thing. Note also that since Schroedinger was writing five years before the publication of Shannon's paper, he was talking about the latter and not the former. Note further that there is no sense in which a living system imports negative Shannon entropy and stores it, since that is not even a concept to which one can attach a meaning. Perhaps you should spend less time quoting from articles for beginners about information theory, and more time actually reading them. Edited by Dr Adequate, : No reason given.
|
|||||||||||||||||||||||||||||||||||||||
Dr Adequate Member (Idle past 284 days) Posts: 16113 Joined: |
In other words, new evidence in genetic divergence between chimps and humans and the measured mutation rate in humans has pushed the possible divergence back to 33 million years. You cannot change a monkey into a man. According to your own calculations, you can do so in about 33 million years. Like the woman in the joke, all you're haggling about is the price.
60 mutations per generation from ... ... a study involving precisely two children.
Commonly the date accepted by evolutionists is 5 million years. No. They do not, in fact, commonly accept a date.
(u)=mutation rate (9x10^-9) That's not what I got using your own figure for the number of mutations. It should be mutations/base pairs = 1.76 x 10-8. This alone halves the number of generations. --- But the other thing ... have you thought about what you're trying to achieve here? The figures biologists use to estimate the time of divergence are based precisely on the sort of genetic data you're using. You seem to be reasoning: "evolutionists say it took five million years, if I can show that it would take 35 million years, then it would never have fit into the evolutionists' time-frame." But if it could be shown definitively and to everyone's satisfaction that it would take 35 million years, then evolutionists would immediately with one voice say that that was how long it took. Because their estimate for how long it did take is based on their estimate of how long it would take. That's where their time frame comes from in the first place. If we make a new measurement of the mutation rate, that changes how long it would have taken, and it also, simultaneously, changes how long evolutionists think it took, because it's the same thing.
|
|||||||||||||||||||||||||||||||||||||||
Dr Adequate Member (Idle past 284 days) Posts: 16113 Joined:
|
I rather not dwell on entropy ... I can see why.
So tell me again how information does not correlate inversely to entropy? Shannon entropy is different from thermodynamic entropy. They are not the same. They are different.
Probability of the message increases, information in the message increases ... No it doesn't. How many times does this need to be explained to you? If it is 100% certain that the message I am about to receive will consist of the word "YES", then that message contains no information. Because I learn nothing whatsoever by receiving it. Because I already knew what it would say. How hard can this be to understand? How fucking hard can this be to understand? How obvious does an idea need to be before a creationist can understand it? Really, how simple does it need to be? If we tried to say to a creationist: "The cat sat on the mat", would you guys be all: "So you're saying the mat is on top of the dog, right? Only that's how I understand it"?
What the fuck is wrong with you people?
|
|||||||||||||||||||||||||||||||||||||||
Dr Adequate Member (Idle past 284 days) Posts: 16113 Joined: |
Remember Haldane’s fixation of new functional genes only happens one every 300 generations. This would provide only about 1700 new beneficial functions fixed in humans since the Pan Human divergence! Are there only about 1700 functional mutations between humans and chimps? Quite possibly. Do you have a scrap of a shred of a shard of a scintilla of an iota of actual evidence to the contrary?
This is known as Haldane’s dilemma. And for the benefit of creationists, I should explain that Haldane's dilemma is a real actual thing in real biology as practiced by real biologists. Only some of you seem to think that "Haldane's dilemma" are two magic words that you can recite which will make every fact that you hate about biology magically disappear. You think this because you are grossly and repugnantly ignorant of the very subject that you are pretending to know something about. Do you have no shame?
|
|||||||||||||||||||||||||||||||||||||||
Dr Adequate Member (Idle past 284 days) Posts: 16113 Joined: |
You know where I asked:
Do you have a scrap of a shred of a shard of a scintilla of an iota of actual evidence ... Maybe you could address yourself to that instead of writing words at random.
|
|||||||||||||||||||||||||||||||||||||||
Dr Adequate Member (Idle past 284 days) Posts: 16113 Joined: |
I used Haldane because he is an authority ... Splendid. Then you will please note that he himself did not think that his work magically proved creationists right about everything. Or, indeed, anything. Perhaps this is because he knew something about his work that you don't, like what it was. You can find his original paper here. Note the complete absence of the words: "And so this proves that creationists are right about something, albeit in some way that they will never adequately be able to explain."
|
|||||||||||||||||||||||||||||||||||||||
Dr Adequate Member (Idle past 284 days) Posts: 16113 Joined:
|
In 1993 Walter ReMine’s, book "The Biotic Message, Haldane’s calculations were upheld and verified. The evolutionist has never presented a reasonable objection to Haldane. I don't need to present a reasonable objection to Haldane, because he did not pretend that his work proved creationists right about something for the first time ever. What I need to do is provide a reasonable objection to the disingenuous creationist halfwits who pretend that he did. And I have done so.
|
|||||||||||||||||||||||||||||||||||||||
Dr Adequate Member (Idle past 284 days) Posts: 16113 Joined:
|
(u)=mutation rate (9x10^-9) (60/7x10^9) I would still like to know why you're dividing 60 by 7x10^9, which is slightly more than twice the actual figure.
this still give a divergence time of 680k generations and 13.6 million years which blows away all of the evolutionist fossil paradigm. Er ... no. We don't have a fossil known to be the common ancestor of chimps and humans and dated at significantly less than 13.6 million years.
Common descent is nonsense. Something which has passed actual scientists by, but which is plain to you, an expert in being wrong about things.
|
|||||||||||||||||||||||||||||||||||||||
Dr Adequate Member (Idle past 284 days) Posts: 16113 Joined:
|
Yes there are two different formulations but they are most certainly relatable. This is too vague to possess meaning.
Actually, you have not demonstrated any validity to your objections by either citation or any particular principles. I was kinda relying on the particular principle that Schroedinger didn't own a time machine and could not have been discussing Shannon entropy. What exactly are you looking for? Clearly Shannon entropy is not thermodynamic entropy because they are measured in different ways, have different domains, and are expressed in different units.
On the other hand, I have shown a relationship between entropy and information that you emphatically denied existed. What I have emphatically denied is that thermodynamic entropy is the same as Shannon entropy. You have not shown that they are the same, nor will you.
I do not participate in this forum to win an argument as a matter of fact I learn more by losing the argument Only if you're smart enough to notice that you've lost the argument. Otherwise you learn nothing and look like a conceited idiot. Edited by Dr Adequate, : No reason given.
|
|||||||||||||||||||||||||||||||||||||||
Dr Adequate Member (Idle past 284 days) Posts: 16113 Joined:
|
The source of evolutionary novelty, as such, is mutation.
This is obvious; it is practically true by definition. One wonders if even creationists can rely on their natural inborn stupidity in their ceaseless efforts not to understand this, or whether they resort to artificial means such as drink, drugs, or self-inflicted brain damage. One pictures the frustrated religion fanatic sitting alone sobbing in a darkened room ... "I've tried and tried to miss the point, but it's so glaringly obvious" ... (he drinks a bottle of vodka) ... "No, it'sh no ushe, I may not be able to shtand up ... and I shat I've just think myself ... but clearly mutations are the source of evolutionary novelty" ... (he weeps and begins to bang his head against a wall) ... "All I need, all I need is one good big clot in my frontal lobes or shomething ... fucking brain, jusht shtop working!" I guess that's what creationist websites are for. Somehow they seem to eradicate the victim's capacity for thought while leaving him in possession of his higher motor functions. Now perhaps this explains their reluctance to discuss the nitty-gritty of genetics, preferring to talk of thermodynamics or information theory or some bizarre melange of the two. These are subjects that they can misunderstand with far greater facility, and have such a tenuous relationship to what is, after all, a very simple question in a completely different field, that even if a creationist did accidentally learn something about these subjects he would not actually have learned anything about evolution. Take zaius, for example, and his hopeless incomprehension of information theory. Can it even be said to be a barrier to his understanding of evolution? It cannot, because if (which I doubt) we could get him to understand it, he would not actually be able to get any closer to understanding anything about evolution. It's not a barrier that prevents him from understanding evolution, it's a displacement activity that prevents him from trying to understand evolution. While he's wandering around trying to get his befuddled head round information theory, he's not even thinking about evolution, and so the chances of him ever being right about anything are reduced to a safe minimum. That said, he's pretty darn good at not understanding evolution anyway. He has a fine grasp of how to delicately combine the merely retarded with the completely meaningless in a way that would bring a proud tear to the eye of Henry Morris. When one reads (for example) his nonsense about "adaptation", one sees that not only has he carefully avoided learning the most basic concepts in evolution, but that also he has managed to convince himself that he has acquired such expertise in it as to give smug patronizing lectures on it to others. It is this kind of doublethink, in which ignorance is actively sought and yet profound knowledge is assumed, that produces the most deliciously ridiculous breed of charlatan. Edited by Dr Adequate, : No reason given.
|
|
|
Do Nothing Button
Copyright 2001-2023 by EvC Forum, All Rights Reserved
Version 4.2
Innovative software from Qwixotic © 2024