Have you read the abstract of that paper ? Because it is testing the modelling assumptions of population demography, while I am asking about the assumptions behind the calculation of the mutation rates.
Now, if we followed the Creationist fantasy of what Scientists do everyone would now gather round and have a big back patting session about how we've proved the Bible is wrong again.
Is typical strawman, and I'll hope you'll cut down on those if you want to discuss with me. Don't pretend to know how creationists think.
I ask because Mit-Eve is an interesting subject but I haven't had the time to really read on it.
I looked into the wiki article, and there seems to be two methods of calculating the mutation rate. The first one beign the one you described earlier, which they call pedigree based. Taking groups of parent/offspring pairs and calculating the mutation rate. In other words, you are calculating the mutation ''in real time'' as they happen.
The other method, however seems to be begging the question. They assume that chimps and humans had a common ancestor 6M years ago, then calculate the mutation rate and apply it to humans. (the circle is completed when someone uses these dates as evidence against the biblical account of human origins). They call this Phylogeny based.
In short, the phylogeny based method is really just getting the Theory of Evolution to make a prediction: it states that if ToE is correct, then the mutation rate should be x. The problem is that when we actually calculate the mutation rate ''in real time'', it turns out to be 10x according to the wiki article.
Now, the 200k date seems to be what comes out when you use the phylogeny based estimation.
But what comes out when you use a pedigree based estimations ? There's at least one study who gets a mutation rate 20 times higher, which would bring down Mit-Eve to around 6k years old. (Parsons, T.J. et al ‘A high observed substitution rate in the human mitochondrial DNA control region’, Nature Genetics Vol. 15: 363–368, 1997)
Of course, no researcher in that study actually think they got the right answer. But still.
I understand what you mean, but as I said the circle is only truely completed when the 200k age given to Mit.-Eve is used as proof that the creationist position is wrong, since it is derived from assuming evolution occured, therefore, already assuming the creationist position to be wrong. This is where begging the question occurs.
Sorry it wasn't really explicit in the first place. It was badly formulated.
When you can provide evidence refuting the mountains of evidence for recent special creation, then you can propose this as begging the question.
See what I did there ?
To put it more explicitly. Evolution is an interpretation of the fossil record. The YEC position interprets the fossil record differently. Therefore, it is begging the question when you use evolutionnary presuppositions in order to derive an age, and then use this age as proof YEC is wrong, since you had already assumed it was wrong when you based your estimate on evolution.
AbE. If you can't see the fallacy here, you should review your understanding of what constitutes the fallacy of begging the question.
Some interpretations follow the data, while others, such as YEC, are absolutely contradicted by the data. To claim that each is a valid interpretation is complete nonsense. You should know better than to try to palm this off here.
Oviously, if I'm creationist it's a bit normal that I think the data better fits within a YEC interpretation than an evolutionary one.
Ages are derived from a wide variety of sources, from nuclear chemistry to stratigraphy, to geology, to archaeology, and more.
This is a very large-scale claim that really adds nothing to this particular discussion. That is why we have multiple forums and threads to discuss each of these.
And once again, obviously if I'm a creationist, I think that all these fields support my position ... it's just normal
The one field that I do think that a long-age interpretation seems to be better as of 2010 is with the radiometric dating methods. This does not mean, however, that creationist ideas on this aren't available nor legitimate, nor that I would close my eyes on the other areas of study that I think support YEC quite well.
The fact is, all of the evidence points to an old earth. Your YEC belief has been contradicted and disproved time and time again. You can "interpret" the evidence all you want, but you can't hide from the facts, and the facts show the earth is old and that the YEC belief is wrong.
Fallacy of Reification. Facts do not point towards anything, nor do they show anything. Nor do they talk.
In science, facts are always interpreted by scientists. It is with the interpretation of a fact that I disagree with, never with the fact itself.
But you reject any answer other than one somewhere roughly around 6000 years ago. And why might that be? Could it be because rather than looking at the evidence you're making an assumption of inerrancy about your interpretation of a story from a 2000 year old book written by desert nomads?
Genetic fallacy. Even if my belief in the Bible would be the only reason for me rejecting an answer, it wouldn't be I would be wrong for doing so. (although it would probably ignorant and stupid on my part)
If I'm wrong about this last part then all you need do is explain why Parsons estimate of mitochondrial mutation rates should be used in making the calculation for Mit-Eve.
Because it is an actual real-time measurement of the rate. It's operational science. This reason should favor the pedigree based approach.
The phylogeny based approach really only gives us a prediction of what the Theory of Evolution says the rate should be. What happens when you go and actually measure what the rate is, and it comes out to be not at all what the theory predicted ? Should you still trust the phylogeny-based rate ??
Of course, such an answer would be surprising in the extreme. It would require a great deal of research to explain how it fits with all the archeological evidence of human habitation much older than 6000 years, and with what we know of human migration rates and the fact that there was no land connection between Asia and the Americas after about 10,000 years ago. How could anyone who lived a mere 6000 years ago be a common ancestor of all modern people in both the old and new worlds?
The creationists litterature seems filled with how all this can be interpreted in a coherent framework. But I agree that such dating of mit.-Eve seems unreconciliable with the current evolutionnary-paradigm of human history.
So, did you read the paper I linked to above? Message 73 Larger survey on mtDNA mutation rates with discussion including Parsons' data. If you did, and that was the only data that you had to go on, you would conclude that the most recent time that all humans (not including neandertals) could have a common female ancestor is very unlikely to be less than 10,000 years ago, and therefore, with that as the only data, the earth is very unlikely to be less than 10,000 years old.
Sorry, I didn't miss your post but didn't really have the time to read through that paper since I'm in the middle of moving back to montreal for school.
I skimmed through it though, but found no reference to the minimum 10k age you are referring to. In your previous post, you said you calculated a 15k mininum a couple years back, is those two references one and the same ?
There are problems with the actual rate of occurrence as a long term measurement. If all the mutations are neutral, then about half would drift out in the long term. If some are slightly disadvantageous, then those would be liable to negative selection in the long term.
Then there's the problem of hotspots, which means that mutations could reverse back and forth amongst other things. Then there's the problem that, on rare occasions, a woman might have a slightly advantageous mutation that wiped out others in an area (becomes fixed in a region).
the paper says this in it's conclusion:
quote:We have argued that several of the explanations posited for a systematic difference between phylogenetic and pedigree estimates of mutation rates are more limited than they might ﬁrst appear to be. Pedigree estimates of this mutation rate are unbiased, regardless of the heterogeneity in rates—or of mutational hot spots—in the CR.
Doesn't this a bit contradict your claim on plausible explanations of the difference between pedigree and phylogeny-based estimates ?
But even if all mutations survived, the data from that paper would certainly not lead you to believe that humans are only 6,300 years old, and went through a bottleneck 4,300 years ago.
Which as a reference to the paper, but claims (if you read the relevant part of the article) that this paper in fact supports an age of 6500 years old.
I'm not a geneticist, like you seem to be, so maybe you could enlighten me on how you calculated your minimum age and how Dr. Dewitt could have gotten his 6,5k figure.
There's also DNA data on regional stone age people compared to the modern inhabitants that would blow out that view.
You talking about Neanderthals ?
So, slevesque, ask yourself why it is that you find the Parsons' survey alone on some creationist sites, and not the other much larger survey that I've linked to.
Is it the sin of lying by omission? The title of this thread is, ironically, secularists do not want the truth.
As shown above, the paper you've linked can be found on this creationist site, along with references to other surveys of the pedigree approach and also to papers for the phylogeny based approach.
And also, the paper in itself does not give any date. The date you talk about is your own calculation from their data. Dewitt seems to have calculated a different date from the same data. How can I know who's right ?
You should try to understand why the two approaches differ instead of ceasing your investigation as soon as you find a study you like.
where in this discussion have you gotten the impression I had ceased my investigation ?
As bluegenes explains in Message 78, a larger study using the same approach as Parsons' found a lower rate.
But it also didn't include heteroplasmy in their calculation of the mutation rate (unlike Parsons) so if I understand this correctly it's normal that they found a lower rate, while not meaning that any age calculated from the two different rate would be different. (Purely intuitively, this may explain the difference from the Bluegene age and the Dewitt age if one of the two didn't include this particularity of the Sigurđardottir paper)
And as he also explains, much work has been done explain the discrepancy, and we now understand that factors like drift and mutational hotspots and so on are important factors that cause the rate to measure higher when performed on a tiny number of contemporary generations.
This may turn out to be an unfounded claim as per my quote from the paper's conclusion, which says mutational hotspots have no effect (if I'm reading it correctly)
So if it really isn't a case of you just liking the 6000 year date better, explain why Parson's value should be accepted, given the additional information that has been provided to you.
It isn't really Parsons value I favor. It's the whole pedigree approach in general. Me simply refering to the Parson paper earlier in the discussion doesn't mean I'm focusing solely on their numbers.
This month's issue of American Scientist magazine has an article about the evolution of penguins in Antarctica: Evolution on a Frozen Continent (looks like you need a subscription if you want more than the abstract, sorry). Nesting grounds contain preserved biological material going back more than 40,000 years. The mutation rates can be measured very precisely and are consistent with widely accepted mutation rates in other species.