A YEC sent me this. It seems dead dodgy to me. However, I do NOT pretend to be a scientist, although I do believe in evolution. Any comments???
Radiometric Dating - The Assumptions
Many of the ages derived by radiometric dating techniques are highly
publicized. Nevertheless, the fundamental assumptions employed are not.
Here are the three major assumptions for your consideration:
1. The rate of decay remains constant.
2. There has been no contamination (that is, no daughter or
intermediate elements have been introduced or leeched from the specimen
of rock).
3. We can determine how much daughter there was to begin with (if
we assume there was no daughter to begin with, yet there was daughter at
the formation of the rock, the rock would have a superficial appearance
of age).
Are these foundational assumptions reasonable? Recent findings seem to
indicate that though we ourselves have not been able to vary the decay
rates by much in the laboratory, the decay rates may have been
accelerated in the unobservable past [1]. If this were the case, the
first assumption would be deemed unreasonable. This would completely
upset our current standardized view of earth's history. Dr Carl Wieland
summarizes the recent findings: "When uranium decays to lead, a
by-product of this process is the formation of helium, a very light,
inert gas which readily escapes from rock. Certain crystals called
zircons, obtained from drilling into very deep granites, contain uranium
which has partly decayed into lead. By measuring the amount of uranium
and 'radiogenic lead' in these crystals, one can calculate that, if the
decay rate has been constant, about 1.5 billion years must have passed.
(This is consistent with the geologic 'age' assigned to the granites in
which these zircons are found.) There is a significant amount of helium
from that '1.5 billion years of decay' still inside the zircons. This is
at first glance surprising, because of the ease with which one would
expect helium (with its tiny, light, unreactive atoms) to escape from
the spaces within the crystal structure. There should hardly be any
left, because with such a slow buildup, it should be seeping out
continually and not accumulating. Drawing any conclusions from the above
depends, of course, on actually measuring the rate at which helium leaks
out of zircons. This is what one of the recent RATE [2] papers reports
on. The samples were sent to a world-class expert to measure these
rates. The consistent answer: the helium does indeed seep out quickly
over a wide range of temperatures. In fact, the results show that
because of all the helium still in the zircons, these crystals (and
since this is Precambrian basement granite, by implication the whole
earth) could not be older than between 4,000 and 14,000 years. In other
words, in only a few thousand years, 1.5 billion years' worth (at
today's rates) of radioactive decay has taken place. Interestingly, the
data has since been refined and updated to give a date of 5680 (+/-
2000) years." [3]
Footnotes:
1. D. Russel Humphreys, Steven A. Austin, John R. Baumgardner, Andrew A.
Snelling, Helium Diffusion Rates Support Accelerated Nuclear Decay;
Article available online at
Error | The Institute for Creation Research.
2. The "RATE" project stands for, "Radioisotopes and the Age of The
Earth"
3. Carl Wieland, RATE Group Reveal Exciting Breakthroughs, 2003