Register | Sign In


Understanding through Discussion


EvC Forum active members: 65 (9162 total)
1 online now:
Newest Member: popoi
Post Volume: Total: 915,817 Year: 3,074/9,624 Month: 919/1,588 Week: 102/223 Day: 0/13 Hour: 0/0


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   Heat and radiation destroy claims of accelerated nuclear decay
JonF
Member (Idle past 168 days)
Posts: 6174
Joined: 06-23-2003


(1)
Message 1 of 2 (677794)
11-01-2012 12:53 PM


From 1997 through 2005, the Radioisotopes And The age of the Earth (RATE) group, comprised of various YECs with appropriate qualifications and knowledge of physics and radiometric dating, tried to invalidate the mainstream conclusions about the age of the Earth and life. They concluded that the amount of radioactive decay is inescapable, and the only possible explanation for that which is consistent with a young Earth is Accelerated Nuclear Decay (AND), specifically approximately 4 billion years worth in the first three days of Creation before there was any life to kill, and 500 million years worth during the Noachic flood.
This thread is not for discussing the various studies which the RATE group claims as evidence that the Earth is truly young; extremely detailed criticisms of these claims are available in many places (I recommend RATE (Radioactivity and the Age of The Earth): Analysis and Evaluation of Radiometric Dating and Radioisotopes and the Age of the Earth). This thread is for discussing the mammoth problems with the hypothesis of accelerated decay during the Noachic flood.
I. Heat
Condensing 5·108 years of decay into one year or less would produce an immense quantity of heat. From the first RATE book, Introduction, page 8, Vardiman writes:
quote:
One major obstacle to accelerated decay is an explanation for the disposal of the great quantities of heat which would be generated by radioactive decay over short periods of time. For example, if most of the radioactive decay implied by fission tracks or quantities of daughter products occurred over the year of the Flood, the amount of heat generated may have been sufficient to vaporize all the waters of the oceans and melt portions of the earth’s crust, given present conditions.
Snelling quantifies this problem in Radiohalos in Granites: Evidence for Accelerated Nuclear Decay, page 183:
quote:
To put this heat problem in perspective we can quickly do a rough estimate of the effect of just the accelerated nuclear decay, say 500 million years worth (at today’s rates), but instead taking place in a single year (the Flood year). The following values of the relevant parameters were obtained from Stacey [1992]:
  • the typical heat production in a granitic pluton from radioactive decay of U, Th, and K is ~10-9 W/kg,
  • the specific heat of granite is ~700 J/kg-K, and
  • the number of seconds in 500 million years is ~1.6 · 1016 sec.
Thus the adiabatic temperature rise =
This is equivalent to a temperature rise of more than 22,000C, which is sufficient, of course, to vaporize a granitic pluton many times over!
Another approach is to assess the heat production in the zircons themselves within the granitic rocks. Note that the U concentrations in the zircon grains can be on the order of 1% by mass of the grains. If the mass of a zircon grain relative to the mass of the biotite crystal that includes it is 0.01, then with the current heat production from radioactive decay of U of 10-4 W/kg, the average heat production in the biotite enclosing that zircon grain is 10-8 W/kg, which is only an order of magnitude higher than the value used above for a typical granite. Thus the adiabatic temperature rise in the biotite as a result of 500 million years worth of accelerated radioactive decay is an order of magnitude higher than the value obtained for the granitic rock as a whole. Of course, the biotite crystal and the zircon grain included in it would be vaporized! So whichever way the calculation is made, there is no denying that there is a genuine heat problem associated with accelerated nuclear decay.
Obviously if the Flood is taken to have occurred more recently, the numbers would be different but just as disastrous. The only hypothesis I've seen proposed to solve this problem is Humphreys' cosmic expansion theory, in which the Earth is cooled by the expansion of space. The problems with this hypothesis are discussed in detail at Nonexistence of Humphreys’ Volume Cooling for Terrestrial Heat Disposal by Cosmic Expansion and Flaws in a Young-Earth Cooling Mechanism. But without even considering whether the hypothesized mechanism is possible we can see a major problem with it. The cooling would have to be applied not evenly throughout the Earth, but very selectively: more cooling where there's more radioactive elements (e.g. rocks) and less cooling where there's fewer radioactive elements (e.g. oceans and living creatures). That just isn't going to fly. Humphreys acknowledges the problem in Young Helium Diffusion Age of Zircons Supports Accelerated Nuclear Decay, pp 73-74:
quote:
The real problem is how to keep non-radioactive materials from getting too cold at the same time. I have not had time to pursue this part of the idea further, so here I can only outline a speculation that may turn out to provide a good explanation later. If the "fabric" of space is a real material, as Scripture implies [Humphreys, 1994, pp. 67—68], then it must have a temperature. I speculate that its temperature might set a minimum on how much heat could be transferred to the fabric during rapid expansion. For example, equation (31) might become:
T = -2H(T - Tmin) (32)
where Tmin is a minimum temperature that might depend on the amount of time dilation occurring at the moment. If Tmin were about 300 K during the Genesis Flood, then creatures aboard the Ark could stay warm. Though this is sheer guesswork now, I am confident that a good explanation exists (whether or not we can find it). That is because (a) the evidence convinces me that accelerated nuclear decay did indeed occur, and (b) as one of Noah’s descendants, I know that his family did not freeze to death aboard the Ark!
Note that he's not really presenting a viable hypothesis, and note the reality of Biblical literalism overlying the false claim of scientific inquiry.
II. Radiation
Condensing 5·108 years of decay into one year or less would also produce an immense quantity of radiation. Again from the first RATE book, Introduction, page 8, Vardiman writes:
quote:
A second obstacle to accelerated decay is the ability of life to cope with the great quantities of ionizing radiation that would have been generated by accelerated decay over short periods of time. This is particularly so with respect to 40K in animal and human bodies. For example, Noah and his family and the animals would likely have been subjected to deadly concentrations of radiation during their stay on the ark if accelerated rates of decay occurred during the Flood. Although the water beneath the ark would have probably protected him from radiation from the earth below, if Noah had similar concentrations of K in his body as we do today, radioactive decay from within his body would have been very destructive.
Note that this assumes that the heat problem is solved, so there would be water remaining to shield Noah from the radiation from the rocks. I haven't looked into whether this shielding is realistic (there is uranium dissolved in sea water).
I haven't seen any YEC quantifications of this problem, but it turns out it isn't difficult. There have been many studies of radiation dosage due to 40K in humans, e.g. Assessment of the doses received by the Cuban population from 40K contained in the body: modelling based on a neural network, Body potassium content and 40K radiation dose to Iranian subjects, and Body potassium content and radiation dose from 40K to the Slovak population. Note that, for decay that produces beta radiation in a human body, 1 μGy = 1 μSv = 1 micro Sievert. All these sources agree that the radiation dosage in the human body due to decay of 40K is in the range of 100-200 μSv/year, and I doubt that all the subjects were heavy banana consumers. Let's take 100 μSv/year for simplicity, and see what dosage would result from condensing 5·108 years of decay into one or less. It's pretty simple:
5·108·100·10-6 = 50,000 Sv
Again a more recent flood would yield a different but essentially similar number. How bad a radiation dose is this? At Lethal dose (LD), 4-5 Sv is listed LD 50/30, meaning 50% of the people exposed to this die within 30 days. At How Much Radiation can the Human Body Safely Receive? the external background radiation on Earth is about 2.4 mSv, and an exposure of 6 Sv is equivalent to 90% death rate, increasing to 100% at higher levels. Obviously dosing Noah et. al. with 10,000 times the LD 50/30 would turn the ark into a casket of rotting flesh (or maybe zombies!!).
The only solution I've seen proposed for this problem is that living things didn't have any 40K in their bodies until after the Flood. In Summary of Evidence for a Young Earth from the RATE Project, page 765, Vardiman et. al. write:
quote:
One solution has been offered that possibly could mitigate this problemnamely, that the 40K we measure in plants and animals today is the result of the Genesis Flood itself. The RATE team believes an attempt should be made to test for 40K in the bodies of pre-Flood insects which were trapped in amber during the Genesis Flood and were thereby protected from subsequent contamination.
I would sure like to see some YEC try to defend this one!
Those are the two big problems. there are others, e.g. the fact that we see rocks containing U and Th in secular equilibrium with their decay products which would be disturbed by AND and would take on the order of 1.7·106 years to recover back to secular equilibrium.
Discuss!-8
Edited by JonF, : No reason given.
Edited by JonF, : No reason given.
Edited by JonF, : No reason given.

Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024