|
Register | Sign In |
|
QuickSearch
Thread ▼ Details |
Thread Info
|
|
|
Author | Topic: Which animals would populate the earth if the ark was real? | |||||||||||||||||||||||||||||||||||||||
PurpleYouko Member Posts: 714 From: Columbia Missouri Joined:
|
Hi Mindspawn.
I just had to get involved here as this thread has moved firmly into my area of expertise. In case you are unfamiliar with my background, and you may well be as I don't post here very often, I am an analytical radio-chemist working in a research reactor. The nuclear reactor that I work at was specifically built to test and answer questions such as the ones that you have been posting. It has a very tightly monitored and controllable neutron flux into which we deliberately place samples before actively measuring their decay rates. One of the processes that we specialize in is called NAA (Neutron Activation Analysis) in which we irradiate a sample then measure the wavelengths of the prompt and delayed gamma rays that are always associated with radioactive decay. So let's address a few of your points. I feel that for an element to have a half-life of a few thousand years is still a slow process that can be affected by the current neutron background.
So you are suggesting that neutron flux can have an affect on the rate of decay correct?It doesn't. Samples decay at precisely the same rate whether they are left in the reactor core, left on a lab bench or completely surrounded by shielding. This has been tested quite extensively for more than 30 years here. I do find myself wondering if you might be confusing something a little though. It would make logical sense for neutron flux to slow down the decay process if the neutron capture path were identical but reversed to the decay path. Such a situation would indeed result in an equilibrium between neutron capture and decay. However this is not the case. The decay path is always different than the neutron capture path. To put it another way, the daughter isotope is always different than the original target isotope (the one that captured the neutron) Let's take iron as an example as it has been mentioned a lot in this thread.56Fe is the most common isotope at 91% of the total iron. It is also stable so it does not decay. If 56Fe (thermal cross section 2.59) captures a neutron it becomes 57Fe which is also a stable isotope but which has a much lower natural abundance 2% If 57Fe (thermal cross section 2.48) captures a neutron (it is slightly less likely to do so than 56Fe as it has a more tightly packed nucleus and therefore a slightly reduced Thermal cross section) it becomes 58Fe which is also stable. if 58Fe (thermal cross section 1.28) it becomes 59Fe which is unstable and decays to 59Co with a half life of 44.5 days Incidentally the reason that Boron shielding works to stop neutrons is because 10B has a thermal cross section of 3837 so a reasonably thick layer (couple of inches or so) is enough to capture any neutrons coming its way.The Boron shields need to be replaced periodically though as capturing those neutrons results in an alpha particle and a 7Li atom A good way to measure this would be to arrange two samples of the same consistency of parent/daughter, one shielded and one not. Measure the ratio, and then a few years later measure the ratio again. Depending on the half-life there should be a detectable difference between the two samples a few years later, the protected sample showing a higher proportion of daughter isotope than the unprotected sample.
That is actually a very well thought out experiment.Scientists here thought that as well and tests such as this have been performed repeatedly at this and other reactors throughout the several decades of their existence. The effects that you predicted were not discovered. the decay rates and hence the ratios, were found to be the same for all situations. Iron is mainly stable, and when pushed into an unstable state (fe59 or fe60) it rapidly decays back to a stable state with days or within a few years. There are not enough neutrons in the neutron background to overcome the high decay rate and permanently change the iron.
OK two points here.First of all as I have pointed out above, the effects of neutron flux on decay rates has been tested rather thoroughly and categorically ruled out as a possible way to change decay rates. Secondly and much more seriously, the kind of thermal or fast neutron flux needed for any isotope to capture a neutron would be many orders of magnitude higher than any form of organic life could survive. Such a flux on the surface of the planet would inevitably result in a sterile radioactive wasteland.
|
|||||||||||||||||||||||||||||||||||||||
PurpleYouko Member Posts: 714 From: Columbia Missouri Joined: |
Thanks for your informative reply, I accept what you say at face value.
Thanks for that.I could probably dig up all kinds of references if pushed but they would all be pretty old now. This question was considered to be fully resolved many many decades ago. It might be kind of hard to find papers online now. I was aware of some data from Purdue and Stanford, as well as several others, that show a very tiny (fractions of a percent) cyclic fluctuations but until now I hadn't really read about it in detail.It's actually extremely interesting reading and surprisingly well accepted in the scientific community. I half expected to find papers that challenge the initial findings but on the whole, every team of researchers that have repeated the same experiments, with and without modifications, have come up with pretty much the same results. The decay rates of whatever isotope is being studied appears to slow down very slightly in response to increased activity in the sun. It has even been successfully used to measure the effect 2 days before a solar flare. However there is still a question of what the actual cause of the effect really is. It was initially thought to be neutrino flux but it is still quite possible that some unknown particle is responsible. Some experiments have been carried out in order to determine if neutrino flux is indeed the culprit
http://arxiv.org/pdf/1006.5071v1.pdf their conclusion was not entirely conclusive though They experimentally took two radioactive samples and made them into two different shapes (flat sheet and sphere). In the sphere, the neutrino flux produced by the decay of the sample itself was many times greater than the solar Neutrino flux while in the flat sheet it was negligable. They did indeed note a small change between the two but it was so small that more experiments are needed to confirm the hypothesis. quote: So yes there does seem to be a measurable change in the decay rates due to something going on in the sun.But what effect does slowing the decay rate by a fraction of 1 percent for short periods of time during increased solar activity have on radiometric dating? First, initial decay rates were probably (I'm not sure about this) measured over a relatively long period, say a few months, due to the inacuracies of the timers that were available back in the day. If this is true then those rates would have been an average. No? Second. If there had been much greater activity in the sun in times past then the dates as we measure them today would be incorrect (by 0.1%) in the direction of the measurements being too short. Third. A difference of 0.1% in the measurements would be lost in the experimental noise of most dating methods. In any analytical process it is typical to have a standard deviation of up to 0.5% around the mean value. ) 0.2% is considered to be almost perfect.I will grant you though, isotope ration measurements need to be a bit better than that, typically in the order of 0.05% or better but that is just in the actual ratio measurement and not in the resulting calculations related to sample age. The potential error bars there are well in excess of 0.1% so as i said this error would be lost in the noise. Ok I accept your first point, must still look into your second point.
It's possible that I misspoke just a little bit. It's a difficult subject to research apparently.Here is some stuff that I've been able to find on the subject. We should be able to work out the actual neutron flux at sea level easily enough first of all here is the measured average dose for the USA measured in mili-rems
http://web.mit.edu/newsoffice/1994/safe-0105.html quote:Here is a table showing the relative conversion factor from neutron flux to mili-rems http://miscpartsmanuals3.tpub.com/...5-315/TM-55-3150014.htm I can't get the stupid table to display properly in this post because it seems to be in pdf form so here is a summary of the information. The lowest energy neutron (thermal) flux when measured in neutrons per square centimeter per second is compared to mili-rems per hour in the following proportions 268.0 neutrons/square centimeter/second == 1 mili-rem per hour (note this is measured as a whole body dose)if the average whole body dose from background radiation in the USA is 300 mili-rem then we need to divide that 268.0 by the number of hours in a year (8760 assuming exactly 365 days) therefore the average neutron flux (assuming only thermal neutrons exist)268 / 8760 = 0.0306 neutrons/square centimeter/second or about 1 neutron every 30 seconds or so. In actual fact there are quite a large proportion of neutrons that are higher energy so the total flux would be significantly less than that. The whole body dose that is considered to be borderline 'safe' is about 50,000 mili-rems per year. That is 166 times greater than the basic flux at sea level so that means that a flux of 5.1 neutrons/square cm/second would be approaching the upper limits of survivability. Any more than that and we would require shielding to survive. I'm not going to take the math any further than that for now.The only point that I would like to make here is that such a low neutron flux is not likely to activate very many atomic nuclei in a given sample. Sure over a large area it is eventually going to be lethal to most life forms but to a chunk of iron, one cm cubed, there will not be very many activations if you get my meaning. most neutrons will pass straight through solid matter.
|
|||||||||||||||||||||||||||||||||||||||
PurpleYouko Member Posts: 714 From: Columbia Missouri Joined: |
Okay, so what are the real numbers measured in the atmosphere? On the order of 10^6 neutrons per square cm.
Can you tell me where you got this figure from.My research leads me to a conclusion that the true value at sea level is closer to 0.05 neutrons/square cm/second or even less. http://www.lanl.gov/science/NSS/issue1_2012/story4full.shtml quote:that works out at about 0.2 neutrons/cm squared/second at 30,000 feet above sea level. Or is your claimed flux not expressed per second?I notice that there is no time component listed in your post. The figure above (0.2/sec) when calculated out to neutrons/square cm/year is indeed around 6 million. 6,307,200 to be exact
|
|||||||||||||||||||||||||||||||||||||||
PurpleYouko Member Posts: 714 From: Columbia Missouri Joined: |
Its all about the penetration of the solar wind through the magnetic field: 1) at midnight the solar wind has better penetration of the magnetic field at the magnetic poles 2) during July more solar wind penetrates the northern hemisphere because the north pole is tilted towards the sun 3) during a solar flare the solar wind is stronger, more solar wind penetrates the magnetic field 4) during the 11 year solar cycle, the solar wind is cyclically stronger. All true.however there has not been a positive correlation between solar wind and the observed decrease in decay rate. In fact the decrease is always noticed a considerable time before a solar flare, while the solar wind is, as yet, unchanged. quote:source http://www.purdue.edu/...r-flares,-give-advance-warning.html From what I am reading, this decreased rate continued until after the solar flare was finished, a period of more than 48 hours. 2 days and 2 nights at least. There may have been overlaid diurnal fluctuations in this rate but it isn't mentioned either way. Solar wind doesn't seem to me to be a strong contender for the cause of this effect. I have no idea what does cause it and neither does anyone else as far as I know. All we really know is that something inside the sun is giving us a couple of days warning prior to a solar event using some unknown mechanism. If a slight increase in penetration of the solar wind causes a slight drop in decay rates, what effect will a near complete blockage of most of the solar wind have during past periods of strong magnetic fields? Slight effect? Major effect? We do not know the answer to this because the cause of the effect is unknown. The assumption that this effect would be slight during past periods of strong magnetic fields appears to me just an assumption with no actual empirical foundation. Slight increases change decay sightly, what would a complete blockout of the mystery effect do? Interesting to contemplate. I agree completely. We cannot make any such assumption.in fact I would assume only that whatever affect we are seeing will be proportional to whatever the sun is doing. I would expect a really big solar event to slow radioactive decay more than a small event. I have no idea if the correlation is linear or not. Assuming that the effect would be large just means that for solar events in the past, the decay rate could potentially (assuming the maximum possible effect) have almost stopped for short periods. Again I ask; What effect do you think that would have on radiometric dating?Just as a thought exercise.
|
|||||||||||||||||||||||||||||||||||||||
PurpleYouko Member Posts: 714 From: Columbia Missouri Joined: |
Hi Jonf
just a quick point
If a significant lowering of earth's magnetic field affected decay rates, the purple crosses at the upper right would be way farther up, far far off the graph. If a significant increase in the Earth's magnetic filed affected decay rates, the purple crosses at the lower left would be much nearer to the horizontal axis.
I don't really agree with your point. But they aren't. let's say that in the past we might have had a few massive solar storms that came about once per century and lasted about a week each.Let's also say that the decay rate during each of these massive storms was almost zero Furthermore, let's assume that the storms didn't strip away the Earth's atmosphere or irradiate it to the point of causing mass extinctions. 7 days of suppressed (zero) rates for every 36443 days of normal rates.That's less than 0.02% I contend that the graph would look precisely the same. What if we had a major event every 10 years and it lasted 2 week?That would be a 0.38% deviation. The effects would be SOOO minor. That graph isn't gonna look any different at all. ^_^ Edited by PurpleYouko, : missed a decimal place
|
|||||||||||||||||||||||||||||||||||||||
PurpleYouko Member Posts: 714 From: Columbia Missouri Joined: |
You're probably missing some background information. Mindspawn thinks that the Noachic fludde took place at the Permian-Triassic boundary, about 253 MYa by conventional dating, around 4,400-4,500 years ago in his fantasy. Yes I am aware of that. I might not post often here but I keep up with the threads that interest me. Thanks for the reminder anyway. All I was saying was that even giving these effects a considerable amount of "benefit of the doubt" they aren't big enough to make any noticeable difference to the existing dating schemata.Apart from that, they slow down decay rather than speeding it up so the effect is in the wrong direction to help with a proof for a young Earth. The larger these slow downs get, the more it means that conventional dating is underestimating the true ages. The only valid argument that could possibly support a young Earth while taking these observations into account is if the present flux of whatever the heck is causing the slow down were much much smaller in the past, hence making decay rates faster back then.I don't see that as a very strong hypothesis but at least it's a "what if" that could actually be explored logically.
|
|||||||||||||||||||||||||||||||||||||||
PurpleYouko Member Posts: 714 From: Columbia Missouri Joined: |
I'm going to comment on the disconnect between your comment which makes perfect sense, and mindspawn's comment which makes no sense whatsoever, yet with which you just expressed agreement. In particular, I note that your two comments are not about the same thing at all. Maybe I wasn't reading his post correctly thenI know he was talking about solar wind just before and I did address the fact that I do not believe there is an established correlation between that and the observed effects. When I said I agreed with him I mean about the fact that a slight increase (in the flux of whatever particles are causing the effect) would likely result in a slight change in the effect itself. I also agree that we cannot make an assumption that the effect was slight in the past during periods of strong activity in the sun. It might not have been. I have to say though that I'm not quite following the last bit of his post
what would a complete blockout of the mystery effect do? Interesting to contemplate.
The thought crosses my mind that he might be alluding to the question of what would happen if the sun should stop it's activity altogether? Would decay rates speed up? maybe. If so then by how much? That hypothesis should be testable to some degree though.If we were able to get our hands on a piece of a comet or something else that has spent most of its lifespan in an area of space that is vastly more distant from the sun than we are, we should be able to date it and if the hypothesis is correct then it should show up as being much older (less sun flux should result in faster decay rates. no?) than moon rock or terrestrial rock even though they were probably formed around the same time. If, however, the Earth's magnetic field is affecting the flux from the sun (i.e. if it is from the solar wind) that too is testable since the hypothesis would predict that the moon should be subject to vastly more flux than we get here on the surface of the earth so radioactive decay would be much slower there.Therefore moon rocks should appear much younger than terrestrial rock of the same age. did they use atomic clocks on moon missions?or on any space missions for that matter? how about GPS satelites. They need some pretty accurate timing. Atomic clocks in space should run slower than those on the ground if the earth's magnetic field and/or solar wind is involved in any way. [ABE]Just answered my own question. lol quote:source http://www.astronomy.ohio-state.edu/...Ast162/Unit5/gps.html Guess that kind of debunks the hypothesis that the solar wind has anything to do with it. Ah well. Edited by PurpleYouko, : added more data
|
|||||||||||||||||||||||||||||||||||||||
PurpleYouko Member Posts: 714 From: Columbia Missouri Joined: |
I think that the disconnect is between the idea that there might be a significantly larger effect in the past (possibly true) and the idea that the effect might be large enough for the Permian-Triassic boundary to be a mere 4500 years ago (not likely enough to be worth considering). I certainly wasn't agreeing to that kind of time scale. lola couple of tenths of 1% error in the dates is a long way from a few tens of thousands of percent error which is what that would take. As I pointed out earlier in my post about dose rates the maximum "safe" (i.e. not immediately lethal) dose rate is normally considered to be about 50,000 mili-rems per year. that is 166 times larger than the average measured dose at sea level.That means that if the decay rate was faster by a factor of 166 times in the past then we could have just about survived. For his proposed time scale it would need to have been a few million times higher.
|
|||||||||||||||||||||||||||||||||||||||
PurpleYouko Member Posts: 714 From: Columbia Missouri Joined: |
To be abrupt about it, the described effect as explained by Jenkins is an increase of decay rates when the sun is closer to the earth, and not a decrease in decay rates due to more solar activity or more neutrions. Is there any evidence or experiment suggesting that increased solar activity results in decreased decay rates? Or that decreased solar activity actually increases decay rates? Because neither is consistent with the claims from the actual experiment as I understand it. Maybe you missed the link that mindspawn posted earlier.here it is again http://www.purdue.edu/...2010/100830FischbachJenkinsDec.html quote: The Purdue team observed a drop in the decay rate a day and a half before a solar flare.This has since been reproduced by dozens of labs around the world and it is pretty well accepted that it does indeed happen. Nobody knows the cause yet though. It doesn't appear to be neutrinos or neutrons or any of the other obvious choices. Except that the operation of atomic clocks has nothing to do with decay rates of radioactive material. So you can keep right on speculating about an effect that to the best of my knowledge has never been observed. OK I'll conceded that one.I got a little ahead of myself and didn't do the research. I thought I remembered reading somewhere about an atomic clock that actually relied on radioactive emission but perhaps I'm remembering that wrong. I'm not suggesting that the resonance frequency of atoms would in any way be affected by changing the rate of decay.
|
|||||||||||||||||||||||||||||||||||||||
PurpleYouko Member Posts: 714 From: Columbia Missouri Joined: |
I'm not sure what exactly you are arguing here.
The drop in decay rates has been observed repeatedly and it correlates with various factors related to the sun.They also hypothesize that it also fluctuates in time with the rotation of the sun's core although it was found not to correlate with the apparent rotation of the sun at its surface. There is a difference of some 5 days in the cycle The only thing uncertain is how the sun causes decay rates to decrease.It is apparent that something from the sun is causing the effect. the only thing we have no evidence for is what that something actually is. Note that the effect has also been reproduced in the laboratory by using radioactive materials in thin sheets, flat as opposed to rolled into a sphere.I linked to this experiment in message 957. here is the link again http://arxiv.org/pdf/1006.5071v1.pdf Their experiment was somewhat successful but not entirely conclusive. Edited by PurpleYouko, : No reason given.
|
|||||||||||||||||||||||||||||||||||||||
PurpleYouko Member Posts: 714 From: Columbia Missouri Joined: |
well if the effect was as small as the Purdue team found then I doubt they would even see a difference with a small nuclear battery unless super sensitive equipment were specifically looking for the effect.
It would still be interesting to actually reproduce their experiment in outer space to see if the effect is any different outside the atmosphere.
|
|
|
Do Nothing Button
Copyright 2001-2023 by EvC Forum, All Rights Reserved
Version 4.2
Innovative software from Qwixotic © 2024