Your suggestion sounds reasonable to me, but I'm not a geologist. This supposition would be much more solid if someone could find some experimental, observational data to back it up. Anyone?
Coal along with oil, gas and rocks with organic debris are reducing environments. U is mobile in oxidizing environments and precipitates out in reducing environments. Any ground water with U moving through an area will precipitate out the U if it crosses a reducing environment.
This is how a Uranium roll front deposit works. The ground water containing a very small amount of U keeps moving until it hits an area which is a reducing environment and precipitates out. Over millions of years you end up getting a mineable Uranium deposit. There are hundreds if not thousands of these in the western US
What all this means is that you cannot make the assumption the Uranium content of a coal has remained static since its formation since at any time ground/subsurface water containing U could have hit it and precipitate out addition U. This means a coal is totally worthless for dating. Seems to me RATE made a big deal out of a Cretaceous coal being much younger which doesn't surprise me since there are a lot of ash falls during the Tertiary which would add U to the system and would easily precipitate in the coals.
I've seen this more times than I can count in subsurface borehole geophysical logs where coals can have about any gamma ray reading from very low to very highdepending on how much U has moved through the system.. In oil and gas deposits GR is used to determine the amount of clay in a rock, it being assumed it comes from K40 which you have in most clays. If the GR reading comes from U and Th it screws everything up. Then you run a Schlumberger NGT log, (natural Gamma Ray spectroscopy) to see how much of your GR reading is from K40 vs U&Th.
Hope that helps.
Edited by petrophysics1, : No reason given.