If most of the daughter element dissipates, won't samples date much older than they should?
Shouldn't they date much younger than they actually are? Daughter isotopes build up over time. If they dissipate, then this would result in an underestimate of age.
And even if that problem were solved, if decay rates were greater in the past, won't samples again date much older than they should? And isn't it impossible to know past decay rates?
As mentioned above, we have direct observations of past decay rates in cosmological events. The decay rates are governed by the fundamental nuclear forces (weak and strong). Changes in these fundamental forces would have far ranging consequences, including fusion rates in stars, power output from stars, changes in the ratio of ring radii within uranium radiohalos, changes in the ratio of products in naturally occurring nuclear reactors (e.g. Oklo), differences in dates using different types of decay and different isotope pairs, and a whole host of other easily observable and testable consequences.
As it is, we can find thousands and thousands of observations supporting constant decay rates in the past. For example, we can consistently get the same date for the same geologic feature using K/Ar, U/Pb, and Rb/Sr dating, even though all three use different types of decay mechanisms. Such a thing wouldn't be possible if decay rates were different in the past because they would alter rates different for different types of decay and isotopes.