There is good reason to believe that radioactive decay was not faster in the past than it was today, at least on the order of 6,000 years.
Radioactive decay generates heat, and this heat would be rather intense if we were to accellerate the decay of radioisotopes to the point where the decay products would match a 6,000 year old earth. That we do not see any evidence of dramatic and intense heat having been released over the 6,000 year period argues against the accellerated decay rate. Of course we can simply add another ad-hoc hypothesis that the decay rate was indeed accellerated, and the heat was somehow eliminated from the process. These are just the sort of ad-hoc explanations that tend to multiply unbounded in creationist arguments.
I believe that the astronomical evidence also points to a constant decay rate in the past. If the light we are seeing from nearby stars has traveled any distance (even 6,000 years worth), our view into the past indicates that their nuclear processes have remained constant over the period of time the light has traveled. This applies regardless of whether the source of the light is 6,000 years old or several billion years old. There is no evidence of any change in historical nuclear processes anywhere else in the universe.
Likewise, and theories about the nature of the speed of light at the very beginning of the universe notwithstanding.. the astronomical evidence also points to the speed of light remaining entirely constant as far as we can measure to the very furthest quasars.
We know this because the information the light has the same energy minus the red shift (amplitude), where we would expect higher energy if the light was compressed by a slower speed. Also, the processes we see happening with this light seem to be running at the same speed as they would nearby. Like a record player played too fast, slow light speed speeded up would cause processes from far away to be on "fast forward", and they aren't.
This clearly argues for a constant speed of light.