PaulK:
My math shows that you're correct. Assuming that all of the bacterial contamination was from the very day that it was tested (the most helpful figure for an old-shroud hypothesis), we get:
C14 halflife: ~5730 years
Shroud's date: ~1350 AD: ~650 yrs: 11.3%: 92% remains.
Jesus's death: ~35 AD: ~1965 yrs: 34.3%: 79% remains.
For every gram of initial C14, there should be 0.79 grams of C14 left.
There are 0.92 grams of C14. That means that there is a surplus of 0.13
grams of C14.
If all bacterial contamination was instant and new, for every gram of its
initial C14, it would provide 1 gram of C14. Thus, we have the equations
(where C is the percentage of contamination and 1-C is the percentage of
the shroud):
C*1.00 + (1-C)*0.79 = 0.92
C + 0.79 - C*0.79 = 0.92
C*0.21 = 0.13
C = 62%
Thus, the best contamination figure they could get was that the mass tested was 62% contamination and 38% shroud.
(*note: this is an oversimplification; C14 rates in the atmosphere constantly fluctuate, and are calibrated via tree rings and ice cores. Also, nuclear testing since the 1950s has increased C14 levels in the atmosphere, so you could probably fall below 60% if all of the bacterial contamination was provided "in the lab" (actually, in all 3 labs that dated it); however, to build up a biofilm, realistically almost all of it would have to have been developed before the 1950s)
"Illuminant light,
illuminate me."