doesn't thermodynamics just say that it is statistically improbable that information can and will be produced, not that it can't be produced or is forever lost?
It is more likely that thermodynamics says nothing whatsoever about information. Thermodynamics and information theory both use the term "entropy", and the equations satisfied by entropy are formally the same in both cases. Apart from that, there is probably no relation.
Of course this all depends on what is meant by the term "information". There is Shannon information (from Shannon's theory of communication), and Chaitan/Kolmogorov information which is not quite the same thing. As far as I know, Dembski uses Dretske's version of information (from Dretske's 1981 book "Knowledge and the Flow of Information").
Shannon information is not conserved. It is created by the transmitter of a message, and disappears when the message signal dies out. I'm not sure about Chaitin information. I suppose it could be said to exist in the platonic world of mathematical forms, and presumably is conserved there. I doubt that it is conserved in the real world.
Maybe Dretske information is conserved, or maybe not. I'm not convinced the concept is even workable as defined by Dretske.