Jerry Don Bauer writes:
1) When loose information is spontaneously diffused, entropy will tend to increase.
Keep in mind that we're not as familiar with your terminology as you are and keep reminding us what these terms mean. I think you consider genes "loose information" because they can change, but what does it mean for information to "diffuse"? Do you mean decrease or become lost? What is an example of information diffusion? Most importantly, you said this was a testable assertion of ID, so I'm wondering how that could be the case. If we grant for the sake of discussion that you're just stating an accepted principle of information theory in unfamiliar terms, how would that constitute a test of ID? Can you describe such a test?
2) Specificity is inversely proportional to the probability of an event occurring.
You're simply asserting that s=1/p. ID hasn't yet drawn any correlation between terms like "specificity" and "specified complexity" and the real world. Can you describe how these concepts relate to the real world, and how you would go about testing this relationship?
3) DNA must be designed by an intelligent agent or by code pre-programmed designed by an intelligent agent.
And how would you test this?
--Percy