You really don't need to explain the relevance of information to us. We understand the argument you are putting forward. What we don't understand is whether you actually have any conception of 'information' which could actually be measured so as to show whether it had increased or not.
You have mentioned Shannon information and there is absolutely no reason to doubt that Shannon information can increase through mutation given that in terms of Shannon information a mere increase in message length is sufficient to increase information, provided the message consists of more than one symbol.
Dembski's paper therefore dismisses Shannon's theory of information when he concludes that...
For an example in the same spirit consider that there is no more information in two copies of Shakespeare's Hamlet than in a single copy. This is of course patently obvious, and any formal account of information had better agree.
If you think you know a way of applying Dembski's measure of information usefully to a genome which doesn't rely on you making an awfully large number of assumptions about probabilities then I think we would all like to know what it is.
At the moment your conception of information is far too loose to know what you will accept as an increase in information. Is the evolution of protein binding site sufficient? Is the neo-functionalisation of a duplicated gene sufficient? Without a specific definition from you of what measure of information you go by there is no way we can usefully try and produce an example which will satisfy you.
TTFN,
WK
P.S. It might make the thread easier to follow if you used the 'reply' button to reply to specific messages rather than just a string of general replies.
Edited by Wounded King, : No reason given.