Register | Sign In


Understanding through Discussion


EvC Forum active members: 64 (9164 total)
5 online now:
Newest Member: ChatGPT
Post Volume: Total: 916,742 Year: 3,999/9,624 Month: 870/974 Week: 197/286 Day: 4/109 Hour: 0/0


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   Evolution of complexity/information
Silent H
Member (Idle past 5845 days)
Posts: 7405
From: satellite of love
Joined: 12-11-2002


Message 3 of 254 (41611)
05-28-2003 11:47 AM
Reply to: Message 1 by Ex_YEC_Er
11-21-2002 2:13 AM


Your assessment,is dead on.
Too bad Dembski was not as careful as the creator of the 2nd law in taking into consideration open and closed systems.
I've been waiting for Dembski to address this issue for a while, instead of running around asserting he came up with a new law.
I wonder what his "law" would make out of libraries?
------------------
holmes

This message is a reply to:
 Message 1 by Ex_YEC_Er, posted 11-21-2002 2:13 AM Ex_YEC_Er has not replied

  
Silent H
Member (Idle past 5845 days)
Posts: 7405
From: satellite of love
Joined: 12-11-2002


Message 4 of 254 (41615)
05-28-2003 12:05 PM
Reply to: Message 2 by Peter
05-28-2003 5:15 AM


Peter, I'm not sure if your criticism is totally accurate. I'm not taking a definite position, but I'm currently leaning away from yours.
Dembski talks about complexity and information as pretty much the same thing. While one can certainly call "information in the genome" an analogy, and can criticize Dembski's use of loaded terms (like information)to muddy the waters (it creates an air of equivocation), I don't think that automatically refutes what he is saying.
There is entropy and order in chemical systems and this is essentially what he is talking about, just cleverly reclothing them in terms of complexity and information.
In order for a complex molecule to be a template, and stay a template, there must be a certain order (or lack of entropy)in that system. Its ability to change into a new and workable template can be considered an increase in order (or a further loss of entropy). This requires energy, and of course not too much energy as well.
Likewise one can also look at the entropy of the protein "system" itself in coordinated action to form an organism. There is a redistribution of energy and entropy, to build a more ordered system.
So Dembski calls it "information." It can still be evaluated as an increase or decrease of whatever term he uses.
I think his argument falls harder by realizing how close it really is to the 2nd law and that he failed to include concepts of open and closed systems.
------------------
holmes

This message is a reply to:
 Message 2 by Peter, posted 05-28-2003 5:15 AM Peter has replied

Replies to this message:
 Message 7 by Peter, posted 05-29-2003 6:05 AM Silent H has replied
 Message 36 by Loudmouth, posted 07-10-2004 4:59 PM Silent H has not replied

  
Silent H
Member (Idle past 5845 days)
Posts: 7405
From: satellite of love
Joined: 12-11-2002


Message 9 of 254 (41710)
05-29-2003 12:59 PM
Reply to: Message 7 by Peter
05-29-2003 6:05 AM


peter writes:
Doesn't the concept of an increase in 'order' presuppose
a direction or intent for the change?
Absolutely not. This is simply an evaluation of some relative characteristic that is measurable.
For example you could base "order" on the number of different molecules found within a mixture (more molecule types meaning less order).
Entropy as a concept in chemistry does not assume intent or direction, just a relative differences between molecular structures.
And as it turns out this concept proves quite useful for chemists, especially in thermodynamics.
peter writes:
The problem with extending the analogy of information in
the genome is that one starts to think of 'increasing'
or 'decreasing' the information content...
I think the problem here is in equivocating between information and "information". There are certainly increasing or decreasing amounts of order, or potential for specific chemical reactions, in biochemical systems.
One can call that order information, and it may very well be a handy analogy when talking about DNA as it is a storehouse of potential chemical reactions. However, you are completely right that in using that term it becomes easily confused with "information", by which we mean something that an intelligent being inputs, interprets, then acts on.
That's one of my criticisms of ID as a whole. It is filled with unnecessarily loaded or misinterpretable terminology. I'm sure this is a calculated effort to win converts, and muddy the waters of debate, but it is not helpful to real discussion.
peter writes:
If someone claimed that there was 'information' involved in
production of water and carbon dioxide from the combustion of
wood or coal would anyone listen?
Yes there would. Especially as computer-modeling of chemical systems increases, there is a very real identification of chemical systems as information. The problem, once again, is coming to believe that everything is information.
Just as an economist could use economic analogies to understand chemical equations, and driving forces, it would be an error to come to view chemistry as a form of economics.
So goes it for the computational-philosophic-mathematical information theorists. Dembski has either lost the reality that he is modeling a real world system and information was just a useful term, or he finds that term useful to obfuscate.
Without that mistake, "information content" of chemical systems is a neutral assessment.
------------------
holmes

This message is a reply to:
 Message 7 by Peter, posted 05-29-2003 6:05 AM Peter has replied

Replies to this message:
 Message 13 by Peter, posted 05-29-2003 2:41 PM Silent H has replied

  
Silent H
Member (Idle past 5845 days)
Posts: 7405
From: satellite of love
Joined: 12-11-2002


Message 15 of 254 (41795)
05-30-2003 1:10 PM
Reply to: Message 13 by Peter
05-29-2003 2:41 PM


Just because something is arbitrarily defined in general, does not make it less helpful as long as it is consistently defined within the model you are creating.
Of course any model is more helpful if it uses common definitions.
To be honest though, what is there in science which is not defined arbitrarily? There is no absolute anything, ESPECIALLY in biology. One arbitrarily picks a feature or some measurable quality as a starting point and uses it as a reference point.
Well you can always be a bit picky to choose a more useful starting reference point, so its not just "eeny-meeny-miney-mo arbitrary", but it is still arbitrary in the grand scheme of things.
I agree that anti-evo use of the term information is ethereal, but it doesn't start that way. Dembski starts by using a real, defined term. It's just that by then the end of his sermonizing he has equivocated on that original term, and its easy for anti-evo forces to buy into the equivocation. From that point on its consistently used in its ethereal sense.
People familiar with computer modeling probably won't be as fazed by its use as a term, as long as it is consistent.
------------------
holmes

This message is a reply to:
 Message 13 by Peter, posted 05-29-2003 2:41 PM Peter has replied

Replies to this message:
 Message 16 by Brad McFall, posted 05-31-2003 2:01 PM Silent H has not replied
 Message 18 by Peter, posted 06-01-2003 10:16 AM Silent H has not replied

  
Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024