Register | Sign In


Understanding through Discussion


EvC Forum active members: 64 (9164 total)
5 online now:
Newest Member: ChatGPT
Post Volume: Total: 916,901 Year: 4,158/9,624 Month: 1,029/974 Week: 356/286 Day: 12/65 Hour: 0/0


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   Intelligent Design in Universities
JustinC
Member (Idle past 4873 days)
Posts: 624
From: Pittsburgh, PA, USA
Joined: 07-21-2003


Message 110 of 310 (205438)
05-05-2005 10:39 PM
Reply to: Message 106 by Jerry Don Bauer
05-05-2005 9:10 PM


quote:
This states that W will equal a factorial relationship of the differences of what we are considering (accumulating deletariously mutated genes as opposed to the rest of the genome) or W = (41469.4 + 1.6)! / (41469.4)!(1.6)! ~ (So let's just calculate our weight and then we can go to Boltzmann's math to calculate entropy.
Can you go into a little more detail about how that equation, which apparently is used to calculate the entropy associated with a box filled with gaseous particles, can be applied to the genome the way you did?
Also, can you explain how finding the solution to a state function indicates a change in something? That is, usually when I see someone showing the change in entropy is decreasing or increasing using the Boltzmann equations, they use (change)S=k ln (W2/W1). This is because S is just the entropy associated with the state of matter (or energy), not the change in entropy.
I'm also not trying to imply your wrong, since my background in entropy is limited to chemistry and physics, and I'm not sure how to apply to information.

This message is a reply to:
 Message 106 by Jerry Don Bauer, posted 05-05-2005 9:10 PM Jerry Don Bauer has replied

Replies to this message:
 Message 116 by Jerry Don Bauer, posted 05-05-2005 11:48 PM JustinC has replied

JustinC
Member (Idle past 4873 days)
Posts: 624
From: Pittsburgh, PA, USA
Joined: 07-21-2003


Message 129 of 310 (205496)
05-06-2005 3:35 AM
Reply to: Message 116 by Jerry Don Bauer
05-05-2005 11:48 PM


quote:
Yep, correct. I was calculating S considering only a single generation. But you can bring an argument that I also could calculate deltaS, or the change in entropy from generation to generation. The formula you threw out is a valid one I am very familiar with. Or it works as well with simple subtraction such as deltaS = Sf - Si, or change in entropy is represented by final entropy minus initial entropy. It doesn't matter as long as we all use the same math to compare when we are quantifying the same system.
I'm still having a little trouble seeing the analogy with the gas molecules. I understand that W is a quantity that can be calculated with regard to a variety of states of matter (i.e., not just gases), but can the formula W= (N1+N2)!/N1!N2! be used in this particular situation?
For the gas molecules, they can be in two states using the parameters from the web page you linked to: either on the right or left side. In your case, the nucleotides can be in either one of two states: ancestral or deleteriously mutated (assuming ancestral aren't deleteriously mutated for simplification). Is this correct?
So, according to that equation for W, W is highest in the gaseous example when both states are equal in number. Therefore, entropy is highest in that case.
How does this apply to mutations in the genome. Following the analogy, W is highest when half are ancestral and half are deleterious? Is this what you are trying to say? There is a limit to the amount of deleterious mutations that can occur in the a genome? Do deleterious mutations only increase entropy in the genome to a point, and then decrease it as more deleterious mutations take place.
What relation are you trying to convey between deleterious mutations and entropy?
I'm honestly not sure.
This message has been edited by JustinC, 05-06-2005 03:37 AM

This message is a reply to:
 Message 116 by Jerry Don Bauer, posted 05-05-2005 11:48 PM Jerry Don Bauer has replied

Replies to this message:
 Message 135 by Jerry Don Bauer, posted 05-06-2005 5:28 AM JustinC has replied

JustinC
Member (Idle past 4873 days)
Posts: 624
From: Pittsburgh, PA, USA
Joined: 07-21-2003


Message 189 of 310 (205787)
05-07-2005 1:43 AM
Reply to: Message 135 by Jerry Don Bauer
05-06-2005 5:28 AM


quote:
It could appear that way if the system were other than a far from equilibrium, living system. Just looking at the graph above we can see that maximum entropy with the 4 coins is two heads and two tails. So technically, you are correct again.
But the reality is, the organism will not be alive long enough to allow a genome to reach anywhere near a state of (100%)! / (50% ancestral)!(50% deleterious)! because this would mean that 50% of the proteins in our body transcribed by genes, just would not work the way they were initially designed to work.
I'm not really concerned with the practical application, just the theoretical framework you are proposing. What is the actual relationship between functional genes and entropy according to your framework?
If I take a human genome (which according to you has "devolved" from an ideal ancestral state, and therefore is higher in entropy according to the equation you used) and caused a deleterious mutation in every gene, would I therefore decrease the genomes entropy?
Mathematically, using the formula W= (N1+N2)!/N1!N2!, we started with say 1000 genes that produced 1000 fully functional proteins. So originally our entropy is proportional to:
W= (1000!)/(1000!)(0!)= 1
Then it devolved after a couple of generations to:
W1=(1000!)/(900!)(100!) > 1
Entropy is increasing, as you state. Next, I cause point mutations in essential DNA triplets that decrease the function of every gene in the organism. The equation becomes:
W2= (1000!)/(0!)(1000!)= 1
So, as I caused more deleterious mutations, i.e., fudged up the original information, I caused the entropy to decrease.
What, then, is the relationship you are trying to show between entropy and deleterious mutations? Is it that the more deleterious mutations that accumulate in a population, the entropy increases but only to a point then it begins to decrease?
Also, I'm not sure, though I suspect you are right, when you say that no organism could live with 50 percent of genome having deleterious mutations. Maybe not on there own, but how about with some scaffolding? It's well known that organisms living in symbiotic relationships tend to have smaller genomes than there free living relatives. So what if we took a simple organism, an archea, mutate half their genes to a non ancestral sequence, and then supply all the limiting reagents needed for it to grow? This would be a man-made symbiotic relationship.
Of course, at this time it may not be feasible to do such an experiment since our knowledge of the exact funtion of every gene in a particular genome is not known ( I think, though I could be wrong. Anybody know if we have this knowledge about any species?). But if we one day did this, what would we say about the entropy of the genome? Would we say it is then at its maximum an can only decrease from that point?
This is a digression, though, since it is the hypothetical argument (above) which I am trying to put forth.
And as a side question, are you using the Eyre-Walker as an argument against evolution, as an argument for ID, or both? The former could be attempted, but can the latter?
The reason I ask is because the Eyre-Walker paper makes several assumptions regarding evolution (e.g., out-group comparison and age of chimp-human split) to come up with the number of delterious mutations (non-synonomous mutations, according to the study), so I don't see how this can be used in an argument for ID. Unless the argument is of the general form: not x, so y.
This message has been edited by JustinC, 05-07-2005 01:47 AM

This message is a reply to:
 Message 135 by Jerry Don Bauer, posted 05-06-2005 5:28 AM Jerry Don Bauer has replied

Replies to this message:
 Message 193 by Jerry Don Bauer, posted 05-07-2005 4:10 AM JustinC has replied

JustinC
Member (Idle past 4873 days)
Posts: 624
From: Pittsburgh, PA, USA
Joined: 07-21-2003


Message 190 of 310 (205791)
05-07-2005 1:50 AM
Reply to: Message 185 by Jerry Don Bauer
05-06-2005 10:00 PM


quote:
There is not a whole bunch of probability in there to begin with. And if you guys should attempt to extrapolate those hotspots to the entire genome, you would be attempting to refute a major tenet of Darwinism wherein evolution happens via random mutation and selection. Would you not then be in danger of perhaps being called pseudo-scientists by your own peers? Something to think about.
Random, in the evolutionary context, means irrespective of fitness. That is, mutations don't direct evolution, only create variation for natural selection to work on.

This message is a reply to:
 Message 185 by Jerry Don Bauer, posted 05-06-2005 10:00 PM Jerry Don Bauer has not replied

JustinC
Member (Idle past 4873 days)
Posts: 624
From: Pittsburgh, PA, USA
Joined: 07-21-2003


Message 211 of 310 (205911)
05-07-2005 6:22 PM
Reply to: Message 193 by Jerry Don Bauer
05-07-2005 4:10 AM


quote:
I'm not sure I understand the question. Functional genes are those genes that continue to translate proteins that fulfill a need in the organism. I've never heard them expressed as extropy, but since extropy is the opposite of entropy, perhaps they could be if we don't take it too far.
I'm not asking about extropy, since the concept isn't as complete as entropy. From your previous posts, you were trying to show that as more functional genes lost there functions, the entropy went up, in accordance with the second law of thermodynamics. You were trying to equate loss of information with entropy, isn't this correct?
quote:
Hmmm....I'm getting rather suspicious that you are trying to set me up with something or another, Justin. What do you think would happen entropically if suddenly every protein in an organism no longer worked?
Yes, the organism wouldn't function. I'm not arguing that. I'm using your equations to show an absurdity, and therefore your calculation must be invalid. I know what would happen, but according to your equations the entropy would decrease, as I showed in my previous post.
quote:
No, you keep saying this but I certainly never have. I know exactly what you are attempting here and I'm enjoying see you set it up.
The entropy increases and the further we get to mutational meltdown, it begins to increase even more, arithmetically. The population then goes extinct. Remember the graph I posted?
I know, I saw the graph. But that is besides the point. We are talking about the information in the genome, and what it tends to do. You say it tends to decrease as more deleterious mutations arise, and the entropy increases. This is in accordance to the second law of thermodynamics.
What I am trying to show you, along with other posters using the coin analogy, is that the second law of thermodynamics doesn't apply to information theory. There is a disconnect.
I am using your equations to do this calcuation, not mine. According to your equation, as I increase the deleterious mutations past a certain point, the entropy actually decreases. I know this isn't what you are trying to say happens, so I dont' think the equations, or your calculation, are useful in this scenerio.
According to your equations, if I have a fully functional genome, use PCR to amplify it and introduce random point mutations every generation, take a sample, PCR it again, repeat; the information would decrease and the entropy would go up. This is what you want to say happens, correct?
But you can start the other way to, and according to your equations, if I start with a genome full of deleterious mutations and manually replicate them and introduce random point mutations, information would increase.
I know you are not trying to say this, but the equations you used to calculate the entropy do say this. I showed it in my previous post. I think this calls into the question previous calculation, and calls into question the equating of Shannon entropy and thermodynamic entropy.
quote:
Additionally, You have completely left the second law of thermodynamics at this point and we are no longer on the same subject. The second law dictates that with spontaneous events, entropy will tend to increase. There is nothing spontaneous about your purposeful interference in the system by adding energy in the form of your actions (work) into it.
Not really. I'm would merely be taking the place of the replication machinery of the cell. So I would be adding no more energy than the cell would itself. I'm wouldn't be introducing the information, just replicating it and introducing random point mutations (via UV radiation or some mutagens), just as happens in nature.
You are saying the entropy tends to increase in such a system. Well, if I start with every gene mutated deleteriously, the only way for entropy to increase, according to your equations, is for some of them to become functional again.
To summarize, according to your equations, if I have a bunch of deleteriously mutated genes, I have a low entropy compared to a genome that have half there genes mutated deleteriously and half of them functional. Don't you see a problem with this?
quote:
Water heaters don't come to equilibrium either if we supply them a source of energy with intelligence in the form of a thermostat to reheat the water every time it starts cooling off.
Again, an organism isn't a closed system either. All I would be doing is taking the place of the replication machinery to ensure replication occurs. The intelligence is not introducing information into the system, merely acting as the replication machinery. According to your equations, if I introduce random, spontaneous, mutations throughout a genome that has every gene deleteriously mutated, and entropy must increase, then half will become functional.
Your calcuations assumed a far from equilibrium state in the opposite direction, I'm just assuming a far from equilibrium state in the opposite direction and seeing were your equations take us.
I'll address the rest of the post later.
This message has been edited by JustinC, 05-07-2005 06:27 PM

This message is a reply to:
 Message 193 by Jerry Don Bauer, posted 05-07-2005 4:10 AM Jerry Don Bauer has replied

Replies to this message:
 Message 243 by Jerry Don Bauer, posted 05-08-2005 12:44 AM JustinC has not replied

Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024