|
Register | Sign In |
|
QuickSearch
Thread ▼ Details |
|
Thread Info
|
|
|
Author | Topic: Intelligent Design in Universities | |||||||||||||||||||
JustinC Member (Idle past 4873 days) Posts: 624 From: Pittsburgh, PA, USA Joined: |
quote:Can you go into a little more detail about how that equation, which apparently is used to calculate the entropy associated with a box filled with gaseous particles, can be applied to the genome the way you did? Also, can you explain how finding the solution to a state function indicates a change in something? That is, usually when I see someone showing the change in entropy is decreasing or increasing using the Boltzmann equations, they use (change)S=k ln (W2/W1). This is because S is just the entropy associated with the state of matter (or energy), not the change in entropy. I'm also not trying to imply your wrong, since my background in entropy is limited to chemistry and physics, and I'm not sure how to apply to information.
|
|||||||||||||||||||
JustinC Member (Idle past 4873 days) Posts: 624 From: Pittsburgh, PA, USA Joined: |
quote:I'm still having a little trouble seeing the analogy with the gas molecules. I understand that W is a quantity that can be calculated with regard to a variety of states of matter (i.e., not just gases), but can the formula W= (N1+N2)!/N1!N2! be used in this particular situation? For the gas molecules, they can be in two states using the parameters from the web page you linked to: either on the right or left side. In your case, the nucleotides can be in either one of two states: ancestral or deleteriously mutated (assuming ancestral aren't deleteriously mutated for simplification). Is this correct? So, according to that equation for W, W is highest in the gaseous example when both states are equal in number. Therefore, entropy is highest in that case. How does this apply to mutations in the genome. Following the analogy, W is highest when half are ancestral and half are deleterious? Is this what you are trying to say? There is a limit to the amount of deleterious mutations that can occur in the a genome? Do deleterious mutations only increase entropy in the genome to a point, and then decrease it as more deleterious mutations take place. What relation are you trying to convey between deleterious mutations and entropy? I'm honestly not sure. This message has been edited by JustinC, 05-06-2005 03:37 AM
|
|||||||||||||||||||
JustinC Member (Idle past 4873 days) Posts: 624 From: Pittsburgh, PA, USA Joined: |
quote:I'm not really concerned with the practical application, just the theoretical framework you are proposing. What is the actual relationship between functional genes and entropy according to your framework? If I take a human genome (which according to you has "devolved" from an ideal ancestral state, and therefore is higher in entropy according to the equation you used) and caused a deleterious mutation in every gene, would I therefore decrease the genomes entropy? Mathematically, using the formula W= (N1+N2)!/N1!N2!, we started with say 1000 genes that produced 1000 fully functional proteins. So originally our entropy is proportional to: W= (1000!)/(1000!)(0!)= 1 Then it devolved after a couple of generations to: W1=(1000!)/(900!)(100!) > 1 Entropy is increasing, as you state. Next, I cause point mutations in essential DNA triplets that decrease the function of every gene in the organism. The equation becomes: W2= (1000!)/(0!)(1000!)= 1 So, as I caused more deleterious mutations, i.e., fudged up the original information, I caused the entropy to decrease. What, then, is the relationship you are trying to show between entropy and deleterious mutations? Is it that the more deleterious mutations that accumulate in a population, the entropy increases but only to a point then it begins to decrease? Also, I'm not sure, though I suspect you are right, when you say that no organism could live with 50 percent of genome having deleterious mutations. Maybe not on there own, but how about with some scaffolding? It's well known that organisms living in symbiotic relationships tend to have smaller genomes than there free living relatives. So what if we took a simple organism, an archea, mutate half their genes to a non ancestral sequence, and then supply all the limiting reagents needed for it to grow? This would be a man-made symbiotic relationship. Of course, at this time it may not be feasible to do such an experiment since our knowledge of the exact funtion of every gene in a particular genome is not known ( I think, though I could be wrong. Anybody know if we have this knowledge about any species?). But if we one day did this, what would we say about the entropy of the genome? Would we say it is then at its maximum an can only decrease from that point? This is a digression, though, since it is the hypothetical argument (above) which I am trying to put forth. And as a side question, are you using the Eyre-Walker as an argument against evolution, as an argument for ID, or both? The former could be attempted, but can the latter? The reason I ask is because the Eyre-Walker paper makes several assumptions regarding evolution (e.g., out-group comparison and age of chimp-human split) to come up with the number of delterious mutations (non-synonomous mutations, according to the study), so I don't see how this can be used in an argument for ID. Unless the argument is of the general form: not x, so y. This message has been edited by JustinC, 05-07-2005 01:47 AM
|
|||||||||||||||||||
JustinC Member (Idle past 4873 days) Posts: 624 From: Pittsburgh, PA, USA Joined: |
quote:Random, in the evolutionary context, means irrespective of fitness. That is, mutations don't direct evolution, only create variation for natural selection to work on.
|
|||||||||||||||||||
JustinC Member (Idle past 4873 days) Posts: 624 From: Pittsburgh, PA, USA Joined: |
quote:I'm not asking about extropy, since the concept isn't as complete as entropy. From your previous posts, you were trying to show that as more functional genes lost there functions, the entropy went up, in accordance with the second law of thermodynamics. You were trying to equate loss of information with entropy, isn't this correct? quote:Yes, the organism wouldn't function. I'm not arguing that. I'm using your equations to show an absurdity, and therefore your calculation must be invalid. I know what would happen, but according to your equations the entropy would decrease, as I showed in my previous post. quote:I know, I saw the graph. But that is besides the point. We are talking about the information in the genome, and what it tends to do. You say it tends to decrease as more deleterious mutations arise, and the entropy increases. This is in accordance to the second law of thermodynamics. What I am trying to show you, along with other posters using the coin analogy, is that the second law of thermodynamics doesn't apply to information theory. There is a disconnect. I am using your equations to do this calcuation, not mine. According to your equation, as I increase the deleterious mutations past a certain point, the entropy actually decreases. I know this isn't what you are trying to say happens, so I dont' think the equations, or your calculation, are useful in this scenerio. According to your equations, if I have a fully functional genome, use PCR to amplify it and introduce random point mutations every generation, take a sample, PCR it again, repeat; the information would decrease and the entropy would go up. This is what you want to say happens, correct? But you can start the other way to, and according to your equations, if I start with a genome full of deleterious mutations and manually replicate them and introduce random point mutations, information would increase. I know you are not trying to say this, but the equations you used to calculate the entropy do say this. I showed it in my previous post. I think this calls into the question previous calculation, and calls into question the equating of Shannon entropy and thermodynamic entropy.
quote:Not really. I'm would merely be taking the place of the replication machinery of the cell. So I would be adding no more energy than the cell would itself. I'm wouldn't be introducing the information, just replicating it and introducing random point mutations (via UV radiation or some mutagens), just as happens in nature. You are saying the entropy tends to increase in such a system. Well, if I start with every gene mutated deleteriously, the only way for entropy to increase, according to your equations, is for some of them to become functional again. To summarize, according to your equations, if I have a bunch of deleteriously mutated genes, I have a low entropy compared to a genome that have half there genes mutated deleteriously and half of them functional. Don't you see a problem with this?
quote:Again, an organism isn't a closed system either. All I would be doing is taking the place of the replication machinery to ensure replication occurs. The intelligence is not introducing information into the system, merely acting as the replication machinery. According to your equations, if I introduce random, spontaneous, mutations throughout a genome that has every gene deleteriously mutated, and entropy must increase, then half will become functional. Your calcuations assumed a far from equilibrium state in the opposite direction, I'm just assuming a far from equilibrium state in the opposite direction and seeing were your equations take us. I'll address the rest of the post later. This message has been edited by JustinC, 05-07-2005 06:27 PM
|
|
|
Do Nothing Button
Copyright 2001-2023 by EvC Forum, All Rights Reserved
Version 4.2
Innovative software from Qwixotic © 2024