Register | Sign In


Understanding through Discussion


EvC Forum active members: 64 (9164 total)
3 online now:
Newest Member: ChatGPT
Post Volume: Total: 916,761 Year: 4,018/9,624 Month: 889/974 Week: 216/286 Day: 23/109 Hour: 1/3


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   Jerry's Calculation of Entropy in Genome
JustinC
Member (Idle past 4869 days)
Posts: 624
From: Pittsburgh, PA, USA
Joined: 07-21-2003


Message 1 of 23 (207279)
05-12-2005 1:03 AM


I've been away from the computer for a while, and sadly the topic where Jerry and I were discussing his calculation was closed. The conversation was getting a bit messy anyway, so I would just like to comment on your calculation one more time and get a response. The original conversation started with this message:
http://EvC Forum: Intelligent Design in Universities -->EvC Forum: Intelligent Design in Universities
The calculation was
quote:
I then introduced the mathematics to show this deterioration of the human genome in order to quantify it: I began by throwing out a formula from The University of New South Wales, physics department:
This states that W will equal a factorial relationship of the differences of what we are considering (accumulating deleteriously mutated genes as opposed to the rest of the genome) or W = (41469.4 + 1.6)! / (41469.4)!(1.6)! ~ (So let's just calculate our weight and then we can go to Boltzmann's math to calculate entropy.
W = (41469.4 + 1.6)! / (41469.4)!(1.6)! --- 3.66 x 10^173494 / 2.14 x 10^173487
W = 1.71 x 10^7
Now we can do Boltzmann's math:
S = K log W, S = (1.38 x 10^-23) log(1.71 x 10^7)
S = 9.98 x 10^-23
There is more than one way to skin a cat, of course. I can stick joules and degrees Kelvin in Boltzmann's formula for the math purest, but most no longer do this.
This math shows the macroevolution inherent in Darwinism standing refuted both scientifically (the study) and mathematically because our final calculation shows increasing entropy in the human genome and therefore disorganization in that genome for the last 6 million years. There is no evidence it has been any different in the annals of human history.
Jerry is putting nucleotides into two categories for this calculation, ancestral and deleteriously mutated. After this, he uses the equation (N1+N2)!/(N1!N2!) to calculate the supposed gain in entropy of the genome after one generation of deleterious mutations using the Eyre-Walker results of 1.6 per generation.
So here's my counter, using a Reductio Ad Adsurdum type argument. It's very simple.
Jerry supposes the entropy will go up from some ideal ancestral state, and this will be correlated with an information loss. He uses the calculation above.
N1 will represent ancestral nucleotides, N2 will represent deleteriously mutated nucleotides. The original entropy, before the mutations, is:
(N1+0)!/(N1!0!)=1
After one round of mutations:
(N1+N2)!(N1!N2)!> 1
Eventually, we'll reach a point where we have more deleterious mutations than we have ancestral nucleotides. After this, entropy will begin to decrease.
So, the more deleterious mutations that accumlate after that point, the more the entropy will decrease until it reaches 1 again.
This goes against the original statement that "an increase in entropy is correlated with a loss of information", since now a decrease in entropy will be correlated with a loss of information.
It also goes against the notion that entropy is measure of disorder, since as disorder goes up (in the sense that information is being lost) entropy decreases. This seems absurd.
I would like to remind Jerry that entropy is a state function, and doesn't depend on the path taken to get that particular state. So replying that an organism will be extinct by the time the mutations reach that level is not a counterargument. It's also way off target since we are not talking about an organism but the entropy associated with the genome. The fate of the organism after the change in the genome isn't relevant.
So to summarize, if I have understood the calculation, it seems that a decrease in information can result in a decrease in entropy (not an increase).
Anyone is welcome to comment, especially if they see an error in my reasoning.
This message has been edited by JustinC, 05-12-2005 06:53 PM

Replies to this message:
 Message 2 by AdminNosy, posted 05-12-2005 1:08 AM JustinC has replied
 Message 5 by PaulK, posted 05-13-2005 3:01 AM JustinC has not replied

  
JustinC
Member (Idle past 4869 days)
Posts: 624
From: Pittsburgh, PA, USA
Joined: 07-21-2003


Message 3 of 23 (207513)
05-12-2005 6:54 PM
Reply to: Message 2 by AdminNosy
05-12-2005 1:08 AM


Re: A link to the old thread?
Link Added

This message is a reply to:
 Message 2 by AdminNosy, posted 05-12-2005 1:08 AM AdminNosy has not replied

  
JustinC
Member (Idle past 4869 days)
Posts: 624
From: Pittsburgh, PA, USA
Joined: 07-21-2003


Message 19 of 23 (208230)
05-14-2005 8:37 PM
Reply to: Message 15 by Jerry Don Bauer
05-14-2005 6:35 PM


quote:
Shouldn't take long to put this whole thread to bed. The math I used simply estimated the entropy increase in the FIRST GENERATION.
I calculated S. This math will not calculate continually changing entropies from generation to generation because that it is not S, but deltaS. You are assuming:
W = (41469.4 + 1.6)! / (41469.4)!(1.6)! --- 3.66 x 10^173494 / 2.14 x 10^173487
W = 1.71 x 10^7
Boltzmann's math:
S = K log W, S = (1.38 x 10^-23) log(1.71 x 10^7)
deltaS = 9.98 x 10^-23,
This is not correct!
To get deltaS you then have to take that first generation down the lineage:
deltaS = S(final) - S(initial)
I honestly have a hard time following this. I'll just calculate W since it is directly proportional to entropy. Also, I know I was only calculating W, not delta W. But, delta W could be found by comparing the different generations. Here is the calculation again. I will assume ancestral nucleotides to begin with.
We start off with a W of:
1. (1000!)/[(1000!)(0!)]=1
After I mutate a quarter of the nucleotides to deleteriously affect some genes:
2.) (1000!)/[(750!)(250!)]>1
After I mutate half of the nuceotides:
3.) (1000!)/[(500!)(500!)] >>1
After I mutate all of the nucleotides, so no information is left in the 1000 nucleotide segment:
4.) (1000!)/[(0!)(1000!)]=1
As you can see, the change in entropy is always positive except for the last change, from (3) to (4), which is:
5.) Delta W= (1-(>>1))= - N
I apologize for the abbreviations since I don't have a calculator handy, but you should get the point.
According to your calculations, if I take a one thousand nucleotide DNA sequence full of genes, and then mutate every nucleotide to deleteriously affect the gene products, then the entropy remains the same. Or, to put it another way, a one thousand nucleotide DNA sequence with half of the nucleotides deleteriously mutated will have a higher entropy than the same sequence with all the nucleotides mutated. The change would be negative if going from the former to the latter using your equation.
quote:
As you can see, since the study showed a steady accumulation of 1.6 mutations per generation, entropy will NEVER begin to decrease, so this should communicate to you that you're probably not doing something right.
I'm just using the equations you used to show an absurditiy. That should communicate to you that your equations aren't sound. Please show me exactly where my calculation is in error.
This message has been edited by JustinC, 05-14-2005 08:40 PM

This message is a reply to:
 Message 15 by Jerry Don Bauer, posted 05-14-2005 6:35 PM Jerry Don Bauer has replied

Replies to this message:
 Message 20 by Jerry Don Bauer, posted 05-14-2005 10:21 PM JustinC has replied

  
JustinC
Member (Idle past 4869 days)
Posts: 624
From: Pittsburgh, PA, USA
Joined: 07-21-2003


Message 22 of 23 (208453)
05-15-2005 6:51 PM
Reply to: Message 20 by Jerry Don Bauer
05-14-2005 10:21 PM


quote:
No. This is not my calculation. This is yours and your math is simply incorrect. Please cut and paste where I used a deltaW anywhere.
It doesn't matter if I use W or S, or Delta W or Delta S, W and S are directly proportional. All I would be doing is taking the ln of the of the number and multiplying it by a constant. You can do the extra math if you want, but the results will turn out the same.
quote:
We are calculating entropy, not just the statistical weight. And when you calculate entropy down the lineage you must use deltaS, not just S which will always be the case with this math you are introducing.
Yes, I know I am using W. They are directly proportional. Do the extra math if you would like, the answer will be the same. For instance, look at my last generation, with all the nucleotides deleteriously mutated:
W= (1000!)/(0!)(1000!)=1
S= k ln W= 0
So delta S going from equation (3) to (4) would be:
Delta S= Sf-Si= 0-(k ln (>>1))=-N
That final entropy would be zero, the entropy before that would be a positive number, giving a negative delta S. A decrease in entropy as more deleteriously mutations accumulate.
quote:
You KNOW entropy will not magically begin to decrease when the study clearly shows 1.6 harmful mutations continue to accumulate. So why are you mathematically trying to show something mathematically correct that you know to be mathematically incorrect?
You were the one trying to equate information loss (in the sense of changing the ancestral state of the genome) with entropy increase. You used that equation to show it. I used that equation to show an absurdity, which (if correct) calls into question your whole calculation. Go from (3) to (4) in my calculation, calculate S's for both, and then find delta S. It will be a decrease in entropy as more deleterious mutations occur.
quote:
The combinatorials need only be used once, to calculate the entropic change in an organism with x amount of nucleotides where y mutations are accumulating. Since x and y are always the same, what is there to recalculate using that formula?
I don't see what's so hard to understand. I am increasing y, i.e., mutating more than half of the genes. When I do this, your equation says the entropy will decrease. This calls into question your calculation, since I don't think you want to be saying this.
quote:
Please use deltaS to calculate this changing entropy.
I did above, but I'll do it again for the hell of it. Equation (4) says W=1, so the entropy will be:
Sf= k ln 1= 0
Equation (3) says the entropy is more than 0 (I'll use W=1.3 as an example), so:
Si= k ln (1.3)>0
So Delta S, going from (3) to (4), would be:
Delta S= Sf-Si= 0- (>0)= -N.
It will be negative still.
quote:
Considering change in entropy between 500 and 501 descendants:
Initial entropy in 500th organism = (500) (9.98 x 10^-23) = S(intial)
Final entropy in 501st organism = (501)(9.98 x 10^-23) = S(final)
deltaS = S(final) - S(intial)
deltaS is positive showing the new accumulated mutations we know occurred in that genome. Now THIS is my math.
I have absolutely no idea what that is supposed to show. We are talking about the entropy of a genome if we dichotomize into ancestral and deleteriously mutated nucleotides, using the equation for statistical weight (N1+N2)!/ ((N1!)(N2!)). What does the generation of the organism have to do with that, and how would that factor into the calculation.
The only way I can see it factoring into the equation would be if we write the number of deleterious mutations as a function of the generation. The result will be the same, once we get to a certain point the more delteriously mutations that accumulate the entropy will decrease.
quote:
No you're not. You are trying to extrapolate a formula I used in a way I did not use it. This is your math, not mine.
The formula should be consistent with the point you are trying to prove. You are trying to prove that deleterious mutations in the genome constitute and increase in entropy. You equation says it does this to a point, and then the entropy will decrease.
quote:
Um....I think I did.
I don't. Calculate the entropies. Entropy will decrease after a certain point as more deleterious mutations accumulate.
The reason I am going through the trouble of this is because I think you just pulled that equation of the internet and tried to use it without understanding it.
It works great with the gas in a box, but you can't just extrapolate it to any binary system, as PaulK was saying
quote:
Jerry is wrong about configurational entropy. Jerry's entropy argument works in exactly the same way for ANY binary classification of genes or mutations, not just "detrimental"/"not detrimental". If you chose to look at beneficial rather than detrimental mutations you would find that each beneficial mutation increased the entropy. This form of entropy depends very much how the problem is framed. And it is not valid to assume that the entropy will tend towards the maximum for every possible measure because different measures give different results.
That's the crux.
This message has been edited by JustinC, 05-17-2005 07:45 PM

This message is a reply to:
 Message 20 by Jerry Don Bauer, posted 05-14-2005 10:21 PM Jerry Don Bauer has not replied

  
Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024