Register | Sign In


Understanding through Discussion


EvC Forum active members: 59 (9164 total)
4 online now:
Newest Member: ChatGPT
Post Volume: Total: 916,923 Year: 4,180/9,624 Month: 1,051/974 Week: 10/368 Day: 10/11 Hour: 1/2


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   Intelligent Design in Universities
Jerry Don Bauer
Inactive Member


Message 131 of 310 (205498)
05-06-2005 3:52 AM
Reply to: Message 130 by PaulK
05-06-2005 3:43 AM


Re: Mike Hager asked
quote:
I can't be blinded by my religion since I don't have one.
Everyone with an IQ of 100 or better has thought metaphysics through and has a religion. Your's just pegs on the negative end of the scale.
quote:
And finally, this is still irrelevant to the point that the ID movement includes Young Earth Creationism.
It certainly does; and old earth creationists, Jews, Muslims, agnostics and atheists. We don't particularly care what your religious views are.

Design Dynamics

This message is a reply to:
 Message 130 by PaulK, posted 05-06-2005 3:43 AM PaulK has replied

Replies to this message:
 Message 132 by Limbo, posted 05-06-2005 4:08 AM Jerry Don Bauer has replied
 Message 133 by PaulK, posted 05-06-2005 4:27 AM Jerry Don Bauer has not replied

Jerry Don Bauer
Inactive Member


Message 135 of 310 (205512)
05-06-2005 5:28 AM
Reply to: Message 129 by JustinC
05-06-2005 3:35 AM


quote:
I'm still having a little trouble seeing the analogy with the gas molecules. I understand that W is a quantity that can be calculated with regard to a variety of states of matter (i.e., not just gases), but can the formula W= (N1+N2)!/N1!N2! be used in this particular situation?
Yes, and in fact this is relatively old hat in infodynamics. Here is Durham University using combinatorials to calculate the entropy and macrostates of flipped coins:
Page not found | Durham University Community
But I think the chart I use to teach this is much clearer (with Boltzmann's constant omitted):
This is really the same formula broken down to (total elements)! / (subset)!(subset)!
This works in any system where we need to compare the statistical weight of a comparison of subsets against the whole as in:
(whole)! / (subset)!(subset)!(subset)!(subset)!........(n-subsets)!.
quote:
For the gas molecules, they can be in two states using the parameters from the web page you linked to: either on the right or left side. In your case, the nucleotides can be in either one of two states: ancestral or deleteriously mutated (assuming ancestral aren't deleteriously mutated for simplification). Is this correct?
It was two states in that particular gas system as there were only two chambers. Obviously systems can be designed with as many chambers as we care to have in them.
In the genome example, for the purpose of simplicity, I was considering only two states much like Shannon considered his relay circuits as either off or on--i.e., genes mutating each generation from healthy to non-healthy. Two states. And in this case I can do so because that study did not include other options per generation. Just how many mutated to the deleterious state after selection had acted to remove what it would.
quote:
So, according to that equation for W, W is highest in the gaseous example when both states are equal in number. Therefore, entropy is highest in that case.
Yep. You have studied thermodynamics in a formal setting. I can tell because you are asking quite intelligent questions. Entropy (in the gas or other far from equilibrium systems, like people) expresses the equilibrium of a system. Once total equilibrium is attained by a system, entropy is at maximum. You may or not have read my postings from Schrodinger about this:
"Schrodinger posited that maximum entropy--perfect equilibrium in the organism-- is
achieved at death. [2] And this makes sense. How could there be anything more at
equilibrium with itself and its environment than a cold, dead organism that isn‘t
functioning at all? Furthermore, how could anything be more disorganized than this same
organism? Nothing is organized adequately enough for anything to work. Yet it is
ordered because there is no chaos."
Thus if we have a two chambered gas system (or anything similar) containing 100 molecules, when these molecules have distributed as far as they can distribute, that system is perfectly ordered, but totally disorganized; and entropy is maximum.
It might help you to read a paper I recently wrote on this terminology:
http://designdynamics.org/order.pdf
quote:
How does this apply to mutations in the genome. Following the analogy, W is highest when half are ancestral and half are deleterious? Is this what you are trying to say? There is a limit to the amount of deleterious mutations that can occur in the a genome?
It could appear that way if the system were other than a far from equilibrium, living system. Just looking at the graph above we can see that maximum entropy with the 4 coins is two heads and two tails. So technically, you are correct again.
But the reality is, the organism will not be alive long enough to allow a genome to reach anywhere near a state of (100%)! / (50% ancestral)!(50% deleterious)! because this would mean that 50% of the proteins in our body transcribed by genes, just would not work the way they were initially designed to work.
The researchers (especially James Crow, an interpreter) toyed with this question: If we are carrying upwards of a thousand harmful mutations and they are accumulating slightly with each generation, why are we not in mutational meltdown and going into extinction?
1000 mutations are not that many when you consider all of the protein coding regions that COULD mutate.
quote:
Do deleterious mutations only increase entropy in the genome to a point, and then decrease it as more deleterious mutations take place.
No, if deleterious mutations are to increase arithmetically each generation, they will increase until a population enters mutational meltdown, which old-schoolers like me might know as error catastrophe. At this point, the population enters a rapid descent off the graph into extinction. Here is the type of graph I am referring to:
If that population is to be saved, it must receive a fresh source of genes into the gene-pool long before it reaches the critical meltdown stage.

Design Dynamics

This message is a reply to:
 Message 129 by JustinC, posted 05-06-2005 3:35 AM JustinC has replied

Replies to this message:
 Message 138 by PaulK, posted 05-06-2005 5:53 AM Jerry Don Bauer has replied
 Message 189 by JustinC, posted 05-07-2005 1:43 AM Jerry Don Bauer has replied

Jerry Don Bauer
Inactive Member


Message 136 of 310 (205513)
05-06-2005 5:31 AM
Reply to: Message 132 by Limbo
05-06-2005 4:08 AM


Re: Mike Hager asked
quote:
Don't forget us Jedi
Oh Yeah! lol....Heard of that one but have to read up on it.

This message is a reply to:
 Message 132 by Limbo, posted 05-06-2005 4:08 AM Limbo has not replied

Jerry Don Bauer
Inactive Member


Message 137 of 310 (205514)
05-06-2005 5:42 AM
Reply to: Message 134 by Wounded King
05-06-2005 4:35 AM


Thanks for the post, but I just cannot seem to win on this forum.
I suffer through 100 posts containing arguments with no reference at all and then you spring an uber-post on me containing tons of references with no argument.
Now please slow down, back up and redo that post as it has much potential of people learning from it.
But people are not going to let you bring an argument using other peoples arguments because they are not here to debate them. Only you are.
1) Organize that post.
2) Put a cogent argument into your own words, using those postings as references, making each point you wish to make, cutting and pasting from them as you go to substanciate each point of your argument.
3) Perhaps you are right and I am wrong. We will see where it goes from there, but you need to put a little effort into communication first.
This message has been edited by Jerry Don Bauer, 05-06-2005 05:45 AM

Design Dynamics

This message is a reply to:
 Message 134 by Wounded King, posted 05-06-2005 4:35 AM Wounded King has replied

Replies to this message:
 Message 139 by Wounded King, posted 05-06-2005 6:06 AM Jerry Don Bauer has replied

Jerry Don Bauer
Inactive Member


Message 140 of 310 (205528)
05-06-2005 7:24 AM
Reply to: Message 138 by PaulK
05-06-2005 5:53 AM


quote:
Let us suppose that we mutate a sequence of coin tosses by randomly selecting a coin and tossing it again. If the original sequence has two heads and two tails, there is no chance of the entropy increasing and a 50% chance of the entropy decreasing. How then does the second law of thermodynamics apply to such a case?
Hmmm....Paul, why are you making me think at 6:10 in the morning? If you have a configuration of a two coin system two of which are heads and two tails and you decide to flip one coin, the second law doesn't care what happens because probabilities favoring either outcome do not exist.
The second law is a law of statistics so considering the flipping of one coin, you pays your money and you takes your chances at 50/50 of evolving the system, or not.
We are really getting out of configurational entropy which is expressed in the chart I posted and defines the macrostate of a system, back into logical entropy, which defines the microstates, i.e., the number of possible states of the matter (heads or tails) taken to the power of the number of coins in the system.
The second law comes into play and gets stronger as the complexity of our system increases: one coin, 2^1, two coins, 2^2, ten coins, 2^10, 500 coins, 2^500.
In ID, this is called specificity and we can define it as:
Specificity is inversely proportional to the probabilities of an event occurring.
Dembski's specified information is really just a quantification of the second law because at 2^500 we are to the odds of 1 chance in 10^150 of any pattern occurring and it ain't gonna happen in nature.
Hope that did it.

Design Dynamics

This message is a reply to:
 Message 138 by PaulK, posted 05-06-2005 5:53 AM PaulK has replied

Replies to this message:
 Message 143 by PaulK, posted 05-06-2005 8:23 AM Jerry Don Bauer has replied

Jerry Don Bauer
Inactive Member


Message 141 of 310 (205531)
05-06-2005 7:49 AM
Reply to: Message 139 by Wounded King
05-06-2005 6:06 AM


Well you seem to be working toward an argument here, so we will run with it.
quote:
The papers don't provide an argument, they provide evidence.
Then if there is no premises or conclusions in the form of an argument, why would you need evidence to support them. Think about that.
quote:
Do you want me to rephrase all their results in my own words?
No, that would be stupid. I want you to bring an argument and provide evidence to support that argument. But chill out because you may be getting there, although I'm having to suck it out of you like George Bush treating a rattlesnake bite.
quote:
You said that all mutations were equiprobable. These papers document, or in the first case review, a number of instances where this is clearly not the case. Mutations derived both transcriptionally and from environmental factors show preferential rates for specific nucleotide conversions and in some cases the local DNA structure/sequence. Therefore not all mutations are equiprobable, even in the case of transcriptionally derived mutations.
No, now carefully read the words that I write, because unless I am drunk, I parse them very carefully: Jerry: ".....Random mutations ARE equiprobable ....." Reread the post.
I understand that some mutations are due to certain factors like photon damage, radiation, etc. But if the mutations we are considering are due to the "causes" in the ABSTRACTS (you did not present any papers, just the abstracts) then how do you then consider them random?
Now see how this works, we are getting a discussion going. This good. Fire bad.

Design Dynamics

This message is a reply to:
 Message 139 by Wounded King, posted 05-06-2005 6:06 AM Wounded King has replied

Replies to this message:
 Message 144 by Wounded King, posted 05-06-2005 9:00 AM Jerry Don Bauer has replied
 Message 145 by MangyTiger, posted 05-06-2005 9:45 AM Jerry Don Bauer has replied

Jerry Don Bauer
Inactive Member


Message 146 of 310 (205555)
05-06-2005 10:20 AM
Reply to: Message 142 by paisano
05-06-2005 8:17 AM


Re: The Entropy of Flipped Coins
quote:
It doesn't. The 2LOT applies to thermodynamic entropy and heat. Period.
dS = dQ/T
It does not apply to information entropy, which is what is being computed in the coin example. No matter how much IDists want it to, it just does not apply.
Hence the equivocation.
LOL.. the good Doctor of physics is reduced to the, is not, is too, is not, argument? You can do better than this. Put more effort and research into your posts. In all honesty, I think you're just lost.
Information entropy is Shannon entropy if you want to get technical. The entropy used in the coin examples is called logical entropy, not thermodynamic entropy. Why don't you plug this stuff into Google and educate yourself before posting? This will help you professionally. You can learn about the different entropies starting here.
And is that d you're using in the above formula (which doesn't apply to anything we are discussing that I can detect so far) meant to denote integration as in calculus or a delta as in simple change? You don't state this and you should as forum software will not show mathematical terms in my experience.
If it is integration, you don't have to do this anymore as the formula deltaS = Q/T was clausius' original math and it is widely accepted today. Just plug the formula into Google and surf until you're sick of it. But it is normally used to quantify thermodynamic reservoirs rather than what we are discussing.
"when energy Q is added to a reservoir, the entropy of this reservoir increases by Q/T where T is the temperature in some absolute unit (like Kelvin)."
Page not found at /physics/courses/PHY102.03Spring/hw/102sol4.pdf
We only need integration to show entropy when there is a distinguished change in system temperature (there is not in reservoirs) such as a beaker containing 1 kilogram of water pulled from a fridge at 5degC and allowed to come to equilibrium with the room at 25degC.
If you are going to use calculus, then do it once and we can be done with it:
Now we are reduced to the formula deltaS = C ln(T2/T1) where deltaS is the change in entropy, C is the heat capacity (specific heat) of the substance and T2 is the final temperature while T1 is the initial temperature.
Now that this reduced, we need no more calculus and can figure the entropy of any distinct temp change we care to.
In my water example above The specific heat of water is 4.185 J/g C, deltaS = 4.2 ln(25/5), and:
deltaS = 6.76
Don't throw remedial calculus at an ID theorist. She will throw it right back at you and make you look extremely silly in the process.
And in the meantime, you need only introduce math that is used in living, open systems such as that of Boltzmann, Schrodinger, Gibbs or Prigogine. You should know this at your level.

Design Dynamics

This message is a reply to:
 Message 142 by paisano, posted 05-06-2005 8:17 AM paisano has replied

Replies to this message:
 Message 148 by jar, posted 05-06-2005 10:30 AM Jerry Don Bauer has replied
 Message 150 by paisano, posted 05-06-2005 11:16 AM Jerry Don Bauer has replied

Jerry Don Bauer
Inactive Member


Message 147 of 310 (205557)
05-06-2005 10:26 AM
Reply to: Message 144 by Wounded King
05-06-2005 9:00 AM


quote:
I didn't realise you had never used a web browser before.
Ok, I'm done with you. Ad homs are logical fallacies and I would see no need to address them since they are logical fallacies to begin with.
Thank you for your input!

Design Dynamics

This message is a reply to:
 Message 144 by Wounded King, posted 05-06-2005 9:00 AM Wounded King has replied

Replies to this message:
 Message 149 by Wounded King, posted 05-06-2005 10:36 AM Jerry Don Bauer has not replied

Jerry Don Bauer
Inactive Member


Message 155 of 310 (205646)
05-06-2005 3:43 PM
Reply to: Message 148 by jar
05-06-2005 10:30 AM


Re: The Entropy of Flipped Coins
quote:
I look on a table and find three groupings of US quarters.
What color is the table?
Oh no, just kidding. Group 1 would be the most specified because it has the lowest odds of occurring. If we look up the word information at dictionary.com we can see that one definition of it is the probability of an experimental outcome. So I would feel pretty comfortable going with that.

Design Dynamics

This message is a reply to:
 Message 148 by jar, posted 05-06-2005 10:30 AM jar has replied

Replies to this message:
 Message 156 by jar, posted 05-06-2005 3:45 PM Jerry Don Bauer has replied
 Message 158 by kjsimons, posted 05-06-2005 4:01 PM Jerry Don Bauer has not replied

Jerry Don Bauer
Inactive Member


Message 157 of 310 (205656)
05-06-2005 3:58 PM
Reply to: Message 143 by PaulK
05-06-2005 8:23 AM


quote:
My point is that you can't assume that configurational entropy operates in the same way as thermodynamic entropy under the rules you've defined. In the situation described there is NO tendency for the measure of configurational entropy to increase. It can only decrease.
Yes I can, because it does. What is the difference between matter and energy? Einstein said they are the same because E = MC^2.
Thermodynamic entropy deals only with energy, configurational, only with the arrangements of matter. In fact, both have been used together in the same formula expressed as total S= S(c) + S(t). I'm not doing anything new here.
quote:
Let us say that we have m + n coins with m heads and n tails.
The probability of a mutation producing no change is:
0.5 * m/(m + n) + 0.5 * n/(m + n) = 0.5
In any case but the minimum entropy case there is a non-zero probability of a decrease in entropy, so the general tendency is for S2 <= S1 in all but that case.
What is that math. Paul? Can you show me where you got it? It doesn't make a lick of sense. You didn't calculate any entropy because you didn't take the log of anything. Nothing you plug into n and m will ever equal .5.
quote:
And worse still for your case it is not argued in evolutionary circles that the general tendency is for mutations to be beneficial rather than detrimental.
Back that up because nothing could be further from the truth.

Design Dynamics

This message is a reply to:
 Message 143 by PaulK, posted 05-06-2005 8:23 AM PaulK has replied

Replies to this message:
 Message 163 by PaulK, posted 05-06-2005 5:05 PM Jerry Don Bauer has replied

Jerry Don Bauer
Inactive Member


Message 159 of 310 (205676)
05-06-2005 4:50 PM
Reply to: Message 150 by paisano
05-06-2005 11:16 AM


Re: The Entropy of Flipped Coins
quote:
Speak for yourself. You're just sort of slinging unrelated thermodynamics around, but have yet to construct a coherent case for why your argument is evidence for anything. You've made physics mistakes, and you've made biology mistakes. I think your argument depends on insisting the mistakes aren't really mistakes, and collapses if you admit they are mistakes. But try again, omitting the mistakes, if you like.
There you go again accusing me of making mistakes without being able to state what they are. Why do you repeatedly do this?
quote:
This logical entropy appears to be a creationist/IDist invention. It certainly is not physics. There is thermodynamic entropy, there is Shannon information entropy, and there is Kolmogorov entropy, which has applications in chaos theory.
They aren't interchangeable, and the 2LOT only applies to the thermodynamic entropy, so you can't invoke it if you are going to make arguments based on Shannon entropy, as you appear to be attempting to do.
Perhaps logical entropy represents the tendency of creationist/IDist arguments to become ever more incoherent with their duration?
LOL...You are a case. The site I sent you to was constructed by Brig Klyce a foaming-at the-mouth anti-creationist, atheist.
Why won't you research this? When I plug the term logical entropy into Google, almost 400 pages come up.
"Landauer realized that if the binary value of a bit is unknown, erasing the bit
changes the logical entropy of the system from klog2 to klog1 = 0 (shrinking the phase space available to the system has decreased the entropy by klog2)."
And look at 3 PhDs in materials science using thermodynamic entropy and configurational entropy in the same formula:
S = k ln Omegath(Omegac) = k lnOmegath + k lnOmegac = Sth + Sc
http://www.ldolphin.org/mystery/chapt8.html
quote:
Exactly my point.
Oh. The fact that the formula you wanted to use is "normally used to quantify thermodynamic reservoirs rather than what we are discussing" was exactly your point? Then why did you throw it out?
I give up. You have not backed up anything you have posted with references because THERE ARE NONE. I'm afraid you are not well enough versed in this area of physics to even discuss it, yet you think you know it all. I don't know how to deal with that.
So, I will thank you for your time and you may have the last comment.

Design Dynamics

This message is a reply to:
 Message 150 by paisano, posted 05-06-2005 11:16 AM paisano has replied

Replies to this message:
 Message 168 by paisano, posted 05-06-2005 5:36 PM Jerry Don Bauer has replied

Jerry Don Bauer
Inactive Member


Message 160 of 310 (205677)
05-06-2005 4:54 PM
Reply to: Message 156 by jar
05-06-2005 3:45 PM


Re: The Entropy of Flipped Coins
[/quote]I asked "Which group contains the greatest amount of information and why?"[quote] Well gee. I told you group one and then why, because the specificity is higher. Did you misread the post?

Design Dynamics

This message is a reply to:
 Message 156 by jar, posted 05-06-2005 3:45 PM jar has replied

Replies to this message:
 Message 161 by jar, posted 05-06-2005 4:56 PM Jerry Don Bauer has replied

Jerry Don Bauer
Inactive Member


Message 162 of 310 (205680)
05-06-2005 5:00 PM
Reply to: Message 161 by jar
05-06-2005 4:56 PM


Re: The Entropy of Flipped Coins
quote:
Is order important or not?
LOL, it's important to me, I guess. What kind of question is that? What is your point?

This message is a reply to:
 Message 161 by jar, posted 05-06-2005 4:56 PM jar has replied

Replies to this message:
 Message 164 by jar, posted 05-06-2005 5:06 PM Jerry Don Bauer has replied

Jerry Don Bauer
Inactive Member


Message 165 of 310 (205685)
05-06-2005 5:12 PM
Reply to: Message 164 by jar
05-06-2005 5:06 PM


Re: The Entropy of Flipped Coins
quote:
Is the order of the coins important in determining the amount of information in each group of coins?
Depends on what you are looking at. One guy stated that they are all equiprobable and from one aspect he is correct.
But group 2 and 3 appeared more random to me. So, looking at specificity of the information, group one would be highest information if I have to pick one, which is the game I assume I'm playing.
But looking at configurational entropy, group 1 is the lowest. So you are going to have to get specific.
This message has been edited by Jerry Don Bauer, 05-06-2005 05:13 PM

Design Dynamics

This message is a reply to:
 Message 164 by jar, posted 05-06-2005 5:06 PM jar has replied

Replies to this message:
 Message 166 by jar, posted 05-06-2005 5:17 PM Jerry Don Bauer has replied

Jerry Don Bauer
Inactive Member


Message 167 of 310 (205689)
05-06-2005 5:35 PM
Reply to: Message 163 by PaulK
05-06-2005 5:05 PM


quote:
Oh dear, I think this conversation is taking a turn for the worse.
Well cheer up, man. The sky's not falling yet.
quote:
1) Since in the situation I described the configurational entropy can decrease (and has a probability of 0.5 of doing so) and cannot increase then it follows that there cannot be a law which states that the configurational entropy will tend to increase in that situation. That being the case your version of the 2LotD does not apply.
I don't have a version of 2LOT, I use the same one everybody else does. So you really think that 2LOT does not apply to matter based on the logic that when I flip a quarter it is equiprobable to get heads or tails?
Gee Paul, I kind of admire you because since 2LOT doesn't apply to your car, you never have to buy a new one. I do. Paint never gets old on your house so that is nice. And you will never grow old and die. Through this logic, you have found immortality!
quote:
There is no need to calculate any logarithms because all we need to do is observe that the entropy increases the closer the number of heads and the number of tails are to each other. Thus any change which makes the numbers closer will increase entropy and any change which makes them further apart will decrease it.
I see. So the physics of Richard Feynman when he taught us that logical entropy is the way matter is arranged and: "The logarithm of that number of ways is the entropy." Are just not correct, in your opinion?
The Second Law of Thermodynamics: Entropy and Evolution. by Brig Klyce
quote:
So, I am generalising my previous example to cover all sequences - taking m as the number of heads and n as the number of tails. Thus if we randomly choose a coin the probability of it being a a head is m/(m + n) (m heads and m + n coins). The probability of it being a tail is n(m + n) (n tails and m + n coins).
Nope. The probability of that one coin being a head is calculated via the formula:
P(A) = f/n
Where the probability (P) of an event (A) equals the number of actual events, (f) divided by all possible outcomes, (n).
That works out to 1/2 = .5. That's where you're trying to go, you're just not quite sure how to get there.
quote:
When the coin is flipped again the probability of it not changing state is 0.5 in each case.
The probability that the entropy remains unchanged is the probability of choosing a head and getting a head on the flip or of choosing a tail and getting a tail on the flip.
This is therefore 0.5 * m/(m + n) + 0.5 n/(m + n) = 0.5
To get a decrease in the entropy we need to pick a coin in the less common state and for the flip to change it.
The probability of that is 0.5 * min(m, n)/(m + n) which will be non-zero for all m,n where m > 0 and n > 0
Thus for all cases where both m and n are greater than zero "mutating" the sequence by choosing a coin and flipping it again will not tend to increase the entropy. The probability of an increase in entropy is always less than 0.5.
3) I'm sorry also that you are unfamiliar with basic evolutionary theory. Since a simple reference is in order try this:
Are Mutations Harmful?
See, this is what happens when one chooses to get their science from a religionist atheist apologist site. They don't do science over there, they do secular humanist religion and CALL it science.
quote:
Now why don't you back up YOUR assertion that the opposite is true ?
I did. Just read the study and you will know exactly what happens:
http://homepages.ed.ac.uk/eang33/

Design Dynamics

This message is a reply to:
 Message 163 by PaulK, posted 05-06-2005 5:05 PM PaulK has replied

Replies to this message:
 Message 172 by JonF, posted 05-06-2005 6:01 PM Jerry Don Bauer has replied
 Message 194 by PaulK, posted 05-07-2005 5:34 AM Jerry Don Bauer has replied

Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024