|
Register | Sign In |
|
QuickSearch
Thread ▼ Details |
|
Thread Info
|
|
|
Author | Topic: Intelligent Design in Universities | |||||||||||||||||||||||||
Jerry Don Bauer Inactive Member |
quote: Everyone with an IQ of 100 or better has thought metaphysics through and has a religion. Your's just pegs on the negative end of the scale.
quote: It certainly does; and old earth creationists, Jews, Muslims, agnostics and atheists. We don't particularly care what your religious views are. Design Dynamics
|
|||||||||||||||||||||||||
Jerry Don Bauer Inactive Member |
quote: Yes, and in fact this is relatively old hat in infodynamics. Here is Durham University using combinatorials to calculate the entropy and macrostates of flipped coins: Page not found | Durham University Community But I think the chart I use to teach this is much clearer (with Boltzmann's constant omitted):
This is really the same formula broken down to (total elements)! / (subset)!(subset)! This works in any system where we need to compare the statistical weight of a comparison of subsets against the whole as in: (whole)! / (subset)!(subset)!(subset)!(subset)!........(n-subsets)!.
quote: It was two states in that particular gas system as there were only two chambers. Obviously systems can be designed with as many chambers as we care to have in them. In the genome example, for the purpose of simplicity, I was considering only two states much like Shannon considered his relay circuits as either off or on--i.e., genes mutating each generation from healthy to non-healthy. Two states. And in this case I can do so because that study did not include other options per generation. Just how many mutated to the deleterious state after selection had acted to remove what it would.
quote: Yep. You have studied thermodynamics in a formal setting. I can tell because you are asking quite intelligent questions. Entropy (in the gas or other far from equilibrium systems, like people) expresses the equilibrium of a system. Once total equilibrium is attained by a system, entropy is at maximum. You may or not have read my postings from Schrodinger about this: "Schrodinger posited that maximum entropy--perfect equilibrium in the organism-- isachieved at death. [2] And this makes sense. How could there be anything more at equilibrium with itself and its environment than a cold, dead organism that isn‘t functioning at all? Furthermore, how could anything be more disorganized than this same organism? Nothing is organized adequately enough for anything to work. Yet it is ordered because there is no chaos." Thus if we have a two chambered gas system (or anything similar) containing 100 molecules, when these molecules have distributed as far as they can distribute, that system is perfectly ordered, but totally disorganized; and entropy is maximum. It might help you to read a paper I recently wrote on this terminology: http://designdynamics.org/order.pdf
quote: It could appear that way if the system were other than a far from equilibrium, living system. Just looking at the graph above we can see that maximum entropy with the 4 coins is two heads and two tails. So technically, you are correct again. But the reality is, the organism will not be alive long enough to allow a genome to reach anywhere near a state of (100%)! / (50% ancestral)!(50% deleterious)! because this would mean that 50% of the proteins in our body transcribed by genes, just would not work the way they were initially designed to work. The researchers (especially James Crow, an interpreter) toyed with this question: If we are carrying upwards of a thousand harmful mutations and they are accumulating slightly with each generation, why are we not in mutational meltdown and going into extinction? 1000 mutations are not that many when you consider all of the protein coding regions that COULD mutate.
quote: No, if deleterious mutations are to increase arithmetically each generation, they will increase until a population enters mutational meltdown, which old-schoolers like me might know as error catastrophe. At this point, the population enters a rapid descent off the graph into extinction. Here is the type of graph I am referring to:
If that population is to be saved, it must receive a fresh source of genes into the gene-pool long before it reaches the critical meltdown stage. Design Dynamics
|
|||||||||||||||||||||||||
Jerry Don Bauer Inactive Member |
quote: Oh Yeah! lol....Heard of that one but have to read up on it.
|
|||||||||||||||||||||||||
Jerry Don Bauer Inactive Member |
Thanks for the post, but I just cannot seem to win on this forum.
I suffer through 100 posts containing arguments with no reference at all and then you spring an uber-post on me containing tons of references with no argument. Now please slow down, back up and redo that post as it has much potential of people learning from it. But people are not going to let you bring an argument using other peoples arguments because they are not here to debate them. Only you are. 1) Organize that post. 2) Put a cogent argument into your own words, using those postings as references, making each point you wish to make, cutting and pasting from them as you go to substanciate each point of your argument. 3) Perhaps you are right and I am wrong. We will see where it goes from there, but you need to put a little effort into communication first. This message has been edited by Jerry Don Bauer, 05-06-2005 05:45 AM Design Dynamics
|
|||||||||||||||||||||||||
Jerry Don Bauer Inactive Member |
quote: Hmmm....Paul, why are you making me think at 6:10 in the morning? If you have a configuration of a two coin system two of which are heads and two tails and you decide to flip one coin, the second law doesn't care what happens because probabilities favoring either outcome do not exist. The second law is a law of statistics so considering the flipping of one coin, you pays your money and you takes your chances at 50/50 of evolving the system, or not. We are really getting out of configurational entropy which is expressed in the chart I posted and defines the macrostate of a system, back into logical entropy, which defines the microstates, i.e., the number of possible states of the matter (heads or tails) taken to the power of the number of coins in the system. The second law comes into play and gets stronger as the complexity of our system increases: one coin, 2^1, two coins, 2^2, ten coins, 2^10, 500 coins, 2^500. In ID, this is called specificity and we can define it as: Specificity is inversely proportional to the probabilities of an event occurring. Dembski's specified information is really just a quantification of the second law because at 2^500 we are to the odds of 1 chance in 10^150 of any pattern occurring and it ain't gonna happen in nature. Hope that did it. Design Dynamics
|
|||||||||||||||||||||||||
Jerry Don Bauer Inactive Member |
Well you seem to be working toward an argument here, so we will run with it.
quote: Then if there is no premises or conclusions in the form of an argument, why would you need evidence to support them. Think about that.
quote: No, that would be stupid. I want you to bring an argument and provide evidence to support that argument. But chill out because you may be getting there, although I'm having to suck it out of you like George Bush treating a rattlesnake bite.
quote: No, now carefully read the words that I write, because unless I am drunk, I parse them very carefully: Jerry: ".....Random mutations ARE equiprobable ....." Reread the post. I understand that some mutations are due to certain factors like photon damage, radiation, etc. But if the mutations we are considering are due to the "causes" in the ABSTRACTS (you did not present any papers, just the abstracts) then how do you then consider them random? Now see how this works, we are getting a discussion going. This good. Fire bad. Design Dynamics
|
|||||||||||||||||||||||||
Jerry Don Bauer Inactive Member |
quote: LOL.. the good Doctor of physics is reduced to the, is not, is too, is not, argument? You can do better than this. Put more effort and research into your posts. In all honesty, I think you're just lost. Information entropy is Shannon entropy if you want to get technical. The entropy used in the coin examples is called logical entropy, not thermodynamic entropy. Why don't you plug this stuff into Google and educate yourself before posting? This will help you professionally. You can learn about the different entropies starting here. And is that d you're using in the above formula (which doesn't apply to anything we are discussing that I can detect so far) meant to denote integration as in calculus or a delta as in simple change? You don't state this and you should as forum software will not show mathematical terms in my experience. If it is integration, you don't have to do this anymore as the formula deltaS = Q/T was clausius' original math and it is widely accepted today. Just plug the formula into Google and surf until you're sick of it. But it is normally used to quantify thermodynamic reservoirs rather than what we are discussing. "when energy Q is added to a reservoir, the entropy of this reservoir increases by Q/T where T is the temperature in some absolute unit (like Kelvin)." Page not found at /physics/courses/PHY102.03Spring/hw/102sol4.pdf We only need integration to show entropy when there is a distinguished change in system temperature (there is not in reservoirs) such as a beaker containing 1 kilogram of water pulled from a fridge at 5degC and allowed to come to equilibrium with the room at 25degC. If you are going to use calculus, then do it once and we can be done with it:
Now we are reduced to the formula deltaS = C ln(T2/T1) where deltaS is the change in entropy, C is the heat capacity (specific heat) of the substance and T2 is the final temperature while T1 is the initial temperature. Now that this reduced, we need no more calculus and can figure the entropy of any distinct temp change we care to. In my water example above The specific heat of water is 4.185 J/g C, deltaS = 4.2 ln(25/5), and: deltaS = 6.76 Don't throw remedial calculus at an ID theorist. She will throw it right back at you and make you look extremely silly in the process. And in the meantime, you need only introduce math that is used in living, open systems such as that of Boltzmann, Schrodinger, Gibbs or Prigogine. You should know this at your level. Design Dynamics
|
|||||||||||||||||||||||||
Jerry Don Bauer Inactive Member |
quote: Ok, I'm done with you. Ad homs are logical fallacies and I would see no need to address them since they are logical fallacies to begin with. Thank you for your input! Design Dynamics
|
|||||||||||||||||||||||||
Jerry Don Bauer Inactive Member |
quote: What color is the table? Oh no, just kidding. Group 1 would be the most specified because it has the lowest odds of occurring. If we look up the word information at dictionary.com we can see that one definition of it is the probability of an experimental outcome. So I would feel pretty comfortable going with that. Design Dynamics
|
|||||||||||||||||||||||||
Jerry Don Bauer Inactive Member |
quote: Yes I can, because it does. What is the difference between matter and energy? Einstein said they are the same because E = MC^2. Thermodynamic entropy deals only with energy, configurational, only with the arrangements of matter. In fact, both have been used together in the same formula expressed as total S= S(c) + S(t). I'm not doing anything new here.
quote: What is that math. Paul? Can you show me where you got it? It doesn't make a lick of sense. You didn't calculate any entropy because you didn't take the log of anything. Nothing you plug into n and m will ever equal .5.
quote: Back that up because nothing could be further from the truth. Design Dynamics
|
|||||||||||||||||||||||||
Jerry Don Bauer Inactive Member |
quote: There you go again accusing me of making mistakes without being able to state what they are. Why do you repeatedly do this?
quote: LOL...You are a case. The site I sent you to was constructed by Brig Klyce a foaming-at the-mouth anti-creationist, atheist. Why won't you research this? When I plug the term logical entropy into Google, almost 400 pages come up. "Landauer realized that if the binary value of a bit is unknown, erasing the bitchanges the logical entropy of the system from klog2 to klog1 = 0 (shrinking the phase space available to the system has decreased the entropy by klog2)." And look at 3 PhDs in materials science using thermodynamic entropy and configurational entropy in the same formula: S = k ln Omegath(Omegac) = k lnOmegath + k lnOmegac = Sth + Sc http://www.ldolphin.org/mystery/chapt8.html
quote: Oh. The fact that the formula you wanted to use is "normally used to quantify thermodynamic reservoirs rather than what we are discussing" was exactly your point? Then why did you throw it out? I give up. You have not backed up anything you have posted with references because THERE ARE NONE. I'm afraid you are not well enough versed in this area of physics to even discuss it, yet you think you know it all. I don't know how to deal with that. So, I will thank you for your time and you may have the last comment. Design Dynamics
|
|||||||||||||||||||||||||
Jerry Don Bauer Inactive Member |
[/quote]I asked "Which group contains the greatest amount of information and why?"[quote]
Well gee. I told you group one and then why, because the specificity is higher. Did you misread the post?
Design Dynamics
|
|||||||||||||||||||||||||
Jerry Don Bauer Inactive Member |
quote: LOL, it's important to me, I guess. What kind of question is that? What is your point?
|
|||||||||||||||||||||||||
Jerry Don Bauer Inactive Member |
quote: Depends on what you are looking at. One guy stated that they are all equiprobable and from one aspect he is correct. But group 2 and 3 appeared more random to me. So, looking at specificity of the information, group one would be highest information if I have to pick one, which is the game I assume I'm playing. But looking at configurational entropy, group 1 is the lowest. So you are going to have to get specific. This message has been edited by Jerry Don Bauer, 05-06-2005 05:13 PM Design Dynamics
|
|||||||||||||||||||||||||
Jerry Don Bauer Inactive Member |
quote: Well cheer up, man. The sky's not falling yet.
quote: I don't have a version of 2LOT, I use the same one everybody else does. So you really think that 2LOT does not apply to matter based on the logic that when I flip a quarter it is equiprobable to get heads or tails? Gee Paul, I kind of admire you because since 2LOT doesn't apply to your car, you never have to buy a new one. I do. Paint never gets old on your house so that is nice. And you will never grow old and die. Through this logic, you have found immortality!
quote: I see. So the physics of Richard Feynman when he taught us that logical entropy is the way matter is arranged and: "The logarithm of that number of ways is the entropy." Are just not correct, in your opinion? The Second Law of Thermodynamics: Entropy and Evolution. by Brig Klyce
quote: Nope. The probability of that one coin being a head is calculated via the formula: P(A) = f/n Where the probability (P) of an event (A) equals the number of actual events, (f) divided by all possible outcomes, (n). That works out to 1/2 = .5. That's where you're trying to go, you're just not quite sure how to get there.
quote: See, this is what happens when one chooses to get their science from a religionist atheist apologist site. They don't do science over there, they do secular humanist religion and CALL it science.
quote: I did. Just read the study and you will know exactly what happens: http://homepages.ed.ac.uk/eang33/ Design Dynamics
|
|
|
Do Nothing Button
Copyright 2001-2023 by EvC Forum, All Rights Reserved
Version 4.2
Innovative software from Qwixotic © 2024