Register | Sign In


Understanding through Discussion


EvC Forum active members: 65 (9164 total)
3 online now:
Newest Member: ChatGPT
Post Volume: Total: 916,422 Year: 3,679/9,624 Month: 550/974 Week: 163/276 Day: 3/34 Hour: 0/1


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   What exactly is ID?
Smooth Operator
Member (Idle past 5135 days)
Posts: 630
Joined: 07-24-2009


Message 691 of 1273 (543139)
01-15-2010 4:15 PM
Reply to: Message 679 by Taq
01-13-2010 1:38 PM


Re: Nonsensical creationist notions
quote:
Then evolution necessarily requires a loss in information according to your definition. For example, both the chimp and human lineages lost information found in the original common ancestor. Chimps and humans are degraded, according to you. The same for all mammals (including humans) as they have degraded from the common ancestor of mammals. The same for all amniotes, all vertebrates, all eukaryotes, and ever single species since the last universal common ancestor. By extension, you are arguing that when an organism evolves from a simple state to a complex state this is a degradation event because the information for making a simple organism has been degraded.
Exactly. The more time passes, the genetic entropy increases. All populations are getting worse, not better. Because theri biological functions are degrading.
quote:
You have argued your way out of the debate. You claim that evolution requires new information, and yet observed instances of evolution do not produce this entity. Therefore, evolution does not require new information. You are out of the game.
No, what this simply means is that evolution doesn't work. I think it's clear to anyone,t hat if evolution requires somehting to be produced for it to work, and if it is observed not to be produced, that we are going to conclude that evolution is not working, and not that it is working.
quote:
The article claims that mammalian evolution trends towards larger body size in many lineages. Obviously, this isn't so in all lineages. Mice, for example, are quite small and they are . . . hmm, let me think . . . oh yes . . . MAMMALS. According to the paper, their population size prevents the accumulation of deleterious mutations compared to larger mammal lineages.
Actually, it says the amount is accumulating, not decreasing.
quote:
Our detector, in this case, is phylogenetic comparisons of DNA. This detector is capable of telling us "evolved" and "not evolved".
For example, we can compare the genes of Glofish (link), jellyfish, and trout. What we should find, if evolution is true, is that the Glofish genes should more closely resemble that of trout than of jellyfish. What do we find? We find that this is false for certain genes. Why is that? Because these genes DID NOT EVOLVE. Glofish contain genes which are almost exact copies of genes found in jellyfish, but not found in trout. How did this happen? Through intelligent design. Humans moved genes from jellyfish into Glofish to make them fluoresce under UV lights.
There you have it. A detector that can detect both evolution and not evolution. So what detector will tell us ID or not ID?
How do you know the similar genes are evolved? Aren't you first assuming common descent to be true, to than turn and claim that this is evidence for common descent? Obviously this is circular reasoning. You can't say that similarity is evidence for common descent. Similarity is simply evidence for similarity, nothing more. You need evidence that would show us that their similarity is due to common descent. Not just assume it is from the start.

This message is a reply to:
 Message 679 by Taq, posted 01-13-2010 1:38 PM Taq has replied

Replies to this message:
 Message 694 by Coyote, posted 01-15-2010 4:38 PM Smooth Operator has not replied
 Message 718 by Taq, posted 01-19-2010 12:43 PM Smooth Operator has replied

Smooth Operator
Member (Idle past 5135 days)
Posts: 630
Joined: 07-24-2009


Message 692 of 1273 (543140)
01-15-2010 4:15 PM
Reply to: Message 684 by Meddle
01-13-2010 7:52 PM


Re: Nonsensical creationist notions
quote:
Of course the ribosome binds to something else. It binds to mRNA and the anti-codons of tRNA carrying amino acids, in it's role of protein synthesis. Streptomycin interferes with the normal role of the ribosome by irreversibly binding to it. This is why streptomycin is an antibiotic, since without protein synthesis the bacterial cell dies, and why it is ridiculous to describe the failure to bind streptomycin as a loss of function. The relevant mutation allows the ribosome to continue functioning in protein synthesis even in the presence of streptomycin, which can be described as a gain in function.
It's not really a case of loss of specific function in this case, as it is a degradation of structure. Which is also a loss of information. A biological function itself does nto have to get damaged every time when structure degrades. But information is lost. And with enough degradation of teh structure, the function itself will degrade.

This message is a reply to:
 Message 684 by Meddle, posted 01-13-2010 7:52 PM Meddle has not replied

Smooth Operator
Member (Idle past 5135 days)
Posts: 630
Joined: 07-24-2009


Message 693 of 1273 (543141)
01-15-2010 4:16 PM
Reply to: Message 685 by Wounded King
01-14-2010 6:05 AM


Re: Nonsensical creationist notions
quote:
You can't make it sensible to talk about binding sites which have evolved/arisen to bind a particular protein representing CSI in the binding target in all cases. Using your logic every time an antibody is raised to a different epitope on a protein the information content should rise! Every animal with an adaptive immune system is increasing the genetic CSI content all the time!
No, you are missing the point. Fine tuning of already existing mechanisms is not gain in information. It's a simply tuning in of what already exists. And as I already said. An information, to be called CSI has got to have the complexity of at least 400 bits. One mutation is not 400 bits. Therefore, such changes do not equal increase in CSI.
quote:
But now you talk about 'original information content' which is quite different, I put it to you that the 'original information content' would have arisen before the full streptomycin biosynthesis pathway, so in fact all you are losing is ,as I suggested before, the 'free' informational value imparted by the development of streptomycin biosynthesis rather than any of the 'original information content'.
What's "free" information value?
quote:
But what you really seem to be saying is, once again, that any change from the first sequence derived from a gene is a loss of information, or in Durston et al.'s approach essentially any deviation from the consensus from an alignment of related sequences.
Nope. I said many times that it does not have constitute a loss of information. Mechanisms inside the cell can modify the DNA and leave it with the same amount of information as before. Random mutations can do the same. But do not on average. On average, they decrease the information. But no, not every single time.
Imagine a statement: "NICE WEATHER WE ARE HAVING TODAY."
If a random mutation happened in that statement,we could have something like this: "NIcE WEATHER WE ARE HAVING TODAY."
Instead of capital C, we now have a small c. So what? Did the meaning of teh statement change? Nope. But the letter did change. Yet the change did not cause any loss of meaning.
quote:
The binding specificity whatever it is for, seems totally unrelated to what you are saying, you are trying to hang some element of functionality on it when there simply isn't any. The only functional effect the mutant has is to allow the bacteria to survive in the presence of streptomycin. It hasn't lost the function of binding to streptomycin becuase that was never its function.
I agree. As I already said above, this is not really the same of loss of functionality per se. But biological function is a subset of a biological structure. If you degrade the structure, you can, but you will not every single time, also degrade the function.
If you scratch a mechanism of of your car's ignition, and you do not damage it so ti loses it's function, did you degrade the function? Nope. But you did damage the strucutre of the ignition mechanism. And this mechanism is what is performing the function. Scratch it enough times, you will damage the structure so much, that you will not be able to start your car anymore. And now, as you can clearly see, you have lost the function. Both the structure and teh function have degraded.
So like I said, no, you do not degrade the function every time you degrade the structure. But it is still a loss of information. Which when is repeated enough times, does lead to loss of function.
quote:
The ribosome is arbitrary in some ways, it is not 100% conserved amongst all species so clearly there is some allowable variation at different positions. You have decided to arbitrarily decree that any mutation changing the sequence from its inital state is a loss of information. Similarly Durston et al. arbitrarily decree that any change away from their consensus sequence will be a loss of information.
That is actually called a mathematical fact. Any organised system, will on average degrade if it is changed on random. The chances are simply like that, that you have mre chances of you hitting on a high entropy sequence, than a low entropy one.
Imagine a simple statement like the one I gave in teh last example. If you make simple changes, letter by letter, on random,, meaning, you do nto see what letters you are changing, are you on average going to decrease it's organization, or increase it? Obviously you will decrease it, which means you will increase it's entropy.
Now tell me, why would random changes to the ribosome be any different? Surely they are not. Any random change will on average make the ribosome less organized, not more.

This message is a reply to:
 Message 685 by Wounded King, posted 01-14-2010 6:05 AM Wounded King has replied

Replies to this message:
 Message 697 by Wounded King, posted 01-15-2010 5:45 PM Smooth Operator has replied

Smooth Operator
Member (Idle past 5135 days)
Posts: 630
Joined: 07-24-2009


Message 701 of 1273 (543481)
01-18-2010 2:31 PM
Reply to: Message 695 by Dr Jack
01-15-2010 4:58 PM


Re: Please explain E. coli
quote:
Cite an example then, please
This article talks about accumulation of mutations in RNA chains.
Accumulation of Deleterious Mutations in Small Abiotic Populations of RNA - PMC
This one compare the accumulation of slightly deleterious mutations between small and large mammals.
Just a moment...
quote:
Why would we need to know the initial population size? I'm talking about in a single gene line - as in those descended from a single individual.
Becasue the size of the population is one of the factors that determines how much mutations will accumulate. If you do not know the initial size, than you also do not know how much mutations will accumulate.

This message is a reply to:
 Message 695 by Dr Jack, posted 01-15-2010 4:58 PM Dr Jack has replied

Replies to this message:
 Message 706 by Dr Jack, posted 01-18-2010 2:40 PM Smooth Operator has replied

Smooth Operator
Member (Idle past 5135 days)
Posts: 630
Joined: 07-24-2009


Message 702 of 1273 (543482)
01-18-2010 2:31 PM
Reply to: Message 696 by PaulK
01-15-2010 5:28 PM


Re: l
quote:
I am afraid that you are incorrect again. You claimed that it had lost all function. I claimed that you did not know that. And as we have seen I was right, although it took an amazingly long series of posts for you to realise it.
That is because it had ONE known function. We will not try to imagine new functions if we are not sure the enzyme has it, now will we? That's unproductive. We know it had ONE function, and it lost that one function, therefore, it lost all functions.
But that's besides the point. Even if it had more function, which I wouldn't be surprised it did, that still doesn't refute my point that it lost a known function. Which, I repeat was the point of the experiment. The experiment was to see how many mutations can an enzyme take befroe it loses it's ONE known function.
quote:
Amazingly enough your clumsy phrasing has made you say something that is about right, although probably unintentionally. For your calculation to be correct, the E Coli flagellum (plus any allowed variants) must include all "bidirectional rotary motor-driven propellers". And in that sense it would describe it. However, we are certainly in no position to say that that is true, and in fact I know that it is not true.
You are also correct to say that "bidirectional rotary motor-driven propellers" is a valid specification, because there are other things that fit the definition. Unfortunately this means that all those things must be included in D*. And since your calculation uses specifics of the E Coli flagellum that do not necessarily apply to all "bidirectional rotary motor-driven propellers" you are calculating the probability for a different "specification" - and in fact that "specification" is a fabrication.
You simply do not understand the desgin inference. Why would I have to include ALL existing "bidirectional rotary motor-driven propellers"? That makes no sense. The reason it doesn't make any sense is becasue some patterns that might fit that description could very well be under 400 bits of information. Therefore making our calculation useless in the first place. Why? Becasue we are than not talking about somplex specified information, but simply about specified information, which Dembski said, chance can generate.
Therefore, you are wrong. Weare only supposed to include the complexity of the event E, which in this case is the 50 proteins of the flagellum. There is no reason to calcualte other probabilities. Just imagine a "bidirectional rotary motor-driven propeller" that has a complexity of 1.000.000.000 bits. You do understand that it has a chance vastly less probable to form by chance than a one consisting of 1.000 bits? So what good would it do to calculate all possible instances of that pattern? Absolutly no good whatsoever.
quote:
I've told you what to do. Calculate the number of sequences that are no more than 20% different. If you can't even work out how to do that then you don't know even basic probability theory. I might be willing to help you a little more, but to do that sensibly you will need to explain how you did the calculation in the first place. Something you were unwilling to do before.
Like I said, I'm waiting for you to tell me how to do the calculation. I have no reason to stall.
quote:
I am afraid that you have completely misunderstood what Dembski is saying here. Dembski is attempting to deal with the issue that there are many more possible targets that evolution might have hit. He does so by estimating the number of concepts at the same level and including it as a correcting factor (this is why it reduces the specified information content).
Clearly adding in extra details - even if they are legitimate - would mean using a higher level concept, and thus this factor would be increased.
Again, no, you have no idea What Dembski is talking about. Yes, he is saying that evolution can hit many targets. The point is to show that the probability is still to small for that to happen.
Like I said "bidirectional rotary motor-driven propeller" is a 4-concept pattern. Every pattern has 10^5 possible patterns. Therefore, the full complexity is 10^20. Now how do we use this? Well simple. We have to show that this pattern is still too improbable to hit by chance. Therefore, we multiply it by 10^120, than with teh complexity of the certain pattern, and that if this number is less than 1/2. Which is true, becasue Dembski says it here.
quote:
We may therefore think of the specificational resources as allowing as many as N = 10^20 possible targets for the chance formation of the
bacterial flagellum, where the probability of hitting each target is not more than p. Factoring in these N specificational resources then amounts to checking whether the probability of hitting any of these targets
quote:
If, therefore, this number is small (certainly less than 1/2 and preferably close to zero), it follows that it is less likely than not for an event E that conforms to the pattern T to have happened according to the chance hypothesis H. Simply put, if H was operative in the production of some
event in S’s context of inquiry, something other than E should have happened even with all the 21 replicational and specificational resources relevant to E’s occurrence being factored in.
Therefore, the number we get, if it is smaller than 1/2, it is less likely than not that chnce produced the event in question. Which mens we infer design.
quote:
Again, you are making a mistake. My point was that the growth mechanisms have a high probability of producing a flagellum. Of course we can say that bacterial reproduction includes producing the physical growth mechanisms with high probability. But again, all this means is that the design proponent has to do is to go back to the origin, which is where you say that the actual design implementation occurred.
Impossible and useless. We can not go back in time. That's obvious. And it's also useless, because natural causes can only transmit CSI, and not add more of it. So whatever CSI we see in nature, we know that that same amount, or more, was inputted in the first place. Therefore, it's fine to calculate it right there.
quote:
Of course the point rests not on what MIGHT be the case in some hypothetical environment but on what actually is the case in the environments which actually exist. And in that environment, the allele for blue eyes is classified as neutral and the sickle cell allele is classed as beneficial overall (in malarial areas), but is very strongly deleterious in individuals who are homozygous for the allele. And that is unusual.
No it's not unusual. That's how natural selection works! It is going to select anything, even those mutations that degrade biological functions if they increase reproductive fitness. That's just how things are.
quote:
In other words it doesn't matter if what you say is true or not ? If you really don't care about then say so. You claimed that the sickle-cell allele was spreading. You claimed that it had been fixed in the population. Neither is true.
I said that it has bee spread and it's fixed in a certain population. Just like any other mutation. There is no one single mutation that is sprea through all life on earth. All mutations are held in certain frequencies.
quote:
That isn't quite true. Your argument is intended to establish what you think will happen to large populations over long periods of time but it doesn't take either point into account. If the other factors are hereditary then they may be included with genes. If they are not then - on the scales we are thinking of - they may be largely ignored unless there is good reason to think that they strongly correlate with the occurrence of a particular allele.
You see, the larger the population (in the statistical sense as well as the biological) the more the chance effects of noise tend to average out. This is why genetic drift is weaker in larger populations. So it is not enough to talk about individual cases or noise. What you need to show is that there is a systematic bias - and that that bias undermines selection to the point that mutational meltdown is inevitable.
No they will not? Why should they be correlated with genetic traits to get passed on?
I already gave an example of epigenetic interference. Imagine two individuals A and B. Individual A has a beneficial mutation, and individual B has a deleterious one. Judging only by genetic traits, individual A would get selected by natural selection. But it just so happens that individual B got his DNA methylated. And this changed his phenotype and made it more fit than individual A. Now, judginig by natural selection, the individual B gets selected over individual A. Even thopugh individual B has a deleterious mutation, and individual A, has a beneficial mutation.
Therefore, we see here that a non-genetic trait interfeered with positive seletion, and a genetically less fitter individual got selected for by natural selection. This happens all the time in all populations on average with all individuals. Sometimes more, sometimes less ofcourse.
quote:
In other words you claim that an increase in fitness is caused by an inevitable decline in fitness. That doesn't make a lot of sense. I say that a sustained increase in fitness shows that the decline is not inevitable, which makes a lot more sense.
Yes, because reproductive fitness is not correlated perfectly with biological functions. If an animal loses it's sight, and it always lives in dark areas, it will probably gain reproductive fitness becasue it will not waste it's energy on eyes.
quote:
It may be an important question, but it is not relevant to either the genetic entropy discussion or the discussion of Dembski's method. I don't intend to add to the problems of this thread by adding a third subject to the discussion. If you want to talk about it then I suggest that you start a new thread.
I'm sorry but no. This is the ultimate question and the ultimate point of Sanfords book. The point is that if the genome keeps deteriorating, and we saw from Spiegelman's experiment that it does just that, that logically could not have evolved in the first place. If it kept getting shorter, than the first RNA chains didn't evolve into people over a period of 3.6 billion years. And thus showing that evolution doesn't work.And that's what I've been arguing for from the start.

This message is a reply to:
 Message 696 by PaulK, posted 01-15-2010 5:28 PM PaulK has replied

Replies to this message:
 Message 708 by PaulK, posted 01-18-2010 3:36 PM Smooth Operator has replied

Smooth Operator
Member (Idle past 5135 days)
Posts: 630
Joined: 07-24-2009


Message 703 of 1273 (543483)
01-18-2010 2:32 PM
Reply to: Message 697 by Wounded King
01-15-2010 5:45 PM


Re: Nonsensical creationist notions
quote:
You missed the entire point. You claim that the specific binding between streptomycin and the ribosome is due to CSI in the ribosomal sequences. So why is specific binding between antibodies and the epitopes they bind not due to CSI in the sequences which code for that epitope? Did you quickly evaluate the bits required for all possible epitopes and determine that they were all under 400 bits?
Because it's the ribosome which consists of more than 400 bits of information that got it's structure degraded. And like I said, function is a subset of a structure. Deforming the structure, might also deform the function.
quote:
The 'free' information, as I have pointed out more than once now, is the information which has to have suddenly magically been added to the coding sequence for the ribosome when the full streptomycin synthesis pathway had developed and it suddenly had a high affinity binding partner.
And when did that happen?
quote:
You say you don't claim mutations are necessarily a loss of information, but you persist with your illogical claim that it is a loss of information in the streptomycin resistance example. Can we just agree that its not a net change in information? Would you go that far?
Defects are loss of information. Some certain changes are not. But defects are always a loss of information. And yes, I was always saying that not all mutations decrease the amount of information.
quote:
To claim that any change in structure is a 'degradation' is bringing front and centre your assumption that the initial structure is some sort of optimal or ideal structure. It is totally inconsequential to the effect of any particular mutation that if you keep on changing the structure long enough at random its function will disappear.
You do understand that if you mutate anything for long enough that it will lose it's function and ultimately it's structure?
You do know that if you hit a bike with a hammer for long enough first you will not be able to ride it anymore, and after a while it won't even be a bike, but a pile of metal. The first loss is the complete loss of function, and the second loss is the complete loss of structure. But, the first loss occured by a partial loss of structure.
quote:
Nice to see your basic understanding of entropy is as faulty as your concept of genetic entropy. You have, as usual, conflated any specific change with the average tendency of all random changes.
The fact that in general non-neutral mutations will tend to have a deleterious effect on a proteins function doesn't mean that beneficial mutations don't exist. You do understand that entropy is a statistical phenomenon?
That is becasue it's true. Please add random changes to this statement: "SOMEDAY MAYBE YOU'LL LEARN SOMETHING!"
Please do add random changes to it, and tell me, over time, will the changes on average, lead to a new meaningful statement, or will they lead to a meaningless statement?
And I never, ever in my entire life said that beneficial mutations do not exist. I never said that. I simply said that they can also lead to loss of information, and that they occure far less than other mutations.

This message is a reply to:
 Message 697 by Wounded King, posted 01-15-2010 5:45 PM Wounded King has not replied

Smooth Operator
Member (Idle past 5135 days)
Posts: 630
Joined: 07-24-2009


Message 704 of 1273 (543484)
01-18-2010 2:32 PM
Reply to: Message 699 by Nuggin
01-15-2010 8:29 PM


Re: funny thing happened on the way to nirvana ...
quote:
No, magic is the mechanism for creationism. Just like it's apparently the mechanism for ID. Which is appropriate, because even Dembski admits that ID is Creationism.
He never said that. He actually always said it's not creationism. And not only that, but that's besides the point.
quote:
No, what I am saying is that your method does not differentiate between things which were designed and things which were not designed.
We've given you a number of examples of things which we all agree are not designed - geodes, snowflakes, ripples on water, etc. The _only_ way you rule them out is by knowing the mechanism for their production.
Yet, you claim that you don't NEED to know the mechanism for your ID theory because you can detect it without knowing the mechanism.
My position is that if you do not know the mechanism, you can not claim design.
You position is the opposite.
I've asked you for an example APART FROM the one you are claiming to DEMONSTRATE that your claim has validity.
You've now admitted that there are NO EXAMPLES apart from what you are claiming.
That's SPECIAL PLEADING.
It's an invalid argument to come to us and tell us you have a method of design detection which ONLY WORKS on Creationism and therefore you don't have to show how it works.
Nope. first of all, those thing are not designed becasue they do nto have marks of design. Second, I'm not using special pleading, becasue I never claimed that my method only works on creationism. Whatever that is supposed to mean.
quote:
It's amazing to all of us that you see "intelligent input" everywhere in nature, but can't seem to manage to apply any here.
You can not determine "input" if you don't have a MECHANISM for that input to take place.
I simply explained to you that you made a logical fallacy. That is all. You can not show me an instance of intelligent design, and than conflate it with an undirected process.

This message is a reply to:
 Message 699 by Nuggin, posted 01-15-2010 8:29 PM Nuggin has replied

Replies to this message:
 Message 709 by Nuggin, posted 01-18-2010 7:32 PM Smooth Operator has replied

Smooth Operator
Member (Idle past 5135 days)
Posts: 630
Joined: 07-24-2009


Message 705 of 1273 (543485)
01-18-2010 2:33 PM
Reply to: Message 700 by Larni
01-16-2010 5:25 AM


Re: Genetic Entropy[quote]On the 649th post of this thread you refer to a link that m
quote:
Then all you are doing is calling a restricted gene pool (as a result of a small population) 'genetic entropy'.
Like what happens to inbred Southern folks.
This would then (logically) not apply to large population.
Large population= no 'genetic entropy'.
No, what I'm saiyng is that mutations accumulate in all populations. In some populations more in some less. In smaller populations they accumulate more rapidly, and in large populations they accumulate slowly. Which means:
Small population = faster increase of genetic entropy.
Large population = slower increase of genetic entropy.
quote:
I'm also a bit confused by your use of the term 'genetic entropy': surely it means more genetic variation?
Why are you using the opposite definition?
No, it means the accumulations of mutations in a population over time which leads to the reduction in genetic information becasue natural selection is not able to remove them.

This message is a reply to:
 Message 700 by Larni, posted 01-16-2010 5:25 AM Larni has replied

Replies to this message:
 Message 707 by Larni, posted 01-18-2010 3:35 PM Smooth Operator has replied

Smooth Operator
Member (Idle past 5135 days)
Posts: 630
Joined: 07-24-2009


Message 754 of 1273 (543917)
01-21-2010 10:18 PM
Reply to: Message 706 by Dr Jack
01-18-2010 2:40 PM


Re: Please explain E. coli
quote:
Interesting articles, thank you. So given that you are presenting these articles as reliable sources, why should we not accept their further information on the differential effects of population size on deleterious mutation accumulation?
(Although, I note, neither of those articles supports your assertion about build up of genetic entropy)
The articles talk about the accumulation of slightly deleterious mutations. Accumulation of slightly deleterious mutations equal increase in genetic entropy. Which means, that yes, the article is talking about the increase in genetic entropy.
quote:
We can still know how many will accumulate in a single genetic line (because E. coli is asexual), that number is at least 7000 - more than one mutation per gene. Why aren't they suffering huge consequences from this problem?
Every species is suffering. The mutations are accumulating. Just not as fast as you think. It takes a long time for a population to go extinct.

This message is a reply to:
 Message 706 by Dr Jack, posted 01-18-2010 2:40 PM Dr Jack has not replied

Smooth Operator
Member (Idle past 5135 days)
Posts: 630
Joined: 07-24-2009


Message 755 of 1273 (543918)
01-21-2010 10:18 PM
Reply to: Message 707 by Larni
01-18-2010 3:35 PM


Re: Genetic Entropy[quote]On the 649th post of this thread you refer to a link that m
quote:
But the only deleterious mutations are a problem to the organism and when these are lethal they are weeded out by natural selection.
No. Almost all mutations increase geentic entropy. Including beneficial ones. And no, natural selection does not weed them out effectively, because there is noise that interferes with the selection. Please read about it in my previous post I really do not want to repeat myself over and over again.
quote:
The increase in 'entropy' in the genes of the organism means more possible combinations/states and thus more variation, not less.
Using the correct definition of entropy as number of states within a system means that genetic entropy is a good thing for variation: increase in entropy; means increase in states; means more variation; means reduced vulnerability to environmental change.
No, wrong, becasue not all states are equal. Not all genetic sequences have biological meaning. Which means that not all sequence will be biologically functional. When a certain sequence mutates too much, it loses it's function. That is becasue relevant biological sequences are an island of functionality in a sea of meaninglessness.

This message is a reply to:
 Message 707 by Larni, posted 01-18-2010 3:35 PM Larni has replied

Replies to this message:
 Message 762 by Larni, posted 01-22-2010 4:52 AM Smooth Operator has replied

Smooth Operator
Member (Idle past 5135 days)
Posts: 630
Joined: 07-24-2009


Message 756 of 1273 (543919)
01-21-2010 10:19 PM
Reply to: Message 708 by PaulK
01-18-2010 3:36 PM


Re: l
quote:
Well we're not doing that.
Than what are we doing? I know I'm not. But it seems you are by trying to say that the enzyme didn't lose all it's functions. We onlyknow of the one it had and it lost it.
quote:
That's illogical. Why can't you just admit that we don't know ?
I'm talking about the know functions. We know for that one it had. And as I said, it lost it. So we know how much you have to mutate that enzyme for it to lose the function we knew about.
quote:
And nobody has challenged that. So all it means is that you have put a lot of effort into arguing against a fact that you say doesn't matter. Well, why bother ? Why not just accept it and move on ?
No, I was arguing from the start that this experiment shows how many mutations does it take for an average enzyme to lose it's known function.
quote:
So what you are saying is that it makes no sense to follow the method because you think that it might give a result you don't want.
You have to include ALL possible "bidirectional rotary motor-driven propellers" because all of them fit the specification "bidirectional rotary motor-driven propellers". Because for a specification D, D* is EVERY possible thing that is delimited by D. That is how it is defined. That is the point of using a specification, to find the specified information. The specification D tells us ONLY that the event is in D*. So the probability of THAT happening is the probability we use for the specified information.
They are all included in the number 10^20. Nothing else gets added. If you think about the other number, the one that describes the event E, in this case 50 proteins, than no. It makes no sense to include anything else. Becasue as I said earlier, some pattern that describes a "bidirectional rotary motor-driven propeller" might be less than 400 bits, and some other more than 1.000.000 bits. So it's obvious that they are different cases of design, or non-design because they a different probability of arising by chance.
quote:
As I have already shown it is the probability of D*, not E, that matters - according to Dembski himself. The Design Inference p165.
D* is 10^20.
quote:
If most of that is unspecified information, it can be ignored. That's the Design Inference - low probability, unspecified results are attributed to chance.
But what if it's not? What if it is a GIANT flagellum that consists of 1.000.000 bits? What than?
quote:
As I said, if you want me to help, you need to give me the details of your original calculation - I need them to help you. Obviously your failure to do so is a mere oversight - since you have no reason to stall. (But don't worry - you still get to do the work).
I simply took away 20% from 10˘2954.
quote:
As I said, Dembski is using a correction factor, based on the number of concepts in the specification. More concepts means using a higher number. (And you will note that your 10^120 figure doesn't show up in either quote. It isn't in the immediate context either.)
Do I have to quote every single letter? The figure 10^120 is mentioned after the quote. If youa ctually read the whole paper you wouldn't be even mentioning that.
quote:
This is why we don't attribute pulsar signals to design. If we ignored the existence of the pulsar, and only considered pure chance as an explanation for the regular signal pulses we would have to conclude that the regularity of the signal was very unlikely and - inevitably - the sequence of pulses would soon qualify as CSI. If we followed your reasoning we would then conclude that the pulsar had the same "CSI" you calculated and conclude that the signal and the pulsar were designed. But we don't do that. We follow the thinking that I've outlined. We start by working out if the existence of pulsars is likely - and when we decide they are, then that is all we need to attribute the signal to natural causes. The pulsar which produces the signal is "simpler" than the signal would be - if you ignore the existence of the pulsar in calculating the probability of the signal.
Of course not. Pulsars do not correspond to any independently given pattern. Therefore, they are not designed. Now, if they were pulsing and givign out coded messages, like for an example a cure for cancer in Morse code. Now that would eb design.
quote:
If it isn't unusual to have a mutation that is strongly beneficial to individuals heterozygous for the allele and very strongly deleterious to those that are homozygous why not produce a few common examples ? Remember to provide evidence that this really is the case.
I'm doen providing anythign for you anymore. My point stands, that mutation is, like almost any other, beneficial in some environments, and deleterious in others. That's how natural selection works.
quote:
I don't know. But since your argument doesn't work unless they are, it is only fair to give you the opportunity to make a case for it.
Obviously you don't know. They are not correlated, they interfere with the geentic selection. Non-genetic traits also get evaluated by natural selection. So since the individual gets evaluated overall, than the genetic traits are just a portion of what gets evaluated. And therefore, positive selection for beneficial mutations and agains deleterious mutations suffers from noise.
quote:
And without the correlation - without a systematic bias - it will go the other way and reinforce selection just as often. That is the point. In a large population, noise averages out. That's statistics for you.
But not 100%. Genetic traits are just a small portion of what gets evaluated. You would ahve to have an infinitely large population size for natural selection to be 100% efficient. But you don't have that. Therefore, mutations accumulate.
quote:
Except that it didn't deteriorate. It optimised itself by losing a lot of unnecessary junk. Instead of going into extinction, it was such a great success that it drove it's rivals into extinction. It beat genetic entropy.
Stop dodging my question. How is this RNA chain, or any other supposed to evolve into a human in 3.6 billion years if it keeps getting shorter!?

This message is a reply to:
 Message 708 by PaulK, posted 01-18-2010 3:36 PM PaulK has replied

Replies to this message:
 Message 761 by PaulK, posted 01-22-2010 2:58 AM Smooth Operator has replied

Smooth Operator
Member (Idle past 5135 days)
Posts: 630
Joined: 07-24-2009


Message 757 of 1273 (543920)
01-21-2010 10:19 PM
Reply to: Message 709 by Nuggin
01-18-2010 7:32 PM


Re: funny thing happened on the way to nirvana ...
quote:
I ALREADY provided you with a quote AND a link to Dembski SPECIFICALLY stating that it's Creationism (including ADAM and EVE).
If you are going to lie, try and do it to someone who hasn't already provided you with the evidence you are denying, you stand a better chance.
Dembski is a Christian. So what? That has nothing to do with ID. A person can accept ID with or without being a Christian. Dembski is a Christian, again, so what?
quote:
Fantastic. So, give us an example OTHER THAN the one you are claiming it works for so we can CHECK your methods.
What, APART FROM CREATIONISM, does your claim work for? Remember, you are the one claiming that we don't need to know the mechanism, so please give us an example where the mechanism is unknown.
The one I gave you is good enough. If youd o not like it, calling it "CREATIONISM" is not going to help you.

This message is a reply to:
 Message 709 by Nuggin, posted 01-18-2010 7:32 PM Nuggin has replied

Replies to this message:
 Message 759 by Nuggin, posted 01-21-2010 11:00 PM Smooth Operator has replied
 Message 760 by Coyote, posted 01-21-2010 11:06 PM Smooth Operator has not replied

Smooth Operator
Member (Idle past 5135 days)
Posts: 630
Joined: 07-24-2009


Message 758 of 1273 (543921)
01-21-2010 10:20 PM
Reply to: Message 718 by Taq
01-19-2010 12:43 PM


Re: Nonsensical creationist notions
quote:
So let me get this straight. Simple unicellular organisms are the acme of evolution while multicellular metazoans are just a degraded version of these unicellular organisms. Is that correct?
No, evolution doesn't work on anything. Short RNA chains could have never evolved into anything more complex because they always keep getting shorter. I already discussed Spiegelman monster in my previous posts, please look them up.
quote:
But we observe evolution occuring and we do not observe what you define as new information. Obviously, evolution does not require this "new information" in order for it to proceed. You have argued your way out of the debate.
Well than, we obviously have a differnet definition of evolution. What is your definition anyway?
quote:
I am assuming nothing. I make a prediction. I predict that if a gene descended from a common ancestor that a phylogenetic comparison should produce a nested hierarchy consistent with the morphological trees. As I have already shown the GFP gene in Glofish causes them to fail this test. We are testing FOR common ancestry, not assuming it.
But you do understand that you are simply assuming that one fish can actually evolve into another? Maybe it can I don't know. Let's say it can, so what? Maybe those genes were designed that way. Just becasue they for a nested hierarchy doesn't mean they evolved.

This message is a reply to:
 Message 718 by Taq, posted 01-19-2010 12:43 PM Taq has not replied

Smooth Operator
Member (Idle past 5135 days)
Posts: 630
Joined: 07-24-2009


Message 805 of 1273 (544164)
01-24-2010 12:16 PM
Reply to: Message 759 by Nuggin
01-21-2010 11:00 PM


Re: funny thing happened on the way to nirvana ...
quote:
ID is a Christian political movement created because the term "creationism" wasn't winning court cases.
The Discovery Institute and the "cDesign Proponentsists" that work with it, including Dembski, are actively pushing a Creationist agenda.
They've even published the Wedge Document which OUTLINES their strategy.
They are all Creationists.
Everyone who knows anything about the history of the modern ID movement knows this is not true. The majority of ID proponents are Christians, so what? The majority of evolutionists are atheists, again so what? I could also call modern evolutionary theory a political atheist movement. What good would that do?
Modern ID movement was formed separately from any court case concerning "creationism". The notions and the term Intelligent Design was used way before any such court case.
quote:
Opponents of the theory often insist that intelligent design emerged as a conspiracy to circumvent the 1987 Supreme Court decision, Edwards vs. Aguillard.1 There the Court struck down a Louisiana law promoting the teaching of creation science in public school science classes. The theory of intelligent design, critics insist, is merely a clever end-run around this ruling, biblical creationism in disguise.
The problem with this claim is the intelligent design predates Edwards vs. Aguillard by many years. Its roots stretch back to design arguments made by Socrates and Plato,2 and even the term intelligent design is more than 100 years old. Oxford scholar F.C.S. Schiller employed it in an 1897 essay, writing that it will not be possible to rule out the supposition that the process of Evolution may be guided by an intelligent design.
The Origin of Intelligent Design | Discovery Institute
quote:
Find me some prominent ID proponents who are not Christian. People who are published and recognized in the field. Not "My cousin Larry". Real people.
Why? What would be the point? Why are "prominent" and "published" scientists so much more important than the rest? What's your point anyway?
What about Steve Fuller who is a secular humanist. He supports ID.
Steve Fuller (sociologist) - Wikipedia
quote:
And round and round we go.
No. It's not. You know it's not. I've explained to you why it is not. You've commented on my explanations.
You CAN NOT check something against itself for verification.
I can't give you a newly manufactured ruler with a "1 foot" marking on it and have you VERIFY that it is 1 foot long by simply reading that it says "1 foot".
That's NOT verification.
Likewise, your "magic Creationism Equation" can not be used to VERIFY ITSELF as proof that the Jew Wizards Jew Beams are zipping around poofing everything into existence.
And you KNOW that I'm right.
That's why you are so busy ducking and dodging. If you thought that your explanation worked for other examples, you'd be trotting them out left and right.
But you aren't.
Instead you are hiding.
So here's the situation. You are wrong. You know you are wrong. I know you are wrong. Everyone reading the post knows you are wrong.
So, just admit you can't come up with any more examples cuz you're making it all up
I'm not checking soemthing against itself. I'm checking the validity of CSI in the way it was supposed to be done. I'm not going to bother doing anything else, so you might as well drop it right there.

This message is a reply to:
 Message 759 by Nuggin, posted 01-21-2010 11:00 PM Nuggin has replied

Replies to this message:
 Message 811 by Nuggin, posted 01-24-2010 2:41 PM Smooth Operator has replied
 Message 813 by Coyote, posted 01-24-2010 3:11 PM Smooth Operator has not replied
 Message 862 by Taq, posted 01-25-2010 5:37 PM Smooth Operator has not replied

Smooth Operator
Member (Idle past 5135 days)
Posts: 630
Joined: 07-24-2009


Message 806 of 1273 (544165)
01-24-2010 12:17 PM
Reply to: Message 761 by PaulK
01-22-2010 2:58 AM


Re: l
quote:
I am saying that we DON'T KNOW if it lost all function. How hard is that to understand ?
We know it lost all KNOWN functions. That is the only thing I'm interested in anyway. Like I said before, it may very well be that it's useful for something else. I'm not disputing that. Maybe it is, maybe it isn't. But what we do know, is that now we know how many mutations it takes for an enzyme to lose it's known function.
quote:
Hello! Nobody disagreed with that ! This argument is over whether we KNOW that it lost ALL function. And it seems that you concede that we don't, but you go on arguing and arguing about nothing.
Umm... no. I don't care if it lost all function, even those we do not know if it might had. Because they were not tested for anyway.
quote:
Wrong. The number 10^20 is the estimated number of 4-level concepts. It is Dembski's attempt to compensate for the fact that "bidirectional rotary motor-driven propeller" is chosen in hindsight.
And D* = bidirectional rotary motor-driven propeller = 10^20.
quote:
I have thought about it and it makes no sense to do anything else. As Dembski points out it is P(D*) that needs to be low to conclude design. And if you think about that (and remember the examples discussed before) that makes perfect sense - it makes no sense to use the probability of the exact event - and that is exactly what Dembski does in his handling of the Caputo case. Since D is "bidirectional rotary motor-driven propeller", we need the probability of getting a "bidirectional rotary motor-driven propeller".
Is the probability of a "bidirectional rotary motor-driven propeller" that consists of 10 proteins equal to the probability of "bidirectional rotary motor-driven propeller" that consists of 100.000.000.000 proteins?
quote:
D* is D considered as an event. In this case that would be "a bidirectional rotary motor-driven propeller". 10^20 is Dembski's estimate of the number of 4-level concepts.
Yes, exactly, D* here is the "a bidirectional rotary motor-driven propeller", which let me repeat, means that: D* = bidirectional rotary motor-driven propeller = 10^20. So I was rigth from the start.
quote:
In that case it won't contribute much to P(D*). That would be obvious to anyone with a basic understanding of the mathematics. Very unlikely events that fall within the specification won't contribute much to the probability of meeting the specification. And 1,000,000 bits is a probability of 2^-1000,000 - very, very unlikely.
I know it does't! But it does matter to the probability of E! Which is also imprtant.
Do you, or do you not understand that getting number 6 fher throwing a die i 1/6? If you have two dice, and you want to get the number 6 on both, the probability decreases, and now it's 1/12. Therefore, aflagellum consisting of 50 proteins has higher probability of forming by chance than a hypothetical one consisting of 10.00.000 proteins. Therefore it's complexity is relevant to the calculation!
quote:
I think that you mean that you took 20% away from the exponent. Because taking 20% away from 10^2954 gets you 8*10^2953. Yes, I already worked out that that was how you applied the 20%. And if you remember I showed exactly why that was wrong.
That's not the calculation I need to know. I need to know how you got the figure of 10^2954 in the first place.
Dear God! I already told you! It's in the NFL, which I already said is what Dembski calculated.
quote:
Actually you should quote the relevant stuff. The figure 10^120 is just another way of presenting Dembski's universal probability bound (in this case reduced to 400 bits).
If you know what the number represents than obviously I shouldn0t bother.
quote:
The pulsar signal does. It's a regular series of pulses. It's certainly not random. A lighthouse produces much the same sort of signal.
A regular pattern does not equal specification. Just becasue it's regular it doesnn't mean anything. It has to have some meaning besides itself. Liek I said, a cure for cancer, a Guns'n'Roses song in binary digits, etc...
quote:
In other words you choose to dodge the issue. I said that the sickle-cell allele was unusual as a beneficial mutation because it is only beneficial in the heterozygous state, while being deleterious in the homozygous state. If you want to argue otherwise you have to produce evidence that that is not unusual. Arguing that it is normal in some other respects simply ignores my point.
WTF am I didging? Nothing! You are the one who is pretending not to understand what I'm talking about.
I DO NOT CARE IF THE MUTATION IS UNUSUAL!!!!!!
The point is that natural seelction selects anything, including the "unusual" mutations, if they confer reproductive fitness. Even those mutations that reduce biological functions, such as the sickle cell mutation. Therefore, it contributes to genetic entropy.
quote:
As I said, it is your argument that would benefit from the existence of the correlation. If we are simply dealing with "noise" then the larger the population, the less effect it has. This is why genetic entropy is only a real problem for small populations.
No! Stop repeating this crap over and over again. There is no noise averageing. For a full removal of all mutations you would need an infinitely large population. Since you don't have one, mutations accumulate. I never said ANYTHING about some correlation. The effects, that is, the noise is too strong to be averaged out, precisely becasue genetic changes are not strong enough, and they are just a tiny part of what is being evaluated by natural selection.
This is the last source I'm citing for you because you are pretending not to understand what I'm talking about. Therefroe, this conversation is useless.
http://discovermagazine.com/2006/nov/cover
Look at these two mice. They are from the same parents. They are genntically almost identical, as two offspring can be. No mutations have happened in any of them. This is an epigenetic effect known as DNA methylation. The mother first gave birth to a giant crappy low fitness yellow mouse. After the mother got fed with food consisting of a large portion of methyl groups, those chemicals attached themselves to the backbone of the DNA, and regulated the DNA so the next mouse that was born was small brown and was more healthy and has high fitness. And ZERO, and I mean ZERO genetic changes happened. No changes in the genetic sequence occured.
As anyone can clearly see, these changes are drastic, and have absolutely nothing to do with genetics. As a matter of fact, the more healthy small rat could actually have a deleterious mutation for all we know, and he would on average have MORE reproductive fitness than the crappy giant rat. Therefore, natural selection would on average favor it. No large population is going to average out this kind of noise. So deleterious mutations in those rats on average do get passed on.
And the best part is this is only ONE, yes only ONE source of noise! You still have FIVE other sources to consider. It's plain and obvious that the noise is too large for natural selection to effectively remove deleterious mutations.
quote:
Natural selection doesn't have to be anything like 100% efficient to prevent genetic entropy. All it has to do is to maintain a balance where the average fitness is held at a high enough level to maintain the population. If the population is adequately fit, and the rate at deleterious alleles are lost is the same as the rate at which they enter the population then they aren't accumulating. If you want to prove that genetic entropy is inevitable on theoretical grounds you are going to have to crunch the numbers and find out where that balance point is.
LOL! This is simple math! Anything less than 100% equals accumulation of mutations!
If every deleterious mutation gets removed only than are the genetic traits kept in balance.
If 1 in 10 are not removed, genetic entropy increases. Because for every 10 muations 1 will accumulate!
if 1 in 1.000 are not removed, genetic entropy increases. Because for every 1.000 muations 1 will accumulate!
if 1 in 10.000.000 are not removed, genetic entropy increases. Because for every 10.000.000 muations 1 will accumulate!
etc...
Only a perfect selection can keep the order in balance. Anything else equals accumulation.
quote:
I already told you that I do not intend to add another separate topic to this discussion. You've seen Admin ask for focus, and suggest narrowing posts down to a single topic. We're already discussing two topics. I decline to add a third.
Thank you for admitting that it's physically impossible for the chain to evolve into a human, because it's obvious to any reasonable person, that a mechanism that makes something constantly get SHORTER, can not in the same time make it LONGER!
It's a phisical impossibillity!
Edited by Smooth Operator, : No reason given.
Edited by Smooth Operator, : No reason given.
Edited by Smooth Operator, : No reason given.
Edited by Smooth Operator, : No reason given.

This message is a reply to:
 Message 761 by PaulK, posted 01-22-2010 2:58 AM PaulK has replied

Replies to this message:
 Message 810 by PaulK, posted 01-24-2010 1:31 PM Smooth Operator has replied
 Message 812 by Nuggin, posted 01-24-2010 2:47 PM Smooth Operator has not replied

Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024