Understanding through Discussion


Welcome! You are not logged in. [ Login ]
EvC Forum active members: 158 (8125 total)
Current session began: 
Page Loaded: 09-17-2014 9:38 PM
81 online now:
Chatting now:  Chat room empty
Newest Member: taiji2
Upcoming Birthdays: AdminPhat, Spiritual Anarchist
Post Volume:
Total: 736,164 Year: 22,005/28,606 Month: 1,092/1,410 Week: 294/524 Day: 58/61 Hour: 2/0


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
RewPrev1
...
4546
47
4849
...
85NextFF
Author Topic:   What exactly is ID?
Smooth Operator
Member (Idle past 1544 days)
Posts: 630
Joined: 07-24-2009


Message 691 of 1273 (543139)
01-15-2010 4:15 PM
Reply to: Message 679 by Taq
01-13-2010 1:38 PM


Re: Nonsensical creationist notions
quote:
Then evolution necessarily requires a loss in information according to your definition. For example, both the chimp and human lineages lost information found in the original common ancestor. Chimps and humans are degraded, according to you. The same for all mammals (including humans) as they have degraded from the common ancestor of mammals. The same for all amniotes, all vertebrates, all eukaryotes, and ever single species since the last universal common ancestor. By extension, you are arguing that when an organism evolves from a simple state to a complex state this is a degradation event because the information for making a simple organism has been degraded.
Exactly. The more time passes, the genetic entropy increases. All populations are getting worse, not better. Because theri biological functions are degrading.

quote:
You have argued your way out of the debate. You claim that evolution requires new information, and yet observed instances of evolution do not produce this entity. Therefore, evolution does not require new information. You are out of the game.
No, what this simply means is that evolution doesn't work. I think it's clear to anyone,t hat if evolution requires somehting to be produced for it to work, and if it is observed not to be produced, that we are going to conclude that evolution is not working, and not that it is working.

quote:
The article claims that mammalian evolution trends towards larger body size in many lineages. Obviously, this isn't so in all lineages. Mice, for example, are quite small and they are . . . hmm, let me think . . . oh yes . . . MAMMALS. According to the paper, their population size prevents the accumulation of deleterious mutations compared to larger mammal lineages.
Actually, it says the amount is accumulating, not decreasing.

quote:
Our detector, in this case, is phylogenetic comparisons of DNA. This detector is capable of telling us "evolved" and "not evolved".

For example, we can compare the genes of Glofish (link), jellyfish, and trout. What we should find, if evolution is true, is that the Glofish genes should more closely resemble that of trout than of jellyfish. What do we find? We find that this is false for certain genes. Why is that? Because these genes DID NOT EVOLVE. Glofish contain genes which are almost exact copies of genes found in jellyfish, but not found in trout. How did this happen? Through intelligent design. Humans moved genes from jellyfish into Glofish to make them fluoresce under UV lights.

There you have it. A detector that can detect both evolution and not evolution. So what detector will tell us ID or not ID?


How do you know the similar genes are evolved? Aren't you first assuming common descent to be true, to than turn and claim that this is evidence for common descent? Obviously this is circular reasoning. You can't say that similarity is evidence for common descent. Similarity is simply evidence for similarity, nothing more. You need evidence that would show us that their similarity is due to common descent. Not just assume it is from the start.
This message is a reply to:
 Message 679 by Taq, posted 01-13-2010 1:38 PM Taq has responded

Replies to this message:
 Message 694 by Coyote, posted 01-15-2010 4:38 PM Smooth Operator has not yet responded
 Message 718 by Taq, posted 01-19-2010 12:43 PM Smooth Operator has responded

  
Smooth Operator
Member (Idle past 1544 days)
Posts: 630
Joined: 07-24-2009


Message 692 of 1273 (543140)
01-15-2010 4:15 PM
Reply to: Message 684 by Malcolm
01-13-2010 7:52 PM


Re: Nonsensical creationist notions
quote:
Of course the ribosome binds to something else. It binds to mRNA and the anti-codons of tRNA carrying amino acids, in it's role of protein synthesis. Streptomycin interferes with the normal role of the ribosome by irreversibly binding to it. This is why streptomycin is an antibiotic, since without protein synthesis the bacterial cell dies, and why it is ridiculous to describe the failure to bind streptomycin as a loss of function. The relevant mutation allows the ribosome to continue functioning in protein synthesis even in the presence of streptomycin, which can be described as a gain in function.
It's not really a case of loss of specific function in this case, as it is a degradation of structure. Which is also a loss of information. A biological function itself does nto have to get damaged every time when structure degrades. But information is lost. And with enough degradation of teh structure, the function itself will degrade.
This message is a reply to:
 Message 684 by Malcolm, posted 01-13-2010 7:52 PM Malcolm has not yet responded

  
Smooth Operator
Member (Idle past 1544 days)
Posts: 630
Joined: 07-24-2009


Message 693 of 1273 (543141)
01-15-2010 4:16 PM
Reply to: Message 685 by Wounded King
01-14-2010 6:05 AM


Re: Nonsensical creationist notions
quote:
You can't make it sensible to talk about binding sites which have evolved/arisen to bind a particular protein representing CSI in the binding target in all cases. Using your logic every time an antibody is raised to a different epitope on a protein the information content should rise! Every animal with an adaptive immune system is increasing the genetic CSI content all the time!
No, you are missing the point. Fine tuning of already existing mechanisms is not gain in information. It's a simply tuning in of what already exists. And as I already said. An information, to be called CSI has got to have the complexity of at least 400 bits. One mutation is not 400 bits. Therefore, such changes do not equal increase in CSI.

quote:
But now you talk about 'original information content' which is quite different, I put it to you that the 'original information content' would have arisen before the full streptomycin biosynthesis pathway, so in fact all you are losing is ,as I suggested before, the 'free' informational value imparted by the development of streptomycin biosynthesis rather than any of the 'original information content'.
What's "free" information value?

quote:
But what you really seem to be saying is, once again, that any change from the first sequence derived from a gene is a loss of information, or in Durston et al.'s approach essentially any deviation from the consensus from an alignment of related sequences.
Nope. I said many times that it does not have constitute a loss of information. Mechanisms inside the cell can modify the DNA and leave it with the same amount of information as before. Random mutations can do the same. But do not on average. On average, they decrease the information. But no, not every single time.

Imagine a statement: "NICE WEATHER WE ARE HAVING TODAY."

If a random mutation happened in that statement,we could have something like this: "NIcE WEATHER WE ARE HAVING TODAY."

Instead of capital C, we now have a small c. So what? Did the meaning of teh statement change? Nope. But the letter did change. Yet the change did not cause any loss of meaning.

quote:
The binding specificity whatever it is for, seems totally unrelated to what you are saying, you are trying to hang some element of functionality on it when there simply isn't any. The only functional effect the mutant has is to allow the bacteria to survive in the presence of streptomycin. It hasn't lost the function of binding to streptomycin becuase that was never its function.
I agree. As I already said above, this is not really the same of loss of functionality per se. But biological function is a subset of a biological structure. If you degrade the structure, you can, but you will not every single time, also degrade the function.

If you scratch a mechanism of of your car's ignition, and you do not damage it so ti loses it's function, did you degrade the function? Nope. But you did damage the strucutre of the ignition mechanism. And this mechanism is what is performing the function. Scratch it enough times, you will damage the structure so much, that you will not be able to start your car anymore. And now, as you can clearly see, you have lost the function. Both the structure and teh function have degraded.

So like I said, no, you do not degrade the function every time you degrade the structure. But it is still a loss of information. Which when is repeated enough times, does lead to loss of function.

quote:
The ribosome is arbitrary in some ways, it is not 100% conserved amongst all species so clearly there is some allowable variation at different positions. You have decided to arbitrarily decree that any mutation changing the sequence from its inital state is a loss of information. Similarly Durston et al. arbitrarily decree that any change away from their consensus sequence will be a loss of information.
That is actually called a mathematical fact. Any organised system, will on average degrade if it is changed on random. The chances are simply like that, that you have mre chances of you hitting on a high entropy sequence, than a low entropy one.

Imagine a simple statement like the one I gave in teh last example. If you make simple changes, letter by letter, on random,, meaning, you do nto see what letters you are changing, are you on average going to decrease it's organization, or increase it? Obviously you will decrease it, which means you will increase it's entropy.

Now tell me, why would random changes to the ribosome be any different? Surely they are not. Any random change will on average make the ribosome less organized, not more.


This message is a reply to:
 Message 685 by Wounded King, posted 01-14-2010 6:05 AM Wounded King has responded

Replies to this message:
 Message 697 by Wounded King, posted 01-15-2010 5:45 PM Smooth Operator has responded

  
Coyote
Member
Posts: 4702
Joined: 01-12-2008
Member Rating: 2.2


Message 694 of 1273 (543142)
01-15-2010 4:38 PM
Reply to: Message 691 by Smooth Operator
01-15-2010 4:15 PM


Re: Nonsensical creationist notions
The more time passes, the genetic entropy increases. All populations are getting worse, not better. Because theri biological functions are degrading.

So in another 3.5 billion years we're in real trouble, eh? Or is it 10 billion years? 20 billion?

I'm not about to worry about something that is going to happen in 20 billion years (if then) when the sun will consume the earth in a quarter that time.

You keep making these claims but you have yet to quantify "genetic entropy" in terms that mean anything. Nor are you able to support it from mainstream science. When the only support for an extreme claim is from a scientist who is also a fundamentalist or a creationist, I tend to want a second opinion.

And you have yet to convince me that your "belief" in genetic entropy is based on something from science and not biblical literalism, i.e. the fall and a young earth. Belief in those two concepts would certainly alter your perception of reality and contribute to your contra-scientific faith in this genetic entropy.

What is this, the third or fourth concept that you have pushed here -- all the while claiming they are based on science -- when mainstream science says the exact opposite and the only real source that agrees with you is the bible? Are we in for a thread on a flat earth or pi=3 next? How about the flood? Let's do the flood next -- that is one area I have some good data to share with you.


Religious belief does not constitute scientific evidence, nor does it convey scientific knowledge.
This message is a reply to:
 Message 691 by Smooth Operator, posted 01-15-2010 4:15 PM Smooth Operator has not yet responded

Mr Jack
Member (Idle past 354 days)
Posts: 3475
From: Leicester, England
Joined: 07-14-2003


Message 695 of 1273 (543144)
01-15-2010 4:58 PM
Reply to: Message 288 by Smooth Operator
12-23-2009 10:03 AM


Re: Please explain E. coli
Yes, we do. We see them in ALL species.

Cite an example then, please

No. We do not know how much mutations would they have accumulated. To predict real numbers is not possible, simply because we do not know what was the initial population size.

Why would we need to know the initial population size? I'm talking about in a single gene line - as in those descended from a single individual.


This message is a reply to:
 Message 288 by Smooth Operator, posted 12-23-2009 10:03 AM Smooth Operator has responded

Replies to this message:
 Message 701 by Smooth Operator, posted 01-18-2010 2:31 PM Mr Jack has responded

PaulK
Member
Posts: 10798
Joined: 01-10-2003
Member Rating: 1.5


Message 696 of 1273 (543148)
01-15-2010 5:28 PM
Reply to: Message 688 by Smooth Operator
01-15-2010 4:14 PM


Re: l
quote:

No, that's not the question. That was not the point of the experiment. The point was to show how much mutational load can a enzyme take before it loses it's function. The one function that was known to exist was measured. And it went away after some time. So now we know how much changes can there be on average before a certain function is lost.

I am afraid that you are incorrect again. You claimed that it had lost all function. I claimed that you did not know that. And as we have seen I was right, although it took an amazingly long series of posts for you to realise it.

quote:

The only relevant question is. Does the flagellum describe this pattern: "bidirectional rotary motor-driven propeller"? The answer is - yes. So since we know of other objects that exhibit this pattern, we conclude that it's a specification, and not a fabrication.

Amazingly enough your clumsy phrasing has made you say something that is about right, although probably unintentionally. For your calculation to be correct, the E Coli flagellum (plus any allowed variants) must include all "bidirectional rotary motor-driven propellers". And in that sense it would describe it. However, we are certainly in no position to say that that is true, and in fact I know that it is not true.

You are also correct to say that "bidirectional rotary motor-driven propellers" is a valid specification, because there are other things that fit the definition. Unfortunately this means that all those things must be included in D*. And since your calculation uses specifics of the E Coli flagellum that do not necessarily apply to all "bidirectional rotary motor-driven propellers" you are calculating the probability for a different "specification" - and in fact that "specification" is a fabrication.

quote:

I'm willing to do the "correct" calculation. I'm simply waiting for you to tell me what to do.

I've told you what to do. Calculate the number of sequences that are no more than 20% different. If you can't even work out how to do that then you don't know even basic probability theory. I might be willing to help you a little more, but to do that sensibly you will need to explain how you did the calculation in the first place. Something you were unwilling to do before.

quote:

Let's see what Dembski has to say about that...

quote:
For a less artificial example of specificational resources in action, imagine a dictionary of 100,000 (= 105) basic concepts. There are then 105 1-level concepts, 1010 2-level concepts, 1015 3- level concepts, and so on. If “bidirectional,” “rotary,” “motor-driven,” and “propeller” are basic concepts, then the molecular machine known as the bacterial flagellum can be characterized as a 4-level concept of the form “bidirectional rotary motor-driven propeller.” Now, there are approximately N = 1020 concepts of level 4 or less, which therefore constitute the specificational resources relevant to characterizing the bacterial flagellum.
http://www.designinference.com/.../2005.06.Specification.pdf

Let me explain what he is saying here breafly.

The ϕs(T) part of the equation describes the specificational resources. That is, the amount of all possible specifications relevant the the specification T that is exibited by the even E. Imagine now a descriptive language D* for which we will say will be English language consisting of aprox. 100.000 words. That's 10^5 basic concepts.

Using the definition of CKS information theory The flagellum describes the pattern "bidirectional rotary motor-driven propeller". This patternt consists of 4 basic concepts, that is, 4 words. The total complexity of this specification is thus 10^5 × 10^5 × 10^5 × 10^5 which equals 10^20.

This is the complexity of the specification as calculated by Dembski himslef. After that we multiply it by 10^120, and with the probability of the event, and than we get the final number...


I am afraid that you have completely misunderstood what Dembski is saying here. Dembski is attempting to deal with the issue that there are many more possible targets that evolution might have hit. He does so by estimating the number of concepts at the same level and including it as a correcting factor (this is why it reduces the specified information content).

Clearly adding in extra details - even if they are legitimate - would mean using a higher level concept, and thus this factor would be increased.

quote:

I agree. The problem I have with your interpretation is that you confuse growth mechanisms with high probability events.

Again, you are making a mistake. My point was that the growth mechanisms have a high probability of producing a flagellum. Of course we can say that bacterial reproduction includes producing the physical growth mechanisms with high probability. But again, all this means is that the design proponent has to do is to go back to the origin, which is where you say that the actual design implementation occurred.

quote:

But this is not a difference. Remember. Mutations are neutral/beneficial/deleterious depending on the environment. Therefore, in some environment, blue eyes could be beneficial, or deleterious. So no, there is no difference, the same natural selection is operating on all traits.

In all cases it depends on the environment and the mutations if they will be removed or spread and at which extent. There is nothing special in this case.


Of course the point rests not on what MIGHT be the case in some hypothetical environment but on what actually is the case in the environments which actually exist. And in that environment, the allele for blue eyes is classified as neutral and the sickle cell allele is classed as beneficial overall (in malarial areas), but is very strongly deleterious in individuals who are homozygous for the allele. And that is unusual.

quote:

Neitehr are a lot of other traits. So what? Are blue eyes fiex in human population? No obviously they are not. Only a small minority has them. Does that mean they are deleterious? No obviously not.

In other words it doesn't matter if what you say is true or not ? If you really don't care about then say so. You claimed that the sickle-cell allele was spreading. You claimed that it had been fixed in the population. Neither is true.

quote:

That's my argument also. Iw as just making an example on an individual. Are you telling me that other traits that get selected besides genes are not working on the level of population? Are you telling me they do not interfeer with genetic selection on the level of population over a long periods of time? Of course they do! And that is why natural selection is inefficient.

That isn't quite true. Your argument is intended to establish what you think will happen to large populations over long periods of time but it doesn't take either point into account. If the other factors are hereditary then they may be included with genes. If they are not then - on the scales we are thinking of - they may be largely ignored unless there is good reason to think that they strongly correlate with the occurrence of a particular allele.

You see, the larger the population (in the statistical sense as well as the biological) the more the chance effects of noise tend to average out. This is why genetic drift is weaker in larger populations. So it is not enough to talk about individual cases or noise. What you need to show is that there is a systematic bias - and that that bias undermines selection to the point that mutational meltdown is inevitable.

quote:

No, it's the genetic entropy that is causing it's shortness.

In other words you claim that an increase in fitness is caused by an inevitable decline in fitness. That doesn't make a lot of sense. I say that a sustained increase in fitness shows that the decline is not inevitable, which makes a lot more sense.

quote:

And the most important question is, how is that chain supposed to evolve into something more complex if it is constantly decreasing? It's not! And that's why evolution doesn't work. If it did, the chain would be getting longer. But it's not getting longer.

It may be an important question, but it is not relevant to either the genetic entropy discussion or the discussion of Dembski's method. I don't intend to add to the problems of this thread by adding a third subject to the discussion. If you want to talk about it then I suggest that you start a new thread.

Edited by PaulK, : Cleaned up a few typos


This message is a reply to:
 Message 688 by Smooth Operator, posted 01-15-2010 4:14 PM Smooth Operator has responded

Replies to this message:
 Message 698 by Tanndarr, posted 01-15-2010 7:27 PM PaulK has not yet responded
 Message 702 by Smooth Operator, posted 01-18-2010 2:31 PM PaulK has responded

  
Wounded King
Member (Idle past 525 days)
Posts: 4149
From: Edinburgh, Scotland
Joined: 04-09-2003


Message 697 of 1273 (543150)
01-15-2010 5:45 PM
Reply to: Message 693 by Smooth Operator
01-15-2010 4:16 PM


Re: Nonsensical creationist notions
You missed the entire point. You claim that the specific binding between streptomycin and the ribosome is due to CSI in the ribosomal sequences. So why is specific binding between antibodies and the epitopes they bind not due to CSI in the sequences which code for that epitope? Did you quickly evaluate the bits required for all possible epitopes and determine that they were all under 400 bits?

The 'free' information, as I have pointed out more than once now, is the information which has to have suddenly magically been added to the coding sequence for the ribosome when the full streptomycin synthesis pathway had developed and it suddenly had a high affinity binding partner.

You say you don't claim mutations are necessarily a loss of information, but you persist with your illogical claim that it is a loss of information in the streptomycin resistance example. Can we just agree that its not a net change in information? Would you go that far?

To claim that any change in structure is a 'degradation' is bringing front and centre your assumption that the initial structure is some sort of optimal or ideal structure. It is totally inconsequential to the effect of any particular mutation that if you keep on changing the structure long enough at random its function will disappear.

Nice to see your basic understanding of entropy is as faulty as your concept of genetic entropy. You have, as usual, conflated any specific change with the average tendency of all random changes.

The fact that in general non-neutral mutations will tend to have a deleterious effect on a proteins function doesn't mean that beneficial mutations don't exist. You do understand that entropy is a statistical phenomenon?

Going from the general to the specific as you do totally defeats the entire point of this discussion.

TTFN,

WK


This message is a reply to:
 Message 693 by Smooth Operator, posted 01-15-2010 4:16 PM Smooth Operator has responded

Replies to this message:
 Message 703 by Smooth Operator, posted 01-18-2010 2:32 PM Wounded King has not yet responded

  
Tanndarr
Member (Idle past 1612 days)
Posts: 68
Joined: 02-14-2008


Message 698 of 1273 (543160)
01-15-2010 7:27 PM
Reply to: Message 696 by PaulK
01-15-2010 5:28 PM


Rotary Propeller Rotation
You are also correct to say that "bidirectional rotary motor-driven propellers" is a valid specification, because there are other things that fit the definition. Unfortunately this means that all those things must be included in D*. And since your calculation uses specifics of the E Coli flagellum that do not necessarily apply to all "bidirectional rotary motor-driven propellers" you are calculating the probability for a different "specification" - and in fact that "specification" is a fabrication.

Dembski's specification also relies on a redundancy which shows just how easy it is to twitch the concept to make it say what you want. A propeller is a device which converts rotary motion into thrust. Rotary is implied in the specification propeller and should be removed from the description.

Dembski himself refuses to defend his flagellum argument. He's not interested in "materialistic" applications. I'm sure it has nothing to do with his fear of public ridicule. Better to sit on the sidelines and let the second-stringers get sacked.


This message is a reply to:
 Message 696 by PaulK, posted 01-15-2010 5:28 PM PaulK has not yet responded

Nuggin
Member (Idle past 742 days)
Posts: 2962
From: Los Angeles, CA USA
Joined: 08-09-2005


Message 699 of 1273 (543167)
01-15-2010 8:29 PM
Reply to: Message 687 by Smooth Operator
01-15-2010 4:12 PM


Re: funny thing happened on the way to nirvana ...
Creationism is not a mechanism. How exactly is creationism supposed to be a mechanism?

No, magic is the mechanism for creationism. Just like it's apparently the mechanism for ID. Which is appropriate, because even Dembski admits that ID is Creationism.

I'm claiming to have a method of design detection. And when I apply it, you claim that I can't use it because that's "creationism".

No, what I am saying is that your method does not differentiate between things which were designed and things which were not designed.

We've given you a number of examples of things which we all agree are not designed - geodes, snowflakes, ripples on water, etc. The _only_ way you rule them out is by knowing the mechanism for their production.

Yet, you claim that you don't NEED to know the mechanism for your ID theory because you can detect it without knowing the mechanism.

My position is that if you do not know the mechanism, you can not claim design.

You position is the opposite.

I've asked you for an example APART FROM the one you are claiming to DEMONSTRATE that your claim has validity.

You've now admitted that there are NO EXAMPLES apart from what you are claiming.

That's SPECIAL PLEADING.

It's an invalid argument to come to us and tell us you have a method of design detection which ONLY WORKS on Creationism and therefore you don't have to show how it works.

Change over time caused by what? Random mutations, or intelligent input? Obviously intelligent input.

It's amazing to all of us that you see "intelligent input" everywhere in nature, but can't seem to manage to apply any here.

You can not determine "input" if you don't have a MECHANISM for that input to take place.

If you would grow some balls and just admit that your mechanism is magical Jew Beams shot from the Great Wizard, we could move on to the next part of this discussion.

Until you do, I'm afraid I'm going to have to keep asking you the same question and you're gonna have to keep pretending that you are too stupid to understand it.

Dembski explained this perfectly

What Dembski explained perfectly is that the Fundamentalist Christian God is responsible through the use of MAGIC for Creation AS IS FOUND IN THE BIBLE, __INCLUDING__ ADAM & EVE!!!

You can't keep siting Dembski _AND_ claiming that you don't believe that it's magical Jew Beams.

Those are two different arguments.


This message is a reply to:
 Message 687 by Smooth Operator, posted 01-15-2010 4:12 PM Smooth Operator has responded

Replies to this message:
 Message 704 by Smooth Operator, posted 01-18-2010 2:32 PM Nuggin has responded

  
Larni
Member
Posts: 3745
From: UK
Joined: 09-16-2005


Message 700 of 1273 (543205)
01-16-2010 5:25 AM
Reply to: Message 690 by Smooth Operator
01-15-2010 4:14 PM


Re: Genetic Entropy[quote]On the 649th post of this thread you refer to a link that m
Okay, I don't really care why this is. My point is simply that it's happening. It's causing genetic entropy.

Then all you are doing is calling a restricted gene pool (as a result of a small population) 'genetic entropy'.

Like what happens to inbred Southern folks.

This would then (logically) not apply to large population.

Large population= no 'genetic entropy'.

I'm also a bit confused by your use of the term 'genetic entropy': surely it means more genetic variation?

Why are you using the opposite definition?

Edited by Larni, : Clarity


This message is a reply to:
 Message 690 by Smooth Operator, posted 01-15-2010 4:14 PM Smooth Operator has responded

Replies to this message:
 Message 705 by Smooth Operator, posted 01-18-2010 2:33 PM Larni has responded

  
Smooth Operator
Member (Idle past 1544 days)
Posts: 630
Joined: 07-24-2009


Message 701 of 1273 (543481)
01-18-2010 2:31 PM
Reply to: Message 695 by Mr Jack
01-15-2010 4:58 PM


Re: Please explain E. coli
quote:
Cite an example then, please
This article talks about accumulation of mutations in RNA chains.

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1774997/?tool...

This one compare the accumulation of slightly deleterious mutations between small and large mammals.

http://www.pnas.org/content/104/33/13390.abstract

quote:
Why would we need to know the initial population size? I'm talking about in a single gene line - as in those descended from a single individual.
Becasue the size of the population is one of the factors that determines how much mutations will accumulate. If you do not know the initial size, than you also do not know how much mutations will accumulate.
This message is a reply to:
 Message 695 by Mr Jack, posted 01-15-2010 4:58 PM Mr Jack has responded

Replies to this message:
 Message 706 by Mr Jack, posted 01-18-2010 2:40 PM Smooth Operator has responded

  
Smooth Operator
Member (Idle past 1544 days)
Posts: 630
Joined: 07-24-2009


Message 702 of 1273 (543482)
01-18-2010 2:31 PM
Reply to: Message 696 by PaulK
01-15-2010 5:28 PM


Re: l
quote:
I am afraid that you are incorrect again. You claimed that it had lost all function. I claimed that you did not know that. And as we have seen I was right, although it took an amazingly long series of posts for you to realise it.
That is because it had ONE known function. We will not try to imagine new functions if we are not sure the enzyme has it, now will we? That's unproductive. We know it had ONE function, and it lost that one function, therefore, it lost all functions.

But that's besides the point. Even if it had more function, which I wouldn't be surprised it did, that still doesn't refute my point that it lost a known function. Which, I repeat was the point of the experiment. The experiment was to see how many mutations can an enzyme take befroe it loses it's ONE known function.

quote:
Amazingly enough your clumsy phrasing has made you say something that is about right, although probably unintentionally. For your calculation to be correct, the E Coli flagellum (plus any allowed variants) must include all "bidirectional rotary motor-driven propellers". And in that sense it would describe it. However, we are certainly in no position to say that that is true, and in fact I know that it is not true.

You are also correct to say that "bidirectional rotary motor-driven propellers" is a valid specification, because there are other things that fit the definition. Unfortunately this means that all those things must be included in D*. And since your calculation uses specifics of the E Coli flagellum that do not necessarily apply to all "bidirectional rotary motor-driven propellers" you are calculating the probability for a different "specification" - and in fact that "specification" is a fabrication.


You simply do not understand the desgin inference. Why would I have to include ALL existing "bidirectional rotary motor-driven propellers"? That makes no sense. The reason it doesn't make any sense is becasue some patterns that might fit that description could very well be under 400 bits of information. Therefore making our calculation useless in the first place. Why? Becasue we are than not talking about somplex specified information, but simply about specified information, which Dembski said, chance can generate.

Therefore, you are wrong. Weare only supposed to include the complexity of the event E, which in this case is the 50 proteins of the flagellum. There is no reason to calcualte other probabilities. Just imagine a "bidirectional rotary motor-driven propeller" that has a complexity of 1.000.000.000 bits. You do understand that it has a chance vastly less probable to form by chance than a one consisting of 1.000 bits? So what good would it do to calculate all possible instances of that pattern? Absolutly no good whatsoever.

quote:
I've told you what to do. Calculate the number of sequences that are no more than 20% different. If you can't even work out how to do that then you don't know even basic probability theory. I might be willing to help you a little more, but to do that sensibly you will need to explain how you did the calculation in the first place. Something you were unwilling to do before.
Like I said, I'm waiting for you to tell me how to do the calculation. I have no reason to stall.

quote:
I am afraid that you have completely misunderstood what Dembski is saying here. Dembski is attempting to deal with the issue that there are many more possible targets that evolution might have hit. He does so by estimating the number of concepts at the same level and including it as a correcting factor (this is why it reduces the specified information content).

Clearly adding in extra details - even if they are legitimate - would mean using a higher level concept, and thus this factor would be increased.


Again, no, you have no idea What Dembski is talking about. Yes, he is saying that evolution can hit many targets. The point is to show that the probability is still to small for that to happen.

Like I said "bidirectional rotary motor-driven propeller" is a 4-concept pattern. Every pattern has 10^5 possible patterns. Therefore, the full complexity is 10^20. Now how do we use this? Well simple. We have to show that this pattern is still too improbable to hit by chance. Therefore, we multiply it by 10^120, than with teh complexity of the certain pattern, and that if this number is less than 1/2. Which is true, becasue Dembski says it here.

quote:
We may therefore think of the specificational resources as allowing as many as N = 10^20 possible targets for the chance formation of the
bacterial flagellum, where the probability of hitting each target is not more than p. Factoring in these N specificational resources then amounts to checking whether the probability of hitting any of these targets

quote:
If, therefore, this number is small (certainly less than 1/2 and preferably close to zero), it follows that it is less likely than not for an event E that conforms to the pattern T to have happened according to the chance hypothesis H. Simply put, if H was operative in the production of some
event in S’s context of inquiry, something other than E should have happened even with all the 21 replicational and specificational resources relevant to E’s occurrence being factored in.
Therefore, the number we get, if it is smaller than 1/2, it is less likely than not that chnce produced the event in question. Which mens we infer design.

quote:
Again, you are making a mistake. My point was that the growth mechanisms have a high probability of producing a flagellum. Of course we can say that bacterial reproduction includes producing the physical growth mechanisms with high probability. But again, all this means is that the design proponent has to do is to go back to the origin, which is where you say that the actual design implementation occurred.
Impossible and useless. We can not go back in time. That's obvious. And it's also useless, because natural causes can only transmit CSI, and not add more of it. So whatever CSI we see in nature, we know that that same amount, or more, was inputted in the first place. Therefore, it's fine to calculate it right there.

quote:
Of course the point rests not on what MIGHT be the case in some hypothetical environment but on what actually is the case in the environments which actually exist. And in that environment, the allele for blue eyes is classified as neutral and the sickle cell allele is classed as beneficial overall (in malarial areas), but is very strongly deleterious in individuals who are homozygous for the allele. And that is unusual.
No it's not unusual. That's how natural selection works! It is going to select anything, even those mutations that degrade biological functions if they increase reproductive fitness. That's just how things are.

quote:
In other words it doesn't matter if what you say is true or not ? If you really don't care about then say so. You claimed that the sickle-cell allele was spreading. You claimed that it had been fixed in the population. Neither is true.
I said that it has bee spread and it's fixed in a certain population. Just like any other mutation. There is no one single mutation that is sprea through all life on earth. All mutations are held in certain frequencies.

quote:
That isn't quite true. Your argument is intended to establish what you think will happen to large populations over long periods of time but it doesn't take either point into account. If the other factors are hereditary then they may be included with genes. If they are not then - on the scales we are thinking of - they may be largely ignored unless there is good reason to think that they strongly correlate with the occurrence of a particular allele.

You see, the larger the population (in the statistical sense as well as the biological) the more the chance effects of noise tend to average out. This is why genetic drift is weaker in larger populations. So it is not enough to talk about individual cases or noise. What you need to show is that there is a systematic bias - and that that bias undermines selection to the point that mutational meltdown is inevitable.


No they will not? Why should they be correlated with genetic traits to get passed on?

I already gave an example of epigenetic interference. Imagine two individuals A and B. Individual A has a beneficial mutation, and individual B has a deleterious one. Judging only by genetic traits, individual A would get selected by natural selection. But it just so happens that individual B got his DNA methylated. And this changed his phenotype and made it more fit than individual A. Now, judginig by natural selection, the individual B gets selected over individual A. Even thopugh individual B has a deleterious mutation, and individual A, has a beneficial mutation.

Therefore, we see here that a non-genetic trait interfeered with positive seletion, and a genetically less fitter individual got selected for by natural selection. This happens all the time in all populations on average with all individuals. Sometimes more, sometimes less ofcourse.

quote:
In other words you claim that an increase in fitness is caused by an inevitable decline in fitness. That doesn't make a lot of sense. I say that a sustained increase in fitness shows that the decline is not inevitable, which makes a lot more sense.
Yes, because reproductive fitness is not correlated perfectly with biological functions. If an animal loses it's sight, and it always lives in dark areas, it will probably gain reproductive fitness becasue it will not waste it's energy on eyes.

quote:
It may be an important question, but it is not relevant to either the genetic entropy discussion or the discussion of Dembski's method. I don't intend to add to the problems of this thread by adding a third subject to the discussion. If you want to talk about it then I suggest that you start a new thread.
I'm sorry but no. This is the ultimate question and the ultimate point of Sanfords book. The point is that if the genome keeps deteriorating, and we saw from Spiegelman's experiment that it does just that, that logically could not have evolved in the first place. If it kept getting shorter, than the first RNA chains didn't evolve into people over a period of 3.6 billion years. And thus showing that evolution doesn't work.And that's what I've been arguing for from the start.
This message is a reply to:
 Message 696 by PaulK, posted 01-15-2010 5:28 PM PaulK has responded

Replies to this message:
 Message 708 by PaulK, posted 01-18-2010 3:36 PM Smooth Operator has responded

  
Smooth Operator
Member (Idle past 1544 days)
Posts: 630
Joined: 07-24-2009


Message 703 of 1273 (543483)
01-18-2010 2:32 PM
Reply to: Message 697 by Wounded King
01-15-2010 5:45 PM


Re: Nonsensical creationist notions
quote:
You missed the entire point. You claim that the specific binding between streptomycin and the ribosome is due to CSI in the ribosomal sequences. So why is specific binding between antibodies and the epitopes they bind not due to CSI in the sequences which code for that epitope? Did you quickly evaluate the bits required for all possible epitopes and determine that they were all under 400 bits?
Because it's the ribosome which consists of more than 400 bits of information that got it's structure degraded. And like I said, function is a subset of a structure. Deforming the structure, might also deform the function.

quote:
The 'free' information, as I have pointed out more than once now, is the information which has to have suddenly magically been added to the coding sequence for the ribosome when the full streptomycin synthesis pathway had developed and it suddenly had a high affinity binding partner.
And when did that happen?

quote:
You say you don't claim mutations are necessarily a loss of information, but you persist with your illogical claim that it is a loss of information in the streptomycin resistance example. Can we just agree that its not a net change in information? Would you go that far?
Defects are loss of information. Some certain changes are not. But defects are always a loss of information. And yes, I was always saying that not all mutations decrease the amount of information.

quote:
To claim that any change in structure is a 'degradation' is bringing front and centre your assumption that the initial structure is some sort of optimal or ideal structure. It is totally inconsequential to the effect of any particular mutation that if you keep on changing the structure long enough at random its function will disappear.
You do understand that if you mutate anything for long enough that it will lose it's function and ultimately it's structure?

You do know that if you hit a bike with a hammer for long enough first you will not be able to ride it anymore, and after a while it won't even be a bike, but a pile of metal. The first loss is the complete loss of function, and the second loss is the complete loss of structure. But, the first loss occured by a partial loss of structure.

quote:
Nice to see your basic understanding of entropy is as faulty as your concept of genetic entropy. You have, as usual, conflated any specific change with the average tendency of all random changes.

The fact that in general non-neutral mutations will tend to have a deleterious effect on a proteins function doesn't mean that beneficial mutations don't exist. You do understand that entropy is a statistical phenomenon?


That is becasue it's true. Please add random changes to this statement: "SOMEDAY MAYBE YOU'LL LEARN SOMETHING!"

Please do add random changes to it, and tell me, over time, will the changes on average, lead to a new meaningful statement, or will they lead to a meaningless statement?

And I never, ever in my entire life said that beneficial mutations do not exist. I never said that. I simply said that they can also lead to loss of information, and that they occure far less than other mutations.


This message is a reply to:
 Message 697 by Wounded King, posted 01-15-2010 5:45 PM Wounded King has not yet responded

  
Smooth Operator
Member (Idle past 1544 days)
Posts: 630
Joined: 07-24-2009


Message 704 of 1273 (543484)
01-18-2010 2:32 PM
Reply to: Message 699 by Nuggin
01-15-2010 8:29 PM


Re: funny thing happened on the way to nirvana ...
quote:
No, magic is the mechanism for creationism. Just like it's apparently the mechanism for ID. Which is appropriate, because even Dembski admits that ID is Creationism.
He never said that. He actually always said it's not creationism. And not only that, but that's besides the point.

quote:
No, what I am saying is that your method does not differentiate between things which were designed and things which were not designed.

We've given you a number of examples of things which we all agree are not designed - geodes, snowflakes, ripples on water, etc. The _only_ way you rule them out is by knowing the mechanism for their production.

Yet, you claim that you don't NEED to know the mechanism for your ID theory because you can detect it without knowing the mechanism.

My position is that if you do not know the mechanism, you can not claim design.

You position is the opposite.

I've asked you for an example APART FROM the one you are claiming to DEMONSTRATE that your claim has validity.

You've now admitted that there are NO EXAMPLES apart from what you are claiming.

That's SPECIAL PLEADING.

It's an invalid argument to come to us and tell us you have a method of design detection which ONLY WORKS on Creationism and therefore you don't have to show how it works.


Nope. first of all, those thing are not designed becasue they do nto have marks of design. Second, I'm not using special pleading, becasue I never claimed that my method only works on creationism. Whatever that is supposed to mean.

quote:
It's amazing to all of us that you see "intelligent input" everywhere in nature, but can't seem to manage to apply any here.

You can not determine "input" if you don't have a MECHANISM for that input to take place.


I simply explained to you that you made a logical fallacy. That is all. You can not show me an instance of intelligent design, and than conflate it with an undirected process.
This message is a reply to:
 Message 699 by Nuggin, posted 01-15-2010 8:29 PM Nuggin has responded

Replies to this message:
 Message 709 by Nuggin, posted 01-18-2010 7:32 PM Smooth Operator has responded

  
Smooth Operator
Member (Idle past 1544 days)
Posts: 630
Joined: 07-24-2009


Message 705 of 1273 (543485)
01-18-2010 2:33 PM
Reply to: Message 700 by Larni
01-16-2010 5:25 AM


Re: Genetic Entropy[quote]On the 649th post of this thread you refer to a link that m
quote:
Then all you are doing is calling a restricted gene pool (as a result of a small population) 'genetic entropy'.

Like what happens to inbred Southern folks.

This would then (logically) not apply to large population.

Large population= no 'genetic entropy'.


No, what I'm saiyng is that mutations accumulate in all populations. In some populations more in some less. In smaller populations they accumulate more rapidly, and in large populations they accumulate slowly. Which means:

Small population = faster increase of genetic entropy.
Large population = slower increase of genetic entropy.

quote:
I'm also a bit confused by your use of the term 'genetic entropy': surely it means more genetic variation?

Why are you using the opposite definition?


No, it means the accumulations of mutations in a population over time which leads to the reduction in genetic information becasue natural selection is not able to remove them.
This message is a reply to:
 Message 700 by Larni, posted 01-16-2010 5:25 AM Larni has responded

Replies to this message:
 Message 707 by Larni, posted 01-18-2010 3:35 PM Smooth Operator has responded

  
RewPrev1
...
4546
47
4849
...
85NextFF
Newer Topic | Older Topic
Jump to:


Copyright 2001-2014 by EvC Forum, All Rights Reserved

™ Version 4.0 Beta
Innovative software from Qwixotic © 2014