Register | Sign In


Understanding through Discussion


EvC Forum active members: 63 (9162 total)
3 online now:
Newest Member: popoi
Post Volume: Total: 916,386 Year: 3,643/9,624 Month: 514/974 Week: 127/276 Day: 1/23 Hour: 1/0


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   Evolving New Information
Percy
Member
Posts: 22479
From: New Hampshire
Joined: 12-23-2000
Member Rating: 4.7


Message 283 of 458 (522601)
09-04-2009 8:39 AM
Reply to: Message 280 by LucyTheApe
09-04-2009 5:37 AM


Re: What is information?
LucyTheApe writes:
Ok cavediver, if you want I will quantify the increase in information.
Where did you "quantify the increase in information?"
Percy claims that given a 46 possible or 12 bits of information that only 2 of these possibilities are legitimate until a mutation occurs. Then another possibility is OK.
I won't go into detail over your confusion about the number of bits of information beyond saying that DNA contains a great deal of redundancy. This means that it uses many more bits than necessary to represent and communicate information. The three alleles (not two) of my example require only 1.585 bits, not 12. The extra 10.415 bits are redundant and unnecessary as far as representing information, but they help a great deal with error tolerance.
More importantly, I did not say that only 3 (again, not 2) alleles were "legitimate." I said that only 3 existed within the population. That means that if you checked that gene in every single individual in the population that you'd only find the 3 alleles I listed. It was that only 3 alleles existed, not that only 3 were legitimate. Other alleles are perfectly possible and legitimate, but no other alleles for that gene happened to exist in the population.
Thus when a mutation produced a 4th allele the amount of information that could be communicated by that gene increased from 1.585 bits to 2 bits.
--Percy

This message is a reply to:
 Message 280 by LucyTheApe, posted 09-04-2009 5:37 AM LucyTheApe has replied

Replies to this message:
 Message 291 by LucyTheApe, posted 09-04-2009 5:51 PM Percy has replied

  
Percy
Member
Posts: 22479
From: New Hampshire
Joined: 12-23-2000
Member Rating: 4.7


Message 307 of 458 (522922)
09-06-2009 12:03 PM
Reply to: Message 291 by LucyTheApe
09-04-2009 5:51 PM


Re: What is information?
LucyTheApe writes:
Percy writes:
Where did you "quantify the increase in information?"
As I said to cavediver, I need to know which interpreter he wants to use and on which machine he wants to assemble the instructions.
The physical implementation is an irrelevant detail. Here's your example from Message 232, I added some formatting for readability:
Lucy the Ape in Message 232 writes:
I'll show you what I mean by adding information. I'll take my example of a piece of code:
void swap(object a, object b) {
  temp = new object();
  temp = b;
  b=a;
  a=temp;
}
Now I'll add some information:
void swap(object a, object b, object c) {
  temp = new object();
  temp = c;
  c=b;
  b=a;
  a=temp;
}
We're wondering how you quantify how much information you added. In my genetic example I quantified how much information was added by a single mutation, and we're asking you to quantify how much information you added by modifying your code.
I won't go into detail over your confusion about the number of bits of information beyond saying that DNA contains a great deal of redundancy. This means that it uses many more bits than necessary to represent and communicate information.
Please Percy, detail my ignorance.
I did detail your ignorance. Again, DNA uses many more bits than necessary to represent and communicate information. The three alleles of my example require only 1.585 bits, not 12. The extra 10.415 bits are redundant and unnecessary as far as representing information. If you'd like clarification on any specific details, just ask. You did ask about one thing, maybe addressing this will help you understand the redundancy in DNA:
You say that the DNA code contains a lot of redundancy, and of course, you can demonstrate this.
Redundancy occurs whenever information is encoded using more bits than necessary. The 3 alleles of my example can be encoded in just 1.585 bits. The additional 10.415 bits are redundant.
The amount of information that we will get by quantifying my code will seem excessive also, but it's not redundant.
There's a huge amount of redundancy in the syntax of all computer languages. For example, the word "object" has more letters than necessary. We could get by perfectly well with just "obj".
Even just a short snippet of computer code is going to be incredibly complex to analyze from an information theoretic standpoint. That's why Cavediver asked you to quantify the increase in information, because it was apparent you weren't aware how difficult this would be, and because in attempting to quantify it you would gradually come to realize this. It would be much simpler if you could keep your focus on examples simple enough to discuss here, such as my very simple genetic example.
Thus when a mutation produced a 4th allele the amount of information that could be communicated by that gene increased from 1.585 bits to 2 bits.
So now you're talking about communication of information, not new information.
All through this thread we've been talking about both communication of information *and* creation of new information. Reproduction requires communication of genetic information from parents to offspring. When noise occurs in that communication channel then new information is introduced in the offspring. In biology this noise-induced information is called a mutation.
The cell itself knows what it needs to do. If the eye colour machine needs to make an eye colour, it takes in its parameters from the DNA. If the parameters are not of the same "object" that it requires the machine will break down, just like any other machine.
In other words the language has to be in place. A mixture of symbols from the alphabet is not good enough.
The machinery you think is lacking is already in place. A mutated allele will still produce a protein, and that protein will still circulate throughout the organism and produce an effect. The degree of desirability of that effect governs the organism's reproductive success and whether that mutation propagates within the population.
--Percy
Edited by Percy, : Fix typo.
Edited by Percy, : Improve phrasing in next to last paragraph.

This message is a reply to:
 Message 291 by LucyTheApe, posted 09-04-2009 5:51 PM LucyTheApe has replied

Replies to this message:
 Message 339 by LucyTheApe, posted 09-11-2009 8:21 AM Percy has replied

  
Percy
Member
Posts: 22479
From: New Hampshire
Joined: 12-23-2000
Member Rating: 4.7


Message 324 of 458 (523246)
09-09-2009 7:02 AM
Reply to: Message 312 by traderdrew
09-08-2009 5:22 PM


Re: and yet you go there
traderdrew writes:
Can't you see that it is not only "god of the gaps", it is also "evolution of the gaps" or an "argument from ignorance of natural causes".
You are correct that we are following a "natural causes of the gaps" approach. We believe that what we don't know will eventually be explained by natural causes. This is because throughout millennia of experience, all scientific questions have resolved to natural causes, and not a single one to God.
The ID position is that a certain scientific question, namely the cause of the diversity of life on Earth, has been answered, and that the cause is God, not evolution. They argue that they've eliminated all possible natural causes, and therefore only God remains as a possibility.
But they haven't eliminated all possible natural causes, of course, and the argument that our current lack of knowledge about specific events in evolutionary history is evidence that they would be naturalistically impossible is IDists contribution to the "God of the gaps" fallacy, and as already pointed out, not a single scientific qustion has ever resolved to God.
Before one can begin claiming God as one of the possible answers to scientific questions, one first has to have at least one case where actual evidence has resolved some scientific question to God. Only then would proposing God as the answer to unresolved scientific questions make any sense.
--Percy

This message is a reply to:
 Message 312 by traderdrew, posted 09-08-2009 5:22 PM traderdrew has replied

Replies to this message:
 Message 325 by Peepul, posted 09-09-2009 9:42 AM Percy has seen this message but not replied
 Message 331 by traderdrew, posted 09-09-2009 11:54 AM Percy has seen this message but not replied

  
Percy
Member
Posts: 22479
From: New Hampshire
Joined: 12-23-2000
Member Rating: 4.7


Message 340 of 458 (523584)
09-11-2009 10:30 AM
Reply to: Message 339 by LucyTheApe
09-11-2009 8:21 AM


Re: What is information?
LucyTheApe writes:
But Percy thats what is information is, implementing a thought, or idea... a message into code.
In everyday usage information can mean representations of facts or thoughts or ideas. When used in this way information is understood to have meaning. But Shannon information theory provides a very specific mathematical definition of information that excludes meaning. People can associate meaning with information, but information itself is independent of meaning.
One very simple example of this is the famous message of Paul Revere's ride, "One if by land, two if by sea." The meaning of "land" is not inherent in one lantern, nor is the meaning of "sea" inherent in two lanterns. The code could have been, "One if by sea, two if by land," and it would have worked just as well. In information theory, meaning and information are two different things, and meaning is not part of information theory. Specifically, thoughts and ideas are not included in information theory.
The shell consists of 832 bits the first bit of code consisted of another 1752 bits and the second bit of code added another 376 bits.
You never mentioned a "shell" before, so I don't know what you're referring to, and I can't figure out how you calculated the number of bits of information. Could you describe this for me?
Percy writes:
Redundancy occurs whenever information is encoded using more bits than necessary. The 3 alleles of my example can be encoded in just 1.585 bits. The additional 10.415 bits are redundant.
It depends how we define redundancy. I obviously have a different interpretation than you do.
I'm using the definition of redundancy from information theory. This is the first sentence from the Wikipedia article on "Redundancy (information theory)":
Wikipedia writes:
Redundancy in information theory is the number of bits used to transmit a message minus the number of bits of actual information in the message.
Since a message from a message set of size 3 can be represented in 1.585 bits, and since the DNA representation actually uses 12 bits, the extra 10.415 bits are redundant.
If we want to keep things simple, like I do, I suggest we use models that can be simulated in the lab, like I have, rather than using unsubstantiated, untestable theoretic models that are contrary to observation.
The example I provided is a simplified model of what we observe happening in nature. My simple gene of 3 alleles experienced a single point mutation, something that is observed to happen all the time. This single point mutation caused the number of alleles of this gene to grow from 3 to 4 in our population. This represents a change in information from 1.585 bits to 2 bits, an increase of .415 bits.
--Percy

This message is a reply to:
 Message 339 by LucyTheApe, posted 09-11-2009 8:21 AM LucyTheApe has replied

Replies to this message:
 Message 341 by greyseal, posted 09-11-2009 11:06 AM Percy has seen this message but not replied
 Message 365 by LucyTheApe, posted 09-18-2009 9:30 PM Percy has replied

  
Percy
Member
Posts: 22479
From: New Hampshire
Joined: 12-23-2000
Member Rating: 4.7


Message 348 of 458 (523620)
09-11-2009 1:45 PM
Reply to: Message 345 by traderdrew
09-11-2009 11:35 AM


Re: and yet you go there (yes I do)
traderdrew writes:
I thought I couldn't post bare links or links that I don't articulate thoughts from (I'm not sure where the moderators draw the lines)...
If you're making a point you've garnered from some website, make the point in your own words and provide a link as a supporting reference. It's best to provide a link to a webpage rather than to an entire website, and if the webpage is long it helps to describe where on the webpage the relevant information appears.
In other circumstances, such as answering questions like, "What was that link again?" bare links are fine.
...but if you want to find the source, just cut and paste the quote into a google search.
As you discovered with RAZD, this is not a reliable way to provide a link. The order of results in Google changes frequently. It's best to provide actual links rather than procedures to find links.
Need I say more? Now, how can we argue with these two scientists???
So if we find two scientists who disagree with Hoyle and Gingerich, then what? Do we conduct a poll and find the percentage of scientists who support each side? If we did then intelligent design would lose by about a 100-to-1 margin.
Or acknowledging that such approaches are the very embodiment of the fallacy of appeal to authority, we could actually discuss the details of the issues ourselves.
--Percy

This message is a reply to:
 Message 345 by traderdrew, posted 09-11-2009 11:35 AM traderdrew has not replied

  
Percy
Member
Posts: 22479
From: New Hampshire
Joined: 12-23-2000
Member Rating: 4.7


Message 356 of 458 (524708)
09-18-2009 7:45 AM
Reply to: Message 352 by LucyTheApe
09-18-2009 5:19 AM


Re: Genetics of melanism
You missed Message 340.
--Percy

This message is a reply to:
 Message 352 by LucyTheApe, posted 09-18-2009 5:19 AM LucyTheApe has not replied

  
Percy
Member
Posts: 22479
From: New Hampshire
Joined: 12-23-2000
Member Rating: 4.7


Message 367 of 458 (524822)
09-19-2009 8:42 AM
Reply to: Message 365 by LucyTheApe
09-18-2009 9:30 PM


Re: What is information?
LucyTheApe writes:
All of Shannons experiments were the result of a deliberate, intelligent message. Otherwise he couldn't do his maths.
Shannon's experiments? Shannon was a mathematician and did not conduct experiments to develop information theory. Obviously Shannon used his intelligence to do his work, we all do.
Yes he dealt with the efficiency of the transfer of information without considering the meaning and of course he realised that it requires the positive intelligent use of energy to ensure that the transmission was recieved as expected.
This erroneous conceptual picture you have is at the core of your misunderstanding of information theory. In everyday terms information includes meaning, but information theory is a very specialized way of thinking about information. Information theory is concerned solely with the problem of transmitting sequences of bits from one point to another. In information theory those bits have no meaning. Measures of information are based upon bits, and there's no meaning.
All transmissions of electromagnetic radiation are in effect just streams of bits, whether they come from the sun or from the local TV station. Humans attach meanings to those bits after they arrive, but the bits themselves have no inherent meaning.
If you want to know how I calculated the information content then just complile the following files...
You think the amount of information in a program is equal to the number of bits output by its compiler? So when the new, more efficient version of the compiler is released and produces a more compact compiled file, your program now contains less information, even though you made no changes? Or you move the compiled file from a Unix to a Dos system and suddenly the file size is different and now your program has a different amount of information, even though you didn't change your program? Or you run the compiled file through gzip to reduce its size, and now the program has less information? In fact, gzip can only reduce file size when redundancy is present. Care to rethink how you should measure the amount of information in your program?
As both Cavediver and I suspected, you have no idea how to calculate how much information is in your program, and not even a clue of how complicated a task you have set yourself.
Any workable example must be simple to enough that we can accurately quantify how much information is involved. To do this you need to know the size of your message set. My simple genome example illustrated how a message set of size 3 (1.585 bits of information) can grow to a message set of size 4 (2 bits of information) through a simple point mutation. I think you should focus on this simple example. Here's the message set with just three alleles before the mutation:
  • GGAAGC (green eyes)
  • GGAAGA (blue eyes)
  • GGCACG (yellow eyes)
And here's the message set with four alleles after the mutation produces a new allele:
  • GGAAGC (green eyes)
  • GGAAGA (blue eyes)
  • GGCACG (yellow eyes)
  • GGCAAG (brown eyes)
I'm using the definition of redundancy from information theory. This is the first sentence from the Wikipedia article on "Redundancy (information theory)":
your reference should actually read;
quote:
Redundancy in information theory is the number of bits used to transmit a message minus the number of bits of actual information in the message. Informally, it is the amount of wasted "space" used to transmit certain data. Data compression is a way to reduce or eliminate unwanted redundancy, while checksums are a way of adding desired redundancy for purposes of error detection when communicating over a noisy channel of limited capacity.
which is exactly my interpretation. When we are talking about a cell that has to pass on its information into a different time, it has no memory, the term redundancy changes is normal meaning.
As PaulK has already noted, you've misinterpreted what Wikipedia is saying. The more complete excerpt does not change the meaning at all, it just repeats the definition in the first sentence in different terms, and it provides the example of checksums. Redundancy is the unnecessary bits used to transmit or store information. You said in Message 339, "It depends how we define redundancy. I obviously have a different interpretation than you do." But if you agree with the Wikipedia article then we have the identical interpretation. Since you apparently still disagree with me, it's more likely that you misinterpreted Wikipedia.
Since a message from a message set of size 3 can be represented in 1.585 bits, and since the DNA representation actually uses 12 bits, the extra 10.415 bits are redundant.
I disagree, the 12 choices are determined by the environment or "situation" the cell finds itself. It's not redundancy but rather the inbuilt ability to adapt.
You still have no idea what redundancy is. As the Wikipedia article states, redundancy is the number of bits used to transmit information above the actual number of bits in the message. Therefore a message from a message set of size 3 can be represented in 1.585 bits, and since the DNA representation actually uses 12 bits, the extra 10.415 bits are redundant. You can't argue with this, it's just a mathematical fact.
Now you need to use my model just as I have asked greyseal to do. Smash it to produce new information. If you are successful then I will reconsider my poisition.
If by "my model" you're referring to your Java code examples, since you are unable to quantify the amount of information in them, I think it's time for you begin addressing the simple example I provided, or to provide one of your own where the amount of information can be quantified.
--Percy
Edited by Percy, : Minor clarification.

This message is a reply to:
 Message 365 by LucyTheApe, posted 09-18-2009 9:30 PM LucyTheApe has replied

Replies to this message:
 Message 398 by LucyTheApe, posted 10-03-2009 12:01 PM Percy has replied

  
Percy
Member
Posts: 22479
From: New Hampshire
Joined: 12-23-2000
Member Rating: 4.7


Message 378 of 458 (526855)
09-29-2009 12:29 PM
Reply to: Message 377 by Calypsis4
09-29-2009 11:49 AM


Re: Irreducible complexity - huh! What is it good for?
Calypsis4 writes:
Interesting debate. But even if one is to grant that evolution is true, what is the origin of genetic information? That has never been answered.
Are you asking about the origin of *new* genetic information? If so, then that's what this thread is about. Why don't you read through the last couple pages and see if any of those posts provide a good enough answer for you.
Or are you asking about the origin of the *first* genetic information? If so, then this thread is not about that, but briefly, we only have speculation at this point.
--Percy

This message is a reply to:
 Message 377 by Calypsis4, posted 09-29-2009 11:49 AM Calypsis4 has replied

Replies to this message:
 Message 379 by Calypsis4, posted 09-29-2009 12:53 PM Percy has seen this message but not replied

  
Percy
Member
Posts: 22479
From: New Hampshire
Joined: 12-23-2000
Member Rating: 4.7


Message 388 of 458 (526910)
09-29-2009 3:33 PM
Reply to: Message 385 by Calypsis4
09-29-2009 3:09 PM


Re: Gaps
Calypsis4 writes:
Keep in mind that though a Lamborghini is not alive nor made of chemicals it must be understood that (if evolution is true) that there was no life before pre-biotic times and all things were non-living inanimate chemical/minerals.
As I said earlier when I asked you if you were asking about the origin of the *first* genetic information, this thread is not about that.
The topic of this thread concerns how mutation produces new information, see the introductory post, Message 1.
--Percy
Edited by Percy, : Grammar.

This message is a reply to:
 Message 385 by Calypsis4, posted 09-29-2009 3:09 PM Calypsis4 has replied

Replies to this message:
 Message 390 by Calypsis4, posted 09-29-2009 3:41 PM Percy has replied

  
Percy
Member
Posts: 22479
From: New Hampshire
Joined: 12-23-2000
Member Rating: 4.7


Message 391 of 458 (526914)
09-29-2009 3:56 PM
Reply to: Message 390 by Calypsis4
09-29-2009 3:41 PM


Re: Gaps
You might try perusing the threads over at the Origin of Life forum. Some possible condidates:
You could also propose a new thread over at Proposed New Topics.
--Percy

This message is a reply to:
 Message 390 by Calypsis4, posted 09-29-2009 3:41 PM Calypsis4 has not replied

  
Percy
Member
Posts: 22479
From: New Hampshire
Joined: 12-23-2000
Member Rating: 4.7


Message 399 of 458 (527974)
10-03-2009 4:41 PM
Reply to: Message 398 by LucyTheApe
10-03-2009 12:01 PM


Re: What is information?
LucyTheApe writes:
Percy writes:
Shannon's experiments? Shannon was a mathematician and did not conduct experiments to develop information theory. Obviously Shannon used his intelligence to do his work, we all do.
Here's one
I said that Shannon didn't conduct experiments to develop information theory, not that he never conducted any experiments ever. Information theory is a mathematical, not an experimental, science, but it does have real world applications.
For example; you don't know whether an adenine paired with a guanine is just a bit. Or whether an adenine paired with a guanine is other than a bit when next to a cytosine and a thymine.
That is precisely the point, that information theory is not concerned with meaning. If you're talking about meaning then you're not talking about information theory.
You think the amount of information in a program is equal to the number of bits output by its compiler?
No, that's how you're defining information, not me.
No, that's how you defined information. I inquired how you were calculating the amount of information in your program, and you replied like this:
LucyTheApe in Message 365 writes:
If you want to know how I calculated the information content then just compile the following files...
So how did you calculate the amount of information in your program that you provided in Message 339:
LucyTheApe in Message 339 writes:
The shell consists of 832 bits the first bit of code consisted of another 1752 bits and the second bit of code added another 376 bits.
I presume you didn't just make them up, so where did they come from?
You still have no idea what redundancy is. As the Wikipedia article states, redundancy is the number of bits used to transmit information above the actual number of bits in the message. Therefore a message from a message set of size 3 can be represented in 1.585 bits, and since the DNA representation actually uses 12 bits, the extra 10.415 bits are redundant. You can't argue with this, it's just a mathematical fact.
Yes I can argue with that, I already have.
Well, okay, yes, you have argued with it, but that log23 is 1.585 bits, and that DNA uses 12 bits to represent these 1.585 bits, and that the difference between them is 10.415 bits, are mathematical facts as undeniable as 2+2=4. You can argue if you like, but not rationally.
--Percy

This message is a reply to:
 Message 398 by LucyTheApe, posted 10-03-2009 12:01 PM LucyTheApe has not replied

  
Percy
Member
Posts: 22479
From: New Hampshire
Joined: 12-23-2000
Member Rating: 4.7


Message 400 of 458 (543092)
01-15-2010 10:20 AM
Reply to: Message 398 by LucyTheApe
10-03-2009 12:01 PM


Re: What is information?
Hi LucyTheApe,
This is a response to your Message 275 over in the Adding information to the genome. thread.
In Message 339 you provided these figures for the amount of information in the code snippets you provided:
  • Shell: 832 bits
  • First section of code: 1752 bits
  • Second section of code: 376 bits
Could you please show us how you calculated those figures? Thanks.
--Percy
Edited by Percy, : Typo.
Edited by Percy, : Grammar.

This message is a reply to:
 Message 398 by LucyTheApe, posted 10-03-2009 12:01 PM LucyTheApe has not replied

  
Percy
Member
Posts: 22479
From: New Hampshire
Joined: 12-23-2000
Member Rating: 4.7


Message 403 of 458 (543451)
01-18-2010 7:11 AM
Reply to: Message 401 by Taq
01-15-2010 4:11 PM


Re: What is information?
Hi Taq,
Since LucyTheApe is apparently going to take their time replying, I'd like to respond to this:
Taq writes:
So for DNA, what is the sender and who is the receiver?
These are what I think are the two primary ways in which DNA can be interpreted as a sender of information:
  1. The sender is the organism's DNA and the receiver is the organism itself. The DNA message is translated into proteins that then travel throughout the body delivering the message.
  2. The sender is the organism's DNA and the receiver is the offspring, which receives a copy of its parent's DNA. The DNA copy may contain errors, and for sexual species the DNA will usually be a combination of DNA from both parents.
There are additional ways of answering the question, but I think these are the most obvious. Hopefully there's no reason why creationists wouldn't look at it the same way.
--Percy
Edited by Percy, : Typo.
Edited by Percy, : Typo.

This message is a reply to:
 Message 401 by Taq, posted 01-15-2010 4:11 PM Taq has replied

Replies to this message:
 Message 405 by Taq, posted 01-19-2010 6:59 PM Percy has replied
 Message 407 by LucyTheApe, posted 01-21-2010 7:59 AM Percy has replied

  
Percy
Member
Posts: 22479
From: New Hampshire
Joined: 12-23-2000
Member Rating: 4.7


Message 404 of 458 (543569)
01-19-2010 7:23 AM
Reply to: Message 398 by LucyTheApe
10-03-2009 12:01 PM


Re: What is information?
Hi LucyTheApe,
For some reason you keep popping in and out every month or three, and now you're apparently doing it again, so here's some additional information that may help break this pattern.
People don't believe you know much about information theory because you've demonstrated your lack of knowledge over and over again. You appear to be trying to solve the problem of how to win a debate on a topic you know little about, and so you've adopted this strange strategy of "post a couple messages then exit for a couple months."
I know you want to drop the discussion about how you calculated the amount of information in your programs, but the mere fact that you thought it was possible tells us you have no idea what you're doing. Until you finally understand that whatever calculations you were doing were wrong you're not going to be seeking answers to how one actually calculates information.
So you're going to have to explain how you calculated the amount of information in your programs. When you finally admit to yourself that you can't do it (something that everyone following this discussion already realizes), only then will we be able to make progress.
--Percy

This message is a reply to:
 Message 398 by LucyTheApe, posted 10-03-2009 12:01 PM LucyTheApe has replied

Replies to this message:
 Message 408 by LucyTheApe, posted 01-21-2010 8:28 AM Percy has seen this message but not replied

  
Percy
Member
Posts: 22479
From: New Hampshire
Joined: 12-23-2000
Member Rating: 4.7


Message 406 of 458 (543678)
01-20-2010 8:13 AM
Reply to: Message 405 by Taq
01-19-2010 6:59 PM


Re: What is information?
Taq writes:
I would add a 3rd option as the most important. Since DNA is always spoken of as a code with four letters, 3 codons, and so forth it is important to relate these to the actual chemical reactions.
You're going to have to connect the dots for me on this one. The actual machinery that implements the information communication channels is unimportant at an information theoretic level.
You've added a bit more detail and used different names, but it seems like much the same thing. We could break it down the processes in which information from DNA is the driving element like this:
  1. Cell metabolism.
  2. Body metabolism.
  3. Heredity.
--Percy

This message is a reply to:
 Message 405 by Taq, posted 01-19-2010 6:59 PM Taq has not replied

  
Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024