Register | Sign In


Understanding through Discussion


EvC Forum active members: 64 (9164 total)
3 online now:
Newest Member: ChatGPT
Post Volume: Total: 916,799 Year: 4,056/9,624 Month: 927/974 Week: 254/286 Day: 15/46 Hour: 0/1


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   Question on genetic information
Percy
Member
Posts: 22494
From: New Hampshire
Joined: 12-23-2000
Member Rating: 4.9


Message 14 of 32 (373655)
01-02-2007 10:05 AM
Reply to: Message 9 by platypus
01-02-2007 2:18 AM


Re: macro and genetic information
Hi Platypus,
I'm primarily using your message as a jumping off point to reply to the opening post.
platypus writes:
Snakes lost their legs, which is the loss of an appendage and function, obviously a loss of information.
This year's Nobel prize in biology was for research into gene regulation and expression, and when I read your post I wondered if perhaps the snake's loss of legs was due to regulatory changes rather than to loss of actual genes. In other words, while this is obviously a loss of form and function, it might not be a loss of information.
My brief search around the Internet seemed to indicate that snakes still possess the genes for limbs, they're just no longer expressed, being removed by regulatory processes that halt the expression of these genes. In the case of the hind limbs, the genes are turned off by additional genes. In the case of the forelimbs it is more complicated and I couldn't find a definitive answer in the five minutes I allotted myself for looking this up, but the genes for the entire section where forelimbs would attach are no longer expressed.
An analogy might help those creationists out there who are trying to understand the ID argument about information. There is more than one way to turn off a light bulb. One way is to remove the wires that connect it to the battery, and that would be analogous to a loss of information. Another way is to insert a switch in the circuit and turn the switch off. That's analogous to a gain of information.
There are other ways to turn the light bulb off, of course. One is to break the light bulb, and I'm not sure if that's analogous to a gain or loss of information. Maybe it's just a change. Another way is to short out the light bulb by connecting another wire directly across its terminals, which is analogous to a gain of information. Another way is to remove the battery from the circuit, which would be analogous to a loss of information.
Analogies can be dangerous, so let me nip any efforts at carrying the analogy too far in the bud. I'm just trying to explain something complex and unfamiliar by drawing an analogy to something simple and familiar. I'm definitely not saying that genetics and gene expression is the same thing as an electric circuit.
--Percy

This message is a reply to:
 Message 9 by platypus, posted 01-02-2007 2:18 AM platypus has replied

Replies to this message:
 Message 15 by platypus, posted 01-02-2007 2:10 PM Percy has not replied
 Message 17 by RAZD, posted 01-02-2007 11:50 PM Percy has not replied

  
Percy
Member
Posts: 22494
From: New Hampshire
Joined: 12-23-2000
Member Rating: 4.9


Message 26 of 32 (377677)
01-17-2007 10:35 PM
Reply to: Message 25 by Jaderis
01-17-2007 9:58 PM


Jaderis writes:
INFORMATION - What is conveyed or represented by a particular arrangement or sequence of things : genetically transmitted information.
Oh...wow. That definition sure clears things up.
Seriously..."Sequence of things"??? How on earth does anyone accept this as anything resembling a scientific or testable definition?
I was wondering if/when things would get to this point.
Information does have a formal definition. In fact, there is a branch of science called Information Theory, and the field was founded by Claude Shannon over half a century ago with his seminal paper titled A Mathematical Theory of Communication.
As Shannon points out, the problem of communications is one of transmitting a message from a fixed set of messages. For example, a very simple message might consist of a single letter. The set of all possible messages in that case would be the set of all letters, which for ASCII is 256 different letters (a good number of them non-printing, but we'll ignore that and keep this example simple).
How much information is contained in a single message is a function of how many bits it takes to represent the message. In the case of a single letter of 256 different letters, it takes 8-bits. So the information contained in a single ASCII letter is 8-bits. For those already familiar with information theory, keep in mind I'm just trying to communicate the general principle of information theory and am purposely avoiding all nuances such as noise, redundancy, etc.
I was actually heading off to bed when I saw this message so I'm not going to go beyond that brief explanation. I'd be glad to answer any questions tomorrow.
--Percy

This message is a reply to:
 Message 25 by Jaderis, posted 01-17-2007 9:58 PM Jaderis has replied

Replies to this message:
 Message 27 by Jaderis, posted 01-17-2007 10:53 PM Percy has replied

  
Percy
Member
Posts: 22494
From: New Hampshire
Joined: 12-23-2000
Member Rating: 4.9


Message 29 of 32 (377768)
01-18-2007 10:04 AM
Reply to: Message 27 by Jaderis
01-17-2007 10:53 PM


Jaderis writes:
Could you perhaps elaborate on the what this theory might imply with regards to transmitting genetic information?
WK already alluded to noise, which in the case of genetics is most often caused by replication error (copying mistakes during cell division). It is this error-prone process that leads to increasing amounts of information in the genome.
As I said earlier, the amount of information contained in a message is the number of bits required to represent the message. If there are only five possible messages, then it takes only three bits to encode a single message, like this for example:
000 Blue eyes
001 Brown eyes
010 Green eyes
011 Hazel eyes
100 Yellow eyes
In reproduction the goal is to pass the genetic information from the parents to the offspring, but what if there is a copying error? What if one of the parents has the genetic code for brown eyes (001 in my example, not an actual genetic code, of course), but during reproduction there was a copying error and the 001 became 101. Well, there was originally no code for 101, but now there is, so now our table now looks like this:
000 Blue eyes
001 Brown eyes
010 Green eyes
011 Hazel eyes
100 Yellow eyes
101 ???
Genetically we don't know what the new code will do until we see what it does to the individual. The codes are instructions for building proteins, which are the workhorses of the body. Will the 101 code produce one of the already existing eye colors, or a new eye color. Body chemistry is usually too complex to predict what would happen ahead of time. That's why medicine is full of animal trials before they actually begin testing on humans.
But notice that the mutation (that's what a copying error is) has caused our message set size to increase from five messages to six. The mathematical way to calculate information is to take the log base 2 of the number of messages in the message set. So the amount of information communicated when sending a single message from a set of messages of size five is:
log25 = 2.32 bits
When the message set size increases to six the amount of information communicated by a single message is:
log26 = 2.585 bits
So the mutation has caused an increase in information in the genome of 0.265 bits.
(Keep in mind that this is just a simple example to get the principle of how Information Theory can be applied to genetics - it's not a real world example. I used binary bits instead of nucleotide triplets, and eye color determination is actually spread amongst a number of genes, not just one.)
Is the definition of information as presented in Information Theory pretty much the same as the definition I quoted in my post?
The definition you quoted is very imprecise for purposes of understanding Information Theory.
Are we (evos) being disingenious when we state that the creationists won't or haven't yet defined information (IOW do they actually use the definition presented by Information Theory and can it be applied accurately to genetics)?
WK covered this pretty well. I wouldn't be so kind myself. Both Gitt and Dembski information are just made up definitions of information drawn from thin air with occasional references to real world theories to lend them an air of legitimacy.
--Percy

This message is a reply to:
 Message 27 by Jaderis, posted 01-17-2007 10:53 PM Jaderis has replied

Replies to this message:
 Message 31 by Jaderis, posted 01-18-2007 6:36 PM Percy has not replied

  
Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024