Register | Sign In


Understanding through Discussion


EvC Forum active members: 65 (9162 total)
4 online now:
Newest Member: popoi
Post Volume: Total: 915,819 Year: 3,076/9,624 Month: 921/1,588 Week: 104/223 Day: 2/13 Hour: 0/0


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   Can mutation and selection increase information?
Percy
Member
Posts: 22394
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.2


(1)
Message 213 of 222 (818676)
08-31-2017 8:46 PM
Reply to: Message 211 by CRR
08-31-2017 8:25 PM


Re: Information
CRR writes:
This work is continuing although I suspect since information is a non material thing it might never be possible to define and measure it exactly.
Yeah, like statistics and percentages and other non-material things. We'll never be able to define and measure them exactly.
Claude Shannon no more had the last word on information than Charles Darwin did on evolution.
You finally said something true, but you're still unable to describe how to measure the amount of information in a genome.
--Percy

This message is a reply to:
 Message 211 by CRR, posted 08-31-2017 8:25 PM CRR has not replied

  
Percy
Member
Posts: 22394
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.2


(2)
Message 221 of 222 (819019)
09-05-2017 9:27 AM
Reply to: Message 219 by CRR
09-04-2017 8:37 PM


Re: Biological Information
CRR writes:
Biological Information What is It?
Werner Gitt, Robert Compton, and Jorge Fernandez

quote:
But what do we mean by the term biological information? We suggest that, at present, it cannot be unambiguously defined.
Yes, Gitt is correct, he doesn't have the ability to define, let alone measure, what he calls "biological information". That's because he confused information with meaning, and we have no way of measuring meaning.
This also means that all your claims of increasing and decreasing biological information are nonsense. You can't measure it, therefore you can't know when it increases or decreases.
quote:
This leads us to ask the more general question: What precisely is information? Anyone who has studied this field is aware of three working definitions of
information:
Classical Information Theory: Shannon (statistical) information [3]; dealing solely with the technical/engineering aspects of communication. This involves analyses including obtaining statistics on the material symbols for data transmission, storage and processing.
Algorithmic Information Theory: Solomonoff/Kolmogorov/Chaitin [4—6]; dealing with the ‘complexity’ (as this term is defined in the theory) of material symbols in data structures and of objects in general.
This latter is derivative of the former.
quote:
Complex Specified Information (CSI) Theory: Dembski [7]; roughly the same as Classical Information Theory but adding the important concept of a ‘specification’.
CSI is made up.
quote:
With the combination of abstract code and syntax we are able to generate more complex language structures such as words and sentences. However, at this (formal language) stage meaning plays no role. It was at this level only that Shannon developed his Theory of Communication [3] into the highly useful statistical analyses of the material symbols, solely for the technical purposes of data transmission, storage and processing.
Give that last phrase a closer look: "data transmission, storage and processing." That's pretty much everything DNA does. The information in DNA can be measured and quantified. The made-up stuff that you're talking about cannot.
--Percy

This message is a reply to:
 Message 219 by CRR, posted 09-04-2017 8:37 PM CRR has not replied

  
Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024