Register | Sign In


Understanding through Discussion


EvC Forum active members: 65 (9162 total)
2 online now:
Newest Member: popoi
Post Volume: Total: 915,806 Year: 3,063/9,624 Month: 908/1,588 Week: 91/223 Day: 2/17 Hour: 0/0


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   How do "novel" features evolve?
caffeine
Member (Idle past 1024 days)
Posts: 1800
From: Prague, Czech Republic
Joined: 10-22-2008


Message 241 of 314 (661786)
05-10-2012 5:38 AM
Reply to: Message 235 by Panda
05-09-2012 12:46 PM


Logic
Then Premise 6 contradicts ("rules out") Premise 5?
To say (e.g.) that "This number is either odd or even, but it is not even." seems contradictory to me.
Surely it should just be: "This number is odd." or "LIFE is due to chance or design".
The premises are not contradictory.
"A or B" means that either A must be true, or B must true. If neither of them are true, then we have a contradiction. If both of them are true, then we also have a contradiction. If just one of them is true, however, then there's no contradiction, and everything's hunky dory. (This all assumes we're using an exclusive or - in common speech 'or' can also mean 'and/or', but from context I don't think that's what zaius meant)
This sort of argument is useful if the premises are actually correct. If it's true that either A or B must be true; and we have convincing evidence that B is not true; then logically this is convincing evidence that A is true; even if we have no direct evidence of that.

This message is a reply to:
 Message 235 by Panda, posted 05-09-2012 12:46 PM Panda has seen this message but not replied

Replies to this message:
 Message 242 by ScottyDouglas, posted 05-10-2012 6:53 AM caffeine has not replied
 Message 249 by zaius137, posted 05-10-2012 6:44 PM caffeine has not replied

  
ScottyDouglas
Member (Idle past 4331 days)
Posts: 79
Joined: 05-10-2012


Message 242 of 314 (661789)
05-10-2012 6:53 AM
Reply to: Message 241 by caffeine
05-10-2012 5:38 AM


Re: Logic
Evolution is both fact and theory.Theory because it is always changing from new information.*also a hurt to evolution
And it is fact because methods and patterns observed are proven.
Though the question is, has the evolutionary theory really completely explained evolution?No it hasn't.Have they discovered any process of genetics which can evolve anything new?No
Fossils speak of sudden appearances of the kinds of fast paced change not exzactly hand in hand with evolution.Nature is rich with biological designs which out right deny evolution.
Amounst animals that do are:the bombardier beetle,the giraffe, the gecko, and the humming bird to name a few.
While we able to apprehend a Creator we can not comprehend Him and that is the problem.
If specific animals do not fit the theory of evolution the house of cards implodes.
Ultimately with animals and fossils recorded not fully supporting evolution the design of the world rests upon a Creator.

This message is a reply to:
 Message 241 by caffeine, posted 05-10-2012 5:38 AM caffeine has not replied

Replies to this message:
 Message 243 by Dr Adequate, posted 05-10-2012 8:29 AM ScottyDouglas has not replied
 Message 244 by RAZD, posted 05-10-2012 5:27 PM ScottyDouglas has replied

  
Dr Adequate
Member (Idle past 284 days)
Posts: 16113
Joined: 07-20-2006


(2)
Message 243 of 314 (661793)
05-10-2012 8:29 AM
Reply to: Message 242 by ScottyDouglas
05-10-2012 6:53 AM


Re: Logic
Evolution is both fact and theory.Theory because it is always changing from new information.*also a hurt to evolution
And it is fact because methods and patterns observed are proven.
Though the question is, has the evolutionary theory really completely explained evolution?No it hasn't.Have they discovered any process of genetics which can evolve anything new?No
Fossils speak of sudden appearances of the kinds of fast paced change not exzactly hand in hand with evolution.Nature is rich with biological designs which out right deny evolution.
Amounst animals that do are:the bombardier beetle,the giraffe, the gecko, and the humming bird to name a few.
While we able to apprehend a Creator we can not comprehend Him and that is the problem.
If specific animals do not fit the theory of evolution the house of cards implodes.
Ultimately with animals and fossils recorded not fully supporting evolution the design of the world rests upon a Creator.
Thank you for your Gish Gallop. If there is any specific point in it that you feel capable of defending, please start a thread on it, where you will be shown how wrong you are. Otherwise please be informed that you are reciting ignorant trash invented by liars and fools to deceive the gullible.

This message is a reply to:
 Message 242 by ScottyDouglas, posted 05-10-2012 6:53 AM ScottyDouglas has not replied

  
RAZD
Member (Idle past 1404 days)
Posts: 20714
From: the other end of the sidewalk
Joined: 03-14-2004


Message 244 of 314 (661841)
05-10-2012 5:27 PM
Reply to: Message 242 by ScottyDouglas
05-10-2012 6:53 AM


MORE OFF TOPIC DRIFT
Hi ScottyDouglas, and welcome to the fray.
Evolution is both fact and theory ...
This thread is not about evolution in general, but specifically about how new features evolve. If the off topic posts continue I will have to ask admin to close the thread for a while.
Feel free to start a new thread and I will be happy to discuss your points.
Go to Proposed New Topics to post new topics.
You can improve the readability of your post by using extra lines between paragraphs.
Enjoy.
... as you are new here, some posting tips:
type [qs]quotes are easy[/qs] and it becomes:
quotes are easy
or type [quote]quotes are easy[/quote] and it becomes:
quote:
quotes are easy
also check out (help) links on any formatting questions when in the reply window.
For other formatting tips see Posting Tips
For a quick overview see EvC Forum Primer
If you have problems with replies see Report Discussion Problems Here 3.0

we are limited in our ability to understand
by our ability to understand
Rebel American Zen Deist
... to learn ... to think ... to live ... to laugh ...
to share.


Join the effort to solve medical problems, AIDS/HIV, Cancer and more with Team EvC! (click)

This message is a reply to:
 Message 242 by ScottyDouglas, posted 05-10-2012 6:53 AM ScottyDouglas has replied

Replies to this message:
 Message 246 by ScottyDouglas, posted 05-10-2012 5:45 PM RAZD has seen this message but not replied

  
RAZD
Member (Idle past 1404 days)
Posts: 20714
From: the other end of the sidewalk
Joined: 03-14-2004


Message 245 of 314 (661843)
05-10-2012 5:30 PM
Reply to: Message 239 by zaius137
05-10-2012 3:28 AM


STILL OFF TOPIC
Hi again zaius137,
Not at all,Here is the Wiki demonstrating the relationship between entropy and information. ...
Which is still not the topic of this thread. Please start a new topic for this discussion.
Go to Proposed New Topics to post new topics.
Enjoy.

we are limited in our ability to understand
by our ability to understand
Rebel American Zen Deist
... to learn ... to think ... to live ... to laugh ...
to share.


Join the effort to solve medical problems, AIDS/HIV, Cancer and more with Team EvC! (click)

This message is a reply to:
 Message 239 by zaius137, posted 05-10-2012 3:28 AM zaius137 has replied

Replies to this message:
 Message 248 by zaius137, posted 05-10-2012 6:40 PM RAZD has seen this message but not replied

  
ScottyDouglas
Member (Idle past 4331 days)
Posts: 79
Joined: 05-10-2012


(1)
Message 246 of 314 (661846)
05-10-2012 5:45 PM
Reply to: Message 244 by RAZD
05-10-2012 5:27 PM


Re: MORE OFF TOPIC DRIFT
Thank you and God Bless
RAZD

This message is a reply to:
 Message 244 by RAZD, posted 05-10-2012 5:27 PM RAZD has seen this message but not replied

  
zaius137
Member (Idle past 3409 days)
Posts: 407
Joined: 05-08-2012


Message 247 of 314 (661875)
05-10-2012 6:36 PM
Reply to: Message 240 by Dr Adequate
05-10-2012 4:40 AM


Re: Information
Well, if your choice is Shannon entropy, then creating information is easy. Any insertion would do it, since the insertion increases the number of bits in the genome, and since the content of these bits is not completely predictable from their context.
You are really going to have to go into the math here. I do not see where increasing the size of the genome varies the probability estimation of that genome. Remember if Shannon entropy is increased, the number of bits that express the uncertainty will increase but the implied information content decreases. Uncertainty goes up implied information goes down.

This message is a reply to:
 Message 240 by Dr Adequate, posted 05-10-2012 4:40 AM Dr Adequate has replied

Replies to this message:
 Message 250 by Dr Adequate, posted 05-10-2012 7:36 PM zaius137 has not replied
 Message 251 by Percy, posted 05-11-2012 6:52 AM zaius137 has replied

  
zaius137
Member (Idle past 3409 days)
Posts: 407
Joined: 05-08-2012


Message 248 of 314 (661878)
05-10-2012 6:40 PM
Reply to: Message 245 by RAZD
05-10-2012 5:30 PM


Re: STILL OFF TOPIC
I apologize RAZD...
I thought I could bring all this together but I see the subject is extremely diffuse.

This message is a reply to:
 Message 245 by RAZD, posted 05-10-2012 5:30 PM RAZD has seen this message but not replied

  
zaius137
Member (Idle past 3409 days)
Posts: 407
Joined: 05-08-2012


Message 249 of 314 (661880)
05-10-2012 6:44 PM
Reply to: Message 241 by caffeine
05-10-2012 5:38 AM


Re: Logic
Thank you...

This message is a reply to:
 Message 241 by caffeine, posted 05-10-2012 5:38 AM caffeine has not replied

  
Dr Adequate
Member (Idle past 284 days)
Posts: 16113
Joined: 07-20-2006


Message 250 of 314 (661891)
05-10-2012 7:36 PM
Reply to: Message 247 by zaius137
05-10-2012 6:36 PM


Re: Information
You are really going to have to go into the math here.
Well where do I start? You referred us to an article on Shannon entropy. I have a Ph.D. in mathematics, and I (perhaps foolishly) assumed that when you started talking about Shannon entropy, you knew what you were talking about.
Apparently you don't. So, tell us --- at what point do you start finding information theory incomprehensible? Maybe I could help you over that hump. But until I have, maybe you should stop talking about information theory.

This message is a reply to:
 Message 247 by zaius137, posted 05-10-2012 6:36 PM zaius137 has not replied

  
Percy
Member
Posts: 22388
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.2


Message 251 of 314 (661953)
05-11-2012 6:52 AM
Reply to: Message 247 by zaius137
05-10-2012 6:36 PM


Re: Information
zaius137 writes:
Uncertainty goes up implied information goes down.
I'm not sure what you mean by the modifier "implied", but unless it has special significance I think you've got the relationship exactly backwards. When uncertainty is greatest concerning the state of the next bit to be communicated is when the most information is exchanged.
--Percy

This message is a reply to:
 Message 247 by zaius137, posted 05-10-2012 6:36 PM zaius137 has replied

Replies to this message:
 Message 252 by zaius137, posted 05-12-2012 2:57 AM Percy has replied

  
zaius137
Member (Idle past 3409 days)
Posts: 407
Joined: 05-08-2012


Message 252 of 314 (662072)
05-12-2012 2:57 AM
Reply to: Message 251 by Percy
05-11-2012 6:52 AM


Re: Information
When uncertainty is greatest concerning the state of the next bit to be communicated is when the most information is exchanged.
My last point on this and I believe this may tie into this thread
The uncertainties or probabilities directly produce resulting entropy (Shannon entropy). For instance, a fair coin toss has entropy of one (to transmit the outcome of a fair coin toss you need one bit). An unfair coin toss (say 70% heads and 30% of the time its tails) has entropy of about (.88). The entropy is less because the outcome is more certain. A perfectly predictable outcome has lowest entropy. As in cybernetics, information can reduce entropy. Consequently, I am implying an inverse relationship between information and entropy.
cybernetics
The science or study of communication in organisms, organic processes, and mechanical or electronic systems replication of natural systems: the replication or imitation of biological control systems with the use of technology.
As Wiener (1954) explains, just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization (p. 17). In other words, information can reduce entropy.
Communication | College of Media, Communication and Information | University of Colorado Boulder
This relationship, as I encountered it, is presented in a book by A.E. Wilder-Smith and is an ultimate test for an intelligent designer.
http://www.wildersmith.org/library.htm
All appreciation to those holding PhDs in the field of mathematics but I would really like a citation relating to their point.
Edited by zaius137, : Spelling

This message is a reply to:
 Message 251 by Percy, posted 05-11-2012 6:52 AM Percy has replied

Replies to this message:
 Message 253 by PaulK, posted 05-12-2012 5:55 AM zaius137 has not replied
 Message 254 by Percy, posted 05-12-2012 7:38 AM zaius137 has replied

  
PaulK
Member
Posts: 17822
Joined: 01-10-2003
Member Rating: 2.3


Message 253 of 314 (662077)
05-12-2012 5:55 AM
Reply to: Message 252 by zaius137
05-12-2012 2:57 AM


Re: Information
I'm pretty convinced that you don't understand the subject.
quote:
The uncertainties or probabilities directly produce resulting entropy (Shannon entropy). For instance, a fair coin toss has entropy of one (to transmit the outcome of a fair coin toss you need one bit). An unfair coin toss (say 70% heads and 30% of the time its tails) has entropy of about (.88). The entropy is less because the outcome is more certain. A perfectly predictable outcome has lowest entropy. As in cybernetics, information can reduce entropy. Consequently, I am implying an inverse relationship between information and entropy.
The information carried by the bit is less because the outcome is more certain. The relationship doesn't seem to be inverse at all. The greater the entropy of the source the more information carried by each bit. (This makes sense when you consider compression)
Wikipedia
Shannon's entropy measures the information contained in a message
In cybernetics information may be USED to reduce the entropy of a system where entropy is considered as disorganisation, but that is not the same thing at all.
quote:
This relationship, as I encountered it, is presented in a book by A.E. Wilder-Smith and is an ultimate test for an intelligent designer.
I suspect that that is the source of your confusion. Creationists do not have a good record of dealing with Information Theory.

This message is a reply to:
 Message 252 by zaius137, posted 05-12-2012 2:57 AM zaius137 has not replied

  
Percy
Member
Posts: 22388
From: New Hampshire
Joined: 12-23-2000
Member Rating: 5.2


Message 254 of 314 (662082)
05-12-2012 7:38 AM
Reply to: Message 252 by zaius137
05-12-2012 2:57 AM


Re: Information
zaius137 writes:
The uncertainties or probabilities directly produce resulting entropy (Shannon entropy). For instance, a fair coin toss has entropy of one (to transmit the outcome of a fair coin toss you need one bit). An unfair coin toss (say 70% heads and 30% of the time its tails) has entropy of about (.88). The entropy is less because the outcome is more certain. A perfectly predictable outcome has lowest entropy. As in cybernetics, information can reduce entropy. Consequently, I am implying an inverse relationship between information and entropy.
You described it accurately, but your conclusion is backwards. The lower the entropy the less information can be communicated.
Take your example of, "A perfectly predictable outcome has lowest entropy." A two-headed coin has a "perfectly predictable outcome" and therefore has the lowest possible entropy. When you flip the coin I don't need you to tell me the result because I already know. This means you have communicated no information when you tell me it came up "heads," because I already knew that.
Entropy and information have a positive, not an inverse, relationship.
--Percy

This message is a reply to:
 Message 252 by zaius137, posted 05-12-2012 2:57 AM zaius137 has replied

Replies to this message:
 Message 255 by zaius137, posted 05-12-2012 1:22 PM Percy has replied

  
zaius137
Member (Idle past 3409 days)
Posts: 407
Joined: 05-08-2012


Message 255 of 314 (662116)
05-12-2012 1:22 PM
Reply to: Message 254 by Percy
05-12-2012 7:38 AM


Re: Information
Remember where Shannon entropy is most appropriate. It gives insight to how many bits needed to covey an independent variable by communications. As the randomness of that variable increases (less innate information), the number of bits needed to convey that variable increases (increase in bits to convey that randomness).
Information decreases (innate information of variable), Entropy Increases (bits to express variable)
This relationship holds in communications and in biology.
quote:
The basic concept of entropy in information theory has to do with how much randomness is in a signal or in a random event. An alternative way to look at this is to talk about how much information is carried by the signal.
As an example consider some English text, encoded as a string of letters, spaces and punctuation (so our signal is a string of characters). Since some characters are not very likely (e.g. 'z') while others are very common (e.g. 'e') the string of characters is not really as random as it might be. On the other hand, since we cannot predict what the next character will be, it does have some 'randomness'. Entropy is a measure of this randomness, suggested by Claude E. Shannon in his 1949 paper A Mathematical Theory of Communication.
http://www.wordiq.com/definition/Shannon_entropy
Edited by zaius137, : No reason given.

This message is a reply to:
 Message 254 by Percy, posted 05-12-2012 7:38 AM Percy has replied

Replies to this message:
 Message 256 by Percy, posted 05-13-2012 7:57 AM zaius137 has replied
 Message 260 by Dr Adequate, posted 05-14-2012 2:31 AM zaius137 has not replied

  
Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024