|
Register | Sign In |
|
QuickSearch
EvC Forum active members: 58 (9206 total) |
| |
Fyre1212 | |
Total: 919,412 Year: 6,669/9,624 Month: 9/238 Week: 9/22 Day: 0/9 Hour: 0/0 |
Summations Only | Thread ▼ Details |
Member (Idle past 1653 days) Posts: 20714 From: the other end of the sidewalk Joined: |
|
Thread Info
|
|
|
Author | Topic: How do "novel" features evolve? | |||||||||||||||||||||||||||||||||||||||||||||||||||||
caffeine Member (Idle past 1273 days) Posts: 1800 From: Prague, Czech Republic Joined: |
Then Premise 6 contradicts ("rules out") Premise 5? To say (e.g.) that "This number is either odd or even, but it is not even." seems contradictory to me. Surely it should just be: "This number is odd." or "LIFE is due to chance or design". The premises are not contradictory. "A or B" means that either A must be true, or B must true. If neither of them are true, then we have a contradiction. If both of them are true, then we also have a contradiction. If just one of them is true, however, then there's no contradiction, and everything's hunky dory. (This all assumes we're using an exclusive or - in common speech 'or' can also mean 'and/or', but from context I don't think that's what zaius meant) This sort of argument is useful if the premises are actually correct. If it's true that either A or B must be true; and we have convincing evidence that B is not true; then logically this is convincing evidence that A is true; even if we have no direct evidence of that.
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||
ScottyDouglas Member (Idle past 4580 days) Posts: 79 Joined:
|
Evolution is both fact and theory.Theory because it is always changing from new information.*also a hurt to evolution
And it is fact because methods and patterns observed are proven. Though the question is, has the evolutionary theory really completely explained evolution?No it hasn't.Have they discovered any process of genetics which can evolve anything new?No Fossils speak of sudden appearances of the kinds of fast paced change not exzactly hand in hand with evolution.Nature is rich with biological designs which out right deny evolution. Amounst animals that do are:the bombardier beetle,the giraffe, the gecko, and the humming bird to name a few. While we able to apprehend a Creator we can not comprehend Him and that is the problem. If specific animals do not fit the theory of evolution the house of cards implodes. Ultimately with animals and fossils recorded not fully supporting evolution the design of the world rests upon a Creator.
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||
Dr Adequate Member Posts: 16113 Joined:
|
Evolution is both fact and theory.Theory because it is always changing from new information.*also a hurt to evolution And it is fact because methods and patterns observed are proven. Though the question is, has the evolutionary theory really completely explained evolution?No it hasn't.Have they discovered any process of genetics which can evolve anything new?No Fossils speak of sudden appearances of the kinds of fast paced change not exzactly hand in hand with evolution.Nature is rich with biological designs which out right deny evolution. Amounst animals that do are:the bombardier beetle,the giraffe, the gecko, and the humming bird to name a few. While we able to apprehend a Creator we can not comprehend Him and that is the problem. If specific animals do not fit the theory of evolution the house of cards implodes. Ultimately with animals and fossils recorded not fully supporting evolution the design of the world rests upon a Creator. Thank you for your Gish Gallop. If there is any specific point in it that you feel capable of defending, please start a thread on it, where you will be shown how wrong you are. Otherwise please be informed that you are reciting ignorant trash invented by liars and fools to deceive the gullible.
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||
RAZD Member (Idle past 1653 days) Posts: 20714 From: the other end of the sidewalk Joined: |
Hi ScottyDouglas, and welcome to the fray.
Evolution is both fact and theory ... This thread is not about evolution in general, but specifically about how new features evolve. If the off topic posts continue I will have to ask admin to close the thread for a while. Feel free to start a new thread and I will be happy to discuss your points. Go to Proposed New Topics to post new topics. You can improve the readability of your post by using extra lines between paragraphs. Enjoy.
... as you are new here, some posting tips: type [qs]quotes are easy[/qs] and it becomes:
quotes are easy or type [quote]quotes are easy[/quote] and it becomes:
quote: also check out (help) links on any formatting questions when in the reply window. For other formatting tips see Posting TipsFor a quick overview see EvC Forum Primer If you have problems with replies see Report Discussion Problems Here 3.0 by our ability to understand Rebel American Zen Deist ... to learn ... to think ... to live ... to laugh ... to share. Join the effort to solve medical problems, AIDS/HIV, Cancer and more with Team EvC! (click)
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||
RAZD Member (Idle past 1653 days) Posts: 20714 From: the other end of the sidewalk Joined: |
Hi again zaius137,
Not at all,Here is the Wiki demonstrating the relationship between entropy and information. ... Which is still not the topic of this thread. Please start a new topic for this discussion. Go to Proposed New Topics to post new topics. Enjoy.by our ability to understand Rebel American Zen Deist ... to learn ... to think ... to live ... to laugh ... to share. Join the effort to solve medical problems, AIDS/HIV, Cancer and more with Team EvC! (click)
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||
ScottyDouglas Member (Idle past 4580 days) Posts: 79 Joined:
|
Thank you and God Bless
RAZD
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||
zaius137 Member (Idle past 3658 days) Posts: 407 Joined: |
Well, if your choice is Shannon entropy, then creating information is easy. Any insertion would do it, since the insertion increases the number of bits in the genome, and since the content of these bits is not completely predictable from their context. You are really going to have to go into the math here. I do not see where increasing the size of the genome varies the probability estimation of that genome. Remember if Shannon entropy is increased, the number of bits that express the uncertainty will increase but the implied information content decreases. Uncertainty goes up implied information goes down.
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||
zaius137 Member (Idle past 3658 days) Posts: 407 Joined: |
I apologize RAZD...
I thought I could bring all this together but I see the subject is extremely diffuse.
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||
zaius137 Member (Idle past 3658 days) Posts: 407 Joined: |
Thank you...
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||
Dr Adequate Member Posts: 16113 Joined: |
You are really going to have to go into the math here. Well where do I start? You referred us to an article on Shannon entropy. I have a Ph.D. in mathematics, and I (perhaps foolishly) assumed that when you started talking about Shannon entropy, you knew what you were talking about. Apparently you don't. So, tell us --- at what point do you start finding information theory incomprehensible? Maybe I could help you over that hump. But until I have, maybe you should stop talking about information theory.
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||
Percy Member Posts: 22929 From: New Hampshire Joined: Member Rating: 7.2 |
zaius137 writes: Uncertainty goes up implied information goes down. I'm not sure what you mean by the modifier "implied", but unless it has special significance I think you've got the relationship exactly backwards. When uncertainty is greatest concerning the state of the next bit to be communicated is when the most information is exchanged. --Percy
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||
zaius137 Member (Idle past 3658 days) Posts: 407 Joined: |
When uncertainty is greatest concerning the state of the next bit to be communicated is when the most information is exchanged. My last point on this and I believe this may tie into this thread The uncertainties or probabilities directly produce resulting entropy (Shannon entropy). For instance, a fair coin toss has entropy of one (to transmit the outcome of a fair coin toss you need one bit). An unfair coin toss (say 70% heads and 30% of the time its tails) has entropy of about (.88). The entropy is less because the outcome is more certain. A perfectly predictable outcome has lowest entropy. As in cybernetics, information can reduce entropy. Consequently, I am implying an inverse relationship between information and entropy. cybernetics The science or study of communication in organisms, organic processes, and mechanical or electronic systems replication of natural systems: the replication or imitation of biological control systems with the use of technology. As Wiener (1954) explains, just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization (p. 17). In other words, information can reduce entropy. Communication | College of Media, Communication and Information | University of Colorado Boulder This relationship, as I encountered it, is presented in a book by A.E. Wilder-Smith and is an ultimate test for an intelligent designer. http://www.wildersmith.org/library.htm All appreciation to those holding PhDs in the field of mathematics but I would really like a citation relating to their point. Edited by zaius137, : Spelling
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||
PaulK Member Posts: 17906 Joined: Member Rating: 7.2 |
I'm pretty convinced that you don't understand the subject.
quote: The information carried by the bit is less because the outcome is more certain. The relationship doesn't seem to be inverse at all. The greater the entropy of the source the more information carried by each bit. (This makes sense when you consider compression)
Wikipedia
Shannon's entropy measures the information contained in a message
In cybernetics information may be USED to reduce the entropy of a system where entropy is considered as disorganisation, but that is not the same thing at all.
quote: I suspect that that is the source of your confusion. Creationists do not have a good record of dealing with Information Theory.
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||
Percy Member Posts: 22929 From: New Hampshire Joined: Member Rating: 7.2 |
zaius137 writes: The uncertainties or probabilities directly produce resulting entropy (Shannon entropy). For instance, a fair coin toss has entropy of one (to transmit the outcome of a fair coin toss you need one bit). An unfair coin toss (say 70% heads and 30% of the time its tails) has entropy of about (.88). The entropy is less because the outcome is more certain. A perfectly predictable outcome has lowest entropy. As in cybernetics, information can reduce entropy. Consequently, I am implying an inverse relationship between information and entropy. You described it accurately, but your conclusion is backwards. The lower the entropy the less information can be communicated. Take your example of, "A perfectly predictable outcome has lowest entropy." A two-headed coin has a "perfectly predictable outcome" and therefore has the lowest possible entropy. When you flip the coin I don't need you to tell me the result because I already know. This means you have communicated no information when you tell me it came up "heads," because I already knew that. Entropy and information have a positive, not an inverse, relationship. --Percy
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||
zaius137 Member (Idle past 3658 days) Posts: 407 Joined: |
Remember where Shannon entropy is most appropriate. It gives insight to how many bits needed to covey an independent variable by communications. As the randomness of that variable increases (less innate information), the number of bits needed to convey that variable increases (increase in bits to convey that randomness).
Information decreases (innate information of variable), Entropy Increases (bits to express variable) This relationship holds in communications and in biology.
quote: Edited by zaius137, : No reason given.
|
|
|
Do Nothing Button
Copyright 2001-2023 by EvC Forum, All Rights Reserved
Version 4.2
Innovative software from Qwixotic © 2024