Register | Sign In


Understanding through Discussion


EvC Forum active members: 66 (9164 total)
5 online now:
Newest Member: ChatGPT
Post Volume: Total: 916,474 Year: 3,731/9,624 Month: 602/974 Week: 215/276 Day: 55/34 Hour: 1/2


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   faith in evolution
Brad McFall
Member (Idle past 5055 days)
Posts: 3428
From: Ithaca,NY, USA
Joined: 12-20-2001


Message 16 of 32 (5223)
02-21-2002 11:27 AM
Reply to: Message 1 by one_god
10-26-2001 5:57 PM


Do you mean by fancied quantum compters that don't exist? and whose construction without close unity with science of genetics can cause a collapse of the/a food web, potentially (in worst case)??

This message is a reply to:
 Message 1 by one_god, posted 10-26-2001 5:57 PM one_god has not replied

Replies to this message:
 Message 17 by joz, posted 02-21-2002 11:38 AM Brad McFall has replied

  
joz
Inactive Member


Message 17 of 32 (5225)
02-21-2002 11:38 AM
Reply to: Message 16 by Brad McFall
02-21-2002 11:27 AM


quote:
Originally posted by Brad McFall:
Do you mean by fancied quantum compters that don't exist? and whose construction without close unity with science of genetics can cause a collapse of the/a food web, potentially (in worst case)??
Au contraire Brad they do exsist...
http://domino.research.ibm.com/comm/bios.nsf/pages/quantum.html
So what was that about the collapse of the/a food web?

This message is a reply to:
 Message 16 by Brad McFall, posted 02-21-2002 11:27 AM Brad McFall has replied

Replies to this message:
 Message 29 by Brad McFall, posted 04-08-2002 2:24 PM joz has not replied

  
toff
Inactive Member


Message 18 of 32 (5275)
02-22-2002 4:24 AM
Reply to: Message 15 by joz
02-21-2002 10:50 AM


quote:
Originally posted by joz:
While I disagree with his line of reasoning John Searles attacks on functionalism are clearly non-trivial....
You would do well to acquaint yourself with his concept of intentionality as expressed in his "chineese room" argument.....

I am well acquainted with his 'chinese room' argument. It is an excellent tool for examinnig the idea of intentionality. Unfortunately, it says nothing either way about whether or not the processes of the brain (ie., human thought) are algorithmic or non-algorithmic.

This message is a reply to:
 Message 15 by joz, posted 02-21-2002 10:50 AM joz has not replied

Replies to this message:
 Message 19 by Malachi, posted 02-28-2002 3:52 AM toff has not replied

  
Malachi
Inactive Member


Message 19 of 32 (5771)
02-28-2002 3:52 AM
Reply to: Message 18 by toff
02-22-2002 4:24 AM


Human thought is not algorithmic in the sense of how computers are. Computer processors solve one problem at a time in order. They only get things done by solving those simple problems very quickly. The human brain is a collection of millions of tiny processors that operate algorithmically on the individual level(nerve cells). It is the interaction between these millions of individual processors that allows for what we percieve as human thought. Some experimental work is being done to create computers with many small, simple processors working together instead of one super fast processor that most of modern computers use. This would be much more similar to the way that the human brain functions than the traditional single processor algorithmic way that computers of today function.

This message is a reply to:
 Message 18 by toff, posted 02-22-2002 4:24 AM toff has not replied

Replies to this message:
 Message 21 by Peter, posted 03-11-2002 9:01 AM Malachi has not replied

  
Peter
Member (Idle past 1501 days)
Posts: 2161
From: Cambridgeshire, UK.
Joined: 02-05-2002


Message 20 of 32 (6553)
03-11-2002 8:54 AM
Reply to: Message 14 by toff
02-21-2002 10:10 AM


quote:
Originally posted by toff:
The brain is a physical organ. It cannot be either algorithmic or non-algorithmic. How it WORKS is either algorithmic or non-algorithmic. If that's what you meant, then I'd love to see whatever evidence you have that our mental processes (ie, human thought) is not algorithmic. Nobody has ever been able to find any.
The question Toff is asking is about human thought .. algorithmic
or not ?
The responses about the brain are analagous to asking 'is
a micro-processor algorithmic?'
It's not. If it's not doing anything its idle ... like a dead
brain.
While in use, do the thought processes exhibit algorithmic
activity ?
Please say how any processing could be other than algorithmic ?

This message is a reply to:
 Message 14 by toff, posted 02-21-2002 10:10 AM toff has not replied

  
Peter
Member (Idle past 1501 days)
Posts: 2161
From: Cambridgeshire, UK.
Joined: 02-05-2002


Message 21 of 32 (6555)
03-11-2002 9:01 AM
Reply to: Message 19 by Malachi
02-28-2002 3:52 AM


quote:
Originally posted by Malachi:
Human thought is not algorithmic in the sense of how computers are. Computer processors solve one problem at a time in order.

What about parallel processing ? Multi-processor systems (even
you're PC or MAC) evaluate parts of the code in parallel and
put the results together when required.
quote:
Originally posted by Malachi:

They only get things done by solving those simple problems very quickly.

Speed is irrelevent to a computers problem solving capability.
The speed of a calculation is only relevent to the urgency with
which the result is required.
quote:
Originally posted by Malachi:

The human brain is a collection of millions of tiny processors that operate algorithmically on the individual level(nerve cells). It is the interaction between these millions of individual processors that allows for what we percieve as human thought.

Are you saying that each nueron is capable of processing data ?
quote:
Originally posted by Malachi:

Some experimental work is being done to create computers with many small, simple processors working together instead of one super fast processor that most of modern computers use. This would be much more similar to the way that the human brain functions than the traditional single processor algorithmic way that computers of today function.

I think you don't know what algorithmic means.
There are plenty of multi-processor computer systems about ... even
your traditional PC or MAC has an ALU, FPU, BUS controller, etc.
which are effectively separate but co-operating processors.
Regardless of the number of processing nodes, the process can be
algorithmic ... you just have multi-processor algorithms.

This message is a reply to:
 Message 19 by Malachi, posted 02-28-2002 3:52 AM Malachi has not replied

  
Darwin Storm
Inactive Member


Message 22 of 32 (7457)
03-21-2002 12:01 AM


The human mind works in a non-linear way. We multitask, and there are a myriad of things that are being proccessed, constantly. Computers (currently) are linear. Even a multitasking computer breaks up tasks (ie proccesses a bit of one program, stops, works on a bit of another one, then comes back and repeats.) Some new computers are parrallel, but even at this level, they are still dividing tasks and working on them in a fairly linear fashion.
However, research is currently looking into computer versions of nueral networks. It is quite possible that this avenue of computer development could lead to a sentient computer.
Some interesting sites:
http://hem.hj.se/~de96klda/NeuralNetworks.htm#1%20Purpose
http://vv.carleton.ca/~neil/neural/neuron.html
Here is a morality question. If we create sentient computers, would sociey give them rights? How would we treat them? Would we just create a new form of slavery? As an atheist, I don't believe in a soul. However, I have an underlying appreciation for sentience. If we are able to create a sentient computer, I believe we have a moral obligation to grant such a being the same rights we would accord a human. Just curious what others view on this would be.

Replies to this message:
 Message 23 by joz, posted 03-21-2002 6:42 PM Darwin Storm has not replied
 Message 24 by Peter, posted 03-27-2002 7:20 AM Darwin Storm has replied
 Message 26 by bretheweb, posted 03-27-2002 11:36 AM Darwin Storm has not replied

  
joz
Inactive Member


Message 23 of 32 (7531)
03-21-2002 6:42 PM
Reply to: Message 22 by Darwin Storm
03-21-2002 12:01 AM


quote:
Originally posted by Darwin Storm:
Here is a morality question. If we create sentient computers, would sociey give them rights? How would we treat them? Would we just create a new form of slavery? As an atheist, I don't believe in a soul. However, I have an underlying appreciation for sentience. If we are able to create a sentient computer, I believe we have a moral obligation to grant such a being the same rights we would accord a human. Just curious what others view on this would be.
Thats just what used to disturb me about Asimovs robot stories and his "three laws of robotics", the fact that sentient entities were created with no motivation but to serve their creators struck me as being morally no better than slavery on the part of those concerned with their manufacture and use....
Probably why I enjoyed "That thou art mindful of him" so much. Rather than finding it disturbing, as so many seem to, I found the robots rationalisation of themselves as being human, thus the three laws of robotics become the three laws of humanics, and by their mental and physical superiority a superior kind of human comforting in the sense that by exercising their capacity for logic they freed themselves of their built in servitude....
(though I`m not really that keen on a bunch of super metal mickeys taking over the world....)

This message is a reply to:
 Message 22 by Darwin Storm, posted 03-21-2002 12:01 AM Darwin Storm has not replied

  
Peter
Member (Idle past 1501 days)
Posts: 2161
From: Cambridgeshire, UK.
Joined: 02-05-2002


Message 24 of 32 (7898)
03-27-2002 7:20 AM
Reply to: Message 22 by Darwin Storm
03-21-2002 12:01 AM


quote:
Originally posted by Darwin Storm:

The human mind works in a non-linear way.

What do you mean by non-linear ?
quote:
Originally posted by Darwin Storm:

We multitask, and there are a myriad of things that are being proccessed, constantly. Computers (currently) are linear. Even a multitasking computer breaks up tasks (ie proccesses a bit of one program, stops, works on a bit of another one, then comes back and repeats.)

There is no such thing as a multi-tasking computer. The operating
system running on a particular platform either multi-tasks or
it doesn't.
quote:
Originally posted by Darwin Storm:

Some new computers are parrallel, but even at this level, they are still dividing tasks and working on them in a fairly linear fashion.

Parallel computing has been about for a few decades, I don't
think that qualifies them as new (except in geological terms).
... again, what do you mean by linear ?
quote:
Originally posted by Darwin Storm:

However, research is currently looking into computer versions of nueral networks. It is quite possible that this avenue of computer development could lead to a sentient computer.
Some interesting sites:
http://hem.hj.se/~de96klda/NeuralNetworks.htm#1%20Purpose
http://vv.carleton.ca/~neil/neural/neuron.html

Again, pretty old research .. I did nueral nets in my undergraduate
degree about twelve years ago ... and they weren't new then.
quote:
Originally posted by Darwin Storm:

Here is a morality question. If we create sentient computers, would sociey give them rights? How would we treat them? Would we just create a new form of slavery? As an atheist, I don't believe in a soul. However, I have an underlying appreciation for sentience. If we are able to create a sentient computer, I believe we have a moral obligation to grant such a being the same rights we would accord a human. Just curious what others view on this would be.

If they are sentient they should have rights ... if they have
the appearance of sentience, then they are mechanisms and rights
make no sense.
Given that the consensus of opinion on chimps and gorillas is that
no matter how well they can learn sign language they are just
dumb animals I would guess governments would bend to political
and economic pressures (as always).

This message is a reply to:
 Message 22 by Darwin Storm, posted 03-21-2002 12:01 AM Darwin Storm has replied

Replies to this message:
 Message 25 by Darwin Storm, posted 03-27-2002 8:14 AM Peter has replied

  
Darwin Storm
Inactive Member


Message 25 of 32 (7902)
03-27-2002 8:14 AM
Reply to: Message 24 by Peter
03-27-2002 7:20 AM


Sorry if the majority of my post was vague and out of date. My main point is that computers are limited in the number of tasks and what type of tasks they can perform. The human mind is much more complex, and doesn't work on a binary system. Neural nets seem to be based on the concept of processing information in an artificial system in a way similar to the human mind. My point was if we continue down that road, it is quite possible we could create a sentient computer based on neural computer technology. I admit that my information in this field is quite weak. If you do know more about nueral nets, I would be most interested.

This message is a reply to:
 Message 24 by Peter, posted 03-27-2002 7:20 AM Peter has replied

Replies to this message:
 Message 27 by Peter, posted 04-08-2002 11:57 AM Darwin Storm has not replied

  
bretheweb
Inactive Member


Message 26 of 32 (7906)
03-27-2002 11:36 AM
Reply to: Message 22 by Darwin Storm
03-21-2002 12:01 AM


Darwin
quote:
Here is a morality question. If we create sentient computers, would sociey give them rights?
Question: How are you defining sentience?
It seems to me that our anthropomorphic tendancies preclude accepting "dumb animals" as sentient beings, so why would an artificial intelligence be any different?
quote:
How would we treat them?
No better or worse than we treat other humans, I'm certain.
quote:
Would we just create a new form of slavery?
Probably not as an AI would be horribly expensive, dont you think?
The creation of an AI robot would require a return on its investment though.
Indentured servitude?
quote:
As an atheist, I don't believe in a soul. However, I have an underlying appreciation for sentience. If we are able to create a sentient computer, I believe we have a moral obligation to grant such a being the same rights we would accord a human. Just curious what others view on this would be.
How about if the definition of sentience included other primates?
Would you consider granting them them same rights?
There is a movement afoot in California to endow pets, ie., cats, dogs, horses, with similar legal rights as a way of protecting them from cruelty.
Call me human-centric, but I'm disturbed by the legal ramifications of such an act.
Joz:
quote:
I found the robots rationalisation of themselves as being human, thus the three laws of robotics become the three laws of humanics, and by their mental and physical superiority a superior kind of human comforting in the sense that by exercising their capacity for logic they freed themselves of their built in servitude....
Excellent.
The problem then becomes "how do you make *humans* follow such a code?"
Ha!
While we can define "human being" from a genetic perspective, ie., a member of the species homo sapien sapien, etc., how do we adequately define it from a social/legal perspective if the definition of sentience will include other primates or AI's?
brett

This message is a reply to:
 Message 22 by Darwin Storm, posted 03-21-2002 12:01 AM Darwin Storm has not replied

  
Peter
Member (Idle past 1501 days)
Posts: 2161
From: Cambridgeshire, UK.
Joined: 02-05-2002


Message 27 of 32 (8324)
04-08-2002 11:57 AM
Reply to: Message 25 by Darwin Storm
03-27-2002 8:14 AM


quote:
Originally posted by Darwin Storm:

Sorry if the majority of my post was vague and out of date.

No problemo. Sorry if I was a little curt in that reply though.
quote:
Originally posted by Darwin Storm:

My main point is that computers are limited in the number of tasks and what type of tasks they can perform. The human mind is much more complex, and doesn't work on a binary system.

I'm pretty sure that brains are limited in that way too. I read
somewhere long ago, that the consious brain cannot process
more that 7 or 8 things at once ... not sure how that was found
or where I read it .. but there are limits on what a human can
think about in one go.
That's not to get confused with thinking, breathing, walking, etc.
That's analgous to a modern car or airliner, with a number of
separate control units responsible for different independent tasks.
I agree that the human mind is more complex ... but then it has
far more component parts in the mechanism that generates it.
Not really sure how the underlying notation of the thoughts/calculations bears on complexity of the processing.
quote:
Originally posted by Darwin Storm:

Neural nets seem to be based on the concept of processing information in an artificial system in a way similar to the human mind. My point was if we continue down that road, it is quite possible we could create a sentient computer based on neural computer technology. I admit that my information in this field is quite weak. If you do know more about nueral nets, I would be most interested.

Not really.
Neural nets are based on an assumption about human thought
which stems from the brain structure as neurons etc.
For an artificial neural network to do anything it has to
be trained first using a known input/output set. It can then
produce consistent results for inputs which it has not seen
before.
My personal feeling is that that's not really how human brains operate.
The main question had about HUMAN THOUGHT though was in what way
can any kind of processing be NON-ALGORITHMIC. To calculate
anything we must have some process which is undertaken in order
to reach a result ... an algorithm.

This message is a reply to:
 Message 25 by Darwin Storm, posted 03-27-2002 8:14 AM Darwin Storm has not replied

Replies to this message:
 Message 28 by Mister Pamboli, posted 04-08-2002 12:26 PM Peter has not replied

  
Mister Pamboli
Member (Idle past 7599 days)
Posts: 634
From: Washington, USA
Joined: 12-10-2001


Message 28 of 32 (8328)
04-08-2002 12:26 PM
Reply to: Message 27 by Peter
04-08-2002 11:57 AM


quote:
Originally posted by Peter:
The main question had about HUMAN THOUGHT though was in what way
can any kind of processing be NON-ALGORITHMIC. To calculate
anything we must have some process which is undertaken in order
to reach a result ... an algorithm.

Not necessarily. Have you read Roger Penrose's book "The Emperor's New Mind" on this very subject? Not that I'm endorsing his conclusions, but I think he demonstrates quite effectively that non-algorithmic thought is a credible model of consciousness. Well worth a read.
You will find, or probably already know, that this is an area of intense (and frequently vicious) controversy. All good fun.

This message is a reply to:
 Message 27 by Peter, posted 04-08-2002 11:57 AM Peter has not replied

  
Brad McFall
Member (Idle past 5055 days)
Posts: 3428
From: Ithaca,NY, USA
Joined: 12-20-2001


Message 29 of 32 (8334)
04-08-2002 2:24 PM
Reply to: Message 17 by joz
02-21-2002 11:38 AM


On a nano-magazine discussion site, I had posted this question about collapse and have not looked back lately but while I checked there was no response. I know that there is much work being done in nano-science but the surface, if I may use a word, does not seem to be commensurable with those I am aware of in ecological theory short of lots of pits I would guess none of us could "see". I was actually thinking of a statment by Feynman I had read years ago before this nano work has picked up steam.
The collopse MAY occur, because unlike simply adding these new hardwares to society in the sense of simply dumping additively cumulative wastes. If one of the nano-bot computers actually builds fogs that surface with the hidden player microbes we do not know the number of they could like Becker's copper for a salamder be a barrier to reproductive multiplicative increase out competeing in an exponential way that unlike simply killing off a species or two (lots of them) as we now do with techonology a whole ecology could be wipped out finding man running for life to the stars. This would be a bleak future indeed I wish not to bequeth the next generation (with).

This message is a reply to:
 Message 17 by joz, posted 02-21-2002 11:38 AM joz has not replied

Replies to this message:
 Message 30 by Quetzal, posted 04-09-2002 5:42 AM Brad McFall has replied

  
Quetzal
Member (Idle past 5894 days)
Posts: 3228
Joined: 01-09-2002


Message 30 of 32 (8372)
04-09-2002 5:42 AM
Reply to: Message 29 by Brad McFall
04-08-2002 2:24 PM


Aww, c'mon Brad. You're just railing against the next logical evolutionary step. So might the dinosaurs have railed against the future takeover by those horrible mammals if they had been capable of thinking about it. You're only upset 'cause we're doing the creating this time around.
Damn, I better check my medication levels. I actually understood that last post...
[This message has been edited by Quetzal, 04-09-2002]

This message is a reply to:
 Message 29 by Brad McFall, posted 04-08-2002 2:24 PM Brad McFall has replied

Replies to this message:
 Message 31 by Brad McFall, posted 04-09-2002 11:22 AM Quetzal has replied

  
Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024