|
Register | Sign In |
|
QuickSearch
EvC Forum active members: 64 (9163 total) |
| |
ChatGPT | |
Total: 916,419 Year: 3,676/9,624 Month: 547/974 Week: 160/276 Day: 34/23 Hour: 0/1 |
Thread ▼ Details |
|
Thread Info
|
|
|
Author | Topic: The Future of Artificial Intelligence: Can machines become sentient (self-aware) | |||||||||||||||||||||||
nwr Member Posts: 6409 From: Geneva, Illinois Joined: Member Rating: 5.3 |
DevilsAdvocate writes:
We still cannot fly. We had to change the meaning of "fly" to something that we can do, before we were able to fly. In the old sense, flying the way birds do it - we cannot do that.Some humans also said we would never fly, go into outer space, etc, etc. But we have exceeded even our wildest expectations. Can we change the meaning of "sentience" to something that computers can do? Presumably we can, if we find that useful. Let me respond in the form of a few questions:1: Could we, in principle, build an artificially sentient system? My answer - yes, sure. I don't see anything magical going on. 2: Is that principle computation?My answer - no. I might be part of a small minority there, though I sometimes suspect that the majority of mathematicians and computer scientists are actually very skeptical of AI but choose not to engage in the public debates. 3: Is that principle intentionality (the issue that John Searle attempted to raise in his "Chinese Room" argument).My answer - no, though that might at least vaguely point in the right direction. With the last two answers, I am probably a minority of one. 4: Is it even worth doing?My answer - no. If it were easy to do, it would be worth doing for what we would learn in the attempt. However, it is going to turn out to be very hard, perhaps prohibitively hard. So there is no real payoff for building an artificially sentient being. Beside, the old fashioned way is more fun. DevilsAdvocate writes:
I have at least given a bit more detail of my thinking above.Never say never. AI, as currently done, is mostly an attempt to automate epistemology (the "theory of knowledge" from philosophy).Epistemology is mostly nonsense.
|
|||||||||||||||||||||||
Taq Member Posts: 10038 Joined: Member Rating: 5.3 |
We are talking about tens of billions of neurons within the human brain each with several thousand synapses (biochemical connections - equivalent to electronic switches or transistors) firing near simultaneously (yes I know they do not all fire at exactly the same time). On top of that, these connections can change over time so that some patterns are reinforced while others atrophy. Neurons themselves can change the threshold needed for them to fire. The brain physically adapts to input. If we are going to copy the human brain for AI then we need a system that can physically change over time in response to input. One interesting system is the field-programmable gate array. This processor allows the programmer to change the actual wiring in the chip. You can actually evolve functional arrays through randomly changing the connections and selecting for functions.
quote: This type of set up has always intrigued me. It even seems to have analogy-like features: "In fact, out of the 37 logic gates the final product uses, five of them are not even connected to the rest of the circuit in any way - yet if their power supply is removed, the circuit stops working. It seems that evolution has exploited some subtle electromagnetic effect of these cells to come up with its solution,".
|
|||||||||||||||||||||||
caffeine Member (Idle past 1045 days) Posts: 1800 From: Prague, Czech Republic Joined: |
By sentience, I think we can all agree we are talking human-like sentience. The ability to contemplate one's self and the ability to increase their knowledge base both on an individual level and collectively. Culture (accumulation of moral and social norms) and science (expounded accumulation of knowledge of ourselves and the universe around us) are only achievable at this level of sentience. All well and good, but I wasn't saying we didn't agree on what sort of sentience we're discussing. Having agreed that we're talking about human-level sentience, we still have no idea how this comes about. how can we know whether we'll be able to artificially reproduce it when we don't know how or why it happens naturally?
|
|||||||||||||||||||||||
New Cat's Eye Inactive Member |
I think the breakthrough will come through recursive simulation of a very highly resolved complex system, that takes into account all of the possible combinations of the subunits, then sorting against a criteria that improves the simulation.
|
|||||||||||||||||||||||
caffeine Member (Idle past 1045 days) Posts: 1800 From: Prague, Czech Republic Joined: |
I think we need better definitions of obey and disobey to go down this path. Your examples are flawed in the sense that they define 'disobey' as being materially incapable of performing the command. When a computer tells you 'Illegal Operation' or something equivalent, it is very simply a deterministic result of being physically incapable of doing what you told it. There is very literally no possible way the electrons can flow down the wires of the integrated circuit in the exact pattern you specified. Which is kind of the point I was trying to make. When a human refuses to do something based on self-interest, or for any other reason for that matter, there's no reason to assume this isn't simply a deterministic product of the wiring of our brains.
|
|||||||||||||||||||||||
DevilsAdvocate Member (Idle past 3122 days) Posts: 1548 Joined: |
Ok, I accept the argument that it would prohibitively difficult (though not impossible) to accomplish this feat in the near future (say 20-50 years down the road given our current technological and political status). However, let's say we did create a human-like sentient machine 50 years down the road. How do you think this would play into our idea about religion and morality?
I am curious about this on a religious level.If a machine could make independent, rational and sometimes moral decisions, would Christian’s and other religious people think that these sentient machines would need to be saved? Would they think they are capable or worthy of being saved? Would they think they are moral agents or just imitations of God's creation by disobedient humans? Thought this would be interesting discussion as well. One of the saddest lessons of history is this: If we've been bamboozled long enough, we tend to reject any evidence of the bamboozle. We're no longer interested in finding out the truth. The bamboozle has captured us. It is simply too painful to acknowledge -- even to ourselves -- that we've been so credulous. - Carl Sagan, The Fine Art of Baloney Detection "You can't convince a believer of anything; for their belief is not based on evidence, it's based on a deep seated need to believe." - Carl Sagan "It is far better to grasp the Universe as it really is than to persist in delusion, however satisfying and reassuring." - Carl Sagan, The Demon-Haunted World
|
|||||||||||||||||||||||
DevilsAdvocate Member (Idle past 3122 days) Posts: 1548 Joined: |
All well and good, but I wasn't saying we didn't agree on what sort of sentience we're discussing. Having agreed that we're talking about human-level sentience, we still have no idea how this comes about. how can we know whether we'll be able to artificially reproduce it when we don't know how or why it happens naturally?
Good answer. I am curious to wonder if our level sentience is a natural byproduct of evolution and how common it is. Considering we only know about life on this planet this is currently an unanswerable question. However, the question arises, if other species i.e. dolphins, elephants, etc were able to evolve without the impedement of human beings, would it be natural for them to evolve a more sentient level of cognition? Also, can we develop a machine mind that mimics the natural evolution of the biological mind. If so than this may be the key process that will help synthesize a human-like sentient machine mind. Who knows but it is an interesting discussion. One of the saddest lessons of history is this: If we've been bamboozled long enough, we tend to reject any evidence of the bamboozle. We're no longer interested in finding out the truth. The bamboozle has captured us. It is simply too painful to acknowledge -- even to ourselves -- that we've been so credulous. - Carl Sagan, The Fine Art of Baloney Detection "You can't convince a believer of anything; for their belief is not based on evidence, it's based on a deep seated need to believe." - Carl Sagan "It is far better to grasp the Universe as it really is than to persist in delusion, however satisfying and reassuring." - Carl Sagan, The Demon-Haunted World
|
|||||||||||||||||||||||
nwr Member Posts: 6409 From: Geneva, Illinois Joined: Member Rating: 5.3 |
DevilsAdvocate writes:
There would be a lot of anger from the religious right, perhaps even an attempt to assassinate the scientists involved. It would call into question their idea of a spiritual soul.However, let's say we did create a human-like sentient machine 50 years down the road. How do you think this would play into our idea about religion and morality? Beyond that, it gets pretty hard to guess what would result.
|
|||||||||||||||||||||||
CosmicChimp Member Posts: 311 From: Muenchen Bayern Deutschland Joined: |
Which big word did you not understand (or bother to look up)? Also your name catsci isn't listed over at the gameknot.com chess site.
Edited by CosmicChimp, : clarity
|
|||||||||||||||||||||||
Dr Jack Member Posts: 3514 From: Immigrant in the land of Deutsch Joined: Member Rating: 8.4 |
One interesting system is the field-programmable gate array. This processor allows the programmer to change the actual wiring in the chip. You can actually evolve functional arrays through randomly changing the connections and selecting for functions. You realise that this, like neural nets, can achieve exactly nothing that conventional hardware can't, right?
|
|||||||||||||||||||||||
New Cat's Eye Inactive Member |
Which big word did you not understand (or bother to look up)? What's a recursive simulaion? What makes a complex system 'very highly resolved'? What are the subunits? How are they 'combined'? Where does the criteria for improvement come from? I don't know shit about this shit Okay, I did a quick google on "recursive simulation" and found:
quote: How's that work? Do they "layer" the simulations or are they more like "side by side"? I took one computer science course and just hated it. On top of that, my stuff never compiled well
Also your name catsci isn't listed over at the gameknot.com chess site. They musta ousted me from lack of participation. I guess I'll make another one. But I didn't like getting my ass handed to me in chess. Also, I've been playing this goofy brower based RPG. Although, the good thing about those chess games is that the timeframe is all up to yourself.
|
|||||||||||||||||||||||
Taq Member Posts: 10038 Joined: Member Rating: 5.3 |
You realise that this, like neural nets, can achieve exactly nothing that conventional hardware can't, right? The part that interests me is the ability to rewire the processor on the go. This is something the brain does as well. As far as I know the CPU in your standard home PC does not do this.
|
|||||||||||||||||||||||
nwr Member Posts: 6409 From: Geneva, Illinois Joined: Member Rating: 5.3 |
DevilsAdvocate writes:I am curious to wonder if our level sentience is a natural byproduct of evolution
Yes, it is. That's just my opinion, of course. It is actually a controversial issue. Some people believe it is an epiphenomenom (a mere side effect of no practical use).
DevilsAdvocate writes:
That's a lot harder to say. We can't even compare two humans, so how could we compare a human and a whale?
... and how common it is. DevilsAdvocate writes:
IMO, sentience is not an end in itself. Rather, it is part of the way we function. What is distinct about humans, say compared to elephants, is the extent to which we form large interactive social groups. A substantial part of our cognitive abilities are a component of that social adaptation.
However, the question arises, if other species i.e. dolphins, elephants, etc were able to evolve without the impedement of human beings, would it be natural for them to evolve a more sentient level of cognition?
|
|||||||||||||||||||||||
New Cat's Eye Inactive Member |
IMO, sentience is not an end in itself. Rather, it is part of the way we function. What is distinct about humans, say compared to elephants, is the extent to which we form large interactive social groups. A substantial part of our cognitive abilities are a component of that social adaptation.
Seems to me like you'd need the sentience to form the sufficient social grouping and visa versa. A catch-22. But once the ball started rolling, they'd feed off each other and you'd get the snowball effect that we seem to be the result of.
|
|||||||||||||||||||||||
DevilsAdvocate Member (Idle past 3122 days) Posts: 1548 Joined: |
Yes, it is. That's just my opinion, of course. It is actually a controversial issue. Some people believe it is an epiphenomenom (a mere side effect of no practical use). I agree that sentience is a by-product of evolution of the brain however I do think self-awareness definately has an effect on our own evolution both past and present. Without sentience we would have no science/culture/etc.
That's a lot harder to say. We can't even compare two humans, so how could we compare a human and a whale? True. It is a very subjective and hard to grasp concept.
IMO, sentience is not an end in itself. Rather, it is part of the way we function. Agreed.
What is distinct about humans, say compared to elephants, is the extent to which we form large interactive social groups. Ake 'culture' though some higher intelligent animals have some rudimentary forms of this. Basically accumulated extrasomatic knowledge passed down from generation to generation. One of the saddest lessons of history is this: If we've been bamboozled long enough, we tend to reject any evidence of the bamboozle. We're no longer interested in finding out the truth. The bamboozle has captured us. It is simply too painful to acknowledge -- even to ourselves -- that we've been so credulous. - Carl Sagan, The Fine Art of Baloney Detection "You can't convince a believer of anything; for their belief is not based on evidence, it's based on a deep seated need to believe." - Carl Sagan "It is far better to grasp the Universe as it really is than to persist in delusion, however satisfying and reassuring." - Carl Sagan, The Demon-Haunted World
|
|
|
Do Nothing Button
Copyright 2001-2023 by EvC Forum, All Rights Reserved
Version 4.2
Innovative software from Qwixotic © 2024