There's no evidence of anything left over that we need the "spirit" to explain.
Whilst I'm in no sense a dualist, I don't think we can be quite as flippant as that. We are still no closer to explaining the subjective experience of consciousness. We have no idea whatsoever what mechanism could create it, nor any adaptive explanation for its purpose.
I've always though that maybe the only way a brain could deal with reality is to model it (this would be our experience) and react to the model (this is our violition).
This would be our consciousness. A bit like running virtual weather patterns and making decisions based on the model.
Perhaps the close approximation model we react to is close enough to reality to keep us functioning but not too overburden with 'too much reality' .
I can't support this in any way, however.
My problem with this sort of explanation is that I'm not clear why you'd need consciousness for this to work. We can create computer programs which model outcomes and then make decisions based on this - like game playing programs, for instance. As far as we know, this doesn't require the computer to have any conscious awareness of what's happening.
If the self is wholly a product of the physcial interactions within the brain, then the chemical and electrical interactions producing the model would be the same without some subjective experience of them. When Deep Blue plays somebody at chess, it can run through a potential variation and 'decide' to reject a move as flawed without any conscious mind reviewing the process.
I'm not sure if I'm expressing myself very well here.
It seems self-evident to me. Our consciousness has allowed us to out-compte pretty much anything on the planet and could take us to others. Surely consciousness - along with intelligence, our ability to plan ahead, empathise, speak and understand others and to put ourselves in their heads is such an obvious competitive advantage it barely needs thinking about? (Even if it does simply emerge fro the development of another useful feature such as language).
Unless I'm missing something, there's a gap in your line of reasoning.
If I've understood what you've posted so far, you share the view that your mind is wholly the product of the physical interactions within your brain. If we had access to sufficient knowledge and computational power, we would be able to explain thought processes and behaviour purely by referring to the chemical and electrical behaviour of the brain. All the planning, emphasising, and computing which you engage in is produced purely by these deterministic processes.
Given that the work needed to produce human behaviour, with its competitive advantages, is all done by these physical processes, why do we need to be aware of doing it? Why do we need to feel the urge to do something, when what we're arguing is that both the feeling and the reaction to it are produced purely by the physical nature of our brains.
If the mind is not some spirit responding to urges and emotions originating in the physical matter of our brain, then it's unclear to me why there should be the perception of anything responding.
Nope, I'm not getting this. There is an old argument that consciousness is not needed to do what we do ie. there's no competitive advantage of being conscious. But that's just silly - does anyone think that it's possible to build a hospital without consciousness (with all that means)?
But what is it that you're implying that consciousness actually does?
Let me try a different approach. Imagine a robot that had been programmed to behave like a human. It could receive the same external stimuli as we do, and had an adaptive programming which could respond to this. It was programmed with certain drives and tendencies to mimic our instincts and biological drives. The programme could learn, it could analyse cost-benefit scenarios and perform calculations. It behaves exactly the same as a human - it can communicate, it can pan rationally, and it can have irrational emotial responses, since these are included in its programming. Being a computer, however, at no point would it have any awareness of its behaviour or the calculations being performed in its electronic brain.
In what way would the behaviour of this robot differ from the behaviour of a human? If it wouldn't, then what is the adaptive basis for consciousness?
The point is not about the stunning technical acheivement involved. The point of bringing up the robot was to make clear that all the processes involved in creating our behaviour are, by Tangle's own argument, purely physical processes. They're the complex results of interactions within the brain.
What is consciousness supposed to be doing, exactly? If we're abandoning the concept of a spirit, or a soul, then there is no mechanism whereby your consciousness decides something and influences the physical processes which produce your behaviour. Your behaviour and, presumably, your consciousness, is produced by deterministic physcal processes in your brain - it does not direct these processes.
This is what I mean when asking what purpose consciousness could possibly be serving.
I find it difficult to believe that drunk driving is completely a strict liability crime in the UK, although I may be wrong.
Driving offences usually are strict liability in the UK. Others include firearm possessions, leading to such odd cases as someone being convicted of possessing a firearm despite the fact that he could not reasonably have been expected to know it was in his possession (it was in a bag); and a more recent case of someone being prosecuted for bringing into a police station a gun he claimed to have found in a park.
ABE: And I thought the consciousness discussion was drifitng off topic!