0 Members and 1 Guest are viewing this topic.
OK, we now have three very intelligent people discussing how to measure something. Would any of you care to define what it is you are trying to measure? A functional definition will suffice for a start, as in "consciousness is that which....."
that which: "establishes the communion between the self to it's environment."
If your scientific model has no room for a soul of this kind in the brain, then consciousness can only occur if it is located somewhere outside the brain.
That is why it seems so rational to look for consciousness outside of the brain, but it is also still rational to look for it inside the brain, just so long as you're prepared to look for something that's actually capable of being tortured.
...are you asserting that Ms X did not have a soul?
I never said there was no room inside the brain for what you term as the soul. So that leaves us with the question I already asked you: "if not in the brain, where would you suggest we find this so-called "soul"?
I do not agree that it is rational to find consciousness outside of the brain. But as you have conceded, "it is also still rational to look for it inside the brain".
This has been my issue with this whole argument from the outset. You seemed to insinuate that there wasn't room in the brain for the fullness of consciousness.
And one or two others here have tried to imply that this consciousness lives on even when the physical brain has died. And the point about Zombies gives no support for that notion either, because being a Zombie doesn't eliminate the brain which is still alive even though it's in a Zombie's head.
And don't start bringing up near death experiences as evidence for the survival of the consciousness. They call it NEAR death for a good reason.
I assumed that you'd read the early pages of the Don's thread and would have understood my position. I didn't want to attack your position but merely defend the idea of looking for consciousness outside of the brain - I suspect that this universe is virtual and that it is likely to be incapable of hosting sentience as a result.
I must apologize for assuming too much, and for not giving enough time and effort to read that thread from start to finish. But in my own defense, it became very evident from reading several of Don's posts that most of them were only boring repetitions of his previous posts. So in my laziness and boredom I really didn't care about wasting time reading any more of his crap than I had to.
I will confess after reading your clarifications on the subject that I find much more agreement with you than I do with Don.
Nevertheless, the one issue I still disagree with you on is a viable consciousness outside the brain. I am willing to overlook that and submit that we can agree to disagree in a friendly manner.
However, finding any sort of cordial arrangement with Don has become impossible. His insults, calling some of us swine and monkeys has him looking and sounding like a simple brat. I simply have no use for that sort of attitude, and his unwillingness to calmly discuss, he only wants to argue his points as if nobody else is smart enough to understand his brilliance. Nothing but a waste!
The real trick is to starve a thread like that instead of feeding it. The level of attention he's getting is a substantial reward as it boosts his status - he is serving as some kind of teacher handing out reading material for the class to work through, but the quality of most of it is either shoddy or out of date. I don't understand why people are letting him manipulate them in that way when they could find far better things to read on the subject by themselves.
If this is a virtual universe, what is it a model of?
If it is an adequate model, then it should replicate or simulate all the characteristics of a real one.
If it is not an adequate model, what is its purpose, ...
...and why go looking for simulations that you know are absent?
If you create a virtual world for people to think they live in where they can have fun that they couldn't have had so conveniently (if at all) within reality, that's all the purpose you need. The model's functional incompleteness is not a barrier to it being useful in this way because the things it cannot handle can sit outside of it.
About defining ConsciousnessEarly working definitions always seem to be somewhat functional, and aren't always true backwards and forwards. In other words, "Something that is conscious must be able to do XYZ, but something that does XYZ may not be conscious." Is that sort of definition "a start", or a failure?
So some malevolent being has created a model of the real universe just so that people can suffer and die, eh? Or is the real universe even more unpleasant than the world we think we live in?
About defining ConsciousnessDefinitions are generally brief, but somehow must contain the elements that are necessary and sufficient. Consciousness appears to be very complex and multi-faceted, and even leaving aside its unknown aspects, consciousness is difficult to sum up with a definition. The list grew longer and longer when I tried to write down what I thought were key elements: sensation, awareness, self-awareness, memory, intelligence, learning, creativity, problem solving, choice or volition, emotion, integrated information, symbols, qualia, attention switching, and possessing Theory of Mind – that is the ability to imagine or attribute the same qualities to another animal that one believes is also conscious, and adopt their point of view. (The last one might not seem that important, or are a consequence of the others, but if consciousness developed to foster social functioning, I suggest that empathy or the ability to alter point of view is important.)
Cooper might argue that computers can do many of these things, often better than we can, so consciousness must be something "else".
At the same time, it’s hard to conceive of consciousness functioning without, for example, memory.
Perhaps memory or intelligence is necessary but not sufficient, ...
... the same way ability to replicate, or respond to stimuli, is part of the definition of life but not enough.
But I digress. My point is that both objectively and subjectively, there appear to be levels or degrees of consciousness. How do you define something that is not a single entity but occurs across a broad spectrum? Neurologists often say “More is different” but AI people seem to say, “No, different is different” and nothing “emerges.” And what’s missing is some key element that turns information into active experience. I can’t decide.
Not a lot of progress there, because sentience is equally undefined (except possibly as consciousness).
If you want your security light to be sentient, presumably you want it to decide whether the moving target is a threat, based on previous knowledge of the general characteristics of a threat, or the absence of characteristics of a friend. Either way you are simply adding learning and a statistical algorithm, so you have to look at something you call a sentient sentinel and ask how it (or he) acts and thinks to determine the intentions of an approaching object. Then I guess you would distinguish between a human that makes some kind of instinctive guess and a machine that sticks to rigid or neural rules. But the problem then becomes that you are defining sentience or consciousness as nothing more than fallibility.