The Naked Scientists

The Naked Scientists Forum

Author Topic: Quantifying Consciousness  (Read 14080 times)

Offline David Cooper

  • Neilep Level Member
  • ******
  • Posts: 1505
    • View Profile
Re: Quantifying Consciousness
« Reply #50 on: 11/01/2014 20:19:42 »
Pity about that. I can define a cow in such a way that a Martian could recognise a cow and a non-cow, and I can define a colour by example.

It's easy to describe something external, including an external colour, but you can't point to an internal quale representing a colour by example, as you doubtless know.

Quote
Even such abstractions as energy and entropy are definable such that we both know what the other is talking about, and when we measure energy or calculate entropy, we both get the same number. But consciousness or sentience seems to defeat the definitive powers of those who discuss it, which makes quantification doubly impossible..

The whole field is deeply unrewarding, with every little bit of progress taking us further away from where we want it to go. It's like jumping into a black hole to study what happens inside it.
 

Offline cheryl j

  • Neilep Level Member
  • ******
  • Posts: 1460
  • Thanked: 1 times
    • View Profile
Re: Quantifying Consciousness
« Reply #51 on: 12/01/2014 00:50:16 »
Pity about that. I can define a cow in such a way that a Martian could recognise a cow and a non-cow,
I’m not sure that is actually true. The martian, if his consciousness is like ours,  might only be able to recognize non-cows.
Ramachandran discusses a patient, John, a former fighter pilot, who had a stroke when a blood clot related to surgery for appendicitis clogged one of his cerebral arteries. He lost the ability to recognize familiar objects. He couldn’t recognize his wife’s face, but he recognized her voice. He couldn’t recognize his own face, but when looking in a mirror, said it must be him, because it moved when he did. Although he couldn’t recognize objects, he could deal with their spatial extent, dimensions, and their movement. He could trim the hedge in his yard and make it nice and even. When shown a picture of a carrot, he said “It’s a long thing with a tuft at the end – a paint brush?” Never-the-less, he wrote this description of a carrot :
“A carrot is a root vegetable cultivated and eaten as human consumption worldwide. Grown from seed as an annual crop, the carrot produces long thin leaves growing from a root head. This is  `deep growing and large in comparison with the leaf growth, sometimes gaining a length of of twelve inches under a leaf top of similar height when grown in good soil. Carrots may be eaten raw or cooked and can harvested during any size or state of growth. The general shape of a carrot is an elongated cone, and it’s color ranges from red and yellow.”

John’s paragraph is impressive. It shows his brain has a lot of information associated with carrots. What’s more some of that information is related to visual aspects – size, color, shape. But that information didn’t seem to help him recognize a carrot. I would have thought that once shown a carrot, with someone saying “this is the thing you just described,” that would be all it might take. But Ramachandran said he never regained that ability to connect the two things. Ramachandran’s story about John goes on for several pages and gets even weirder.



Quote

and I can define a colour by example.


That’s the trouble with qualia. Even if qualia is just a “symbol” for something else – red is the symbol in our brains for certain waveslengths in a spectrum of electormagentic radiation - there is no other symbol that exactly replicates that symbol or what it symbolizes. Sometimes I wonder if the problem with qualia isn’t sentience or consciousness but the fact that there is no way to translate red into words, another symbol,  that exactly describe or reproduce red back into another brain. On the other hand, if emotions are qualia, and a person is angry, he can insult you and generate that qualia inside of you, or describe some injustice done to him that makes you feel angry as well. I can make your neurons do what mine are doing, or something pretty close, when it comes to anger but not red.

 I must admit, though,  I get very confused when it comes to brains and computers, about the difference between input-output functions versus copying or translating information. I often think I am confusing one with other. I would appreciate any help.
« Last Edit: 12/01/2014 01:57:27 by cheryl j »
 

Offline alancalverd

  • Global Moderator
  • Neilep Level Member
  • *****
  • Posts: 4716
  • Thanked: 154 times
  • life is too short to drink instant coffee
    • View Profile
Re: Quantifying Consciousness
« Reply #52 on: 12/01/2014 02:13:02 »
For most of us, visual information and recognition dominates our senses because the input bandwidth is enormous and tightly correlated with our movements, but I can imagine a system where the connection between the visual processor and learned information gets broken. It doesn't surprise me that the connection is fragile since the visual processor has to act very quickly and not retain previous data (otherwise moving objects would appear blurred) so there must be some filter between sight and medium- or long-term memory.

Not sure how this relates to consciousness, though. Presumably John was able to locate and pick up - and eat - a carrot, so in at least one sense he was conscious of it and his relationship to it.  Given his textbook-accurate description, could he draw a carrot, I wonder? Fascinating case.     
 

Offline David Cooper

  • Neilep Level Member
  • ******
  • Posts: 1505
    • View Profile
Re: Quantifying Consciousness
« Reply #53 on: 12/01/2014 22:25:29 »
On the other hand, if emotions are qualia, and a person is angry, he can insult you and generate that qualia inside of you, or describe some injustice done to him that makes you feel angry as well. I can make your neurons do what mine are doing, or something pretty close, when it comes to anger but not red.

We don't necessarily feel anger the same way any more than we all see red the same way. It's likely to be very different between us and aliens though, even if they have a range of emotions which matches up with ours perfectly.

Quote
I must admit, though,  I get very confused when it comes to brains and computers, about the difference between input-output functions versus copying or translating information. I often think I am confusing one with other. I would appreciate any help.

Here are some ideas you may be able to build upon:-

Input and output functions simply collect values from external sources or send numbers out to external recipients. If you want to control a motor or muscle, you have to send values to it which will make it act the way you want it to act (so you either have to speak the language of the motor/muscle or have something else in between that translates the signal), and if you want to get input from a sensor like a microphone or an ear, you just take whatever values you get from it and then have to try to interpret them. Input and output is like talking to someone who speaks another language - you have to learn their language in order to communicate with them (or they have to learn yours, or you need an interpreter in between who can convert from one language to the other in either direction).

Copying a number is done by reading from one memory address and writing the value you've just read to another memory address. Translating a number may involve reading it from one address, looking up that value in a table to find a number to replace it with, and then writing that new number back to memory somewhere - this could happen when turing the symbol "9" (often represented by the value 57) into the value 9 in order to use it in an arithmetical calculation. These processes are only meaningful if the numbers represent something. Sometimes their meaning is the number itself (although there are different ways of mapping numbers to bytes too: e.g. 255 can either mean 255 or -1), but more often the meaning is something else which is mapped to that value, in which case there has to be some way of storing the details of that mapping and there have to be algorithms available to run which can process the data usefully. The meanings of some data may only be recognised by a piece of program code that processes it, meaning that the only mapping details that exist are tied up in the algorithm.

If we imagine a robot with an eye with a single pixel, the colour value could be sent from the eye to the robot's brain as a series of three bytes, the first representing the colour red, the second green and the third blue. This would already be representing the colour detected by the pixel in the eye in a form that the brain of the computer wants to work with, so there is no translation step needed, unless it feels the need to change the order. A different camera could use just two bytes for the colour, with 5 bits for each colour and one bit not used - the robot's brain would perhaps need to rearrange these sets of 5 bits into three different bytes before working with them. Perhaps this robot eats plants in order to power itself, so it might be programmed to hunt for green things to eat. It can send values to its motors until the input has the greenest value and then go forwards while chomping. If the (adjusted) colour value from the camera is always stored in the same place, perhaps a feeling is generated in the material of the piece of memory holding that value, perhaps related to the magnetic field - there could be a distinct feeling felt there for each individual colour, so we could have a sentient robot, but this sentient robot cannot know that it is sentient because it can't read the feeling - all it can do is read the colour value back from memory, not able to tell if anything has been felt at all, never mind what kind of feeling it might be.
« Last Edit: 12/01/2014 22:28:46 by David Cooper »
 

Offline cheryl j

  • Neilep Level Member
  • ******
  • Posts: 1460
  • Thanked: 1 times
    • View Profile
Re: Quantifying Consciousness
« Reply #54 on: 13/01/2014 16:02:53 »


We don't necessarily feel anger the same way any more than we all see red the same way.

Possibly, but I would be surprised if it were true. One of the things that is more developed in humans is learning and the ability to predict the behaviour or intentions of others. A lot of research attributes this to mirror neurons that fire not just when you perform an action but when you  see another person performing an action. And it does have to be a person or animal you believe is conscious – mirror neurons don’t, for example, fire in response to the up and down movement of a basketball. (I don't know about robots) Mirror neurons may be responsible for mimicking, learning complex skills, learning to talk. They may also have the evolutionary advantage of allowing one to learn from the misadventures of others without incurring the same risks and injuries oneself, but for this learning to be as effective, I may have to, as Bill Clinton used to say, “feel your pain.”

Mirror neurons are  considered the source of empathy and allow one to predict another person's next action or state of mind (is this person going share their food with me or club me over the head?) If our emotional experiences were vastly different, I don’t think any of this would work very well or as quickly. I could learn that every time you seemed happy, I was about to be whacked on the head, but it might take a few times.  Of course, as you said, all that really matters is emotional states correlate, that our behavior or display matches up, not that our subjective experience be the same. But given the similarities in brain structure, biochemistry, it’s hard to think of a good reason why the subjective experience really should be different.

I sometimes wonder about the qualia red/green inversion question. If our colours were truly inverted or shifted, wouldn’t there be discrepancies in what we thought was the same or different when we compared mixed colors and added and subtracted shades of color?  It seems like an obvious question, so surely someone has done the math.

Another thing about emotion. I always thought about it as driver, or positive or negative tag. But last night I reading about patients with disruptions in connections between the amygdala and other parts of the brain. When there is a disruption in between the amygdala and that part of the brain that recognizes faces, patients have the delusion that people close to them are imposters. My mother is not really my mother. She looks and acts the same, but it is not really her. My father is not my father, and my dog has been substituted with another identical dog. When the  connections are missing from the amygdala to much larger areas that process sensory information, the patient feels like the entire world does not really exist, nothing is “real.”  When it’s a missing connection to parts of the brain associated with sense of self, patients believe they don’t exist, they are actually dead. Not just depressed or lacking energy, but a literal belief that they are dead, their body is a hollow shell, or they are a ghost.

I suppose one might explain it by saying the association of emotion to objects is strongly conditioned, and when it goes missing they wrongly conclude the object has changed, not their emotional response to it. But that still seems odd- you would think they would just explain it by deciding “I just don’t like or care about my mother as much as I  used to,”  not, "she's an imposter."

At any rate, I was kind of intrigued by the association between emotion and our sense of what is real. Of course Ramachandron doesn’t rule the idea that maybe normal people are the ones with delusion or illusion that the world or the self is real. But other qualia seem to help us make similar distinctions. Qualia generated by the senses is vivid and non negotiable, the qualia of dreams less so, and imagined qualia much more fuzzy and malleable .

I wonder if there is anything in AI like mirror neurons in learning, or what simulations of them do.

« Last Edit: 13/01/2014 16:38:08 by cheryl j »
 

Offline cheryl j

  • Neilep Level Member
  • ******
  • Posts: 1460
  • Thanked: 1 times
    • View Profile
Re: Quantifying Consciousness
« Reply #55 on: 13/01/2014 16:08:33 »
  Given his textbook-accurate description, could he draw a carrot, I wonder? Fascinating case.     

 Actually,  since John was a very  capable artist (even if he couldn't identify was he was drawing) Ramachandran asked him to draw  pictures of various plants from memory –a rose, a daffodil a lupin, a tulip and an iris. The drawings sort of reflect his factual knowledge but look nothing like the real thing. (The daffodil looks like a mushroom. The rose looks like a pom-pom.) The drawings look like what you might end up with if you tried to tell someone over the phone how to draw something he had never seen before, and coincidentally,  Ramachandron refers to  the pictures as “Martian flowers.”
« Last Edit: 13/01/2014 17:49:16 by cheryl j »
 

Offline cheryl j

  • Neilep Level Member
  • ******
  • Posts: 1460
  • Thanked: 1 times
    • View Profile
Re: Quantifying Consciousness
« Reply #56 on: 13/01/2014 17:48:01 »
Actually, I changed my mind a bit about emotions. It occurred to me I do know people who find anger oddly pleasurable and enjoy a good scrap. For them it's exciting and invigorating (righteous indignation) where as I usually find it upsetting or wildly frustrating. It jams my circuits.
 

Offline alancalverd

  • Global Moderator
  • Neilep Level Member
  • *****
  • Posts: 4716
  • Thanked: 154 times
  • life is too short to drink instant coffee
    • View Profile
Re: Quantifying Consciousness
« Reply #57 on: 13/01/2014 18:54:50 »
But your physiological response to annoyance or frustration is likely to be the same as everyone else's - adrenalin, sweat, tachycardia....it's really a question of whether you have become habituated to noradrenalin after the action has passed. I was disturbed to find, a few months after being widowed, that I actually enjoyed crying, and I wondered if those people who apparently never recover from such an event actually feel more comfortable being miserable - endorphin addiction? - it certainly accounts for "curry addicts".
 

Offline David Cooper

  • Neilep Level Member
  • ******
  • Posts: 1505
    • View Profile
Re: Quantifying Consciousness
« Reply #58 on: 13/01/2014 19:05:49 »
We don't necessarily feel anger the same way any more than we all see red the same way.

Possibly, but I would be surprised if it were true.

If red and blue qualia can be switched round in one person compared with another, they'd still function normally and be unable to determine whether they're seeing things the same way or differently. The same could occur with the qualia of fear and disgust, and it might even be possible to see colours using sound qualia and to hear things through colour qualia. I'm not sure what would exchange well with an anger quale, but there may be a vast number of spare qualia which we never experience at all, and it could be that no two people experience like qualia at all. It seems more likely to me that we all experience the same qualia in the same ways, for the most part at least. Colour-blind people might give us some reason to think there will be differences though, but it's possible that red-green colour-blind people simply never experience red (or alternatively green) qualia at all and that it's a lack of a quale that results rather than an exchange.

Quote
I wonder if there is anything in AI like mirror neurons in learning, or what simulations of them do.

If you can't make a machine feel anything related to itself, you aren't going to be able to make it feel anything related to anyone it's observing either, but it will certainly try to calculate what they are likely to be feeling based on what it knows about how people feel. If it sees someone being punched in the stomach, it won't feel anything and won't be alarmed, but it will generate a lot of conclusions about potential damage, possible immorality and the possible need to intervene.
 

Offline David Cooper

  • Neilep Level Member
  • ******
  • Posts: 1505
    • View Profile
Re: Quantifying Consciousness
« Reply #59 on: 13/01/2014 19:13:02 »
I forgot to comment on this bit:-

Quote
I sometimes wonder about the qualia red/green inversion question. If our colours were truly inverted or shifted, wouldn’t there be discrepancies in what we thought was the same or different when we compared mixed colors and added and subtracted shades of color?  It seems like an obvious question, so surely someone has done the math.

People have done experiments with "glasses" that invert the image and make people see everything upside down. After a few days they adapt to this and see everything as normal. I'd like to try doing the same thing with switching colours round, and inverting some (or all of them) in terms of brightness. The range of colours and shades available should be the same, though fewer can be expressed in the blue, so just switching red and green round would be best, but we can work fairly well with a reduced range without it being obvious, so it may not be too important. If you brought up a baby wearing such a device it would grow up thinking the colours it sees are absolutely normal, and that white is dark while black is light (if you're reversing the brightnesses). Whether someone could adapt to it well later in life is another issue, and would be well worth doing the experiment.
 

Offline bizerl

  • Sr. Member
  • ****
  • Posts: 279
    • View Profile
Re: Quantifying Consciousness
« Reply #60 on: 14/01/2014 01:18:35 »
I'd like to try doing the same thing with switching colours round, and inverting some (or all of them) in terms of brightness. The range of colours and shades available should be the same, though fewer can be expressed in the blue, so just switching red and green round would be best, but we can work fairly well with a reduced range without it being obvious, so it may not be too important. If you brought up a baby wearing such a device it would grow up thinking the colours it sees are absolutely normal, and that white is dark while black is light (if you're reversing the brightnesses). Whether someone could adapt to it well later in life is another issue, and would be well worth doing the experiment.

Is this similar to what happens when I wear tinted ski goggles and for a while everything looks orange, but after a while I forget I'm wearing them and colours just look "normal". I can see a spectrum and it doesn't appear to have a tint. Then when I take the goggles off, everything looks extra blue until my "eyes" (or probably more acurately "brain") re-adjusts.

I've only skimmed over this thread I must admit but it seems that if consciousness was something external, it would be able to interact with other consciousnesses, and would provide something to observe as evidence. I personally believe that this is not the case and take the anthropic argument that the neurons and chemicals and what-nots that are all whizzing around doing their job in our heads create the effect of consciousness. The fact that there is a direct effect on our experience when all this grey matter is altered (ie with chemicals - drugs, or with direct electrical stimulation) seems to support the idea that what we call "consciousness" is contained within the structure we call "brain".

Sorry if i'm repeating points from earlier, as I said, I've only skimmed this thread.
 

Offline cheryl j

  • Neilep Level Member
  • ******
  • Posts: 1460
  • Thanked: 1 times
    • View Profile
Re: Quantifying Consciousness
« Reply #61 on: 14/01/2014 01:57:33 »
Since additive and subtractive color mixing works differently, I was wondering if you'd end up with discrepancies if people really did have inverted qualia.

With mixing paint, there are several ways to make brown. If you did it one way and I did another,  and we had inverted red green, would we agree on the final color?

This article has some interesting pictures.


http://plato.stanford.edu/entries/qualia-inverted/
 

Offline Aemilius

  • Sr. Member
  • ****
  • Posts: 311
  • Thanked: 2 times
    • View Profile
Re: Quantifying Consciousness
« Reply #62 on: 14/01/2014 09:48:54 »
Would any of you care to define what it is you are trying to measure? A functional definition will suffice for a start, as in "consciousness is that which....."

Consciousness is that which perceives.
« Last Edit: 14/01/2014 10:38:37 by Aemilius »
 

Offline cheryl j

  • Neilep Level Member
  • ******
  • Posts: 1460
  • Thanked: 1 times
    • View Profile
Re: Quantifying Consciousness
« Reply #63 on: 14/01/2014 16:32:22 »
Francis Crick  was lecturing on consciousness at the Salk Institute and a student raised his hand an said "But professor Crick, you say you are going to lecture on the neural mechanisms of consciousness, and you haven't even bothered to define the word properly."

Crick said "My dear Chap, there was never a time in the history of biology when a group of us sat around a table and saying 'let's define life first.'We just went out there and found out what it was - a double helix. We leave matters of semantic distinctions to you philosophers."

I agree that not having an adequate definition can be a problem, but it might also be true that good definitions only follow once you actually know more about the phenomena you are interested in. And even if they don't follow, you still end up with a lot of useful knowledge that can be applied to other things and opens doors to other interesting questions that wouldn't have occurred to you before. I would trade a "perfect" definition of life for the discovery of DNA any day.
 

Offline David Cooper

  • Neilep Level Member
  • ******
  • Posts: 1505
    • View Profile
Re: Quantifying Consciousness
« Reply #64 on: 14/01/2014 20:02:11 »
Since additive and subtractive color mixing works differently, I was wondering if you'd end up with discrepancies if people really did have inverted qualia.

It would remain additive, unless it's already subtractive with qualia, in which case it would remain subtractive.

Quote
With mixing paint, there are several ways to make brown. If you did it one way and I did another,  and we had inverted red green, would we agree on the final color?

Are there multiple ways of making brown if you start out with only three colours of paint, those being magenta, yellow and cyan?
 

Offline Aemilius

  • Sr. Member
  • ****
  • Posts: 311
  • Thanked: 2 times
    • View Profile
Re: Quantifying Consciousness
« Reply #65 on: 15/01/2014 00:09:04 »
Francis Crick  was lecturing on consciousness at the Salk Institute and a student raised his hand an said "But professor Crick, you say you are going to lecture on the neural mechanisms of consciousness, and you haven't even bothered to define the word properly."

Crick said "My dear Chap, there was never a time in the history of biology when a group of us sat around a table and saying 'let's define life first.'We just went out there and found out what it was - a double helix. We leave matters of semantic distinctions to you philosophers."

Looks to me as if he was just artfully dodging the issue of directly linking neural mechanisms to consciousness.

With his DNA research, Crick was more exploring the physical/tangible "Nuts and Bolts" side of things.  When exploring the nuts and bolts side of things in the physical sciences one can often just go out there and start finding things out.

Consciousness, on the other hand, is the non-physical/intangible product of all the nuts and bolts, so if your planning on just going out there and finding out what that is, you shall be out there for a very, very long time.
 
I agree that not having an adequate definition can be a problem....

Can be a problem? This thread has been impaled on it right from the start.
« Last Edit: 15/01/2014 00:25:41 by Aemilius »
 

Offline Aemilius

  • Sr. Member
  • ****
  • Posts: 311
  • Thanked: 2 times
    • View Profile
Re: Quantifying Consciousness
« Reply #66 on: 15/01/2014 05:33:38 »
Consciousness is that which: "establishes the communion between the self to it's environment."

Consider the word; "Myself"

This is a compound word consisting of two words; My and self. The "My" establishes ownership of the following word "self". To understand the significance of this union, one needs to grasp the notion that the "My" refers to the physical attributes of one's existence and the "self" extends to the ethereal portion of this alliance between body and mind.

There is no absolute evidence that this alliance exists without both participants being involved. Until that evidence surfaces, we can only speculate, and speculation is not science.

Consciousness is that which establishes the communion between the self to it's environment? Are there any little wafers involved? I hate little wafers!

No real need for all that. Consciousness is simply that which perceives. It is perception that is the hallmark of consciousness.... whether for a man, a whale, a parakeet, a tree, a single cell, a leopard, what have you. 
« Last Edit: 15/01/2014 06:24:58 by Aemilius »
 

Offline cheryl j

  • Neilep Level Member
  • ******
  • Posts: 1460
  • Thanked: 1 times
    • View Profile
Re: Quantifying Consciousness
« Reply #67 on: 15/01/2014 17:19:43 »


Can be a problem? This thread has been impaled on it right from the start.

Perhaps, but I still tend to agree with Crick, in that neuroscience has provided a different perspective about the nature of the things that philosophers sometimes associate with consciousness, words like "self" or "intelligence" or "awareness" or "perception."

What I like about Ramachandran's experiements is that he often demonstrates that the abilities, or type of experiences, that philosophers see as being "the same" or "one thing" aren't always, that certain mental functions viewed as inseparable, some times are. Who would have intuitively thought it might be possible to write but not be able to read what you or someone else has written, to speak (in a meaningful way) but not understand what is said to you, to see but not be consciously aware of what you are seeing, or to divide consciousness in half in a split brain patient?

 

The Naked Scientists Forum

Re: Quantifying Consciousness
« Reply #67 on: 15/01/2014 17:19:43 »

 

SMF 2.0.10 | SMF © 2015, Simple Machines
SMFAds for Free Forums