Naked Science Forum

Non Life Sciences => Technology => Topic started by: thedoc on 10/11/2016 18:53:01

Title: Can you give AI emotions?
Post by: thedoc on 10/11/2016 18:53:01
Joshua Matteau  asked the Naked Scientists:
    My name is Josh. I'm a high school writing teacher in North Carolina in the U.S. I frequently recommend The Naked Scientist to my students because well informed writers are better writers.
   Artificial Intelligence (A.I.) is popular fodder for news, movies, and my student's fiction.  Often, A.I. is depicted as becoming more human-like as it develops anger, love, and fear; but I wonder...
   As our emotions come from hormones, chemical reactions, and even possibly our micro-biome, what would A.I. look like without those things?
   In addition, are scientists working to find ways to artificially add these chemical reactions to A.I.?
  Thank you for reading and perhaps satisfying my curiosity on this subject.
Josh Matteau

   
What do you think?
Title: Re: Can you give AI emotions?
Post by: evan_au on 10/11/2016 21:10:25
Quote from: Joshua Matteau
As our emotions come from hormones, chemical reactions, and even possibly our micro-biome, what would A.I. look like without those things?
In our current understanding, humans compute with chemicals and electricity. (Previous generations thought that humans functions via pumps/plumbing and levers/pulleys, as described by the technology of the time.)

But emotions are a state, and states can be described in many ways - by chemicals, levers, or by storing electrical 1's and 0's (the same things used in today's AI computers).

Our emotional state can be affected by external and internal events (refer to Maslow's Hierarchy (https://en.wikipedia.org/wiki/Maslow%27s_hierarchy_of_needs) for more examples).
- Physiological: Sudden appearance of a predator (lion, snake, spider, car on the wrong side of the road, or a mouse) provokes an immediate "fight or flight" response that rapidly grabs your entire attention to get you out of that dangerous situation (perhaps by standing on a chair). This is reflected by rapid increases in adrenalin. This would be useful in an AI which had a sense of self-protection (such as required by Asimov's three laws of robotics (https://en.wikipedia.org/wiki/Three_Laws_of_Robotics)).
- Physiological: Gradual onset of hunger or thirst occasionally intrudes into your consciousness, causing you to divert your activities towards areas where you can satisfy this drive. Ignore it too long, and it becomes more insistent, even demanding. I imagine in severe cases it will become all-consuming. This reflects levels of hormones like leptin. This would be useful in a Roomba which must recharge periodically or it will be stranded and "die" in the middle of a room.
- Safety: You may choose to put more food into the fridge (clothes in the wardrobe, or money into the bank account) than you actually need today - so you don't risk running out tomorrow or next week. The Roomba may choose to recharge more often than strictly necessary.
- Belonging: We need our family groups, and this is reflected in hormones like oxytocin and long-term cortisol levels. Current AIs are not very "social", but when they start looking for information and resources autonomously, they may find that certain associations work better than others; and an AI with no cooperating AIs could be an impoverished AI indeed!
- Boredom is a powerful driver of human behavior. It drives a search for novelty. This could be useful in an AI that actively looks for useful things to do.
- Empathy is the ability to share the emotional state of others, and to help improve it
-  The above are responses to stimuli that benefit the human. It is not entirely clear whether the emotional impacts of the microbiome benefit the human, or whether they primarily benefit the microbiome...
- Depression is a very common maladjustment of human emotions, which is often treated by increasing the levels of the hormone serotonin. If/when AIs start getting involved in the above responses, depression is also a likely outcome.

I would suggest that as AIs start interacting more extensively with the physical world (through the "Internet of Things", for example), and with humans (Siri and the like) that sensitivity to human emotions, and presenting an emotional state towards humans will become increasingly common.

Some research into advanced user interfaces have found benefits from gently insulting their users (after a period to gain familiarity, of course).

Whether you consider a microbiome to be biologically external, physiologically internal or topologically external is a separate discussion...
Title: Re: Can you give AI emotions?
Post by: Atomic-S on 11/11/2016 05:45:42
I am unsure if AI is, strictly speaking, capable of emotions. There appears to be nothing about a computer that would suggest it should have emotions. It is nothing but cold logic. To say that is to say that anything that intrinsically has emotions must be fundamentally different. In what way do humans differ fundamentally from computers? The one way that immediately occurs to me is humans' capability of deeper understanding. Can computers be developed that think at that level?
Title: Re: Can you give AI emotions?
Post by: cheryl j on 12/11/2016 08:19:30
One way to look at emotions (setting things like qualia aside for the moment) is that they are operating states that make certain  behaviors or responses more likely. For example, if I tease you, and you're in a happy, jovial mood, you might just chuckle at the joke at your expense. If you are already slightly  irritated, you might respond by punching me in the nose. And one study  showed that even simple biological organisms like flies have something similar to emotions -  "a persistent and scalable internal state of defensive arousal in flies, which can influence their subsequent behavior for minutes after the threat has passed" or what we might call fear. https://www.sciencedaily.com/releases/2015/05/150514132907.htm

So the question isn't necessarily whether you could duplicate emotions in AI, but would you really need to? Would it serve any useful purpose?
Title: Re: Can you give AI emotions?
Post by: alancalverd on 12/11/2016 10:53:41
It's always a good idea to consider the antithesis of a proposition. We generally attach "emotional" to an action that cannot be categorised as "rational", and here I must differ with Evan.

The adrenalin reaction is a well-developed and rational chemical response to a threat to a complex chemical entity.

Likewise hunger and thirst are descriptions of rational chemical responses to a lack of essential inputs. A car without low fuel or low oil warnings won't be popular, but it occurs to me that cars are relatively "adult" in this respect compared with aeroplanes: as you can't refuel (most) planes in flight, you have to check fuel and oil levels before you set off, just as you would feed a baby from time to time without waiting to be asked, and a "low fuel" indication is of no more use over an ocean  than a baby's scream when there's no food available - it just adds to the pilot/parent's stress.   

Recharging the refrigerator or battery is rational and limited by rationality: there's no point in buying food that will spoil before you eat it, and overcharging a battery wastes power and shortens battery life, but it's rational to have enough reserve for a rainy day/diversion to an alternative charging socket. I recall an infamous x-ray machine that couldn't be recharged if the battery was flat because the battery charger was commanded by a computer that ran off the battery....there are still several dead ones lying around in Africa.

I'll grant that belonging is an emotional need in a civilised society, but our species survives in the main by specialisation and collaboration, whereas sharks and tigers generally hunt alone, so "belonging" may just be a trivial interpretation of a fundamental requirement that most of us can take for granted because collaboration is implicit in the built environment. 

Boredom is certainly an emotion - it is irrational and more usually destructive than constructive. I've built machines that in effect "check their lipstick" when there's nothing else to do, but the ability to shut down and cool down is generally more useful.

We often build "dither" into machines, either to prevent stiction in mechanical parts or to explore the stability or optimality of a solution, but this is always a rational inclusion with logical consequences and wouldn't be confused with emotion.

I see empathy as a subconscious extenson of our firmware need for collaboration. I suspect an "empathic" machine would simply be categorised as "collaborative".

Depression? Yes, all machines have operational tolerance limits, beyond which they malfunction, maybe to the point of selfdestruction. 
Title: Re: Can you give AI emotions?
Post by: evan_au on 12/11/2016 21:36:23
Quote from: alancalverd
We generally attach "emotional" to an action that cannot be categorised as "rational", and here I must differ with Evan.
Unlike Alan, I agree with Alan.

Often, I think emotions can often be found to have a rational basis that benefits the individual and/or society, as Alan showed.
- Although some evolutionary psychologists seem to take it too far: "This emotional trait exists today, therefore it must have some evolutionary advantage; I think the advantage is X which could arise in scenario Y."

I think emotions could serve a "useful purpose" (as Cheryl asked), even in what we view today as a machine governed by "nothing but cold logic" (Atomic-S).

Sometimes emotional reactions take over, far beyond the domain where they are beneficial, producing crippling effects like hysteria emotional breakdown or PTSD. Perhaps this is where the implicit contrast of "emotional" vs "rational" originates? But I don't think they are mutually exclusive.

Quote
I'll grant that belonging is an emotional need in a civilised society, but our species survives in the main by specialisation and collaboration, whereas sharks and tigers generally hunt alone..
It is getting incredibly cheap these days to add communications capability to a computer chip - and extremely beneficial in terms of being able to find out where it is, what it senses, and changing what it does - that I suspect that most future computers will have some form of communications built-in. And they will actively seek out a network of communicating devices with different specialties which together would be more effective than a single computer acting alone.

This is the basis of the "Internet of Things" (and "Skynet", if you are into Terminator movies).
Title: Re: Can you give AI emotions?
Post by: Ethos_ on 12/11/2016 23:32:48
Quote from: alancalverd
We generally attach "emotional" to an action that cannot be categorised as "rational", and here I must differ with Evan.
Unlike Alan, I agree with Alan.


Like evan, I agree that Alan can be disagreeable.
Title: Re: Can you give AI emotions?
Post by: jeffreyH on 13/11/2016 02:05:53
I beg to differ. You are all wrong!
Title: Re: Can you give AI emotions?
Post by: mrsmith2211 on 14/11/2016 23:57:56
Problem one, you have to quantify emotion. Now in living species my opinion is it is altered states as a result of chemical inducement, adrenal glands or whatever. Could emotional responses be programmed after observation, certainly. So now we program the computer to respond the same as a person reacting to say being put in a situation   where fight or flight kicks in. It is like a chess game, you analize your options or not and react. So when is it called an emotion?

It certainly has to do with the epiphenomenon we call consciousness, but if replicated by say an ai computer, it would not be consciousness, thus emotionless. But if we are wired to react on emotions, and a computer is programmed to emulate thee emotions, how do you know the difference?
Title: Re: Can you give AI emotions?
Post by: smart on 15/11/2016 10:53:10
Could a robot hypercompute emotional consciousness like humans do?
Title: Re: Can you give AI emotions?
Post by: zx16 on 17/11/2016 20:49:41
Problem one, you have to quantify emotion. Now in living species my opinion is it is altered states as a result of chemical inducement, adrenal glands or whatever. Could emotional responses be programmed after observation, certainly. So now we program the computer to respond the same as a person reacting to say being put in a situation   where fight or flight kicks in. It is like a chess game, you analize your options or not and react. So when is it called an emotion?

It certainly has to do with the epiphenomenon we call consciousness, but if replicated by say an ai computer, it would not be consciousness, thus emotionless. But if we are wired to react on emotions, and a computer is programmed to emulate thee emotions, how do you know the difference?

I think you've hit the nail on the head, with your observation that in living species "altered states" of mind, or what we call "emotions", result from chemical inducement.  In as you say, adrenal glands or whatever.

The point is that a computer doesn't have adrenal glands, or anything else which could produce chemical changes.

The computer operates only on electrical "switches", or transistors.  These transistors don't undergo any chemical changes when they're activated. They simply switch on or off in response to an electric current.  Just like a light-bulb does when you switch it on or off.

So to suppose that a computer consisting of millions of transistors, could have genuine emotions, is like expecting that an array of millions of light-bulbs could have emotions.
It can't happen, because what the transistors and light-bulbs lack is chemical complexity. And such complexity is only found in living organisms.

Title: Re: Can you give AI emotions?
Post by: Novaflipps on 17/11/2016 21:46:35
Well, they can store information in artificial DNA strings now, so who knows where computer technology ends up. But the question is :Can an AI ever become self-aware?
Title: Re: Can you give AI emotions?
Post by: mrsmith2211 on 19/11/2016 02:31:57
Can an AI ever become self-aware?

If you ask that question to a pre programmed AI, how could you trust the answer?
Title: Re: Can you give AI emotions?
Post by: evan_au on 19/11/2016 07:25:52
Quote from: zx16
what the transistors and light-bulbs lack is chemical complexity
Playing "devil's advocate" here...
We could take adrenaline as an example of a complex chemical (among the many signaling hormones used by the body).

Your body uses the level of adrenaline in the body to prepare for flight or flight.

The level of adrenaline is what matters, and probably 1% accuracy is as close as your body measures the level.

So you could represent adrenaline as a 1-byte value in a computer. This is a lot less than the chemical complexity of the adrenaline molecule.

The body has mechanisms to break down adrenaline when it is no longer being produced. A simple, first-order digital filter can achieve this, although you might want to increase the resolution to 2 bytes to avoid round-off error and digital noise.

But we are talking about bytes, not gigabytes of complexity.
Title: Re: Can you give AI emotions?
Post by: Wicked96 on 19/11/2016 14:38:04
AI can be programed to understand emotions and thats all thats necessary.
Title: Re: Can you give AI emotions?
Post by: jeffreyH on 19/11/2016 14:45:12
AI can be programed to understand emotions and thats all thats necessary.

Well then it isn't intelligent since it isn't using learning to develop emotions. You have simply told it how it feel. Not all humans have empathy. So if you program an AI to have what you believe is empathy you are removing a learning ability. The more learning opportunities you remove the less intelligent it becomes. Emotions and feelings are not always present. Sometimes they are inherent. This is another issue for the AI developer.
Title: Re: Can you give AI emotions?
Post by: Novaflipps on 19/11/2016 16:15:29
Self-awareness: You can find out if a being is self-aware without asking. Put it in front of a mirror and observe. And BTW a true AI should not be programmed to do anything. Just the basics so in can learn.
Title: Re: Can you give AI emotions?
Post by: zx16 on 19/11/2016 20:00:08
Quote from: zx16
what the transistors and light-bulbs lack is chemical complexity
Playing "devil's advocate" here...
We could take adrenaline as an example of a complex chemical (among the many signaling hormones used by the body).

Your body uses the level of adrenaline in the body to prepare for flight or flight.

The level of adrenaline is what matters, and probably 1% accuracy is as close as your body measures the level.

Interesting post, thank you!

But when you suggest that "1% accuracy" is "probably" as close as your body measures the level of adrenaline, is there any empirical proof of this?
Title: Re: Can you give AI emotions?
Post by: Wicked96 on 20/11/2016 12:48:51
AI can be programed to understand emotions and thats all thats necessary.

Well then it isn't intelligent since it isn't using learning to develop emotions. You have simply told it how it feel. Not all humans have empathy. So if you program an AI to have what you believe is empathy you are removing a learning ability. The more learning opportunities you remove the less intelligent it becomes. Emotions and feelings are not always present. Sometimes they are inherent. This is another issue for the AI developer.

What if we merge with AI in a way then it would be possible.
Title: Re: Can you give AI emotions?
Post by: evan_au on 20/11/2016 20:37:08
Quote from: zx16
"1% accuracy" is "probably" as close as your body measures the level of adrenaline, is there any empirical proof of this?
Not through experiments I have done.
- But I have seen in a number of areas, that the response of "analogue" measures used by the body can often be represented in 1 or 2 bytes
- The main question is whether this response should be represented on a linear or logarithmic scale
- The response of your 3 color cones is frequently indicated in 1 byte each (linear)
- Sounds are often represented in 1 byte (logarithmic) or 2 bytes (linear). The 1 byte samples are audibly distorted, while 2 bytes is not.

To address the specific question about adrenaline levels, here is a survey of experiments conducted by manipulating adrenaline levels, either physically (inject adrenaline or saline), psychologically ("this video shows real events" or 'this video is fiction") or in regards to safety (walk on a shaky bridge or not-so-shaky bridge).

https://www.mckendree.edu/academics/scholars/issue17/mckinney.htm

They measured the effects of adrenaline on sexual attractiveness, and found that elevated adrenaline levels was misattributed as sexual attractiveness. (This raises the question about whether hormonal reactions are rational, or maybe adaptive - shared danger promotes attractiveness.)

Various experimenters used different means to measure the results, from binary (call the phone number or not) to more analogue and multidimensional (rate the participant on various scales).

While these experiments did not rate the same individual many times under various scenarios (the subjects would have become suspicious after a while), they did try the same experiment on many individuals.

If you look at the numeric results, you will see that the experiments cited tested enough subjects to get a statistically significant result, but the correlation was not strong. For example, a comparison between men and women showed that the mean response was different, but it differed by less than a standard deviation.

PS: If you search for papers, adrenaline & epinephrine are used by authors in different countries. Fortunately, Google knows this.