The Naked Scientists
  • Login
  • Register
  • Podcasts
      • The Naked Scientists
      • eLife
      • Naked Genetics
      • Naked Astronomy
      • In short
      • Naked Neuroscience
      • Ask! The Naked Scientists
      • Question of the Week
      • Archive
      • Video
      • SUBSCRIBE to our Podcasts
  • Articles
      • Science News
      • Features
      • Interviews
      • Answers to Science Questions
  • Get Naked
      • Donate
      • Do an Experiment
      • Science Forum
      • Ask a Question
  • About
      • Meet the team
      • Our Sponsors
      • Site Map
      • Contact us

User menu

  • Login
  • Register
  • Home
  • Help
  • Search
  • Tags
  • Recent Topics
  • Login
  • Register
  1. Naked Science Forum
  2. General Discussion & Feedback
  3. Just Chat!
  4. Is there a universal moral standard?
« previous next »
  • Print
Pages: 1 ... 10 11 [12] 13 14 ... 212   Go Down

Is there a universal moral standard?

  • 4236 Replies
  • 965520 Views
  • 2 Tags

0 Members and 167 Guests are viewing this topic.

Offline David Cooper

  • Naked Science Forum King!
  • ******
  • 2876
  • Activity:
    0%
  • Thanked: 38 times
Re: Is there a universal moral standard?
« Reply #220 on: 30/09/2019 20:20:35 »
Quote from: hamdani yusuf on 30/09/2019 00:46:37
Quote from: David Cooper on 28/09/2019 22:04:38
Science has no model that can make sense of sentience - it looks as if there can be no such thing. If we decide that that's the case, then there can be no such thing as suffering and there is no role for morality.


Quote from: David Cooper on 28/09/2019 22:04:38
Protecting sentient things is the purpose of morality. Calculating morality does not require the calculator to be sentient.
That requires sentience to be defined objectively.
How do you define fundamental things? When you reach them, their definitions are always circular. All you have is how they relate to other things.
Logged
 



Offline hamdani yusuf (OP)

  • Naked Science Forum GOD!
  • *******
  • 11799
  • Activity:
    92.5%
  • Thanked: 285 times
Re: Is there a universal moral standard?
« Reply #221 on: 01/10/2019 05:11:37 »
Quote from: David Cooper on 30/09/2019 20:20:35
How do you define fundamental things? When you reach them, their definitions are always circular. All you have is how they relate to other things.
You can compare fundamental things of one object to another. For example, which rock has more mass or volume.
How do you compare and relate sentience to other things?
Logged
Unexpected results come from false assumptions.
 

Offline David Cooper

  • Naked Science Forum King!
  • ******
  • 2876
  • Activity:
    0%
  • Thanked: 38 times
Re: Is there a universal moral standard?
« Reply #222 on: 01/10/2019 18:20:08 »
Quote from: hamdani yusuf on 01/10/2019 05:11:37
How do you compare and relate sentience to other things?

We won't know until we find out how to read feelings out of whatever feels feelings. The only examples we think we know of are hidden inside our own heads where our brain makes claims that suggest that it's measuring feelings. That is something only science can explore, but getting to the evidence without destroying what you're looking for may be a tough task.
Logged
 

Offline David Cooper

  • Naked Science Forum King!
  • ******
  • 2876
  • Activity:
    0%
  • Thanked: 38 times
Re: Is there a universal moral standard?
« Reply #223 on: 01/10/2019 18:34:12 »
Quote from: Halc on 01/10/2019 13:51:02
But we know the rock isn't sentient since none of its particles exhibits free will.

Nothing exhibits free will.

Quote
If any particle was suffering, it could put itself in a situation where this was not the case.

People who are being tortured can't stop the torture so easily.

Quote
Since it isn't doing that, either it isn't sentient or the thing is completely contented.

That's really just an assumption given false backing by faulty reasoning.

Quote
Likewise, the motion of the particles in my body can be described by the laws of physics.  Not a single proton seems to be exerting free will.  Hence I cannot be sentient (your definition).

My definition doesn't involve free will (not least because there's no such thing), so don't attribute your definition to me.

Quote
What prevents me from flying like superman? I will that, yet cannot bring it about. My free will does not seem to have any ability to override physics, yet you claim otherwise when contrasting yourself to the actions of computers that, lacking said sentience, are confined to the laws of physics.

Again you're trying to attribute ideas to me that are the opposite of the ones I hold. There is no free will involved in anything.

Quote
I did not intend to debate morality from a dualist perspective. The perspective is religious (inherently non-empirical) and that typically has morality pretty much built in. I don't deny that. I just find your particular flavor of it self contradictory.

If sentience exists, there is no escape from "dualism". When discussing morality in a way where we decide that it matters, we have to work under the premise that there is such a thing as sentience. If there is no such thing as sentience, morality has no role as there's nothing needing to be protected from anything else. A computer that prints "Ouch!" to the screen when you tap the "O" key and "Ooh, I like that!" when you type "E" (for ecstasy) does not feel pain or pleasure on either occasion. We can simply switch the strings round and it will print "Ouch!" when you type "E" instead. It's just data without sentience. Anything you do with a program on a computer works like that without any feelings being tied to the process at all. When you run them on a Chinese Room processor, you can easily see that this is the case. If you ban "dualism", the human brain is like that too with no feelings involved and with all the claims about feelings coming out of the brain being false. I'm not contradicting myself anywhere, but am simply telling it as it is. Sentience looks impossible and there is no room for it in any way that ties it to process in our machines. In discussing morality though, we put that apparent impossibility aside for cases where the full mechanism remains beyond the current reach of science and we humour the idea that sentience might be real. On that basis, we can then talk explore the idea of morality while remaining fully aware that it is predicated on something that may be false.
Logged
 

Offline hamdani yusuf (OP)

  • Naked Science Forum GOD!
  • *******
  • 11799
  • Activity:
    92.5%
  • Thanked: 285 times
Re: Is there a universal moral standard?
« Reply #224 on: 02/10/2019 04:16:46 »
To prevent miscommunications, I think we should use common definition of terms before proposing to use our own redefinitions.
Quote
Sentience is the capacity to feel, perceive, or experience subjectively. Eighteenth-century philosophers used the concept to distinguish the ability to think (reason) from the ability to feel (sentience).
http://en.wikipedia.org/wiki/Sentience

Quote
Definition of sentience
1: a sentient quality or state
2: feeling or sensation as distinguished from perception and thought
https://www.merriam-webster.com/dictionary/sentience

Quote
Definition of consciousness
1a: the quality or state of being aware especially of something within oneself
b: the state or fact of being conscious of an external object, state, or fact
c: AWARENESS
especially : concern for some social or political cause
The organization aims to raise the political consciousness of teenagers.
2: the state of being characterized by sensation, emotion, volition, and thought : MIND
3: the totality of conscious states of an individual
4: the normal state of conscious life
regained consciousness
5: the upper level of mental life of which the person is aware as contrasted with unconscious processes
https://www.merriam-webster.com/dictionary/consciousness

If you use words for significantly different meanings than commonly used definitions, maybe it's better to use different terminology, or even create a new word to express your intention.
Logged
Unexpected results come from false assumptions.
 



Offline David Cooper

  • Naked Science Forum King!
  • ******
  • 2876
  • Activity:
    0%
  • Thanked: 38 times
Re: Is there a universal moral standard?
« Reply #225 on: 02/10/2019 19:46:23 »
Quote from: Halc on 02/10/2019 00:04:25
Quote from: David Cooper on 01/10/2019 18:34:12
Nothing exhibits free will.
I think I have misread your position.  You say nothing has free will, but haven't defined it.

Free will depends on an injection of magic somewhere to get round the problem of everything that happens having a cause. Even if that cause is something genuinely random, it still doesn't provide for free will. We do what we're forced to do by physics. At a higher level, we're simply trying to do the best thing (for ourselves) all the time, and the only way to break free of that rule in order to pretend to have free will is to do something less good, but that seems to be the best thing to do too as it's an attempt to satisfy ourselves that we have free will and to feel better for believing that (if we're stupid enough not to realise that we failed).

Quote
It seems that you consider sentience to be a passive experiencer, lacking any agency in the physical world.  Morals are there as obligations to these external experiencers, to keep your movie audience contented so to speak.
Perhaps I am wrong about this epiphenomenal stance.  Kindly correct me if I've again got it wrong.

If sentience is real, it is just a passenger. It appears to have no useful role in the machine. However, people report pain being an unpleasant thing and they typically consider it important not to cause it without justification (which means using it with the aim of causing less suffering).

There are two different discussions involved in this. One is a discussion of how morality works, and it's predicated on sentience being a real thing. The other is a discussion of what sentience is and how it works. The second of these is the biggest puzzle of them all, and it would be easy to waste your whole life trying to crack it. I'd rather wait to see what happens when the claims about sentience generated by human brains are traced back to see what evidence they are based on. We may uncover a self-delusion mechanism which fools a machine with no self into thinking it has a self. Alternatively, we may find something new to science which stuns everyone. What I repeatedly come up against though is people who don't look deeply enough at how computers work who insist that sentience can be operating in there somewhere in the bit they don't understand, and they're absolutely sure they're right because they're absolutely sure that they themselves are sentient and that they are just machines. That's maybe a third discussion, and it's worth having in that it can be resolved just by filling in the gaps for those people so that they can see that sentience cannot be operating where they hoped it might be. Morality is also resolved. What is not resolved is the part that seems so impossible that it just makes sentience look impossible, and yet it feels too real for that to be the case.
« Last Edit: 02/10/2019 19:49:06 by David Cooper »
Logged
 

Offline David Cooper

  • Naked Science Forum King!
  • ******
  • 2876
  • Activity:
    0%
  • Thanked: 38 times
Re: Is there a universal moral standard?
« Reply #226 on: 03/10/2019 20:56:47 »
Quote from: Halc on 03/10/2019 01:54:12
Quote from: David Cooper on 02/10/2019 19:46:23
Quote from: Halc on 02/10/2019 00:04:25
You say nothing has free will, but haven't defined it.
Free will depends on an injection of magic somewhere to get round the problem of everything that happens having a cause.
OK, you define free will as 1) having this external thing (what you call a sentience), and 2) it having a will and being able to exert that.  This actually pretty much sums up the concept from a typical dualist, yes.

When I said it "depends on an injection of magic", I was ruling out free will on that basis - not endorsing it. Whatever the sentience does, it's caused to do that by the inputs and whatever algorithm its mechanism applies to them.

Quote
I on the other hand would describe that situation as possession, where my will is overridden by a stronger agent, and its freedom taken away.  You don't describe possession.  The body retains its physical will and this 'sentience' gets its jollies by being along for the ride.

I just see a whole lot of causation from the outside interacting with causation from the set up of whatever's on the inside, and every part of it is dictated by physics.

Quote
That said, you seem aware of the 'magic' that needs to happen.  Most are in stark denial of it, or posit it in some inaccessible place like the pineal gland despite the complete lack of neurons letting their shots be called by it.

I rule it out because it depends on magic.

Quote
You're an epiphenomenalist, a less mainstream stance.

I don't think that fits. If sentience is real, it has a causal role: without that, it cannot possibly cause claims about feelings being felt to be generated. It is still just a passenger though in that what it does is forced by the inputs.

Quote
Why do you posit it then?  Seem like the equivalent of positing the invisible pink unicorn that's always in the room.  If there's no distinction between the presence or absence of a thing, why posit it?

If there's no sentience, then torture is impossible and morality has no purpose. Most people believe that pain is real and that they strongly dislike it. If you are in that camp, then you're a unicornist yourself. If you are completely out of that camp, then you should be a nihilist with no self.

Quote
Why might you not have many of them, a whole cinema full all taking the same ride?

Indeed, there could be quintillions of sentiences in there all imagining themselves to be the only one, and they might not all be feeling the same thing as each other, but due to lack of memory, they aren't going to be capable of recognising any contradiction between what they feel and what the brain reads the feeling to be.

Quote
Most people don't define sentience as an epiphenomenal passenger, so most don't base their moral decisions on how it will make the unicorn feel.

If they believe in sentience, the sentient thing that feels is what morality is there to protect. If they don't have that, they don't need morality. But they want to have sentience without any sentient thing to experience feelings other than magical complexity.
Logged
 

Offline David Cooper

  • Naked Science Forum King!
  • ******
  • 2876
  • Activity:
    0%
  • Thanked: 38 times
Re: Is there a universal moral standard?
« Reply #227 on: 04/10/2019 20:38:32 »
Quote from: Halc on 03/10/2019 22:22:06
Quote from: David Cooper on 03/10/2019 20:56:47
When I said it "depends on an injection of magic", I was ruling out free will on that basis - not endorsing it.
Understood, but this is only true given a free-will definition that involves this kind of magic going on, instead of somebody else that considers free will to be not remote controlled.

The same magic is required for it regardless where it's controlled from. Whatever causes something is itself caused and is forced to cause what it causes. There is no such thing as choice in that whatever is chosen in the end was actually forced.

Quote
Quote
I just see a whole lot of causation from the outside interacting with causation from the set up of whatever's on the inside, and every part of it is dictated by physics.
That sounds like a description of semi-deterministic physics.

It's fully deterministic physics.

Quote
Quote
If sentience is real, it has a causal role: without that, it cannot possibly cause claims about feelings being felt to be generated. It is still just a passenger though in that what it does is forced by the inputs.
This seems to be a contradictory statement.

It isn't. X causes Y, then Y causes Z --> X causes Z. Y causes Z but is forced to by X.

Quote
If I feel the warmth of green, I cannot cause the body to discuss said warmth without performing said magic on the physical body which supposedly is incapable of such feelings.  If it has any causal role, there's magic going on.

There's no magic in Y being forced by X to cause Z.

Quote
Quote
If there's no sentience, then torture is impossible and morality has no purpose.
Agree.  So I find your definitions rather implausible for this reason.  My view doesn't have this external passenger.

If you don't have something experiencing the feelings, you have no sentience there and the feelings don't exist either. By throwing away the "passenger" you lose the sentience.

Quote
The physical being is all there is and is sentient in itself (yes, a different definition of sentience), has free will because nothing else is overriding its physical will, and morality has a purpose because there are obligations to the physical thing.

There is no free will for Y in "X causes Y causes Z" and no free will for X either because it's caused by W). Your big mistake there is in accepting something that's actually impossible. There is no free will. You want the physical being to be sentient in itself, and that's fine: that's like particles being sentient, and it's possible that all stuff is sentient. That doesn't solve the problem of how an information system can ever get that knowledge from it in order to report the existence of sentience.

Quote
I also don't think morality is about pain and suffering.  Everybody that says that makes it sound like life is some kind of horrible thing to have to experience.  Pleasure and pain are means to an end.  If the pleasure and pain were the end (the point of morality), then we should just put everybody on heroin.  Problem solved.  Recognizing the greater purpose is isn't a trivial task.

Morality is about suffering AND the opposite. It's a harm:benefit calculation in which the harm is ideally minimised and the benefit (all kinds of pleasure) maximised, but where those two aims conflict with each other in places and you're looking for the best compromise between the two to optimise quality of life..

Quote
Quote
Most people believe that pain is real and that they strongly dislike it. If you are in that camp, then you're a unicornist yourself.
Nonsense. I don't think I need the unicorn to feel my own pain for me. That you propose this indicates that the idea is beyond your comprehension, and not just an interpretation with which you don't agree.

But you are the unicorn. Of course you can't have a unicorn feel anything for you because then it would be the sentience rather than you. Don't attribute nonsense to me that comes out of your misreading of my position.

Quote
Quote
If they believe in sentience, the sentient thing that feels is what morality is there to protect.
Almost nobody believes in the sort of sentence you describe. Typically it's a separate experiencer capable of said magic (think Chalmers), or in my case, a sentience composed of a physical process (Dennett, or whoever that hero is supposed to be).

I don't give a damn where the sentient thing is or what it's made of: that's a job for science to uncover. The only thing that actually matters here is that for feelings to be real, something real has to experience them, and that is a sentience. No sentient thing --> no feelings can be felt --> no role for morality --> you can try to torture anyone as much as you like and no harm can be done. If you think sentience is real, you need to ask yourself where it is and how it interacts with the things that physics recognises as being real. Chalmers has sentient stuff, and that's not incompatible with physics, but he (and everyone else) cannot account for how that sentience can be detected by the brain in order for the information system of the brain to generate accounts of what that sentience is experiencing. Dennet appears to be a nihilist - it should be possible to torture him and have him assure everyone throughout that the pain isn't real and that he can't really be suffering. Dennet may be right, but it's hard to believe that when you're actually in pain (as I often am due to Crohn's disease). It feels too real to be a fiction: making something non-sentient believe that it's sentient is quite some trick.

What I object to most though is when people deny the existence of a sentient thing and yet insist on asserting that there is sentience in there. There isn't any in a computer which makes fake claims about being in pain when you type a particular key, and it's the same situation with any system which fakes the claims, no matter how complex the system is and how well hidden the cheating is. People who insist that there is sentience in such systems should also insist that it is there in the simple program that cheats by claiming that pain is experienced when a key is pressed just because it is programmed to print a string to the screen as a response which makes a claim and where the claim could be edited to make the opposite claim without changing the feeling of anything.
« Last Edit: 04/10/2019 20:43:39 by David Cooper »
Logged
 

Offline Halc

  • Global Moderator
  • Naked Science Forum King!
  • ********
  • 2404
  • Activity:
    6%
  • Thanked: 1014 times
Re: Is there a universal moral standard?
« Reply #228 on: 05/10/2019 06:54:54 »
Quote from: David Cooper
The same magic is required for it regardless where it's controlled from.
Blatantly false. A roomba is controlled from within itself and it requires no magic to do so. It just requires magic if the control is to come from outside the physical realm.
Quote
Whatever causes something is itself caused and is forced to cause what it causes.
Indeed. You make it sound like a bad thing. I thought of what it would be like if choices were not based on input that was not caused by prior state. I'd be dead in a day.
Quote
There is no such thing as choice in that whatever is chosen in the end was actually forced.
If that were true, mammals would not have evolved better brains to make better choices, or to make say moral choices. We are ultimately responsible for our choices, as evidenced by what happens to those that make poor ones. Not sure what choice is if you don't think that's going on.
MInd you, I agree that  if the physics of the universe is deterministic, then my choices are determined. I'm just saying that they're still choices.

Quote
Quote
If sentience is real, it has a causal role: without that, it cannot possibly cause claims about feelings being felt to be generated. It is still just a passenger though in that what it does is forced by the inputs.
This seems to be a contradictory statement.
It isn't. X causes Y, then Y causes Z --> X causes Z. Y causes Z but is forced to by X.[/quote]Which one (X, Y, or Z) is the sentience (your definition)?  I thought it was a passenger and has no arrow pointing from it. If so, it has no causal role. If it has one, then there's magic going on.

Quote
If you don't have something experiencing the feelings, you have no sentience there and the feelings don't exist either.
Only true in your interpretation. I for instance never said there wasn't something experiencing my feelings. I just don't think it's a separate entity, passenger or otherwise. I'm fine with you disagreeing with it, but do you find inconsistency with it, without begging your own interpretation?

Quote
Quote
I also don't think morality is about pain and suffering.  Everybody that says that makes it sound like life is some kind of horrible thing to have to experience.  Pleasure and pain are means to an end.  If the pleasure and pain were the end (the point of morality), then we should just put everybody on heroin.  Problem solved.  Recognizing the greater purpose is isn't a trivial task.
Morality is about suffering AND the opposite.
I find that thinking shallow.  Heroin it is then, the most moral thing you can do to others. It minimizes suffering an maximizes pleasure, resulting in the optimum quality of life.

Quote
Quote from: Halc
Quote from: Cooper
Most people believe that pain is real and that they strongly dislike it. If you are in that camp, then you're a unicornist yourself.
Nonsense. I don't think I need the unicorn to feel my own pain for me.
...  Don't attribute nonsense to me that comes out of your misreading of my position.
I wasn't commenting on your position. Your statement above concerned the camp that I'm in, implying that pain cannot be felt given a different interpretation of mind.

Quote
The only thing that actually matters here is that for feelings to be real, something real has to experience them, and that is a sentience. No sentient thing --> no feelings can be felt --> no role for morality --> you can try to torture anyone as much as you like and no harm can be done.
Totally agree. That's all that matters for the purpose of this topic. I'm not the one that drove this discussion to down assertions about the interpretation of mind. Only a moral nihilist denies that feelings matter to anything, and I'm not in with that crowd.

Quote
Dennet appears to be a nihilist
A word you seem to use for any monist position. You're begging your interpretation to draw this conclusion.

Answer my question.  How do you know about your passenger if it cannot make itself known to you?
« Last Edit: 05/10/2019 06:57:24 by Halc »
Logged
 



Offline David Cooper

  • Naked Science Forum King!
  • ******
  • 2876
  • Activity:
    0%
  • Thanked: 38 times
Re: Is there a universal moral standard?
« Reply #229 on: 05/10/2019 21:01:59 »
Quote from: Halc on 05/10/2019 06:54:54
Quote from: David Cooper
The same magic is required for it regardless where it's controlled from.
Blatantly false. A roomba is controlled from within itself and it requires no magic to do so. It just requires magic if the control is to come from outside the physical realm.

You appear to have lost track of the point. With the sentience, there are inputs and outputs, and the outputs are determined by the inputs. The inputs are the X (inputs) that cause Y in the sentient thing, and Y then causes Z (outputs). In the course of Y happening, feelings are supposedly generated, and some of the outputs document that in some way. A roomba reacts to inputs in the same way, but doesn't generate claims about sentience in Y.

Quote
If that were true, mammals would not have evolved better brains to make better choices, or to make say moral choices. We are ultimately responsible for our choices, as evidenced by what happens to those that make poor ones. Not sure what choice is if you don't think that's going on.
MInd you, I agree that  if the physics of the universe is deterministic, then my choices are determined. I'm just saying that they're still choices.

Our choices are no different from the ones computers make. The computer making a choice between which string to print to the screen when you press a key is applying an algorithm to work out which one to print, but its choice is forced. You can run algorithms that produce apparently random numbers too, but the results are forced. We are like that: there are many factors that can determine what number you say if I ask you for a number between fifty and a hundred, but your choice is forced by the algorithms and perhaps disturbances in the system which affect the algorithms. That itch on your back may distract you for a moment and lead to a different number being chosen than the one you would have gone for if the distraction hadn't occurred when it did. It's all forced though. We can use interrupts in a computer to disrupt an algorithm which makes it miss an event at a timer that it's watching.

Quote
I thought it was a passenger and has no arrow pointing from it. If so, it has no causal role. If it has one, then there's magic going on.

I told you before that it has a causal role: the generation of data documenting the experience of sentience cannot be triggered without outputs from the sentience to inform the system that the experience happened. This is the key thing that science will some day be able to explore, because for sentience to be real, that output from it must exist. (The problem then though is how the output can be understood rather than just making baseless assertions about what it represents.)

Quote
Quote
If you don't have something experiencing the feelings, you have no sentience there and the feelings don't exist either.
Only true in your interpretation. I for instance never said there wasn't something experiencing my feelings. I just don't think it's a separate entity, passenger or otherwise. I'm fine with you disagreeing with it, but do you find inconsistency with it, without begging your own interpretation?

True in any rational interpretation. I don't say that it has to be a separate entity, but simply that for feelings to be felt, something has to feel them. Things that don't exist can't do anything, actions can't be performed by nothing, and experiences can't be experienced by nothing.

Quote
Quote
[Morality is about suffering AND the opposite.
I find that thinking shallow.

You're entitled to find reality shallow if you like, but if you remove those things, morality is 100% redundant.

Quote
Your statement above concerned the camp that I'm in, implying that pain cannot be felt given a different interpretation of mind.

An interpretation of mind in which feelings are experienced by nothing is a magical interpretation.

Quote
Quote
Dennet appears to be a nihilist
A word you seem to use for any monist position. You're begging your interpretation to draw this conclusion.

I use the word to describe a position which denies the feelings exist at all and that there is no sentience.

Quote
Answer my question.  How do you know about your passenger if it cannot make itself known to you?

From the start I said that sentience could be a property of all particles (stuff, energy), so in that sense it needn't be a passenger as it is the essential nature of that stuff. I call it a passenger when referring to its lack of any useful causal role that can be produced just by going straight from X to Z without the middleman. The only thing it appears to do that goes beyond that is drive the generation of data to document the experiencing of feelings, but it's hard to believe that it actually does that as it should be impossible to interpret that part of the output in any way that would lead to the system that generates the data documenting the experience to have any idea that there was an experience of feelings at all.
Logged
 

Offline Halc

  • Global Moderator
  • Naked Science Forum King!
  • ********
  • 2404
  • Activity:
    6%
  • Thanked: 1014 times
Re: Is there a universal moral standard?
« Reply #230 on: 06/10/2019 19:28:30 »
Quote from: David Cooper on 05/10/2019 21:01:59
The inputs are the X (inputs) that cause Y in the sentient thing, and Y then causes Z (outputs). In the course of Y happening, feelings are supposedly generated, and some of the outputs document that in some way.
OK, Is Y the experiencing the feelings, or is Y the physical feelings which are noticed by the sentient experiencer?  I'm trying to figure out if the physical feelings or the sentient experience of those feelings is what is causing Z, the output.

I ask because of this:

Quote
Quote
I thought it was a passenger and has no arrow pointing from it. If so, it has no causal role. If it has one, then there's magic going on.
I told you before that it has a causal role: the generation of data documenting the experience of sentience cannot be triggered without outputs from the sentience to inform the system that the experience happened.
Here you are asserting output from the sentience, which you say cannot be done without some kind of magic that we both deny.

You say that physics is entirely deterministic, which means that output from something external to the physical system cannot cause any effects in said determined system.  In your quote just above, you assert the opposite, that the system is being informed of data from non-physical sources, which would make it non-deterministic, or which makes the sentience part of the deterministic physical system, in which case it isn't two systems, but just one.
 
Quote
I call it a passenger when referring to its lack of any useful causal role that can be produced just by going straight from X to Z
Here again you seem to deny the 'passenger' having a causal role, yet above you say it causes data about the feelings.  If I avoid standing in the rain because it gives me discomfort, then the discomfort definitely plays a causal role in my choosing to seek shelter.  There's not a direct link from rain to choice of seeking shelter if I don't know if the sentient experiencer prefers a wet environment or not.  Some things clearly have a preference for it, like say robins.
Logged
 

Offline evan_au

  • Global Moderator
  • Naked Science Forum GOD!
  • ********
  • 11033
  • Activity:
    8%
  • Thanked: 1486 times
Re: Is there a universal moral standard?
« Reply #231 on: 07/10/2019 01:42:15 »
Quote from: Halc
if the physics of the universe is deterministic
In quantum theory, physics is not deterministic (or at least, not determinable by us).

However, in mammals, I think moral decisions arise at a higher level than the quantum level - it is encoded in the strengths of synapses.

Quote from: David Cooper
Morality is ... a harm:benefit calculation in which the harm is ideally minimised and the benefit (all kinds of pleasure) maximised
As I understand it, people with certain brain structures have psychopathic tendencies
- However, they don't become full-blown psychopaths unless there is a trigger - such as being abandoned by their mother at a young age
- With this trigger, their moral rule seems to be: "The only harm that counts is harm to me, and the only pleasure that counts is what brings pleasure to me.".
- A psychopath does a very local harm:benefit calculation
- Other systems of morality take a more global view of whose harm and benefit they include in the calculation
- Sometimes it's just "my family", "my tribe", "my skin colour", "my nation" or "my religion"
- Without the trigger, people with psychopathic tendencies could live useful and productive lives - as lawyers, drill sergeants or surgeons, for example
- So the synapses driving their morality are formed with various inputs from DNA, and development before and after birth. These synaptic weights can be modified by the individual based on their teaching, experiences, and deductions.

I agree that in the end, we are all responsible for our decisions.

Listen to neuroscientist James Fallon: https://after-on.com/episodes/029
Logged
 
The following users thanked this post: hamdani yusuf

Offline Halc

  • Global Moderator
  • Naked Science Forum King!
  • ********
  • 2404
  • Activity:
    6%
  • Thanked: 1014 times
Re: Is there a universal moral standard?
« Reply #232 on: 07/10/2019 02:37:50 »
Quote from: evan_au on 07/10/2019 01:42:15
Quote from: Halc
if the physics of the universe is deterministic
In quantum theory, physics is not deterministic (or at least, not determinable by us).
His assertion, not mine.  And deterministic doesn't mean determinable.

That said, quantum theory actually doesn't say one way or another.  It is interpretation dependent.
MWI and Bohmian mechanics for instance, while very different interpretations, are both hard deterministic (no true randomness).  Neither asserts that one can predict where a photon will be measured no matter how much we measure ahead of time. That's just not what deterministic means.

Given his assertions on the subject, I assume David has bought into one of these deterministic interpretations, or that he hasn't put much thought into it. I know he's a presentist, though that hasn't come up in this topic. The typical presentist tends not to choose a deterministic interpretation of QM, but the combination is not contradictory.

Quote
However, in mammals, I think moral decisions arise at a higher level than the quantum level - it is encoded in the strengths of synapses.
Agree that it's not a quantum thing at all. Quantum stuff always comes up because dualism needs a way to allow a non-physical will to effect changes in a physical world, and QM is where lies the argument that such external interference is or isn't feasible.

I think that to an extent, human morality is encoded into us at a deep level (DNA), but not completely.  Much of it is simply taught.  Being encoded in human DNA, it is human morality, not universal morality.
Look at the morality of wolves, which is quite strong and consistent from pack to pack.  That's DNA morality, not taught.  Wolves and bees are far more moral creatures than are humans, as measured by their adherence to their own code.
Logged
 



Offline hamdani yusuf (OP)

  • Naked Science Forum GOD!
  • *******
  • 11799
  • Activity:
    92.5%
  • Thanked: 285 times
Re: Is there a universal moral standard?
« Reply #233 on: 07/10/2019 09:51:18 »
Quote from: David Cooper on 05/10/2019 21:01:59
From the start I said that sentience could be a property of all particles (stuff, energy), so in that sense it needn't be a passenger as it is the essential nature of that stuff.
It seems to me that you used eastern philosophical definition of sentience.
Quote
Sentience is the capacity to feel, perceive, or experience subjectively.[1] Eighteenth-century philosophers used the concept to distinguish the ability to think (reason) from the ability to feel (sentience). In modern Western philosophy, sentience is the ability to experience sensations (known in philosophy of mind as "qualia"). In Eastern philosophy, sentience is a metaphysical quality of all things that require respect and care. The concept is central to the philosophy of animal rights because sentience is necessary for the ability to suffer, and thus is held to confer certain rights.
I had some issues regarding this view, as I stated in my previous post. What is the ultimate/terminal goal of moral rules derived from this view? What will happen if we ignore them? why are they bad?

Neuroscience has shown that we can manipulate neurotransmitters to temporary disable human's ability to feel. Hence it is possible to kill a living organism, including humans, without involving any feeling of the subject (see Coup de grâce), hence not violating moral rules whose ultimate goal is to minimize pain and suffering while maximizing pleasure and happiness.
Logged
Unexpected results come from false assumptions.
 

Offline hamdani yusuf (OP)

  • Naked Science Forum GOD!
  • *******
  • 11799
  • Activity:
    92.5%
  • Thanked: 285 times
Re: Is there a universal moral standard?
« Reply #234 on: 07/10/2019 10:43:21 »
Quote from: evan_au on 07/10/2019 01:42:15
- So the synapses driving their morality are formed with various inputs from DNA, and development before and after birth. These synaptic weights can be modified by the individual based on their teaching, experiences, and deductions.
I think this view is consistent with my thought experiment posted here
Quote from: hamdani yusuf on 14/06/2018 15:33:02
The next step for cooperating more effectively is by splitting duties among colony members. Some responsible for defense, some for digesting food, etc. Though each cell are genetically identical, they can develop differently due to Gene activation by their surrounding.
This requires longer and more complex genetic materials in each organism's cell.

I've mentioned that consciousness comes as a continuum. Different levels of consciousness are likely the product of evolution, involving random changes and natural selections. I summarized the process in the thread quoted above.
Development of feeling is just a milestone in the development of consciousness through evolution.
Quote from: hamdani yusuf on 16/06/2018 15:33:07
Quote from: hamdani yusuf on 16/06/2018 07:20:52
So they need the ability to distinguish objects in their surrounding and categorize them, so they can choose appropriate actions.
Some organisms develop pain and pleasure system to tell if some circumstances are good or bad for their survival. They try to avoid pain and seek pleasure, which is basically making assumptions that pain is bad while pleasure is good.
Though there are times it could be a mistake to seek pleasure and avoid pain, mostly this rule of thumb brings overall benefits to the organisms.
Avoiding pain can prevent organisms from suffering further damage which may threat their lives. While seeking pleasure can help them to get basic needs to survive, such as food and sex.
From neuroscience, we know that pain and pleasure are electrochemical processes in the nervous system. Hence seeking pleasure and avoiding pain should be treated as instrumental goals only, not the terminal goals themselves. Otherwise they would be the inevitable victims of reward hackings such as in drug abuses.
The next milestone of consciousness that I know of is emotion. It involves expected future feelings. This ability requires additional memory capacity to build a rough model/simulation of the environment.
The next milestone is reason, which is the ability to think. It builds more robust and detail models/simulations of the environment.
Logged
Unexpected results come from false assumptions.
 

Offline David Cooper

  • Naked Science Forum King!
  • ******
  • 2876
  • Activity:
    0%
  • Thanked: 38 times
Re: Is there a universal moral standard?
« Reply #235 on: 07/10/2019 22:59:22 »
Quote from: Halc on 06/10/2019 19:28:30
OK, Is Y the experiencing the feelings, or is Y the physical feelings which are noticed by the sentient experiencer?  I'm trying to figure out if the physical feelings or the sentient experience of those feelings is what is causing Z, the output.

Let's look at something simple that might be sentient: a worm. You prod the worm with something sharp and it reacts as if in pain. The prodding is X and the reaction is Z. In between those two things is Y, and the pain (assuming that a worm can feel pain) is experienced there by something. You can do the exact same thing with a human, and if the human needs a feeling of pain as part of the mechanism, the worm likely has that too: evolution probably isn't going to add pain into this situation for us if it already has a system that works just fine without it while producing the same kind of response. We can't build a model of this with a feeling of pain in it: we simply don't know how to. The only models that we can build provide the same behaviour without pain being involved, or which assert that there is pain being experienced at some point without actually providing any way for the system to detect that this is happening and to report such a feeling being felt. The pain in any model we build is superfluous and would be impossible to detect. However, humans report the involvement of pain. Models only report it if they've been programmed to make baseless assertions which don't involve any measurement of pain or any other kinds of feelings.

There are two possibilities though if feelings are real: X causes Y (which is the experience of pain) and Y causes Z, or we might have X causes Z and X also causes pain at Y which Z then claims led to Y causing Z.

Quote
I ask because of this:

Quote
Quote
I thought it was a passenger and has no arrow pointing from it. If so, it has no causal role. If it has one, then there's magic going on.
I told you before that it has a causal role: the generation of data documenting the experience of sentience cannot be triggered without outputs from the sentience to inform the system that the experience happened.
Here you are asserting output from the sentience, which you say cannot be done without some kind of magic that we both deny.

If you don't have output from the sentience, it has no role in the system. Its actual role may not be to cause Z, but Z is generating claims that Z was caused by Y. That's what science needs to explore to see why something at Z is generating such a claim.

Quote
You say that physics is entirely deterministic, which means that output from something external to the physical system cannot cause any effects in said determined system.  In your quote just above, you assert the opposite, that the system is being informed of data from non-physical sources, which would make it non-deterministic, or which makes the sentience part of the deterministic physical system, in which case it isn't two systems, but just one.

I don't think there is such a thing as true randomness, but it isn't important to bring that into this and I didn't say that physics is entirely deterministic. Randomness is not free will. We can allow true randomness at base level if you like, but those random events then cause things, and that takes us to a point where everything else is caused, all dictated by the initial circumstances and those random inputs. In computer chips, they have to be designed in such a way that randomness doesn't interfere with their functionality, so everything that a program does is fully deterministic. In our brains, that may not be the case, but random events making a bit of neural net behave slightly different each time it fires is not free will and is more likely to cause trouble than do anything useful. Neural nets actually get trained in order to minimise unreliability issues, but this typically never reaches perfection and leads to us making mistakes in almost everything we do.

I also never said that something external to the physical system was involved in any way. Whatever is sentient, if feelings exist at all, is necessarily part of the physical system. The outputs from the sentience are outputs from something that is part of the physical system (and in which feelings, if feelings exist at all) are experienced.
 
Quote
Here again you seem to deny the 'passenger' having a causal role,

No: I deny it having a useful role. You can cut out Y and have X cause the same Z as you get with X causes Y causes Z, except that when you have Y in the chain you also get these assertions being generated about there being feelings involved in the process.

Quote
If I avoid standing in the rain because it gives me discomfort, then the discomfort definitely plays a causal role in my choosing to seek shelter.

Then show me a model of that. It should be exactly like with the pain one. A sensor on the skin sends a signal to the brain, and that's an input. The input is fed into a black box where discomfort is experienced. An output from the black box then causes the brain move the animal into shelter. We can bypass the black box and the functionality is unchanged: the input wire can simply be connected directly to the output wire. Except, the black box also triggers claims to be generated about feelings being experienced inside the box, and it makes those known to the outside by sending a signal down another wire to say so. However, we could simply connect the input wire to both output wires and remove the black box and the exact same functionality is produced, including the generation of claims about feelings being experienced in the black box even though the black box no longer exists in the system.
Logged
 

Offline David Cooper

  • Naked Science Forum King!
  • ******
  • 2876
  • Activity:
    0%
  • Thanked: 38 times
Re: Is there a universal moral standard?
« Reply #236 on: 07/10/2019 23:05:34 »
Quote from: Halc on 07/10/2019 02:37:50
Agree that it's not a quantum thing at all. Quantum stuff always comes up because dualism needs a way to allow a non-physical will to effect changes in a physical world, and QM is where lies the argument that such external interference is or isn't feasible.

The reason that quantum stuff gets dragged into this is that it's impossible to model sentience without it. It looks impossible to model it with it too, but because it's complex, it provides some refuge for hope. If the brain doesn't perform some extraordinary kind of trick, sentience is fake and there is no role for morality; no self in any of us to need to be protected.
Logged
 



Offline David Cooper

  • Naked Science Forum King!
  • ******
  • 2876
  • Activity:
    0%
  • Thanked: 38 times
Re: Is there a universal moral standard?
« Reply #237 on: 07/10/2019 23:18:52 »
Quote from: hamdani yusuf on 07/10/2019 09:51:18
Neuroscience has shown that we can manipulate neurotransmitters to temporary disable human's ability to feel. Hence it is possible to kill a living organism, including humans, without involving any feeling of the subject (see Coup de grâce), hence not violating moral rules whose ultimate goal is to minimize pain and suffering while maximizing pleasure and happiness.

You can kill everyone humanely without them feeling anything, but that's clearly immoral if you're producing inferior harm:benefit figures, and you would be doing so if you tried that. Imagine that you are going to live the lives of everyone in the system, going round and round through time to do so. There are a thousand people on an island and one of them decides that he can have a better life if he kills all the others, and by doing it humanely he imagines that it's not immoral. He doesn't know that he will also live the lives of all those other people and that he will be killing himself 999 times. If he knew, he would not do it because he'd realise that he's going to lose out heavily rather than gain.

Of course, in the real world we don't believe that we're going to live all those lives in turn, but the method for calculating morality is right regardless: this is the way that AGI should calculate it. Morality isn't about rewarding one selfish person at the expense of all the others, but about maximising pleasure (though not by force - we don't all want to be drugged for it) and minimising suffering.

Also, we're setting things up for future generations. We care about our children's children's children's children's children, and we don't want to set up a system that picks one of them to give the Earth to while the rest are humanely killed. Morality isn't about biasing things in favour of one individual or group, but about rewarding all.
Logged
 

Offline hamdani yusuf (OP)

  • Naked Science Forum GOD!
  • *******
  • 11799
  • Activity:
    92.5%
  • Thanked: 285 times
Re: Is there a universal moral standard?
« Reply #238 on: 08/10/2019 09:50:41 »
Quote from: David Cooper on 07/10/2019 23:18:52
Quote from: hamdani yusuf on 07/10/2019 09:51:18
Neuroscience has shown that we can manipulate neurotransmitters to temporary disable human's ability to feel. Hence it is possible to kill a living organism, including humans, without involving any feeling of the subject (see Coup de grâce), hence not violating moral rules whose ultimate goal is to minimize pain and suffering while maximizing pleasure and happiness.

You can kill everyone humanely without them feeling anything, but that's clearly immoral if you're producing inferior harm:benefit figures, and you would be doing so if you tried that. Imagine that you are going to live the lives of everyone in the system, going round and round through time to do so. There are a thousand people on an island and one of them decides that he can have a better life if he kills all the others, and by doing it humanely he imagines that it's not immoral. He doesn't know that he will also live the lives of all those other people and that he will be killing himself 999 times. If he knew, he would not do it because he'd realise that he's going to lose out heavily rather than gain.

Of course, in the real world we don't believe that we're going to live all those lives in turn, but the method for calculating morality is right regardless: this is the way that AGI should calculate it. Morality isn't about rewarding one selfish person at the expense of all the others, but about maximising pleasure (though not by force - we don't all want to be drugged for it) and minimising suffering.

Also, we're setting things up for future generations. We care about our children's children's children's children's children, and we don't want to set up a system that picks one of them to give the Earth to while the rest are humanely killed. Morality isn't about biasing things in favour of one individual or group, but about rewarding all.
Your calculation of harm:benefit here has nothing to do with feelings. Moral rules based on pleasure and suffering as their ultimate goals are vulnerable to reward hacking (such as drugs) and exploitation by utility monsters.
We know that killing random person is immoral, even if we can make sure that the person doesn't feel any pain while dying. There must be a more fundamental reason to get to that conclusion, other than minimising suffering, because no suffering is involved here.
I have mentioned that moral rules are created as methods to protect conscious beings from getting harmed by other conscious beings. Those rules can not protect us from unconscious beings such as natural disasters, germs, or beast attacks.
Logged
Unexpected results come from false assumptions.
 

Offline evan_au

  • Global Moderator
  • Naked Science Forum GOD!
  • ********
  • 11033
  • Activity:
    8%
  • Thanked: 1486 times
Re: Is there a universal moral standard?
« Reply #239 on: 08/10/2019 10:30:56 »
Quote from: David Cooper
standing in the rain ... we could simply connect the input wire to both output wires and remove the black box and the exact same functionality is produced, including the generation of claims about feelings being experienced in the black box even though the black box no longer exists in the system.
If you grew up Scottish winters, standing in the rain is likely to give you hypothermia.
If you grew up in Darwin (Australia), standing in the rain cools you down a bit, and the water will evaporate fairly soon anyway.

We need the black box, because an individual human might be born in Edinburgh or Darwin.

If all reactions were hard-wired, without the integration of other sensations, experiences (and mothers yelling at children), humans would not have spread so far around the world.

Quote from: evan_au
Sometimes morality is just applied to "my family", "my tribe"...
And sometimes morality is just applied to "my species"
- while some people (like Buddhists) want to apply morality to all living animals (partly because they think that, one day, they might be one of those animals).
- Some people want to extend it to plants
- And there are undoubtedly impacts of fungicides on microbes living in the soil and living in symbiosis with trees and plants; or microbes living in symbiosis with our guts
- Some people even wish to extend morality to the whole planet...

...We haven't made enough of an impact on the universe to be worried about saving the universe - but planetary protection officers are worried about us polluting Mars with our microbes, and destroying any traces of native Martian life.
Logged
 



  • Print
Pages: 1 ... 10 11 [12] 13 14 ... 212   Go Up
« previous next »
Tags: morality  / philosophy 
 
There was an error while thanking
Thanking...
  • SMF 2.0.15 | SMF © 2017, Simple Machines
    Privacy Policy
    SMFAds for Free Forums
  • Naked Science Forum ©

Page created in 0.807 seconds with 72 queries.

  • Podcasts
  • Articles
  • Get Naked
  • About
  • Contact us
  • Advertise
  • Privacy Policy
  • Subscribe to newsletter
  • We love feedback

Follow us

cambridge_logo_footer.png

©The Naked Scientists® 2000–2017 | The Naked Scientists® and Naked Science® are registered trademarks created by Dr Chris Smith. Information presented on this website is the opinion of the individual contributors and does not reflect the general views of the administrators, editors, moderators, sponsors, Cambridge University or the public at large.