0 Members and 1 Guest are viewing this topic.
Universal morality as in universally applied by people/aliens - no. Universal morality as in absolute morality - yes. There is an absolute morality, and most attempts at formulating moral rules are attempts to produce that underlying absolute morality. The reason we find so much in common between different attempts at formulating systems of moral rules is that they are all tapping into an underlying absolute morality which they are struggling to pin down precisely, but it is there.What is absolute morality? The idea of "do unto others as you'd have them do unto you" captures most of it, but it's not quite right. "Always try your best to minimise harm (if that harm isn't cancelled out by the gains for the one who suffers it)" was one of my attempts to formulate the rule properly, and it does the job a lot better, but I'm not sure it's completely right. The correct solution is more of a method than a rule: it's to imagine that you are all the people (and indeed all the sentient beings) involved in a situation and to make yourself as happy as possible with the result of whatever action is determined to produce that maximum happiness. You must imagine that you will have to live each of their lives in turn, so if one of them kills one of the others, you will be both the killer and the one killed, but that killing will be the most moral action if it minimises your suffering and maximise your pleasure overall.This is how intelligent machines will attempt to calculate what's moral in any situation, but they will often be incapable of accessing or crunching enough data in the time available to make ideal decisions - they can only ever do the best they can with what is available to them, playing the odds.(This is a kind of utiliratrianism. The strongest objection I've seen to utilitarianism is the Mere Addition Paradox, but there's a major mathematical fault in that paradox and anyone rational should throw it in the bin where it belongs.)
But there's your problem - there is no universally applicable rule! Witness the ecstatic joy of the Hitler Jugend, and the total misery they wrought on everyone, including, eventually, themselves.
Truth will never be decided by opinion polls.
I did say the Golden Rule is faulty. That's why I came up with a better rule (the harm minimisation one) which removes the major problems with it, but I'm not sure it is perfect. What does appear to be perfect is the method of considering yourself to be all the people involved in a scenario. Let's apply it to the Trolley Problem. You are the person lying on one track which the trolley is not supposed to go down. In other lives, you are the ten people lying on another track which the trolley is scheduled to go down. In another life you are the person by the lever who has to make a decision. How many of yourself do you want to kill/save in this situation? Should you save the ten idiot versions of yourself who have lain down on a track which the trolley is scheduled to go down, or should you save the lesser idiot version of yourself who has lain down on the other track in the stupid assumption that the trolley won't go that way? It's a calculation that needs a lot of guessing unless you have access to a lot of information about the eleven people in question so that you can work out whether it's better to die ten times as mega-morons or once as a standard moron, but it's still a judgement that can be made on the basis of self-interest. All scenarios can be converted into calculations about self-interest on the basis that you are all of the players. This doesn't make the calculations easy, but it does provide a means for producing the best answer from the available information.
Every hour you continue to exist is of the greatest help to the B.E.F. Government has therefore decided you must continue to fight. Have greatest possible admiration for your splendid stand. Evacuation will not (repeat not) take place, and craft required for above purposes are to return to Dover. Verity and Windsor to cover Commander Mine-sweeping and his retirement.
The Trolley Problem should never be dismissed as an academic exercise. Churchill's decision not to evacuate the Calais garrison in 1940 is a classic case of balancing the certain death of a few against the possible survival of many by delaying the German advance on Dunkirk. Imagine sending this signal:Quote Every hour you continue to exist is of the greatest help to the B.E.F. Government has therefore decided you must continue to fight. Have greatest possible admiration for your splendid stand. Evacuation will not (repeat not) take place, and craft required for above purposes are to return to Dover. Verity and Windsor to cover Commander Mine-sweeping and his retirement.
To answer why keeping the existence of conscient beings is a fundamental moral rule, we can use a method called reductio ad absurdum to its alternative. Imagine a rule that actively seeks to destroy conscient beings. It's basically a meme that's self destruct by destroying its own medium. Or conscient beings that don't follow the rule to actively keep their existence (or their copies) will likely be outcompeted by those who do, or struck by random events and cease to exist.
If we propose minimizing harm as a fundamental moral rule, we need to agree first on its definition.
If it's about inflicting pain, then giving painkiller should solve the problem, which is not the case.
If it's about causing death, then death penalty and euthanasia are in direct violation.
Hence there must be a more fundamental reason why this proposed rule works in most cases, but still have some exceptions.
Hence, keeping the existence of conscient beings is one of the most fundamental moral rules, if not the most.
Does it have any exceptions? Show me one.
Quote from: hamdani yusufHence, keeping the existence of conscient beings is one of the most fundamental moral rules, if not the most.There seems to be some debate about which are conscient (conscious?) beings to which this moral rule applies...- Some apply it to just members of their own family or tribe- Others apply it to just members of their own country or religion- Thinking more broadly, are elephants conscious, or dolphins? How should we treat them?- What about our pet dog or cat?
Finally we get to the last question: how. There are some basic strategies to preserve information which I borrow from IT business:Choosing robust media. Creating multilayer protection. Creating backups. Create diversity to avoid common mode failures.