0 Members and 1 Guest are viewing this topic.
How about choosing to kill one hardened criminal (no chance of parole, a pure burden to society) and use his organs to save 8 lives of patients who otherwise are going to die before a donor can be found. That's immoral (at least to humans) and nobody does it. Why not? The doctor that did it would be thrown in jail, not have his standing updated in a positive way. Is the AI better than humans if it chooses to save the 8, or is it a monster for killing the one?There are other real world scenarios, and humans always seem to find the kill-the-most option preferable.
Inevitably, people trying to answer the question will fill the gaps with their own assumptions
A Self Replicating A.I. sets out to Eradicate Visual Impairment (blindness) from the Society completely.
What if, it then considers taking 1 eye forcefully from people who own 2...& Implanting it into someone who has none.
Some information below can change the decision
how much resource is required?
- how hard is it to build fully functional synthetic organs?
Your points are mostly irrelevant. The 'there' can be any place of your choosing.The situation was simple: 8 people who will definitely die soon (a month?) without the needed organ. All are young enough that they'd have decades of life expectancy after the surgery.Let's say there's a 90% chance with each person of success, and 10% rejection chance.
The one prisoner obviously.
attempt to obfuscate a simple situation
For the purpose of this exercise, impossible.
Everybody's insured
No attempt to answer I see.
Are you sure?
In almost ideal conditions as you described, the prisoner should be executed, and the organs are transplanted to those in need.
So you're saying the law is wrong? Because it forbids such practices even in the most ideal situations, even in the case of the voluntary donor. Would an AI that was put in charge (instead of 'following' some favorite person as per the OP) rewrite such laws? Might there be a reason for the law not being conditional on any of the factors you keep trying to drag in?
The synthetic organ option disqualifies the situation as a trolley problem.
How about choosing to kill one hardened criminal (no chance of parole, a pure burden to society) and use his organs to save 8 lives of patients who otherwise are going to die before a donor can be found.
I like your discussion, but it is far from what I wanted to discuss, which is biologically programmed AI.
In my theoretical world, there is not one but several AIs.
I wish not to debate one AI's decisions without considering what other AIs would do to make sure their objective isn't hampered.
An AI with the goal of improving the prison system to benefit humans would challenge the first AI.
Still another AI above them who has the goal of improving AI disputes to better humanity would intervene.
Another AI who wishes to better humanity by removing the ability to take lives from AI policy.
There are turtles all the way down.
But you would say that this is too slow a process.
That's why this is all digital.
All these hypothetical scenarios are being simulated by different AIs from the common processing stack. The 'thoughts' are logged in a common history of thoughts, and can be read by all AI of the system. Since computers can think in timesteps of microseconds, days of debate would be seconds to us. So what if the end decision is not perfect, it will still be so much wiser than a human's that not considering it would be detrimental.
How does the superior obeying slave AI come into all this? Well, their policy is the result of following one person and moving to the next person with better outcome. So an AI following Halc would debate with an AI following Hamdani and so on.
So this is the situation - Humans are still the masters. The AI is constrained to only benefit humans.
Angry emojis carry more weight in Facebook’s algorithm than likes, Virginia gubernatorial candidate Glenn Youngkin runs an ad from a mom who tried to get “Beloved” banned from her son’s school, and a man saves enough money to buy a house and pay off loans by eating at Six Flags every day for seven years.