0 Members and 1 Guest are viewing this topic.
Quote from: hamdani yusuf on 26/02/2021 06:07:46Here is another plot twist I just thought of. You have inadvertently switched the track when you learn about the situation, so the train is heading toward one person. Will you switch it back to its original track, which will kill 5 people, instead of letting your mistake inadvertently kill 1?In this scenario, both decisions involve you action. Hence there is no excuse of being a passive bystander. Another variation of the plot twist. You didn't pull the lever inadvertently. You pulled it because you weren't aware of the one person on the second track. When you become aware of that person, will you switch back to the original track, which will eventually kill 5 people?
Here is another plot twist I just thought of. You have inadvertently switched the track when you learn about the situation, so the train is heading toward one person. Will you switch it back to its original track, which will kill 5 people, instead of letting your mistake inadvertently kill 1?In this scenario, both decisions involve you action. Hence there is no excuse of being a passive bystander.
hmmmm
The understanding of the universal moral standard becomes a necessity when we are engaging AGI and genetic engineering. Economic competition is forcing us to get there sooner, rather than later.
No. What we need to do is to impress a human moral standard on semiautonomous machines.
A classic example of not doing so is the use of "altitude hold" in a simple autopilot. Most of the time this just saves you having to make continuous minor adjustments of power and trim as you burn fuel, fly into a different weather system, or the passengers start walking around. But there is a dangerous temptation to let "George" fly the plane in strong turbulence, because his reactions are quicker and he doesn't get tired. This can be fatal. If you hit a strong downdraft, George will point the nose of the plane upwards to regain altitude and you may stall. Unfortunately the strong downdrafts are found in dense cloud, so the plane will quite suddenly flip, spin, or do a dozen ballistic things all at once and topple the gyro horizon, making recovery to aerodynamic flight quite a conundrum and in some cases impossible. The proper thing to do is to fly by hand in strong turbulence, accepting that you will not maintain constant speed or altitude, but simply keep the wings generating lift even when the meal trays hit the ceiling, in a compromise with the forces of nature. Given the choice between the unpalatable and the unacceptable, you must accept the unpalatable to survive.
Which takes us back to the "immoral but right" decisions of conflict, which cannot be universal because the rest of the universe is at best indifferent to human life, and at worst, opposed to it.
Alternatively, you can train the machine using reinforced learning, given that you can provide a virtual environment accurate enough to represent parts of the real world which is considered relevant to the application.
And look what happened to the 737 Max. One jammed sensor and several hundred dead. But apparently cheaper than putting a warning in the pilot's notes and spending an extra hour on type training.
Not sure how you could program that into anything except three humans, never mind getting Birmingham Control to reorganise all their traffic.
Quote from: alancalverd on 02/03/2021 09:53:46And look what happened to the 737 Max. One jammed sensor and several hundred dead. But apparently cheaper than putting a warning in the pilot's notes and spending an extra hour on type training.It was human error in design phase.
Quote from: alancalverd on 02/03/2021 09:53:46Not sure how you could program that into anything except three humans, never mind getting Birmingham Control to reorganise all their traffic. Perhaps the plane engine shouldn't be able to start with incorrect fuel amount or type for the planned journey. It would be easier to fix.
Not at all. The design was perfect and simply introduced an interesting new characteristic in the flight envelope. The wrong decision was to substitute weak automation for an hour's training.
Yes, one more thing to go wrong. Suppose we have planned Belfast- Heathrow but a passenger gets sick so we divert to Birmingham. Will the machine say "fuel overload - switch off engine"? Or Heathrow is fogged so we make a late divert to Southend. Is that another 30 miles (insufficient fuel - switch off engine) or had we planned to approach Heathrow from the east anyway, plus a go-around plus a divert plus extra taxi time?
Quote from: alancalverd on 03/03/2021 10:40:33Not at all. The design was perfect and simply introduced an interesting new characteristic in the flight envelope. The wrong decision was to substitute weak automation for an hour's training. Not providing redundancy for critical components which can cause a single point failure is a sign of bad design.How did they fix the problem? By changing the design.
There were no critical components in the original design. The flying characteristics of any aircraft change in different phases of flight, but rather than teach people to fly the MAX throughout the modified takeoff envelope they installed two wholly unnecessary critical components (yes, the system incorporated redundancy) and software that made things worse if one failed.