Why is death tragic and how tragic?

Started by Barry, November 01, 2020, 12:12:11 PM

« previous - next »

0 Members and 1 Guest are viewing this topic.

Streetwalker

Quote from: Barry on November 01, 2020, 12:12:11 PM


Ethically, is it better to save the life of a 20 year old or an 80 year old?

Age is just one of the reasons that would make me come to an ethical decision

T00ts

Quote from: Nalaar on November 01, 2020, 04:51:51 PM
It's considered progress because AI driven vehicles will have a much reduced chance of having an accident in the first place, saving many lives. However they will have to be programmed on how to respond in the worst case scenario were death can not be avoided.

Yes I can see that. I am quite prepared to stand alone in questioning the long term results of such actions. I would warn though against the tide of possibly the majority, of unintended consequences and it's affect on our eternal futures.

Nalaar

Quote from: T00ts on November 01, 2020, 04:45:58 PM
Yes we could, but the drive is to make humans redundant and even worse worthless. I'm not sure that's such a good ambition for the future. There is a very big difference between speed reduction or whatever and deciding in advance how a machine reacts of its own volition. I have no idea why anyone would consider that progress.

It's considered progress because AI driven vehicles will have a much reduced chance of having an accident in the first place, saving many lives. However they will have to be programmed on how to respond in the worst case scenario were death can not be avoided.
Don't believe everything you think.

T00ts

Quote from: Nalaar on November 01, 2020, 04:34:55 PM
We could drastically reduce fatal car crashes in human driven cars by restricting all engines to a maximum 20 mph. We don't do that for human driven vehicles, we won't do it for AI driven vehicles either.

Yes we could, but the drive is to make humans redundant and even worse worthless. I'm not sure that's such a good ambition for the future. There is a very big difference between speed reduction or whatever and deciding in advance how a machine reacts of its own volition. I have no idea why anyone would consider that progress.

Nalaar

Quote from: T00ts on November 01, 2020, 04:25:44 PM
Well then I think it is fairly obvious.
A) any such vehicle should not travel so fast that it cannot stop in time not to kill anyone.
B) it would be preferable to have separate transport systems for them rather than try to mix with humans

I am sure there are other preventative measures. There just needs to be the will.

We could drastically reduce fatal car crashes in human driven cars by restricting all engines to a maximum 20 mph. We don't do that for human driven vehicles, we won't do it for AI driven vehicles either.
Don't believe everything you think.

T00ts

Quote from: Nalaar on November 01, 2020, 04:15:12 PM
The reality is this is going to happen, and we should be talking about what we want to happen in the worst case scenarios.

Well then I think it is fairly obvious.
A) any such vehicle should not travel so fast that it cannot stop in time not to kill anyone.
B) it would be preferable to have separate transport systems for them rather than try to mix with humans

I am sure there are other preventative measures. There just needs to be the will.

Nalaar

Quote from: T00ts on November 01, 2020, 04:07:31 PM
I realise that this is coming but for someone like me there is a moral issue here which I find impossible to reconcile.  From my position in recognising that mankind is not the ruler of the universe I find the disconnect between what can be to what should be very disconcerting.

The reality is this is going to happen, and we should be talking about what we want to happen in the worst case scenarios.
Don't believe everything you think.

T00ts

Quote from: Nalaar on November 01, 2020, 03:48:30 PM
The machine is making the same choice as the human. It's making the decision quicker, and with more certainly about the outcomes, but at the end of the day it's the same choice.

We will have AI trucks on our roads in the coming years, and we're going to have decide how we want them to respond when they are put in a position of choosing who dies in a given scenario.

I realise that this is coming but for someone like me there is a moral issue here which I find impossible to reconcile.  From my position in recognising that mankind is not the ruler of the universe I find the disconnect between what can be to what should be very disconcerting. 

Nalaar

Quote from: T00ts on November 01, 2020, 03:44:42 PM
I do not. Accidents happen indeed but what you are proposing is that the machine is programmed by a human being to make a choice. It is no longer an accident but manslaughter.  There is a basic lack of morality in the premise.

The machine is making the same choice as the human. It's making the decision quicker, and with more certainly about the outcomes, but at the end of the day it's the same choice.

We will have AI trucks on our roads in the coming years, and we're going to have decide how we want them to respond when they are put in a position of choosing who dies in a given scenario.
Don't believe everything you think.

T00ts

Quote from: Nalaar on November 01, 2020, 03:37:46 PM
With human psychology we are subject to deep personal prejudice. I do not want my dad to die, however it is the right option. Though of course in the heat of the moment my prejudice could overwhelm judgement.

In any case (personal relative or not) the 80 year old should die.

Do you disagree? If so, why?

I do not. Accidents happen indeed but what you are proposing is that the machine is programmed by a human being to make a choice. It is no longer an accident but manslaughter.  There is a basic lack of morality in the premise.

Nalaar

Quote from: T00ts on November 01, 2020, 03:35:01 PM
If you were driving what would be your choice - as if I didn't know? What if the older man was you father?

With human psychology we are subject to deep personal prejudice. I do not want my dad to die, however it is the right option. Though of course in the heat of the moment my prejudice could overwhelm judgement.

In any case (personal relative or not) the 80 year old should die.

Do you disagree? If so, why?
Don't believe everything you think.

T00ts

Quote from: Nalaar on November 01, 2020, 03:32:15 PM
The scenario presupposes no safe option for the truck which avoids fatalities. The truck is guaranteed to hit one of the humans, it must decide which. So I ask again, do you disagree with my assessment that I should be the one the truck chooses to hit?

If you were driving what would be your choice - as if I didn't know? What if the older man was you father?

Nalaar

Quote from: T00ts on November 01, 2020, 03:29:18 PM
Most certainly. I would expect the person who programmed the truck to build in a response that would kill neither person. It's a machine for goodness sake put together by human beings - although leaving out human might be a better description. The fact that you appear to quite glibly accept such a scenario is quite disturbing.

The scenario presupposes no safe option for the truck which avoids fatalities. The truck is guaranteed to hit one of the humans, it must decide which. So I ask again, do you disagree with my assessment that I should be the one the truck chooses to hit?
Don't believe everything you think.

T00ts

Quote from: Nalaar on November 01, 2020, 03:15:46 PM
I agree with all of that.

I can easily construct an AI scenario like the one in the OP that should result in my death - if I was 80 years old walking down the street, and a 20 year old unknowingly stepped out in front of an AI driven truck, and the truck has to decide whether to stay on the road and kill the 20 year old, or veer off the road killing me instead, the AI should definitely chose to kill me. Do you disagree?

Most certainly. I would expect the person who programmed the truck to build in a response that would kill neither person. It's a machine for goodness sake put together by human beings - although leaving out human might be a better description. The fact that you appear to quite glibly accept such a scenario is quite disturbing.

Nalaar

Quote from: T00ts on November 01, 2020, 02:48:47 PMSadly my study tells me that the suicide bomber has got his ideology mixed up. That is not his fault since he knows no different and I would hope that not he, but the false prophet/teacher who persuaded him of that is judged accordingly. Fortunately that is not my role.

I agree with all of that.

QuoteI would be interested to learn what you would feel if at whatever age AI decided purely on logic that your time had come. How will you look into the eyes of those who might love you and see the fear and misery in their eyes. Many who lose family members regret not doing more for the deceased or blame themselves in some way. More to the point how will you feel when you get the message to report for removal? What of your hopes or wishes? Perhaps in this brave new world, humans will have their human feelings removed. I do not envy you if it should come to that.

I can easily construct an AI scenario like the one in the OP that should result in my death - if I was 80 years old walking down the street, and a 20 year old unknowingly stepped out in front of an AI driven truck, and the truck has to decide whether to stay on the road and kill the 20 year old, or veer off the road killing me instead, the AI should definitely chose to kill me. Do you disagree?
Don't believe everything you think.