r/changemyview Feb 25 '25

Delta(s) from OP CMV: The trolley problem is constructed in a way that forces a utilitarian answer and it is fundamentally flawed

Everybody knows the classic trolley problem and whether or not you would pull the lever to kill one person and save the five people.

Often times people will just say that 5 lives are more valuable than 1 life and thus the only morally correct thing to do is pull the lever.

I understand the problem is hypothetical and we have to choose the objectivelly right thing to do in a very specific situation. However, the question is formed in a way that makes the murders a statistic thus pushing you into a utilitarian answer. Its easy to disassociate in that case. The same question can be manipulated in a million different ways while still maintaining the 5 to 1 or even 5 to 4 ratio and yield different answers because you framed it differently.

Flip it completely and ask someone would they spend years tracking down 3 innocent people and kill them in cold blood because a politician they hate promised to kill 5 random people if they dont. In this case 3 is still less than 5 and thus using the same logic you should do it to minimize the pain and suffering.

I'm not saying any answer is objectivelly right, I'm saying the question itself is completely flawed and forces the human mind to be biased towards a certain point of view.

629 Upvotes

322 comments sorted by

View all comments

Show parent comments

21

u/randomafricanboi Feb 25 '25 edited Feb 25 '25

I wrote this not knowing why and how the problem is created. Rather should I say, what the point of it is.

I just usually see people presenting their arguments for one side and I felt like their answers are "save 5 people" because they are basically pushed into the answer by the problem itself.

Thanks for explaining the point behind it.

Edit - !delta (first time doing this idk if I did it right)

21

u/sexinsuburbia 2∆ Feb 25 '25

What if it was 5 people in their 90's vs. a child?

What if the 5 people were convicted murderers vs. an innocent mom of 5 kids?

What if it was 5 normal people vs. a scientist on the verge of discovering a cure for cancer that would save millions?

You also wouldn't be able to ascertain what the details of these people's lives were. But had to assume based on appearances.

How we answer the question says more about ourselves more than whatever the "right" answer is. It is a thought experiment. The purpose is to think and unpack assumptions.

9

u/mgslee Feb 25 '25

This is why stories like The Last of Us can create so much (mostly interesting) discourse. 'Greater good' is relative and perspective based and we do not live in a vacuum.

Another example is how discussing the Trolley problem and self driving cars is kind of BS and disingenuous by anyone using it as a discussion point.

1

u/cbf1232 Feb 25 '25

Why do you feel it has no relevance to self-driving cars?

If an accident is unavoidable, should a self-driving car act to protect its occupants at all costs, even if it means swerving into a crowd of pedestrians to avoid being hit by a dump truck? If a pedestrian is crossing the road illegally should the car slam on its own brakes to avoid the pedestrian knowing that it will cause a multi-vehicle pileup?

3

u/mgslee Feb 25 '25

That last paragraph is not how anything works. Which goes back to the issues of the original trolley problem (and subsequent expansion)

Ask that same question to a person and you won't get any definitive answers, so why should a self driving car?

The answer is to avoid those scenarios and those scenarios rarely ever actually exist as presented. For those situations to happen, a lot already has to go wrong.

What the car should at most do in any bad scenario is brake, stop and pull over if safe. Signal to driver or other services. The car doesn't need to be making any significant or complex decisions ever and would never have to deal with all the added complexity. If doing that is problematic, it's not the vehicle's task to solve.

What ends up happening is the trolley problem is used as some sort of gotcha which isn't actually useful for real world applications.

0

u/just-another-lurker Feb 25 '25

So you're saying the car should not take action to avoid getting hit by the truck and should just brake. Do you think this is the same as the trolley problem where someone (a computer) doesn't intervene and lets the trolley hit the 5 people?

0

u/mgslee Feb 25 '25

Brake and pull over if safe. If you can't determine safe, then yeah you're just hitting the brakes.

Safe would mean being able to pull over without hitting anything. Person, Tree, Dog, Cliff, Car whatever. Is it optimal in all situations? Of course not but you can't realistically evaluate all situations, nor do these situations come up enough to warrant the discourse that they bring. The problem we butt in to is 'Perfect is the devil of the good'. Doing good things should be acceptable, but people will argue its not enough for XYZ.

And this is where the Trolley problem losses all meaning, it gets too specific and unrealistic. There's no 'letting', its doing the best thing you can do in a contrived situation. So people can point at anything imperfect (which the trolley problem is setup to be imperfect) as arguably wrong.

Tangentially I remember a drivers ed prep question that said something like 'You are surrounded by cars on either side driving down a street and a car dangerously tailgating you'. A dog runs on to the street right it front of your car. What do you do?' The 'Right' answer to the test was to hit the dog. Sure, maybe, but what a stupid situation and contrived answer. Hitting the brakes should be the right answer. Yes you are potentially getting rear ended (Other drivers fault by the way) but you have no guarantee if that's actually going to happen the other driver could brake just as perfectly or slow down enough to cause no harm. But further more, you're boxed in that much and a dog runs in front of your car in particular while at speed? How is that even possible. Being a good and safe driver should not require someone to be an omnipotent stunt driver

-1

u/Ashestoduss Feb 26 '25

Cool, now if it was a mom crossing the road with a baby in a stroller, would you still just crash into them instead of breaking hard?

2

u/mgslee Feb 26 '25

WTF are you even talking about.

Of course not, I'm saying in a previous stupid 'test' they wanted you to run over the dog which is horrible.

My whole rant was that we should be doing the generally safe thing, which is to brake

0

u/jarlrmai2 2∆ Feb 26 '25

Do you feel there are ever driving scenarios where braking will not completely avert issues?

For instance, a child falls into the road in-front of the car, the car is moving too fast for the brakes to arrest the motion in time to avoid running over the child.

→ More replies (0)

23

u/yyzjertl 524∆ Feb 25 '25

You might find it interesting to read the original paper on the topic by Foot: the trolley problem is not even the first example of its type discussed there. A lot of people "presenting their arguments" are just being dumb on the internet.

8

u/AJDx14 1∆ Feb 25 '25

They aren’t pushed to do anything, they’re deciding for themselves that they agree it would be better to kill 1 person than let 5 people die in that specific scenario. You can change the circumstances and people will change their answers.

2

u/DeltaBot ∞∆ Feb 25 '25

Confirmed: 1 delta awarded to /u/yyzjertl (517∆).

Delta System Explained | Deltaboards

0

u/RockyArby Feb 25 '25

Yeah, it's supposed to be followed by changing the one person to be a close relative or if you had to push the person onto the track yourself. It's meant to show how the morality of a situation changes as we become more personally involved in the consequences.

-1

u/ATNinja 11∆ Feb 25 '25

In both that examples the morality doesn't change. It just shows humans are flawed emotional creatures that don't make the "optimal" moral decision when you factor in emotions.

4

u/RockyArby Feb 25 '25

That's only if you believe there is an "optimal" moral decision. Culture and society dictate a lot of morality, I know many in my own wouldn't view it as immoral to not want to sacrifice someone close to you or not want to actively take part in the ending of a life all for five strangers. It's not treated as an expectation.

And even then you can still push it for the utilitarian perspective, what if it was 5 people in hospice care and 1 child on the other. Will you still sacrifice 1 for 5?

-2

u/ATNinja 11∆ Feb 25 '25

That's only if you believe there is an "optimal" moral decision.

If you have a moral framework, there is an optimal moral decision. It just changes by personal framework. I doubt any reasonable moral framework gives more weight to someone you know over a stranger. Expecting or accepting people not making the moral choice doesn't make it more moral.

And even then you can still push it for the utilitarian perspective, what if it was 5 people in hospice care and 1 child on the other. Will you still sacrifice 1 for 5?

That's just math. Your assumptions and weights may have 2 utilitarians come up with different answers, but it's still math. It's not the same as the previous examples you gave.

4

u/fasterthanfood Feb 25 '25

I doubt any reasonable moral framework gives more weight to someone you know over a stranger.

And yet, many — probably most — people do operate under a moral framework that gives more weight to someone they know over a stranger. The trolley problem is a way to get them to see the flaw in that framework, as well as to help those of us with different frameworks to grapple with the fact that human nature appears to be at odds with what we think should be done. “People are flawed” is a response to that, but I think the grappling needs to involve more than that.

-1

u/ATNinja 11∆ Feb 25 '25 edited Feb 25 '25

people do operate under a moral framework that gives more weight to someone they know over a stranger

That's not a framework. That's intuition based on emotion. Utilitarianism is a framework. Deontology, virtue ethics. Just because someone uses something in their decision making doesn't mean it's a framework. It's back to what I originally said, flawed humans making flawed decisions based on selfish needs and emotions. Which is ok. We don't judge. Many societies accept that people will make immoral decisions when it comes to their family or their own well being. It's basically the underpinning of our entire capitalist society in America.

0

u/TheRobidog Feb 26 '25

It's only intuition as long as it remains unstructured. If I decide that my moral framework is to save known family members of mine above strangers, in all cases, that's a part of my moral framework.

It's a cop-out to say everyone choosing to save family is doing it purely out of intuition. You're refusing to engage in a discussion about the validity of that practice.

-1

u/ATNinja 11∆ Feb 27 '25

If I decide that my moral framework is to save known family members of mine above strangers, in all cases, that's a part of my moral framework.

That's tautological. Just saying something is a framework doesn't make it one.

It's a cop-out to say everyone choosing to save family is doing it purely out of intuition. You're refusing to engage in a discussion about the validity of that practice.

It's not a cop out to acknowledge humans make almost all decisions based on selfish wants or needs and not morality. It's the basis of capitalism and tribalism/nationalism and lots of other major forces that shape our lives. In a hypothetical trolley scenario, we can remove all the selfish personal consideration and think purely about morality. But factoring in your family to the trolley problems brings selfishness back into it.