Interesting example of practical ethical philosophy is being presented by automated cars. The cars will have software which enables them to make incredibly rapid decisions in the event of an imminent collision. They will be programmed to minimise human injuries, so if the choice is driving into a bus queue or a wall, it would choose the wall because the occupants of the car are outnumbered by the people in the queue. Now there is talk of equipping these cars with a user setting which would allow the owner/occupant (can't really say driver) with a choice - ‘altruist’ which would retain the imperative to minimise harm to the maximum number of people; or ‘egoist’ which would protect the occupants of the car at all costs, no matter how many others are killed/hurt. So what setting would you choose? Anonymous poll, be honest. I reckon most of us would put ‘altruist’ if it was only them in the car. Imagine that you can’t change this setting after the initial selection, and that you will be driving your kids (or grandkids) around. This is quite a tricky question, take your time and ponder. For one thing, any accident would not be your fault. You’re not driving.
This isn't really going to work is it? Everyone will vote altruist, but would be an egoist in practice. I am of course on a different level, so will vote altruist in good conscience.
Not sure, but that’s why I made it anonymous. But if it doesn’t work, fine, you win some, lose some, it’s all the same to me. After a bit of thought I went for egoist. On the grounds that in a world of self driving cars no accident would be my fault, so I’ll pass the buck to the machines. They will probably be German, so very easy to blame. In the real world I don’t think humans should be given the choice, just let all the cars be programmed to altruism permanently.
Why should it, Strolls? Frankly, if I’m in a car with family, friends or other familiars and, through whatever reason, some sort of catastrophe is imminent and I somehow have the time and nous to make a rational decision, I think that decision would be one that optimises the chances of survival of said loved ones. Of course, it would be fantastic if that same decision was also best for complete strangers in the locale too, but I would unashamedly prioritise the well-being of those for whom I care over those I don’t know from Adam. Sorry. But, it’s honest.
The question was about a pre-setting choice though wasn't it? In the throes of a life-threatening situation, pretty much everyone would choose to save themselves and their loved ones before strangers. The more I think about it, it's a clever philosophical question by our friend Stan. If everyone was to choose altruism, we would all be equally safe, but as soon as one person chooses egoism.............wait a minute, that doesn't work. I'm changing my vote.
I'd go for the altruistic option and inform anyone getting in the car of this beforehand. Me making the decision should not necessarily be everyone else's choice (wouldn't it be an egotistical choice to otherwise force my decision on my passengers).
Science says that the rationale of all species over all time is egotistical. "This is my gene pool, these are the creatures that need to survive...b@gger the rest of you". Some species (I think penguins)...kill other parents chicks if there is insufficient food. Humans are the first species (allegedly) of sufficient intellectual level to ask this question. To answer your question sb... even though the car was responsible but I was the "driver" and survived...how could I live with killing my grandchildren (do not have any yet)...No I couldn't. I wouldn't tell anyone but it would be egotistical
I wouldn't get in a car that had no human input into what it was doing, so you can ignore this post. I guess that makes me by default a control freak!
Yes, amazingly I was able to grasp what the question was about. As my instinct is always to put the safety of my loved ones over strangers I would choose the pre-set that begins with E. There are, of course, scenarios where this choice might be uncomfortable, such as when I’m the sole occupant of the vehicle and outside are the Belles of St. Trinians, but I’ll take my chances on the law of averages.
The selfish gene in action. There are plenty of examples of amazing altruistic behaviour in animals too. https://www.theguardian.com/science...ns-merciful-monkeys-can-animals-be-altruistic They all turn out to be genetically motivated. All those selecting altruism - you are probably doing it because it makes you feel good about yourself. Let the machines make the choice. We don’t have the equipment (and I have spent a lifetime convincing myself that I do). I like this question.
This isn't a difficult question imo. If there's a chance that my loved ones will be in the car with me then it has to be Egoist.
The interesting thing is Col, I reckon that for most of us, if we were in control of the car, we would drive into the wall. Not because a different moral code kicks in, but because in a split second reaction with no thinking time we would instinctively try to avoid the people in front of us. If you are in your self driving car do you think you should have a button to set egoist/altruist, or should the car make the decision?
Ok, let’s try a variation of this, the classic trolley dilemma. A runaway tram is careering down the tracks. There are 5 workers, unaware of its approach, in its path. You are standing by a lever which divert the tram onto another track where it will only kill one worker. Do you pull the lever? Easy? What if the one worker is a friend or relation of yours? Similar situation, but there is no lever. You are standing on a bridge over the track with the five workers in the path of the tram. A massively obese bloke is standing next to you, you are certain he is big enough to stop the tram and save the workers. Do you push him off the bridge into the path of the tram? The net effect, sacrifice one to save five, is identical to pulling the lever.
Not really. I chose altruism because I consider it the 'right' thing to do. If everyone in this scenario did the same we would all be equally safe. I and my family would only be endangered by the selfishness of others.
If the car is so badly put together that it can veer off the road into queues of people I would probably buy a bicycle
And by doing the ‘right’ thing your invisible internal reward system releases some endorphins in your brain which for a fleeting moment makes you feel good. It’s biology, you can’t control it. When I selected ‘egoist’ it had the opposite effect. It’s all algorithms. What’s interesting about this question is that it is potentially a real one that many of us will have to answer, unlike the academic games which test the same responses.
Does living in New Zealand stunt your imagination? An out of control sheep truck is going to hit your car head on. It can accept the collision, guaranteeing the death of all occupants, veer left into a wall or veer right into the softer landing of a queue of people.