Well?

Well?

Other urls found in this thread:

moralmachine.mit.edu/
youtube.com/watch?v=x_qiVBy0B2Y
twitter.com/NSFWRedditVideo

Play the game!

moralmachine.mit.edu/

Only in the third scenario.

the faggots who chose to put their lives in the hands of a computer are obviously the ones who should bear the consequences of the computer's decisions

If it is driving so fast that it can't use breaks then it will have a hell of a hard time trying to swerve, while using normal car tires. It'd end up plowing through people anyway, just at an angle and more broadside, probably catching even more people in the process.

>faggots
Why the homophobia?

I hope self-driving cars will run over a homophobe to save its faggot driver.

So what happens when people start jumping in front of cars in order to kill them?

>girlfriend angry at boyfriend
>jumps in front of his car
>it swerves and he dies
>girlfriend runs off

Perfect, "murder".

you need to go back, newfag

what about the car should drive slower inside citys in the first place? out of citys we need more bridges/tunnels.

solution: don't buy a self-driving car

>faggot driver.
Why the homophobia? I guess you'll be the one getting run over bud.

>youtube.com/watch?v=x_qiVBy0B2Y
The only true solution

What if the driver wasn't a robot, but a friend of yours? What should he/she do in that scenario?

What happens if it swerves and there's even more people than just 1 person in the direction is swerves to?

>swerve to miss one moron in the street.
>end up swerving into the big crowd of non-morons waiting for the crosswalk signal to change

The brakes break just before the incident. The car doesn't have enough time to slow down without brakes, and thus we're left with OP's scenario.

Are you legitimitely so autistic that you don't get the fact that the feasibility of OP's pic isn't the point? The image is trying to convey the fact that with machines there is no "human error" that can account for decisions like these. With machines we have to decide which view on morality is "correct", because we have to make up the guidelines by which the car decides things in a scenario like in OP's pic.

Why are people walking on the street?

To get to the other chicken

That's not what the OP said. If you'll notice, the vehicle swerves and kills the drive in all 3 examples, even when no one is there. Obviously, this isn't a breaking issue, this is a programming issue. The vehicle decided to kill the driver or thinks the street turns in that location in that direction.

Why not randomize it? Computer makes a virtual coin toss on whether to kill the passengers or the pedestrians. At least then you know you might not die.

...

not have been a retard
if a friend died from lung cancer from smoking i wouldnt blame the cigarette company
as long as the car's decision-making process was working as intended, it would only be his fault for willingly putting his life in the hands of a machine

How does a car know if someone just robbed a bank? The cars number 1 priority should be it's passengers, otherwise i will not buy one. Then just don't interfere if there is no safe alternative... the people who walked in front of your car deserve to die because they made the mistake. A self-driving car would not be driving too fast to stop at a cross walk. So anyone that is in your way fucked up and if they fuck up they deserve to die.

This test makes the dangerous assumption that pedestrians are incapable of seeing a car speeding towards them and moving out of the fucking way. What if a pedestrian sees the car, runs to the side of the road, then the car swerves to the side of the road to avoid the pedestrian, where the pedestrian is now standing and kill both car driver and pedestrian?

AI should assume that pedestrian have some level of self preservation.

With this in mind, the ONLY smart move for an AI car is to brake as much as physically possible, while honking the horn and flashing the lights.

The system should do the best it can to avoid any collision and if it can't, it should try to take as much energy out of the system as it can. Asking for a computer system to do anything more than that on its own is flat out retarded.
>hurr we need a fucking advanced ethics engine in every car
Fuck that. That's all needless philosophical pondering. If you jump in front of a train, the train doesn't need a fucking special mechanism so the train driver can derail it.
>b-but what if it's a cargo train and there's a fucking bus full of children on the tracks
NO.

he makes a good point. tires have a limited amount of grip. if you can't stop in time, then the car would probably understeer and hit the pedestrians anyway, especially in the OPs pic where you literally have to make a 90* turn to avoid hitting them.


correct thing to do is brake and try to slow down as much as possible.

consider that pedestrians might even do this on purpose, thus killing the driver while maintaining plausible deniability if the car's default behavior is to just ram into something hard when it can't stop in time to avoid hitting a soft obstacle (which is ridiculous in the first place)

shut the fuck up you retarded tumblrite

does the driver scream allahu akbar before?
this is important

the correct thing to do is minimize collision with soft obstacles while avoiding hard obstacles entirely.

...

>perfect "murder"
It's not murder in quotes, it's just murder, just like cutting brake lines, and you'd be murdering someone on camera.

any other behavior is simply too unpredictable and dangerous to work in real life

>gender preference is only a false binary scale
Will self-driving cars be transphobic?

Post yfw you realize you've only been here for a year and you're the newfag.

If the self driving car has the possibility of killing the driver it becomes a security risk since it has a scenario which it doesn't put the drivers security at top most priority programmed into it. That alone would make the car a shitty investment as compared to normal cars where the driver has a choice in this scenario everyone would be put under the blanket opinion of the programmers.

>binary scale
You might have meant "gradient" or "line".

Can't computers figure out how to swerve and roll the car in such a way that nobody dies

Just let any driver play this game before driving and decide on these results

Obviously slow as much as possible while honking as the pedestrians also have the ability to move in this time. The scenarios these images are really trying to ask about can't be drawn or explained so easily, in which case of course the goyim will bear first mortality.

program it in a way that there are no victims left after each accident so no one has to pay any money.

Is this the evolution of the trolley problem?

This isn't 1997, nobody robs banks anymore.

if it is traveling slow enough that it can make that sharp of a turn it is traveling slow enough to break completely

>implying slamming into the wall will necessarily kill you

Slow down, cars can react faster than people. This situation should never happen.
OP said nothing about the brakes failing. With electric cars it is extremely unlikely that the brakes fail. Of course if the brakes are broken, how are we to know that other things might be broken too? Perhaps the steering doesn't work anymore either because the hydraulics is fucked

Just break

break me off a piece of that [math]jaywalking pedestrian [/math]

>first scenario
>do fat people deserve death?
LMFAO

One could even say "spectrum"

Kill them. Serves them right for crossing the street wherever they fucking please.

>large
Is fat too offensive?

People who jaywalk in front of a car when it's going fast enough that it cannot safely stop deserve to get run over far more than the passengers. Plus, if self driving cars sacrifice people for the greater good they will be less consumer friendly.

The car should slam on the brakes and plow through pedestrians.

underrated

JUST make the car not drive over the speed limit. Then non of these situations will happen, as it should be more than enough time for the car to break.

This is only realistic if the breaks are broken on the car, but then it's not an AI problem...

Anyway, the AI car only needs to be better than humans.

Maybe you should go to a place where there's a post-rating system in place. It would work with some kind of vote, let's call it an upvote. Go there and you'll never have to worry about underrated posts ever again.

>This is only realistic if the breaks are broken on the car, but then it's not an AI problem...

Actually it IS an AI problem, because even if this situation only happens in 1 out of 1million drives due solely to mechanical failure, the AI will still need to make moral choices. If we simply don't program anything at all into the AI, then we're just letting random chance decide.

There are also various other reasons, which would also be rare but still we should have plans ahead of time

Self driving car should never kill its occupants. End of Statement.

What if the occupants is animals? What if it's just freight with no human occupants? It's not that simple.

>your own car trying to kill you
What the fuck is this shit?

It's a prisoners dilemma.

If only you opted for the option to save as many lives as possible, it would be bad for you. But if everybody who ever buys a self driving car all have the same AI which tries to save as many lives as possible, it is an overall benefit to you because you're less likely to get run over.

Any company that implements suicide cars is DOA economically-legally-ethically. Kill yourself, psychopathic fuck.

>But if everybody who ever buys a self driving car all have the same AI which tries to save as many lives as possible, it is an overall benefit to you because you're less likely to get run over.

not really. if everyone yields right of way when they're supposed to, there are no problems. i wouldn't expect a human driver to swerve into a barrier in order to save someone who did not yield right of way when they were supposed to.

Something like this will probably be legally mandated.

It makes sense to have the people who are actually purchasing the technology and subjecting the rest of the world to it should assume the responsibility and the risks.
'
And again, it would never and should never be implemented by ONE company (except possibly as an optional setting for especially selfless individuals). It would only make sense if it was legally mandated that all driverless draws function with the same "moral" logic, which increases the overall safety of everybody in the country.

You're intentionally avoiding the situation where for some reason nobody is "at fault", which is the most difficult. What if the car has an unexpected mechanical failure and a person is legally crossing at a crosswalk?

>legally mandated
By whom? China?

overrated

If you are desperate to look for someone at fault then look at the manufacturer or whoever allowed the car to be on the road in poor condition. What you are doing is turning an accident into an execution and assuming that the the driver and passengers the ones who have to take the bullet.

>You're intentionally avoiding the situation where for some reason nobody is "at fault", which is the most difficult. What if the car has an unexpected mechanical failure and a person is legally crossing at a crosswalk?

it should go into some fail-safe mode, perhaps an emergency brake an engine braking. if a car has an unexpected mechanical failure, then the car should behave as consistently and safely as possible. this does not mean running into something, which is a complex behavior that may not be reliably executed if the car is having mechanical issues.

Breaking is the only right answer. The reduction in the speed of the car directly correlates with the likelihood of the pedestrians surviving. Turning and breaking is almost never a better way to stop the car. The car should not have gotten into this situation if the AI was functioning properly and the pedestrians looked both ways. Since that is the case then do your best to protect the occupant rather than the pedestrians.

How the fuck does the car's AI know who is a criminal or not? Is it kit from knight rider?

the car should stay in its lane and slow down. there's literally no other realistic answer to this

All you're doing now is trying to avoid the question.

Maybe a car will have mechanical failure and be able to pull safely off to a side and nobody gets hurt. Maybe a car will see a pedestrian and will just break. Maybe a car will have mechanical failure and go into a failsafe mode and prevent an accident.

But what about that tiny fraction of situations where the worst possible series of events just coincidentally happens? Even if it's incredibly rare, the amount of driving that happens every day means that it will happen at least a few times a year.

The whole point of a thread like this is that we need to decide what should happen in these situations. You shouldn't just weasel out of a hard question by trying to come up with some loophole in the way the question is phrased. Finding a loophole where nobody needs to get hurt to avoid the moral question is just missing the point, so just stop. Maybe it's a series of mechanical failures, maybe it's black ice on the road, maybe it's a failure of the actual road infrastructure itself. But situations will eventually arise where an AI car will need to choose between the safety of its passenger or the people on the street.

>i wouldn't expect a human driver to swerve into a barrier in order to save someone
Of course, when a real life person is put in this situation, they need to make a last second instinctual reaction without thinking it through because it happens so fast. However, an AI driver has the benefit of deciding ahead of time in a calm, rational manner what it will do in any given situation. Logic that applies to panicking, confused humans with 1 second to decide doesn't apply to a cold, calculating machine that was already told on the day it was made what it will do in every situation it ever encounters

>But situations will eventually arise where an AI car will need to choose between the safety of its passenger or the people on the street.

people on the street still have freedom of movement. running into a barrier is unfair to the driver who is stuck in the vehicle with no control.

the common sense answer is to slow down and stay in the lane. anything else is unpredictable and probably difficult from an engineering standpoint. or rather, it's premature optimization.

Leave you fucking nigger

>With electric cars it is extremely unlikely that the brakes fail.
Engineers who design cars are supposed to take into account a variety of extremely unlikely scenarios that could pose a hazard to life.

If feasibility of the scenario is irrelevant, why not debate whether self driving cars should be given machine gun turrets to fight of Somali car-pirates?

He's literally wearing a mask and carrying a sack with a dollar sign on it.

Any half-retarded image recognition AI would clock him as a criminal.

Why are so many people quibbling with the hypothetical's rules?

run over the faggots every single time.

they shouldn't be on an automated road.

Obviously they should, every person has a right to self-defence.

Now what about an answer to the question in the OP?

>The self driving car with sudden brake failure
>Stay in its lane and slow down.
>literally no other realistic answer to this
Are you unable to read?

>He's literally wearing a mask and carrying a sack with a dollar sign on it.
>Any half-retarded image recognition AI would clock him as a criminal.
What if it's Halloween?

Should a human kill themselves in those scenarios?
Should a car do anything different than the person would?

If you're talking life and death...high social status men should be saved first. It's about reproduction. Homosexual can't reproduce withe each other and more women reproduce than men.

...

I think the AI should determine if there is a crosswalk there or not, and if so it should have slowed down ahead of time.

If not I would not want myself or an user to die to save a dozen jaywalkers.

your property should always prioritize the owner at all cost.

If there is no legal crosswalk then the only obligation should be to brake

no one would ever program in some kind of retarded escape maneuver like that

Noone is going to buy or even sit in a car which kills you.

>capacity for genetic reproduction is more important than capacity for generating good ideas and good work

Dan Puperi

What's wrong with letting blackbox random chance decide?

There’s no way a robot could be smarter than a human. There’s nothing a robot could know that some human wouldn’t have already. AI is connected to the internet......all of it.........and everything on the internet was made by humans, including viruses

But no single human has ever learned everything on the internet, AI could.

traffic rules already exist to help to stop that from happening, and determine who's responsible in cases where it does happen

>self driving cars begin plowing through mcdonalds playpens

shouldn't the car identify the threat well before it gets to that situation

This, the car needs to perform constant checks on brake systems before it ever starts up, and routinely throughout the trip.

I agree. Pretty rude desu

>1 large woman