In a Crash, Should Self-Driving Cars Save Passengers or Pedestrians? 2 Million People Weigh In (pbs.org) 535
In what is referred to as the "Moral Machine Experiment", a survey of more than two million people from nearly every country on the planet, people preferred to save humans over animals, young over old, and more people over fewer. From a report: Since 2016, scientists have posed this scenario to folks around the world through the "Moral Machine," an online platform hosted by the Massachusetts Institute of Technology that gauges how humans respond to ethical decisions made by artificial intelligence. On Wednesday, the team behind the Moral Machine released responses from more than two million people spanning 233 countries, dependencies and territories. They found a few universal decisions -- for instance, respondents preferred to save a person over an animal, and young people over older people -- but other responses differed by regional cultures and economic status.
The study's findings offer clues on how to ethically program driverless vehicles based on regional preferences, but the study also highlights underlying diversity issues in the tech industry -- namely that it leaves out voices in the developing world. The Moral Machine uses a quiz to give participants randomly generated sets of 13 questions. Each scenario has two choices: You save the car's passengers or you save the pedestrians. However, the characteristics of the passengers and pedestrians varied randomly -- including by gender, age, social status and physical fitness. What they found: The researchers identified three relatively universal preferences. On average, people wanted: to spare human lives over animals, save more lives over fewer, prioritize young people over old ones. When respondents' preferences did differ, they were highly correlated to cultural and economic differences between countries. For instance, people who were more tolerant of illegal jaywalking tended to be from countries with weaker governance, nations who had a large cultural distance from the U.S. and places that do not value individualism as highly. These distinct cultural preferences could dictate whether a jaywalking pedestrian deserves the same protection as pedestrians crossing the road legally in the event they're hit by a self-driving car. Further reading: The study; and MIT Technology Review.
The study's findings offer clues on how to ethically program driverless vehicles based on regional preferences, but the study also highlights underlying diversity issues in the tech industry -- namely that it leaves out voices in the developing world. The Moral Machine uses a quiz to give participants randomly generated sets of 13 questions. Each scenario has two choices: You save the car's passengers or you save the pedestrians. However, the characteristics of the passengers and pedestrians varied randomly -- including by gender, age, social status and physical fitness. What they found: The researchers identified three relatively universal preferences. On average, people wanted: to spare human lives over animals, save more lives over fewer, prioritize young people over old ones. When respondents' preferences did differ, they were highly correlated to cultural and economic differences between countries. For instance, people who were more tolerant of illegal jaywalking tended to be from countries with weaker governance, nations who had a large cultural distance from the U.S. and places that do not value individualism as highly. These distinct cultural preferences could dictate whether a jaywalking pedestrian deserves the same protection as pedestrians crossing the road legally in the event they're hit by a self-driving car. Further reading: The study; and MIT Technology Review.
Passengers... (Score:5, Insightful)
A self driving car should protect its passengers first or they wouldn't sell. Who would willingly ride in a vehicle that would intentionally sacrifice their life for any reason?
Re: (Score:3, Insightful)
The passengers have seatbelts, air bags, and crumple zones to lessen their injuries, though. Pedestrians might as well be naked.
Re:Passengers... (Score:5, Interesting)
The passengers have seatbelts, air bags, and crumple zones to lessen their injuries
The question is usually framed to already take that into account. They way I have heard it is:
Choice 1: Hit pedestrian.
Choice 2: Drive off a cliff and kill the passenger.
It may be an interesting philosophical question, but it has little to do with reality. A scenario like that is almost never going to happen, and even if it did, a human driver would be faced with the same split second dilemma and be no more likely to make the "correct" decision (whatever that is).
Far more important is that the SDC would have much better reaction time, more braking distance, better control of steering, more situational awareness of other traffic, and thus better able to kill no one.
Re: (Score:2)
Re:Passengers... (Score:5, Insightful)
While this is an interesting hypothetical scenario, I might suggest that the number of times that this sort of thing has actually been any kind of real choice to have to make, particularly in a situation that was not preventable by paying enough attention to the road in the first place, is probably countable on one hand in the entire history of automobiles, if not actually entirely non-existent.
The ideal is that the self-driving car would be paying enough attention (tirelessly, I might add) to the road and what lies ahead that this sort of "kill the driver or kill the pedestrian" situation that people like to dream up wouldn't ever arise in practice... an automated car that is genuinely designed for safety would simply not drive so fast in any sort of hypothetically reduced visibility situation that there would not be enough time to stop safely in the first place.
Re: (Score:2)
Re: (Score:3)
Besides a negligible outlier at best.....no human driver is going to choose keeping themselves alive over any other beings if the choice is between them or someone else.
That's just human nature, self preservation.
Re: (Score:3)
In such split second decisions humans react on instinct. Unless specially trained, that instinct will be self-preservation or panic paralysis. Think someone punching a person, most people will just try cover themselves or do freeze "like a deer in the headlights". People trained in hand-to-hand combat will get out of the way, block, redirect the punch or even use it to attack back. The question at hand is, should cars react like exactly like humans, and if so, which humans? If not, how should the cars react
Re: (Score:2)
Re: (Score:3)
It may be an interesting philosophical question, but it has little to do with reality. A scenario like that is almost never going to happen, and even if it did, a human driver would be faced with the same split second dilemma and be no more likely to make the "correct" decision (whatever that is).
It's not just a philosophical question. A team of engineers has to sit down and write code, or at the very least models for machine learning, that will allow a self-driving car to make a reasonable decision in any conceivable scenario. The choice you give is just a marker for a whole class of decisions that some cars will have to make at some time. This is a real problem that these engineers have to face before these cars are on the road.
The fact that human drivers in the same situation could make a poor c
Re: (Score:3)
In the matter of legal innocence or guilt. The person chose to enter the vehicle and take a private risk. The other person choose to be protected by their government and take a walk upon government owned and controlled land. The AI is programmed with choice by the programmer and funded by the corporation for profit. So the choice to be made, is not one life over another, the choice is commit premeditated murder to suit the convenience of the person who choose to enter the vehicle.
So in terms of legal choic
Re: (Score:2)
Re: Passengers... (Score:2)
Re: (Score:2)
Re: Passengers... (Score:4, Funny)
Re: (Score:3)
It is the driver's responsibility to take reasonable measures to avoid hitting pedestrians, but sometimes pedestrians just do something monumentally stupid (or suicidal) and get hit. If you are obeying a 50mph speed limit and a pedestrian steps right infront of your moving vehicle and you don't have time to react and stop then the police will assess the situation and usually declare you not at fault.
If the police decide that you were breaking the law (speeding, using a phone while driving, driving a vehicle
Re: (Score:3)
Nice theory but where I live the law says otherwise. Pedestrians ALWAYS have the right of way.
No, that's not what the law says. It's just that milions of ignoramuses believe that that's what it says because multiple generations of driving instructors have been misleading them. Go look through the traffic laws; you won't find anything like that.
I found a reference that summarizes the various laws by state. Some states are more restrictive than others. http://www.ncsl.org/research/transportation/pedestrian-crossing-50-state-summary.aspx [ncsl.org]
Re: (Score:2)
In the scenarios in the linked survey, there are ones where the pedestrians have a green light and ones where they are walking against a red light. I always prioritized legal pedestrians over passengers and passengers over illegal pedestrians. Note that in every case that which caused the situation was a brake failure - not something that the passengers are directly responsible for at that moment (unless they are knowingly using unlicensed/uninspected/unmaintained vehicles and that's why the brakes failed -
Re:Passengers... (Score:5, Insightful)
Re: (Score:2)
Hmm... I wonder if you would like to be sentenced to prison. You would be surrounded by inmates who think outside the rules and aren't knee-jerk authoritarians which would thrill you. On the other hand, you would also be subject to a lot of knee-jerk authoritarians with badges which you might not like -- but that would give you even more opportunities than usual to show how much you think outside the rules. Tough call :)
Re:Passengers... (Score:5, Insightful)
This, in a nutshell, is everything wrong with our society. We have way too many people who think that jaywalking and prison rape are equivalent.
In most countries, jaywalking is not even a crime. In America, it is mostly used by the police to target young people and minorities.
Re:Passengers... (Score:5, Interesting)
In most countries, jaywalking is not even a crime.
Citation needed.
Anyway...
Jaywalking is only called as such if it's a crime. Otherwise it's called "crossing the road". So let's focus on where it indeed is illegal to cross the road wherever you feel like.
Jaywalking was always a tricky thing. There are many variables to consider.
Is it okay to cross a road through an unmarked location, if the road is empty for hundreds of yards either way or with no car in sight?
Is it okay to play IRL Frogger in a busy intersection in the middle of the city?
Personally, I am a strong supporter of (enforcing) heavy fines in case of jaywalking anywhere within a city or town's borders, with the exception of single-lane, one-way streets with speed bumpers or speed limit below 15 mph. The reason for this is my belief that a civilized society is based on respecting the "small rules": no littering, no jaywalking, no unruly behavior, no making a lot of noise, you know, common sense things.
I'm from a country where jaywalking is punished... in theory. In practice, nobody gives a flying fuck, and as a result I stay at the red light with my little kids and everyone else just jaywalks, so I struggle to properly educate my kids to be civilized because everyone else shows them, through their apish behavior, that their dad is an idiot for following simple common sense rules. Am I an idiot for teaching my kids a civilized rule?
In the past I used to work as a camera man for a local branch of a country-wide private TV channel. One of my tasks was to document all major incidents for the local police, as at the time they did not have their own camera man. I have documented car accidents, fires, demolitions gone wrong, suicides, homicides, pretty much anything with victims (be they wounded or killed). I've seen fatal effects of jaywalking, very closely and from a wide variety of angles. People who jaywalk have no fucking clue. I know exactly what I am keeping my kids away from, and I cringe every time I see parents dragging their kids across the road, in a hurry, because cars are coming with 30-40 mph. There was a case from back then where a parent with two kids jaywalked, one of the kids dropped his toy and pulled his hand from his father's, ran back to pick it up and was run over by a car. The other kid go scared and ran the other way, across the middle of the road and got hit by another car. The parent was unscathed but ended up with one dead child and another crippled for life. All because he decided not to wait for 30 more seconds.
So yeah, it doesn't matter if jaywalking is a crime. Before it being a crime, it's a common sense rule. It became a crime because people lack common sense, so it needs to be hammered into their thick skulls with fines and such.
Re: (Score:3)
In most of europe and asia, jaywalking is not a crime... You can cross the road wherever you feel it's safe to do so, and in most cases designated crossings are generally safer.
However in a lot of countries (india, myanmar etc) traffic frequently ignores crossings and it's dangerous to cross whenever there is any traffic around. In many countries it is actually safer to cross in the middle of a traffic jam because the traffic will be slow or stationary, if the traffic is actually moving it won't stop for yo
Re: Passengers... (Score:3, Insightful)
In most countries, jaywalking is not even a crime.
In the country I'm currently in traffic laws in general seem to be little more than suggestions, and right-of-way is a foreign concept. So what? They also have one of the highest vehicle fatality rates in the world. What kind of idiot thinks that "it's legal in other places" is a good argument?
In America, it is mostly used by the police to target young people and minorities.
Ah. That kind of idiot.
Re: (Score:2)
UK says get lost, we'll cross anywhere we like, it's perfectly legal and expected. (excludes motorways). We don't have 'jaywalking' laws here.
Re: (Score:3)
And as soon as that happens ... the inflatable "plastic passengers" which are used to fool surveillance cameras on "dual occupancy" or car pool lanes will start being weighted, so the car thinks it has an actual passenger on board, and therefore be more likely to protect the driver by proxy.
My first guess would be that users would fill the inflatable legs and torso with water, to trip the weight sensor in the seat.
Re: Passengers... (Score:2)
Re: (Score:2)
In a "self-driving" car, the "driver" you refer to is a passenger.
Re:Passengers... (Score:5, Insightful)
A self driving car should protect its passengers first or they wouldn't sell. Who would willingly ride in a vehicle that would intentionally sacrifice their life for any reason?
No, actually, we're going to let the traffic engineers at the Department of Transportation set the rules, which will be the same as for humans (stay in lane, stop as fast as you can, DO NOT SWERVE) and the engineers won't even ask the public.
Re: (Score:2)
Legislation might require that the passengers in the car be prioritized below law abiding pedestrians in failure cases. This would encourage people to buy/rent/share the most reliable cars -- i.e., those that don't have as many failure cases that require making such decisions. It also makes the person responsible for the selection (the passengers) accountable for their actions.
Re: (Score:2)
A self driving car should protect its passengers first or they wouldn't sell. Who would willingly ride in a vehicle that would intentionally sacrifice their life for any reason?
And if your car damages other people, that alone will make them win any court case against you.
What do you think happens if you intentionally kill people with your car to minimise your own damage?
A stupid and pointless debate.... (Score:3)
1) By being able to operate a vehicle orders of magnitude faster and with far more information than a human, the chance that the car will ever even get into a situation were this decision would have to be made is very, very unlikely.
2) If it gets into this situation where stopping entirely w/o injuring anyone is off the table, then the car will have so little time to react that making a decision to kill one group or the other and acting on it is a pointless exercise.
Also, there ar
Re: (Score:2)
A self driving car should protect its passengers first or they wouldn't sell. Who would willingly ride in a vehicle that would intentionally sacrifice their life for any reason?
This is great, I had the exact opposite conclusion. The passengers of the vehicle signed up for the risk, the pedestrians did not. So peds whom are not invested in the risk of using a self-driving car should be spared over the passengers if there's a choice to be made about who lives or dies.
Re:Only Americans are selfish, according to resear (Score:4, Insightful)
The US is a nation composed largely of immigrants and their offspring, many who have arrived comparatively recently. In many cases they came not because it was convenient (getting to the US from Poland or Italy, for example, was not "convenient" before air travel - esp. for poor people) or because it was easy or because it was low risk. They subjected themselves to substantial risk, expense, and inconvenience to make the trip and survive in the US.
These immigrants, of course, left behind those that didn't have the same drive or interest in creating a better situation for themselves and their families. It would not be surprising that those who had the gumption to better themselves rather than sacrifice themselves for the "common good" would be looking out for themselves and their families more strongly than those that lacked such gumption and remained behind.
As well, the US has historically been one of the most diverse populations in the world (due to the source of our population) so the tribal "common good" notion is probably unsurprisingly much stronger than in monocultures like Japan or most of the Nordic countries.
The US seems to have done pretty well - esp. in light of having to deal with its very diverse population.
Re: (Score:3)
So, if a pedestrian high on drugs illegally steps out in front of your car in the middle of a high speed road where there is no pedestrian crossing, you would choose to drive your car into a barrier to avoid the pedestrian even if that would result in the almost certain death of yourself and three other family members in your car? What if you were driving a carpool of kids and two of the passengers who would die aren't even your kids -- they are neighbor's kids?
Re: Passengers... (Score:3)
Re: (Score:3)
Apparently you've never lived in a town with train track crossings.
Re: Passengers... (Score:5, Insightful)
Re: (Score:2)
As do bicyclists that aren't obeying the rule of the road...which you see ALL the time.
Stupid False Question (Score:5, Insightful)
You never know for certain that a given course of action will cause a fatality. When you're driving, you try to avoid accident. Self-driving cars will do the same. They'll compute the odds of an accident for all options and select the one with the lowest odds. It may be just a fraction of a percent less likely, but it will take that.
Re: (Score:2)
In all of the cases in the survey I took at this site, SOME form of loss of life was unavoidable (although some were between humans and non-humans).
However, in many real world cases, there are likely to be differing probability of human death and, certainly, self-driving software should take that into account. A car hitting an elderly pedestrian squarely at 35 MPH is very likely to kill the pedestrian (maybe >80%?) but in a modern car with the highest crash protections, hitting a concrete barrier at the
Depends (Score:2)
Re: (Score:2)
You'd probably see a lot more of them depending on the country [satwcomic.com].
This is insane (Score:2)
No questionnaire can resolve this problem.
Cars must prioritize the safety of its occupants over everything else. If they do not, then people can murder others through machine logic without any hacking at all. Additionally, no computer will ever be advanced enough to know that the next decision it makes will be better or worse than just default saving it's occupants.
It is also immoral to evaluate lives based on worthless criteria like age, gender, political, racial, class, or religious ideology.
Just imagi
Re: (Score:2)
>It is also immoral to evaluate lives based on worthless criteria like age, gender, political, racial, class, or religious ideology.
I agree with most of that. However, you shouldn't lump age into that. Most people would find it morally correct to save a kid over an adult. That being said, it is still best to not factor any of it in.
Re: (Score:2)
Saving a kid over the adult is not as cut and dry as it sounds. What about all the other kids that the adult may be financially supporting? Additionally, how far can that go? is it better to save an 11 year old vs a 12 year old? Can a vehicle/machine really make that evaluation correctly each time?
I agree that most people would go with the cut can dry save a kid over an adult, myself as well, but it just really is not the simple when you really start to think about it. There are all sorts of additional
Re: (Score:2)
Saving a kid over the adult is not as cut and dry as it sounds. What about all the other kids that the adult may be financially supporting? Additionally, how far can that go? is it better to save an 11 year old vs a 12 year old? Can a vehicle/machine really make that evaluation correctly each time?
The car needs more information. We need to have an identifying beacon of sorts so the car can identify us and evaluate our lives or just look up our social value ranking or some such metric such as predicted tax contributions over their lifetimes.
And if there are multiple people at stake it may have to decide whether killing 2 lower ranked people is preferable to killing 1 of a higher rank. Taking an average doesn't seem fair, but neither does just adding up the 2 lower ranked people's scores.
I dunno, is
Re: (Score:2)
Re: (Score:2)
Why?
Re: (Score:2)
My initial reaction on reading the summary was that there were no surprises, but then it referred to cultural differences and it suddenly struck me that actually it is surprising that there was universal preference to save the young. Certainly historically there have been cultures which valued the old (with their wisdom and experience) over the young.
Re: (Score:2)
that is a massively incorrect analogy. The trolley car problem is a super specific problem and morals driven by either emotional value or numerical value criteria. It also comes with the supposition that no other fallout will result from the decision. Additionally the problem does not address an "occupant" vs "external life" scenario in all cases. So in short the Trolley Car problem does not even come close to the real problem here... not by a long shot.
Self driving vehicles will be exposed to wildly dy
Re: (Score:2)
If I want you dead all I need to do is throw something that can trick a machine into thinking its a human baby into the road and watch the ensuing carnage!
And you will get caught by multiple cameras, and go to jail for a very long time. But also, if you want someone dead, you can just take a gun.
Stupid Question (Score:2)
Really, its a click bait question. An automated car seeing a pedestrian it may collide with with will brake as hard as possible while avoiding other obstacles and staying in its prescribed lane. The chances of this happening as well as the car having to swerve into an obstacle that would also injure the driver is so small as to be irrelevant noise.
Re: (Score:2)
I agree, they wasted time and money asking 2 million people a stupid bloody set of questions. I'm guessing they know fuck all about autonomous vehicles systems because if they actually know anything then they might have been able to ask some useful questions.
Here's some more interesting questions:
If a $2000 lidar system can see ahead 500 yards and a $1000 lidar system can see ahead 250 yards, should the manufacturers be allowed to just install the $1000 system.
Or
Should the government be mandating what dista
Trolley problem by another name (Score:3)
Trolley problems fail rigor because they make a critical assumption, an artificial intelligence is smart enough that it knows the results of two choices each with negative outcomes but is somehow not smart enough to have avoided that situation to begin with. An AI developer who is trying to produce the safest AI system possible is prioritizing the likely cases first and attempting to produce the best reaction in your typical crash. Nobody in development is concerned about the situation where you have a car speeding down a narrow road where a pedestrian steps out at just the right time and place where the only cause of actions is to crash into them or crash into a power pole. That situation is rare and shouldn't be optimized yet.
Let's say that we're worried about optimizing that situation now and we somehow have omniscient AI that still runs into this situation. Now our problem is probabilities. What's the probability that the pedestrian will survive jump out of the road in time and no crash will happen? What's the probability that the pedestrian will die from the crash? What's the probability that the passenger will die when if we swerve into the light pole? Who is going to be harmed by that falling light pole?
Re: (Score:3)
No. Their action of stepping off the curb in front of you dooms them, not your reaction time. We need to start holding pedestrians accountable for their own safety rather than automatically assuming that the automobile is at fault. The whole notion of "pedestrian has the right of way no matter what" is ludicrous when analyzed objectively. The idea that just because the pedestrian stepped onto the road in a crosswalk means that all traffic must stop instantly and in contravention of the laws of physics and/o
Of course you should prioritize the pedestrians (Score:2)
Unless I'm the passenger.
Theoretical not actual issues. (Score:2, Interesting)
There will NEVER be a set rule of anything like "protect passengers over pedestrians. Or Vice Versa. Because that is not how computers work. And forget about age discrimination, that is just plane stupid. The computer will have a hard enough time deciding if an obstacle is a pedestrian, it won't have that kind of higher logic to estimate the age of the people.
It might not even be able to tell how many people are in the car let alone how many people are currently standing in the middle of the road.
The c
and when the sensors messup and class an kid (Score:2)
and when the sensors mess-up and class an kid as safe to run over Debris??
Re: (Score:2)
If it's a kid in the road you're probably on a residential street. It's probable that if you're driving one of those streets, rather then trying to park on them using an assisted park feature, the AI will actually require you stay in control of the car.
For later versions that actually work in residential driving, the car will be going 20-25 MPH rather then 50+MPH, and will probably have specific programming to not run over anything because anything might be a puppy/ball/etc. being chased by a four-year-old.
Re: (Score:2)
residential mode needs map data so we can just blame the map data provider for fucking up.
Re: (Score:2)
In theory, the computer should be able to figure out whether it's driving residential streets or not from GPS (to tell you the state), and traffic signs like speed limits. Generally the residential zones will have different speed limit then commercial.
But yes, you can also blame the map provider. Depending on the local libility laws and your contract with Google, it might even stick in Court.
Very unlikely (Score:2)
and when the sensors mess-up and class an kid as safe to run over Debris??
First of all, the whole road would have to be covered in something as big as the kid to even think about running over an obstacle.
Secondly, very probably any software would simply stop if the road was filled with debris that large, or at worst run around.
Thirdly, moving "debris" would rate a higher priority not to go over compared to static obstacles.
Fourthly, don't set your damn baby down on the road or Grandma will never even see it
Re: (Score:2)
I agree, whether you should swerve left or swerve right is a silly question. Just brake and maintain control of the vehicle. Reducing your kinetic energy helps everyone. If someone hits you from behind, it's their own fault for tailgating, and anyway you and they are both well protected by your steel cages and airbags.
Re: (Score:2)
If you've watched any of google's visualizations, they clearly have systems working out where the car is not allowed to drive. A cyclist waving his arms around? Paint a red line across the lane behind him in case he is trying to turn. Train crossing with other vehicles? Paint a red line this side of the crossing until the way is clear.
If the vehicle is surprised, due to some sensor failure or erratic pedestrian, I'm certain the car would just hit the brakes or change lanes if possible. Then I'd expect a sm
NYC, glad to know (Score:5, Insightful)
Glad to know that NYC (and Boston, probably) has a large cultural distance from the rest of the US. Any place that's not tolerant of jaywalking isn't worth living in, since it puts the needs of steel sensory deprivation bubbles ahead of human needs...
"For instance, people who were more tolerant of illegal jaywalking tended to be from countries with weaker governance, nations who had a large cultural distance from the U.S. and places that do not value individualism as highly."
Re: (Score:2)
A modest proposal (Score:5, Insightful)
Wrong question (Score:2)
What if the pedestrian is in the road because they were ejected from another vehicle in a crash. Still feel justified in plowing through them?
What if terrorists are jumping in front of self-driving cars in the road. Should your car always crash anyway just in case?
The real question is why we should settle for some crap self-driving car design that uses RNG to decide whether or not to ram pedestrians or crash and burn? I should hope we can do better than that.
Moral philosophers are so cute (Score:5, Insightful)
We all know that whether the car decides to hit a jaywalker or not will depend on several variables:
1) Who is more likely to win a multi-million verdict in a Civil Suit: a jay-walker or the passenger?
2) Will drivers buy the AI software if it will decide to kill their entire families?
3) How well the engineers work on a feature (deciding whether to hit jaywalker or kill passenger by driving off cliff?) that is much less likely to be used in the real world then every other feature of the AI?
And variable 4) Moral philosophers have written a paper on this based on millions of data points from an online quiz, is not on the list.
Be predictable (Score:4, Interesting)
In a crash, self-driving cars should be predictable, rather than coming up with convoluted means to determine which group of pedestrians should be slammed.
Human drivers are erratic enough. No need to make computer-assised drivers to also be erratic.
Software developers... (Score:2)
Working to make cars more secure is highly beneficial. Working on deciding moral dilemmas, whether to kill one person or another, isn't beneficial in any way. One person dead, one way or another. So spending developer time on this kind of question is absolutely pointless until these cars are 100% safe, and then it is even more point
A machine shouldn't be making those decisions. (Score:2)
Hey /. : please stop putting up this BS (Score:2)
To an engineer they are engineering failures. And I don't know about you personally, maybe you're some daredevil alcoholic behind the wheel, but I've yet to ever encounter a life or death situation for anyone while driving. That includes ever even seeing anyone else in one. Considering self driving cars are supposed to be safer than human drivers to begin with, not only is even getting into a stupid trolley problem situati
Algorithm (Score:2)
What is this? The Wild West? (Score:2)
In a crash, obey road rules as much as practical. Normally, this means braking and staying in your lane. Stray outside your lane only if it won't kill someone.
Further, AI today is generally too clever by half. I don't think its capable of making any such decisions.
Most respondents were young and/or no income (Score:2)
The highest responding age group was 20 year olds, and largest number of respondents (close to 40% respondents) fell into the $0-$5,000 annual income bracket, so people not with means to purchase a self-driving car, hence responding to the questions from "what others should do" perspective, not "what I would do". No surprise, people are usually very altruistic when asked what others should do. If the question was "what should your car do" or "what should your loved one's car do", the answers would be differ
engineers vs. philosophers (Score:4, Interesting)
It's a cute experiment with not exactly surprising results (humans prefer humans over animals - who'd have thought?).
But in the end, like the trolley experiment, it is informative and insightful and a bunch of other +5 mod points buzzwords, but the actual solution for the real world will be made by engineers, not by philosophers, and it will almost certainly not involve a "moral decision" subsystem. The primary effort of a practical AI is in making a decision so quickly that it can still minimize damage. Every CPU cycle wasted on evaluating the data in other ways is silly. It will rely for its decision on whatever data its sensors have already provided, and that data will not be in the shape or form of "there are 3 black people with this age range and these fitness indicators in the car, here are their yearly incomes, family relations and social responsibilities. Outside the car we can choose between the river, average temperature 2 degrees, giving the passengers this table of survival probabilities. Or crowd A, here is a data set of their apparent age, social status and survival probabilities. Or crowd B, here is their data set."
This is how the philosopher imagines the problem would be stated to the AI - or to a human in a survey.
But in reality, the question will be more likely something like: "Collision avoidance subsystem. Here's some noisy sensor data that looks like the road ends over there. A bunch of pixels to the left could be people, number unclear. A bunch of pixels to the right also seem to be people, trajectory prediction subsystem has just given up on them because they're running fuck knows where. Estimated time to impact: 0.5 seconds. You have 1 ms to plot a course somewhere or it doesn't make a difference anymore. Figure something out, I need to adjust the volume on the infotainment system and make the crash warning icon blink."
What we will end up with is some general heuristics, like "don't crash into people if you can avoid it" and then the AI will somehow come up with some result, and it will work ok in most cases in the simulator, and then it will be installed in cars.
Re: (Score:3)
Also the reason why engineers are vastly over represented in mass murder incidents of terrorism.
That is simply because they actually know how to build a bomb or make other things work, while most other people who'd like to do it simply can't put their thoughts into actions.
That is the problem.
Evolution would like to have a word with you. In the end, what works, wins. And don't get me wrong, morality is a part of the working set, as evidenced by the fact that societies with a moral system have a good track record on surviving.
But every society, with absolutely no exception, ever is also practical about it. The holy book,
Jaywalking = Weak governance? FFS. (Score:5, Insightful)
Jaywalking is not a crime in most countries. Pedestrians typically have right of way over cars. That may sound odd to Americans who haven't traveled, but most countries don't have a word for jaywalking because it is just walking.
So tolerance of jaywalking comes from it being fine in most places.
Re: (Score:2)
Re: (Score:3)
That's naive -- the software has to do something here, it receives a bunch of inputs and analyzes a bunch of possible outcomes and somehow has to score them to decide to take an action (including "do nothing" which is, in itself, an action).
For example, surely a car should swerve to avoid a car that has run a red light if that will avoid a collision of any sort rather than just run into the red light runner and likely kill or seriously injure individuals in both cars. But, what if swerving would mean impact
Re: (Score:3)
The difference is that with software, a car can know instantly whether the lane on either side is open, and can have shorter reaction time than a person (at least one would hope), which significantly changes the odds when it comes to swerving.
Also, given enough CPU horsepower thrown at the problem, a car could also ostensibly calculate the correct angle at which
Re: (Score:2)
Re: (Score:2)
The survey presented assumes brake FAILURE in the scenarios. I.e., the brakes are applied but they fail to stop the car as expected.
In such a scenario, the software is presented with several inputs including at least:
1. In spite of applying the brakes the car is not slowing down quickly enough (perhaps it might not even be because of a defect in the car -- perhaps a leaking tanker truck full of vegetable oil had passed ahead of the vehicle you are in just 20 seconds earlier).
2. The car is approaching an una
Re: (Score:2)
How often has anyone ever been in a situation where they have to make such a choice? Once every trillion miles driven?
Well, just American drivers alone drive 3.2 trillion miles a year, so it would be a multiple-times-per-year event in the US alone, using that as the rate per mile. Globally the frequency would be much higher since the US accounts for only about 2% of pedestrian fatalities worldwide (about 5000 out of over a quarter million).
This is kind of a semi-broken way of thinking about this; yes the rate at which this situation happens is very low and likely to be lower for self-driving cars. But its a black swan eve
Re: (Score:2)
So, a self driving car in a case like this (or any of the millions of other similar cases) should never take evasive action - just smash into the pedestrians or barrier in front of it? Or, do you suggest that self-driving cars should ignore bicyclists and pedestrians (as they are both light and probably won't kill any passengers in the car) when evaluating evasive maneuvers necessary to "save the car and the passengers"?
Consider how you would write the requirements spec or testing criteria for self driving
Re: (Score:3)
Re: (Score:3)
Thomas Jefferson raped his slaves and sold off his own children into slavery. Fuck him.
Thomas Jefferson is also dead, please stop advocating necrophilia.
I don't think Jefferson was evil, but ACs are (Score:4, Insightful)
From the very foundation that manages the estate of Thomas Jefferson at the home he built, Monticello [monticello.org], including his descendants, both black and white:
“Though enslaved, Sally Hemings helped shape her life and the lives of her children, who got an almost 50-year head start on emancipation, escaping the system that had engulfed their ancestors and millions of others. Whatever we may feel about it today, this was important to her.”
Pulitzer Prize-winning historian Annette Gordon-Reed, 2017
I don't think Thomas Jefferson was quite as evil as you make him out to be. He seems to have been more interested in keeping his relationship with Sally Hemings secret, rather than in keeping anyone a slave. I also challenge you to produce a record of Jefferson selling any of his children with Sally Hemings, or a record of any of Sally's children being abused. Jefferson went out of his way to provide Sally with a private adjoining bedroom with his own. This woman had unfettered access to Jefferson. She could have easily killed him in his sleep, for decades, but she didn't. They also fell in love while in France, where mixed race relationships where no big deal.
It's also not fair to use modern values to judge those from a different culture and era. If you have references to paint a clear picture of Jefferson as someone who was truly evil, rather than someone who was trying to avoid persecution for a forbidden love, I'd love to see them.
Jefferson did leave clear instructions that all his slaves were to be freed, but I don't think this happened until after he died. I do love history, but I do not claim to be knowledgeable about Jefferson, although I have visited his home.
If you want an example of evil in the founding fathers of US history - look at Alexander Hamilton. That SOB used anonymous news articles and stories to libel and belittle Aaron Burr for decades, a rather competent military man who went on to become vice president. Both Burr and Jefferson were not terribly fond of Hamilton's Federalist agenda, which has issues reverberating in American politics to this day.
Burr eventually got tired of Hamilton's shit and challenged him to a duel, which was accepted. Hamilton, being inept with a pistol, his few competencies being running his mouth and flinging ink with his pen, lost the duel and died. A fitting end for an Anonymous Coward.
Re: (Score:3)
So, let me get this straight. The foundation that was formed to be public relations for Thomas Jefferson's estate is actually pro-Thomas Jefferson?
Astonishing.
Your words, my emphasis. The folks at Monticello can speak for themselves as to whether or not they are trying to accurately represent history or simply be public relations.
I'd suggest you simply visit, review their record and ask them, but you seem to have already made up your mind.
Re: (Score:2)
You do not need AI for that. Just a Muslim Jihadist in a Truck of Peace.
Should AI driven cars in muslim countries be programmed to swerve to run over pedestrians if a koran is in the road?
Re: (Score:2)
I'm always surprised at how many people think crossing the road somewhere without lights or a designated pedestrian crossing is "jaywalking". How would one legally get from one side of the road to the other when in the countryside without a crossing in sight?!
Re: (Score:2)
Re: (Score:2)
However, there are laws against some sorts of crossing between intersections in some areas. Where I live, there is one law that says that if you are within X feet (I don't recall what X is) of a controlled intersection, it is illegal to cross except at such an intersection.
Also, there are places where "no pedestrian" signs prohibit pedestrians so crossing a road from/to one of those areas is also illegal.
But, yes, where I live, you're certainly free to cross an undivided road on foot if there's no intersect
Re: (Score:3)
Not really. The term jay-walker [merriam-webster.com] descended from jay-driver. People who would refuse to abide by the rules of the road when operating motor vehicles or horse-drawn carriages. Jay-walker was applied to people who had no 'sidewalk etiquette' as well as those who wandered into the roadway. Jay-driver dropped out of use as motor vehicle faux pas began to be referred to by official violation names. Whereas jay-walking remained in our lexicon specifically because the laws were slow to codify pedestrian misbehavior
Re: (Score:2)
Re: (Score:2)
Even in unexpected situations there is no 'either-or' scenario; the car will quickly plan a whole bunch of solutions, and there is no way ALL of them will end with someone dead. It MAY choose to crash into a brick wall at a lowered speed, inflating the airbag and likely leaving the driver concussed at worst but everyone alive.
Re: (Score:2)
Should it swerve if that would result in no damage or injury to anyone if it detects that it will impact the pedestrian at a lethal speed if it just slows down?
Assuming that your answer is "yes" - what if it detects a domestic dog in the path of swerving, should it kill the pedestrian instead? What if it detects a squirrel in the path of swerving? What about a mouse? What about a mailbox? Remember, you're writing the code and/or developing the training sets -- your call, the only way out of making the call
Re: (Score:2)
These maneuvers are not necessarily "insane" - but there may be something in the "escape path" with some probability that it is each of (a) human, (b) domestic pet, (c) small wild rodent, (d) a patch of flowers in front of a business or home.
It sounds like you are proposing that if p(a) = 0.0000001 and p(d) = 99.999 you would still elect to kill the pedestrian? Most humans would not make that decision -- including humans on the jury for the civil case the pedestrian's widow and children file.
[Yes I know tha