Tesla Says Autopilot Was Engaged During Fatal Model X Crash (theverge.com) 422
An anonymous reader quotes a report from The Verge: Tesla says Autopilot was engaged at the time of a deadly Model X crash that occurred March 23rd in Mountain View, California. The company posted a statement online late Friday, after local news reported that the victim had made several complaints to Tesla about the vehicle's Autopilot technology prior to the crash in which he died. After recovering the logs from the crash site, Tesla acknowledged that Autopilot was on, with the adaptive cruise control follow distance set to a minimum. The company also said that the driver, identified as Apple engineer Wei "Walter" Huang, had his hands off the steering wheel and was not responding to warnings to re-take control. Tesla said in a statement: "The driver had received several visual and one audible hands-on warning earlier in the drive and the driver's hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken."
According to Mercury News, the driver of the car was headed southbound on California's Route 101 when his Model X crashed headfirst into the safety barrier section of a divider that separates the carpool lane from the off-ramp to the left. "The front end of his SUV was ripped apart, the vehicle caught fire, and two other cars crashed into the rear end. [The driver] was removed from the vehicle by rescuers and brought to Stanford Hospital, where he died from injuries sustained in the crash."
According to Mercury News, the driver of the car was headed southbound on California's Route 101 when his Model X crashed headfirst into the safety barrier section of a divider that separates the carpool lane from the off-ramp to the left. "The front end of his SUV was ripped apart, the vehicle caught fire, and two other cars crashed into the rear end. [The driver] was removed from the vehicle by rescuers and brought to Stanford Hospital, where he died from injuries sustained in the crash."
Another interestnig tidbit (Score:4, Informative)
Re:Another interestnig tidbit (Score:5, Interesting)
In addition, the data should account for accidents not caused by the sedan, such as a tractor trailer suddenly coming across the median a taking out a car, or other 'unpreventable' incidents that neither an autonomous or human controlled car could avoid.
Comparison against total highway deaths is apples and oranges.
Musk is a smart man, smart enough to know how to use statistics properly. I believe he is quite aware his claims are not supported by existing data. It is disappointing and unnecessary. If I want to abuse statistics, I'd say the data clearly shows that on the particular day of this accident it was thousands of times safer to be in a human driven vehicle when passing the deficient barrier than in an auto-piloted Tesla.
Re:Another interestnig tidbit (Score:5, Interesting)
... he should not be making unsupported safety claims regarding autonomous driving, nor conflating them with Tesla Auto pilot safety.
He has to. Tesla is on the brink of going under and unless he gets more cash to keep the business going, it'll be bust by the end of the year.
To get that cash, he has to keep in the news and make a lot of hype.
Re: Another interestnig tidbit (Score:3)
To get that cash, he has to keep in the news and make a lot of hype.
Right, because sticking to what he's doing has been so ineffective...
Re: (Score:2)
Red Flag: Anon personal attacks when the facts run out to stand critique. Here the infamous " Cult" tag is implemented. Cult being the trap SteveJobs avoided when the Mac began to gain traction. Industry smartly knew if they could brand Macintosh and Apple as a cult it could never grow.
Re: (Score:3)
>And then Musk's compensation package?! WTF?!
>Tesla shareholders are just plain stupid.
I don't know - seems like a smart bet to me, *especially* if you don't have confidence in his long term abilities: Pay him peanuts unless he pulls an elephant out of his hat, and if he actually manages to do so, then give him a bigger slice of the elephant.
Re: Another interestnig tidbit (Score:5, Insightful)
Indeed. It's often been described as win-win for shareholders, in that it doesn't cost them anything if Elon fails, but they're rich if he succeeds. But in reality, a lot of people who voted it saw it as win-win-win - the "extra" win being that not only do they earn a lot of money, but so does Elon. Namely because they like the sort of things that Elon spends his money on. ;) He's not the sort of person who makes a ton of money and just retires and buys a private island and a superyacht and whatnot. There's no telling for sure what he'll spend his money on next, but you can rest assured that it'll be like something out of sci-fi ;)
Re: (Score:2)
I'm not sure it's jealousy on his part.
I'm not him, but I'm very glad I'm in my shoes and not Musk's. The chances seem slimmer that mine will be matched to orange clothing a few years from now.
Re: Another interestnig tidbit (Score:5, Insightful)
I dont like luxuray car producers like tesla I also think the whole idea of electric car is a bad direction. Yet there are other companies that produce epic fails: there were broken ignition keys, self accelerating cars and least we forget a company that tried to fulfill silly regulation by nasty nasty and failing in covering its tracks. There were many more. If Tesla sinks it is not because it failed to inspect the bolts.
Regarding electric cars: We've owned a Chevy Volt for the last 4 years and it's the best and most reliable car we've ever owned. There's no question in my mind that electric is the future for the vast majority of vehicles. I also own an F150 for hauling things, so I'm not opposed to gas vehicles where appropriate.
For most people electric cars are simply going to end up being better. Higher torque, lower maintenance requirements, and I think in the long run, likely better range and cheaper fueling costs. There will still be a place for gas engines but their advantages are becoming more limited every year.
Self driving cars are a totally different can of worms.
Re: Another interestnig tidbit (Score:5, Insightful)
Because there is no competition to the Model 3. Plain and simple. The "competition" charges vastly slower, from less reliable networks, is slower, doesn't have as good handling, don't look as nice, don't have anywhere near as interesting options (long-range pack, dual motor AWD, air suspension, etc), and on and on and on.
Look at, say, the 2018 Leaf. Yeah, you save $5k. You also get an econobox that looks like a catfish that only goes 2/3rds as far (which becomes even worse when you consider the need to leave yourself a safety buffer), charges at a max rate 1/3rd that of the Tesla before #RapidGate sets in, 1/5th the speed after #RapidGate sets in, with much worse performance.
With the Bolt you can pay more to go the same distance, perform worse, still charge at 1/3rd the max rate, and again have your car be a dorky-looking econobox.
People are waiting because they want what they're waiting for, and not what else is available. Yes, there also are some people waiting specifically because it's Tesla - they don't want to support companies that have continually tried to minimize how many EVs they need to build so that they can just get back to making ICEs. Others want specifically Teslas because all of the competition has terrible depreciation rates but Teslas don't, due to excellent pack management, over the air updates, etc.
Re: Another interestnig tidbit (Score:5, Informative)
Look at, say, the 2018 Leaf. Yeah, you save $5k. You also get an econobox that looks like a catfish that only goes 2/3rds as far (which becomes even worse when you consider the need to leave yourself a safety buffer), charges at a max rate 1/3rd that of the Tesla before #RapidGate sets in, 1/5th the speed after #RapidGate sets in, with much worse performance.
That's not really a fair comparison, because you can't actually get a $35k Model 3 today, or any time in the foreseeable future. They are only selling the more expensive ones at the moment. And despite the problems (RapidGate is pretty serious) the Leaf 40 is selling quite well - so much so that they just bumped the price up 3%.
Also that $35k won't get you the features that a $5k cheaper Leaf will, such as ProPilot which does auto-sterring and auto parking that is actually more advanced that Tesla's. So it's not really a like-for-like comparison. Depending on the country the Leaf sometimes has a better charging network too, e.g. the UK where there are far more CHAdeMO pumps than Superchargers.
The real competition for the Leaf 40 is the Hyundai Ioniq and to a lesser extent the Renault Zoe. The Ioniq isn't widely available and the Zoe has its own rapid charging issues, as well as being an inferior car in every way.
Having said that, the Leaf 40 is a massive disappointment compared to what people were expecting. Maybe the 60 will be better... At least it should have active battery cooling to fix the charging problems.
Re: (Score:3)
>That's not really a fair comparison, because you can't actually get a $35k Model 3 today
A friend of mine is getting his model 3 today. He's picking it up from the Tesla shop at 10.00am.
And how much did he pay for it?
Re: Another interestnig tidbit (Score:5, Insightful)
Not true. The Zoe does make up 22,7% of European EV sales (vs. the Model S's 11,5%), but the i3 is only 10,8% of European EV sales. Furthermore, the fact that the Zoe has less than twice as many sales, yet is selling to a price bracket that represents a market 1 1/2 orders of magnitude larger, isn't exactly a bragging point, and doesn't bode well for it when the Model 3 arrives next year.
I'll take that as "no contest". The fact that you do most of your charging at home is completely irrelevant when you need to go on a road trip.
Wow, you've not seen Teslas plugged into superchargers at places that aren't supercharger stations? Imagine that.
Next, try actually going to a supercharger station if you want to see Teslas connected to superchargers.
Plugshare status reports say otherwise. The consistent stream of reports of down stations on our local EV group says otherwise. Yesterday literally a third of our CCS chargers in the country were down.
If you mean "looks like an econobox", yes.
Apparently you forgot that we're talking about the Model 3, not the Model S. The Model 3 is the same size as a BMW 3-series.
Re: (Score:3)
Electric cars and vehicles are definitely the way of the future. Especially when we get to the point where there are AI cars that can run as shuttles to and from regularly scheduled buses.
But, in the case of Tesla, I don't really get why anybody is giving them money to reserve a car that hasn't yet been built and may not be built for quite some time. I have noticed a lot more Teslas around here lately, but they're going to be really popular with poor people during the upcoming revolution as they'll be the second people stoned, just after those Prius drivers.
One is that there is no risk. You can get your money back, so no commitment required. One other is that only the first "x" number of buyers can get the tax credit, so best to get on the list. Of course, Tesla is also a popular brand and people with the money to spare want them.
Re: Another interestnig tidbit (Score:4, Interesting)
A counterpoint. [model3ownersclub.com]
Re: (Score:3)
I hope you got your shorts in while the stock was around $350 ;) If not, hey, you can still jump in, I know some people who missed their chance to buy because they had set triggers for $250 but it only got down to $253 (I bought at $268 - as I have no pretension of being a psychic, I had no interest in trying to "time the bottom").
If you did short (nearly a third of Tesla's stock is in short positions), then - and I mean this completely seriously - I want to offer you my sincere thanks for doing so. I was
Re: (Score:3)
I take from this that you missed your chance to short the stock? Don't worry, you still can - they're going bankrupt, right? So why not short? Easy money, right? Come on, put your money where your mouth is. Don't tell me that you have literally no money in your bank account. Or do you not actually believe what you preach? If you believed you had a sure thing, then it's free money, and why on Earth would you choose a couple percent interest over that?
Come on, short it!
Re: (Score:3)
Right. They'll just pop down to their gigafactories after a quick recharge on their supercharging networks and build vehicles on lines and with components that are the result of billions of dollars of investment over years. Easy as pie!
Better stats (Score:2)
I agree these would be better numbers for grading Tesla.
You can't make the comparison using news stories though, because when a Tesla crashes without Autopilot engaged it's just another car crash. It doesn't make national news. This makes the numbers appear slanted against Autopilot.
Re: (Score:2)
I agree these would be better numbers for grading Tesla.
You can't make the comparison using news stories though, because when a Tesla crashes without Autopilot engaged it's just another car crash. It doesn't make national news. This makes the numbers appear slanted against Autopilot.
I don't know. It seems every fatal Tesla crash is already a story before they determine if Auto Pilot was engaged or not. This story is one of those.
Re:Better stats (Score:5, Informative)
Well, I did find an NHTSA report from January of 2017 (after a previous fatality linked to Autopilot use). They found a 40% decrease in crashes among Tesla drivers after Autopilot Autosteer became available. Not super-definitive, but interesting.
5.4 Crash rates
. ODI analyzed mileage and airbag deployment data supplied by Tesla for all MY 2014 through 2016 Model S and 2016 Model X vehicles equipped with the Autopilot Technology Package, either installed in the vehicle when sold or through an OTA update, to calculate crash rates by miles travelled prior to (21) and after Autopilot installation. (22)
Figure 11 shows the rates calculated by ODI for airbag deployment crashes in the subject Tesla vehicles before and after Autosteer installation. The data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation.
page 10 on:
https://www.scribd.com/documen... [scribd.com]
I'm trying to think this through (Score:3)
I apologize for the length of this post. It is as much about trying to state my thoughts clearly as it is about participating in a conversation.
For all of the following I will assume that the 40% reduction in crashes attributed to Autopilot by NHTSA is real.
I agree that off-highway driving is where you see the most crashes. Let's break all driving into two types: Autopilotable (P), and Manual-required (M). P includes all miles driven that allow the driver to use Autopilot, M includes the rest.
Let's arbit
Re: (Score:2)
There is special interest in autonomous driving, so therefore it will get attention. Would you rather have apathy?
Re: (Score:2)
A hell of a post. But lets step back and ask questions about this accident. For example, one that stands out to me is, "the Tesla knew it was approaching an non-moving object, and it did not stop, or veer? Why?
Its a perfectly good question. The Tesla manual clearly states that it cannot be relied upon to stay in the traffic lane or avoid obstacles in its path. But you'd think it would recognize such an obstacle. Why did it veer at all?
Re: (Score:3)
It's not clear from the article whether the requests to take the wheel were the normal periodic "I'm going to make sure you're paying attention" requests, or a "Hey, I don't know how to handle this situation, you need to do it!" request.
Re: (Score:2)
Only with rail your limited to the slowest vehicle on the track, your stopping distances are longer so you have to leave larger gaps between vehicles, you can't swerve to avoid unexpected obstacles, a faulty vehicle blocks the entire track until it can be removed, every time you stop you cause delays to those behind you (or further increase the distances required between vehicles) unless you can get off the main track, you can only go where there's tracks and cannot make arbitrary turns, you have to follow
Driving is can be extremely dangerous! Be safe! (Score:5, Insightful)
He apparently had plenty of money; he was driving a Tesla. He was an engineer, so he was educated.
It amazes me that often people don't recognize that driving a car is a potentially extremely dangerous activity. 100% attention is required at all times, particularly since other drivers often do things they shouldn't do.
Re: (Score:3, Informative)
Don't take Tesla's word that he had his hands off the wheel; he may have had them resting lightly on the wheel. They use a pressure sensor. I've got a Tesla Model X, and have been nagged many times, because my touch is a bit too light for it to detect.
Re:Driving is can be extremely dangerous! Be safe! (Score:5, Informative)
Re: Driving is can be extremely dangerous! Be safe (Score:5, Funny)
... or maybe you're not holding the wheel tight enough.
Well, he did work at Apple. Not holding it right is a distinct possibility...
Re: (Score:3)
Maybe he installed a big notch in the middle of his windshield, and he couldn't see the barrier through it ;)
Re:Driving is can be extremely dangerous! Be safe! (Score:5, Insightful)
It amazes me that often people don't recognize that driving a car is a potentially extremely dangerous activity. 100% attention is required at all times, particularly since other drivers often do things they shouldn't do.
Then putting Autopilot in a vehicle is illogical. You don't put something in a vehicle to steer for drivers while totally failing at relieving any kind of duty of driving. Eventually they will get sidetracked, it's just human.
Re: Driving is can be extremely dangerous! Be safe (Score:2)
That's what Google, to its credit, has been saying since day one. Autopilot is either a safety backup system like Meritor Onguard, or it's totally in control. Driving is not a dificult task, it's no easier to monitor a computer driving than it is to drive. Consequently if people aren't driving they are looking at their cell phones.
Re: (Score:3)
Then putting Autopilot in a vehicle is illogical. You don't put something in a vehicle to steer for drivers while totally failing at relieving any kind of duty of driving. Eventually they will get sidetracked, it's just human.
It makes perfect sense to me, along the lines of adaptive cruise control. It makes for a more relaxing drive in that I don't have to actively maintain speed and distance myself, but it does not relieve me from the need to pay attention in order to be able to intervene should the need arise.
Additionally, if I should have a momentary lapse of attention at an inopportune time, odds are that it does not happen at the exact moment my car fails to notice that the car in front slowed down, so it adds safety.
For so
Re: Driving is can be extremely dangerous! Be safe (Score:2)
Re: (Score:3)
If there's a general consensus in the English speaking world that autopilot is synonymous with autonomous, I agree the name was poorly chosen. That's not my impression of the general understanding of the word – we expect a pilot to remain in the cockpit and alert when the plane we're on is on autopilot, after all – but if the data shows otherwise then I'd be the first to argue for the feature to have its name changed.
Re: (Score:2)
It makes perfect sense to me, along the lines of adaptive cruise control. It makes for a more relaxing drive in that I don't have to actively maintain speed and distance myself, but it does not relieve me from the need to pay attention in order to be able to intervene should the need arise.
I find that adaptive cruise control makes the drive less relaxing, in that I have to monitor what it does constantly, whereas if I drive manually, most adjustments are reflexes requiring little conscious effort. Nor using adaptive cruise control frees up part of my attention to deal with other potential dangers, like the road ahead and the behaviour of other drivers.
Re: (Score:2)
I guess our brains are very different. I find it really relaxing to just pay attention to the road and steering, and let the car deal with the speed. Since I'm obviously watching the road anyway, it takes away some (literal) footwork without adding any additional workload.
If it was unreliable, it would just add extra anxiety of having to quickly correct for it all the time. But in the year I've had the car, it's been rock solid, even in the worst of slushy winter conditions.
On really winding roads, I do ten
Re: (Score:2)
The driver was given six seconds warning to take control, which is more than ample to react to an emergency situation.
Then given that this appears to have been an intelligent driver who was also aware of potential problems with the automatic control, we have to ask why that didn't happen. If we assume the driver didn't deliberately allow an accident to happen with tragic results, then evidently either something wasn't clear enough about the situation and what needed to be done, or something interfered with the driver's ability to act accordingly.
Re: (Score:2)
Re: (Score:2)
Comments like that really aren't helpful at this stage, and could be deeply hurtful if any of the victim's friends or family read them. Please engage your brain before posting.
Re:Driving is can be extremely dangerous! Be safe! (Score:4, Interesting)
Then given that this appears to have been an intelligent driver who was also aware of potential problems with the automatic control, we have to ask why that didn't happen. If we assume the driver didn't deliberately allow an accident to happen with tragic results, then evidently either something wasn't clear enough about the situation and what needed to be done, or something interfered with the driver's ability to act accordingly.
No, that is not evident. Old Bill of Ockham tells me to look for a less complicated explanation, like that the driver had rolled 16 INT but 3 WIS.
Re: (Score:2)
What we need here isn't philosophical cliches or conjecture, it's facts, or at least possible explanations that are consistent with the evidence available and worth investigating.
Re: (Score:2)
What we need here isn't philosophical cliches or conjecture, it's facts
Then why your conjecture two posts up?
When looking for facts, look for the simplest facts first, and only look for more complicated ones if those fail. Don't devise complicated hypotheses based on biased views, and then declare them as the "only" options. Which seems to me like is what you did.
Philosophical cliches are useful when they stop us from knee-jerk reactions (another philosophical cliche) and jumping to conclusions (ditto).
Re: (Score:2)
There was no conjecture in my original comment. I acknowledged three specific possibilities, and implied that one of them was unlikely on the evidence so far.
Re:Driving is can be extremely dangerous! Be safe! (Score:4, Insightful)
The core question is: why did the car not brake and stop in fromt of the obstacle?
Re:Driving is can be extremely dangerous! Be safe! (Score:4, Interesting)
The core question is: why did the car not brake and stop in fromt of the obstacle?
The core answer is: Because the driver did not apply the brakes.
The original question is a good one and should not be just tossed off like this.
If you did a GIS on the accident you would quickly see that the car impacted a fixed obstacle with a clear view of it. The obstacle was marked with black-and-yellow safety stripes exactly the sort to alert a human driver it was there.
So why did the autopilot not see that obstacle and take action? (Either divert or stop?) What sensor system failed to see it? Does it have something to do with the material on the surface that holds the black-and-yellow paint?
If they get to the root cause of that they have a good chance of never having an accident like this again.
Re:Driving is can be extremely dangerous! Be safe! (Score:4, Insightful)
Or maybe there had been warnings earlier in the drive, as Tesla's statement says, but then there was insufficient warning immediately before the fatal collision.
We simply don't know yet, based on the information released so far, and what is needed in a situation like this is facts, not speculation.
Re: Driving is can be extremely dangerous! Be safe (Score:2)
Six seconds is pushing the brake to the floor and being at a complete stop, with plenty of room to spare; even more time to slightly turn the wheel to avoid hitting something 150m away and not moving.
Driver not paying attention AT ALL.
Re: (Score:2)
Six seconds is pushing the brake to the floor and being at a complete stop, with plenty of room to spare; even more time to slightly turn the wheel to avoid hitting something 150m away and not moving.
Driver not paying attention AT ALL.
Six seconds is the amount of time the driver was not engaging the steering wheel. We do not know how much time there was after the car veered out of the traffic lane and before it hit the barrier, it my have been much less.
Re: (Score:3)
It amazes me that often people don't recognize that driving a car is a potentially extremely dangerous activity. 100% attention is required at all times, particularly since other drivers often do things they shouldn't do.
Unfortunately, nobody can claim they pay 100% attention at all times and be telling the truth. Everybody has a moment when they are distracted, good drivers quickly re-engage their minds.
Which is why we need to be very careful about technologies that give people a false sense of confidence that they can take their attention of of driving for longer periods of time.
Re:Driving is can be extremely dangerous! Be safe! (Score:5, Interesting)
He was an engineer, so he was educated.
Educated but apparently not particularly smart, given that he had complained to Tesla several times about issues with the guidance system and yet continued to blindly rely on it.
Re: Driving is can be extremely dangerous! Be safe (Score:2)
"Schooled" hardly means "educated" and "educated" definitely doesn't imply "informed" or "intelligent," as evidenced by this twit's (moment of silence) decision.Besides, no "computer engineer" (i.e. someone with a deep understanding of both analog and digital logic) would be willing to trust their lives to a cutting-edge machine that'st being tested not in a controlled environment but rather a fucking city.
wtf tesla? (Score:3)
So you design a car that can safely drive itself in traffic, can track whether the driver is actively using the controls and knows that for six seconds the driver hasn't been using them while driving at speeds the car can't protect them through a crash.
And you didn't design in, "Slow the fuck down because nobody is in control of the vehicle"?
Re: (Score:2)
So you design a car that can safely drive itself in traffic
Well, that premise is under a bit of debate due to incidents like this, isn't it?
Re: (Score:2)
You miss the point so entirely that you should perhaps be cautious about throwing around the word 'moron'.
Re: (Score:2)
My Toyota can't do any of that shit. It will happily drive me into a wall 100% of the time without my assistance.
No, it won't. It needs your assistance to get it up to speed and point it at the wall.
Most of the walls I see are free of embedded Toyotas.
Hands off the wheel for 6 seconds (Score:5, Interesting)
The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.
Having narrowly avoided two separate impending collisions while driving due to insects, one hornet loose in the cab & one bee in the eye through an open window, I have a macabre fascination with the last few seconds in a vehicle before the collision the takes the life of the human witness(es).
Sure, we live in an age of unrivaled electronic distractions, but there have always been ample incentive to pick the wrong five seconds to look away from the road. Outside of law enforcement, we'd never see the video, even if it did exist... but the new tech vehicles are getting makes the 'fly on the wall' view ever more likely.
Re: (Score:2)
Why were the insects causing you to drive?
Couldn't outrun the beasts.
Re: (Score:2)
Why were the insects causing you to drive?
This is Slashdot.
He, for one, welcomed our insect overlords.
Apple engineer (Score:4, Insightful)
Huang reportedly complained that the car’s Autopilot option kept veering the car toward the same barrier on Highway 101, near Mountain View, into which he crashed the car last Friday.
If you've noticed unsafe behaviour and have made complaints about it, why the fuck would you keep using it?
Not surprising that an Apple engineer has no common sense.
And the only common sense thing for Tesla to do is to disable the damn thing. People are too stupid to be trusted with anything.
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Because he paid $120,000 for it. The real question is why is Tesla putting out a faulty expensive product and beta testing it on our roads?
This. Unless there was something unique about circumstances that lead to a fatal crash, it is reasonable to assume that non-faulty self-driving car on a clear day would not ram itself into a divider killing everyone involved.
Warning to take control are only good for defending against lawsuits. They are not actually useful due to desensitization / "crying wolf" phenomenon. If Tesla was serious about it, they would have implemented "Red Alert, High Probability of Crash" warning, not some BS "put your hands o
Re: (Score:3)
Except he probably did have his hands on the wheel. The Model X is notorious for nagging even when the user does have his or her hands on the wheel. Even though I almost always have my hands on the wheel, I get about five to ten nags per hour — often several nags within a minute or two.
In other words, Tesla's data is approximately the equivalent of spinning a roulette wheel of accident causes. It's crap, and is correct only slightly more often than chance.
Re: (Score:3)
Even though I almost always have my hands on the wheel, I get about five to ten nags per hour — often several nags within a minute or two.
Five to ten nags per hour is once every 6 mins. If you "almost always" have your hand on your wheel, then that sounds about right, because you don't always have your hands on the wheel, per your admission.
If there's one thing we learned from over a century of operating machinery is that humans are not reliable narrators and tend to over estimate their abilities/compliance. This is why we have audits and logging and blackboxes.
Re: (Score:2)
Huang reportedly complained that the car’s Autopilot option kept veering the car toward the same barrier on Highway 101, near Mountain View, into which he crashed the car last Friday.
...
Tesla spokesperson said the company has been searching its service records, “And we cannot find anything suggesting that the customer ever complained to Tesla about the performance of Autopilot.”
The spokesperson added that there had been “a concern” raised about the car’s navigation not working properly, but “Autopilot’s performance is unrelated to navigation.”
The car usually worked okay, it just had an obsession with that particular barrier. But that was a "navigation" concern, not autopilot. Got it?
Re: Apple engineer (Score:2)
If you've noticed unsafe behaviour and have made complaints about it, why the fuck would you keep using it?
You really have to ask?? I would've thought it obvious; he was an Apple engineer!
The answer, of course, is courage.
Artificial Intelligence kills 2 in one week (Score:5, Insightful)
On average there are 700 deaths on US roads EVERY week and two more should not be national news. With safer cars this number has been dropping in the last decade but this is news is actually about computer AI making a choice, or by not making a choice, killing two people. It may not be full AI, but it is still a computer program in control. Two people died because of a computer program. With both accidents the "self-driving" AI program should have saved these people. Both times the person behind the wheel should have been able to avoid or lessen the collision if they were actually driving. We don't hear much about AI driving success in avoiding crashes just like we don't hear about planes that land safely. We only hear about failures. These features will get better with time and debugging (meaning more failures to come). Just as early commercial planes had their problems so does AI self-driving. For now flying is safer than driving no matter who is in control of the car (0 commercial aviation deaths for 2017 in US) and improved technology can only help our chances of making it home safely even if it makes the wrong choice occasionally (well, on average).
Re:Artificial Intelligence kills 2 in one week (Score:5, Insightful)
Re: (Score:3)
Re: (Score:2)
I think we're at a interesting point, where robotic drivers aren't as safe as human drivers in general, but are safer than human drivers under certain circumstances.
What this means is that robotic assistance can be used to improve safety, but when misused will actually make things worse. So for the foreseeable future, every time one of these things is in a crash the question will arise as to whether the system failed, or the driver misused it.
In fact both scenarios are bound to happen, and will continue
not equivalent to Uber crash (Score:3)
Uber plowed into a pedestrian at full speed on a well lit road, whereas this driver ignored six seconds of warnings to take control.
Auto Pilot still safer (Score:2)
Re: (Score:2)
FIFY, and I happen to agree with you.
Was this unsafe following? (Score:2)
Re: (Score:2)
Spin from Tesla (Score:3, Insightful)
Reading and re-reading the quote from Tesla, I see I was mislead:
The driver had received several visual and one audible hands-on warning earlier in the drive and the driver's hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken."
This does not mean that the warning fired during the 6-seconds prior to the collision. It wasn't telling him a collision was imminent. It says that "earlier in the drive" it warned him. So the warning could have been 45 minutes prior. Also, it sounds like the autopilot warning happens any time the user takes their hands off the wheel, not just when it needs help. It might be that autopilot drivers have a tendency to ignore the warning, like a dialog box that comes up so often people just click "OK" to it.
I begin to think that a semi-autopilot is a bad idea. If it is not reliable enough that a person can take their hands off the wheel, and they still must pay full attention to the road in case it makes a mistake, then they might as well drive? It is very hard to pay attention to something you aren't actively involved in. Airline pilots and lifeguards and factory quality inspectors know this. Those industries have specific policies and practices designed to keep people engaged and aware.
RoboKill (Score:2)
New classification of death - robotic process.
By algorithm, metaphor and processing machine made choice was fatal. This post reality dawn of an age where humans are given a metaphor stand-in for reality to represent risk. Tesla chose to use a sound to implement a warning. What could go wrong? Did he have windows down and couldn't hear? Maybe cabin noise was chaotic or distracting but the metaphor implementation failed the human-in-control.
SO the cost of that weak metaphor is catastrophic. I think thou
Darwin wins again (Score:4, Insightful)
"...the victim had made several complaints to Tesla about the vehicle's Autopilot technology prior to the crash...
So the guy who has complained not once, but repeatedly, that his car's autopilot is inadequate engages it and completely ignores what it's doing.
This takes a special kind of stupid.
Somehow I found the strength to ignore the low-hanging fruit: that this potential Darwin Award winner was an Apple engineer.
Level 3 systems are insidious! (Score:2, Insightful)
A dazed driver cannot assume responsibility of a ton of steel travelling fast down the highway within five seconds!
In fact, studies has shown that a driver is not "up to speed" in driving capability for a long time after a requested activation, up to 40 seconds.
Having a five second limit is simply irresponsible, and 40 seconds is almost same AD challenge as a level 4 system.
Abolish level 3!
Worst damage control statement ever (Score:2)
There are important missing facts about this... (Score:3)
The driver had complained about trouble with his car to Tesla before the fatal crash:
"Walter Huang's family tells Dan Noyes he took his Tesla to the dealer, complaining that -- on multiple occasions -- the auto-pilot veered toward that same barrier -- the one his Model X hit on Friday when he died."
If his Tesla has a history of doing something reckless, why would he re-enable it? Why would he also not have your hands on the wheel? Why didn't Tesla analyzed the data in his car when he reported this to see what was going on? Seems like it would have been a pretty simple check: Did the car attempt to steer the car towards the barrier or not?
Re: (Score:2)
Re: (Score:2)
In ccase of Googles Go and Chess, ANNs it is deep learning.
Deep learning only means you have a relatively deep artificial neural network, and train it.
Depp means: many layers.
Re: (Score:3)
Actually tesla said the driver ignored the warning earlier in the drive. It could have been an hour before.
Tesla autopilot warns you to take control every minute or so regardless of whether it is "confused" or not. If you don't, after sufficient warnings, it stops the car.
There is no information in the Tesla statement that isn't true of many many tesla autopilot journeys.
It looks to me, as a Tesla owner myself, that autopilot did, indeed, drive him into the barrier and that we have a reminder that every
Re: (Score:3)
If the car had "several visual and one audible hands-on warning" then maybe the autopilot should bring the car to a halt. However I suspect that what happened was more complicated and that we do not know the full story.
Re: Evolution in action (Score:5, Insightful)
Better yet, another story is saying that the man has noticed autopilot having a problem at this particular stretch of road In the past - if you've seen it trying to send you into a k-rail at that bit a few times, what the fuck are you doing letting it drive on that bit, and not paying attention when it tells you to? Did he fall asleep or something?
I sure as shit wouldn't be using it there if I've gone so far as to take it to the service center to have them look at it for trying to drive into exactly that barrier in the past...
Man: Hey Doc, when I shove my head up my ass, I have problems breathing!
Doctor: then pull your head out of your ass, and stop shoving it up there!
Re: (Score:2)
if you've seen it trying to send you into a k-rail at that bit a few times, what the fuck are you doing letting it drive on that bit, and not paying attention when it tells you to?
If he was a computer programmer, I'd say he was trying to reproduce the fault, in order to better understand the entry conditions. :/
so... (Score:2)
Um, success? So... yay?
Re: (Score:3, Insightful)
That story was incorrect. He had taken his car in because of problems with the navigation system, unrelated to autopilot. How that morphed into “autopilot problems at that particular stretch” is a classic example of the telephone game.
Re: (Score:3)
as the car is driving down route 101, do you honestly expect the computer onboard to STOP the car, right there? in many cases, there is not even a pullover (lay-by) lane.
I'd like to know what you propose, when the computer says 'I need you to do something, I'm not sure what I should do, myself' and the human ignores it for too long. stopping is NOT always the right thing! the correct answer is 'it depends'.
Re: (Score:2)
as the car is driving down route 101, do you honestly expect the computer onboard to STOP the car, right there? in many cases, there is not even a pullover (lay-by) lane.
I appreciate that ... but it is a case of the least worse thing to do. Yes: stopping on a busy motorway would get a lot of people annoyed at him and honking their horns - but he would still be alive -- assuming that the guy in the car behind him was not asleep as well. In engineering there is a concept of Fail safe [wikipedia.org], when I was taught to drive: stopping was the fail-safe action; embarrassing and might get you a ticket, but usually better than continuing to move forwards.
Re: (Score:2)
as the car is driving down route 101, do you honestly expect the computer onboard to STOP the car, right there? in many cases, there is not even a pullover (lay-by) lane.
One workaround might be to turn on the hazard lights for the benefit of other drivers and in a loud voice state "DISENGAGING DRIVING ASSIST, TAKE CONTROL NOW". Then disengage power.
Re: (Score:2)
Where does 101 not have a pullover lane? I'm trying to remember the entire length, and I thought there were at least bike lanes the whole way.
Re: Evolution in action (Score:2)
After repeated warning, it does stop the car.
Re: (Score:2)
Re: (Score:2)
Does the wheel fight you so hard that you can't override it in an emergency? Ditto the brakes? Because if not, I would have a hard time believing that he was paying any attention to the road, which is the entire point of the whole "keep your hands on the wheel" thing.
Re:Unfortunately, People Will Get Hysterical (Score:5, Insightful)
People understand the risk of driving and they drive. What people don't understand is how long it will take to make these cars workable.
Re: (Score:3)
Oh god, imagine how much you could freak people out by replacing the standard autopilot-engagement sound files with "KILL ALL HUMANS" ;)