DVD Player Found In Tesla Autopilot Crash, Says Florida Officials (reuters.com) 485
An anonymous reader quotes a report from Reuters: A digital video disc player was found in the Tesla car that was on autopilot when its driver was killed in a collision with a truck in May, Florida Highway Patrol officials said on Friday. "There was a portable DVD player in the vehicle," said Sergeant Kim Montes of the FHP in a telephone interview. She said there was no camera found, mounted on the dash or of any kind, in the wreckage. A lawyer for a truck driver involved in the accident with the Tesla told Reuters his investigators had spoken to a witness who said the DVD player was playing a "Harry Potter" video after the accident, but the lawyer was unable to verify that beyond the witness account. Lawyers for the family of the victim, 40-year-old Joshua Brown, released a statement Friday saying the family is cooperating with the investigations "and hopes that information learned from this tragedy will trigger further innovation which enhances the safety of everyone on the roadways." Lawyers for the family of the victim, 40-year-old Joshua Brown, released a statement Friday saying the family is cooperating with the investigations "and hopes that information learned from this tragedy will trigger further innovation which enhances the safety of everyone on the roadways." Tesla said in a statement Friday, "Autopilot is by far the most advanced driver assistance system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility."
So what does it do then? (Score:5, Insightful)
Re: So what does it do then? (Score:5, Informative)
Re: (Score:3)
Re: (Score:3)
The only thing cruise control provides is comfort.
Re: (Score:2)
Except I drive a manual civic and don't have cruise control.
Cruise control has been an option on MT (including civics) for quite a while. Generally, on MT, CC works in gears 4-5, but won't engage in 3rd gear. The stock CC will dis-engage automatically if you touch the break (same as auto) or the clutch pedal (w/o actually engaging the clutch).
My 1984 Nissan Maxima 5-speed MT had cruise control. Worked about the same as my 1989 Honda civic MT (but maybe CC only came with the power windows package, I can't remember, it's been too long). Given how loaded civics hav
Re: (Score:2)
Generally, on MT, CC works in gears 4-5, but won't engage in 3rd gear.
I've used cruise control in both my 2001 Civic EX and 2002 CR-V EX in 3rd. I think the limiting factor is a speed below which CC won't engage (like 20 mph), not the actual gear you're in - though that's certainly a practical factor. I remember one time being in CC and decelerating via the CC controls and at some low speed it disengaged.
Re: (Score:2)
Except I drive a manual civic and don't have cruise control.
I have a 2001 Honda Civic Coupe EX and a 2002 Honda CR-V EX both with 5-speed manual transmissions and cruise control. Maybe your vehicle is older or a different trim line, but manual transmission and cruise control are not mutually-exclusive.
Re: (Score:2)
Re: So what does it do then? (Score:2)
Talk about building a car no one wants. I'd think they'd throw in as many features as they can to get people to buy that piece of crap.
Re: (Score:2)
"All it's doing is managing RPM. The cruise control disengages when you operate the clutch."
Maybe that's the case for your model. Mine doesn't control RPM but speed itself, acting either the brakes or the gas as needed and even allows you to change gears. It disengages either when you operate the designed control or the brakes.
It's a Mercedes manual transmission model from 2001, so not exactly new.
Re:So what does it do then? (Score:4, Insightful)
It's a safety and convenience feature that is being abused by treating it as a true AI chauffeur. The autopilot is really a minimal set of enhancements -- things like:
intelligent cruise control (senses nearby cars and adjusts the cruise setting and braking based on their data)
auto-parallel parking and perpendicular parking
auto-lane change when hitting the turn sigal
auto-driving (including making turns) in some instances -- mostly 5 mph areas
summoning (car backs out of driveway and comes to you)
Even the features used while driving are supposed to warn you and nag you if you take both hands off of the wheel and will slow the car down if you don't respond. It's not meant to be as full-featured as a Google self-driving car. Only someone watching a DVD player instead of driving the car would have hit that truck instead of slowing down -- assuming there's no massive glitch that disabled the driver's ability to hit the brake.
Re: (Score:2, Interesting)
No. Tesla is not staffed completely by idiots. They may be a new company, but I am sure they have many experienced driving experts working for them. You can be sure that Musk has read many reports showing that if you take away all need for user input while driving down the highway (possibly for hours at a time), when an incident happens the "driver" will be completely unable to respond in time to be of any help. It's not rocket science, this is not a new field never studied before.
This driver did what all d
Re:So what does it do then? (Score:5, Insightful)
The Tesla does not drive for you in autopilot mode. You still have to tell it when you want to change lanes (which this person supposedly did just before the crash.) Whomever was driving was alert and attentive enough to decide to change lanes literally a moment before the crash, so they must have assessed the surrounding vehicles and determined it was safe to do so.
As for your assumptions about driving, I have no idea where you're getting your data from as all Google cars have drivers that are paid to be attentive and all Teslas explain the features are to assist in driving, not autonomous driving... and they slow down and alert you if you don't keep your hands on the wheel.
I've regularly driven 5 to 7 hours at a time visiting family and friends every few weekends, and I almost always use my cruise control on the interstate. I have no idea why a Tesla which has enhanced cruise control and little else other than a collision warning system would make a human being so much more bored and inattentive they'd drive straight into a truck after changing lanes. That's just nonsense. I keep the A/C on high and play music or podcasts to entertain me, but I never zone out, change lanes, and run into the back of trucks. Not sure who on earth would.
The Tesla's enhancements don't ask the driver to "do nothing" any more than my cruise control does. They still have to physically tell the car to change lanes, watch the road for crazy drivers, note when and where to turn off the main road (even driving interstates, one can go through many off-ramps, yet still be on the same interstate), etc. It's not like a getting into a cab and telling the driver where you want to go.
I've seen people doing their own make-up, reading newspapers, and even watching TV in their vehicles while driving on the interstate. Eyes completely off the road in front of them, vehicle on cruise control (I presume). Those are morons... and my money is on this guy watching Harry Potter instead of being a responsible driver. Don't blame the vehicle for human laziness. There's no excuse for it.
Re: (Score:3)
And? Nobody was expecting Tesla to calculate the trajectory of the trailer and take an intelligent detour around it via a side street. It is well within the autopilot features to stop the car if there is an obstacle in the road in front of it. The size of the fucking trailer, mind you. Yes, it's not meant to be a full-featured self-driving car. But stopping before hitting an obstacle is very much expected.
Also, who tested autopilot at Tesla? It's not like tractor trailers are rare on the road. You just take
Are you being sarcastic? (Score:4, Interesting)
I do not understand. Why would you think they do in the first place? Perfect SF movie artificial intelligence has not been invented and installed in a car. Are you being serious?
Re: (Score:2)
Re: (Score:2)
I don't know the real reason, but I'm guessing it's a flashy feature meant to impress others since it's an expensive car... but, it is mostly billed as the car coming to you when the weather is poor and while it isn't meant to go far, the dream is to have it show up at the doorway when summoned from a large parking lot -- like a valet. The bigger dream is to be at work and summon your vehicle which is parked at home, but Tesla's not there yet.
Think rich person's digital valet service. When combined with
Re: (Score:2)
What exactly is the point of it? To lull you into a false sense of comfort and security?
I rather expect that Tesla will fix this particular problem quickly, if a fix is possible; so that the next time a white tractor trailer with high ground clearance is crossing in front of a Tesla (whose driver is not paying attention) on a sunny day, the Tesla will notice it and slow or stop, as necessary.
Whether or not that fix will make the Tesla system "safe enough" is still debatable.
Re: (Score:2)
Re: (Score:2)
I don't have a Tesla but the most useful thing I could imagine the Tesla autopilot for is actually stop and go traffic, where the car could do a great job of removing the tedium of constantly adjusting speed, you just watch the cars all around you.
It would also allow you to pay somewhat more attention to what drivers are doing behind you so you could avoid an accident - I've avoided several rear-end collisions just because I saw something bad was happening behind me and if I didn't move out of the way someh
Re: (Score:3)
"I don't have a Tesla but the most useful thing I could imagine the Tesla autopilot for is actually stop and go traffic, where the car could do a great job of removing the tedium of constantly adjusting speed, you just watch the cars all around you."
You don't need a Tesla for that. "Intelligent" cruise control that does exactly that has been in the market for quite a few years now.
Re: (Score:2)
Re: So what does it do then? (Score:2)
What does it mean that a Navy SEAL was into Harry Potter though?
Don't pick up the soap?
I'll Be Back (Score:2)
Re: (Score:2)
In related news, I haven't seen my cat since I bought the Roomba.
Re:I'll Be Back (Score:5, Funny)
Try looking on top of your roomba. Or in the box where the roomba came.
Re: (Score:2)
Try looking on top of your roomba. Or in the box where the roomba came.
Hmm... Or inside. How big is the Roomba and how small is the cat?
human nature doesn't mesh well with this. (Score:5, Insightful)
The problem is that if it slightly resembles a full-on AI based driverless system, that's how people are going to treat it no matter how many layweresque warnings you thrust in front of them and no matter how many forms they have to sign telling them it is just fancy lane assist.
It's just human nature: if people aren't actively involved in the driving process, their attention is going to wander. It's how we as humans are wired up. For a long trip, I'm not sure I could stay focused at all times, even though I'd know perfectly well I was risking my life if my attention wandered. If I'm driving, that's one thing, but if the car is doing 99.9% of it, the other 0.1% is going to pose a real serious problem.
If you build "almost an autopilot", that is a recipe for people treating it like what it resembles but isn't.
Re: (Score:3)
According to Snopes, this is an urban legend: http://www.snopes.com/autos/te... [snopes.com]
You can't do autonomous half-way like this. (Score:5, Interesting)
The car was basically equipped with a stay-in-lane and slow-down-if-you-approach-the-car-in-front-of-you kind of system, which is not an autonomous vehicle, nor can you take your eyes off the road. At best it reacts a bit faster if someone in front of you hits the brakes. Google did a talk on this and said in their tests, as soon as a car seems to be working by itself, drivers stopped paying attention to the road, so half-way-autonomous is a bad idea. People don't want to pay attention and they won't if the car seems to be doing a good enough job.
Only a fully autonomous car will be good enough.
Re: (Score:2)
Or not at all.
Re: (Score:2)
Yeah, no cars at all, better use horses. They aren't so stupid to run into obstacles with full speed.
Re: (Score:3)
Horses are a good idea, but they're not up to the task of driving a car at freeway speeds. A better solution would be to outsource driving. Let your car be remotely piloted by a driver working for pennies via VR in India.
Re: (Score:2)
The NORMAL rate is a death at 96 million miles. So, what this means is that 50% less fatality.
In fact, Tesla has many instances of this already saving lives in situations where they would have died in other vehicles, or been injured in the Tesla. IOW, this is already proving itself to be safer.
Now, as to Mr. Brown going on with watching TV, that is sad. OTOH, millions of drivers text and drink EVERY DAY
Re: (Score:2)
actually, more than 130 million miles have been logged by drivers doing this. 1 life has been lost. The NORMAL rate is a death at 96 million miles. So, what this means is that 50% less fatality.
I don't know if "normal" is the word you're looking for, unless you're saying they need to start killing more people.
Re: (Score:2)
It doesn't actually have to be fully autonomous to be useful. It need not need to know how to navigate and it doesn't need to be able to turn. It *DOES* need to know to stop to avoid accidents. It DOES need to be perfectly safe even if you fall asleep, even if you might not be in the right state when you wake up.
Re: (Score:2)
Indeed, autopilots on boats kill people all the time. (Most common case, the guy falls overboard while taking a pee and the autopilot sails away, leaving him to drown.) Yet I've never heard of anyone arguing that they should be banned (or even that they should only be allowed on boats with indoor plumbing).
Harry Potter my ass... (Score:5, Funny)
Re: (Score:2)
"How long was your wand again, Harry? 11 3/4?"
Re: (Score:2)
potter: British term for putter.
putter: a golf club...
Harry, potter my ass! Yikes.
Google vs Tesla approaches to self driving cars (Score:5, Interesting)
Re: Google vs Tesla approaches to self driving car (Score:2)
The other main difference is that Tesla has logged data from 50 million miles of autopilot data from all over the world, while Google has logged data from 1.5 million miles mainly in the Bay area.
I think this gap will widen exponentially, and good enough AI for driving will come only through masses of data, so Tesla have a huge advantage.
Re: (Score:3)
I am a bit surprised about the belief that AIs (or machine learning) will solve all problems given enough data.
What do you think a neural net would have learned to do if trained to use VW's "AdBlue" as efficiently as possible but still to pass the NHTSA conformance test?
Who would you blame then? After all the constraints look reasonable. Would you want to be the engineer sued because he did not predict the neural net might learn something illegal?
Plus, there is obviously a problem with the way Tesla gathers
Re: (Score:2)
In Tesla's case, they have 130+ million miles logged on the system with exactly 1 fatality and no injuries.
However, the NORMAL case is that a faility and numerous injuries are logged every 96 million miles in America. So, at this time, there is 50% less loss of life and a great deal less injuries.
Right there, it says that Tesla has the right solutions by saving more lives and within another year, they will have the count down even fur
Truck at fault (Score:2)
As far as I can see the truck driver was at fault, so why is such a big deal being made about this? Of course automation is going to make drivers lose concentration. Thats been understood for decades.
Re: (Score:2)
The reason is that once AP is further along, states will decide to dedicate highway lanes to them so that they can move more ppl along safer. In addition, at some point, they will then decide to allow these automated cars to move at the speeds for which highways were designed for. When I was growing up, the highways were 90 MPH, while most had been designed for 120 MPH. Here in the west, we can do an easy 120 MPH esp on a clear day. As such, these vehicles will then
Nice editing (Score:2)
Apparently a lawyer for the family has mental defect that causes them to repeat statements. Either that or the /. editors are once again showing their true dedication and attention to detail. Either way things that were getting better following the most recent change of hands have begun to erode already.
Driver assistance system or autopilot system ? (Score:4, Insightful)
Then maybe they should start by stopping to use the misleading name of "autopilot" for this functionality.
Re:Driver assistance system or autopilot system ? (Score:5, Insightful)
Most people understand "autopilot" to be something that keeps an airplane flying in a straight line. In that regard, the term isn't misleading.
Even a modern autopilot won't help you in an unexpected situation. You still need a real pilot to handle interesting things.
Re: (Score:3)
Most people understand "autopilot" to be something that keeps an airplane flying in a straight line.
I don't think most people understand that.
Re: (Score:3)
Most people understand "autopilot" to be something that keeps an airplane flying in a straight line.
I don't think most people understand that.
Most people don't know shit, and should disqualify themselves from making assumptions. When they don't, that's their fault, not anyone else's. Compare first aid. If you have first aid training and you help someone after an accident you're basically protected from liability only as long as you stay within your training. If you attempt to exceed it, you can potentially be held liable. If a person knows enough to make a reasonable assumption, and that conclusion turns out to be false because of a deliberate at
Re: (Score:2)
Aircraft can do fully automated landings now. It's quite commonplace even on passenger airliners. Pilots are still better at handling difficult conditions like crosswinds, but if visibility is too poor for a pilot to land they just flip the switch for automatic landing.
Re: (Score:2)
Aircraft can do fully automated landings now. It's quite commonplace even on passenger airliners. Pilots are still better at handling difficult conditions like crosswinds, but if visibility is too poor for a pilot to land they just flip the switch for automatic landing.
Sure, sure. But how often does a tractor-trailer cut in front of a plane while it's trying to land? Not sure even their AP is programmed for that.
Re: (Score:2)
Auto means self. Pilot means pilot.
If you call something an autopilot and it can't pilot the vehicle in the vast majority of situations autonomously, you're misrepresenting it.
Re: (Score:3)
Most people understand "autopilot" to be something that keeps an airplane flying in a straight line.
Rubbish. Most people probably think "autopilot" means that inflatable doll in the movie Airplane. I'd fully admit that I've got no idea precisely what a modern autopilot can and can't do or what the rules are for using them - what I do know is that (a) pilots are much more thoroughly trained and monitored than car drivers, and are more likely to follow the rules when flying on autopilot and (b) planes fly for thousands of miles on pre-set courses without passing within a mile of other traffic, and its proba
Re: (Score:2)
In general, Tesla drivers treat the AP as an aid to help with the driving. And it's really good at that, actually. Yes, I do own a Tesla.
Re: (Score:2)
Most Tesla owners think Autopilot means "I can watch Harry Potter while the car does all the driving for me", as has been eloquently proven in this example.
Actually, every regular driver in order to use AP, has to sign off on a paragraph that tells you that it is NOT fully safe and warns you to keep watch on the situation.
In addition, with the fact that 130+ million miles have been logged until this first fatality AND zero injuries, VS. an average of 96 million miles / fatality and large number of injuries in normal cars ( Tesla injury rate is far far less than even volvo ), it says that this is already a much safer system than allowing normal drivers.
And
No camera? What about Lidar? (Score:2)
I thought that high-end consumer vehicles employed Lidar to detect physical objects in front of them?
And isn't it a requirement of Tesla to have the cameras installed before you install the autopilot software?
Tesla is still very much to blame here (Score:2)
So Tesla says the auto-pilot actually detected the trailer, but thought it was an "overhead sign" that was hanging high enough. What?! So it appears the sensors on Tesla are not precise enough to tell if the car can safely pass under something if it hangs over the road? I mean, come on, I'd be fine if the auto-pilot couldn't tell if the clearance is 10 feet or 12 feet. But a trailer? As far as I can find the standard floor height of a tractor trailer is 48". That means the clearance under is even less. It d
Re: (Score:2)
It makes sense if the car was traveling uphill and the computer doesn't take that into account or the sensors are just fixed at a single point regardless of incline.
And yes, we had a story of an auto parking Tesla hit a trailer.
First Driverless Car Death (Score:2)
A person died for the novelty of a car that seemed to drive itself.
Who will be next.
the real question is, is this safer? (Score:2)
So, how does this compare to the average?
In America, somebody dies every 96 million miles. In addition, there are a large number of injuries, though to be fair, injuries should probably not be looked at as much as accident rate (tesla is the safest car on the road, bar none; they make volvo look dangerous). So, at this
Which DVD was in the player? (Score:2)
The witness says a Harry Potter movie was playing. If he was making this up, then there's a less than one in a thousand chance that the DVD player actually contains a disc with a Harry Potter movie. (The last disc of the series was released on DVD in 2011. A Tesla owner would be much more likely to be watching a more recent release.)
Investigators know which disc was in the player, so they know if the witness is telling the truth.
He saw the truck and drove into it intentionally. (Score:3)
He was confused as to where platform 9 3/4 was.
Re: (Score:2, Funny)
How dare you speak like this about Martian Citizen Zero? He is going to lead the entire human species to Mars! Sure, a few airlocks may open on the way, food dispensers may not work, but think how jealous the people left on Earth will be!
Re: (Score:2)
I am jealous you insensitive clod !
Re: (Score:3)
I have to remember that old joke. From Soviet times, when else
GDR-FRG border. GDR subordinate storms into the office of his superior.
Sub: Comrade! The Russians, they're on the moon!
Sup: All of them?
Sub: No... they just sent a capsule up.
Sup: Then why the fuck do you wake me? Just report when they're all gone.
Re:By far... (Score:5, Informative)
Yes, I know, jealousy of Musk is a big motivation for you people to hate on him, however if we ignore that for a second..
The point here is someone knowingly placed their life in the hands of automation, and paid the price for that. Real world
conditions mean that automation is NEVER perfect, and this is new automation at the cutting edge (sigh) of such things.
The larger issue to me is why the DRIVER did not notice a truck across the road in front of them. Are we to believe that
the software should have spotted it, and yet it was so hard to spot that a driver who was paying attention could not? That
would certainly stretch the bounds of credibility quite far.
It seems quite clear here that the driver was not watching the road ahead - in fact was ignoring it enough to not notice a
whole, large truck trailer unit turn in front of them in clear view. In other words they were, unfortunately for them, doing
something stupid.
But no, people are going to try and blame automation, because otherwise it would be a dead person at fault.. And that is
just not nice, right? However, this is NOT a case where a driver jumped on the brakes and they did not work, or tried
to turn the car and it went straight ahead (at least none of that is being claimed). It is a case where a driver of a car
at speed was not aware of the road directly ahead of them, that makes this border on a darwin here folks..
Maybe (Score:3)
It is a case where a driver of a car
at speed was not aware of the road directly ahead of them, that makes this border on a darwin here folks..
Not necessarily. Maybe the rest of the automation had been so good that the driver saw the struck, but believed the car also saw the truck. If you are a passenger in a car, you don't pull the handbrake to avoid an accident when you expect the driver is going to press the foot brake.
That said, he was probably just watching Harry Potter.
Re:Maybe (Score:5, Interesting)
I hear what you are saying, but I suspect you are missing one basic part of human psychology.
I have spent quite some time around motor racing, including being a passenger with some very good track drivers (much better than I will ;)
ever be) is some very fast 2 seaters. There is one thing that will ALWAYs happen in such a situation, after a few laps the passenger will
have a very sore braking leg. The reason is that it is pretty much impossible NOT to push your foot, even on a non-existent brake, as you
hurtle beyond what you believe is the safe point towards a collision - unless you are unaware of the collision. You will literally try and push
your foot through the floor trying to help the driver stop
Of course I think the truck driver is being rather 'creative' here also, however in this case the telemetry will tell pretty much all, and even if we
never know, the powers that be will know the speed, control inputs, etc that the car had before, during, and after the crash.
None of this makes it any better for the driver, his family, the truck driver, or anyone else involved.
But come on people, pointing the finger at Tesla really is a step too far. It is like blaming the national mint for a bank robbery.
Re: (Score:3)
We are just now starting to see the testing of autonomous vehicle cases in court. Even if this isn't an autonomous vehicle, decisions on these level of cars will be used as a model when trying incidents where fully autonomous cars are involved.
Re:Maybe (Score:4, Insightful)
That's a leap. The Tesla's autopilot features are more akin to cruise control or auto-braking when backing out if someone should walk behind the vehicle. The Teslas were never designed to be autonomous and are severely limited compared to a Google self-driving vehicle. They also clearly state that the driver is liable and should have proper control over the vehicle at all times (hands on wheel, foot near brake, eyes on road, etc.), so there's little wiggle room for anyone to be at fault other than the driver except in cases of severe malfunction where the driver is unable to regain control of the car at all.
Their most autonomous modes are to "summon" the vehicle at 5 mph or less in a parking lot and/or to parallel or perpendicular park on their own. I could see the potential for some lawsuits questioning who was at fault if the Tesla hit something while in summon mode with no one behind the wheel.... but, I would hope that would also be the owner's responsibility for not ensuring a safe, unobstructed path for summon to work properly. The Tesla's sensors are few and not very advanced compared to cars designed for autonomous driving. Basing liability laws on what they do would be a bit like basing laws for adults on toddler behavior. A three year old stripping down naked and smearing crayon and magic markers all over a public area would likely be the parents' or guardians' fault for lack of supervision... an adult performing the same behavior would likely be considered fully responsible and find him or herself fined, imprisoned, and/or institutionalized and possibly on the sex offender registry.
Re: (Score:3)
Or maybe the truck really was hard to see against the bright background, and the human driver did no better than the AI.
That's so unlikely that its not even worth mentioning. And as has been mentioned, if your visibility is reduced you should slow down, same goes for auto-pilot.
Re: (Score:2)
If that was the case then the A.I. cargo cult has gone way too far so we have to very actively inform people that they are not in a SF movie and the machine is way too dumb to do their thinking for them.
Re:By far... (Score:5, Informative)
The problem is not the Autopilot feature but the way it has been misleadingly and dangerously marketed.
Musk bragged to the press that Autopilot was "almost twice as good as a person," certainly sending the wrong message. His ex-wife posted a YouTube video of her driving while covering her eyes and dancing around while on Autopilot on a crowded highway. All this has encouraged a bunch of other YouTube videos of people behaving foolishly while on Autopilot.
https://www.yahoo.com/news/tesla-mixes-warnings-bravado-hands-free-driving-002343250--finance.html
Even the marketing name "Autopilot" is probably misleading to some people, who might interpret as "the car drives itself without human assistance". It should have been more conservatively called "driver assist" or some such.
In the end their marketing stupidity is probably going to bite them financially. A dashboard warning doesn't excuse it. I say this regretfully as a Tesla stockholder.
Re: (Score:2)
My phone has a "lightning" charger. Should I hold it out in the rain and hope for a surge to fill up the battery?
Sure, if that's what you think it means, be my guest. However, the issue is not what you or I think it means but whether a jury can be convinced. In your case, good luck.
Re:By far... (Score:4, Informative)
FYI, it's envy, not jealousy. Jealousy would mean you were afraid of losing Elon. Envy means you wish you had all his cool stuff, fun life, and hot ex-wife. I only make this correction because I found out I was saying it wrong for 30 years...
Re: (Score:3)
The larger issue to me is why the DRIVER did not notice a truck across the road in front of them.
How can we be sure that the driver did *not* notice the truck?! The fact that he didn't step on the brakes, you say? Maybe he noticed it, but thought "whatever, my Tesla is smart enough to stop if it needs to". And by the time he realized that it's not stopping he just didn't have enough reaction time to lift the foot off the floor and apply the brakes? Every time I see a demo of the smart cruise control, where the car can stop if there's an obstacle, drivers are told to resist stepping on the brakes and t
Re:By far... (Score:4, Insightful)
Then that is Darwin award territory.
Re: (Score:3)
But no, people are going to try and blame automation, because otherwise it would be a dead person at fault.. And that is
just not nice, right? However, this is NOT a case where a driver jumped on the brakes and they did not work, or tried
to turn the car and it went straight ahead (at least none of that is being claimed). It is a case where a driver of a car
at speed was not aware of the road directly ahead of them, that makes this border on a darwin here folks..
Sure the person was at fault for paying attention while driving.
But people not paying attention while driving is the obvious outcome of giving a car an "autopilot" that operates on highways.
Re: (Score:3)
You mean he purchased an automobile?
The airbag blew up in his face?
No. It is a common tactic in most industries to imply human error was the cause immediately after an accident. This quickly placates the general public. When the results of the investigation prove the claim was unfounded, that information doesn't get remotely as much publicity,
Re:By far... (Score:5, Insightful)
but that does not change the fact that they released a fundamentally flawed and extremely dangerous product.
At this stage, that is your opinion and not a fact. Don't purport it as such.
Re:By far... (Score:4, Insightful)
The fact is that Tesla states that: ""Autopilot is by far the most advanced driver assistance system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility."
According to the GP, while taking human psychology into account, this is what makes this a fundamentally flawed and extremely dangerous product. People will watch Harry Potter movies in this car, they will have horrible response times because they don't need to pay attention, they will get into accidents when the 'driver assistent' fails, and Tesla will try to abdicate responsibility each and every time based on contractual terms.
Re: (Score:2)
That bit only makes sense if the car had no idea how far away the truck way. As the car approached the truck the clearance should have increased and the car should have realized something was wrong if it didn't.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Informative)
TCAS is only a warning system and that's unilkely to change any time soon.
Airplanes on autopilot will happily fly into other airplanes, mountains, buildings, etc... TCAS and GPWS will give aural and visual warnings, but that's it. If the pilots ignore those warnings, the autopilot will just continue on its path.
By the way, TCAS relies on transponders in other aircraft that broadcast their position and altitude. Cars don't have those.
Re: By far... (Score:5, Insightful)
Allowing the user to have hands off for 30 seconds is problematic for Telsa. A lot can happen in 30 seconds, its an arbitrary duration. Why not 5 seconds?
imho, it should not be called auto-pilot or autonomous driving because its not truly that yet. Assisted control is more appropriate.
Re: By far... (Score:4, Insightful)
Re: (Score:3)
Another thing that probably could have prevented this: raw camera data.
There's no way that a truck was reflecting exactly the same amount of R, G and B (let alone full spectrums) as the sky. But it probably was the same pixel in the image taken by the camera: 255,255,255 (or higher if it's more than 8 bits). The process of converting raw data to linear space tends to truncate both dark and bright pixels; in reality you may have one pixel of a dark object in a shady place indoors that's numerous orders of m
Re:Rather have a Subaru (Score:4, Interesting)
More to the point about spectrums... mid to near IR should show operating vehicle engines, potentially exhaust, etc as hot pixels. And long-wave IR should show people and animals as hot pixels. Both of which sound *incredibly* useful.
That said, I'm not sure where traditional CCDs stop being sensitive... I imagine they don't go all the way down to the long-wave spectrum. They do of course make cooled IR cameras that capture long-wave but they tend to be larger and more expensive. Hmm, let's see how far traditional uncooled CCDs can go... I'm seeing a number of pages putting the range limit at around 150 (or 300?) to 1100nm (human vision is 380 to 750nm, give or take). I wouldn't be surprised if some parts of an engine would glow reasonably well in the 1000+ nm range.... but that's *if* you could see it, though, without something blocking the radiation. I doubt they could see exhaust, at least at the point it leaves the tailpipe. You'd need a special designed, cooled camera if you want to see the lower ranges.
Re: (Score:3)
Regular lens glass isn't great for IR.
Many of the best IR lens materials can't stand humidity.
http://www.edmundoptics.com/re... [edmundoptics.com]
This was a software issue. The camera 'saw' the truck, but the edges didn't have high enough contrast.
Fundamentally though, this was an autopilot induced crash. If the driver had continued to pay attention, he could have avoided it. Airplanes had the same problem, driver assists are dangerous if they allow the driver to feel safe when not focused on driving.
Re:Rather have a Subaru (Score:4, Interesting)
Again, it depends on what you mean by "IR", which is a very broad spectrum range. Cameras often have to add a special IR filter to block near-IR because the lens doesn't block it on its own. You can see here the transmission spectrums of different types of glasses and plastics [spiedigitallibrary.org]. You can see that as a general rule they're good at blocking UV but not IR, at least near-IR (750-1400nm). They tend to block more IR the closer you get to the far-IR spectrum, however.
Images being overexposed will do that to you. And the overexposure of an image isn't a fundamental aspect of CCD hardware, it's a processing artifact.
Example: take this image [wordpress.com]. Note how the boundary between the car and the sky in this picture is completely lost. It's not like the CCD is receiving the exact same amount of photons from the car and the sky - they're actually going to be very different. But they're both truncated off at maximum brightness when saved into an "image" - and that image is then provided to the autopilot. In severe cases, the autopilot is highly disadvantaged, if not inherently doomed, no matter how good its software is. Human eyes don't have that limitation - we can see bright and dark areas simultaneously and make out details in both.
The CCD is getting the data that's needed. But the autopilot isn't.
Re: (Score:2)
Re: (Score:2)
Police lidars work at a hundred metres or so. Thats enough for avoiding conflicts like this.
Re: (Score:2)
Been wondering what it would be like with 10s of cars, across multiple lanes and in both directions, having LIDAR and/or RADAR actively sweeping their surroundings....
Re: (Score:2)
Its a 2D version if TCAS [wikipedia.org], where the vehicles have the option of coming to a complete halt if they face an impossible situation. TCAS uses communication between aircraft, so that they can negotiate a solution. So if you have two aircraft approaching head on one will tell its crew to pull up and the other to descend.
In your scenario communication between vehicles would definitely be required.
Re: (Score:2)
People are idiot's. Tesla's mistake was in failing to take this into account in their promotion. They made a very sophisticated driver assist, but if you call it 'autopilot' then you're going to get some idiot who watches a movie while driving because he believes the driver assist to be far more capable than it actually is.
Re: (Score:2)
Re: (Score:2)
the ONLY reason i want a self driving car is so that i can study during my daily commute. i consider driving time wasted time. i already know the scenery, i don't care for news on the radio, i don't enjoy driving. i really want self-driving cars to get to the point where i can safely pick up an oreilly book and study.