Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
Tesla has put itself in a difficult position by telling drivers to keep their hands on the wheel, but not reinforcing it with an alarm or deactivation for several minutes.

Obviously, they know drivers aren’t keeping hands on the wheel, but that not doing so is a major attraction of Autopilot to Tesla's customers. It helps to sell more cars.

They should change the nag interval to seconds, not minutes. Doing so would put the system in line with other steering assist systems on the market. That is if Tesla is really concerned about safety, not just giving it lip service.
 
Tesla has put itself in a difficult position by telling drivers to keep their hands on the wheel, but not reinforcing it with an alarm or deactivation for several minutes.

Obviously, they know drivers aren’t keeping hands on the wheel, but that not doing so is a major attraction of Autopilot to Tesla's customers. It helps to sell more cars.

They should change the nag interval to seconds, not minutes. Doing so would put the system in line with other steering assist systems on the market. That is if Tesla is really concerned about safety, not just giving it lip service.

My Prius doesn't make an alarm when my hands aren't on the wheel. Maybe Toyota isn't serious about safety.
 
Toyota engineers just assumed you would be steering the car yourself, since, you know, it doesn't have an autosteering system.

:( but but but Larry and Sergey aren't using hands haha

google-car-sergey-brin-larry-page-eric-schmidt-623x389.jpg
 
  • Funny
Reactions: GoTslaGo
Not sure if this has already been posted, but it looks like NHTSA is investigating a SECOND Autopilot crash that took place on July 1:

Second Possible Tesla Autopilot Crash Under Investigation

Also from CNBC's Twitter feed:

CNBC Now on Twitter

Why do I have a feeling this is going to get worse before it gets better?

I found this part of the Tesla quote interesting:
"As we do with all crash events, we immediately reached out to the customer to confirm they were ok and offer support"
A knucklehead ran a red light and destroyed the front end of my MS last month - no contact with Tesla except my call to local store a day or two later asking if they could look at the log to see if my MS had applied the brakes just prior to knucklehead impact. About $30K estimate so far.
 
I found this part of the Tesla quote interesting:
"As we do with all crash events, we immediately reached out to the customer to confirm they were ok and offer support"
A knucklehead ran a red light and destroyed the front end of my MS last month - no contact with Tesla except my call to local store a day or two later asking if they could look at the log to see if my MS had applied the brakes just prior to knucklehead impact. About $30K estimate so far.

Did the airbags deploy in your crash?
 
Tesla has put itself in a difficult position by telling drivers to keep their hands on the wheel, but not reinforcing it with an alarm or deactivation for several minutes.

Obviously, they know drivers aren’t keeping hands on the wheel, but that not doing so is a major attraction of Autopilot to Tesla's customers. It helps to sell more cars.

They should change the nag interval to seconds, not minutes. Doing so would put the system in line with other steering assist systems on the market. That is if Tesla is really concerned about safety, not just giving it lip service.

Or perhaps Tesla believes the vast majority of its customers are responsible drivers and shouldn't be advesley impacted with onerous "nag" messages and alarms--because a small minority' "may" behave foolishly.
 
Tesla has put itself in a difficult position by telling drivers to keep their hands on the wheel, but not reinforcing it with an alarm or deactivation for several minutes.

Obviously, they know drivers aren’t keeping hands on the wheel, but that not doing so is a major attraction of Autopilot to Tesla's customers. It helps to sell more cars.

They should change the nag interval to seconds, not minutes. Doing so would put the system in line with other steering assist systems on the market. That is if Tesla is really concerned about safety, not just giving it lip service.

Have you ever driven a Tesla?
 
Or perhaps Tesla believes the vast majority of its customers are responsible drivers and shouldn't be advesley impacted with onerous "nag" messages and alarms--because a small minority' "may" behave foolishly.

If the driver is operating the vehicle responsibly, with his or her hands on the wheel as per Tesla's instructions, then they won't receive any "nags" or alarms. Therefore, no problem.
 
I'd like a simulator, or series of "training" videos. Delivery guy spent, what, 90 minutes going through everything with me. I don't remember him saying that Autopilot cannot see a parked car, if car it is following changes lanes to avoid one. I learn that reading this forum. I now also know that its in the manual - I haven't downloaded that, let alone read it, other than the bits & pieces that I have read on the console screen. However I would be up for viewing some "training videos" - that could fill in gaps in my knowledge and shorten the delivery guy's time (and repeat that job for second-hand sales, or "borrow my car" drivers), which might be important when trying to deliver 500,000 vehicles a year!
 
As far as i could see from the street view pics in this scenario the lane markers were perfect?

Do you get an alert when the cam looses lane markers?

no, the car may just unexpectedly begin to drift. happens when lane markers change from white to black/white and sometimes on poorly paved sections or when small hills appear in front of the vehicle that then the camera cannot see over to continue viewing the lane markings, also may occur at intersections. anywhere not fully striped or view impaired could result in lane tracking being lost, and if not following another vehicle, the tesla will hunt (or as my wife calls it bob and weave) searching for one or both of these lane markers.
 
  • Informative
Reactions: 22522
Tesla Motors Inc. (TSLA) Feuds With Fortune Over Disclosure - ValueWalk https://apple.news/AWyDzFfTWQnKXw0TVdBJOaw

Glad to see a journalist finally call out the lies published by major media outlets.
A great article. But your link doesn't work here's a working link and highlight from the article.

Tesla Motors Inc. (TSLA) Feuds With Fortune Over Disclosure – ValueWalk

"Here’s what we did know at the time of the accident and subsequent filing:

  1. That Tesla Motors Autopilot had been safely used in over 100 million miles of driving by tens of thousands of customers worldwide, with zero confirmed fatalities and a wealth of internal data demonstrating safer, more predictable vehicle control performance when the system is properly used.
  2. That contrasted against worldwide accident data, customers using Autopilot are statistically safer than those not using it at all.
  3. That given its nature as a driver assistance system, a collision on Autopilot was a statistical inevitability, though by this point, not one that would alter the conclusion already borne out over millions of miles that the system provided a net safety benefit to society.
Given the fact that the “better-than-human” threshold had been crossed and robustly validated internally, news of a statistical inevitability did not materially change any statements previously made about the Autopilot system, its capabilities, or net impact on roadway safety.

Finally, the Fortune article makes two other false assumptions. First, they assume that this accident was caused by an Autopilot failure. To be clear, this accident was the result of a semi-tractor trailer crossing both lanes of a divided highway in front of an oncoming car. Whether driven under manual or assisted mode, this presented a challenging and unexpected emergency braking scenario for the driver to respond to. In the moments leading up to the collision, there is no evidence to suggest that Autopilot was not operating as designed and as described to users: specifically, as a driver assistance system that maintains a vehicle’s position in lane and adjusts the vehicle’s speed to match surrounding traffic.

Fortune never even addresses that point. Second, Fortune assumes that, putting all of these other problems aside, a single accident involving Autopilot, regardless of how many accidents Autopilot has stopped and how many lives it has saved, is material to Tesla’s investors. On the day the news broke about NHTSA’s decision to initiate a preliminary evaluation into the incident, Tesla’s stock traded up, not down, confirming that not only did our investors know better, but that our own internal assessment of the performance and risk profile of Autopilot were in line with market expectations.

The bottom line is that Fortune jumped the gun on a story before they had the facts. They then sought wrongly to defend that position by plucking boilerplate language from SEC filings that have no bearing on what happened, while failing to correct or acknowledge their original omissions and errors."

I note that the article cites the police report in reporting that the cause of the accident is primarily a truck driver turning left across oncoming traffic creating a challenging emergency situation that would be tough for any driver or driver assistance technology. Virtually no other reporter or article bothers to even quote this most basic and relevant fact in the police report instead they invent speculation and quote people who don't know what they are talking about and other nonsense.
 
Last edited:
  • Helpful
Reactions: SW2Fiddler
Or perhaps Tesla believes the vast majority of its customers are responsible drivers and shouldn't be advesley impacted with onerous "nag" messages and alarms--because a small minority' "may" behave foolishly.

No car company should assume that the "vast majority" of its customers are responsible drivers.
From my daily experience, the relation between responsible and irresponsible drivers (including those that don't indicate (especially on the Autobahn and in roundabouts), ignore speed limits, take your right of way, ignore red lights, ignore pedestrians on zebra crossings, text on their phones) is about 50:50, nowhere near the "vast majority".
Interestingly enough, of the three occasions I encountered a Model S directly (i.e. not simply seeing one drive by on the opposite side of the Autobahn), two involved irresponsible behaviour. One took my right of way so we almost crashed, the other one was speeding far above the allowed limit. Certainly not representative, but at least an indication that even Tesla drivers might not all (or the "vast majority") be responsible drivers. Tesla imho should indeed stay on the cautious side when deciding what to allow drivers to do. Far too much damage can be done to Tesla's image if the general public starts thinking "oh that's the manufacturer that doesn't care as much for safety as the others" (great crash test results notwithstanding).
 
... a truck driver turning left across oncoming traffic creating a challenging emergency situation that would be tough for any driver or driver assistance technology. ...

If people illegally turning left in front of me caused accidents, I'd be dead over 1,000,000 miles ago. It's pretty common.

If you ride a motorcycle on the street, you will experience it constantly. But I even get it when driving a bright red truck with my lights on.

You need to anticipate your next accident. Figure out how it happen. And the first step is to assume that nobody on road can see you. Drive like you are invisible, and driving becomes much safer.

The "pilot" did not see the truck driver start to do something stupid. The instant the truck started to violate your right-of-way, the "pilot" should have covered the brake and looked for an exit strategy, and picked where the crash was going to occur.

A good driver knows which cars are riskier than others in traffic:

They don't make eye contact.
They are looking down.
They are wandering in their lane.
They are making a lot of lane changes.
They are occupied eating, putting on makeup, talking on a phone or with passengers.
Out of state plates, or a lot of dents in a car.
Add your own.

It will be a long time before software gets the seat-time to spot idiots before they become a threat. This truck driver might have been one such idiot that could have been spotted.
 
[and so on]

Why, then, do most (all? Lada is out of business after all) car companies sell cars that can vastly exceed speed limits, for instance? Would you take the position that they shouldn't do so?

Give it time. Governments are already aiming in that direction.

But it is not lawyers who are at the technological forefront of owner-defined HP attenuation. Few lawyers even know how to select tires.

Many cars can be programmed to limit HP and speed. Fords can be ordered with 2 keys. An "employee" key and a 'boss" key. The Hellcat has a Valet key. New GM's have "teen driver" lockdowns. I tune GM cars and trucks to accept Valet Mode switches easily. Turn a key and reduce HP by 50% and set the speed limiter to any given number.

But? Nearly all fatal accidents are under 45mph. Look it up. Even though cars that go over 200mph are now affordable and plentiful, the fatalities in those cars is not higher. My Chevy goes over 200 as sold, and it's cheaper to insure than our Volts.