Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
  • Want to remove ads? Register an account and login to see fewer ads, and become a Supporting Member to remove almost all ads.
  • Tesla's Supercharger Team was recently laid off. We discuss what this means for the company on today's TMC Podcast streaming live at 1PM PDT. You can watch on X or on YouTube where you can participate in the live chat.

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
In allowing incremental advancement of imperfect technology, it has always been human operators who are responsible to operate the machine safely. Human should be able to control an airplane or a car and you just can't say because it's beta so I forgot how to fly or how to drive.
@Tam, well put. I agree.

I think Tesla labeled AutoSteer as "beta" (as compared to Daimler not using that label on their Distronic system even though it is far less capable) because Tesla is a more software driven company with a deeper understanding of the perils and potential of modern technology. Autopilot is a "beta" product in the sense that it needs to be used carefully and thoughtfully given the seriousness of high speed driving errors, and that its use requires a human driver to be constantly supervising the operation of a Tesla when Autopilot is engaged. That is made clear in the owner's manual and through the onscreen warnings when the owner first activates Autopilot.
 
When the police report states that the truck driver failed to yield the right of way, turned left directly in front of the oncoming Tesla, it is amazing that this Consumer Watchdog goes after the Tesla with the highest safety ratings.

I suggest send them emails and file a complaint about their own organization on their own website and facebook page:

Consumer Watchdog

Consumer Watchdog (@ConsumerWD) | Twitter

jamie court (@RaisingHellNow) | Twitter

Create Consumer Complaint | Consumer Watchdog

Email: [email protected]

[email protected]

General number: 310-392-0522


edit to add:
reviewing them on facebook also:

upload_2016-7-10_10-31-27.png


Consumer Watchdog
 
Last edited:
This is false. The Q50S function (which came out before Tesla's) is able to drive hands free up to 5 minutes at a time.
Infiniti’s ‘Auto Pilot’ Driving Straight Into the Future

Even for the older S-Class, although the timer nag is 12 seconds, the car keeps going hands free even if you ignore it (it just has a warning tone).
Semi-Autonomous Cars Compared! Tesla Model S vs. BMW 750i, Infiniti Q50S, and Mercedes-Benz S65 AMG - Feature
The 2017 E-class extends the nag to 1 minute. Even though they say there is a controlled shut down procedure (car slows down to a stop) if the nag is ignored, journalists who tested it were not able to confirm this actually existed.
New 2016 Mercedes E-Class: UK prices, specs and on sale date
http://jalopnik.com/2017-mercedes-benz-e300-mercedes-made-an-e-class-bette-1781983204

If the car keeps the lane keeping going even after the nag, I'm not seeing how this necessarily enhances "safety" (assuming hands on the steering wheel means you are not distracted, which is not necessarily true in the first place).

Tesla's 7.1 nag is about 3 minutes.

I don't see how that demonstrates that what I said is "false." With the exception of Infiniti, for the most part the industry reinforces hands on the wheel with a much shorter interval than Tesla does (either with an alarm, an alarm accompanied with deactivation, or an alarm followed by deactivation).

It seems it is mainly non-Tesla owners who suggest more onerous nags. I have yet to see actual owners that want this (rather they want the opposite).

Since I am a non-Tesla owner, perhaps Tesla owners can educate me as to how nags are onerous if one keeps their hands on wheel at all times per Tesla's instructions. These are Tesla's instructions, not mine.
 
  • Like
Reactions: Just a Reader
I don't see how that demonstrates that what I said is "false." With the exception of Infiniti, for the most part the industry reinforces hands on the wheel with a much shorter interval than Tesla does (either with an alarm, an alarm accompanied with deactivation, or an alarm followed by deactivation).



Since I am a non-Tesla owner, perhaps Tesla owners can educate me as to how nags are onerous if one keeps their hands on wheel at all times per Tesla's instructions. These are Tesla's instructions, not mine.

To educate you
In my 90D while AP is engaged, and my hands are lightly on the wheel, and not steering but going with the flow of AP the nags still occur. AP wants you to move the wheel with a little tension so it knows you are there.
 
  • Informative
Reactions: jgs
To educate you
In my 90D while AP is engaged, and my hands are lightly on the wheel, and not steering but going with the flow of AP the nags still occur. AP wants you to move the wheel with a little tension so it knows you are there. Bottom line is that the nags occur with or without your hands on the wheel.
If I just rest one hand at the bottom of the wheel it provides occasional resistance to the wheel and the nags never appear. It's pretty easy and safe to comply with the keep hands on wheel requirement. It's good to get feedback through the hands in addition to the eyes when the car has the occasional truck or exit lust.
 
So Elon just tweeted his definition of "Beta" in terms of Autopilot is any system with <1 Billion miles of real world driving.

Not all the features of Autopilot are labeled beta if I remember correctly. The beta labeled functions are: Summon, Auto Lane Change, and I think Auto Steering/Lane Keeping is still beta.

(As a side note, I wonder how 1 billion miles of lane changing is calculated, or summon, given that summon can only move a few dozen feet per activation)
 
  • Informative
Reactions: jgs
So Elon just tweeted his definition of "Beta" in terms of Autopilot is any system with <1 Billion miles of real world driving.

Not all the features of Autopilot are labeled beta if I remember correctly. The beta labeled functions are: Summon, Auto Lane Change, and I think Auto Steering/Lane Keeping is still beta.

(As a side note, I wonder how 1 billion miles of lane changing is calculated, or summon, given that summon can only move a few dozen feet per activation)

Realistically, the system will never be perfect. It might reach the point that it is noticeably safer than regular driving but that doesn't mean that it will ever reach the point that nothing ever goes wrong. If so, I wouldn't put a number on when it is no longer "beta."

Even when a system reaches a point where it is autonomous driving, it will still be difficult for such a system to replicate everything a human does. It will very likely be much safer than a human being at the controls but there will still be unanticipated situations that a human being will be better than the system alone. Once every car is run autonomously then it might be a very different situation. Until that time, an autonomous driving program will still have to deal with irrational and flawed human drivers controlling other vehicles.
 
Last edited:
Realistically, the system will never be perfect. It might reach the point that it is noticeably safer than regular driving but that doesn't mean that it will ever reach the point that nothing ever goes wrong. If so, I wouldn't put a number on when it is no longer "beta."

Elon has since clarified his tweet. 1 Billion miles of real world driving is a part of a minimum required to leave beta, but a feature does not automatically leave beta after 1 billion miles.
 
don't get it I thought we were at 140 million miles so far? How can we add 6 times as many miles in 6 months?

AP is getting used more and more. Probably more people who had the hardware from 2014 and 2015 are using it more, but there are also a lot more cars with AP on the road now. Between the Model S and X I think they were over 2000 cars a week now. It's not a linear growth curve.
 
Another thread on non-fatal Model X crash against roadside wooden posts on Restricted 2-lane Country Road with no median divider in between.

2 Problems:

1) Incorrect use of Autopilot because "Autosteer is intended for use on freeways and highways where access is limited by entry and exit ramps."

2) Driver failed to correct the steering wheel.

It looks like more crashes will continue to happen until drivers learn how to use Autopilot appropriately.




image-jpeg.184848





image-png.184782
 
It appears there was some sort of gag order or something about this. The news didn't hear a peep about this from anyone from May 7 to yesterday. The cops who investigated the crash didn't talk, the family didn't talk, the truck driver didn't talk, and nobody involved in the investigation talked. What I find amazing is so many people knew about it for close to two months and nobody said anything until the NHTSA released their preliminary report.

Few people would have known the car was on autopilot. Witnesses / the truck driver / the police might not have known this - might have been limited to investigators and Tesla. Others would have seen a car without a roof and seen an unsurvivable accident and thought poor soul.
 
It's going to be a long time before cars can handle every edge situation safely. Aircraft have been on the autonomous flying track much longer than anyone has been working on it for cars. The joke at Boeing was the new flight deck was one pilot and a dog, the pilot's job was to feed the dog and the dog's job was to bite the pilot if he touched anything. Planes can largely fly themselves, but they still run into edge situations where the pilot needs to step in and take control. Many aircraft accidents in recent years were due to the pilots stepping in during an edge situation and making mistakes they shouldn't have made.

The edge situations will be reduced, but don't expect them to be completely eliminated for a very long time. I expect the edge situations will be reduced from around 0.1% now to 0.01% or even 0.001% in the next few years. The best they can do with the remaining edge conditions is to fail as safely as possible.

The driver of the Tesla must have been completely engaged in something else, or possibly asleep. Most people if they are aware they are about to go under a semi trailer are going to duck if they know they can't stop. Nothing indicated the driver did anything to react.

Aren't most to do with weather, mechanical, or pilots misinterpreting faulty instrumentation. eg. the AirFrance flight to SouthAmerica
Air France Flight 447 - Wikipedia, the free encyclopedia
where the airspeed indicator (pitot tubes) indicated the plane was doing a different speed (100mph not 600mph) and the copilot caused an aerodynamic stall.
I recall that the pilot came back in and didn't audibly take control, the copilot pulled the stick in the opposite way to the pilot. Due to the electronic dual-controls (no feedback from the other pilot stick) the plane fought both of them.

Hmmm, just had a scary thought. A malevolent Tesla (or hacker) could upload new software telling cars to accelerate to top speed and crash deliberately. To think - even if you don't buy Autopilot the software is still in the car! Nothing is not computer controlled.
 
Aren't most to do with weather, mechanical, or pilots misinterpreting faulty instrumentation. eg. the AirFrance flight to SouthAmerica
Air France Flight 447 - Wikipedia, the free encyclopedia
where the airspeed indicator (pitot tubes) indicated the plane was doing a different speed (100mph not 600mph) and the copilot caused an aerodynamic stall.
I recall that the pilot came back in and didn't audibly take control, the copilot pulled the stick in the opposite way to the pilot. Due to the electronic dual-controls (no feedback from the other pilot stick) the plane fought both of them.

Fundamentally, that accident was caused by the autopilot disengaging unexpectedly, and revealing that at least one of the pilots had become so dependent on automation that they forgot (or never knew) how to fly the plane by hand. By the time they were able to reorient themselves to the situation, it was too late.