Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla's Autopilot needs to be shut down and NHTSA needs to do their due diligence

This site may earn commission on affiliate links.
i just find it funny how two people say the same thing and get vastly different reactions.

OP now up to 57 dislikes.

wk057 now has 5 informatives, 2 likes, and 2 loves.

They are both wrong.

Another person who has both AP1 and AP2 reports that AP2 is better in some ways right now.

It [AP2] doesn't seem to wobble at the top of small hills the way my AP1 car does. The left turn at the fork in the road shown at the beginning of the video took several hair raising months for my AP1 car to learn to navigate without having to take over. The new car is doing it after only one week of driving. Hopefully that means it is learning faster.

It is absurd hyperbolic nonsense to say that AP2 is 20x worse than AP1.

Let's try this snag from YouTube....yep, this seems to work. Enjoy Sab's great German accent.


Yep, it looks like on roads with well marked lanes, AP 2 does pretty well -- at least as well as AP1 did in its early stages. No reason to sent people that technology while they are still refining.

It would be better if Tesla would better communicate the limitations that ap2 has now and when it should and shouldn't be used and under what conditions.
 
Last edited:
  • Like
Reactions: croman
My point is that it needs to be shutdown now due to the massive regression the latest version caused.
Very long thread, probably mentioned but this is not a regression failure as the software was completely rewritten for new hardware.

It is clearly defective and I agree with many of your points. But the bottom line is this is supposed to be a driver's aid, you are not supposed to drive hands-free on neighborhood roads.

I can personally a test to AP1's inability to read yellow lines - was hoping AP2 would be better at that out of the gate.

As for reversing position - do you mean they should ignore the statistical data which shows an AP vehicle 40% less likely to be in an accident ... If so why? Would you rather the vehicle was forced to be under human control and statistically more likely to cause an accident? Strange.
 
did you even bother reading WK057's original post? he quite blatantly said it shouldn't have been released, just as the OP did.

here, allow me to do the research you didn't:
AP2 should NOT be publicly utilized nor have even been released yet. Seriously. I don't know who in their right mind thought it was a good idea to start rolling out AP2 hardware before they even had basic feature parity with AP1, but those involved should be sacked.
Admittedly I did not look at what you quoted, I just finished reading the comment below and scrolled down to yours and assumed you were quoting this one:
I disagree. AP1 on day 1 was far more reliable and predictable than AP2 is months into releases. Day 1 AP1's major problem was diving for exit ramps and turn lanes, a predictable and avoidable behavior. AP2 does all of this plus loses the lanes completely, drifts into oncoming traffic, locks on to the lane and drives into oncoming traffic, dives for barriers and other vehicles, etc.... there's really no comparison. AP1 2 months into releases was infinitely more usable than AP2 is 2 months into releases also. AP1 today, IMO is less usable than previous versions due to the arbitrary speed restrictions, excessive nagging, and purely punitive "features", but that's another discussion entirely.

However what you did was not quoting either, but selective quoting with very strong edits. Here's what he actually said:
I'll just chime in with a quick note. I've utilized AP1 since day 1 of public release and have driven tens of thousands of miles with AP1 since that time. It's not perfect, but it's mostly predictable and I generally know where it's going to have trouble.

I recently did ~100 miles with AP2 in an X... And you know what? IMO, this should NOT be publicly utilized nor have even been released yet. Seriously. AP2 was all over the place and completely unpredictable on routes where I've used AP1 without incident over a hundred times. AP2 was nearly unusable. The initial release of AP1 was 20x more reliable, despite AP2 having a significant sensor advantage.

I don't know who in their right mind thought it was a good idea to start rolling out AP2 hardware before they even had basic feature parity with AP1, but those involved should be sacked.
Yes, he does say that AP2 should not have been released, however he critically said that this is only his opinion (the IMO is important and needs to remain in the quote) and based on his own personal experience using the two systems.

However, this is still way different from what the OP said. Examples:
- OP used the term "regression testing" as though he did not even know that AP2 uses new hardware and software that is different from AP1. He claims he is an engineer, so I would not assume he is using the term "regression" in a loose way like the general public would do.
- OP talks about NHTSA not doing their job and that Tesla is putting the public in danger, which are much strong accusations than @wk057's post (which only says AP2 should not be released, with no other comment).
- OP points out "solutions" even though it doesn't appear he knows about the details of Tesla's technology, including adding points about an unrelated accident (others already criticized this).
- OP trots out supposed credentials as if it supports his arguments (classic appeal to authority fallacy). @wk057 on the other hand had the humbleness to say IMO. Others already pointed this out, but @wk057 earned his reputation here by his contributions. I have frequently disagreed with him, but I believe I have never gave him a disagree (or a dislike before it was renamed), and I will never do so out of respect for his contributions to this forum. I'm sure a lot of people are the same.
- Later OP points out he never used either system and never plans to do so. @wk057 was speaking from personal experience (so he deserves the informative ratings he got). The OP does not deserve any informative ratings because he does not appear informative in any way (in fact he appears to misinform).
 
http://www.thedrive.com/news/7915/watch-this-tesla-autopilot-2-0-fail-terribly-in-a-model-s

That video shows the car making a quick move across the oncoming lane. It is beyond clear that Tesla's design and testing approach is reckless. Imagine if the car was going faster. How does this not get caught in simulation or using simulators? How does this not get caught on test tracks? Using your customers as Guinea pigs is bad enough but now you are using them to check for massive system regression? This video clearly shows that these cars regressed so far that Tesla's entire process needs to be investigated. Especially around regression testing.

NHTSA needs to quickly reverse their stance on Tesla's autopilot at least long enough to actually do their homework, look into these issues and drive toward a solution that protects the public, makes sure the right things are happening at these companies. They need to due their due diligence, go talk to actual experts in ALL of these areas and not be so wowed by Mr. Musk. That fox owns the hen house and is going to get those hens killed. Musk's mantra that he is statistically saving lives is not only wrong but his system is putting the public in danger.

The Solution

  • Create a Scenario Matrix that cars will be officially tested to. Ensure this matrix covers a minimum amount of scenarios that ensure driver and public safety. Gather folks from these companies, automakers, the insurance industry, traffic engineering, NHTSA, academics and people who actually know how to create, design and test to a massive exception handling matrix like this. Most likely from DoD, NASA or Boeing. Ensure these standards are met before releasing any updates.
  • Bring that systems engineering experience into these companies. Commercial IT has never used most best engineering practices. Yeah I know they make tons of money and really cool apps, games and websites. The fact is that Commercial IT rarely even looks into exception handling (cases where things do not go as planned) let alone a massive effort like this. That includes identifying them, designing to them and testing them. They lack the experience in doing this and their tools don't support it.
  • Stop this massively avoidable process of using customers and the public as Guinea pigs. Musk says he needs 6 BILLION miles of it to collect the data he needs. Look at what that means. Innocent and trusting people being used to not only gather the first sets of data, most of which is for ACCIDENTS, then they are used to regression test after a system change. The reason for the 6 BILLION miles is that most of the data collected is repeat. They have to drive billions of miles because they are randomly stumbling on the scenarios. The solution here is to use the matrix described above with simulation and simulators to do most of the discovery and testing. That can be augmented with test tracks and controlled public driving. (Note - By Guinea pigs I mean the folks driving cars with autopilots engaged. Gathering data when they are in control is prudent.
  • Ensure the black box data is updated often enough to gather all the data for any event (many times a second) or make sure the black box can withstand any crash. In the McCarthy/Speckman tragedy Tesla said they have no data on the crash. That is inexcusable. Also pass regulations that give the proper government organizations access to that data while ensuring it cannot be tampered with before they do so.
  • Investigate the McCarth/Speckman crash. Determine if that car contributed to the accident. That includes any autopilot use as well as why that battery exploded and caused so much damage so fast. https://www.linkedin.com/pulse/how-much-responsibility-does-tesla-have-tragedy-michael-dekort
I am a former systems engineer, program and engineering manager for Lockheed Martin. There I worked on aircraft simulation, the Aegis Weapon System and was Software Engineering Manager for all of NORAD. I was also the whistleblower who raised the Deepwater Program issues - IEEE Xplore Full-Text PDF:
Have you used it yourself? I can assure you it's quite safe, but it's not fully autonomous so you still need to pay attention.
 
The video showing a car with AP1 hitting a traffic barrier demonstrates at least two potential problems with the current AP1 software.

The lane lines are ambiguous, which could be difficult for human drivers to determine where the lane is actually placed. The software didn't detect the temporary lane change - and proceeded straight ahead following the better marked lane.

The bigger issue is that the software did not detect the traffic barrier placed ahead in the car's current lane. There is always a risk that construction barriers, traffic cones or police will direct cars out of the defined lanes. AP/EAP are supposed to operate under driver control, so ultimately the driver is responsible for taking control and avoiding the unexpected barrier - though the software should be able to detect a large traffic barrier and do something reasonable.

FSD is designed to operate without any driver monitoring, even without anyone even in the car, if operating on the Tesla Network. That means FSD must be able to detect this situation and take actions at least as safe as a human driver would take.

Like with the accident when AP1 didn't detect the truck crossing in front of the AP car, Tesla should review this accident, and determine if the AP1 or AP2 sensor suite can detect this situation properly and the software can take reasonable actions. If the full AP2 sensor suite can't detect this situation, then Tesla may have a huge problem with EAP/FSD, because it's not that unusual to encounter temporary road or lane changes...
 
in my country never you ll see this because a road blocked is not allowed , lines are repainted at least
here ap or no ap it s dangerous , the dash cam car nearly hit the wall too , he didn t only because car in front of him had accident
the ap did his job : follow lines
 
  • Like
Reactions: Tomnook
in my country never you ll see this because a road blocked is not allowed , lines are repainted at least
here ap or no ap it s dangerous , the dash cam car nearly hit the wall too , he didn t only because car in front of him had accident
the ap did his job : follow lines

Even in this country I expect and demand better. I agree that the follower car footage proves this was just a bad situation and the barrier was surprising even for human drivers. I think the footage absolves AP and puts the blame on the driver and the signage/lane repainting efforts of the construction/local government.
 
Wow! Maybe try just apologizing for being wrong.
(which would have been expected since you agree to not even reading the post in question ?!?!?)

Why waste time trying to find minor points to establish that WK057 and the OP are, in fact, different people? So desperate.

The OP and WK057 both think AP2 should not be released.
They are not wearing the same color shirt. <- attempt to address your next defensive post.

Admittedly I did not look at what you quoted, I just finished reading the comment below and scrolled down to yours and assumed you were quoting this one:


However what you did was not quoting either, but selective quoting with very strong edits. Here's what he actually said:

Yes, he does say that AP2 should not have been released, however he critically said that this is only his opinion (the IMO is important and needs to remain in the quote) and based on his own personal experience using the two systems.

However, this is still way different from what the OP said. Examples:
- OP used the term "regression testing" as though he did not even know that AP2 uses new hardware and software that is different from AP1. He claims he is an engineer, so I would not assume he is using the term "regression" in a loose way like the general public would do.
- OP talks about NHTSA not doing their job and that Tesla is putting the public in danger, which are much strong accusations than @wk057's post (which only says AP2 should not be released, with no other comment).
- OP points out "solutions" even though it doesn't appear he knows about the details of Tesla's technology, including adding points about an unrelated accident (others already criticized this).
- OP trots out supposed credentials as if it supports his arguments (classic appeal to authority fallacy). @wk057 on the other hand had the humbleness to say IMO. Others already pointed this out, but @wk057 earned his reputation here by his contributions. I have frequently disagreed with him, but I believe I have never gave him a disagree (or a dislike before it was renamed), and I will never do so out of respect for his contributions to this forum. I'm sure a lot of people are the same.
- Later OP points out he never used either system and never plans to do so. @wk057 was speaking from personal experience (so he deserves the informative ratings he got). The OP does not deserve any informative ratings because he does not appear informative in any way (in fact he appears to misinform).
 
Wow! Maybe try just apologizing for being wrong.
(which would have been expected since you agree to not even reading the post in question ?!?!?)

Why waste time trying to find minor points to establish that WK057 and the OP are, in fact, different people? So desperate.

The OP and WK057 both think AP2 should not be released.
They are not wearing the same color shirt. <- attempt to address your next defensive post.
The points I raised are not minor and explains why people largely disagreed with the OP and not with wk057. This is not simply from them being "different people" but rather how they supported and justified their argument: OP appealed to authority and offered nothing informative to the discussion, while wk057 spoke from experience having tried both AP1 and AP2 and stated that it was his opinion.
 
It's not that simple. How about people who get hurt because someone chose to use AP2 drove into them?

It's not a "new" issue. The resolution is the same with any of these features. The driver is responsible, provided we aren't discussing people smart enough to attempt walking across a 65mph highway at night.

I'm sure the first people in an airplane understood well the potential for heavier than air flight... but someone had to do it. These people wanted to be first, they wanted to be engaged. No one is forcing you to turn on these driver assist systems. (Or even purchase them). AP, EAP, and FSD are all options you choose to purchase and enable.

When Tesla (or someone) has a "full self driving" 100% enabled: THEN we can blame Tesla for crashes. Until that day, it's on the monkey in the driver's seat.

Me? I'm sitting tight until I see a Tesla pickup :)

I'm just waiting for the day when this whole argument goes 180.... "Sir, at the time of the crash, why were you driving?"
 
Autopilot is lame. If you need an assisted system to drive then you shouldn't be driving in the first place. I've said this so many times.

Be present. Be in the moment. Be aware. SIMPLE.
You're absolutely correct.

The issue is that I don't "need" an assisted system to drive, I "want" an assisted system to drive.

Which makes Autopilot freaking awesome on long trips.
 
  • Like
Reactions: kavyboy and bhzmark
You're absolutely correct.

The issue is that I don't "need" an assisted system to drive, I "want" an assisted system to drive.

Which makes Autopilot freaking awesome on long trips.
And that's exactly why it's dangerous. It's so freaking awesome that eventually you are going to be lulled into a moment of inattention, and if something bad happens at that moment, it's game over. As more and more people begin to use it, the probability of a bad event will increase and eventually will happen. At that point the system will be shut down.
 
And that's exactly why it's dangerous. It's so freaking awesome that eventually you are going to be lulled into a moment of inattention, and if something bad happens at that moment, it's game over. As more and more people begin to use it, the probability of a bad event will increase and eventually will happen. At that point the system will be shut down.
And for every one time that happens it will have saved lives by avoiding approximately two or more human error caused accidents. And that's exactly why AP is safer.
 
  • Disagree
Reactions: alevek