Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
Hi guys, i just read this article on reuters.

Do you know if the 'AP' can initiate lane changes without manually operating the indicator switch?

I'm asking this because of the truck drivers statement:

Baressi, an independent owner-operator, said he saw the Tesla approaching in the left, eastbound lane. Then it crossed to the right lane and struck his trailer. "I don’t know why he went over to the slow lane when he had to have seen me,” he said.

no, but it can lose sight of one or both lane markers, and in doing so, may drift left or right as it seeks out (or as I call it hunts) for the other lane. The obstruction by the trailer may have inadvertently caused the Telsa to lose it's visualization of the lane markers, thus it may have drifted. I have driven over 9600 miles in the past 9 months using Autopilot frequently and am well aware of it's limitations. That's why you should "keep your hands on the wheel at all times" and pay attention to what lies ahead, in case your vehicle fails to interpret it properly.
 
...When it can pass a DMV test, I will say it has reached some maturity. This "take control at the last moment" stuff doesn't work...

A 2012 youtube showing an Audi A6 did not brake for tested object of carton boxes wrapped with aluminum foil.


The tester had an overexpectation that the Adaptive Cruise Control should brake to a stop.

That system was imperfect for this test of crash avoidance but it has been on the market before classic non-autopilot Model S were produced.

I suspect other current systems are imperfect too.

They just call them "Assist" like Volvo Pilot Assist, Mercedes Driver Assistance Package... instead of "beta."

It's just another way saying that you are still responsible because the system just helps you.

The road to pass a DMV test is incremental.

Notice we are very near the bottom and no where near SAE level 5!

SAE automated vehicle classifications:


  • Level 0: Automated system has no vehicle control, but may issue warnings.
  • Level 1: Driver must be ready to take control at anytime. Automated system may include features such as Adaptive Cruise Control (ACC), Parking Assistance with automated steering, and Lane Keeping Assistance (LKA) Type II in any combination.
  • Level 2: The driver is obliged to detect objects and events and respond if the automated system fails to respond properly. The automated system executes accelerating, braking, and steering. The automated system can deactivate immediately upon takeover by the driver.
  • Level 3: Within known, limited environments (such as freeways), the driver can safely turn their attention away from driving tasks.
  • Level 4: The automated system can control the vehicle in all but a few environments such as severe weather. The driver must enable the automated system only when it is safe to do so. When enabled, driver attention is not required.
  • Level 5: Other than setting the destination and starting the system, no human intervention is required. The automatic system can drive to any location where it is legal to drive.
 
Last edited:
Well..there you have it. I completely disagree and thats fine.

Thank God no other deaths were involved. I can't fathom the ramifications this incident would have produced if my child, your child, or anyone for that matter would have been killed in addition to Josh.

I'm thinking your stance may be different.

Simply put that kind of thinking leads to more deaths.

This kind of accident or something really similar happens every day with all makes/models of cars. The only way to prevent them is really to remove the driver from the equation. Especially now days with Alcohol, Pot, Texting. etc. People these days have a nervous twitch anytime a screen is removed from their face for more than a few minutes.

We have to advance automotive technology to combat this epidemic.

There are two ways to go about this, and I think both camps have some merit. So I'm not going to argue with anyone who is in one camp or the other. Essentially one camp believes Level II semiautonomous driving is necessary to reach Level III, and the other camp thinks Level II semiautonomous driving is too dangerous (it removes situational awareness too much without having the technology to fill the gap) so they want to wait and skip to Level III. I'm in the camp with Level 2, but at my job I do use deep neural networks so I'm a bit biased towards wanting more data and the only way to get data is on the road. Tesla is using fleet learning to improve the technology in their next generation systems. The Model 3 is benefiting from all of those that are currently driving an Model S with the AP hardware.

In the last couple months we've lost two souls to automotive technology (at least in terms of headlines) who will be remembered for the lessons their accident taught us. The one is the actor driving a Jeep that had a horrible UI issue that led to his death. The other was Joshua Brown, bur we don't know exactly what happened. Bur, we do know that steps need to be taken to improve the AEB technology to prevent/mitigate that type of accident. Both people will go down as saving lives, and saving people from being injured. Every car company will use them as lessons of what can happen.
 
Last edited:
What state of capacity are you talking about? Tesla's system came out in October 2015.

Just from the cars listed in the C&D article (there may be other systems even earlier that I won't bother looking for):

The Infiniti Q50's active lane control came out in August 2013:
First Drive: Infiniti's new Q50 feels driverless

The Mercedes' Distronic Plus with Steering Assist came out with the 2014 E-Class:
First Drive: 2014 Mercedes-Benz E-Class

Both of these are level 2 systems that came out before Tesla's and has the same caveat that the driver must pay attention when driving, given there are many situations that they can't handle.

The beta label is irrelevant to the issue, as I pointed out in another post. It shows Tesla is aware the system still needs improvement, but doesn't really change the objective limitations of the system.
Fatal autopilot crash, NHTSA investigating...

Edit, Bonus:
In August 2014, someone in the general public even pulled the same shenanigan in a Infiniti Q50 of leaving the driver's seat empty:
Infiniti Q50 Active Lane control is scarily self-driving
This is more than a year before Autopilot came out and someone did a similar stunt in a Tesla!

Here's a source (although it was another one that I was referring to, will look later, too tired ) describing what capacity I meant...the system as a whole.

Exclusive: The Tesla AutoPilot - An In-Depth Look At The Technology Behind the Engineering Marvel

I understand there were/are lane assistance system in place, thats not what I meant which I could have been more clear.

Teslas AP was the first to use deep neural networks (DNNs) which of course is courtesy of mobileye. No other manufacturer in the world used DNNs within their ADAS before tesla did to my knowledge....in a public release.
 
Here's a source (although it was another one that I was referring to, will look later, too tired ) describing what capacity I meant...the system as a whole.

Exclusive: The Tesla AutoPilot - An In-Depth Look At The Technology Behind the Engineering Marvel

I understand there were/are lane assistance system in place, thats not what I meant which I could have been more clear.

Teslas AP was the first to use deep neural networks (DNNs) which of course is courtesy of mobileye. No other manufacturer in the world used DNNs within their ADAS before tesla did to my knowledge....in a public release.

My understanding is the Mobile EyeQ processors are computer vision processors. As of July 2015 they held 80% of the market for advanced driver-assistance systems. I'm not sure how Tesla could be the first considering that DNN's are used to do object recognition, and sign recognition. They were doing that before Tesla ever hit the road. Sure they could have done those in other ways, but DNN's are perfect for much of what the MobileEye can do.

Now it could be Tesla was the first to radically modify the provided software, and to train the DNN's themselves. That could be what the article meant.
 
Simply put that kind of thinking leads to more deaths.

This kind of accident or something really similar happens every day with all makes/models of cars. The only way to prevent them is really to remove the driver from the equation. Especially now days with Alcohol, Pot, Texting. etc. People these days have a nervous twitch anytime a screen is removed from their face for more than a few minutes.

We have to advance automotive technology to combat this epidemic.

There are two ways to go about this, and I think both camps have some merit. So I'm not going to argue with anyone who is in one camp or the other. Essentially one camp believes Level II semiautonomous driving is necessary to reach Level III, and the other camp thinks Level II semiautonomous driving is too dangerous (it removes situational awareness too much without having the technology to fill the gap) so they want to wait and skip to Level III. I'm in the camp with Level 2, but at my job I do use deep neural networks so I'm a bit biased towards wanting more data and the only way to get data is on the road. Tesla is using fleet learning to improve the technology in their next generation systems. The Model 3 is benefiting from all of those that are currently driving an Model S with the AP hardware.

In the last couple months we've lost two souls to automotive technology (at least in terms of headlines) who will be remembered for the lessons their accident taught us. The one is the actor driving a Jeep that had a horrible UI issue that led to his death. The other was Joshua Brown, bur we don't know exactly what happened. Bur, we do know that steps need to be taken to improve the AEB technology to prevent/mitigate that type of accident. Both people will go down as saving lives, and saving people from being injured. Every car company will use them as lessons of what can happen.

This type of accident hasnt happened ever while AP was in use. It's the first death. So while the TYPE of accident isn't abnormal, the circumstance is certainly abnormal...hence the on going investigation.

We're on the same team here. I'm all for the advancement of automotive technology. I'm a huge advocate of it and spend more than enough of my time defending it and preaching it to others.

Up until this past week, i was with the camp of level 2. However after putting much thought into it, I'm in the camp of waiting until level 3; obviously some of you don't like that position. I get it, I won't thumbs down you, I respect that.

I've honestly jumped camp on the approach for the methodology on how AP is implemented. That's all. I believe wholeheartedly it's the future that will drive us up to level 4 one day.

My opinion on it is just that, my opinion. It's not to convince anyone or to persuade anyone to join my viewpoint. Conversely, I wouldnt be swayed to join your opinion no matter how much data is provided. I believe in what I believe in, people can believe diferently...I'm cool with that.

My belief is AP in the beta form it's in today should not have been released. If driver error is the cause of 1, 2, 10 or how many deaths occur while AP (as it is today) is in use...none of it is worth it. It can wait for level 3, in my humbling opinion.
 
This type of accident hasnt happened ever while AP was in use. It's the first death. So while the TYPE of accident isn't abnormal, the circumstance is certainly abnormal...hence the on going investigation.

We're on the same team here. I'm all for the advancement of automotive technology. I'm a huge advocate of it and spend more than enough of my time defending it and preaching it to others.

Up until this past week, i was with the camp of level 2. However after putting much thought into it, I'm in the camp of waiting until level 3; obviously some of you don't like that position. I get it, I won't thumbs down you, I respect that.

I've honestly jumped camp on the approach for the methodology on how AP is implemented. That's all. I believe wholeheartedly it's the future that will drive us up to level 4 one day.

My opinion on it is just that, my opinion. It's not to convince anyone or to persuade anyone to join my viewpoint. Conversely, I wouldnt be swayed to join your opinion no matter how much data is provided. I believe in what I believe in, people can believe diferently...I'm cool with that.

My belief is AP in the beta form it's in today should not have been released. If driver error is the cause of 1, 2, 10 or how many deaths occur while AP (as it is today) is in use...none of it is worth it. It can wait for level 3, in my humbling opinion.

We don't know how the crash relates to AP other than knowing it was on. The fact that we don't know is leading us to speculate about what happened. So I'd like to stick to the facts that we do know.

We know the AEB didn't activate, and that doesn't involve any beta element. That's mostly purely what was provided by MobileEye with some enhancements by Tesla.

We know any kind of AEB is better than no AEB so we're glad every Tesla car that has the AP hardware has it. They do even if the buyer decided not to opt for the Autopilot option.

Having a better AEB system requires better hardware. From your post history to come across as someone who has either pre-ordered the Model 3, or is interested in it. I hope you realize that the Model 3 is going to be really awesome partly because of how bold Tesla was with Autopilot, and Summons with the Model S. In fact you'll owe your experience to the blood, sweat and tears of those that came before you with the Model S. All the things the owners have taken issue with it, and all the mishaps that have happened.

Now I don't know if you'll have a Level 2 or a Level 3 experience it. The realist in me says it will likely only be Level 2, but it will be Level 3 HW ready. There is lots of legal ramifications of Level 3, and it might be years away.

But, even if it is Level 2 it's going to be a really awesome experience. You'll have more cameras in front, and more sensors around the car. Your blindspot/side monitoring will actually work, and you'll have all kinds of refinements to the AP system. Your Model 3 will likely see stopped cars 99.99% of the time versus a hell of a lot less with the current Tesla Model S.

It's going to be so good that there is no reason for me to convince you to change your opinion. Your opinion will change when you drive it.
 
correction, afaik autopilot wouldn't automatically disengage when approaching a roundabout, it would need to be manually disengaged. if you didn't disengage it I couldn't predict how it would react but I'd bet a red screen of death would appear.

As roundabouts are common here, I often approach roundabout with AP on. It doesn't understand them and I always need to override it. I guess the maximun speed limit with roundabout is 50 km/h here.
 
Hi guys, i just read this article on reuters.

Do you know if the 'AP' can initiate lane changes without manually operating the indicator switch?

I'm asking this because of the truck drivers statement:

Baressi, an independent owner-operator, said he saw the Tesla approaching in the left, eastbound lane. Then it crossed to the right lane and struck his trailer. "I don’t know why he went over to the slow lane when he had to have seen me,” he said.

Here is a theory:

* The road was probably quite empty.

At least it was the day the Google car came by

* The car was probably going at 90mph.

This contradicts the police judgement, but it fits with the witness report of the lady who said she was doing 85mph when she was overtaken. We only have this 2nd hand though, but from the landowner who has no axe to grind. If the witness was using the speedometer and not the GPS then she was probably only going 80mph when she thought she was going 85mph.
It fits with his "eight speeding tickets in six years", with the truck driver saying he was going fast, with the truck driver not spotting him and with the large distance the car went after the accident. It fits with his risk-taking attitude (skydiving) and with his saying that AP usually gets you there a little slower than manual driving (same link). Since AP works up to 90mph and he finds it a little slow it sounds like he enjoys driving fast. At least one law officer thought the car was going way over the limit (110mph). His friend said he "had the need for speed" and was a "kind of a daredevil" who loved the excitement, loved speed and had no fear.

* 90mph is 40ms or 44 yards per second.

* There are 480 yards from the crest of the hill to the site of the accident.

* That represents 11 seconds at that speed.

* It takes perhaps 5 seconds for AP to perform a lane change. (guess!)

Let's say that Joshua looks up about half a mile before the accident (20 seconds before impact). The road is empty and he is in the right lane. He flicks the stalk to change lanes and looks down again just before cresting the hill (11 seconds before impact), then looks down for the next 10-11 seconds. The truck driver sees the car crest the hill in the left lane and guns the engine to get the long slow truck out of the way.

The truck driver will see the car do a 5second lane change and not slow down at all for the next 6 seconds before impact. This fits the quote from the truck driver: "I don’t know why he went over to the slow lane when he had to have seen me,”
 
My belief is AP in the beta form it's in today should not have been released. If driver error is the cause of 1, 2, 10 or how many deaths occur while AP (as it is today) is in use...none of it is worth it. It can wait for level 3, in my humbling opinion.

Fair enough. What if there were 10 fatalities related to AP use as it is today but it could be shown that had AP not existed in its current form that there would have been 25 fatalities in that same time period/miles driven, does your opinion change?

Mike
 
IMO Tesla will have to change the name Of autopilot to something else in the aftermath of this incident.
I don't know if they will have to, but I think it would be wise if they marketed what they currently call "autopilot" as "Lane keeping and car following driver assistance". That more accurately describes what it does do and doesn't suggest things that it doesn't do.
 
  • Like
Reactions: ebwb
I don't know if they will have to, but I think it would be wise if they marketed what they currently call "autopilot" as "Lane keeping and car following driver assistance". That more accurately describes what it does do and doesn't suggest things that it doesn't do.

That in no way more accurately describes what it does, except to people who have no idea what the word autopilot means.

PS: "Car following driver assistance" is called "Adaptive Cruise Control" or "Traffic Aware Cruise Control". Those terms have been in use since the 90's. There is no need to invent yet another term for it.
 
That in no way more accurately describes what it does, except to people who have no idea what the word autopilot means.

PS: "Car following driver assistance" is called "Adaptive Cruise Control" or "Traffic Aware Cruise Control". Those terms have been in use since the 90's. There is no need to invent yet another term for it.
Yes. People don't know what Tesla intends "autopilot" to mean. That's the point. In a video game engaging autopilot means you get up and go to the bathroom. The term doesn't convey the continued need for attention.
I was routinely asked, "is this the car that drives itself?" And now I'm asked "is this the car that drives itself into the side of a semi?"

Instead call it exactly what it does do: assist the driver with following cars and keeping in the lane. Name it exactly what it does. "Car following" is a better description than "traffic aware" or "adaptive cruise" and certainly "autopilot" . The actual, as opposed to the imagined, functionality is important to get across.

Remember there are a whole bunch of people that don't follow car or airplane tech and terminology like the nerds on this forum. Many of them reserved for model 3 and more are buying a model s. Using words that don't suggest capabilities that don't exist and instead describe exactly what the technology does do, will help them understand the limitations, and keep more alive and uninjured and reduce bad PR for tesla.
 
Last edited:
We'll it's one thing for me not to use it but a completely different scenario knowing others out there are using it irresponsibly. That's where I'm hung up on.

Tesla says you're responsible for your actions with this tech..ok, got it. They're attempting to deflect blame off them and on you...ok got that too.

But is it ethical knowing this tech in beta form, when abused by its driver, can kill me or others out there? Beta, non perfect system is the emphasis here.

It's not about me having the choice to use it or not. It's about the public whos sharing these streets with the few who use it irresponsibly. As a company who knows this, that's too big of a risk to take imo...as seen that no other auto manufacturer is using this combination of tech in their cars. Why? Risk vs reward. They're not taking the chance yet. Of course they're more than happy for tesla to take that risk for them but so far they have not. It's coming but tesla was the first.

Wow, have you even read the other posts? This is about automatic emergency braking. Many cars have this. Some may be better than Tesla but many aren't. AEB helps in a lot of situations and when it doesn't help at least it doesn't hurt. Why single Tesla out?
 
And there's one of the issues I brought up earlier. Tesla obviously understand that those people who do this can harm or kill people by using the beta system. They also knew/know if such an incident did occur, hell would break out.

So my question (primarily to myself), is it responsible to release a beta system like this to the public knowing those risks? They think yes, every other manufacturer says no (to this point)...I'm still fighting my demons to answer this question myself.

Ok, I've tried to give you the benefit of the doubt but what is your agenda? Are you a short? Do you have a Tesla hatred? "Obviously"- real? You are making a huge assumption as to how Tesla views their system. You seem bent on slamming Tesla when others have lane keep assist. You seem to ignore that we have no idea yet if lane keep assist even had anything to do with this accident. The trucker said the car changed lanes. That would mean a drive run control.

I went through thousands of posts on the suspension thread and noticed recent members with posts only slamming Tesla.

A lot of cars have driver assist features. Overall they either improve driver comfort or safety and, in general do both. They are not perfect. They are getting better. Having had a driver fail to stay in his lane and, as a result, hit me I wish everyone on the connector in Atlanta had lane keep assist as good as Tesla has.
 
  • Disagree
Reactions: Drivin
Appreciate the info.

My understanding of what tesla and mobileye are saying is mobileye cant do anything about crossing traffic but tesla's in house software can. If that's the case, which certainly sounds like it is, the software operating the radar confused the truck as an overhead sign.

That's the issue for me. The software isn't refined enough to distinguish a 18 wheeler crossing the road from a overhead sign.

I understand tesla is covering their butts by saying "hey, it's not perfect so you're responsible for your actions". I get that. What I'm questioning is
by knowing that imperfection is there, is it ethical to release a beta form?

Makes my head hurt just thinking about it. I feel like a have the devil on one shoulder and the angel on the other...and both are trying to pull me in opposite directions.

Please read my other posts where I discuss some of the technical issues in a bit of detail.

Here you go again. A radar can help prevent rear end collisions. It doesn't respond if the object is high off the ground and movement is either zero (stationary) or lateral. Read the Distronic Plus manual from Mercedes and you see the same issues. However, you probably aren't short on Mercedes so you don't care about that. Overall these systems improve safety. They are tuned with a bias towards inaction since false positives can be dangerous.
 
I'm fully aware on the approaches of other companies on this. I think I'm now agreeing with Google approach

Wow, ok here is your approach: In some cases seat belts cause a death when a person would have been thrown clear. Let's get rid of seat belts till they are perfect. The same goes for air bags. Oh yeah, there are cases where antilock brakes stop in a longer distance than regular brakes. Let's get rid of those too.