Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
First off my condolences to J Browns family.
I think this is very simple- the truck driver made an error in his judgment, the S driver was either disabled and couldn't react or he was inattentive and didn't react. Mr Brown lost his life in a combination of factors that are statistically very very small. I don't see how the Tesla is at fault in any way. When AP is activated you agree to the parameters set forth and you push the button, end of story. The primary culprit is the truck driver.

I agree that the truck driver likely made an error in his judgement but I feel there is a third possibility regarding Mr. Browns actions beyond incapacitated or distracted.

It is possible that he saw the truck and thought or assumed it would stop and moved calmly without disengaging AP, from the left lane to the right lane to go around it. Meanwhile, the (slow moving?) truck driver sees the Tesla bearing down on him and "gives her the gas" - completely blocking the road.

At that point all decisions and options are bad and you just run out of time.
 
I'm not saying Tesla didn't introduce some advancements. That would be unlikely (just like it is unlikely other manufacturer's didn't introduce advancements in other areas).

My point was that given you seem to be crucifying Tesla for releasing a level 2 system "first" (and taking issue with the Beta label), I just needed to point out that they were not first with a level 2 system and that the beta label is largely irrelevant to the issue. Tesla uses the same legal disclaimers about driver responsibility as other systems, and this has nothing to do with the Beta label. All level 2 systems will have such disclaimers, simply because such systems by definition have many situations that require the driver to take over. Even after Autopilot leaves Beta, as long as it remains a level 2 system, it will need such disclaimers.

It is okay to take up a position that you feel level 2 systems are irresponsible (a position many will disagree with), but I think many people push back against this because you are suggesting Tesla is alone or first in this, which is not true.

I'm simply pointing out one specific issue, if you call that crucifying...whatever.

Tesla isn't perfect and that's ok, no company is. But to not question or to always take a favorable position without objectively...well I won't do in this case.

As far as me or anyone else using the word beta or imperfect system...these are Tesla words so I'm not sure why you're questioning it.
 
Wow, ok here is your approach: In some cases seat belts cause a death when a person would have been thrown clear. Let's get rid of seat belts till they are perfect. The same goes for air bags. Oh yeah, there are cases where antilock brakes stop in a longer distance than regular brakes. Let's get rid of those too.

Take a chill pill dude, you sound like my 5 year old. Blare all you want, but guess what, you're not changing my position. You can thumbs down me, I'm ok with that ;)
 
It is possible that he saw the truck and thought or assumed it would stop and moved calmly without disengaging AP, from the left lane to the right lane to go around it. Meanwhile, the (slow moving?) truck driver sees the Tesla bearing down on him and "gives her the gas" - completely blocking the road.

At that point all decisions and options are bad and you just run out of time.

There were definitely still options in this example. Semis, whether gunning or not, aren't fast. The Tesla driver should have noticed that the truck was continuing to move across the lane, and at that point slammed on the brakes. Since the brakes weren't touched...something else happened that caused the driver to ignore the brake pedal.
 
...What is very clear is you don't know what Autopilot means.

You think Autopilot should equal self driving aka Autonomous driving but it doesn't...

People may overestimate Autopilot due to its name.

Even in aviation, there must be one qualified human supervising Autopilot at all time. A Co-pilot may take a break at one time. A Pilot may take a break at another time but never together and leave Autopilot with no human supervision or with unqualified airplane operator such as a Flight Attendant or a guest.

Names may be catchy but as adults, we have to ask: "What's the catch? Can both Captain and co-pilot take a break at the same time?

We need to understand the difference between marketing and engineering.

When companies advertise their certified EPA fuel economy numbers, we need to ask "in what conditions? Laboratory conditions?..."

That is how adults' world operate: We are responsible to find the footnotes and we need to dig deeper than just accepting marketing propaganda.
 
It is possible that he saw the truck and thought or assumed it would stop and moved calmly without disengaging AP, from the left lane to the right lane to go around it. Meanwhile, the (slow moving?) truck driver sees the Tesla bearing down on him and "gives her the gas" - completely blocking the road.

At that point all decisions and options are bad and you just run out of time.
This is the scenario of the vast majority of accidents. Two people making one or more simulations decisions that conflict until time runs out. It happens all day, every day, even when walking. It even happened to me on a bike once. It has nothing to do with AP.
 
Lots of talk about autopilot capability which is understandable.
However all systems, no matter how advanced will have "edge" criteria that they are unable to reliably detect.

The discussion really needs to be,

"How can driver assist functions be implemented whilst ensuring the driver maintains the appropriate level of concentration for the technology in use"
 
  • Like
Reactions: Haddock
...Would autopilot detect and warn if it sees bike riders? Would it turn off autopilot?...

As I said earlier, this first beta version is for the most simple task of road conditions which is an uncomplicated freeway.

That way it does not have to deal with crossing traffic as in this case, or pedestrians or bicyclists.

Autopilot does not turn itself off just because of pedestrians or bicyclists.

I don't hear reports about detecting pedestrians or bicyclists on highway.

It is capable of detecting those small and slender obstacles especially with its 16-foot range ultrasonic sensors.

However, I am not sure it can effectively stop from a high speed of 65 MPH for pedestrians or bicyclists.

Owners have reported that Autopilot does detect Motorcyclists.
 
Lots of talk about autopilot capability which is understandable.
However all systems, no matter how advanced will have "edge" criteria that they are unable to reliably detect.

The discussion really needs to be,

"How can driver assist functions be implemented whilst ensuring the driver maintains the appropriate level of concentration for the technology in use"
I'd say it's not the car's responsibility and I don't personally want/need a nanny. The solution to this problem is education instead of hand holding. It's going to be confusing for people as we transition from partial to full autonomy. They simply need to be fully aware of their own vehicle's capabilities. There needs to be no ambiguity.
 
This statement here is not true. They are using deep neural networks these can work several orders of magnitude faster than brute force for the tasks at hand.

I understand. It is still very far from the situational awareness that a human being would have that I gave examples of. As far as I can tell, it might be crunching numbers at an incredibly deep level but it is very far from what a human being does easily.

As I pointed out, the AP is far better than any human being already at maintaining a constant awareness of what it can perceive.

Here's another example: You are driving along a mountain road and you see a boulder break loose and start rolling down the mountain. You would sense the potential danger and start to slow immediately. AP would not see any of that happen and only react when the boulder came into its sensors. The car would be crushed.

And here is one for AP: AP is on and being used. The car in the lane next to you doesn't signal and pulls into your lane. AP will react much better than any human in that situation. As human beings we look for cues and if they aren't there we might not respond.
 
...One other point: I recall CEO talking about travelling from one city to another without touching the wheel. This gave me and my wife the impression the caf drove itself. In light of recent news, I feel like this functionality is much more like cruise control (which i also do not use)...

Elon has a lot of visions which have nothing to do with current capability.

It's like you can order your car in New York and summon it from Fremont and it will find itself (including self Supercharging) to your home and also hook to charge in your garage with a Snake Bot.

What counts is what the manual says, not what visions, propaganda, marketing say.

I think it is wise to treat autopilot as if it's a cruise control so you should brake normally as you always do and not to wait for Autopilot to do it for you.
 
Tesla elaborates more info...

Tesla elaborates on Autopilot’s automatic emergency braking capacity over Mobileye’s system

From reading this article, here's what I gather:

- Mobileyes tech is incapable of processing lateral crossing vehicles data to activate Automatic Emergency Braking.

- Quote from Tesla "Autopilot activates automatic emergency braking in response to any interruption of the ground plane in the path of the vehicle that cross-checks against a consistent radar signature".
- Quote from Tesla - "In the case of this accident, the high, white side of the box truck, combined with a radar signature that would have looked very similar to an overhead sign, caused automatic braking not to fire".

- Quote from Tesla - “AEB does not engage when an alternative collision avoidance strategy (e.g., driver steering) remains viable. Instead, when a collision threat is detected, forward collision warning alerts the driver to encourage them to take appropriate evasive action. AEB is a fallback safety feature that operates by design only at high levels of severity and should not be tested with live subjects.”

My assessment ( taking the drivers out of the equation) will be on the tech.

4 things that play out here:

1. The mobileye chip
2. AP
3. The radar
4. Automatic Emergency Braking

As said by mobileye, the chip does not compute for lateral vehicle crossing. So the chip is out of the equation. Next up, AP.

AP activates the automatic braking system according to what the radar is reading. In this case, the radar ping hit a false positive because it saw the truck as white and the height as an overhead sign. This implies the radar did see the truck but tesla's in-house software calculated it as a false positive. This also implies if the truck was a different color (say orange), the radars software may have picked it up which would have notified AP to apply AEB.

Since the radars software thought the the truck as a overhead sign, it did not communicate anything to AP. Since nothing was communicated to AP, the Emergency Braking System was not activated.

Put all this together and to me, we have a flaw with tesla's in house software as it relates to how the radars software interprets objects in front of the car.

From tesla's statement, the radar interpreted the truck as a overhead sign due to its color and radar signature. That to me is a flaw. Would the software have interpreted it as a truck if the truck was a different color? If so, that's a flaw.

Regardless of who's fault it was, truck or car, is tesla's in house radar software effectively making the correct decision? As its currently programmed, it did make the right decision and calculated the truck as being an overhead sign because that's what how it's programmed.

But the logic of the programming is flawed. The programming should be able to tell between an overhead sign and a 18 wheeler crossing the road no matter the color or height of the truck.

Thoughts?

I think you need to read Tesla's statement again. It said the emergency action is implemented when both camera and radar see an obstruction - in this case a truck. If radar sees it but camera does not, no action taken.

One must remember that the camera's image must be interpreted by the software, as must an image from a human's eye by the brain. The human brain is far more sophisticated than any AI software as yet, so we can identify familiar objects far better than software can. But software can't yet say, "that bright area is sky, but that similarly bright, but rectangular, area is likely to be a truck or a billboard", and proceed to look for other clues in the way the human mind does.
 
Actually the way neural networks work is based on the way the brain and neural synapses work.

Here's another example: You are driving along a mountain road and you see a boulder break loose and start rolling down the mountain. You would sense the potential danger and start to slow immediately. AP would not see any of that happen and only react when the boulder came into its sensors. The car would be crushed.

You're right, humans are awesome at that...
ludovic-masciave-survives-boulder-accident-french-alps.jpg
 
Last edited:
  • Like
Reactions: deonb
Wow, ok here is your approach: In some cases seat belts cause a death when a person would have been thrown clear. Let's get rid of seat belts till they are perfect. The same goes for air bags. Oh yeah, there are cases where antilock brakes stop in a longer distance than regular brakes. Let's get rid of those too.

That is a ridiculous counter argument since you take the argument to a nonsense extreme, the same can be done for your "argument"

"well, let's approve everything since nothing can be perfect and since we approved non perfect seatbelt so we have to approve everything"

Appeal to Extremes
 
only people who own/drive a tesla has the knowledge and experience to offer a well informed opinion.
in this case it would be narrowed a bit more because only owners drivers of AP equipped model s or x who would have the prerequisite experience to support their opinions.
the ill informed comments and bickering from non owners/drivers is like background noise, useless, unneeded and unwanted.

Well then I don't know how the nhtsa will have an opinion unless the engineers there own a tesla.
 
  • Like
Reactions: Canuck
That is a ridiculous counter argument since you take the argument to a nonsense extreme, the same can be done for your "argument"

"well, let's approve everything since nothing can be perfect and since we approved non perfect seatbelt so we have to approve everything"

Appeal to Extremes

Of course you might want to read my other posts. The point is that the systems work. AEB avoids accidents under the conditions it was designed for i.e. rear end collisions. Its action in other cases is benign in that it just doesn't act. I was posting something meant to be a bit humorous while pointing out that the arguments were indeed the appeal to the extremes. We have one accident under conditions the system was never designed to handle and the argument is to remove the systems. There have been several post saying the systems shouldn't be released until they are perfect. I was pointing out that the same standard, applied to other safety systems, would have us remove seatbelt, airbags and antilock brakes. None of those are perfect.
 
  • Helpful
Reactions: SW2Fiddler
I went and read some of you past posts and admit I owe you an apology. In the past several months there have been so many people joining TMC just to bash Tesla that I over reacted, didn't research and bottom line jumped the gun. Sorry for lumping you in with Dr ValueSeeker and his ilk.

It doesn't change how over the top I find some of your comments. Improving safety always has been and always will be an incremental thing. I don't want to increase rear end accidents just because the system sometimes doesn't respond to side intrusions just like I don't want to eliminate a system that misses bicyclists if it properly stops when the car in front slams on the brakes. In time these systems will improve. Eventually someone will be complaining because the system works great except when an airplane is making an emergency landing on an interstate.

It's all good. I respect everyone's position. We may not agree on some things, but that's ok.

Ive driven the S for about 18k miles with roughly 4-6k miles on AP. My father has a 85D and I have his car for 2-3 days a week. He's close to retiring and lives next door so I'm lucky in almost calling his car mine.

I was 12th in line for the 3 at my dallas location and that's when my obsession took off. Although I drive the S quite a bit, it wasn't until the 3..

I love AP and in the right hands, I have no issues with it. It's the few crazies that worry me.