You are so correct. Dang, I did missed that one.Don't forget cup holders. One shouldn't be drinking anything as it is distracting.
Last edited:
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
You are so correct. Dang, I did missed that one.Don't forget cup holders. One shouldn't be drinking anything as it is distracting.
First off my condolences to J Browns family.
I think this is very simple- the truck driver made an error in his judgment, the S driver was either disabled and couldn't react or he was inattentive and didn't react. Mr Brown lost his life in a combination of factors that are statistically very very small. I don't see how the Tesla is at fault in any way. When AP is activated you agree to the parameters set forth and you push the button, end of story. The primary culprit is the truck driver.
I'm not saying Tesla didn't introduce some advancements. That would be unlikely (just like it is unlikely other manufacturer's didn't introduce advancements in other areas).
My point was that given you seem to be crucifying Tesla for releasing a level 2 system "first" (and taking issue with the Beta label), I just needed to point out that they were not first with a level 2 system and that the beta label is largely irrelevant to the issue. Tesla uses the same legal disclaimers about driver responsibility as other systems, and this has nothing to do with the Beta label. All level 2 systems will have such disclaimers, simply because such systems by definition have many situations that require the driver to take over. Even after Autopilot leaves Beta, as long as it remains a level 2 system, it will need such disclaimers.
It is okay to take up a position that you feel level 2 systems are irresponsible (a position many will disagree with), but I think many people push back against this because you are suggesting Tesla is alone or first in this, which is not true.
Wow, ok here is your approach: In some cases seat belts cause a death when a person would have been thrown clear. Let's get rid of seat belts till they are perfect. The same goes for air bags. Oh yeah, there are cases where antilock brakes stop in a longer distance than regular brakes. Let's get rid of those too.
Take a chill pill dude, you sound like my 5 year old. Blare all you want, but guess what, you're not changing my position. You can thumbs down me, I'm ok with that
It is possible that he saw the truck and thought or assumed it would stop and moved calmly without disengaging AP, from the left lane to the right lane to go around it. Meanwhile, the (slow moving?) truck driver sees the Tesla bearing down on him and "gives her the gas" - completely blocking the road.
At that point all decisions and options are bad and you just run out of time.
...What is very clear is you don't know what Autopilot means.
You think Autopilot should equal self driving aka Autonomous driving but it doesn't...
This is the scenario of the vast majority of accidents. Two people making one or more simulations decisions that conflict until time runs out. It happens all day, every day, even when walking. It even happened to me on a bike once. It has nothing to do with AP.It is possible that he saw the truck and thought or assumed it would stop and moved calmly without disengaging AP, from the left lane to the right lane to go around it. Meanwhile, the (slow moving?) truck driver sees the Tesla bearing down on him and "gives her the gas" - completely blocking the road.
At that point all decisions and options are bad and you just run out of time.
...Would autopilot detect and warn if it sees bike riders? Would it turn off autopilot?...
I'd say it's not the car's responsibility and I don't personally want/need a nanny. The solution to this problem is education instead of hand holding. It's going to be confusing for people as we transition from partial to full autonomy. They simply need to be fully aware of their own vehicle's capabilities. There needs to be no ambiguity.Lots of talk about autopilot capability which is understandable.
However all systems, no matter how advanced will have "edge" criteria that they are unable to reliably detect.
The discussion really needs to be,
"How can driver assist functions be implemented whilst ensuring the driver maintains the appropriate level of concentration for the technology in use"
This statement here is not true. They are using deep neural networks these can work several orders of magnitude faster than brute force for the tasks at hand.
...One other point: I recall CEO talking about travelling from one city to another without touching the wheel. This gave me and my wife the impression the caf drove itself. In light of recent news, I feel like this functionality is much more like cruise control (which i also do not use)...
Tesla elaborates more info...
Tesla elaborates on Autopilot’s automatic emergency braking capacity over Mobileye’s system
From reading this article, here's what I gather:
- Mobileyes tech is incapable of processing lateral crossing vehicles data to activate Automatic Emergency Braking.
- Quote from Tesla "Autopilot activates automatic emergency braking in response to any interruption of the ground plane in the path of the vehicle that cross-checks against a consistent radar signature".
- Quote from Tesla - "In the case of this accident, the high, white side of the box truck, combined with a radar signature that would have looked very similar to an overhead sign, caused automatic braking not to fire".
- Quote from Tesla - “AEB does not engage when an alternative collision avoidance strategy (e.g., driver steering) remains viable. Instead, when a collision threat is detected, forward collision warning alerts the driver to encourage them to take appropriate evasive action. AEB is a fallback safety feature that operates by design only at high levels of severity and should not be tested with live subjects.”
My assessment ( taking the drivers out of the equation) will be on the tech.
4 things that play out here:
1. The mobileye chip
2. AP
3. The radar
4. Automatic Emergency Braking
As said by mobileye, the chip does not compute for lateral vehicle crossing. So the chip is out of the equation. Next up, AP.
AP activates the automatic braking system according to what the radar is reading. In this case, the radar ping hit a false positive because it saw the truck as white and the height as an overhead sign. This implies the radar did see the truck but tesla's in-house software calculated it as a false positive. This also implies if the truck was a different color (say orange), the radars software may have picked it up which would have notified AP to apply AEB.
Since the radars software thought the the truck as a overhead sign, it did not communicate anything to AP. Since nothing was communicated to AP, the Emergency Braking System was not activated.
Put all this together and to me, we have a flaw with tesla's in house software as it relates to how the radars software interprets objects in front of the car.
From tesla's statement, the radar interpreted the truck as a overhead sign due to its color and radar signature. That to me is a flaw. Would the software have interpreted it as a truck if the truck was a different color? If so, that's a flaw.
Regardless of who's fault it was, truck or car, is tesla's in house radar software effectively making the correct decision? As its currently programmed, it did make the right decision and calculated the truck as being an overhead sign because that's what how it's programmed.
But the logic of the programming is flawed. The programming should be able to tell between an overhead sign and a 18 wheeler crossing the road no matter the color or height of the truck.
Thoughts?
Here's another example: You are driving along a mountain road and you see a boulder break loose and start rolling down the mountain. You would sense the potential danger and start to slow immediately. AP would not see any of that happen and only react when the boulder came into its sensors. The car would be crushed.
Wow, ok here is your approach: In some cases seat belts cause a death when a person would have been thrown clear. Let's get rid of seat belts till they are perfect. The same goes for air bags. Oh yeah, there are cases where antilock brakes stop in a longer distance than regular brakes. Let's get rid of those too.
only people who own/drive a tesla has the knowledge and experience to offer a well informed opinion.
in this case it would be narrowed a bit more because only owners drivers of AP equipped model s or x who would have the prerequisite experience to support their opinions.
the ill informed comments and bickering from non owners/drivers is like background noise, useless, unneeded and unwanted.
That is a ridiculous counter argument since you take the argument to a nonsense extreme, the same can be done for your "argument"
"well, let's approve everything since nothing can be perfect and since we approved non perfect seatbelt so we have to approve everything"
Appeal to Extremes
I went and read some of you past posts and admit I owe you an apology. In the past several months there have been so many people joining TMC just to bash Tesla that I over reacted, didn't research and bottom line jumped the gun. Sorry for lumping you in with Dr ValueSeeker and his ilk.
It doesn't change how over the top I find some of your comments. Improving safety always has been and always will be an incremental thing. I don't want to increase rear end accidents just because the system sometimes doesn't respond to side intrusions just like I don't want to eliminate a system that misses bicyclists if it properly stops when the car in front slams on the brakes. In time these systems will improve. Eventually someone will be complaining because the system works great except when an airplane is making an emergency landing on an interstate.
Actually the way neural networks work is based on the way the brain and neural synapses work.