You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
This scenario scares me. This is even more reason to pull FSD after beta and then only allow trained W-2 employees to test it.Full Self Driving beta is a 100% accurate description. You can sit in the backseat when it's out of beta, not hard to understand at all.
Sure, right now it's bad enough that every sane person will monitor it like a hawk. However once it gets really good, say 1 year between errors, Tesla is going to have to do a much better job of explaining to people that just because it drove 10,000 miles without error doesn't mean it's not going to run over a pedestrian on mile 10,001.
Agree 100%, this is my concern too. Companies testing much more reliable AV's go to great lengths to assure their test drivers are vigilant (I think some have two drivers in the car at all times). The argument here is that somehow it won't be an issue because the driver knows they're responsible whereas employees know the company will be liable (though the Uber test driver who killed someone was charged with negligent homicide, seems like that case is still ongoing) .This scenario scares me. This is even more reason to pull FSD after beta and then only allow trained W-2 employees to test it.
The ramp from it could kill you any moment to it's still 10x more likely to crash than a human, to equal or better than a human is a really dangerous space.
Even 10x more dangerous than a human, it will still be extremely rare for any individual to even experience a single fault in several years of using it. Human brains don't have the capacity to evaluate the risk of such infrequent events based on individual experience. We need to stop and evaluate data.
What the name is in that transition could help people understand the risk, but I'm not sure.
Gut check on my assumptions: I've been driving 40 years. I've been in 3 accidents. One would have been prevented by FSD, the other 2 I was rear ended waiting to turn left. No way will FSD, as currently imagined, be able to prevent that.
10x 1 accident is 10, so one accident every 4 years, for me.
I think I'm pretty average. My dad's been driving 66 years. He's been in 1 accident, when he was about 21.
Most people I know have been in 0 to 1 accidents over decades of driving.
Whatever the actual numbers are, there is going to be a gap between "FSDb scares the crap out of you every day" to "FSD is better than 90% of human drivers," where it will be in a state of "drives really well for years but is still x factor more dangerous than most drivers."
How to navigate that gap will be super critical.
It's safe enough when it exceeds human performance. Tesla says that's a severe collision (>12mph = airbag deployment) 1 per 2 million miles.Would a W-2 employee help if the math presented is true? Let's assume it's right and the average person will only see an accident once every 4 years (10x worse than humans). That means an employee safety driver will not see an accident scenario for 4 years? At what point does the company say it's safe enough?
Next question - how many severe collisions have occurred on FSD Beta? We have stats for AP, both from NHTSA and Tesla, but I haven't seen stats for FSD Beta accept what Tesla reports, which is virtually 0.It's safe enough when it exceeds human performance. Tesla says that's a severe collision (>12mph = airbag deployment) 1 per 2 million miles.
It's not the status of the employee, it's the fact that these companies are watching the employees like a hawk and will fire them if they aren't being vigilant.
Here is Waymo's safety report for some context of the collision rates they saw in Chandler: https://storage.googleapis.com/sdc-...Waymo-Public-Road-Safety-Performance-Data.pdf
And there "fatigue risk management framework." (this is new, I haven't read it): https://storage.googleapis.com/sdc-prod/v1/safety-report/Waymo-Fatigue-Risk-Management.pdf
I think zero in about 35 million miles which is of course very good. On the other hand someone could have a fatal collision on FSD beta and that would instantly make it very bad since fatal collisions only occur 1 per 100 million miles. And of course the population of people using FSD beta is not representative of the average population so it's really too early to make a judgement.Next question - how many severe collisions have occurred on FSD Beta? We have stats for AP, both from NHTSA and Tesla, but I haven't seen stats for FSD Beta accept what Tesla reports, which is virtually 0.
And *this* we've seen over and over - with videos of people jumping into the back seat while on AP.Most people agree that with FSD beta in its current form driver engagement is not an issue. My prediction is that the first severe collision will be because the driver "wanted to see what would happen."
Not exactly what I'm talking about and extremely rare. I'm talking about more normal people who let it run red lights and stop signs for fun.And *this* we've seen over and over - with videos of people jumping into the back seat while on AP.
I think zero in about 35 million miles which is of course very good. On the other hand someone could have a fatal collision on FSD beta and that would instantly make it very bad since fatal collisions only occur 1 per 100 million miles. And of course the population of people using FSD beta is not representative of the average population so it's really too early to make a judgement.
Most people agree that with FSD beta in its current form driver engagement is not an issue. My prediction is that the first severe collision will be because the driver "wanted to see what would happen."
And *this* we've seen over and over - with videos of people jumping into the back seat while on AP.
I don’t have any statistics but I don’t believe it is rare. Stupid people do stupid things. If it’s a minor accident it does not make the evening news.Not exactly what I'm talking about and extremely rare. I'm talking about more normal people who let it run red lights and stop signs for fun.
Like this: