Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Does Tesla have a self-driving data advantage?

This site may earn commission on affiliate links.
I am prepared to be told by y'all that I am crazy. Maybe I am - but hear me out. And let me make something really clear - I wish Joshua Brown was alive and that his accident had never happened. But now that it has happened the past cannot be undone - and we ask ourselves - what next?

Tesla implemented Autopilot in an open regulatory environment - allowing it to gather a data set of hundreds of millions of miles to train its neural networks free of interference from the media, politicians and regulation. Was it risky? Yes - as we have all found out there was a really horrible corner case lurking.

But how do we know how many other corner cases were learned of and discovered prior to the fatality? What if the only way to discover them is through large scale testing that may now become politically impossible for latecomers?

Yes, pioneers often get shot in the back - but sometimes first movers do have big advantages.

Neural networks can be quickly trained to be 99% effective on a relatively small data set. For example, Nvidia just published a paper in April which shows that its Drive PX2 super computer system was trained end-to-end to drive in less than 100 hours only by comparing a video feed to steering input. This differs markedly from Mobileye's task-compartmentalized, annotated-image training approach.

But of course, 99% isn't good enough - as the media is now screaming about.

If you really want statistical safety then the network must be exposed to a large enough data set that corner cases which no human can predict are found through simple trial and error.

Also - remember that even the scientists don't truly understand how their own neural networks work - they openly acknowledge this. To some degree tuning these algorithms - like the human brain - is an experimental art from - they are closed boxes in the sense that we cannot predict with 100% accuracy what a neural network will output for any given set of inputs.

Only Tesla can now say to regulators that their neural network has been "hardened" through hundreds of millions of miles of testing - nobody else can make that claim.

If this is true, then Tesla has gathered the data it needs to build a statistically safe neural network - but the fatality has possibly temporarily slammed the door shut on its competition because anybody who tries to copy Tesla's methodology - a fleet learning neural network - is going to get shut down by politicians and the press saying "DIDN'T YOU IDIOTS SEE WHAT HAPPENED TO TESLA? HOW DARE YOU?"'

The competition will have to run their autopilots in simulation mode. Even now - almost 2 years after Autopilot launched - the competition's lane assist systems are not learning anything once in their customer's fleets. Anyone who wants to replicate what Tesla has done will have to start from scratch.

We do not know exactly what level of proprietary neural network training Tesla is using - but for those of you who have not closely followed the hints dropped by mobileye and Tesla, here are a few key ones:

1 - Mobileye said in a public presentation (I believe at January's 2016 CES talk) that EyeQ3 is the first SOC to have a neural network which not only processes information in realtime after being trained back at "headquarters" - but which also is capable of being set up to learn on the fly in customer cars. It refers to this as "DNN" - and the media widely reported it as a "coming soon" feature of EyeQ4 - but Mobileye's on CEO said in his talk that Tesla already implemented it in Autopilot 1.0 using EyeQ3.

2 - Mobileye has also said that Tesla is the only automaker implementation so far to be using this neural network learning.

3 - Mobileye said this in January 2016 - a full 18 months after Tesla began gathering data in October of 2014.

What does this mean moving forward? Tesla can rightly tell the public that it has built a statistically safe system, and that the only way to do it is with hundreds of millions of miles of data.

Thus Tesla can argue that it should be allowed to continue operating Autopilot - but that the rest of the industry should go through the same fleet learning and high definition map building before they release systems.

However, it may now be politically impossible to launch such a project again for any other automaker - except in pure simulation mode - where an autopilot runs in the background comparing its own actions to that of the human driver.

If this is true - then nobody else will be able to catch up to Tesla's real-world data set for at least a couple more years - and even more importantly they won't be able to offer to the public a functioning autopilot equivalent to Tesla's for at least a couple more years.

The short term hit to Tesla's reputation has been costly - true. But Tesla can rightly tell lawmakers that it is the only carmaker which has this robust data, and is enhancing its system even further, ensuring that Mr. Brown's incident never happens again.

But who else can say that? Nobody. Who else, right now, can say they have a functional high definition lane-by-lane map of at least a portion of American roads? Nobody - Mobileye and others are launching map building projects but they won't bear fruit for some time.
 
Last edited:
Tesla could also say:

"Dear competitor - in addition to our kind offer to sell you access to the only robust rapid charging network for EV's in the world, we are now also happy to sell you access to the only robustly trained self-driving neural network in the world - now that you can't build your own."
 
  • Like
  • Disagree
Reactions: KZKZ and hybridbear
>
Fatality was best possible outcome for Tesla

I'd be banned for typing what I think about you posting that.

Did you read what I put at the top? My father died in a gruesome car accident and I spent years mourning his death and recovering from its impact on my life. This made me much more concerned with automotive safety than ever before - one reason I purchased a Tesla actually. I wish we could go back and change the past and bring Joshua Brown - and my father - back. Sometimes I catch myself thinking "Would AEB have saved dad? Would a cross traffic warning system have prevented him from pulling out into the road?" etc. etc. But now that it's happened - the political landscape has changed. The question is about the future. That is all.
 
Last edited:
Once the sensors were in the cars, and before the Auto Pilot software was enabled, Tesla was collecting data and "pretend" driving the car to see how closely it matched the real drivers. This was revealed to us at last years TMC Connect, and several had already speculated as such. Thus the training required for the 'nets need not be under real Auto Pilot control at the time.

So no, I don't believe there will be a first mover advantage here to the data, even if actually driving on AP isn't "allowed".

Also, some have speculated (and I concur) that Tesla will share their database freely, as they've done with the patents, in order to help advance the state of the art of autonomous driving for humanity's betterment, not just for TSLAs.
 
Once the sensors were in the cars, and before the Auto Pilot software was enabled, Tesla was collecting data and "pretend" driving the car to see how closely it matched the real drivers. This was revealed to us at last years TMC Connect, and several had already speculated as such. Thus the training required for the 'nets need not be under real Auto Pilot control at the time.

So no, I don't believe there will be a first mover advantage here to the data, even if actually driving on AP isn't "allowed".

Also, some have speculated (and I concur) that Tesla will share their database freely, as they've done with the patents, in order to help advance the state of the art of autonomous driving for humanity's betterment, not just for TSLAs.

Right - the advantage I am imagining is that other automakers will possibly have to spend years doing the same thing you just described Tesla as doing - time during which Tesla will be able to offer a functional autopilot to people buying cars.

As for Tesla giving the data away - I hadn't heard of that possibility. If Tesla gives the data away to the competition then yes, my whole theory is wrong.
 
No, because I don't want Tesla to be the sole provider of good and increasingly better autonomy features. I think Tesla is great and even hold stock but I don't want to see progress get shut down just to leave Tesla with a monopoly which I think is a fantasy scenario that would never happen anyway. I want to see all cars, for everyone, get better and safer, because that makes it safer for me too.

It's interesting to note that Tesla's mantra is "accelerating the transition to sustainable transport", but does not currently include "accelerating the transition to safer transport." Perhaps they will add that. Remember that Elon opened up their patents in 2014 to help accelerate other EV companies. But it's not clear whether that pledge applies to the Autopilot/autonomy technologies and their patents.

It's possible Elon judges sustainable energy to be a far more important goal than reducing car accident fatalities, and is willing to keep the autonomy technology proprietary and not share those in order to maintain a competitive advantage for the business, but is willing to share just the battery/EV IP.

Does anyone know more about the current status of their patent pledge and whether AP falls under that pledge?
 
  • Like
Reactions: Ben W
Did you read what I put at the top? My father died in a gruesome car accident which made me much more concerned with automotive safety than ever before - one reason I purchased a Tesla actually. I wish we could go back and change the past and bring Joshua Brown back. But now that it's happened - the political landscape has changed. The question is about the future. That is all.

doesn't matter. you can make your same point without using that highly insensitive title.
 
  • Like
Reactions: Tree95
Right - the advantage I am imagining is that other automakers will possibly have to spend years doing the same thing you just described Tesla as doing - time during which Tesla will be able to offer a functional autopilot to people buying cars.

As for Tesla giving the data away - I hadn't heard of that possibility. If Tesla gives the data away to the competition then yes, my whole theory is wrong.
Here's an article that says they offered it to the DOT and that Elon said he was considering offering the data to other automakers also. The goal is to push regulators to approve actual autonomous driving rather than hoarding data that might turn out to be worth little if regulators decide not to allow autonomous driving or if they decided to put severe limitations.
http://jalopnik.com/tesla-reportedly-offered-its-autopilot-data-to-the-depa-1780633149
 
Here's an article that says they offered it to the DOT and that Elon said he was considering offering the data to other automakers also. The goal is to push regulators to approve actual autonomous driving rather than hoarding data that might turn out to be worth little if regulators decide not to allow autonomous driving or if they decided to put severe limitations.
http://jalopnik.com/tesla-reportedly-offered-its-autopilot-data-to-the-depa-1780633149
That Jalopnik article is just a rehash of this Electrek article Tesla offered to share all its Autopilot data with the US Department of Transport

Whether Elon will ever offer Tesla's AP data to other car manufacturers remains to be seen. Whether other manufacturers would accept it, since that would require that they develop the capability to process and analyze and use the data also remains to be seen. Personally, I think their price would prevent them from making the necessary effort to use the data, just as their pride may be preventing them from making EVs compatible with the Supercharger network.

@calisnow, thanks for your interesting and insiighful post, I enjoyed reading it.
 
I am prepared to be told by y'all that I am crazy.
...
The short term hit to Tesla's reputation has been costly - true. But Tesla can rightly tell lawmakers that it is the only carmaker which has this robust data, and is enhancing its system even further, ensuring that Mr. Brown's incident never happens again.

But who else can say that? Nobody. Who else, right now, can say they have a functional high definition lane-by-lane map of at least a portion of American roads? Nobody - Mobileye and others are launching map building projects but they won't bear fruit for some time.

Ok, you're crazy. And that's a compliment.

Your point about the granular mapping is exactly why I have engaged DriverAssist (Autopilot) at almost every opportunity in the past 54K+ miles and 18 months-ish - including the twisty 1-lane roads next to pools of boiling, acidic water in parts of Yellowstone, the rural back roads of Vermont, and more highways, freeways, and interstates than I can count in the States and Canada. Because it all counts - especially the exceptions that get shared fleetwide. Agreed that as much as Tesla has led the way, other manufacturers should absolutely not be allowed to cut corners/take shortcuts. Because we know they will without oversight.

Respectfully, and taking the above response into account, if you could still possibly figure out a different thread title with the same impact, it might be a good thing. I appreciate your perspective - just thinking about if a younger member of the deceased's family happens to run across this thread; they may not take the time at first to appreciate the care and thoroughness (and caveats) that you've cogently and compellingly demonstrated above.
 
  • Like
Reactions: hybridbear
calisnow, you seem to be hypothesizing a total freeze on autopilot-like features based on the one incident under investigation. If that happens (and it won't), I'll argue that the winner would be Google, not Tesla. Google has been testing what is effectively level 4 autonomy for quite some time. (Yes, it's technically/pedantically Level 3, but effectively it's an alpha test of Level 4). Thus, one could argue that Google is roughly 2-4 years ahead of even Tesla. And so, despite the fact that the rest of the auto-industry is mostly stupid, Google and others will (very soon) step in and offer a plug-and-play solution to these dinosaurs that instantly brings them up to Level 3/4 autonomy.

But, I'll speculate that there won't be the freeze that you're talking about. The truck driver will be cited, and Josh Brown will be found somewhat negligent (unfortunately). The NTSB will likely make some silly recommendation involving nag-screens or hands-on-wheel requirements, and we'll move forward -- until the 2nd fatal incident, which, in the social-media age is viewed as a pandemic and then all bets are off.
 
Your point about the granular mapping is exactly why I have engaged DriverAssist (Autopilot) at almost every opportunity in the past 54K+ miles and 18 months-ish ...Respectfully, and taking the above response into account, if you could still possibly figure out a different thread title with the same impact, it might be a good thing.

You're right - I reported the post to the moderators and asked for a title change. Unfortunately we cannot change post titles after a post goes live (as far as I know).

As for your efforts to help Tesla by using autopilot - good for you. Is there a big difference, however, between engaging autopilot and not engaging it - as far as helping the fleet learn? I thought that if you do not engage it, it is still running in the background in a simulation mode, comparing what it would choose to do at any given moment with what you the driver actually do.
 
Tesla should have anticipated this edge case based on the crazy antics that have been posted on YouTube over the last half year. They could have done a better job to prevent this fatality. With that said, I agree that their approach for autopilot (and really, for much of their products in general), which is to push things to the extreme limit has given them a level of experience and a dataset that put them years ahead of the next closest competitor. Was it worth the life of a person? Absolutely not. But what's done is done, and I agree that Tesla is in a position to move forward. Future software and hardware updates for autopilot will be incrementally better and edge cases will be further reduced.
 
You're right - I reported the post to the moderators and asked for a title change. Unfortunately we cannot change post titles after a post goes live (as far as I know).

As for your efforts to help Tesla by using autopilot - good for you. Is there a big difference, however, between engaging autopilot and not engaging it - as far as helping the fleet learn? I thought that if you do not engage it, it is still running in the background in a simulation mode, comparing what it would choose to do at any given moment with what you the driver actually do.

Nicely done re the title change.

Thanks for mentioning the simulation mode - I had forgotten about that. I know more than one person who is scared of AP - they just never use it. Even on LA freeways where it does quite well and can reduce stress markedly. With simulation, every mile they drive still helps.

Heh. I could have saved myself a few harrowing moments on sub-standard roads, highways, tollways, and the obligatory goat path or two :).

In the end, I believe the data are cumulative - every time it breaks loose while engaged or every time *I* have to intervene, which can be often, an exception is logged (in addition to the times when AP is not actively engaged - so simulation only).

Tangentially, the most notable times it got abruptly disengaged were during high crosswinds. Blew the car right out of the lane, so that was that for Autosteer. One does wonder about automated recovery from that sort of scenario in the context of Elon's predicted NYC ---> LA trip solely via Summon in a few years.
 
I would be content with my own death if it acted to serve society in some measurable way.

But, I think in this case everyone is focusing on the wrong thing. In doing so we're not learning the lessons we should be learning from the accident. In doing that we're doing a great disservice to the one that died.

In every single case it seems like were being swept up by completely exaggerating what was claimed by Tesla. There was never a claim that in every single incident that the "neural network" could be retained. The only fleet learning that has been acknowledged is with high definition maps when it pertains to lane keeping.

It was also never claimed that the Tesla MS was "self driving", but the media keeps reporting it as if it is.

What we do know is the Automatic Emergency braking didn't activate, and yet how many media outlets have focused on that? Where they've really questioned the value of those systems?

Those systems are not just on a Tesla, but on a LOT of cars on the road.

How many of them can really effectively mitigate an accident of a similar nature?

Shouldn't we be pushing for improvements to this system? Not just for Tesla, but every single manufacture that has a AEB system.

This isn't just about us.
 
Tesla should have anticipated this edge case based on the crazy antics that have been posted on YouTube over the last half year.
"Edge cases" by definition are those use cases that are extremely difficult if not impossible to anticipate. In the real world there are some many different bizarre and unpredictable circumstances that cannot be anticipated. But hindsight is 20/20.
Shouldn't we be pushing for improvements to this system? Not just for Tesla, but every single manufacture that has a AEB system.
Obviously Tesla is working hard to improve AP. Elon has stated it is a top priority at Tesla and that the head of the AP group reports directly to him. AP has shown clear improvement over the past year, and I am sure that we are going to see much more dramatic improvements in the near future. I do not see any other car manufacture working nearly as hard to implement autonomous driving. Google is the only company to have a superior system (but only on the roads that they have precisely mapped) but after many years of development Google has yet to sell their system to any car company and have it made available to car buyers. So in the real world Tesla is ahead of Google when it comes to implementing some level of "self driving" and has already logged many more real world driving miles.
 
What we do know is the Automatic Emergency braking didn't activate,
truly said.

I have never seen tesla claim that AEB would work in 100% scenarios with or without AP. If something is broken it is AEB, not autopilot.

Do not want this thread to discuss how simulation mode should work, but I wonder why they can't gather simulated data for when AEB thinks it should have kicked in vs when driver manually slammed breaks to avoid a mishap.
 
Obviously Tesla is working hard to improve AP. Elon has stated it is a top priority at Tesla and that the head of the AP group reports directly to him.

I don't doubt that AEB will be massively improved in the Autopilot 2.0 hardware. That this kind of situation will be a priority when testing the next version. Especially since it seems like the next version is being tested right now, and the incident coincides with the testing.

My concern though is that because of all the autofocus on Tesla/AP that other manufactures will be slow in addressing this issue. I'm also concerned by some of the poor performance that AEB systems have shown when tested. That it's kind feels like manufactures are feeding their consumers a bunch of bull. In fact a lowly Subaru system tends to score the highest during comparison testing. Plus it can stop completely up to 31mph differential versus just cutting the speed down like the Tesla system.
 
"Edge cases" by definition are those use cases that are extremely difficult if not impossible to anticipate. In the real world there are some many different bizarre and unpredictable circumstances that cannot be anticipated. But hindsight is 20/20.

Agree that the edge case of a white big rig cutting across the highway would be difficult to anticipate, but certainly Tesla is aware of drivers becoming so comfortable with autopilot so as to ignore the fine print and not pay attention to the road.