Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The next big milestone for FSD is 11. It is a significant upgrade and fundamental changes to several parts of the FSD stack including totally new way to train the perception NN.

From AI day and Lex Fridman interview we have a good sense of what might be included.

- Object permanence both temporal and spatial
- Moving from “bag of points” to objects in NN
- Creating a 3D vector representation of the environment all in NN
- Planner optimization using NN / Monte Carlo Tree Search (MCTS)
- Change from processed images to “photon count” / raw image
- Change from single image perception to surround video
- Merging of city, highway and parking lot stacks a.k.a. Single Stack

Lex Fridman Interview of Elon. Starting with FSD related topics.


Here is a detailed explanation of Beta 11 in "layman's language" by James Douma, interview done after Lex Podcast.


Here is the AI Day explanation by in 4 parts.


screenshot-teslamotorsclub.com-2022.01.26-21_30_17.png


Here is a useful blog post asking a few questions to Tesla about AI day. The useful part comes in comparison of Tesla's methods with Waymo and others (detailed papers linked).

 
Last edited:
It’s been getting progressively wonkier as each day passes. Somewhere, someone has to be adjusting parameters that is causing expected disengagements for data. I’m convinced.
I am not sure I understand “causing disengagement for data” exactly but interesting.

I have noticed environmental camera degradation where cabin humidity and temperature resulted in condensation inside the cabin pillars (Pillar glass fogs on the inside). Usually related to a cool morning after parking a wet car in the garage. High humidity in the garage gets inside the cabin. AC eventually clears this up as driving proceeds.

I now keep a dehumidifier running in the garage. No more fogging.

FWIW, recent comments about a breakthrough in FSD caused me to wonder if AI might be applied to map data in some novel way to minimize disengagements. YMMV
 
I am not sure I understand “causing disengagement for data” exactly but interesting.
My car has been uploading around 50gb of data every day that I drive and almost nothing on dates I don’t. Since FSD data cannot be altered without an actual update, all I can think of is every drive map data is being tweaked, HW4 follows a route it has taken many times, makes subtle changes in the way it approaches it along the way, stopping sooner, later, creeping further, turning wider, selecting a different lane for no apparent reason, almost trying to get me to say “what are you doing car?” and disengage, then it sends that data, effectively giving Tesla data on the drivers tolerances (what mistakes am I likely to ignore and put up with and which ones are so flat out wrong I take over). I just can’t imagine another reason for 5 seemingly random lane changes on a regular 2 lane road through town, selecting the wrong lane to proceed along the route where it has picked the correct one countless times before, and dozens of other odd behaviors along a route that it flawlessly performed dozens of times during the same software update, same weather conditions, same time of day, with the same traffic density, suddenly I find myself taking over 10 times. It’s almost like it’s “on purpose” to teach it the uncomfortable point of drivers to presumably reduce disengagements in the future. IMG_1635.jpeg
 
My car has been uploading around 50gb of data every day that I drive and almost nothing on dates I don’t. Since FSD data cannot be altered without an actual update, all I can think of is every drive map data is being tweaked, HW4 follows a route it has taken many times, makes subtle changes in the way it approaches it along the way, stopping sooner, later, creeping further, turning wider, selecting a different lane for no apparent reason, almost trying to get me to say “what are you doing car?” and disengage, then it sends that data, effectively giving Tesla data on the drivers tolerances (what mistakes am I likely to ignore and put up with and which ones are so flat out wrong I take over). I just can’t imagine another reason for 5 seemingly random lane changes on a regular 2 lane road through town, selecting the wrong lane to proceed along the route where it has picked the correct one countless times before, and dozens of other odd behaviors along a route that it flawlessly performed dozens of times during the same software update, same weather conditions, same time of day, with the same traffic density, suddenly I find myself taking over 10 times. It’s almost like it’s “on purpose” to teach it the uncomfortable point of drivers to presumably reduce disengagements in the future. View attachment 956941
Interesting, it sounds a bit like it is playing a game with you (now that vision has been solved) as a way to beat/learn to drive better than any human. Might work.

I wonder if it might be testing pruning the NN to manage with a simpler network. That might work to reduce latency in the end. Interesting stuff.
 
Last edited:
My car has been uploading around 50gb of data every day that I drive and almost nothing on dates I don’t. Since FSD data cannot be altered without an actual update, all I can think of is every drive map data is being tweaked, HW4 follows a route it has taken many times, makes subtle changes in the way it approaches it along the way, stopping sooner, later, creeping further, turning wider, selecting a different lane for no apparent reason, almost trying to get me to say “what are you doing car?” and disengage, then it sends that data, effectively giving Tesla data on the drivers tolerances (what mistakes am I likely to ignore and put up with and which ones are so flat out wrong I take over). I just can’t imagine another reason for 5 seemingly random lane changes on a regular 2 lane road through town, selecting the wrong lane to proceed along the route where it has picked the correct one countless times before, and dozens of other odd behaviors along a route that it flawlessly performed dozens of times during the same software update, same weather conditions, same time of day, with the same traffic density, suddenly I find myself taking over 10 times. It’s almost like it’s “on purpose” to teach it the uncomfortable point of drivers to presumably reduce disengagements in the future. View attachment 956941
When I just installed 11.4.4, FSD performed the correct left turn to freeway ramp that the previous versions failed to do. Now, 11.4.4 does it incorrectly constantly again like the previous versions (moves to right lane then miss the left turn lane instead of staying on the leftmost lane to prepare to make left turn to ramp). Not sure how that happens.
 
  • Informative
Reactions: beachmiles
I think if Tesla wants to train FSD to correct a bad lane selection or a bad turn then they should give the user a button to train FSD.
When the user presses the train button, the user will train FSD by making a perfect drive on some part of the route that FSD fails to maneuver. Tesla then uses the recorded data during the training period to fix the bug reported by the user.
 
I think if Tesla wants to train FSD to correct a bad lane selection or a bad turn then they should give the user a button to train FSD.
When the user presses the train button, the user will train FSD by making a perfect drive on some part of the route that FSD fails to maneuver. Tesla then uses the recorded data during the training period to fix the bug reported by the user.
This
 
Interesting, it sounds a bit like it is playing a game with you (now that vision has been solved) as a way to beat/learn to drive better than any human. Might work.

I wonder if it might be testing pruning the NN to manage with a simpler network. That might work to reduce latency in the end. Interesting stuff.
I mean, we all agreed to be in control at all times so all the data none of the liability.
 
Good morning Tesla owners in USA, Canada & rest of the world, outside of Europe.
As you may know, there is only an enormously crippled version of FSD here, due to EU regulations.
Tesla owners want the same version as the USA and Canada and in order to get this done, a petition for the European Parliament should get 15,000 signatures.
So, let’s go global and please help the Tesla owners with FSD in Europe and sign this petition. We are almost there.
Thank you all in advance!
 
Good morning Tesla owners in USA, Canada & rest of the world, outside of Europe.
As you may know, there is only an enormously crippled version of FSD here, due to EU regulations.
Tesla owners want the same version as the USA and Canada and in order to get this done, a petition for the European Parliament should get 15,000 signatures.
So, let’s go global and please help the Tesla owners with FSD in Europe and sign this petition. We are almost there.
Thank you all in advance!
Out of curiosity, in what ways is it different? Did they start with zero training?
 
Out of curiosity, in what ways is it different? Did they start with zero training?
No, it is the training. It had to be dialed back to comply with the new UN/ECE R79 regulation on driver assistance systems. Like it has to complete a lane change in x seconds, or it has to abort, and that it can exceed 3 m/s² of lateral acceleration on a turn/curve. (Such that it either has to slow down more than normal people do, or it has to disable because it isn't allowed to complete the turn.)

Or at least that is my understanding.
1689776689740.png
 
  • Informative
Reactions: jr_gn