Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot is already improving.

This site may earn commission on affiliate links.
If Tesla auto-steer is making such amazing leaps and bounds after a week in production, it points to the notion that 6.2 software was NOT previously collecting AP sensor data (or at least not in the same way). The question i have is: Why wouldn't Tesla have simply placed the learning algorithms into 6.2 SW on a passive basis and started 7.0 off with all that collected data, rather than risking the week of 'close encounters' that has been documented on forums, YouTube, etc?
 
I was wondering the same thing myself. My only thought is that without driver interaction information (e.g., overriding turning off at an exit), the data is somewhat less valuable. 6.2 data wouldn't tell you when the autopilot was not working correctly. But there is still likely some value.
 
If Tesla auto-steer is making such amazing leaps and bounds after a week in production, it points to the notion that 6.2 software was NOT previously collecting AP sensor data (or at least not in the same way). The question i have is: Why wouldn't Tesla have simply placed the learning algorithms into 6.2 SW on a passive basis and started 7.0 off with all that collected data, rather than risking the week of 'close encounters' that has been documented on forums, YouTube, etc?

They are collecting data on when the drivers have corrected (changed) what the auto-steer wanted to do. They are not collecting information about the paths people take, rather only when it differs from what the auto-steering thought they were taking. That wouldn't have been possible in v6 because there was auto-steer.
 
The AP could certainly have been projecting a course based on sensor input, but not actually acting on it and then reporting back when user action differed from the projected - but it seems they choose not to do that.

Hopefully they'll file that under "lessons learned" at the post-implementation review meetings.
 
They are collecting data on when the drivers have corrected (changed) what the auto-steer wanted to do. They are not collecting information about the paths people take, rather only when it differs from what the auto-steering thought they were taking. That wouldn't have been possible in v6 because there was auto-steer.

I don't see what difference it makes. Consider this scenario: If there's a highway exit to the right that autosteer would have taken since presumably without other data, it defaults to relying on the right lane marking, and subsequently a few hundred v6.2 users follow the main highway (to the left) by say a 30:1 ratio, then Tesla could easily mark the junction to override default logic and follow the left lane markings. All that would be required is that the autosteer algorithms were passively enabled and tracking lane markings in v6.2. They shouldn't have needed to wait for someone with v7.0 to actively correct auto-steer at this junction.
 
Last edited:
If Tesla auto-steer is making such amazing leaps and bounds after a week in production, it points to the notion that 6.2 software was NOT previously collecting AP sensor data (or at least not in the same way). The question i have is: Why wouldn't Tesla have simply placed the learning algorithms into 6.2 SW on a passive basis and started 7.0 off with all that collected data, rather than risking the week of 'close encounters' that has been documented on forums, YouTube, etc?

They did do exactly this with the most safety-critical aspect of the system— automatic emergency braking. Those of us at TMC Connect 2015 heard a full presentation by the engineering manager of the group that developed that aspect. He explained that they collected a lot of sensor and trial decision algorithm data on real cars in the field before actually turning on the braking.

Could they have done this with auto-steer? Perhaps, but a bit less clear, as others here have mentioned.
 
But wouldn't google just use data from android phones that are in cars, where the phones have gps? This would give a defacto large data set all ready, although a bit dirty...
Even my 2001 S430 map system uses tire rotation to make a more accurate determination of position - it augments the GPS data. On Bjorn's channel Morgan discusses tunnel tracking where GPS signal is entirely unavailable (Tesla estimating its position in tunnels - YouTube). Which makes one think about the higher precision mapping Tesla is doing - could it not be that they are using the various onboard sensors to augment the GPS data?
 
I just went for a longish drive on a lightly traveled freeway last night and I'm not seeing much improvement. If I'm in the right lane, AP will reliably take the exit unless there's a stripe across it (lots of the exits around here don't have the stripe)....and keeping it on the road turns off AP. it hunts around in the lane, often going for one side or the other, and once in a while actually crossing the lane line. It seems to me when the lane is straight and it's got good lines detected, it should go right down the middle, but it doesn't. It doesn't seem to worry much about cars in adjacent lanes drifting into my lane until the blind spot detector is actually going off. the non-blind spot side detector is very chattery about walls and jersey barriers that are well out of my lane, but says nothing about giant trucks that are encrouching.. AP seems to like to follow cars that are exiting.

several of the obvious places I've wanted autopark, it doesn't give me the option. against a wall. pulling into a supercharger, pulling into my own garage, I haven't been able to make it work. when autopark is happy though, it works very well indeed.

--Snortybartfast
 
Generally, instantaneous locations from isolated GPS devices can be inaccurate by 20 to 50 feet,
and occasionally much more, but sometimes less. Usually, they are not sufficient to determine
if you are traveling on the edge lane of a freeway, or on a frontage road, or to determine which of
4 lanes you are traveling in. Could be the slow lane, the fast lane, or perhaps an HOV lane.

Sometimes I want to pass an exit, and other times I want to take the exit.
Usually, I would turn off the AP before beginning to slow for the exit.
So, usually, leaving AP turned On means that I want to follow the highway.

Presently, there is no good way to indicate that I want to follow-left or
follow right when a lane "splits'. And no way to slow a bit for a curve
without using the AP lever.
 
One question I've had on the "leaving freeway exit ramps unintentionally" issue is wouldn't this have been easy to program out of the system using gps data? Surely Elon Musk and the rest of the team realized the cars would follow exit ramps. Couldn't they have programmed the cars at the launch to remain on the freeway unless the driver used the turn signal to indicate that a freeway exit was desired?

- - - Updated - - -

Presently, there is no good way to indicate that I want to follow-left or
follow right when a lane "splits'. And no way to slow a bit for a curve
without using the AP lever.

I'm surprised the autopilot wasn't programmed to look at map data ahead of arriving at a curve, calculate if the current velocity was too much for that curve, and then slow down. Even the primitive cruise control in my 2004 Infiniti QX56 SUV slows down when the road curves and speeds up again when it straightens out.
 
It would be interesting to test the same drive with different Teslas to see whether the whole fleet is learning at the same rate, or whether there's both car specific and fleet level learning going on. For example, if you've never driven a route before in your Tesla, will your Tesla have gained the same model of the highways in the route as a different Tesla that has? Or a lessened version that includes conclusions that have been identified as universally true ("nobody follows this right-hand lane marker and stays on the highway") but not all adaptations?
 
Yes but even assuming what's on the Web site is true, that doesn't necessarily imply improvements over the span of a few days ("continually" is not the same as "continuously"). Short of an official announcement with more details, we don't know one way or the other. We're all just guessing.

I think a lot of people are vastly underestimating just how much faster computers do things than human brains. Your experience is telling you, wow, a week is pretty damn fast, doesn't seem likely, but if a computer sees the same thing 7 times in a row and other drivers have done the same (don't forget you might not be the only one training that particular exit ramp area), that's more than enough time to have it learn new behavior. It may even be that in addition to the global fleet learning, it could have local car to car learning, similar to how it learns about daily commute times and heats the car up etc. It's likely a combination of both. Heck, it may even have a different set of learning for each driver profile.
The big fear about AI is that because of how fast computers do things, it would learn things too quickly to even keep track of or control.
 
What is it learning that will help me take the exit sometimes, and
drive on by the exit on other occasions?

In the big scheme of things to come with AP,
how is it intended that I will indicate "stay left", or "stay right" when
a lane widens, then becomes two lanes, that then actually split?

How is it intended that I tell the car to slow down for some
approaching situation, curve, construction, vehicles or
workers on the side of the road, etc?
 
What is it learning that will help me take the exit sometimes, and
drive on by the exit on other occasions?

In the big scheme of things to come with AP,
how is it intended that I will indicate "stay left", or "stay right" when
a lane widens, then becomes two lanes, that then actually split?

How is it intended that I tell the car to slow down for some
approaching situation, curve, construction, vehicles or
workers on the side of the road, etc?


Turn signal?
 
What is it learning that will help me take the exit sometimes, and
drive on by the exit on other occasions?

Currently AP is intended only for use on the highway. So in my view, it should currently always intend to stay on the highway. If the trainer (driver) is keeping the car on the highway most times, it should be learning to prioritize the trajectory and left lane marker as well as anything else that keeps it on the highway.

In the big scheme of things to come with AP,
how is it intended that I will indicate "stay left", or "stay right" when
a lane widens, then becomes two lanes, that then actually split?
How is it intended that I tell the car to slow down for some
approaching situation, curve, construction, vehicles or
workers on the side of the road, etc?

In the big scheme of things, you won't. The car will choose and anticipate. :smile:
 
I thought it was improving, but I also thought that might all be in my head, so I didn't want to mention it.

My commute involves some three-lane roads, a divided boulevard and a multi-lane state road with many businesses and traffic lights. AP is definitely not diving into turn lanes like it used to and it's getting better about getting through intersections where there aren't any lane markings, even if there isn't a car in front of me. It still struggles with intersections that are on curves, as simply going straight for 50 feet will cause it to leave it's lane and then I get the panic beep.

I did notice once that I got a disconnect just as slacker radio was about to load the next song. Slacker became practically useless for the next 60 - 90 seconds, so I think autopilot uploaded a bunch of data and saturated the 3G network connection.

I really wish we had a better understanding of how the "learning" is supposed to work. I taught my son to drive, so I could probably teach this thing, too, if I knew how.
 
What is it learning that will help me take the exit sometimes, and
drive on by the exit on other occasions?

In the big scheme of things to come with AP,
how is it intended that I will indicate "stay left", or "stay right" when
a lane widens, then becomes two lanes, that then actually split?

How is it intended that I tell the car to slow down for some
approaching situation, curve, construction, vehicles or
workers on the side of the road, etc?

The car isn't supposed to take the exit, ever. If the lane splits, the car is supposed to keep with the current highway. That is what it's currently learning to do. If you want to take the exit, you get into an exit-only lane or you take over.

If you want the car to slow down, you press down on the stalk to decrease speed. Better yet, if a situation is approaching, you tap the brake pedal and take over.

Seriously, it's just lane-keeping, it's not magic. The type of intelligence you're looking for is years down the road.
 
Better yet, if a situation is approaching, you tap the brake pedal and take over.

^^^ This.

If there's a situation that I consider to be unusual (especially if it involves people nearby), my first reaction should be to take over manual control and just drive the car, not try to figure out how to tell the AP to do what I think it should do.
 
One question I've had on the "leaving freeway exit ramps unintentionally" issue is wouldn't this have been easy to program out of the system using gps data? Surely Elon Musk and the rest of the team realized the cars would follow exit ramps. Couldn't they have programmed the cars at the launch to remain on the freeway unless the driver used the turn signal to indicate that a freeway exit was desired?

- - - Updated - - -



I'm surprised the autopilot wasn't programmed to look at map data ahead of arriving at a curve, calculate if the current velocity was too much for that curve, and then slow down. Even the primitive cruise control in my 2004 Infiniti QX56 SUV slows down when the road curves and speeds up again when it straightens out.
It does slow down for curves.
I've driven a few winding mountain roads with AP and it will slow down for curves. On one recent AP trip I had the speed set at 60 and it took the curve at 35 (it could have gone a bit faster IMO).