Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Elon: "Feature complete for full self driving this year"

This site may earn commission on affiliate links.
I think what this traffic control feature really demonstrates is that Tesla understands what's necessary for level 5 FSD. A lot of us questioned for the last 2-3 years whether Tesla had a good grasp of the factors involved in a level 5 system. It's true that there are still many factors unaccounted for in the public release, but it seems there's a lot more under the hood than we know.
 
  • Like
Reactions: diplomat33
I've also noticed light detection in difficult situations (i.e. front car obstructing view and around bends).

That's because it's at least in part using maps, not cameras for those.

Also why it has trouble handling things like double stop-signs (like in the video that's been posted a few times where it skips the first stop sign, trying to drive right through the intersection to the second one)
 
Came across a rather informative post on Reddit, where the OP did a series of tests to establish how/why the Traffic control feature works like it does: AutoPilot’s visual Stop Control is already quite impressive and learning fast : teslamotors
Thanks. In my quick testing today, the "confirm green light" behavior makes complete sense on AutoPilot without AutoSteer for an unprotected left turn. Here AutoPilot really isn't designed for turning on city streets yet, so waiting to accelerate even on a green is the correct behavior. This also happens to be the correct behavior for making a right turn at a green traffic light as you want the car to slow down anyway.

And related, I can now see why the behavior works without AutoSteer -- AutoPilot also currently isn't designed for city streets with parked cars especially on residential streets without lane markings. Now being able to go through the whole neighborhood with AutoPilot active the whole time is kinda neat.
 
...and this is why it is in BETA and Tesla warns repeatedly that this is not self driving and does not replace due diligence by the driver. It's learning and I am thrilled to be a part of the process.

I think Tesla was really crystal clear in what to expect with this feature. That one is to expect that occasionally there will be a false positives or false negatives. I look forwards to trying it out.

But, this feature in itself isn't what the concern is.

The concern was there is a stacking of features that can all cause false braking.

AEB can cause pretty severe false braking.
TACC itself can have false braking due to thinking that a semi is in your lane when it's not.
AP can cause false braking.
Incorrect speed limits for the road its on can cause false braking if it thinks you're on a different kind of road than you're on.
NoA is probably the worst offender when it comes to false braking. Especially when the maps aren't up to date.

That's a lot of stuff to put onto a driver. Stuff that will increase the possibility of a rear-end crash. Normally crashes are a combination of things. False braking in itself doesn't cause the accident. It simply contributes to it especially if the driver doesn't respond quickly.

Now we have this feature that can cause false braking.

To those of us that understand how these things work it's not a big deal. If I encounter a lot of false positives with this latest feature I'll likely simply turn this feature off.

There might be some issues with trying to figure out what caused the braking. Most of us are not looking at our screens, but at the road ahead. I wish there was a notifications logs. Or a way of bringing up notifications data on the dashcam playback.
 
This traffic sign control feature is super impressive, considering all the possible false positives and variables involved in stopping comfortably at an appropriate position. I've also noticed light detection in difficult situations (i.e. front car obstructing view and around bends).

This new feature is much more impressive than smart summon IMO and works better than I expected.

So you no longer think its an accident waiting to happen?
 
Just tried it for the first time. What blew my mind more than anything else was the system's ability to know there is a stop sign ahead even when there is absolutely no visual que that it is there. This happened 4 different times at signs that, from driving this route all the time, I knew were coming but are not visible until you are right on top of them (around a curve or over a rise). On each occasion it warned me of the upcoming sign well ahead of any visual reference. It must be using the mapping software to predict where signs might be. Amazing stuff.

Dan
 
  • Like
Reactions: EinSV and DanCar
Just tried it for the first time. What blew my mind more than anything else was the system's ability to know there is a stop sign ahead even when there is absolutely no visual que that it is there. This happened 4 different times at signs that, from driving this route all the time, I knew were coming but are not visible until you are right on top of them (around a curve or over a rise). On each occasion it warned me of the upcoming sign well ahead of any visual reference. It must be using the mapping software to predict where signs might be. Amazing stuff.

Dan

It is using map data. Why do you think I tout the benefits of HD Maps? ;)

Now, I know Tesla does not use HD maps but they do use map data which provides some of the same data as HD maps. When you use map data, you can get info when there are no visual queues. So, yeah, it's pretty important.
 
  • Like
Reactions: mongo
It is using map data. Why do you think I tout the benefits of HD Maps? ;)
I'm not sure it has to be high definition necessarily. Perhaps it uses previous information from the fleet that there was a stop sign there the last time a Tesla drove that route so my car "expects" to see one there, gives the notification, slows down but then confirms visually when within visual contact. Whatever it is, it's flippin' amazing. And this is just the initial version! It's only going to get better from here.

Dan
 
I'm not sure it has to be high definition necessarily. Perhaps it uses previous information from the fleet that there was a stop sign there the last time a Tesla drove that route so my car "expects" to see one there, gives the notification, slows down but then confirms visually when within visual contact. Whatever it is, it's flippin' amazing. And this is just the initial version! It's only going to get better from here.

Dan

As far as I know, it is just using pre-loaded map data. I don't think the fleet is sharing map data in real-time or anything.

No, it does not need to be HD although autonomous driving companies like Waymo use HD because they believe the cm accuracy is important.

Yes, it is amazing. But I think that is mostly because most folks have never been in an autonomous or even semi-autonomous car in their lives. So for us, a feature like stopping at red lights is something we've never seen before. So it is definitely amazing to us! But the truth is that the tech is pretty common among autonomous cars. Every autonomous car from Waymo to Cruise or Mobileye have been using camera vision and HD maps for years to do traffic light response.

On a side note, it's a big reason why I bought a Tesla. I knew that some automakers had features like Super Cruise but only on limited models and Waymo has awesome L4 autonomous driving but it is not available to the public yet. Only Tesla's AP would offer me expanding features like "traffic light response" and more for free in updates. And even if the features are not autonomous yet, I would be getting cool new AP feature on a regular basis and AP would get better over time.
 
Last edited:
  • Like
Reactions: Dan Detweiler
As for stop lights and signs around corners, I wonder if Tesla is also using the signs that indicate a stop sign or light is coming up.

As for whether I still think this feature release is risky, I do but less so. It's at the threshold of being releasable. Apple, for example, would never release something like this.
 
  • Like
Reactions: diplomat33
By the way, there is a stretch of a 4 lane road, non highway, outside the city limits, that I can do auto lane changes on with the stalk. But there is one stretch where it fails every time. It starts to do the auto lane change and then half way over, the car changes its mind and pushes me back into the right lane. The road looks pretty normal. The lane markings are clear. But it is slightly hilly and there is discolored patch of asphalt on the road. Maybe that is throwing the camera vision off. I can't do auto lane changes on that part since I know that it would not be safe to other drivers to have my car abort a lane change like that. I look forward to "city NOA" where hopefully that gets fixed.
 
As far as I know, it is just using pre-loaded map data. I don't think the fleet is sharing map data in real-time or anything.

At least one anecdote showing that it may be incorporating fleet data in real-time: 2020.12.5.6: Traffic Light & Stop Sign Control

If anyone has a spare stop-sign, it may be worth testing placing it at the entrance to your drive way, driving past it a few times, and seeing if the car remembers it later.
 
I can't come up with any reason you'd break things out that way OTHER than being able to recognize some FSD revenue sooner when you hit 1, but not 2.
You don't need to break up things on the website to be able to recognize partial revenue.

When it comes to rev recognition - as I wrote earlier, its between the finance department & the auditors. Apart from GAAP, they would use reasonableness to figure out what is ok. Responding to traffic lights/stop signs is an important step, so some partial rev recognition is in order. Tesla would have to give reasons for why they are recognizing x% instead of y% - and auditors have to buy that explanation.

In practice I'd guess - finance would ask engineering for how much % of FSD they think is done - and use that. Then, there is the question of how many cars out of total have actually got the software download.

I had assumed 150M in revenue in Q2 & Q3 - and 100M in Q4. This was assuming City NOA is delivered in Q2. I'd have to reduce this quite a bit if sign recognition and response is all that Tesla delivers.
 
Last edited:
You don't need to break up things on the website to be able to recognize partial revenue.

Then why would they list that as its own feature at all- given it'd be a mandatory part of the other one?



I had assumed 150M in revenue in Q2 & Q3 - and 100M in Q4. This was assuming City NOA is delivered in Q2. I'd have to reduce this quite a bit if sign recognition and response is all that Tesla delivers.

So funny story,

They're not even promising anythign like City NOA anymore.

Now if you buy FSD today it's just "autosteer" in the city.
 
Then why would they list that as its own feature at all- given it'd be a mandatory part of the other one?
Because marketing thinks it will sell ?

Anyway, since your original explanation is not correct, its for you to defend your position. Not for me to supply alternate theories ;)

So funny story,

They're not even promising anything like City NOA anymore.

Now if you buy FSD today it's just "autosteer" in the city.
Musk has long gone silent on City NOA. A sure sign that things are moving slower than (he) expected.
 
Because marketing thinks it will sell ?

Anyway, since your original explanation is not correct, its for you to defend your position. Not for me to supply alternate theories ;)

You "saying" it's not correct doesn't make it true.

I'm waiting on you to support your claim :)


You said "When it comes to rev recognition - as I wrote earlier, its between the finance department & the auditors. Apart from GAAP, they would use reasonableness to figure out what is ok"

But then failed to explain why "we delivered an entire specifically listed future feature" is not a criteria for reasonableness that would allow them to recognize more easily or more fully a larger % of revenue than simply having listed "automatic city driving" and nothing else and then making up a % of that that stoplights gets them.
 
Although traffic control does seem to be very much a work-in-progress, it is very impressive to watch it stopping for a hand-held stop-sign: Fred Hassen on Twitter

The flip side of this -- earlier this week I drove through a construction zone with a worker holding a stop sign "down" and FSD correctly interpreted that as "don't stop" and kept driving.

Karpathy mentioned this example as one of the challenges in reading stop signs during his recent talk and it was cool to watch it work in real life.
 
The flip side of this -- earlier this week I drove through a construction zone with a worker holding a stop sign "down" and FSD correctly interpreted that as "don't stop" and kept driving.

Karpathy mentioned this example as one of the challenges in reading stop signs during his recent talk and it was cool to watch it work in real life.

Exactly the reason why no other company will be able to collect this type of data without a large fleet.