Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
There are two separate issues here:
1) FSD or autonomous driving - a year or two away - still has a driver in pilot/copilot arrangement, and maybe save some of the 40,000 lives lost on the road in the USA each year. This is low hanging fruit that is within reach.

2) Driverless cars. This is the pursuit of Uber and Lyft etc., who want to save the cost of the driver. This is many more years away (10?) and only after (1) has been successfully accomplished.

So let's keep this distinction in mind, and not confuse these two very different outcomes.

IMO FSD, full self driving, is a driverless car, which needs no intervention and no pilot.

But of course there are many different steps of autonomous cars, from level 3 when the car starts to become autonomous, taking it's own decisions and taking responsibility in very limited cases, over level 4, where the car eventually can drive w/o a driver in geofenced locations, up to level 5, where the car is as good as a human driver and can be thrown into every situation possible.

IMO Elon is promising level 5 for every car with at least AP 2.0. But that's very far into the future. Level 4 in a well mapped environment, like ride Lyft, GM-Cruise and Google are attempting to do, will come a lot sooner.

For those who did not see the Reddit Post, here is the post in question[Rumor/Speculation: Tesla has completed the Coast to Coast FSD trip with 30 human interventions] Screen capture below

To me 30 disengages on a coast-to-coast drive seems pretty impressive.
As long as we don't know how long the disengages lasts, nor the situations they occur in there is still a lot thats unknown.

Hope they get this win soon.

View attachment 273826


0BLBC

30 disengages on 3000 miles is a good improvement over 2016's disengagement report, but 2016 they did suburban driving, which should be harder. Waymo had 0.2 disengagements per 1000 miles in 2016 and mostly on urban roads. But Tesla is moving in the right direction.



I personally just hope this isn't just some hard coded promo event, where they show something that won't arrive in customer vehicles, soon. Because if people buy FSD because of such an event and then not get what was shown relatively soon, many will feel cheated. This already happened to many Model S and X owners that upgraded, or bought new after the first FSD video and it will be even more amplified if it happened again.
 
IMO FSD, full self driving, is a driverless car, which needs no intervention and no pilot.

Originally they kind of pitched it like that, but since then they changed the wording to say: "The system is designed to be able to conduct short and long distance trips with no action required by the person in the driver’s seat."
 
i, for one, still have faith they'll get it done while i own m3. bonus, i won't have to buy new car to get it either. they can prove me wrong but you have to keep in mind that they've been saying the same thing in the last 2 years and haven't changed their advertisement or stopped selling the feature. my conclusion, its coming in 3, definitely 6 months! :)
 
  • Funny
Reactions: R.S and Swift
Originally they kind of pitched it like that, but since then they changed the wording to say: "The system is designed to be able to conduct short and long distance trips with no action required by the person in the driver’s seat."
Interesting, they need to pay us back some costs then. Or "an agreement is only there to be broken" (old Norwegian saying).
 
Originally they kind of pitched it like that, but since then they changed the wording to say: "The system is designed to be able to conduct short and long distance trips with no action required by the person in the driver’s seat."

Kind of pitched it like that? Elon said the car could find a parking space on it's own and even drive across the country for you, with no driver, person in the car, needed.

Good job people saved screenshots so they can be held to what they actually sold!
I saved my screenshot 5 min ago.
Bildschirmfoto 2018-01-22 um 21.59.03.png
 
Noticed another thing this evening on a 200 km drive. Still winter and partly poor lane markers.
Car seems to drive in AP MUCH better when trailing another car. Passing it, the swerving act begins. Back behind one again, it settles down. Anyone else noticed?
Should we demand that the (late) FSD launch is done without any lead cars? Or is that unfair?
 
  • Informative
Reactions: Swift
Noticed another thing this evening on a 200 km drive. Still winter and partly poor lane markers.
Car seems to drive in AP MUCH better when trailing another car. Passing it, the swerving act begins. Back behind one again, it settles down. Anyone else noticed?
Should we demand that the (late) FSD launch is done without any lead cars? Or is that unfair?

Yes, if it has a car it can follow, it's a lot more stable.

I would consider it a cheat, if they had a dedicated car(s) they would follow.

The hard parts would not be the highway driving, that is doable today, if you cleverly avoid construction zones. The charging stops should be the hard parts and even more if it includes city driving, which it probably won't.

So if there is always a car in front at the charging stops, or construction zones, or parts of the road with bad lane markings, some could argue that they would use them to cheat. But it's unavoidable that there will be cars in front of the test car, so it's hard to prove anything.
 
It's Monday morning, I'll be an optimist for a little bit:

One nice thing is that assuming FSD uses additional cameras, that could increase the definition of "lead cars" to cars beside you or shortly behind you as well. And if there's no cars in front of, beside you, or behind you, then it barely matters if you depart your lane a bit or not (after all, that's what humans seemingly do around mountain passes).

So, assuming they're doing something smart with additional cameras, enabling them might address a huge subset of the Autopilot issues that trigger disengagement in EAP (e.g. getting too close to a car near you)
 
This guy is basically repeating what i told you people 2 years ago...but you ppl would rather accept a fairytale than believe the obvious.

placeholder_image.svg

What is suprising here? The learning set requires characterization of the scene items to train against. Once the net is trained against a subset of that data, it is tested against the remainder of the training truthed imagery. This helps to prevent overfitting.
If that test gets good marks, they can run against the full truthed data set.
With a lot of real world video, they can go a step further and run against that footage. People can then quickly evaluate the quality of lane and vehicle recognition using bounding boxes/ highlighting visualization. It could also get training via user induced feedback (more left, more right, don't drive like an orange) but that requires more computational power to calculate the back propagation.
 
  • Like
Reactions: scaesare
What is suprising here? The learning set requires characterization of the scene items to train against. Once the net is trained against a subset of that data, it is tested against the remainder of the training truthed imagery. This helps to prevent overfitting.
If that test gets good marks, they can run against the full truthed data set.
With a lot of real world video, they can go a step further and run against that footage. People can then quickly evaluate the quality of lane and vehicle recognition using bounding boxes/ highlighting visualization. It could also get training via user induced feedback (more left, more right, don't drive like an orange) but that requires more computational power to calculate the back propagation.

I think the problem is we are over a year into this mess, and we are still not seeing any noticeable improvements in driving honestly. I’m not talking about lane keeping, they had a massive failure on that front tracking behind AP1.
 
  • Like
Reactions: 1 person
I think the problem is we are over a year into this mess, and we are still not seeing any noticeable improvements in driving honestly. I’m not talking about lane keeping, they had a massive failure on that front tracking behind AP1.

I'd even say to forget about the neural network for a bit. There are huge deficiencies in the AP2 control scheme. I almost never get a smooth ride unless there are no cars around me. If I have to follow somebody, I get to "enjoy" sudden and aggressive maneuvering, including hard acceleration and braking. If I want to change lanes, the car dives into the next lane causing my head to be whipped to the side.

How can they move forward when they can't even get the vehicle controls right? This should not be an open problem. Classical theory already exists to provide smooth control.
 
I'd even say to forget about the neural network for a bit. There are huge deficiencies in the AP2 control scheme. I almost never get a smooth ride unless there are no cars around me. If I have to follow somebody, I get to "enjoy" sudden and aggressive maneuvering, including hard acceleration and braking. If I want to change lanes, the car dives into the next lane causing my head to be whipped to the side.

How can they move forward when they can't even get the vehicle controls right? This should not be an open problem. Classical theory already exists to provide smooth control.

Driving home last night on a major highway, with very well marked lanes, I car lurched for the shoulder, and I still have no idea why, this is with .53, but I know the control scheme hasn't changed with 18.2, so either way, it's still a bit rough around the edges.
 
  • Like
Reactions: 1 person
You're being taken for a ride and not a self driving one if you don't think the video was almost completely faked.

To believe it wasn't faked, you would have to take a promo video and Elon’s status updates at face value. Believing we are always just a few months away from departing away from hw1 abilities. And the biggest hurdle of that departure is quality testing and regulation, rather than non existent software.

The reality is they were far from code complete to pull off a FSD demo in Oct, 2016. All the 2017 releases were not about incrementally merging and releasing that code seen in the demo. Nor have they been merely been internally investing development time refining FSD until it's perfect.

It is much more reasonable to think that most of the code that was released over the last 16 months is new and therefore written after the FSD video. Here’s the hard truth: over the last 16 months, all of that new code is obviously needed for FSD to work. Yet if you evaluate the ability of the latest stable ev firmware, how would it have made the FSD video better if the video was made today? It wouldn’t. Hence, the video was faked and staged to highlight the theoretical potential of the new hardware if given enough time in the future to develop it.

Sure, there is likely some separate branch where they have traffic light or stop sign detection and the car will react in some rudimentary way. And that may land to stable soon (first half of 2018). But given Tesla's tolerance of releasing dangerous hw2 updates, AKA most of 2017, it will likely be unsafe to use and, unfortunately, highway lane keeping won't miraculously get better because of it.

And that’s all fine. I like being a beta tester. I love autopilot. It blows my mind. But some things are said and done for marketing, sales, and stock value.
 
Last edited: