Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
All right. Semantics. Let's get down to basics.

One poster said, "My car can drive around the block. A Waymo can't."

You said, or you're attempting to say, "The car didn't drive around the block. It's not even Level 2! You drove around the block!"

Yes. Which is, not just by SAE, but functionally and legally, a fundamentally different thing


It's not, remotely, "semantics"

Your poor departed mothers story aside, that remains the case.

Because when FSD-b is turned on and I'm hanging onto the steering wheel with my foot off the pedals, in a completely English sense of the word, I'm not driving. The car is

This is grossly, factually, wrong.

You are DRIVING the car. You are not JUST "ready to take over"

If you were that'd be L3. Which the car is not capable of.

If you don't understand what actual parts of the driving task you are performing while using FSDb you ought to stop using it, for everyones safety, until you do understand.

That Forbes link explaining OEDR would be a great place to start.

Or you can repeat some more ignorance and insults on the topic- your call.
 
This is grossly, factually, wrong.

You are DRIVING the car. You are not JUST "ready to take over"
So, if I put a wheel weight on the steering wheel, and a big bag of rice buckled in the seat and set the nav to go around the block while I stand in front of my house am I driving while it goes around the block? (Sure, legally I am responsible for what happens, but I certainly am not driving as I am not even in the vehicle.)
 
All right. Semantics. Let's get down to basics.

One poster said, "My car can drive around the block. A Waymo can't."

You said, or you're attempting to say, "The car didn't drive around the block. It's not even Level 2! You drove around the block!"

My dear departed mother, who had a serious brain on her, had this habit, when she was losing an argument (which, granted, didn't happen often) of changing the definitions of things in the middle of the argument so that Her Argument Would Win.

And whether she was being consciously aware of what she was doing or was just being sneaky, once she got away with it, unrolling the argument to where she pulled this stunt was often impossible (what with her trying to impede the rollback and all), leaving other people infuriated and her smug. Living in that household, us five kids got really good at calling out this stunt when it would appear.

You, sir, have done that stunt. Yeah, legally, when one is sitting in a Tesla, no matter what's turned on or turned off, it's the person in the driver's seat who's the legally responsible party for making the car go wherever it's going to go. And black is white and white is black and we'll all get killed at the next zebra crossing.

Because when FSD-b is turned on and I'm hanging onto the steering wheel with my foot off the pedals, in a completely English sense of the word, I'm not driving. The car is. If the car does something stupid, which happens, then it's my job to take over the driving.

In responding to this rhetorical trick of yours I do feel like I've lost a few brain cells. I don't mind your debating things, but please don't do that? It just infuriates people.

Maybe this a proper time for an um?

I think we all agree, by all accounts and parties, Tesla, FSD user agreement, FSD responsibility waivers, and thoughtful FSD owners, we are responsible for all driving actions when we FSD is on. Whether we intervene or not is irrelevant to who is driving. FSD owners are driving.

If FSD was designed to drive us, we would be sitting in the back seat, reading a newspaper, napping, there would be no driver controls, and we wouldn't be signing so many waivers etc.
 
  • Like
Reactions: Knightshade
In responding to this rhetorical trick of yours I do feel like I've lost a few brain cells. I don't mind your debating things, but please don't do that? It just infuriates people.
This thing is not even up for debate. It's like conversing with a flat earther.


0ZfXKxZ.png



j1jXsoG.png


They are not just saying that to avoid regulation. They literally can't do that, and Uber found out the hard way when they tried to make that argument. The FSDb software lacks the capability to do the entire DDT (Dynamic Driving Task) and OEDR (Object and Event Detection and Response). It requires a person behind the wheel doing the Object and Event Detection and Responding immediately when there is a failure. It is an ADAS therefore the driver (YOU) behind the wheel is responsible for driving at all times even when the automated system is controlling the acceleration, braking and lateral motion of the vehicle. That is the literal legal distinction between a L2 ADAS and a L4 ADS. It is not semantics just because the L2 ADAS is good.
 
Last edited:
So, if I put a wheel weight on the steering wheel, and a big bag of rice buckled in the seat and set the nav to go around the block while I stand in front of my house am I driving while it goes around the block? (Sure, legally I am responsible for what happens, but I certainly am not driving as I am not even in the vehicle.)

So first- the car won't go anywhere since the camera won't see a driver.... but ignoring that-

If you take a 1973 pinto, tie a rope to the steering wheel to keep it straight, put a brick on the accelerator, then move the shifter to drive as you hop out, is the car "driving" itself?

Now take a 2023 Lexus with adaptive cruise and lane keeping, turn both on on a highway and hop out while it's stopped in traffic....is the car "driving" itself?

The answer to both- as with your question- remains no.

Being able to trick a vehicle into executing the partial-driving things it has an ability to partially-automate does not mean the car can drive itself. It means exactly the same things as it did with the human there-- it can automate specific SUBtasks of driving, not perform the entire dynamic driving task.

There's a reason there's actual definitions for this stuff, not just in SAE, but in the actual law


Again, as Bitdepth so aptly illustrates above- TESLA tells you this themselves. Not just in the DMV emails, but explicitly in writing when selling you the product. Then they tell you again in the manual. Then they tell you again when you activate the system in the car.

Why do people not believe Tesla themselves?
 
So, if I put a wheel weight on the steering wheel, and a big bag of rice buckled in the seat and set the nav to go around the block while I stand in front of my house am I driving while it goes around the block? (Sure, legally I am responsible for what happens, but I certainly am not driving as I am not even in the vehicle.)
You answered your own question. If FSDb was L4, even if you engaged the feature and strapped 4 bags of rice in the passenger seats, the software is legally responsible for whatever happens. As it is L2, you are personally responsible for whatever happens.

In the same way you can use summon while not present in the vehicle, but you are legally responsible to watch your surrounding and disengage if you see it is about to hit something same applies to FSDb.
 
I have some information for George: In Q4 Alphabet had a net income about the size of Tesla's revenue in Q4.
That's irrelevant because Alphabet have shown multiple times they will suddenly cut projects that have gone too long without a profit. They aren't running a charity and they do view projects individually.

As a reminder they had shakeup last year and cut the whole trucking devision (even though it had a lot of promise). So there is pressure for Waymo to show a pathway to profitability.
 
Last edited:
  • Like
Reactions: willow_hiller
So first- the car won't go anywhere since the camera won't see a driver.... but ignoring that-
Sorry, there is no cabin camera in the pre-refresh Model S&X... So, it would go just fine.

As it is L2, you are personally responsible for whatever happens.

Right, I would be responsible, but I would not be driving. The question was not about responsibility, it was about if the car was capable of driving around the block on its own. (Which it is.)
 
Last edited:
Right, I would be responsible, but I would not be driving.



Driving includes a specific bunch of tasks that ALL must be performed to be considered driving.

Some of which the car is not capable of doing at all

The car would be MOVING, but that's not the same as DRIVING.

That's why I included the "brick on accelerator" and "Just use regular LKA and radar cruise" examples. In those cases likewise the car is moving but parts of the driving task aren't being done by the car because they can't do them.




The question was not about responsibility, it was about if the car was capable of driving around the block on its own. (Which it is.)

It absolutely is not and Tesla tells you so


Why do you insist that Tesla is lying when they tell you the car can not drive on its own?



It seems every time anyone invokes the SAE levels, it goes nowhere, and that's because the SAE levels suck because no one understands it


The fact you do not understand a thing does not change the fact many other people understand the thing.

It's a pretty weird take to keep projecting your own ignorance of a topic onto the entire rest of the world and insisting the thing you don't understand "sucks" because you refuse to educate yourself about it.
 
Last edited:
Incomplete OEDR is L2, not subject to autonomous regulation, by definition- so adding "better but still incomplete" OEDR doesn't change anything.
Yes, and that seems like the approach Tesla has been taking and will continue to take with 12.x to avoid regulation. Even as end-to-end takes on more detection and response that weren't handled with 11.x, as long as FSD Beta stays "incomplete" even for say one very rare specific type of event, Tesla could avoid autonomous regulations even if otherwise it could safely handle city streets with many thousands of miles without disengagement. Of course Tesla will highlight that one final limitation in the manual to additionally emphasize and show proof to regulators that owners were told it's a driver assistance feature and that the design intent is explicitly "limited."

Hopefully this all helps Tesla deploy 12.x wider to customers sooner with regular updates without needing ongoing legal discussions with regulators.
 
  • Like
Reactions: powertoold
Right, I would be responsible, but I would not be driving. The question was not about responsibility, it was about if the car was capable of driving around the block on its own. (Which it is.)
Legally you who engaged the feature is driving even when your hands and feet do not touch the steering or brake. Super Cruise is literally hands free; you are still considered the driver because you engaged a L2 system. The expectation is that you are monitoring and will intervein as necessary to ensure safe operation of the feature. There is no room for misinterpretation.

DEtBxNZ.png

I don't even know why this debate over how normal people define driving is even relevant.

It seems every time anyone invokes the SAE levels, it goes nowhere, and that's because the SAE levels suck because no one understands it, and it's not what people think it is.

Even if we were to agree with knightshade, no one understands anything of value.
Just because you do not bother to understand it does not make it suck. It's a 41-page document, read it.

L0 = No automation
L1- L2 = driver support partial automation (you are still responsible for driving when the feature is engaged)
L3 = Conditional automation, the feature is responsible for driving and knowing its own limitation and giving you enough time to take over if it cannot handle a situation.
L4 - L5 = The feature is responsible for driving; you are at no point responsible for driving or doing anything.

What is so complicated about that?
 
So I made the point that a Waymo, dropped off in my driveway, would be incapable of driving itself around the block. On the other hand, a Tesla can do this without a problem.

Everyone knows what I was saying. Everyone knows the point I was making.

But now this thread is full of posts about what it means to "drive itself". Again, this is despite the fact that everyone knows the point I was making.

Boy do I appreciate moderated threads.
 
So I made the point that a Waymo, dropped off in my driveway, would be incapable of driving itself around the block. On the other hand, a Tesla can do this without a problem.

Everyone knows what I was saying. Everyone knows the point I was making.

Yes, you were saying you don't understand that Teslas can not drive themselves. At all. Even though Tesla themselves explicitly tells you they can't.

You, too, might benefit from reading J3016...or the Forbes story about L2 vs L3 and OEDR.... or, heck, just the Tesla owners manual or product description regarding FSD.

The fact remains you car can not drive itself around the block. It can assist a human driver in doing so. So can lots of other consumer cars to varying degrees (LKA systems, adaptive cruise, etc).... NONE of them can drive itself however- with the single exception of the Mercedes that offers L3 in a very narrow ODD. When their L3 system is engaged the car is driving itself. A thing no Tesla can currently do.

Waymo can- though in a quite limited ODD (much less limited than Mercedes for capabilities, but more limited for geofencing).



If it's so simple, there wouldn't be a 40-page definition and explanation for it.

Most of the 40 pages are additional details for the people who somehow read the basic summary or flow chart and still remain baffled by it. Might be worth a read for you!
 
So I made the point that a Waymo, dropped off in my driveway, would be incapable of driving itself around the block. On the other hand, a Tesla can do this without a problem.

Everyone knows what I was saying. Everyone knows the point I was making.

But now this thread is full of posts about what it means to "drive itself". Again, this is despite the fact that everyone knows the point I was making.

Boy do I appreciate moderated threads.

Sorry I missed this whole discussion. What's your reasoning for a Waymo not being able to drive itself around the block?
 
Any of these points provide zero value to any autonomy discussion:

1) FSDb is L2 and therefore cannot be compared to L4 in any way
2) FSDb is not driving so it's not doing anything for you because you're driving
3) A L2's longitudinal and lateral control and planning is meaningless when compared to a L4's longitudinal and lateral control software, because the driver is responsible
4) L4 has complete ODER and is therefore superior in every way to L2's limited ODER