Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Summon seems like a silly party trick to me

This site may earn commission on affiliate links.
The biggest question for me is: Can Smart Summon be used from a meaningful distance non-stupidly?

Because the way Smart Summon is set up, the driver responsible for the drive is always unable to see — from a distance — part of what is going on around the car. This in itself is a form stupidity, if performed, because you should always have full awareness of what is happening around your car when it is moving. Yet the way Smart Summon is set up, if you are standing at a distance, you will not see the other side of your car nor will you see through any obstacles blocking the line of sight...

...and what’s worse, there is no really good solution to this. Vigilance doesn’t help because you simply can’t see from a distance through your car and probably not around all other obstacles either. The extra blindspots this setup introduces are greatly increased compared to any normal driving situations. The only thing that helps is removing the responsible driver from the equation, or by being close to the car and moving around to monitor its movement. Perhaps camera views on the phone could help.

I agree, and in fact I've been testing Smart Summons from the drivers seat because its the only good way I know to do so.

But, my position is likely obviously because I was arguing well before the release of Smart Summon that it wouldn't work because of the lack of 360 degree down facing camera.

As to the line of sight? It really reminds me of drones where the FAA rule is you have to be in line of sight, but then the range on the drone controller is in MILES. It's basically this tiny dot in the sky, and there is NO WAY people can actually see the drone.

Basically there is a huge disconnect between what you're supposed to do, and reality.

In any case where I've been testing it I can see the car the entire way along with the having plenty of warning if another car enters the testing area.

I decided to jump in the car since it was getting too close to a curb for comfort, and the latency in the dead mans switch meant I had less buffer than I wanted. It's a $60K+ car so I'm going to watch over it like a hawk. I just don't get people who have no fear about their expensive car hitting something.

As to safety they can certainly add a level of safety by requiring the pedestrian noise maker for Summon. Of course that would require adding it to cars made before September of this year (or whenever they made that change). Lots of owners would be resistant to that as lots of owners absolutely hate the noise maker.

But, that would allow the car to make a noise before backing up, and thereby alerting any pedestrian.
 
Last edited:
@Tam

The problem is, what you are displaying is one of the retroactive changes Tesla made to the text.

In fact in 2016 Tesla said EAP would come as a single update in December 2016 and was merely pending validation and regulatory approval (in fact FSD was also just waiting validation and regulatory approval according to Tesla...):

Tesla-enhanced-autopilot-upgrade.jpg


Yes, Tesla lied. No way could they have ”expected” that given what we know of the reality.

But we did not know in 2016 that Tesla was lying.
 
Last edited:
...merely pending validation and regulatory approval (in fact FSD was also just waiting validation and regulatory approval according to Tesla...)

The hints, clues and legal wordings are all there!

"Expect" is not a certainty just like an expected delivery date.

Also, the fulfillment of the features are dependent on several factors:

1) Complete validation: It is hard to complete a validation if no one has even written the software for a feature just yet. It clearly says that if the validation is not complete, please don't expect that those features will be fulfilled.

2) Rolled out: The roll out indeed began in 12/2016 but it did not finish in that year. There's no timeline indicated when the finishing line will be. It could be forever!

Tesla didn't advertise the word "beta" but those words say the same: It's not a final product!
 
Oh, and just an FYI - I'm also a strong proponent of Lidar, but any mention of Lidar on TMC will get you flamed to hell and back. Pretty much more than anything except threatening to sue Tesla. If you threaten to sue Tesla you'll get like 300 dislikes on a post.

I'm thinking of suing Tesla for not using LIDAR. What do you think, class action?
 
Imagine my perspective for a moment. There are probably only a few dozen people on earth who know more about this subject than I do. I've been working on AVs for nine years. I can't use arguments from authority, because I'm not permitted to identify myself -- and such arguments don't usually work anyway.

Wait, you can't commit the logical fallacy of argument from authority, argumentum ab auctoritate, because you're pretending to be someone you're not. That's not how that works..........well, might be how ego works.....
 
  • Like
Reactions: EinSV
SERIOUSLY? this is your proof it is ok to use? at 30 seconds into the video the car emerges from behind a row of parked cars where the user had no line of sight at all on the vehicle, anything can happen on the blind side of the car when you CAN see it but this guy is summoning for 30 seconds without even seeing the car, awesome, just awesome, someone will get hurt in a parking lot before this disaster is over and I hope it is not too serious, its pretty obvious that Tesla will have to abort this SS ability soon.
Thank you for mentioning that. I was debating saying something largely identical. I was looking at the two cars in the front on the right side of the row, waiting for either of them to move. Then I saw the car emerge from the back. I agree, out of line of sight. PLUS the car was creeping. That, in and of itself, is dangerous. As I said elsewhere, Tesla should have released FSD with the same caveats as NoA and AutoSteer. At least that mandates a driver in the driver seat. Enhanced Summon is truly an autonomous car. Disclaimer, I am totally OK even at end of year, with FSD that requires a driver to indicate they are actively watching what the car is doing and can take over at any time.
 
With AP it’s falling asleep or paying less attention than you would otherwise.
The only way someone can fall asleep, for any length of time before the car pulls over and shuts down is to deliberately defeat the system. Even at that, a Tesla on autopilot is smarter than a non semi-autonomous car when a driver drifts off. Paying less attention is a controllable issue. I am a software engineer. Oh, wait, there was an earlier discussion about the pros and cons of a poster establishing their credentials.
 
Interesting math, I certainly have nothing to better offer myself, so unless someone else does let’s look at those numbers of weeks over time.

The biggest open question that remains in my mind about the safety of Level 2 driver’s aids is: how big a part does luck play in however they turn out and how we find out (or not) about them. Luck in both the events that occur as well as luck in how reliably the role of the automation is assessed or publicized or counted when an event occurs (or is not counted). I’m not yet quite convinced we have enough of these cars, their users and enough of the publicity or ability to gather that data reliabily enough to know...

For example, it was widely thought that Joshua Brown was the first Autopilot related fatality (cross traffic). It was only later that we found out about the earlier death in China (stationary object). There have been other suspicious deaths where the role of Autopilot can not have been ruled out... and how many cases we have simply not heard of at all? Who knows how many statistics simply get missed by use since we don’t hear of them or get reliable data.

It certainly is an interesting thought experiment somene presented: What if every GM car suddenly had these features? Their use and probably our abilty to get data on them would probably skyrocket simply due to numbers. Now Tesla is still small enough to fly under the radar a little and it is difficult to get good readings when something is flying under the radar.

Of course Smart Summon may be a little easier to assess because nobody is in the car. With other Autopilot related accidents things get muddy as the role of the driver and the car intertwine more.

The luck question of course plays both ways. One unfortunately unlucky event (for Tesla Joshua Brown probably was somewhat such an event) and the results may be more dire than the statistics would dictate.

Interesting point, we have gotten a feel over time for what AP is good at and where it fails. The operational domain starts to become second nature to the driver. We obviously aren't there for SS, which probably is why we have the property damage cases now and will have for a bit as people learn the system. Anyone using it really should play with it off the beaten path to at least learn the basics of what to expect. Would be cool if it had a training requirement! Same will be the case for NoA city. We won't know what it's not good at until the videos show up. That part of the learning curve sucks.
 
Interesting point, we have gotten a feel over time for what AP is good at and where it fails. The operational domain starts to become second nature to the driver. We obviously aren't there for SS, which probably is why we have the property damage cases now and will have for a bit as people learn the system. Anyone using it really should play with it off the beaten path to at least learn the basics of what to expect. Would be cool if it had a training requirement! Same will be the case for NoA city. We won't know what it's not good at until the videos show up. That part of the learning curve sucks.
But in the NoA City version there will, by definition, always be a safety driver, until it is declared better than human drivers...or maybe at least as good as. As I mentioned previously, with Enhanced Summon, there is, essentially, no safety driver. So a good question is, with a non-controlling human in the driver seat, if the car is operating via ES, would it even respond to human input, brakes or steering? I know in a normal power steering/power brakes system if it's off, they don't work.
 
But in the NoA City version there will, by definition, always be a safety driver, until it is declared better than human drivers...or maybe at least as good as. As I mentioned previously, with Enhanced Summon, there is, essentially, no safety driver. So a good question is, with a non-controlling human in the driver seat, if the car is operating via ES, would it even respond to human input, brakes or steering? I know in a normal power steering/power brakes system if it's off, they don't work.

The steering and brakes are still directly connected, they can always be overridden by a human input. It is NOT drive by wire...yet.