solaris007
Member
i simply don't understand? what is so hard/user unfriendly about identifying the blue steering wheel indicating autopilot is active or not?
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
i simply don't understand? what is so hard/user unfriendly about identifying the blue steering wheel indicating autopilot is active or not?
i simply don't understand? what is so hard/user unfriendly about identifying the blue steering wheel indicating autopilot is active or not?
The bong and a corresponding 15 pixels of blue somewhere on the dash aren't good enough.
AWDtsla said:There are many other bad design flaws here too, for example the park distance control chimes are completely useless.
You're not getting it. THIS IS NOT ABOUT BLAME. It is about proper user interface design.I disagree completely. Everyone is always looking to blame someone or something else other than themselves. At some point, somewhere, personal responsibility has to matter. There is a clear icon that changes color, it's easily visible with an audible chime to go with engaging and disengaging. No matter what Tesla does here, there is always going to be someone who refuses to take responsibility for their actions.
Tesla's sensors are seeing in multiple depths to identify "3D objects", I don't want or need different chimes to tell me what I can clearly see from the sensor readout. If you went the faster beep option then it's have to latch on to the closest point to base the beeps off of which would be counter productive, from my perspective, just based on how the sensors read things. I don't know how your parking but I rely on the backup camera and sensor display, I'm not constantly looking at my mirrors.
Jeff
His answer was half right. Autopilot is not "flawless", however it IS your responsibility. All actions of the car at this stage are your responsibility, and it is your responsibility to take over if the car is not doing what you need it to do. The only possible way for the car to be at fault in any collision in it's current form is if it fails to allow the user to take over, and I have never heard of such a case with a Model S.After the police came (the other driver insisted we file an accident report) and I received a $120 ticket, I called Tesla's technical assistance. The gentleman I spoke with told me that the Autopilot function is "flawless" and that it was my responsibility, as the driver, to avoid any collision. I asked to speak with his supervisor who told me that this was the first time anything like this has ever happened, and found it "very strange." They were clearly intimating that I did something wrong.
I took delivery of my new P90D last weekend (trading in my model 85 from 2013) and downloaded Firmware 7.1 two days ago. I had used autopilot for a few days with Firmware 7.0 and found it wonderful (astounding). Today was the first day I tried it using 7.0.
At about 8:30 AM this morning on I90 (road conditions perfect, visibility good), I was doing about 60 MPH and switched on Autopilot. I initiated a lane change with my turn signal and the car switched lanes seamlessly. My car automatically modulated my speed (with a two car distance) with the car in front of me, and I was cruising along happily when the car in front of me changed lanes and my car caught up to the car in front of him. After following this new car for a few minutes, the traffic began to slow.
My car slowed as well. But when the car in front of me came to a complete stop (not a sudden emergency stop, but rather a gradual stop), I expected my car to do the same (as it had been doing previously). It didn't. I slammed on the brakes in that dreadful instance before I realized my car wouldn't stop in time, but I still hit the car in front of me (while going maybe 5-10 MPH). I'd like to mention that I consider myself a very safe driver and have never been involved in any accident before (I'm 52). I damaged that car's rear bumper and cracked the plastic cover on my new Tesla (see attached photo).
After the police came (the other driver insisted we file an accident report) and I received a $120 ticket, I called Tesla's technical assistance. The gentleman I spoke with told me that the Autopilot function is "flawless" and that it was my responsibility, as the driver, to avoid any collision. I asked to speak with his supervisor who told me that this was the first time anything like this has ever happened, and found it "very strange." They were clearly intimating that I did something wrong.
He didn't say accident.
As for sources: count me as one.
My car slowed as well. But when the car in front of me came to a complete stop (not a sudden emergency stop, but rather a gradual stop), I expected my car to do the same (as it had been doing previously). It didn't. I slammed on the brakes in that dreadful instance before I realized my car wouldn't stop in time, but I still hit the car in front of me (while going maybe 5-10 MPH). I'd like to mention that I consider myself a very safe driver and have never been involved in any accident before (I'm 52). I damaged that car's rear bumper and cracked the plastic cover on my new Tesla (see attached photo).
After the police came (the other driver insisted we file an accident report) and I received a $120 ticket, I called Tesla's technical assistance. The gentleman I spoke with told me that the Autopilot function is "flawless" and that it was my responsibility, as the driver, to avoid any collision. I asked to speak with his supervisor who told me that this was the first time anything like this has ever happened, and found it "very strange." They were clearly intimating that I did something wrong.
You're not getting it. THIS IS NOT ABOUT BLAME. It is about proper user interface design.
They are NOT 3D. They simply detect distance from objects. The audio should indicate distance without looking at anything, and a more advanced version would map the location of the sensor to a speaker in the audio system so that people that still have 2 functional ears could fully utilize this information.
It's not more complicated than my last car, except you have the front and rear. Audio should indicate both rate of closing and minimum distance reported at _any_ sensor, period. The more advanced stuff is for another day.Well I don't have two fully functional ears so that would be a problem for me... I didn't mean 3D as in "3D" I was struggling with how to describe how it see's things in a manor thats different from other cars that I've owned with ultrasonics. My bad...
Regarding the interface design part, I think it's perfectly fine and was pointing out that no matter what you do to it personal responsibility still matters. That was all.
Jeff
It's second nature. You assume that AP is available, you double pull the cruise stalk, you hear bing bong (which means it failed, instead of bong bong, which means it succeeded, but you're thinking about something else so they sound the same to you), you let go of the wheel only to realize that AP didn't engage.
Ever do something so many times and have it work so many times, that the one time you didn't check if it worked, is when it failed? Yeah, this is like that.
Think about how many times you'll get into the model S and your reflexes will kick in and your hand will reach for the pushbutton/keystarter thing and/or your hand reaches for the gear selector in the wrong place? What's so hard about reaching for the right place? It's not hard at all, but when it's 2nd nature, you don't think about it and bam, no shifter there! (or in this case, bam AP didn't engage!)
but these are all well know human error modes (lapses -> memory/omission/repetition and attention slips combined with confirmation bias), which of course can be alleviated with a good user interface, alerts about the state of automation AND human training, systems knowledge (e.g. in which conditions does the AP disengage) and acceptance of responsibility.
for example simply trusting that the pull of the lever and a bing bong results in AP activation is not enough, no matter how good "the UI". as long as the human is boss, verification is required, and perfectly easy by checking the icon, which is always at the same location, central on the IC and enhanced with color.
i am curious about what changes to the UI you would propose that would make the verification easier?
overriding the AP temporarily with the AP remaining engaged sounds interesting. even then, verification is required.
for me a good strategy is always having the same scanning pattern in a directional flow that scans the outside (80%), the IC (speed, TACC/AP state, energy, ...), in repeating sweeps, thus always aware of the state of the vehicle.
From January 11th,
Elon Musk says Tesla's autopilot is already better than human drivers - The Washington Post
"Musk said, he was not aware of any accidents caused by autopilot. He said the closest scenario were accidents where drivers mistakenly believed they were in autopilot mode."
I wonder if this was one of those times.
i am curious about what changes to the UI you would propose that would make the verification easier?
LOL. Yeah, kinda what I was thinking. Autopilot still needs a ton of work(years), before it's ready for prime time.In fact Autopilot is so good that they're reducing where and when you can use it.... that makes sense...