Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

A flight instructor teaches Tesla Autopilot

This site may earn commission on affiliate links.
Because it occurred so fast, I don't have full confidence in my recollection. But this is what I believe I experienced.

I was driving in the far left HOV lane on AP. I hear a loud motorcycle approaching from the rear. The Tesla moved itself over a couple of feet to allow the lane splitting motorcycle to pass and after it did, the Tesla moved a few feet to center again.

Did his happen?

What you describe is consistent with a report we've heard before and makes sense. If the Tesla detects a motorcycle or any other vehicle uncomfortably close on one side, it will increase the wiggle room by favoring the other side of the lane until the threat is past, and then it will return to its normal location in the middle of the lane.

My question for you is how quickly did the Tesla change its position in the lane? Was it smooth or fairly abrupt? Thanks for your report.
 
What you describe is consistent with a report we've heard before and makes sense. If the Tesla detects a motorcycle or any other vehicle uncomfortably close on one side, it will increase the wiggle room by favoring the other side of the lane until the threat is past, and then it will return to its normal location in the middle of the lane.

My question for you is how quickly did the Tesla change its position in the lane? Was it smooth or fairly abrupt? Thanks for your report.

Absolutely smooth and seamless.
 
@ papafox. My Father had a plane for most of my teen years. I learned to fly when I dated a Vietman jet jock (really tells you how old I am). I find myself driving the car on AP like IFR flying-just looking at the dash. How about you?

Bonlaw, thanks for the quick reply. Glad to hear the autopilot showed fineness in such a situation. Your aviation experience shows with your terminology.

I like to characterize my attention as "rapid scan" looking inside and out while using autopilot. I'm most curious to compare what I'm seeing with my eyes to what the autopilot image on the dash is displaying. I figure that if autopilot fails to keep my vehicle safe, it is my situational awareness of where the other traffic is and what it's doing that will give me the full picture so that I can take over and then do exactly the right thing. With version 7.0 of the autopilot software, I felt the need to really keep my eyes on the road, and the way I compared the two sources of information was to run simultaneous videos of the autopilot display and the views through my dashcams, then sync them with the music playing in the car and look at the most interesting times at a later date. There's much that a person is too busy to notice when actually behind the wheel. I'm pleased to say that on my most recent two day road trip I never had the need to override the autopilot (which is the first long trip about which I can say this). I did purposely choose to not engage the autopilot when lane paint or other issues made the choice unwise. Now, when highway markings are good, I have the confidence to spend more time glancing in and out to judge the quality of the autopilot's performance.
 
Keeping the pilot analogy, a critical few seconds when flying occurs when we transition from instruments to visual (for example, when landing). The eyes need to refocus to the outside world, situational awareness takes a major paradigm change, and we are essentially "not in complete control" for a second or two - at least that's my experience. Never been a problem, since the transition usually occurs when all is well, and you really don't have to worry about other planes next to you.

So, I don't own my MS yet (in production), I've only take a test drive, but during that very short time I noticed changing from looking at the dash (in parcitular, looking at the proximity warnings of cars in adjacent lanes) to looking outside and back, I seemed to experience that transition effect. I'm just wondering if this goes away with experience, or is there always that second of transition when your awareness has to change? It felt a bit disconcerting.
 
I find that I do not get that transition. Normally as I switch back to manual I am doing this consciously and take effort with the accelerator to pick up the exact speed of the car, and ensure the car's on the line I want as opposed to its choice (usually the same though still a decision).

Then I push the auto stick forward, causing return to manual. All this preparation has aligned my awareness to the conditions well.
 
Hookemhorns: I'm not a pilot, but if you're flying in IMC in your aircraft you have (or assume you don't have) any outside visibility. So yes, I could see where that transition to VFR, say when you break out of overcast on final approach to landing, is a bit jarring.

When you're driving a Tesla on autopilot, you have normal visibility out the windshield. I personally am not too fixated on the sensor display on the instrument cluster (although it'd probably be a natural thing to do on a test drive). Usually I do my normal instrument and visual scan so that if I do need to take over manual control, I have the situational awareness to do that reasonably smoothly. Also as simong pointed out, you often (but not always) can pick when to transition from autopilot to manual control.

Bruce.
 
Keeping the pilot analogy, a critical few seconds when flying occurs when we transition from instruments to visual (for example, when landing). The eyes need to refocus to the outside world, situational awareness takes a major paradigm change, and we are essentially "not in complete control" for a second or two - at least that's my experience. Never been a problem, since the transition usually occurs when all is well, and you really don't have to worry about other planes next to you.

So, I don't own my MS yet (in production), I've only take a test drive, but during that very short time I noticed changing from looking at the dash (in parcitular, looking at the proximity warnings of cars in adjacent lanes) to looking outside and back, I seemed to experience that transition effect. I'm just wondering if this goes away with experience, or is there always that second of transition when your awareness has to change? It felt a bit disconcerting.

Good question. When flying by instruments, you know that your scan is focused inside, with just the occasional glance out the windshield to see if the runway is visible yet. The transition to visual cues outside the windshield is indeed a big transition because you are now confident enough with what you see to trust aircraft control by substituting your visual perceptions of what's in the windshield for the reliable instrument indications on the panel. Fortunately, I agree with simonog and bmah that there's really no such transition with Tesla autopilot if you're doing it right because you have maintained your awareness of the road and other traffic even while the autopilot is operating.

The best way I can describe proper Tesla autopilot supervision is that you never give up control of the car to the autopilot, at least you shouldn't with the current beta software release. Instead, your hands are on the wheel and you are allowing the Tesla to make little steering corrections and some not-so-little speed adjustments. The second you are unhappy with what the autopilot is doing, you simply take control by turning the wheel to where you believe it should be or you touch the brake as needed. All this time you are checking your mirrors, you're looking out the windshield to see what threats might lie ahead, and thereby you're maintaining your situational awareness based upon visual cues.

In a real sense, your need to look inside the vehicle is reduced with autopilot. Since Traffic Aware Cruise Control (TACC) is taking care of your speed, there's no immediate need to glance at the speedometer. You are free to focus your attention outside the Tesla, where it will do you the most good. Nonetheless, there are times you will glance inside to see, for example, whether the blue lane edge markers are remaining solid blue or if they're fading in and out because of poor highway markings. Yes, in such a case you learn something valuable by looking inside (you now know that the lane-keeping feature is somewhat compromised by the poor quality of highway markings), but it is not necessary to look inside. If you hadn't looked inside to see the intermittent nature of the lane edge indications on the panel, you would not be unsafe, for your primary control of the vehicle always remains the visual picture you get from looking outside.
 
Last edited:
Today, Consumer Reports criticized Tesla for using the word "autopilot" to describe its driver-assistance functions of lane-keeping and traffic aware cruise control. The criticism is unwarranted, I believe, because autopilot has never implied "turn it on and forget it" in airplanes or vehicles. During my 20+ years as an airline pilot, we monitored the autopilot like a hawk when it was in a critical situation, which is a situation close to the ground on an autopilot-flown approach. A pilot would have to be nuts to trust autopilot without monitoring in such a critical situation because any number of issues (electrical anomaly, momentary loss of ground signal, etc.) could cause an immediately hazardous situation. In a vehicle, a hazardous situation is more common: passing close to another vehicle when road conditions are less than ideal for autopilot, glare on the road, etc.

Consumers' Report is therefore incorrect in criticizing Tesla for using the word "Autopilot" not because of the system's performance in the vehicle. CR is off base because it has no clue how carefully flight crews must monitor the autopilot in an aircraft when it is in a critical situation. Bottom line: the use of the word "Autopilot" by Tesla is appropriate.
 
Last edited:
Regardless of what a feature is named, it's limitations, and behavior need to be explained to the operator.

No name can convey complex ideas like: Autopilot may not see stationary vehicles that are revealed after a car moves out of your lane. Be prepared to take over to stop in these conditions. Changing the name will not convey these limitations, nor will it help make autopilot safer. These same situations need to be explained whether this system is Called Drive Pilot or Distronic+ (MB), Pilot Assist (Volvo) or Autopilot (Tesla). Without explanation, all of these names fail to convey the limitations of the system and can result in dangerous situations. Will changing the name better describe complex driver assistance systems and better prepare drivers to use them safely? I don't see how it could. Blaming the operator error on the name is a cop-out.

If CR wants to blame Tesla in a constructive way, they should say that delivery experience specialists aren't providing adequate training on the subject of vehicle systems, specifically autopilot.

The only effective way to reduce Autopilot accidents will be to educate owners on the proper use and limitations of autopilot.
 
Regardless of what a feature is named, it's limitations, and behavior need to be explained to the operator.

No name can convey complex ideas like: Autopilot may not see stationary vehicles that are revealed after a car moves out of your lane. Be prepared to take over to stop in these conditions. Changing the name will not convey these limitations, nor will it help make autopilot safer. These same situations need to be explained whether this system is Called Drive Pilot or Distronic+ (MB), Pilot Assist (Volvo) or Autopilot (Tesla). Without explanation, all of these names fail to convey the limitations of the system and can result in dangerous situations. Will changing the name better describe complex driver assistance systems and better prepare drivers to use them safely? I don't see how it could. Blaming the operator error on the name is a cop-out.

If CR wants to blame Tesla in a constructive way, they should say that delivery experience specialists aren't providing adequate training on the subject of vehicle systems, specifically autopilot.

The only effective way to reduce Autopilot accidents will be to educate owners on the proper use and limitations of autopilot.
You agree to the terms of use when you accept enabling the feature, you are therefore ultimately responsible. Tesla communicated it as best they could and any other impression made by the consumer needs to be self-learned and handled appropriately otherwise don't give them that capability. We know how to use the technology, do you?
 
You agree to the terms of use when you accept enabling the feature, you are therefore ultimately responsible. Tesla communicated it as best they could and any other impression made by the consumer needs to be self-learned and handled appropriately otherwise don't give them that capability. We know how to use the technology, do you?
True but you *learned* to use it. I've been using it since it first came out and learned its strengths and limitations by trial and error and gained experience. AP can definately benefit from training to shorten the learning curve. This would also appeal to the regulators.
 
  • Like
Reactions: bmah
You agree to the terms of use when you accept enabling the feature, you are therefore ultimately responsible. Tesla communicated it as best they could and any other impression made by the consumer needs to be self-learned and handled appropriately otherwise don't give them that capability. We know how to use the technology, do you?

I think we are in agreement, so I'm not sure why the "dislike". I am saying blaming the name for false impressions is a cop out because the user is responsible for their driving, and knowing how the systems they are operating work. Tesla can help their owners better be responsible for their own actions by providing better autopilot education.
 
I think we are in agreement, so I'm not sure why the "dislike". I am saying blaming the name for false impressions is a cop out because the user is responsible for their driving, and knowing how the systems they are operating work. Tesla can help their owners better be responsible for their own actions by providing better autopilot education.
Don't sweat the "dislike" check his profile, he hands out lots of them without explanation. It's a badge of honor. :)
 
  • Funny
Reactions: EVie'sDad
I agree with those who advocate better education of autopilot users. Such education would probably not have made a difference with the Josh Brown accident (because he was very experienced at using autopilot and had formed his own habit patterns based upon the system's past performance) but they could be useful in the type of accidents where a new autopilot-equipped Tesla owner thinks they have activated autopilot, really haven't activated it, and then are not in a "I'm ready to take over" mode when the first curve in the road comes up.

The Josh Brown accident is unique because it highlights a rare type of situation that could surprise an experienced user who has had a good experience with autopilot so far. This evening's post from Elon Musk about looking at using radar decoupled from the camera to create a lidar-like picture of what lies ahead holds the promise for eliminating this type of accident. Nonetheless, the driver remains in charge of the safe-conduct of the drive and must be attentive, as we have seen. There will no doubt be other corner-cases that pop up in the future, and one should not trust this new technology to the point of not maintaining situational awareness of what threats lie ahead. No autopilot, whether on a car or airplane, is without flaws, and much work plus a hardware update will be necessary before true autonomous driving is available for us.
 
Last edited:
I think we are in agreement, so I'm not sure why the "dislike". I am saying blaming the name for false impressions is a cop out because the user is responsible for their driving, and knowing how the systems they are operating work. Tesla can help their owners better be responsible for their own actions by providing better autopilot education.

Sorry, I should have stated the portion I disagreed with - "If CR wants to blame Tesla in a constructive way, they should say that delivery experience specialists aren't providing adequate training on the subject of vehicle systems, specifically autopilot." I do agree that changing the name will have little effect on the consumers impression of the feature.

You are correct that CR overreacted, but I disagree that the delivery specialists are not providing adequate training on the vehicles systems. In fact, I took at least four test drives over the course of two years before purchasing my MS, and then spent several hours both with DS and reading the online documentation end to end.

Am I the norm? Perhaps not, but users are forewarned and must accept the agreements on-screen before they can enable said features. Besides, what is to stop a non-telsa dealership from selling a used MS or MX with the tech package and not provide any training at all?

As stated by others, you obtained a drivers license by studying, practicing and taking a written and physical test. If you passed that, you are expected to behave responsibly and obey the rules of the road when behind the wheel of a several thousand pound vehicle hurling down the highway. No amount of auto-steer or TACC is going to prevent an accident/crash if said driver chooses to a)ignore warnings, b) distract themselves with other media or documentation or devices, c) decides to blatantly disregard all available information and documentation and activate features they have no idea of the limitations of.

Sadly, there will always be those that unintentionally aspire to win a darwin award, and driving in such a manner will certainly get you close to that goal, so yes be vigilant and pay attention all the time while operation your motor vehicle, whether it be an ICE or an EV or other type of automobile. We all have to share the roads together, please be considerate of others as you would want them to be of you.
 
Last edited:
Today, Consumer Reports criticized Tesla for using the word "autopilot" to describe its driver-assistance functions of lane-keeping and traffic aware cruise control. The criticism is unwarranted, I believe, because autopilot has never implied "turn it on and forget it" in airplanes or vehicles. During my 20+ years as an airline pilot, we monitored the autopilot like a hawk when it was in a critical situation, which is a situation close to the ground on an autopilot-flown approach. A pilot would have to be nuts to trust autopilot without monitoring in such a critical situation because any number of issues (electrical anomaly, momentary loss of ground signal, etc.) could cause an immediately hazardous situation. In a vehicle, a hazardous situation is more common: passing close to another vehicle when road conditions are less than ideal for autopilot, glare on the road, etc.

Consumers' Report is therefore incorrect in criticizing Tesla for using the word "Autopilot" not because of the system's performance in the vehicle. CR is off base because it has no clue how carefully flight crews must monitor the autopilot in an aircraft when it is in a critical situation. Bottom line: the use of the word "Autopilot" by Tesla is appropriate.

Precisely, http://jalopnik.com/why-teslas-autopilot-isnt-the-menace-you-think-it-is-1783682751
 
True but you *learned* to use it. I've been using it since it first came out and learned its strengths and limitations by trial and error and gained experience. AP can definately benefit from training to shorten the learning curve. This would also appeal to the regulators.

No amount of training except for actual driving experience will provide a new owner with comfort in using AP. The same concept exists for when you first get your drivers license. You do all the book work and watch the horror films but only learn to truly drive when you are behind the wheel.
 
No amount of training except for actual driving experience will provide a new owner with comfort in using AP. The same concept exists for when you first get your drivers license. You do all the book work and watch the horror films but only learn to truly drive when you are behind the wheel.

Larry, there truly are ways to greatly speed up the learning curve for safe autopilot driving. The first step would be to drive on a road without much traffic and practice taking control from the autopilot. You use the twist of the steering wheel, pushing the brake, and pushing the button on the end of the autopilot stalk. What this teaches the driver is that the autopilot is only going to be driving until that half-second when you're not so happy with its driving, and then you click it off. Next, you show the driver examples of challenges that will exceed the autopilot's ability to perform satisfactorily. You drive on a road with poor lane edge markings and either see the autopilot turn control over to you or steer in an unsatisfactory fashion. Then you take a turn at a very exact slow speed which nonetheless exceeds the autopilot's ability to execute the turn. You brief the driver that he or she will have to take over at some point during the turn. Through these simple exercises, you demonstrate that the driver is indeed in control of the outcome of the drive at all times and that sometimes there's a need to say "not good enough" to the autopilot and take over.

The next step is to teach the driver to evaluate the performance of the autopilot, based upon the conditions. On a scale of 1 to 10, how is the autopilot performing? At what number do you take over manually? You then give the driver a chance to drive a road that has some autopilot challenges such as too steep a curve, poor markings, glare, or construction zone. You ask the driver to as soon as possible tell you about the threat to autopilot that lies ahead (and thereby train the driver to use some of the free driver's brain bandwidth to anticipate problems before they become obvious). The driver gets practice taking over, and if the driver doesn't take over soon enough, the instructor (as was properly briefed ahead of time) says "take over." Thus, in a controlled environment that challenges the current autopilot technology, the driver becomes confident You present the driver with a road that is totally unsuitable for autopilot usage and you brief ahead that if encountering such a road, the student is to turn off the autopilot and say, "unsuitable road."

In flying, we would never expect the student to just learn from their own mistakes. We challenge them and give them the ability to fly safely in the marginal situations and to avoid situations that are worse than a certain level. The same type of training can be given to autopilot drivers. Inexpensive simulators could be created if training in real Teslas becomes too scary. The goal, though, is to show the new driver what the marginal situations look like, how the autopilot performs in such situations, and how to be comfortable taking over.