Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilots and degradation of driving skills

This site may earn commission on affiliate links.

Matias

Active Member
Apr 2, 2014
3,986
5,836
Finland
In the future automation will do most of driving. But in certain situations it will give the wheel back to driver. This will happen when car does dot have enough information to drive safely. It will probably happen during heavy rain or road covered with snow. So in practice average driver will drive rarely, but when he or she is asked to drive, it will be in worst weather.

In aviation situation is the same. In average flight pilot manually flies only four minutes. Two during ascend and two during descent. But in some rare situations autopilot switch off and gives stick pack to pilot. When this happened in AF 447, pilots did not know what to do and plane crashed.

Air France Flight 447 - Wikipedia, the free encyclopedia

Could this happen in futures self driving car, when it suddenly gives wheel pack to driver during heavy rain or snow covered road?
 
One of the most frequently quoted mantras in aerospace is that when something goes wrong, you can't just pull over.

If we do get to the point you're describing where most drivers don't drive by hand much or have the experience, the answer will be for the car to pull over and stop before giving the driver the chance to decide whether or not to proceed.

Realistically, though, between being able to get data about conditions long before the car actually encounters them and having sensors that see things people can't, I think that by the time we're facing the degraded skill sets you describe we'll be able to build cars that can handle those worst case conditions better than human drivers anyway.

I'm pretty sure your scenario is at least thirty years away, though.
Walter
 
Last edited:
Brings to mind my first encounter with someone driving cluelessly while talking on a cellphone. On the BQExpressway in Queens NYC - 3 lanes wide and the perp was in all three at one time or other. Several cars were honking to no effect so I backed down to avoid/enjoy it. It was just that strange an event, seeing it for the first time.
--
 
Actually there is a different pretty recent incident of this exact problem. Asiana Airlines Flight 214 - Wikipedia, the free encyclopedia

"Over-reliance on automation and lack of systems understanding by the pilots were cited as major factors contributing to the accident.[82]"

Asiana Plane Crash Caused by Pilots’ Overreliance on Automation Skift

If I recall correct the pilots rely so much in automation over there that they rarely actually have to do anything at all vs in the US the pilots are frequently tested on these systems and manual control etc.

when it comes down to it they basically just didn't know how to fly the plane, and they didn't understand the systems. In SF due to the way the airport is setup manual landing is common or more manual intervention is required or something of the sort. They didn't have a clue what to do bc they never had to use these systems much over there. I'm probably generalizing I don't have time to read about it but the point is valid - if you don't know how to drive you shouldn't be driving and relying on autopilot to do everything for you. There will be times where manual driving is required so you need to be awake/aware/sane/and know how to use drive and how to use the brake/gas/steering wheel/etc lol
 
In the meantime, automation has made aviation safer and crashes are less frequent. Expect the same to happen with cars -- overall crashes will decline, but there will be outliers caused by inattentive / inexperienced drivers when automation fails.
 
Your hypothesis is not comparable. In the Air France case the plane's design and the situation made things confusing for the pilots of the plane, and it was a dangerous situation where they had a short space of time to avoid a crash.

You're talking about a situation where an autonomous (not autopilot!) hands control back to a driver in conditions that aren't easy. The answer is that it's really not a big deal at all. Driving in adverse conditions is actually extremely easy: pay attention, keep your speed down, no sudden maneuvers, avoid lane changes where possible, and put your flashers on if others are driving much faster than you.

Benefits will dramatically outweigh disadvantages, and the great thing is there's a lot of room on the system costs, since collision and medical costs are a substantial part of insurance, and there are huge indirect benefits to gained from accident avoidance and mitigation.
 
Your hypothesis is not comparable. In the Air France case the plane's design and the situation made things confusing for the pilots of the plane, and it was a dangerous situation where they had a short space of time to avoid a crash.

That's not quite correct. The only instrument(s) that failed were the multiple airspeed indicators, and even they were working correctly well before the time the plane crashed. Every other instrument supported a stall condition. And yet the Pilot Flying kept pulling the nose of the plane up until it crashed into the ocean, and they actually had a number of minutes (4m20s from Autopilot disconnect to crash, I just checked) to correct it.
 
I think the need for driver competence will diminish more quickly than actual driver competence. As long as we still require drivers to know how to drive to get their license, and make sure that the AI controlling the car doesn't do things that are dangerous for a human to do and teach the driver bad habits, overall accident and fatality numbers are going to go down. It doesn't matter that the rate of accidents per mile driven by a human go up slightly, if the overall trend is towards fewer accidents per mile traveled in the car overall.