The focus on FSD has been on items like high resolution maps and object detection - which is enough to handle typical driving situations.
What about unusual situations?
For example, when at an intersection with a police officer manually directed traffic using a combination of a whistle and hand signals. Will the FSD software be able to detect this? Will it be able to isolate the police officer in the intersection, rule out pedestrians in the area, and listen for a traffic whistle or detect the officers hand signals?
There are other circumstances when emergency vehicles or trains are nearby - and not visible - and use audible warnings (sirens) to warn of their approach, so vehicles can slow down in advance of making visual contact. Will the FSD software be able to detect this?
When operating under EAP, it's the driver's responsibility to maintain control of the vehicle. Under FSD, there may not even be anyone inside the vehicle.
Has there been any mention by Tesla about how FSD will be able to react to these situations?
Another one... What if the software makes a mistake on the speed limit or doesn't detect a stop sign or traffic signal. A police car starts pursuing the car with a siren and lights, asking the driver to pull over. Will FSD comply - pull over and come to a complete stop? Since there might not be any driver, the officer wouldn't be able to give anyone a ticket. And when the officer has taken down the vehicle information, how will FSD detect it's OK to resume travel?
It will likely be the car's owner - not Tesla that will be held responsible for not handling these situations properly.
It's difficult enough just to get normal driving working, but these unusual situations may be just as difficult to handle - and is the AP2 sensor suite sufficient (such as listening for audible warnings)?
Since Tesla claims FSD should have sufficient hardware (sensors/processors) to get regulatory approval for full self driving (when the software is ready), should we assume Tesla has anticipated these issues and has a plan to handle them with AP2.x?
What about unusual situations?
For example, when at an intersection with a police officer manually directed traffic using a combination of a whistle and hand signals. Will the FSD software be able to detect this? Will it be able to isolate the police officer in the intersection, rule out pedestrians in the area, and listen for a traffic whistle or detect the officers hand signals?
There are other circumstances when emergency vehicles or trains are nearby - and not visible - and use audible warnings (sirens) to warn of their approach, so vehicles can slow down in advance of making visual contact. Will the FSD software be able to detect this?
When operating under EAP, it's the driver's responsibility to maintain control of the vehicle. Under FSD, there may not even be anyone inside the vehicle.
Has there been any mention by Tesla about how FSD will be able to react to these situations?
Another one... What if the software makes a mistake on the speed limit or doesn't detect a stop sign or traffic signal. A police car starts pursuing the car with a siren and lights, asking the driver to pull over. Will FSD comply - pull over and come to a complete stop? Since there might not be any driver, the officer wouldn't be able to give anyone a ticket. And when the officer has taken down the vehicle information, how will FSD detect it's OK to resume travel?
It will likely be the car's owner - not Tesla that will be held responsible for not handling these situations properly.
It's difficult enough just to get normal driving working, but these unusual situations may be just as difficult to handle - and is the AP2 sensor suite sufficient (such as listening for audible warnings)?
Since Tesla claims FSD should have sufficient hardware (sensors/processors) to get regulatory approval for full self driving (when the software is ready), should we assume Tesla has anticipated these issues and has a plan to handle them with AP2.x?