stopcrazypp
Well-Known Member
Because the final FSD code is completing final validation and will be released in December 2016. Because of the 8 cameras. Because the car apparently can already detect pedestrians, road signs, lanes, brush, and drive itself around in the parking lot and on city streets. You mean that was just a scam when the car will be oblivious to what other cars are doing and won't see a 6 ft construction cone or merging lanes?
Fault mostly Tesla's in this accident for poor programming. If lane markers were ambiguous the car should have followed the cars in front. This is ENHANCEDautoPILOT not autoSTEER. This is FULL-SELF-DRIVE with advanced NVIDIA technology that will drive completely unassisted, coast to coast, including charging stops, by the end of 2017.
If there is a positive, maybe your video data from the crash will be uploaded to the "neural net" to train all future Tesla's not to plow into construction cones. But I suspect this crash will have to be repeated a few hundred times before the magic Tesla AI picks up on this.
Yeah, all I am seeing is confusing FSD with EAP. EAP software has nothing to do with FSD. EAP currently only uses 2 cameras (other than the recent update to take snapshots).Who told you FSD was completing final validation in Dec 2016? You're confusing FSD with EAP.
The logic between the two has to be different because EAP always assumes the driver can take over (so it errs on the side of avoiding false positives), while FSD has to assume there is no driver (so err on side of avoiding false negatives). Also EAP operates under the assumption that no unauthorized lane changes can occur (with the driver handling ambiguous lanes). FSD on the other hand will be allowed to make its own lane changes and to cross lanes as necessary.
Last edited: