AlanSubie4Life
Efficiency Obsessed Member
Not sure how they can say it indiscriminately mows down children, all the children in the video are white.
Yeah. Though mostly they were covered up, so maybe it was profiling the choice of clothes.
Seems like a pretty big failure of FSD Beta 10.12 (more details: The Dangers of Tesla's Full Self-Driving Software - The Dawn Project )
But a little detail: slowed from 38 or so to 25 (and perhaps 18 at time of impact) with the first collision (they claim 25 but I am not sure about that). And was telling driver to take control (too late). Of course the audio from the car was not included in the videos.
They seemed unable to afford a proper in-car video mount, or a camera with sufficient depth of field to keep both the screen and the distance reasonably in focus. Makes it hard to see what is happening. And no audio as mentioned.
It veered to the side on the last two tests.
I wonder how they had FCW set (not sure if it would matter for in-cabin alerts, but presumably would not change the driving behavior).
The setup with cones seems a bit non-optimal to me. Why not do it with standard road markings? Not that this is an excuse - FSD Beta 10.12 should not be plowing into obvious obstacles. I just wonder (given that it seemed confused prior to the impact) whether it’s actually a great test and may actually be uncovering some sort of poorly-coded corner case.
To be clear, I suspect it would hit the mannequin with standard road markings, unless they were in a crosswalk.
But I do wonder what is wrong with the VRU identification here. I’m slightly surprised it fails so badly. I’m not at all surprised it hit the VRU though.
I wonder how many tests they did?