Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Any details on headline - Arizona pedestrian is killed by Uber self-driving car

This site may earn commission on affiliate links.
I was on board with you until this sentence. The record of the person should not matter.

We cannot blame the safety driver 100%. If the safety driver is to blame 100% then what blame does the Uber software have, 0%?

If the video is indeed representative of how the scene looked like (although I doubt that), I could probably not have stopped in time myself. But I expect the system to stop because it is using at least 3 types of sensors.
That’s not how I meant it, I mean that if the reporting is correct on the driver having multiple last traffic violations, this is uber not screening it’s safety drivers well enough perhaps simply because of the potential optics. Of course blame is multifaceted l, however with this instance, the company, with a reputation for poor hiring practices and aggressive corporate action, they have screwed up. They killed someone via negligence.
 
  • Disagree
Reactions: Phrixotrichus
I got a question. I personally didn’t see the woman until there was no time to react. Are we sure that it would make a difference if the driver is looking at the road? This is similar to a shed I almost hit, I only avoided it with a last sec maneuver.

Does Uber’s car use Volvo’s technology or their own? I wonder if this was on our ap or Volvo’s system. Would the car react enough or would it just slow the impact in its current form? I am not sure if it would stop and avoid the accident either. The car will probably brake at last second to lessen the impact.

She was starring at the phone, typical uber driver. A homeless person wonder across fast moving traffic, seen plenty of those here too. How tragic, it’s like a perfect poop storm.

I feel bad for the people who paid for fsd
 
I got a question. I personally didn’t see the woman until there was no time to react. Are we sure that it would make a difference if the driver is looking at the road? This is similar to a shed I almost hit, I only avoided it with a last sec maneuver. .....

A dashcam doesn't represent what the human eye can see. I'm sure that most people would have seen that there was something and reacted accordingly. Otherwise, how about reducing your speed?
 
I am reminded of a seminal case in what is now thought of as classic “soft side” computer science literature. Briefly, it involved an X-ray machine (console) called the Therac 25. Technicians soon discovered that during the process to specify how much radiation a patient was to receive, an error message that appeared periodically in said console could be made to go away if the technicians pressed the “P” key repeatedly. This soon became de rigeur, and ultimately one day they ended up cooking a patient who, if I recall correctly, subsequently died.

One line of questioning that soon arose involved responsibility. Who or what was responsible? The technicians? The hospital? The vendor of the X-ray system including the console? The programmers?

Fast forwarding to the case at hand, it appears that multiple parties are again responsible. What will occupy graduate seminars for years to come will be whether the ensuing division of blame is/was correct.

Clearly, crossing a poorly-lit road without supplemental Illumination in the path of oncoming traffic is unwise.

As clearly, distracted driving is usually he fault of the driver.

Any false sense of security while driving a self-driving car must be examined in the context of what worked, failed, and would have worked if operational. For example, was the driver aware that Lidar had been disengaged and therefore the car was less... I hesitate to use the word “aware” but you get the idea... than usual?

Lidar or no, another question or six have to do with AEB.

This is probably going to be a lot bigger deal than another pedestrian fatality. May they RIP.

Point being that regardless of how the blame pie is sliced betweenst the pedestrian, the driver, and the various hardware and software systems, this was an historic event. I hope for the sake of the entire endeavor, the findings will result in industry-wide improvements.

We already know from that mid-mile trucking startup also in Arizona, that soon will have runs from Tucson to Phoenix, that their trucks can react in as little as 0.1 second versus about 1 second for humans. Not saying that would have saved the pedestrian’s life, or averted the accident entirely, but...
 
...I personally didn’t see the woman until there was no time to react...

At least, a driver needs to make some attempt by placing their hands on the steering wheel and slamming on the brake even if that was too late. At least, a drivers needs to show that there's an effort even if it's too late.

As pointed out by @zmarty link, human eyes can certainly see the pedestrian with those 2 dimmed streetlights even when the camera cannot.

...Does Uber’s car use Volvo’s technology or their own?...

Once Uber got its modification, it's on Uber's dime and no long Volvo's technology.

...Would the car react enough or would it just slow the impact in its current form?...

LIDAR can shine its laser beams in complete darkness and the good ones can see her in 3 football fields away or 300 meters or more than 900 feet.

It's a very easy proven collision avoidance scenario for any generic Self-Driving car.

Something is wrong with Uber here as no LIDAR data was released. @zmarty link heard that Uber turned LIDAR off to test it out without telling anyone.

...She was starring at the phone, typical uber driver...

"The safety driver is clearly relying on the fact that the car is driving itself. It's the old adage that if everyone is responsible no one is responsible," Smith said. "This is everything gone wrong that these systems, if responsibly implemented, are supposed to prevent."
 
Guess you all know the swiss cheese model of safety, picture from wikimedia commons:
File:Swiss_cheese_model_of_accident_causation_with_additional_labels.png

Swiss_cheese_model_of_accident_causation_with_additional_labels.png


Align all the weknesses, and an event occurs.
So far these weaknesses have been identified:
- night time
- crossing in the unlit area
- crossing outside boardwalk
- one area poorly lit, others well lit
- no reflectors on pedestrian
- no high beams in use
- high speed but not speeding
- sensor failure - did not sense (speculation)
- software failure - dismissed as noice (speculation)
- sensor disabled (speculation)
- safety driver unaware

Maybe only close one of these gaps and the accident could be avoided or the severity reduced.
Uber and the otheres definetly need to work on their safety driver concept and more, while us others need to wear reflectors at night.

Now the big question, what does this mean for us FSD-buyers?
 
  • Helpful
Reactions: NeverFollow
Daktari, when i was using AP2 yesterday, A guy was crossing the road someway up and my car slowed down as it saw the hazard and then picked speed back up once they were on the pavement in broad day light, The issue i see is that in the dark would the camera pick up that person or will it rely on Radar and Sonar to detect, The difficulty i see with Autonomous cars is that if it assumes everything coming from the side will not stop or is a hazard it will be stop start all the time, So it must calculate based on once it is in a certain zone that it becomes a hazard and it could be in this case that unfortunately the lady crossed this point to close to the car before it could do anything to react,
 
Very interesting if this was an example of camera plus radar without lidar. Do we have any examples of how well Tesla AP senses and reacts to pedestrians in these situations? And this pedestrian had an additional metal bicycle all around her... should have been an easy case I would think.

the LIDAR was off in order to test operations using just camera and radar.

a more advanced radar also should have detected her and her bicycle (though triggered no braking) as soon as she entered the lane to the left, probably 4 seconds before impact at least. Braking could trigger 2 seconds before, in theory enough time.)"

I would have hoped a tesla would have spotted this and reacted.

personally am not so sure the Tesla will see this and stop in time.
 
  • Like
Reactions: NeverFollow
It certainly looks bad for Uber | Brad Ideas

"I have seen one report -- just a rumour from somebody who spoke to an un-named insider, that the LIDAR was off in order to test operations using just camera and radar.

[...]

The road is empty of other cars. Here are the big issues:

  1. On this empty road, the LIDAR is very capable of detecting her. If it was operating, there is no way that it did not detect her 3 to 4 seconds before the impact, if not before. She would have come into range just over 5 seconds before impact.
  2. On the dash-cam style video, we only see her 1.5 seconds before impact. However, the human eye and quality cameras have a much better dynamic range than this video, and should have also been able to see her even before 5 seconds. From just the dash-cam video, no human could brake in time with just 1.5 seconds warning. The best humans react in just under a second, many take 1.5 to 2.5 seconds.
  3. The human safety driver did not see her because she was not looking at the road. She seems to spend most of the time before the accident looking down to her right, in a style that suggests looking at a phone.
  4. While a basic radar which filters out objects which are not moving towards the car would not necessarily see her, a more advanced radar also should have detected her and her bicycle (though triggered no braking) as soon as she entered the lane to the left, probably 4 seconds before impact at least. Braking could trigger 2 seconds before, in theory enough time.)"

Quasi-theory:
Lidar returns the distance to the first (only) object the beam bounces off of. Due to this, in rain or snow, it needs to filter out the noise created by transient close returns. One way to do this is with a low pass filter that retains only the furthest distance for a point or region. That allows it to see through the clutter to to the ground/ actual object.

Now insert a person pushing a bicycle tangential to the sensor. The bike itself has a structure with a small cross section, so that will cause near/far returns, additionally the spokes are moving rapidly also creating there/ not there returns.

If the wheel in front and the wheel behind are sufficient to trigger the rain/snow filter, the system could ignore the entire region where the person is moving (with sufficient filter strength). Further, if the system uses the lidar determination to gate video processing (to avoid trying to pattern recognize rain), then the vision system will also ignore the object.

Forgetting the case for a second, I personally am not so sure the Tesla will see this and stop in time. I think it will do an emergency stop at the end to lessen the compact. What do you guys think?

From the dash cam, I came up with maybe 2 seconds from shoe to impact. At 40 MPH, that is 117 feet (58.7 fps). An XC90 can panic stop from 40 in about 65-70 feet with good conditions. So with one second of reaction time, the car could have mostly stopped (or swerved, code allowing). From other's posts, actual visibility would have been much better.
 
I think Tesla today would have killed the person. But that's just wild speculation on my part. I have not seen / experienced what others have said with the car slowing down for people and such, granted, I never trust the car enough to "test" it out unlike unstopped car testing, which I'm comfortable knowing I can stop the car when I don't see the car reacting when I perceive it should, I'd never want to test this on a pedestrian...
 
  • Informative
Reactions: NeverFollow
I believe blame is on uber... they have a primary responsibility with testing technology in a safe way. If I had time, I’d post links to the many times their autonomous cars wrecked and eventually led to getting kicked out of California. They are affecting things for everyone. Again, this is akin to a Tesla employee striking and killing a pedestrian while product TESTING autopilot on public roads. If I had time I’d link articles I remember over the last couple of years of accidents and safety issues with their system.
 
  • Love
Reactions: Az_Rael
As a photographer (my side gig), I can attest that cameras absolutely do not have the same contrast resolution as the human eye. Anyone who has tried to take a sunset photo and only gets either dark sillohettes or over exposed sun can attest to the difficulty of reproducing the contrast range your eye can see. In addition, a dash camera is generally using a small sensor, which typically reduces the dynamic range of a camera vs a large sensor DSLR for example.

I believe the driver would have seen the person much earlier than the video implies and could have braked, either reducing the impact speed or missing the pedestrian entirely.

It is inexcusable that the driver was inattentive. Sure, a normal person can be inattentive at any time, but a driver responsible to test out new vehicle technology on public roads should be held to a higher standard. Uber’s test drivers should all be highly trained and kept on schedules that account for human factors. I work in flight test. Our test pilots aren’t just normal pilots, they are trained specifically for the test environment. We also monitor flights using engineering personnel, and those people are also specially trained. We maintain work schedule limitations for testing to account for human factors such as fatigue, etc.

You may think, well those are airplanes, lives are obviously in danger! But this “test pilot” that Uber was using was driving a vehicle that can easily put lives in danger as well (her own as well as the public who hadn’t consented to participate in this test). She should have had a similar level of training and protocols in place to keep her attentive at all times and ready to react to any unusual situation. If it is true that the lidar was off, she should have been briefed about the conditions of the test and the limitations of the platform with the lidar system off.