Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another tragic fatality with a semi in Florida. This time a Model 3

This site may earn commission on affiliate links.
Status
Not open for further replies.
Alternatively:
Computer: beats me, but I'm positive it's not a drivable surface, so if it is going to intersect with my path, I'm stopping.

Like training bank telkers, you don't need to know all the different ways to counterfeit, you only need to be really good at recognizing really money. Or for parents, recognizing your own kids.

Counterfeit Detection (Part 1) - Tim Challies
so the computer won't take caution until the triangle is in its path while passing it by at full speed.

A driving coach doesn't have to tell the student about each case either but the student driver will recognize potential dangers.
But the computer will only follow the known rules and instructions.


iPhone face recognition will not recognize the owner if he gets a plastic surgery while a friend will still recognize him.
 
What is radar braking and where is it enabled?

I just wish it would react better to cars halfway in the lane. It just seems to have no reaction until the other car is almost all the way in the lane. The problem could be that they're training it using Tesla drivers :p

If this is ever going to be much safer than the best human driver (not just drivers in general - the very best human driver is the most reasonable metric), FSD with HW 3.0 will need to be able to sense the vector, note wheel angle, lane position, etc., of a vehicle in an adjacent lane so as to make a nice gentle reasonable reaction to the vehicle entering the lane, prior to entry, prioritizing maintenance of speed without compromising safety. Otherwise FSD is going to get rear ended when it makes an abrupt reaction (and that would make it less safe than the best human driver - the best human drivers drive so that other humans can be incompetent and yet not hit them).

I used AP this weekend, and saw a minivan about 50 yards ahead that looked suspicious in an adjacent lane to the right (it was drifting purposefully left in its lane - I don't remember if there were other cues), and I was closing on it at 30mph or so. AP wasn't doing anything, so I signaled for it to move over a lane to the left while the minivan was still in the adjacent lane, just as the minivan started to make the predicted switch to cut me off (I don't remember whether it ended up signaling or not). FSD needs to have similar (but superior) predictive abilities, as I am definitely not the best human driver.

FSD will need to be able to duplicate this behavior and pick up on the subtleties before release, and have predictive abilities exceeding that of humans. Seems like a tall order for the NN, but maybe not. In any case, creating that behavior is independent of the sensing suite (lidar, radar, cameras) that is in use. It also needs to adjust following distance, based on the distance that people behind the Tesla are following (that's a lot easier but not sure it can be done with current sensing suite).

There's all this talk about this particular truck scenario getting fixed by FSD. And maybe this specific case will be, not sure. But in the end, FSD when it is released is also going to have fatal accidents which humans would generally not have. It's just the reality, even if FSD ends up being really quite good (which it may or may not be). Just wonder how people will react to those events. To start with, the fatal accidents will occur when the nags are still active (must have full driver attention). Whether they'll ever get to the point where Tesla can get approval for release of L3/L4, I don't know. But if they do, the fatal accidents will continue. It'll be interesting to see how things evolve.
 
so the computer won't take caution until the triangle is in its path while passing it by at full speed.
As I said, 'if the object is going to intersect the vehicle path', not 'if it is already in the vehicle path'.


But the computer will only follow the known rules and instructions.
Known rule: don't hit things (or: drive in a way to reduce the chance of hitting things). That alone prevents the majority of accidents.

iPhone face recognition will not recognize the owner if he gets a plastic surgery while a friend will still recognize him.

Facial recognition for security is specifically designed to be overfitted relative to general recognition NNs.
 
As I said, 'if the object is going to intersect the vehicle path', not 'if it is already in the vehicle path'.



Known rule: don't hit things (or: drive in a way to reduce the chance of hitting things). That alone prevents the majority of accidents.



Facial recognition for security is specifically designed to be overfitted relative to general recognition NNs.
rule is don't hit things..
it is more likely to hit the triangle because it won't take precaution.
 
This
Tesla driver gets license suspended after drunkenly falling asleep on Autopilot

And this
Tesla on Autopilot drove 7 miles with sleeping drunk driver, police say
And this
Tesla on Autopilot drove 7 miles with sleeping drunk driver, police say
And plenty of more examples should not be possible but Tesla doesn’t implement better controls relying instead on a blurb in the owners manual stating the driver must be in control.
5 people (who hit disagree) think it's great the technological marvel accepts a drunk driver and drives it this far without waking him up. They seem to feel it's not necessary for the car (soon able to drive itself) to set a certain standard for their legally still in-control driver.
World's safest cars don't need the car to check up on the driver, right? Now we have a drunk driver on the highway who in another car might not have made it that far. Anyway, we have a Tesla driving around unchecked by its incapacitated driver and it could easily have refused it as a driver. Called the cops on him before even starting off.
 
No body said ban. What I said is better controls need to be implemented. Having a hand on the wheel with my eyes closed does not work. The vehicle needs to have a better monitoring system. Other vehicles do it, other vehicles that don’t even have a self driving assisted driving whatever you want to call it, system. Sure stupid is as stupid does but I do t have to hand the guy the scissors before he runs off with them. It’s not a on or off situation. It could be better. That’s all I am saying. That and I do t like that Tesla is content with the end user performing all the testing on public roads. If the system worked it wouldn’t inadvertently drive into truck or steer into center dividers. It’s these small issues that need to be worked out until the general public is ready to use it. The system gives to much of a false sense of security. Maybe not to those of us who realize the limitations s but we are the minority.
Another post with 5 disagrees. Forum members holding Tesla to lower standards than other brands set for themselves. Fascinating ethics among this demographic of 1%ers.
 
5 people (who hit disagree) think it's great the technological marvel accepts a drunk driver and drives it this far without waking him up. They seem to feel it's not necessary for the car (soon able to drive itself) to set a certain standard for their legally still in-control driver.
World's safest cars don't need the car to check up on the driver, right? Now we have a drunk driver on the highway who in another car might not have made it that far. Anyway, we have a Tesla driving around unchecked by its incapacitated driver and it could easily have refused it as a driver. Called the cops on him before even starting off.
No car has a breathalyzer as standard equipment.

If I might provide an analogy: modern medicine has few if any 100% treatments. They do have drugs and procedures that may address a problem, but may also have complications, including death. Even though it is known that surgery is not 100% safe, it is allowed. Yet a vehicle that improves some accidents rates is criticized for all the cases it does not currently handle.
 
  • Love
Reactions: bhzmark
If multiple sensors (thereby reducing false positive detection) identify obstacles, why should the car, for example, allow 100% forward acceleration and power to the tires? This is common when the operator confuses the accelerator with the brake - one such case here: Tesla Model 3 Crashes Into Dry Cleaners, Does Major Damage: Video where a Tesla smashed through a storefront.
Teslas fight for your free will. If you want to ram a bulding on ludicrous mode from standstill, that's your good right. And it will.
Safest car in the world needs not lose a game of chicken to a building. And it's by design. A feature, not a bug. They could easily prevent it. But didn't, even after their direct maximum torque proved cumbersome. My grandmother actually died from such an accident, whether it was throttle by wire error or driver error. People die and Tesla seems fine with that. How this management team is going to make FSD work is beyond me. OK, they lost some people and hired others. Will that change anything?
 
Will FSD be able to incorporate such learnings and experience? I hope so but some seem to be pretty tricky problems.
What keeps me up at night is that the team that allows the AP fails discussed here are also the team who will have to program what to do when a lone ball rolls onto the street from between a row of cars. Will machine learning figure it out? After how many kids killed by RoboTaxis? Imagine being the passenger and your RoboTaxi just ripped a child to pieces. And you get to explain it to the kids it was playing with and the grownups and police that follow.
I see the errors in jugdment and worry that me and my loved ones are already sharing the roads with Tesla drivers who bit by bit lose their interesting for traffic, as the car always sounds a signal or throws in some steering lock. They've seen it do it for hundreds of kms already, right? Many Tesla drivers are outstanding drivers and humans, but the EAP and NAVOAP simply mess with some peoples' sanity. We are discussing their deaths here and get to be thankful they died alone and didn't ram a van full of kids but a fire truck. Not a van with kids but a semi.
 
  • Like
Reactions: Msjulie and TigaFF
What keeps me up at night is that the team that allows the AP fails discussed here are also the team who will have to program what to do when a lone ball rolls onto the street from between a row of cars. Will machine learning figure it out? After how many kids killed by RoboTaxis? Imagine being the passenger and your RoboTaxi just ripped a child to pieces. And you get to explain it to the kids it was playing with and the grownups and police that follow.
I see the errors in jugdment and worry that me and my loved ones are already sharing the roads with Tesla drivers who bit by bit lose their interesting for traffic, as the car always sounds a signal or throws in some steering lock. They've seen it do it for hundreds of kms already, right? Many Tesla drivers are outstanding drivers and humans, but the EAP and NAVOAP simply mess with some peoples' sanity. We are discussing their deaths here and get to be thankful they died alone and didn't ram a van full of kids but a fire truck. Not a van with kids but a semi.

Which doesn't cover the case of the child running out between parked cars to go to their home across the street.

If you want to be worried, be worried about the rest of the cars on the road with no AP to attempt to cover for drivers that resulted in 40,000 deaths and 4.5 million injuries last year in the US.
 
  • Like
Reactions: bhzmark
Which doesn't cover the case of the child running out between parked cars to go to their home across the street.

If you want to be worried, be worried about the rest of the cars on the road with no AP to attempt to cover for drivers that resulted in 40,000 deaths and 4.5 million injuries last year in the US.
I was that kid. I got hit chasing a ball. In regular traffic I always checked left-right-left.
The kid running out without a prophetic ball to chase, that's a sad way to end up in a wheelchair or coffin.
Chasing a ball, that's criminal negligence. And nothing EAP has accomplished give me any grain of hope someone with common sense is at the helm teaching the software common sense.
Guy dies hitting a lane divider, next day the same accident would still happen in the same way had the driver not be explicitly aware of this exact situation, from the news. Not from the car sending an extra warning or shutting off for safety reasons. Nope, Tesla is happy to throw around their intelligence insultting safety statistics and hide behind a single disclaimer.
There's a whole case on whether Boeing trained pilotes sufficiently to understand the MAX upgrades. What does Tesla do? Go drive and notice the improvements!
 
AP is like guns, Human invention that has widely known good uses. But it occasionally kills people unintentionally.

I can almost get behind this analogy (not that that is of any importance) since fir a gun to kill anyone it must:
Be loaded
Be pointed
Have the trigger pulled
Three steps needed to lauch the bullet

For AP to kill soneone it must:
  1. Be enabled
  2. Be put in a situation it cannot handle
  3. Be allowed to do whatever it wants without driver intervention

Whereas a normal car only needs number 3 to occur (all situations meet criteria of #2)
 
I can almost get behind this analogy (not that that is of any importance) since fir a gun to kill anyone it must:
Be loaded
Be pointed
Have the trigger pulled
Three steps needed to lauch the bullet

For AP to kill soneone it must:
  1. Be enabled
  2. Be put in a situation it cannot handle
  3. Be allowed to do whatever it wants without driver intervention

Whereas a normal car only needs number 3 to occur (all situations meet criteria of #2)
Yes with AP we have drivers not minding traffic because they feel AP got this.
People have a good sense of survival instinct. This is how bad drivers get to die of old age.
AP just does what it know. It doesn't sense the atmosphere of traffic. It's doesn't know a semi or fire truck from thin air. And Tesla does as little as they can get away with to make driver formally aware that in theory, maybe they sometimes are needed to pay attention when AP is engaged.

Keep them apologetic posts coming, they're going to love it on Twitter.
 
Another post with 5 disagrees. Forum members holding Tesla to lower standards than other brands set for themselves. Fascinating ethics among this demographic of 1%ers.
Umm, no. The low standard is that of human drivers. Sorry dude, but we as a whole are pretty terrible.

Nothing is ever going to be perfect. That’s reality. Any automaker waiting for a perfected system will never release the system. That is pretty much why I give a Tesla a pass on this stuff. I get to enjoy cool features while Tesla improves the system. If you have a problem with the situation, then don’t use it.

And stop being so sensitive about the dislikes. Another fact of life, there will always be people who disagree with whatever position you take.
 
  • Disagree
Reactions: Leafdriver333
I was that kid. I got hit chasing a ball. In regular traffic I always checked left-right-left.
I'm sorry to hear about your accident. Can I assume you were not hit by a Tesla on AP?

Guy dies hitting a lane divider, next day the same accident would still happen in the same way had the driver not be explicitly aware of this exact situation, from the news.

That guy had complained about that exact junction multiple times and yet still crashed into it. If that is not proof that explicitly warnings sre not effective, I don't know what is.
Regardless, under the operating rules of AP, it should not have occured.

There's a whole case on whether Boeing trained pilotes sufficiently to understand the MAX upgrades. What does Tesla do? Go drive and notice the improvements!

AP is overridden by any driver input, be that steering, braking, or accelerating. The Boeing anti-stall function was not.
 
  • Like
Reactions: bhzmark
Status
Not open for further replies.