Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
Yes, at some point, more sensors becomes overkill. But companies are not just piling on more and more sensors. That is a strawman. Companies look for the sensor package that they think will provide them with the right coverage and redundancy for safety but also be cost effective. Obviously some companies think they can do that with 20 sensors, others might think they need 30 sensors. But nobody is just adding more sensors because more is better. They add sensors that they think will serve a practical safety purpose.
Sorry, I think that is naive. Companies want to sell cars .. if they think the public will prefer a car with 20 sensors instead of 10, they will shove them on assuming the cost is not too crazy .. or else add the extra 10 to a ”premium driving” package for extra $$$. I remenber helping a friend pull apart a Chinese car (long story) to fix the “premium” audio system that had 6 extra speakers over the regular one. The speakers were there all right .. but not connected to anything!!
 
Sorry, I think that is naive. Companies want to sell cars .. if they think the public will prefer a car with 20 sensors instead of 10, they will shove them on assuming the cost is not too crazy .. or else add the extra 10 to a ”premium driving” package for extra $$$. I remenber helping a friend pull apart a Chinese car (long story) to fix the “premium” audio system that had 6 extra speakers over the regular one. The speakers were there all right .. but not connected to anything!!
Initial technology developments tend to be over-designed because you first need to make it work. After the early products are fielded, you can start eliminating and optimizing things to reduce complexity and cost.

You can't improve something until you actually have something.
 
Sorry, I think that is naive. Companies want to sell cars .. if they think the public will prefer a car with 20 sensors instead of 10, they will shove them on assuming the cost is not too crazy .. or else add the extra 10 to a ”premium driving” package for extra $$$. I remenber helping a friend pull apart a Chinese car (long story) to fix the “premium” audio system that had 6 extra speakers over the regular one. The speakers were there all right .. but not connected to anything!!

Cost is a factor though. Even if automakers did think that the public would be attracted to a car with 20 sensors instead of 10, they are not going to add 20 sensors if it destroys their profit margin. And profit margins are historically pretty thin on consumer cars. There is very little room for excess cost. And speakers are cheap. Automakers can afford to add a few extra speakers that are not connected and charge money for them. Radar and lidar are more expensive than speakers. And yes, automakers do charge money for extra driving packages but that is with existing sensors. We don't see any automaker adding extra sensors that they don't use.

Here are a few examples of sensor suites:

Hozon Neta S
6 lidar
5 mm radar
12 ultrasonics
13 high res cameras

SAIC Fiefan R7
1 front lidar
6 long range mm radar
2 4D radar
12 8mp cameras
12 ultrasonics

Arcfox Alpha S
3 lidar
6 mm radar
12 ultrasonics
13 cameras

Lucid Air
13 cameras
5 radar
1 front lidar
12 ultrasonics

Mobileye Drive
13 cameras
3 long range lidar
6 short range lidar
6 radar

Obviously, these sensor packages are subject to change but they give us a general idea of what to expect. All vehicles have 13 cameras. They have 12 ultrasonic sensors. The L2 consumer cars usually only have 1 front lidar to keep cost down. The L4 systems like Mobileye Drive will have more lidar. And all of my examples have less than 10 radar.

Like I said in my previous post, there some differences of course between the different sensors suites especially in type of lidar, camera or radar but for the most part they keep the sensors to a minimum of what they think can still give them good 360 coverage and be cost effective. We don't see anyone adding more sensors just to brag to consumers that their car is "better".
 
Last edited:
A thought on why Tesla may be having difficulty resolving the remaining issues with FSD Beta. Tesla has a large number of vehicles on the road that were sold as FSD (robo taxi) capable vehicles. In some instances the addition of additional sensors/cameras may simplify/expedite the solution to a problem, however it will come at considerable expense to retrofit the existing fleet. I believe this may be forcing Tesla to spend extra time/effort to resolve the issue via software in an effort to mitigate the costs of a hardware retrofit. This said, I believe Tesla has done an amazing job on FSD. I think one f their greatest challenges is trying to deliver to over aggressive commitments.
 
  • Like
Reactions: daktari
A thought on why Tesla may be having difficulty resolving the remaining issues with FSD Beta. Tesla has a large number of vehicles on the road that were sold as FSD (robo taxi) capable vehicles. In some instances the addition of additional sensors/cameras may simplify/expedite the solution to a problem, however it will come at considerable expense to retrofit the existing fleet. I believe this may be forcing Tesla to spend extra time/effort to resolve the issue via software in an effort to mitigate the costs of a hardware retrofit. This said, I believe Tesla has done an amazing job on FSD. I think one f their greatest challenges is trying to deliver to over aggressive commitments.

That's part of it. But IMO, the biggest problem is that I don't think Elon fully understood what it takes to actually "solve FSD". Safe and reliable autonomous driving requires that you solve 3 huge problems: perception, prediction and planning. 1) The car needs to see the world accurately and completely (perception). 2) The car needs to predict what relevant road agents will do (prediction) and 3) the car needs to plan the appropriate path in real-time that both follows the rules of the road, follows the right route to your destination and avoids hitting other objects (planning). Elon just saw it as a vision problem and assumed that with lots of machine learning and data, it should in theory be possible so let's just do that.

Now, considering what Tesla has to work with, I think that Tesla has done an amazing job with the perception part. But that still leaves a lot of work to be done on the prediction and planning part. And prediction and planning will be dependent on perception. So any weaknesses in Tesla's vision will affect the prediction and planning parts. So yes, at this point, Tesla is basically stuck with trying to solve all 3 parts with just the 8 low res cameras and hope that with enough ML training, they can get FSD to be "good enough".
 
  • Like
Reactions: SCTes1aMan
Now, considering what Tesla has to work with, I think that Tesla has done an amazing job with the perception part. But that still leaves a lot of work to be done on the prediction and planning part. And prediction and planning will be dependent on perception. So any weaknesses in Tesla's vision will affect the prediction and planning parts. So yes, at this point, Tesla is basically stuck with trying to solve all 3 parts with just the 8 low res cameras and hope that with enough ML training, they can get FSD to be "good enough".
+1 ^^^

No doubt that Tesla had the best guy in the business for perception (vision), and yes what Tesla has done with perception given the limited sensor (camera) input may be considered an "amazing job." But the pressure from Elon (who had zero background in signal processing or AI) to stick with limited sensors (first just forward facing camera and radar, then 8 cameras and radar, then remove the radar because of supply chain issues, etc.) has hamstrung their efforts from the beginning, and may have doomed FSD from the start. I imagine Tesla knows this, and Semi and Robotaxi will have a lot more sensors - maybe even LIDAR. But what Elon says on Twitter IMO is not driven by engineering acumen or intellectual honesty. It's driven by his desire to save his/Tesla's reputation and keep the stock price high.
 
+1 ^^^

No doubt that Tesla had the best guy in the business for perception (vision), and yes what Tesla has done with perception given the limited sensor (camera) input may be considered an "amazing job." But the pressure from Elon (who had zero background in signal processing or AI) to stick with limited sensors (first just forward facing camera and radar, then 8 cameras and radar, then remove the radar because of supply chain issues, etc.) has hamstrung their efforts from the beginning, and may have doomed FSD from the start. I imagine Tesla knows this, and Semi and Robotaxi will have a lot more sensors - maybe even LIDAR. But what Elon says on Twitter IMO is not driven by engineering acumen or intellectual honesty. It's driven by his desire to save his/Tesla's reputation and keep the stock price high.
The big concern for me here is that there is a very large fleet of vehicles ( ver 3 hardware) that were sold as being FSD/robotaxi capable and thousands of FSD purchases that promised the same that will need to have hardware ( sensor ) upgrades to attain the goal. This could result in a huge financial hit for Tesla one way or the other.
 
  • Funny
Reactions: AlanSubie4Life
This is not something that happened to any appreciable degree - this problem was solved in 2019.

(No need to discuss further here - plenty of other threads.)
Just a reference to get people up to speed on the changes made since the initial 2016 launch:

I will say it would be fairly awkward if pre-2019 cars get a free hardware boost and post-2019 is left out, so I really doubt Tesla wants to do that. This is especially true given the older buyers tend to have marginally paid the least for FSD.
 
Here is a short video that shows the new Baidu robotaxi as well as give some info on Baidu's plans for the future. The video mentions that Baidu's robotaxis are limited to areas with less traffic to reduce risk. And at the 2:10 mark, we see a left turn that the Baidu robotaxi does not handle well IMO.


OMG that's awful and dangerous unless im misunderstanding the traffic rules. Did the opposite traffic get a red?

Edit: Ok i checked further and it looks like its still in the turning lane when it stopped. So I guess that's good under chinese law?
You can see the same thing here. Notice how everyone is queuing up accross the turn lane.
 
Last edited:
OMG that's awful and dangerous unless im misunderstanding the traffic rules. Did the opposite traffic get a red?

It is hard to tell. It seems the Baidu did have a green light. The opposite traffic does stop but maybe it was just stopping for the Baidu car in the middle of the lane.

It makes me doubt Baidu's aggressive plans to expand so quickly (10 cities by next year, 100 cities after that) if they still have safety issues like this. Perhaps they are hoping to have these issues solved by then. And I am sure their ride-hailing areas will be properly geofenced to hopefully mitigate these issues. But still. We could see a situation where Baidu scales aggressively at first but then has some major safety problems.
 
OMG that's awful and dangerous unless im misunderstanding the traffic rules. Did the opposite traffic get a red?

Edit: Ok i checked further and it looks like its still in the turning lane when it stopped. So I guess that's good under chinese law?
You can see the same thing here. Notice how everyone is queuing up accross the turn lane.
I noticed that too in the video, but chalked it up to perhaps traffic customs in China (maybe semi-protected left?)

If it was unprotected left and Baidu car was supposed to yield and just stopped in the middle, that seems very much like the Cruise accident.
 
It is hard to tell. It seems the Baidu did have a green light. The opposite traffic does stop but maybe it was just stopping for the Baidu car in the middle of the lane.

It makes me doubt Baidu's aggressive plans to expand so quickly (10 cities by next year, 100 cities after that) if they still have safety issues like this. Perhaps they are hoping to have these issues solved by then. And I am sure their ride-hailing areas will be properly geofenced to hopefully mitigate these issues. But still. We could see a situation where Baidu scales aggressively at first but then has some major safety problems.
The way China operates is city governments have a lot of autonomy and enforcement of laws and care for safety can be quite lax. If it makes the city government look good, they can easily sweep any safety incidents under the rug (especially given how controlled the media there is and how media/public access to any incident reports is essentially zero).

Things can go almost unregulated for quite a long period before the government steps in. That happened with bike sharing (where large amounts of bikes were left on the street until eventually it got so bad they had to round them up, leading to the stunning pictures of them dumped in a lot). When companies in the US tried to emulate that model with bikes and scooters (dumping a bunch of them on city streets) they got promptly stopped by city governments here.

Another factor is how accidents are handled in China, almost all of them are handled on the spot or at the police station with the police as a mediator.
A Crash Course in Handling Traffic Accidents in China
It isn't like in the US, where even if you are nobody, you may get sued by a personal injury lawyer for your entire worth. If you are a company, you are even worse off and may be sued for millions. There also isn't "guanxi" in the US (of which a large company like Baidu would have plenty of vs a random person off the street), where you can pull strings to make the problem go away (this term actually was mentioned in above article).

Basically if it doesn't get to the point of a death (in which there might be criminal proceedings) the risk to Baidu is relatively low when any incidents occur vs in other countries.
 
  • Helpful
Reactions: Doggydogworld
Obviously, these sensor packages are subject to change but they give us a general idea of what to expect. All vehicles have 13 cameras. They have 12 ultrasonic sensors. The L2 consumer cars usually only have 1 front lidar to keep cost down. The L4 systems like Mobileye Drive will have more lidar. And all of my examples have less than 10 radar.
But look in the variations around each type of sensor .. they are all trying to do the same thing, so why such huge variation? Basically, this tells me that they are all more or less guessing.
 
  • Funny
Reactions: Daniel in SD
But look in the variations around each type of sensor .. they are all trying to do the same thing, so why such huge variation? Basically, this tells me that they are all more or less guessing.

More like an educated "guess". Engineers build systems that they think will meet their requirements. There is always going to be some uncertainty but the designs are based on engineering principles.
 
The big concern for me here is that there is a very large fleet of vehicles ( ver 3 hardware) that were sold as being FSD/robotaxi capable and thousands of FSD purchases that promised the same that will need to have hardware ( sensor ) upgrades to attain the goal. This could result in a huge financial hit for Tesla one way or the other.
Well I should clarify that by "doomed FSD" I mean kept it from achieving L4 or L5 - anything more than a limited L3 on highways, perhaps. That's not to say that they won't release Autosteer on City Streets as an L2 ADAS feature and call it done while never admitting that they formally promised anything more, thus avoiding at least one form of the "huge financial hit" to which you refer. May not save them from stock losses, though.
 
That article is from 2019. Notes that adopting a BEV (birds-eye view) dramatically increased accuracy.
Tesla incorporated that finding in 2020, with a switch to a "Bird's eye view network".

They have further improved by AI day in 2021 by using transformers to do the image to BEV conversion:
Tesla’s arc of AI progress

As mentioned by others upthread, I think perception-wise Tesla has done a really good job already. They however have barely touched the surface for planning (I believe it is still largely hand coded). I guess we'll find out more when AI Day comes around this year.
 
Last edited:
  • Like
Reactions: JHCCAZ and Dewg