To the posters that wrote they expect a "free upgrade" to HW4, do you expect a "free upgrade" to your A# chip every time Apple releases a new one? I didn't think so
This analogy makes no sense.
Apple does not sell a software package with features they have not yet delivered, that the hardware they sold it on is not capable of delivering.
Tesla supported two 100% different streams and software stacks when they dumped Mobileye and went in-house to control this mission-critical tech.
No, they did not.
Mobileye, not Tesla, owned and maintained and improved most of the AP1 code.
Mobileeye never gave Tesla access to most of it- which is why after the breakup AP1 saw no new features going forward and only minor tweaks to the few areas Tesla had any control over, and even then not for terribly long.
Tesla didn't continue to do major SW development on AP1 because they never were doing it in the first place.
Tesla was still doing updates on the media computer side, but then all cars had the same media computer at the time so still just 1 stream.
They already have two entirely different neural network software streams.
Again, no they do not. Not sure where you're getting that idea.
Otherwise, what are they doing with the data coming from the LIDAR sensors?
They either have them connected to a second HW3 or HW4 board in the test mule running a modified NN (Neural Network).
Again- you are just making things up.
They LIDAR is just reporting ranging distance to things.
That's the primary use of LIDAR in cars.
Then they compare those distances to what vision thinks the distance to the objects are.
They don't need to USE the LIDAR data for anything else, and certainly don't need to write NNs to do anything with the data.
Why would they? Software to interpret LIDAR point cloud readings is off the shelf stuff and has been for years.
More likely, HW4 supports additional inputs and has the processing power so the LIDAR is connected to the same HW4 board as the optical sensors are.
That's not likely at all since both the CEO and the head of Tesla AI development have said as recently as 1-2 months ago they're all-on on vision ONLY.
Human eye corneas do not have rain driving at them at 60+ mph like the glass that covers the video input sensors.
... what?
The glass on the front cameras is... the windshield.
Which
absolutely covers your eyes as far as seeing the road.
In both cases windshield wipers help out.
We have fast windshield wipers to give us a slower frame rate than normal. Our brain fills-in the missing frames and decides what to do downstream a different part of our brains.
Again... what?
All 3 forward cameras, for example, are behind the same windshield and wipers as your eyes are.
Everyone is speculating at this point.
I mean, mainly you. And fairly wildly.
"Vision-only" just means no input from the frontal radar collision-avoidance sensor
No, it does not.
The front radar was not a "collision avoidance sensor" it provided ranging distance to everything in front including for use in path planning and speed control. Now vision is doing that.
Serious question have you actually done any research into:
What Tesla has actually said about this stuff? (including hour+ long presentations from Andrew Karpathy on their entire design philosophy contradicting your speculation)
What Tesla is actually running today on the driving computers? (greentheonly on Tesla has a lot on this for example).
Because it does not read like you have.
Agree. LIDAR will not be used in "production" consumer vehicles where the liability for FSD is on the driver and not Tesla.
Except
consumer vehicles will be part of the Tesla network-- the terms of FSD have explicated even cited this- for years.
As already pointed out to you.
And if what you claim were true, what is HW4 going into consumer vehicles for on Aug 19 as you claim?
For Tesla Network, obviously, the liability has to be on Tesla. If adding LIDAR gives those vehicles a safety advantage (compare vision with LIDAR downstream for a lower error rate), then I believe Tesla will use it.
Your belief is not Teslas belief.
Teslas belief, as stated explicitly, repeatedly, and recently, is that they can achieve all the safety required with
vision only on the current sensor package. (ie without even radar)
We can be fanboys, but we don't have to drink the Kool-Aid and believe every word that comes out of Elon's mouth
It doesn't sound like you believe
any of em.
Nor the head of the guy actually doing the AI work who also says vision only is the path forward, and has given long presentations including why LIDAR adds no value if you get vision right.
This is irrelevant to LIDAR possibly being used in future
It's relevant to your argument that Tesla will do it because it's what "everyone else" is doing.
Teslas success has been from
not doing what everyone else is doing- and doing something a lot smarter, better, and usually with less cost and parts.
Just like using vision only for self driving.
Tesla Network Tesla-Owned vehicles, as is the next point about "tons of parts", and firmware updates.
You realize the tesla-owned ones will be
existing model 3s coming off lease right?
Tesla specifically did not include a "leaser can buy at the end" clause- because they said they wanted them back for use as RTs.
0 of them have LIDAR.
Besides solving vision, they need to solve mud splashes, water, etc. obscuring the glass covering the sensors, not to mention snow blizzards and fog.
Does mud not get on LIDAR sensors?
Of course it does.
"adding more cameras" would work just as well if your entire reasoning is "with more of them it's less likely mud will cover one in a given direction"
And be a lot cheaper.
And not require complex, unneeded, sensor fusion to be created by Teslas software team.
From what I've seen of the V9.x beta videos so far, adding some variant of the B-pillar forward/side cameras to the lower front fenders would do a LOT more to help FSD out than LIDAR would.... (and be a ton easier to retrofit into existing cars too- which again, everyone who bought FSD before 3/19 was promised at least L4 for their purchase)
There's no real requirement to "drive in blizzards"- Even L5 explicitly states it's only needing to work in conditions a human can reasonably be expected to drive in. 8 eyes (cameras) will still beat 2 in those conditions.
I don't think Waymo and others are morons using LIDAR, do you?
Tesla thinks they are.
Waymo has been working for 12 years now, with Googles money, and their entire "consumer facing" product is.... a very few L4 robotaxis... in one tiny suburb of AZ with perfect weather....geofenced to that area... and still needing human backup assist remotely.
WHAT A FUTURE THEY HAVE SHOWN US!
Maybe a different approach will work better. Tesla thinks so.
Let's agree to disagree on some of these issues and watch and learn what Tesla is up to on August 19 from Elon and Andrei.
Fair enough... though it's a shame I can't buy some puts on your predictions