Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla, TSLA & the Investment World: the Perpetual Investors' Roundtable

This site may earn commission on affiliate links.
FYI, this is a common theme as I explore Max Pain for future weeks. Lots of Call trading at the 200 mark, but not much sticking for the Open Interest.
Quite the dance for a single day.

1714434552688.png
 
I agree. The best way to handle this type of turn is to not do it. Go right to a stop light and make a left or U-turn or otherwise replan your trip. For years UPS has planned left turn out of routes as much as possible. It would be entirely reasonable for FSD to handle this very dangerous left turn situation by rerouting. It takes a little longer but is worth it.

I'm having to consciously hold back vulgarities from my "disengagement notes" sometimes I'm so annoyed at routes chosen and I don't want my FSD to get turned off. It initially wows you but then like a person once you get to know them a little better... Which leads me to....

I was driving down Page Mill Road outside of the old Tesla HQ yesterday, even requested Paint It Black by the Stones to amuse myself while I watched the car drive me around the same area. It's arguably better than the demo from several years ago, and yet I can't shake the feeling that the football is being spiked before the touchdown is scored with this both in this forum and in general and it gives me pause. Having watched the progress starting with Enhanced Autopilot in 2018 and FSDb for the last two years with over 250,000 miles driven (not a typo), I'd guess at least 1/3 and maybe even 1/2 with the car driving, if I had to make an informed guess I'd say getting there is inevitable- in about another 6 years. Don't reach for that "disagree" button just yet, I'll give you even more reason in just a minute.

It also gives me pause with a huge staff cut of employees when probably the poorest performing employee over the last two years has been the CEO, this is based on the objective measures of company value, brand, and the actions that have had a direct negative measure on it. Morale isn't something that only exists in armies- businesses and many other human organizations have it. To paraphrase Elon: if Elon was one of Elon's employees Elon would fire himself. How it will impact Tesla and $TSLA in the future remains to me seen, I clearly am not as optimistic as I once was, but neither have I sold any shares yet due to this either.
 
With Baidu struggling to improve it’s own Robotaxi operations, there are obvious mutual interests.
Seems likely to me that Tesla Robotaxis update Baidu maps, Baidu Robtotaxis benefit from those updates. And Tesla benefits from the existing maps and any updates done by Baidu.

Those that are thinking that a Chinese competitor can easily copy or reverse engineer the Tesla FSD solution haven't thought it through. Tesla FSD can be used for testing and validation of a competitors system.

For both copying and testing it makes very little difference whether or not Tesla operates a Robotaxi fleet, or even supports FSD in China. The core FSD product is pretty much the same worldwide and it is in every vehicle.

What China clearly doesn't want is anything that allow geolocation that is mostly the combination of video or pictures and map coordinates, but in some cases may extend to video and pictures without map data.

For driving a Robotaxi needs to mostly foreground information, the road surface, road signs, stop lights, other vesicles, pedestrians. The background landscape, urban or rural makes little difference, it may be handy to know the distance to the nearest solid structure, but most of the time FSD doesn't need to critique the quality of the architecture.

So clearly there is some way of generating sufficient training data for China in a way that keep the Chinese government 100% happy.

The big benefits for the Chinese in allowing FSD are:-
  • Accelerating the decline in fuel imports.
  • Increased productivity
  • Fewer accidents with associated health and vehicle repair costs and productivity impacts.
Aside from fewer accidents, the productivity benefits are:-
  • Worker is able to engage in other activities while commuting or is less fatigued.
  • Drivers freed up for other more productive tasks.
  • Some parking space can be used in more productive ways.

For the Chinese competitors the main things working Tesla Robotaxis provides them with are, more encouragement, and probably more ability to attract finance.

Some posters here seem to grasp any available straw to turn likely good news into bad news, working Robotaxis in China will help with global adoption and will be profitable for as long as it lasts. In the long run China probably doesn't want Tesla having a monopoly.

Chinese Robotaxis may find it hard to get a approval or public acceptance in some parts of the world Robotaxis will be cheap enough that for many price isn't a major consideration. Safety, amenity and trust in the brand will be considerations.
 
Back when I bought TSLA for $30-$45/share in the 2012-13 timeframe, if it had made an equivalent jump in Market Cap as today, it would have gone up $386/share (courtesy of 5x and 3x splits). I've spent most of it on the photo in my profile, but there is a little left and it's good to see in perspective sometimes.
 
Last edited:
FYI, this is a common theme as I explore Max Pain for future weeks. Lots of Call trading at the 200 mark, but not much sticking for the Open Interest.
Quite the dance for a single day.

View attachment 1042960
I was under the impression volume is somewhat live but open interest doesn’t update until the following day. [they can’t make it too easy for us. 😆] That might explain the disparity you’re seeing.

I’m sleeping comfortably tonight with my selling of 200c on all my shares for the week.

Sold early for sure but free money is free money. :)
 
One response from Chinese Owner. I find it interesting. Can anybody from here comment on this?

Boon
@booncooll

As a Chinese and a modelY car owner, Baidu is not a good choice. Amap is more trustworthy. Many Chinese Tesla owners need to install a mobile phone holder for their Tesla because they are not used to using Baidu Maps. Let’s use the Amap Map APP in the mobile APP. Amap Map is more accurate and easier to use than Baidu Map. But whether to use Amap or Baidu Map, ELON seems to have no choice.
Tesla using Baidu Maps should mean that all Tesla vehicles including Robotaxis improve Baidu Maps as they drive around.

When integrated into Tesla navigation using Baidu Maps will be seamless.

I expect the gap to Amap Map APP will quickly close.
 
  • Like
Reactions: carterm2
This is the first time in my long life of investing where I bought really cheap penny options the day before ER and it went my way by 30% in underlying stock move.

Gotta admit, this feels great. I can see why so many gamblers got addicted.

Do not copy lol. They hat you don't hear about are the 1000s of similar contracts I've lost out on.
 
Does it really perform worse, or do you just get used to the new release?
It does do worse in the beginning. For instance; bringing my daughter to school - when going into the grounds the car needs to turn right followed by an immediate left turn. The first couple of days it could not do it. Now it nails it every time.

Picking my daughter up from school; there is a four lane street. On a cross section with traffic lights we need to turn right. Just before the right turn there is another road to the right. The first couple of times my car went too soon to the right lane with the indicator on suggesting it wanted to turn right into the road rather than right at the cross section. Now it nails it every time, moving to the right lane after the other road.
 
That's a hard thing to justify really. If we're being really optimistic and assume a 30% take rate for FSD, average vehicle lifetime of 10 years, it means you get:

2m cars x 100$ x 12 x 10 x 0.3 = 7.2B / year in revenue. Triple the volume (a pipe dream with current factories and models) and you get only 22B in revenue. Factor in a software gross margin of 80% and that's only 16B in bottom line.

And yet Tesla was up $68bn today.
 
  • Like
Reactions: alexxs88
I have a quick question for the TA people.
Does today's rise mean there is a gap that is expected to fill?

TIA

edit for clarity

It looks like a weak island reversal pattern. As there was a gap down before.

And when I went into detaios, there were a few trades that traded at previous day's close after the gap up. I'd consider that gap filled in a traditional sense. But knowing the market, it's probably some fat finger somewhere.

Island reversals usually don't fill the gap back, as it represent a complete change of a fundamental issue. That said, the pattern is pretty weak.
 
Does it really perform worse, or do you just get used to the new release?
Interesting question. People have been batting this question back and forth for years on the FSD threads.

Here's an example: Near where I live, to get on US1 (yeah, that US1) one has to take an on-ramp. When getting on US1, one has maybe 100 yards (maybe less) to merge to the left to get into the slowest of three travel lanes on US1. If one misses that merge to the left and continues straight ahead on the on-ramp, one swiftly finds oneself on an off-ramp that takes one onto I-287 which will take one miles out of one's way.

I take this ramp onto US1 a lot. First time with 12.3.3, and again on 12.3.4, had to disengage, then go across various painted lines to get to the left onto US1; that is, the car didn't merge over fast enough, either due to traffic, but mostly because it has a more genteel approach to shifting to the left, keeping an eye out for All That Traffic on three lanes of US1.

2nd, 3rd, and Nth merges: The car starts signalling earlier and gets over, although it's still slower than a human, and it kinda crosses white solid lines a bit with the right side of the car before getting over.

On 11.4.9: It would make it one out of three tries. On 12.x, it's made it every time, except for the first for a release.

It's not just me: Others have noted this behavior. Not necessarily at some on/off ramp or other, but at some "difficult" spot in the road, be it speed, stop signs, or just funny intersections.

Which brings us into the Debate. In no particular order, but possibly in terms of popularity, we have the following hypotheses:
  • It's The Map Data. People hypothesize that either at the start of a NAV-based trip, but possibly picked up dynamically so long as there's some kind of internet connection, Map Data, possibly crowd-sourced (based on interventions?) gets downloaded to the car, telling it to Watch It on certain intersections. This has been pointed out to explain poor initial behavior at stop signs, various kinds of merges, and strange intersections with multiple conflicting traffic signals. Right turn on red yes or no? Could be the Map Data.
  • It's The Hidden Transfer Data. With FSD one is supposed to have diagnostic data being sent to the Mother Ship; certain people with more expertise than I have monitored the amount of data going back (and somewhat, forth). Typical uploads (it's mainly uploads) are measured in 100's of MB; some have reported single GB numbers. Now, this is commonly supposed to be videos and such. But there's no particular reason that this might not have "correction" data, map or otherwise.
  • It's NN Learning Stuff. This is pretty much my horse, but I get yelled at all the time by people who say, "The binaries are signed with checksums and can't change, EVER!!!, until the next point release." Um. My CS chops say, "There's Always A Work Around.", and that work-around, if present, would presumably allow "error correction" to move various "fine control" sliders around, be it NN weights or other algorithmic stuff.
In the meantime, Official Tesla has said nothing about any of the above, or even hinted that Any Of The Above might be happening. Or not. And there's folks who disbelieve that any of the above are true or even possible.

But people have repetitively noted horrible behavior after a new release that has been cured by
  • Double-scroll-wheel reset
  • Powering down the car for a time
  • Waiting a day or whatever
  • Calibrating the cameras (extreme cases)
  • More recently, cleaning the glass under the cameras on the front windshield
Most of the above may be related to the common human method of tangling with obtuse systems by killing a chicken and studying the internal organs afterwards. Or just superstition. Or getting "Used to it", as you point out.

And, just to make things worse: Whatever the heck is going on under the hood on a Tesla running FSD, it's absolutely clear that there's a definite probabilistic component to the car's behavior because what seems to be different behavior in identical circumstances might just be some minor deterministic differences in what the car drove through in the last ten seconds.. or last ten minutes.. or the last ten days.

In other words, the car acts a bit like, say, a bunch of civilians standing on the goal line on a football field. Somebody yells, "Listen up! Everybody go to the other goal line!" Some people run, some people walk fast, some walk slow, but they all get there. Chaotic, probabilistic times for each civilian to the other goal line. FSD.. is a bit like that.
 
Which brings us into the Debate. In no particular order, but possibly in terms of popularity, we have the following hypotheses:
  • It's The Map Data. People hypothesize that either at the start of a NAV-based trip, but possibly picked up dynamically so long as there's some kind of internet connection, Map Data, possibly crowd-sourced (based on interventions?) gets downloaded to the car, telling it to Watch It on certain intersections. This has been pointed out to explain poor initial behavior at stop signs, various kinds of merges, and strange intersections with multiple conflicting traffic signals. Right turn on red yes or no? Could be the Map Data.


  • This isn't a hypothesis- we know this happens for a fact, @verygreen has posted some detailed info on this happening.


    [*]It's The Hidden Transfer Data. With FSD one is supposed to have diagnostic data being sent to the Mother Ship; certain people with more expertise than I have monitored the amount of data going back (and somewhat, forth). Typical uploads (it's mainly uploads) are measured in 100's of MB; some have reported single GB numbers. Now, this is commonly supposed to be videos and such. But there's no particular reason that this might not have "correction" data, map or otherwise.

    There's campaigns, where Tesla will tell the fleet (or even narrow parts of it, like "only cars in this place" or "only model X" or whatever) that on a given trigger, upload the related data.... for example they need clips of "people who disengage at roundabouts" so there's a campaign sent out for that, when it triggers in a given car that data is marked to upload to the fleet....and they use that for training better roundabout behavior.

    I'm unsure how this is a separate item though as what gets uploaded can only "help" if it comes back to the car as map-related data so this is just a source for the stuff that happens in item 1.


    [*]It's NN Learning Stuff. This is pretty much my horse, but I get yelled at all the time by people who say, "The binaries are signed with checksums and can't change, EVER!!!, until the next point release." Um. My CS chops say, "There's Always A Work Around.", and that work-around, if present, would presumably allow "error correction" to move various "fine control" sliders around, be it NN weights or other algorithmic stuff.

    This has been directly debunked- to you- multiple times.

    There's no way 'around' a firmware that gets CRC checked on every boot and can not change without an update of the entire blob- and nothing else in the system survives a reboot.

    Greentheonly said:
    they must update entire firmware at least on the autopilot. Can't change a single file - breaks dm-verity. Can't overlay and have it survive a reboot (no dev overlay hooks in prod firmwares)




    There is no on-car learning. Period full stop.

    By design.

    NN learning on individual cars, as would be clear to someone with CS chops, would be a nightmare to ever debug anything- let alone integrate whatever it learned with the next actual firmware update.



    And, just to make things worse: Whatever the heck is going on under the hood on a Tesla running FSD, it's absolutely clear that there's a definite probabilistic component to the car's behavior because what seems to be different behavior in identical circumstances might just be some minor deterministic differences in what the car drove through in the last ten seconds.. or last ten minutes.. or the last ten days.


    That's kind of the fundamental and inherent nature of NNs...
 
Last edited:
Interesting question. People have been batting this question back and forth for years on the FSD threads.

Here's an example: Near where I live, to get on US1 (yeah, that US1) one has to take an on-ramp. When getting on US1, one has maybe 100 yards (maybe less) to merge to the left to get into the slowest of three travel lanes on US1. If one misses that merge to the left and continues straight ahead on the on-ramp, one swiftly finds oneself on an off-ramp that takes one onto I-287 which will take one miles out of one's way.

I take this ramp onto US1 a lot. First time with 12.3.3, and again on 12.3.4, had to disengage, then go across various painted lines to get to the left onto US1; that is, the car didn't merge over fast enough, either due to traffic, but mostly because it has a more genteel approach to shifting to the left, keeping an eye out for All That Traffic on three lanes of US1.

2nd, 3rd, and Nth merges: The car starts signalling earlier and gets over, although it's still slower than a human, and it kinda crosses white solid lines a bit with the right side of the car before getting over.

On 11.4.9: It would make it one out of three tries. On 12.x, it's made it every time, except for the first for a release.

It's not just me: Others have noted this behavior. Not necessarily at some on/off ramp or other, but at some "difficult" spot in the road, be it speed, stop signs, or just funny intersections.

Which brings us into the Debate. In no particular order, but possibly in terms of popularity, we have the following hypotheses:
  • It's The Map Data. People hypothesize that either at the start of a NAV-based trip, but possibly picked up dynamically so long as there's some kind of internet connection, Map Data, possibly crowd-sourced (based on interventions?) gets downloaded to the car, telling it to Watch It on certain intersections. This has been pointed out to explain poor initial behavior at stop signs, various kinds of merges, and strange intersections with multiple conflicting traffic signals. Right turn on red yes or no? Could be the Map Data.
  • It's The Hidden Transfer Data. With FSD one is supposed to have diagnostic data being sent to the Mother Ship; certain people with more expertise than I have monitored the amount of data going back (and somewhat, forth). Typical uploads (it's mainly uploads) are measured in 100's of MB; some have reported single GB numbers. Now, this is commonly supposed to be videos and such. But there's no particular reason that this might not have "correction" data, map or otherwise.
  • It's NN Learning Stuff. This is pretty much my horse, but I get yelled at all the time by people who say, "The binaries are signed with checksums and can't change, EVER!!!, until the next point release." Um. My CS chops say, "There's Always A Work Around.", and that work-around, if present, would presumably allow "error correction" to move various "fine control" sliders around, be it NN weights or other algorithmic stuff.
In the meantime, Official Tesla has said nothing about any of the above, or even hinted that Any Of The Above might be happening. Or not. And there's folks who disbelieve that any of the above are true or even possible.

But people have repetitively noted horrible behavior after a new release that has been cured by
  • Double-scroll-wheel reset
  • Powering down the car for a time
  • Waiting a day or whatever
  • Calibrating the cameras (extreme cases)
  • More recently, cleaning the glass under the cameras on the front windshield
Most of the above may be related to the common human method of tangling with obtuse systems by killing a chicken and studying the internal organs afterwards. Or just superstition. Or getting "Used to it", as you point out.

And, just to make things worse: Whatever the heck is going on under the hood on a Tesla running FSD, it's absolutely clear that there's a definite probabilistic component to the car's behavior because what seems to be different behavior in identical circumstances might just be some minor deterministic differences in what the car drove through in the last ten seconds.. or last ten minutes.. or the last ten days.

In other words, the car acts a bit like, say, a bunch of civilians standing on the goal line on a football field. Somebody yells, "Listen up! Everybody go to the other goal line!" Some people run, some people walk fast, some walk slow, but they all get there. Chaotic, probabilistic times for each civilian to the other goal line. FSD.. is a bit like that.
Thanks for your thoughts on FSD

It would be great to get @Discoducky response . I see he noted that he had 12 days of no critical disengagements and is ex Tesla’s FSD team
 
I stopped watching his videos when I realized it is far faster to go to the light and turn left with all the other cars seen in Google street view instead of trying to be quicker but more dangerous even as a human driver. It's a whopping .2 miles. During rush hour probably actually faster than waiting, definitely safer.View attachment 1042962
But you see, if a car cannot solve the unprotected left with 65mph cross traffic over 6 lanes with a blocked view, it cannot solve robotaxi
-said by no other robotaxi engineers from Waymo, Cruise, or anyone for that matter.
 
But you see, if a car cannot solve the unprotected left with 65mph cross traffic over 6 lanes with a blocked view, it cannot solve robotaxi
-said by no other robotaxi engineers from Waymo, Cruise, or anyone for that matter.

A quick hack for Geofenced Robotaxi operation would be to mark certain turns as closed on the map, ensuring Robotaxis take alternative safer routes.

FSD still needs to allow for the case where there is no alternative safer route.