Not for true level 4 or 5 autonomy. Autopilot flakes out in too many common situations, like some highway forks (it was once driving down the middle of one of those for me, straight into a barrier, before I disengaged it), construction zones, avoidance of potholes (maybe not as common in San Diego, but here in the northeast, they're far too common), etc. NoA is aware of the traffic in its immediate vicinity (one lane over, one or two cars in front, and one car behind), but seems to be oblivious to everything else. As a result, it suggests some quite stupid maneuvers, like merging into a lane that's closed off ahead or that's faster right there but slowing ahead. It also fails to slow when approaching stopped traffic until the last minute, which is disconcerting at best. I suppose the car could drive through many of these problems, but not as well as a human, and some of these limitations could result in damage to the car or even injury to the passenger.
That said, NoA does reasonably well on most highway driving, in terms of miles -- probably 90%, and maybe a point or two more than that, which isn't numerically all that different from your estimate. The trouble is that it's just not good enough in the remaining conditions, and those are so varied, and so important from a safety perspective, that they'll take a lot longer to get right than the first 90%. This is actually pretty common in programming; you can get to 90% functionality pretty well, but it takes as long or longer to get that final 10%. (Of course, Tesla's neural net approach differs significantly from traditional programming, but I suspect a similar rule applies.)