(CleanTechnica) – If you’re a Tesla investor, fan, or owner, there has been so much incredible news in the past few days. I have a bunch of half-written articles that I need to finish in the upcoming days — about Tesla’s earnings, some followup analysis on Tesla’s battery news, another long overdue followup to my Tesla warranty accounting articles, and an interview with Climateworks on its carbon dioxide removal system and the economics behind that.
But … I started writing on CleanTechnica because I wanted to dive into potential FUD surrounding the stock to determine for myself what are legit reasons for investors to have FUD (“fear, uncertainty, and doubt”) regarding the company — and share my findings with others for input. Last October, in one of my most-read articles, I admitted that I expected to find a lot more “there” there in the FUD arguments, and have been surprised that I haven’t.
I’m still open to finding issues, by the way, but I have come to believe that the majority of the FUD around Tesla comes more from the fact that Tesla does not prioritize institutional investors, and those investors have been poorly suited to handle a company like Tesla that is working more like a series of startups, as Elon mentioned on the last earnings call, and not at all like a traditional automaker.
With that having been said, let’s get to the meat of this article — Tesla started rolling out Full Self Driving for a select few drivers to test. Tesla fans have been excited to see what the system can do.
But the stock price hasn’t moved, and the articles that have shown up in the media about it have been — unsurprisingly — sensationalist.
And I suppose this is what you’d expect. It’s a lot easier to get people to click on your article if you claim that Tesla is allowing “untrained consumers to validate beta-level software,” which is part of one of the subheadlines of one of those articles.
Tesla Insurance Could Be 30–40% Of The Value Of Tesla’s Car Business
So, like I did when I discovered that Smart Summon was actually doing great back in October of last year, I wanted to dive into what we know about this Full Self-Driving (FSD) update now, and how we should all feel about it.
Tesla Smart Summon
Two Tesla Model 3 vehicles being smart summoned, with the car in the middle driven by a human.
It may seem counterintuitive to look at Smart Summon first, but it’s perhaps the most impressive FSD feature that every Tesla owner with the FSD package already has. Smart Summon is really impressive, but I think of it as more of a parlor trick than an overly useful feature. Yes, there have been a few examples of where FSD has been used to get itself out of deep puddles, so that the owner could keep their shoes dry, but in my own day-to-day life, it’s hard to figure out where to use it.
I like seeing how the system learns, and I also like seeing how the updates change things, so I have tried to use it whenever I can, but Smart Summon isn’t great for a shopping trip where you need to load the car up with stuff.
I used to use it late at night at the local gym after I had worked out, mostly to see how it was updating (I could walk to the car quicker than the car could roll to me back then), but COVID convinced me to build a gym in the basement. I might have tried it at a restaurant, but I haven’t eaten at one of those since February either. I did try it once on a dirt road at a park I had gone to, where I was amused as it very carefully made it’s way around a few larger gravel stones that I as a driver wouldn’t have noticed.
I do like to challenge Smart Summon when I can. It still can’t reliably make its way out of my driveway, which includes a sharp, 90 degree turn down a wooded path.
You’re forgiven if you haven’t thought much about Smart Summon lately, as it’s been largely out of the news. And that’s the point — Smart Summon did exactly what I expected — it kept working as intended, and it isn’t that big of a deal because it’s worked pretty well. The sensationalist headlines have died down, and the press has moved on.
At least, that’s what I thought, until I saw one of the Tesla FSD articles saying that how well Smart Summon works is up for debate, as there is a huge number of Tesla owners “reporting bugs” in the system, referencing an article they wrote in September … of 2019.
That same article was one that had inspired my article in October 2019, which concluded that none of the three main claims against Smart Summon were legitimate claims — one was circumstantial and caused by the other driver, in one nothing happened but there was a close call again due to the other driver, and in the third a guy took photos of “his car” in a driveway, claiming the car Smart Summoned into the garage wall in a spot that was not congruent with a car driving forward as claimed.
This brought me to the conclusion that Smart Summon was doing well, and through time, I’ve been more convinced of this. If Smart Summon had run over a pedestrian, it would have been a big deal, and we all would have heard about it. If Smart Summon was regularly running into garages, it would have been a big deal, and we would all have heard about it. The fact it hasn’t bodes really well.
I would like to encourage anyone who got the new Tesla FSD update to try Smart Summon, though, and film the results. I’m curious if the tentative nature of the system has been improved upon.
We also know that Autopilot has done a pretty solid job over the years. There have been a few exceptions, and we highlighted some of those in this article and earlier articles.
When I’ve shared information about my car online, I’ve regularly had people that come in and like to claim that due to these Autopilot accidents, the system is unsafe. My response is that the system is operating exactly as designed, and the user opted to misuse the system, creating these situations. Tesla makes it exceedingly clear that Autopilot needs to be overseen, and when combined with an attentive driver, I firmly believe it’s safer than a driver without it. But, the system has limitations that are difficult for many to understand — the vehicle will plow into things ahead of it on the freeway because the likelihood that doing so is dangerous is less than the danger that would happen if the vehicles slammed on the brakes for every overpass crossing the freeway.
The new FSD beta system that some owners are testing will be able to optically detect whether it’s a plastic bag or an overturned commercial truck, but for now, the vast majority of Tesla owners have been trained on how to properly use Autopilot. Want proof? From that article above, we’ve had two major accidents on Autopilot this year. Last year, we also had two. In 2018, we had one.
That seems like it’s pretty flat, but you have to take into account that the number of Tesla vehicles on the road and using Autopilot has exploded since 2018. In 2019, Tesla sold more vehicles than the previous two years combined. This year, including the pandemic shutdowns, Tesla has already delivered about 85% of its numbers from last year. Just as relevant, Autopilot was not standard until 2019, meaning that the percentage of Tesla vehicles using Autopilot is even higher.
If anything, the data show that major Autopilot accidents are becoming rarer. It’s impossible to tell if this is because the majority of people are using the system properly, the system got better to account for more human error, or some combination of both. But the numbers are positive.
Tesla Full Self-Driving
And that bring us to Full Self Driving, and, honestly, we don’t know much of anything about it. Maybe 100 people got the update. Maybe 10,000, or 100,000. Maybe 20. It’s literally impossible to extrapolate any data about how it is doing so far without having an idea of how much use there is for the system.
I hope there are no major incidents, but if there are, I’m certain we’ll hear about them.
In the meantime, here are some videos and reflections from early beta users.
Regardless of what the bears or media say in clickbait headlines, Tesla isn’t a reckless company. It has made mistakes, but the goal is to learn from those mistakes as quickly as possible and continuously improve on safety. One of Elon Musk’s most famous quotes is:
“You should take the approach that you’re wrong. Your goal is to be less wrong.”
The ethos behind this quote pervades the company. A problem becomes an opportunity to change things and improve them. The ego of the engineers is secondary to the ability of the project. You don’t get 13 improvements in 3 months to an already extremely innovative design without wanting to improve. It’s extremely unrealistic to believe Tesla’s FSD beta just used a shotgun approach to whom got it. I think that Tesla is serious that it wants to push this to the best drivers. Not just would a major accident at this point be a huge problem for the company’s ambitions, but the best drivers will also train the neural net the best.
Tesla doesn’t want the neural net learning bad behaviors, so it only makes sense to put the safest people in it. As the system grows and improves, more and more drivers will be included.
I don’t expect to get the FSD update until early next year at the soonest. I don’t think I’m a poor driver — quite the opposite, actually — but I think the rollout will be extremely slow.
I think the abilities the system learns will happen incredibly fast, though.
What do you think? Agree with me or disagree? Leave a comment below. I’d like to know what I’m missing!
*Disclaimer: I am a Tesla [NASDAQ:TSLA] shareholder who has purchased shares within the preceding 12 months. Research I do for articles, including this article, may compel me to increase or decrease stock positions. However, I will not do so within 48 hours after any article is published in which I discuss matters that I feel may materially affect stock price. I do not believe that my voice could or should influence stock price by itself, and I strongly caution anyone against using my work as your sole data point to choose to invest or divest in any company. My articles are my opinion, which was formulated using research based on publicly available data. However, my research or conclusions may be incorrect.