As both the editor of an AI newsletter and a Tesla owner, I’ve been closely following developments in autonomous driving technology. Naturally, I’m particularly drawn to discussions about Tesla’s Full Self-Driving (FSD) system.
I’ve written extensively about Tesla’s FSD, including my recent article, “Watch as My Tesla Drives Me Off the Road.” That particular incident took place on Election Day during rush hour on Highway 101, with FSD v13 engaged—the version Elon Musk claimed would “blow your mind.” Thankfully, Highway 101 is blessed with wide, generous shoulders, which is exactly where my Tesla decided to dive. Since I was fully alert, I immediately disengaged FSD and steered the car back onto the road. (And, thankfully I survived without any parts of my body blown out of the car.)
The driver behind me, however, wasn’t reassured. They kept their distance—an impressive feat during rush hour—and likely thought I was under the influence of something. Maybe I was. Perhaps I was under the influence of believing that this software was truly “full self-driving” and could be trusted.
With that introduction, let’s dive into Electrek’s newly published article and the fresh data it brings to light about Tesla’s FSD.
The Electrek Article: A Critical Analysis
On January 13, 2025, Electrek published an article titled, “Elon Musk Misrepresents Data That Shows Tesla Is Still Years Away from Unsupervised Self-Driving.” While the title might be a bit clunky, its message is clear—and, frankly, the title could’ve been shortened to “Elon Musk Misrepresents Data” or “Tesla Is Still Years Away from Full Autonomy” without losing any impact.
Having written extensively about Tesla’s FSD on this Substack and on LinkedIn, I can confidently summarize the current state of the technology: it’s no Waymo.
The Electrek article raises important concerns about Tesla’s progress—or lack thereof—toward achieving truly autonomous driving. Let’s break it down:
The Pros of the Article:
It highlights the gap between Musk’s ambitious claims and Tesla’s actual FSD performance.
Crowdsourced data is used effectively to challenge Tesla’s assertions, offering an alternative perspective.
A timeline of Musk’s past predictions is included, providing helpful context for the current claims.
The Cons of the Article:
The article leans heavily on crowdsourced data, which, while valuable, is not as comprehensive as Tesla’s internal metrics.
Its tone might come across as biased against Tesla, potentially overshadowing legitimate progress.
By focusing so heavily on Musk’s statements, it risks underplaying the genuine technical strides Tesla’s engineers have made.
Elon Musk’s History of FSD Claims
Elon Musk has been the Optimist-in-Chief when it comes to Tesla’s FSD. His history of ambitious—and often missed—predictions is well-documented:
2016: Musk announced that all Tesla vehicles would have the hardware necessary for “full self-driving.”
2019: Musk predicted that Tesla would have over a million robotaxis on the road by mid-2020.
2020: Musk claimed FSD would be “feature complete” by the end of the year.
2023: Musk admitted his previous timelines were overly optimistic but promised full autonomy “later this year.”
This pattern of bold claims followed by delays has eroded trust among critics and even some longtime fans.
The Current State of FSD
As of January 2025, Tesla’s FSD system is still in beta, requiring constant driver supervision. The latest data reveals:
Over 360,000 participants in North America have access to the FSD Beta.
Tesla reports improvements in miles between driver interventions, but independent data suggests these gains are modest.
Updates like FSD v13.2.4 have shown promise but continue to struggle in complex scenarios, such as construction zones and unprotected left turns.
Realistically, fully unsupervised driving remains years away. Experts predict:
2027-2028: Limited unsupervised operation in geofenced areas. (For instance, Tesla’s Cybercabs could operate within a tightly controlled environment like the Warner Bros lot, since the area was pre-mapped and geofenced.)
2030-2032: Broader deployment across varied conditions.
However, these projections come with the caveat of Musk’s track record of missed deadlines.
Tesla’s Unique Approach to Autonomy
Tesla’s strategy for developing autonomous vehicles is both innovative and polarizing. Here’s how it stands out:
Vision-Based System: Tesla relies on cameras and neural networks, forgoing lidar and radar, which are standard for competitors like Waymo.
Fleet Learning: Tesla uses data from its extensive fleet to train and refine its AI.
Incremental Deployment: Regular software updates allow Tesla to iterate quickly.
End-to-End Neural Networks: Tesla aims to replace traditional coding with neural networks trained on massive datasets.
In-House Hardware: Tesla’s custom-designed FSD and Dojo chips optimize its AI systems.
While groundbreaking, this approach has drawn criticism for testing beta software on public roads, raising safety concerns.
The Larger Implications
The divide between Musk’s optimism and FSD’s current reality mirrors a broader issue: trust. Whether it’s trusting a newly inaugurated leader or an autonomous vehicle, promises must be backed by transparency, accountability, and results.
For Tesla, this means balancing innovation with caution, ensuring that progress doesn’t come at the expense of public safety. As exciting as the future of autonomous vehicles may be, we must critically evaluate the data, scrutinize bold claims, and prioritize ethical deployment.
Final Thoughts
In the end, the journey toward full autonomy isn’t just about AI—it’s about trust. For Tesla, and indeed the entire industry, transparency and realistic goal-setting will be key to navigating this road successfully. Until then, we’ll be keeping our hands on the wheel.
Vocabulary Key
FSD (Full Self-Driving): Tesla’s autonomous driving software, currently in beta, designed to eventually eliminate the need for human drivers.
Geofencing: A virtual boundary that restricts autonomous vehicles to specific, pre-mapped areas.
Neural Network: A machine learning model inspired by the human brain, used in Tesla’s FSD to process driving data and make decisions.
Miles Between Interventions: A metric used to evaluate autonomous driving performance, measuring the distance traveled before a human driver must take control.
Edge Cases: Uncommon or unpredictable scenarios (e.g., construction zones or sudden pedestrian crossings) that challenge autonomous vehicle software.
Beta Testing: A pre-release phase where software is tested in real-world conditions to identify and address issues before full deployment.
Lidar: A sensor technology that uses lasers to measure distances and create 3D maps, often used in autonomous vehicles but not in Tesla’s camera-based system.
FAQs
1. What is Tesla’s Full Self-Driving (FSD)?
Tesla’s FSD is an advanced driver-assistance system designed to enable Tesla vehicles to drive autonomously in various conditions. However, as of January 2025, it remains in beta and requires constant driver supervision.
2. Why does FSD require geofencing for unsupervised driving?
Geofencing restricts autonomous vehicles to specific, pre-mapped areas, reducing unpredictability. It allows the software to operate in a controlled environment with fewer edge cases like complex intersections or unmarked roads.
3. How does Tesla collect driving data?
Tesla vehicles gather real-world driving data through their onboard sensors and cameras. This data is anonymized, aggregated, and used to train Tesla’s neural networks, improving the FSD system.
4. What are the limitations of Tesla’s FSD Beta?
FSD Beta struggles with certain complex scenarios, such as construction zones, unprotected left turns, and adverse weather conditions. Additionally, it relies heavily on cameras and lacks backup sensors like lidar or radar.
5. When will Tesla achieve fully autonomous driving?
Predictions suggest limited unsupervised driving in geofenced areas by 2027-2028 and broader deployment by 2030-2032. However, these estimates are subject to change given Musk’s history of optimistic timelines.
What the WolfPack Is Reading:
r/SelfDrivingCars. The false promises of Tesla’s Full Self-Driving
USATODAY. Tech News. Tesla announces fully self-driving cars. (October 19, 2016.) Highlights from this article: Tesla Motors announced Wednesday that its electric cars will be the first in the nation to all be fitted with the hardware they need to drive themselves.
CEO Elon Musk announced Wednesday that the automaker's Model S, X and forthcoming Model 3 sedan will start being outfitted with "the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver."
THE VERGE. Tesla wins again as lawsuit challenging Autopilot claims is dismissed. Elon Musk’s self-driving claims are mostly ‘puffery,’ not fraud, the judge said. (October 1, 2024.)
INSIDE EVs. Elon Musk's Self-Driving Promises Are Getting Old. Tesla's CEO has been making lofty predictions about robotic taxis and self-driving cars for nearly a decade. Is this time any different? (April 26, 2024.)
Washington Post. Tesla Sells Full Self-Driving. Is It A Fraud? (July 11, 2024.)
The Center for Auto Safety. A Dive into Automotive Safety with Dr. Jonathan Gitlin.
INSIDEEVS. 77% Of Tesla's Stock Value Rides On Self-Driving. CEO Elon Musk has written a very big check with all his promises about autonomous cars. When will that check come due? (November 18, 2024.)
The Dawn Project. The History of Tesla Full Self Driving. Highlights: Tesla also warns in a lengthy disclaimer that the software “may do the wrong thing at the worst time.”
#Tesla #FullSelfDriving #FSD #AutonomousVehicles #SelfDrivingCars #ElonMusk #RevolutionOrRisk #TeslaFSD #AI #Technology #Innovation #TeslaCybertruck #TeslaRobotaxi #TechNews
Share this post