Deep Learning With The Wolf
Deep Learning With The Wolf
A Tale of Two Self-Driving Cars: Experiencing Tesla and Waymo in a San Francisco Downpour
0:00
Current time: 0:00 / Total time: -5:09
-5:09

A Tale of Two Self-Driving Cars: Experiencing Tesla and Waymo in a San Francisco Downpour

On a particularly stormy Thursday in San Francisco, I found myself in a unique position to experience two distinct approaches to autonomous driving technology. The weather conditions were challenging – heavy rain reduced visibility, roads were slick with large puddles, and traffic moved at a crawl. These less-than-ideal conditions provided the perfect testing ground to compare Tesla's Autopilot and Waymo's autonomous driving system.

It was a day of planes, trains and automobiles. I had a meeting in San Francisco, but also needed to pick up someone from the airport. So, I drove my Tesla to the airport in a heavy downpour, squeezed it into a much too tight parking spot at the airport and “caught the BART” up to the city.

Riding BART is not as old-school as the New York City subway system, but not nearly as modern as the Metro in Washington, DC. The interiors of the cars have been modernized, but the ride itself is so loud it is difficult to hold a conversation. It still beats driving and trying to find parking.

After a successful meeting with a startup, (we were in San Francisco, after all), it was time to head back to the train station.

We “summoned” the Waymo while walking away from our meeting site.

The software is very intuitive to use and reminiscent of summoning an Uber. Yet, someone, the process felt less complicated than dealing with human drivers and the Waymo arrived VERY quickly. Within two minutes, our ride had arrived- a sleek looking Jaguar-equipped Waymo.

After greeting you by name, the Waymo takes you through a safety briefing that almost sounds like you are on an airplane.

The Multi-Sensor Approach: Waymo's Autonomous System

The Waymo approach to autonomous driving is strikingly different from my Tesla.

As a Level 4 autonomous system, Waymo combines lidar, radar, cameras, and detailed mapping to create a comprehensive understanding of its environment. The difference was immediately noticeable in the smoothness of the ride and the system's confidence in handling the poor weather conditions.

None of this reassured my fellow passenger who initially described himself as “terrified” by the experience.

From the moment the Waymo vehicle began moving, I experienced a profound sense of ease that lasted throughout the entire journey. My extensive research into autonomous vehicle technology likely contributed to this comfort level. Understanding the sophisticated technology at work - the lidar's precise distance measurements, the radar's velocity tracking, and the redundant sensor systems - created a foundation of trust in the vehicle's capabilities.

This stands in stark contrast to the experience of using Level 2 autonomous systems like Tesla's Autopilot, where drivers must maintain constant vigilance and be ready to take control at any moment. The mental load of such continuous monitoring prevents true relaxation, as your mind must remain engaged in the driving task.

In the Waymo, however, I could genuinely sit back and observe the technology at work, knowing that multiple layers of safety systems were actively managing every aspect of the journey. This difference between having to stay perpetually alert versus being able to truly relax highlights one of the fundamental distinctions between Level 2 and Level 4 autonomous systems - not just in their technical capabilities, but in the vastly different user experiences they create.


One particularly memorable moment came when the Waymo vehicle navigated a tight right turn around a yellow cab. While the maneuver brought us closer to the taxi than most human drivers would attempt, the system had precisely calculated the available space using its lidar sensors. This illustrated how Waymo's approach combines multiple data sources to make decisions with mathematical precision, even in challenging conditions.

If your travels take you to San Francisco, Los Angeles, or Phoenix, I strongly recommend adding a Waymo ride to your itinerary. Beyond being a glimpse into the future of transportation, it offers a uniquely relaxing way to experience these cities.

As we wound our way to the train station, I found myself admiring San Francisco's architecture and streetscapes from an entirely new perspective, free from the usual stresses of urban navigation.

The experience transforms a simple journey into something remarkable – a perfect blend of technological innovation and urban exploration. It's not just a ride; it's a window into how we'll all experience cities in the years to come.

The Vision-Based Approach: Tesla's Autopilot

After my research assistant caught his train, I took BART to the airport to pick up my husband. I was easily on time to pick him up for his flight, but the timing meant we would be driving home in the peak of rush hour traffic.

Slow-moving traffic typically presents ideal conditions for Tesla's Autopilot - the steady pace and predictable movements of surrounding vehicles play to the system's strengths.

Autopilot exemplifies Tesla's unique approach to autonomous driving: a vision-based system that interprets the world primarily through cameras and neural networks, much like the human brain processes visual information through our eyes. This strategy has allowed Tesla to deploy their technology widely, as cameras are far more cost-effective than the sophisticated lidar sensors used by companies like Waymo.

However, as our journey progressed, the limitations of this vision-only approach became increasingly apparent. We encountered numerous deep puddles concealing rough road surfaces, and the car's behavior highlighted a crucial technological gap.

Just as human eyes struggle to accurately gauge the depth of water on a road, Tesla's cameras couldn't determine water depth or assess the condition of the pavement lurking beneath. Without the precise depth perception that lidar provides, the system treated these hazards as normal sections of road.

The result was a notably uncomfortable ride as the car repeatedly "dove into" these puddles at full speed. This behavior reveals a key distinction between Level 2 and Level 4 autonomous systems. Tesla's Autopilot, as a Level 2 system, focuses primarily on basic driving tasks - staying in lane, maintaining speed, and avoiding collisions. It lacks the sophisticated decision-making capabilities that would allow it to consider passenger comfort, such as avoiding rough patches of road when possible. In contrast, Waymo's Level 4 system incorporates passenger comfort into its decision-making process, much like an experienced human driver would, though even it occasionally encounters unavoidable rough patches, just as any vehicle would.

The video captures a telling moment when my husband instinctively shouts "Stop!" at the car. While Tesla vehicles do have some basic voice command capabilities - like adjusting the temperature or opening the glovebox - they're far from the sophisticated AI interactions depicted in science fiction shows like Knight Rider.

The car certainly won't alter its driving behavior or disengage Autopilot based on verbal commands, no matter how emphatically delivered. This limitation reflects the current state of autonomous vehicle technology: despite impressive capabilities, these systems still require physical intervention through the steering wheel or pedals when human judgment determines a change is needed.

Throughout our journey, we found ourselves regularly disengaging Autopilot when approaching these water hazards. This need for constant human oversight illustrates a fundamental aspect of Level 2 autonomous systems - while they can reduce the mental strain of driving in challenging conditions, they cannot fully replace human judgment. Each intervention served as a reminder that we were testing the boundaries between human and machine capabilities.

We maintained our sense of humor throughout the experience, laughing as our sophisticated electric vehicle repeatedly demonstrated its enthusiasm for puddles like an automotive version of Charlie Brown's friend Pig-Pen. This blend of cutting-edge technology and decidedly unsophisticated behavior perfectly encapsulates the current state of autonomous driving - impressive capabilities mixed with surprisingly human-like limitations.

The contrast with my earlier Waymo experience couldn't be starker - while both vehicles are impressive feats of engineering, Tesla's Autopilot serves as a reminder that there's still a vast technological gulf between driver assistance systems and true autonomous driving. For all its capabilities, it's no Waymo.

Looking to the Future

These contrasting experiences highlight the two main paths toward autonomous driving. Tesla's vision-based approach mirrors human learning and perception, potentially offering broader adaptability but requiring significant advances in AI to achieve full autonomy. Waymo's multi-sensor approach provides greater precision and reliability within mapped areas, though it requires more extensive infrastructure and preparation.

The future of autonomous driving may ultimately combine elements of both approaches, but for now, these distinct strategies offer different tradeoffs between deployment scale and operational capability.


FAQ:

What's the difference between Level 2 and Level 4 autonomy? Level 2 systems, like Tesla's Autopilot, assist drivers but require constant supervision. Level 4 systems, like Waymo, can operate fully autonomously within specific areas and conditions.

Why doesn't Tesla use lidar sensors? Tesla believes that vision-based systems using cameras and AI can achieve human-level driving capabilities at a lower cost than lidar-based systems.

Can these systems drive in any weather condition? Weather conditions affect both systems, but multi-sensor approaches like Waymo's tend to handle poor weather more reliably due to their redundant sensing capabilities.


What the WolfPack Is Reading:

InsideEVs. Tesla Sales Are Tanking Across The World. Blame the Musk Effect, declining EV subsidies or all of the above. But Tesla's global sales are off to a very bad start for 2025. (February 8, 2025.)

R/TeslaInvestorsClub. Why a New CEO is Needed. Musk is no longer focused on Tesla. (February 2, 2025.)

electrek. Tesla Cybertruck crash on Full Self-Driving v13 goes viral. Fred Lambert. (February 9, 2025.)

electrek. Elon Musk is about to masterfully move the goalpost on Tesla Full Self-Driving. (February 10, 2025.) Summary: “Electrek’s Take I can almost guarantee what will happen: Tesla will launch this project and claim to have achieved “unsupervised self-driving.” Elon and his Tesla influencer simps will pump this up while blurring the line between this product and FSD in customer vehicles to give the impression that Tesla is still a leader in self-driving. When, in fact, Tesla will only have achieved what Waymo delivered years ago.”


#AutonomousVehicles #SelfDrivingCars #Tesla #Waymo #TechInnovation #ArtificialIntelligence #FutureOfTransportation #AutonomousDriving #MobilityTech #TransportationTechnology #TeslaAutopilot #WaymoDrive

Discussion about this episode