DIY Robotaxis

"Self-driving cars are the natural extension of active safety and obviously something we should do." – Elon Musk

Tesla Model X driving on a road with a person in the driver seat.
Photography by Timo Wielink on Unsplash
Published: Sunday, 03 November 2024 22:57 (EST)
By Nina Schmidt

Elon Musk’s vision of a future filled with autonomous vehicles is no secret. But while he’s busy planning Tesla-run robotaxi networks, some drivers are already taking matters into their own hands. And, as you might expect, things are getting a little... bumpy.

In April, a Tesla on Full Self-Driving (FSD) software collided with an SUV in Las Vegas, sparking fresh concerns about the growing number of DIY robotaxis. These are not the fully autonomous, heavily regulated vehicles you might expect from companies like Waymo or Cruise. Nope, these are Teslas, driven by regular people, using FSD software to turn their cars into makeshift robotaxis for Uber and Lyft. And the results? Well, let’s just say they’re not always smooth rides.

Why Are Drivers Taking the Risk?

So, why are drivers so eager to use Tesla’s FSD software, despite its limitations? According to Fast Company, many Uber and Lyft drivers are using FSD to reduce stress and work longer hours. The software, which costs $99 a month, helps with steering, acceleration, and braking, allowing drivers to take a bit of a breather—at least in theory.

But here’s the catch: FSD is not fully autonomous. It’s classified as partial automation, meaning drivers must remain fully engaged and ready to take control at any moment. And as we’ve seen with the Las Vegas crash, things can go wrong fast. The driver, Justin Yoon, was in the driver’s seat with his hands off the wheel when his Tesla failed to slow down for an SUV crossing the road. Yoon had to take control at the last second, but the damage was done—the car was totaled, and both he and his passenger were injured.

Regulatory Gray Areas

One of the biggest issues with DIY robotaxis is the regulatory gray area they operate in. While companies like Waymo and Cruise are subject to strict regulations, Tesla’s FSD is not. State and federal authorities consider Tesla drivers fully responsible for their vehicles, even when using driver-assist software. This means that, legally speaking, there’s nothing stopping Uber and Lyft drivers from using FSD to shuttle passengers around.

But just because it’s legal doesn’t mean it’s safe. FSD has been involved in several accidents, including two fatal ones, and it’s under increasing scrutiny from regulators. The U.S. National Highway Traffic Safety Administration (NHTSA) is aware of the Las Vegas crash and has reached out to Tesla for more information, but so far, there’s been no formal investigation into how ride-hail drivers are using FSD.

Is the Tech Ready?

Despite Musk’s grand ambitions, many drivers are finding that FSD still has some serious shortcomings. Some report issues like sudden, unexplained acceleration and braking, while others avoid using it in complex situations like airport pickups or construction zones. Sergio Avedian, a ride-hail driver in Los Angeles, told Fast Company that he uses FSD but isn’t completely comfortable with it. He estimates that 30-40% of Tesla ride-hail drivers use FSD regularly, but many are cautious about when and where they activate it.

And it’s not just drivers who are concerned. David Kidd, a senior research scientist at the Insurance Institute for Highway Safety, said that while Tesla’s advancements are impressive, they raise serious safety concerns. “From a safety standpoint, it raised a lot of hairs,” Kidd said. He believes that NHTSA should at least provide basic guidelines to prevent the misuse of driver-assist technologies like FSD.

What’s Next?

As more drivers turn to FSD to make their jobs easier, the pressure is on for regulators to step in. Missy Cummings, director of the George Mason University Autonomy and Robotics Center, believes that companies like Uber and Lyft should take the lead in banning the use of FSD for ride-hailing services before things get worse. “If Uber and Lyft were smart, they’d get ahead of it and they would ban that,” she said.

But for now, drivers like Kaz Barnes, who has completed over 2,000 trips using FSD, are eagerly awaiting the day when they can fully hand over control to their Teslas. “You would just kind of take off the training wheels,” Barnes said. “I hope to be able to do that with this car one day.”

Whether that day comes sooner or later, one thing is clear: the road to fully autonomous ride-hailing is still full of twists and turns.

According to Arstechnica, Tesla’s FSD requires human intervention every 13 miles. While it’s great at giving pedestrians room, it has also been known to run red lights and cross into oncoming traffic. Yikes.

With stats like that, it’s no wonder the debate over DIY robotaxis is heating up. The question is, how many more accidents will it take before regulators step in?

Drones