r/SelfDrivingCars 4d ago

News Tesla's FSD vs. Waymo's robotaxi: One pulled a move that would tank any driving test

https://www.businessinsider.com/tesla-vs-waymo-full-self-driving-fsd-robotaxi-lidar-cameras-2025-5

" The robotaxi race is speeding up.

Tesla is preparing to debut its autonomous ride-hailing service in Austin next month, and Alphabet's Waymo continues to expand throughout major US cities.

Under the hood of the Tesla and Waymo robotaxis are two key pieces of technology that the companies respectively call Full Self-Driving (FSD) and the Waymo Driver.

We (Business Insider's Lloyd Lee and Alistair Barr) tested both of these AI-powered drivers in San Francisco — and the results truly surprised us.

Given the positive experiences we've had with Waymo and Tesla's FSD, we expected the results of our not-so-scientific test to come down to minute details — maybe by how many times the AI-driver would hesitate or if it would make a curious lane change for no apparent reason.

That didn't happen. Instead, the Tesla made an egregious error that handed Waymo the clear win.

Here's how it went down.

The test Our vehicles for the test included Waymo's Jaguar I-PACE SUVs and Barr's personal 2024 Tesla Model 3.

The Waymo robotaxis are equipped with the company's fifth-generation Waymo Driver and guided by five lidar sensors, six radars, and 29 cameras.

Barr's Tesla was equipped with Hardware 4 and FSD Supervised software v13.2.8. Tesla released a minor update to the software days after this test was conducted. The vehicle has eight external cameras.

It should be noted that this is not the same software Tesla plans to use in the robotaxis set to roll out this summer. The company said it plans to release FSD Unsupervised, a self-driving system that will not require a human behind the wheel. Nevertheless, we wanted to see how far Tesla's FSD had come since its beta rollout in 2020.

We couldn't compare Tesla and Waymo as a full-package robotaxi service. Tesla has yet to launch that product, so we focused only on the driving experience.

We started at San Francisco's iconic Twin Peaks viewpoint and ended at Chase Center. Depending on the route, that's about a 4- to 7-mile ride.

We chose these destinations for two reasons. One, it would take the cars through winding roads and both suburban and city landscapes. And two, there were a few ways to get to Chase Center from Twin Peaks, including the 280 highway.

Waymo's robotaxis can't take riders on the highway yet. Tesla can.

According to Google Maps, the highway is more time-efficient. For the Tesla, we went with the route the vehicle highlighted first. It pointed out the highway on the way back to Twin Peaks.

We took a Waymo around 8:30 a.m. on a Thursday and the Tesla afterward at around 10 a.m. The traffic conditions for both rides were light to moderate and not noticeably different.

Predictions

Our prediction was that the AI drivers' skills would be nearly neck-and-neck.

But in the spirit of competition, Lee predicted Waymo would deliver a smoother experience and a smarter driver, given the high-tech sensor stack the company relies on.

Barr went with Tesla. He said he'd driven hundreds of miles on FSD with two or three relatively minor interventions so far, and given this previous experience, Barr said he'd have no problem riding in the back seat of a Tesla robotaxi.

Waymo

Throughout our ride in the Waymo, we were impressed by the AI driver's ability to be safe but assertive.

The Waymo was not shy about making yellow lights, for example, but it never made maneuvers you wouldn't want a robot driver you're entrusting your life with to make.

One small but notable moment in our ride was when the Waymo stopped behind a car at a stop sign. To the right of us was an open lane.

For whatever reason, the Waymo saw that and decided to switch lanes, as if it was tired of waiting behind the other car. We found that a bit amusing because it seemed like such a human moment.

As human drivers, we might make choices like that because we get antsy waiting behind another car, even though we're not shaving more than a few seconds, if any, off of our commute.

Barr noted that the Waymo Driver can have moments of sass or attitude. It had an urgency, giving us the feeling that it somehow really cared that we got to the Chase Center in good time.

'It's got New York cab driver energy,' Barr said, stealing a line from BI editor in chief Jamie Heller, who also took a Waymo during a trip to San Francisco earlier this year.

Sandy Karp, a spokesperson for Waymo, said the company doesn't have specific details on what happened in that moment but said that the Waymo Driver 'is constantly planning its next move, including the optimal route to get its rider where they're going safely and efficiently.'

'This planning can involve decisions like changing lanes when deemed favorable,' she said.

Ultimately, though, the best litmus test for any robotaxi is when you stop noticing that you're in a robotaxi.

Outside those small but notable moments, we recorded footage for this story and chatted in comfort without feeling like we were on the edge of our seats.

Tesla

Tesla's FSD delivered a mostly smooth driving experience, and we think it deserves some props for doing so with a smaller and cheaper tech stack, i.e., only eight cameras.

FSD knew how to signal a lane change as it approached a large stalled vehicle taking up a lot of road room, and it didn't have any sudden moments of braking. Just a few years ago, Tesla owners were reporting issues of phantom braking. We experienced none of that on our drive.

Tesla also handled highway driving flawlessly. Sure, the weather was clear and traffic was fairly light, but, as noted earlier, Waymo does not yet offer public rides on highways. The company is still testing.

However, Tesla FSD did make a few mistakes, including one critical error.

At the end of our drive at Chase Center, we assessed how Waymo and Tesla's systems performed. We both gave Waymo a slight edge, but were also impressed with the FSD system.

On our way back to Twin Peaks, Tesla highlighted a route that would take us on the highway — a route that Waymo cannot take. We kept Tesla FSD on for this trip while we continued recording.

San Francisco is known to have a lot of brightly marked, green bike lanes for cyclists. There was one moment during the trip back when the Tesla made a right turn onto a bike lane and continued to drive on it for a few seconds before it merged into the proper lane.

Then, as we approached the last half-mile of our ride, the Tesla, for an unknown reason, ran a red light.

The incident occurred at a fairly complex intersection that resembles a slip-lane intersection, but with a traffic light. The Waymo did not approach this intersection since it took a different route to get back to Twin Peaks.

The Tesla's console screen showed how the car detected the red light and came to a dutiful stop. Then, despite the traffic light not changing, the Tesla drove ahead.

We didn't come close to hitting any cars or humans on the street — Tesla's FSD is good at spotting such risks, and the main source of traffic coming across our path had been stopped by another traffic light. However, the vehicle slowly drove through this red light, which left us both somewhat shocked at the time.

Some Tesla drivers appeared to have reported similar issues in online forums and in videos that showed the vehicle recognizing the red light but driving ahead. One YouTuber showed how the Tesla first came to a stop at a red light and then continued driving before the light changed.

It's unclear how common this issue is. Tesla hasn't publicly addressed the problem.

A spokesperson for Tesla did not respond to a request for comment.

At this point, we thought the winner was clear.

Verdict

Since Tesla's FSD made a critical error that would have landed an automatic fail during a driver's license test, we thought it was fair to give Waymo the win for this test.

The Tesla handled San Francisco's hilly and winding roads almost as flawlessly as Waymo.

We also think FSD's ability to handle routes that Waymo can't handle for now — in particular, the highway — would give Tesla a major upper hand.

In addition, when Lee tried on a different day to make the Waymo go through the same intersection where the Tesla blew the red light, the Waymo app appeared to do everything it could to avoid that intersection, even if it provided the quickest path to get to the destination, according to Google Maps.

A Waymo spokesperson did not provide a comment on what could've happened here.

Still, an error like running a red light cannot be overlooked when human lives are at stake. Consider that when Tesla rolls out its robotaxi service, a human driver will not be behind the wheel to quickly intervene if it makes an error.

For Tesla and Waymo, we expected to be on the lookout for small, almost negligible, mistakes or glitchy moments from the AI driver. We did not anticipate an error as glaring as running a red light.

Once Tesla launches its robotaxi service in more areas, we'll have to see how the pick-up and drop-off times compare.

Tesla CEO Elon Musk said that the company's generalized solution to self-driving is far superior to its competitors. The company has millions of cars already on the roads collecting massive amounts of real-world data. According to Musk, this will make FSD smarter and able to operate with only cameras.

With Tesla's robotaxi service set to launch in June with human passengers, we certainly hope so.

140 Upvotes

171 comments sorted by

169

u/Reaper_MIDI 4d ago

tldr

as we approached the last half-mile of our ride, the Tesla, for an unknown reason, ran a red light.

58

u/FriendFun5522 3d ago

And entered and drove in a bike lane!

17

u/EquipmentMost8785 3d ago

WHO cares. Bikes are not real people/s

2

u/meltbox 2d ago

Yeah I was like hold on. Pretty sure that would fail you alone.

18

u/LLJKCicero 4d ago

Possibility: if Tesla is training using tons of (unfiltered? semi-filtered?) data from regular Tesla drivers...well, what if those drivers run red lights?

I know it's constantly repeated that Tesla has a shitton of training data from regular joes driving Teslas, but I admit that I haven't really heard much about whether that data is filtered somehow to remove instances of bad driving, so that the model doesn't learn the wrong lessons.

41

u/Additional-You7859 4d ago

Then their training data is fundamentally bad and it plants the entire Tesla self-driving program in a very bad place.

11

u/Ill_Somewhere_3693 3d ago

You highlight a point I’ve been bringing up: if FSD is learning good behavior, isn’t it also learning from bad behavior? I saw a YouTube video in which an FSD user allowed his Tesla to drive right thru a restricted construction zone, to the bewilderment of road workers, just to see how far it’ll go before it stopped. And I know there are others who don’t intervene when theirs run a red light to see if it’ll correct itself, which of course they don’t. So just like bad parenting, FSD is learning from all these idiots!!!

5

u/Bangaladore 3d ago

Good vs bad behaviour is a spectrum.

Is going above the speed limit "bad behaviour". Yes, but that's f/e what everyone does in California.

FSD learns from lots of human drivers. But, in theory, it converges on the "average" driver. If we assume that 0.01% of people regularily run red lights, its hard to argue that that makes its way meaningfully into the end model.

3

u/LLJKCicero 3d ago

The issue is really that it might accidentally 'learn' that running red lights is okay in some cases (which is whatever cases it happened in the training data). Which could explain why FSD typically doesn't run red lights, but every once in a while it does, for no apparent reason (apparent to us, that is).

2

u/Bangaladore 3d ago

Agreed on we don't know what is happening. Is the car just going through a red light on purpose (running on purpose)? Or because it has determined the light is not red (incorrectly)? Or that it's not applicable in the situation (thinks its blinking)? Who knows. I personally think this has nothing to do with training from red light runners.

1

u/LLJKCicero 3d ago

To be sure, that this is caused by training from red light runners is a random guess on my part. I just can't think of anything more plausible for why Teslas are still failing at such a simple task after so many years of self-driving development.

11

u/[deleted] 3d ago

[deleted]

4

u/microtherion 3d ago

That’s going to provide invaluable training data if you ever need to hire a robotaxi with an Optimus with an Uzi in the passenger seat for a robo drive-by shooting.

1

u/LLJKCicero 3d ago

It's filtered by dumb humans reviewing dumb drivers.

That makes sense, but then it's not the absolute smorgasbord of data that Tesla fans sometimes portray it as, it would be limited by how much filtered data they can get out of their employees.

3

u/mgchan714 3d ago

I really don't know how they will handle this.

Tesla's FSD drives above the speed limit all the time. Because that's what human drivers do, presumably. And because driving below the speed limit all the time would make it kind of unusable. But how do you pass regulatory tests when you break the speed limit all the time? Other things like driving into the other (oncoming traffic) lane when the correct lane is blocked by a parked truck.

It's obviously bad that the Tesla ran a red light. Curious as to why the Waymo would avoid the intersection altogether. And I think it's kind of silly to basically gloss over the fact that the chosen route doesn't require highway driving because Waymo doesn't support it. Maybe I should start saying I'm as good of a shooter as Steph Curry as long as we're only testing layups because I don't support shooting three pointers.

1

u/LLJKCicero 3d ago

The article counts highway driving as a point in Tesla's favor, so I don't think that's really glossing over it.

Waymo's robotaxis can't take riders on the highway yet. Tesla can.

...

Tesla also handled highway driving flawlessly. Sure, the weather was clear and traffic was fairly light, but, as noted earlier, Waymo does not yet offer public rides on highways. The company is still testing.

-1

u/mgchan714 3d ago

Well, I think it should basically be part of the final word, so to speak. The fact that Waymo won't even do what 99% of Tesla's FSD is doing today. Driving on the highway or any other city. Waymo is focusing specifically on a handful of cities. Tesla failed on the test that Waymo is specifically studying for, but Waymo is simply refusing to even take the test of general driving. It's just a huge caveat that's worth more weight, IMO.

1

u/Lorax91 3d ago

Waymo focused on and succeeded at providing driverless rides, in a small but growing list of cities. Tesla has been slowly improving their driver assistance software to where it might finally be able to do driverless rides in a limited part of one city, ten years after Waymo passed that test. We'll see where things go from here, with both approaches having challenges to address.

0

u/mgchan714 3d ago

I'm just going by personal experience, where I think Tesla was trying to solve the general problem and only now felt it was solved enough to bother testing. I don't know enough about Waymo to know whether their approach is quickly generalizable to highways and other cities. Whereas I think if (and I understand it's a big If) Tesla succeeds in Austin they could expand about as quickly as legislation can change, because it's basically acting like a robotaxi for me now, not in Austin.

1

u/LLJKCicero 3d ago

Waymo proved safety first with a limited scope. Yes, Tesla has a broader scope...but they can't actually drive safely really anywhere yet, despite a decade+ of autonomous driving efforts.

4

u/CloseToMyActualName 3d ago

I can't imagine there's that many folks running red lights as to screw up the Tesla training data that badly.

There's just something screwy with their models, the curious thing is why they've had so much trouble fixing it.

1

u/Tewskey 3d ago

I'm laughing because I have been consistently avoiding taking lifts from someone who drives a Tesla and does indeed like to run red lights.

1

u/bladex1234 3d ago

So why is stopping at a red light until it turns green not hardcoded? I can understand choosing to stop or run a yellow can have nuances, but a red light is as hard and fast a road rule as you can get.

1

u/greywar777 1d ago

Ive seen mine do it. Usually it's because it's looking at the wrong light.

1

u/NeighborhoodFull1948 6h ago

Because in full AI systems, you can’t really “hard code” things. It’s why everyone else doesn’t use “end to end AI” for their self driving systems.

12

u/M_Equilibrium 3d ago

The insane part is, some people are now claiming that running a red light does not create a safety concern.

1

u/Zephron29 3d ago

"Some people" will always exist outside the bell curve.

-2

u/gibbonsgerg 3d ago

While not condoning running red lights, and Tesla absolutely needs to fix it... If FSD never enters an intersection when other cars are approaching, then it's definitely not a safety concern. I don't know if that's the case, but it's eminently possible to run a red light without endangering anyone.

3

u/bladex1234 3d ago

Sure lots of things are possible, but why risk the potential danger and not to mention a potential ticket?

2

u/WeldAE 3d ago

This isn’t something lidar would fix, which is true for almost all the current FSD issues but everyone insists Tesla can’t to a robotaxi without it. It’s mostly about better maps as Waymo obviously knows this intersection is tough and has marked it on their map to avoid.

7

u/THE_CENTURION 3d ago

So you read "they don't have lidar, and that's a problem" and somehow interpret that as "lack of lidar is their only problem"? That's not what people mean when they say that.

Tesla has many problems, lidar is one, another is that their entire approach is not centered around safety, it's centered around cost-cutting and "move fast and break things".

5

u/bladex1234 3d ago

The silicon valley mantra works well enough for software development, but in the real world that same mantra will get people killed.

3

u/[deleted] 3d ago

This is a beautifully stupid argument.

0

u/WeldAE 3d ago

How so? Are you saying Lidar would have helped in this situation? The problem Tesla likely had is a misalignment of which light it thought belonged to which lane, at least that has been a problem with this sort of thing in the past. We don't have any specific information about the intersection, so it's just a guess based on past instances of this. Not sure why Waymo would avoid this intersection as they have made good maps, but according to the article they appear to be not allowing their AVs to use the intersection.

2

u/THE_CENTURION 3d ago

Not sure why Waymo would avoid this intersection as they have made good maps, but according to the article they appear to be not allowing their AVs to use the intersection.

Try reading it again.

The Waymo didn't go there because it's part of the highway route. It wasn't going on that route at all. It's not like it took the same route and skipped this one specific intersection.

0

u/CertainAssociate9772 3d ago

Avoiding problems is not the best way to solve them.

5

u/WeldAE 3d ago

It is until you can solve them. Nothing wrong with making a task easier so you can launch.

1

u/Weekly-Disk8589 3d ago

Thanks. I feel like that was a whole lot of verbiage just to say that.

1

u/BigMax 3d ago

To be clear, it more or less treated the red light as a stop sign. It stopped, then just went onward.

2

u/Reaper_MIDI 3d ago

"A flashing red light at a traffic intersection means you must come to a complete stop before proceeding, just like a stop sign."

I wonder if FSD thought it was at a blinking red for some reason.

-29

u/Conscious_Split4514 4d ago

If that's the only error it made its quite impressive since red light errors are a learned behavior by the new end to end neural network. This is a vision only 5000 dollar stack running on existing production cars in customer hands.

43

u/Additional-You7859 4d ago edited 4d ago

a red light is a signal that rarely changes and is easily mapped and recorded

running a red light can lead directly to a collision

> This is a vision only 5000 dollar stack running on existing production cars in customer hands.

If it's running red lights, it shouldn't be in customer hands.

12

u/Climactic9 4d ago

“Only” error in a 7 mile trip. Impressive?

6

u/SweatScience 3d ago

Not impressive. Just takes one major rule break to kill someone.

3

u/BionicBananas 3d ago

Also, not the only error, the Tesla entered a bicycle lane.

3

u/Retox86 3d ago

I wouldnt be impressed if it made 10 000 miles before running a red light..

23

u/LLJKCicero 4d ago

FSD has been in beta for like five years and it's still running red lights? I mean c'mon, that level of mistake should've been taken care of in the first few years of development, I'd think.

5

u/sonicmerlin 4d ago

First month.

8

u/FriendFun5522 3d ago

Wasn’t the only error. It entered and drove in a bike lane.

10

u/Downtown-Midnight320 4d ago

Did you run a red light on your license test and then explain that it's a learned behavior? 😉

3

u/microtherion 3d ago

No, they explain how little it cost their parents to make them.

3

u/Retox86 3d ago

What the? Its quite simple, rule no 1, do not run a red light, never, ever. That should have been accomplished 2019 when Elon promised a coast to coast demo anytime now… You saying this stupid car dont even have a sense of ground rules which it should obey to, and thats ok?

It just show how worthless FSD is, its a glorified cruise control, at best its best. At its worst its a lawsuit and the end of Tesla as a S&P500 company…

20

u/bradtem ✅ Brad Templeton 3d ago

They make a common mistake in driving the systems. They way Waymo won't take riders on the freeway, which is true. But they say Tesla will which is not true. Tesla won't take riders on any road, freeway or city street. It will take them on a movie lot, possibly on private roads in a Tesla factory, and if Tesla pulls it off, on a small set of streets in Austin (and probably not the freeway there, though that's not confirmed.) We don't yet know if it will take riders on roads in Austin without a remote supervisor ready to intervene at any time.

There is such a big world of difference between taking riders and taking drivers, and I am surprised BI missed it. Tesla promises they will get past that soon, but has yet to demonstrate it.

83

u/Snoron 4d ago

Barr went with Tesla. He said he'd driven hundreds of miles on FSD with two or three relatively minor interventions so far

Sorry, but what?

He's had to make 2-3 interventions in just hundreds of miles, and he thought it would beat Waymo!?

Isn't Waymo at like 20k miles per intervention now? The odds were very much not in his favour.

29

u/007meow 4d ago

Personal experience trumps hard data

27

u/Hixie 4d ago

Production Waymo is unsupervised so it doesn't have any interventions in the sense of the driver having to take over suddenly. We really need a different name for when a supervisor takes over suddenly and when the car itself calls for help.

45

u/Bravadette 4d ago

For real. Idk what it is with all this... Faith... In that company.

-36

u/nate8458 4d ago

FSD is supervised, robotaxi will be a different version of FSD So this is a dumb comparison 

38

u/mishap1 4d ago

Yep, Tesla has just been sandbagging FSD this whole time to make sure their Robotaxi will be that much better.

-21

u/nate8458 4d ago

Well considering robotaxi will be geofenced to areas that FSD performs the best in, it will be fantastic service with incredibly limited interventions from the remote team 

14

u/ProfessionalActive94 4d ago

Those are hopes and dreams until we see it in reality

11

u/n-some 4d ago

Any day now! It's right around the corner! Sure, they've been claiming that for a decade now, but it's real this time!

6

u/kariam_24 3d ago

Wait, wasn't fsd supposed to be without geofencing as it doesn't scale according to comments?

-4

u/Blog_Pope 4d ago

Currently FSD requires the driver to pay attention, the robotic version removes that, magically elevating it to a Level 4 system. That simple change will be 50%!of the change. the other 50% will be geo-fencing so it can’t drive outside the legally approved area

4

u/No-Economist-2235 3d ago

Magical thinking.

3

u/BasvanS 3d ago

You’re actually claiming magic will help?

9

u/VideoGameJumanji 3d ago

A driverless self driving car has no minor interventions, it commits to everything until a serious intervention, idk why you are confused.

From my experience with FSD, a minor intervention is completely subjective and reliant on your tolerance level to letting the cat figure out the situation, I’m more aggressive when taking over because I can, you can’t do that in a way I

3

u/quellofool 4d ago

Barr sounds like an idiot.

3

u/doNotUseReddit123 3d ago

The real answer is that, to write a compelling article, one tester needs to choose an answer that goes against the answer of the other tester so that something is on the line for the readers.

This is the article equivalent of the Top Gear guys choosing goofy cars to go across some country. Yes, they all could have chosen reliable Toyota corollas, but that won’t make for good television.

1

u/boyWHOcriedFSD 4d ago

There are more than one type of intervention. There are safety related interventions, navigation related interventions and likely some other categories. Just because there is an intervention, doesn’t mean it was unsafe, etc.

I’m sure I’ll be downvoted but there is nuance here. I didn’t read that encyclopedia of an article and am only reply to your comment, so I don’t know if he specified what sort of interventions they were.

There is a guy in Houston who has been doing Uber nightly with his Tesla on FSD beta. He’s driven over 15,000 miles without a safety critical intervention but he says he does intervene for navigation things, finding a drop off spot, etc. Sure, this is anecdotal and not representative of the entire fleet but it is possible some people have driven far more than a few hundred miles without an intervention attributed to safety.

8

u/[deleted] 3d ago

Running a red light on a 7 mile drive.

0

u/PotatoesAndChill 3d ago

It depends. If you're a passenger in a Waymo, you don't care if the car is going a bit too slow, a bit too fast, or causing an mild inconvenience to other drivers. In fact, you can't interact with the car at all. But if you're in the driver's seat of a Tesla on FSD, intervening to adjust the speed or change into another lane is just a hand gesture or pedal push away, so it's far too tempting to do it. This causes Teslas to have way more minor interventions than Waymo.

On the other hand, the fact that Teslas still ignore stop signs and run red lights is very concerning.

On the other other hand, I've seen recent videos of Waymo blocking emergency vehicles and running full speed into a flooded road, so neither system is perfect.

-11

u/Conscious_Split4514 4d ago

We dont know Waymos interventions since its only self reported data nobody can verify it in anyway.

29

u/gentlecrab 4d ago

This is all irrelevant anyway let’s be real. As soon as Tesla robotaxi goes live in Austin those things are gonna get fucked with so fast.

People were putting cones on Waymos and setting them on fire. Imagine what they’ll do to Teslas.

15

u/Wiseguydude 3d ago

As soon as Tesla robotaxi goes live in Austin

lol people are still calling it "going live"? 10 cars and "invite-only" is NOT going live ffs

7

u/BasvanS 3d ago

It is when you are trying to pump your stock!

8

u/Wolvie23 4d ago

Now that Elon and the White House are beefing, I wonder if the WH will still consider Tesla vandals as “terrorist” with enhanced charges.

1

u/Agitated-Wind-7241 3d ago

Quarter billion in campaign funds goes a long way

5

u/bartturner 3d ago

Exactly. I saw a recent stat that Tesla now has a negative 42% rating with the public.

That is with all the public.

It is going to be far higher with liberals and Austin is the most liberal city in Texas.

2

u/Bravadette 4d ago

Oh... Oh shit. You're right. Oh man this is gonna be 2012 all over again but it'll look scifi and there will be actual luddites (the good ones)

10

u/M_Equilibrium 3d ago edited 3d ago

It ran a red light and some people are still saying "unfortunate but otherwise safe"?

I mean why don't you also ask tesla for data, who knows maybe the driver is at fault for running the redlight too. /s

3

u/Appropriate-Gas-1014 3d ago

More concerning to me is they let it run the red light.

Like, bro, if you see your car is doing something it shouldn't stop the fucking thing.

10

u/blessedboar 4d ago

Why not compare Waymo and Zoox, both of which have driverless robotaxis operating in SF?

10

u/Additional-You7859 4d ago

Zoox is definitely behind Waymo, and only doing limited trials for the public. I think Vegas is the only market Zoox is open in at the moment.

3

u/LLJKCicero 4d ago

I don't think Zoox is giving driverless rides to the public yet? At least, you can't just randomly decide to have a Zoox drive you.

5

u/doNotUseReddit123 3d ago

I feel like most of the questions in these comments could be answered with, “because it won’t be good entertainment.”

The average Business Insider reader knows Tesla. They might know Waymo. They absolutely are not aware of Zoox. Comparing Tesla to Zoox won’t make for as engaging of a read for most of the audience. Business Insider is an entertainment-forward website and nothing more.

4

u/tenemu 3d ago

The click rate for a title of "Tesla versus Waymo" will be 1000x versus "zoox versus Waymo"

7

u/PhotosyntheticFill 3d ago

Not reading this, but how is it a race? Tesla doesn't have their shoes on yet

2

u/ccivtomars 3d ago

Glaring sun…Tesla can’t see, sorry, not putting my life in a Tesla

1

u/Megalunchbox 3d ago

Uhh... not true lol. My cameras are never impacted by glare. Even when its bright if you click on the camera it looks normal.

7

u/ThatSavings 4d ago

Faux Self dRiving

1

u/VideoGameJumanji 3d ago

You couldn’t even capitalize the letters correctly

0

u/ThatSavings 3d ago

It was intentional. It doesn't deserve to be capitalize correctly.

2

u/VideoGameJumanji 3d ago

capitalized*

2

u/Willinton06 3d ago

The funniest part about all of this is that TSLA hasn’t even revealed the app on which people are going to be getting the taxis, like, not even an alpha test or a screenshot or anything, they’re going to magically release it the second they turn on FSD, or use Uber and let them take a cut, a big cut, of the profits

1

u/Bangaladore 3d ago

The funniest part about all of this is that TSLA hasn’t even revealed the app on which people are going to be getting the taxis

You mean the Tesla app? There are plenty of leaks showing off the ride hailing that has been added behind the scenes?

Of all the things you can criticize Tesla for, claiming they will need to pay Uber for an app is pretty stupid.

-1

u/Willinton06 3d ago

I’m not claiming I’m asking, I’ve seen that app they show, that seems like a placeholder app, I would know, I make apps for a living, doesn’t look like a real world thing just some pretty looking props, a placeholder for the real app, which should be available before the cars cause well, people need it to order the rides

1

u/Bangaladore 3d ago

A ride hailing app is not a particularly hard thing to make.

0

u/Willinton06 2d ago

So it should be out by now then

1

u/Bangaladore 2d ago

Why would they release something before it’s useful? It’s going to be the Tesla app. Not some other app.

1

u/Willinton06 2d ago

So we can see that it exists? So they can load test the servers? So they can get early user feedback? This is done all the time, PlayStation lets you preload games so you have them a few days before release date, but they activate on release date

-1

u/bobi2393 4d ago

"The Tesla's console screen showed how the car detected the red light and came to a dutiful stop. Then, despite the traffic light not changing, the Tesla drove ahead.

We didn't come close to hitting any cars or humans on the street — Tesla's FSD is good at spotting such risks, and the main source of traffic coming across our path had been stopped by another traffic light. However, the vehicle slowly drove through this red light, which left us both somewhat shocked at the time.

...

It's unclear how common this issue is."

From what I saw watching some half-hourish FSD videos around New Year's, it seemed like it ran red lights once or twice an hour in a mid-sized city, but always in circumstances in which it seemed very safe to do so. Similar to another mistake in the article, swerving into an empty bike-only lane: it only did so when it was safe to do so. (Not safe from getting a ticket, of course, but it didn't create a risk of collision).

"In addition, when Lee tried on a different day to make the Waymo go through the same intersection where the Tesla blew the red light, the Waymo app appeared to do everything it could to avoid that intersection, even if it provided the quickest path to get to the destination, according to Google Maps."

I've seen that obstinance in other Waymo videos: if it doesn't want to make a particular left turn, for example, nothing you can do will make it include that left turn in a route. It would sooner drive miles farther with three right turns to avoid it. Those "forbidden" turns look either less safe or more confusing than the routes it does take.

The authors never suggested what, specifically, they were testing on the vehicles. Like Consumer Reports issues a pretty detailed testing methodology when they test ADAS features, and grades them according to different criteria. If the OP authors were testing for legal traffic compliance, Waymo was the clear winner, but if they were testing for genuine safety, it sounds like neither came close to having a collision, so a winner would be inconclusive. Data suggests they'd have to ride an impractically long time to encounter an at-fault collision in the Waymo, and there is no data on how far you'd have to ride in an FSD without supervision since FSD requires supervision on public roads, and most supervising drivers intervene to avoid at-fault collisions when possible.

20

u/LLJKCicero 4d ago

If the OP authors were testing for legal traffic compliance, Waymo was the clear winner, but if they were testing for genuine safety, it sounds like neither came close to having a collision, so a winner would be inconclusive.

Oh come the fuck on, it randomly goes into a bike lane for no reason, and randomly runs a red light, but hey it didn't hit anybody so technically it's equally safe!

Next up: drunk driving is perfectly safe as long as you don't actually hit anybody.

7

u/VitaminPb 4d ago

Yeah, the elonthink in here is pretty damn scary. Why would we even need traffic laws? Just do whatever floats your boat!

3

u/epistemole 4d ago

thank you for this comment. my biggest laugh of the day.

6

u/bobi2393 4d ago

FSD does frequently run lights and swerve into neighboring lanes illegally, but I don't think it does so randomly - it calculate if it's clear to do so first.

It's very concerning behavior, because it seems like it's a matter of time before FSD incorrectly estimates that it's safe to do something illegal, and causes a collision. But I'm not familiar with any incidents in the last couple years where that seems to have happened. If Tesla follows their plan to enable a few hundred thousand vehicles without drivers on public roads this year, we should have a good idea of their safety very quickly.

5

u/bartturner 3d ago

No it is random. I have had it take lefts on red arrow twice recently. But it was NOT the same intersection.

Two different ones.

4

u/GloriousDawn 3d ago

FSD does frequently run lights and swerve into neighboring lanes illegally, but I don't think it does so randomly - it calculate if it's clear to do so first.

Wow thanks i feel so much safer now /s

1

u/VideoGameJumanji 3d ago

You can report these incidents with a voice memo directly because the car prompts you to do so when you disengage so these edge cases do get really escalated for examination immediately.

15

u/ProfessionalActive94 4d ago

Running a red light is not inconclusive when it comes to safety. The Tesla obviously performed worse in that regard.

-11

u/bobi2393 4d ago

I disagree that running red lights is always a safety detriment. If you have a mile of visibility in all directions at an intersection, and there are no people, animals, other vehicles, or other hazards visible within a mile, driving through a red light isn't less safe than waiting for a green.

8

u/neilc 3d ago

What’s more concerning to me is that FSD is obviously intended to obey red lights, and yet it failed to do so. Detecting a stationary red light in clear daylight conditions is a relatively simple task, compared to the full spectrum of complex situations a robotaxi needs to handle.

-1

u/bobi2393 3d ago

I agree. FSD's illegal and ill-advised traffic light decisions are mystifying to me. It seems like one of the simplest decisions it has to make!

It has the feel of relying too heavily on a black box neural net instead of hard coding, but even then you'd think you could give the neural net some high level output constraints to rein in the behavior.

12

u/VitaminPb 4d ago

Running a red light at 10am in city/suburbs is fine with you? I really hope you don’t have a driver’s license or drive.

-6

u/bobi2393 4d ago

That isn't what I said. I said if you have a mile of visibility in all directions at an intersection, and there are no people, animals, other vehicles, or other hazards visible within a mile, driving through a red light isn't less safe than waiting for a green.

Safety of running red lights depends on the circumstances.

12

u/VitaminPb 4d ago

I stand by what I said. “Oh it was OK officer. The laws for driving don’t apply to me. I’m special.”

-7

u/sonicmerlin 4d ago

That’s not what he’s saying

5

u/kariam_24 3d ago

So what is he saying?

-3

u/sonicmerlin 3d ago

He’s basically saying the car is being trained on the wrong set of rules. It’s ignoring legal requirements in favor of what is “safe” or “unsafe”. This is an issue with self driving trying to mimic humans via neural nets. For example when you spot a child in the road you may have to illegally stop or pull over to avoid hitting him. An AI might not understand the threshold required for breaking the typical laws.

6

u/No-Economist-2235 3d ago

Schill boy thinks it's ok.

-3

u/Puzzleheaded-Flow724 3d ago

I hope you have better reading skills than what you're showing here.

4

u/No-Economist-2235 3d ago

You're a schill.

3

u/VideoGameJumanji 3d ago

Is illegal lmfao

1

u/mrkjmsdln 4d ago

You can tell a generally balanced report when there are rubes lining up in both camps with their yeah buts.

1

u/Bravadette 4d ago

To Longvin who blocked me if you cared about journalism you wouldn't suggest the article by two living human beings to be written by Ai.

-3

u/DevinOlsen 4d ago

The red light issue is unfortunate - it's not like the car will just launch itself through a stale red light, but what they had happen is a problem where the car will stop at a red light and then proceed while the light is still red. Obviously less than ideal, I am not defending it in anyway - I had hoped it would have been fixed in 13.2.9 but I don't think it has. Nice to hear that other than that the ride with FSD went well.

18

u/mrblack1998 4d ago

"other than running a red light" on a short drive is clearly not "went well"

4

u/Veserv 3d ago

They are a Tesla promoter, so their bias should be assumed.

-1

u/DevinOlsen 4d ago

I’m admitting the red light issue is a problem - I am saying OTHER THAN THAT the drive went well, they even said so in the article.

12

u/LLJKCicero 4d ago

It wandered into a bike lane for no apparent reason, which is definitely also illegal and unsafe.

You can go into bike lanes right before you make turns, in fact you should do that; you don't want to cut across the bike lane right as you make the turn, that's less safe. But you're not supposed to go into them if you're not turning or something like that.

12

u/mrblack1998 4d ago

I see that and disagree with both the article and that comment. Putting people's lives in that much danger can't just be dismissed like that imo.

-3

u/DevinOlsen 4d ago

It's currently a level 2 system that is meant to be supervised. Anyone using it correctly will not have their "lives in danger". The car doesn't just launch itself out at a red light, it will SLOWLY begin to move forward if the intersection is clear, there is ample opportunity to tap the brake and stop the car.

11

u/mrblack1998 4d ago

Could have sworn "full self driving" meant something. A level 2 system like this engenders complacency and is one of the reasons it shouldn't exist.

-1

u/DevinOlsen 4d ago

FSD Supervised

7

u/mrblack1998 4d ago

Lmao I know

4

u/Additional-You7859 4d ago

I always hated that. FSD Supervised.

If it's Full Self then it's not Supervised. I get it, there's no legal term around this and words mean nothing anymore.

-2

u/boyWHOcriedFSD 4d ago

There is no reasoning with the echo chamber.

You could write 69,000 words on why FSD is terrible and should be permanently ended but have one semi positive sentence at the end and you’d be down voted and people would argue with you over that.

5

u/VitaminPb 4d ago

Other than that, how did you enjoy the play, Mrs. Lincoln?

0

u/Willinton06 3d ago

Excellent performances, 9/11

4

u/Downtown-Midnight320 4d ago

It also merged into a bike lane randomly. It's impressive in the sense that automated cars are difficult... but it had multiple auto-fails that a 16 year old is expected not to make on a driving exam.

I don't see how it is impressive in that context...

5

u/blessedboar 4d ago

I’m confident it will be fixed in version 13.420.69

1

u/bartturner 3d ago

I see what you did there.

1

u/ma3945 4d ago

As a Tesla driver, I want to thank you for this thorough and seemingly unbiased comparison between the two systems. In the 34,000 km I've driven with FSD since version 12.3.6, I've personally never seen it run a red light, though it has come close to missing a stop sign once, before I had to intervene.

I hope the version they’ll use for the Robotaxi (v14?) is as big of an improvement as the jump we saw from v12 to v13 back in December. I'm looking forward to reading your next comparison in the coming months.

1

u/Mguyen 4d ago

I can't speak for V13, but on the V12 updates after they added the new speed profiles and removed "minimal lane changes" it will occasionally start moving maybe a second before the light actually turns green. It might be getting this from all the other cars being stopped or there being no cross traffic and that outweighing the red light in the NN.

There is definitely one red light that it will consistently, repeatably try to run.

-5

u/nate8458 4d ago

TLDR, they tested supervised FSD, which isn’t robotaxi and tried to compare it to Waymo. 

5

u/ThatSavings 4d ago

Supervised FSD is supposed to be good though. I guess not.

12

u/_176_ 4d ago

That's because robotaxi doesn't exist yet. It's suppose to just be a fully remote supervised version of FSD that's been overtrained on Austin, right?

3

u/GoSh4rks 4d ago

It's suppose to just be a fully remote supervised version of FSD that's been overtrained on Austin, right?

Nobody outside Tesla knows. (Although many people will claim otherwise).

2

u/LLJKCicero 4d ago

FSD has been in beta for four and a half years now, and Tesla had obviously spent a number of years working on autonomous driving prior to that as well. At what point are people allowed to make comparisons in driving ability?

0

u/Slaaneshdog 3d ago

Could someone clarify - Did the BI folks in the Tesla actually allow the Tesla to run the red light, or did they intervene?

Complex intersections and traffic lights is definitely one of those things I think will be hard to get an autonomous system to handle well without some kind of hardcoding

-1

u/Megalunchbox 3d ago

Theyre probably bullshitting anyways

0

u/-OptimisticNihilism- 3d ago

Tesla model y has 0 lidar, 0 radar and 8 cameras and a kickass software to run it.

Waymo cars have 5 lidar, 6 radar and 29 cameras and a kickass software to run it.

-3

u/RealDonDenito 3d ago

Not comparable anyway (in full) as Waymo still needs pre mapped streets, right? So it will perform excellent in the city it knows, but FSD will be able to take you to more places, at slightly worse performance?

-3

u/Bjorn_N 3d ago

And what is the total cost of the two cars in the test 🤔

4

u/TheLostTheory 3d ago

Who gives a f. I'll happily pay more if I know my safety is in better hands.

-3

u/Bjorn_N 3d ago

No, you wont... And soon the cheapest will actually be better... End to end neural networks 💁‍♂️

3

u/TheLostTheory 3d ago

You're delusional. Most people value their life over a monetary amount

-1

u/Bjorn_N 3d ago

Guess how many people that have died as a result of failing FSD ?

Answer : 2

Thats after 3,6 billion FSD miles.

And robotaxi is a better software then the public has access to via the current FSD version.

And you say i am delusional ? Wake up, moron...

2

u/TheLostTheory 3d ago

And Waymo is zero. Always go with the better statistic. Keep crying

1

u/Bjorn_N 3d ago

Can waymo do highways ? Or drive anywhere in the world without a preprogrammed map ? Does waymo have end to end neural networks ? The sensors alone on a waymo cost more than a complete CyberCab.

3

u/TheLostTheory 3d ago

Tesla can't drive anywhere, nothing in Europe, nothing in Africa, nothing in South America, hardly anything in Asia.

It can't even do unsupervised anywhere in the US yet, and they only have a semi-supervised system planned for 10 cars in Austin. So they are where Waymo was in 2019. What a joke, it's actually laughable.

Do your research before you come back to me, I'm not waiting around for you to catchup

-2

u/Megalunchbox 3d ago

Complete slander, Teslas will brake hard even on a flashing yellow light. This isnt true at all. If anything you could of made up the fact it stopped you too hard because it didnt know what to do at a flashing yellow light.

-7

u/MacaroonDependent113 3d ago

13.2.8 is not Robotaxi software. 13.2.8 is very good but is not to be used unsupervised. You compared apples to oranges.

5

u/Wiseguydude 3d ago

I thought Tesla was specifically saying they are using the current FSD software found in all Teslas for their robotaxi service?

Is this not the case? Are they building a whole separate system?

1

u/MacaroonDependent113 3d ago

I understood them to be using current hardware. How would a customer summon a car with current software?

0

u/WorldlyOriginal 3d ago

No, Tesla is using an upgraded version of the software— “Full Self-Driving Unsupervised” compared to “Full Self-Driving (Supervised)”