Robo-Taxis, Tesla Troubles, and the Path to Safer Roads
This week we cover recent controversies surrounding Tesla’s Full Self-Driving technology, revealing how manual interventions and YouTuber feedback fuel the myth of advanced AI automation and into the broader issues of automotive safety, discussing open-source software’s pitfalls and the potential cybersecurity risks in modern vehicles. Other highlights include varied recalls from manufacturers like GM, Toyota, Nissan, Ford, and Lucid.
Links:
- https://www.autoblog.com/2024/07/09/tesla-self-driving-bias-musk-and-influencers-get-priority-in-autonomous-driving-ai-development/
- https://www.washingtonpost.com/technology/2024/06/27/tesla-death-wife-lawsuit-damages/
- https://www.popsci.com/technology/self-cleaning-tesla/
- https://jalopnik.com/fan-boy-finally-criticizes-tesla-after-2-broken-tesla-c-1851573471
- https://www.theverge.com/2024/7/6/24193094/phoenix-waymo-car-pulled-over-traffic-stop-oncoming-lane-police
- https://gothamist.com/news/jfk-airports-newest-ride-not-planes-but-self-driving-shuttles
- https://static.nhtsa.gov/odi/rcl/2024/RCLRPT-24V491-8985.PDF
- https://static.nhtsa.gov/odi/rcl/2024/RCLRPT-24V498-2475.PDF
- https://static.nhtsa.gov/odi/rcl/2024/RCLRPT-24V482-8101.PDF
- https://static.nhtsa.gov/odi/rcl/2024/RCLRPT-24V486-3579.PDF
- https://static.nhtsa.gov/odi/rcl/2024/RCLRPT-24V493-4480.PDF
- https://static.nhtsa.gov/odi/rcl/2024/RCLRPT-24V497-2271.PDF
- https://www.autosafety.org/support-us/
Subscribe using your favorite podcast service:
Transcript
note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.
[00:00:00] Introduction to the Podcast
[00:00:00] Anthony: You’re listening to There Auto Be A Law, the center for auto safety podcast with executive director, Michael Brooks, chief engineer, Fred Perkins, and hosted by me, Anthony Cimino for over 50 years, the center for auto safety has worked to make cars safer.
Hello listeners. Coming to you live from the internet. It’s the center for auto safety podcast. Mr. Fred Perkins is dialing in remote. So that’s why his voice has a very cool kind of mid rangy sound to it. How you doing Fred?
[00:00:41] Fred: And good morning. Good morning to you all.
[00:00:45] Anthony: Great. Let’s jump right into it.
[00:00:47] Tesla’s Full Self-Driving Controversies
[00:00:47] Anthony: This has been a fun week for a little company called Tesla. Oh boy. I’m gonna start off with an article on Autoblog. So we’ve all heard about how full self driving Tesla is. The car drives [00:01:00] itself. You can sit back, have a cup of coffee, watch a movie, read a book. We got it, bro. How it’s all artificial intelligence and great and it does all this stuff and the system is learning and smart.
It’s more artificial than intelligent because this article shows that basically there’s an entire team that will fix the maps for certain users such as Elon Musk. And some YouTubers who go out and complain. What this does, instead of the car being able to identify roads and things properly, Elon will have a problem and be like, Ah, this road doesn’t work.
I hate this. And the people go, Oh, yes, sir. Yeah, right away, sir. I will go fix that. At one point, apparently they even had the California Department of Transportation repaint the lines in the road. But other than that, yeah, they have people Sitting there manually marking up maps, being like, Nope, this is where the exit sign is, this is what a stop sign is.
They gave an example in there of a YouTuber who was having problems navigating Lombard Street in San Francisco, which was [00:02:00] basically a long series of right angles just a turn constantly, and the software couldn’t figure it out. So they put in these kind of hidden barriers so the car could navigate it.
Long story short, if you want to use full self driving, follow Elon’s commute and it will be more than okay. That’s my takeaway. Yeah.
[00:02:20] Fred: It’s good to be it’s good to be the king .
[00:02:25] Michael: It’s also good to be, a YouTuber who’s very vocal about the problems you’re having with Tesla vehicles. And, Tesla essentially is.
Scouring the internet for any YouTubers or other, folks who are putting up, posting problems that they’ve had in autopilot or in full self driving, and then they’re, when they see these problems, they go back to their team and it looks like sometimes, they’re sending Tesla vehicles out.
To explore those areas where the problems take place. And they’re even, annotate, they have these, [00:03:00] what they call a small army of human data annotators, which doesn’t sound very artificially intelligent to me, but these folks are basically. Continuously improving how the cars work and how they’re responding by reviewing the footage that’s posted on the internet.
What this article made me think is that if you’re using autopilot or full self driving, you might have better performance if you’re driving, along Elon’s route to work. Whoever he is now, are they in Austin now? Or if you’re driving in the neighborhood of one of the YouTubers that posts a lot of videos of Tesla bugs as they happen.
So it’s, somewhat concerning. I don’t think that, they’re not doing this for John Q public and for random folks who aren’t making their problems known on the internet, which is most of us. And so it’s. Interesting. It’s also probably a great way to make sure that your PR shop is in order if you’re constantly responding [00:04:00] to the people who are complaining on the internet.
[00:04:04] Anthony: Now they’ve actually, Elon’s made fun of GM and Ford for their driving. assistance thing, because it uses pre mapped roads. Oh, look at them, but essentially that’s what Tesla’s doing after the fact. For Elon and some YouTubers. From the article, it says We, it seemed like we were purposely making his car, Elon’s, to, better, to make Autopilot look different than it was, a former employee said.
It felt dishonest. Now, when I think Tesla and Elon Musk, I think, you know, the, the utter echelon of human honesty,
[00:04:39] Fred: No. Did that honest statement was made by a former employee.
[00:04:45] Anthony: That is an excellent point.
[00:04:48] Michael: It’s like they make a good point. The YouTubers, these folks who are doing this for clicks and attention, Are trying to break the system, right?
They’re trying to [00:05:00] go out and screw things up so they can run to the Internet and expose Tesla or run to the Internet and say, Hey, Elon, can you fix this for me, buddy? And, buddy up to Elon, that kind of thing. There is some good in, the company going and fixing those problems.
However, it’s, it seems like it’s not a very well, a well planned strategy when you need to address what’s happening on every road in America versus the areas where these YouTubers and influencers are driving.
[00:05:32] Tesla’s Legal Battles and Safety Concerns
[00:05:32] Anthony: Next up in Tesla news is an article in the Washington Post. We mentioned this a while back.
There was a wrongful death lawsuit against Tesla. It was a Tesla employee. Who relied on full self driving and unfortunately died. And, so the wife of this person is suing Tesla, obviously saying, Hey, your stuff is crap and my husband died using it. And Tesla’s fighting this. From the article, Tesla’s response submitted Thursday states that the company disputes that it is liable for any [00:06:00] damages whatsoever, and requests that the case be moved from state to federal court.
I get why they’re fighting it. Sure. Who wants to admit that their software is garbage and could kill people? But I don’t understand is why do you want to move it from state court to federal court? How does that help Tesla?
[00:06:18] Michael: They might have, they might, the case might be filed in a location where juries are typically more favorable to plaintiffs.
That’s a common objection of auto companies, not just Tesla. The real reason they, I think they, the biggest, reason they’d want to fight this case is apparently the driver was intoxicated. The attorney for the drivers is disputing those toxicology results in the case, it’s that’s a problem.
Ultimately here, it occurred on this road called, I think, bear mountain or bear Creek road, which is a very curvy two lane road where. Autopilot or full self driving should not be operating in the first place, right? [00:07:00] They should be restricted to highways where there are exits, no cross traffic, very specific areas.
And this is our, our biggest problem with Tesla. I’m sure you’ve heard me discuss it ad nauseum here, but they need to be geofence. They need to be pegged on a map to the highways where they can operate. Consumers should not be allowed to turn on these systems on roads where they haven’t been tested and where the software hasn’t been proven to work properly.
And we’ve seen a number of crashes that have taken place in areas where autopilot or full self driving never should have been allowed to be turned on. By the vehicle. And that’s part of Tesla’s over reliance on its camera systems. And it’s AI, they want their AI and their cameras to basically look out at the road and determine whether or not that’s the type of road where the technology could be used when in fact, Ford and GM with their various.
Systems like Blue [00:08:00] Cruise or Super Cruise, Ultra Cruise, they’re literally going by the map of where the divided highways are. We know where those divided highways are, Elon. You don’t need to prove your AI is great because it’s clearly not working to determine, the proper locations for the technology to be used.
Why not just peg it to a map and end this problem once and for all instead of relying on cameras to do the work that we’ve already really done with GPS and mapping.
[00:08:28] Anthony: Quoting from the book of Elon in 2016, full self driving is pretty much solved. You’re wrong, Michael. Wow, that was almost
[00:08:37] Michael: ten years ago.
They had solved it.
[00:08:40] Anthony: I know. But hey they’re solving something else.
[00:08:43] Elon’s RoboTaxi Vision
[00:08:43] Anthony: Elon is talking about his RoboTaxi. RoboTaxi, coming out. And hey, you bought a Tesla? Hey, why not have it drive around and have it get a job? At night, while you sleep, your car will out there get a job and pay for itself, taking strangers around [00:09:00] with their smells and foods and stinks in your car, so what do you need for that?
So Tesla’s patented a self cleaning technology for its robo taxis that don’t exist yet. The, they may include disinfecting UV lights, heating plates, and, maybe some sprays. I don’t know. I don’t know what they’re thinking. I like the idea. Of this, I want my, I want everywhere I am to be self cleaning that would be great, but, I think this is just nonsense, cause, cause remember, he said all of the cars since 2015, all of them are ready to be robo taxis.
But not, I don’t think any of them have UV cleaning, disinfecting systems in them.
[00:09:40] Michael: I think we all know, and most of the folks listening know that none of those vehicles are ready to be robo taxis. They don’t even approach the level of, the technological level that is required to be robo taxis riding around without a supervisor to all the Tesla systems that have been produced [00:10:00] so far are.
Going to require a human supervising the automation and from what we’ve seen of full Tesla’s full self driving, there’s no way they can put that. I hope they’re not going to put that technology in robo taxis because it requires a human there. And there are all sorts of steering nags. But, from what we’ve heard from people who drive in full self driving, we haven’t done it ourselves because frankly, I’m scared to you’re constantly having to manage and monitor the vehicles activity in order to ensure that you’re not crashing.
So if you put that same technology in a robo taxi with no driver, you’re going to see a lot of problems. I would. I would suggest that you’re going to see a lot more problems than we’ve seen with even cruise, definitely a lot more problems than we see with Waymo. I think it’s going to be a disaster.
I’m sure we’ll talk about it a lot as we approach that big August 8th date on which they’re going to announce or reveal the [00:11:00] robo taxi. We’re still not sure what’s going to happen there. I would caution anyone watching what comes out on that date. To be very skeptical of the claims being made.
[00:11:10] Anthony: I’m imagining it’s a guy dressing in a taxi outfit. That’s it, right? I
[00:11:17] Fred: want to remind the listeners that no robo taxi that’s out there now has shown any evidence of exceeding DOT technology readiness level five. In order for the technologies to be considered a mature, it has to be at least technology readiness, level eight.
So there’s a long way to go for any of these technologies. And in fact, if Tesla comes out with a robot taxi tomorrow or next month, or who knows when it’s also going to be a technology ready to level five, it’s not going if that, all of the rest of it is just window dressing for not only [00:12:00] Tesla, but as well the same is true for all of the other contenders in the robo taxi arena.
[00:12:06] Anthony: So, Fred, I want to ask you real quick, so Fred, as our chief medical officer, I want to ask you is, we, we’ve seen things fail in cars like airbags that explode and go off unintentionally, different things inside cars going off when they shouldn’t be. If you have a self driving car with this UV.
disinfecting technology. What could possibly go wrong if this self cleaning technology goes off? I’m a passenger in this. Is the, can I make microwave popcorn while this happens? Will I improve my tan?
[00:12:37] Fred: You’ve heard of pepper spray, right?
[00:12:39] Anthony: Oh yeah.
[00:12:41] Fred: Yeah. Who knows what can go wrong? But my opinion is that the Self atomizing vehicles or feature is way down on the list of things you should be concerned about if you get into one of these things.
[00:12:59] Anthony: Alright, we’re [00:13:00] looking forward to your Yelp review. Hey listeners, have you gotten into a do you own a Tesla and you think, hey, I want it to go out and prowl the streets at night? Like a dirty little car. Boy. But hey,
[00:13:13] Michael: I’ve seen the road. I’ll take care of it from there. The robo made, I guess I’m calling it a robo made.
It looks like something that the patent that Tesla has gotten looks like something that can do some minor disinfecting, but nothing that’s going to take care of all those drunk guys in your car or animals or any kind of real mess. It’s something that you know, you as the owner are still going to have to put some elbow grease into fixing the
[00:13:43] Fred: hamburger wrappers are going to stay on the floor.
Yeah,
[00:13:47] Michael: it’s not going to get those McDonald’s French fries out from the sides of your seat. Sorry, kids.
[00:13:53] Anthony: I want to get into one of these now. I’m going to leave all sorts of stuff behind.
[00:13:57] Michael: Hey, it’s like your daily shower.
[00:13:59] Anthony: [00:14:00] Yeah. Hey, look, I ate a burger in a shower once. What are you talking about?
I don’t know.
[00:14:03] Fred: I think that could have worked on a self cleaning refrigerator. That would have been a lot more practical. I would love that.
[00:14:10] Anthony: Honestly, I’m afraid to turn on the self cleaning function of my oven. Because I read the manual, and it’s we will make the oven hotter than it normally gets. And I’m like, eh, I don’t know.
I’ll wait for like fall when they haven’t turned the heat on in the building yet, and it’s chilly. Maybe I’ll do that. But anyway, listeners, let us know if you have a Tesla that you’d let prowl the streets at night. I’m honestly curious, maybe there’s something we’re missing. And let us know. And while you’re doing that, click subscribe.
Go to autosafety. org and click on donate. Tell all your friends, rate us 5 stars. And get ready for our final Tesla story of the day. I think it’s our final Tesla story.
[00:14:44] Tesla Fanboys and Cybertruck Issues
[00:14:44] Anthony: So we know Tesla’s got a lot of fanboys. I don’t think there’s a lot of fangirls for They’re they’re not fan men they’re boys.
There’s an article in Jalopnik where this one Tesla fanboy buys a Cybertruck and it broke [00:15:00] twice. And so they’ve given him two different Cybertrucks in four months. Two weeks after taking delivery, this guy’s Cybertruck failed on the side of the road, throwing all kinds of error messages as he documented on his YouTube channel.
Later on, he talks about how there’s other issues like the windshield having what he described as a line or ripple in it, the driver’s seat being too close to the center console, the headliner trim sagging. And his response to this is stuff happens and we just have to work through it.
[00:15:29] Michael: Wow. I don’t know if the Center for Autosafety would exist if every consumer out there had that opinion.
[00:15:35] Anthony: So last Christmas, I bought my, for my wife, a new gardening belt because this is her profession. She does. And it fell apart after six months. And we talked to the company and we’re like, Hey, can you fix this?
And they’re like, sure. Send it in to fix this. But your wife’s using this wrong. I just thought, oh did I just contact Tesla? What is going on here? So I want to know why these Tesla fanboys are like, yeah, I put down a hundred grand [00:16:00] and things are crappy, but I have to work through it. Are they like, do they not want to go to a dominatrix?
Is that what it is? Because it sounds like it’s,
[00:16:06] Michael: I think it’s just, first of all, I guess we should go ahead and just question their intelligence generally. But beyond that, it sounds like they have way, way too much money. Like they don’t, they have so much money that they don’t even care, which is scary in some ways.
Tesla Cybertruck is now the best selling electric. Pickup truck in the United States, speeding out the Ford F one 50 lightning EV. So they’ve got all, that’s not something I expected to happen. I, the cyber trucks,
[00:16:38] Fred: that’s like a rock emerging from a tide that’s going out, It’s a much, it’s a steadily smaller market share.
And a little spurt in the sales for the cyber truck, cause it’s new and flashy and part of the Elon religion just means that’s stable while the rest of them are receding. [00:17:00] I don’t think it has a lot of meaning.
[00:17:03] Michael: I think it’s temporary at best. I don’t think Americans are going to flock to the cyber truck in large numbers.
Certainly not, they’re not flocking to electric pickup trucks and large numbers at this point, but once Ford and General Motors and some of the other, Truck manufacturers, pickup trucks are the best selling cars in America. At the top of the list every year are Ford, GM, and Chrysler pickup trucks.
Once electric vehicles get better batteries that, work for farmers and for folks who have to drive long ways without stopping for hours to charge their vehicles, then I don’t think the Cybertruck is going to be anywhere near the top of the list. You’re right, Fred. It’s like the tide going out and rocks.
[00:17:44] Fred: Yeah, I don’t think that, I don’t think that the electric trucks are ever going to get much purchase because you know what, they’re twice as heavy as other trucks and pickup trucks have to work in muddy conditions, so you’re automatically building in a lot of restrictions in the [00:18:00] operating scenarios for those electric trucks compared with a conventionally powered truck and farmers don’t like to get stuck.
People who have to use pickup trucks to earn a living don’t want to get stuck in the mud because they’ve got some kind of funky.
[00:18:17] Anthony: You can see a number of YouTube videos of a cyber trucks getting stuck in the mud and then it’s Ford F 150s pulling them out. It’s hilarious. Continuing on this article, it’s fascinating. One of the issues that was problem with this brand new cyber truck was that the ground wire for the electrical system was corroded.
And that’s just they had to have installed it corroded. Like, there’s no way that happens in a couple months. That, years maybe fascinating. But the guy, Lamar MK says, Tesla made it right guys, I got a brand new Cybertruck, thank you. And then that Cybertruck dies on him.
[00:18:59] Michael: So he’s going [00:19:00] to get three, three Cybertrucks in three months, right?
[00:19:03] Anthony: Definitely on a list somewhere. That’s for sure. Fred?
[00:19:10] Fred: It’s part of the religion, I think. I think a lot of people buy Teslas in part because they want to stick it to the man. They don’t seem to recognize that Elon is the man to whom they’re trying to stick it. And instead of sticking it to the man, they’re shoveling money into his ma. And it’s just, once, once you’ve adopted a religion, boy, it’s really hard for people to shake you from that faith.
[00:19:34] Anthony: It just makes me think of that old saying, government, get your hands off my Medicare.
[00:19:38] Fred: Something like that.
[00:19:40] Anthony: Ah, alright, thank you. That’s the end of the world of Tesla.
[00:19:44] NHTSA’s Speed Prevention Campaign
[00:19:44] Anthony: Let’s go to Nitza’s. This is Speed Week. Speed, is it Speed Week? Do they actually call it Speed Week, Michael? Speed Week.
[00:19:51] Michael: No they do not call speed week. They call it their, it’s they do an annual speeding prevention campaign. This year they brought in some Ascar [00:20:00] drivers and some other folks and they do a national media by and put out ads that, try to Show people, the consequences of speeding and crashes there.
And they do some press around it, but, they also released some data, which I think is probably the most interesting part of this where. It shows that about a third, 29 percent between, a quarter and a third of all fatalities that happen every year are speeding related. And when you say speeding related, that’s not that’s where the speeding was reported by the police or the driver was charged with the speeding offense in that crash.
There could, I think you could reasonably suggest that, they’re, Probably a lot more crashes, fatal crashes and others happening that are speed related, but that just aren’t documented by the police in their reports which, you know, and 29 percent of all traffic fatalities is already an obscenely high number and 1 more [00:21:00] reason why we think we need effective speeding.
Speed prevention technology in vehicles, enforcement is not going to be good enough. And some people are going to speed no matter how many beeps and buzzes you put in front of their face. And in some ways we just need to move towards cars that can’t speed.
[00:21:20] Anthony: Yeah, I was surprised from this.
It’s the is 2022 data. I guess it takes a while to compile all this stuff, but it said there was an estimated 300, 595 people injured in speed related traffic crashes in 2022. That’s that’s amazing. Because I just had to drive down the Merritt Parkway in Connecticut, and I assumed that number would be much higher.
Because my trip back from Maine took eight and a half hours instead of five and a half hours because of a whole bunch of people speeding and crashing into each other.
[00:21:51] Michael: You can also see, there’s a there’s a a lot of bad decisions being made by drivers here that speed because not only [00:22:00] are they speeding, a lot of them are drinking and intoxicated, a fairly large percentage of them.
And to add to that, it’s more than half of speeding drivers in fatal crashes. weren’t wearing their seatbelts, about 52%. So there doesn’t seem to be a lot going on upstairs for these drivers involved in these crashes. They’re not really, they don’t seem to be very concerned about their own safety or the safety of other folks on the road.
[00:22:31] Anthony: No, they definitely do not care about other people on the road. All right, here’s a more lighthearted, entertaining story.
[00:22:37] Waymo’s RoboTaxi Mishaps
[00:22:37] Anthony: Waymo. The self driving robo taxi offering from Google, which also does have fanboys. Cause people think, hey, these things are the best things ever. A cop in Phoenix had to pull one over cause it had been driving into an oncoming traffic lane.
And so the car’s going the wrong way. Cop lights up the sirens. The [00:23:00] Waymo runs away. For a bit, and then eventually pulls over and the cop gets out and is Hey, there’s nobody in this car and has to give a ticket. And so there’s a video related to this.
[00:23:11] Michael: Yep, please go. Did you say has to, did you say has to give a ticket?
Because my understanding was he couldn’t figure out a way to give the vehicle a ticket.
[00:23:19] Anthony: Yes, that’s the thing. The cop gets out of the car assuming there’s somebody here. This is a traffic stop. It is potentially a dangerous situation. I imagine the heart rate comes up and there’s no one inside the car and Who do I cite committing this?
And we know in, in San Francisco that Cruz and Waymo were exempt from Any sort of traffic citations, right? Yeah,
[00:23:42] Michael: Arizona. It says in an article I’m reading, Arizona law does, it allows officers to give out tickets when a robo taxi commits a traffic violation. But, automakers have to give them to the company that owns the vehicle.
And doing so is [00:24:00] not feasible, according to the Phoenix Police, at least. There’s a lot they need to work on there.
[00:24:05] Anthony: Yeah, so this related video, you can see the body cam footage from the cop and it’s fascinating and then there’s also a clip of somebody in law enforcement talking about how great Waymo is and saying, hey, they’ve given us all this training to help our officers move the vehicles out of the way if there’s an issue and to get into the vehicles and control it because the police are now valets for robo taxi companies.
Clearly the guy in law enforcement was on the take, I’m gonna say it. Yep, in my opinion, in my strong opinion, this guy was getting a little dough. Gettin a little bonus from this robo taxi company, cause either that or he’s an idiot.
[00:24:44] Michael: Yeah, I don’t think any of us, I don’t I don’t think that Waymo is paying off police.
But I also don’t think that, police need to be spending any of their time babysitting robo taxi operations at all. Any more than they need to be spending their time [00:25:00] babysitting drivers on the road, which, they do have to, to some extent. But, this is a little troubling again here is that, we’ve seen Waymo vehicles driving into oncoming traffic, a couple of months ago or more, NHTSA’s open investigation, citing that as one of the main problems that it’s looking into and, here we are a couple of months later.
Waymo has not been able to figure that out. In this case, they’re blaming. Bad signing in a construction zone it seems and humans make those mistakes too. I guess I’d have to say, but it’s, it’s the car was driving for 30 seconds in the wrong lane which suggests that it certainly didn’t have any clue that it was not in the proper way.
[00:25:47] Anthony: Waymo still dried, driving drunk. Yeah, the Waymo told Fox 10 Phoenix that its cars are three and a half times more likely to avoid a crash than a human being. They cite [00:26:00] no data for this, that is. Acceptable to any third grader.
[00:26:05] Michael: Correct.
[00:26:07] Fred: Part of this that I don’t understand is how they pulled it over because I don’t think there’s an algorithm And I could be wrong, but I’d be surprised if there’s an algorithm that waits for sirens or blinking blue lights Inside the car to automatically pull over So i’m guessing that there was probably a human being listening in on a microphone and heard the text Sirens of the beeps or whatever they do in Arizona and pull the car over and the video shows the cop chatting with the operator, the road operator over the intercom inside the vehicle.
So I’m wondering if they’re now listening in on all conversations everywhere in their in the robo taxes. In the event of, a cop trying to pull him over or give him directions or whatever, but also probably listening in on the conversations of the people who are in the car. To me, it seems a little creepy and indicative of [00:27:00] a real overreach on the part of Waymo.
[00:27:04] Anthony: Alexa, is Waymo spying on me?
[00:27:07] Michael: Yeah. How many remote drivers do they have operating these things, too? It makes me wonder. It’s difficult to monitor. I don’t know. A lot of cars at once is there a secret, fleet of humans sitting in a cave somewhere who are driving these vehicles?
There’s a lot of questions here that make me wonder, we’ve talked about remote drivers and the latency issue and whether they should be physically operating the vehicle when you have those types of problems. But even beyond that, you know, are there maneuvers that are being performed remotely that we’re unaware of?
Are these things as fully robo as they’re saying they are, or is there a lot more human involved there? It’s hard to say because, they’re not telling us and I don’t know it’s creepy and it’s weird and it’s, it’s dangerous. I think this whole experiment should be pulled off the roads for another 10 [00:28:00] years.
So they can get some things together. And so we can get some laws and regulations that are consistent from state to state and federally around this.
[00:28:10] Fred: Agreed. Yeah. It’s also indicative of the technological immaturity. Again, these are only a TRL five. They gotta be up to TRL eight or nine before they can be considered commercially acceptable.
[00:28:24] Gaslight Nominees of the Week
[00:28:24] Anthony: So I think this leads us into this week’s Gaslight nominees because this is interesting because Waymo has about what, 500 cars on the road? Maybe they do have 500 people monitoring them or however many are out in the fleet. Maybe it is a one to one relationship. Who knows? But my nominee, somewhat tangentially related, Would be Tesla and their artificial intelligence we talked about earlier.
Our cars can map the roads and figure everything out for you. Don’t look behind the curtain, there’s a team of people physically mapping the roads and making sure they don’t avoid these mistakes post crash, post incident, after it happens. Ipso [00:29:00] facto, DeLorean, Dominus Pobiscum. Not Jim cruise this week. I should get a point for that.
[00:29:09] Michael: No, you should get a point for that.
[00:29:11] Anthony: All right.
[00:29:12] Elon Musk’s Controversial Comments on Waymo
[00:29:12] Anthony: So
[00:29:12] Michael: Michael,
[00:29:14] Anthony: what’s your what’s your nominee?
[00:29:17] Michael: Oh, my nominee is, Elon. He after seeing the Waymo incident that we just discussed, he goes onto his little prop platform. You might’ve heard of that used to be called Twitter. And basically says that way most.
Vehicles are straight out of the Silicon Valley show. The comedy writes itself, he says with two great big laughing emojis, which is really strange coming from, a man whose automations have actually killed a lot of people on America’s roads that Waymo vehicle driving in the wrong lane, Waymo’s may have been [00:30:00] involved in some minor.
Injury incidents, they have not burned their passengers to death. Some of Elon’s vehicles have, they have not crashed into highway barriers while using automations like Tesla vehicles have. And there are a host of problems that Tesla vehicles have had now, not just autopilot full self driving, but some of their designs that we’ve discussed as far as manual door releases and other things that trap.
people in the vehicles when they’re on fire. Where, it’s, frankly, it’s just disgusting to see him make light of Waymo situation when he’s been responsible, personally, I’d say for pretty, a significantly higher percentage of human tragedies.
[00:30:47] Anthony: We’re not even talking about his own family.
[00:30:51] Michael: Which is a mess.
[00:30:53] The Debate on Open Source Software in Automobiles
[00:30:53] Anthony: Fred, who’s your Gaslight nominee? And can I tie this directly into the towel, Fred?
[00:30:57] Fred: You’ve now entered the towel set. Absolutely, [00:31:00] and thank you for that. I need your help with the citation because I don’t have my notes in front of me. I’m relaxing offshore. That was that magazine article from
[00:31:08] Anthony: Yeah, there
[00:31:12] Michael: was an article talking about auto news that was discussing how automakers are supposedly sharing open source data as cars get more complicated.
I’ll do a direct quote from it.
[00:31:25] Anthony: So the direct quote from it is, everyone wants the same output, wants the same set of functionality, is trying to solve the same problem, said Alex Euler, director of SBD Automotive North America. A dollar invested supported quality open source software is much more efficient than a dollar invested in trying to solve that problem yourself or asking a supplier to support it for you.
The article starts off with, more software, higher development costs, and other challenges are forcing the industry into a more collaborative approach. I said that with a straight face.
[00:31:54] Fred: So this is so this is my nominee for Gaslight Illumination because hidden [00:32:00] in that statement is the whole idea that open source is intrinsically high quality, and of course it is not.
And we need to get into a little bit of explanation about what open source means, so I’m just going to go ahead with that and sneak into the Tower of Fred with that.
[00:32:15] Understanding Open Source and Its Risks
[00:32:15] Fred: Because there’s a lot of terms that are confusing to people who really are not In the software industry and open source is one of them. Basically open source means that you can use it without paying anybody. There’s a lot of software available on the internet. A lot of that is what’s called object oriented software. So those terms are very confusing to people who don’t use them every day. What the object oriented software basically means is that the software module can be treated like bricks in a foundation.
Okay, if you want to build a house you put a lot of bricks in the foundation and you build on top of that. And that’s great, it’s a good way of doing things, but what if the bricks are crappy? What if the [00:33:00] bricks have got built in defects? No matter how many bricks you use, you still get those defects in your foundation and sooner or later, they’re going to come back to bite you.
The problem with the open source, a problem with open source is that there’s nobody who has a vested interest in making sure that all of those software modules, those bricks, if you will, are in fact, high quality. You have to pay people to do that. You have to pay people or find people who are saints. To go through all of those individual modules, make sure that they work properly, make sure that they’ve got the right inputs and outputs, make sure that there are no back doors in them that allow hackers to get in. In fact, a lot of the well publicized software breaches that have occurred are caused by back doors that have been built into open source software. There’s a source called Linux that’s around, a lot of people use that. But the versions of [00:34:00] Linux that are being used are really not open source. For example, there’s a company called Red Hat that makes a version of Linux that’s widely used throughout industry, particularly in the banking industry.
Red Hat is a private company, so when you use Red Hat Linux, you’re really not using open source software. So getting back to why this is a gas light, the article that we’re quoting here talks about the virtues of open source as though it’s a slam dunk. You’ve got, if you have one company developing a software platform that represents, we’ll say a car with four wheels, everybody else can build on top of that because.
Everybody else’s cars got four wheels, right? That’s fine as far as it goes. But what if somebody has built in a problem into those into that platform? That means that problem, that trap door, that vulnerability, whatever it is, goes out to every car in [00:35:00] the world. There’s no requirement. From any government, as far as I know, that the certificate of safety that goes along with these cars involves cyber security and sufficient depth to make sure that something like open source software, in fact, is secure.
Does not have the kind of defects that can lead to operational problems or injury or death of the person who is inside the car.
[00:35:27] Cybersecurity Concerns in the Automotive Industry
[00:35:27] Fred: The urban myth of intrinsic AV safety benefits really is based on the whole false belief that software and hardware running the car is both trouble free and defect free and secure.
None of those three are true with open source software, and I’m sure, Anthony, you know with your software experience that a lot of the open source software that’s out there is crap, and some of it is very good, but very hard to understand, and whatever problems there are with this open source [00:36:00] software propagate upwards into software you build on top of the open source that is specific for your particular application.
Now, in the case of banks and financial institutions using open source software, all that means is that they’ll lose a few hundred million dollars. But that’s okay, they’ve got insurance, they don’t really worry about that. For you and your car, it’s an entirely different problem if the operating software is faulty or has got intentional shortcomings built into it.
There is secure software available. There’s no open source, intrinsically secure software available. You have to pay for it. People who are looking for type certificates of aircraft are willing to do that. The aircraft operating systems are secure, and people pay for it because then you’ve got a company who has a vested interest in making sure the software works properly, is safe, is secure, doesn’t have any [00:37:00] cybersecurity problems.
And then the entire history of aircraft. I don’t think there’s a single instance of aircraft cyber security being breached despite all the many aircraft out there and all the hackers in the world who are trying urgently to do that. So that’s my nomination to AutoNews because they skip over the huge problems associated with open source software quality and personal safety.
And instead just say, gee, it’s really a good idea because it’s going to save some money for somebody. And by the way, the article also goes on to cite some commercial sources of software that are not open source as a way of supporting their argument for open source software. So, that’s my nomination, and I’m not done ranting about open source software yet, but it’s time for a break so you can hear somebody else.
[00:37:55] Michael: Do you think that as it pertains to cybersecurity specifically, [00:38:00] does it make more sense for manufacturers to, diversify the type of software they’re using for cybersecurity? If everyone uses the same platform, does that create a larger attack surface for someone to, for instance, take over all the vehicles in America versus just those that are controlled by one manufacturer?
[00:38:22] Anthony: Can I answer this one?
[00:38:22] Michael: Yeah, go.
[00:38:24] Anthony: Yes. So remember these ads where it was like, Macs don’t get viruses. Macs don’t get malware. It was because back in the day, nobody used a Mac. And so no one wanted to write software to attack these things. And so everyone went after Microsoft. But yes, having a a variety of things makes it harder for people to write custom viruses, malware, and whatnot.
Because now I gotta target 800 things. Everyone’s standardizing the same thing. That’s, now I have one attack surface to go after. Microsoft is huge and their Outlook system has been hacked and hit by a lot of people. People, mainly [00:39:00] Russia and state actors a lot of times. Cause Hey, this is the biggest one there.
If all auto manufacturers use the exact same thing, I think that would be more dangerous and also innovation.
[00:39:12] Fred: Michael, what are the what brand of cars has been hacked a lot because of a clue that went out over the the tech talk?
[00:39:22] Michael: Yeah. Yeah, Hyundai and Kia had a had, They have, they’re not really injecting lines of code into the system.
They’re basically breaking a plastic cover off the steering wheel and using a USB cable to turn the ignition on. But, in a similar way, Hyundai’s and Kia’s were all designed without an immobilizer. And, they’re the only vehicles in the US market that had been, routinely hacked in that manner.
[00:39:47] Fred: But that’s a great example of the consequences and strength and weaknesses of having ubiquitous software. It’s really the same problem. If you’ve got the same software in all vehicles, Then only [00:40:00] a hacker only has to do one thing to get into all the vehicles. If you have different platforms, even if they’re insecure, at least the hackers have to do multiple activities to get into them, but it’s a much better idea to use software that is intrinsically secure and software practices that are secure, just bite the bullet.
Hey, the software manufacturers, what they need to stay in business and preserve the safety of the passengers. It seems like a really straightforward approach to me.
[00:40:32] Anthony: So I’m going to take a slightly different tack than you’ve read. I am so open source, I think is great in a lot of ways, but you’re absolutely right in terms of, it’s not this magic solution to everything.
It’s not. And most open source, the example of Linux. It’s not done by, what people think of, volunteers. It’s mainly developed, I think the biggest contributors are IBM and Microsoft, probably. And maybe Oracle now, I don’t know. But yeah, these are people who are paid to, to do this [00:41:00] stuff.
But that lack of, The focus on cyber security is an issue even with them because that takes a long time to really figure that stuff out. There’s a, there’s always, you find out 20 years after the fact that some large protocol that’s been in use forever, Oh, we just found a flaw in it. And it being open source or not doesn’t make that flaw disappear, doesn’t make that problem solved.
It’s more of a cultural mindset of focus on security. And I think the airline industry, as you’ve said, they really focus on that because, Hey, if our planes are going to fail, let’s make sure we can blame it on Pratt and Winty, not our software. Bad example. But, they really type it down. And I’m sure with your experience in the defense industry, there is a lot of, Strict security around things, but even that never gets to 100 percent perfection,
[00:41:50] Fred: right?
Perfection is Always a utopian ideal, but you can still do much better And for example from the aircraft industry air inc [00:42:00] is a company a non profit that is organized around having security associated with aircraft communications and then internal computer networks and they do. You pay for it, you get what you pay for on that. You look at the max eight, the 737 max eight, that’s an example of what can happen even when you are using secure software because they built on a little addition onto their, Platform that they had developed over a long time was a 7 37 to incorporate this additional feature they thought was a great idea.
But even there, they didn’t do enough software software acceptance testing. They didn’t do enough validation and, terrible things happen. So software is very difficult and to minimize it to minimize the risk And the strengths and weaknesses of it by just saying it’s open source and that’s going to be great.
Even [00:43:00] when your article, as this Gaslight nomination insists, doesn’t really talk about either open source exclusively or the Or any of the pitfalls associated with open source is, I think, really a disservice to the public.
[00:43:18] Anthony: I think for consumers out there, especially as software becomes more and more intertwined with our cars, you’re going to hear a lot more excuses when things go wrong was, Oh, it was a software change.
We didn’t realize, or it was a software bug that we didn’t realize. And they try to minimize it and be like. It’s just one person who did this think Volkswagen and Dieselgate they try to be like, ah, it was one person. No. There’s entire teams of people, they go through a root process, even when they don’t get it right, they know exactly what they’re doing.
It happened last year with, if anyone follows NASCAR, where one of the teams, all of a sudden, their boost could work when it shouldn’t, and they’re like, it was a, it was just a software glitch. No, you had at least five people working on this and intentionally putting this stuff in. Look forward to that in the [00:44:00] future as things go wrong in your car.
Where they’re like, it was just a software glitch. Because they think, oh my, your phone or your computer. I have to update it every now and then because I had a glitch. Or I had to plug it, unplug it, and plug it back in. It was just a glitch. Now, these are intentional failures. Does that sound good?
Intentional failures? I don’t know.
[00:44:19] The Role of AI in Software Development
[00:44:19] Fred: It does sound good, but there’s another hazard, which is that a lot of lazy people are now using artificial intelligence to write the code for them. And lord knows what those consequences are going to be. I don’t even want to go there today.
[00:44:35] Anthony: Every time I hear your voice speaking over the phone, I’m waiting for another voice to come in and say, this call is being made from a correctional facility.
All right, let’s thank you for,
[00:44:47] Fred: yeah, we are a correctional, we are a correctional facility, aren’t we? That’s what we do. We’re trying to correct the industry.
[00:44:55] Anthony: Hey, yes, sir. Okay. Thank you for your towel and your gas light all at once. [00:45:00] Let’s jump to something a little more fun.
Interesting.
[00:45:03] JFK’s New Automated Shuttle System
[00:45:03] Anthony: There’s an article in Gothamist about JFK has a new automated shuttle system. And so this is an example we’ve talked about in the past. We’re like, Hey, this is a probably a good area to try autonomous vehicles out because it’s operating design domain is highly limited. You have a limited number of people, the speed is really limited and so what they’re having is they’re testing out a couple shuttles that will, take people around the airport essentially.
But my favorite part of the article is, They will also be staffed by on board safety attendants who will greet and guide riders and currently serve as JFK shuttle bus drivers. In an airport, we can all think, Hey, I probably need somebody when I get confused or overwhelmed. A little helping hand. And this is, frankly, when I think of them talking about robo taxis. Maybe 50, maybe, I don’t know how what percent of the time, sometimes you don’t need to interact with your driver at [00:46:00] all.
But if you have a disability or whatnot, you need a driver there. I can’t open the door here. A little human contact’s pretty good. Don’t know why they get rid of this and I’m glad they’re not doing that at JFK. Michael.
[00:46:14] Michael: Yeah, this is a one of the I often struggle to find or to think of and imagine actual use cases for Autonomous vehicles.
I don’t think they’re going to take over cities the way a lot of proponents seem to think they are And I don’t think they’re Really at this point they’re probably less safe than an Uber driver. And they’re not going to be able to help you take your luggage out of the trunk and put it in and do some of the other things that humans come in handy for but here where you have a, a.
An easily mappable roadway parking lot. And they’re basically going back and forth from terminal to parking lot all day. And they’re operating at low speeds [00:47:00] and they have a safety driver. And they’re making the passengers buckled, buckle up as they start the program and sit down, even though there is standing room on the buses.
It sounds like they have safety really at the forefront of what they’re doing at JFK. And also, I think they did a, um, the Port Authority’s done some other testing as well in situations like this. So it’s a, it’s, this is a situation in which it can work and, perhaps, save money make.
Make getting to the airport a little more efficient, and, this could work, even though these shuttles are shaped like lima beans, according to the article.
[00:47:38] Anthony: Aesthetics aside, we get a, a middling thumbs up from Michael Brooks. Yes.
[00:47:47] Weekly Vehicle Recalls
[00:47:47] Anthony: Ah, and with that, I guess let’s move on to recalls for the week.
How’s that sound? Strap
[00:47:51] Michael: in. Time for the week holdout option. Sounds good.
[00:47:54] Anthony: I know. Okay, let’s start off with General Motors. Founder of this organization. [00:48:00] Can I call him that? Pretty much. General Motors recalling 8, 622 Colorado’s it’s one of these 2024 models, Colorado’s canyons. And this is the defect relates to.
Fuel pump, the lock ring that secures the fuel pump to the fuel tank assembly may not have been fully locked during assembly by the supplier. There’s a lot of locking here.
[00:48:27] Michael: Basically, it’s a part that connects the fuel tank to the fuel pump. And in higher speed crashes, The, they can separate, which is going to allow for spilled fuel to come out.
And that’s when really bad things happen. This recall looks like it’s going to be starting sometime in August. It’s interesting here to me that, Jim was aware of this problem in March, they were notified by a supplier that they discovered one that wasn’t [00:49:00] locked. And in May, GM started looking into this, which doesn’t seem to be an over an overly complex issue.
They seem to be really taking their time getting this fix out to consumers. And it’s a. Pretty dangerous problem. Anything that could result in, a fuel spill, in our opinion, is a very dangerous problem. I’m a little disappointed it’s taken GM so long to get this one out, but fortunately, they are.
[00:49:28] Anthony: Alright, next one up, Nissan. 1, 608 vehicles. This is the 2024 Nissan Sentra. And it’s listed under glazing materials. Oh, there’s air bubbles present in the lower driver’s side windshield area. And the customer may not be able to see the VIN clearly. Wait, so it’s not even impacting the driver’s view. It’s just trying to, if you try to, you’re outside the car trying to look at the VIN number?
[00:49:54] Michael: It’s it this is basically a non compliance there’s, not a [00:50:00] direct safety issue here. And if you can’t see your VIN number, then you can’t look up recalls on the internet and there may be some other problems there. But this is there’s a federal motor vehicle safety standard 205 that governs glazing materials, basically the windows in your vehicle.
And apparently somewhere in 205, there is a requirement that the VIN be able to be seen, it’s not explicitly written in 205, but federal motor vehicle safety standard 205. Incorporates ANSI and SAE standards that are only available if you pay for them. So I’m just going to guess that, that, that part of the standard is in ANSI or the SAE that we don’t have access to.
So I can’t cite to it, but it’s an interesting recall nonetheless.
[00:50:52] Anthony: Yeah. If you own one of these vehicles, get it taken care of. Why not? It’s free. Drink some bad coffee while you wait. Moving [00:51:00] on, Toyota won 11, 000 plus vehicles. This is 2024 Lexus RX versions, 2025 Lexus NX something. And basically the head restraint can be removed without pressing the lock release button. These are, I see how these are non compliant, but these are not like, dramatic recalls like the car will suddenly shift to first gear.
[00:51:23] Michael: You, you say that, however, the restraint probably prevents as many emergency room visits and injuries as almost any other part of the vehicle other than the seatbelt, because, head restraints.
Headrests are very important in preventing spinal injuries, whiplash, all sorts of things when you’re in a frontal and rear collision. So they are incredibly important. And if you have one of those vehicles where you fold down a seat and have to remove the headrest, make sure you put it back in because you do not want a [00:52:00] passenger or driver riding around without that protection.
[00:52:04] Anthony: All right. Look at that. I learned something new. Ford 4,361 vehicles, the 2024 F-150. The windshield may not have been properly bonded to the vehicle, which could allow it to detach. Definitely get this one fixed. This one is a lot more dramatic than not being able to read your VIN number like this.
Oh my God. You don’t wanna be driving down the highway and then all of a sudden the windshield’s in your face, like literally in your face, this is awful. That
[00:52:30] Fred: would be startling.
[00:52:32] Anthony: Yeah.
[00:52:34] Michael: Yeah. And then basically they didn’t probably see all the windshields on there in a crash. I think, I don’t think that these windshields are just popping up as folks drive down the road.
However, they will pop out during a crash and that’s important for a lot of reasons. Obviously that could be a good thing if you need to escape the vehicle quickly, but. Glass and windshields and windows in your car also provide a significant [00:53:00] amount of structure that preserves the vehicle structure when you’re in a crash and rollovers and other things.
You don’t think of windows. As being particularly strong, but they do contribute to the ability of the vehicle, the passenger apartment to remain intact and protected in certain crashes. So it’s important that windshields stay in place.
[00:53:23] Anthony: Wow, I learned two things in a row from you, Michael Brooks.
This is fascinating. All right, but
[00:53:28] Michael: verify
[00:53:31] Anthony: next one up Ford Motor Company. Huh? They’re rarely listening to recall round up. 30, 735 vehicles is the 2022 to 2023 Ford Mustang. And this description is going to need the assistance of Fred Perkins. The secondary digital torque sensor in the steering gear was calibrated with an inverted polarity. Yeah. I hate when my polarity is inverted. This means that the steering torque sensor will can fail and the secondary [00:54:00] digital torque center’s polarity is inverted. The steering wheel may begin oscillating without warning. Ooh. So you’re trying to steer and the steering wheel is nah, we’re going the other way.
Fred, how do you invert your polarity?
[00:54:15] Fred: Oh, basically, you ever jump the car, ever jumpstarted a car with a dead battery?
[00:54:23] Anthony: Yes.
[00:54:25] Fred: If you put the electrical connectors on in the wrong order, your battery tends to explode, right? If you always need to go red to red and black to black, right? If you go red to black and black to red, your battery is going to explode.
That latter example is reverse polarity. So somehow they, they did that. And I think what probably happens is when you turn the wheel to the right, the sensor thinks you’re turning the wheels to the left. And I think that’s what the reverse polarity probably means in this case, but it’s a secondary effect.
So only when you lose the primary [00:55:00] sensor, is this going to come into play? So it’ll happen rarely, but when it does, it’s going to be really bad. It’s not too different than the 737 max eight problem. Okay. That was also a reverse polarity problem in a sense.
[00:55:15] Michael: So here, and here you get, it looks like you get, phantom steering of some sort, you’re trying to turn right.
The car is providing is basically. Pushing your wheel in the opposite direction. And so it, it looks like it would make steering more difficult, or if you weren’t applying a lot of pressure on the steering wheel, the vehicle is somewhat steering itself the wrong way. So it’s a, this is a fairly scary one.
That I don’t want to
[00:55:42] Fred: fix right away.
[00:55:43] Michael: Yeah. Yeah. I don’t believe that every vehicle that’s involved in the recall has this condition but they’re going to bring them in and inspect them all to make sure.
[00:55:53] Anthony: Next up, Lucid, 5, 251 vehicles, 2022 to 2023 [00:56:00] Lucid Air. And my suspicion is that they only produced about 5, 000 of these vehicles.
I could be wrong. The vehicles are susceptible to me being unable to speak. The affected vehicles are susceptible to intermittent hardware connection failures, which could lead the high voltage interconnect software to remove high voltage from the high voltage bus while driving. What language is this in?
[00:56:26] Michael: It doesn’t matter. It’s lucid. We’ve talked about them before. They are the, they are where the phrase, the turtle of death comes from because they have a turtle that shows up on their instrument panel when the vehicle is losing power, being put in limp mode. They’ve had a lot of complaints from owners and about.
They’re the vehicle losing power. We’ve seen, I think there have already been at least two recalls on the lucid air for similar issues. They’re [00:57:00] just, I would advise anyone considering a lucid to look elsewhere right now because it’s, they’ve just continually had problems with the battery system and the drive system and they don’t look like they’re going away anytime soon.
But if you did, if you were one of the lucky people to buy a Lucid, you’ll have your own notification on your desk in about a month. However, it looks like this is an over the air update. You should be getting this before you should be having this fix come into your vehicle before that date. So
[00:57:32] Anthony: if you or a loved one has a Lucid or is considering purchasing a Lucid, Don’t invest in that turtle of death.
Instead, invest in safety. Contribute to the Center for Auto Safety. Go to autosafety. org. Click on the donate button. Click twice. Click three times. Use your credit card number. Use your stranger’s credit card numbers. Use as many credit card numbers as you want. There’s probably some, you can go onto some forum right now, pay ten bucks, get a million credit card numbers, try them all, why not?
[00:57:58] Michael: Yeah no, [00:58:00] no dark web credit cards, thank you.
[00:58:02] Anthony: He’s a little more conservative than me.
[00:58:04] Conclusion and Final Thoughts
[00:58:04] Anthony: And with that’s our show, thank you so much. I would like to formally invite the CEO of Ford, Jim Farley, to join us, cause we want to talk to him about vehicle weight, cause he seems to be interested in that, and we’re interested in that too.
Thank you. So Jim, if you’re listening, come on, we’ll be kind.
[00:58:20] Fred: Thank you, gentlemen. And thank you listeners.
[00:58:24] Anthony: All right. Fred, we hope you get parole.
[00:58:28] Fred: Yeah. I got to run to the ferry that’s leaving the prison Island here.
[00:58:32] Anthony: Okay. Thanks everyone. Bye.
[00:58:38] Michael: For more information, visit www. autosafety. org.