Full Self Driving leads to Dangerous Complacency
Hello listener. Here are some highlights from this weeks episode:
- NHTSA is making GM Cruise submit a report of it’s software updates every 90 days. Anthony suggests NHTSA ask for the commit logs and other developer documentation so they don’t get gaslit.
- Tesla’s FSD requires a human to do the full self driving part.
- Consumer Reports looks at the CyberTruck and they think visibility sucks.
- Kia gets hacked and provides a good example of bad cybersecurity in the auto space.
- Fred explains latency and it’s got nothing to do with lace or lattes.
- plus recalls and more
This weeks links:
- https://www.nhtsa.gov/press-releases/consent-order-cruise-crash-reporting
- https://arstechnica.com/cars/2024/09/tesla-full-self-driving-requires-human-intervention-every-13-miles/
- https://arstechnica.com/cars/2024/09/flaw-in-kia-web-portal-let-researchers-track-hack-cars/
- https://www.carscoops.com/2024/09/fisker-flip-flops-again-requires-owners-to-pay-for-recall-repairs-but-feds-beg-to-differ/
- https://arstechnica.com/tech-policy/2024/09/ubers-mandatory-arbitration-upheld-in-case-over-severe-crash-injuries/
- https://www.consumerreports.org/cars/hybrids-evs/tesla-cybertruck-review-a4750335741/
- https://www.theverge.com/2024/9/30/24258232/tesla-cybertruck-fsd-early-access-v12-roll-out
- https://insideevs.com/news/734637/zeekr-7x-window-breaking-flooding/
- https://static.nhtsa.gov/odi/rcl/2024/RCLRPT-24V708-3072.PDF
- https://static.nhtsa.gov/odi/rcl/2024/RCLRPT-24V694-7731.PDF
- https://static.nhtsa.gov/odi/rcl/2024/RCLRPT-24V704-8105.PDF
- https://static.nhtsa.gov/odi/rcl/2024/RCLRPT-24V715-5912.PDF
- https://static.nhtsa.gov/odi/rcl/2024/RCLRPT-24V702-4982.PDF
Subscribe using your favorite podcast service:
Transcript
note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.
[00:00:00] Anthony: You’re listening to There Auto Be A Law, the center for auto safety podcast with executive director, Michael Brooks, chief engineer, Fred Perkins, and hosted by me, Anthony Cimino for over 50 years. The center for auto safety has worked to make cars safer.
Nice to everybody out there listening.
[00:00:30] Fred: Good morning. Good morning. Hey, I wanted to explain to our listeners, by the way, that we have a lot of bad news that comes through sometimes, and we can be mistaken for Cassandra, but for those keeping score at home, we’re more Aletheia than Cassandra, so I’ll just leave that as an exercise for our Greek mythology fans.
[00:00:50] Anthony: If you’d like to follow Fred on his Greek mythology podcast, go to Caesar, no, that’s the wrong thing. Look, I didn’t take Greek [00:01:00] mythology, okay? What I did do though is I’m going to start this week off in a different way. We’re starting off with my gas light. Okay. Cause my gas light this week is going to be, I think it’s cruise, and I think they’re going to be gaslighting NHTSA.
See NHTSA find GM cruise for. Basically Cruise saying, lying to them, which never a good idea. Don’t lie to government officials, especially customs officials. Because that’s the crime. And so as part of this the consent order and Cruise had to pay some, basically their daily lunch budget, which is like a one and a half million dollars.
Nothing. And part of this whole consent order says that Cruise has to report on their software updates. That means every 90 days. Cruise must submit an up a report on their ADS software and explain what happened. But it doesn’t really go into details, explaining, like, how they have to do this and whatnot.
And I think this is how Cruise gets to [00:02:00] gas, like Netza. Because I don’t think NHTSA is equipped, and stop me if I’m wrong, to handle what software updates actually mean. So I’m going to put out there that I think NHTSA should have direct access to Cruise’s developer logs, their commit logs, and their issue tracking system so they can actually see what their software developers are doing versus what GM’s Cruise executives are claiming they’re doing.
So is it a bit of a stretch for a gaslight? I don’t know. Something. I don’t know. I think,
[00:02:32] Fred: I guess what you’re saying is that this is requiring crews to obey the law once every 90 days. Is that basically the gist of what you’re saying?
[00:02:41] Anthony: What it sounds like, Michael, correct me, I’m sure, is that part of this consent order where GM crews is being monitored now by NHTSA is that every 90 days, GM crews say, hey, this is what we did with our software.
And these are software driven vehicles. And I don’t, it’s a, they’re not really software [00:03:00] engineers, right? They’re not, they don’t have that kind of expertise. So GM crews can be like, we updated gamma seven 19 vertical three. And they’ll be like, Oh, okay.
[00:03:11] Michael: I think, I wouldn’t blanket make a blanket statement that NITS is not going to be evaluate that, that type of thing.
Earlier this year they appointed and, or they started an office of autonomy, essentially, that’s overseeing a lot of these things. And I’m certain that office is going to be involved in evaluating the cruise updates. And it’s not just software updates. that Cruise is required to give NITSA.
They have to give, general reports on their operations, where they’re planning to, start testing in the future. They have to give reports on how they’re complying with local traffic laws. And I don’t know if that goes to some of the issues which we saw, cruise vehicles having trouble navigating streets, stopping in the middle of streets, causing traffic jams.
[00:04:00] But certainly it goes to whether they’re obeying, traffic laws. And they also have to do reporting on their safety framework to NHTSA, on a, every 90, some, there were, some of these have 60 day periods, some of them 90 day periods, but they’re meeting with NHTSA, I believe, monthly, essentially, and quarterly to discuss these issues and to try to at least, assuage NHTSA’s concerns that Cruise has been less than forthcoming about their safety operations which is the basis for the NHTSA fine.
And yes, it’s, it’s 1. 5 million, when I first see that I’m thinking, that’s peanuts for a company the size of GM Cruise and, it really is. But ultimately what we’re talking about here is a violation that involved one severe injury and a crash. And, there’s some obviously some back and forth between Nitsa and GM about just how much crews covered up.
They filed, the one and 10 day reports under the standing general [00:05:00] order that they were required to do, but they omitted the fact that the vehicle had dragged a pedestrian until the 30 day report came in. They were gaslighting that’s along the way. And, they’ve been fined for it.
They’ve got a structure that they have to adhere to now, and they’re. continued operations. I can’t say we’re totally satisfied with the outcome, but it’s a, it’s a showing that they’re going to enforce violations of the standing general order, which is a good thing. And they also, I believe, had GM go back and correct for other incidents that took place with their vehicles that weren’t correctly reported.
All in all, it’s, it shows that the standing general order has some teeth and is going to continue to require manufacturers to be forthright in their reporting, which is a good thing.
[00:05:47] Anthony: I guess my concern is mainly around the, they have to report this every 90 calendar days. And software on day one of that period, and 90 days later, could be totally different.[00:06:00]
So that’s where it’s gonna become this vague Amorphous kind of blob. So is GM Cruise telling them, Hey, this is what we have on day 90 and not explain what they did over these previous 90 days. Cause there’s no reason that NHTSA can’t have real time access to the software changes they’re doing. They don’t need access to the actual software, but they can have access to the commit logs of every time a software developer says, Hey, this is the change I made.
There’s no reason that NHTSA shouldn’t have access to that. So then they can double check that GM cruise is not gaslighting them. So maybe this is not my gaslight anomaly. It’s my gaslight prevention tip for NHTSA. I don’t know, Fred, you’ve worked in, yeah, you’ve made, you’ve worked on large scale software systems and whatnot.
Am I totally off the mark here?
[00:06:46] Fred: No, I don’t think you’re totally off the mark, but it’s impossible for NHTSA to evaluate software changes instantly. There’s got to be some delay between the software being [00:07:00] implemented or even try it out and the time it takes for NHTSA to evaluate it and also the time that it takes for that software change to be reported.
There, there are finite amounts of time involved in all that. Granted they can be smaller rather than larger, depending on administrative burdens that you put on it. But I don’t think that it’s practical or possible. To expect an instant review of the software for the same reason. We’ve said that it’s impossible to put out a responsible software update to operating software overnight.
You’ve got to do the validation. You’ve got to do the integration. There’s just a lot of overhead associated with making a software change in complex operational software. That. Burden is born by both the developers and the people who are trying to evaluate as a third party what the consequences of the software changes are.
[00:07:57] Anthony: Yeah, I wouldn’t say a real time review or [00:08:00] anything like that, but they should be able to have access to all of the logs as they get committed. And so they can see, and NHTSA can then review it every 90 days, but they have the, exactly what happened every day. So they can see GMCrews say, hey, this is what we did.
Whatever the terms, they haven’t specified the scope of what this means yet. And that’s more of my issues, because that could be a little amorphous. But as long as NHTSA has access to, okay, the GM’s saying, this is what we did. But, and it’s, I can say great yay or nay, but that they have access to everything they actually did and they can review it at their leisure.
I think that could
[00:08:36] Fred: be, that could be provided. They could provide the same access to changes as they provide internally. Yeah.
[00:08:43] Anthony: Right.
[00:08:44] Fred: But of course, internally there’s going to be different kinds of software. There’s going to be software that’s put on for a trial and then the software that’s going to be actually implemented in operational software released to the Operating units.
So there’s complexity in that as well. [00:09:00] I’m not saying anything that you’re suggesting is a bad idea. I’m just reminding people that complexity of a responsible software release process.
[00:09:10] Anthony: Yeah. I just, I see this as a way for. the regulators to have a better understanding of what’s going into these systems.
Because I think that’s the big issue that we have and a lot of people have with that. There’s a lot of ambiguity around, Hey, this is what we did. And then when we see it with Tesla they’ll do an update like on a weekly basis, if not more often. And that just seems reckless. Because we know these things are not being tested.
We know they’re just, they’re using the public as beta testers.
[00:09:39] Michael: Yeah, and I guess this also raises the question of if, if NHTSA sees that cruise needs to report these type of things to ensure that it’s operations are being conducted in a manner that’s going to boost safety then, why not demand this type of information from every, manufacturer who’s testing autonomous vehicles on the road [00:10:00] rather than just the poster child in GM Cruise, who’s had a recent vile, a recent poor performance, maybe it is time to put a little more oversight over the entire industry to ensure that they’re, deploying responsibly in areas that are appropriate.
[00:10:18] Anthony: So there we go. That’s my vote is to. Give NHTSA more funding so they can better monitor this stuff and force the manufacturers to provide greater clarity.
[00:10:30] Fred: My vote goes towards type certificates for vehicles being released on a public highway. This retroactive evaluation of software that, Only becomes intense when somebody gets killed or injured is really a bad idea.
There should be a process as there is in Europe for type certificate certification of vehicles that are going to be released in the public, particularly when the public is being used as unwitting [00:11:00] test subjects. For these dangerous vehicles.
[00:11:05] Anthony: So is this your gaslight nominee? Is this what you’re going into?
[00:11:09] Fred: I’m sorry.
[00:11:10] Anthony: Is this your gaslight nominee? Is this what we’re going into?
[00:11:13] Fred: Oh, we can do that. You want to do that now? All
[00:11:16] Anthony: I kicked it off in a backwards way. So yeah, let’s do it.
[00:11:21] Fred: All right. Oh, my nominee this week is Fortalex, which is a company that is in the business of providing simulation software.
And they make some extraordinary claims, but I want to point out to people that the best kind of simulation. That’s available as hardware in the loop in which you take the actual analog system because we live in an analog world. Folks. So you take the actual analog system and you attach it to a computer.
That is simulating the inputs to that system and allowing the system to actually [00:12:00] generate the real outputs that the real hardware will create given those inputs. That’s called hardware in the loop. That’s the best way of testing and the best way of using simulations. But Fortalex is proposing. Fortalex says Actually, the first line of this website that it is, quote, safety driven verification and validation for automated driving systems and ADAS.
Then they go on to say the fortified platform provides a unified VNV flow, which is stands for validation to verification flow that combines real world test drives and hyperscale simulation in one platform. Boy, that sounds pretty good. It does. And they say our
[00:12:45] Anthony: simulations, sorry, I said their website looks fancy too.
[00:12:49] Fred: Yeah, I’d buy one of those. Our solutions help integrate the safety, development, and V& V activities, follow the latest ADS safety standards, [00:13:00] and deliver evidence to support the safety case. That sounds pretty good too, doesn’t it? There’s a little problem here, though, that is there are no ADS safety standards.
Anywhere in the world that I’m aware of that they can satisfy by using simulation. So that’s blowing a little bit of smoke there. They go on to talk about level 2 plus ADAS. Again, that sounds pretty good, but there is no level 2 plus ADAS anywhere in any Industry standard or industry accepted definition that I’m aware of, so that’s completely made up.
Then I guess the best part of this is that they say they can accelerate your compliance testing and expand beyond the end cap requirements with additional maps and higher actor variability. Using a large scale automated virtual simulation. Wow, that sounds pretty good. Of course, that’s easily done because [00:14:00] there are no end cap requirements that are specifically associated with AAVs.
Or their simulation, so there’s nothing you can do anything that you can do in a simulation exceeds the end cap requirements. So this statement is both true and incredibly misleading. They apparently have some commercial customers though, so apparently the standard for buying this crap is not too high.
Let me and then they go on to say they can increase safety and quality. They can identify edge cases and unknowns. and generate evidence to support the safety case. There’s no definition of what the safety case is, so it’s probably easy to support that. They probably exceed it, too. They exceed it, yeah, particularly because there’s no definition.
Now, it’s true the simulations can identify some things That the software needs to avoid, but what is not true is that any simulation can replicate [00:15:00] the real world with enough fidelity to say that a system is safe to operate on the highways using only a simulation as the basis for that claim. So that’s why I’m giving them my nomination for Gaslight of the Week, because they just made up mythical standards that they can exceed.
And use that as a justification for saying that they can provide evidence that will automatically give you strong evidence of a system that’s safe to operate on the highways, even though they never take the step of moving from their. Abstract digital simulation to the real world of analog systems, which is what you do when you have a hardware in the loop system.
This is really actually, I think, assertions that safety of real ladies. A real highways can be validated by digital simulations only and [00:16:00] not adequately validated within an unbounded digital simulation models is really just dangerous bullshit. And so that’s my nomination.
[00:16:09] Anthony: Okay, so Fortalex. I also hear they’re the fastest company that’ll help you exceed these safety standards.
A Baptist company? Sorry. The fastest. They could be Baptist as well. They’re the fastest way to become Baptist while you’re doing this because as you use one of their products, you’re like, Oh shit, I’m gonna die! Lord save me!
[00:16:32] Fred: They talk about the Jesus nut that they’ve got on helicopters, which is basically the the nut that you use to attach the rotor to the rest of the helicopter.
And they call it the Jesus nut because if it fails, that’s who you’re likely to meet.
[00:16:46] Anthony: Oh, wow. Cause he’s one of the mechanics who repairs helicopters. Hello! Speaking of Baptists, Michael Brooks.
[00:16:54] Michael: All right. My Gaslight nominee of the week is I think someone who’s I’ve nominated [00:17:00] before. It is governor Gavin Newsom in California who did a couple of bad things this week.
The one that I’m going to call out for Gaslight is his veto of the California bill that was going to require intelligent speed assistance. Of some sort to be added to vehicles in California. I believe it was going to be required by, I don’t remember, 2027, 2029. We do expect that to happen at some point on the federal level, but probably not for another five years or so.
A lot depends on the political outlook, which is uncertain at the moment. But vetoing the bill Newsom noted that adding California specific requirements would create a patchwork of regulations that undermines this long standing federal framework which is speaking of the NHTSA safety framework that, that, and its regulatory framework.
But, I think any, almost anyone listening to the podcast [00:18:00] knows that California has made a career out of passing laws and governors in California have made a career out of signing laws that create that patchwork of regulations for manufacturers to tiptoe around or to comply with. We often see safety standards, not just the vehicle space, but in all sorts of other areas.
Of life and we see those coming out of california and california is the first state generally that starts the process of creating a patchwork of regulations that sometimes ultimately ends up becoming a federal law that applies to every state so Very disingenuous of the governor to act as though california is somehow opposed to creating regulations before the federal government in the area of safety and in other areas, California has long been a leader in requiring vehicles to have higher emission standards than the federal government and other states.
They certainly have had no [00:19:00] problem creating patchwork of regulations and other consumer related areas, but. Because Gavin Newsom is, continuing to boost his credentials as an anti safety tech bro, he’s going right along with this language about regulations that is most commonly heard coming from the automobile industry lobbyists that are visiting him in his offices frequently, it looks like.
So that’s the first. That’s the gaslight part of Gavin’s week. He also signed a law that functionally, destroys in many ways. What is the bet? What has long been the best or one of the best lemon laws in the country? And signed a bill that did not go through the appropriate Process in the California legislature.
I spoke about this a few weeks ago. They used a gut and amend process where they Gutted a completely unrelated bill that had made it through committee added language that was [00:20:00] approved by manufacturers, including general motors and others and then put that directly into a law without the ability for groups like us to comment or to testify in front of the legislature about the harmful effects that the measure would have.
So Gavin Newsom gets my gas light of the week and a big thumb down on his safety actions in the past couple of years.
[00:20:25] Anthony: Wow. That’s pretty good. But Michael, Gavin Newsom has what we’d call Hollywood man. presidential hair and I don’t want you to take this the wrong way. You just do not have Hollywood presidential man hair.
So when it comes to things of importance, I’m going to have to listen to him. Maybe you need a pomade or something.
[00:20:46] Michael: Anthony, your hair is much, much more beautiful than mine will ever be. I’ll take your word for that. For judging our political leaders by hair, then, I don’t know how to argue with that.
That sounds like a great way to go.
[00:20:57] Anthony: I think so, that, and he has a very strong jawline. [00:21:00] He, he, uh, he looks presidential, but his actions are they’d make Ronald Reagan blush. Let’s be honest. They’re he’s definitely out there. Do you have something to add, Fred? Was that what that lean in was?
[00:21:11] Fred: I’m going to lose out in a pair of sweepstakes, I think.
[00:21:15] Anthony: Or you’re winning in reverse.
[00:21:17] Fred: If the standard becomes the amount of light that’s reflected from somebody’s scalp, I’m in the running, but I’m not sure that’s what you’re talking about.
[00:21:27] Anthony: Okay. I’m not sure what we’re talking about either anymore.
But hey, let’s do an update on a fun little story. I think it was last week. You guys remember Fisker? Fisker Ocean, they they’re a car company, they went bankrupt, and their cars have open recalls, and they told their customers you gotta pay for the recalls! And then they’re like, Nah, we’ll pay for the parts, but you gotta pay for the service.
And then at one point, briefly, they said no, we got it all. Then they’re like, Nah, you guys, we’re in bankruptcy, but we put enough money aside to [00:22:00] pay for the parts, but you gotta pay for the service. And then people like Michael Brooks are like, No! No, that is not the federal law. So Michael, what’s going on with Fiskars now and their lovely lawyers?
[00:22:12] Michael: It looks like I, and I can’t tell if they have been in touch with NHTSA or NHTSA was in touch with them, but it looks like they’re, what they’re saying is they are working diligently working to secure funding for the labor costs and they have been able to fund the purchase of the service parts for those vehicles so that the recall can happen.
Can be completed, they’re going to be held to the, they’re going to be required to provide funding for all of that ultimately, and they’re not going to be able to plead bankruptcy and that’s the federal law. It’s a pretty hard line and there’s no way around it. If you’re a Fisker owner, I guess you can take some solace in the fact that, the recalls that are currently available on these cars are open, are going to be fixed, but [00:23:00] ultimately.
You’re, you’ve got a vehicle that is not going to have any manufacturer support going forward. You may not see any more recalls. And if there are safety defects that show up on these vehicles and, you already have owners with problems with these vehicles that aren’t under recalls. There’s a owners that I’ve seen have battery issues and there’s simply no way to get them fixed.
It’s basically an expensive lawn ornament at this point for a lot of owners. And we would advise. Anyone who is even remotely considering a purchase of a Fisker vehicle at this point, I’m not sure if they’re still selling the new, but if so, don’t buy them new. Certainly don’t buy them used.
Stay far away because repairs are going to be incredibly difficult to get.
[00:23:45] Fred: I want to clarify that lawn ornament is sometimes called yard art. Just for those who are keeping score at home,
[00:23:55] Michael: and if you want one buy a 20 flamingo and not a [00:24:00] 80, 000 Fisker
[00:24:02] Anthony: Okay. There you go. There’s some Lawn care advice from the Center for Auto Safety.
Hey, if you like lawn care advice, go to autosafety. org and click on donate. And I, if you do, I promise we won’t discuss pink flamingos anymore. Yeah, I made the assumption they were pink. Okay what other color are they going to be? I don’t know. So hey, let’s do some Tesla news.
So Tesla, a lot of interesting news. So Consumer Reports they put out their preliminary review of the Cybertruck. They’re like we’re gonna do this cause people are into it, and we gotta do this. And they had their initial impression, and things they don’t like are visibility.
Basically, they said the with the exception of a straight ahead view out of the windshield, the Cybertruck’s visibility is abysmal! Look to the left or right and your view will be blocked by thick pillars surrounding the windshield. They also complain about the the steering. It’s got that at low speeds you have to turn it less and versus high speed, how does it work?
I’m thinking like there’s an old Jackie Mason [00:25:00] commercial for the Honda Prelude in the 90s. You turn it like this and then it goes like that and Anyway, you have to adjust, you have to relearn how to use a steering wheel is their They’re very thing they’re saying it’s very difficult to park this thing.
The cameras aren’t great. Don’t give a good view of a round. Um, they’re, they compare their, the cameras to that of a $27,000 Nissan Sentra and they say the Nissan Sentras cameras are better. There’s also things like only until yesterday, I think full self-driving and autopilot were not on these a hundred thousand dollars cars, but yesterday.
October 1st, the select some people got autopilot. They managed to get that or full self driving. And TheVerge. com talks about how one Tesla fan put this up, of course they all put up videos of this stuff and from their article. And at about the 6 minute 20 second mark in the video, the driver needed to intervene when the Cybertruck almost drove into a median after making an automatic left turn at an [00:26:00] intersection.
Tesla will claim this uses end to end neural nets and it’s on avocado toast and it’s amazing. You need to pay attention. If you’re driving any vehicle, I don’t care what features are, you’re the one driving the car, okay? Don’t, it may help you like stay in your center lane but, you drive over somebody and drag them for 20 feet, you’re the responsible one.
[00:26:23] Fred: There was somebody that I read who defined Full self driving and thinking maybe we’re thinking of it the wrong way, but they defined it as the requirement that you drive full time, drive yourself
[00:26:37] Michael: full time. I like that. That’s what you need to be doing. There are a couple of things that stood out to me in the Consumer Reports review.
The first was something that most car buyers would never accept, which was the original price of what was agreed to by Consumer Reports, I believe, when they bought the vehicle was around 49, 000. And ultimately, When they finally got [00:27:00] access to a vehicle, I think it was about five years after they put down their deposit, they ended up paying over 100, 000 for the Cybertruck.
So that’s something that most vehicle purchasers would rebel against. It’s Tesla fans, not so much, but raising the price of a vehicle by, doubling the price of a vehicle from the time you put down a deposit to the time you accept the vehicle is clearly unacceptable in the eyes of most vehicle owners.
Also, beyond the visibility, which is a huge concern, the steering system in the Cybertruck is drive by wire. And not only is it drive by wire, it’s apparently the only drive by wire vehicle that has ever been sold that uses the drive by wire steering without any mechanical redundancy. So if your drive by wire steering software goes buggy, you don’t have steering.
And, while Tesla claims that they have steering, Some type [00:28:00] of redundancy and fail safe mode built into this system. They do not have the fail safe of mechanical redundancy, which is, I think, critically important in situations where your steering fails. And, how are you going to get your car over the side of the road?
How are you going to navigate, the situation that you’re in when you have no steering and you only have, acceleration. Acceleration and brake pedals as your primary controls. So that’s a scary situation. It’s something that I think manufacturers are going to have to get their hands on using drive by wire steering.
There’s got to be some type of mechanical redundancy so that folks can pull can continue to operate the vehicle safely, even if steering becomes somewhat more difficult when you have a, when you have an issue with the steering.
[00:28:47] Fred: In many systems that do not have mechanical backup of controls that are important for operations, they use parallel software systems, where you’ve got a completely different [00:29:00] set of logic and running two different parallel paths.
And then you compare the recommendation at the end of those two calculations to say, okay, if they agree, then I have high confidence that this is the right solution. If they disagree, Then you have no confidence, so they often have to run 3 systems so that you can vote and make sure that, the software always agrees on the inputs and the outputs because you’ve got now got a system that’s complex enough.
So that you can actually reliably say that you’ve got the right answer. This is never done in these AVs because the complexity of the software that they’re running prohibits a parallel system to be put in place with different logic that looks at any critical parts of the software. So what folks look at in software driven vehicles is inherently much less safe.[00:30:00]
Then all the systems that are put in place and other kinds of systems, aircraft, for example, that rely upon software for safety, critical functionality.
[00:30:10] Anthony: I don’t know if the software in these systems, cause Elon always claims that it’s a supercomputer neural network, artificial intelligence behind this, but Elon doesn’t know jack shit about engineering.
He doesn’t know anything about software. And I don’t know if what he’s saying is actually true. Cause the little bit that we have out, we know it’s. Humans manually mapping out each road and saying this is a curb, this is a divider line. So it’s, it may be complex in a spaghetti code kind of way, but I don’t think there’s any sort of.
I’m not convinced there’s intelligence behind this. The other data point I have on this is my buddy Kyle, when he was the head of GM Cruise. I think CNBC was interviewing him, and they were like, Oh, do you have a supercomputer like Tesla? And he just said, Yeah, you don’t need a supercomputer to do this kind of stuff.
It made me think, Oh. This is another data point of Elon being full of it. [00:31:00] So I don’t, I wonder how complex this actual software is. How,
[00:31:05] Michael: You’re using Kyle to bat off and that doesn’t really work if they’re both full of shit, right?
[00:31:11] Anthony: He’s full of shit when it comes to safety, but at least I think he knows a little bit more about engineering than Elon, which is not saying much.
I agree, but. It was very interesting. It was like, yeah, we don’t need a supercomputer to do this kind of calculations. And a little bit we have out from Tesla is they’ve got minimum wage workers literally mapping out what the roads look like. So I’m curious. Cause the other example I can think of that is fly by wires, of course, Airbus, their airplanes, but yeah, it’s a triple redundant system.
But it’s also run by people who have thousands of hours of training, who are regularly trained every year. Their software release process takes years because the amount of validation and testing they have, because they don’t want any plane to crash or suffer an issue. I don’t think. [00:32:00] I don’t think we have any reason to think Tesla does that remotely.
[00:32:03] Fred: The parallel between Musk and Kyle is that they are both experts at selling bad ideas who they believe has, time has come.
[00:32:14] Anthony: They’re, yeah, they’re experts at selling what we all grew up as children. It doesn’t matter how old you are. You all grew up with the same, we all grew up with the same myth.
Of a comic book idea of the future. Yeah I, it’s, see, this is another reason that I think Nitsa should have access to the software, at least the comments and the developers write in to find out what’s real and what’s Member X.
[00:32:40] Fred: I agree with Michael. There’s, there should be a requirement somewhere that says, if you’re.
They have a software driven vehicle that has safety critical functionality. You should have a mechanical backup to that if, to make sure that you’re not going to kill the people who are using it when the battery goes dead.
[00:32:59] Anthony: Yeah, [00:33:00] I, and we’ve seen with Tesla’s where the battery goes dead and people are locked inside the car.
They can’t,
[00:33:06] Michael: They can’t find the manual door release that’s been hidden under a panel.
[00:33:09] Anthony: And, the doors don’t work unless there’s power,
Which is insane. The
[00:33:13] Fred: cyber truck that burned up in Texas a few weeks ago and killed the occupants. Michael, do you know if the NTSB has yet discovered why that happened or how that happened?
[00:33:24] Michael: No, I’m not even sure if the NTSB was investigating that crash or not. And I haven’t seen anything else on that, but that’s just another reminder that of the dangers of crashes and battery fires and crashes.
[00:33:41] Anthony: While we’re still on this Tesla full self driving, uh, Ars Technica has a great article titled Tesla Full Self Driving Requires Human Intervention Every 13 Miles.
Oh my god from our friend of the show, Jonathan Gitlin. Quoting from the article, The partially automated driving system exhibited [00:34:00] dangerous behavior that required human intervention more than 75 times. Over the course of 1, 000 miles of driving in Southern California, averaging one intervention every 13 miles.
This company called AMCI Testing did this and said, quoting from them, But it’s seeming infallibility in anyone’s first five minutes of full self driving operation breeds a sense of awe that unavoidably leads to dangerous complacency. When drivers are operating with FSD engaged, driving with their hands in their laps or away from the steering wheel is incredibly dangerous.
And I think this is what we’ve discussed, what friend of the show, Phil Kopman, has mentioned is that first five minutes, you’re like, Oh, this thing is doing it, everything. I don’t I become complacent. I don’t have to pay attention anymore. They’re all gonna die. And that’s the dangerous part.
[00:34:51] Fred: It is, and we see this over and over again when people talk about their test drives in the the cruise or the Waymo in San [00:35:00] Francisco.
I went from my hotel to the Presidio with, 15 minutes, and it was perfectly fine. Therefore, it’s safe for all users in all circumstances. Anecdotal information about a safe test drivers is not the same as endurance testing and comprehensive, expansion of the testing to all circumstances that somebody is likely to encounter.
And complacency comes in very fast and the reporters who are enjoying this nice ride, I think do the public a disservice by simply. Drinking the Kool Aid and talking about how wonderful it is, and we really need to do this worldwide.
[00:35:44] Michael: I, I wanted to note a couple of things. If you look at the AMCI testing has a website that has, about, I think it’s about six videos of the different scenarios in which they tested full self driving. And those are interesting to look at to [00:36:00] see. Exactly what the system is doing on the roads.
And, I’ll note as they noted in their overview they found the system to be surprisingly capable, which, we note is something that leads humans to believe that it’s going to be safe all the time, but they describe it as surprisingly capable while simultaneous.
Tameously problematic and occasionally dangerously inept. It’s, it seems like it’s a it’s like a football player that’s got a lot of potential, but also has a lot of downsides on the other side and may be best avoided. And also note that they said that when errors occur, They are occasionally sudden, dramatic, and dangerous.
In those circumstances, it is unlikely that a driver without their hands on the wheel will be able to intervene in time to prevent an accident or possibly a fatality. It sounds a lot like the way we’ve described it as it’s we fall into [00:37:00] this idea that the system is doing everything that needs to be done and that we can go about our business without paying attention.
And in those moments, you’re at the greatest danger. So everyone, everyone who is willing or, Willing to use full self driving out there. Please pay attention and keep your hands on the wheel and, don’t fall don’t be lulled into a sense of confidence in the technology because you might be the next person to die.
[00:37:28] Fred: Those sudden and dramatic circumstances that you talk about are exactly why. The kind of simulations I was talking about earlier, offered by Fortalex are too limiting and unrealistic for evaluation of the real world operations of these vehicles. Shit happens, folks, and it happens fast and furious, and it has bad consequences.
And whenever the bad consequences occur, it’s generally a result of those things that are happening really fast and are completely unexpected. [00:38:00]
[00:38:00] Anthony: Speaking of dangerously, occasionally, what is it? Occasionally dangerously inept, let’s talk Kia. Yes. Kia an article also we’re linking to from Ars Technica quoting from the article, Today, a group of independent security researchers real Today, a group of independent security researchers revealed that they’d found a flaw in a web portal operated by the car maker Kia.
That let the researchers reassign control of the internet connected features of most modern Kia vehicles from the smartphone of a car’s owners to the hacker’s own phone or computer. Ugh. By exploiting the vulnerability and building their own custom app to send commands to target cars They were able to scan virtually any internet connected Kia vehicle’s license plate, and within seconds gain the ability to track the car’s location, unlock the car, honk its horn, or start its ignition at will.
Ha! Take that, TikTok generation! USB cable, nothing. Come on, kids. Step up your game. Hack it via your phone. Honk the [00:39:00] horns.
[00:39:01] Michael: Yeah, this is it’s a really scary one. They’re, they don’t even need anything other than the license plate. In order to essentially hack into a person’s vehicle system.
And then you can obtain that person’s personal information, name, phone number, email address, physical address. And add the attacker as an invisible second user on the VE victim’s vehicle. And not only that, and I will note that this, these what we’re talking about here, the vulnerabilities have supposedly been fixed.
The hackers didn’t release this tool. So this is hopefully not a problem going down the line for Kia owners. What it allowed this hacker to do, and the most concerning thing to me is that they could start the vehicle remotely. And that would allow them to start a vehicle that’s parked in a garage.
And we all know how bad buildup of carbon monoxide is. It can go from the garage to a house. This [00:40:00] could be done while people are sleeping. There’s a significant safety issue just in that part of this hack. It’s something that I’m glad. These guys, these hackers caught and communicated to Kia so that they could fix it, but it raises questions about just how secure Kia systems are overall and also, there, there are a lot of concerns that consumers need to be aware of when they are using, web apps and using their phone to connect to their vehicle.
Cybersecurity is not a guarantee. And anytime you’re. Connecting your vehicle to the internet, there is going to be a chance that someone might be able to exploit the vehicle software, the web software, whatever, however you’re connected to make bad things happen. People should be very aware of that.
[00:40:51] Fred: And before you’re relaxed because you didn’t buy a Kia, the authors go on to say quote, and those bugs are just two among a view of similar [00:41:00] web based vulnerabilities that they’ve discovered within the last two years that have affected cars sold by Acura, Genesis, Honda, Hyundai, Infiniti, Toyota, and more, close quote.
Yeah, be afraid.
[00:41:16] Anthony: Oh, that’s not good. Yeah, so the hacker said that what they, Kia’s website allowed you to do is connect to your car just by entering the VIN number. And there’s services that allow you to just go online, enter in any license plate number and it’ll pull up the VIN number. I guess that’s all public data.
I don’t know why Kia didn’t do any simple security check on this. Maybe, I don’t know, maybe software, safety, cars. Who knew it would be a problem? I had no idea. Oy vey. Speaking of which, let’s let’s do a little Tau. The Tau, and since we’re in the world of digital, let’s talk about digital latency, which is a type of undergarment [00:42:00] worn in the 1920s, right?
Is that what digital latency is?
[00:42:03] Fred: Still being used. Put up by Playtex, I think. So there’s a lot of terms that we throw around, and I wanted to make sure people understood what they are. At least our astute listeners understood what they are. So latency is one of those terms, and it’s related to another term called throughput.
So latency basically Is how quickly your computer can communicate with another computer and something you can try at home. If you get on your computer, go to terminal, you can type in ping and then a space. And then the coordinates of some other server. And it’ll tell you how long it takes your computer to get a message back from the other computer that says, yeah, I’ve heard you.
So I did a little research here and. I started to look at how far away servers can be and what their response would be. [00:43:00] The reason this is important is because a lot of the AV developers, I think all of them actually, are using remote assistants and remote locations for their servers. With the human beings behind them so that they can actually back up their autonomous claims.
None of them are truly autonomous. They’ve all got these backup humans, but I digress. So if I started the ping test from my own phone, my own iPhone I found that the ping was 45 milliseconds to Chicago. And I’ve started in Massachusetts and the response was 314 milliseconds from.
Melbourne, Australia, which is about a third of a second. So if you’re driving your car at 60 miles an hour, right? A third of a second is about 25. 30 feet something like that. So before anything can happen to [00:44:00] analyze what your car is all about You’ve already traveled 30 feet and a lot can happen in 30 feet as we were discussing earlier stuff happens fast and You can’t really respond Throughput is a little different.
So throughput is given that you’ve got Amount of time, a finite amount of time it takes for you to communicate with another server. The other server is got to do some things to do some calculations. And then it provides a result. And it comes back to you. And so this can be very lengthy, you can have a complex interaction with that server.
It has to do a lot of thinking and a lot of research. So the throughput can be much larger than the latency. It can be no less than the latency though, so the latency is a lower limit on how quickly you can get a response. It’s also internal within the car’s computer, right? So there’s a certain amount of latency [00:45:00] there, but it’s very small, but there’s a lot of delay associated with the throughput, particularly if the computer is overloaded.
Things are happening, you’re, filling up the memory. When we talk about latency and throughput, you should remember that there’s always a finite amount of time that is required before the computer in your car can respond to stimuli that are coming in from, example, for the cameras or the LiDAR or whatever sensors it’s got, and then translate that into the logic that’s required to determine Which direction the car is going to be steered in and how the brakes are going to be applied.
So latency is part of that. The farther away from the car that you physically get, like with the remote assistance that’s available and provided to all of these vehicles, the slower it’s going to be. So that’s the short message about this. You’ve [00:46:00] got latency, you’ve got throughput. Throughput is always longer than the latency, but never shorter.
And that’s it for this week.
[00:46:10] Anthony: That’s great. I think it’s a good example for those playing the home game, is next time you’re on a, some sort of video chat with a friend or family member, have both of you pick up an instrument, like a guitar or something like that, and say, hey, let’s play a song together, because it’s you think, hey we’re talking to each other in real time and everything, there isn’t any delay.
Try playing something that’s time dependent, like a song, with somebody else through these systems, and you’ll be like, this is garbage. Because that couple milliseconds, you’re like, oh my god, my relative can’t play for crap. This is awful. Wait, maybe it’s me, I’m too slow. They’re too fast. That’s a quick way to understand what latency is.
Also
[00:46:48] Michael: remote, remote work doesn’t work for bands.
[00:46:51] Anthony: No, it doesn’t. It’s a big problem. It doesn’t it does not work. But what I did while you were talking, Fred, as I went and I [00:47:00] pinged autosafety. org and it took me 4. 234 milliseconds. To reach the nearest server. 4. 234 milliseconds, pretty quick.
And in that period of time, you can go there and donate. Oh! And while Fred was talking, you could be telling your friends, Hey, text them and say, I found this awesome podcast called There Oughta Be a Law. I gave it five and a half stars. That’s right, you get an extra half a star in there if you like us. So what do you think, folks?
Have you done this? I’m waiting. Fred muted himself, but I think he just did it. He definitely. No,
[00:47:34] Fred: No. I was just wondering if you had pinged Piggly Wiggly because we’re waiting for them to come as our sponsor. Maybe there’s a latency problem there.
[00:47:42] Anthony: There could be a latency problem or it’s a blocked artery problem.
I’m not really sure what the Piggly Wiggly. Hey, anyway before we get to recalls, here’s a fun little story that a fan of the show sent in a couple of weeks ago. We haven’t gotten to yet. There’s a company called Zeeker. Zeeker. They are a [00:48:00] Chinese company, I believe, and they have this fun, weird video of their compact SUV driving into a lake to show how they have built into their car a way to automatically break the passenger windows.
And we’ve talked about this before, if it’s tempered glass or if it’s that other kind of glass was the other kind of a laminated glass and you need different tools. And so they have basically a window breaker built into the door panel, but you can, we have the link to this, watch the video. It’s bizarre of this guy slowly sinking into a lake and taking his time and relaxing the cars, filling up with water.
And then he eventually breaks the glass. The question I have for people at Zika is. This is an electric vehicle. I assume this was fresh water, this was not salt water. Because imagine salt water, you’re going to get some major shorts. As we’ve seen with flooding in the South. [00:49:00] Ooh, blah.
Couldn’t you just done this experiment on land?
[00:49:08] Michael: Yeah, I think they could have. It’s an interesting demonstration by Zeker. I’m not sure if it was worth the value of that vehicle. They sunk, but they are, They clearly have tempered glass in their side windows, which can be broken by those types of tools that they’ve basically installed into.
I think you can pull up, pull a handle in, in the passenger driver’s doors and it will break the glass because it’s tempered, but it would not work if they had laminated windows, which are. Far more difficult to break and to escape through. And very difficult in many circumstances, the typical tools that you see sold on Amazon or at your auto parts store, not going to work on laminated windows.
And so once again, this story just bears out a warning. For everyone out [00:50:00] there that you shouldn’t, you should apprise yourself of what types of windows you have in your vehicle so that you know which one is going to be going to work the best. If you do need to escape the car quickly.
[00:50:13] Anthony: I have one more before we jumped to recalls and I know you’re like, please get to recalls because then I know the show’s almost over.
Get to recalls. I can wake up out of my nap or I can end my hypnosis section, but no, this is a good one in weird law. Uber. What was this? Basically they’re, let’s see, a married couple can’t sue Uber over severe injuries. They suffered in a 2022 car accident because of mandatory arbitration provision in the ride sharing company’s terms of use.
So what happens is the couple’s claims that their 12 year old daughter, when using Uber Eats, which is not, which is a separate service, said, okay, I click to agree to the terms of service. I’m not a lawyer, but let’s play one. Can a 12 year old agree to terms of service? Is that enforceable?
Can a 12 year old sign a [00:51:00] contract?
[00:51:00] Michael: Apparently, according to the Superior Court in New Jersey’s Appellate Division, yes, they reversed the lower court’s ruling that essentially said, you know, this click on a screen by a 12 year old shouldn’t void the rights of the parents to pursue a lawsuit in court.
Uber is basically trying to force them into Uber sponsored arbitration based on the fact that their daughter was ordering food and accepted their terms of use.
[00:51:31] Fred: I just want to jump in and say that this is not about the food, this is about an injury that happened later that the court said too bad because your daughter ordered some chicken wings.
[00:51:42] Michael: Yeah, and this is a problem. We’ve talked about binding or forced arbitration on a number of occasions. Essentially all the software that you’re using on your phone they’re going to be terms of use that pop up occasionally. I think I have one every week from Samsung who [00:52:00] manufactures my cell phone and it’s basically says, click here to accept our new terms of use.
And if I. Touch my screen in any way whatsoever, it will deem that I’ve accepted their terms of use, which I’m sure it’s buried, eight, 83 pages down include an arbitration agreement of some sort. I don’t think. I don’t understand how we’ve come to a point in America where companies are allowed to bind individuals to this type of thing with only the very briefest of interactions.
You’re not having to explain. And I’m going to explain the terms of use to the consumer. In fact, if I wanted to keep up with Samsung’s terms of use, I would have to read a very long document every week just so that I was comfortable clicking the button. And if I don’t click the button, then guess what?
I don’t know that I can use the service. So it’s a contract of adhesion and even beyond being a contract of adhesion that, where the Uber has really all the [00:53:00] power to enforce it there. It’s also. It’s just a very sneaky underhanded way of getting consumer consumers to agree to a contractual provisions.
There’s no bargaining that’s taking place there. You’re essentially saying, if you use our service, we’re going to take away your civil rights. And it’s something that I think should be eliminated. in every industry in America bar none. And I it’s one of the ways that companies are escaping punishment for bad actions.
And it’s, it’s really damaging the rights of injured parties and cars.
[00:53:40] Anthony: Alright, one star review there. To the Appellate Court of New Jersey.
[00:53:45] Michael: Yeah, that was a bad decision.
[00:53:47] Anthony: Ah let’s see what happens as it goes up the next level. What’s after the Appellate Court of New Jersey? Is it the Supreme Court of New Jersey?
Or is it the Supreme Court of the U. S.? I
[00:53:54] Michael: assume it’s the New Jersey Supreme Court. New Jersey.
[00:53:56] Anthony: Alright let’s see how they are. I thought
[00:53:59] Fred: it was Tony [00:54:00] Soprano.
[00:54:00] Anthony: Ayo! Hey, let’s go into recalls and you’re coming out and in 10, 9, you’re clear, you understand of where you’re located, and at the snap of my fingers, your hypnosis is done.
Alright, recall roundup. First one, a rare entrant, Toyota, 42, 199 vehicles, the 2023 2024 Toyota Corolla Cross Hybrid. Oh, problem with their electronic control unit. Skid control ECU. Oh, the brake actuator and the skid control and the regenerative brake system. There’s a possibility that brake fluid pressure may not be control controlled as designed in a limited situation.
Michael translate this.
[00:54:48] Michael: So there’s a electronic control unit with. That’s embedded in the skid control system of these vehicles, basically, when you hit the brakes, they’re going to prevent you from locking the [00:55:00] brakes and, sliding to a stop versus pumping the brakes, or, there’s software that’s controlling The actuation of the brakes so that it makes, slowing down quickly safer, particularly in rain or other bad conditions.
And in this case, the software is not working properly. And it’s not properly controlling the brake fluid pressure when you’re in those situations. And it makes it harder to produce. To push the pedal, it reduces the braking force and it increases the stopping distance, thereby increasing the risk of a crash in those situations.
So those vehicles have to go back to the Toyota dealer and have their skid control, electronic control unit software updated. It looks like that’s going to start in early to mid November for owners of these, what is it, a Corolla Cross Hybrid.
[00:55:54] Anthony: Yes. Why couldn’t they just call it Anti lock brakes, skid control.
It just, it seems [00:56:00] embarrassing.
[00:56:01] Michael: I think it’s part of the anti lock brake system. I think, it may also be part of the electronic stability control system. Either way it’s broke and they need to fix it.
[00:56:11] Anthony: Yes. Get that fixed. Next up Chrysler, not a surprise. 15, 835 vehicles, the 2017 to 2020 Fiat slash Arbath.
124 spider and there is a airbag control module is the issue here.
[00:56:29] Michael: Yeah. And this one is the, so if a lot of the airbags have either most airbags today, these days have dual stage deployments instead of a single stage deployment, a single stage deployment means that an airbag is basically when it is actuated, it is, it has one way in which the airbag is deployed at one force every time, it’s.
A reason, one of the reasons that dual stage airbags are more important is because they can deploy at [00:57:00] a lower level or a higher level, maybe even multiple levels, depending on the driver’s weight, the driver’s positioning in relation to the airbag. And, it also, when you’re in higher speed crashes, the airbags can deploy more forcefully to ensure that they’re there to protect you because things move a lot faster in higher speed crashes.
And in this case, the. Airbag control modules in the Spyder vehicles are, they’re commanding the dual stage airbag deployment, the most powerful frontal airbag deployment, and that could injure passengers who are smaller or who are close to the closer to the steering wheel, closer to the airbag and it’s doing that.
Despite the fact that the single stage deployment or the lesser of the two forces should have been deployed. So that’s a pretty significant recall. It’s only about 15, 000 vehicles affected and they’re all from 27 to 2020. I [00:58:00] really think folks who have these vehicles should, Be on the lookout in early November when this recall comes out.
[00:58:07] Anthony: So throughout this recall though, it mentions Mazda a lot. That Mazda notified Stellantis, which is Chrysler. Did Mazda make these cars and they just rebadged? Or is this Mazda made the airbag? What’s going on? It’s confusing.
[00:58:21] Michael: Sometimes you’ll see that occasionally there are manufacturers who cooperate in designing or in building certain vehicles.
I’m not sure what the provenance of the Fiat spiders here is, but it appears that Mazda is involved in either design or, they must’ve been teaming up somehow on the cars. Sometimes you’ll see companies where Mazda puts out the B2000 pickup and Ford puts out the Ranger. They’re the same vehicle in most respects other than their design, but you’ll see the companies issue recalls on behalf of one another in those circumstances.
[00:58:58] Anthony: I got it. [00:59:00] Next up Mercedes Benz 27, 190 vehicles. The 2021 2023 Mercedes Maybach S580, which I think Fred has. The Mercedes Benz regular S580, which is just too down the road, just too poor for Fred. Hardware failure in the Camtronic system. Oh, I love a good Camtronic system. And injected fuel might be erroneously increased by the Lambda control system.
They’re just making shit up now. This is,
[00:59:30] Michael: What essentially is happening is there’s software. There’s a, in the event of an independent hardware phase, there’s a hardware and software issue here where there. For some reason, the fuel injector is increasing the amount of fuel that’s injected significantly during the cylinder deactivation process.
And so what happens ultimately is you’ve got more gas going in. And the vehicle is not expecting that. And the exhaust temperatures [01:00:00] coming out the other end are so hot that they are going to damage the surrounding components, which could be, engine wiring, harnesses, catalytic converters, and, you can lose propulsion of the vehicle and there’s also a risk of fire.
I believe that the fix for this is a reprogramming the software in the engine control unit. And that’ll be available later in November.
[01:00:24] Anthony: All right. Up next again, Chrysler 336 vehicles, the 2024 Ram Pro Master. There’s some the, what was the battery electric version? May, it may have been produced with incorrect motor control processor, a software that may allow a loss of propulsion.
Propulsion. So this is, this
[01:00:44] Michael: is, this is a. Vehicle with a gross vehicle weight of 9, 350 pounds, which makes me happy that Chrysler’s only sold 336 of them so far. But this is [01:01:00] an incorrect software. Issue as well. I think all four of the issues we’ve discussed so far today have been software problems.
And ultimately the vehicles are going to be fixed. It’s a power inverter mod module that is the source of the issue. And it’s essentially, you’re going to lose propulsion. As much as I’d like to see. These giant heavy trucks lose propulsion. It’s still a safety issue and I don’t want anything to happen to you.
So please get it into your owner. Into your dealer. And it looks like that this recall is starting very soon and owners may be notified in the next week or so.
[01:01:38] Anthony: Yeah.
[01:01:39] Fred: While you’re at your dealer, while you are at your dealer getting that checked, you might as well turn in the truck and get something a lot lighter.
[01:01:47] Anthony: Two trucks. How’s that? Two trucks? They’ll weigh less than that one truck. Yeah, but it won’t say ProMaster. And [01:02:00] last up, General Motors. 18, 325 vehicles, the 2013 2019 Chevrolet Express Cutaway. And apparently the cutaway on these cars could cut the brakes lines. Is that right? Yeah. If brake lines contact the body mounts, the brake line may experience wear or damage.
[01:02:20] Michael: Yeah. And these are if you don’t know what a cutaway vehicle is, it’s basically, it looks like a van that has a cargo van that has been chopped off right behind the driver. And there’s just a chassis there. So what these vans what GM and Chevy sell these to people who are going to put.
Trailers, not trailers, but other cabs and other structures on the back of it. So there’s a lot of things you can do on the back of one of these, but it’s essentially what, if you look at most U Haul vans, they started their life as a cutaway van. They were produced the chassis and the engine and the most of the [01:03:00] Safety related components are produced by General Motors and then it’s sold to another manufacturer who then proceeds to put the whatever they want in the back, typically it’s a cargo area of some sort.
It’s
[01:03:15] Anthony: so weird looking. I just looked them up. Yeah. Yeah. Why not just get a truck? I don’t get it anyway. That’s yeah, these yeah. I’ve seen the version that looks like an airport transport van. Eh, alright, that’s less offensive. But some of these other ones don’t allow your children here.
Just keep away. All vans, for that matter. And on that note, hey, thanks for subscribing, telling all of your friends, and donating. That’s another episode of There Oughta Be a Law Keeping My Mouth Shut.
[01:03:46] Fred: I thank you for listening.
[01:03:50] Anthony: Michael says goodbye to you. Bye
[01:03:52] Michael: bye.
[01:03:56] Fred: For more information, visit www. [01:04:00] autosafety. org.