Inactionable corporate puffery
Tesla dominates the auto safety world this week and on this podcast that is a bad thing. First Tesla recalls almost every car they’ve ever produced. Then their lawyers cause the Tesla flock to cry by admitting that AutoPilot and Full Self Driving are nothing more than corporate puffery (at $12,000 that’s gotta hurt) and finally an insurance study shows that Tesla drivers are more likely to get into a crash than drivers of any other vehicles. I wonder what Earth will have to say about all of this?
This weeks Tao of Fred dives into Duty of Care and now Anthony no longer thinks everything will be better in the future.
This weeks links:
- https://www.washingtonpost.com/technology/2023/12/16/tesla-autopilot-recall/
- https://www.wsj.com/business/autos/teslas-autopilot-system-sparked-lawsuits-before-the-recall-970cfe4e
- https://www.msn.com/en-us/autos/news/virginia-sheriffs-office-says-tesla-was-running-on-autopilot-moments-before-tractor-trailer-crash/ar-AA1lp6Zy
- https://jalopnik.com/tesla-drivers-have-more-car-crashes-than-anyone-else-s-1851110889
- https://www.washingtonpost.com/technology/2023/12/10/tesla-autopilot-crash/
- https://arstechnica.com/tech-policy/2023/12/tesla-fights-autopilot-false-advertising-claim-with-free-speech-argument/
- https://arstechnica.com/tech-policy/2023/12/disgraced-nikola-founder-trevor-milton-gets-4-year-sentence-for-lying-about-evs/
- https://jalopnik.com/mercedes-turquoise-automated-driving-lights-level-3-1851110043
- https://www.msn.com/en-us/autos/news/stellantis-makes-a-big-bet-on-ev-battery-swapping-in-new-deal-with-ample/ar-AA1l9r3t
- https://static.nhtsa.gov/odi/rcl/2023/RCLRPT-23V801-2565.PDF
- https://static.nhtsa.gov/odi/rcl/2023/RCLRPT-23V807-5655.PDF
- https://static.nhtsa.gov/odi/rcl/2023/RCLRPT-23V830-3081.PDF
- https://static.nhtsa.gov/odi/inv/2023/INOA-PE23023-13019.pdf
- https://www.autosafety.org/support-us/
Subscribe using your favorite podcast service:
Transcript
note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.
Anthony: You’re listening to their auto be a law, the center for auto safety podcast with executive director, Michael Brooks, chief engineer, Fred Perkins, and hosted by me, Anthony Simonoff. For over 50 years, the center for auto safety has worked to make cars safer. I’ll
Fred: be careful.
Anthony: Okay, recording is in progress. Welcome to the Picked Last Podcast.
Michael: No. Good morning world. Is that recording locally for you? Yeah, it’s recording.
Anthony: Yeah. Hey man, we’ve already covered where we’re recording. tHis is the Elon Musk happy hour. Cause it’s not happy.
Michael: There’s been a lot of Tesla the past few weeks. I’m, you get tired of talking about it, but it’s just, there’s so much going on. The cyber truck is what really put it over the top for me that the autopilot stuff is continues to be interesting because they clearly haven’t done everything they can to fix their system, even with the recall.
Yeah, so
Anthony: let’s start in the, in this recall. Tesla had a recall of pretty much every vehicle they’ve ever made. And, a lot of the fanboys are like, It’s not a recall, it’s just a software update. That means every time you get an upgrade for your phone, that’s a, your phone’s been recalled?
Michael: Yeah, that’s something that they just have to get over.
The federal standards were written in, the 60s and early 70s on this kind of thing. A recall does not signify that the vehicle is being brought into the dealership and that it’s going to be repaired. That’s what happens in a lot of circumstances, but recall is simply the phrase used to note when a vehicle.
Has a safety defect and the manufacturer’s admitting to it. And there’s going to be some sort of remedy, whether that remedy is OT or not. I see a lot of knickers and bunches with the Tesla fan boys, who it’s not a recall because the car, we’re getting an over the air update and it’s going to be fixed.
And there’s no actual point where the vehicle is being recalled back. to the dealer to be fixed. It’s a recall because federal regulations say that’s what a recall is. When there’s a safety defect that requires a remedy, over the air update or not, it’s a recall. So everybody should just relax on that one and just let recall the word mean a little more than you think it does.
Anthony: Yeah. If the NHTSA calls it a recall and even Tesla. Calls it a recall. It’s a recall. Yeah. Whether you like it or not. Yeah. So what the problem here. So they had a update 2 million cars. Cause I think it’s, they weren’t giving drivers warnings that, Hey, you’re in the autopilot full self driving mode.
But you got to keep your hands on the steering wheel. That’s what’s happening.
Michael: Yeah, the recall is basically, it looks like what they’re going to do in the remedy, which, there’s some, if you look at the part 573, it says they’re going to be putting additional restrictions around, basically where and when you can enable autopilot to be on and the.
One of the big problems here is that we don’t think they’re doing the where right, it looks like they’re going to be putting more restrictions on controlled access highways. They only, you’re only supposed to be using this controlled access highways, like the interstates where there’s no crossing traffic, no heavy trucks.
Pulling out into the middle of the road where you could run under them with a Tesla, not seeing them as we’ve seen in a number of circumstances, none of those situations, but the, instead of doing geo fencing, which we’ve discussed before, it would basically be getting a map out and figuring out where the controlled access highways are and preventing autopilot or full self driving or any of the other junk from being used on, on any road but controlled X highways, Tesla is trying to figure out their, Once again, they’re turning to their little camera and computer system the car and said the AI is going to figure out You know where and when we are and when we can turn this on and that just doesn’t work You know gm and ford are doing it using maps and locations.
We know where the controlled access highways are in america We don’t need your car trying to figure it out And able to turn on autopilot just use geo fencing and you solve that problem. It seems tesla’s Being very resistant to that idea. But in the five, seven, excuse me, let
Fred: me jump in. I want to remind the listeners what the problem is.
The problem is that several people have been killed in Teslas because autopilot brand name autopilot was engaged. While they were driving on roads for which it was not designed. Now there’s a problem that arises because in the advertising, the promotional material associated with the cars, they are shown to be self driving and the name itself implies The name Autopilot implies that it’s on autopilot, yet there is fine print in the website and in the owner’s documentation that says the system should only be engaged on limited access highways where certain conditions prevail, which is that, there is no cross traffic, there are controlled access, et cetera, et cetera, and what was discovered and what was found by both the NTSB and by NHTSA Is that people are frequently using this system on roads for which it was not designed.
So that’s the background behind what Michael’s explanation was.
Anthony: Yep, my favorite description of both Autopilot and Full Self Driving comes directly from Tesla itself. Because we have a link to an article in the Wall Street Journal where Tesla faces at least a dozen lawsuits in the U. S. related to their driver assistance systems.
That’s Autopilot, Full Self Driving and Tesla’s lawyers have previously argued in court filings that its statements about its driver assistance technology are legally protected forecasts, truthful opinions. Or inactionable corporate puffery. Basically, the company is saying, Hey, thanks for giving us 12 grand for garbage.
Your own attorneys are just like, This is nonsense. All the stuff we say that, Hey, this is self driving. This is autopilot. This is just nonsense we’re throwing out into
Michael: the ether. Yeah inactionable corporate puffer puffery is a great phrase and it basically means we’re blowing smoke up your ass and you can’t do anything about it which is what we’re seeing.
Really? They’re basically saying on one hand. Marketing and, the legions of people who follow Tesla and their supporters are claiming that this is the next great thing. This is the next great biscuit, and we’re going to tell it to everybody. And what’s ultimately happening is, when people believe that they’ve got these advanced driving capabilities and only work some of the time and you allow them to turn it on whenever they want and outside of situations where it design, then You run into a big problem and they’ve run into this problem and when that problem occurs and you go to court, Tesla says we didn’t design it to do any of those things.
Those are, our users are using these vehicles. Abilities outside of areas or in conditions where we didn’t design them and we warn them not to so yes, the, their puffery is getting in the way of the areas where people are allowed to access. If you’re going to give people those types of tech to use in their vehicles, you have to put restrictions on it to limit it to where it’s actually been tested and can be proven to be used safely.
And if you don’t, then you may be on the hook as Tesla, hopefully soon as fine soon finds out.
Fred: Revision at least gives you a placebo effect, folks, you need to be aware that even though there’s AI in the cars, it’s not sophisticated enough to benefit from the placebo effect. So don’t squander your money on worthless promotions that, a company offers to make you feel good.
That’s what this one does. And unfortunately, instead of making it feel good, it can kill you.
Anthony: Yeah, so a lot of people under the impression, and not wrongly, that hey, Tesla’s creating self driving cars because they have something called full self driving. They’re the only ones doing autonomous vehicles.
They’re in the head of the pack, and I want to take a different tack, and I don’t think they have any involvement at all in self driving cars or autonomous vehicles. They
Michael: They don’t, they can say it as much as they want, but in every legal filing they’ve ever made, they’re calling it level two, which as we know is not anywhere near level four, which is where you really need to be before you start to say, Hey, we’ve got a self driving car, or we’ve got a robo taxi.
You could say we’ve got a bad half functional robo taxi, and you might be correct. But you’re never going to be able to claim level four, particularly when you’ve got humans intervening in any way whatsoever. So it’s, it’s a very complex area. It’s confusing. And Tesla has expertly used that to their advantage to sell a lot of vehicles that can’t do what their marketing really leads consumers to believe that it can.
Anthony: It’s interesting because I’m no lawyer, but it sounds like fraud to me, or at least it sounds like it’s, it sounds kinda like bait and switch, or some unethical behavior, saying, hey, this is what we’re selling. They’ll put it in their SEC filing saying we’re doing this, but their own lawyers say this is inactionable corporate puffery and they get away with it, whereas on the other hand, you have a company like Nikolai, Nikola where there, the Trevor, who ran that company, it did the same thing.
He’s hey, look, I have this autonomous truck. And he produced a video saying, look at it work. And it was just a, a car truck with a, the handbrake off and it rolled down a hill. And now he’s been sentenced for committing fraud and getting away with it. So is it just that? Like where’s that weird line difference Elon gets away with it, whereas this guy didn’t get away with it.
It was the extent of the lie that he didn’t produce anything? I don’t
Michael: know. Here’s where I think the line is, and it’s not a very clear line, but it’s the, the Biogen or the Elizabeth Holmes line, where you built a box, and what’s going on inside of that box is either nothing like what You said it is, which is similar to the Nicola case where basically they were saying, yeah, we have this prototype and it can do all of this when there’s no such prototype versus a Tesla where they’re saying, hey, we’ve got this vehicle and it can do all these things.
It just doesn’t do them very well at all. There’s actually, a system there in the Tesla vehicles. It’s intended to, allow the vehicle to drive itself somewhat. We know they have modes like summon where the vehicle is going to drive itself across the parking lot to you. There are situations where the vehicle is doing what’s, it matches up with some of their claims, whereas in Nikola and in some of the other.
Fraud situations. There was never such a prototype in existence. There was never any piece of technology that even remotely came close to what the claims were. Here, at least, they’ve done a half assed job of it with Tesla. I think that’s, for me, looking at all those situations, that would be the defining line.
And it’s, it’s what blurring those lines is what Tesla’s been really good at as they built this business.
Anthony: So boys and girls, if you’re gonna create BS, at least do a bad prototype. Yeah, at
Michael: least do a half assed
Anthony: job. Yeah, exactly. And then you can get away with whatever you want.
Yeah,
Fred: I want to talk about Nikola for just a second. I went to their website where they put up a response after the conviction of their former CEO four years in jail, I think, something like that. And they wrote quote, Nikola has a strong foundation and is in the process of achieving our mission to decarbonize the Trucking industry, which is our focus.
Nicholas statement said, what’s really interesting about that is that their electric cars will have a less thermodynamic efficiency and actually produce more carbon than the diesel trucks that they’re claiming to replace. A large diesel engine in a truck generally gets about 55 percent overall thermodynamic efficiency, meaning the energy and the fuel bill.
for into torque out of the engine. Typical power plants only get about 30 to 40 percent efficiency or 30 to 35 percent efficiency. And with a code generation system, you can get up to 40, maybe 42 percent efficiency. So the actual larger diesels they’re trying to replace are much more efficient and we’ll put less carbon in the air.
than these electric trucks that Nicola is touting are going to decarbonize the planet. So this is, this whole truck policy, this whole business, is built on a lie. And yeah, in the future when we’re all solar and everything is great and bluebirds fly everywhere, they will contribute to decarbonizing the world, but that’s, that future’s a long way off, folks.
This reminds me of my teacher who once said that If there’s a, if there’s a gold rush, sell shovels. And Nikola is really shoveling it here.
Anthony: Yeah, Nikola is an amazing company where it’s complete fiction. They raise billions of dollars from the public market on this complete fiction. Send one guy off to jail and everyone else.
The company keeps the money and let’s start building something now. I wish them well, who knows what’s going to happen with them. So let’s jump back into Tesla autopilot geofencing. I think we’ll hit that real quick again, because there are similar products from Ford and GM.
Where it’s this system will only work on these roads at this time. If you try to engage it somewhere else, it won’t work. And now even Mercedes, they got, they have approval for level three system, which says, Hey, you don’t have to pay attention. You can go ahead and use this system on specific roads that are geofenced and mapped in Nevada and California.
And now it’s kinda neat, I think, that the Mercedes, when it’s in this level three mode, it has these exterior lights in turquoise, letting other drivers know that, hey, this car is literally driving itself, and it lets the police know, hey, this person might be watching a movie, but the car’s doing what it does instead.
I like that. And let pedestrians know
Fred: they should run like the wind.
Anthony: It’s good. This is really good that like you’re warning other drivers, Hey, you’re partaking in I want to say quite, quite a beta test on this because Mercedes has had the system in Germany running for a couple of years.
Like they, they’re not quite being like, I don’t know, just throw some things up and we’ll always just do a software update and fix it when we kill enough. iT’s
Michael: very basic, right? It’s just, you’re in traffic, you’re under X speed. You’re basically bumper to bumper traffic, following the vehicle in front of you.
I think that even the software to be engaged requires that you be behind a vehicle. You have to be the following vehicle. So it’s not exactly, astrophysics type planning. You have to follow the vehicle in front of you. But. The, I completely lost my point.
Somebody take it away and I’ll come back to it.
Fred: No, but where this falls apart is that these systems that have been qualified, you talked about Ford, GM and Mercedes, they’re all based on high definition maps of the areas where they’re allowed to be used, right? That was one of the key. Findings about the quality problems with Tesla.
The problem is that most of the events that are going to occur that will challenge you are not on the maps. Collisions that occur on the road are not on the maps. Police by the side of the road are not on the maps. Floods, potholes. Trucks crossing those are not on the maps. Still be cautious folks mind those turquoise lights
Michael: And that’s my point was about the turquoise lights and you look at That is that’s not just a warning to people to many folks on the roads And I’m thinking rush hour folks who don’t seem to be ever content with the lane they’re in You’re basically giving them a beacon that says, come cut me off because my car is going to slow down and let you in.
And, it’s going to be submissive to your advances versus actively resisting your ability to cut in line or do whatever. So there’s some. Some, I believe some folks who are going to take advantage of those warnings just like we’ve seen a lot of people take advantage of autonomous vehicles and cut them off and use the tech to, to do things that they wouldn’t really consider if there was a human operating the vehicle at that moment.
Anthony: That’s an interesting take. Let’s jump back to Tesla again, because we’re in this thread. And Elon always claims, he’s Hey, if you want to have the safest car on planet Earth, buy our car. If you don’t want to have a safe car and be dangerous, buy somebody else’s car. And there’s this myth that he’s created, basically, and all of his army has said, Yes, Tesla’s, everyone’s the safest thing.
With autopilot engaged, it’s safer. With full self driving, it’s safer. We’re saving lives. We’re doing all this, saving lives. They quote this one study that some company had access to cell phone data as this this consulting firm. They said, Oh, look, it’s got some better data here. But that is really dubious at best.
It’s everyone’s had a bunch of questions around really where the data come from. So they’ve got their data.
Michael: They’re just not showing it to
Anthony: us, right? Tesla has the data. Yeah, they won’t
Michael: share all of this data. They will not share. Any data about that’s that underlies their claims about how safe autopilot or full self driving is because it’s all part of the myth.
They’re trying to create that the computer is better than us as drivers. But ultimately, and just like we’ve talked with the self driving companies and crews who did a very similar thing, they were trying to be Tesla light at level four. They just won’t show us the data and ultimately, if you’re, if you’ve got it and it proves all this out and it bears out your hypothesis that these vehicles are safer than human drivers, why aren’t you showing it?
Anthony: Yeah. So who do we turn to when these companies won’t do it? We turn to the, to the insurance industry and in this case, a lending tree. An article in Jalopnik, a new stat has emerged to Tesla Bros, a light on social media platforms around the world. Tesla drivers are involved in crashes more than the drivers of any other car brand.
For its analysis, LendingTree analyzed tens of millions of insurance quotes involving 30 different automakers between the dates of November 14, 2022 and November 14, 2023. The study revealed that Tesla drivers have 23. 54 accidents per 1, 000 drivers. The EV maker is followed by RAM, 22. 76, and Subaru at 20.
90. And that’s mainly because Subaru drivers, their Birkenstocks get stuck underneath the accelerator pedal, and they can’t undo it. Is that right, Mr.
Fred: Perkins? Yeah, that’s happened to me,
Michael: many times.
Anthony: And the Washington Post has done an extensive dive into the fact that people using Autopilot are bad drivers.
They’re using it in places they shouldn’t use it and it’s unsafe.
Michael: tHe lending tree data doesn’t seem to differentiate between, use in autopilot or full silk driving mode versus just the Tesla driver going down the road. But that their claims are, those are in stark contrast.
To what Tesla has been promoting, which is that their drivers and their vehicles are safer than anything out there without backing it up with any data. Lending tree has some data and they’ve released it and it says the opposite. So you would think at some point it’s incumbent on Tesla to actually.
Show us where they’re getting this. At what point does, inactionable corporate puffery a fraud upon the citizens of the United States who are relying on your statements to purchase a vehicle That might be, endanger them more than another one. So But that’s all remained to be seen in courts across the country over the next few years, I think
Anthony: If you’re in a Tesla, on a positive note, it does really well in crashes.
You can drive it off a cliff. Yeah,
Michael: You’ve got great crash worthiness there, although, the caveat is that, there are now, what, 15, 20 Cybertrucks on America’s roads, which probably don’t have the greatest crash worthiness. But
Fred: that remains to be seen as well. Do not drive your Tesla off a cliff.
Bad Anthony, bad.
Anthony: I was not suggesting people do that. I think just push it off a cliff instead. And be like, oh, this was a dumb choice. Why do I keep getting picked last? You know why you keep getting picked last? Because you haven’t donated to the Center for Auto Safety. People, it’s getting towards the end of the year.
You got to get in those tax deductions or even just don’t do it for a tax deduction. Just do it because you enjoy the dulcet tones of Fred Perkins. You enjoy the legal analysis of Michael Brooks. And just because you’re like, Eh, they make fun of Elon. Elon, he sucks. Let’s do that. Okay, that’s a weird reason to donate.
But hey, any reason to donate is a good reason, right? autosafety. org Click the red donate button. That’s how we Do we have
Fred: people collecting donations at the doors to Piggly Wiggly? Oh my God.
Michael: AI and automation impacts lots of parts of our lives now. And my understanding is that there are now kiosks that have replaced the bell ringers that allow you to submit a donation without any of the real holiday cheer that used to take place.
They’ve replaced the bell ringers with the machine. Wait,
Anthony: because of the Salvation Army there, they have the bell ringers, people dressed up as Santa? There. There’s now just a, it’s a kiosk. I
Michael: believe they’re rolling those out.
Fred: That’s just sad. Do they at least dress up the kiosks? Like Santa?
Michael: Oh, that’s what I wondering.
That’s a point is the kiosk ring its own little bell and that’s certainly not as charming.
Anthony: Can I sit in the kiosks lap? All right, last Tesla story. Okay. This is just too funny. So we talked about full self-driving and autopilot. The state of California is like, this is false advertising, you can’t do that, and Tesla’s response is, you’re violating my first amendment rights to lie and talk bullshit.
Fascinating little story there, but I’m sure,
Michael: yeah, there’s a, there are a lot of stupid interpretations of what the first amendment means these days. I’ve seen some stupid interpretations of the 14th amendment the last couple days too. But, it’s, some, We’re going to have to reckon with this.
There’s a big difference between an individual, that individual’s right to free speech and a corporation’s right to bullshit you into buying their vehicles. And until one kills you and there’s gotta be a reckoning here. And I. Don’t that Elon Musk is on the right side of this one. I also don’t think that Alex Jones should be back on Twitter and given a platform to spew some of the actionable puffery that that he continues to do.
So there’s a lot of, I don’t know, I think we need some better arbiters of the first amendment than Elon and his. Flock of lawyers.
Anthony: All right let’s jump to a different topic now. No more Tesla, at least that I’m aware of. Hey guys, what’s the number one resistance for people buying EVs? What’s always the thing, the top of their list?
Fred range envy. Yes.
Michael: Michael, the top of the list for EV owners has got to be, it, Almost all the issues that we’re seeing would be around range. In range, when you’re towing or in the cold or, on a vacation, when you want to get a thousand miles and you don’t want to have to charge four different times or three different times.
So that’s it. It’s a lot about that. That’s where the gas and the ice vehicles really. Show their stuff. You pull into a gas station and you’re out in 10 minutes and you go another three to 400 or 500 miles, depending on what kind of fuel economy
Anthony: you have. 10 minutes. Really? It takes me that long just to figure out which kind of candy I want to buy inside the mart, but Hey, you’re right.
it Is range. And we’ve talked about this before and now Stellantis. What’s Stellantis? You ever hear of Chrysler? Now they’re owned by Stellantis. Stellantis is becoming one of the first Western automakers to embrace battery swapping technology. That’s right, you pull your car into a location, you put the battery in a bowl, you hang out for a bit, then you grab a new battery and leave.
Wait, no, wait, that’s not what we’re doing here. Stellantis is betting that the EV charging infrastructure in Europe and the U. S. will remain a barrier to adoption in the near future. Necess Necessitating. mAking it necessary for other solutions. Battery swapping could theoretically help EV owners power up and get moving without having to wait for long stretches at a charging station.
We’ve talked about this before. It’s literally like you, your battery’s low, you pull into a place, boom, battery drops out, new battery pops in. You don’t have to sit there, plug in, wait for an hour and be like, where’s the nearest Piggly Wiggly? No, you don’t
Fred: have to, you just have to sit in line between five other cars that are also waiting to get their batteries replaced because it’s Christmas Eve and they’ve run out of charged batteries.
I think this is a dumb idea whose time has come and dad joke alert. If you elect to not buy an electric vehicle because of your concern about how far it can go, are you deranged?
Anthony: end
Michael: of joke, . I, This, it’s gonna be I think it’s going to be really difficult to get this idea up to scale. Mainly because I don’t know if you’re going to get, Stellantis is doing, but I think ultimately to make it work, you’re going to have a lot more automakers buy in, and they’re making money producing their own batteries.
Why would they want to move to a system where batteries are more standardized? Like all cars can pull into a gas pump and find what fuel they need there, but. Pulling into a facility and finding a battery that’s specific to your make and model is going to be a lot more difficult than pulling into one and finding one that’s been standardized to work on every vehicle.
So the way things are going now, you’ve got certain manufacturers who are making advances towards solid state battery technology, and they’re putting a lot of money into researching and getting those out, and they’re going to want to recoup that investment. And it’s going to be much more difficult for them to do so if there is.
No standardization, if there is standardization and we have battery swapping going on, I don’t think they’re, they’re going to be able to recoup as much of that as they would otherwise. So I, I think it’s good that Solanus is going this way, and I’ve been in favor of this idea primarily because it would allow people who are only using their vehicle for 50 miles a day to get to work and back or.
To change the size of the battery that’s in your vehicle and thereby change the weight of the vehicle, which would be a big positive in crashes. We don’t all need to be carrying around a battery that gets, three to 400 miles in range all the time. But ultimately scaling that solution to where it becomes the standard in the United States, while we’re at the same time, rolling out.
Supercharger networks across the country that, that take a different tack at solving the problem. I wish the land is the best, but I don’t know that it’s going to ultimately be successful.
Fred: This is a Chinese company named Neo that has already. Putting this kind of technology in place in Europe, not sure how well they’re doing, but I want to remind our listeners that there’s a technology called plug in hybrid that is very good thermodynamic efficiency that is less expensive than the battery powered vehicles and has unlimited range.
Just a thought, folks, when you’re pulling out your wallet to buy the next car.
Anthony: All right. And then but what if I want a big fat battery so I can get my Hummer and go race car speed and hurl 9, 000 pounds down the highway with zero training? What if I want to do that? Then, in
Fred: that case, bless you, merry gentlemen, go on your way.
Michael: Freedom of choice, Anthony. You can buy that right now.
Anthony: This is a real question. The Cybertruck and the Hummer, the Cybertruck will go 0 60 in 2. 2 seconds. Whereas an F1 car does the same sort of acceleration. But an F1 car is designed for that. It has massive safety features. And it’s driven by people who’ve spent their entire life driving those cars with tons and tons of training.
There’s an entire team watching thousands of data points and telemetrics. So I understand how F1 cars can do this. Why is it allowed, why can you buy a car that has that type of power to any schmoe on the street? That’s
Michael: nuts. There are no laws or regulations preventing the deployment of those anywhere.
This is,
Anthony: ah, there ought to be a law. Ha! Come on, nothing? Come on,
Michael: guys. How much do F1 cars weigh, by the way? Aren’t they trying to make them lighter and faster, right?
Anthony: Yeah, they weigh a thousand kilograms,
Michael: So they’re coming in at about a fourth of the size of a. And they’re hybrid, right?
Yeah, and
Anthony: they’re hybrids. Yeah. They’re they have big batteries. Their engines are getting smaller, their batteries are getting bigger. Actually, wait, I think, no, I think they weigh a thousand pounds, not a thousand kilograms.
Michael: I don’t know, it would be, you’ve got to think it’s going to be much safer if we just allow, made F1 street legal and gave people that option over the hummers, right?
Anthony: tHose cars are pretty safe. We, I’ve seen some crash test footage and they survive. But
Fred: yeah, I want to get back to the, I want to get back to the basic problem here, which is that horsepower is a really poor way to decide which vehicle you’re going to get because any vehicle can generate tremendous amounts of horsepower.
The question is for how long and under what circumstances, so you’ve got a container ship that is crossing the ocean, generating a thousand horsepower for several weeks or right for 20 years. But the car that you’re buying that generates a thousand horsepower can do it for maybe a second and during that one second while you are generating a thousand horsepower, your life will probably flash before your eyes because you haven’t been trained on how to do this and you’re likely spinning the tires and the car is going out of control very rapidly.
So you’re really squandering money if you use horsepower as a way of deciding which vehicle you’re going to buy. You should really look at the whole spectrum of how this horsepower is going to be used and for what purpose, if you can safely maneuver your vehicle onto a controlled access highway with 100 horsepower, why do you really need 500 horsepower?
If you can get a, if you can get a speeding ticket with your 125 horsepower pickup truck, why in the world do you want to have 750 horsepower? It’s really dumb. Because I’m a little man and I
Anthony: want to prove to people I’m big. I said because I’m a little man and I want to prove to people that I’m big.
Okay? What would you say? Would you say this is like the responsibility of a person to take all reasonable measures necessary to prevent activities that can result in the harm to other individuals and property? Wait, is it time for the Tao of Fred? You’ve
Fred: now entered the Tao of Fred. Duty of Care.
I love this one actually because I don’t know what the hell I’m talking about for the most part. But happily, we’ve got Michael in the conversation here to talk about that. But duty of care is a a legal construct in tort law Michael? That says you can’t just argue that you didn’t know what was coming.
If you’re engaged in this activity, you have to assume a certain amount of responsibility and there’s no way to weasel your way out of it. Is that what duty cares all about, Michael? Yeah, it’s,
Michael: it’s something that we really wish the folks over it. Tesla would think about a little more when allowing people to turn on these systems.
Yeah,
Anthony: you just think about the word duty. Go ahead. I was just saying, Tesla just thinks about the word duty and they think, Oh, can we have a duty button, a duty mode? Huh. I think they’ve lost
Fred: friends too often. The friends used to be big on duty as well. Okay, but duty care sounds good, but there’s really a big problem with duty of care in self driving vehicles or autonomous vehicles.
In military and sophisticated commercial systems, there is something called bi directional traceability, and by definition, it’s a two way relationship between two products. For example, the relationship between a requirement and the design of a given software component. And the relationship between the design of the software component and or requirements and satisfies.
All right, yada yada. What this really means is that if you’re. Self driving vehicle is going to turn left, it should be traceable to a requirement that says the vehicle is going to turn left under certain circumstances. If you have a requirement that says it should turn left, then you should be able to do a test that says it really actually does turn left in these.
Circumstances, right? So it’s a bidirectional traceability. A cause should have an effect and effect should be traceable to a cause. The problem that arises in these vehicles is that they all use artificial intelligence and artificial intelligence is a black box. It is not traceable. So there is no way that you can use traditional software qualification techniques to say, Hey, I have a cause, I have a requirement and that’s gonna cause the distinct to happen.
Similarly, something happens, so I can trace it back to a line in the software that says it should happen. So the only way to determine whether or not your vehicle actually subscribes to the requirements in order. In other words, the validate the software that you’ve got is to do a statist a test, and.
Look at the statistics associated with the test to make sure that it meets the reliability and performance requirements that you want to have. So why not just do this? In order to generate anything like 99 percent of reliability that The software saying turn left will actually result in the vehicle turning left.
You’ve got to do hundreds of tests, and you’ve got to do hundreds of tests in a very controlled environment. And if you change anything in that controlled environment during the test, you’ve got to do the test again, right? There’s no you can’t really fudge the test. Nobody does this, folks. The world doesn’t have enough money to do this.
And if you think of something like the cruise event in San Francisco, where the unfortunate woman was run over by the cruise vehicle under any circumstances. Imagine what it would take to run that test hundreds of times to generate a response that says, my software responded appropriately to this condition.
It’s virtually impossible. And that’s only one of the millions of circumstances that arise that need to be validated in your development of the of the vehicle. So if you’re going to do this bidirectional traceability you’ve got a real problem. There’s not enough money in the world to do this for any individual vehicle.
And if you change the software that restarts the clock. You got to do it all over again, right? So you do the test, you run a little tweak, it operates better the next time. grEat. You restarted the clock, got to do all those hundreds of tests all over again. Wait,
Anthony: so real quick, so Now the way this go ahead.
So if anything changes, so even if you’re driving down the road and one of your sensors get dirty, is that, like your camera gets, one of your many cameras in your car gets occluded somehow? Does that mean Yeah,
Fred: That invalidates the test. Oh, my God. So why is this a problem?
Actuaries are the people in the insurance industry who figure out what the risk is, right? And how in the world can an actuary or the insurance industry figure out what the risk is and consequently what the insurance coverage should be? If it’s literally impossible to run the series of tests that are required to validate the software progression through this black box, which is the artificial intelligence, it’s simply not possible.
It will never be able to The world will never be able to look at any individual AAV and say that under all circumstances, it has even a 90 percent probability of doing the correct things with 95 percent confidence. There’s not enough money in the world to do that. There’s not enough test space in the world to do that.
AAVs will never Ever be able to satisfy the criteria that you have bidirectional traceability between requirements and performance for any extensive set of circumstances that danger the lives of either the person in the car. The passengers in the car or the vulnerable road users in the vicinity of the vehicle.
So this duty of care is this duty of care is going to be a real problem for the developers. They will fight it tooth and nail. And we on the consumer advocacy side should also fight tooth and nail for the duty of care to be assigned to self driving vehicles.
Anthony: Is there any way forward you see on this?
Because there’s, there’s there’s good things that come out of AI in terms of like pattern recognition. So we can see, okay, this is a traffic light and this is what it’s doing and things of that level. Okay. It’s easy
Fred: to say this is a traffic light, but. From the engineering perspective, you’ve got to say, I recognize that this is a traffic light with X percentage of reliability and with the Y percentage of confidence.
Now, the confidence thing is a little bit confusing because what the confidence level says is if I repeat this test series under the exact same circumstances within the limits of what the test can do, what is my, what is the likelihood that I will come up with the same reliability result as I did the previous time?
All right, so if you have low confidence and you run the test again, you’re very unlikely to come up with the same reliability result. Now, that’s a sophisticated concept, but that’s an engineering approach to determining what the risk is. Now, this actually, this discussion actually bleeds over into the ARC recall that we’ll be talking about perhaps later or maybe another day because The statistics exist.
Engineers know how to do this. At least good engineers know how to do this. But, it’s possible to quantify the safety of a vehicle, theoretically. Practically, it’s impossible because the bean counters rule, they will never authorize enough money to do this under any circumstances for any AAV, guaranteed.
How do how do you quantify the risk? The only way the insurance company is going to be able to do it is to say, Alright, we’ll let these suckers drive down the road and see how many people they kill. And over time, we’ll figure out how many people are dying, and we’ll, set that as the bar for what we need to do in terms of insurance coverage for these vehicles.
That’s a pretty grim approach, but there’s no other approach that the Bean Council is going to allow on the developer side.
Anthony: That’s just Tesla’s business model.
Fred: It is Tesla’s business model. And they’re rigors upon going down that road.
Michael: Yeah, there’s one thing I’m worried, I’m wondering about here in relation to Tesla.
They tout you know, the machine learning capability of their vehicles and if you can’t trace Those learning steps. It’s, you, it doesn’t seem like you can just plug in a camera to your car and send it down the road and it’s going to learn everything. And then you’re going to put people in those cars that are safe now even at level two and three, it seems like you have to validate everything that, that computer learns, or that AI learns during that process in order to Demonstrate that the vehicle is going to operate safely.
It seems like there’s an inherent tension, between that position.
Fred: Sorry, again, please. You drifted off for a second.
Michael: It seems like there’s an inherent. Contradiction between the process of machine learning as it’s been advertised by some of these manufacturers in relation to vehicles and a duty of care because you can’t truly validate whether the machine has learned in some respects.
Or you can’t trace the vehicle’s, activity to a specific line of code, like you say, you’re you’re relying upon the AI to produce results, but you can’t necessarily predict or control or document those results in a safety case, for instance.
Fred: Right. That’s absolutely right. AI is great for looking at a picture of a flower, comparing it to a database of flowers and saying, okay, this is probably a periwinkle.
I have 90 percent reliability in my result, and I’ve got 50 percent confidence. That’s great, but nobody’s going to die if your periwinkle is misidentified. When ARC was on Capitol Hill, they said that their inflators had 95 percent reliability. Or, excuse me, yeah, wasn’t it? 95 percent reliability with 50 percent confidence.
Are you kidding me? What a piece of junk. And how in the world, how in the world does NHTSA let this junk go out on the highway? And yes, it will save people’s lives because more often than not it’s going to work, probably. But for crying out loud, go to MIL STD 322, everybody can download it for free, I should say, and you will find it listed in their tables for what kind of reliability and confidence you can find If you do a certain number of tests for any kind of vehicle, right?
And so you’re looking at hundreds of tests to produce 99 percent reliability and is anything like 99 percent confidence. Anthony how confident should people be about an airbag inflator? Pick
Anthony: some numbers. I would say I would like to be 100 percent confident.
Fred: Oh, you’re never going to make it in the OEM business.
Michael, pick some numbers.
Michael: 99. 999999 percent confidence. Oh, you
Fred: guys are failures. You’re never going to make it in the automobile industry. buT if I were to sit here and say 95 percent sounds pretty good. And I’ll say 50 percent confidence that is perfectly acceptable from the NISA perspective and the industry perspective.
And it’s it’s just not good, folks. Just not good. There’s standards that are available. Freely over this miraculous thing called the Internet that anybody at NHTSA can use. And when I looked at the NHTSA response to the ARC information that you sent out this morning, Michael, it’s interesting that somebody there read that and they were told not to cite apparently, or they were told not to cite the specific reference.
Because when they talk about the shortcomings, they’re identified by the manufacturers in the data. It goes chapter and verse to exactly what we were talking about. Maybe we’ll get to that today, maybe not. Anyway, I’m going to wrap up duty of care for now, unless you folks have got some some other comments to make.
Anthony: I think that was very helpful. It makes me afraid for living in the future. Maybe things will not be better in the future, as you say. I don’t know what we’re all gonna do. Yeah, that was that right there is a perfect reason to donate to the Center for Auto Safety. Cause where else are you gonna get that kind of information, huh?
Nowhere. Ford’s not gonna tell you that. GM’s not gonna tell you that. No, Tesla, come on. They’re the greatest thing since sliced bread. But they don’t have to tell you that, cause they’re the greatest things ever. And if you don’t believe them, go fuck yourself. But hey, speaking of that, I apologize for my aggressive language.
It’s time for some recalls! This week, it’s gonna be my headlights are all messed up. We’ll start with Infinity. Certain model year That’s funny. Certain model year 2022 to 24, Infinity QX60s, equipped with adaptive front light systems, manufactured from Different dates in Smyrna, Tennessee the headlights will inadvertently, they will have an incorrect tilt value.
As a result, when this vehicle speed is over 81 miles per hour, if the headlights are in auto mode, the headlights will adjust to the greatest downward angle. Wait a second, okay, so we’re just talking about, hey, you can’t trace AI and like machine learning and whatnot. Yeah, I don’t want that in my car, but I can’t even get the headlights working correctly.
So
Fred: Maybe it’s designed to wink at you, to let people know that, you’re all on the same side.
Anthony: Yeah, that we’re doing 81 miles per hour. That’s a very specific number. If I hit 81 miles per hour, my headlights fail. First
Michael: of all, people Yeah, it’s really anything over that amount,
Anthony: and you should only be driving that fast if you’re in the state of Texas, because I believe that’s the only state that allows you to go that fast.
Or if you’re an F1 driver.
Michael: Yeah. And this is a, I think it’s just, you’ve got to go back to the dealer for this one it’s somewhat software related, but they’ve got to reconfigure your settings to make sure that you’re basically your headlights don’t point downward at an extreme angle when you’re going too fast.
Would that be bad? It’s going to prevent you from seeing things like deer, which as we know from your experiences are something you definitely don’t want to hit at 81 miles per hour or greater.
Fred: Surprisingly massive, yes, that’s right.
Anthony: So the next bad headlights are from the car that I own, a Lamborghini.
Ha. This is potentially, this is amazing, potentially 7, 805 vehicles. I had no idea they made That many. This is the 2015. Yeah, that’s a lot
Michael: of them. Usually a Lamborghini recall is about 15 of them. But, this is a lot of model years. This
Anthony: is a lot of them. 2015 to 24, the Hurricane. The effective vehicles have a function on the infotainment system.
Of course, the infotainment system. Which allows the selection of headlight aiming for left hand driving. tourist mode. Wait, what? Additionally, the front light adaptive front light system may have been adjusted outside of production specifications of the U. S. market. Wait, what? You get to headlight Amy, you get to control headlight aiming for tour.
I this is I’m not familiar with these features. I didn’t read this part of my owner manual For my Lamborghini
Fred: Huracan. We’re gonna see a lot more of this. This is talking about the adaptive headlights. And what the adaptive headlights do is they illuminate your side of the road. When you’re, when there’s an approaching car, but they dim the lights on the other side of the road so they don’t dazzle the driver of the approaching car.
Oh, I like that. So this feature is associated with the fact that sometimes these cars are in countries with left hand drive and sometimes in countries with right hand drive. So they basically screwed up and Didn’t get it right.
Anthony: I got it. Oh, I like that. Cause I, too many lights these days just blind me.
It’s I don’t enjoy driving at night. I think some people have really bad headlights, mainly those trucks, and I don’t like it. So this is cool. Okay. I’m on board with this. Hey, fellow Lamborghini owners, I will see you at Our Swiss bank account. And then we’ll, we’ll go drag race.
And we’ll go, ha, look at the pores in their Ferraris. Ah next recall we got is a little company called Ford Motor Company. Potentially 5, 100 plus vehicles. The 2022 to 2023 F 150 Lightning. Does not contain actual lightning. They are equipped with 15 inch touchscreen. That’s how the recall notice starts.
I don’t even want to read anymore. Let’s. So Michael what
Michael: is this one? So basically this one, pretty simple. Every time you turn your vehicle on, the electronic stability control is supposed to come on, and in this case, it is not coming on every time these folks are, they’re turning on the vehicle, which is a problem in every car, but particularly when you’re in a large, heavy truck, your electronic stability control is incredibly important.
When that was put into place partly because of the Ford Firestone rollover problems around the year 2000 and it has really helped reduce the number of rollover accidents that we see every year. And so it’s incredibly important that comes on every time you turn on your vehicle.
This one looks like they’re going to fix it. Through a Ford over the air update, which the lightnings are capable of doing. So no one will have to bring it into their dealer. And, I’m not sure, but it may. Notification. It looks like this one’s going to be done with that, with your next software update sometime in November.
So it should have already taken place. And yet this is just receiving notice and we’re just talking about it. So
Anthony: wait, so that Ford Firestone thing you’re talking about in the two thousands I don’t know what, there was literally any like sort of organization that was really pushing to get Ford to fix their stuff.
Michael: We were really pushing hard to get that done, and it, Clarence Titlow, who was our executive director at the time, testified on a number of occasions before Congress about the issue, and it was a giant problem. There was a, a dual problem there in that the tires Were manufactured poorly and subject to sidewall damage, tread separation issues that, and when you lose a tire on a top heavy Ford Explorer, as they were designed at the time and you don’t have any type of electronic stability control, or even if you just run off the road and don’t lose a tire you can have tripping incidents that make it really easy for those vehicles to roll over and electronic Ashton Lake.
Electronic stability control in the years since NHTSA required it has really helped prevent a lot of those types of incidents.
Anthony: Can I make the wild leap that the Center for Auto Safety is partly responsible for this recall? Because I just did. Hey, donate. Yeah,
Michael: I think you could make that leap.
A lot of people are involved. It’s never just us, that’s for sure.
Anthony: Nah, it’s always just us. Don’t listen to
Fred: them. I do want to add that the Ford F 150 is one of these electric vehicles that has excess horsepower. And the stability control is particularly important because if you are unfortunate enough to step on the accelerator.
All the way to the floor and everything kicks in and there’s any, and there’s any tire spin, your car is going to go wildly out of control very rapidly without the electronic stability control. So this is, I don’t think this was in the notice, but it’s a, it’s another danger that’s associated with this particular vehicle and the electronic stability
Anthony: control.
So that’s another vote for getting a excessively heavy electric vehicle that will go race car speeds. Oh wait, no you’re against that. I’m confused. Last recall, Kia of America, 2, 300 vehicles. This is 2023 Kia Soles. Certain 23 mile models manufactured in April of this year. Has problem with their side curtain airbags.
They will Inadvertently inflate due to a welding error in the stored gas section of the hybrid inflator, the side curtain airbag may inflate inadvertently without a deployment camp command. What’s more welding issues with these inflators? What’s
Michael: It sounds a lot like some of the other things we’ve seen maybe with ARC and others, but here you basically have, there’s the welding error basically makes this diffuser disc within the inflator break.
So when that happens, when it breaks, the gas is released and the side curtain airbag inflates. And that’s regardless of whether you’ve been in an accident or had a signal to the inflator to deploy. So it could literally happen at any time. And if you’ve seen side curtain airbags inflate and you’ve seen they’re really going to get in the way of a driver.
And not only the surprise of them, but continuing to drive and hopefully. Hopefully when, the vehicle is powered down when this occurs, but it seems like a really sketchy situation, having a side curtain inflator or side cut curtain airbag deploy while you’re in, while you’re trying to focus on your driving tasks.
So that’s something you really want to look
Fred: out for. Might not be obvious is that this is a hybrid inflator. And what that means is that the inflator is filled with compressed gas as well as the explosive components. So in this particular case, it’s only the compressed gas that is that is that issue.
It’s not. A question of the actual pyrotechnics firing off because that it probably is a flaccid inflation. It’s probably enough to break the seal, but this is code drooping around rather than the full inflation that you would expect with a more exciting event. I’ll leave the metaphors there.
But anyway, that’s what hybrid inflator means. It’s got the compressed gas and the seal breaks and then the diffuser gets clogged and.
Diffuser does exactly what you’d think it would do. It just softens the the rate of flow of the gas into the airbag.
Anthony: For those of you just tuning in, this is not the Urology podcast. This is the Center for Auto Safety’s podcast. And with that is our show. Unless you gentlemen wanna, wish Fair Tings and whatnot.
We’ll be back next week. Are we back? Final episode of the year,
Michael: next week. We are back. Next week should be our we will dive a little deeper into the a RC airbag. The recall that Nitsa is pursuing the manufacturers came out over the last couple of days in their comments on the record, and there’s a unified group of manufacturers who are saying, we don’t agree. And we’re, it’s setting up for either NHTSA to back down or NHTSA to order the manufacturers to do a recall and the manufacturer is going to court. So this one’s going to take a long time to Get things straight and to, it’s going to probably be years in court before we see some resolution on that one.
Merry Christmas.
Anthony: Happy New Year it’s not New Year, we’ll be back for the New Year’s episode, so happy holidays everyone. Thanks for listening, hey, and thanks in advance for donating, and even more, thanks for, donate on behalf of your friends, family, loved ones, random people on the street, people who drive Teslas, people who want to drive a Tesla, people who’ve never seen a Tesla, people who want to be a race car driver, I can keep doing this for a long time, the two of you are just going to let me keep doing it, aren’t you?
Hey, and with that, thank you very much. Until next week,
Fred: bye bye. Bye bye. Thank you for listening. Bye bye.
Anthony: Michael Ways, goodbye.
Michael: Bye.
Fred: For more information, visit www. autosafety. org.