Driving the Future: The Rocky Road of Autonomous Vehicles

Subscribe using your favorite podcast service:

Transcript

note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.

[00:00:00] Introduction to the Center for Auto Safety Podcast

[00:00:00] Anthony: You’re listening to there auto be a law, the center for auto safety podcast with the executive director, Michael Brooks, chief engineer, Fred Perkins, and hosted by me, Anthony Cimino for over 50 years, the center for auto safety has worked to make cars safer.

Hey listeners, you’re listening to the center for auto safety podcast. The only podcast that will sue you if you don’t donate.

[00:00:33] Fred: Oh, that’s pretty negative there, Anthony.

[00:00:36] Anthony: I learned it from Elon Musk.

[00:00:37] Fred: I will. Yeah, he’s got an interesting way of approaching his customers, doesn’t he? But good morning, world.

[00:00:44] Anthony: Hey, so we’re back from our second break of the year. And thankfully nothing interesting happened in the world of auto safety at all. It’s gonna be a short episode.

[00:00:53] Vacation Car Reviews: Hyundai Kona and Kia

[00:00:53] Michael: I, on vacation, I did drive a Hyundai Kona. I got a choice of all sorts of mini [00:01:00] SUVs for my drive out to Washington State.

And I picked the Hyundai Kona. I kinda liked it. It wasn’t a bad car. It didn’t catch fire. It was great.

[00:01:11] Fred: That could be a great advertising slogan for them.

[00:01:13] Michael: Yeah. Didn’t catch fire. I also made sure that when I got out of it, it took a little time at the airport to delete all my personal and privacy related data from their computer system, but it worked pretty well with my little Android phone, throwing up the Android auto maps.

And the music that my daughter has to listen to while he drives. So everything went pretty well. So I had, despite all of the negatives on Hyundai that you’ll invariably hear on this podcast, I did enjoy driving the Hyundai Kona, which by the way, it looks like a spaceship.

[00:01:50] Anthony: I had the same experience with the Kia rental car I had back in February, which Hey, I thought it was great.

That’s when I first learned about the automatic backup [00:02:00] braking system when it hit the brakes there. I thought it was a fine car. I haven’t tried their EVs. I know apparently people are really big fans of their EVs. But hey, that’s not what we’re here to talk about, our favorite rental cars.

Instead, let’s go, let’s start off with let’s start off with something nice and light and easy.

[00:02:17] The ARC Airbags Recall Controversy

[00:02:17] Anthony: ARK Airbags! How’s that sound good. All right. So NHTSA, remember arc airbags, this is a company where they’re also selling exploding airbags, potentially, maybe, who knows, huh? And NHTSA is like, Hey, you got to recall this stuff.

And they’re like, we’re going to sue you. And so NHTSA has confirmed its initial decision that certain frontal driver and passenger airbag inflators manufactured by arc automotive and Delphi automotive systems contain a defect related to motor vehicle safety. From NHTSA’s report, this looks like they’re just, they’re at the phase where they’re just asking for consumer input, feedback, public feedback.

[00:02:53] Michael: They already had that phase, and that went that, that phase, I believe was in December when they issued [00:03:00] their initial decision. This supplemental decision is, Something I don’t think we’ve really ever seen before in the recall process, but essentially they’re just adding on to what they said in the original decision, putting it back out for another period of comment from the public and manufacturers, and they’re going to go from there.

Not a lot has changed in this version. They did release some additional. Analysis of the failure rates which is helpful. They’re, we’ve got, I think, 50 million airbags that are involved here that were installed on about 49 million vehicles. Some of them are passenger and driver bags, which is the reason for that.

And, we’ve had, I think it was seven incidents that have happened, that it’s. Field ruptures, which is basically explosions in normal people’s cars out in the, out on normal roads and not in a laboratory. So the rupture rate, when you’re looking at rupture rates, you don’t [00:04:00] look at those seven versus the 50 million airbags out there.

You look at those seven versus the number of vehicles that have actually had one of the inflators deploy in a crash. Which is about 1. 3 million somewhere in there. So that, that, when you look at it that way, it really does make the risks look, a lot less infinitesimal. And if you were looking at that out of 50 million, so far we’ve seen, I believe one fatality and six injuries in the United States.

A couple other fatalities and injuries, I think four or five, maybe more than that in other countries. But, if you’re in one of these vehicles with an ARC airbag and your airbag is triggered, it looks like the risk of an inflator rupturing is around one in 192, 000, somewhere in there, which is, A significant risk.

It’s a far, you’re far better likelihood of that happening when you’re in a crash than if you winning the lottery or other things. For example it’s not a [00:05:00] de minimis risk, which was a concern legally, since that’s a standard for recalls.

[00:05:07] Anthony: So they’re asking for public comment. What did they expect from the public?

We’re like, Hey, I’ve been maimed or Hey, I’ve not been maimed. I don’t understand.

[00:05:16] Michael: When they ask for public comment, typically you don’t see a lot of random members of the public commenting. The public would include organizations like Center for Auto Safety, Insurance Institute, a number of other organizations that would focus in this area.

But also it provides an opportunity for the manufacturers to respond, for ARC to respond. And bring up any further arguments they have to, that, that could show that these airbags are safe. I don’t know that they’re going to be able to come up with those. NHTSA has. Done a pretty good job of buttoning their argument up.

And while I still expect some resistance, particularly from the [00:06:00] manufacturers who haven’t seen a rupture during deployments in the ARC bags that are installed on their vehicles. And this may, we may still see lawsuits around this and delays around the actual implementation of the recall, and it’s as.

Got a pretty good argument here that this is a recall that should occur. And I think we agree.

[00:06:19] Anthony: So from a consumer’s point of view what am I supposed to do?

[00:06:23] Michael: You can either stop driving the vehicle or keep driving it and take the risk. That’s really what the choice that we’re giving given as consumers based on the current law in America.

Anytime there’s a recall and there’s, or a proposed recall or a defect that’s been identified, but there’s not a fix available for it yet, really all you could do, I suppose you could request to have a, another airbag module installed in your vehicle made by ARC that was produced after 2018 when they started implementing procedures to prevent.

This defect [00:07:00] and maybe that’s something that would work that’s going to be really expensive. Or you can not drive the car or you can sell the car and make someone else deal with your problem. There are options there, but none of them are very good. Certainly none of them are as good as the manufacturer saying, okay, we’re going to replace that thing for you.

[00:07:18] Fred: I want to remind people that the source of the problem here is that ARC elected, for reasons unknown, to produce airbag inflators for cars that were much less safe than they know how to produce for military and commercial markets for similar devices. This appears to have been an economic decision.

They just didn’t want to spend the time and money required to do adequate testing of these devices. And so they said, okay that’s fine. It’s only the public. Let’s just go ahead and put this junk out in the cars and hope for the best. NHTSA decided that was not an acceptable solution. The statistics that ARC was touting is.

[00:08:00] evidence of their quality was actually just the opposite. So this was an intentional decision by ARC to degrade the quality of their products for the public and it has lethal impacts.

[00:08:15] Anthony: Know what else is also an economic decision? Going to autosafety. org and clicking on donate. So the Wall Street Journal, I was kidding earlier being there’s nothing happened in the world of auto safety over the last week or two.

[00:08:27] Tesla’s Autopilot Crashes and Data Privacy Issues

[00:08:27] Anthony: The Wall Street Journal did this amazing investigation inside Tesla. Tesla we talk about, there’s crashes and we can never get access to data. It’s all redacted and it’s a report so we don’t know what’s going on. The Wall Street Journal hired some hackers and got these crashed vehicles and sent off Tesla’s internal computers inside of the cars and said, Hey what’s actually going on here?

Let’s look at the footage. Let’s see what’s going on. And the the footage, most of it, unfortunately, you have to be a subscriber to Walsh’s Journal of the Sea, but It’s fascinating, like their vehicles are going full on [00:09:00] into a semi truck because the machine doesn’t know what to do. Going full on at a, coming to a T junction and just going straight off the road and keep on driving.

It’s an amazing piece of journalism, investigation that’s really good. And so some of the things that are in here that I was surprised about was Tesla has reported over 1, 200 autopilot related crashes to NHTSA since 2021. These incidents represent about 80 percent of all crashes reported to the regular regulator involving autonomous driving technology.

So again, they’re not the only people on the road that have ADAS systems, but Hey, they’re responsible for 80 percent of crashes. That’s a, yeah,

[00:09:42] Michael: I always try to condition anytime someone brings up that point to the fact that. Tesla is receiving, basically, their data is being beamed up to Tesla every second, unlike a lot of other manufacturers, where, [00:10:00] in order to hear about a crash like this, they would have to get a report from a consumer or a lawyer or someone before they reported it.

Tesla’s aware of crashes in its vehicle because it’s being told about them. The vehicles themselves are reporting things to Tesla. I know it doesn’t happen all the time. Sometimes some of those systems are damaged in the Tesla and the crash and the reporting doesn’t take place. But Tesla is made aware of those and thus has a lot more to report to NHTSA.

So I think any conclusions drawn basically on, essentially drawing on the sheer number of reports don’t, you need more than that to bear out that there’s a safety problem.

[00:10:43] Fred: A couple of details about this, though, that I thought were very interesting. One is that Tesla is reporting data to themselves, extensive data to themselves, that they do not allow the customers to ever see.

The only way to do this is to spend thousands of dollars and disassemble [00:11:00] your computer with some destructive changes to get at the data. The other thing that I thought was very interesting about this is it was older data, the data showed the radar images and the radar results coming through and Tesla does not include the radar in their current offerings.

I don’t think, they took that out because it was superfluous. Because

[00:11:22] Michael: it’s too expensive and their camera systems cheaper.

[00:11:26] Anthony: And if I drive with my eyeballs, I don’t have radar. I should be good.

[00:11:30] Fred: All right. So even with the radar. The Tesla was still smashing into stationary objects and moving objects with a seemingly reckless abandon.

So I thought it was interesting that they decided that even though all this bad stuff is happening, let’s get rid of one of the systems that could keep this from happening. Interesting decision.

[00:11:50] Anthony: All right. Fred, can you help me with some math?

[00:11:53] Fred: Maybe.

[00:11:54] Anthony: Okay. It’s not complex math. Okay. So I’m thinking is Tesla has sold roughly four and a half [00:12:00] million vehicles through its entire lifetime.

Since 2021, they’ve had 1, 200 crashes involving autopilots. Not all of their cars have autopilot on them. From the start, they don’t have them. So if a man gets on a train, a lot of

[00:12:17] Michael: people don’t use it, even if they have it, a lot of people don’t use it too.

[00:12:21] Anthony: Had a really dumb joke in there and you just stepped all over my premise.

[00:12:23] Michael: A man got, it had something to do with man, a train in Chicago.

[00:12:29] Anthony: So anyway, this, how does this compare? What do you think to other manufacturers? I have no idea.

[00:12:38] Fred: There’s just not enough data to make the comparison.

[00:12:40] Anthony: Okay. Cause that’s the fanboy argument is always that sounds like nothing. 1200 crashes.

Big deal.

[00:12:47] Michael: You should stop listening to fanboys. One of the things, one of the things that I found, somewhat interesting here is something that isn’t brought up a lot, the journal went and took I [00:13:00] believe they found about 200 different police reports. They filed public records all over the country to try to get police reports behind these crashes.

And They found that a number of those crashes occurred when Autopilot was engaged and the vehicle lost traction, wet pavement, and other things. It looks like Autopilot is also struggling to operate when, the pavement’s wet. There, there’s, so that’s not something we had heard of before, or I don’t believe we’ve talked about or discussed.

We know that cameras that Tesla employs in the, are, they have problems in rain and other, at dark and in other conditions where, you would expect the cameras performance to be degraded, but losing traction on wet pavement is less, I don’t know if that’s a camera issue or if the vehicle is unaware that it’s on wet pavement or what’s going on there, but that’s a pretty clear problem.

[00:13:58] Fred: [00:14:00] It did lead to an interesting thought, though, that it’s possible using the statistical information that they’ve recovered to put a warning system in place for these cars to warn the drivers when they’re entering dangerous conditions. It should be possible analyzing brake data, analyzing the camera data to determine that the pavement is wet.

For example, and that they’re approaching a curve that’s too tight. So I don’t, 1 of the things that the manufacturer should be doing is putting warning systems in place in these cars as part of the 8 F systems. So the drivers get advanced notice of an imminent hazard rather than. A notice of the hazard after they’ve had a problem.

[00:14:42] Michael: Yeah, and cars with even relatively simple technology, have been detecting the presence of rain for years using automatic windshield wipers. So it’s not that hard. I don’t know what, I don’t know what Tesla’s missing there. [00:15:00]

[00:15:00] Anthony: Yeah, even my car mourns me. It’s hey, roads may be icy. I assume that’s just a simple temperature gauge.

But, so with this is, so NHTSA withholds all this information from the public. And part of that is Tesla saying this is proprietary information, we can’t share this. Which, does any other manufacturer pull that level of No, sorry, you can’t see what happened here.

[00:15:22] Michael: No, so in the standing general order reporting that manufacturers are required to do on, level two plus or, vehicles that are able to this is, this goes beyond, automatic emergency braking and things like that.

And simple lane keeping. It’s when those systems are working together. And you basically have a car that is semi driving itself. on the road controlling longitudinal and lateral movement in those situations. That’s okay. I lost my train of thought. What was the question there, Anthony?

[00:15:55] Anthony: So a man gets on a train in Chicago.

No it was, this was a [00:16:00] what is proprietary information here? Other OEMs don’t pull the same level of,

[00:16:05] Michael: so all of that’s reported to NHTSA by manufacturers or those types of vehicles. And it’s. It’s Tesla has routinely said that the data that it’s reporting is confidential when it’s clearly not confidential data, for instance, in every crash narrative for every one of those 1200 or so crashes that Tesla submitted, they’ve claimed that the actual narrative of the crash is confidential when.

The narrative of the crash is, car a did X, Y, and Z. This happened on a public road. There were, if not witnesses to the actual crash, multiple witnesses there after the fact, and police reports documenting this that are, could be available to the public. So for Tesla to claim that information is confidential or trade secret is a farce.

Why does it allow it? Because that allows NHTSA to play [00:17:00] good with Tesla, play nice with Tesla and keep getting Tesla’s data. But wait,

[00:17:03] Anthony: NHTSA is the one in charge here. I think it could be.

Why do

[00:17:06] Michael: they have

[00:17:07] Anthony: to play nice?

[00:17:07] Michael: NHTSA technically at this point, the standing general order, while it’s telling manufacturers they have to submit this information and providing penalties for those who don’t, it’s it’s, NHTSA wants the information for itself, they want to be able to access the information, evaluate it, public access to that information is a secondary consideration.

I think there’s a, I think that if someone filed a Freedom of Information Act request and went to court to try to force NHTSA to turn over what TESLA has given them in certain fields that are clearly not Trade secrets. I think the that field is one. There are a number of other fields that Tesla’s redacting in those reports I think they would have a chance of winning and but NHTSA on its own volition is not going to Release that data that Tesla’s I mean [00:18:00] that Tesla claims is confidential even though it pretty clearly isn’t

[00:18:04] Fred: There’s a related unresolved issue here about data ownership.

You bought the car You bought the computer that’s in the car is recording data about you personally, it’s recording your face, it’s recording your location, your operations, but Tesla doesn’t let you get at that data unless you disassemble the computer. Who actually owns that data, Michael, and what kind of recourse has a private person got to get at the data that actually originates with them, originates in the car that they purchased, but somehow is inaccessible to them.

[00:18:38] Michael: Yeah there’s as well as nobody really knows who owns that data right now. I guess the best answer there are, there’s no real federal guidance as to who owns the data generated by your car. And frankly, the way some cars are being sold now, there’s a question as to whether you truly own everything in your [00:19:00] car.

It’s maybe sound odd to say, but there are a lot of conditions that are being put on, Tesla owners amongst. Owners of other vehicles that really make you question whether or not you own your vehicle. For instance, if you have, I think BMW a few years back, and they may have stopped this by now, but they were putting, seat heaters or other types of creature comforts into vehicles and installing them, but wouldn’t allow you to turn it on unless you had a monthly subscription to that.

Whether or not you actually own your car and all of its functions is in debate now. I would suggest that, even if you owned all of the data in your Tesla finding it and interpreting it, Is going to be incredibly difficult without a lot of assistance from Tesla explaining, what X, Y, and Z are.

That’s a question that’s yet to be resolved and also something we may discuss in our privacy discussion that’s coming up in

[00:19:59] Anthony: [00:20:00] the podcast. We’ll discuss it now, even. But before that, hey, are you a law student? Cause if so, I got an assignment for ya. File some FOIA against NHTSA regarding Tesla’s things.

Come on, get some extra credit. Law school’s boring. FOIA’s fun. The F in FOIA stands for fun.

[00:20:18] Senators Push for FTC Investigation on Car Data Privacy

[00:20:18] Anthony: So what we were just talking about here is Senators Wyden and Markey, they want the FTC to investigate how car companies collect data and punish them if they violated the law. And exactly what we’re just talking about here is new cars are a privacy, data privacy nightmare.

It’s, you, oh, you think you bought the car, but, when you first start it up, click to agree, click to continue, click to all this stuff, and you’re like, I just want to go to the supermarket and go to the Piggly Wiggly and hang out with my boys from the Center for Auto Safety podcast, but you’re agreeing to things that, you didn’t go to law school when you’re like, ah, I you’re being forced to agree after you’ve paid 20, 000 plus dollars for something.

That seems [00:21:00] unfair. But the thing is GM I think was the bad ones here, whereas GM was taking your driving data and selling it off to the highest bidder. What’s happening now?

[00:21:10] Michael: I, there’s a lot of problems with privacy in cars. We’ve talked about that endlessly. And, what happened here was, This is essentially Senator Wyden’s office conducted a, an investigation, a followup investigation into some of the things we’ve talked about previously, but they look specifically at GM, Honda and Hyundai all of whom shared their vehicle with data brokers.

And they’re really just calling on them to do more and saying, if you don’t do more, then we’re probably going to have to look at this from a legislative angle and see what we can do to prevent manufacturers from operating this way. Essentially, they’re saying that when.

Customers buy these vehicles and they, start hitting the buttons on their [00:22:00] screen to sign up for connected vehicle services for own star for those types of things that the companies aren’t being. sufficiently transparent about where that data is, the data collected is going to go. And we saw where GM had something called smart drive or something or other that they’ve now discontinued because it was such a privacy nightmare that its customers were being essentially automatically signed up for when they signed up for own star.

And it was distributing their data directly to their insurance companies who could then raise their rates based on how they saw them their vehicle being driven.

[00:22:40] Anthony: It was worse than that. Wasn’t it when they the data they’re selling. So not to be abstract was your location data. So they were selling where you’re driving to third party data brokers.

[00:22:49] Michael: That’s another issue here too, because what GM is saying in regards to that data is that it’s de identified or it’s anonymized for it is shipped to data brokers. So [00:23:00] while it gives the data brokers an idea of where. X, Y, and Z GM vehicle may be driving at any one moment, it doesn’t say that Anthony was driving it.

That’s

[00:23:13] Anthony: true. And I always park my car in the next town over so they can not map it to my address.

[00:23:19] Michael: This is stupid. Yes. And that’s the problem with anonymizing data. You can’t. It takes a lot of thought to anonymize data because there’s a lot of parameters in there that can tip you off as to where when this happens.

We even have been taking, for instance, supposedly anonymized data that the federal crash system has in it, the FARS database, for many years now. And you, while you can’t always find, an individual, you can use the latitude or longitudinal coordinates. And, and the date, the time, the vehicles involved to trace down crashes through [00:24:00] news articles or otherwise, and then go file for a police report.

And so it’s not as hard, it’s not as hard to figure out where an event or where something in the data took place, even if The data has been somewhat anonymized. So Anthony is right. Yes, you’re going to park your car by your house every night and they’re going to get a pretty good idea of that, of who that is.

So that’s one reason why, we need more clarity up front as consumers when we’re clicking a button, we need more clarity as to what exactly that button means without having to read through 40 pages of technical specifications on the system. GM, Honda, Hyundai, they all need to be straight up or right at the front and say, we’re going to sell your data to Verisk, which is a company that they’re using.

And they’re going to use that data for X, Y, and Z. They need to be just blatantly obvious about what they’re going to do with the data so that customers can [00:25:00] say, okay, I’m fine with that or no way, man. And right now customers are simply being led by the nose where you’re like, if I won’t own star in my car, I’ve got to hit this except button.

There’s no other way around it. It’s a contract of adhesion. It’s got all sorts of problems from a fairness perspective and consumers are basically being, shown a big carrot and going after the carrot while their privacy rights are leaving them.

[00:25:28] Fred: Michael, if this data exists, isn’t it by definition discoverable?

If you happen to live in certain states that restrict your access to medical care couldn’t this data be used in response to a subpoena to, figure out

[00:25:45] Michael: which doctors you’ve been seeing? Yeah, a subpoena is going to cut through a lot, the layers, whatever layers of privacy or anything that the customer or General Motors or Honda or any of these companies have put into place because a subpoena is [00:26:00] basically a judge saying, get the data for me and show it to me.

I don’t care what agreement you have in place. Yeah that’s. An issue, your cars, if your car is tracking your location, your phone is tracking your location, any device is tracking your location and you don’t want your location known. If someone subpoenas your phone for a copy of your phone’s location or your car’s location.

And they have cause to do that. A judge orders it. They’re going to get it. And that could be a problem. So if

[00:26:30] Fred: this data is available to third parties, is there anything that would stop a a prosecutor from just going shopping in the data and sifting through it to see who’s gone to, 63 West 47th street.

Hey,

[00:26:44] Anthony: that was private. I’m not supposed to. Oh, okay. That was just an example. Okay.

[00:26:51] Fred: Sorry, it didn’t

[00:26:52] Michael: mean. That’s where the anonymized part of it comes in. If it’s truly de identified, then the prosecutor can shop all he wants, and he [00:27:00] won’t be able to connect that location back to. Individual owner or another person.

However, I guess there’s a chance that, if That information could be used to get a subpoena to dig a little deeper, then there may be a problem there. Or if the data is not anonymized as well as it should be and a prosecutor is able to figure it out from the badly anonymized data, then there’s an issue there.

I don’t think, bottom line, I don’t think anyone trusts the car companies to protect them from those kind of intrusions. And if you do, you shouldn’t.

[00:27:38] Anthony: How can I trust them? They’re selling me faulty LBAGs. I don’t know why my voice just did that. That was strange. But Fred, that was a great question.

But you missed the big point of Michael’s earlier speech. And what he said there that was the most important part was Anthony’s right. Okay, let’s focus on that. That has

[00:27:55] Fred: never happened before. This is an interesting day. I know I’d like

[00:27:59] Anthony: [00:28:00] to retire

[00:28:01] Michael: It only took a hundred and what five episodes.

[00:28:04] Anthony: Yeah, quite a bit more than that, but hey who’s counting?

[00:28:08] Gaslight Illumination: GM’s Privacy Issues and Aurora’s Bold Claims

[00:28:08] Anthony: Let’s go right into gaslight illumination because michael, I think you’re You’re what you’re just talking about is really just ties into your nominee Yeah,

[00:28:15] Michael: my nominee was GM because, just after having to basically kill their smart driver program and being the primary bad guy in a slew of privacy problems, GM is, gave a statement that GM is.

It’s just typical GM, we share the desire to protect customers privacy while enhancing safety and preserving innovation. We vehemently deny the assertion that we use manipulative design techniques to coerce consumers into enrolling in SmartDrive. Each consumer is given a choice at the time of enrolling and throughout the life of the product.

We established it to promote safety, safer driving behavior [00:29:00] for the benefit of consumers who are elected participants. Which, that sounds great and all, but the fact is they were selling that data. They established the Smart Driver product. Because they knew they could sell that data and make more money off of it.

And it’s very clear that data is worth this type of data is worth money. They’re selling it to LexisNexis. They’re selling to this fair risk company. They’re selling the data. Honestly, I think to anyone who’s willing to buy it maybe short of, certain Russian or Chinese State actors who they would really get in trouble for selling it to.

So, GM is trying to pretend that it’s not the bad guy in this scenario, that they established very clear mechanisms for consumers to allow their data to be accessed and that they really didn’t have a, they didn’t play a bad role in this they’re, I don’t know, they’re essentially pleading the victim and.[00:30:00]

They’re not a victim. They are actively selling data of their customers and owners to data brokers and they’re, they just can’t crawfish their way out of that. There’s no backing out of that. They need to take responsibility for it and do better.

[00:30:17] Anthony: Didn’t, in the 1960s, didn’t GM invade somebody’s privacy and follow them around supermarkets, dig through their trash?

[00:30:26] Michael: Yeah, I believe they even hired a prostitute to do that. To do that, yeah.

[00:30:30] Anthony: What was the outcome there? Oh, you’re listening to it, the Center for Autosafety.

[00:30:37] Fred: Yeah, that ended well for them, didn’t it?

[00:30:39] Anthony: It worked out great for us and for you consumers. You’re welcome for airbags and seatbelts and this podcast. Fred, who’s your nominee?

[00:30:47] Fred: My nominee is our friends at Aurora. Aurora is producing software. For self driving heavy trucks, and they made the [00:31:00] modest claim and I placard that they erected at the conference.

I just attended that Aurora is the gold standard for self driving safety. And they had 3 engineers, they’re staffing their booth and I said to them, that’s a pretty bold claim. What is your basis for that? And 1 of the engineers said, we never made that claim. And so in my in my best Vanna White imitation, I swept my arm and showed them the sign that he was standing in front of and said actually you did make that claim.

And he said that’s just a, that’s just some marketing person made that claim. And I said okay, well, tell me why you guys think you’re doing a better job than anybody else. And he said we’ve, we’ve got a lot more transparency than other people essentially was his claim.

And yet they’ve got no operating miles yet. With the self driving truck, [00:32:00] so I, I nominate Aurora for the bold claim of. They are, quote, the gold standard for self driving safety, quote, on the basis of bullshit, I’d say.

[00:32:12] Michael: Yeah, that’s craziness anyway, because they’re not even operating in cities really, right?

They’re just operating on interstates, which, as we know, are probably the safest place to drive.

[00:32:22] Anthony: I love it. That is the key. Classic disconnect between marketing and engineering, like clearly some marketing person who doesn’t understand anything about engineering, sit down and said, Oh, so tell me, what are you guys doing?

What are you working on? We’ve got some self driving. Great. You guys are the gold standard. No I said that my, I got a new gold watch. Perfect. And everyone gets free gold watches. I don’t want to be in this meeting anymore. They’re making eye contact. Let’s go.

[00:32:46] Fred: Now, I do want to point out to our listeners that on our website, you will find the gold standard for AV safety standards in our AV Consumer Bill of Rights, and it’s interesting that [00:33:00] none of the speakers at this conference, which we’ll talk a little bit more about later, had any offerings that cover that comprehensive set of requirements that we’ve established there.

So I think there’s still a long way to go.

[00:33:14] Anthony: Shameful. They should all follow the Center for Auto Safety. So my nominee, and this is, I’m beating a dead horse again, but I’m, it is not GM Cruise specific this week, even though, they’re ridiculous. It is going to be the entire autonomous vehicle industry.

All of them, this self driving robo taxi nonsense. And I could be wrong, and I want someone to explain where I’m missing the math. Okay, so you’ve got companies like Tesla, NGM, investing ridiculous multi billions and billions of dollars into this and then claiming, Hey, this is going to be a 10 trillion industry or some ridiculous number.

And so I just did basic math again. So essentially they’re going to compete with Uber. Uber’s global [00:34:00] 2023 revenue was 37 billion. That’s their revenue, not getting the profits, anything like that. That’s how much money they brought in. Now, Uber, they operate around the entire globe. No company makes any form of autonomous vehicle that can operate outside of limited driving areas.

I don’t know how that’s gonna work. Uber has over five million drivers. For example, Tesla sold over four and a half million vehicles.

[00:34:24] Tesla’s Robo Taxi Ambitions

[00:34:24] Anthony: So Elon, he’s claiming, hey, we’ve got this robo taxi coming. We’re gonna make ridiculous sums of money off of it. It’s gonna be great. Based off of that, and he said in, in 2024 alone, they’re gonna invest 10 billion into full self driving.

To believe this nonsense, you have to assume that every vehicle that Tesla’s ever sold is automatically gonna become a robo taxi. It’s gonna be operate around the entire globe. And they will completely take over Uber’s entire revenue stream. And even based off of that, I still don’t see how they break even.

I [00:35:00] don’t.

[00:35:01] Michael: So they’re investing. He

[00:35:03] Anthony: said 10 billion alone.

[00:35:04] Economic Viability of Autonomous Vehicles

[00:35:04] Michael: Say, hundreds of billions of dollars going into investing into getting into an industry that’s only generating around 40 billion in revenue every year.

[00:35:16] Anthony: It’s more than that, but yes. But they’re investing, all these companies are investing billions and billions into a relatively small market.

For the amount of investment they’re putting in.

[00:35:26] Michael: And we’re talking about robo taxis here not Aurora, the trucking. No, that’s different, right?

[00:35:30] Anthony: I’m saying literally just what GM Cruz wants to do, what Waymo wants to do, what Tesla is saying they’re doing. They’re knowing. No, they’re not.

What Ford and Volkswagen said, Hey, let’s try this in the next, they did the math. They walked out. Apple’s Oh no, we don’t make cars. They backed out. Smart companies, I think have backed out after, investing at least a billion dollars.

[00:35:50] Fred: That’s an interesting observation, and I’ve started to think that it’s odd that people are celebrating these investments for example, there was just an announcement that [00:36:00] Waymo is going to get another 5 billion from their friends at Alphabet.

It’s interesting because to me, it seems as though they’re a patient on life support who is just celebrating the fact that they need another transfusion. I just, I don’t get the economic case for this. I

[00:36:18] Anthony: That’s the gaslight, I’m assuming it is. Some tech nerds sitting in a room in California.

Met with their marketing department, and the marketing department’s This is great, man. Yeah, you guys are the gold standard. Here’s some more money. That’s my only thing, but please, listeners, write in, tell me where I’m wrong. What am I missing? I know some people who work at GMCrews listen to this.

What am I missing about your business? What, where do you get some amazing things? No, no one is not, no one is gonna license your software. That is not going to happen. I’m sorry. No one is going to license Tesla’s full self driving. Never going to happen. So what do you think is your business? [00:37:00] And what’s the problem you’re trying to solve?

Getting rid of people?

[00:37:05] Fred: I don’t get it. They’re all pretty strong candidates. I’m not sure who the winner is for this week’s Gaslight Illumination Award.

[00:37:12] Anthony: I don’t know if you heard earlier, but it was Anthony that was right.

[00:37:14] Fred: Oh, that would be twice in one week. I’m not sure that’s going to happen twice in one episode.

[00:37:20] Anthony: So let’s I want to jump related to that. So Elon with an article in New York times he says Tesla’s future is robo taxis. I think I just pointed out why that’s not true. I don’t even know if we have to dig into that more. That’s just him pumping and dumping his stock. Cause he realized, Oh, auto companies don’t have that high evaluation.

But if I say it’s an artificial intelligence robotics company that makes self driving cars people like ARK Investments will pump and dump. Oh, back to ARK Investments, so we’ve talked about them a couple weeks ago, and they put out these charts saying, hell, this is our analysis of Tesla. A lot of this data is available upon request.

I’ve requested [00:38:00] it numerous times now. The last response I got from ARK was a bounce back from their email address that says, this email address is not valid.

[00:38:10] Michael: I think you’ve been blocked, Anthony.

[00:38:12] Anthony: No, I think Ark’s a Gaslighter too. I have a lot of Gaslight nominees. Anyway. Let’s let’s, Fred let’s jump into the, to the Tao.

I’m looking at the clock. You’ve now entered the Tao of Sleight. We’re hanging out surfing in San Diego and then arm wrestling Navy SEALs.

[00:38:29] Fred: Let’s go. For now, everybody’s learning that. I wish I were. Sadly, I was at the conference.

[00:38:34] Conference Insights and Takeaways

[00:38:34] Fred: The ARS 24 Automated Road Transportation System 2024, sponsored by the National Academy’s Transportation Research Board.

Now, the National Academies are science, medicine, and engineering, so pretty, pretty geeky crowd. I was surprised by some of the things that were my takeaways from there. There were hundreds of people, a lot of our good [00:39:00] friends, a lot of people that I met for the first time. Notable by their absence was Tesla.

Apparently, they don’t like to be inundated with data. I’m not sure that nobody was there, but none of the speakers that I saw were from Tesla, and I didn’t see anybody with a name tag identifying them. I could have missed somebody, but my big takeaways were That their slogan might have been now that the horses are running wild in the streets, let’s discuss the need for corrals.

They a lot of discussion of validation. We’ll talk more about that. 1 of the speakers. That was left on challenge because of the format was a declaration that equated automation with safety improvements. This is going back to the old shibboleth about the 94 percent reduction. That is unfounded, but, it’s just breeze through that or maybe there’s another.

Theme for this would be harkening back to the plague years. Bring out your shibboleths and add them [00:40:00] to the pile. There’s a lot of that going on, but as we know, in the future, everything will be better. So let me push through that. A keynote speaker stated that quote safety assurance needs to be economically viable close quote.

I think that’s supposed to cart right before the horse because safety has to be provided before. You can. Investigate your economic viability, unless you’re going to put something on the road that simply is not safe. And to me this actually is a reversal of the philosophy that should be put in place for AV development.

Anthony?

[00:40:37] Anthony: Was the keynote speaker Lee Iacocca? Literally, this is exactly him saying, safety doesn’t sell. Sorry you died in our Ford Pintos.

[00:40:46] Fred: I think Lee’s passed into his great He has. central liberty in the sky, but it could have been. I don’t, I’m by the rules of this, I’m not allowed to identify the names of the individuals who spoke [00:41:00] certain things.

So that’ll have to remain a mystery. It’s some some, anyway, it’s just the way they do things.

[00:41:06] Anthony: Okay,

[00:41:06] Fred: so I need to be a little circumspect about that. Similarly, when I talked about our friends at Aurora earlier, notice I delicately mentioned no names so that they can all have deniability. I’m not after anybody’s career here.

Let’s see, there was another interesting point that for all of our listeners who want to be like me. A person named Tar Abel Newman is the chief of NHTSA electronics and safety division said, quote, we are hiring close quote. So that’s a good sign. They’re staffing up bringing people in.

[00:41:40] Safety Standards and Validation

[00:41:40] Fred: Validation activities are, I think, a really important part of this and international standards organization is actively producing a normative or binding standard for AV safety.

Or excuse me, design verification and validation seems a bit late, given the [00:42:00] fact that, these things are already out on the street, but perhaps better late than never. Jeff, we’re sure who will identify because he’s a good guy and 1 of our friends at Arizona State University spoke about several reports that he’s leading, especially the J3131 on AV validation and J3237 on AV metrics.

Now, inquiring minds might wonder why these are being produced after the fact rather than before the fact of deployment on the streets, but I guess that’s imponderable. Again, a lot of that, and I spoke with some other people there who said that, this is the same old crowd that keeps coming out year after year.

I hadn’t attended it before, so I was in no position to dispute that, but He, like I, thought that this was probably a lot more boosterism than there really should be in a [00:43:00] conference that’s organized by an ostensibly scientific organization. Moving on, there was one speaker from Europe who said that it would take 12 billion miles of driving, 12 billion with a big B.

Miles of driving to prove a 20 percent improvement in safety compared with human driven cars. Now, I thought I was being pretty skeptical about the statistics on this, but I was certainly put in my place by this announcement. None of the developers talked about having an independent red team. that they use to challenge their safety assumptions.

An independent red team is a standard part of military and many commercial developments because you don’t want to have Groupthink infecting too badly your safety profile and your safety management system. The one speaker I challenged directly on that said he simply didn’t know whether or not his company had an independent red team.

When, as he was talking about safety, I found that a little bit hard [00:44:00] to accept, but I’m a generous guy. So I didn’t repeat that.

[00:44:05] Anthony: Can I go back to the 12 billion miles before you get a 20 percent reduction? Cause that’s much greater than we. Ever discussed in the show and whatnot. What was that based on?

Did it seem, I, there’s obviously not enough time to go into the, this show for the details, but from your expert point of view, was it based on something sound or was it just throwing numbers out?

[00:44:28] Fred: I don’t know the answer to that. There wasn’t enough data presented to establish the basis for that.

And haven’t caught up with all of the underlying. Studies yet. I don’t know if they’re a reference to providers or not. There’s more information available to people who want to sift through the presentations, but I haven’t done that yet.

[00:44:44] Anthony: Okay. Please for next week, if you It sounds like he was

[00:44:47] Michael: saying, it sounds like he was saying, you need 12 billion miles before we can show a 20%.

It sounds like he’s talking about statistical significance. I’m not sure. Yeah. No, he was talking about that. [00:45:00]

[00:45:00] Fred: Exactly right. Yeah. But I don’t know, the details of his analysis. So let’s see, what else have we got? Okay. Anthony, you got something?

[00:45:09] Anthony: No, I’m just, that, that’s the one that I’m still very curious about.

[00:45:13] Fred: Okay. Another another kind of global observation is that safety management systems and safety case analysis are still immature and they’re not ready for prime time. There’s a lot of discussion about whether and how to do those things. They should both be done, but how they should be done and what the standards are for them are still completely up in the air.

Once again why was that? Why is that even an issue now after the fact of these things roaming the streets? It’s just, this is really an ass backwards way of doing things. One of the workshops was the role of standards in urban driverless operation. What they’re already driving in cities.

What are the, are those governments going completely untethered [00:46:00] without having any standards or ways of even knowing what data they should collect? This is a little bit crazy.

[00:46:05] Michael: It sounds like it should have been titled the role of no standards.

[00:46:09] Fred: Yeah. Yeah. That would have been

[00:46:11] Michael: good.

[00:46:12] Fred: Your friends at Cruise, Anthony, introduced their new chief safety officer, Steve Kenner, and announced that they’re jettisoning everything done in the past, and they have a new safety approach with no details presented, but it’s impossible to know what or when it might be, but they are, quote, enabling a safety culture at all levels at Cruise, close quote, and they’re setting aspirational goals for their safety, rather than saying it has to be better than a human driver.

There was no details about what those would be and how they were going to be implemented,

[00:46:48] Michael: but I think they. They did an interview with auto news that we probably won’t get to until next week, but they laid out their new safety plan. And while it [00:47:00] looks better on the surface, there’s still some unanswered questions.

[00:47:02] Anthony: Yeah. Friend of the show, Phil Copeland will link to, he has a breakdown of their safety plan and these summary of that. And I apologize, Phil, if I get this wrong is this sounds like a good thing, but you’re missing a lot of detail. But if I’m a GM shareholder and I am not. I would be like, at the next shareholder meeting, Hey, let’s stop giving this stupid thing money and make better cars.

Or if I’m a shareholder, I’d be like, Hey, let’s do some stock buybacks. Ooh, let’s increase the dividend. That’s really, you’re better off as a shareholder. Cause this is, it’s, you tried it, you failed. It’s not going to work. Just tell me why my math is wrong. Sorry.

[00:47:39] Fred: Just a couple other things to close up this episode of the Tao Mark Rosekind, a former administrator of.

The NHTSA and also a former member of the National Transportation Safety Board talked about the model for aviation safety that’s been used by exchanging information in a closed [00:48:00] environment so that nobody, gets their pick bent and how that would be a good idea to establish a similar closed environment for exchanging safety information about the AVs.

There was a lot of discussion about that, how it could be done. Apparently a company named MITRE is now trying to get that set up and going. And they’ve had one meeting and hopefully, I think that’s probably a good idea. Hopefully that’ll come to pass, but that was discussed as well.

[00:48:33] Anthony: Will they let a rabble rouser like you involved in that though?

I’m sorry? Will they let a rabble rouser like you be involved in that? Cause my concern with that is then. They’re just having all these conversations about safety and private and then the public will be like, Oh, it’s all good, bro.

[00:48:47] Fred: Oh, I think I’m too good looking. They won’t have engineers in there.

Obviously you’re too tall to fit into the room. There you go. Another person, Kristen Pollard, who was on the NTSB was discussing their investigation of AV [00:49:00] crashes and had a requisite appalling video that she. Brought up and she made the point that the lessons learned could be applied to conventional vehicles immediately.

And there’s no reason why the safety lessons being learned by AV development cannot be quickly applied to conventional cars, accelerating the safety benefits, which I applauded. She was surprised by that. Floss, I guess, and talked about the rules of the road that need to be enforced. And, there was an interesting discussion about rules of the road.

We may come back to this with a guest in a couple of weeks. But think of this example. Okay. If you let’s say that you’ve got a school zone with a speed limit of 15 miles per hour. That’s a rule of the road, right? Every human being knows what to do with that, but for human beings, rules of the road just have to be general markers.

We interpret them generously and we interpret them in a way that’s compatible with our ethics basis [00:50:00] and, all of those things. But for a car, it doesn’t have that. So if you really want to exercise the same kind of care in a school zone for a car, you’ve got to program it to do several things.

One is to not exceed the speed. But then you’ve also got the steering limits, so you need to program it to not drive over the children, except that if you’re in imminent danger, you need to perhaps do something, maybe speed up or, steer off the road, but ideally still avoid children. And unless there’s a bear eating the children, in which case, you would probably want to speed up and try to kill the bear

[00:50:41] Anthony: and then put the car just in your car and drop it off in Central Park.

[00:50:45] Fred: Yeah. Okay. Only if you’re running for president, but you, but then you better not kill any kids if the bear didn’t already eat. So there’s an awful lot of sophistication that a human being puts into the interpretation of that rule of the road, [00:51:00] 15 mile per hour speed limit that you need to explicitly identify and program into these cars.

That’s the problem with these AVs, right? You’ve got rules of the road that were intended for human beings. With our interpretation and extrapolations and all of those things that we do automatically machines don’t do that. Machines don’t extrapolate. They don’t interpret. They just execute exactly what you tell them to do.

With that I’ll just close out this episode. Thank you for being generous with the time, Anthony. And any questions, gentlemen, you have a good tacos. You’re out in California. I did have some fish tacos, but I wanted to leave a 3 tip. They charged me for a 300 tip. And so I had to get that cleared up.

Interesting side light from my conversation.

[00:51:50] Michael: They forgot the decimal point.

[00:51:52] Fred: Yeah, something happened there.

[00:51:54] Anthony: You could probably work for Boeing.

[00:51:57] Fred: I called them up and they did issue the refund, but, [00:52:00] which was nice, but didn’t fight me on that, but yeah, 300 tip for two tacos. That seemed a little excessive.

[00:52:06] Michael: Really upsetting to the staff who thought they had a sugar daddy.

[00:52:10] Anthony: That sounds like a productive thing. So my only question on this conference is how much of it is, how much of our people representing the consumer space are present at this type of thing? It’s primarily history.

Apparently about

[00:52:21] Fred: one out of 400. Okay. So there was

[00:52:26] Michael: four hundred and one. I would guess there are probably, I’m not sure if I would guess a few more than that, right? How many people were there? Total?

[00:52:35] Fred: About 400.

[00:52:37] Michael: Okay. You were there. Phil was there, right? Did you see Phil? Phil was there. Yeah.

All right. Advocates maybe I, there’s got, I think it was probably a little higher than that, so don’t low ball it too much for what I’m getting at

[00:52:51] Fred: is

[00:52:52] Anthony: listeners. What I’m getting at is if you donate and you donate generously, You could send all three of us to San Diego next year.

[00:52:59] Fred: Which would be [00:53:00] an excellent idea because they have multiple tracks and I can only cover half of it.

[00:53:04] Anthony: Absolutely. And I just want to hang out with Shamu.

[00:53:08] Recalls and Investigations

[00:53:08] Anthony: Anyway, let’s go to recalls. Recalls.

[00:53:12] Michael: Recall roundup.

[00:53:13] Anthony: Let’s start off with Hyundai motor 49, 719 vehicles. This is the 2024 Hyundai Santa Fe, the Santa Fe Hiv.

And the problem here is the main floor wire harness. It could be due to, it could become damaged due to contact with a passenger side second row bench seat that’s folding hinge assembly. And that could turn on the airbag warning light and or intervene in airbag deployment or inability to deploy the airbags during a crash.

Ooh, oh, this is, so they ran a wire. Under the floor, but somehow they ran it up the passenger side seat, near the seat hinge. None of my wires are exposed in my car. They’re all behind some sort of facet, faucet, something.

[00:53:58] Michael: We’ve seen a lot of [00:54:00] little recalls. Not just, we see backup camera recalls and airbag recalls and all sorts of recalls that involve moving parts of the vehicle, contacting a wiring harness, which just should never happen.

And I’m not sure exactly why these manufacturers aren’t doing something like testing the second row bed seats, folding hinge assembly a few times and seeing if it. Destroys a wiring harness connected to airbags before you sell the car. It just doesn’t make sense to me. Some of these decisions, it just seems like they’re throwing some of these vehicles together without sufficient testing.

[00:54:40] Anthony: That’s weird. When I once worked for a fruit company, they had this, they had a machine that would literally open and close laptops. All day long to see if the hinges would break or fail at what point they would fall apart. But this seems like the wires would be exposed. Like I don’t understand. Why would that?

I’m confused. Hey Hyundai what happened here? [00:55:00] Moving on. Hyundai Motor Corporation, look at this! 12, 612 vehicles the 2010 to 2013. Hyundai Genesis coupe the ignition lock switch in conjunction with the one. So the ignition spring could fracture due to switch from the clutch pedal bracket. I don’t understand this.

A fracture.

[00:55:23] Fred: Car with a standard transmission. Has a clutch, and so it’s set up so that you have to depress the clutch before you start the car for safety. Oh, okay. The car knows that the clutch has been depressed because it’s got a switch in there. So if that switch doesn’t work, then the car doesn’t know that the car is still engaged, and you turn the key and all of a sudden the car lurches forward.

That’s what this is all about. That’s exactly it.

[00:55:48] Anthony: That makes sense now. Okay. Hey I think who, our next recall is going to be the winner. They are at least per capita per vehicles produced. They are absolutely the winner. [00:56:00] Ford, sorry, you’re just gonna be in second place. Tesla recalling 1. 8 million vehicles.

Basically everything produced from 2021 onward. The problem is the hood latch assembly may not detect an open condition and prevent driver notification of the hood. And then when you start driving away, the hood will go, Ha! Fly open! And so this is this is a recall. I’m sorry, because a recall is a notice of a safety defect, no matter how it is fixed.

Did I get that right?

[00:56:30] Fred: But hearkening back to an earlier observation, this is more reliable than the stated reliability of the ARC airbag inflators, just to put it in perspective. So Tesla, we’re calling all of this without complaint contrasts with the ARC suing the NHTSA to avoid the more hazardous component they put in their cars.

[00:56:56] Anthony: Okay. So if you have one of these Teslas, make sure your hood is [00:57:00] closed. I don’t know why you’re opening your hood. Anyway, I used to put something in the frunk or show. Hey look, no engine in here. This, I can put a bag of Doritos. I don’t know.

[00:57:11] Fred: You need to put your golf bag somewhere.

[00:57:14] Anthony: Oh, I put that in my other car.

I put that in my cyber truck. Which was also recalled. Alright, now we got a couple investigations and then we’re done. Chrysler, inoperative door locks and windows may prevent vehicle occupants from exiting the vehicle during an emergency. That sucks.

[00:57:29] Michael: Yeah, that’s an upgraded investigation on, I think it’s 2009 to 2020 Dodge Journeys.

There was a a woman who was killed in a collision. I want to say, I don’t know if it was last year where the she was unable to exit the vehicle because of the electronic door latches in the vehicle, the battery was out in the car, there was just no way for her to get out of the vehicle. We think that this is, we have long had a problem with some of these Chrysler electric systems.

We [00:58:00] tried to get that, we filed a defect petition, tried to get them. Recalled many years ago because of a totally integrated power module or a tip them that’s in these vehicles that just was presenting consumers with all sorts of problems. Windows rolling up and down, windshield wipers going off unable to open the door lock systems, virtually anything powered by An electric system in the vehicle was affected by this.

That’s why it’s a totally integrated power module. And we think that’s a contributor here or something involving that. We think this may go beyond just the Dodge journeys, but NHTSA is upgrading the investigation to an engineering analysis. So that’s usually what they do when a, they’re looking for a recall from the manufacturer and the manufacturer.

Hasn’t been amenable to that and they are going to bump the investigation up another notch to put a little more pressure on the manufacturer and to get at the root cause of the defect.

[00:58:57] Anthony: All right.

[00:58:58] Conclusion and Listener Engagement

[00:58:58] Anthony: I think just because of time [00:59:00] constraints we’re going to wrap up there. Hey, listeners, thanks for tuning in.

Tell all your friends, click subscribe, thumbs up, thumbs down, thumbs sideways, whatever donate autosafety. org. And we’ll If you’re a law student, file some FOIA. Till next week.

[00:59:18] Fred: Bye bye. Thank you for listening. For

[00:59:22] Michael: more information, visit www.

[00:59:25] Fred: autosafety. org