Elon mode and more Kyle failures
Do you like to pretend that your car can drive itself? Well… you might be a Tesla owner or the CEO of GM Cruise. Either way please let us know where you might be so we can avoid you.
NHTSA applies a special order to Telsa, AV’s are trained on pictures of white men, ARC doesn’t want to recall their air bag inflators, Cruise blocks an ambulance, recalls and the Tao of Fred.
This weeks links:
- https://www.autosafety.org/support-us/
- https://www.msn.com/en-ca/money/topstories/us-regulator-asks-for-information-on-tesla-autopilot-monitoring-drivers/ar-AA1fX1Mf
- https://news.yahoo.com/autos/nhtsa-demands-tesla-explain-autopilot-145400006.html
- https://www.washingtonpost.com/business/2023/09/06/arc-airbag-recall-inflater-nhtsa/
- https://www.theguardian.com/us-news/2023/sep/05/san-francisco-cruise-robotaxi-death-ambulance
- https://techcrunch.com/2023/09/04/protestors-rally-at-cruise-hq-in-san-francisco/
- https://www.nhtsa.gov/recalls?nhtsaId=23V598000
- https://www.nhtsa.gov/recalls?nhtsaId=23V594000
- https://www.theguardian.com/business/2023/sep/06/toyota-blames-factory-shutdown-in-japan-on-insufficient-disk-space
Subscribe using your favorite podcast service:
Transcript
note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.
Anthony: You’re listening to their auto be a law, the center for auto safety podcast with executive director, Michael Brooks, chief engineer, Fred Perkins, and hosted by me, Anthony Cimino. For over 50 years, the center for auto safety has worked to make cars safer.
Michael: Good morning. Good morning, everyone.
Anthony: Wow! Wait, what is going on? Michael says good morning. He never says good morning at home. This week we’re gonna start off with our good friends at Tesla. Tesla, Tesla, Tesla. Now, if you haven’t been paying attention, Tesla sells something called Autopilot, Enhanced Autopilot.
Full self driving, nonsense mode, dum dum, give us your money mode and we’ve talked about many times about how we have issues with the very least of just calling something autopilot when it’s not. And now the good folks at NHTSA said, wait a second, we should look into this. And they’ve been looking into it because these vehicles have been having some issues.
Now, one of the things most consumers have the belief that hey, I have something called Autopilot, it drives itself. Tesla’s put out videos where the car is steering itself, doing all sorts of things. These are highly edited videos that are essentially lies. And then in very small, fine print on their website it says, Humans must be paying attention at all times, keep your hands on the wheels hands attended to.
This is all, these videos are just for fiction and for entertainment purposes only. Something like that. They have some vague legal description like that. And one of the things that this hacker hacked into Tesla’s software and discovered something called Elon Mode. That’s what he’s calling it.
In the comments of the code, apparently it was listed as Executive Mode. That allows you to bypass the… Basic safety features that say, Hey, put your hands on the wheel, dummy. Let’s make sure you’re actually paying attention. And so part of NIST’s investigation is they’re looking into that and they’re looking at Tesla’s crashing into crashing into things when they shouldn’t be crashing into things.
Michael, I’m sure you have some thoughts.
Michael: Oh, my thoughts on this is it’s just another little reactive order to Tesla that doesn’t address the overall problem here. I mean, Obviously in the emergency vehicles crashes, we’re seeing drivers who are maybe drunk, maybe asleep. They’re not in control of the vehicle.
We don’t believe when it’s running into the end of the fire trucks, they’ve got it in autopilot mode and how they’re able to do that is the problem here. They can, there are a number of ways to disable. The autopilot driver monitoring, there’ve been steering wheel. They had a Tesla above you could put on the steering wheel.
There are weights, you can leave your hand on the wheel and do whatever else you want, play a video game. All of those are ways to get around Tesla’s driver monitoring system. If you want, if you truly think that the vehicle is going to drive itself safely to your destination and you don’t want to be bothered, you’re too lazy to drive, or you want to do something else in this case.
There’s an actual mode that’s been inserted by Tesla that allows that to occur. Basically, you’re not going to get nags if you’re not touching the steering wheel. The car can do whatever it wants and you can do whatever you want. And that’s a huge problem, obviously, from a safety perspective. But it’s, it’s NHTSA once again, choosing a reactive path.
They’re saying, Oh, we found out, it’s like they’re scrubbing the, they’re scrubbing the wires every day and looking for a story and saying, Oh, there’s this Elon mode. Let’s go after that. And they put out a special order on this. Meanwhile. These Tesla’s are still hitting emergency vehicles with flashing lights and nobody’s getting to the root of that problem.
And, some of the other issues we’re seeing with Tesla point to a pretty clear problem with the ability of these vehicles to see and respond to threats. And we’re not seeing that’s a move on that point. Instead, they seem to be finding, I mean, this is this. Third or fourth special order I’ve seen from them.
And, they focused on in the past, even in other recalls on things like Tesla’s that are programmed to be able to roll through stop signs. These constant little things that pop up or that Elon decides are going to come out. I mean, he said six months ago, he was going to loosen the reins on this steering wheel.
Detection and the driver monitoring system and it’s been done. And now this is going back and cleaning up that little mess while not addressing the overall big mess that we’re worried about, which is the continued ability, inability for these vehicles to detect threats and respond appropriately.
Anthony: So real quick, let me, I’m going to do a quick summary of, okay.
So autopilot is basically lane keeping assist and. Automatic, automated cruise control pretty much all new vehicles sold today have some form of this enhanced autopilot does lane changes, so if you hit your blinker, it will safely or potentially move to the next lane or something like that, again, a bunch of cars have this feature, Consumer Reports did a whole breakdown of this.
Every manufacturer that has it says Tesla is just kind of like, eh, kind of middle of the road at best on this. And then auto or full self driving is where it says, hey, we’ll recognize stop signs, we’ll recognize traffic lights, and we’ll make unprotected left turns and you can see some very funny videos about how dangerous that is.
With Tesla, this is great. There was, earlier this month, there’s an article from Yahoo earlier this month Musk live streamed himself driving a Tesla in Palo Alto, California, while using a phone, a violation of Tesla’s own rules and California state law. And during this video, he had no nags in it.
The, the steering wheel didn’t be like, hey, put your hands on the wheel, we’re gonna disable this function, anything like that. And I think that’s how this hacker’s wait a second, how did he get this thing? Let’s go past it. It’s it’s very funny. Hidden in this article at the bottom is Tesla has long said its driver assistance software is not a substitute for a driver and that currently enabled features do not make the vehicle autonomous, but many owners don’t realize this because Michael’s muted and he was saying something very smart, I’m sure, and very clever.
Michael: Yeah, some, some, someone and something in my background can’t keep the noise down.
Anthony: Is that a, is that a dog?
Michael: I wish it was a dog.
Anthony: Don’t worry about the background noise, we can fix it in post. Again, I think not only NHTSA but we’ve mentioned the FTC should probably go after them because of this advertising.
Not that great. Michael and NHTSA, besides these special orders, is anything gonna happen?
Michael: I don’t know, I mean, they hinted a couple of weeks ago that something bigger was coming down the line. Hopefully it’s, yeah, hopefully it’s a resolution of why Tesla’s… Vision systems and their, their planning systems that they’ve got working here to avoid obstacles and to prevent collisions aren’t working and particularly aren’t working in the situations that we’ve seen that are problematic, fire trucks, emergency vehicles, roadside crews, motorcycles and other issues. I mean, fundamentally, I would say we think that Tesla’s, safety systems aren’t sufficiently mature. To be operating, particularly not when there’s a mode that’s enabled that ensures that the human driver isn’t going to wake up or respond to what’s going on.
So that’s the, that’s the resolution we’re hoping to see from the D. O. T. and ultimately not, simple responses to all the goofy things Elon comes up with every few months.
Anthony: Like fart mode. Eh. Anyway, let’s jump into something a little more serious. I, listeners, you remember Ark Automotive? We talked about them a few months back.
This is when NHTSA said they spent eight years investigating their airbags and started initiating maybe a recall. It was very confusing to me, my simple brain. But now NHTSA is pushing again for a voluntary, for They’re voluntary recall, they’re but ARC is fighting back, saying it’s accusing NHTSA of overstepping its authority and calling the ruptures occasional isolated failures.
Keep in mind, these are the, the inflators, the little canisters of whatever magic gas is inside there. There’s some extra welding slag in there, and when you get into an accident that activates the airbag, the airbag deploys, but also so does a little bullet. Some shrapnel comes in and injures and or kills people.
Bad thing to happen. No. And some individual car manufacturers have said, Hey, we’re gonna, we’re going to voluntarily recall these things. Cause you know, a dead customer is not a repeat customer. Arc auto motive objects. And this is ongoing right now.
Michael: Yeah. So this is a few, I guess a few months ago, I want to say it was April and may ish we were talking about the, and it’s had.
Preliminarily notified arc, the airbag manufacturer that there was a defect, that Anthony just described where the, this is ammonium nitrate bags. This is something that the agency’s been looking at since to cottage. Similar in that there’s an ammonium nitrate airbag, but the defect appears to be different in the Takata situation.
We have, I think, somewhere around 30 deaths and 100 plus injuries, and that’s because the ammonium nitrate in those bags is degrading. And when, when the, when it explodes, it’s not a controlled explosion, it’s all at once, it basically destroys the entire airbag and can kill or injure passengers.
In the ARC situation, there’s a, the weld slag that’s kept, that, that apparently is not, it, it, the defect here is not a design defect like in Takata where you have a Inherent defect in that can come out and it’s in every bag. You have a, it’s basically the defect and caught is built into every bag here.
It’s, it looks like it’s only in the inflators that have this weld slag present. I think we’ve talked about before, whether that’s a Friday afternoon issue. Is it something that is always present? We’re not really sure. And in this case, I think we’ve seen worldwide about out. Two deaths, nine injuries, one death and six or seven injuries in the United States from these airbags.
There are about 50 million of them out there. And it raises a question after NHTSA makes this. Initial determination. That’s what’s happened yesterday. Arc came back a couple of months ago and said, we’re not going to, we’re not only didn’t, not only they say, we’re not going to recall these things right now, they said, and then, so you don’t have, even have the authority to order us to do this.
Here comes NHTSA making initial defect determination that requests that the manufacturer arc conduct a recall. And where that goes from here is very similar to the last time arc. Going to come back and make a response, but the additional thing is that in a month from a month from now, this is going to have a public hearing on this entire issue.
The agency is going to present all of their arguments. Arc is going to present their arguments, there, there will be an opportunity for interested groups and individuals to present their arguments. I’m sure we will and Thank you. At that point, there’s, functionally there’s a decision to make.
ARC either says, hey, we’re going to recall these, or the agency says, yes, you’re going to recall them, and issues a final order that ARC can then decide to recall, or they can take nits of the court. I expect that we might see the latter take place here. Nitze is making some, arguments that, I haven’t seen before around the de minimis part of the, there’s a, there’s a really long and tortured history of recalls involving, a very few incidents across millions of products and manufacturers coming back and saying this is going to be A problem because we’re just not seeing enough incidents.
This is a re this, this shouldn’t be a recall. It doesn’t occur often enough for it to be considered a real safety issue. And in this case where you have one death and six or seven injuries. I expect that the manufacturers are going to be trying to make NHTSA show first of all, prove that this defect, the specific defect that’s identified as what’s taking place and prove that the recall is, or the defect is not so small as to not require the agency to pursue a recall or this type of recall in the future.
There’s a lot of issues there, they’re going to have to be worked out probably in court at some point.
Fred: There’s an overriding issue, which is that the industry knows how to make these safe. They haven’t done it for cars because they’re not required to do it for cars. But I want to remind our listeners that the military has standards that assures similar devices are safe, not at the beginning of life, but at the end of life.
So if they’re designed, if their design life is, for example, 30 years, you can do a lot of accelerated life testing on devices like this, and then go ahead and exercise them so that you can develop very high confidence. and very high reliability for these devices. It’s essential because they’re life saving or life determining devices.
Why NHTSA hasn’t stepped up to this and put a standard in place that would once and for all solve this problem is a mystery to me. It seems like a straightforward thing to do. It’s something that industry knows how to do. The only reason for not doing it, I suppose, is because Either they haven’t thought of it, or they just don’t think it’s important to save people’s lives.
I don’t, it’s a mystery to me, but again, industry knows how to solve this problem. Industry solves this problem every day for military and commercial single shot devices, which is what they’re called. It’s far past time for NHTSA to put a standard in place that requires Automotive customers to get the same level of safety as people who use them in military and commercial applications.
Anthony: But Fred, so you’re saying that that Michael’s muted again. I would have cut you off if I wasn’t. That’s okay, I got it. But Fred, you’re saying, okay, so they do this for the, the military customers. They do it, and so what, what experience would a little company like ARC have with military experience?
Come on.
Fred: You might recall that they’ve been in the military business for, oh, I don’t know, since World War II, something like that, building exactly this same class of devices.
Anthony: Oh there goes my facetious argument. Michael!
Michael: NHTSA has also pointed out, something really interesting here, which is that these are safety devices.
This isn’t an axle, it’s not a tire. This is something that fundamentally is supposed to be keeping you safe in the car. So where you might see, an axle shaft break and, The vehicle might be at a stoplight or might not be moving or might be moving slowly and no one’s injured here when these things rupture, they are injuring people are killing people.
It’s it’s hard to escape shrapnel being ejected from a, explosive device right in front of you. And with their heightened status as a safety device, there’s kind of a bigger burden. And I think we certainly agree with that. There’s a bigger burden on manufacturers when they’re creating something that’s explicitly a safety device to do so in a way that that prevents that device from becoming a danger to.
Two, two occupants and that’s a pretty powerful argument that’s made in its initial defect determination. And I think, that argument might win the day if, if this does end up in court.
Fred: I just want to remind our listeners again, what shrapnel is a shrapnel. When you say that, it’s kind of a euphemism.
Shrapnel is a red hot piece of jagged metal that’s traveling at high velocity. And that’s why. So when we talk about shrapnel. We’re talking about red hot pieces of steel traveling at velocities, close to the speed of sound that are directed at you. This is, this is not a happy experience for anybody who’s been in the vicinity of shrapnel.
Michael: No, and in fact, in the some of the first Takata cases that came out, there were some rather low speed collisions involved and police and emergency responders were initially investigating them as as murders or suspicious deaths because they had no idea what was going on at that point. So it’s certainly a violent and dangerous prospect.
Anthony: And listeners, that’s why I drive wearing full flak jackets and blast shields. I recommend you do the same, but if you don’t want to do the same, go to autosafety. org and click that donate button. That’s right, there you go. That’s a horrible transition from hot metal shrapnel to donation esque, but hey, that’s what I do.
Is it time for Kyle? I think it’s time for Kyle. A little light hearted fun. Time for Kyle. He needs his own theme music. Okay. You don’t know the Kyle I’m talking about? GM Cruise and those robo taxis of hell. In an incident on the 14th of August, first responders were treating a pedestrian who had been struck by a vehicle and had life threatening injuries with significant bleeding.
Two autonomous cruise vehicles had stopped in nearby lanes and were not moving, blocking ingress and egress, according to the San Francisco Fire Department. As the emergency crews loaded the patient into an ambulance, the vehicles remained stopped in two lanes and police attempts to take over the vehicles manually were unsuccessful.
The fire department had to locate a police officer on scene and ask him to move his vehicle in order to leave the scene, which the report states further delayed patient care. Unfortunately in this case, the patient was late getting to the hospital and did not make it. Now, GM Cruise disputes this and they supplied video from their angle to to tech friendly outlets, which I found interesting, instead of supplying this video to, I don’t know, real journalists.
Yeah, or putting it on the internet for people to see. You’re right, why not put it on YouTube, or give it to, I don’t know, the New York Times, Wall Street Journal, Washington Post. Instead, they gave it to TechCrunch and some blogger at Forbes, who, As far as I can tell, hasn’t shared this video. It’s yeah.
Michael: Where’s the video, Kyle,
Anthony: Kyle, where’s the video?
Michael: This is, this one’s interesting in a number of ways. I mean, first of all, the person who ended up dying was struck by a vehicle driven by a human, which Cruz has, brought up in their defense, a number of occasions, there’s some debate around.
Whether or not the emergency vehicles were actually are delayed or delayed long enough to to hinder care being provided to the, the pedestrian victim of the, of the previous incident. So there’s a lot of debate going on back and forth between crews and San Francisco officials on this issue. We haven’t seen the video yet, so we don’t, we don’t know.
We’re apparently, at this point, essentially relying on hearsay from both parties, but it would be interesting to see the video and, there’s not much else to go on at this point, really, without, without some actual tape of the incident.
I, I go on in the fact that, okay, first responders, emergency first responders, have been doing this kind of thing for, I don’t know, a hundred years? I mean, I don’t know. Pretty long time, right? And so they’re used to an emergency of getting people, people, to move their vehicles out of the way and causing problems.
And this is, there’s no driver in the car and the police is just kind of like, we couldn’t move it. And I guess what Jim Cruz has said in the past is we have an 800 number and there’s some YouTube videos on, you can download to help how you can take over this vehicle. Which is insanity and perhaps their argument is going to be like look if you guys just got rid of all of your training for the last hundred years and just did this instead it would have been okay.
Which is a dumb argument, and it’s one I’m going to suggest that GM Cruise makes, because, hey, when it comes to brains, let’s talk GM Cruise. I know. I know. But related to this, there was a large protest this past Monday outside of Cruise’s headquarter in San Francisco. And so I guess the residents of San Francisco are just sick and tired of these things.
It’s it’s fascinating. Maybe it’s because they don’t have any say in the… Democracy of San Francisco, because the residents are against it, the fire department’s against it, the police are against it, but the state of California says, go for it. And what I find fascinating, correct me if I’m wrong these, Waymo is located in, headquartered in Mountain View, California, they’re, these are Silicon Valley companies, and there’s no such thing as self driving cars being allowed on the road, and…
In Silicon Valley, where these companies are headquartered. That seems surprising. When I worked in software, there’s a saying called, Eat your own dog food. If you’re gonna get customers to buy this stuff, You better say it’s this is really good. And these guys will not eat their own dog food. They’re letting it, They’re letting their dogs shit all over San Francisco.
I mean, I think that’s somewhat accurate. I mean, they’re not listening to the people who are, being affected by by these cars. And, we’ve, we’ve continued over and over to question whether there’s an actual use case for these vehicles, or whether they’re just adding problems and commotion, the streets of San Francisco right now.
I think it’s certainly the latter. I don’t think you would see this type of, this. Thank you. Strong response from emergency services from, city leaders and commissioners to this problem if they didn’t feel like they needed to do more here to address the situation, but they’re powerless.
I mean, they. Functionally can’t do anything without running to the California DMV or running to the California public utilities commission, which as we know, has, an interesting composition, including at least one former cruise managing attorney, so there’s a huge problem for, for localities who are trying to keep their citizens safe.
And I think we need to listen to them a lot more than we’re doing, rather than, rushing to get state laws out there that, that erase the ability for, for local folks to take charge of their streets.
Anthony: Crazy concept there. Related to driverless cars, and I think we’ve touched on this in the past, there was research out of King’s College of London they, they found that through testing over 8, 000 images through pieces of AI software, that detection accuracy, this is, Can the cars identify adults and pedestrians that, and children.
The AI software had a detection accuracy for adults that was almost 20% higher than it was for children and just over seven and a half percent more accurate for life light skinned pedestrians compared to the darkest skin counterparts. The researchers say the major cause of this discrepancy is the images used in the AI training systems.
Basically, they’re putting a bunch of white adults into the training model. And the fairness of the AI system is I’m not used to seeing children and or people with darker skin. Hence, we can’t identify them. Hence, there’s gonna be a safety thing. In my mind, this seems A pretty easy problem to solve I don’t, I don’t understand how you, I do, actually I do as I say this out loud, I do understand how you create this problem for yourself.
You’re a white dude, sitting in an office, be like, hey, that’s the only people who exist. White 25 to 35 year old men. And that’s all I need photos of. Cause, I’m on Grindr all day. Wait, what?
Michael: And this, this sounds a lot like what we talked about with Beth Brooke a few weeks ago, about needing more female crash dummies, needing to get different bodies into different spots in the car, because right now, you’re safest as a 50% white male driver in the driver’s seat here, we’ve, we were seeing something very similar where the training that AI is using is mostly based on, Presumably male, Western male, white people.
And so they’re not going to be able to detect anything else until those are introduced into the training regimen for this. AI or that’s machine learning, whatever these systems are, that, that, that, that the article is discussing. And, this is just a fundamental matter of fairness. No one should have a decreased chance of being detected by an oncoming vehicle.
I don’t think anyone in America is willing to accept that as a proposition. And, maybe that’s a little aspirational, but I, I, I, this is something that has to be worked out before these things get out on the roads and start making decisions that impact lives.
Fred: I think Michael put his finger on a fundamental problem here.
This is really not, AI is a misnomer. It’s not intelligence, it’s a correlation engine. And because it’s a correlation engine, it always works in retrospective. What does that mean? Training of these things is done by human beings. When a signal goes through the system, there’s a human being somewhere at the end of the system that says, Okay, you interpreted this as a goat.
No, it’s not a goat. It’s a human being. And then the system goes back and records that as a probability that what it thought was a goat is actually a human being. So unless you put in lots and lots of pictures with lots and lots of diversity, and you have human beings at the end of it who interpret all of those pictures without their own prejudices impacting their interpretation of the images, then you’re going to get a biased system, which is inadequate.
So I, I, I think the calling it artificial intelligence really does a disservice to the whole public, because it’s not. It’s, it’s something that, if you think about it in chess it doesn’t project forward. into a novel situation. What it does is it says, all right, I’ve looked at these 80, 000 chess games that have been that have been played in the past.
And because of that, I have a statistical probability of X that moving my king to knight four is going to allow me to win this game. But, the human being who’s actually got a brain would be able to project forward. Using a variety of algorithms built into a brain, like intelligence, and, understand much better the overall landscape and what it takes to win that chess game.
The point, the point of that ramble is that artificial intelligence isn’t intelligence. Artificial intelligence is only as good as the human beings who are looking at the images or inputs. perceived by the artificial intelligence system and interpreting them with their own human biases and restrictions.
Also, if you don’t bring enough data into the system, you’re not going to get a reasonable result out. So it’s they used to talk about computers as garbage in, garbage out. In this case, they’re not getting enough garbage in, and they’re not filtering through the garbage at the end.
Anthony: The good news on this is the researchers on this paper they now hope that manufacturers will be more transparent when it comes to how their commercial pedestrian detection AI models are trained, as well as how they perform before they hit the streets.
A lot of, yeah. I, I agree, but I look at this as the positive view is okay, these systems are very early on, they’re getting called out pretty early, and this is not an expensive fix. It’s not even, it doesn’t even cost them anything. It’s hey, let’s just get a wider array of images to train these systems on or else we all look like racists in our artificial intelligence that hate children.
Fred: That’s true, but the state of the art is for the companies to just put these cars out on the street and then use the… Data that they accumulate as they drive to try to improve the driving algorithms. This is really ass backwards, right? So they’re, they’re putting untrained systems out on the street, hoping that they’re going to train themselves without the benefit of the humans.
And the only parameter that the human beings interpreting it, meaning the test drivers is whether or not there’s a catastrophic event that makes them take over control of the vehicle or it crashes or something else horrible happens. So this is really a very, very poor way of training a machine learning system to avoid critical circumstances that can kill the person who’s inside the car.
Michael: But that’s what Tesla is based on, right? I mean, that whole concept is machine learning through these cameras to produce a robo taxi.
Fred: So where’s the bank of un, of unbiased humans at the end who are doing this interpretation? I’m gonna bet you a nickel that there isn’t any such person. They’re just waiting for a catastrophic outcome and then saying, Oops, I guess that’s bad. We’ll have to look into that.
Anthony: It depends on what bad is to you.
Fred, this makes me beg the question. I’m gonna ask you the question. I don’t know if you’re ready for it. How safe is safe enough? Ah, welcome to today’s TOW or DOW or COW, my friend.
Fred: All right the rationale for giving AVs a free pass in state legislations, meaning vacating all the normal restrictions on automobile safety and travel, is that they will inevitably…
Dramatically improve highway safety and we’ve all heard the the blather about 94% reduction in accidents and collisions and all that sort of stuff, which is wrong. By the way, it has never produced a study that says there will be 94% reduction in human induced traffic. But there was an interesting video that came out this week from Our friend Phil Koopman, which which talked about this in some interesting terms.
Now, we’re as guilty as others of saying the minimum standard for introduction onto the road should be you’ve demonstrated as safe as a human as human drivers. I want to emphasize that is a minimum standard because this bill points out that if you’re talking about something that is the safest human driver and the overall statistics, which is one death per every 100, 000, 000 miles, which, by the way, means serious injury, but once every 100, 000 miles is a roughly factor of 10 between the serious injuries and deaths, if you do that, then you are Including all of the drunks and spectacularly bad drivers and your standard for what as safe as a human driver is.
If your standard is a human driver who’s not drunk and who’s attentive. We’re really, according to Phil, talking about one death roughly every billion miles driven. Now, it’s interesting to me that if we look at that, it actually comes back to the NHTSA study that talks about contributions to death from various factors.
And it’s very similar to the number that you come up with if you look at NHTSA’s figures for deaths due to vehicle failures. Now, we’ve been saying in the show that the deaths due to vehicle failures should be the parameters for AVs, because after all, if they’re driving and something bad happens, like an injury or death, it’s because the vehicle has failed.
I think that we are converging with Phil on a standard of roughly, one per billion miles, something like that, before you can turn these things loose in the street and say that they are relatively safe compared to who… Human drivers and compared to human drivers in the best of circumstances.
So the other thing he points out is it is not okay to use a single safety parameter because one of the things that you would happen in that is that you could talk about risk transfer on the vulnerable populations. So what that means is that let’s say that you have an algorithm that always kills a pedestrian instead of killing a person inside the car.
Anthony: I have an algorithm that always kills a pedestrian instead of someone in the car. Was this interactive?
Fred: I’m not, I’m not sure where you’re headed with that.
Anthony: You said, let’s say, and I, I wanted to join in the conversation.
Fred: Thank you. I appreciate your support. But, okay that’s an additional parameter.
There should be no risk transfer into vulnerable populations. Should also be that a regulation is in place that says there’s no negligent computer programming. You should don’t, come on.
Anthony: Sorry, this makes me flash back to, to interviewing software engineers. And this one guy came in and he’s he flat out states, I write bug free code.
He said that flat out. Full of, middle aged white man confidence. I write bug free code and I reacted the same way I did just now. I burst out laughing and I was like, then we can’t afford you. Have a nice day.
Fred: All right. This perspective is just a little different. What he meant by negligent computer programming is, code that allows vehicles to violate the law rolling, going through red lights,
Anthony: like rolling through stop signs.
Fred: Got it. Oh, yeah. To name one. Yeah. And we’re not sure about the others. So that’s an important additional parameter. And then he also said that which makes obvious sense, the vehicles should conform to industry standards for safety testing vehicles before you start testing them on city streets.
Now there is a standard. From S. A. E. Called S. A. J. 30 18 3018 that addresses the safety of driving test driving autonomous vehicles. But it turns out that that’s a very immature document. We’re currently in the process of updating that standard for what the test driving parameters should be. So I think that even though it’s a great idea, it’s immature, and I think that the industry should really look very cautiously at testing these things on the street if there is no applicable standard that’s been updated and reflects the current state of the art.
Anthony: Does anybody actually comply to SAE blah blah blah blah?
Fred: J3018? I don’t know. I don’t know of anybody who’s doing that. I don’t know of any company that has said in their literature that yeah, we’re doing J3018. What they did instead is they started an industry group called the Advanced Autonomous Vehicle Safety Coalition, which contains no input from any regulator, contains no input from any consumer advocacy.
It’s just the promoter’s ideas of what’s good for them. So yeah, there’s, there’s nothing out there that’s universally accepted.
Anthony: So Fred, you work with SAE, right? I do. Okay. I think part of the problem is, and maybe you can address this, is changing these things from SAE J80 blah to the let’s not be a dick standard.
Let’s not be a dick and hurt people standard, because I think it’d be easier for legislators to get behind that, like I can’t, there’s not a single legislator, even as lunatic as some of them are, that’d be like, wait a second, no, I support hurting people, instead of giving them these very technical, dry names that not even the engineers mothers would love.
Fred: That’s true, but I think you’re being very sexist about your observation,
Anthony: their fathers wouldn’t love them either.
Fred: No, about the, the whole dick thing. We need to, we need to be more universal. Anyway, so there’s another parameter that that Phil talks about, which is that other ethical and equity concerns also need to be included in that.
Discrimination for any of a variety of reasons, and very importantly, liability associated with these things. Mercedes says we’ll accept responsibility for any problem that occurs during the testing of level three in Nevada where it’s been approved. But it hasn’t been tested in the courts yet, and we don’t, we suspect that what’s going to happen when they’re in front of the judge is they’re going to say, like Tesla does, Nope, not our problem.
We have fine print that says you need to be in control at all times. And yeah, we’re just kidding about that whole responsibility part.
Michael: Yeah, and they’re accepting responsibility for, actions, some actions in civil court. They’re not, they’re not accepting responsibility. Criminally, because they can’t even make that determination, right?
They, the state could still charge the driver of one of those vehicles, regardless of what Mercedes wants. So they, they, they can only partially accept responsibility in that scenario. And it’s still going to be a legal burden on whoever’s operating the vehicle.
Fred: And Phil also points out that there should be no allowance for preemption of local concerns, like we’re seeing in San Francisco, where they’re just good.
Overruled, because after all, the people who are in charge of the city streets are the people in the city. They fix the potholes, they put up the stop signs, they know what’s going on in the city to the extent that anybody does. There should be no preemption by higher authorities that say you must vacate all of your rules because the steamroller’s coming and you damn well better make room for it.
Anthony: The Steam roller is a new model from GM, right? It’s part of their Hummer EV line, the steamroll
Fred: It is, and they’re injecting that steamroll technology into their entire AV line.
Anthony: Oh, good. Sorry, continue.
Fred: Yeah, the calling is the Kyle Mobiles I all, and finally, there should be absolute prohibition against interruption of emergency services from whatever source that takes.
And if you cannot field a vehicle, Where you can guarantee that it’s not going to interrupt emergency services, a JML shouldn’t put it on the road. People’s lives are at stake, and whether, whether or not it turns out that Cruz was liable in San Francisco for the death of that person in the ambulance, it is very clear from all the evidence that The cruise vehicle remains stationary in the second lane from the, from the right, rather than pulling over to the right, like any human being would, right?
Michael: Yeah, it was screwing, it was screwing up traffic, whether it screwed up that ambulance, regardless, it was once again screwing up traffic by having a what do we call it, an existential crisis in the lane by itself.
Fred: So anyway, for our listeners, how safe is safe enough? Who knows, but we’re nowhere close to that now.
And it’s very important to establish that because how would a developer know they’ve reached the destination if they don’t know what the hell the destination is you’ve got, somebody who’s got a standard for this, somebody who’s got to establish rules of the road where no rules of the road exist. That’s gotta be either us or it’s gotta be NHTSA.
Anthony: So I vote us because
Michael: we would be faster, however if we made the rules, there might be, we wouldn’t be taking into account, the investors in cruise. I don’t think we would, we would more be taking into account the people on the streets who are impacted by this, whereas they’re taking into account their bottom line.
So that’s a completely different calculation that often leads to humans literally getting thrown under the bus.
Fred: Absolutely. And the start of this technology was, really about 10 years ago, more or less, where they were promising that that self driving vehicles have gone the street and 2014 2015.
So the investors are getting a little nervous, I think, and a little pushy. It’s hard for them to get their money out if there’s no IPO in the offing, and these things are years away from an IPO. So I imagine Kyle’s getting a lot of crews from his, a lot of excuse me, a lot of pressure from his overseers at Cruise.
Saying, we need to, we need to get our money out here. We’ve only got 20 billion. We want 30 billion out. Get your ass in gear, Kyle. Move this money out.
Anthony: Whoa, the wealthy. Not only they have to deal with this, but they’re stuck in the mud at Burning Man. Oh, the poor wealthy. Speaking of stuck in the mud, this is not stuck in the mud at all, but this is one of, gonna be one of our new favorite topics, talking about the weight of electric vehicles.
We unfortunately can’t link to this article, but if you’re a subscriber to Automotive News You can read this article. It’s a, it’s a good one. It’s talking about how vehicles keep getting larger and larger and way more and more and that is dangerous. There’s a quote in here. And I, oh, I had it highlighted.
Oh, come on. Where did it go? It’s fine.
Michael: It’s okay, Anthony. I think our listeners accepted your incompetence at this point.
Anthony: Oh, come on. No,
Fred: it’s a good, it’s a coffee defect.
Anthony: It’s not a coffee defect. It’s a, this is a human failure. Okay. This is why I’m not artificially intelligent. But basically the the quote in there is from a young man named Michael Brooks talks about, Hey, that it’s a physics problem.
The heavier these vehicles get, the more danger it is for other road users. I want to point out this cause as a, as a lay person, which I, I. I think I am, most of the time. Because I am laying down while recording this. Is But you don’t think of road users, you think of, okay, other drivers. No! Road users are pedestrians, other people on bicycles, other people on unicycles.
I mean, there’s, it’s everybody else on the road, and I think we always At least I tend to forget that. So it’s not just worrying about the other driver, it’s being aware that, Oh, someone decided to bolt across the road in front of me. And so this article is great because it says, Hey, instead let’s focus on, intelligent speed assist technology let’s focus on automatic emergency braking let’s make these things weigh less.
But hey, you guys are the ones who were interviewed, not me.
Michael: This is obviously a problem we’ve talked about for a while now, and it’s, happening right in front of our eyes. This is the data isn’t in yet because there simply aren’t enough of these super heavy vehicles on the road. The GM Hummer EV is.
Slowly coming out. I think they’re over a thousand or 2000 made now. We’re not really sure how many of those have actually made it to the streets and are running around the pickup truck and large SUV market is starting to pick up evs and those are going to be going out on the roads the next few years.
So that the data to go back and look at this isn’t really isn’t there yet. I mean, if we, if we want to, we can’t show that there are more fatalities, more injuries. Thank you. At this point, but in our experience over 50 years suggests that, weight is undefeated in crashes. The more weight you add to a crash, the worse outcomes you’re going to have.
It’s, sheer physics, and unless these vehicles are just dramatically redesigned to account for the additional weight, which they’re not being they’re not being redesigned, if you look at, for instance, an F 150 Lightning versus an F 150, they look very similar, except one weighs 1, 800 pounds more than the other.
So and another thing that was in that article, which is just continues to be shocking to me, there’s a figure we see that going up every year is that now almost 80% of the vehicles, new vehicles sold are SUVs or trucks. We’re not, I own a sedan and, and I don’t, it doesn’t look like we’re like sedans are going to be available much longer.
Maybe they’re going to go the way of the convertible. And manufacturers simply going to stop making them because the demand appears to be reducing and I don’t know if it’s just the demand. I think a lot of it has to do with the fact that the big auto companies. Make more money on SUVs and large trucks and so they’re incentivized to make more of those so that they continue to make more money.
It ensures that they have economic growth in their company when the sedan market was, apparently limited in profits for them. So we need smaller cars on the road and we need a lot more of them in the next 10 years. And right now we’re headed in the opposite direction. And we think that’s going to have bad outcomes for safety.
Fred: We need more public transportation too. We need more buses. We need, we’ve talked before about the whole idea that all of the savings available from use of electric vehicles are pretty marginal. I mean, if everybody were driving an electric vehicle, you might save. 10 or 20% of the energy that’s currently going into gasoline consumption by these vehicles, but the, the, the idea that you’re saving 10 to 20% overall.
That’s not going to save the world, folks. What will, what will save the world is for you to take the bus, take the train. But one of the problems of that, of course, is that the buses and trains have to exist before you can use them. There’s really a high level policy situation here that needs to be addressed.
And it’s not going to be, I don’t think. The nation that has decided to use air travel instead of trains, Even when the point to point time would be the same is coming up with an odd energy solution.
Anthony: Hey, Fred, what’s good for GM is good for the U. S., right? Maybe not.
Fred: They can always build locomotives.
Anthony: Oh, a GM locomotive. I don’t know. I’m not
Michael: flying cars,
Anthony: flying cars, flying, flying cars. The answer. Oh my word. I look forward to the recall roundup. We have flying cars. Let’s do some recalls and round them up. Ford! Ford re re re re recalls again. This is according to Michael’s note, the fifth recall on these vehicles since March 2021.
Wait, what? Five recalls since March 2021. Damn. Hey, listeners, if you’re playing at home, guess what this recall concerns? Do do do do do do. Hey, if you said rear view camera, you’re correct. That’s right. This is potentially a hundred and… 69, 000 plus Ford vehicles, 2020 to 2023, to 21 navigators some transit, some four Broncos, the rear view camera, or 360 degree view camera, if equipped may not display a rear view image when the vehicle is placed in reverse this this.
I, I’m not gonna assume that what’s happening here, but I’m gonna tell you a story. When I was in college, I had a professor who worked on GM’s assembly line in the 80s, and he said, we, we, one of the things we did, we did work slow down, and then what they did is they put something that smells really bad, and they painted the inside of the vents on some Chevy Malibu booze, and so every time people put on the heat, it would smell like rotten eggs in the cars.
I know, horrible. But, the GM had to keep bringing these cars into the dealer, and they had to replace all these vents, because they couldn’t figure out what it was. With all of these rear view camera issues, is this what’s going on? Is this some sort of labor slowdown? Because this is just I I can’t understand why else this is happening.
Do we have any guesses?
Michael: This one just smacks to me. I mean, there’s been so many recalls in this one issue and every one of them was kind of like a we’re not really sure. One of them, the root call wasn’t determined. We, here, we’ll do, try this software fix. But ultimately what the recall that’s coming out now is we’re going to replace the camera finally.
So hopefully this is finally the hardware fix that that wasn’t happening in the, in the previous recalls on this issue. And it’s really, we see something similar in a lot of recalls now where manufacturers appear to be scrambling to find a cheap software fix to pop onto these vehicles so that they don’t have to replace hardware because that’s, instead of pulling into a dealership and getting a quick software update or getting an over the air update at that point, you’ve got a physical repair to make that involves the mechanic and a lot more work and time and money goes into the situation.
So software. Yeah. Updates as recalls are much preferred by the industry at this point. Even though they don’t address often the, the, the physical defect that’s present in these vehicles. They’re, they’re mitigating the problem in many respects, and they offer a cheaper alternative to.
Physical repairs to manufacturers. So they’re very popular. That’s what this situation looks like. And we’ve had this two and a half year period between the time the defect was identified. And now, when they’re actually going to say, okay, we’ll replace the camera which buys them a lot of time and saves them a lot of money in the process as well, and makes it look like they’re doing their due diligence to when, in fact, under the covers.
We’re not sure what’s going on in a lot of these in a lot of these situations. We’ve seen them A number of times recently not just with ford. Also with hyundai and kia and other manufacturers where These recalls seem to drag on you get a new recall You’re replacing something different this time because the software and the first one didn’t work, right?
When you know if in the first instance these manufacturers make that Actual repair that they know will fix the problem, but that’s a little more expensive. This issues could be fixed more quickly and consumers would be a lot happier with their purchases. So we thought we think ultimately these kind of strategies are going to fail.
And, it’s something that the agency needs to keep an eye on because they’ve got a few recall queries going on right now on similar issues and it’s. Only, as soft as computers and cars merge into one, this problem is only going to get bigger.
Fred: Hey, Michael, can people go to the Piggly Wiggly to get this fixed or do they have to go to a Ford dealer?
Michael: They have to go to a Ford dealer. And it would be, that’s another thing, and we talk about this in the Hyundai Kia theft context. It would be really nice if the software recalls could be brought to consumers versus the other way around. I mean, you’re talking about putting a guy in a car with a USB stick that can drive around to consumers and plug it in for them instead of having every consumer visit the dealership one at a time.
And if we want to. Create better efficiency on our roads. A mobile recall repair for software issues makes a whole lot of sense. And it’s something that I hope a lot of manufacturers start doing.
Fred: Oh, that’s a great idea.
Anthony: Hey listeners Fred may be on the take of the pigly wiggly corporation, but we’re not.
So go to autosafety. org, click donate, become a monthly donor, five bucks a month, 10 bucks a month, 5, 000 a month, whatever you want to do. We’ll appreciate it. Our next recall, again, involving Ford truck emergency release may fail. Nope. Lie. This is Kia. I’m sorry, Ford. Trunk emergency release may fail.
This is Kia 319, 000 vehicles. Recalling certain 2016 to 2017 Rios, Optimas, Optima hybrid, Optima plug in vehicles. The trunk latch base may crack, preventing the opening of the trunk from the inside. And we’ve discussed the safety need around this very issue itself. So definitely get that fixed, right?
Michael: Absolutely. This was what we discussed with our with Jeanette Fennel from Kids in Cars. And she and her husband had a terrible, terribly traumatic experience in the trunk of a car. And. Federal Motor Vehicle Safety Standard 401 was the result of that after a long fight to get it basically to ensure that not only to put a glow in the dark trunk handle into the trunk, which is part of the standard, but also to ensure that the view that the trunk can be open from the inside, which is important in this recall.
It appears not that. There’s some kind of damage to the trunk rule, the, the, the trunk handle, the glow in the dark handle that you’re required to pull, but there’s a problem with it actually opening the trunk in this recall. And that’s important for anyone that’s trapped in a trunk. It’s not just important if you’re kidnapped, it’s important if you’re a kid and you climb in the trunk on a hot day.
So there are a lot of reasons why we need to be able to escape trunks. And, Anthony’s kids probably have a few.
Fred: But I want to mention that most cars now have an electric… Trunk release, right? So you can push a button on the dashboard and trunk pops open. Yep. And if you have children maybe they’re not like my children, but, children tend to get into cars and push buttons to see what can happen.
If you have a Tesla, they can push a button and drive away the car, but that’s a different issue. But this is a, this is a real safety issue for anybody with children or anybody in their life who may like to push buttons and fool around. And there’s a lot of us out there.
Anthony: If you like to push buttons, maybe a career in engineering is for you.
All right, we’re we’re out of time. I’m just gonna close on one fun little story that I didn’t share with either of you guys because it’s too good. This is an article in the Guardian. The headline is, Toyota Blames Factory Shutdown in Japan on Insufficient Disk Space. Yeah. Toyota shut down all of its factories in Japan because they ran out of hard drive space.
Yeah, and these are companies that we think can make autonomous vehicles. Give me a break. This is, they reiterated, this is not a cyber attack. Basically we made a mistake with our database and I don’t know, didn’t have any extra hard drives laying around. I love that. I love that. So
Yeah. Auto manufacturers just as messed up as we are now.
Michael: Those are the guys in charge of developing the new solid state battery that’s gonna save us all.
Anthony: Oh, hopefully they have enough disc space for that, but we’re outta disc space. Thank you listeners. We’ll be in touch with you next week with more exciting news and, we’ll see whatever dumb shit Kyle does next week.
Fred: Thank you. Bye. Bye.
Michael: Thanks, everybody. For more information, visit www. autosafety. org.