Ground Transportation Podcast

Robot Drivers: Safer Than Humans… or Total Chaos?

Ken Lucci & James Blain Season 1 Episode 64

Send us a text

Autonomous vehicles promised safer roads, fewer accidents, and a seamless future — but the real-world footage tells a very different story. In this episode, Ken Lucci and James Blain react to a curated set of shocking, hilarious, and sometimes disturbing AV clips, including Tesla autopilot mishaps, snowy-road confusion, near-miss lane changes, and even a tragic cruise-vehicle incident.

From the “cardboard-box test” debate to the ethics of the trolley problem, they break down what the technology actually sees, how it makes decisions, and why operators shouldn’t assume AI thinks anything like a human. They also explore real implications for operators:

  • Will AVs really lower insurance premiums?
  • How close are we to driverless buses?
  • What happens to safety drivers and workforce transition?
  • Are regulators and engineers moving fast enough — or too fast?

It’s equal parts education and entertainment as Ken and James take on robo-taxis, Tesla logic, data collection, human instinct, AV ethics, and the future of ground transportation — one wild video at a time.

Whether you’re curious, skeptical, or cautiously optimistic about AV tech, this episode will have you laughing, thinking, and maybe gripping your steering wheel a little tighter.

At Driving Transactions, Ken Lucci and his team offer financial analysis, KPI reviews,  for specific purposes like improving profitability, enhancing the value of the enterprise business planning and buying and selling companies. So if you have any of those needs, please give us a call or check us out at www.drivingtransactions.com.

Pax Training is your  all in one solution designed to elevate your team's skills, boost passenger satisfaction, and keep your business ahead of the curve. Learn more at www.paxtraining.com/gtp

Connect with Kenneth Lucci, Principle Analyst at Driving Transactions:
https://www.drivingtransactions.com/

Connect with James Blain, President at PAX Training:
https://paxtraining.com/

James Blain:

there's a whole empty left lane here you can see at the river. That guy's coming. Get the hell out the way. I don't cuss real often, but like in my head, all I'm hearing is ludicrous song from when I was in high school. Move bitch. Get out the way.

get out the get out the wave,

James Blain:

Hello everybody and welcome back to another exciting episode of the Ground Transportation Podcast. Now you know me, James Blaine from PAC's Training. You know, my wonderful co-host Ken Lucci of driving transactions, right? But if you missed the last time we had him on, you might not know John. John is our wizard behind the scenes. He is the producer. He is backed by popular demand. So kind of like top gear of old. You have those challenges the producer gives you and you film them. If you don't know about cars, you won't get that reference. Don't worry, you'll figure it out later. But here's the fun thing. So we are once again handing the reigns over to John. So John is gonna be our main pilot on this episode. Ken and I are along for the ride. And John, what are we talking about today? I think it's, I think it's autonomous vehicles, right?

Ken Lucci:

unaccustomed, I am to talking about autonomous vehicles.

James Blain:

yeah. It's.

John Tyreman:

the way that this is gonna work guys, is I'm gonna play a couple videos that are going to be around a certain theme, and then we're gonna have a conversation around that. So

James Blain:

to determine the theme or will you be giving us the theme?

John Tyreman:

I'll be giving you the theme.

Ken Lucci:

I love

James Blain:

much more fun if we had to figure it out on our own. See if we even got remotely close to

Ken Lucci:

It's, it's Friday. I am not that creative. Oh, here we go. Here we go. I like this. I like where this is

James Blain:

Okay. Oh, geez. Elon out the gate, out the hole.

John Tyreman:

Elon out of the gate. So this one is a, um, a Tesla fully autonomous driving mode in the snow. Let's check it out. Seems to take, have taken that turn pretty well.

James Blain:

I'm watching that hand. I wanna, oh, there it,

Ken Lucci:

Oh, there you go. There you go. There you go. What it, it wasn't, wait a minute. It just, for those people who are not, who, who, who are not looking at this, we're looking at a Tesla that was going straight down the road in a snowstorm, but then magically it started going off to the left hand side on its own, and

James Blain:

and it did the turn fine, like it turned onto the road fine. But you saw the road and you're like, Ugh.

Ken Lucci:

What do you think happened there?

James Blain:

I mean, so a couple things. So one, Teslas are heavy, right? They weigh a ton. So my

Ken Lucci:

of the, because of the battery.

James Blain:

yeah, you've got kinda that skateboard. You got a big battery pack underneath. So I gotta tell you, I mean, just me, I'd be a little nervous driving a Tesla in the snow anyway, but I think what happened is it lost a little bit of traction. It starts sliding sideways. And I, I gotta tell you, if you were watching the wheel, like John, I don't know if you can turn the audio off and pull it back a little bit. Like if you're watching the wheel coming outta the turn, it's a little twitchy. Like I, I was watching cause this, this guy's got his hand on his, his thigh here trying to like, I'm not gonna touch the wheel. Um, and, and it looks a little twitchy when he's doing that. I gotta tell you, at least in my opinion, I think it was already struggling. I think this was just asking for trouble.

Ken Lucci:

So, so let's back up a step. I mean, you know, here to four Waymo's have been in Phoenix, you know, besides dust storms, really easy, uh, dust storms, they're really easy from a weather perspective, it's been in San Francisco. Besides the fog in the, in the window a little bit, it's been in

James Blain:

which is not by accident, right. They,

Ken Lucci:

right? Do you think?

James Blain:

choosing places that don't have

Ken Lucci:

right. And, and the latest is Austin in Texas. So do you think that this is, this is what we're going to come to, that when these autonomous vehicles are in a snowy weather condition, they're just not gonna be on the road or what?

James Blain:

So, all right, so let's, let's get a couple things out of the gate because the first is humans learn in a linear fashion. I'm a little bit smarter tomorrow than I was today. I'm a little bit smarter the next day. I'm a little bit smarter. The

Ken Lucci:

I don't think that's true.'cause I don't think I'm any smarter now that at 60 than I was at 16.

James Blain:

by the way, we go downhill the a a lot more like AI goes up,

Ken Lucci:

there you go. There you go. Yep. I think IPI think I peaked when I was 40, so it's

James Blain:

there you go. Yeah. Yeah. Someone was peaked in high

Ken Lucci:

So, so tell, tell us, tell us what you mean by that. So, humans are linear. we we learn from our experiences.

James Blain:

we are linear in the way that we progress, in the way that we learn. And the other part of that is our expectations of AI and technology are generally linear. We expect them to be a little bit better than they were. The problem is with technology, and we've talked about this in the past with Moore's Law, they don't, 1, 2, 3 isn't how you count with ai when you're thinking of computers. It goes 1, 2, 4, 8, 16, 32 64, 1 28, 5, 12, 10, 24. And then from there it starts just absolutely hockey sticking. The same thing is happening with the advancement of AI and self-driving. Um, they've been doing this stuff all the way back to the nineties when NASA had their

Ken Lucci:

Oh yeah, absolutely. So are you saying that the more and more that

James Blain:

It's gonna get better

Ken Lucci:

it's gonna get better? So, all right, so I'm gonna play devil's advocate. Why didn't they start the damn things in Anchorage, Alaska, to prove your point?

James Blain:

It is the

Ken Lucci:

Because doesn't it, doesn't it snow up there three 40 days a year?

James Blain:

So, so on behalf of my friends, Charlie and Athena, I'm gonna say it's beautiful in the spring. Come visit, right? Go to Anchorage. It's incredible. You don't start by having a kid dead sprint, because at the end of the day, there are human engineers that are trying to do this. And as people, we can't just start with an infinite amount of money, an infinite amount of research, an infinite amount of everything. And the other thing is, believe it or not, they have already, the fins and the Swedes are already driving these things on frozen lakes.

Ken Lucci:

very good point.

James Blain:

thing to, the thing to keep in mind is we can't beta test it with people on frozen lakes. So what we're seeing with these rollouts in these highly controlled environments is you are seeing the public face of it, Hey, we're ready for you to test this. We're ready for you to drive this. Versus, you know, going to Finland, going to Sweden, going to, you know, the northern parts of Russia to

Ken Lucci:

well wait a minute, to follow your logic, then they had to do Phoenix, San Francisco, LA to get the other parts of. Topography down. Driving, driving, driving on real world stuff. And now they're gonna progress and they're gonna go into heavy rain areas. Oh, wait till these, wait till these things get down into Miami and they get into a deluge at three o'clock in the afternoon, there's gonna be 150 of these Teslas parked under in the, on the highway, underneath the bridge. Waiting for the rain. Waiting for the rain

James Blain:

I I mean we, we all know about the parking lot incident, right?

So I was like, where's that coming from? I looked down and I was like, I think it's coming from the Waymo cars. Yeah. Kind of bizarre, right? Neighbors in San Francisco dealing with some sleepless nights, as they say, driverless cars in a parking lot nearby. Confused. They start honking at each other. Maybe they're just talking and a statement. Waymo says, it's her complaints. They're working on a fix. That is your rush,

James Blain:

So, I mean, these things are gonna happen, but would we rather that happen like, I don't know, in the middle of New York in the winter? Or do you want that to happen? You know, in Arizona, in the middle of spring where you know, if there's rain or anything, it just pulls over and

Ken Lucci:

Well, and to your point, you had said on a, on an episode a while ago that the Tesla is constantly learning. So that episode that it had that close call now will be used for learn it. It will have learned that, and it will be taking that into account in the future.

James Blain:

Yeah, ideally. Now I will, I will also also preface with not every incident gets looked at. Not every incident gets reported, and if the Tesla doesn't know it's screwed up, it might not be in a training set. So keep in mind, a lot of this is it, it's kinda like software, you know, if you are using software and you're having problems and you just get frustrated and don't tell anybody

Ken Lucci:

You gotta report it in. Yeah. All right. Let's go to the next one.

James Blain:

Oh, it's Tesla again. John's really beating on Elon today,

John Tyreman:

Yeah, we're gonna beat on EL in a little

Ken Lucci:

Are we gonna get a cease and disor order

James Blain:

by the way, Elon, if you wanna be a guest and explain this, we would love to have you on.

Ken Lucci:

You know. Yeah. You know what, and it's kind. Yeah.

James Blain:

you to us, are like the mad scientist. Like, I, I like to, I like to give Elon crap, but for God's sake, he was like, I wanna rocket. I'm gonna build a rocket. So we'd love to have him as a guest,

Ken Lucci:

And you know, you go on Joe Ro, you go on Joe Rogan. I mean, are we not as good as Joe Rogan? Is that what you're saying? Elon?

James Blain:

are you, are you saying that we gotta smoke pot if Elon comes on? Because if that's what it takes to get Elon on, we will film an episode in Denver or California if we

Ken Lucci:

Well, that's all you, that's all on you. I,

James Blain:

Now, now that said, I cannot partake because I have a CDL and I'm not willing to give that up, but

Ken Lucci:

John will smoke with Elon then. All right. Go ahead.

James Blain:

We're gonna get John High with Elon.

John Tyreman:

and I can talk up together. All right, so, uh, so I think, um, I, I wanna switch things up and I wanna preface this video with the big question that James, I, I think, let's watch this video and I'd love for you to react and answer. So, James, how do you train someone? A driver, uh, an AV attendant, if you will, to stay alert when their main job is to watch a machine that rarely makes mistakes until it does.

James Blain:

He's given me an impossible question and now he wants me to watch a video. This is gonna be great guys. Digging the techno music. We've got the rainbow just driving down the highway. Oh, snap. Oh, we got him. I love that. All right, so we've got the Tesla driving down the highway that looks like a Toyota. I don't know if that's a BRZ or what it is. It's a, looks like a white Toyota crossover. And the left hand lane, we're in the far right lane and it looks like it's coming over. This guy's in full self-driving. You got the, the little rainbow road out in front of it. You can hear the techno music running in the background. Um, this guy clearly does not have his hands on the wheels. I think he's holding the phone. I think that's why it takes him a second to get his hand back on the wheel. Play it again real quick.'cause this one went really fast. I wanna make sure I have it before I All right. So, okay. All right.

Ken Lucci:

Describe it again now that it's driving down the road and then you see this white SUV veering over into the lane in front of the Tesla. Why didn't the Tesla see it, James?

James Blain:

so, so two things. This looks like almost a self-inflicted pit maneuver on behalf of the guy coming over, right? So probably the front left, um, kind of fender area is probably right at the rear tires at the door. And the other thing is, this guy's use of a turn signal. It's kinda like, um, I don't know if you guys have seen the meme where it's like, I've gotta cross the left lane. I've got across eight lanes. I'm gonna turn the take signal on. Good luck everybody. And they just kinda like yolo it over. I feel like the Toyota put the turn signal on and then was like, all right, turn signal's on wham. Um, two things. I think one, the thing that we forget is that even though this thing can see ahead, it. It's not going to process intention the way we do. So one of the cool things, and I believe he was actually a KU professor, so a professor came out to Orlando and, and I'm on the executive committee for the bus industry Safety council. And one of the sessions that we did in Orlando that I absolutely loved was they had a professor come talk about, at a collegiate level drive, like basically distracted driving. And one of the things he did is he put up a picture and he is like, here's a highway. And he goes, all right, I'm gonna have you guys look at it for a second. Did you look at it for a second? He goes, I'm gonna put up the next picture and I want you to look at it, and I want you to tell me what's wrong. And he puts up the next picture. And it's, it's basically like, find the differences. Nobody, nobody in the freaking room got it. Why? Because we're trained to look at the road. We're looking at the cars. We're looking at the signs. We're looking at the highway. We're looking at everything we know that is important. Then he goes, all right, I'm gonna circle it. It was like a building off to the side of the highway. Why? Nobody cares. It doesn't matter to us. The building's not gonna cause the accident. So I think what we're seeing here is it does not see the intention. But the other thing is, if you've ever ridden in a Tesla, you will see that there's a little bit of reaction time. The Tesla isn't as quick. I, I saw the same thing with the zoo. If you go watch the zoo video, when I'm in Vegas and I'm riding around at the zoo, you'll see the exact same thing. It's got like 40 or 50 different options. And it's not like us where it goes, oh crap. And it commits, right? A lot of times it's having to make decisions and it did react, but it didn't react fast enough, it didn't catch the intention. And frankly, it's not gonna have the same level of anticipation that we do for years because it's gonna have to get to the point where processing is gonna have to get faster and anticipation and training. Data's gotta get better.

Ken Lucci:

wow.

John Tyreman:

Um,

Ken Lucci:

So what you're saying to me is every real life scenario that takes place has to be entered into its database or its memory bank. It needs to. Look at how it reacted and then adjust itself accordingly. It's gotta, it's gotta repeat that.

James Blain:

Basically so, so in our world, you know, I coach hockey with my son. So I think of it as with my son. I can have him shoot from the middle of the ice and then I can have him shoot from, you know, the blue line and then I can have him come up a little closer and he's gonna be okay shooting in between. Um, the same thing happens with this except the problem is, unlike a kid that can kind of infer it and turn him loose, you've gotta have these massive sets of data for it to go through and get through and figure out. And the other thing is, humans do really good in completely one-off ridiculous situations, right? If I throw you into a weird, ridiculous situation, we do really well at improvisation. That's one of our gifts. If you've ever seen me give a talk, it's improvisation,

Ken Lucci:

Some, some of us do.

James Blain:

But it depends because even if you suck it improvisation, if I put you in a life or death situation, your

Ken Lucci:

it's fight or

James Blain:

your everything, you're gonna react. And in, sometimes we screw it up. But the reason that we've done so great as a species is we tend to get it right. The problem with the AI is the AI has no adrenaline, it has no fight or flight. It's just processing. And so it's one of the interesting quirks that you see in these scenarios. Now, the other side is to actually answer John's frankly, crappy question.'cause I can't answer it. Right.

Ken Lucci:

don't blame John because you can't answer the question.

James Blain:

I, I,

John Tyreman:

about the role of training

James Blain:

All right,

John Tyreman:

and how it's

James Blain:

so here's the role of training. The role of training is that the safety driver's time is limited. It, the whole point of having self-driving is not to need the safety driver and the idea that we're gonna have someone that sits there just staring at the road, waiting for his moment to do something, frankly, it, it, it makes no sense because that defeats the whole purpose. That's like saying, Hey, I'm gonna, I'm gonna find the ability to fly, but I'm only ever gonna fly two feet off the ground. So if something happens, I can jump outta the plane. Like, it only makes sense in this teeter-totter phase

John Tyreman:

I disagree.

Ken Lucci:

yeah,

James Blain:

here. Uh, I've, I've started a great debate. Let's go boys.

Ken Lucci:

no. So, so one of the things that, that occurred at the Cha for Driven and LA show is I had a great dinner with the, with the guys from Wendell Marks and, uh, while, uh, while Matt, Matt Doss was not there, pat Russo was there and we had a very spirited discussion about Waymo's in New York. And the thought process of, of New York to me was, and reading all of the analyst reports and reading through. The Waymo announcements and, and the Zoox announcements is that those vehicles are gonna be in New York within three years. They had a,

James Blain:

and they're required to have a safety driver.

Ken Lucci:

but they had a different take. And their take was because there's going to be some serious blowback on labor where the, the New York as a, a city is worried about displacing so many drivers. And I di I, I thought there was validity to that, that it's going to extend it. Now let's not debate that,

James Blain:

Well, there's no debate there. You're absolutely right.

Ken Lucci:

okay, but what if, what if the plan in New York City was, look, we know you want to launch these avs, but I think we'll all agree that New York City is extraordinarily challenging. Why don't we go into a period of five years where you have to have a body in there? So that we can make sure that, that we have that human reaction to the com, you know, to the darting out the way you just saw it. I mean, that to me seems to be the best of both worlds because at that point, the vehicle could learn from what the human being does. Now I know that if Elon was on here, God willing, we would love to have him on, but he would give me the 15 scientific reasons why. That makes no sense. But what we just

James Blain:

No, you gotta split the middle, Ken. You gotta split the middle. It may you makes, alright, so here's the thing. So I-I-A-T-R, not this year, but the year before, one of the things that got brought up is

Ken Lucci:

tell the audience who IATR is.

James Blain:

So that's the International Association of Transportation Regulators. That's when all the regulators that regulate transportation come together, and the whole focus that year was on. Autonomous vehicles and electric vehicles, and everybody was super excited. This is gonna be awesome. And New York, the, the one of the representatives at New York was like, oh God. Oh God. And everybody's like, what's wrong? New York? Like, it's almost like those, those cartoon memes you see like, all right, New York, what's wrong? Uh, well, we've got all of these drivers and their pension and their retirement relies on, and I'm not gonna call it a Ponzi scheme, it's not a Ponzi scheme, it's a retirement fund, but it relies on new people coming in behind and paying into it. So you have to have new drivers paying into the retirement fund to keep it going for the drivers. No, no, no. The the black car fund, uh, is something different. Um, but anyways, the concern there was at the point where we stop having new drivers come in and we displace all of these workers immediately. What happens now, a couple things to note here. One. Safety drivers make sense in the beginning to, to not give John crap and, and give you a direct answer. The way that you get them to do the job effectively is you don't let them take their phone. You don't let them have distractions and you make the only form of entertainment or input or anything. They have actually driving the vehicle to the point where if you really wanted an effective safety driver, you would literally put a black partition and lock him in the driver's seat. And his only option for any form of input or stimulus is to watch the dam road

Ken Lucci:

And, and would you agree? Would you agree that that would train the av?

James Blain:

Uh, yes and no. The point, the job of the, all right, so if we're looking to train the av, we don't actually use an av. If I wanna train an av, I take a, to think about it for a second,

Ken Lucci:

where, where's he going, John?

James Blain:

the AV doesn't learn the same way we do. If

Ken Lucci:

What rabbit hole is he going

James Blain:

If you wait, so if you're looking purely to train an av, what you do is you have the AV vehicle and you put it in manual mode and you let the driver drive it, and then afterwards you have that driving graded by someone else and you tell the AV, this is what the driver did. Great. We want you to do this. This is what the driver did wrong. We don't want you to do this. To the point that Uber is already doing that. So Uber recently announced that they are looking at employing drivers to train the AV to basically do that. Because here's the thing, if you're trying to do that is a safety driver, the only thing you can do as a safety driver is either intervene when it screws up or tell it what it got wrong, which a true function of a safety driver. That's why I said you almost have to put'em in a completely partition box because I don't want'em distracted by the radio. I don't want him distracted by the phone. I don't want him distracted by the passenger. Your job is, you are the Oh God. Right. When, when it's an oh God moment, you should take over and be there.

John Tyreman:

But that's just one job. And, and I think that that was kind of my point is we're you're, you're going down like safety driver,

James Blain:

the safety driver.

John Tyreman:

but if we are looking at this through a black car operator's lens, right? It's much more than just a driver. This is

James Blain:

are two different jobs.

John Tyreman:

well, right. And I guess maybe that was my mistake in not clarifying that, but that was kind of where I,

James Blain:

we're, we're talking computers. It's one or zero, you don't get to try again.

Ken Lucci:

No, it, it wasn't a shitty question just'cause he couldn't answer it,

James Blain:

I did answer. I said it was a shitty question and then I answered it anyway.

Ken Lucci:

you know when you come off the road you are sassy. Okay. You are sassy. John and I have been doing this without you for the,

James Blain:

this, is why, this is why I don't do it while I'm on the road. Y'all wouldn't be able to handle me when I'm traveling

Ken Lucci:

we've been doing this alone for the past 30 days while you've been out there screwing around at all these association meetings.

James Blain:

by the way I want. I want all the associations to know that our sponsorships and ad dollars are considered to be screwing around by No, I'm just kidding.

Ken Lucci:

So, so no, what you just hit upon probably, and,

James Blain:

but it, you, you can't do both effectively.

Ken Lucci:

but, but to, but to, you know,

James Blain:

do both effectively.

Ken Lucci:

my realization because you know how I've been with AVS and because I've looked at the six cities in China that have 30,000 of these beasts on the road already. You know what Pat, and I forget the other gentleman's name from Wendell Marx Pull pointed out to me is that China's entire business climate and re relationship to government is completely different than ours.

James Blain:

Oh, yeah.

Ken Lucci:

completely different than ours. So when he said to me,

James Blain:

By the way, we're not saying anything bad about China. Please,

Ken Lucci:

not at all. No. Well, geez, we don't want any cease. And is this coming from, from them either? Um, what, what the, and the gentleman from Wendell Marx also used to manage the New York City and their entire fleet for many, many, many years. So what, what he said to me was the human element of it, which is New York City will give massive pushback about all of these drivers losing their jobs. So what you are just saying to me seems to me to be a good solution. Where,

James Blain:

but wait,

Ken Lucci:

wait a minute, you man, you mandate that the, the drivers have to have to drive these AV units.

James Blain:

yeah, so, all right, so a couple things. So, so one of the. One of my favorite Elon quotes, right? And, and we, we give Elon a bunch of crap'cause he's basically a mad scientist. But one of my favorite Elon quotes is that the fundamental characteristic and mistake of an engineer is optimizing a thing that shouldn't exist. Okay? And the, the thing that people have to understand is a safety driver is a step to get where we're going. So if you wanna look at a industry that's done this, and the airline industry. We used to have that when you would cross the Atlantic, you had four engine planes. You had a flight mechanic or flight engineer. You had a UR in command and a second in command. Over time what's happened? We've gone down. We got rid. Now we're happy with two engines, not four. Now we're happy with two pilots, not one. The Airline Pilots association is now going through a huge push to try and keep two pilots in the cockpit because Boeing, all these others have come back and said, guys, we can push a button. And the fricking thing flies itself. What do we need? Two guys in here we're paying for, we

Ken Lucci:

no, no,

James Blain:

shortage and they're trying to go down to one pilot.

Ken Lucci:

no, we need

James Blain:

now, now this is where it gets interesting because it depends. So you look at your Cessnas, you look at all your small private, your small individual aircraft. You got a bunch of guys flying by themselves,

Ken Lucci:

Yeah, but they're in charge of themselves. They're not in charge of 300 souls. That's the issue I have. And are we really gonna take Boeing's word for, for Christ's sake, they're building planes that fall out of the sky just because of some computer says it should go up and the damn thing goes down. Do we wanna talk about the 7 37 max

James Blain:

James Blaine nor Pack's training endorse any of the remarks made by Ken. No, I'm just kidding. Um, so, so here's the deal. So, interestingly enough, I think we're at a point where the fundamental idea of autonomous driving is that you don't need a driver. The problem that you have is what I talked about a second ago. We as humans trust really fast. If it works for the first 10 minutes, we think it's gonna work for the next 10 minutes. And so the problem is you've got all these people blindly trusting in it that don't understand it. I always go back to when quote unquote, autopilot came out for RVs. It was cruise control because it was called autopilot. People would literally go to the back in the back of the RV and wonder why all of a sudden they come outta the bathroom in a

Ken Lucci:

and try to make a, and try to make a sandwich

James Blain:

Yeah. Yeah. So the so, so to, to wrap this up in a nice little bow, the one thing that I would say is that. We need to only have safety drivers until the data proves that the safety driver isn't needed. Having a safety driver for the sake of a pension or to have someone have a job or just to pay them to be there, in my mind, it makes more sense to try and find a better role, better thing that person could be doing that we know AI or technology can't do. Rather than having safety drivers longer than we need to just'cause we have a pension fund. We have to find more effective uses. And not only that, like think of coming home from work. How was your day, honey? I stared at a fricking robot driving the car like I do every day. Well, I scrolled through Twitter on the side, trying to do it in a way that the camera wouldn't notice and tell me that I'm scrolling through Twitter. That's a meaningless

Ken Lucci:

But, to your point in New York, you know, they're gonna, New York City is gonna push towards, no, we don't want this evolution in technology because it's going to displace, you know, 13,000 taxi drivers potentially. And another, I don't know how many Ubers there are out there, let's just say another 15,000 when what you are saying is come up with a, a plan. Where you have a period of time where all of the data is being collected, and at the, at the same time, come up with a plan to retrain the, retrain the workforce.

James Blain:

Yeah. You have to, you have to build a bridge to transition, right? I, I'm gonna use the most cliche quote ever, but when one door closes, another door opens, right? Fighting progress is dumb.

Ken Lucci:

I, I, I also don't think you can. I, I just don't think you can.

James Blain:

But the other piece of that is that, you know, a lot of, a lot of it is we get caught on these problems and we get this analysis, oh my God, there's 80,000 drivers, there's a million drivers, there's a, you know, 10 million drivers or whatever number it is. We can't solve this, we can't do this. Why are we going to, we found something that might be better done by technology. We know it eventually will be done by better technology. How do we find meaningful work for these people that they can do effectively and do a job that only people can't?

Ken Lucci:

I've already got the answer to that. And to train all these guys to have CDLP licenses and CDL licenses because there's 85,000 school bus jobs open. There's 275,000 truck driving position. So if you like to drive, we're gonna train you on a CDL or

James Blain:

the only problem is they're coming for that next, I would, I

Ken Lucci:

wait a minute. But that'll be next. But that will be next. But you still, I refuse to believe in my life lifetime, you're gonna have a 56 passenger vehicle with no driver in it. I might be wrong, but, but to but to your point, then, then

James Blain:

Well, I don't know how much longer you got left, Ken.

Ken Lucci:

don't know, maybe five years. I don't know. Probably five. Five. I mean, good ones. Five. I've only got five Good ones left of me

James Blain:

Because,'cause I think I think we're, I think we're 10, I think in 20 years. Right. And Ken, if you're not around for that, I'm gonna go, I'm, I'm going to your grave and bitching every day. Right. I

Ken Lucci:

Let me tell you something. If I'm on this frigging podcast 20 years from now, you we're gonna both have a problem anyway. No, but to you, no, to your point. I think to what the point you're making is instead of Fight New York, and that was the perspective I hadn't heard before. Instead of fighting progress, why don't you interview Uber Waymo, Zu X. Collectively, you need to interview all these people and come up with a consensus of how they're gonna be trained to go someplace else. And guess what? You guys are mega rich. You guys are gonna have to pay for this shit.

James Blain:

Yeah. And hey, guess what? If, if you have to, and I'm not saying to do this Uber, don't sue me, right? But maybe you go in and you say, Hey, if you're gonna, like, uh, for example, in the state of Massachusetts, every single ride share ride, they have to pay 10 cents and it goes into a fund that's supposed to be there to, to help people. It make total sense for me if they said, Hey for, and as, as this goes down, if you wanna operate autonomous vehicles in the state of New York, X out of every ride has to go to the pension. That helps transition all of the existing drivers out and take care of all the ones. And as those people shrink, we'll start scaling that back. We all know that'll never happen. It'll go to building something else. But, but there are, there are plenty of ways to do that, that allow progress to continue while still creating a soft landing for the affected industries or displaced individuals. And it's just a matter of figuring it

Ken Lucci:

And before, this is the last comment, and then John we're gonna get to this video, is, uh, I, I've heard from an extraordinarily reliable source that Waymo Google had a, meeting with the Waymo team and Google Flat out said, we don't give a shit what it costs. Just keep going much, much faster. Go as much faster as you possibly can. They've got the money and

James Blain:

they're winning the race.

Ken Lucci:

and this is, this retraining of, retraining of drivers to me is something that they could do that would grease the skids. Across the board and move us forward, move us forward into autonomy. And, but I don't see that happening on the robotics or AI side, but let's not go down that road. But anyway, I think that that's a solution to what they should be doing in New York. If, if New York comes back and says, slow down, what are you gonna do with these drivers? You hit the nail on the head, they're gonna have to fund it. All right, so let's go. What's Tesla autopilot takes control to avoid two accidents.

James Blain:

All right. Let's

John Tyreman:

How about that headline? All right. Yeah. So, uh, this next series of videos will be a little bit, have a little bit of a different focus. So the first batch was around humans that were taking over and correcting the autonomous vehicles. This next couple videos is actually the opposite. This is when autonomous takes over

James Blain:

the digital nanny.

John Tyreman:

the digital nanny. Exactly. So, uh, I'll, I'll pose this question before we get into the videos and then

James Blain:

be Ken's question.

John Tyreman:

yeah. Maybe Ken, we can start with you. Do you think that AVS can actually drive down fleet insurance costs?

James Blain:

No.

John Tyreman:

Look.

Ken Lucci:

Well, the first question I have is, John, did you put that music in?

James Blain:

Yeah. Like,

John Tyreman:

that was not, that was not my selection of music. I would've picked something much

Ken Lucci:

Okay. So let me give a narration for

James Blain:

oh boy. Was buzzing.

Ken Lucci:

For those people in the studio audience that are not in front of YouTube. What happened was the Tesla was in the middle and on the right hand side there was a pickup truck with a something on the back of it, and it swerved into the Tesla's lane. And so the Tesla, it swerved, that vehicle swerved in front of the Tesla going left, so the Tesla went further left over to the other lane, but there was a car there, so that vehicle got in the way. So the Tesla kind of did two maneuvers to get out of the way. Okay.

James Blain:

Yeah, this all could have been avoided by hitting the brakes. By the way, as the safety guy, I'm like, if he, if he wouldn't have been hauling absolutely flying down the center lane and just hit the brakes, he wouldn't have had to overcorrect.'cause I will tell you, Ken, where I thought that went, and the reason I gave that reaction is I totally thought it was gonna spin out.'cause that that thing was just flopping around

Ken Lucci:

yeah. One thing I did did leave out is the Tesla, who, whoever was in that Tesla, it got a little bit of a, a jarring ride because it did overcorrect a couple times. Um, okay, so the question you asked is, tee up that question again.

John Tyreman:

Yeah, so the, the big question is with, autonomous vehicles and presumably with the right level of training and inputs, they would grow to the point where they would be able to prevent more accidents in theory. So therefore, would that be a viable way to drive down fleet insurance costs, which is plaguing the industry.

Ken Lucci:

Yeah. My answer is yes. One thing that the, the AVS have got going that the traditional vehicles in the, of the past don't have is the data collection. The data collection in the autonomous space is, is front and center into every single platform. Okay. So the answer is yes, every single incident is being learned from, especially on the Waymo side and what we're seeing as far as Tesla reporting into the cloud. So, yeah, I do, and I think that the, it, I, I also think that the data collection is going to be done in weeks and months or months instead of over many, many, many years. You're going to have so much data input that it's going to overwhelmingly make the case that these things are yes, much safer or no. And the way it's leaning right now is they are. Much, much safer Autonomous vehicle is much, much safer than a human being driving a vehicle. That's what it looks like right now. So yeah, I do just purely because of the data collection.

John Tyreman:

All right, well, here's another video, kind of in that same vein of a, uh, a Tesla, an autopilot on the highway. Uh, and this one is from wham bam Tesla cam. Uh, so let's take a look

Ken Lucci:

For those of you who want to continue to follow this, go onto YouTube

James Blain:

Yeah. Yeah. You gotta see these

Ken Lucci:

and find at wham WHAM BM Tesla wham bam tesla cam.

James Blain:

Is that bam or is that.

Ken Lucci:

wham bam. Yeah,

John Tyreman:

can't really tell. But if you're also going to YouTube, also go to at Ground Transportation Podcast and give us a

James Blain:

guys. Yeah, yeah, yeah. Forget those guys, right? We're we're just, we're we just gave him free. We're, we'll send him a bill for that, that call out later. Go to our, go to ours instead.

John Tyreman:

All right. Well, we're, uh, you know, courtesy of wham bam Tesla cam, here is, uh, this video of the Tesla autopilot on the highway.

James Blain:

Let's see here.

Ken Lucci:

Oh,

James Blain:

Okay. Elon, what the hell? You, you, what the hell? He had a whole empty lane next to him. Beep, beep, beep. You're gonna get hit. Move stupid. Like, are you kidding me? It knows it's coming. It could have just moved. You saw in the rear view cam, the other car moved.

Ken Lucci:

But Okay. But, but the typical human reaction would've been to slam into the car in front of him.

James Blain:

Oh, hell no. If you're PAC's trained, you better have moved. If I find out your PAC's trained and you're slow down and you didn't look in the rear view and you didn't just move, I'm

Ken Lucci:

you? So what you are, wait a minute. What you're saying is you're not confident that the Tesla is doing the 12 o'clock, three o'clock, six o'clock, nine o'clock

James Blain:

Well, but All right. All right, roll.

John Tyreman:

But if you get

James Blain:

Can we, can we roll this back?

John Tyreman:

wouldn't that car behind them slam into the, the car that you would get out? Oh yeah. Let's roll it

James Blain:

Roll it back for, play the tape. Play the tape. All right. Look, you've got a whole empty shoulder to the left. Okay. I don't know if, I don't know if there's a mute on this, but, but there's, there's a whole empty left lane here you can see at the river. That guy's coming. Get the hell out the way. Like I, I, I don't cross, I don't cuss real often, but like in my head, all I'm hearing is ludicrous song from when I was in high school. Move bitch. Get out the way.

get out the get out the wave, get out, get.

James Blain:

Like

Ken Lucci:

there's no, there's no need for that kind of talk.

James Blain:

Says, says, Mr. F Bomb. But, but

Ken Lucci:

Are you, uh, wait a minute. Are you saying go to the shoulder on

James Blain:

but look, look, you've got, you've got an empty right lane. You've got an empty left lane. Yeah. You've got two empty lanes, the guy behind him. So if you watch that again in that impact, that what looks like some kind of forward focus or something went around the van that hit him. I'm telling you in this one, anytime I'm slowing down right, I'm looking rear of me and I'm looking behind me. And if I see that guy's look, brake lights, I'm, I'm. Instantly. Oh crap. The guy behind me doesn't see it. Here he comes, Tesla, be beep. You should have moved. Right. It it Effective drivers would've tried to hit. Are you successful? I don't know, but I can tell you he has two empty options left and right. Neither of them were used. He psyched because he didn't hit the car in front of him. I'm pissed because the passenger in my backseat is now not gonna make it to the airport on time. Because he had an accident.

Ken Lucci:

well, not to mention the fact the passenger in the back seat seat's gonna sue you for whiplash, you know? Um,

James Blain:

things, speaking

Ken Lucci:

point. Yeah. To your point, to your point, I think that the beep beep, beep beep is not an ample warning for

James Blain:

yeah. I might as well just said, you are screwed.

Ken Lucci:

No. It should have sent brace, brace, brace. Rear crash coming.

James Blain:

I feel like, you know, that's the type of thing for me, look, not, I, I get that. Not every accident is avoidable, but most unavoidable accidents are a result of putting yourself in a compromised position and you can avoid generally putting yourself in a compromised position. And I think he was actually in good position to make a move. Right? Or make a move left. We saw a Tesla practically fishtail out'cause it moved outta the way earlier. It just, in this case, decided to stay straight instead of move outta the way.

Ken Lucci:

and lemme ask you a question. When you think of the possibilities of an accident occurring, doesn't that scenario almost always come up where someone, someone comes, breaks in front of you? Okay? You're My whole point is that's not an auto that's happened to every one of us.

James Blain:

day

Ken Lucci:

Okay,

James Blain:

we teach the safe zone. Right? So for, for me, my big thing is if you ride with me on the highway, right? Especially, you know, all of us have our modes where we're not as high alert, but if, but if you ride with me on the highway, I typically am going to purposely ride in what I call open pockets. And so I'm going to try and find an open pocket where I have some option to move. And I don't care how angry you get, I'm gonna make sure that there is space in front of me because if I have to stop short, I have to be able to get outta the way. I have to make sure I've got plenty of room riding. Having that space to maneuver is crucial. And in this case, he had it, he just didn't use it.

Ken Lucci:

Good point. Okay, now

John Tyreman:

Now we're

Ken Lucci:

Romania.

John Tyreman:

Romania. Yep. Now no, we're, we're off to Romania. And this one is a little bit of a different question. So thank you, Ken, for giving us your thoughts on whether you think AVS could actually drive down fleet insurance costs. But I think this next question, I would love for both of you to chime in who is at fault in an AV crash. So what we're gonna

James Blain:

it's Elon's fault.

John Tyreman:

is we are gonna go to Romania and we're gonna check out this video. Alright. You ready?

James Blain:

Yep. Let's go. Oh, and almost still hit him.

John Tyreman:

Alright.

Ken Lucci:

so let's tee that up. For those of you in

James Blain:

life. A human driver would've struggled with that.

Ken Lucci:

so those of you in the studio audience who can't see this YouTube, um, what happens is the te white Tesla is going on by itself and it is in the right hand lane, but next to that lane there's pedestrians,

James Blain:

is like those little bollards you see in

Ken Lucci:

Right. The road is separated by some bollards, but there's a sidewalk there and a pedestrian falls into the roadway and the Tesla completely 100% avoids him by going the only place that they could, which is into the next lane, the oncoming traffic. Okay.

James Blain:

and it still almost hits him'cause it hits that car. It bou. I've been watching this. We've got it looping for us to watch here when it bounces back that rear right tire almost. Well, I think he gets barely hit with the bumper, but that tire would've been on him,

Ken Lucci:

but uh, that is a fantastic save.

James Blain:

Yeah, but this is the trolley problem. And for those that aren't familiar with the trolley problem, you know, you've got, you've got a trolley going down a track, okay? And you've got two different tracks. You can throw a switch, there's a trolley coming. You have no way to stop it. You throw one switch, one person dies, you throw another switch, three people die. Right now it's heading towards the three people. So the question becomes, if I go, oh God, I gotta save those three people, and I throw the switch and one person dies, I am now a direct murderer. I murdered that person. I am personally responsible for their death. If I do nothing, three people die. But that wasn't really my fault. I didn't do anything. I just watched them die. So the question becomes, in the trolley problem, do you throw the switch and kill the one person to save three? Or do you say, Hey, I don't wanna kill anybody. If they die, they die. I'm hoping to God, Jesus takes the wheel and it derails the trolley and everybody's okay.

Ken Lucci:

So, but I look at this differently. This is a law probability that the Tesla looked at it and said, the probability if I hit that human, that human being's gonna die. If I go into the next lane and hit a vehicle, chances are lower that the

James Blain:

You're giving it, you're

Ken Lucci:

that,

James Blain:

much credit. I don't You're giving

Ken Lucci:

I think it. I, I I'll, it made the right decision.

James Blain:

I, it made the right decision based on the sequence of events, in my

Ken Lucci:

Oh, totally. About what choice did it have? The, the,

James Blain:

My point is the guy falls into the street first. All right. So if you, so, all right, I'm gonna go full programmer for a sec. Yeah. It's that Jack Off's fault. Um, so yeah. Yeah. Why you falling in the straight

Ken Lucci:

can't wa was he drinking? He can't walk on a sidewalk. What's your problem?

James Blain:

Romania, uh, no, but in all seriousness,

Ken Lucci:

trips on the pothole. He trips on a pothole and throws it. He gets thrown into the street.

James Blain:

All right, so let's, let's bring it back for a second. Computers work on a sequence of events. Computers are typically monolithic, so in this case, it sees someone fall into the road and it takes immediate action to not hit that person. There are bollards on the right. There's an open lane on the left, okay. If you look, there's actually two lanes of oncoming traffic. Arguably, if it would've seen the guy falling and not just been fully reactionary, it might've been able to, and a human driver couldn't do this, but it might've been able to calculate how to get into the far left lane and go around the green car on the far side.

Ken Lucci:

Uh, I dunno if there

James Blain:

this is, but this is, but this is the same thing you see in movies where like the AI is shooting at'em and the guy is dodging the bullets. When AI hits war, it's gonna suck.'cause AI doesn't miss, AI isn't like us. AI would only take the shot if it knows you're dead. Right. This whole like ai. So I think what we're seeing here is we're seeing a sequence of events that led to an outcome. It made the, it made the decision to go to the other lane. It then had to figure out what to do with the green car and the clock ran out. It just ran out of time. I don't think it calculated that. Um,'cause now we have the iBot problem. The kid has a lower chance of surviving than I do, but he's got more life ahead of him. Right. That's the whole, if you watch, and by the way, um, the iRobot movie with Will Smith was based on the Isaac Asimov book that develops the

Ken Lucci:

He's take, he's take, he's taken us down a

James Blain:

I know, I I to this is what, this is what you get when you bring up eye, eye. But, but the, but the problem is ultimately going to become, these are moral and ethical human problems, not robot problems. That's

Ken Lucci:

in that case, it is a good point. He

James Blain:

And it only made the decision because of the sequence of events. We can't, we can't humanize it and say it didn't want to kill the pedestrian. It said, oh shit, there's something in the road. I don't want to hit it. And then it got to the other side. It

Ken Lucci:

I refuse to believe that. I think Tesla knew that it was a human being.

James Blain:

And your Roomba loves you and it's not mapping your house. It's keeping your house clean.

Ken Lucci:

Well, the only way to prove my

James Blain:

My robotic cat isn't spying on me. It really loves me.

Ken Lucci:

My, my, the only way to prove my hypothesis is to throw a cardboard box in front of,

James Blain:

I thought you were gonna say a child. Thank God it was cardboard box.

Ken Lucci:

it's, and you know, it's, if I'm looking, if I'm looking at that Romanian video clip and I wanna prove, I, if I wanna prove my point, I'm duplicating that scenario and throwing a cardboard box out there, and I'm betting you that that Tesla goes right over the cardboard box.

James Blain:

uh, maybe

Ken Lucci:

Well, maybe there was a human, wait a minute, there's a

James Blain:

Elon. Come on the podcast.

John Tyreman:

Yeah, let's test that out. Ilan. Come on. Show us.

James Blain:

There's a, there's a robotic cat in the cardboard box that we're throwing in front of the robotic taxi to see. And then, and then we'll throw from the other side, we'll throw a real cat in a box and we'll see which box it runs over.

Ken Lucci:

We really need Elon. We need Elon on this. We need him on the podcast to settle these important questions. Okay, now we've got a Robo Taxii company. Cruz

John Tyreman:

All right, so the question remains who's at fault in an AV accident? So let's play this clip and then we'd love to get your reactions.

Ken Lucci:

Mm-hmm.

James Blain:

Oh God. Rest that poor woman.

Ken Lucci:

But, but to be clear, she got hit by another car first.

James Blain:

Yeah, so, so John, correct me if I'm wrong, this is a pretty dark story. So basically the cruise vehicle's driving along, this woman gets hit, it tosses her under the cruise vehicle. The sensors don't notice it, right? Uh, she gets dragged, I don't know what eventually stops it, but she basically gets dragged. And the argument here is a human would've known something happened. Now that said, we should probably also pull up the clip of the guy driving 90 down the highway with no wheel on his vehicle, dragging the front caliper on the, the highway,'cause I call bs. But the whole idea here being that a human would've known better and that the fact that she got dragged, and, and like I said this is, but this goes back to what I just said a second ago. It is a tragic series of events and the order of operations matters because if I remember right, the whole deal is that because she got. Tossed under the vehicle. The crew's vehicle. Didn't know she was down there getting dragged.

Ken Lucci:

Well, a, a couple things, you know, couple, couple things. First and foremost, you, um, cruise is out of, cruise is out of business, you know, and I firmly believe for reasons. Um, when I look at the dollars that Waymo has put into AV compared to the dollars that that GM put into it, I think that number one is an evolution of autonomous technology. And today. You know, today, I think if the crews had encountered anything that had that kind of an impact, the tech is such that it would've stopped the vehicle. Um, would she still have died? Sure. She still, she still would've died because that's a, that's sequence of events is tough to prevent. Right. She was tossed in front of it.

James Blain:

but these are human problems, right? You hear stories all the time of someone that you know, they have, they have a, and, and this is again, very tragic, but this is what we're talking about. You hear stories all the time of someone who, they get hit by a subway or they have something massive fall on them, and there's no way to save them like they're done. They, they're pinched in a way. That we cannot extract them and save their life. And so I think at a certain point, you know, you have to be careful of the hype versus buzz because there's a whole group of people ready to point the finger at AV and be like, that shit doesn't work. You're gonna kill us all. While there's another group that's like, oh no, that's great. We need to do that tomorrow. And I think, um, when, and again, back to the International Association of Transportation Regulators, at their conference this year, they had two guys back to back. They had a, uh, the guy that basically was, Hey, this is great. This is awesome. Let's do it. We believe in them a hundred percent. And then they had a professor that was, Hey, you're going too fast. This is too dangerous. Don't do this. And at least what, what came outta that for me is that especially in the us it's left or right. It's red or blue, it's, you know, black or white. There's no in between. And I think in our world, there's a middle ground because there's unfortunately accents like this all the time. There's drunk drivers all the time. There's things that happen, you know, every X amount, and this is a tragedy, but I, I don't, there's, if she got hit and thrown under the wheel, it's not like the AV could have jumped and not hit her. So, to a certain extent, some of these are there. Like I said, not all events can be avoided. A lot can, some circumstances are freak accidents

Ken Lucci:

but if she was thrown in, in front of that vehicle, 25 feet in front of it

James Blain:

and

Ken Lucci:

and this Correct. So to, I hate to say it, but based on that video, you know, engineers guaranteed they looked at it and said, okay, wait a minute, now we need to put sensors on the front bumper. And, and I guarantee you that that senseless death caused some changes in technology. But, but to my way of thinking, because we've gone through a second revolution on av, the first revolution on AV was done by vehicle companies. It was done by the, by the automotive companies and it was a miserable failure. And then, and then Uber tried it and. Again, they were just using basic vehicles. Waymo, Waymo and Zu X started from the ground up. And Waymo, Waymo and Tesla and Waymo and

James Blain:

and Apple had a

Ken Lucci:

X,

James Blain:

they give up on.

Ken Lucci:

they gave up on it. So I think the, yeah, I think the evolution is, is gone, um, has moved forward I tremendously since the cruise. The other thing I'll say is we have, without question, a great source of data and a great source of what could happen and has happened. When you look at China having 30,000 robo taxis on the road, 23 hours a day, one hour for cleaning, one hour for a new battery. And this, and the sad part is, I'm, I'm just gonna say it, I think China has a different way of looking at things. And there's a completely different relationship between regulatory and business. I mean, the,

James Blain:

Yeah. That's the home of the suicide net at the factory. Right? They were having people's, people were jumping off the factories to commit suicide, so instead of fixing the working conditions, they put nets around so that when you would try to kill yourself, you didn't die like that. Clearly,

Ken Lucci:

a different, it's a different way of thinking. And, but the, but, but they are way ahead on, I they're way ahead on autonomous vehicles.'cause they don't give a shit if people drop to their debt. No, I'm only kidding. I'm only kidding.

James Blain:

but, well, but, but you've brought up a fundamental business problem, right? So I recently listened to a podcast on Dieselgate. So Dieselgate, basically Volkswagen had this idea that they were gonna create. Clean diesel vehicles and the CEO comes down and he says, we're doing this. It's happening. It's gonna work, it's gonna be there. The engineers couldn't make it happen. They couldn't make it happen. But the mandate was, I don't care what you're doing, you get it done. So what did they do? They cheated. They made it, they, well, so what they did is they used what's called a defeat device. So when you were testing the vehicle, it would notice that the steering wheel wasn't moving. It would notice all of these inputs and it would go into like a limp mode and it, that limp mode, it would give out certain numbers. And the only reason they got caught is some college kids got a grant to drive vehicles and test them. And they had like this Jerry Ridge cardboard box testing and they couldn't get the Volkswagen to do it. And they didn't believe it. Like they'd marketed so hard they'd sold it so well. The kids are like, we screwed up. Like we, we got something wrong. And the kids kept testing again and again and again. They couldn't get it to actually do it in the real world and eventually it turned into a giant issue. But I think the other side of this, and it kind of goes to what you're saying about with China, is to a certain extent, I think you have to be aware that there are times that business wants results that will be good for business even if they're not feasible with the current technology. Right? I think so. I think that's, for me, what's scary about this is, and we've seen the same thing, believe it or not, in nuclear power and the nuclear regulatory industry. And that's why you have, you know, nuclear a, a lot of these nuclear meltdowns, if you ever watch the Cher NOL series on HBO Max, it's great, but it will scar you because it was literally the Soviet Union decided, hey. We're gonna test this thing and we're gonna run the test. And we don't really care about safety or anything like that. The test has gotta be done. And oh, by the way, we didn't tell anybody about the flaw in the reactor because that would make us look bad. You get, you know, incidents like Three Mile Island, they start looking at three Mile Islands maintenance record, and it's crap. You know, this is not nuclear power, but this is something that, you know, I, I hear all the time, what if it gets hacked? What if it fails? What if it gets struck by lightning? I mean, these are all things that, you know, like I said earlier, I, I support the technology, but to a certain extent, you've gotta do a safe and effective rollout, and you have to transition into something. You can't just say, Hey, I don't feel like driving anymore, and instead of driving three hours, I want to be able to work on my laptop, so I'm gonna adopt it tonight.

Ken Lucci:

Well, in the end, we had the, uh, CEO of OB on. Who talked about the fact that the CEO of ob OB is a ride share aggregator. Think about, you know, booking.com for the ride share companies. You can find the best prices on OB and her, one of the statistics she gave was when they did a survey of, I think it was 10,000 ride share customers, 78% of them said, yeah, they would prefer an autonomous or they would, they would, absolutely, as long as the safety was there, they would, they want an autonomous vehicle. So we're in an exciting inflection point where, where. They've worked in Phoenix, they're working in Austin. We see how they're going to go in more congested cities and cities that are, have older infrastructures and perhaps, you know, they're not all flat, they don't have the best weather. So I think it's gonna be critically important to more than ever before. The answer is data. The answer is safety data. One of the things that I really like about the AV evolution is every single one of these vehicles is a force multiplier for law enforcement because they have cameras on them. So to

James Blain:

So I I think that only makes sense until you get someone like Apple that's like, nah, we're not giving it to you.

Ken Lucci:

okay, but wait a minute, but, but in the city of Miami, they have an autonomous vehicle, uh, autonomous police car that is going around and it's being one person can monitor six of those police cars. So, and at some point AI is going to be looking at this data and saying, wait a minute, this all looks good, except the guy that running down the street with the gun in his hand.

James Blain:

Well, but the problem is now you start getting into the question that they have in England. Right? And, and so you look at England and they have more CCTVs than anywhere on the planet.

Ken Lucci:

yes, the city of London has

James Blain:

right, they at, they were testing technology at one point. They literally listened to tone and voice inflection, and anybody that sounded aggressive or raised their voice, it would flag at what point, right? And this is me swinging the pendulum as hard as I can. At what point does this become minority report and I'm in trouble for something I haven't done yet?

Ken Lucci:

Well, and, and, and what if you're inside the AV vehicle and that it's listening to you? There's no question. I mean, I never thought I would live to see the day that we had robots and AI and the AV units. even as much adoption as there is now, even though

James Blain:

Oh, I'm still waiting on the Terminator. I expect to see the Terminator in my lifetime. And I, and I don't, I don't mean that, I mean that in a serious sense, right? I, I genuinely, we are already seeing drone strikes, right? At what point now? And now we're seeing it with Ukraine. You are seeing autonomous drone strikes. They program the target, they turn it loose. It determines what the target is and isn't. And if it doesn't have the target it wants, it picks a secondary one. And so we're literally, it's not the Terminator in the sense of Skynet, but we are, I genuinely believe that in my lifetime we will see that. And all they had to do right now, we could have the Terminator tomorrow. Elon's got his humanoid robot, give him an AR 15, right? And all of a sudden he's the Terminator. He's

Ken Lucci:

You know, John, we really do miss him when he's not here, but he's taken us quite far afield today. All I wanna do is see autonomous vehicles in New York City. He wants to see robocop out there with an ar.

James Blain:

whoa, whoa, whoa, whoa. I I didn't say anything about putting a human brain in there. Okay. Now, you, you?

Ken Lucci:

well listen, we've given our audience something to think about. As usual, this is the Ground Transportation Podcast. We are gonna have more episodes with John, our producer, showing us videos. We please subscribe to the Ground Transportation Podcast, go on YouTube so you can, you can see all of the shenanigans that have gone on, uh, in video as well.

James Blain:

Give us a, like, give us a subscribe. Let

Ken Lucci:

absolutely. And we've got some exciting stuff coming up in the future. So thanks again for, uh, joining us today and have a great weekend.

Thank you for listening to the ground transportation podcast. If you enjoyed this episode, please remember to subscribe to the show on apple, Spotify, YouTube, or wherever you get your podcasts. For more information about PAX training and to contact James, go to PAX training.com. And for more information about driving transactions and to contact Ken, Go to driving transactions.com. We'll see you next time on the ground transportation podcast.

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

Raise Up Podcast Artwork

Raise Up Podcast

Raise Up Mindset