back to index

Chris Urmson: Self-Driving Cars at Aurora, Google, CMU, and DARPA | Lex Fridman Podcast #28


small model | large model

link |
00:00:00.000
The following is a conversation with Chris Ermsen.
link |
00:00:03.120
He was the CTO of the Google self driving car team,
link |
00:00:06.040
a key engineer and leader behind the Carnegie Mellon
link |
00:00:08.880
University, autonomous vehicle entries
link |
00:00:11.240
in the DARPA Grand Challenges and the winner
link |
00:00:14.120
of the DARPA Urban Challenge.
link |
00:00:16.160
Today, he's the CEO of Aurora Innovation,
link |
00:00:19.480
an autonomous vehicle software company.
link |
00:00:21.360
He started with Sterling Anderson,
link |
00:00:23.600
who was the former director of Tesla Autopilot
link |
00:00:26.000
and drew back now Uber's former autonomy and perception lead.
link |
00:00:30.160
Chris is one of the top roboticist and autonomous vehicle
link |
00:00:33.160
experts in the world and a long time voice of reason
link |
00:00:37.440
in a space that is shrouded in both mystery and hype.
link |
00:00:41.320
He both acknowledges the incredible challenges
link |
00:00:43.600
involved in solving the problem of autonomous driving
link |
00:00:46.560
and is working hard to solve it.
link |
00:00:49.680
This is the Artificial Intelligence Podcast.
link |
00:00:52.440
If you enjoy it, subscribe on YouTube,
link |
00:00:54.720
give it five stars on iTunes, support it on Patreon,
link |
00:00:57.920
or simply connect with me on Twitter
link |
00:00:59.760
at Lex Freedman spelled FRID MAN.
link |
00:01:03.280
And now, here's my conversation with Chris Ermsen.
link |
00:01:09.160
You were part of both the DARPA Grand Challenge
link |
00:01:11.960
and the DARPA Urban Challenge teams at CMU with Red Whitaker.
link |
00:01:17.040
What technical or philosophical things
link |
00:01:19.720
have you learned from these races?
link |
00:01:22.280
I think the high order bit was that it could be done.
link |
00:01:26.640
I think that was the thing that was incredible about the first
link |
00:01:32.880
of the Grand Challenges, that I remember I was a grad
link |
00:01:36.440
student at Carnegie Mellon, and there we
link |
00:01:41.440
was kind of this dichotomy of it seemed really hard,
link |
00:01:46.320
so that would be cool and interesting.
link |
00:01:48.800
But at the time, we were the only robotics
link |
00:01:51.720
institute around, and so if we went into it and fell
link |
00:01:54.960
in our faces, that would be embarrassing.
link |
00:01:58.320
So I think just having the will to go do it,
link |
00:02:01.160
to try to do this thing that at the time was marked
link |
00:02:03.360
as darn near impossible, and then after a couple of tries,
link |
00:02:07.120
be able to actually make it happen, I think that was really
link |
00:02:11.360
exciting.
link |
00:02:12.360
But at which point did you believe it was possible?
link |
00:02:15.120
Did you, from the very beginning,
link |
00:02:17.000
did you personally, because you're
link |
00:02:18.360
one of the lead engineers, you actually
link |
00:02:20.320
had to do a lot of the work?
link |
00:02:21.800
Yeah, I was the technical director there,
link |
00:02:23.840
and did a lot of the work, along with a bunch
link |
00:02:26.120
of other really good people.
link |
00:02:28.440
Did I believe it could be done?
link |
00:02:29.760
Yeah, of course.
link |
00:02:31.120
Why would you go do something you thought was impossible,
link |
00:02:33.400
completely impossible?
link |
00:02:34.880
We thought it was going to be hard.
link |
00:02:36.280
We didn't know how we're going to be able to do it.
link |
00:02:38.080
We didn't know if we'd be able to do it the first time.
link |
00:02:42.880
Turns out we couldn't.
link |
00:02:46.000
That, yeah, I guess you have to.
link |
00:02:48.400
I think there's a certain benefit to naivete,
link |
00:02:52.920
that if you don't know how hard something really is,
link |
00:02:55.400
you try different things, and it gives you an opportunity
link |
00:02:59.560
that others who are wiser maybe don't have.
link |
00:03:04.080
What were the biggest pain points?
link |
00:03:05.680
Mechanical, sensors, hardware, software, algorithms
link |
00:03:09.360
for mapping, localization, just general perception,
link |
00:03:12.760
control, like hardware, software, first of all.
link |
00:03:15.440
I think that's the joy of this field,
link |
00:03:17.840
is that it's all hard.
link |
00:03:20.040
And that you have to be good at each part of it.
link |
00:03:25.160
So for the urban challenges, if I look back at it from today,
link |
00:03:32.280
it should be easy today.
link |
00:03:36.200
That it was a static world.
link |
00:03:38.880
There weren't other actors moving through it.
link |
00:03:40.720
That is what that means.
link |
00:03:42.440
It was out in the desert, so you get really good GPS.
link |
00:03:47.080
So that went, and we could map it roughly.
link |
00:03:51.320
And so in retrospect now, it's within the realm of things
link |
00:03:55.160
we could do back then.
link |
00:03:57.800
Just actually getting the vehicle,
link |
00:03:59.200
and there's a bunch of engineering work
link |
00:04:00.680
to get the vehicle so that we could control and drive it.
link |
00:04:04.200
That's still a pain today, but it was even more so back then.
link |
00:04:09.520
And then the uncertainty of exactly what they wanted us
link |
00:04:12.920
to do was part of the challenge as well.
link |
00:04:17.080
Right, you didn't actually know the track hiding it.
link |
00:04:19.360
You knew approximately, but you didn't actually
link |
00:04:21.520
know the route that's going to be taken.
link |
00:04:23.560
That's right, we didn't even really,
link |
00:04:26.560
the way the rules had been described,
link |
00:04:28.640
you had to kind of guess.
link |
00:04:29.840
So if you think back to that challenge,
link |
00:04:33.440
the idea was that the government would give us,
link |
00:04:37.000
the DARPA would give us a set of waypoints
link |
00:04:40.360
and kind of the width that you had to stay within between the line
link |
00:04:44.240
that went between each of those waypoints.
link |
00:04:46.840
And so the most devious thing they could have done
link |
00:04:49.280
is set a kilometer wide corridor across a field of scrub
link |
00:04:53.720
brush and rocks and said, go figure it out.
link |
00:04:58.520
Fortunately, it turned into basically driving along
link |
00:05:02.200
a set of trails, which is much more relevant to the application
link |
00:05:06.800
they were looking for.
link |
00:05:08.760
But no, it was a hell of a thing back in the day.
link |
00:05:12.080
So the legend, Red, was kind of leading that effort
link |
00:05:16.640
in terms of just broadly speaking.
link |
00:05:19.120
So you're a leader now.
link |
00:05:22.040
What have you learned from Red about leadership?
link |
00:05:25.000
I think there's a couple of things.
link |
00:05:26.360
One is go and try those really hard things.
link |
00:05:30.880
That's where there is an incredible opportunity.
link |
00:05:34.760
I think the other big one, though,
link |
00:05:36.560
is to see people for who they can be, not who they are.
link |
00:05:41.720
It's one of the deepest lessons I learned from Red,
link |
00:05:46.080
was that he would look at undergraduates or graduate
link |
00:05:51.000
students and empower them to be leaders,
link |
00:05:56.120
to have responsibility, to do great things,
link |
00:06:01.400
that I think another person might look at them and think,
link |
00:06:04.760
oh, well, that's just an undergraduate student.
link |
00:06:06.600
What could they know?
link |
00:06:08.720
And so I think that trust, but verify, have confidence
link |
00:06:13.520
in what people can become, I think,
link |
00:06:14.880
is a really powerful thing.
link |
00:06:16.680
So through that, let's just fast forward through the history.
link |
00:06:20.480
Can you maybe talk through the technical evolution
link |
00:06:24.200
of autonomous vehicle systems from the first two
link |
00:06:27.480
Grand Challenges to the Urban Challenge to today?
link |
00:06:30.920
Are there major shifts in your mind,
link |
00:06:33.600
or is it the same kind of technology just made more robust?
link |
00:06:37.240
I think there's been some big, big steps.
link |
00:06:40.880
So for the Grand Challenge, the real technology
link |
00:06:46.600
that unlocked that was HD mapping.
link |
00:06:51.400
Prior to that, a lot of the off road robotics work
link |
00:06:55.200
had been done without any real prior model of what
link |
00:06:58.920
the vehicle was going to encounter.
link |
00:07:01.400
And so that innovation, that the fact
link |
00:07:03.960
that we could get decimeter resolution models,
link |
00:07:11.320
was really a big deal.
link |
00:07:13.560
And that allowed us to kind of bound
link |
00:07:17.480
the complexity of the driving problem the vehicle had
link |
00:07:19.680
and allowed it to operate at speed,
link |
00:07:21.040
because we could assume things about the environment
link |
00:07:23.800
that it was going to encounter.
link |
00:07:26.400
So that was one of the big step there.
link |
00:07:31.320
For the Urban Challenge, one of the big technological
link |
00:07:38.520
innovations there was the multi beam LiDAR.
link |
00:07:41.960
And be able to generate high resolution,
link |
00:07:45.720
mid to long range 3D models the world,
link |
00:07:48.680
and use that for understanding the world around the vehicle.
link |
00:07:54.120
And that was really kind of a game changing technology.
link |
00:07:59.120
And parallel with that, we saw a bunch
link |
00:08:02.880
of other technologies that had been kind of converging
link |
00:08:06.640
half their day in the sun.
link |
00:08:08.960
So Bayesian estimation had been, SLAM had been a big field
link |
00:08:16.800
in robotics.
link |
00:08:18.600
You would go to a conference a couple of years
link |
00:08:20.800
before that, and every paper would effectively
link |
00:08:23.800
have SLAM somewhere in it.
link |
00:08:25.640
And so seeing that Bayesian estimation techniques
link |
00:08:31.560
play out on a very visible stage,
link |
00:08:34.040
I thought that was pretty exciting to see.
link |
00:08:38.680
And mostly SLAM was done based on LiDAR at that time?
link |
00:08:41.760
Well, yeah.
link |
00:08:42.400
And in fact, we weren't really doing SLAM per se in real time,
link |
00:08:46.720
because we had a model ahead of time.
link |
00:08:48.120
We had a roadmap, but we were doing localization.
link |
00:08:51.560
And we were using the LiDAR or the cameras,
link |
00:08:54.080
depending on who exactly was doing it,
link |
00:08:55.920
to localize to a model of the world.
link |
00:08:58.080
And I thought that was a big step
link |
00:09:00.720
from kind of naively trusting GPS INS before that.
link |
00:09:07.160
And again, lots of work had been going on in this field.
link |
00:09:10.400
Certainly, this was not doing anything particularly
link |
00:09:14.080
innovative in SLAM or in localization,
link |
00:09:17.400
but it was seeing that technology necessary
link |
00:09:20.160
in a real application on a big stage.
link |
00:09:21.800
I thought it was very cool.
link |
00:09:23.080
So for the Urban Challenge, those already maps
link |
00:09:25.600
constructed offline in general?
link |
00:09:28.120
OK.
link |
00:09:28.600
And did people do that individually?
link |
00:09:30.920
Did individual teams do it individually?
link |
00:09:33.600
So they had their own different approaches there?
link |
00:09:36.440
Or did everybody kind of share that information,
link |
00:09:41.720
at least intuitively?
link |
00:09:42.880
So DARPA gave all the teams a model of the world, a map.
link |
00:09:49.560
And then one of the things that we had to figure out back then
link |
00:09:53.720
was, and it's still one of these things that trips people up
link |
00:09:56.720
today, is actually the coordinate system.
link |
00:10:00.240
So you get a latitude, longitude.
link |
00:10:03.000
And to so many decimal places, you
link |
00:10:05.120
don't really care about kind of the ellipsoid of the Earth
link |
00:10:07.800
that's being used.
link |
00:10:09.520
But when you want to get to 10 centimeter or centimeter
link |
00:10:12.720
resolution, you care whether the coordinate system is NADS 83
link |
00:10:18.480
or WGS 84, or these are different ways
link |
00:10:22.720
to describe both the kind of nonsphericalness of the Earth,
link |
00:10:26.720
but also kind of the actually, and I think when I can't remember
link |
00:10:31.560
which one, the tectonic shifts that are happening
link |
00:10:33.560
and how to transform the global datum as a function of that.
link |
00:10:36.920
So getting a map and then actually matching it
link |
00:10:40.400
to reality to centimeter resolution,
link |
00:10:41.880
that was kind of interesting and fun back then.
link |
00:10:44.000
So how much work was the perception doing there?
link |
00:10:46.800
So how much were you relying on localization based on maps
link |
00:10:52.440
without using perception to register to the maps?
link |
00:10:55.720
And I guess the question is how advanced
link |
00:10:57.960
was perception at that point?
link |
00:10:59.720
It's certainly behind where we are today.
link |
00:11:01.920
We're more than a decade since the urban challenge.
link |
00:11:05.800
But the core of it was there, that we were tracking vehicles.
link |
00:11:13.080
We had to do that at 100 plus meter range
link |
00:11:15.600
because we had to merge with other traffic.
link |
00:11:18.280
We were using, again, Bayesian estimates
link |
00:11:21.200
for state of these vehicles.
link |
00:11:23.800
We had to deal with a bunch of the problems
link |
00:11:25.560
that you think of today of predicting
link |
00:11:28.240
where that vehicle is going to be a few seconds into the future.
link |
00:11:31.040
We had to deal with the fact that there
link |
00:11:33.680
were multiple hypotheses for that because a vehicle
link |
00:11:36.000
at an intersection might be going right
link |
00:11:37.640
or it might be going straight or it might be making a left turn.
link |
00:11:41.440
And we had to deal with the challenge of the fact
link |
00:11:44.080
that our behavior was going to impact the behavior
link |
00:11:47.520
of that other operator.
link |
00:11:48.880
And we did a lot of that in relatively naive ways.
link |
00:11:53.400
But it kind of worked.
link |
00:11:54.720
Still had to have some kind of assumption.
link |
00:11:57.000
And so where does that 10 years later, where does that take us
link |
00:12:00.640
today from that artificial city construction
link |
00:12:04.200
to real cities to the urban environment?
link |
00:12:06.920
Yeah, I think the biggest thing is that the actors are truly
link |
00:12:13.600
unpredictable, that most of the time, the drivers on the road,
link |
00:12:18.680
the other road users are out there behaving well.
link |
00:12:24.000
But every once in a while, they're not.
link |
00:12:27.040
The variety of other vehicles is, you have all of them.
link |
00:12:33.320
In terms of behavior, or terms of perception, or both?
link |
00:12:35.760
Both.
link |
00:12:38.320
Back then, we didn't have to deal with cyclists.
link |
00:12:40.480
We didn't have to deal with pedestrians.
link |
00:12:42.800
Didn't have to deal with traffic lights.
link |
00:12:46.240
The scale over which that you have to operate is now
link |
00:12:49.360
as much larger than the airbase that we were thinking about back
link |
00:12:52.240
then.
link |
00:12:52.720
So what easy question?
link |
00:12:56.280
What do you think is the hardest part about driving?
link |
00:12:59.720
Easy question.
link |
00:13:00.480
Yeah.
link |
00:13:01.320
No, I'm joking.
link |
00:13:02.600
I'm sure nothing really jumps out at you as one thing.
link |
00:13:07.440
But in the jump from the urban challenge to the real world,
link |
00:13:12.920
is there something that's a particular euphorcy
link |
00:13:16.200
as a very serious, difficult challenge?
link |
00:13:18.480
I think the most fundamental difference
link |
00:13:21.120
is that we're doing it for real, that in that environment,
link |
00:13:28.960
it was both a limited complexity environment,
link |
00:13:31.840
because certain actors weren't there,
link |
00:13:33.240
because the roads were maintained.
link |
00:13:35.360
There were barriers keeping people separate from robots
link |
00:13:38.720
at the time.
link |
00:13:40.880
And it only had to work for 60 miles, which looking at it
link |
00:13:44.480
from 2006, it had to work for 60 miles.
link |
00:13:48.960
Looking at it from now, we want things
link |
00:13:52.720
that will go and drive for half a million miles.
link |
00:13:57.200
And it's just a different game.
link |
00:14:00.960
So how important, you said Lyder came into the game early on,
link |
00:14:06.080
and it's really the primary driver of autonomous vehicles
link |
00:14:08.880
today as a sensor.
link |
00:14:10.240
So how important is the role of Lyder in the sensor suite
link |
00:14:12.880
in the near term?
link |
00:14:14.760
So I think it's essential.
link |
00:14:18.680
But I also believe that cameras are essential,
link |
00:14:20.520
and I believe the radar is essential.
link |
00:14:22.160
I think that you really need to use the composition of data
link |
00:14:27.400
from these different sensors if you
link |
00:14:28.920
want the thing to really be robust.
link |
00:14:32.600
The question I want to ask, let's see if we can untangle it,
link |
00:14:35.440
is what are your thoughts on the Elon Musk provocative statement
link |
00:14:40.240
that Lyder is a crutch, that is a kind of, I guess,
link |
00:14:45.840
growing pains, and that much of the perception
link |
00:14:49.600
task can be done with cameras?
link |
00:14:52.160
So I think it is undeniable that people walk around
link |
00:14:56.920
without lasers in their foreheads,
link |
00:14:59.680
and they can get into vehicles and drive them.
link |
00:15:01.840
And so there's an existence proof
link |
00:15:05.560
that you can drive using passive vision.
link |
00:15:10.840
No doubt, can't argue with that.
link |
00:15:12.680
In terms of sensors, yeah.
link |
00:15:14.320
So there's proof.
link |
00:15:14.800
Yes, in terms of sensors, right?
link |
00:15:15.960
So there's an example that we all
link |
00:15:18.720
go do it at many of us every day.
link |
00:15:23.280
In terms of Lyder being a crutch, sure.
link |
00:15:28.200
But in the same way that the combustion engine
link |
00:15:33.080
was a crutch on the path to an electric vehicle,
link |
00:15:35.240
in the same way that any technology ultimately gets
link |
00:15:40.840
replaced by some superior technology in the future.
link |
00:15:44.640
And really, the way that I look at this
link |
00:15:47.720
is that the way we get around on the ground, the way
link |
00:15:51.720
that we use transportation is broken.
link |
00:15:55.280
And that we have this, I think the number I saw this morning,
link |
00:15:59.720
37,000 Americans killed last year on our roads.
link |
00:16:04.040
And that's just not acceptable.
link |
00:16:05.360
And so any technology that we can bring to bear
link |
00:16:09.440
that accelerates this technology, self driving technology,
link |
00:16:12.840
coming to market and saving lives,
link |
00:16:15.720
is technology we should be using.
link |
00:16:18.280
And it feels just arbitrary to say, well, I'm not
link |
00:16:24.040
OK with using lasers, because that's whatever.
link |
00:16:27.800
But I am OK with using an 8 megapixel camera
link |
00:16:30.760
or a 16 megapixel camera.
link |
00:16:32.880
These are just bits of technology,
link |
00:16:34.640
and we should be taking the best technology from the tool
link |
00:16:36.880
bin that allows us to go and solve a problem.
link |
00:16:41.600
The question I often talk to, well, obviously you do as well,
link |
00:16:45.160
to automotive companies.
link |
00:16:48.320
And if there's one word that comes up more often than anything,
link |
00:16:51.880
it's cost and drive costs down.
link |
00:16:55.320
So while it's true that it's a tragic number, the 37,000,
link |
00:17:01.440
the question is, and I'm not the one asking this question,
link |
00:17:04.880
because I hate this question, but we
link |
00:17:07.160
want to find the cheapest sensor suite that
link |
00:17:11.680
creates a safe vehicle.
link |
00:17:13.400
So in that uncomfortable trade off,
link |
00:17:18.240
do you foresee lidar coming down in cost in the future?
link |
00:17:23.680
Or do you see a day where level 4 autonomy is possible
link |
00:17:28.000
without lidar?
link |
00:17:29.880
I see both of those, but it's really a matter of time.
link |
00:17:32.880
And I think, really, maybe I would
link |
00:17:35.080
talk to the question you asked about the cheapest sensor.
link |
00:17:38.760
I don't think that's actually what you want.
link |
00:17:40.440
What you want is a sensor suite that is economically viable.
link |
00:17:45.720
And then after that, everything is about margin
link |
00:17:49.480
and driving cost out of the system.
link |
00:17:52.320
What you also want is a sensor suite that works.
link |
00:17:55.400
And so it's great to tell a story about how it would be better
link |
00:18:01.280
to have a self driving system with a $50 sensor instead
link |
00:18:04.560
of a $500 sensor.
link |
00:18:08.720
But if the $500 sensor makes it work and the $50 sensor
link |
00:18:11.560
doesn't work, who cares?
link |
00:18:15.680
So long as you can actually have an economic opportunity there.
link |
00:18:21.680
And the economic opportunity is important,
link |
00:18:23.760
because that's how you actually have a sustainable business.
link |
00:18:27.800
And that's how you can actually see this come to scale
link |
00:18:30.440
and be out in the world.
link |
00:18:32.520
And so when I look at lidar, I see
link |
00:18:36.400
a technology that has no underlying fundamentally expense
link |
00:18:41.200
to it, fundamental expense to it.
link |
00:18:43.240
It's going to be more expensive than an imager,
link |
00:18:46.120
because CMOS processes or FAP processes
link |
00:18:51.400
are dramatically more scalable than mechanical processes.
link |
00:18:56.200
But we still should be able to drive cost
link |
00:18:58.160
out substantially on that side.
link |
00:19:00.440
And then I also do think that with the right business model,
link |
00:19:05.880
you can absorb more, certainly more cost
link |
00:19:08.440
on the below materials.
link |
00:19:09.480
Yeah, if the sensor suite works, extra value is provided.
link |
00:19:12.600
Thereby, you don't need to drive cost down to zero.
link |
00:19:15.480
It's a basic economics.
link |
00:19:17.120
You've talked about your intuition
link |
00:19:18.840
at level two autonomy is problematic because
link |
00:19:22.720
of the human factor of vigilance, decrement, complacency,
link |
00:19:27.280
overtrust, and so on, just us being human.
link |
00:19:29.600
With the overtrust system, we start doing even more
link |
00:19:33.000
so partaking in the secondary activities like smartphone
link |
00:19:36.480
and so on.
link |
00:19:38.720
Have your views evolved on this point in either direction?
link |
00:19:42.960
Can you speak to it?
link |
00:19:44.760
So I want to be really careful, because sometimes this
link |
00:19:48.240
gets twisted in a way that I certainly didn't intend.
link |
00:19:53.000
So active safety systems are a really important technology
link |
00:19:59.360
that we should be pursuing and integrating into vehicles.
link |
00:20:03.400
And there's an opportunity in the near term
link |
00:20:05.680
to reduce accidents, reduce fatalities, and that's
link |
00:20:09.400
and we should be pushing on that.
link |
00:20:13.400
Level two systems are systems where
link |
00:20:17.280
the vehicle is controlling two axes,
link |
00:20:19.480
so breaking and thrall slash steering.
link |
00:20:24.800
And I think there are variants of level two systems that
link |
00:20:27.200
are supporting the driver that absolutely we
link |
00:20:30.200
should encourage to be out there.
link |
00:20:32.560
Where I think there's a real challenge is in the human factors
link |
00:20:37.920
part around this and the misconception
link |
00:20:40.800
from the public around the capability set that that enables
link |
00:20:44.920
and the trust that they should have in it.
link |
00:20:48.000
And that is where I'm actually incrementally more
link |
00:20:53.880
concerned around level three systems
link |
00:20:55.800
and how exactly a level two system is marketed and delivered
link |
00:20:59.960
and how much effort people have put into those human factors.
link |
00:21:03.240
So I still believe several things around this.
link |
00:21:07.000
One is people will over trust the technology.
link |
00:21:10.760
We've seen over the last few weeks
link |
00:21:12.720
a spate of people sleeping in their Tesla.
link |
00:21:16.280
I watched an episode last night of Trevor Noah talking
link |
00:21:23.240
about this, and this is a smart guy
link |
00:21:27.160
who has a lot of resources at his disposal describing
link |
00:21:31.040
a Tesla as a self driving car.
link |
00:21:32.880
And that why shouldn't people be sleeping in their Tesla?
link |
00:21:35.640
It's like, well, because it's not a self driving car
link |
00:21:38.800
and it is not intended to be.
link |
00:21:41.120
And these people will almost certainly die at some point
link |
00:21:48.400
or hurt other people.
link |
00:21:50.400
And so we need to really be thoughtful about how
link |
00:21:52.640
that technology is described and brought to market.
link |
00:21:56.280
I also think that because of the economic issue,
link |
00:22:00.760
economic challenges we were just talking about,
link |
00:22:03.320
that technology path will, these level two driver system
link |
00:22:06.960
systems, that technology path will
link |
00:22:08.400
diverge from the technology path that we
link |
00:22:11.560
need to be on to actually deliver truly self driving
link |
00:22:15.800
vehicles, ones where you can get in it and sleep
link |
00:22:19.120
and have the equivalent or better safety
link |
00:22:21.480
than a human driver behind the wheel.
link |
00:22:24.600
Because, again, the economics are very different
link |
00:22:28.440
in those two worlds.
link |
00:22:29.800
And so that leads to divergent technology.
link |
00:22:32.720
So you just don't see the economics of gradually
link |
00:22:36.920
increasing from level two and doing so quickly enough
link |
00:22:41.520
to where it doesn't cost safety, critical safety concerns.
link |
00:22:44.400
You believe that it needs to diverge at this point
link |
00:22:48.600
into different, basically different routes.
link |
00:22:50.600
And really that comes back to what
link |
00:22:53.760
are those L2 and L1 systems doing?
link |
00:22:56.840
And they are driver assistance functions
link |
00:22:59.800
where the people that are marketing that responsibly
link |
00:23:04.360
are being very clear and putting human factors in place
link |
00:23:07.960
such that the driver is actually responsible for the vehicle
link |
00:23:12.400
and that the technology is there to support the driver.
link |
00:23:15.200
And the safety cases that are built around those
link |
00:23:19.880
are dependent on that driver attention and attentiveness.
link |
00:23:24.320
And at that point, you can kind of give up, to some degree,
link |
00:23:30.360
for economic reasons, you can give up on, say, false negatives.
link |
00:23:34.280
And so the way to think about this
link |
00:23:36.200
is for a four collision mitigation braking system,
link |
00:23:40.760
if half the times the driver missed a vehicle in front of it,
link |
00:23:45.080
it hit the brakes and brought the vehicle to a stop,
link |
00:23:47.640
that would be an incredible, incredible advance
link |
00:23:51.200
in safety on our roads, right?
link |
00:23:52.960
That would be equivalent to seatbelts.
link |
00:23:55.080
But it would mean that if that vehicle wasn't being monitored,
link |
00:23:57.560
it would hit one out of two cars.
link |
00:24:00.560
And so economically, that's a perfectly good solution
link |
00:24:05.080
for a driver assistance system.
link |
00:24:06.200
What you should do at that point,
link |
00:24:07.360
if you can get it to work 50% of the time,
link |
00:24:09.200
is drive the cost out of that so you can get it
link |
00:24:11.040
on as many vehicles as possible.
link |
00:24:13.320
But driving the cost out of it doesn't drive up performance
link |
00:24:16.920
on the false negative case.
link |
00:24:18.840
And so you'll continue to not have a technology
link |
00:24:21.480
that could really be available for a self driven vehicle.
link |
00:24:25.720
So clearly the communication,
link |
00:24:28.480
and this probably applies to all four vehicles as well,
link |
00:24:31.640
the marketing and the communication
link |
00:24:34.440
of what the technology is actually capable of,
link |
00:24:37.080
how hard it is, how easy it is,
link |
00:24:38.440
all that kind of stuff is highly problematic.
link |
00:24:41.040
So say everybody in the world was perfectly communicated
link |
00:24:45.680
and were made to be completely aware
link |
00:24:48.400
of every single technology out there,
link |
00:24:50.040
what it's able to do.
link |
00:24:52.880
What's your intuition?
link |
00:24:54.160
And now we're maybe getting into philosophical ground.
link |
00:24:56.920
Is it possible to have a level two vehicle
link |
00:25:00.040
where we don't overtrust it?
link |
00:25:04.720
I don't think so.
link |
00:25:05.840
If people truly understood the risks and internalized it,
link |
00:25:11.200
then sure you could do that safely,
link |
00:25:14.320
but that's a world that doesn't exist.
link |
00:25:16.200
The people are going to,
link |
00:25:19.440
if the facts are put in front of them,
link |
00:25:20.800
they're gonna then combine that with their experience.
link |
00:25:24.480
And let's say they're using an L2 system
link |
00:25:28.400
and they go up and down the one on one every day
link |
00:25:31.040
and they do that for a month
link |
00:25:32.800
and it just worked every day for a month.
link |
00:25:36.320
Like that's pretty compelling.
link |
00:25:37.400
At that point, just even if you know the statistics,
link |
00:25:41.880
you're like, well, I don't know,
link |
00:25:43.520
maybe there's something a little funny about those.
link |
00:25:44.840
Maybe they're driving in difficult places.
link |
00:25:47.000
Like I've seen it with my own eyes, it works.
link |
00:25:49.960
And the problem is that that sample size that they have,
link |
00:25:52.480
so it's 30 miles up and down,
link |
00:25:54.000
so 60 miles times 30 days, so 60, 180, 1,800 miles.
link |
00:26:01.720
That's a drop in the bucket compared to the one,
link |
00:26:05.240
what 85 million miles between fatalities.
link |
00:26:07.640
And so they don't really have a true estimate
link |
00:26:11.400
based on their personal experience of the real risks,
link |
00:26:14.440
but they're gonna trust it anyway,
link |
00:26:15.640
because it's hard not to, it worked for a month.
link |
00:26:17.720
What's gonna change?
link |
00:26:18.640
So even if you start a perfect understanding of the system,
link |
00:26:21.600
your own experience will make it drift.
link |
00:26:24.160
I mean, that's a big concern.
link |
00:26:25.920
Over a year, over two years even, it doesn't have to be months.
link |
00:26:29.480
And I think that as this technology moves from,
link |
00:26:35.440
what I would say is kind of the more technology savvy
link |
00:26:37.800
ownership group to the mass market,
link |
00:26:41.480
you may be able to have some of those folks
link |
00:26:44.640
who are really familiar with technology,
link |
00:26:46.320
they may be able to internalize it better.
link |
00:26:48.880
And you're kind of immunization
link |
00:26:50.840
against this kind of false risk assessment
link |
00:26:53.400
might last longer, but as folks who aren't as savvy
link |
00:26:56.960
about that read the material
link |
00:27:00.200
and they compare that to their personal experience,
link |
00:27:02.200
I think there that it's gonna move more quickly.
link |
00:27:08.200
So your work, the program that you've created at Google
link |
00:27:11.320
and now at Aurora is focused more on the second path
link |
00:27:16.640
of creating full autonomy.
link |
00:27:18.520
So it's such a fascinating,
link |
00:27:21.800
I think it's one of the most interesting AI problems
link |
00:27:24.600
of the century, right?
link |
00:27:25.640
It's a, I just talked to a lot of people,
link |
00:27:28.320
just regular people, I don't know, my mom
link |
00:27:30.400
about autonomous vehicles and you begin to grapple
link |
00:27:33.840
with ideas of giving your life control over to a machine.
link |
00:27:38.080
It's philosophically interesting,
link |
00:27:40.040
it's practically interesting.
link |
00:27:41.760
So let's talk about safety.
link |
00:27:43.720
How do you think, we demonstrate,
link |
00:27:46.240
you've spoken about metrics in the past,
link |
00:27:47.880
how do you think we demonstrate to the world
link |
00:27:51.880
that an autonomous vehicle, an Aurora system is safe?
link |
00:27:56.160
This is one where it's difficult
link |
00:27:57.320
because there isn't a sound bite answer.
link |
00:27:59.280
That we have to show a combination of work
link |
00:28:05.960
that was done diligently and thoughtfully.
link |
00:28:08.360
And this is where something like a functional safety process
link |
00:28:10.840
as part of that is like, here's the way we did the work.
link |
00:28:15.320
That means that we were very thorough.
link |
00:28:17.200
So, if you believe that we, what we said about,
link |
00:28:20.560
this is the way we did it,
link |
00:28:21.480
then you can have some confidence that we were thorough
link |
00:28:23.440
in the engineering work we put into the system.
link |
00:28:27.000
And then on top of that, to kind of demonstrate
link |
00:28:30.160
that we weren't just thorough,
link |
00:28:32.000
we were actually good at what we did.
link |
00:28:35.320
There'll be a kind of a collection of evidence
link |
00:28:38.240
in terms of demonstrating that the capabilities
link |
00:28:40.480
work the way we thought they did, statistically
link |
00:28:43.960
and to whatever degree we can demonstrate that
link |
00:28:48.200
both in some combination of simulation,
link |
00:28:50.320
some combination of unit testing and decomposition testing,
link |
00:28:54.720
and then some part of it will be on road data.
link |
00:28:58.200
And I think the way we'll ultimately convey this
link |
00:29:03.320
to the public is there'll be clearly some conversation
link |
00:29:06.800
with the public about it,
link |
00:29:08.240
but we'll kind of invoke the kind of the trusted nodes
link |
00:29:12.080
and that we'll spend more time being able to go
link |
00:29:14.360
into more depth with folks like NHTSA
link |
00:29:17.280
and other federal and state regulatory bodies
link |
00:29:19.760
and kind of given that they are operating
link |
00:29:22.600
in the public interest and they're trusted
link |
00:29:26.240
that if we can show enough work to them
link |
00:29:28.680
that they're convinced,
link |
00:29:30.040
then I think we're in a pretty good place.
link |
00:29:33.840
That means that you work with people
link |
00:29:35.040
that are essentially experts at safety
link |
00:29:36.960
to try to discuss and show,
link |
00:29:39.040
do you think the answer is probably no,
link |
00:29:41.800
but just in case, do you think there exists a metric?
link |
00:29:44.360
So currently people have been using
link |
00:29:46.360
a number of disengagement.
link |
00:29:48.200
And it quickly turns into a marketing scheme
link |
00:29:50.160
to sort of you alter the experiments you run to.
link |
00:29:54.320
I think you've spoken that you don't like.
link |
00:29:56.320
Don't love it.
link |
00:29:57.160
No, in fact, I was on the record telling DMV
link |
00:29:59.720
that I thought this was not a great metric.
link |
00:30:02.000
Do you think it's possible to create a metric,
link |
00:30:05.360
a number that could demonstrate safety
link |
00:30:09.480
outside of fatalities?
link |
00:30:12.400
So I do and I think that it won't be just one number.
link |
00:30:16.640
So as we are internally grappling with this
link |
00:30:21.320
and at some point we'll be able to talk
link |
00:30:23.600
more publicly about it,
link |
00:30:25.080
is how do we think about human performance
link |
00:30:28.560
in different tasks, say detecting traffic lights
link |
00:30:32.200
or safely making a left turn across traffic?
link |
00:30:37.720
And what do we think the failure rates
link |
00:30:40.040
are for those different capabilities for people?
link |
00:30:42.520
And then demonstrating to ourselves
link |
00:30:44.760
and then ultimately folks in regulatory role
link |
00:30:48.480
and then ultimately the public,
link |
00:30:50.760
that we have confidence that our system
link |
00:30:52.400
will work better than that.
link |
00:30:54.800
And so these individual metrics
link |
00:30:57.040
will kind of tell a compelling story ultimately.
link |
00:31:01.760
I do think at the end of the day,
link |
00:31:03.920
what we care about in terms of safety
link |
00:31:06.640
is life saved and injuries reduced.
link |
00:31:11.640
And then ultimately kind of casualty dollars
link |
00:31:16.440
that people aren't having to pay to get their car fixed.
link |
00:31:19.360
And I do think that in aviation,
link |
00:31:22.680
they look at a kind of an event pyramid
link |
00:31:25.880
where a crash is at the top of that
link |
00:31:28.600
and that's the worst event obviously.
link |
00:31:30.440
And then there's injuries and near miss events and whatnot
link |
00:31:34.240
and violation of operating procedures.
link |
00:31:37.320
And you kind of build a statistical model
link |
00:31:40.160
of the relevance of the low severity things
link |
00:31:44.440
and the high severity things.
link |
00:31:45.280
And I think that's something
link |
00:31:46.120
where we'll be able to look at as well
link |
00:31:48.240
because an event per 85 million miles
link |
00:31:51.920
is statistically a difficult thing
link |
00:31:54.480
even at the scale of the US to kind of compare directly.
link |
00:31:59.440
And that event, the fatality that's connected
link |
00:32:02.280
to an autonomous vehicle is significantly,
link |
00:32:07.480
at least currently magnified
link |
00:32:09.160
in the amount of attention you get.
link |
00:32:12.320
So that speaks to public perception.
link |
00:32:15.080
I think the most popular topic
link |
00:32:16.720
about autonomous vehicles in the public
link |
00:32:19.520
is the trolley problem formulation, right?
link |
00:32:23.080
Which has, let's not get into that too much
link |
00:32:27.040
but is misguided in many ways.
link |
00:32:29.600
But it speaks to the fact that people are grappling
link |
00:32:32.320
with this idea of giving control over to a machine.
link |
00:32:36.160
So how do you win the hearts and minds of the people
link |
00:32:41.560
that autonomy is something
link |
00:32:43.600
that could be a part of their lives?
link |
00:32:45.480
I think you let them experience it, right?
link |
00:32:47.640
I think it's right.
link |
00:32:50.440
I think people should be skeptical.
link |
00:32:52.720
I think people should ask questions.
link |
00:32:55.680
I think they should doubt
link |
00:32:58.040
because this is something new and different.
link |
00:33:00.960
They haven't touched it yet.
link |
00:33:01.960
And I think it's perfectly reasonable.
link |
00:33:03.680
And but at the same time,
link |
00:33:07.360
it's clear there's an opportunity to make the road safer.
link |
00:33:09.360
It's clear that we can improve access to mobility.
link |
00:33:12.480
It's clear that we can reduce the cost of mobility.
link |
00:33:16.680
And that once people try that
link |
00:33:19.520
and understand that it's safe
link |
00:33:22.800
and are able to use in their daily lives,
link |
00:33:24.480
I think it's one of these things that will just be obvious.
link |
00:33:28.080
And I've seen this practically in demonstrations
link |
00:33:32.280
that I've given where I've had people come in
link |
00:33:35.640
and they're very skeptical.
link |
00:33:38.600
And they get in the vehicle.
link |
00:33:39.960
My favorite one is taking somebody out on the freeway
link |
00:33:42.640
and we're on the one on one driving at 65 miles an hour.
link |
00:33:46.080
And after 10 minutes, they kind of turn and ask,
link |
00:33:48.560
is that all it does?
link |
00:33:49.560
And you're like, it's self driving car.
link |
00:33:52.160
I'm not sure exactly what you thought it would do, right?
link |
00:33:54.920
But it becomes mundane,
link |
00:33:58.920
which is exactly what you want to technology
link |
00:34:01.560
like this to be, right?
link |
00:34:02.760
We don't really...
link |
00:34:04.680
When I turn the light switch on in here,
link |
00:34:07.320
I don't think about the complexity of those electrons
link |
00:34:12.040
being pushed down a wire from wherever it was
link |
00:34:14.240
and being generated.
link |
00:34:15.880
It's like, I just get annoyed if it doesn't work, right?
link |
00:34:19.120
And what I value is the fact
link |
00:34:21.440
that I can do other things in this space.
link |
00:34:23.120
I can see my colleagues.
link |
00:34:24.600
I can read stuff on a paper.
link |
00:34:26.200
I can not be afraid of the dark.
link |
00:34:29.240
And I think that's what we want this technology to be like
link |
00:34:32.840
is it's in the background
link |
00:34:34.160
and people get to have those life experiences
link |
00:34:36.520
and do so safely.
link |
00:34:37.880
So putting this technology in the hands of people
link |
00:34:41.600
speaks to scale of deployment, right?
link |
00:34:45.800
So what do you think the dreaded question about the future
link |
00:34:50.360
because nobody can predict the future?
link |
00:34:52.840
But just maybe speak poetically about
link |
00:34:57.080
when do you think we'll see a large scale deployment
link |
00:35:00.600
of autonomous vehicles, 10,000, those kinds of numbers.
link |
00:35:06.360
We'll see that within 10 years.
link |
00:35:09.280
I'm pretty confident.
link |
00:35:10.600
We...
link |
00:35:13.920
What's an impressive scale?
link |
00:35:15.920
What moment, so you've done the DARPA Challenge
link |
00:35:19.000
where there's one vehicle,
link |
00:35:20.240
at which moment does it become,
link |
00:35:22.000
wow, this is serious scale?
link |
00:35:23.720
So I think the moment it gets serious is when
link |
00:35:27.960
we really do have a driverless vehicle
link |
00:35:32.040
operating on public roads
link |
00:35:34.760
and that we can do that kind of continuously.
link |
00:35:37.760
Without a safety driver?
link |
00:35:38.640
Without a safety driver in the vehicle.
link |
00:35:40.240
I think at that moment,
link |
00:35:41.320
we've kind of crossed the zero to one threshold.
link |
00:35:45.720
And then it is about how do we continue to scale that?
link |
00:35:50.000
How do we build the right business models?
link |
00:35:53.720
How do we build the right customer experience around it
link |
00:35:56.040
so that it is actually a useful product out in the world?
link |
00:36:00.720
And I think that is really,
link |
00:36:03.360
at that point, it moves from a,
link |
00:36:05.720
what is this kind of mixed science engineering project
link |
00:36:08.960
into engineering and commercialization
link |
00:36:12.120
and really starting to deliver on the value
link |
00:36:15.600
that we all see here.
link |
00:36:18.000
And actually making that real in the world.
link |
00:36:20.680
What do you think that deployment looks like?
link |
00:36:22.240
Where do we first see the inkling of
link |
00:36:24.920
no safety driver, one or two cars here and there?
link |
00:36:28.600
Is it on the highway?
link |
00:36:29.760
Is it in specific routes in the urban environment?
link |
00:36:33.200
I think it's gonna be urban, suburban type environments.
link |
00:36:37.920
You know, with Aurora,
link |
00:36:38.920
when we thought about how to tackle this,
link |
00:36:42.400
it was kind of invoke to think about trucking
link |
00:36:46.000
as opposed to urban driving.
link |
00:36:47.760
And again, the human intuition around this
link |
00:36:51.240
is that freeways are easier to drive on
link |
00:36:57.040
because everybody's kind of going in the same direction
link |
00:36:59.240
and lanes are a little wider, et cetera.
link |
00:37:01.560
And I think that that intuition is pretty good,
link |
00:37:03.280
except we don't really care about most of the time.
link |
00:37:06.000
We care about all of the time.
link |
00:37:08.360
And when you're driving on a freeway with a truck,
link |
00:37:10.840
say 70 miles an hour and you've got 70,000 pound load
link |
00:37:15.840
to do with you, that's just an incredible amount
link |
00:37:17.840
of kinetic energy.
link |
00:37:18.840
And so when that goes wrong, it goes really wrong.
link |
00:37:22.600
And that those challenges that you see occur more rarely
link |
00:37:27.760
so you don't get to learn as quickly.
link |
00:37:31.040
And they're incrementally more difficult
link |
00:37:33.640
than urban driving, but they're not easier
link |
00:37:35.920
than urban driving.
link |
00:37:37.400
And so I think this happens in moderate speed,
link |
00:37:41.600
urban environments, because there,
link |
00:37:43.840
if two vehicles crash at 25 miles per hour,
link |
00:37:46.560
it's not good, but probably everybody walks away.
link |
00:37:51.000
And those events where there's the possibility
link |
00:37:53.680
for that occurring happen frequently.
link |
00:37:55.720
So we get to learn more rapidly.
link |
00:37:57.920
We get to do that with lower risk for everyone.
link |
00:38:02.440
And then we can deliver value to people
link |
00:38:04.280
that need to get from one place to another.
link |
00:38:05.800
And then once we've got that solved,
link |
00:38:08.160
then the kind of the freeway driving part of this
link |
00:38:10.000
just falls out, but we're able to learn
link |
00:38:12.440
more safely, more quickly in the urban environment.
link |
00:38:15.160
So 10 years and then scale 20, 30 years.
link |
00:38:18.480
I mean, who knows if it's sufficiently compelling
link |
00:38:21.440
experience is created, it can be faster and slower.
link |
00:38:24.320
Do you think there could be breakthroughs
link |
00:38:27.120
and what kind of breakthroughs might there be
link |
00:38:29.880
that completely change that timeline?
link |
00:38:32.360
Again, not only am I asking to predict the future,
link |
00:38:35.320
I'm asking you to predict breakthroughs
link |
00:38:37.280
that haven't happened yet.
link |
00:38:38.280
So what's the, I think another way to ask that would be
link |
00:38:41.800
if I could wave a magic wand,
link |
00:38:44.240
what part of the system would I make work today
link |
00:38:46.640
to accelerate it as quickly as possible?
link |
00:38:48.600
Right.
link |
00:38:52.080
Don't say infrastructure, please don't say infrastructure.
link |
00:38:54.080
No, it's definitely not infrastructure.
link |
00:38:56.280
It's really that perception forecasting capability.
link |
00:39:00.520
So if tomorrow you could give me a perfect model
link |
00:39:04.760
of what's happening and what will happen
link |
00:39:07.520
for the next five seconds around a vehicle
link |
00:39:11.400
on the roadway, that would accelerate things
link |
00:39:14.480
pretty dramatically.
link |
00:39:15.320
Are you interested in staying up at night?
link |
00:39:17.560
Are you mostly bothered by cars, pedestrians, or cyclists?
link |
00:39:21.680
So I worry most about the vulnerable road users
link |
00:39:25.920
about the combination of cyclists and cars, right?
link |
00:39:28.000
Just cyclists and pedestrians
link |
00:39:29.480
because they're not in armor.
link |
00:39:33.240
The cars, they're bigger, they've got protection
link |
00:39:36.480
for the people and so the ultimate risk is lower there.
link |
00:39:39.440
Whereas a pedestrian or cyclist, they're out on the road
link |
00:39:44.080
and they don't have any protection.
link |
00:39:46.520
And so we need to pay extra attention to that.
link |
00:39:49.760
Do you think about a very difficult technical challenge
link |
00:39:55.760
of the fact that pedestrians,
link |
00:39:58.560
if you try to protect pedestrians by being careful
link |
00:40:01.400
and slow, they'll take advantage of that.
link |
00:40:04.600
So the game theoretic dance.
link |
00:40:07.560
Does that worry you from a technical perspective
link |
00:40:10.880
how we solve that?
link |
00:40:12.520
Because as humans, the way we solve that
link |
00:40:14.600
is kind of nudge our way through the pedestrians,
link |
00:40:17.280
which doesn't feel from a technical perspective
link |
00:40:20.040
as a appropriate algorithm.
link |
00:40:23.240
But do you think about how we solve that problem?
link |
00:40:25.960
Yeah, I think there's two different concepts there.
link |
00:40:31.400
So one is, am I worried that because these vehicles
link |
00:40:35.880
are self driving, people will kind of step on the road
link |
00:40:37.640
and take advantage of them.
link |
00:40:38.680
And I've heard this and I don't really believe it
link |
00:40:43.800
because if I'm driving down the road
link |
00:40:46.000
and somebody steps in front of me, I'm going to stop.
link |
00:40:48.920
Right?
link |
00:40:49.760
Like even if I'm annoyed, I'm not gonna just drive
link |
00:40:53.720
through a person stood on the road.
link |
00:40:55.200
Right.
link |
00:40:56.440
And so I think today people can take advantage of this
link |
00:41:00.440
and you do see some people do it.
link |
00:41:02.600
I guess there's an incremental risk
link |
00:41:04.200
because maybe they have lower confidence
link |
00:41:05.920
that I'm going to see them
link |
00:41:06.760
than they might have for an automated vehicle.
link |
00:41:09.360
And so maybe that shifts it a little bit.
link |
00:41:12.080
But I think people don't want to get hit by cars.
link |
00:41:14.400
And so I think that I'm not that worried
link |
00:41:17.120
about people walking out of the one on one
link |
00:41:18.800
and creating chaos more than they would today.
link |
00:41:24.400
Regarding kind of the nudging through a big stream
link |
00:41:27.040
of pedestrians leaving a concert or something.
link |
00:41:30.040
I think that is further down the technology pipeline.
link |
00:41:33.480
I think that you're right, that's tricky.
link |
00:41:36.920
I don't think it's necessarily,
link |
00:41:40.320
I think the algorithm people use for this is pretty simple.
link |
00:41:43.360
Right?
link |
00:41:44.200
It's kind of just move forward slowly
link |
00:41:45.040
and if somebody's really close and stop.
link |
00:41:47.600
And I think that that probably can be replicated
link |
00:41:50.840
pretty easily and particularly given that it's,
link |
00:41:54.040
you don't do this at 30 miles an hour, you do it at one,
link |
00:41:57.240
that even in those situations,
link |
00:41:59.080
the risk is relatively minimal.
link |
00:42:01.200
But it's not something we're thinking
link |
00:42:03.440
about in any serious way.
link |
00:42:04.560
And probably that's less an algorithm problem
link |
00:42:08.000
more creating a human experience.
link |
00:42:10.160
So the HCI people that create a visual display
link |
00:42:14.320
that you're pleasantly as a pedestrian,
link |
00:42:16.280
nudged out of the way.
link |
00:42:17.680
That's an experience problem, not an algorithm problem.
link |
00:42:22.880
Who's the main competitor to Aurora today?
link |
00:42:25.480
And how do you out compete them in the long run?
link |
00:42:28.600
So we really focus a lot on what we're doing here.
link |
00:42:31.200
I think that, I've said this a few times
link |
00:42:34.440
that this is a huge difficult problem
link |
00:42:37.960
and it's great that a bunch of companies are tackling it
link |
00:42:40.280
because I think it's so important for society
link |
00:42:42.320
that somebody gets there.
link |
00:42:45.200
So we don't spend a whole lot of time
link |
00:42:49.040
like thinking tactically about who's out there
link |
00:42:51.560
and how do we beat that person individually?
link |
00:42:55.480
What are we trying to do to go faster ultimately?
link |
00:42:58.680
Well, part of it is the leisure team we have
link |
00:43:02.600
has got pretty tremendous experience.
link |
00:43:04.160
And so we kind of understand the landscape
link |
00:43:06.400
and understand where the cul de sacs are to some degree.
link |
00:43:09.120
And we try and avoid those.
link |
00:43:12.600
I think there's a part of it
link |
00:43:14.240
just this great team we've built.
link |
00:43:16.240
People, this is a technology and a company
link |
00:43:19.040
that people believe in the mission of.
link |
00:43:22.280
And so it allows us to attract just awesome people
link |
00:43:24.760
to go work.
link |
00:43:26.760
We've got a culture, I think,
link |
00:43:28.000
that people appreciate, that allows them to focus,
link |
00:43:30.440
allows them to really spend time solving problems.
link |
00:43:33.080
And I think that keeps them energized.
link |
00:43:35.880
And then we've invested heavily in the infrastructure
link |
00:43:43.520
and architectures that we think will ultimately accelerate us.
link |
00:43:46.520
So because of the folks we're able to bring in early on,
link |
00:43:50.640
because of the great investors we have,
link |
00:43:53.520
we don't spend all of our time doing demos
link |
00:43:56.760
and kind of leaping from one demo to the next.
link |
00:43:58.680
We've been given the freedom to invest in
link |
00:44:03.960
infrastructure to do machine learning,
link |
00:44:05.480
infrastructure to pull data from our on road testing,
link |
00:44:08.600
infrastructure to use that to accelerate engineering.
link |
00:44:11.480
And I think that early investment
link |
00:44:14.480
and continuing investment in those kind of tools
link |
00:44:17.320
will ultimately allow us to accelerate
link |
00:44:19.400
and do something pretty incredible.
link |
00:44:21.920
Chris, beautifully put.
link |
00:44:23.400
It's a good place to end.
link |
00:44:24.640
Thank you so much for talking today.
link |
00:44:26.520
Thank you very much.
link |
00:44:27.360
I hope you enjoyed it.