back to index

Chris Urmson: Self-Driving Cars at Aurora, Google, CMU, and DARPA | Lex Fridman Podcast #28


small model | large model

link |
00:00:00.000
The following is a conversation with Chris Sampson.
link |
00:00:03.120
He was a CTO of the Google self driving car team,
link |
00:00:06.000
a key engineer and leader behind the Carnegie Mellon
link |
00:00:08.880
University autonomous vehicle entries in the DARPA Grand
link |
00:00:12.000
Challenges and the winner of the DARPA Urban Challenge.
link |
00:00:16.160
Today, he's the CEO of Aurora Innovation, an autonomous
link |
00:00:20.100
vehicle software company.
link |
00:00:21.360
He started with Sterling Anderson,
link |
00:00:23.600
who was the former director of Tesla Autopilot,
link |
00:00:25.960
and drew back now, Uber's former autonomy and perception lead.
link |
00:00:30.120
Chris is one of the top roboticists and autonomous
link |
00:00:32.880
vehicle experts in the world, and a longtime voice
link |
00:00:36.320
of reason in a space that is shrouded
link |
00:00:38.840
in both mystery and hype.
link |
00:00:41.320
He both acknowledges the incredible challenges
link |
00:00:43.600
involved in solving the problem of autonomous driving
link |
00:00:46.480
and is working hard to solve it.
link |
00:00:49.760
This is the Artificial Intelligence podcast.
link |
00:00:52.400
If you enjoy it, subscribe on YouTube,
link |
00:00:54.720
give it five stars on iTunes, support it on Patreon,
link |
00:00:57.920
or simply connect with me on Twitter
link |
00:00:59.720
at Lex Friedman, spelled F R I D M A N.
link |
00:01:03.240
And now, here's my conversation with Chris Sampson.
link |
00:01:09.120
You were part of both the DARPA Grand Challenge
link |
00:01:11.960
and the DARPA Urban Challenge teams
link |
00:01:13.880
at CMU with Red Whitaker.
link |
00:01:17.040
What technical or philosophical things
link |
00:01:19.720
have you learned from these races?
link |
00:01:22.240
I think the high order bit was that it could be done.
link |
00:01:26.600
I think that was the thing that was
link |
00:01:30.200
incredible about the first of the Grand Challenges,
link |
00:01:34.880
that I remember I was a grad student at Carnegie Mellon,
link |
00:01:38.160
and there was kind of this dichotomy of it
link |
00:01:45.360
seemed really hard, so that would
link |
00:01:46.720
be cool and interesting.
link |
00:01:48.800
But at the time, we were the only robotics institute around,
link |
00:01:52.800
and so if we went into it and fell on our faces,
link |
00:01:55.560
that would be embarrassing.
link |
00:01:58.360
So I think just having the will to go do it,
link |
00:02:01.120
to try to do this thing that at the time
link |
00:02:02.880
was marked as darn near impossible,
link |
00:02:05.000
and then after a couple of tries,
link |
00:02:06.960
be able to actually make it happen,
link |
00:02:08.420
I think that was really exciting.
link |
00:02:12.320
But at which point did you believe it was possible?
link |
00:02:15.040
Did you from the very beginning?
link |
00:02:16.960
Did you personally?
link |
00:02:18.000
Because you're one of the lead engineers.
link |
00:02:19.800
You actually had to do a lot of the work.
link |
00:02:21.800
Yeah, I was the technical director there,
link |
00:02:23.880
and did a lot of the work, along with a bunch
link |
00:02:26.120
of other really good people.
link |
00:02:28.420
Did I believe it could be done?
link |
00:02:29.760
Yeah, of course.
link |
00:02:31.080
Why would you go do something you thought
link |
00:02:32.760
was completely impossible?
link |
00:02:34.800
We thought it was going to be hard.
link |
00:02:36.260
We didn't know how we were going to be able to do it.
link |
00:02:37.800
We didn't know if we'd be able to do it the first time.
link |
00:02:42.880
Turns out we couldn't.
link |
00:02:45.960
That, yeah, I guess you have to.
link |
00:02:48.400
I think there's a certain benefit to naivete, right?
link |
00:02:52.960
That if you don't know how hard something really is,
link |
00:02:55.440
you try different things, and it gives you an opportunity
link |
00:02:59.600
that others who are wiser maybe don't have.
link |
00:03:04.120
What were the biggest pain points?
link |
00:03:05.720
Mechanical, sensors, hardware, software,
link |
00:03:08.880
algorithms for mapping, localization,
link |
00:03:11.800
just general perception, control?
link |
00:03:13.680
Like hardware, software, first of all?
link |
00:03:15.320
I think that's the joy of this field, is that it's all hard
link |
00:03:20.120
and that you have to be good at each part of it.
link |
00:03:25.360
So for the urban challenges, if I look back at it from today,
link |
00:03:32.360
it should be easy today, that it was a static world.
link |
00:03:38.960
There weren't other actors moving through it,
link |
00:03:40.800
is what that means.
link |
00:03:42.480
It was out in the desert, so you get really good GPS.
link |
00:03:47.080
So that went, and we could map it roughly.
link |
00:03:51.400
And so in retrospect now, it's within the realm of things
link |
00:03:55.160
we could do back then.
link |
00:03:57.840
Just actually getting the vehicle and the,
link |
00:03:59.720
there's a bunch of engineering work
link |
00:04:00.680
to get the vehicle so that we could control it and drive it.
link |
00:04:04.760
That's still a pain today, but it was even more so back then.
link |
00:04:09.600
And then the uncertainty of exactly what they wanted us to do
link |
00:04:14.280
was part of the challenge as well.
link |
00:04:17.040
Right, you didn't actually know the track heading in here.
link |
00:04:19.440
You knew approximately, but you didn't actually
link |
00:04:21.480
know the route that was going to be taken.
link |
00:04:23.520
That's right, we didn't know the route.
link |
00:04:24.920
We didn't even really, the way the rules had been described,
link |
00:04:28.600
you had to kind of guess.
link |
00:04:29.800
So if you think back to that challenge,
link |
00:04:33.360
the idea was that the government would give us,
link |
00:04:36.960
the DARPA would give us a set of waypoints
link |
00:04:40.320
and kind of the width that you had to stay within
link |
00:04:43.520
between the line that went between each of those waypoints.
link |
00:04:46.800
And so the most devious thing they could have done
link |
00:04:49.280
is set a kilometer wide corridor across a field
link |
00:04:53.280
of scrub brush and rocks and said, go figure it out.
link |
00:04:58.520
Fortunately, it really, it turned into basically driving
link |
00:05:01.920
along a set of trails, which is much more relevant
link |
00:05:05.000
to the application they were looking for.
link |
00:05:08.760
But no, it was a hell of a thing back in the day.
link |
00:05:12.080
So the legend, Red, was kind of leading that effort
link |
00:05:16.640
in terms of just broadly speaking.
link |
00:05:19.120
So you're a leader now.
link |
00:05:22.040
What have you learned from Red about leadership?
link |
00:05:25.000
I think there's a couple things.
link |
00:05:26.200
One is go and try those really hard things.
link |
00:05:31.080
That's where there is an incredible opportunity.
link |
00:05:34.480
I think the other big one, though,
link |
00:05:36.560
is to see people for who they can be, not who they are.
link |
00:05:41.720
It's one of the things that I actually,
link |
00:05:43.720
one of the deepest lessons I learned from Red
link |
00:05:46.080
was that he would look at undergraduates
link |
00:05:50.200
or graduate students and empower them to be leaders,
link |
00:05:56.120
to have responsibility, to do great things
link |
00:06:00.320
that I think another person might look at them
link |
00:06:04.480
and think, oh, well, that's just an undergraduate student.
link |
00:06:06.600
What could they know?
link |
00:06:08.680
And so I think that kind of trust but verify,
link |
00:06:12.720
have confidence in what people can become,
link |
00:06:14.480
I think is a really powerful thing.
link |
00:06:16.680
So through that, let's just fast forward through the history.
link |
00:06:20.440
Can you maybe talk through the technical evolution
link |
00:06:24.160
of autonomous vehicle systems
link |
00:06:26.200
from the first two Grand Challenges to the Urban Challenge
link |
00:06:29.960
to today, are there major shifts in your mind
link |
00:06:33.560
or is it the same kind of technology just made more robust?
link |
00:06:37.240
I think there's been some big, big steps.
link |
00:06:40.880
So for the Grand Challenge,
link |
00:06:43.720
the real technology that unlocked that was HD mapping.
link |
00:06:51.400
Prior to that, a lot of the off road robotics work
link |
00:06:55.160
had been done without any real prior model
link |
00:06:58.480
of what the vehicle was going to encounter.
link |
00:07:01.400
And so that innovation that the fact that we could get
link |
00:07:05.960
decimeter resolution models was really a big deal.
link |
00:07:13.440
And that allowed us to kind of bound the complexity
link |
00:07:18.200
of the driving problem the vehicle had
link |
00:07:19.680
and allowed it to operate at speed
link |
00:07:21.040
because we could assume things about the environment
link |
00:07:23.800
that it was going to encounter.
link |
00:07:25.360
So that was the big step there.
link |
00:07:31.280
For the Urban Challenge,
link |
00:07:37.240
one of the big technological innovations there
link |
00:07:39.280
was the multi beam LIDAR
link |
00:07:41.960
and being able to generate high resolution,
link |
00:07:45.760
mid to long range 3D models of the world
link |
00:07:48.680
and use that for understanding the world around the vehicle.
link |
00:07:53.680
And that was really kind of a game changing technology.
link |
00:07:58.600
In parallel with that,
link |
00:08:00.000
we saw a bunch of other technologies
link |
00:08:04.360
that had been kind of converging
link |
00:08:06.120
half their day in the sun.
link |
00:08:08.440
So Bayesian estimation had been,
link |
00:08:12.560
SLAM had been a big field in robotics.
link |
00:08:17.840
You would go to a conference a couple of years before that
link |
00:08:20.760
and every paper would effectively have SLAM somewhere in it.
link |
00:08:24.880
And so seeing that the Bayesian estimation techniques
link |
00:08:30.720
play out on a very visible stage,
link |
00:08:33.400
I thought that was pretty exciting to see.
link |
00:08:38.080
And mostly SLAM was done based on LIDAR at that time.
link |
00:08:41.560
Yeah, and in fact, we weren't really doing SLAM per se
link |
00:08:45.600
in real time because we had a model ahead of time,
link |
00:08:47.480
we had a roadmap, but we were doing localization.
link |
00:08:51.040
And we were using the LIDAR or the cameras
link |
00:08:53.560
depending on who exactly was doing it
link |
00:08:55.400
to localize to a model of the world.
link |
00:08:57.560
And I thought that was a big step
link |
00:09:00.160
from kind of naively trusting GPS, INS before that.
link |
00:09:06.640
And again, lots of work had been going on in this field.
link |
00:09:09.840
Certainly this was not doing anything
link |
00:09:13.040
particularly innovative in SLAM or in localization,
link |
00:09:16.840
but it was seeing that technology necessary
link |
00:09:20.200
in a real application on a big stage,
link |
00:09:21.800
I thought was very cool.
link |
00:09:23.080
So for the urban challenge,
link |
00:09:24.000
those are already maps constructed offline in general.
link |
00:09:28.600
And did people do that individually,
link |
00:09:30.920
did individual teams do it individually
link |
00:09:33.600
so they had their own different approaches there
link |
00:09:36.440
or did everybody kind of share that information
link |
00:09:41.720
at least intuitively?
link |
00:09:42.880
So DARPA gave all the teams a model of the world, a map.
link |
00:09:49.640
And then one of the things that we had to figure out
link |
00:09:53.240
back then was, and it's still one of these things
link |
00:09:56.080
that trips people up today
link |
00:09:57.280
is actually the coordinate system.
link |
00:10:00.280
So you get a latitude longitude
link |
00:10:03.080
and to so many decimal places,
link |
00:10:05.040
you don't really care about kind of the ellipsoid
link |
00:10:07.360
of the earth that's being used.
link |
00:10:09.560
But when you want to get to 10 centimeter
link |
00:10:12.240
or centimeter resolution,
link |
00:10:14.400
you care whether the coordinate system is NADS 83
link |
00:10:18.520
or WGS 84 or these are different ways to describe
link |
00:10:24.200
both the kind of non sphericalness of the earth,
link |
00:10:26.760
but also kind of the, I think,
link |
00:10:31.080
I can't remember which one,
link |
00:10:32.080
the tectonic shifts that are happening
link |
00:10:33.600
and how to transform the global datum as a function of that.
link |
00:10:37.000
So getting a map and then actually matching it to reality
link |
00:10:41.020
to centimeter resolution, that was kind of interesting
link |
00:10:42.880
and fun back then.
link |
00:10:44.040
So how much work was the perception doing there?
link |
00:10:46.760
So how much were you relying on localization based on maps
link |
00:10:52.480
without using perception to register to the maps?
link |
00:10:55.760
And I guess the question is how advanced
link |
00:10:58.000
was perception at that point?
link |
00:10:59.800
It's certainly behind where we are today, right?
link |
00:11:01.960
We're more than a decade since the urban challenge.
link |
00:11:05.840
But the core of it was there.
link |
00:11:08.640
That we were tracking vehicles.
link |
00:11:13.120
We had to do that at 100 plus meter range
link |
00:11:15.640
because we had to merge with other traffic.
link |
00:11:18.320
We were using, again, Bayesian estimates
link |
00:11:21.240
for state of these vehicles.
link |
00:11:23.860
We had to deal with a bunch of the problems
link |
00:11:25.580
that you think of today,
link |
00:11:26.920
of predicting where that vehicle's going to be
link |
00:11:29.820
a few seconds into the future.
link |
00:11:31.060
We had to deal with the fact
link |
00:11:32.380
that there were multiple hypotheses for that
link |
00:11:35.320
because a vehicle at an intersection might be going right
link |
00:11:37.660
or it might be going straight
link |
00:11:38.780
or it might be making a left turn.
link |
00:11:41.500
And we had to deal with the challenge of the fact
link |
00:11:44.120
that our behavior was going to impact the behavior
link |
00:11:47.600
of that other operator.
link |
00:11:48.960
And we did a lot of that in relatively naive ways,
link |
00:11:53.480
but it kind of worked.
link |
00:11:54.820
Still had to have some kind of solution.
link |
00:11:57.080
And so where does that, 10 years later,
link |
00:11:59.960
where does that take us today
link |
00:12:01.520
from that artificial city construction
link |
00:12:04.260
to real cities to the urban environment?
link |
00:12:07.000
Yeah, I think the biggest thing
link |
00:12:09.160
is that the actors are truly unpredictable.
link |
00:12:15.720
That most of the time, the drivers on the road,
link |
00:12:18.800
the other road users are out there behaving well,
link |
00:12:24.080
but every once in a while they're not.
link |
00:12:27.080
The variety of other vehicles is, you have all of them.
link |
00:12:32.080
In terms of behavior, in terms of perception, or both?
link |
00:12:35.840
Both.
link |
00:12:38.740
Back then we didn't have to deal with cyclists,
link |
00:12:40.520
we didn't have to deal with pedestrians,
link |
00:12:42.800
didn't have to deal with traffic lights.
link |
00:12:46.260
The scale over which that you have to operate is now
link |
00:12:49.400
is much larger than the air base
link |
00:12:51.120
that we were thinking about back then.
link |
00:12:52.720
So what, easy question,
link |
00:12:56.280
what do you think is the hardest part about driving?
link |
00:12:59.720
Easy question.
link |
00:13:00.560
Yeah, no, I'm joking.
link |
00:13:02.560
I'm sure nothing really jumps out at you as one thing,
link |
00:13:07.440
but in the jump from the urban challenge to the real world,
link |
00:13:12.920
is there something that's a particular,
link |
00:13:15.320
you foresee as very serious, difficult challenge?
link |
00:13:18.480
I think the most fundamental difference
link |
00:13:21.080
is that we're doing it for real.
link |
00:13:26.760
That in that environment,
link |
00:13:28.960
it was both a limited complexity environment
link |
00:13:31.880
because certain actors weren't there,
link |
00:13:33.240
because the roads were maintained,
link |
00:13:35.380
there were barriers keeping people separate
link |
00:13:37.360
from robots at the time,
link |
00:13:40.840
and it only had to work for 60 miles.
link |
00:13:43.300
Which, looking at it from 2006,
link |
00:13:46.160
it had to work for 60 miles, right?
link |
00:13:48.960
Looking at it from now,
link |
00:13:51.880
we want things that will go and drive
link |
00:13:53.720
for half a million miles,
link |
00:13:57.160
and it's just a different game.
link |
00:14:00.940
So how important,
link |
00:14:03.480
you said LiDAR came into the game early on,
link |
00:14:06.080
and it's really the primary driver
link |
00:14:07.880
of autonomous vehicles today as a sensor.
link |
00:14:10.240
So how important is the role of LiDAR
link |
00:14:11.920
in the sensor suite in the near term?
link |
00:14:14.800
So I think it's essential.
link |
00:14:17.920
I believe, but I also believe that cameras are essential,
link |
00:14:20.480
and I believe the radar is essential.
link |
00:14:22.120
I think that you really need to use
link |
00:14:26.280
the composition of data from these different sensors
link |
00:14:28.720
if you want the thing to really be robust.
link |
00:14:32.640
The question I wanna ask,
link |
00:14:34.360
let's see if we can untangle it,
link |
00:14:35.600
is what are your thoughts on the Elon Musk
link |
00:14:39.320
provocative statement that LiDAR is a crutch,
link |
00:14:42.340
that it's a kind of, I guess, growing pains,
link |
00:14:47.760
and that much of the perception task
link |
00:14:49.920
can be done with cameras?
link |
00:14:52.120
So I think it is undeniable
link |
00:14:55.440
that people walk around without lasers in their foreheads,
link |
00:14:59.360
and they can get into vehicles and drive them,
link |
00:15:01.880
and so there's an existence proof
link |
00:15:05.600
that you can drive using passive vision.
link |
00:15:10.880
No doubt, can't argue with that.
link |
00:15:12.720
In terms of sensors, yeah, so there's proof.
link |
00:15:14.680
Yeah, in terms of sensors, right?
link |
00:15:16.000
So there's an example that we all go do it,
link |
00:15:20.200
many of us every day.
link |
00:15:21.380
In terms of LiDAR being a crutch, sure.
link |
00:15:28.180
But in the same way that the combustion engine
link |
00:15:33.100
was a crutch on the path to an electric vehicle,
link |
00:15:35.260
in the same way that any technology ultimately gets
link |
00:15:40.840
replaced by some superior technology in the future,
link |
00:15:44.380
and really the way that I look at this
link |
00:15:47.740
is that the way we get around on the ground,
link |
00:15:51.460
the way that we use transportation is broken,
link |
00:15:55.280
and that we have this, I think the number I saw this morning,
link |
00:15:59.740
37,000 Americans killed last year on our roads,
link |
00:16:04.060
and that's just not acceptable.
link |
00:16:05.380
And so any technology that we can bring to bear
link |
00:16:09.460
that accelerates this self driving technology
link |
00:16:12.860
coming to market and saving lives
link |
00:16:14.640
is technology we should be using.
link |
00:16:18.280
And it feels just arbitrary to say,
link |
00:16:20.840
well, I'm not okay with using lasers
link |
00:16:26.240
because that's whatever,
link |
00:16:27.820
but I am okay with using an eight megapixel camera
link |
00:16:30.720
or a 16 megapixel camera.
link |
00:16:32.880
These are just bits of technology,
link |
00:16:34.640
and we should be taking the best technology
link |
00:16:36.360
from the tool bin that allows us to go and solve a problem.
link |
00:16:41.360
The question I often talk to, well, obviously you do as well,
link |
00:16:45.160
to sort of automotive companies,
link |
00:16:48.280
and if there's one word that comes up more often
link |
00:16:51.360
than anything, it's cost, and trying to drive costs down.
link |
00:16:55.280
So while it's true that it's a tragic number, the 37,000,
link |
00:17:01.400
the question is, and I'm not the one asking this question
link |
00:17:04.880
because I hate this question,
link |
00:17:05.820
but we want to find the cheapest sensor suite
link |
00:17:09.960
that creates a safe vehicle.
link |
00:17:13.280
So in that uncomfortable trade off,
link |
00:17:18.220
do you foresee LiDAR coming down in cost in the future,
link |
00:17:23.680
or do you see a day where level four autonomy
link |
00:17:26.680
is possible without LiDAR?
link |
00:17:29.880
I see both of those, but it's really a matter of time.
link |
00:17:32.880
And I think really, maybe I would talk to the question
link |
00:17:36.040
you asked about the cheapest sensor.
link |
00:17:37.840
I don't think that's actually what you want.
link |
00:17:40.360
What you want is a sensor suite that is economically viable.
link |
00:17:45.680
And then after that, everything is about margin
link |
00:17:49.440
and driving costs out of the system.
link |
00:17:52.120
What you also want is a sensor suite that works.
link |
00:17:55.360
And so it's great to tell a story about
link |
00:17:59.600
how it would be better to have a self driving system
link |
00:18:03.260
with a $50 sensor instead of a $500 sensor.
link |
00:18:08.040
But if the $500 sensor makes it work
link |
00:18:10.520
and the $50 sensor doesn't work, who cares?
link |
00:18:15.680
So long as you can actually have an economic opportunity,
link |
00:18:20.020
there's an economic opportunity there.
link |
00:18:21.520
And the economic opportunity is important
link |
00:18:23.760
because that's how you actually have a sustainable business
link |
00:18:27.760
and that's how you can actually see this come to scale
link |
00:18:31.120
and be out in the world.
link |
00:18:32.400
And so when I look at LiDAR,
link |
00:18:35.960
I see a technology that has no underlying
link |
00:18:38.880
fundamentally expense to it, fundamental expense to it.
link |
00:18:42.420
It's going to be more expensive than an imager
link |
00:18:46.080
because CMOS processes or FAP processes
link |
00:18:51.360
are dramatically more scalable than mechanical processes.
link |
00:18:56.200
But we still should be able to drive costs down
link |
00:18:58.320
substantially on that side.
link |
00:19:00.120
And then I also do think that with the right business model
link |
00:19:05.880
you can absorb more,
link |
00:19:07.560
certainly more cost on the bill of materials.
link |
00:19:09.480
Yeah, if the sensor suite works, extra value is provided,
link |
00:19:12.600
thereby you don't need to drive costs down to zero.
link |
00:19:15.480
It's the basic economics.
link |
00:19:17.100
You've talked about your intuition
link |
00:19:18.820
that level two autonomy is problematic
link |
00:19:22.200
because of the human factor of vigilance,
link |
00:19:25.920
decrement, complacency, over trust and so on,
link |
00:19:28.040
just us being human.
link |
00:19:29.600
We over trust the system,
link |
00:19:31.120
we start doing even more so partaking
link |
00:19:34.240
in the secondary activities like smartphones and so on.
link |
00:19:38.680
Have your views evolved on this point in either direction?
link |
00:19:43.000
Can you speak to it?
link |
00:19:44.800
So, and I want to be really careful
link |
00:19:47.480
because sometimes this gets twisted in a way
link |
00:19:50.380
that I certainly didn't intend.
link |
00:19:53.040
So active safety systems are a really important technology
link |
00:19:58.040
that we should be pursuing and integrating into vehicles.
link |
00:20:02.080
And there's an opportunity in the near term
link |
00:20:04.280
to reduce accidents, reduce fatalities,
link |
00:20:06.520
and we should be pushing on that.
link |
00:20:11.960
Level two systems are systems
link |
00:20:14.680
where the vehicle is controlling two axes.
link |
00:20:18.080
So braking and throttle slash steering.
link |
00:20:23.480
And I think there are variants of level two systems
link |
00:20:25.680
that are supporting the driver.
link |
00:20:27.280
That absolutely we should encourage to be out there.
link |
00:20:31.080
Where I think there's a real challenge
link |
00:20:32.880
is in the human factors part around this
link |
00:20:37.640
and the misconception from the public
link |
00:20:41.240
around the capability set that that enables
link |
00:20:43.600
and the trust that they should have in it.
link |
00:20:46.640
And that is where I kind of,
link |
00:20:50.000
I'm actually incrementally more concerned
link |
00:20:52.920
around level three systems
link |
00:20:54.440
and how exactly a level two system is marketed and delivered
link |
00:20:58.440
and how much effort people have put into those human factors.
link |
00:21:01.840
So I still believe several things around this.
link |
00:21:05.640
One is people will overtrust the technology.
link |
00:21:09.440
We've seen over the last few weeks
link |
00:21:11.440
a spate of people sleeping in their Tesla.
link |
00:21:14.920
I watched an episode last night of Trevor Noah
link |
00:21:19.920
talking about this and him,
link |
00:21:23.920
this is a smart guy who has a lot of resources
link |
00:21:26.720
at his disposal describing a Tesla as a self driving car
link |
00:21:30.720
and that why shouldn't people be sleeping in their Tesla?
link |
00:21:33.480
And it's like, well, because it's not a self driving car
link |
00:21:36.560
and it is not intended to be
link |
00:21:38.840
and these people will almost certainly die at some point
link |
00:21:46.400
or hurt other people.
link |
00:21:48.040
And so we need to really be thoughtful
link |
00:21:50.080
about how that technology is described
link |
00:21:51.840
and brought to market.
link |
00:21:54.240
I also think that because of the economic challenges
link |
00:21:59.240
we were just talking about,
link |
00:22:01.240
that these level two driver assistance systems,
link |
00:22:05.160
that technology path will diverge
link |
00:22:07.280
from the technology path that we need to be on
link |
00:22:10.200
to actually deliver truly self driving vehicles,
link |
00:22:14.080
ones where you can get in it and drive it.
link |
00:22:16.920
Can get in it and sleep and have the equivalent
link |
00:22:20.800
or better safety than a human driver behind the wheel.
link |
00:22:24.680
Because again, the economics are very different
link |
00:22:28.480
in those two worlds and so that leads
link |
00:22:30.880
to divergent technology.
link |
00:22:32.800
So you just don't see the economics
link |
00:22:34.680
of gradually increasing from level two
link |
00:22:38.560
and doing so quickly enough
link |
00:22:41.600
to where it doesn't cause safety, critical safety concerns.
link |
00:22:44.480
You believe that it needs to diverge at this point
link |
00:22:48.680
into basically different routes.
link |
00:22:50.800
And really that comes back to what are those L2
link |
00:22:55.560
and L1 systems doing?
link |
00:22:57.080
And they are driver assistance functions
link |
00:22:59.840
where the people that are marketing that responsibly
link |
00:23:04.400
are being very clear and putting human factors in place
link |
00:23:08.000
such that the driver is actually responsible for the vehicle
link |
00:23:12.440
and that the technology is there to support the driver.
link |
00:23:15.160
And the safety cases that are built around those
link |
00:23:19.880
are dependent on that driver attention and attentiveness.
link |
00:23:24.040
And at that point, you can kind of give up
link |
00:23:29.160
to some degree for economic reasons,
link |
00:23:31.240
you can give up on say false negatives.
link |
00:23:34.800
And the way to think about this
link |
00:23:36.200
is for a four collision mitigation braking system,
link |
00:23:39.320
if it half the times the driver missed a vehicle
link |
00:23:43.960
in front of it, it hit the brakes
link |
00:23:46.080
and brought the vehicle to a stop,
link |
00:23:47.680
that would be an incredible, incredible advance
link |
00:23:51.640
in safety on our roads, right?
link |
00:23:53.040
That would be equivalent to seat belts.
link |
00:23:55.000
But it would mean that if that vehicle
link |
00:23:56.600
wasn't being monitored, it would hit one out of two cars.
link |
00:24:00.600
And so economically, that's a perfectly good solution
link |
00:24:05.120
for a driver assistance system.
link |
00:24:06.280
What you should do at that point,
link |
00:24:07.240
if you can get it to work 50% of the time,
link |
00:24:09.240
is drive the cost out of that
link |
00:24:10.520
so you can get it on as many vehicles as possible.
link |
00:24:13.320
But driving the cost out of it
link |
00:24:14.760
doesn't drive up performance on the false negative case.
link |
00:24:18.800
And so you'll continue to not have a technology
link |
00:24:21.440
that could really be available for a self driven vehicle.
link |
00:24:25.680
So clearly the communication,
link |
00:24:28.440
and this probably applies to all four vehicles as well,
link |
00:24:31.600
the marketing and communication
link |
00:24:34.440
of what the technology is actually capable of,
link |
00:24:37.040
how hard it is, how easy it is,
link |
00:24:38.400
all that kind of stuff is highly problematic.
link |
00:24:41.000
So say everybody in the world was perfectly communicated
link |
00:24:45.640
and were made to be completely aware
link |
00:24:48.400
of every single technology out there,
link |
00:24:50.000
what it's able to do.
link |
00:24:52.840
What's your intuition?
link |
00:24:54.120
And now we're maybe getting into philosophical ground.
link |
00:24:56.880
Is it possible to have a level two vehicle
link |
00:25:00.000
where we don't over trust it?
link |
00:25:04.680
I don't think so.
link |
00:25:05.800
If people truly understood the risks and internalized it,
link |
00:25:11.160
then sure, you could do that safely.
link |
00:25:14.320
But that's a world that doesn't exist.
link |
00:25:16.160
The people are going to,
link |
00:25:18.720
if the facts are put in front of them,
link |
00:25:20.760
they're gonna then combine that with their experience.
link |
00:25:24.440
And let's say they're using an L2 system
link |
00:25:28.360
and they go up and down the 101 every day
link |
00:25:30.800
and they do that for a month.
link |
00:25:32.720
And it just worked every day for a month.
link |
00:25:36.200
Like that's pretty compelling at that point,
link |
00:25:39.000
just even if you know the statistics,
link |
00:25:41.800
you're like, well, I don't know,
link |
00:25:43.400
maybe there's something funny about those.
link |
00:25:44.760
Maybe they're driving in difficult places.
link |
00:25:46.920
Like I've seen it with my own eyes, it works.
link |
00:25:49.840
And the problem is that that sample size that they have,
link |
00:25:52.400
so it's 30 miles up and down,
link |
00:25:53.880
so 60 miles times 30 days,
link |
00:25:56.360
so 60, 180, 1,800 miles.
link |
00:25:58.720
Like that's a drop in the bucket
link |
00:26:03.280
compared to the, what, 85 million miles between fatalities.
link |
00:26:07.640
And so they don't really have a true estimate
link |
00:26:11.400
based on their personal experience of the real risks,
link |
00:26:14.440
but they're gonna trust it anyway,
link |
00:26:15.640
because it's hard not to.
link |
00:26:16.480
It worked for a month, what's gonna change?
link |
00:26:18.640
So even if you start a perfect understanding of the system,
link |
00:26:21.640
your own experience will make it drift.
link |
00:26:24.160
I mean, that's a big concern.
link |
00:26:25.920
Over a year, over two years even,
link |
00:26:28.160
it doesn't have to be months.
link |
00:26:29.440
And I think that as this technology moves
link |
00:26:32.920
from what I would say is kind of the more technology savvy
link |
00:26:37.760
ownership group to the mass market,
link |
00:26:42.640
you may be able to have some of those folks
link |
00:26:44.600
who are really familiar with technology,
link |
00:26:46.280
they may be able to internalize it better.
link |
00:26:48.840
And your kind of immunization
link |
00:26:50.800
against this kind of false risk assessment
link |
00:26:53.360
might last longer,
link |
00:26:54.280
but as folks who aren't as savvy about that
link |
00:26:58.680
read the material and they compare that
link |
00:27:00.880
to their personal experience,
link |
00:27:02.160
I think there it's going to move more quickly.
link |
00:27:08.160
So your work, the program that you've created at Google
link |
00:27:11.280
and now at Aurora is focused more on the second path
link |
00:27:16.600
of creating full autonomy.
link |
00:27:18.480
So it's such a fascinating,
link |
00:27:20.880
I think it's one of the most interesting AI problems
link |
00:27:24.560
of the century, right?
link |
00:27:25.600
It's, I just talked to a lot of people,
link |
00:27:28.280
just regular people, I don't know,
link |
00:27:29.440
my mom, about autonomous vehicles,
link |
00:27:31.720
and you begin to grapple with ideas
link |
00:27:34.520
of giving your life control over to a machine.
link |
00:27:38.080
It's philosophically interesting,
link |
00:27:40.040
it's practically interesting.
link |
00:27:41.760
So let's talk about safety.
link |
00:27:43.720
How do you think we demonstrate,
link |
00:27:46.240
you've spoken about metrics in the past,
link |
00:27:47.880
how do you think we demonstrate to the world
link |
00:27:51.880
that an autonomous vehicle, an Aurora system is safe?
link |
00:27:56.160
This is one where it's difficult
link |
00:27:57.320
because there isn't a soundbite answer.
link |
00:27:59.280
That we have to show a combination of work
link |
00:28:05.960
that was done diligently and thoughtfully,
link |
00:28:08.360
and this is where something like a functional safety process
link |
00:28:10.840
is part of that.
link |
00:28:11.680
It's like here's the way we did the work,
link |
00:28:15.280
that means that we were very thorough.
link |
00:28:17.160
So if you believe that what we said
link |
00:28:20.040
about this is the way we did it,
link |
00:28:21.440
then you can have some confidence
link |
00:28:22.720
that we were thorough in the engineering work
link |
00:28:25.200
we put into the system.
link |
00:28:26.920
And then on top of that,
link |
00:28:28.920
to kind of demonstrate that we weren't just thorough,
link |
00:28:32.000
we were actually good at what we did,
link |
00:28:35.280
there'll be a kind of a collection of evidence
link |
00:28:38.200
in terms of demonstrating that the capabilities
link |
00:28:40.440
worked the way we thought they did,
link |
00:28:42.920
statistically and to whatever degree
link |
00:28:45.320
we can demonstrate that,
link |
00:28:48.160
both in some combination of simulations,
link |
00:28:50.320
some combination of unit testing
link |
00:28:53.080
and decomposition testing,
link |
00:28:54.640
and then some part of it will be on road data.
link |
00:28:58.160
And I think the way we'll ultimately
link |
00:29:02.680
convey this to the public
link |
00:29:04.000
is there'll be clearly some conversation
link |
00:29:06.760
with the public about it,
link |
00:29:08.200
but we'll kind of invoke the kind of the trusted nodes
link |
00:29:12.040
and that we'll spend more time
link |
00:29:13.880
being able to go into more depth with folks like NHTSA
link |
00:29:17.280
and other federal and state regulatory bodies
link |
00:29:19.720
and kind of given that they are
link |
00:29:22.080
operating in the public interest and they're trusted,
link |
00:29:26.240
that if we can show enough work to them
link |
00:29:28.640
that they're convinced,
link |
00:29:30.000
then I think we're in a pretty good place.
link |
00:29:33.800
That means you work with people
link |
00:29:35.000
that are essentially experts at safety
link |
00:29:36.920
to try to discuss and show.
link |
00:29:39.000
Do you think, the answer's probably no,
link |
00:29:41.720
but just in case,
link |
00:29:42.920
do you think there exists a metric?
link |
00:29:44.360
So currently people have been using
link |
00:29:46.320
number of disengagements.
link |
00:29:48.200
And it quickly turns into a marketing scheme
link |
00:29:50.120
to sort of you alter the experiments you run to adjust.
link |
00:29:54.280
I think you've spoken that you don't like.
link |
00:29:56.280
Don't love it.
link |
00:29:57.120
No, in fact, I was on the record telling DMV
link |
00:29:59.680
that I thought this was not a great metric.
link |
00:30:01.960
Do you think it's possible to create a metric,
link |
00:30:05.280
a number that could demonstrate safety
link |
00:30:09.440
outside of fatalities?
link |
00:30:12.320
So I do.
link |
00:30:13.440
And I think that it won't be just one number.
link |
00:30:17.600
So as we are internally grappling with this,
link |
00:30:21.280
and at some point we'll be able to talk
link |
00:30:23.560
more publicly about it,
link |
00:30:25.040
is how do we think about human performance
link |
00:30:28.520
in different tasks,
link |
00:30:29.840
say detecting traffic lights
link |
00:30:32.160
or safely making a left turn across traffic?
link |
00:30:37.680
And what do we think the failure rates are
link |
00:30:40.080
for those different capabilities for people?
link |
00:30:42.520
And then demonstrating to ourselves
link |
00:30:44.760
and then ultimately folks in the regulatory role
link |
00:30:48.480
and then ultimately the public
link |
00:30:50.760
that we have confidence that our system
link |
00:30:52.400
will work better than that.
link |
00:30:54.760
And so these individual metrics
link |
00:30:57.040
will kind of tell a compelling story ultimately.
link |
00:31:01.760
I do think at the end of the day
link |
00:31:03.920
what we care about in terms of safety
link |
00:31:06.640
is life saved and injuries reduced.
link |
00:31:12.160
And then ultimately kind of casualty dollars
link |
00:31:16.440
that people aren't having to pay to get their car fixed.
link |
00:31:19.360
And I do think that in aviation
link |
00:31:22.680
they look at a kind of an event pyramid
link |
00:31:25.880
where a crash is at the top of that
link |
00:31:28.600
and that's the worst event obviously
link |
00:31:30.440
and then there's injuries and near miss events and whatnot
link |
00:31:34.240
and violation of operating procedures
link |
00:31:37.320
and you kind of build a statistical model
link |
00:31:40.160
of the relevance of the low severity things
link |
00:31:44.440
or the high severity things.
link |
00:31:45.280
And I think that's something
link |
00:31:46.120
where we'll be able to look at as well
link |
00:31:48.200
because an event per 85 million miles
link |
00:31:51.840
is statistically a difficult thing
link |
00:31:54.440
even at the scale of the U.S.
link |
00:31:56.800
to kind of compare directly.
link |
00:31:59.360
And that event fatality that's connected
link |
00:32:02.240
to an autonomous vehicle is significantly
link |
00:32:07.440
at least currently magnified
link |
00:32:09.160
in the amount of attention it gets.
link |
00:32:12.320
So that speaks to public perception.
link |
00:32:15.080
I think the most popular topic
link |
00:32:16.720
about autonomous vehicles in the public
link |
00:32:19.480
is the trolley problem formulation, right?
link |
00:32:23.080
Which has, let's not get into that too much
link |
00:32:27.000
but is misguided in many ways.
link |
00:32:29.600
But it speaks to the fact that people are grappling
link |
00:32:32.320
with this idea of giving control over to a machine.
link |
00:32:36.160
So how do you win the hearts and minds of the people
link |
00:32:41.560
that autonomy is something that could be a part
link |
00:32:44.600
of their lives?
link |
00:32:45.520
I think you let them experience it, right?
link |
00:32:47.640
I think it's right.
link |
00:32:50.440
I think people should be skeptical.
link |
00:32:52.800
I think people should ask questions.
link |
00:32:55.680
I think they should doubt
link |
00:32:57.000
because this is something new and different.
link |
00:33:00.120
They haven't touched it yet.
link |
00:33:01.880
And I think that's perfectly reasonable.
link |
00:33:03.640
And, but at the same time,
link |
00:33:07.320
it's clear there's an opportunity to make the road safer.
link |
00:33:09.320
It's clear that we can improve access to mobility.
link |
00:33:12.440
It's clear that we can reduce the cost of mobility.
link |
00:33:16.640
And that once people try that
link |
00:33:19.480
and understand that it's safe
link |
00:33:22.720
and are able to use in their daily lives,
link |
00:33:24.440
I think it's one of these things
link |
00:33:25.280
that will just be obvious.
link |
00:33:28.040
And I've seen this practically in demonstrations
link |
00:33:32.240
that I've given where I've had people come in
link |
00:33:35.560
and they're very skeptical.
link |
00:33:38.840
Again, in a vehicle, my favorite one
link |
00:33:40.440
is taking somebody out on the freeway
link |
00:33:42.560
and we're on the 101 driving at 65 miles an hour.
link |
00:33:46.000
And after 10 minutes, they kind of turn and ask,
link |
00:33:48.400
is that all it does?
link |
00:33:49.480
And you're like, it's a self driving car.
link |
00:33:52.080
I'm not sure exactly what you thought it would do, right?
link |
00:33:54.840
But it becomes mundane,
link |
00:33:58.840
which is exactly what you want a technology
link |
00:34:01.480
like this to be, right?
link |
00:34:02.720
We don't really, when I turn the light switch on in here,
link |
00:34:07.280
I don't think about the complexity of those electrons
link |
00:34:12.000
being pushed down a wire from wherever it was
link |
00:34:14.200
and being generated.
link |
00:34:15.240
It's like, I just get annoyed if it doesn't work, right?
link |
00:34:19.080
And what I value is the fact
link |
00:34:21.400
that I can do other things in this space.
link |
00:34:23.080
I can see my colleagues.
link |
00:34:24.560
I can read stuff on a paper.
link |
00:34:26.160
I can not be afraid of the dark.
link |
00:34:30.360
And I think that's what we want this technology to be like
link |
00:34:33.320
is it's in the background
link |
00:34:34.640
and people get to have those life experiences
link |
00:34:37.120
and do so safely.
link |
00:34:38.440
So putting this technology in the hands of people
link |
00:34:42.160
speaks to scale of deployment, right?
link |
00:34:46.320
So what do you think the dreaded question about the future
link |
00:34:50.880
because nobody can predict the future,
link |
00:34:53.560
but just maybe speak poetically
link |
00:34:57.240
about when do you think we'll see a large scale deployment
link |
00:35:00.880
of autonomous vehicles, 10,000, those kinds of numbers?
link |
00:35:06.680
We'll see that within 10 years.
link |
00:35:09.240
I'm pretty confident.
link |
00:35:14.040
What's an impressive scale?
link |
00:35:16.040
What moment, so you've done the DARPA challenge
link |
00:35:19.200
where there's one vehicle.
link |
00:35:20.440
At which moment does it become, wow, this is serious scale?
link |
00:35:23.960
So I think the moment it gets serious
link |
00:35:26.520
is when we really do have a driverless vehicle
link |
00:35:32.240
operating on public roads
link |
00:35:35.000
and that we can do that kind of continuously.
link |
00:35:37.960
Without a safety driver.
link |
00:35:38.880
Without a safety driver in the vehicle.
link |
00:35:40.440
I think at that moment,
link |
00:35:41.560
we've kind of crossed the zero to one threshold.
link |
00:35:45.920
And then it is about how do we continue to scale that?
link |
00:35:50.200
How do we build the right business models?
link |
00:35:53.960
How do we build the right customer experience around it
link |
00:35:56.320
so that it is actually a useful product out in the world?
link |
00:36:00.960
And I think that is really,
link |
00:36:03.600
at that point it moves from
link |
00:36:05.920
what is this kind of mixed science engineering project
link |
00:36:09.200
into engineering and commercialization
link |
00:36:12.360
and really starting to deliver on the value
link |
00:36:15.840
that we all see here and actually making that real in the world.
link |
00:36:20.680
What do you think that deployment looks like?
link |
00:36:22.240
Where do we first see the inkling of no safety driver,
link |
00:36:26.440
one or two cars here and there?
link |
00:36:28.600
Is it on the highway?
link |
00:36:29.800
Is it in specific routes in the urban environment?
link |
00:36:33.160
I think it's gonna be urban, suburban type environments.
link |
00:36:37.880
Yeah, with Aurora, when we thought about how to tackle this,
link |
00:36:41.560
it was kind of in vogue to think about trucking
link |
00:36:46.040
as opposed to urban driving.
link |
00:36:47.800
And again, the human intuition around this
link |
00:36:51.280
is that freeways are easier to drive on
link |
00:36:57.080
because everybody's kind of going in the same direction
link |
00:36:59.280
and lanes are a little wider, et cetera.
link |
00:37:01.560
And I think that that intuition is pretty good,
link |
00:37:03.320
except we don't really care about most of the time.
link |
00:37:06.040
We care about all of the time.
link |
00:37:08.400
And when you're driving on a freeway with a truck,
link |
00:37:10.880
say 70 miles an hour,
link |
00:37:14.600
and you've got 70,000 pound load with you,
link |
00:37:16.240
that's just an incredible amount of kinetic energy.
link |
00:37:18.880
And so when that goes wrong, it goes really wrong.
link |
00:37:22.640
And those challenges that you see occur more rarely,
link |
00:37:27.800
so you don't get to learn as quickly.
link |
00:37:31.120
And they're incrementally more difficult than urban driving,
link |
00:37:34.720
but they're not easier than urban driving.
link |
00:37:37.440
And so I think this happens in moderate speed
link |
00:37:41.640
urban environments because if two vehicles crash
link |
00:37:45.280
at 25 miles per hour, it's not good,
link |
00:37:48.120
but probably everybody walks away.
link |
00:37:51.080
And those events where there's the possibility
link |
00:37:53.720
for that occurring happen frequently.
link |
00:37:55.800
So we get to learn more rapidly.
link |
00:37:58.000
We get to do that with lower risk for everyone.
link |
00:38:02.520
And then we can deliver value to people
link |
00:38:04.360
that need to get from one place to another.
link |
00:38:05.880
And once we've got that solved,
link |
00:38:08.160
then the freeway driving part of this just falls out.
link |
00:38:11.320
But we're able to learn more safely,
link |
00:38:13.080
more quickly in the urban environment.
link |
00:38:15.200
So 10 years and then scale 20, 30 year,
link |
00:38:18.760
who knows if a sufficiently compelling experience
link |
00:38:22.040
is created, it could be faster and slower.
link |
00:38:24.400
Do you think there could be breakthroughs
link |
00:38:27.160
and what kind of breakthroughs might there be
link |
00:38:29.920
that completely change that timeline?
link |
00:38:32.400
Again, not only am I asking you to predict the future,
link |
00:38:35.360
I'm asking you to predict breakthroughs
link |
00:38:37.360
that haven't happened yet.
link |
00:38:38.360
So what's the, I think another way to ask that
link |
00:38:41.440
would be if I could wave a magic wand,
link |
00:38:44.320
what part of the system would I make work today
link |
00:38:46.720
to accelerate it as quickly as possible?
link |
00:38:52.120
Don't say infrastructure, please don't say infrastructure.
link |
00:38:54.200
No, it's definitely not infrastructure.
link |
00:38:56.320
It's really that perception forecasting capability.
link |
00:39:00.600
So if tomorrow you could give me a perfect model
link |
00:39:04.840
of what's happened, what is happening
link |
00:39:06.960
and what will happen for the next five seconds
link |
00:39:10.360
around a vehicle on the roadway,
link |
00:39:13.040
that would accelerate things pretty dramatically.
link |
00:39:15.360
Are you, in terms of staying up at night,
link |
00:39:17.600
are you mostly bothered by cars, pedestrians or cyclists?
link |
00:39:21.760
So I worry most about the vulnerable road users
link |
00:39:25.960
about the combination of cyclists and cars, right?
link |
00:39:28.480
Or cyclists and pedestrians because they're not in armor.
link |
00:39:31.960
The cars, they're bigger, they've got protection
link |
00:39:36.480
for the people and so the ultimate risk is lower there.
link |
00:39:41.080
Whereas a pedestrian or a cyclist,
link |
00:39:43.240
they're out on the road and they don't have any protection
link |
00:39:46.480
and so we need to pay extra attention to that.
link |
00:39:49.720
Do you think about a very difficult technical challenge
link |
00:39:55.720
of the fact that pedestrians,
link |
00:39:58.520
if you try to protect pedestrians
link |
00:40:00.240
by being careful and slow, they'll take advantage of that.
link |
00:40:04.560
So the game theoretic dance, does that worry you
link |
00:40:09.040
of how, from a technical perspective, how we solve that?
link |
00:40:12.480
Because as humans, the way we solve that
link |
00:40:14.560
is kind of nudge our way through the pedestrians
link |
00:40:17.240
which doesn't feel, from a technical perspective,
link |
00:40:20.000
as a appropriate algorithm.
link |
00:40:23.200
But do you think about how we solve that problem?
link |
00:40:25.920
Yeah, I think there's two different concepts there.
link |
00:40:31.360
So one is, am I worried that because these vehicles
link |
00:40:35.820
are self driving, people will kind of step in the road
link |
00:40:37.600
and take advantage of them?
link |
00:40:38.640
And I've heard this and I don't really believe it
link |
00:40:43.760
because if I'm driving down the road
link |
00:40:45.960
and somebody steps in front of me, I'm going to stop.
link |
00:40:50.600
Even if I'm annoyed, I'm not gonna just drive
link |
00:40:53.660
through a person stood in the road.
link |
00:40:56.400
And so I think today people can take advantage of this
link |
00:41:00.400
and you do see some people do it.
link |
00:41:02.560
I guess there's an incremental risk
link |
00:41:04.180
because maybe they have lower confidence
link |
00:41:05.880
that I'm gonna see them than they might have
link |
00:41:07.720
for an automated vehicle and so maybe that shifts
link |
00:41:10.400
it a little bit.
link |
00:41:12.040
But I think people don't wanna get hit by cars.
link |
00:41:14.360
And so I think that I'm not that worried
link |
00:41:17.080
about people walking out of the 101
link |
00:41:18.760
and creating chaos more than they would today.
link |
00:41:24.400
Regarding kind of the nudging through a big stream
link |
00:41:27.040
of pedestrians leaving a concert or something,
link |
00:41:30.040
I think that is further down the technology pipeline.
link |
00:41:33.520
I think that you're right, that's tricky.
link |
00:41:36.960
I don't think it's necessarily,
link |
00:41:40.360
I think the algorithm people use for this is pretty simple.
link |
00:41:43.600
It's kind of just move forward slowly
link |
00:41:44.800
and if somebody's really close then stop.
link |
00:41:46.800
And I think that that probably can be replicated
link |
00:41:50.880
pretty easily and particularly given that
link |
00:41:54.040
you don't do this at 30 miles an hour,
link |
00:41:55.720
you do it at one, that even in those situations
link |
00:41:59.080
the risk is relatively minimal.
link |
00:42:01.200
But it's not something we're thinking about
link |
00:42:03.640
in any serious way.
link |
00:42:04.560
And probably that's less an algorithm problem
link |
00:42:07.920
and more creating a human experience.
link |
00:42:10.160
So the HCI people that create a visual display
link |
00:42:14.300
that you're pleasantly as a pedestrian
link |
00:42:16.260
nudged out of the way, that's an experience problem,
link |
00:42:20.760
not an algorithm problem.
link |
00:42:22.880
Who's the main competitor to Aurora today?
link |
00:42:25.480
And how do you outcompete them in the long run?
link |
00:42:28.640
So we really focus a lot on what we're doing here.
link |
00:42:31.200
I think that, I've said this a few times,
link |
00:42:34.480
that this is a huge difficult problem
link |
00:42:37.960
and it's great that a bunch of companies are tackling it
link |
00:42:40.320
because I think it's so important for society
link |
00:42:42.320
that somebody gets there.
link |
00:42:43.800
So we don't spend a whole lot of time
link |
00:42:49.120
thinking tactically about who's out there
link |
00:42:51.600
and how do we beat that person individually.
link |
00:42:55.240
What are we trying to do to go faster ultimately?
link |
00:42:59.760
Well part of it is the leadership team we have
link |
00:43:02.640
has got pretty tremendous experience.
link |
00:43:04.200
And so we kind of understand the landscape
link |
00:43:06.440
and understand where the cul de sacs are to some degree
link |
00:43:09.160
and we try and avoid those.
link |
00:43:10.980
I think there's a part of it,
link |
00:43:14.260
just this great team we've built.
link |
00:43:16.260
People, this is a technology and a company
link |
00:43:19.080
that people believe in the mission of
link |
00:43:22.320
and so it allows us to attract
link |
00:43:23.740
just awesome people to go work.
link |
00:43:26.800
We've got a culture I think that people appreciate
link |
00:43:29.320
that allows them to focus,
link |
00:43:30.460
allows them to really spend time solving problems.
link |
00:43:33.120
And I think that keeps them energized.
link |
00:43:35.900
And then we've invested hard,
link |
00:43:38.940
invested heavily in the infrastructure
link |
00:43:43.500
and architectures that we think will ultimately accelerate us.
link |
00:43:46.540
So because of the folks we're able to bring in early on,
link |
00:43:50.660
because of the great investors we have,
link |
00:43:53.540
we don't spend all of our time doing demos
link |
00:43:56.780
and kind of leaping from one demo to the next.
link |
00:43:58.660
We've been given the freedom to invest in
link |
00:44:03.940
infrastructure to do machine learning,
link |
00:44:05.500
infrastructure to pull data from our on road testing,
link |
00:44:08.600
infrastructure to use that to accelerate engineering.
link |
00:44:11.500
And I think that early investment
link |
00:44:14.480
and continuing investment in those kind of tools
link |
00:44:17.340
will ultimately allow us to accelerate
link |
00:44:19.780
and do something pretty incredible.
link |
00:44:21.940
Chris, beautifully put.
link |
00:44:23.420
It's a good place to end.
link |
00:44:24.660
Thank you so much for talking today.
link |
00:44:26.500
Thank you very much. Really enjoyed it.