back to index

George Hotz: Comma.ai, OpenPilot, and Autonomous Vehicles | Lex Fridman Podcast #31


small model | large model

link |
00:00:00.000
The following is a conversation with George Hotz.
link |
00:00:02.520
He's the founder of Comma AI,
link |
00:00:04.480
a machine learning based vehicle automation company.
link |
00:00:07.400
He is most certainly an outspoken personality
link |
00:00:10.200
in the field of AI and technology in general.
link |
00:00:13.160
He first gained recognition for being the first person
link |
00:00:16.240
to carry on lock and iPhone.
link |
00:00:18.400
And since then, he's done quite a few interesting things
link |
00:00:21.280
at the intersection of hardware and software.
link |
00:00:24.400
This is the artificial intelligence podcast.
link |
00:00:27.440
If you enjoy it, subscribe on YouTube,
link |
00:00:29.560
give it five stars on iTunes, support it on Patreon,
link |
00:00:32.920
or simply connect with me on Twitter.
link |
00:00:34.920
Alex Friedman, spelled F R I D M A N.
link |
00:00:39.120
And I'd like to give a special thank you to Jennifer
link |
00:00:42.000
from Canada for her support of the podcast on Patreon.
link |
00:00:45.880
Merci beaucoup, Jennifer.
link |
00:00:47.720
She's been a friend and an engineering colleague
link |
00:00:50.600
for many years since I was in grad school.
link |
00:00:52.800
Your support means a lot and inspires me
link |
00:00:55.520
to keep this series going.
link |
00:00:57.920
And now here's my conversation with George Hotz.
link |
00:01:02.720
Do you think we're living in a simulation?
link |
00:01:06.480
Yes, but it may be unfalsifiable.
link |
00:01:10.080
What do you mean by unfalsifiable?
link |
00:01:12.440
So if the simulation is designed in such a way
link |
00:01:16.840
that they did like a formal proof
link |
00:01:19.640
to show that no information can get in and out.
link |
00:01:22.320
And if their hardware is designed for the anything
link |
00:01:25.200
in the simulation to always keep the hardware in spec,
link |
00:01:27.880
it may be impossible to prove
link |
00:01:29.480
whether we're in a simulation or not.
link |
00:01:32.600
So they've designed it such that it's a closed system,
link |
00:01:35.680
you can't get outside the system.
link |
00:01:37.200
Well, maybe it's one of three worlds.
link |
00:01:38.760
We're either in a simulation which can be exploited,
link |
00:01:41.400
we're in a simulation which not only can't be exploited,
link |
00:01:44.200
but like the same thing's true about VMs.
link |
00:01:46.440
A really well designed VM,
link |
00:01:48.160
you can't even detect if you're in a VM or not.
link |
00:01:51.400
That's brilliant.
link |
00:01:52.520
So we're, yeah, so the simulation is running
link |
00:01:55.160
on a virtual machine.
link |
00:01:56.800
But now in reality, all VMs have ways to detect.
link |
00:01:59.440
That's the point.
link |
00:02:00.280
I mean, is it, you've done quite a bit of hacking yourself.
link |
00:02:04.840
So you should know that really any complicated system
link |
00:02:08.640
will have ways in and out.
link |
00:02:11.000
So this isn't necessarily true going forward.
link |
00:02:15.280
I spent my time away from comma,
link |
00:02:18.080
I learned a cock, it's a dependently typed,
link |
00:02:21.240
like it's a language for writing math proofs.
link |
00:02:24.360
And if you write code that compiles in a language like that,
link |
00:02:28.200
it is correct by definition.
link |
00:02:30.840
The types check it's correctance.
link |
00:02:33.560
So it's possible that the simulation
link |
00:02:35.000
is written in a language like this, in which case, yeah.
link |
00:02:39.640
Yeah, but that can't be sufficiently expressive
link |
00:02:42.680
of language like that.
link |
00:02:43.760
Oh, it can.
link |
00:02:44.600
It can be?
link |
00:02:45.440
Oh, yeah.
link |
00:02:46.280
Okay, well, so, all right, so.
link |
00:02:48.920
The simulation doesn't have to be tearing complete
link |
00:02:50.640
if it has a scheduled end date.
link |
00:02:52.320
Looks like it does actually with entropy.
link |
00:02:54.600
I mean, I don't think that a simulation
link |
00:02:58.520
that results in something as complicated as the universe
link |
00:03:03.080
would have a formal proof of correctness, right?
link |
00:03:08.240
It's possible, of course.
link |
00:03:09.880
We have no idea how good their tooling is.
link |
00:03:12.720
And we have no idea how complicated
link |
00:03:14.640
the universe computer really is.
link |
00:03:16.280
It may be quite simple.
link |
00:03:17.920
It's just very large, right?
link |
00:03:19.680
It's very, it's definitely very large.
link |
00:03:22.160
But the fundamental rules might be super simple.
link |
00:03:24.480
Yeah, Conway's gonna like kinda stop.
link |
00:03:26.240
Right, so if you could hack,
link |
00:03:30.320
so imagine the simulation that is hackable,
link |
00:03:32.400
if you could hack it,
link |
00:03:35.040
what would you change about the universe?
link |
00:03:37.960
Like how would you approach hacking a simulation?
link |
00:03:41.640
The reason I gave that talk?
link |
00:03:44.360
By the way, I'm not familiar with the talk you gave.
link |
00:03:46.680
I just read that you talked about escaping the simulation
link |
00:03:50.160
or something like that.
link |
00:03:51.280
So maybe you can tell me a little bit
link |
00:03:52.640
about the theme and the message there too.
link |
00:03:55.360
It wasn't a very practical talk
link |
00:03:57.680
about how to actually escape a simulation.
link |
00:04:00.600
It was more about a way of restructuring
link |
00:04:03.320
an us versus them narrative.
link |
00:04:05.120
If we continue on the path we're going with technology,
link |
00:04:12.360
I think we're in big trouble,
link |
00:04:14.160
like as a species and not just as a species,
link |
00:04:16.760
but even as me as an individual member of the species.
link |
00:04:19.480
So if we could change rhetoric to be more like,
link |
00:04:23.680
to think upwards,
link |
00:04:26.240
like to think about that we're in a simulation
link |
00:04:29.080
and how we could get out,
link |
00:04:30.360
already we'd be on the right path.
link |
00:04:32.640
What you actually do once you do that,
link |
00:04:34.800
well, I assume I would have acquired way more intelligence
link |
00:04:37.360
in the process of doing that, so I'll just ask that.
link |
00:04:39.760
So the thinking upwards,
link |
00:04:42.080
what kind of ideas,
link |
00:04:43.760
what kind of breakthrough ideas do you think thinking
link |
00:04:45.640
in that way could inspire?
link |
00:04:47.280
And why did you say upwards?
link |
00:04:49.800
Upwards.
link |
00:04:50.640
Into space?
link |
00:04:51.480
Are you thinking sort of exploration in all forms?
link |
00:04:54.120
The space narrative that held for the modernist generation
link |
00:04:59.880
doesn't hold as well for the postmodern generation.
link |
00:05:04.560
What's the space narrative?
link |
00:05:05.480
Are we talking about the same space?
link |
00:05:06.520
The three dimensional space?
link |
00:05:07.360
No, no, space, like going up space,
link |
00:05:08.840
like building like Elon Musk,
link |
00:05:10.040
like we're going to build rockets,
link |
00:05:11.160
we're going to go to Mars,
link |
00:05:12.080
we're going to colonize the universe.
link |
00:05:13.560
And the narrative you're referring,
link |
00:05:14.720
I was born in the Soviet Union,
link |
00:05:16.040
you're referring to the race to space?
link |
00:05:18.000
The race to space, yeah.
link |
00:05:18.840
Yes, explore, okay.
link |
00:05:19.680
That was a great modernist narrative.
link |
00:05:21.760
Yeah.
link |
00:05:23.360
It doesn't seem to hold the same weight in today's culture.
link |
00:05:27.640
I'm hoping for good postmodern narratives that replace it.
link |
00:05:32.160
So let's think, so you work a lot with AI.
link |
00:05:35.560
So AI is one formulation of that narrative.
link |
00:05:39.080
There could be also,
link |
00:05:40.080
I don't know how much you do in VR and AR.
link |
00:05:42.320
Yeah.
link |
00:05:43.160
That's another, I know less about it,
link |
00:05:45.160
but every time I play with it and our research,
link |
00:05:47.600
it's fascinating, that virtual world.
link |
00:05:49.640
Are you interested in the virtual world?
link |
00:05:51.840
I would like to move to virtual reality.
link |
00:05:55.360
In terms of your work?
link |
00:05:56.440
No, I would like to physically move there.
link |
00:05:58.760
The apartment I can rent in the cloud
link |
00:06:00.240
is way better than the apartment I can rent in the real world.
link |
00:06:03.240
Well, it's all relative, isn't it?
link |
00:06:04.760
Because others will have very nice apartments too,
link |
00:06:07.280
so you'll be inferior in the virtual world as well.
link |
00:06:09.200
But that's not how I view the world, right?
link |
00:06:11.320
I don't view the world.
link |
00:06:12.440
I mean, that's a very like, almost zero summish way
link |
00:06:15.640
to view the world.
link |
00:06:16.480
Say like, my great apartment isn't great
link |
00:06:18.800
because my neighbor has one too.
link |
00:06:20.400
No, my great apartment is great
link |
00:06:21.640
because like, look at this dishwasher, man.
link |
00:06:24.320
You just touch the dish and it's washed, right?
link |
00:06:26.640
And that is great in and of itself
link |
00:06:28.680
if I had the only apartment
link |
00:06:30.120
or if everybody had the apartment.
link |
00:06:31.520
I don't care.
link |
00:06:32.400
So you have fundamental gratitude.
link |
00:06:34.760
The world first learned of Geohot, George Hots
link |
00:06:39.080
in August 2007, maybe before then,
link |
00:06:42.280
but certainly in August 2007
link |
00:06:44.080
when you were the first person to unlock,
link |
00:06:46.760
carry on lock an iPhone.
link |
00:06:48.880
How did you get into hacking?
link |
00:06:50.520
What was the first system you discovered
link |
00:06:53.080
vulnerabilities for and broke into?
link |
00:06:56.240
So that was really kind of the first thing.
link |
00:07:01.640
I had a book in 2006 called Gray Hat Hacking.
link |
00:07:07.480
And I guess I realized that
link |
00:07:11.000
if you acquired these sort of powers
link |
00:07:13.480
you could control the world.
link |
00:07:16.160
But I didn't really know that much
link |
00:07:18.920
about computers back then.
link |
00:07:20.560
I started with electronics.
link |
00:07:22.120
The first iPhone hack was physical.
link |
00:07:24.200
Cardware.
link |
00:07:25.040
You had to open it up and pull an address line high.
link |
00:07:28.160
And it was because I didn't really know
link |
00:07:29.960
about software exploitation.
link |
00:07:31.320
I learned that all in the next few years
link |
00:07:32.960
and I got very good at it.
link |
00:07:33.920
But back then I knew about like
link |
00:07:36.560
how memory chips are connected to processors and stuff.
link |
00:07:38.920
But you knew about software and programming.
link |
00:07:41.040
You didn't know.
link |
00:07:43.200
Oh really, so your view of the world
link |
00:07:46.160
and computers was physical, was hardware.
link |
00:07:49.320
Actually, if you read the code that I released with that
link |
00:07:52.400
in August 2007, it's atrocious.
link |
00:07:55.760
What language was it?
link |
00:07:56.760
C.
link |
00:07:57.600
C, nice.
link |
00:07:58.440
And in a broken sort of state machine, ask C.
link |
00:08:01.480
I didn't know how to program.
link |
00:08:02.960
Yeah.
link |
00:08:04.160
So how did you learn to program?
link |
00:08:07.520
What was your journey?
link |
00:08:08.440
I mean, we'll talk about it.
link |
00:08:10.040
You've live streamed some of your programming.
link |
00:08:12.680
This chaotic, beautiful mess.
link |
00:08:14.400
How did you arrive at that?
link |
00:08:16.480
Years and years of practice.
link |
00:08:18.640
I interned at Google after,
link |
00:08:22.240
the summer after the iPhone unlock.
link |
00:08:24.800
And I did a contract for them
link |
00:08:26.720
where I built a hardware for Street View
link |
00:08:29.040
and I wrote a software library to interact with it.
link |
00:08:32.680
And it was terrible code.
link |
00:08:34.920
And for the first time I got feedback
link |
00:08:36.560
from people who I respected saying,
link |
00:08:38.760
no, like, don't write code like this.
link |
00:08:42.680
Now, of course, just getting that feedback is not enough.
link |
00:08:45.680
The way that I really got good was,
link |
00:08:51.000
I wanted to write this thing that could emulate
link |
00:08:54.800
and then visualize like arm binaries
link |
00:08:58.440
because I wanted to hack the iPhone better.
link |
00:09:00.040
And I didn't like that I couldn't see what the,
link |
00:09:01.960
I couldn't single step through the processor
link |
00:09:03.800
because I had no debugger on there,
link |
00:09:05.200
especially for the low level things like the boot ROM
link |
00:09:06.640
and the boot loader.
link |
00:09:07.480
So I tried to build this tool to do it.
link |
00:09:10.920
And I built the tool once and it was terrible.
link |
00:09:13.440
I built the tool second times, it was terrible.
link |
00:09:15.120
I built the tool third time.
link |
00:09:16.320
This by the time I was at Facebook, it was kind of okay.
link |
00:09:18.600
And then I built the tool fourth time
link |
00:09:20.560
when I was a Google intern again in 2014.
link |
00:09:22.560
And that was the first time I was like,
link |
00:09:24.320
this is finally usable.
link |
00:09:25.880
How do you pronounce this, Kira?
link |
00:09:27.120
Kira, yeah.
link |
00:09:28.360
So it's essentially the most efficient way
link |
00:09:31.840
to visualize the change of state of the computer
link |
00:09:35.720
as the program is running.
link |
00:09:37.200
That's what you mean by debugger.
link |
00:09:38.920
Yeah, it's a timeless debugger.
link |
00:09:41.760
So you can rewind just as easily as going forward.
link |
00:09:45.080
Think about, if you're using GDB,
link |
00:09:46.280
you have to put a watch on a variable.
link |
00:09:47.880
If you want to see if that variable changes.
link |
00:09:49.680
In Kira, you can just click on that variable.
link |
00:09:51.480
And then it shows every single time
link |
00:09:53.880
when that variable was changed or accessed.
link |
00:09:56.520
Think about it like get for your computer's, the run lock.
link |
00:09:59.760
So there's like a deep log of the state of the computer
link |
00:10:05.640
as the program runs and you can rewind.
link |
00:10:07.840
Why isn't that, maybe it is, maybe you can educate me.
link |
00:10:11.480
Why isn't that kind of debugging used more often?
link |
00:10:14.640
Because the tooling's bad.
link |
00:10:16.320
Well, two things.
link |
00:10:17.160
One, if you're trying to debug Chrome,
link |
00:10:19.360
Chrome is a 200 megabyte binary
link |
00:10:22.920
that runs slowly on desktops.
link |
00:10:25.440
So that's gonna be really hard to use for that.
link |
00:10:27.760
But it's really good to use for like CTFs
link |
00:10:30.160
and for boot ROMs and for small parts of code.
link |
00:10:33.200
So it's hard if you're trying to debug like massive systems.
link |
00:10:36.360
What's a CTF and what's a boot ROM?
link |
00:10:38.200
A boot ROM is the first code that executes
link |
00:10:40.480
the minute you give power to your iPhone.
link |
00:10:42.280
Okay.
link |
00:10:43.520
And CTF were these competitions that I played.
link |
00:10:46.040
Capture the flag.
link |
00:10:46.880
Capture the flag.
link |
00:10:47.720
I was gonna ask you about that.
link |
00:10:48.560
What are those, those look at,
link |
00:10:49.920
I watched a couple of videos on YouTube.
link |
00:10:51.440
Those look fascinating.
link |
00:10:52.920
What have you learned about maybe at the high level
link |
00:10:55.560
in the vulnerability of systems from these competitions?
link |
00:11:00.840
I feel like in the heyday of CTFs,
link |
00:11:04.200
you had all of the best security people in the world
link |
00:11:08.160
challenging each other and coming up
link |
00:11:10.720
with new toy exploitable things over here.
link |
00:11:13.640
And then everybody, okay, who can break it?
link |
00:11:15.400
And when you break it, you get like,
link |
00:11:17.160
there's like a file in the server called flag.
link |
00:11:19.360
And then there's a program running,
link |
00:11:20.960
listening on a socket that's vulnerable.
link |
00:11:22.680
So you write an exploit, you get a shell,
link |
00:11:25.000
and then you cat flag, and then you type the flag
link |
00:11:27.160
into like a web based scoreboard and you get points.
link |
00:11:29.480
So the goal is essentially to find an exploit in the system
link |
00:11:33.000
that allows you to run shell,
link |
00:11:35.280
to run arbitrary code on that system.
link |
00:11:38.040
That's one of the categories.
link |
00:11:40.200
That's like the Poneable category.
link |
00:11:43.560
Poneable?
link |
00:11:44.400
Yeah, Poneable.
link |
00:11:45.240
It's like, you know, you Pone the program.
link |
00:11:47.600
It's a program.
link |
00:11:48.440
Oh, yeah.
link |
00:11:51.760
You know, first of all, I apologize, I'm gonna say,
link |
00:11:55.360
it's because I'm Russian,
link |
00:11:56.280
but maybe you can help educate me.
link |
00:12:00.120
Some video game like misspelled
link |
00:12:01.680
to own way back in the day.
link |
00:12:02.840
Yeah, and it's just,
link |
00:12:04.880
I wonder if there's a definition
link |
00:12:06.280
and I'll have to go to Urban Dictionary for it.
link |
00:12:08.000
Yeah, it'd be interesting to see what it says.
link |
00:12:09.800
Okay, so what was the heyday of CTL, by the way,
link |
00:12:12.760
but was it, what decade are we talking about?
link |
00:12:15.480
I think like, I mean, maybe I'm biased
link |
00:12:18.400
because it's the era that I played,
link |
00:12:21.120
but like 2011 to 2015,
link |
00:12:27.200
because the modern CTF scene
link |
00:12:30.320
is similar to the modern competitive programming scene.
link |
00:12:32.640
You have people who like do drills.
link |
00:12:34.280
You have people who practice.
link |
00:12:35.880
And then once you've done that,
link |
00:12:37.040
you've turned it less into a game of generic computer skill
link |
00:12:40.040
and more into a game of, okay, you memorize,
link |
00:12:42.440
you drill on these five categories.
link |
00:12:45.760
And then before that, it wasn't,
link |
00:12:48.920
it didn't have like as much attention as it had.
link |
00:12:52.800
I don't know, they were like,
link |
00:12:53.640
I won $30,000 once in Korea
link |
00:12:55.200
for one of these competitions.
link |
00:12:56.120
Holy crap.
link |
00:12:56.960
Yeah, they were, they were, that was...
link |
00:12:57.920
So that means, I mean, money is money,
link |
00:12:59.520
but that means there was probably good people there.
link |
00:13:02.320
Exactly, yeah.
link |
00:13:03.600
Are the challenges human constructed
link |
00:13:06.800
or are they grounded in some real flaws in real systems?
link |
00:13:10.760
Usually they're human constructed,
link |
00:13:13.080
but they're usually inspired by real flaws.
link |
00:13:15.760
What kind of systems are imagined
link |
00:13:17.320
is really focused on mobile?
link |
00:13:19.080
Like what has vulnerabilities these days?
link |
00:13:20.920
Is it primarily mobile systems like Android?
link |
00:13:25.120
Oh, everything does.
link |
00:13:26.680
Yeah, of course.
link |
00:13:28.120
The price has kind of gone up
link |
00:13:29.360
because less and less people can find them.
link |
00:13:31.280
And what's happened in security is now,
link |
00:13:33.160
if you want to like jailbreak an iPhone,
link |
00:13:34.560
you don't need one exploit anymore, you need nine.
link |
00:13:37.960
Nine change together?
link |
00:13:39.160
What would you mean?
link |
00:13:40.000
Yeah, wow.
link |
00:13:40.840
Okay, so it's really, what's the benefit?
link |
00:13:44.800
Speaking higher level philosophically about hacking.
link |
00:13:48.240
I mean, it sounds from everything I've seen about you,
link |
00:13:50.400
you just love the challenge and you don't want to do anything.
link |
00:13:55.040
You don't want to bring that exploit out into the world
link |
00:13:58.120
and do any actual, let it run wild.
link |
00:14:01.680
You just want to solve it
link |
00:14:02.760
and then you go on to the next thing.
link |
00:14:05.400
Oh yeah, I mean, doing criminal stuff's not really worth it.
link |
00:14:08.440
And I'll actually use the same argument
link |
00:14:10.520
for why I don't do defense for why I don't do crime.
link |
00:14:15.440
If you want to defend a system,
link |
00:14:16.840
say the system has 10 holes, right?
link |
00:14:19.280
If you find nine of those holes as a defender,
link |
00:14:22.240
you still lose because the attacker gets in
link |
00:14:24.240
through the last one.
link |
00:14:25.520
If you're an attacker,
link |
00:14:26.360
you only have to find one out of the 10.
link |
00:14:28.720
But if you're a criminal,
link |
00:14:30.760
if you log on with a VPN nine out of the 10 times,
link |
00:14:34.800
but one time you forget, you're done.
link |
00:14:37.760
Because you're caught, okay.
link |
00:14:39.400
Because you only have to mess up once
link |
00:14:41.160
to be caught as a criminal.
link |
00:14:42.920
That's why I'm not a criminal.
link |
00:14:45.920
But okay, let me,
link |
00:14:47.080
cause I was having a discussion with somebody
link |
00:14:49.520
just at a high level about nuclear weapons,
link |
00:14:52.440
actually why we're having blown ourselves up yet.
link |
00:14:56.240
And my feeling is all the smart people in the world,
link |
00:14:59.840
if you look at the distribution of smart people,
link |
00:15:04.120
smart people are generally good.
link |
00:15:06.760
And then the Southern person,
link |
00:15:07.680
I was talking to Sean Carroll, the physicist,
link |
00:15:09.480
and he was saying no good and bad people
link |
00:15:11.400
are evenly distributed amongst everybody.
link |
00:15:14.080
My sense was good hackers are in general good people
link |
00:15:18.080
and they don't want to mess with the world.
link |
00:15:20.400
What's your sense?
link |
00:15:21.920
I'm not even sure about that.
link |
00:15:25.920
Like, I have a nice life.
link |
00:15:30.520
Crime wouldn't get me anything.
link |
00:15:34.320
But if you're good and you have these skills,
link |
00:15:36.520
you probably have a nice life too, right?
link |
00:15:38.720
Right, you can use the father things.
link |
00:15:40.160
But is there an ethical,
link |
00:15:41.120
is there a little voice in your head that says,
link |
00:15:46.120
well, yeah, if you could hack something
link |
00:15:49.040
to where you could hurt people
link |
00:15:52.840
and you could earn a lot of money doing it though,
link |
00:15:54.960
not hurt physically perhaps,
link |
00:15:56.320
but disrupt their life in some kind of way.
link |
00:16:00.200
Isn't there a little voice that says,
link |
00:16:03.360
Well, two things.
link |
00:16:04.560
One, I don't really care about money.
link |
00:16:06.800
So like the money wouldn't be an incentive.
link |
00:16:08.680
The thrill might be an incentive.
link |
00:16:10.640
But when I was 19, I read crime and punishment.
link |
00:16:14.440
That was another great one
link |
00:16:16.120
that talked me out of ever really doing crime.
link |
00:16:19.400
Cause it's like, that's gonna be me.
link |
00:16:21.720
I'd get away with it, but it would just run through my head.
link |
00:16:25.040
Even if I got away with it, you know?
link |
00:16:26.480
And then you do crime for long enough,
link |
00:16:27.640
you'll never get away with it.
link |
00:16:28.960
That's right, in the end.
link |
00:16:30.360
That's a good reason to be good.
link |
00:16:32.680
I wouldn't say I'm good, I would just say I'm not bad.
link |
00:16:34.880
You're a talented programmer and a hacker
link |
00:16:38.080
in a good positive sense of the word.
link |
00:16:40.920
You've played around, found vulnerabilities
link |
00:16:43.360
in various systems.
link |
00:16:44.720
What have you learned broadly
link |
00:16:46.120
about the design of systems and so on
link |
00:16:49.480
from that whole process?
link |
00:16:53.280
You learn to not take things
link |
00:16:59.280
for what people say they are,
link |
00:17:02.160
but you look at things for what they actually are.
link |
00:17:07.040
Yeah.
link |
00:17:07.880
I understand that's what you tell me it is,
link |
00:17:10.080
but what does it do?
link |
00:17:12.960
And you have nice visualization tools
link |
00:17:14.600
to really know what it's really doing.
link |
00:17:16.720
Oh, I wish I'm a better programmer now than I was in 2014.
link |
00:17:20.080
I said, Kira, that was the first tool
link |
00:17:21.880
that I wrote that was usable.
link |
00:17:23.440
I wouldn't say the code was great.
link |
00:17:25.360
I still wouldn't say my code is great.
link |
00:17:28.840
So how was your evolution as a programmer?
link |
00:17:30.760
Except practice.
link |
00:17:32.280
You started with C,
link |
00:17:33.880
what point did you pick up Python?
link |
00:17:35.560
Because you're pretty big in Python now.
link |
00:17:37.080
Now, yeah, in college,
link |
00:17:39.960
I went to Carnegie Mellon when I was 22.
link |
00:17:42.520
I went back, I'm like,
link |
00:17:44.200
I'm gonna take all your hardest CS courses
link |
00:17:46.640
and we'll see how I do, right?
link |
00:17:47.640
Like, did I miss anything
link |
00:17:48.560
by not having a real undergraduate education?
link |
00:17:51.520
Took operating systems, compilers, AI,
link |
00:17:54.240
and they're like a freshman Weeder math course.
link |
00:17:56.880
And some of those classes you mentioned,
link |
00:18:03.320
pretty tough, actually.
link |
00:18:04.240
They're great.
link |
00:18:05.640
At least when the 2012,
link |
00:18:07.640
circa 2012 operating systems and compilers
link |
00:18:11.240
were two of the best classes I've ever taken in my life.
link |
00:18:14.440
Because you write an operating system
link |
00:18:15.640
and you write a compiler.
link |
00:18:18.080
I wrote my operating system in C
link |
00:18:19.760
and I wrote my compiler in Haskell,
link |
00:18:21.400
but somehow I picked up Python that semester as well.
link |
00:18:26.400
I started using it for the CTFs, actually.
link |
00:18:28.080
That's when I really started to get into CTFs
link |
00:18:30.320
and CTFs, you're all to race against the clock.
link |
00:18:33.360
So I can't write things and see.
link |
00:18:35.120
Oh, there's a clock component.
link |
00:18:36.240
So you really want to use the programming language
link |
00:18:37.840
just so you can be fastest.
link |
00:18:38.960
48 hours.
link |
00:18:40.080
Pwn as many of these challenges as you can.
link |
00:18:41.440
Pwn.
link |
00:18:42.280
Yeah.
link |
00:18:43.120
You got like 100 points of challenge,
link |
00:18:43.960
whatever team gets the most.
link |
00:18:46.360
You were both at Facebook and Google for a brief stint.
link |
00:18:50.240
Yeah.
link |
00:18:51.080
With Project Zero, actually, at Google for five months
link |
00:18:54.920
where you develop Kira.
link |
00:18:56.960
What was Project Zero about in general?
link |
00:19:01.760
Just curious about the security efforts in these companies.
link |
00:19:05.160
Well, Project Zero started the same time I went there.
link |
00:19:08.840
What year is it there?
link |
00:19:11.080
2015.
link |
00:19:12.320
2015.
link |
00:19:13.160
So that was right at the beginning of Project Zero.
link |
00:19:15.040
It's small.
link |
00:19:16.200
It's Google's offensive security team.
link |
00:19:18.840
I'll try to give the best public facing explanation
link |
00:19:25.680
that I can.
link |
00:19:26.520
So the idea is basically,
link |
00:19:30.960
these vulnerabilities exist in the world.
link |
00:19:33.240
Nation states have them.
link |
00:19:35.240
Some high powered bad actors have them.
link |
00:19:39.840
Sometimes people will find these vulnerabilities
link |
00:19:44.200
and submit them in bug bounties to the companies.
link |
00:19:47.960
But a lot of the companies don't only care.
link |
00:19:49.440
They don't even fix the bug.
link |
00:19:50.520
It doesn't hurt for there to be a vulnerability.
link |
00:19:53.760
So Project Zero is like, we're going to do it different.
link |
00:19:55.880
We're going to announce a vulnerability
link |
00:19:57.840
and we're going to give them 90 days to fix it.
link |
00:19:59.640
And then whether they fix it or not,
link |
00:20:00.800
we're going to drop the Zero Day.
link |
00:20:03.200
Oh, wow.
link |
00:20:04.080
We're going to drop the weapon on the textbook.
link |
00:20:05.240
That's so cool.
link |
00:20:06.080
That is so cool.
link |
00:20:07.480
I love that deadlines.
link |
00:20:09.200
Oh, that's so cool.
link |
00:20:10.040
Give them real deadlines.
link |
00:20:10.880
Yeah.
link |
00:20:12.320
And I think it's done a lot for moving the industry forward.
link |
00:20:15.800
I watched your coding sessions on the streamed online.
link |
00:20:20.360
You code things up, the basic projects, usually from scratch.
link |
00:20:25.280
I would say, sort of as a programmer myself,
link |
00:20:28.200
just watching you, that you type really fast
link |
00:20:30.360
and your brain works in both brilliant and chaotic ways.
link |
00:20:34.440
I don't know if that's always true,
link |
00:20:35.800
but certainly for the live streams.
link |
00:20:37.600
So it's interesting to me because I'm much slower
link |
00:20:41.320
and systematic and careful.
link |
00:20:43.520
And you just move probably in order of magnitude faster.
link |
00:20:48.040
So I'm curious, is there a method to your madness?
link |
00:20:51.800
Or is it just who you are?
link |
00:20:53.040
There's pros and cons.
link |
00:20:54.720
There's pros and cons to my programming style.
link |
00:20:58.080
And I'm aware of them.
link |
00:21:00.360
If you ask me to get something up and working quickly
link |
00:21:04.480
with an API that's kind of undocumented,
link |
00:21:06.800
I will do this super fast because I will throw things
link |
00:21:08.880
at it until it works.
link |
00:21:10.200
If you ask me to take a vector and rotate it 90 degrees
link |
00:21:14.720
and then flip it over the X, Y plane,
link |
00:21:19.320
I'll spam program for two hours and won't get it.
link |
00:21:22.280
Oh, because it's something that you
link |
00:21:23.480
could do with a sheet of paper or think through design
link |
00:21:26.240
and then just you really just throw stuff at the wall
link |
00:21:30.400
and you get so good at it that it usually works.
link |
00:21:34.600
I should become better at the other kind as well.
link |
00:21:36.920
Sometimes I will do things methodically.
link |
00:21:39.440
It's nowhere near as entertaining on the Twitch streams.
link |
00:21:41.200
I do exaggerate it a bit on the Twitch streams as well.
link |
00:21:43.520
The Twitch streams, I mean, what do you want to see a game
link |
00:21:45.480
or you want to see actions permit, right?
link |
00:21:46.840
I'll show you APM for programming too.
link |
00:21:48.200
Yeah, I'd recommend people go to it.
link |
00:21:50.280
I think I watched probably several hours that you put,
link |
00:21:53.800
like I've actually left you programming in the background
link |
00:21:57.480
while I was programming because you made me,
link |
00:22:00.400
it was like watching a really good gamer.
link |
00:22:03.120
It's like energizes you because you're like moving so fast
link |
00:22:06.240
and so it's awesome, it's inspiring.
link |
00:22:08.840
It made me jealous that like,
link |
00:22:12.280
because my own programming is inadequate
link |
00:22:14.280
in terms of speed, so I was like.
link |
00:22:16.960
So I'm twice as frantic on the live streams
link |
00:22:20.520
as I am when I code without, oh.
link |
00:22:22.680
It's super entertaining.
link |
00:22:23.720
So I wasn't even paying attention to what you were coding,
link |
00:22:26.400
which is great, it's just watching you switch windows
link |
00:22:29.760
and Vim, I guess is the most way.
link |
00:22:31.400
Yeah, does Vim on screen?
link |
00:22:33.000
I've developed a workload Facebook and stuck with it.
link |
00:22:35.640
How do you learn new programming tools,
link |
00:22:37.320
ideas, techniques these days?
link |
00:22:39.440
What's your like methodology for learning new things?
link |
00:22:42.080
So I wrote for comma,
link |
00:22:47.200
the distributed file systems out in the world
link |
00:22:49.280
are extremely complex.
link |
00:22:50.720
Like if you want to install something like like like Ceph,
link |
00:22:55.280
Ceph is I think the like open infrastructure
link |
00:22:58.760
distributed file system or there's like newer ones
link |
00:23:03.040
like seaweed FS, but these are all like 10,000
link |
00:23:05.880
plus line projects.
link |
00:23:06.880
I think some of them are even 100,000 line
link |
00:23:09.520
and just configuring them as a nightmare.
link |
00:23:11.120
So I wrote, I wrote one, it's 200 lines
link |
00:23:16.440
and it uses like engine X of the line servers
link |
00:23:18.880
and has this little master server that I wrote and go.
link |
00:23:21.600
And the way I go, this, if I would say
link |
00:23:24.840
that I'm proud per line of any code I wrote,
link |
00:23:27.240
maybe there's some exploits that I think are beautiful
link |
00:23:29.160
and then this, this is 200 lines
link |
00:23:31.320
and just the way that I thought about it,
link |
00:23:33.720
I think was very good and the reason it's very good
link |
00:23:35.560
is because that was the fourth version of it that I wrote
link |
00:23:37.640
and I had three versions that I threw away.
link |
00:23:39.320
You mentioned, did you say go?
link |
00:23:41.000
I wrote a go, yeah.
link |
00:23:41.840
And go.
link |
00:23:42.680
Is that a functional language?
link |
00:23:43.880
I forget what go is.
link |
00:23:45.280
Go is Google's language.
link |
00:23:47.160
Right.
link |
00:23:48.200
It's not functional.
link |
00:23:49.480
It's some, it's like, in a way it's C++, but easier.
link |
00:23:56.160
It's strongly typed.
link |
00:23:58.200
It has a nice ecosystem around it.
link |
00:23:59.760
When I first looked at it, I was like,
link |
00:24:01.680
this is like Python, but it takes twice as long
link |
00:24:03.800
to do anything.
link |
00:24:05.600
Now that I've open pilot is migrating to C,
link |
00:24:09.600
but it still has large Python components,
link |
00:24:11.000
I now understand why Python doesn't work
link |
00:24:12.760
for large code bases and why you want something like go.
link |
00:24:15.840
Interesting.
link |
00:24:16.680
So why, why doesn't Python work for,
link |
00:24:18.680
so even most, speaking for myself at least,
link |
00:24:21.720
like we do a lot of stuff, basically demo level work
link |
00:24:24.960
with autonomous vehicles and most of the work is Python.
link |
00:24:29.240
Why doesn't Python work for large code bases?
link |
00:24:32.440
Because, well, lack of type checking is a big one.
link |
00:24:37.920
So errors creep in.
link |
00:24:39.360
Yeah, and like you don't know,
link |
00:24:41.920
the compiler can tell you like nothing, right?
link |
00:24:45.320
So everything is either, you know,
link |
00:24:48.440
like syntax errors, fine,
link |
00:24:49.880
but if you misspell a variable in Python,
link |
00:24:51.800
the compiler won't catch that.
link |
00:24:53.000
There's like linters that can catch it some of the time.
link |
00:24:56.600
There's no types.
link |
00:24:57.560
This is really the biggest downside.
link |
00:25:00.520
And then we'll Python slow, but that's not related to it.
link |
00:25:02.640
Well, maybe it's kind of related to it, so it's lack of.
link |
00:25:04.840
So what's in your toolbox these days?
link |
00:25:06.600
Is it Python or what else?
link |
00:25:07.760
Go.
link |
00:25:08.600
I need to move to something else.
link |
00:25:10.240
My adventure into dependently typed languages,
link |
00:25:12.880
I love these languages.
link |
00:25:14.240
They just have like syntax from the 80s.
link |
00:25:18.520
What do you think about JavaScript?
link |
00:25:21.120
ES6, like the modern type script?
link |
00:25:24.000
JavaScript is, the whole ecosystem
link |
00:25:27.320
is unbelievably confusing.
link |
00:25:29.320
NPM updates a package from 022 to 025
link |
00:25:32.840
and that breaks your Babel linter,
link |
00:25:34.560
which translates your ES5 into ES6, which doesn't run on.
link |
00:25:38.560
So why do I have to compile my JavaScript again, huh?
link |
00:25:42.480
It may be the future though.
link |
00:25:44.040
You think about, I mean,
link |
00:25:45.800
I've embraced JavaScript recently
link |
00:25:47.400
just because just like I've continually embraced PHP,
link |
00:25:52.280
it seems that these worst possible languages live on
link |
00:25:55.360
for the longest, like cockroaches never die.
link |
00:25:57.480
Yeah, well, it's in the browser and it's fast.
link |
00:26:00.760
It's fast.
link |
00:26:01.680
Yeah.
link |
00:26:02.520
It's in the browser and compute might stay become,
link |
00:26:05.480
you know, the browser,
link |
00:26:06.440
it's unclear what the role of the browser is
link |
00:26:09.040
in terms of distributed computation in the future.
link |
00:26:11.800
So.
link |
00:26:13.600
JavaScript is definitely here to stay.
link |
00:26:15.240
Yeah.
link |
00:26:16.080
It's interesting if autonomous vehicles
link |
00:26:18.160
will run on JavaScript one day.
link |
00:26:19.480
I mean, you have to consider these possibilities.
link |
00:26:21.760
Well, all our debug tools are JavaScript.
link |
00:26:24.280
We actually just open source them.
link |
00:26:26.040
We have a tool explorer, which you can annotate
link |
00:26:28.160
your disengagements and we have tool Kibana,
link |
00:26:30.080
which lets you analyze the can traffic from the car.
link |
00:26:32.920
So basically any time you're visualizing something
link |
00:26:35.240
about the log using JavaScript.
link |
00:26:37.720
Well, the web is the best UI toolkit by far.
link |
00:26:40.120
Yeah.
link |
00:26:40.960
So, and then, you know what?
link |
00:26:41.880
You're coding in JavaScript.
link |
00:26:42.760
We have a React guy.
link |
00:26:43.600
He's good.
link |
00:26:44.440
React, nice.
link |
00:26:46.080
Let's get into it.
link |
00:26:46.920
So let's talk autonomous vehicles.
link |
00:26:49.120
You found a comma AI.
link |
00:26:51.440
Let's, at a high level,
link |
00:26:54.920
how did you get into the world of vehicle automation?
link |
00:26:57.880
Can you also just, for people who don't know,
link |
00:26:59.920
tell the story of comma AI?
link |
00:27:01.400
Sure.
link |
00:27:02.920
So I was working at this AI startup
link |
00:27:06.120
and a friend approached me and he's like,
link |
00:27:09.240
dude, I don't know where this is going,
link |
00:27:12.080
but the coolest applied AI problem today
link |
00:27:15.160
is self driving cars.
link |
00:27:16.480
I'm like, well, absolutely.
link |
00:27:18.800
You wanna meet with Elon Musk
link |
00:27:20.520
and he's looking for somebody to build a vision system
link |
00:27:24.560
for autopilot.
link |
00:27:27.600
This is when they were still on AP one.
link |
00:27:29.320
They were still using Mobileye.
link |
00:27:30.840
Elon back then was looking for a replacement.
link |
00:27:33.680
And he brought me in and we talked about a contract
link |
00:27:37.320
where I would deliver something
link |
00:27:39.040
that meets Mobileye level performance.
link |
00:27:41.640
I would get paid $12 million if I could deliver it tomorrow
link |
00:27:43.920
and I would lose $1 million for every month I didn't deliver.
link |
00:27:47.720
So I was like, okay, this is a great deal.
link |
00:27:49.080
This is a super exciting challenge.
link |
00:27:52.360
You know what?
link |
00:27:53.200
It takes me 10 months, I get $2 million, it's good.
link |
00:27:55.840
Maybe I can finish up in five.
link |
00:27:57.160
Maybe I don't finish it at all and I get paid nothing
link |
00:27:58.880
and I'll work for 12 months for free.
link |
00:28:00.880
So maybe just take a pause on that.
link |
00:28:02.960
I'm also curious about this
link |
00:28:04.280
because I've been working in robotics for a long time.
link |
00:28:06.360
And I'm curious to see a person like you just step in
link |
00:28:08.320
and sort of somewhat naive, but brilliant, right?
link |
00:28:12.000
So that's the best place to be
link |
00:28:14.000
because you basically full steam take on a problem.
link |
00:28:17.240
How confident, from that time,
link |
00:28:19.720
because you know a lot more now,
link |
00:28:21.320
at that time, how hard do you think it is
link |
00:28:23.440
to solve all of autonomous driving?
link |
00:28:25.880
I remember I suggested to Elon in the meeting
link |
00:28:30.440
on putting a GPU behind each camera
link |
00:28:33.120
to keep the compute local.
link |
00:28:35.120
This is an incredibly stupid idea.
link |
00:28:38.000
I leave the meeting 10 minutes later and I'm like,
link |
00:28:40.080
I could have spent a little bit of time
link |
00:28:41.560
thinking about this problem before I went in.
link |
00:28:42.880
Why is this a stupid idea?
link |
00:28:44.200
Oh, just send all your cameras to one big GPU.
link |
00:28:46.280
You're much better off doing that.
link |
00:28:48.240
Oh, sorry, you said behind every camera.
link |
00:28:50.160
Every camera.
link |
00:28:51.000
Every small GPU.
link |
00:28:51.840
I was like, oh, I'll put the first few layers
link |
00:28:52.720
of my comms there.
link |
00:28:54.520
Like why did I say that?
link |
00:28:56.080
That's possible.
link |
00:28:56.920
It's possible, but it's a bad idea.
link |
00:28:59.000
It's not obviously a bad idea.
link |
00:29:00.480
Pretty obviously bad.
link |
00:29:01.320
But whether it's actually a bad idea or not,
link |
00:29:02.960
I left that meeting with Elon, like beating myself up.
link |
00:29:05.240
I'm like, why did I say something stupid?
link |
00:29:07.080
Yeah, you haven't, like you haven't at least
link |
00:29:09.360
like thought through every aspect fully.
link |
00:29:12.240
He's very sharp too.
link |
00:29:13.200
Like usually in life, I get away with saying stupid things
link |
00:29:15.760
and then kind of course,
link |
00:29:16.960
right away he called me out about it.
link |
00:29:18.560
And like, usually in life,
link |
00:29:19.800
I get away with saying stupid things.
link |
00:29:21.120
And then like people will, you know,
link |
00:29:24.640
a lot of times people don't even notice.
link |
00:29:26.080
And I'll like correct it and bring the conversation back.
link |
00:29:28.200
But with Elon, it was like, nope, like, okay.
link |
00:29:30.600
Well, that's not at all why the contract fell through.
link |
00:29:33.520
I was much more prepared the second time I met him.
link |
00:29:35.520
Yeah.
link |
00:29:36.360
But in general, how hard did you think it,
link |
00:29:39.640
like 12 months is a tough timeline?
link |
00:29:43.680
Oh, I just thought I'd clone Mobileye IQ three.
link |
00:29:45.720
I didn't think I'd solve level five self driving
link |
00:29:47.560
or anything.
link |
00:29:48.400
So the goal there was to do lane keeping,
link |
00:29:51.000
good lane keeping.
link |
00:29:52.840
I saw my friend showed me the outputs from Mobileye.
link |
00:29:55.560
And the outputs from Mobileye was just basically two lanes
link |
00:29:57.680
and a position of a lead car.
link |
00:29:59.440
I'm like, I can gather a data set
link |
00:30:01.560
and train this net in weeks.
link |
00:30:03.440
And I did.
link |
00:30:04.840
Well, first time I tried the implementation of Mobileye
link |
00:30:07.600
in a Tesla, I was really surprised how good it is.
link |
00:30:11.240
It's quite incredibly good.
link |
00:30:12.320
Cause I thought it's just cause I've done
link |
00:30:14.080
a lot of computer vision.
link |
00:30:14.920
I thought it'd be a lot harder to create a system
link |
00:30:18.880
that that's stable.
link |
00:30:21.000
So I was personally surprised.
link |
00:30:22.440
Just, you know, have to admit it.
link |
00:30:25.000
Cause I was kind of skeptical before trying it.
link |
00:30:27.840
Cause I thought it would go in and out a lot more.
link |
00:30:31.200
It would get disengaged a lot more.
link |
00:30:33.160
And it's pretty robust.
link |
00:30:36.200
So what, how, how, how hard is the problem
link |
00:30:39.720
when you, when you tackled it?
link |
00:30:42.080
So I think AP one was great. Like Elon talked
link |
00:30:45.760
about disengagements on the 405 down in LA
link |
00:30:49.040
with like the lane marks were kind of faded
link |
00:30:51.040
and the Mobileye system would drop out.
link |
00:30:53.960
Like I had something up and working
link |
00:30:57.240
that I would say was like the same quality in three months.
link |
00:31:02.480
Same quality, but how do you know?
link |
00:31:04.560
You say stuff like that confidently, but you can't,
link |
00:31:07.400
and I love it, but the question is you can't,
link |
00:31:12.120
you're kind of going by feel cause you just,
link |
00:31:13.880
You're going by feel, absolutely, absolutely.
link |
00:31:15.560
Like, like I would take, I hadn't,
link |
00:31:17.280
I borrowed my friend's Tesla.
link |
00:31:18.480
I would take AP one out for a drive.
link |
00:31:20.760
And then I would take my system out for a drive.
link |
00:31:22.320
And seems reasonably like the same.
link |
00:31:26.080
So the 405, how hard is it to create something
link |
00:31:30.480
that could actually be a product that's deployed?
link |
00:31:34.200
I mean, I've read an article where Elon, this respond,
link |
00:31:39.520
it said something about you saying that
link |
00:31:41.880
to build autopilot is more complicated
link |
00:31:47.080
than a single George Hodds level job.
link |
00:31:51.880
How hard is that job to create something
link |
00:31:55.520
that would work across the globally?
link |
00:31:58.960
Why don't the global is the challenge,
link |
00:32:00.640
but Elon followed that up by saying
link |
00:32:02.240
it's going to take two years and a company of 10 people.
link |
00:32:04.920
And here I am four years later with a company of 12 people.
link |
00:32:07.920
And I think we still have another two to go.
link |
00:32:09.960
Two years.
link |
00:32:10.800
So yeah, so what do you think,
link |
00:32:13.120
what do you think about how Tesla's progressing
link |
00:32:15.960
with autopilot of V2, V3?
link |
00:32:19.200
I think we've kept pace with them pretty well.
link |
00:32:24.080
I think navigating autopilot is terrible.
link |
00:32:26.880
We had some demo features internally of the same stuff
link |
00:32:31.120
and we would test it and I'm like,
link |
00:32:32.720
I'm not shipping this even as like open source software
link |
00:32:34.720
to people.
link |
00:32:35.560
What do you think is terrible?
link |
00:32:37.400
Consumer Reports does a great job of describing it.
link |
00:32:39.600
Like when it makes a lane change,
link |
00:32:41.240
it does it worse than a human.
link |
00:32:43.600
You shouldn't ship things like autopilot, open pilot,
link |
00:32:46.960
they lane keep better than a human.
link |
00:32:49.760
If you turn it on for a stretch of highway,
link |
00:32:53.440
like an hour long, it's never going to touch a lane line.
link |
00:32:56.680
Human will touch probably a lane line twice.
link |
00:32:59.040
You just inspired me.
link |
00:33:00.080
I don't know if you're grounded in data on that.
link |
00:33:02.200
I read your paper.
link |
00:33:03.280
Okay, but no, but that's interesting.
link |
00:33:06.720
I wonder actually how often we touch lane lines
link |
00:33:11.200
a little bit because it is.
link |
00:33:13.400
I could answer that question pretty easily
link |
00:33:14.960
with the common data side.
link |
00:33:15.800
Yeah, I'm curious.
link |
00:33:16.920
I've never answered it.
link |
00:33:17.760
I don't know.
link |
00:33:18.600
I just too was like my personal.
link |
00:33:20.000
It feels right, but that's interesting
link |
00:33:22.400
because every time you touch a lane,
link |
00:33:23.800
that's a source of a little bit of stress
link |
00:33:26.760
and kind of lane keeping is removing that stress.
link |
00:33:29.320
That's ultimately the biggest value add
link |
00:33:31.840
honestly is just removing the stress
link |
00:33:34.240
of having to stay in lane.
link |
00:33:35.480
And I think I don't think people fully realize
link |
00:33:39.040
first of all that that's a big value add,
link |
00:33:41.960
but also that that's all it is.
link |
00:33:45.000
And that not only I find it a huge value add.
link |
00:33:48.560
I drove down when we moved to San Diego,
link |
00:33:50.440
I drove down in an enterprise rental car
link |
00:33:52.640
and I missed it.
link |
00:33:53.480
So I missed having the system so much.
link |
00:33:55.480
It's so much more tiring to drive
link |
00:33:59.200
without it.
link |
00:34:00.320
It's, it is that lane centering.
link |
00:34:02.960
That's the key feature.
link |
00:34:04.840
Yeah.
link |
00:34:06.600
And in a way it's the only feature
link |
00:34:08.960
that actually adds value to people's lives
link |
00:34:11.040
in autonomous vehicles today.
link |
00:34:12.200
Waymo does not add value to people's lives.
link |
00:34:13.840
It's a more expensive, slower Uber.
link |
00:34:15.880
Maybe someday it'll be this big cliff where it adds value,
link |
00:34:18.640
but I don't usually.
link |
00:34:19.480
It's fascinating.
link |
00:34:20.320
I haven't talked to, this is good.
link |
00:34:22.560
Cause I haven't, I have intuitively,
link |
00:34:25.840
but I think we're making it explicit now.
link |
00:34:28.320
I actually believe that really good lane keeping
link |
00:34:35.480
is a reason to buy a car.
link |
00:34:37.240
Will be a reason to buy a car.
link |
00:34:38.440
It is a huge value add.
link |
00:34:39.720
I've never, until we just started talking about it,
link |
00:34:41.760
haven't really quite realized it,
link |
00:34:43.880
that I've felt with Elon's chase of level four
link |
00:34:49.440
is not the correct chase.
link |
00:34:52.360
It was on, cause you should just say Tesla has the best
link |
00:34:56.000
as if from a Tesla perspective say,
link |
00:34:58.320
Tesla has the best lane keeping.
link |
00:35:00.600
Kama AI should say Kama AI is the best lane keeping.
link |
00:35:04.160
And that is it.
link |
00:35:05.640
Yeah.
link |
00:35:06.480
Yeah.
link |
00:35:07.320
Do you think?
link |
00:35:08.160
You have to do the longitudinal as well.
link |
00:35:09.920
You can't just lane keep.
link |
00:35:10.960
You have to do ACC,
link |
00:35:12.920
but ACC is much more forgiving than lane keep,
link |
00:35:15.840
especially on the highway.
link |
00:35:17.400
By the way, are you Kama AI's camera only, correct?
link |
00:35:22.000
No, we use the radar.
link |
00:35:23.440
We, from the car, you're able to get to, okay.
link |
00:35:26.960
We can do it camera only now.
link |
00:35:28.800
It's gotten to the point,
link |
00:35:29.640
but we leave the radar there as like a,
link |
00:35:31.600
it's fusion now.
link |
00:35:33.440
Okay, so let's maybe talk through
link |
00:35:35.440
some of the system specs on the hardware.
link |
00:35:37.920
What's the hardware side of what you're providing?
link |
00:35:42.880
What's the capabilities on the software side
link |
00:35:44.720
with OpenPilot and so on?
link |
00:35:46.800
So OpenPilot as the box that we sell that it runs on,
link |
00:35:51.800
it's a phone in a plastic case.
link |
00:35:53.920
It's nothing special.
link |
00:35:54.840
We sell it without the software.
link |
00:35:56.200
So you're like, you know, you buy the phone,
link |
00:35:57.840
it's just easy.
link |
00:35:58.920
It'll be easy set up,
link |
00:36:00.240
but it's sold with no software.
link |
00:36:03.480
OpenPilot right now is about to be 0.6.
link |
00:36:06.600
When it gets to 1.0,
link |
00:36:07.880
I think we'll be ready for a consumer product.
link |
00:36:09.680
We're not gonna add any new features.
link |
00:36:11.120
We're just gonna make the lane keeping really, really good.
link |
00:36:13.800
Okay, I got it.
link |
00:36:15.120
So what do we have right now?
link |
00:36:16.120
It's a Snapdragon 820.
link |
00:36:18.200
It's a Sony IMX 298 forward facing camera,
link |
00:36:23.680
driver monitoring camera.
link |
00:36:24.720
It's just a selfie cam on the phone.
link |
00:36:26.400
And a can transceiver,
link |
00:36:30.000
maybe it's a little thing called pandas.
link |
00:36:32.320
And they talk over USB to the phone
link |
00:36:35.040
and then they have three can buses
link |
00:36:36.400
that they talk to the car.
link |
00:36:38.560
One of those can buses is the radar can bus.
link |
00:36:40.920
One of them is the main car can bus.
link |
00:36:42.920
And the other one is the proxy camera can bus.
link |
00:36:44.920
We leave the existing camera in place.
link |
00:36:47.320
So we don't turn AEB off.
link |
00:36:49.560
Right now we still turn AEB off
link |
00:36:51.040
if you're using our longitudinal,
link |
00:36:52.280
but we're gonna fix that before 1.0.
link |
00:36:54.320
Got it.
link |
00:36:55.160
Wow, that's cool.
link |
00:36:56.000
So in its can both ways.
link |
00:36:57.960
So how are you able to control vehicles?
link |
00:37:02.120
So we proxy the vehicles that we work with
link |
00:37:05.520
already have a lane keeping assist system.
link |
00:37:08.960
So lane keeping assist can mean a huge variety of things.
link |
00:37:12.520
It can mean it will apply a small torque
link |
00:37:16.120
to the wheel after you've already crossed a lane line
link |
00:37:18.920
by a foot, which is the system in the older Toyotas.
link |
00:37:22.720
Versus like, I think Tesla still calls it lane keeping assist
link |
00:37:26.360
where it'll keep you perfectly in the center of the lane
link |
00:37:28.920
on the highway.
link |
00:37:31.240
You can control like you with the joystick, the cars.
link |
00:37:34.000
So these cars already have the capability of drive by wire.
link |
00:37:36.600
So is it, is it trivial to convert a car
link |
00:37:41.600
that it operates with?
link |
00:37:43.320
It open pilot is able to control the steering.
link |
00:37:48.480
Oh, a new car or a car that we,
link |
00:37:49.720
so we have support now for 45 different makes of cars.
link |
00:37:52.800
What are the cars in general?
link |
00:37:54.880
Mostly Honda's and Toyotas.
link |
00:37:56.360
We support almost every Honda and Toyota made this year.
link |
00:38:01.680
And then bunch of GM's, bunch of Subaru's.
link |
00:38:04.480
But it doesn't have to be like a Prius.
link |
00:38:05.960
It could be Corolla as well.
link |
00:38:07.320
Oh, the 2020 Corolla is the best car with open pilot.
link |
00:38:10.760
It just came out there.
link |
00:38:11.720
The actuator has less lag than the older Corolla.
link |
00:38:15.840
I think I started watching a video with you.
link |
00:38:18.240
I mean, the way you make videos is awesome.
link |
00:38:21.480
It's just literally at the dealership streaming.
link |
00:38:25.320
I had my friend to follow him.
link |
00:38:26.160
I probably want to stream for an hour.
link |
00:38:27.560
Yeah, and basically like if stuff goes a little wrong,
link |
00:38:31.120
you just like, you just go with it.
link |
00:38:33.160
Yeah, I love it.
link |
00:38:34.000
It's real.
link |
00:38:34.840
Yeah, it's real.
link |
00:38:35.680
That's so beautiful and it's so in contrast to the way
link |
00:38:42.000
other companies would put together a video like that.
link |
00:38:44.600
Kind of why I like to do it like that.
link |
00:38:46.000
Good.
link |
00:38:46.840
I mean, if you become super rich one day and successful,
link |
00:38:49.720
I hope you keep it that way because I think that's actually
link |
00:38:52.280
what people love, that kind of genuine.
link |
00:38:54.600
Oh, it's all that has value to me.
link |
00:38:56.520
Money has no, if I sell out to like make money,
link |
00:38:59.840
I sold out.
link |
00:39:00.680
It doesn't matter.
link |
00:39:01.520
What do I get?
link |
00:39:02.360
Yacht, I don't want a yacht.
link |
00:39:04.440
And I think Tesla actually has a small inkling of that
link |
00:39:09.440
as well with autonomy day.
link |
00:39:11.240
They did reveal more than, I mean, of course,
link |
00:39:14.000
there's marketing communications, you could tell,
link |
00:39:15.680
but it's more than most companies would reveal,
link |
00:39:17.640
which is I hope they go towards that direction
link |
00:39:20.960
more other companies, GM, Ford.
link |
00:39:23.000
Oh, Tesla's going to win level five.
link |
00:39:25.400
They really are.
link |
00:39:26.560
So let's talk about it.
link |
00:39:27.800
You think, you're focused on level two currently, currently.
link |
00:39:33.000
We're going to be one to two years behind Tesla
link |
00:39:36.160
getting to level five.
link |
00:39:37.160
OK.
link |
00:39:38.520
We're Android, right?
link |
00:39:39.320
We're Android.
link |
00:39:39.880
You're Android.
link |
00:39:40.680
I'm just saying once Tesla gets it,
link |
00:39:42.240
we're one to two years behind.
link |
00:39:43.440
I'm not making any timeline on when Tesla's going to get it.
link |
00:39:45.680
That's right.
link |
00:39:46.120
You did.
link |
00:39:46.360
That's brilliant.
link |
00:39:46.960
I'm sorry, Tesla investors, if you
link |
00:39:48.560
think you're going to have an autonomous robot taxi
link |
00:39:50.520
fleet by the end of the year, I'll bet against that.
link |
00:39:54.920
So what do you think about this?
link |
00:39:57.720
The most level four companies are kind of just
link |
00:40:03.280
doing their usual safety driver, doing full autonomy kind
link |
00:40:08.360
of testing.
link |
00:40:08.800
And then Tesla does basically trying
link |
00:40:10.880
to go from lane keeping to full autonomy.
link |
00:40:15.280
What do you think about that approach?
link |
00:40:16.840
How successful would it be?
link |
00:40:18.360
It's a ton better approach.
link |
00:40:20.680
Because Tesla is gathering data on a scale
link |
00:40:23.960
that none of them are.
link |
00:40:25.200
They're putting real users behind the wheel of the cars.
link |
00:40:29.560
It's, I think, the only strategy that works, the incremental.
link |
00:40:34.440
Well, so there's a few components to Tesla approach
link |
00:40:37.000
that's more than just the incremental.
link |
00:40:38.800
What you spoke with is the software,
link |
00:40:41.400
so over the air software updates.
link |
00:40:43.720
Necessity.
link |
00:40:44.800
I mean, Waymo crews have those too.
link |
00:40:46.440
Those aren't.
link |
00:40:47.560
But no.
link |
00:40:48.080
Those differentiate from the automakers.
link |
00:40:49.800
Right.
link |
00:40:50.080
No lane keeping systems have no cars with lane keeping system
link |
00:40:53.440
have that except Tesla.
link |
00:40:54.760
Yeah.
link |
00:40:55.720
And the other one is the data, the other direction,
link |
00:40:59.760
which is the ability to query the data.
link |
00:41:01.840
I don't think they're actually collecting
link |
00:41:03.480
as much data as people think, but the ability
link |
00:41:05.240
to turn on collection and turn it off.
link |
00:41:09.440
So I'm both in the robotics world, in the psychology,
link |
00:41:13.400
human factors world.
link |
00:41:15.000
Many people believe that level two autonomy
link |
00:41:17.320
is problematic because of the human factor.
link |
00:41:20.040
Like the more the task is automated,
link |
00:41:23.280
the more there's a vigilance decrement.
link |
00:41:25.960
You start to fall asleep.
link |
00:41:27.200
You start to become complacent, start texting more and so on.
link |
00:41:30.480
Do you worry about that?
link |
00:41:32.200
Because if you're talking about transition from lane keeping
link |
00:41:35.000
to full autonomy, if you're spending 80% of the time
link |
00:41:40.960
not supervising the machine, do you
link |
00:41:43.080
worry about what that means for the safety of the drivers?
link |
00:41:47.080
One, we don't consider OpenPilot to be 1.0
link |
00:41:49.640
until we have 100% driver monitoring.
link |
00:41:52.880
You can cheat right now, our driver monitoring system.
link |
00:41:55.000
There's a few ways to cheat it.
link |
00:41:56.080
They're pretty obvious.
link |
00:41:58.160
We're working on making that better.
link |
00:41:59.680
Before we ship a consumer product that can drive cars,
link |
00:42:02.520
I want to make sure that I have driver monitoring
link |
00:42:04.240
that you can't cheat.
link |
00:42:05.440
What's a successful driver monitoring system look like?
link |
00:42:09.000
Is it all about just keeping your eyes on the road?
link |
00:42:11.680
Well, a few things.
link |
00:42:12.760
So that's what we went with at first for driver monitoring.
link |
00:42:16.600
I'm checking.
link |
00:42:17.160
I'm actually looking at where your head is looking.
link |
00:42:19.000
The camera's not that high.
link |
00:42:19.880
Resolution eyes are a little bit hard to get.
link |
00:42:21.840
Well, head is big.
link |
00:42:22.880
I mean, that's just.
link |
00:42:23.560
Head is good.
link |
00:42:24.640
And actually, a lot of it, just psychology wise,
link |
00:42:28.720
to have that monitor constantly there,
link |
00:42:30.720
it reminds you that you have to be paying attention.
link |
00:42:33.400
But we want to go further.
link |
00:42:35.080
We just hired someone full time to come on
link |
00:42:36.760
to do the driver monitoring.
link |
00:42:37.960
I want to detect phone in frame, and I
link |
00:42:40.600
want to make sure you're not sleeping.
link |
00:42:42.600
How much does the camera see of the body?
link |
00:42:44.880
This one, not enough.
link |
00:42:47.480
Not enough.
link |
00:42:48.400
The next one, everything.
link |
00:42:50.720
What's interesting, FishEye, is we're
link |
00:42:52.920
doing just data collection, not real time.
link |
00:42:55.200
But FishEye is a beautiful being able to capture the body.
link |
00:42:59.200
And the smartphone is really the biggest problem.
link |
00:43:03.280
I'll show you.
link |
00:43:03.880
I can show you one of the pictures from our new system.
link |
00:43:07.800
Awesome.
link |
00:43:08.160
So you're basically saying the driver monitoring
link |
00:43:10.520
will be the answer to that.
link |
00:43:13.080
I think the other point that you raised in your paper
link |
00:43:15.320
is good as well.
link |
00:43:16.920
You're not asking a human to supervise a machine
link |
00:43:20.400
without giving them the they can take over at any time.
link |
00:43:23.920
Our safety model, you can take over.
link |
00:43:25.760
We disengage on both the gas or the brake.
link |
00:43:27.720
We don't disengage on steering.
link |
00:43:28.880
I don't feel you have to.
link |
00:43:29.920
But we disengage on gas or brake.
link |
00:43:31.720
So it's very easy for you to take over.
link |
00:43:34.240
And it's very easy for you to reengage.
link |
00:43:36.400
That switching should be super cheap.
link |
00:43:39.320
The cars that require, even autopilot,
link |
00:43:40.800
requires a double press.
link |
00:43:42.400
That's almost, I see, I don't like that.
link |
00:43:44.360
And then the cancel.
link |
00:43:46.440
To cancel in autopilot, you either
link |
00:43:48.320
have to press cancel, which no one knows where that is.
link |
00:43:49.920
So they press the brake.
link |
00:43:51.000
But a lot of times you don't want to press the brake.
link |
00:43:53.360
You want to press the gas.
link |
00:43:54.560
So you should cancel on gas or wiggle the steering wheel,
link |
00:43:56.880
which is bad as well.
link |
00:43:57.960
Wow, that's brilliant.
link |
00:43:58.920
I haven't heard anyone articulate that point.
link |
00:44:01.440
Oh, there's a lot I think about.
link |
00:44:04.960
Because I think actually Tesla has done a better job
link |
00:44:09.800
than most automakers at making that frictionless.
link |
00:44:12.920
But you just described that it could be even better.
link |
00:44:16.600
I love Super Cruise as an experience.
link |
00:44:19.320
Once it's engaged.
link |
00:44:21.120
I don't know if you've used it, but getting the thing
link |
00:44:22.800
to try to engage.
link |
00:44:25.040
Yeah, I've used the driven Super Cruise a lot.
link |
00:44:27.480
So what's your thoughts on the Super Cruise system in general?
link |
00:44:29.680
You disengage Super Cruise, and it falls back to ACC.
link |
00:44:32.640
So my car is still accelerating.
link |
00:44:34.600
It feels weird.
link |
00:44:36.280
Otherwise, when you actually have Super Cruise engaged
link |
00:44:39.000
on the highway, it is phenomenal.
link |
00:44:41.200
We bought that Cadillac.
link |
00:44:42.320
We just sold it.
link |
00:44:43.240
But we bought it just to experience this.
link |
00:44:45.600
And I wanted everyone in the office to be like,
link |
00:44:47.440
this is what we're striving to build.
link |
00:44:49.360
GM pioneering with the driver monitoring.
link |
00:44:52.800
You like their driver monitoring system?
link |
00:44:55.040
It has some bugs.
link |
00:44:56.440
If there's a sun shining back here, it'll be blind to you.
link |
00:45:01.960
But overall, mostly, yeah.
link |
00:45:03.360
That's so cool that you know all this stuff.
link |
00:45:05.960
I don't often talk to people that because it's such a rare car,
link |
00:45:09.960
unfortunately, currently.
link |
00:45:10.960
We bought one explicitly for that.
link |
00:45:12.760
We lost like $25K in the deprecation,
link |
00:45:15.040
but it feels worth it.
link |
00:45:16.720
I was very pleasantly surprised that our GM system
link |
00:45:21.280
was so innovative and really wasn't advertised much,
link |
00:45:26.320
wasn't talked about much.
link |
00:45:28.480
And I was nervous that it would die, that it would disappear.
link |
00:45:31.840
Well, they put it on the wrong car.
link |
00:45:33.520
They should have put it on the bolt and not some weird Cadillac
link |
00:45:35.680
that nobody bought.
link |
00:45:36.640
I think that's going to be into, they're saying at least
link |
00:45:39.520
it's going to be into their entire fleet.
link |
00:45:41.840
So what do you think about, as long as we're
link |
00:45:44.320
on the driver monitoring, what do you think
link |
00:45:46.920
about Elon Musk's claim that driver monitoring is not needed?
link |
00:45:51.920
Normally, I love his claims.
link |
00:45:53.680
That one is stupid.
link |
00:45:55.560
That one is stupid.
link |
00:45:56.560
And he's not going to have his level five fleet
link |
00:46:00.320
by the end of the year.
link |
00:46:01.320
Hopefully, he's like, OK, I was wrong.
link |
00:46:04.880
I'm going to add driver monitoring.
link |
00:46:06.280
Because when these systems get to the point
link |
00:46:08.240
that they're only messing up once every 1,000 miles,
link |
00:46:10.320
you absolutely need driver monitoring.
link |
00:46:14.080
So let me play, because I agree with you,
link |
00:46:15.880
but let me play devil's advocate.
link |
00:46:17.320
One possibility is that without driver monitoring,
link |
00:46:22.440
people are able to self regulate, monitor themselves.
link |
00:46:29.400
So your idea is, I'm just.
link |
00:46:30.680
You're seeing all the people sleeping in Teslas?
link |
00:46:34.160
Yeah.
link |
00:46:35.280
Well, I'm a little skeptical of all the people sleeping
link |
00:46:38.320
in Teslas because I've stopped paying attention to that kind
link |
00:46:43.960
of stuff because I want to see real data.
link |
00:46:45.680
It's too much glorified.
link |
00:46:47.240
It doesn't feel scientific to me.
link |
00:46:48.720
So I want to know how many people are really sleeping
link |
00:46:52.560
in Teslas versus sleeping.
link |
00:46:55.080
I was driving here, sleep deprived,
link |
00:46:57.640
in a car with no automation.
link |
00:46:59.520
I was falling asleep.
link |
00:47:01.040
I agree that it's hypey.
link |
00:47:02.120
It's just like, you know what?
link |
00:47:04.840
If Elon put driver monitoring, my last autopilot experience
link |
00:47:08.480
was I rented a Model 3 in March and drove it around.
link |
00:47:12.200
The wheel thing is annoying.
link |
00:47:13.640
And the reason the wheel thing is annoying.
link |
00:47:15.440
We use the wheel thing as well, but we
link |
00:47:17.080
don't disengage on wheel.
link |
00:47:18.720
For Tesla, you have to touch the wheel just enough
link |
00:47:21.720
to trigger the torque sensor to tell it that you're there,
link |
00:47:25.320
but not enough as to disengage it, which don't use it
link |
00:47:29.720
for two things.
link |
00:47:30.440
Don't disengage on wheel.
link |
00:47:31.360
You don't have to.
link |
00:47:32.400
That whole experience, wow, beautifully put.
link |
00:47:35.360
All those elements, even if you don't have driver monitoring,
link |
00:47:38.360
that whole experience needs to be better.
link |
00:47:41.080
Driver monitoring, I think would make,
link |
00:47:43.760
I mean, I think supercruise is a better experience
link |
00:47:46.200
once it's engaged over autopilot.
link |
00:47:48.440
I think supercruise is a transition to engagement
link |
00:47:51.600
and disengagement are significantly worse.
link |
00:47:55.200
There's a tricky thing, because if I were to criticize
link |
00:47:57.880
supercruise, it's a little too crude.
link |
00:48:00.800
And I think it's like six seconds or something.
link |
00:48:03.640
If you look off road, it'll start warning you.
link |
00:48:06.080
It's some ridiculously long period of time.
link |
00:48:09.120
And just the way, I think it's basically, it's a binary.
link |
00:48:15.840
It should be adapted.
link |
00:48:17.440
Yeah, it needs to learn more about you.
link |
00:48:19.880
It needs to communicate what it sees about you more.
link |
00:48:23.160
I'm not, you know, Tesla shows what it sees
link |
00:48:25.800
about the external world.
link |
00:48:27.160
It would be nice if supercruise would tell us
link |
00:48:29.120
what it sees about the internal world.
link |
00:48:30.840
It's even worse than that.
link |
00:48:31.960
You press the button to engage
link |
00:48:33.320
and it just says supercruise unavailable.
link |
00:48:35.480
Yeah, why?
link |
00:48:36.320
Why?
link |
00:48:37.800
Yeah, that transparency is good.
link |
00:48:41.480
We've renamed the driver monitoring packet
link |
00:48:43.520
to driver state.
link |
00:48:45.360
Driver state.
link |
00:48:46.280
We have car state packet, which has the state of the car
link |
00:48:48.360
and driver state packet, which has state of the driver.
link |
00:48:51.040
So what is it?
link |
00:48:52.240
Estimate their BAC.
link |
00:48:54.080
What's BAC?
link |
00:48:54.920
Blood alcohol, kind of.
link |
00:48:57.360
You think that's possible with computer vision?
link |
00:48:59.240
Absolutely.
link |
00:49:02.560
It's a, to me, it's an open question.
link |
00:49:04.520
I haven't looked into too much.
link |
00:49:06.600
Actually, I quite seriously looked at the literature.
link |
00:49:08.440
It's not obvious to me that from the eyes and so on,
link |
00:49:10.840
you can tell.
link |
00:49:11.680
You might need stuff from the car as well.
link |
00:49:13.440
You might need how they're controlling the car, right?
link |
00:49:15.760
And that's fundamentally at the end of the day
link |
00:49:17.360
what you care about.
link |
00:49:18.640
But I think, especially when people are really drunk,
link |
00:49:21.640
they're not controlling the car nearly as smoothly
link |
00:49:23.640
as they would look at them walking, right?
link |
00:49:25.160
They're, the car is like an extension of the body.
link |
00:49:27.240
So I think you could totally detect.
link |
00:49:29.360
And if you could fix people who are drunk,
link |
00:49:30.880
distracted, asleep, if you fix those three.
link |
00:49:32.840
Yeah, that's a huge, that's huge.
link |
00:49:35.480
So what are the current limitations of OpenPilot?
link |
00:49:38.240
What are the main problems that still need to be solved?
link |
00:49:41.720
We're hopefully fixing a few of them in zero six.
link |
00:49:45.440
We're not as good as autopilot at stop cars.
link |
00:49:49.440
So if you're coming up to a red light at like 55,
link |
00:49:55.200
so it's the radar stopped car problem,
link |
00:49:56.880
which is responsible for two autopilot accidents,
link |
00:49:59.200
it's hard to differentiate a stopped car
link |
00:50:01.480
from a like signpost.
link |
00:50:03.640
Yeah, static object.
link |
00:50:05.320
So you have to fuse, you have to do this visually.
link |
00:50:07.520
There's no way from the radar data to tell the difference.
link |
00:50:09.600
Maybe you can make a map,
link |
00:50:10.680
but I don't really believe in mapping at all anymore.
link |
00:50:13.840
Wait, wait, wait, what?
link |
00:50:14.920
You don't believe in mapping?
link |
00:50:16.040
No.
link |
00:50:16.880
So you're basically, the OpenPilot solution is saying,
link |
00:50:21.120
react to the environment as you see it,
link |
00:50:22.480
just like human doing beings do.
link |
00:50:24.480
And then eventually when you want to do navigate
link |
00:50:26.200
on OpenPilot, I'll train the net to look at ways.
link |
00:50:30.400
I'll run ways in the background,
link |
00:50:31.360
I'll train and come down a way.
link |
00:50:32.200
Are you using GPS at all?
link |
00:50:33.560
We use it to ground truth.
link |
00:50:34.840
We use it to very carefully ground truth the paths.
link |
00:50:37.440
We have a stack which can recover relative
link |
00:50:39.560
to 10 centimeters over one minute.
link |
00:50:41.800
And then we use that to ground truth
link |
00:50:43.440
exactly where the car went in that local part
link |
00:50:45.880
of the environment, but it's all local.
link |
00:50:47.800
How are you testing in general?
link |
00:50:49.160
Just for yourself, like experiments and stuff.
link |
00:50:51.400
Where are you located?
link |
00:50:54.000
San Diego.
link |
00:50:54.840
San Diego.
link |
00:50:55.680
Yeah.
link |
00:50:56.520
Okay.
link |
00:50:57.360
So you basically drive around there,
link |
00:50:59.760
collect some data and watch the performance?
link |
00:51:02.200
We have a simulator now and we have,
link |
00:51:04.800
our simulator is really cool.
link |
00:51:06.440
Our simulator is not,
link |
00:51:08.120
it's not like a Unity based simulator.
link |
00:51:09.720
Our simulator lets us load in real estate.
link |
00:51:12.880
What do you mean?
link |
00:51:13.720
We can load in a drive and simulate
link |
00:51:16.760
what the system would have done on the historical data.
link |
00:51:20.280
Ooh, nice.
link |
00:51:22.520
Interesting.
link |
00:51:24.360
Right now we're only using it for testing,
link |
00:51:26.080
but as soon as we start using it for training.
link |
00:51:28.640
That's it.
link |
00:51:29.480
That's all set up for us.
link |
00:51:30.840
What's your feeling about the real world versus simulation?
link |
00:51:33.040
Do you like simulation for training?
link |
00:51:34.320
If this moves to training?
link |
00:51:35.720
So we have to distinguish two types of simulators, right?
link |
00:51:40.040
There's a simulator that like is completely fake.
link |
00:51:44.720
I could get my car to drive around in GTA.
link |
00:51:47.800
I feel that this kind of simulator is useless.
link |
00:51:51.880
You're never, there's so many.
link |
00:51:54.640
My analogy here is like, okay, fine.
link |
00:51:57.000
You're not solving the computer vision problem,
link |
00:51:59.920
but you're solving the computer graphics problem.
link |
00:52:02.440
Right.
link |
00:52:03.280
And you don't think you can get very far
link |
00:52:04.600
by creating ultra realistic graphics?
link |
00:52:08.040
No, because you can create ultra realistic graphics
link |
00:52:10.360
or the road, now create ultra realistic behavioral models
link |
00:52:13.160
of the other cars.
link |
00:52:14.600
Oh, well, I'll just use myself driving.
link |
00:52:16.920
No, you won't.
link |
00:52:18.280
You need real, you need actual human behavior
link |
00:52:21.640
because that's what you're trying to learn.
link |
00:52:23.320
The driving does not have a spec.
link |
00:52:25.840
The definition of driving is what humans do when they drive.
link |
00:52:29.920
Whatever Waymo does, I don't think it's driving.
link |
00:52:32.800
Right.
link |
00:52:33.640
Well, I think actually Waymo and others,
link |
00:52:36.400
if there's any use for reinforcement learning,
link |
00:52:38.920
I've seen it used quite well.
link |
00:52:40.360
I studied pedestrians a lot too,
link |
00:52:41.640
is try to train models from real data
link |
00:52:44.360
of how pedestrians move and try to use reinforcement learning
link |
00:52:46.920
models to make pedestrians move in human like ways.
link |
00:52:50.040
By that point, you've already gone so many layers,
link |
00:52:53.520
you detected a pedestrian.
link |
00:52:55.680
Did you hand code the feature vector of their state?
link |
00:52:59.640
Right.
link |
00:53:00.480
Did you guys learn anything from computer vision
link |
00:53:02.880
before deep learning?
link |
00:53:04.600
Well, okay, I feel like this is...
link |
00:53:07.160
So perception to you is the sticking point.
link |
00:53:10.840
I mean, what's the hardest part of the stack here?
link |
00:53:13.760
There is no human understandable feature vector
link |
00:53:19.680
separating perception and planning.
link |
00:53:23.040
That's the best way I can put that.
link |
00:53:25.120
There is no...
link |
00:53:25.960
So it's all together and it's a joint problem.
link |
00:53:29.600
So you can take localization.
link |
00:53:31.480
Localization and planning,
link |
00:53:32.960
there is a human understandable feature vector
link |
00:53:34.760
between these two things.
link |
00:53:36.000
I mean, okay, so I have like three degrees position,
link |
00:53:38.720
three degrees orientation and those derivatives,
link |
00:53:40.560
maybe those second derivatives, right?
link |
00:53:42.000
That's human understandable, that's physical.
link |
00:53:44.520
The between perception and planning.
link |
00:53:49.520
So like Waymo has a perception stack and then a planner.
link |
00:53:53.600
And one of the things Waymo does right
link |
00:53:55.560
is they have a simulator that can separate those two.
link |
00:54:00.000
They can like replay their perception data
link |
00:54:02.920
and test their system,
link |
00:54:03.920
which is what I'm talking about
link |
00:54:04.880
about like the two different kinds of simulators.
link |
00:54:06.520
There's the kind that can work on real data
link |
00:54:08.240
and there's the kind that can't work on real data.
link |
00:54:10.920
Now, the problem is that I don't think
link |
00:54:13.880
you can hand code a feature vector, right?
link |
00:54:16.160
Like you have some list of like,
link |
00:54:17.360
well, here's my list of cars in the scenes.
link |
00:54:19.040
Here's my list of pedestrians in the scene.
link |
00:54:21.280
This isn't what humans are doing.
link |
00:54:23.240
What are humans doing?
link |
00:54:24.920
Global.
link |
00:54:27.200
Some, some.
link |
00:54:28.040
You're saying that's too difficult to hand engineer.
link |
00:54:31.960
I'm saying that there is no state vector.
link |
00:54:34.120
Given a perfect, I could give you the best team
link |
00:54:36.560
of engineers in the world to build a perception system
link |
00:54:38.520
and the best team to build a planner.
link |
00:54:40.640
All you have to do is define the state vector
link |
00:54:42.640
that separates those two.
link |
00:54:43.960
I'm missing the state vector that separates those two.
link |
00:54:48.560
What do you mean?
link |
00:54:49.400
So what is the output of your perception system?
link |
00:54:54.000
Output of the perception system.
link |
00:54:56.880
It's, there's, okay, well, there's several ways to do it.
link |
00:55:01.560
One is the slam component is localization.
link |
00:55:03.840
The other is drivable area, drivable space.
link |
00:55:05.920
Drivable space, yep.
link |
00:55:06.760
And then there's the different objects in the scene.
link |
00:55:09.000
Yep.
link |
00:55:11.000
And different objects in the scene over time maybe
link |
00:55:16.000
to give you input to then try to start
link |
00:55:18.720
modeling the trajectories of those objects.
link |
00:55:21.560
Sure.
link |
00:55:22.400
That's it.
link |
00:55:23.240
I can give you a concrete example of something you missed.
link |
00:55:25.160
What's that?
link |
00:55:26.000
So say there's a bush in the scene.
link |
00:55:28.640
Humans understand that when they see this bush
link |
00:55:30.920
that there may or may not be a car behind that bush.
link |
00:55:34.680
Drivable area and a list of objects does not include that.
link |
00:55:37.280
Humans are doing this constantly
link |
00:55:38.920
at the simplest intersections.
link |
00:55:40.920
So now you have to talk about occluded area.
link |
00:55:43.880
Right.
link |
00:55:44.720
Right, but even that, what do you mean by occluded?
link |
00:55:47.800
Okay, so I can't see it.
link |
00:55:49.640
Well, if it's the other side of a house, I don't care.
link |
00:55:51.840
What's the likelihood that there's a car
link |
00:55:53.560
in that occluded area, right?
link |
00:55:55.280
And if you say, okay, we'll add that,
link |
00:55:58.080
I can come up with 10 more examples that you can't add.
link |
00:56:01.680
Certainly occluded area would be something
link |
00:56:03.960
that simulator would have because it's simulating
link |
00:56:06.760
the entire, you know, occlusion is part of it.
link |
00:56:11.320
Occlusion is part of a vision stack.
link |
00:56:12.680
Vision stack.
link |
00:56:13.520
But what I'm saying is if you have a hand engineered,
link |
00:56:16.600
if your perception system output can be written
link |
00:56:20.040
in a spec document, it is incomplete.
link |
00:56:23.120
Yeah, I mean, I certainly, it's hard to argue with that
link |
00:56:27.800
because in the end, that's going to be true.
link |
00:56:30.120
Yeah, and I'll tell you what the output
link |
00:56:31.760
of our perception system is.
link |
00:56:32.720
What's that?
link |
00:56:33.560
It's a 1024 dimensional vector.
link |
00:56:37.120
Transparent neural net.
link |
00:56:38.000
Oh, you know that.
link |
00:56:39.000
No, that's the 1024 dimensions of who knows what.
link |
00:56:43.520
Because it's operating on real data.
link |
00:56:45.160
Yeah.
link |
00:56:47.000
And that's the perception.
link |
00:56:48.320
That's the perception state, right?
link |
00:56:50.360
Think about an autoencoder for faces, right?
link |
00:56:53.520
If you have an autoencoder for faces
link |
00:56:54.720
and you say it has 256 dimensions in the middle,
link |
00:56:59.720
and I'm taking a face over here
link |
00:57:00.680
and projecting it to a face over here.
link |
00:57:02.800
Can you hand label all 256 of those dimensions?
link |
00:57:06.280
Well, no, but those are generated automatically.
link |
00:57:09.240
But even if you tried to do it by hand,
link |
00:57:11.360
could you come up with a spec between your encoder
link |
00:57:15.520
and your decoder?
link |
00:57:17.400
No, no, because it wasn't designed, but they're...
link |
00:57:20.720
No, no, no, but if you could design it,
link |
00:57:23.600
if you could design a face reconstructor system,
link |
00:57:26.480
could you come up with a spec?
link |
00:57:29.240
No, but I think we're missing here a little bit.
link |
00:57:32.320
I think you're just being very poetic
link |
00:57:35.080
about expressing a fundamental problem of simulators,
link |
00:57:37.880
that they are going to be missing so much
link |
00:57:42.480
that the feature of actually
link |
00:57:44.680
would just look fundamentally different
link |
00:57:47.080
from in the simulated world than the real world.
link |
00:57:51.280
I'm not making a claim about simulators.
link |
00:57:53.800
I'm making a claim about the spec division
link |
00:57:57.120
between perception and planning.
link |
00:57:58.800
And planning.
link |
00:57:59.640
Even in your system.
link |
00:58:00.840
Just in general.
link |
00:58:01.800
Right, just in general.
link |
00:58:03.360
If you're trying to build a car that drives,
link |
00:58:05.680
if you're trying to hand code
link |
00:58:07.280
the output of your perception system,
link |
00:58:08.760
like saying, here's a list of all the cars in the scene.
link |
00:58:10.960
Here's a list of all the people.
link |
00:58:11.920
Here's a list of the occluded areas.
link |
00:58:13.120
Here's a vector of drivable areas.
link |
00:58:14.920
It's insufficient.
link |
00:58:16.600
And if you start to believe that,
link |
00:58:18.000
you realize that what Waymo and Cruz are doing is impossible.
link |
00:58:20.840
Currently, what we're doing is the perception problem
link |
00:58:24.320
is converting the scene into a chessboard.
link |
00:58:29.200
And then you reason some basic reasoning
link |
00:58:31.720
around that chessboard.
link |
00:58:33.400
And you're saying that really there's a lot missing there.
link |
00:58:38.080
First of all, why are we talking about this?
link |
00:58:40.240
Because isn't this a full autonomy?
link |
00:58:42.840
Is this something you think about?
link |
00:58:44.720
Oh, I want to win self driving cars.
link |
00:58:47.680
So your definition of win includes the full five.
link |
00:58:53.680
I don't think level four is a real thing.
link |
00:58:55.800
I want to build the AlphaGo of driving.
link |
00:58:59.720
So AlphaGo is really end to end.
link |
00:59:06.160
Yeah.
link |
00:59:07.000
Is, yeah, it's end to end.
link |
00:59:09.840
And do you think this whole problem,
link |
00:59:12.480
is that also kind of what you're getting at
link |
00:59:14.680
with the perception and the planning?
link |
00:59:16.640
Is that this whole problem, the right way to do it,
link |
00:59:19.440
is really to learn the entire thing?
link |
00:59:21.600
I'll argue that not only is it the right way,
link |
00:59:23.680
it's the only way that's going to exceed human performance.
link |
00:59:27.640
Well, it's certainly true for Go.
link |
00:59:29.960
Everyone who tried to hand code Go things
link |
00:59:31.520
built human inferior things.
link |
00:59:33.440
And then someone came along and wrote some 10,000 line thing
link |
00:59:36.200
that doesn't know anything about Go that beat everybody.
link |
00:59:39.800
It's 10,000 lines.
link |
00:59:41.080
True, in that sense.
link |
00:59:43.360
The open question then that maybe I can ask you
link |
00:59:47.520
is driving is much harder than Go.
link |
00:59:53.440
The open question is how much harder?
link |
00:59:56.240
So how, because I think the Elon Musk approach here
link |
00:59:59.480
with planning and perception is similar
link |
01:00:01.600
to what you're describing,
link |
01:00:02.960
which is really turning into not some kind of modular thing,
link |
01:00:08.280
but really do formulate as a learning problem
link |
01:00:11.120
and solve the learning problem with scale.
link |
01:00:13.360
So how many years, put one,
link |
01:00:17.120
how many years would it take to solve this problem
link |
01:00:18.880
or just how hard is this freaking problem?
link |
01:00:21.680
Well, the cool thing is,
link |
01:00:24.560
I think there's a lot of value
link |
01:00:27.800
that we can deliver along the way.
link |
01:00:30.840
I think that you can build lame keeping assist
link |
01:00:36.600
actually plus adaptive cruise control plus, okay,
link |
01:00:41.440
looking at ways extends to like all of driving.
link |
01:00:46.000
Yeah, most of driving, right?
link |
01:00:47.920
Oh, your adaptive cruise control treats red lights
link |
01:00:49.760
like cars, okay.
link |
01:00:51.200
So let's jump around with you mentioned
link |
01:00:53.480
that you didn't like navigate an autopilot.
link |
01:00:55.760
What advice, how would you make it better?
link |
01:00:57.760
Do you think as a feature that if it's done really well,
link |
01:01:00.560
it's a good feature?
link |
01:01:02.360
I think that it's too reliant on like hand coded hacks
link |
01:01:07.520
for like, how does navigate an autopilot do a lane change?
link |
01:01:10.400
It actually does the same lane change every time
link |
01:01:13.400
and it feels mechanical.
link |
01:01:14.320
Humans do different lane changes.
link |
01:01:15.920
Humans, sometimes we'll do a slow one,
link |
01:01:17.360
sometimes do a fast one.
link |
01:01:18.920
Navigate an autopilot at least every time I use it
link |
01:01:20.880
is it the identical lane change?
link |
01:01:23.040
How do you learn?
link |
01:01:24.280
I mean, this is a fundamental thing actually
link |
01:01:26.800
is the breaking and accelerating,
link |
01:01:30.400
something that still, Tesla probably does it better
link |
01:01:33.960
than most cars, but it still doesn't do a great job
link |
01:01:36.800
of creating a comfortable natural experience
link |
01:01:39.960
and navigate an autopilot is just lane changes
link |
01:01:42.680
and extension of that.
link |
01:01:44.120
So how do you learn to do natural lane change?
link |
01:01:49.120
So we have it and I can talk about how it works.
link |
01:01:52.920
So I feel that we have the solution for lateral
link |
01:01:58.720
but we don't yet have the solution for longitudinal.
link |
01:02:00.640
There's a few reasons longitudinal is harder than lateral.
link |
01:02:03.360
The lane change component, the way that we train on it
link |
01:02:06.920
very simply is like our model has an input
link |
01:02:10.840
for whether it's doing a lane change or not.
link |
01:02:14.040
And then when we train the end to end model,
link |
01:02:16.360
we hand label all the lane changes because you have to.
link |
01:02:19.560
I've struggled a long time about not wanting to do that
link |
01:02:22.440
but I think you have to.
link |
01:02:24.280
Or the training data.
link |
01:02:25.320
For the training data, right?
link |
01:02:26.520
We actually have an automatic ground truth
link |
01:02:28.280
or which automatically labels all the lane changes.
link |
01:02:30.600
Was that possible?
link |
01:02:31.680
To automatically label lane changes?
link |
01:02:32.720
Yeah.
link |
01:02:33.560
And detect the lane I see when it crosses it, right?
link |
01:02:34.800
And I don't have to get that high percent accuracy
link |
01:02:36.680
but it's like 95 good enough.
link |
01:02:38.080
Okay.
link |
01:02:38.960
Now I set the bit when it's doing the lane change
link |
01:02:43.200
in the end to end learning.
link |
01:02:44.840
And then I set it to zero when it's not doing a lane change.
link |
01:02:47.920
So now if I want us to do a lane change a test time,
link |
01:02:49.720
I just put the bit to a one and it'll do a lane change.
link |
01:02:52.360
Yeah, but so if you look at the space of lane change,
link |
01:02:54.640
you know some percentage, not a hundred percent,
link |
01:02:57.320
that we make as humans is not a pleasant experience
link |
01:03:01.120
because we messed some part of it up.
link |
01:03:02.800
It's nerve wracking to change.
link |
01:03:04.320
If you look, you have to see,
link |
01:03:05.760
it has to accelerate.
link |
01:03:06.920
How do we label the ones that are natural and feel good?
link |
01:03:09.920
You know, that's the,
link |
01:03:11.560
because that's your ultimate criticism,
link |
01:03:13.360
the current navigate and autopilot just doesn't feel good.
link |
01:03:17.000
Well, the current navigate and autopilot
link |
01:03:18.520
is a hand coded policy written by an engineer in a room
link |
01:03:21.720
who probably went out and tested it a few times on the 280.
link |
01:03:25.080
Probably a more, a better version of that.
link |
01:03:28.560
But yes.
link |
01:03:29.400
That's how we would have written it.
link |
01:03:30.560
Yeah.
link |
01:03:31.400
Maybe Tesla did a Tesla, they tested it in.
link |
01:03:33.480
That might have been two engineers.
link |
01:03:34.920
Two engineers.
link |
01:03:35.760
Yeah.
link |
01:03:37.400
No, but so if you learn the lane change,
link |
01:03:40.120
if you learn how to do a lane change from data,
link |
01:03:42.480
just like you have a label that says lane change
link |
01:03:44.680
and then you put it in when you want it to do the lane change,
link |
01:03:48.040
it'll automatically do the lane change
link |
01:03:49.640
that's appropriate for the situation.
link |
01:03:51.600
Now, to get at the problem of some humans
link |
01:03:54.720
do bad lane changes,
link |
01:03:57.400
we haven't worked too much on this problem yet.
link |
01:03:59.920
It's not that much of a problem in practice.
link |
01:04:03.120
My theory is that all good drivers are good in the same way
link |
01:04:06.160
and all bad drivers are bad in different ways.
link |
01:04:09.360
And we've seen some data to back this up.
link |
01:04:11.320
Well, beautifully put.
link |
01:04:12.400
So you just basically, if that's true hypothesis,
link |
01:04:16.560
then your task is to discover the good drivers.
link |
01:04:19.920
The good drivers stand out
link |
01:04:21.800
because they're in one cluster
link |
01:04:23.360
and the bad drivers are scattered all over the place
link |
01:04:25.200
and your net learns the cluster.
link |
01:04:27.240
Yeah.
link |
01:04:28.080
So you just learn from the good drivers
link |
01:04:30.800
and they're easy to cluster.
link |
01:04:33.200
In fact, we learned from all of them
link |
01:04:34.240
and the net automatically learns the policy
link |
01:04:35.840
that's like the majority.
link |
01:04:36.920
But we'll eventually probably have to build some out.
link |
01:04:38.440
So if that theory is true, I hope it's true
link |
01:04:41.560
because the counter theory is there is many clusters,
link |
01:04:49.480
maybe arbitrarily many clusters of good drivers.
link |
01:04:53.680
Because if there's one cluster of good drivers,
link |
01:04:55.840
you can at least discover a set of policies.
link |
01:04:57.600
You can learn a set of policies
link |
01:04:59.000
which would be good universally.
link |
01:05:00.640
Yeah.
link |
01:05:01.640
That would be nice if it's true.
link |
01:05:04.560
And you're saying that there is some evidence that...
link |
01:05:06.560
Let's say lane changes can be clustered into four clusters.
link |
01:05:09.720
Right.
link |
01:05:10.560
There's a finite level of...
link |
01:05:12.040
I would argue that all four of those are good clusters.
link |
01:05:15.280
All the things that are random are noise and probably bad.
link |
01:05:18.360
And which one of the four you pick?
link |
01:05:20.360
Or maybe it's 10 or maybe it's 20.
link |
01:05:21.920
You can learn that.
link |
01:05:22.760
It's context dependent.
link |
01:05:23.800
It depends on the scene.
link |
01:05:26.760
And the hope is it's not too dependent on the driver.
link |
01:05:31.400
Yeah, the hope is that it all washes out.
link |
01:05:34.240
The hope is that the distribution is not bimodal.
link |
01:05:36.960
The hope is that it's a nice Gaussian.
link |
01:05:39.080
So what advice would you give to Tesla?
link |
01:05:41.640
How to fix, how to improve, navigate an autopilot?
link |
01:05:45.000
That's the lessons that you've learned from Kamii.
link |
01:05:48.240
The only real advice I would give to Tesla
link |
01:05:50.560
is please put driver monitoring in your cars.
link |
01:05:53.920
With respect to improving it.
link |
01:05:55.160
You can't do that anymore.
link |
01:05:56.000
I started to interrupt.
link |
01:05:57.280
But there's a practical nature of many of hundreds of thousands
link |
01:06:01.760
of cars being produced that don't have a good driver facing camera.
link |
01:06:05.760
The Model 3 has a selfie cam.
link |
01:06:07.520
Is it not good enough?
link |
01:06:08.680
Did they not have put IR LEDs for night?
link |
01:06:10.800
That's a good question.
link |
01:06:11.640
But I do know that it's fish eye
link |
01:06:13.360
and it's relatively low resolution.
link |
01:06:15.800
So it's really not designed.
link |
01:06:16.760
It wasn't designed for driver monitoring.
link |
01:06:18.760
You can hope that you can kind of scrape up
link |
01:06:21.760
and have something from it.
link |
01:06:24.400
But why didn't they put it in today?
link |
01:06:27.520
Put it in today.
link |
01:06:28.280
Put it in today.
link |
01:06:29.520
Every time I've heard Carpathian talk about the problem
link |
01:06:31.520
and talking about like software 2.0
link |
01:06:33.240
and how the machine learning is gobbling up everything,
link |
01:06:35.240
I think this is absolutely the right strategy.
link |
01:06:37.440
I think that he didn't write and navigate on autopilot.
link |
01:06:40.160
I think somebody else did and kind of hacked it on top of that stuff.
link |
01:06:43.240
I think when Carpathian says, wait a second,
link |
01:06:45.680
why did we hand code this lane change policy
link |
01:06:47.440
with all these magic numbers?
link |
01:06:48.360
We're going to learn it from data.
link |
01:06:49.360
They'll fix it.
link |
01:06:49.840
They already know what to do there.
link |
01:06:51.040
Well, that's Andre's job is to turn everything
link |
01:06:54.360
into a learning problem and collect a huge amount of data.
link |
01:06:57.480
The reality is, though, not every problem
link |
01:07:01.120
can be turned into a learning problem in the short term.
link |
01:07:04.080
In the end, everything will be a learning problem.
link |
01:07:07.280
The reality is, like if you want to build L5 vehicles today,
link |
01:07:12.880
it will likely involve no learning.
link |
01:07:15.600
And that's the reality is, so at which point does learning start?
link |
01:07:20.320
It's the crutch statement that LiDAR is a crutch.
link |
01:07:23.480
Which point will learning get up to part of human performance?
link |
01:07:27.240
It's over human performance on ImageNet, classification,
link |
01:07:31.960
on driving, it's a question still.
link |
01:07:34.000
It is a question.
link |
01:07:35.760
I'll say this, I'm here to play for 10 years.
link |
01:07:39.160
I'm not here to try to.
link |
01:07:40.280
I'm here to play for 10 years and make money along the way.
link |
01:07:42.960
I'm not here to try to promise people
link |
01:07:45.040
that I'm going to have my L5 taxi network up and working
link |
01:07:47.600
in two years.
link |
01:07:48.200
Do you think that was a mistake?
link |
01:07:49.400
Yes.
link |
01:07:50.520
What do you think was the motivation behind saying
link |
01:07:53.160
that other companies are also promising L5 vehicles
link |
01:07:56.640
with their different approaches in 2020, 2021, 2022?
link |
01:08:01.880
If anybody would like to bet me that those things do not pan out,
link |
01:08:05.720
I will bet you.
link |
01:08:07.000
Even money, even money, I'll bet you as much as you want.
link |
01:08:10.800
So are you worried about what's going to happen?
link |
01:08:13.600
Because you're not in full agreement on that.
link |
01:08:16.040
What's going to happen when 2022, 2021 come around
link |
01:08:19.160
and nobody has fleets of autonomous vehicles?
link |
01:08:22.800
Well, you can look at the history.
link |
01:08:25.000
If you go back five years ago, they
link |
01:08:26.880
were all promised by 2018 and 2017.
link |
01:08:29.880
But they weren't that strong of promises.
link |
01:08:32.200
I mean, Ford really declared.
link |
01:08:36.240
I think not many have declared as definitively
link |
01:08:40.560
as they have now these dates.
link |
01:08:42.600
Well, OK.
link |
01:08:43.320
So let's separate L4 and L5.
link |
01:08:45.040
Do I think that it's possible for Waymo
link |
01:08:46.800
to continue to hack on their system
link |
01:08:50.960
until it gets to level four in Chandler, Arizona?
link |
01:08:53.400
Yes.
link |
01:08:55.040
No safety driver?
link |
01:08:56.800
Chandler, Arizona?
link |
01:08:57.600
Yeah.
link |
01:08:59.600
By which year are we talking about?
link |
01:09:02.440
Oh, I even think that's possible by like 2020, 2021.
link |
01:09:06.120
But level four, Chandler, Arizona, not level five,
link |
01:09:09.480
New York City.
link |
01:09:11.480
Level four, meaning some very defined streets.
link |
01:09:15.920
It works out really well.
link |
01:09:17.400
Very defined streets.
link |
01:09:18.280
And then practically, these streets are pretty empty.
link |
01:09:20.680
If most of the streets are covered in Waymos,
link |
01:09:24.680
Waymo can kind of change the definition of what driving is.
link |
01:09:28.360
Right?
link |
01:09:28.920
If your self driving network is the majority
link |
01:09:31.720
of cars in an area, they only need
link |
01:09:34.120
to be safe with respect to each other,
link |
01:09:35.720
and all the humans will need to learn to adapt to them.
link |
01:09:38.640
Now go drive in downtown New York.
link |
01:09:41.120
Oh, yeah, that's.
link |
01:09:42.200
I mean, already.
link |
01:09:43.440
You can talk about autonomy and like on farms,
link |
01:09:46.040
it already works great, because you can really just
link |
01:09:48.520
follow the GPS line.
link |
01:09:51.320
So what does success look like for Kama AI?
link |
01:09:56.800
What are the milestones like where
link |
01:09:58.200
you can sit back with some champagne
link |
01:09:59.800
and say, we did it, boys and girls?
link |
01:10:04.120
Well, it's never over.
link |
01:10:06.320
Yeah, but don't be so.
link |
01:10:07.800
You must drink champagne every time you celebrate.
link |
01:10:10.400
So what is good?
link |
01:10:11.440
What are some wins?
link |
01:10:13.160
A big milestone that we're hoping for by mid next year
link |
01:10:19.480
is profitability of the company.
link |
01:10:20.680
And we're going to have to revisit the idea of selling
link |
01:10:28.560
a consumer product.
link |
01:10:30.280
But it's not going to be like the Kama One.
link |
01:10:32.720
When we do it, it's going to be perfect.
link |
01:10:36.240
OpenPilot has gotten so much better in the last two years.
link |
01:10:39.600
We're going to have a few features.
link |
01:10:41.680
We're going to have 100% driver monitoring.
link |
01:10:43.760
We're going to disable no safety features in the car.
link |
01:10:46.720
Actually, I think it'd be really cool what we're doing right
link |
01:10:48.760
now, our project this week is we're analyzing the data set
link |
01:10:51.600
and looking for all the AEB triggers
link |
01:10:53.240
from the manufacturer systems.
link |
01:10:55.640
We have better data set on that than the manufacturers.
link |
01:10:59.440
How much does Toyota have 10 million miles of real world
link |
01:11:02.960
driving to know how many times they're AEB triggered?
link |
01:11:05.360
So let me give you, because you asked, financial advice.
link |
01:11:10.880
Because I work with a lot of automakers
link |
01:11:12.440
and one possible source of money for you,
link |
01:11:15.840
which I'll be excited to see you take on, is basically
link |
01:11:21.400
selling the data, which is something that most people,
link |
01:11:29.120
and not selling in a way where here, here at Automaker,
link |
01:11:31.800
but creating.
link |
01:11:33.000
We've done this actually at MIT, not for money purposes,
link |
01:11:35.480
but you could do it for significant money purposes
link |
01:11:37.760
and make the world a better place
link |
01:11:39.440
by creating a consortia where automakers would pay in
link |
01:11:44.240
and then they get to have free access to the data.
link |
01:11:46.960
And I think a lot of people are really hungry for that
link |
01:11:52.400
and would pay significant amount of money for it.
link |
01:11:54.200
Here's the problem with that.
link |
01:11:55.400
I like this idea all in theory.
link |
01:11:56.840
It'd be very easy for me to give them access to my servers.
link |
01:11:59.640
And we already have all open source tools to access this data.
link |
01:12:02.280
It's in a great format.
link |
01:12:03.400
We have a great pipeline.
link |
01:12:05.560
But they're going to put me in the room
link |
01:12:07.120
with some business development guy.
link |
01:12:10.120
And I'm going to have to talk to this guy.
link |
01:12:12.400
And he's not going to know most of the words I'm saying.
link |
01:12:15.040
I'm not willing to tolerate that.
link |
01:12:17.280
OK, Mick Jagger.
link |
01:12:18.840
No, no, no, no.
link |
01:12:19.800
But I think I agree with you.
link |
01:12:21.040
I'm the same way.
link |
01:12:21.720
But you just tell them the terms
link |
01:12:22.960
and there's no discussion needed.
link |
01:12:24.640
If I could just tell them the terms, then like, all right.
link |
01:12:30.480
Who wants access to my data?
link |
01:12:31.600
I will sell it to you for, let's say,
link |
01:12:36.680
you want a subscription?
link |
01:12:37.640
I'll sell you for 100k a month.
link |
01:12:40.680
Anyone?
link |
01:12:41.200
100k a month?
link |
01:12:42.000
100k a month?
link |
01:12:43.040
I'll give you access to the data subscription?
link |
01:12:45.080
Yeah.
link |
01:12:45.680
Yeah, I think that's kind of fair.
link |
01:12:46.680
Came up with that number off the top of my head.
link |
01:12:48.440
If somebody sends me like a three line email where it's like,
link |
01:12:50.840
we would like to pay 100k a month to get access to your data.
link |
01:12:54.000
We would agree to like reasonable privacy terms
link |
01:12:56.160
of the people who are in the data set.
link |
01:12:58.360
I would be happy to do it.
link |
01:12:59.520
But that's not going to be the email.
link |
01:13:01.200
The email is going to be, hey, do you
link |
01:13:03.120
have some time in the next month where we can sit down
link |
01:13:05.560
and we can, I don't have time for that.
link |
01:13:07.000
We're moving too fast.
link |
01:13:08.360
You could politely respond to that email,
link |
01:13:10.040
but not saying I don't have any time for your bullshit.
link |
01:13:13.240
You say, oh, well, unfortunately, these are the terms.
link |
01:13:15.440
And so this is what we try to, we brought the cost down
link |
01:13:19.280
for you in order to minimize the friction, the communication.
link |
01:13:22.320
Yeah, absolutely.
link |
01:13:22.920
Here's the whatever it is, $1, $2 million a year.
link |
01:13:26.720
And you have access.
link |
01:13:28.880
And it's not like I get that email from like,
link |
01:13:31.440
but OK, am I going to reach out?
link |
01:13:32.720
Am I going to hire a business development person
link |
01:13:34.200
who's going to reach out to the automakers?
link |
01:13:35.840
No way.
link |
01:13:36.480
Yeah.
link |
01:13:36.880
OK, I got you.
link |
01:13:37.840
I admire.
link |
01:13:38.520
If they reached into me, I'm not
link |
01:13:39.680
going to ignore the email.
link |
01:13:40.600
I'll come back with something like, yeah,
link |
01:13:42.160
if you're willing to pay $100,000 for access to the data,
link |
01:13:44.560
I'm happy to set that up.
link |
01:13:46.080
That's worth my engineering time.
link |
01:13:48.200
That's actually quite insightful of you.
link |
01:13:49.520
You're right.
link |
01:13:50.440
Probably because many of the automakers
link |
01:13:52.480
are quite a bit old school, there
link |
01:13:54.480
will be a need to reach out.
link |
01:13:56.200
And they want it, but there will need
link |
01:13:58.440
to be some communication.
link |
01:13:59.800
You're right.
link |
01:14:00.160
Mobileye circa 2015 had the lowest R&D spend of any chipmaker.
link |
01:14:06.760
Like per, and you look at all the people who work for them,
link |
01:14:10.640
and it's all business development people
link |
01:14:12.120
because the car companies are impossible to work with.
link |
01:14:15.320
Yeah, so you have no patience for that,
link |
01:14:17.880
and you're a legit Android, huh?
link |
01:14:20.040
I have something to do, right?
link |
01:14:21.440
Like, it's not like I don't mean to be a dick and say,
link |
01:14:24.040
I don't have patience for that, but it's like,
link |
01:14:25.920
that stuff doesn't help us with our goal of winning
link |
01:14:29.160
self driving cars.
link |
01:14:30.560
If I want money in the short term,
link |
01:14:33.800
if I showed off the actual learning tech that we have,
link |
01:14:38.040
it's somewhat sad.
link |
01:14:40.160
It's years and years ahead of everybody else's.
link |
01:14:43.000
Maybe not Tesla's.
link |
01:14:43.720
I think Tesla has similar stuff to us, actually.
link |
01:14:45.720
I think Tesla has similar stuff, but when you compare it
link |
01:14:47.640
to what the Toyota Research Institute has,
link |
01:14:50.920
you're not even close to what we have.
link |
01:14:53.480
No comments, but I also can't.
link |
01:14:55.840
I have to take your comments.
link |
01:14:58.440
I intuitively believe you, but I have
link |
01:15:01.960
to take it with a grain of salt because,
link |
01:15:04.680
I mean, you are an inspiration because you basically
link |
01:15:07.440
don't care about a lot of things that other companies care
link |
01:15:10.000
about.
link |
01:15:10.880
You don't try to bullshit, in a sense, like make up stuff,
link |
01:15:16.600
so to drive up valuation.
link |
01:15:18.600
You're really very real, and you're
link |
01:15:19.960
trying to solve the problem, and I admire that a lot.
link |
01:15:22.280
What I don't necessarily fully can't trust you on about your
link |
01:15:26.520
respect is how good it is, right?
link |
01:15:28.440
I can only, but I also know how bad others are.
link |
01:15:33.320
I'll say two things about, trust, but verify, right?
link |
01:15:36.680
I'll say two things about that.
link |
01:15:38.040
One is try, get in a 2020 Corolla,
link |
01:15:42.360
and try OpenPilot 0.6 when it comes out next month.
link |
01:15:46.680
I think already, you'll look at this,
link |
01:15:48.400
and you'll be like, this is already really good.
link |
01:15:51.400
And then, I could be doing that all with hand labelers
link |
01:15:54.240
and all with the same approach that Mobileye uses.
link |
01:15:58.000
When we release a model that no longer
link |
01:16:00.040
has the lanes in it, that only outputs a path,
link |
01:16:05.000
then think about how we did that machine learning,
link |
01:16:08.720
and then right away, when you see,
link |
01:16:10.080
and that's going to be an OpenPilot,
link |
01:16:11.240
that's going to be an OpenPilot before 1.0,
link |
01:16:13.000
when you see that model, you'll know
link |
01:16:14.400
that everything I'm saying is true,
link |
01:16:15.360
because how else did I get that model?
link |
01:16:16.840
Good.
link |
01:16:17.320
You know what I'm saying is true about the simulator.
link |
01:16:19.240
Yeah, yeah, yeah, this is super exciting.
link |
01:16:20.600
That's super exciting.
link |
01:16:22.680
But I listened to your talk with Kyle,
link |
01:16:25.760
and Kyle was originally building the aftermarket system,
link |
01:16:30.480
and he gave up on it because of technical challenges,
link |
01:16:34.920
because of the fact that he's going
link |
01:16:37.360
to have to support 20 to 50 cars.
link |
01:16:39.160
We support 45, because what is he
link |
01:16:41.120
going to do when the manufacturer ABS system triggers?
link |
01:16:43.440
We have alerts and warnings to deal with all of that
link |
01:16:45.480
and all the cars, and how is he going to formally verify it?
link |
01:16:48.400
Well, I got 10 million miles of data.
link |
01:16:49.800
It's probably better verified than the spec.
link |
01:16:53.240
Yeah, I'm glad you're here talking to me.
link |
01:16:57.720
I'll remember this day, because it's interesting.
link |
01:17:01.120
If you look at Kyle's from Cruise,
link |
01:17:04.160
I'm sure they have a large number of business development
link |
01:17:06.320
folks, and he's working with GM.
link |
01:17:10.200
He could work with Argo AI, worked with Ford.
link |
01:17:13.280
It's interesting, because chances that you fail businesswise,
link |
01:17:18.520
like bankrupt, are pretty high.
link |
01:17:21.120
And yet, it's the Android model,
link |
01:17:23.880
is you're actually taking on the problem.
link |
01:17:26.440
So that's really inspiring.
link |
01:17:28.160
Well, I have a long term way for comedy to make money, too.
link |
01:17:30.920
And one of the nice things when you really take on the problem,
link |
01:17:34.400
which is my hope for autopilot, for example,
link |
01:17:36.760
is things you don't expect, ways to make money,
link |
01:17:41.040
or create value that you don't expect will pop up.
link |
01:17:44.160
I've known how to do it since 2017 is the first time I said it.
link |
01:17:48.560
Which part to know how to do which part?
link |
01:17:50.440
Our long term plan is to be a car insurance company.
link |
01:17:52.520
Insurance.
link |
01:17:53.160
Yeah, I love it.
link |
01:17:55.320
I make driving twice as safe.
link |
01:17:56.680
Not only that, I have the best data
link |
01:17:57.680
such to know who statistically is the safest drivers.
link |
01:18:00.040
And oh, oh, we see you.
link |
01:18:02.160
We see you driving unsafely.
link |
01:18:03.720
We're not going to insure you.
link |
01:18:05.360
And that causes a bifurcation in the market,
link |
01:18:08.960
because the only people who can't get common insurance
link |
01:18:10.920
or the bad drivers, Geico can insure them.
link |
01:18:12.760
Their premiums are crazy high, our premiums are crazy low.
link |
01:18:15.360
We win car insurance.
link |
01:18:16.240
Take over that whole market.
link |
01:18:18.120
OK, so if we win, if we win, but that's
link |
01:18:21.560
I'm saying like how do you turn comma into a $10 billion
link |
01:18:23.800
company is that.
link |
01:18:24.640
That's right.
link |
01:18:25.600
So you Elon Musk, who else?
link |
01:18:30.000
Who else is thinking like this and working like this
link |
01:18:32.720
in your view?
link |
01:18:33.160
Who are the competitors?
link |
01:18:34.800
Are there people seriously?
link |
01:18:36.160
I don't think anyone that I'm aware of is seriously
link |
01:18:39.480
taking on lane keeping, like to where it's a huge business that
link |
01:18:45.280
turns eventually to full autonomy that then creates
link |
01:18:51.400
other businesses on top of it and so on.
link |
01:18:53.440
Thinks insurance, thinks all kinds of ideas like that.
link |
01:18:56.480
Do you know anyone else thinking like this?
link |
01:19:00.480
Not really.
link |
01:19:02.200
That's interesting.
link |
01:19:02.960
I mean, my sense is everybody turns to that in like four
link |
01:19:06.560
or five years.
link |
01:19:07.800
Like Ford, once the autonomy doesn't fall through.
link |
01:19:11.240
But at this time.
link |
01:19:12.600
Elon's the iOS.
link |
01:19:14.120
By the way, he paved the way for all of us.
link |
01:19:16.720
It's not iOS, true.
link |
01:19:18.000
I would not be doing comma AI today if it was not
link |
01:19:21.520
for those conversations with Elon.
link |
01:19:23.480
And if it were not for him saying like,
link |
01:19:26.840
I think he said like, well, obviously we're not
link |
01:19:28.600
going to use LiDAR, we use cameras, humans use cameras.
link |
01:19:31.280
So what do you think about that?
link |
01:19:32.600
How important is LiDAR?
link |
01:19:33.880
Everybody else's on L5 is using LiDAR.
link |
01:19:36.960
What are your thoughts on his provocative statement
link |
01:19:39.160
that LiDAR is a crutch?
link |
01:19:41.320
See, sometimes they'll say dumb things like the driver
link |
01:19:43.520
monitoring thing, but sometimes they'll say absolutely
link |
01:19:45.680
completely 100% obviously true things.
link |
01:19:48.400
Of course LiDAR is a crutch.
link |
01:19:50.840
It's not even a good crutch.
link |
01:19:53.040
You're not even using it.
link |
01:19:54.200
They're using it for localization,
link |
01:19:56.920
which isn't good in the first place.
link |
01:19:58.160
If you have to localize your car to centimeters
link |
01:20:00.480
in order to drive, that's not driving.
link |
01:20:04.280
Currently not doing much machine learning.
link |
01:20:06.320
I thought LiDAR data, meaning like to help you
link |
01:20:09.280
in the task of general task of perception.
link |
01:20:12.840
The main goal of those LiDARs on those cars
link |
01:20:15.320
I think is actually localization more than perception,
link |
01:20:18.840
or at least that's what they use them for.
link |
01:20:20.080
Yeah, that's true.
link |
01:20:20.920
If you want to localize to centimeters,
link |
01:20:22.480
you can't use GPS.
link |
01:20:23.720
The fancies GPS in the world can't do it,
link |
01:20:25.120
especially if you're under tree cover and stuff.
link |
01:20:26.960
LiDAR you can do this pretty easily.
link |
01:20:28.480
So really they're not taking on,
link |
01:20:30.240
I mean in some research they're using it for perception,
link |
01:20:33.200
but and they're certainly not, which is sad,
link |
01:20:35.840
they're not fusing it well with vision.
link |
01:20:38.680
They do use it for perception.
link |
01:20:40.560
I'm not saying they don't use it for perception,
link |
01:20:42.400
but the thing that they have vision based
link |
01:20:45.480
and radar based perception systems as well.
link |
01:20:47.680
You could remove the LiDAR and keep around
link |
01:20:51.440
a lot of the dynamic object perception.
link |
01:20:54.040
You want to get centimeter accurate localization.
link |
01:20:56.320
Good luck doing that with anything else.
link |
01:20:59.120
So what should a cruise Waymo do?
link |
01:21:02.880
Like what would be your advice to them now?
link |
01:21:06.400
I mean Waymo is actually, they're serious.
link |
01:21:11.400
Waymo out of the ball of them,
link |
01:21:13.120
are quite serious about the long game.
link |
01:21:16.120
If L5 is a lot, is requires 50 years,
link |
01:21:20.680
I think Waymo will be the only one left standing at the end
link |
01:21:24.000
with a given the financial backing that they have.
link |
01:21:26.560
They're boo Google box.
link |
01:21:28.640
I'll say nice things about both Waymo and cruise.
link |
01:21:32.320
Let's do it.
link |
01:21:33.480
Nice is good.
link |
01:21:35.720
Waymo is by far the furthest along with technology.
link |
01:21:39.200
Waymo has a three to five year lead
link |
01:21:41.160
on all the competitors.
link |
01:21:43.960
If the Waymo looking stack works,
link |
01:21:48.640
maybe three year lead.
link |
01:21:49.720
If the Waymo looking stack works,
link |
01:21:51.280
they have a three year lead.
link |
01:21:52.800
Now, I argue that Waymo has spent too much money
link |
01:21:55.800
to recapitalize, to gain back their losses
link |
01:21:59.240
in those three years.
link |
01:22:00.160
Also self driving cars have no network effect like that.
link |
01:22:03.600
Uber has a network effect.
link |
01:22:04.800
You have a market, you have drivers and you have riders.
link |
01:22:07.120
Self driving cars, you have capital and you have riders.
link |
01:22:09.880
There's no network effect.
link |
01:22:11.400
If I want to blanket a new city in self driving cars,
link |
01:22:13.800
I buy the off the shelf Chinese knockoff self driving cars
link |
01:22:16.000
and I buy enough of them in the city.
link |
01:22:17.160
I can't do that with drivers.
link |
01:22:18.360
And that's why Uber has a first mover advantage
link |
01:22:20.840
that no self driving car company will.
link |
01:22:23.960
Can you just a thing, let a little bit.
link |
01:22:26.520
Uber, you're not talking about Uber,
link |
01:22:28.160
the autonomous vehicle Uber.
link |
01:22:29.240
You're talking about the Uber cars.
link |
01:22:30.960
Yeah.
link |
01:22:31.800
I'm Uber.
link |
01:22:32.640
I open for business in Austin, Texas, let's say.
link |
01:22:35.920
I need to attract both sides of the market.
link |
01:22:38.760
I need to both get drivers on my platform
link |
01:22:41.200
and riders on my platform.
link |
01:22:42.720
And I need to keep them both sufficiently happy, right?
link |
01:22:45.320
Riders aren't going to use it
link |
01:22:46.520
if it takes more than five minutes for an Uber to show up.
link |
01:22:48.960
Drivers aren't going to use it
link |
01:22:50.120
if they have to sit around all day and there's no riders.
link |
01:22:52.120
So you have to carefully balance a market.
link |
01:22:54.480
And whenever you have to carefully balance a market,
link |
01:22:56.240
there's a great first mover advantage
link |
01:22:58.280
because there's a switching cost for everybody, right?
link |
01:23:01.000
The drivers and the riders
link |
01:23:02.120
would have to switch at the same time.
link |
01:23:04.080
Let's even say that, let's say, Uber shows up.
link |
01:23:08.880
And Uber somehow agrees to do things at a bigger,
link |
01:23:14.800
we've done it more efficiently, right?
link |
01:23:17.440
Uber only takes 5% of a car
link |
01:23:19.800
instead of the 10% that Uber takes.
link |
01:23:21.600
No one is going to switch
link |
01:23:22.760
because the switching cost is higher than that 5%.
link |
01:23:24.920
So you actually can, in markets like that,
link |
01:23:27.200
you have a first mover advantage.
link |
01:23:28.520
Yeah.
link |
01:23:30.160
Autonomous vehicles of the level five variety
link |
01:23:32.720
have no first mover advantage.
link |
01:23:34.560
If the technology becomes commoditized,
link |
01:23:36.800
say I want to go to a new city, look at the scooters.
link |
01:23:39.520
It's going to look a lot more like scooters.
link |
01:23:41.480
Every person with a checkbook
link |
01:23:44.040
can blanket a city in scooters
link |
01:23:45.720
and that's why you have 10 different scooter companies.
link |
01:23:47.920
Which one's going to win?
link |
01:23:48.760
It's a race to the bottom.
link |
01:23:49.600
It's a terrible market to be in
link |
01:23:51.040
because there's no market for scooters.
link |
01:23:54.960
And the scooters don't get a say
link |
01:23:56.520
in whether they want to be bought
link |
01:23:57.480
and deployed to a city or not.
link |
01:23:58.440
Right.
link |
01:23:59.280
So yeah.
link |
01:24:00.120
We're going to entice the scooters with subsidies
link |
01:24:02.080
and deals.
link |
01:24:03.840
So whenever you have to invest that capital,
link |
01:24:05.480
it doesn't...
link |
01:24:06.720
It doesn't come back.
link |
01:24:07.560
Yeah.
link |
01:24:08.600
They can't be your main criticism of the Waymo approach.
link |
01:24:12.320
Oh, I'm saying even if it does technically work.
link |
01:24:14.840
Even if it does technically work, that's a problem.
link |
01:24:17.040
Yeah.
link |
01:24:18.000
I don't know if I were to say, I would say,
link |
01:24:22.840
you're already there.
link |
01:24:23.520
I haven't even thought about that.
link |
01:24:24.560
But I would say the bigger challenge
link |
01:24:26.520
is the technical approach.
link |
01:24:29.760
So Waymo's cruise is...
link |
01:24:31.840
And not just the technical approach,
link |
01:24:33.000
but of creating value.
link |
01:24:34.800
I still don't understand how you beat Uber,
link |
01:24:40.760
the human driven cars.
link |
01:24:43.480
In terms of financially,
link |
01:24:44.920
it doesn't make sense to me
link |
01:24:47.160
that people want to get an autonomous vehicle.
link |
01:24:50.080
I don't understand how you make money.
link |
01:24:52.800
In the long term, yes, like real long term,
link |
01:24:56.440
but it just feels like there's too much
link |
01:24:58.640
capital investment needed.
link |
01:24:59.960
Oh, and they're going to be worse than Ubers
link |
01:25:01.200
because they're going to stop
link |
01:25:02.440
for every little thing everywhere.
link |
01:25:06.320
I'll say a nice thing about cruise.
link |
01:25:07.360
That was my nice thing about Waymo.
link |
01:25:08.440
They're three years ahead of me.
link |
01:25:09.280
It was a nice...
link |
01:25:10.120
Oh, because they're three years.
link |
01:25:10.960
They're three years technically ahead of everybody.
link |
01:25:12.480
Their tech stack is great.
link |
01:25:14.800
My nice thing about cruise is GM buying them
link |
01:25:17.920
was a great move for GM.
link |
01:25:20.600
For $1 billion,
link |
01:25:22.240
GM bought an insurance policy against Waymo.
link |
01:25:26.560
They put cruise is three years behind Waymo.
link |
01:25:30.000
That means Google will get a monopoly
link |
01:25:32.600
on the technology for at most three years.
link |
01:25:36.840
And if technology works,
link |
01:25:38.880
you might not even be right about the three years.
link |
01:25:40.840
It might be less.
link |
01:25:41.840
Might be less.
link |
01:25:42.680
Cruise actually might not be that far behind.
link |
01:25:44.320
I don't know how much Waymo has waffled around
link |
01:25:47.360
or how much of it actually is just that long tail.
link |
01:25:49.760
Yeah, okay.
link |
01:25:50.600
If that's the best you could say in terms of nice things,
link |
01:25:53.600
that's more of a nice thing for GM
link |
01:25:55.200
that that's a smart insurance policy.
link |
01:25:58.560
It's a smart insurance policy.
link |
01:25:59.680
I mean, I think that's how...
link |
01:26:01.880
I can't see cruise working out any other.
link |
01:26:05.200
For cruise to leapfrog Waymo would really surprise me.
link |
01:26:10.400
Yeah, so let's talk about the underlying assumptions
link |
01:26:13.000
of everything is...
link |
01:26:13.840
We're not gonna leapfrog Tesla.
link |
01:26:17.560
Tesla would have to seriously mess up
link |
01:26:19.240
for us to leapfrog them.
link |
01:26:20.440
Okay, so the way you leapfrog, right,
link |
01:26:23.240
is you come up with an idea
link |
01:26:26.120
or you take a direction, perhaps secretly,
link |
01:26:28.560
that the other people aren't taking.
link |
01:26:31.640
And so cruise, Waymo, even Aurora...
link |
01:26:38.080
I don't know, Aurora, Zooks is the same stack as well.
link |
01:26:40.080
They're all the same code base even.
link |
01:26:41.720
They're all the same DARPA Urban Challenge code base.
link |
01:26:44.120
It's...
link |
01:26:45.360
So the question is, do you think there's a room
link |
01:26:47.760
for brilliance and innovation there
link |
01:26:49.120
that will change everything?
link |
01:26:51.560
Like say, okay, so I'll give you examples.
link |
01:26:53.880
It could be if revolution and mapping, for example,
link |
01:26:59.640
that allow you to map things,
link |
01:27:03.040
do HD maps of the whole world,
link |
01:27:05.840
all weather conditions somehow really well,
link |
01:27:08.080
or revolution and simulation,
link |
01:27:14.480
to where all the way you said before becomes incorrect.
link |
01:27:20.480
That kind of thing.
link |
01:27:21.520
Any room for breakthrough innovation?
link |
01:27:24.920
What I said before about,
link |
01:27:25.960
oh, they actually get the whole thing, well,
link |
01:27:28.280
I'll say this about we divide driving into three problems.
link |
01:27:32.600
And I actually haven't solved the third yet,
link |
01:27:33.800
but I haven't had any idea how to do it.
link |
01:27:34.800
So there's the static.
link |
01:27:36.120
The static driving problem is assuming
link |
01:27:38.000
you are the only car on the road, right?
link |
01:27:40.120
And this problem can be solved 100%
link |
01:27:42.000
with mapping and localization.
link |
01:27:44.000
This is why farms work the way they do.
link |
01:27:45.760
If all you have to deal with is the static problem,
link |
01:27:48.440
and you can statically schedule your machines, right?
link |
01:27:50.160
It's the same as like statically scheduling processes.
link |
01:27:52.680
You can statically schedule your tractors
link |
01:27:54.040
to never hit each other on their paths, right?
link |
01:27:56.160
Because then you know the speed they go at.
link |
01:27:57.520
So that's the static driving problem.
link |
01:28:00.160
Maps only helps you with the static driving problem.
link |
01:28:03.920
Yeah, the question about static driving,
link |
01:28:06.960
you've just made it sound like it's really easy.
link |
01:28:08.800
Static driving is really easy.
link |
01:28:11.880
How easy?
link |
01:28:13.040
How, well, because the whole drifting out of lane,
link |
01:28:16.480
when Tesla drifts out of lane,
link |
01:28:18.760
it's failing on the fundamental static driving problem.
link |
01:28:21.960
Tesla is drifting out of lane?
link |
01:28:24.440
The static driving problem is not easy for the world.
link |
01:28:27.720
The static driving problem is easy for one route.
link |
01:28:31.840
One route in one weather condition
link |
01:28:33.920
with one state of lane markings
link |
01:28:37.920
and like no deterioration, no cracks in the road.
link |
01:28:40.920
Well, I'm assuming you have a perfect localizer.
link |
01:28:42.600
So that's all for the weather condition
link |
01:28:44.200
and the lane marking condition.
link |
01:28:45.560
But that's the problem.
link |
01:28:46.640
How do you have a perfect localizer?
link |
01:28:47.680
You can build, perfect localizers are not that hard to build.
link |
01:28:50.560
Okay, come on now, with LIDAR.
link |
01:28:53.360
LIDAR, yeah.
link |
01:28:54.200
With LIDAR, okay.
link |
01:28:55.040
LIDAR, yeah, but you use LIDAR, right?
link |
01:28:56.440
Like you use LIDAR, build a perfect localizer.
link |
01:28:58.640
Building a perfect localizer without LIDAR,
link |
01:29:03.000
it's gonna be hard.
link |
01:29:04.320
You can get 10 centimeters without LIDAR,
link |
01:29:05.760
you can get one centimeter with LIDAR.
link |
01:29:07.240
I'm not even concerned about the one or 10 centimeters.
link |
01:29:09.280
I'm concerned if every once in a while you just weigh off.
link |
01:29:12.680
Yeah, so this is why you have to carefully
link |
01:29:17.480
make sure you're always tracking your position.
link |
01:29:20.040
You wanna use LIDAR camera fusion,
link |
01:29:21.760
but you can get the reliability of that system
link |
01:29:24.480
up to 100,000 miles
link |
01:29:28.000
and then you write some fallback condition
link |
01:29:29.720
where it's not that bad if you're way off, right?
link |
01:29:32.160
I think that you can get it to the point,
link |
01:29:33.800
it's like ASL D that you're never in a case
link |
01:29:36.800
where you're way off and you don't know it.
link |
01:29:38.480
Yeah, okay, so this is brilliant.
link |
01:29:40.240
So that's the static.
link |
01:29:41.160
Static.
link |
01:29:42.280
We can, especially with LIDAR and good HD maps,
link |
01:29:45.960
you can solve that problem.
link |
01:29:47.080
It's easy.
link |
01:29:47.920
The static, the static problem is so easy.
link |
01:29:51.840
It's very typical for you to say something's easy.
link |
01:29:54.000
I got it.
link |
01:29:54.840
It's not as challenging as the other ones, okay.
link |
01:29:56.920
Well, okay, maybe it's obvious how to solve it.
link |
01:29:58.760
The third one's the hardest.
link |
01:29:59.760
And a lot of people don't even think about the third one
link |
01:30:01.920
and even see it as different from the second one.
link |
01:30:03.640
So the second one is dynamic.
link |
01:30:05.720
The second one is like, say there's an obvious example,
link |
01:30:08.560
it's like a car stopped at a red light, right?
link |
01:30:10.360
You can't have that car in your map
link |
01:30:12.520
because you don't know whether that car
link |
01:30:13.720
is gonna be there or not.
link |
01:30:14.880
So you have to detect that car in real time
link |
01:30:17.960
and then you have to do the appropriate action, right?
link |
01:30:21.600
Also, that car is not a fixed object.
link |
01:30:24.800
That car may move and you have to predict
link |
01:30:26.600
what that car will do, right?
link |
01:30:28.680
So this is the dynamic problem.
link |
01:30:30.840
Yeah.
link |
01:30:31.680
So you have to deal with this.
link |
01:30:32.800
This involves, again, like you're gonna need models
link |
01:30:36.640
of other people's behavior.
link |
01:30:38.760
Do you, are you including in that?
link |
01:30:40.160
I don't wanna step on the third one.
link |
01:30:42.320
Oh, but are you including in that your influence
link |
01:30:46.600
on people?
link |
01:30:47.440
Ah, that's the third one.
link |
01:30:48.280
Okay.
link |
01:30:49.120
That's the third one.
link |
01:30:49.960
We call it the counterfactual.
link |
01:30:51.880
Yeah, brilliant.
link |
01:30:52.720
And that.
link |
01:30:53.560
I just talked to Judea Pro who's obsessed
link |
01:30:54.920
with counterfactuals.
link |
01:30:55.760
Counterfactual, oh yeah, yeah, I read his books.
link |
01:30:58.640
So the static and the dynamic are our approach right now
link |
01:31:03.960
for lateral will scale completely to the static and dynamic.
link |
01:31:07.600
The counterfactual, the only way I have to do it yet,
link |
01:31:10.760
the thing that I wanna do once we have all of these cars
link |
01:31:14.000
is I wanna do reinforcement learning on the world.
link |
01:31:16.760
I'm always gonna turn the exploiter up to max.
link |
01:31:18.880
I'm not gonna have them explore.
link |
01:31:20.440
But the only real way to get at the counterfactual
link |
01:31:22.760
is to do reinforcement learning
link |
01:31:24.080
because the other agents are humans.
link |
01:31:27.760
So that's fascinating that you break it down like that.
link |
01:31:30.080
I agree completely.
link |
01:31:31.680
I've spent my life thinking about this problem.
link |
01:31:33.600
This is beautiful.
link |
01:31:34.920
And part of it, cause you're slightly insane,
link |
01:31:37.880
because not my life, just the last four years.
link |
01:31:43.120
No, no, you have some non zero percent of your brain
link |
01:31:48.920
has a madman in it, which is a really good feature.
link |
01:31:52.360
But there's a safety component to it
link |
01:31:55.920
that I think when there's sort of
link |
01:31:57.320
with counterfactuals and so on,
link |
01:31:59.040
that would just freak people out.
link |
01:32:00.280
How do you even start to think about this in general?
link |
01:32:03.320
I mean, you've had some friction with NHTSA and so on.
link |
01:32:07.600
I am frankly exhausted by safety engineers.
link |
01:32:14.280
The prioritization on safety over innovation
link |
01:32:21.360
to a degree where it kills, in my view,
link |
01:32:23.720
kills safety in the longterm.
link |
01:32:26.200
So the counterfactual thing,
link |
01:32:28.080
they just actually exploring this world
link |
01:32:31.560
of how do you interact with dynamic objects and so on?
link |
01:32:33.600
How do you think about safety?
link |
01:32:34.840
You can do reinforcement learning without ever exploring.
link |
01:32:38.120
And I said that, like,
link |
01:32:39.200
so you can think about your, in like reinforcement learning,
link |
01:32:41.560
it's usually called like a temperature parameter.
link |
01:32:44.320
And your temperature parameter
link |
01:32:45.360
is how often you deviate from the argmax.
link |
01:32:48.080
I could always set that to zero and still learn.
link |
01:32:50.720
And I feel that you'd always want that set to zero
link |
01:32:52.840
on your actual system.
link |
01:32:54.080
Gotcha.
link |
01:32:54.920
But the problem is you first don't know very much
link |
01:32:58.160
and so you're going to make mistakes.
link |
01:32:59.560
So the learning, the exploration happens through mistakes.
link |
01:33:01.680
We're all ready, yeah, but.
link |
01:33:03.240
Okay, so the consequences of a mistake.
link |
01:33:06.080
OpenPilot and Autopilot are making mistakes left and right.
link |
01:33:09.400
We have 700 daily active users,
link |
01:33:12.560
1,000 weekly active users.
link |
01:33:14.080
OpenPilot makes tens of thousands of mistakes a week.
link |
01:33:18.920
These mistakes have zero consequences.
link |
01:33:21.160
These mistakes are,
link |
01:33:22.520
oh, I wanted to take this exit and it went straight.
link |
01:33:26.800
So I'm just going to carefully touch the wheel.
link |
01:33:28.520
The humans catch them.
link |
01:33:29.360
The humans catch them.
link |
01:33:30.640
And the human disengagement is labeling
link |
01:33:33.120
that reinforcement learning in a completely
link |
01:33:35.000
consequence free way.
link |
01:33:37.240
So driver monitoring is the way you ensure they keep.
link |
01:33:39.840
Yes.
link |
01:33:40.680
They keep paying attention.
link |
01:33:42.120
How's your messaging?
link |
01:33:43.240
Say I gave you a billion dollars,
link |
01:33:45.200
so you would be scaling it now.
link |
01:33:47.800
Oh, if I could scale, I couldn't scale with any amount of money.
link |
01:33:49.720
I'd raise money if I could, if I had a way to scale it.
link |
01:33:51.640
Yeah, you're not, no, I'm not focused on scale.
link |
01:33:53.320
I don't know how to do.
link |
01:33:54.160
Oh, like, I guess I could sell it to more people,
link |
01:33:55.760
but I want to make the system better.
link |
01:33:56.960
Better, better.
link |
01:33:57.800
And I don't know how to.
link |
01:33:58.840
But what's the messaging here?
link |
01:34:01.080
I got a chance to talk to Elon.
link |
01:34:02.560
And he basically said that the human factor doesn't matter.
link |
01:34:09.280
You know, the human doesn't matter
link |
01:34:10.360
because the system will perform.
link |
01:34:12.280
There'll be sort of a, sorry to use the term,
link |
01:34:14.760
but like a singular, like a point
link |
01:34:16.120
where it gets just much better.
link |
01:34:17.920
And so the human, it won't really matter.
link |
01:34:20.800
But it seems like that human catching the system
link |
01:34:25.000
when it gets into trouble is like the thing
link |
01:34:29.360
which will make something like reinforcement learning work.
link |
01:34:32.720
So how do you, how do you think messaging for Tesla,
link |
01:34:35.640
for you, for the industry in general, should change?
link |
01:34:39.080
I think my messaging is pretty clear,
link |
01:34:40.840
at least like our messaging wasn't that clear
link |
01:34:43.080
in the beginning and I do kind of fault myself for that.
link |
01:34:45.200
We are proud right now to be a level two system.
link |
01:34:48.480
We are proud to be level two.
link |
01:34:50.360
If we talk about level four,
link |
01:34:51.640
it's not with the current hardware.
link |
01:34:53.200
It's not going to be just a magical OTA upgrade.
link |
01:34:55.920
It's going to be new hardware.
link |
01:34:57.280
It's going to be very carefully thought out right now.
link |
01:35:00.000
We are proud to be level two.
link |
01:35:01.560
And we have a rigorous safety model.
link |
01:35:03.320
I mean, not like, like, okay, rigorous.
link |
01:35:05.680
Who knows what that means?
link |
01:35:06.600
But we at least have a safety model
link |
01:35:08.600
and we make it explicit.
link |
01:35:09.560
It's in safety.md and open pilot.
link |
01:35:11.800
And it says, seriously though.
link |
01:35:13.960
Safety.md.
link |
01:35:14.800
Safety.md.
link |
01:35:16.840
This is really, this is so Android.
link |
01:35:18.400
So, well, this is, this is the safety model
link |
01:35:21.800
and I like to have conversations like if, like, you know,
link |
01:35:25.520
sometimes people will come to you and they're like,
link |
01:35:27.120
your system's not safe.
link |
01:35:29.240
Okay.
link |
01:35:30.080
Have you read my safety docs?
link |
01:35:31.080
Would you like to have an intelligent conversation
link |
01:35:32.720
about this?
link |
01:35:33.560
And the answer is always no.
link |
01:35:34.400
They just like scream about, it runs Python.
link |
01:35:38.240
Okay. What?
link |
01:35:39.080
So you're saying that, that because Python's not real time,
link |
01:35:41.560
Python not being real time never causes disengagement.
link |
01:35:44.240
Disengagement's are caused by, you know, the model is QM.
link |
01:35:47.640
But safety.md says the following.
link |
01:35:49.760
First and foremost,
link |
01:35:50.600
the driver must be paying attention at all times.
link |
01:35:54.240
I don't consider, I do,
link |
01:35:55.320
I still consider the software to be alpha software
link |
01:35:57.720
until we can actually enforce that statement.
link |
01:36:00.080
But I feel it's very well communicated to our users.
link |
01:36:03.280
Two more things.
link |
01:36:04.520
One is the user must be able to easily take control
link |
01:36:09.080
of the vehicle at all times.
link |
01:36:10.880
So if you step on the gas or brake with open pilot,
link |
01:36:14.440
it gives full manual control back to the user
link |
01:36:16.400
or press the cancel button.
link |
01:36:18.680
Step two, the car will never react so quickly.
link |
01:36:23.240
We define so quickly to be about one second
link |
01:36:26.000
that you can't react in time.
link |
01:36:27.640
And we do this by enforcing torque limits,
link |
01:36:29.480
braking limits and acceleration limits.
link |
01:36:31.520
So we have like our torque limits way lower than Tesla's.
link |
01:36:36.520
This is another potential.
link |
01:36:39.080
If I could tweak autopilot,
link |
01:36:40.240
I would lower their torque limit
link |
01:36:41.360
and I would add driver monitoring.
link |
01:36:42.960
Because autopilot can jerk the wheel hard.
link |
01:36:46.240
Open pilot can't.
link |
01:36:47.520
It's, we limit and all this code is open source, readable.
link |
01:36:52.080
And I believe now it's all MISRA C compliant.
link |
01:36:54.880
What's that mean?
link |
01:36:57.080
MISRA is like the automotive coding standard.
link |
01:37:00.400
At first, I've come to respect,
link |
01:37:03.400
I've been reading like the standards lately
link |
01:37:04.960
and I've come to respect them.
link |
01:37:05.920
They're actually written by very smart people.
link |
01:37:07.800
Yeah, they're brilliant people actually.
link |
01:37:09.920
They have a lot of experience.
link |
01:37:11.320
They're sometimes a little too cautious,
link |
01:37:13.360
but in this case, it pays off.
link |
01:37:16.800
MISRA is written by like computer scientists
link |
01:37:18.440
and you can tell by the language they use.
link |
01:37:19.840
You can tell by the language they use.
link |
01:37:21.080
They talk about like whether certain conditions in MISRA
link |
01:37:24.440
are decidable or undecidable.
link |
01:37:26.520
And you mean like the halting problem?
link |
01:37:28.360
And yes, all right, you've earned my respect.
link |
01:37:31.600
I will read carefully what you have to say
link |
01:37:33.120
and we want to make our code compliant with that.
link |
01:37:35.760
All right, so you're proud level two, beautiful.
link |
01:37:38.160
So you were the founder and I think CEO of Kama AI,
link |
01:37:42.320
then you were the head of research.
link |
01:37:44.320
What the heck are you now?
link |
01:37:46.080
What's your connection to Kama AI?
link |
01:37:47.480
I'm the president, but I'm one of those like
link |
01:37:49.640
unelected presidents of like a small dictatorship country,
link |
01:37:53.440
not one of those like elected presidents.
link |
01:37:55.200
Oh, so you're like Putin when he was like the, yeah,
link |
01:37:57.640
I got you.
link |
01:37:58.980
So there's, what's the governance structure?
link |
01:38:02.120
What's the future of Kama AI finance?
link |
01:38:04.800
I mean, yeah, as a business, do you want,
link |
01:38:08.120
are you just focused on getting things right now,
link |
01:38:11.640
making some small amount of money in the meantime
link |
01:38:14.920
and then when it works, it works a new scale.
link |
01:38:17.520
Our burn rate is about 200 K a month
link |
01:38:20.480
and our revenue is about 100 K a month.
link |
01:38:23.040
So we need to forex our revenue,
link |
01:38:24.920
but we haven't like tried very hard at that yet.
link |
01:38:28.200
And the revenue is basically selling stuff online.
link |
01:38:30.160
Yeah, we sell stuff shop.com.ai.
link |
01:38:32.360
Is there other, well, okay.
link |
01:38:33.920
So you'll have to figure out.
link |
01:38:35.360
That's our only, see, but to me,
link |
01:38:37.880
that's like respectable revenues.
link |
01:38:40.400
We make it by selling products to consumers
link |
01:38:42.640
for honest and transparent about what they are.
link |
01:38:45.040
Most actually level four companies, right?
link |
01:38:50.720
Cause you could easily start blowing up like smoke,
link |
01:38:54.320
like overselling the hype and feeding into,
link |
01:38:57.080
getting some fundraisers.
link |
01:38:59.080
Oh, you're the guy, you're a genius
link |
01:39:00.520
because you hacked the iPhone.
link |
01:39:01.800
Oh, I hate that.
link |
01:39:02.920
I hate that.
link |
01:39:03.760
Yeah, I can trade my social capital for more money.
link |
01:39:06.360
I did it once.
link |
01:39:07.320
I almost regret it doing it the first time.
link |
01:39:10.320
Well, on a small tangent,
link |
01:39:11.640
what's your, you seem to not like fame
link |
01:39:16.560
and yet you're also drawn to fame.
link |
01:39:18.840
What's, where have you on, where are you on that currently?
link |
01:39:24.560
Have you had some introspection, some soul searching?
link |
01:39:27.200
Yeah.
link |
01:39:28.480
I actually, I've come to a pretty stable position on that.
link |
01:39:32.200
Like after the first time,
link |
01:39:33.880
I realized that I don't want attention from the masses.
link |
01:39:36.840
I want attention from people who I respect.
link |
01:39:39.160
Who do you respect?
link |
01:39:41.960
I can give a list of people.
link |
01:39:43.960
So are these like Elon Musk type characters?
link |
01:39:47.200
Yeah.
link |
01:39:49.040
Actually, you know what?
link |
01:39:50.000
I'll make it more broad than that.
link |
01:39:51.200
I won't make it about a person.
link |
01:39:52.600
I respect skill.
link |
01:39:54.040
I respect people who have skills, right?
link |
01:39:56.880
And I would like to like be,
link |
01:40:00.280
I'm not gonna say famous,
link |
01:40:01.400
but be like known among more people
link |
01:40:03.760
who have like real skills.
link |
01:40:05.440
Who in cars, do you think have skill?
link |
01:40:12.560
Not do you respect?
link |
01:40:15.000
Oh, Kyle Voat has skill.
link |
01:40:17.760
A lot of people at Waymo have skill.
link |
01:40:19.880
And I respect them.
link |
01:40:20.840
I respect them as engineers.
link |
01:40:23.760
Like I can think, I mean,
link |
01:40:24.920
I think about all the times in my life
link |
01:40:26.280
where I've been like dead set on approaches
link |
01:40:27.960
and they turn out to be wrong.
link |
01:40:29.160
Yeah.
link |
01:40:30.000
So I mean, this might, I might be wrong.
link |
01:40:31.720
I accept that, I accept that there's a decent chance
link |
01:40:34.720
that I'm wrong.
link |
01:40:36.600
And actually, I mean, having talked to Chris Armson,
link |
01:40:38.400
Sterling Anderson, those guys,
link |
01:40:40.480
I mean, I deeply respect Chris.
link |
01:40:43.360
I just admire the guy.
link |
01:40:46.040
He's legit.
link |
01:40:47.400
When you drive a car through the desert
link |
01:40:48.960
when everybody thinks it's impossible, that's legit.
link |
01:40:52.400
And then I also really respect the people
link |
01:40:53.840
who are like writing the infrastructure of the world,
link |
01:40:55.680
like the Linus Torvalds and the Chris Ladin.
link |
01:40:57.360
Oh yeah, they were doing the real work.
link |
01:40:59.080
I know they're doing the real work.
link |
01:41:02.000
Having talked to Chris Ladin,
link |
01:41:03.760
you realize, especially when they're humble,
link |
01:41:05.680
it's like, you realize, oh, you guys,
link |
01:41:07.680
we're just using your...
link |
01:41:09.640
Oh yeah.
link |
01:41:10.480
All the hard work that you did.
link |
01:41:11.520
Yeah, that's incredible.
link |
01:41:13.120
What do you think, Mr. Anthony Lewandowski?
link |
01:41:18.440
What do you, he's a, he's another mad genius.
link |
01:41:21.640
Sharp guy.
link |
01:41:22.480
Oh yeah.
link |
01:41:23.320
What, do you think he might long term become a competitor?
link |
01:41:27.640
Oh, to comma?
link |
01:41:28.840
Well, so I think that he has the other right approach.
link |
01:41:32.400
I think that right now, there's two right approaches.
link |
01:41:35.280
One is what we're doing and one is what he's doing.
link |
01:41:37.680
Can you describe, I think it's called Pronto AI,
link |
01:41:39.800
he's starting using, do you know what the approach is?
link |
01:41:42.360
I actually don't know.
link |
01:41:43.200
Embark is also doing the same sort of thing.
link |
01:41:45.040
The idea is almost that you want to,
link |
01:41:47.280
so if you're, I can't partner with Honda and Toyota.
link |
01:41:51.800
Honda and Toyota are like 400,000 person companies.
link |
01:41:57.600
It's not even a company at that point.
link |
01:41:59.400
Like I don't think of it like, I don't personify it.
link |
01:42:01.400
I think of it like an object, but a trucker drives for a fleet.
link |
01:42:07.120
Maybe that has like, some truckers are independent.
link |
01:42:10.280
Some truckers drive for fleets with a hundred trucks.
link |
01:42:12.080
There are tons of independent trucking companies out there.
link |
01:42:14.960
Start a trucking company and drive your costs down
link |
01:42:18.120
or figure out how to drive down the cost of trucking.
link |
01:42:23.760
Another company that I really respect is Nauto.
link |
01:42:26.560
Actually, I respect their business model.
link |
01:42:28.320
Nauto sells a driver monitoring camera
link |
01:42:31.560
and they sell it to fleet owners.
link |
01:42:33.920
If I owned a fleet of cars and I could pay 40 bucks a month
link |
01:42:39.120
to monitor my employees,
link |
01:42:42.400
this is gonna like reduces accidents 18%.
link |
01:42:45.520
It's so like that in the space,
link |
01:42:48.960
that is like the business model that I like most respect
link |
01:42:53.400
because they're creating value today.
link |
01:42:55.360
Yeah, which is, that's a huge one.
link |
01:42:57.840
How do we create value today with some of this?
link |
01:42:59.800
And the length keeping thing is huge.
link |
01:43:01.680
And it sounds like you're creeping in
link |
01:43:03.800
or full steam ahead on the drive of monitoring too.
link |
01:43:06.680
Which I think actually where the short term value,
link |
01:43:09.240
if you can get right.
link |
01:43:10.480
I still, I'm not a huge fan of the statement
link |
01:43:12.800
that everything is to have drive of monitoring.
link |
01:43:15.120
I agree with that completely,
link |
01:43:16.120
but that statement usually misses the point
link |
01:43:18.680
that to get the experience of it right is not trivial.
link |
01:43:21.920
Oh, no, not at all.
link |
01:43:22.840
In fact, like, so right now we have,
link |
01:43:25.280
I think the timeout depends on speed of the car,
link |
01:43:29.560
but we want to depend on like the scene state.
link |
01:43:32.520
If you're on like an empty highway,
link |
01:43:35.440
it's very different if you don't pay attention
link |
01:43:37.680
than if like you're like coming up to a traffic light.
link |
01:43:42.040
And long term, it should probably learn from the driver
link |
01:43:45.720
because that's to do, I watched a lot of video.
link |
01:43:48.120
We've built a smartphone detector
link |
01:43:49.480
just to analyze how people are using smartphones
link |
01:43:51.520
and people are using it very differently.
link |
01:43:53.400
And there's this, it's a texting styles.
link |
01:43:57.760
We haven't watched nearly enough of the videos.
link |
01:44:00.320
We haven't, I got millions of miles
link |
01:44:01.800
of people driving cars.
link |
01:44:02.960
In this moment, I spent a large fraction of my time
link |
01:44:05.960
just watching videos because it's never fails to learn.
link |
01:44:10.880
Like I've never failed from a video watching session
link |
01:44:13.480
to learn something I didn't know before.
link |
01:44:15.400
In fact, I usually, like when I eat lunch, I'll sit,
link |
01:44:19.640
especially when the weather is good
link |
01:44:20.680
and just watch pedestrians.
link |
01:44:22.080
With an eye to understand like from a computer vision eye,
link |
01:44:26.400
just to see, can this model, can you predict
link |
01:44:29.280
what are the decisions made?
link |
01:44:30.480
And there's so many things that we don't understand.
link |
01:44:33.040
This is what I mean about state vector.
link |
01:44:34.760
Yeah, it's, I'm trying to always think like,
link |
01:44:37.880
because I'm understanding in my human brain,
link |
01:44:40.280
how do we convert that into,
link |
01:44:43.000
how hard is the learning problem here?
link |
01:44:44.960
I guess is the fundamental question.
link |
01:44:46.960
So something that's from a hacking perspective,
link |
01:44:51.800
this is always comes up, especially with folks.
link |
01:44:54.200
Well, first, the most popular question is
link |
01:44:56.480
the trolley problem, right?
link |
01:44:58.440
So that's not a sort of a serious problem.
link |
01:45:01.960
There are some ethical questions, I think that arise.
link |
01:45:06.080
Maybe you wanna, do you think there's any ethical,
link |
01:45:09.600
serious ethical questions?
link |
01:45:11.280
We have a solution to the trolley problem at com.ai.
link |
01:45:14.080
Well, so there is actually an alert
link |
01:45:15.920
in our code, ethical dilemma detected.
link |
01:45:18.000
It's not triggered yet.
link |
01:45:18.960
We don't know how yet to detect the ethical dilemmas,
link |
01:45:21.040
but we're a level two system.
link |
01:45:22.360
So we're going to disengage and leave
link |
01:45:23.760
that decision to the human.
link |
01:45:25.320
You're such a troll.
link |
01:45:26.680
No, but the trolley problem deserves to be trolled.
link |
01:45:28.760
Yeah, that's a beautiful answer actually.
link |
01:45:32.040
I know, I gave it to someone who was like,
link |
01:45:34.440
sometimes people ask like you asked about the trolley problem.
link |
01:45:36.600
Like you can have a kind of discussion about it.
link |
01:45:38.080
Like when you get someone who's like really like
link |
01:45:39.720
earnest about it, because it's the kind of thing where
link |
01:45:43.600
if you ask a bunch of people in an office,
link |
01:45:45.600
whether we should use a SQL stack or no SQL stack,
link |
01:45:48.360
if they're not that technical, they have no opinion.
link |
01:45:50.600
But if you ask them what color they want to paint the office,
link |
01:45:52.360
everyone has an opinion on that.
link |
01:45:54.040
And that's why the trolley problem is.
link |
01:45:56.040
I mean, that's a beautiful answer.
link |
01:45:57.280
Yeah, we're able to detect the problem
link |
01:45:59.240
and we're able to pass it on to the human.
link |
01:46:01.960
Wow, I've never heard anyone say it.
link |
01:46:03.760
This is your nice escape route.
link |
01:46:06.160
Okay, but...
link |
01:46:07.320
Proud level two.
link |
01:46:08.680
I'm proud level two.
link |
01:46:09.760
I love it.
link |
01:46:10.600
So the other thing that people have some concern about
link |
01:46:14.400
with AI in general is hacking.
link |
01:46:17.800
So how hard is it, do you think, to hack an autonomous vehicle
link |
01:46:21.400
either through physical access or through the more sort of
link |
01:46:25.000
popular now, these adversarial examples on the sensors?
link |
01:46:28.240
Okay, the adversarial examples one.
link |
01:46:30.720
You want to see some adversarial examples
link |
01:46:32.320
that affect humans, right?
link |
01:46:34.880
Oh, well, there used to be a stop sign here,
link |
01:46:38.040
but I put a black bag over the stop sign
link |
01:46:40.000
and then people ran it, adversarial, right?
link |
01:46:43.520
Like, there's tons of human adversarial examples too.
link |
01:46:48.360
The question in general about security, if you saw,
link |
01:46:52.240
something just came out today and there are always
link |
01:46:54.040
such hypey headlines about how navigate on autopilot
link |
01:46:57.560
was fooled by a GPS spoof to take an exit.
link |
01:47:00.960
Right.
link |
01:47:01.800
At least that's all they could do was take an exit.
link |
01:47:03.920
If your car is relying on GPS in order
link |
01:47:06.720
to have a safe driving policy, you're doing something wrong.
link |
01:47:10.240
If you're relying, and this is why V2V
link |
01:47:12.680
is such a terrible idea, V2V now relies on both parties
link |
01:47:18.160
getting communication right.
link |
01:47:19.800
This is not even, so I think of safety,
link |
01:47:26.080
security is like a special case of safety, right?
link |
01:47:28.480
Safety is like we put a little, you know,
link |
01:47:31.880
piece of caution tape around the hole
link |
01:47:33.360
so that people won't walk into it by accident.
link |
01:47:35.560
Security is like put a 10 foot fence around the hole
link |
01:47:38.240
so you actually physically cannot climb into it
link |
01:47:40.120
with barbed wire on the top and stuff, right?
link |
01:47:42.360
So like if you're designing systems
link |
01:47:44.560
that are like unreliable, they're definitely not secure.
link |
01:47:48.440
Your car should always do something safe
link |
01:47:51.240
using its local sensors.
link |
01:47:53.400
And then the local sensor should be hardwired.
link |
01:47:55.240
And then could somebody hack into your can boss
link |
01:47:57.400
and turn your steering wheel on your brakes?
link |
01:47:58.640
Yes, but they could do it before comma AI too, so.
link |
01:48:02.800
Let's think out of the box and some things.
link |
01:48:04.680
So do you think teleoperation has a role in any of this?
link |
01:48:09.400
So remotely stepping in and controlling the cars?
link |
01:48:13.880
No, I think that if the safety operation
link |
01:48:21.320
by design requires a constant link to the cars,
link |
01:48:26.160
I think it doesn't work.
link |
01:48:27.560
So that's the same argument used for V2I, V2V.
link |
01:48:31.080
Well, there's a lot of non safety critical stuff
link |
01:48:34.280
you can do with V2I.
link |
01:48:35.120
I like V2I, I like V2I way more than V2V
link |
01:48:37.440
because V2I is already like,
link |
01:48:39.280
I already have internet in the car, right?
link |
01:48:40.880
There's a lot of great stuff you can do with V2I.
link |
01:48:44.280
Like for example, you can,
link |
01:48:46.320
well, where I already have V2V, Waze is V2I, right?
link |
01:48:48.880
Waze can route me around traffic jams.
link |
01:48:50.520
That's a great example of V2I.
link |
01:48:52.760
And then, okay, the car automatically talks
link |
01:48:54.440
to that same service, like it works.
link |
01:48:55.800
So it's improving the experience,
link |
01:48:56.800
but it's not a fundamental fallback for safety.
link |
01:48:59.480
No, if any of your things that require
link |
01:49:04.160
wireless communication are more than QM,
link |
01:49:07.480
like have an ASL rating, you shouldn't.
link |
01:49:10.640
You previously said that life is work
link |
01:49:15.440
and that you don't do anything to relax.
link |
01:49:17.480
So how do you think about hard work?
link |
01:49:20.800
Well, what is it?
link |
01:49:22.200
What do you think it takes to accomplish great things?
link |
01:49:24.720
And there's a lot of people saying
link |
01:49:25.840
that there needs to be some balance.
link |
01:49:28.280
You know, you need to,
link |
01:49:29.600
in order to accomplish great things,
link |
01:49:31.120
you need to take some time off,
link |
01:49:32.200
you need to reflect and so on.
link |
01:49:34.640
And then some people are just insanely working,
link |
01:49:37.840
burning the candle on both ends.
link |
01:49:39.640
How do you think about that?
link |
01:49:41.360
I think I was trolling in the Suraj interview
link |
01:49:43.400
when I said that off camera,
link |
01:49:45.600
but right before I spoke to a little bit of weed,
link |
01:49:47.240
like, you know, come on, this is a joke, right?
link |
01:49:49.800
Like I do nothing to relax.
link |
01:49:50.880
Look where I am, I'm at a party, right?
link |
01:49:52.560
Yeah, yeah, yeah, sure.
link |
01:49:53.960
That's true.
link |
01:49:55.200
So no, no, of course I don't.
link |
01:49:58.040
When I say that life is work though,
link |
01:49:59.800
I mean that like, I think that
link |
01:50:01.960
what gives my life meaning is work.
link |
01:50:04.200
I don't mean that every minute of the day
link |
01:50:05.720
you should be working.
link |
01:50:06.560
I actually think this is not the best way
link |
01:50:08.000
to maximize results.
link |
01:50:09.800
I think that if you're working 12 hours a day,
link |
01:50:12.040
you should be working smarter and not harder.
link |
01:50:14.920
Well, so it gives work gives you meaning
link |
01:50:17.880
for some people, other sorts of meaning
link |
01:50:20.520
is personal relationships, like family and so on.
link |
01:50:24.560
You've also in that interview with Suraj
link |
01:50:27.200
or the trolling mentioned that one of the things
link |
01:50:30.720
you look forward to in the future is AI girlfriends.
link |
01:50:33.400
Yes.
link |
01:50:34.360
So that's a topic that I'm very much fascinated by,
link |
01:50:38.800
not necessarily girlfriends,
link |
01:50:39.840
but just forming a deep connection with AI.
link |
01:50:41.880
Yeah.
link |
01:50:42.960
What kind of system do you imagine
link |
01:50:44.400
when you say AI girlfriend,
link |
01:50:46.240
whether you were trolling or not?
link |
01:50:47.800
No, that one I'm very serious about.
link |
01:50:49.720
And I'm serious about that on both a shallow level
link |
01:50:52.360
and a deep level.
link |
01:50:53.680
I think that VR brothels are coming soon
link |
01:50:55.720
and are gonna be really cool.
link |
01:50:57.800
It's not cheating if it's a robot.
link |
01:50:59.760
I see the slogan already.
link |
01:51:01.080
Um, but...
link |
01:51:04.320
There's a, I don't know if you've watched
link |
01:51:06.200
or just watched the Black Mirror episode.
link |
01:51:08.320
I watched the latest one, yeah.
link |
01:51:09.320
Yeah, yeah.
link |
01:51:11.320
Oh, the Ashley 2 one?
link |
01:51:13.160
Or the...
link |
01:51:15.120
No, where there's two friends
link |
01:51:16.920
who were having sex with each other in...
link |
01:51:20.160
Oh, in the VR game.
link |
01:51:21.240
In the VR game, it's the two guys,
link |
01:51:23.560
but one of them was a female, yeah.
link |
01:51:26.720
Yeah, the...
link |
01:51:27.560
Which is another mind blowing concept.
link |
01:51:29.560
That in VR, you don't have to be the form.
link |
01:51:33.320
You can be two animals having sex, it's weird.
link |
01:51:37.720
I mean, I'll see how nice
link |
01:51:38.560
that the software maps the nerve endings, right?
link |
01:51:40.280
Yeah, it's weird.
link |
01:51:41.600
I mean, yeah, they sweep a lot of the fascinating,
link |
01:51:44.480
really difficult technical challenges under the rug,
link |
01:51:46.440
like assuming it's possible
link |
01:51:48.360
to do the mapping of the nerve endings, then...
link |
01:51:51.160
I wish, yeah, I saw that.
link |
01:51:52.000
The way they did it with the little like stim unit
link |
01:51:53.800
on the head, that'd be amazing.
link |
01:51:56.800
So, well, no, no, on a shallow level,
link |
01:51:58.760
like you could set up like almost a brothel
link |
01:52:01.640
with like real dolls and Oculus quests,
link |
01:52:05.160
write some good software.
link |
01:52:06.200
I think it'd be a cool novelty experience.
link |
01:52:09.280
But no, on a deeper, like emotional level.
link |
01:52:12.800
I mean, yeah, I would really like to fall in love
link |
01:52:16.960
with the machine.
link |
01:52:18.120
Do you see yourself having a long term relationship
link |
01:52:23.120
of the kind monogamous relationship that we have now
link |
01:52:27.520
with the robot, with the AI system, even?
link |
01:52:31.360
Not even just the robot.
link |
01:52:32.680
So, I think about maybe my ideal future.
link |
01:52:38.200
When I was 15, I read Eliezer Yudkowsky's early writings
link |
01:52:44.320
on the singularity and like that AI
link |
01:52:49.120
is going to surpass human intelligence massively.
link |
01:52:53.040
He made some Moore's law based predictions
link |
01:52:55.480
that I mostly agree with.
link |
01:52:57.400
And then I really struggled
link |
01:52:59.360
for the next couple of years of my life.
link |
01:53:01.360
Like, why should I even bother to learn anything?
link |
01:53:03.360
It's all gonna be meaningless when the machine show up.
link |
01:53:06.160
Right.
link |
01:53:07.000
Well, maybe when I was that young,
link |
01:53:10.520
I was still a little bit more pure
link |
01:53:12.040
and really like clung to that.
link |
01:53:13.160
And then I'm like, well, the machine's ain't here yet.
link |
01:53:14.720
You know, and I seem to be pretty good at this stuff.
link |
01:53:16.800
Let's try my best, you know,
link |
01:53:18.520
like what's the worst that happens?
link |
01:53:20.320
But the best possible future I see
link |
01:53:23.440
is me sort of merging with the machine.
link |
01:53:26.120
And the way that I personify this
link |
01:53:28.120
is in a longterm and augments relationship with the machine.
link |
01:53:32.160
Oh, you don't think there's room
link |
01:53:33.320
for another human in your life
link |
01:53:35.040
if you really truly merge with another machine?
link |
01:53:38.440
I mean, I see merging.
link |
01:53:40.240
I see like the best interface to my brain
link |
01:53:45.520
is like the same relationship interface
link |
01:53:48.000
to merge with an AI, right?
link |
01:53:49.320
What does that merging feel like?
link |
01:53:52.440
I've seen couples who've been together for a long time
link |
01:53:55.320
and like, I almost think of them as one person.
link |
01:53:57.840
Like couples who spend all their time together and...
link |
01:54:01.280
That's fascinating.
link |
01:54:02.120
You're actually putting,
link |
01:54:03.320
what does that merging actually looks like?
link |
01:54:05.520
It's not just a nice channel.
link |
01:54:07.600
Like a lot of people imagine it's just an efficient link,
link |
01:54:11.640
search link to Wikipedia or something.
link |
01:54:13.800
I don't believe in that.
link |
01:54:14.640
But it's more, you're saying that there's the same kind of,
link |
01:54:17.120
the same kind of relationship you have with another human
link |
01:54:19.520
as a deep relationship is that's what merging looks like.
link |
01:54:22.960
That's pretty...
link |
01:54:24.480
I don't believe that link is possible.
link |
01:54:26.680
I think that that link, so you're like,
link |
01:54:28.120
oh, I'm gonna download Wikipedia right to my brain.
link |
01:54:30.160
My reading speed is not limited by my eyes.
link |
01:54:33.360
My reading speed is limited by my inner processing loop.
link |
01:54:36.800
And to like bootstrap that
link |
01:54:38.680
sounds kind of unclear how to do it and horrifying.
link |
01:54:42.440
But if I am with somebody, and I'll use somebody
link |
01:54:46.560
who is making a super sophisticated model of me
link |
01:54:51.400
and then running simulations on that model,
link |
01:54:53.200
I'm not gonna get into the question
link |
01:54:54.120
whether the simulations are conscious or not.
link |
01:54:55.880
I don't really wanna know what it's doing.
link |
01:54:58.240
But using those simulations to play out hypothetical futures
link |
01:55:01.600
for me, deciding what things to say to me
link |
01:55:04.880
to guide me along a path and that's how I envision it.
link |
01:55:08.720
So on that path to AI of super human level intelligence,
link |
01:55:13.720
you've mentioned that you believe in the singularity,
link |
01:55:15.680
that singularity is coming.
link |
01:55:17.280
Again, could be trolling, could be not, could be part...
link |
01:55:20.440
All trolling has truth in it.
link |
01:55:21.760
I don't know what that means anymore.
link |
01:55:22.840
What is the singularity?
link |
01:55:24.520
So yeah, so that's really the question.
link |
01:55:26.720
How many years do you think before the singularity
link |
01:55:29.280
of what form do you think it will take?
link |
01:55:30.920
Does that mean fundamental shifts in capabilities of AI?
link |
01:55:34.200
Does it mean some other kind of ideas?
link |
01:55:36.960
Maybe that's just my roots, but...
link |
01:55:40.120
So I can buy a human being's worth of computers
link |
01:55:42.920
for things worth of compute for like a million bucks a day.
link |
01:55:46.000
It's about one TPU pod V3.
link |
01:55:47.800
I want like, I think they claim a hundred pay to flops.
link |
01:55:50.240
That's being generous.
link |
01:55:51.080
I think humans are actually more like 20.
link |
01:55:52.320
So that's like five humans.
link |
01:55:53.160
That's pretty good.
link |
01:55:54.040
Google needs to sell their TPUs.
link |
01:55:56.840
But no, I could buy GPUs.
link |
01:55:58.640
I could buy a stack of like, I buy 1080TIs,
link |
01:56:02.280
build data center full of them.
link |
01:56:03.880
And for a million bucks, I can get a human worth of compute.
link |
01:56:08.160
But when you look at the total number of flops in the world,
link |
01:56:12.280
when you look at human flops,
link |
01:56:14.400
which goes up very, very slowly with the population,
link |
01:56:17.040
and machine flops, which goes up exponentially,
link |
01:56:19.760
but it's still nowhere near.
link |
01:56:22.360
I think that's the key thing
link |
01:56:24.040
to talk about when the singularity happened.
link |
01:56:25.880
When most flops in the world are silicon
link |
01:56:28.560
and not biological, that's kind of the crossing point.
link |
01:56:32.280
Like they are now the dominant species on the planet.
link |
01:56:35.480
And just looking at how technology is progressing,
link |
01:56:38.720
when do you think that could possibly happen?
link |
01:56:40.360
Do you think it would happen in your lifetime?
link |
01:56:41.680
Oh yeah, definitely in my lifetime.
link |
01:56:43.640
I've done the math.
link |
01:56:44.480
I like 2038 because it's the UNIX timestamp roll over.
link |
01:56:49.920
Yeah, beautifully put.
link |
01:56:52.680
So you've said that the meaning of life is to win.
link |
01:56:58.000
If you look five years into the future,
link |
01:56:59.560
what does winning look like?
link |
01:57:02.640
So...
link |
01:57:03.720
I can go into technical depth to what I mean by that, to win.
link |
01:57:11.720
It may not mean...
link |
01:57:12.720
I was criticized for that in the comments.
link |
01:57:14.400
Like, doesn't this guy want to save the penguins in Antarctica?
link |
01:57:17.720
Or like, oh man, listen to what I'm saying.
link |
01:57:20.960
I'm not talking about like I have a yacht or something.
link |
01:57:24.720
I am an agent.
link |
01:57:26.720
I am put into this world.
link |
01:57:28.720
And I don't really know what my purpose is.
link |
01:57:33.720
But if you're a reinforcement, if you're an intelligent agent
link |
01:57:36.720
and you're put into a world, what is the ideal thing to do?
link |
01:57:39.720
Well, the ideal thing, mathematically,
link |
01:57:41.720
you can go back to like Schmidt Hoover theories about this,
link |
01:57:43.720
is to build a compressive model of the world.
link |
01:57:46.720
To build a maximally compressive to explore the world
link |
01:57:49.720
such that your exploration function maximizes
link |
01:57:52.720
the derivative of compression of the past.
link |
01:57:55.720
Schmidt Hoover has a paper about this.
link |
01:57:58.720
And like, I took that kind of as like a personal goal function.
link |
01:58:02.720
So what I mean to win, I mean like,
link |
01:58:04.720
maybe this is religious, but like I think that in the future
link |
01:58:08.720
I might be given a real purpose.
link |
01:58:10.720
Or I may decide this purpose myself.
link |
01:58:12.720
And then at that point, now I know what the game is
link |
01:58:14.720
and I know how to win.
link |
01:58:15.720
I think right now I'm still just trying to figure out what the game is.
link |
01:58:18.720
But once I know...
link |
01:58:20.720
So you have...
link |
01:58:22.720
You have imperfect information.
link |
01:58:25.720
You have a lot of uncertainty about the reward function
link |
01:58:27.720
and you're discovering it.
link |
01:58:28.720
Exactly.
link |
01:58:29.720
But the purpose is...
link |
01:58:30.720
That's a better way to put it.
link |
01:58:31.720
The purpose is to maximize it
link |
01:58:33.720
while you have a lot of uncertainty around it.
link |
01:58:36.720
And you're both reducing the uncertainty
link |
01:58:38.720
and maximizing at the same time.
link |
01:58:40.720
And so that's at the technical level.
link |
01:58:43.720
What is the...
link |
01:58:44.720
If you believe in the universal prior,
link |
01:58:46.720
what is the universal reward function?
link |
01:58:48.720
That's the better way to put it.
link |
01:58:50.720
So that win is interesting.
link |
01:58:53.720
I think I speak for everyone in saying that
link |
01:58:56.720
I wonder what that reward function is for you.
link |
01:59:01.720
And I look forward to seeing that in five years and ten years.
link |
01:59:06.720
I think a lot of people including myself are cheering you on, man.
link |
01:59:09.720
So I'm happy you exist.
link |
01:59:11.720
And I wish you the best of luck.
link |
01:59:13.720
Thanks for talking today, man.
link |
01:59:14.720
Thank you.
link |
01:59:15.720
This was a lot of fun.