back to index

George Hotz: Hacking the Simulation & Learning to Drive with Neural Nets | Lex Fridman Podcast #132


small model | large model

link |
00:00:00.000
The following is a conversation with George Hotz,
link |
00:00:02.440
AKA Geohot, his second time on the podcast.
link |
00:00:06.520
He's the founder of Comma AI,
link |
00:00:09.320
an autonomous and semi autonomous vehicle technology company
link |
00:00:12.880
that seeks to be to Tesla Autopilot
link |
00:00:15.840
what Android is to the iOS.
link |
00:00:18.920
They sell the Comma 2 device for $1,000
link |
00:00:22.760
that when installed in many of their supported cars
link |
00:00:25.600
can keep the vehicle centered in the lane
link |
00:00:27.960
even when there are no lane markings.
link |
00:00:30.760
It includes driver sensing
link |
00:00:32.400
that ensures that the driver's eyes are on the road.
link |
00:00:35.640
As you may know, I'm a big fan of driver sensing.
link |
00:00:38.440
I do believe Tesla Autopilot and others
link |
00:00:40.560
should definitely include it in their sensor suite.
link |
00:00:43.520
Also, I'm a fan of Android and a big fan of George
link |
00:00:47.120
for many reasons,
link |
00:00:48.240
including his nonlinear out of the box brilliance
link |
00:00:51.640
and the fact that he's a superstar programmer
link |
00:00:55.120
of a very different style than myself.
link |
00:00:57.360
Styles make fights and styles make conversations.
link |
00:01:01.160
So I really enjoyed this chat
link |
00:01:02.920
and I'm sure we'll talk many more times on this podcast.
link |
00:01:06.280
Quick mention of a sponsor
link |
00:01:07.680
followed by some thoughts related to the episode.
link |
00:01:10.160
First is Four Sigmatic,
link |
00:01:12.240
the maker of delicious mushroom coffee.
link |
00:01:15.360
Second is The Coding Digital,
link |
00:01:17.520
a podcast on tech and entrepreneurship
link |
00:01:19.760
that I listen to and enjoy.
link |
00:01:22.080
And finally, ExpressVPN,
link |
00:01:24.520
the VPN I've used for many years to protect my privacy
link |
00:01:27.800
on the internet.
link |
00:01:29.320
Please check out the sponsors in the description
link |
00:01:31.280
to get a discount and to support this podcast.
link |
00:01:34.840
As a side note, let me say that my work at MIT
link |
00:01:38.080
on autonomous and semi autonomous vehicles
link |
00:01:40.480
led me to study the human side of autonomy
link |
00:01:43.080
enough to understand that it's a beautifully complicated
link |
00:01:46.600
and interesting problem space,
link |
00:01:48.600
much richer than what can be studied in the lab.
link |
00:01:51.800
In that sense, the data that Comma AI, Tesla Autopilot
link |
00:01:55.360
and perhaps others like Cadillac Super Cruiser collecting
link |
00:01:58.480
gives us a chance to understand
link |
00:02:00.560
how we can design safe semi autonomous vehicles
link |
00:02:03.800
for real human beings in real world conditions.
link |
00:02:07.560
I think this requires bold innovation
link |
00:02:09.920
and a serious exploration of the first principles
link |
00:02:13.000
of the driving task itself.
link |
00:02:15.640
If you enjoyed this thing, subscribe on YouTube,
link |
00:02:17.880
review it with five stars and up a podcast,
link |
00:02:20.160
follow on Spotify, support on Patreon
link |
00:02:22.760
or connect with me on Twitter at Lex Friedman.
link |
00:02:26.280
And now here's my conversation with George Hotz.
link |
00:02:31.360
So last time we started talking about the simulation,
link |
00:02:34.040
this time let me ask you,
link |
00:02:35.640
do you think there's intelligent life out there
link |
00:02:37.480
in the universe?
link |
00:02:38.600
I've always maintained my answer to the Fermi paradox.
link |
00:02:41.640
I think there has been intelligent life
link |
00:02:44.440
elsewhere in the universe.
link |
00:02:45.880
So intelligent civilizations existed
link |
00:02:47.920
but they've blown themselves up.
link |
00:02:49.240
So your general intuition is that
link |
00:02:50.760
intelligent civilizations quickly,
link |
00:02:54.560
like there's that parameter in the Drake equation.
link |
00:02:57.760
Your sense is they don't last very long.
link |
00:02:59.640
Yeah.
link |
00:03:00.480
How are we doing on that?
link |
00:03:01.560
Like, have we lasted pretty good?
link |
00:03:03.680
Oh no.
link |
00:03:04.520
Are we do?
link |
00:03:05.360
Oh yeah.
link |
00:03:06.200
I mean, not quite yet.
link |
00:03:09.280
Well, it was good to tell you,
link |
00:03:10.440
as you'd ask the IQ required to destroy the world
link |
00:03:13.520
falls by one point every year.
link |
00:03:15.440
Okay.
link |
00:03:16.280
Technology democratizes the destruction of the world.
link |
00:03:21.120
When can a meme destroy the world?
link |
00:03:23.120
It kind of is already, right?
link |
00:03:27.280
Somewhat.
link |
00:03:28.480
I don't think we've seen anywhere near the worst of it yet.
link |
00:03:32.240
Well, it's going to get weird.
link |
00:03:34.000
Well, maybe a meme can save the world.
link |
00:03:36.480
You thought about that?
link |
00:03:37.480
The meme Lord Elon Musk fighting on the side of good
link |
00:03:40.800
versus the meme Lord of the darkness,
link |
00:03:44.560
which is not saying anything bad about Donald Trump,
link |
00:03:48.280
but he is the Lord of the meme on the dark side.
link |
00:03:51.720
He's a Darth Vader of memes.
link |
00:03:53.760
I think in every fairy tale they always end it with,
link |
00:03:58.360
and they lived happily ever after.
link |
00:03:59.920
And I'm like, please tell me more
link |
00:04:00.960
about this happily ever after.
link |
00:04:02.440
I've heard 50% of marriages end in divorce.
link |
00:04:05.880
Why doesn't your marriage end up there?
link |
00:04:07.840
You can't just say happily ever after.
link |
00:04:09.280
So it's the thing about destruction
link |
00:04:12.280
is it's over after the destruction.
link |
00:04:14.840
We have to do everything right in order to avoid it.
link |
00:04:18.160
And one thing wrong,
link |
00:04:20.040
I mean, actually this is what I really like
link |
00:04:21.920
about cryptography.
link |
00:04:22.960
Cryptography, it seems like we live in a world
link |
00:04:24.600
where the defense wins versus like nuclear weapons.
link |
00:04:29.640
The opposite is true.
link |
00:04:30.920
It is much easier to build a warhead
link |
00:04:32.960
that splits into a hundred little warheads
link |
00:04:34.520
than to build something that can, you know,
link |
00:04:36.440
take out a hundred little warheads.
link |
00:04:38.880
The offense has the advantage there.
link |
00:04:41.400
So maybe our future is in crypto, but.
link |
00:04:44.520
So cryptography, right.
link |
00:04:45.720
The Goliath is the defense.
link |
00:04:49.760
And then all the different hackers are the Davids.
link |
00:04:54.280
And that equation is flipped for nuclear war.
link |
00:04:57.840
Cause there's so many,
link |
00:04:58.800
like one nuclear weapon destroys everything essentially.
link |
00:05:01.960
Yeah, and it is much easier to attack with a nuclear weapon
link |
00:05:06.960
than it is to like the technology required to intercept
link |
00:05:09.640
and destroy a rocket is much more complicated
link |
00:05:12.080
than the technology required to just, you know,
link |
00:05:13.800
orbital trajectory, send a rocket to somebody.
link |
00:05:17.480
So, okay.
link |
00:05:18.520
Your intuition that there were intelligent civilizations
link |
00:05:21.880
out there, but it's very possible
link |
00:05:24.360
that they're no longer there.
link |
00:05:26.240
That's kind of a sad picture.
link |
00:05:27.640
They enter some steady state.
link |
00:05:29.520
They all wirehead themselves.
link |
00:05:31.520
What's wirehead?
link |
00:05:33.360
Stimulate, stimulate their pleasure centers
link |
00:05:35.320
and just, you know, live forever in this kind of stasis.
link |
00:05:39.680
They become, well, I mean,
link |
00:05:42.560
I think the reason I believe this is because where are they?
link |
00:05:46.320
If there's some reason they stopped expanding,
link |
00:05:50.680
cause otherwise they would have taken over the universe.
link |
00:05:52.160
The universe isn't that big.
link |
00:05:53.440
Or at least, you know,
link |
00:05:54.280
let's just talk about the galaxy, right?
link |
00:05:56.120
That's 70,000 light years across.
link |
00:05:58.720
I took that number from Star Trek Voyager.
link |
00:05:59.920
I don't know how true it is, but yeah, that's not big.
link |
00:06:04.680
Right? 70,000 light years is nothing.
link |
00:06:07.320
For some possible technology that you can imagine
link |
00:06:10.040
that can leverage like wormholes or something like that.
link |
00:06:12.320
Or you don't even need wormholes.
link |
00:06:13.320
Just a von Neumann probe is enough.
link |
00:06:15.120
A von Neumann probe and a million years of sublight travel
link |
00:06:18.440
and you'd have taken over the whole universe.
link |
00:06:20.440
That clearly didn't happen.
link |
00:06:22.480
So something stopped it.
link |
00:06:24.000
So you mean if you, right,
link |
00:06:25.400
for like a few million years,
link |
00:06:27.040
if you sent out probes that travel close,
link |
00:06:29.880
what's sublight?
link |
00:06:30.720
You mean close to the speed of light?
link |
00:06:32.280
Let's say 0.1 C.
link |
00:06:33.720
And it just spreads.
link |
00:06:34.800
Interesting.
link |
00:06:35.640
Actually, that's an interesting calculation, huh?
link |
00:06:38.800
So what makes you think that we'd be able
link |
00:06:40.640
to communicate with them?
link |
00:06:42.280
Like, yeah, what's,
link |
00:06:45.160
why do you think we would be able to be able
link |
00:06:47.920
to comprehend intelligent lives that are out there?
link |
00:06:51.920
Like even if they were among us kind of thing,
link |
00:06:54.960
like, or even just flying around?
link |
00:06:57.600
Well, I mean, that's possible.
link |
00:07:01.200
It's possible that there is some sort of prime directive.
link |
00:07:04.640
That'd be a really cool universe to live in.
link |
00:07:07.040
And there's some reason
link |
00:07:08.000
they're not making themselves visible to us.
link |
00:07:10.920
But it makes sense that they would use the same,
link |
00:07:15.200
well, at least the same entropy.
link |
00:07:16.960
Well, you're implying the same laws of physics.
link |
00:07:18.800
I don't know what you mean by entropy in this case.
link |
00:07:20.800
Oh, yeah.
link |
00:07:21.920
I mean, if entropy is the scarce resource in the universe.
link |
00:07:25.040
So what do you think about like Stephen Wolfram
link |
00:07:26.960
and everything is a computation?
link |
00:07:28.840
And then what if they are traveling through
link |
00:07:31.440
this world of computation?
link |
00:07:32.640
So if you think of the universe
link |
00:07:34.240
as just information processing,
link |
00:07:36.600
then what you're referring to with entropy
link |
00:07:40.840
and then these pockets of interesting complex computation
link |
00:07:44.240
swimming around, how do we know they're not already here?
link |
00:07:47.160
How do we know that this,
link |
00:07:51.040
like all the different amazing things
link |
00:07:53.040
that are full of mystery on earth
link |
00:07:55.080
are just like little footprints of intelligence
link |
00:07:58.640
from light years away?
link |
00:08:01.160
Maybe.
link |
00:08:02.840
I mean, I tend to think that as civilizations expand,
link |
00:08:05.760
they use more and more energy
link |
00:08:07.800
and you can never overcome the problem of waste heat.
link |
00:08:10.240
So where is there waste heat?
link |
00:08:11.880
So we'd be able to, with our crude methods,
link |
00:08:13.560
be able to see like, there's a whole lot of energy here.
link |
00:08:18.840
But it could be something we're not,
link |
00:08:20.560
I mean, we don't understand dark energy, right?
link |
00:08:22.480
Dark matter.
link |
00:08:23.560
It could be just stuff we don't understand at all.
link |
00:08:26.160
Or they can have a fundamentally different physics,
link |
00:08:29.080
you know, like that we just don't even comprehend.
link |
00:08:32.200
Well, I think, okay,
link |
00:08:33.440
I mean, it depends how far out you wanna go.
link |
00:08:35.120
I don't think physics is very different
link |
00:08:36.840
on the other side of the galaxy.
link |
00:08:39.760
I would suspect that they have,
link |
00:08:41.920
I mean, if they're in our universe,
link |
00:08:43.680
they have the same physics.
link |
00:08:45.760
Well, yeah, that's the assumption we have,
link |
00:08:47.600
but there could be like super trippy things
link |
00:08:50.000
like our cognition only gets to a slice,
link |
00:08:57.040
and all the possible instruments that we can design
link |
00:08:59.440
only get to a particular slice of the universe.
link |
00:09:01.520
And there's something much like weirder.
link |
00:09:04.080
Maybe we can try a thought experiment.
link |
00:09:06.880
Would people from the past
link |
00:09:10.040
be able to detect the remnants of our,
link |
00:09:14.000
or would we be able to detect our modern civilization?
link |
00:09:16.600
I think the answer is obviously yes.
link |
00:09:18.840
You mean past from a hundred years ago?
link |
00:09:20.720
Well, let's even go back further.
link |
00:09:22.080
Let's go to a million years ago, right?
link |
00:09:24.400
The humans who were lying around in the desert
link |
00:09:26.520
probably didn't even have,
link |
00:09:27.720
maybe they just barely had fire.
link |
00:09:31.360
They would understand if a 747 flew overhead.
link |
00:09:35.200
Oh, in this vicinity, but not if a 747 flew on Mars.
link |
00:09:43.200
Like, cause they wouldn't be able to see far,
link |
00:09:45.080
cause we're not actually communicating that well
link |
00:09:47.240
with the rest of the universe.
link |
00:09:48.920
We're doing okay.
link |
00:09:50.160
Just sending out random like fifties tracks of music.
link |
00:09:54.200
True.
link |
00:09:55.160
And yeah, I mean, they'd have to, you know,
link |
00:09:57.800
we've only been broadcasting radio waves for 150 years.
link |
00:10:02.480
And well, there's your light cone.
link |
00:10:04.560
So.
link |
00:10:05.800
Yeah. Okay.
link |
00:10:06.920
What do you make about all the,
link |
00:10:08.800
I recently came across this having talked to David Fravor.
link |
00:10:14.720
I don't know if you caught what the videos
link |
00:10:16.960
of the Pentagon released
link |
00:10:18.840
and the New York Times reporting of the UFO sightings.
link |
00:10:23.480
So I kind of looked into it, quote unquote.
link |
00:10:26.040
And there's actually been like hundreds
link |
00:10:30.800
of thousands of UFO sightings, right?
link |
00:10:33.800
And a lot of it you can explain away
link |
00:10:35.960
in different kinds of ways.
link |
00:10:37.080
So one is it could be interesting physical phenomena.
link |
00:10:40.160
Two, it could be people wanting to believe
link |
00:10:44.640
and therefore they conjure up a lot of different things
link |
00:10:46.760
that just, you know, when you see different kinds of lights,
link |
00:10:48.920
some basic physics phenomena,
link |
00:10:50.760
and then you just conjure up ideas
link |
00:10:53.640
of possible out there mysterious worlds.
link |
00:10:56.680
But, you know, it's also possible,
link |
00:10:58.960
like you have a case of David Fravor,
link |
00:11:02.480
who is a Navy pilot, who's, you know,
link |
00:11:06.160
as legit as it gets in terms of humans
link |
00:11:08.840
who are able to perceive things in the environment
link |
00:11:13.440
and make conclusions,
link |
00:11:15.320
whether those things are a threat or not.
link |
00:11:17.600
And he and several other pilots saw a thing,
link |
00:11:22.000
I don't know if you followed this,
link |
00:11:23.440
but they saw a thing that they've since then called TikTok
link |
00:11:26.840
that moved in all kinds of weird ways.
link |
00:11:29.520
They don't know what it is.
link |
00:11:30.640
It could be technology developed by the United States
link |
00:11:36.640
and they're just not aware of it
link |
00:11:38.040
and the surface level from the Navy, right?
link |
00:11:40.000
It could be different kind of lighting technology
link |
00:11:42.280
or drone technology, all that kind of stuff.
link |
00:11:45.000
It could be the Russians and the Chinese,
link |
00:11:46.560
all that kind of stuff.
link |
00:11:48.000
And of course their mind, our mind,
link |
00:11:51.400
can also venture into the possibility
link |
00:11:54.160
that it's from another world.
link |
00:11:56.320
Have you looked into this at all?
link |
00:11:58.160
What do you think about it?
link |
00:11:59.640
I think all the news is a psyop.
link |
00:12:01.680
I think that the most plausible.
link |
00:12:05.240
Nothing is real.
link |
00:12:06.440
Yeah, I listened to the, I think it was Bob Lazar
link |
00:12:10.920
on Joe Rogan.
link |
00:12:12.360
And like, I believe everything this guy is saying.
link |
00:12:15.840
And then I think that it's probably just some like MKUltra
link |
00:12:18.440
kind of thing, you know?
link |
00:12:20.680
What do you mean?
link |
00:12:21.520
Like they, you know, they made some weird thing
link |
00:12:24.720
and they called it an alien spaceship.
link |
00:12:26.160
You know, maybe it was just to like
link |
00:12:27.480
stimulate young physicists minds.
link |
00:12:29.560
We'll tell them it's alien technology
link |
00:12:31.040
and we'll see what they come up with, right?
link |
00:12:33.600
Do you find any conspiracy theories compelling?
link |
00:12:36.080
Like have you pulled at the string
link |
00:12:38.320
of the rich complex world of conspiracy theories
link |
00:12:42.440
that's out there?
link |
00:12:43.920
I think that I've heard a conspiracy theory
link |
00:12:46.520
that conspiracy theories were invented by the CIA
link |
00:12:48.960
in the 60s to discredit true things.
link |
00:12:52.040
Yeah.
link |
00:12:53.880
So, you know, you can go to ridiculous conspiracy theories
link |
00:12:58.520
like Flat Earth and Pizza Gate.
link |
00:13:01.000
And, you know, these things are almost to hide
link |
00:13:05.440
like conspiracy theories that like,
link |
00:13:08.000
you know, remember when the Chinese like locked up
link |
00:13:09.640
the doctors who discovered coronavirus?
link |
00:13:11.360
Like I tell people this and I'm like,
link |
00:13:12.840
no, no, no, that's not a conspiracy theory.
link |
00:13:14.400
That actually happened.
link |
00:13:15.920
Do you remember the time that the money used to be backed
link |
00:13:18.000
by gold and now it's backed by nothing?
link |
00:13:20.080
This is not a conspiracy theory.
link |
00:13:21.680
This actually happened.
link |
00:13:23.800
Well, that's one of my worries today
link |
00:13:26.360
with the idea of fake news is that when nothing is real,
link |
00:13:32.880
then like you dilute the possibility of anything being true
link |
00:13:37.600
by conjuring up all kinds of conspiracy theories.
link |
00:13:41.000
And then you don't know what to believe.
link |
00:13:42.400
And then like the idea of truth of objectivity
link |
00:13:46.280
is lost completely.
link |
00:13:47.840
Everybody has their own truth.
link |
00:13:50.120
So you used to control information by censoring it.
link |
00:13:53.600
And then the internet happened and governments were like,
link |
00:13:55.880
oh shit, we can't censor things anymore.
link |
00:13:58.360
I know what we'll do.
link |
00:14:00.560
You know, it's the old story of the story of like
link |
00:14:04.160
tying a flag with a leprechaun tells you his gold is buried
link |
00:14:07.080
and you tie one flag and you make the leprechaun swear
link |
00:14:09.080
to not remove the flag.
link |
00:14:10.040
And you come back to the field later with a shovel
link |
00:14:11.720
and there's flags everywhere.
link |
00:14:14.640
That's one way to maintain privacy, right?
link |
00:14:16.320
It's like in order to protect the contents
link |
00:14:20.280
of this conversation, for example,
link |
00:14:21.800
we could just generate like millions of deep,
link |
00:14:25.120
fake conversations where you and I talk
link |
00:14:27.560
and say random things.
link |
00:14:29.120
So this is just one of them
link |
00:14:30.440
and nobody knows which one was the real one.
link |
00:14:32.720
This could be fake right now.
link |
00:14:34.440
Classic steganography technique.
link |
00:14:37.240
Okay, another absurd question about intelligent life.
link |
00:14:39.960
Cause you know, you're an incredible programmer
link |
00:14:43.960
outside of everything else we'll talk about
link |
00:14:45.520
just as a programmer.
link |
00:14:49.320
Do you think intelligent beings out there,
link |
00:14:52.960
the civilizations that were out there,
link |
00:14:54.520
had computers and programming?
link |
00:14:58.240
Did they, do we naturally have to develop something
link |
00:15:01.440
where we engineer machines and are able to encode
link |
00:15:05.480
both knowledge into those machines
link |
00:15:08.680
and instructions that process that knowledge,
link |
00:15:11.720
process that information to make decisions
link |
00:15:14.260
and actions and so on?
link |
00:15:15.680
And would those programming languages,
link |
00:15:18.320
if you think they exist, be at all similar
link |
00:15:21.320
to anything we've developed?
link |
00:15:24.120
So I don't see that much of a difference
link |
00:15:26.600
between quote unquote natural languages
link |
00:15:29.400
and programming languages.
link |
00:15:34.160
Yeah.
link |
00:15:35.000
I think there's so many similarities.
link |
00:15:36.680
So when asked the question,
link |
00:15:39.920
what do alien languages look like?
link |
00:15:42.380
I imagine they're not all that dissimilar from ours.
link |
00:15:46.480
And I think translating in and out of them
link |
00:15:51.560
wouldn't be that crazy.
link |
00:15:52.920
Well, it's difficult to compile like DNA to Python
link |
00:15:57.520
and then to C.
link |
00:15:59.160
There's a little bit of a gap in the kind of languages
link |
00:16:02.000
we use for touring machines
link |
00:16:06.840
and the kind of languages nature seems to use a little bit.
link |
00:16:10.160
Maybe that's just, we just haven't understood
link |
00:16:13.800
the kind of language that nature uses well yet.
link |
00:16:16.440
DNA is a CAD model.
link |
00:16:19.120
It's not quite a programming language.
link |
00:16:21.200
It has no sort of a serial execution.
link |
00:16:25.280
It's not quite a, yeah, it's a CAD model.
link |
00:16:29.480
So I think in that sense,
link |
00:16:30.920
we actually completely understand it.
link |
00:16:32.520
The problem is, well, simulating on these CAD models,
link |
00:16:37.300
I played with it a bit this year,
link |
00:16:38.520
is super computationally intensive.
link |
00:16:41.120
If you wanna go down to like the molecular level
link |
00:16:43.720
where you need to go to see a lot of these phenomenon
link |
00:16:45.880
like protein folding.
link |
00:16:48.080
So yeah, it's not that we don't understand it.
link |
00:16:52.200
It just requires a whole lot of compute to kind of compile it.
link |
00:16:55.160
For our human minds, it's inefficient,
link |
00:16:56.680
both for the data representation and for the programming.
link |
00:17:00.520
Yeah, it runs well on raw nature.
link |
00:17:02.800
It runs well on raw nature.
link |
00:17:03.880
And when we try to build emulators or simulators for that,
link |
00:17:07.840
well, they're mad slow, but I've tried it.
link |
00:17:10.640
It runs in that, yeah, you've commented elsewhere,
link |
00:17:14.280
I don't remember where,
link |
00:17:15.780
that one of the problems is simulating nature is tough.
link |
00:17:20.780
And if you want to sort of deploy a prototype,
link |
00:17:25.300
I forgot how you put it, but it made me laugh,
link |
00:17:28.020
but animals or humans would need to be involved
link |
00:17:31.700
in order to try to run some prototype code on,
link |
00:17:38.120
like if we're talking about COVID and viruses and so on,
link |
00:17:41.180
if you were trying to engineer
link |
00:17:42.900
some kind of defense mechanisms,
link |
00:17:45.020
like a vaccine against COVID and all that kind of stuff
link |
00:17:49.580
that doing any kind of experimentation,
link |
00:17:52.060
like you can with like autonomous vehicles
link |
00:17:53.980
would be very technically and ethically costly.
link |
00:17:59.660
I'm not sure about that.
link |
00:18:00.940
I think you can do tons of crazy biology and test tubes.
link |
00:18:05.060
I think my bigger complaint is more,
link |
00:18:08.480
oh, the tools are so bad.
link |
00:18:11.420
Like literally, you mean like libraries and?
link |
00:18:14.540
I don't know, I'm not pipetting shit.
link |
00:18:16.100
Like you're handing me a, I got a, no, no, no,
link |
00:18:20.140
there has to be some.
link |
00:18:22.580
Like automating stuff.
link |
00:18:24.140
And like the, yeah, but human biology is messy.
link |
00:18:28.300
Like it seems.
link |
00:18:29.140
But like, look at those Toronto's videos.
link |
00:18:31.240
They were a joke.
link |
00:18:32.080
It's like a little gantry.
link |
00:18:33.460
It's like little X, Y gantry,
link |
00:18:34.660
high school science project with the pipette.
link |
00:18:36.660
I'm like, really?
link |
00:18:38.260
Gotta be something better.
link |
00:18:39.220
You can't build like nice microfluidics
link |
00:18:41.420
and I can program the computation to bio interface.
link |
00:18:45.300
I mean, this is gonna happen.
link |
00:18:47.180
But like right now, if you are asking me
link |
00:18:50.020
to pipette 50 milliliters of solution, I'm out.
link |
00:18:54.460
This is so crude.
link |
00:18:55.460
Yeah.
link |
00:18:56.700
Okay, let's get all the crazy out of the way.
link |
00:18:59.980
So a bunch of people asked me,
link |
00:19:02.260
since we talked about the simulation last time,
link |
00:19:05.060
we talked about hacking the simulation.
link |
00:19:06.860
Do you have any updates, any insights
link |
00:19:09.900
about how we might be able to go about hacking simulation
link |
00:19:13.740
if we indeed do live in a simulation?
link |
00:19:17.180
I think a lot of people misinterpreted
link |
00:19:19.900
the point of that South by talk.
link |
00:19:22.420
The point of the South by talk
link |
00:19:23.500
was not literally to hack the simulation.
link |
00:19:26.720
I think that this is an idea is literally just,
link |
00:19:33.520
I think theoretical physics.
link |
00:19:34.580
I think that's the whole goal, right?
link |
00:19:39.900
You want your grand unified theory, but then, okay,
link |
00:19:42.300
build a grand unified theory search for exploits, right?
link |
00:19:45.140
I think we're nowhere near actually there yet.
link |
00:19:47.640
My hope with that was just more to like,
link |
00:19:51.500
are you people kidding me
link |
00:19:52.740
with the things you spend time thinking about?
link |
00:19:54.980
Do you understand like kind of how small you are?
link |
00:19:58.020
You are bytes and God's computer, really?
link |
00:20:02.540
And the things that people get worked up about, you know?
link |
00:20:06.700
So basically, it was more a message
link |
00:20:10.060
of we should humble ourselves.
link |
00:20:12.540
That we get to, like what are we humans in this byte code?
link |
00:20:19.460
Yeah, and not just humble ourselves,
link |
00:20:22.380
but like I'm not trying to like make people guilty
link |
00:20:24.900
or anything like that.
link |
00:20:25.740
I'm trying to say like, literally,
link |
00:20:27.260
look at what you are spending time on, right?
link |
00:20:30.200
What are you referring to?
link |
00:20:31.040
You're referring to the Kardashians?
link |
00:20:32.460
What are we talking about?
link |
00:20:34.140
Twitter?
link |
00:20:34.980
No, the Kardashians, everyone knows that's kind of fun.
link |
00:20:38.060
I'm referring more to like the economy, you know?
link |
00:20:42.980
This idea that we gotta up our stock price.
link |
00:20:50.720
Or what is the goal function of humanity?
link |
00:20:55.380
You don't like the game of capitalism?
link |
00:20:57.600
Like you don't like the games we've constructed
link |
00:20:59.340
for ourselves as humans?
link |
00:21:00.660
I'm a big fan of capitalism.
link |
00:21:02.860
I don't think that's really the game we're playing right now.
link |
00:21:05.120
I think we're playing a different game
link |
00:21:07.260
where the rules are rigged.
link |
00:21:10.300
Okay, which games are interesting to you
link |
00:21:12.580
that we humans have constructed and which aren't?
link |
00:21:14.780
Which are productive and which are not?
link |
00:21:18.380
Actually, maybe that's the real point of the talk.
link |
00:21:21.880
It's like, stop playing these fake human games.
link |
00:21:25.100
There's a real game here.
link |
00:21:26.780
We can play the real game.
link |
00:21:28.660
The real game is, you know, nature wrote the rules.
link |
00:21:31.260
This is a real game.
link |
00:21:32.540
There still is a game to play.
link |
00:21:35.180
But if you look at, sorry to interrupt,
link |
00:21:36.940
I don't know if you've seen the Instagram account,
link |
00:21:38.420
nature is metal.
link |
00:21:40.220
The game that nature seems to be playing
link |
00:21:42.820
is a lot more cruel than we humans want to put up with.
link |
00:21:47.340
Or at least we see it as cruel.
link |
00:21:49.520
It's like the bigger thing eats the smaller thing
link |
00:21:53.660
and does it to impress another big thing
link |
00:21:58.180
so it can mate with that thing.
link |
00:22:00.460
And that's it.
link |
00:22:01.300
That seems to be the entirety of it.
link |
00:22:04.040
Well, there's no art, there's no music,
link |
00:22:07.260
there's no comma AI, there's no comma one,
link |
00:22:10.860
no comma two, no George Hots with his brilliant talks
link |
00:22:14.940
at South by Southwest.
link |
00:22:17.020
I disagree, though.
link |
00:22:17.860
I disagree that this is what nature is.
link |
00:22:19.620
I think nature just provided basically a open world MMORPG.
link |
00:22:26.780
And, you know, here it's open world.
link |
00:22:29.860
I mean, if that's the game you want to play,
link |
00:22:31.100
you can play that game.
link |
00:22:32.260
But isn't that beautiful?
link |
00:22:33.780
I don't know if you played Diablo.
link |
00:22:35.580
They used to have, I think, cow level where it's...
link |
00:22:39.420
So everybody will go just, they figured out this,
link |
00:22:44.420
like the best way to gain like experience points
link |
00:22:48.420
is to just slaughter cows over and over and over.
link |
00:22:52.180
And so they figured out this little sub game
link |
00:22:55.900
within the bigger game that this is the most efficient way
link |
00:22:58.800
to get experience points.
link |
00:22:59.900
And everybody somehow agreed
link |
00:23:01.860
that getting experience points in RPG context
link |
00:23:04.460
where you always want to be getting more stuff,
link |
00:23:06.500
more skills, more levels, keep advancing.
link |
00:23:09.140
That seems to be good.
link |
00:23:10.440
So might as well sacrifice actual enjoyment
link |
00:23:14.740
of playing a game, exploring a world,
link |
00:23:17.640
and spending like hundreds of hours of your time
link |
00:23:21.580
at cow level.
link |
00:23:22.420
I mean, the number of hours I spent in cow level,
link |
00:23:26.400
I'm not like the most impressive person
link |
00:23:28.140
because people have spent probably thousands of hours there,
link |
00:23:30.460
but it's ridiculous.
link |
00:23:31.580
So that's a little absurd game that brought me joy
link |
00:23:35.220
in some weird dopamine drug kind of way.
link |
00:23:37.500
So you don't like those games.
link |
00:23:40.060
You don't think that's us humans feeling the nature.
link |
00:23:46.500
I think so.
link |
00:23:47.340
And that was the point of the talk.
link |
00:23:49.640
Yeah.
link |
00:23:50.480
So how do we hack it then?
link |
00:23:51.460
Well, I want to live forever.
link |
00:23:52.740
And I want to live forever.
link |
00:23:55.820
And this is the goal.
link |
00:23:56.780
Well, that's a game against nature.
link |
00:23:59.220
Yeah, immortality is the good objective function to you?
link |
00:24:03.540
I mean, start there and then you can do whatever else
link |
00:24:05.100
you want because you got a long time.
link |
00:24:07.380
What if immortality makes the game just totally not fun?
link |
00:24:10.740
I mean, like, why do you assume immortality
link |
00:24:13.860
is somehow a good objective function?
link |
00:24:18.160
It's not immortality that I want.
link |
00:24:19.940
A true immortality where I could not die,
link |
00:24:22.580
I would prefer what we have right now.
link |
00:24:25.020
But I want to choose my own death, of course.
link |
00:24:27.180
I don't want nature to decide when I die,
link |
00:24:29.840
I'm going to win.
link |
00:24:30.780
I'm going to be you.
link |
00:24:33.100
And then at some point, if you choose commit suicide,
link |
00:24:36.920
like how long do you think you'd live?
link |
00:24:41.700
Until I get bored.
link |
00:24:43.140
See, I don't think people like brilliant people like you
link |
00:24:48.060
that really ponder living a long time
link |
00:24:52.380
are really considering how meaningless life becomes.
link |
00:24:58.620
Well, I want to know everything and then I'm ready to die.
link |
00:25:03.620
As long as there's...
link |
00:25:04.460
Yeah, but why do you want,
link |
00:25:05.280
isn't it possible that you want to know everything
link |
00:25:06.940
because it's finite?
link |
00:25:09.700
Like the reason you want to know quote unquote everything
link |
00:25:12.220
is because you don't have enough time to know everything.
link |
00:25:16.380
And once you have unlimited time,
link |
00:25:18.780
then you realize like, why do anything?
link |
00:25:22.220
Like why learn anything?
link |
00:25:25.140
I want to know everything and then I'm ready to die.
link |
00:25:27.100
So you have, yeah.
link |
00:25:28.460
It's not a, like, it's a terminal value.
link |
00:25:30.940
It's not in service of anything else.
link |
00:25:34.740
I'm conscious of the possibility, this is not a certainty,
link |
00:25:37.900
but the possibility of that engine of curiosity
link |
00:25:41.800
that you're speaking to is actually
link |
00:25:47.100
a symptom of the finiteness of life.
link |
00:25:49.780
Like without that finiteness, your curiosity would vanish.
link |
00:25:55.060
Like a morning fog.
link |
00:25:57.000
All right, cool.
link |
00:25:57.840
Bukowski talked about love like that.
link |
00:25:59.300
Then let me solve immortality
link |
00:26:01.340
and let me change the thing in my brain
link |
00:26:02.900
that reminds me of the fact that I'm immortal,
link |
00:26:04.700
tells me that life is finite shit.
link |
00:26:06.260
Maybe I'll have it tell me that life ends next week.
link |
00:26:09.060
Right?
link |
00:26:10.660
I'm okay with some self manipulation like that.
link |
00:26:12.660
I'm okay with deceiving myself.
link |
00:26:14.420
Oh, Rika, changing the code.
link |
00:26:17.020
Yeah, well, if that's the problem, right?
link |
00:26:18.300
If the problem is that I will no longer have that,
link |
00:26:20.580
that curiosity, I'd like to have backup copies of myself,
link |
00:26:24.460
which I check in with occasionally
link |
00:26:27.460
to make sure they're okay with the trajectory
link |
00:26:29.240
and they can kind of override it.
link |
00:26:31.000
Maybe a nice, like, I think of like those wave nets,
link |
00:26:33.180
those like logarithmic go back to the copies.
link |
00:26:35.180
Yeah, but sometimes it's not reversible.
link |
00:26:36.700
Like I've done this with video games.
link |
00:26:39.980
Once you figure out the cheat code
link |
00:26:41.620
or like you look up how to cheat old school,
link |
00:26:43.940
like single player, it ruins the game for you.
link |
00:26:46.860
Absolutely.
link |
00:26:47.700
It ruins that feeling.
link |
00:26:48.540
But again, that just means our brain manipulation
link |
00:26:51.900
technology is not good enough yet.
link |
00:26:53.260
Remove that cheat code from your brain.
link |
00:26:54.700
Here you go.
link |
00:26:55.820
So it's also possible that if we figure out immortality,
link |
00:27:00.500
that all of us will kill ourselves
link |
00:27:03.460
before we advance far enough
link |
00:27:06.100
to be able to revert the change.
link |
00:27:08.820
I'm not killing myself till I know everything, so.
link |
00:27:11.900
That's what you say now, because your life is finite.
link |
00:27:15.020
You know, I think yes, self modifying systems gets,
link |
00:27:19.060
comes up with all these hairy complexities
link |
00:27:21.020
and can I promise that I'll do it perfectly?
link |
00:27:23.020
No, but I think I can put good safety structures in place.
link |
00:27:27.180
So that talk and your thinking here
link |
00:27:29.740
is not literally referring to a simulation
link |
00:27:36.180
and that our universe is a kind of computer program
link |
00:27:40.520
running on a computer.
link |
00:27:42.240
That's more of a thought experiment.
link |
00:27:45.180
Do you also think of the potential of the sort of Bostrom,
link |
00:27:51.780
Elon Musk and others that talk about an actual program
link |
00:27:57.820
that simulates our universe?
link |
00:27:59.700
Oh, I don't doubt that we're in a simulation.
link |
00:28:01.980
I just think that it's not quite that important.
link |
00:28:05.300
I mean, I'm interested only in simulation theory
link |
00:28:06.940
as far as like it gives me power over nature.
link |
00:28:09.740
If it's totally unfalsifiable, then who cares?
link |
00:28:13.140
I mean, what do you think that experiment would look like?
link |
00:28:15.300
Like somebody on Twitter asks,
link |
00:28:17.740
asks George what signs we would look for
link |
00:28:20.780
to know whether or not we're in the simulation,
link |
00:28:22.980
which is exactly what you're asking is like,
link |
00:28:25.880
the step that precedes the step of knowing
link |
00:28:29.500
how to get more power from this knowledge
link |
00:28:32.200
is to get an indication that there's some power to be gained.
link |
00:28:35.220
So get an indication that there,
link |
00:28:37.900
you can discover and exploit cracks in the simulation
link |
00:28:42.100
or it doesn't have to be in the physics of the universe.
link |
00:28:45.340
Yeah.
link |
00:28:46.720
Show me, I mean, like a memory leak could be cool.
link |
00:28:51.620
Like some scrying technology, you know?
link |
00:28:54.000
What kind of technology?
link |
00:28:55.260
Scrying?
link |
00:28:56.220
What's that?
link |
00:28:57.060
Oh, that's a weird,
link |
00:28:58.460
scrying is the paranormal ability to like remote viewing,
link |
00:29:03.460
like being able to see somewhere where you're not.
link |
00:29:08.220
So, you know, I don't think you can do it
link |
00:29:10.100
by chanting in a room,
link |
00:29:11.220
but if we could find, it's a memory leak, basically.
link |
00:29:16.300
It's a memory leak.
link |
00:29:17.300
Yeah, you're able to access parts you're not supposed to.
link |
00:29:19.980
Yeah, yeah, yeah.
link |
00:29:20.820
And thereby discover a shortcut.
link |
00:29:22.100
Yeah, maybe memory leak means the other thing as well,
link |
00:29:24.720
but I mean like, yeah,
link |
00:29:25.560
like an ability to read arbitrary memory, right?
link |
00:29:28.100
And that one's not that horrifying, right?
link |
00:29:29.820
The right ones start to be horrifying.
link |
00:29:31.420
Read, right.
link |
00:29:32.260
It's the reading is not the problem.
link |
00:29:34.900
Yeah, it's like Heartfleet for the universe.
link |
00:29:37.220
Oh boy, the writing is a big, big problem.
link |
00:29:40.740
It's a big problem.
link |
00:29:43.060
It's the moment you can write anything,
link |
00:29:44.620
even if it's just random noise.
link |
00:29:47.740
That's terrifying.
link |
00:29:49.180
I mean, even without that,
link |
00:29:51.560
like even some of the, you know,
link |
00:29:52.580
the nanotech stuff that's coming, I think is.
link |
00:29:57.140
I don't know if you're paying attention,
link |
00:29:58.340
but actually Eric Weistand came out
link |
00:30:00.500
with the theory of everything.
link |
00:30:02.260
I mean, that came out.
link |
00:30:03.580
He's been working on a theory of everything
link |
00:30:05.460
in the physics world called geometric unity.
link |
00:30:08.060
And then for me, from computer science person like you,
link |
00:30:11.700
Steven Wolfram's theory of everything,
link |
00:30:14.220
of like hypergraphs is super interesting and beautiful,
link |
00:30:17.660
but not from a physics perspective,
link |
00:30:19.460
but from a computational perspective.
link |
00:30:20.940
I don't know, have you paid attention to any of that?
link |
00:30:23.020
So again, like what would make me pay attention
link |
00:30:26.440
and like why like I hate string theory is,
link |
00:30:29.540
okay, make a testable prediction, right?
link |
00:30:31.780
I'm only interested in,
link |
00:30:33.700
I'm not interested in theories for their intrinsic beauty.
link |
00:30:36.060
I'm interested in theories
link |
00:30:37.060
that give me power over the universe.
link |
00:30:39.900
So if these theories do, I'm very interested.
link |
00:30:43.220
Can I just say how beautiful that is?
link |
00:30:45.100
Because a lot of physicists say,
link |
00:30:47.140
I'm interested in experimental validation
link |
00:30:49.980
and they skip out the part where they say
link |
00:30:52.940
to give me more power in the universe.
link |
00:30:55.500
I just love the.
link |
00:30:57.500
No, I want. The clarity of that.
link |
00:30:59.780
I want 100 gigahertz processors.
link |
00:31:02.020
I want transistors that are smaller than atoms.
link |
00:31:04.120
I want like power.
link |
00:31:08.100
That's true.
link |
00:31:10.580
And that's where people from aliens
link |
00:31:12.460
to this kind of technology where people are worried
link |
00:31:14.420
that governments, like who owns that power?
link |
00:31:19.320
Is it George Harts?
link |
00:31:20.780
Is it thousands of distributed hackers across the world?
link |
00:31:25.020
Is it governments?
link |
00:31:26.580
Is it Mark Zuckerberg?
link |
00:31:28.660
There's a lot of people that,
link |
00:31:32.500
I don't know if anyone trusts any one individual with power.
link |
00:31:35.540
So they're always worried.
link |
00:31:37.380
It's the beauty of blockchains.
link |
00:31:39.340
That's the beauty of blockchains, which we'll talk about.
link |
00:31:43.140
On Twitter, somebody pointed me to a story,
link |
00:31:46.220
a bunch of people pointed me to a story a few months ago
link |
00:31:49.300
where you went into a restaurant in New York.
link |
00:31:51.660
And you can correct me if any of this is wrong.
link |
00:31:53.660
And ran into a bunch of folks from a company
link |
00:31:56.580
in a crypto company who are trying to scale up Ethereum.
link |
00:32:01.780
And they had a technical deadline
link |
00:32:03.300
related to solidity to OVM compiler.
link |
00:32:07.380
So these are all Ethereum technologies.
link |
00:32:09.660
So you stepped in, they recognized you,
link |
00:32:14.700
pulled you aside, explained their problem.
link |
00:32:16.220
And you stepped in and helped them solve the problem,
link |
00:32:19.540
thereby creating legend status story.
link |
00:32:23.100
So can you tell me the story in a little more detail?
link |
00:32:28.980
It seems kind of incredible.
link |
00:32:31.540
Did this happen?
link |
00:32:32.380
Yeah, yeah, it's a true story, it's a true story.
link |
00:32:34.060
I mean, they wrote a very flattering account of it.
link |
00:32:39.340
So Optimism is the company called Optimism,
link |
00:32:43.820
spin off of Plasma.
link |
00:32:45.420
They're trying to build L2 solutions on Ethereum.
link |
00:32:47.820
So right now, every Ethereum node
link |
00:32:52.620
has to run every transaction on the Ethereum network.
link |
00:32:56.420
And this kind of doesn't scale, right?
link |
00:32:58.540
Because if you have N computers,
link |
00:33:00.020
well, if that becomes two N computers,
link |
00:33:02.340
you actually still get the same amount of compute, right?
link |
00:33:05.420
This is like all of one scaling
link |
00:33:09.120
because they all have to run it.
link |
00:33:10.140
Okay, fine, you get more blockchain security,
link |
00:33:12.840
but like, blockchain is already so secure.
link |
00:33:15.380
Can we trade some of that off for speed?
link |
00:33:17.780
So that's kind of what these L2 solutions are.
link |
00:33:20.500
They built this thing, which kind of,
link |
00:33:23.020
kind of sandbox for Ethereum contracts.
link |
00:33:26.300
So they can run it in this L2 world
link |
00:33:28.200
and it can't do certain things in L world, in L1.
link |
00:33:30.900
Can I ask you for some definitions?
link |
00:33:32.420
What's L2?
link |
00:33:33.420
Oh, L2 is layer two.
link |
00:33:34.840
So L1 is like the base Ethereum chain.
link |
00:33:37.180
And then layer two is like a computational layer
link |
00:33:40.920
that runs elsewhere,
link |
00:33:44.420
but still is kind of secured by layer one.
link |
00:33:47.680
And I'm sure a lot of people know,
link |
00:33:49.720
but Ethereum is a cryptocurrency,
link |
00:33:51.940
probably one of the most popular cryptocurrency
link |
00:33:53.720
second to Bitcoin.
link |
00:33:55.320
And a lot of interesting technological innovation there.
link |
00:33:58.800
Maybe you could also slip in whenever you talk about this
link |
00:34:03.240
and things that are exciting to you in the Ethereum space.
link |
00:34:06.380
And why Ethereum?
link |
00:34:07.800
Well, I mean, Bitcoin is not Turing complete.
link |
00:34:12.260
Ethereum is not technically Turing complete
link |
00:34:13.820
with the gas limit, but close enough.
link |
00:34:16.040
With the gas limit?
link |
00:34:16.880
What's the gas limit, resources?
link |
00:34:19.080
Yeah, I mean, no computer is actually Turing complete.
link |
00:34:21.180
Right.
link |
00:34:23.160
You're gonna find out RAM, you know?
link |
00:34:24.720
I can actually solve the whole thing.
link |
00:34:25.560
What's the word gas limit?
link |
00:34:26.720
You just have so many brilliant words.
link |
00:34:28.660
I'm not even gonna ask.
link |
00:34:29.500
That's not my word, that's Ethereum's word.
link |
00:34:32.220
Gas limit.
link |
00:34:33.060
Ethereum, you have to spend gas per instruction.
link |
00:34:35.360
So like different op codes use different amounts of gas
link |
00:34:37.820
and you buy gas with ether to prevent people
link |
00:34:40.680
from basically DDoSing the network.
link |
00:34:42.740
So Bitcoin is proof of work.
link |
00:34:45.440
And then what's Ethereum?
link |
00:34:47.120
It's also proof of work.
link |
00:34:48.360
They're working on some proof of stake,
link |
00:34:49.960
Ethereum 2.0 stuff.
link |
00:34:51.040
But right now it's proof of work.
link |
00:34:52.720
It uses a different hash function from Bitcoin.
link |
00:34:54.700
That's more ASIC resistance, because you need RAM.
link |
00:34:57.340
So we're all talking about Ethereum 1.0.
link |
00:34:59.960
So what were they trying to do to scale this whole process?
link |
00:35:03.840
So they were like, well, if we could run contracts elsewhere
link |
00:35:07.800
and then only save the results of that computation,
link |
00:35:13.120
well, we don't actually have to do the compute on the chain.
link |
00:35:14.680
We can do the compute off chain
link |
00:35:15.680
and just post what the results are.
link |
00:35:17.440
Now, the problem with that is,
link |
00:35:18.800
well, somebody could lie about what the results are.
link |
00:35:21.120
So you need a resolution mechanism.
link |
00:35:23.240
And the resolution mechanism can be really expensive
link |
00:35:26.500
because you just have to make sure
link |
00:35:29.000
that the person who is saying,
link |
00:35:31.140
look, I swear that this is the real computation.
link |
00:35:33.800
I'm staking $10,000 on that fact.
link |
00:35:36.640
And if you prove it wrong,
link |
00:35:39.000
yeah, it might cost you $3,000 in gas fees to prove wrong,
link |
00:35:42.820
but you'll get the $10,000 bounty.
link |
00:35:44.800
So you can secure using those kinds of systems.
link |
00:35:47.740
So it's effectively a sandbox, which runs contracts.
link |
00:35:52.500
And like, it's like any kind of normal sandbox,
link |
00:35:55.600
you have to like replace syscalls
link |
00:35:57.960
with calls into the hypervisor.
link |
00:36:02.920
Sandbox, syscalls, hypervisor.
link |
00:36:05.080
What do these things mean?
link |
00:36:06.680
As long as it's interesting to talk about.
link |
00:36:09.240
Yeah, I mean, you can take like the Chrome sandbox
link |
00:36:11.280
is maybe the one to think about, right?
link |
00:36:12.720
So the Chrome process that's doing a rendering,
link |
00:36:16.000
can't, for example, read a file from the file system.
link |
00:36:18.800
It has, if it tries to make an open syscall in Linux,
link |
00:36:21.800
the open syscall, you can't make it open syscall,
link |
00:36:23.720
no, no, no.
link |
00:36:24.780
You have to request from the kind of hypervisor process
link |
00:36:29.160
or like, I don't know what it's called in Chrome,
link |
00:36:31.680
but the, hey, could you open this file for me?
link |
00:36:36.400
And then it does all these checks
link |
00:36:37.520
and then it passes the file handle back in
link |
00:36:39.200
if it's approved.
link |
00:36:41.160
So that's, yeah.
link |
00:36:42.640
So what's the, in the context of Ethereum,
link |
00:36:45.320
what are the boundaries of the sandbox
link |
00:36:47.200
that we're talking about?
link |
00:36:48.400
Well, like one of the calls that you,
link |
00:36:50.800
actually reading and writing any state
link |
00:36:53.960
to the Ethereum contract,
link |
00:36:55.480
or to the Ethereum blockchain.
link |
00:36:58.720
Writing state is one of those calls
link |
00:37:01.060
that you're going to have to sandbox in layer two,
link |
00:37:04.500
because if you let layer two just arbitrarily write
link |
00:37:08.120
to the Ethereum blockchain.
link |
00:37:09.480
So layer two is really sitting on top of layer one.
link |
00:37:15.120
So you're going to have a lot of different kinds of ideas
link |
00:37:17.160
that you can play with.
link |
00:37:18.680
And they're all, they're not fundamentally changing
link |
00:37:21.360
the source code level of Ethereum.
link |
00:37:25.140
Well, you have to replace a bunch of calls
link |
00:37:28.900
with calls into the hypervisor.
link |
00:37:31.120
So instead of doing the syscall directly,
link |
00:37:33.840
you replace it with a call to the hypervisor.
link |
00:37:37.360
So originally they were doing this
link |
00:37:39.320
by first running the, so Solidity is the language
link |
00:37:43.520
that most Ethereum contracts are written in.
link |
00:37:45.440
It compiles to a bytecode.
link |
00:37:47.320
And then they wrote this thing they called the transpiler.
link |
00:37:50.040
And the transpiler took the bytecode
link |
00:37:52.440
and it transpiled it into OVM safe bytecode.
link |
00:37:56.000
Basically bytecode that didn't make any
link |
00:37:57.520
of those restricted syscalls
link |
00:37:58.800
and added the calls to the hypervisor.
link |
00:38:01.680
This transpiler was a 3000 line mess.
link |
00:38:05.640
And it's hard to do.
link |
00:38:07.100
It's hard to do if you're trying to do it like that,
link |
00:38:09.060
because you have to kind of like deconstruct the bytecode,
link |
00:38:12.200
change things about it, and then reconstruct it.
link |
00:38:15.360
And I mean, as soon as I hear this, I'm like,
link |
00:38:17.760
well, why don't you just change the compiler, right?
link |
00:38:20.440
Why not the first place you build the bytecode,
link |
00:38:22.340
just do it in the compiler.
link |
00:38:25.560
So yeah, I asked them how much they wanted it.
link |
00:38:29.160
Of course, measured in dollars and I'm like, well, okay.
link |
00:38:33.040
And yeah.
link |
00:38:34.600
And you wrote the compiler.
link |
00:38:35.920
Yeah, I modified, I wrote a 300 line diff to the compiler.
link |
00:38:39.040
It's open source, you can look at it.
link |
00:38:40.840
Yeah, it's, yeah, I looked at the code last night.
link |
00:38:43.000
It's, yeah, exactly.
link |
00:38:46.680
Cute is a good word for it.
link |
00:38:49.440
And it's C++.
link |
00:38:52.000
C++, yeah.
link |
00:38:54.320
So when asked how you were able to do it,
link |
00:38:57.040
you said, you just gotta think and then do it right.
link |
00:39:03.080
So can you break that apart a little bit?
link |
00:39:04.680
What's your process of one, thinking and two, doing it right?
link |
00:39:09.920
You know, the people that I was working for
link |
00:39:12.440
were amused that I said that.
link |
00:39:13.480
It doesn't really mean anything.
link |
00:39:14.800
Okay.
link |
00:39:16.480
I mean, is there some deep, profound insights
link |
00:39:19.640
to draw from like how you problem solve from that?
link |
00:39:23.600
This is always what I say.
link |
00:39:24.640
I'm like, do you wanna be a good programmer?
link |
00:39:26.060
Do it for 20 years.
link |
00:39:27.840
Yeah, there's no shortcuts.
link |
00:39:29.600
No.
link |
00:39:31.360
What are your thoughts on crypto in general?
link |
00:39:33.440
So what parts technically or philosophically
link |
00:39:38.080
do you find especially beautiful maybe?
link |
00:39:40.040
Oh, I'm extremely bullish on crypto longterm.
link |
00:39:42.800
Not any specific crypto project, but this idea of,
link |
00:39:48.680
well, two ideas.
link |
00:39:50.320
One, the Nakamoto Consensus Algorithm
link |
00:39:54.240
is I think one of the greatest innovations
link |
00:39:57.320
of the 21st century.
link |
00:39:58.600
This idea that people can reach consensus.
link |
00:40:01.260
You can reach a group consensus.
link |
00:40:03.400
Using a relatively straightforward algorithm is wild.
link |
00:40:08.220
And like, you know, Satoshi Nakamoto,
link |
00:40:14.120
people always ask me who I look up to.
link |
00:40:15.580
It's like, whoever that is.
link |
00:40:17.800
Who do you think it is?
link |
00:40:19.120
I mean, Elon Musk?
link |
00:40:21.020
Is it you?
link |
00:40:22.320
It is definitely not me.
link |
00:40:24.040
And I do not think it's Elon Musk.
link |
00:40:26.060
But yeah, this idea of groups reaching consensus
link |
00:40:31.060
in a decentralized yet formulaic way
link |
00:40:34.940
is one extremely powerful idea from crypto.
link |
00:40:40.540
Maybe the second idea is this idea of smart contracts.
link |
00:40:45.860
When you write a contract between two parties,
link |
00:40:49.100
any contract, this contract, if there are disputes,
link |
00:40:53.720
it's interpreted by lawyers.
link |
00:40:56.140
Lawyers are just really shitty overpaid interpreters.
link |
00:41:00.020
Imagine you had, let's talk about them in terms of a,
link |
00:41:02.060
in terms of like, let's compare a lawyer to Python, right?
link |
00:41:05.260
So lawyer, well, okay.
link |
00:41:07.140
That's really, I never thought of it that way.
link |
00:41:10.020
It's hilarious.
link |
00:41:11.620
So Python, I'm paying even 10 cents an hour.
link |
00:41:15.900
I'll use the nice Azure machine.
link |
00:41:17.180
I can run Python for 10 cents an hour.
link |
00:41:19.660
Lawyers cost $1,000 an hour.
link |
00:41:21.480
So Python is 10,000 X better on that axis.
link |
00:41:25.900
Lawyers don't always return the same answer.
link |
00:41:31.100
Python almost always does.
link |
00:41:36.700
Cost.
link |
00:41:37.740
Yeah, I mean, just cost, reliability,
link |
00:41:40.940
everything about Python is so much better than lawyers.
link |
00:41:43.860
So if you can make smart contracts,
link |
00:41:46.100
this whole concept of code is law.
link |
00:41:50.400
I love, and I would love to live in a world
link |
00:41:53.180
where everybody accepted that fact.
link |
00:41:55.620
So maybe you can talk about what smart contracts are.
link |
00:42:01.180
So let's say, let's say, you know,
link |
00:42:05.020
we have a, even something as simple
link |
00:42:08.700
as a safety deposit box, right?
link |
00:42:11.700
Safety deposit box that holds a million dollars.
link |
00:42:14.780
I have a contract with the bank that says
link |
00:42:17.040
two out of these three parties must be present
link |
00:42:22.620
to open the safety deposit box and get the money out.
link |
00:42:25.280
So that's a contract for the bank,
link |
00:42:26.300
and it's only as good as the bank and the lawyers, right?
link |
00:42:29.020
Let's say, you know, somebody dies and now,
link |
00:42:32.820
oh, we're gonna go through a big legal dispute
link |
00:42:34.580
about whether, oh, well, was it in the will,
link |
00:42:36.140
was it not in the will?
link |
00:42:37.540
What, like, it's just so messy,
link |
00:42:39.780
and the cost to determine truth is so expensive.
link |
00:42:44.620
Versus a smart contract, which just uses cryptography
link |
00:42:47.220
to check if two out of three keys are present.
link |
00:42:50.100
Well, I can look at that, and I can have certainty
link |
00:42:53.860
in the answer that it's going to return.
link |
00:42:55.980
And that's what, all businesses want is certainty.
link |
00:42:58.220
You know, they say businesses don't care.
link |
00:42:59.620
Viacom, YouTube, YouTube's like,
link |
00:43:02.020
look, we don't care which way this lawsuit goes.
link |
00:43:04.100
Just please tell us so we can have certainty.
link |
00:43:07.220
Yeah, I wonder how many agreements in this,
link |
00:43:09.100
because we're talking about financial transactions only
link |
00:43:12.720
in this case, correct, the smart contracts?
link |
00:43:15.620
Oh, you can go to anything.
link |
00:43:17.380
You can put a prenup in the Ethereum blockchain.
link |
00:43:21.820
A married smart contract?
link |
00:43:23.020
Sorry, divorce lawyer, sorry.
link |
00:43:24.900
You're going to be replaced by Python.
link |
00:43:29.580
Okay, so that's another beautiful idea.
link |
00:43:34.980
Do you think there's something that's appealing to you
link |
00:43:37.700
about any one specific implementation?
link |
00:43:40.260
So if you look 10, 20, 50 years down the line,
link |
00:43:45.100
do you see any, like, Bitcoin, Ethereum,
link |
00:43:48.140
any of the other hundreds of cryptocurrencies winning out?
link |
00:43:51.140
Is there, like, what's your intuition about the space?
link |
00:43:53.340
Or are you just sitting back and watching the chaos
link |
00:43:55.380
and look who cares what emerges?
link |
00:43:57.380
Oh, I don't.
link |
00:43:58.220
I don't speculate.
link |
00:43:59.060
I don't really care.
link |
00:43:59.900
I don't really care which one of these projects wins.
link |
00:44:02.620
I'm kind of in the Bitcoin as a meme coin camp.
link |
00:44:05.500
I mean, why does Bitcoin have value?
link |
00:44:07.000
It's technically kind of, you know,
link |
00:44:11.740
not great, like the block size debate.
link |
00:44:14.380
When I found out what the block size debate was,
link |
00:44:16.140
I'm like, are you guys kidding?
link |
00:44:18.020
What's the block size debate?
link |
00:44:21.180
You know what?
link |
00:44:22.020
It's really, it's too stupid to even talk.
link |
00:44:23.820
People can look it up, but I'm like, wow.
link |
00:44:27.140
You know, Ethereum seems,
link |
00:44:28.420
the governance of Ethereum seems much better.
link |
00:44:31.300
I've come around a bit on proof of stake ideas.
link |
00:44:35.360
You know, very smart people thinking about some things.
link |
00:44:37.740
Yeah, you know, governance is interesting.
link |
00:44:40.340
It does feel like Vitalik,
link |
00:44:44.100
like it does feel like an open,
link |
00:44:46.100
even in these distributed systems,
link |
00:44:48.020
leaders are helpful
link |
00:44:51.220
because they kind of help you drive the mission
link |
00:44:54.580
and the vision and they put a face to a project.
link |
00:44:58.420
It's a weird thing about us humans.
link |
00:45:00.140
Geniuses are helpful, like Vitalik.
link |
00:45:02.820
Yeah, brilliant.
link |
00:45:06.540
Leaders are not necessarily, yeah.
link |
00:45:10.320
So you think the reason he's the face of Ethereum
link |
00:45:15.340
is because he's a genius.
link |
00:45:17.100
That's interesting.
link |
00:45:18.060
I mean, that was,
link |
00:45:21.460
it's interesting to think about
link |
00:45:22.940
that we need to create systems
link |
00:45:25.860
in which the quote unquote leaders that emerge
link |
00:45:30.340
are the geniuses in the system.
link |
00:45:33.060
I mean, that's arguably why
link |
00:45:35.000
the current state of democracy is broken
link |
00:45:36.980
is the people who are emerging as the leaders
link |
00:45:39.420
are not the most competent,
link |
00:45:40.980
are not the superstars of the system.
link |
00:45:43.340
And it seems like at least for now
link |
00:45:45.020
in the crypto world oftentimes
link |
00:45:47.220
the leaders are the superstars.
link |
00:45:49.380
Imagine at the debate they asked,
link |
00:45:51.720
what's the sixth amendment?
link |
00:45:53.640
What are the four fundamental forces in the universe?
link |
00:45:56.300
What's the integral of two to the X?
link |
00:45:59.940
I'd love to see those questions asked
link |
00:46:01.740
and that's what I want as our leader.
link |
00:46:03.380
It's a little bit.
link |
00:46:04.220
What's Bayes rule?
link |
00:46:07.160
Yeah, I mean, even, oh wow, you're hurting my brain.
link |
00:46:10.900
It's that my standard was even lower
link |
00:46:15.020
but I would have loved to see
link |
00:46:17.660
just this basic brilliance.
link |
00:46:20.600
Like I've talked to historians.
link |
00:46:22.180
There's just these, they're not even like
link |
00:46:23.900
they don't have a PhD or even education history.
link |
00:46:26.720
They just like a Dan Carlin type character
link |
00:46:30.040
who just like, holy shit.
link |
00:46:32.780
How did all this information get into your head?
link |
00:46:35.420
They're able to just connect Genghis Khan
link |
00:46:38.460
to the entirety of the history of the 20th century.
link |
00:46:41.820
They know everything about every single battle that happened
link |
00:46:46.180
and they know the game of Thrones
link |
00:46:51.020
of the different power plays and all that happened there.
link |
00:46:55.100
And they know like the individuals
link |
00:46:56.700
and all the documents involved
link |
00:46:58.580
and they integrate that into their regular life.
link |
00:47:02.060
It's not like they're ultra history nerds.
link |
00:47:03.980
They're just, they know this information.
link |
00:47:06.380
That's what competence looks like.
link |
00:47:08.020
Yeah.
link |
00:47:09.060
Cause I've seen that with programmers too, right?
link |
00:47:10.660
That's what great programmers do.
link |
00:47:12.580
But yeah, it would be, it's really unfortunate
link |
00:47:15.780
that those kinds of people aren't emerging as our leaders.
link |
00:47:19.340
But for now, at least in the crypto world
link |
00:47:21.820
that seems to be the case.
link |
00:47:23.260
I don't know if that always, you could imagine
link |
00:47:26.700
that in a hundred years, it's not the case, right?
link |
00:47:28.820
Crypto world has one very powerful idea going for it
link |
00:47:31.940
and that's the idea of forks, right?
link |
00:47:35.020
I mean, imagine, we'll use a less controversial example.
link |
00:47:42.940
This was actually in my joke app in 2012.
link |
00:47:47.140
I was like, Barack Obama, Mitt Romney,
link |
00:47:49.460
let's let them both be president, right?
link |
00:47:51.060
Like imagine we could fork America
link |
00:47:52.940
and just let them both be president.
link |
00:47:54.500
And then the Americas could compete
link |
00:47:56.060
and people could invest in one,
link |
00:47:58.180
pull their liquidity out of one, put it in the other.
link |
00:48:00.500
You have this in the crypto world.
link |
00:48:02.420
Ethereum forks into Ethereum and Ethereum classic.
link |
00:48:05.540
And you can pull your liquidity out of one
link |
00:48:07.340
and put it in another.
link |
00:48:08.980
And people vote with their dollars,
link |
00:48:11.980
which forks, companies should be able to fork.
link |
00:48:16.420
I'd love to fork Nvidia, you know?
link |
00:48:20.260
Yeah, like different business strategies
link |
00:48:22.780
and then try them out and see what works.
link |
00:48:26.180
Like even take, yeah, take comma AI that closes its source
link |
00:48:34.780
and then take one that's open source and see what works.
link |
00:48:38.300
Take one that's purchased by GM
link |
00:48:41.060
and one that remains Android Renegade
link |
00:48:43.540
and all these different versions and see.
link |
00:48:45.300
The beauty of comma AI is someone can actually do that.
link |
00:48:47.900
Please take comma AI and fork it.
link |
00:48:50.620
That's right, that's the beauty of open source.
link |
00:48:53.020
So you're, I mean, we'll talk about autonomous vehicle space,
link |
00:48:56.420
but it does seem that you're really knowledgeable
link |
00:49:02.100
about a lot of different topics.
link |
00:49:03.940
So the natural question a bunch of people ask this,
link |
00:49:06.100
which is how do you keep learning new things?
link |
00:49:09.980
Do you have like practical advice
link |
00:49:12.420
if you were to introspect, like taking notes,
link |
00:49:15.820
allocate time, or do you just mess around
link |
00:49:19.220
and just allow your curiosity to drive?
link |
00:49:21.340
I'll write these people a self help book
link |
00:49:23.060
and I'll charge $67 for it.
link |
00:49:25.060
And I will write, I will write,
link |
00:49:28.500
I will write on the cover of the self help book.
link |
00:49:30.300
All of this advice is completely meaningless.
link |
00:49:32.500
You're gonna be a sucker and buy this book anyway.
link |
00:49:34.740
And the one lesson that I hope they take away from the book
link |
00:49:38.860
is that I can't give you a meaningful answer to that.
link |
00:49:42.580
That's interesting.
link |
00:49:44.020
Let me translate that.
link |
00:49:45.420
Is you haven't really thought about what it is you do
link |
00:49:51.580
systematically because you could reduce it.
link |
00:49:53.900
And there's some people, I mean, I've met brilliant people
link |
00:49:56.980
that this is really clear with athletes.
link |
00:50:00.180
Some are just, you know, the best in the world
link |
00:50:03.580
at something and they have zero interest
link |
00:50:06.820
in writing like a self help book,
link |
00:50:09.500
or how to master this game.
link |
00:50:11.620
And then there's some athletes who become great coaches
link |
00:50:15.620
and they love the analysis, perhaps the over analysis.
link |
00:50:18.820
And you right now, at least at your age,
link |
00:50:20.740
which is an interesting, you're in the middle of the battle.
link |
00:50:23.180
You're like the warriors that have zero interest
link |
00:50:25.620
in writing books.
link |
00:50:27.820
So you're in the middle of the battle.
link |
00:50:29.140
So you have, yeah.
link |
00:50:30.580
This is a fair point.
link |
00:50:31.740
I do think I have a certain aversion
link |
00:50:34.060
to this kind of deliberate intentional way of living life.
link |
00:50:39.060
You're eventually, the hilarity of this,
link |
00:50:41.820
especially since this is recorded,
link |
00:50:43.620
it will reveal beautifully the absurdity
link |
00:50:47.620
when you finally do publish this book.
link |
00:50:49.620
I guarantee you, you will.
link |
00:50:51.540
The story of comma AI, maybe it'll be a biography
link |
00:50:56.060
written about you.
link |
00:50:57.060
That'll be better, I guess.
link |
00:50:58.740
And you might be able to learn some cute lessons
link |
00:51:00.740
if you're starting a company like comma AI from that book.
link |
00:51:03.700
But if you're asking generic questions,
link |
00:51:05.660
like how do I be good at things?
link |
00:51:07.740
How do I be good at things?
link |
00:51:10.260
Dude, I don't know.
link |
00:51:11.900
Do them a lot.
link |
00:51:14.380
Do them a lot.
link |
00:51:15.220
But the interesting thing here is learning things
link |
00:51:18.580
outside of your current trajectory,
link |
00:51:22.020
which is what it feels like from an outsider's perspective.
link |
00:51:28.140
I don't know if there's advice on that,
link |
00:51:30.900
but it is an interesting curiosity.
link |
00:51:33.340
When you become really busy, you're running a company.
link |
00:51:38.020
Hard time.
link |
00:51:40.340
Yeah.
link |
00:51:41.580
But there's a natural inclination and trend.
link |
00:51:46.100
Just the momentum of life carries you
link |
00:51:48.980
into a particular direction of wanting to focus.
link |
00:51:51.060
And this kind of dispersion that curiosity can lead to
link |
00:51:55.140
gets harder and harder with time.
link |
00:51:58.220
Because you get really good at certain things
link |
00:52:00.940
and it sucks trying things that you're not good at,
link |
00:52:03.820
like trying to figure them out.
link |
00:52:05.340
When you do this with your live streams,
link |
00:52:07.380
you're on the fly figuring stuff out.
link |
00:52:10.060
You don't mind looking dumb.
link |
00:52:11.780
No.
link |
00:52:14.140
You just figure it out pretty quickly.
link |
00:52:16.660
Sometimes I try things and I don't figure them out quickly.
link |
00:52:19.180
My chest rating is like a 1400,
link |
00:52:20.980
despite putting like a couple of hundred hours in.
link |
00:52:23.300
It's pathetic.
link |
00:52:24.340
I mean, to be fair, I know that I could do it better
link |
00:52:26.860
if I did it better.
link |
00:52:27.700
Like don't play five minute games,
link |
00:52:29.860
play 15 minute games at least.
link |
00:52:31.380
Like I know these things, but it just doesn't,
link |
00:52:34.260
it doesn't stick nicely in my knowledge stream.
link |
00:52:37.260
All right, let's talk about Comma AI.
link |
00:52:39.300
What's the mission of the company?
link |
00:52:42.140
Let's like look at the biggest picture.
link |
00:52:44.860
Oh, I have an exact statement.
link |
00:52:46.820
Solve self driving cars
link |
00:52:48.660
while delivering shippable intermediaries.
link |
00:52:51.540
So longterm vision is have fully autonomous vehicles
link |
00:52:56.300
and make sure you're making money along the way.
link |
00:52:59.060
I think it doesn't really speak to money,
link |
00:53:00.220
but I can talk about what solve self driving cars means.
link |
00:53:03.260
Solve self driving cars of course means
link |
00:53:06.700
you're not building a new car,
link |
00:53:08.020
you're building a person replacement.
link |
00:53:10.460
That person can sit in the driver's seat
link |
00:53:12.340
and drive you anywhere a person can drive
link |
00:53:14.580
with a human or better level of safety,
link |
00:53:17.980
speed, quality, comfort.
link |
00:53:21.420
And what's the second part of that?
link |
00:53:23.260
Delivering shippable intermediaries is well,
link |
00:53:26.620
it's a way to fund the company, that's true.
link |
00:53:28.180
But it's also a way to keep us honest.
link |
00:53:31.980
If you don't have that, it is very easy
link |
00:53:34.900
with this technology to think you're making progress
link |
00:53:39.020
when you're not.
link |
00:53:40.180
I've heard it best described on Hacker News as
link |
00:53:43.420
you can set any arbitrary milestone,
link |
00:53:46.660
meet that milestone and still be infinitely far away
link |
00:53:49.500
from solving self driving cars.
link |
00:53:51.740
So it's hard to have like real deadlines
link |
00:53:53.820
when you're like Cruz or Waymo when you don't have revenue.
link |
00:54:02.100
Is that, I mean, is revenue essentially
link |
00:54:06.060
the thing we're talking about here?
link |
00:54:07.820
Revenue is, capitalism is based around consent.
link |
00:54:11.460
Capitalism, the way that you get revenue
link |
00:54:13.100
is real capitalism comes in the real capitalism camp.
link |
00:54:16.340
There's definitely scams out there,
link |
00:54:17.340
but real capitalism is based around consent.
link |
00:54:19.580
It's based around this idea that like,
link |
00:54:20.740
if we're getting revenue, it's because we're providing
link |
00:54:22.980
at least that much value to another person.
link |
00:54:24.780
When someone buys $1,000 comma two from us,
link |
00:54:27.460
we're providing them at least $1,000 of value
link |
00:54:29.100
or they wouldn't buy it.
link |
00:54:30.180
Brilliant, so can you give a whirlwind overview
link |
00:54:32.940
of the products that Comma AI provides,
link |
00:54:34.940
like throughout its history and today?
link |
00:54:38.180
I mean, yeah, the past ones aren't really that interesting.
link |
00:54:40.780
It's kind of just been refinement of the same idea.
link |
00:54:45.180
The real only product we sell today is the Comma 2.
link |
00:54:48.300
Which is a piece of hardware with cameras.
link |
00:54:50.500
Mm, so the Comma 2, I mean, you can think about it
link |
00:54:54.700
kind of like a person.
link |
00:54:57.100
Future hardware will probably be
link |
00:54:58.340
even more and more personlike.
link |
00:55:00.300
So it has eyes, ears, a mouth, a brain,
link |
00:55:07.460
and a way to interface with the car.
link |
00:55:09.540
Does it have consciousness?
link |
00:55:10.980
Just kidding, that was a trick question.
link |
00:55:13.500
I don't have consciousness either.
link |
00:55:15.140
Me and the Comma 2 are the same.
link |
00:55:16.420
You're the same?
link |
00:55:17.260
I have a little more compute than it.
link |
00:55:18.660
It only has like the same compute as a B, you know.
link |
00:55:23.260
You're more efficient energy wise
link |
00:55:25.300
for the compute you're doing.
link |
00:55:26.460
Far more efficient energy wise.
link |
00:55:29.020
20 petaflops, 20 watts, crazy.
link |
00:55:30.580
Do you lack consciousness?
link |
00:55:32.220
Sure.
link |
00:55:33.060
Do you fear death?
link |
00:55:33.900
You do, you want immortality.
link |
00:55:35.500
Of course I fear death.
link |
00:55:36.340
Does Comma AI fear death?
link |
00:55:38.100
I don't think so.
link |
00:55:39.580
Of course it does.
link |
00:55:40.500
It very much fears, well, it fears negative loss.
link |
00:55:42.980
Oh yeah.
link |
00:55:43.820
Okay, so Comma 2, when did that come out?
link |
00:55:49.220
That was a year ago?
link |
00:55:50.500
No, two.
link |
00:55:52.100
Early this year.
link |
00:55:53.540
Wow, time, it feels like, yeah.
link |
00:55:56.900
2020 feels like it's taken 10 years to get to the end.
link |
00:56:00.820
It's a long year.
link |
00:56:01.820
It's a long year.
link |
00:56:03.140
So what's the sexiest thing about Comma 2 feature wise?
link |
00:56:08.140
So, I mean, maybe you can also link on like, what is it?
link |
00:56:14.340
Like what's its purpose?
link |
00:56:15.740
Cause there's a hardware, there's a software component.
link |
00:56:18.540
You've mentioned the sensors,
link |
00:56:20.020
but also like what is it, its features and capabilities?
link |
00:56:23.060
I think our slogan summarizes it well.
link |
00:56:25.340
Comma slogan is make driving chill.
link |
00:56:28.860
I love it, okay.
link |
00:56:30.660
Yeah, I mean, it is, you know,
link |
00:56:33.020
if you like cruise control, imagine cruise control,
link |
00:56:35.340
but much, much more.
link |
00:56:36.660
So it can do adaptive cruise control things,
link |
00:56:41.020
which is like slow down for cars in front of it,
link |
00:56:42.980
maintain a certain speed.
link |
00:56:44.220
And it can also do lane keeping.
link |
00:56:46.260
So staying in the lane and doing it better
link |
00:56:48.700
and better and better over time.
link |
00:56:50.020
It's very much machine learning based.
link |
00:56:53.060
So this camera is, there's a driver facing camera too.
link |
00:57:01.100
What else is there?
link |
00:57:02.100
What am I thinking?
link |
00:57:02.940
So the hardware versus software.
link |
00:57:04.380
So open pilot versus the actual hardware of the device.
link |
00:57:09.020
What's, can you draw that distinction?
link |
00:57:10.580
What's one, what's the other?
link |
00:57:11.580
I mean, the hardware is pretty much a cell phone
link |
00:57:13.820
with a few additions.
link |
00:57:14.700
A cell phone with a cooling system
link |
00:57:16.940
and with a car interface connected to it.
link |
00:57:20.980
And by cell phone, you mean like Qualcomm Snapdragon.
link |
00:57:25.620
Yeah, the current hardware is a Snapdragon 821.
link |
00:57:29.380
It has wifi radio, it has an LTE radio, it has a screen.
link |
00:57:32.180
We use every part of the cell phone.
link |
00:57:35.980
And then the interface with the car
link |
00:57:37.540
is specific to the car.
link |
00:57:38.460
So you keep supporting more and more cars.
link |
00:57:41.460
Yeah, so the interface to the car,
link |
00:57:42.660
I mean, the device itself just has four CAN buses.
link |
00:57:45.340
It has four CAN interfaces on it
link |
00:57:46.860
that are connected through the USB port to the phone.
link |
00:57:49.420
And then, yeah, on those four CAN buses,
link |
00:57:53.300
you connect it to the car.
link |
00:57:54.460
And there's a little harness to do this.
link |
00:57:56.460
Cars are actually surprisingly similar.
link |
00:57:58.460
So CAN is the protocol by which cars communicate.
link |
00:58:01.500
And then you're able to read stuff and write stuff
link |
00:58:04.260
to be able to control the car depending on the car.
link |
00:58:06.900
So what's the software side?
link |
00:58:08.260
What's OpenPilot?
link |
00:58:10.380
So I mean, OpenPilot is,
link |
00:58:11.740
the hardware is pretty simple compared to OpenPilot.
link |
00:58:13.740
OpenPilot is, well, so you have a machine learning model,
link |
00:58:21.020
which it's in OpenPilot, it's a blob.
link |
00:58:24.540
It's just a blob of weights.
link |
00:58:25.620
It's not like people are like, oh, it's closed source.
link |
00:58:27.420
I'm like, it's a blob of weights.
link |
00:58:28.860
What do you expect?
link |
00:58:29.780
So it's primarily neural network based.
link |
00:58:33.380
You, well, OpenPilot is all the software
link |
00:58:36.020
kind of around that neural network.
link |
00:58:37.660
That if you have a neural network that says,
link |
00:58:39.060
here's where you wanna send the car,
link |
00:58:40.860
OpenPilot actually goes and executes all of that.
link |
00:58:44.780
It cleans up the input to the neural network.
link |
00:58:46.900
It cleans up the output and executes on it.
link |
00:58:49.060
So it connects, it's the glue
link |
00:58:50.860
that connects everything together.
link |
00:58:51.860
Runs the sensors, does a bunch of calibration
link |
00:58:54.420
for the neural network, deals with like,
link |
00:58:58.100
if the car is on a banked road,
link |
00:59:00.140
you have to counter steer against that.
link |
00:59:02.060
And the neural network can't necessarily know that
link |
00:59:03.820
by looking at the picture.
link |
00:59:06.420
So you do that with other sensors
link |
00:59:08.100
and Fusion and Localizer.
link |
00:59:09.900
OpenPilot also is responsible
link |
00:59:11.780
for sending the data up to our servers.
link |
00:59:14.780
So we can learn from it, logging it, recording it,
link |
00:59:17.940
running the cameras, thermally managing the device,
link |
00:59:21.500
managing the disk space on the device,
link |
00:59:23.180
managing all the resources on the device.
link |
00:59:24.780
So what, since we last spoke,
link |
00:59:26.860
I don't remember when, maybe a year ago,
link |
00:59:28.460
maybe a little bit longer,
link |
00:59:30.180
how has OpenPilot improved?
link |
00:59:33.100
We did exactly what I promised you.
link |
00:59:34.860
I promised you that by the end of the year,
link |
00:59:36.760
where you'd be able to remove the lanes.
link |
00:59:40.500
The lateral policy is now almost completely end to end.
link |
00:59:46.060
You can turn the lanes off and it will drive,
link |
00:59:48.600
drive slightly worse on the highway
link |
00:59:49.980
if you turn the lanes off,
link |
00:59:51.060
but you can turn the lanes off and it will drive well,
link |
00:59:54.260
trained completely end to end on user data.
link |
00:59:57.220
And this year we hope to do the same
link |
00:59:58.700
for the longitudinal policy.
link |
01:00:00.140
So that's the interesting thing is you're not doing,
link |
01:00:03.380
you don't appear to be, maybe you can correct me,
link |
01:00:05.380
you don't appear to be doing lane detection
link |
01:00:08.700
or lane marking detection or kind of the segmentation task
link |
01:00:12.420
or any kind of object detection task.
link |
01:00:15.100
You're doing what's traditionally more called
link |
01:00:17.580
like end to end learning.
link |
01:00:19.420
So, and trained on actual behavior of drivers
link |
01:00:24.180
when they're driving the car manually.
link |
01:00:27.780
And this is hard to do.
link |
01:00:29.820
It's not supervised learning.
link |
01:00:32.220
Yeah, but so the nice thing is there's a lot of data.
link |
01:00:34.740
So it's hard and easy, right?
link |
01:00:37.060
It's a...
link |
01:00:37.900
We have a lot of high quality data, yeah.
link |
01:00:40.020
Like more than you need in the second.
link |
01:00:41.700
Well...
link |
01:00:42.700
We have way more than we do.
link |
01:00:43.520
We have way more data than we need.
link |
01:00:44.980
I mean, it's an interesting question actually,
link |
01:00:47.040
because in terms of amount, you have more than you need,
link |
01:00:50.440
but the driving is full of edge cases.
link |
01:00:54.260
So how do you select the data you train on?
link |
01:00:58.260
I think this is an interesting open question.
link |
01:01:00.540
Like what's the cleverest way to select data?
link |
01:01:04.220
That's the question Tesla is probably working on.
link |
01:01:07.660
That's, I mean, the entirety of machine learning can be,
link |
01:01:09.900
they don't seem to really care.
link |
01:01:11.060
They just kind of select data.
link |
01:01:12.260
But I feel like that if you want to solve,
link |
01:01:14.860
if you want to create intelligent systems,
link |
01:01:16.220
you have to pick data well, right?
link |
01:01:18.900
And so do you have any hints, ideas of how to do it well?
link |
01:01:22.900
So in some ways that is...
link |
01:01:25.140
The definition I like of reinforcement learning
link |
01:01:27.400
versus supervised learning.
link |
01:01:29.300
In supervised learning, the weights depend on the data.
link |
01:01:32.720
Right?
link |
01:01:34.560
And this is obviously true,
link |
01:01:35.700
but in reinforcement learning,
link |
01:01:38.320
the data depends on the weights.
link |
01:01:40.420
Yeah.
link |
01:01:41.380
And actually both ways.
link |
01:01:42.540
That's poetry.
link |
01:01:43.980
So how does it know what data to train on?
link |
01:01:46.260
Well, let it pick.
link |
01:01:47.540
We're not there yet, but that's the eventual.
link |
01:01:49.460
So you're thinking this almost like
link |
01:01:51.100
a reinforcement learning framework.
link |
01:01:53.320
We're going to do RL on the world.
link |
01:01:55.380
Every time a car makes a mistake, user disengages,
link |
01:01:58.140
we train on that and do RL on the world.
link |
01:02:00.140
Ship out a new model, that's an epoch, right?
link |
01:02:03.260
And for now you're not doing the Elon style promising
link |
01:02:08.540
that it's going to be fully autonomous.
link |
01:02:09.700
You really are sticking to level two
link |
01:02:12.420
and like it's supposed to be supervised.
link |
01:02:15.300
It is definitely supposed to be supervised
link |
01:02:16.900
and we enforce the fact that it's supervised.
link |
01:02:19.740
We look at our rate of improvement in disengagements.
link |
01:02:23.660
OpenPilot now has an unplanned disengagement
link |
01:02:25.700
about every a hundred miles.
link |
01:02:27.440
This is up from 10 miles, like maybe,
link |
01:02:32.580
maybe maybe a year ago.
link |
01:02:36.560
Yeah.
link |
01:02:37.400
So maybe we've seen 10 X improvement in a year,
link |
01:02:38.500
but a hundred miles is still a far cry
link |
01:02:41.740
from the a hundred thousand you're going to need.
link |
01:02:43.880
So you're going to somehow need to get three more 10 Xs
link |
01:02:48.060
in there.
link |
01:02:49.880
And you're, what's your intuition?
link |
01:02:52.300
You're basically hoping that there's exponential
link |
01:02:54.540
improvement built into the baked into the cake somewhere.
link |
01:02:56.940
Well, that's even, I mean, 10 X improvement,
link |
01:02:58.420
that's already assuming exponential, right?
link |
01:03:00.540
There's definitely exponential improvement.
link |
01:03:02.620
And I think when Elon talks about exponential,
link |
01:03:04.340
like these things, these systems are going to
link |
01:03:06.340
exponentially improve, just exponential doesn't mean
link |
01:03:09.820
you're getting a hundred gigahertz processors tomorrow.
link |
01:03:12.660
Right? Like it's going to still take a while
link |
01:03:15.060
because the gap between even our best system
link |
01:03:18.340
and humans is still large.
link |
01:03:20.300
So that's an interesting distinction to draw.
link |
01:03:22.340
So if you look at the way Tesla is approaching the problem
link |
01:03:26.100
and the way you're approaching the problem,
link |
01:03:28.340
which is very different than the rest of the self driving
link |
01:03:31.780
car world.
link |
01:03:32.720
So let's put them aside is you're treating most
link |
01:03:35.380
the driving task as a machine learning problem.
link |
01:03:37.500
And the way Tesla is approaching it is with the multitask
link |
01:03:40.260
learning where you break the task of driving into hundreds
link |
01:03:44.100
of different tasks and you have this multiheaded
link |
01:03:47.180
neural network that's very good at performing each task.
link |
01:03:51.660
And there there's presumably something on top that's
link |
01:03:54.620
stitching stuff together in order to make control
link |
01:03:59.240
decisions, policy decisions about how you move the car.
link |
01:04:02.180
But what that allows you, there's a brilliance to this
link |
01:04:04.420
because it allows you to master each task,
link |
01:04:08.360
like lane detection, stop sign detection,
link |
01:04:13.380
the traffic light detection, drivable area segmentation,
link |
01:04:19.180
you know, vehicle, bicycle, pedestrian detection.
link |
01:04:23.000
There's some localization tasks in there.
link |
01:04:25.420
Also predicting of like, yeah,
link |
01:04:30.440
predicting how the entities in the scene are going to move.
link |
01:04:34.060
Like everything is basically a machine learning task.
link |
01:04:36.360
So there's a classification, segmentation, prediction.
link |
01:04:40.380
And it's nice because you can have this entire engine,
link |
01:04:44.460
data engine that's mining for edge cases for each one of
link |
01:04:48.740
these tasks.
link |
01:04:49.580
And you can have people like engineers that are basically
link |
01:04:52.200
masters of that task,
link |
01:04:53.820
like become the best person in the world at,
link |
01:04:56.600
as you talk about the cone guy for Waymo,
link |
01:04:59.860
the becoming the best person in the world at cone detection.
link |
01:05:06.700
So that's a compelling notion from a supervised learning
link |
01:05:10.140
perspective, automating much of the process of edge case
link |
01:05:15.380
discovery and retraining neural network for each of the
link |
01:05:17.820
individual perception tasks.
link |
01:05:19.780
And then you're looking at the machine learning in a more
link |
01:05:22.460
holistic way, basically doing end to end learning on the
link |
01:05:27.240
driving tasks, supervised, trained on the data of the
link |
01:05:31.500
actual driving of people.
link |
01:05:34.420
They use comma AI, like actual human drivers,
link |
01:05:37.580
their manual control,
link |
01:05:38.700
plus the moments of disengagement that maybe with some
link |
01:05:44.400
labeling could indicate the failure of the system.
link |
01:05:47.340
So you have the,
link |
01:05:48.900
you have a huge amount of data for positive control of the
link |
01:05:52.420
vehicle, like successful control of the vehicle,
link |
01:05:55.560
both maintaining the lane as,
link |
01:05:58.340
as I think you're also working on longitudinal control of
link |
01:06:01.060
the vehicle and then failure cases where the vehicle does
link |
01:06:04.700
something wrong that needs disengagement.
link |
01:06:08.380
So like what,
link |
01:06:09.820
why do you think you're right and Tesla is wrong on this?
link |
01:06:14.220
And do you think,
link |
01:06:15.660
do you think you'll come around the Tesla way?
link |
01:06:17.540
Do you think Tesla will come around to your way?
link |
01:06:21.400
If you were to start a chess engine company,
link |
01:06:23.920
would you hire a Bishop guy?
link |
01:06:26.080
See, we have a,
link |
01:06:27.660
this is Monday morning.
link |
01:06:29.060
Quarterbacking is a yes, probably.
link |
01:06:36.340
Oh, our Rook guy.
link |
01:06:37.400
Oh, we stole the Rook guy from that company.
link |
01:06:39.420
Oh, we're going to have real good Rooks.
link |
01:06:40.780
Well, there's not many pieces, right?
link |
01:06:43.900
You can,
link |
01:06:46.220
there's not many guys and gals to hire.
link |
01:06:48.820
You just have a few that work in the Bishop,
link |
01:06:51.060
a few that work in the Rook.
link |
01:06:52.860
Is that not ludicrous today to think about
link |
01:06:55.340
in a world of AlphaZero?
link |
01:06:57.520
But AlphaZero is a chess game.
link |
01:06:58.860
So the fundamental question is,
link |
01:07:01.780
how hard is driving compared to chess?
link |
01:07:04.340
Because, so long term,
link |
01:07:07.200
end to end,
link |
01:07:08.980
will be the right solution.
link |
01:07:10.620
The question is how many years away is that?
link |
01:07:13.460
End to end is going to be the only solution for level five.
link |
01:07:15.740
For the only way we'll get there.
link |
01:07:17.220
Of course, and of course,
link |
01:07:18.220
Tesla is going to come around to my way.
link |
01:07:19.740
And if you're a Rook guy out there, I'm sorry.
link |
01:07:22.980
The cone guy.
link |
01:07:24.980
I don't know.
link |
01:07:25.820
We're going to specialize each task.
link |
01:07:26.940
We're going to really understand Rook placement.
link |
01:07:29.100
Yeah.
link |
01:07:30.540
I understand the intuition you have.
link |
01:07:32.060
I mean, that,
link |
01:07:35.100
that is a very compelling notion
link |
01:07:36.820
that we can learn the task end to end,
link |
01:07:39.140
like the same compelling notion you might have
link |
01:07:40.820
for natural language conversation.
link |
01:07:42.620
But I'm not
link |
01:07:44.800
sure,
link |
01:07:47.060
because one thing you sneaked in there
link |
01:07:48.940
is the assertion that it's impossible to get to level five
link |
01:07:53.180
without this kind of approach.
link |
01:07:55.420
I don't know if that's obvious.
link |
01:07:57.140
I don't know if that's obvious either.
link |
01:07:58.300
I don't actually mean that.
link |
01:08:01.340
I think that it is much easier
link |
01:08:03.500
to get to level five with an end to end approach.
link |
01:08:05.700
I think that the other approach is doable,
link |
01:08:08.900
but the magnitude of the engineering challenge
link |
01:08:11.380
may exceed what humanity is capable of.
link |
01:08:13.780
But what do you think of the Tesla data engine approach,
link |
01:08:19.140
which to me is an active learning task,
link |
01:08:21.100
is kind of fascinating,
link |
01:08:22.500
is breaking it down into these multiple tasks
link |
01:08:25.700
and mining their data constantly for like edge cases
link |
01:08:29.500
for these different tasks.
link |
01:08:30.500
Yeah, but the tasks themselves are not being learned.
link |
01:08:32.460
This is feature engineering.
link |
01:08:35.700
Yeah, I mean, it's a higher abstraction level
link |
01:08:40.740
of feature engineering for the different tasks.
link |
01:08:43.340
Task engineering in a sense.
link |
01:08:44.740
It's slightly better feature engineering,
link |
01:08:46.780
but it's still fundamentally is feature engineering.
link |
01:08:49.260
And if anything about the history of AI
link |
01:08:51.300
has taught us anything,
link |
01:08:52.620
it's that feature engineering approaches
link |
01:08:54.540
will always be replaced and lose to end to end.
link |
01:08:57.660
Now, to be fair, I cannot really make promises on timelines,
link |
01:09:02.060
but I can say that when you look at the code for Stockfish
link |
01:09:05.700
and the code for AlphaZero,
link |
01:09:06.940
one is a lot shorter than the other,
link |
01:09:09.060
a lot more elegant,
link |
01:09:09.900
required a lot less programmer hours to write.
link |
01:09:12.580
Yeah, but there was a lot more murder of bad agents
link |
01:09:21.620
on the AlphaZero side.
link |
01:09:24.740
By murder, I mean agents that played a game
link |
01:09:29.220
and failed miserably.
link |
01:09:30.420
Yeah.
link |
01:09:31.460
Oh, oh.
link |
01:09:32.300
In simulation, that failure is less costly.
link |
01:09:34.740
Yeah.
link |
01:09:35.580
In real world, it's...
link |
01:09:37.300
Do you mean in practice,
link |
01:09:38.260
like AlphaZero has lost games miserably?
link |
01:09:40.660
No.
link |
01:09:41.500
Wow.
link |
01:09:42.540
I haven't seen that.
link |
01:09:43.380
No, but I know, but the requirement for AlphaZero is...
link |
01:09:47.380
A simulator.
link |
01:09:48.220
To be able to like evolution, human evolution,
link |
01:09:51.500
not human evolution, biological evolution of life on earth
link |
01:09:54.420
from the origin of life has murdered trillions
link |
01:09:58.700
upon trillions of organisms on the path thus humans.
link |
01:10:02.260
Yeah.
link |
01:10:03.100
So the question is, can we stitch together
link |
01:10:05.860
a human like object without having to go
link |
01:10:07.940
through the entirety process of evolution?
link |
01:10:09.900
Well, no, but do the evolution in simulation.
link |
01:10:11.940
Yeah, that's the question.
link |
01:10:12.860
Can we simulate?
link |
01:10:13.700
So do you have a sense that it's possible
link |
01:10:15.060
to simulate some aspect?
link |
01:10:16.220
MuZero is exactly this.
link |
01:10:18.220
MuZero is the solution to this.
link |
01:10:21.300
MuZero I think is going to be looked back
link |
01:10:23.980
as the canonical paper.
link |
01:10:25.220
And I don't think deep learning is everything.
link |
01:10:26.860
I think that there's still a bunch of things missing
link |
01:10:28.700
to get there, but MuZero I think is going to be looked back
link |
01:10:31.420
as the kind of cornerstone paper
link |
01:10:34.260
of this whole deep learning era.
link |
01:10:37.060
And MuZero is the solution to self driving cars.
link |
01:10:39.580
You have to make a few tweaks to it,
link |
01:10:41.220
but MuZero does effectively that.
link |
01:10:42.820
It does those rollouts and those murdering
link |
01:10:45.500
in a learned simulator and a learned dynamics model.
link |
01:10:50.180
That's interesting.
link |
01:10:51.020
It doesn't get enough love.
link |
01:10:51.860
I was blown away when I read that paper.
link |
01:10:54.220
I'm like, okay, I've always said a comma.
link |
01:10:57.060
I'm going to sit and I'm going to wait for the solution
link |
01:10:58.460
to self driving cars to come along.
link |
01:11:00.340
This year I saw it.
link |
01:11:01.180
It's MuZero.
link |
01:11:05.060
So.
link |
01:11:06.860
Sit back and let the winning roll in.
link |
01:11:09.180
So your sense, just to elaborate a little bit,
link |
01:11:12.300
it's a link on the topic.
link |
01:11:13.380
Your sense is neural networks will solve driving.
link |
01:11:16.300
Yes.
link |
01:11:17.140
Like we don't need anything else.
link |
01:11:18.820
I think the same way chess was maybe the chess
link |
01:11:21.260
and maybe Google are the pinnacle of like search algorithms
link |
01:11:25.060
and things that look kind of like a star.
link |
01:11:28.420
The pinnacle of this era is going to be self driving cars.
link |
01:11:34.700
But on the path of that, you have to deliver products
link |
01:11:38.180
and it's possible that the path to full self driving cars
link |
01:11:42.340
will take decades.
link |
01:11:44.420
I doubt it.
link |
01:11:45.340
How long would you put on it?
link |
01:11:47.820
Like what are we, you're chasing it, Tesla's chasing it.
link |
01:11:53.460
What are we talking about?
link |
01:11:54.340
Five years, 10 years, 50 years.
link |
01:11:56.180
Let's say in the 2020s.
link |
01:11:58.060
In the 2020s.
link |
01:11:59.540
The later part of the 2020s.
link |
01:12:03.580
With the neural network.
link |
01:12:05.500
Well, that would be nice to see.
link |
01:12:06.580
And then the path to that, you're delivering products,
link |
01:12:09.140
which is a nice L2 system.
link |
01:12:10.580
That's what Tesla's doing, a nice L2 system.
link |
01:12:13.060
Just gets better every time.
link |
01:12:14.380
L2, the only difference between L2 and the other levels
link |
01:12:16.660
is who takes liability.
link |
01:12:17.620
And I'm not a liability guy, I don't wanna take liability.
link |
01:12:20.100
I'm gonna level two forever.
link |
01:12:22.700
Now on that little transition,
link |
01:12:25.780
I mean, how do you make the transition work?
link |
01:12:29.060
Is this where driver sensing comes in?
link |
01:12:32.780
Like how do you make the, cause you said a hundred miles,
link |
01:12:35.940
like, is there some sort of human factor psychology thing
link |
01:12:41.340
where people start to overtrust the system,
link |
01:12:43.140
all those kinds of effects,
link |
01:12:45.020
once it gets better and better and better and better,
link |
01:12:46.780
they get lazier and lazier and lazier.
link |
01:12:49.340
Is that, like, how do you get that transition right?
link |
01:12:52.460
First off, our monitoring is already adaptive.
link |
01:12:54.500
Our monitoring is already seen adaptive.
link |
01:12:56.620
Driver monitoring is just the camera
link |
01:12:58.940
that's looking at the driver.
link |
01:13:00.060
You have an infrared camera in the...
link |
01:13:03.060
Our policy for how we enforce the driver monitoring
link |
01:13:06.340
is seen adaptive.
link |
01:13:07.940
What's that mean?
link |
01:13:08.780
Well, for example, in one of the extreme cases,
link |
01:13:12.580
if the car is not moving,
link |
01:13:14.860
we do not actively enforce driver monitoring, right?
link |
01:13:19.460
If you are going through a,
link |
01:13:22.380
like a 45 mile an hour road with lights
link |
01:13:25.780
and stop signs and potentially pedestrians,
link |
01:13:27.980
we enforce a very tight driver monitoring policy.
link |
01:13:30.780
If you are alone on a perfectly straight highway,
link |
01:13:33.860
and this is, it's all machine learning.
link |
01:13:35.620
None of that is hand coded.
link |
01:13:36.940
Actually, the stop is hand coded, but...
link |
01:13:39.060
So there's some kind of machine learning
link |
01:13:41.100
estimation of risk.
link |
01:13:42.300
Yes.
link |
01:13:43.620
Yeah.
link |
01:13:44.460
I mean, I've always been a huge fan of that.
link |
01:13:45.860
That's a...
link |
01:13:47.500
Because...
link |
01:13:48.340
It's difficult to do every step into that direction
link |
01:13:53.780
is a worthwhile step to take.
link |
01:13:55.100
It might be difficult to do really well.
link |
01:13:56.540
Like us humans are able to estimate risk pretty damn well,
link |
01:13:59.980
whatever the hell that is.
link |
01:14:01.500
That feels like one of the nice features of us humans.
link |
01:14:06.500
Cause like we humans are really good drivers
link |
01:14:08.980
when we're really like tuned in
link |
01:14:11.180
and we're good at estimating risk.
link |
01:14:12.940
Like when are we supposed to be tuned in?
link |
01:14:14.900
Yeah.
link |
01:14:15.980
And, you know, people are like,
link |
01:14:17.580
oh, well, you know,
link |
01:14:18.420
why would you ever make the driver monitoring policy
link |
01:14:20.420
less aggressive?
link |
01:14:21.260
Why would you always not keep it at its most aggressive?
link |
01:14:23.980
Because then people are just going to get fatigued from it.
link |
01:14:25.780
Yes.
link |
01:14:26.620
When they get annoyed.
link |
01:14:27.460
You want them...
link |
01:14:28.300
Yeah.
link |
01:14:29.140
You want the experience to be pleasant.
link |
01:14:30.900
Obviously I want the experience to be pleasant,
link |
01:14:32.500
but even just from a straight up safety perspective,
link |
01:14:35.980
if you alert people when they look around and they're like,
link |
01:14:39.740
why is this thing alerting me?
link |
01:14:41.020
There's nothing I could possibly hit right now.
link |
01:14:42.980
People will just learn to tune it out.
link |
01:14:45.220
People will just learn to tune it out,
link |
01:14:46.900
to put weights on the steering wheel,
link |
01:14:48.060
to do whatever to overcome it.
link |
01:14:49.820
And remember that you're always part
link |
01:14:52.580
of this adaptive system.
link |
01:14:53.580
So all I can really say about, you know,
link |
01:14:55.660
how this scales going forward is yeah,
link |
01:14:57.180
it's something we have to monitor for.
link |
01:14:59.420
Ooh, we don't know.
link |
01:15:00.260
This is a great psychology experiment at scale.
link |
01:15:02.060
Like we'll see.
link |
01:15:03.220
Yeah, it's fascinating.
link |
01:15:04.060
Track it.
link |
01:15:04.900
And making sure you have a good understanding of attention
link |
01:15:09.140
is a very key part of that psychology problem.
link |
01:15:11.420
Yeah.
link |
01:15:12.260
I think you and I probably have a different,
link |
01:15:14.100
come to it differently, but to me,
link |
01:15:16.660
it's a fascinating psychology problem
link |
01:15:19.700
to explore something much deeper than just driving.
link |
01:15:22.100
It's such a nice way to explore human attention
link |
01:15:26.700
and human behavior, which is why, again,
link |
01:15:30.180
we've probably both criticized Mr. Elon Musk
link |
01:15:34.100
on this one topic from different avenues.
link |
01:15:38.220
So both offline and online,
link |
01:15:39.740
I had little chats with Elon and like,
link |
01:15:44.380
I love human beings as a computer vision problem,
link |
01:15:48.620
as an AI problem, it's fascinating.
link |
01:15:51.020
He wasn't so much interested in that problem.
link |
01:15:53.260
It's like in order to solve driving,
link |
01:15:56.820
the whole point is you want to remove the human
link |
01:15:58.780
from the picture.
link |
01:16:01.300
And it seems like you can't do that quite yet.
link |
01:16:04.140
Eventually, yes, but you can't quite do that yet.
link |
01:16:07.900
So this is the moment where you can't yet say,
link |
01:16:12.300
I told you so to Tesla, but it's getting there
link |
01:16:17.700
because I don't know if you've seen this,
link |
01:16:19.220
there's some reporting that they're in fact
link |
01:16:21.100
starting to do driver monitoring.
link |
01:16:23.260
Yeah, they shift the model in shadow mode.
link |
01:16:26.260
With, I believe, only a visible light camera,
link |
01:16:29.620
it might even be fisheye.
link |
01:16:31.780
It's like a low resolution.
link |
01:16:33.260
Low resolution, visible light.
link |
01:16:34.820
I mean, to be fair, that's what we have in the Eon as well,
link |
01:16:37.100
our last generation product.
link |
01:16:38.820
This is the one area where I can say
link |
01:16:41.060
our hardware is ahead of Tesla.
link |
01:16:42.180
The rest of our hardware, way, way behind,
link |
01:16:43.820
but our driver monitoring camera.
link |
01:16:46.020
So you think, I think on the third row Tesla podcast,
link |
01:16:50.940
or somewhere else, I've heard you say that obviously,
link |
01:16:54.580
eventually they're gonna have driver monitoring.
link |
01:16:57.220
I think what I've said is Elon will definitely ship
link |
01:16:59.660
driver monitoring before he ships level five.
link |
01:17:01.700
Before level five.
link |
01:17:02.540
And I'm willing to bet 10 grand on that.
link |
01:17:04.660
And you bet 10 grand on that.
link |
01:17:07.060
I mean, now I don't wanna take the bet,
link |
01:17:08.260
but before, maybe someone would have,
link |
01:17:09.500
oh, I should have got my money in.
link |
01:17:10.500
Yeah.
link |
01:17:11.900
It's an interesting bet.
link |
01:17:12.980
I think you're right.
link |
01:17:16.460
I'm actually on a human level
link |
01:17:19.180
because he's been, he's made the decision.
link |
01:17:24.380
Like he said that driver monitoring is the wrong way to go.
link |
01:17:27.660
But like, you have to think of as a human, as a CEO,
link |
01:17:31.260
I think that's the right thing to say when,
link |
01:17:36.460
like sometimes you have to say things publicly
link |
01:17:40.140
that are different than when you actually believe,
link |
01:17:41.780
because when you're producing a large number of vehicles
link |
01:17:45.420
and the decision was made not to include the camera,
link |
01:17:47.860
like what are you supposed to say?
link |
01:17:49.460
Like our cars don't have the thing
link |
01:17:51.780
that I think is right to have.
link |
01:17:54.020
It's an interesting thing.
link |
01:17:55.780
But like on the other side, as a CEO,
link |
01:17:58.340
I mean, something you could probably speak to as a leader,
link |
01:18:01.220
I think about me as a human
link |
01:18:04.940
to publicly change your mind on something.
link |
01:18:07.020
How hard is that?
link |
01:18:08.420
Especially when assholes like George Haas say,
link |
01:18:10.620
I told you so.
link |
01:18:12.380
All I will say is I am not a leader
link |
01:18:14.740
and I am happy to change my mind.
link |
01:18:17.060
And I will.
link |
01:18:17.900
You think Elon will?
link |
01:18:20.580
Yeah, I do.
link |
01:18:22.300
I think he'll come up with a good way
link |
01:18:24.260
to make it psychologically okay for him.
link |
01:18:27.420
Well, it's such an important thing, man.
link |
01:18:29.700
Especially for a first principles thinker,
link |
01:18:31.420
because he made a decision that driver monitoring
link |
01:18:34.780
is not the right way to go.
link |
01:18:35.740
And I could see that decision.
link |
01:18:37.660
And I could even make that decision.
link |
01:18:39.260
Like I was on the fence too.
link |
01:18:41.660
Like I'm not a,
link |
01:18:42.500
driver monitoring is such an obvious,
link |
01:18:47.140
simple solution to the problem of attention.
link |
01:18:49.980
It's not obvious to me that just by putting a camera there,
link |
01:18:52.820
you solve things.
link |
01:18:54.260
You have to create an incredible, compelling experience.
link |
01:18:59.060
Just like you're talking about.
link |
01:19:01.060
I don't know if it's easy to do that.
link |
01:19:03.300
It's not at all easy to do that, in fact, I think.
link |
01:19:05.980
So as a creator of a car that's trying to create a product
link |
01:19:10.980
that people love, which is what Tesla tries to do, right?
link |
01:19:14.180
It's not obvious to me that as a design decision,
link |
01:19:18.340
whether adding a camera is a good idea.
link |
01:19:20.940
From a safety perspective either,
link |
01:19:22.460
like in the human factors community,
link |
01:19:25.100
everybody says that you should obviously
link |
01:19:27.380
have driver sensing, driver monitoring.
link |
01:19:30.500
But that's like saying it's obvious as parents,
link |
01:19:36.900
you shouldn't let your kids go out at night.
link |
01:19:39.780
But okay, but like,
link |
01:19:43.460
they're still gonna find ways to do drugs.
link |
01:19:45.620
Like, you have to also be good parents.
link |
01:19:49.860
So like, it's much more complicated than just the,
link |
01:19:52.580
you need to have driver monitoring.
link |
01:19:54.140
I totally disagree on, okay, if you have a camera there
link |
01:19:58.740
and the camera's watching the person,
link |
01:20:00.300
but never throws an alert, they'll never think about it.
link |
01:20:03.540
Right?
link |
01:20:04.380
The driver monitoring policy that you choose to,
link |
01:20:08.380
how you choose to communicate with the user
link |
01:20:10.180
is entirely separate from the data collection perspective.
link |
01:20:14.220
Right?
link |
01:20:15.060
Right?
link |
01:20:15.900
So, you know, like, there's one thing to say,
link |
01:20:20.980
like, you know, tell your teenager they can't do something.
link |
01:20:24.500
There's another thing to like, you know, gather the data.
link |
01:20:27.020
So you can make informed decisions.
link |
01:20:28.500
That's really interesting.
link |
01:20:29.340
But you have to make that,
link |
01:20:30.980
that's the interesting thing about cars.
link |
01:20:33.380
But even true with common AI,
link |
01:20:35.020
like you don't have to manufacture the thing
link |
01:20:37.900
into the car, is you have to make a decision
link |
01:20:40.060
that anticipates the right strategy longterm.
link |
01:20:44.180
So like, you have to start collecting the data
link |
01:20:46.620
and start making decisions.
link |
01:20:47.780
Started it three years ago.
link |
01:20:49.900
I believe that we have the best driver monitoring solution
link |
01:20:52.660
in the world.
link |
01:20:54.620
I think that when you compare it to Super Cruise
link |
01:20:57.220
is the only other one that I really know that shipped.
link |
01:20:59.460
And ours is better.
link |
01:21:01.420
What do you like and not like about Super Cruise?
link |
01:21:06.420
I mean, I had a few Super Cruise,
link |
01:21:08.740
the sun would be shining through the window,
link |
01:21:12.020
would blind the camera,
link |
01:21:13.180
and it would say I wasn't paying attention.
link |
01:21:14.580
When I was looking completely straight,
link |
01:21:16.100
I couldn't reset the attention with a steering wheel touch
link |
01:21:19.140
and Super Cruise would disengage.
link |
01:21:21.060
Like I was communicating to the car, I'm like, look,
link |
01:21:22.980
I am here, I am paying attention.
link |
01:21:24.420
Why are you really gonna force me to disengage?
link |
01:21:26.340
And it did.
link |
01:21:28.620
So it's a constant conversation with the user.
link |
01:21:32.100
And yeah, there's no way to ship a system
link |
01:21:33.660
like this if you can OTA.
link |
01:21:35.500
We're shipping a new one every month.
link |
01:21:37.180
Sometimes we balance it with our users on Discord.
link |
01:21:40.140
Like sometimes we make the driver monitoring
link |
01:21:41.980
a little more aggressive and people complain.
link |
01:21:43.820
Sometimes they don't.
link |
01:21:45.500
We want it to be as aggressive as possible
link |
01:21:47.060
where people don't complain and it doesn't feel intrusive.
link |
01:21:49.180
So being able to update the system over the air
link |
01:21:51.100
is an essential component.
link |
01:21:52.540
I mean, that's probably to me, you mentioned,
link |
01:21:56.620
I mean, to me that is the biggest innovation of Tesla,
link |
01:22:01.060
that it made people realize that over the air updates
link |
01:22:04.860
is essential.
link |
01:22:06.460
Yeah.
link |
01:22:07.460
I mean, was that not obvious from the iPhone?
link |
01:22:10.140
The iPhone was the first real product that OTA'd, I think.
link |
01:22:13.060
Was it actually, that's brilliant, you're right.
link |
01:22:15.220
I mean, the game consoles used to not, right?
link |
01:22:17.060
The game consoles were maybe the second thing that did.
link |
01:22:18.980
Wow, I didn't really think about one of the amazing features
link |
01:22:22.260
of a smartphone isn't just like the touchscreen
link |
01:22:26.180
isn't the thing, it's the ability to constantly update.
link |
01:22:30.740
Yeah, it gets better.
link |
01:22:31.980
It gets better.
link |
01:22:35.100
Love my iOS 14.
link |
01:22:36.820
Yeah.
link |
01:22:38.300
Well, one thing that I probably disagree with you
link |
01:22:41.540
on driver monitoring is you said that it's easy.
link |
01:22:46.700
I mean, you tend to say stuff is easy.
link |
01:22:48.940
I'm sure the, I guess you said it's easy
link |
01:22:52.700
relative to the external perception problem.
link |
01:22:58.180
Can you elaborate why you think it's easy?
link |
01:23:00.940
Feature engineering works for driver monitoring.
link |
01:23:03.500
Feature engineering does not work for the external.
link |
01:23:05.860
So human faces are not, human faces and the movement
link |
01:23:10.700
of human faces and head and body is not as variable
link |
01:23:14.740
as the external environment, is your intuition?
link |
01:23:17.140
Yes, and there's another big difference as well.
link |
01:23:20.140
Your reliability of a driver monitoring system
link |
01:23:22.500
doesn't actually need to be that high.
link |
01:23:24.380
The uncertainty, if you have something that's detecting
link |
01:23:27.220
whether the human's paying attention and it only works
link |
01:23:29.140
92% of the time, you're still getting almost all
link |
01:23:31.700
the benefit of that because the human,
link |
01:23:33.620
like you're training the human, right?
link |
01:23:35.500
You're dealing with a system that's really helping you out.
link |
01:23:39.100
It's a conversation.
link |
01:23:40.180
It's not like the external thing where guess what?
link |
01:23:43.460
If you swerve into a tree, you swerve into a tree, right?
link |
01:23:46.140
Like you get no margin for error there.
link |
01:23:48.340
Yeah, I think that's really well put.
link |
01:23:49.620
I think that's the right, exactly the place
link |
01:23:54.020
where comparing to the external perception,
link |
01:23:58.660
the control problem, the driver monitoring is easier
link |
01:24:01.500
because you don't, the bar for success is much lower.
link |
01:24:05.260
Yeah, but I still think like the human face
link |
01:24:09.100
is more complicated actually than the external environment,
link |
01:24:12.140
but for driving, you don't give a damn.
link |
01:24:14.380
I don't need, yeah, I don't need something,
link |
01:24:15.620
I don't need something that complicated
link |
01:24:18.300
to have to communicate the idea to the human
link |
01:24:22.220
that I want to communicate, which is,
link |
01:24:23.980
yo, system might mess up here.
link |
01:24:25.740
You gotta pay attention.
link |
01:24:26.940
Yeah, see, that's my love and fascination is the human face.
link |
01:24:32.500
And it feels like this is a nice place to create products
link |
01:24:38.420
that create an experience in the car.
link |
01:24:40.100
So like, it feels like there should be
link |
01:24:42.640
more richer experiences in the car, you know?
link |
01:24:47.540
Like that's an opportunity for like something like On My Eye
link |
01:24:51.500
or just any kind of system like a Tesla
link |
01:24:53.900
or any of the autonomous vehicle companies
link |
01:24:56.220
is because software is, there's much more sensors
link |
01:24:59.180
and so much is on our software
link |
01:25:00.660
and you're doing machine learning anyway,
link |
01:25:02.940
there's an opportunity to create totally new experiences
link |
01:25:06.340
that we're not even anticipating.
link |
01:25:08.260
You don't think so?
link |
01:25:10.140
Nah.
link |
01:25:10.980
You think it's a box that gets you from A to B
link |
01:25:12.900
and you want to do it chill?
link |
01:25:14.920
Yeah, I mean, I think as soon as we get to level three
link |
01:25:16.940
on highways, okay, enjoy your candy crush,
link |
01:25:19.300
enjoy your Hulu, enjoy your, you know, whatever, whatever.
link |
01:25:23.660
Sure, you get this, you can look at screens basically
link |
01:25:26.260
versus right now where you have music and audio books.
link |
01:25:28.700
So level three is where you can kind of disengage
link |
01:25:31.020
in stretches of time.
link |
01:25:34.860
Well, you think level three is possible?
link |
01:25:37.460
Like on the highway going for 100 miles
link |
01:25:39.180
and you can just go to sleep?
link |
01:25:40.500
Oh yeah, sleep.
link |
01:25:43.620
So again, I think it's really all on a spectrum.
link |
01:25:47.340
I think that being able to use your phone
link |
01:25:50.060
while you're on the highway and like this all being okay
link |
01:25:53.500
and being aware that the car might alert you
link |
01:25:55.360
and you have five seconds to basically.
link |
01:25:57.180
So the five second thing is you think is possible?
link |
01:25:59.060
Yeah, I think it is, oh yeah.
link |
01:26:00.420
Not in all scenarios, right?
link |
01:26:02.180
Some scenarios it's not.
link |
01:26:03.940
It's the whole risk thing that you mentioned is nice
link |
01:26:06.300
is to be able to estimate like how risky is this situation?
link |
01:26:10.660
That's really important to understand.
link |
01:26:12.620
One other thing you mentioned comparing KAMA
link |
01:26:15.380
and Autopilot is that something about the haptic feel
link |
01:26:20.380
of the way KAMA controls the car when things are uncertain.
link |
01:26:25.920
Like it behaves a little bit more uncertain
link |
01:26:27.760
when things are uncertain.
link |
01:26:29.240
That's kind of an interesting point.
link |
01:26:31.080
And then Autopilot is much more confident always
link |
01:26:34.080
even when it's uncertain until it runs into trouble.
link |
01:26:39.200
That's a funny thing.
link |
01:26:40.920
I actually mentioned that to Elon, I think.
link |
01:26:42.700
And then the first time we talked, he wasn't biting.
link |
01:26:46.280
It's like communicating uncertainty.
link |
01:26:48.880
I guess KAMA doesn't really communicate uncertainty
link |
01:26:51.880
explicitly, it communicates it through haptic feel.
link |
01:26:55.040
Like what's the role of communicating uncertainty
link |
01:26:57.420
do you think?
link |
01:26:58.260
Oh, we do some stuff explicitly.
link |
01:26:59.840
Like we do detect the lanes when you're on the highway
link |
01:27:01.800
and we'll show you how many lanes we're using to drive with.
link |
01:27:04.380
You can look at where it thinks the lanes are.
link |
01:27:06.200
You can look at the path.
link |
01:27:08.480
And we want to be better about this.
link |
01:27:10.440
We're actually hiring, want to hire some new UI people.
link |
01:27:12.900
UI people, you mentioned this.
link |
01:27:14.320
Cause it's such an, it's a UI problem too, right?
link |
01:27:17.300
We have a great designer now, but you know,
link |
01:27:19.720
we need people who are just going to like build this
link |
01:27:21.160
and debug these UIs, QT people.
link |
01:27:23.800
QT.
link |
01:27:24.640
Is that what the UI is done with, is QT?
link |
01:27:26.880
The new UI is in QT.
link |
01:27:29.320
C++ QT?
link |
01:27:32.000
Tesla uses it too.
link |
01:27:33.300
Yeah.
link |
01:27:34.200
We had some React stuff in there.
link |
01:27:37.760
React JS or just React?
link |
01:27:39.440
React is his own language, right?
link |
01:27:41.160
React Native, React is a JavaScript framework.
link |
01:27:44.480
Yeah.
link |
01:27:45.320
So it's all based on JavaScript, but it's, you know,
link |
01:27:48.960
I like C++.
link |
01:27:51.480
What do you think about Dojo with Tesla
link |
01:27:55.080
and their foray into what appears to be
link |
01:28:00.240
specialized hardware for training your own nets?
link |
01:28:05.120
I guess it's something, maybe you can correct me,
link |
01:28:07.360
from my shallow looking at it,
link |
01:28:10.000
it seems like something like Google did with TPUs,
link |
01:28:12.120
but specialized for driving data.
link |
01:28:15.680
I don't think it's specialized for driving data.
link |
01:28:18.360
It's just legit, just TPU.
link |
01:28:20.120
They want to go the Apple way,
link |
01:28:22.160
basically everything required in the chain is done in house.
link |
01:28:25.640
Well, so you have a problem right now,
link |
01:28:27.840
and this is one of my concerns.
link |
01:28:31.740
I really would like to see somebody deal with this.
link |
01:28:33.800
If anyone out there is doing it,
link |
01:28:35.280
I'd like to help them if I can.
link |
01:28:38.000
You basically have two options right now to train.
link |
01:28:40.620
One, your options are NVIDIA or Google.
link |
01:28:45.980
So Google is not even an option.
link |
01:28:50.120
Their TPUs are only available in Google Cloud.
link |
01:28:53.180
Google has absolutely onerous
link |
01:28:55.140
terms of service restrictions.
link |
01:28:58.060
They may have changed it,
link |
01:28:59.280
but back in Google's terms of service,
link |
01:29:00.620
it said explicitly you are not allowed to use Google Cloud ML
link |
01:29:03.740
for training autonomous vehicles
link |
01:29:05.340
or for doing anything that competes with Google
link |
01:29:07.360
without Google's prior written permission.
link |
01:29:09.300
Wow, okay.
link |
01:29:10.340
I mean, Google is not a platform company.
link |
01:29:14.140
I wouldn't touch TPUs with a 10 foot pole.
link |
01:29:16.760
So that leaves you with the monopoly.
link |
01:29:19.300
NVIDIA? NVIDIA.
link |
01:29:21.020
So, I mean.
link |
01:29:22.340
That you're not a fan of.
link |
01:29:23.900
Well, look, I was a huge fan of in 2016 NVIDIA.
link |
01:29:28.540
Jensen came sat in the car.
link |
01:29:31.880
Cool guy.
link |
01:29:32.820
When the stock was $30 a share.
link |
01:29:35.540
NVIDIA stock has skyrocketed.
link |
01:29:38.220
I witnessed a real change
link |
01:29:39.920
in who was in management over there in like 2018.
link |
01:29:43.580
And now they are, let's exploit.
link |
01:29:46.700
Let's take every dollar we possibly can
link |
01:29:48.460
out of this ecosystem.
link |
01:29:49.580
Let's charge $10,000 for A100s
link |
01:29:51.740
because we know we got the best shit in the game.
link |
01:29:54.180
And let's charge $10,000 for an A100
link |
01:29:57.920
when it's really not that different from a 3080,
link |
01:30:00.100
which is 699.
link |
01:30:03.500
The margins that they are making
link |
01:30:05.100
off of those high end chips are so high
link |
01:30:08.640
that, I mean, I think they're shooting themselves
link |
01:30:10.260
in the foot just from a business perspective.
link |
01:30:12.220
Because there's a lot of people talking like me now
link |
01:30:14.980
who are like, somebody's gotta take NVIDIA down.
link |
01:30:19.060
Yeah.
link |
01:30:19.900
Where they could dominate it.
link |
01:30:21.020
NVIDIA could be the new Intel.
link |
01:30:22.460
Yeah, to be inside everything essentially.
link |
01:30:26.940
And yet the winners in certain spaces
link |
01:30:30.660
like autonomous driving, the winners,
link |
01:30:33.780
only the people who are like desperately falling back
link |
01:30:36.620
and trying to catch up and have a ton of money,
link |
01:30:38.540
like the big automakers are the ones
link |
01:30:40.660
interested in partnering with NVIDIA.
link |
01:30:43.220
Oh, and I think a lot of those things
link |
01:30:44.820
are gonna fall through.
link |
01:30:45.940
If I were NVIDIA, sell chips.
link |
01:30:49.340
Sell chips at a reasonable markup.
link |
01:30:52.260
To everybody.
link |
01:30:53.100
To everybody.
link |
01:30:53.920
Without any restrictions.
link |
01:30:54.940
Without any restrictions.
link |
01:30:56.140
Intel did this.
link |
01:30:57.360
Look at Intel.
link |
01:30:58.260
They had a great long run.
link |
01:30:59.940
NVIDIA is trying to turn their,
link |
01:31:01.580
they're like trying to productize their chips
link |
01:31:04.020
way too much.
link |
01:31:05.620
They're trying to extract way more value
link |
01:31:07.880
than they can sustainably.
link |
01:31:09.380
Sure, you can do it tomorrow.
link |
01:31:10.740
Is it gonna up your share price?
link |
01:31:12.140
Sure, if you're one of those CEOs
link |
01:31:13.540
who's like, how much can I strip mine this company?
link |
01:31:15.300
And I think, you know, and that's what's weird about it too.
link |
01:31:17.860
Like the CEO is the founder.
link |
01:31:19.380
It's the same guy.
link |
01:31:20.300
Yeah.
link |
01:31:21.120
I mean, I still think Jensen's a great guy.
link |
01:31:22.340
He is great.
link |
01:31:23.300
Why do this?
link |
01:31:25.180
You have a choice.
link |
01:31:26.660
You have a choice right now.
link |
01:31:27.940
Are you trying to cash out?
link |
01:31:28.820
Are you trying to buy a yacht?
link |
01:31:30.660
If you are, fine.
link |
01:31:32.100
But if you're trying to be
link |
01:31:34.220
the next huge semiconductor company, sell chips.
link |
01:31:37.240
Well, the interesting thing about Jensen
link |
01:31:40.140
is he is a big vision guy.
link |
01:31:42.060
So he has a plan like for 50 years down the road.
link |
01:31:48.700
So it makes me wonder like.
link |
01:31:50.500
How does price gouging fit into it?
link |
01:31:51.780
Yeah, how does that, like it's,
link |
01:31:54.000
it doesn't seem to make sense as a plan.
link |
01:31:57.060
I worry that he's listening to the wrong people.
link |
01:31:59.260
Yeah, that's the sense I have too sometimes.
link |
01:32:02.540
Because I, despite everything, I think NVIDIA
link |
01:32:07.940
is an incredible company.
link |
01:32:09.020
Well, one, so I'm deeply grateful to NVIDIA
link |
01:32:12.420
for the products they've created in the past.
link |
01:32:13.740
Me too.
link |
01:32:14.580
Right?
link |
01:32:15.400
And so.
link |
01:32:16.240
The 1080 Ti was a great GPU.
link |
01:32:18.000
Still have a lot of them.
link |
01:32:18.840
Still is, yeah.
link |
01:32:21.840
But at the same time, it just feels like,
link |
01:32:26.860
feels like you don't want to put all your stock in NVIDIA.
link |
01:32:29.380
And so like Elon is doing, what Tesla is doing
link |
01:32:32.940
with Autopilot and Dojo is the Apple way is,
link |
01:32:37.260
because they're not going to share Dojo with George Hott's.
link |
01:32:40.300
I know.
link |
01:32:42.340
They should sell that chip.
link |
01:32:43.780
Oh, they should sell that.
link |
01:32:44.700
Even their accelerator.
link |
01:32:46.400
The accelerator that's in all the cars, the 30 watt one.
link |
01:32:49.060
Sell it, why not?
link |
01:32:51.580
So open it up.
link |
01:32:52.700
Like make, why does Tesla have to be a car company?
link |
01:32:55.820
Well, if you sell the chip, here's what you get.
link |
01:32:58.060
Yeah.
link |
01:32:59.020
Make some money off the chips.
link |
01:33:00.260
It doesn't take away from your chip.
link |
01:33:02.080
You're going to make some money, free money.
link |
01:33:03.860
And also the world is going to build an ecosystem
link |
01:33:07.380
of tooling for you.
link |
01:33:09.020
Right?
link |
01:33:09.860
You're not going to have to fix the bug in your 10H layer.
link |
01:33:12.860
Someone else already did.
link |
01:33:15.140
Well, the question, that's an interesting question.
link |
01:33:16.780
I mean, that's the question Steve Jobs asked.
link |
01:33:18.740
That's the question Elon Musk is perhaps asking is,
link |
01:33:24.940
do you want Tesla stuff inside other vehicles?
link |
01:33:28.060
Inside, potentially inside like a iRobot vacuum cleaner.
link |
01:33:32.620
Yeah.
link |
01:33:34.860
I think you should decide where your advantages are.
link |
01:33:37.160
I'm not saying Tesla should start selling battery packs
link |
01:33:39.260
to automakers.
link |
01:33:40.380
Because battery packs to automakers,
link |
01:33:41.720
they are straight up in competition with you.
link |
01:33:43.660
If I were Tesla, I'd keep the battery technology totally.
link |
01:33:46.060
Yeah.
link |
01:33:46.900
As far as we make batteries.
link |
01:33:47.940
But the thing about the Tesla TPU is anybody can build that.
link |
01:33:53.160
It's just a question of, you know,
link |
01:33:54.600
are you willing to spend the money?
link |
01:33:57.460
It could be a huge source of revenue potentially.
link |
01:34:00.220
Are you willing to spend a hundred million dollars?
link |
01:34:02.420
Anyone can build it.
link |
01:34:03.660
And someone will.
link |
01:34:04.680
And a bunch of companies now are starting
link |
01:34:06.680
trying to build AI accelerators.
link |
01:34:08.020
Somebody is going to get the idea right.
link |
01:34:10.200
And yeah, hopefully they don't get greedy
link |
01:34:13.700
because they'll just lose to the next guy who finally,
link |
01:34:15.780
and then eventually the Chinese are going to make knockoff
link |
01:34:17.540
and video chips and that's.
link |
01:34:19.580
From your perspective,
link |
01:34:20.400
I don't know if you're also paying attention
link |
01:34:21.820
to stay on Tesla for a moment.
link |
01:34:24.180
Dave, Elon Musk has talked about a complete rewrite
link |
01:34:28.820
of the neural net that they're using.
link |
01:34:31.700
That seems to, again, I'm half paying attention,
link |
01:34:34.800
but it seems to involve basically a kind of integration
link |
01:34:39.000
of all the sensors to where it's a four dimensional view.
link |
01:34:44.180
You know, you have a 3D model of the world over time.
link |
01:34:47.540
And then you can, I think it's done both for the,
link |
01:34:52.100
for the actually, you know,
link |
01:34:53.280
so the neural network is able to,
link |
01:34:55.260
in a more holistic way,
link |
01:34:56.920
deal with the world and make predictions and so on,
link |
01:34:59.340
but also to make the annotation task more, you know, easier.
link |
01:35:04.780
Like you can annotate the world in one place
link |
01:35:08.180
and then kind of distribute itself across the sensors
link |
01:35:10.500
and across a different,
link |
01:35:12.860
like the hundreds of tasks that are involved
link |
01:35:15.180
in the Hydro Net.
link |
01:35:16.540
What are your thoughts about this rewrite?
link |
01:35:19.140
Is it just like some details that are kind of obvious
link |
01:35:22.420
that are steps that should be taken,
link |
01:35:24.060
or is there something fundamental
link |
01:35:26.100
that could challenge your idea
link |
01:35:27.560
that end to end is the right solution?
link |
01:35:31.120
We're in the middle of a big rewrite now as well.
link |
01:35:33.160
We haven't shipped a new model in a bit.
link |
01:35:34.860
Of what kind?
link |
01:35:36.400
We're going from 2D to 3D.
link |
01:35:38.240
Right now, all our stuff, like for example,
link |
01:35:39.740
when the car pitches back,
link |
01:35:40.980
the lane lines also pitch back
link |
01:35:43.020
because we're assuming the flat world hypothesis.
link |
01:35:47.160
The new models do not do this.
link |
01:35:48.420
The new models output everything in 3D.
link |
01:35:50.420
But there's still no annotation.
link |
01:35:53.580
So the 3D is, it's more about the output.
link |
01:35:56.500
Yeah.
link |
01:35:57.340
We have Zs in everything.
link |
01:36:00.100
We've...
link |
01:36:00.940
Zs.
link |
01:36:01.760
Yeah.
link |
01:36:02.600
We had a Zs.
link |
01:36:03.420
We had a Zs.
link |
01:36:04.260
We unified a lot of stuff as well.
link |
01:36:06.620
We switched from TensorFlow to PyTorch.
link |
01:36:10.720
My understanding of what Tesla's thing is,
link |
01:36:13.740
is that their annotator now annotates
link |
01:36:15.660
across the time dimension.
link |
01:36:16.960
Mm hmm.
link |
01:36:19.980
I mean, cute.
link |
01:36:22.000
Why are you building an annotator?
link |
01:36:24.400
I find their entire pipeline.
link |
01:36:28.280
I find your vision, I mean,
link |
01:36:30.560
the vision of end to end very compelling,
link |
01:36:32.860
but I also like the engineering of the data engine
link |
01:36:35.880
that they've created.
link |
01:36:37.400
In terms of supervised learning pipelines,
link |
01:36:41.560
that thing is damn impressive.
link |
01:36:43.560
You're basically, the idea is that you have
link |
01:36:47.480
hundreds of thousands of people
link |
01:36:49.280
that are doing data collection for you
link |
01:36:51.200
by doing their experience.
link |
01:36:52.400
So that's kind of similar to the Comma AI model.
link |
01:36:55.220
And you're able to mine that data
link |
01:36:59.580
based on the kind of edge cases you need.
link |
01:37:02.940
I think it's harder to do in the end to end learning.
link |
01:37:07.360
The mining of the right edge cases.
link |
01:37:09.520
Like that's where feature engineering
link |
01:37:11.440
is actually really powerful
link |
01:37:14.120
because like us humans are able to do
link |
01:37:17.280
this kind of mining a little better.
link |
01:37:19.800
But yeah, there's obvious, as we know,
link |
01:37:21.980
there's obvious constraints and limitations to that idea.
link |
01:37:25.880
Carpathia just tweeted, he's like,
link |
01:37:28.280
you get really interesting insights
link |
01:37:29.640
if you sort your validation set by loss
link |
01:37:33.720
and look at the highest loss examples.
link |
01:37:36.400
Yeah.
link |
01:37:37.560
So yeah, I mean, you can do,
link |
01:37:39.180
we have a little data engine like thing.
link |
01:37:42.040
We're training a segment.
link |
01:37:43.560
I know it's not fancy.
link |
01:37:44.560
It's just like, okay, train the new segment,
link |
01:37:48.280
run it on 100,000 images
link |
01:37:50.080
and now take the thousand with highest loss.
link |
01:37:52.160
Select a hundred of those by human,
link |
01:37:54.160
put those, get those ones labeled, retrain, do it again.
link |
01:37:57.840
And so it's a much less well written data engine.
link |
01:38:01.480
And yeah, you can take these things really far
link |
01:38:03.600
and it is impressive engineering.
link |
01:38:06.600
And if you truly need supervised data for a problem,
link |
01:38:09.920
yeah, things like data engine are at the high end
link |
01:38:12.560
of what is attention?
link |
01:38:14.960
Is a human paying attention?
link |
01:38:15.960
I mean, we're going to probably build something
link |
01:38:17.960
that looks like data engine
link |
01:38:18.940
to push our driver monitoring further.
link |
01:38:21.120
But for driving itself,
link |
01:38:22.920
you have it all annotated beautifully by what the human does.
link |
01:38:26.400
Yeah, that's interesting.
link |
01:38:27.240
I mean, that applies to driver attention as well.
link |
01:38:30.040
Do you want to detect the eyes?
link |
01:38:31.200
Do you want to detect blinking and pupil movement?
link |
01:38:33.500
Do you want to detect all the like face alignments
link |
01:38:36.680
or landmark detection and so on,
link |
01:38:38.720
and then doing kind of reasoning based on that?
link |
01:38:41.560
Or do you want to take the entirety of the face over time
link |
01:38:43.880
and do end to end?
link |
01:38:45.320
I mean, it's obvious that eventually you have to do end
link |
01:38:48.480
to end with some calibration, some fixes and so on,
link |
01:38:51.360
but it's like, I don't know when that's the right move.
link |
01:38:55.760
Even if it's end to end, there actually is,
link |
01:38:58.420
there is no kind of, you have to supervise that with humans.
link |
01:39:03.380
Whether a human is paying attention or not
link |
01:39:05.480
is a completely subjective judgment.
link |
01:39:08.560
Like you can try to like automatically do it
link |
01:39:11.040
with some stuff, but you don't have,
link |
01:39:13.160
if I record a video of a human,
link |
01:39:15.080
I don't have true annotations anywhere in that video.
link |
01:39:18.400
The only way to get them is with,
link |
01:39:21.120
you know, other humans labeling it really.
link |
01:39:22.840
Well, I don't know.
link |
01:39:26.080
If you think deeply about it,
link |
01:39:28.540
you could, you might be able to just,
link |
01:39:30.160
depending on the task,
link |
01:39:31.000
maybe a discover self annotating things like,
link |
01:39:34.320
you know, you can look at like steering wheel reverse
link |
01:39:36.920
or something like that.
link |
01:39:37.760
You can discover little moments of lapse of attention.
link |
01:39:41.200
I mean, that's where psychology comes in.
link |
01:39:44.540
Is there indicate,
link |
01:39:45.480
cause you have so much data to look at.
link |
01:39:48.000
So you might be able to find moments when there's like,
link |
01:39:51.140
just inattention that even with smartphone,
link |
01:39:54.320
if you want to detect smartphone use,
link |
01:39:56.700
you can start to zoom in.
link |
01:39:57.880
I mean, that's the gold mine, sort of the comma AI.
link |
01:40:01.480
I mean, Tesla is doing this too, right?
link |
01:40:02.920
Is they're doing annotation based on,
link |
01:40:06.920
it's like a self supervised learning too.
link |
01:40:10.500
It's just a small part of the entire picture.
link |
01:40:13.440
That's kind of the challenge of solving a problem
link |
01:40:17.760
in machine learning.
link |
01:40:18.660
If you can discover self annotating parts of the problem,
link |
01:40:24.000
right?
link |
01:40:25.020
Our driver monitoring team is half a person right now.
link |
01:40:27.760
I would, you know, once we have,
link |
01:40:29.280
once we have two, three people on that team,
link |
01:40:33.280
I definitely want to look at self annotating stuff
link |
01:40:35.280
for attention.
link |
01:40:38.200
Let's go back for a sec to a comma and what,
link |
01:40:43.720
you know, for people who are curious to try it out,
link |
01:40:46.240
how do you install a comma in say a 2020 Toyota Corolla
link |
01:40:51.120
or like, what are the cars that are supported?
link |
01:40:53.400
What are the cars that you recommend?
link |
01:40:55.500
And what does it take?
link |
01:40:57.880
You have a few videos out, but maybe through words,
link |
01:41:00.000
can you explain what's it take to actually install a thing?
link |
01:41:02.880
So we support, I think it's 91 cars, 91 makes the models.
link |
01:41:08.080
We've got to 100 this year.
link |
01:41:10.160
Nice.
link |
01:41:11.000
The, yeah, the 2020 Corolla, great choice.
link |
01:41:16.680
The 2020 Sonata, it's using the stock longitudinal.
link |
01:41:21.200
It's using just our lateral control,
link |
01:41:23.280
but it's a very refined car.
link |
01:41:25.140
Their longitudinal control is not bad at all.
link |
01:41:28.200
So yeah, Corolla, Sonata,
link |
01:41:31.720
or if you're willing to get your hands a little dirty
link |
01:41:34.240
and look in the right places on the internet,
link |
01:41:35.940
the Honda Civic is great,
link |
01:41:37.520
but you're going to have to install a modified EPS firmware
link |
01:41:40.600
in order to get a little bit more torque.
link |
01:41:42.160
And I can't help you with that.
link |
01:41:43.380
Comma does not officially endorse that,
link |
01:41:45.960
but we have been doing it.
link |
01:41:47.560
We didn't ever release it.
link |
01:41:49.800
We waited for someone else to discover it.
link |
01:41:51.440
And then, you know.
link |
01:41:52.880
And you have a Discord server where people,
link |
01:41:55.680
there's a very active developer community, I suppose.
link |
01:42:00.360
So depending on the level of experimentation
link |
01:42:04.000
you're willing to do, that's the community.
link |
01:42:07.600
If you just want to buy it and you have a supported car,
link |
01:42:11.240
it's 10 minutes to install.
link |
01:42:13.920
There's YouTube videos.
link |
01:42:15.400
It's Ikea furniture level.
link |
01:42:17.040
If you can set up a table from Ikea,
link |
01:42:19.040
you can install a Comma 2 in your supported car
link |
01:42:21.200
and it will just work.
link |
01:42:22.600
Now you're like, oh, but I want this high end feature
link |
01:42:24.920
or I want to fix this bug.
link |
01:42:26.160
Okay, well, welcome to the developer community.
link |
01:42:29.520
So what, if I wanted to,
link |
01:42:31.000
this is something I asked you offline like a few months ago.
link |
01:42:34.680
If I wanted to run my own code to,
link |
01:42:39.800
so use Comma as a platform
link |
01:42:43.420
and try to run something like OpenPilot,
link |
01:42:46.040
what does it take to do that?
link |
01:42:48.320
So there's a toggle in the settings called enable SSH.
link |
01:42:51.840
And if you toggle that, you can SSH into your device.
link |
01:42:54.600
You can modify the code.
link |
01:42:55.620
You can upload whatever code you want to it.
link |
01:42:58.260
There's a whole lot of people.
link |
01:42:59.160
So about 60% of people are running stock comma.
link |
01:43:03.040
About 40% of people are running forks.
link |
01:43:05.480
And there's a community of,
link |
01:43:07.280
there's a bunch of people who maintain these forks
link |
01:43:10.320
and these forks support different cars
link |
01:43:13.040
or they have different toggles.
link |
01:43:15.700
We try to keep away from the toggles
link |
01:43:17.360
that are like disabled driver monitoring,
link |
01:43:18.920
but there's some people might want that kind of thing
link |
01:43:21.720
and like, yeah, you can, it's your car.
link |
01:43:24.560
I'm not here to tell you.
link |
01:43:29.400
We have some, we ban,
link |
01:43:31.080
if you're trying to subvert safety features,
link |
01:43:32.920
you're banned from our Discord.
link |
01:43:33.760
I don't want anything to do with you,
link |
01:43:35.260
but there's some forks doing that.
link |
01:43:37.920
Got it.
link |
01:43:39.880
So you encourage responsible forking.
link |
01:43:42.880
Yeah, yeah.
link |
01:43:43.720
We encourage, some people, yeah, some people,
link |
01:43:46.080
like there's forks that will do,
link |
01:43:48.160
some people just like having a lot of readouts on the UI,
link |
01:43:52.040
like a lot of like flashing numbers.
link |
01:43:53.440
So there's forks that do that.
link |
01:43:55.120
Some people don't like the fact that it disengages
link |
01:43:57.200
when you press the gas pedal.
link |
01:43:58.320
There's forks that disable that.
link |
01:44:00.480
Got it.
link |
01:44:01.320
Now the stock experience is what like,
link |
01:44:04.920
so it does both lane keeping
link |
01:44:06.240
and longitudinal control all together.
link |
01:44:08.960
So it's not separate like it is in autopilot.
link |
01:44:11.020
No, so, okay.
link |
01:44:12.520
Some cars we use the stock longitudinal control.
link |
01:44:15.040
We don't do the longitudinal control in all the cars.
link |
01:44:17.400
Some cars, the ACCs are pretty good in the cars.
link |
01:44:19.560
It's the lane keep that's atrocious in anything
link |
01:44:21.360
except for autopilot and super cruise.
link |
01:44:23.420
But, you know, you just turn it on and it works.
link |
01:44:27.880
What does this engagement look like?
link |
01:44:29.480
Yeah, so we have, I mean,
link |
01:44:30.960
I'm very concerned about mode confusion.
link |
01:44:32.960
I've experienced it on super cruise and autopilot
link |
01:44:36.960
where like autopilot, like autopilot disengages.
link |
01:44:39.820
I don't realize that the ACC is still on.
link |
01:44:42.400
The lead car moves slightly over
link |
01:44:44.680
and then the Tesla accelerates
link |
01:44:46.160
to like whatever my set speed is super fast.
link |
01:44:48.000
I'm like, what's going on here?
link |
01:44:51.320
We have engaged and disengaged.
link |
01:44:53.720
And this is similar to my understanding, I'm not a pilot,
link |
01:44:56.380
but my understanding is either the pilot is in control
link |
01:45:00.080
or the copilot is in control.
link |
01:45:02.080
And we have the same kind of transition system.
link |
01:45:05.120
Either open pilot is engaged or open pilot is disengaged.
link |
01:45:08.620
Engage with cruise control,
link |
01:45:10.160
disengage with either gas brake or cancel.
link |
01:45:13.320
Let's talk about money.
link |
01:45:14.560
What's the business strategy for Kama?
link |
01:45:17.360
Profitable.
link |
01:45:18.840
Well, so you're.
link |
01:45:19.680
We did it.
link |
01:45:20.520
So congratulations.
link |
01:45:23.200
What, so basically selling,
link |
01:45:25.760
so we should say Kama cost a thousand bucks, Kama two?
link |
01:45:29.960
200 for the interface to the car as well.
link |
01:45:31.640
It's 1200, I'll send that.
link |
01:45:34.360
Nobody's usually upfront like this.
link |
01:45:36.400
Yeah, you gotta add the tack on, right?
link |
01:45:38.200
Yeah.
link |
01:45:39.040
I love it.
link |
01:45:39.860
I'm not gonna lie to you.
link |
01:45:41.080
Trust me, it will add $1,200 of value to your life.
link |
01:45:43.840
Yes, it's still super cheap.
link |
01:45:45.560
30 days, no questions asked, money back guarantee,
link |
01:45:47.880
and prices are only going up.
link |
01:45:50.400
If there ever is future hardware,
link |
01:45:52.320
it could cost a lot more than $1,200.
link |
01:45:53.840
So Kama three is in the works.
link |
01:45:56.960
It could be.
link |
01:45:57.800
All I will say is future hardware
link |
01:45:59.640
is going to cost a lot more than the current hardware.
link |
01:46:02.900
Yeah, the people that use,
link |
01:46:05.260
the people I've spoken with that use Kama,
link |
01:46:07.880
that use open pilot,
link |
01:46:10.320
first of all, they use it a lot.
link |
01:46:12.120
So people that use it, they fall in love with it.
link |
01:46:14.400
Oh, our retention rate is insane.
link |
01:46:16.840
It's a good sign.
link |
01:46:17.740
Yeah.
link |
01:46:18.580
It's a really good sign.
link |
01:46:19.680
70% of Kama two buyers are daily active users.
link |
01:46:23.720
Yeah, it's amazing.
link |
01:46:27.760
Oh, also, we don't plan on stopping selling the Kama two.
link |
01:46:30.520
Like it's, you know.
link |
01:46:31.960
So whatever you create that's beyond Kama two,
link |
01:46:36.600
it would be potentially a phase shift.
link |
01:46:40.760
Like it's so much better that,
link |
01:46:42.840
like you could use Kama two
link |
01:46:44.200
and you can use Kama whatever.
link |
01:46:45.760
Depends what you want.
link |
01:46:46.600
It's 3.41, 42.
link |
01:46:48.320
Yeah.
link |
01:46:49.160
You know, autopilot hardware one versus hardware two.
link |
01:46:52.200
The Kama two is kind of like hardware one.
link |
01:46:53.600
Got it, got it.
link |
01:46:54.440
You can still use both.
link |
01:46:55.260
Got it, got it.
link |
01:46:56.320
I think I heard you talk about retention rate
link |
01:46:58.000
with the VR headsets that the average is just once.
link |
01:47:01.240
Yeah.
link |
01:47:02.080
Just fast.
link |
01:47:02.900
I mean, it's such a fascinating way
link |
01:47:03.880
to think about technology.
link |
01:47:05.760
And this is a really, really good sign.
link |
01:47:07.420
And the other thing that people say about Kama
link |
01:47:09.000
is like they can't believe they're getting this 4,000 bucks.
link |
01:47:12.060
Right?
link |
01:47:12.900
It seems like some kind of steal.
link |
01:47:17.020
So, but in terms of like longterm business strategies
link |
01:47:20.040
that basically to put,
link |
01:47:21.640
so it's currently in like a thousand plus cars.
link |
01:47:27.560
1,200.
link |
01:47:28.520
More, more.
link |
01:47:30.560
So yeah, dailies is about, dailies is about 2,000.
link |
01:47:35.560
Weeklys is about 2,500, monthlys is over 3,000.
link |
01:47:38.960
Wow.
link |
01:47:39.800
We've grown a lot since we last talked.
link |
01:47:42.120
Is the goal, like can we talk crazy for a second?
link |
01:47:44.800
I mean, what's the goal to overtake Tesla?
link |
01:47:48.620
Let's talk, okay, so.
link |
01:47:49.960
I mean, Android did overtake iOS.
link |
01:47:51.480
That's exactly it, right?
link |
01:47:52.520
So they did it.
link |
01:47:55.360
I actually don't know the timeline of that one.
link |
01:47:57.720
But let's talk, because everything is in alpha now.
link |
01:48:02.160
The autopilot you could argue is in alpha
link |
01:48:03.960
in terms of towards the big mission
link |
01:48:05.840
of autonomous driving, right?
link |
01:48:07.680
And so what, yeah, is your goal to overtake
link |
01:48:11.600
millions of cars essentially?
link |
01:48:13.920
Of course.
link |
01:48:15.520
Where would it stop?
link |
01:48:16.760
Like it's open source software.
link |
01:48:18.000
It might not be millions of cars
link |
01:48:19.280
with a piece of comma hardware, but yeah.
link |
01:48:21.360
I think open pilot at some point
link |
01:48:24.320
will cross over autopilot in users,
link |
01:48:26.920
just like Android crossed over iOS.
link |
01:48:29.200
How does Google make money from Android?
link |
01:48:31.240
It's complicated.
link |
01:48:34.840
Their own devices make money.
link |
01:48:37.400
Google, Google makes money
link |
01:48:39.480
by just kind of having you on the internet.
link |
01:48:42.240
Yes.
link |
01:48:43.080
Google search is built in, Gmail is built in.
link |
01:48:45.620
Android is just a shill
link |
01:48:46.540
for the rest of Google's ecosystem.
link |
01:48:48.200
Yeah, but the problem is Android is not,
link |
01:48:50.680
is a brilliant thing.
link |
01:48:52.440
I mean, Android arguably changed the world.
link |
01:48:55.080
So there you go.
link |
01:48:56.440
That's, you can feel good ethically speaking.
link |
01:49:00.800
But as a business strategy, it's questionable.
link |
01:49:04.320
Or sell hardware.
link |
01:49:05.720
Sell hardware.
link |
01:49:06.560
I mean, it took Google a long time to come around to it,
link |
01:49:08.120
but they are now making money on the Pixel.
link |
01:49:10.000
You're not about money, you're more about winning.
link |
01:49:13.240
Yeah, of course.
link |
01:49:14.160
No, but if only 10% of open pilot devices
link |
01:49:18.320
come from comma AI.
link |
01:49:19.920
They still make a lot.
link |
01:49:20.820
That is still, yes.
link |
01:49:21.660
That is a ton of money for our company.
link |
01:49:22.840
But can't somebody create a better comma using open pilot?
link |
01:49:27.120
Or are you basically saying, well, I'll compete them?
link |
01:49:28.840
Well, I'll compete you.
link |
01:49:29.680
Can you create a better Android phone than the Google Pixel?
link |
01:49:32.280
Right.
link |
01:49:32.800
I mean, you can, but like, you know.
link |
01:49:34.680
I love that.
link |
01:49:35.520
So you're confident, like, you know
link |
01:49:37.360
what the hell you're doing.
link |
01:49:38.480
Yeah.
link |
01:49:40.040
It's confidence and merit.
link |
01:49:43.520
I mean, our money comes from, we're
link |
01:49:44.960
a consumer electronics company.
link |
01:49:46.160
Yeah.
link |
01:49:46.660
And put it this way.
link |
01:49:48.040
So we sold like 3,000 comma twos.
link |
01:49:51.600
2,500 right now.
link |
01:49:54.080
And like, OK, we're probably going
link |
01:49:59.720
to sell 10,000 units next year.
link |
01:50:01.920
10,000 units, even just $1,000 a unit, OK,
link |
01:50:04.520
we're at 10 million in revenue.
link |
01:50:09.400
Get that up to 100,000, maybe double the price of the unit.
link |
01:50:12.100
Now we're talking like 200 million revenue.
link |
01:50:13.560
We're talking like series.
link |
01:50:14.440
Yeah, actually making money.
link |
01:50:15.840
One of the rare semi autonomous or autonomous vehicle companies
link |
01:50:19.360
that are actually making money.
link |
01:50:21.080
Yeah.
link |
01:50:22.600
You know, if you look at a model,
link |
01:50:24.880
and we were just talking about this yesterday.
link |
01:50:26.680
If you look at a model, and like you're AB testing your model,
link |
01:50:29.760
and if you're one branch of the AB test,
link |
01:50:32.360
the losses go down very fast in the first five epochs.
link |
01:50:35.320
That model is probably going to converge
link |
01:50:37.520
to something considerably better than the one
link |
01:50:39.360
where the losses are going down slower.
link |
01:50:41.360
Why do people think this is going to stop?
link |
01:50:43.000
Why do people think one day there's
link |
01:50:44.500
going to be a great like, well, Waymo's eventually
link |
01:50:46.840
going to surpass you guys?
link |
01:50:49.600
Well, they're not.
link |
01:50:52.000
Do you see like a world where like a Tesla or a car
link |
01:50:55.800
like a Tesla would be able to basically press a button
link |
01:50:59.160
and you like switch to open pilot?
link |
01:51:01.920
You know, you load in.
link |
01:51:04.480
No, so I think so first off, I think
link |
01:51:06.840
that we may surpass Tesla in terms of users.
link |
01:51:10.560
I do not think we're going to surpass Tesla ever
link |
01:51:12.520
in terms of revenue.
link |
01:51:13.520
I think Tesla can capture a lot more revenue per user
link |
01:51:16.380
than we can.
link |
01:51:17.320
But this mimics the Android iOS model exactly.
link |
01:51:20.520
There may be more Android devices,
link |
01:51:22.000
but there's a lot more iPhones than Google Pixels.
link |
01:51:24.320
So I think there'll be a lot more Tesla cars sold
link |
01:51:26.360
than pieces of common hardware.
link |
01:51:30.280
And then as far as a Tesla owner being
link |
01:51:34.420
able to switch to open pilot, does iPhones run Android?
link |
01:51:40.920
No, but it doesn't make sense.
link |
01:51:42.400
You can if you really want to do it,
link |
01:51:43.480
but it doesn't really make sense.
link |
01:51:44.440
Like it's not.
link |
01:51:45.320
It doesn't make sense.
link |
01:51:46.240
Who cares?
link |
01:51:46.740
What about if a large company like automakers, Ford, GM,
link |
01:51:51.640
Toyota came to George Hots?
link |
01:51:53.720
Or on the tech space, Amazon, Facebook, Google
link |
01:51:58.000
came with a large pile of cash?
link |
01:52:01.080
Would you consider being purchased?
link |
01:52:07.360
Do you see that as a one possible?
link |
01:52:10.500
Not seriously, no.
link |
01:52:12.360
I would probably see how much shit they'll entertain for me.
link |
01:52:19.680
And if they're willing to jump through a bunch of my hoops,
link |
01:52:22.080
then maybe.
link |
01:52:22.960
But no, not the way that M&A works today.
link |
01:52:25.200
I mean, we've been approached.
link |
01:52:26.520
And I laugh in these people's faces.
link |
01:52:28.000
I'm like, are you kidding?
link |
01:52:31.000
Yeah.
link |
01:52:31.600
Because it's so demeaning.
link |
01:52:33.680
The M&A people are so demeaning to companies.
link |
01:52:36.960
They treat the startup world as their innovation ecosystem.
link |
01:52:41.340
And they think that I'm cool with going along with that,
link |
01:52:43.640
so I can have some of their scam fake Fed dollars.
link |
01:52:46.440
Fed coin.
link |
01:52:47.680
What am I going to do with more Fed coin?
link |
01:52:49.680
Fed coin.
link |
01:52:50.320
Fed coin, man.
link |
01:52:51.400
I love that.
link |
01:52:52.120
So that's the cool thing about podcasting,
link |
01:52:54.040
actually, is people criticize.
link |
01:52:56.200
I don't know if you're familiar with Spotify giving Joe Rogan
link |
01:53:00.560
$100 million.
link |
01:53:01.920
I don't know about that.
link |
01:53:03.720
And they respect, despite all the shit
link |
01:53:08.200
that people are talking about Spotify,
link |
01:53:11.440
people understand that podcasters like Joe Rogan
link |
01:53:15.300
know what the hell they're doing.
link |
01:53:17.200
So they give them money and say, just do what you do.
link |
01:53:21.160
And the equivalent for you would be like,
link |
01:53:25.440
George, do what the hell you do, because you're good at it.
link |
01:53:28.440
Try not to murder too many people.
link |
01:53:31.400
There's some kind of common sense things,
link |
01:53:33.480
like just don't go on a weird rampage of it.
link |
01:53:37.400
Yeah.
link |
01:53:38.080
It comes down to what companies I could respect, right?
link |
01:53:43.400
Could I respect GM?
link |
01:53:44.440
Never.
link |
01:53:46.520
No, I couldn't.
link |
01:53:47.480
I mean, could I respect a Hyundai?
link |
01:53:50.840
More so.
link |
01:53:52.520
That's a lot closer.
link |
01:53:53.560
Toyota?
link |
01:53:54.720
What's your?
link |
01:53:55.840
Nah.
link |
01:53:56.840
Nah.
link |
01:53:57.520
Korean is the way.
link |
01:53:59.400
I think that the Japanese, the Germans, the US, they're all
link |
01:54:02.440
too, they're all too, they all think they're too great.
link |
01:54:05.720
What about the tech companies?
link |
01:54:07.520
Apple?
link |
01:54:08.480
Apple is, of the tech companies that I could respect,
link |
01:54:11.000
Apple's the closest.
link |
01:54:12.120
Yeah.
link |
01:54:12.600
I mean, I could never.
link |
01:54:13.560
It would be ironic.
link |
01:54:14.440
It would be ironic if Comma AI is acquired by Apple.
link |
01:54:19.840
I mean, Facebook, look, I quit Facebook 10 years ago
link |
01:54:21.840
because I didn't respect the business model.
link |
01:54:24.320
Google has declined so fast in the last five years.
link |
01:54:28.480
What are your thoughts about Waymo and its present
link |
01:54:32.080
and its future?
link |
01:54:33.200
Let me start by saying something nice, which is I've
link |
01:54:39.240
visited them a few times and have ridden in their cars.
link |
01:54:45.240
And the engineering that they're doing,
link |
01:54:49.760
both the research and the actual development
link |
01:54:51.720
and the engineering they're doing
link |
01:54:53.360
and the scale they're actually achieving
link |
01:54:55.160
by doing it all themselves is really impressive.
link |
01:54:58.400
And the balance of safety and innovation.
link |
01:55:01.480
And the cars work really well for the routes they drive.
link |
01:55:07.520
It drives fast, which was very surprising to me.
link |
01:55:10.960
It drives the speed limit or faster than the speed limit.
link |
01:55:14.800
It goes.
link |
01:55:16.120
And it works really damn well.
link |
01:55:17.800
And the interface is nice.
link |
01:55:19.160
In Chandler, Arizona, yeah.
link |
01:55:20.360
Yeah, in Chandler, Arizona, very specific environment.
link |
01:55:22.560
So it gives me enough material in my mind
link |
01:55:27.360
to push back against the madmen of the world,
link |
01:55:30.360
like George Hotz, to be like, because you kind of imply
link |
01:55:36.040
there's zero probability they're going to win.
link |
01:55:38.760
And after I've used, after I've ridden in it, to me,
link |
01:55:43.560
it's not zero.
link |
01:55:44.440
Oh, it's not for technology reasons.
link |
01:55:46.760
Bureaucracy?
link |
01:55:48.000
No, it's worse than that.
link |
01:55:49.520
It's actually for product reasons, I think.
link |
01:55:51.840
Oh, you think they're just not capable of creating
link |
01:55:53.840
an amazing product?
link |
01:55:55.600
No, I think that the product that they're building
link |
01:55:58.280
doesn't make sense.
link |
01:56:01.040
So a few things.
link |
01:56:03.320
You say the Waymo's are fast.
link |
01:56:05.840
Benchmark a Waymo against a competent Uber driver.
link |
01:56:09.040
Right.
link |
01:56:09.600
Right?
link |
01:56:10.080
The Uber driver's faster.
link |
01:56:11.080
It's not even about speed.
link |
01:56:12.200
It's the thing you said.
link |
01:56:13.240
It's about the experience of being stuck at a stop sign
link |
01:56:16.280
because pedestrians are crossing nonstop.
link |
01:56:20.120
I like when my Uber driver doesn't come to a full stop
link |
01:56:22.160
at the stop sign.
link |
01:56:22.960
Yeah.
link |
01:56:23.520
You know?
link |
01:56:24.440
And so let's say the Waymo's are 20% slower than an Uber.
link |
01:56:31.320
Right?
link |
01:56:33.000
You can argue that they're going to be cheaper.
link |
01:56:35.160
And I argue that users already have the choice
link |
01:56:37.640
to trade off money for speed.
link |
01:56:39.360
It's called UberPool.
link |
01:56:42.280
I think it's like 15% of rides are UberPools.
link |
01:56:45.560
Right?
link |
01:56:46.040
Users are not willing to trade off money for speed.
link |
01:56:49.480
So the whole product that they're building
link |
01:56:52.320
is not going to be competitive with traditional ride sharing
link |
01:56:56.200
networks.
link |
01:56:56.720
Right.
link |
01:56:59.560
And also, whether there's profit to be made
link |
01:57:04.360
depends entirely on one company having a monopoly.
link |
01:57:07.360
I think that the level four autonomous ride sharing
link |
01:57:11.200
vehicles market is going to look a lot like the scooter market
link |
01:57:14.800
if even the technology does come to exist, which I question.
link |
01:57:18.680
Who's doing well in that market?
link |
01:57:20.400
It's a race to the bottom.
link |
01:57:22.320
Well, it could be closer like an Uber and a Lyft,
link |
01:57:25.680
where it's just one or two players.
link |
01:57:28.000
Well, the scooter people have given up
link |
01:57:31.160
trying to market scooters as a practical means
link |
01:57:34.760
of transportation.
link |
01:57:35.480
And they're just like, they're super fun to ride.
link |
01:57:37.640
Look at wheels.
link |
01:57:38.360
I love those things.
link |
01:57:39.160
And they're great on that front.
link |
01:57:40.560
Yeah.
link |
01:57:41.040
But from an actual transportation product
link |
01:57:43.840
perspective, I do not think scooters are viable.
link |
01:57:46.240
And I do not think level four autonomous cars are viable.
link |
01:57:49.200
If you, let's play a fun experiment.
link |
01:57:51.560
If you ran, let's do a Tesla and let's do Waymo.
link |
01:57:56.960
If Elon Musk took a vacation for a year, he just said,
link |
01:58:01.880
screw it, I'm going to go live on an island, no electronics.
link |
01:58:05.400
And the board decides that we need to find somebody
link |
01:58:07.520
to run the company.
link |
01:58:09.160
And they did decide that you should run the company
link |
01:58:11.520
for a year.
link |
01:58:12.240
How do you run Tesla differently?
link |
01:58:14.920
I wouldn't change much.
link |
01:58:16.240
Do you think they're on the right track?
link |
01:58:17.920
I wouldn't change.
link |
01:58:18.680
I mean, I'd have some minor changes.
link |
01:58:21.640
But even my debate with Tesla about end
link |
01:58:25.520
to end versus SegNets, that's just software.
link |
01:58:29.120
Who cares?
link |
01:58:30.720
It's not like you're doing something terrible with SegNets.
link |
01:58:33.920
You're probably building something that's
link |
01:58:35.600
at least going to help you debug the end to end system a lot.
link |
01:58:39.440
It's very easy to transition from what they have
link |
01:58:42.200
to an end to end kind of thing.
link |
01:58:45.840
And then I presume you would, in the Model Y
link |
01:58:50.520
or maybe in the Model 3, start adding driver
link |
01:58:52.480
sensing with infrared.
link |
01:58:53.600
Yes, I would add infrared camera, infrared lights
link |
01:58:58.000
right away to those cars.
link |
01:59:02.120
And start collecting that data and do all that kind of stuff,
link |
01:59:04.920
yeah.
link |
01:59:05.440
Very much.
link |
01:59:06.080
I think they're already kind of doing it.
link |
01:59:07.780
It's an incredibly minor change.
link |
01:59:09.320
If I actually were CEO of Tesla, first off,
link |
01:59:11.160
I'd be horrified that I wouldn't be able to do
link |
01:59:13.080
a better job as Elon.
link |
01:59:14.000
And then I would try to understand
link |
01:59:16.480
the way he's done things before.
link |
01:59:17.760
You would also have to take over his Twitter.
link |
01:59:20.720
I don't tweet.
link |
01:59:22.200
Yeah, what's your Twitter situation?
link |
01:59:24.240
Why are you so quiet on Twitter?
link |
01:59:25.880
Since Dukama is like what's your social network presence like?
link |
01:59:30.280
Because on Instagram, you do live streams.
link |
01:59:34.280
You understand the music of the internet,
link |
01:59:39.240
but you don't always fully engage into it.
link |
01:59:41.520
You're part time.
link |
01:59:42.800
Well, I used to have a Twitter.
link |
01:59:44.160
Yeah, I mean, Instagram is a pretty place.
link |
01:59:47.600
Instagram is a beautiful place.
link |
01:59:49.120
It glorifies beauty.
link |
01:59:49.960
I like Instagram's values as a network.
link |
01:59:53.360
Twitter glorifies conflict, glorifies shots,
link |
02:00:00.320
taking shots of people.
link |
02:00:01.400
And it's like, you know, Twitter and Donald Trump
link |
02:00:05.280
are perfectly, they're perfect for each other.
link |
02:00:08.440
So Tesla's on the right track in your view.
link |
02:00:12.800
OK, so let's try, let's really try this experiment.
link |
02:00:16.600
If you ran Waymo, let's say they're,
link |
02:00:19.720
I don't know if you agree, but they
link |
02:00:21.160
seem to be at the head of the pack of the kind of,
link |
02:00:25.640
what would you call that approach?
link |
02:00:27.040
Like it's not necessarily lighter based
link |
02:00:29.000
because it's not about lighter.
link |
02:00:30.320
Level four robotaxi.
link |
02:00:31.520
Level four robotaxi, all in before making any revenue.
link |
02:00:37.040
So they're probably at the head of the pack.
link |
02:00:38.720
If you were said, hey, George, can you
link |
02:00:42.840
please run this company for a year, how would you change it?
link |
02:00:47.160
I would go.
link |
02:00:47.760
I would get Anthony Levandowski out of jail,
link |
02:00:49.800
and I would put him in charge of the company.
link |
02:00:56.400
Well, let's try to break that apart.
link |
02:00:58.200
Why do you want to destroy the company by doing that?
link |
02:01:01.600
Or do you mean you like renegade style thinking that pushes,
link |
02:01:09.400
that throws away bureaucracy and goes
link |
02:01:11.560
to first principle thinking?
link |
02:01:12.840
What do you mean by that?
link |
02:01:14.080
I think Anthony Levandowski is a genius,
link |
02:01:16.040
and I think he would come up with a much better idea of what
link |
02:01:19.640
to do with Waymo than me.
link |
02:01:22.360
So you mean that unironically.
link |
02:01:23.840
He is a genius.
link |
02:01:24.800
Oh, yes.
link |
02:01:25.400
Oh, absolutely.
link |
02:01:26.640
Without a doubt.
link |
02:01:27.600
I mean, I'm not saying there's no shortcomings,
link |
02:01:30.960
but in the interactions I've had with him, yeah.
link |
02:01:34.760
What?
link |
02:01:35.720
He's also willing to take, like, who knows
link |
02:01:38.200
what he would do with Waymo?
link |
02:01:39.400
I mean, he's also out there, like far more out there
link |
02:01:41.400
than I am.
link |
02:01:41.880
Yeah, there's big risks.
link |
02:01:43.480
What do you make of him?
link |
02:01:44.480
I was going to talk to him on this podcast,
link |
02:01:47.040
and I was going back and forth.
link |
02:01:48.880
I'm such a gullible, naive human.
link |
02:01:51.720
Like, I see the best in people.
link |
02:01:53.840
And I slowly started to realize that there
link |
02:01:56.720
might be some people out there that, like,
link |
02:02:02.400
have multiple faces to the world.
link |
02:02:05.280
They're, like, deceiving and dishonest.
link |
02:02:08.160
I still refuse to, like, I just, I trust people,
link |
02:02:13.040
and I don't care if I get hurt by it.
link |
02:02:14.640
But, like, you know, sometimes you
link |
02:02:16.520
have to be a little bit careful, especially platform
link |
02:02:18.760
wise and podcast wise.
link |
02:02:21.040
What do you, what am I supposed to think?
link |
02:02:23.160
So you think, you think he's a good person?
link |
02:02:26.400
Oh, I don't know.
link |
02:02:27.840
I don't really make moral judgments.
link |
02:02:30.000
It's difficult to.
link |
02:02:30.840
Oh, I mean this about the Waymo.
link |
02:02:32.480
I actually, I mean that whole idea very nonironically
link |
02:02:34.960
about what I would do.
link |
02:02:36.200
The problem with putting me in charge of Waymo
link |
02:02:38.160
is Waymo is already $10 billion in the hole, right?
link |
02:02:41.720
Whatever idea Waymo does, look, commas profitable, commas
link |
02:02:44.840
raised $8.1 million.
link |
02:02:46.680
That's small, you know, that's small money.
link |
02:02:48.180
Like, I can build a reasonable consumer electronics company
link |
02:02:50.840
and succeed wildly at that and still never be able to pay back
link |
02:02:54.200
Waymo's $10 billion.
link |
02:02:55.800
So I think the basic idea with Waymo, well,
link |
02:02:58.680
forget the $10 billion because they have some backing,
link |
02:03:00.880
but your basic thing is, like, what can we do
link |
02:03:04.000
to start making some money?
link |
02:03:05.640
Well, no, I mean, my bigger idea is, like,
link |
02:03:07.920
whatever the idea is that's gonna save Waymo,
link |
02:03:10.320
I don't have it.
link |
02:03:11.440
It's gonna have to be a big risk idea
link |
02:03:13.480
and I cannot think of a better person
link |
02:03:15.280
than Anthony Levandowski to do it.
link |
02:03:17.840
So that is completely what I would do as CEO of Waymo.
link |
02:03:20.240
I would call myself a transitionary CEO,
link |
02:03:22.640
do everything I can to fix that situation up.
link |
02:03:24.880
I'm gonna see.
link |
02:03:25.720
Yeah.
link |
02:03:27.920
Yeah.
link |
02:03:28.760
Because I can't do it, right?
link |
02:03:29.760
Like, I can't, I mean, I can talk about how
link |
02:03:33.440
what I really wanna do is just apologize
link |
02:03:35.260
for all those corny, you know, ad campaigns
link |
02:03:38.120
and be like, here's the real state of the technology.
link |
02:03:40.160
Yeah, that's, like, I have several criticism.
link |
02:03:42.280
I'm a little bit more bullish on Waymo
link |
02:03:44.320
than you seem to be, but one criticism I have
link |
02:03:48.000
is it went into corny mode too early.
link |
02:03:50.880
Like, it's still a startup.
link |
02:03:52.200
It hasn't delivered on anything.
link |
02:03:53.660
So it should be, like, more renegade
link |
02:03:56.320
and show off the engineering that they're doing,
link |
02:03:59.280
which just can be impressive,
link |
02:04:00.560
as opposed to doing these weird commercials
link |
02:04:02.320
of, like, your friendly car company.
link |
02:04:07.040
I mean, that's my biggest snipe at Waymo is always,
link |
02:04:10.080
that guy's a paid actor.
link |
02:04:11.440
That guy's not a Waymo user.
link |
02:04:12.800
He's a paid actor.
link |
02:04:13.640
Look here, I found his call sheet.
link |
02:04:15.360
Do kind of like what SpaceX is doing
link |
02:04:17.400
with the rocket launches.
link |
02:04:18.960
Just put the nerds up front, put the engineers up front,
link |
02:04:22.200
and just, like, show failures too, just.
link |
02:04:25.720
I love SpaceX's, yeah.
link |
02:04:27.880
Yeah, the thing that they're doing is right,
link |
02:04:29.840
and it just feels like the right.
link |
02:04:31.720
But.
link |
02:04:32.560
We're all so excited to see them succeed.
link |
02:04:34.440
Yeah.
link |
02:04:35.260
I can't wait to see when it won't fail, you know?
link |
02:04:37.320
Like, you lie to me, I want you to fail.
link |
02:04:39.400
You tell me the truth, you be honest with me,
link |
02:04:41.200
I want you to succeed.
link |
02:04:42.160
Yeah.
link |
02:04:44.120
Ah, yeah, and that requires the renegade CEO, right?
link |
02:04:50.200
I'm with you, I'm with you.
link |
02:04:51.480
I still have a little bit of faith in Waymo
link |
02:04:54.000
for the renegade CEO to step forward, but.
link |
02:04:57.860
It's not, it's not John Kraftik.
link |
02:05:00.600
Yeah, it's, you can't.
link |
02:05:02.980
It's not Chris Hormiston.
link |
02:05:04.980
And those people may be very good at certain things.
link |
02:05:07.680
Yeah.
link |
02:05:08.520
But they're not renegades.
link |
02:05:10.360
Yeah, because these companies are fundamentally,
link |
02:05:12.040
even though we're talking about billion dollars,
link |
02:05:14.260
all these crazy numbers,
link |
02:05:15.640
they're still, like, early stage startups.
link |
02:05:19.300
I mean, and I just, if you are pre revenue
link |
02:05:21.840
and you've raised 10 billion dollars,
link |
02:05:23.120
I have no idea, like, this just doesn't work.
link |
02:05:26.520
You know, it's against everything Silicon Valley.
link |
02:05:28.040
Where's your minimum viable product?
link |
02:05:29.880
You know, where's your users?
link |
02:05:31.600
Where's your growth numbers?
link |
02:05:33.320
This is traditional Silicon Valley.
link |
02:05:36.280
Why do you not apply it to what you think
link |
02:05:38.040
you're too big to fail already, like?
link |
02:05:41.760
How do you think autonomous driving will change society?
link |
02:05:45.780
So the mission is, for comma, to solve self driving.
link |
02:05:52.520
Do you have, like, a vision of the world
link |
02:05:54.240
of how it'll be different?
link |
02:05:57.940
Is it as simple as A to B transportation?
link |
02:06:00.100
Or is there, like, cause these are robots.
link |
02:06:03.640
It's not about autonomous driving in and of itself.
link |
02:06:05.880
It's what the technology enables.
link |
02:06:09.760
It's, I think it's the coolest applied AI problem.
link |
02:06:12.200
I like it because it has a clear path to monetary value.
link |
02:06:17.720
But as far as that being the thing that changes the world,
link |
02:06:21.440
I mean, no, like, there's cute things we're doing in common.
link |
02:06:25.480
Like, who'd have thought you could stick a phone
link |
02:06:26.960
on the windshield and it'll drive.
link |
02:06:29.080
But like, really, the product that you're building
link |
02:06:31.160
is not something that people were not capable
link |
02:06:33.920
of imagining 50 years ago.
link |
02:06:35.760
So no, it doesn't change the world on that front.
link |
02:06:37.800
Could people have imagined the internet 50 years ago?
link |
02:06:39.480
Only true genius visionaries.
link |
02:06:42.560
Everyone could have imagined autonomous cars 50 years ago.
link |
02:06:45.120
It's like a car, but I don't drive it.
link |
02:06:47.000
See, I have this sense, and I told you, like,
link |
02:06:49.920
my longterm dream is robots with which you have deep,
link |
02:06:55.880
with whom you have deep connections, right?
link |
02:06:59.320
And there's different trajectories towards that.
link |
02:07:03.600
And I've been thinking,
link |
02:07:04.440
so I've been thinking of launching a startup.
link |
02:07:07.060
I see autonomous vehicles
link |
02:07:09.820
as a potential trajectory to that.
link |
02:07:11.420
That's not where the direction I would like to go,
link |
02:07:16.580
but I also see Tesla or even Comma AI,
link |
02:07:19.100
like, pivoting into robotics broadly defined
link |
02:07:24.900
at some stage in the way, like you're mentioning,
link |
02:07:27.140
the internet didn't expect.
link |
02:07:29.580
Let's solve, you know, when I say a comma about this,
link |
02:07:32.580
we could talk about this,
link |
02:07:33.420
but let's solve self driving cars first.
link |
02:07:35.740
You gotta stay focused on the mission.
link |
02:07:37.140
Don't, don't, don't, you're not too big to fail.
link |
02:07:39.260
For however much I think Comma's winning,
link |
02:07:41.380
like, no, no, no, no, no, you're winning
link |
02:07:43.420
when you solve level five self driving cars.
link |
02:07:45.020
And until then, you haven't won.
link |
02:07:46.780
And you know, again, you wanna be arrogant
link |
02:07:48.900
in the face of other people, great.
link |
02:07:50.620
You wanna be arrogant in the face of nature, you're an idiot.
link |
02:07:53.780
Stay mission focused, brilliantly put.
link |
02:07:56.420
Like I mentioned, thinking of launching a startup,
link |
02:07:58.660
I've been considering, actually, before COVID,
link |
02:08:01.220
I've been thinking of moving to San Francisco.
link |
02:08:03.340
Ooh, ooh, I wouldn't go there.
link |
02:08:06.020
So why is, well, and now I'm thinking
link |
02:08:09.020
about potentially Austin and we're in San Diego now.
link |
02:08:13.660
San Diego, come here.
link |
02:08:14.940
So why, what, I mean, you're such an interesting human.
link |
02:08:20.540
You've launched so many successful things.
link |
02:08:23.060
What, why San Diego?
link |
02:08:26.220
What do you recommend?
link |
02:08:27.060
Why not San Francisco?
link |
02:08:29.300
Have you thought, so in your case,
link |
02:08:31.820
San Diego with Qualcomm and Snapdragon,
link |
02:08:33.780
I mean, that's an amazing combination.
link |
02:08:36.860
But.
link |
02:08:37.700
That wasn't really why.
link |
02:08:38.540
That wasn't the why?
link |
02:08:39.500
No, I mean, Qualcomm was an afterthought.
link |
02:08:41.340
Qualcomm was, it was a nice thing to think about.
link |
02:08:42.900
It's like, you can have a tech company here.
link |
02:08:45.140
Yeah.
link |
02:08:45.980
And a good one, I mean, you know, I like Qualcomm, but.
link |
02:08:48.300
No.
link |
02:08:49.140
Well, so why San Diego better than San Francisco?
link |
02:08:50.540
Why does San Francisco suck?
link |
02:08:51.900
Well, so, okay, so first off,
link |
02:08:53.020
we all kind of said like, we wanna stay in California.
link |
02:08:55.300
People like the ocean.
link |
02:08:57.620
You know, California, for its flaws,
link |
02:09:00.020
it's like a lot of the flaws of California
link |
02:09:02.380
are not necessarily California as a whole,
link |
02:09:03.900
and they're much more San Francisco specific.
link |
02:09:05.740
Yeah.
link |
02:09:06.700
San Francisco, so I think first tier cities in general
link |
02:09:09.860
have stopped wanting growth.
link |
02:09:13.140
Well, you have like in San Francisco, you know,
link |
02:09:15.460
the voting class always votes to not build more houses
link |
02:09:18.740
because they own all the houses.
link |
02:09:19.980
And they're like, well, you know,
link |
02:09:21.820
once people have figured out how to vote themselves
link |
02:09:23.860
more money, they're gonna do it.
link |
02:09:25.180
It is so insanely corrupt.
link |
02:09:27.900
It is not balanced at all, like political party wise,
link |
02:09:31.620
you know, it's a one party city and.
link |
02:09:34.700
For all the discussion of diversity,
link |
02:09:38.700
it stops lacking real diversity of thought,
link |
02:09:42.140
of background, of approaches, of strategies, of ideas.
link |
02:09:48.540
It's kind of a strange place
link |
02:09:51.180
that it's the loudest people about diversity
link |
02:09:54.380
and the biggest lack of diversity.
link |
02:09:56.420
I mean, that's what they say, right?
link |
02:09:58.620
It's the projection.
link |
02:10:00.340
Projection, yeah.
link |
02:10:02.140
Yeah, it's interesting.
link |
02:10:02.980
And even people in Silicon Valley tell me
link |
02:10:04.500
that's like high up people,
link |
02:10:07.940
everybody is like, this is a terrible place.
link |
02:10:10.100
It doesn't make sense.
link |
02:10:10.940
I mean, and coronavirus is really what killed it.
link |
02:10:13.220
San Francisco was the number one exodus
link |
02:10:17.340
during coronavirus.
link |
02:10:18.900
We still think San Diego is a good place to be.
link |
02:10:21.660
Yeah.
link |
02:10:23.460
Yeah, I mean, we'll see.
link |
02:10:24.740
We'll see what happens with California a bit longer term.
link |
02:10:29.780
Like Austin's an interesting choice.
link |
02:10:32.100
I wouldn't, I don't have really anything bad to say
link |
02:10:33.940
about Austin either,
link |
02:10:35.740
except for the extreme heat in the summer,
link |
02:10:37.940
which, but that's like very on the surface, right?
link |
02:10:40.180
I think as far as like an ecosystem goes, it's cool.
link |
02:10:43.700
I personally love Colorado.
link |
02:10:45.540
Colorado's great.
link |
02:10:47.100
Yeah, I mean, you have these states that are,
link |
02:10:49.140
like just way better run.
link |
02:10:51.380
California is, you know, it's especially San Francisco.
link |
02:10:55.380
It's not a tie horse and like, yeah.
link |
02:10:58.460
Can I ask you for advice to me and to others
link |
02:11:02.620
about what's it take to build a successful startup?
link |
02:11:07.540
Oh, I don't know.
link |
02:11:08.380
I haven't done that.
link |
02:11:09.220
Talk to someone who did that.
link |
02:11:10.740
Well, you've, you know,
link |
02:11:14.980
this is like another book of years
link |
02:11:16.460
that I'll buy for $67, I suppose.
link |
02:11:18.860
So there's, um.
link |
02:11:20.700
One of these days I'll sell out.
link |
02:11:24.140
Yeah, that's right.
link |
02:11:24.980
Jailbreaks are going to be a dollar
link |
02:11:26.020
and books are going to be 67.
link |
02:11:27.860
How I jailbroke the iPhone by George Hots.
link |
02:11:32.060
That's right.
link |
02:11:32.900
How I jail broke the iPhone and you can too.
link |
02:11:35.620
You can too.
link |
02:11:36.860
67 dollars.
link |
02:11:37.700
In 21 days.
link |
02:11:39.020
That's right.
link |
02:11:39.860
That's right.
link |
02:11:40.700
Oh God.
link |
02:11:41.540
Okay, I can't wait.
link |
02:11:42.380
But quite, so you have an introspective,
link |
02:11:44.940
you have built a very unique company.
link |
02:11:49.460
I mean, not you, but you and others.
link |
02:11:53.180
But I don't know.
link |
02:11:55.620
There's no, there's nothing.
link |
02:11:56.940
You have an introspective,
link |
02:11:57.780
you haven't really sat down and thought about like,
link |
02:12:01.780
well, like if you and I were having a bunch of,
link |
02:12:04.180
we're having some beers
link |
02:12:06.180
and you're seeing that I'm depressed
link |
02:12:08.100
and whatever, I'm struggling.
link |
02:12:09.860
There's no advice you can give?
link |
02:12:11.580
Oh, I mean.
link |
02:12:13.100
More beer?
link |
02:12:13.940
More beer?
link |
02:12:15.100
Um, yeah, I think it's all very like situation dependent.
link |
02:12:23.340
Here's, okay, if I can give a generic piece of advice,
link |
02:12:25.860
it's the technology always wins.
link |
02:12:28.140
The better technology always wins.
link |
02:12:30.740
And lying always loses.
link |
02:12:35.180