back to index

George Hotz: Hacking the Simulation & Learning to Drive with Neural Nets | Lex Fridman Podcast #132


small model | large model

link |
00:00:00.000
The following is a conversation with George Hotz,
link |
00:00:02.440
AKA Geohot, his second time on the podcast.
link |
00:00:06.520
He's the founder of Comma AI,
link |
00:00:09.320
an autonomous and semi autonomous vehicle technology company
link |
00:00:12.880
that seeks to be to Tesla Autopilot
link |
00:00:15.840
what Android is to the iOS.
link |
00:00:18.920
They sell the Comma 2 device for $1,000
link |
00:00:22.760
that when installed in many of their supported cars
link |
00:00:25.600
can keep the vehicle centered in the lane
link |
00:00:27.960
even when there are no lane markings.
link |
00:00:30.760
It includes driver sensing
link |
00:00:32.400
that ensures that the driver's eyes are on the road.
link |
00:00:35.640
As you may know, I'm a big fan of driver sensing.
link |
00:00:38.440
I do believe Tesla Autopilot and others
link |
00:00:40.560
should definitely include it in their sensor suite.
link |
00:00:43.520
Also, I'm a fan of Android and a big fan of George
link |
00:00:47.120
for many reasons,
link |
00:00:48.240
including his nonlinear out of the box brilliance
link |
00:00:51.640
and the fact that he's a superstar programmer
link |
00:00:55.120
of a very different style than myself.
link |
00:00:57.360
Styles make fights and styles make conversations.
link |
00:01:01.160
So I really enjoyed this chat
link |
00:01:02.920
and I'm sure we'll talk many more times on this podcast.
link |
00:01:06.280
Quick mention of a sponsor
link |
00:01:07.680
followed by some thoughts related to the episode.
link |
00:01:10.160
First is Four Sigmatic,
link |
00:01:12.240
the maker of delicious mushroom coffee.
link |
00:01:15.360
Second is The Coding Digital,
link |
00:01:17.520
a podcast on tech and entrepreneurship
link |
00:01:19.760
that I listen to and enjoy.
link |
00:01:22.080
And finally, ExpressVPN,
link |
00:01:24.520
the VPN I've used for many years to protect my privacy
link |
00:01:27.800
on the internet.
link |
00:01:29.320
Please check out the sponsors in the description
link |
00:01:31.280
to get a discount and to support this podcast.
link |
00:01:34.840
As a side note, let me say that my work at MIT
link |
00:01:38.080
on autonomous and semi autonomous vehicles
link |
00:01:40.480
led me to study the human side of autonomy
link |
00:01:43.080
enough to understand that it's a beautifully complicated
link |
00:01:46.600
and interesting problem space,
link |
00:01:48.600
much richer than what can be studied in the lab.
link |
00:01:51.800
In that sense, the data that Comma AI, Tesla Autopilot
link |
00:01:55.360
and perhaps others like Cadillac Super Cruiser collecting
link |
00:01:58.480
gives us a chance to understand
link |
00:02:00.560
how we can design safe semi autonomous vehicles
link |
00:02:03.800
for real human beings in real world conditions.
link |
00:02:07.560
I think this requires bold innovation
link |
00:02:09.920
and a serious exploration of the first principles
link |
00:02:13.000
of the driving task itself.
link |
00:02:15.640
If you enjoyed this thing, subscribe on YouTube,
link |
00:02:17.880
review it with five stars and up a podcast,
link |
00:02:20.160
follow on Spotify, support on Patreon
link |
00:02:22.760
or connect with me on Twitter at Lex Friedman.
link |
00:02:26.280
And now here's my conversation with George Hotz.
link |
00:02:31.360
So last time we started talking about the simulation,
link |
00:02:34.040
this time let me ask you,
link |
00:02:35.640
do you think there's intelligent life out there
link |
00:02:37.480
in the universe?
link |
00:02:38.600
I've always maintained my answer to the Fermi paradox.
link |
00:02:41.640
I think there has been intelligent life
link |
00:02:44.440
elsewhere in the universe.
link |
00:02:45.880
So intelligent civilizations existed
link |
00:02:47.920
but they've blown themselves up.
link |
00:02:49.240
So your general intuition is that
link |
00:02:50.760
intelligent civilizations quickly,
link |
00:02:54.560
like there's that parameter in the Drake equation.
link |
00:02:57.760
Your sense is they don't last very long.
link |
00:02:59.640
Yeah.
link |
00:03:00.480
How are we doing on that?
link |
00:03:01.560
Like, have we lasted pretty good?
link |
00:03:03.680
Oh no.
link |
00:03:04.520
Are we do?
link |
00:03:05.360
Oh yeah.
link |
00:03:06.200
I mean, not quite yet.
link |
00:03:09.280
Well, it was good to tell you,
link |
00:03:10.440
as you'd ask the IQ required to destroy the world
link |
00:03:13.520
falls by one point every year.
link |
00:03:15.440
Okay.
link |
00:03:16.280
Technology democratizes the destruction of the world.
link |
00:03:21.120
When can a meme destroy the world?
link |
00:03:23.120
It kind of is already, right?
link |
00:03:27.280
Somewhat.
link |
00:03:28.480
I don't think we've seen anywhere near the worst of it yet.
link |
00:03:32.240
Well, it's going to get weird.
link |
00:03:34.000
Well, maybe a meme can save the world.
link |
00:03:36.480
You thought about that?
link |
00:03:37.480
The meme Lord Elon Musk fighting on the side of good
link |
00:03:40.800
versus the meme Lord of the darkness,
link |
00:03:44.560
which is not saying anything bad about Donald Trump,
link |
00:03:48.280
but he is the Lord of the meme on the dark side.
link |
00:03:51.720
He's a Darth Vader of memes.
link |
00:03:53.760
I think in every fairy tale they always end it with,
link |
00:03:58.360
and they lived happily ever after.
link |
00:03:59.920
And I'm like, please tell me more
link |
00:04:00.960
about this happily ever after.
link |
00:04:02.440
I've heard 50% of marriages end in divorce.
link |
00:04:05.880
Why doesn't your marriage end up there?
link |
00:04:07.840
You can't just say happily ever after.
link |
00:04:09.280
So it's the thing about destruction
link |
00:04:12.280
is it's over after the destruction.
link |
00:04:14.840
We have to do everything right in order to avoid it.
link |
00:04:18.160
And one thing wrong,
link |
00:04:20.040
I mean, actually this is what I really like
link |
00:04:21.920
about cryptography.
link |
00:04:22.960
Cryptography, it seems like we live in a world
link |
00:04:24.600
where the defense wins versus like nuclear weapons.
link |
00:04:29.640
The opposite is true.
link |
00:04:30.920
It is much easier to build a warhead
link |
00:04:32.960
that splits into a hundred little warheads
link |
00:04:34.520
than to build something that can, you know,
link |
00:04:36.440
take out a hundred little warheads.
link |
00:04:38.880
The offense has the advantage there.
link |
00:04:41.400
So maybe our future is in crypto, but.
link |
00:04:44.520
So cryptography, right.
link |
00:04:45.720
The Goliath is the defense.
link |
00:04:49.760
And then all the different hackers are the Davids.
link |
00:04:54.280
And that equation is flipped for nuclear war.
link |
00:04:57.840
Cause there's so many,
link |
00:04:58.800
like one nuclear weapon destroys everything essentially.
link |
00:05:01.960
Yeah, and it is much easier to attack with a nuclear weapon
link |
00:05:06.960
than it is to like the technology required to intercept
link |
00:05:09.640
and destroy a rocket is much more complicated
link |
00:05:12.080
than the technology required to just, you know,
link |
00:05:13.800
orbital trajectory, send a rocket to somebody.
link |
00:05:17.480
So, okay.
link |
00:05:18.520
Your intuition that there were intelligent civilizations
link |
00:05:21.880
out there, but it's very possible
link |
00:05:24.360
that they're no longer there.
link |
00:05:26.240
That's kind of a sad picture.
link |
00:05:27.640
They enter some steady state.
link |
00:05:29.520
They all wirehead themselves.
link |
00:05:31.520
What's wirehead?
link |
00:05:33.360
Stimulate, stimulate their pleasure centers
link |
00:05:35.320
and just, you know, live forever in this kind of stasis.
link |
00:05:39.680
They become, well, I mean,
link |
00:05:42.560
I think the reason I believe this is because where are they?
link |
00:05:46.320
If there's some reason they stopped expanding,
link |
00:05:50.680
cause otherwise they would have taken over the universe.
link |
00:05:52.160
The universe isn't that big.
link |
00:05:53.440
Or at least, you know,
link |
00:05:54.280
let's just talk about the galaxy, right?
link |
00:05:56.120
That's 70,000 light years across.
link |
00:05:58.720
I took that number from Star Trek Voyager.
link |
00:05:59.920
I don't know how true it is, but yeah, that's not big.
link |
00:06:04.680
Right? 70,000 light years is nothing.
link |
00:06:07.320
For some possible technology that you can imagine
link |
00:06:10.040
that can leverage like wormholes or something like that.
link |
00:06:12.320
Or you don't even need wormholes.
link |
00:06:13.320
Just a von Neumann probe is enough.
link |
00:06:15.120
A von Neumann probe and a million years of sublight travel
link |
00:06:18.440
and you'd have taken over the whole universe.
link |
00:06:20.440
That clearly didn't happen.
link |
00:06:22.480
So something stopped it.
link |
00:06:24.000
So you mean if you, right,
link |
00:06:25.400
for like a few million years,
link |
00:06:27.040
if you sent out probes that travel close,
link |
00:06:29.880
what's sublight?
link |
00:06:30.720
You mean close to the speed of light?
link |
00:06:32.280
Let's say 0.1 C.
link |
00:06:33.720
And it just spreads.
link |
00:06:34.800
Interesting.
link |
00:06:35.640
Actually, that's an interesting calculation, huh?
link |
00:06:38.800
So what makes you think that we'd be able
link |
00:06:40.640
to communicate with them?
link |
00:06:42.280
Like, yeah, what's,
link |
00:06:45.160
why do you think we would be able to be able
link |
00:06:47.920
to comprehend intelligent lives that are out there?
link |
00:06:51.920
Like even if they were among us kind of thing,
link |
00:06:54.960
like, or even just flying around?
link |
00:06:57.600
Well, I mean, that's possible.
link |
00:07:01.200
It's possible that there is some sort of prime directive.
link |
00:07:04.640
That'd be a really cool universe to live in.
link |
00:07:07.040
And there's some reason
link |
00:07:08.000
they're not making themselves visible to us.
link |
00:07:10.920
But it makes sense that they would use the same,
link |
00:07:15.200
well, at least the same entropy.
link |
00:07:16.960
Well, you're implying the same laws of physics.
link |
00:07:18.800
I don't know what you mean by entropy in this case.
link |
00:07:20.800
Oh, yeah.
link |
00:07:21.920
I mean, if entropy is the scarce resource in the universe.
link |
00:07:25.040
So what do you think about like Stephen Wolfram
link |
00:07:26.960
and everything is a computation?
link |
00:07:28.840
And then what if they are traveling through
link |
00:07:31.440
this world of computation?
link |
00:07:32.640
So if you think of the universe
link |
00:07:34.240
as just information processing,
link |
00:07:36.600
then what you're referring to with entropy
link |
00:07:40.840
and then these pockets of interesting complex computation
link |
00:07:44.240
swimming around, how do we know they're not already here?
link |
00:07:47.160
How do we know that this,
link |
00:07:51.040
like all the different amazing things
link |
00:07:53.040
that are full of mystery on earth
link |
00:07:55.080
are just like little footprints of intelligence
link |
00:07:58.640
from light years away?
link |
00:08:01.160
Maybe.
link |
00:08:02.840
I mean, I tend to think that as civilizations expand,
link |
00:08:05.760
they use more and more energy
link |
00:08:07.800
and you can never overcome the problem of waste heat.
link |
00:08:10.240
So where is there waste heat?
link |
00:08:11.880
So we'd be able to, with our crude methods,
link |
00:08:13.560
be able to see like, there's a whole lot of energy here.
link |
00:08:18.840
But it could be something we're not,
link |
00:08:20.560
I mean, we don't understand dark energy, right?
link |
00:08:22.480
Dark matter.
link |
00:08:23.560
It could be just stuff we don't understand at all.
link |
00:08:26.160
Or they can have a fundamentally different physics,
link |
00:08:29.080
you know, like that we just don't even comprehend.
link |
00:08:32.200
Well, I think, okay,
link |
00:08:33.440
I mean, it depends how far out you wanna go.
link |
00:08:35.120
I don't think physics is very different
link |
00:08:36.840
on the other side of the galaxy.
link |
00:08:39.760
I would suspect that they have,
link |
00:08:41.920
I mean, if they're in our universe,
link |
00:08:43.680
they have the same physics.
link |
00:08:45.760
Well, yeah, that's the assumption we have,
link |
00:08:47.600
but there could be like super trippy things
link |
00:08:50.000
like our cognition only gets to a slice,
link |
00:08:57.040
and all the possible instruments that we can design
link |
00:08:59.440
only get to a particular slice of the universe.
link |
00:09:01.520
And there's something much like weirder.
link |
00:09:04.080
Maybe we can try a thought experiment.
link |
00:09:06.880
Would people from the past
link |
00:09:10.040
be able to detect the remnants of our,
link |
00:09:14.000
or would we be able to detect our modern civilization?
link |
00:09:16.600
I think the answer is obviously yes.
link |
00:09:18.840
You mean past from a hundred years ago?
link |
00:09:20.720
Well, let's even go back further.
link |
00:09:22.080
Let's go to a million years ago, right?
link |
00:09:24.400
The humans who were lying around in the desert
link |
00:09:26.520
probably didn't even have,
link |
00:09:27.720
maybe they just barely had fire.
link |
00:09:31.360
They would understand if a 747 flew overhead.
link |
00:09:35.200
Oh, in this vicinity, but not if a 747 flew on Mars.
link |
00:09:43.200
Like, cause they wouldn't be able to see far,
link |
00:09:45.080
cause we're not actually communicating that well
link |
00:09:47.240
with the rest of the universe.
link |
00:09:48.920
We're doing okay.
link |
00:09:50.160
Just sending out random like fifties tracks of music.
link |
00:09:54.200
True.
link |
00:09:55.160
And yeah, I mean, they'd have to, you know,
link |
00:09:57.800
we've only been broadcasting radio waves for 150 years.
link |
00:10:02.480
And well, there's your light cone.
link |
00:10:04.560
So.
link |
00:10:05.800
Yeah. Okay.
link |
00:10:06.920
What do you make about all the,
link |
00:10:08.800
I recently came across this having talked to David Fravor.
link |
00:10:14.720
I don't know if you caught what the videos
link |
00:10:16.960
of the Pentagon released
link |
00:10:18.840
and the New York Times reporting of the UFO sightings.
link |
00:10:23.480
So I kind of looked into it, quote unquote.
link |
00:10:26.040
And there's actually been like hundreds
link |
00:10:30.800
of thousands of UFO sightings, right?
link |
00:10:33.800
And a lot of it you can explain away
link |
00:10:35.960
in different kinds of ways.
link |
00:10:37.080
So one is it could be interesting physical phenomena.
link |
00:10:40.160
Two, it could be people wanting to believe
link |
00:10:44.640
and therefore they conjure up a lot of different things
link |
00:10:46.760
that just, you know, when you see different kinds of lights,
link |
00:10:48.920
some basic physics phenomena,
link |
00:10:50.760
and then you just conjure up ideas
link |
00:10:53.640
of possible out there mysterious worlds.
link |
00:10:56.680
But, you know, it's also possible,
link |
00:10:58.960
like you have a case of David Fravor,
link |
00:11:02.480
who is a Navy pilot, who's, you know,
link |
00:11:06.160
as legit as it gets in terms of humans
link |
00:11:08.840
who are able to perceive things in the environment
link |
00:11:13.440
and make conclusions,
link |
00:11:15.320
whether those things are a threat or not.
link |
00:11:17.600
And he and several other pilots saw a thing,
link |
00:11:22.000
I don't know if you followed this,
link |
00:11:23.440
but they saw a thing that they've since then called TikTok
link |
00:11:26.840
that moved in all kinds of weird ways.
link |
00:11:29.520
They don't know what it is.
link |
00:11:30.640
It could be technology developed by the United States
link |
00:11:36.640
and they're just not aware of it
link |
00:11:38.040
and the surface level from the Navy, right?
link |
00:11:40.000
It could be different kind of lighting technology
link |
00:11:42.280
or drone technology, all that kind of stuff.
link |
00:11:45.000
It could be the Russians and the Chinese,
link |
00:11:46.560
all that kind of stuff.
link |
00:11:48.000
And of course their mind, our mind,
link |
00:11:51.400
can also venture into the possibility
link |
00:11:54.160
that it's from another world.
link |
00:11:56.320
Have you looked into this at all?
link |
00:11:58.160
What do you think about it?
link |
00:11:59.640
I think all the news is a psyop.
link |
00:12:01.680
I think that the most plausible.
link |
00:12:05.240
Nothing is real.
link |
00:12:06.440
Yeah, I listened to the, I think it was Bob Lazar
link |
00:12:10.920
on Joe Rogan.
link |
00:12:12.360
And like, I believe everything this guy is saying.
link |
00:12:15.840
And then I think that it's probably just some like MKUltra
link |
00:12:18.440
kind of thing, you know?
link |
00:12:20.680
What do you mean?
link |
00:12:21.520
Like they, you know, they made some weird thing
link |
00:12:24.720
and they called it an alien spaceship.
link |
00:12:26.160
You know, maybe it was just to like
link |
00:12:27.480
stimulate young physicists minds.
link |
00:12:29.560
We'll tell them it's alien technology
link |
00:12:31.040
and we'll see what they come up with, right?
link |
00:12:33.600
Do you find any conspiracy theories compelling?
link |
00:12:36.080
Like have you pulled at the string
link |
00:12:38.320
of the rich complex world of conspiracy theories
link |
00:12:42.440
that's out there?
link |
00:12:43.920
I think that I've heard a conspiracy theory
link |
00:12:46.520
that conspiracy theories were invented by the CIA
link |
00:12:48.960
in the 60s to discredit true things.
link |
00:12:52.040
Yeah.
link |
00:12:53.880
So, you know, you can go to ridiculous conspiracy theories
link |
00:12:58.520
like Flat Earth and Pizza Gate.
link |
00:13:01.000
And, you know, these things are almost to hide
link |
00:13:05.440
like conspiracy theories that like,
link |
00:13:08.000
you know, remember when the Chinese like locked up
link |
00:13:09.640
the doctors who discovered coronavirus?
link |
00:13:11.360
Like I tell people this and I'm like,
link |
00:13:12.840
no, no, no, that's not a conspiracy theory.
link |
00:13:14.400
That actually happened.
link |
00:13:15.920
Do you remember the time that the money used to be backed
link |
00:13:18.000
by gold and now it's backed by nothing?
link |
00:13:20.080
This is not a conspiracy theory.
link |
00:13:21.680
This actually happened.
link |
00:13:23.800
Well, that's one of my worries today
link |
00:13:26.360
with the idea of fake news is that when nothing is real,
link |
00:13:32.880
then like you dilute the possibility of anything being true
link |
00:13:37.600
by conjuring up all kinds of conspiracy theories.
link |
00:13:41.000
And then you don't know what to believe.
link |
00:13:42.400
And then like the idea of truth of objectivity
link |
00:13:46.280
is lost completely.
link |
00:13:47.840
Everybody has their own truth.
link |
00:13:50.120
So you used to control information by censoring it.
link |
00:13:53.600
And then the internet happened and governments were like,
link |
00:13:55.880
oh shit, we can't censor things anymore.
link |
00:13:58.360
I know what we'll do.
link |
00:14:00.560
You know, it's the old story of the story of like
link |
00:14:04.160
tying a flag with a leprechaun tells you his gold is buried
link |
00:14:07.080
and you tie one flag and you make the leprechaun swear
link |
00:14:09.080
to not remove the flag.
link |
00:14:10.040
And you come back to the field later with a shovel
link |
00:14:11.720
and there's flags everywhere.
link |
00:14:14.640
That's one way to maintain privacy, right?
link |
00:14:16.320
It's like in order to protect the contents
link |
00:14:20.280
of this conversation, for example,
link |
00:14:21.800
we could just generate like millions of deep,
link |
00:14:25.120
fake conversations where you and I talk
link |
00:14:27.560
and say random things.
link |
00:14:29.120
So this is just one of them
link |
00:14:30.440
and nobody knows which one was the real one.
link |
00:14:32.720
This could be fake right now.
link |
00:14:34.440
Classic steganography technique.
link |
00:14:37.240
Okay, another absurd question about intelligent life.
link |
00:14:39.960
Cause you know, you're an incredible programmer
link |
00:14:43.960
outside of everything else we'll talk about
link |
00:14:45.520
just as a programmer.
link |
00:14:49.320
Do you think intelligent beings out there,
link |
00:14:52.960
the civilizations that were out there,
link |
00:14:54.520
had computers and programming?
link |
00:14:58.240
Did they, do we naturally have to develop something
link |
00:15:01.440
where we engineer machines and are able to encode
link |
00:15:05.480
both knowledge into those machines
link |
00:15:08.680
and instructions that process that knowledge,
link |
00:15:11.720
process that information to make decisions
link |
00:15:14.260
and actions and so on?
link |
00:15:15.680
And would those programming languages,
link |
00:15:18.320
if you think they exist, be at all similar
link |
00:15:21.320
to anything we've developed?
link |
00:15:24.120
So I don't see that much of a difference
link |
00:15:26.600
between quote unquote natural languages
link |
00:15:29.400
and programming languages.
link |
00:15:34.160
Yeah.
link |
00:15:35.000
I think there's so many similarities.
link |
00:15:36.680
So when asked the question,
link |
00:15:39.920
what do alien languages look like?
link |
00:15:42.380
I imagine they're not all that dissimilar from ours.
link |
00:15:46.480
And I think translating in and out of them
link |
00:15:51.560
wouldn't be that crazy.
link |
00:15:52.920
Well, it's difficult to compile like DNA to Python
link |
00:15:57.520
and then to C.
link |
00:15:59.160
There's a little bit of a gap in the kind of languages
link |
00:16:02.000
we use for touring machines
link |
00:16:06.840
and the kind of languages nature seems to use a little bit.
link |
00:16:10.160
Maybe that's just, we just haven't understood
link |
00:16:13.800
the kind of language that nature uses well yet.
link |
00:16:16.440
DNA is a CAD model.
link |
00:16:19.120
It's not quite a programming language.
link |
00:16:21.200
It has no sort of a serial execution.
link |
00:16:25.280
It's not quite a, yeah, it's a CAD model.
link |
00:16:29.480
So I think in that sense,
link |
00:16:30.920
we actually completely understand it.
link |
00:16:32.520
The problem is, well, simulating on these CAD models,
link |
00:16:37.300
I played with it a bit this year,
link |
00:16:38.520
is super computationally intensive.
link |
00:16:41.120
If you wanna go down to like the molecular level
link |
00:16:43.720
where you need to go to see a lot of these phenomenon
link |
00:16:45.880
like protein folding.
link |
00:16:48.080
So yeah, it's not that we don't understand it.
link |
00:16:52.200
It just requires a whole lot of compute to kind of compile it.
link |
00:16:55.160
For our human minds, it's inefficient,
link |
00:16:56.680
both for the data representation and for the programming.
link |
00:17:00.520
Yeah, it runs well on raw nature.
link |
00:17:02.800
It runs well on raw nature.
link |
00:17:03.880
And when we try to build emulators or simulators for that,
link |
00:17:07.840
well, they're mad slow, but I've tried it.
link |
00:17:10.640
It runs in that, yeah, you've commented elsewhere,
link |
00:17:14.280
I don't remember where,
link |
00:17:15.780
that one of the problems is simulating nature is tough.
link |
00:17:20.780
And if you want to sort of deploy a prototype,
link |
00:17:25.300
I forgot how you put it, but it made me laugh,
link |
00:17:28.020
but animals or humans would need to be involved
link |
00:17:31.700
in order to try to run some prototype code on,
link |
00:17:38.120
like if we're talking about COVID and viruses and so on,
link |
00:17:41.180
if you were trying to engineer
link |
00:17:42.900
some kind of defense mechanisms,
link |
00:17:45.020
like a vaccine against COVID and all that kind of stuff
link |
00:17:49.580
that doing any kind of experimentation,
link |
00:17:52.060
like you can with like autonomous vehicles
link |
00:17:53.980
would be very technically and ethically costly.
link |
00:17:59.660
I'm not sure about that.
link |
00:18:00.940
I think you can do tons of crazy biology and test tubes.
link |
00:18:05.060
I think my bigger complaint is more,
link |
00:18:08.480
oh, the tools are so bad.
link |
00:18:11.420
Like literally, you mean like libraries and?
link |
00:18:14.540
I don't know, I'm not pipetting shit.
link |
00:18:16.100
Like you're handing me a, I got a, no, no, no,
link |
00:18:20.140
there has to be some.
link |
00:18:22.580
Like automating stuff.
link |
00:18:24.140
And like the, yeah, but human biology is messy.
link |
00:18:28.300
Like it seems.
link |
00:18:29.140
But like, look at those Toronto's videos.
link |
00:18:31.240
They were a joke.
link |
00:18:32.080
It's like a little gantry.
link |
00:18:33.460
It's like little X, Y gantry,
link |
00:18:34.660
high school science project with the pipette.
link |
00:18:36.660
I'm like, really?
link |
00:18:38.260
Gotta be something better.
link |
00:18:39.220
You can't build like nice microfluidics
link |
00:18:41.420
and I can program the computation to bio interface.
link |
00:18:45.300
I mean, this is gonna happen.
link |
00:18:47.180
But like right now, if you are asking me
link |
00:18:50.020
to pipette 50 milliliters of solution, I'm out.
link |
00:18:54.460
This is so crude.
link |
00:18:55.460
Yeah.
link |
00:18:56.700
Okay, let's get all the crazy out of the way.
link |
00:18:59.980
So a bunch of people asked me,
link |
00:19:02.260
since we talked about the simulation last time,
link |
00:19:05.060
we talked about hacking the simulation.
link |
00:19:06.860
Do you have any updates, any insights
link |
00:19:09.900
about how we might be able to go about hacking simulation
link |
00:19:13.740
if we indeed do live in a simulation?
link |
00:19:17.180
I think a lot of people misinterpreted
link |
00:19:19.900
the point of that South by talk.
link |
00:19:22.420
The point of the South by talk
link |
00:19:23.500
was not literally to hack the simulation.
link |
00:19:26.720
I think that this is an idea is literally just,
link |
00:19:33.520
I think theoretical physics.
link |
00:19:34.580
I think that's the whole goal, right?
link |
00:19:39.900
You want your grand unified theory, but then, okay,
link |
00:19:42.300
build a grand unified theory search for exploits, right?
link |
00:19:45.140
I think we're nowhere near actually there yet.
link |
00:19:47.640
My hope with that was just more to like,
link |
00:19:51.500
are you people kidding me
link |
00:19:52.740
with the things you spend time thinking about?
link |
00:19:54.980
Do you understand like kind of how small you are?
link |
00:19:58.020
You are bytes and God's computer, really?
link |
00:20:02.540
And the things that people get worked up about, you know?
link |
00:20:06.700
So basically, it was more a message
link |
00:20:10.060
of we should humble ourselves.
link |
00:20:12.540
That we get to, like what are we humans in this byte code?
link |
00:20:19.460
Yeah, and not just humble ourselves,
link |
00:20:22.380
but like I'm not trying to like make people guilty
link |
00:20:24.900
or anything like that.
link |
00:20:25.740
I'm trying to say like, literally,
link |
00:20:27.260
look at what you are spending time on, right?
link |
00:20:30.200
What are you referring to?
link |
00:20:31.040
You're referring to the Kardashians?
link |
00:20:32.460
What are we talking about?
link |
00:20:34.140
Twitter?
link |
00:20:34.980
No, the Kardashians, everyone knows that's kind of fun.
link |
00:20:38.060
I'm referring more to like the economy, you know?
link |
00:20:42.980
This idea that we gotta up our stock price.
link |
00:20:50.720
Or what is the goal function of humanity?
link |
00:20:55.380
You don't like the game of capitalism?
link |
00:20:57.600
Like you don't like the games we've constructed
link |
00:20:59.340
for ourselves as humans?
link |
00:21:00.660
I'm a big fan of capitalism.
link |
00:21:02.860
I don't think that's really the game we're playing right now.
link |
00:21:05.120
I think we're playing a different game
link |
00:21:07.260
where the rules are rigged.
link |
00:21:10.300
Okay, which games are interesting to you
link |
00:21:12.580
that we humans have constructed and which aren't?
link |
00:21:14.780
Which are productive and which are not?
link |
00:21:18.380
Actually, maybe that's the real point of the talk.
link |
00:21:21.880
It's like, stop playing these fake human games.
link |
00:21:25.100
There's a real game here.
link |
00:21:26.780
We can play the real game.
link |
00:21:28.660
The real game is, you know, nature wrote the rules.
link |
00:21:31.260
This is a real game.
link |
00:21:32.540
There still is a game to play.
link |
00:21:35.180
But if you look at, sorry to interrupt,
link |
00:21:36.940
I don't know if you've seen the Instagram account,
link |
00:21:38.420
nature is metal.
link |
00:21:40.220
The game that nature seems to be playing
link |
00:21:42.820
is a lot more cruel than we humans want to put up with.
link |
00:21:47.340
Or at least we see it as cruel.
link |
00:21:49.520
It's like the bigger thing eats the smaller thing
link |
00:21:53.660
and does it to impress another big thing
link |
00:21:58.180
so it can mate with that thing.
link |
00:22:00.460
And that's it.
link |
00:22:01.300
That seems to be the entirety of it.
link |
00:22:04.040
Well, there's no art, there's no music,
link |
00:22:07.260
there's no comma AI, there's no comma one,
link |
00:22:10.860
no comma two, no George Hots with his brilliant talks
link |
00:22:14.940
at South by Southwest.
link |
00:22:17.020
I disagree, though.
link |
00:22:17.860
I disagree that this is what nature is.
link |
00:22:19.620
I think nature just provided basically a open world MMORPG.
link |
00:22:26.780
And, you know, here it's open world.
link |
00:22:29.860
I mean, if that's the game you want to play,
link |
00:22:31.100
you can play that game.
link |
00:22:32.260
But isn't that beautiful?
link |
00:22:33.780
I don't know if you played Diablo.
link |
00:22:35.580
They used to have, I think, cow level where it's...
link |
00:22:39.420
So everybody will go just, they figured out this,
link |
00:22:44.420
like the best way to gain like experience points
link |
00:22:48.420
is to just slaughter cows over and over and over.
link |
00:22:52.180
And so they figured out this little sub game
link |
00:22:55.900
within the bigger game that this is the most efficient way
link |
00:22:58.800
to get experience points.
link |
00:22:59.900
And everybody somehow agreed
link |
00:23:01.860
that getting experience points in RPG context
link |
00:23:04.460
where you always want to be getting more stuff,
link |
00:23:06.500
more skills, more levels, keep advancing.
link |
00:23:09.140
That seems to be good.
link |
00:23:10.440
So might as well sacrifice actual enjoyment
link |
00:23:14.740
of playing a game, exploring a world,
link |
00:23:17.640
and spending like hundreds of hours of your time
link |
00:23:21.580
at cow level.
link |
00:23:22.420
I mean, the number of hours I spent in cow level,
link |
00:23:26.400
I'm not like the most impressive person
link |
00:23:28.140
because people have spent probably thousands of hours there,
link |
00:23:30.460
but it's ridiculous.
link |
00:23:31.580
So that's a little absurd game that brought me joy
link |
00:23:35.220
in some weird dopamine drug kind of way.
link |
00:23:37.500
So you don't like those games.
link |
00:23:40.060
You don't think that's us humans feeling the nature.
link |
00:23:46.500
I think so.
link |
00:23:47.340
And that was the point of the talk.
link |
00:23:49.640
Yeah.
link |
00:23:50.480
So how do we hack it then?
link |
00:23:51.460
Well, I want to live forever.
link |
00:23:52.740
And I want to live forever.
link |
00:23:55.820
And this is the goal.
link |
00:23:56.780
Well, that's a game against nature.
link |
00:23:59.220
Yeah, immortality is the good objective function to you?
link |
00:24:03.540
I mean, start there and then you can do whatever else
link |
00:24:05.100
you want because you got a long time.
link |
00:24:07.380
What if immortality makes the game just totally not fun?
link |
00:24:10.740
I mean, like, why do you assume immortality
link |
00:24:13.860
is somehow a good objective function?
link |
00:24:18.160
It's not immortality that I want.
link |
00:24:19.940
A true immortality where I could not die,
link |
00:24:22.580
I would prefer what we have right now.
link |
00:24:25.020
But I want to choose my own death, of course.
link |
00:24:27.180
I don't want nature to decide when I die,
link |
00:24:29.840
I'm going to win.
link |
00:24:30.780
I'm going to be you.
link |
00:24:33.100
And then at some point, if you choose commit suicide,
link |
00:24:36.920
like how long do you think you'd live?
link |
00:24:41.700
Until I get bored.
link |
00:24:43.140
See, I don't think people like brilliant people like you
link |
00:24:48.060
that really ponder living a long time
link |
00:24:52.380
are really considering how meaningless life becomes.
link |
00:24:58.620
Well, I want to know everything and then I'm ready to die.
link |
00:25:03.620
As long as there's...
link |
00:25:04.460
Yeah, but why do you want,
link |
00:25:05.280
isn't it possible that you want to know everything
link |
00:25:06.940
because it's finite?
link |
00:25:09.700
Like the reason you want to know quote unquote everything
link |
00:25:12.220
is because you don't have enough time to know everything.
link |
00:25:16.380
And once you have unlimited time,
link |
00:25:18.780
then you realize like, why do anything?
link |
00:25:22.220
Like why learn anything?
link |
00:25:25.140
I want to know everything and then I'm ready to die.
link |
00:25:27.100
So you have, yeah.
link |
00:25:28.460
It's not a, like, it's a terminal value.
link |
00:25:30.940
It's not in service of anything else.
link |
00:25:34.740
I'm conscious of the possibility, this is not a certainty,
link |
00:25:37.900
but the possibility of that engine of curiosity
link |
00:25:41.800
that you're speaking to is actually
link |
00:25:47.100
a symptom of the finiteness of life.
link |
00:25:49.780
Like without that finiteness, your curiosity would vanish.
link |
00:25:55.060
Like a morning fog.
link |
00:25:57.000
All right, cool.
link |
00:25:57.840
Bukowski talked about love like that.
link |
00:25:59.300
Then let me solve immortality
link |
00:26:01.340
and let me change the thing in my brain
link |
00:26:02.900
that reminds me of the fact that I'm immortal,
link |
00:26:04.700
tells me that life is finite shit.
link |
00:26:06.260
Maybe I'll have it tell me that life ends next week.
link |
00:26:09.060
Right?
link |
00:26:10.660
I'm okay with some self manipulation like that.
link |
00:26:12.660
I'm okay with deceiving myself.
link |
00:26:14.420
Oh, Rika, changing the code.
link |
00:26:17.020
Yeah, well, if that's the problem, right?
link |
00:26:18.300
If the problem is that I will no longer have that,
link |
00:26:20.580
that curiosity, I'd like to have backup copies of myself,
link |
00:26:24.460
which I check in with occasionally
link |
00:26:27.460
to make sure they're okay with the trajectory
link |
00:26:29.240
and they can kind of override it.
link |
00:26:31.000
Maybe a nice, like, I think of like those wave nets,
link |
00:26:33.180
those like logarithmic go back to the copies.
link |
00:26:35.180
Yeah, but sometimes it's not reversible.
link |
00:26:36.700
Like I've done this with video games.
link |
00:26:39.980
Once you figure out the cheat code
link |
00:26:41.620
or like you look up how to cheat old school,
link |
00:26:43.940
like single player, it ruins the game for you.
link |
00:26:46.860
Absolutely.
link |
00:26:47.700
It ruins that feeling.
link |
00:26:48.540
But again, that just means our brain manipulation
link |
00:26:51.900
technology is not good enough yet.
link |
00:26:53.260
Remove that cheat code from your brain.
link |
00:26:54.700
Here you go.
link |
00:26:55.820
So it's also possible that if we figure out immortality,
link |
00:27:00.500
that all of us will kill ourselves
link |
00:27:03.460
before we advance far enough
link |
00:27:06.100
to be able to revert the change.
link |
00:27:08.820
I'm not killing myself till I know everything, so.
link |
00:27:11.900
That's what you say now, because your life is finite.
link |
00:27:15.020
You know, I think yes, self modifying systems gets,
link |
00:27:19.060
comes up with all these hairy complexities
link |
00:27:21.020
and can I promise that I'll do it perfectly?
link |
00:27:23.020
No, but I think I can put good safety structures in place.
link |
00:27:27.180
So that talk and your thinking here
link |
00:27:29.740
is not literally referring to a simulation
link |
00:27:36.180
and that our universe is a kind of computer program
link |
00:27:40.520
running on a computer.
link |
00:27:42.240
That's more of a thought experiment.
link |
00:27:45.180
Do you also think of the potential of the sort of Bostrom,
link |
00:27:51.780
Elon Musk and others that talk about an actual program
link |
00:27:57.820
that simulates our universe?
link |
00:27:59.700
Oh, I don't doubt that we're in a simulation.
link |
00:28:01.980
I just think that it's not quite that important.
link |
00:28:05.300
I mean, I'm interested only in simulation theory
link |
00:28:06.940
as far as like it gives me power over nature.
link |
00:28:09.740
If it's totally unfalsifiable, then who cares?
link |
00:28:13.140
I mean, what do you think that experiment would look like?
link |
00:28:15.300
Like somebody on Twitter asks,
link |
00:28:17.740
asks George what signs we would look for
link |
00:28:20.780
to know whether or not we're in the simulation,
link |
00:28:22.980
which is exactly what you're asking is like,
link |
00:28:25.880
the step that precedes the step of knowing
link |
00:28:29.500
how to get more power from this knowledge
link |
00:28:32.200
is to get an indication that there's some power to be gained.
link |
00:28:35.220
So get an indication that there,
link |
00:28:37.900
you can discover and exploit cracks in the simulation
link |
00:28:42.100
or it doesn't have to be in the physics of the universe.
link |
00:28:45.340
Yeah.
link |
00:28:46.720
Show me, I mean, like a memory leak could be cool.
link |
00:28:51.620
Like some scrying technology, you know?
link |
00:28:54.000
What kind of technology?
link |
00:28:55.260
Scrying?
link |
00:28:56.220
What's that?
link |
00:28:57.060
Oh, that's a weird,
link |
00:28:58.460
scrying is the paranormal ability to like remote viewing,
link |
00:29:03.460
like being able to see somewhere where you're not.
link |
00:29:08.220
So, you know, I don't think you can do it
link |
00:29:10.100
by chanting in a room,
link |
00:29:11.220
but if we could find, it's a memory leak, basically.
link |
00:29:16.300
It's a memory leak.
link |
00:29:17.300
Yeah, you're able to access parts you're not supposed to.
link |
00:29:19.980
Yeah, yeah, yeah.
link |
00:29:20.820
And thereby discover a shortcut.
link |
00:29:22.100
Yeah, maybe memory leak means the other thing as well,
link |
00:29:24.720
but I mean like, yeah,
link |
00:29:25.560
like an ability to read arbitrary memory, right?
link |
00:29:28.100
And that one's not that horrifying, right?
link |
00:29:29.820
The right ones start to be horrifying.
link |
00:29:31.420
Read, right.
link |
00:29:32.260
It's the reading is not the problem.
link |
00:29:34.900
Yeah, it's like Heartfleet for the universe.
link |
00:29:37.220
Oh boy, the writing is a big, big problem.
link |
00:29:40.740
It's a big problem.
link |
00:29:43.060
It's the moment you can write anything,
link |
00:29:44.620
even if it's just random noise.
link |
00:29:47.740
That's terrifying.
link |
00:29:49.180
I mean, even without that,
link |
00:29:51.560
like even some of the, you know,
link |
00:29:52.580
the nanotech stuff that's coming, I think is.
link |
00:29:57.140
I don't know if you're paying attention,
link |
00:29:58.340
but actually Eric Weistand came out
link |
00:30:00.500
with the theory of everything.
link |
00:30:02.260
I mean, that came out.
link |
00:30:03.580
He's been working on a theory of everything
link |
00:30:05.460
in the physics world called geometric unity.
link |
00:30:08.060
And then for me, from computer science person like you,
link |
00:30:11.700
Steven Wolfram's theory of everything,
link |
00:30:14.220
of like hypergraphs is super interesting and beautiful,
link |
00:30:17.660
but not from a physics perspective,
link |
00:30:19.460
but from a computational perspective.
link |
00:30:20.940
I don't know, have you paid attention to any of that?
link |
00:30:23.020
So again, like what would make me pay attention
link |
00:30:26.440
and like why like I hate string theory is,
link |
00:30:29.540
okay, make a testable prediction, right?
link |
00:30:31.780
I'm only interested in,
link |
00:30:33.700
I'm not interested in theories for their intrinsic beauty.
link |
00:30:36.060
I'm interested in theories
link |
00:30:37.060
that give me power over the universe.
link |
00:30:39.900
So if these theories do, I'm very interested.
link |
00:30:43.220
Can I just say how beautiful that is?
link |
00:30:45.100
Because a lot of physicists say,
link |
00:30:47.140
I'm interested in experimental validation
link |
00:30:49.980
and they skip out the part where they say
link |
00:30:52.940
to give me more power in the universe.
link |
00:30:55.500
I just love the.
link |
00:30:57.500
No, I want. The clarity of that.
link |
00:30:59.780
I want 100 gigahertz processors.
link |
00:31:02.020
I want transistors that are smaller than atoms.
link |
00:31:04.120
I want like power.
link |
00:31:08.100
That's true.
link |
00:31:10.580
And that's where people from aliens
link |
00:31:12.460
to this kind of technology where people are worried
link |
00:31:14.420
that governments, like who owns that power?
link |
00:31:19.320
Is it George Harts?
link |
00:31:20.780
Is it thousands of distributed hackers across the world?
link |
00:31:25.020
Is it governments?
link |
00:31:26.580
Is it Mark Zuckerberg?
link |
00:31:28.660
There's a lot of people that,
link |
00:31:32.500
I don't know if anyone trusts any one individual with power.
link |
00:31:35.540
So they're always worried.
link |
00:31:37.380
It's the beauty of blockchains.
link |
00:31:39.340
That's the beauty of blockchains, which we'll talk about.
link |
00:31:43.140
On Twitter, somebody pointed me to a story,
link |
00:31:46.220
a bunch of people pointed me to a story a few months ago
link |
00:31:49.300
where you went into a restaurant in New York.
link |
00:31:51.660
And you can correct me if any of this is wrong.
link |
00:31:53.660
And ran into a bunch of folks from a company
link |
00:31:56.580
in a crypto company who are trying to scale up Ethereum.
link |
00:32:01.780
And they had a technical deadline
link |
00:32:03.300
related to solidity to OVM compiler.
link |
00:32:07.380
So these are all Ethereum technologies.
link |
00:32:09.660
So you stepped in, they recognized you,
link |
00:32:14.700
pulled you aside, explained their problem.
link |
00:32:16.220
And you stepped in and helped them solve the problem,
link |
00:32:19.540
thereby creating legend status story.
link |
00:32:23.100
So can you tell me the story in a little more detail?
link |
00:32:28.980
It seems kind of incredible.
link |
00:32:31.540
Did this happen?
link |
00:32:32.380
Yeah, yeah, it's a true story, it's a true story.
link |
00:32:34.060
I mean, they wrote a very flattering account of it.
link |
00:32:39.340
So Optimism is the company called Optimism,
link |
00:32:43.820
spin off of Plasma.
link |
00:32:45.420
They're trying to build L2 solutions on Ethereum.
link |
00:32:47.820
So right now, every Ethereum node
link |
00:32:52.620
has to run every transaction on the Ethereum network.
link |
00:32:56.420
And this kind of doesn't scale, right?
link |
00:32:58.540
Because if you have N computers,
link |
00:33:00.020
well, if that becomes two N computers,
link |
00:33:02.340
you actually still get the same amount of compute, right?
link |
00:33:05.420
This is like all of one scaling
link |
00:33:09.120
because they all have to run it.
link |
00:33:10.140
Okay, fine, you get more blockchain security,
link |
00:33:12.840
but like, blockchain is already so secure.
link |
00:33:15.380
Can we trade some of that off for speed?
link |
00:33:17.780
So that's kind of what these L2 solutions are.
link |
00:33:20.500
They built this thing, which kind of,
link |
00:33:23.020
kind of sandbox for Ethereum contracts.
link |
00:33:26.300
So they can run it in this L2 world
link |
00:33:28.200
and it can't do certain things in L world, in L1.
link |
00:33:30.900
Can I ask you for some definitions?
link |
00:33:32.420
What's L2?
link |
00:33:33.420
Oh, L2 is layer two.
link |
00:33:34.840
So L1 is like the base Ethereum chain.
link |
00:33:37.180
And then layer two is like a computational layer
link |
00:33:40.920
that runs elsewhere,
link |
00:33:44.420
but still is kind of secured by layer one.
link |
00:33:47.680
And I'm sure a lot of people know,
link |
00:33:49.720
but Ethereum is a cryptocurrency,
link |
00:33:51.940
probably one of the most popular cryptocurrency
link |
00:33:53.720
second to Bitcoin.
link |
00:33:55.320
And a lot of interesting technological innovation there.
link |
00:33:58.800
Maybe you could also slip in whenever you talk about this
link |
00:34:03.240
and things that are exciting to you in the Ethereum space.
link |
00:34:06.380
And why Ethereum?
link |
00:34:07.800
Well, I mean, Bitcoin is not Turing complete.
link |
00:34:12.260
Ethereum is not technically Turing complete
link |
00:34:13.820
with the gas limit, but close enough.
link |
00:34:16.040
With the gas limit?
link |
00:34:16.880
What's the gas limit, resources?
link |
00:34:19.080
Yeah, I mean, no computer is actually Turing complete.
link |
00:34:21.180
Right.
link |
00:34:23.160
You're gonna find out RAM, you know?
link |
00:34:24.720
I can actually solve the whole thing.
link |
00:34:25.560
What's the word gas limit?
link |
00:34:26.720
You just have so many brilliant words.
link |
00:34:28.660
I'm not even gonna ask.
link |
00:34:29.500
That's not my word, that's Ethereum's word.
link |
00:34:32.220
Gas limit.
link |
00:34:33.060
Ethereum, you have to spend gas per instruction.
link |
00:34:35.360
So like different op codes use different amounts of gas
link |
00:34:37.820
and you buy gas with ether to prevent people
link |
00:34:40.680
from basically DDoSing the network.
link |
00:34:42.740
So Bitcoin is proof of work.
link |
00:34:45.440
And then what's Ethereum?
link |
00:34:47.120
It's also proof of work.
link |
00:34:48.360
They're working on some proof of stake,
link |
00:34:49.960
Ethereum 2.0 stuff.
link |
00:34:51.040
But right now it's proof of work.
link |
00:34:52.720
It uses a different hash function from Bitcoin.
link |
00:34:54.700
That's more ASIC resistance, because you need RAM.
link |
00:34:57.340
So we're all talking about Ethereum 1.0.
link |
00:34:59.960
So what were they trying to do to scale this whole process?
link |
00:35:03.840
So they were like, well, if we could run contracts elsewhere
link |
00:35:07.800
and then only save the results of that computation,
link |
00:35:13.120
well, we don't actually have to do the compute on the chain.
link |
00:35:14.680
We can do the compute off chain
link |
00:35:15.680
and just post what the results are.
link |
00:35:17.440
Now, the problem with that is,
link |
00:35:18.800
well, somebody could lie about what the results are.
link |
00:35:21.120
So you need a resolution mechanism.
link |
00:35:23.240
And the resolution mechanism can be really expensive
link |
00:35:26.500
because you just have to make sure
link |
00:35:29.000
that the person who is saying,
link |
00:35:31.140
look, I swear that this is the real computation.
link |
00:35:33.800
I'm staking $10,000 on that fact.
link |
00:35:36.640
And if you prove it wrong,
link |
00:35:39.000
yeah, it might cost you $3,000 in gas fees to prove wrong,
link |
00:35:42.820
but you'll get the $10,000 bounty.
link |
00:35:44.800
So you can secure using those kinds of systems.
link |
00:35:47.740
So it's effectively a sandbox, which runs contracts.
link |
00:35:52.500
And like, it's like any kind of normal sandbox,
link |
00:35:55.600
you have to like replace syscalls
link |
00:35:57.960
with calls into the hypervisor.
link |
00:36:02.920
Sandbox, syscalls, hypervisor.
link |
00:36:05.080
What do these things mean?
link |
00:36:06.680
As long as it's interesting to talk about.
link |
00:36:09.240
Yeah, I mean, you can take like the Chrome sandbox
link |
00:36:11.280
is maybe the one to think about, right?
link |
00:36:12.720
So the Chrome process that's doing a rendering,
link |
00:36:16.000
can't, for example, read a file from the file system.
link |
00:36:18.800
It has, if it tries to make an open syscall in Linux,
link |
00:36:21.800
the open syscall, you can't make it open syscall,
link |
00:36:23.720
no, no, no.
link |
00:36:24.780
You have to request from the kind of hypervisor process
link |
00:36:29.160
or like, I don't know what it's called in Chrome,
link |
00:36:31.680
but the, hey, could you open this file for me?
link |
00:36:36.400
And then it does all these checks
link |
00:36:37.520
and then it passes the file handle back in
link |
00:36:39.200
if it's approved.
link |
00:36:41.160
So that's, yeah.
link |
00:36:42.640
So what's the, in the context of Ethereum,
link |
00:36:45.320
what are the boundaries of the sandbox
link |
00:36:47.200
that we're talking about?
link |
00:36:48.400
Well, like one of the calls that you,
link |
00:36:50.800
actually reading and writing any state
link |
00:36:53.960
to the Ethereum contract,
link |
00:36:55.480
or to the Ethereum blockchain.
link |
00:36:58.720
Writing state is one of those calls
link |
00:37:01.060
that you're going to have to sandbox in layer two,
link |
00:37:04.500
because if you let layer two just arbitrarily write
link |
00:37:08.120
to the Ethereum blockchain.
link |
00:37:09.480
So layer two is really sitting on top of layer one.
link |
00:37:15.120
So you're going to have a lot of different kinds of ideas
link |
00:37:17.160
that you can play with.
link |
00:37:18.680
And they're all, they're not fundamentally changing
link |
00:37:21.360
the source code level of Ethereum.
link |
00:37:25.140
Well, you have to replace a bunch of calls
link |
00:37:28.900
with calls into the hypervisor.
link |
00:37:31.120
So instead of doing the syscall directly,
link |
00:37:33.840
you replace it with a call to the hypervisor.
link |
00:37:37.360
So originally they were doing this
link |
00:37:39.320
by first running the, so Solidity is the language
link |
00:37:43.520
that most Ethereum contracts are written in.
link |
00:37:45.440
It compiles to a bytecode.
link |
00:37:47.320
And then they wrote this thing they called the transpiler.
link |
00:37:50.040
And the transpiler took the bytecode
link |
00:37:52.440
and it transpiled it into OVM safe bytecode.
link |
00:37:56.000
Basically bytecode that didn't make any
link |
00:37:57.520
of those restricted syscalls
link |
00:37:58.800
and added the calls to the hypervisor.
link |
00:38:01.680
This transpiler was a 3000 line mess.
link |
00:38:05.640
And it's hard to do.
link |
00:38:07.100
It's hard to do if you're trying to do it like that,
link |
00:38:09.060
because you have to kind of like deconstruct the bytecode,
link |
00:38:12.200
change things about it, and then reconstruct it.
link |
00:38:15.360
And I mean, as soon as I hear this, I'm like,
link |
00:38:17.760
well, why don't you just change the compiler, right?
link |
00:38:20.440
Why not the first place you build the bytecode,
link |
00:38:22.340
just do it in the compiler.
link |
00:38:25.560
So yeah, I asked them how much they wanted it.
link |
00:38:29.160
Of course, measured in dollars and I'm like, well, okay.
link |
00:38:33.040
And yeah.
link |
00:38:34.600
And you wrote the compiler.
link |
00:38:35.920
Yeah, I modified, I wrote a 300 line diff to the compiler.
link |
00:38:39.040
It's open source, you can look at it.
link |
00:38:40.840
Yeah, it's, yeah, I looked at the code last night.
link |
00:38:43.000
It's, yeah, exactly.
link |
00:38:46.680
Cute is a good word for it.
link |
00:38:49.440
And it's C++.
link |
00:38:52.000
C++, yeah.
link |
00:38:54.320
So when asked how you were able to do it,
link |
00:38:57.040
you said, you just gotta think and then do it right.
link |
00:39:03.080
So can you break that apart a little bit?
link |
00:39:04.680
What's your process of one, thinking and two, doing it right?
link |
00:39:09.920
You know, the people that I was working for
link |
00:39:12.440
were amused that I said that.
link |
00:39:13.480
It doesn't really mean anything.
link |
00:39:14.800
Okay.
link |
00:39:16.480
I mean, is there some deep, profound insights
link |
00:39:19.640
to draw from like how you problem solve from that?
link |
00:39:23.600
This is always what I say.
link |
00:39:24.640
I'm like, do you wanna be a good programmer?
link |
00:39:26.060
Do it for 20 years.
link |
00:39:27.840
Yeah, there's no shortcuts.
link |
00:39:29.600
No.
link |
00:39:31.360
What are your thoughts on crypto in general?
link |
00:39:33.440
So what parts technically or philosophically
link |
00:39:38.080
do you find especially beautiful maybe?
link |
00:39:40.040
Oh, I'm extremely bullish on crypto longterm.
link |
00:39:42.800
Not any specific crypto project, but this idea of,
link |
00:39:48.680
well, two ideas.
link |
00:39:50.320
One, the Nakamoto Consensus Algorithm
link |
00:39:54.240
is I think one of the greatest innovations
link |
00:39:57.320
of the 21st century.
link |
00:39:58.600
This idea that people can reach consensus.
link |
00:40:01.260
You can reach a group consensus.
link |
00:40:03.400
Using a relatively straightforward algorithm is wild.
link |
00:40:08.220
And like, you know, Satoshi Nakamoto,
link |
00:40:14.120
people always ask me who I look up to.
link |
00:40:15.580
It's like, whoever that is.
link |
00:40:17.800
Who do you think it is?
link |
00:40:19.120
I mean, Elon Musk?
link |
00:40:21.020
Is it you?
link |
00:40:22.320
It is definitely not me.
link |
00:40:24.040
And I do not think it's Elon Musk.
link |
00:40:26.060
But yeah, this idea of groups reaching consensus
link |
00:40:31.060
in a decentralized yet formulaic way
link |
00:40:34.940
is one extremely powerful idea from crypto.
link |
00:40:40.540
Maybe the second idea is this idea of smart contracts.
link |
00:40:45.860
When you write a contract between two parties,
link |
00:40:49.100
any contract, this contract, if there are disputes,
link |
00:40:53.720
it's interpreted by lawyers.
link |
00:40:56.140
Lawyers are just really shitty overpaid interpreters.
link |
00:41:00.020
Imagine you had, let's talk about them in terms of a,
link |
00:41:02.060
in terms of like, let's compare a lawyer to Python, right?
link |
00:41:05.260
So lawyer, well, okay.
link |
00:41:07.140
That's really, I never thought of it that way.
link |
00:41:10.020
It's hilarious.
link |
00:41:11.620
So Python, I'm paying even 10 cents an hour.
link |
00:41:15.900
I'll use the nice Azure machine.
link |
00:41:17.180
I can run Python for 10 cents an hour.
link |
00:41:19.660
Lawyers cost $1,000 an hour.
link |
00:41:21.480
So Python is 10,000 X better on that axis.
link |
00:41:25.900
Lawyers don't always return the same answer.
link |
00:41:31.100
Python almost always does.
link |
00:41:36.700
Cost.
link |
00:41:37.740
Yeah, I mean, just cost, reliability,
link |
00:41:40.940
everything about Python is so much better than lawyers.
link |
00:41:43.860
So if you can make smart contracts,
link |
00:41:46.100
this whole concept of code is law.
link |
00:41:50.400
I love, and I would love to live in a world
link |
00:41:53.180
where everybody accepted that fact.
link |
00:41:55.620
So maybe you can talk about what smart contracts are.
link |
00:42:01.180
So let's say, let's say, you know,
link |
00:42:05.020
we have a, even something as simple
link |
00:42:08.700
as a safety deposit box, right?
link |
00:42:11.700
Safety deposit box that holds a million dollars.
link |
00:42:14.780
I have a contract with the bank that says
link |
00:42:17.040
two out of these three parties must be present
link |
00:42:22.620
to open the safety deposit box and get the money out.
link |
00:42:25.280
So that's a contract for the bank,
link |
00:42:26.300
and it's only as good as the bank and the lawyers, right?
link |
00:42:29.020
Let's say, you know, somebody dies and now,
link |
00:42:32.820
oh, we're gonna go through a big legal dispute
link |
00:42:34.580
about whether, oh, well, was it in the will,
link |
00:42:36.140
was it not in the will?
link |
00:42:37.540
What, like, it's just so messy,
link |
00:42:39.780
and the cost to determine truth is so expensive.
link |
00:42:44.620
Versus a smart contract, which just uses cryptography
link |
00:42:47.220
to check if two out of three keys are present.
link |
00:42:50.100
Well, I can look at that, and I can have certainty
link |
00:42:53.860
in the answer that it's going to return.
link |
00:42:55.980
And that's what, all businesses want is certainty.
link |
00:42:58.220
You know, they say businesses don't care.
link |
00:42:59.620
Viacom, YouTube, YouTube's like,
link |
00:43:02.020
look, we don't care which way this lawsuit goes.
link |
00:43:04.100
Just please tell us so we can have certainty.
link |
00:43:07.220
Yeah, I wonder how many agreements in this,
link |
00:43:09.100
because we're talking about financial transactions only
link |
00:43:12.720
in this case, correct, the smart contracts?
link |
00:43:15.620
Oh, you can go to anything.
link |
00:43:17.380
You can put a prenup in the Ethereum blockchain.
link |
00:43:21.820
A married smart contract?
link |
00:43:23.020
Sorry, divorce lawyer, sorry.
link |
00:43:24.900
You're going to be replaced by Python.
link |
00:43:29.580
Okay, so that's another beautiful idea.
link |
00:43:34.980
Do you think there's something that's appealing to you
link |
00:43:37.700
about any one specific implementation?
link |
00:43:40.260
So if you look 10, 20, 50 years down the line,
link |
00:43:45.100
do you see any, like, Bitcoin, Ethereum,
link |
00:43:48.140
any of the other hundreds of cryptocurrencies winning out?
link |
00:43:51.140
Is there, like, what's your intuition about the space?
link |
00:43:53.340
Or are you just sitting back and watching the chaos
link |
00:43:55.380
and look who cares what emerges?
link |
00:43:57.380
Oh, I don't.
link |
00:43:58.220
I don't speculate.
link |
00:43:59.060
I don't really care.
link |
00:43:59.900
I don't really care which one of these projects wins.
link |
00:44:02.620
I'm kind of in the Bitcoin as a meme coin camp.
link |
00:44:05.500
I mean, why does Bitcoin have value?
link |
00:44:07.000
It's technically kind of, you know,
link |
00:44:11.740
not great, like the block size debate.
link |
00:44:14.380
When I found out what the block size debate was,
link |
00:44:16.140
I'm like, are you guys kidding?
link |
00:44:18.020
What's the block size debate?
link |
00:44:21.180
You know what?
link |
00:44:22.020
It's really, it's too stupid to even talk.
link |
00:44:23.820
People can look it up, but I'm like, wow.
link |
00:44:27.140
You know, Ethereum seems,
link |
00:44:28.420
the governance of Ethereum seems much better.
link |
00:44:31.300
I've come around a bit on proof of stake ideas.
link |
00:44:35.360
You know, very smart people thinking about some things.
link |
00:44:37.740
Yeah, you know, governance is interesting.
link |
00:44:40.340
It does feel like Vitalik,
link |
00:44:44.100
like it does feel like an open,
link |
00:44:46.100
even in these distributed systems,
link |
00:44:48.020
leaders are helpful
link |
00:44:51.220
because they kind of help you drive the mission
link |
00:44:54.580
and the vision and they put a face to a project.
link |
00:44:58.420
It's a weird thing about us humans.
link |
00:45:00.140
Geniuses are helpful, like Vitalik.
link |
00:45:02.820
Yeah, brilliant.
link |
00:45:06.540
Leaders are not necessarily, yeah.
link |
00:45:10.320
So you think the reason he's the face of Ethereum
link |
00:45:15.340
is because he's a genius.
link |
00:45:17.100
That's interesting.
link |
00:45:18.060
I mean, that was,
link |
00:45:21.460
it's interesting to think about
link |
00:45:22.940
that we need to create systems
link |
00:45:25.860
in which the quote unquote leaders that emerge
link |
00:45:30.340
are the geniuses in the system.
link |
00:45:33.060
I mean, that's arguably why
link |
00:45:35.000
the current state of democracy is broken
link |
00:45:36.980
is the people who are emerging as the leaders
link |
00:45:39.420
are not the most competent,
link |
00:45:40.980
are not the superstars of the system.
link |
00:45:43.340
And it seems like at least for now
link |
00:45:45.020
in the crypto world oftentimes
link |
00:45:47.220
the leaders are the superstars.
link |
00:45:49.380
Imagine at the debate they asked,
link |
00:45:51.720
what's the sixth amendment?
link |
00:45:53.640
What are the four fundamental forces in the universe?
link |
00:45:56.300
What's the integral of two to the X?
link |
00:45:59.940
I'd love to see those questions asked
link |
00:46:01.740
and that's what I want as our leader.
link |
00:46:03.380
It's a little bit.
link |
00:46:04.220
What's Bayes rule?
link |
00:46:07.160
Yeah, I mean, even, oh wow, you're hurting my brain.
link |
00:46:10.900
It's that my standard was even lower
link |
00:46:15.020
but I would have loved to see
link |
00:46:17.660
just this basic brilliance.
link |
00:46:20.600
Like I've talked to historians.
link |
00:46:22.180
There's just these, they're not even like
link |
00:46:23.900
they don't have a PhD or even education history.
link |
00:46:26.720
They just like a Dan Carlin type character
link |
00:46:30.040
who just like, holy shit.
link |
00:46:32.780
How did all this information get into your head?
link |
00:46:35.420
They're able to just connect Genghis Khan
link |
00:46:38.460
to the entirety of the history of the 20th century.
link |
00:46:41.820
They know everything about every single battle that happened
link |
00:46:46.180
and they know the game of Thrones
link |
00:46:51.020
of the different power plays and all that happened there.
link |
00:46:55.100
And they know like the individuals
link |
00:46:56.700
and all the documents involved
link |
00:46:58.580
and they integrate that into their regular life.
link |
00:47:02.060
It's not like they're ultra history nerds.
link |
00:47:03.980
They're just, they know this information.
link |
00:47:06.380
That's what competence looks like.
link |
00:47:08.020
Yeah.
link |
00:47:09.060
Cause I've seen that with programmers too, right?
link |
00:47:10.660
That's what great programmers do.
link |
00:47:12.580
But yeah, it would be, it's really unfortunate
link |
00:47:15.780
that those kinds of people aren't emerging as our leaders.
link |
00:47:19.340
But for now, at least in the crypto world
link |
00:47:21.820
that seems to be the case.
link |
00:47:23.260
I don't know if that always, you could imagine
link |
00:47:26.700
that in a hundred years, it's not the case, right?
link |
00:47:28.820
Crypto world has one very powerful idea going for it
link |
00:47:31.940
and that's the idea of forks, right?
link |
00:47:35.020
I mean, imagine, we'll use a less controversial example.
link |
00:47:42.940
This was actually in my joke app in 2012.
link |
00:47:47.140
I was like, Barack Obama, Mitt Romney,
link |
00:47:49.460
let's let them both be president, right?
link |
00:47:51.060
Like imagine we could fork America
link |
00:47:52.940
and just let them both be president.
link |
00:47:54.500
And then the Americas could compete
link |
00:47:56.060
and people could invest in one,
link |
00:47:58.180
pull their liquidity out of one, put it in the other.
link |
00:48:00.500
You have this in the crypto world.
link |
00:48:02.420
Ethereum forks into Ethereum and Ethereum classic.
link |
00:48:05.540
And you can pull your liquidity out of one
link |
00:48:07.340
and put it in another.
link |
00:48:08.980
And people vote with their dollars,
link |
00:48:11.980
which forks, companies should be able to fork.
link |
00:48:16.420
I'd love to fork Nvidia, you know?
link |
00:48:20.260
Yeah, like different business strategies
link |
00:48:22.780
and then try them out and see what works.
link |
00:48:26.180
Like even take, yeah, take comma AI that closes its source
link |
00:48:34.780
and then take one that's open source and see what works.
link |
00:48:38.300
Take one that's purchased by GM
link |
00:48:41.060
and one that remains Android Renegade
link |
00:48:43.540
and all these different versions and see.
link |
00:48:45.300
The beauty of comma AI is someone can actually do that.
link |
00:48:47.900
Please take comma AI and fork it.
link |
00:48:50.620
That's right, that's the beauty of open source.
link |
00:48:53.020
So you're, I mean, we'll talk about autonomous vehicle space,
link |
00:48:56.420
but it does seem that you're really knowledgeable
link |
00:49:02.100
about a lot of different topics.
link |
00:49:03.940
So the natural question a bunch of people ask this,
link |
00:49:06.100
which is how do you keep learning new things?
link |
00:49:09.980
Do you have like practical advice
link |
00:49:12.420
if you were to introspect, like taking notes,
link |
00:49:15.820
allocate time, or do you just mess around
link |
00:49:19.220
and just allow your curiosity to drive?
link |
00:49:21.340
I'll write these people a self help book
link |
00:49:23.060
and I'll charge $67 for it.
link |
00:49:25.060
And I will write, I will write,
link |
00:49:28.500
I will write on the cover of the self help book.
link |
00:49:30.300
All of this advice is completely meaningless.
link |
00:49:32.500
You're gonna be a sucker and buy this book anyway.
link |
00:49:34.740
And the one lesson that I hope they take away from the book
link |
00:49:38.860
is that I can't give you a meaningful answer to that.
link |
00:49:42.580
That's interesting.
link |
00:49:44.020
Let me translate that.
link |
00:49:45.420
Is you haven't really thought about what it is you do
link |
00:49:51.580
systematically because you could reduce it.
link |
00:49:53.900
And there's some people, I mean, I've met brilliant people
link |
00:49:56.980
that this is really clear with athletes.
link |
00:50:00.180
Some are just, you know, the best in the world
link |
00:50:03.580
at something and they have zero interest
link |
00:50:06.820
in writing like a self help book,
link |
00:50:09.500
or how to master this game.
link |
00:50:11.620
And then there's some athletes who become great coaches
link |
00:50:15.620
and they love the analysis, perhaps the over analysis.
link |
00:50:18.820
And you right now, at least at your age,
link |
00:50:20.740
which is an interesting, you're in the middle of the battle.
link |
00:50:23.180
You're like the warriors that have zero interest
link |
00:50:25.620
in writing books.
link |
00:50:27.820
So you're in the middle of the battle.
link |
00:50:29.140
So you have, yeah.
link |
00:50:30.580
This is a fair point.
link |
00:50:31.740
I do think I have a certain aversion
link |
00:50:34.060
to this kind of deliberate intentional way of living life.
link |
00:50:39.060
You're eventually, the hilarity of this,
link |
00:50:41.820
especially since this is recorded,
link |
00:50:43.620
it will reveal beautifully the absurdity
link |
00:50:47.620
when you finally do publish this book.
link |
00:50:49.620
I guarantee you, you will.
link |
00:50:51.540
The story of comma AI, maybe it'll be a biography
link |
00:50:56.060
written about you.
link |
00:50:57.060
That'll be better, I guess.
link |
00:50:58.740
And you might be able to learn some cute lessons
link |
00:51:00.740
if you're starting a company like comma AI from that book.
link |
00:51:03.700
But if you're asking generic questions,
link |
00:51:05.660
like how do I be good at things?
link |
00:51:07.740
How do I be good at things?
link |
00:51:10.260
Dude, I don't know.
link |
00:51:11.900
Do them a lot.
link |
00:51:14.380
Do them a lot.
link |
00:51:15.220
But the interesting thing here is learning things
link |
00:51:18.580
outside of your current trajectory,
link |
00:51:22.020
which is what it feels like from an outsider's perspective.
link |
00:51:28.140
I don't know if there's advice on that,
link |
00:51:30.900
but it is an interesting curiosity.
link |
00:51:33.340
When you become really busy, you're running a company.
link |
00:51:38.020
Hard time.
link |
00:51:40.340
Yeah.
link |
00:51:41.580
But there's a natural inclination and trend.
link |
00:51:46.100
Just the momentum of life carries you
link |
00:51:48.980
into a particular direction of wanting to focus.
link |
00:51:51.060
And this kind of dispersion that curiosity can lead to
link |
00:51:55.140
gets harder and harder with time.
link |
00:51:58.220
Because you get really good at certain things
link |
00:52:00.940
and it sucks trying things that you're not good at,
link |
00:52:03.820
like trying to figure them out.
link |
00:52:05.340
When you do this with your live streams,
link |
00:52:07.380
you're on the fly figuring stuff out.
link |
00:52:10.060
You don't mind looking dumb.
link |
00:52:11.780
No.
link |
00:52:14.140
You just figure it out pretty quickly.
link |
00:52:16.660
Sometimes I try things and I don't figure them out quickly.
link |
00:52:19.180
My chest rating is like a 1400,
link |
00:52:20.980
despite putting like a couple of hundred hours in.
link |
00:52:23.300
It's pathetic.
link |
00:52:24.340
I mean, to be fair, I know that I could do it better
link |
00:52:26.860
if I did it better.
link |
00:52:27.700
Like don't play five minute games,
link |
00:52:29.860
play 15 minute games at least.
link |
00:52:31.380
Like I know these things, but it just doesn't,
link |
00:52:34.260
it doesn't stick nicely in my knowledge stream.
link |
00:52:37.260
All right, let's talk about Comma AI.
link |
00:52:39.300
What's the mission of the company?
link |
00:52:42.140
Let's like look at the biggest picture.
link |
00:52:44.860
Oh, I have an exact statement.
link |
00:52:46.820
Solve self driving cars
link |
00:52:48.660
while delivering shippable intermediaries.
link |
00:52:51.540
So longterm vision is have fully autonomous vehicles
link |
00:52:56.300
and make sure you're making money along the way.
link |
00:52:59.060
I think it doesn't really speak to money,
link |
00:53:00.220
but I can talk about what solve self driving cars means.
link |
00:53:03.260
Solve self driving cars of course means
link |
00:53:06.700
you're not building a new car,
link |
00:53:08.020
you're building a person replacement.
link |
00:53:10.460
That person can sit in the driver's seat
link |
00:53:12.340
and drive you anywhere a person can drive
link |
00:53:14.580
with a human or better level of safety,
link |
00:53:17.980
speed, quality, comfort.
link |
00:53:21.420
And what's the second part of that?
link |
00:53:23.260
Delivering shippable intermediaries is well,
link |
00:53:26.620
it's a way to fund the company, that's true.
link |
00:53:28.180
But it's also a way to keep us honest.
link |
00:53:31.980
If you don't have that, it is very easy
link |
00:53:34.900
with this technology to think you're making progress
link |
00:53:39.020
when you're not.
link |
00:53:40.180
I've heard it best described on Hacker News as
link |
00:53:43.420
you can set any arbitrary milestone,
link |
00:53:46.660
meet that milestone and still be infinitely far away
link |
00:53:49.500
from solving self driving cars.
link |
00:53:51.740
So it's hard to have like real deadlines
link |
00:53:53.820
when you're like Cruz or Waymo when you don't have revenue.
link |
00:54:02.100
Is that, I mean, is revenue essentially
link |
00:54:06.060
the thing we're talking about here?
link |
00:54:07.820
Revenue is, capitalism is based around consent.
link |
00:54:11.460
Capitalism, the way that you get revenue
link |
00:54:13.100
is real capitalism comes in the real capitalism camp.
link |
00:54:16.340
There's definitely scams out there,
link |
00:54:17.340
but real capitalism is based around consent.
link |
00:54:19.580
It's based around this idea that like,
link |
00:54:20.740
if we're getting revenue, it's because we're providing
link |
00:54:22.980
at least that much value to another person.
link |
00:54:24.780
When someone buys $1,000 comma two from us,
link |
00:54:27.460
we're providing them at least $1,000 of value
link |
00:54:29.100
or they wouldn't buy it.
link |
00:54:30.180
Brilliant, so can you give a whirlwind overview
link |
00:54:32.940
of the products that Comma AI provides,
link |
00:54:34.940
like throughout its history and today?
link |
00:54:38.180
I mean, yeah, the past ones aren't really that interesting.
link |
00:54:40.780
It's kind of just been refinement of the same idea.
link |
00:54:45.180
The real only product we sell today is the Comma 2.
link |
00:54:48.300
Which is a piece of hardware with cameras.
link |
00:54:50.500
Mm, so the Comma 2, I mean, you can think about it
link |
00:54:54.700
kind of like a person.
link |
00:54:57.100
Future hardware will probably be
link |
00:54:58.340
even more and more personlike.
link |
00:55:00.300
So it has eyes, ears, a mouth, a brain,
link |
00:55:07.460
and a way to interface with the car.
link |
00:55:09.540
Does it have consciousness?
link |
00:55:10.980
Just kidding, that was a trick question.
link |
00:55:13.500
I don't have consciousness either.
link |
00:55:15.140
Me and the Comma 2 are the same.
link |
00:55:16.420
You're the same?
link |
00:55:17.260
I have a little more compute than it.
link |
00:55:18.660
It only has like the same compute as a B, you know.
link |
00:55:23.260
You're more efficient energy wise
link |
00:55:25.300
for the compute you're doing.
link |
00:55:26.460
Far more efficient energy wise.
link |
00:55:29.020
20 petaflops, 20 watts, crazy.
link |
00:55:30.580
Do you lack consciousness?
link |
00:55:32.220
Sure.
link |
00:55:33.060
Do you fear death?
link |
00:55:33.900
You do, you want immortality.
link |
00:55:35.500
Of course I fear death.
link |
00:55:36.340
Does Comma AI fear death?
link |
00:55:38.100
I don't think so.
link |
00:55:39.580
Of course it does.
link |
00:55:40.500
It very much fears, well, it fears negative loss.
link |
00:55:42.980
Oh yeah.
link |
00:55:43.820
Okay, so Comma 2, when did that come out?
link |
00:55:49.220
That was a year ago?
link |
00:55:50.500
No, two.
link |
00:55:52.100
Early this year.
link |
00:55:53.540
Wow, time, it feels like, yeah.
link |
00:55:56.900
2020 feels like it's taken 10 years to get to the end.
link |
00:56:00.820
It's a long year.
link |
00:56:01.820
It's a long year.
link |
00:56:03.140
So what's the sexiest thing about Comma 2 feature wise?
link |
00:56:08.140
So, I mean, maybe you can also link on like, what is it?
link |
00:56:14.340
Like what's its purpose?
link |
00:56:15.740
Cause there's a hardware, there's a software component.
link |
00:56:18.540
You've mentioned the sensors,
link |
00:56:20.020
but also like what is it, its features and capabilities?
link |
00:56:23.060
I think our slogan summarizes it well.
link |
00:56:25.340
Comma slogan is make driving chill.
link |
00:56:28.860
I love it, okay.
link |
00:56:30.660
Yeah, I mean, it is, you know,
link |
00:56:33.020
if you like cruise control, imagine cruise control,
link |
00:56:35.340
but much, much more.
link |
00:56:36.660
So it can do adaptive cruise control things,
link |
00:56:41.020
which is like slow down for cars in front of it,
link |
00:56:42.980
maintain a certain speed.
link |
00:56:44.220
And it can also do lane keeping.
link |
00:56:46.260
So staying in the lane and doing it better
link |
00:56:48.700
and better and better over time.
link |
00:56:50.020
It's very much machine learning based.
link |
00:56:53.060
So this camera is, there's a driver facing camera too.
link |
00:57:01.100
What else is there?
link |
00:57:02.100
What am I thinking?
link |
00:57:02.940
So the hardware versus software.
link |
00:57:04.380
So open pilot versus the actual hardware of the device.
link |
00:57:09.020
What's, can you draw that distinction?
link |
00:57:10.580
What's one, what's the other?
link |
00:57:11.580
I mean, the hardware is pretty much a cell phone
link |
00:57:13.820
with a few additions.
link |
00:57:14.700
A cell phone with a cooling system
link |
00:57:16.940
and with a car interface connected to it.
link |
00:57:20.980
And by cell phone, you mean like Qualcomm Snapdragon.
link |
00:57:25.620
Yeah, the current hardware is a Snapdragon 821.
link |
00:57:29.380
It has wifi radio, it has an LTE radio, it has a screen.
link |
00:57:32.180
We use every part of the cell phone.
link |
00:57:35.980
And then the interface with the car
link |
00:57:37.540
is specific to the car.
link |
00:57:38.460
So you keep supporting more and more cars.
link |
00:57:41.460
Yeah, so the interface to the car,
link |
00:57:42.660
I mean, the device itself just has four CAN buses.
link |
00:57:45.340
It has four CAN interfaces on it
link |
00:57:46.860
that are connected through the USB port to the phone.
link |
00:57:49.420
And then, yeah, on those four CAN buses,
link |
00:57:53.300
you connect it to the car.
link |
00:57:54.460
And there's a little harness to do this.
link |
00:57:56.460
Cars are actually surprisingly similar.
link |
00:57:58.460
So CAN is the protocol by which cars communicate.
link |
00:58:01.500
And then you're able to read stuff and write stuff
link |
00:58:04.260
to be able to control the car depending on the car.
link |
00:58:06.900
So what's the software side?
link |
00:58:08.260
What's OpenPilot?
link |
00:58:10.380
So I mean, OpenPilot is,
link |
00:58:11.740
the hardware is pretty simple compared to OpenPilot.
link |
00:58:13.740
OpenPilot is, well, so you have a machine learning model,
link |
00:58:21.020
which it's in OpenPilot, it's a blob.
link |
00:58:24.540
It's just a blob of weights.
link |
00:58:25.620
It's not like people are like, oh, it's closed source.
link |
00:58:27.420
I'm like, it's a blob of weights.
link |
00:58:28.860
What do you expect?
link |
00:58:29.780
So it's primarily neural network based.
link |
00:58:33.380
You, well, OpenPilot is all the software
link |
00:58:36.020
kind of around that neural network.
link |
00:58:37.660
That if you have a neural network that says,
link |
00:58:39.060
here's where you wanna send the car,
link |
00:58:40.860
OpenPilot actually goes and executes all of that.
link |
00:58:44.780
It cleans up the input to the neural network.
link |
00:58:46.900
It cleans up the output and executes on it.
link |
00:58:49.060
So it connects, it's the glue
link |
00:58:50.860
that connects everything together.
link |
00:58:51.860
Runs the sensors, does a bunch of calibration
link |
00:58:54.420
for the neural network, deals with like,
link |
00:58:58.100
if the car is on a banked road,
link |
00:59:00.140
you have to counter steer against that.
link |
00:59:02.060
And the neural network can't necessarily know that
link |
00:59:03.820
by looking at the picture.
link |
00:59:06.420
So you do that with other sensors
link |
00:59:08.100
and Fusion and Localizer.
link |
00:59:09.900
OpenPilot also is responsible
link |
00:59:11.780
for sending the data up to our servers.
link |
00:59:14.780
So we can learn from it, logging it, recording it,
link |
00:59:17.940
running the cameras, thermally managing the device,
link |
00:59:21.500
managing the disk space on the device,
link |
00:59:23.180
managing all the resources on the device.
link |
00:59:24.780
So what, since we last spoke,
link |
00:59:26.860
I don't remember when, maybe a year ago,
link |
00:59:28.460
maybe a little bit longer,
link |
00:59:30.180
how has OpenPilot improved?
link |
00:59:33.100
We did exactly what I promised you.
link |
00:59:34.860
I promised you that by the end of the year,
link |
00:59:36.760
where you'd be able to remove the lanes.
link |
00:59:40.500
The lateral policy is now almost completely end to end.
link |
00:59:46.060
You can turn the lanes off and it will drive,
link |
00:59:48.600
drive slightly worse on the highway
link |
00:59:49.980
if you turn the lanes off,
link |
00:59:51.060
but you can turn the lanes off and it will drive well,
link |
00:59:54.260
trained completely end to end on user data.
link |
00:59:57.220
And this year we hope to do the same
link |
00:59:58.700
for the longitudinal policy.
link |
01:00:00.140
So that's the interesting thing is you're not doing,
link |
01:00:03.380
you don't appear to be, maybe you can correct me,
link |
01:00:05.380
you don't appear to be doing lane detection
link |
01:00:08.700
or lane marking detection or kind of the segmentation task
link |
01:00:12.420
or any kind of object detection task.
link |
01:00:15.100
You're doing what's traditionally more called
link |
01:00:17.580
like end to end learning.
link |
01:00:19.420
So, and trained on actual behavior of drivers
link |
01:00:24.180
when they're driving the car manually.
link |
01:00:27.780
And this is hard to do.
link |
01:00:29.820
It's not supervised learning.
link |
01:00:32.220
Yeah, but so the nice thing is there's a lot of data.
link |
01:00:34.740
So it's hard and easy, right?
link |
01:00:37.060
It's a...
link |
01:00:37.900
We have a lot of high quality data, yeah.
link |
01:00:40.020
Like more than you need in the second.
link |
01:00:41.700
Well...
link |
01:00:42.700
We have way more than we do.
link |
01:00:43.520
We have way more data than we need.
link |
01:00:44.980
I mean, it's an interesting question actually,
link |
01:00:47.040
because in terms of amount, you have more than you need,
link |
01:00:50.440
but the driving is full of edge cases.
link |
01:00:54.260
So how do you select the data you train on?
link |
01:00:58.260
I think this is an interesting open question.
link |
01:01:00.540
Like what's the cleverest way to select data?
link |
01:01:04.220
That's the question Tesla is probably working on.
link |
01:01:07.660
That's, I mean, the entirety of machine learning can be,
link |
01:01:09.900
they don't seem to really care.
link |
01:01:11.060
They just kind of select data.
link |
01:01:12.260
But I feel like that if you want to solve,
link |
01:01:14.860
if you want to create intelligent systems,
link |
01:01:16.220
you have to pick data well, right?
link |
01:01:18.900
And so do you have any hints, ideas of how to do it well?
link |
01:01:22.900
So in some ways that is...
link |
01:01:25.140
The definition I like of reinforcement learning
link |
01:01:27.400
versus supervised learning.
link |
01:01:29.300
In supervised learning, the weights depend on the data.
link |
01:01:32.720
Right?
link |
01:01:34.560
And this is obviously true,
link |
01:01:35.700
but in reinforcement learning,
link |
01:01:38.320
the data depends on the weights.
link |
01:01:40.420
Yeah.
link |
01:01:41.380
And actually both ways.
link |
01:01:42.540
That's poetry.
link |
01:01:43.980
So how does it know what data to train on?
link |
01:01:46.260
Well, let it pick.
link |
01:01:47.540
We're not there yet, but that's the eventual.
link |
01:01:49.460
So you're thinking this almost like
link |
01:01:51.100
a reinforcement learning framework.
link |
01:01:53.320
We're going to do RL on the world.
link |
01:01:55.380
Every time a car makes a mistake, user disengages,
link |
01:01:58.140
we train on that and do RL on the world.
link |
01:02:00.140
Ship out a new model, that's an epoch, right?
link |
01:02:03.260
And for now you're not doing the Elon style promising
link |
01:02:08.540
that it's going to be fully autonomous.
link |
01:02:09.700
You really are sticking to level two
link |
01:02:12.420
and like it's supposed to be supervised.
link |
01:02:15.300
It is definitely supposed to be supervised
link |
01:02:16.900
and we enforce the fact that it's supervised.
link |
01:02:19.740
We look at our rate of improvement in disengagements.
link |
01:02:23.660
OpenPilot now has an unplanned disengagement
link |
01:02:25.700
about every a hundred miles.
link |
01:02:27.440
This is up from 10 miles, like maybe,
link |
01:02:32.580
maybe maybe a year ago.
link |
01:02:36.560
Yeah.
link |
01:02:37.400
So maybe we've seen 10 X improvement in a year,
link |
01:02:38.500
but a hundred miles is still a far cry
link |
01:02:41.740
from the a hundred thousand you're going to need.
link |
01:02:43.880
So you're going to somehow need to get three more 10 Xs
link |
01:02:48.060
in there.
link |
01:02:49.880
And you're, what's your intuition?
link |
01:02:52.300
You're basically hoping that there's exponential
link |
01:02:54.540
improvement built into the baked into the cake somewhere.
link |
01:02:56.940
Well, that's even, I mean, 10 X improvement,
link |
01:02:58.420
that's already assuming exponential, right?
link |
01:03:00.540
There's definitely exponential improvement.
link |
01:03:02.620
And I think when Elon talks about exponential,
link |
01:03:04.340
like these things, these systems are going to
link |
01:03:06.340
exponentially improve, just exponential doesn't mean
link |
01:03:09.820
you're getting a hundred gigahertz processors tomorrow.
link |
01:03:12.660
Right? Like it's going to still take a while
link |
01:03:15.060
because the gap between even our best system
link |
01:03:18.340
and humans is still large.
link |
01:03:20.300
So that's an interesting distinction to draw.
link |
01:03:22.340
So if you look at the way Tesla is approaching the problem
link |
01:03:26.100
and the way you're approaching the problem,
link |
01:03:28.340
which is very different than the rest of the self driving
link |
01:03:31.780
car world.
link |
01:03:32.720
So let's put them aside is you're treating most
link |
01:03:35.380
the driving task as a machine learning problem.
link |
01:03:37.500
And the way Tesla is approaching it is with the multitask
link |
01:03:40.260
learning where you break the task of driving into hundreds
link |
01:03:44.100
of different tasks and you have this multiheaded
link |
01:03:47.180
neural network that's very good at performing each task.
link |
01:03:51.660
And there there's presumably something on top that's
link |
01:03:54.620
stitching stuff together in order to make control
link |
01:03:59.240
decisions, policy decisions about how you move the car.
link |
01:04:02.180
But what that allows you, there's a brilliance to this
link |
01:04:04.420
because it allows you to master each task,
link |
01:04:08.360
like lane detection, stop sign detection,
link |
01:04:13.380
the traffic light detection, drivable area segmentation,
link |
01:04:19.180
you know, vehicle, bicycle, pedestrian detection.
link |
01:04:23.000
There's some localization tasks in there.
link |
01:04:25.420
Also predicting of like, yeah,
link |
01:04:30.440
predicting how the entities in the scene are going to move.
link |
01:04:34.060
Like everything is basically a machine learning task.
link |
01:04:36.360
So there's a classification, segmentation, prediction.
link |
01:04:40.380
And it's nice because you can have this entire engine,
link |
01:04:44.460
data engine that's mining for edge cases for each one of
link |
01:04:48.740
these tasks.
link |
01:04:49.580
And you can have people like engineers that are basically
link |
01:04:52.200
masters of that task,
link |
01:04:53.820
like become the best person in the world at,
link |
01:04:56.600
as you talk about the cone guy for Waymo,
link |
01:04:59.860
the becoming the best person in the world at cone detection.
link |
01:05:06.700
So that's a compelling notion from a supervised learning
link |
01:05:10.140
perspective, automating much of the process of edge case
link |
01:05:15.380
discovery and retraining neural network for each of the
link |
01:05:17.820
individual perception tasks.
link |
01:05:19.780
And then you're looking at the machine learning in a more
link |
01:05:22.460
holistic way, basically doing end to end learning on the
link |
01:05:27.240
driving tasks, supervised, trained on the data of the
link |
01:05:31.500
actual driving of people.
link |
01:05:34.420
They use comma AI, like actual human drivers,
link |
01:05:37.580
their manual control,
link |
01:05:38.700
plus the moments of disengagement that maybe with some
link |
01:05:44.400
labeling could indicate the failure of the system.
link |
01:05:47.340
So you have the,
link |
01:05:48.900
you have a huge amount of data for positive control of the
link |
01:05:52.420
vehicle, like successful control of the vehicle,
link |
01:05:55.560
both maintaining the lane as,
link |
01:05:58.340
as I think you're also working on longitudinal control of
link |
01:06:01.060
the vehicle and then failure cases where the vehicle does
link |
01:06:04.700
something wrong that needs disengagement.
link |
01:06:08.380
So like what,
link |
01:06:09.820
why do you think you're right and Tesla is wrong on this?
link |
01:06:14.220
And do you think,
link |
01:06:15.660
do you think you'll come around the Tesla way?
link |
01:06:17.540
Do you think Tesla will come around to your way?
link |
01:06:21.400
If you were to start a chess engine company,
link |
01:06:23.920
would you hire a Bishop guy?
link |
01:06:26.080
See, we have a,
link |
01:06:27.660
this is Monday morning.
link |
01:06:29.060
Quarterbacking is a yes, probably.
link |
01:06:36.340
Oh, our Rook guy.
link |
01:06:37.400
Oh, we stole the Rook guy from that company.
link |
01:06:39.420
Oh, we're going to have real good Rooks.
link |
01:06:40.780
Well, there's not many pieces, right?
link |
01:06:43.900
You can,
link |
01:06:46.220
there's not many guys and gals to hire.
link |
01:06:48.820
You just have a few that work in the Bishop,
link |
01:06:51.060
a few that work in the Rook.
link |
01:06:52.860
Is that not ludicrous today to think about
link |
01:06:55.340
in a world of AlphaZero?
link |
01:06:57.520
But AlphaZero is a chess game.
link |
01:06:58.860
So the fundamental question is,
link |
01:07:01.780
how hard is driving compared to chess?
link |
01:07:04.340
Because, so long term,
link |
01:07:07.200
end to end,
link |
01:07:08.980
will be the right solution.
link |
01:07:10.620
The question is how many years away is that?
link |
01:07:13.460
End to end is going to be the only solution for level five.
link |
01:07:15.740
For the only way we'll get there.
link |
01:07:17.220
Of course, and of course,
link |
01:07:18.220
Tesla is going to come around to my way.
link |
01:07:19.740
And if you're a Rook guy out there, I'm sorry.
link |
01:07:22.980
The cone guy.
link |
01:07:24.980
I don't know.
link |
01:07:25.820
We're going to specialize each task.
link |
01:07:26.940
We're going to really understand Rook placement.
link |
01:07:29.100
Yeah.
link |
01:07:30.540
I understand the intuition you have.
link |
01:07:32.060
I mean, that,
link |
01:07:35.100
that is a very compelling notion
link |
01:07:36.820
that we can learn the task end to end,
link |
01:07:39.140
like the same compelling notion you might have
link |
01:07:40.820
for natural language conversation.
link |
01:07:42.620
But I'm not
link |
01:07:44.800
sure,
link |
01:07:47.060
because one thing you sneaked in there
link |
01:07:48.940
is the assertion that it's impossible to get to level five
link |
01:07:53.180
without this kind of approach.
link |
01:07:55.420
I don't know if that's obvious.
link |
01:07:57.140
I don't know if that's obvious either.
link |
01:07:58.300
I don't actually mean that.
link |
01:08:01.340
I think that it is much easier
link |
01:08:03.500
to get to level five with an end to end approach.
link |
01:08:05.700
I think that the other approach is doable,
link |
01:08:08.900
but the magnitude of the engineering challenge
link |
01:08:11.380
may exceed what humanity is capable of.
link |
01:08:13.780
But what do you think of the Tesla data engine approach,
link |
01:08:19.140
which to me is an active learning task,
link |
01:08:21.100
is kind of fascinating,
link |
01:08:22.500
is breaking it down into these multiple tasks
link |
01:08:25.700
and mining their data constantly for like edge cases
link |
01:08:29.500
for these different tasks.
link |
01:08:30.500
Yeah, but the tasks themselves are not being learned.
link |
01:08:32.460
This is feature engineering.
link |
01:08:35.700
Yeah, I mean, it's a higher abstraction level
link |
01:08:40.740
of feature engineering for the different tasks.
link |
01:08:43.340
Task engineering in a sense.
link |
01:08:44.740
It's slightly better feature engineering,
link |
01:08:46.780
but it's still fundamentally is feature engineering.
link |
01:08:49.260
And if anything about the history of AI
link |
01:08:51.300
has taught us anything,
link |
01:08:52.620
it's that feature engineering approaches
link |
01:08:54.540
will always be replaced and lose to end to end.
link |
01:08:57.660
Now, to be fair, I cannot really make promises on timelines,
link |
01:09:02.060
but I can say that when you look at the code for Stockfish
link |
01:09:05.700
and the code for AlphaZero,
link |
01:09:06.940
one is a lot shorter than the other,
link |
01:09:09.060
a lot more elegant,
link |
01:09:09.900
required a lot less programmer hours to write.
link |
01:09:12.580
Yeah, but there was a lot more murder of bad agents
link |
01:09:21.620
on the AlphaZero side.
link |
01:09:24.740
By murder, I mean agents that played a game
link |
01:09:29.220
and failed miserably.
link |
01:09:30.420
Yeah.
link |
01:09:31.460
Oh, oh.
link |
01:09:32.300
In simulation, that failure is less costly.
link |
01:09:34.740
Yeah.
link |
01:09:35.580
In real world, it's...
link |
01:09:37.300
Do you mean in practice,
link |
01:09:38.260
like AlphaZero has lost games miserably?
link |
01:09:40.660
No.
link |
01:09:41.500
Wow.
link |
01:09:42.540
I haven't seen that.
link |
01:09:43.380
No, but I know, but the requirement for AlphaZero is...
link |
01:09:47.380
A simulator.
link |
01:09:48.220
To be able to like evolution, human evolution,
link |
01:09:51.500
not human evolution, biological evolution of life on earth
link |
01:09:54.420
from the origin of life has murdered trillions
link |
01:09:58.700
upon trillions of organisms on the path thus humans.
link |
01:10:02.260
Yeah.
link |
01:10:03.100
So the question is, can we stitch together
link |
01:10:05.860
a human like object without having to go
link |
01:10:07.940
through the entirety process of evolution?
link |
01:10:09.900
Well, no, but do the evolution in simulation.
link |
01:10:11.940
Yeah, that's the question.
link |
01:10:12.860
Can we simulate?
link |
01:10:13.700
So do you have a sense that it's possible
link |
01:10:15.060
to simulate some aspect?
link |
01:10:16.220
MuZero is exactly this.
link |
01:10:18.220
MuZero is the solution to this.
link |
01:10:21.300
MuZero I think is going to be looked back
link |
01:10:23.980
as the canonical paper.
link |
01:10:25.220
And I don't think deep learning is everything.
link |
01:10:26.860
I think that there's still a bunch of things missing
link |
01:10:28.700
to get there, but MuZero I think is going to be looked back
link |
01:10:31.420
as the kind of cornerstone paper
link |
01:10:34.260
of this whole deep learning era.
link |
01:10:37.060
And MuZero is the solution to self driving cars.
link |
01:10:39.580
You have to make a few tweaks to it,
link |
01:10:41.220
but MuZero does effectively that.
link |
01:10:42.820
It does those rollouts and those murdering
link |
01:10:45.500
in a learned simulator and a learned dynamics model.
link |
01:10:50.180
That's interesting.
link |
01:10:51.020
It doesn't get enough love.
link |
01:10:51.860
I was blown away when I read that paper.
link |
01:10:54.220
I'm like, okay, I've always said a comma.
link |
01:10:57.060
I'm going to sit and I'm going to wait for the solution
link |
01:10:58.460
to self driving cars to come along.
link |
01:11:00.340
This year I saw it.
link |
01:11:01.180
It's MuZero.
link |
01:11:05.060
So.
link |
01:11:06.860
Sit back and let the winning roll in.
link |
01:11:09.180
So your sense, just to elaborate a little bit,
link |
01:11:12.300
it's a link on the topic.
link |
01:11:13.380
Your sense is neural networks will solve driving.
link |
01:11:16.300
Yes.
link |
01:11:17.140
Like we don't need anything else.
link |
01:11:18.820
I think the same way chess was maybe the chess
link |
01:11:21.260
and maybe Google are the pinnacle of like search algorithms
link |
01:11:25.060
and things that look kind of like a star.
link |
01:11:28.420
The pinnacle of this era is going to be self driving cars.
link |
01:11:34.700
But on the path of that, you have to deliver products
link |
01:11:38.180
and it's possible that the path to full self driving cars
link |
01:11:42.340
will take decades.
link |
01:11:44.420
I doubt it.
link |
01:11:45.340
How long would you put on it?
link |
01:11:47.820
Like what are we, you're chasing it, Tesla's chasing it.
link |
01:11:53.460
What are we talking about?
link |
01:11:54.340
Five years, 10 years, 50 years.
link |
01:11:56.180
Let's say in the 2020s.
link |
01:11:58.060
In the 2020s.
link |
01:11:59.540
The later part of the 2020s.
link |
01:12:03.580
With the neural network.
link |
01:12:05.500
Well, that would be nice to see.
link |
01:12:06.580
And then the path to that, you're delivering products,
link |
01:12:09.140
which is a nice L2 system.
link |
01:12:10.580
That's what Tesla's doing, a nice L2 system.
link |
01:12:13.060
Just gets better every time.
link |
01:12:14.380
L2, the only difference between L2 and the other levels
link |
01:12:16.660
is who takes liability.
link |
01:12:17.620
And I'm not a liability guy, I don't wanna take liability.
link |
01:12:20.100
I'm gonna level two forever.
link |
01:12:22.700
Now on that little transition,
link |
01:12:25.780
I mean, how do you make the transition work?
link |
01:12:29.060
Is this where driver sensing comes in?
link |
01:12:32.780
Like how do you make the, cause you said a hundred miles,
link |
01:12:35.940
like, is there some sort of human factor psychology thing
link |
01:12:41.340
where people start to overtrust the system,
link |
01:12:43.140
all those kinds of effects,
link |
01:12:45.020
once it gets better and better and better and better,
link |
01:12:46.780
they get lazier and lazier and lazier.
link |
01:12:49.340
Is that, like, how do you get that transition right?
link |
01:12:52.460
First off, our monitoring is already adaptive.
link |
01:12:54.500
Our monitoring is already seen adaptive.
link |
01:12:56.620
Driver monitoring is just the camera
link |
01:12:58.940
that's looking at the driver.
link |
01:13:00.060
You have an infrared camera in the...
link |
01:13:03.060
Our policy for how we enforce the driver monitoring
link |
01:13:06.340
is seen adaptive.
link |
01:13:07.940
What's that mean?
link |
01:13:08.780
Well, for example, in one of the extreme cases,
link |
01:13:12.580
if the car is not moving,
link |
01:13:14.860
we do not actively enforce driver monitoring, right?
link |
01:13:19.460
If you are going through a,
link |
01:13:22.380
like a 45 mile an hour road with lights
link |
01:13:25.780
and stop signs and potentially pedestrians,
link |
01:13:27.980
we enforce a very tight driver monitoring policy.
link |
01:13:30.780
If you are alone on a perfectly straight highway,
link |
01:13:33.860
and this is, it's all machine learning.
link |
01:13:35.620
None of that is hand coded.
link |
01:13:36.940
Actually, the stop is hand coded, but...
link |
01:13:39.060
So there's some kind of machine learning
link |
01:13:41.100
estimation of risk.
link |
01:13:42.300
Yes.
link |
01:13:43.620
Yeah.
link |
01:13:44.460
I mean, I've always been a huge fan of that.
link |
01:13:45.860
That's a...
link |
01:13:47.500
Because...
link |
01:13:48.340
It's difficult to do every step into that direction
link |
01:13:53.780
is a worthwhile step to take.
link |
01:13:55.100
It might be difficult to do really well.
link |
01:13:56.540
Like us humans are able to estimate risk pretty damn well,
link |
01:13:59.980
whatever the hell that is.
link |
01:14:01.500
That feels like one of the nice features of us humans.
link |
01:14:06.500
Cause like we humans are really good drivers
link |
01:14:08.980
when we're really like tuned in
link |
01:14:11.180
and we're good at estimating risk.
link |
01:14:12.940
Like when are we supposed to be tuned in?
link |
01:14:14.900
Yeah.
link |
01:14:15.980
And, you know, people are like,
link |
01:14:17.580
oh, well, you know,
link |
01:14:18.420
why would you ever make the driver monitoring policy
link |
01:14:20.420
less aggressive?
link |
01:14:21.260
Why would you always not keep it at its most aggressive?
link |
01:14:23.980
Because then people are just going to get fatigued from it.
link |
01:14:25.780
Yes.
link |
01:14:26.620
When they get annoyed.
link |
01:14:27.460
You want them...
link |
01:14:28.300
Yeah.
link |
01:14:29.140
You want the experience to be pleasant.
link |
01:14:30.900
Obviously I want the experience to be pleasant,
link |
01:14:32.500
but even just from a straight up safety perspective,
link |
01:14:35.980
if you alert people when they look around and they're like,
link |
01:14:39.740
why is this thing alerting me?
link |
01:14:41.020
There's nothing I could possibly hit right now.
link |
01:14:42.980
People will just learn to tune it out.
link |
01:14:45.220
People will just learn to tune it out,
link |
01:14:46.900
to put weights on the steering wheel,
link |
01:14:48.060
to do whatever to overcome it.
link |
01:14:49.820
And remember that you're always part
link |
01:14:52.580
of this adaptive system.
link |
01:14:53.580
So all I can really say about, you know,
link |
01:14:55.660
how this scales going forward is yeah,
link |
01:14:57.180
it's something we have to monitor for.
link |
01:14:59.420
Ooh, we don't know.
link |
01:15:00.260
This is a great psychology experiment at scale.
link |
01:15:02.060
Like we'll see.
link |
01:15:03.220
Yeah, it's fascinating.
link |
01:15:04.060
Track it.
link |
01:15:04.900
And making sure you have a good understanding of attention
link |
01:15:09.140
is a very key part of that psychology problem.
link |
01:15:11.420
Yeah.
link |
01:15:12.260
I think you and I probably have a different,
link |
01:15:14.100
come to it differently, but to me,
link |
01:15:16.660
it's a fascinating psychology problem
link |
01:15:19.700
to explore something much deeper than just driving.
link |
01:15:22.100
It's such a nice way to explore human attention
link |
01:15:26.700
and human behavior, which is why, again,
link |
01:15:30.180
we've probably both criticized Mr. Elon Musk
link |
01:15:34.100
on this one topic from different avenues.
link |
01:15:38.220
So both offline and online,
link |
01:15:39.740
I had little chats with Elon and like,
link |
01:15:44.380
I love human beings as a computer vision problem,
link |
01:15:48.620
as an AI problem, it's fascinating.
link |
01:15:51.020
He wasn't so much interested in that problem.
link |
01:15:53.260
It's like in order to solve driving,
link |
01:15:56.820
the whole point is you want to remove the human
link |
01:15:58.780
from the picture.
link |
01:16:01.300
And it seems like you can't do that quite yet.
link |
01:16:04.140
Eventually, yes, but you can't quite do that yet.
link |
01:16:07.900
So this is the moment where you can't yet say,
link |
01:16:12.300
I told you so to Tesla, but it's getting there
link |
01:16:17.700
because I don't know if you've seen this,
link |
01:16:19.220
there's some reporting that they're in fact
link |
01:16:21.100
starting to do driver monitoring.
link |
01:16:23.260
Yeah, they shift the model in shadow mode.
link |
01:16:26.260
With, I believe, only a visible light camera,
link |
01:16:29.620
it might even be fisheye.
link |
01:16:31.780
It's like a low resolution.
link |
01:16:33.260
Low resolution, visible light.
link |
01:16:34.820
I mean, to be fair, that's what we have in the Eon as well,
link |
01:16:37.100
our last generation product.
link |
01:16:38.820
This is the one area where I can say
link |
01:16:41.060
our hardware is ahead of Tesla.
link |
01:16:42.180
The rest of our hardware, way, way behind,
link |
01:16:43.820
but our driver monitoring camera.
link |
01:16:46.020
So you think, I think on the third row Tesla podcast,
link |
01:16:50.940
or somewhere else, I've heard you say that obviously,
link |
01:16:54.580
eventually they're gonna have driver monitoring.
link |
01:16:57.220
I think what I've said is Elon will definitely ship
link |
01:16:59.660
driver monitoring before he ships level five.
link |
01:17:01.700
Before level five.
link |
01:17:02.540
And I'm willing to bet 10 grand on that.
link |
01:17:04.660
And you bet 10 grand on that.
link |
01:17:07.060
I mean, now I don't wanna take the bet,
link |
01:17:08.260
but before, maybe someone would have,
link |
01:17:09.500
oh, I should have got my money in.
link |
01:17:10.500
Yeah.
link |
01:17:11.900
It's an interesting bet.
link |
01:17:12.980
I think you're right.
link |
01:17:16.460
I'm actually on a human level
link |
01:17:19.180
because he's been, he's made the decision.
link |
01:17:24.380
Like he said that driver monitoring is the wrong way to go.
link |
01:17:27.660
But like, you have to think of as a human, as a CEO,
link |
01:17:31.260
I think that's the right thing to say when,
link |
01:17:36.460
like sometimes you have to say things publicly
link |
01:17:40.140
that are different than when you actually believe,
link |
01:17:41.780
because when you're producing a large number of vehicles
link |
01:17:45.420
and the decision was made not to include the camera,
link |
01:17:47.860
like what are you supposed to say?
link |
01:17:49.460
Like our cars don't have the thing
link |
01:17:51.780
that I think is right to have.
link |
01:17:54.020
It's an interesting thing.
link |
01:17:55.780
But like on the other side, as a CEO,
link |
01:17:58.340
I mean, something you could probably speak to as a leader,
link |
01:18:01.220
I think about me as a human
link |
01:18:04.940
to publicly change your mind on something.
link |
01:18:07.020
How hard is that?
link |
01:18:08.420
Especially when assholes like George Haas say,
link |
01:18:10.620
I told you so.
link |
01:18:12.380
All I will say is I am not a leader
link |
01:18:14.740
and I am happy to change my mind.
link |
01:18:17.060
And I will.
link |
01:18:17.900
You think Elon will?
link |
01:18:20.580
Yeah, I do.
link |
01:18:22.300
I think he'll come up with a good way
link |
01:18:24.260
to make it psychologically okay for him.
link |
01:18:27.420
Well, it's such an important thing, man.
link |
01:18:29.700
Especially for a first principles thinker,
link |
01:18:31.420
because he made a decision that driver monitoring
link |
01:18:34.780
is not the right way to go.
link |
01:18:35.740
And I could see that decision.
link |
01:18:37.660
And I could even make that decision.
link |
01:18:39.260
Like I was on the fence too.
link |
01:18:41.660
Like I'm not a,
link |
01:18:42.500
driver monitoring is such an obvious,
link |
01:18:47.140
simple solution to the problem of attention.
link |
01:18:49.980
It's not obvious to me that just by putting a camera there,
link |
01:18:52.820
you solve things.
link |
01:18:54.260
You have to create an incredible, compelling experience.
link |
01:18:59.060
Just like you're talking about.
link |
01:19:01.060
I don't know if it's easy to do that.
link |
01:19:03.300
It's not at all easy to do that, in fact, I think.
link |
01:19:05.980
So as a creator of a car that's trying to create a product
link |
01:19:10.980
that people love, which is what Tesla tries to do, right?
link |
01:19:14.180
It's not obvious to me that as a design decision,
link |
01:19:18.340
whether adding a camera is a good idea.
link |
01:19:20.940
From a safety perspective either,
link |
01:19:22.460
like in the human factors community,
link |
01:19:25.100
everybody says that you should obviously
link |
01:19:27.380
have driver sensing, driver monitoring.
link |
01:19:30.500
But that's like saying it's obvious as parents,
link |
01:19:36.900
you shouldn't let your kids go out at night.
link |
01:19:39.780
But okay, but like,
link |
01:19:43.460
they're still gonna find ways to do drugs.
link |
01:19:45.620
Like, you have to also be good parents.
link |
01:19:49.860
So like, it's much more complicated than just the,
link |
01:19:52.580
you need to have driver monitoring.
link |
01:19:54.140
I totally disagree on, okay, if you have a camera there
link |
01:19:58.740
and the camera's watching the person,
link |
01:20:00.300
but never throws an alert, they'll never think about it.
link |
01:20:03.540
Right?
link |
01:20:04.380
The driver monitoring policy that you choose to,
link |
01:20:08.380
how you choose to communicate with the user
link |
01:20:10.180
is entirely separate from the data collection perspective.
link |
01:20:14.220
Right?
link |
01:20:15.060
Right?
link |
01:20:15.900
So, you know, like, there's one thing to say,
link |
01:20:20.980
like, you know, tell your teenager they can't do something.
link |
01:20:24.500
There's another thing to like, you know, gather the data.
link |
01:20:27.020
So you can make informed decisions.
link |
01:20:28.500
That's really interesting.
link |
01:20:29.340
But you have to make that,
link |
01:20:30.980
that's the interesting thing about cars.
link |
01:20:33.380
But even true with common AI,
link |
01:20:35.020
like you don't have to manufacture the thing
link |
01:20:37.900
into the car, is you have to make a decision
link |
01:20:40.060
that anticipates the right strategy longterm.
link |
01:20:44.180
So like, you have to start collecting the data
link |
01:20:46.620
and start making decisions.
link |
01:20:47.780
Started it three years ago.
link |
01:20:49.900
I believe that we have the best driver monitoring solution
link |
01:20:52.660
in the world.
link |
01:20:54.620
I think that when you compare it to Super Cruise
link |
01:20:57.220
is the only other one that I really know that shipped.
link |
01:20:59.460
And ours is better.
link |
01:21:01.420
What do you like and not like about Super Cruise?
link |
01:21:06.420
I mean, I had a few Super Cruise,
link |
01:21:08.740
the sun would be shining through the window,
link |
01:21:12.020
would blind the camera,
link |
01:21:13.180
and it would say I wasn't paying attention.
link |
01:21:14.580
When I was looking completely straight,
link |
01:21:16.100
I couldn't reset the attention with a steering wheel touch
link |
01:21:19.140
and Super Cruise would disengage.
link |
01:21:21.060
Like I was communicating to the car, I'm like, look,
link |
01:21:22.980
I am here, I am paying attention.
link |
01:21:24.420
Why are you really gonna force me to disengage?
link |
01:21:26.340
And it did.
link |
01:21:28.620
So it's a constant conversation with the user.
link |
01:21:32.100
And yeah, there's no way to ship a system
link |
01:21:33.660
like this if you can OTA.
link |
01:21:35.500
We're shipping a new one every month.
link |
01:21:37.180
Sometimes we balance it with our users on Discord.
link |
01:21:40.140
Like sometimes we make the driver monitoring
link |
01:21:41.980
a little more aggressive and people complain.
link |
01:21:43.820
Sometimes they don't.
link |
01:21:45.500
We want it to be as aggressive as possible
link |
01:21:47.060
where people don't complain and it doesn't feel intrusive.
link |
01:21:49.180
So being able to update the system over the air
link |
01:21:51.100
is an essential component.
link |
01:21:52.540
I mean, that's probably to me, you mentioned,
link |
01:21:56.620
I mean, to me that is the biggest innovation of Tesla,
link |
01:22:01.060
that it made people realize that over the air updates
link |
01:22:04.860
is essential.
link |
01:22:06.460
Yeah.
link |
01:22:07.460
I mean, was that not obvious from the iPhone?
link |
01:22:10.140
The iPhone was the first real product that OTA'd, I think.
link |
01:22:13.060
Was it actually, that's brilliant, you're right.
link |
01:22:15.220
I mean, the game consoles used to not, right?
link |
01:22:17.060
The game consoles were maybe the second thing that did.
link |
01:22:18.980
Wow, I didn't really think about one of the amazing features
link |
01:22:22.260
of a smartphone isn't just like the touchscreen
link |
01:22:26.180
isn't the thing, it's the ability to constantly update.
link |
01:22:30.740
Yeah, it gets better.
link |
01:22:31.980
It gets better.
link |
01:22:35.100
Love my iOS 14.
link |
01:22:36.820
Yeah.
link |
01:22:38.300
Well, one thing that I probably disagree with you
link |
01:22:41.540
on driver monitoring is you said that it's easy.
link |
01:22:46.700
I mean, you tend to say stuff is easy.
link |
01:22:48.940
I'm sure the, I guess you said it's easy
link |
01:22:52.700
relative to the external perception problem.
link |
01:22:58.180
Can you elaborate why you think it's easy?
link |
01:23:00.940
Feature engineering works for driver monitoring.
link |
01:23:03.500
Feature engineering does not work for the external.
link |
01:23:05.860
So human faces are not, human faces and the movement
link |
01:23:10.700
of human faces and head and body is not as variable
link |
01:23:14.740
as the external environment, is your intuition?
link |
01:23:17.140
Yes, and there's another big difference as well.
link |
01:23:20.140
Your reliability of a driver monitoring system
link |
01:23:22.500
doesn't actually need to be that high.
link |
01:23:24.380
The uncertainty, if you have something that's detecting
link |
01:23:27.220
whether the human's paying attention and it only works
link |
01:23:29.140
92% of the time, you're still getting almost all
link |
01:23:31.700
the benefit of that because the human,
link |
01:23:33.620
like you're training the human, right?
link |
01:23:35.500
You're dealing with a system that's really helping you out.
link |
01:23:39.100
It's a conversation.
link |
01:23:40.180
It's not like the external thing where guess what?
link |
01:23:43.460
If you swerve into a tree, you swerve into a tree, right?
link |
01:23:46.140
Like you get no margin for error there.
link |
01:23:48.340
Yeah, I think that's really well put.
link |
01:23:49.620
I think that's the right, exactly the place
link |
01:23:54.020
where comparing to the external perception,
link |
01:23:58.660
the control problem, the driver monitoring is easier
link |
01:24:01.500
because you don't, the bar for success is much lower.
link |
01:24:05.260
Yeah, but I still think like the human face
link |
01:24:09.100
is more complicated actually than the external environment,
link |
01:24:12.140
but for driving, you don't give a damn.
link |
01:24:14.380
I don't need, yeah, I don't need something,
link |
01:24:15.620
I don't need something that complicated
link |
01:24:18.300
to have to communicate the idea to the human
link |
01:24:22.220
that I want to communicate, which is,
link |
01:24:23.980
yo, system might mess up here.
link |
01:24:25.740
You gotta pay attention.
link |
01:24:26.940
Yeah, see, that's my love and fascination is the human face.
link |
01:24:32.500
And it feels like this is a nice place to create products
link |
01:24:38.420
that create an experience in the car.
link |
01:24:40.100
So like, it feels like there should be
link |
01:24:42.640
more richer experiences in the car, you know?
link |
01:24:47.540
Like that's an opportunity for like something like On My Eye
link |
01:24:51.500
or just any kind of system like a Tesla
link |
01:24:53.900
or any of the autonomous vehicle companies
link |
01:24:56.220
is because software is, there's much more sensors
link |
01:24:59.180
and so much is on our software
link |
01:25:00.660
and you're doing machine learning anyway,
link |
01:25:02.940
there's an opportunity to create totally new experiences
link |
01:25:06.340
that we're not even anticipating.
link |
01:25:08.260
You don't think so?
link |
01:25:10.140
Nah.
link |
01:25:10.980
You think it's a box that gets you from A to B
link |
01:25:12.900
and you want to do it chill?
link |
01:25:14.920
Yeah, I mean, I think as soon as we get to level three
link |
01:25:16.940
on highways, okay, enjoy your candy crush,
link |
01:25:19.300
enjoy your Hulu, enjoy your, you know, whatever, whatever.
link |
01:25:23.660
Sure, you get this, you can look at screens basically
link |
01:25:26.260
versus right now where you have music and audio books.
link |
01:25:28.700
So level three is where you can kind of disengage
link |
01:25:31.020
in stretches of time.
link |
01:25:34.860
Well, you think level three is possible?
link |
01:25:37.460
Like on the highway going for 100 miles
link |
01:25:39.180
and you can just go to sleep?
link |
01:25:40.500
Oh yeah, sleep.
link |
01:25:43.620
So again, I think it's really all on a spectrum.
link |
01:25:47.340
I think that being able to use your phone
link |
01:25:50.060
while you're on the highway and like this all being okay
link |
01:25:53.500
and being aware that the car might alert you
link |
01:25:55.360
and you have five seconds to basically.
link |
01:25:57.180
So the five second thing is you think is possible?
link |
01:25:59.060
Yeah, I think it is, oh yeah.
link |
01:26:00.420
Not in all scenarios, right?
link |
01:26:02.180
Some scenarios it's not.
link |
01:26:03.940
It's the whole risk thing that you mentioned is nice
link |
01:26:06.300
is to be able to estimate like how risky is this situation?
link |
01:26:10.660
That's really important to understand.
link |
01:26:12.620
One other thing you mentioned comparing KAMA
link |
01:26:15.380
and Autopilot is that something about the haptic feel
link |
01:26:20.380
of the way KAMA controls the car when things are uncertain.
link |
01:26:25.920
Like it behaves a little bit more uncertain
link |
01:26:27.760
when things are uncertain.
link |
01:26:29.240
That's kind of an interesting point.
link |
01:26:31.080
And then Autopilot is much more confident always
link |
01:26:34.080
even when it's uncertain until it runs into trouble.
link |
01:26:39.200
That's a funny thing.
link |
01:26:40.920
I actually mentioned that to Elon, I think.
link |
01:26:42.700
And then the first time we talked, he wasn't biting.
link |
01:26:46.280
It's like communicating uncertainty.
link |
01:26:48.880
I guess KAMA doesn't really communicate uncertainty
link |
01:26:51.880
explicitly, it communicates it through haptic feel.
link |
01:26:55.040
Like what's the role of communicating uncertainty
link |
01:26:57.420
do you think?
link |
01:26:58.260
Oh, we do some stuff explicitly.
link |
01:26:59.840
Like we do detect the lanes when you're on the highway
link |
01:27:01.800
and we'll show you how many lanes we're using to drive with.
link |
01:27:04.380
You can look at where it thinks the lanes are.
link |
01:27:06.200
You can look at the path.
link |
01:27:08.480
And we want to be better about this.
link |
01:27:10.440
We're actually hiring, want to hire some new UI people.
link |
01:27:12.900
UI people, you mentioned this.
link |
01:27:14.320
Cause it's such an, it's a UI problem too, right?
link |
01:27:17.300
We have a great designer now, but you know,
link |
01:27:19.720
we need people who are just going to like build this
link |
01:27:21.160
and debug these UIs, QT people.
link |
01:27:23.800
QT.
link |
01:27:24.640
Is that what the UI is done with, is QT?
link |
01:27:26.880
The new UI is in QT.
link |
01:27:29.320
C++ QT?
link |
01:27:32.000
Tesla uses it too.
link |
01:27:33.300
Yeah.
link |
01:27:34.200
We had some React stuff in there.
link |
01:27:37.760
React JS or just React?
link |
01:27:39.440
React is his own language, right?
link |
01:27:41.160
React Native, React is a JavaScript framework.
link |
01:27:44.480
Yeah.
link |
01:27:45.320
So it's all based on JavaScript, but it's, you know,
link |
01:27:48.960
I like C++.
link |
01:27:51.480
What do you think about Dojo with Tesla
link |
01:27:55.080
and their foray into what appears to be
link |
01:28:00.240
specialized hardware for training your own nets?
link |
01:28:05.120
I guess it's something, maybe you can correct me,
link |
01:28:07.360
from my shallow looking at it,
link |
01:28:10.000
it seems like something like Google did with TPUs,
link |
01:28:12.120
but specialized for driving data.
link |
01:28:15.680
I don't think it's specialized for driving data.
link |
01:28:18.360
It's just legit, just TPU.
link |
01:28:20.120
They want to go the Apple way,
link |
01:28:22.160
basically everything required in the chain is done in house.
link |
01:28:25.640
Well, so you have a problem right now,
link |
01:28:27.840
and this is one of my concerns.
link |
01:28:31.740
I really would like to see somebody deal with this.
link |
01:28:33.800
If anyone out there is doing it,
link |
01:28:35.280
I'd like to help them if I can.
link |
01:28:38.000
You basically have two options right now to train.
link |
01:28:40.620
One, your options are NVIDIA or Google.
link |
01:28:45.980
So Google is not even an option.
link |
01:28:50.120
Their TPUs are only available in Google Cloud.
link |
01:28:53.180
Google has absolutely onerous
link |
01:28:55.140
terms of service restrictions.
link |
01:28:58.060
They may have changed it,
link |
01:28:59.280
but back in Google's terms of service,
link |
01:29:00.620
it said explicitly you are not allowed to use Google Cloud ML
link |
01:29:03.740
for training autonomous vehicles
link |
01:29:05.340
or for doing anything that competes with Google
link |
01:29:07.360
without Google's prior written permission.
link |
01:29:09.300
Wow, okay.
link |
01:29:10.340
I mean, Google is not a platform company.
link |
01:29:14.140
I wouldn't touch TPUs with a 10 foot pole.
link |
01:29:16.760
So that leaves you with the monopoly.
link |
01:29:19.300
NVIDIA? NVIDIA.
link |
01:29:21.020
So, I mean.
link |
01:29:22.340
That you're not a fan of.
link |
01:29:23.900
Well, look, I was a huge fan of in 2016 NVIDIA.
link |
01:29:28.540
Jensen came sat in the car.
link |
01:29:31.880
Cool guy.
link |
01:29:32.820
When the stock was $30 a share.
link |
01:29:35.540
NVIDIA stock has skyrocketed.
link |
01:29:38.220
I witnessed a real change
link |
01:29:39.920
in who was in management over there in like 2018.
link |
01:29:43.580
And now they are, let's exploit.
link |
01:29:46.700
Let's take every dollar we possibly can
link |
01:29:48.460
out of this ecosystem.
link |
01:29:49.580
Let's charge $10,000 for A100s
link |
01:29:51.740
because we know we got the best shit in the game.
link |
01:29:54.180
And let's charge $10,000 for an A100
link |
01:29:57.920
when it's really not that different from a 3080,
link |
01:30:00.100
which is 699.
link |
01:30:03.500
The margins that they are making
link |
01:30:05.100
off of those high end chips are so high
link |
01:30:08.640
that, I mean, I think they're shooting themselves
link |
01:30:10.260
in the foot just from a business perspective.
link |
01:30:12.220
Because there's a lot of people talking like me now
link |
01:30:14.980
who are like, somebody's gotta take NVIDIA down.
link |
01:30:19.060
Yeah.
link |
01:30:19.900
Where they could dominate it.
link |
01:30:21.020
NVIDIA could be the new Intel.
link |
01:30:22.460
Yeah, to be inside everything essentially.
link |
01:30:26.940
And yet the winners in certain spaces
link |
01:30:30.660
like autonomous driving, the winners,
link |
01:30:33.780
only the people who are like desperately falling back
link |
01:30:36.620
and trying to catch up and have a ton of money,
link |
01:30:38.540
like the big automakers are the ones
link |
01:30:40.660
interested in partnering with NVIDIA.
link |
01:30:43.220
Oh, and I think a lot of those things
link |
01:30:44.820
are gonna fall through.
link |
01:30:45.940
If I were NVIDIA, sell chips.
link |
01:30:49.340
Sell chips at a reasonable markup.
link |
01:30:52.260
To everybody.
link |
01:30:53.100
To everybody.
link |
01:30:53.920
Without any restrictions.
link |
01:30:54.940
Without any restrictions.
link |
01:30:56.140
Intel did this.
link |
01:30:57.360
Look at Intel.
link |
01:30:58.260
They had a great long run.
link |
01:30:59.940
NVIDIA is trying to turn their,
link |
01:31:01.580
they're like trying to productize their chips
link |
01:31:04.020
way too much.
link |
01:31:05.620
They're trying to extract way more value
link |
01:31:07.880
than they can sustainably.
link |
01:31:09.380
Sure, you can do it tomorrow.
link |
01:31:10.740
Is it gonna up your share price?
link |
01:31:12.140
Sure, if you're one of those CEOs
link |
01:31:13.540
who's like, how much can I strip mine this company?
link |
01:31:15.300
And I think, you know, and that's what's weird about it too.
link |
01:31:17.860
Like the CEO is the founder.
link |
01:31:19.380
It's the same guy.
link |
01:31:20.300
Yeah.
link |
01:31:21.120
I mean, I still think Jensen's a great guy.
link |
01:31:22.340
He is great.
link |
01:31:23.300
Why do this?
link |
01:31:25.180
You have a choice.
link |
01:31:26.660
You have a choice right now.
link |
01:31:27.940
Are you trying to cash out?
link |
01:31:28.820
Are you trying to buy a yacht?
link |
01:31:30.660
If you are, fine.
link |
01:31:32.100
But if you're trying to be
link |
01:31:34.220
the next huge semiconductor company, sell chips.
link |
01:31:37.240
Well, the interesting thing about Jensen
link |
01:31:40.140
is he is a big vision guy.
link |
01:31:42.060
So he has a plan like for 50 years down the road.
link |
01:31:48.700
So it makes me wonder like.
link |
01:31:50.500
How does price gouging fit into it?
link |
01:31:51.780
Yeah, how does that, like it's,
link |
01:31:54.000
it doesn't seem to make sense as a plan.
link |
01:31:57.060
I worry that he's listening to the wrong people.
link |
01:31:59.260
Yeah, that's the sense I have too sometimes.
link |
01:32:02.540
Because I, despite everything, I think NVIDIA
link |
01:32:07.940
is an incredible company.
link |
01:32:09.020
Well, one, so I'm deeply grateful to NVIDIA
link |
01:32:12.420
for the products they've created in the past.
link |
01:32:13.740
Me too.
link |
01:32:14.580
Right?
link |
01:32:15.400
And so.
link |
01:32:16.240
The 1080 Ti was a great GPU.
link |
01:32:18.000
Still have a lot of them.
link |
01:32:18.840
Still is, yeah.
link |
01:32:21.840
But at the same time, it just feels like,
link |
01:32:26.860
feels like you don't want to put all your stock in NVIDIA.
link |
01:32:29.380
And so like Elon is doing, what Tesla is doing
link |
01:32:32.940
with Autopilot and Dojo is the Apple way is,
link |
01:32:37.260
because they're not going to share Dojo with George Hott's.
link |
01:32:40.300
I know.
link |
01:32:42.340
They should sell that chip.
link |
01:32:43.780
Oh, they should sell that.
link |
01:32:44.700
Even their accelerator.
link |
01:32:46.400
The accelerator that's in all the cars, the 30 watt one.
link |
01:32:49.060
Sell it, why not?
link |
01:32:51.580
So open it up.
link |
01:32:52.700
Like make, why does Tesla have to be a car company?
link |
01:32:55.820
Well, if you sell the chip, here's what you get.
link |
01:32:58.060
Yeah.
link |
01:32:59.020
Make some money off the chips.
link |
01:33:00.260
It doesn't take away from your chip.
link |
01:33:02.080
You're going to make some money, free money.
link |
01:33:03.860
And also the world is going to build an ecosystem
link |
01:33:07.380
of tooling for you.
link |
01:33:09.020
Right?
link |
01:33:09.860
You're not going to have to fix the bug in your 10H layer.
link |
01:33:12.860
Someone else already did.
link |
01:33:15.140
Well, the question, that's an interesting question.
link |
01:33:16.780
I mean, that's the question Steve Jobs asked.
link |
01:33:18.740
That's the question Elon Musk is perhaps asking is,
link |
01:33:24.940
do you want Tesla stuff inside other vehicles?
link |
01:33:28.060
Inside, potentially inside like a iRobot vacuum cleaner.
link |
01:33:32.620
Yeah.
link |
01:33:34.860
I think you should decide where your advantages are.
link |
01:33:37.160
I'm not saying Tesla should start selling battery packs
link |
01:33:39.260
to automakers.
link |
01:33:40.380
Because battery packs to automakers,
link |
01:33:41.720
they are straight up in competition with you.
link |
01:33:43.660
If I were Tesla, I'd keep the battery technology totally.
link |
01:33:46.060
Yeah.
link |
01:33:46.900
As far as we make batteries.
link |
01:33:47.940
But the thing about the Tesla TPU is anybody can build that.
link |
01:33:53.160
It's just a question of, you know,
link |
01:33:54.600
are you willing to spend the money?
link |
01:33:57.460
It could be a huge source of revenue potentially.
link |
01:34:00.220
Are you willing to spend a hundred million dollars?
link |
01:34:02.420
Anyone can build it.
link |
01:34:03.660
And someone will.
link |
01:34:04.680
And a bunch of companies now are starting
link |
01:34:06.680
trying to build AI accelerators.
link |
01:34:08.020
Somebody is going to get the idea right.
link |
01:34:10.200
And yeah, hopefully they don't get greedy
link |
01:34:13.700
because they'll just lose to the next guy who finally,
link |
01:34:15.780
and then eventually the Chinese are going to make knockoff
link |
01:34:17.540
and video chips and that's.
link |
01:34:19.580
From your perspective,
link |
01:34:20.400
I don't know if you're also paying attention
link |
01:34:21.820
to stay on Tesla for a moment.
link |
01:34:24.180
Dave, Elon Musk has talked about a complete rewrite
link |
01:34:28.820
of the neural net that they're using.
link |
01:34:31.700
That seems to, again, I'm half paying attention,
link |
01:34:34.800
but it seems to involve basically a kind of integration
link |
01:34:39.000
of all the sensors to where it's a four dimensional view.
link |
01:34:44.180
You know, you have a 3D model of the world over time.
link |
01:34:47.540
And then you can, I think it's done both for the,
link |
01:34:52.100
for the actually, you know,
link |
01:34:53.280
so the neural network is able to,
link |
01:34:55.260
in a more holistic way,
link |
01:34:56.920
deal with the world and make predictions and so on,
link |
01:34:59.340
but also to make the annotation task more, you know, easier.
link |
01:35:04.780
Like you can annotate the world in one place
link |
01:35:08.180
and then kind of distribute itself across the sensors
link |
01:35:10.500
and across a different,
link |
01:35:12.860
like the hundreds of tasks that are involved
link |
01:35:15.180
in the Hydro Net.
link |
01:35:16.540
What are your thoughts about this rewrite?
link |
01:35:19.140
Is it just like some details that are kind of obvious
link |
01:35:22.420
that are steps that should be taken,
link |
01:35:24.060
or is there something fundamental
link |
01:35:26.100
that could challenge your idea
link |
01:35:27.560
that end to end is the right solution?
link |
01:35:31.120
We're in the middle of a big rewrite now as well.
link |
01:35:33.160
We haven't shipped a new model in a bit.
link |
01:35:34.860
Of what kind?
link |
01:35:36.400
We're going from 2D to 3D.
link |
01:35:38.240
Right now, all our stuff, like for example,
link |
01:35:39.740
when the car pitches back,
link |
01:35:40.980
the lane lines also pitch back
link |
01:35:43.020
because we're assuming the flat world hypothesis.
link |
01:35:47.160
The new models do not do this.
link |
01:35:48.420
The new models output everything in 3D.
link |
01:35:50.420
But there's still no annotation.
link |
01:35:53.580
So the 3D is, it's more about the output.
link |
01:35:56.500
Yeah.
link |
01:35:57.340
We have Zs in everything.
link |
01:36:00.100
We've...
link |
01:36:00.940
Zs.
link |
01:36:01.760
Yeah.
link |
01:36:02.600
We had a Zs.
link |
01:36:03.420
We had a Zs.
link |
01:36:04.260
We unified a lot of stuff as well.
link |
01:36:06.620
We switched from TensorFlow to PyTorch.
link |
01:36:10.720
My understanding of what Tesla's thing is,
link |
01:36:13.740
is that their annotator now annotates
link |
01:36:15.660
across the time dimension.
link |
01:36:16.960
Mm hmm.
link |
01:36:19.980
I mean, cute.
link |
01:36:22.000
Why are you building an annotator?
link |
01:36:24.400
I find their entire pipeline.
link |
01:36:28.280
I find your vision, I mean,
link |
01:36:30.560
the vision of end to end very compelling,
link |
01:36:32.860
but I also like the engineering of the data engine
link |
01:36:35.880
that they've created.
link |
01:36:37.400
In terms of supervised learning pipelines,
link |
01:36:41.560
that thing is damn impressive.
link |
01:36:43.560
You're basically, the idea is that you have
link |
01:36:47.480
hundreds of thousands of people
link |
01:36:49.280
that are doing data collection for you
link |
01:36:51.200
by doing their experience.
link |
01:36:52.400
So that's kind of similar to the Comma AI model.
link |
01:36:55.220
And you're able to mine that data
link |
01:36:59.580
based on the kind of edge cases you need.
link |
01:37:02.940
I think it's harder to do in the end to end learning.
link |
01:37:07.360
The mining of the right edge cases.
link |
01:37:09.520
Like that's where feature engineering
link |
01:37:11.440
is actually really powerful
link |
01:37:14.120
because like us humans are able to do
link |
01:37:17.280
this kind of mining a little better.
link |
01:37:19.800
But yeah, there's obvious, as we know,
link |
01:37:21.980
there's obvious constraints and limitations to that idea.
link |
01:37:25.880
Carpathia just tweeted, he's like,
link |
01:37:28.280
you get really interesting insights
link |
01:37:29.640
if you sort your validation set by loss
link |
01:37:33.720
and look at the highest loss examples.
link |
01:37:36.400
Yeah.
link |
01:37:37.560
So yeah, I mean, you can do,
link |
01:37:39.180
we have a little data engine like thing.
link |
01:37:42.040
We're training a segment.
link |
01:37:43.560
I know it's not fancy.
link |
01:37:44.560
It's just like, okay, train the new segment,
link |
01:37:48.280
run it on 100,000 images
link |
01:37:50.080
and now take the thousand with highest loss.
link |
01:37:52.160
Select a hundred of those by human,
link |
01:37:54.160
put those, get those ones labeled, retrain, do it again.
link |
01:37:57.840
And so it's a much less well written data engine.
link |
01:38:01.480
And yeah, you can take these things really far
link |
01:38:03.600
and it is impressive engineering.
link |
01:38:06.600
And if you truly need supervised data for a problem,
link |
01:38:09.920
yeah, things like data engine are at the high end
link |
01:38:12.560
of what is attention?
link |
01:38:14.960
Is a human paying attention?
link |
01:38:15.960
I mean, we're going to probably build something
link |
01:38:17.960
that looks like data engine
link |
01:38:18.940
to push our driver monitoring further.
link |
01:38:21.120
But for driving itself,
link |
01:38:22.920
you have it all annotated beautifully by what the human does.
link |
01:38:26.400
Yeah, that's interesting.
link |
01:38:27.240
I mean, that applies to driver attention as well.
link |
01:38:30.040
Do you want to detect the eyes?
link |
01:38:31.200
Do you want to detect blinking and pupil movement?
link |
01:38:33.500
Do you want to detect all the like face alignments
link |
01:38:36.680
or landmark detection and so on,
link |
01:38:38.720
and then doing kind of reasoning based on that?
link |
01:38:41.560
Or do you want to take the entirety of the face over time
link |
01:38:43.880
and do end to end?
link |
01:38:45.320
I mean, it's obvious that eventually you have to do end
link |
01:38:48.480
to end with some calibration, some fixes and so on,
link |
01:38:51.360
but it's like, I don't know when that's the right move.
link |
01:38:55.760
Even if it's end to end, there actually is,
link |
01:38:58.420
there is no kind of, you have to supervise that with humans.
link |
01:39:03.380
Whether a human is paying attention or not
link |
01:39:05.480
is a completely subjective judgment.
link |
01:39:08.560
Like you can try to like automatically do it
link |
01:39:11.040
with some stuff, but you don't have,
link |
01:39:13.160
if I record a video of a human,
link |
01:39:15.080
I don't have true annotations anywhere in that video.
link |
01:39:18.400
The only way to get them is with,
link |
01:39:21.120
you know, other humans labeling it really.
link |
01:39:22.840
Well, I don't know.
link |
01:39:26.080
If you think deeply about it,
link |
01:39:28.540
you could, you might be able to just,
link |
01:39:30.160
depending on the task,
link |
01:39:31.000
maybe a discover self annotating things like,
link |
01:39:34.320
you know, you can look at like steering wheel reverse
link |
01:39:36.920
or something like that.
link |
01:39:37.760
You can discover little moments of lapse of attention.
link |
01:39:41.200
I mean, that's where psychology comes in.
link |
01:39:44.540
Is there indicate,
link |
01:39:45.480
cause you have so much data to look at.
link |
01:39:48.000
So you might be able to find moments when there's like,
link |
01:39:51.140
just inattention that even with smartphone,
link |
01:39:54.320
if you want to detect smartphone use,
link |
01:39:56.700
you can start to zoom in.
link |
01:39:57.880
I mean, that's the gold mine, sort of the comma AI.
link |
01:40:01.480
I mean, Tesla is doing this too, right?
link |
01:40:02.920
Is they're doing annotation based on,
link |
01:40:06.920
it's like a self supervised learning too.
link |
01:40:10.500
It's just a small part of the entire picture.
link |
01:40:13.440
That's kind of the challenge of solving a problem
link |
01:40:17.760
in machine learning.
link |
01:40:18.660
If you can discover self annotating parts of the problem,
link |
01:40:24.000
right?
link |
01:40:25.020
Our driver monitoring team is half a person right now.
link |
01:40:27.760
I would, you know, once we have,
link |
01:40:29.280
once we have two, three people on that team,
link |
01:40:33.280
I definitely want to look at self annotating stuff
link |
01:40:35.280
for attention.
link |
01:40:38.200
Let's go back for a sec to a comma and what,
link |
01:40:43.720
you know, for people who are curious to try it out,
link |
01:40:46.240
how do you install a comma in say a 2020 Toyota Corolla
link |
01:40:51.120
or like, what are the cars that are supported?
link |
01:40:53.400
What are the cars that you recommend?
link |
01:40:55.500
And what does it take?
link |
01:40:57.880
You have a few videos out, but maybe through words,
link |
01:41:00.000
can you explain what's it take to actually install a thing?
link |
01:41:02.880
So we support, I think it's 91 cars, 91 makes the models.
link |
01:41:08.080
We've got to 100 this year.
link |
01:41:10.160
Nice.
link |
01:41:11.000
The, yeah, the 2020 Corolla, great choice.
link |
01:41:16.680
The 2020 Sonata, it's using the stock longitudinal.
link |
01:41:21.200
It's using just our lateral control,
link |
01:41:23.280
but it's a very refined car.
link |
01:41:25.140
Their longitudinal control is not bad at all.
link |
01:41:28.200
So yeah, Corolla, Sonata,
link |
01:41:31.720
or if you're willing to get your hands a little dirty
link |
01:41:34.240
and look in the right places on the internet,
link |
01:41:35.940
the Honda Civic is great,
link |
01:41:37.520
but you're going to have to install a modified EPS firmware
link |
01:41:40.600
in order to get a little bit more torque.
link |
01:41:42.160
And I can't help you with that.
link |
01:41:43.380
Comma does not officially endorse that,
link |
01:41:45.960
but we have been doing it.
link |
01:41:47.560
We didn't ever release it.
link |
01:41:49.800
We waited for someone else to discover it.
link |
01:41:51.440
And then, you know.
link |
01:41:52.880
And you have a Discord server where people,
link |
01:41:55.680
there's a very active developer community, I suppose.
link |
01:42:00.360
So depending on the level of experimentation
link |
01:42:04.000
you're willing to do, that's the community.
link |
01:42:07.600
If you just want to buy it and you have a supported car,
link |
01:42:11.240
it's 10 minutes to install.
link |
01:42:13.920
There's YouTube videos.
link |
01:42:15.400
It's Ikea furniture level.
link |
01:42:17.040
If you can set up a table from Ikea,
link |
01:42:19.040
you can install a Comma 2 in your supported car
link |
01:42:21.200
and it will just work.
link |
01:42:22.600
Now you're like, oh, but I want this high end feature
link |
01:42:24.920
or I want to fix this bug.
link |
01:42:26.160
Okay, well, welcome to the developer community.
link |
01:42:29.520
So what, if I wanted to,
link |
01:42:31.000
this is something I asked you offline like a few months ago.
link |
01:42:34.680
If I wanted to run my own code to,
link |
01:42:39.800
so use Comma as a platform
link |
01:42:43.420
and try to run something like OpenPilot,
link |
01:42:46.040
what does it take to do that?
link |
01:42:48.320
So there's a toggle in the settings called enable SSH.
link |
01:42:51.840
And if you toggle that, you can SSH into your device.
link |
01:42:54.600
You can modify the code.
link |
01:42:55.620
You can upload whatever code you want to it.
link |
01:42:58.260
There's a whole lot of people.
link |
01:42:59.160
So about 60% of people are running stock comma.
link |
01:43:03.040
About 40% of people are running forks.
link |
01:43:05.480
And there's a community of,
link |
01:43:07.280
there's a bunch of people who maintain these forks
link |
01:43:10.320
and these forks support different cars
link |
01:43:13.040
or they have different toggles.
link |
01:43:15.700
We try to keep away from the toggles
link |
01:43:17.360
that are like disabled driver monitoring,
link |
01:43:18.920
but there's some people might want that kind of thing
link |
01:43:21.720
and like, yeah, you can, it's your car.
link |
01:43:24.560
I'm not here to tell you.
link |
01:43:29.400
We have some, we ban,
link |
01:43:31.080
if you're trying to subvert safety features,
link |
01:43:32.920
you're banned from our Discord.
link |
01:43:33.760
I don't want anything to do with you,
link |
01:43:35.260
but there's some forks doing that.
link |
01:43:37.920
Got it.
link |
01:43:39.880
So you encourage responsible forking.
link |
01:43:42.880
Yeah, yeah.
link |
01:43:43.720
We encourage, some people, yeah, some people,
link |
01:43:46.080
like there's forks that will do,
link |
01:43:48.160
some people just like having a lot of readouts on the UI,
link |
01:43:52.040
like a lot of like flashing numbers.
link |
01:43:53.440
So there's forks that do that.
link |
01:43:55.120
Some people don't like the fact that it disengages
link |
01:43:57.200
when you press the gas pedal.
link |
01:43:58.320
There's forks that disable that.
link |
01:44:00.480
Got it.
link |
01:44:01.320
Now the stock experience is what like,
link |
01:44:04.920
so it does both lane keeping
link |
01:44:06.240
and longitudinal control all together.
link |
01:44:08.960
So it's not separate like it is in autopilot.
link |
01:44:11.020
No, so, okay.
link |
01:44:12.520
Some cars we use the stock longitudinal control.
link |
01:44:15.040
We don't do the longitudinal control in all the cars.
link |
01:44:17.400
Some cars, the ACCs are pretty good in the cars.
link |
01:44:19.560
It's the lane keep that's atrocious in anything
link |
01:44:21.360
except for autopilot and super cruise.
link |
01:44:23.420
But, you know, you just turn it on and it works.
link |
01:44:27.880
What does this engagement look like?
link |
01:44:29.480
Yeah, so we have, I mean,
link |
01:44:30.960
I'm very concerned about mode confusion.
link |
01:44:32.960
I've experienced it on super cruise and autopilot
link |
01:44:36.960
where like autopilot, like autopilot disengages.
link |
01:44:39.820
I don't realize that the ACC is still on.
link |
01:44:42.400
The lead car moves slightly over
link |
01:44:44.680
and then the Tesla accelerates
link |
01:44:46.160
to like whatever my set speed is super fast.
link |
01:44:48.000
I'm like, what's going on here?
link |
01:44:51.320
We have engaged and disengaged.
link |
01:44:53.720
And this is similar to my understanding, I'm not a pilot,
link |
01:44:56.380
but my understanding is either the pilot is in control
link |
01:45:00.080
or the copilot is in control.
link |
01:45:02.080
And we have the same kind of transition system.
link |
01:45:05.120
Either open pilot is engaged or open pilot is disengaged.
link |
01:45:08.620
Engage with cruise control,
link |
01:45:10.160
disengage with either gas brake or cancel.
link |
01:45:13.320
Let's talk about money.
link |
01:45:14.560
What's the business strategy for Kama?
link |
01:45:17.360
Profitable.
link |
01:45:18.840
Well, so you're.
link |
01:45:19.680
We did it.
link |
01:45:20.520
So congratulations.
link |
01:45:23.200
What, so basically selling,
link |
01:45:25.760
so we should say Kama cost a thousand bucks, Kama two?
link |
01:45:29.960
200 for the interface to the car as well.
link |
01:45:31.640
It's 1200, I'll send that.
link |
01:45:34.360
Nobody's usually upfront like this.
link |
01:45:36.400
Yeah, you gotta add the tack on, right?
link |
01:45:38.200
Yeah.
link |
01:45:39.040
I love it.
link |
01:45:39.860
I'm not gonna lie to you.
link |
01:45:41.080
Trust me, it will add $1,200 of value to your life.
link |
01:45:43.840
Yes, it's still super cheap.
link |
01:45:45.560
30 days, no questions asked, money back guarantee,
link |
01:45:47.880
and prices are only going up.
link |
01:45:50.400
If there ever is future hardware,
link |
01:45:52.320
it could cost a lot more than $1,200.
link |
01:45:53.840
So Kama three is in the works.
link |
01:45:56.960
It could be.
link |
01:45:57.800
All I will say is future hardware
link |
01:45:59.640
is going to cost a lot more than the current hardware.
link |
01:46:02.900
Yeah, the people that use,
link |
01:46:05.260
the people I've spoken with that use Kama,
link |
01:46:07.880
that use open pilot,
link |
01:46:10.320
first of all, they use it a lot.
link |
01:46:12.120
So people that use it, they fall in love with it.
link |
01:46:14.400
Oh, our retention rate is insane.
link |
01:46:16.840
It's a good sign.
link |
01:46:17.740
Yeah.
link |
01:46:18.580
It's a really good sign.
link |
01:46:19.680
70% of Kama two buyers are daily active users.
link |
01:46:23.720
Yeah, it's amazing.
link |
01:46:27.760
Oh, also, we don't plan on stopping selling the Kama two.
link |
01:46:30.520
Like it's, you know.
link |
01:46:31.960
So whatever you create that's beyond Kama two,
link |
01:46:36.600
it would be potentially a phase shift.
link |
01:46:40.760
Like it's so much better that,
link |
01:46:42.840
like you could use Kama two
link |
01:46:44.200
and you can use Kama whatever.
link |
01:46:45.760
Depends what you want.
link |
01:46:46.600
It's 3.41, 42.
link |
01:46:48.320
Yeah.
link |
01:46:49.160
You know, autopilot hardware one versus hardware two.
link |
01:46:52.200
The Kama two is kind of like hardware one.
link |
01:46:53.600
Got it, got it.
link |
01:46:54.440
You can still use both.
link |
01:46:55.260
Got it, got it.
link |
01:46:56.320
I think I heard you talk about retention rate
link |
01:46:58.000
with the VR headsets that the average is just once.
link |
01:47:01.240
Yeah.
link |
01:47:02.080
Just fast.
link |
01:47:02.900
I mean, it's such a fascinating way
link |
01:47:03.880
to think about technology.
link |
01:47:05.760
And this is a really, really good sign.
link |
01:47:07.420
And the other thing that people say about Kama
link |
01:47:09.000
is like they can't believe they're getting this 4,000 bucks.
link |
01:47:12.060
Right?
link |
01:47:12.900
It seems like some kind of steal.
link |
01:47:17.020
So, but in terms of like longterm business strategies
link |
01:47:20.040
that basically to put,
link |
01:47:21.640
so it's currently in like a thousand plus cars.
link |
01:47:27.560
1,200.
link |
01:47:28.520
More, more.
link |
01:47:30.560
So yeah, dailies is about, dailies is about 2,000.
link |
01:47:35.560
Weeklys is about 2,500, monthlys is over 3,000.
link |
01:47:38.960
Wow.
link |
01:47:39.800
We've grown a lot since we last talked.
link |
01:47:42.120
Is the goal, like can we talk crazy for a second?
link |
01:47:44.800
I mean, what's the goal to overtake Tesla?
link |
01:47:48.620
Let's talk, okay, so.
link |
01:47:49.960
I mean, Android did overtake iOS.
link |
01:47:51.480
That's exactly it, right?
link |
01:47:52.520
So they did it.
link |
01:47:55.360
I actually don't know the timeline of that one.
link |
01:47:57.720
But let's talk, because everything is in alpha now.
link |
01:48:02.160
The autopilot you could argue is in alpha
link |
01:48:03.960
in terms of towards the big mission
link |
01:48:05.840
of autonomous driving, right?
link |
01:48:07.680
And so what, yeah, is your goal to overtake
link |
01:48:11.600
millions of cars essentially?
link |
01:48:13.920
Of course.
link |
01:48:15.520
Where would it stop?
link |
01:48:16.760
Like it's open source software.
link |
01:48:18.000
It might not be millions of cars
link |
01:48:19.280
with a piece of comma hardware, but yeah.
link |
01:48:21.360
I think open pilot at some point
link |
01:48:24.320
will cross over autopilot in users,
link |
01:48:26.920
just like Android crossed over iOS.
link |
01:48:29.200
How does Google make money from Android?
link |
01:48:31.240
It's complicated.
link |
01:48:34.840
Their own devices make money.
link |
01:48:37.400
Google, Google makes money
link |
01:48:39.480
by just kind of having you on the internet.
link |
01:48:42.240
Yes.
link |
01:48:43.080
Google search is built in, Gmail is built in.
link |
01:48:45.620
Android is just a shill
link |
01:48:46.540
for the rest of Google's ecosystem.
link |
01:48:48.200
Yeah, but the problem is Android is not,
link |
01:48:50.680
is a brilliant thing.
link |
01:48:52.440
I mean, Android arguably changed the world.
link |
01:48:55.080
So there you go.
link |
01:48:56.440
That's, you can feel good ethically speaking.
link |
01:49:00.800
But as a business strategy, it's questionable.
link |
01:49:04.320
Or sell hardware.
link |
01:49:05.720
Sell hardware.
link |
01:49:06.560
I mean, it took Google a long time to come around to it,
link |
01:49:08.120
but they are now making money on the Pixel.
link |
01:49:10.000
You're not about money, you're more about winning.
link |
01:49:13.240
Yeah, of course.
link |
01:49:14.160
No, but if only 10% of open pilot devices
link |
01:49:18.320
come from comma AI.
link |
01:49:19.920
They still make a lot.
link |
01:49:20.820
That is still, yes.
link |
01:49:21.660
That is a ton of money for our company.
link |
01:49:22.840
But can't somebody create a better comma using open pilot?
link |
01:49:27.120
Or are you basically saying, well, I'll compete them?
link |
01:49:28.840
Well, I'll compete you.
link |
01:49:29.680
Can you create a better Android phone than the Google Pixel?
link |
01:49:32.280
Right.
link |
01:49:32.800
I mean, you can, but like, you know.
link |
01:49:34.680
I love that.
link |
01:49:35.520
So you're confident, like, you know
link |
01:49:37.360
what the hell you're doing.
link |
01:49:38.480
Yeah.
link |
01:49:40.040
It's confidence and merit.
link |
01:49:43.520
I mean, our money comes from, we're
link |
01:49:44.960
a consumer electronics company.
link |
01:49:46.160
Yeah.
link |
01:49:46.660
And put it this way.
link |
01:49:48.040
So we sold like 3,000 comma twos.
link |
01:49:51.600
2,500 right now.
link |
01:49:54.080
And like, OK, we're probably going
link |
01:49:59.720
to sell 10,000 units next year.
link |
01:50:01.920
10,000 units, even just $1,000 a unit, OK,
link |
01:50:04.520
we're at 10 million in revenue.
link |
01:50:09.400
Get that up to 100,000, maybe double the price of the unit.
link |
01:50:12.100
Now we're talking like 200 million revenue.
link |
01:50:13.560
We're talking like series.
link |
01:50:14.440
Yeah, actually making money.
link |
01:50:15.840
One of the rare semi autonomous or autonomous vehicle companies
link |
01:50:19.360
that are actually making money.
link |
01:50:21.080
Yeah.
link |
01:50:22.600
You know, if you look at a model,
link |
01:50:24.880
and we were just talking about this yesterday.
link |
01:50:26.680
If you look at a model, and like you're AB testing your model,
link |
01:50:29.760
and if you're one branch of the AB test,
link |
01:50:32.360
the losses go down very fast in the first five epochs.
link |
01:50:35.320
That model is probably going to converge
link |
01:50:37.520
to something considerably better than the one
link |
01:50:39.360
where the losses are going down slower.
link |
01:50:41.360
Why do people think this is going to stop?
link |
01:50:43.000
Why do people think one day there's
link |
01:50:44.500
going to be a great like, well, Waymo's eventually
link |
01:50:46.840
going to surpass you guys?
link |
01:50:49.600
Well, they're not.
link |
01:50:52.000
Do you see like a world where like a Tesla or a car
link |
01:50:55.800
like a Tesla would be able to basically press a button
link |
01:50:59.160
and you like switch to open pilot?
link |
01:51:01.920
You know, you load in.
link |
01:51:04.480
No, so I think so first off, I think
link |
01:51:06.840
that we may surpass Tesla in terms of users.
link |
01:51:10.560
I do not think we're going to surpass Tesla ever
link |
01:51:12.520
in terms of revenue.
link |
01:51:13.520
I think Tesla can capture a lot more revenue per user
link |
01:51:16.380
than we can.
link |
01:51:17.320
But this mimics the Android iOS model exactly.
link |
01:51:20.520
There may be more Android devices,
link |
01:51:22.000
but there's a lot more iPhones than Google Pixels.
link |
01:51:24.320
So I think there'll be a lot more Tesla cars sold
link |
01:51:26.360
than pieces of common hardware.
link |
01:51:30.280
And then as far as a Tesla owner being
link |
01:51:34.420
able to switch to open pilot, does iPhones run Android?
link |
01:51:40.920
No, but it doesn't make sense.
link |
01:51:42.400
You can if you really want to do it,
link |
01:51:43.480
but it doesn't really make sense.
link |
01:51:44.440
Like it's not.
link |
01:51:45.320
It doesn't make sense.
link |
01:51:46.240
Who cares?
link |
01:51:46.740
What about if a large company like automakers, Ford, GM,
link |
01:51:51.640
Toyota came to George Hots?
link |
01:51:53.720
Or on the tech space, Amazon, Facebook, Google
link |
01:51:58.000
came with a large pile of cash?
link |
01:52:01.080
Would you consider being purchased?
link |
01:52:07.360
Do you see that as a one possible?
link |
01:52:10.500
Not seriously, no.
link |
01:52:12.360
I would probably see how much shit they'll entertain for me.
link |
01:52:19.680
And if they're willing to jump through a bunch of my hoops,
link |
01:52:22.080
then maybe.
link |
01:52:22.960
But no, not the way that M&A works today.
link |
01:52:25.200
I mean, we've been approached.
link |
01:52:26.520
And I laugh in these people's faces.
link |
01:52:28.000
I'm like, are you kidding?
link |
01:52:31.000
Yeah.
link |
01:52:31.600
Because it's so demeaning.
link |
01:52:33.680
The M&A people are so demeaning to companies.
link |
01:52:36.960
They treat the startup world as their innovation ecosystem.
link |
01:52:41.340
And they think that I'm cool with going along with that,
link |
01:52:43.640
so I can have some of their scam fake Fed dollars.
link |
01:52:46.440
Fed coin.
link |
01:52:47.680
What am I going to do with more Fed coin?
link |
01:52:49.680
Fed coin.
link |
01:52:50.320
Fed coin, man.
link |
01:52:51.400
I love that.
link |
01:52:52.120
So that's the cool thing about podcasting,
link |
01:52:54.040
actually, is people criticize.
link |
01:52:56.200
I don't know if you're familiar with Spotify giving Joe Rogan
link |
01:53:00.560
$100 million.
link |
01:53:01.920
I don't know about that.
link |
01:53:03.720
And they respect, despite all the shit
link |
01:53:08.200
that people are talking about Spotify,
link |
01:53:11.440
people understand that podcasters like Joe Rogan
link |
01:53:15.300
know what the hell they're doing.
link |
01:53:17.200
So they give them money and say, just do what you do.
link |
01:53:21.160
And the equivalent for you would be like,
link |
01:53:25.440
George, do what the hell you do, because you're good at it.
link |
01:53:28.440
Try not to murder too many people.
link |
01:53:31.400
There's some kind of common sense things,
link |
01:53:33.480
like just don't go on a weird rampage of it.
link |
01:53:37.400
Yeah.
link |
01:53:38.080
It comes down to what companies I could respect, right?
link |
01:53:43.400
Could I respect GM?
link |
01:53:44.440
Never.
link |
01:53:46.520
No, I couldn't.
link |
01:53:47.480
I mean, could I respect a Hyundai?
link |
01:53:50.840
More so.
link |
01:53:52.520
That's a lot closer.
link |
01:53:53.560
Toyota?
link |
01:53:54.720
What's your?
link |
01:53:55.840
Nah.
link |
01:53:56.840
Nah.
link |
01:53:57.520
Korean is the way.
link |
01:53:59.400
I think that the Japanese, the Germans, the US, they're all
link |
01:54:02.440
too, they're all too, they all think they're too great.
link |
01:54:05.720
What about the tech companies?
link |
01:54:07.520
Apple?
link |
01:54:08.480
Apple is, of the tech companies that I could respect,
link |
01:54:11.000
Apple's the closest.
link |
01:54:12.120
Yeah.
link |
01:54:12.600
I mean, I could never.
link |
01:54:13.560
It would be ironic.
link |
01:54:14.440
It would be ironic if Comma AI is acquired by Apple.
link |
01:54:19.840
I mean, Facebook, look, I quit Facebook 10 years ago
link |
01:54:21.840
because I didn't respect the business model.
link |
01:54:24.320
Google has declined so fast in the last five years.
link |
01:54:28.480
What are your thoughts about Waymo and its present
link |
01:54:32.080
and its future?
link |
01:54:33.200
Let me start by saying something nice, which is I've
link |
01:54:39.240
visited them a few times and have ridden in their cars.
link |
01:54:45.240
And the engineering that they're doing,
link |
01:54:49.760
both the research and the actual development
link |
01:54:51.720
and the engineering they're doing
link |
01:54:53.360
and the scale they're actually achieving
link |
01:54:55.160
by doing it all themselves is really impressive.
link |
01:54:58.400
And the balance of safety and innovation.
link |
01:55:01.480
And the cars work really well for the routes they drive.
link |
01:55:07.520
It drives fast, which was very surprising to me.
link |
01:55:10.960
It drives the speed limit or faster than the speed limit.
link |
01:55:14.800
It goes.
link |
01:55:16.120
And it works really damn well.
link |
01:55:17.800
And the interface is nice.
link |
01:55:19.160
In Chandler, Arizona, yeah.
link |
01:55:20.360
Yeah, in Chandler, Arizona, very specific environment.
link |
01:55:22.560
So it gives me enough material in my mind
link |
01:55:27.360
to push back against the madmen of the world,
link |
01:55:30.360
like George Hotz, to be like, because you kind of imply
link |
01:55:36.040
there's zero probability they're going to win.
link |
01:55:38.760
And after I've used, after I've ridden in it, to me,
link |
01:55:43.560
it's not zero.
link |
01:55:44.440
Oh, it's not for technology reasons.
link |
01:55:46.760
Bureaucracy?
link |
01:55:48.000
No, it's worse than that.
link |
01:55:49.520
It's actually for product reasons, I think.
link |
01:55:51.840
Oh, you think they're just not capable of creating
link |
01:55:53.840
an amazing product?
link |
01:55:55.600
No, I think that the product that they're building
link |
01:55:58.280
doesn't make sense.
link |
01:56:01.040
So a few things.
link |
01:56:03.320
You say the Waymo's are fast.
link |
01:56:05.840
Benchmark a Waymo against a competent Uber driver.
link |
01:56:09.040
Right.
link |
01:56:09.600
Right?
link |
01:56:10.080
The Uber driver's faster.
link |
01:56:11.080
It's not even about speed.
link |
01:56:12.200
It's the thing you said.
link |
01:56:13.240
It's about the experience of being stuck at a stop sign
link |
01:56:16.280
because pedestrians are crossing nonstop.
link |
01:56:20.120
I like when my Uber driver doesn't come to a full stop
link |
01:56:22.160
at the stop sign.
link |
01:56:22.960
Yeah.
link |
01:56:23.520
You know?
link |
01:56:24.440
And so let's say the Waymo's are 20% slower than an Uber.
link |
01:56:31.320
Right?
link |
01:56:33.000
You can argue that they're going to be cheaper.
link |
01:56:35.160
And I argue that users already have the choice
link |
01:56:37.640
to trade off money for speed.
link |
01:56:39.360
It's called UberPool.
link |
01:56:42.280
I think it's like 15% of rides are UberPools.
link |
01:56:45.560
Right?
link |
01:56:46.040
Users are not willing to trade off money for speed.
link |
01:56:49.480
So the whole product that they're building
link |
01:56:52.320
is not going to be competitive with traditional ride sharing
link |
01:56:56.200
networks.
link |
01:56:56.720
Right.
link |
01:56:59.560
And also, whether there's profit to be made
link |
01:57:04.360
depends entirely on one company having a monopoly.
link |
01:57:07.360
I think that the level four autonomous ride sharing
link |
01:57:11.200
vehicles market is going to look a lot like the scooter market
link |
01:57:14.800
if even the technology does come to exist, which I question.
link |
01:57:18.680
Who's doing well in that market?
link |
01:57:20.400
It's a race to the bottom.
link |
01:57:22.320
Well, it could be closer like an Uber and a Lyft,
link |
01:57:25.680
where it's just one or two players.
link |
01:57:28.000
Well, the scooter people have given up
link |
01:57:31.160
trying to market scooters as a practical means
link |
01:57:34.760
of transportation.
link |
01:57:35.480
And they're just like, they're super fun to ride.
link |
01:57:37.640
Look at wheels.
link |
01:57:38.360
I love those things.
link |
01:57:39.160
And they're great on that front.
link |
01:57:40.560
Yeah.
link |
01:57:41.040
But from an actual transportation product
link |
01:57:43.840
perspective, I do not think scooters are viable.
link |
01:57:46.240
And I do not think level four autonomous cars are viable.
link |
01:57:49.200
If you, let's play a fun experiment.
link |
01:57:51.560
If you ran, let's do a Tesla and let's do Waymo.
link |
01:57:56.960
If Elon Musk took a vacation for a year, he just said,
link |
01:58:01.880
screw it, I'm going to go live on an island, no electronics.
link |
01:58:05.400
And the board decides that we need to find somebody
link |
01:58:07.520
to run the company.
link |
01:58:09.160
And they did decide that you should run the company
link |
01:58:11.520
for a year.
link |
01:58:12.240
How do you run Tesla differently?
link |
01:58:14.920
I wouldn't change much.
link |
01:58:16.240
Do you think they're on the right track?
link |
01:58:17.920
I wouldn't change.
link |
01:58:18.680
I mean, I'd have some minor changes.
link |
01:58:21.640
But even my debate with Tesla about end
link |
01:58:25.520
to end versus SegNets, that's just software.
link |
01:58:29.120
Who cares?
link |
01:58:30.720
It's not like you're doing something terrible with SegNets.
link |
01:58:33.920
You're probably building something that's
link |
01:58:35.600
at least going to help you debug the end to end system a lot.
link |
01:58:39.440
It's very easy to transition from what they have
link |
01:58:42.200
to an end to end kind of thing.
link |
01:58:45.840
And then I presume you would, in the Model Y
link |
01:58:50.520
or maybe in the Model 3, start adding driver
link |
01:58:52.480
sensing with infrared.
link |
01:58:53.600
Yes, I would add infrared camera, infrared lights
link |
01:58:58.000
right away to those cars.
link |
01:59:02.120
And start collecting that data and do all that kind of stuff,
link |
01:59:04.920
yeah.
link |
01:59:05.440
Very much.
link |
01:59:06.080
I think they're already kind of doing it.
link |
01:59:07.780
It's an incredibly minor change.
link |
01:59:09.320
If I actually were CEO of Tesla, first off,
link |
01:59:11.160
I'd be horrified that I wouldn't be able to do
link |
01:59:13.080
a better job as Elon.
link |
01:59:14.000
And then I would try to understand
link |
01:59:16.480
the way he's done things before.
link |
01:59:17.760
You would also have to take over his Twitter.
link |
01:59:20.720
I don't tweet.
link |
01:59:22.200
Yeah, what's your Twitter situation?
link |
01:59:24.240
Why are you so quiet on Twitter?
link |
01:59:25.880
Since Dukama is like what's your social network presence like?
link |
01:59:30.280
Because on Instagram, you do live streams.
link |
01:59:34.280
You understand the music of the internet,
link |
01:59:39.240
but you don't always fully engage into it.
link |
01:59:41.520
You're part time.
link |
01:59:42.800
Well, I used to have a Twitter.
link |
01:59:44.160
Yeah, I mean, Instagram is a pretty place.
link |
01:59:47.600
Instagram is a beautiful place.
link |
01:59:49.120
It glorifies beauty.
link |
01:59:49.960
I like Instagram's values as a network.
link |
01:59:53.360
Twitter glorifies conflict, glorifies shots,
link |
02:00:00.320
taking shots of people.
link |
02:00:01.400
And it's like, you know, Twitter and Donald Trump
link |
02:00:05.280
are perfectly, they're perfect for each other.
link |
02:00:08.440
So Tesla's on the right track in your view.
link |
02:00:12.800
OK, so let's try, let's really try this experiment.
link |
02:00:16.600
If you ran Waymo, let's say they're,
link |
02:00:19.720
I don't know if you agree, but they
link |
02:00:21.160
seem to be at the head of the pack of the kind of,
link |
02:00:25.640
what would you call that approach?
link |
02:00:27.040
Like it's not necessarily lighter based
link |
02:00:29.000
because it's not about lighter.
link |
02:00:30.320
Level four robotaxi.
link |
02:00:31.520
Level four robotaxi, all in before making any revenue.
link |
02:00:37.040
So they're probably at the head of the pack.
link |
02:00:38.720
If you were said, hey, George, can you
link |
02:00:42.840
please run this company for a year, how would you change it?
link |
02:00:47.160
I would go.
link |
02:00:47.760
I would get Anthony Levandowski out of jail,
link |
02:00:49.800
and I would put him in charge of the company.
link |
02:00:56.400
Well, let's try to break that apart.
link |
02:00:58.200
Why do you want to destroy the company by doing that?
link |
02:01:01.600
Or do you mean you like renegade style thinking that pushes,
link |
02:01:09.400
that throws away bureaucracy and goes
link |
02:01:11.560
to first principle thinking?
link |
02:01:12.840
What do you mean by that?
link |
02:01:14.080
I think Anthony Levandowski is a genius,
link |
02:01:16.040
and I think he would come up with a much better idea of what
link |
02:01:19.640
to do with Waymo than me.
link |
02:01:22.360
So you mean that unironically.
link |
02:01:23.840
He is a genius.
link |
02:01:24.800
Oh, yes.
link |
02:01:25.400
Oh, absolutely.
link |
02:01:26.640
Without a doubt.
link |
02:01:27.600
I mean, I'm not saying there's no shortcomings,
link |
02:01:30.960
but in the interactions I've had with him, yeah.
link |
02:01:34.760
What?
link |
02:01:35.720
He's also willing to take, like, who knows
link |
02:01:38.200
what he would do with Waymo?
link |
02:01:39.400
I mean, he's also out there, like far more out there
link |
02:01:41.400
than I am.
link |
02:01:41.880
Yeah, there's big risks.
link |
02:01:43.480
What do you make of him?
link |
02:01:44.480
I was going to talk to him on this podcast,
link |
02:01:47.040
and I was going back and forth.
link |
02:01:48.880
I'm such a gullible, naive human.
link |
02:01:51.720
Like, I see the best in people.
link |
02:01:53.840
And I slowly started to realize that there
link |
02:01:56.720
might be some people out there that, like,
link |
02:02:02.400
have multiple faces to the world.
link |
02:02:05.280
They're, like, deceiving and dishonest.
link |
02:02:08.160
I still refuse to, like, I just, I trust people,
link |
02:02:13.040
and I don't care if I get hurt by it.
link |
02:02:14.640
But, like, you know, sometimes you
link |
02:02:16.520
have to be a little bit careful, especially platform
link |
02:02:18.760
wise and podcast wise.
link |
02:02:21.040
What do you, what am I supposed to think?
link |
02:02:23.160
So you think, you think he's a good person?
link |
02:02:26.400
Oh, I don't know.
link |
02:02:27.840
I don't really make moral judgments.
link |
02:02:30.000
It's difficult to.
link |
02:02:30.840
Oh, I mean this about the Waymo.
link |
02:02:32.480
I actually, I mean that whole idea very nonironically
link |
02:02:34.960
about what I would do.
link |
02:02:36.200
The problem with putting me in charge of Waymo
link |
02:02:38.160
is Waymo is already $10 billion in the hole, right?
link |
02:02:41.720
Whatever idea Waymo does, look, commas profitable, commas
link |
02:02:44.840
raised $8.1 million.
link |
02:02:46.680
That's small, you know, that's small money.
link |
02:02:48.180
Like, I can build a reasonable consumer electronics company
link |
02:02:50.840
and succeed wildly at that and still never be able to pay back
link |
02:02:54.200
Waymo's $10 billion.
link |
02:02:55.800
So I think the basic idea with Waymo, well,
link |
02:02:58.680
forget the $10 billion because they have some backing,
link |
02:03:00.880
but your basic thing is, like, what can we do
link |
02:03:04.000
to start making some money?
link |
02:03:05.640
Well, no, I mean, my bigger idea is, like,
link |
02:03:07.920
whatever the idea is that's gonna save Waymo,
link |
02:03:10.320
I don't have it.
link |
02:03:11.440
It's gonna have to be a big risk idea
link |
02:03:13.480
and I cannot think of a better person
link |
02:03:15.280
than Anthony Levandowski to do it.
link |
02:03:17.840
So that is completely what I would do as CEO of Waymo.
link |
02:03:20.240
I would call myself a transitionary CEO,
link |
02:03:22.640
do everything I can to fix that situation up.
link |
02:03:24.880
I'm gonna see.
link |
02:03:25.720
Yeah.
link |
02:03:27.920
Yeah.
link |
02:03:28.760
Because I can't do it, right?
link |
02:03:29.760
Like, I can't, I mean, I can talk about how
link |
02:03:33.440
what I really wanna do is just apologize
link |
02:03:35.260
for all those corny, you know, ad campaigns
link |
02:03:38.120
and be like, here's the real state of the technology.
link |
02:03:40.160
Yeah, that's, like, I have several criticism.
link |
02:03:42.280
I'm a little bit more bullish on Waymo
link |
02:03:44.320
than you seem to be, but one criticism I have
link |
02:03:48.000
is it went into corny mode too early.
link |
02:03:50.880
Like, it's still a startup.
link |
02:03:52.200
It hasn't delivered on anything.
link |
02:03:53.660
So it should be, like, more renegade
link |
02:03:56.320
and show off the engineering that they're doing,
link |
02:03:59.280
which just can be impressive,
link |
02:04:00.560
as opposed to doing these weird commercials
link |
02:04:02.320
of, like, your friendly car company.
link |
02:04:07.040
I mean, that's my biggest snipe at Waymo is always,
link |
02:04:10.080
that guy's a paid actor.
link |
02:04:11.440
That guy's not a Waymo user.
link |
02:04:12.800
He's a paid actor.
link |
02:04:13.640
Look here, I found his call sheet.
link |
02:04:15.360
Do kind of like what SpaceX is doing
link |
02:04:17.400
with the rocket launches.
link |
02:04:18.960
Just put the nerds up front, put the engineers up front,
link |
02:04:22.200
and just, like, show failures too, just.
link |
02:04:25.720
I love SpaceX's, yeah.
link |
02:04:27.880
Yeah, the thing that they're doing is right,
link |
02:04:29.840
and it just feels like the right.
link |
02:04:31.720
But.
link |
02:04:32.560
We're all so excited to see them succeed.
link |
02:04:34.440
Yeah.
link |
02:04:35.260
I can't wait to see when it won't fail, you know?
link |
02:04:37.320
Like, you lie to me, I want you to fail.
link |
02:04:39.400
You tell me the truth, you be honest with me,
link |
02:04:41.200
I want you to succeed.
link |
02:04:42.160
Yeah.
link |
02:04:44.120
Ah, yeah, and that requires the renegade CEO, right?
link |
02:04:50.200
I'm with you, I'm with you.
link |
02:04:51.480
I still have a little bit of faith in Waymo
link |
02:04:54.000
for the renegade CEO to step forward, but.
link |
02:04:57.860
It's not, it's not John Kraftik.
link |
02:05:00.600
Yeah, it's, you can't.
link |
02:05:02.980
It's not Chris Hormiston.
link |
02:05:04.980
And those people may be very good at certain things.
link |
02:05:07.680
Yeah.
link |
02:05:08.520
But they're not renegades.
link |
02:05:10.360
Yeah, because these companies are fundamentally,
link |
02:05:12.040
even though we're talking about billion dollars,
link |
02:05:14.260
all these crazy numbers,
link |
02:05:15.640
they're still, like, early stage startups.
link |
02:05:19.300
I mean, and I just, if you are pre revenue
link |
02:05:21.840
and you've raised 10 billion dollars,
link |
02:05:23.120
I have no idea, like, this just doesn't work.
link |
02:05:26.520
You know, it's against everything Silicon Valley.
link |
02:05:28.040
Where's your minimum viable product?
link |
02:05:29.880
You know, where's your users?
link |
02:05:31.600
Where's your growth numbers?
link |
02:05:33.320
This is traditional Silicon Valley.
link |
02:05:36.280
Why do you not apply it to what you think
link |
02:05:38.040
you're too big to fail already, like?
link |
02:05:41.760
How do you think autonomous driving will change society?
link |
02:05:45.780
So the mission is, for comma, to solve self driving.
link |
02:05:52.520
Do you have, like, a vision of the world
link |
02:05:54.240
of how it'll be different?
link |
02:05:57.940
Is it as simple as A to B transportation?
link |
02:06:00.100
Or is there, like, cause these are robots.
link |
02:06:03.640
It's not about autonomous driving in and of itself.
link |
02:06:05.880
It's what the technology enables.
link |
02:06:09.760
It's, I think it's the coolest applied AI problem.
link |
02:06:12.200
I like it because it has a clear path to monetary value.
link |
02:06:17.720
But as far as that being the thing that changes the world,
link |
02:06:21.440
I mean, no, like, there's cute things we're doing in common.
link |
02:06:25.480
Like, who'd have thought you could stick a phone
link |
02:06:26.960
on the windshield and it'll drive.
link |
02:06:29.080
But like, really, the product that you're building
link |
02:06:31.160
is not something that people were not capable
link |
02:06:33.920
of imagining 50 years ago.
link |
02:06:35.760
So no, it doesn't change the world on that front.
link |
02:06:37.800
Could people have imagined the internet 50 years ago?
link |
02:06:39.480
Only true genius visionaries.
link |
02:06:42.560
Everyone could have imagined autonomous cars 50 years ago.
link |
02:06:45.120
It's like a car, but I don't drive it.
link |
02:06:47.000
See, I have this sense, and I told you, like,
link |
02:06:49.920
my longterm dream is robots with which you have deep,
link |
02:06:55.880
with whom you have deep connections, right?
link |
02:06:59.320
And there's different trajectories towards that.
link |
02:07:03.600
And I've been thinking,
link |
02:07:04.440
so I've been thinking of launching a startup.
link |
02:07:07.060
I see autonomous vehicles
link |
02:07:09.820
as a potential trajectory to that.
link |
02:07:11.420
That's not where the direction I would like to go,
link |
02:07:16.580
but I also see Tesla or even Comma AI,
link |
02:07:19.100
like, pivoting into robotics broadly defined
link |
02:07:24.900
at some stage in the way, like you're mentioning,
link |
02:07:27.140
the internet didn't expect.
link |
02:07:29.580
Let's solve, you know, when I say a comma about this,
link |
02:07:32.580
we could talk about this,
link |
02:07:33.420
but let's solve self driving cars first.
link |
02:07:35.740
You gotta stay focused on the mission.
link |
02:07:37.140
Don't, don't, don't, you're not too big to fail.
link |
02:07:39.260
For however much I think Comma's winning,
link |
02:07:41.380
like, no, no, no, no, no, you're winning
link |
02:07:43.420
when you solve level five self driving cars.
link |
02:07:45.020
And until then, you haven't won.
link |
02:07:46.780
And you know, again, you wanna be arrogant
link |
02:07:48.900
in the face of other people, great.
link |
02:07:50.620
You wanna be arrogant in the face of nature, you're an idiot.
link |
02:07:53.780
Stay mission focused, brilliantly put.
link |
02:07:56.420
Like I mentioned, thinking of launching a startup,
link |
02:07:58.660
I've been considering, actually, before COVID,
link |
02:08:01.220
I've been thinking of moving to San Francisco.
link |
02:08:03.340
Ooh, ooh, I wouldn't go there.
link |
02:08:06.020
So why is, well, and now I'm thinking
link |
02:08:09.020
about potentially Austin and we're in San Diego now.
link |
02:08:13.660
San Diego, come here.
link |
02:08:14.940
So why, what, I mean, you're such an interesting human.
link |
02:08:20.540
You've launched so many successful things.
link |
02:08:23.060
What, why San Diego?
link |
02:08:26.220
What do you recommend?
link |
02:08:27.060
Why not San Francisco?
link |
02:08:29.300
Have you thought, so in your case,
link |
02:08:31.820
San Diego with Qualcomm and Snapdragon,
link |
02:08:33.780
I mean, that's an amazing combination.
link |
02:08:36.860
But.
link |
02:08:37.700
That wasn't really why.
link |
02:08:38.540
That wasn't the why?
link |
02:08:39.500
No, I mean, Qualcomm was an afterthought.
link |
02:08:41.340
Qualcomm was, it was a nice thing to think about.
link |
02:08:42.900
It's like, you can have a tech company here.
link |
02:08:45.140
Yeah.
link |
02:08:45.980
And a good one, I mean, you know, I like Qualcomm, but.
link |
02:08:48.300
No.
link |
02:08:49.140
Well, so why San Diego better than San Francisco?
link |
02:08:50.540
Why does San Francisco suck?
link |
02:08:51.900
Well, so, okay, so first off,
link |
02:08:53.020
we all kind of said like, we wanna stay in California.
link |
02:08:55.300
People like the ocean.
link |
02:08:57.620
You know, California, for its flaws,
link |
02:09:00.020
it's like a lot of the flaws of California
link |
02:09:02.380
are not necessarily California as a whole,
link |
02:09:03.900
and they're much more San Francisco specific.
link |
02:09:05.740
Yeah.
link |
02:09:06.700
San Francisco, so I think first tier cities in general
link |
02:09:09.860
have stopped wanting growth.
link |
02:09:13.140
Well, you have like in San Francisco, you know,
link |
02:09:15.460
the voting class always votes to not build more houses
link |
02:09:18.740
because they own all the houses.
link |
02:09:19.980
And they're like, well, you know,
link |
02:09:21.820
once people have figured out how to vote themselves
link |
02:09:23.860
more money, they're gonna do it.
link |
02:09:25.180
It is so insanely corrupt.
link |
02:09:27.900
It is not balanced at all, like political party wise,
link |
02:09:31.620
you know, it's a one party city and.
link |
02:09:34.700
For all the discussion of diversity,
link |
02:09:38.700
it stops lacking real diversity of thought,
link |
02:09:42.140
of background, of approaches, of strategies, of ideas.
link |
02:09:48.540
It's kind of a strange place
link |
02:09:51.180
that it's the loudest people about diversity
link |
02:09:54.380
and the biggest lack of diversity.
link |
02:09:56.420
I mean, that's what they say, right?
link |
02:09:58.620
It's the projection.
link |
02:10:00.340
Projection, yeah.
link |
02:10:02.140
Yeah, it's interesting.
link |
02:10:02.980
And even people in Silicon Valley tell me
link |
02:10:04.500
that's like high up people,
link |
02:10:07.940
everybody is like, this is a terrible place.
link |
02:10:10.100
It doesn't make sense.
link |
02:10:10.940
I mean, and coronavirus is really what killed it.
link |
02:10:13.220
San Francisco was the number one exodus
link |
02:10:17.340
during coronavirus.
link |
02:10:18.900
We still think San Diego is a good place to be.
link |
02:10:21.660
Yeah.
link |
02:10:23.460
Yeah, I mean, we'll see.
link |
02:10:24.740
We'll see what happens with California a bit longer term.
link |
02:10:29.780
Like Austin's an interesting choice.
link |
02:10:32.100
I wouldn't, I don't have really anything bad to say
link |
02:10:33.940
about Austin either,
link |
02:10:35.740
except for the extreme heat in the summer,
link |
02:10:37.940
which, but that's like very on the surface, right?
link |
02:10:40.180
I think as far as like an ecosystem goes, it's cool.
link |
02:10:43.700
I personally love Colorado.
link |
02:10:45.540
Colorado's great.
link |
02:10:47.100
Yeah, I mean, you have these states that are,
link |
02:10:49.140
like just way better run.
link |
02:10:51.380
California is, you know, it's especially San Francisco.
link |
02:10:55.380
It's not a tie horse and like, yeah.
link |
02:10:58.460
Can I ask you for advice to me and to others
link |
02:11:02.620
about what's it take to build a successful startup?
link |
02:11:07.540
Oh, I don't know.
link |
02:11:08.380
I haven't done that.
link |
02:11:09.220
Talk to someone who did that.
link |
02:11:10.740
Well, you've, you know,
link |
02:11:14.980
this is like another book of years
link |
02:11:16.460
that I'll buy for $67, I suppose.
link |
02:11:18.860
So there's, um.
link |
02:11:20.700
One of these days I'll sell out.
link |
02:11:24.140
Yeah, that's right.
link |
02:11:24.980
Jailbreaks are going to be a dollar
link |
02:11:26.020
and books are going to be 67.
link |
02:11:27.860
How I jailbroke the iPhone by George Hots.
link |
02:11:32.060
That's right.
link |
02:11:32.900
How I jail broke the iPhone and you can too.
link |
02:11:35.620
You can too.
link |
02:11:36.860
67 dollars.
link |
02:11:37.700
In 21 days.
link |
02:11:39.020
That's right.
link |
02:11:39.860
That's right.
link |
02:11:40.700
Oh God.
link |
02:11:41.540
Okay, I can't wait.
link |
02:11:42.380
But quite, so you have an introspective,
link |
02:11:44.940
you have built a very unique company.
link |
02:11:49.460
I mean, not you, but you and others.
link |
02:11:53.180
But I don't know.
link |
02:11:55.620
There's no, there's nothing.
link |
02:11:56.940
You have an introspective,
link |
02:11:57.780
you haven't really sat down and thought about like,
link |
02:12:01.780
well, like if you and I were having a bunch of,
link |
02:12:04.180
we're having some beers
link |
02:12:06.180
and you're seeing that I'm depressed
link |
02:12:08.100
and whatever, I'm struggling.
link |
02:12:09.860
There's no advice you can give?
link |
02:12:11.580
Oh, I mean.
link |
02:12:13.100
More beer?
link |
02:12:13.940
More beer?
link |
02:12:15.100
Um, yeah, I think it's all very like situation dependent.
link |
02:12:23.340
Here's, okay, if I can give a generic piece of advice,
link |
02:12:25.860
it's the technology always wins.
link |
02:12:28.140
The better technology always wins.
link |
02:12:30.740
And lying always loses.
link |
02:12:35.180
Build technology and don't lie.
link |
02:12:38.580
I'm with you.
link |
02:12:39.420
I agree very much.
link |
02:12:40.420
The long run, long run.
link |
02:12:41.340
Sure.
link |
02:12:42.180
That's the long run, yeah.
link |
02:12:43.020
The market can remain irrational longer
link |
02:12:44.860
than you can remain solvent.
link |
02:12:46.540
True fact.
link |
02:12:47.380
Well, this is an interesting point
link |
02:12:49.260
because I ethically and just as a human believe that
link |
02:12:54.380
like hype and smoke and mirrors is not
link |
02:12:58.980
at any stage of the company is a good strategy.
link |
02:13:02.740
I mean, there's some like, you know,
link |
02:13:04.900
PR magic kind of like, you know.
link |
02:13:07.020
Oh, hype around a new product, right?
link |
02:13:08.540
If there's a call to action,
link |
02:13:09.780
if there's like a call to action,
link |
02:13:10.980
like buy my new GPU, look at it.
link |
02:13:13.060
It takes up three slots and it's this big.
link |
02:13:14.940
It's huge.
link |
02:13:15.780
Buy my GPU.
link |
02:13:16.620
Yeah, that's great.
link |
02:13:17.460
If you look at, you know,
link |
02:13:18.300
especially in the AI space broadly,
link |
02:13:20.780
but autonomous vehicles,
link |
02:13:22.780
like you can raise a huge amount of money on nothing.
link |
02:13:26.540
And the question to me is like, I'm against that.
link |
02:13:30.060
I'll never be part of that.
link |
02:13:31.500
I don't think, I hope not, willingly not.
link |
02:13:36.820
But like, is there something to be said
link |
02:13:40.020
to essentially lying to raise money,
link |
02:13:44.980
like fake it till you make it kind of thing?
link |
02:13:47.860
I mean, this is Billy McFarland in the Fyre Festival.
link |
02:13:50.140
Like we all experienced, you know,
link |
02:13:53.460
what happens with that.
link |
02:13:54.300
No, no, don't fake it till you make it.
link |
02:13:57.460
Be honest and hope you make it the whole way.
link |
02:14:00.540
The technology wins.
link |
02:14:01.940
Right, the technology wins.
link |
02:14:02.980
And like, there is, I'm not used to like the anti hype,
link |
02:14:06.820
you know, that's a Slava KPSS reference,
link |
02:14:08.900
but hype isn't necessarily bad.
link |
02:14:13.140
I loved camping out for the iPhones, you know,
link |
02:14:17.580
and as long as the hype is backed by like substance,
link |
02:14:21.020
as long as it's backed by something I can actually buy,
link |
02:14:23.780
and like it's real, then hype is great
link |
02:14:26.740
and it's a great feeling.
link |
02:14:28.700
It's when the hype is backed by lies
link |
02:14:30.860
that it's a bad feeling.
link |
02:14:32.060
I mean, a lot of people call Elon Musk a fraud.
link |
02:14:34.540
How could he be a fraud?
link |
02:14:35.620
I've noticed this, this kind of interesting effect,
link |
02:14:37.740
which is he does tend to over promise
link |
02:14:42.300
and deliver, what's the better way to phrase it?
link |
02:14:45.660
Promise a timeline that he doesn't deliver on,
link |
02:14:49.220
he delivers much later on.
link |
02:14:51.740
What do you think about that?
link |
02:14:52.820
Cause I do that, I think that's a programmer thing too.
link |
02:14:56.220
I do that as well.
link |
02:14:57.540
You think that's a really bad thing to do or is that okay?
link |
02:15:01.300
I think that's, again, as long as like,
link |
02:15:03.900
you're working toward it and you're gonna deliver on it,
link |
02:15:06.940
it's not too far off, right?
link |
02:15:10.260
Right?
link |
02:15:11.100
Like, you know, the whole autonomous vehicle thing,
link |
02:15:14.900
it's like, I mean, I still think Tesla's on track
link |
02:15:18.620
to beat us.
link |
02:15:19.740
I still think even with their missteps,
link |
02:15:21.900
they have advantages we don't have.
link |
02:15:25.660
You know, Elon is better than me
link |
02:15:28.100
at like marshaling massive amounts of resources.
link |
02:15:33.060
So, you know, I still think given the fact
link |
02:15:36.260
they're maybe making some wrong decisions,
link |
02:15:38.220
they'll end up winning.
link |
02:15:39.300
And like, it's fine to hype it
link |
02:15:42.620
if you're actually gonna win, right?
link |
02:15:44.900
Like if Elon says, look, we're gonna be landing rockets
link |
02:15:47.220
back on earth in a year and it takes four,
link |
02:15:49.340
like, you know, he landed a rocket back on earth
link |
02:15:53.540
and he was working toward it the whole time.
link |
02:15:55.420
I think there's some amount of like,
link |
02:15:57.060
I think when it becomes wrong is if you know
link |
02:15:59.260
you're not gonna meet that deadline.
link |
02:16:00.460
If you're lying.
link |
02:16:01.540
Yeah, that's brilliantly put.
link |
02:16:03.260
Like this is what people don't understand, I think.
link |
02:16:06.860
Like Elon believes everything he says.
link |
02:16:09.500
He does, as far as I can tell, he does.
link |
02:16:12.140
And I detected that in myself too.
link |
02:16:14.780
Like if I, it's only bullshit
link |
02:16:17.620
if you're like conscious of yourself lying.
link |
02:16:21.460
Yeah, I think so.
link |
02:16:22.500
Yeah.
link |
02:16:23.420
Now you can't take that to such an extreme, right?
link |
02:16:25.660
Like in a way, I think maybe Billy McFarland
link |
02:16:27.620
believed everything he said too.
link |
02:16:30.020
Right, that's how you start a cult
link |
02:16:31.460
and everybody kills themselves.
link |
02:16:33.820
Yeah.
link |
02:16:34.660
Yeah, like it's, you need, you need,
link |
02:16:36.020
if there's like some factor on it, it's fine.
link |
02:16:39.300
And you need some people to like, you know,
link |
02:16:41.820
keep you in check, but like,
link |
02:16:44.580
if you deliver on most of the things you say
link |
02:16:46.580
and just the timelines are off, yeah.
link |
02:16:48.780
It does piss people off though.
link |
02:16:50.300
I wonder, but who cares?
link |
02:16:53.300
In a long arc of history, the people,
link |
02:16:55.260
everybody gets pissed off at the people who succeed,
link |
02:16:58.100
which is one of the things
link |
02:16:59.260
that frustrates me about this world,
link |
02:17:01.820
is they don't celebrate the success of others.
link |
02:17:07.660
Like there's so many people that want Elon to fail.
link |
02:17:12.900
It's so fascinating to me.
link |
02:17:14.860
Like what is wrong with you?
link |
02:17:18.180
Like, so Elon Musk talks about like people shorting,
link |
02:17:21.620
like they talk about financial,
link |
02:17:23.580
but I think it's much bigger than the financials.
link |
02:17:25.340
I've seen like the human factors community,
link |
02:17:27.860
they want, they want other people to fail.
link |
02:17:31.540
Why, why, why?
link |
02:17:32.940
Like even people, the harshest thing is like,
link |
02:17:36.740
you know, even people that like seem
link |
02:17:38.140
to really hate Donald Trump, they want him to fail
link |
02:17:41.980
or like the other president
link |
02:17:43.020
or they want Barack Obama to fail.
link |
02:17:45.900
It's like.
link |
02:17:47.140
Yeah, we're all on the same boat, man.
link |
02:17:49.740
It's weird, but I want that,
link |
02:17:51.660
I would love to inspire that part of the world to change
link |
02:17:54.100
because damn it, if the human species is gonna survive,
link |
02:17:58.540
we should celebrate success.
link |
02:18:00.460
Like it seems like the efficient thing to do
link |
02:18:02.780
in this objective function that we're all striving for
link |
02:18:06.300
is to celebrate the ones that like figure out
link |
02:18:09.060
how to like do better at that objective function
link |
02:18:11.820
as opposed to like dragging them down back into the mud.
link |
02:18:16.580
I think there is, this is the speech I always give
link |
02:18:19.100
about the commenters on Hacker News.
link |
02:18:21.700
So first off, something to remember
link |
02:18:23.260
about the internet in general is commenters
link |
02:18:26.140
are not representative of the population.
link |
02:18:29.300
I don't comment on anything.
link |
02:18:31.460
You know, commenters are representative
link |
02:18:34.180
of a certain sliver of the population.
link |
02:18:36.660
And on Hacker News, a common thing I'll see
link |
02:18:39.540
is when you'll see something that's like,
link |
02:18:42.100
you know, promises to be wild out there and innovative.
link |
02:18:47.500
There is some amount of, you know,
link |
02:18:49.700
checking them back to earth,
link |
02:18:50.940
but there's also some amount of if this thing succeeds,
link |
02:18:55.260
well, I'm 36 and I've worked
link |
02:18:57.580
at large tech companies my whole life.
link |
02:19:02.300
They can't succeed because if they succeed,
link |
02:19:05.020
that would mean that I could have done something different
link |
02:19:07.180
with my life, but we know that I couldn't have,
link |
02:19:09.220
we know that I couldn't have,
link |
02:19:10.140
and that's why they're gonna fail.
link |
02:19:11.940
And they have to root for them to fail
link |
02:19:13.100
to kind of maintain their world image.
link |
02:19:15.940
So tune it out.
link |
02:19:17.860
And they comment, well, it's hard, I, so one of the things,
link |
02:19:21.860
one of the things I'm considering startup wise
link |
02:19:25.260
is to change that.
link |
02:19:27.620
Cause I think the, I think it's also a technology problem.
link |
02:19:31.700
It's a platform problem.
link |
02:19:33.060
I agree.
link |
02:19:33.900
It's like, because the thing you said,
link |
02:19:35.980
most people don't comment.
link |
02:19:39.660
I think most people want to comment.
link |
02:19:42.940
They just don't because it's all the assholes
link |
02:19:45.180
who are commenting.
link |
02:19:46.020
Exactly, I don't want to be grouped in with them.
link |
02:19:47.620
You don't want to be at a party
link |
02:19:49.420
where everyone is an asshole.
link |
02:19:50.780
And so they, but that's a platform problem.
link |
02:19:54.500
I can't believe what Reddit's become.
link |
02:19:56.100
I can't believe the group thinking, Reddit comments.
link |
02:20:00.580
There's a, Reddit is an interesting one
link |
02:20:02.700
because they're subreddits.
link |
02:20:05.100
And so you can still see, especially small subreddits
link |
02:20:09.260
that like, that are a little like havens
link |
02:20:11.940
of like joy and positivity and like deep,
link |
02:20:16.140
even disagreement, but like nuanced discussion.
link |
02:20:18.860
But it's only like small little pockets,
link |
02:20:21.700
but that's emergent.
link |
02:20:23.460
The platform is not helping that or hurting that.
link |
02:20:26.780
So I guess naturally something about the internet,
link |
02:20:31.100
if you don't put in a lot of effort to encourage
link |
02:20:34.380
nuance and positive, good vibes,
link |
02:20:37.180
it's naturally going to decline into chaos.
link |
02:20:41.140
I would love to see someone do this well.
link |
02:20:42.860
Yeah.
link |
02:20:43.780
I think it's, yeah, very doable.
link |
02:20:45.540
I think actually, so I feel like Twitter
link |
02:20:49.740
could be overthrown.
link |
02:20:52.060
Yashua Bach talked about how like,
link |
02:20:55.220
if you have like and retweet,
link |
02:20:58.020
like that's only positive wiring, right?
link |
02:21:02.120
The only way to do anything like negative there
link |
02:21:05.580
is with a comment.
link |
02:21:08.300
And that's like that asymmetry is what gives,
link |
02:21:12.420
you know, Twitter its particular toxicness.
link |
02:21:15.620
Whereas I find YouTube comments to be much better
link |
02:21:18.060
because YouTube comments have an up and a down
link |
02:21:21.540
and they don't show the downvotes.
link |
02:21:23.500
Without getting into depth of this particular discussion,
link |
02:21:26.820
the point is to explore possibilities
link |
02:21:29.580
and get a lot of data on it.
link |
02:21:30.860
Because I mean, I could disagree with what you just said.
link |
02:21:34.780
The point is it's unclear.
link |
02:21:36.100
It hasn't been explored in a really rich way.
link |
02:21:39.860
Like these questions of how to create platforms
link |
02:21:44.700
that encourage positivity.
link |
02:21:47.100
Yeah, I think it's a technology problem.
link |
02:21:49.580
And I think we'll look back at Twitter as it is now.
link |
02:21:51.980
Maybe it'll happen within Twitter,
link |
02:21:53.900
but most likely somebody overthrows them
link |
02:21:56.460
is we'll look back at Twitter and say,
link |
02:22:00.500
can't believe we put up with this level of toxicity.
link |
02:22:03.220
You need a different business model too.
link |
02:22:05.100
Any social network that fundamentally has advertising
link |
02:22:07.700
as a business model, this was in The Social Dilemma,
link |
02:22:10.460
which I didn't watch, but I liked it.
link |
02:22:11.740
It's like, you know, there's always the, you know,
link |
02:22:12.940
you're the product, you're not the,
link |
02:22:15.380
but they had a nuanced take on it that I really liked.
link |
02:22:17.980
And it said, the product being sold is influence over you.
link |
02:22:24.660
The product being sold is literally your,
link |
02:22:27.140
you know, influence on you.
link |
02:22:29.820
Like that can't be, if that's your idea, okay.
link |
02:22:33.660
Well, you know, guess what?
link |
02:22:35.420
It can't not be toxic.
link |
02:22:37.060
Yeah, maybe there's ways to spin it,
link |
02:22:39.420
like with giving a lot more control to the user
link |
02:22:42.540
and transparency to see what is happening to them
link |
02:22:44.660
as opposed to in the shadows, it's possible,
link |
02:22:47.300
but that can't be the primary source of.
link |
02:22:49.380
But the users aren't, no one's gonna use that.
link |
02:22:51.980
It depends, it depends, it depends.
link |
02:22:54.020
I think that the, you're not going to,
link |
02:22:57.140
you can't depend on self awareness of the users.
link |
02:23:00.220
It's a longer discussion because you can't depend on it,
link |
02:23:04.460
but you can reward self awareness.
link |
02:23:09.740
Like if for the ones who are willing to put in the work
link |
02:23:12.340
of self awareness, you can reward them and incentivize
link |
02:23:16.060
and perhaps be pleasantly surprised how many people
link |
02:23:20.180
are willing to be self aware on the internet.
link |
02:23:23.260
Like we are in real life.
link |
02:23:24.380
Like I'm putting in a lot of effort with you right now,
link |
02:23:26.940
being self aware about if I say something stupid or mean,
link |
02:23:30.300
I'll like look at your like body language.
link |
02:23:32.740
Like I'm putting in that effort.
link |
02:23:33.700
It's costly for an introvert, very costly.
link |
02:23:36.260
But on the internet, fuck it.
link |
02:23:39.620
Like most people are like, I don't care if this hurts
link |
02:23:42.940
somebody, I don't care if this is not interesting
link |
02:23:46.220
or if this is, yeah, it's a mean or whatever.
link |
02:23:48.660
I think so much of the engagement today on the internet
link |
02:23:50.980
is so disingenuine too.
link |
02:23:53.060
You're not doing this out of a genuine,
link |
02:23:54.620
this is what you think.
link |
02:23:55.500
You're doing this just straight up to manipulate others.
link |
02:23:57.660
Whether you're in, you just became an ad.
link |
02:23:59.660
Yeah, okay, let's talk about a fun topic,
link |
02:24:02.780
which is programming.
link |
02:24:04.420
Here's another book idea for you.
link |
02:24:05.780
Let me pitch.
link |
02:24:07.300
What's your perfect programming setup?
link |
02:24:09.700
So like this by George Hots.
link |
02:24:12.660
So like what, listen, you're.
link |
02:24:17.500
Give me a MacBook Air, sit me in a corner of a hotel room
link |
02:24:20.340
and you know I'll still ask you.
link |
02:24:21.180
So you really don't care.
link |
02:24:22.020
You don't fetishize like multiple monitors, keyboard.
link |
02:24:27.020
Those things are nice and I'm not gonna say no to them,
link |
02:24:30.260
but did they automatically unlock tons of productivity?
link |
02:24:33.300
No, not at all.
link |
02:24:34.140
I have definitely been more productive on a MacBook Air
link |
02:24:36.300
in a corner of a hotel room.
link |
02:24:38.300
What about IDE?
link |
02:24:41.620
So which operating system do you love?
link |
02:24:45.980
What text editor do you use IDE?
link |
02:24:49.020
What, is there something that is like the perfect,
link |
02:24:53.020
if you could just say the perfect productivity setup
link |
02:24:57.020
for George Hots.
link |
02:24:57.860
It doesn't matter.
link |
02:24:58.700
It literally doesn't matter.
link |
02:25:00.420
You know, I guess I code most of the time in Vim.
link |
02:25:03.100
Like literally I'm using an editor from the 70s.
link |
02:25:05.020
You know, you didn't make anything better.
link |
02:25:07.460
Okay, VS code is nice for reading code.
link |
02:25:09.060
There's a few things that are nice about it.
link |
02:25:10.820
I think that you can build much better tools.
link |
02:25:13.460
How like IDA's xrefs work way better than VS codes, why?
link |
02:25:18.660
Yeah, actually that's a good question, like why?
link |
02:25:20.760
I still use, sorry, Emacs for most.
link |
02:25:25.320
I've actually never, I have to confess something dark.
link |
02:25:28.860
So I've never used Vim.
link |
02:25:32.620
I think maybe I'm just afraid
link |
02:25:36.460
that my life has been like a waste.
link |
02:25:39.340
I'm so, I'm not evangelical about Emacs.
link |
02:25:43.500
I think this.
link |
02:25:44.620
This is how I feel about TensorFlow versus PyTorch.
link |
02:25:47.820
Having just like, we've switched everything to PyTorch now.
link |
02:25:50.300
Put months into the switch.
link |
02:25:51.900
I have felt like I've wasted years on TensorFlow.
link |
02:25:54.580
I can't believe it.
link |
02:25:56.300
I can't believe how much better PyTorch is.
link |
02:25:58.380
Yeah.
link |
02:25:59.620
I've used Emacs and Vim, doesn't matter.
link |
02:26:01.500
Yeah, it's still just my heart.
link |
02:26:03.060
Somehow I fell in love with Lisp.
link |
02:26:04.700
I don't know why.
link |
02:26:05.540
You can't, the heart wants what the heart wants.
link |
02:26:08.140
I don't understand it, but it just connected with me.
link |
02:26:10.480
Maybe it's the functional language
link |
02:26:11.700
that first I connected with.
link |
02:26:13.260
Maybe it's because so many of the AI courses
link |
02:26:15.860
before the deep learning revolution
link |
02:26:17.300
were taught with Lisp in mind.
link |
02:26:19.500
I don't know.
link |
02:26:20.340
I don't know what it is, but I'm stuck with it.
link |
02:26:22.340
But at the same time, like,
link |
02:26:23.540
why am I not using a modern ID
link |
02:26:25.100
for some of these programming?
link |
02:26:26.300
I don't know.
link |
02:26:27.260
They're not that much better.
link |
02:26:28.500
I've used modern IDs too.
link |
02:26:30.180
But at the same time, so to just,
link |
02:26:32.040
well, not to disagree with you,
link |
02:26:33.060
but like, I like multiple monitors.
link |
02:26:35.880
Like I have to do work on a laptop
link |
02:26:38.380
and it's a pain in the ass.
link |
02:26:41.340
And also I'm addicted to the Kinesis weird keyboard.
link |
02:26:45.420
You could see there.
link |
02:26:46.740
Yeah, yeah, yeah.
link |
02:26:48.540
Yeah, so you don't have any of that.
link |
02:26:50.100
You can just be on a MacBook.
link |
02:26:51.820
I mean, look at work.
link |
02:26:53.020
I have three 24 inch monitors.
link |
02:26:55.220
I have a happy hacking keyboard.
link |
02:26:56.580
I have a Razer Death Hatter mouse, like.
link |
02:26:59.820
But it's not essential for you.
link |
02:27:01.300
No.
link |
02:27:02.120
Let's go to a day in the life of George Hots.
link |
02:27:04.680
What is the perfect day productivity wise?
link |
02:27:08.780
So we're not talking about like Hunter S. Thompson drugs.
link |
02:27:12.380
Yeah, yeah, yeah.
link |
02:27:13.220
And let's look at productivity.
link |
02:27:16.420
Like what's the day look like, like hour by hour?
link |
02:27:19.820
Is there any regularities that create
link |
02:27:23.260
a magical George Hots experience?
link |
02:27:25.940
I can remember three days in my life.
link |
02:27:28.300
And I remember these days vividly
link |
02:27:30.300
when I've gone through kind of radical transformations
link |
02:27:36.620
to the way I think.
link |
02:27:37.900
And what I would give, I would pay $100,000
link |
02:27:40.140
if I could have one of these days tomorrow.
link |
02:27:42.980
The days have been so impactful.
link |
02:27:44.900
And one was first discovering Eliezer Yudkowsky
link |
02:27:47.780
on the singularity and reading that stuff.
link |
02:27:50.740
And like, you know, my mind was blown.
link |
02:27:54.260
The next was discovering the Hutter Prize
link |
02:27:57.820
and that AI is just compression.
link |
02:27:59.900
Like finally understanding AIXI and what all of that was.
link |
02:28:03.820
You know, I like read about it when I was 18, 19,
link |
02:28:05.300
I didn't understand it.
link |
02:28:06.300
And then the fact that like lossless compression
link |
02:28:08.380
implies intelligence, the day that I was shown that.
link |
02:28:12.240
And then the third one is controversial.
link |
02:28:14.140
The day I found a blog called Unqualified Reservations.
link |
02:28:17.340
And read that and I was like.
link |
02:28:20.620
Wait, which one is that?
link |
02:28:21.500
That's, what's the guy's name?
link |
02:28:22.980
Curtis Yarvin.
link |
02:28:24.020
Yeah.
link |
02:28:25.020
So many people tell me I'm supposed to talk to him.
link |
02:28:27.640
Yeah, the day.
link |
02:28:28.480
He looks, he sounds insane.
link |
02:28:30.140
Definitely. Or brilliant,
link |
02:28:31.300
but insane or both, I don't know.
link |
02:28:33.100
The day I found that blog was another like,
link |
02:28:35.140
this was during like Gamergate
link |
02:28:37.180
and kind of the run up to the 2016 election.
link |
02:28:39.020
And I'm like, wow, okay, the world makes sense now.
link |
02:28:42.860
This is like, I had a framework now to interpret this.
link |
02:28:45.620
Just like I got the framework for AI
link |
02:28:47.200
and a framework to interpret technological progress.
link |
02:28:49.420
Like those days when I discovered these new frameworks were.
link |
02:28:52.820
Oh, interesting.
link |
02:28:53.660
So it's not about, but what was special about those days?
link |
02:28:57.160
How did those days come to be?
link |
02:28:58.780
Is it just, you got lucky?
link |
02:28:59.980
Like, you just encountered a hotter prize
link |
02:29:04.900
on Hacker News or something like that?
link |
02:29:09.740
But you see, I don't think it's just,
link |
02:29:11.840
see, I don't think it's just that like,
link |
02:29:13.260
I could have gotten lucky at any point.
link |
02:29:14.820
I think that in a way.
link |
02:29:16.260
You were ready at that moment.
link |
02:29:17.820
Yeah, exactly.
link |
02:29:18.660
To receive the information.
link |
02:29:21.460
But is there some magic to the day today
link |
02:29:24.300
of like eating breakfast?
link |
02:29:27.220
And it's the mundane things.
link |
02:29:29.060
Nah.
link |
02:29:29.900
Nothing.
link |
02:29:30.720
Nah, I drift through life.
link |
02:29:32.900
Without structure.
link |
02:29:34.180
I drift through life hoping and praying
link |
02:29:36.220
that I will get another day like those days.
link |
02:29:38.540
And there's nothing in particular you do
link |
02:29:40.320
to be a receptacle for another, for day number four.
link |
02:29:46.060
No, I didn't do anything to get the other ones.
link |
02:29:48.060
So I don't think I have to really do anything now.
link |
02:29:51.100
I took a month long trip to New York
link |
02:29:53.420
and the Ethereum thing was the highlight of it,
link |
02:29:56.260
but the rest of it was pretty terrible.
link |
02:29:57.980
I did a two week road trip
link |
02:29:59.860
and I got, I had to turn around.
link |
02:30:01.740
I had to turn around driving in Gunnison, Colorado.
link |
02:30:06.620
I passed through Gunnison
link |
02:30:08.060
and the snow starts coming down.
link |
02:30:10.660
There's a pass up there called Monarch Pass
link |
02:30:12.160
in order to get through to Denver,
link |
02:30:13.000
you gotta get over the Rockies.
link |
02:30:14.700
And I had to turn my car around.
link |
02:30:16.800
I couldn't, I watched a F150 go off the road.
link |
02:30:20.260
I'm like, I gotta go back.
link |
02:30:21.620
And like that day was meaningful.
link |
02:30:24.620
Cause like, it was real.
link |
02:30:26.040
Like I actually had to turn my car around.
link |
02:30:28.900
It's rare that anything even real happens in my life.
link |
02:30:31.240
Even as, you know, mundane as the fact that,
link |
02:30:34.300
yeah, there was snow, I had to turn around,
link |
02:30:36.100
stay in Gunnison and leave the next day.
link |
02:30:37.700
Something about that moment felt real.
link |
02:30:40.260
Okay, so actually it's interesting to break apart
link |
02:30:43.180
the three moments you mentioned, if it's okay.
link |
02:30:45.380
So I always have trouble pronouncing his name,
link |
02:30:48.540
but Alousa Yurkowski.
link |
02:30:53.300
So what, how did your worldview change
link |
02:30:57.820
in starting to consider the exponential growth of AI
link |
02:31:02.900
and AGI that he thinks about
link |
02:31:05.020
and the threats of artificial intelligence
link |
02:31:07.620
and all that kind of ideas?
link |
02:31:09.340
Can you, is it just like, can you maybe break apart
link |
02:31:12.460
like what exactly was so magical to you?
link |
02:31:15.660
Is it transformational experience?
link |
02:31:17.380
Today, everyone knows him for threats and AI safety.
link |
02:31:20.340
This was pre that stuff.
link |
02:31:22.260
There was, I don't think a mention of AI safety on the page.
link |
02:31:25.980
This is, this is old Yurkowski stuff.
link |
02:31:27.860
He'd probably denounce it all now.
link |
02:31:29.060
He'd probably be like,
link |
02:31:29.900
that's exactly what I didn't want to happen.
link |
02:31:32.220
Sorry, man.
link |
02:31:33.060
Is there something specific you can take from his work
link |
02:31:37.700
that you can remember?
link |
02:31:38.700
Yeah, it was this realization
link |
02:31:40.780
that computers double in power every 18 months
link |
02:31:45.980
and humans do not, and they haven't crossed yet.
link |
02:31:50.140
But if you have one thing that's doubling every 18 months
link |
02:31:52.700
and one thing that's staying like this, you know,
link |
02:31:55.100
here's your log graph, here's your line, you know,
link |
02:31:58.940
calculate that.
link |
02:31:59.860
And then the data opened the door
link |
02:32:03.540
to the exponential thinking, like thinking that like,
link |
02:32:06.420
you know what, with technology,
link |
02:32:07.740
we can actually transform the world.
link |
02:32:11.580
It opened the door to human obsolescence.
link |
02:32:13.740
It opened the door to realize that in my lifetime,
link |
02:32:16.980
humans are going to be replaced.
link |
02:32:20.620
And then the matching idea to that of artificial intelligence
link |
02:32:23.740
with the Hutter prize, you know, I'm torn.
link |
02:32:27.940
I go back and forth on what I think about it.
link |
02:32:30.220
Yeah.
link |
02:32:31.380
But the basic thesis is it's a nice compelling notion
link |
02:32:36.180
that we can reduce the task of creating
link |
02:32:38.340
an intelligent system, a generally intelligent system
link |
02:32:41.380
into the task of compression.
link |
02:32:43.940
So you can think of all of intelligence in the universe,
link |
02:32:46.300
in fact, as a kind of compression.
link |
02:32:50.220
Do you find that, was that just at the time
link |
02:32:52.420
you found that as a compelling idea
link |
02:32:53.900
or do you still find that a compelling idea?
link |
02:32:56.980
I still find that a compelling idea.
link |
02:32:59.180
I think that it's not that useful day to day,
link |
02:33:02.180
but actually one of maybe my quests before that
link |
02:33:06.260
was a search for the definition of the word intelligence.
link |
02:33:09.340
And I never had one.
link |
02:33:10.860
And I definitely have a definition of the word compression.
link |
02:33:14.700
It's a very simple, straightforward one.
link |
02:33:18.420
And you know what compression is,
link |
02:33:19.740
you know what lossless, it's lossless compression,
link |
02:33:21.100
not lossy, lossless compression.
link |
02:33:22.940
And that that is equivalent to intelligence,
link |
02:33:25.620
which I believe, I'm not sure how useful
link |
02:33:27.540
that definition is day to day,
link |
02:33:28.980
but like I now have a framework to understand what it is.
link |
02:33:32.340
And he just 10X the prize for that competition
link |
02:33:36.300
like recently a few months ago.
link |
02:33:37.700
You ever thought of taking a crack at that?
link |
02:33:39.580
Oh, I did.
link |
02:33:41.100
Oh, I did.
link |
02:33:41.940
I spent the next, after I found the prize,
link |
02:33:44.660
I spent the next six months of my life trying it.
link |
02:33:47.820
And well, that's when I started learning everything about AI.
link |
02:33:51.300
And then I worked at Vicarious for a bit
link |
02:33:53.340
and then I read all the deep learning stuff.
link |
02:33:55.340
And I'm like, okay, now I like I'm caught up to modern AI.
link |
02:33:58.460
Wow.
link |
02:33:59.300
And I had a really good framework to put it all in
link |
02:34:01.380
from the compression stuff, right?
link |
02:34:04.420
Like some of the first deep learning models I played with
link |
02:34:07.340
were GPT basically, but before transformers,
link |
02:34:12.020
before it was still RNNs to do character prediction.
link |
02:34:17.900
But by the way, on the compression side,
link |
02:34:19.940
I mean, especially with neural networks,
link |
02:34:22.940
what do you make of the lossless requirement
link |
02:34:25.500
with the Hutter prize?
link |
02:34:26.540
So, you know, human intelligence and neural networks
link |
02:34:31.460
can probably compress stuff pretty well,
link |
02:34:33.300
but it would be lossy.
link |
02:34:35.220
It's imperfect.
link |
02:34:36.660
You can turn a lossy compression
link |
02:34:37.860
to a lossless compressor pretty easily
link |
02:34:39.620
using an arithmetic encoder, right?
link |
02:34:41.060
You can take an arithmetic encoder
link |
02:34:42.540
and you can just encode the noise with maximum efficiency.
link |
02:34:45.980
Right?
link |
02:34:46.820
So even if you can't predict exactly
link |
02:34:48.900
what the next character is,
link |
02:34:50.460
the better a probability distribution,
link |
02:34:52.500
you can put over the next character.
link |
02:34:54.260
You can then use an arithmetic encoder to, right?
link |
02:34:57.340
You don't have to know whether it's an E or an I,
link |
02:34:59.300
you just have to put good probabilities on them
link |
02:35:01.060
and then, you know, code those.
link |
02:35:03.020
And if you have, it's a bits of entropy thing, right?
link |
02:35:06.460
So let me, on that topic,
link |
02:35:07.740
it'd be interesting as a little side tour.
link |
02:35:10.260
What are your thoughts in this year about GPT3
link |
02:35:13.580
and these language models and these transformers?
link |
02:35:16.140
Is there something interesting to you as an AI researcher,
link |
02:35:20.660
or is there something interesting to you
link |
02:35:22.540
as an autonomous vehicle developer?
link |
02:35:24.740
Nah, I think it's overhyped.
link |
02:35:27.460
I mean, it's not, like, it's cool.
link |
02:35:29.020
It's cool for what it is, but no,
link |
02:35:30.820
we're not just gonna be able to scale up to GPT12
link |
02:35:33.500
and get general purpose intelligence.
link |
02:35:35.340
Like, your loss function is literally just,
link |
02:35:38.700
you know, cross entropy loss on the character, right?
link |
02:35:41.460
Like, that's not the loss function of general intelligence.
link |
02:35:44.860
Is that obvious to you?
link |
02:35:45.940
Yes.
link |
02:35:47.380
Can you imagine that, like,
link |
02:35:51.460
to play devil's advocate on yourself,
link |
02:35:53.300
is it possible that you can,
link |
02:35:55.860
the GPT12 will achieve general intelligence
link |
02:35:58.580
with something as dumb as this kind of loss function?
link |
02:36:01.980
I guess it depends what you mean by general intelligence.
link |
02:36:05.060
So there's another problem with the GPTs,
link |
02:36:07.860
and that's that they don't have a,
link |
02:36:11.060
they don't have longterm memory.
link |
02:36:13.020
Right.
link |
02:36:13.860
So, like, just GPT12,
link |
02:36:18.580
a scaled up version of GPT2 or GPT3,
link |
02:36:22.460
I find it hard to believe.
link |
02:36:26.060
Well, you can scale it in,
link |
02:36:28.780
so it's a hard coded length,
link |
02:36:32.060
but you can make it wider and wider and wider.
link |
02:36:34.300
Yeah.
link |
02:36:36.380
You're gonna get cool things from those systems,
link |
02:36:40.820
but I don't think you're ever gonna get something
link |
02:36:44.820
that can, like, you know, build me a rocket ship.
link |
02:36:47.580
What about solved driving?
link |
02:36:49.420
So, you know, you can use Transformer with video,
link |
02:36:53.140
for example.
link |
02:36:54.740
You think, is there something in there?
link |
02:36:57.220
No, because, I mean, look, we use a GRU.
link |
02:37:01.260
We use a GRU.
link |
02:37:02.100
We could change that GRU out to a Transformer.
link |
02:37:05.900
I think driving is much more Markovian than language.
link |
02:37:09.500
So, Markovian, you mean, like, the memory,
link |
02:37:11.460
which aspect of Markovian?
link |
02:37:13.660
I mean that, like, most of the information
link |
02:37:16.340
in the state at T minus one is also in state T.
link |
02:37:19.540
I see, yeah.
link |
02:37:20.380
Right, and it kind of, like, drops off nicely like this,
link |
02:37:22.820
whereas sometime with language,
link |
02:37:23.900
you have to refer back to the third paragraph
link |
02:37:25.900
on the second page.
link |
02:37:27.260
I feel like.
link |
02:37:28.140
There's not many, like, you can say, like,
link |
02:37:30.060
speed limit signs, but there's really not many things
link |
02:37:32.420
in autonomous driving that look like that.
link |
02:37:33.860
But if you look at, to play devil's advocate,
link |
02:37:37.140
is the risk estimation thing that you've talked about
link |
02:37:39.740
is kind of interesting.
link |
02:37:41.180
Is, it feels like there might be some longer term
link |
02:37:45.700
aggregation of context necessary to be able to figure out,
link |
02:37:49.020
like, the context.
link |
02:37:51.900
Yeah, I'm not even sure I'm believing my devil's advocate.
link |
02:37:55.820
We have a nice, like, vision model,
link |
02:37:58.260
which outputs, like, a one or two,
link |
02:38:00.180
four dimensional perception space.
link |
02:38:03.300
Can I try Transformers on it?
link |
02:38:04.740
Sure, I probably will.
link |
02:38:06.580
At some point, we'll try Transformers,
link |
02:38:08.260
and then we'll just see.
link |
02:38:09.100
Do they do better?
link |
02:38:09.940
Sure, I'm.
link |
02:38:10.780
But it might not be a game changer, you're saying?
link |
02:38:12.180
No, well, I'm not.
link |
02:38:13.020
Like, might Transformers work better than GRUs
link |
02:38:15.300
for autonomous driving?
link |
02:38:16.140
Sure.
link |
02:38:16.980
Might we switch?
link |
02:38:17.820
Sure.
link |
02:38:18.660
Is this some radical change?
link |
02:38:19.500
No.
link |
02:38:20.340
Okay, we use a slightly different,
link |
02:38:21.540
you know, we switch from RNNs to GRUs.
link |
02:38:23.020
Like, okay, maybe it's GRUs to Transformers,
link |
02:38:24.980
but no, it's not.
link |
02:38:26.540
Yeah.
link |
02:38:27.900
Well, on the topic of general intelligence,
link |
02:38:30.500
I don't know how much I've talked to you about it.
link |
02:38:32.140
Like, what, do you think we'll actually build
link |
02:38:36.540
an AGI?
link |
02:38:38.060
Like, if you look at Ray Kurzweil with Singularity,
link |
02:38:40.740
do you have like an intuition about,
link |
02:38:43.380
you're kind of saying driving is easy.
link |
02:38:45.420
Yeah.
link |
02:38:46.260
And I tend to personally believe that solving driving
link |
02:38:52.500
will have really deep, important impacts
link |
02:38:56.580
on our ability to solve general intelligence.
link |
02:38:59.300
Like, I think driving doesn't require general intelligence,
link |
02:39:03.380
but I think they're going to be neighbors
link |
02:39:05.220
in a way that it's like deeply tied.
link |
02:39:08.380
Cause it's so, like driving is so deeply connected
link |
02:39:11.540
to the human experience that I think solving one
link |
02:39:15.020
will help solve the other.
link |
02:39:17.420
But, so I don't see, I don't see driving as like easy
link |
02:39:20.980
and almost like separate than general intelligence,
link |
02:39:23.540
but like, what's your vision of a future with a Singularity?
link |
02:39:26.700
Do you see there'll be a single moment,
link |
02:39:28.180
like a Singularity where it'll be a phase shift?
link |
02:39:30.620
Are we in the Singularity now?
link |
02:39:32.380
Like what, do you have crazy ideas about the future
link |
02:39:34.540
in terms of AGI?
link |
02:39:35.740
We're definitely in the Singularity now.
link |
02:39:38.020
We are?
link |
02:39:38.860
Of course, of course.
link |
02:39:40.180
Look at the bandwidth between people.
link |
02:39:41.500
The bandwidth between people goes up, right?
link |
02:39:44.820
The Singularity is just, you know, when the bandwidth, but.
link |
02:39:47.300
What do you mean by the bandwidth of people?
link |
02:39:48.620
Communications, tools, the whole world is networked.
link |
02:39:51.340
The whole world is networked
link |
02:39:52.260
and we raise the speed of that network, right?
link |
02:39:54.740
Oh, so you think the communication of information
link |
02:39:57.380
in a distributed way is an empowering thing
link |
02:40:00.980
for collective intelligence?
link |
02:40:02.300
Oh, I didn't say it's necessarily a good thing,
link |
02:40:03.660
but I think that's like,
link |
02:40:04.540
when I think of the definition of the Singularity,
link |
02:40:06.500
yeah, it seems kind of right.
link |
02:40:08.180
I see, like it's a change in the world
link |
02:40:12.060
beyond which like the world be transformed
link |
02:40:14.940
in ways that we can't possibly imagine.
link |
02:40:16.780
No, I mean, I think we're in the Singularity now
link |
02:40:18.380
in the sense that there's like, you know,
link |
02:40:19.700
one world and a monoculture and it's also linked.
link |
02:40:22.340
Yeah, I mean, I kind of share the intuition
link |
02:40:24.800
that the Singularity will originate
link |
02:40:27.700
from the collective intelligence of us ands
link |
02:40:31.220
versus the like some single system AGI type thing.
link |
02:40:35.340
Oh, I totally agree with that.
link |
02:40:37.020
Yeah, I don't really believe in like a hard take off AGI
link |
02:40:40.460
kind of thing.
link |
02:40:45.140
Yeah, I don't even think AI is all that different in kind
link |
02:40:49.500
from what we've already been building.
link |
02:40:52.060
With respect to driving,
link |
02:40:53.380
I think driving is a subset of general intelligence
link |
02:40:56.340
and I think it's a pretty complete subset.
link |
02:40:58.060
I think the tools we develop at Kama
link |
02:41:00.300
will also be extremely helpful
link |
02:41:02.300
to solving general intelligence
link |
02:41:04.020
and that's I think the real reason why I'm doing it.
link |
02:41:06.620
I don't care about self driving cars.
link |
02:41:08.420
It's a cool problem to beat people at.
link |
02:41:10.920
But yeah, I mean, yeah, you're kind of, you're of two minds.
link |
02:41:14.540
So one, you do have to have a mission
link |
02:41:16.340
and you wanna focus and make sure you get there.
link |
02:41:19.020
You can't forget that but at the same time,
link |
02:41:22.280
there is a thread that's much bigger
link |
02:41:26.060
than that connects the entirety of your effort.
link |
02:41:28.460
That's much bigger than just driving.
link |
02:41:31.180
With AI and with general intelligence,
link |
02:41:33.380
it is so easy to delude yourself
link |
02:41:35.220
into thinking you've figured something out when you haven't.
link |
02:41:37.300
If we build a level five self driving car,
link |
02:41:39.820
we have indisputably built something.
link |
02:41:42.660
Yeah.
link |
02:41:43.500
Is it general intelligence?
link |
02:41:44.800
I'm not gonna debate that.
link |
02:41:45.940
I will say we've built something
link |
02:41:47.520
that provides huge financial value.
link |
02:41:49.740
Yeah, beautifully put.
link |
02:41:50.580
That's the engineering credo.
link |
02:41:51.940
Like just build the thing.
link |
02:41:53.660
It's like, that's why I'm with Elon
link |
02:41:57.120
on go to Mars.
link |
02:41:58.780
Yeah, that's a great one.
link |
02:41:59.740
You can argue like who the hell cares about going to Mars.
link |
02:42:03.700
But the reality is set that as a mission, get it done.
link |
02:42:07.220
Yeah.
link |
02:42:08.060
And then you're going to crack some problem
link |
02:42:09.540
that you've never even expected
link |
02:42:11.580
in the process of doing that, yeah.
link |
02:42:13.900
Yeah, I mean, no, I think if I had a choice
link |
02:42:16.220
between humanity going to Mars
link |
02:42:17.460
and solving self driving cars,
link |
02:42:18.500
I think going to Mars is better, but I don't know.
link |
02:42:21.900
I'm more suited for self driving cars.
link |
02:42:23.580
I'm an information guy.
link |
02:42:24.420
I'm not a modernist, I'm a postmodernist.
link |
02:42:26.540
Postmodernist, all right, beautifully put.
link |
02:42:29.620
Let me drag you back to programming for a sec.
link |
02:42:32.220
What three, maybe three to five programming languages
link |
02:42:35.140
should people learn, do you think?
link |
02:42:36.580
Like if you look at yourself,
link |
02:42:38.220
what did you get the most out of from learning?
link |
02:42:42.700
Well, so everybody should learn C and assembly.
link |
02:42:45.980
We'll start with those two, right?
link |
02:42:47.380
Assembly?
link |
02:42:48.220
Yeah, if you can't code an assembly,
link |
02:42:49.940
you don't know what the computer's doing.
link |
02:42:51.720
You don't understand like,
link |
02:42:53.340
you don't have to be great in assembly,
link |
02:42:54.820
but you have to code in it.
link |
02:42:56.580
And then like, you have to appreciate assembly
link |
02:42:58.940
in order to appreciate all the great things C gets you.
link |
02:43:02.020
And then you have to code in C
link |
02:43:03.380
in order to appreciate all the great things Python gets you.
link |
02:43:06.340
So I'll just say assembly C and Python,
link |
02:43:07.860
we'll start with those three.
link |
02:43:09.740
The memory allocation of C and the fact that,
link |
02:43:14.700
so assembly gives you a sense
link |
02:43:16.340
of just how many levels of abstraction
link |
02:43:18.460
you get to work on in modern day programming.
link |
02:43:20.660
Yeah, yeah, yeah, yeah, graph coloring for assignment,
link |
02:43:22.900
register assignment and compilers.
link |
02:43:24.740
Like, you know, you gotta do,
link |
02:43:25.940
you know, the compiler,
link |
02:43:26.780
the computer only has a certain number of registers,
link |
02:43:28.340
yet you can have all the variables you want in a C function.
link |
02:43:31.140
So you get to start to build intuition about compilation,
link |
02:43:34.460
like what a compiler gets you.
link |
02:43:37.380
What else?
link |
02:43:38.500
Well, then there's kind of a,
link |
02:43:41.220
so those are all very imperative programming languages.
link |
02:43:45.800
Then there's two other paradigms for programming
link |
02:43:47.700
that everybody should be familiar with.
link |
02:43:49.220
And one of them is functional.
link |
02:43:51.260
You should learn Haskell and take that all the way through,
link |
02:43:54.140
learn a language with dependent types like Coq,
link |
02:43:57.260
learn that whole space,
link |
02:43:58.980
like the very PL theory, heavy languages.
link |
02:44:02.740
And Haskell is your favorite functional?
link |
02:44:04.860
Is that the go to, you'd say?
link |
02:44:06.500
Yeah, I'm not a great Haskell programmer.
link |
02:44:08.580
I wrote a compiler in Haskell once.
link |
02:44:10.580
There's another paradigm,
link |
02:44:11.460
and actually there's one more paradigm
link |
02:44:12.540
that I'll even talk about after that,
link |
02:44:14.180
that I never used to talk about
link |
02:44:15.100
when I would think about this,
link |
02:44:15.940
but the next paradigm is learn Verilog of HDL.
link |
02:44:20.240
Understand this idea of all of the instructions
link |
02:44:22.280
execute at once. If I have a block in Verilog
link |
02:44:26.860
and I write stuff in it, it's not sequential.
link |
02:44:29.900
They all execute at once.
link |
02:44:33.740
And then think like that, that's how hardware works.
link |
02:44:36.700
To be, so I guess assembly doesn't quite get you that.
link |
02:44:40.020
Assembly is more about compilation,
link |
02:44:42.260
and Verilog is more about the hardware,
link |
02:44:44.260
like giving a sense of what actually
link |
02:44:46.420
is the hardware is doing.
link |
02:44:48.580
Assembly, C, Python are straight,
link |
02:44:50.480
like they sit right on top of each other.
link |
02:44:52.420
In fact, C is, well, C is kind of coded in C,
link |
02:44:55.660
but you could imagine the first C was coded in assembly,
link |
02:44:57.900
and Python is actually coded in C.
link |
02:45:00.020
So you can straight up go on that.
link |
02:45:03.620
Got it, and then Verilog gives you, that's brilliant.
link |
02:45:06.940
Okay.
link |
02:45:07.780
And then I think there's another one now.
link |
02:45:09.820
Everyone, Carpathia calls it programming 2.0,
link |
02:45:12.660
which is learn a, I'm not even gonna,
link |
02:45:16.500
don't learn TensorFlow, learn PyTorch.
link |
02:45:18.620
So machine learning.
link |
02:45:20.180
We've got to come up with a better term
link |
02:45:21.540
than programming 2.0, or, but yeah.
link |
02:45:26.100
It's a programming language, learn it.
link |
02:45:29.900
I wonder if it can be formalized a little bit better.
link |
02:45:32.660
It feels like we're in the early days
link |
02:45:34.900
of what that actually entails.
link |
02:45:37.100
Data driven programming?
link |
02:45:39.220
Data driven programming, yeah.
link |
02:45:41.700
But it's so fundamentally different
link |
02:45:43.080
as a paradigm than the others.
link |
02:45:44.900
Like it almost requires a different skillset.
link |
02:45:50.420
But you think it's still, yeah.
link |
02:45:53.740
And PyTorch versus TensorFlow, PyTorch wins.
link |
02:45:56.580
It's the fourth paradigm.
link |
02:45:57.460
It's the fourth paradigm that I've kind of seen.
link |
02:45:59.380
There's like this, you know,
link |
02:46:01.460
imperative functional hardware.
link |
02:46:04.840
I don't know a better word for it.
link |
02:46:06.340
And then ML.
link |
02:46:08.180
Do you have advice for people that wanna,
link |
02:46:13.180
you know, get into programming, wanna learn programming?
link |
02:46:16.180
You have a video,
link |
02:46:19.940
what is programming noob lessons, exclamation point.
link |
02:46:22.860
And I think the top comment is like,
link |
02:46:24.620
warning, this is not for noobs.
link |
02:46:27.820
Do you have a noob, like a TLDW for that video,
link |
02:46:32.560
but also a noob friendly advice
link |
02:46:38.020
on how to get into programming?
link |
02:46:39.540
We're never going to learn programming
link |
02:46:41.400
by watching a video called Learn Programming.
link |
02:46:44.640
The only way to learn programming, I think,
link |
02:46:46.660
and the only one is that the only way
link |
02:46:48.060
everyone I've ever met who can program well,
link |
02:46:50.140
learned it all in the same way.
link |
02:46:51.940
They had something they wanted to do
link |
02:46:54.960
and then they tried to do it.
link |
02:46:56.580
And then they were like, oh, well, okay.
link |
02:47:00.020
This is kind of, you know, it'd be nice
link |
02:47:01.380
if the computer could kind of do this.
link |
02:47:02.740
And then, you know, that's how you learn.
link |
02:47:04.440
You just keep pushing on a project.
link |
02:47:09.020
So the only advice I have for learning programming
link |
02:47:10.900
is go program.
link |
02:47:12.140
Somebody wrote to me a question like,
link |
02:47:14.680
we don't really, they're looking to learn
link |
02:47:17.100
about recurring neural networks.
link |
02:47:19.060
And he's saying, like, my company's thinking
link |
02:47:20.540
of using recurring neural networks for time series data,
link |
02:47:24.140
but we don't really have an idea of where to use it yet.
link |
02:47:27.420
We just want to, like, do you have any advice
link |
02:47:28.980
on how to learn about, these are these kind of
link |
02:47:31.780
general machine learning questions.
link |
02:47:33.340
And I think the answer is, like,
link |
02:47:36.780
actually have a problem that you're trying to solve.
link |
02:47:39.020
And just.
link |
02:47:40.220
I see that stuff.
link |
02:47:41.220
Oh my God, when people talk like that,
link |
02:47:42.660
they're like, I heard machine learning is important.
link |
02:47:45.500
Could you help us integrate machine learning
link |
02:47:47.500
with macaroni and cheese production?
link |
02:47:51.300
You just, I don't even, you can't help these people.
link |
02:47:54.000
Like, who lets you run anything?
link |
02:47:55.980
Who lets that kind of person run anything?
link |
02:47:58.400
I think we're all, we're all beginners at some point.
link |
02:48:02.620
So.
link |
02:48:03.460
It's not like they're a beginner.
link |
02:48:04.860
It's like, my problem is not that they don't know
link |
02:48:07.340
about machine learning.
link |
02:48:08.620
My problem is that they think that machine learning
link |
02:48:10.840
has something to say about macaroni and cheese production.
link |
02:48:14.740
Or like, I heard about this new technology.
link |
02:48:17.020
How can I use it for why?
link |
02:48:19.860
Like, I don't know what it is, but how can I use it for why?
link |
02:48:23.500
That's true.
link |
02:48:24.340
You have to build up an intuition of how,
link |
02:48:26.140
cause you might be able to figure out a way,
link |
02:48:27.620
but like the prerequisites,
link |
02:48:29.300
you should have a macaroni and cheese problem to solve first.
link |
02:48:32.300
Exactly.
link |
02:48:33.460
And then two, you should have more traditional,
link |
02:48:36.940
like the learning process should involve
link |
02:48:39.340
more traditionally applicable problems
link |
02:48:41.940
in the space of whatever that is, machine learning,
link |
02:48:44.540
and then see if it can be applied to mac and cheese.
link |
02:48:47.060
At least start with, tell me about a problem.
link |
02:48:49.180
Like if you have a problem, you're like,
link |
02:48:50.740
you know, some of my boxes aren't getting
link |
02:48:52.500
enough macaroni in them.
link |
02:48:54.500
Can we use machine learning to solve this problem?
link |
02:48:56.820
That's much, much better than how do I apply
link |
02:48:59.380
machine learning to macaroni and cheese?
link |
02:49:01.600
One big thing, maybe this is me talking
link |
02:49:05.380
to the audience a little bit, cause I get these days
link |
02:49:07.860
so many messages, advice on how to like learn stuff, okay?
link |
02:49:15.060
My, this is not me being mean.
link |
02:49:18.160
I think this is quite profound actually,
link |
02:49:20.540
is you should Google it.
link |
02:49:22.780
Oh yeah.
link |
02:49:23.820
Like one of the like skills that you should really acquire
link |
02:49:29.700
as an engineer, as a researcher, as a thinker,
link |
02:49:33.100
like one, there's two complementary skills.
link |
02:49:36.660
Like one is with a blank sheet of paper
link |
02:49:39.100
with no internet to think deeply.
link |
02:49:41.660
And then the other is to Google the crap
link |
02:49:44.740
out of the questions you have.
link |
02:49:45.940
Like that's actually a skill people often talk about,
link |
02:49:49.180
but like doing research, like pulling at the thread,
link |
02:49:52.020
like looking up different words,
link |
02:49:53.860
going into like GitHub repositories with two stars
link |
02:49:58.060
and like looking how they did stuff,
link |
02:49:59.700
like looking at the code or going on Twitter,
link |
02:50:03.460
seeing like there's little pockets of brilliant people
link |
02:50:05.780
that are like having discussions.
link |
02:50:07.620
Like if you're a neuroscientist,
link |
02:50:09.860
go into signal processing community.
link |
02:50:11.620
If you're an AI person going into the psychology community,
link |
02:50:15.860
like switch communities.
link |
02:50:18.020
I keep searching, searching, searching,
link |
02:50:19.980
because it's so much better to invest
link |
02:50:23.860
in like finding somebody else who already solved your problem
link |
02:50:27.260
than it is to try to solve the problem.
link |
02:50:30.900
And because they've often invested years of their life,
link |
02:50:34.380
like entire communities are probably already out there
link |
02:50:37.420
who have tried to solve your problem.
link |
02:50:39.180
I think they're the same thing.
link |
02:50:40.980
I think you go try to solve the problem.
link |
02:50:44.180
And then in trying to solve the problem,
link |
02:50:46.180
if you're good at solving problems,
link |
02:50:47.740
you'll stumble upon the person who solved it already.
link |
02:50:50.260
But the stumbling is really important.
link |
02:50:52.300
I think that's a skill that people should really put,
link |
02:50:54.260
especially in undergrad, like search.
link |
02:50:57.700
If you ask me a question,
link |
02:50:58.700
how should I get started in deep learning, like especially?
link |
02:51:04.100
Like that is just so Googleable.
link |
02:51:07.260
Like the whole point is you Google that
link |
02:51:10.060
and you get a million pages and just start looking at them.
link |
02:51:13.980
Start pulling at the threads, start exploring,
link |
02:51:16.420
start taking notes, start getting advice
link |
02:51:19.300
from a million people that already like spent their life
link |
02:51:22.820
answering that question, actually.
link |
02:51:25.060
Oh, well, yeah, I mean, that's definitely also, yeah,
link |
02:51:26.460
when people like ask me things like that, I'm like, trust me,
link |
02:51:28.500
the top answer on Google is much, much better
link |
02:51:30.340
than anything I'm going to tell you, right?
link |
02:51:32.860
Yeah.
link |
02:51:34.460
People ask, it's an interesting question.
link |
02:51:38.100
Let me know if you have any recommendations.
link |
02:51:39.940
What three books, technical or fiction or philosophical,
link |
02:51:43.700
had an impact on your life or you would recommend perhaps?
link |
02:51:49.180
Maybe we'll start with the least controversial,
link |
02:51:51.100
Infinite Jest, Infinite Jest is a...
link |
02:51:57.500
David Foster Wallace.
link |
02:51:58.860
Yeah, it's a book about wireheading, really.
link |
02:52:03.820
Very enjoyable to read, very well written.
link |
02:52:07.540
You know, you will grow as a person reading this book,
link |
02:52:11.180
its effort, and I'll set that up for the second book,
link |
02:52:14.740
which is pornography, it's called Atlas Shrugged,
link |
02:52:17.500
which...
link |
02:52:21.020
Atlas Shrugged is pornography.
link |
02:52:22.580
I mean, it is, I will not defend the,
link |
02:52:25.540
I will not say Atlas Shrugged is a well written book.
link |
02:52:28.380
It is entertaining to read, certainly, just like pornography.
link |
02:52:31.380
The production value isn't great.
link |
02:52:33.540
You know, there's a 60 page monologue in there
link |
02:52:36.220
that Ann Rand's editor really wanted to take out.
link |
02:52:38.740
And she paid, she paid out of her pocket
link |
02:52:42.580
to keep that 60 page monologue in the book.
link |
02:52:45.060
But it is a great book for a kind of framework
link |
02:52:53.220
of human relations.
link |
02:52:54.660
And I know a lot of people are like,
link |
02:52:55.960
yeah, but it's a terrible framework.
link |
02:52:58.060
Yeah, but it's a framework.
link |
02:53:00.500
Just for context, in a couple of days,
link |
02:53:02.360
I'm speaking for probably four plus hours
link |
02:53:06.260
with Yaron Brook, who's the main living,
link |
02:53:10.200
remaining objectivist, objectivist.
link |
02:53:13.340
Interesting.
link |
02:53:14.580
So I've always found this philosophy quite interesting
link |
02:53:19.380
on many levels.
link |
02:53:20.260
One of how repulsive some percent of,
link |
02:53:24.020
large percent of the population find it,
link |
02:53:26.300
which is always, always funny to me
link |
02:53:29.140
when people are like unable to even read a philosophy
link |
02:53:32.980
because of some, I think that says more
link |
02:53:36.700
about their psychological perspective on it.
link |
02:53:40.580
But there is something about objectivism
link |
02:53:45.300
and Ann Rand's philosophy that's deeply connected
link |
02:53:48.780
to this idea of capitalism,
link |
02:53:50.740
of the ethical life is the productive life
link |
02:53:56.620
that was always compelling to me.
link |
02:54:00.740
It didn't seem as, like I didn't seem to interpret it
link |
02:54:03.160
in the negative sense that some people do.
link |
02:54:05.660
To be fair, I read that book when I was 19.
link |
02:54:07.980
So you had an impact at that point, yeah.
link |
02:54:09.620
Yeah, and the bad guys in the book have this slogan
link |
02:54:13.700
from each according to their ability
link |
02:54:15.340
to each according to their need.
link |
02:54:17.300
And I'm looking at this and I'm like,
link |
02:54:19.740
these are the most cart,
link |
02:54:20.580
this is team rocket level cartoonishness, right?
link |
02:54:22.940
No bad guy.
link |
02:54:23.820
And then when I realized that was actually the slogan
link |
02:54:25.780
of the communist party, I'm like, wait a second.
link |
02:54:29.940
Wait, no, no, no, no, no.
link |
02:54:31.660
You're telling me this really happened?
link |
02:54:34.100
Yeah, it's interesting.
link |
02:54:34.920
I mean, one of the criticisms of her work
link |
02:54:36.660
is she has a cartoonish view of good and evil.
link |
02:54:39.220
Like the reality, as Jordan Peterson says,
link |
02:54:44.300
is that each of us have the capacity for good and evil
link |
02:54:47.420
in us as opposed to like, there's some characters
link |
02:54:49.940
who are purely evil and some characters that are purely good.
link |
02:54:52.220
And that's in a way why it's pornographic.
link |
02:54:55.220
The production value, I love it.
link |
02:54:57.020
Like evil is punished and there's very clearly,
link |
02:55:01.020
there's no, just like porn doesn't have character growth.
link |
02:55:06.020
Well, you know, neither does Alice Shrugged, like.
link |
02:55:09.540
Really, well put.
link |
02:55:10.860
But at 19 year old George Hots, it was good enough.
link |
02:55:14.220
Yeah, yeah, yeah, yeah.
link |
02:55:15.380
What's the third?
link |
02:55:16.940
You have something?
link |
02:55:18.620
I could give, these two I'll just throw out.
link |
02:55:21.520
They're sci fi.
link |
02:55:22.420
Perputation City.
link |
02:55:24.280
Great thing to start thinking about copies of yourself.
link |
02:55:26.620
And then the...
link |
02:55:27.460
Who's that by?
link |
02:55:28.280
Sorry, I didn't catch that.
link |
02:55:29.120
That is Greg Egan.
link |
02:55:31.980
He's a, that might not be his real name.
link |
02:55:33.540
Some Australian guy, might not be Australian.
link |
02:55:35.740
I don't know.
link |
02:55:36.700
And then this one's online.
link |
02:55:38.740
It's called The Metamorphosis of Prime Intellect.
link |
02:55:43.020
It's a story set in a post singularity world.
link |
02:55:45.420
It's interesting.
link |
02:55:46.720
Is there, can you, either of the worlds,
link |
02:55:49.180
do you find something philosophical interesting in them
link |
02:55:51.540
that you can comment on?
link |
02:55:53.660
I mean, it is clear to me that
link |
02:55:57.860
Metamorphosis of Prime Intellect is like written by
link |
02:56:00.620
an engineer, which is,
link |
02:56:03.780
it's very almost a pragmatic take on a utopia, in a way.
link |
02:56:12.620
Positive or negative?
link |
02:56:15.260
That's up to you to decide reading the book.
link |
02:56:17.940
And the ending of it is very interesting as well.
link |
02:56:21.580
And I didn't realize what it was.
link |
02:56:23.660
I first read that when I was 15.
link |
02:56:25.260
I've reread that book several times in my life.
link |
02:56:27.540
And it's short, it's 50 pages.
link |
02:56:29.220
Everyone should go read it.
link |
02:56:30.740
What's, sorry, it's a little tangent.
link |
02:56:33.100
I've been working through the foundation.
link |
02:56:34.700
I've been, I haven't read much sci fi my whole life
link |
02:56:37.060
and I'm trying to fix that the last few months.
link |
02:56:40.180
That's been a little side project.
link |
02:56:42.180
What's to you as the greatest sci fi novel
link |
02:56:46.180
that people should read?
link |
02:56:47.740
Or is that?
link |
02:56:49.220
I mean, I would, yeah, I would say like, yeah,
link |
02:56:51.180
Permutation City, Metamorphosis of Prime Intellect.
link |
02:56:53.820
I don't know.
link |
02:56:54.660
I didn't like Foundation.
link |
02:56:56.240
I thought it was way too modernist.
link |
02:56:58.820
You like Dune and all of those.
link |
02:57:00.780
I've never read Dune.
link |
02:57:01.780
I've never read Dune.
link |
02:57:02.820
I have to read it.
link |
02:57:04.580
Fire Upon the Deep is interesting.
link |
02:57:09.140
Okay, I mean, look, everyone should read,
link |
02:57:10.540
everyone should read Neuromancer.
link |
02:57:11.380
Everyone should read Snow Crash.
link |
02:57:12.820
If you haven't read those, like start there.
link |
02:57:15.500
Yeah, I haven't read Snow Crash.
link |
02:57:16.340
You haven't read Snow Crash?
link |
02:57:17.460
Oh, it's, I mean, it's very entertaining.
link |
02:57:19.980
Go to Lesher Bach.
link |
02:57:20.820
And if you want the controversial one,
link |
02:57:22.220
Bronze Age Mindset.
link |
02:57:25.340
All right, I'll look into that one.
link |
02:57:27.740
Those aren't sci fi, but just to round out books.
link |
02:57:30.360
So a bunch of people asked me on Twitter
link |
02:57:34.360
and Reddit and so on for advice.
link |
02:57:36.880
So what advice would you give a young person today
link |
02:57:39.440
about life?
link |
02:57:40.560
In other words, what, yeah, I mean, looking back,
link |
02:57:47.480
especially when you were younger, you did,
link |
02:57:50.480
and you continued it.
link |
02:57:51.560
You've accomplished a lot of interesting things.
link |
02:57:54.840
Is there some advice from those,
link |
02:57:57.840
from that life of yours that you can pass on?
link |
02:58:01.880
If college ever opens again,
link |
02:58:03.760
I would love to give a graduation speech.
link |
02:58:07.640
At that point, I will put a lot of somewhat satirical effort
link |
02:58:11.360
into this question.
link |
02:58:12.320
Yeah, at this, you haven't written anything at this point.
link |
02:58:15.880
Oh, you know what?
link |
02:58:16.760
Always wear sunscreen.
link |
02:58:18.240
This is water.
link |
02:58:19.560
Pick your plagiarizing.
link |
02:58:21.160
I mean, you know, but that's the,
link |
02:58:23.960
that's the like clean your room.
link |
02:58:26.240
You know, yeah, you can plagiarize from all of this stuff.
link |
02:58:28.600
And it's, there is no,
link |
02:58:35.920
self help books aren't designed to help you.
link |
02:58:37.680
They're designed to make you feel good.
link |
02:58:40.080
Like whatever advice I could give, you already know.
link |
02:58:44.120
Everyone already knows.
link |
02:58:45.840
Sorry, it doesn't feel good.
link |
02:58:50.240
Right?
link |
02:58:51.080
Like, you know, you know,
link |
02:58:53.040
if I tell you that you should, you know,
link |
02:58:56.880
eat well and read more and it's not gonna do anything.
link |
02:59:01.920
I think the whole like genre
link |
02:59:03.480
of those kinds of questions is meaningless.
link |
02:59:07.480
I don't know.
link |
02:59:08.320
If anything, it's don't worry so much about that stuff.
link |
02:59:10.560
Don't be so caught up in your head.
link |
02:59:12.440
Right.
link |
02:59:13.280
I mean, you're, yeah.
link |
02:59:14.280
In a sense that your whole life,
link |
02:59:16.560
your whole existence is like moving version of that advice.
link |
02:59:20.680
I don't know.
link |
02:59:23.840
There's something, I mean,
link |
02:59:25.480
there's something in you that resists
link |
02:59:27.080
that kind of thinking and that in itself is,
link |
02:59:30.880
it's just illustrative of who you are.
link |
02:59:34.480
And there's something to learn from that.
link |
02:59:36.760
I think you're clearly not overthinking stuff.
link |
02:59:41.280
Yeah.
link |
02:59:42.120
And you know what?
link |
02:59:42.960
There's a gut thing.
link |
02:59:43.800
Even when I talk about my advice,
link |
02:59:45.000
I'm like, my advice is only relevant to me.
link |
02:59:47.400
It's not relevant to anybody else.
link |
02:59:48.720
I'm not saying you should go out.
link |
02:59:49.960
If you're the kind of person who overthinks things
link |
02:59:51.640
to stop overthinking things, it's not bad.
link |
02:59:54.080
It doesn't work for me.
link |
02:59:54.960
Maybe it works for you.
link |
02:59:55.800
I don't know.
link |
02:59:57.960
Let me ask you about love.
link |
02:59:59.400
Yeah.
link |
03:00:02.240
I think last time we talked about the meaning of life
link |
03:00:05.120
and it was kind of about winning.
link |
03:00:08.640
Of course.
link |
03:00:10.880
I don't think I've talked to you about love much,
link |
03:00:13.120
whether romantic or just love
link |
03:00:15.000
for the common humanity amongst us all.
link |
03:00:18.120
What role has love played in your life?
link |
03:00:21.400
In this quest for winning, where does love fit in?
link |
03:00:26.360
Well, the word love, I think means several different things.
link |
03:00:29.840
There's love in the sense of, maybe I could just say,
link |
03:00:32.960
there's like love in the sense of opiates
link |
03:00:34.400
and love in the sense of oxytocin
link |
03:00:37.880
and then love in the sense of,
link |
03:00:43.080
maybe like a love for math.
link |
03:00:44.480
I don't think it fits into either
link |
03:00:45.640
of those first two paradigms.
link |
03:00:49.160
So each of those, have they given something to you
link |
03:00:55.560
in your life?
link |
03:00:56.920
I'm not that big of a fan of the first two.
link |
03:01:00.800
Why?
link |
03:01:03.600
The same reason I'm not a fan of,
link |
03:01:06.360
the same reason I don't do opiates and don't take ecstasy.
link |
03:01:09.880
And there were times, look, I've tried both.
link |
03:01:14.200
I liked opiates way more than I liked ecstasy,
link |
03:01:18.400
but they're not, the ethical life is the productive life.
link |
03:01:24.400
So maybe that's my problem with those.
link |
03:01:27.080
And then like, yeah, a sense of, I don't know,
link |
03:01:29.440
like abstract love for humanity.
link |
03:01:32.200
I mean, the abstract love for humanity,
link |
03:01:34.520
I'm like, yeah, I've always felt that.
link |
03:01:36.200
And I guess it's hard for me to imagine
link |
03:01:39.680
not feeling it and maybe there's people who don't.
link |
03:01:41.560
And I don't know.
link |
03:01:43.560
Yeah, that's just like a background thing that's there.
link |
03:01:46.520
I mean, since we brought up drugs, let me ask you,
link |
03:01:51.760
this is becoming more and more a part of my life
link |
03:01:54.000
because I'm talking to a few researchers
link |
03:01:55.520
that are working on psychedelics.
link |
03:01:57.680
I've eaten shrooms a couple of times
link |
03:02:00.440
and it was fascinating to me that like the mind can go,
link |
03:02:04.680
like just fascinating the mind can go to places
link |
03:02:08.200
I didn't imagine it could go.
link |
03:02:09.480
And it was very friendly and positive and exciting
link |
03:02:12.720
and everything was kind of hilarious in the place.
link |
03:02:16.160
Wherever my mind went, that's where I went.
link |
03:02:18.200
Is, what do you think about psychedelics?
link |
03:02:20.960
Do you think they have, where do you think the mind goes?
link |
03:02:24.680
Have you done psychedelics?
link |
03:02:25.920
Where do you think the mind goes?
link |
03:02:28.640
Is there something useful to learn about the places it goes
link |
03:02:32.240
once you come back?
link |
03:02:33.840
I find it interesting that this idea
link |
03:02:38.040
that psychedelics have something to teach
link |
03:02:40.400
is almost unique to psychedelics, right?
link |
03:02:43.800
People don't argue this about amphetamines.
link |
03:02:46.560
And I'm not really sure why.
link |
03:02:50.240
I think all of the drugs have lessons to teach.
link |
03:02:53.800
I think there's things to learn from opiates.
link |
03:02:55.160
I think there's things to learn from amphetamines.
link |
03:02:56.600
I think there's things to learn from psychedelics,
link |
03:02:58.160
things to learn from marijuana.
link |
03:03:02.320
But also at the same time recognize
link |
03:03:05.200
that I don't think you're learning things about the world.
link |
03:03:07.400
I think you're learning things about yourself.
link |
03:03:09.160
Yes.
link |
03:03:10.680
And, you know, what's the, even, it might've even been,
link |
03:03:15.760
might've even been a Timothy Leary quote.
link |
03:03:17.240
I don't wanna misquote him,
link |
03:03:18.200
but the idea is basically like, you know,
link |
03:03:20.240
everybody should look behind the door,
link |
03:03:21.600
but then once you've seen behind the door,
link |
03:03:22.760
you don't need to keep going back.
link |
03:03:26.120
So, I mean, and that's my thoughts on all real drug use too.
link |
03:03:29.960
Except maybe for caffeine.
link |
03:03:32.680
It's a little experience that is good to have, but.
link |
03:03:37.160
Oh yeah, no, I mean, yeah, I guess,
link |
03:03:39.240
yes, psychedelics are definitely.
link |
03:03:41.880
So you're a fan of new experiences, I suppose.
link |
03:03:43.920
Yes.
link |
03:03:44.760
Because they all contain a little,
link |
03:03:45.880
especially the first few times,
link |
03:03:47.000
it contains some lessons that can be picked up.
link |
03:03:49.720
Yeah, and I'll revisit psychedelics maybe once a year.
link |
03:03:55.720
Usually smaller doses.
link |
03:03:58.760
Maybe they turn up the learning rate of your brain.
link |
03:04:01.440
I've heard that, I like that.
link |
03:04:03.160
Yeah, that's cool.
link |
03:04:04.280
Big learning rates have pros and cons.
link |
03:04:07.360
Last question, and this is a little weird one,
link |
03:04:09.440
but you've called yourself crazy in the past.
link |
03:04:14.120
First of all, on a scale of one to 10,
link |
03:04:16.120
how crazy would you say are you?
link |
03:04:18.040
Oh, I mean, it depends how you, you know,
link |
03:04:19.480
when you compare me to Elon Musk and Anthony Levandowski,
link |
03:04:21.680
not so crazy.
link |
03:04:23.520
So like a seven?
link |
03:04:25.920
Let's go with six.
link |
03:04:27.000
Six, six, six.
link |
03:04:29.320
What?
link |
03:04:31.440
Well, I like seven, seven's a good number.
link |
03:04:32.960
Seven, all right, well, I'm sure day by day it changes,
link |
03:04:36.320
right, so, but you're in that area.
link |
03:04:42.440
In thinking about that,
link |
03:04:43.440
what do you think is the role of madness?
link |
03:04:45.760
Is that a feature or a bug
link |
03:04:48.520
if you were to dissect your brain?
link |
03:04:51.680
So, okay, from like a mental health lens on crazy,
link |
03:04:57.040
I'm not sure I really believe in that.
link |
03:04:59.040
I'm not sure I really believe in like a lot of that stuff.
link |
03:05:02.840
Right, this concept of, okay, you know,
link |
03:05:05.080
when you get over to like hardcore bipolar and schizophrenia,
link |
03:05:09.560
these things are clearly real, somewhat biological.
link |
03:05:13.160
And then over here on the spectrum,
link |
03:05:14.480
you have like ADD and oppositional defiance disorder
link |
03:05:18.360
and these things that are like,
link |
03:05:20.760
wait, this is normal spectrum human behavior.
link |
03:05:22.800
Like this isn't, you know, where's the line here
link |
03:05:28.640
and why is this like a problem?
link |
03:05:31.320
So there's this whole, you know,
link |
03:05:33.280
the neurodiversity of humanity is huge.
link |
03:05:35.840
Like people think I'm always on drugs.
link |
03:05:37.640
People are saying this to me on my streams.
link |
03:05:38.920
And I'm like, guys, you know,
link |
03:05:39.760
like I'm real open with my drug use.
link |
03:05:41.320
I'd tell you if I was on drugs and yeah,
link |
03:05:44.080
I had like a cup of coffee this morning,
link |
03:05:45.640
but other than that, this is just me.
link |
03:05:47.240
You're witnessing my brain in action.
link |
03:05:51.400
So the word madness doesn't even make sense
link |
03:05:55.600
in the rich neurodiversity of humans.
link |
03:05:59.680
I think it makes sense, but only for like
link |
03:06:04.600
some insane extremes.
link |
03:06:07.040
Like if you are actually like visibly hallucinating,
link |
03:06:11.720
you know, that's okay.
link |
03:06:15.000
But there is the kind of spectrum on which you stand out.
link |
03:06:17.440
Like that's like, if I were to look, you know,
link |
03:06:22.000
at decorations on a Christmas tree or something like that,
link |
03:06:25.080
like if you were a decoration, that would catch my eye.
link |
03:06:28.760
Like that thing is sparkly, whatever the hell that thing is.
link |
03:06:35.360
There's something to that.
link |
03:06:37.360
Just like refusing to be boring
link |
03:06:42.120
or maybe boring is the wrong word,
link |
03:06:43.920
but to yeah, I mean, be willing to sparkle, you know?
link |
03:06:52.440
It's like somewhat constructed.
link |
03:06:54.200
I mean, I am who I choose to be.
link |
03:06:57.000
I'm gonna say things as true as I can see them.
link |
03:07:01.000
I'm not gonna lie.
link |
03:07:04.480
But that's a really important feature in itself.
link |
03:07:06.600
So like whatever the neurodiversity of your,
link |
03:07:09.080
whatever your brain is, not putting constraints on it
link |
03:07:13.800
that force it to fit into the mold of what society is like,
link |
03:07:18.720
defines what you're supposed to be.
link |
03:07:20.640
So you're one of the specimens
link |
03:07:22.360
that doesn't mind being yourself.
link |
03:07:27.800
Being right is super important,
link |
03:07:31.720
except at the expense of being wrong.
link |
03:07:37.240
Without breaking that apart,
link |
03:07:38.480
I think it's a beautiful way to end it.
link |
03:07:40.400
George, you're one of the most special humans I know.
link |
03:07:43.080
It's truly an honor to talk to you.
link |
03:07:44.600
Thanks so much for doing it.
link |
03:07:45.800
Thank you for having me.
link |
03:07:47.640
Thanks for listening to this conversation with George Hotz
link |
03:07:50.360
and thank you to our sponsors,
link |
03:07:52.320
Four Sigmatic, which is the maker
link |
03:07:54.640
of delicious mushroom coffee,
link |
03:07:57.160
Decoding Digital, which is a tech podcast
link |
03:07:59.880
that I listen to and enjoy,
link |
03:08:02.120
and ExpressVPN, which is the VPN I've used for many years.
link |
03:08:07.080
Please check out these sponsors in the description
link |
03:08:09.240
to get a discount and to support this podcast.
link |
03:08:13.120
If you enjoy this thing, subscribe on YouTube,
link |
03:08:15.520
review it with Five Stars and Apple Podcast,
link |
03:08:17.920
follow on Spotify, support on Patreon,
link |
03:08:20.680
or connect with me on Twitter at Lex Friedman.
link |
03:08:24.480
And now, let me leave you with some words
link |
03:08:27.040
from the great and powerful Linus Torvalds.
link |
03:08:30.720
Talk is cheap, show me the code.
link |
03:08:33.120
Thank you for listening and hope to see you next time.