back to index

Elon Musk: SpaceX, Mars, Tesla Autopilot, Self-Driving, Robotics, and AI | Lex Fridman Podcast #252


small model | large model

link |
00:00:00.000
The following is a conversation with Elon Musk,
link |
00:00:02.540
his third time on this, The Lex Friedman Podcast.
link |
00:00:06.980
Yeah, make yourself comfortable.
link |
00:00:08.440
Boo.
link |
00:00:09.280
No, wow, okay.
link |
00:00:10.120
You don't do the headphone thing?
link |
00:00:12.760
No.
link |
00:00:13.600
Okay.
link |
00:00:14.420
I mean, how close do I need to get up to the same place?
link |
00:00:15.840
The closer you are, the sexier you sound.
link |
00:00:17.560
Hey, babe.
link |
00:00:18.400
What's up?
link |
00:00:19.220
Can't get enough of your love, baby.
link |
00:00:20.880
I'm gonna clip that out
link |
00:00:25.080
any time somebody messages me about it.
link |
00:00:26.720
If you want my body and you think I'm sexy,
link |
00:00:30.120
come right out and tell me so.
link |
00:00:33.040
Do, do, do, do, do.
link |
00:00:36.000
So good.
link |
00:00:38.840
So good.
link |
00:00:39.680
Okay, serious mode activate.
link |
00:00:41.880
All right.
link |
00:00:43.240
Serious mode.
link |
00:00:44.360
Come on, you're Russian.
link |
00:00:45.200
You can be serious.
link |
00:00:46.020
Yeah, I know.
link |
00:00:46.860
Everyone's serious all the time in Russia.
link |
00:00:47.700
Yeah, yeah, we'll get there.
link |
00:00:50.600
We'll get there.
link |
00:00:51.920
Yeah.
link |
00:00:52.760
It's gotten soft.
link |
00:00:53.600
Allow me to say that the SpaceX launch
link |
00:00:57.760
of human beings to orbit on May 30th, 2020
link |
00:01:02.160
was seen by many as the first step
link |
00:01:03.920
in a new era of human space exploration.
link |
00:01:07.440
These human space flight missions were a beacon of hope
link |
00:01:10.620
to me and to millions over the past two years
link |
00:01:12.740
as our world has been going through
link |
00:01:14.680
one of the most difficult periods in recent human history.
link |
00:01:18.080
We saw, we see the rise of division, fear, cynicism,
link |
00:01:21.720
and the loss of common humanity
link |
00:01:24.620
right when it is needed most.
link |
00:01:26.380
So first, Elon, let me say thank you
link |
00:01:29.080
for giving the world hope and reason
link |
00:01:30.620
to be excited about the future.
link |
00:01:32.520
Oh, it's kind of easy to say.
link |
00:01:34.200
I do want to do that.
link |
00:01:35.480
Humanity has obviously a lot of issues
link |
00:01:38.320
and people at times do bad things,
link |
00:01:42.620
but despite all that, I love humanity
link |
00:01:47.260
and I think we should make sure we do everything we can
link |
00:01:52.560
to have a good future and an exciting future
link |
00:01:54.640
and one where that maximizes the happiness of the people.
link |
00:01:58.320
Let me ask about Crew Dragon Demo 2.
link |
00:02:00.960
So that first flight with humans on board,
link |
00:02:04.360
how did you feel leading up to that launch?
link |
00:02:06.160
Were you scared?
link |
00:02:07.280
Were you excited?
link |
00:02:08.260
What was going through your mind?
link |
00:02:09.640
So much was at stake.
link |
00:02:11.380
Yeah, no, that was extremely stressful, no question.
link |
00:02:14.700
We obviously could not let them down in any way.
link |
00:02:19.140
So extremely stressful, I'd say, to say the least.
link |
00:02:25.940
I was confident that at the time that we launched
link |
00:02:28.380
that no one could think of anything at all to do
link |
00:02:34.140
that would improve the probability of success.
link |
00:02:36.500
And we racked our brains to think of any possible way
link |
00:02:40.100
to improve the probability of success.
link |
00:02:41.420
We could not think of anything more, nor could NASA.
link |
00:02:44.820
And so that's just the best that we could do.
link |
00:02:48.060
So then we went ahead and launched.
link |
00:02:51.860
Now, I'm not a religious person,
link |
00:02:54.340
but I nonetheless got on my knees and prayed for that mission.
link |
00:03:00.340
Were you able to sleep?
link |
00:03:01.940
No.
link |
00:03:02.780
How did it feel when it was a success,
link |
00:03:10.820
first when the launch was a success,
link |
00:03:12.780
and when they returned back home or back to Earth?
link |
00:03:16.940
It was a great relief.
link |
00:03:20.380
Yeah, for high stress situations,
link |
00:03:23.480
I find it's not so much elation as relief.
link |
00:03:26.060
And I think once, as we got more comfortable
link |
00:03:33.440
and proved out the systems,
link |
00:03:34.600
because we really got to make sure everything works,
link |
00:03:41.280
it was definitely a lot more enjoyable
link |
00:03:43.200
with the subsequent astronaut missions.
link |
00:03:46.520
And I thought the Inspiration mission
link |
00:03:50.160
was actually a very inspiring Inspiration 4 mission.
link |
00:03:53.960
I'd encourage people to watch the Inspiration documentary
link |
00:03:57.040
on Netflix, it's actually really good.
link |
00:04:00.280
And it really isn't, I was actually inspired by that.
link |
00:04:03.360
And so that one I felt I was kind of able
link |
00:04:07.760
to enjoy the actual mission
link |
00:04:09.320
and not just be super stressed all the time.
link |
00:04:10.880
So for people that somehow don't know,
link |
00:04:13.760
it's the all civilian, first time all civilian
link |
00:04:17.240
out to space, out to orbit.
link |
00:04:19.400
Yeah, and it was I think the highest orbit
link |
00:04:22.480
that in like, I don't know, 30 or 40 years or something.
link |
00:04:26.120
The only one that was higher was the one shuttle,
link |
00:04:29.480
sorry, a Hubble servicing mission.
link |
00:04:32.160
And then before that, it would have been Apollo in 72.
link |
00:04:37.840
It's pretty wild.
link |
00:04:39.180
So it's cool, it's good.
link |
00:04:40.520
I think as a species, we want to be continuing
link |
00:04:46.620
to do better and reach higher ground.
link |
00:04:50.000
And like, I think it would be tragic, extremely tragic
link |
00:04:52.880
if Apollo was the high watermark for humanity,
link |
00:04:57.000
and that's as far as we ever got.
link |
00:05:00.080
And it's concerning that here we are 49 years
link |
00:05:06.360
after the last mission to the moon.
link |
00:05:09.060
And so almost half a century, and we've not been back.
link |
00:05:14.060
And that's worrying.
link |
00:05:16.260
It's like, does that mean we've peaked as a civilization
link |
00:05:20.280
or what?
link |
00:05:21.120
So like, I think we got to get back to the moon
link |
00:05:24.480
and build a base there, a science base.
link |
00:05:27.160
I think we could learn a lot about the nature
link |
00:05:28.880
of the universe if we have a proper science base
link |
00:05:31.120
on the moon, like we have a science base in Antarctica
link |
00:05:35.520
and many other parts of the world.
link |
00:05:38.080
And so that's like, I think the next big thing,
link |
00:05:41.880
we've got to have like a serious moon base
link |
00:05:45.680
and then get people to Mars and get out there
link |
00:05:49.960
and be a spacefaring civilization.
link |
00:05:52.580
I'll ask you about some of those details,
link |
00:05:54.500
but since you're so busy with the hard engineering
link |
00:05:57.520
challenges of everything that's involved,
link |
00:06:00.440
are you still able to marvel at the magic of it all,
link |
00:06:03.000
of space travel, of every time the rocket goes up,
link |
00:06:05.880
especially when it's a crewed mission?
link |
00:06:08.120
Or are you just so overwhelmed with all the challenges
link |
00:06:11.440
that you have to solve?
link |
00:06:12.560
And actually, sort of to add to that,
link |
00:06:16.280
the reason I wanted to ask this question of May 30th,
link |
00:06:19.240
it's been some time, so you can look back
link |
00:06:21.840
and think about the impact already.
link |
00:06:23.520
It's already, at the time it was an engineering problem
link |
00:06:26.360
maybe, now it's becoming a historic moment.
link |
00:06:29.320
Like it's a moment that, how many moments
link |
00:06:31.640
will be remembered about the 21st century?
link |
00:06:33.720
To me, that or something like that,
link |
00:06:37.300
maybe Inspiration4, one of those would be remembered
link |
00:06:39.780
as the early steps of a new age of space exploration.
link |
00:06:44.200
Yeah, I mean, during the launches itself,
link |
00:06:46.640
so I mean, I think maybe some people will know,
link |
00:06:49.120
but a lot of people don't know,
link |
00:06:50.200
is like I'm actually the chief engineer of SpaceX,
link |
00:06:52.120
so I've signed off on pretty much all the design decisions.
link |
00:06:59.040
And so if there's something that goes wrong
link |
00:07:03.120
with that vehicle, it's fundamentally my fault, so.
link |
00:07:09.960
So I'm really just thinking about all the things that,
link |
00:07:13.800
like, so when I see the rocket,
link |
00:07:15.600
I see all the things that could go wrong
link |
00:07:17.320
and the things that could be better
link |
00:07:18.520
and the same with the Dragon spacecraft.
link |
00:07:21.480
It's like, other people will see,
link |
00:07:23.440
oh, this is a spacecraft or a rocket
link |
00:07:25.600
and this looks really cool.
link |
00:07:27.040
I'm like, I've like a readout of like,
link |
00:07:29.600
these are the risks, these are the problems,
link |
00:07:32.600
that's what I see.
link |
00:07:33.840
Like, choo choo choo choo choo choo choo.
link |
00:07:36.320
So it's not what other people see
link |
00:07:38.840
when they see the product, you know.
link |
00:07:40.400
So let me ask you then to analyze Starship
link |
00:07:43.040
in that same way.
link |
00:07:44.360
I know you have, you'll talk about,
link |
00:07:46.520
in more detail about Starship in the near future, perhaps.
link |
00:07:49.560
Yeah, we can talk about it now if you want.
link |
00:07:51.800
But just in that same way, like you said,
link |
00:07:54.120
you see, when you see a rocket,
link |
00:07:57.620
you see sort of a list of risks.
link |
00:07:59.160
In that same way, you said that Starship
link |
00:08:01.040
is a really hard problem.
link |
00:08:03.240
So there's many ways I can ask this,
link |
00:08:05.480
but if you magically could solve one problem perfectly,
link |
00:08:09.560
one engineering problem perfectly,
link |
00:08:11.740
which one would it be?
link |
00:08:13.000
On Starship?
link |
00:08:14.120
On, sorry, on Starship.
link |
00:08:15.540
So is it maybe related to the efficiency,
link |
00:08:18.040
the engine, the weight of the different components,
link |
00:08:21.360
the complexity of various things,
link |
00:08:22.680
maybe the controls of the crazy thing it has to do to land?
link |
00:08:26.720
No, it's actually by far the biggest thing
link |
00:08:30.080
absorbing my time is engine production.
link |
00:08:35.080
Not the design of the engine,
link |
00:08:36.480
but I've often said prototypes are easy, production is hard.
link |
00:08:45.360
So we have the most advanced rocket engine
link |
00:08:48.680
that's ever been designed.
link |
00:08:52.720
Cause I'd say currently the best rocket engine ever
link |
00:08:55.260
is probably the RD180 or RD170,
link |
00:09:00.320
that's the Russian engine basically.
link |
00:09:03.640
And still, I think an engine should only count
link |
00:09:06.760
if it's gotten something to orbit.
link |
00:09:09.000
So our engine has not gotten anything to orbit yet,
link |
00:09:12.720
but it is, it's the first engine
link |
00:09:14.600
that's actually better than the Russian RD engines,
link |
00:09:20.240
which were amazing design.
link |
00:09:22.680
So you're talking about Raptor engine.
link |
00:09:24.120
What makes it amazing?
link |
00:09:25.520
What are the different aspects of it that make it,
link |
00:09:29.280
like what are you the most excited about
link |
00:09:31.920
if the whole thing works in terms of efficiency,
link |
00:09:35.080
all those kinds of things?
link |
00:09:37.120
Well, it's, the Raptor is a full flow
link |
00:09:42.000
staged combustion engine
link |
00:09:46.360
and it's operating at a very high chamber pressure.
link |
00:09:50.840
So one of the key figures of merit, perhaps the key figure
link |
00:09:54.800
of merit is what is the chamber pressure
link |
00:09:58.680
at which the rocket engine can operate?
link |
00:10:00.800
That's the combustion chamber pressure.
link |
00:10:03.040
So Raptor is designed to operate at 300 bar,
link |
00:10:06.960
possibly maybe higher, that's 300 atmospheres.
link |
00:10:10.320
So the record right now for operational engine
link |
00:10:15.880
is the RD engine that I mentioned, the Russian RD,
link |
00:10:17.880
which is I believe around 267 bar.
link |
00:10:22.560
And the difficulty of the chamber pressure
link |
00:10:25.840
is increases on a nonlinear basis.
link |
00:10:27.860
So 10% more chamber pressure is more like 50% more difficult.
link |
00:10:36.960
But that chamber pressure is,
link |
00:10:38.640
that is what allows you to get a very high power density
link |
00:10:44.400
for the engine.
link |
00:10:45.220
So enabling a very high thrust to weight ratio
link |
00:10:53.300
and a very high specific impulse.
link |
00:10:57.020
So specific impulse is like a measure of the efficiency
link |
00:10:59.820
of a rocket engine.
link |
00:11:01.700
It's really the effect of exhaust velocity
link |
00:11:07.820
of the gas coming out of the engine.
link |
00:11:10.120
So with a very high chamber pressure,
link |
00:11:17.160
you can have a compact engine
link |
00:11:21.600
that nonetheless has a high expansion ratio,
link |
00:11:24.000
which is the ratio between the exit nozzle and the throat.
link |
00:11:31.800
So you see a rocket engine's got sort of like a hourglass
link |
00:11:35.640
shape, it's like a chamber and then it necks down
link |
00:11:38.120
and there's a nozzle and the ratio of the exit diameter
link |
00:11:41.880
to the throat is the expansion ratio.
link |
00:11:45.960
So why is it such a hard engine to manufacture at scale?
link |
00:11:51.400
It's very complex.
link |
00:11:53.120
So a lot of, what is complexity mean here?
link |
00:11:55.200
There's a lot of components involved.
link |
00:11:56.760
There's a lot of components and a lot of unique materials
link |
00:12:02.400
that, so we had to invent several alloys
link |
00:12:07.040
that don't exist in order to make this engine work.
link |
00:12:11.520
It's a materials problem too.
link |
00:12:14.560
It's a materials problem and in a staged combustion,
link |
00:12:19.760
a full flow staged combustion,
link |
00:12:21.200
there are many feedback loops in the system.
link |
00:12:24.160
So basically you've got propellants and hot gas
link |
00:12:31.820
flowing simultaneously to so many different places
link |
00:12:36.820
on the engine and they all have a recursive effect
link |
00:12:42.740
on each other.
link |
00:12:43.700
So you change one thing here, it has a recursive effect
link |
00:12:45.780
here, it changes something over there
link |
00:12:47.420
and it's quite hard to control.
link |
00:12:52.740
Like there's a reason no one's made this before.
link |
00:12:58.700
And the reason we're doing a staged combustion full flow
link |
00:13:03.700
is because it has the highest possible efficiency.
link |
00:13:12.740
So in order to make a fully reusable rocket,
link |
00:13:19.380
which that's the really the holy grail of orbital rocketry.
link |
00:13:25.340
You have to have, everything's gotta be the best.
link |
00:13:28.420
It's gotta be the best engine, the best airframe,
link |
00:13:30.900
the best heat shield, extremely light avionics,
link |
00:13:38.100
very clever control mechanisms.
link |
00:13:40.620
You've got to shed mass in any possible way that you can.
link |
00:13:45.100
For example, we are, instead of putting landing legs
link |
00:13:47.620
on the booster and ship, we are gonna catch them
link |
00:13:49.580
with a tower to save the weight of the landing legs.
link |
00:13:53.220
So that's like, I mean, we're talking about catching
link |
00:13:58.220
the largest flying object ever made with,
link |
00:14:03.340
on a giant tower with chopstick arms.
link |
00:14:06.660
It's like Karate Kid with the fly, but much bigger.
link |
00:14:12.060
I mean, pulling something like that home.
link |
00:14:12.900
This probably won't work the first time.
link |
00:14:17.620
Anyway, so this is bananas, this is banana stuff.
link |
00:14:19.820
So you mentioned that you doubt, well, not you doubt,
link |
00:14:23.100
but there's days or moments when you doubt
link |
00:14:26.460
that this is even possible.
link |
00:14:28.340
It's so difficult.
link |
00:14:30.120
The possible part is, well, at this point,
link |
00:14:35.420
I think we will get Starship to work.
link |
00:14:41.380
There's a question of timing.
link |
00:14:42.860
How long will it take us to do this?
link |
00:14:45.460
How long will it take us to actually achieve
link |
00:14:47.700
full and rapid reusability?
link |
00:14:50.500
Cause it will take probably many launches
link |
00:14:52.680
before we are able to have full and rapid reusability.
link |
00:14:57.660
But I can say that the physics pencils out.
link |
00:15:01.540
Like we're not, like at this point,
link |
00:15:06.620
I'd say we're confident that, like let's say,
link |
00:15:10.100
I'm very confident success is in the set
link |
00:15:12.620
of all possible outcomes.
link |
00:15:13.820
For a while there, I was not convinced
link |
00:15:18.340
that success was in the set of possible outcomes,
link |
00:15:20.820
which is very important actually.
link |
00:15:23.820
But so we're saying there's a chance.
link |
00:15:29.700
I'm saying there's a chance, exactly.
link |
00:15:33.260
Just not sure how long it will take.
link |
00:15:38.260
We have a very talented team.
link |
00:15:39.580
They're working night and day to make it happen.
link |
00:15:43.420
And like I said, the critical thing to achieve
link |
00:15:48.260
for the revolution in space flight
link |
00:15:49.540
and for humanity to be a spacefaring civilization
link |
00:15:52.020
is to have a fully and rapidly reusable rocket,
link |
00:15:54.380
orbital rocket.
link |
00:15:56.500
There's not even been any orbital rocket
link |
00:15:58.820
that's been fully reusable ever.
link |
00:16:00.100
And this has always been the holy grail of rocketry.
link |
00:16:06.020
And many smart people, very smart people,
link |
00:16:09.700
have tried to do this before and they've not succeeded.
link |
00:16:12.700
So, cause it's such a hard problem.
link |
00:16:16.100
What's your source of belief in situations like this?
link |
00:16:21.220
When the engineering problem is so difficult,
link |
00:16:23.580
there's a lot of experts, many of whom you admire,
link |
00:16:27.500
who have failed in the past.
link |
00:16:29.220
And a lot of people, a lot of experts,
link |
00:16:37.220
maybe journalists, all the kind of,
link |
00:16:38.860
the public in general have a lot of doubt
link |
00:16:40.900
about whether it's possible.
link |
00:16:42.820
And you yourself know that even if it's a non null set,
link |
00:16:47.340
non empty set of success, it's still unlikely
link |
00:16:50.580
or very difficult.
link |
00:16:52.140
Like where do you go to, both personally,
link |
00:16:55.780
intellectually as an engineer, as a team,
link |
00:16:58.920
like for source of strength needed
link |
00:17:00.860
to sort of persevere through this.
link |
00:17:02.560
And to keep going with the project, take it to completion.
link |
00:17:18.560
A source of strength.
link |
00:17:19.800
That's really not how I think about things.
link |
00:17:23.600
I mean, for me, it's simply this,
link |
00:17:25.000
this is something that is important to get done
link |
00:17:28.080
and we should just keep doing it or die trying.
link |
00:17:32.480
And I don't need a source of strength.
link |
00:17:35.920
So quitting is not even like...
link |
00:17:39.000
That's not, it's not in my nature.
link |
00:17:41.520
And I don't care about optimism or pessimism.
link |
00:17:46.240
Fuck that, we're gonna get it done.
link |
00:17:47.880
Gonna get it done.
link |
00:17:51.720
Can you then zoom back in to specific problems
link |
00:17:55.160
with Starship or any engineering problems you work on?
link |
00:17:58.300
Can you try to introspect your particular
link |
00:18:00.900
biological neural network, your thinking process
link |
00:18:03.520
and describe how you think through problems,
link |
00:18:06.120
the different engineering and design problems?
link |
00:18:07.960
Is there like a systematic process?
link |
00:18:10.080
You've spoken about first principles thinking,
link |
00:18:11.880
but is there a kind of process to it?
link |
00:18:14.140
Well, yeah, like saying like physics is a law
link |
00:18:19.200
and everything else is a recommendation.
link |
00:18:21.640
Like I've met a lot of people that can break the law,
link |
00:18:23.200
but I haven't met anyone who could break physics.
link |
00:18:25.700
So the first, for any kind of technology problem,
link |
00:18:32.520
you have to sort of just make sure
link |
00:18:34.560
you're not violating physics.
link |
00:18:39.960
And first principles analysis, I think,
link |
00:18:45.120
is something that can be applied to really any walk of life,
link |
00:18:49.200
anything really.
link |
00:18:50.040
It's really just saying, let's boil something down
link |
00:18:54.680
to the most fundamental principles.
link |
00:18:58.440
The things that we are most confident are true
link |
00:19:00.840
at a foundational level.
link |
00:19:02.480
And that sets your axiomatic base,
link |
00:19:05.040
and then you reason up from there,
link |
00:19:07.000
and then you cross check your conclusion
link |
00:19:09.040
against the axiomatic truths.
link |
00:19:13.680
So some basics in physics would be like,
link |
00:19:18.120
are you violating conservation of energy or momentum
link |
00:19:20.280
or something like that?
link |
00:19:21.400
Then it's not gonna work.
link |
00:19:29.560
So that's just to establish, is it possible?
link |
00:19:32.960
And another good physics tool
link |
00:19:34.760
is thinking about things in the limit.
link |
00:19:36.480
If you take a particular thing
link |
00:19:38.360
and you scale it to a very large number
link |
00:19:41.680
or to a very small number, how do things change?
link |
00:19:45.960
Both like in number of things you manufacture
link |
00:19:48.920
or something like that, and then in time?
link |
00:19:51.640
Yeah, like let's say take an example of like manufacturing,
link |
00:19:55.920
which I think is just a very underrated problem.
link |
00:20:00.400
And like I said, it's much harder to
link |
00:20:06.720
take an advanced technology product
link |
00:20:09.520
and bring it into volume manufacturing
link |
00:20:11.000
than it is to design it in the first place.
link |
00:20:12.880
My orders of magnitude.
link |
00:20:14.480
So let's say you're trying to figure out
link |
00:20:17.800
is like why is this part or product expensive?
link |
00:20:23.960
Is it because of something fundamentally foolish
link |
00:20:27.400
that we're doing or is it because our volume is too low?
link |
00:20:31.320
And so then you say, okay, well, what if our volume
link |
00:20:32.840
was a million units a year?
link |
00:20:34.220
Is it still expensive?
link |
00:20:35.740
That's what I mean by thinking about things in the limit.
link |
00:20:38.120
If it's still expensive at a million units a year,
link |
00:20:40.140
then volume is not the reason why your thing is expensive.
link |
00:20:42.500
There's something fundamental about the design.
link |
00:20:44.700
And then you then can focus on reducing the complexity
link |
00:20:47.440
or something like that in the design.
link |
00:20:48.280
You gotta change the design to change the part
link |
00:20:50.600
to be something that is not fundamentally expensive.
link |
00:20:56.440
But like that's a common thing in rocketry
link |
00:20:58.920
because the unit volume is relatively low.
link |
00:21:01.840
And so a common excuse would be,
link |
00:21:04.120
well, it's expensive because our unit volume is low.
link |
00:21:06.480
And if we were in like automotive or something like that,
link |
00:21:08.760
or consumer electronics, then our costs would be lower.
link |
00:21:10.920
I'm like, okay, so let's say
link |
00:21:13.080
now you're making a million units a year.
link |
00:21:14.620
Is it still expensive?
link |
00:21:16.080
If the answer is yes, then economies of scale
link |
00:21:20.560
are not the issue.
link |
00:21:22.120
Do you throw into manufacturing,
link |
00:21:24.120
do you throw like supply chain?
link |
00:21:26.080
Talked about resources and materials and stuff like that.
link |
00:21:28.520
Do you throw that into the calculation
link |
00:21:29.960
of trying to reason from first principles,
link |
00:21:31.800
like how we're gonna make the supply chain work here?
link |
00:21:34.640
Yeah, yeah.
link |
00:21:35.760
And then the cost of materials, things like that.
link |
00:21:37.840
Or is that too much?
link |
00:21:38.960
Exactly, so like a good example,
link |
00:21:41.880
I think of thinking about things in the limit
link |
00:21:44.440
is if you take any machine or whatever,
link |
00:21:54.360
like take a rocket or whatever,
link |
00:21:56.040
and say, if you look at the raw materials in the rocket,
link |
00:22:03.640
so you're gonna have like aluminum, steel, titanium,
link |
00:22:07.560
Inconel, specialty alloys, copper,
link |
00:22:12.560
and you say, what's the weight of the constituent elements,
link |
00:22:19.200
of each of these elements,
link |
00:22:20.440
and what is their raw material value?
link |
00:22:22.600
And that sets the asymptotic limit
link |
00:22:25.720
for how low the cost of the vehicle can be
link |
00:22:29.480
unless you change the materials.
link |
00:22:31.800
So, and then when you do that,
link |
00:22:33.960
I call it like maybe the magic one number
link |
00:22:35.760
or something like that.
link |
00:22:36.600
So that would be like, if you had the,
link |
00:22:40.320
like just a pile of these raw materials,
link |
00:22:42.440
again, you could wave the magic wand
link |
00:22:43.680
and rearrange the atoms into the final shape.
link |
00:22:47.160
That would be the lowest possible cost
link |
00:22:49.600
that you could make this thing for
link |
00:22:51.000
unless you change the materials.
link |
00:22:52.720
So then, and that is almost always a very low number.
link |
00:22:57.760
So then what's actually causing things to be expensive
link |
00:23:01.200
is how you put the atoms into the desired shape.
link |
00:23:06.000
Yeah, actually, if you don't mind me taking a tiny tangent,
link |
00:23:10.120
I often talk to Jim Keller,
link |
00:23:11.440
who's somebody that worked with you as a friend.
link |
00:23:14.440
Jim was, yeah, did great work at Tesla.
link |
00:23:17.760
So I suppose he carries the flame
link |
00:23:20.480
with the same kind of thinking
link |
00:23:22.680
that you're talking about now.
link |
00:23:26.240
And I guess I see that same thing
link |
00:23:27.880
at Tesla and SpaceX folks who work there,
link |
00:23:31.800
they kind of learn this way of thinking
link |
00:23:33.760
and it kind of becomes obvious almost.
link |
00:23:36.720
But anyway, I had argument, not argument,
link |
00:23:39.680
but he educated me about how cheap it might be
link |
00:23:44.800
to manufacture a Tesla bot.
link |
00:23:46.600
We just, we had an argument.
link |
00:23:48.360
How can you reduce the cost of scale of producing a robot?
link |
00:23:52.120
Because I've gotten the chance to interact quite a bit,
link |
00:23:55.960
obviously, in the academic circles with humanoid robots
link |
00:23:59.480
and then Boston Dynamics and stuff like that.
link |
00:24:01.800
And they're very expensive to build.
link |
00:24:04.520
And then Jim kind of schooled me on saying like,
link |
00:24:07.600
okay, like this kind of first principles thinking
link |
00:24:10.080
of how can we get the cost of manufacturing down?
link |
00:24:13.760
I suppose you do that,
link |
00:24:14.720
you have done that kind of thinking for Tesla bot
link |
00:24:17.640
and for all kinds of complex systems
link |
00:24:21.880
that are traditionally seen as complex.
link |
00:24:23.680
And you say, okay, how can we simplify everything down?
link |
00:24:27.160
Yeah, I mean, I think if you are really good
link |
00:24:30.320
at manufacturing, you can basically make,
link |
00:24:34.640
at high volume, you can basically make anything
link |
00:24:36.560
for a cost that asymptotically approaches
link |
00:24:40.640
the raw material value of the constituents
link |
00:24:42.880
plus any intellectual property that you need to do license.
link |
00:24:46.480
Anything.
link |
00:24:49.440
But it's hard.
link |
00:24:50.280
It's not like that's a very hard thing to do,
link |
00:24:51.840
but it is possible for anything.
link |
00:24:54.720
Anything in volume can be made, like I said,
link |
00:24:57.480
for a cost that asymptotically approaches
link |
00:25:00.440
this raw material constituents
link |
00:25:02.760
plus intellectual property license rights.
link |
00:25:05.360
So what'll often happen in trying to design a product
link |
00:25:08.520
is people will start with the tools and parts
link |
00:25:11.960
and methods that they are familiar with
link |
00:25:14.520
and then try to create the product
link |
00:25:17.480
using their existing tools and methods.
link |
00:25:21.240
The other way to think about it is actually imagine the,
link |
00:25:25.080
try to imagine the platonic ideal of the perfect product
link |
00:25:28.800
or technology, whatever it might be.
link |
00:25:31.360
And say, what is the perfect arrangement of atoms
link |
00:25:35.680
that would be the best possible product?
link |
00:25:38.560
And now let us try to figure out
link |
00:25:39.840
how to get the atoms in that shape.
link |
00:25:43.880
I mean, it sounds,
link |
00:25:48.000
it's almost like a Rick and Morty absurd
link |
00:25:50.400
until you start to really think about it
link |
00:25:52.080
and you really should think about it in this way
link |
00:25:56.440
because everything else is kind of,
link |
00:25:59.000
if you think, you might fall victim to the momentum
link |
00:26:03.000
of the way things were done in the past
link |
00:26:04.440
unless you think in this way.
link |
00:26:06.000
Well, just as a function of inertia,
link |
00:26:07.680
people will want to use the same tools and methods
link |
00:26:10.640
that they are familiar with.
link |
00:26:13.720
That's what they'll do by default.
link |
00:26:15.720
And then that will lead to an outcome of things
link |
00:26:18.720
that can be made with those tools and methods
link |
00:26:20.480
but is unlikely to be the platonic ideal
link |
00:26:23.880
of the perfect product.
link |
00:26:25.040
So then, so that's why it's good to think of things
link |
00:26:28.360
in both directions.
link |
00:26:29.200
So like, what can we build with the tools that we have?
link |
00:26:31.240
But then, but also what is the perfect,
link |
00:26:34.600
the theoretical perfect product look like?
link |
00:26:36.320
And that theoretical perfect part
link |
00:26:38.600
is gonna be a moving target
link |
00:26:39.560
because as you learn more,
link |
00:26:41.720
the definition of that perfect product will change
link |
00:26:45.960
because you don't actually know what the perfect product is,
link |
00:26:47.720
but you can successfully approximate a more perfect product.
link |
00:26:52.120
So think about it like that and then saying,
link |
00:26:54.520
okay, now what tools, methods, materials,
link |
00:26:57.320
whatever do we need to create
link |
00:27:00.400
in order to get the atoms in that shape?
link |
00:27:03.880
But people rarely think about it that way.
link |
00:27:07.880
But it's a powerful tool.
link |
00:27:10.480
I should mention that the brilliant Siobhan Ziles
link |
00:27:13.840
is hanging out with us in case you hear a voice
link |
00:27:17.840
of wisdom from outside, from up above.
link |
00:27:23.720
Okay, so let me ask you about Mars.
link |
00:27:26.320
You mentioned it would be great for science
link |
00:27:28.520
to put a base on the moon to do some research.
link |
00:27:32.960
But the truly big leap, again,
link |
00:27:36.760
in this category of seemingly impossible,
link |
00:27:38.920
is to put a human being on Mars.
link |
00:27:41.880
When do you think SpaceX will land a human being on Mars?
link |
00:27:44.920
Hmm, best case is about five years, worst case, 10 years.
link |
00:27:54.480
Okay, so I'm gonna ask you about Mars.
link |
00:27:57.480
You mentioned it would be great for science
link |
00:28:00.480
to put a base on the moon to do some research.
link |
00:28:03.480
But the truly big leap, again,
link |
00:28:05.680
in this category of seemingly impossible,
link |
00:28:08.760
is to put a human being on Mars.
link |
00:28:11.240
But the truly big leap, again,
link |
00:28:15.040
in this category of seemingly impossible,
link |
00:28:18.040
is to put a human being on Mars.
link |
00:28:20.480
When do you think SpaceX will land a human being on Mars?
link |
00:28:23.480
Hmm, best case is about five years, worst case, 10 years.
link |
00:28:31.760
What are the determining factors, would you say,
link |
00:28:34.520
from an engineering perspective?
link |
00:28:36.240
Or is that not the bottlenecks?
link |
00:28:37.840
I don't know, order of magnitude or something like that.
link |
00:28:40.760
It's a lot, it's really next level.
link |
00:28:43.000
So, and the fundamental optimization of Starship
link |
00:28:49.200
is minimizing cost per ton to orbit,
link |
00:28:51.480
and ultimately cost per ton to the surface of Mars.
link |
00:28:54.760
This may seem like a mercantile objective,
link |
00:28:56.320
but it is actually the thing that needs to be optimized.
link |
00:29:00.320
Like there is a certain cost per ton to the surface of Mars
link |
00:29:04.000
where we can afford to establish a self sustaining city,
link |
00:29:08.800
and then above that, we cannot afford to do it.
link |
00:29:12.760
So right now, you couldn't fly to Mars for a trillion dollars.
link |
00:29:16.760
No amount of money could get you a ticket to Mars.
link |
00:29:19.120
So we need to get that above,
link |
00:29:22.440
to get that something that is actually possible at all.
link |
00:29:27.760
But that's, we don't just want to have,
link |
00:29:32.240
with Mars, flags and footprints,
link |
00:29:33.800
and then not come back for a half century,
link |
00:29:35.880
like we did with the Moon.
link |
00:29:37.920
In order to pass a very important, great filter,
link |
00:29:43.360
I think we need to be a multi planet species.
link |
00:29:48.320
This may sound somewhat esoteric to a lot of people,
link |
00:29:51.400
but eventually, given enough time,
link |
00:29:57.160
there's something, Earth is likely to experience
link |
00:30:00.480
some calamity that could be something
link |
00:30:05.200
that humans do to themselves,
link |
00:30:06.720
or an external event like happened to the dinosaurs.
link |
00:30:12.680
But eventually, if none of that happens,
link |
00:30:17.920
and somehow magically we keep going,
link |
00:30:21.240
then the Sun is gradually expanding,
link |
00:30:24.520
and will engulf the Earth,
link |
00:30:26.600
and probably Earth gets too hot for life
link |
00:30:31.080
in about 500 million years.
link |
00:30:34.800
It's a long time, but that's only 10% longer
link |
00:30:37.280
than Earth has been around.
link |
00:30:39.360
And so if you think about the current situation,
link |
00:30:43.240
it's really remarkable, and kind of hard to believe,
link |
00:30:45.640
but Earth's been around four and a half billion years,
link |
00:30:50.120
and this is the first time in four and a half billion years
link |
00:30:52.360
that it's been possible to extend life beyond Earth.
link |
00:30:55.720
And that window of opportunity may be open
link |
00:30:58.120
for a long time, and I hope it is,
link |
00:30:59.520
but it also may be open for a short time.
link |
00:31:01.740
And I think it is wise for us to act quickly
link |
00:31:09.440
while the window is open, just in case it closes.
link |
00:31:13.800
Yeah, the existence of nuclear weapons, pandemics,
link |
00:31:17.800
all kinds of threats should kind of give us some motivation.
link |
00:31:22.800
I mean, civilization could die with a bang or a whimper.
link |
00:31:29.560
If it dies as a demographic collapse,
link |
00:31:32.760
then it's more of a whimper, obviously.
link |
00:31:35.160
And if it's World War III, it's more of a bang.
link |
00:31:38.480
But these are all risks.
link |
00:31:41.160
I mean, it's important to think of these things
link |
00:31:42.000
and just think of things like probabilities, not certainties.
link |
00:31:46.240
There's a certain probability
link |
00:31:47.680
that something bad will happen on Earth.
link |
00:31:50.200
I think most likely the future will be good.
link |
00:31:53.200
But there's like, let's say for argument's sake,
link |
00:31:56.400
a 1% chance per century of a civilization ending event.
link |
00:32:00.920
Like that was Stephen Hawking's estimate.
link |
00:32:05.080
I think he might be right about that.
link |
00:32:07.720
So then we should basically think of this
link |
00:32:15.440
being a multi planet species,
link |
00:32:16.760
just like taking out a planet from the sky,
link |
00:32:18.600
multi planet species, just like taking out insurance
link |
00:32:20.720
for life itself.
link |
00:32:21.560
Like life insurance for life.
link |
00:32:27.280
It's turned into an infomercial real quick.
link |
00:32:29.360
Life insurance for life, yes.
link |
00:32:31.600
And we can bring the creatures from,
link |
00:32:36.240
plants and animals from Earth to Mars
link |
00:32:38.040
and breathe life into the planet
link |
00:32:41.280
and have a second planet with life.
link |
00:32:44.640
That would be great.
link |
00:32:46.000
They can't bring themselves there.
link |
00:32:47.560
So if we don't bring them to Mars,
link |
00:32:50.000
then they will just for sure all die
link |
00:32:52.280
when the sun expands anyway.
link |
00:32:54.320
And then that'll be it.
link |
00:32:56.200
What do you think is the most difficult aspect
link |
00:32:58.560
of building a civilization on Mars?
link |
00:33:00.960
Terraforming Mars, like from an engineering perspective,
link |
00:33:03.760
from a financial perspective, human perspective,
link |
00:33:07.240
to get a large number of folks there
link |
00:33:13.000
who will never return back to Earth?
link |
00:33:15.880
No, they could certainly return.
link |
00:33:17.160
Some will return back to Earth.
link |
00:33:18.440
They will choose to stay there
link |
00:33:19.960
for the rest of their lives.
link |
00:33:21.280
Yeah, many will.
link |
00:33:23.760
But we need the spaceships back,
link |
00:33:28.440
like the ones that go to Mars.
link |
00:33:29.480
We need them back.
link |
00:33:30.320
So you can hop on if you want.
link |
00:33:32.680
But we can't just not have the spaceships come back.
link |
00:33:34.960
Those things are expensive.
link |
00:33:35.800
We need them back.
link |
00:33:36.640
I'd like to come back and do another trip.
link |
00:33:38.760
I mean, do you think about the terraforming aspect,
link |
00:33:40.680
like actually building?
link |
00:33:41.640
Are you so focused right now on the spaceships part
link |
00:33:44.560
that's so critical to get to Mars?
link |
00:33:46.960
We absolutely, if you can't get there,
link |
00:33:48.200
nothing else matters.
link |
00:33:49.600
So, and like I said, we can't get there
link |
00:33:53.040
at some extraordinarily high cost.
link |
00:33:54.680
I mean, the current cost of, let's say,
link |
00:33:57.800
one ton to the surface of Mars
link |
00:34:00.080
is on the order of a billion dollars.
link |
00:34:02.680
So, because you don't just need the rocket
link |
00:34:04.840
and the launch and everything,
link |
00:34:05.680
you need like heat shield, you need guidance system,
link |
00:34:09.840
you need deep space communications,
link |
00:34:12.280
you need some kind of landing system.
link |
00:34:15.000
So, like rough approximation would be a billion dollars
link |
00:34:19.520
per ton to the surface of Mars right now.
link |
00:34:22.200
This is obviously way too expensive
link |
00:34:26.880
to create a self sustaining civilization.
link |
00:34:30.680
So we need to improve that by at least a factor of 1,000.
link |
00:34:36.840
A million per ton?
link |
00:34:38.440
Yes, ideally much less than a million ton.
link |
00:34:40.880
But if it's not, like it's gotta be,
link |
00:34:44.280
so you have to say like, well,
link |
00:34:45.200
how much can society afford to spend
link |
00:34:47.960
or just want to spend on a self sustaining city on Mars?
link |
00:34:52.400
The self sustaining part is important.
link |
00:34:53.780
Like it's just the key threshold,
link |
00:34:57.720
the great filter will have been passed
link |
00:35:01.360
when the city on Mars can survive
link |
00:35:05.280
even if the spaceships from Earth stop coming
link |
00:35:07.720
for any reason, doesn't matter what the reason is,
link |
00:35:09.960
but if they stop coming for any reason,
link |
00:35:12.120
will it die out or will it not?
link |
00:35:13.720
And if there's even one critical ingredient missing,
link |
00:35:16.400
then it still doesn't count.
link |
00:35:18.320
It's like, you know, if you're on a long sea voyage
link |
00:35:20.440
and you've got everything except vitamin C,
link |
00:35:23.920
it's only a matter of time, you know, you're gonna die.
link |
00:35:26.280
So we're gonna get a Mars city
link |
00:35:28.880
to the point where it's self sustaining.
link |
00:35:32.260
I'm not sure this will really happen in my lifetime,
link |
00:35:33.880
but I hope to see it at least have a lot of momentum.
link |
00:35:37.320
And then you could say, okay,
link |
00:35:38.520
what is the minimum tonnage necessary
link |
00:35:40.380
to have a self sustaining city?
link |
00:35:44.060
And there's a lot of uncertainty about this.
link |
00:35:46.480
You could say like, I don't know,
link |
00:35:48.560
it's probably at least a million tons
link |
00:35:52.040
cause you have to set up a lot of infrastructure on Mars.
link |
00:35:55.440
Like I said, you can't be missing anything
link |
00:35:58.680
that in order to be self sustaining,
link |
00:36:00.960
you can't be missing, like you need,
link |
00:36:02.420
you know, semiconductor, fabs, you need iron ore refineries,
link |
00:36:07.120
like you need lots of things, you know.
link |
00:36:09.360
So, and Mars is not super hospitable.
link |
00:36:13.640
It's the least inhospitable planet,
link |
00:36:15.680
but it's definitely a fixer of a planet.
link |
00:36:18.120
Outside of Earth.
link |
00:36:19.360
Yes.
link |
00:36:20.200
Earth is pretty good.
link |
00:36:21.020
Earth is like easy, yeah.
link |
00:36:22.280
And also I should, we should clarify in the solar system.
link |
00:36:25.480
Yes, in the solar system.
link |
00:36:26.600
There might be nice like vacation spots.
link |
00:36:29.760
There might be some great planets out there,
link |
00:36:31.160
but it's hopeless.
link |
00:36:32.760
Too hard to get there?
link |
00:36:33.800
Yeah, way, way, way, way, way too hard to say the least.
link |
00:36:37.440
Let me push back on that, not really a push back,
link |
00:36:39.720
but a quick curve ball of a question.
link |
00:36:42.000
So you did mention physics as the first starting point.
link |
00:36:44.840
So, general relativity allows for wormholes.
link |
00:36:51.400
They technically can exist.
link |
00:36:53.080
Do you think those can ever be leveraged
link |
00:36:55.720
by humans to travel fast in the speed of light?
link |
00:36:59.400
Well, the wormhole thing is debatable.
link |
00:37:03.600
The, we currently do not know of any means
link |
00:37:08.200
of going faster than the speed of light.
link |
00:37:11.760
There is like, there are some ideas about having space.
link |
00:37:21.400
Like so, you can only move at the speed of light
link |
00:37:26.520
through space, but if you can make space itself move,
link |
00:37:30.740
that's like, that's warping space.
link |
00:37:36.140
Space is capable of moving faster than the speed of light.
link |
00:37:39.580
Right.
link |
00:37:40.740
Like the universe in the Big Bang,
link |
00:37:42.100
the universe expanded at much,
link |
00:37:44.180
much more than the speed of light by a lot.
link |
00:37:46.820
Yeah.
link |
00:37:48.820
So, but the, if this is possible,
link |
00:37:56.380
the amount of energy required to warp space
link |
00:37:58.660
is so gigantic, it boggles the mind.
link |
00:38:03.100
So all the work you've done with propulsion,
link |
00:38:05.060
how much innovation is possible with rocket propulsion?
link |
00:38:08.100
Is this, I mean, you've seen it all,
link |
00:38:11.140
and you're constantly innovating in every aspect.
link |
00:38:14.420
How much is possible?
link |
00:38:15.340
Like how much, can you get 10x somehow?
link |
00:38:17.380
Is there something in there in physics
link |
00:38:19.660
that you can get significant improvement
link |
00:38:21.260
in terms of efficiency of engines
link |
00:38:22.620
and all those kinds of things?
link |
00:38:24.640
Well, as I was saying, really the Holy Grail
link |
00:38:27.940
is a fully and rapidly reusable orbital system.
link |
00:38:33.020
So right now, the Falcon 9
link |
00:38:38.100
is the only reusable rocket out there,
link |
00:38:41.460
but the booster comes back and lands,
link |
00:38:44.300
and you've seen the videos,
link |
00:38:46.180
and we get the nose cone fairing back,
link |
00:38:47.620
but we do not get the upper stage back.
link |
00:38:49.740
So that means that we have a minimum cost
link |
00:38:54.260
of building an upper stage.
link |
00:38:56.660
And you can think of like a two stage rocket
link |
00:38:59.300
of sort of like two airplanes,
link |
00:39:00.700
like a big airplane and a small airplane.
link |
00:39:03.180
And we get the big airplane back,
link |
00:39:04.600
but not the small airplane.
link |
00:39:05.860
And so it still costs a lot.
link |
00:39:07.860
So that upper stage is at least $10 million.
link |
00:39:13.300
And then the degree of,
link |
00:39:15.460
the booster is not as rapidly and completely reusable
link |
00:39:19.060
as we'd like in order of the fairings.
link |
00:39:20.900
So our kind of minimum marginal cost
link |
00:39:25.180
and our counting overhead for per flight
link |
00:39:27.180
is on the order of 15 to $20 million, maybe.
link |
00:39:33.400
So that's extremely good for,
link |
00:39:38.060
it's by far better than any rocket ever in history.
link |
00:39:41.660
But with full and rapid reusability,
link |
00:39:45.380
we can reduce the cost per ton to orbit
link |
00:39:48.840
by a factor of 100.
link |
00:39:51.880
But just think of it like,
link |
00:39:54.200
like imagine if you had an aircraft or something or a car.
link |
00:39:58.800
And if you had to buy a new car
link |
00:40:02.920
every time you went for a drive,
link |
00:40:05.080
that would be very expensive.
link |
00:40:06.760
It'd be silly, frankly.
link |
00:40:08.320
But in fact, you just refuel the car or recharge the car.
link |
00:40:13.480
And that makes your trip like,
link |
00:40:18.480
I don't know, a thousand times cheaper.
link |
00:40:20.320
So it's the same for rockets.
link |
00:40:23.800
If you, it's very difficult to make this complex machine
link |
00:40:27.240
that can go to orbit.
link |
00:40:28.680
And so if you cannot reuse it
link |
00:40:30.560
and have to throw even any part of,
link |
00:40:32.800
any significant part of it away,
link |
00:40:34.000
that massively increases the cost.
link |
00:40:36.640
So, you know, Starship in theory
link |
00:40:40.320
could do a cost per launch of like a million,
link |
00:40:44.360
maybe $2 million or something like that.
link |
00:40:46.400
And put over a hundred tons in orbit, which is crazy.
link |
00:40:53.560
Yeah, that's incredible.
link |
00:40:55.960
So you're saying like it's by far the biggest bang
link |
00:40:58.200
for the buck is to make it fully reusable
link |
00:41:00.720
versus like some kind of brilliant breakthrough
link |
00:41:04.160
in theoretical physics.
link |
00:41:05.760
No, no, there's no, there's no brilliant break.
link |
00:41:07.680
No, there's no, just make the rocket reusable.
link |
00:41:11.240
This is an extremely difficult engineering problem.
link |
00:41:13.520
Got it.
link |
00:41:14.360
No new physics is required.
link |
00:41:17.960
Just brilliant engineering.
link |
00:41:19.320
Let me ask a slightly philosophical fun question.
link |
00:41:22.200
Gotta ask, I know you're focused on getting to Mars,
link |
00:41:24.840
but once we're there on Mars, what do you,
link |
00:41:27.240
what form of government, economic system, political system
link |
00:41:32.360
do you think would work best
link |
00:41:33.760
for an early civilization of humans?
link |
00:41:37.200
Is, I mean, the interesting reason to talk about this stuff,
link |
00:41:41.560
it also helps people dream about the future.
link |
00:41:44.320
I know you're really focused
link |
00:41:45.560
about the short term engineering dream,
link |
00:41:48.040
but it's like, I don't know,
link |
00:41:49.320
there's something about imagining an actual civilization
link |
00:41:51.840
on Mars that gives people, really gives people hope.
link |
00:41:55.440
Well, it would be a new frontier and an opportunity
link |
00:41:57.680
to rethink the whole nature of government,
link |
00:41:59.640
just as was done in the creation of the United States.
link |
00:42:02.680
So, I mean, I would suggest having a direct democracy,
link |
00:42:14.680
people vote directly on things
link |
00:42:16.200
as opposed to representative democracy.
link |
00:42:18.440
So representative democracy, I think,
link |
00:42:21.520
is too subject to a special interest
link |
00:42:25.120
and a coercion of the politicians and that kind of thing.
link |
00:42:29.580
So I'd recommend that there was just direct democracy,
link |
00:42:39.360
people vote on laws, the population votes on laws themselves,
link |
00:42:42.680
and then the laws must be short enough
link |
00:42:44.520
that people can understand them.
link |
00:42:46.880
Yeah, and then like keeping a well informed populace,
link |
00:42:49.920
like really being transparent about all the information,
link |
00:42:52.240
about what they're voting for.
link |
00:42:53.080
Yeah, absolute transparency.
link |
00:42:54.800
Yeah, and not make it as annoying as those cookies
link |
00:42:57.480
we have to accept. Accept cookies.
link |
00:42:59.920
I've always, like, you know,
link |
00:43:01.760
there's like always like a slight amount of trepidation
link |
00:43:03.640
when you click accept cookies,
link |
00:43:05.360
like I feel as though there's like perhaps
link |
00:43:07.280
like a very tiny chance that it'll open a portal to hell
link |
00:43:10.680
or something like that.
link |
00:43:12.320
That's exactly how I feel.
link |
00:43:13.800
Why do they, why do they keep wanting me to accept it?
link |
00:43:16.640
What do they want with this cookie?
link |
00:43:19.000
Like somebody got upset with accepting cookies
link |
00:43:21.020
or something somewhere, who cares?
link |
00:43:23.040
Like so annoying to keep accepting all these cookies.
link |
00:43:26.920
To me, this is just a great example.
link |
00:43:29.600
Yes, you can have my damn cookie.
link |
00:43:30.720
I don't care, whatever.
link |
00:43:32.360
Heard it from Ilhan first.
link |
00:43:33.720
He accepts all your damn cookies.
link |
00:43:35.960
Yeah, and stop asking me.
link |
00:43:40.240
It's annoying.
link |
00:43:41.400
Yeah, it's one example of implementation
link |
00:43:46.000
of a good idea done really horribly.
link |
00:43:50.200
Yeah, it's somebody who was like,
link |
00:43:51.480
there's some good intentions of like privacy or whatever,
link |
00:43:54.920
but now everyone's just has to accept cookies
link |
00:43:57.320
and it's not, you know, you have billions of people
link |
00:43:59.360
who have to keep clicking accept cookie.
link |
00:44:00.720
It's super annoying.
link |
00:44:02.440
Then we just accept the damn cookie, it's fine.
link |
00:44:05.000
There is like, I think a fundamental problem that we're,
link |
00:44:08.660
because we've not really had a major,
link |
00:44:12.320
like a world war or something like that in a while.
link |
00:44:14.360
And obviously we would like to not have world wars.
link |
00:44:18.320
There's not been a cleansing function
link |
00:44:19.840
for rules and regulations.
link |
00:44:21.420
So wars did have, you know, some sort of aligning
link |
00:44:24.340
in that there would be a reset on rules
link |
00:44:27.580
and regulations after a war.
link |
00:44:29.660
So World Wars I and II,
link |
00:44:30.500
there were huge resets on rules and regulations.
link |
00:44:34.260
Now, if society does not have a war
link |
00:44:37.900
and there's no cleansing function
link |
00:44:39.460
or garbage collection for rules and regulations,
link |
00:44:41.420
then rules and regulations will accumulate every year
link |
00:44:43.500
because they're immortal.
link |
00:44:45.020
There's no actual, humans die, but the laws don't.
link |
00:44:48.580
So we need a garbage collection function
link |
00:44:51.660
for rules and regulations.
link |
00:44:52.980
They should not just be immortal
link |
00:44:55.660
because some of the rules and regulations
link |
00:44:57.340
that are put in place will be counterproductive,
link |
00:45:00.260
done with good intentions, but counterproductive.
link |
00:45:02.140
Sometimes not done with good intentions.
link |
00:45:03.940
So if rules and regulations just accumulate every year
link |
00:45:09.100
and you get more and more of them,
link |
00:45:10.740
then eventually you won't be able to do anything.
link |
00:45:13.220
You're just like Gulliver with, you know,
link |
00:45:15.180
tied down by thousands of little structures
link |
00:45:17.420
by thousands of little strings.
link |
00:45:19.620
And we see that in, you know, US and like basically
link |
00:45:26.460
all economies that have been around for a while
link |
00:45:31.460
and regulators and legislators create new rules
link |
00:45:35.340
and regulations every year,
link |
00:45:36.580
but they don't put effort into removing them.
link |
00:45:38.740
And I think that's very important that we put effort
link |
00:45:40.300
into removing rules and regulations.
link |
00:45:44.020
But it gets tough because you get special interests
link |
00:45:45.620
that then are dependent on, like they have, you know,
link |
00:45:48.900
a vested interest in that whatever rule and regulation
link |
00:45:51.940
and then they fight to not get it removed.
link |
00:45:57.620
Yeah, so I mean, I guess the problem with the constitution
link |
00:46:00.940
is it's kind of like C versus Java
link |
00:46:04.140
because it doesn't have any garbage collection built in.
link |
00:46:06.740
I think there should be, when you first said
link |
00:46:09.020
the metaphor of garbage collection, I loved it.
link |
00:46:10.860
Yeah, from a coding standpoint.
link |
00:46:12.380
From a coding standpoint, yeah, yeah.
link |
00:46:14.340
It would be interesting if the laws themselves
link |
00:46:16.900
kind of had a built in thing
link |
00:46:19.020
where they kind of die after a while
link |
00:46:20.700
unless somebody explicitly publicly defends them.
link |
00:46:23.660
So that's sort of, it's not like somebody has to kill them.
link |
00:46:26.220
They kind of die themselves.
link |
00:46:28.100
They disappear.
link |
00:46:29.180
Yeah.
link |
00:46:32.540
Not to defend Java or anything, but you know, C++,
link |
00:46:36.220
you know, you could also have a great garbage collection
link |
00:46:38.540
in Python and so on.
link |
00:46:39.940
Yeah, so yeah, something needs to happen
link |
00:46:43.740
or just the civilization's arteries just harden over time
link |
00:46:48.580
and you can just get less and less done
link |
00:46:50.820
because there's just a rule against everything.
link |
00:46:54.820
So I think like, I don't know, for Mars or whatever,
link |
00:46:57.900
I'd say, or even for, you know, obviously for Earth as well,
link |
00:47:00.260
like I think there should be an active process
link |
00:47:02.620
for removing rules and regulations
link |
00:47:04.940
and questioning their existence.
link |
00:47:07.100
Just like if we've got a function
link |
00:47:10.260
for creating rules and regulations,
link |
00:47:11.580
because rules and regulations can also think of us
link |
00:47:13.140
like they're like software or lines of code
link |
00:47:15.780
for operating civilization.
link |
00:47:18.860
That's the rules and regulations.
link |
00:47:21.300
So it's not like we shouldn't have rules and regulations,
link |
00:47:22.980
but you have code accumulation, but no code removal.
link |
00:47:27.100
And so it just gets to become basically archaic bloatware
link |
00:47:31.460
after a while.
link |
00:47:33.620
And it's just, it makes it hard for things to progress.
link |
00:47:37.900
So I don't know, maybe Mars, you'd have like,
link |
00:47:40.460
you know, any given law must have a sunset, you know,
link |
00:47:44.300
and require active voting to keep it up there, you know.
link |
00:47:52.140
And I actually also say like, and these are just,
link |
00:47:54.660
I don't know, recommendations or thoughts,
link |
00:47:58.260
ultimately will be up to the people on Mars to decide.
link |
00:48:00.700
But I think it should be easier to remove a law
link |
00:48:06.660
than to add one because of the,
link |
00:48:08.620
just to overcome the inertia of laws.
link |
00:48:10.700
So maybe it's like, for argument's sake,
link |
00:48:15.220
you need like say 60% vote to have a law take effect,
link |
00:48:19.780
but only a 40% vote to remove it.
link |
00:48:23.420
So let me be the guy, you posted a meme on Twitter recently
link |
00:48:26.620
where there's like a row of urinals,
link |
00:48:30.140
and a guy just walks all the way across,
link |
00:48:33.100
and he tells you about crypto.
link |
00:48:36.300
I mean, that's happened to me so many times.
link |
00:48:38.340
I think maybe even literally.
link |
00:48:40.420
Yeah.
link |
00:48:41.820
Do you think, technologically speaking,
link |
00:48:43.500
there's any room for ideas of smart contracts or so on?
link |
00:48:47.340
Because you mentioned laws.
link |
00:48:49.300
That's an interesting use of things like smart contracts
link |
00:48:52.980
to implement the laws by which governments function.
link |
00:48:57.300
Like something built on Ethereum,
link |
00:48:58.980
or maybe a dog coin that enables smart contracts somehow.
link |
00:49:04.860
I don't quite understand this whole smart contract thing.
link |
00:49:08.260
You know.
link |
00:49:09.960
I mean, I'm too dumb to understand smart contracts.
link |
00:49:15.020
That's a good line.
link |
00:49:17.900
I mean, my general approach to any kind of deal or whatever
link |
00:49:21.460
is just make sure there's clarity of understanding.
link |
00:49:23.780
That's the most important thing.
link |
00:49:25.820
And just keep any kind of deal very short and simple,
link |
00:49:29.660
plain language, and just make sure everyone understands
link |
00:49:33.460
this is the deal, is it clear?
link |
00:49:36.580
And what are the consequences if various things
link |
00:49:40.660
don't happen?
link |
00:49:42.620
But usually deals are, business deals or whatever,
link |
00:49:47.220
are way too long and complex and overly lawyered
link |
00:49:50.940
and pointlessly.
link |
00:49:52.740
You mentioned that Doge is the people's coin.
link |
00:49:57.020
Yeah.
link |
00:49:57.860
And you said that you were literally going,
link |
00:49:59.660
SpaceX may consider literally putting a Doge coin
link |
00:50:04.660
on the moon, is this something you're still considering?
link |
00:50:09.940
Mars, perhaps, do you think there's some chance,
link |
00:50:13.620
we've talked about political systems on Mars,
link |
00:50:16.060
that Doge coin is the official currency of Mars
link |
00:50:20.220
at some point in the future?
link |
00:50:22.580
Well, I think Mars itself will need to have
link |
00:50:25.700
a different currency because you can't synchronize
link |
00:50:29.060
due to speed of light, or not easily.
link |
00:50:32.660
So it must be completely stand alone from Earth.
link |
00:50:36.480
Well, yeah, because Mars is, at closest approach,
link |
00:50:41.340
it's four light minutes away, roughly,
link |
00:50:43.020
and then at furthest approach, it's roughly
link |
00:50:45.580
20 light minutes away, maybe a little more.
link |
00:50:50.100
So you can't really have something synchronizing
link |
00:50:52.980
if you've got a 20 minute speed of light issue,
link |
00:50:55.500
if it's got a one minute blockchain.
link |
00:50:58.180
It's not gonna synchronize properly.
link |
00:50:59.980
So Mars, I don't know if Mars would have
link |
00:51:03.960
a cryptocurrency as a thing, but probably, seems likely.
link |
00:51:07.640
But it would be some kind of localized thing on Mars.
link |
00:51:12.280
And you let the people decide.
link |
00:51:14.800
Yeah, absolutely.
link |
00:51:17.680
The future of Mars should be up to the Martians.
link |
00:51:20.720
Yeah, so, I think the cryptocurrency thing
link |
00:51:25.720
is an interesting approach to reducing
link |
00:51:30.800
the error in the database that is called money.
link |
00:51:41.400
I think I have a pretty deep understanding
link |
00:51:42.920
of what money actually is on a practical day to day basis
link |
00:51:46.720
because of PayPal.
link |
00:51:50.400
We really got in deep there.
link |
00:51:52.920
And right now, the money system, actually,
link |
00:51:57.560
for practical purposes, is really a bunch
link |
00:52:01.120
of heterogeneous mainframes running old COBOL.
link |
00:52:07.480
Okay, you mean literally.
link |
00:52:08.720
That's literally what's happening.
link |
00:52:10.840
In batch mode.
link |
00:52:12.760
Okay.
link |
00:52:13.600
In batch mode.
link |
00:52:14.420
Yeah, pretty the poor bastards who have
link |
00:52:16.720
to maintain that code.
link |
00:52:19.000
Okay, that's a pain.
link |
00:52:22.240
Not even Fortran, it's COBOL.
link |
00:52:24.240
It's COBOL.
link |
00:52:26.040
And the banks are still buying mainframes in 2021
link |
00:52:30.120
and running ancient COBOL code.
link |
00:52:33.000
And the Federal Reserve is probably even older
link |
00:52:37.800
than what the banks have, and they have
link |
00:52:39.280
an old COBOL mainframe.
link |
00:52:41.880
And so, the government effectively has editing privileges
link |
00:52:47.080
on the money database.
link |
00:52:48.660
And they use those editing privileges to make more money,
link |
00:52:53.740
whatever they want.
link |
00:52:55.420
And this increases the error in the database that is money.
link |
00:52:59.060
So, I think money should really be viewed
link |
00:53:00.820
through the lens of information theory.
link |
00:53:03.580
And so, it's kind of like an internet connection.
link |
00:53:08.300
Like what's the bandwidth, total bit rate,
link |
00:53:12.580
what is the latency, jitter, packet drop,
link |
00:53:16.420
you know, errors in network communication.
link |
00:53:21.820
Just think of money like that, basically.
link |
00:53:24.460
I think that's probably the right way to think of it.
link |
00:53:26.460
And then say what system from an information theory
link |
00:53:31.460
standpoint allows an economy to function the best.
link |
00:53:35.620
And, you know, crypto is an attempt to reduce
link |
00:53:40.620
the error in money that is contributed
link |
00:53:48.780
by governments diluting the money supply
link |
00:53:53.300
as basically a pernicious form of taxation.
link |
00:53:58.900
So, both policy in terms of with inflation
link |
00:54:01.900
and actual like technological COBOL,
link |
00:54:05.780
like cryptocurrency takes us into the 21st century
link |
00:54:08.880
in terms of the actual systems
link |
00:54:10.740
that allow you to do the transaction,
link |
00:54:12.140
to store wealth, all those kinds of things.
link |
00:54:16.920
Like I said, just think of money as information.
link |
00:54:18.580
People often will think of money
link |
00:54:20.900
as having power in and of itself.
link |
00:54:24.100
It does not.
link |
00:54:24.980
Money is information and it does not have power
link |
00:54:28.980
in and of itself.
link |
00:54:31.460
Like, you know, applying the physics tools
link |
00:54:35.060
of thinking about things in the limit is helpful.
link |
00:54:37.540
If you are stranded on a tropical island
link |
00:54:41.020
and you have a trillion dollars, it's useless
link |
00:54:47.660
because there's no resource allocation.
link |
00:54:50.540
Money is a database for resource allocation,
link |
00:54:52.660
but there's no resource to allocate except yourself.
link |
00:54:55.020
So, money is useless.
link |
00:55:01.020
If you're stranded on a desert island with no food,
link |
00:55:04.100
all the Bitcoin in the world will not stop you
link |
00:55:10.340
from starving.
link |
00:55:12.420
So, just think of money as a database
link |
00:55:20.820
for resource allocation across time and space.
link |
00:55:24.980
And then what system, in what form
link |
00:55:29.980
should that database or data system,
link |
00:55:37.020
what would be most effective?
link |
00:55:39.060
Now, there is a fundamental issue
link |
00:55:41.380
with say Bitcoin in its current form
link |
00:55:46.020
in that the transaction volume is very limited
link |
00:55:50.700
and the latency for a properly confirmed transaction
link |
00:55:55.700
is too long, much longer than you'd like.
link |
00:55:58.100
So, it's actually not great from a transaction volume
link |
00:56:02.460
standpoint or a latency standpoint.
link |
00:56:07.260
So, it is perhaps useful to solve an aspect
link |
00:56:12.420
of the money database problem, which is the sort of store
link |
00:56:17.420
of wealth or an accounting of relative obligations,
link |
00:56:22.020
I suppose, but it is not useful as a currency,
link |
00:56:27.500
as a day to day currency.
link |
00:56:28.780
But people have proposed different technological solutions.
link |
00:56:31.260
Like Lightning.
link |
00:56:32.100
Yeah, Lightning Network and the layer two technologies
link |
00:56:34.740
on top of that.
link |
00:56:35.580
I mean, it seems to be all kind of a trade off,
link |
00:56:38.820
but the point is, it's kind of brilliant to say
link |
00:56:41.060
that just think about information,
link |
00:56:42.460
think about what kind of database,
link |
00:56:44.100
what kind of infrastructure enables
link |
00:56:45.860
that exchange of information.
link |
00:56:46.700
Yeah, just say like you're operating an economy
link |
00:56:49.180
and you need to have some thing that allows
link |
00:56:55.260
for the efficient, to have efficient value ratios
link |
00:56:59.740
between products and services.
link |
00:57:01.380
So, you've got this massive number of products
link |
00:57:03.220
and services and you need to, you can't just barter.
link |
00:57:06.380
It's just like, that would be extremely unwieldy.
link |
00:57:09.620
So, you need something that gives you the ratio
link |
00:57:13.420
of exchange between goods and services.
link |
00:57:20.540
And then something that allows you
link |
00:57:22.580
to shift obligations across time, like debt.
link |
00:57:26.580
Debt and equity shift obligations across time.
link |
00:57:29.180
Then what does the best job of that?
link |
00:57:33.300
Part of the reason why I think there's some
link |
00:57:36.020
Merit Doge coin, even though it was obviously created
link |
00:57:38.660
as a joke, is that it actually does have
link |
00:57:44.100
a much higher transaction volume capability than Bitcoin.
link |
00:57:49.820
And the costs of doing a transaction,
link |
00:57:53.180
the Doge coin fee is very low.
link |
00:57:55.980
Like right now, if you want to do a Bitcoin transaction,
link |
00:57:58.220
the price of doing that transaction is very high.
link |
00:58:00.460
So, you could not use it effectively for most things.
link |
00:58:04.340
And nor could it even scale to a high volume.
link |
00:58:11.860
And when Bitcoin started, I guess around 2008
link |
00:58:15.260
or something like that, the internet connections
link |
00:58:18.740
were much worse than they are today.
link |
00:58:20.820
Like Order of magnitude, I mean,
link |
00:58:23.620
just way, way worse in 2008.
link |
00:58:26.860
So, like having a small block size or whatever
link |
00:58:31.860
is, and a long synchronization time made sense in 2008.
link |
00:58:37.820
But 2021, or fast forward 10 years,
link |
00:58:41.380
it's like comically low.
link |
00:58:50.540
And I think there's some value to having a linear increase
link |
00:58:55.580
in the amount of currency that is generated.
link |
00:58:58.620
So, because some amount of the currency,
link |
00:59:01.060
like if a currency is too deflationary,
link |
00:59:05.060
or I should say, if a currency is expected
link |
00:59:10.140
to increase in value over time,
link |
00:59:11.700
there's reluctance to spend it.
link |
00:59:14.220
Because you're like, oh, I'll just hold it and not spend it
link |
00:59:17.380
because it's scarcity is increasing with time.
link |
00:59:19.380
So, if I spend it now, then I will regret spending it.
link |
00:59:22.180
So, I will just, you know, hodl it.
link |
00:59:24.980
But if there's some dilution of the currency occurring
link |
00:59:27.980
over time, that's more of an incentive
link |
00:59:29.860
to use it as a currency.
link |
00:59:31.260
So, those coins, somewhat randomly has just a fixed,
link |
00:59:39.220
a number of sort of coins or hash strings
link |
00:59:43.220
that are generated every year.
link |
00:59:46.540
So, there's some inflation, but it's not a percentage base.
link |
00:59:49.860
It's a percentage of the total amount of money
link |
00:59:54.340
it's a fixed number.
link |
00:59:55.700
So, the percentage of inflation
link |
00:59:58.380
will necessarily decline over time.
link |
01:00:02.700
So, I'm not saying that it's like the ideal system
link |
01:00:06.340
for a currency, but I think it actually is
link |
01:00:09.500
just fundamentally better than anything else I've seen
link |
01:00:13.460
just by accident, so.
link |
01:00:16.420
I like how you said around 2008.
link |
01:00:19.740
So, you're not, you know, some people suggested
link |
01:00:23.420
you might be Satoshi Nakamoto.
link |
01:00:24.820
You've previously said you're not.
link |
01:00:26.340
Let me ask.
link |
01:00:27.180
You're not for sure.
link |
01:00:28.780
Would you tell us if you were?
link |
01:00:30.140
Yes.
link |
01:00:30.980
Okay.
link |
01:00:33.460
Do you think it's a feature or a bug
link |
01:00:34.780
that he's anonymous or she or they?
link |
01:00:38.980
It's an interesting kind of quirk of human history
link |
01:00:41.700
that there is a particular technology
link |
01:00:43.580
that is a completely anonymous inventor
link |
01:00:46.140
or creator.
link |
01:01:03.380
Well, I mean, you can look at the evolution of ideas
link |
01:01:10.060
before the launch of Bitcoin
link |
01:01:11.860
and see who wrote, you know, about those ideas.
link |
01:01:19.540
And then, like, I don't know exactly,
link |
01:01:21.900
obviously I don't know who created Bitcoin
link |
01:01:24.180
for practical purposes,
link |
01:01:25.180
but the evolution of ideas is pretty clear for that.
link |
01:01:28.980
And like, it seems as though like Nick Szabo
link |
01:01:31.940
is probably more than anyone else responsible
link |
01:01:35.260
for the evolution of those ideas.
link |
01:01:37.100
So, he claims not to be Satoshi Nakamoto,
link |
01:01:41.100
but I'm not sure that's neither here nor there,
link |
01:01:44.460
but he seems to be the one more responsible
link |
01:01:47.700
for the ideas behind Bitcoin than anyone else.
link |
01:01:50.340
So, it's not perhaps like singular figures
link |
01:01:52.820
aren't even as important as the figures involved
link |
01:01:55.740
in the evolution of ideas, the Leto thing, so.
link |
01:01:58.100
Yeah.
link |
01:01:58.940
Yeah, it's, you know, perhaps it's sad
link |
01:02:02.260
to think about history,
link |
01:02:03.100
but maybe most names will be forgotten anyway.
link |
01:02:06.340
What is a name anyway?
link |
01:02:07.460
It's a name attached to an idea.
link |
01:02:11.260
What does it even mean, really?
link |
01:02:13.700
I think Shakespeare had a thing about roses and stuff,
link |
01:02:16.260
whatever he said.
link |
01:02:17.220
A rose by any other name, it smells sweet.
link |
01:02:22.420
I got Elon to quote Shakespeare.
link |
01:02:24.340
I feel like I accomplished something today.
link |
01:02:26.900
Shall I compare thee to a summer's day?
link |
01:02:28.860
What?
link |
01:02:30.820
I'm gonna clip that out.
link |
01:02:31.820
I said it to people.
link |
01:02:33.980
Not more temperate and more fair.
link |
01:02:39.060
Autopilot.
link |
01:02:40.620
Tesla autopilot.
link |
01:02:46.140
Tesla autopilot has been through an incredible journey
link |
01:02:48.500
over the past six years,
link |
01:02:50.540
or perhaps even longer in the minds of,
link |
01:02:52.780
in your mind and the minds of many involved.
link |
01:02:57.020
Yeah, I think that's where we first like connected really
link |
01:02:58.980
was the autopilot stuff, autonomy and.
link |
01:03:01.900
The whole journey was incredible to me to watch.
link |
01:03:05.220
I was, because I knew, well, part of it is I was at MIT
link |
01:03:10.340
and I knew the difficulty of computer vision.
link |
01:03:13.180
And I knew the whole, I had a lot of colleagues and friends
link |
01:03:15.780
about the DARPA challenge and knew how difficult it is.
link |
01:03:18.420
And so there was a natural skepticism
link |
01:03:20.140
when I first drove a Tesla with the initial system
link |
01:03:23.700
based on Mobileye.
link |
01:03:25.140
I thought there's no way, so first when I got in,
link |
01:03:28.900
I thought there's no way this car could maintain,
link |
01:03:32.580
like stay in the lane and create a comfortable experience.
link |
01:03:35.900
So my intuition initially was that the lane keeping problem
link |
01:03:39.500
is way too difficult to solve.
link |
01:03:41.780
Oh, lane keeping, yeah, that's relatively easy.
link |
01:03:43.900
Well, like, but not this, but solve in the way
link |
01:03:47.820
that we just, we talked about previous is prototype
link |
01:03:50.860
versus a thing that actually creates a pleasant experience
link |
01:03:54.380
over hundreds of thousands of miles or millions.
link |
01:03:57.460
Yeah, so we had to wrap a lot of code
link |
01:04:00.460
around the Mobileye thing.
link |
01:04:01.740
It doesn't just work by itself.
link |
01:04:04.380
I mean, that's part of the story
link |
01:04:06.420
of how you approach things sometimes.
link |
01:04:07.980
Sometimes you do things from scratch.
link |
01:04:09.680
Sometimes at first you kind of see what's out there
link |
01:04:12.340
and then you decide to do from scratch.
link |
01:04:14.340
That was one of the boldest decisions I've seen
link |
01:04:17.180
is both in the hardware and the software
link |
01:04:18.820
to decide to eventually go from scratch.
link |
01:04:21.020
I thought, again, I was skeptical
link |
01:04:22.700
of whether that's going to be able to work out
link |
01:04:24.500
because it's such a difficult problem.
link |
01:04:26.860
And so it was an incredible journey
link |
01:04:28.900
what I see now with everything,
link |
01:04:31.460
the hardware, the compute, the sensors,
link |
01:04:33.220
the things I maybe care and love about most
link |
01:04:37.300
is the stuff that Andre Karpathy is leading
link |
01:04:40.060
with the data set selection,
link |
01:04:41.740
the whole data engine process,
link |
01:04:43.100
the neural network architectures,
link |
01:04:45.020
the way that's in the real world,
link |
01:04:47.300
that network is tested, validated,
link |
01:04:49.380
all the different test sets,
link |
01:04:52.340
versus the ImageNet model of computer vision,
link |
01:04:54.740
like what's in academia is like real world
link |
01:04:58.340
artificial intelligence.
link |
01:05:01.340
And Andre's awesome and obviously plays an important role,
link |
01:05:04.220
but we have a lot of really talented people driving things.
link |
01:05:09.540
And Ashok is actually the head of autopilot engineering.
link |
01:05:14.820
Andre's the director of AI.
link |
01:05:16.380
AI stuff, yeah, yeah.
link |
01:05:17.700
So yeah, I'm aware that there's an incredible team
link |
01:05:20.580
of just a lot going on.
link |
01:05:22.060
Yeah, obviously people will give me too much credit
link |
01:05:26.060
and they'll give Andre too much credit, so.
link |
01:05:28.700
And people should realize how much is going on
link |
01:05:31.700
under the hood.
link |
01:05:32.540
Yeah, it's just a lot of really talented people.
link |
01:05:36.540
The Tesla Autopilot AI team is extremely talented.
link |
01:05:40.020
It's like some of the smartest people in the world.
link |
01:05:43.700
So yeah, we're getting it done.
link |
01:05:45.060
What are some insights you've gained
link |
01:05:47.660
over those five, six years of autopilot
link |
01:05:51.300
about the problem of autonomous driving?
link |
01:05:54.260
So you leaped in having some sort of
link |
01:05:58.580
first principles kinds of intuitions,
link |
01:06:00.820
but nobody knows how difficult the problem, like the problem.
link |
01:06:05.340
I thought the self driving problem would be hard,
link |
01:06:07.140
but it was harder than I thought.
link |
01:06:08.980
It's not like I thought it would be easy.
link |
01:06:09.980
I thought it would be very hard,
link |
01:06:10.800
but it was actually way harder than even that.
link |
01:06:14.220
So, I mean, what it comes down to at the end of the day
link |
01:06:17.060
is to solve self driving,
link |
01:06:18.760
you basically need to recreate what humans do to drive,
link |
01:06:28.680
which is humans drive with optical sensors,
link |
01:06:31.480
eyes and biological neural nets.
link |
01:06:34.880
And so in order to,
link |
01:06:36.420
that's how the entire road system is designed to work
link |
01:06:39.120
with basically passive optical and neural nets,
link |
01:06:45.360
biologically.
link |
01:06:46.200
And now that we need to,
link |
01:06:47.920
so for actually for full self driving to work,
link |
01:06:50.080
we have to recreate that in digital form.
link |
01:06:52.960
So we have to, that means cameras with advanced neural nets
link |
01:07:01.520
in silicon form, and then it will obviously solve
link |
01:07:06.400
for full self driving.
link |
01:07:08.000
That's the only way.
link |
01:07:09.000
I don't think there's any other way.
link |
01:07:10.320
But the question is what aspects of human nature
link |
01:07:12.920
do you have to encode into the machine, right?
link |
01:07:15.560
Do you have to solve the perception problem, like detect?
link |
01:07:18.720
And then you first realize
link |
01:07:21.320
what is the perception problem for driving,
link |
01:07:23.080
like all the kinds of things you have to be able to see,
link |
01:07:25.400
like what do we even look at when we drive?
link |
01:07:27.900
There's, I just recently heard Andre talked about MIT
link |
01:07:32.440
about car doors.
link |
01:07:33.720
I think it was the world's greatest talk of all time
link |
01:07:36.080
about car doors, the fine details of car doors.
link |
01:07:41.380
Like what is even an open car door, man?
link |
01:07:44.440
So like the ontology of that,
link |
01:07:46.880
that's a perception problem.
link |
01:07:48.000
We humans solve that perception problem,
link |
01:07:49.880
and Tesla has to solve that problem.
link |
01:07:51.640
And then there's the control and the planning
link |
01:07:53.380
coupled with the perception.
link |
01:07:54.980
You have to figure out like what's involved in driving,
link |
01:07:58.280
like especially in all the different edge cases.
link |
01:08:02.320
And then, I mean, maybe you can comment on this,
link |
01:08:06.540
how much game theoretic kind of stuff needs to be involved
link |
01:08:10.920
at a four way stop sign.
link |
01:08:12.700
As humans, when we drive, our actions affect the world.
link |
01:08:18.060
Like it changes how others behave.
link |
01:08:20.780
Most autonomous driving, if you,
link |
01:08:23.340
you're usually just responding to the scene
link |
01:08:27.420
as opposed to like really asserting yourself in the scene.
link |
01:08:31.360
Do you think?
link |
01:08:33.140
I think these sort of control logic conundrums
link |
01:08:37.580
are not the hard part.
link |
01:08:39.340
The, you know, let's see.
link |
01:08:45.540
What do you think is the hard part
link |
01:08:46.860
in this whole beautiful, complex problem?
link |
01:08:50.580
So it's a lot of freaking software, man.
link |
01:08:52.980
A lot of smart lines of code.
link |
01:08:57.300
For sure, in order to have,
link |
01:09:01.340
create an accurate vector space.
link |
01:09:03.980
So like you're coming from image space,
link |
01:09:08.280
which is like this flow of photons,
link |
01:09:12.540
you're going to the camera, cameras,
link |
01:09:14.380
and then you have this massive bitstream
link |
01:09:21.220
in image space, and then you have to effectively compress
link |
01:09:29.580
a massive bitstream corresponding to photons
link |
01:09:34.580
that knocked off an electron in a camera sensor
link |
01:09:38.500
and turn that bitstream into a vector space.
link |
01:09:44.860
By vector space, I mean like, you know,
link |
01:09:47.860
you've got cars and humans and lane lines and curves
link |
01:09:55.060
and traffic lights and that kind of thing.
link |
01:09:59.380
Once you've got all of that in your head,
link |
01:10:03.180
once you have an accurate vector space,
link |
01:10:08.500
the control problem is similar to that of a video game,
link |
01:10:11.680
like a Grand Theft Auto of Cyberpunk,
link |
01:10:14.100
if you have accurate vector space.
link |
01:10:16.260
It's the control problem is,
link |
01:10:18.300
I wouldn't say it's trivial, it's not trivial,
link |
01:10:20.300
but it's not like some insurmountable thing.
link |
01:10:29.020
Having an accurate vector space is very difficult.
link |
01:10:32.140
Yeah, I think we humans don't give enough respect
link |
01:10:35.540
to how incredible the human perception system is
link |
01:10:37.900
to mapping the raw photons to the vector space
link |
01:10:42.460
representation in our heads.
link |
01:10:44.660
Your brain is doing an incredible amount of processing
link |
01:10:48.360
and giving you an image that is a very cleaned up image.
link |
01:10:51.360
Like when we look around here, we see,
link |
01:10:53.380
like you see color in the corners of your eyes,
link |
01:10:55.340
but actually your eyes have very few cones,
link |
01:10:59.420
like cone receptors in the peripheral vision.
link |
01:11:02.260
Your eyes are painting color in the peripheral vision.
link |
01:11:05.660
You don't realize it,
link |
01:11:06.500
but their eyes are actually painting color
link |
01:11:09.060
and your eyes also have like this blood vessels
link |
01:11:12.260
and all sorts of gnarly things and there's a blind spot,
link |
01:11:14.660
but do you see your blind spot?
link |
01:11:16.380
No, your brain is painting in the missing, the blind spot.
link |
01:11:21.180
You're gonna do these things online where you look here
link |
01:11:24.980
and look at this point and then look at this point
link |
01:11:27.380
and if it's in your blind spot,
link |
01:11:30.460
your brain will just fill in the missing bits.
link |
01:11:33.660
The peripheral vision is so cool.
link |
01:11:35.500
It makes you realize all the illusions for vision sciences,
link |
01:11:38.060
so it makes you realize just how incredible the brain is.
link |
01:11:40.620
The brain is doing crazy amount of post processing
link |
01:11:42.660
on the vision signals for your eyes.
link |
01:11:45.820
It's insane.
link |
01:11:49.180
And then even once you get all those vision signals,
link |
01:11:51.940
your brain is constantly trying to forget
link |
01:11:56.260
as much as possible.
link |
01:11:57.660
So human memory is,
link |
01:11:59.500
perhaps the weakest thing about the brain is memory.
link |
01:12:01.900
So because memory is so expensive to our brain
link |
01:12:05.260
and so limited,
link |
01:12:06.660
your brain is trying to forget as much as possible
link |
01:12:09.740
and distill the things that you see
link |
01:12:12.140
into the smallest amounts of information possible.
link |
01:12:16.500
So your brain is trying to not just get to a vector space,
link |
01:12:19.340
but get to a vector space that is the smallest possible
link |
01:12:22.860
vector space of only relevant objects.
link |
01:12:26.620
And I think you can sort of look inside your brain
link |
01:12:29.860
or at least I can,
link |
01:12:31.540
when you drive down the road and try to think about
link |
01:12:35.380
what your brain is actually doing consciously.
link |
01:12:38.940
And it's like you'll see a car,
link |
01:12:44.540
because you don't have cameras,
link |
01:12:46.860
I don't have eyes in the back of your head or a side.
link |
01:12:48.940
So you basically have like two cameras on a slow gimbal.
link |
01:13:00.460
And eyesight is not that great.
link |
01:13:01.700
Okay, human eyes are like,
link |
01:13:04.300
and people are constantly distracted
link |
01:13:05.700
and thinking about things and texting
link |
01:13:07.180
and doing all sorts of things they shouldn't do in a car,
link |
01:13:09.260
changing the radio station.
link |
01:13:10.940
So having arguments is like,
link |
01:13:15.060
so when's the last time you looked right and left
link |
01:13:22.540
and rearward, or even diagonally forward
link |
01:13:27.060
to actually refresh your vector space?
link |
01:13:30.140
So you're glancing around and what your mind is doing
link |
01:13:32.740
is trying to distill the relevant vectors,
link |
01:13:37.300
basically objects with a position and motion.
link |
01:13:40.220
And then editing that down to the least amount
link |
01:13:48.140
that's necessary for you to drive.
link |
01:13:49.940
It does seem to be able to edit it down
link |
01:13:53.260
or compress it even further into things like concepts.
link |
01:13:55.780
So it's not, it's like it goes beyond,
link |
01:13:57.660
the human mind seems to go sometimes beyond vector space
link |
01:14:01.260
to sort of space of concepts to where you'll see a thing.
link |
01:14:05.080
It's no longer represented spatially somehow.
link |
01:14:07.520
It's almost like a concept that you should be aware of.
link |
01:14:10.060
Like if this is a school zone,
link |
01:14:12.300
you'll remember that as a concept,
link |
01:14:14.940
which is a weird thing to represent,
link |
01:14:16.420
but perhaps for driving,
link |
01:14:17.480
you don't need to fully represent those things.
link |
01:14:20.460
Or maybe you get those kind of indirectly.
link |
01:14:25.860
You need to establish vector space
link |
01:14:27.740
and then actually have predictions for those vector spaces.
link |
01:14:32.740
So if you drive past, say a bus and you see that there's people,
link |
01:14:47.340
before you drove past the bus,
link |
01:14:48.500
you saw people crossing or some,
link |
01:14:50.580
just imagine there's like a large truck
link |
01:14:52.820
or something blocking site.
link |
01:14:55.500
But before you came up to the truck,
link |
01:14:57.420
you saw that there were some kids about to cross the road
link |
01:15:00.700
in front of the truck.
link |
01:15:01.540
Now you can no longer see the kids,
link |
01:15:03.260
but you would now know, okay,
link |
01:15:06.300
those kids are probably gonna pass by the truck
link |
01:15:09.020
and cross the road, even though you cannot see them.
link |
01:15:12.020
So you have to have memory,
link |
01:15:17.340
you have to need to remember that there were kids there
link |
01:15:19.100
and you need to have some forward prediction
link |
01:15:21.820
of what their position will be at the time of relevance.
link |
01:15:25.660
So with occlusions and computer vision,
link |
01:15:28.540
when you can't see an object anymore,
link |
01:15:30.840
even when it just walks behind a tree and reappears,
link |
01:15:33.460
that's a really, really,
link |
01:15:35.100
I mean, at least in academic literature,
link |
01:15:37.100
it's tracking through occlusions, it's very difficult.
link |
01:15:40.620
Yeah, we're doing it.
link |
01:15:41.940
I understand this.
link |
01:15:42.780
Yeah.
link |
01:15:43.600
So some of it.
link |
01:15:44.440
It's like object permanence,
link |
01:15:45.420
like same thing happens with humans with neural nets.
link |
01:15:47.700
Like when like a toddler grows up,
link |
01:15:50.100
like there's a point in time where they develop,
link |
01:15:54.420
they have a sense of object permanence.
link |
01:15:56.220
So before a certain age, if you have a ball or a toy
link |
01:15:59.580
or whatever, and you put it behind your back
link |
01:16:01.180
and you pop it out, if they don't,
link |
01:16:03.360
before they have object permanence,
link |
01:16:04.660
it's like a new thing every time.
link |
01:16:05.860
It's like, whoa, this toy went poof, just fared
link |
01:16:08.260
and now it's back again and they can't believe it.
link |
01:16:09.980
And that they can play peekaboo all day long
link |
01:16:12.140
because peekaboo is fresh every time.
link |
01:16:13.940
But then we figured out object permanence,
link |
01:16:18.220
then they realize, oh no, the object is not gone,
link |
01:16:20.360
it's just behind your back.
link |
01:16:22.660
Sometimes I wish we never did figure out object permanence.
link |
01:16:26.380
Yeah, so that's a...
link |
01:16:28.260
So that's an important problem to solve.
link |
01:16:31.620
Yes, so like an important evolution
link |
01:16:33.960
of the neural nets in the car is
link |
01:16:39.740
memory across both time and space.
link |
01:16:43.860
So now you can't remember, like you have to say,
link |
01:16:47.100
like how long do you want to remember things for?
link |
01:16:48.960
And there's a cost to remembering things for a long time.
link |
01:16:53.260
So you can like run out of memory
link |
01:16:55.740
to try to remember too much for too long.
link |
01:16:58.600
And then you also have things that are stale
link |
01:17:01.420
if you remember them for too long.
link |
01:17:03.540
And then you also need things that are remembered over time.
link |
01:17:06.880
So even if you like say have like,
link |
01:17:10.640
for our good sake, five seconds of memory on a time basis,
link |
01:17:14.580
but like let's say you're parked at a light
link |
01:17:17.100
and you saw, use a pedestrian example,
link |
01:17:20.660
that people were waiting to cross the road
link |
01:17:25.220
and you can't quite see them because of an occlusion,
link |
01:17:28.900
but they might wait for a minute before the light changes
link |
01:17:31.820
for them to cross the road.
link |
01:17:33.180
You still need to remember that that's where they were
link |
01:17:36.940
and that they're probably going
link |
01:17:38.420
to cross the road type of thing.
link |
01:17:40.580
So even if that exceeds your time based memory,
link |
01:17:44.740
it should not exceed your space memory.
link |
01:17:48.140
And I just think the data engine side of that,
link |
01:17:50.500
so getting the data to learn all of the concepts
link |
01:17:53.540
that you're saying now is an incredible process.
link |
01:17:56.180
It's this iterative process of just,
link |
01:17:58.380
it's this hydranet of many.
link |
01:18:00.740
Hydranet.
link |
01:18:03.380
We're changing the name to something else.
link |
01:18:05.420
Okay, I'm sure it'll be equally as Rick and Morty like.
link |
01:18:09.700
There's a lot of, yeah.
link |
01:18:11.460
We've rearchitected the neural net,
link |
01:18:14.700
the neural nets in the cars so many times it's crazy.
link |
01:18:17.980
Oh, so every time there's a new major version,
link |
01:18:20.020
you'll rename it to something more ridiculous
link |
01:18:21.940
or memorable and beautiful, sorry.
link |
01:18:25.060
Not ridiculous, of course.
link |
01:18:28.140
If you see the full array of neural nets
link |
01:18:32.500
that are operating in the cars,
link |
01:18:34.180
it kind of boggles the mind.
link |
01:18:36.420
There's so many layers, it's crazy.
link |
01:18:39.860
So, yeah.
link |
01:18:43.620
And we started off with simple neural nets
link |
01:18:48.620
that were basically image recognition
link |
01:18:53.300
on a single frame from a single camera
link |
01:18:56.620
and then trying to knit those together
link |
01:19:00.700
with C, I should say we're really primarily running C here
link |
01:19:07.680
because C++ is too much overhead
link |
01:19:10.000
and we have our own C compiler.
link |
01:19:11.660
So to get maximum performance,
link |
01:19:13.580
we actually wrote our own C compiler
link |
01:19:15.780
and are continuing to optimize our C compiler
link |
01:19:18.060
for maximum efficiency.
link |
01:19:20.060
In fact, we've just recently done a new river
link |
01:19:23.100
on our C compiler that'll compile directly
link |
01:19:25.260
to our autopilot hardware.
link |
01:19:26.980
If you want to compile the whole thing down
link |
01:19:28.960
with your own compiler, like so efficiency here,
link |
01:19:32.660
because there's all kinds of compute,
link |
01:19:33.940
there's CPU, GPU, there's like basic types of things
link |
01:19:37.340
and you have to somehow figure out the scheduling
link |
01:19:39.140
across all of those things.
link |
01:19:40.140
And so you're compiling the code down that does all, okay.
link |
01:19:44.500
So that's why there's a lot of people involved.
link |
01:19:46.900
There's a lot of hardcore software engineering
link |
01:19:50.620
at a very sort of bare metal level
link |
01:19:54.700
because we're trying to do a lot of compute
link |
01:19:57.280
that's constrained to our full self driving computer.
link |
01:20:03.040
So we want to try to have the highest frames per second
link |
01:20:07.740
possible in a sort of very finite amount of compute
link |
01:20:14.580
and power.
link |
01:20:15.420
So we really put a lot of effort into the efficiency
link |
01:20:20.940
of our compute.
link |
01:20:23.820
And so there's actually a lot of work done
link |
01:20:26.060
by some very talented software engineers at Tesla
link |
01:20:29.660
that at a very foundational level
link |
01:20:33.140
to improve the efficiency of compute
link |
01:20:35.260
and how we use the trip accelerators,
link |
01:20:38.940
which are basically, you know,
link |
01:20:43.180
doing matrix math dot products,
link |
01:20:45.420
like a bazillion dot products.
link |
01:20:47.340
And it's like, what are neural nets?
link |
01:20:49.580
It's like compute wise, like 99% dot products.
link |
01:20:54.340
So, you know.
link |
01:20:57.100
And you want to achieve as many high frame rates
link |
01:20:59.700
like a video game.
link |
01:21:00.580
You want full resolution, high frame rate.
link |
01:21:05.020
High frame rate, low latency, low jitter.
link |
01:21:10.020
So I think one of the things we're moving towards now
link |
01:21:18.340
is no post processing of the image
link |
01:21:22.540
through the image signal processor.
link |
01:21:26.700
So like what happens for cameras is that,
link |
01:21:32.580
almost all cameras is they,
link |
01:21:35.740
there's a lot of post processing done
link |
01:21:37.700
in order to make pictures look pretty.
link |
01:21:40.260
And so we don't care about pictures looking pretty.
link |
01:21:43.540
We just want the data.
link |
01:21:45.380
So we're moving just raw photon counts.
link |
01:21:48.780
So the system will, like the image that the computer sees
link |
01:21:55.060
is actually much more than what you'd see
link |
01:21:57.860
if you represented it on a camera.
link |
01:21:59.100
It's got much more data.
link |
01:22:00.780
And even in a very low light conditions,
link |
01:22:02.560
you can see that there's a small photon count difference
link |
01:22:05.220
between this spot here and that spot there,
link |
01:22:08.780
which means that,
link |
01:22:09.620
so it can see in the dark incredibly well
link |
01:22:12.900
because it can detect these tiny differences
link |
01:22:15.100
in photon counts.
link |
01:22:16.980
Like much better than you could possibly imagine.
link |
01:22:20.780
So, and then we also save 13 milliseconds on a latency.
link |
01:22:27.420
So.
link |
01:22:29.260
From removing the post processing on the image?
link |
01:22:31.220
Yes.
link |
01:22:32.060
It's like,
link |
01:22:32.900
it's incredible.
link |
01:22:34.300
Cause we've got eight cameras
link |
01:22:35.820
and then there's roughly, I don't know,
link |
01:22:39.780
one and a half milliseconds or so,
link |
01:22:41.980
maybe 1.6 milliseconds of latency for each camera.
link |
01:22:46.220
And so like going to just,
link |
01:22:53.380
basically bypassing the image processor
link |
01:22:56.300
gets us back 13 milliseconds of latency,
link |
01:22:58.240
which is important.
link |
01:22:59.320
And we track latency all the way from, you know,
link |
01:23:03.440
photon hits the camera to, you know,
link |
01:23:06.760
all the steps that it's got to go through to get,
link |
01:23:08.920
you know, go through the various neural nets
link |
01:23:12.280
and the C code.
link |
01:23:13.320
And there's a little bit of C++ there as well.
link |
01:23:17.800
Well, I can maybe a lot, but it,
link |
01:23:20.200
the core stuff is heavy duty computers all in C.
link |
01:23:25.120
And so we track that latency all the way
link |
01:23:28.840
to an output command to the drive unit to accelerate
link |
01:23:33.480
the brakes just to slow down the steering,
link |
01:23:36.600
you know, turn left or right.
link |
01:23:38.920
So, cause you go to output a command
link |
01:23:40.680
that's got to go to a controller.
link |
01:23:41.800
And like some of these controllers have an update frequency
link |
01:23:44.120
that's maybe 10 Hertz or something like that,
link |
01:23:46.480
which is slow.
link |
01:23:47.320
That's like now you lose a hundred milliseconds potentially.
link |
01:23:50.240
So, so then we want to update the,
link |
01:23:55.040
the drivers on the like say steering and braking control
link |
01:23:58.720
to have more like a hundred Hertz instead of 10 Hertz.
link |
01:24:02.920
And then you've got a 10 millisecond latency
link |
01:24:04.400
instead of a hundred millisecond worst case latency.
link |
01:24:06.400
And actually jitter is more of a challenge than latency.
link |
01:24:09.560
Cause latency is like, you can, you can,
link |
01:24:11.000
you can anticipate and predict, but if you're,
link |
01:24:13.120
but if you've got a stack up of things going
link |
01:24:14.920
from the camera to the, to the computer through
link |
01:24:17.880
then a series of other computers,
link |
01:24:19.400
and finally to an actuator on the car,
link |
01:24:22.120
if you have a stack up of tolerances of timing tolerances,
link |
01:24:26.960
then you can have quite a variable latency,
link |
01:24:29.000
which is called jitter.
link |
01:24:30.360
And, and that makes it hard to, to, to anticipate exactly
link |
01:24:34.840
what, how you should turn the car or accelerate,
link |
01:24:37.480
because if you've got maybe a hundred,
link |
01:24:40.640
50, 200 milliseconds of jitter,
link |
01:24:42.440
then you could be off by, you know, up to 0.2 seconds.
link |
01:24:45.560
And this can make, this could make a big difference.
link |
01:24:47.560
So you have to interpolate somehow to, to, to,
link |
01:24:50.160
to deal with the effects of jitter.
link |
01:24:52.400
So that you can make like robust control decisions.
link |
01:24:57.920
You have to, so the jitters and the sensor information,
link |
01:25:01.620
or the jitter can occur at any stage in the pipeline.
link |
01:25:05.000
You can, if you have just, if you have fixed latency,
link |
01:25:07.760
you can anticipate and, and like say, okay,
link |
01:25:11.960
we know that our information is for argument's sake,
link |
01:25:16.560
150 milliseconds stale.
link |
01:25:19.000
Like, so for, for, for 150 milliseconds
link |
01:25:22.840
from photon second camera to where you can measure a change
link |
01:25:28.500
in the acceleration of the vehicle.
link |
01:25:33.720
So then, then you can say, okay, well, we're going to enter,
link |
01:25:38.080
we know it's 150 milliseconds.
link |
01:25:39.400
So we're going to take that into account
link |
01:25:40.960
and, and compensate for that latency.
link |
01:25:44.240
However, if you've got then 150 milliseconds of latency
link |
01:25:47.320
plus a hundred milliseconds of jitter,
link |
01:25:49.200
that's which could be anywhere from zero,
link |
01:25:50.760
zero to a hundred milliseconds on top.
link |
01:25:52.240
So, so then your latency could be from 150 to 250 milliseconds.
link |
01:25:55.720
Now you've got a hundred milliseconds
link |
01:25:56.680
that you don't know what to do with.
link |
01:25:58.040
And that's basically random.
link |
01:26:01.440
So getting rid of jitter is extremely important.
link |
01:26:04.260
And that affects your control decisions
link |
01:26:05.840
and all those kinds of things.
link |
01:26:07.400
Okay.
link |
01:26:09.160
Yeah, the car's just going to fundamentally maneuver better
link |
01:26:11.140
with lower jitter.
link |
01:26:12.920
Got it.
link |
01:26:13.760
The cars will maneuver with superhuman ability
link |
01:26:16.760
and reaction time much faster than a human.
link |
01:26:20.360
I mean, I think over time the autopilot,
link |
01:26:24.320
full self driving will be capable of maneuvers
link |
01:26:26.400
that are far more than what like James Bond could do
link |
01:26:34.840
in like the best movie type of thing.
link |
01:26:36.360
That's exactly what I was imagining in my mind,
link |
01:26:38.620
as you said it.
link |
01:26:40.240
It's like an impossible maneuvers
link |
01:26:41.820
that a human couldn't do.
link |
01:26:43.020
Yeah, so.
link |
01:26:45.080
Well, let me ask sort of looking back the six years,
link |
01:26:48.720
looking out into the future,
link |
01:26:50.280
based on your current understanding,
link |
01:26:51.840
how hard do you think this,
link |
01:26:53.640
this full self driving problem,
link |
01:26:55.360
when do you think Tesla will solve level four FSD?
link |
01:27:01.040
I mean, it's looking quite likely
link |
01:27:02.340
that it will be next year.
link |
01:27:05.520
And what does the solution look like?
link |
01:27:07.040
Is it the current pool of FSD beta candidates,
link |
01:27:10.200
they start getting greater and greater
link |
01:27:13.120
as they have been degrees of autonomy,
link |
01:27:15.760
and then there's a certain level
link |
01:27:17.760
beyond which they can do their own,
link |
01:27:20.800
they can read a book.
link |
01:27:22.880
Yeah, so.
link |
01:27:25.720
I mean, you can see that anybody
link |
01:27:26.880
who's been following the full self driving beta closely
link |
01:27:30.640
will see that the rate of disengagements
link |
01:27:35.260
has been dropping rapidly.
link |
01:27:37.440
So like disengagement be where the driver intervenes
link |
01:27:40.720
to prevent the car from doing something dangerous,
link |
01:27:44.000
potentially, so.
link |
01:27:49.740
So the interventions per million miles
link |
01:27:53.240
has been dropping dramatically at some point.
link |
01:27:57.520
And that trend looks like it happens next year
link |
01:28:01.040
is that the probability of an accident on FSD
link |
01:28:06.040
is less than that of the average human,
link |
01:28:09.520
and then significantly less than that of the average human.
link |
01:28:13.480
So it certainly appears like we will get there next year.
link |
01:28:21.080
Then of course, then there's gonna be a case of,
link |
01:28:24.000
okay, well, we now have to prove this to regulators
link |
01:28:26.040
and prove it to, you know, and we want a standard
link |
01:28:28.640
that is not just equivalent to a human,
link |
01:28:31.160
but much better than the average human.
link |
01:28:33.480
I think it's gotta be at least two or three times
link |
01:28:35.560
two or three times higher safety than a human.
link |
01:28:39.000
So two or three times lower probability of injury
link |
01:28:41.360
than a human before we would actually say like,
link |
01:28:44.840
okay, it's okay to go, it's not gonna be equivalent,
link |
01:28:46.800
it's gonna be much better.
link |
01:28:48.340
So if you look, FSD 10.6 just came out recently,
link |
01:28:53.400
10.7 is on the way, maybe 11 is on the way,
link |
01:28:57.120
so we're in the future.
link |
01:28:58.440
Yeah, we were hoping to get 11 out this year,
link |
01:29:01.040
but 11 actually has a whole bunch of fundamental rewrites
link |
01:29:07.180
on the neural net architecture,
link |
01:29:10.840
and some fundamental improvements
link |
01:29:14.040
in creating vector space, so.
link |
01:29:19.680
There is some fundamental like leap
link |
01:29:22.240
that really deserves the 11,
link |
01:29:24.000
I mean, that's a pretty cool number.
link |
01:29:25.160
Yeah, 11 would be a single stack
link |
01:29:29.920
for all, you know, one stack to rule them all.
link |
01:29:36.160
But there are just some really fundamental
link |
01:29:40.520
neural net architecture changes
link |
01:29:43.760
that will allow for much more capability,
link |
01:29:47.720
but at first they're gonna have issues.
link |
01:29:51.080
So like we have this working on like sort of alpha software
link |
01:29:54.680
and it's good, but it's basically taking a whole bunch
link |
01:30:00.880
of C, C++ code and leading a massive amount of C++ code
link |
01:30:05.360
and replacing it with a neural net.
link |
01:30:06.440
And Andre makes this point a lot,
link |
01:30:09.160
which is like neural nets are kind of eating software.
link |
01:30:12.480
Over time there's like less and less conventional software,
link |
01:30:15.880
more and more neural net, which is still software,
link |
01:30:18.240
but it's, you know, still comes out the lines of software,
link |
01:30:21.120
but it's more neural net stuff
link |
01:30:25.320
and less, you know, heuristics basically.
link |
01:30:33.080
More matrix based stuff and less heuristics based stuff.
link |
01:30:38.080
And, you know, like one of the big changes will be,
link |
01:30:47.080
like right now the neural nets will deliver
link |
01:30:54.080
a giant bag of points to the C++ or C and C++ code.
link |
01:31:00.080
We call it the giant bag of points.
link |
01:31:03.080
And it's like, so you've got a pixel and something associated
link |
01:31:08.080
with that pixel.
link |
01:31:09.080
Like this pixel is probably car.
link |
01:31:11.080
The pixel is probably lane line.
link |
01:31:13.080
Then you've got to assemble this giant bag of points
link |
01:31:16.080
in the C code and turn it into vectors.
link |
01:31:21.080
And it does a pretty good job of it, but it's,
link |
01:31:26.080
we want to just, you know,
link |
01:31:30.080
we need another layer of neural nets on top of that
link |
01:31:35.080
to take the giant bag of points and distill that down
link |
01:31:40.080
to a vector space in the neural net part of the software,
link |
01:31:45.080
as opposed to the heuristics part of the software.
link |
01:31:48.080
This is a big improvement.
link |
01:31:51.080
Neural nets all the way down.
link |
01:31:52.080
That's what you want.
link |
01:31:53.080
It's not even all neural nets, but this will be just a,
link |
01:31:58.080
this is a game changer to not have the bag of points,
link |
01:32:01.080
the giant bag of points that has to be assembled
link |
01:32:04.080
with many lines of C++ and have the,
link |
01:32:09.080
and have a neural net just assemble those into a vector.
link |
01:32:12.080
So the neural net is outputting much, much less data.
link |
01:32:19.080
It's outputting, this is a lane line.
link |
01:32:22.080
This is a curb.
link |
01:32:23.080
This is drivable space.
link |
01:32:24.080
This is a car.
link |
01:32:25.080
This is a pedestrian or a cyclist or something like that.
link |
01:32:29.080
It's outputting, it's really outputting proper vectors
link |
01:32:35.080
to the C, C++ control code,
link |
01:32:39.080
as opposed to the sort of constructing the vectors in C,
link |
01:32:50.080
which we've done, I think, quite a good job of,
link |
01:32:52.080
but we're kind of hitting a local maximum
link |
01:32:55.080
on how well the C can do this.
link |
01:32:59.080
So this is really a big deal.
link |
01:33:02.080
And just all of the networks in the car
link |
01:33:04.080
need to move to surround video.
link |
01:33:06.080
There's still some legacy networks that are not surround video.
link |
01:33:11.080
And all of the training needs to move to surround video
link |
01:33:14.080
and the efficiency of the training,
link |
01:33:16.080
it needs to get better and it is.
link |
01:33:18.080
And then we need to move everything to raw photon counts
link |
01:33:25.080
as opposed to processed images,
link |
01:33:29.080
which is quite a big reset on the training
link |
01:33:31.080
because the system's trained on post processed images.
link |
01:33:35.080
So we need to redo all the training
link |
01:33:38.080
to train against the raw photon counts
link |
01:33:41.080
instead of the post processed image.
link |
01:33:43.080
So ultimately, it's kind of reducing the complexity
link |
01:33:46.080
of the whole thing.
link |
01:33:47.080
So reducing the...
link |
01:33:50.080
Lines of code will actually go lower.
link |
01:33:52.080
Yeah, that's fascinating.
link |
01:33:54.080
So you're doing fusion of all the sensors
link |
01:33:56.080
and reducing the complexity of having to deal with these...
link |
01:33:58.080
Fusion of the cameras.
link |
01:33:59.080
Fusion of the cameras, really.
link |
01:34:00.080
Right, yes.
link |
01:34:03.080
Same with humans.
link |
01:34:05.080
Well, I guess we've got ears too.
link |
01:34:07.080
Yeah, we'll actually need to incorporate sound as well
link |
01:34:11.080
because you need to listen for ambulance sirens
link |
01:34:14.080
or fire trucks or somebody yelling at you or something.
link |
01:34:20.080
I don't know.
link |
01:34:21.080
There's a little bit of audio that needs to be incorporated as well.
link |
01:34:24.080
Do you need to go back for a break?
link |
01:34:26.080
Yeah, sure, let's take a break.
link |
01:34:27.080
Okay.
link |
01:34:28.080
Honestly, frankly, the ideas are the easy thing
link |
01:34:33.080
and the implementation is the hard thing.
link |
01:34:35.080
The idea of going to the moon is the easy part.
link |
01:34:37.080
Not going to the moon is the hard part.
link |
01:34:39.080
It's the hard part.
link |
01:34:40.080
And there's a lot of hardcore engineering
link |
01:34:42.080
that's got to get done at the hardware and software level.
link |
01:34:46.080
Like I said, optimizing the C compiler
link |
01:34:48.080
and just cutting out latency everywhere.
link |
01:34:55.080
If we don't do this, the system will not work properly.
link |
01:34:59.080
So the work of the engineers doing this,
link |
01:35:02.080
they are like the unsung heroes,
link |
01:35:05.080
but they are critical to the success of the situation.
link |
01:35:08.080
I think you made it clear.
link |
01:35:09.080
I mean, at least to me, it's super exciting.
link |
01:35:11.080
Everything that's going on outside of what Andre is doing.
link |
01:35:15.080
Just the whole infrastructure, the software.
link |
01:35:17.080
I mean, everything is going on with Data Engine,
link |
01:35:19.080
whatever it's called.
link |
01:35:21.080
The whole process is just work of art to me.
link |
01:35:24.080
The sheer scale of it boggles my mind.
link |
01:35:26.080
Like the training, the amount of work done with,
link |
01:35:29.080
like we've written all this custom software for training and labeling
link |
01:35:33.080
and to do auto labeling.
link |
01:35:34.080
Auto labeling is essential.
link |
01:35:38.080
Because especially when you've got surround video, it's very difficult.
link |
01:35:42.080
To label surround video from scratch is extremely difficult.
link |
01:35:48.080
Like take a human such a long time to even label one video clip,
link |
01:35:52.080
like several hours.
link |
01:35:54.080
Or the auto label, basically we just apply heavy duty,
link |
01:36:00.080
like a lot of compute to the video clips to preassign
link |
01:36:06.080
and guess what all the things are that are going on in the surround video.
link |
01:36:09.080
And then there's like correcting it.
link |
01:36:10.080
Yeah.
link |
01:36:11.080
And then all the human has to do is like tweet,
link |
01:36:13.080
like say, adjust what is incorrect.
link |
01:36:16.080
This is like increases productivity by in fact a hundred or more.
link |
01:36:21.080
Yeah.
link |
01:36:22.080
So you've presented Tesla Bot as primarily useful in the factory.
link |
01:36:25.080
First of all, I think humanoid robots are incredible.
link |
01:36:28.080
From a fan of robotics, I think the elegance of movement
link |
01:36:32.080
that humanoid robots, that bipedal robots show are just so cool.
link |
01:36:38.080
So it's really interesting that you're working on this
link |
01:36:40.080
and also talking about applying the same kind of all the ideas,
link |
01:36:44.080
some of which we've talked about with Data Engine,
link |
01:36:46.080
all the things that we're talking about with Tesla Autopilot,
link |
01:36:49.080
just transferring that over to just yet another robotics problem.
link |
01:36:54.080
I have to ask, since I care about human robot interaction,
link |
01:36:57.080
so the human side of that.
link |
01:36:59.080
So you've talked about mostly in the factory.
link |
01:37:01.080
Do you see part of this problem that Tesla Bot has to solve
link |
01:37:06.080
is interacting with humans and potentially having a place like in the home.
link |
01:37:10.080
So interacting, not just not replacing labor, but also like, I don't know,
link |
01:37:15.080
being a friend or an assistant or something like that.
link |
01:37:18.080
Yeah, I think the possibilities are endless.
link |
01:37:27.080
It's not quite in Tesla's primary mission direction
link |
01:37:32.080
of accelerating sustainable energy,
link |
01:37:34.080
but it is an extremely useful thing that we can do for the world,
link |
01:37:38.080
which is to make a useful humanoid robot that is capable of interacting with the world
link |
01:37:44.080
and helping in many different ways.
link |
01:37:49.080
I think if you say extrapolate to many years in the future,
link |
01:37:59.080
I think work will become optional.
link |
01:38:04.080
There's a lot of jobs that if people weren't paid to do it,
link |
01:38:10.080
they wouldn't do it.
link |
01:38:12.080
It's not fun necessarily.
link |
01:38:14.080
If you're washing dishes all day,
link |
01:38:16.080
it's like, you know, even if you really like washing dishes,
link |
01:38:19.080
you really want to do it for eight hours a day every day.
link |
01:38:22.080
Probably not.
link |
01:38:25.080
And then there's like dangerous work.
link |
01:38:27.080
And basically, if it's dangerous, boring,
link |
01:38:30.080
it has like potential for repetitive stress injury, that kind of thing.
link |
01:38:34.080
Then that's really where humanoid robots would add the most value initially.
link |
01:38:40.080
So that's what we're aiming for is for the humanoid robots to do jobs
link |
01:38:47.080
that people don't voluntarily want to do.
link |
01:38:51.080
And then we'll have to pair that obviously
link |
01:38:53.080
with some kind of universal basic income in the future.
link |
01:38:56.080
So I think.
link |
01:39:00.080
So do you see a world when there's like hundreds of millions of Tesla bots
link |
01:39:05.080
doing different, performing different tasks throughout the world?
link |
01:39:12.080
Yeah, I haven't really thought about it that far into the future,
link |
01:39:14.080
but I guess there may be something like that.
link |
01:39:17.080
So.
link |
01:39:20.080
Can I ask a wild question?
link |
01:39:22.080
So the number of Tesla cars has been accelerating.
link |
01:39:25.080
There's been close to two million produced.
link |
01:39:28.080
Many of them have autopilot.
link |
01:39:30.080
I think we're over two million now.
link |
01:39:32.080
Do you think there will ever be a time when there will be more Tesla bots
link |
01:39:36.080
than Tesla cars?
link |
01:39:40.080
Yeah.
link |
01:39:42.080
Actually, it's funny you asked this question,
link |
01:39:44.080
because normally I do try to think pretty far into the future,
link |
01:39:47.080
but I haven't really thought that far into the future with the Tesla bot,
link |
01:39:52.080
or it's codenamed Optimus.
link |
01:39:55.080
I call it Optimus subprime.
link |
01:39:59.080
It's not like a giant transformer robot.
link |
01:40:04.080
So.
link |
01:40:07.080
But it's meant to be a general purpose, helpful bot.
link |
01:40:14.080
And basically, like the things that we're basically like, like,
link |
01:40:18.080
Tesla, I think is the has the most advanced real world AI
link |
01:40:25.080
for interacting with the real world,
link |
01:40:26.080
which are developed as a function of to make self driving work.
link |
01:40:30.080
And so along with custom hardware and like a lot of, you know,
link |
01:40:36.080
hardcore low level software to have it run efficiently
link |
01:40:39.080
and be power efficient because it's one thing to do neural nets
link |
01:40:43.080
if you've got a gigantic server room with 10,000 computers.
link |
01:40:45.080
But now let's say you just you have to now distill that down
link |
01:40:48.080
into one computer that's running at low power in a humanoid robot or a car.
link |
01:40:53.080
That's actually very difficult.
link |
01:40:54.080
A lot of hardcore software work is required for that.
link |
01:40:59.080
So since we're kind of like solving the navigate the real world
link |
01:41:05.080
with neural nets problem for cars,
link |
01:41:08.080
which are kind of robots with four wheels,
link |
01:41:10.080
then it's like kind of a natural extension of that is to put it
link |
01:41:14.080
in a robot with arms and legs and actuators.
link |
01:41:20.080
So like the two hard things are like you basically need to make the
link |
01:41:31.080
have the robot be intelligent enough to interact in a sensible way
link |
01:41:34.080
with the environment.
link |
01:41:36.080
So you need real world AI and you need to be very good at manufacturing,
link |
01:41:43.080
which is a very hard problem.
link |
01:41:44.080
Tesla is very good at manufacturing and also has the real world AI.
link |
01:41:50.080
So making the humanoid robot work is basically means developing custom motors
link |
01:41:58.080
and sensors that are different from what a car would use.
link |
01:42:04.080
But we've also we have I think we have the best expertise
link |
01:42:10.080
in developing advanced electric motors and power electronics.
link |
01:42:15.080
So it just has to be for humanoid robot application on a car.
link |
01:42:22.080
Still, you do talk about love sometimes.
link |
01:42:25.080
So let me ask.
link |
01:42:26.080
This isn't like for like sex robots or something like that.
link |
01:42:29.080
Love is the answer.
link |
01:42:30.080
Yes.
link |
01:42:33.080
There is something compelling to us, not compelling,
link |
01:42:36.080
but we connect with humanoid robots or even like robots like with the dog
link |
01:42:41.080
and shapes of dogs.
link |
01:42:43.080
It just it seems like, you know, there's a huge amount of loneliness in this world.
link |
01:42:48.080
All of us seek companionship with other humans, friendship
link |
01:42:51.080
and all those kinds of things.
link |
01:42:52.080
We have a lot of here in Austin, a lot of people have dogs.
link |
01:42:56.080
There seems to be a huge opportunity to also have robots that decrease
link |
01:43:01.080
the amount of loneliness in the world or help us humans connect with each other.
link |
01:43:09.080
So in a way that dogs can.
link |
01:43:12.080
Do you think about that with TeslaBot at all?
link |
01:43:14.080
Or is it really focused on the problem of performing specific tasks,
link |
01:43:19.080
not connecting with humans?
link |
01:43:23.080
I mean, to be honest, I have not actually thought about it from the companionship
link |
01:43:27.080
standpoint, but I think it actually would end up being it could be actually
link |
01:43:31.080
a very good companion.
link |
01:43:34.080
And it could develop like a personality over time that is that is like unique,
link |
01:43:44.080
like, you know, it's not like they're just all the robots are the same
link |
01:43:47.080
and that personality could evolve to be, you know,
link |
01:43:53.080
match the owner or the, you know, yes, the owner.
link |
01:43:59.080
Well, whatever you want to call it.
link |
01:44:02.080
The other half, right?
link |
01:44:05.080
In the same way that friends do.
link |
01:44:06.080
See, I think that's a huge opportunity.
link |
01:44:09.080
Yeah, no, that's interesting.
link |
01:44:14.080
Because, you know, like there's a Japanese phrase I like, Wabi Sabi,
link |
01:44:19.080
you know, the subtle imperfections are what makes something special.
link |
01:44:23.080
And the subtle imperfections of the personality of the robot mapped
link |
01:44:28.080
to the subtle imperfections of the robot's human friend.
link |
01:44:34.080
I don't know, owner sounds like maybe the wrong word,
link |
01:44:36.080
but could actually make an incredible buddy, basically.
link |
01:44:42.080
In that way, the imperfections.
link |
01:44:43.080
Like R2D2 or like a C3PO sort of thing, you know.
link |
01:44:46.080
So from a machine learning perspective,
link |
01:44:49.080
I think the flaws being a feature is really nice.
link |
01:44:53.080
You could be quite terrible at being a robot for quite a while
link |
01:44:57.080
in the general home environment or in general world.
link |
01:45:00.080
And that's kind of adorable.
link |
01:45:02.080
And that's like, those are your flaws and you fall in love with those flaws.
link |
01:45:06.080
So it's very different than autonomous driving
link |
01:45:09.080
where it's a very high stakes environment you cannot mess up.
link |
01:45:13.080
And so it's more fun to be a robot in the home.
link |
01:45:17.080
Yeah, in fact, if you think of like C3PO and R2D2,
link |
01:45:21.080
like they actually had a lot of like flaws and imperfections
link |
01:45:24.080
and silly things and they would argue with each other.
link |
01:45:29.080
Were they actually good at doing anything?
link |
01:45:32.080
I'm not exactly sure.
link |
01:45:34.080
They definitely added a lot to the story.
link |
01:45:38.080
But there's sort of quirky elements and, you know,
link |
01:45:43.080
that they would like make mistakes and do things.
link |
01:45:45.080
It was like it made them relatable, I don't know, enduring.
link |
01:45:52.080
So yeah, I think that that could be something that probably would happen.
link |
01:45:59.080
But our initial focus is just to make it useful.
link |
01:46:03.080
So I'm confident we'll get it done.
link |
01:46:06.080
I'm not sure what the exact timeframe is,
link |
01:46:08.080
but like we'll probably have, I don't know,
link |
01:46:11.080
a decent prototype towards the end of next year or something like that.
link |
01:46:15.080
And it's cool that it's connected to Tesla, the car.
link |
01:46:19.080
Yeah, it's using a lot of, you know,
link |
01:46:22.080
it would use the autopilot inference computer
link |
01:46:25.080
and a lot of the training that we've done for cars
link |
01:46:29.080
in terms of recognizing real world things
link |
01:46:32.080
could be applied directly to the robot.
link |
01:46:38.080
But there's a lot of custom actuators and sensors that need to be developed.
link |
01:46:42.080
And an extra module on top of the vector space for love.
link |
01:46:47.080
Yeah.
link |
01:46:48.080
That's what I'm saying.
link |
01:46:51.080
We can add that to the car too.
link |
01:46:53.080
That's true.
link |
01:46:55.080
Yeah, it could be useful in all environments.
link |
01:46:57.080
Like you said, a lot of people argue in the car,
link |
01:46:59.080
so maybe we can help them out.
link |
01:47:02.080
You're a student of history,
link |
01:47:03.080
fan of Dan Carlin's Hardcore History podcast.
link |
01:47:06.080
Yeah, it's great.
link |
01:47:07.080
Greatest podcast ever?
link |
01:47:08.080
Yeah, I think it is actually.
link |
01:47:11.080
It almost doesn't really count as a podcast.
link |
01:47:14.080
It's more like an audio book.
link |
01:47:16.080
So you were on the podcast with Dan.
link |
01:47:18.080
I just had a chat with him about it.
link |
01:47:21.080
He said you guys went military and all that kind of stuff.
link |
01:47:23.080
Yeah, it was basically, it should be titled Engineer Wars, essentially.
link |
01:47:32.080
Like when there's a rapid change in the rate of technology,
link |
01:47:36.080
then engineering plays a pivotal role in victory and battle.
link |
01:47:43.080
How far back in history did you go?
link |
01:47:45.080
Did you go World War II?
link |
01:47:47.080
Well, it was supposed to be a deep dive on fighters and bomber technology in World War II,
link |
01:47:55.080
but that ended up being more wide ranging than that,
link |
01:47:58.080
because I just went down the total rathole of studying all of the fighters and bombers of World War II
link |
01:48:04.080
and the constant rock, paper, scissors game that one country would make this plane,
link |
01:48:10.080
then it would make a plane to beat that, and that country would make a plane to beat that.
link |
01:48:15.080
And really what matters is the pace of innovation
link |
01:48:18.080
and also access to high quality fuel and raw materials.
link |
01:48:25.080
Germany had some amazing designs, but they couldn't make them
link |
01:48:29.080
because they couldn't get the raw materials,
link |
01:48:31.080
and they had a real problem with the oil and fuel, basically.
link |
01:48:37.080
The fuel quality was extremely variable.
link |
01:48:40.080
So the design wasn't the bottleneck?
link |
01:48:42.080
Yeah, the U.S. had kickass fuel that was very consistent.
link |
01:48:47.080
The problem is if you make a very high performance aircraft engine,
link |
01:48:50.080
in order to make it high performance, the fuel, the aviation gas,
link |
01:48:59.080
has to be a consistent mixture and it has to have a high octane.
link |
01:49:07.080
High octane is the most important thing, but it also can't have impurities and stuff
link |
01:49:11.080
because you'll foul up the engine.
link |
01:49:14.080
And Germany just never had good access to oil.
link |
01:49:16.080
They tried to get it by invading the Caucasus, but that didn't work too well.
link |
01:49:22.080
It never worked so well.
link |
01:49:23.080
It didn't work out for them.
link |
01:49:24.080
See you, Jerry.
link |
01:49:26.080
Nice to meet you.
link |
01:49:28.080
Germany was always struggling with basically shitty oil,
link |
01:49:31.080
and they couldn't count on high quality fuel for their aircraft,
link |
01:49:37.080
so they had to have all these additives and stuff.
link |
01:49:43.080
Whereas the U.S. had awesome fuel, and they provided that to Britain as well.
link |
01:49:48.080
So that allowed the British and the Americans to design aircraft engines
link |
01:49:53.080
that were super high performance, better than anything else in the world.
link |
01:49:58.080
Germany could design the engines, they just didn't have the fuel.
link |
01:50:01.080
And then also the quality of the aluminum alloys that they were getting
link |
01:50:06.080
was also not that great.
link |
01:50:09.080
Is this like, you talked about all this with Dan?
link |
01:50:11.080
Yep.
link |
01:50:12.080
Awesome.
link |
01:50:13.080
Broadly looking at history, when you look at Genghis Khan,
link |
01:50:16.080
when you look at Stalin, Hitler, the darkest moments of human history,
link |
01:50:22.080
what do you take away from those moments?
link |
01:50:24.080
Does it help you gain insight about human nature, about human behavior today,
link |
01:50:28.080
whether it's the wars or the individuals or just the behavior of people,
link |
01:50:32.080
any aspects of history?
link |
01:50:41.080
Yeah, I find history fascinating.
link |
01:50:49.080
There's just a lot of incredible things that have been done, good and bad,
link |
01:50:54.080
that they help you understand the nature of civilization and individuals.
link |
01:51:06.080
Does it make you sad that humans do these kinds of things to each other?
link |
01:51:09.080
You look at the 20th century, World War II, the cruelty, the abuse of power,
link |
01:51:15.080
talk about communism, Marxism, and Stalin.
link |
01:51:20.080
I mean, there's a lot of human history.
link |
01:51:24.080
Most of it is actually people just getting on with their lives,
link |
01:51:28.080
and it's not like human history is just nonstop war and disaster.
link |
01:51:35.080
Those are actually just, those are intermittent and rare.
link |
01:51:38.080
If they weren't, then humans would soon cease to exist.
link |
01:51:46.080
But it's just that wars tend to be written about a lot,
link |
01:51:50.080
whereas something being like, well,
link |
01:51:54.080
a normal year where nothing major happened doesn't get written about much.
link |
01:51:58.080
But that's, most people just like farming and kind of living their life,
link |
01:52:04.080
being a villager somewhere.
link |
01:52:09.080
And every now and again, there's a war.
link |
01:52:16.080
And I would have to say, there aren't very many books where I just had to stop reading
link |
01:52:23.080
because it was just too dark.
link |
01:52:26.080
But the book about Stalin, The Court of the Red Tsar, I had to stop reading.
link |
01:52:32.080
It was just too dark and rough.
link |
01:52:37.080
Yeah.
link |
01:52:39.080
The 30s, there's a lot of lessons there to me,
link |
01:52:44.080
in particular that it feels like humans,
link |
01:52:48.080
like all of us have that, it's the old Solzhenitsyn line,
link |
01:52:53.080
that the line between good and evil runs through the heart of every man,
link |
01:52:56.080
that all of us are capable of evil, all of us are capable of good.
link |
01:52:59.080
It's almost like this kind of responsibility that all of us have
link |
01:53:04.080
to tend towards the good.
link |
01:53:07.080
And so to me, looking at history is almost like an example of,
link |
01:53:11.080
look, you have some charismatic leader that convinces you of things.
link |
01:53:16.080
It's too easy, based on that story, to do evil onto each other,
link |
01:53:21.080
onto your family, onto others.
link |
01:53:23.080
And so it's like our responsibility to do good.
link |
01:53:26.080
It's not like now is somehow different from history.
link |
01:53:29.080
That can happen again. All of it can happen again.
link |
01:53:32.080
And yes, most of the time, you're right,
link |
01:53:35.080
the optimistic view here is mostly people are just living life.
link |
01:53:39.080
And as you've often memed about, the quality of life was way worse
link |
01:53:44.080
back in the day, and this keeps improving over time
link |
01:53:47.080
through innovation, through technology.
link |
01:53:49.080
But still, it's somehow notable that these blimps of atrocities happen.
link |
01:53:54.080
Sure.
link |
01:53:56.080
Yeah, I mean, life was really tough for most of history.
link |
01:54:02.080
I mean, for most of human history, a good year would be one
link |
01:54:07.080
where not that many people in your village died of the plague,
link |
01:54:11.080
starvation, freezing to death, or being killed by a neighboring village.
link |
01:54:16.080
It's like, well, it wasn't that bad.
link |
01:54:18.080
It was only like we lost 5% this year.
link |
01:54:20.080
That was a good year.
link |
01:54:23.080
That would be par for the course.
link |
01:54:25.080
Just not starving to death would have been the primary goal
link |
01:54:28.080
of most people throughout history,
link |
01:54:31.080
is making sure we'll have enough food to last through the winter
link |
01:54:34.080
and not freeze or whatever.
link |
01:54:36.080
So, now food is plentiful.
link |
01:54:42.080
I have an obesity problem.
link |
01:54:46.080
Well, yeah, the lesson there is to be grateful for the way things are now
link |
01:54:50.080
for some of us.
link |
01:54:53.080
We've spoken about this offline.
link |
01:54:56.080
I'd love to get your thought about it here.
link |
01:55:00.080
If I sat down for a long form in person conversation with the President of Russia,
link |
01:55:05.080
Vladimir Putin, would you potentially want to call in for a few minutes
link |
01:55:10.080
to join in on a conversation with him, moderated and translated by me?
link |
01:55:15.080
Sure, yeah.
link |
01:55:16.080
Sure, I'd be happy to do that.
link |
01:55:19.080
You've shown interest in the Russian language.
link |
01:55:22.080
Is this grounded in your interest in history of linguistics, culture, general curiosity?
link |
01:55:27.080
I think it sounds cool.
link |
01:55:29.080
Sounds cool and that looks cool.
link |
01:55:32.080
Well, it takes a moment to read Cyrillic.
link |
01:55:39.080
Once you know what the Cyrillic characters stand for,
link |
01:55:43.080
actually then reading Russian becomes a lot easier
link |
01:55:47.080
because there are a lot of words that are actually the same.
link |
01:55:49.080
Like bank is bank.
link |
01:55:54.080
So find the words that are exactly the same and now you start to understand Cyrillic.
link |
01:55:59.080
If you can sound it out, there's at least some commonality of words.
link |
01:56:06.080
What about the culture?
link |
01:56:09.080
You love great engineering, physics.
link |
01:56:12.080
There's a tradition of the sciences there.
link |
01:56:14.080
You look at the 20th century from rocketry.
link |
01:56:17.080
Some of the greatest rockets, some of the space exploration has been done in the former Soviet Union.
link |
01:56:24.080
So do you draw inspiration from that history?
link |
01:56:27.080
Just how this culture that in many ways, one of the sad things is because of the language,
link |
01:56:33.080
a lot of it is lost to history because it's not translated.
link |
01:56:37.080
Because it is in some ways an isolated culture.
link |
01:56:40.080
It flourishes within its borders.
link |
01:56:45.080
So do you draw inspiration from those folks, from the history of science engineering there?
link |
01:56:51.080
The Soviet Union, Russia and Ukraine as well have a really strong history in space flight.
link |
01:57:02.080
Some of the most advanced and impressive things in history were done by the Soviet Union.
link |
01:57:11.080
So one cannot help but admire the impressive rocket technology that was developed.
link |
01:57:20.080
After the fall of the Soviet Union, there's much less that then happened.
link |
01:57:29.080
But still things are happening, but it's not quite at the frenetic pace that it was happening before the Soviet Union kind of dissolved into separate republics.
link |
01:57:46.080
Yeah, there's Roscosmos, the Russian agency.
link |
01:57:52.080
I look forward to a time when those countries with China are working together.
link |
01:57:57.080
The United States are all working together.
link |
01:58:00.080
Maybe a little bit of friendly competition.
link |
01:58:02.080
I think friendly competition is good.
link |
01:58:04.080
Governments are slow and the only thing slower than one government is a collection of governments.
link |
01:58:09.080
So the Olympics would be boring if everyone just crossed the finishing line at the same time.
link |
01:58:16.080
Nobody would watch.
link |
01:58:18.080
And people wouldn't try hard to run fast and stuff.
link |
01:58:22.080
So I think friendly competition is a good thing.
link |
01:58:25.080
This is also a good place to give a shout out to a video titled,
link |
01:58:29.080
The Entire Soviet Rocket Engine Family Tree by Tim Dodd, AKA Everyday Astronaut.
link |
01:58:34.080
It's like an hour and a half.
link |
01:58:35.080
It gives the full history of Soviet rockets.
link |
01:58:38.080
And people should definitely go check out and support Tim in general.
link |
01:58:41.080
That guy is super excited about the future, super excited about space flight.
link |
01:58:45.080
Every time I see anything by him, I just have a stupid smile on my face because he's so excited about stuff.
link |
01:58:50.080
Yeah, Tim Dodd is really great.
link |
01:58:53.080
If you're interested in anything to do with space, he's, in terms of explaining rocket technology to your average person, he's awesome.
link |
01:59:02.080
The best, I'd say.
link |
01:59:04.080
And I should say like the part of the reason like I switched us from like Rafter at one point was going to be a hydrogen engine.
link |
01:59:14.080
But hydrogen has a lot of challenges.
link |
01:59:17.080
It's very low density.
link |
01:59:18.080
It's a deep cryogen.
link |
01:59:19.080
So it's only liquid at a very, very close to absolute zero.
link |
01:59:23.080
Requires a lot of insulation.
link |
01:59:26.080
So it is a lot of challenges there.
link |
01:59:30.080
And I was actually reading a bit about Russian rocket engine developments.
link |
01:59:35.080
And at least the impression I had was that the Soviet Union, Russia and Ukraine primarily were actually in the process of switching to Methalox.
link |
01:59:50.080
And there were some interesting tests and data for ISP.
link |
01:59:55.080
Like they were able to get like up to like a 380 second ISP with the Methalox engine.
link |
02:00:01.080
And I was like, well, OK, that's actually really impressive.
link |
02:00:05.080
So I think you could actually get a much lower cost, like in optimizing cost per ton to orbit, cost per ton to Mars.
link |
02:00:19.080
I think Methalox is the way to go.
link |
02:00:25.080
And I was partly inspired by the Russian work on the test stands with Methalox engines.
link |
02:00:32.080
And now for something completely different.
link |
02:00:35.080
Do you mind doing a bit of a meme review in the spirit of the great, the powerful PewDiePie?
link |
02:00:41.080
Let's say 1 to 11.
link |
02:00:42.080
Just go over a few documents printed out.
link |
02:00:45.080
We can try.
link |
02:00:46.080
Let's try this.
link |
02:00:49.080
I present to you document numero uno.
link |
02:00:56.080
I don't know. OK.
link |
02:00:58.080
Vladimir Impaler discovers marshmallows.
link |
02:01:03.080
That's not bad.
link |
02:01:07.080
So you get it? Because he likes impaling things.
link |
02:01:11.080
Yes, I get it.
link |
02:01:12.080
I don't know, three, whatever.
link |
02:01:14.080
That's not very good.
link |
02:01:19.080
This is grounded in some engineering, some history.
link |
02:01:28.080
Yeah, give us an eight out of ten.
link |
02:01:31.080
What do you think about nuclear power?
link |
02:01:33.080
I'm in favor of nuclear power.
link |
02:01:34.080
I think in a place that is not subject to extreme natural disasters, I think nuclear power is a great way to generate electricity.
link |
02:01:47.080
I don't think we should be shutting down nuclear power stations.
link |
02:01:51.080
Yeah, but what about Chernobyl?
link |
02:01:53.080
Exactly.
link |
02:01:56.080
I think there's a lot of fear of radiation and stuff.
link |
02:02:04.080
The problem is a lot of people just don't study engineering or physics.
link |
02:02:13.080
Just the word radiation just sounds scary.
link |
02:02:16.080
They can't calibrate what radiation means.
link |
02:02:21.080
But radiation is much less dangerous than you think.
link |
02:02:30.080
For example, Fukushima, when the Fukushima problem happened due to the tsunami,
link |
02:02:41.080
I got people in California asking me if they should worry about radiation from Fukushima.
link |
02:02:46.080
I'm like, definitely not, not even slightly, not at all.
link |
02:02:51.080
That is crazy.
link |
02:02:54.080
Just to show this is how the danger is so much overplayed compared to what it really is that I actually flew to Fukushima.
link |
02:03:09.080
I donated a solar power system for a water treatment plant, and I made a point of eating locally grown vegetables on TV in Fukushima.
link |
02:03:28.080
I'm still alive.
link |
02:03:31.080
So it's not even that the risk of these events is low, but the impact of them is...
link |
02:03:36.080
The impact is greatly exaggerated.
link |
02:03:38.080
It's human nature.
link |
02:03:40.080
People don't know what radiation is.
link |
02:03:42.080
I've had people ask me, what about radiation from cell phones causing brain cancer?
link |
02:03:46.080
I'm like, when you say radiation, do you mean photons or particles?
link |
02:03:49.080
They're like, I don't know, what do you mean photons or particles?
link |
02:03:52.080
Do you mean, let's say, photons?
link |
02:03:56.080
What frequency or wavelength?
link |
02:03:59.080
And they're like, I have no idea.
link |
02:04:01.080
Do you know that everything's radiating all the time?
link |
02:04:04.080
What do you mean?
link |
02:04:06.080
Like, yeah, everything's radiating all the time.
link |
02:04:08.080
Photons are being emitted by all objects all the time, basically.
link |
02:04:12.080
And if you want to know what it means to stand in front of nuclear fire, go outside.
link |
02:04:21.080
The sun is a gigantic thermonuclear reactor that you're staring right at it.
link |
02:04:28.080
Are you still alive?
link |
02:04:29.080
Yes.
link |
02:04:30.080
Okay, amazing.
link |
02:04:32.080
Yeah, I guess radiation is one of the words that could be used as a tool to fear monger by certain people.
link |
02:04:39.080
That's it.
link |
02:04:40.080
I think people just don't understand.
link |
02:04:42.080
I mean, that's the way to fight that fear, I suppose, is to understand, is to learn.
link |
02:04:46.080
Yeah, just say, okay, how many people have actually died from nuclear accidents?
link |
02:04:50.080
It's practically nothing.
link |
02:04:51.080
And say, how many people have died from coal plants?
link |
02:04:57.080
And it's a very big number.
link |
02:04:59.080
So, like, obviously we should not be starting up coal plants and shutting down nuclear plants.
link |
02:05:05.080
It just doesn't make any sense at all.
link |
02:05:08.080
Coal plants, like, I don't know, 100 to 1,000 times worse for health than nuclear power plants.
link |
02:05:15.080
You want to go to the next one?
link |
02:05:16.080
This is really bad.
link |
02:05:20.080
It's 90, 180, and 360 degrees.
link |
02:05:24.080
Everybody loves the math.
link |
02:05:25.080
Nobody gives a shit about 270.
link |
02:05:28.080
It's not super funny.
link |
02:05:30.080
I don't know, like, 203.
link |
02:05:32.080
This is not a, you know, LOL situation.
link |
02:05:37.080
Yeah.
link |
02:05:43.080
That was pretty good.
link |
02:05:44.080
The United States oscillating between establishing and destroying dictatorships.
link |
02:05:48.080
It's like a metronome.
link |
02:05:49.080
Is that a metronome?
link |
02:05:50.080
Yeah, it's out of 7 out of 10.
link |
02:05:53.080
It's kind of true.
link |
02:05:54.080
Oh, yeah, this is kind of personal for me.
link |
02:05:57.080
Next one.
link |
02:05:58.080
Oh, man.
link |
02:05:59.080
Is this Leica?
link |
02:06:00.080
Yeah.
link |
02:06:01.080
Well, no.
link |
02:06:02.080
Or it's like referring to Leica or something?
link |
02:06:03.080
As Leica's, like, husband.
link |
02:06:06.080
Husband.
link |
02:06:07.080
Yeah, yeah.
link |
02:06:08.080
Hello.
link |
02:06:09.080
Yes, this is dog.
link |
02:06:10.080
Your wife was launched into space.
link |
02:06:11.080
And then the last one is him with his eyes closed and a bottle of vodka.
link |
02:06:16.080
Yeah.
link |
02:06:17.080
Leica didn't come back.
link |
02:06:18.080
No.
link |
02:06:19.080
They don't tell you the full story of, you know, what the impact they had on the loved
link |
02:06:24.080
ones.
link |
02:06:25.080
True.
link |
02:06:26.080
Yeah.
link |
02:06:27.080
It's like 711 for me.
link |
02:06:28.080
Sure.
link |
02:06:29.080
The Soviet shadow.
link |
02:06:30.080
Oh, yeah.
link |
02:06:31.080
This keeps going on the Russian theme.
link |
02:06:33.400
First man in space.
link |
02:06:35.160
Nobody cares.
link |
02:06:36.160
First man on the moon.
link |
02:06:37.160
Well, I think people do care.
link |
02:06:38.160
No, I know.
link |
02:06:39.160
But...
link |
02:06:40.160
There is...
link |
02:06:41.160
Yuri Gagarin's name will be forever in history, I think.
link |
02:06:45.760
There is something special about placing, like, stepping foot onto another totally foreign
link |
02:06:52.080
land.
link |
02:06:53.080
It's not the journey, like, people that explore the oceans.
link |
02:06:56.960
It's not as important to explore the oceans as to land on a whole new continent.
link |
02:07:01.400
Yeah.
link |
02:07:02.400
Well, this is about you.
link |
02:07:05.080
Oh, yeah, I'd love to get your comment on this.
link |
02:07:08.200
Elon Musk, after sending 6.6 billion dollars to the UN to end world hunger, you have three
link |
02:07:14.680
hours.
link |
02:07:15.680
Yeah, well, I mean, obviously, 6 billion dollars is not going to end world hunger.
link |
02:07:24.240
So I mean, the reality is at this point, the world is producing far more food than it can
link |
02:07:30.680
really consume.
link |
02:07:31.680
Like, we don't have a caloric constraint at this point.
link |
02:07:35.920
So where there is hunger, it is almost always due to, like, civil war or strife or some
link |
02:07:43.280
like, it's not a thing that is extremely rare for it to be just a matter of, like, lack
link |
02:07:52.860
of money.
link |
02:07:53.860
It's like, you know, it's like some civil war in some country and like one part of the
link |
02:07:59.000
country is literally trying to starve the other part of the country.
link |
02:08:03.040
So it's much more complex than something that money could solve.
link |
02:08:05.840
It's geopolitics.
link |
02:08:07.440
It's a lot of things.
link |
02:08:09.840
It's human nature.
link |
02:08:10.840
It's governments.
link |
02:08:11.840
It's money, monetary systems, all that kind of stuff.
link |
02:08:14.520
Yeah, food is extremely cheap these days.
link |
02:08:17.640
It's like, I mean, the US at this point, you know, among low income families, obesity is
link |
02:08:26.720
actually another problem.
link |
02:08:27.960
It's not, like, obesity, it's not hunger.
link |
02:08:31.120
It's like too much, you know, too many calories.
link |
02:08:34.660
So it's not that nobody's hungry anywhere.
link |
02:08:37.880
It's just, this is not a simple matter of adding money and solving it.
link |
02:08:43.960
Hmm.
link |
02:08:44.960
What do you think that one gets?
link |
02:08:48.800
It's getting...
link |
02:08:49.800
Two.
link |
02:08:50.800
We're just going after empires, world, where did you get those artifacts?
link |
02:08:57.560
The British Museum.
link |
02:08:58.560
Shout out to Monty Python.
link |
02:09:01.600
We found them.
link |
02:09:02.600
Yeah.
link |
02:09:03.600
The British Museum is pretty great.
link |
02:09:05.800
I mean, admittedly Britain did take these historical artifacts from all around the world
link |
02:09:10.200
and put them in London, but, you know, it's not like people can't go see them.
link |
02:09:16.120
So it is a convenient place to see these ancient artifacts is London for, you know, for a large
link |
02:09:23.480
segment of the world.
link |
02:09:25.080
So I think, you know, on balance, the British Museum is a net good, although I'm sure a
link |
02:09:29.640
lot of countries will argue about that.
link |
02:09:31.440
Yeah.
link |
02:09:32.440
It's like you want to make these historical artifacts accessible to as many people as
link |
02:09:35.960
possible and the British Museum, I think, does a good job of that.
link |
02:09:41.520
Even if there's a darker aspect to like the history of empire in general, whatever the
link |
02:09:45.200
empire is, however things were done, it is the history that happened.
link |
02:09:52.120
You can't sort of erase that history, unfortunately.
link |
02:09:54.280
You could just become better in the future.
link |
02:09:55.920
That's the point.
link |
02:09:57.520
Yeah.
link |
02:09:58.520
I mean, it's like, well, how are we going to pass moral judgment on these things?
link |
02:10:04.040
Like it's like if, you know, if one is going to judge, say, the Russian Empire, you've
link |
02:10:10.400
got to judge, you know, what everyone was doing at the time and how were the British
link |
02:10:14.720
relative to everyone.
link |
02:10:18.200
And I think the British would actually get like a relatively good grade, relatively good
link |
02:10:22.640
grade, not in absolute terms, but compared to what everyone else was doing, they were
link |
02:10:30.040
not the worst.
link |
02:10:31.600
Like I said, you got to look at these things in the context of the history at the time
link |
02:10:35.400
and say, what were the alternatives and what are you comparing it against?
link |
02:10:39.200
And I do not think it would be the case that Britain would get a bad grade when looking
link |
02:10:47.220
at history at the time.
link |
02:10:48.560
You know, if you judge history from, you know, from what is morally acceptable today, you
link |
02:10:56.280
basically are going to give everyone a failing grade.
link |
02:10:58.400
I'm not clear.
link |
02:10:59.400
It's not, I don't think anyone would get a passing grade in their morality of like you
link |
02:11:04.720
could go back 300 years ago, like who's getting a passing grade?
link |
02:11:08.960
Basically no one.
link |
02:11:10.760
And we might not get a passing grade from generations that come after us.
link |
02:11:16.280
What does that one get?
link |
02:11:18.880
Sure.
link |
02:11:19.880
Six, seven.
link |
02:11:20.880
For the Monty Python, maybe.
link |
02:11:21.880
I always love Monty Python.
link |
02:11:22.880
They're great.
link |
02:11:23.880
The Life of Brian and the Quest of the Holy Grail are incredible.
link |
02:11:28.880
Yeah.
link |
02:11:29.880
Yeah.
link |
02:11:30.880
Yeah.
link |
02:11:31.880
Those serious eyebrows.
link |
02:11:32.880
How important do you think is facial hair to great leadership?
link |
02:11:37.480
Well, you got a new haircut.
link |
02:11:41.480
How does that affect your leadership?
link |
02:11:42.480
I don't know.
link |
02:11:43.480
Hopefully not.
link |
02:11:44.480
It doesn't.
link |
02:11:45.480
Is that the second no one?
link |
02:11:46.480
Yeah.
link |
02:11:47.480
The second is no one.
link |
02:11:48.480
There is no one competing with Brezhnev.
link |
02:11:49.480
No one, too.
link |
02:11:50.480
Those are like epic eyebrows.
link |
02:11:51.480
Yeah.
link |
02:11:52.480
Sure.
link |
02:11:53.480
That's ridiculous.
link |
02:11:54.480
Give it a six or seven, I don't know.
link |
02:11:55.480
I like this Shakespearean analysis of memes.
link |
02:11:56.480
Brezhnev, he had a flair for drama as well.
link |
02:11:57.480
Like, you know, showmanship.
link |
02:11:58.480
Yeah.
link |
02:11:59.480
Yeah.
link |
02:12:00.480
It must come from the eyebrows.
link |
02:12:01.480
All right.
link |
02:12:02.480
Invention, great engineering, look what I invented, that's the best thing since ripped
link |
02:12:20.960
up bread.
link |
02:12:21.960
Yeah.
link |
02:12:22.960
Because they invented sliced bread, am I just explaining memes at this point?
link |
02:12:29.000
This is where my life has become a meme, what it like, you know, like a scribe that like
link |
02:12:40.000
runs around with the kings and just like writes down memes.
link |
02:12:44.120
I mean, when was the cheeseburger invented?
link |
02:12:46.280
That's like an epic invention.
link |
02:12:47.280
Yeah.
link |
02:12:48.280
Like, like, wow.
link |
02:12:49.280
Yeah.
link |
02:12:50.280
Versus just like a burger or a burger, I guess a burger in general is like, you know, then
link |
02:12:57.680
there's like, what is a burger, what's a sandwich, and then you start getting a pizza sandwich
link |
02:13:01.800
and what is the original, it gets into an ontology argument.
link |
02:13:05.720
Yeah.
link |
02:13:06.720
But everybody knows like if you order like a burger or cheeseburger or whatever and you
link |
02:13:08.920
like, you got like, you know, tomato and some lettuce and onions and whatever and, you know,
link |
02:13:14.000
mayor and ketchup and mustard, it's like epic.
link |
02:13:16.040
Yeah.
link |
02:13:17.040
But I'm sure they've had bread and meat separately for a long time and it was kind of a burger
link |
02:13:21.080
on the same plate, but somebody who actually combined them into the same thing and then
link |
02:13:25.960
you bite it and hold it makes it convenient.
link |
02:13:29.040
It's a materials problem.
link |
02:13:30.040
Yeah.
link |
02:13:31.040
Like your hands don't get dirty and whatever.
link |
02:13:33.520
Yeah.
link |
02:13:34.520
It's brilliant.
link |
02:13:35.520
Well, that is not what I would have guessed, but everyone knows like you, if you order
link |
02:13:43.280
a cheeseburger, you know what you're getting, you know, it's not like some obtuse, like,
link |
02:13:46.800
well, I wonder what I'll get, you know, um, you know, uh, fries are, I mean, great.
link |
02:13:52.800
I mean, they were the devil, but fries are awesome.
link |
02:13:56.320
And uh, yeah, pizza is incredible.
link |
02:14:00.320
Food innovation doesn't get enough love, I guess is what we're getting at.
link |
02:14:05.480
Great.
link |
02:14:06.480
Um, uh, what about the, uh, Matthew McConaughey, Austinite here, uh, president Kennedy, do
link |
02:14:13.520
you know how to put men on the moon yet?
link |
02:14:15.200
NASA?
link |
02:14:16.200
No.
link |
02:14:17.200
President Kennedy, it'd be a lot cooler if you did.
link |
02:14:19.440
Pretty much sure, six, six or seven, I suppose.
link |
02:14:25.360
All right.
link |
02:14:26.360
And this is the last one that's funny.
link |
02:14:30.880
Someone drew a bunch of dicks all over the walls, 16 chapel boys bath.
link |
02:14:35.680
Sure.
link |
02:14:36.680
I'll give it a nine.
link |
02:14:37.680
It's super.
link |
02:14:38.680
It's really true.
link |
02:14:39.680
All right.
link |
02:14:40.680
This is our highest ranking meme for today.
link |
02:14:41.680
I mean, it's true.
link |
02:14:42.680
Like, how do they get away with it?
link |
02:14:44.880
Lots of nakedness.
link |
02:14:45.880
I mean, dick pics are, I mean, just something throughout history.
link |
02:14:49.640
Uh, as long as people can draw things, there's been a dick pic.
link |
02:14:53.080
It's a staple of human history.
link |
02:14:55.080
It's a staple.
link |
02:14:56.080
It's just throughout human history.
link |
02:14:58.840
You tweeted that you aspire to comedy.
link |
02:15:00.680
You're friends with Joe Rogan.
link |
02:15:02.640
Might you, uh, do a short standup comedy set at some point in the future, maybe, um, open
link |
02:15:08.080
for Joe, something like that.
link |
02:15:09.640
Is that, is that...
link |
02:15:10.640
Really?
link |
02:15:11.640
Standup?
link |
02:15:12.640
Actual, just full on standup?
link |
02:15:13.640
Full on standup.
link |
02:15:14.640
Is that in there or is that...
link |
02:15:15.640
It's extremely difficult if, uh, at least that's what, uh, like Joe says and the comedians
link |
02:15:22.840
say.
link |
02:15:23.840
Huh.
link |
02:15:24.840
I wonder if I could, um, I mean, I, I, you know, I, I have done standup for friends,
link |
02:15:32.880
just, uh, impromptu, you know, I'll get, get on like a roof, uh, and they, they do laugh,
link |
02:15:40.320
but they're our friends too.
link |
02:15:41.440
So I don't know if, if you've got to call, you know, like a room of strangers, are they
link |
02:15:45.320
going to actually also find it funny, but I could try, see what happens.
link |
02:15:51.400
I think you'd learn something either way.
link |
02:15:53.560
Um, yeah.
link |
02:15:54.560
I kind of love, um, both the, when you bomb and when, when you do great, just watching
link |
02:16:00.080
people, how they deal with it, it's so difficult.
link |
02:16:03.320
It's so, you're so fragile up there.
link |
02:16:07.000
It's just you and you, you think you're going to be funny.
link |
02:16:09.760
And when it completely falls flat, it's just, it's beautiful to see people deal with like
link |
02:16:14.400
that.
link |
02:16:15.400
Yeah.
link |
02:16:16.400
I might have enough material to do standup.
link |
02:16:17.400
I've never thought about it, but I might have enough material.
link |
02:16:21.240
Um, I don't know, like 15 minutes or something.
link |
02:16:25.480
Oh yeah.
link |
02:16:26.480
Yeah.
link |
02:16:27.480
Do a Netflix special.
link |
02:16:28.480
Netflix special.
link |
02:16:29.480
Sure.
link |
02:16:30.480
Um, what's your favorite Rick and Morty concept, uh, just to spring that on you.
link |
02:16:36.240
Is there, there's a lot of sort of scientific engineering ideas explored there.
link |
02:16:39.960
There's the butter robot.
link |
02:16:42.880
That's a great, uh, that's a great show.
link |
02:16:44.800
Um, yeah.
link |
02:16:45.800
Rick and Morty is awesome.
link |
02:16:47.440
Somebody that's exactly like you from an alternate dimension showed up there.
link |
02:16:50.560
Elon Tusk.
link |
02:16:51.560
Yeah, that's right.
link |
02:16:52.560
That you voiced.
link |
02:16:53.560
Yeah.
link |
02:16:54.560
Rick and Morty certainly explores a lot of interesting concepts.
link |
02:16:57.200
Uh, so like what's the favorite one?
link |
02:17:00.200
I don't know.
link |
02:17:01.200
The butter robot certainly is, uh, you know, it's like, it's certainly possible to have
link |
02:17:04.960
too much sentience in a device.
link |
02:17:06.840
Um, like you don't want to have your toast to be like a super genius toaster.
link |
02:17:12.080
It's going to hate, hate life cause all it could do is make his toast.
link |
02:17:15.280
But if it's like, you don't want to have like super intelligent stuck in a very limited
link |
02:17:19.720
device.
link |
02:17:20.720
Um, do you think it's too easy from a, if we're talking about from the engineering perspective
link |
02:17:24.960
of super intelligence, like with Marvin the robot, like, is it, it seems like it might
link |
02:17:30.760
be very easy to engineer just the depressed robot.
link |
02:17:33.800
Like it's not obvious to engineer and robot that's going to find a fulfilling existence.
link |
02:17:40.760
Sometimes humans I suppose, but, um, I wonder if that's like the default, if you don't do
link |
02:17:47.160
a good job on building a robot, it's going to be sad a lot.
link |
02:17:52.800
Well we can reprogram robots easier than we can reprogram humans.
link |
02:17:58.160
So I guess if you let it evolve without tinkering, then it might get a sad, uh, but you can change
link |
02:18:06.520
the optimization function and have it be a cheery robot.
link |
02:18:13.000
You uh, like I mentioned with, with SpaceX, you give a lot of people hope and a lot of
link |
02:18:17.560
people look up to you.
link |
02:18:18.560
Millions of people look up to you.
link |
02:18:19.920
Uh, if we think about young people in high school, maybe in college, um, what advice
link |
02:18:26.400
would you give to them about if they want to try to do something big in this world,
link |
02:18:31.680
they want to really have a big positive impact.
link |
02:18:33.680
What advice would you give them about their career, maybe about life in general?
link |
02:18:39.440
Try to be useful.
link |
02:18:40.440
Um, you know, do things that are useful to your fellow human beings, to the world.
link |
02:18:46.480
It's very hard to be useful.
link |
02:18:48.480
Um, very hard.
link |
02:18:51.680
Um, you know, are you contributing more than you consume?
link |
02:18:57.200
You know, like, uh, like can you try to have a positive net contribution to society?
link |
02:19:05.760
Um, I think that's the thing to aim for, you know, not, not to try to be sort of a leader
link |
02:19:11.600
for just for the sake of being a leader or whatever.
link |
02:19:15.200
Um, a lot of time people, a lot of times the people you want as leaders are the people
link |
02:19:22.080
who don't want to be leaders.
link |
02:19:24.440
So, um, if you live a useful life, that is a good life, a life worth having lived.
link |
02:19:36.640
Um, you know, and I, like I said, I would, I would encourage people to use the mental
link |
02:19:45.040
tools of physics and apply them broadly in life.
link |
02:19:48.360
There are the best tools.
link |
02:19:49.360
When you think about education and self education, what do you recommend?
link |
02:19:54.040
So there's the university, there's a self study, there is a hands on sort of finding
link |
02:20:01.920
a company or a place or a set of people that do the thing you're passionate about and joining
link |
02:20:05.840
them as early as possible.
link |
02:20:08.080
Um, there's, uh, taking a road trip across Europe for a few years and writing some poetry,
link |
02:20:13.520
which, uh, which, which trajectory do you suggest?
link |
02:20:18.440
In terms of learning about how you can become useful, as you mentioned, how you can have
link |
02:20:24.000
the most positive impact.
link |
02:20:27.320
Well, I encourage people to read a lot of books, just read, basically try to ingest
link |
02:20:39.160
as much information as you can, uh, and try to also just develop a good general knowledge.
link |
02:20:46.000
Um, so, so you at least have like a rough lay of the land of the knowledge landscape.
link |
02:20:54.320
Like try to learn a little bit about a lot of things, um, cause you might not know what
link |
02:20:59.040
you're really interested in.
link |
02:21:00.040
How would you know what you're really interested in if you at least aren't like doing a peripheral
link |
02:21:04.280
explore exploration of broadly of, of the knowledge landscape?
link |
02:21:10.240
Um, and you talk to people from different walks of life and different, uh, industries
link |
02:21:17.320
and professions and skills and occupations, like just try to learn as much as possible.
link |
02:21:27.520
Man's search for meaning.
link |
02:21:31.120
Isn't the whole thing a search for meaning?
link |
02:21:32.920
Yeah.
link |
02:21:33.920
What's the meaning of life and all, you know, but just generally, like I said, I would encourage
link |
02:21:38.920
people to read broadly, um, in many different subject areas, um, and, and, and then try
link |
02:21:45.760
to find something where there's an overlap of your talents and, and what you're interested
link |
02:21:51.580
in.
link |
02:21:52.580
So people may, may, may be good at something, but, or they may have skill at a particular
link |
02:21:56.120
thing, but they don't like doing it.
link |
02:21:57.760
Um, so you want to try to find a thing where you have your, that's a good, a good, a combination
link |
02:22:04.680
of, of your, of the things that you're inherently good at, but you also like doing, um, and,
link |
02:22:12.600
um,
link |
02:22:13.600
And reading is a super fast shortcut to, to figure out which, where are you, you both
link |
02:22:18.680
good at it.
link |
02:22:19.680
You like doing it and it will actually have positive impact.
link |
02:22:22.680
Well, you got to learn about things somehow.
link |
02:22:25.040
So read, reading a broad range, just really read, read it.
link |
02:22:31.320
You know, one point was that kid I read through the encyclopedia, uh, so that was pretty helpful.
link |
02:22:38.760
Um, and, uh, there are also things that I didn't even know existed a lot, so obviously
link |
02:22:44.280
and
link |
02:22:45.280
It's like as broad as it gets.
link |
02:22:46.280
Encyclopedias were digestible, I think, uh, you know, whatever, 40 years ago.
link |
02:22:51.240
Um, so, um, you know, maybe read through the, the condensed version of the encyclopedia
link |
02:22:58.360
of Britannica.
link |
02:22:59.360
And that, um, you can always like skip subjects or you read a few paragraphs and you know
link |
02:23:05.120
you're not interested, just jump to the next one.
link |
02:23:07.560
That sort of read the encyclopedia or scan, skim, skim through it.
link |
02:23:12.360
Um, and, um, but I, you know, I put a lot of stock and certainly have a lot of respect
link |
02:23:19.920
for someone who puts in an honest day's work, uh, to do useful things and, and just generally
link |
02:23:27.480
to have like a, not a zero sum mindset, um, or, uh, like have, have more of a grow the
link |
02:23:34.560
pie mindset.
link |
02:23:35.840
Like the, if you, if you sort of say like when, when I see people like perhaps, um,
link |
02:23:42.280
including some very smart people kind of taking an attitude of, uh, like, like, like doing
link |
02:23:48.320
things that seem like morally questionable, it's often because they have at a base sort
link |
02:23:53.640
of axiomatic level, a zero sum mindset.
link |
02:23:57.480
Um, and, and they, without realizing it, they don't realize they have a zero sum mindset
link |
02:24:02.720
or at least that they don't realize it consciously.
link |
02:24:04.920
Um, and so if you have a zero sum mindset, then the only way to get ahead is by taking
link |
02:24:09.260
things from others.
link |
02:24:11.520
If it's like, if the, if the, if the pie is fixed, then the only way to have more pie
link |
02:24:17.160
is to take someone else's pie.
link |
02:24:19.360
But this is false.
link |
02:24:20.680
Like obviously the pie has grown dramatically over time, the economic pie.
link |
02:24:24.120
Um, so the reality, in reality you can have the, so overuse this analogy, if you have
link |
02:24:31.640
a lot of, there's a lot of pie, pie is not fixed.
link |
02:24:37.960
Um, uh, so you really want to make sure you don't, you're not operating, um, without realizing
link |
02:24:43.640
it from a zero sum mindset where, where the only way to get ahead is to take things from
link |
02:24:47.960
others.
link |
02:24:48.960
And you take, try to take things from others, which is not, not good.
link |
02:24:52.040
It's much better to work on, uh, adding to the economic pie, maybe, you know, so creating,
link |
02:25:02.280
like I said, create, creating more than you consume, uh, doing more than you.
link |
02:25:06.200
Yeah.
link |
02:25:07.200
Um, so that's, that's a big deal.
link |
02:25:08.960
Um, I think there's like, you know, a fair number of people in, in finance that, uh,
link |
02:25:15.040
do have a bit of a zero sum mindset.
link |
02:25:16.680
I mean, it's all walks of life.
link |
02:25:19.160
I've seen that one of the, one of the reasons, uh, Rogan inspires me is he celebrates others
link |
02:25:25.680
a lot.
link |
02:25:26.680
There's not, not creating a constant competition.
link |
02:25:29.400
Like there's a scarcity of resources.
link |
02:25:31.400
What happens when you celebrate others and you promote others, the ideas of others, it,
link |
02:25:36.800
it, uh, it actually grows that pie.
link |
02:25:38.960
I mean, it, every, like the, uh, the resource, the resources become less scarce and that,
link |
02:25:45.280
that applies in a lot of kinds of domains.
link |
02:25:46.760
It applies in academia where a lot of people are very, uh, see some funding for academic
link |
02:25:51.600
research is zero sum and it is not.
link |
02:25:54.120
If you celebrate each other, if you make, if you get everybody to be excited about AI,
link |
02:25:58.080
about physics, above mathematics, I think it, there'll be more and more funding and
link |
02:26:02.440
I think everybody wins.
link |
02:26:03.640
Yeah.
link |
02:26:04.640
That applies, I think broadly.
link |
02:26:05.640
Uh,
link |
02:26:06.640
yeah, yeah, exactly.
link |
02:26:08.680
So last, last question about love and meaning, uh, what is the role of love?
link |
02:26:15.240
In the human condition, broadly and more specific to you, how has love, romantic love or otherwise
link |
02:26:21.720
made you a better person, a better human being?
link |
02:26:27.360
Better engineer?
link |
02:26:28.360
Now you're asking really perplexing questions.
link |
02:26:31.520
Um, it's hard to give a, I mean, there are many books, poems and songs written about
link |
02:26:41.240
what is love and what is, what exactly, you know, um, you know, what is love, baby don't
link |
02:26:50.960
hurt me.
link |
02:26:51.960
Um, that's one of the great ones.
link |
02:26:53.920
Yes.
link |
02:26:54.920
Yeah.
link |
02:26:55.920
You've, you've earlier quoted Shakespeare, but that that's really up there.
link |
02:26:58.120
Yeah.
link |
02:26:59.120
Love is a many splinter thing.
link |
02:27:02.120
Uh, I mean there's, um, it's cause we've talked about so many inspiring things like be useful
link |
02:27:07.960
in the world, sort of like solve problems, alleviate suffering, but it seems like connection
link |
02:27:13.720
between humans is a source, you know, it's a, it's a source of joys, a source of meaning
link |
02:27:20.320
and that, that's what love is, friendship, love.
link |
02:27:24.160
I just wonder if you think about that kind of thing where you talk about preserving the
link |
02:27:29.040
light of human consciousness and us becoming a multi planetary, multi planetary species.
link |
02:27:35.240
I mean, to me at least, um, that, that means like if we're just alone and conscious and
link |
02:27:43.240
intelligent and it doesn't mean nearly as much as if we're with others, right?
link |
02:27:48.880
And there's some magic created when we're together, the, uh, the, the friendship of
link |
02:27:54.560
it.
link |
02:27:55.560
And I think the highest form of it is love, which I think broadly is, is much bigger than
link |
02:27:59.440
just sort of romantic, but also yes, romantic love and, um, family and those kinds of things.
link |
02:28:05.800
Well, I mean, the reason I guess I care about us becoming multi planet species in a space
link |
02:28:10.420
frank civilization is foundationally, I love humanity, um, and, and so I wish to see it
link |
02:28:19.600
prosper and do great things and be happy and, um, and if I did not love humanity, I would
link |
02:28:27.520
not care about these things.
link |
02:28:29.640
So when you look at the whole of it, the human history, all the people who's ever lived,
link |
02:28:35.080
all the people live now, it's pretty, we're, we're okay.
link |
02:28:41.840
On the whole, we're pretty interesting bunch.
link |
02:28:44.960
Yeah.
link |
02:28:45.960
All things considered, and I've read a lot of history, including the darkest, worst parts
link |
02:28:51.680
of it, and, uh, despite all that, I think on balance, I still love humanity.
link |
02:28:59.560
You joked about it with the 42, uh, what do you, what do you think is the meaning of this
link |
02:29:03.280
whole thing?
link |
02:29:05.600
Is like, is there a non numerical representation?
link |
02:29:08.520
Yeah.
link |
02:29:09.520
Well, really, I think what Douglas Adams was saying in Hitchhiker's Guide to the Galaxy
link |
02:29:13.120
is that, um, the universe is the answer and, uh, what we really need to figure out are
link |
02:29:22.000
what questions to ask about the answer that is the universe and that the question is the
link |
02:29:27.680
really the hard part.
link |
02:29:28.680
And if you can properly frame the question, then the answer, relatively speaking, is easy.
link |
02:29:33.640
Uh, so, so, so therefore, if you want to understand what questions to ask about the universe,
link |
02:29:40.840
you want to understand the meaning of life, we need to expand the scope and scale of consciousness
link |
02:29:45.860
so that we're better able to understand the nature of the universe and, and understand
link |
02:29:51.120
the meaning of life.
link |
02:29:52.580
And ultimately, the most important part would be to ask the right question, thereby elevating
link |
02:30:00.240
the role of the interviewer as the most important human in the room.
link |
02:30:07.880
Good questions are, you know, it's a hard, it's hard to come up with good questions.
link |
02:30:12.880
Absolutely.
link |
02:30:13.880
Um, but yeah, like, it's like that, that is the foundation of my philosophy is that, um,
link |
02:30:20.000
I am curious about the nature of the universe and, uh, you know, and obviously I will die.
link |
02:30:27.240
I don't know when I'll die, but I won't live forever.
link |
02:30:31.840
Um, but I would like to know that we are on a path to understanding the nature of the
link |
02:30:36.560
universe and the meaning of life and what questions to ask about the answer that is
link |
02:30:40.040
the universe.
link |
02:30:41.040
And, um, and so if we expand the scope and scale of humanity and consciousness in general,
link |
02:30:46.160
um, which includes silicon consciousness, then that, you know, that, that, that seems
link |
02:30:51.960
like a fundamentally good thing.
link |
02:30:53.920
Elon, like I said, um, I'm deeply grateful that you would spend your extremely valuable
link |
02:31:00.520
time with me today and also that you have given millions of people hope in this difficult
link |
02:31:07.040
time, this divisive time, in this, uh, cynical time.
link |
02:31:11.380
So I hope you do continue doing what you're doing.
link |
02:31:14.160
Thank you so much for talking today.
link |
02:31:15.160
Oh, you're welcome.
link |
02:31:16.160
Uh, thanks for your excellent questions.
link |
02:31:18.880
Thanks for listening to this conversation with Elon Musk.
link |
02:31:21.280
To support this podcast, please check out our sponsors in the description.
link |
02:31:25.480
And now let me leave you with some words from Elon Musk himself.
link |
02:31:29.640
When something is important enough, you do it, even if the odds are not in your favor.
link |
02:31:35.840
Thank you for listening and hope to see you next time.