back to index

Elon Musk: SpaceX, Mars, Tesla Autopilot, Self-Driving, Robotics, and AI | Lex Fridman Podcast #252


small model | large model

link |
00:00:00.000
The following is a conversation with Elon Musk.
link |
00:00:02.540
His third time on this, the Lex Friedman podcast.
link |
00:00:06.960
Yeah, make yourself comfortable.
link |
00:00:08.400
Boo.
link |
00:00:09.240
No, wow, okay.
link |
00:00:11.040
Do you don't do the headphones thing?
link |
00:00:12.760
No.
link |
00:00:13.600
Okay.
link |
00:00:14.440
I mean, how close do I get?
link |
00:00:15.280
I need to get to this thing.
link |
00:00:16.120
The closer you are, the sexier you sound.
link |
00:00:17.560
Hey, babe.
link |
00:00:18.400
Yep.
link |
00:00:19.240
Can't get enough of the all that, baby.
link |
00:00:20.920
I'm gonna clip that out anytime somebody messaged me about it.
link |
00:00:26.720
Yeah, you want my body.
link |
00:00:28.280
And you think I'm sexy.
link |
00:00:30.080
Come right out and tell me so.
link |
00:00:33.560
Do, do, do, do, do.
link |
00:00:38.880
So good.
link |
00:00:39.720
Okay.
link |
00:00:40.560
Serious mode activate.
link |
00:00:41.880
All right.
link |
00:00:43.200
Serious mode.
link |
00:00:44.320
Come on, you're Russian, you can be serious.
link |
00:00:45.720
Yeah, I know.
link |
00:00:46.560
Everyone's serious all the time in Russia.
link |
00:00:47.680
Yeah.
link |
00:00:48.520
Yeah.
link |
00:00:49.360
We'll get there.
link |
00:00:50.600
We'll get there.
link |
00:00:51.920
Yeah.
link |
00:00:52.760
It's gotten soft.
link |
00:00:55.280
Allow me to say that the SpaceX launch
link |
00:00:57.720
of human beings to orbit on May 30th, 2020
link |
00:01:02.120
was seen by many as the first step
link |
00:01:03.880
in a new era of human space exploration.
link |
00:01:07.400
These human spaceflight missions were a beacon of hope
link |
00:01:10.600
to me and to millions over the past two years
link |
00:01:12.720
as our world has been going through
link |
00:01:14.640
one of the most difficult periods in recent human history.
link |
00:01:18.040
We saw, we see the rise of division, fear, cynicism,
link |
00:01:21.720
and the loss of common humanity,
link |
00:01:24.600
right when it is needed most.
link |
00:01:26.360
So first, Elon, let me say thank you
link |
00:01:29.040
for giving the world hope and reason
link |
00:01:30.600
to be excited about the future.
link |
00:01:32.480
Oh, it's kind of you to say.
link |
00:01:34.160
I do want to do that.
link |
00:01:35.440
Humanity has obviously a lot of issues
link |
00:01:38.280
and people at times do bad things,
link |
00:01:42.600
but despite all that, I love humanity
link |
00:01:47.240
and I think we should make sure we do everything
link |
00:01:52.240
we can to have a good future and an exciting future
link |
00:01:54.440
and one where that maximizes the happiness of the people.
link |
00:01:58.360
Let me ask about Crew Dragon demo two.
link |
00:02:00.960
So that first flight with humans on board,
link |
00:02:04.400
how did you feel leading up to that launch?
link |
00:02:06.200
Were you scared?
link |
00:02:07.280
Were you excited?
link |
00:02:08.320
Was it going through your mind?
link |
00:02:09.680
So much was at stake.
link |
00:02:14.080
Yeah, no, that was extremely stressful, no question.
link |
00:02:17.360
We obviously could not let them down in any way.
link |
00:02:21.760
So extremely stressful, I'd say, to say the least.
link |
00:02:28.720
I was confident that at the time that we launched,
link |
00:02:30.880
that no one could think of anything at all to do
link |
00:02:36.840
that would improve the probability of success
link |
00:02:40.240
and we racked our brains to think of any possible way
link |
00:02:43.320
to improve the probability of success.
link |
00:02:44.840
We could not think of anything more and nor could NASA
link |
00:02:48.040
and so that's just the best that we could do.
link |
00:02:51.440
So then we went ahead and launched.
link |
00:02:55.680
Now, I'm not a religious person, but I nonetheless
link |
00:03:00.320
got on my knees and prayed for that mission.
link |
00:03:03.760
Were you able to sleep?
link |
00:03:05.840
No.
link |
00:03:07.600
How did it feel when it was a success?
link |
00:03:10.800
First, when the launch was a success
link |
00:03:12.760
and when they returned back home or back to Earth?
link |
00:03:16.920
It was a great relief.
link |
00:03:20.160
Yeah, for high stress situations,
link |
00:03:23.400
I find it's not so much elation as relief.
link |
00:03:28.400
And I think once, as we got more comfortable and proved out
link |
00:03:33.920
the systems, because we really got to make sure everything
link |
00:03:39.520
works, it was definitely a lot more enjoyable
link |
00:03:43.120
with the subsequent astronaut missions.
link |
00:03:46.440
And I thought the inspiration mission was actually
link |
00:03:50.680
very inspiring, the inspiration for mission.
link |
00:03:53.560
I'd encourage people to watch the inspiration
link |
00:03:56.560
documentary on Netflix.
link |
00:03:57.560
It's actually really good.
link |
00:04:00.200
And it really isn't.
link |
00:04:01.440
I was actually inspired by that.
link |
00:04:03.320
And so that one, I felt I was able to enjoy the actual mission
link |
00:04:09.320
and not just be super stressed all the time.
link |
00:04:10.880
So for people that somehow don't know,
link |
00:04:13.680
it's the all civilian first time, all civilian out
link |
00:04:17.400
to space, out to orbit.
link |
00:04:19.320
Yeah, and it was, I think, the highest orbit
link |
00:04:22.440
that in like 30 or 40 years or something.
link |
00:04:26.080
The only one that was higher was the one shuttle, sorry,
link |
00:04:29.800
Hubble servicing mission.
link |
00:04:32.480
And then before that, it would have been Apollo in 72.
link |
00:04:37.800
It's pretty wild.
link |
00:04:39.120
So it's cool.
link |
00:04:40.480
And I think as a species, we want
link |
00:04:45.000
to be continuing to do better and reach higher ground.
link |
00:04:49.720
And I think it would be extremely tragic
link |
00:04:52.880
if Apollo was the high watermark for humanity.
link |
00:04:57.040
And that's as far as we ever got.
link |
00:05:00.080
And it's concerning that here we are, 49 years
link |
00:05:06.400
after the last mission to the moon.
link |
00:05:09.040
And so almost half a century, and we've not been back.
link |
00:05:14.440
And that's worrying.
link |
00:05:16.240
It's like, does that mean we've peaked as a civilization
link |
00:05:20.240
or what?
link |
00:05:20.920
So I think we've got to get back to the moon
link |
00:05:24.440
and build a base there, a science base.
link |
00:05:27.120
I think we could learn a lot about the nature of the universe
link |
00:05:29.760
if we have a proper science base on the moon.
link |
00:05:33.200
Like we have a science base in Antarctica
link |
00:05:35.480
and many other parts of the world.
link |
00:05:38.040
And so that's the next big thing.
link |
00:05:41.840
We've got to have a serious moon base
link |
00:05:45.640
and then get people to Mars and get out there
link |
00:05:49.920
and be a space bearing civilization.
link |
00:05:52.520
I'll ask you about some of those details.
link |
00:05:54.440
But since you're so busy with the hard engineering
link |
00:05:57.480
challenges of everything that's involved,
link |
00:06:00.440
are you still able to marvel at the magic of it all,
link |
00:06:02.960
of space travel, of every time the rocket goes up,
link |
00:06:05.840
especially when it's a crewed mission?
link |
00:06:08.160
Or are you just so overwhelmed with all the challenges
link |
00:06:11.440
that you have to solve?
link |
00:06:13.520
And actually, to add to that, the reason
link |
00:06:16.640
that I wanted to ask this question of May 30th,
link |
00:06:19.240
it's been some time.
link |
00:06:20.840
So you can look back and think about the impact already.
link |
00:06:23.520
It's already, at the time, it was an engineering problem,
link |
00:06:26.360
maybe.
link |
00:06:27.240
Now it's becoming a historic moment.
link |
00:06:29.320
Like it's a moment that, how many moments
link |
00:06:31.640
would be remembered about the 21st century?
link |
00:06:33.720
To me, that or something like that,
link |
00:06:37.280
maybe inspiration for one of those
link |
00:06:39.120
would be remembered as the early steps of a new age
link |
00:06:41.600
of space exploration.
link |
00:06:44.200
Yeah, I mean, during the launches itself,
link |
00:06:46.640
so I think maybe some people will know,
link |
00:06:49.040
but a lot of people don't know.
link |
00:06:50.160
It's like, I'm actually the chief engineer of SpaceX.
link |
00:06:52.080
So I've signed off on pretty much all the design decisions.
link |
00:06:59.040
And so if there's something that goes wrong with that,
link |
00:07:03.680
a vehicle, it's fundamentally my fault.
link |
00:07:07.120
So I'm really just thinking about all the things that.
link |
00:07:13.920
So when I see the rocket, I see all the things
link |
00:07:16.400
that could go wrong and the things that could be better.
link |
00:07:18.520
And the same with the Dragon spacecraft,
link |
00:07:21.480
it's like, a lot of people will say, oh, this
link |
00:07:23.680
is a spacecraft or a rocket.
link |
00:07:25.560
And this looks really cool.
link |
00:07:27.000
I'm like, I've like a readout of like, these are the risks.
link |
00:07:31.200
These are the problems.
link |
00:07:32.560
That's what I see.
link |
00:07:33.840
Like, tch, tch, tch, tch, tch, tch.
link |
00:07:36.280
So it's not what other people see when they see the product.
link |
00:07:40.400
So let me ask you then to analyze Starship in that same way.
link |
00:07:44.360
I know you'll talk about it in more detail about Starship
link |
00:07:48.000
in the near future, perhaps.
link |
00:07:49.560
Yeah, we'll talk about it now if you want.
link |
00:07:51.760
But just in that same way, like you said,
link |
00:07:54.120
you see when you see a rocket, you see a list of risks.
link |
00:07:59.160
In that same way, you said that Starship
link |
00:08:01.040
was a really hard problem.
link |
00:08:03.280
So there's many ways I can ask this.
link |
00:08:05.480
But if you magically could solve one problem perfectly,
link |
00:08:09.600
one engineering problem perfectly, which one would it be?
link |
00:08:13.040
On Starship?
link |
00:08:14.600
Sorry, on Starship.
link |
00:08:15.560
So is it maybe related to the efficiency, the engine,
link |
00:08:20.040
the weight of the different components, the complexity
link |
00:08:22.000
of various things, maybe the controls of the crazy thing
link |
00:08:25.400
as to do the land?
link |
00:08:26.760
Oh, actually, by far, the biggest thing absorbing my time
link |
00:08:31.440
is engine production, not the design of the engine.
link |
00:08:38.160
I've often said prototypes are easy, production is hard.
link |
00:08:45.360
So we have the most advanced rocket engine
link |
00:08:48.680
that's ever been designed.
link |
00:08:52.680
Because I say currently the best rocket engine ever
link |
00:08:55.240
is probably the RD180 or RD170, the Dora Russian engine,
link |
00:09:01.880
basically.
link |
00:09:03.640
And still, I think an engine should only count
link |
00:09:06.760
if it's gotten something to orbit.
link |
00:09:09.000
So our engine has not gotten anything to orbit yet.
link |
00:09:12.680
But it is.
link |
00:09:13.760
It's the first engine that's actually
link |
00:09:16.560
better than the Russian RD engines,
link |
00:09:20.240
which were an amazing design.
link |
00:09:22.640
So you're talking about Raptor engine.
link |
00:09:24.120
What makes it amazing?
link |
00:09:25.520
What are the different aspects of it that make it?
link |
00:09:29.680
What do you get the most excited about
link |
00:09:31.920
if the whole thing works in terms of efficiency,
link |
00:09:35.080
all those kinds of things?
link |
00:09:37.120
Well, it's, but Raptor is a full flow staged combustion
link |
00:09:45.880
engine, and it's operating at a very high chamber pressure.
link |
00:09:50.880
So one of the key figures perhaps the key figure of merit
link |
00:09:55.200
is what is the chamber pressure at which the rocket engine
link |
00:09:59.920
can operate?
link |
00:10:00.800
That's the combustion chamber pressure.
link |
00:10:03.040
So Raptor is designed to operate at 300 bar, possibly
link |
00:10:07.720
maybe higher, 300 atmospheres.
link |
00:10:10.360
So the record right now for operational engine
link |
00:10:15.920
is the RD engine that I mentioned, the Russian RD, which
link |
00:10:18.120
is, I believe, around 267 bar.
link |
00:10:22.600
And the difficulty of the chamber pressure
link |
00:10:25.880
is increases on a nonlinear basis.
link |
00:10:27.880
So 10% more chamber pressure is more like 50%, more difficult.
link |
00:10:36.960
But that chamber pressure, that is
link |
00:10:39.320
what allows you to get a very high power
link |
00:10:42.280
density for the engine.
link |
00:10:46.160
So enabling a very high thrust to weight ratio
link |
00:10:53.280
and a very high specific impulse.
link |
00:10:57.000
So specific impulse is like a measure
link |
00:10:59.000
of the efficiency of a rocket engine.
link |
00:11:01.680
It's really the effect of exhaust
link |
00:11:07.120
velocity of the gas coming out of the engine.
link |
00:11:09.920
So with a very high chamber pressure,
link |
00:11:17.160
you can have a compact engine that nonetheless
link |
00:11:22.200
has a high expansion ratio, which
link |
00:11:24.160
is the ratio between the exit nozzle and the throat.
link |
00:11:31.800
So you see a rocket engine's got sort of like a hourglass shape.
link |
00:11:35.960
It's like a chamber and then it necks down and there's a nozzle.
link |
00:11:39.720
And the ratio of the exit diameter
link |
00:11:41.840
to the throat is the expansion ratio.
link |
00:11:45.960
So why is it such a hard engine to manufacture at scale?
link |
00:11:51.360
It's very complex.
link |
00:11:53.160
So what is complexity mean here?
link |
00:11:55.160
There's a lot of components involved.
link |
00:11:56.760
There's a lot of components and a lot of unique materials.
link |
00:12:03.080
So we had to invent several alloys
link |
00:12:07.040
that don't exist in order to make this engine work.
link |
00:12:11.240
So it's a materials problem, too.
link |
00:12:14.600
It's a materials problem.
link |
00:12:16.280
And in a full flow stage combustion,
link |
00:12:20.840
there are many feedback loops in the system.
link |
00:12:24.200
So basically, you've got propellants and hot gas
link |
00:12:31.840
flowing simultaneously to so many different places on the engine.
link |
00:12:39.120
And they all have a recursive effect on each other.
link |
00:12:43.680
So you change one thing here.
link |
00:12:44.880
It has a recursive effect here.
link |
00:12:46.080
It changes something over there.
link |
00:12:47.440
And it's quite hard to control.
link |
00:12:52.480
Like there's a reason no one's made this before.
link |
00:12:54.400
But the reason we're doing a stage combustion full flow
link |
00:13:04.560
is because it has the highest theoretical possible efficiency.
link |
00:13:12.640
So in order to make a fully reusable rocket,
link |
00:13:19.920
that's really the holy grail of orbital rocketry.
link |
00:13:25.360
You have to have everything's got to be the best.
link |
00:13:28.320
It's got to be the best engine, the best airframe,
link |
00:13:30.880
the best heat shield, extremely light avionics,
link |
00:13:38.120
very clever control mechanisms.
link |
00:13:40.640
You've got to shed mass in any possible way that you can.
link |
00:13:45.120
For example, instead of putting landing legs on the booster
link |
00:13:48.280
and ship, we are going to catch them with a tower
link |
00:13:50.720
to save the weight of the landing legs.
link |
00:13:53.200
So that's like, I mean, we're talking
link |
00:13:56.680
about catching the largest flying object ever made
link |
00:14:02.080
with on a giant tower with chopstick arms.
link |
00:14:06.640
It's like a karate kid with the fly, but much bigger.
link |
00:14:12.520
This probably won't work the first time.
link |
00:14:17.560
Anyway, so this is bananas.
link |
00:14:18.800
This is banana stuff.
link |
00:14:19.760
So you mentioned that you doubt, well, not you doubt,
link |
00:14:23.080
but there's days or moments when you doubt
link |
00:14:26.440
that this is even possible.
link |
00:14:28.320
It's so difficult.
link |
00:14:30.120
The possible part is, well, at this point,
link |
00:14:35.400
I think we will get Starship to work.
link |
00:14:41.360
This is a question of timing.
link |
00:14:42.880
How long will it take us to do this?
link |
00:14:45.440
How long will it take us to actually achieve
link |
00:14:47.680
full and rapid reusability?
link |
00:14:50.480
Because it will take probably many launches
link |
00:14:52.680
before we are able to have full and rapid reusability.
link |
00:14:57.640
But I can say that the physics pencils out, like we're not,
link |
00:15:06.120
at this point, I'd say we're confident that, like let's say,
link |
00:15:10.120
I'm very confident success is in the set
link |
00:15:12.640
of all possible outcomes.
link |
00:15:14.760
It's not an all set.
link |
00:15:16.880
For a while there, I was not convinced
link |
00:15:18.280
that success was in the set of possible outcomes, which
link |
00:15:22.000
is very important, actually.
link |
00:15:23.760
But so we were saying there's a chance.
link |
00:15:29.640
I'm saying there's a chance, exactly.
link |
00:15:33.200
Just not sure how long it will take.
link |
00:15:38.200
We have a very talented team.
link |
00:15:39.560
They're working night and day to make it happen.
link |
00:15:43.400
And like I said, the critical thing
link |
00:15:47.680
to achieve for the revolution in spaceflight
link |
00:15:49.520
and for humanity to be a space frame civilization
link |
00:15:52.000
is to have a fully and rapidly reusable rocket,
link |
00:15:54.360
orbital rocket.
link |
00:15:56.480
There's not even been any orbital rocket that's
link |
00:15:59.000
been fully reusable ever.
link |
00:16:00.080
And this has always been the holy grail of rocketry.
link |
00:16:05.960
And many smart people, very smart people,
link |
00:16:09.680
have tried to do this before and have not succeeded.
link |
00:16:12.680
So because it's such a hard problem.
link |
00:16:16.960
What's your source of belief in situations like this?
link |
00:16:21.160
When the engineering problem is so difficult,
link |
00:16:23.520
there's a lot of experts, many of whom
link |
00:16:26.160
you admire who have failed in the past.
link |
00:16:29.120
Yes.
link |
00:16:29.960
And a lot of people, a lot of experts, maybe journalists,
link |
00:16:37.960
all the kind of the public in general,
link |
00:16:39.840
have a lot of doubt about whether it's possible.
link |
00:16:43.600
And you yourself know that even if it's
link |
00:16:46.040
a non null set, non empty set of success,
link |
00:16:49.480
it's still unlikely or very difficult.
link |
00:16:52.000
Like where do you go to?
link |
00:16:53.400
Both personally, intellectually as an engineer,
link |
00:16:57.240
as a team, like for source of strength,
link |
00:17:00.360
needed to sort of persevere through this
link |
00:17:03.440
and to keep going with the project,
link |
00:17:05.160
take it to completion.
link |
00:17:18.440
It's also strength.
link |
00:17:19.800
It doesn't really know how I think about things.
link |
00:17:23.600
I mean, for me, it's simply this is something
link |
00:17:25.560
that is important to get done.
link |
00:17:28.120
And we should just keep doing it or die trying.
link |
00:17:32.560
And I don't need source of strength.
link |
00:17:35.960
So quitting is not even like.
link |
00:17:39.080
That's not in my nature.
link |
00:17:41.560
And I don't care about optimism or pessimism.
link |
00:17:46.280
Fuck that, we're going to get it done.
link |
00:17:47.920
Going to get it done.
link |
00:17:51.760
Can you then zoom back in to specific problems
link |
00:17:55.160
with starship or any engineering problems you work on?
link |
00:17:58.360
Can you try to introspect your particular biological
link |
00:18:01.520
and neural network, your thinking process
link |
00:18:03.600
and describe how you think through problems,
link |
00:18:06.200
the different engineering and design problems?
link |
00:18:08.040
Is there like a systematic process you've
link |
00:18:10.240
spoken about first principles thinking,
link |
00:18:11.920
but is there a kind of process to it?
link |
00:18:14.200
Well, like saying like physics is low
link |
00:18:19.280
and everything else is a recommendation.
link |
00:18:21.280
Like I've met a lot of people that can break the law,
link |
00:18:23.240
but I haven't met anyone who could break physics.
link |
00:18:25.480
So for any kind of technology problem,
link |
00:18:32.520
you have to just make sure you're not violating physics.
link |
00:18:39.960
And first principles analysis, I think,
link |
00:18:45.120
is something that could be applied to really
link |
00:18:47.400
any walk of life, anything really.
link |
00:18:49.840
It's really just saying, let's boil something down
link |
00:18:54.680
to the most fundamental principles.
link |
00:18:58.440
The things that we are most confident
link |
00:19:00.280
are true at a foundational level.
link |
00:19:02.520
And that sets your axiomatic base.
link |
00:19:05.080
And then you reason up from there.
link |
00:19:07.040
And then you cross check your conclusion
link |
00:19:09.040
against the axiomatic truths.
link |
00:19:13.680
So some basics in physics would be like,
link |
00:19:18.120
are you violating conservation of energy or momentum
link |
00:19:20.280
or something like that?
link |
00:19:21.400
Then it's not going to work.
link |
00:19:25.800
So that's just to establish, is it possible?
link |
00:19:33.000
And another good physics tool is thinking
link |
00:19:35.160
about things in the limit.
link |
00:19:36.520
If you take a particular thing and you scale it
link |
00:19:40.760
to a very large number or to a very small number,
link |
00:19:42.960
how do things change?
link |
00:19:46.000
Well, it's like in number of things you manufacture,
link |
00:19:48.960
something like that, and then in time.
link |
00:19:51.680
Yeah, let's say you take an example of manufacturing, which
link |
00:19:56.120
I think is just a very underrated problem.
link |
00:20:00.360
And like I said, it's much harder
link |
00:20:04.560
to take an advanced technology product
link |
00:20:09.360
and bring it into volume manufacturing
link |
00:20:11.040
than it is to design it in the first place.
link |
00:20:12.880
My orders of magnitude.
link |
00:20:14.520
So let's say you're trying to figure out
link |
00:20:19.520
why is this part or product expensive?
link |
00:20:24.000
Is it because of something fundamentally foolish
link |
00:20:27.400
that we're doing?
link |
00:20:28.080
Or is it because our volume is too low?
link |
00:20:30.960
And then you say, OK, well, what if our volume was
link |
00:20:32.960
a million units a year?
link |
00:20:34.240
Is it still expensive?
link |
00:20:35.560
That's what I'm thinking about things in the limit.
link |
00:20:38.160
If it's still expensive at a million units a year,
link |
00:20:40.160
then volume is not the reason why your thing is expensive.
link |
00:20:42.520
There's something fundamental about design.
link |
00:20:44.680
And then you then can focus on reducing complexity
link |
00:20:47.440
or something like that in the design?
link |
00:20:48.800
Change the design to change the part to be something
link |
00:20:51.280
that is not fundamentally expensive.
link |
00:20:57.120
That's a common thing in rocketry,
link |
00:20:58.920
because the unit volume is relatively low.
link |
00:21:01.800
And so a common excuse would be, well,
link |
00:21:04.280
it's expensive because our unit volume is low.
link |
00:21:06.760
And if we were in automotive or something like that
link |
00:21:08.760
or consumer electronics, then our costs would be lower.
link |
00:21:10.920
And I'm like, OK, so let's say now you're
link |
00:21:13.480
making a million units a year.
link |
00:21:14.640
Is it still expensive?
link |
00:21:16.080
If the answer is yes, then economies of scale are not
link |
00:21:20.760
the issue.
link |
00:21:22.080
Do you throw into manufacturing?
link |
00:21:24.080
Do you throw like supply chain?
link |
00:21:26.040
Talk about resources and materials and stuff like that.
link |
00:21:28.480
Do you throw that into the calculation
link |
00:21:29.920
of trying to reason from first principles
link |
00:21:31.760
like how we're going to make the supply chain work here?
link |
00:21:34.600
Yeah, yeah.
link |
00:21:35.720
And then the cost of materials, things like that.
link |
00:21:37.800
Or is that too much?
link |
00:21:38.880
Exactly, so another good example of thinking about things
link |
00:21:43.080
in the limit is if you take any product, any machine,
link |
00:21:52.160
or whatever, take a rocket or whatever
link |
00:21:56.040
and say, if you look at the raw materials in the rocket,
link |
00:22:03.640
so you're going to have aluminum, steel, titanium, incanal,
link |
00:22:09.240
specialty alloys, copper.
link |
00:22:13.480
And you say, what's the weight of the constituent elements
link |
00:22:19.200
of each of these elements?
link |
00:22:20.440
And what is their raw material value?
link |
00:22:22.560
And that sets the asymptotic limit
link |
00:22:25.680
for how low the cost of the vehicle
link |
00:22:29.200
can be unless you change the materials.
link |
00:22:32.920
And then when you do that, I call it maybe the magic one
link |
00:22:35.480
number or something like that.
link |
00:22:36.680
So that would be if you had just a pile of these raw materials
link |
00:22:42.120
here and you could wave the magic one
link |
00:22:43.640
and rearrange the atoms into the final shape,
link |
00:22:47.160
that would be the lowest possible cost
link |
00:22:49.440
that you could make this thing for
link |
00:22:50.960
unless you change the materials.
link |
00:22:52.720
So then, and that is always almost always a very low number.
link |
00:22:57.760
So then what's actually causing these to be expensive
link |
00:23:01.200
is how you put the atoms into the desired shape.
link |
00:23:06.000
Yeah, actually, if you don't mind me taking a tiny tangent,
link |
00:23:10.120
I often talk to Jim Keller, who's
link |
00:23:12.240
somebody who worked with you as a friend.
link |
00:23:14.360
Jim was a great work at Tesla.
link |
00:23:17.800
So I suppose he carries the flame of the same kind
link |
00:23:21.280
of thinking that you're talking about now.
link |
00:23:26.280
And I guess I see that same thing at Tesla and SpaceX folks
link |
00:23:30.920
who worked there, they kind of learned this way of thinking.
link |
00:23:33.800
And it kind of becomes obvious almost.
link |
00:23:36.720
But anyway, I had argument, not argument,
link |
00:23:40.880
he educated me about how cheap it
link |
00:23:44.480
might be to manufacture a Tesla bot.
link |
00:23:46.640
We just, we had an argument, how can you
link |
00:23:49.000
reduce the cost of the scale of producing a robot?
link |
00:23:52.120
Because I've gotten a chance to interact quite a bit,
link |
00:23:56.000
obviously, in the academic circles with human robots
link |
00:23:59.520
and the Boston Dynamics and stuff like that.
link |
00:24:01.920
And they're very expensive to build.
link |
00:24:04.520
And then Jim kind of schooled me on saying, OK,
link |
00:24:08.240
this kind of first principles thinking
link |
00:24:10.080
of how can we get the cost of manufacturing down?
link |
00:24:13.760
I suppose you do that, you have done that kind of thinking
link |
00:24:16.840
for Tesla bot and for all kinds of complex systems that
link |
00:24:22.040
are traditionally seen as complex.
link |
00:24:23.640
And you say, OK, how can we simplify everything now?
link |
00:24:27.160
Yeah, I mean, I think if you are really good at manufacturing,
link |
00:24:32.360
you can basically make at high volume,
link |
00:24:35.320
you can basically make anything for a cost
link |
00:24:38.480
that asymptotically approaches the raw material value
link |
00:24:42.040
of the constituents, plus any intellectual property
link |
00:24:44.600
that you need to do license.
link |
00:24:46.520
Anything.
link |
00:24:49.400
But it's hard.
link |
00:24:50.240
It's not like that's a very hard thing to do,
link |
00:24:52.240
but it is possible for anything.
link |
00:24:54.720
Anything in volume can be made, like I said,
link |
00:24:57.480
for a cost that asymptotically approaches
link |
00:25:00.400
its raw material constituents, plus intellectual property
link |
00:25:03.760
license rights.
link |
00:25:05.360
So what will often happen in trying to design a product
link |
00:25:08.280
is people will start with the tools and parts and methods
link |
00:25:12.400
that they are familiar with and then try
link |
00:25:15.920
to create a product using their existing tools and methods.
link |
00:25:21.200
The other way to think about it is actually
link |
00:25:24.120
imagine the try to imagine the platonic ideal of the perfect
link |
00:25:27.920
product or technology, whatever it might be.
link |
00:25:31.320
And so what is the perfect arrangement of atoms
link |
00:25:35.640
that would be the best possible product?
link |
00:25:38.480
And now let us try to figure out how
link |
00:25:39.960
to get the atoms in that shape.
link |
00:25:43.800
I mean, it sounds almost like Rick and Morty
link |
00:25:49.320
absurd until you start to really think about it.
link |
00:25:52.080
And you really should think about it in this way,
link |
00:25:56.400
because everything else is kind of,
link |
00:25:59.000
if you think you might fall victim
link |
00:26:02.320
to the momentum of the way things were done in the past,
link |
00:26:04.440
unless you think in this way.
link |
00:26:06.000
Well, just as a function of inertia,
link |
00:26:07.680
people will want to use the same tools and methods
link |
00:26:10.640
that they are familiar with.
link |
00:26:13.680
That's what they'll do by default.
link |
00:26:16.040
And then that will lead to an outcome of things
link |
00:26:18.680
that can be made with those tools and methods,
link |
00:26:20.560
that is unlikely to be the platonic ideal of the perfect
link |
00:26:24.320
product.
link |
00:26:26.040
So that's why it's good to think of things in both directions.
link |
00:26:30.000
They're like, what can we build with the tools that we have?
link |
00:26:32.200
But also, what is the theoretical perfect product look like?
link |
00:26:37.280
And that theoretical perfect product
link |
00:26:39.560
is going to be a moving target, because as you learn more,
link |
00:26:42.720
the definition for that perfect product will change,
link |
00:26:46.600
because you don't actually know what the perfect product is,
link |
00:26:48.760
but you can successfully approximate a more perfect
link |
00:26:52.560
product.
link |
00:26:54.320
So think about it like that, and then saying, OK, now,
link |
00:26:57.440
what tools, methods, materials, whatever
link |
00:26:59.920
do we need to create in order to get the atoms in that shape?
link |
00:27:06.040
But people rarely think about it that way.
link |
00:27:10.160
But it's a powerful tool.
link |
00:27:12.720
I should mention that the brilliant Siobhan Zillis is
link |
00:27:15.920
hanging out with us, in case you hear
link |
00:27:19.440
a voice of wisdom from outside, from up above.
link |
00:27:25.720
OK, so let me ask you about Mars.
link |
00:27:28.200
You mentioned it would be great for science
link |
00:27:30.440
to put a base on the moon to do some research.
link |
00:27:34.960
But the truly big leap, again, in this category
link |
00:27:39.600
of seemingly impossible, is to put a human being on Mars.
link |
00:27:43.800
When do you think SpaceX will land a human being on Mars?
link |
00:27:46.760
Hmm.
link |
00:27:54.760
Hmm.
link |
00:28:08.760
Best case is about five years, worst case, 10 years.
link |
00:28:16.760
What are the determining factors,
link |
00:28:18.520
would you say, from an engineering perspective,
link |
00:28:21.160
or is that not the bottlenecks?
link |
00:28:24.080
You know, it's fundamentally engineering the vehicle.
link |
00:28:32.520
I mean, Starship is the most complex and advanced rocket
link |
00:28:36.560
that's ever been made by, I don't know,
link |
00:28:39.320
water magnitude or something like that.
link |
00:28:40.800
It's a lot.
link |
00:28:42.080
It's really next level.
link |
00:28:43.040
So, and the fundamental optimization of Starship
link |
00:28:49.240
is minimizing cost per ton to orbit,
link |
00:28:51.560
and ultimately cost per ton to the surface of Mars.
link |
00:28:54.760
This may seem like a Mugantile objective,
link |
00:28:56.360
but it is actually the thing that needs to be optimized.
link |
00:29:00.360
Like, there is a certain cost per ton to the surface of Mars
link |
00:29:04.040
where we can afford to establish a self sustaining city.
link |
00:29:08.800
And then above that, we cannot afford to do it.
link |
00:29:12.800
So, right now, you couldn't fly to Mars for a trillion dollars.
link |
00:29:16.160
There's no amount of money could get you a ticket to Mars.
link |
00:29:19.160
So, we need to get that above, you know,
link |
00:29:22.440
to get that like something that is actually possible at all.
link |
00:29:27.800
But then, that's, we don't just want to have, you know,
link |
00:29:32.240
with Mars flags and footprints,
link |
00:29:33.800
and then not come back for a half century
link |
00:29:35.880
like we did with the moon.
link |
00:29:37.960
In order to pass a very important great filter,
link |
00:29:43.360
I think, we need to be a multi planet species.
link |
00:29:45.520
That may sound somewhat esoteric to a lot of people,
link |
00:29:51.400
but eventually, given enough time,
link |
00:29:55.320
there's something,
link |
00:29:58.040
Earth is likely to experience some calamity
link |
00:30:01.240
that could be something that humans do to themselves
link |
00:30:06.680
or an external event like happen to the dinosaurs.
link |
00:30:09.120
And eventually, if none of that happens,
link |
00:30:17.800
and somehow, magically, we keep going,
link |
00:30:20.520
then the sun will, the sun is gradually expanding
link |
00:30:23.800
and will engulf the Earth.
link |
00:30:25.960
And probably Earth gets too hot for life
link |
00:30:30.840
in about 500 million years.
link |
00:30:34.680
It's a long time, but that's only 10% longer
link |
00:30:37.160
than Earth has been around.
link |
00:30:38.240
And so, if you think about like the current situation,
link |
00:30:43.240
it's really remarkable and kind of hard to believe,
link |
00:30:45.640
but Earth's been around 4.5 billion years,
link |
00:30:50.080
and this is the first time in 4.5 billion years
link |
00:30:52.360
that it's been possible to extend life beyond Earth.
link |
00:30:55.720
And that window of charity may be open for a long time,
link |
00:30:58.800
and I hope it is, but it also may be open for a short time.
link |
00:31:01.760
And I think it was wise for us to act quickly
link |
00:31:09.440
while the window is open, just in case it closes.
link |
00:31:13.800
Yeah, the existence of nuclear weapons, pandemics,
link |
00:31:17.840
all kinds of threats should kind of give us some motivation.
link |
00:31:25.160
I mean, civilization could die with a bang or a whimper.
link |
00:31:31.520
If it dies, a demographic collapse,
link |
00:31:35.000
then it's more of a whimper, obviously.
link |
00:31:38.080
But if it's World War III, it's more of a bang.
link |
00:31:40.680
But these are all risks.
link |
00:31:43.160
I mean, it's important to think of these things
link |
00:31:44.560
and just think of things as probabilities, not certainties.
link |
00:31:48.600
There's a probability that something bad will happen on Earth.
link |
00:31:52.360
I think most likely the future will be good.
link |
00:31:56.520
But there's, let's say, for argument's sake,
link |
00:31:59.400
a 1% chance per century of a civilization ending event.
link |
00:32:03.600
Like, that was Stephen Hawking's estimate.
link |
00:32:07.720
I think he might be right about that.
link |
00:32:10.440
So then we should basically think of this
link |
00:32:18.000
like being a multiplanet species is like taking out
link |
00:32:20.200
insurance for life itself.
link |
00:32:21.480
Like, life insurance?
link |
00:32:22.600
For life.
link |
00:32:26.680
Well, it's turned into an infomercial real quick.
link |
00:32:29.280
Life insurance for life, yes.
link |
00:32:31.520
And we can bring the creatures from plants and animals
link |
00:32:36.720
from Earth to Mars and breathe life into the planet
link |
00:32:41.200
and have a second planet with life.
link |
00:32:44.600
That would be great.
link |
00:32:45.920
They can't bring themselves there.
link |
00:32:47.480
So if we don't bring them to Mars,
link |
00:32:49.960
then they will just for sure all die when the sun expands anyway.
link |
00:32:54.280
And then that'll be it.
link |
00:32:56.160
What do you think is the most difficult aspect of building
link |
00:32:59.560
a civilization on Mars, terraforming Mars,
link |
00:33:02.200
like from an engineering perspective,
link |
00:33:03.720
from a financial perspective, human perspective,
link |
00:33:07.200
to get a large number of folks there who will never return
link |
00:33:14.280
back to Earth?
link |
00:33:15.800
No, they could certainly return.
link |
00:33:17.080
Some will return back to Earth.
link |
00:33:18.400
They will choose to stay there for the rest of their lives.
link |
00:33:21.360
Many will.
link |
00:33:23.680
But we need the spaceships back, like the ones that go to Mars.
link |
00:33:29.400
We need them back.
link |
00:33:30.000
So you can hop on if you want.
link |
00:33:32.520
But we can't just not have the spaceships come back.
link |
00:33:34.880
Those things are expensive.
link |
00:33:35.760
We need them back.
link |
00:33:36.480
I'd like to come back after the trip.
link |
00:33:38.720
I mean, do you think about the terraforming aspect,
link |
00:33:40.680
like actually building?
link |
00:33:41.600
Are you so focused right now on the spaceships part that's
link |
00:33:44.760
so critical to get to Mars?
link |
00:33:46.880
We absolutely, if you can't get there, nothing else matters.
link |
00:33:50.480
And like I said, we can't get there at some extraordinarily
link |
00:33:54.080
high cost.
link |
00:33:54.640
I mean, the current cost of, let's say,
link |
00:33:57.840
one ton to the surface of Mars is on the order of $1 billion.
link |
00:34:02.640
So because you don't just need the rocket and the launch
link |
00:34:05.200
and everything, you need like heat shield, you need guidance
link |
00:34:09.040
system, you need deep space communications,
link |
00:34:12.240
you need some kind of landing system.
link |
00:34:14.960
So like rough approximation would
link |
00:34:16.920
be $1 billion per ton to the surface of Mars right now.
link |
00:34:22.200
This is obviously way too expensive
link |
00:34:26.880
to create a self sustaining civilization.
link |
00:34:30.680
So we need to improve that by at least a factor of 1,000.
link |
00:34:36.840
A million per ton?
link |
00:34:38.440
Yes, ideally much less than a million ton.
link |
00:34:40.880
But if it's not, like it's got to be,
link |
00:34:44.360
obviously like how much can society afford to spend
link |
00:34:47.960
or just want to spend on a self sustaining city on Mars?
link |
00:34:52.360
The self sustaining part is important.
link |
00:34:53.800
Like it's just the key threshold,
link |
00:34:57.800
the great filter will have been passed
link |
00:35:01.320
when the city on Mars can survive even if the spaceships
link |
00:35:06.480
from Earth stop coming for any reason.
link |
00:35:08.520
It doesn't matter what the reason is.
link |
00:35:10.000
But if they stop coming for any reason,
link |
00:35:12.120
will it die out or will it not?
link |
00:35:13.680
And if there's even one critical ingredient missing,
link |
00:35:16.360
then it still doesn't count.
link |
00:35:18.320
It's like if you're in a long sea voyage
link |
00:35:20.400
and you've got everything except vitamin C,
link |
00:35:23.680
and it's only a matter of time, you know, you're going to die.
link |
00:35:26.240
So we're going to get Mars, a Mars city to the point
link |
00:35:29.240
where it's self sustaining.
link |
00:35:31.880
I'm not sure this will really happen in my lifetime,
link |
00:35:33.840
but I hope to see it at least have a lot of momentum.
link |
00:35:37.240
And then you could say, what is the minimum tonnage
link |
00:35:39.760
necessary to have a self sustaining city?
link |
00:35:45.080
And there's a lot of uncertainty about this.
link |
00:35:46.480
You could say like, I don't know,
link |
00:35:48.520
it's probably at least a million tons.
link |
00:35:52.000
Because you have to set up a lot of infrastructure on Mars.
link |
00:35:55.400
Like I said, you can't be missing anything
link |
00:35:58.680
that in order to be self sustaining, you can't be missing.
link |
00:36:01.560
Like you need a sand baking doctor, fabs,
link |
00:36:04.480
you need iron ore refineries, like you need lots of things.
link |
00:36:09.360
So, and Mars is not super hospitable.
link |
00:36:13.440
It's the least inhospitable planet,
link |
00:36:15.680
but it's definitely a fixer of a planet.
link |
00:36:18.160
Outside of Earth.
link |
00:36:19.400
Yes.
link |
00:36:20.080
Earth is pretty good.
link |
00:36:20.800
Earth is like easy.
link |
00:36:22.240
And also we should clarify in the solar system.
link |
00:36:25.480
Yes, in the solar system.
link |
00:36:26.600
There might be nice like vacation spots.
link |
00:36:29.760
There might be some great planets out there, but it's hopeless.
link |
00:36:32.720
Too hard to get there?
link |
00:36:33.800
Yeah, way, way, way, way too hard to say the least.
link |
00:36:37.440
Let me push back on that.
link |
00:36:38.720
Not really a pushback, but a quick curve ball of a question.
link |
00:36:42.000
So you did mention physics as the first starting point.
link |
00:36:44.800
So general relativity allows for warm holes.
link |
00:36:51.400
They technically can exist.
link |
00:36:53.040
Do you think those can ever be leveraged by humus
link |
00:36:56.200
to travel fast in the speed of light?
link |
00:36:59.400
Well, the one whole thing is debatable.
link |
00:37:06.280
We currently do not know of any means of going fast
link |
00:37:08.920
in the speed of light.
link |
00:37:17.640
There are some ideas about having space.
link |
00:37:21.840
So you can only move at the speed of light through space,
link |
00:37:27.360
but if you can make space itself move,
link |
00:37:31.640
that's what we're warming space.
link |
00:37:36.160
Space is capable of moving faster than the speed of light.
link |
00:37:40.760
Like the universe, in the Big Bang,
link |
00:37:42.120
the universe expanded much more than the speed of light by a lot.
link |
00:37:55.320
If this is possible, the amount of energy
link |
00:37:57.400
required to walk space is so gigantic, it boggles the mind.
link |
00:38:03.120
So all the work you've done with propulsion,
link |
00:38:05.080
how much innovation is possible with rocket propulsion?
link |
00:38:08.120
Is this, I mean, you've seen it all,
link |
00:38:11.200
and you're constantly innovating in every aspect.
link |
00:38:14.480
How much is possible?
link |
00:38:15.360
Like, how much can you get 10x somehow?
link |
00:38:17.400
Is there something in there in physics
link |
00:38:19.680
that you can get significant improvement in terms
link |
00:38:21.520
of efficiency of engines and all those kinds of things?
link |
00:38:24.680
Well, as I was saying, really the Holy Grail
link |
00:38:27.960
is a fully and rapidly reusable orbital system.
link |
00:38:33.080
So right now, the Falcon 9 is the only reusable rocket out
link |
00:38:40.560
there, but the booster comes back and lands,
link |
00:38:44.320
and you've seen the videos, and we get the nose
link |
00:38:46.800
cone fairing back, but we do not get the upper stage back.
link |
00:38:49.760
So that means that we have a minimum cost
link |
00:38:54.280
of building an upper stage.
link |
00:38:57.680
You can think of like a two stage rocket of sort
link |
00:38:59.760
of like two airplanes, like a big airplane
link |
00:39:01.520
and a small airplane, and we get the big airplane back,
link |
00:39:04.600
but not the small airplane.
link |
00:39:05.920
And so it still costs a lot.
link |
00:39:07.880
So that upper stage is at least $10 million.
link |
00:39:13.360
And then the degree of the booster is not as rapidly
link |
00:39:18.240
and completely reusable as we'd like in order of the fairings.
link |
00:39:20.960
So our kind of minimum marginal cost, not counting overhead,
link |
00:39:26.080
for per flight is on the order of $15 to $20 million maybe.
link |
00:39:33.360
So that's extremely good for, it's by far better
link |
00:39:38.720
than any rocket ever in history.
link |
00:39:41.600
But with full and rapid reusability,
link |
00:39:45.360
we can reduce the cost per ton to orbit by a factor of 100.
link |
00:39:53.200
Just think of it like imagine if you had an aircraft
link |
00:39:56.640
or something or a car.
link |
00:40:00.000
And if you had to buy in your car every time
link |
00:40:03.440
you went for a drive, it would be very expensive,
link |
00:40:06.760
every silly, frankly.
link |
00:40:08.240
But in fact, you just refuel the car or recharge the car.
link |
00:40:13.440
And that makes your trip, I don't know, 1,000 times cheaper.
link |
00:40:20.280
So it's the same for rockets.
link |
00:40:23.800
It's very difficult to make this complex machine that
link |
00:40:27.360
can go to orbit.
link |
00:40:28.560
And so if you cannot reuse it and have
link |
00:40:31.120
to throw even any significant part of it away,
link |
00:40:34.040
that massively increases the cost.
link |
00:40:36.640
So Starship, in theory, could do a cost per launch of like
link |
00:40:43.880
a million, maybe $2 million or something like that
link |
00:40:48.280
and put over 100 tons in orbit, which is crazy.
link |
00:40:53.520
Yeah, that's incredible.
link |
00:40:55.920
So you're saying like it's by far the biggest bank for the block
link |
00:40:58.680
is to make it fully reusable versus like some kind
link |
00:41:02.320
of brilliant breakthrough in theoretical physics?
link |
00:41:05.720
Yeah, no.
link |
00:41:06.360
There's no brilliant break.
link |
00:41:07.680
No, just make the rocket reusable.
link |
00:41:11.240
This is an extremely difficult engineering problem.
link |
00:41:13.440
Got it.
link |
00:41:14.040
But no new physics is required.
link |
00:41:17.880
Just brilliant engineering.
link |
00:41:19.280
Let me ask a slightly philosophical, fun question.
link |
00:41:22.120
Got to ask.
link |
00:41:22.960
I know you're focused on getting to Mars,
link |
00:41:24.800
but once we're there on Mars, what do you
link |
00:41:27.160
what form of government, economic system, political system
link |
00:41:32.280
do you think would work best for an early civilization
link |
00:41:35.560
of humans?
link |
00:41:38.080
I mean, the interesting reason to talk about this stuff,
link |
00:41:41.520
it also helps people dream about the future.
link |
00:41:44.280
I know you're really focused about the short term
link |
00:41:47.240
engineering dream, but it's like, I don't know.
link |
00:41:49.280
There's something about imagining an actual civilization
link |
00:41:51.760
on Mars that gives people, really gives people help.
link |
00:41:55.360
Well, it would be a new frontier and an opportunity
link |
00:41:57.640
to rethink the whole nature of government, just
link |
00:41:59.840
as was done in the creation of the United States.
link |
00:42:02.640
So I mean, I would suggest having direct democracy,
link |
00:42:14.400
like people vote directly on things,
link |
00:42:16.160
as opposed to representative democracy.
link |
00:42:18.440
So representative democracy, I think,
link |
00:42:21.520
is too subject to a special interest
link |
00:42:25.120
and a coercion of the politicians and that kind of thing.
link |
00:42:31.400
So I'd recommend that there's just direct democracy.
link |
00:42:39.320
People vote on laws.
link |
00:42:40.560
The population votes on laws themselves.
link |
00:42:42.680
And then the laws must be short enough
link |
00:42:44.480
that people can understand them.
link |
00:42:46.880
Yeah, and then keeping a well informed populist,
link |
00:42:50.000
really being transparent about all the information,
link |
00:42:52.240
about what they're voting for.
link |
00:42:53.440
Absolutely transparency.
link |
00:42:54.800
Yeah.
link |
00:42:55.520
And not make it as annoying as those cookies
link |
00:42:57.440
where you have to accept the cookies.
link |
00:42:58.800
Accept cookies.
link |
00:42:59.800
Like always, there's always a slight amount of trepidation
link |
00:43:03.640
when you click accept cookies.
link |
00:43:05.360
Like, I feel as though there's, perhaps,
link |
00:43:07.760
a very tiny chance that it'll open a portal to hell
link |
00:43:10.680
or something like that.
link |
00:43:12.200
That's exactly how I feel.
link |
00:43:13.760
Why do they want me to accept that?
link |
00:43:16.600
What do they want with this cookie?
link |
00:43:18.960
Like, somebody got upset with accepting cookies or something
link |
00:43:21.440
somewhere.
link |
00:43:22.400
Who cares?
link |
00:43:23.960
So annoying to keep accepting all these cookies.
link |
00:43:26.920
To me, this is just a grand accept.
link |
00:43:29.560
Yes, you can have my damn cookie.
link |
00:43:30.720
I don't care.
link |
00:43:31.280
Whatever.
link |
00:43:32.360
He heard it from me on first.
link |
00:43:33.720
He accepts all your damn cookies.
link |
00:43:35.960
Yeah.
link |
00:43:37.880
It's not asking me.
link |
00:43:39.920
It's annoying.
link |
00:43:41.400
Yeah, it's one example of implementation
link |
00:43:46.000
of a good idea done really horribly.
link |
00:43:50.200
Yeah, it's somebody who has some good intentions
link |
00:43:52.520
of privacy or whatever.
link |
00:43:54.880
But now, everyone just has to accept cookies.
link |
00:43:57.320
And it's not, you know, you have billions of people
link |
00:43:59.320
who have to keep clicking accept cookie.
link |
00:44:00.720
It's super annoying.
link |
00:44:02.440
Then we just accept the damn cookie.
link |
00:44:04.040
It's fine.
link |
00:44:05.120
There is, I think, a fundamental problem
link |
00:44:07.960
that we're, because we've not really had a major,
link |
00:44:12.280
like a world war or something like that in a while.
link |
00:44:14.320
And obviously, we'd like to not have world wars.
link |
00:44:18.320
There's not been a cleansing function
link |
00:44:19.800
for rules and regulations.
link |
00:44:22.720
So wars did have some sort of lining
link |
00:44:25.760
in that there would be a reset on rules and regulations
link |
00:44:30.080
after a war.
link |
00:44:31.120
So World War I and II, there were huge resets
link |
00:44:33.000
on rules and regulations.
link |
00:44:35.720
Now, if society does not have a war
link |
00:44:39.400
and there's no cleansing function or garbage collection
link |
00:44:41.720
for rules and regulations, then rules and regulations
link |
00:44:43.840
will accumulate every year, because they're immortal.
link |
00:44:46.480
There's no actual humans die, but the laws don't.
link |
00:44:50.280
So we need a garbage collection function
link |
00:44:53.160
for rules and regulations.
link |
00:44:54.560
They should not just be immortal,
link |
00:44:57.480
because some of the rules and regulations that are put in place
link |
00:45:00.000
will be counterproductive, done with good intentions,
link |
00:45:02.800
but counterproductive.
link |
00:45:03.720
Sometimes not done with good intentions.
link |
00:45:05.600
So if rules and regulations just accumulate every year
link |
00:45:10.840
and you get more and more of them,
link |
00:45:12.360
then eventually you won't be able to do anything.
link |
00:45:14.840
You're just like Gulliver tied down
link |
00:45:17.320
by thousands of little strings.
link |
00:45:19.560
And we see that in US and basically all economies
link |
00:45:27.600
that have been around for a while,
link |
00:45:31.400
and regulators and legislators
link |
00:45:34.400
create new rules and regulations every year,
link |
00:45:36.560
but they don't put effort into removing them.
link |
00:45:38.680
And I think that's very important that we put effort
link |
00:45:40.240
into removing rules and regulations.
link |
00:45:44.000
But it gets tough, because you get special interests
link |
00:45:45.560
that then are dependent on, like they have a vested interest
link |
00:45:50.520
in that whatever rule and regulation,
link |
00:45:51.920
and then they fight to not get it removed.
link |
00:45:57.560
Yeah, so I mean, I guess the problem with the Constitution
link |
00:46:00.880
is it's kind of like C versus Java,
link |
00:46:04.080
because it doesn't have any garbage collection built in.
link |
00:46:06.720
I think there should be,
link |
00:46:07.920
when you first said the metaphor of garbage collection,
link |
00:46:10.800
I loved it.
link |
00:46:11.640
For the coding standpoint.
link |
00:46:12.480
For the coding standpoint, yeah, yeah.
link |
00:46:14.320
It would be interesting if the laws themselves
link |
00:46:16.880
kind of had a built in thing where they kind of die
link |
00:46:20.000
after a while and somebody explicitly publicly defends them.
link |
00:46:23.600
So that's sort of, it's not like somebody has to kill them,
link |
00:46:26.200
they kind of die themselves, they disappear.
link |
00:46:29.160
Yeah.
link |
00:46:32.520
Not to defend Java or anything,
link |
00:46:33.920
but the C++, you could also have a great garbage collection
link |
00:46:38.520
in Python and so on.
link |
00:46:39.920
Yeah, so yeah, something needs to happen,
link |
00:46:43.760
or just the civilization's already just hardened over time.
link |
00:46:48.600
And you can just get less and less done
link |
00:46:50.840
because there's just a rule against everything.
link |
00:46:54.840
So I think like, I don't know, for Mars,
link |
00:46:57.640
whatever I say, I would say for Earth as well,
link |
00:47:00.280
like I think there should be an active process
link |
00:47:02.640
for removing rules and regulations
link |
00:47:04.960
and questioning their existence.
link |
00:47:07.120
Just like, if we've got a function
link |
00:47:10.280
for creating rules and regulations,
link |
00:47:11.560
because rules and regulations can also think of as like,
link |
00:47:13.480
they're like software or lines of code
link |
00:47:15.760
for operating civilization.
link |
00:47:18.880
That's the rules and regulations.
link |
00:47:21.280
So it's not like we shouldn't have rules and regulations,
link |
00:47:22.960
but you have code accumulation, but no code removal.
link |
00:47:27.120
And so it just gets to become basically
link |
00:47:30.320
archaic bloatware after a while.
link |
00:47:33.480
And it's just, it makes it hard for things to progress.
link |
00:47:37.920
So I don't know, maybe Mars, you'd have like any given law
link |
00:47:42.560
must have a sunset and require active voting
link |
00:47:48.560
to keep it up there, you know?
link |
00:47:52.160
And I should also say like, and these are just,
link |
00:47:54.640
I don't know, recommendations or thoughts,
link |
00:47:57.000
and ultimately we'll be up to the people in Mars
link |
00:48:00.120
to decide, but I think it should be easier
link |
00:48:05.000
to remove a law than to add one
link |
00:48:07.520
because of the, just to overcome the inertia of laws.
link |
00:48:10.680
So maybe it's like, for argument's sake,
link |
00:48:15.200
you need like say 60% vote to have a law take effect,
link |
00:48:19.800
but only a 40% vote to remove it.
link |
00:48:23.440
So let me be the guy, you posted a meme on Twitter recently
link |
00:48:26.640
where there's like a row of urinals
link |
00:48:30.160
and guy just walks all the way across
link |
00:48:33.080
and he tells you about crypto.
link |
00:48:36.320
I mean, that's happened to be so many times,
link |
00:48:38.360
I think maybe even literally.
link |
00:48:40.400
Yeah.
link |
00:48:41.800
Do you think, technologically speaking,
link |
00:48:43.480
there's any room for ideas of smart contracts or so on,
link |
00:48:47.360
because you mentioned laws,
link |
00:48:49.320
that's an interesting use of things like smart contracts
link |
00:48:52.960
to implement the laws by which governments function.
link |
00:48:57.280
And like something built on Ethereum
link |
00:48:58.960
or maybe a dog coin that enables smart contracts somehow.
link |
00:49:04.840
I don't quite understand this whole smart contract thing.
link |
00:49:09.960
I mean, I'm too down to understand smart contracts.
link |
00:49:14.960
That's a good line.
link |
00:49:17.880
I mean, my general approach to any kind of like deal
link |
00:49:21.040
or whatever is just make sure there's clarity of understanding.
link |
00:49:23.760
That's the most important thing.
link |
00:49:25.800
And just keep any kind of deal very, very short
link |
00:49:29.080
and simple, plain language.
link |
00:49:31.320
And just make sure everyone understands this is the deal.
link |
00:49:34.240
Is everyone, is it clear?
link |
00:49:36.520
And what are the consequences
link |
00:49:39.760
if various things don't happen?
link |
00:49:42.600
But usually deals are business deals or whatever
link |
00:49:47.200
are way too long and complex and overly lawyered.
link |
00:49:50.920
And pointlessly.
link |
00:49:52.720
You mentioned that Doge is the people's coin.
link |
00:49:57.680
And you said that you were literally going SpaceX,
link |
00:50:00.720
may consider literally putting a Doge coin on the moon.
link |
00:50:07.480
Is this something you're still considering Mars perhaps?
link |
00:50:11.960
Do you think there's some chance
link |
00:50:13.640
we've talked about political systems on Mars
link |
00:50:16.080
that Doge coin is the official currency of Mars
link |
00:50:20.240
at some point in the future?
link |
00:50:22.560
Well, I think Mars itself will need to have
link |
00:50:25.720
a different currency because you can't synchronize
link |
00:50:29.040
due to speed of light or not easily.
link |
00:50:32.680
So it must be completely standalone from Earth?
link |
00:50:36.480
Well, yeah, because Mars is at closest approach,
link |
00:50:41.320
it's four light minutes away roughly.
link |
00:50:43.000
And then at most approach,
link |
00:50:45.000
it's roughly 20 light minutes away, maybe a little more.
link |
00:50:48.480
So you can't really have something synchronizing.
link |
00:50:52.760
You know, if you've got a 20 minutes speed of light issue,
link |
00:50:55.520
if it's got a one minute block chain,
link |
00:50:58.200
it's not gonna synchronize probably.
link |
00:51:01.200
So Mars, I don't know if Mars would have a crypto currency
link |
00:51:04.720
as a thing, but probably seems likely,
link |
00:51:07.640
but it would be some kind of localized thing on Mars.
link |
00:51:12.320
And you let the people decide?
link |
00:51:14.840
Yeah, absolutely.
link |
00:51:16.560
The future of Mars should be up to the Martians.
link |
00:51:20.720
Yeah, so I mean, I think the crypto currency thing
link |
00:51:25.800
is an interesting approach to reducing the error
link |
00:51:34.080
in the database that is called money.
link |
00:51:41.200
You know, I think I have a pretty deep understanding
link |
00:51:42.960
of what money actually is on a practical day to day basis
link |
00:51:46.760
because of PayPal.
link |
00:51:50.120
You know, we really got in deep there.
link |
00:51:55.120
And right now the money system,
link |
00:51:57.280
actually for practical purposes,
link |
00:51:59.080
is really a bunch of heterogeneous mainframes
link |
00:52:04.080
running old cobalt.
link |
00:52:07.480
Okay, you mean literally?
link |
00:52:09.000
Literally.
link |
00:52:09.840
Literally what's happening.
link |
00:52:10.840
In batch mode.
link |
00:52:11.680
Okay, in batch mode.
link |
00:52:14.080
Yeah, pretty the poor bastards
link |
00:52:16.440
who have to maintain that code.
link |
00:52:19.000
Okay, that's a pain.
link |
00:52:22.200
Not even Fortran, it's cobalt, yep.
link |
00:52:24.200
It's cobalt.
link |
00:52:26.000
And the banks are still buying mainframes in 2021
link |
00:52:30.080
and running ancient cobalt code.
link |
00:52:32.960
And the Federal Reserve is probably even older
link |
00:52:37.760
than what the banks have
link |
00:52:38.920
and they have an old cobalt mainframe.
link |
00:52:41.840
And so now, and so the government effectively
link |
00:52:45.760
has editing privileges on the money database.
link |
00:52:49.400
And they use those editing privileges
link |
00:52:51.480
to make more money whenever they want.
link |
00:52:56.400
And this increases the error in the database that is money.
link |
00:53:00.000
So I think money should really be viewed
link |
00:53:01.800
through the lens of information theory.
link |
00:53:04.560
And so it's, you're kind of like an internet connection.
link |
00:53:09.240
Like what's the bandwidth, you know, total bit rate?
link |
00:53:13.640
What is the latency, jitter, packet drop,
link |
00:53:17.560
you know, errors in network communication?
link |
00:53:22.800
Just think of money like that, basically.
link |
00:53:25.440
I think that's probably why we really think of it.
link |
00:53:27.400
And then say what system
link |
00:53:31.160
from an information theory standpoint
link |
00:53:32.880
allows an economy to function the best.
link |
00:53:36.520
And, you know, crypto is an attempt to reduce the error
link |
00:53:44.600
in money that is contributed by governments
link |
00:53:51.720
diluting the money supply
link |
00:53:53.280
as basically a pernicious form of taxation.
link |
00:53:57.200
So both policy in terms of with inflation and actual,
link |
00:54:03.280
like technological cobalt, like cryptocurrency
link |
00:54:07.080
takes us into the 21st century
link |
00:54:08.840
in terms of the actual systems
link |
00:54:10.720
that allow you to do the transaction,
link |
00:54:12.120
to store wealth, all those kinds of things.
link |
00:54:16.880
Like I said, just think of money as information.
link |
00:54:18.560
People often will think of money
link |
00:54:20.840
as having power in and of itself.
link |
00:54:24.080
It does not.
link |
00:54:24.960
Money is information.
link |
00:54:26.800
And it does not have power in and of itself.
link |
00:54:31.400
Like, you know, applying the physics tools
link |
00:54:35.000
of thinking about things in the limit is helpful.
link |
00:54:37.480
If you are stranded on a tropical island
link |
00:54:41.000
and you have a trillion dollars, it's useless.
link |
00:54:47.640
Because there's no resource allocation.
link |
00:54:50.280
Money is a database for resource allocation.
link |
00:54:52.640
If there's no resource to allocate except yourself.
link |
00:54:55.000
So money is useless.
link |
00:55:01.000
If you're stranded on a desert island with no food,
link |
00:55:05.640
all the Bitcoin in the world will not stop you from starving.
link |
00:55:16.320
So just think of money as a database
link |
00:55:20.840
for resource allocation across time and space.
link |
00:55:24.040
And then what system in what form should that database
link |
00:55:35.160
or data system, what would be most effective?
link |
00:55:39.080
Now, there is a fundamental issue with, say, Bitcoin
link |
00:55:43.880
in its current form in that the transaction volume
link |
00:55:47.960
is very limited.
link |
00:55:48.800
And the latency for a properly confirmed transaction
link |
00:55:57.640
is too long, much longer than you'd like.
link |
00:56:00.160
So it's actually not great from a transaction volume
link |
00:56:04.520
standpoint or a latency standpoint.
link |
00:56:09.200
So it is perhaps useful to solve an aspect
link |
00:56:14.360
of the money database problem,
link |
00:56:16.200
which is the sort of store of wealth
link |
00:56:19.960
or an accounting of relative obligations, I suppose.
link |
00:56:25.680
But it is not useful as a currency,
link |
00:56:29.440
as a day to day currency.
link |
00:56:30.840
But people have proposed different technological solutions.
link |
00:56:33.320
Like Lightning.
link |
00:56:34.320
Yeah, Lightning Network and the layer two technologies
link |
00:56:36.800
on top of that.
link |
00:56:37.880
I mean, it's all, it seems to be all kind of a trade off.
link |
00:56:40.880
But the point is, it's kind of brilliant to say
link |
00:56:43.160
that just think about it information,
link |
00:56:44.560
think about what kind of database,
link |
00:56:46.240
what kind of infrastructure enables
link |
00:56:48.120
the exchange of information.
link |
00:56:48.960
Like if you're operating in an economy,
link |
00:56:51.360
and you need to have some thing that allows
link |
00:56:55.320
for the efficient, to have efficient value ratios
link |
00:56:59.800
between products and services.
link |
00:57:01.440
So you've got this massive number of products
link |
00:57:03.240
and services and you need to, you can't just barter.
link |
00:57:07.320
It's like, that would be extremely unwieldy.
link |
00:57:09.680
So you need something that gives you a ratio of exchange
link |
00:57:17.520
between goods and services.
link |
00:57:20.600
And then something that allows you to shift obligations
link |
00:57:25.240
across time, like debt, debt and equity,
link |
00:57:27.320
shift obligations across time.
link |
00:57:29.280
Then what does the best job of that?
link |
00:57:33.360
Part of the reason why I think there's some
link |
00:57:36.080
marriage to dogecoin, even though it was obviously created
link |
00:57:38.760
as a joke, is that it actually does have
link |
00:57:44.160
a much higher transaction volume capability than Bitcoin.
link |
00:57:49.880
And the costs of doing a transaction,
link |
00:57:53.040
the dogecoin fee is very low.
link |
00:57:56.040
Like right now, if you want to do a Bitcoin transaction,
link |
00:57:58.280
the price of doing that transaction is very high.
link |
00:58:00.520
So you could not use it effectively for most things.
link |
00:58:04.400
And nor could it even scale to a high volume.
link |
00:58:11.920
And when Bitcoin started, I guess around 2008
link |
00:58:15.320
or something like that, the internet connections
link |
00:58:18.800
were much worse than they are today.
link |
00:58:20.880
Like order of magnitude, I mean, there's the way,
link |
00:58:24.960
way worse in 2008.
link |
00:58:26.880
So like having a small block size or whatever
link |
00:58:31.880
is, and a long synchronization time
link |
00:58:35.760
is made sense in 2008.
link |
00:58:37.840
But to 2021 or fast forward 10 years,
link |
00:58:41.400
it's like, it's like comically low, it's a,
link |
00:58:47.000
so, and I think there's some value
link |
00:58:52.280
to having a linear increase in the amount of currency
link |
00:58:57.000
that is generated.
link |
00:58:58.640
So because some amount of the currency,
link |
00:59:01.960
like if a currency is too deflationary or like,
link |
00:59:07.240
or should say if a currency is expected
link |
00:59:11.560
to increase in value over time,
link |
00:59:13.040
there's reluctance to spend it.
link |
00:59:15.240
Cause you're like, oh, if I,
link |
00:59:16.920
I'll just hold it and not spend it
link |
00:59:18.560
because it's scarcity is increasing with time.
link |
00:59:20.640
So if I spend it now, then I will regret spending it.
link |
00:59:23.440
So I will just, you know, total it.
link |
00:59:26.080
But if there's some dilution of the currency
link |
00:59:30.680
occurring over time, that's more of an incentive
link |
00:59:32.840
to use it as a currency.
link |
00:59:34.280
So those coins somewhat randomly has a,
link |
00:59:41.320
just a fixed number of sort of coins
link |
00:59:45.000
or hash strings that are generated every year.
link |
00:59:49.640
So there's some inflation, but it's not a percentage base.
link |
00:59:52.800
It's a fixed number.
link |
00:59:55.720
So the percentage of inflation
link |
00:59:58.400
will necessarily decline over time.
link |
01:00:02.720
So it just, I'm not saying
link |
01:00:04.440
that it's like the ideal system for a currency,
link |
01:00:07.600
but I think it actually is just fundamentally better
link |
01:00:10.600
than anything else I've seen, just by accident.
link |
01:00:15.800
So.
link |
01:00:16.640
Like I said, around 2008.
link |
01:00:19.800
So you're not, you know, some people suggested
link |
01:00:23.480
you might be set to Oshinakamoto.
link |
01:00:24.840
You've previously said you're not, let me ask.
link |
01:00:27.120
You're not for sure.
link |
01:00:28.800
Would you tell us if you were?
link |
01:00:30.160
Yes.
link |
01:00:31.000
Okay.
link |
01:00:31.840
Do you think it's a feature or bug
link |
01:00:34.800
that he's anonymous or she or they?
link |
01:00:39.000
It's an interesting kind of quirk of human history
link |
01:00:41.680
that there is a particular technology
link |
01:00:43.600
that is a completely anonymous inventor.
link |
01:00:46.240
Or creator.
link |
01:01:03.400
Well, I mean, you can look at the evolution of ideas
link |
01:01:10.080
before the launch of Bitcoin
link |
01:01:11.880
and see who wrote, you know, about those ideas.
link |
01:01:20.640
And then I don't know exactly,
link |
01:01:22.800
obviously I don't know who created Bitcoin
link |
01:01:25.160
for practical purposes,
link |
01:01:26.120
but the evolution of ideas is pretty clear for that.
link |
01:01:29.960
And like it seems as though like Nick Szabo
link |
01:01:33.000
is probably more than anyone else
link |
01:01:35.680
responsible for the evolution of those ideas.
link |
01:01:38.040
So he claims not to be Nakamoto,
link |
01:01:41.880
but I'm not sure that's neither here nor there,
link |
01:01:45.360
but he seems to be the one more responsible
link |
01:01:48.280
for the ideas behind Bitcoin than anyone else.
link |
01:01:50.880
So it's not perhaps like singular figures
link |
01:01:53.360
aren't even as important as the figures involved
link |
01:01:56.240
in the evolution of ideas that led to a thing.
link |
01:01:58.200
So, you know, most perhaps it's sad to think about history,
link |
01:02:03.600
but maybe most names will be forgotten anyway.
link |
01:02:06.920
What is the name anyway?
link |
01:02:08.040
It's a name attached to an idea.
link |
01:02:12.200
What does it even mean really?
link |
01:02:13.760
I think Shakespeare had a thing about roses and stuff,
link |
01:02:16.320
whatever you said.
link |
01:02:17.240
Roses by any other name.
link |
01:02:18.720
It smells sweet.
link |
01:02:22.440
I gotta yield on to quote Shakespeare.
link |
01:02:24.360
I feel like I accomplished something today.
link |
01:02:26.920
Shall I compare thee to a summer's day?
link |
01:02:30.840
I'm gonna clip that out instead of doing it.
link |
01:02:34.040
It's a lot more temperate and more fair.
link |
01:02:39.040
Autopilot, Tesla autopilot.
link |
01:02:46.160
Tesla autopilot has been through an incredible journey
link |
01:02:48.560
over the past six years,
link |
01:02:50.560
or perhaps even longer in the minds of,
link |
01:02:52.800
in your mind, in the minds of many involved.
link |
01:02:57.080
I think that's where we first like connected
link |
01:02:58.840
really was the autopilot stuff, autonomy and.
link |
01:03:01.960
The whole journey was incredible to me to watch.
link |
01:03:05.200
I was,
link |
01:03:07.680
because I knew, well, part of it was I was at MIT
link |
01:03:10.320
and I knew the difficulty of computer vision.
link |
01:03:13.160
And I knew the whole, I had a lot of colleagues
link |
01:03:15.240
and friends about the DARPA challenge.
link |
01:03:16.760
I knew how difficult it is.
link |
01:03:18.440
And so there was a natural skepticism.
link |
01:03:20.120
When I first drove a Tesla with the initial system
link |
01:03:23.680
based on Mobileye, I thought there's no way.
link |
01:03:27.480
So the first one I got in, I thought there's no way
link |
01:03:29.880
this car could maintain, like staying in the lane
link |
01:03:34.120
and create a comfortable experience.
link |
01:03:35.880
So my intuition initially was that the lane keeping problem
link |
01:03:39.480
is way too difficult to solve.
link |
01:03:41.720
Oh, lane keeping, yeah, that's relatively easy.
link |
01:03:43.600
Well, like, but not the, but solve in the way that we just,
link |
01:03:48.400
we talked about previous is prototype versus a thing
link |
01:03:52.600
that actually creates a pleasant experience
link |
01:03:54.360
over hundreds of thousands of miles and millions.
link |
01:03:57.400
Yeah, so.
link |
01:03:58.240
I mean, we had to wrap a lot of code around the Mobileye thing.
link |
01:04:01.680
It doesn't just work by itself.
link |
01:04:04.360
I mean, that's part of the story
link |
01:04:06.360
of how you approach things sometimes.
link |
01:04:07.960
Sometimes you do things from scratch.
link |
01:04:09.640
Sometimes at first you kind of see what's out there
link |
01:04:12.440
and then you decide to do from scratch.
link |
01:04:14.320
That was one of the boldest decisions I've seen
link |
01:04:17.160
is both on the hardware and the software
link |
01:04:18.800
to decide to eventually go from scratch.
link |
01:04:20.960
I thought, again, I was skeptical
link |
01:04:22.640
of whether that's going to be able to work out
link |
01:04:24.440
because it's such a difficult problem.
link |
01:04:26.840
And so it was an incredible journey.
link |
01:04:28.880
What I see now with everything,
link |
01:04:31.440
the hardware, the compute, the sensors,
link |
01:04:33.200
the things I maybe care and love about most
link |
01:04:37.280
is the stuff that Andre Carpathi is leading
link |
01:04:40.040
with the data set selection,
link |
01:04:41.720
the whole data engine process,
link |
01:04:43.080
the neural network architectures,
link |
01:04:45.000
the way that's in the real world,
link |
01:04:47.280
that network is tested, validated,
link |
01:04:49.360
all the different test sets,
link |
01:04:52.320
versus the ImageNet model of computer vision
link |
01:04:54.720
like what's in academia is like real world
link |
01:04:58.360
artificial intelligence.
link |
01:04:59.840
So, and Andre is awesome
link |
01:05:02.560
and obviously plays an important role,
link |
01:05:04.200
but we have a lot of really talented people driving things.
link |
01:05:07.680
So, and Ashok is actually the head
link |
01:05:11.560
of autopilot engineering.
link |
01:05:14.760
Andre is the director of AI.
link |
01:05:16.360
AI stuff, yeah, yeah.
link |
01:05:17.680
So yeah, there's, I'm aware that there's an incredible team
link |
01:05:20.560
of just a lot going on.
link |
01:05:22.040
Yeah, I just, you know, people will give me too much credit
link |
01:05:26.040
and they'll give Andre too much credit, so.
link |
01:05:28.680
And people should realize how much is going on
link |
01:05:31.680
under the hood.
link |
01:05:32.520
Yeah, it's just a lot of really talented people.
link |
01:05:36.480
The Tesla autopilot AI team is extremely talented.
link |
01:05:40.000
It's like some of the smartest people in the world.
link |
01:05:43.640
So yeah, we're getting it done.
link |
01:05:45.000
What are some insights you've gained
link |
01:05:47.640
over those five, six years of autopilot
link |
01:05:51.280
about the problem of autonomous driving?
link |
01:05:54.240
So, you leaped in having some sort of first principles,
link |
01:05:59.280
kinds of intuitions, but nobody knows
link |
01:06:02.280
how difficult the problem, like the problem.
link |
01:06:05.320
I thought the self driving problem would be hard,
link |
01:06:07.120
but it was harder than I thought.
link |
01:06:08.960
It's not like I thought it'd be easy.
link |
01:06:09.920
I thought it'd be very hard,
link |
01:06:10.760
but it was actually way harder than even that.
link |
01:06:14.200
So, what it comes down to at the end of the day
link |
01:06:17.000
is to solve self driving, you have to solve,
link |
01:06:21.560
you basically need to recreate what humans do to drive,
link |
01:06:28.720
which is humans drive with optical sensors,
link |
01:06:31.560
eyes, and biological neural nets.
link |
01:06:34.920
And so in order to, that's how the entire road system
link |
01:06:37.880
is designed to work with basically passive optical
link |
01:06:42.880
and neural nets, biologically.
link |
01:06:46.120
And now that we need to,
link |
01:06:47.920
so for actually for full self driving to work,
link |
01:06:50.080
we have to recreate that in digital form.
link |
01:06:52.960
So we have to, that means cameras
link |
01:06:56.120
with advanced neural nets in silicon form,
link |
01:07:04.040
and then it will obviously solve for full self driving.
link |
01:07:08.000
That's the only way.
link |
01:07:09.000
I don't think there's any other way.
link |
01:07:10.320
But the question is, what aspects of human nature
link |
01:07:12.920
do you have to encode into the machine, right?
link |
01:07:15.560
Do you have to solve the perception problem, like detect?
link |
01:07:18.760
And then you first, while it realize,
link |
01:07:21.320
what is the perception problem for driving?
link |
01:07:23.080
Like all the kinds of things you have to be able to see.
link |
01:07:25.400
Like what do we even look at when we drive?
link |
01:07:27.880
There's, I just recently heard Andre talked about at MIT
link |
01:07:32.440
about like car doors.
link |
01:07:33.720
I think it was the world's greatest talk
link |
01:07:35.560
of all time about car doors.
link |
01:07:36.960
The fine details of car doors.
link |
01:07:41.360
Like what is even an open car door, man?
link |
01:07:44.440
So like the the ontology of that,
link |
01:07:46.880
that's the perception problem.
link |
01:07:48.000
We humans solve that perception problem
link |
01:07:49.840
and Tesla has to solve that problem.
link |
01:07:51.640
And then there's the control and the planning
link |
01:07:53.360
coupled with the perception.
link |
01:07:54.960
You have to figure out like what's involved in driving,
link |
01:07:58.280
like especially in all the different edge cases.
link |
01:08:02.320
And then, I mean, maybe you can comment on this,
link |
01:08:06.560
how much game theoretic kind of stuff needs to be involved,
link |
01:08:10.440
you know, at a four way stop sign.
link |
01:08:13.280
You know, as humans, when we drive,
link |
01:08:15.560
our actions affect the world.
link |
01:08:18.040
Like it changes how others behave.
link |
01:08:20.760
Most of the time was driving,
link |
01:08:22.080
if you're usually just responding to the scene,
link |
01:08:27.400
as opposed to like really asserting yourself in the scene.
link |
01:08:31.320
Do you think?
link |
01:08:33.080
I think these sort of control logic conundrums
link |
01:08:37.560
are not the hard part.
link |
01:08:39.320
The, you know, let's see.
link |
01:08:45.560
What do you think is the hard part
link |
01:08:46.840
in this whole beautiful, complex problem?
link |
01:08:50.600
So it's a lot of frigging software, man.
link |
01:08:53.000
A lot of smart lines of code.
link |
01:08:57.320
For sure, in order to have create an accurate vector space.
link |
01:09:03.960
So like you're coming from image space,
link |
01:09:08.280
which is like this flow of photons.
link |
01:09:12.560
You're going to the camera cameras
link |
01:09:14.400
and then you have this massive bit stream in image space.
link |
01:09:23.240
And then you have to effectively compress
link |
01:09:29.560
a massive bit stream corresponding to photons
link |
01:09:34.560
that knocked off an electron in a camera sensor
link |
01:09:41.440
and turn that bit stream into vector space.
link |
01:09:47.920
By vector space, I mean like,
link |
01:09:51.680
you know, you've got cars and humans
link |
01:09:54.920
and lane lines and curves and traffic lights
link |
01:10:01.440
and that kind of thing.
link |
01:10:02.440
Once you have an accurate vector space,
link |
01:10:08.520
the control problem is similar to that of a video game,
link |
01:10:11.680
like a grand theft order of cyberpunk.
link |
01:10:14.120
If you have accurate, accurate, best vector space.
link |
01:10:16.240
It's the control problem is,
link |
01:10:18.280
I wouldn't say it's trivial, it's not trivial, but it's,
link |
01:10:22.880
it's not like some insurmountable thing.
link |
01:10:28.880
But having an accurate vector space is very difficult.
link |
01:10:32.120
Yeah, I think we humans don't give enough respect
link |
01:10:35.520
to how incredibly human perception system is,
link |
01:10:37.880
to mapping the raw photons
link |
01:10:41.520
to the vector space representation in our heads.
link |
01:10:44.680
Your brain is doing an incredible amount of processing
link |
01:10:47.480
and giving you an image that is a very cleaned up image.
link |
01:10:51.360
Like when we look around here, we see,
link |
01:10:53.360
like you see color in the corners of your eyes,
link |
01:10:55.320
but actually your eyes have very few cones,
link |
01:10:59.400
like cone receptors in the peripheral vision.
link |
01:11:02.240
Your eyes are painting color in the peripheral vision.
link |
01:11:05.640
You don't realize it, but their eyes
link |
01:11:07.240
are actually painting color.
link |
01:11:09.040
And your eyes also have like this blood vessels
link |
01:11:12.240
and also to gnarly things.
link |
01:11:13.480
And there's a blind spot, but do you see your blind spot?
link |
01:11:16.360
No, your brain is painting in the missing, the blind spot.
link |
01:11:21.160
You can do these like, see these things online
link |
01:11:23.960
where you look here and look at this point
link |
01:11:25.960
and then look at this point.
link |
01:11:27.360
And it's, if it's in your blind spot,
link |
01:11:30.480
your brain will just fill in the missing bits.
link |
01:11:33.240
It's so cool. The peripheral vision is so cool.
link |
01:11:35.280
It makes you realize all the illusions for vision science
link |
01:11:37.920
and so it makes you realize just how incredible the brain is.
link |
01:11:40.640
The brain is doing crazy amount of post processing
link |
01:11:42.640
on the vision signals from your eyes.
link |
01:11:45.840
It's insane.
link |
01:11:47.680
So, and then even once you get all those vision signals,
link |
01:11:53.360
your brain is constantly trying to forget as much as possible.
link |
01:11:57.720
So, human memory is perhaps the weakest thing
link |
01:12:00.360
about the brain is memory.
link |
01:12:01.920
So, because memory is so expensive to a brain
link |
01:12:05.280
and so limited, your brain is trying to forget
link |
01:12:08.240
as much as possible and distill the things that you see
link |
01:12:12.160
into the smallest amounts of information possible.
link |
01:12:16.520
So, your brain is trying to not just get to a vector space,
link |
01:12:19.400
but get to a vector space
link |
01:12:20.920
that is the smallest possible vector space
link |
01:12:23.600
of only relevant objects.
link |
01:12:26.640
And I think like, you can sort of look inside your brain,
link |
01:12:29.920
or at least I can, like when you drive down the road
link |
01:12:33.160
and try to think about what your brain is actually doing
link |
01:12:37.480
consciously.
link |
01:12:39.000
And it's, it's, it's, it's, it's, it's like, you'll see a car
link |
01:12:43.680
that's, because you're, you're, you don't have cameras.
link |
01:12:46.600
You, I don't have eyes in the back of your head or the side.
link |
01:12:48.760
You know, so you say like, you basically, your, your head is
link |
01:12:52.520
like a, you know, you basically have like two cameras
link |
01:12:55.960
on a slow gimbal.
link |
01:12:58.200
And, and what's your, and I said, it's not that great.
link |
01:13:01.720
Okay.
link |
01:13:02.560
You and I is, you know, like, and people are constantly
link |
01:13:05.120
distracted and thinking about things and texting
link |
01:13:07.240
and doing all sorts of things they shouldn't do in a car,
link |
01:13:09.320
changing the radio station.
link |
01:13:11.000
So, having arguments, you know, is like, so, so then,
link |
01:13:16.000
like, say like, like, like when's the last time you looked
link |
01:13:21.000
right and left and, you know, or, and, and rearward or even
link |
01:13:24.320
diagonally, you know, forward to actually refresh your vector
link |
01:13:28.480
space.
link |
01:13:29.480
So you're glancing around and what your mind is doing is, is,
link |
01:13:32.680
is trying to still the relevant vectors, basically objects
link |
01:13:37.680
with a position and motion and, and, and then, and then, and
link |
01:13:43.680
then editing that down to the least amount that's necessary
link |
01:13:48.800
for you to drive.
link |
01:13:49.960
It does seem to be able to edit it down or compress it even
link |
01:13:54.240
further into things like concepts.
link |
01:13:55.800
So it's not, it's like, it goes beyond the human mind seems
link |
01:13:58.440
to go sometimes beyond vector space to sort of space of
link |
01:14:02.560
concepts to where you'll see a thing.
link |
01:14:05.120
It's no longer represented spatially somehow.
link |
01:14:07.560
It's almost like a concept that you should be aware of.
link |
01:14:10.120
Like if this is a school zone, you'll remember that.
link |
01:14:13.320
Yeah.
link |
01:14:14.160
As a concept, which is a weird thing to represent, but
link |
01:14:16.560
perhaps for driving, you don't need to fully represent
link |
01:14:20.040
those things, or maybe you get those kind of, um,
link |
01:14:24.440
well, you, you, you, you, you need to like establish vector
link |
01:14:27.440
space and then actually have predictions for, uh, that those
link |
01:14:33.440
vector spaces.
link |
01:14:34.120
So like, um, you know, like if, uh, you know, like you drive
link |
01:14:39.360
fast, say, say, uh, uh, uh, uh, uh, a bus and the, and you
link |
01:14:44.600
see that this, this people, before you drove past the bus,
link |
01:14:48.480
you saw people crossing like, or some just imagine there's
link |
01:14:51.200
like a large truck or something blocking site.
link |
01:14:54.640
Um, but you, before you came out to the truck, you saw
link |
01:14:57.720
that there were some kids about to cross the road in front
link |
01:15:01.000
of the truck.
link |
01:15:01.440
Now you can no longer see the kids, but you, you, you need
link |
01:15:04.320
to be able, but you would now know, okay, those kids are
link |
01:15:06.800
probably going to pass by the truck and cross the road, even
link |
01:15:10.760
though you cannot see them.
link |
01:15:11.920
So you have to have, um, memory, uh, you have to need to
link |
01:15:17.920
remember that there were kids there and you need to have
link |
01:15:20.360
some forward prediction of what their position will be.
link |
01:15:23.960
It's a really hard problem at the time of relevance.
link |
01:15:25.600
So with, with occlusions and computer vision, when you can't
link |
01:15:29.200
see an object anymore, even when it just walks behind a tree
link |
01:15:32.640
and reappears, that's a really, really, I mean, at least in
link |
01:15:35.840
academic literature, it's tracking through occlusions.
link |
01:15:39.280
It's very difficult.
link |
01:15:40.600
Yeah.
link |
01:15:40.840
We're doing it.
link |
01:15:41.520
Um, I understand this.
link |
01:15:43.440
So some of it, it's like object permanence, like the same
link |
01:15:45.800
thing happens with humans, with neural nets, like when, like
link |
01:15:48.720
a toddler grows up, like there's a, there's a point in time
link |
01:15:51.800
where, uh, they develop, they have a sense of object
link |
01:15:55.600
permanence.
link |
01:15:56.200
So before a certain age, if you have a ball, uh, or a toy
link |
01:15:59.560
or whatever, and you put it behind your back and you pop it
link |
01:16:01.920
out, if they don't, before they have object permanence, it's
link |
01:16:04.840
like a new thing every time.
link |
01:16:05.880
It's like, whoa, this toy went, poof, just spared.
link |
01:16:08.280
And now it's back again.
link |
01:16:09.280
And they can't believe it.
link |
01:16:10.000
And that they can play peekaboo all day long because the
link |
01:16:12.440
peekaboo is fresh every time.
link |
01:16:16.160
But then we figured out object permanence.
link |
01:16:18.240
Then they realized, oh, no, the object is not gone.
link |
01:16:20.360
It's just behind your back.
link |
01:16:21.920
Um, sometimes I wish we never did figure out object permanence.
link |
01:16:26.320
Yeah.
link |
01:16:26.560
So that's, uh, that's an important problem to solve.
link |
01:16:31.640
Yes.
link |
01:16:32.040
So, so, and like an important evolution of the neural nets in
link |
01:16:34.840
the car is, uh, um, memory across, memory across both time and
link |
01:16:42.680
space.
link |
01:16:43.440
Um, so, uh, no, you can't remember, like you have to say,
link |
01:16:47.080
like how long do you want to remember things for?
link |
01:16:48.960
And, and it's, it doesn't, there's a cost to remembering
link |
01:16:51.880
things for a long time.
link |
01:16:53.240
So you get, you know, like run out of memory to try to remember
link |
01:16:57.000
too much for too long.
link |
01:16:58.520
Um, and, and then you also have things that are stale.
link |
01:17:01.240
If, if, if they're, if you remember them for too long, and
link |
01:17:03.880
then you also need things that are remembered, uh, remembered
link |
01:17:06.160
over time.
link |
01:17:06.880
So even if you like, say, have like, for a good sake, five
link |
01:17:11.520
seconds of memory, uh, on a time basis, but like, let's say
link |
01:17:15.240
you, you, you're parked at a light and you, and you saw, you
link |
01:17:19.640
use a pedestrian example that people were waiting to cross
link |
01:17:22.680
the, across the road and you can't, you can't quite see them
link |
01:17:26.520
because of an occlusion.
link |
01:17:28.040
Uh, but they might wait for a minute before the light
link |
01:17:31.200
changes for them to cross the road.
link |
01:17:33.160
You still need to, to remember that they, that that's where
link |
01:17:35.840
they were, um, and that they're probably going to cross the
link |
01:17:38.800
road type of thing.
link |
01:17:39.800
Um, so even if that exceeds your, your, your time based
link |
01:17:44.400
memory should not exceed your space of memory.
link |
01:17:48.120
And I just think the data engine side of that.
link |
01:17:50.480
So getting the data to learn all of the concepts that you're
link |
01:17:53.760
saying now is an incredible process.
link |
01:17:56.160
It's this iterative process of just, it's this, this
link |
01:17:59.080
hydranet of many, hydranets, we're changing the name to
link |
01:18:04.560
something else.
link |
01:18:05.440
Okay.
link |
01:18:05.920
I'm sure it'd be equally as Rick and Morty like, yeah, we've
link |
01:18:11.960
rearchitected the neural net, uh, neural nets in the cars.
link |
01:18:15.880
So many times it's crazy.
link |
01:18:18.000
Oh, so every time there's a new major version, you'll rename it
link |
01:18:20.680
to something more ridiculous or, or memorable and beautiful.
link |
01:18:24.600
Sorry, not ridiculous, of course.
link |
01:18:26.200
If you, if you see the full, the full like, uh, array of neural
link |
01:18:30.400
nets that, that, that are operating in the car, it's kind
link |
01:18:33.280
of boggles the mind.
link |
01:18:34.280
There's so, there's so many layers.
link |
01:18:36.280
It's crazy.
link |
01:18:36.960
Um, so yeah.
link |
01:18:40.960
Um, but, and, and we, we started off with, uh, simple neural
link |
01:18:47.360
nets that were, uh, basically image recognition on a single
link |
01:18:52.640
frame, from a single camera, uh, and then, uh, trying to knit
link |
01:18:59.080
those together with it, you know, it with a C, uh, I should
link |
01:19:03.920
say we, we're really primarily running C here because C plus
link |
01:19:07.360
plus is too much overhead and we have our own C compiler.
link |
01:19:10.840
So to get maximum performance, we actually wrote our own C
link |
01:19:14.480
compiler and are continuing to optimize our C compiler, uh, for
link |
01:19:17.960
maximum efficiency.
link |
01:19:19.240
In fact, we've just recently, uh, done a new river on a, on a
link |
01:19:23.080
C compiler that will compile directly to our autopilot
link |
01:19:25.640
hardware.
link |
01:19:26.080
Um,
link |
01:19:26.600
Do you want to compile the whole thing down and with your
link |
01:19:28.720
own compiler?
link |
01:19:29.720
Yeah.
link |
01:19:30.160
Like so efficiency here, cause there's all kinds of compute.
link |
01:19:33.360
There's CPU, GPU, there's like the ASIC type of thing that's,
link |
01:19:36.760
and you have to somehow figure out the scheduling across all
link |
01:19:39.040
of those things.
link |
01:19:39.520
And so you're compiling the code down.
link |
01:19:41.480
Yeah.
link |
01:19:41.720
It does all the, okay.
link |
01:19:43.240
This is, so that's why there's a lot of people involved.
link |
01:19:45.920
There's, there's a lot of hardcore, uh, software engineering at a
link |
01:19:50.800
very sort of bare metal level, uh, cause you, we're trying to do
link |
01:19:55.880
a lot of compute, uh, that's constrained to the, you know, our
link |
01:20:01.880
full self driving computer.
link |
01:20:02.960
So, and we want to try to have the highest frames per second, um,
link |
01:20:07.680
possible, um, with, with sort of very, very finite amount of
link |
01:20:12.560
compute, um, and power.
link |
01:20:15.040
So, um, we really put a lot of effort into the efficiency
link |
01:20:20.800
of our compute.
link |
01:20:22.000
Um, and, and, uh, so there's actually a lot of work done by some
link |
01:20:27.600
very talented software engineers at Tesla that, uh, at a very
link |
01:20:31.880
foundational level to improve the efficiency of compute and how
link |
01:20:35.480
we use the, the, the trip accelerators, uh, which are basically,
link |
01:20:39.800
um, dot, you know, uh, doing matrix math dot, dot products, uh,
link |
01:20:45.280
like a bazillion dot products.
link |
01:20:47.120
And it's like, what, what, what, what are neural nets?
link |
01:20:49.360
It's like computer wise, like 99% dot products.
link |
01:20:54.120
So, you know, um,
link |
01:20:56.840
And you want to achieve as many high frame rates like a video game.
link |
01:21:00.360
You want full resolution, high frame rate, high frame rate, low latency,
link |
01:21:06.400
um, low jitter, uh, so, um, I think one of the things for, um, moving
link |
01:21:17.280
towards now is no post processing of the image through the, um, uh, the
link |
01:21:25.600
image signal processor.
link |
01:21:26.600
So, um, like for, for what happens for cameras is that almost all
link |
01:21:32.960
cameras is they, um, there's a lot of post processing done in order
link |
01:21:37.960
to make pictures look pretty.
link |
01:21:40.200
Uh, and so we don't care about pictures looking pretty.
link |
01:21:42.440
Um, we, we just want the data.
link |
01:21:44.960
We, we, so we're, we're moving to just roll, roll photon counts.
link |
01:21:48.720
So the system will, like the image that, that, that the computer sees is
link |
01:21:55.120
actually much more than what you'd see if you're represented on a camera.
link |
01:21:59.040
It's got much more data.
link |
01:22:00.040
Uh, and even in very low light conditions, you can see that there's
link |
01:22:02.840
a small photon count difference between, you know, the spot here and
link |
01:22:07.640
that's about there, which means that, so it can see in the dark incredibly
link |
01:22:11.040
well, um, because it can detect these tiny differences in photon counts.
link |
01:22:16.440
Much better than you'd possibly imagine.
link |
01:22:19.840
Um, so, and then we also save, uh, 13 milliseconds on a latency.
link |
01:22:25.920
Uh, so, uh, from removing the post processing and the image.
link |
01:22:30.720
Yes.
link |
01:22:31.080
Yeah.
link |
01:22:31.320
It's like, um, because we've got eight cameras and, and then there's, uh,
link |
01:22:38.320
roughly, I don't know, one and a half milliseconds or so, maybe 1.6 milliseconds
link |
01:22:42.640
of latency, um, for each camera.
link |
01:22:45.680
And so it, like, um, going to just, uh, it basically bypassing the image processor.
link |
01:22:55.120
Uh, gets us back 13 milliseconds of latency, which is important, um, and we
link |
01:23:00.600
track latency all the way from, you know, photon hits the, the camera to, you
link |
01:23:05.360
know, all the steps that it's got to go through to get, you know, go through the,
link |
01:23:08.720
um, the various neural nets and the, the C code and, uh, and there's a
link |
01:23:13.920
little bit of C plus plus there as well.
link |
01:23:15.600
Um, well, I can maybe a lot, but it, the core stuff is the heavy duty computers
link |
01:23:21.280
all in C, um, and, uh, and so, so we track that latency all the way to an
link |
01:23:27.760
output command to the, um, drive unit to accelerate, uh, the brakes just to slow
link |
01:23:33.520
down, steering your turn left or right.
link |
01:23:36.160
Um, so, cause you got to output a command that's going to go to a controller
link |
01:23:40.160
and like some of these controllers have an update frequency that's maybe, uh, 10
link |
01:23:43.680
hertz or something like that, which is slow.
link |
01:23:45.680
That's like, now you lose a hundred milliseconds potentially.
link |
01:23:48.080
So, um, so then we want to update the, the drivers on the, like, say, steering and
link |
01:23:55.360
braking control to have, um, more like, uh, 100 hertz instead of 10 hertz and you
link |
01:24:00.480
got a 10 millisecond latency instead of a hundred milliseconds worst case latency.
link |
01:24:03.920
And actually jitter is more of a challenge than, than, than latency.
link |
01:24:06.880
Because latency is like, you can, you can, you can anticipate and predict, but if
link |
01:24:09.680
you're, but if you've got a stack up of things going from the camera to the, to
link |
01:24:13.440
the computer through, then you can, you can, you can, you can anticipate and
link |
01:24:16.160
the computer through then a series of other computers.
link |
01:24:19.360
And finally to an actuator on the car.
link |
01:24:22.080
If you have a stack up of, uh, of tolerances of timing tolerances, then you
link |
01:24:27.200
can have quite a variable latency, which is called jitter.
link |
01:24:30.320
And, and that makes it a hard to, to, to anticipate exactly what, how you should
link |
01:24:35.840
turn the car or accelerate.
link |
01:24:37.440
Because, you know, if you've got maybe 150 to 200 milliseconds of jitter, then
link |
01:24:42.560
you could be off by, you know, up to.2 seconds.
link |
01:24:45.040
And this could make, this could make a big difference.
link |
01:24:46.720
So you have to interpolate somehow to, to, to, to, uh, deal with the effects of jitter.
link |
01:24:51.760
So you, you, you can make like robust control decisions.
link |
01:24:57.440
Yeah.
link |
01:24:57.680
Again, you have to, uh, so the jitter is in the sensor information or is it, the jitter
link |
01:25:02.000
can occur at any stage in the pipeline.
link |
01:25:04.320
You can, if you have just, if you have a fixed latency, you can anticipate, um, and,
link |
01:25:10.080
and, and, uh, like say, okay, we know that, uh, our information is, for argument's sake,
link |
01:25:16.400
150 milliseconds stale.
link |
01:25:18.880
Like, so for, for, um, 140, for argument's sake, 150 milliseconds from photon second
link |
01:25:24.240
camera to, um, where you can measure a change in the acceleration of the vehicle.
link |
01:25:33.600
So then, uh, then you're going to say, okay, well, we're going to enter, we know it's
link |
01:25:38.400
150 milliseconds, so we're going to take that into account and, uh, and compensate for that
link |
01:25:43.600
latency.
link |
01:25:44.080
However, if you, if you've got then 150 milliseconds of latency plus 100 milliseconds of jitter,
link |
01:25:49.040
that's, which could be anywhere from zero to 100 milliseconds on top.
link |
01:25:52.080
So, so then your latency could be from 150 to 250 milliseconds.
link |
01:25:55.600
Now you've got 100 milliseconds that you don't know what to do with and, and that's basically random.
link |
01:26:01.280
So getting rid of jitter is extremely important.
link |
01:26:04.080
And that affects your control decisions and all those kinds of things.
link |
01:26:07.280
Okay. Yeah, the, the cars is going to fundamentally maneuver better with lower jitter.
link |
01:26:12.800
Got it.
link |
01:26:13.120
And the, the, the, the cars will maneuver with superhuman ability and reaction time
link |
01:26:17.840
much faster than a human.
link |
01:26:20.240
I mean, I think over time the autopilot, full cell driving will be capable of maneuvers that,
link |
01:26:26.880
um, you know, uh,
link |
01:26:32.240
you know, are far more than what like James Bond could do in like the best movie type of thing.
link |
01:26:36.240
That's exactly where I was imagining my mind as you said it.
link |
01:26:40.080
It's like impossible maneuvers that a human couldn't do.
link |
01:26:44.880
Well, let me ask sort of, uh, looking back the six years, looking out into the future,
link |
01:26:50.080
based on your current understanding, how, how hard do you think this,
link |
01:26:53.440
this full self driving problem, when do you think Tesla will solve level four FSD?
link |
01:27:00.800
I mean, it's looking quite likely that it will be next year.
link |
01:27:03.120
And what does the solution look like? Is it the current pool of FSD beta candidates?
link |
01:27:11.200
They start getting greater and greater as they have been degrees of autonomy.
link |
01:27:15.600
And then there's a certain level beyond which they can, they can do their own, they can read a book.
link |
01:27:22.800
Yeah. So, uh, I mean, you can see that anybody who's been following the
link |
01:27:27.600
full self driving beta closely will see that the, um, the rate of disengagement has been
link |
01:27:35.600
dropping rapidly. So like disengagement be where, where the driver intervenes to prevent the car
link |
01:27:41.280
from doing something dangerous potentially. So, um, so the interventions, you know, per million
link |
01:27:52.800
miles has been dropping dramatically at some point the, and that trend looks like it happens next
link |
01:28:00.560
year is the, the, the, the probability of an accident on FSD is less than that of the average
link |
01:28:09.600
human and then, and then significantly less than that of the average human. Um, so it certainly
link |
01:28:16.480
appears like we will get there next year. Um, then, then of course that, that, then there's
link |
01:28:23.840
going to be a case of, okay, well, we now have to prove this to regulators and prove it to,
link |
01:28:27.360
you know, and, and we, we, we want a standard that is not just equivalent to a human, but
link |
01:28:32.720
uh, much better than the average human. I think it's got to be at least two or three times, uh,
link |
01:28:36.880
higher safety than a human. So two or three times lower probability of injury than a human.
link |
01:28:41.760
Um, before, before we would actually say like, okay, it's okay to go. It's not going to be a
link |
01:28:46.400
cool, it's going to be much better. So if you look at 10 point FSD, 10.6 just came out recently,
link |
01:28:53.280
10.7 is on the way, maybe 11 is on the way to where in the future. Yeah. Um, we were hoping
link |
01:28:59.920
to get 11 out this year, but it's, uh, 11 actually has a whole bunch of, uh, fundamental
link |
01:29:06.640
rewrites on the neural, neural net architecture, um, and, and some fundamental improvements, uh, in
link |
01:29:15.280
creating vector space. Uh, so, uh, there is a, some fundamental like leap that really deserves
link |
01:29:23.120
the 11. I mean, that's a pretty cool number. Yeah. Yeah. Uh, 11 would be a single stack for
link |
01:29:30.240
all, you know, one stack to rule them all. Um, and, uh, but, but there, there's just some really
link |
01:29:37.680
fundamental, uh, neural net architecture changes that are, that will allow for, uh, much more
link |
01:29:47.040
capability, but, but, you know, at first they're going to have issues. So like we have this working
link |
01:29:52.880
on like sort of alpha software and it's good, but it's, uh, it's, it's, it's, it's basically taking
link |
01:30:00.160
a whole bunch of C C plus plus code and, and, and leading a massive amount of C plus plus code
link |
01:30:05.200
and replacing it with the neural net. And you know, Andre, um, makes this point a lot, which
link |
01:30:09.200
he's like neural nets, that kind of eating software, you know, over time there's like
link |
01:30:14.160
less and less conventional software, more and more neural net, uh, which is still software, but it's,
link |
01:30:18.560
you know, still comes out the lines of software, but, uh, it's more, more neural net stuff, uh, and
link |
01:30:25.360
less, uh, you know, heuristics basically, um, if you're more, more, more, uh, matrix based
link |
01:30:35.360
stuff, unless, uh, heuristics based stuff. Um, and, um, you know, like, like, like one of the big changes
link |
01:30:49.200
will be, um, like right now the neural nets, uh, will, um, deliver a giant bag of points
link |
01:31:00.160
to the C plus plus or C and C plus plus code. Yeah. Um, we call it the giant bag of points.
link |
01:31:07.200
Yeah. Uh, and it's like, so you go to pixel and, and, and, and something associated with that pixel,
link |
01:31:12.960
like this pixel is probably car, the pixel is probably lane line. Um, then you've got to
link |
01:31:18.320
assemble this giant bag of points in the C code and turn it into, uh, vectors. Um, and, uh,
link |
01:31:27.520
it does a pretty good job of it, but it's, it's, uh, it's, we want to just, we need another layer
link |
01:31:34.080
of neural nets on top of that to take the, the giant bag of points and distill that down to
link |
01:31:41.200
vector space in the, in the neural net part of the software as opposed to the heuristics
link |
01:31:46.960
part of the software. This is a big improvement. Um, neural nets all the way down. That's what
link |
01:31:52.560
you want. It's not even all neural nets, but it's, it's, it's, uh, this will be just a, this
link |
01:31:58.080
is a game changer to not have the bag of points, the giant bag of points that has to be assembled
link |
01:32:03.920
with, um, many lines of C C plus plus, uh, and, and have the, and have a neural net just
link |
01:32:10.560
assemble those into vectors. So, so the, the neural net is outputting, um, much, much less
link |
01:32:18.480
data. It's, it's, it's outputting this, this is a lane line. This is a curb. This is drivable
link |
01:32:23.200
space. This is a car. This is, uh, you know, a pedestrian or cyclist or something like that.
link |
01:32:29.120
It's outputting, um, it's really outputting, um, proper vectors to the, the C C plus plus control
link |
01:32:38.720
control code as opposed to the sort of constructing the, the vectors, uh, in, in C. Um, we're done,
link |
01:32:50.880
I think, quite a good job of, but it's, it's a, it's kind of hitting a local maximum on the,
link |
01:32:56.480
how well the C can do this. Um, so this is, this is really, this is really a big deal. And, and
link |
01:33:03.040
just all of the networks in the car need, need to move to surround video. There's still some
link |
01:33:07.040
legacy networks that are not, uh, surround video. Um, and all of the training needs to move to
link |
01:33:13.920
surround video and the efficiency of the training, uh, it needs to get better than it is. Uh, and
link |
01:33:18.880
then we need to move everything to, uh, raw, uh, photon, uh, counts as opposed to, um, processed
link |
01:33:27.840
images. Okay. It's just, it's just quite a big reset on the training because the system's trained
link |
01:33:32.480
on post processed image images. So we need to redo all the training, uh, to train against
link |
01:33:39.440
the, the raw photon counts instead of the post processed image. So ultimately it's kind of
link |
01:33:45.280
reducing the complexity of the whole thing. So, uh, reducing, reducing the lines of code will
link |
01:33:50.480
actually go, go lower. Yeah. That's fascinating. Um, so you're doing fusion of all the sensors
link |
01:33:56.000
and reducing the complexity of having to deal with these cameras. There's a lot of cameras
link |
01:33:59.920
really. Right. Yes. Um, same with humans. Uh, well, I guess we got years too. Okay. Yeah.
link |
01:34:07.680
Well, we'll actually need to incorporate, um, sound as well. Um, cause you know, you need to like
link |
01:34:12.400
listen for ambulance sirens or fire, you know, fire trucks, you know, uh, if somebody like
link |
01:34:19.360
you know, yelling at you or something, I don't know, just that there's, there's a little bit of
link |
01:34:22.400
audio that needs to be incorporated as well. Do you need to go back to break? Yeah, let's
link |
01:34:26.560
just, let's take a break. Okay. Honestly, frankly, like the ideas are, are the easy thing and the
link |
01:34:33.600
implementation is the hard thing. Like the idea of going to the moon is, is the easy part,
link |
01:34:37.600
but going to the moon is the hard part. It's the hard part. Um, and there's a lot of like hardcore
link |
01:34:42.160
engineering that's got to get done at the hardware and software level. Uh, likes optimizing the
link |
01:34:48.000
C compiler and, uh, just, you know, uh, cutting out latency everywhere. Like this is, if we don't
link |
01:34:57.440
do this, the system will not work properly. Um, so the work of the engineers doing this,
link |
01:35:02.880
they are like the unsung heroes to some, you know, but they are critical to the success of the
link |
01:35:07.840
situation. I think he made it clear. I mean, at least to me, it's super exciting. Everything
link |
01:35:11.680
that's going on outside of what Andre is doing. Yeah. Just the whole infrastructure, the software.
link |
01:35:16.800
I mean, everything is going on with data engine, uh, whatever, whatever it's called,
link |
01:35:21.600
the whole process is, is just work of art to me.
link |
01:35:24.480
Yeah. I think the, the, the sure scale of it is, is boggles mind. Like the training,
link |
01:35:27.200
the amount of work done with the, like we've written all this custom software for training
link |
01:35:31.600
and labeling, um, and to do auto labeling, auto labeling is essential. Um, because especially
link |
01:35:38.800
when you've got like surround video, it's very difficult to like label surround video from scratch
link |
01:35:44.960
is extremely difficult. Um, like take a human's such a long time to even label one video clip
link |
01:35:51.760
like several hours, uh, or the order label it, uh, basically we're just apply a heavy, like heavy duty,
link |
01:35:59.040
uh, like a lot of compute to the, to the video clips, um, to pre assign and guess what all the
link |
01:36:06.800
things are that are going on in this round video. And then there's like correcting it.
link |
01:36:10.240
Yeah. And then all the human has to do is like tweet, like say the, you know, change, adjust
link |
01:36:14.800
what is incorrect. This, this is like increase, increase this productivity by effect a hundred
link |
01:36:20.080
or more. Yeah. Uh, so you've presented Tesla bot as primarily useful in the factory. First of all,
link |
01:36:25.440
I think human robots are incredible from a fan of robotics. I think, uh, the elegance of movement
link |
01:36:32.240
that human, um, the human robots that by Peter robots show are just so cool. So it's, uh, really
link |
01:36:39.360
interesting that you're working on this and also talking about applying the same kind of all the
link |
01:36:43.440
ideas of some of which you've talked about with data engine, all the things that we're talking
link |
01:36:47.520
about with Tesla autopilot, just, uh, transferring that over to the, just yet another robotics problem.
link |
01:36:54.400
I have to ask, since I care about human robot interaction, so the human side of that,
link |
01:36:59.040
so you've talked about mostly in the factory. Do you see it, uh, also, do you see part of this
link |
01:37:04.400
problem that Tesla bot has to solve as interacting with humans and potentially having a place,
link |
01:37:08.720
like in the home. So interacting, not just not replacing labor, but also like, I don't know,
link |
01:37:15.600
being a friend or an assistant. Yeah. Yeah. I think the, the possibilities are, you know, endless.
link |
01:37:27.040
Yeah. I mean, it's, it's, it's obviously like a, it's not quite in Tesla's primary mission
link |
01:37:32.240
direction of accelerating sustainable energy, but, uh, it is a, an extremely useful thing
link |
01:37:37.680
that we can do for the world, which is to make a useful humanoid robot. Um, that is capable of
link |
01:37:43.360
interacting with the world and, um, helping in, in many different ways. Uh, so,
link |
01:37:51.120
certainly in factories and really just, just, I mean, I think if you say like, uh, extrapolate
link |
01:37:57.440
to, you know, many years in the future, it's like, I think, uh, work will become optional.
link |
01:38:03.760
Yeah. So like there's a lot of jobs that if you, if you, if people weren't paid to do it,
link |
01:38:10.400
they, they wouldn't do it. Like it's not, it's not fun, you know, necessarily. Like
link |
01:38:15.040
if you're washing dishes all day, it's like, uh, you know, even if you really like washing dishes,
link |
01:38:19.520
you really want to do it for eight hours a day every day. Probably not. So, um, and then there's
link |
01:38:26.240
like dangerous work and basically if it's dangerous, boring, uh, it has like potential
link |
01:38:31.440
for repetitive stress injury, injury, that kind of thing. Um, then that's really where
link |
01:38:36.480
humanoid robots would add the most value initially. Um, so that's what we're aiming for is, is to, um,
link |
01:38:45.840
for, for the humanoid robots to do jobs that people don't, don't voluntarily want to do.
link |
01:38:50.880
Um, and then we'll have to pair that obviously with some kind of universal basic income in the
link |
01:38:55.680
future. Uh, so I think, um, do you see a world when there's like hundreds of millions of Tesla
link |
01:39:04.320
bots doing different performing different tasks throughout the world?
link |
01:39:12.160
Yeah. I haven't really thought about it that far into the future, but I guess that there may be
link |
01:39:15.440
something like that. Um, so I guess it's a wild question. So the, the number of Tesla cars has
link |
01:39:24.560
been accelerating. It's been close to 2 million produced. Many of them have autopilot. I think
link |
01:39:30.320
we're over 2 million now. Yeah. Do you think there will ever be a time when there'll be more Tesla
link |
01:39:34.720
bots than Tesla cars? Yeah. I, I, I, you know, actually it's funny you ask this question because
link |
01:39:44.400
normally I do try to think I'm pretty far into the future, but I haven't really thought that far
link |
01:39:48.000
into the future with the, with the Tesla bot or it's code named Optimus. I call it Optimus subprime
link |
01:39:59.120
because it's not, it's not like a giant, you know, transformer robot. Um, so, uh,
link |
01:40:06.880
but it's meant to be a general purpose helpful, helpful bot. Um,
link |
01:40:10.640
um, and, and basically like the things that we're basically like, like Tesla, I think, um,
link |
01:40:21.360
is the, has the most advanced real world AI, uh, for interacting with the real world,
link |
01:40:26.240
which you developed as a function of to, to make self driving work. Um, and so along with custom
link |
01:40:32.880
hardware and like a lot of, you know, uh, hardcore low level software to have it run efficiently and
link |
01:40:39.520
be, you know, power efficient because, you know, it's one thing to do neural nets. If you've got a
link |
01:40:43.840
gigantic solar room with 10,000 computers, but now let's say you just, you have to now distill
link |
01:40:48.240
that down into one computer that's running at low power in a humanoid robot or a car. Um,
link |
01:40:53.360
that's actually very difficult and a lot of hardcore software is required for that. Um,
link |
01:40:58.960
so, so since we're kind of like solving the navigate the real world with neural nets problem
link |
01:41:06.480
for cars, which are kind of robots with four wheels, then it's like kind of a natural extension
link |
01:41:12.080
of that is to put it in a robot with arms and legs, uh, and actually, you know, actuators. Um, so,
link |
01:41:23.200
um, like, like the, the, the two, like hard things are like, you basically need to make the,
link |
01:41:30.560
have the robot be intelligent enough to interact in a sensible way with the environment. Um,
link |
01:41:36.880
so you should need real, real world AI and you need to be very good at, um, manufacturing,
link |
01:41:43.360
which is a very hard problem. Tesla is very good at manufacturing and also has the real world AI.
link |
01:41:49.360
So making the humanoid robot work is, uh, basically means developing custom, uh, motors and sensors,
link |
01:42:00.480
uh, that, that are different for what a car would use. Um, but we, we're also, we have a, um,
link |
01:42:07.920
I think we have the, the, the best expertise in developing advanced electric motors and
link |
01:42:13.840
power electronics. So it just has to be for a humanoid robot application on a car.
link |
01:42:22.160
Still, you do talk about love sometimes. So let me ask, this isn't like for like sex robots
link |
01:42:28.480
or something like that. Love is the answer. Yes. Uh, there is something compelling to us,
link |
01:42:35.680
not compelling, but we connect with, um, humanoid robots or even legged robots,
link |
01:42:40.480
like with the dog and shapes of dogs. It just, it seems like, you know, there's a huge amount
link |
01:42:46.240
of loneliness in this world. All of us seek companionship and with other humans, friendship
link |
01:42:51.200
and all those kinds of things. We have a lot of here in Austin, a lot of people have dogs.
link |
01:42:55.840
That's right. Um, there seems to be a huge opportunity to also have robots that decrease
link |
01:43:01.840
the, uh, the, the amount of loneliness in the world or help us humans connect with each,
link |
01:43:08.480
with each other. So in the way that dogs can, um, do you think about that?
link |
01:43:13.600
We'll test about it all. Or is it really focused on the problem of, of performing specific tasks,
link |
01:43:19.520
not connecting with humans? Um, I mean, to be, to be honest, I have not actually thought about it
link |
01:43:25.920
from the companionship standpoint, but I think it actually would end up being, it could be actually
link |
01:43:30.960
a very good companion. Um, and it could, you develop like a personality, uh, over time that is,
link |
01:43:42.160
that is like unique. Like, uh, you know, it's not like they're just all the robots are the same.
link |
01:43:47.280
And that personality could evolve to be, you know, uh, match, match the, the, the owner or the,
link |
01:43:56.000
you know, yes, the owner, uh, well, uh, whatever you want to call it, uh, the other companion,
link |
01:44:02.880
the other half, right? Uh, in the same way that friends do. See, I think that's a huge opportunity.
link |
01:44:08.720
I think, yeah, no, that's interesting. Like, um, the, because, you know, like there's, uh,
link |
01:44:16.560
Japanese phrase, I like the, uh, Wabi Savi, you know, uh, the subtle imperfections
link |
01:44:21.360
are what makes something special. And the subtle imperfections of the personality of the robot
link |
01:44:27.920
mapped to the subtle imperfections of the robot's human friend, I don't know,
link |
01:44:35.040
owner sounds like maybe the wrong word, but, um, could actually make an incredible buddy,
link |
01:44:41.360
basically. And in that way, the imperfections, like R2D2 or like a C3PO sort of thing, you know.
link |
01:44:46.320
So from a machine learning perspective, I think the flaws being a feature is really nice.
link |
01:44:53.840
You could be quite terrible at being a robot for quite a while in the general home environment
link |
01:44:58.800
or all in the general world. And that's kind of adorable. And that's like, those are your flaws
link |
01:45:04.560
and you fall in love with those flaws. So it's, it's a, it's very different than autonomous
link |
01:45:09.280
driving where it's a very high stakes environment. You cannot mess up. And so it's, yeah, it's more
link |
01:45:14.800
fun to be a robot in the home. Yeah. In fact, if you think of like a C3PO and R2D2, like they
link |
01:45:22.000
actually had a lot of like flaws and imperfections and silly things and they would argue with each
link |
01:45:26.480
other. And, um, were they actually good at doing anything? I'm not exactly sure.
link |
01:45:33.760
I definitely added a lot to the story. Um, but, but, but there's, there's sort of quirky elements
link |
01:45:40.480
and, you know, that they would like make mistakes and do things. Like it was like, uh, it made them
link |
01:45:48.480
relatable, I don't know, um, enduring. So, so yeah, I think that that could be something
link |
01:45:55.280
that probably would happen. Um, but our initial focus is just to make it useful. Uh, so, so,
link |
01:46:04.240
um, I'm confident we'll get it done. I'm not sure what the exact timeframe is, but uh,
link |
01:46:08.400
like we'll probably have, I don't know, a decent prototype towards the end of next year or something
link |
01:46:13.920
like that. And it's cool that it's connected to Tesla, the car. So, so yeah, it's, it's, it's using
link |
01:46:21.520
a lot of, you know, it would use the autopilot inference computer and, um, a lot of the training
link |
01:46:27.360
that we've done for the four cars in terms of recognizing real world things could be applied
link |
01:46:32.800
directly to the, to the robot. Um, so it, but, but there's, there's a lot of custom actuators
link |
01:46:40.160
and sensors that need to be developed. And an extra module on top of the vector space,
link |
01:46:45.600
uh, for love. Uh, yeah. That's amazing. Okay. We can add that to the car too.
link |
01:46:53.120
That's true. Um, that could be useful in all environments. Like you said, a lot of people
link |
01:46:58.240
argue in the car. So maybe we can help them out. Uh, you're a student of history,
link |
01:47:03.440
fan of Dan Carlin's hardcore history podcast. Yeah, that's great. Greatest podcast ever.
link |
01:47:08.400
Yeah. I think it is actually, it almost doesn't really count as a podcast. Yeah. It's more like
link |
01:47:14.960
a audio book. Yeah. So you were on the podcast with Dan, I just had a chat with him about it.
link |
01:47:20.960
He said, you guys want military and all that kind of stuff. Uh, yeah, it's literally, uh, it was
link |
01:47:25.120
basically, um, uh, I think it should be titled engineer wars. Uh, essentially like, like when
link |
01:47:33.440
there's a rapid change in the rate of technology, then, uh, engineering plays a pivotal role in,
link |
01:47:40.160
in victory and battle. Um, do you get, how far back in history did you go? Did you go World War
link |
01:47:46.080
II? Uh, it was mostly, well, it was supposed to be a deep dive on fighters and bomber, uh,
link |
01:47:53.360
technology in World War II. Um, but that ended up being more wide ranging than that. Um,
link |
01:47:58.400
because I just went down the, a total rathole of like studying all of the fighters and bombers
link |
01:48:03.600
of World War II and like the constant rock, paper, scissors game that like, you know, uh,
link |
01:48:09.680
one country would make this plane, then it'd make a plane to beat that and that's what I'm
link |
01:48:12.800
trying to make a plane to beat that. And then the, and really what matters is like the pace of
link |
01:48:16.960
innovation, um, and also access to high quality, uh, fuel and, uh, raw materials. So like Germany
link |
01:48:25.840
had like some amazing designs, but they couldn't make them, uh, because they couldn't get their
link |
01:48:30.160
raw materials. Uh, and, uh, they, they had a real problem with the oil and, and, and, uh, fuel
link |
01:48:36.560
basically the fuel quality was extremely, uh, variable. So the design wasn't the bottleneck
link |
01:48:41.440
because, uh, yeah, like the US had kick ass fuel, uh, that was like very consistent. Like the
link |
01:48:47.120
problem is if you make a very high performance aircraft engine, um, in order to make high
link |
01:48:50.960
performance, you have to, um, the, the, the, the, the fuel, the aviation gas, uh, has to be a consistent
link |
01:49:00.560
mixture and, uh, uh, it has to have a high octane. Um, like high octane is the most important thing,
link |
01:49:08.800
but also can't have like impurities and stuff, uh, because you'll, you'll foul up the engine
link |
01:49:13.920
and, and, and Germany just never had good access to oil. Like they try to get it by invading the
link |
01:49:17.520
Caucasus, um, but that didn't work too well. Never works well.
link |
01:49:26.240
That's, that's for you. So there was, Germany was always struggling with, with basically shitty
link |
01:49:30.400
oil. Um, and then they could not, uh, they, they couldn't count on a, on high quality fuel for
link |
01:49:36.080
their aircraft. So then they had to have all these additives and stuff. Uh, so, um, uh, whereas the
link |
01:49:44.000
US had awesome fuel, um, and that provided that to Britain as well. Um, so that allowed the British
link |
01:49:50.400
and the Americans to design aircraft engines that were, uh, super high performance, better than
link |
01:49:55.200
anything else in the world. Germany could, could, could design the engines. They just didn't have
link |
01:50:00.320
the fuel. Uh, and then also the, like I said, the, the, uh, the quality of the aluminum allies that
link |
01:50:05.680
they were getting was also not that great. And so, you know,
link |
01:50:08.320
is your, is this like, uh, do you talk about all this with them?
link |
01:50:11.360
Yeah. Awesome. Broadly looking at history, when you look at Genghis Khan, when you look at Stalin,
link |
01:50:18.400
Hitler, the darkest moments of human history, uh, what do you take away from those moments?
link |
01:50:23.760
Does it help you gain insight about human nature, about human behavior today,
link |
01:50:27.120
whether it's the wars or the individuals or just the behavior of people, any aspects of history?
link |
01:50:40.960
Yeah, I find history fascinating. Um,
link |
01:50:49.360
um, there's just a lot of incredible things that have been done, good and bad, um, that they
link |
01:50:54.560
help, you just help you understand the nature of civilization, um, and individuals and
link |
01:51:06.080
Does it make you sad that humans do these kinds of things to each other? You look at the 20th
link |
01:51:10.320
century, World War II, the cruelty, the abuse of power, talk about communism, Marxism and Stalin.
link |
01:51:19.200
Um, I mean, some of these things do, I mean, if you, like there's a lot of human history,
link |
01:51:24.320
um, most of it is actually people just getting on with their lives. Uh, you know, and it's not like
link |
01:51:30.560
human history is just, uh, what nonstop war and disaster is, those are actually just
link |
01:51:37.440
those are intermittent and rare. And if they weren't, then, you know, humans would soon cease to exist.
link |
01:51:46.400
Uh, but it's just that wars tend to be written about a lot. And whereas, like, uh,
link |
01:51:51.600
uh, something being like, well, a normal year where nothing major happened was doesn't get
link |
01:51:57.440
written about much, but that's, you know, most people just like farming and kind of like living
link |
01:52:02.720
their life, you know, um, being a villager somewhere. Um, and every now and again, there's a war
link |
01:52:10.480
and a thing. So, um, and, um, you know, I'd say like that there aren't very many books that I,
link |
01:52:21.520
where I just had to start reading because it was just too, too dark. But, uh, the book about Stalin,
link |
01:52:28.480
the court of the red czar, I had to start reading. It was just too, too dark, rough.
link |
01:52:36.160
Yeah. Um, the thirties, uh, there's a lot, a lot of lessons there to me in particular that it feels
link |
01:52:46.240
like humans, like all of us have that as the old soldiers in line, um, that the line between good
link |
01:52:54.240
and evil runs to the heart of every man that all of us are capable of evil. All of us are capable
link |
01:52:58.720
of good. It's almost like this kind of responsibility that, um, all of us have to, to, to tend towards
link |
01:53:06.400
the good. And so like to me, looking at history is almost like an example of, look, you have some
link |
01:53:12.080
charismatic leader that, uh, convinces you of things is too easy based on that story to do evil
link |
01:53:20.080
onto each other, onto your family and to others. And so it's like our responsibility to do good.
link |
01:53:25.280
Um, it's not like now is somehow different from history. That can happen again. All of it can
link |
01:53:31.200
happen again. And yes, most of the time you're right. I mean, the optimistic view here is mostly
link |
01:53:37.520
people are just living life. And as you've often memed about, uh, the quality of life was way worse
link |
01:53:44.240
back in the day and keeps improving over time through innovation to technology.
link |
01:53:48.640
But still it's somehow notable that these blimps of atrocities happen. Sure. Yeah. I mean life was
link |
01:53:57.760
really tough for most of history. Um, I mean, for most of human history, um, a good year would be
link |
01:54:06.880
one where not that many people in your village died of the plague, starvation, freezing to death,
link |
01:54:13.440
or being killed by a neighboring village. It's like, well, it wasn't that bad. You know, it was
link |
01:54:18.080
only like, you know, we lost 5% this year. That was, uh, it was a good year. You know, that would
link |
01:54:23.440
be part of the course. Like just, just not starving to death would have been like the primary goal
link |
01:54:28.560
of most people in through throughout history is making sure we'll have no foods last for the
link |
01:54:33.120
winter and not get, not freeze or whatever. So, um, now food is, is plentiful. I have an obesity
link |
01:54:42.880
problem. Um, you know, so. Well, yeah. The lesson there is to be grateful for the way things are
link |
01:54:49.520
now for, for some of us. We've spoken about this offline. I'd love to get your thought about it here.
link |
01:54:59.760
If I sat down for a long form in person conversation with the president of Russia,
link |
01:55:05.120
Vladimir Putin, would you potentially want to call in for a few minutes to join in on a conversation
link |
01:55:12.000
with a moderated translated by me? Sure. Yeah. Sure. I'd be happy to do that.
link |
01:55:19.440
You've shown interest in the Russian language. Is this grounded in your interest in history
link |
01:55:23.840
of linguistics, culture, general curiosity? I think it sounds cool. Sounds cool. Now it looks cool.
link |
01:55:32.480
Well, it's, it's, you know, it's, it's a, it's, it takes a moment to read Cyrillic.
link |
01:55:37.280
Once you know what the Cyrillic characters stand for, actually, then reading Russian
link |
01:55:45.440
becomes a lot easier because there are a lot of words that are actually the same.
link |
01:55:49.200
Like bank is bank. So find the words that are exactly the same and now you start to understand
link |
01:55:58.640
Cyrillic. Yeah. If you can, if you can sound it out, it's much, there's at least some commonality
link |
01:56:05.760
of words. What about the culture? You, you love great engineering, physics. There's a tradition
link |
01:56:12.800
of the sciences there. Sure. You look at the 20th century from rocketry. So, you know,
link |
01:56:18.160
some of the greatest rockets of the space exploration has been done in the Soviet and the
link |
01:56:22.480
former Soviet Union. Yeah. So do you draw inspiration from that history? Just how this
link |
01:56:28.720
culture that in many ways, I mean, one of the sad things is because of the language,
link |
01:56:32.480
which a lot of it is lost to history because it's not translated to all those kinds of,
link |
01:56:37.440
because it, it is in some ways an isolated culture. It flourishes within its, within its borders.
link |
01:56:44.400
Yeah. So do you draw inspiration from those folks from, from the history of science engineering
link |
01:56:50.480
there? I mean, the Soviet Union, Russia, and Ukraine as well, and have a really strong history
link |
01:57:00.480
in spaceflight. Like some of the most advanced and impressive things in history were done,
link |
01:57:07.760
you know, by the Soviet Union. So one can, cannot help but admire the
link |
01:57:17.840
impressive rocket technology that was developed. You know, after the sort of full Soviet Union,
link |
01:57:23.680
the, there's, there's much less that, that, than happened. But
link |
01:57:33.360
still things are happening, but it's not, not quite at the
link |
01:57:37.200
frenetic pace that was happening before the Soviet Union kind of
link |
01:57:43.840
dissolved into separate republics. Yeah. I mean, I, I, you know, there's Roscosmos,
link |
01:57:49.680
the Russian agency. I look forward to a time when those countries with China are working together,
link |
01:57:57.840
the United States are all working together. Maybe a little bit of friendly competition, but
link |
01:58:01.760
like friendly competition is good. You know, government's so slow, and the only thing slower
link |
01:58:06.960
than one government is a collection of governments. So the Olympics would be boring if everyone just
link |
01:58:14.720
crossed the finishing line at the same time. Yeah. Nobody would watch. Yeah. And, and people wouldn't
link |
01:58:19.760
try hard to run fast and stuff. So I think friendly competition is a good thing.
link |
01:58:26.400
This is also a good place to give a shout out to a video titled the entire Soviet rocket engine
link |
01:58:30.720
family tree by Tim Dodd, aka everyday astronaut. It's like an hour and a half. It gives a full
link |
01:58:36.240
history of Soviet rockets. And people should definitely go check on support Tim in general.
link |
01:58:41.360
That guy's super excited about the future, super excited about a spaceflight. Every time I see
link |
01:58:46.480
anything by him, I just have a stupid smile on my face because he's so excited about stuff.
link |
01:58:51.120
Yeah. Tim Dodd is really, really great. If you're interested in anything to do with space,
link |
01:58:56.160
he's, in terms of explaining rocket technology to your average person, he's awesome. The best,
link |
01:59:02.560
I'd say. And I should say like the part of the reason like I switched us from, like Raptor at
link |
01:59:12.400
one point was going to be a hydrogen engine. But hydrogen has a lot of challenges. It's very low
link |
01:59:17.760
density. It's a deep cryogen. So it's only liquid at a very, you know, very close to absolute zero
link |
01:59:23.200
requires a lot of insulation. It's, so it is a lot of challenges there.
link |
01:59:28.560
And I was actually reading a bit about Russian rocket engine developments. And
link |
01:59:37.440
at least the impression I had was that Soviet Union Russia and Ukraine primarily were
link |
01:59:45.520
actually in the process of switching to methalox. And there was some interesting
link |
01:59:52.320
test and data for ISP, like they were able to get like up to like a 382nd ISP with the
link |
02:00:00.240
methalox engine. And I was like, well, okay, that's, that's actually really impressive. So
link |
02:00:08.640
so I think we could, you could actually get a much lower cost, like in optimizing cost per
link |
02:00:15.360
time to orbit cost per time to Mars. It's, I think methane oxygen is the way to go.
link |
02:00:25.040
And I was partly inspired by the Russian work on the test stands with methalox engines.
link |
02:00:32.560
And now for something completely different. Do you mind doing a bit of a meme review in the
link |
02:00:38.560
spirit of the great, the powerful beauty pie, let's say one to 11, just go over a few documents
link |
02:00:44.080
printed out. We can try. Let's try this. I present to you document number Uno.
link |
02:00:56.320
I don't know. Okay. Vlad the Impaler discovers marshmallows.
link |
02:01:03.840
Yeah, that's not bad.
link |
02:01:04.560
So you get it because he's failing things. I don't know, three, whatever.
link |
02:01:14.480
That's not very good.
link |
02:01:19.040
This is grounded in some engineering, some history.
link |
02:01:28.560
Yeah, give us an eight out of 10.
link |
02:01:31.040
What do you think about nuclear power?
link |
02:01:32.480
I'm in favor of nuclear power. I think it's in a place that is not subject to extreme natural
link |
02:01:40.800
disasters. I think it's a nuclear power is a great way to generate electricity.
link |
02:01:47.840
I don't think we should be shutting down nuclear power stations.
link |
02:01:51.680
Yeah, but what about Chernobyl?
link |
02:01:52.880
Exactly. So I think people, there's like a lot of fear of radiation and stuff.
link |
02:02:04.160
And it's, I guess, probably like a lot of people just don't
link |
02:02:09.440
even study engineering or physics. It's just the word radiation just sounds scary.
link |
02:02:15.280
You know, so they don't, they can't calibrate what radiation means. But radiation is much
link |
02:02:22.000
less dangerous than you think. So like, for example, Fukushima, when the Fukushima problem
link |
02:02:35.840
happened due to the tsunami, I got people in California asking me if they should worry about
link |
02:02:44.240
radiation from Fukushima. And I'm like, definitely not, not even slightly, not at all. That is crazy.
link |
02:02:54.000
And just to show, like, look, this is how, like, the danger is so much overplayed compared to what
link |
02:03:04.960
it really is that I actually flew to Fukushima and I donated a solar power system for what
link |
02:03:14.160
a treatment plant. And, and I made a point of eating locally grown vegetables on TV in Fukushima.
link |
02:03:28.640
Like, I'm still alive. Okay.
link |
02:03:31.360
So it's not even at the risk of these events as low, but the impact of them is.
link |
02:03:36.000
Impact is greatly exaggerated. It's just human nature.
link |
02:03:39.920
It's people who don't know what radiation is, like I've had people ask me, like, what about
link |
02:03:43.280
radiation from cell phones, quoting, causing brain cancer? I'm like, when you say radiation,
link |
02:03:47.600
do you mean photons or particles than like that? I don't know what, what do you mean
link |
02:03:51.440
photons particles? So do you mean, let's say photons? What, what, what frequency,
link |
02:03:58.240
wavelength? And they're like, no idea. Like, do you know that everything's radiating all the time?
link |
02:04:04.720
Like, what do you mean? Like, yeah, everything's radiating all the time.
link |
02:04:07.440
Photons are being emitted by, by all objects all the time, basically. So,
link |
02:04:14.320
and if you want to know what it's, it's what, what it means to stand in front of nuclear fire,
link |
02:04:20.080
go outside. The sun is a gigantic, you know, thermonuclear reactor that you're staring
link |
02:04:27.280
right at it. Are you still alive? Yes. Okay. Amazing.
link |
02:04:30.480
Yeah. I guess radiation is one of the words that could be used as a tool to, to, to fear
link |
02:04:37.840
monger by certain people. That's it. And I think people just don't, don't understand. So, I mean,
link |
02:04:42.400
that's the way to fight that, that fear, I suppose, is to understand, is to learn.
link |
02:04:46.480
Yeah. Just say like, okay, how many people have actually died from nuclear accidents? It's like
link |
02:04:50.640
practically nothing. And say how many people have, have died from, you know, coal plants? And
link |
02:04:57.360
it's a very big number. So like, obviously, we should not be starting up coal plants and shutting
link |
02:05:04.320
down nuclear plants. It just doesn't make any sense at all. Coal plants, like, I don't know,
link |
02:05:09.280
a hundred to a thousand times worse for, for health than nuclear power plants.
link |
02:05:15.040
You want to go to the next one? This is really bad.
link |
02:05:17.440
So that 90, 180 and 360 degrees, everybody loves the math, nobody gives a shit about 270.
link |
02:05:27.840
It's not super funny. I don't like 203. Yeah. This is not, you know, LOL situation.
link |
02:05:37.360
Yeah.
link |
02:05:37.680
Yeah. That's pretty good.
link |
02:05:43.840
The United States oscillating between establishing and destroying dictatorships.
link |
02:05:48.400
It's like a metro. Is that a metro? Yeah. What does that mean? Yeah.
link |
02:05:51.440
Yeah. It's a 7 out of 10. It's kind of true. Oh, yeah. This is,
link |
02:05:55.760
this is kind of personal for me. Next one. Oh, man. This is Leica?
link |
02:06:00.480
Yeah. Well, no, this is... Or it's like referring to Leica or something?
link |
02:06:03.760
As Leica is like a husband. Hello. Yes. This is dog. Your wife was launched to space.
link |
02:06:11.360
And then the last one is him with his eyes closed and a bottle of vodka.
link |
02:06:16.400
Yeah. Leica didn't come back. No. They don't tell you the full story of, you know,
link |
02:06:21.920
what the love, the impact they had on the loved ones.
link |
02:06:24.960
True. That one gets an 11 for me. Sure.
link |
02:06:27.120
The Soviet set up. Yeah. This keeps going on the Russian theme. First man in space,
link |
02:06:34.720
nobody cares. First man on the moon. Well, I think people do care.
link |
02:06:37.840
No, I know. But...
link |
02:06:41.040
Yuri Gagarin's names will be forever in history, I think.
link |
02:06:45.440
There is something special about placing like stepping foot onto another totally foreign land.
link |
02:06:52.000
It's not the journey like people that explore the oceans. It's not as important to explore the
link |
02:06:57.360
oceans as to land on a whole new continent. Yeah.
link |
02:07:02.880
Well, this is about you. Oh, yeah. I'd love to get your comment on this.
link |
02:07:07.360
Elon Musk, after sending $6.6 billion to the UN to end world hunger, you have three hours.
link |
02:07:14.240
Yeah. Well, I mean, obviously $6 billion is not going to end world hunger. So
link |
02:07:24.000
so I mean, the reality is at this point, the world is producing
link |
02:07:29.280
far more food than it can really consume it. Like we don't have a caloric
link |
02:07:34.800
constraint to this point. So where there is hunger, it is almost always due to
link |
02:07:39.040
like civil war, strife or some like... It's not a thing that is
link |
02:07:48.960
extremely rare for it to be just a matter of lack of money. It's like,
link |
02:07:54.640
you know, it's like some, the civil war in some country and like one part of the country is
link |
02:07:59.040
literally trying to starve the other part of the country. So it's much more complex than
link |
02:08:04.400
something that money could solve. It's geopolitics. It's a lot of things. It's human nature.
link |
02:08:10.480
It's governments. It's money, monetary systems, all that kind of stuff.
link |
02:08:14.560
Yeah. Food is extremely cheap these days. It's like,
link |
02:08:20.720
I mean, the US at this point, you know, among low income families, obesity is actually another
link |
02:08:27.440
problem. It's not, like obviously it's not hunger. It's like too many calories. So
link |
02:08:36.160
it's not that nobody's hungry anywhere. It's just, this is not a simple matter of adding money and
link |
02:08:43.120
solving it. What do you think that one gets? It's getting... Two.
link |
02:08:50.720
Just going after Empire's world. Where did you get those artifacts? The British Museum
link |
02:08:59.040
has a shout out to Monty Python. We found them.
link |
02:09:02.800
Yeah. The British Museum is pretty great. I mean,
link |
02:09:06.880
immediately Britain did take these historical artifacts from all around the world and put
link |
02:09:10.320
them in London. But, you know, it's not like people can't go see them. So it is a convenient
link |
02:09:17.680
place to see these ancient artifacts is London for, you know, for a large segment of the world.
link |
02:09:24.800
So I think, you know, on balance, the British Museum is a net good. Although I'm sure
link |
02:09:29.520
that a lot of countries argue about that. Yeah. It's like you want to make these historical
link |
02:09:33.920
artifacts accessible to as many people as possible. And the British Museum, I think,
link |
02:09:38.960
does a good job of that. Even if there's a darker aspect to, like, the history of Empire in general,
link |
02:09:44.720
whatever the Empire is, however things were done, it is the history that happened. You
link |
02:09:52.080
can't sort of erase that history, unfortunately. You could just become better in the future.
link |
02:09:56.400
That's the point. Yeah. I mean, it's like, well, how are we going to pass from all judgment on
link |
02:10:03.040
these things? Like, it's like, you know, if one is going to judge, say, the British Empire,
link |
02:10:10.160
you got to judge, you know, what everyone was doing at the time, and how were the British
link |
02:10:14.720
relative to everyone? And I think they were first would actually get like a relatively good grade,
link |
02:10:21.760
relatively good grade, not an absolute terms, but compared to what everyone else was doing.
link |
02:10:29.680
They were not the worst. Like I said, you got to look at these things in the context of the
link |
02:10:33.600
history at the time, and say, what were the alternatives? And what are you comparing it
link |
02:10:37.840
against? Yes. And I do not think it would be the case that Britain would get a bad grade
link |
02:10:46.080
in when looking at history at the time. You know, if you judge history from, you know,
link |
02:10:53.200
from what is morally acceptable today, you basically are going to give everyone a failing
link |
02:10:57.680
grade. I'm not clear. I don't think anyone would get a passing grade in their morality of,
link |
02:11:04.320
like you go back 300 years ago, like, who's getting a passing grade? Basically, no one.
link |
02:11:10.480
And we might not get a passing grade from generations that come after us. What does that one get?
link |
02:11:20.080
Sure. Six, seven, seven.
link |
02:11:22.320
For the Monty Python, maybe. I always love Monty Python. They're great.
link |
02:11:26.480
I like for Brian and the Quist of the Holy Grail are incredible.
link |
02:11:29.200
Yeah, yeah. Yeah, those serious eyebrows. Brezhnev. How important do you think is facial hair to
link |
02:11:36.560
great leadership? Well, you got a new haircut. How does that affect your leadership?
link |
02:11:43.840
I don't know. Hopefully not. It doesn't.
link |
02:11:48.240
Yeah, the second is no one.
link |
02:11:51.200
There is no one competing with Brezhnev. No one, too.
link |
02:11:53.520
Those are like epic eyebrows. So, sure. That's ridiculous.
link |
02:12:00.160
Give it a six or seven. I don't know. I like this, like, Shakespeare analysis of memes.
link |
02:12:05.520
Brezhnev, he had a flair for drama as well. Like, you know, showmanship.
link |
02:12:11.280
Yeah, yeah. It must come from the eyebrows. All right. Invention, great engineering.
link |
02:12:17.920
Look what I invented. That's the best thing since ripped up bread.
link |
02:12:21.360
Yeah. Because they invented sliced bread. Am I just explaining memes at this point?
link |
02:12:29.680
This is what my life has become.
link |
02:12:33.520
You're going to be more of a meme explainer.
link |
02:12:35.120
Yeah, I'm a meme. Like a scribe that like runs around with the kings and just like writes down
link |
02:12:43.120
memes. I mean, when was the cheeseburger invented? That's like an epic invention.
link |
02:12:47.200
Yeah. Like, wow.
link |
02:12:50.080
You know, versus just like a burger?
link |
02:12:53.120
Or burger. I guess a burger in general. It's like, you know.
link |
02:12:57.200
Then there's like, what is a burger? What's the sandwich? And then you start getting
link |
02:13:00.640
the pizza sandwich and what is the original? It gets into an ontology argument.
link |
02:13:05.760
Yeah. But everybody knows like, if you order like a burger or cheeseburger or whatever and you
link |
02:13:08.800
like, you got like, you know, tomato and some lettuce and onions and whatever and, you know,
link |
02:13:12.560
you know, mayo and ketchup and mustard. It's like epic.
link |
02:13:16.320
Yeah. But I'm sure they've had bread and meat separately for a long time and it was kind of
link |
02:13:20.560
a burger on the same plate, but somebody who actually combined them into the same thing
link |
02:13:25.600
and you buy it and hold it, makes it convenient. It's a materials problem. Like your hands don't
link |
02:13:30.960
get dirty and whatever. Yeah, it's brilliant.
link |
02:13:38.240
Well, that is not what I would have guessed.
link |
02:13:40.240
But everybody knows like, if you order a cheeseburger, you know what you're getting,
link |
02:13:44.240
you know, it's not like some obtuse. Like, I wonder what I'll get, you know.
link |
02:13:49.920
You know, fries are, I mean, great. I mean, they're the devil, but fries are awesome. And
link |
02:13:57.920
yeah, pizza is incredible. Food innovation doesn't get enough love.
link |
02:14:03.440
Yeah. I guess is what we're getting at.
link |
02:14:04.800
It's great. What about the Matthew McGonaghey Austinite here?
link |
02:14:11.840
President Kennedy, do you know how to put men on the moon yet?
link |
02:14:15.040
NASA know. President Kennedy, be a lot cooler if you did.
link |
02:14:20.160
Pretty much. Sure. Six, six or seven.
link |
02:14:26.000
And this is the last one. That's funny.
link |
02:14:29.120
Someone drew a bunch of dicks all over the walls of Sistine Chapel Boys Bathroom.
link |
02:14:35.440
Sure. I'll give it nine. It's really true.
link |
02:14:39.360
This is our highest ranking meme for today.
link |
02:14:42.240
I mean, it's true. Like, how do they get away with that?
link |
02:14:44.480
Lots of nakedness. I mean, dick pics are, I mean, just something throughout history.
link |
02:14:50.400
As long as people can draw things, there's been a dick pic.
link |
02:14:52.800
It's a staple of human history.
link |
02:14:54.880
It's a staple. Consistence throughout human history.
link |
02:14:58.320
You tweeted that you aspire to comedy. Your friends with Joe Rogan
link |
02:15:02.320
might you do a short stand up comedy set at some point in the future?
link |
02:15:06.480
Maybe open for Joe, something like that. Is that really stand up?
link |
02:15:11.120
Actual just full on stand up? Full on stand up. Is that in there or is that?
link |
02:15:14.880
I've never thought about that.
link |
02:15:17.360
It's extremely difficult if at least that's what Joe says in the comedians say.
link |
02:15:22.960
Huh. I wonder if I could. The only one way to find out.
link |
02:15:29.360
You know, I have done stand up for friends just impromptu.
link |
02:15:35.600
You know, I'll get on like a roof and they do laugh, but they're our friends too.
link |
02:15:41.280
So I don't know if you've got to call, you know, like a room of strangers.
link |
02:15:45.040
Are they going to actually also find it funny? But I could try. See what happens.
link |
02:15:50.480
I think you'd learn something either way. Yeah, I kind of love
link |
02:15:56.400
both the, when you bomb and when, when you do great just watching people,
link |
02:16:00.400
how they deal with it. It's so difficult. It's so, you're so fragile
link |
02:16:05.760
up there. It's just you and you think you're going to be funny.
link |
02:16:09.600
And when it completely falls flat, it's just, it's beautiful to see people
link |
02:16:13.360
deal with like that. I might have enough material to do stand up.
link |
02:16:16.800
I've never thought about it, but I might have enough material.
link |
02:16:23.440
I don't know, like 15 minutes or something. Oh yeah. Yeah. Do a Netflix special.
link |
02:16:28.640
Netflix special. Sure. What's your favorite Rick and Morty concept?
link |
02:16:34.960
Just to spring that on you. Is there, there's a lot of sort of scientific engineering ideas
link |
02:16:38.960
explored there. There's the favorite. There's the butter robot. It's great.
link |
02:16:43.120
Yeah. It's a great show. You like it? Yeah. Rick and Morty is awesome.
link |
02:16:47.040
Somebody that's exactly like you from an alternate dimension showed up there, Elon Tusk.
link |
02:16:51.920
Yeah. That's right. That you voiced.
link |
02:16:53.520
Yeah. Rick and Morty certainly explores a lot of interesting concepts.
link |
02:16:59.120
Like what's the favorite one? I don't know. The butter robot certainly is,
link |
02:17:02.800
you know, it's like, it's certainly possible to have too much sentence in a device.
link |
02:17:07.680
Like you don't want to have your toast to be like a super genius toaster.
link |
02:17:11.840
It's going to hate life because all it could just make is toast. But if, you know, it's like,
link |
02:17:16.400
you don't want to have like super intelligence stuck in a very limited device.
link |
02:17:21.040
Do you think it's too easy from a, if we're talking about from the engineering perspective
link |
02:17:24.880
of super intelligence, like with Marvin, the robot, like is it just, it seems like it might
link |
02:17:30.720
be very easy to engineer just a depressed robot. Like it, it's not obvious to engineer a robot
link |
02:17:37.040
that's going to find a fulfilling existence. Same as humans, I suppose. But I wonder if that's like
link |
02:17:45.200
the default. If you don't do a good job on building a robot, it's going to be sad a lot.
link |
02:17:52.480
Well, we can reprogram robots easier than we can reprogram humans. So I guess if you let it evolve
link |
02:18:01.120
without tinkering, then it might get sad. But you can change the optimization function and
link |
02:18:08.800
have it be a cheery robot.
link |
02:18:12.800
You, like I mentioned with, with SpaceX, you give a lot of people hope. And a lot of people look
link |
02:18:17.840
up to you, millions of people look up to you. If we think about young people in high school,
link |
02:18:23.360
maybe in college, what advice would you give to them about if they want to try to do something
link |
02:18:30.800
big in this world, they want to really have a big positive impact, what advice would you give them
link |
02:18:35.040
about their career, maybe about life in general?
link |
02:18:39.200
Try to be useful. You do things that are useful to your fellow human beings to the world. It's
link |
02:18:46.400
very hard to be useful. Very hard. You know, are you contributing more than you consume?
link |
02:18:56.960
You know, like, like, can you try to have a positive net contribution to society?
link |
02:19:07.520
I think that's the thing to aim for, you know, not to try to be sort of a leader for,
link |
02:19:13.040
for the sake of being a leader or whatever. A lot of time people who, a lot of time, the people
link |
02:19:20.560
you want as leaders are the people who don't want to be leaders. So if you can live a useful life,
link |
02:19:32.080
that is a good life, a life worth having lived. You know, and like I said, I would encourage people
link |
02:19:41.680
to use the mental tools of physics and apply them broadly in life. They are the best tools.
link |
02:19:49.120
When you think about education and self education, what do you recommend? So there's the university,
link |
02:19:55.280
there's self study, there is a hands on sort of finding a company or a place or a set of people
link |
02:20:03.840
that do the thing you're passionate about and joining them as early as possible.
link |
02:20:08.800
There's taking a road trip across Europe for a few years and writing some poetry,
link |
02:20:13.200
which trajectory do you suggest in terms of learning about how you can become useful,
link |
02:20:22.400
as you mentioned, how you can have the most positive impact?
link |
02:20:32.480
Well, I encourage people to read a lot of books. Basically, try to ingest as much information as
link |
02:20:40.160
you can and try to also just develop a good general knowledge. So you at least have a rough
link |
02:20:50.480
lay of the land of the knowledge landscape. Try to learn a little bit about a lot of things.
link |
02:20:58.160
Because you might not know what you're really interested in. How would you know what you're
link |
02:21:00.320
really interested in if you at least aren't doing it peripheral exploration or broadly of
link |
02:21:05.920
the knowledge landscape? And you talk to people from different walks of life and different
link |
02:21:16.720
industries and professions and skills and occupations. Just try to learn as much as possible.
link |
02:21:27.200
Man's search for meaning.
link |
02:21:30.720
Isn't the whole thing a search for meaning?
link |
02:21:32.400
Yeah, what's the meaning of life and all? But just generally, like I said, I would encourage
link |
02:21:38.880
people to read broadly in many different subject areas and then try to find something where there's
link |
02:21:47.440
an overlap of your talents and what you're interested in. So people may be good at something,
link |
02:21:53.440
but they may have skill at a particular thing, but they don't like doing it.
link |
02:21:57.360
So you want to try to find a thing where that's a good combination of the things that you're
link |
02:22:06.880
inherently good at, but you also like doing. And reading is a super fast shortcut to
link |
02:22:15.760
figure out where are you. You're both good at it. You like doing it, and it will actually have positive
link |
02:22:22.160
impact. Well, you got to learn about things somehow. So reading a broad range, it's just
link |
02:22:28.800
really read it. One important one is that kid I read through the encyclopedia. So that's pretty
link |
02:22:37.680
helpful. And those are things I didn't even know existed. Well, lots, obviously. It's like as
link |
02:22:44.720
broad as it gets. Encyclophilias were digestible, I think, whatever, 40 years ago. So maybe read
link |
02:22:55.680
through the condensed version of the encyclopedia Britannica. I'd recommend that. You can always
link |
02:23:01.440
like skip subjects or you read a few paragraphs, and no, you're not interested. Just jump to the
link |
02:23:06.080
next one. So read the encyclopedia or scan through it. And you know, put a lot of stock and
link |
02:23:18.720
certainly have a lot of respect for someone who puts in an honest day's work to do useful things.
link |
02:23:25.920
And just generally to have like not a zero sum mindset, or like have more of a
link |
02:23:33.120
like, grow the pie mindset, like the, if you sort of say like, when we see people like,
link |
02:23:41.360
perhaps, including some very smart people, kind of taking an attitude of like, like,
link |
02:23:48.000
like doing things that seem like morally questionable. It's often because they have
link |
02:23:52.400
at a base sort of axiomatic level, a zero sum mindset. And, and they without realizing it,
link |
02:23:59.920
they don't realize they have a zero sum mindset, or at least they don't realize it consciously.
link |
02:24:05.760
And so if you have a zero sum mindset, then the only way to get ahead is by taking things from
link |
02:24:09.600
others. If it's like, if the, if the, if the pie is fixed, then the only way to have more pie is to
link |
02:24:17.360
take someone else's pie. But, but this is false, like obviously the pie has grown dramatically
link |
02:24:22.160
over time, the economic pie. So the reality, in reality, you can have, so over useless analogy,
link |
02:24:30.960
if you have a lot of, there's a lot of pie. Pie pie is not fixed. So you really want to make sure
link |
02:24:40.400
you don't, you're not operating without realizing it from a zero sum mindset, where, where the only
link |
02:24:46.720
way to get ahead is to take things from others, then that's going to result in you trying to
link |
02:24:49.680
take things from others, which is not, not good. It's much better to work on adding to the economic
link |
02:24:55.920
pie, maybe, you know, so creating, like I said, creating more than you consume, doing more than
link |
02:25:05.120
you. Yeah. So that's a big deal. I think there's like, you know, a fair number of people in,
link |
02:25:12.720
in finance that do have a bit of a zero sum mindset.
link |
02:25:16.560
I mean, it's all walks of life. I've seen that. And one of the, one of the reasons
link |
02:25:23.040
Rogan inspires me is he celebrates others a lot. This is not, not creating a constant competition.
link |
02:25:29.200
Like there's a scarcity of resources. What happens when you celebrate others and you promote others,
link |
02:25:34.800
the ideas of others, it, it, it actually grows that pie. I mean, it, every, like the,
link |
02:25:41.360
the resource, the resources become less scarce. And that, that applies in a lot of kinds of
link |
02:25:46.240
domains. It applies in academia where a lot of people are very, see some funding for academic
link |
02:25:51.520
research is zero sum. And it is not, if you celebrate each other, if you make, if you get
link |
02:25:56.320
everybody to be excited about AI, about physics, about mathematics, I think it, there'll be more,
link |
02:26:01.440
more funding. And I think everybody wins. Yeah. That applies, I think broadly.
link |
02:26:06.480
Yeah. Yeah. Exactly. So last, last, last question about love and meaning.
link |
02:26:11.280
What is the role of love in the human condition broadly and more specific to you? How has love,
link |
02:26:20.000
romantic love, or otherwise made you a better person, a better human being?
link |
02:26:27.040
Better engineer?
link |
02:26:29.040
Now you're asking really perplexing questions. It's hard to give up. I mean, there are many
link |
02:26:37.600
books, poems, and songs written about what is love and what is, what exactly, you know,
link |
02:26:47.600
you know, what is love? Maybe you don't hurt me.
link |
02:26:52.320
That's one of the great ones. Yes. Yeah. You've, you've earlier quoted Shakespeare,
link |
02:26:56.160
but that, that's really up there. Yeah. Love is a many splendid thing.
link |
02:27:01.120
I mean, there's, because we've talked about so many inspiring things like be useful in the world,
link |
02:27:08.320
sort of like solve problems, alleviate suffering, but it seems like connection between humans is a
link |
02:27:14.800
source, you know, it's a source of joy is a source of meaning. And that, that's what love is, friendship,
link |
02:27:21.920
love. I just wonder if you think about that kind of thing when you talk about preserving the light
link |
02:27:29.280
of human consciousness or it's becoming a multi planetary multi planetary species.
link |
02:27:35.120
I mean, to me, at least that, that means like, if we're just alone and conscious and intelligent,
link |
02:27:44.240
it doesn't mean nearly as much as if we're with others, right? And there's some magic created
link |
02:27:50.240
when we're together. The, the French of it, and I think the highest form of it is love,
link |
02:27:56.640
which I think broadly is, is much bigger than just sort of romantic, but also yes,
link |
02:28:01.840
romantic love and family and those kinds of things.
link |
02:28:06.000
Well, I mean, the reason I guess I care about us becoming multi planetary species in a space
link |
02:28:10.400
frank civilization is foundationally, I love humanity. And, and so I wish to see it prosper and
link |
02:28:20.560
do great things and be happy. And if I did not love humanity, I would not care about these things.
link |
02:28:31.120
So when you look at the whole of it, the human history, all the people who's ever lived, all
link |
02:28:35.040
the people alive now, it's pretty, we're okay. On the whole, we're pretty interesting bunch.
link |
02:28:44.480
Yeah. All things considered. And I've read a lot of history, including the darkest,
link |
02:28:50.960
worst parts of it. And despite all that, I think on balance, I still love humanity.
link |
02:28:59.280
You joked about it with the 42. What do you think is the meaning of this whole thing?
link |
02:29:05.280
Is it like, is there a non numerical representation?
link |
02:29:08.240
Yeah. Well, really, I think what Dr. Sattons was saying in Hitchhiker's Guide to the Galaxy is that
link |
02:29:15.680
the universe is the answer. And what we really need to figure out are what questions to ask
link |
02:29:23.440
about the answer that is the universe. And that the question is the really the hard part. And
link |
02:29:28.560
if you can properly frame the question, then the answer relatively speaking is easy.
link |
02:29:32.160
So therefore, if you want to understand what questions to ask about the universe,
link |
02:29:40.720
you want to understand the meaning of life. We need to expand the scope and scale of consciousness
link |
02:29:45.760
so that we're better able to understand the nature of the universe and understand the
link |
02:29:51.120
meaning of life. And ultimately, the most important part would be to ask the right question.
link |
02:29:56.400
Yes. Thereby elevating the role of the interviewer.
link |
02:30:02.720
Yes, exactly. As the most important human in the room.
link |
02:30:07.600
Good questions are, it's hard to come up with good questions. Absolutely.
link |
02:30:15.040
But yeah, it's like that is the foundation of my philosophy is that I am curious about the
link |
02:30:21.520
nature of the universe. And obviously, I will die. I don't know when I'll die, but I won't live
link |
02:30:30.240
forever. But I would like to know that we are on a path to understanding the nature of the universe
link |
02:30:36.560
and the meaning of life and what questions to ask about the answer that is the universe.
link |
02:30:41.440
And so if we expand the scope and scale of humanity and consciousness in general,
link |
02:30:46.320
which includes Silicon Consciousness, then that seems like a fundamentally good thing.
link |
02:30:55.040
Elon, like I said, I'm deeply grateful that you have spent your extremely valuable time with me
link |
02:31:01.040
today and also that you have given millions of people hope in this difficult time, this divisive
link |
02:31:07.840
time in this cynical time. So I hope you do continue doing what you're doing. Thank you
link |
02:31:14.080
so much for talking today. You're welcome. Thanks for excellent questions.
link |
02:31:18.560
Thanks for listening to this conversation with Elon Musk. To support this podcast,
link |
02:31:22.560
please check out our sponsors in the description. And now let me leave you with some words from
link |
02:31:27.360
Elon Musk himself. When something is important enough, you do it, even if the odds are not
link |
02:31:33.600
in your favor. Thank you for listening and hope to see you next time.