back to index

Michael Stevens: Vsauce | Lex Fridman Podcast #58


small model | large model

link |
00:00:00.000
The following is a conversation with Michael Stevens,
link |
00:00:02.680
the creator of Vsauce,
link |
00:00:04.500
one of the most popular educational YouTube channels
link |
00:00:07.000
in the world with over 15 million subscribers
link |
00:00:10.080
and over 1.7 billion views.
link |
00:00:13.040
His videos often ask and answer questions
link |
00:00:16.160
that are both profound and entertaining,
link |
00:00:18.780
spanning topics from physics to psychology.
link |
00:00:21.800
Popular questions include,
link |
00:00:23.640
what if everyone jumped at once?
link |
00:00:25.440
Or what if the sun disappeared?
link |
00:00:27.680
Or why are things creepy?
link |
00:00:29.820
Or what if the earth stopped spinning?
link |
00:00:32.840
As part of his channel,
link |
00:00:34.040
he created three seasons of Mind Field,
link |
00:00:36.320
a series that explored human behavior.
link |
00:00:38.760
His curiosity and passion are contagious
link |
00:00:41.640
and inspiring to millions of people.
link |
00:00:44.080
And so as an educator,
link |
00:00:45.240
his impact and contribution to the world
link |
00:00:47.540
is truly immeasurable.
link |
00:00:49.780
This is the Artificial Intelligence Podcast.
link |
00:00:52.840
If you enjoy it, subscribe on YouTube,
link |
00:00:55.200
give it five stars on Apple Podcast,
link |
00:00:57.160
support it on Patreon,
link |
00:00:58.600
or simply connect with me on Twitter,
link |
00:01:00.640
at Lex Friedman, spelled F R I D M A N.
link |
00:01:04.900
I recently started doing ads
link |
00:01:06.400
at the end of the introduction.
link |
00:01:08.120
I'll do one or two minutes after introducing the episode
link |
00:01:10.780
and never any ads in the middle
link |
00:01:12.740
that break the flow of the conversation.
link |
00:01:14.760
I hope that works for you
link |
00:01:16.120
and doesn't hurt the listening experience.
link |
00:01:19.000
This show is presented by Cash App,
link |
00:01:21.500
the number one finance app in the App Store.
link |
00:01:24.000
I personally use Cash App to send money to friends,
link |
00:01:26.560
but you can also use it to buy, sell,
link |
00:01:28.280
and deposit Bitcoin in just seconds.
link |
00:01:30.720
Cash App also has a new investing feature.
link |
00:01:33.480
You can buy fractions of a stock, say $1 worth,
link |
00:01:36.320
no matter what the stock price is.
link |
00:01:38.280
Broker services are provided by Cash App Investing,
link |
00:01:40.880
a subsidiary of Square and member SIPC.
link |
00:01:44.200
I'm excited to be working with Cash App
link |
00:01:46.180
to support one of my favorite organizations called First,
link |
00:01:49.120
best known for their first robotics and Lego competitions.
link |
00:01:52.440
They educate and inspire hundreds of thousands of students
link |
00:01:56.000
in over 110 countries
link |
00:01:57.720
and have a perfect rating on Charity Navigator,
link |
00:02:00.320
which means the donated money
link |
00:02:01.640
is used to maximum effectiveness.
link |
00:02:04.280
When you get Cash App from the App Store, Google Play,
link |
00:02:07.180
and use code LEXPODCAST, you'll get $10,
link |
00:02:11.040
and Cash App will also donate $10 to First,
link |
00:02:13.760
which again is an organization
link |
00:02:15.480
that I've personally seen inspire girls and boys
link |
00:02:18.360
to dream of engineering a better world.
link |
00:02:21.660
And now here's my conversation with Michael Stevens.
link |
00:02:25.940
One of your deeper interests is psychology,
link |
00:02:30.260
understanding human behavior.
link |
00:02:32.620
You've pointed out how messy studying human behavior is
link |
00:02:35.480
and that it's far from the scientific rigor
link |
00:02:37.420
of something like physics, for example.
link |
00:02:40.880
How do you think we can take psychology
link |
00:02:43.640
from where it's been in the 20th century
link |
00:02:45.680
to something more like what the physicists,
link |
00:02:49.100
theoretical physicists are doing,
link |
00:02:50.460
something precise, something rigorous?
link |
00:02:52.380
Well, we could do it by finding
link |
00:02:57.600
the physical foundations of psychology, right?
link |
00:03:01.200
If all of our emotions and moods and feelings and behaviors
link |
00:03:05.840
are the result of mechanical behaviors of atoms
link |
00:03:11.120
and molecules in our brains,
link |
00:03:12.800
then can we find correlations?
link |
00:03:15.400
Perhaps like chaos makes that really difficult
link |
00:03:17.760
and the uncertainty principle and all these things.
link |
00:03:19.520
That we can't know the position and velocity
link |
00:03:22.760
of every single quantum state in a brain, probably.
link |
00:03:27.140
But I think that if we can get to that point with psychology,
link |
00:03:33.560
then we can start to think about consciousness
link |
00:03:37.380
in a physical and mathematical way.
link |
00:03:40.960
When we ask questions like, well, what is self reference?
link |
00:03:44.760
How can you think about yourself thinking?
link |
00:03:47.320
What are some mathematical structures
link |
00:03:49.160
that could bring that about?
link |
00:03:52.400
There's ideas of, in terms of consciousness
link |
00:03:55.560
and breaking it down into physics,
link |
00:03:59.260
there's ideas of panpsychism where people believe
link |
00:04:02.400
that whatever consciousness is,
link |
00:04:04.840
is a fundamental part of reality.
link |
00:04:07.480
It's almost like a physics law.
link |
00:04:08.860
Do you think, what's your views on consciousness?
link |
00:04:11.560
Do you think it has this deep part of reality
link |
00:04:15.600
or is it something that's deeply human
link |
00:04:17.840
and constructed by us humans?
link |
00:04:21.640
Starting nice and light and easy.
link |
00:04:25.120
Nothing I ask you today has actually proven answer.
link |
00:04:28.280
So we're just hypothesizing.
link |
00:04:29.720
So yeah, I mean, I should clarify, this is all speculation
link |
00:04:32.960
and I'm not an expert in any of these topics
link |
00:04:35.320
and I'm not God, but I think that consciousness
link |
00:04:39.840
is probably something that can be fully explained
link |
00:04:44.840
within the laws of physics.
link |
00:04:48.280
I think that our bodies and brains and the universe
link |
00:04:51.960
and at the quantum level is so rich and complex.
link |
00:04:56.280
I'd be surprised if we couldn't find a room
link |
00:04:58.560
for consciousness there.
link |
00:05:00.800
And why should we be conscious?
link |
00:05:04.440
Why are we aware of ourselves?
link |
00:05:06.200
That is a very strange and interesting
link |
00:05:10.400
and important question.
link |
00:05:11.960
And I think for the next few thousand years,
link |
00:05:15.640
we're going to have to believe in answers purely on faith.
link |
00:05:20.200
But my guess is that we will find that,
link |
00:05:25.080
within the configuration space
link |
00:05:27.080
of possible arrangements of the universe,
link |
00:05:29.860
there are some that contain memories of others.
link |
00:05:34.240
Literally, Julian Barber calls them time capsule states
link |
00:05:38.140
where you're like, yeah, not only do I have a scratch
link |
00:05:40.400
on my arm, but also this state of the universe
link |
00:05:43.160
also contains a memory in my head
link |
00:05:45.720
of being scratched by my cat three days ago.
link |
00:05:48.560
And for some reason, those kinds of states of the universe
link |
00:05:52.940
are more plentiful or more likely.
link |
00:05:55.800
When you say those states,
link |
00:05:57.120
the ones that contain memories of its past
link |
00:06:00.000
or ones that contain memories of its past
link |
00:06:02.760
and have degrees of consciousness.
link |
00:06:05.400
Just the first part, because I think the consciousness
link |
00:06:08.640
then emerges from the fact that a state of the universe
link |
00:06:13.160
that contains fragments or memories of other states
link |
00:06:19.660
is one where you're going to feel like there's time.
link |
00:06:22.560
You're going to feel like, yeah,
link |
00:06:24.160
things happened in the past.
link |
00:06:26.400
And I don't know what'll happen in the future
link |
00:06:27.980
because these states don't contain information
link |
00:06:29.600
about the future.
link |
00:06:30.760
For some reason, those kinds of states
link |
00:06:34.000
are either more common, more plentiful,
link |
00:06:38.240
or you could use the anthropic principle and just say,
link |
00:06:40.840
well, they're extremely rare,
link |
00:06:42.480
but until you are in one, or if you are in one,
link |
00:06:45.720
then you can ask questions,
link |
00:06:46.860
like you're asking me on this podcast.
link |
00:06:49.920
Why questions?
link |
00:06:50.760
Yeah, it's like, why are we conscious?
link |
00:06:52.560
Well, because if we weren't,
link |
00:06:53.400
we wouldn't be asking why we were.
link |
00:06:56.040
You've kind of implied that you have a sense,
link |
00:06:59.480
again, hypothesis, theorizing
link |
00:07:02.360
that the universe is deterministic.
link |
00:07:05.560
What's your thoughts about free will?
link |
00:07:08.040
Do you think of the universe as deterministic
link |
00:07:10.320
in a sense that it's unrolling a particular,
link |
00:07:14.000
like there's a,
link |
00:07:14.960
it's operating under a specific set of physical laws.
link |
00:07:17.920
And when you have to set the initial conditions,
link |
00:07:21.360
it will unroll in the exact same way
link |
00:07:23.500
in our particular line of the universe every time.
link |
00:07:28.440
That is a very useful way to think about the universe.
link |
00:07:31.600
It's done us well.
link |
00:07:32.440
It's brought us to the moon.
link |
00:07:33.740
It's brought us to where we are today, right?
link |
00:07:35.920
I would not say that I believe in determinism
link |
00:07:40.280
in that kind of an absolute form,
link |
00:07:43.200
or actually I just don't care.
link |
00:07:45.560
Maybe it's true,
link |
00:07:46.720
but I'm not gonna live my life like it is.
link |
00:07:49.920
What in your sense,
link |
00:07:50.880
cause you've studied kind of how we humans
link |
00:07:54.120
think of the world.
link |
00:07:55.820
What's in your view is the difference between our perception,
link |
00:07:59.240
like how we think the world is and reality.
link |
00:08:02.320
Do you think there's a huge gap there?
link |
00:08:04.120
Like we delude ourselves that the whole thing is an illusion.
link |
00:08:07.240
Just everything about human psychology,
link |
00:08:09.360
the way we see things and how things actually are.
link |
00:08:12.880
All the things you've studied, what's your sense?
link |
00:08:14.760
How big is the gap between reality and perception?
link |
00:08:16.920
Well, again, purely speculative.
link |
00:08:18.920
I think that we will never know the answer.
link |
00:08:20.960
We cannot know the answer.
link |
00:08:22.600
There is no experiment to find an answer to that question.
link |
00:08:26.960
Everything we experience is an event in our brain.
link |
00:08:30.240
When I look at a cat, I'm not even,
link |
00:08:32.880
I can't prove that there's a cat there.
link |
00:08:36.660
All I am experiencing is the perception of a cat
link |
00:08:40.900
inside my own brain.
link |
00:08:43.100
I am only a witness to the events of my mind.
link |
00:08:46.300
I think it is very useful to infer that
link |
00:08:50.060
if I witness the event of cat in my head,
link |
00:08:54.060
it's because I'm looking at a cat that is literally there
link |
00:08:57.180
and it has its own feelings and motivations
link |
00:08:59.900
and should be pet and given food and water and love.
link |
00:09:03.100
I think that's the way you should live your life.
link |
00:09:05.940
But whether or not we live in a simulation,
link |
00:09:09.500
I'm a brain in a vat, I don't know.
link |
00:09:13.040
Do you care?
link |
00:09:14.900
I don't really.
link |
00:09:16.800
Well, I care because it's a fascinating question.
link |
00:09:19.580
And it's a fantastic way to get people excited about
link |
00:09:23.740
all kinds of topics, physics, psychology,
link |
00:09:26.260
consciousness, philosophy.
link |
00:09:28.100
But at the end of the day, what would the difference be?
link |
00:09:31.000
If you...
link |
00:09:31.840
The cat needs to be fed at the end of the day,
link |
00:09:33.780
otherwise it'll be a dead cat.
link |
00:09:35.620
Right, but if it's not even a real cat,
link |
00:09:38.420
then it's just like a video game cat.
link |
00:09:40.320
And right, so what's the difference between killing
link |
00:09:43.040
a digital cat in a video game because of neglect
link |
00:09:46.740
versus a real cat?
link |
00:09:48.320
It seems very different to us psychologically.
link |
00:09:50.540
Like I don't really feel bad about, oh my gosh,
link |
00:09:52.400
I forgot to feed my Tamagotchi, right?
link |
00:09:54.380
But I would feel terrible
link |
00:09:55.780
if I forgot to feed my actual cats.
link |
00:09:58.940
So can you just touch on the topic of simulation?
link |
00:10:03.420
Do you find this thought experiment that we're living
link |
00:10:06.220
in a simulation useful, inspiring or constructive
link |
00:10:10.500
in any kind of way?
link |
00:10:11.580
Do you think it's ridiculous?
link |
00:10:12.980
Do you think it could be true?
link |
00:10:14.980
Or is it just a useful thought experiment?
link |
00:10:17.540
I think it is extremely useful as a thought experiment
link |
00:10:20.920
because it makes sense to everyone,
link |
00:10:24.700
especially as we see virtual reality
link |
00:10:27.360
and computer games getting more and more complex.
link |
00:10:30.580
You're not talking to an audience in like Newton's time
link |
00:10:33.900
where you're like, imagine a clock
link |
00:10:36.660
that it has mechanics in it that are so complex
link |
00:10:38.820
that it can create love.
link |
00:10:40.220
And everyone's like, no.
link |
00:10:42.380
But today you really start to feel, man,
link |
00:10:46.540
at what point is this little robot friend of mine
link |
00:10:48.960
gonna be like someone I don't want to cancel plans with?
link |
00:10:53.960
And so it's a great, the thought experiment
link |
00:10:59.000
of do we live in a simulation?
link |
00:11:00.280
Am I a brain in a vat that is just being given
link |
00:11:03.760
electrical impulses from some nefarious other beings
link |
00:11:08.960
so that I believe that I live on earth
link |
00:11:11.000
and that I have a body and all of this?
link |
00:11:13.040
And the fact that you can't prove it either way
link |
00:11:15.480
is a fantastic way to introduce people
link |
00:11:17.480
to some of the deepest questions.
link |
00:11:20.760
So you mentioned a little buddy
link |
00:11:23.080
that you would want to cancel an appointment with.
link |
00:11:25.700
So that's a lot of our conversations.
link |
00:11:27.800
That's what my research is, is artificial intelligence.
link |
00:11:32.240
And I apologize, but you're such a fun person
link |
00:11:34.520
to ask these big questions with.
link |
00:11:36.880
Well, I hope I can give some answers that are interesting.
link |
00:11:40.200
Well, because of you've sharpened your brain's ability
link |
00:11:45.200
to explore some of the most, some of the questions
link |
00:11:47.940
that many scientists are actually afraid of even touching,
link |
00:11:51.120
which is fascinating.
link |
00:11:52.380
I think you're in that sense ultimately a great scientist
link |
00:11:56.580
through this process of sharpening your brain.
link |
00:11:58.920
Well, I don't know if I am a scientist.
link |
00:12:01.740
I think science is a way of knowing
link |
00:12:04.920
and there are a lot of questions I investigate
link |
00:12:09.440
that are not scientific questions.
link |
00:12:11.960
On like mind field, we have definitely done
link |
00:12:14.200
scientific experiments and studies that had hypotheses
link |
00:12:17.680
and all of that, but not to be too like precious
link |
00:12:22.240
about what does the word science mean?
link |
00:12:24.200
But I think I would just describe myself as curious
link |
00:12:27.600
and I hope that that curiosity is contagious.
link |
00:12:29.920
So to you, the scientific method
link |
00:12:31.840
is deeply connected to science
link |
00:12:33.720
because your curiosity took you to asking questions.
link |
00:12:38.280
To me, asking a good question, even if you feel,
link |
00:12:43.440
society feels that it's not a question
link |
00:12:45.400
within the reach of science currently.
link |
00:12:47.320
To me, asking the question is the biggest step
link |
00:12:51.320
of the scientific process.
link |
00:12:53.320
The scientific method is the second part
link |
00:12:57.240
and that may be what traditionally is called science,
link |
00:12:59.400
but to me, asking the questions,
link |
00:13:00.840
being brave enough to ask the questions,
link |
00:13:03.000
being curious and not constrained
link |
00:13:05.000
by what you're supposed to think is just true,
link |
00:13:09.880
what it means to be a scientist to me.
link |
00:13:11.560
It's certainly a huge part of what it means to be a human.
link |
00:13:16.380
If I were to say, you know what?
link |
00:13:17.520
I don't believe in forces.
link |
00:13:19.080
I think that when I push on a massive object,
link |
00:13:22.120
a ghost leaves my body and enters the object I'm pushing
link |
00:13:25.680
and these ghosts happen to just get really lazy
link |
00:13:28.040
when they're around massive things
link |
00:13:29.760
and that's why F equals MA.
link |
00:13:32.640
Oh, and by the way, the laziness of the ghost
link |
00:13:34.360
is in proportion to the mass of the object.
link |
00:13:36.320
So boom, prove me wrong.
link |
00:13:37.760
Every experiment, well, you can never find the ghost.
link |
00:13:41.060
And so none of that theory is scientific,
link |
00:13:45.880
but once I start saying, can I see the ghost?
link |
00:13:49.380
Why should there be a ghost?
link |
00:13:50.880
And if there aren't ghosts, what might I expect?
link |
00:13:53.100
And I start to do different tests to see,
link |
00:13:56.540
is this falsifiable?
link |
00:13:59.120
Are there things that should happen if there are ghosts
link |
00:14:01.520
or are there things that shouldn't happen?
link |
00:14:02.740
And do they, you know, what do I observe?
link |
00:14:05.120
Now I'm thinking scientifically.
link |
00:14:06.920
I don't think of science as, wow, a picture of a black hole.
link |
00:14:10.960
That's just a photograph.
link |
00:14:12.240
That's an image.
link |
00:14:13.060
That's data.
link |
00:14:13.900
That's a sensory and perception experience.
link |
00:14:16.480
Science is how we got that and how we understand it
link |
00:14:19.640
and how we believe in it
link |
00:14:20.680
and how we reduce our uncertainty around what it means.
link |
00:14:24.040
But I would say I'm deeply within the scientific community
link |
00:14:28.260
and I'm sometimes disheartened by the elitism
link |
00:14:31.600
of the thinking, sort of not allowing yourself
link |
00:14:34.640
to think outside the box.
link |
00:14:36.280
So allowing the possibility
link |
00:14:37.880
of going against the conventions of science,
link |
00:14:40.040
I think is a beautiful part of some
link |
00:14:43.920
of the greatest scientists in history.
link |
00:14:46.280
I don't know, I'm impressed by scientists every day
link |
00:14:49.840
and revolutions in our knowledge of the world occur
link |
00:14:58.480
only under very special circumstances.
link |
00:15:00.760
It is very scary to challenge conventional thinking
link |
00:15:05.680
and risky because let's go back to elitism and ego, right?
link |
00:15:10.440
If you just say, you know what?
link |
00:15:11.440
I believe in the spirits of my body
link |
00:15:14.000
and all forces are actually created by invisible creatures
link |
00:15:17.900
that transfer themselves between objects.
link |
00:15:22.360
If you ridicule every other theory
link |
00:15:26.680
and say that you're correct,
link |
00:15:28.800
then ego gets involved and you just don't go anywhere.
link |
00:15:31.680
But fundamentally the question of well, what is a force
link |
00:15:36.800
is incredibly important.
link |
00:15:38.560
We need to have that conversation,
link |
00:15:40.040
but it needs to be done in this very political way
link |
00:15:42.640
of like, let's be respectful of everyone
link |
00:15:44.640
and let's realize that we're all learning together
link |
00:15:46.920
and not shutting out other people.
link |
00:15:49.000
And so when you look at a lot of revolutionary ideas,
link |
00:15:54.720
they were not accepted right away.
link |
00:15:57.460
And, you know, Galileo had a couple of problems
link |
00:16:00.560
with the authorities and later thinkers, Descartes,
link |
00:16:04.800
was like, all right, look, I kind of agree with Galileo,
link |
00:16:06.980
but I'm gonna have to not say that.
link |
00:16:11.380
I'll have to create and invent and write different things
link |
00:16:13.740
that keep me from being in trouble,
link |
00:16:15.100
but we still slowly made progress.
link |
00:16:17.320
Revolutions are difficult in all forms
link |
00:16:19.240
and certainly in science.
link |
00:16:20.480
Before we get to AI, on topic of revolutionary ideas,
link |
00:16:23.840
let me ask on a Reddit AMA, you said that is the earth flat
link |
00:16:28.840
is one of the favorite questions you've ever answered,
link |
00:16:31.600
speaking of revolutionary ideas.
link |
00:16:33.640
So your video on that, people should definitely watch,
link |
00:16:37.080
is really fascinating.
link |
00:16:39.760
Can you elaborate why you enjoyed
link |
00:16:41.660
answering this question so much?
link |
00:16:43.800
Yeah, well, it's a long story.
link |
00:16:45.620
I remember a long time ago,
link |
00:16:49.380
I was living in New York at the time,
link |
00:16:50.940
so it had to have been like 2009 or something.
link |
00:16:54.000
I visited the Flat Earth forums
link |
00:16:57.600
and this was before the Flat Earth theories
link |
00:17:00.280
became as sort of mainstream as they are.
link |
00:17:03.160
Sorry to ask the dumb question, forums, online forums.
link |
00:17:06.720
Yeah, the Flat Earth Society,
link |
00:17:09.040
I don't know if it's.com or.org, but I went there
link |
00:17:11.200
and I was reading their ideas
link |
00:17:14.200
and how they responded to typical criticisms of,
link |
00:17:17.880
well, the earth isn't flat because what about this?
link |
00:17:20.280
And I could not tell, and I mentioned this in my video,
link |
00:17:23.760
I couldn't tell how many of these community members
link |
00:17:28.640
actually believe the earth was flat or we're just trolling.
link |
00:17:32.400
And I realized that the fascinating thing is,
link |
00:17:36.320
how do we know anything?
link |
00:17:38.480
And what makes for a good belief
link |
00:17:41.640
versus a maybe not so tenable or good belief?
link |
00:17:45.240
And so that's really what my video
link |
00:17:47.560
about earth being flat is about.
link |
00:17:49.920
It's about, look, there are a lot of reasons
link |
00:17:52.320
that the earth is probably not flat,
link |
00:17:57.000
but a Flat Earth believer can respond
link |
00:18:00.600
to every single one of them, but it's all in an ad hoc way.
link |
00:18:04.120
And all of these, all of their rebuttals
link |
00:18:05.760
aren't necessarily gonna form
link |
00:18:07.040
a cohesive noncontradictory whole.
link |
00:18:10.760
And I believe that's the episode
link |
00:18:12.520
where I talk about Occam's razor
link |
00:18:14.960
and Newton's flaming laser sword.
link |
00:18:17.400
And then I say, well, you know what, wait a second.
link |
00:18:19.480
We know that space contracts as you move.
link |
00:18:25.040
And so to a particle moving near the speed of light
link |
00:18:27.200
towards earth, earth would be flattened
link |
00:18:29.560
in the direction of that particles travel.
link |
00:18:32.200
So to them, earth is flat.
link |
00:18:35.520
Like we need to be really generous to even wild ideas
link |
00:18:41.000
because they're all thinking,
link |
00:18:43.800
they're all the communication of ideas.
link |
00:18:45.840
And what else can it mean to be a human?
link |
00:18:48.240
Yeah, and I think I'm a huge fan
link |
00:18:50.960
of the Flat Earth theory, quote unquote,
link |
00:18:54.880
in the sense that to me it feels harmless
link |
00:18:57.520
to explore some of the questions
link |
00:18:59.000
of what it means to believe something,
link |
00:19:00.480
what it means to explore the edge of science and so on.
link |
00:19:05.000
Cause it's a harm, it's a, to me,
link |
00:19:07.160
nobody gets hurt whether the earth is flat or round,
link |
00:19:09.680
not literally, but I mean intellectually
link |
00:19:11.840
when we're just having a conversation.
link |
00:19:13.360
That said, again, to elitism,
link |
00:19:15.760
I find that scientists roll their eyes
link |
00:19:18.640
way too fast on the Flat Earth.
link |
00:19:21.400
The kind of dismissal that I see to this even notion,
link |
00:19:25.520
they haven't like sat down and say,
link |
00:19:27.760
what are the arguments that are being proposed?
link |
00:19:30.120
And this is why these arguments are incorrect.
link |
00:19:32.600
So that should be something
link |
00:19:35.040
that scientists should always do,
link |
00:19:37.480
even to the most sort of ideas that seem ridiculous.
link |
00:19:42.160
So I like this as almost, it's almost my test
link |
00:19:45.880
when I ask people what they think about Flat Earth theory,
link |
00:19:48.440
to see how quickly they roll their eyes.
link |
00:19:51.080
Well, yeah, I mean, let me go on record
link |
00:19:53.880
and say that the earth is not flat.
link |
00:19:58.400
It is a three dimensional spheroid.
link |
00:20:02.040
However, I don't know that and it has not been proven.
link |
00:20:07.000
Science doesn't prove anything.
link |
00:20:08.800
It just reduces uncertainty.
link |
00:20:10.680
Could the earth actually be flat?
link |
00:20:13.960
Extremely unlikely, extremely unlikely.
link |
00:20:19.040
And so it is a ridiculous notion
link |
00:20:21.640
if we care about how probable and certain our ideas might be.
link |
00:20:26.680
But I think it's incredibly important
link |
00:20:28.400
to talk about science in that way
link |
00:20:32.120
and to not resort to, well, it's true.
link |
00:20:35.280
It's true in the same way that a mathematical theorem
link |
00:20:39.880
is true.
link |
00:20:41.280
And I think we're kind of like being pretty pedantic
link |
00:20:46.520
about defining this stuff.
link |
00:20:48.400
But like, sure, I could take a rocket ship out
link |
00:20:51.680
and I could orbit earth and look at it
link |
00:20:53.680
and it would look like a ball, right?
link |
00:20:56.880
But I still can't prove that I'm not living in a simulation,
link |
00:20:59.360
that I'm not a brain in a vat,
link |
00:21:00.520
that this isn't all an elaborate ruse
link |
00:21:02.640
created by some technologically advanced
link |
00:21:04.600
extraterrestrial civilization.
link |
00:21:06.520
So there's always some doubt and that's fine.
link |
00:21:11.000
That's exciting.
link |
00:21:12.280
And I think that kind of doubt, practically speaking,
link |
00:21:14.840
is useful when you start talking about quantum mechanics
link |
00:21:17.720
or string theory, sort of, it helps.
link |
00:21:20.360
To me, that kind of adds a little spice
link |
00:21:23.200
into the thinking process of scientists.
link |
00:21:26.440
So, I mean, just as a thought experiment,
link |
00:21:30.040
your video kind of, okay, say the earth is flat.
link |
00:21:33.400
What would the forces when you walk about this flat earth
link |
00:21:36.640
feel like to the human?
link |
00:21:38.360
That's a really nice thought experiment to think about.
link |
00:21:40.600
Right, because what's really nice about it
link |
00:21:42.440
is that it's a funny thought experiment,
link |
00:21:45.400
but you actually wind up accidentally learning
link |
00:21:48.040
a whole lot about gravity and about relativity
link |
00:21:51.640
and geometry.
link |
00:21:53.240
And I think that's really the goal of what I'm doing.
link |
00:21:56.280
I'm not trying to like convince people
link |
00:21:57.640
that the earth is round.
link |
00:21:58.840
I feel like you either believe that it is or you don't
link |
00:22:01.280
and like, that's, you know, how can I change that?
link |
00:22:04.680
What I can do is change how you think
link |
00:22:06.920
and how you are introduced to important concepts.
link |
00:22:10.920
Like, well, how does gravity operate?
link |
00:22:13.760
Oh, it's all about the center of mass of an object.
link |
00:22:16.480
So right, on a sphere, we're all pulled towards the middle,
link |
00:22:19.480
essentially the centroid geometrically,
link |
00:22:21.440
but on a disc, ooh, you're gonna be pulled at a weird angle
link |
00:22:24.560
if you're out near the edge.
link |
00:22:25.920
And that stuff's fascinating.
link |
00:22:28.400
Yeah, and to me, that was, that particular video
link |
00:22:34.520
opened my eyes even more to what gravity is.
link |
00:22:37.520
It's just a really nice visualization tool of,
link |
00:22:40.040
because you always imagine gravity with spheres,
link |
00:22:43.080
with masses that are spheres.
link |
00:22:44.600
Yeah.
link |
00:22:45.440
And imagining gravity on masses that are not spherical,
link |
00:22:48.280
some other shape, but in here, a plate, a flat object,
link |
00:22:53.400
is really interesting.
link |
00:22:54.240
It makes you really kind of visualize
link |
00:22:56.280
in a three dimensional way the force of gravity.
link |
00:22:57.800
Yeah, even if a disc the size of Earth would be impossible,
link |
00:23:05.880
I think anything larger than like the moon basically
link |
00:23:09.040
needs to be a sphere because gravity will round it out.
link |
00:23:15.240
So you can't have a teacup the size of Jupiter, right?
link |
00:23:18.120
There's a great book about the teacup in the universe
link |
00:23:21.040
that I highly recommend.
link |
00:23:22.920
I don't remember the author.
link |
00:23:24.720
I forget her name, but it's a wonderful book.
link |
00:23:26.800
So look it up.
link |
00:23:28.200
I think it's called Teacup in the Universe.
link |
00:23:30.200
Just to link on this point briefly,
link |
00:23:32.680
your videos are generally super, people love them, right?
link |
00:23:37.120
If you look at the sort of number of likes versus dislikes
link |
00:23:39.600
is this measure of YouTube, right, is incredible.
link |
00:23:43.140
And as do I.
link |
00:23:45.240
But this particular flat Earth video
link |
00:23:48.160
has more dislikes than usual.
link |
00:23:51.800
What do you, on that topic in general,
link |
00:23:55.320
what's your sense, how big is the community,
link |
00:23:58.800
not just who believes in flat Earth,
link |
00:24:00.740
but sort of the anti scientific community
link |
00:24:03.720
that naturally distrust scientists in a way
link |
00:24:08.200
that's not an open minded way,
link |
00:24:12.200
like really just distrust scientists
link |
00:24:13.720
like they're bought by some kind of mechanism
link |
00:24:17.040
of some kind of bigger system
link |
00:24:18.880
that's trying to manipulate human beings.
link |
00:24:21.080
What's your sense of the size of that community?
link |
00:24:24.040
You're one of the sort of great educators in the world
link |
00:24:28.960
that educates people on the exciting power of science.
link |
00:24:34.000
So you're kind of up against this community.
link |
00:24:38.080
What's your sense of it?
link |
00:24:39.960
I really have no idea.
link |
00:24:41.960
I haven't looked at the likes and dislikes
link |
00:24:44.200
on the flat Earth video.
link |
00:24:45.320
And so I would wonder if it has a greater percentage
link |
00:24:49.100
of dislikes than usual,
link |
00:24:51.340
is that because of people disliking it
link |
00:24:53.520
because they think that it's a video
link |
00:24:56.720
about Earth being flat and they find that ridiculous
link |
00:25:01.260
and they dislike it without even really watching much?
link |
00:25:04.200
Do they wish that I was more like dismissive
link |
00:25:07.360
of flat Earth theories?
link |
00:25:08.560
Yeah.
link |
00:25:09.760
That's possible too.
link |
00:25:10.600
I know there are a lot of response videos
link |
00:25:12.080
that kind of go through the episode and are pro flat Earth,
link |
00:25:18.600
but I don't know if there's a larger community
link |
00:25:21.860
of unorthodox thinkers today
link |
00:25:25.120
than there have been in the past.
link |
00:25:27.440
And I just wanna not lose them.
link |
00:25:29.960
I want them to keep listening and thinking
link |
00:25:32.580
and by calling them all idiots or something,
link |
00:25:36.640
that does no good because how idiotic are they really?
link |
00:25:41.020
I mean, the Earth isn't a sphere at all.
link |
00:25:45.300
We know that it's an oblate spheroid
link |
00:25:47.720
and that in and of itself is really interesting.
link |
00:25:50.340
And I investigated that in which way is down
link |
00:25:52.240
where I'm like, really down does not point
link |
00:25:54.200
towards the center of the Earth.
link |
00:25:56.240
It points in different direction,
link |
00:25:58.920
depending on what's underneath you and what's above you
link |
00:26:01.640
and what's around you.
link |
00:26:02.480
The whole universe is tugging on me.
link |
00:26:06.000
And then you also show that gravity is non uniform
link |
00:26:10.080
across the globe.
link |
00:26:11.680
Like if you, there's this I guess thought experiment
link |
00:26:14.400
if you build a bridge all the way across the Earth
link |
00:26:19.400
and then just knock out its pillars, what would happen?
link |
00:26:23.440
And you describe how it would be like a very chaotic,
link |
00:26:27.080
unstable thing that's happening
link |
00:26:28.960
because gravity is non uniform throughout the Earth.
link |
00:26:31.760
Yeah, in small spaces, like the ones we work in,
link |
00:26:36.620
we can essentially assume that gravity is uniform,
link |
00:26:39.320
but it's not.
link |
00:26:40.880
It is weaker the further you are from the Earth.
link |
00:26:43.040
And it also is going to be,
link |
00:26:47.080
it's radially pointed towards the middle of the Earth.
link |
00:26:50.060
So a really large object will feel tidal forces
link |
00:26:54.160
because of that non uniformness.
link |
00:26:55.640
And we can take advantage of that with satellites, right?
link |
00:26:58.500
Gravitational induced torque.
link |
00:27:00.280
It's a great way to align your satellite
link |
00:27:01.680
without having to use fuel or any kind of engine.
link |
00:27:05.560
So let's jump back to it, artificial intelligence.
link |
00:27:08.320
What's your thought of the state of where we are at
link |
00:27:11.180
currently with artificial intelligence
link |
00:27:12.920
and what do you think it takes to build human level
link |
00:27:15.880
or superhuman level intelligence?
link |
00:27:17.960
I don't know what intelligence means.
link |
00:27:20.400
That's my biggest question at the moment.
link |
00:27:22.840
And I think it's because my instinct is always to go,
link |
00:27:25.080
well, what are the foundations here of our discussion?
link |
00:27:28.000
What does it mean to be intelligent?
link |
00:27:31.080
How do we measure the intelligence of an artificial machine
link |
00:27:35.320
or a program or something?
link |
00:27:37.480
Can we say that humans are intelligent?
link |
00:27:39.880
Because there's also a fascinating field
link |
00:27:42.600
of how do you measure human intelligence.
link |
00:27:44.280
Of course.
link |
00:27:45.360
But if we just take that for granted,
link |
00:27:47.080
saying that whatever this fuzzy intelligence thing
link |
00:27:50.000
we're talking about, humans kind of have it.
link |
00:27:53.080
What would be a good test for you?
link |
00:27:56.680
So during develop a test that's natural language
link |
00:27:59.160
conversation, would that impress you?
link |
00:28:01.600
A chat bot that you'd want to hang out
link |
00:28:03.360
and have a beer with for a bunch of hours
link |
00:28:06.600
or have dinner plans with.
link |
00:28:08.360
Is that a good test, natural language conversation?
link |
00:28:10.240
Is there something else that would impress you?
link |
00:28:12.260
Or is that also too difficult to think about?
link |
00:28:13.640
Oh yeah, I'm pretty much impressed by everything.
link |
00:28:16.120
I think that if there was a chat bot
link |
00:28:20.320
that was like incredibly, I don't know,
link |
00:28:23.360
really had a personality.
link |
00:28:24.460
And if I didn't be the Turing test, right?
link |
00:28:27.940
Like if I'm unable to tell that it's not another person
link |
00:28:33.360
but then I was shown a bunch of wires
link |
00:28:36.560
and mechanical components.
link |
00:28:39.060
And it was like, that's actually what you're talking to.
link |
00:28:42.680
I don't know if I would feel that guilty destroying it.
link |
00:28:46.440
I would feel guilty because clearly it's well made
link |
00:28:49.240
and it's a really cool thing.
link |
00:28:51.040
It's like destroying a really cool car or something
link |
00:28:53.800
but I would not feel like I was a murderer.
link |
00:28:56.280
So yeah, at what point would I start to feel that way?
link |
00:28:58.800
And this is such a subjective psychological question.
link |
00:29:02.680
If you give it movement or if you have it act as though
link |
00:29:07.680
or perhaps really feel pain as I destroy it
link |
00:29:11.840
and scream and resist, then I'd feel bad.
link |
00:29:15.640
Yeah, it's beautifully put.
link |
00:29:16.680
And let's just say act like it's a pain.
link |
00:29:20.480
So if you just have a robot that not screams,
link |
00:29:25.640
just like moans in pain if you kick it,
link |
00:29:28.440
that immediately just puts it in a class
link |
00:29:30.920
that we humans, it becomes, we anthropomorphize it.
link |
00:29:35.280
It almost immediately becomes human.
link |
00:29:37.920
So that's a psychology question
link |
00:29:39.320
as opposed to sort of a physics question.
link |
00:29:40.920
Right, I think that's a really good instinct to have.
link |
00:29:43.080
If the robot.
link |
00:29:45.720
Screams.
link |
00:29:46.560
Screams and moans, even if you don't believe
link |
00:29:50.160
that it has the mental experience,
link |
00:29:52.640
the qualia of pain and suffering,
link |
00:29:55.280
I think it's still a good instinct to say,
link |
00:29:56.760
you know what, I'd rather not hurt it.
link |
00:29:59.960
The problem is that instinct can get us in trouble
link |
00:30:02.560
because then robots can manipulate that.
link |
00:30:05.560
And there's different kinds of robots.
link |
00:30:08.240
There's robots like the Facebook and the YouTube algorithm
link |
00:30:10.640
that recommends the video,
link |
00:30:11.920
and they can manipulate in the same kind of way.
link |
00:30:14.440
Well, let me ask you just to stick
link |
00:30:16.240
on artificial intelligence for a second.
link |
00:30:17.840
Do you have worries about existential threats from AI
link |
00:30:21.720
or existential threats from other technologies
link |
00:30:23.680
like nuclear weapons that could potentially destroy life
link |
00:30:27.720
on earth or damage it to a very significant degree?
link |
00:30:31.200
Yeah, of course I do.
link |
00:30:32.280
Especially the weapons that we create.
link |
00:30:35.280
There's all kinds of famous ways to think about this.
link |
00:30:38.080
And one is that, wow, what if we don't see
link |
00:30:41.440
advanced alien civilizations because of the danger
link |
00:30:46.800
of technology?
link |
00:30:50.600
What if we reach a point,
link |
00:30:51.800
and I think there's a channel, Thoughty2,
link |
00:30:55.000
geez, I wish I remembered the name of the channel,
link |
00:30:57.400
but he delves into this kind of limit
link |
00:30:59.960
of maybe once you discover radioactivity and its power,
link |
00:31:04.960
you've reached this important hurdle.
link |
00:31:07.160
And the reason that the skies are so empty
link |
00:31:09.400
is that no one's ever managed to survive as a civilization
link |
00:31:13.880
once they have that destructive power.
link |
00:31:16.840
And when it comes to AI, I'm not really very worried
link |
00:31:22.320
because I think that there are plenty of other people
link |
00:31:24.760
that are already worried enough.
link |
00:31:26.440
And oftentimes these worries are just,
link |
00:31:30.520
they just get in the way of progress.
link |
00:31:32.880
And they're questions that we should address later.
link |
00:31:37.680
And I think I talk about this in my interview
link |
00:31:41.960
with the self driving autonomous vehicle guy,
link |
00:31:47.440
as I think it was a bonus scene
link |
00:31:48.640
from the trolley problem episode.
link |
00:31:52.080
And I'm like, wow, what should a car do
link |
00:31:54.200
if this really weird contrived scenario happens
link |
00:31:56.960
where it has to swerve and save the driver, but kill a kid?
link |
00:32:00.120
And he's like, well, what would a human do?
link |
00:32:03.440
And if we resist technological progress
link |
00:32:07.200
because we're worried about all of these little issues,
link |
00:32:10.360
then it gets in the way.
link |
00:32:11.960
And we shouldn't avoid those problems,
link |
00:32:14.320
but we shouldn't allow them to be stumbling blocks
link |
00:32:16.760
to advancement.
link |
00:32:18.920
So the folks like Sam Harris or Elon Musk
link |
00:32:22.440
are saying that we're not worried enough.
link |
00:32:24.520
So the worry should not paralyze technological progress,
link |
00:32:28.520
but we're sort of marching,
link |
00:32:30.760
technology is marching forward without the key scientists,
link |
00:32:35.800
the developing of technology,
link |
00:32:37.880
worrying about the overnight having some effects
link |
00:32:42.080
that would be very detrimental to society.
link |
00:32:45.200
So to push back on your thought of the idea
link |
00:32:49.520
that there's enough people worrying about it,
link |
00:32:51.240
Elon Musk says, there's not enough people
link |
00:32:53.040
worrying about it.
link |
00:32:54.640
That's the kind of balance is,
link |
00:32:58.960
it's like folks who are really focused
link |
00:33:01.240
on nuclear deterrence are saying
link |
00:33:03.680
there's not enough people worried
link |
00:33:04.840
about nuclear deterrence, right?
link |
00:33:06.080
So it's an interesting question of what is a good threshold
link |
00:33:10.220
of people to worry about these?
link |
00:33:12.600
And if it's too many people that are worried, you're right.
link |
00:33:15.160
It'll be like the press would over report on it
link |
00:33:18.720
and there'll be technological, halt technological progress.
link |
00:33:21.840
If not enough, then we can march straight ahead
link |
00:33:24.440
into that abyss that human beings might be destined for
link |
00:33:29.840
with the progress of technology.
link |
00:33:31.320
Yeah, I don't know what the right balance is
link |
00:33:33.680
of how many people should be worried
link |
00:33:35.720
and how worried should they be,
link |
00:33:36.960
but we're always worried about new technology.
link |
00:33:40.440
We know that Plato was worried about the written word.
link |
00:33:42.960
He was like, we shouldn't teach people to write
link |
00:33:45.000
because then they won't use their minds to remember things.
link |
00:33:48.360
There have been concerns over technology
link |
00:33:51.240
and its advancement since the beginning of recorded history.
link |
00:33:55.120
And so, I think, however,
link |
00:33:58.880
these conversations are really important to have
link |
00:34:01.060
because again, we learn a lot about ourselves.
link |
00:34:03.360
If we're really scared of some kind of AI
link |
00:34:06.200
like coming into being that is conscious or whatever
link |
00:34:09.360
and can self replicate, we already do that every day.
link |
00:34:13.120
It's called humans being born.
link |
00:34:14.560
They're not artificial, they're humans,
link |
00:34:17.200
but they're intelligent and I don't wanna live in a world
link |
00:34:20.140
where we're worried about babies being born
link |
00:34:21.940
because what if they become evil?
link |
00:34:24.200
Right.
link |
00:34:25.040
What if they become mean people?
link |
00:34:25.920
What if they're thieves?
link |
00:34:27.680
Maybe we should just like, what, not have babies born?
link |
00:34:31.760
Like maybe we shouldn't create AI.
link |
00:34:33.980
It's like, we will want to have safeguards in place
link |
00:34:39.720
in the same way that we know, look,
link |
00:34:41.740
a kid could be born that becomes some kind of evil person,
link |
00:34:44.400
but we have laws, right?
link |
00:34:47.880
And it's possible that with advanced genetics in general,
link |
00:34:51.600
be able to, it's a scary thought to say that,
link |
00:34:58.400
this, my child, if born would have an 83% chance
link |
00:35:05.200
of being a psychopath, right?
link |
00:35:08.720
Like being able to, if it's something genetic,
link |
00:35:11.500
if there's some sort of, and what to use that information,
link |
00:35:15.040
what to do with that information
link |
00:35:16.200
is a difficult ethical thought.
link |
00:35:20.000
Yeah, and I'd like to find an answer that isn't,
link |
00:35:22.040
well, let's not have them live.
link |
00:35:24.940
You know, I'd like to find an answer that is,
link |
00:35:26.860
well, all human life is worthy.
link |
00:35:30.340
And if you have an 83% chance of becoming a psychopath,
link |
00:35:33.640
well, you still deserve dignity.
link |
00:35:38.140
And you still deserve to be treated well.
link |
00:35:42.400
You still have rights.
link |
00:35:43.360
At least at this part of the world, at least in America,
link |
00:35:45.980
there's a respect for individual life in that way.
link |
00:35:49.540
That's, well, to me, but again, I'm in this bubble,
link |
00:35:54.040
is a beautiful thing.
link |
00:35:55.760
But there's other cultures where individual human life
link |
00:35:59.020
is not that important, where a society,
link |
00:36:02.720
so I was born in the Soviet Union,
link |
00:36:04.760
where the strength of nation and society together
link |
00:36:07.440
is more important than any one particular individual.
link |
00:36:10.280
So it's an interesting also notion,
link |
00:36:12.080
the stories we tell ourselves.
link |
00:36:13.540
I like the one where individuals matter,
link |
00:36:16.000
but it's unclear that that was what the future holds.
link |
00:36:19.200
Well, yeah, and I mean, let me even throw this out.
link |
00:36:21.480
Like, what is artificial intelligence?
link |
00:36:23.840
How can it be artificial?
link |
00:36:25.200
I really think that we get pretty obsessed
link |
00:36:28.000
and stuck on the idea that there is some thing
link |
00:36:30.740
that is a wild human, a pure human organism
link |
00:36:34.320
without technology.
link |
00:36:35.780
But I don't think that's a real thing.
link |
00:36:37.540
I think that humans and human technology are one organism.
link |
00:36:42.780
Look at my glasses, okay?
link |
00:36:44.500
If an alien came down and saw me,
link |
00:36:47.860
would they necessarily know that this is an invention,
link |
00:36:50.620
that I don't grow these organically from my body?
link |
00:36:53.220
They wouldn't know that right away.
link |
00:36:55.460
And the written word, and spoons, and cups,
link |
00:37:00.460
these are all pieces of technology.
link |
00:37:02.460
We are not alone as an organism.
link |
00:37:06.700
And so the technology we create,
link |
00:37:09.060
whether it be video games or artificial intelligence
link |
00:37:11.980
that can self replicate and hate us,
link |
00:37:14.020
it's actually all the same organism.
link |
00:37:16.780
When you're in a car, where do you end in the car begin?
link |
00:37:19.220
It seems like a really easy question to answer,
link |
00:37:21.100
but the more you think about it,
link |
00:37:22.660
the more you realize, wow,
link |
00:37:23.900
we are in this symbiotic relationship with our inventions.
link |
00:37:27.940
And there are plenty of people who are worried about it.
link |
00:37:30.020
And there should be,
link |
00:37:30.860
but it's inevitable.
link |
00:37:32.820
And I think that even just us think of ourselves
link |
00:37:35.900
as individual intelligences may be silly notion
link |
00:37:41.380
because it's much better to think
link |
00:37:44.540
of the entirety of human civilization.
link |
00:37:46.780
All living organisms on earth is a single living organism.
link |
00:37:50.740
As a single intelligent creature,
link |
00:37:52.180
because you're right, everything's intertwined.
link |
00:37:54.340
Everything is deeply connected.
link |
00:37:57.100
So we mentioned, you know,
link |
00:37:57.940
Musk, so you're a curious lover of science.
link |
00:38:03.140
What do you think of the efforts that Elon Musk is doing
link |
00:38:06.860
with space exploration, with electric vehicles,
link |
00:38:10.100
with autopilot, sort of getting into the space
link |
00:38:13.620
of autonomous vehicles, with boring under LA
link |
00:38:17.300
and a Neuralink trying to communicate brain machine
link |
00:38:21.740
interfaces, communicate between machines
link |
00:38:24.260
and human brains?
link |
00:38:28.540
Well, it's really inspiring.
link |
00:38:30.140
I mean, look at the fandom that he's amassed.
link |
00:38:34.860
It's not common for someone like that
link |
00:38:39.220
to have such a following.
link |
00:38:40.620
And so it's... Engineering nerd.
link |
00:38:42.420
Yeah, so it's really exciting.
link |
00:38:44.580
But I also think that a lot of responsibility
link |
00:38:46.420
comes with that kind of power.
link |
00:38:47.660
So like if I met him, I would love to hear how he feels
link |
00:38:50.860
about the responsibility he has.
link |
00:38:53.420
When there are people who are such a fan of your ideas
link |
00:38:59.980
and your dreams and share them so closely with you,
link |
00:39:04.620
you have a lot of power.
link |
00:39:06.420
And he didn't always have that, you know?
link |
00:39:09.660
He wasn't born as Elon Musk.
link |
00:39:11.980
Well, he was, but well, he was named that later.
link |
00:39:13.980
But the point is that I wanna know the psychology
link |
00:39:18.980
of becoming a figure like him.
link |
00:39:23.860
Well, I don't even know how to phrase the question right,
link |
00:39:25.660
but it's a question about what do you do
link |
00:39:27.980
when you're following, your fans become so large
link |
00:39:35.780
that it's almost bigger than you.
link |
00:39:37.980
And how do you responsibly manage that?
link |
00:39:41.020
And maybe it doesn't worry him at all.
link |
00:39:42.180
And that's fine too.
link |
00:39:43.500
But I'd be really curious.
link |
00:39:45.500
And I think there are a lot of people that go through this
link |
00:39:47.660
when they realize, whoa, there are a lot of eyes on me.
link |
00:39:50.380
There are a lot of people who really take what I say
link |
00:39:53.900
very earnestly and take it to heart and will defend me.
link |
00:39:57.700
And whew, that's, that's, that can be dangerous.
link |
00:40:04.260
And you have to be responsible with it.
link |
00:40:07.500
Both in terms of impact on society
link |
00:40:09.260
and psychologically for the individual,
link |
00:40:11.260
just the burden psychologically on Elon?
link |
00:40:15.020
Yeah, yeah, how does he think about that?
link |
00:40:18.820
Part of his persona.
link |
00:40:21.180
Well, let me throw that right back at you
link |
00:40:23.340
because in some ways you're just a funny guy
link |
00:40:28.540
that gotten a humongous following,
link |
00:40:31.660
a funny guy with a curiosity.
link |
00:40:34.900
You've got a huge following.
link |
00:40:36.540
How do you psychologically deal with the responsibility?
link |
00:40:40.060
In many ways you have a reach
link |
00:40:41.980
in many ways bigger than Elon Musk.
link |
00:40:44.540
What is your, what is the burden that you feel in educating
link |
00:40:49.340
being one of the biggest educators in the world
link |
00:40:51.980
where everybody's listening to you
link |
00:40:53.500
and actually everybody, like most of the world
link |
00:40:58.380
that's uses YouTube for educational material,
link |
00:41:01.060
trust you as a source of good, strong scientific thinking.
link |
00:41:07.540
It's a burden and I try to approach it
link |
00:41:11.020
with a lot of humility and sharing.
link |
00:41:16.020
Like I'm not out there doing
link |
00:41:18.580
a lot of scientific experiments.
link |
00:41:20.300
I am sharing the work of real scientists
link |
00:41:23.180
and I'm celebrating their work and the way that they think
link |
00:41:26.420
and the power of curiosity.
link |
00:41:29.500
But I wanna make it clear at all times that like,
link |
00:41:32.200
look, we don't know all the answers
link |
00:41:35.260
and I don't think we're ever going to reach a point
link |
00:41:37.660
where we're like, wow, and there you go.
link |
00:41:39.500
That's the universe.
link |
00:41:40.620
It's this equation, you plug in some conditions or whatever
link |
00:41:43.820
and you do the math
link |
00:41:44.660
and you know what's gonna happen tomorrow.
link |
00:41:46.100
I don't think we're ever gonna reach that point,
link |
00:41:47.920
but I think that there is a tendency
link |
00:41:51.920
to sometimes believe in science and become elitist
link |
00:41:56.100
and become, I don't know, hard when in reality
link |
00:41:58.860
it should humble you and make you feel smaller.
link |
00:42:01.780
I think there's something very beautiful
link |
00:42:03.080
about feeling very, very small and very weak
link |
00:42:07.420
and to feel that you need other people.
link |
00:42:10.940
So I try to keep that in mind and say,
link |
00:42:13.220
look, thanks for watching.
link |
00:42:14.340
Vsauce is not, I'm not Vsauce, you are.
link |
00:42:16.700
When I start the episodes, I say,
link |
00:42:18.080
hey, Vsauce, Michael here.
link |
00:42:20.500
Vsauce and Michael are actually a different thing
link |
00:42:22.260
in my mind.
link |
00:42:23.100
I don't know if that's always clear,
link |
00:42:24.380
but yeah, I have to approach it that way
link |
00:42:26.900
because it's not about me.
link |
00:42:30.080
Yeah, so it's not even,
link |
00:42:31.860
you're not feeling the responsibility.
link |
00:42:33.620
You're just sort of plugging into this big thing
link |
00:42:36.100
that is scientific exploration of our reality
link |
00:42:40.020
and you're a voice that represents a bunch,
link |
00:42:42.660
but you're just plugging into this big Vsauce ball
link |
00:42:47.660
that others, millions of others are plugged into.
link |
00:42:49.860
Yeah, and I'm just hoping to encourage curiosity
link |
00:42:53.060
and responsible thinking
link |
00:42:56.380
and an embracement of doubt
link |
00:43:01.980
and being okay with that.
link |
00:43:05.020
So I'm next week talking to Christos Gudrow.
link |
00:43:08.200
I'm not sure if you're familiar who he is,
link |
00:43:09.980
but he's the VP of engineering,
link |
00:43:11.660
head of the quote unquote YouTube algorithm
link |
00:43:14.660
or the search and discovery.
link |
00:43:16.140
So let me ask, first high level,
link |
00:43:20.180
do you have a question for him
link |
00:43:25.100
that if you can get an honest answer that you would ask,
link |
00:43:28.860
but more generally,
link |
00:43:30.140
how do you think about the YouTube algorithm
link |
00:43:32.620
that drives some of the motivation behind,
link |
00:43:36.060
no, some of the design decisions you make
link |
00:43:38.900
as you ask and answer some of the questions you do,
link |
00:43:42.220
how would you improve this algorithm in your mind in general?
link |
00:43:45.140
So just what would you ask him?
link |
00:43:47.540
And outside of that,
link |
00:43:49.500
how would you like to see the algorithm improve?
link |
00:43:52.700
Well, I think of the algorithm as a mirror.
link |
00:43:56.780
It reflects what people put in
link |
00:43:58.940
and we don't always like what we see in that mirror.
link |
00:44:01.140
From the individual mirror
link |
00:44:02.780
to the individual mirror to the society.
link |
00:44:05.420
Both, in the aggregate,
link |
00:44:07.020
it's reflecting back what people on average want to watch.
link |
00:44:11.380
And when you see things being recommended to you,
link |
00:44:15.340
it's reflecting back what it thinks you want to see.
link |
00:44:19.220
And specifically, I would guess that it's
link |
00:44:22.300
not just what you want to see,
link |
00:44:23.560
but what you will click on
link |
00:44:25.560
and what you will watch some of and stay on YouTube
link |
00:44:30.560
because of.
link |
00:44:32.480
I don't think that, this is all me guessing,
link |
00:44:34.860
but I don't think that YouTube cares
link |
00:44:38.980
if you only watch like a second of a video,
link |
00:44:41.700
as long as the next thing you do is open another video.
link |
00:44:45.100
If you close the app or close the site,
link |
00:44:49.380
that's a problem for them
link |
00:44:50.780
because they're not a subscription platform.
link |
00:44:52.940
They're not like, look,
link |
00:44:53.780
you're giving us 20 bucks a month no matter what,
link |
00:44:56.120
so who cares?
link |
00:44:57.620
They need you to watch and spend time there and see ads.
link |
00:45:02.100
So one of the things I'm curious about
link |
00:45:03.620
whether they do consider longer term sort of develop,
link |
00:45:09.980
your longer term development as a human being,
link |
00:45:12.600
which I think ultimately will make you feel better
link |
00:45:15.100
about using YouTube in the longterm
link |
00:45:17.460
and allowing you to stick with it for longer.
link |
00:45:19.920
Because even if you feed the dopamine rush in the short term
link |
00:45:23.300
and you keep clicking on cat videos,
link |
00:45:25.580
eventually you sort of wake up like from a drug
link |
00:45:28.980
and say, I need to quit this.
link |
00:45:30.860
So I wonder how much you're trying to optimize
link |
00:45:32.700
for the longterm because when I look at the,
link |
00:45:35.820
your videos aren't exactly sort of, no offense,
link |
00:45:39.420
but they're not the most clickable.
link |
00:45:41.740
They're both the most clickable
link |
00:45:44.180
and I feel I watched the entire thing
link |
00:45:47.260
and I feel a better human after I watched it, right?
link |
00:45:49.640
So like they're not just optimizing for the clickability
link |
00:45:54.640
because I hope, so my thought is how do you think of it?
link |
00:45:59.880
And does it affect your own content?
link |
00:46:02.240
Like how deep you go,
link |
00:46:03.280
how profound you explore the directions and so on.
link |
00:46:07.000
I've been really lucky in that I don't worry too much
link |
00:46:11.500
about the algorithm.
link |
00:46:12.520
I mean, look at my thumbnails.
link |
00:46:13.800
I don't really go too wild with them.
link |
00:46:17.200
And with minefield where I'm in partnership with YouTube
link |
00:46:19.800
on the thumbnails, I'm often like, let's pull this back.
link |
00:46:22.400
Let's be mysterious.
link |
00:46:23.960
But usually I'm just trying to do
link |
00:46:25.640
what everyone else is not doing.
link |
00:46:27.520
So if everyone's doing crazy Photoshop kind of thumbnails,
link |
00:46:30.960
I'm like, what if the thumbnails just a line?
link |
00:46:34.120
And what if the title is just a word?
link |
00:46:37.760
And I kind of feel like all of the Vsauce channels
link |
00:46:41.160
have cultivated an audience that expects that.
link |
00:46:43.160
And so they would rather Jake make a video
link |
00:46:45.340
that's just called stains than one called,
link |
00:46:48.440
I explored stains, shocking.
link |
00:46:50.920
But there are other audiences out there that want that.
link |
00:46:53.300
And I think most people kind of want
link |
00:46:57.080
what you see the algorithm favoring,
link |
00:46:58.800
which is mainstream traditional celebrity
link |
00:47:02.240
and news kind of information.
link |
00:47:03.640
I mean, that's what makes YouTube really different
link |
00:47:05.200
than other streaming platforms.
link |
00:47:06.680
No one's like, what's going on in the world?
link |
00:47:08.800
I'll open up Netflix to find out.
link |
00:47:10.520
But you do open up Twitter to find that out.
link |
00:47:12.720
You open up Facebook and you can open up YouTube
link |
00:47:14.840
because you'll see that the trending videos
link |
00:47:16.260
are like what happened amongst the traditional mainstream
link |
00:47:20.240
people in different industries.
link |
00:47:22.120
And that's what's being shown.
link |
00:47:24.040
And it's not necessarily YouTube saying,
link |
00:47:27.560
we want that to be what you see.
link |
00:47:29.240
It's that that's what people click on.
link |
00:47:31.400
When they see Ariana Grande, you know,
link |
00:47:33.320
reads a love letter from like her high school sweetheart,
link |
00:47:36.280
they're like, I wanna see that.
link |
00:47:38.000
And when they see a video from me
link |
00:47:39.320
that's got some lines in math and it's called law and causes
link |
00:47:41.800
they're like, well, I mean, I'm just on the bus.
link |
00:47:45.940
Like I don't have time to dive into a whole lesson.
link |
00:47:48.640
So, you know, before you get super mad at YouTube,
link |
00:47:52.360
you should say, really,
link |
00:47:53.640
they're just reflecting back human behavior.
link |
00:47:55.520
Is there something you would improve about the algorithm
link |
00:48:00.240
knowing of course, that as far as we're concerned,
link |
00:48:02.760
it's a black box, so we don't know how it works.
link |
00:48:04.600
Right, and I don't think that even anyone at YouTube
link |
00:48:06.760
really knows what it's doing.
link |
00:48:07.920
They know what they've tweaked, but then it learns.
link |
00:48:09.800
I think that it learns and it decides how to behave.
link |
00:48:13.800
And sometimes the YouTube employees are left going,
link |
00:48:16.640
I don't know.
link |
00:48:17.740
Maybe we should like change the value
link |
00:48:19.600
of how much it, you know, worries about watch time.
link |
00:48:22.640
And maybe it should worry more about something else.
link |
00:48:24.800
I don't know.
link |
00:48:25.640
But I mean, I would like to see,
link |
00:48:28.400
I don't know what they're doing and not doing.
link |
00:48:30.680
Well, is there a conversation
link |
00:48:32.740
that you think they should be having just internally,
link |
00:48:35.720
whether they're having it or not?
link |
00:48:37.300
Is there something,
link |
00:48:38.920
should they be thinking about the longterm future?
link |
00:48:41.140
Should they be thinking about educational content
link |
00:48:44.440
and whether that's educating about what just happened
link |
00:48:48.160
in the world today, news or educational content,
link |
00:48:50.680
like what you're providing,
link |
00:48:51.520
which is asking big sort of timeless questions
link |
00:48:54.360
about how the way the world works.
link |
00:48:56.580
Well, it's interesting.
link |
00:48:58.120
What should they think about?
link |
00:48:59.400
Because it's called YouTube, not our tube.
link |
00:49:02.620
And that's why I think they have
link |
00:49:04.680
so many phenomenal educational creators.
link |
00:49:08.360
You don't have shows like Three Blue One Brown
link |
00:49:11.720
or Physics Girl or Looking Glass Universe or Up and Atom
link |
00:49:14.960
or Brain Scoop or, I mean, I could go on and on.
link |
00:49:18.720
They aren't on Amazon Prime and Netflix
link |
00:49:21.200
and they don't have commissioned shows from those platforms.
link |
00:49:24.000
It's all organically happening
link |
00:49:25.520
because there are people out there
link |
00:49:26.660
that want to share their passion for learning,
link |
00:49:30.140
that wanna share their curiosity.
link |
00:49:32.520
And YouTube could promote those kinds of shows more,
link |
00:49:37.440
but first of all, they probably wouldn't get as many clicks
link |
00:49:43.240
and YouTube needs to make sure that the average user
link |
00:49:45.340
is always clicking and staying on the site.
link |
00:49:47.720
They could still promote it more for the good of society,
link |
00:49:51.060
but then we're making some really weird claims
link |
00:49:52.720
about what's good for society
link |
00:49:53.960
because I think that cat videos
link |
00:49:55.600
are also an incredibly important part
link |
00:49:58.040
of what it means to be a human.
link |
00:50:00.360
I mentioned this quote before from Unumuno about,
link |
00:50:02.860
look, I've seen a cat like estimate distances
link |
00:50:05.400
and calculate a jump more often than I've seen a cat cry.
link |
00:50:09.440
And so things that play with our emotions
link |
00:50:12.440
and make us feel things can be cheesy and can feel cheap,
link |
00:50:15.360
but like, man, that's very human.
link |
00:50:18.000
And so even the dumbest vlog is still so important
link |
00:50:23.760
that I don't think I have a better claim to take its spot
link |
00:50:27.380
than it has to have that spot.
link |
00:50:29.840
It puts a mirror to us, the beautiful parts,
link |
00:50:32.600
the ugly parts, the shallow parts, the deep parts.
link |
00:50:35.880
You're right.
link |
00:50:36.720
What I would like to see is,
link |
00:50:39.840
I miss the days when engaging with content on YouTube
link |
00:50:43.360
helped push it into my subscribers timelines.
link |
00:50:47.620
It used to be that when I liked a video,
link |
00:50:49.560
say from Veritasium, it would show up in the feed
link |
00:50:54.620
on the front page of the app or the website of my subscribers.
link |
00:50:58.340
And I knew that if I liked a video,
link |
00:51:00.440
I could send it 100,000 views or more.
link |
00:51:03.480
That no longer is true,
link |
00:51:05.360
but I think that was a good user experience.
link |
00:51:07.300
When I subscribe to someone, when I'm following them,
link |
00:51:09.800
I want to see more of what they like.
link |
00:51:13.060
I want them to also curate the feed for me.
link |
00:51:15.360
And I think that Twitter and Facebook are doing that
link |
00:51:17.900
in also some ways that are kind of annoying,
link |
00:51:20.320
but I would like that to happen more.
link |
00:51:22.340
And I think we would see communities being stronger
link |
00:51:25.160
on YouTube if it was that way instead of YouTube going,
link |
00:51:27.300
well, technically Michael liked this Veritasium video,
link |
00:51:29.760
but people are way more likely to click on Carpool Karaoke.
link |
00:51:33.620
So I don't even care who they are, just give them that.
link |
00:51:36.200
Not saying anything against Carpool Karaoke,
link |
00:51:38.960
that is a extremely important part of our society,
link |
00:51:43.260
what it means to be a human on earth, you know, but.
link |
00:51:46.780
I'll say it sucks, but.
link |
00:51:48.440
Yeah, but a lot of people would disagree with you
link |
00:51:51.380
and they should be able to see as much of that as they want.
link |
00:51:53.860
And I think even people who don't think they like it
link |
00:51:55.660
should still be really aware of it
link |
00:51:57.100
because it's such an important thing.
link |
00:51:59.400
It's such an influential thing.
link |
00:52:00.900
But yeah, I just wish that like new channels I discover
link |
00:52:03.260
and that I subscribe to,
link |
00:52:04.340
I wish that my subscribers found out about that
link |
00:52:06.940
because especially in the education community,
link |
00:52:10.020
a rising tide floats all boats.
link |
00:52:11.580
If you watch a video from Numberphile,
link |
00:52:14.020
you're just more likely to want to watch an episode from me,
link |
00:52:16.740
whether it be on Vsauce1 or Ding.
link |
00:52:18.540
It's not competitive in the way that traditional TV was
link |
00:52:21.840
where it's like, well, if you tune into that show,
link |
00:52:23.180
it means you're not watching mine
link |
00:52:24.540
because they both air at the same time.
link |
00:52:26.140
So helping each other out through collaborations
link |
00:52:29.340
takes a lot of work,
link |
00:52:30.180
but just through engaging, commenting on their videos,
link |
00:52:32.820
liking their videos, subscribing to them, whatever,
link |
00:52:36.700
that I would love to see become easier and more powerful.
link |
00:52:41.500
So a quick and impossibly deep question,
link |
00:52:46.060
last question about mortality.
link |
00:52:48.980
You've spoken about death as an interesting topic.
link |
00:52:52.580
Do you think about your own mortality?
link |
00:52:55.940
Yeah, every day, it's really scary.
link |
00:52:59.740
So what do you think is the meaning of life
link |
00:53:04.020
that mortality makes very explicit?
link |
00:53:07.340
So why are you here on earth, Michael?
link |
00:53:12.620
What's the point of this whole thing?
link |
00:53:18.100
What does mortality in the context of the whole universe
link |
00:53:23.100
make you realize about yourself?
link |
00:53:26.400
Just you, Michael Stevens.
link |
00:53:29.140
Well, it makes me realize
link |
00:53:31.300
that I am destined to become a notion.
link |
00:53:35.620
I'm destined to become a memory and we can extend life.
link |
00:53:39.620
I think there's really exciting things being done
link |
00:53:42.900
to extend life,
link |
00:53:43.860
but we still don't know how to protect you
link |
00:53:47.100
from some accident that could happen,
link |
00:53:48.860
some unforeseen thing.
link |
00:53:50.300
Maybe we could save my connectome
link |
00:53:54.140
and recreate my consciousness digitally,
link |
00:53:56.580
but even that could be lost
link |
00:54:00.340
if it's stored on a physical medium or something.
link |
00:54:02.580
So basically, I just think that embracing
link |
00:54:07.780
and realizing how cool it is,
link |
00:54:09.020
that someday I will just be an idea.
link |
00:54:11.540
And there won't be a Michael anymore
link |
00:54:13.380
that can be like, no, that's not what I meant.
link |
00:54:16.180
It'll just be what people,
link |
00:54:17.540
they have to guess what I meant.
link |
00:54:19.620
And they'll remember me
link |
00:54:23.140
and how I live on as that memory
link |
00:54:25.780
will maybe not even be who I want it to be.
link |
00:54:29.600
But there's something powerful about that.
link |
00:54:31.940
And there's something powerful
link |
00:54:32.900
about letting future people run the show themselves.
link |
00:54:38.980
I think I'm glad to get out of their way at some point
link |
00:54:41.980
and say, all right, it's your world now.
link |
00:54:43.660
So you, the physical entity, Michael,
link |
00:54:47.300
have ripple effects in the space of ideas
link |
00:54:50.260
that far outlives you in ways that you can't control,
link |
00:54:54.260
but it's nevertheless fascinating to think,
link |
00:54:56.100
I mean, especially with you,
link |
00:54:57.580
you can imagine an alien species
link |
00:54:59.180
when they finally arrive and destroy all of us
link |
00:55:01.740
would watch your videos to try to figure out
link |
00:55:04.580
what were the questions that these people.
link |
00:55:05.980
But even if they didn't,
link |
00:55:08.620
I still think that there will be ripples.
link |
00:55:11.380
Like when I say memory,
link |
00:55:13.420
I don't specifically mean people remember my name
link |
00:55:16.580
and my birth date and like there's a photo of me
link |
00:55:18.940
on Wikipedia, like all that can be lost,
link |
00:55:20.860
but I still would hope that people ask questions
link |
00:55:23.980
and teach concepts in some of the ways
link |
00:55:26.460
that I have found useful and satisfying.
link |
00:55:28.340
Even if they don't know that I was the one
link |
00:55:29.780
who tried to popularize it, that's fine.
link |
00:55:32.580
But if Earth was completely destroyed,
link |
00:55:35.260
like burnt to a crisp, everything on it today,
link |
00:55:39.340
what would, the universe wouldn't care.
link |
00:55:42.700
Like Jupiter's not gonna go, oh no, and that could happen.
link |
00:55:49.980
So we do however have the power to launch things
link |
00:55:55.100
into space to try to extend how long our memory exists.
link |
00:56:02.900
And what I mean by that is,
link |
00:56:04.620
we are recording things about the world
link |
00:56:06.900
and we're learning things and writing stories
link |
00:56:08.420
and all of this and preserving that is truly
link |
00:56:13.020
what I think is the essence of being a human.
link |
00:56:16.580
We are autobiographers of the universe
link |
00:56:20.580
and we're really good at it.
link |
00:56:21.660
We're better than fossils.
link |
00:56:22.940
We're better than light spectrum.
link |
00:56:25.500
We're better than any of that.
link |
00:56:26.740
We collect much more detailed memories
link |
00:56:29.980
of what's happening, much better data.
link |
00:56:32.820
And so that should be our legacy.
link |
00:56:37.300
And I hope that that's kind of mine too
link |
00:56:40.140
in terms of people remembering something
link |
00:56:42.420
or having some kind of effect.
link |
00:56:44.780
But even if I don't, you can't not have an effect.
link |
00:56:47.860
This is not me feeling like,
link |
00:56:49.180
I hope that I have this powerful legacy.
link |
00:56:50.820
It's like, no matter who you are, you will.
link |
00:56:53.820
But you also have to embrace the fact
link |
00:56:57.700
that that impact might look really small and that's okay.
link |
00:57:01.340
One of my favorite quotes is from Tessa the Durbervilles.
link |
00:57:04.460
And it's along the lines of the measure of your life
link |
00:57:08.220
depends on not your external displacement
link |
00:57:10.940
but your subjective experience.
link |
00:57:13.100
If I am happy and those that I love are happy,
link |
00:57:16.700
can that be enough?
link |
00:57:17.700
Because if so, excellent.
link |
00:57:21.340
I think there's no better place to end it, Michael.
link |
00:57:23.460
Thank you so much.
link |
00:57:24.300
It was an honor to meet you.
link |
00:57:25.140
Thanks for talking to me.
link |
00:57:25.980
Thank you, it was a pleasure.
link |
00:57:27.660
Thanks for listening to this conversation
link |
00:57:29.180
with Michael Stevens.
link |
00:57:30.460
And thank you to our presenting sponsor, Cash App.
link |
00:57:33.340
Download it, use code LexPodcast,
link |
00:57:35.940
you'll get $10 and $10 will go to First,
link |
00:57:38.980
a STEM education nonprofit that inspires
link |
00:57:41.300
hundreds of thousands of young minds
link |
00:57:43.300
to learn, to dream of engineering our future.
link |
00:57:47.020
If you enjoy this podcast, subscribe on YouTube,
link |
00:57:49.940
give it five stars on Apple Podcast,
link |
00:57:51.940
support it on Patreon, or connect with me on Twitter.
link |
00:57:55.500
And now, let me leave you with some words of wisdom
link |
00:57:58.580
from Albert Einstein.
link |
00:58:00.780
The important thing is not to stop questioning.
link |
00:58:03.980
Curiosity has its own reason for existence.
link |
00:58:06.900
One cannot help but be in awe
link |
00:58:09.220
when he contemplates the mysteries of eternity,
link |
00:58:11.740
of life, the marvelous structure of reality.
link |
00:58:14.980
It is enough if one tries merely to comprehend
link |
00:58:18.100
a little of this mystery every day.
link |
00:58:20.140
Thank you for listening and hope to see you next time.