back to index

Liv Boeree: Poker, Game Theory, AI, Simulation, Aliens & Existential Risk | Lex Fridman Podcast #314


small model | large model

link |
00:00:00.000
Evolutionarily, if we see a lion running at us,
link |
00:00:03.040
we didn't have time to sort of calculate
link |
00:00:05.040
the lion's kinetic energy and is it optimal
link |
00:00:08.040
to go this way or that way, you just react it.
link |
00:00:10.440
And physically our bodies are well attuned
link |
00:00:13.240
to actually make right decisions.
link |
00:00:14.540
But when you're playing a game like poker,
link |
00:00:16.640
this is not something that you ever evolved to do.
link |
00:00:19.320
And yet you're in that same flight or fight response.
link |
00:00:22.360
And so that's a really important skill
link |
00:00:24.440
to be able to develop to basically learn
link |
00:00:25.900
how to like meditate in the moment and calm yourself
link |
00:00:29.100
so that you can think clearly.
link |
00:00:32.520
The following is a conversation with Liv Burry,
link |
00:00:35.740
formerly one of the best poker players in the world,
link |
00:00:38.380
trained as an astrophysicist and is now a philanthropist
link |
00:00:42.880
and an educator on topics of game theory,
link |
00:00:45.780
physics, complexity, and life.
link |
00:00:49.120
This is the Lex Friedman podcast.
link |
00:00:51.080
To support it, please check out our sponsors
link |
00:00:53.200
in the description.
link |
00:00:54.440
And now, dear friends, here's Liv Burry.
link |
00:00:57.680
What role do you think luck plays in poker and in life?
link |
00:01:02.580
You can pick whichever one you want,
link |
00:01:04.260
poker or life and or life.
link |
00:01:06.940
The longer you play, the less influence luck has.
link |
00:01:10.900
Like with all things, the bigger your sample size,
link |
00:01:13.900
the more the quality of your decisions
link |
00:01:16.380
or your strategies matter.
link |
00:01:18.860
So to answer that question, yeah, in poker,
link |
00:01:21.580
it really depends.
link |
00:01:22.780
If you and I sat and played 10 hands right now,
link |
00:01:26.120
I might only win 52% of the time, 53% maybe.
link |
00:01:30.260
But if we played 10,000 hands,
link |
00:01:31.740
then I'll probably win like over 98, 99% of the time.
link |
00:01:35.180
So it's a question of sample sizes.
link |
00:01:38.180
And what are you figuring out over time?
link |
00:01:40.120
The betting strategy that this individual does
link |
00:01:42.260
or literally it doesn't matter
link |
00:01:43.740
against any individual over time?
link |
00:01:45.920
Against any individual over time, the better player
link |
00:01:47.980
because they're making better decisions.
link |
00:01:49.520
So what does that mean to make a better decision?
link |
00:01:51.260
Well, to get into the realness of Gritty already,
link |
00:01:55.340
basically poker is a game of math.
link |
00:01:58.780
There are these strategies familiar
link |
00:02:00.380
with like Nash Equilibria, right?
link |
00:02:02.580
So there are these game theory optimal strategies
link |
00:02:06.340
that you can adopt.
link |
00:02:08.340
And the closer you play to them,
link |
00:02:10.380
the less exploitable you are.
link |
00:02:12.420
So because I've studied the game a bunch,
link |
00:02:15.980
although admittedly not for a few years,
link |
00:02:17.300
but back when I was playing all the time,
link |
00:02:20.060
I would study these game theory optimal solutions
link |
00:02:23.060
and try and then adopt those strategies
link |
00:02:24.940
when I go and play.
link |
00:02:25.780
So I'd play against you and I would do that.
link |
00:02:27.760
And because the objective,
link |
00:02:31.860
when you're playing game theory optimal,
link |
00:02:33.260
it's actually, it's a loss minimization thing
link |
00:02:36.020
that you're trying to do.
link |
00:02:37.380
Your best bet is to try and play a sort of similar style.
link |
00:02:42.760
You also need to try and adopt this loss minimization.
link |
00:02:46.220
But because I've been playing much longer than you,
link |
00:02:48.140
I'll be better at that.
link |
00:02:49.420
So first of all, you're not taking advantage
link |
00:02:51.940
of my mistakes.
link |
00:02:53.180
But then on top of that,
link |
00:02:55.020
I'll be better at recognizing
link |
00:02:56.940
when you are playing suboptimally
link |
00:02:59.540
and then deviating from this game theory optimal strategy
link |
00:03:02.220
to exploit your bad plays.
link |
00:03:05.140
Can you define game theory and Nash equilibria?
link |
00:03:08.700
Can we try to sneak up to it in a bunch of ways?
link |
00:03:10.980
Like what's a game theory framework of analyzing poker,
link |
00:03:14.380
analyzing any kind of situation?
link |
00:03:16.340
So game theory is just basically the study of decisions
link |
00:03:21.340
within a competitive situation.
link |
00:03:24.540
I mean, it's technically a branch of economics,
link |
00:03:26.780
but it also applies to like wider decision theory.
link |
00:03:30.380
And usually when you see it,
link |
00:03:35.380
it's these like little payoff matrices and so on.
link |
00:03:37.900
That's how it's depicted.
link |
00:03:38.780
But it's essentially just like study of strategies
link |
00:03:41.140
under different competitive situations.
link |
00:03:42.940
And as it happens, certain games,
link |
00:03:46.360
in fact, many, many games have these things
link |
00:03:48.820
called Nash equilibria.
link |
00:03:50.460
And what that means is when you're in a Nash equilibrium,
link |
00:03:52.420
basically it is not,
link |
00:03:55.460
there is no strategy that you can take
link |
00:03:59.700
that would be more beneficial
link |
00:04:01.180
than the one you're currently taking,
link |
00:04:02.780
assuming your opponent is also doing the same thing.
link |
00:04:05.700
So it'd be a bad idea,
link |
00:04:06.740
if we're both playing in a game theory optimal strategy,
link |
00:04:10.660
if either of us deviate from that,
link |
00:04:12.140
now we're putting ourselves at a disadvantage.
link |
00:04:16.620
Rock, paper, scissors
link |
00:04:17.460
is actually a really great example of this.
link |
00:04:18.840
Like if we were to start playing rock, paper, scissors,
link |
00:04:22.380
you know, you know nothing about me
link |
00:04:23.740
and we're gonna play for all our money,
link |
00:04:26.180
let's play 10 rounds of it.
link |
00:04:27.820
What would your sort of optimal strategy be?
link |
00:04:30.220
Do you think, what would you do?
link |
00:04:33.640
Let's see.
link |
00:04:35.180
I would probably try to be as random as possible.
link |
00:04:42.580
Exactly.
link |
00:04:43.500
You wanna, because you don't know anything about me,
link |
00:04:46.000
you don't want to give anything away about yourself.
link |
00:04:48.100
So ideally you'd have like a little dice
link |
00:04:49.540
or somewhat, you know, perfect randomizer
link |
00:04:52.560
that makes you randomize 33% of the time
link |
00:04:54.500
each of the three different things.
link |
00:04:56.060
And in response to that,
link |
00:04:58.140
well, actually I can kind of do anything,
link |
00:04:59.620
but I would probably just randomize back too,
link |
00:05:01.380
but actually it wouldn't matter
link |
00:05:02.400
because I know that you're playing randomly.
link |
00:05:05.140
So that would be us in a Nash equilibrium
link |
00:05:07.620
where we're both playing this like unexploitable strategy.
link |
00:05:10.620
However, if after a while you then notice
link |
00:05:13.020
that I'm playing rock a little bit more often than I should.
link |
00:05:16.380
Yeah, you're the kind of person that would do that,
link |
00:05:18.140
wouldn't you?
link |
00:05:18.980
Sure.
link |
00:05:19.800
Yes, yes, yes.
link |
00:05:20.640
I'm more of a scissors girl, but anyway.
link |
00:05:21.460
You are?
link |
00:05:22.500
No, I'm a, as I said, randomizer.
link |
00:05:25.580
So you notice I'm throwing rock too much
link |
00:05:27.140
or something like that.
link |
00:05:28.140
Now you'd be making a mistake
link |
00:05:29.300
by continuing playing this game theory optimal strategy,
link |
00:05:32.180
well, the previous one, because you are now,
link |
00:05:36.660
I'm making a mistake and you're not deviating
link |
00:05:39.060
and exploiting my mistake.
link |
00:05:41.300
So you'd want to start throwing paper a bit more often
link |
00:05:43.800
in whatever you figure is the right sort of percentage
link |
00:05:45.940
of the time that I'm throwing rock too often.
link |
00:05:48.060
So that's basically an example of where,
link |
00:05:51.460
what game theory optimal strategy is
link |
00:05:53.260
in terms of loss minimization,
link |
00:05:54.660
but it's not always the maximally profitable thing
link |
00:05:58.020
if your opponent is doing stupid stuff,
link |
00:06:00.140
which in that example.
link |
00:06:02.560
So that's kind of then how it works in poker,
link |
00:06:04.500
but it's a lot more complex.
link |
00:06:07.380
And the way poker players typically,
link |
00:06:10.620
nowadays they study, the games change so much.
link |
00:06:12.900
And I think we should talk about how it sort of evolved,
link |
00:06:15.420
but nowadays like the top pros
link |
00:06:17.780
basically spend all their time in between sessions
link |
00:06:20.820
running these simulators using like software
link |
00:06:24.220
where they do basically Monte Carlo simulations,
link |
00:06:26.140
sort of doing billions of fictitious self play hands.
link |
00:06:31.140
You input a fictitious hand scenario,
link |
00:06:34.020
like, oh, what do I do with Jack nine suited
link |
00:06:36.060
on a King 10 four to two spades board
link |
00:06:41.980
and against this bet size.
link |
00:06:43.940
So you'd input that press play,
link |
00:06:45.780
it'll run its billions of fake hands
link |
00:06:49.340
and then it will converge upon
link |
00:06:50.340
what the game theory optimal strategies are.
link |
00:06:53.460
And then you wanna try and memorize what these are.
link |
00:06:55.420
Basically they're like ratios of how often,
link |
00:06:57.480
what types of hands you want to bluff
link |
00:06:59.900
and what percentage of the time.
link |
00:07:01.300
So then there's this additional layer
link |
00:07:02.780
of inbuilt randomization built in.
link |
00:07:04.460
Yeah, those kinds of simulations incorporate
link |
00:07:06.420
all the betting strategies and everything else like that.
link |
00:07:09.460
So as opposed to some kind of very crude mathematical model
link |
00:07:12.340
of what's the probability you win
link |
00:07:13.860
just based on the quality of the card,
link |
00:07:16.380
it's including everything else too.
link |
00:07:18.980
The game theory of it.
link |
00:07:20.220
Yes, yeah, essentially.
link |
00:07:21.980
And what's interesting is that nowadays,
link |
00:07:23.780
if you want to be a top pro and you go and play
link |
00:07:25.620
in these really like the super high stakes tournaments
link |
00:07:27.540
or tough cash games, if you don't know this stuff,
link |
00:07:30.700
you're gonna get eaten alive in the long run.
link |
00:07:33.260
But of course you could get lucky over the short run
link |
00:07:35.160
and that's where this like luck factor comes in
link |
00:07:36.860
because luck is both a blessing and a curse.
link |
00:07:40.400
If luck didn't, if there wasn't this random element
link |
00:07:42.580
and there wasn't the ability for worse players
link |
00:07:45.260
to win sometimes, then poker would fall apart.
link |
00:07:48.220
You know, the same reason people don't play chess
link |
00:07:50.980
professionally for money against,
link |
00:07:53.380
you don't see people going and hustling chess
link |
00:07:55.500
like not knowing, trying to make a living from it
link |
00:07:57.860
because you know there's very little luck in chess,
link |
00:08:00.140
but there's quite a lot of luck in poker.
link |
00:08:01.420
Have you seen Beautiful Mind, that movie?
link |
00:08:03.940
Years ago.
link |
00:08:04.780
Well, what do you think about the game theoretic formulation
link |
00:08:07.220
of what is it, the hot blonde at the bar?
link |
00:08:09.980
Do you remember?
link |
00:08:10.820
Oh, yeah.
link |
00:08:11.660
What they illustrated is they're trying to pick up a girl
link |
00:08:14.860
at a bar and there's multiple girls.
link |
00:08:16.460
They're like friend, it's like a friend group
link |
00:08:18.140
and you're trying to approach,
link |
00:08:20.100
I don't remember the details, but I remember.
link |
00:08:21.980
Don't you like then speak to her friends first
link |
00:08:24.340
or something like that, feign disinterest.
link |
00:08:25.700
I mean, it's classic pickup artist stuff, right?
link |
00:08:27.420
You wanna.
link |
00:08:28.260
And they were trying to correlate that somehow,
link |
00:08:30.780
that being an optimal strategy game theoretically.
link |
00:08:35.360
Why?
link |
00:08:36.740
What, what, like, I don't think, I remember.
link |
00:08:38.940
I can't imagine that there is,
link |
00:08:39.780
I mean, there's probably an optimal strategy.
link |
00:08:41.940
Is it, does that mean that there's an actual Nash equilibrium
link |
00:08:45.020
of like picking up girls?
link |
00:08:46.660
Do you know the marriage problem?
link |
00:08:49.020
It's optimal stopping.
link |
00:08:50.940
Yes.
link |
00:08:51.780
So where it's an optimal dating strategy
link |
00:08:54.180
where you, do you remember what it is?
link |
00:08:56.580
Yeah, I think it's like something like,
link |
00:08:57.580
you know you've got like a set of a hundred people
link |
00:08:59.920
you're gonna look through and after,
link |
00:09:01.900
how many do you, now after that,
link |
00:09:05.620
after going on this many dates out of a hundred,
link |
00:09:08.020
at what point do you then go, okay, the next best person I see,
link |
00:09:10.780
is that the right one?
link |
00:09:11.620
And I think it's like something like 37%.
link |
00:09:14.700
Uh, it's one over E, whatever that is.
link |
00:09:17.740
Right, which I think is 37%.
link |
00:09:19.260
Yeah.
link |
00:09:21.260
We're gonna fact check that.
link |
00:09:24.860
Yeah.
link |
00:09:25.700
So, but it's funny under those strict constraints,
link |
00:09:28.420
then yes, after that many people,
link |
00:09:30.660
as long as you have a fixed size pool,
link |
00:09:32.900
then you just pick the next person
link |
00:09:36.460
that is better than anyone you've seen before.
link |
00:09:38.460
Anyone else you've seen, yeah.
link |
00:09:40.340
Have you tried this?
link |
00:09:41.780
Have you incorporated it?
link |
00:09:42.620
I'm not one of those people.
link |
00:09:44.660
And we're gonna discuss this.
link |
00:09:45.820
I, and, what do you mean, those people?
link |
00:09:49.820
I try not to optimize stuff.
link |
00:09:52.560
I try to listen to the heart.
link |
00:09:55.300
I don't think, I like,
link |
00:09:59.780
my mind immediately is attracted to optimizing everything.
link |
00:10:04.300
Optimizing everything.
link |
00:10:06.220
And I think that if you really give in
link |
00:10:09.900
to that kind of addiction,
link |
00:10:11.060
that you lose the joy of the small things,
link |
00:10:14.880
the minutiae of life, I think.
link |
00:10:17.180
I don't know.
link |
00:10:18.020
I'm concerned about the addictive nature
link |
00:10:19.780
of my personality in that regard.
link |
00:10:21.600
In some ways,
link |
00:10:24.760
while I think the, on average,
link |
00:10:26.520
people under try and quantify things,
link |
00:10:30.140
or try, under optimize.
link |
00:10:32.500
There are some people who, you know,
link |
00:10:34.100
with all these things, it's a balancing act.
link |
00:10:37.140
I've been on dating apps, but I've never used them.
link |
00:10:40.180
I'm sure they have data on this,
link |
00:10:42.060
because they probably have
link |
00:10:43.060
the optimal stopping control problem.
link |
00:10:45.260
Because there aren't a lot of people that use social,
link |
00:10:47.380
like dating apps, are on there for a long time.
link |
00:10:51.020
So the interesting aspect is like, all right,
link |
00:10:56.300
how long before you stop looking
link |
00:10:58.740
before it actually starts affecting your mind negatively,
link |
00:11:01.720
such that you see dating as a kind of,
link |
00:11:07.380
A game.
link |
00:11:08.220
A kind of game versus an actual process
link |
00:11:12.780
of finding somebody that's going to make you happy
link |
00:11:14.420
for the rest of your life.
link |
00:11:15.780
That's really interesting.
link |
00:11:17.340
They have the data.
link |
00:11:18.180
I wish they would be able to release that data.
link |
00:11:20.420
And I do want to hop to it.
link |
00:11:21.260
It's OkCupid, right?
link |
00:11:22.220
I think they ran a huge, huge study on all of that.
link |
00:11:25.320
Yeah, they're more data driven, I think, OkCupid folks are.
link |
00:11:28.060
I think there's a lot of opportunity for dating apps,
link |
00:11:30.220
in general, even bigger than dating apps,
link |
00:11:32.340
people connecting on the internet.
link |
00:11:35.020
I just hope they're more data driven.
link |
00:11:37.580
And it doesn't seem that way.
link |
00:11:40.340
I think like, I've always thought that
link |
00:11:44.780
Goodreads should be a dating app.
link |
00:11:47.900
Like the.
link |
00:11:48.740
I've never used it.
link |
00:11:49.580
After what?
link |
00:11:50.400
Goodreads is just lists like books that you've read.
link |
00:11:55.140
And allows you to comment on the books you read
link |
00:11:57.360
and what books you're currently reading.
link |
00:11:58.940
But it's a giant social networks of people reading books.
link |
00:12:01.340
And that seems to be a much better database
link |
00:12:03.380
of like interests.
link |
00:12:04.620
Of course it constrains you to the books you're reading,
link |
00:12:06.580
but like that really reveals so much more about the person.
link |
00:12:10.220
Allows you to discover shared interests
link |
00:12:12.460
because books are a kind of window
link |
00:12:13.940
into the way you see the world.
link |
00:12:16.020
Also like the kind of places,
link |
00:12:19.220
people you're curious about,
link |
00:12:20.340
the kind of ideas you're curious about.
link |
00:12:21.700
Are you a romantic or are you cold calculating rationalist?
link |
00:12:24.860
Are you into Ayn Rand or are you into Bernie Sanders?
link |
00:12:28.260
Are you into whatever?
link |
00:12:29.940
And I feel like that reveals so much more
link |
00:12:31.980
than like a person trying to look hot
link |
00:12:35.220
from a certain angle in a Tinder profile.
link |
00:12:37.100
Well, and it'd also be a really great filter
link |
00:12:38.940
in the first place for people.
link |
00:12:40.340
It's less people who read books
link |
00:12:41.940
and are willing to go and rate them
link |
00:12:44.700
and give feedback on them and so on.
link |
00:12:47.020
So that's already a really strong filter.
link |
00:12:48.820
Probably the type of people you'd be looking for.
link |
00:12:50.600
Well, at least be able to fake reading books.
link |
00:12:52.380
I mean, the thing about books,
link |
00:12:53.540
you don't really need to read it.
link |
00:12:54.500
You can just look at the click counts.
link |
00:12:55.860
Yeah, game the dating app by feigning intellectualism.
link |
00:12:59.060
Can I admit something very horrible about myself?
link |
00:13:02.340
Go on.
link |
00:13:03.180
The things that, you know,
link |
00:13:04.300
I don't have many things in my closet,
link |
00:13:05.700
but this is one of them.
link |
00:13:07.980
I've never actually read Shakespeare.
link |
00:13:10.700
I've only read Cliff Notes
link |
00:13:12.340
and I got a five in the AP English exam.
link |
00:13:15.020
Wow.
link |
00:13:15.860
And I...
link |
00:13:16.700
Which book?
link |
00:13:18.020
The which books have I read?
link |
00:13:19.660
Oh yeah, which was the exam on which book?
link |
00:13:21.500
Oh no, they include a lot of them.
link |
00:13:23.180
Oh.
link |
00:13:24.260
But Hamlet, I don't even know
link |
00:13:27.140
if you read Romeo and Juliet, Macbeth.
link |
00:13:30.060
I don't remember, but I don't understand it.
link |
00:13:32.660
It's like really cryptic.
link |
00:13:34.060
It's hard.
link |
00:13:34.900
It's really, I don't, and it's not that pleasant to read.
link |
00:13:37.980
It's like ancient speak.
link |
00:13:39.180
I don't understand it.
link |
00:13:40.300
Anyway, maybe I was too dumb.
link |
00:13:41.780
I'm still too dumb, but I did get...
link |
00:13:44.780
You got a five, which is...
link |
00:13:45.820
Yeah, yeah.
link |
00:13:46.660
I don't know how the US grading system...
link |
00:13:47.980
Oh no, so AP English is a,
link |
00:13:50.380
there's kind of this advanced versions of courses
link |
00:13:53.060
in high school, and you take a test
link |
00:13:54.900
that is like a broad test for that subject
link |
00:13:57.940
and includes a lot.
link |
00:13:58.900
It wasn't obviously just Shakespeare.
link |
00:14:00.580
I think a lot of it was also writing, written.
link |
00:14:04.900
You have like AP Physics, AP Computer Science,
link |
00:14:07.140
AP Biology, AP Chemistry, and then AP English
link |
00:14:11.140
or AP Literature, I forget what it was.
link |
00:14:13.460
But I think Shakespeare was a part of that.
link |
00:14:16.020
But I...
link |
00:14:16.860
And the point is you gamified it?
link |
00:14:19.140
Gamified.
link |
00:14:19.980
Well, in entirety, I was into getting As.
link |
00:14:22.580
I saw it as a game.
link |
00:14:24.180
I don't think any...
link |
00:14:27.060
I don't think all of the learning I've done
link |
00:14:30.340
has been outside of school.
link |
00:14:33.940
The deepest learning I've done has been outside of school
link |
00:14:36.220
with a few exceptions, especially in grad school,
link |
00:14:38.380
like deep computer science courses.
link |
00:14:40.420
But that was still outside of school
link |
00:14:41.780
because it was outside of getting...
link |
00:14:43.180
Sorry.
link |
00:14:44.020
It was outside of getting the A for the course.
link |
00:14:46.180
The best stuff I've ever done is when you read the chapter
link |
00:14:49.620
and you do many of the problems at the end of the chapter,
link |
00:14:52.300
which is usually not what's required for the course,
link |
00:14:54.900
like the hardest stuff.
link |
00:14:56.140
In fact, textbooks are freaking incredible.
link |
00:14:58.780
If you go back now and you look at like biology textbook
link |
00:15:02.220
or any of the computer science textbooks
link |
00:15:06.060
on algorithms and data structures,
link |
00:15:07.740
those things are incredibly...
link |
00:15:09.500
They have the best summary of a subject,
link |
00:15:11.940
plus they have practice problems of increasing difficulty
link |
00:15:15.300
that allows you to truly master the basic,
link |
00:15:17.700
like the fundamental ideas behind that.
link |
00:15:19.900
I got through my entire physics degree with one textbook
link |
00:15:24.380
that was just this really comprehensive one
link |
00:15:26.260
that they told us at the beginning of the first year,
link |
00:15:28.620
buy this, but you're gonna have to buy 15 other books
link |
00:15:31.660
for all your supplementary courses.
link |
00:15:33.340
And I was like, every time I was just checked
link |
00:15:35.500
to see whether this book covered it and it did.
link |
00:15:38.020
And I think I only bought like two or three extra
link |
00:15:39.820
and thank God, cause they're super expensive textbooks.
link |
00:15:41.900
It's a whole racket they've got going on.
link |
00:15:44.300
Yeah, they are.
link |
00:15:45.140
They could just...
link |
00:15:46.220
You get the right one, it's just like a manual for...
link |
00:15:49.500
But what's interesting though,
link |
00:15:51.380
is this is the tyranny of having exams and metrics.
link |
00:15:56.780
The tyranny of exams and metrics, yes.
link |
00:15:58.660
I loved them because I'm very competitive
link |
00:16:00.980
and I liked finding ways to gamify things
link |
00:16:04.020
and then like sort of dust off my shoulders afterwards
link |
00:16:06.380
when I get a good grade
link |
00:16:07.420
or be annoyed at myself when I didn't.
link |
00:16:09.620
But yeah, you're absolutely right.
link |
00:16:10.860
And that the actual...
link |
00:16:12.820
How much of that physics knowledge I've retained,
link |
00:16:15.700
like I've learned how to cram and study
link |
00:16:19.660
and please an examiner,
link |
00:16:21.340
but did that give me the deep lasting knowledge
link |
00:16:23.980
that I needed?
link |
00:16:24.820
I mean, yes and no,
link |
00:16:27.020
but really like nothing makes you learn a topic better
link |
00:16:30.980
than when you actually then have to teach it yourself.
link |
00:16:34.260
Like I'm trying to wrap my teeth
link |
00:16:35.860
around this like game theory Moloch stuff right now.
link |
00:16:38.180
And there's no exam at the end of it that I can gamify.
link |
00:16:43.060
There's no way to gamify
link |
00:16:43.980
and sort of like shortcut my way through it.
link |
00:16:45.380
I have to understand it so deeply
link |
00:16:47.260
from like deep foundational levels
link |
00:16:49.660
to then to build upon it
link |
00:16:50.700
and then try and explain it to other people.
link |
00:16:52.340
And like, you're about to go and do some lectures, right?
link |
00:16:54.460
You can't sort of just like,
link |
00:16:58.620
you presumably can't rely on the knowledge
link |
00:17:00.660
that you got through when you were studying for an exam
link |
00:17:03.460
to reteach that.
link |
00:17:04.860
Yeah, and especially high level lectures,
link |
00:17:06.860
especially the kind of stuff you do on YouTube,
link |
00:17:09.420
you're not just regurgitating material.
link |
00:17:12.780
You have to think through what is the core idea here.
link |
00:17:17.940
And when you do the lectures live especially,
link |
00:17:20.780
you have to, there's no second takes.
link |
00:17:25.020
That is the luxury you get
link |
00:17:27.100
if you're recording a video for YouTube
link |
00:17:28.980
or something like that.
link |
00:17:30.220
But it definitely is a luxury you shouldn't lean on.
link |
00:17:35.060
I've gotten to interact with a few YouTubers
link |
00:17:37.340
that lean on that too much.
link |
00:17:39.300
And you realize, oh, you've gamified this system
link |
00:17:43.380
because you're not really thinking deeply about stuff.
link |
00:17:46.740
You're through the edit,
link |
00:17:48.420
both written and spoken,
link |
00:17:52.220
you're crafting an amazing video,
link |
00:17:53.900
but you yourself as a human being
link |
00:17:55.460
have not really deeply understood it.
link |
00:17:57.620
So live teaching or at least recording video
link |
00:18:00.780
with very few takes is a different beast.
link |
00:18:04.700
And I think it's the most honest way of doing it,
link |
00:18:07.300
like as few takes as possible.
link |
00:18:09.420
That's why I'm nervous about this.
link |
00:18:10.780
Don't go back and be like, ah, let's do that.
link |
00:18:14.020
Don't fuck this up, Liv.
link |
00:18:17.220
The tyranny of exams.
link |
00:18:18.500
I do think people talk about high school and college
link |
00:18:24.220
as a time to do drugs and drink and have fun
link |
00:18:27.220
and all this kind of stuff.
link |
00:18:28.340
But looking back, of course I did a lot of those things.
link |
00:18:33.340
Yes, no, yes, but it's also a time
link |
00:18:39.020
when you get to read textbooks or read books
link |
00:18:43.620
or learn with all the time in the world.
link |
00:18:48.740
You don't have these responsibilities of laundry
link |
00:18:54.860
and having to sort of pay for mortgage,
link |
00:18:59.860
all that kind of stuff, pay taxes, all this kind of stuff.
link |
00:19:03.020
In most cases, there's just so much time in the day
link |
00:19:06.500
for learning and you don't realize it at the time
link |
00:19:09.420
because at the time it seems like a chore,
link |
00:19:11.100
like why the hell does there's so much homework?
link |
00:19:14.660
But you never get a chance to do this kind of learning,
link |
00:19:17.220
this kind of homework ever again in life,
link |
00:19:20.300
unless later in life you really make a big effort out of it.
link |
00:19:23.860
Like basically your knowledge gets solidified.
link |
00:19:26.780
You don't get to have fun and learn.
link |
00:19:28.380
Learning is really fulfilling and really fun
link |
00:19:32.740
if you're that kind of person.
link |
00:19:33.700
Like some people like, you know,
link |
00:19:37.220
like knowledge is not something that they think is fun.
link |
00:19:39.940
But if that's the kind of thing that you think is fun,
link |
00:19:43.020
that's the time to have fun and do the drugs and drink
link |
00:19:45.620
and all that kind of stuff.
link |
00:19:46.460
But the learning, just going back to those textbooks,
link |
00:19:50.780
the hours spent with the textbooks
link |
00:19:52.460
is really, really rewarding.
link |
00:19:54.420
Do people even use textbooks anymore?
link |
00:19:56.020
Yeah. Do you think?
link |
00:19:56.940
Kids these days with their TikTok and their...
link |
00:20:00.300
Well, not even that, but it's just like so much information,
link |
00:20:03.940
really high quality information,
link |
00:20:05.260
you know, it's now in digital format online.
link |
00:20:08.420
Yeah, but they're not, they are using that,
link |
00:20:10.340
but you know, college is still very, there's a curriculum.
link |
00:20:15.340
I mean, so much of school is about rigorous study
link |
00:20:18.860
of a subject and still on YouTube, that's not there.
link |
00:20:22.820
Right. YouTube has,
link |
00:20:23.900
YouTube has, Grant Sanderson talks about this.
link |
00:20:27.700
He's this math...
link |
00:20:29.180
3Blue1Brown.
link |
00:20:30.060
Yeah, 3Blue1Brown.
link |
00:20:31.660
He says like, I'm not a math teacher.
link |
00:20:33.860
I just take really cool concepts and I inspire people.
link |
00:20:37.740
But if you wanna really learn calculus,
link |
00:20:39.460
if you wanna really learn linear algebra,
link |
00:20:41.820
you should do the textbook.
link |
00:20:43.540
You should do that, you know.
link |
00:20:45.220
And there's still the textbook industrial complex
link |
00:20:48.980
that like charges like $200 for textbook and somehow,
link |
00:20:53.340
I don't know, it's ridiculous.
link |
00:20:56.580
Well, they're like, oh, sorry, new edition, edition 14.6.
link |
00:21:00.540
Sorry, you can't use 14.5 anymore.
link |
00:21:02.860
It's like, what's different?
link |
00:21:03.700
We've got one paragraph different.
link |
00:21:05.340
So we mentioned offline, Daniel Negrano.
link |
00:21:08.900
I'm gonna get a chance to talk to him on this podcast.
link |
00:21:11.620
And he's somebody that I found fascinating
link |
00:21:14.380
in terms of the way he thinks about poker,
link |
00:21:16.540
verbalizes the way he thinks about poker,
link |
00:21:18.580
the way he plays poker.
link |
00:21:20.180
So, and he's still pretty damn good.
link |
00:21:22.860
He's been good for a long time.
link |
00:21:24.900
So you mentioned that people are running
link |
00:21:27.300
these kinds of simulations and the game of poker has changed.
link |
00:21:30.540
Do you think he's adapting in this way?
link |
00:21:33.420
Like the top pros, do they have to adopt this way?
link |
00:21:36.540
Or is there still like over the years,
link |
00:21:41.700
you basically develop this gut feeling about,
link |
00:21:45.300
like you get to be like good the way,
link |
00:21:48.580
like alpha zero is good.
link |
00:21:49.780
You look at the board and somehow from the fog
link |
00:21:54.180
comes out the right answer.
link |
00:21:55.380
Like this is likely what they have.
link |
00:21:58.020
This is likely the best way to move.
link |
00:22:00.460
And you don't really, you can't really put a finger
link |
00:22:03.100
on exactly why, but it just comes from your gut feeling.
link |
00:22:08.260
Or no?
link |
00:22:09.100
Yes and no.
link |
00:22:10.580
So gut feelings are definitely very important.
link |
00:22:14.740
You know, that we've got our two,
link |
00:22:16.980
you can distill it down to two modes
link |
00:22:18.180
of decision making, right?
link |
00:22:19.100
You've got your sort of logical linear voice in your head,
link |
00:22:22.180
system two, as it's often called,
link |
00:22:24.100
and your system one, your gut intuition.
link |
00:22:29.020
And historically in poker,
link |
00:22:32.420
the very best players were playing
link |
00:22:34.380
almost entirely by their gut.
link |
00:22:37.140
You know, often they do some kind of inspired play
link |
00:22:39.300
and you'd ask them why they do it
link |
00:22:40.500
and they wouldn't really be able to explain it.
link |
00:22:43.020
And that's not so much because their process
link |
00:22:46.500
was unintelligible, but it was more just because
link |
00:22:48.620
no one had the language with which to describe
link |
00:22:51.220
what optimal strategies were
link |
00:22:52.420
because no one really understood how poker worked.
link |
00:22:54.340
This was before, you know, we had analysis software.
link |
00:22:57.620
You know, no one was writing,
link |
00:22:59.940
I guess some people would write down their hands
link |
00:23:01.540
in a little notebook,
link |
00:23:02.740
but there was no way to assimilate all this data
link |
00:23:04.620
and analyze it.
link |
00:23:05.980
But then, you know, when computers became cheaper
link |
00:23:08.540
and software started emerging,
link |
00:23:09.980
and then obviously online poker,
link |
00:23:11.540
where it would like automatically save your hand histories.
link |
00:23:14.260
Now, all of a sudden you kind of had this body of data
link |
00:23:17.100
that you could run analysis on.
link |
00:23:19.620
And so that's when people started to see, you know,
link |
00:23:22.180
these mathematical solutions.
link |
00:23:26.780
And so what that meant is the role of intuition
link |
00:23:32.500
essentially became smaller.
link |
00:23:35.540
And it went more into, as we talked before about,
link |
00:23:38.660
you know, this game theory optimal style.
link |
00:23:40.620
But also, as I said, like game theory optimal
link |
00:23:43.340
is about loss minimization and being unexploitable.
link |
00:23:47.740
But if you're playing against people who aren't,
link |
00:23:49.620
because no person, no human being can play perfectly
link |
00:23:51.620
game theory optimal in poker, not even the best AIs.
link |
00:23:54.020
They're still like, you know,
link |
00:23:55.460
they're 99.99% of the way there or whatever,
link |
00:23:57.540
but it's kind of like the speed of light.
link |
00:23:59.260
You can't reach it perfectly.
link |
00:24:01.060
So there's still a role for intuition?
link |
00:24:03.780
Yes, so when, yeah,
link |
00:24:05.780
when you're playing this unexploitable style,
link |
00:24:08.460
but when your opponents start doing something,
link |
00:24:11.420
you know, suboptimal that you want to exploit,
link |
00:24:14.140
well, now that's where not only your like logical brain
link |
00:24:17.340
will need to be thinking, well, okay,
link |
00:24:18.700
I know I have this, I'm in the sort of top end
link |
00:24:21.300
of my range here with this hand.
link |
00:24:23.940
So that means I need to be calling X percent of the time
link |
00:24:27.780
and I put them on this range, et cetera.
link |
00:24:30.460
But then sometimes you'll have this gut feeling
link |
00:24:34.020
that will tell you, you know what, this time,
link |
00:24:37.300
I know mathematically I'm meant to call now.
link |
00:24:40.060
You know, I'm in the sort of top end of my range
link |
00:24:42.740
and this is the odds I'm getting.
link |
00:24:45.140
So the math says I should call,
link |
00:24:46.300
but there's something in your gut saying,
link |
00:24:48.340
they've got it this time, they've got it.
link |
00:24:49.940
Like they're beating you, your hand is worse.
link |
00:24:55.180
So then the real art,
link |
00:24:56.940
this is where the last remaining art in poker,
link |
00:24:59.620
the fuzziness is like, do you listen to your gut?
link |
00:25:03.660
How do you quantify the strength of it?
link |
00:25:06.100
Or can you even quantify the strength of it?
link |
00:25:08.340
And I think that's what Daniel has.
link |
00:25:13.100
I mean, I can't speak for how much he's studying
link |
00:25:15.340
with the simulators and that kind of thing.
link |
00:25:17.740
I think he has, like he must be to still be keeping up.
link |
00:25:22.260
But he has an incredible intuition for just,
link |
00:25:26.340
he's seen so many hands of poker in the flesh.
link |
00:25:29.180
He's seen so many people, the way they behave
link |
00:25:31.860
when the chips are, you know, when the money's on the line
link |
00:25:33.740
and he've got him staring you down in the eye.
link |
00:25:36.100
You know, he's intimidating.
link |
00:25:37.460
He's got this like kind of X factor vibe
link |
00:25:39.620
that he, you know, gives out.
link |
00:25:42.180
And he talks a lot, which is an interactive element,
link |
00:25:45.020
which is he's getting stuff from other people.
link |
00:25:47.180
Yes, yeah.
link |
00:25:48.020
And just like the subtlety.
link |
00:25:49.340
So he's like, he's probing constantly.
link |
00:25:51.460
Yeah, he's probing and he's getting this extra layer
link |
00:25:53.900
of information that others can't.
link |
00:25:55.940
Now that said though, he's good online as well.
link |
00:25:57.900
You know, I don't know how, again,
link |
00:25:59.660
would he be beating the top cash game players online?
link |
00:26:02.660
Probably not, no.
link |
00:26:03.900
But when he's in person and he's got that additional
link |
00:26:07.260
layer of information, he can not only extract it,
link |
00:26:09.700
but he knows what to do with it still so well.
link |
00:26:13.140
There's one player who I would say is the exception
link |
00:26:15.140
to all of this.
link |
00:26:16.980
And he's one of my favorite people to talk about
link |
00:26:18.940
in terms of, I think he might have cracked the simulation.
link |
00:26:23.460
It's Phil Helmuth.
link |
00:26:25.540
He...
link |
00:26:27.060
In more ways than one, he's cracked the simulation,
link |
00:26:29.140
I think.
link |
00:26:29.980
Yeah, he somehow to this day is still
link |
00:26:33.700
and I love you Phil, I'm not in any way knocking you.
link |
00:26:37.500
He is still winning so much at the World Series
link |
00:26:41.220
of Poker specifically.
link |
00:26:43.100
He's now won 16 bracelets.
link |
00:26:44.780
The next nearest person I think has won 10.
link |
00:26:47.660
And he is consistently year in, year out going deep
link |
00:26:50.100
or winning these huge field tournaments,
link |
00:26:52.420
you know, with like 2000 people,
link |
00:26:54.580
which statistically he should not be doing.
link |
00:26:57.060
And yet you watch some of the plays he makes
link |
00:27:02.060
and they make no sense, like mathematically,
link |
00:27:04.260
they are so far from game theory optimal.
link |
00:27:07.020
And the thing is, if you went and stuck him
link |
00:27:08.540
in one of these high stakes cash games
link |
00:27:10.860
with a bunch of like GTO people,
link |
00:27:12.740
he's gonna get ripped apart.
link |
00:27:14.540
But there's something that he has that when he's
link |
00:27:16.500
in the halls of the World Series of Poker specifically,
link |
00:27:20.420
amongst sort of amateurish players,
link |
00:27:23.340
he gets them to do crazy shit like that.
link |
00:27:26.340
And, but my little pet theory is that also,
link |
00:27:30.540
he just, the card, he's like a wizard
link |
00:27:35.100
and he gets the cards to do what he needs them to.
link |
00:27:38.980
Because he just expects to win and he expects to receive,
link |
00:27:43.980
you know, to get flopper set with a frequency far beyond
link |
00:27:47.300
what the real percentages are.
link |
00:27:50.780
And I don't even know if he knows what the real percentages
link |
00:27:52.500
are, he doesn't need to, because he gets there.
link |
00:27:54.700
I think he has found the Chico,
link |
00:27:55.980
because when I've seen him play,
link |
00:27:57.380
he seems to be like annoyed that the long shot thing
link |
00:28:01.300
didn't happen.
link |
00:28:02.140
He's like annoyed and it's almost like everybody else
link |
00:28:05.820
is stupid because he was obviously going to win
link |
00:28:08.020
with the spare.
link |
00:28:08.860
If that silly thing hadn't happened.
link |
00:28:10.540
And it's like, you don't understand,
link |
00:28:11.460
the silly thing happens 99% of the time.
link |
00:28:13.940
And it's a 1%, not the other way around,
link |
00:28:15.700
but genuinely for his lived experience at the World Series,
link |
00:28:18.580
only at the World Series of Poker, it is like that.
link |
00:28:21.540
So I don't blame him for feeling that way.
link |
00:28:24.020
But he does, he has this X factor
link |
00:28:26.460
and the poker community has tried for years
link |
00:28:29.940
to rip him down saying like, he's no good,
link |
00:28:32.740
but he's clearly good because he's still winning
link |
00:28:34.540
or there's something going on.
link |
00:28:36.220
Whether that's he's figured out how to mess
link |
00:28:39.140
with the fabric of reality and how cards,
link |
00:28:42.580
a randomly shuffled deck of cards come out.
link |
00:28:44.300
I don't know what it is, but he's doing it right still.
link |
00:28:46.660
Who do you think is the greatest of all time?
link |
00:28:48.700
Would you put Hellmuth?
link |
00:28:52.340
Not Hellmuth definitely, he seems like the kind of person
link |
00:28:54.660
when mentioned he would actually watch this.
link |
00:28:56.460
So you might wanna be careful.
link |
00:28:58.380
Well, as I said, I love Phil and I have,
link |
00:29:02.420
I would say this to his face, I'm not saying anything.
link |
00:29:04.180
I don't, he's got, he truly, I mean,
link |
00:29:06.980
he is one of the greatest.
link |
00:29:09.540
I don't know if he's the greatest,
link |
00:29:10.980
he's certainly the greatest at the World Series of Poker.
link |
00:29:14.300
And he is the greatest at, despite the game switching
link |
00:29:17.980
into a pure game, almost an entire game of math,
link |
00:29:20.900
he has managed to keep the magic alive.
link |
00:29:22.860
And this like, just through sheer force of will,
link |
00:29:26.020
making the game work for him.
link |
00:29:27.620
And that is incredible.
link |
00:29:28.660
And I think it's something that should be studied
link |
00:29:30.460
because it's an example.
link |
00:29:32.020
Yeah, there might be some actual game theoretic wisdom.
link |
00:29:35.260
There might be something to be said about optimality
link |
00:29:37.420
from studying him.
link |
00:29:39.220
What do you mean by optimality?
link |
00:29:40.940
Meaning, or rather game design, perhaps.
link |
00:29:45.540
Meaning if what he's doing is working,
link |
00:29:48.460
maybe poker is more complicated
link |
00:29:51.620
than the one we're currently modeling it as.
link |
00:29:54.220
So like his, yeah.
link |
00:29:55.060
Or there's an extra layer,
link |
00:29:56.660
and I don't mean to get too weird and wooey,
link |
00:29:59.500
but, or there's an extra layer of ability
link |
00:30:04.700
to manipulate the things the way you want them to go
link |
00:30:07.580
that we don't understand yet.
link |
00:30:09.780
Do you think Phil Hellmuth understands them?
link |
00:30:12.020
Is he just generally?
link |
00:30:13.660
Hashtag positivity.
link |
00:30:14.780
Well, he wrote a book on positivity and he's.
link |
00:30:17.220
He has, he did, not like a trolling book.
link |
00:30:20.460
No.
link |
00:30:21.300
He's just straight up, yeah.
link |
00:30:23.980
Phil Hellmuth wrote a book about positivity.
link |
00:30:26.020
Yes.
link |
00:30:27.220
Okay, not ironic.
link |
00:30:28.620
And I think it's about sort of manifesting what you want
link |
00:30:32.700
and getting the outcomes that you want
link |
00:30:34.100
by believing so much in yourself
link |
00:30:36.460
and in your ability to win,
link |
00:30:37.620
like eyes on the prize.
link |
00:30:40.540
And I mean, it's working.
link |
00:30:42.140
The man's delivered.
link |
00:30:43.180
But where do you put like Phil Ivey
link |
00:30:45.340
and all those kinds of people?
link |
00:30:47.300
I mean, I'm too, I've been,
link |
00:30:49.700
to be honest too much out of the scene
link |
00:30:50.900
for the last few years to really,
link |
00:30:53.740
I mean, Phil Ivey's clearly got,
link |
00:30:54.940
again, he's got that X factor.
link |
00:30:57.900
He's so incredibly intimidating to play against.
link |
00:31:00.340
I've only played against him a couple of times,
link |
00:31:01.700
but when he like looks you in the eye
link |
00:31:03.660
and you're trying to run a bluff on him,
link |
00:31:04.980
oof, no one's made me sweat harder than Phil Ivey,
link |
00:31:07.100
just my bluff got through actually.
link |
00:31:10.340
That was actually one of the most thrilling moments
link |
00:31:11.780
I've ever had in poker was,
link |
00:31:12.940
it was in a Monte Carlo in a high roller.
link |
00:31:15.460
I can't remember exactly what the hand was,
link |
00:31:16.700
but I, you know, a three bit
link |
00:31:19.500
and then like just barreled all the way through.
link |
00:31:22.100
And he just like put his laser eyes into me.
link |
00:31:24.380
And I felt like he was just scouring my soul.
link |
00:31:28.180
And I was just like, hold it together, Liv,
link |
00:31:29.780
hold it together.
link |
00:31:30.620
And he was like, folded.
link |
00:31:31.460
And you knew your hand was weaker.
link |
00:31:33.060
Yeah, I mean, I was bluffing.
link |
00:31:34.420
I presume, which, you know,
link |
00:31:36.420
there's a chance I was bluffing with the best hand,
link |
00:31:37.780
but I'm pretty sure my hand was worse.
link |
00:31:40.700
And he folded.
link |
00:31:44.620
I was truly one of the deep highlights of my career.
link |
00:31:47.500
Did you show the cards or did you fold?
link |
00:31:50.020
You should never show in game.
link |
00:31:51.980
Like, because especially as I felt like
link |
00:31:53.660
I was one of the worst players at the table
link |
00:31:54.900
in that tournament.
link |
00:31:55.740
So giving that information,
link |
00:31:57.900
unless I had a really solid plan
link |
00:31:59.900
that I was now like advertising,
link |
00:32:01.980
oh, look, I'm capable of bluffing Phil Ivey.
link |
00:32:03.580
But like, why?
link |
00:32:05.140
It's much more valuable to take advantage
link |
00:32:07.620
of the impression that they have of me,
link |
00:32:09.020
which is like, I'm a scared girl
link |
00:32:10.220
playing a high roller for the first time.
link |
00:32:12.060
Keep that going, you know.
link |
00:32:14.740
Interesting.
link |
00:32:15.580
But isn't there layers to this?
link |
00:32:17.100
Like psychological warfare
link |
00:32:18.780
that the scared girl might be way smart
link |
00:32:22.420
and then like just to flip the tables?
link |
00:32:24.700
Do you think about that kind of stuff?
link |
00:32:25.540
Or is it better not to reveal information?
link |
00:32:28.500
I mean, generally speaking,
link |
00:32:29.500
you want to not reveal information.
link |
00:32:31.060
You know, the goal of poker is to be
link |
00:32:32.700
as deceptive as possible about your own strategies
link |
00:32:36.020
while elucidating as much out of your opponent
link |
00:32:38.700
about their own.
link |
00:32:39.740
So giving them free information,
link |
00:32:42.100
particularly if they're people
link |
00:32:42.940
who you consider very good players,
link |
00:32:44.820
any information I give them is going into
link |
00:32:47.100
their little database and being,
link |
00:32:49.220
I assume it's going to be calculated and used well.
link |
00:32:51.780
So I have to be really confident
link |
00:32:53.500
that my like meta gaming that I'm going to then do,
link |
00:32:56.700
oh, they've seen this, so therefore that.
link |
00:32:58.340
I'm going to be on the right level.
link |
00:33:00.660
So it's better just to keep that little secret
link |
00:33:03.380
to myself in the moment.
link |
00:33:04.340
So how much is bluffing part of the game?
link |
00:33:06.700
Huge amount.
link |
00:33:08.340
So yeah, I mean, maybe actually let me ask,
link |
00:33:10.420
like, what did it feel like with Phil Ivey
link |
00:33:12.980
or anyone else when it's a high stake,
link |
00:33:15.020
when it's a big, it's a big bluff?
link |
00:33:18.860
So a lot of money on the table and maybe,
link |
00:33:23.060
I mean, what defines a big bluff?
link |
00:33:24.660
Maybe a lot of money on the table,
link |
00:33:25.980
but also some uncertainty in your mind and heart
link |
00:33:29.180
about like self doubt.
link |
00:33:31.940
Well, maybe I miscalculated what's going on here,
link |
00:33:34.540
what the bet said, all that kind of stuff.
link |
00:33:36.740
Like, what does that feel like?
link |
00:33:38.820
I mean, it's, I imagine comparable to,
link |
00:33:45.980
you know, running a, I mean, any kind of big bluff
link |
00:33:49.020
where you have a lot of something
link |
00:33:52.380
that you care about on the line.
link |
00:33:54.260
You know, so if you're bluffing in a courtroom,
link |
00:33:57.220
not that anyone should ever do that,
link |
00:33:58.180
or, you know, something equatable to that.
link |
00:34:00.340
It's, you know, in that scenario, you know,
link |
00:34:04.100
I think it was the first time I'd ever played a 20,
link |
00:34:05.700
I'd won my way into this 25K tournament.
link |
00:34:09.180
So that was the buy in 25,000 euros.
link |
00:34:11.180
And I had satelliteed my way in
link |
00:34:13.020
because it was much bigger
link |
00:34:14.020
than I would ever normally play.
link |
00:34:16.060
And, you know, I hadn't, I wasn't that experienced
link |
00:34:18.260
at the time, and now I was sitting there
link |
00:34:20.140
against all the big boys, you know,
link |
00:34:21.620
the Negronus, the Phil Ivey's and so on.
link |
00:34:24.740
And then to like, you know,
link |
00:34:28.780
each time you put the bets out, you know,
link |
00:34:30.580
you put another bet out, your card.
link |
00:34:33.020
Yeah, I was on what's called a semi bluff.
link |
00:34:34.860
So there were some cards that could come
link |
00:34:36.300
that would make my hand very, very strong
link |
00:34:38.100
and therefore win.
link |
00:34:39.180
But most of the time, those cards don't come.
link |
00:34:40.940
So that is a semi bluff because you're representing,
link |
00:34:43.860
are you representing that you already have something?
link |
00:34:47.860
So I think in this scenario, I had a flush draw.
link |
00:34:51.340
So I had two clubs, two clubs came out on the flop.
link |
00:34:55.940
And then I'm hoping that on the turn
link |
00:34:57.260
and the river, one will come.
link |
00:34:59.140
So I have some future equity.
link |
00:35:00.980
I could hit a club and then I'll have the best hand
link |
00:35:02.780
in which case, great.
link |
00:35:04.620
And so I can keep betting and I'll want them to call,
link |
00:35:07.460
but I'm also got the other way of winning the hand
link |
00:35:09.660
where if my card doesn't come,
link |
00:35:11.940
I can keep betting and get them to fold their hand.
link |
00:35:14.740
And I'm pretty sure that's what the scenario was.
link |
00:35:18.020
So I had some future equity, but it's still, you know,
link |
00:35:21.020
most of the time I don't hit that club.
link |
00:35:23.300
And so I would rather him just fold because I'm, you know,
link |
00:35:25.940
the pot is now getting bigger and bigger.
link |
00:35:27.820
And in the end, like I've jam all in on the river.
link |
00:35:30.980
That's my entire tournament on the line.
link |
00:35:33.420
As far as I'm aware, this might be the one time
link |
00:35:35.180
I ever get to play a big 25K.
link |
00:35:37.220
You know, this was the first time I played one.
link |
00:35:38.700
So it was, it felt like the most momentous thing.
link |
00:35:42.020
And this was also when I was trying to build myself up at,
link |
00:35:43.900
you know, build my name, a name for myself in poker.
link |
00:35:46.700
I wanted to get respect.
link |
00:35:47.740
Destroy everything for you.
link |
00:35:49.260
It felt like it in the moment.
link |
00:35:50.540
Like, I mean, it literally does feel
link |
00:35:51.780
like a form of life and death.
link |
00:35:52.820
Like your body physiologically
link |
00:35:54.340
is having that flight or fight response.
link |
00:35:56.140
What are you doing with your body?
link |
00:35:57.140
What are you doing with your face?
link |
00:35:58.340
Are you just like, what are you thinking about?
link |
00:36:01.340
More of a mixture of like, okay, what are the cards?
link |
00:36:04.700
So in theory, I'm thinking about like, okay,
link |
00:36:07.020
what are cards that make my hand look stronger?
link |
00:36:09.980
Which cards hit my perceived range from his perspective?
link |
00:36:13.860
Which cards don't?
link |
00:36:15.620
What's the right amount of bet size
link |
00:36:17.340
to maximize my fold equity in this situation?
link |
00:36:20.820
You know, that's the logical stuff
link |
00:36:22.100
that I should be thinking about.
link |
00:36:23.180
But I think in reality, because I was so scared,
link |
00:36:25.380
because there's this, at least for me,
link |
00:36:26.860
there's a certain threshold of like nervousness or stress
link |
00:36:30.420
beyond which the logical brain shuts off.
link |
00:36:33.420
And now it just gets into this like,
link |
00:36:36.900
just like, it feels like a game of wits, basically.
link |
00:36:38.740
It's like of nerve.
link |
00:36:39.780
Can you hold your resolve?
link |
00:36:41.780
And it certainly got by that, like by the river.
link |
00:36:44.300
I think by that point, I was like,
link |
00:36:45.780
I don't even know if this is a good bluff anymore,
link |
00:36:47.660
but fuck it, let's do it.
link |
00:36:50.060
Your mind is almost numb from the intensity of that feeling.
link |
00:36:53.100
I call it the white noise.
link |
00:36:55.100
And it happens in all kinds of decision making.
link |
00:36:58.860
I think anything that's really, really stressful.
link |
00:37:00.740
I can imagine someone in like an important job interview,
link |
00:37:02.940
if it's like a job they've always wanted,
link |
00:37:04.780
and they're getting grilled, you know,
link |
00:37:06.540
like Bridgewater style, where they ask these really hard,
link |
00:37:09.180
like mathematical questions.
link |
00:37:11.140
You know, it's a really learned skill
link |
00:37:13.420
to be able to like subdue your flight or fight response.
link |
00:37:17.780
You know, I think get from the sympathetic
link |
00:37:19.380
into the parasympathetic.
link |
00:37:20.340
So you can actually, you know, engage that voice
link |
00:37:23.820
in your head and do those slow logical calculations.
link |
00:37:26.020
Because evolutionarily, you know, if we see a lion
link |
00:37:29.020
running at us, we didn't have time to sort of calculate
link |
00:37:31.660
the lion's kinetic energy and, you know,
link |
00:37:33.980
is it optimal to go this way or that way?
link |
00:37:35.420
You just react it.
link |
00:37:37.060
And physically, our bodies are well attuned
link |
00:37:39.860
to actually make right decisions.
link |
00:37:41.140
But when you're playing a game like poker,
link |
00:37:43.220
this is not something that you ever, you know,
link |
00:37:44.780
evolved to do.
link |
00:37:45.940
And yet you're in that same flight or fight response.
link |
00:37:48.980
And so that's a really important skill to be able to develop
link |
00:37:51.740
to basically learn how to like meditate in the moment
link |
00:37:54.900
and calm yourself so that you can think clearly.
link |
00:37:57.420
But as you were searching for a comparable thing,
link |
00:38:00.780
it's interesting because you just made me realize
link |
00:38:03.300
that bluffing is like an incredibly high stakes
link |
00:38:06.580
form of lying.
link |
00:38:08.180
You're lying.
link |
00:38:10.780
And I don't think you can.
link |
00:38:11.620
Telling a story.
link |
00:38:12.460
No, no, it's straight up lying.
link |
00:38:15.180
In the context of game, it's not a negative kind of lying.
link |
00:38:19.060
But it is, yeah, exactly.
link |
00:38:19.900
You're representing something that you don't have.
link |
00:38:23.340
And I was thinking like how often in life
link |
00:38:26.740
do we have such high stakes of lying?
link |
00:38:30.300
Because I was thinking certainly
link |
00:38:33.420
in high level military strategy,
link |
00:38:36.060
I was thinking when Hitler was lying to Stalin
link |
00:38:40.220
about his plans to invade the Soviet Union.
link |
00:38:44.540
And so you're talking to a person like your friends
link |
00:38:48.300
and you're fighting against the enemy,
link |
00:38:50.700
whatever the formulation of the enemy is.
link |
00:38:53.900
But meanwhile, whole time you're building up troops
link |
00:38:56.780
on the border.
link |
00:38:58.900
That's extremely.
link |
00:38:59.740
Wait, wait, so Hitler and Stalin were like
link |
00:39:01.420
pretending to be friends?
link |
00:39:02.460
Yeah.
link |
00:39:03.300
Well, my history knowledge is terrible.
link |
00:39:04.140
Oh yeah.
link |
00:39:04.980
That's crazy.
link |
00:39:05.820
Yeah, that they were, oh man.
link |
00:39:09.500
And it worked because Stalin,
link |
00:39:11.460
until the troops crossed the border
link |
00:39:14.260
and invaded in Operation Barbarossa
link |
00:39:17.500
where this storm of Nazi troops
link |
00:39:22.780
invaded large parts of the Soviet Union.
link |
00:39:25.100
And hence, one of the biggest wars in human history began.
link |
00:39:30.380
Stalin for sure thought that this was never going to be,
link |
00:39:34.900
that Hitler is not crazy enough to invade the Soviet Union.
link |
00:39:38.660
And it makes, geopolitically makes total sense
link |
00:39:41.780
to be collaborators.
link |
00:39:43.020
And ideologically, even though there's a tension
link |
00:39:46.260
between communism and fascism or national socialism,
link |
00:39:50.300
however you formulate it,
link |
00:39:51.500
it still feels like this is the right way
link |
00:39:54.340
to battle the West.
link |
00:39:55.780
Right.
link |
00:39:56.620
They were more ideologically aligned.
link |
00:39:58.660
They in theory had a common enemy, which is the West.
link |
00:40:01.380
So it made total sense.
link |
00:40:03.460
And in terms of negotiations
link |
00:40:05.420
and the way things were communicated,
link |
00:40:07.300
it seemed to Stalin that for sure,
link |
00:40:11.540
that they would remain, at least for a while,
link |
00:40:16.100
peaceful collaborators.
link |
00:40:17.620
And that, and everybody, because of that,
link |
00:40:21.380
in the Soviet Union believed that it was a huge shock
link |
00:40:24.140
when Kiev was invaded.
link |
00:40:25.820
And you hear echoes of that when I travel to Ukraine,
link |
00:40:28.220
sort of the shock of the invasion.
link |
00:40:32.180
It's not just the invasion on one particular border,
link |
00:40:34.500
but the invasion of the capital city.
link |
00:40:36.700
And just like, holy shit, especially at that time,
link |
00:40:41.380
when you thought World War I,
link |
00:40:44.220
you realized that that was the war to end all wars.
link |
00:40:46.900
You would never have this kind of war.
link |
00:40:48.980
And holy shit, this person is mad enough
link |
00:40:52.380
to try to take on this monster in the Soviet Union.
link |
00:40:56.300
So it's no longer going to be a war
link |
00:40:58.300
of hundreds of thousands dead.
link |
00:40:59.820
It'll be a war of tens of millions dead.
link |
00:41:02.260
And yeah, but that, that's a very large scale kind of lie,
link |
00:41:08.260
but I'm sure there's in politics and geopolitics,
link |
00:41:11.460
that kind of lying happening all the time.
link |
00:41:14.620
And a lot of people pay financially
link |
00:41:17.260
and with their lives for that kind of lying.
link |
00:41:19.060
But in our personal lives, I don't know how often we,
link |
00:41:22.260
maybe we.
link |
00:41:23.100
I think people do.
link |
00:41:23.940
I mean, like think of spouses
link |
00:41:25.900
cheating on their partners, right?
link |
00:41:27.340
And then like having to lie,
link |
00:41:28.340
like where were you last night?
link |
00:41:29.540
Stuff like that. Oh shit, that's tough.
link |
00:41:30.580
Yeah, that's true.
link |
00:41:31.420
Like that's, I think, you know, I mean,
link |
00:41:34.220
unfortunately that stuff happens all the time, right?
link |
00:41:36.180
Or having like multiple families, that one is great.
link |
00:41:38.740
When each family doesn't know about the other one
link |
00:41:42.220
and like maintaining that life.
link |
00:41:44.620
There's probably a sense of excitement about that too.
link |
00:41:48.580
Or. It seems unnecessary, yeah.
link |
00:41:50.700
But. Why?
link |
00:41:51.740
Well, just lying.
link |
00:41:52.580
Like, you know, the truth finds a way of coming out.
link |
00:41:56.300
You know? Yes.
link |
00:41:57.140
But hence that's the thrill.
link |
00:41:59.220
Yeah, perhaps.
link |
00:42:00.220
Yeah, people.
link |
00:42:01.060
I mean, you know, that's why I think actually like poker.
link |
00:42:04.220
What's so interesting about poker
link |
00:42:05.940
is most of the best players I know,
link |
00:42:09.860
they're always exceptions, you know?
link |
00:42:10.900
They're always bad eggs.
link |
00:42:12.460
But actually poker players are very honest people.
link |
00:42:15.140
I would say they are more honest than the average,
link |
00:42:17.540
you know, if you just took random population sample.
link |
00:42:22.380
Because A, you know, I think, you know,
link |
00:42:26.100
humans like to have that.
link |
00:42:28.500
Most people like to have some kind of, you know,
link |
00:42:30.580
mysterious, you know, an opportunity to do something
link |
00:42:32.740
like a little edgy.
link |
00:42:34.900
So we get to sort of scratch that itch
link |
00:42:36.860
of being edgy at the poker table,
link |
00:42:38.340
where it's like, it's part of the game.
link |
00:42:40.420
Everyone knows what they're in for, and that's allowed.
link |
00:42:43.420
And you get to like really get that out of your system.
link |
00:42:47.740
And then also like poker players learned that, you know,
link |
00:42:51.220
I would play in a huge game against some of my friends,
link |
00:42:55.380
even my partner, Igor, where we will be, you know,
link |
00:42:57.740
absolutely going at each other's throats,
link |
00:42:59.660
trying to draw blood in terms of winning each money
link |
00:43:02.060
off each other and like getting under each other's skin,
link |
00:43:04.020
winding each other up, doing the craftiest moves we can.
link |
00:43:08.340
But then once the game's done, you know,
link |
00:43:11.100
the winners and the losers will go off
link |
00:43:12.260
and get a drink together and have a fun time
link |
00:43:13.860
and like talk about it in this like weird academic way
link |
00:43:16.660
afterwards, because, and that's why games are so great.
link |
00:43:19.500
Cause you get to like live out like this competitive urge
link |
00:43:23.980
that, you know, most people have.
link |
00:43:26.420
What's it feel like to lose?
link |
00:43:28.660
Like we talked about bluffing when it worked out.
link |
00:43:31.380
What about when you, when you go broke?
link |
00:43:35.380
So like in a game, I'm, fortunately I've never gone broke.
link |
00:43:39.140
You mean like full life?
link |
00:43:40.820
Full life, no.
link |
00:43:42.460
I know plenty of people who have.
link |
00:43:46.740
And I don't think Igor would mind me saying he went,
link |
00:43:48.180
you know, he went broke once in poker bowl, you know,
link |
00:43:50.300
early on when we were together.
link |
00:43:51.820
I feel like you haven't lived unless you've gone broke.
link |
00:43:54.260
Oh yeah.
link |
00:43:55.660
Some sense.
link |
00:43:56.500
Right.
link |
00:43:57.340
Some fundamental sense.
link |
00:43:58.180
I mean, I'm happy, I've sort of lived through it,
link |
00:43:59.900
vicariously through him when he did it at the time.
link |
00:44:02.420
But yeah, what's it like to lose?
link |
00:44:04.140
Well, it depends.
link |
00:44:04.980
So it depends on the amount.
link |
00:44:06.020
It depends what percentage of your net worth
link |
00:44:07.660
you've just lost.
link |
00:44:10.060
It depends on your brain chemistry.
link |
00:44:11.540
It really, you know, varies from person to person.
link |
00:44:13.140
You have a very cold calculating way of thinking about this.
link |
00:44:16.900
So it depends what percentage.
link |
00:44:18.980
Well, it did, it really does, right?
link |
00:44:20.700
Yeah, it's true, it's true.
link |
00:44:22.140
I mean, that's another thing poker trains you to do.
link |
00:44:24.740
You see, you see everything in percentages
link |
00:44:28.060
or you see everything in like ROI or expected hourlies
link |
00:44:30.820
or cost benefit, et cetera.
link |
00:44:32.420
You know, so that's, one of the things I've tried to do
link |
00:44:37.220
is calibrate the strength of my emotional response
link |
00:44:39.820
to the win or loss that I've received.
link |
00:44:43.380
Because it's no good if you like, you know,
link |
00:44:45.540
you have a huge emotional dramatic response to a tiny loss
link |
00:44:50.460
or on the flip side, you have a huge win
link |
00:44:53.220
and you're sort of so dead inside
link |
00:44:54.580
that you don't even feel it.
link |
00:44:55.420
Well, that's, you know, that's a shame.
link |
00:44:56.820
I want my emotions to calibrate with reality
link |
00:45:00.220
as much as possible.
link |
00:45:02.860
So yeah, what's it like to lose?
link |
00:45:04.180
I mean, I've had times where I've lost, you know,
link |
00:45:07.180
busted out of a tournament that I thought I was gonna win in
link |
00:45:09.260
especially if I got really unlucky or I make a dumb play
link |
00:45:13.940
where I've gone away and like, you know, kicked the wall,
link |
00:45:17.540
punched a wall, I like nearly broke my hand one time.
link |
00:45:19.900
Like I'm a lot less competitive than I used to be.
link |
00:45:24.100
Like I was like pathologically competitive in my like
link |
00:45:27.540
late teens, early twenties, I just had to win at everything.
link |
00:45:30.900
And I think that sort of slowly waned as I've gotten older.
link |
00:45:33.540
According to you, yeah.
link |
00:45:34.860
According to me.
link |
00:45:35.700
I don't know if others would say the same, right?
link |
00:45:38.500
I feel like ultra competitive people,
link |
00:45:40.860
like I've heard Joe Rogan say this to me.
link |
00:45:43.020
It's like that he's a lot less competitive
link |
00:45:44.780
than he used to be.
link |
00:45:45.620
I don't know about that.
link |
00:45:47.380
Oh, I believe it.
link |
00:45:48.420
No, I totally believe it.
link |
00:45:49.260
Like, because as you get, you can still be,
link |
00:45:51.460
like I care about winning.
link |
00:45:53.020
Like when, you know, I play a game with my buddies online
link |
00:45:56.060
or, you know, whatever it is,
link |
00:45:57.460
polytopia is my current obsession.
link |
00:45:58.780
Like when I.
link |
00:46:00.820
Thank you for passing on your obsession to me.
link |
00:46:02.980
Are you playing now?
link |
00:46:03.820
Yeah, I'm playing now.
link |
00:46:04.780
We gotta have a game.
link |
00:46:05.900
But I'm terrible and I enjoy playing terribly.
link |
00:46:08.140
I don't wanna have a game because that's gonna pull me
link |
00:46:10.060
into your monster of like a competitive play.
link |
00:46:13.940
It's important, it's an important skill.
link |
00:46:15.420
I'm enjoying playing on the, I can't.
link |
00:46:18.340
You just do the points thing, you know, against the bots.
link |
00:46:20.980
Yeah, against the bots.
link |
00:46:22.020
And I can't even do the, there's like a hard one
link |
00:46:26.100
and there's a very hard one.
link |
00:46:26.940
And then it's crazy, yeah.
link |
00:46:27.820
It's crazy.
link |
00:46:28.660
I can't, I don't even enjoy the hard one.
link |
00:46:29.900
The crazy, I really don't enjoy.
link |
00:46:32.020
Cause it's intense.
link |
00:46:33.660
You have to constantly try to win
link |
00:46:34.980
as opposed to enjoy building a little world and.
link |
00:46:37.980
Yeah, no, no, no.
link |
00:46:38.820
There's no time for exploration in polytopia.
link |
00:46:40.100
You gotta get.
link |
00:46:40.940
Well, when, once you graduate from the crazies,
link |
00:46:42.740
then you can come play the.
link |
00:46:44.540
Graduate from the crazies.
link |
00:46:46.460
Yeah, so in order to be able to play a decent game
link |
00:46:48.620
against like, you know, our group,
link |
00:46:51.700
you'll need to be, you'll need to be consistently winning
link |
00:46:55.740
like 90% of games against 15 crazy bots.
link |
00:46:58.300
Yeah.
link |
00:46:59.180
And you'll be able to, like there'll be,
link |
00:47:00.740
I could teach you it within a day, honestly.
link |
00:47:03.340
How to beat the crazies?
link |
00:47:04.660
How to beat the crazies.
link |
00:47:05.740
And then, and then you'll be ready for the big leagues.
link |
00:47:08.140
Generalizes to more than just polytopia.
link |
00:47:11.060
But okay, why were we talking about polytopia?
link |
00:47:14.420
Losing hurts.
link |
00:47:15.980
Losing hurts, oh yeah.
link |
00:47:17.060
Yes, competitiveness over time.
link |
00:47:19.380
Oh yeah.
link |
00:47:20.340
I think it's more that, at least for me,
link |
00:47:23.340
I still care about playing,
link |
00:47:24.740
about winning when I choose to play something.
link |
00:47:26.460
It's just that I don't see the world
link |
00:47:28.700
as zero sum as I used to be, you know?
link |
00:47:31.860
I think as one gets older and wiser,
link |
00:47:34.940
you start to see the world more as a positive something.
link |
00:47:38.060
Or at least you're more aware of externalities,
link |
00:47:40.820
of scenarios, of competitive interactions.
link |
00:47:43.740
And so, yeah, I just like, I'm more,
link |
00:47:47.180
and I'm more aware of my own, you know, like,
link |
00:47:50.180
if I have a really strong emotional response to losing,
link |
00:47:52.780
and that makes me then feel shitty for the rest of the day,
link |
00:47:54.980
and then I beat myself up mentally for it.
link |
00:47:57.220
Like, I'm now more aware
link |
00:47:58.660
that that's unnecessary negative externality.
link |
00:48:01.660
So I'm like, okay, I need to find a way to turn this down,
link |
00:48:03.940
you know, dial this down a bit.
link |
00:48:05.300
Was poker the thing that has,
link |
00:48:07.580
if you think back at your life,
link |
00:48:09.700
and think about some of the lower points of your life,
link |
00:48:12.180
like the darker places you've got in your mind,
link |
00:48:15.020
did it have to do something with poker?
link |
00:48:17.140
Like, did losing spark the descent into darkness,
link |
00:48:23.620
or was it something else?
link |
00:48:27.540
I think my darkest points in poker
link |
00:48:29.460
were when I was wanting to quit and move on to other things,
link |
00:48:34.500
but I felt like I hadn't ticked all the boxes
link |
00:48:38.340
I wanted to tick.
link |
00:48:39.980
Like, I wanted to be the most winningest female player,
link |
00:48:43.740
which is by itself a bad goal.
link |
00:48:45.540
You know, that was one of my initial goals,
link |
00:48:47.940
and I was like, well, I haven't, you know,
link |
00:48:48.900
and I wanted to win a WPT event.
link |
00:48:50.900
I've won one of these, I've won one of these,
link |
00:48:52.100
but I want one of those as well.
link |
00:48:53.540
And that sort of, again, like,
link |
00:48:57.380
it's a drive of like overoptimization to random metrics
link |
00:49:00.060
that I decided were important
link |
00:49:02.420
without much wisdom at the time, but then like carried on.
link |
00:49:06.540
That made me continue chasing it longer
link |
00:49:09.540
than I still actually had the passion to chase it for.
link |
00:49:12.860
And I don't have any regrets that, you know,
link |
00:49:15.740
I played for as long as I did, because who knows,
link |
00:49:17.540
you know, I wouldn't be sitting here,
link |
00:49:19.260
I wouldn't be living this incredible life
link |
00:49:21.020
that I'm living now.
link |
00:49:22.660
This is the height of your life right now.
link |
00:49:24.860
This is it, peak experience, absolute pinnacle
link |
00:49:28.100
here in your robot land with your creepy light.
link |
00:49:34.500
No, it is, I mean, I wouldn't change a thing
link |
00:49:37.340
about my life right now, and I feel very blessed to say that.
link |
00:49:40.900
So, but the dark times were in the sort of like 2016 to 18,
link |
00:49:47.980
even sooner really, where I was like,
link |
00:49:50.340
I had stopped loving the game,
link |
00:49:53.100
and I was going through the motions,
link |
00:49:55.420
and I would, and then I was like, you know,
link |
00:49:59.660
I would take the losses harder than I needed to,
link |
00:50:02.220
because I'm like, ah, it's another one.
link |
00:50:03.380
And it was, I was aware that like,
link |
00:50:04.820
I felt like my life was ticking away,
link |
00:50:06.180
and I was like, is this gonna be what's on my tombstone?
link |
00:50:07.940
Oh yeah, she played the game of, you know,
link |
00:50:09.580
this zero sum game of poker,
link |
00:50:11.700
slightly more optimally than her next opponent.
link |
00:50:14.100
Like, cool, great, legacy, you know?
link |
00:50:16.380
So, I just wanted, you know, there was something in me
link |
00:50:19.660
that knew I needed to be doing something
link |
00:50:21.340
more directly impactful and just meaningful.
link |
00:50:25.260
It was just like your search for meaning,
link |
00:50:26.260
and I think it's a thing a lot of poker players,
link |
00:50:27.940
even a lot of, I imagine any games players
link |
00:50:30.900
who sort of love intellectual pursuits,
link |
00:50:36.100
you know, I think you should ask Magnus Carlsen
link |
00:50:37.660
this question, I don't know what he's on.
link |
00:50:38.500
He's walking away from chess, right?
link |
00:50:39.780
Yeah, like, it must be so hard for him.
link |
00:50:41.500
You know, he's been on the top for so long,
link |
00:50:43.740
and it's like, well, now what?
link |
00:50:45.300
He's got this incredible brain, like, what to put it to?
link |
00:50:49.980
And, yeah, it's.
link |
00:50:52.100
It's this weird moment where I've just spoken with people
link |
00:50:55.620
that won multiple gold medals at the Olympics,
link |
00:50:58.380
and the depression hits hard after you win.
link |
00:51:02.860
Dopamine crash.
link |
00:51:04.180
Because it's a kind of a goodbye,
link |
00:51:05.620
saying goodbye to that person,
link |
00:51:06.900
to all the dreams you had that thought,
link |
00:51:09.420
you thought would give meaning to your life,
link |
00:51:11.860
but in fact, life is full of constant pursuits of meaning.
link |
00:51:16.820
It doesn't, you don't like arrive and figure it all out,
link |
00:51:20.260
and there's endless bliss,
link |
00:51:21.700
and it continues going on and on.
link |
00:51:23.980
You constantly have to figure out to rediscover yourself.
link |
00:51:27.860
And so for you, like that struggle to say goodbye to poker,
link |
00:51:31.580
you have to like find the next.
link |
00:51:33.780
There's always a bigger game.
link |
00:51:34.940
That's the thing.
link |
00:51:35.780
That's my motto is like, what's the next game?
link |
00:51:38.460
And more importantly,
link |
00:51:41.220
because obviously game usually implies zero sum,
link |
00:51:43.140
like what's a game which is like Omni win?
link |
00:51:46.580
Like what? Omni win.
link |
00:51:47.700
Omni win.
link |
00:51:48.540
Why is Omni win so important?
link |
00:51:50.740
Because if everyone plays zero sum games,
link |
00:51:54.700
that's a fast track to either completely stagnate
link |
00:51:57.180
as a civilization, but more actually,
link |
00:51:59.220
far more likely to extinct ourselves.
link |
00:52:02.300
You know, like the playing field is finite.
link |
00:52:05.020
You know, nuclear powers are playing,
link |
00:52:09.620
you know, a game of poker with, you know,
link |
00:52:12.020
but their chips are nuclear weapons, right?
link |
00:52:14.500
And the stakes have gotten so large
link |
00:52:17.020
that if anyone makes a single bet, you know,
link |
00:52:19.140
fires some weapons, the playing field breaks.
link |
00:52:22.100
I made a video on this.
link |
00:52:22.940
Like, you know, the playing field is finite.
link |
00:52:26.900
And if we keep playing these adversarial zero sum games,
link |
00:52:31.540
thinking that we, you know,
link |
00:52:33.180
in order for us to win, someone else has to lose,
link |
00:52:35.420
or if we lose that, you know, someone else wins,
link |
00:52:37.740
that will extinct us.
link |
00:52:40.020
It's just a matter of when.
link |
00:52:41.100
What do you think about that mutually assured destruction,
link |
00:52:44.860
that very simple,
link |
00:52:46.340
almost to the point of caricaturing game theory idea
link |
00:52:49.980
that does seem to be at the core
link |
00:52:52.340
of why we haven't blown each other up yet
link |
00:52:54.060
with nuclear weapons.
link |
00:52:55.660
Do you think there's some truth to that,
link |
00:52:57.540
this kind of stabilizing force
link |
00:53:00.180
of mutually assured destruction?
link |
00:53:01.500
And do you think that's gonna hold up
link |
00:53:04.940
through the 21st century?
link |
00:53:07.340
I mean, it has held.
link |
00:53:09.980
Yes, there's definitely truth to it,
link |
00:53:11.700
that it was a, you know, it's a Nash equilibrium.
link |
00:53:14.740
Yeah, are you surprised it held this long?
link |
00:53:17.300
Isn't it crazy?
link |
00:53:18.460
It is crazy when you factor in all the like
link |
00:53:21.140
near miss accidental firings.
link |
00:53:24.620
Yes, that's makes me wonder like, you know,
link |
00:53:28.060
are you familiar with the like quantum suicide
link |
00:53:29.900
thought experiment, where it's basically like,
link |
00:53:33.020
you have a, you know, like a Russian roulette
link |
00:53:36.580
type scenario hooked up to some kind of quantum event,
link |
00:53:40.620
you know, particle splitting or pair of particles splitting.
link |
00:53:45.820
And if it, you know, if it goes A,
link |
00:53:48.420
then the gun doesn't go off and it goes B,
link |
00:53:50.260
then it does go off and it kills you.
link |
00:53:52.500
Because you can only ever be in the universe,
link |
00:53:55.020
you know, assuming like the Everett branch,
link |
00:53:56.540
you know, multiverse theory,
link |
00:53:57.780
you'll always only end up in the branch
link |
00:54:00.220
where you continually make, you know, option A comes in,
link |
00:54:03.340
but you run that experiment enough times,
link |
00:54:05.460
it starts getting pretty damn, you know,
link |
00:54:07.060
out of the tree gets huge,
link |
00:54:09.380
there's a million different scenarios,
link |
00:54:10.900
but you'll always find yourself in this,
link |
00:54:12.420
in the one where it didn't go off.
link |
00:54:14.860
And so from that perspective, you are essentially immortal
link |
00:54:20.780
because someone, and you will only find yourself
link |
00:54:22.580
in the set of observers that make it down that path.
link |
00:54:25.100
So it's kind of a...
link |
00:54:26.620
That doesn't mean, that doesn't mean
link |
00:54:30.420
you're still not gonna be fucked at some point in your life.
link |
00:54:33.140
No, of course not, no, I'm not advocating
link |
00:54:34.860
like that we're all immortal because of this.
link |
00:54:36.420
It's just like a fun thought experiment.
link |
00:54:38.460
And the point is it like raises this thing
link |
00:54:40.100
of like these things called observer selection effects,
link |
00:54:42.380
which Bostrom, Nick Bostrom talks about a lot,
link |
00:54:44.420
and I think people should go read.
link |
00:54:46.180
It's really powerful,
link |
00:54:47.020
but I think it could be overextended, that logic.
link |
00:54:48.860
I'm not sure exactly how it can be.
link |
00:54:52.220
I just feel like you can get, you can overgeneralize
link |
00:54:57.300
that logic somehow.
link |
00:54:58.340
Well, no, I mean, it leaves you into like solipsism,
link |
00:55:00.660
which is a very dangerous mindset.
link |
00:55:02.140
Again, if everyone like falls into solipsism of like,
link |
00:55:04.340
well, I'll be fine.
link |
00:55:05.740
That's a great way of creating a very,
link |
00:55:08.820
self terminating environment.
link |
00:55:10.860
But my point is, is that with the nuclear weapons thing,
link |
00:55:14.540
there have been at least, I think it's 12 or 11 near misses
link |
00:55:19.540
of like just stupid things, like there was moonrise
link |
00:55:22.940
over Norway, and it made weird reflections
link |
00:55:25.860
of some glaciers in the mountains, which set off,
link |
00:55:28.820
I think the alarms of NORAD radar,
link |
00:55:33.820
and that put them on high alert, nearly ready to shoot.
link |
00:55:35.700
And it was only because the head of Russian military
link |
00:55:39.540
happened to be at the UN in New York at the time
link |
00:55:41.980
that they go like, well, wait a second,
link |
00:55:43.300
why would they fire now when their guy is there?
link |
00:55:47.140
And it was only that lucky happenstance,
link |
00:55:49.260
which doesn't happen very often where they didn't then
link |
00:55:50.900
escalate it into firing.
link |
00:55:51.980
And there's a bunch of these different ones.
link |
00:55:53.860
Stanislav Petrov, like saved the person
link |
00:55:56.660
who should be the most famous person on earth,
link |
00:55:57.940
cause he's probably on expectation,
link |
00:55:59.580
saved the most human lives of anyone,
link |
00:56:01.180
like billions of people by ignoring Russian orders to fire
link |
00:56:05.300
because he felt in his gut that actually
link |
00:56:06.700
this was a false alarm.
link |
00:56:07.540
And it turned out to be, you know, very hard thing to do.
link |
00:56:11.060
And there's so many of those scenarios that I can't help
link |
00:56:13.500
but wonder at this point that we aren't having this kind
link |
00:56:15.900
of like selection effect thing going on.
link |
00:56:17.940
Cause you look back and you're like, geez,
link |
00:56:19.820
that's a lot of near misses.
link |
00:56:20.820
But of course we don't know the actual probabilities
link |
00:56:22.860
that they would have lent each one would have ended up
link |
00:56:24.340
in nuclear war.
link |
00:56:25.180
Maybe they were not that likely, but still the point is,
link |
00:56:27.740
it's a very dark, stupid game that we're playing.
link |
00:56:30.900
And it is an absolute moral imperative if you ask me
link |
00:56:35.340
to get as many people thinking about ways
link |
00:56:37.220
to make this like very precarious.
link |
00:56:39.540
Cause we're in a Nash equilibrium,
link |
00:56:41.140
but it's not like we're in the bottom of a pit.
link |
00:56:42.820
You know, if you would like map it topographically,
link |
00:56:46.380
it's not like a stable ball at the bottom of a thing.
link |
00:56:48.380
We're not in equilibrium because of that.
link |
00:56:49.420
We're on the top of a hill with a ball balanced on top.
link |
00:56:52.380
And just at any little nudge could send it flying down
link |
00:56:55.580
and you know, nuclear war pops off
link |
00:56:57.100
and hellfire and bad times.
link |
00:57:00.060
On the positive side,
link |
00:57:01.100
life on earth will probably still continue.
link |
00:57:03.220
And another intelligent civilization might still pop up.
link |
00:57:06.220
Maybe.
link |
00:57:07.060
Several millennia after.
link |
00:57:08.420
Pick your X risk, depends on the X risk.
link |
00:57:10.180
Nuclear war, sure.
link |
00:57:11.020
That's one of the perhaps less bad ones.
link |
00:57:12.820
Green goo through synthetic biology, very bad.
link |
00:57:17.620
Will turn, you know, destroy all, you know,
link |
00:57:22.220
organic matter through, you know,
link |
00:57:25.740
it's basically like a biological paperclip maximizer.
link |
00:57:28.140
Also bad.
link |
00:57:28.980
Or AI type, you know, mass extinction thing as well
link |
00:57:32.540
would also be bad.
link |
00:57:33.380
Shh, they're listening.
link |
00:57:35.060
There's a robot right behind you.
link |
00:57:36.500
Okay, wait.
link |
00:57:37.340
So let me ask you about this from a game theory perspective.
link |
00:57:40.380
Do you think we're living in a simulation?
link |
00:57:42.300
Do you think we're living inside a video game
link |
00:57:44.700
created by somebody else?
link |
00:57:48.100
Well, so what was the second part of the question?
link |
00:57:50.220
Do I think we're living in a simulation and?
link |
00:57:52.620
A simulation that is observed by somebody
link |
00:57:56.660
for purpose of entertainment.
link |
00:57:58.620
So like a video game.
link |
00:57:59.660
Are we listening?
link |
00:58:00.500
Are we, because there's a,
link |
00:58:03.140
it's like Phil Hellmuth type of situation, right?
link |
00:58:05.260
Like there's a creepy level of like,
link |
00:58:09.940
this is kind of fun and interesting.
link |
00:58:13.700
Like there's a lot of interesting stuff going on.
link |
00:58:16.940
Maybe that could be somehow integrated
link |
00:58:18.820
into the evolutionary process where the way we perceive
link |
00:58:23.660
and are.
link |
00:58:24.500
Are you asking me if I believe in God?
link |
00:58:27.340
Sounds like it.
link |
00:58:28.820
Kind of, but God seems to be not optimizing
link |
00:58:33.980
in the different formulations of God that we conceive of.
link |
00:58:37.420
He doesn't seem to be, or she, optimizing
link |
00:58:39.940
for like personal entertainment.
link |
00:58:44.220
Maybe the older gods did.
link |
00:58:45.860
But the, you know, just like, basically like a teenager
link |
00:58:49.820
in their mom's basement watching, create a fun universe
link |
00:58:54.180
to observe what kind of crazy shit might happen.
link |
00:58:58.500
Okay, so to try and answer this.
link |
00:59:00.100
Do I think there is some kind of extraneous intelligence
link |
00:59:11.340
to like our, you know, classic measurable universe
link |
00:59:16.220
that we, you know, can measure with, you know,
link |
00:59:18.220
through our current physics and instruments?
link |
00:59:23.780
I think so, yes.
link |
00:59:25.100
Partly because I've had just small little bits of evidence
link |
00:59:29.100
in my own life, which have made me question.
link |
00:59:32.300
Like, so I was a diehard atheist, even five years ago.
link |
00:59:37.460
You know, I got into like the rationality community,
link |
00:59:39.820
big fan of less wrong, continue to be an incredible resource.
link |
00:59:45.140
But I've just started to have too many little snippets
link |
00:59:49.140
of experience, which don't make sense with the current sort
link |
00:59:54.140
of purely materialistic explanation of how reality works.
link |
01:00:04.540
Isn't that just like a humbling, practical realization
link |
01:00:09.540
that we don't know how reality works?
link |
01:00:12.540
Isn't that just a reminder to yourself that you're
link |
01:00:15.620
like, I don't know how reality works, isn't that just
link |
01:00:19.340
a reminder to yourself?
link |
01:00:20.180
Yeah, no, it's a reminder of epistemic humility
link |
01:00:22.500
because I fell too hard, you know, same as people,
link |
01:00:25.460
like I think, you know, many people who are just like,
link |
01:00:27.580
my religion is the way, this is the correct way,
link |
01:00:29.380
this is the work, this is the law, you are immoral
link |
01:00:32.260
if you don't follow this, blah, blah, blah.
link |
01:00:33.620
I think they are lacking epistemic humility.
link |
01:00:36.060
They're a little too much hubris there.
link |
01:00:38.100
But similarly, I think that sort of the Richard Dawkins
link |
01:00:40.380
realism is too rigid as well and doesn't, you know,
link |
01:00:48.580
there's a way to try and navigate these questions
link |
01:00:51.020
which still honors the scientific method,
link |
01:00:52.780
which I still think is our best sort of realm
link |
01:00:54.820
of like reasonable inquiry, you know, a method of inquiry.
link |
01:00:58.940
So an example, two kind of notable examples
link |
01:01:03.340
that like really rattled my cage.
link |
01:01:06.660
The first one was actually in 2010 early on
link |
01:01:09.340
in quite early on in my poker career.
link |
01:01:13.460
And I, remember the Icelandic volcano that erupted
link |
01:01:19.540
that like shut down kind of all Atlantic airspace.
link |
01:01:22.700
And it meant I got stuck down in the South of France.
link |
01:01:25.260
I was there for something else.
link |
01:01:27.140
And I couldn't get home and someone said,
link |
01:01:29.900
well, there's a big poker tournament happening in Italy.
link |
01:01:31.860
Maybe, do you wanna go?
link |
01:01:32.980
I was like, all right, sure.
link |
01:01:33.820
Like, let's, you know, got a train across,
link |
01:01:35.860
found a way to get there.
link |
01:01:37.860
And the buy in was 5,000 euros,
link |
01:01:39.860
which was much bigger than my bankroll would normally allow.
link |
01:01:42.500
And so I played a feeder tournament, won my way in
link |
01:01:46.620
kind of like I did with the Monte Carlo big one.
link |
01:01:49.820
So then I won my way, you know,
link |
01:01:50.900
from 500 euros into 5,000 euros to play this thing.
link |
01:01:54.180
And on day one of then the big tournament,
link |
01:01:57.980
which turned out to have,
link |
01:01:58.980
it was the biggest tournament ever held in Europe
link |
01:02:00.860
at the time.
link |
01:02:01.700
It got over like 1,200 people, absolutely huge.
link |
01:02:04.540
And I remember they dimmed the lights for before, you know,
link |
01:02:08.380
the normal shuffle up and deal
link |
01:02:10.220
to tell everyone to start playing.
link |
01:02:12.100
And they played Chemical Brothers, Hey Boy, Hey Girl,
link |
01:02:16.100
which I don't know why it's notable,
link |
01:02:17.380
but it was just like a really,
link |
01:02:18.380
it was a song I always liked.
link |
01:02:19.220
It was like one of these like pump me up songs.
link |
01:02:21.220
And I was sitting there thinking, oh yeah, it's exciting.
link |
01:02:22.900
I'm playing this really big tournament.
link |
01:02:24.580
And out of nowhere, just suddenly this voice in my head,
link |
01:02:29.380
just, and it sounded like my own sort of, you know,
link |
01:02:32.220
when you think in your mind, you hear a voice kind of, right?
link |
01:02:34.980
At least I do.
link |
01:02:36.780
And so it sounded like my own voice and it said,
link |
01:02:38.620
you are going to win this tournament.
link |
01:02:41.220
And it was so powerful that I got this like wave of like,
link |
01:02:44.220
you know, sort of goosebumps down my body.
link |
01:02:46.780
And that I even, I remember looking around being like,
link |
01:02:48.780
did anyone else hear that?
link |
01:02:49.820
And obviously people are in their phones,
link |
01:02:51.020
like no one else heard it.
link |
01:02:51.860
And I was like, okay, six days later,
link |
01:02:56.700
I win the fucking tournament out of 1,200 people.
link |
01:02:59.780
And I don't know how to explain it.
link |
01:03:08.780
Okay, yes, maybe I have that feeling
link |
01:03:13.380
before every time I play.
link |
01:03:14.620
And it's just that I happened to, you know,
link |
01:03:16.140
because I won the tournament, I retroactively remembered it.
link |
01:03:18.820
But that's just.
link |
01:03:19.660
Or the feeling gave you a kind of,
link |
01:03:22.300
now from the film, Helmutian.
link |
01:03:24.940
Well, exactly.
link |
01:03:25.780
Like it gave you a confident, a deep confidence.
link |
01:03:28.340
And it did.
link |
01:03:29.180
It definitely did.
link |
01:03:30.020
Like, I remember then feeling this like sort of,
link |
01:03:32.060
well, although I remember then on day one,
link |
01:03:33.980
I then went and lost half my stack quite early on.
link |
01:03:35.900
And I remember thinking like, oh, well, that was bullshit.
link |
01:03:37.700
You know, what kind of premonition is this?
link |
01:03:40.260
Thinking, oh, I'm out.
link |
01:03:41.100
But you know, I managed to like keep it together
link |
01:03:42.940
and recover and then just went like pretty perfectly
link |
01:03:46.060
from then on.
link |
01:03:47.420
And either way, it definitely instilled me
link |
01:03:51.100
with this confidence.
link |
01:03:52.780
And I don't want to put, I can't put an explanation.
link |
01:03:55.820
Like, you know, was it some, you know, huge extra to extra,
link |
01:04:01.460
you know, supernatural thing driving me?
link |
01:04:03.980
Or was it just my own self confidence in someone
link |
01:04:06.540
that just made me make the right decisions?
link |
01:04:07.980
I don't know.
link |
01:04:08.860
And I don't, I'm not going to put a frame on it.
link |
01:04:10.660
And I think.
link |
01:04:11.500
I think I know a good explanation.
link |
01:04:12.420
So we're a bunch of NPCs living in this world
link |
01:04:14.740
created by, in the simulation.
link |
01:04:16.580
And then people, not people, creatures from outside
link |
01:04:20.300
of the simulation sort of can tune in
link |
01:04:23.140
and play your character.
link |
01:04:24.220
And that feeling you got is somebody just like,
link |
01:04:27.220
they got to play a poker tournament through you.
link |
01:04:29.300
Honestly, it felt like that.
link |
01:04:30.660
It did actually feel a little bit like that.
link |
01:04:33.100
But it's been 12 years now.
link |
01:04:35.180
I've retold the story many times.
link |
01:04:36.700
Like, I don't even know how much I can trust my memory.
link |
01:04:38.940
You're just an NPC retelling the same story.
link |
01:04:41.500
Because they just played the tournament and left.
link |
01:04:43.780
Yeah, they're like, oh, that was fun.
link |
01:04:44.780
Cool.
link |
01:04:45.620
Yeah, cool.
link |
01:04:46.460
Next time.
link |
01:04:47.300
And now you're for the rest of your life left
link |
01:04:49.260
as a boring NPC retelling this story of greatness.
link |
01:04:51.860
But it was, and what was interesting was that after that,
link |
01:04:53.700
then I didn't obviously win a major tournament
link |
01:04:55.620
for quite a long time.
link |
01:04:56.780
And it left, that was actually another sort of dark period
link |
01:05:01.180
because I had this incredible,
link |
01:05:02.780
like the highs of winning that,
link |
01:05:04.340
just on a like material level were insane,
link |
01:05:06.180
winning the money.
link |
01:05:07.060
I was on the front page of newspapers
link |
01:05:08.540
because there was like this girl that came out of nowhere
link |
01:05:10.220
and won this big thing.
link |
01:05:12.180
And so again, like sort of chasing that feeling
link |
01:05:14.820
was difficult.
link |
01:05:16.700
But then on top of that, there was this feeling
link |
01:05:18.140
of like almost being touched by something bigger
link |
01:05:22.340
that was like, ah.
link |
01:05:24.060
So maybe, did you have a sense
link |
01:05:26.580
that I might be somebody special?
link |
01:05:29.540
Like this kind of,
link |
01:05:34.340
I think that's the confidence thing
link |
01:05:36.540
that maybe you could do something special in this world
link |
01:05:40.740
after all kind of feeling.
link |
01:05:42.300
I definitely, I mean, this is the thing
link |
01:05:45.340
I think everybody wrestles with to an extent, right?
link |
01:05:48.220
We are truly the protagonists in our own lives.
link |
01:05:51.540
And so it's a natural bias, human bias
link |
01:05:54.620
to feel special.
link |
01:05:58.860
And I think, and in some ways we are special.
link |
01:06:00.820
Every single person is special
link |
01:06:01.940
because you are that, the universe does,
link |
01:06:04.660
the world literally does revolve around you.
link |
01:06:06.220
That's the thing in some respect.
link |
01:06:08.580
But of course, if you then zoom out
link |
01:06:10.540
and take the amalgam of everyone's experiences,
link |
01:06:12.140
then no, it doesn't.
link |
01:06:12.980
So there is this shared sort of objective reality,
link |
01:06:15.700
but sorry, there's objective reality that is shared,
link |
01:06:17.740
but then there's also this subjective reality
link |
01:06:19.260
which is truly unique to you.
link |
01:06:20.740
And I think both of those things coexist.
link |
01:06:22.300
And it's not like one is correct and one isn't.
link |
01:06:24.460
And again, anyone who's like,
link |
01:06:26.300
oh no, your lived experience is everything
link |
01:06:28.180
versus your lived experience is nothing.
link |
01:06:30.100
No, it's a blend between these two things.
link |
01:06:32.500
They can exist concurrently.
link |
01:06:33.860
But there's a certain kind of sense
link |
01:06:35.300
that at least I've had my whole life.
link |
01:06:36.780
And I think a lot of people have this as like,
link |
01:06:38.500
well, I'm just like this little person.
link |
01:06:40.980
Surely I can't be one of those people
link |
01:06:42.900
that do the big thing, right?
link |
01:06:46.540
There's all these big people doing big things.
link |
01:06:48.660
There's big actors and actresses, big musicians.
link |
01:06:53.420
There's big business owners and all that kind of stuff,
link |
01:06:57.060
scientists and so on.
link |
01:06:58.700
I have my own subject experience that I enjoy and so on,
link |
01:07:02.420
but there's like a different layer.
link |
01:07:04.860
Like surely I can't do those great things.
link |
01:07:09.460
I mean, one of the things just having interacted
link |
01:07:11.500
with a lot of great people, I realized,
link |
01:07:14.140
no, they're like just the same humans as me.
link |
01:07:20.060
And that realization I think is really empowering.
link |
01:07:22.140
And to remind yourself.
link |
01:07:24.180
What are they?
link |
01:07:25.020
Huh?
link |
01:07:25.860
What are they?
link |
01:07:26.700
Are they?
link |
01:07:27.540
Well, in terms of.
link |
01:07:29.020
Depends on some, yeah.
link |
01:07:30.220
They're like a bag of insecurities and.
link |
01:07:33.900
Yes.
link |
01:07:36.940
Peculiar sort of, like their own little weirdnesses
link |
01:07:41.300
and so on, I should say also not.
link |
01:07:48.020
They have the capacity for brilliance,
link |
01:07:50.140
but they're not generically brilliant.
link |
01:07:52.860
Like, you know, we tend to say this person
link |
01:07:55.740
or that person is brilliant, but really no,
link |
01:07:59.620
they're just like sitting there and thinking through stuff
link |
01:08:02.740
just like the rest of us.
link |
01:08:04.780
Right.
link |
01:08:05.620
I think they're in the habit of thinking through stuff
link |
01:08:08.100
seriously and they've built up a habit of not allowing them,
link |
01:08:13.020
their mind to get trapped in a bunch of bullshit
link |
01:08:15.020
and minutia of day to day life.
link |
01:08:16.980
They really think big ideas, but those big ideas,
link |
01:08:21.420
it's like allowing yourself the freedom to think big,
link |
01:08:24.700
to realize that you can be one that actually solved
link |
01:08:27.980
this particular big problem.
link |
01:08:29.260
First identify a big problem that you care about,
link |
01:08:31.300
then like, I can actually be the one
link |
01:08:33.220
that solves this problem.
link |
01:08:34.860
And like allowing yourself to believe that.
link |
01:08:37.260
And I think sometimes you do need to have like
link |
01:08:39.380
that shock go through your body and a voice tells you,
link |
01:08:41.700
you're gonna win this tournament.
link |
01:08:42.900
Well, exactly.
link |
01:08:43.740
And whether it was, it's this idea of useful fictions.
link |
01:08:50.500
So again, like going through all like
link |
01:08:53.020
the classic rationalist training of Les Wrong
link |
01:08:55.500
where it's like, you want your map,
link |
01:08:57.460
you know, the image you have of the world in your head
link |
01:09:00.100
to as accurately match up with how the world actually is.
link |
01:09:04.380
You want the map and the territory to perfectly align
link |
01:09:06.900
as, you know, you want it to be
link |
01:09:08.380
as an accurate representation as possible.
link |
01:09:11.860
I don't know if I fully subscribed to that anymore,
link |
01:09:13.860
having now had these moments of like feeling of something
link |
01:09:17.420
either bigger or just actually just being overconfident.
link |
01:09:20.340
Like there is value in overconfidence sometimes.
link |
01:09:23.700
If you, you know, take, you know,
link |
01:09:25.780
take Magnus Carlsen, right?
link |
01:09:30.540
If he, I'm sure from a young age,
link |
01:09:32.260
he knew he was very talented,
link |
01:09:34.380
but I wouldn't be surprised if he was also had something
link |
01:09:37.340
in him to, well, actually maybe he's a bad example
link |
01:09:40.260
because he truly is the world's greatest,
link |
01:09:42.580
but someone who it was unclear
link |
01:09:44.380
whether they were gonna be the world's greatest,
link |
01:09:45.620
but ended up doing extremely well
link |
01:09:47.260
because they had this innate, deep self confidence,
link |
01:09:50.220
this like even overblown idea
link |
01:09:53.060
of how good their relative skill level is.
link |
01:09:54.860
That gave them the confidence to then pursue this thing
link |
01:09:56.900
and they're like with the kind of focus and dedication
link |
01:10:00.380
that it requires to excel in whatever it is
link |
01:10:02.380
you're trying to do, you know?
link |
01:10:03.420
And so there are these useful fictions
link |
01:10:06.060
and that's where I think I diverge slightly
link |
01:10:09.540
with the classic sort of rationalist community
link |
01:10:14.620
because that's a field that is worth studying
link |
01:10:21.300
of like how the stories we tell,
link |
01:10:23.220
what the stories we tell to ourselves,
link |
01:10:25.020
even if they are actually false,
link |
01:10:26.220
and even if we suspect they might be false,
link |
01:10:28.980
how it's better to sort of have that like little bit
link |
01:10:30.660
of faith, like value in faith, I think actually.
link |
01:10:34.380
And that's partly another thing
link |
01:10:35.860
that's now led me to explore the concept of God,
link |
01:10:40.620
whether you wanna call it a simulator,
link |
01:10:42.940
the classic theological thing.
link |
01:10:44.300
I think we're all like elucidating to the same thing.
link |
01:10:46.460
Now, I don't know, I'm not saying,
link |
01:10:47.780
because obviously the Christian God
link |
01:10:49.180
is like all benevolent, endless love.
link |
01:10:53.500
The simulation, at least one of the simulation hypothesis
link |
01:10:56.700
is like, as you said, like a teenager in his bedroom
link |
01:10:58.700
who doesn't really care, doesn't give a shit
link |
01:11:00.340
about the individuals within there.
link |
01:11:02.580
It just like wants to see how the thing plays out
link |
01:11:05.140
because it's curious and it could turn it off like that.
link |
01:11:07.620
Where on the sort of psychopathy
link |
01:11:09.900
to benevolent spectrum God is, I don't know.
link |
01:11:13.820
But just having a little bit of faith
link |
01:11:20.180
that there is something else out there
link |
01:11:21.700
that might be interested in our outcome
link |
01:11:24.340
is I think an essential thing actually for people to find.
link |
01:11:27.940
A, because it creates commonality between,
link |
01:11:29.860
it's something we can all share.
link |
01:11:31.860
And it is uniquely humbling of all of us to an extent.
link |
01:11:35.020
It's like a common objective.
link |
01:11:37.900
But B, it gives people that little bit of like reserve
link |
01:11:41.580
when things get really dark.
link |
01:11:43.380
And I do think things are gonna get pretty dark
link |
01:11:44.860
over the next few years.
link |
01:11:47.180
But it gives that like,
link |
01:11:49.260
to think that there's something out there
link |
01:11:50.780
that actually wants our game to keep going.
link |
01:11:53.140
I keep calling it the game.
link |
01:11:55.020
It's a thing C and I, we call it the game.
link |
01:11:57.660
You and C is a.k.a. Grimes, we call what the game?
link |
01:12:02.660
Everything, the whole thing?
link |
01:12:03.900
Yeah, we joke about like.
link |
01:12:05.500
So everything is a game.
link |
01:12:06.820
Well, the universe, like what if it's a game
link |
01:12:11.100
and the goal of the game is to figure out like,
link |
01:12:13.580
well, either how to beat it, how to get out of it.
link |
01:12:16.020
Maybe this universe is an escape room,
link |
01:12:18.820
like a giant escape room.
link |
01:12:20.140
And the goal is to figure out,
link |
01:12:22.900
put all the pieces to puzzle, figure out how it works
link |
01:12:25.820
in order to like unlock this like hyperdimensional key
link |
01:12:29.780
and get out beyond what it is.
link |
01:12:31.260
That's.
link |
01:12:32.100
No, but then, so you're saying it's like different levels
link |
01:12:34.380
and it's like a cage within a cage within a cage
link |
01:12:36.500
and never like one cage at a time,
link |
01:12:38.300
you figure out how to escape that.
link |
01:12:41.060
Like a new level up, you know,
link |
01:12:42.260
like us becoming multi planetary would be a level up
link |
01:12:44.380
or us, you know, figuring out how to upload
link |
01:12:46.700
our consciousnesses to the thing.
link |
01:12:48.260
That would probably be a leveling up or spiritually,
link |
01:12:51.380
you know, humanity becoming more combined
link |
01:12:53.580
and less adversarial and bloodthirsty
link |
01:12:56.900
and us becoming a little bit more enlightened.
link |
01:12:58.780
That would be a leveling up.
link |
01:12:59.620
You know, there's many different frames to it,
link |
01:13:01.860
whether it's physical, you know, digital
link |
01:13:05.140
or like metaphysical.
link |
01:13:05.980
I wonder what the levels, I think,
link |
01:13:07.380
I think level one for earth is probably
link |
01:13:10.900
the biological evolutionary process.
link |
01:13:14.060
So going from single cell organisms to early humans.
link |
01:13:18.380
Then maybe level two is whatever's happening inside our minds
link |
01:13:22.980
and creating ideas and creating technologies.
link |
01:13:26.100
That's like evolutionary process of ideas.
link |
01:13:31.340
And then multi planetary is interesting.
link |
01:13:34.620
Is that fundamentally different
link |
01:13:35.940
from what we're doing here on earth?
link |
01:13:37.900
Probably, because it allows us to like exponentially scale.
link |
01:13:41.780
It delays the Malthusian trap, right?
link |
01:13:46.380
It's a way to keep the playing field,
link |
01:13:50.380
to make the playing field get larger
link |
01:13:53.380
so that it can accommodate more of our stuff, more of us.
link |
01:13:57.260
And that's a good thing,
link |
01:13:58.460
but I don't know if it like fully solves this issue of,
link |
01:14:05.740
well, this thing called Moloch,
link |
01:14:06.740
which we haven't talked about yet,
link |
01:14:07.820
but which is basically,
link |
01:14:10.660
I call it the God of unhealthy competition.
link |
01:14:12.740
Yeah, let's go to Moloch.
link |
01:14:13.980
What's Moloch?
link |
01:14:14.980
You did a great video on Moloch and one aspect of it,
link |
01:14:18.380
the application of it to one aspect.
link |
01:14:20.420
Instagram beauty filters.
link |
01:14:22.420
True.
link |
01:14:23.580
Very niche, but I wanted to start off small.
link |
01:14:27.340
So Moloch was originally coined as,
link |
01:14:34.540
well, so apparently back in the like Canaanite times,
link |
01:14:39.540
it was to say ancient Carthaginian,
link |
01:14:41.420
I can never say it Carthaginian,
link |
01:14:43.220
somewhere around like 300 BC or 280, I don't know.
link |
01:14:47.620
There was supposedly this death cult
link |
01:14:49.700
who would sacrifice their children
link |
01:14:53.220
to this awful demon God thing they called Moloch
link |
01:14:56.980
in order to get power to win wars.
link |
01:14:59.380
So really dark, horrible things.
link |
01:15:00.900
And it was literally like about child sacrifice,
link |
01:15:02.580
whether they actually existed or not, we don't know,
link |
01:15:04.060
but in mythology they did.
link |
01:15:05.660
And this God that they worshiped
link |
01:15:07.220
was this thing called Moloch.
link |
01:15:09.220
And then I don't know,
link |
01:15:10.940
it seemed like it was kind of quiet throughout history
link |
01:15:13.660
in terms of mythology beyond that,
link |
01:15:15.420
until this movie Metropolis in 1927 talked about this,
link |
01:15:24.420
you see that there was this incredible futuristic city
link |
01:15:27.020
that everyone was living great in,
link |
01:15:29.100
but then the protagonist goes underground into the sewers
link |
01:15:31.380
and sees that the city is run by this machine.
link |
01:15:34.260
And this machine basically would just like kill the workers
link |
01:15:37.300
all the time because it was just so hard to keep it running.
link |
01:15:40.060
They were always dying.
link |
01:15:40.900
So there was all this suffering that was required
link |
01:15:42.980
in order to keep the city going.
link |
01:15:44.540
And then the protagonist has this vision
link |
01:15:45.900
that this machine is actually this demon Moloch.
link |
01:15:48.140
So again, it's like this sort of like mechanistic consumption
link |
01:15:50.860
of humans in order to get more power.
link |
01:15:54.780
And then Allen Ginsberg wrote a poem in the 60s,
link |
01:15:58.700
which incredible poem called Howl about this thing Moloch.
link |
01:16:04.420
And a lot of people sort of quite understandably
link |
01:16:06.980
take the interpretation of that,
link |
01:16:08.740
that he's talking about capitalism.
link |
01:16:10.820
But then the sort of piece to resistance
link |
01:16:13.740
that's moved Moloch into this idea of game theory
link |
01:16:16.340
was Scott Alexander of Slate Style Codex
link |
01:16:20.540
wrote this incredible,
link |
01:16:21.700
well, literally I think it might be my favorite piece
link |
01:16:23.420
of writing of all time.
link |
01:16:24.260
It's called Meditations on Moloch.
link |
01:16:26.180
Everyone must go read it.
link |
01:16:28.820
And...
link |
01:16:29.660
I say Codex is a blog.
link |
01:16:30.940
It's a blog, yes.
link |
01:16:32.300
We can link to it in the show notes or something, right?
link |
01:16:35.220
No, don't.
link |
01:16:36.060
I, yes, yes.
link |
01:16:38.340
But I like how you assume
link |
01:16:41.260
I have a professional operation going on here.
link |
01:16:43.540
I mean...
link |
01:16:44.380
I shall try to remember to...
link |
01:16:45.220
You were gonna assume.
link |
01:16:46.060
What do you...
link |
01:16:46.900
What are you, what do you want?
link |
01:16:48.100
You're giving the impression of it.
link |
01:16:49.420
Yeah, I'll look, please.
link |
01:16:50.700
If I don't, please somebody in the comments remind me.
link |
01:16:53.260
I'll help you.
link |
01:16:54.100
If you don't know this blog,
link |
01:16:55.780
it's one of the best blogs ever probably.
link |
01:16:59.420
You should probably be following it.
link |
01:17:01.180
Yes.
link |
01:17:02.020
Are blogs still a thing?
link |
01:17:03.340
I think they are still a thing, yeah.
link |
01:17:04.660
Yeah, he's migrated onto Substack,
link |
01:17:06.220
but yeah, it's still a blog.
link |
01:17:07.860
Anyway.
link |
01:17:08.700
Substack better not fuck things up, but...
link |
01:17:10.860
I hope not, yeah.
link |
01:17:12.100
I hope they don't, I hope they don't turn Molochy,
link |
01:17:14.620
which will mean something to people when we continue.
link |
01:17:16.700
Yeah.
link |
01:17:17.540
When I stop interrupting for once.
link |
01:17:19.460
No, no, it's good.
link |
01:17:20.300
Go on, yeah.
link |
01:17:21.140
So anyway, so he writes,
link |
01:17:22.220
he writes this piece, Meditations on Moloch,
link |
01:17:24.940
and basically he analyzes the poem and he's like,
link |
01:17:28.220
okay, so it seems to be something relating
link |
01:17:29.700
to where competition goes wrong.
link |
01:17:32.460
And, you know, Moloch was historically this thing
link |
01:17:35.580
of like where people would sacrifice a thing
link |
01:17:38.820
that they care about, in this case, children,
link |
01:17:40.460
their own children, in order to gain power,
link |
01:17:43.220
a competitive advantage.
link |
01:17:45.340
And if you look at almost everything that sort of goes wrong
link |
01:17:48.860
in our society, it's that same process.
link |
01:17:52.300
So with the Instagram beauty filters thing,
link |
01:17:56.420
you know, if you're trying to become
link |
01:17:57.900
a famous Instagram model,
link |
01:18:01.940
you are incentivized to post the hottest pictures
link |
01:18:03.820
of yourself that you can, you know,
link |
01:18:05.620
you're trying to play that game.
link |
01:18:07.220
There's a lot of hot women on Instagram.
link |
01:18:08.660
How do you compete against them?
link |
01:18:10.020
You post really hot pictures
link |
01:18:11.540
and that's how you get more likes.
link |
01:18:14.220
As technology gets better, you know,
link |
01:18:16.780
more makeup techniques come along.
link |
01:18:19.260
And then more recently, these beauty filters
link |
01:18:22.020
where like at the touch of a button,
link |
01:18:23.420
it makes your face look absolutely incredible
link |
01:18:26.060
compared to your natural face.
link |
01:18:29.540
These technologies come along,
link |
01:18:30.940
it's everyone is incentivized to that short term strategy.
link |
01:18:35.620
But over on net, it's bad for everyone
link |
01:18:39.020
because now everyone is kind of feeling
link |
01:18:40.460
like they have to use these things.
link |
01:18:41.660
And these things like they make you like,
link |
01:18:43.060
the reason why I talked about them in this video
link |
01:18:44.380
is because I noticed it myself, you know,
link |
01:18:45.860
like I was trying to grow my Instagram for a while,
link |
01:18:48.820
I've given up on it now.
link |
01:18:49.660
But yeah, and I noticed these filters,
link |
01:18:52.500
how good they made me look.
link |
01:18:53.420
And I'm like, well, I know that everyone else
link |
01:18:56.380
is kind of doing it.
link |
01:18:57.220
Go subscribe to Liv's Instagram.
link |
01:18:58.980
Please, so I don't have to use the filters.
link |
01:19:01.180
I'll post a bunch of, yeah, make it blow up.
link |
01:19:06.020
So yeah, you felt the pressure actually.
link |
01:19:08.460
Exactly, these short term incentives
link |
01:19:10.980
to do this like, this thing that like either sacrifices
link |
01:19:13.980
your integrity or something else
link |
01:19:17.300
in order to like stay competitive,
link |
01:19:20.100
which on aggregate turns like,
link |
01:19:22.700
creates this like sort of race to the bottom spiral
link |
01:19:24.820
where everyone else ends up in a situation
link |
01:19:26.580
which is worse off than if they hadn't started,
link |
01:19:28.220
you know, than they were before.
link |
01:19:29.300
Kind of like if, like at a football stadium,
link |
01:19:34.300
like the system is so badly designed,
link |
01:19:36.380
a competitive system of like everyone sitting
link |
01:19:38.140
and having a view that if someone at the very front
link |
01:19:40.220
stands up to get an even better view,
link |
01:19:42.060
it forces everyone else behind
link |
01:19:43.620
to like adopt that same strategy
link |
01:19:45.420
just to get to where they were before.
link |
01:19:46.980
But now everyone's stuck standing up.
link |
01:19:48.700
Like, so you need this like top down God's eye coordination
link |
01:19:51.860
to make it go back to the better state.
link |
01:19:53.820
But from within the system, you can't actually do that.
link |
01:19:56.100
So that's kind of what this Moloch thing is.
link |
01:19:57.660
It's this thing that makes people sacrifice values
link |
01:20:01.580
in order to optimize for the winning the game in question,
link |
01:20:04.620
the short term game.
link |
01:20:05.500
But this Moloch, can you attribute it
link |
01:20:09.740
to any one centralized source
link |
01:20:11.700
or is it an emergent phenomena
link |
01:20:13.620
from a large collection of people?
link |
01:20:16.140
Exactly that.
link |
01:20:16.980
It's an emergent phenomena.
link |
01:20:18.860
It's a force of game theory.
link |
01:20:22.220
It's a force of bad incentives on a multi agent system
link |
01:20:25.580
where you've got more, you know,
link |
01:20:26.580
prisoner's dilemma is technically
link |
01:20:28.140
a kind of Moloch system as well,
link |
01:20:30.500
but it's just a two player thing.
link |
01:20:31.980
But another word for Moloch is it multipolar trap.
link |
01:20:35.740
Where basically you just got a lot of different people
link |
01:20:37.780
all competing for some kind of prize.
link |
01:20:40.740
And it would be better
link |
01:20:42.380
if everyone didn't do this one shitty strategy,
link |
01:20:44.740
but because that strategy gives you a short term advantage,
link |
01:20:47.260
everyone's incentivized to do it.
link |
01:20:48.500
And so everyone ends up doing it.
link |
01:20:49.860
So the responsibility for,
link |
01:20:51.980
I mean, social media is a really nice place
link |
01:20:54.060
for a large number of people to play game theory.
link |
01:20:56.860
And so they also have the ability
link |
01:20:59.740
to then design the rules of the game.
link |
01:21:02.580
And is it on them to try to anticipate
link |
01:21:05.540
what kind of like to do the thing
link |
01:21:07.900
that poker players are doing to run simulation?
link |
01:21:11.100
Ideally that would have been great.
link |
01:21:12.380
If, you know, Mark Zuckerberg and Jack
link |
01:21:15.580
and all the, you know, the Twitter founders and everyone,
link |
01:21:17.260
if they had at least just run a few simulations
link |
01:21:20.580
of how their algorithms would, you know,
link |
01:21:23.220
if different types of algorithms would turn out for society,
link |
01:21:26.260
that would have been great.
link |
01:21:27.340
That's really difficult to do
link |
01:21:28.700
that kind of deep philosophical thinking
link |
01:21:30.420
about thinking about humanity actually.
link |
01:21:33.100
So not kind of this level of how do we optimize engagement
link |
01:21:40.580
or what brings people joy in the short term,
link |
01:21:42.860
but how is this thing going to change
link |
01:21:45.300
the way people see the world?
link |
01:21:47.940
How is it gonna get morphed in iterative games played
link |
01:21:52.940
into something that will change society forever?
link |
01:21:56.860
That requires some deep thinking.
link |
01:21:58.500
That's, I hope there's meetings like that inside companies,
link |
01:22:02.420
but I haven't seen them.
link |
01:22:03.260
There aren't, that's the problem.
link |
01:22:04.300
And it's difficult because like,
link |
01:22:07.500
when you're starting up a social media company,
link |
01:22:09.740
you know, you're aware that you've got investors to please,
link |
01:22:13.940
there's bills to pay, you know,
link |
01:22:17.100
there's only so much R&D you can afford to do.
link |
01:22:19.940
You've got all these like incredible pressures,
link |
01:22:21.940
bad incentives to get on and just build your thing
link |
01:22:24.420
as quickly as possible and start making money.
link |
01:22:26.620
And, you know, I don't think anyone intended
link |
01:22:29.100
when they built these social media platforms
link |
01:22:32.580
and just to like preface it.
link |
01:22:33.860
So the reason why, you know, social media is relevant
link |
01:22:36.340
because it's a very good example of like,
link |
01:22:38.740
everyone these days is optimizing for, you know, clicks,
link |
01:22:42.980
whether it's a social media platforms themselves,
link |
01:22:44.780
because, you know, every click gets more, you know,
link |
01:22:47.300
impressions and impressions pay for, you know,
link |
01:22:49.740
they get advertising dollars
link |
01:22:50.980
or whether it's individual influencers
link |
01:22:53.740
or, you know, whether it's a New York Times or whoever,
link |
01:22:56.100
they're trying to get their story to go viral.
link |
01:22:58.300
So everyone's got this bad incentive of using, you know,
link |
01:23:00.220
as you called it, the clickbait industrial complex.
link |
01:23:02.780
That's a very molly key system
link |
01:23:04.220
because everyone is now using worse and worse tactics
link |
01:23:06.260
in order to like try and win this attention game.
link |
01:23:09.500
And yeah, so ideally these companies
link |
01:23:14.620
would have had enough slack in the beginning
link |
01:23:17.300
in order to run these experiments to see,
link |
01:23:19.780
okay, what are the ways this could possibly go wrong
link |
01:23:22.460
for people?
link |
01:23:23.300
What are the ways that Moloch,
link |
01:23:24.700
they should be aware of this concept of Moloch
link |
01:23:26.300
and realize that whenever you have
link |
01:23:28.300
a highly competitive multiagent system,
link |
01:23:31.300
which social media is a classic example of,
link |
01:23:32.900
millions of agents all trying to compete
link |
01:23:34.900
for likes and so on,
link |
01:23:35.940
and you try and bring all this complexity down
link |
01:23:39.940
into like very small metrics,
link |
01:23:41.900
such as number of likes, number of retweets,
link |
01:23:44.860
whatever the algorithm optimizes for,
link |
01:23:46.700
that is a guaranteed recipe for this stuff to go wrong
link |
01:23:49.780
and become a race to the bottom.
link |
01:23:51.540
I think there should be an honesty when founders,
link |
01:23:53.900
I think there's a hunger for that kind of transparency
link |
01:23:56.380
of like, we don't know what the fuck we're doing.
link |
01:23:58.180
This is a fascinating experiment.
link |
01:23:59.700
We're all running as a human civilization.
link |
01:24:02.780
Let's try this out.
link |
01:24:04.260
And like, actually just be honest about this,
link |
01:24:06.380
that we're all like these weird rats in a maze.
link |
01:24:10.420
None of us are controlling it.
link |
01:24:12.380
There's this kind of sense like the founders,
link |
01:24:15.100
the CEO of Instagram or whatever,
link |
01:24:17.540
Mark Zuckerberg has a control and he's like,
link |
01:24:19.980
like with strings playing people.
link |
01:24:21.820
No, they're.
link |
01:24:22.860
He's at the mercy of this is like everyone else.
link |
01:24:24.580
He's just like trying to do his best.
link |
01:24:26.540
And like, I think putting on a smile
link |
01:24:29.260
and doing over polished videos
link |
01:24:32.980
about how Instagram and Facebook are good for you,
link |
01:24:36.860
I think is not the right way to actually ask
link |
01:24:39.980
some of the deepest questions we get to ask as a society.
link |
01:24:43.180
How do we design the game such that we build a better world?
link |
01:24:48.100
I think a big part of this as well is people,
link |
01:24:51.700
there's this philosophy, particularly in Silicon Valley
link |
01:24:56.980
of well, techno optimism,
link |
01:24:58.500
technology will solve all our issues.
link |
01:25:01.300
And there's a steel man argument to that where yes,
link |
01:25:03.700
technology has solved a lot of problems
link |
01:25:05.380
and can potentially solve a lot of future ones.
link |
01:25:08.140
But it can also, it's always a double edged sword.
link |
01:25:10.820
And particularly as you know,
link |
01:25:11.900
technology gets more and more powerful
link |
01:25:13.420
and we've now got like big data
link |
01:25:15.060
and we're able to do all kinds of like
link |
01:25:17.300
psychological manipulation with it and so on.
link |
01:25:22.740
Technology is not a values neutral thing.
link |
01:25:24.940
People think, I used to always think this myself.
link |
01:25:26.980
It's like this naive view that,
link |
01:25:28.900
oh, technology is completely neutral.
link |
01:25:30.700
It's just, it's the humans that either make it good or bad.
link |
01:25:33.820
No, to the point we're at now,
link |
01:25:36.620
the technology that we are creating,
link |
01:25:38.020
they are social technologies.
link |
01:25:39.420
They literally dictate how humans now form social groups
link |
01:25:45.420
and so on beyond that.
link |
01:25:46.260
And beyond that, it also then,
link |
01:25:47.780
that gives rise to like the memes
link |
01:25:49.580
that we then like coalesce around.
link |
01:25:51.980
And that, if you have the stack that way
link |
01:25:54.380
where it's technology driving social interaction,
link |
01:25:56.940
which then drives like memetic culture
link |
01:26:00.460
and like which ideas become popular, that's Moloch.
link |
01:26:04.780
And we need the other way around, we need it.
link |
01:26:06.820
So we need to figure out what are the good memes?
link |
01:26:08.420
What are the good values
link |
01:26:11.820
that we think we need to optimize for
link |
01:26:15.060
that like makes people happy and healthy
link |
01:26:17.180
and like keeps society as robust and safe as possible,
link |
01:26:21.340
then figure out what the social structure
link |
01:26:22.820
around those should be.
link |
01:26:23.660
And only then do we figure out technology,
link |
01:26:25.460
but we're doing the other way around.
link |
01:26:26.780
And as much as I love in many ways
link |
01:26:31.900
the culture of Silicon Valley,
link |
01:26:32.940
and like I do think that technology has,
link |
01:26:36.020
I don't wanna knock it.
link |
01:26:36.860
It's done so many wonderful things for us,
link |
01:26:38.180
same as capitalism.
link |
01:26:40.340
There are, we have to like be honest with ourselves.
link |
01:26:44.100
We're getting to a point where we are losing control
link |
01:26:47.220
of this very powerful machine that we have created.
link |
01:26:49.580
Can you redesign the machine within the game?
link |
01:26:53.220
Can you just have, can you understand the game enough?
link |
01:26:57.380
Okay, this is the game.
link |
01:26:58.860
And this is how we start to reemphasize
link |
01:27:01.900
the memes that matter,
link |
01:27:03.860
the memes that bring out the best in us.
link |
01:27:06.740
You know, like the way I try to be in real life
link |
01:27:11.340
and the way I try to be online
link |
01:27:13.260
is to be about kindness and love.
link |
01:27:15.900
And I feel like I'm sometimes get like criticized
link |
01:27:19.860
for being naive and all those kinds of things.
link |
01:27:22.040
But I feel like I'm just trying to live within this game.
link |
01:27:25.840
I'm trying to be authentic.
link |
01:27:27.520
Yeah, but also like, hey, it's kind of fun to do this.
link |
01:27:31.080
Like you guys should try this too, you know,
link |
01:27:33.900
and that's like trying to redesign
link |
01:27:37.180
some aspects of the game within the game.
link |
01:27:40.940
Is that possible?
link |
01:27:43.240
I don't know, but I think we should try.
link |
01:27:46.740
I don't think we have an option but to try.
link |
01:27:48.620
Well, the other option is to create new companies
link |
01:27:51.140
or to pressure companies that,
link |
01:27:55.300
or anyone who has control of the rules of the game.
link |
01:27:59.060
I think we need to be doing all of the above.
link |
01:28:01.100
I think we need to be thinking hard
link |
01:28:02.980
about what are the kind of positive, healthy memes.
link |
01:28:09.140
You know, as Elon said,
link |
01:28:10.900
he who controls the memes controls the universe.
link |
01:28:13.620
He said that.
link |
01:28:14.460
I think he did, yeah.
link |
01:28:16.300
But there's truth to that.
link |
01:28:17.860
It's very, there is wisdom in that
link |
01:28:19.380
because memes have driven history.
link |
01:28:22.700
You know, we are a cultural species.
link |
01:28:24.900
That's what sets us apart from chimpanzees
link |
01:28:27.140
and everything else.
link |
01:28:27.980
We have the ability to learn and evolve through culture
link |
01:28:32.500
as opposed to biology or like, you know,
link |
01:28:34.900
classic physical constraints.
link |
01:28:37.300
And that means culture is incredibly powerful
link |
01:28:40.740
and we can create and become victim
link |
01:28:43.820
to very bad memes or very good ones.
link |
01:28:46.620
But we do have some agency over which memes,
link |
01:28:49.300
you know, we, but not only put out there,
link |
01:28:51.940
but we also like subscribe to.
link |
01:28:54.540
So I think we need to take that approach.
link |
01:28:56.460
We also need to, you know,
link |
01:28:58.980
because I don't want, I'm making this video right now
link |
01:29:02.740
called The Attention Wars,
link |
01:29:03.740
which is about like how Moloch,
link |
01:29:05.460
like the media machine is this Moloch machine.
link |
01:29:08.500
Well, is this kind of like blind dumb thing
link |
01:29:11.580
where everyone is optimizing for engagement
link |
01:29:13.140
in order to win their share of the attention pie.
link |
01:29:16.420
And then if you zoom out,
link |
01:29:17.300
it's really like Moloch that's pulling the strings
link |
01:29:19.180
because the only thing that benefits from this in the end,
link |
01:29:20.860
you know, like our information ecosystem is breaking down.
link |
01:29:23.980
Like we have, you look at the state of the US,
link |
01:29:26.420
it's in, we're in a civil war.
link |
01:29:28.340
It's just not a physical war.
link |
01:29:29.900
It's an information war.
link |
01:29:33.220
And people are becoming more fractured
link |
01:29:35.020
in terms of what their actual shared reality is.
link |
01:29:37.380
Like truly like an extreme left person,
link |
01:29:39.500
an extreme right person,
link |
01:29:40.420
like they literally live in different worlds
link |
01:29:43.380
in their minds at this point.
link |
01:29:45.220
And it's getting more and more amplified.
link |
01:29:47.180
And this force is like a razor blade
link |
01:29:50.260
pushing through everything.
link |
01:29:51.860
It doesn't matter how innocuous a topic is,
link |
01:29:53.340
it will find a way to split into this,
link |
01:29:55.220
you know, bifurcated cultural and it's fucking terrifying.
link |
01:29:58.260
Because that maximizes the tension.
link |
01:29:59.780
And that's like an emergent Moloch type force
link |
01:30:03.140
that takes anything, any topic
link |
01:30:06.820
and cuts through it so that it can split nicely
link |
01:30:11.140
into two groups.
link |
01:30:12.500
One that's...
link |
01:30:13.820
Well, it's whatever, yeah,
link |
01:30:16.380
all everyone is trying to do within the system
link |
01:30:17.980
is just maximize whatever gets them the most attention
link |
01:30:21.180
because they're just trying to make money
link |
01:30:22.300
so they can keep their thing going, right?
link |
01:30:24.460
And the best emotion for getting attention,
link |
01:30:29.140
well, because it's not just about attention on the internet,
link |
01:30:30.820
it's engagement, that's the key thing, right?
link |
01:30:33.020
In order for something to go viral,
link |
01:30:34.380
you need people to actually engage with it.
link |
01:30:35.940
They need to like comment or retweet or whatever.
link |
01:30:39.740
And of all the emotions that,
link |
01:30:43.540
there's like seven classic shared emotions
link |
01:30:45.780
that studies have found that all humans,
link |
01:30:47.500
even from like previously uncontacted tribes have.
link |
01:30:51.020
Some of those are negative, you know, like sadness,
link |
01:30:54.620
disgust, anger, et cetera, some are positive,
link |
01:30:57.580
happiness, excitement, and so on.
link |
01:31:01.140
The one that happens to be the most useful
link |
01:31:03.540
for the internet is anger.
link |
01:31:05.540
Because anger, it's such an active emotion.
link |
01:31:09.780
If you want people to engage, if someone's scared,
link |
01:31:13.060
and I'm not just like talking out my ass here,
link |
01:31:14.700
there are studies here that have looked into this.
link |
01:31:18.540
Whereas like if someone's like,
link |
01:31:19.580
disgusted or fearful, they actually tend to then be like,
link |
01:31:22.420
oh, I don't wanna deal with this.
link |
01:31:23.500
So they're less likely to actually engage
link |
01:31:25.100
and share it and so on, they're just gonna be like, ugh.
link |
01:31:27.180
Whereas if they're enraged by a thing,
link |
01:31:29.700
well now that triggers all the like,
link |
01:31:31.980
the old tribalism emotions.
link |
01:31:34.820
And so that's how then things get sort of spread,
link |
01:31:37.260
you know, much more easily.
link |
01:31:38.100
They out compete all the other memes in the ecosystem.
link |
01:31:42.180
And so this like, the attention economy,
link |
01:31:46.180
the wheels that make it go around are,
link |
01:31:48.340
is rage.
link |
01:31:49.740
I did a tweet, the problem with raging against the machine
link |
01:31:53.860
is that the machine has learned to feed off rage.
link |
01:31:56.020
Because it is feeding off our rage.
link |
01:31:57.860
That's the thing that's now keeping it going.
link |
01:31:59.180
So the more we get angry, the worse it gets.
link |
01:32:01.900
So the mullet in this attention,
link |
01:32:04.700
in the war of attention is constantly maximizing rage.
link |
01:32:10.260
What it is optimizing for is engagement.
link |
01:32:12.420
And it happens to be that engagement
link |
01:32:14.300
is more propaganda, you know.
link |
01:32:20.460
I mean, it just sounds like everything is putting,
link |
01:32:23.220
more and more things are being put through this
link |
01:32:24.500
like propagandist lens of winning
link |
01:32:27.020
whatever the war is in question.
link |
01:32:28.900
Whether it's the culture war or the Ukraine war, yeah.
link |
01:32:31.420
Well, I think the silver lining of this,
link |
01:32:33.260
do you think it's possible that in the long arc
link |
01:32:36.540
of this process, you actually do arrive
link |
01:32:39.580
at greater wisdom and more progress?
link |
01:32:41.820
It just, in the moment, it feels like people are
link |
01:32:45.100
tearing each other to shreds over ideas.
link |
01:32:47.780
But if you think about it, one of the magic things
link |
01:32:49.580
about democracy and so on, is you have
link |
01:32:51.740
the blue versus red constantly fighting.
link |
01:32:53.900
It's almost like they're in discourse,
link |
01:32:58.180
creating devil's advocate, making devils out of each other.
link |
01:33:01.300
And through that process, discussing ideas.
link |
01:33:04.900
Like almost really embodying different ideas
link |
01:33:07.740
just to yell at each other.
link |
01:33:08.860
And through the yelling, over the period of decades,
link |
01:33:11.660
maybe centuries, figuring out a better system.
link |
01:33:15.380
Like in the moment, it feels fucked up.
link |
01:33:17.580
But in the long arc, it actually is productive.
link |
01:33:20.820
I hope so.
link |
01:33:23.340
That said, we are now in the era of,
link |
01:33:28.460
just as we have weapons of mass destruction
link |
01:33:30.540
with nuclear weapons, you know,
link |
01:33:32.420
that can break the whole playing field,
link |
01:33:35.260
we now are developing weapons
link |
01:33:37.700
of informational mass destruction.
link |
01:33:39.420
Information weapons, you know, WMDs
link |
01:33:41.940
that basically can be used for propaganda
link |
01:33:44.860
or just manipulating people however is needed,
link |
01:33:49.540
whether that's through dumb TikTok videos,
link |
01:33:53.100
or, you know, there are significant resources being put in.
link |
01:33:59.100
I don't mean to sound like, you know,
link |
01:34:01.500
to doom and gloom, but there are bad actors out there.
link |
01:34:04.300
That's the thing, there are plenty of good actors
link |
01:34:06.020
within the system who are just trying to stay afloat
link |
01:34:07.780
in the game, so effectively doing monarchy things.
link |
01:34:09.820
But then on top of that, we have actual bad actors
link |
01:34:12.540
who are intentionally trying to, like,
link |
01:34:14.980
manipulate the other side into doing things.
link |
01:34:17.420
And using, so because it's a digital space,
link |
01:34:19.660
they're able to use artificial actors, meaning bots.
link |
01:34:24.500
Exactly, botnets, you know,
link |
01:34:26.180
and this is a whole new situation
link |
01:34:29.780
that we've never had before.
link |
01:34:30.900
Yeah, it's exciting.
link |
01:34:32.340
You know what I want to do?
link |
01:34:34.020
You know what I want to do that,
link |
01:34:36.180
because there is, you know, people are talking about bots
link |
01:34:38.140
manipulating and, like, malicious bots
link |
01:34:41.580
that are basically spreading propaganda.
link |
01:34:43.580
I want to create, like, a bot army for, like,
link |
01:34:46.580
that fights that. For love?
link |
01:34:47.620
Yeah, exactly, for love, that fights, that, I mean.
link |
01:34:50.940
You know, there's, I mean, there's truth
link |
01:34:52.220
to fight fire with fire, it's like,
link |
01:34:54.140
but how you always have to be careful
link |
01:34:57.100
whenever you create, again, like,
link |
01:34:59.940
Moloch is very tricky. Yeah, yeah.
link |
01:35:01.860
Hitler was trying to spread the love, too.
link |
01:35:03.740
Well, yeah, so we thought, but, you know, I agree with you
link |
01:35:07.020
that, like, that is a thing that should be considered,
link |
01:35:09.140
but there is, again, everyone,
link |
01:35:11.460
the road to hell is paved in good intentions.
link |
01:35:13.580
And this is, there's always unforeseen circumstances,
link |
01:35:18.100
you know, outcomes, externalities
link |
01:35:20.380
of you trying to adopt a thing,
link |
01:35:21.420
even if you do it in the very best of faith.
link |
01:35:23.580
But you can learn lessons of history.
link |
01:35:25.020
If you can run some sims on it first, absolutely.
link |
01:35:28.540
But also there's certain aspects of a system,
link |
01:35:30.980
as we've learned through history, that do better than others.
link |
01:35:33.980
Like, for example, don't have a dictator,
link |
01:35:36.100
so, like, if I were to create this bot army,
link |
01:35:39.620
it's not good for me to have full control over it.
link |
01:35:42.900
Because in the beginning, I might have a good understanding
link |
01:35:45.340
of what's good and not, but over time,
link |
01:35:47.620
that starts to get deviated,
link |
01:35:49.140
because I'll get annoyed at some assholes,
link |
01:35:50.700
and I'll think, okay, wouldn't it be nice
link |
01:35:52.540
to get rid of those assholes?
link |
01:35:53.980
But then that power starts getting to your head,
link |
01:35:55.660
you become corrupted, that's basic human nature.
link |
01:35:58.020
So distribute the power somehow.
link |
01:35:59.940
We need a love botnet on a DAO.
link |
01:36:05.780
A DAO love botnet.
link |
01:36:07.780
Yeah, and without a leader, like without...
link |
01:36:10.700
Well, exactly, a distributed, right,
link |
01:36:12.340
but yeah, without any kind of centralized...
link |
01:36:14.300
Yeah, without even, you know,
link |
01:36:16.100
basically it's the more control,
link |
01:36:17.500
the more you can decentralize the control of a thing
link |
01:36:21.420
to people, you know, but the balance...
link |
01:36:25.180
But then you still need the ability to coordinate,
link |
01:36:26.700
because that's the issue when something is too,
link |
01:36:29.420
you know, that's really, to me, like the culture wars,
link |
01:36:32.460
the bigger war we're dealing with is actually between
link |
01:36:35.660
the sort of the, I don't know what even the term is for it,
link |
01:36:39.580
but like centralization versus decentralization.
link |
01:36:41.980
That's the tension we're seeing.
link |
01:36:44.100
Power in control by a few versus completely distributed.
link |
01:36:48.220
And the trouble is if you have a fully centralized thing,
link |
01:36:50.860
then you're at risk of tyranny, you know,
link |
01:36:52.700
Stalin type things can happen, or completely distributed.
link |
01:36:56.860
Now you're at risk of complete anarchy and chaos
link |
01:36:58.500
but you can't even coordinate to like on, you know,
link |
01:37:01.100
when there's like a pandemic or anything like that.
link |
01:37:03.220
So it's like, what is the right balance to strike
link |
01:37:05.780
between these two structures?
link |
01:37:08.100
Can't Moloch really take hold
link |
01:37:09.660
in a fully decentralized system?
link |
01:37:11.060
That's one of the dangers too.
link |
01:37:12.500
Yes, very vulnerable to Moloch.
link |
01:37:14.420
So a dictator can commit huge atrocities,
link |
01:37:17.740
but they can also make sure the infrastructure works
link |
01:37:20.980
and trains run on time.
link |
01:37:23.180
They have that God's eye view at least.
link |
01:37:24.820
They have the ability to create like laws and rules
link |
01:37:27.340
that like force coordination, which stops Moloch.
link |
01:37:30.900
But then you're vulnerable to that dictator
link |
01:37:33.260
getting infected with like this,
link |
01:37:34.660
with some kind of psychopathy type thing.
link |
01:37:36.940
What's reverse Moloch?
link |
01:37:39.460
Sorry, great question.
link |
01:37:40.580
So that's where, so I've been working on this series.
link |
01:37:46.220
It's been driving me insane for the last year and a half.
link |
01:37:48.260
I did the first one a year ago.
link |
01:37:50.100
I can't believe it's nearly been a year.
link |
01:37:52.060
The second one, hopefully will be coming out
link |
01:37:53.500
in like a month.
link |
01:37:54.340
And my goal at the end of the series is to like present,
link |
01:37:58.700
cause basically I'm painting the picture of like
link |
01:38:00.180
what Moloch is and how it's affecting
link |
01:38:02.220
almost all these issues in our society
link |
01:38:04.300
and how it's driving.
link |
01:38:06.420
It's like kind of the generator function
link |
01:38:07.940
as people describe it of existential risk.
link |
01:38:11.020
And then at the end of that.
link |
01:38:11.860
Wait, wait, the generator function of existential risk.
link |
01:38:14.180
So you're saying Moloch is sort of the engine
link |
01:38:16.460
that creates a bunch of X risks.
link |
01:38:19.660
Yes, not all of them.
link |
01:38:20.540
Like a, you know, a.
link |
01:38:22.940
Just a cool phrase, generator function.
link |
01:38:24.620
It's not my phrase.
link |
01:38:25.460
It's Daniel Schmacktenberger.
link |
01:38:26.780
Oh, Schmacktenberger.
link |
01:38:27.620
I got that from him.
link |
01:38:28.460
Of course.
link |
01:38:29.300
All things, it's like all roads lead back
link |
01:38:31.380
to Daniel Schmacktenberger, I think.
link |
01:38:32.740
The dude is, the dude is brilliant.
link |
01:38:35.220
He's really brilliant.
link |
01:38:36.060
After that it's Mark Twain.
link |
01:38:37.260
But anyway, sorry.
link |
01:38:39.540
Totally rude interruptions from me.
link |
01:38:41.140
No, it's fine.
link |
01:38:42.020
So not all X risks.
link |
01:38:43.100
So like an asteroid technically isn't
link |
01:38:45.460
because it's, you know, it's just like
link |
01:38:47.940
this one big external thing.
link |
01:38:49.500
It's not like a competition thing going on.
link |
01:38:51.580
But, you know, synthetic bio, you know,
link |
01:38:54.700
bio weapons, that's one because everyone's incentivized
link |
01:38:57.540
to build, even for defense, you know,
link |
01:38:59.740
bad, bad viruses, you know,
link |
01:39:01.460
just to threaten someone else, et cetera.
link |
01:39:03.820
Or AI, technically the race to AGI
link |
01:39:05.820
is kind of potentially a Molochi situation.
link |
01:39:09.620
But yeah, so if Moloch is this like generator function
link |
01:39:15.060
that's driving all of these issues
link |
01:39:16.780
over the coming century that might wipe us out,
link |
01:39:19.100
what's the inverse?
link |
01:39:20.340
And so far what I've gotten to is this character
link |
01:39:24.100
that I want to put out there called Winwin.
link |
01:39:26.660
Because Moloch is the God of lose, lose, ultimately.
link |
01:39:28.660
It masquerades as the God of win, lose,
link |
01:39:30.260
but in reality it's lose, lose.
link |
01:39:31.780
Everyone ends up worse off.
link |
01:39:34.380
So I was like, well, what's the opposite of that?
link |
01:39:35.860
It's Winwin.
link |
01:39:36.700
And I was thinking for ages, like,
link |
01:39:37.660
what's a good name for this character?
link |
01:39:39.740
And then tomorrow I was like, okay, well,
link |
01:39:42.380
don't try and, you know, think through it logically.
link |
01:39:44.700
What's the vibe of Winwin?
link |
01:39:46.540
And to me, like in my mind, Moloch is like,
link |
01:39:49.220
and I addressed that in the video,
link |
01:39:50.580
like it's red and black.
link |
01:39:52.140
It's kind of like very, you know, hyper focused
link |
01:39:55.660
on it's one goal you must win.
link |
01:39:58.660
So Winwin is kind of actually like these colors.
link |
01:40:01.420
It's like purple, turquoise.
link |
01:40:04.140
It's loves games too.
link |
01:40:06.700
It loves a little bit of healthy competition,
link |
01:40:08.420
but constrained, like kind of like before,
link |
01:40:10.100
like knows how to ring fence zero sum competition
link |
01:40:12.860
into like just the right amount,
link |
01:40:14.940
whereby its externalities can be controlled
link |
01:40:17.460
and kept positive.
link |
01:40:19.180
And then beyond that, it also loves cooperation,
link |
01:40:21.260
coordination, love, all these other things.
link |
01:40:23.940
But it's also kind of like mischievous,
link |
01:40:26.780
like, you know, it will have a good time.
link |
01:40:28.500
It's not like kind of like boring, you know,
link |
01:40:30.300
like, oh God, it knows how to have fun.
link |
01:40:33.460
It can get like, it can get down,
link |
01:40:36.020
but ultimately it's like unbelievably wise
link |
01:40:39.740
and it just wants the game to keep going.
link |
01:40:42.740
And I call it Winwin.
link |
01:40:44.580
That's a good like pet name, Winwin.
link |
01:40:47.500
I think the, Winwin, right?
link |
01:40:49.900
And I think it's formal name when it has to do
link |
01:40:51.740
like official functions is Omnia.
link |
01:40:55.020
Omnia.
link |
01:40:55.860
Yeah.
link |
01:40:56.700
From like omniscience kind of, why Omnia?
link |
01:40:59.540
You just like Omnia?
link |
01:41:00.380
She's like Omniwin.
link |
01:41:01.220
Omniwin.
link |
01:41:02.060
But I'm open to suggestions.
link |
01:41:02.900
I would like, you know, and this is.
link |
01:41:04.100
I like Omnia.
link |
01:41:05.100
Yeah.
link |
01:41:05.940
But there is an angelic kind of sense to Omnia though.
link |
01:41:08.740
So Winwin is more fun.
link |
01:41:09.980
So it's more like, it embraces the fun aspect.
link |
01:41:14.980
The fun aspect.
link |
01:41:16.420
I mean, there is something about sort of,
link |
01:41:20.500
there's some aspect to Winwin interactions
link |
01:41:23.500
that requires embracing the chaos of the game
link |
01:41:31.580
and enjoying the game itself.
link |
01:41:33.740
I don't know.
link |
01:41:34.580
I don't know what that is.
link |
01:41:35.420
That's almost like a Zen like appreciation
link |
01:41:37.660
of the game itself, not optimizing
link |
01:41:40.500
for the consequences of the game.
link |
01:41:42.460
Right, well, it's recognizing the value
link |
01:41:45.260
of competition in of itself about,
link |
01:41:47.180
it's not like about winning.
link |
01:41:48.700
It's about you enjoying the process of having a competition
link |
01:41:51.460
and not knowing whether you're gonna win
link |
01:41:52.620
or lose this little thing.
link |
01:41:53.900
But then also being aware that, you know,
link |
01:41:56.580
what's the boundary?
link |
01:41:57.420
How big do I want competition to be?
link |
01:41:59.020
Because one of the reason why Moloch is doing so well now
link |
01:42:02.660
in our society, in our civilization is
link |
01:42:04.820
because we haven't been able to ring fence competition.
link |
01:42:07.820
You know, and so it's just having all
link |
01:42:09.380
these negative externalities
link |
01:42:10.580
and it's, we've completely lost control of it.
link |
01:42:13.420
You know, it's, I think my guess is,
link |
01:42:16.780
and now we're getting really like,
link |
01:42:18.620
you know, metaphysical technically,
link |
01:42:20.860
but I think we'll be in a more interesting universe
link |
01:42:26.780
if we have one that has both pure cooperation,
link |
01:42:29.700
you know, lots of cooperation
link |
01:42:31.340
and some pockets of competition
link |
01:42:32.980
than one that's purely competition, cooperation entirely.
link |
01:42:36.220
Like it's good to have some little zero sumness bits,
link |
01:42:39.340
but I don't know that fully
link |
01:42:41.500
and I'm not qualified as a philosopher to know that.
link |
01:42:44.100
And that's what reverse Moloch,
link |
01:42:45.460
so this kind of win, win creature is a system,
link |
01:42:49.620
is an antidote to the Moloch system.
link |
01:42:52.140
Yes.
link |
01:42:53.580
And I don't know how it's gonna do that.
link |
01:42:57.620
But it's good to kind of try to start
link |
01:42:59.980
to formulate different ideas,
link |
01:43:01.460
different frameworks of how we think about that.
link |
01:43:03.820
Exactly.
link |
01:43:04.660
At the small scale of a collection of individuals
link |
01:43:07.180
and a large scale of a society.
link |
01:43:09.100
Exactly.
link |
01:43:09.940
It's a meme, I think it's an example of a good meme.
link |
01:43:13.380
And I'm open, I'd love to hear feedback from people
link |
01:43:15.540
if they think it's, you know, they have a better idea
link |
01:43:17.620
or it's not, you know,
link |
01:43:18.460
but it's the direction of memes that we need to spread,
link |
01:43:21.660
this idea of like, look for the win, wins in life.
link |
01:43:25.140
Well, on the topic of beauty filters,
link |
01:43:27.220
so in that particular context
link |
01:43:28.660
where Moloch creates negative consequences,
link |
01:43:35.140
Dostoevsky said beauty will save the world.
link |
01:43:37.180
What is beauty anyway?
link |
01:43:41.260
It would be nice to just try to discuss
link |
01:43:45.340
what kind of thing we would like to converge towards
link |
01:43:49.300
in our understanding of what is beautiful.
link |
01:43:55.900
So to me, I think something is beautiful
link |
01:43:59.220
when it can't be reduced down to easy metrics.
link |
01:44:04.220
Like if you think of a tree, what is it about a tree,
link |
01:44:07.820
like a big, ancient, beautiful tree, right?
link |
01:44:09.420
What is it about it that we find so beautiful?
link |
01:44:11.780
It's not, you know, the sweetness of its fruit
link |
01:44:17.700
or the value of its lumber.
link |
01:44:20.140
It's this entirety of it that is,
link |
01:44:25.940
there's these immeasurable qualities,
link |
01:44:28.020
it's like almost like a qualia of it.
link |
01:44:30.020
That's both, like it walks this fine line between pattern,
link |
01:44:33.340
well, it's got lots of patternicity,
link |
01:44:34.620
but it's not overly predictable.
link |
01:44:36.660
Again, it walks this fine line between order and chaos.
link |
01:44:39.100
It's a very highly complex system.
link |
01:44:45.380
It's evolving over time,
link |
01:44:47.420
the definition of a complex versus,
link |
01:44:49.140
and this is another Schmackt and Berger thing,
link |
01:44:50.860
a complex versus a complicated system.
link |
01:44:53.500
A complicated system can be sort of broken down
link |
01:44:55.660
into bits and pieces,
link |
01:44:56.980
a complicated system can be sort of broken down into bits,
link |
01:45:00.340
understood and then put back together.
link |
01:45:01.980
A complex system is kind of like a black box.
link |
01:45:04.740
It does all this crazy stuff,
link |
01:45:06.700
but if you take it apart,
link |
01:45:07.940
you can't put it back together again,
link |
01:45:09.060
because there's all these intricacies.
link |
01:45:11.540
And also very importantly, like there's some of the parts,
link |
01:45:14.740
sorry, the sum of the whole is much greater
link |
01:45:16.340
than the sum of the parts.
link |
01:45:17.620
And that's where the beauty lies, I think.
link |
01:45:21.100
And I think that extends to things like art as well.
link |
01:45:23.300
Like there's something immeasurable about it.
link |
01:45:27.380
There's something we can't break down to a narrow metric.
link |
01:45:29.740
Does that extend to humans, you think?
link |
01:45:31.580
Yeah, absolutely.
link |
01:45:33.580
So how can Instagram reveal that kind of beauty,
link |
01:45:37.180
the complexity of a human being?
link |
01:45:39.060
Good question.
link |
01:45:41.820
And this takes us back to dating sites and Goodreads,
link |
01:45:44.580
I think.
link |
01:45:46.420
Very good question.
link |
01:45:47.500
I mean, well, I know what it shouldn't do.
link |
01:45:50.020
It shouldn't try and like, right now,
link |
01:45:52.980
you know, I was talking to like a social media expert
link |
01:45:56.180
recently, because I was like, oh, I hate that.
link |
01:45:58.020
There's such a thing as a social media expert?
link |
01:45:59.860
Oh, yeah, there are like agencies out there
link |
01:46:02.020
that you can like outsource,
link |
01:46:03.220
because I'm thinking about working with one to like,
link |
01:46:06.980
I want to start a podcast.
link |
01:46:09.260
You should, you should have done it a long time ago.
link |
01:46:11.860
Working on it.
link |
01:46:12.860
It's going to be called Win Win.
link |
01:46:14.660
And it's going to be about this like positive sum stuff.
link |
01:46:17.420
And the thing that, you know, they always come back and say,
link |
01:46:21.140
is like, well, you need to like figure out
link |
01:46:22.980
what your thing is.
link |
01:46:24.060
You know, you need to narrow down what your thing is
link |
01:46:26.420
and then just follow that.
link |
01:46:27.980
Have like a sort of a formula,
link |
01:46:31.060
because that's what people want.
link |
01:46:32.180
They want to know that they're coming back
link |
01:46:33.220
to the same thing.
link |
01:46:34.260
And that's the advice on YouTube, Twitter, you name it.
link |
01:46:37.420
And that's why, and the trouble with that
link |
01:46:40.180
is that it's a complexity reduction.
link |
01:46:42.460
And generally speaking, you know,
link |
01:46:44.100
complexity reduction is bad.
link |
01:46:45.340
It's making things more, it's an oversimplification.
link |
01:46:48.020
Not that simplification is always a bad thing.
link |
01:46:51.380
But when you're trying to take, you know,
link |
01:46:55.500
what is social media doing?
link |
01:46:56.540
It's trying to like encapsulate the human experience
link |
01:47:00.100
and put it into digital form and commodify it to an extent.
link |
01:47:04.700
That, so you do that, you compress people down
link |
01:47:07.580
into these like narrow things.
link |
01:47:09.500
And that's why I think it's kind of ultimately
link |
01:47:12.700
fundamentally incompatible with at least
link |
01:47:14.460
my definition of beauty.
link |
01:47:15.300
It's interesting because there is some sense in which
link |
01:47:21.020
a simplification sort of in the Einstein kind of sense
link |
01:47:25.180
of a really complex idea, a simplification in a way
link |
01:47:29.300
that still captures some core power of an idea of a person
link |
01:47:33.900
is also beautiful.
link |
01:47:36.100
And so maybe it's possible for social media to do that.
link |
01:47:39.340
A presentation, a sort of a slither, a slice,
link |
01:47:44.340
a look into a person's life that reveals something
link |
01:47:48.420
real about them.
link |
01:47:50.220
But in a simple way, in a way that can be displayed
link |
01:47:52.980
graphically or through words.
link |
01:47:55.340
Some way, I mean, in some way Twitter can do
link |
01:47:58.460
that kind of thing.
link |
01:47:59.700
A very few set of words can reveal the intricacies
link |
01:48:03.540
of a person.
link |
01:48:04.660
Of course, the viral machine that spreads those words
link |
01:48:09.660
often results in people taking the thing out of context.
link |
01:48:14.660
People often don't read tweets in the context
link |
01:48:18.900
of the human being that wrote them.
link |
01:48:20.740
The full history of the tweets they've written,
link |
01:48:24.060
the education level, the humor level,
link |
01:48:26.540
the world view they're playing around with,
link |
01:48:30.380
all that context is forgotten and people just see
link |
01:48:32.460
the different words.
link |
01:48:33.620
So that can lead to trouble.
link |
01:48:35.500
But in a certain sense, if you do take it in context,
link |
01:48:39.780
it reveals some kind of quirky little beautiful idea
link |
01:48:43.260
or a profound little idea from that particular person
link |
01:48:47.180
that shows something about that person.
link |
01:48:48.540
So in that sense, Twitter can be more successful
link |
01:48:51.260
if we're talking about Mollux is driving
link |
01:48:54.060
a better kind of incentive.
link |
01:48:56.940
Yeah, I mean, how they can, like if we were to rewrite,
link |
01:49:01.980
is there a way to rewrite the Twitter algorithm
link |
01:49:05.420
so that it stops being the fertile breeding ground
link |
01:49:11.620
of the culture wars?
link |
01:49:12.460
Because that's really what it is.
link |
01:49:15.300
I mean, maybe I'm giving it, Twitter too much power,
link |
01:49:19.340
but just the more I looked into it
link |
01:49:21.780
and I had conversations with Tristan Harris
link |
01:49:25.620
from Center of Humane Technology.
link |
01:49:27.620
And he explained it as like,
link |
01:49:30.820
Twitter is where you have this amalgam of human culture
link |
01:49:34.100
and then this terribly designed algorithm
link |
01:49:36.180
that amplifies the craziest people
link |
01:49:39.780
and the angriest most divisive takes and amplifies them.
link |
01:49:45.500
And then the media, the mainstream media,
link |
01:49:47.740
because all the journalists are also on Twitter,
link |
01:49:49.860
they then are informed by that.
link |
01:49:52.860
And so they draw out the stories they can
link |
01:49:55.060
from this already like very boiling lava of rage
link |
01:50:00.700
and then spread that to their millions
link |
01:50:03.340
and millions of people who aren't even on Twitter.
link |
01:50:07.020
And so I honestly, I think if I could press a button,
link |
01:50:10.820
turn them off, I probably would at this point,
link |
01:50:13.580
because I just don't see a way
link |
01:50:14.740
of being compatible with healthiness,
link |
01:50:17.300
but that's not gonna happen.
link |
01:50:19.580
And so at least one way to like stem the tide
link |
01:50:23.140
and make it less malachy would be to change,
link |
01:50:29.980
at least if like it was on a subscription model,
link |
01:50:31.900
then it's now not optimizing for impressions.
link |
01:50:36.980
Cause basically what it wants is for people
link |
01:50:38.300
to keep coming back as often as possible.
link |
01:50:40.180
That's how they get paid, right?
link |
01:50:42.020
Every time an ad gets shown to someone
link |
01:50:43.740
and the way is to get people constantly refreshing their feed.
link |
01:50:46.460
So you're trying to encourage addictive behaviors.
link |
01:50:49.380
Whereas if someone, if they moved on
link |
01:50:52.300
to at least a subscription model,
link |
01:50:53.900
then they're getting the money either way,
link |
01:50:56.940
whether someone comes back to the site once a month
link |
01:50:59.140
or 500 times a month,
link |
01:51:01.020
they get the same amount of money.
link |
01:51:02.060
So now that takes away that incentive,
link |
01:51:04.860
to use technology, to build,
link |
01:51:07.220
to design an algorithm that is maximally addictive.
link |
01:51:10.500
That would be one way, for example.
link |
01:51:12.500
Yeah, but you still want people to,
link |
01:51:14.620
yeah, I just feel like that just slows down,
link |
01:51:17.260
creates friction in the virality of things.
link |
01:51:20.940
But that's good.
link |
01:51:22.260
We need to slow down virality.
link |
01:51:24.420
It's good, it's one way.
link |
01:51:26.580
Virality is malach, to be clear.
link |
01:51:28.900
So malach is always negative then?
link |
01:51:34.300
Yes, by definition.
link |
01:51:36.500
Yes.
link |
01:51:37.340
Competition is not always negative.
link |
01:51:39.460
Competition is neutral.
link |
01:51:40.540
I disagree with you that all virality is negative then,
link |
01:51:44.180
is malach then.
link |
01:51:45.940
Because it's a good intuition,
link |
01:51:49.500
because we have a lot of data on virality being negative.
link |
01:51:52.900
But I happen to believe that the core of human beings,
link |
01:51:57.100
so most human beings want to be good
link |
01:52:00.380
more than they want to be bad to each other.
link |
01:52:03.340
And so I think it's possible,
link |
01:52:05.700
it might be just harder to engineer systems
link |
01:52:09.060
that enable virality,
link |
01:52:10.540
but it's possible to engineer systems that are viral
link |
01:52:13.860
that enable virality.
link |
01:52:15.700
And the kind of stuff that rises to the top
link |
01:52:19.140
is things that are positive.
link |
01:52:21.260
And positive, not like la la positive,
link |
01:52:24.180
it's more like win win,
link |
01:52:25.820
meaning a lot of people need to be challenged.
link |
01:52:28.380
Wise things, yes.
link |
01:52:29.620
You grow from it, it might challenge you,
link |
01:52:31.460
you might not like it, but you ultimately grow from it.
link |
01:52:34.740
And ultimately bring people together
link |
01:52:36.700
as opposed to tear them apart.
link |
01:52:38.460
I deeply want that to be true.
link |
01:52:40.540
And I very much agree with you that people at their core
link |
01:52:43.740
are on average good, care for each other,
link |
01:52:46.140
as opposed to not.
link |
01:52:46.980
Like I think it's actually a very small percentage
link |
01:52:50.540
of people are truly wanting to do
link |
01:52:52.900
just like destructive malicious things.
link |
01:52:54.420
Most people are just trying to win their own little game.
link |
01:52:56.500
And they don't mean to be,
link |
01:52:57.900
they're just stuck in this badly designed system.
link |
01:53:01.980
That said, the current structure, yes,
link |
01:53:04.620
is the current structure means that virality
link |
01:53:09.140
is optimized towards Moloch.
link |
01:53:10.780
That doesn't mean there aren't exceptions.
link |
01:53:12.380
Sometimes positive stories do go viral
link |
01:53:14.060
and I think we should study them.
link |
01:53:15.140
I think there should be a whole field of study
link |
01:53:17.060
into understanding, identifying memes
link |
01:53:20.340
that above a certain threshold of the population
link |
01:53:24.020
agree is a positive, happy, bringing people together meme.
link |
01:53:27.340
The kind of thing that brings families together
link |
01:53:29.860
that would normally argue about cultural stuff
link |
01:53:31.700
at the table, at the dinner table.
link |
01:53:34.820
Identify those memes and figure out what it was,
link |
01:53:37.420
what was the ingredient that made them spread that day.
link |
01:53:41.380
And also like not just like happiness
link |
01:53:44.740
and connection between humans,
link |
01:53:45.940
but connection between humans in other ways
link |
01:53:49.580
that enables like productivity, like cooperation,
link |
01:53:52.700
solving difficult problems and all those kinds of stuff.
link |
01:53:56.180
So it's not just about let's be happy
link |
01:53:59.020
and have a fulfilling lives.
link |
01:54:00.820
It's also like, let's build cool shit.
link |
01:54:03.260
Let's get excited.
link |
01:54:04.100
Which is the spirit of collaboration,
link |
01:54:05.100
which is deeply anti Moloch, right?
link |
01:54:06.700
That's, it's not using competition.
link |
01:54:09.300
It's like, Moloch hates collaboration and coordination
link |
01:54:13.140
and people working together.
link |
01:54:14.580
And that's, again, like the internet started out as that
link |
01:54:18.060
and it could have been that,
link |
01:54:20.660
but because of the way it was sort of structured
link |
01:54:23.380
in terms of, you know, very lofty ideal,
link |
01:54:26.860
they wanted everything to be open source,
link |
01:54:28.980
open source and also free.
link |
01:54:30.620
And, but they needed to find a way to pay the bills anyway,
link |
01:54:32.820
because they were still building this
link |
01:54:33.860
on top of our old economics system.
link |
01:54:36.300
And so the way they did that
link |
01:54:37.620
was through third party advertisement.
link |
01:54:40.260
But that meant that things were very decoupled.
link |
01:54:42.780
You know, you've got this third party interest,
link |
01:54:45.620
which means that you're then like,
link |
01:54:47.460
people having to optimize for that.
link |
01:54:48.980
And that is, you know, the actual consumer
link |
01:54:51.300
is actually the product,
link |
01:54:52.980
not the person you're making the thing for.
link |
01:54:56.940
In the end, you start making the thing for the advertiser.
link |
01:54:59.820
And so that's why it then like breaks down.
link |
01:55:03.660
Yeah, like it's, there's no clean solution to this.
link |
01:55:08.260
And I, it's a really good suggestion by you actually
link |
01:55:11.140
to like figure out how we can optimize virality
link |
01:55:16.100
for positive sum topics.
link |
01:55:19.540
I shall be the general of the love bot army.
link |
01:55:24.900
Distributed.
link |
01:55:26.220
Distributed, distributed, no, okay, yeah.
link |
01:55:28.420
The power, just even in saying that,
link |
01:55:30.220
the power already went to my head.
link |
01:55:32.220
No, okay, you've talked about quantifying your thinking.
link |
01:55:35.940
We've been talking about this,
link |
01:55:36.980
sort of a game theoretic view on life
link |
01:55:39.340
and putting probabilities behind estimates.
link |
01:55:42.260
Like if you think about different trajectories
link |
01:55:44.260
you can take through life,
link |
01:55:45.660
just actually analyzing life in game theoretic way,
link |
01:55:48.340
like your own life, like personal life.
link |
01:55:50.820
I think you've given an example
link |
01:55:52.820
that you had an honest conversation with Igor
link |
01:55:54.700
about like, how long is this relationship gonna last?
link |
01:55:57.780
So similar to our sort of marriage problem
link |
01:56:00.140
kind of discussion, having an honest conversation
link |
01:56:03.180
about the probability of things
link |
01:56:05.620
that we sometimes are a little bit too shy
link |
01:56:08.580
or scared to think of in a probabilistic terms.
link |
01:56:11.820
Can you speak to that kind of way of reasoning,
link |
01:56:15.060
the good and the bad of that?
link |
01:56:17.020
Can you do this kind of thing with human relations?
link |
01:56:20.900
Yeah, so the scenario you're talking about, it was like.
link |
01:56:25.020
Yeah, tell me about that scenario.
link |
01:56:26.020
Yeah, I think it was about a year into our relationship
link |
01:56:30.900
and we were having a fairly heavy conversation
link |
01:56:34.500
because we were trying to figure out
link |
01:56:35.460
whether or not I was gonna sell my apartment.
link |
01:56:37.580
Well, you know, he had already moved in,
link |
01:56:40.180
but I think we were just figuring out
link |
01:56:41.780
what like our longterm plans would be.
link |
01:56:43.660
We should be by a place together, et cetera.
link |
01:56:46.340
When you guys are having that conversation,
link |
01:56:47.700
are you like drunk out of your mind on wine
link |
01:56:49.820
or is he sober and you're actually having a serious?
link |
01:56:52.780
I think we were sober.
link |
01:56:53.700
How do you get to that conversation?
link |
01:56:54.940
Because most people are kind of afraid
link |
01:56:56.180
to have that kind of serious conversation.
link |
01:56:58.780
Well, so our relationship was very,
link |
01:57:01.540
well, first of all, we were good friends
link |
01:57:03.360
for a couple of years before we even got romantic.
link |
01:57:08.340
And when we did get romantic,
link |
01:57:12.460
it was very clear that this was a big deal.
link |
01:57:15.780
It wasn't just like another, it wasn't a random thing.
link |
01:57:20.420
So the probability of it being a big deal was high.
link |
01:57:22.900
It was already very high.
link |
01:57:24.220
And then we'd been together for a year
link |
01:57:26.220
and it had been pretty golden and wonderful.
link |
01:57:28.820
So, you know, there was a lot of foundation already
link |
01:57:32.620
where we felt very comfortable
link |
01:57:33.820
having a lot of frank conversations.
link |
01:57:35.220
But Igor's MO has always been much more than mine.
link |
01:57:38.460
He was always from the outset,
link |
01:57:39.980
like just in a relationship,
link |
01:57:42.600
radical transparency and honesty is the way
link |
01:57:45.340
because the truth is the truth,
link |
01:57:47.100
whether you want to hide it or not, you know,
link |
01:57:48.420
but it will come out eventually.
link |
01:57:50.300
And if you aren't able to accept difficult things yourself,
link |
01:57:56.540
then how could you possibly expect to be like
link |
01:57:58.540
the most integral version that, you know,
link |
01:58:00.420
you can't, the relationship needs this bedrock
link |
01:58:02.940
of like honesty as a foundation more than anything.
link |
01:58:06.620
Yeah, that's really interesting,
link |
01:58:07.700
but I would like to push against some of those ideas,
link |
01:58:09.960
but that's the down the line, yes, throw them up.
link |
01:58:13.780
I just rudely interrupt.
link |
01:58:15.260
That was fine.
link |
01:58:16.780
And so, you know, we'd been about together for a year
link |
01:58:19.860
and things were good
link |
01:58:20.700
and we were having this hard conversation
link |
01:58:23.460
and then he was like, well, okay,
link |
01:58:25.300
what's the likelihood that we're going to be together
link |
01:58:27.300
in three years then?
link |
01:58:28.300
Because I think it was roughly a three year time horizon.
link |
01:58:31.060
And I was like, ooh, ooh, interesting.
link |
01:58:32.900
And then we were like, actually wait,
link |
01:58:34.260
before you said out loud,
link |
01:58:35.100
let's both write down our predictions formally
link |
01:58:37.760
because we'd been like,
link |
01:58:38.600
we're just getting into like effective altruism
link |
01:58:40.460
and rationality at the time,
link |
01:58:41.860
which is all about making formal predictions
link |
01:58:43.900
as a means of measuring your own,
link |
01:58:48.980
well, your own foresight essentially in a quantified way.
link |
01:58:53.700
So we like both wrote down our percentages
link |
01:58:55.500
and we also did a one year prediction
link |
01:58:58.660
and a 10 year one as well.
link |
01:58:59.600
So we got percentages for all three
link |
01:59:02.000
and then we showed each other.
link |
01:59:03.900
And I remember like having this moment of like, ooh,
link |
01:59:06.260
because for the 10 year one, I was like, well, I mean,
link |
01:59:08.760
I love him a lot,
link |
01:59:09.600
but like a lot can happen in 10 years,
link |
01:59:11.800
and we've only been together for,
link |
01:59:14.260
so I was like, I think it's over 50%,
link |
01:59:16.220
but it's definitely not 90%.
link |
01:59:17.900
And I remember like wrestling,
link |
01:59:19.020
I was like, oh, but I don't want him to be hurt.
link |
01:59:20.220
I don't want him to,
link |
01:59:21.400
I don't want to give a number lower than his.
link |
01:59:22.700
And I remember thinking, I was like, ah, ah, don't game it.
link |
01:59:25.260
This is an exercise in radical honesty.
link |
01:59:28.100
So just give your real percentage.
link |
01:59:29.500
And I think mine was like 75%.
link |
01:59:31.300
And then we showed each other
link |
01:59:32.400
and luckily we were fairly well aligned.
link |
01:59:34.900
And, but honestly, even if we weren't.
link |
01:59:38.640
20%.
link |
01:59:39.480
Huh?
link |
01:59:40.320
It definitely would have,
link |
01:59:41.940
I, if his had been consistently lower than mine,
link |
01:59:45.540
that would have rattled me for sure.
link |
01:59:48.140
Whereas if it had been the other way around,
link |
01:59:49.920
I think he would have,
link |
01:59:50.740
he's just kind of like a water off a duck's back type of guy.
link |
01:59:53.000
It'd be like, okay, well, all right, we'll figure this out.
link |
01:59:55.240
Well, did you guys provide error bars on the estimate?
link |
01:59:57.880
Like the level of uncertainty?
link |
01:59:58.760
They came built in.
link |
01:59:59.600
We didn't give formal plus or minus error bars.
link |
02:00:02.220
I didn't draw any or anything like that.
link |
02:00:03.560
Well, I guess that's the question I have is,
link |
02:00:05.680
did you feel informed enough to make such decisions?
link |
02:00:12.320
Cause like, I feel like if you were,
link |
02:00:13.920
if I were to do this kind of thing rigorously,
link |
02:00:17.360
I would want some data.
link |
02:00:20.520
I would want to set one of the assumptions you have
link |
02:00:23.080
is you're not that different from other relationships.
link |
02:00:25.360
Right.
link |
02:00:26.200
And so I want to have some data about the way.
link |
02:00:29.360
You want the base rates.
link |
02:00:30.680
Yeah, and also actual trajectories of relationships.
link |
02:00:34.400
I would love to have like time series data
link |
02:00:39.080
about the ways that relationships fall apart or prosper,
link |
02:00:43.720
how they collide with different life events,
link |
02:00:46.700
losses, job changes, moving.
link |
02:00:50.000
Both partners find jobs, only one has a job.
link |
02:00:54.520
I want that kind of data
link |
02:00:56.320
and how often the different trajectories change in life.
link |
02:00:58.960
Like how informative is your past to your future?
link |
02:01:04.400
That's the whole thing.
link |
02:01:05.240
Like can you look at my life and have a good prediction
link |
02:01:09.800
about in terms of my characteristics and my relationships,
link |
02:01:13.560
what that's going to look like in the future or not?
link |
02:01:15.880
I don't even know the answer to that question.
link |
02:01:17.200
I'll be very ill informed in terms of making the probability.
link |
02:01:20.740
I would be far, yeah, I just would be under informed.
link |
02:01:25.680
I would be under informed.
link |
02:01:26.680
I'll be over biasing to my prior experiences, I think.
link |
02:01:31.240
Right, but as long as you're aware of that
link |
02:01:33.120
and you're honest with yourself,
link |
02:01:34.480
and you're honest with the other person,
link |
02:01:35.760
say, look, I have really wide error bars on this
link |
02:01:37.640
for the following reasons, that's okay.
link |
02:01:40.440
I still think it's better than not trying
link |
02:01:42.040
to quantify it at all.
link |
02:01:43.200
If you're trying to make really major
link |
02:01:45.220
irreversible life decisions.
link |
02:01:46.760
And I feel also the romantic nature of that question
link |
02:01:49.480
for me personally, I try to live my life
link |
02:01:52.800
thinking it's very close to 100%.
link |
02:01:55.440
Like, allowing myself actually the,
link |
02:01:58.400
this is the difficulty of this is allowing myself
link |
02:02:01.280
to think differently, I feel like has
link |
02:02:04.520
a psychological consequence.
link |
02:02:06.460
That's where that, that's one of my pushbacks
link |
02:02:08.400
against radical honesty is this one particular perspective.
link |
02:02:14.480
So you're saying you would rather give
link |
02:02:16.240
a falsely high percentage to your partner.
link |
02:02:20.140
Going back to the wide sage filth.
link |
02:02:21.840
In order to sort of create this additional optimism.
link |
02:02:24.560
Helmuth.
link |
02:02:25.520
Yes.
link |
02:02:26.360
Of fake it till you make it.
link |
02:02:29.360
The positive, the power of positive thinking.
link |
02:02:31.200
Hashtag positivity.
link |
02:02:32.040
Yeah, hashtag, it's with a hashtag.
link |
02:02:35.320
Well, so that, and this comes back to this idea
link |
02:02:37.440
of useful fictions, right?
link |
02:02:39.760
And I agree, I don't think there's a clear answer to this.
link |
02:02:42.920
And I think it's actually quite subjective.
link |
02:02:44.240
Some people this works better for than others.
link |
02:02:48.320
You know, to be clear, Igor and I weren't doing
link |
02:02:50.120
this formal prediction in it.
link |
02:02:51.960
Like we did it with very much tongue in cheek.
link |
02:02:55.520
It wasn't like we were going to make,
link |
02:02:57.380
I don't think it even would have drastically changed
link |
02:03:00.160
what we decided to do even.
link |
02:03:02.160
We kind of just did it more as a fun exercise.
link |
02:03:05.200
For the consequence of that fun exercise,
link |
02:03:06.840
you really actually kind of, there was a deep honesty to it.
link |
02:03:09.920
Exactly, it was a deep, and it was, yeah,
link |
02:03:11.720
it was just like this moment of reflection.
link |
02:03:13.240
I'm like, oh wow, I actually have to think
link |
02:03:14.680
like through this quite critically and so on.
link |
02:03:17.840
And it's also what was interesting was I got to like check
link |
02:03:22.840
in with what my desires were.
link |
02:03:25.960
So there was one thing of like what my actual prediction is,
link |
02:03:28.560
but what are my desires and could these desires
link |
02:03:30.400
be affecting my predictions and so on.
link |
02:03:32.440
And you know, that's a method of rationality.
link |
02:03:34.920
And I personally don't think it loses anything in terms of,
link |
02:03:37.080
I didn't take any of the magic away
link |
02:03:38.440
from our relationship, quite the opposite.
link |
02:03:39.920
Like it brought us closer together because it was like,
link |
02:03:42.600
we did this weird fun thing that I appreciate
link |
02:03:45.360
a lot of people find quite strange.
link |
02:03:47.800
And I think it was somewhat unique in our relationship
link |
02:03:51.800
that both of us are very, we both love numbers,
link |
02:03:54.840
we both love statistics, we're both poker players.
link |
02:03:58.320
So this was kind of like our safe space anyway.
link |
02:04:01.240
For others, one partner really might not
link |
02:04:04.640
like that kind of stuff at all, in which case
link |
02:04:06.360
this is not a good exercise to do.
link |
02:04:07.840
I don't recommend it to everybody.
link |
02:04:10.600
But I do think there's, it's interesting sometimes
link |
02:04:14.280
to poke holes in the probe at these things
link |
02:04:18.920
that we consider so sacred that we can't try
link |
02:04:21.600
to quantify them, which is interesting
link |
02:04:24.720
because that's in tension with like the idea
link |
02:04:26.120
of what we just talked about with beauty
link |
02:04:27.440
and like what makes something beautiful,
link |
02:04:28.720
the fact that you can't measure everything about it.
link |
02:04:31.000
And perhaps something shouldn't be tried to measure,
link |
02:04:32.960
maybe it's wrong to completely try and value
link |
02:04:36.400
the utilitarian, put a utilitarian frame
link |
02:04:39.200
of measuring the utility of a tree in its entirety.
link |
02:04:43.320
I don't know, maybe we should, maybe we shouldn't.
link |
02:04:44.760
I'm ambivalent on that.
link |
02:04:46.680
But overall, people have too many biases.
link |
02:04:52.680
People are overly biased against trying to do
link |
02:04:57.080
like a quantified cost benefit analysis
link |
02:04:59.880
on really tough life decisions.
link |
02:05:02.680
They're like, oh, just go with your gut.
link |
02:05:04.000
It's like, well, sure, but guts, our intuitions
link |
02:05:07.200
are best suited for things that we've got tons
link |
02:05:09.160
of experience in, then we can really, you don't trust on it.
link |
02:05:11.920
If it's a decision we've made many times,
link |
02:05:13.360
but if it's like, should I marry this person
link |
02:05:16.040
or should I buy this house over that house?
link |
02:05:19.040
You only make those decisions a couple of times
link |
02:05:20.700
in your life, maybe.
link |
02:05:23.000
Well, I would love to know, there's a balance,
link |
02:05:26.940
probably is a personal balance to strike
link |
02:05:29.240
is the amount of rationality you apply
link |
02:05:33.140
to a question versus the useful fiction,
link |
02:05:38.160
the fake it till you make it.
link |
02:05:40.120
For example, just talking to soldiers in Ukraine,
link |
02:05:43.560
you ask them, what's the probability of you winning,
link |
02:05:49.220
Ukraine winning?
link |
02:05:52.480
Almost everybody I talk to is 100%.
link |
02:05:55.680
Wow.
link |
02:05:56.880
And you listen to the experts, right?
link |
02:05:59.000
They say all kinds of stuff.
link |
02:06:00.800
Right.
link |
02:06:01.640
They are, first of all, the morale there
link |
02:06:05.360
is higher than probably enough.
link |
02:06:07.200
I've never been to a war zone before this,
link |
02:06:09.920
but I've read about many wars
link |
02:06:12.560
and I think the morale in Ukraine is higher
link |
02:06:14.920
than almost any war I've read about.
link |
02:06:17.320
It's every single person in the country
link |
02:06:19.360
is proud to fight for their country.
link |
02:06:21.720
Wow.
link |
02:06:22.560
Everybody, not just soldiers, not everybody.
link |
02:06:25.640
Why do you think that is?
link |
02:06:26.560
Specifically more than in other wars?
link |
02:06:31.920
I think because there's perhaps a dormant desire
link |
02:06:36.680
for the citizens of this country to find
link |
02:06:40.520
the identity of this country because it's been
link |
02:06:43.280
going through this 30 year process
link |
02:06:45.340
of different factions and political bickering
link |
02:06:48.400
and they haven't had, as they talk about,
link |
02:06:50.960
they haven't had their independence war.
link |
02:06:52.600
They say all great nations have had an independence war.
link |
02:06:55.880
They had to fight for their independence,
link |
02:06:58.920
for the discovery of the identity of the core
link |
02:07:01.320
of the ideals that unify us and they haven't had that.
link |
02:07:04.600
There's constantly been factions, there's been divisions,
link |
02:07:07.220
there's been pressures from empires,
link |
02:07:09.820
from the United States and from Russia,
link |
02:07:12.320
from NATO and Europe, everybody telling them what to do.
link |
02:07:15.680
Now they want to discover who they are
link |
02:07:17.600
and there's that kind of sense that we're going to fight
link |
02:07:21.240
for the safety of our homeland,
link |
02:07:23.280
but we're also gonna fight for our identity.
link |
02:07:25.640
And that, on top of the fact that there's just,
link |
02:07:32.040
if you look at the history of Ukraine
link |
02:07:33.960
and there's certain other countries like this,
link |
02:07:36.760
there are certain cultures that are feisty
link |
02:07:39.880
in their pride of being the citizens of that nation.
link |
02:07:45.680
Ukraine is that, Poland was that.
link |
02:07:48.400
You just look at history.
link |
02:07:49.480
In certain countries, you do not want to occupy.
link |
02:07:54.240
I mean, both Stalin and Hitler talked about Poland
link |
02:07:56.880
in this way, they're like, this is a big problem
link |
02:08:00.260
if we occupy this land for prolonged periods of time.
link |
02:08:02.360
They're gonna be a pain in their ass.
link |
02:08:04.680
They're not going to want to be occupied.
link |
02:08:07.120
And certain other countries are pragmatic.
link |
02:08:09.520
They're like, well, leaders come and go.
link |
02:08:12.200
I guess this is good.
link |
02:08:14.400
Ukraine just doesn't have, Ukrainians,
link |
02:08:18.200
throughout the 20th century,
link |
02:08:19.280
don't seem to be the kind of people
link |
02:08:20.760
that just sit calmly and let the quote unquote occupiers
link |
02:08:28.200
impose their rules.
link |
02:08:30.160
That's interesting though, because you said it's always been
link |
02:08:32.400
under conflict and leaders have come and gone.
link |
02:08:35.520
So you would expect them to actually be the opposite
link |
02:08:37.120
under that like reasoning.
link |
02:08:39.160
Well, because it's a very fertile land,
link |
02:08:42.360
it's great for agriculture.
link |
02:08:43.480
So a lot of people want to,
link |
02:08:44.800
I mean, I think they've developed this culture
link |
02:08:46.520
because they've constantly been occupied
link |
02:08:47.920
by different people for the different peoples.
link |
02:08:51.040
And so maybe there is something to that
link |
02:08:54.600
where you've constantly had to feel like within the blood
link |
02:08:59.600
of the generations, there's the struggle against the man,
link |
02:09:04.840
against the imposition of rules, against oppression
link |
02:09:08.600
and all that kind of stuff, and that stays with them.
link |
02:09:10.560
So there's a will there, but a lot of other aspects
link |
02:09:15.320
are also part of it that has to do with
link |
02:09:18.000
the reverse Mollet kind of situation
link |
02:09:20.000
where social media has definitely played a part of it.
link |
02:09:23.120
Also different charismatic individuals
link |
02:09:25.040
have had to play a part.
link |
02:09:27.160
The fact that the president of the nation, Zelensky,
link |
02:09:31.120
stayed in Kiev during the invasion
link |
02:09:36.280
is a huge inspiration to them
link |
02:09:37.680
because most leaders, as you can imagine,
link |
02:09:41.440
when the capital of the nation is under attack,
link |
02:09:44.520
the wise thing, the smart thing
link |
02:09:46.800
that the United States advised Zelensky to do
link |
02:09:49.080
is to flee and to be the leader of the nation
link |
02:09:52.720
from a distant place.
link |
02:09:54.720
He said, fuck that, I'm staying put.
link |
02:09:58.840
Everyone around him, there was a pressure to leave
link |
02:10:01.720
and he didn't, and those singular acts
link |
02:10:07.840
really can unify a nation.
link |
02:10:09.160
There's a lot of people that criticize Zelensky
link |
02:10:11.360
within Ukraine.
link |
02:10:12.720
Before the war, he was very unpopular, even still,
link |
02:10:17.200
but they put that aside, especially that singular act
link |
02:10:22.080
of staying in the capital.
link |
02:10:24.080
Yeah, a lot of those kinds of things come together
link |
02:10:27.760
to create something within people.
link |
02:10:33.360
These things always, of course, though,
link |
02:10:39.920
how zoomed out of a view do you wanna take?
link |
02:10:43.600
Because, yeah, you describe it,
link |
02:10:45.000
it's like an anti Moloch thing happened within Ukraine
link |
02:10:47.960
because it brought the Ukrainian people together
link |
02:10:49.760
in order to fight a common enemy.
link |
02:10:51.760
Maybe that's a good thing, maybe that's a bad thing.
link |
02:10:53.360
In the end, we don't know
link |
02:10:54.200
how this is all gonna play out, right?
link |
02:10:56.720
But if you zoom it out from a level, on a global level,
link |
02:11:01.120
they're coming together to fight that,
link |
02:11:09.000
that could make a conflict larger.
link |
02:11:12.240
You know what I mean?
link |
02:11:13.120
I don't know what the right answer is here.
link |
02:11:15.680
It seems like a good thing that they came together,
link |
02:11:17.600
but we don't know how this is all gonna play out.
link |
02:11:20.000
If this all turns into nuclear war, we'll be like,
link |
02:11:21.920
okay, that was the bad, that was the...
link |
02:11:23.440
Oh yeah, so I was describing the reverse Moloch
link |
02:11:26.080
for the local level.
link |
02:11:27.440
Exactly, yes.
link |
02:11:28.280
Now, this is where the experts come in and they say,
link |
02:11:32.160
well, if you channel most of the resources of the nation
link |
02:11:37.520
and the nations supporting Ukraine into the war effort,
link |
02:11:42.760
are you not beating the drums of war
link |
02:11:45.800
that is much bigger than Ukraine?
link |
02:11:47.480
In fact, even the Ukrainian leaders
link |
02:11:50.880
are speaking of it this way.
link |
02:11:52.920
This is not a war between two nations.
link |
02:11:55.840
This is the early days of a world war,
link |
02:12:01.000
if we don't play this correctly.
link |
02:12:02.440
Yes, and we need cool heads from our leaders.
link |
02:12:07.760
So from Ukraine's perspective,
link |
02:12:10.200
Ukraine needs to win the war
link |
02:12:12.640
because what does winning the war mean
link |
02:12:15.440
is coming to peace negotiations, an agreement
link |
02:12:21.600
that guarantees no more invasions.
link |
02:12:24.360
And then you make an agreement
link |
02:12:25.640
about what land belongs to who.
link |
02:12:28.560
And that's, you stop that.
link |
02:12:30.240
And basically from their perspective
link |
02:12:34.120
is you want to demonstrate to the rest of the world
link |
02:12:36.600
who's watching carefully, including Russia and China
link |
02:12:39.640
and different players on the geopolitical stage,
link |
02:12:42.280
that this kind of conflict is not going to be productive
link |
02:12:46.000
if you engage in it.
link |
02:12:47.240
So you want to teach everybody a lesson,
link |
02:12:49.240
let's not do World War III.
link |
02:12:50.920
It's gonna be bad for everybody.
link |
02:12:53.000
And so it's a lose, lose.
link |
02:12:54.760
It's a deep lose, lose.
link |
02:12:56.600
Doesn't matter.
link |
02:13:00.400
And I think that's actually a correct,
link |
02:13:05.680
when I zoom out, I mean, 99% of what I think about
link |
02:13:10.160
is just individual human beings and human lives
link |
02:13:12.320
and just that war is horrible.
link |
02:13:14.600
But when you zoom out and think
link |
02:13:15.920
from a geopolitics perspective,
link |
02:13:17.760
we should realize that it's entirely possible
link |
02:13:22.000
that we will see a World War III in the 21st century.
link |
02:13:26.400
And this is like a dress rehearsal for that.
link |
02:13:29.840
And so the way we play this as a human civilization
link |
02:13:34.840
will define whether we do or don't have a World War III.
link |
02:13:45.000
How we discuss war, how we discuss nuclear war,
link |
02:13:49.320
the kind of leaders we elect and prop up,
link |
02:13:55.200
the kind of memes we circulate,
link |
02:13:58.480
because you have to be very careful
link |
02:13:59.920
when you're being pro Ukraine, for example,
link |
02:14:04.480
you have to realize that you're being,
link |
02:14:07.640
you are also indirectly feeding
link |
02:14:11.520
the ever increasing military industrial complex.
link |
02:14:15.560
So you have to be extremely careful
link |
02:14:17.560
that when you say pro Ukraine or pro anybody,
link |
02:14:24.440
you're pro human beings, not pro the machine
link |
02:14:29.440
that creates narratives that says it's pro human beings.
link |
02:14:36.760
But it's actually, if you look at the raw use
link |
02:14:39.920
of funds and resources, it's actually pro making weapons
link |
02:14:44.840
and shooting bullets and dropping bombs.
link |
02:14:47.360
The real, we have to just somehow get the meme
link |
02:14:50.600
into everyone's heads that the real enemy is war itself.
link |
02:14:54.560
That's the enemy we need to defeat.
link |
02:14:57.040
And that doesn't mean to say that there isn't justification
link |
02:15:01.560
for small local scenarios, adversarial conflicts.
link |
02:15:07.280
If you have a leader who is starting wars,
link |
02:15:11.080
they're on the side of team war, basically.
link |
02:15:13.720
It's not that they're on the side of team country,
link |
02:15:15.240
whatever that country is,
link |
02:15:16.200
it's they're on the side of team war.
link |
02:15:17.920
So that needs to be stopped and put down.
link |
02:15:20.160
But you also have to find a way
link |
02:15:21.560
that your corrective measure doesn't actually then
link |
02:15:25.480
end up being coopted by the war machine
link |
02:15:27.360
and creating greater war.
link |
02:15:28.920
Again, the playing field is finite.
link |
02:15:31.080
The scale of conflict is now getting so big
link |
02:15:35.200
that the weapons that can be used are so mass destructive
link |
02:15:40.280
that we can't afford another giant conflict.
link |
02:15:42.560
We just, we won't make it.
link |
02:15:44.080
What existential threat in terms of us not making it
link |
02:15:48.000
are you most worried about?
link |
02:15:49.640
What existential threat to human civilization?
link |
02:15:51.920
We got like, let's go.
link |
02:15:52.760
Going down a dark path, huh?
link |
02:15:53.600
It's good, but no, well, no, it's a dark.
link |
02:15:56.600
No, it's like, well, while we're in the somber place,
link |
02:15:58.560
we might as well.
link |
02:16:02.280
Some of my best friends are dark paths.
link |
02:16:06.680
What worries you most?
link |
02:16:07.960
We mentioned asteroids, we mentioned AGI, nuclear weapons.
link |
02:16:14.160
The one that's on my mind the most,
link |
02:16:17.320
mostly because I think it's the one where we have
link |
02:16:19.400
actually a real chance to move the needle on
link |
02:16:21.920
in a positive direction or more specifically,
link |
02:16:24.520
stop some really bad things from happening,
link |
02:16:26.800
really dumb, avoidable things is bio risks.
link |
02:16:36.160
In what kind of bio risks?
link |
02:16:37.480
There's so many fun options.
link |
02:16:39.320
So many, so of course, we have natural risks
link |
02:16:41.880
from natural pandemics, naturally occurring viruses
link |
02:16:45.000
or pathogens, and then also as time and technology goes on
link |
02:16:49.560
and technology becomes more and more democratized
link |
02:16:51.520
and into the hands of more and more people,
link |
02:16:54.000
the risk of synthetic pathogens.
link |
02:16:57.320
And whether or not you fall into the camp of COVID
link |
02:17:00.600
was gain of function accidental lab leak
link |
02:17:03.680
or whether it was purely naturally occurring,
link |
02:17:07.200
either way, we are facing a future where synthetic pathogens
link |
02:17:14.400
or like human meddled with pathogens
link |
02:17:17.840
either accidentally get out or get into the hands
link |
02:17:21.840
of bad actors who, whether they're omnicidal maniacs,
link |
02:17:25.800
you know, either way.
link |
02:17:28.560
And so that means we need more robustness for that.
link |
02:17:31.160
And you would think that us having this nice little dry run,
link |
02:17:33.920
which is what as awful as COVID was,
link |
02:17:37.880
and all those poor people that died,
link |
02:17:39.480
it was still like child's play compared
link |
02:17:42.800
to what a future one could be in terms of fatality rate.
link |
02:17:45.800
And so you'd think that we would then be coming,
link |
02:17:48.400
we'd be much more robust in our pandemic preparedness.
link |
02:17:52.200
And meanwhile, the budget in the last two years for the US,
link |
02:17:59.000
sorry, they just did this, I can't remember the name
link |
02:18:02.400
of what the actual budget was,
link |
02:18:03.320
but it was like a multi trillion dollar budget
link |
02:18:05.440
that the US just set aside.
link |
02:18:07.360
And originally in that, you know, considering that COVID
link |
02:18:09.920
cost multiple trillions to the economy, right?
link |
02:18:12.400
The original allocation in this new budget
link |
02:18:14.640
for future pandemic preparedness was 60 billion,
link |
02:18:17.320
so tiny proportion of it.
link |
02:18:20.000
That's proceeded to get whittled down to like 30 billion,
link |
02:18:24.520
to 15 billion, all the way down to 2 billion
link |
02:18:27.960
out of multiple trillions.
link |
02:18:28.880
For a thing that has just cost us multiple trillions,
link |
02:18:31.520
we've just finished, we're barely even,
link |
02:18:33.080
we're not even really out of it.
link |
02:18:34.720
It basically got whittled down to nothing
link |
02:18:36.560
because for some reason people think that,
link |
02:18:38.480
oh, all right, we've got the pandemic,
link |
02:18:40.000
out of the way, that was that one.
link |
02:18:42.360
And the reason for that is that people are,
link |
02:18:44.680
and I say this with all due respect
link |
02:18:47.200
to a lot of the science community,
link |
02:18:48.320
but there's an immense amount of naivety about,
link |
02:18:52.800
they think that nature is the main risk moving forward,
link |
02:18:56.800
and it really isn't.
link |
02:18:59.120
And I think nothing demonstrates this more
link |
02:19:00.440
than this project that I was just reading about
link |
02:19:03.240
that's sort of being proposed right now called Deep Vision.
link |
02:19:07.760
And the idea is to go out into the wilds,
link |
02:19:10.360
and we're not talking about just like within cities,
link |
02:19:13.160
like deep into like caves that people don't go to,
link |
02:19:15.840
deep into the Arctic, wherever, scour the earth
link |
02:19:18.880
for whatever the most dangerous possible pathogens
link |
02:19:22.240
could be that they can find.
link |
02:19:24.560
And then not only do, try and find these,
link |
02:19:27.960
bring samples of them back to laboratories.
link |
02:19:31.200
And again, whether you think COVID was a lab leak or not,
link |
02:19:34.240
I'm not gonna get into that,
link |
02:19:35.560
but we have historically had so many, as a civilization,
link |
02:19:38.520
we've had so many lab leaks
link |
02:19:40.760
from even like the highest level security things.
link |
02:19:42.800
Like it's just, people should go and just read it.
link |
02:19:45.680
It's like a comedy show of just how many they are,
link |
02:19:48.680
how leaky these labs are,
link |
02:19:50.280
even when they do their best efforts.
link |
02:19:52.720
So bring these things then back to civilization.
link |
02:19:55.680
That's step one of the badness.
link |
02:19:57.240
Then the next step would be to then categorize them,
link |
02:20:01.040
do experiments on them and categorize them
link |
02:20:02.600
by their level of potential.
link |
02:20:04.320
And then the piece de resistance on this plan
link |
02:20:07.320
is to then publish that information freely on the internet
link |
02:20:11.560
about all these pathogens, including their genome,
link |
02:20:13.720
which is literally like the building instructions
link |
02:20:15.680
of how to do them on the internet.
link |
02:20:18.080
And this is something that genuinely,
link |
02:20:20.280
a pocket of the like bio,
link |
02:20:23.080
of the scientific community thinks is a good idea.
link |
02:20:26.560
And I think on expectation, like the,
link |
02:20:28.880
and their argument is, is that, oh, this is good,
link |
02:20:31.440
because it might buy us some time to develop the vaccines,
link |
02:20:35.360
which, okay, sure, maybe would have made sense
link |
02:20:37.880
prior to mRNA technology, but you know, like they,
link |
02:20:40.480
mRNA, we can develop a vaccine now
link |
02:20:43.760
when we find a new pathogen within a couple of days.
link |
02:20:46.880
Now then there's all the trials and so on.
link |
02:20:48.520
Those trials would have to happen anyway,
link |
02:20:50.040
in the case of a brand new thing.
link |
02:20:51.440
So you're saving maybe a couple of days.
link |
02:20:53.400
So that's the upside.
link |
02:20:54.520
Meanwhile, the downside is you're not only giving,
link |
02:20:58.160
you're not only giving the vaccine,
link |
02:21:00.000
you're bringing the risk of these pathogens
link |
02:21:03.200
of like getting leaked,
link |
02:21:04.040
but you're literally handing it out
link |
02:21:06.400
to every bad actor on earth who would be doing cartwheels.
link |
02:21:10.320
And I'm talking about like Kim Jong Un, ISIS,
link |
02:21:13.280
people who like want,
link |
02:21:14.880
they think the rest of the world is their enemy.
link |
02:21:17.200
And in some cases they think that killing it themselves
link |
02:21:19.760
is like a noble cause.
link |
02:21:22.680
And you're literally giving them the building blocks
link |
02:21:24.240
of how to do this.
link |
02:21:25.080
It's the most batshit idea I've ever heard.
link |
02:21:26.920
Like on expectation, it's probably like minus EV
link |
02:21:29.560
of like multiple billions of lives
link |
02:21:31.640
if they actually succeeded in doing this.
link |
02:21:33.440
Certainly in the tens or hundreds of millions.
link |
02:21:35.760
So the cost benefit is so unbelievably, it makes no sense.
link |
02:21:38.840
And I was trying to wrap my head around like why,
link |
02:21:42.120
like what's going wrong in people's minds
link |
02:21:44.920
to think that this is a good idea?
link |
02:21:46.640
And it's not that it's malice or anything like that.
link |
02:21:50.000
I think it's that people don't,
link |
02:21:53.600
the proponents, they're actually overly naive
link |
02:21:57.400
about the interactions of humanity.
link |
02:22:00.680
And well, like there are bad actors
link |
02:22:02.880
who will use this for bad things.
link |
02:22:04.920
Because not only will it,
link |
02:22:07.440
if you publish this information,
link |
02:22:08.920
even if a bad actor couldn't physically make it themselves,
link |
02:22:12.200
which given in 10 years time,
link |
02:22:14.080
like the technology is getting cheaper and easier to use.
link |
02:22:18.000
But even if they couldn't make it, they could now bluff it.
link |
02:22:20.400
Like what would you do if there's like some deadly new virus
link |
02:22:22.960
that we were published on the internet
link |
02:22:26.120
in terms of its building blocks?
link |
02:22:27.560
Kim Jong Un could be like,
link |
02:22:28.600
hey, if you don't let me build my nuclear weapons,
link |
02:22:31.560
I'm gonna release this, I've managed to build it.
link |
02:22:33.600
Well, now he's actually got a credible bluff.
link |
02:22:35.440
We don't know.
link |
02:22:36.440
And so that's, it's just like handing the keys,
link |
02:22:39.320
it's handing weapons of mass destruction to people.
link |
02:22:42.360
Makes no sense.
link |
02:22:43.200
The possible, I agree with you,
link |
02:22:44.520
but the possible world in which it might make sense
link |
02:22:48.240
is if the good guys, which is a whole nother problem,
link |
02:22:53.760
defining who the good guys are,
link |
02:22:55.480
but the good guys are like an order of magnitude
link |
02:22:59.640
higher competence.
link |
02:23:01.840
And so they can stay ahead of the bad actors
link |
02:23:06.720
by just being very good at the defense.
link |
02:23:10.080
By very good, not meaning like a little bit better,
link |
02:23:13.720
but an order of magnitude better.
link |
02:23:15.880
But of course the question is,
link |
02:23:17.200
in each of those individual disciplines, is that feasible?
link |
02:23:21.680
Can you, can the bad actors,
link |
02:23:23.440
even if they don't have the competence leapfrog
link |
02:23:25.840
to the place where the good guys are?
link |
02:23:29.680
Yeah, I mean, I would agree in principle
link |
02:23:32.760
with pertaining to this like particular plan of like,
link |
02:23:37.000
that, you know, with the thing I described,
link |
02:23:38.480
this deep vision thing, where at least then
link |
02:23:40.320
that would maybe make sense for steps one and step two
link |
02:23:42.160
of like getting the information,
link |
02:23:43.400
but then why would you release it,
link |
02:23:45.640
the information to your literal enemies?
link |
02:23:47.240
You know, that's, that makes,
link |
02:23:49.000
that doesn't fit at all in that perspective
link |
02:23:52.560
of like trying to be ahead of them.
link |
02:23:53.560
You're literally handing them the weapon.
link |
02:23:55.000
But there's different levels of release, right?
link |
02:23:56.640
So there's the kind of secrecy
link |
02:24:00.520
where you don't give it to anybody,
link |
02:24:02.440
but there's a release where you incrementally give it
link |
02:24:05.480
to like major labs.
link |
02:24:07.720
So it's not public release, but it's like,
link |
02:24:09.400
you're giving it to major labs.
link |
02:24:10.240
Yeah, there's different layers of reasonability, but.
link |
02:24:12.880
But the problem there is it's going to,
link |
02:24:14.640
if you go anywhere beyond like complete secrecy,
link |
02:24:18.160
it's gonna leak.
link |
02:24:19.720
That's the thing, it's very hard to keep secrets.
link |
02:24:21.600
And so that's still. Information is.
link |
02:24:23.440
So you might as well release it to the public
link |
02:24:25.920
is that argument.
link |
02:24:26.920
So you either go complete secrecy
link |
02:24:29.600
or you release it to the public.
link |
02:24:31.720
So, which is essentially the same thing.
link |
02:24:33.960
It's going to leak anyway,
link |
02:24:35.920
if you don't do complete secrecy.
link |
02:24:38.240
Right, which is why you shouldn't get the information
link |
02:24:39.840
in the first place.
link |
02:24:40.680
Yeah, I mean, what, in that, I think.
link |
02:24:43.720
Well, that's a solution.
link |
02:24:44.920
Yeah, the solution is either don't get the information
link |
02:24:46.640
in the first place or B, keep it incredibly,
link |
02:24:50.880
incredibly contained.
link |
02:24:52.240
See, I think, I think it really matters
link |
02:24:54.560
which discipline we're talking about.
link |
02:24:55.800
So in the case of biology, I do think you're very right.
link |
02:24:59.680
We shouldn't even be, it should be forbidden
link |
02:25:02.520
to even like, think about that.
link |
02:25:06.640
Meaning don't collect, don't just even collect
link |
02:25:08.520
the information, but like, don't do,
link |
02:25:11.120
I mean, gain of function research is a really iffy area.
link |
02:25:14.880
Like you start.
link |
02:25:15.880
I mean, it's all about cost benefits, right?
link |
02:25:17.640
There are some scenarios where I could imagine
link |
02:25:19.240
the cost benefit of a gain of function research
link |
02:25:21.280
is very, very clear, where you've evaluated
link |
02:25:24.400
all the potential risks, factored in the probability
link |
02:25:26.760
that things can go wrong.
link |
02:25:27.800
And like, you know, not only known unknowns,
link |
02:25:29.760
but unknown unknowns as well, tried to quantify that.
link |
02:25:32.320
And then even then it's like orders of magnitude
link |
02:25:34.400
better to do that.
link |
02:25:35.400
I'm behind that argument.
link |
02:25:36.720
But the point is that there's this like,
link |
02:25:38.920
naivety that's preventing people from even doing
link |
02:25:40.960
the cost benefit properly on a lot of the things
link |
02:25:43.280
because, you know, I get it.
link |
02:25:45.880
The science community, again, I don't want to bucket
link |
02:25:48.680
the science community, but like some people
link |
02:25:50.240
within the science community just think
link |
02:25:52.600
that everyone's good and everyone just cares
link |
02:25:54.800
about getting knowledge and doing the best for the world.
link |
02:25:56.880
And unfortunately, that's not the case.
link |
02:25:58.040
I wish we lived in that world, but we don't.
link |
02:26:00.880
Yeah, I mean, there's a lie.
link |
02:26:02.360
Listen, I've been criticizing the science community
link |
02:26:05.600
broadly quite a bit.
link |
02:26:07.360
There's so many brilliant people that brilliance
link |
02:26:09.840
is somehow a hindrance sometimes
link |
02:26:11.280
because it has a bunch of blind spots.
link |
02:26:13.160
And then you start to look at the history of science,
link |
02:26:16.400
how easily it's been used by dictators
link |
02:26:19.040
to any conclusion they want.
link |
02:26:20.880
And it's dark how you can use brilliant people
link |
02:26:24.880
that like playing the little game of science
link |
02:26:27.160
because it is a fun game.
link |
02:26:28.800
You know, you're building, you're going to conferences,
link |
02:26:30.720
you're building on top of each other's ideas,
link |
02:26:32.320
there's breakthroughs.
link |
02:26:33.240
Hi, I think I've realized how this particular molecule works
link |
02:26:37.040
and I could do this kind of experiment
link |
02:26:38.560
and everyone else is impressed.
link |
02:26:39.680
Ooh, cool.
link |
02:26:40.520
No, I think you're wrong.
link |
02:26:41.400
Let me show you why you're wrong.
link |
02:26:42.480
And that little game, everyone gets really excited
link |
02:26:44.840
and they get excited, oh, I came up with a pill
link |
02:26:47.080
that solves this problem
link |
02:26:48.040
and it's going to help a bunch of people.
link |
02:26:49.480
And I came up with a giant study
link |
02:26:51.360
that shows the exact probability it's going to help or not.
link |
02:26:54.480
And you get lost in this game
link |
02:26:56.320
and you forget to realize this game, just like Moloch,
link |
02:27:01.160
can have like...
link |
02:27:03.040
Unintended consequences.
link |
02:27:03.880
Yeah.
link |
02:27:04.720
Unintended consequences that might
link |
02:27:06.920
destroy human civilization.
link |
02:27:08.800
Right.
link |
02:27:09.640
Or divide human civilization
link |
02:27:12.840
or have dire geopolitical consequences.
link |
02:27:17.520
I mean, the effects of, I mean, it's just so...
link |
02:27:20.320
The most destructive effects of COVID
link |
02:27:22.880
have nothing to do with the biology of the virus,
link |
02:27:25.760
it seems like.
link |
02:27:27.520
I mean, I could just list them forever,
link |
02:27:29.560
but like one of them is the complete distress
link |
02:27:32.200
of public institutions.
link |
02:27:34.160
The other one is because of that public distress,
link |
02:27:36.360
I feel like if a much worse pandemic came along,
link |
02:27:39.840
we as a world have now cried wolf.
link |
02:27:42.440
And if an actual wolf now comes,
link |
02:27:45.440
people will be like, fuck masks, fuck...
link |
02:27:48.360
Fuck vaccines, yeah.
link |
02:27:49.200
Fuck everything.
link |
02:27:50.040
And they won't be...
link |
02:27:51.680
They'll distrust every single thing
link |
02:27:53.560
that any major institution is going to tell them.
link |
02:27:55.960
And...
link |
02:27:56.960
Because that's the thing,
link |
02:27:57.800
like there were certain actions made by
link |
02:28:03.720
certain health public figures where they told,
link |
02:28:07.760
they very knowingly told it was a white lie,
link |
02:28:10.760
it was intended in the best possible way,
link |
02:28:12.440
such as early on when there was clearly a shortage of masks.
link |
02:28:20.480
And so they said to the public,
link |
02:28:21.520
oh, don't get masks, there's no evidence that they work.
link |
02:28:25.000
Or don't get them, they don't work.
link |
02:28:27.240
In fact, it might even make it worse.
link |
02:28:29.000
You might even spread it more.
link |
02:28:30.240
Like that was the real like stinker.
link |
02:28:33.120
Yeah, no, no, unless you know how to do it properly,
link |
02:28:35.120
you're gonna make that you're gonna get sicker
link |
02:28:36.680
or you're more likely to catch the virus,
link |
02:28:38.680
which is just absolute crap.
link |
02:28:41.320
And they put that out there.
link |
02:28:43.160
And it's pretty clear the reason why they did that
link |
02:28:45.280
was because there was actually a shortage of masks
link |
02:28:47.600
and they really needed it for health workers,
link |
02:28:50.160
which makes sense.
link |
02:28:51.000
Like, I agree, like, you know,
link |
02:28:52.760
but the cost of lying to the public
link |
02:28:57.840
when that then comes out,
link |
02:28:59.880
people aren't as stupid as they think they are.
link |
02:29:02.720
And that's, I think where this distrust of experts
link |
02:29:05.840
has largely come from.
link |
02:29:06.680
A, they've lied to people overtly,
link |
02:29:09.360
but B, people have been treated like idiots.
link |
02:29:13.160
Now, that's not to say that there aren't a lot
link |
02:29:14.440
of stupid people who have a lot of wacky ideas
link |
02:29:16.160
around COVID and all sorts of things.
link |
02:29:18.240
But if you treat the general public like children,
link |
02:29:21.560
they're going to see that, they're going to notice that.
link |
02:29:23.520
And that is going to just like absolutely decimate the trust
link |
02:29:26.760
in the public institutions that we depend upon.
link |
02:29:29.520
And honestly, the best thing that could happen,
link |
02:29:32.240
I wish like if like Fauci or, you know,
link |
02:29:34.720
and these other like leaders who, I mean, God,
link |
02:29:36.880
I would, I can't imagine how nightmare his job has been
link |
02:29:39.760
for the last few years, hell on earth.
link |
02:29:41.040
Like, so, you know, I, you know,
link |
02:29:43.480
I have a lot of sort of sympathy
link |
02:29:45.360
for the position he's been in.
link |
02:29:46.920
But like, if he could just come out and be like,
link |
02:29:48.800
okay, look, guys, hands up,
link |
02:29:51.160
we didn't handle this as well as we could have.
link |
02:29:53.960
These are all the things I would have done differently
link |
02:29:55.880
in hindsight.
link |
02:29:56.720
I apologize for this and this and this and this.
link |
02:29:58.680
That would go so far.
link |
02:30:01.360
And maybe I'm being naive, who knows,
link |
02:30:03.040
maybe this would backfire, but I don't think it would.
link |
02:30:04.760
Like to someone like me even,
link |
02:30:06.240
cause I've like, I've lost trust
link |
02:30:07.360
in a lot of these things.
link |
02:30:08.840
But I'm fortunate that I at least know people
link |
02:30:10.360
who I can go to who I think are good,
link |
02:30:11.680
like have good epistemics on this stuff.
link |
02:30:14.320
But you know, if they, if they could sort of put their hands
link |
02:30:16.240
on my go, okay, these are the spots where we screwed up.
link |
02:30:18.240
This, this, this, this was our reasons.
link |
02:30:21.280
Yeah, we actually told a little white lie here.
link |
02:30:22.760
We did it for this reason.
link |
02:30:23.720
We're really sorry.
link |
02:30:24.960
Where they just did the radical honesty thing,
link |
02:30:26.760
the radical transparency thing,
link |
02:30:28.680
that would go so far to rebuilding public trust.
link |
02:30:32.120
And I think that's what needs to happen.
link |
02:30:33.320
Otherwise.
link |
02:30:34.160
I totally agree with you.
link |
02:30:35.000
Unfortunately, yeah, his job was very tough
link |
02:30:38.680
and all those kinds of things, but I see arrogance
link |
02:30:42.920
and arrogance prevented him from being honest
link |
02:30:46.080
in that way previously.
link |
02:30:47.840
And I think arrogance will prevent him
link |
02:30:49.480
from being honest in that way.
link |
02:30:51.040
Now we need leaders.
link |
02:30:52.840
I think young people are seeing that,
link |
02:30:55.440
that kind of talking down to people
link |
02:30:59.960
from a position of power, I hope is the way of the past.
link |
02:31:04.360
People really like authenticity
link |
02:31:06.120
and they like leaders that are like a man
link |
02:31:10.600
and a woman of the people.
link |
02:31:12.320
And I think that just.
link |
02:31:15.000
I mean, he still has a chance to do that, I think.
link |
02:31:17.040
I mean, I don't wanna, you know, I don't think, you know,
link |
02:31:19.280
if I doubt he's listening, but if he is like,
link |
02:31:21.560
hey, I think, you know, I don't think he's irredeemable
link |
02:31:25.160
by any means.
link |
02:31:26.000
I think there's, you know, I don't have an opinion
link |
02:31:28.720
whether there was arrogance or there or not.
link |
02:31:30.680
Just know that I think like coming clean on the, you know,
link |
02:31:34.440
it's understandable to have fucked up during this pandemic.
link |
02:31:37.600
Like I won't expect any government to handle it well
link |
02:31:39.400
because it was so difficult, like so many moving pieces,
link |
02:31:42.840
so much like lack of information and so on.
link |
02:31:46.040
But the step to rebuilding trust is to go, okay, look,
link |
02:31:49.320
we're doing a scrutiny of where we went wrong.
link |
02:31:51.280
And for my part, I did this wrong in this part.
link |
02:31:53.880
And that would be huge.
link |
02:31:55.120
All of us can do that.
link |
02:31:56.000
I mean, I was struggling for a while
link |
02:31:57.360
whether I want to talk to him or not.
link |
02:32:00.000
I talked to his boss, Francis Collins.
link |
02:32:03.600
Another person that's screwed up in terms of trust,
link |
02:32:08.040
lost a little bit of my respect too.
link |
02:32:10.280
There seems to have been a kind of dishonesty
link |
02:32:14.640
in the backrooms in that they didn't trust people
link |
02:32:21.080
to be intelligent.
link |
02:32:23.040
Like we need to tell them what's good for them.
link |
02:32:26.000
We know what's good for them.
link |
02:32:27.480
That kind of idea.
link |
02:32:28.960
To be fair, the thing that's, what's it called?
link |
02:32:33.400
I heard the phrase today, nut picking.
link |
02:32:36.360
Social media does that.
link |
02:32:37.760
So you've got like nitpicking.
link |
02:32:39.040
Nut picking is where the craziest, stupidest,
link |
02:32:44.360
you know, if you have a group of people,
link |
02:32:45.880
let's say people who are vaccine,
link |
02:32:47.720
I don't like the term anti vaccine,
link |
02:32:48.720
people who are vaccine hesitant, vaccine speculative.
link |
02:32:51.720
So what social media did or the media or anyone,
link |
02:32:57.320
their opponents would do is pick the craziest example.
link |
02:33:01.000
So the ones who are like,
link |
02:33:02.280
I think I need to inject myself with motor oil
link |
02:33:06.240
up my ass or something.
link |
02:33:07.920
Select the craziest ones and then have that beamed too.
link |
02:33:11.920
So from like someone like Fauci or Francis's perspective,
link |
02:33:14.680
that's what they get because they're getting
link |
02:33:16.240
the same social media stuff as us.
link |
02:33:17.440
They're getting the same media reports.
link |
02:33:19.000
I mean, they might get some more information,
link |
02:33:20.920
but they too are gonna get the nuts portrayed to them.
link |
02:33:24.880
So they probably have a misrepresentation
link |
02:33:27.240
of what the actual public's intelligence is.
link |
02:33:29.320
Well, that's just, yes.
link |
02:33:31.280
And that just means they're not social media savvy.
link |
02:33:33.640
So one of the skills of being on social media
link |
02:33:36.000
is to be able to filter that in your mind,
link |
02:33:37.880
like to understand, to put into proper context.
link |
02:33:40.080
To realize that what you are seeing,
link |
02:33:41.800
social media is not anywhere near
link |
02:33:43.960
an accurate representation of humanity.
link |
02:33:46.600
Nut picking, and there's nothing wrong
link |
02:33:49.520
with putting motor oil up your ass.
link |
02:33:51.320
It's just one of the better aspects of,
link |
02:33:54.440
I do this every weekend.
link |
02:33:56.320
Okay.
link |
02:33:58.280
Where the hell did that analogy come from in my mind?
link |
02:33:59.800
Like what?
link |
02:34:00.640
I don't know.
link |
02:34:01.480
I think you need to, there's some Freudian thing
link |
02:34:03.680
you would need to deeply investigate with a therapist.
link |
02:34:06.600
Okay, what about AI?
link |
02:34:08.240
Are you worried about AGI, super intelligence systems
link |
02:34:13.040
or paperclip maximizer type of situation?
link |
02:34:16.180
Yes, I'm definitely worried about it,
link |
02:34:19.940
but I feel kind of bipolar in the,
link |
02:34:23.220
some days I wake up and I'm like.
link |
02:34:24.780
You're excited about the future?
link |
02:34:26.100
Well, exactly.
link |
02:34:26.940
I'm like, wow, we can unlock the mysteries of the universe.
link |
02:34:29.420
You know, escape the game.
link |
02:34:30.620
And this, you know,
link |
02:34:34.340
because I spend all my time thinking
link |
02:34:35.740
about these molecule problems that, you know,
link |
02:34:37.460
what is the solution to them?
link |
02:34:39.260
What, you know, in some ways you need this like
link |
02:34:41.780
omni benevolent, omniscient, omni wise coordination mechanism
link |
02:34:49.100
that can like make us all not do the molecule thing
link |
02:34:53.260
or like provide the infrastructure
link |
02:34:55.540
or redesign the system so that it's not vulnerable
link |
02:34:57.460
to this molecule process.
link |
02:34:59.940
And in some ways, you know,
link |
02:35:01.180
that's the strongest argument to me
link |
02:35:02.540
for like the race to build AGI
link |
02:35:04.940
is that maybe, you know, we can't survive without it.
link |
02:35:08.020
But the flip side to that is the,
link |
02:35:12.460
the unfortunately now that there's multiple actors
link |
02:35:14.900
trying to build AI, AGI, you know,
link |
02:35:17.780
this was fine 10 years ago when it was just DeepMind,
link |
02:35:19.940
but then other companies started up
link |
02:35:22.060
and now it created a race dynamic.
link |
02:35:23.380
Now it's like, that's the whole thing.
link |
02:35:25.660
Is it the same, it's got the same problem.
link |
02:35:28.020
It's like whichever company is the one that like optimizes
link |
02:35:31.500
for speed at the cost of safety
link |
02:35:33.620
will get the competitive advantage.
link |
02:35:35.340
And so we're the more likely the ones to build the AGI,
link |
02:35:37.380
you know, and that's the same cycle that you're in.
link |
02:35:40.260
And there's no clear solution to that
link |
02:35:41.580
because you can't just go like slapping,
link |
02:35:46.820
if you go and try and like stop all the different companies,
link |
02:35:51.140
then it will, you know, the good ones will stop
link |
02:35:54.620
because they're the ones, you know,
link |
02:35:55.660
within the West's reach,
link |
02:35:57.700
but then that leaves all the other ones to continue
link |
02:35:59.980
and then they're even more likely.
link |
02:36:00.980
So it's like, it's a very difficult problem
link |
02:36:03.140
with no clean solution.
link |
02:36:04.420
And, you know, at the same time, you know,
link |
02:36:08.580
I know at least some of the folks at DeepMind
link |
02:36:12.100
and they're incredible and they're thinking about this.
link |
02:36:13.740
They're very aware of this problem.
link |
02:36:15.020
And they're like, you know,
link |
02:36:17.020
I think some of the smartest people on earth.
link |
02:36:18.660
Yeah, the culture is important there
link |
02:36:20.580
because they are thinking about that
link |
02:36:21.940
and they're some of the best machine learning engineers.
link |
02:36:26.180
So it's possible to have a company or a community of people
link |
02:36:29.780
that are both great engineers
link |
02:36:31.700
and are thinking about the philosophical topics.
link |
02:36:33.740
Exactly, and importantly, they're also game theorists,
link |
02:36:36.700
you know, and because this is ultimately
link |
02:36:38.260
a game theory problem, the thing, this Moloch mechanism
link |
02:36:41.620
and like, you know, how do we voice arms race scenarios?
link |
02:36:47.340
You need people who aren't naive to be thinking about this.
link |
02:36:50.020
And again, like luckily there's a lot of smart,
link |
02:36:51.980
non naive game theorists within that group.
link |
02:36:54.780
Yes, I'm concerned about it and I think it's again,
link |
02:36:57.660
a thing that we need people to be thinking about
link |
02:37:01.300
in terms of like, how do we create,
link |
02:37:03.820
how do we mitigate the arms race dynamics
link |
02:37:05.980
and how do we solve the thing of,
link |
02:37:10.260
Bostrom calls it the orthogonality problem whereby,
link |
02:37:14.500
because it's obviously there's a chance, you know,
link |
02:37:16.060
the belief, the hope is,
link |
02:37:17.500
is that you build something that's super intelligent
link |
02:37:19.780
and by definition of being super intelligent,
link |
02:37:22.980
it will also become super wise
link |
02:37:25.140
and have the wisdom to know what the right goals are.
link |
02:37:27.820
And hopefully those goals include keeping humanity alive.
link |
02:37:30.980
Right?
link |
02:37:31.980
But Bostrom says that actually those two things,
link |
02:37:35.300
you know, super intelligence and super wisdom
link |
02:37:39.220
aren't necessarily correlated.
link |
02:37:40.420
They're actually kind of orthogonal things.
link |
02:37:42.900
And how do we make it so that they are correlated?
link |
02:37:44.820
How do we guarantee it?
link |
02:37:45.660
Because we need it to be guaranteed really,
link |
02:37:47.020
to know that we're doing the thing safely.
link |
02:37:48.900
But I think that like merging of intelligence and wisdom,
link |
02:37:54.460
at least my hope is that this whole process
link |
02:37:56.940
happens sufficiently slowly
link |
02:37:58.860
that we're constantly having these kinds of debates
link |
02:38:01.500
that we have enough time to figure out
link |
02:38:06.140
how to modify each version of the system
link |
02:38:07.900
as it becomes more and more intelligent.
link |
02:38:09.580
Yes, buying time is a good thing, definitely.
link |
02:38:12.020
Anything that slows everything down.
link |
02:38:14.220
We just, everyone needs to chill out.
link |
02:38:16.060
We've got millennia to figure this out.
link |
02:38:19.540
Yeah.
link |
02:38:21.580
Or at least, at least, well, it depends.
link |
02:38:25.580
Again, some people think that, you know,
link |
02:38:27.820
we can't even make it through the next few decades
link |
02:38:29.820
without having some kind of
link |
02:38:33.260
omni wise coordination mechanism.
link |
02:38:36.300
And there's also an argument to that.
link |
02:38:37.940
Yeah, I don't know.
link |
02:38:39.380
Well, there is, I'm suspicious of that kind of thinking
link |
02:38:42.220
because it seems like the entirety of human history
link |
02:38:45.220
has people in it that are like predicting doom
link |
02:38:49.780
just around the corner.
link |
02:38:50.780
There's something about us
link |
02:38:52.700
that is strangely attracted to that thought.
link |
02:38:56.300
It's almost like fun to think about
link |
02:38:59.700
the destruction of everything.
link |
02:39:02.540
Just objectively speaking,
link |
02:39:05.260
I've talked and listened to a bunch of people
link |
02:39:08.180
and they are gravitating towards that.
link |
02:39:11.100
It's almost, I think it's the same thing
link |
02:39:13.180
that people love about conspiracy theories
link |
02:39:15.700
is they love to be the person that kind of figured out
link |
02:39:19.220
some deep fundamental thing about the,
link |
02:39:22.140
that's going to be, it's going to mark something
link |
02:39:24.780
extremely important about the history of human civilization
link |
02:39:28.300
because then I will be important.
link |
02:39:31.300
When in reality, most of us will be forgotten
link |
02:39:33.500
and life will go on.
link |
02:39:37.620
And one of the sad things about
link |
02:39:40.020
whenever anything traumatic happens to you,
link |
02:39:41.980
whenever you lose loved ones or just tragedy happens,
link |
02:39:46.700
you realize life goes on.
link |
02:39:49.140
Even after a nuclear war that will wipe out
link |
02:39:52.140
some large percentage of the population
link |
02:39:55.020
and will torture people for years to come
link |
02:40:00.380
because of the effects of a nuclear winter,
link |
02:40:05.460
people will still survive.
link |
02:40:07.260
Life will still go on.
link |
02:40:08.700
I mean, it depends on the kind of nuclear war.
link |
02:40:10.940
But in the case of nuclear war, it will still go on.
link |
02:40:13.500
That's one of the amazing things about life.
link |
02:40:15.820
It finds a way.
link |
02:40:17.060
And so in that sense, I just,
link |
02:40:18.940
I feel like the doom and gloom thing
link |
02:40:21.100
is a...
link |
02:40:23.780
Well, what we don't, yeah,
link |
02:40:24.620
we don't want a self fulfilling prophecy.
link |
02:40:26.100
Yes, that's exactly.
link |
02:40:27.500
Yes, and I very much agree with that.
link |
02:40:29.660
And I even have a slight feeling
link |
02:40:32.980
from the amount of time we spent in this conversation
link |
02:40:35.780
talking about this,
link |
02:40:36.620
because it's like, is this even a net positive
link |
02:40:40.180
if it's like making everyone feel,
link |
02:40:41.900
oh, in some ways, like making people imagine
link |
02:40:44.980
these bad scenarios can be a self fulfilling prophecy,
link |
02:40:47.460
but at the same time, that's weighed off
link |
02:40:51.180
with at least making people aware of the problem
link |
02:40:54.340
and gets them thinking.
link |
02:40:55.300
And I think particularly,
link |
02:40:56.260
the reason why I wanna talk about this to your audience
link |
02:40:58.380
is that on average, they're the type of people
link |
02:41:00.660
who gravitate towards these kinds of topics
link |
02:41:02.780
because they're intellectually curious
link |
02:41:04.260
and they can sort of sense that there's trouble brewing.
link |
02:41:07.860
They can smell that there's,
link |
02:41:09.420
I think there's a reason
link |
02:41:10.260
that people are thinking about this stuff a lot
link |
02:41:11.460
is because the probability,
link |
02:41:14.180
the probability, it's increased in probability
link |
02:41:16.820
over certainly over the last few years,
link |
02:41:19.420
trajectories have not gone favorably,
link |
02:41:21.700
let's put it since 2010.
link |
02:41:24.260
So it's right, I think, for people to be thinking about it,
link |
02:41:28.420
but that's where they're like,
link |
02:41:30.060
I think whether it's a useful fiction
link |
02:41:31.660
or whether it's actually true
link |
02:41:33.340
or whatever you wanna call it,
link |
02:41:34.420
I think having this faith,
link |
02:41:35.940
this is where faith is valuable
link |
02:41:38.060
because it gives you at least this like anchor of hope.
link |
02:41:41.140
And I'm not just saying it to like trick myself,
link |
02:41:43.860
like I do truly,
link |
02:41:44.780
I do think there's something out there that wants us to win.
link |
02:41:47.700
I think there's something that really wants us to win.
link |
02:41:49.660
And it just, you just have to be like,
link |
02:41:53.060
just like kind of, okay, now I sound really crazy,
link |
02:41:55.740
but like open your heart to it a little bit
link |
02:41:58.740
and it will give you the like,
link |
02:42:03.140
the sort of breathing room
link |
02:42:04.820
with which to marinate on the solutions.
link |
02:42:07.660
We are the ones who have to come up with the solutions,
link |
02:42:10.340
but we can use that.
link |
02:42:15.140
There's like, there's hashtag positivity.
link |
02:42:18.020
There's value in that.
link |
02:42:19.300
Yeah, you have to kind of imagine
link |
02:42:21.300
all the destructive trajectories that lay in our future
link |
02:42:24.940
and then believe in the possibility
link |
02:42:27.940
of avoiding those trajectories.
link |
02:42:29.460
All while, you said audience,
link |
02:42:32.260
all while sitting back, which is majority,
link |
02:42:35.100
the two people that listen to this
link |
02:42:36.660
are probably sitting on a beach,
link |
02:42:38.660
smoking some weed, just, that's a beautiful sunset.
link |
02:42:44.900
They're looking at just the waves going in and out.
link |
02:42:47.900
And ultimately there's a kind of deep belief there
link |
02:42:50.460
in the momentum of humanity to figure it all out.
link |
02:42:56.660
But we've got a lot of work to do.
link |
02:42:58.420
Which is what makes this whole simulation,
link |
02:43:01.020
this video game kind of fun.
link |
02:43:03.380
This battle of polytopia,
link |
02:43:05.820
I still, man, I love those games so much.
link |
02:43:08.620
So good.
link |
02:43:09.580
And that one for people who don't know,
link |
02:43:11.740
Battle of Polytopia is a big,
link |
02:43:13.700
it's like a, is this really radical simplification
link |
02:43:17.940
of a civilization type of game.
link |
02:43:20.540
It still has a lot of the skill tree development,
link |
02:43:24.140
a lot of the strategy,
link |
02:43:27.060
but it's easy enough to play on a phone.
link |
02:43:29.860
Yeah.
link |
02:43:30.700
It's kind of interesting.
link |
02:43:31.540
They've really figured it out.
link |
02:43:33.260
It's one of the most elegantly designed games
link |
02:43:35.020
I've ever seen.
link |
02:43:35.860
It's incredibly complex.
link |
02:43:37.940
And yet being, again, it walks that line
link |
02:43:39.580
between complexity and simplicity
link |
02:43:40.980
in this really, really great way.
link |
02:43:44.180
And they use pretty colors
link |
02:43:45.180
that hack the dopamine reward circuits in our brains.
link |
02:43:49.020
Very well.
link |
02:43:49.860
Yeah, it's fun.
link |
02:43:50.860
Video games are so fun.
link |
02:43:52.300
Yeah.
link |
02:43:53.380
Most of this life is just about fun.
link |
02:43:55.540
Escaping all the suffering to find the fun.
link |
02:43:58.820
What's energy healing?
link |
02:43:59.940
I have in my notes, energy healing question mark.
link |
02:44:02.220
What's that about?
link |
02:44:05.500
Oh man.
link |
02:44:06.860
God, your audience is gonna think I'm mad.
link |
02:44:09.460
So the two crazy things that happened to me,
link |
02:44:13.180
the one was the voice in the head
link |
02:44:14.980
that said you're gonna win this tournament
link |
02:44:16.100
and then I won the tournament.
link |
02:44:18.260
The other craziest thing that's happened to me
link |
02:44:20.940
was in 2018,
link |
02:44:25.300
I started getting this like weird problem in my ear
link |
02:44:30.420
where it was kind of like low frequency sound distortion,
link |
02:44:35.260
where voices, particularly men's voices,
link |
02:44:37.300
became incredibly unpleasant to listen to.
link |
02:44:40.660
It would like create this,
link |
02:44:42.380
it would like be falsely amplified or something
link |
02:44:44.100
and it was almost like a physical sensation in my ear,
link |
02:44:46.340
which was really unpleasant.
link |
02:44:48.100
And it would like last for a few hours and then go away
link |
02:44:50.980
and then come back for a few hours and go away.
link |
02:44:52.540
And I went and got hearing tests
link |
02:44:54.460
and they found that like the bottom end,
link |
02:44:56.340
I was losing the hearing in that ear.
link |
02:45:00.140
And in the end, I got,
link |
02:45:03.740
doctors said they think it was this thing
link |
02:45:05.620
called Meniere's disease,
link |
02:45:07.220
which is this very unpleasant disease
link |
02:45:10.540
where people basically end up losing their hearing,
link |
02:45:12.100
but they get this like,
link |
02:45:14.140
it often comes with like dizzy spells and other things
link |
02:45:16.260
because it's like the inner ear gets all messed up.
link |
02:45:19.540
Now, I don't know if that's actually what I had,
link |
02:45:21.780
but that's what at least a couple of,
link |
02:45:23.420
one doctor said to me.
link |
02:45:24.940
But anyway, so I'd had three months of this stuff,
link |
02:45:26.980
this going on and it was really getting me down.
link |
02:45:28.820
And I was at Burning Man of all places.
link |
02:45:32.540
I don't mean to be that person talking about Burning Man,
link |
02:45:35.100
but I was there.
link |
02:45:36.380
And again, I'd had it and I was unable to listen to music,
link |
02:45:39.140
which is not what you want
link |
02:45:40.060
because Burning Man is a very loud, intense place.
link |
02:45:42.580
And I was just having a really rough time.
link |
02:45:44.060
And on the final night,
link |
02:45:45.660
I get talking to this girl who's like a friend of a friend.
link |
02:45:49.620
And I mentioned, I was like,
link |
02:45:51.020
oh, I'm really down in the dumps about this.
link |
02:45:52.540
And she's like, oh, well,
link |
02:45:53.380
I've done a little bit of energy healing.
link |
02:45:54.940
Would you like me to have a look?
link |
02:45:56.340
And I was like, sure.
link |
02:45:57.900
Now this was again,
link |
02:45:59.940
deep, I was, you know, no time in my life for this.
link |
02:46:03.340
I didn't believe in any of this stuff.
link |
02:46:04.900
I was just like, it's all bullshit.
link |
02:46:06.100
It's all wooey nonsense.
link |
02:46:08.980
But I was like, sure, I'll have a go.
link |
02:46:10.740
And she starts like with her hand and she says,
link |
02:46:14.020
oh, there's something there.
link |
02:46:15.340
And then she leans in and she starts like sucking
link |
02:46:18.100
over my ear, not actually touching me,
link |
02:46:19.900
but like close to it, like with her mouth.
link |
02:46:22.620
And it was really unpleasant.
link |
02:46:23.780
I was like, whoa, can you stop?
link |
02:46:24.780
She's like, no, no, no, there's something there.
link |
02:46:25.820
I need to get it.
link |
02:46:26.660
And I was like, no, no, no, I really don't like it.
link |
02:46:27.780
Please, this is really loud.
link |
02:46:29.140
She's like, I need to just bear with me.
link |
02:46:31.100
And she does it.
link |
02:46:31.940
And I don't know how long, for a few minutes.
link |
02:46:33.700
And then she eventually collapses on the ground,
link |
02:46:36.500
like freezing cold, crying.
link |
02:46:40.580
And I'm just like, I don't know what the hell is going on.
link |
02:46:42.700
Like I'm like thoroughly freaked out
link |
02:46:44.300
as is everyone else watching.
link |
02:46:45.420
Just like, what the hell?
link |
02:46:46.340
And we like warm her up and she was like,
link |
02:46:47.700
oh, what, oh, you know, she was really shaken up.
link |
02:46:51.340
And she's like, I don't know what that,
link |
02:46:54.700
she said it was something very unpleasant and dark.
link |
02:46:57.580
Don't worry, it's gone.
link |
02:46:58.700
I think you'll be fine in a couple,
link |
02:46:59.900
you'll have the physical symptoms for a couple of weeks
link |
02:47:01.420
and you'll be fine.
link |
02:47:02.700
But, you know, she was just like that, you know,
link |
02:47:05.740
so I was so rattled, A, because the potential that actually
link |
02:47:10.020
I'd had something bad in me that made someone feel bad
link |
02:47:13.100
and that she was scared.
link |
02:47:15.340
That was what, you know, I was like, wait,
link |
02:47:16.500
I thought you do this, this is the thing.
link |
02:47:19.300
Now you're terrified.
link |
02:47:20.340
Like you bought like some kind of exorcism or something.
link |
02:47:22.540
Like what the fuck is going on?
link |
02:47:24.620
So it, like just the most insane experience.
link |
02:47:29.700
And frankly, it took me like a few months
link |
02:47:31.740
to sort of emotionally recover from it.
link |
02:47:35.380
But my ear problem went away about a couple of weeks later
link |
02:47:39.540
and touch wood, I've not had any issues since.
link |
02:47:42.940
So.
link |
02:47:44.740
That gives you like hints
link |
02:47:48.140
that maybe there's something out there.
link |
02:47:51.060
I mean, I don't, again,
link |
02:47:53.340
I don't have an explanation for this.
link |
02:47:55.180
The most probable explanation was, you know,
link |
02:47:57.820
I was a burning man, I was in a very open state.
link |
02:47:59.860
Let's just leave it at that.
link |
02:48:01.820
And, you know, placebo is an incredibly powerful thing
link |
02:48:08.060
and a very not understood thing.
link |
02:48:10.300
So.
link |
02:48:11.140
Almost assigning the word placebo to it reduces it down
link |
02:48:13.740
to a way that it doesn't deserve to be reduced down.
link |
02:48:16.500
Maybe there's a whole science of what we call placebo.
link |
02:48:19.180
Maybe there's a, placebo's a door.
link |
02:48:21.540
Self healing, you know?
link |
02:48:24.020
And I mean, I don't know what the problem was.
link |
02:48:26.380
Like I was told it was many years.
link |
02:48:27.700
I don't want to say I definitely had that
link |
02:48:29.500
because I don't want people to think that,
link |
02:48:30.700
oh, that's how, you know, if they do have that,
link |
02:48:32.220
because it's a terrible disease.
link |
02:48:33.180
And if they have that,
link |
02:48:34.020
that this is going to be a guaranteed way for it
link |
02:48:35.100
to fix it for them.
link |
02:48:35.940
I don't know.
link |
02:48:37.740
And I also don't, I don't,
link |
02:48:39.740
and you're absolutely right to say,
link |
02:48:41.060
like using even the word placebo is like,
link |
02:48:43.380
it comes with this like baggage of, of like frame.
link |
02:48:48.020
And I don't want to reduce it down.
link |
02:48:49.980
All I can do is describe the experience and what happened.
link |
02:48:52.900
I cannot put an ontological framework around it.
link |
02:48:56.580
I can't say why it happened, what the mechanism was,
link |
02:49:00.100
what the problem even was in the first place.
link |
02:49:02.980
I just know that something crazy happened
link |
02:49:05.220
and it was while I was in an open state.
link |
02:49:07.060
And fortunately for me, it made the problem go away.
link |
02:49:09.540
But what I took away from it, again,
link |
02:49:11.860
it was part of this, you know,
link |
02:49:13.580
this took me on this journey of becoming more humble
link |
02:49:16.020
about what I think I know.
link |
02:49:17.500
Because as I said before, I was like,
link |
02:49:18.860
I was in the like Richard Dawkins train of atheism
link |
02:49:21.740
in terms of there is no God.
link |
02:49:23.380
And everything like that is bullshit.
link |
02:49:24.820
We know everything, we know, you know,
link |
02:49:26.660
the only way we can get through,
link |
02:49:28.980
we know how medicine works and its molecules
link |
02:49:30.820
and chemical interactions and that kind of stuff.
link |
02:49:33.980
And now it's like, okay, well,
link |
02:49:36.580
there's clearly more for us to understand.
link |
02:49:40.820
And that doesn't mean that it's ascientific as well,
link |
02:49:43.540
because, you know, the beauty of the scientific method
link |
02:49:47.140
is that it still can apply to this situation.
link |
02:49:49.940
Like, I don't see why, you know,
link |
02:49:51.340
I would like to try and test this experimentally.
link |
02:49:54.300
I haven't really, like, you know,
link |
02:49:55.540
I don't know how we would go about doing that.
link |
02:49:57.100
We'd have to find other people with the same condition,
link |
02:49:58.860
I guess, and like, try and repeat the experiment.
link |
02:50:04.220
But it doesn't, just because something happens
link |
02:50:06.980
that's sort of out of the realms
link |
02:50:09.180
of our current understanding,
link |
02:50:10.100
it doesn't mean that it's,
link |
02:50:11.940
the scientific method can't be used for it.
link |
02:50:13.820
Yeah, I think the scientific method sits on a foundation
link |
02:50:17.900
of those kinds of experiences,
link |
02:50:20.220
because the scientific method is a process
link |
02:50:24.660
to carve away at the mystery all around us.
link |
02:50:30.980
And experiences like this is just a reminder
link |
02:50:33.740
that we're mostly shrouded in mystery still.
link |
02:50:36.700
That's it.
link |
02:50:37.540
It's just like a humility.
link |
02:50:38.940
Like, we haven't really figured this whole thing out.
link |
02:50:41.580
But at the same time, we have found ways
link |
02:50:44.500
to act, you know, we're clearly doing something right,
link |
02:50:47.500
because think of the technological scientific advancements,
link |
02:50:50.020
the knowledge that we have that would blow people's minds
link |
02:50:53.980
even from 100 years ago.
link |
02:50:55.620
Yeah, and we've even allegedly got out to space
link |
02:50:58.660
and landed on the moon, although I still haven't,
link |
02:51:01.220
I have not seen evidence of the Earth being round,
link |
02:51:04.340
but I'm keeping an open mind.
link |
02:51:08.140
Speaking of which, you studied physics
link |
02:51:10.380
and astrophysics, just to go to that,
link |
02:51:16.860
just to jump around through the fascinating life you've had,
link |
02:51:20.940
when did you, how did that come to be?
link |
02:51:23.620
Like, when did you fall in love with astronomy
link |
02:51:25.860
and space and things like this?
link |
02:51:28.580
As early as I can remember.
link |
02:51:30.460
I was very lucky that my mom, and my dad,
link |
02:51:33.380
but particularly my mom, my mom is like the most nature,
link |
02:51:38.300
she is Mother Earth, is the only way to describe her.
link |
02:51:41.220
Just, she's like Dr. Doolittle, animals flock to her
link |
02:51:44.540
and just like sit and look at her adoringly.
link |
02:51:47.100
As she sings.
link |
02:51:48.380
Yeah, she just is Mother Earth,
link |
02:51:51.060
and she has always been fascinated by,
link |
02:51:54.540
she doesn't have any, she never went to university
link |
02:51:57.100
or anything like that, she's actually phobic of maths,
link |
02:51:59.380
if I try and get her to like,
link |
02:52:00.860
you know, I was trying to teach her poker and she hated it.
link |
02:52:03.860
But she's so deeply curious,
link |
02:52:06.840
and that just got instilled in me when, you know,
link |
02:52:09.880
we would sleep out under the stars,
link |
02:52:11.200
whenever it was, you know, the two nights a year
link |
02:52:13.160
when it was warm enough in the UK to do that.
link |
02:52:15.760
And we would just lie out there until we fell asleep,
link |
02:52:18.680
looking at, looking for satellites,
link |
02:52:20.920
looking for shooting stars, and I was just always,
link |
02:52:24.320
I don't know whether it was from that,
link |
02:52:25.320
but I've always naturally gravitated to like the biggest,
link |
02:52:30.440
the biggest questions.
link |
02:52:31.960
And also the like, the most layers of abstraction I love,
link |
02:52:35.520
just like, what's the meta question?
link |
02:52:36.720
What's the meta question and so on.
link |
02:52:38.800
So I think it just came from that really.
link |
02:52:40.800
And then on top of that, like physics,
link |
02:52:43.760
you know, it also made logical sense
link |
02:52:45.880
in that it was a degree that,
link |
02:52:51.240
well, a subject that ticks the box of being,
link |
02:52:53.800
you know, answering these really big picture questions,
link |
02:52:55.520
but it was also extremely useful.
link |
02:52:57.840
It like has a very high utility in terms of,
link |
02:53:00.920
I didn't know necessarily,
link |
02:53:02.080
I thought I was gonna become like a research scientist.
link |
02:53:04.400
My original plan was,
link |
02:53:05.240
I wanna be a professional astronomer.
link |
02:53:07.040
So it's not just like a philosophy degree
link |
02:53:08.640
that asks the big questions,
link |
02:53:10.400
and it's not like biology and the path
link |
02:53:14.440
to go to medical school or something like that,
link |
02:53:16.160
which is all overly pragmatic, not overly,
link |
02:53:19.200
is very pragmatic, but this is, yeah,
link |
02:53:23.760
physics is a good combination of the two.
link |
02:53:26.480
Yeah, at least for me, it made sense.
link |
02:53:27.920
And I was good at it, I liked it.
link |
02:53:30.160
Yeah, I mean, it wasn't like I did an immense amount
link |
02:53:32.440
of soul searching to choose it or anything.
link |
02:53:34.320
It just was like this, it made the most sense.
link |
02:53:38.160
I mean, you have to make this decision in the UK age 17,
link |
02:53:41.240
which is crazy, because, you know, in US,
link |
02:53:43.640
you go the first year, you do a bunch of stuff, right?
link |
02:53:46.240
And then you choose your major.
link |
02:53:48.520
Yeah, I think the first few years of college,
link |
02:53:50.000
you focus on the drugs and only as you get closer
link |
02:53:53.080
to the end, do you start to think, oh shit,
link |
02:53:56.360
this wasn't about that.
link |
02:53:57.440
And I owe the government a lot of money.
link |
02:54:01.380
How many alien civilizations are out there?
link |
02:54:05.180
When you looked up at the stars with your mom
link |
02:54:07.580
and you were counting them, what's your mom think
link |
02:54:11.820
about the number of alien civilizations?
link |
02:54:13.980
I actually don't know.
link |
02:54:15.780
I would imagine she would take the viewpoint of,
link |
02:54:18.500
you know, she's pretty humble and she knows how many,
link |
02:54:21.300
she knows there's a huge number of potential spawn sites
link |
02:54:24.100
out there, so she would.
link |
02:54:25.380
Spawn sites?
link |
02:54:26.220
Spawn sites, yeah.
link |
02:54:27.540
You know, this is all spawn sites.
link |
02:54:29.500
Yeah, spawn sites in Polytopia.
link |
02:54:30.780
We spawned on Earth, you know, it's.
link |
02:54:33.020
Hmm, yeah, spawn sites.
link |
02:54:35.860
Why does that feel weird to say spawn?
link |
02:54:39.260
Because it makes me feel like there's only one source
link |
02:54:44.860
of life and it's spawning in different locations.
link |
02:54:47.420
That's why the word spawn.
link |
02:54:49.220
Because it feels like life that originated on Earth
link |
02:54:52.580
really originated here.
link |
02:54:54.700
Right, it is unique to this particular.
link |
02:54:58.780
Yeah, I mean, but I don't, in my mind, it doesn't exclude,
link |
02:55:02.600
you know, the completely different forms of life
link |
02:55:04.380
and different biochemical soups can't also spawn,
link |
02:55:09.160
but I guess it implies that there's some spark
link |
02:55:12.580
that is uniform, which I kind of like the idea of.
link |
02:55:16.180
And then I get to think about respawning,
link |
02:55:19.220
like after it dies, like what happens if life on Earth ends?
link |
02:55:23.500
Is it gonna restart again?
link |
02:55:26.140
Probably not, it depends.
link |
02:55:28.060
Maybe Earth is too.
link |
02:55:28.900
It depends on the type of, you know,
link |
02:55:30.460
what's the thing that kills it off, right?
link |
02:55:32.820
If it's a paperclip maximizer, not for the example,
link |
02:55:36.100
but, you know, some kind of very self replicating,
link |
02:55:40.820
high on the capabilities, very low on the wisdom type thing.
link |
02:55:44.380
So whether that's, you know, gray goo, green goo,
link |
02:55:47.420
you know, like nanobots or just a shitty misaligned AI
link |
02:55:51.180
that thinks it needs to turn everything into paperclips.
link |
02:55:54.360
You know, if it's something like that,
link |
02:55:57.680
then it's gonna be very hard for life,
link |
02:55:59.040
you know, complex life, because by definition,
link |
02:56:02.080
you know, a paperclip maximizer
link |
02:56:03.160
is the ultimate instantiation of molecule.
link |
02:56:05.680
Deeply low complexity, over optimization on a single thing,
link |
02:56:08.620
sacrificing everything else, turning the whole world into.
link |
02:56:11.120
Although something tells me,
link |
02:56:12.200
like if we actually take a paperclip maximizer,
link |
02:56:14.640
it destroys everything.
link |
02:56:16.180
It's a really dumb system that just envelops
link |
02:56:19.200
the whole of Earth.
link |
02:56:20.760
And the universe beyond, yeah.
link |
02:56:22.360
Oh, I didn't know that part, but okay, great.
link |
02:56:26.600
That's the thought experiment.
link |
02:56:27.440
So it becomes a multi planetary paperclip maximizer?
link |
02:56:30.280
Well, it just propagates.
link |
02:56:31.760
I mean, it depends whether it figures out
link |
02:56:33.640
how to jump the vacuum gap.
link |
02:56:36.720
But again, I mean, this is all silly
link |
02:56:38.640
because it's a hypothetical thought experiment,
link |
02:56:40.280
which I think doesn't actually have
link |
02:56:41.200
much practical application to the AI safety problem,
link |
02:56:43.240
but it's just a fun thing to play around with.
link |
02:56:45.240
But if by definition, it is maximally intelligent,
link |
02:56:47.780
which means it is maximally good at
link |
02:56:50.200
navigating the environment around it
link |
02:56:53.080
in order to achieve its goal,
link |
02:56:54.720
but extremely bad at choosing goals in the first place.
link |
02:56:58.000
So again, we're talking on this orthogonality thing, right?
link |
02:57:00.000
It's very low on wisdom, but very high on capability.
link |
02:57:03.720
Then it will figure out how to jump the vacuum gap
link |
02:57:05.680
between planets and stars and so on,
link |
02:57:07.400
and thus just turn every atom it gets its hands on
link |
02:57:09.900
into paperclips.
link |
02:57:10.960
Yeah, by the way, for people who don't.
link |
02:57:12.800
Which is maximum virality, by the way.
link |
02:57:14.600
That's what virality is.
link |
02:57:16.240
But does not mean that virality is necessarily
link |
02:57:18.540
all about maximizing paperclips.
link |
02:57:20.040
In that case, it is.
link |
02:57:21.080
So for people who don't know,
link |
02:57:22.280
this is just a thought experiment example
link |
02:57:24.280
of an AI system that's very, that has a goal
link |
02:57:27.360
and is willing to do anything to accomplish that goal,
link |
02:57:30.400
including destroying all life on Earth
link |
02:57:32.280
and all human life and all of consciousness in the universe
link |
02:57:36.320
for the goal of producing a maximum number of paperclips.
link |
02:57:40.720
Okay.
link |
02:57:41.560
Or whatever its optimization function was
link |
02:57:43.960
that it was set at.
link |
02:57:45.000
But don't you think?
link |
02:57:45.840
It could be making, recreating Lexus.
link |
02:57:47.920
Maybe it'll tile the universe in Lex.
link |
02:57:50.600
Go on.
link |
02:57:51.640
I like this idea.
link |
02:57:52.480
No, I'm just kidding.
link |
02:57:53.320
That's better.
link |
02:57:54.800
That's more interesting than paperclips.
link |
02:57:56.240
That could be infinitely optimal
link |
02:57:57.680
if I were to say it to myself.
link |
02:57:58.520
But if you ask me, it's still a bad thing
link |
02:58:00.400
because it's permanently capping
link |
02:58:02.780
what the universe could ever be.
link |
02:58:04.200
It's like, that's its end state.
link |
02:58:05.880
Or achieving the optimal
link |
02:58:07.840
that the universe could ever achieve.
link |
02:58:09.240
But that's up to,
link |
02:58:10.320
different people have different perspectives.
link |
02:58:12.480
But don't you think within the paperclip world
link |
02:58:15.520
that would emerge, just like in the zeros and ones
link |
02:58:19.320
that make up a computer,
link |
02:58:20.200
that would emerge beautiful complexities?
link |
02:58:23.760
Like, it won't suppress, you know,
link |
02:58:27.820
as you scale to multiple planets and throughout,
link |
02:58:30.500
there'll emerge these little worlds
link |
02:58:33.280
that on top of the fabric of maximizing paperclips,
link |
02:58:38.140
there will be, that would emerge like little societies
link |
02:58:42.880
of paperclip.
link |
02:58:45.880
Well, then we're not describing
link |
02:58:47.520
a paperclip maximizer anymore.
link |
02:58:48.600
Because by the, like, if you think of what a paperclip is,
link |
02:58:51.600
it is literally just a piece of bent iron, right?
link |
02:58:55.600
So if it's maximizing that throughout the universe,
link |
02:58:58.960
it's taking every atom it gets its hand on
link |
02:59:01.040
into somehow turning it into iron or steel.
link |
02:59:04.240
And then bending it into that shape
link |
02:59:05.400
and then done and done.
link |
02:59:06.800
By definition, like paperclips,
link |
02:59:08.880
there is no way for, well, okay.
link |
02:59:11.960
So you're saying that paperclips somehow
link |
02:59:13.920
will just emerge and create through gravity or something.
link |
02:59:17.960
Well, no, no, no.
link |
02:59:18.800
Because there's a dynamic element to the whole system.
link |
02:59:21.600
It's not just, it's creating those paperclips
link |
02:59:24.720
and the act of creating, there's going to be a process.
link |
02:59:27.900
And that process will have a dance to it.
link |
02:59:30.640
Because it's not like sequential thing.
link |
02:59:32.640
There's a whole complex three dimensional system
link |
02:59:35.240
of paperclips, you know, like, you know,
link |
02:59:37.760
people like string theory, right?
link |
02:59:39.400
It's supposed to be strings that are interacting
link |
02:59:40.920
in fascinating ways.
link |
02:59:41.960
I'm sure paperclips are very string like,
link |
02:59:44.520
they can be interacting in very interesting ways
link |
02:59:46.520
as you scale exponentially through three dimensional.
link |
02:59:50.040
I mean, I'm sure the paperclip maximizer
link |
02:59:53.840
has to come up with a theory of everything.
link |
02:59:55.680
It has to create like wormholes, right?
link |
02:59:58.320
It has to break, like,
link |
03:00:01.040
it has to understand quantum mechanics.
link |
03:00:02.680
It has to understand general relativity.
link |
03:00:03.520
I love your optimism.
link |
03:00:04.840
This is where I'd say this,
link |
03:00:06.240
we're going into the realm of pathological optimism
link |
03:00:08.400
where if I, it's.
link |
03:00:09.360
I'm sure there'll be a,
link |
03:00:12.800
I think there's an intelligence
link |
03:00:14.880
that emerges from that system.
link |
03:00:16.320
So you're saying that basically intelligence
link |
03:00:18.480
is inherent in the fabric of reality and will find a way.
link |
03:00:21.680
Kind of like Goldblum says, life will find a way.
link |
03:00:23.880
You think life will find a way
link |
03:00:25.240
even out of this perfectly homogenous dead soup.
link |
03:00:29.520
It's not perfectly homogenous.
link |
03:00:31.600
It has to, it's perfectly maximal in the production.
link |
03:00:34.960
I don't know why people keep thinking it's homogenous.
link |
03:00:37.400
It maximizes the number of paperclips.
link |
03:00:39.560
That's the only thing.
link |
03:00:40.400
It's not trying to be homogenous.
link |
03:00:42.040
It's trying.
link |
03:00:42.880
It's trying to maximize paperclips.
link |
03:00:44.760
So you're saying, you're saying that because it,
link |
03:00:47.840
because, you know, kind of like in the Big Bang
link |
03:00:50.160
or, you know, it seems like, you know, things,
link |
03:00:52.200
there were clusters, there was more stuff here than there.
link |
03:00:54.880
That was enough of the patternicity
link |
03:00:56.600
that kickstarted the evolutionary process.
link |
03:00:58.440
It's the little weirdness that will make it beautiful.
link |
03:01:01.280
So yeah.
link |
03:01:02.120
Complexity emerges.
link |
03:01:03.000
Interesting, okay.
link |
03:01:04.200
Well, so how does that line up then
link |
03:01:05.600
with the whole heat death of the universe, right?
link |
03:01:08.080
Cause that's another sort of instantiation of this.
link |
03:01:10.080
It's like everything becomes so far apart and so cold
link |
03:01:13.200
and so perfectly mixed that it's like homogenous grayness.
link |
03:01:20.360
Do you think that even out of that homogenous grayness
link |
03:01:23.280
where there's no, you know, negative entropy,
link |
03:01:26.880
that, you know, there's no free energy that we understand
link |
03:01:30.880
even from that new stuff?
link |
03:01:33.920
Yeah, the paperclip maximizer
link |
03:01:36.560
or any other intelligence systems
link |
03:01:38.120
will figure out ways to travel to other universes
link |
03:01:40.960
to create Big Bangs within those universes
link |
03:01:43.480
or through black holes to create whole other worlds
link |
03:01:46.120
to break the, what we consider are the limitations
link |
03:01:49.840
of physics.
link |
03:01:52.720
The paperclip maximizer will find a way if a way exists.
link |
03:01:56.640
And we should be humbled to realize that we don't.
link |
03:01:59.000
Yeah, but because it just wants to make more paperclips.
link |
03:02:01.440
So it's gonna go into those universes
link |
03:02:02.640
and turn them into paperclips.
link |
03:02:03.760
Yeah, but we humans, not humans,
link |
03:02:07.480
but complex system exists on top of that.
link |
03:02:10.320
We're not interfering with it.
link |
03:02:13.000
This complexity emerges from the simple base state.
link |
03:02:17.360
The simple base.
link |
03:02:18.280
Whether it's, yeah, whether it's, you know,
link |
03:02:20.280
plank lengths or paperclips as the base unit.
link |
03:02:22.640
Yeah, you can think of like the universe
link |
03:02:25.720
as a paperclip maximizer because it's doing some dumb stuff.
link |
03:02:28.920
Like physics seems to be pretty dumb.
link |
03:02:31.320
It has, like, I don't know if you can summarize it.
link |
03:02:34.120
Yeah, the laws are fairly basic
link |
03:02:37.360
and yet out of them amazing complexity emerges.
link |
03:02:39.640
And its goals seem to be pretty basic and dumb.
link |
03:02:43.440
If you can summarize its goals,
link |
03:02:45.280
I mean, I don't know what's a nice way maybe,
link |
03:02:49.480
maybe laws of thermodynamics could be good.
link |
03:02:52.400
I don't know if you can assign goals to physics,
link |
03:02:55.080
but if you formulate in the sense of goals,
link |
03:02:57.960
it's very similar to paperclip maximizing
link |
03:03:00.840
in the dumbness of the goals.
link |
03:03:02.960
But the pockets of complexity as it emerge
link |
03:03:06.520
is where beauty emerges.
link |
03:03:07.960
That's where life emerges.
link |
03:03:09.120
That's where intelligence, that's where humans emerge.
link |
03:03:12.120
And I think we're being very down
link |
03:03:14.000
on this whole paperclip maximizer thing.
link |
03:03:16.240
Now, the reason we hated it.
link |
03:03:17.680
I think, yeah, because what you're saying
link |
03:03:19.000
is that you think that the force of emergence itself
link |
03:03:24.160
is another like unwritten, not unwritten,
link |
03:03:27.480
but like another baked in law of reality.
link |
03:03:31.800
And you're trusting that emergence will find a way to,
link |
03:03:35.200
even out of seemingly the most mollusky,
link |
03:03:38.720
awful, plain outcome, emergence will still find a way.
link |
03:03:42.200
I love that as a philosophy.
link |
03:03:43.440
I think it's very nice.
link |
03:03:44.320
I would wield it carefully
link |
03:03:47.280
because there's large error bars on that
link |
03:03:50.440
and the certainty of that.
link |
03:03:53.000
How about we build the paperclip maximizer and find out.
link |
03:03:55.920
Classic, yeah.
link |
03:03:56.840
Moloch is doing cartwheels, man.
link |
03:03:59.120
Yeah.
link |
03:03:59.960
But the thing is it will destroy humans in the process,
link |
03:04:02.440
which is the reason we really don't like it.
link |
03:04:05.320
We seem to be really holding on
link |
03:04:07.240
to this whole human civilization thing.
link |
03:04:10.240
Would that make you sad if AI systems that are beautiful,
link |
03:04:13.840
that are conscious, that are interesting
link |
03:04:15.680
and complex and intelligent,
link |
03:04:18.160
ultimately lead to the death of humans?
link |
03:04:20.040
Would that make you sad?
link |
03:04:21.080
If humans led to the death of humans?
link |
03:04:23.000
Sorry.
link |
03:04:23.840
Like if they would supersede humans.
link |
03:04:25.520
Oh, if some AI?
link |
03:04:26.840
Yeah, AI would end humans.
link |
03:04:30.320
I mean, that's the reason why I'm like,
link |
03:04:32.560
in some ways less emotionally concerned about AI risk
link |
03:04:36.440
than say, bio risk.
link |
03:04:39.480
Because at least with AI, there's a chance,
link |
03:04:42.200
you know, if we're in this hypothetical
link |
03:04:44.120
where it wipes out humans,
link |
03:04:45.640
but it does it for some like higher purpose,
link |
03:04:48.240
it needs our atoms and energy to do something.
link |
03:04:51.040
At least now the universe is going on
link |
03:04:53.120
to do something interesting,
link |
03:04:55.240
whereas if it wipes everything, you know,
link |
03:04:56.760
bio like just kills everything on earth and that's it.
link |
03:05:00.120
And there's no more, you know,
link |
03:05:01.440
earth cannot spawn anything more meaningful
link |
03:05:03.320
in the few hundred million years it has left,
link |
03:05:05.960
because it doesn't have much time left.
link |
03:05:09.040
Then, yeah, I don't know.
link |
03:05:13.480
So one of my favorite books I've ever read is,
link |
03:05:16.120
Novocene by James Lovelock, who sadly just died.
link |
03:05:19.920
He wrote it when he was like 99.
link |
03:05:22.080
He died aged 102, so it's a fairly new book.
link |
03:05:25.560
And he sort of talks about that,
link |
03:05:27.000
that he thinks it's, you know,
link |
03:05:29.240
sort of building off this Gaia theory
link |
03:05:30.800
where like earth is like living,
link |
03:05:34.320
some form of intelligence itself,
link |
03:05:36.080
and that this is the next like step, right?
link |
03:05:38.360
Is this, whatever this new intelligence
link |
03:05:41.720
that is maybe silicon based
link |
03:05:43.360
as opposed to carbon based goes on to do.
link |
03:05:46.120
And it's a really sort of, in some ways an optimistic,
link |
03:05:48.280
but really fatalistic book.
link |
03:05:49.800
And I don't know if I fully subscribed to it,
link |
03:05:52.400
but it's a beautiful piece to read anyway.
link |
03:05:54.400
So am I sad by that idea?
link |
03:05:56.760
I think so, yes.
link |
03:05:57.920
And actually, yeah, this is the reason
link |
03:05:59.240
why I'm sad by the idea,
link |
03:06:00.120
because if something is truly brilliant
link |
03:06:02.520
and wise and smart and truly super intelligent,
link |
03:06:06.440
it should be able to figure out abundance.
link |
03:06:09.320
So if it figures out abundance,
link |
03:06:11.720
it shouldn't need to kill us off.
link |
03:06:12.800
It should be able to find a way for us.
link |
03:06:14.360
It should be, there's plenty, the universe is huge.
link |
03:06:17.440
There should be plenty of space for it to go out
link |
03:06:19.920
and do all the things it wants to do,
link |
03:06:21.760
and like give us a little pocket
link |
03:06:23.440
where we can continue doing our things
link |
03:06:24.880
and we can continue to do things and so on.
link |
03:06:27.400
And again, if it's so supremely wise,
link |
03:06:28.920
it shouldn't even be worried
link |
03:06:29.760
about the game theoretic considerations
link |
03:06:31.920
that by leaving us alive,
link |
03:06:33.000
we'll then go and create another like super intelligent agent
link |
03:06:35.120
that it then has to compete against,
link |
03:06:36.600
because it should be only wise and smart enough
link |
03:06:38.280
to not have to concern itself with that.
link |
03:06:40.680
Unless it deems humans to be kind of assholes.
link |
03:06:44.680
Like the humans are a source of non
link |
03:06:48.680
of a lose, lose kind of dynamics.
link |
03:06:51.880
Well, yes and no, we're not,
link |
03:06:55.440
Moloch is, that's why I think it's important to separate.
link |
03:06:57.600
But maybe humans are the source of Moloch.
link |
03:07:00.200
No, I think, I mean, I think game theory
link |
03:07:02.600
is the source of Moloch.
link |
03:07:03.800
And, you know, because Moloch exists
link |
03:07:05.560
in nonhuman systems as well.
link |
03:07:08.000
It happens within like agents within a game
link |
03:07:10.560
in terms of like, you know, it applies to agents,
link |
03:07:13.920
but like it can apply to, you know,
link |
03:07:17.400
a species that's on an island of animals,
link |
03:07:20.640
you know, rats out competing,
link |
03:07:22.680
the ones that like massively consume all the resources
link |
03:07:25.520
are the ones that are gonna win out
link |
03:07:26.840
over the more like chill, socialized ones.
link |
03:07:29.680
And so, you know, creates this Malthusian trap,
link |
03:07:31.520
like Moloch exists in little pockets in nature as well.
link |
03:07:34.480
So it's not a strictly human thing.
link |
03:07:35.960
I wonder if it's actually a result of consequences
link |
03:07:38.320
of the invention of predator and prey dynamics.
link |
03:07:41.520
Maybe it needs to, AI will have to kill off
link |
03:07:45.240
every organism that's.
link |
03:07:47.840
Now you're talking about killing off competition.
link |
03:07:50.240
Not competition, but just like the way,
link |
03:07:55.880
it's like the weeds or whatever
link |
03:07:59.960
in a beautiful flower garden.
link |
03:08:01.640
Parasites.
link |
03:08:02.480
The parasites, yeah, on the whole system.
link |
03:08:05.200
Now, of course, it won't do that completely.
link |
03:08:08.440
It'll put them in a zoo like we do with parasites.
link |
03:08:10.520
It'll ring fence.
link |
03:08:11.360
Yeah, and there'll be somebody doing a PhD
link |
03:08:13.120
on like they'll prod humans with a stick
link |
03:08:15.800
and see what they do.
link |
03:08:18.880
But I mean, in terms of letting us run wild
link |
03:08:22.400
outside of the, you know, a geographically
link |
03:08:25.080
constrained region that might be,
link |
03:08:27.760
that it might decide to against that.
link |
03:08:31.200
No, I think there's obviously the capacity
link |
03:08:33.160
for beauty and kindness and non Moloch behavior
link |
03:08:38.160
amidst humans, so I'm pretty sure AI will preserve us.
link |
03:08:42.440
Let me, I don't know if you answered the aliens question.
link |
03:08:46.240
No, I didn't.
link |
03:08:47.080
You had a good conversation with Toby Orr.
link |
03:08:49.600
Yes.
link |
03:08:50.440
About various sides of the universe.
link |
03:08:52.200
I think, did he say, now I'm forgetting,
link |
03:08:54.680
but I think he said it's a good chance we're alone.
link |
03:08:58.160
So the classic, you know, Fermi paradox question is,
link |
03:09:02.200
there are so many spawn points and yet, you know,
link |
03:09:09.200
it didn't take us that long to go from harnessing fire
link |
03:09:12.080
to sending out radio signals into space.
link |
03:09:15.120
So surely given the vastness of space we should be,
link |
03:09:18.640
and you know, even if only a tiny fraction of those
link |
03:09:20.720
create life and other civilizations too,
link |
03:09:23.120
we should be, the universe should be very noisy.
link |
03:09:24.840
There should be evidence of Dyson spheres or whatever,
link |
03:09:27.800
you know, like at least radio signals and so on,
link |
03:09:29.680
but seemingly things are very silent out there.
link |
03:09:32.520
Now, of course, it depends on who you speak to.
link |
03:09:33.800
Some people say that they're getting signals all the time
link |
03:09:35.840
and so on and like, I don't wanna make
link |
03:09:37.480
an epistemic statement on that,
link |
03:09:38.640
but it seems like there's a lot of silence.
link |
03:09:43.120
And so that raises this paradox.
link |
03:09:45.680
And then say, you know, the Drake equation.
link |
03:09:51.040
So the Drake equation is like basically just a simple thing
link |
03:09:55.120
of like trying to estimate the number of possible
link |
03:09:57.720
civilizations within the galaxy
link |
03:09:58.960
by multiplying the number of stars created per year
link |
03:10:02.080
by the number of stars that have planets,
link |
03:10:03.560
planets that are habitable, blah, blah, blah.
link |
03:10:04.760
So all these like different factors.
link |
03:10:06.520
And then you plug in numbers into that and you, you know,
link |
03:10:09.240
depending on like the range of, you know,
link |
03:10:11.400
your lower bound and your upper bound point estimates
link |
03:10:14.680
that you put in, you get out a number at the end
link |
03:10:16.800
for the number of civilizations.
link |
03:10:18.520
But what Toby and his crew did differently was,
link |
03:10:22.840
Toby is a researcher at the Future of Humanity Institute.
link |
03:10:25.640
They, instead of, they realized that it's like basically
link |
03:10:31.160
a statistical quirk that if you put in point sources,
link |
03:10:33.800
even if you think you're putting in
link |
03:10:34.840
conservative point sources,
link |
03:10:36.000
because on some of these variables,
link |
03:10:37.960
the uncertainty is so large,
link |
03:10:41.080
it spans like maybe even like a couple of hundred
link |
03:10:43.800
orders of magnitude.
link |
03:10:46.400
By putting in point sources,
link |
03:10:47.440
it's always going to lead to overestimates.
link |
03:10:50.880
And so they, like by putting stuff on a log scale,
link |
03:10:54.320
or actually they did it on like a log log scale
link |
03:10:56.080
on some of them,
link |
03:10:57.400
and then like ran the simulation across the whole
link |
03:11:01.080
bucket of uncertainty,
link |
03:11:02.600
across all of those orders of magnitude.
link |
03:11:04.320
When you do that,
link |
03:11:05.840
then actually the number comes out much, much smaller.
link |
03:11:08.240
And that's the more statistically rigorous,
link |
03:11:10.440
you know, mathematically correct way
link |
03:11:11.920
of doing the calculation.
link |
03:11:13.360
It's still a lot of hand waving.
link |
03:11:14.680
As science goes, it's like definitely, you know,
link |
03:11:17.640
just waving, I don't know what an analogy is,
link |
03:11:19.840
but it's hand wavy.
link |
03:11:22.040
And anyway, when they did this,
link |
03:11:24.720
and then they did a Bayesian update on it as well,
link |
03:11:27.000
to like factor in the fact that there is no evidence
link |
03:11:30.120
that we're picking up because, you know,
link |
03:11:31.320
no evidence is actually a form of evidence, right?
link |
03:11:33.920
And the long and short of it comes out that the,
link |
03:11:37.680
we're roughly around 70% to be the only
link |
03:11:42.480
intelligent civilization in our galaxy thus far,
link |
03:11:45.080
and around 50, 50 in the entire observable universe,
link |
03:11:47.840
which sounds so crazily counterintuitive,
link |
03:11:50.200
but their math is legit.
link |
03:11:53.360
Well, yeah, the math around this particular equation,
link |
03:11:55.600
which the equation is ridiculous on many levels,
link |
03:11:57.760
but the powerful thing about the equation
link |
03:12:03.160
is there's the different things,
link |
03:12:05.200
different components that can be estimated,
link |
03:12:09.400
and the error bars on which can be reduced with science.
link |
03:12:13.800
And hence throughout, since the equation came out,
link |
03:12:17.320
the error bars have been coming out on different,
link |
03:12:19.560
different aspects.
link |
03:12:20.920
And so that, it almost kind of says,
link |
03:12:23.600
what, like this gives you a mission to reduce the error bars
link |
03:12:27.000
on these estimates over a period of time.
link |
03:12:30.000
And once you do, you can better and better understand,
link |
03:12:32.760
like in the process of redoing the error bars,
link |
03:12:34.720
you'll get to understand actually
link |
03:12:36.960
what is the right way to find out where the aliens are,
link |
03:12:41.480
how many of them there are, and all those kinds of things.
link |
03:12:43.960
So I don't think it's good to use that for an estimation.
link |
03:12:47.800
I think you do have to think from like,
link |
03:12:50.120
more like from first principles,
link |
03:12:51.840
just looking at what life is on Earth,
link |
03:12:55.320
and trying to understand the very physics based,
link |
03:12:59.320
biology, chemistry, biology based question of what is life,
link |
03:13:04.240
maybe computation based.
link |
03:13:05.960
What the fuck is this thing?
link |
03:13:07.920
And that, like how difficult is it to create this thing?
link |
03:13:12.160
It's one way to say like how many planets like this
link |
03:13:14.520
are out there, all that kind of stuff,
link |
03:13:16.160
but it feels like from our very limited knowledge
link |
03:13:20.560
perspective, the right way is to think how does,
link |
03:13:25.680
what is this thing and how does it originate?
link |
03:13:28.200
From very simple nonlife things,
link |
03:13:32.000
how does complex lifelike things emerge?
link |
03:13:37.120
From a rock to a bacteria, protein,
link |
03:13:42.120
and these like weird systems that encode information
link |
03:13:46.400
and pass information from self replicate,
link |
03:13:49.520
and then also select each other and mutate
link |
03:13:51.560
in interesting ways such that they can adapt
link |
03:13:53.440
and evolve and build increasingly more complex systems.
link |
03:13:56.560
Right, well it's a form of information processing, right?
link |
03:13:59.720
Right.
link |
03:14:00.560
Whereas information transfer, but then also
link |
03:14:04.000
an energy processing, which then results in,
link |
03:14:07.440
I guess information processing,
link |
03:14:09.080
maybe I'm getting bogged down.
link |
03:14:09.920
It's doing some modification and yeah,
link |
03:14:12.560
the input is some energy.
link |
03:14:14.440
Right, it's able to extract, yeah,
link |
03:14:17.600
extract resources from its environment
link |
03:14:20.640
in order to achieve a goal.
link |
03:14:22.680
But the goal doesn't seem to be clear.
link |
03:14:24.920
Right, well the goal is to make more of itself.
link |
03:14:29.440
Yeah, but in a way that increases,
link |
03:14:33.760
I mean I don't know if evolution
link |
03:14:36.320
is a fundamental law of the universe,
link |
03:14:39.840
but it seems to want to replicate itself
link |
03:14:44.200
in a way that maximizes the chance of its survival.
link |
03:14:47.640
Individual agents within an ecosystem do, yes, yes.
link |
03:14:51.320
Evolution itself doesn't give a fuck.
link |
03:14:53.280
Right.
link |
03:14:54.120
It's a very, it don't care.
link |
03:14:55.200
It's just like, oh, you optimize it.
link |
03:14:58.480
Well, at least it's certainly, yeah,
link |
03:15:01.520
it doesn't care about the welfare
link |
03:15:03.440
of the individual agents within it,
link |
03:15:05.040
but it does seem to, I don't know.
link |
03:15:06.440
I think the mistake is that we're anthropomorphizing.
link |
03:15:09.680
To even try and give evolution a mindset
link |
03:15:14.760
because it is, there's a really great post
link |
03:15:17.200
by Eliezer Yudkowsky on Lesrong,
link |
03:15:21.440
which is an alien God.
link |
03:15:25.360
And he talks about the mistake we make
link |
03:15:27.560
when we try and put our mind,
link |
03:15:30.080
think through things from an evolutionary perspective
link |
03:15:32.280
as though giving evolution some kind of agency
link |
03:15:35.160
and what it wants.
link |
03:15:37.320
Yeah, worth reading, but yeah.
link |
03:15:39.520
I would like to say that having interacted
link |
03:15:42.600
with a lot of really smart people
link |
03:15:43.960
that say that anthropomorphization is a mistake,
link |
03:15:46.920
I would like to say that saying
link |
03:15:48.720
that anthropomorphization is a mistake is a mistake.
link |
03:15:51.600
I think there's a lot of power in anthropomorphization,
link |
03:15:54.960
if I can only say that word correctly one time.
link |
03:15:57.720
I think that's actually a really powerful way
link |
03:16:00.760
to reason to things.
link |
03:16:01.800
And I think people, especially people in robotics
link |
03:16:04.320
seem to run away from it as fast as possible.
link |
03:16:07.120
And I just, I think.
link |
03:16:09.280
Can you give an example of like how it helps in robotics?
link |
03:16:13.160
Oh, in that our world is a world of humans
link |
03:16:19.040
and to see robots as fundamentally just tools
link |
03:16:24.200
runs away from the fact that we live in a world,
link |
03:16:27.960
a dynamic world of humans.
link |
03:16:30.000
That like these, all these game theory systems
link |
03:16:32.080
we've talked about, that a robot
link |
03:16:35.440
that ever has to interact with humans.
link |
03:16:37.560
And I don't mean like intimate friendship interaction.
link |
03:16:40.800
I mean, in a factory setting where it has to deal
link |
03:16:43.320
with the uncertainty of humans, all that kind of stuff.
link |
03:16:45.520
You have to acknowledge that the robot's behavior
link |
03:16:49.840
has an effect on the human, just as much as the human
link |
03:16:53.360
has an effect on the robot.
link |
03:16:54.600
And there's a dance there.
link |
03:16:56.120
And you have to realize that this entity,
link |
03:16:58.400
when a human sees a robot, this is obvious
link |
03:17:01.680
in a physical manifestation of a robot,
link |
03:17:04.240
they feel a certain way.
link |
03:17:05.800
They have a fear, they have uncertainty.
link |
03:17:07.920
They have their own personal life projections.
link |
03:17:11.880
We have to have pets and dogs
link |
03:17:13.240
and the thing looks like a dog.
link |
03:17:14.840
They have their own memories of what a dog is like.
link |
03:17:17.120
They have certain feelings and that's gonna be useful
link |
03:17:19.760
in a safety setting, safety critical setting,
link |
03:17:22.240
which is one of the most trivial settings for a robot
link |
03:17:25.280
in terms of how to avoid any kind of dangerous situations.
link |
03:17:29.240
And a robot should really consider that
link |
03:17:32.520
in navigating its environment.
link |
03:17:34.880
And we humans are right to reason about how a robot
link |
03:17:38.760
should consider navigating its environment
link |
03:17:41.480
through anthropomorphization.
link |
03:17:42.840
I also think our brains are designed to think
link |
03:17:46.520
in human terms, like game theory,
link |
03:17:55.320
I think is best applied in the space of human decisions.
link |
03:18:01.680
And so...
link |
03:18:03.440
Right, you're dealing, I mean, with things like AI,
link |
03:18:06.160
AI is, they are, we can somewhat,
link |
03:18:10.520
like, I don't think it's,
link |
03:18:11.960
the reason I say anthropomorphization
link |
03:18:14.160
we need to be careful with is because there is a danger
link |
03:18:17.000
of overly applying, overly wrongly assuming
link |
03:18:20.840
that this artificial intelligence is going to operate
link |
03:18:24.200
in any similar way to us,
link |
03:18:25.880
because it is operating
link |
03:18:27.760
on a fundamentally different substrate.
link |
03:18:29.680
Like even dogs or even mice or whatever, in some ways,
link |
03:18:33.880
like anthropomorphizing them is less of a mistake, I think,
link |
03:18:37.520
than an AI, even though it's an AI we built and so on,
link |
03:18:40.200
because at least we know
link |
03:18:41.440
that they're running from the same substrate.
link |
03:18:43.360
And they've also evolved from the same,
link |
03:18:45.560
out of the same evolutionary process.
link |
03:18:48.680
They've followed this evolution
link |
03:18:50.000
of like needing to compete for resources
link |
03:18:52.400
and needing to find a mate and that kind of stuff.
link |
03:18:55.200
Whereas an AI that has just popped into existence
link |
03:18:58.280
somewhere on like a cloud server,
link |
03:19:00.480
let's say, you know, or whatever, however it runs
link |
03:19:02.840
and whatever, whether it,
link |
03:19:03.920
I don't know whether they have an internal experience.
link |
03:19:05.560
I don't think they necessarily do.
link |
03:19:07.040
In fact, I don't think they do.
link |
03:19:08.120
But the point is, is that to try and apply
link |
03:19:11.920
any kind of modeling of like thinking through problems
link |
03:19:14.880
and decisions in the same way that we do
link |
03:19:16.920
has to be done extremely carefully because they are,
link |
03:19:20.720
like, they're so alien,
link |
03:19:23.560
their method of whatever their form of thinking is,
link |
03:19:26.320
it's just so different because they've never had to evolve,
link |
03:19:29.160
you know, in the same way.
link |
03:19:30.640
Yeah, beautifully put.
link |
03:19:32.280
I was just playing devil's advocate.
link |
03:19:33.760
I do think in certain contexts,
link |
03:19:35.680
anthropomorphization is not gonna hurt you.
link |
03:19:37.920
Yes.
link |
03:19:38.760
Engineers run away from it too fast.
link |
03:19:39.960
I can see that.
link |
03:19:41.040
But from the most point, you're right.
link |
03:19:43.680
Do you have advice for young people today,
link |
03:19:48.680
like the 17 year old that you were,
link |
03:19:51.360
of how to live life?
link |
03:19:53.600
You can be proud of how to have a career
link |
03:19:56.440
you can be proud of in this world full of mullocks.
link |
03:20:00.880
Think about the win wins.
link |
03:20:02.440
Look for win win situations.
link |
03:20:05.520
And be careful not to, you know, overly use your smarts
link |
03:20:11.320
to convince yourself that something is win win
link |
03:20:13.000
when it's not.
link |
03:20:13.840
So that's difficult.
link |
03:20:14.680
And I don't know how to advise, you know, people on that
link |
03:20:17.600
because it's something I'm still figuring out myself.
link |
03:20:20.160
But have that as a sort of default MO.
link |
03:20:25.760
Don't see things, everything as a zero sum game.
link |
03:20:28.080
Try to find the positive sumness and like find ways
link |
03:20:30.680
if there doesn't seem to be one,
link |
03:20:32.680
consider playing a different game.
link |
03:20:34.200
So that I would suggest that.
link |
03:20:37.000
Do not become a professional poker player.
link |
03:20:38.760
I, cause people always ask that like, oh, she's a pro.
link |
03:20:41.800
I wanna do that too.
link |
03:20:43.240
Fine, you could have done it if you were, you know,
link |
03:20:45.280
when I started out,
link |
03:20:46.240
it was a very different situation back then.
link |
03:20:48.280
Poker is, you know, a great game to learn
link |
03:20:52.600
in order to understand the ways to think.
link |
03:20:54.960
And I recommend people learn it,
link |
03:20:56.800
but don't try and make a living from it these days.
link |
03:20:58.400
It's almost, it's very, very difficult
link |
03:21:00.280
to the point of being impossible.
link |
03:21:03.640
And then really, really be aware of how much time
link |
03:21:08.800
you spend on your phone and on social media
link |
03:21:12.120
and really try and keep it to a minimum.
link |
03:21:14.280
Be aware that basically every moment that you spend on it
link |
03:21:17.040
is bad for you.
link |
03:21:18.120
So it doesn't mean to say you can never do it,
link |
03:21:20.160
but just have that running in the background.
link |
03:21:22.520
I'm doing a bad thing for myself right now.
link |
03:21:25.280
I think that's the general rule of thumb.
link |
03:21:28.680
Of course, about becoming a professional poker player,
link |
03:21:31.160
if there is a thing in your life that's like that
link |
03:21:35.400
and nobody can convince you otherwise, just fucking do it.
link |
03:21:40.960
Don't listen to anyone's advice.
link |
03:21:44.080
Find a thing that you can't be talked out of too.
link |
03:21:46.360
That's a thing.
link |
03:21:47.360
I like that, yeah.
link |
03:21:50.160
You were a lead guitarist in a metal band?
link |
03:21:53.880
Oh.
link |
03:21:54.720
Did I write that down from something?
link |
03:21:56.960
What did you, what'd you do it for?
link |
03:22:00.080
The performing, was it the pure, the music of it?
link |
03:22:07.080
Was it just being a rock star?
link |
03:22:08.960
Why'd you do it?
link |
03:22:11.440
So we only ever played two gigs.
link |
03:22:15.400
We didn't last, you know, it wasn't a very,
link |
03:22:17.880
we weren't famous or anything like that.
link |
03:22:20.560
But I was very into metal.
link |
03:22:25.280
Like it was my entire identity,
link |
03:22:27.440
sort of from the age of 16 to 23.
link |
03:22:29.320
What's the best metal band of all time?
link |
03:22:31.640
Don't ask me that, it's so hard to answer.
link |
03:22:36.760
So I know I had a long argument with,
link |
03:22:40.800
I'm a guitarist, more like a classic rock guitarist.
link |
03:22:43.960
So, you know, I've had friends who are very big
link |
03:22:46.600
Pantera fans and so there was often arguments
link |
03:22:49.800
about what's the better metal band,
link |
03:22:52.480
Metallica versus Pantera.
link |
03:22:53.880
This is a more kind of 90s maybe discussion.
link |
03:22:57.360
But I was always on the side of Metallica,
link |
03:23:00.360
both musically and in terms of performance
link |
03:23:02.760
and the depth of lyrics and so on.
link |
03:23:06.120
So, but they were, basically everybody was against me.
link |
03:23:10.280
Because if you're a true metal fan,
link |
03:23:12.280
I guess the idea goes is you can't possibly
link |
03:23:14.520
be a Metallica fan.
link |
03:23:16.160
Because Metallica is pop, it's just like, they sold out.
link |
03:23:19.560
Metallica are metal.
link |
03:23:20.600
Like they were the, I mean, again, you can't say
link |
03:23:24.920
who was the godfather of metal, blah, blah, blah.
link |
03:23:26.520
But like they were so groundbreaking and so brilliant.
link |
03:23:32.680
I mean, you've named literally two of my favorite bands.
link |
03:23:34.920
Like when you asked that question, who are my favorites?
link |
03:23:37.240
Like those were two that came up.
link |
03:23:39.000
A third one is Children of Bodom,
link |
03:23:41.480
who I just think, oh, they just tick all the boxes for me.
link |
03:23:47.240
Yeah, I don't know.
link |
03:23:48.160
It's nowadays, like I kind of sort of feel
link |
03:23:51.800
like a repulsion to the, I was that myself.
link |
03:23:55.680
Like I'd be like, who do you prefer more?
link |
03:23:56.880
Come on, who's like, no, you have to rank them.
link |
03:23:58.880
But it's like this false zero sumness that's like, why?
link |
03:24:01.720
They're so additive.
link |
03:24:02.600
Like there's no conflict there.
link |
03:24:04.560
Although when people ask that kind of question
link |
03:24:06.960
about anything, movies, I feel like it's hard work.
link |
03:24:11.440
And it's unfair, but it's, you should pick one.
link |
03:24:14.880
Like, and that's actually the same kind of,
link |
03:24:17.760
it's like a fear of a commitment.
link |
03:24:19.480
When people ask me, what's your favorite band?
link |
03:24:21.040
It's like, but I, it's good to pick.
link |
03:24:24.280
Exactly.
link |
03:24:25.120
And thank you for the tough question, yeah.
link |
03:24:27.640
Well, maybe not in the context
link |
03:24:29.480
when a lot of people are listening.
link |
03:24:31.800
Yeah, I'm not just like, what, why does it matter?
link |
03:24:33.960
No, it does.
link |
03:24:35.560
Are you still into metal?
link |
03:24:37.400
Funny enough, I was listening to a bunch
link |
03:24:38.520
before I came over here.
link |
03:24:39.840
Oh, like, do you use it for like motivation
link |
03:24:43.680
or it gets you in a certain?
link |
03:24:44.640
Yeah, I was weirdly listening
link |
03:24:45.680
to 80s hair metal before I came.
link |
03:24:48.120
Does that count as metal?
link |
03:24:49.840
I think so, it's like proto metal and it's happy.
link |
03:24:53.440
It's optimistic, happy proto metal.
link |
03:24:56.760
Yeah, I mean, all these genres bleed into each other.
link |
03:25:00.040
But yeah, sorry, to answer your question
link |
03:25:01.240
about guitar playing, my relationship with it
link |
03:25:04.280
was kind of weird in that I was deeply uncreative.
link |
03:25:09.120
My objective would be to hear some really hard
link |
03:25:11.240
technical solo and then learn it, memorize it
link |
03:25:14.320
and then play it perfectly.
link |
03:25:15.960
But I was incapable of trying to write my own music.
link |
03:25:19.200
Like the idea was just absolutely terrifying.
link |
03:25:23.120
But I was also just thinking, I was like,
link |
03:25:24.640
it'd be kind of cool to actually try starting a band again
link |
03:25:28.600
and getting back into it and write.
link |
03:25:31.520
But it's scary.
link |
03:25:34.000
It's scary.
link |
03:25:34.840
I mean, I put out some guitar playing
link |
03:25:36.840
just other people's covers.
link |
03:25:38.120
I play Comfortly Numb on the internet.
link |
03:25:41.440
And it's scary too.
link |
03:25:42.480
It's scary putting stuff out there.
link |
03:25:45.320
And I had this similar kind of fascination
link |
03:25:47.680
with technical playing, both on piano and guitar.
link |
03:25:50.280
You know, one of the first,
link |
03:25:55.680
one of the reasons that I started learning guitar
link |
03:25:58.120
is from Ozzy Osbourne, Mr. Crowley's solo.
link |
03:26:01.760
And one of the first solos I learned is that,
link |
03:26:06.760
there's a beauty to it.
link |
03:26:07.600
There's a lot of beauty to it.
link |
03:26:08.440
It's tapping, right?
link |
03:26:09.280
Yeah, there's some tapping, but it's just really fast.
link |
03:26:14.480
Beautiful, like arpeggios.
link |
03:26:15.720
Yeah, arpeggios, yeah.
link |
03:26:16.960
But there's a melody that you can hear through it,
link |
03:26:19.600
but there's also build up.
link |
03:26:21.600
It's a beautiful solo,
link |
03:26:22.800
but it's also technically just visually the way it looks
link |
03:26:25.920
when a person's watching, you feel like a rockstar playing.
link |
03:26:29.640
But it ultimately has to do with technical.
link |
03:26:33.560
You're not developing the part of your brain
link |
03:26:36.440
that I think requires you to generate beautiful music.
link |
03:26:40.640
It is ultimately technical in nature.
link |
03:26:42.360
And so that took me a long time to let go of that
link |
03:26:45.840
and just be able to write music myself.
link |
03:26:48.960
And that's a different journey, I think.
link |
03:26:53.120
I think that journey is a little bit more inspired
link |
03:26:55.080
in the blues world, for example,
link |
03:26:57.080
where improvisation is more valued,
link |
03:26:58.640
obviously in jazz and so on.
link |
03:26:59.960
But I think ultimately it's a more rewarding journey
link |
03:27:04.960
because you get your relationship with the guitar
link |
03:27:08.360
then becomes a kind of escape from the world
link |
03:27:12.120
where you can create, I mean, creating stuff is.
link |
03:27:17.120
And it's something you work with,
link |
03:27:18.720
because my relationship with my guitar was like,
link |
03:27:20.240
it was something to tame and defeat.
link |
03:27:23.360
Yeah, it's a challenge.
link |
03:27:24.760
Which was kind of what my whole personality was back then.
link |
03:27:26.960
I was just very like, as I said, very competitive,
link |
03:27:29.520
very just like must bend this thing to my will.
link |
03:27:33.120
Whereas writing music, it's like a dance, you work with it.
link |
03:27:37.840
But I think because of the competitive aspect,
link |
03:27:39.880
for me at least, that's still there,
link |
03:27:42.440
which creates anxiety about playing publicly
link |
03:27:45.720
or all that kind of stuff.
link |
03:27:47.080
I think there's just like a harsh self criticism
link |
03:27:49.280
within the whole thing.
link |
03:27:50.680
It's really tough.
link |
03:27:53.320
I wanna hear some of your stuff.
link |
03:27:55.320
I mean, there's certain things that feel really personal.
link |
03:27:59.240
And on top of that, as we talked about poker offline,
link |
03:28:03.360
there's certain things that you get to a certain height
link |
03:28:05.560
in your life, and that doesn't have to be very high,
link |
03:28:07.160
but you get to a certain height
link |
03:28:09.080
and then you put it aside for a bit.
link |
03:28:11.760
And it's hard to return to it
link |
03:28:13.160
because you remember being good.
link |
03:28:15.960
And it's hard to, like you being at a very high level
link |
03:28:19.880
in poker, it might be hard for you to return to poker
link |
03:28:22.560
every once in a while and enjoy it,
link |
03:28:24.600
knowing that you're just not as sharp as you used to be
link |
03:28:26.920
because you're not doing it every single day.
link |
03:28:29.680
That's something I always wonder with,
link |
03:28:31.320
I mean, even just like in chess with Kasparov,
link |
03:28:33.760
some of these greats, just returning to it,
link |
03:28:36.200
it's almost painful.
link |
03:28:38.400
And I feel that way with guitar too,
link |
03:28:40.680
because I used to play every day a lot.
link |
03:28:44.160
So returning to it is painful
link |
03:28:46.080
because it's like accepting the fact
link |
03:28:48.720
that this whole ride is finite
link |
03:28:51.480
and that you have a prime,
link |
03:28:55.760
there's a time when you were really good
link |
03:28:57.400
and now it's over and now.
link |
03:28:58.800
We're on a different chapter of life.
link |
03:29:00.360
I was like, oh, but I miss that.
link |
03:29:02.960
But you can still discover joy within that process.
link |
03:29:06.560
It's been tough, especially with some level of like,
link |
03:29:10.080
as people get to know you, and people film stuff,
link |
03:29:13.680
you don't have the privacy of just sharing something
link |
03:29:18.960
with a few people around you.
link |
03:29:20.600
Yeah.
link |
03:29:21.600
That's a beautiful privacy.
link |
03:29:23.080
That's a good point.
link |
03:29:23.920
With the internet, it's just disappearing.
link |
03:29:26.040
Yeah, that's a really good point.
link |
03:29:27.760
Yeah.
link |
03:29:29.160
But all those pressures aside,
link |
03:29:31.120
if you really, you can step up
link |
03:29:32.920
and still enjoy the fuck out of a good musical performance.
link |
03:29:39.120
What do you think is the meaning of this whole thing?
link |
03:29:42.360
What's the meaning of life?
link |
03:29:43.600
Oh, wow.
link |
03:29:45.600
It's in your name, as we talked about.
link |
03:29:47.320
You have to live up.
link |
03:29:48.600
Do you feel the requirement
link |
03:29:50.960
to have to live up to your name?
link |
03:29:54.200
Because live?
link |
03:29:55.200
Yeah.
link |
03:29:56.040
No, because I don't see it.
link |
03:29:57.760
I mean, my, oh, again, it's kind of like,
link |
03:30:02.440
no, I don't know.
link |
03:30:03.280
Because my full name is Olivia.
link |
03:30:05.320
Yeah.
link |
03:30:06.160
So I can retreat in that and be like,
link |
03:30:07.200
oh, Olivia, what does that even mean?
link |
03:30:10.000
Live up to live.
link |
03:30:12.320
No, I can't say I do,
link |
03:30:13.880
because I've never thought of it that way.
link |
03:30:15.400
And then your name backwards is evil.
link |
03:30:17.280
That's what we also talked about.
link |
03:30:18.880
I mean, I feel the urge to live up to that,
link |
03:30:21.880
to be the inverse of evil or even better.
link |
03:30:25.880
Because I don't think, you know,
link |
03:30:27.480
is the inverse of evil good
link |
03:30:29.000
or is good something completely separate to that?
link |
03:30:32.000
I think my intuition says it's the latter,
link |
03:30:34.040
but I don't know.
link |
03:30:34.880
Anyway, again, getting in the weeds.
link |
03:30:36.880
What is the meaning of all this?
link |
03:30:39.280
Of life.
link |
03:30:41.240
Why are we here?
link |
03:30:43.880
I think to,
link |
03:30:44.720
explore, have fun and understand
link |
03:30:48.080
and make more of here and to keep the game going.
link |
03:30:51.080
Of here?
link |
03:30:51.920
More of here?
link |
03:30:52.760
More of this, whatever this is.
link |
03:30:55.280
More of experience.
link |
03:30:57.240
Just to have more of experience
link |
03:30:58.240
and ideally positive experience.
link |
03:31:01.240
And more complex, you know,
link |
03:31:05.240
I guess, try and put it into a sort of
link |
03:31:06.840
vaguely scientific term.
link |
03:31:10.240
I don't know.
link |
03:31:11.080
I don't know.
link |
03:31:11.920
I don't know.
link |
03:31:12.760
But make it so that the program required,
link |
03:31:18.360
the length of code required to describe the universe
link |
03:31:21.480
is as long as possible.
link |
03:31:23.560
And, you know, highly complex and therefore interesting.
link |
03:31:27.120
Because again, like,
link |
03:31:28.960
I know, you know, we bang the metaphor to death,
link |
03:31:32.200
but like, tiled with X, you know,
link |
03:31:36.000
tiled with paperclips,
link |
03:31:37.880
doesn't require that much of a code to describe.
link |
03:31:41.120
Obviously, maybe something emerges from it,
link |
03:31:42.720
but that steady state, assuming a steady state,
link |
03:31:44.840
it's not very interesting.
link |
03:31:45.680
Whereas it seems like our universe is over time
link |
03:31:49.720
becoming more and more complex and interesting.
link |
03:31:51.960
There's so much richness and beauty and diversity
link |
03:31:54.000
on this earth.
link |
03:31:54.840
And I want that to continue and get more.
link |
03:31:56.280
I want more diversity.
link |
03:31:58.640
And the very best sense of that word
link |
03:32:01.880
is to me the goal of all this.
link |
03:32:07.040
Yeah.
link |
03:32:07.880
And somehow have fun in the process.
link |
03:32:10.880
Yes.
link |
03:32:12.120
Because we do create a lot of fun things along,
link |
03:32:14.360
instead of in this creative force
link |
03:32:18.000
and all the beautiful things we create,
link |
03:32:19.480
somehow there's like a funness to it.
link |
03:32:22.520
And perhaps that has to do with the finiteness of life,
link |
03:32:25.160
the finiteness of all these experiences,
link |
03:32:28.400
which is what makes them kind of unique.
link |
03:32:31.360
Like the fact that they end,
link |
03:32:32.920
there's this, whatever it is,
link |
03:32:35.480
falling in love or creating a piece of art
link |
03:32:40.480
or creating a bridge or creating a rocket
link |
03:32:45.560
or creating a, I don't know,
link |
03:32:48.880
just the businesses that build something
link |
03:32:53.160
or solve something.
link |
03:32:56.160
The fact that it is born and it dies
link |
03:33:00.840
somehow embeds it with fun, with joy
link |
03:33:07.040
for the people involved.
link |
03:33:08.440
I don't know what that is.
link |
03:33:09.920
The finiteness of it.
link |
03:33:11.400
It can do.
link |
03:33:12.240
Some people struggle with the,
link |
03:33:13.800
I mean, a big thing I think that one has to learn
link |
03:33:17.880
is being okay with things coming to an end.
link |
03:33:21.120
And in terms of like projects and so on,
link |
03:33:25.640
people cling onto things beyond
link |
03:33:27.120
what they're meant to be doing,
link |
03:33:28.520
beyond what is reasonable.
link |
03:33:32.040
And I'm gonna have to come to terms
link |
03:33:33.880
with this podcast coming to an end.
link |
03:33:35.800
I really enjoyed talking to you.
link |
03:33:37.080
I think it's obvious as we've talked about many times,
link |
03:33:40.160
you should be doing a podcast.
link |
03:33:41.360
You should, you're already doing a lot of stuff publicly
link |
03:33:45.960
to the world, which is awesome.
link |
03:33:47.360
And you're a great educator.
link |
03:33:48.440
You're a great mind.
link |
03:33:49.280
You're a great intellect.
link |
03:33:50.280
But it's also this whole medium of just talking
link |
03:33:52.840
is also fun.
link |
03:33:53.680
It is good.
link |
03:33:54.520
It's a fun one.
link |
03:33:55.360
It really is good.
link |
03:33:56.200
And it's just, it's nothing but like,
link |
03:33:58.840
oh, it's just so much fun.
link |
03:34:00.560
And you can just get into so many,
link |
03:34:03.160
yeah, there's this space to just explore
link |
03:34:05.240
and see what comes and emerges.
link |
03:34:07.440
And yeah.
link |
03:34:08.280
Yeah, to understand yourself better.
link |
03:34:09.320
And if you're talking to others,
link |
03:34:10.320
to understand them better and together with them.
link |
03:34:12.560
I mean, you should do your own podcast,
link |
03:34:15.320
but you should also do a podcast with C
link |
03:34:16.960
as we've talked about.
link |
03:34:18.640
The two of you have such different minds
link |
03:34:22.800
that like melt together in just hilarious ways,
link |
03:34:26.040
fascinating ways, just the tension of ideas there
link |
03:34:29.320
is really powerful.
link |
03:34:30.240
But in general, I think you got a beautiful voice.
link |
03:34:33.200
So thank you so much for talking today.
link |
03:34:35.320
Thank you for being a friend.
link |
03:34:36.520
Thank you for honoring me with this conversation
link |
03:34:39.320
and with your valuable time.
link |
03:34:40.400
Thanks, Liv.
link |
03:34:41.240
Thank you.
link |
03:34:42.560
Thanks for listening to this conversation with Liv Marie.
link |
03:34:45.160
To support this podcast,
link |
03:34:46.280
please check out our sponsors in the description.
link |
03:34:48.800
And now let me leave you with some words
link |
03:34:50.880
from Richard Feynman.
link |
03:34:53.080
I think it's much more interesting to live not knowing
link |
03:34:56.240
than to have answers, which might be wrong.
link |
03:34:59.280
I have approximate answers and possible beliefs
link |
03:35:01.800
and different degrees of uncertainty about different things,
link |
03:35:05.120
but I'm not absolutely sure of anything.
link |
03:35:08.520
And there are many things I don't know anything about,
link |
03:35:11.480
such as whether it means anything to ask why we're here.
link |
03:35:15.440
I don't have to know the answer.
link |
03:35:17.560
I don't feel frightened not knowing things
link |
03:35:20.760
by being lost in a mysterious universe without any purpose,
link |
03:35:24.280
which is the way it really is as far as I can tell.
link |
03:35:27.520
Thank you for listening and hope to see you next time.